The Volokh Conspiracy
Mostly law professors | Sometimes contrarian | Often libertarian | Always independent
The FTC jumps into Log4j cleanup with one foot
Episode 389 of the Cyberlaw Podcast
When it comes to spurring remediation of the log4j bug, the FTC's other foot, I argue, is lodged firmly in its mouth. It has published what can only be described as a regulatory blog post, reminding everyone of the $700 million in fines imposed on Equifax and threatening "to use its full legal authority to pursue companies that fail to take reasonable steps to protect consumer data from exposure as a result of Log4j." Tatyana Bolton defends the agency from a charge of heavy-handedness, arguing that this is the best way to get companies to patch quickly and that only "reasonable steps" are required. I think we'll hear "we only asked for reasonable steps" a lot from the FTC, now that it turns out that fixing the Log4j mess is going to require a lot more than regulatory muscle flexing. I also argue that the FTC's tough-guy pose is just that; when talking about the open source maintainers who actually have to generate many of the patches, the FTC doesn't threaten them with its "full legal authority." Instead, it acknowledges that open source coders "don't always have adequate resources and personnel," something the FTC "will consider as we work to address the root issues that endanger user security." Hmm, maybe Equifax should have pleaded inadequate resources and saved itself $700 million.
Speaking of fallible regulators, Glenn Gerstell gives us a tour of China's tech regulatory landscape, and the remarkable decline it has caused in the fortunes of consumer tech firms there, something the NYT covered in detail last week. Is that good news for Silicon Valley or for US competitiveness? Sadly, probably not, I conclude.
Mark MacCarthy explains why a proposal to combine cryptocurrency with Signal is causing angst among Signal's supporters, who fear an expansion of the end-to-end encrypted service's "regulatory attack surface."
Glenn covers the latest story about security risks and telecom gear from China.
Mark and I dig into the growing enthusiasm for regulating big Silicon Valley companies as gatekeepers. The Germans are about to apply that approach to Google. And the South Koreans are doing the same to Apple and its app store payment policies.
Tatyana notes the press coverage about possible tensions between two talented and strong cybersecurity officials in the White House: Anne Neuberger and Chris Inglis. I put Glenn on the spot about claims that Anne has "a particular tendency to clash with lawyers." That would only make me love her more, but to my regret, Glenn (who, as NSA's top lawyer, worked with her for years) absolves her of the charge.
Mark and I handicap the probability that the plaintiff will succeed in a highly charged lawsuit against Facebook/Meta for bringing together the boogaloo conspirators who killed a federal protective officer. It's a long shot, but if "negligent design" turns out to create liability for software and algorithms, Signal will have an even greater attack surface than its fans are now worried about.
Glenn explains the charges brought in China against Walmart for breaches of cybersecurity laws (hint: it's mostly not breaches of cybersecurity laws). Speaking of surprises that aren't surprises, Glenn also covers the announcement by Lloyd's of London that cyber insurance won't cover cyber-attacks attributable to nation-states.
Finally, I devote a few minutes to a rant about the Justice Department's decision to expand charges against Joe Sullivan, Uber's former CISO, for his role in paying "bug bounties" to hackers who looked more like crooks than bounty hunters when they compromised a bunch of Uber records. More than a year after charging Sullivan with obstruction of justice for using the "bug bounty" justification to keep the whole thing quiet, Justice piled on new charges of wire fraud for more or less the same thing. Glenn and I both question the decision to do this without any new facts to base the new charges on. And I point out the logical consequence of telling breach responders that they could face wire fraud charges if they decide not to disclose the breach (or maybe delay notice too long). The new Justice tack will (or should be) fatal to the FBI's desire to be called in to assist and observe while companies are dealing with breaches. If there's even a small risk that a decision to delay or withhold notice could lead to a criminal investigation, why would any GC want to have an FBI agent sitting in the room while the decision is being made?
Download the 389th Episode (mp3)
You can subscribe to The Cyberlaw Podcast using iTunes, Google Play, Spotify, Pocket Casts, or our RSS feed. As always, The Cyberlaw Podcast is open to feedback. Be sure to engage with @stewartbaker on Twitter. Send your questions, comments, and suggestions for topics or interviewees to CyberlawPodcast@steptoe.com. Remember: If your suggested guest appears on the show, we will send you a highly coveted Cyberlaw Podcast mug!
The views expressed in this podcast are those of the speakers and do not reflect the opinions of their institutions, clients, friends, families, or pets.
Editor's Note: We invite comments and request that they be civil and on-topic. We do not moderate or assume any responsibility for comments, which are owned by the readers who post them. Comments do not represent the views of Reason.com or Reason Foundation. We reserve the right to delete any comment for any reason at any time. Comments may only be edited within 5 minutes of posting. Report abuses.
Please
to post comments
Equifax' corporate charter should have been revoked and its assets liquidated.
"Instead, it acknowledges that open source coders "don't always have adequate resources and personnel," something the FTC "will consider as we work to address the root issues that endanger user security." Hmm, maybe Equifax should have pleaded inadequate resources and saved itself $700 million."
I'm not sure why Baker gets the platform to write this drivel when at times it's so clear he has no clue what he's talking about.
Equifax runs a for-profit business managing sensitive personal data; it is entirely reasonable to expect them to have reasonable security practices in place to support that business without massively violating the data subjects' privacy.
Random open source project could very well be some dead guy's five-years-ago hobby that it turns out a bunch of companies have incorporated into their businesses. It's the responsibility of the companies making use of the code to make sure they are doing so safely, not the hobbyist who let them use his work for free.
I'm not sure the FTC is the agency best equipped to do so, but having someone think about systemic risks of relying on underfunded open source projects is probably a good idea.
Are you Sheldon Cooper disguised as "jb"? I feel that you need someone to hold up the SARCASM sign.
Turns out I can totally understand that the original statement was sarcastic and still think that the implication is stupid.
The FTC's tough-guy attitude would be a lot more impressive if there was a complete patch for the log4j bug in existence to begin with.
Threatening people over patching a third-party library that still hasn't had the bug in question completely fixed is just idiotic.
There's never going to be a complete patch because there's always going to be more bugs. But there's a patch right now that fixes all of the known exploits, so not sure what you're trying to get at here.
Now there's downstream projects that have not yet incorporated the fixes in the log4j library itself so they're still vulnerable. That's a problem, but a different one.
There have been (at least) four patches that claim to fix the log4j bug by now, so forgive me if I'm a little skeptical of the latest claims.
As a software dev, this isn't helping. And the idea that it is simple, reasonable effort to fix is all that is needed is hopelessly ham handed.
1. Though the root cause isn't a core feature, it is used (and was kept in precisely because it was used). Those projects aren't likely to be simple fixes, as the functionality of the software will need to change, and depending on the industry incur testing and regulatory burdens (I work in finance, and FNRA is a harsh taskmaster on data the must be collected, maintained, how is kept and lots of other complexities.
2. There have already been patches that needed patches for introduced bugs. Rushing a 'fix' isn't always useful. We are in the 'you don't have time to do it right, but you have time to do it again?' zone.
3. Deployment isn't always easy. I worked in DoD space, and our system had a 4 hour window every 6 months to do releases. Emergency fixes were VERY costly, and had mission impacts. I wonder who wins the FTC / DoD fight on what gets deployed on GOTS? Though, I presume Lockheed, Boeing, et al will be the losers in any event.
4. Until this bug, there were huge companies using this library, but only ~6 supporting people and/or organizations. Oracle, Facebook, Microsoft, IBM, yada all used this, but didn't support it. It is a tragedy of the commons (the biggest risk in free and open source software). The FTC heavy handedness won't help that. It may kill off lots of useful projects that companies don't want to support, but do want to use.
And that is off the top of my head.
Although, npm is a dumpster fire and if that died because of this (and whatever follow-on idiocy the admin state dreams up), it wouldn't be the worst thing that ever happened.
"3. Deployment isn't always easy. I worked in DoD space, and our system had a 4 hour window every 6 months to do releases. Emergency fixes were VERY costly, and had mission impacts. I wonder who wins the FTC / DoD fight on what gets deployed on GOTS? Though, I presume Lockheed, Boeing, et al will be the losers in any event."
The original vulnerability was VERY, VERY bad. It's hard to imagine any deployment cost or mission impact outweighing the fact that almost anyone can trivially start running whatever code they want on your system. I'm sure it would be expensive and mess a bunch of missions up if, say, we discovered that we had to replace every piece of radio equipment used by the military because it turned out the Chinese had managed to get unimpeded access to the existing communications network, but I would also hope that the DoD wouldn't just slowly replace everything according to their usual procurement process either.
It wasn't on a public network. It is a serious vulnerability but the bad actors are limited to those who could penetrate a SCIF. So the risk & cost of not fixing it still needs to be greater than fixing it "right now, though the heavens fall".
I'd say the FTC is a day late and a dollar short but it's more like a month late and so much more than a dollar that I don't even know where to start.
"According to the complaint in Equifax, a failure to patch a known vulnerability irreversibly exposed the personal information of 147 million consumers. Equifax agreed to pay $700 million to settle actions by the Federal Trade Commission, the Consumer Financial Protection Bureau, and all fifty states."
So now, not only are we legislating by blog post, but the source of this authority comes from the mere claim of authority in another compliant? In a pretty extreme scenario? Really?
I'm not entirely comfortable with any of this, though of course people should patch log4j issues.