Congress continues its power grab for the internet. The pretext this time was the mental health of teenagers who use Instagram. But U.S. lawmakers these days don't really need much impetus for intervention. Every few weeks—and sometimes more frequently—Congress holds new "Big Tech" hearings, designed to give them opportunities for showboating and pushing the same grab bag of internet regulations.
Yesterday's Senate subcommittee hearing on Facebook was no different. Led by Sen. Richard Blumenthal (D–Conn.) and Sen. Marsha Blackburn (R–Tenn.), members of the Subcommittee on Consumer Protection, Product Safety, and Data Security spewed the same old pre-packaged talking points—algorithms are bad, technology is addictive, think of the children, this legislation I introduced will save democracy—with little regard for the actual facts (or witnesses) at hand.
Much of it was clearly just senators wanting a dramatic soundbite of themselves condemning Facebook and technology. "Facebook and Big Tech are facing a Big Tobacco moment," opined Blumenthal, before reminiscing about his days suing cigarette companies. Earlier this week Facebook had a service outage, "but for years it has had a principles outage" said Sen. Ed Markey (D–Mass.). "The children of America are hooked on their product," said Sen. Roger Wicker (R–Miss.).
Again and again, senators posed broad and often absurd accusations against Facebook as questions to the hearing's witness, former Facebook employee Frances Haugen. Isn't CEO Mark Zuckerberg "the algorithm designer in chief?" asked Blumenthal, as if Facebook's head is personally programming Instagram to show teens certain content. Doesn't Facebook want to hook kids young because then their "lifetime value as a user" is greater? asked Blackburn, despite admitting that a Facebook staffer had said under oath last week that's not how they thought about it.
Haugen was adept at saying sympathetic-sounding things without actually agreeing to these statements (i.e., no, Zuck doesn't personally write Instagram algorithms, but he did create a culture where metrics matter…) and sometimes even outright denied them (no, to her knowledge, Facebook did not keep kids' data after deleting their accounts, she said). But Haugen did her own fair share of grandstanding about how Facebook puts profits before "safety" with surprisingly little concrete information for someone being touted as a "whistleblower."
The biggest "bombshell" that she disclosed is that some of Facebook's internal research showed teenagers, especially teen girls, felt negative after using Instagram. This news is being treated like some sort of major revelation in the press and by lawmakers, who conveniently ignore the fact that feeling lonely, socially competitive, and insecure is an unfortunate fact of being a teenager for many kids.
But there's scant evidence Instagram is uniquely bad on this front compared to other social media and/or countless non-internet enabled factors (fashion magazines, celebrity culture, high school in general, etc.). Nor do we have reason to think that getting teens off Instagram would leave them without an online trigger for these same feelings. There are hundreds of apps teens can use to keep up (or feel like they're failing to) with one another, and there are plenty of places teens can easily seek out information independent of Facebook's algorithms.
Senators yesterday made a big deal about teens finding extreme dieting content on Instagram. But easily accessible "pro-ana" content has been a feature of the internet since I was a teen, and I'm now almost 40 years old.
Nonetheless, senators yesterday did their now-typical hand-wringing about algorithms ("There is a question … if there is such a thing as a safe algorithm," said Blumenthal in opening statements). They also pulled the common but incredibly disingenuous trick of condemning a company for prohibiting the very underage users it says it will prohibit.
Again and again, Blackburn kept coming back to the fact that Facebook had booted hundreds of thousands of under-13 users, in accordance with its minimum age rules. That's what we should want, right? But oh no—in Blackburn's warped logic, the very fact that these kids (lied about their age and) created accounts in the first place is Facebook's fault. The company should've somehow known their true ages.
Often, it seems the only thing that will satisfy lawmakers is for tech companies to have magical powers. But there's a frightening underlying point to such shenanigans. By having such unrealistic expectations for tech companies—expectations that they try to convince the public are just common sense—lawmakers can easily pivot to suggesting that a failure to meet these expectations (to magically divine user ages or whatever other absurd thing is being touted) necessitates giving more control to legislators, censoring more speech, collecting more private information about users, etc.
In this case, the only way to be sure no 11- or 12-year-olds join Instagram is if every user of every age must submit a government issued ID to join.
Killing internet anonymity is something that crops up frequently on lawmaker wish lists. So does weakening Section 230, which would let lawmakers severely threaten companies that don't do whatever they ask. And so do making tech companies share more user data with regulators, letting Congress decide what sorts of content can be seen on social media, and giving regulatory agencies more room to put tech companies on trial— a cross-agency fishing expedition that will hopefully turn up something senators can use to justify all the time they've wasted on this.
Yesterday's hearing had all of these solutions offered, with little to suggest they would actually solve the problems people have with Facebook and everything to suggest this is about lawmakers getting their way.
In a statement after the hearing, Zuckerberg decried what he described as "the false picture of the company that is being painted" and pushed back on many of the senators' allegations:
Many of the claims don't make any sense. If we wanted to ignore research, why would we create an industry-leading research program to understand these important issues in the first place? If we didn't care about fighting harmful content, then why would we employ so many more people dedicated to this than any other company in our space—even ones larger than us? If we wanted to hide our results, why would we have established an industry-leading standard for transparency and reporting on what we're doing? And if social media were as responsible for polarizing society as some people claim, then why are we seeing polarization increase in the US while it stays flat or declines in many countries with just as heavy use of social media around the world?
At the heart of these accusations is this idea that we prioritize profit over safety and well-being. That's just not true. For example, one move that has been called into question is when we introduced the Meaningful Social Interactions change to News Feed. This change showed fewer viral videos and more content from friends and family—which we did knowing it would mean people spent less time on Facebook, but that research suggested it was the right thing for people's well-being. Is that something a company focused on profits over people would do?
The argument that we deliberately push content that makes people angry for profit is deeply illogical. We make money from ads, and advertisers consistently tell us they don't want their ads next to harmful or angry content. And I don't know any tech company that sets out to build products that make people angry or depressed. The moral, business and product incentives all point in the opposite direction.
In a post earlier this week, the company also pushed back on the idea that its research showed Instagram was purely negative for teens:
The research actually demonstrated that many teens we heard from feel that using Instagram helps them when they are struggling with the kinds of hard moments and issues teenagers have always faced. In fact, in 11 of 12 areas on the slide referenced by the Journal—including serious areas like loneliness, anxiety, sadness and eating issues—more teenage girls who said they struggled with that issue also said Instagram made those difficult times better rather than worse.
Take all of that for what you will. But one needn't totally buy the company's self-serving logic (or agree with everything it does) to see that Congress are working their own self-serving angle here, too.
.@FrancesHaugen is right – we need a dedicated regulatory agency to hold Facebook and other Big Tech companies accountable for how their algorithms push misinformation and how our data is used and misused for their profit.
We need a Data Protection Agency. pic.twitter.com/TAqnuwE41s
— Sen. Kirsten Gillibrand (@gillibrandny) October 5, 2021
California can't limit private prisons. "A divided federal appeals court panel ruled on Tuesday that a California state law aimed at phasing out private detention facilities in the state cannot be enforced because it is likely to unconstitutionally intrude on the federal government's power over immigration," reports Politico. "The 2-1 decision from the 9th U.S. Circuit Court of Appeals overturns a District Court ruling that upheld the 2019 California legislation, known as AB 32."
Treasury Secretary defends Biden's bank snooping proposal. The administration's plan would make banks report to the IRS the annual inflow and outflow of all bank, loan, and investment accounts either containing more than $600 or conducting more than $600 in transactions during the year. The proposal "is widely seen as an unprecedented invasion of privacy," notes the New York Post. But Treasury Secretary Janet Yellen dismissed this concern, saying on CNBC yesterday that "it's just a few pieces of information about individual bank accounts" and not a privacy violation.
Yellen's argument seems to rest on the idea that because banks wouldn't report the specifics of transactions, filling the IRS in on everyone's bank balances is no big deal.
"It is not reporting of individual transactions or anything of the like," said Yellen. "And it would be a simple thing for banks and other payment providers to provide along with the other information they're already providing."
A year after cop's QI denial was upheld at 10th Circuit, jury awards $2.5 million to man injured by cop's gunfire after he found himself an unwitting passenger in a chase. Includes $131K against the cop, who was given PD's medal of honor for this incident.https://t.co/5aLm0wkMCB pic.twitter.com/iqPicQYXbl
— Peter Bonilla (@pebonilla) October 5, 2021
• A new date has been set for the trial of former Backpage executives after a mistrial was declared last month. The new trial will start on February 9, 2022.
• Jacob Sullum takes a look at everyone's favorite new myth about Halloween candy dangers: that marijuana edibles are being handed out to children. "Police are still pushing this discredited scare, but it seems fewer people are falling for it," Sullum writes.
• John McWhorter defends the use of they as a gender-neutral singular pronoun.
• "There is no plausible explanation for the words 'pig', 'pigs', 'copper', and 'jerk' being on the State Police's list of additional bad words [filtered out from comments on the State Police's Facebook page] other than impermissible viewpoint discrimination," a federal court has ruled.
• Gen X is getting richer.
• Wealthy people's perceptions of themselves as not that wealthy helps drive their support for progressive taxation policies, finds a new study:
Related to the above tweet: Many comparatively rich people support more redistribution of wealth because they believe that there is a considerable share of the population with (even) higher incomes, who would bear the cost. https://t.co/kizi5GeqLe pic.twitter.com/Q7Ix8cKgHm
— Rolf Degen (@DegenRolf) October 5, 2021