Social Media

We Don't Need a War on Screen Time

Unwired makes an unconvincing argument for heavy-handed tech regulation.

|

Unwired: Gaining Control over Addictive Technologies, by Gaia Bernstein, Cambridge University Press, 248 pages, $24.95

"For over a decade," Seton Hall Law School professor Gaia Bernstein recalls in Unwired, "I often sat down to write at my regular table at the coffee shop near my apartment. I took out my laptop, my iPhone, and my Kindle. I wrote down my list of tasks for the day. But then two and a half hours later, with little writing done and feeling drained, I wondered what happened. The answer was usually texts and emails, but more than anything, uncontrollable Internet browsing: news sites, blogs, Facebook. Every click triggered another. I no longer do this. At least, I try my best not to."

Although Bernstein calls her behavior "uncontrollable," she also says she managed to control it. But after several years of studying "technology overuse" and advising people on how to limit their children's "screen time," she has concluded that self-discipline and parental supervision are no match for wily capitalists who trick people into unhealthy attachments to their smartphones and computers. The solution, she believes, must include legal restrictions and mandates. Without government action, she says, there is no hope of "gaining control over addictive technologies."

To make her case that the use of force is an appropriate response to technologies that people like too much, Bernstein mixes anecdotes with a smattering of inconclusive studies and supposedly earthshaking revelations by former industry insiders. It all adds up to the same basic argument that has long been used to justify restrictions on drug use, gambling, and other pleasurable activities that can, but usually do not, lead to self-destructive, life-disrupting habits: Because some people do these things to excess, everyone must suffer.

Like drug prohibitionists who assert that the freedom to use psychoactive substances is illusory, Bernstein is determined to debunk the notion that people have any real choice when it comes to deciding how much time they spend online or what they do there. If you believe that, she says, you have been fooled by "the illusion of control," which blinds you to the reality that "the technology industry" is "calling the shots."

Given the underhanded techniques that designers and programmers use to lure and trap the audience they need to generate ad revenue and collect valuable data, Bernstein argues, people cannot reasonably be expected to resist the siren call of the internet. The result, she says, is an increasingly disconnected society in which adults stare at their phones instead of interacting with each other and children face a "public health crisis" caused by their obsession with social media and online games.

One problem for Bernstein's argument, as she concedes, is that the concept of "screen time" encompasses a wide range of activities, many of which are productive, edifying, sociable, or harmlessly entertaining. Another problem, which Bernstein by and large ignores, is the fine line between "abusive design," which supposedly encourages people to spend more time on apps, social media platforms, or games than they should, and functional design, which makes those things more attractive or easier to use.

Bernstein warns that ad-dependent and data-hungry businesses are highly motivated to maximize the amount of time that people spend on their platforms. So they come up with "manipulative features," such as "likes," autoplay, "pull to refresh," the "infinite scroll," notifications, and algorithm-driven recommendations, that encourage users to linger even when it is not in their well-considered interest to do so. But those same features also enhance the user's experience. If they didn't, businesses that eschewed them would have an advantage over their competitors.

Bernstein claims that autoplay, for example, serves merely to increase revenue by boosting exposure to advertising. Yet she repeatedly complains about autoplay on Netflix, which makes its money through subscriptions. The company nevertheless has decided that it makes good business sense to cue up the next episode of a series after a viewer has finished the previous one. Other streaming services have made the same decision.

Bernstein says that sneaky move gives people no time to decide whether they actually want to watch the next episode. But a lag allows viewers to stop autoplay with the press of a button. If they are not quick enough, they still have the option of choosing not to binge-watch You or The Diplomat, a decision they can implement with negligible effort. As a justification for government intervention, this is pretty weak tea.

One of the horror stories scattered through Unwired poses a similar puzzle. Daniel Petric, an Ohio teenager, was so obsessed with Halo 3 that he ended up playing it 18 hours a day. After his concerned father "confiscated the game" and locked it in a safe, Bernstein notes, Petric shot both of his parents, killing his mother.

Since Halo is an Xbox game, Bernstein cannot claim that its designers were determined to maximize ad revenue by maximizing play time. Rather, they were determined to maximize sales by making the game as entertaining as possible. While that may be a problem for some people in some circumstances, making video games as boring as possible is plainly not in the interest of most consumers, which is why it is a bad business model. But by Bernstein's logic, pretty much anything electronic that holds people's interest is suspect.

Bernstein does not propose a ban on compelling video games or binge-worthy TV shows. But she has lots of other ideas, including government-imposed addictiveness ratings, restrictions on the availability of online games (a policy she assures us is not limited to "totalitarian regimes like China"), mandatory warnings that nag people about spending too much time on an app or website, legally required default settings that would limit smartphone use, and a ban on "social media access by minors."

Bernstein also suggests "prohibiting specific addictive elements," such as loot boxes, autoplay, engagement badges, alerts, and the infinite scroll. And she wants the government to consider "prohibiting comments and likes on social media networks," which would make social something of a misnomer. If those steps proved inadequate, she says, legislators could take "a broader catch-all approach by prohibiting any feature that encourages additional engagement with the platform"—a category capacious enough to cover anything that makes a platform attractive to users.

Other possibilities Bernstein mentions include "a federal tort for social media harm," a Federal Trade Commission crackdown on excessively engaging platforms, and an effort to break up tech companies through antitrust litigation. She hopes the last strategy, which so far has proven far less successful than she suggests it could be, would create more competition, which in turn would help replace free services that rely on advertising and data harvesting with services that collect fees from users.

But why wait? "Social media is free," Bernstein writes. "Games are mostly free. But what if they were not? What if users had to pay?" Just as "taxes on cigarettes made them prohibitively expensive" and "effectively reduced the number of smokers," she says, a "pay-as-you-go model" would deter overuse, especially by teenagers.

Might some of these policies pose constitutional problems? Yes, Bernstein admits. "As laws and courts continuously expand First Amendment protection," she says, the fight to redesign devices and platforms through legislation and regulation "is likely to be an uphill battle." But as she sees it, that battle against freedom of speech must be fought, for the sake of our children and future generations.

One of Bernstein's stories should have prompted her to reflect on the wisdom of her proposals. When she was growing up in Israel, there was just one TV option: a government-run channel that decolorized its meager offerings lest they prove too alluring. The country's first few prime ministers, who "disfavored the idea of television generally," viewed even that concession to popular tastes as a regrettable "compromise." Frustrated Israelis eventually found a workaround: an "anti-eraser" that recolorized the shows and movies that the government wanted everyone to watch in black and white.

Oddly, Bernstein presents that story to bolster her argument that self-help software is not a satisfactory solution to overuse of social media. "Todo-book replaces Facebook's newsfeed with a user's to do list," she writes. "Instead of scrolling the feed, the user now sees tasks that they planned to do that day, and only when they have finished them does the newsfeed unlock." Although "Todo-book is a neat idea," she says, "the question becomes 'why take this circular route'? Why erase color only to restore it?"

This analogy is doubly puzzling. First, Bernstein is comparing Facebook's newsfeed, which she thinks makes the platform too appealing, with black-and-white broadcasting, which she thinks made TV less appealing. Second, it was the Israeli government, not a private business, that "erase[d] color," and it was rebellious viewers, aided by clever entrepreneurs, who "restore[d] it." There is a lesson there about misguided government meddling, but it is clearly not one that Bernstein has learned.