Politics

Learning From Kodak's Demise

What the end of a blue-chip company can teach us about the 2012 election.

|

The human brain is capable of memorizing 67,890 digits of pi, composing side two of Exile on Main Street, and inventing a dog-to-human translation device called the Bowlingual. Yet, we often-brilliant, always-innovating bipeds find it impossible to imagine changing the trajectory of the world we think we live in by more than a few degrees at any given moment. Whatever dominates today we assume will dominate tomorrow. This is true for our private lives, this is true for commerce, and this is especially true for politics.

Tectonic shifts in the course of human events are almost never predicted ahead of time, even by the very people who stomp on the cracks. When asked in August 1989, only a few months after the electrifying demonstrations in Tiananmen Square, whether the communist East Bloc would ever be democratic and free in his lifetime, Czech economist Vàclav Klaus said no. Less than five months later, he was the first finance minister of a free Czechoslovakia. Morgan Stanley trader Howie Hubler lost $9 billion on a single stock market bet in 2007, not because he didn't think the bubble of mortgage-backed derivative securities would pop, but because he couldn't conceive of the price reduction exceeding 8 percent. For all but the last 10 days of 2007, the famed Iowa Electronic Markets (IEM) trading system for predicting major-party presidential nominees established as its clear Republican favorite the famous ex-mayor of New York, Rudy Giuliani. Yet, when it came time for people to actually vote in the primaries and caucuses, Giuliani lost in more than 40 states to Ron Paul, an obscure obstetrician/congressman whose name never even showed up on IEM's 2008 election trading board despite his ending up with the fourth-highest number of delegates. Massive, fast-paced change, whether liberational, destructive, or just plain weird, is always and everywhere underpriced.

You may have heard of confirmation bias, whereby people choose to notice and believe whatever rumors, news stories, and quasi-academic studies confirm their basic worldview. Well, get your mind around existence bias, where the mere fact of a person's, business's, political party's, or country's existence is taken as unspoken and unchallenged proof that the same entity will exist in largely the same form tomorrow, the next day, the next month, the next decade, forever and ever, amen. This despite the fact that the Western world, and the United States in particular, stands out in the history of Homo sapiens as the most vigorous producer of constant, dynamic change.

Dig up the time capsules for every decade preceding us, and you'll find retrospectively laughable anxieties about seemingly intractable threats that no longer exist. At the dawn of the new millennium, for example, the overwhelming majority of media observers agonized over how mere mortals could cope with the advent of the new Big Brother–style corporate behemoth called AOL Time Warner. As it turned out, the company set new records for financial losses before disbanding altogether. A decade before that, the question wasn't whether the Japanese would own and operate the U.S. economy but whether the American workplace would be free of insidious group calisthenics led by Toshiro Mifune types. The graduating class of 1980 could not imagine a world without high inflation and the growing communist threat, and its 1970 counterpart forecasted constant Southeast Asian war fed by an endless military draft.

In 2012, it's tempting to give in to the pessimism and existence bias of the moment. Unemployment and underemployment have remained at levels not seen since the 1978–1982 recession, and unlike in that era of Federal Reserve Bank–imposed austerity via heightened interest rates, the economy has not been dosed with any medicine that hints at a better tomorrow. Indeed, debt and deficits are reaching levels not seen since World War II, when, as you might recall, we were fighting a world war. Against Hitler. And as bad as the current fiscal picture looks, there is rare unanimity across the discipline of economics—as well as inside the administration, from President Barack Obama on down—about one singularly unhappy fact: As the first wave of the baby boomers born between 1946 and 1964 starts to retire and goes on the public dole, things will only get much, much worse.

Yet there is a glowing ember of real hope in this gloomy picture, and it lies, paradoxically, right alongside our inability to detect it. The same revolutionary forces that have already upended much of American commerce and society over the past 40 years, delivering us not just from yesterday's bogeymen but into a futuretastic world of nearly infinite individual choice, specialization, and autonomy, are at long last beginning to buckle the cement under the most ossified chunk of American life: politics and government. A close if idiosyncratic reading of recent U.S. history gives us a blueprint for how to speed up that process of creative destruction in the realm of public policy. Because it's not true that nobody predicted such history-altering innovations as the Internet, nonviolent resistance to totalitarianism, and the home-brewing renaissance, among a thousand other happy developments in the modern world.

If we listen carefully to the theoreticians and practitioners who helped midwife these giant leaps toward the decentralization of power and the democratization of mankind, they have some surprisingly consistent things to say about changing or working around restrictive regimes, and—above all— altering the mindset that tolerates and perpetuates them. As any revolutionary will testify, there are structural impediments galore to our personal and global pursuit of happiness. Before we can sweep those roadblocks away, we have to declare our independence from the forces that conspire to keep us less than free and recognize that the status quo has no inalienable right to keep on keeping on.

Nothing in 21st-century life seems as archaic, ubiquitous, and immovable as the Republican and Democratic parties, two 19th-century political groupings that divide up the spoils of a combined $6.4 trillion annually in forcibly extracted taxpayer money at the federal, state, county, and municipal levels. While rhetorically and theoretically at odds with one another at any micro-moment in time, the two parties manage to create a mostly unbroken set of policies and governance structures that benefit well-connected groups at the expense of the individual. Americans have watched, with a growing sense of alarm and alienation, as first a Republican then a Democratic administration flouted public opinion by bailing out banks, nationalizing the auto industry, expanding war in Central Asia, throwing yet more good money after bad to keep housing prices artificially high, and prosecuting a drug war no one outside the federal government pretends is comprehensible, let alone winnable. It is easy to look upon this well-worn rut of political affairs and despair.

But what if that's the existence bias talking? What if the same elements that extend the incumbents' advantage threaten to hasten their demise? Luckily, economists have a particular fondness for studying what Democrats and Republicans have become: the longest-lived duopoly in American history. But while researchers have done interesting work explaining how duopolies collude with one another to carve up captive markets, they generally fail to address the most interesting moment of all: how customer-unfriendly collusion and customer-empowering technology combine to produce an inevitable consumer revolt, sweeping one or more of the dominant players into the dustbin of history.

In a widely circulated 2009 paper surveying the vast economic literature on the topic, the late Larry F. Darby presented a list of classic duopolies for discussion. Tellingly, several no longer existed, including MCI and AT&T (MCI, then known as WorldCom, became history's largest bankruptcy in 2003) and Macy's and Gimbels (Gimbels was the country's—and the world's—dominant department store chain in the 1930s; it ceased to exist in 1987). As such examples illustrate, there is nothing inherently stable about two organizations dominating a particular market in the hurly-burly of modern American life. In fact, there is plenty of reason to suspect that such arrangements are, if anything, unstable—particularly when technology allows captive consumers to flee.

It's worth taking a closer look at a single such case, one of the duopolies on Darby's list: Kodak and Fujifilm. Like the Democratic Party with the House of Representatives, Kodak was for much of the 20th century synonymous with color photography. Memories captured on film were "Kodak moments." Eastman Kodak was a bedrock member of the Dow Jones Industrial Average for more than seven decades. At one point the company enjoyed an amazing 96 percent share of the U.S. market for photographic film. Such was its dominance that the federal government sued Kodak for antitrust violations not once but twice, producing out-of-court settlements in 1921 and 1954. As recently as 1994, long after Japan's Fujifilm had entered the scene, the Justice Department argued that the antitrust settlements should remain in force, since Kodak had "long dominated" the industry, still enjoyed a U.S. market share of around 75 percent, and could "greatly outsell its rivals despite charging a higher price." (Careful observers of and participants in capitalism may notice in that latter claim a wonderful market opportunity.)

Fujifilm began competing with Kodak globally in the 1970s and seriously in the United States after the 1984 Olympics. Though always the junior partner on Kodak's home turf, the conglomerate held its own enough that the duopoly soon attracted academic studies such as "Entry, Its Deterrence, and Its Accommodation," "Vertical Restraints and Market Access," and "Advertising Collusion in Retail Markets." The underlying assumption was that you could assume the duopoly's equilibrium for the foreseeable future. Even those who noticed Kodak faltering in the late 1990s at the dawn of the digital age were still apt to say, as Fortune magazine did, "The Kodak brand remains solid gold, and its quality is not in dispute." No one could conceive of a photography world without Kodak playing its customary leading role.

This, stunningly, is no longer true. Eastman Kodak share prices tumbled from $60 in 2000 to $40 in 2001, to $10 in 2008, and under the $1 threshold by the end of 2011. The Dow Jones kicked the stock off its bedrock industrial average in 2004, and the New York Stock Exchange threatened the company with de-listing. Kodachrome—subject not just of a hit Paul Simon song but of the 1954 antitrust settlement that the federal government was trying to maintain four decades later—vanished from stores in 2009, and developers stopped processing the stuff for good on New Year's Day 2010. The company closed scores of plants, laid off more than 10,000 employees, and has now filed for Chapter 11 bankruptcy.

What happened? Technological advances gave consumers choices that Kodak's fat bureaucracy was unwilling to provide. Writing in the Wall Street Journal in November 2006, William M. Bulkeley explained how the implications of this insight ranged far afield from the world of processing photographs:

Photography and publishing companies shouldn't be surprised when digital technology upends their industries. After all, their business success relied on forcing customers to buy things they didn't want. Photo companies made customers pay for 24 shots in a roll of film to get a handful of good pictures. Music publishers made customers buy full CDs to get a single hit song. Encyclopedia publishers made parents spend thousands of dollars on multiple volumes when all they wanted was to help their kid do one homework paper. The business models required customers to pay for detritus to get the good stuff. . . . Eastman Kodak and Fuji Photo Film had a highly profitable duopoly for 20 years before digital cameras came along. They never dreamed customers would quickly abandon film and prints.

When given real choice, especially the choice to go elsewhere, consumers will drop even the most beloved of brands for options that enhance their experience and increase their autonomy. We have all witnessed and participated in this revolutionary transfer of loyalty away from those who tell us what we should buy or think and toward those who give us tools to think and act for ourselves. No corner of the economy, of cultural life, or even of our personal lives hasn't felt the gale-force winds of this change. Except government.

Think of any customer experience that has made you wince or kick the cat. What jumps to mind? Waiting in multiple lines at the Department of Motor Vehicles. Observing the bureaucratic sloth and lowest-common-denominator performance of public schools, especially in big cities. Getting ritually humiliated going through airport security. Trying desperately to understand your doctor bills. Navigating the permitting process at your local city hall. Wasting a day at home while the gas man fails to show up. Whatever you come up with, chances are good that the culprit is either a direct government monopoly (as in the providers of K–12 education) or a heavily regulated industry or utility where the government is the largest player (as in health care).

Unlike government and its sub-entities, Kodak couldn't count on a guaranteed revenue stream: Consumers abandoned its products, and now the company is basically done. The history of private-sector duopolies and even monopolies is filled with such seemingly sudden disappearing acts: The A&P supermarket chain—if you're under 40 years old, or from the West Coast, you probably haven't even heard of it—enjoyed a U.S. market share of 75 percent as recently as the 1950s. Big-box music retailers and bookstores were supposed to bestride the land like colossi at the turn of our new century, but Virgin megastores have all but disappeared, and Borders has been liquidated. Dominant newspapers in one-paper towns were able to book some of the economy's highest profit margins for four decades—more than 20 percent a year, on average, positively dwarfing such hated industrial icons as Walmart—yet with the explosion of Web-based competition, these onetime mints are now among the least attractive companies in the economy.

There is a positive correlation between an organization's former dominance and its present inability to cope with 21st-century change. As technology business consultant Nilofer Merchant has aptly put it, "The Web turns old industries on their head. Industries that have had monopolies or highly profitable duopolies are the ones most likely to be completely gutted when a more powerful, more efficient system comes along." We need to hasten the inevitable arrival of that more efficient system on the doorstep of America's most stubborn, foot-dragging, reactionary sector—government at the local, state, and especially federal levels, and its officially authorized customer-hating agents, the Democrats and Republicans.

The most important long-term trend in American politics is the half-century leak in market share by the country's political duopoly. This January, Gallup released its latest study on the question of political self-identification, finding that "the proportion of independents in 2011 was the largest in at least 60 years"–a stunning 40 percent. Democrats were at a desultory 31 percent, and Republicans proved utterly unable to capitalize on a bad, Democrat-led economy, trending downward to 27 percent.

Voters free from the affiliation of party membership are more inclined to view political claims with the skepticism they so richly deserve, to hear the dog whistle of tribal political calls as deliberate attacks on the senses rather than rousing calls to productive action. By refusing to confer legitimacy on the two accepted forms of political organization and discourse, they also hint strongly that another form is gathering to take their place.

When duopolies bleed share of a captive market, something potentially revolutionary is afoot. The Bush-Obama era of bailout economics and perennially deferred pain has produced a political backlash that has been evident most every time voters have had an opportunity to vent. When blue-state California was allowed in May 2009 to pass judgment on its political class, via a five-part budget-fix referendum (including "onetime" infusions of income tax and lottery proceeds) supported by a nearly unanimous cross-section of major politicians, newspapers, and interest groups, the slate lost by an average of 30 percentage points, despite opponents being outspent by an average of seven to one. Eight months later, unknown Republican Scott Brown won Teddy Kennedy's old Senate seat in a three-to-one Democrat Massachusetts. Congressmen mostly canceled their traditional August town hall meetings in 2010 after getting too many earfuls in 2009, GOP establishment candidates for Senate were upended by Tea Party insurgents from Delaware to Kentucky to Nevada, and limited-government Republican insurgent Ron Paul has more than doubled his vote over 2008 so far in this presidential primary season.

For the first time in recent memory, participants in the political process, many of them newly engaged, are openly imagining and pushing for a world other than the one they currently live in. A certain spell is on the verge of breaking, with impacts we can only guess at.

Matt Welch (matt.welch@reason.com) is editor in chief of reason and Nick Gillespie (gillespie@reason.com) is editor in chief of reason.com and reason.tv. They are the co-authors of The Declaration of Independents: How Libertarian Politics Can Fix What's Wrong With America (PublicAffairs), from which this essay was adapted.