America's Tragic Mistake
The welfare state was created just as the "need" for it was disappearing.
In the 50 years since the Great Depression, history has discredited the principal premises on which the modern welfare state was built. America has created a giant, monopolistic public-service conglomerate by mistake, built on a series of interlocking misapprehensions.
It was the central premise of British economist John Maynard Keynes, and later of President Franklin D. Roosevelt, that the American economy had reached maturity, that there would henceforth be a chronic shortage of jobs and a chronic surplus of capital. But history has proved that premise and its corollaries dead wrong. Today it is clear that Keynes's work is most deficient in the quality he most wanted for it: the principal propositions of The General Theory of Employment, Interest, and Money are not general.
Joseph Schumpeter wrote in 1946, the year of Keynes's death, an appreciative but brutal summation of the mind and meaning of Keynes. The economist was, he said, "surprisingly insular." His advice, Schumpeter pointed out, was always English advice, yet Keynes "always exalted what was at any moment truth and wisdom for England into truth and wisdom for all times and places."
England's situation after World War I was unique. The country was broke. Its weakened social fabric, as Schumpeter said, had become rigid. Taxes and wages were too high to permit rapid reconstruction of the economy. Keynes believed that to return to the gold standard at prewar parity would be suicidal. (Gold, he correctly told his friends, would soon be reduced to the status of a constitutional monarch.) And so he prescribed monetary management.
Keynes's model makes sense only if—and it is a towering "if"—the industrial plant remains static; if, as so many believed during the '30s, the economic point of no return has been reached. But industrial change, the creation of new plants built on new technologies—the phenomenon that has most distinguished 20th-century capitalism—is a concept beyond the reach of the Keynesian model, with its assumption of stagnation.
Keynesian economics as presented in The General Theory is, as Professor Hicks contends, the economics of an era of depression. It is an economics of special cases. Schumpeter wrote: "Keynesians may hold that these special cases are the actual ones of our age. They cannot do more than that."
Today we know that what was taken for chronic stagnation was in fact temporary, that the economies of Europe and America were not mature but barely adolescent. The Keynesian edifice, which posited this "maturity" before the air age, before the computer, before microwave transmission, before the laser, before space travel, before automation, before robotics, rested on a false foundation. Keynes and the Keynesians had mistaken the late dawn of the industrial era for its twilight. And, as a result, the potentially healthy American economy has been under intensive care for nearly 50 years.
The fragility which the American economy showed during the '30s was not, as was so widely assumed, a reflection of an inherent weakness. It was a temporary condition induced by mistaken public policies. There is no solid consensus on what caused the Crash of '29 and delayed the recovery, but there is a near consensus on what did not. Whether the Great Depression was the result of a deliberate reduction of the money supply that asphyxiated the economy, or was caused by the imposition of a mercantilist tariff policy that suddenly shut off foreign markets when they were needed most, or was induced by some combination of these and other factors, is still being debated.
In the early '30s, the Federal Reserve allowed the money supply to contract by 30 percent over a period of several years. Economist Gottfried Haberler reminds us that today economists tremble when the money supply fails to increase by some targeted amount. During the Great Depression it fell nearly a third. Moreover, in 1932, the Hoover administration nearly doubled income taxes, increasing the top-bracket rate, for example, from 24 percent to 63 percent. And later, by raising bank reserve requirements, the Federal Reserve may have induced the steep slide of 1937.
To make matters worse, Congress passed the Smoot-Hawley tariff, history's highest, in 1931. A staggering number of economists for that time, 1,028 altogether, vainly urged the president to veto the bill. In an open letter to the New York Times, they warned that stifling trade would never create employment. To some unknowable extent, Smoot-Hawley helped prolong the Great Depression.
There is an unmistakable tendency for economic activity to follow a cyclical course, to ebb and flow, and scholars are forever scrutinizing the evidence in search of patterns and chains of causation. But logic and experience suggest that, in the absence of central intervention, this rising and falling does not tend to be cumulative, but self-correcting. The Great Depression was clearly not a natural disaster but an artificial one.
Today we are able to question the cocksure conventional wisdom of the postwar period which says that World War II "cured" the Depression. It seems far more likely that Pearl Harbor merely suspended the prevailing prejudice against production. Temporarily freed from the fear that they were destroying their jobs, men and women began to produce wholeheartedly again.
Before the mobilization began, there had been an all-out effort to limit production, to plow crops under, to "spread the work," to push men and women out of the labor force. Overnight, that attitude was turned inside out as patriotism triumphed over economic superstition. Attempts to limit the work week gave way to efforts to extend it. Factories painted their windows black and worked around the clock. Retired persons rejoined the labor force by the tens of thousands. There was again, as there had been for decades, an acute labor shortage. Twenty million women went to work, many in jobs that had formerly been defined as off-limits for them. Teenagers were let out of school to harvest the crops. Factories proudly flew "E" flags awarded for high productivity.
Two days before the Pacific War ended, a Washington newspaperman warned that 10 million workers would be out of work in a month. A few days later the War Production Board predicted 7 to 9 million unemployed by Thanksgiving. The New York Times said that the "best" government and private sources predicted 8 million unemployed by spring 1946. Even the Wall Street Journal saw 6.2 million unemployed by the end of the year.
But the estimates of postwar unemployment had to be revised almost immediately. In December 1945, just as the House passed its version of an Employment Act, the Times reported that the official estimate of spring unemployment had been reduced from 8 million to 5 million. President Truman had to admit that unemployment was not "so serious or drastic" as his advisers had predicted.
Vice-President Henry Wallace, in a popular postwar book, called for a continuation of wartime spending levels to put 54 million to work—a goal considered irresponsibly utopian. Twelve months after V-J Day, employment had exceeded 57 million without the therapies suggested by the Employment Act.
The reconversion provided astounding evidence of an economic vitality which so many had come to believe had been lost. First came the basics: fuel, clothing, and food. In the Southwest, oil, cotton, and cattle became the basis for a boom that is still going on.
There was pent-up demand for familiar prewar products—automobiles (by 1949, the industry finally exceeded 1929 production), radios, and refrigerators. There were also dozens of new products.
The war ended on August 14. By November, the Edison Electric Appliance Corporation, using methods learned during the war, had converted from making armor-piercing bullet cores to the manufacture of fully automatic dishwashers—at a third less than prewar prices. This was one example among thousands.
This adjustment, which contradicted a practically unanimous pessimism, was accomplished during a massive reduction of federal spending. In fiscal 1945, the federal government spent more than $100 billion. The decline was four times as much as total federal spending just six years before.
By V-J Day, the federal government was consuming almost half the gross national product. Between 1945 and 1946 the figure fell from 42 percent to 12 percent, and a year after that it fell still further. Yet the belief in the hazard of high productivity has persisted with awesome tenacity. The fear that the supply of work may someday dwindle and disappear still expresses itself in dozens of ways.
Work rules continue to outlaw efficient methods and mandate obsolete ones. Workers rarely attack machines anymore, as Luddites did in the early 19th century, but they cripple them by limiting their use or canceling their advantages, as when standby musicians must be paid when recorded music is used in a Broadway theater. Without the Luddite rationalization, practices like these would be seen as simple extortion.
The economy still struggles against high productivity—even as American producers lose markets to foreign competition. Thus, the belief that an economy is prone to decline causes it to decline. There are men and women still living who can remember the days when employment opportunities seemed inexhaustible, when there was an abiding shortage of labor. The Old World exported its unemployment to the United States. The capacity of the American economy to put people to work was not feeble and tentative but surging and insatiable. We offered work to millions, many unskilled and illiterate. Unemployment, the existence of a pool of men and women who wanted work, was not seen as a social burden to be dealt with dutifully and sacrificially. Workers were scarce and sought after: the system wanted and welcomed them.
But the Great Depression, however mistakenly, changed all that. There was and is a tenacious fear of a chronic labor surplus. We began to ration work, to look for reasons to exclude people from it. And the impulse survives. Young people, for example, are still banned from the work force, often at the peak of their innocent audacity and enthusiasm. "Child labor" is perceived as the ultimate outrage, but the root of the antagonism to it was the fear that children would take jobs men needed to feed their families. It is not proposed that six-year-olds be obliged to work from dawn to dark in mines or mills. But to exclude young people from the serious work of the world because of an exhausted supposition that there is not enough work to go around is grossly unfair to them.
The Luddite impulse persists, as the expectation of a labor surplus leads to policies that produce an apparent labor surplus. In 1982, it was proposed to relax the restrictions on 14- and 15-year-olds for the first time since the Great Depression. The new rules would have permitted these young people to work 6 more hours a week, when school is in session, to a total of 24 hours. It would have opened certain kinds of work now forbidden: the operation of switchboards, teletypewriters, and data-processing equipment, filling orders in warehouses, and polishing trucks and buses. The AFL-CIO protested. "At a time when their older brothers and sisters cannot find work," President Lane Kirkland told the press, "it is preposterous to lower the working rules for school-age youngsters."
In the same way and for the same reason, older people are arbitrarily shelved regardless of their preference. Bismarck chose 65 as the retirement age when he was setting up his social security system because so few people in that era lived that long. If, in the 1880s, it excluded people unnecessarily from the work force, the numbers were very small. But since then, life expectancy has nearly doubled—and millions of able-bodied older men and women spend decades in a retirement forced upon them by the folklore of chronic stagnation.
The social security system was conceived at a time of a presumably critical capital surplus. Thus, unlike private pension plans, its premiums were not invested. Beneficiaries were to be paid not from accumulated reserves but from the current contributions of the working population. This method, which pushed the cost of benefits forever forward to future generations was, curiously, called a pay-as-you-go system. Now, this financing method seems scandalously improvident. But the Depression mentality would have been incapable of selecting a method that would have added capital to an economy which seemed to be suffering from a severe and disabling abundance of capital. The idea of capital shortage was then scarcely imaginable.
These and other anomalies have contributed to a gathering recognition not only that the accepted economic diagnosis of the Depression era was mistaken but also that policies based on that diagnosis created the very conditions they were designed to remedy. The euthanasia of the rentier was premature and ill-advised.
Of all the conclusions drawn from the Depression experience, the ones least examined were not economic but social: that independent institutions were inherently incapable of dealing with the problems of the industrial era and that full employment and retirement security had to be provided by the central authority if they were to be provided at all. There is no question that, after three years of Depression, these assumptions seemed sound. The independent sector, the network of voluntary institutions and practices through which Americans had acted on most of their common concerns for 150 years, did not in fact provide adequate stability and security during the grim years of 1929–32. But that did not mean that it never could.
The modern welfare state is in a sense the product of a misunderstanding, compounded by some cruel accidents of timing. Wilhelm Röpke, the Swiss political economist-philosopher who was the intellectual godfather of the post-World War II German economic "miracle," described this central paradox of the modern era: what we call the welfare state grew most rapidly after the need for it had largely disappeared. The large-scale need for social assistance resulted from a highly specific, ephemeral circumstance. As industrialization progressed, the old preindustrial patterns of mutual support disintegrated. The individual became a bewildered, dependent, isolated member of the new industrial proletariat, a refugee in his own society.
"The paradox," Röpke wrote in A Humane Economy (published in English translation in 1960), "is that the modern welfare state carries to an excess the system of government-organized mass relief precisely at a moment [he was writing in the late 1950s] when the economically advanced countries have largely emerged from that transition period and when, therefore, the potentialities of voluntary self-help by the individual or group are greatly enhanced." Massive state intervention was merely an expedient, made necessary by a society that was changing radically but unevenly, whose economic institutions were adapting more slowly. We were suffering from an acute, though probably temporary, institutional lag.
The expedient of state assistance, Röpke wrote, "was necessary as long as most factory workers were too poor to help themselves, too paralyzed by their proletarian position to be provident, and too disconnected from the old social fabric to rely on the solidarity and help of genuine small communities." Today, the industrialized nations have advanced well beyond that stage, and "the welfare state has outlived its necessity.…it derives its origin and meaning from the conditions of an all but finished transition period of economic and social development."
But what Röpke could see in the late '50s as a difficult transition was seen in the '30s as a possibly terminal illness. After three years of stagnation, the American economy was perceived as being permanently disabled. Most Americans did not see the society's independent institutions as experiencing an adjustment; they saw them in ruins. It was easy to believe that a maturing economy had produced problems that were beyond the reach of independent institutions.
Just as the economic diagnoses fulfilled themselves—the belief in stagnation producing stagnation and the belief in labor surplus producing labor surplus—the social diagnosis became self-fulfilling. The universal assumption that the independent sector could not adapt to the needs of an industrial society guaranteed that it would not make that adjustment. The state quickly monopolized the public business. Thus, the independent sector was doomed by a limiting definition to perpetual adolescence. By the 1980s, American pluralism had almost disappeared, and our society's perception of itself had become grossly distorted.
In every major period in the history of science, a certain model, or paradigm, of the universe has been widely accepted. This dominant paradigm provides a framework for understanding the workings of the physical world. Scientists work to fill in the details by observation and experimentation. Some of the facts they find fit the paradigm, or seem to; others do not. Any divergence of fact—of reality—from the model creates a certain strain. When it becomes too great, a scientific revolution occurs. Scientists renounce the old paradigm and adopt a new one. The investigatory agenda is radically revised. For centuries, for example, the Ptolemaic cosmology was the dominant scientific paradigm. In time, in a painful upheaval, it gave way to the Copernican model.
Sheldon J. Wolin, in his splendid essay "Paradigms and Political Theories," has suggested that, in a similar way, social decisions are shaped by a dominant social paradigm. A society has a self-image, a set of widely shared beliefs about how it sets goals and gets things done, about where its strength resides, about its structure and the relationship of its parts, about how responsibilities are defined and distributed. In other words, Wolin says, a society "believes itself to be one thing and not another."
As with scientific paradigms, there is an abiding tension between reality and the prevailing social paradigm. When the divergence creates too great a strain, a conceptual revolution occurs. People repudiate the old social paradigm and embrace a new one. The new paradigm suggests a new way of looking at and understanding the world and provides a fresh blueprint for social renovation.
Wolin has found throughout history an intimate relationship between social crisis and innovative social theory. When societies begin to break down, when institutions begin to totter and authority to decompose, a search begins for a new paradigm, a new and imaginative "representation of what society would be like if it could be reordered." Constructive change begins when people begin to imagine an alternative.
In the 50 years since the Great Crash, we have come to accept a holistic, corporatist social cosmology with an all-embracing state at the center. Whatever happens is the business of the state; if other institutions exist, they are subordinate to the state. Institutions outside the state draw their legitimacy from the state, must justify their existence to the state, exist at the pleasure of the state. The state is the society. What is unofficial is illegitimate. Thus, private education, for example, is called "divisive"; its existence undermines the state. The idea of community is expressed only through the state.
What the state does "works" by definition. What the state does not do does not fit the paradigm—is unreal, or less real. Thus, any reduction in the size of the state diminishes the public good. As a black judge, A. Leon Higginbotham, said at Yale in the spring of 1982, "Anything that talks about a dilution of federal involvement can be translated as a reduction of resources, particularly for the weak, the poor and black."
The intellectuals in National Socialist Germany believed that local governments were redundant because there was no longer any distinction between the people and the government. Leninists have contended for years that there is no need for independent trade unions because the party is the embodiment of the working class.
When a Brooklyn family and their friends began to spend their Saturday mornings scrubbing a filthy Williamsburg subway station, they were ordered to desist. The head of the Transport Workers Union said that if the Transit Authority did not do the work officially, it should not be done.
But in this country and elsewhere, this monolithic, corporate social paradigm can no longer contain or explain social reality. People are beginning to realize that a large and growing fraction of social reality exists separately from the state, is beyond the reach of the state, is largely incomprehensible to the state and unresponsive to the only methods of coordination available to the state. There is a gathering awareness of how much work is done, not because of the state, but in spite of it and hence how much of society's future will be shaped outside the state.
Richard Cornuelle is the author of Reclaiming the American Dream and De-Managing America. This article is excerpted from his new book, Healing America, published in August by Putnam's. Copyright © 1983 by Richard Cornuelle.
This article originally appeared in print under the headline "America's Tragic Mistake."
Show Comments (0)