Would the Founders Freak?
Constitution Series 1787-1987: They thought America could do best for the cause of liberty by eschewing allies and avoiding distant wars.
A major dilemma for the American republic throughout its formative years was how to deal effectively with foreign affairs while avoiding political or military entanglements that could endanger the fledgling nation. This dilemma became particularly acute during the era of the Articles of Confederation.
A belief that America had a unique mission—a destiny quite distinct from that of Europe—was already taking hold. Freed from the Old World's corrupt aristocratic system, the United States symbolized an invigorating experiment in individual liberty and limited government. Such an atmosphere of freedom, Americans believed, would enable them to establish a society that oppressed people everywhere would want to emulate. Political and military links—alliances and other manifestations of the destructive international politics that characterized Europe—might endanger the nation's independence and compromise the commitment to liberty.
At the same time, America could not be indifferent to foreign affairs. The new republic confronted powerful adversaries on its borders. Spain occupied strategic positions in Florida and Louisiana, while British forces remained entrenched in Canada and even maintained outposts throughout the Great Lakes area—territory that had clearly been ceded to the United States in the 1783 peace treaty ending the Revolutionary War.
The highly decentralized political structure established by the Articles of Confederation reflected revolutionary republican principles, but it was not the most efficacious system for dealing with foreign states. Long before the Constitutional Convention convened in Philadelphia 200 years ago, conservatives fretted about the lack of a powerful central government to manage foreign affairs. Alexander Hamilton asserted that only a stronger union could prevent "our being a ball in the hands of European powers, bandied against each other at their pleasure," and John Adams warned that disunity among the states could make them the "sport of transatlantic politicians of all denominations, who hate liberty in every shape." Efforts on the part of several states to negotiate individual commercial arrangements with European nations increasingly alarmed conservatives and seemed to underscore the need for a more coherent political mechanism to deal with foreign relations.
Treaties, Trade, and War
The Constitutional Convention did not include an abundance of direct discussions of the power to conduct foreign policy, but that issue represented an underlying motive for various provisions. For example, adoption of a clause granting Congress an exclusive and unqualified power "to regulate Commerce with Foreign Nations" was unquestionably a reaction to the Confederation era. Even establishing a distinct executive branch in the national government reflected the view, at least among conservative elements, that the Confederation Congress had been impotent in conducting foreign policy. Advocates of an independent executive believed that the national government needed an official authorized to represent the entire nation in its relations with foreign states.
Yet even this conservative-dominated convention was not about to abandon republican principles. One crucial feature of the new national government was its emphasis on a system of checks and balances. This power-sharing rationale applied just as rigorously to foreign policy as to domestic responsibilities.
Reflecting concern about problems that arose during the Confederation era, the Framers made the presidency the repository of authority for day-to-day conduct of foreign affairs. The president was authorized to receive the diplomats of other nations and to appoint America's own diplomatic representatives. He was also given the authority to negotiate treaties with foreign states. Finally, the president was invested with the power and responsibility of commander-in-chief of the nation's armed forces.
But the Founders also granted significant foreign policy powers to the legislative branch. The Senate was accorded the right to approve or reject presidential appointees, including the secretary of state and diplomatic ministers. Similarly, it was empowered to "advise and consent" with respect to treaties. Congress as a whole was given the power to regulate foreign commerce. It was also given the authority to raise military forces and provide funds for their continued operation.
Most important, the Framers gave Congress, not the president, the power to declare war. Although some constitutional scholars disagree, Arthur Schlesinger, Jr., W. Taylor Reveley, and others argue persuasively that declare essentially meant "authorize" or "begin." Even Alexander Hamilton, an unusual partisan of executive power, proposed in the convention that the Senate "have the sole power of declaring war," with the executive to "have the direction of war when authorized or begun" (emphasis added).
By giving Congress the power to "declare" rather than "make" war (as suggested in an earlier draft), the delegates wanted to enable the president to repel sudden attacks on American territory. At the same time, they had no desire to grant the president unrestricted power to initiate hostilities. The ultimate decision whether the republic would go to war or remain at peace resided with Congress. This arrangement reflected the political orientation of the Founders—inheritors of the British Whig tradition with its renowned aversion to excessive executive power.
Although the governmental structure established by the Constitutional Convention enhanced the republic's ability to promote international commerce and fend off external adversaries, it was not especially conducive to playing an activist political or military role in international affairs. Such a role would have been best served by a powerful, unitary national government dominated by an unfettered executive; ample taxing authority; and the power to raise and sustain large military forces.
But the Convention preserved the concept of federalism and decentralized authority by relying on a separation of powers and checks and balances. Beyond such structural restraints, there was a pervasive hostility to large standing armies. James Madison typified this attitude when he observed in Federalist No. 41 that even a small standing army "has its inconveniences," while on an extensive scale "its consequences may be fatal." Even Madison's position was not sufficient for Anti-Federalists such as Edmund Randolph, who opposed the new Constitution largely because it did not place an explicit limitation on the size of the army.
For the Cause of Liberty
In the two centuries since the Philadelphia Convention, in foreign policy as in domestic, things have changed. Through numerous issues and events, both the substance of U.S. foreign policy and the manner in which it is conducted have evolved dramatically.
The French Revolution was the first event to test the mettle of the new nation. It created problems for George Washington's administration in two ways. The United States and France were still linked by a military alliance formed in 1778—a matter of more than academic concern once the new French republic became embroiled in hostilities with its neighbors. Equally distressing was the tendency of some Americans to take sides concerning the European struggle.
In keeping with early Americans' conception of their nation's unique mission, Washington dealt with the first danger by abrogating the alliance in 1794 and proclaiming strict U.S. neutrality, an action that provoked cries of "presidential usurpation" from outraged French partisans in Congress. His second response was a political testament stressing the virtues of what would today be called an isolationist foreign policy.
In his Farewell Address in 1796 before leaving office, Washington cautioned against "inveterate antipathies" toward certain nations and "passionate attachments" to others. While affirming that the republic might employ "temporary alliances for extraordinary emergencies," Washington clearly advocated a detached approach to world affairs. "The great rule of conduct for us in regard to foreign nations is, in extending our commercial relations to have with them as little political connection as possible." In this way, he observed, we would not "entangle our peace and prosperity in the toils of European ambition, rivalship, interest, humor or caprice."
Despite Washington's admonitions, the republic found it difficult to extricate itself from the quarrels of the Old World. During the Napoleonic Wars in the early 1800s America was repeatedly caught in the middle, with Britain especially prone to violate this country's neutral maritime rights. It was not until the successful conclusion of the War of 1812—fought at least in part to defend those rights—that the United States was able to implement a comprehensive policy avoiding entanglements with foreign powers.
The contours of "isolationist" policy became evident in 1823 with the promulgation of the Monroe Doctrine, which emphasized the divergent interests and destinies of Europe and the Western Hemisphere. When President James Monroe presented it to Congress, he proclaimed Americans "anxious and interested spectators" with regard to European developments:
In the wars of the European powers in matters relating to themselves we have never taken any part, nor does it comport with our policy so to do. It is only when our rights are invaded or seriously menaced that we resent injuries or make preparation for our defense.
Essentially, the United States, reflecting its growing power, was carving out a security sphere. It was an explicit trade-off: the United States reiterated its intention to remain out of Europe's quarrels, while insisting that the conservative European monarchies make no move (such as recolonization efforts in Latin America) threatening the United States with encirclement. This two-hemispheres thesis remained the lodestar of U.S. foreign policy for the next century.
Avoiding foreign entanglements enabled the nation to concentrate on internal economic development and territorial expansion in North America. The latter drive was seen by proponents as an integral part of America's destiny, and most Americans perceived no contradiction between a vigorous program of continental expansion and an aversion to overseas commitments.
Yet the enthusiasm for continental expansion did increase the danger of war. A desire for Canadian territory contributed to the onset of conflict with Britain in 1812. An even more insatiable hunger for the lightly populated northern provinces of Mexico was a major factor leading to the Mexican War in 1846.
Despite a certain measure of hypocrisy in the distinction between continental expansion and overseas nonintervention, American leaders were surprisingly consistent about avoiding entanglements outside North America until the end of the 19th century. Even calls for solidarity with other republican movements failed to alter that policy. When a wave of liberal republican uprisings convulsed much of Europe in the mid1800s, the United States confined its reaction to moral support. In rejecting appeals for material assistance, Sen. Henry Clay expressed the views of a majority of his countrymen:
By the policy we have adhered since the days of Washington…we have done more for the cause of liberty in the world than arms could effect; we have shown to other nations the way to greatness and happiness.…Far better it is for ourselves…and the cause of liberty, that adhering to our pacific system and avoiding the distant wars of Europe, we should keep our lamp burning brightly on this western shore.
Toward Global Commitments
It was not until the Spanish-American War in 1898 that the nation departed significantly from traditional noninterventionist doctrine. The war itself was a brief, almost comic-opera affair, but its consequences were not. At war's end the United States had acquired its first protectorate (Cuba) and its first overseas colonies, most notably the Philippines. Anti-imperialists, such as Andrew Carnegie and Charles Francis Adams, warned that imitating European colonialism would irreparably weaken America's own exemplary commitment to democratic republican values.
It was a prophetic warning. Almost immediately a lengthy and brutal struggle ensued to suppress a Filipino independence movement. The United States also deemed it necessary to intervene in Cuba on several occasions in the succeeding quarter century to restore order and keep friendly regimes in power. With the advent of the so-called Roosevelt Corollary to the Monroe Doctrine in 1905, the United States began to police the entire Caribbean Basin—a role that led to repeated interventions up until the 1930s in Nicaragua, the Dominican Republic, Haiti, and other nations. While these actions did not directly contravene the core feature of traditional doctrine—the avoidance of entanglements outside the Western Hemisphere—it did weaken opposition to the principle of an interventionist foreign policy.
It was American entry into the First World War, however, that marked the beginning of the end for traditional noninterventionism. Although the United States entered that conflict ostensibly to defend neutral maritime rights, much as it had in 1812, there were crucial differences. In World War I, the United States formed extensive cooperative ties with the major powers opposing Germany. Moreover, Woodrow Wilson employed the most universal terms possible to justify the war effort, proclaiming it a crusade to make the world safe for democracy. At the end of the fighting he sought to take the country into a new world organization, the League of Nations, which would have entailed a host of permanent foreign commitments.
The Wilson era was a harbinger of the foreign policy that would emerge in the 1940s. Likewise, America's wartime experience suggested the domestic consequences of intervening abroad. Economic planning, military conscription, and strict censorship became normal features of society. The railroads were nationalized. Pacifists, draft opponents, and critics of U.S. involvement in the war were assaulted, intimidated, and imprisoned. Sedition legislation made even mild criticism of government policy a risky venture.
The interwar period has typically been portrayed as a virulently "isolationist" era. During the 1930s, a noninterventionist majority in Congress pushed through neutrality legislation designed to prevent a repetition of the process that drew the nation into the First World War. Yet in retrospect the interwar period was merely the "Indian summer" of noninterventionism.
The consensus began to fragment with Nazi Germany's wave of conquests. Franklin Roosevelt's administration, concluding that a Nazi-dominated Europe and a Japanese-dominated East Asia posed a threat to America's security and economic well-being, moved the United States into a de facto alliance with Britain long before the attack on Pearl Harbor produced a formal declaration of war. By the end of the Second World War, an interventionist consensus was almost as pervasive as noninterventionism had been previously. Whereas the Senate had rejected U.S. membership in the League of Nations, it approved the new United Nations Charter with only two dissenting votes.
Fearful of possible Soviet designs for global domination, a majority of Americans endorsed the containment doctrine inaugurated by the Truman administration. A vocal minority fought a rearguard action against the Truman Doctrine, the Marshall Plan, and other tangible manifestations of containment, warning, as had their forefathers, that America was undertaking onerous and dangerous obligations; but they were overwhelmed.
Perhaps no event more clearly symbolized the eclipse of noninterventionism than the decisive Senate approval of the NATO pact in 1949. NATO embodied precisely the kind of entanglement that Washington, Jefferson, and other Founders had warned against—a long-term military alliance with European nations. NATO was merely the first of a host of such alliances. Today, the United States has formal commitments to defend more than 40 nations, plus numerous informal (but entirely real) promises, such as the "special relationship" with Israel. Most recently, the "Reagan Doctrine" pledges the United States to assist anti-Soviet insurgent forces seeking to overthrow incumbent governments in several Third World countries. The contrast with the refusal to aid liberal republican movements during the 19th century could scarcely be more striking.
The Continuing Struggle
Just as the content of U.S. foreign policy has undergone a dramatic transformation in the past two centuries, so too has the manner in which it is conducted. Inevitable ambiguities in the Constitution's delineation of foreign policy powers between the executive and legislative branches created what Edward S. Corwin aptly termed "an invitation to struggle."
The most obvious change in the conduct of foreign policy has been an enormous expansion of presidential authority. Tellingly, the greatest growth of executive power has occurred during wartime or when military crisis seemed imminent. On those occasions, Congress has shown deference toward presidential initiatives.
James K. Polk provided a graphic lesson in how control over the military can, in the hands of an ambitious president, lead to an increase in executive authority. By ordering U.S. troops into disputed territory along the border between Texas and Mexico in 1846, he courted if not deliberately provoked a Mexican attack. Once American troops were being fired upon, Congress had little choice but to approve Polk's call for a declaration of war. Some historians contend that Franklin Roosevelt employed the same tactic to perfection nearly a century later.
Congress also allowed chief executives a considerable amount of latitude to use military force in "minor" incidents not involving serious danger of a large war. Thus Congress did not insist on a declaration of war when President Jefferson launched attacks against the Barbary Pirates. In the 20th century, presidents routinely deployed the Marines in various Caribbean and Central American nations, invariably without declarations of war and often without explicit congressional authorization.
Since the latter cases all occurred in a region long regarded as part of the republic's vital security zone and involved small nations that were incapable of posing a serious retaliatory threat, Congress exhibited indifference toward executive activism. But such episodes nourished, however inadvertently, notions of presidential preeminence and set precedents for unilateral actions on a far more massive scale in Korea and Vietnam.
By the 1940s, the ground had been well laid for a sustained expansion of presidential power, culminating in an "imperial presidency." Franklin Roosevelt maneuvered a reluctant nation toward war through a series of unilateral actions, transferring destroyers to Great Britain, proclaiming the western half of the Atlantic a "defense zone," ordering U.S. naval vessels to convoy British ships, and imposing a draconian trade embargo on Japan. Once the United States officially entered the war, he disregarded Congress to an even greater degree.
Major decisions reached with Britain and the Soviet Union, not only about the conduct of the war but about the postwar settlement, were concluded by executive agreement rather than treaty, excluding the Senate from the decision making.
Roosevelt's successors duplicated and even exceeded his activism. Meeting the Communist challenge and fulfilling global commitments appeared to place a premium on secrecy and decisive action. Truman responded to the Korean crisis by sending U.S. troops without even asking for an authorizing resolution, much less a declaration of war. Sen. Robert Taft angrily termed this action "a complete usurpation by the President of authority to use the Armed Forces of this country," but his views were in the minority. Truman even attempted to seize the nation's steel mills under claimed presidential authority as commander-in-chief during wartime. Although the Supreme Court struck down his thesis, that rebuke scarcely slowed the on rushing executive power juggernaut.
Both Lyndon Johnson and Richard Nixon asserted a right to conduct the Vietnam war without congressional approval. Even the Tonkin Gulf resolution was dismissed as superfluous moral support. Instead, both presidents relied on expansive interpretations of their constitutional powers as commander-in-chief and spun elaborate theories of "inherent" presidential power.
Congress finally reclaimed some of its foreign policy prerogatives in the 1970s. Passage of the War Powers resolution, restrictions on the use of executive agreements in lieu of treaties, and provisions for congressional oversight of intelligence agencies restored at least a portion of the system of checks and balances. This congressional resurgence may prove transitory, however, as the Reagan administration has once again asserted extensive executive power over foreign policy.
From Liberty to Force
Whatever the exigencies of world affairs that have seemed to justify the foreign policy that has dominated this century, on balance the transformation from the "pacific system" lauded by Henry Clay has had an undeniably corrosive effect on the republic. It has created serious political distortions, as evidenced by the rise of an imperial presidency. It has also proved costly in other ways. Nearly 400,000 young Americans have perished in combat since the early 1940s. More than 100,000 of that number lost their lives in the Korean and Vietnam wars, whose relationship to U.S. security was, at best, obscure.
The financial cost is nearly as great. In the past four-and-a-half decades, the United States has spent nearly four trillion dollars on the military. According to government figures, more than 60 percent of America's defense budget subsidizes the security of allies and client states. This economic burden is disturbing, especially since the United States spends nearly 6.5 percent of GNP on the military while West Germany and Japan spend 3 percent and 1 percent, respectively.
Most troubling of all, interventionism has seriously compromised fundamental American values. Domestically, every major episode of foreign entanglement has, as the Founders warned, involved a discernible contraction of individual liberties (as Robert Higgs pointed out here in July, "In the Name of Emergency"). Externally, the United States has tarnished its image as a symbol of liberty by providing political, economic, and military support for an assortment of "friendly" Third World dictatorships.
Those who contend that international realities of our day require interventionism in defense of our liberty would do well to consider the words of John Quincy Adams, who warned long ago of a corruption of values if we abandoned our commitment to peaceful neutrality:
[America] well knows that by once enlisting under other banners than her own…she would involve herself beyond the power of extrication in all the wars of interest and intrigue, of individual avarice, envy, and ambition, which assume the colors and usurp the standard of freedom. The fundamental maxims of her policy would insensibly change from liberty to force.…She might become dictatress of the world. She would be no longer the ruler of her own spirit.
Ted Galen Carpenter is a historian and the director of foreign policy studies for the Cato Institute in Washington, D.C.
This article originally appeared in print under the headline "Would the Founders Freak?."
Show Comments (0)