Dictionary of american history - PDF Free Download (2024)

DICTIONARY OF

American History Third Edition

EDITORIAL BOARD Michael A. Bernstein University of California, San Diego Lizabeth Cohen Harvard University Hasia R. Diner New York University Graham Russell Hodges Colgate University David A. Hollinger University of California, Berkeley Frederick E. Hoxie University of Illinois Pauline Maier Massachusetts Institute of Technology Louis P. Masur City College of New York Andrew C. Rieser State University of New York, Geneseo CONSULTING EDITORS Rolf Achilles School of the Art Institute of Chicago Philip J. Pauly Rutgers University

DICTIONARY OF

American History Third Edition

Stanley I. Kutler, Editor in Chief

Volume 8 Subversion to Zuni

Dictionary of American History, Third Edition Stanley I. Kutler, Editor

䊚 2003 by Charles Scribner’s Sons Charles Scribner’s Sons is an imprint of The Gale Group, Inc., a division of Thomson Learning, Inc. Charles Scribner’s Sons姞 and Thomson Learning姠 are trademarks used herein under license.

ALL RIGHTS RESERVED No part of this work covered by the copyright hereon may be reproduced or used in any form or by any means—graphic, electronic, or mechanical, including photocopying, recording, taping, Web distribution, or information storage retrieval systems—without the written permission of the publisher.

For more information, contact Charles Scribner’s Sons An imprint of the Gale Group 300 Park Avenue South New York, NY 10010

For permission to use material from this product, submit your request via Web at http://www.gale-edit.com/permissions, or you may download our Permissions Request form and submit your request by fax or mail to: Permissions Department The Gale Group, Inc. 27500 Drake Rd. Farmington Hills, MI 48331-3535 Permissions Hotline: 248-699-8006 or 800-877-4253, ext. 8006 Fax: 248-699-8074 or 800-762-4058

LIBRARY OF CONGRESS CATALOGING-IN-PUBLICATION DATA Dictionary of American history / Stanley I. Kutler.—3rd ed. p. cm. Includes bibliographical references and index. ISBN 0-684-80533-2 (set : alk. paper) 1. United States—History—Dictionaries. I. Kutler, Stanley I. E174 .D52 2003 973⬘.03—dc21

Printed in United States of America 10 9 8 7 6 5 4 3 2 1

CONTENTS Volume 1 List of Maps . . . xi Preface . . . xv Aachen to Butler’s Order No. 28 Volume 2 Cabeza de Vaca Expeditions to Demography and Demographic Trends Volume 3 Denominationalism to Ginseng, American Volume 4 Girl Scouts of the United States of America to Kwanzaa

The Revolutionary War . . . 29 The Early Republic . . . 37 The War of 1812 . . . 42 The United States Expands . . . 45 Texas and the Mexican War . . . 52 Transportation . . . 56 Gold Rush in California . . . 59 The Civil War . . . 65 New York—The Development of a City . . . 70 Primary Source Documents . . . 79 The Colonial Period . . . 81 The Revolutionary War . . . 127

Volume 5 La Follette Civil Liberties Committee Hearings to Nationalism

The Early Republic . . . 153

Volume 6 Native American Church to Pyramid Schemes

Women’s Rights . . . 325

Volume 7 Quakers to Suburbanization

Expansion . . . 187 Slavery, Civil War, and Reconstruction . . . 267 Industry and Labor . . . 339 World War I . . . 363 The Great Depression . . . 375 World War II . . . 393

Volume 8 Subversion, Communist, to Zuni

The Cold War . . . 411

Volume 9 Contents . . . v

The Vietnam War . . . 455

Archival Maps . . . 1 U.S. History through Maps and Mapmaking . . . 2 Early Maps of the New World . . . 6 The Colonies . . . 12

Civil Rights . . . 445 The Late Twentieth Century . . . 481 Volume 10 Directory of Contributors Learning Guide Index

Exploration of the American Continent . . . 19 Colonial Wars . . . 25

v

DICTIONARY OF

American History Third Edition

S (Continued)

SUBVERSION, COMMUNIST. Defining itself by its commitment to a set of ideas and not its citizens’ ancestry or blood, the United States has long harbored a fear of subversion at the hands of enemies of democracy. The first antisubversive federal laws, the Alien and Sedition Acts of 1798, were intended to check revolutionary influences from France. In the 1830s the Anti-Masonic Party played on fears of a conspiracy of Freemasons; twenty years later distrust of Catholics and immigrants fueled the Know-Nothing Party. Abolitionists fretted about the slave conspiracy in the 1850s. During World War I, fears of enemy subversion led to passage of the Espionage Act, which was later used to prosecute antiwar and antidraft activists, and provided an excuse for widespread vigilantism. The Bureau of Investigation, later renamed the Federal Bureau of Investigation, organized a nationwide crackdown on suspected foreign anarchists and revolutionaries in 1919–1920, the so-called Palmer raids.

The first significant congressional legislation targeting peacetime subversion since 1798 was the Alien Registration Act of 1940, also known as the Smith Act, which made it a crime to advocate or teach the overthrow of the government by force or violence. Its first victims were a group of Trotskyists, convicted in 1941, and a motley band of Nazis and fascists, whose lengthy trial during World War II ended in a mistrial. The national leadership of the CPUSA was convicted in 1948, and the Supreme Court upheld the constitutionality of the Smith Act in Dennis v. United States (1951). Six years later, in Yates v. United States, the Court effectively foreclosed further prosecutions. The Internal Security Act, passed in 1950 and usually called the McCarran Act, created the Subversive Activities Control Board, which attempted for years to compel communist and communist-front groups to register with it and reveal their members and financing. After protracted legal battles, in 1965 a divided Supreme Court found the registration provision unconstitutional.

For much of the twentieth century the fear of communist subversion drove government officials to investigate “un-American” activities and legislate to control them. During its revolutionary periods, the Communist Party of the United States (CPUSA) openly boasted about its intent to overthrow the U.S. government and replace it with a Soviet-style regime. Even in more moderate periods, the habitual secrecy of Communist Party members generated concerns and fears of infiltration.

Public fears about subversion were heightened by a series of espionage cases. In 1945 six people associated with a procommunist magazine, Amerasia, were arrested and charged with espionage. Two were fined for minor transgressions and the others never prosecuted. The case continued to fester; in 1950, following the triumph of Chinese communism, Senator Joseph McCarthy charged that John Stewart Service, one of the original defendants, was part of a cabal of communist sympathizers in the State Department who had sold out Chiang Kai-Shek.

New York State’s Lusk Committee began an inquiry into communism even before the formation of the first American Communist Party in 1919. The most famous and longest-lasting congressional body, the House Committee on Un-American Activities (HUAC), was first authorized as a special committee in 1938, following in the footsteps of earlier congressional inquiries in 1919 and 1930. One of its sponsors, Samuel Dickstein of New York (later revealed to be a source for Soviet intelligence) wanted it to focus on Nazi and fascist activities, a source of a variety of conspiracy theories and worries about domestic subversion. Under the direction of Martin Dies, however, HUAC mostly investigated communists. It became a standing committee of the House of Representatives in 1945.

After World War II, several defectors from Soviet intelligence, notably Igor Gouzenko and Elizabeth Bentley, alerted the FBI to widespread Soviet espionage. In 1948, Bentley and Whittaker Chambers testified before HUAC and named dozens of government employees as Soviet spies, most of who took the Fifth Amendment and refused to answer questions. Several of the most prominent individuals, however, denied the charges, including Alger Hiss, a former high-ranking State Department official; Harry Dexter White, a former assistant secretary of the Treasury; the presidential adviser Lauchlin Currie; and Duncan Lee, formerly legal counsel to the head of the Office of Strategic Services. White died of a heart attack and the one-time chief of the Latin American division of the State Department, Laurence Duggan, com-

1

S U B WAY S

mitted suicide shortly after questioning. Hiss was convicted of perjury in 1950. The trial and conviction of Julius and Ethel Rosenberg for atomic espionage in 1951 further fueled fears that subversive forces had endangered U.S. national interests. Using the Hiss case and the communist victory in China, Senator McCarthy began a campaign to purge suspected communists from government positions, accusing a host of people of subversion, notably Owen Lattimore, a one-time adviser to the State Department on China policy. When he became chairman of the Senate Committee on Government Operations in 1953, McCarthy launched a series of investigations, one of which, directed at the United States Army, eventually led to his censure by the Senate in 1954. Although congressional committees such as HUAC survived into the 1970s, they were never again as consequential as they had previously been. Although McCarthy’s charges were consistently off the mark, recently released material from Russian and American archives demonstrates that communist subversion had been a serious problem in the 1940s. Decrypted Soviet cables, collected by the top-secret Venona project beginning in 1943, were finally released in 1995 and confirmed that hundreds of Americans had spied for the USSR. Approximately 300 Americans worked for Soviet intelligence agencies during World War II; only about 125 were definitively identified by American counterintelligence, including virtually everyone named by Chambers and Bentley. Although these identified Soviet sources lost their government positions by the end of the 1940s, the effort to uncover the others remained a high priority of counterintelligence and of an extensive loyalty and security program. The first executive order establishing such a program, instituted by President Harry Truman in 1947, authorized the discharge of government employees if “reasonable grounds” to doubt their loyalty existed and established a loyalty review board within the Civil Service Commission. President Dwight D. Eisenhower broadened the criteria for dismissal to encompass security risks. Critics charged that the procedures and criteria for determining loyalty and security were flawed. Approximately 2,700 government employees were dismissed and some 12,000 resigned between 1947 and 1956. BIBLIOGRAPHY

Goldstein, Robert. Political Repression in Modern America from 1870 to 1976. 2d rev. ed. Urbana: University of Illinois Press, 2001. Haynes, John Earl, and Harvey Klehr. Venona: Decoding Soviet Espionage in America. New Haven, Conn.: Yale University Press, 1999. Powers, Richard Gid. Not Without Honor: The History of American Anticommunism. New York: Free Press, 1995. Schrecker, Ellen. Many Are the Crimes: McCarthyism in America. Boston: Little, Brown, 1998.

Harvey Klehr See also Anticommunism; Cold War.

2

SUBWAYS. See Railways, Urban, and Rapid Transit. SUDDEN INFANT DEATH SYNDROME (SIDS), sometimes referred to as crib death, is a medical term for the decease of an apparently well infant. It describes a death that remains unexplained after all known and possible causes have been ruled out through autopsy, investigation of the scene, and review of the child’s medical history. SIDS was first identified as a separate medical entity and named in 1969. SIDS causes the death of as many as 7,000 infants each year in the United States. It is the most common cause of death in children between their first month and first year of age. SIDS more frequently affects males than females and nonwhites than whites. It affects infants born into poverty more often than those in higher-income situations. Most at risk are infants born to women with inadequate prenatal care, infants born prematurely, and infants whose mothers smoked during pregnancy or after delivery. Deaths usually occur during sleep, are more likely during cold months, and occur more frequently in infants who sleep on their stomachs than in infants who sleep on their backs. In 1994 a “Back to Sleep” campaign encouraging parents and caretakers to put babies to sleep on their backs was initiated as a cooperative effort of the U.S. Public Health Service, the American Academy of Pediatrics, the SIDS Alliance, and the Association of SIDS and Infant Mortality Programs. The cause of SIDS is unknown. Theories include an unidentified birth defect, stress in a normal baby caused by infection or other factors, and failure to develop. Because no definitive cause can be found and because parents are totally unprepared for such a loss, the death often causes intense feelings of guilt. BIBLIOGRAPHY

Guntheroth, Warren G. Crib Death: The Sudden Infant Death Syndrome. Armonk, N.Y.: Futura Publishing, 1995.

Jack Handler / f. b. See also Childhood; Maternal and Child Health Care; Smoking.

SUEZ CRISIS. In the summer of 1956, British, French, and Israeli leaders deemed the conduct of Egyptian president Gamal Abdel Nasser as provocative. The European powers were humiliated by his nationalization of their shares in the Universal Suez Canal Company. They were concerned about Nasser’s increasing contacts with the Soviet bloc, his founding role in the nonaligned movement, and his opposition to European influence in the Arab world, especially to French colonial rule in Algeria. Israel, agitated over continuous cross-border infiltration from Egypt and the blockade of maritime routes in the Red Sea and the Suez Canal, found more cause for worry in Egypt’s forthcoming arms supplies from the Soviet bloc. Thus, Britain, France, and Israel joined forces in a sur-

S U F F O L K R E S O LV E S

prise attack on Egypt, triggering the Suez Crisis on 29 October 1956. On that day, Israel invaded the Sinai peninsula and the Gaza Strip. Two days later the British and the French bombed major Egyptian cities, then conquered the Suez Canal area in early November. The Americans and the Soviets collaborated to denounce and reverse what they viewed as a gross violation of Egyptian sovereignty by colonial powers. Using their military, economic, and political supremacy, they forced a cease-fire, then a full withdrawal of British, French, and Israeli troops. The United States gave assurances to Israel that it would enjoy safe passage in the Gulf of Aqaba, and UN peacekeepers were deployed as a buffer between Egypt and Israel. The United States also was concerned about Soviet encroachments in the Arab world. President Dwight D. Eisenhower was personally incensed with the timing of the crisis, which occurred in the final days of his reelection campaign and during a crisis in Hungary, where the Soviets were crushing a regime that wanted to leave the Warsaw Pact. He felt betrayed by Britain and France and viewed the assault as a challenge to his authority, but he also wanted to preserve Western influence. The proximity of interests between the United States and Egypt was temporary given Nasser’s aspirations to lead the Arabs and to display “positive neutrality” in the Cold War. Already, on 5 January 1957, the Eisenhower Doctrine, approved by the U.S. Senate two months later, promised U.S. economic assistance, military support, and even armed intervention to sustain regimes opposed to “international communism” in the Middle East. Egypt, a recipient of Soviet aid, quickly became a U.S. antagonist over influence in Syria, Iraq, and Jordan. BIBLIOGRAPHY

Holland, Matthew. America and Egypt: From Roosevelt to Eisenhower. Westport, Conn.: Praeger, 1996. Sneh, Itai. “Viewpoint: The Antagonism between the United States and Egypt Arose from the American View of Gamal Abdel Nasser as a Soviet Puppet.” In Benjamin Frankel, ed., History in Dispute: Political and Social Movements, Vol. 2: American Social and Political Movements, 1945–2000: Pursuit of Liberty. Detroit, Mich.: St. James Press, 2000. Woodward, Peter. Nasser. London and New York: Longman, 1992.

Itai Sneh See also Arab Nations, Relations with; Egypt, Relations with.

SUFFOLK BANKING SYSTEM was a note clearing system for New England banks established in 1826 by Boston’s Suffolk Bank. Membership required a bank to maintain a noninterest-bearing deposit with Suffolk. Deposits of notes of member banks were credited and debited at par to members’ accounts. Since clearing was on a

net basis, membership permitted banks to economize on specie. By 1838, virtually all New England banks had joined the system. Under the system, notes of New England banks circulated at par in that region. A newly formed competitor, the Bank of Mutual Redemption, drove out the system in 1858. BIBLIOGRAPHY

Rolnick, Arthur J., Bruce D. Smith, and Warren E. Weber. “Lessons from a Laissez-Faire Payments System: The Suffolk Banking System (1825–58).” Federal Reserve Bank of St. Louis Review 80, no. 3 (May/June 1998): 105–116. Whitney, D. R. The Suffolk Bank. Cambridge, Mass.: Riverside Press, 1878.

Warren E. Weber See also Banking.

SUFFOLK RESOLVES. In defiance of the ban on town meetings, delegates from the towns of Suffolk County, Massachusetts (including Boston), met at private homes in Dedham on 6 September 1774 and in Milton on 9 September to plan resistance against the Coercive Acts. A committee, headed by Joseph Warren, was charged with drawing up an address to Governor Thomas Gage and resolves to send to the Continental Congress. Warren, the primary author, argued that liberties guaranteed by the British constitution and the laws of nature applied equally to Britons on the home island and in the colonies, and could not be thrown aside because of the “arbitrary will of a licentious minister.” Thus, the convention recommended measures that included calling a provincial congress; withholding public funds from the treasury; nonintercourse with Britain, Ireland, and the West Indies; and raising a defensive militia. All these measures were intended to pressure Parliament into repealing the Coercive Acts. The resolves asserted the colonists’ loyalty to George III and did not mention independence. Paul Revere carried the resolves to the Continental Congress, which rallied to support Massachusetts, unanimously endorsing the resolves on 17 September. The Suffolk Resolves laid out a clear ideological justification for resistance and a plan of action for the Continental Congress to follow. BIBLIOGRAPHY

Bailyn, Bernard. The Ideological Origins of the American Revolution. Enlarged ed. Cambridge, Mass: Harvard University Press, 1991. Brown, Richard D. Revolutionary Politics in Massachusetts: The Boston Committee of Correspondence and the Towns, 1772– 1774. Cambridge, Mass.: Harvard University Press, 1970. Ford, Worthington C., et al., eds. Journals of the Continental Congress, 1774–1789. 34 vols. Washington, D.C.: GPO, 1904– 1937. (Volume 1 contains the full text of the Suffolk Resolves and the address to Governor Gage. Available on-line

3

S U F F R A G E : O V E RV I E W

through the Library of Congress at http://memory.loc .gov.) Maier, Pauline. From Resistance to Revolution: Colonial Radicals and the Development of American Opposition to Britain, 1765– 1776. New York: Knopf, 1972. Morgan, Edmund S. The Birth of the Republic, 1763–1789. 3d ed. Chicago: University of Chicago Press, 1992.

Aaron J. Palmer

SUFFRAGE This entry includes 5 subentries: Overview Exclusion from the Suffrage Colonial Suffrage African American Suffrage Woman’s Suffrage

OVERVIEW Suffrage, the right to vote on public matters, predates American history by several thousand years. Since the founding of the American colonies, definition of the breadth of suffrage has reflected a tension between the desire to legitimize political authority by permitting expressions of consent through public acts of voting and the desires and demands of various groups and individuals for public recognition and the opportunity to participate in the selection of political representatives and governmental policies. No clearer and more distinctly American example of this tension can be offered than the election of the first legislative assembly in the colony of Virginia in 1619. Nine days before the scheduled election of this representative assembly, “the Polonians resident in Virginia” successfully protested their exclusion from the upcoming election until it was agreed “that they shall be enfranchised, and made as free as any inhabitant there whatsoever.” Since 1776 the political definition of the right to vote has been contested principally over the conceptual boundary dividing eligible and ineligible voters. Until the twentieth century, state governments were largely responsible for the determination of this boundary, although local election officials participated in the enforcement and not uncommonly capricious interpretation of this legal definition. From the early national years to the Civil War, states were free to deny the right to vote with regard to a wide range of conditions, including gender, religion, race and ethnicity, citizenship, residency, tax status, wealth, literacy, mental competence, criminal conviction, and military service. States imposed and then abandoned many of these restrictions. Several states, however, never sanctioned religious or racial restrictions, and New Jersey granted women the right to vote from 1776 until 1807. Only three groups have consistently been deemed ineligible to vote: enslaved persons until 1865, and minors and nonresidents to the present.

4

The U.S. Constitution also has contributed to the definition of the right to vote. Article 1 requires that those deemed eligible to vote for members of a lower state legislative body are eligible to vote for members of the U.S. House of Representatives. The Seventeenth Amendment (1913) extends this requirement to U.S. Senate elections. The Fourteenth Amendment (1868) offers an incentive for states to expand their voter rolls by promising to reduce a state’s representation in the U.S. House and the Electoral College in proportion to the number of male citizens over twenty-one years whose voting rights are denied or abridged. Congress and the U.S. Supreme Court have never enforced this constitutional provision. The Fifteenth Amendment (1870) prohibits states from denying any citizen the right to vote “on account of race, color or previous condition of servitude.” This provision, however, was not enforced nationally until Congress enacted the 1965 Voting Rights Act. The Nineteenth Amendment (1920) prohibits the United States or the states from denying or abridging the privilege of voting “on account of sex.” The Twenty-fourth Amendment (1964) prohibits states from collecting poll taxes from voters in presidential elections, and the Twenty-sixth Amendment (1971) lowers the minimum voting age to eighteen years. Although great advances have been made to broaden the suffrage by expanding and enforcing the concept of voter eligibility, the history of voting in the United States still is overshadowed by the history of nonvoting. Indeed, whereas less than 20 percent of the population participated in national and state elections prior to 1920, the level of voter participation has exceeded 40 percent of the U.S. population only once, in 1992. Moreover, barely over half of all eligible voters vote in presidential election years and substantially less than this vote in nonpresidential election years. BIBLIOGRAPHY

Keyssar, Alexander. The Right to Vote: The Contested History of Democracy in the United States. New York: Basic Books, 2000. Kromkowski, Charles. Recreating the American Republic: Rules of Apportionment, Constitutional Change, and American Political Development, 1700–1870. New York: Cambridge University Press, 2002.

Charles A. Kromkowski See also Voting Rights Act of 1965.

EXCLUSION FROM THE SUFFRAGE It is generally estimated that because of state property and taxpaying qualifications, fewer than one-fourth of all white adult males were eligible to vote in 1787–1789, the time the U.S. Constitution was being ratified. The history of the suffrage in the United States since then has been one of steady expansion, partly through constitutional amendments and partly through legislation. The states largely abandoned the property qualifications for voting by 1850. The Fifteenth Amendment, ratified in 1870, for-

SUFFRAGE: EXCLUSION FROM THE SUFFRAGE

U.S. Presidential Elections: Turnout, 1788–2000 (As a Percentage of Total Population) 45.0% 41.3% 39.2%

40.0%

39.4% 38.4%

38.9% 37.5% 37.4% 37.3% 38.2% 37.5% 36.7% 36.9% 36.9%

37.8% 35.5% 34.3%

35.0%

33.1% 31.8% 30.7% 30.0% 25.8% 25.2%

25.0%

18.8%19.5% 18.4% 18.3% 18.5% 18.2% 18.4%

20.0%

15.8% 14.9% 15.4% 13.8% 14.1% 13.1% 14.4% 12.7% 12.2%

15.0%

18.4% 16.7% 16.4% 15.8%

9.7% 9.4% 9.4%

10.0%

5.0%

3.3%

0.0%

0.0%

0.0%

17 8 17 8 9 17 2 9 18 6 0 18 0 0 18 4 0 18 8 1 18 2 1 18 6 2 18 0 2 18 4 2 18 8 3 18 2 3 18 6 4 18 0 4 18 4 4 18 8 5 18 2 5 18 6 6 18 0 6 18 4 6 18 8 7 18 2 7 18 6 8 18 0 8 18 4 8 18 8 9 18 2 9 19 6 0 19 0 0 19 4 0 19 8 1 19 2 1 19 6 2 19 0 2 19 4 2 19 8 3 19 2 3 19 6 4 19 0 4 19 4 4 19 8 5 19 2 5 19 6 6 19 0 6 19 4 6 19 8 7 19 2 7 19 6 8 19 0 84 19 8 19 8 9 19 2 9 20 6 00

0.0%

bade denial of the right to vote “on account of race, color, or previous condition of servitude.” The Nineteenth Amendment, which was adopted in 1920, prohibited denial of the right to vote on account of sex. The poll tax was outlawed for federal elections by the Twenty-Fourth Amendment and for state elections by the Supreme Court decision in Harper v. Virginia Board of Elections. The Twenty-Sixth Amendment, ratified in 1971, lowered the age limit for all federal and state voting to eighteen. Various obstacles to African American suffrage were progressively eliminated by Supreme Court decisions—for example, the white primary in 1944 (Smith v. Allwright) and the “reasonable interpretation” of the Constitution test in 1965 (Louisiana v. United States)—and by federal legislation, notably the Voting Rights Act of 1965, which outlawed literacy, educational, “good character,” and voucher devices aimed at keeping black suffrage to a minimum. Thus, by 1972, all persons over eighteen, of whatever sex, color, or race, were legally entitled to vote. The remaining obstacles to voting were largely administrative in char-

acter and related to such matters as registration procedures and the times, places, and manner of holding elections. BIBLIOGRAPHY

Branch, Taylor. Parting the Waters: America in the King Years, 1954–63. New York: Simon and Schuster, 1988. Mann, Robert. The Walls of Jericho: Lyndon Johnson, Hubert Humphrey, Richard Russell, and the Struggle for Civil Rights. New York: Harcourt Brace, 1996. Phillips, Kevin, and Paul H. Blackman. Electoral Reform and Voter Participation: Federal Registration, a False Remedy for Voter Apathy. Washington, D.C.: American Enterprise Institute for Public Policy Research, 1975. Weisbrot, Robert. Freedom Bound: A History of America’s Civil Rights Movement. New York: Norton, 1990. Williamson, Chilton. American Suffrage: From Property to Democracy, 1760–1860. Princeton, N.J.: Princeton University Press, 1960.

David Fellman / t. g.

5

SUFFRAGE: COLONIAL SUFFRAGE

See also Ballot; Literacy Test; Massachusetts Ballot; Preferential Voting; Primary, White; Suffrage: Colonial Suffrage; Wade-Davis Bill.

COLONIAL SUFFRAGE Neither the extent nor the exercise of suffrage in colonial America can be described precisely. Voting qualifications were fixed by each colony, and in many, the requirements were changed during the colonial period. The generally accepted philosophy was the English concept that only those with “a stake in society” should vote. Each colony established some property qualification for voting for the lower house of the provincial legislature, and in each colony the upper house was almost always appointed. The definition of freeholder in the colonies varied from colony to colony. In New York, a freehold was an estate worth forty British pounds or bearing forty shillings rent; other colonies fixed acreage rather than money definitions for the term “freehold”: one hundred acres in New Jersey; fifty acres in the Carolinas, Georgia, Maryland, Pennsylvania, and Delaware. Many colonies had alternatives to landholding as a suffrage qualification, usually the possession of other property but sometimes mere taxpaying. An added complication was the numerous separate qualifications established for dwellers in towns and boroughs, usually lower and more liberal than the general provincial franchise. Virginia town dwellers could vote by virtue of possession of a house and lot, and in North Carolina, all taxpaying tenants and homeowners in towns and boroughs were voters. New England town qualifications were bewilderingly varied, the net effect being to admit virtually all the adult male inhabitants to the franchise. Limitations of race, sex, age, and residence were more often the result of custom than of law. Generally, Jews and Roman Catholics were barred, usually by their inability to take the English test oaths with regard to the Anglican Church. Maryland and New York specifically barred Catholics by statute, and New York excluded Jews by law in 1737. These prohibitions were not always enforced. Jews appear on New York City voting lists in 1768 and 1769, and Catholics voted in Virginia in 1741 and 1751. Women were excluded by statute only in four colonies, but there is rare evidence that any ever voted anywhere. The age qualification was almost universally twenty-one, but in Massachusetts, suffrage was confined to twenty-four-year-olds in the seventeenth century and sometimes extended to nineteen-year-olds in the eighteenth century. Pennsylvania’s two-year residence requirement was the most stringent; other colonies usually demanded six months or a year. Slaves and indentured servants were invariably denied the franchise, and in the Carolinas, Georgia, and Virginia, freed blacks as well. Indians did vote at times in Massachusetts. The number of adult males who qualified as voters under these requirements can only be estimated. Probably 50 to 75 percent of the adult male population could qual-

6

ify as freeholders, and in some colonies up to 80 or 90 percent as freeholders or freemen. The relative ease of obtaining land in America and the high rate of wages opened the door fairly wide to those persons who sought the franchise. Suffrage limitations do not appear to have been a grievance in any of the popular protest movements that developed during the colonial period. On the other hand, this rather broadly based electorate usually voted into office a narrowly based leadership and deferred to its judgment in running the colonies’ political affairs. BIBLIOGRAPHY

Bailyn, Bernard. The Origins of American Politics. New York: Knopf, 1968. Maier, Pauline. From Resistance to Revolution: Colonial Radicals and the Development of American Opposition to Britain, 1765– 1776. New York: Knopf, 1972. Williamson, Chilton. American Suffrage: From Property to Democracy, 1760–1860. Princeton, N.J.: Princeton University Press, 1960. Wood, Gordon S. The Radicalism of the American Revolution. New York: Knopf. 1992.

Milton M. Klein / a. g. See also Ballot; Literacy Test; Massachusetts Ballot; Preferential Voting; Primary, White; Suffrage: Exclusion from the Suffrage.

AFRICAN AMERICAN SUFFRAGE Throughout the American colonial era, racial distinctions were not the principal legal or conventional means employed to restrict the right to vote or to hold office. The concepts of “freeman” and “freeholder,” as well as gender, age, religion, and wealth requirements, ensured that the number of individuals eligible to vote remained small relative to the population. Among those eligible, however, adult African American male propertyholders were permitted to and did cast ballots in at least South Carolina, North Carolina, and Virginia. In the eighteenth century, a few colonies devised and adopted race-based restrictions, but the right to vote in other colonies remained free of similar limitations. The American Revolution did not prompt a radical redefinition of the right to vote. In 1786, only two of the original thirteen states, Georgia and South Carolina, expressly restricted voting privileges to the eligible white population. The U.S. Constitution, written in 1787, recognized the authority of the states to define the right to vote. Between 1776 and 1860, about one-third of the states permitted voting by free African American adult males. Race-based voter eligibility restrictions became increasingly more common in the nineteenth century, especially among the states admitted into the Union after 1787. Among the twenty non-original states added before the American Civil War, only Vermont, Maine, and temporarily Kentucky (1792–1799) and Tennessee (1796– 1834) did not explicitly limit voting rights to “free,” “white” males.

SUFFRAGE: AFRICAN AMERICAN SUFFRAGE

African American Voter. While others wait behind him, a man takes part in the election process—after a century of actions, legal and otherwise, in some parts of the country to deprive blacks of their constitutional right. Library of Congress

Ironically, as states gradually broadened their electorates by abandoning their original property, tax payment, and religion requirements, many added explicitly racialist definitions of the right to vote into their state constitutions. The 1858 Oregon constitution, for example, expressly prescribed that “No Negro, Chinaman, or mulatto, shall have the right of suffrage.” By 1860, free African American adult males were legally able to vote in only five states. A sixth state, New York, imposed racially based registration and property restrictions in 1811 and 1821, effectively curtailing African American voting. In 1849, Wisconsin voters approved a legislature-endorsed act extending the right to vote to African American males, but a state elections board refused to recognize the new eligibility standard; as a result, this statutory grant did not become effective until after the Civil War. Although formidable, constitutional, statutory, and administrative bars against voting were not always fully enforced, especially at the local level. Indeed African Americans voted in Maryland as late as 1810, although they were denied the right in a 1783 statute and again by an amendment in 1801 to the state constitution; and John Mercer Langston, though denied voting rights by Ohio’s constitution, was elected a township clerk in 1855, thereby becoming the first African American elected official in the United States.

Reconstruction Neither the Civil War nor ratification of the Thirteenth Amendment (1865), which banned slavery, altered the racially discriminatory prewar understanding of the right to vote. In the South, this recalcitrance was not surprising; until the federal government imposed military rule over the region in 1867, these states were governed by many of the proslavery state political elites who had engineered and supported secession in 1860 and 1861. Suffrage reform, it must be noted, also was not forthcoming in many Northern states. In the immediate wake of the Civil War, legislatures and voters in nine Northern states rejected state constitutional amendments that extended voting rights to African Americans. Abolitionist activists like Frederick Douglass and members of the Republicancontrolled U.S. Congress, however, continued to push for national suffrage reform, advocating and enacting numerous Reconstruction acts and the Fourteenth Amendment (1868). The latter constitutional amendment recognized African Americans as full citizens of the United States, guaranteed all persons equal protection of the law, and provided a mechanism for reducing a state’s federal representation if it denied or abridged voting rights to any eligible male voters. Congress never used the latter mechanism, but it made ratification of the Fourteenth Amendment a precondition for readmission of each secessionist

7

SUFFRAGE: AFRICAN AMERICAN SUFFRAGE

Fig. 1: Average Southern State Voter Participation Rates in Gubernatorial Elections, 1859–1919 18.0%

16.0%

14.0%

Voters Per Population

12.0%

10.0%

8.0 %

6.0 %

4.0 %

2.0 %

0.0 % 1859

1864

1869

1874

1879

1884

1889

1894

1890

1904

1909

1914

1919

Includes data from Alabama, Arkansas, Georgia, Florida, Louisiana, Mississippi, North Carolina, South Carolina, Tennessee, Texas, and Virginia. Intradecennial state population based on straightline interdecennial interpolation of U.S. Census population data.

NOTE:

state into Congress. Under the leadership of the Republicans, Congress additionally enacted the Fifteenth Amendment. Ratified in 1870, this amendment barred states from denying or abridging the right to vote on account of race, color, or previous condition of servitude, and it empowered Congress with the legislative authority to enforce this amendment. The federal government’s Reconstruction program and commitment to African American voting rights supported dramatic changes in the states that had been placed under military rule. By 1868, more than 800,000 African Americans had registered to vote as did approximately 660,000 eligible white voters. In addition to exercising their newly acquired right to vote, African American males also participated in political party and state constitutional conventions, and as elected and appointed state and local government officials. Between 1869 and 1901, twenty African Americans also served as U.S. Representatives and Blanche K. Bruce and Hiram R. Revels represented Mississippi as the first and only African American U.S. senators until Edward Brooke represented Massachusetts in the Senate from 1967 until 1979. These electoral reforms and political achievements, however, were repeatedly resisted, tempered, and eventually overcome by the organized violence, voter intimidation, and electoral fraud tactics employed by white supremacist groups like the Ku Klux Klan and their various political supporters. In Louisiana alone, more than 2,000 were killed or injured before the 1868 presidential elec-

8

tion. The same year in Georgia, white legislators gained control over the state legislature by fraudulently expelling thirty legally elected African American state legislators. Congress responded to these and similar events, compiling testimony from the individuals affected, proposing the Fifteenth Amendment, and enacting additional enforcement legislation in 1870, 1871, and the Civil Rights Act of 1875. Resistance to African American suffrage continued in the South, becoming politically acceptable and increasingly invidious with each success. The federal government’s role in the Reconstruction of the South also decreased after the contested 1876 presidential election of Republican Rutherford B. Hayes and the subsequent withdrawal of federal supervision and military protection of the right to vote. Over the next four decades, southern state legislatures, governors, judiciaries, and numerous local governments systematically enacted and supported segregationist policies and electoral devices designed to accomplish under the cover of law what the Fifteenth Amendment expressly prohibited. These devices included locally administered registration laws; literacy, understanding of the Constitution, and character tests; cumulative poll taxes; criminal disenfranchisements; white party primary elections; closed political party membership rules; racially skewed redistricting plans; and so-called grandfather clauses, which effectively exempted some white voters from state voter restrictions. As a result of these exclusionary devices and practices, the number and political weight of African American voters declined sub-

SUFFRAGE: WOMAN’S SUFFRAGE

stantially in every Southern state and the region fell under the one-party political domination of the Democratic Party. As Figure 1 reveals, the exclusion of African Americans from the electorate and the concomitant loss of party competition throughout the South depressed voter turnout from the 1870s to 1919. The Twentieth Century At the beginning of the twentieth century, civil rights activists like W. E. B. Du Bois and civil rights organizations like the National Association for the Advancement of Colored People (NAACP), established in 1909, initiated and sustained more organized private efforts to protect and to restore African American civil rights. Many of the initial successes of these efforts were achieved in litigation that challenged the constitutionality of state voting restrictions. In Guinn and Beal v. United States (1915), for example, the U.S. Supreme Court upheld Oklahoma literacy tests but found the state’s grandfather clause to be an unconstitutional attempt to evade the Fifteenth Amendment. In Nixon v. Herndon (1927) and Nixon v. Condon (1932), the Court determined that all-white primary systems were unconstitutional if they were authorized by state action. Subsequently, in U.S. v. Classic (1941) and Smith v. Allwright (1944), the Supreme Court ruled against the constitutionality of all-white primary elections. In Gomillion v. Lightfoot (1960), the Court furthered the dismantling of state-supported disfranchisement schemes when it struck down a local Alabama redistricting plan that intentionally discriminated against African Americans. Many factors, including long-term emigration patterns and the immediate need to desegregate the U.S. military during World War II, renewed congressional interest in civil and voting rights reform in the 1940s. In 1942, Congress exempted U.S. soldiers from state voter poll taxes, but state senators in the South repeatedly rejected or filibustered legislative efforts to broaden civil rights guarantees over the next two decades. Despite the setbacks in Congress, civil rights and voting rights reformers pursued their goals by mobilizing and orchestrating public protests and demonstrations throughout the South. Finally, in 1957 and 1960, Congress managed to enact two new Civil Rights Acts. The legislation created the United States Civil Rights Commission and authorized litigation by the U.S. Attorney General against voting rights violations. The Commission proved especially useful because it gathered and reported statistics that detailed the extent to which African Americans remained excluded from participating in U.S. elections. In 1962, Congress responded by endorsing the Twentyfourth Amendment, which, when ratified two years later, banned state poll taxes required for voting in federal elections. Civil rights demonstrations and voter registration drives continued throughout the late 1950s and early 1960s, although they often were met with local and sometimes lethal levels of violence. In 1964, Congress enacted a more expansive and enforceable Civil Rights Act, and in the aftermath of nationally televised attacks against a

peaceful civil rights march in Selma, Alabama, President Lyndon Johnson and Congress approved the 1965 Voting Rights Act. The act banned literacy tests and other racially discriminatory devices, and it guaranteed direct federal supervision of voter registration, voting procedures, and elections in seven southern states and several nonsouthern states as well. In 1966, the U.S. Supreme Court upheld the constitutionality of the Voting Rights Act and extended the ban on poll taxes to state elections. Congress amended and extended the protections of the Voting Rights Act in 1970, 1975, and 1982. In the states and jurisdictions covered by the act, the 1965 Voting Rights Act and its amendments had immediate and lasting effects upon African American voter registration, electoral participation, and political officeholding. In Mississippi, for example, the voting age percentage of the nonwhite population registered to vote increased from 6.7 to 59.8 percent between 1965 and 1967. Today African Americans register and vote at rates approximately similar to other ethnic groups. Federal protection of African American voting rights also has supported increases in the number of African American elected officials. In 1967, more than 200 African Americans were elected to state and local offices in southern states—twice the number elected before the act. Today, there are thirty-eight African American members of Congress, almost 600 African American state legislators, and more than 8,400 locally elected officials. BIBLIOGRAPHY

Davidson, Chandler, and Bernard Grofman, eds. Quiet Revolution in the South: The Impact of the Voting Rights Act, 1965– 1990. Princeton, N.J.: Princeton University Press, 1994. Goldman, Robert M. Reconstruction and Black Suffrage: Losing the Vote in Reese and Cruikshank. Lawrence: University Press of Kansas, 2001. Keyssar, Alexander. The Right to Vote: The Contested History of Democracy in the United States. New York: Basic Books, 2000. Kousser, J. Morgan. Colorblind Injustice: Minority Voting Rights and the Undoing of the Second Reconstruction. Chapel Hill: University of North Carolina Press, 1999. Kromkowski, Charles. Recreating the American Republic: Rules of Apportionment, Constitutional Change, and American Political Development, 1700–1870. New York: Cambridge University Press, 2002.

Charles A. Kromkowski See also Civil Rights Act of 1957; Civil Rights Act of 1964; Civil Rights Movement; Reconstruction; Voting Rights Act of 1965; and vol. 9: An Interview with Fannie Lou Hamer.

WOMAN’S SUFFRAGE The history of woman’s suffrage in America begins with a seventeenth-century businesswoman, Margaret Brent. Brent, a Catholic immigrant to the colony of Maryland, was a property owner and the executrix and attorney of

9

SUFFRAGE: WOMAN’S SUFFRAGE

the estate of the Maryland governor Leonard Calvert. In 1648, Brent demanded the right to two votes in the Maryland General Assembly. The first vote she claimed for herself, the second as the legal representative of the extensive Calvert estate. At the time, the colony faced political uncertainty caused by financial problems and a considerable amount of religious strife, and the General Assembly denied her claim to both votes. Brent protested her exclusion and the subsequent proceedings of the assembly, and she soon moved and settled permanently in Virginia. Although Brent’s original bid for voting rights failed, women voted in several eighteenth-century colonial elections. The available evidence suggests that all of these women were widowed property owners. Voting Rights from the Revolution to Reconstruction After 1776, a larger but still comparatively small number of women voted more regularly in New Jersey elections until 1807, when the state amended its constitution to expressly prohibit woman’s suffrage. Thereafter and with few exceptions until 1869, American women were barred from voting in all federal, state, and local elections. One noteworthy local exception to this exclusionary past was Kentucky’s 1838 grant permitting voting privileges in

school board elections to all propertied widows with school-age children. Efforts to gain the right to vote for American women advanced in 1848 with the calling of a convention to meet in Seneca Falls, New York, to discuss the “social, civil and religious rights of women.” Organized by Elizabeth Cady Stanton, Lucretia Mott, and others, and inspired by the abolitionist movement and the ideals of Quakerism and the Declaration of Independence, more than three hundred women and men attended. The Seneca Falls Convention included numerous speeches and Stanton and Mott’s famous “Declaration of Sentiments,” which proclaimed “that all men and women are created equal.” Participants also resolved “it is the duty of the women of this country to secure to themselves their sacred right to the elective franchise.” Similar conventions were held in the 1850s, promoting greater public awareness and a network of suffrage advocates and supporters. Still women’s suffragists had limited political success before the outbreak of the Civil War. In fact, only Michigan in 1855 and Kansas in 1861 extended school board election voting rights to women, and the Kansas Supreme Court voided the latter right in 1875. The end of the war and the concomitant need for fundamental changes in the United States and many state

Woman’s Suffrage. Five suffragists in New York City promote a march in May 1912, to take place “rain or shine”; in 1917, New York State became the first state in the East to approve equal suffrage for women. Library of Congress

10

SUFFRAGE: WOMAN’S SUFFRAGE

constitutions created opportunities for many types of social, economic, and political change. Woman’s suffragists lobbied members of Congress, state legislators, Republican party leaders, and abolitionist groups with the hope of garnering support for their cause. Despite these efforts, neither Congress nor the others advocated extending voting rights to women in any of the Reconstruction amendments proposed and subsequently added to the U.S. Constitution. Indeed, the Fourteenth Amendment (1868) explicitly recognizes the power of states to deny the right to vote to “male inhabitants,” a gender-specific description of voting rights not found in the original Constitution that must have discouraged woman’s suffragists and intensified their subsequent lobbying efforts in Congress. Interestingly, the Fifteenth Amendment (1870) employed gender-neutral language, barring state denial or abridgment of “the right of citizens of the United States to vote” based upon race, color, or previous condition of servitude—thus leaving open the possibility for future state extensions of the right to vote to women. Woman’s Suffrage Organizations Failure to achieve support in Congress for a constitutional right to vote divided woman’s suffrage activists for the next twenty years. In 1869, Elizabeth Stanton, Susan B. Anthony, and others established the National Woman’s Suffrage Association (NWSA). Unsatisfied with the results of their initial lobbying efforts, Stanton, Anthony, and the NWSA withheld support for the ratification of the Fourteenth and Fifteenth Amendments, thereby severing themselves from other suffragists as well as many of their former abolitionist and Republican allies. Under the leadership of Stanton and Anthony, the NWSA continued to work for a national constitutional amendment, focusing most of the energies and talents of the organization upon lobbying the United States Congress. These organizational investments, however, yielded both mixed and modest results. For example, between 1869 and 1888 members of Congress submitted eighteen constitutional amendments designed to extend voting rights to women, yet most of these proposals received little consideration and none won legislative approval in either the House or the Senate. Outside of Congress, the NWSA experimented with other tactics, including a reform strategy involving civil disobedience and the federal judiciary. In 1872, Anthony and others succeeded in their efforts to be arrested for attempting to vote in state elections. Their trials attracted a considerable amount of attention to the suffrage movement and, in one case, a U.S. Supreme Court decision, Minor v. Happersett (1875). In Minor, however, the Court decisively rejected the claim that the term “citizens” in the Fourteenth Amendment granted the right to vote to women. The Court’s decision was another setback for the NWSA, and it also signaled the Court’s subsequent and similarly narrow reading of the individual rights protected by the Fifteenth Amendment.

White House Protest. A suffragist in 1917 calls on President Woodrow Wilson to support woman’s suffrage; he did so in January 1918, after months of White House picketing, hunger strikes by some of the arrested women, and a visit by Wilson to militant leader Alice Paul in her Virginia jail cell. 䉷 corbis

Suffrage advocates not aligned with the NWSA pursued their reform agenda within other organizations, including the American Woman’s Suffrage Association (AWSA). Established in 1869, the AWSA directed most of its efforts toward achieving state suffrage reforms. Like the NWSA, the AWSA achieved limited success in its first twenty years. By 1889, women could vote in schoolrelated elections in about twenty states and territorial governments; in four territorial states—Wyoming (1969), Utah (1870), Washington (1883), and Montana (1887)— women possessed equivalent voting rights with men. Unification of the NWSA and AWSA in 1890 produced the National American Woman Suffrage Association (NAWSA), but during the next two decades the new organization achieved limited success. Although additional states extended woman’s suffrage in school, municipal, tax, or bond elections, by 1910 only five states—Wyoming (1890), Colorado (1893), Utah (1896), Idaho (1896), and Washington (1910)—guaranteed women the right to vote in all elections. Despite these limited results, the NAWSA and various state organizations persisted with their lobbying and grassroots efforts. The persistence paid greater dividends

11

SUGAR ACTS

in the 1910s as other social, economic, and political conditions fortuitously converged to accelerate the progress of the woman’s suffrage movement. An early indicator of this future was President William H. Taft’s decision to speak at the NAWSA 1910 annual convention. Taft declined to offer an explicit endorsement of woman’s suffrage, but his presence and speech sent a different message to both the public and NAWSA members. Another significant indicator was the Progressive party’s public endorsement of woman’s suffrage in 1912, for although it yielded limited immediate results, the endorsement underscored the long-term electoral and partisan stakes associated with the reform’s enactment. Woman’s suffragists, to be sure, also benefited greatly from the new environments created by industrialization and urbanization and from increased public interest in political reform and other social movements. By 1917, not only had the NAWSA membership increased to 2 million, twelve additional states had approved woman’s suffrage since 1910, increasing the total to seventeen states and adding both legitimacy and electoral consequences to the suffrage reform. Throughout the decade, and especially after 1915, leaders of national woman’s suffrage organizations like Carrie Chapman Catt of the NAWSA and Alice Paul and Lucy Burns of the Congressional Union, an organization established in 1913, began to focus their efforts upon winning congressional approval of an amendment to the U.S. Constitution. In addition to conducting a traditional lobbying campaign, the NAWSA and other organizations employed many of the tactics successfully used to achieve state constitutional reforms: authorizing and orchestrating mass marches, petition campaigns, and political candidate endorsements designed to exert electoral pressures upon the national political parties and members of Congress. In 1917, the National Women’s Party, another new and decidedly more militant woman’s suffrage organization, initiated a series of widely publicized protests and arrests at the White House. Many of the protesters chained themselves to the White House fence and some went on hunger strikes during their imprisonment. By January 1918, the combination of these various efforts with others associated with the United States involvement in World War I set the conditions within which President Woodrow Wilson issued his endorsement of a national constitutional amendment. The U.S. House of Representatives quickly followed the president, agreeing by the required two-thirds majority to send the woman’s suffrage amendment on to the states for ratification. The Senate, however, balked initially, stalling the amendment in Congress until June 1919, when it, too, finally endorsed the Nineteenth Amendment. Slightly more than a year later the thirty-sixth state, or the three-quarters of the states required by the U.S. Constitution, ratified the Nineteenth Amendment. The amendment, in part, provides that “The right of citizens of the United States to vote shall not be denied or abridged by the United States or by any State on account of sex.”

12

Ironically, ratification of the Nineteenth Amendment did not produce dramatic national- or state-level changes in policies or party affiliation. The Nineteenth Amendment, however, did have immediate and permanent effects upon the American political landscape, bolstering its democratic characteristics and tendencies by nearly doubling the number of voters in almost every election except those occurring in southern states. BIBLIOGRAPHY

Andersen, Kristi. After Suffrage: Women in Partisan and Electoral Politics before the New Deal. Chicago: Chicago University Press, 1996. Dinkin, Robert J. Voting in Provincial America: A Study of Elections in the Thirteen Colonies, 1689–1776. Westport, Conn.: Westview, 1977. DuBois, Ellen Carol. Feminism and Suffrage: The Emergence of an Independent Women’s Movement in America, 1848–1869. Ithaca, N.Y.: Cornell University Press, 1978. Harvey, Anna L. Votes Without Leverage: Women in American Electoral Politics, 1920–1970. New York: Cambridge University Press, 1998. Keyssar, Alexander. The Right to Vote: The Contested History of Democracy in the United States. New York: Basic Books, 2000. Kromkowski, Charles. Recreating the American Republic: Rules of Apportionment, Constitutional Change and American Political Development, 1700–1870. New York: Cambridge University Press, 2002.

Charles A. Kromkowski See also vol. 9: Path Breaking; The Seneca Falls Declaration of Rights and Sentiments.

SUGAR ACTS were parliamentary measures designed to increase Great Britain’s profits from the lucrative West Indian and North American sugar trade. Throughout the American colonial period the British Empire depended on the West Indies for sugar. Wealthy sugar planters who resided in England used their political influence to bring about enactment of the Molasses Act (1733), which secured their monopoly by subjecting foreign molasses imported into any British colony to a duty of six pence per gallon. This law proved ineffective, however, in the absence of systematic measures to enforce it. In 1764 George Grenville, chancellor of the Exchequer, enacted a new sugar act, which he intended to end the smuggling trade in foreign molasses and at the same time secure revenue. The act lowered the duty on foreign molasses from six to three pence a gallon, raised the duties on foreign refined sugar, and increased the export bounty on British refined sugar bound for the colonies. These measures gave the British sugar planters an effective monopoly of the American sugar market. Smuggling of foreign sugar became unprofitable, as did the old illicit trade in foreign molasses. These changes sparked violent protests at first. Two years later, Parliament lowered the duty

S U G A R I N D U S T RY

to one penny a gallon, applied alike to foreign and British imports, and the protests on the molasses duty ended. At this lower rate, molasses yielded an average of £12,194 annually from 1767 to 1775. Other phases of the Sugar Act of 1764 were far more irritating to the colonists than was the lowered duty on molasses. One was a new duty on wine imported from Madeira, which had previously come in duty free and was the main source of profit for the fish and food ships returning from the Mediterranean. This part of the Sugar Act led to few direct protests, but it did produce some spectacular attempts at evasion, such as the wine-running episode in Boston involving a ship belonging to Capt. Daniel Malcolm, in February 1768. Even more provocative were measures imposing new bonding regulations that compelled ship masters to give bond, even when they loaded their vessels with nonenumerated goods. The most controversial of these features was a provision that shipmasters had to give bond before they put any article, enumerated or nonenumerated, on board. The universal American practice, however, was to load first and then clear and give bond, which made it difficult for shipmasters to give a new bond at a customhouse before he brought every new consignment on board. Under the Sugar Act, any ship caught with any article on board before a bond covering that article had been given was subject to seizure and confiscation. The customs commissioners profited greatly from this provision. The most notorious seizures for technical violations of the bonding provision included John Hancock’s sloop Liberty (10 June 1768) and the Ann belonging to Henry Laurens of South Carolina. BIBLIOGRAPHY

Andrews, K. R., et al. The Westward Enterprise: English Activities in Ireland, the Atlantic, and America, 1480–1650. Detroit, Mich.: Wayne State University Press, 1979. McCusker, John J., and Russell R. Menard. The Economy of British America, 1607–1789. Chapel Hill: University of North Carolina Press, 1985.

O. M. Dickerson / s. b. See also Colonial Commerce; Colonial Policy, British; Enumerated Commodities; Navigation Acts; Rum Trade; Smuggling, Colonial; Triangular Trade; West Indies, British and French.

SUGAR INDUSTRY dates back to the very founding of the New World, and has been intricately entangled with its history. Because of its role in the slave trade, sugar played an important role not only in the economy but also in how social relations developed in the New World. In the infamous “triangle trade,” English colonies in the Caribbean shipped sugar to England for refining, and the products went to Africa where traders exchanged them for slaves, who were brought to the Caribbean plantations to raise more sugar. Sugar plantation work was among the

most brutal and dangerous, as workers labored in oppressive heat and swampy conditions, and with dangerous tools. Brought to the New World by Christopher Columbus, sugar cane was first cultivated successfully in Louisiana around the middle of the eighteenth century. Although efforts to make sugar from the cane juice succeeded in Louisiana as early as 1760 and in Florida a few years later, until the 1790s cane was cultivated in small quantities, mainly for the manufacture of syrup and rum. The spectacular success of a wealthy Louisiana planter, Jean E´tienne Bore´, in making sugar on a substantial scale in 1795 was followed in the next years by a rapid shift of planters from indigo to sugarcane. When the United States took possession of Louisiana in 1803, there was already a small but thriving sugar industry in south Louisiana. Likewise, when the United States acquired Puerto Rico and Hawaii in 1898, sugar culture was already well established in both areas. Though slavery in the United States ended after the Civil War, sugar producers continued to keep sugar workers in slave-like conditions in parts of the South, supported by government programs. The need for cane sugar laborers was a key reason that seventeenth-century plantation owners in the Caribbean began importing slaves, and the labor-intensive character of sugar growing later encouraged planters in the U.S. South to hold large numbers of slaves. Climatic conditions in the southern United States were not as favorable for cane culture as those of the West Indies, because of shorter growing seasons and the danger of freezes. Nevertheless, as a result of the availability of enslaved workers, a protective tariff, the introduction of coldresistant cane varieties, the adoption of steam power for grinding cane, and advances in the processes of clarification and evaporation of cane juice, the cane sugar industry grew rapidly in the years prior to the Civil War. Major improvements were made in the manufacture of sugar, including the introduction in the 1820s of steam power for crushing cane and the invention in the 1840s by Norbert Rillieux, a Louisiana Creole, of a multiple-effect system for evaporating cane juice, which replaced the open kettle boilers and revolutionized sugar manufacture. Although cane was grown for syrup mainly on small farms in South Carolina, Georgia, Florida, Alabama, Mississippi, Louisiana, Arkansas, and Texas, only on the large plantations in south Louisiana and Texas was a successful sugar industry established. In 1850, on plantations worked by slaves, the southern states produced almost 114,000 tons of cane sugar, approximately one-half of the sugar consumed in the United States. Prior to 1861, most Louisiana cane sugar was shipped to cities throughout the Mississippi Valley and the East Coast, and much of it was consumed in the form of raw sugar. Refiners in eastern cities imported raw sugar from the West Indies and, by a refining process of melting the sugar, clarifying the juice in boneblack filters, and centrifugal drying, produced a dry, white sugar.

13

S U G A R I N D U S T RY

Beets, the other principle source for the sugar industry, have only in the twentieth century become a widespread source, though attempts at making beet sugar date centuries back. Sugar beets, which probably grew wild in Asia, were cultivated at an early time in Egypt and southern Europe. A German chemist, Andreas Marggraf, demonstrated in 1747 that sugar from beets was identical with cane sugar. Early in the nineteenth century, when France was cut off from overseas sugar supplies, Napoleon Bonaparte established the sugar beet industry. Although the industry declined with Napoleon’s downfall, it gradually revived, spreading first to Germany and then to much of the rest of Europe. One reason that the beet sugar industry was established so slowly in the United States is the large amount of hand labor required in growing beets; because of where beets grew, their growers could not rely on enslaved labor. In the late nineteenth and early twentieth centuries, the cultivation of sugar beets spread throughout the central and western states from the Great Lakes to California, and in both cane and beet processing, large expensive central mills came to dominate the manufacture of sugar. Four small beet sugar factories were constructed between 1838 and 1856, but all failed. The first successful one was established by E. H. Dyer at Alvarado, California (twentytwo miles east of San Francisco), in 1870 and operated through 1967. The next successful plants were established in Watsonville, California (1888); Grand Island, Nebraska (1890); and Lehi, Utah (1891). During the 1870s, Maine and Delaware offered a bonus for beet sugar manufactured within their limits, and factories destined to operate only a few years were built at Portland and Wilmington, respectively. The Portland factory inaugurated the practice of contracting with farmers for a specific acreage of sugar beets that would be raised from seed furnished by the company. This plan of operation, adapted from French practices, has persisted to the present. Despite the activity in Maine and Delaware, production in the United States has tended to concentrate in irrigated areas in the West. By 1910 more beet than cane sugar was produced in the continental United States. In 1920 the output exceeded one million tons, and in 1972 it was about 3.5 million tons, which was more than one-fourth of the sugar consumed in the United States. In the 1970s, some sixty plants were producing beet sugar in eighteen states, with more than one-third of the total factory capacity located in California and Colorado. During the 1930s, studies began on the mechanization of growing and harvesting beets. Since World War II, mechanical devices have replaced much of the handcutting of cane, as machines for planting, cultivating, and harvesting beets—all requiring specialized technological changes—were developed by the beginning of World War II, and their adoption was hastened by shortages of hand labor during the war and by postwar prosperity. By the 1960s, the refining branch of the sugar industry was dominated by large corporations and was con-

14

centrated in coastal cities, especially New York, New Orleans, Savannah, Baltimore, Philadelphia, Boston, and San Francisco. Refiners process raw sugar from Louisiana, Florida, Hawaii, Puerto Rico, and foreign countries. Refined sugar was marketed in more than one hundred varieties of grades and packaging to meet highly specialized demands. Per capita sugar consumption in the United States increased rapidly during the twentieth century and by the 1970s had been stabilized at about one hundred pounds per year. Although sugar production in Texas ended in the 1920s, a thriving modern sugar industry emerged in Florida south of Lake Okeechobee. Since 1934 the U.S. government has assisted the sugar industry, which has a powerful lobby. Until the late twentieth century, sugar growers had access to extremely lowpaid, non-unionized immigrant workers through a federal “guest worker” program for the industry. In the early twenty-first century, the sugar industry was receiving $1.6 billion from the U.S. government. Rather than making direct payments to growers, as the Agriculture Department does in other industries, the department gives sugar processors short-term loans, and maintains high domestic prices by strictly limiting imports. Critics of this policy note that sugar consumers pay two to three times the world market price. In fiscal year 2000, domestic growers grew more than the government-set limit, and the government had to spend $465 million to buy their excess sugar and cover the cost of processors’ loan forfeitures. According to the Center for Responsive Politics, which tracks political campaign contributions, the sugar industry contributes more than one-third of the money that crop production and food processing interests spend on political campaigns. The industry has a growing U.S. market; sugar consumption has practically doubled in the past century, from 86 pounds per U.S. citizen to nearly 160. However, the fortunes of the U.S. sugar industry may change in the wake of the North American Free Trade Agreement, as Mexican imports are likely to flood the U.S. market beginning in fiscal 2004.

BIBLIOGRAPHY

Hall, Michael R. Sugar and Power in the Dominican Republic: Eisenhower, Kennedy, and the Trujillos. Westport, Conn.: Greenwood Press, 2000. Melendy, H. Brett. Hawaii, America’s Sugar Territory, 1898– 1959. Lewiston, N.Y.: Edwin Mellen Press, 1999. Mintz, Sidney Wilfred. Sweetness and Power: The Place of Sugar in Modern History. New York Penguin Books, 1986. Roberts, Paul. “The Sweet Hereafter.” Harper’s Magazine 299, 1794 (November 1999): 54. Rodrigue, John C. Reconstruction in the Cane Fields: From Slavery to Free Labor in Louisiana’s Sugar Parishes, 1862–1880. Baton Rouge: Louisiana State University Press, 2001. Sandiford, Keith Albert. The Cultural Politics of Sugar: Caribbean Slavery and Narratives of Colonialism. New York: Cambridge University Press, 2000.

SUMMIT CONFERENCES, U.S. AND RUSSIAN

Woloson, Wendy A. Refined Tastes: Sugar, Confectionery, and Consumers in Nineteenth-Century America. Baltimore: Johns Hopkins University Press, 2002.

appeasement, although the Yalta accords reflected the reality of the positions Allied armies had reached on the ground.

Wayne D. Rasmussen J. Carlyle Sitterson / d. b.

After Roosevelt’s death and Germany’s surrender, President Harry S. Truman traveled to Potsdam to meet Stalin and Churchill (replaced during the conference by Clement Attlee after his victory in British elections) from 17 July to 2 August 1945. They carved Germany into four occupation zones and settled on a policy of modest reparations and the rebuilding of Germany’s basic infrastructure, rather than seeking the country’s deindustrialization or dismemberment.

See also Agricultural Machinery; Agricultural Price Support; Agriculture, Department of; Diets and Dieting; Plantation System of the South; Slave Trade; Subsidies.

SUMMIT CONFERENCES, U.S. AND RUSSIAN, the occasions for heads of state or government to meet directly in what is often termed “personal diplomacy.” While summits are often depicted as the opportunity for top leaders to reach breakthroughs on difficult issues their subordinates have been unable to resolve through negotiation, more often agreements signed at summit meetings are the culmination of traditional diplomatic work. Summits offer participants a chance to evaluate their counterparts in person, and allow leaders to impress domestic and international audiences with their peacemaking ability or diplomatic prowess, although the expectations they raise for dramatic progress can easily be disappointed. Every president since Franklin D. Roosevelt has met with the Soviet or Russian leadership. Although each summit meeting was marked by circumstances specific to the historical moment, one can speak roughly of four phases: wartime meetings of the Allied leaders to plan strategy during World War II; a continued multilateral approach to dealing with crucial international issues in the Dwight D. Eisenhower years; a shift to bilateral discussions of nuclear arms limitation in the 1960s through the 1980s; and the attempt to forge a new relationship in the post– Cold War era. Allied Conferences in World War II The first wartime summit took place from 28 November to 1 December 1943, when President Roosevelt met with Soviet Premier Joseph Stalin and British Prime Minister Winston Churchill at Tehran. Stalin pressed the AngloAmericans to begin the promised cross-channel attack on German-held Europe, and promised to enter the war against Japan once Germany was defeated. Roosevelt proposed the creation of a postwar international organization to keep the peace, dominated by the “Four Policemen” (including China). From 4 to 11 February 1945, the three met again at the Russian Black Sea resort of Yalta. Stalin consented to a four-power occupation of Germany (including a French force) and reaffirmed his promise to enter the war against Japan. But the central issue was the postwar fate of Eastern Europe, especially Poland. Stalin soon violated the Yalta agreement to assure representative government in Poland, made without provision for enforcement. This led Roosevelt’s detractors to charge him with “betrayal” and to link the name of Yalta, like that of Munich, to

The sharpening of the Cold War after World War II brought a ten-year halt to U.S.-Soviet summit meetings. Summits were discredited in the minds of critics who believed that an ailing Roosevelt had been manipulated by a crafty Stalin at Yalta, and that there was nothing to be gained by personal diplomacy with untrustworthy rivals. Summits on International Issues The freeze began to thaw from 18 to 23 July 1955, when President Dwight D. Eisenhower met Premier Nikolai Bulganin and Communist Party chief Nikita Khrushchev, along with Prime Minister Anthony Eden of Britain and French Prime Minister Edgar Faure at Geneva in the first East-West summit of the Cold War. Neither Eisenhower’s proposal for an “open skies” inspection plan permitting Americans and Soviets to conduct aerial reconnaissance over one another’s territory, nor the Soviet proposal for mutual withdrawal of forces from Europe, made any headway. However, after a decade of no meetings, many welcomed the lessening of international tension associated with the “spirit of Geneva.” This was followed by the “spirit of Camp David,” when Khrushchev visited Eisenhower at the presidential retreat in Maryland from 25 to 27 September 1959 and retracted his ultimatum demanding a final settlement of the status of Berlin. The thaw proved short-lived. Two weeks before a planned summit in Paris on 16 to 17 May 1960, an American U-2 spy plane was shot down deep inside Soviet airspace. Khrushchev used the opening of the summit to denounce American aggression and then walked out. When Khrushchev met the new President John F. Kennedy in Vienna on 3 and 4 June 1961, the two leaders eyed each other grimly. They agreed to avoid superpower confrontation over the civil war in Laos, but made no progress toward a proposed ban on nuclear weapons testing, and clashed over the fate of Berlin. The outbreak of the Six-Day War in the Middle East prompted an emergency session of the United Nations General Assembly in New York, which Soviet Premier Alexei Kosygin planned to attend. Kosygin and President Lyndon B. Johnson met halfway between New York and Washington at Glassboro, New Jersey, from 23 to 25 June 1967. The Soviet premier called for American withdrawal from Vietnam and Israeli withdrawal from Egypt, and Johnson focused on nuclear issues. However, Kosygin had

15

SUMMIT CONFERENCES, U.S. AND RUSSIAN

been given little power to negotiate by the Politburo, and no agreements were signed. De´tente and Nuclear Arms Talks President Richard M. Nixon and his National Security Adviser Henry A. Kissinger sought to use negotiations with the Soviet Union to arrange an acceptable exit from the Vietnam War in exchange for improved relations. After Nixon’s historic visit to China in 1972, Soviet leaders invited him to Moscow, where talks held from 22 to 30 May 1972, resulted in the signing of two agreements marking the beginning of “de´tente”: a treaty limiting each country to the construction of two Anti-Ballistic Missile (ABM) systems, and an agreement limiting long-range land-based and submarine-based ballistic missiles, later known as the SALT I (Strategic Arms Limitation Talks) treaty. Communist Party Secretary Leonid Brezhnev visited the U.S. from 18 to 25 June 1973, where he and Nixon signed a number of minor agreements regarding agriculture, transportation, and trade. A meeting in Moscow from 27 June to 3 July 1974, held in the shadow of the Watergate scandal and under pressure from conservative opponents of arms control, brought no further progress on strategic arms limitations, although the ABM treaty was amended to reduce the number of ABM systems permitted from two to one. After Nixon’s resignation, President Gerald R. Ford met Brezhnev at Vladivostok on 23 and 24 November 1974, where they agreed on the outlines for a SALT II agreement. From 30 July to 2 August 1975, the two leaders met again in Helsinki during a signing ceremony of the Conference on Security and Cooperation in Europe. Cooling relations brought about by the collapse of Saigon, superpower rivalry in Angola, and trade disputes lessened the possibility of progress toward a second SALT agreement, as did the upcoming American elections, in which Ford avoided all references to “de´tente” to resist challenges from the right. The unpopularity of de´tente continued to grow during President Jimmy Carter’s term. By the time he met Brezhnev in Vienna from 15 to 18 June 1979, relations with the Soviet Union had deteriorated over trade restrictions, Third World conflicts, U.S. rapprochement with China, and human rights. The two leaders were able to sign a SALT II agreement but Senate conservatives opposed the treaty and the Soviet invasion of Afghanistan in December ended any hope of ratification. President Ronald Reagan’s first term was marked by remilitarization and a heightening of Cold War tensions that sowed fears that the superpowers might be sliding toward nuclear war, creating a mass antiwar movement in the United States and Europe. In response, Reagan met the new, reformist Soviet leader, Mikhail Gorbachev, at a “getacquainted summit” at Geneva from 19 to 21 November 1985, where despite a lack of agreement on nuclear arms reductions, the two leaders established warm personal re-

16

lations. They met again in Reykjavik, Iceland, on 11 and 12 October 1986, and agreed to reduce intermediate-range nuclear missiles, but deadlocked over Reagan’s devotion to space-based missile defense. At a third meeting in Washington from 7 to 10 December 1987, the two leaders signed the Intermediate-Range Nuclear Forces (INF) Treaty, requiring the elimination of all U.S. and Soviet INF missiles. A fourth meeting in Moscow from 20 May to 2 June 1988, was more notable for the media images of Reagan strolling through the heart of what he had formerly called “the Evil Empire” than for the minor arms control agreements signed, and a final meeting in New York on 7 December was largely ceremonial. End of the Cold War The rapid pace of political change in Eastern Europe in 1989 led President George H. W. Bush to hold a shipboard meeting with Gorbachev off Malta on 2 and 3 December 1989. Although no agreements were signed, statements of goodwill indicated, as Gorbachev put it, that the era of the Cold War was ending. This was reinforced at a Washington summit from 31 May to 3 June 1990, when agreements on a range of issues including trade and chemical weapons were signed in an atmosphere of cooperation not seen since the height of de´tente. Gorbachev was invited to a Group of Seven meeting in London on 17 and 18 July 1991, where he and Bush agreed to sign a START (Strategic Arms Reduction Talks) treaty at a full summit in Moscow on 30 and 31 July. The Moscow meeting proved to be the last superpower summit, as the Soviet Union collapsed at the end of the year. Summits after the Cold War Between 1992 and 2000, Presidents George Bush and Bill Clinton met more than twenty times with Russian Presidents Boris Yeltsin or Vladimir Putin at bilateral summits or individually at multilateral venues. Talks on further nuclear arms reductions and securing the former Soviet arsenal and nuclear materials were a feature of many of the summits. At a Moscow meeting on 2 and 3 January 1993, Bush and Yeltsin signed the START II Treaty, promising to reduce each country’s nuclear arsenal to between 3,000–3,500 warheads within ten years. Yeltsin used various summit meetings to argue unsuccessfully against the eastward expansion of the North Atlantic Treaty Organization, and Clinton often pressed Yeltsin and Putin to seek a peaceful resolution to Russia’s conflict with its secessionist province of Chechnya. Another regular feature of the discussions was the attempt by the Russian leaders to obtain better trade relations and economic aid from Western countries and institutions, and American pressure to link such concessions to structural reform of Russia’s economy. The diminished drama and increased frequency of the meetings compared with the Cold War years confirmed the extent to which relations between the two countries had normalized by the end of the twentieth century.

S U M T E R , F O RT

BIBLIOGRAPHY

Boyle, Peter G. American-Soviet Relations: From the Russian Revolution to the Fall of Communism. New York: Routledge, 1993. Garthoff, Raymond L. The Great Transition: American-Soviet Relations and the End of the Cold War. Washington, D.C.: Brookings Institution, 1994. LaFeber, Walter. America, Russia, and the Cold War, 1945–1996. 8th ed. New York: McGraw-Hill, 1997. Paterson, Thomas G. American Foreign Relations. 5th ed. Boston: Houghton Mifflin, 2000. Weihmiller, Gordon R. U.S.-Soviet Summits: An Account of EastWest Diplomacy at the Top, 1955–1985. Lanham, Md.: University Press of America, 1986.

Max Paul Friedman See also Arms Race and Disarmament; Cold War.

SUMPTUARY LAWS AND TAXES, COLONIAL. The term “sumptuary laws” usually refers to regulations of food, clothing, morals, amusements, church attendance, and Sabbath observance. Sumptuary laws existed in all of the colonies. They included general colonial statutes, local regulations, applications of common law to local situations, and fixed customs of the people in different colonies. Custom and practice were as much a part of the total laws of a community as were the formal statutes, although their enforcement was different. The blue laws of Connecticut were the best known of the sumptuary laws. They were originally compiled by the Loyalist and Anglican clergyman Samuel A. Peters and published in England in his General History of Connecticut (1781). For many years people accepted or denounced this account of the Connecticut colonial code. In 1898 Walter F. Prince published in the Report of the American Historical Association for 1898, a detailed analysis of the Peters laws based on careful research. He found that one-half did exist in New Haven and more than fourfifths existed in one or more of the New England colonies. Others, however, were inventions, exaggerations, misunderstandings, or the result of copying from other erroneous writers on New England history. Different kinds of sumptuary laws predominated in different times and places. Some laws prohibited wearing gold decorations, lace, hatbands, ruffles, silks, and similar materials when one’s station in life did not warrant such expensive clothing. These were most common during the seventeenth century and prevailed in many colonies. In 1621, for example, authorities sent directives to Virginia reserving for council members the right to wear fine apparel. Massachusetts also had very detailed laws regulating dress. Many colonies enforced such laws by fine, although in Massachusetts the wearer might have his assessed valuation raised to £300 in addition to a fine. Laws against sexual immorality were also similar in all the colonies, although in the southern colonies they were di-

rected particularly against relations between whites and blacks. The most widespread sumptuary laws governed religious life. Laws against Sabbath breaking were common to all colonies, and most colonies mandated church attendance by law. Enforcement was probably stricter in New England than elsewhere, mainly because of the structure of government in each town, which depended upon cooperation between ecclesiastical and secular authorities to enforce both religious and civil regulations. Whereas most colonies taxed residents to support the local church and its minister, New England colonies (except Rhode Island) went even further to regulate religious life by prescribing doctrinal uniformity by law. In the seventeenth century, Massachusetts punished Quakers and drove them from the colony, and four were hanged for persistent return. Authorities also punished Baptists with beatings and imprisonment, and many alleged witches were sentenced to imprisonment or hanging in the latter half of the seventeenth century. Yet with all this reputation for harshness, there were far fewer death penalties provided by law in New England than in the English statutes of the same time. Further, after the implementation of religious toleration following the Glorious Revolution (1688), even the strictest colonists could no longer ban other religious groups from their midst. BIBLIOGRAPHY

Brown, Kathleen M. Good Wives, Nasty Wenches, and Anxious Patriarchs: Gender, Race, and Power in Colonial Virginia. Chapel Hill: University of North Carolina Press, 1996. Gildrie, Richard P. The Profane, the Civil and the Godly: The Reformation of Manners in Orthodox New England, 1679–1749. University Park: Pennsylvania State University Press, 1994. Hoffer, Peter Charles. Law and People in Colonial America. Baltimore: Johns Hopkins University Press, 1998.

O. M. Dickerson / s. b. See also Blue Laws; Colonial Society; Manners and Etiquette; Massachusetts Bay Colony; Religious Liberty; Virginia; Witchcraft.

SUMTER, FORT, situated on a sandbar, commands the sea approach to Charleston, South Carolina. On the night of 26 December 1860, Maj. Robert Anderson, Union commader at Charleston, removed his garrison from Fort Moultrie, on Sullivan’s Island, to a better defensive position at Fort Sumter. At 4:30 on the morning of Friday, 12 April, the Confederate batteries opened fire on Fort Sumter. On 13 April, after a bombardment of thirty-four hours, Anderson surrendered; the Civil War had begun. In April 1863, Fort Sumter, then garrisoned by Confederates, repelled an attack by a Union fleet. In August, the siege of Fort Sumter by Union forces began and lasted for 567 days; the Confederates never surrendered. The fort was eventually abandoned in February 1865 and later made a national monument.

17

S U N B E LT

BIBLIOGRAPHY

Current, Richard N. Lincoln and the First Shot. Philadelphia: Lippincott, 1963. Donald, David H. Lincoln. New York: Simon and Schuster, 1995. McPherson, James M. Battle Cry of Freedom. New York: Oxford University Press, 1988.

DuBose Heyward / a. r. See also Charleston; Charleston Harbor, Defense of; Confederate States of America; Secession; South Carolina.

SUN BELT comprises the states of the South and the Southwest. The term was coined to describe both the warm climate of these regions and the rapid economic and population growth that have been characteristic since the 1960s. The Sun Belt stretches approximately from Virginia south to Florida and west to California but also includes western mountain states, such as Colorado and Utah, that have experienced similar economic growth. Historically, most of the nation’s population and economic power was based in the Northeast and the upper Midwest. The Southeast had a smaller population, a less robust economy, and hot, humid summers that many northerners considered uncomfortable. Much of the Southwest was settled later and remained sparsely populated well into the twentieth century because of its remote location and an inhospitable desert climate that regularly reached triple-digit temperatures in summer. With the advent of air conditioning, however, year-round comfort became possible in both regions. A shift from northeastern dominance was evident by the early 1970s. The term “New South” came into use to describe economic progress and social changes in the Southeast. California and oil-rich Texas had established themselves as thriving economies, and newer regions of prosperity had begun to emerge throughout the West. This pattern intensified in following decades as many states in the North lost industries, population, and representation in Congress. The Sun Belt attracted domestic and international businesses for many reasons, including lower energy costs and nonunion wages, state policies favorable to business, and, in the West, proximity to the increasingly important Pacific Rim nations. A national emphasis on developing domestic fuel sources in the early 1970s stimulated growth in Texas, Colorado, and other states. The lifestyles and natural beauty of Sun Belt states also attraced many newcomers. As populations grew, southern and western states gained increasing political and economic power. All seven winners of U.S. presidential elections between 1964 and 2000 were from the Sun Belt, reflecting the increased representation in Congress of key states like Texas, Arizona, and Florida, which helped Republicans win majority representation in Congress during the 1990s. Southern culture and values became influential, such as the nationwide popularity of

18

country and western music. Hispanic cultures of the Southwest and Florida gained prominence. The Sun Belt also faced difficult issues, including social problems that many migrants had hoped to escape. Despite areas of prosperity, the Southeast continued to have many sections of poverty. Texas and other energyoriented states experienced a steep, if temporary, economic decline in the mid-1980s because of a fall in oil prices. California suffered serious economic recession and social stresses in the late 1980s and early 1990s, which caused a significant migration of businesses and residents to nearby states. The impacts of growth and development became matters of urgent concern as many Sun Belt communities experienced suburban sprawl, congestion, and pollution, along with an erosion of their traditional regional characteristics and identities. These trends provoked many controversies, which continued into the 1990s. Some people opposed the changes, but others saw them as positive signs of progress and prosperity. Nationally, experts predicted that the economic growth and increasing influence of the Sun Belt marked a permanent change in the demographic, economic, and political structure of the nation. BIBLIOGRAPHY

Bogue, Donald J. The Population of the United States: Historical Trends and Future Projections. New York: Free Press, 1985. De Vita, Carol J. America in the 21st Century: A Demographic Overview. Washington, D.C.: Population Reference Bureau, 1989. Wilson, Charles Reagan, and William Ferris, eds. Encyclopedia of Southern Culture. Chapel Hill: University of North Carolina Press, 1989.

John Townes / c. w. See also Air Conditioning; Climate; Demography and Demographic Trends; Energy Industry; Migration, Internal; Rust Belt.

SUN DANCE. The term “sun dance” is an anthropological invention referring to a number of ceremonies on the Great Plains that were characterized by considerable internal complexity. The Lakota sun dance, wiwanyag wachipi, may be translated as “dance looking at the sun.” By contrast, some have translated the Blackfoot Okan as “sacred sleep.” The central ritual of the Mandans, the Okipa, was not a sun dance at all but rather a complex ceremony that took place in an earth lodge on the central dance plaza of the village and focused a great deal of its energy on animal dances and animal renewal. By the middle of the nineteenth century there were approximately twenty-five rituals identified as “sun dances” spread across the Great Plains. On the Northwestern Plains, the development of these rituals was imbedded in a history of migrations that brought peoples with different cultural backgrounds into

S U N D AY S C H O O L S

multiple symbolic references that interacted with the central symbols of the ritual. Finally, the ritual enactment as a whole was believed to renew the world, the animals, the plants, and the people. Despite these similarities, when looked at from within, the rituals of the various groups were identified with symbolic boundaries that made them unique peoples. Important creator figures, such as the Sun (in the case of one Blackfoot tradition), special culture heroes, and other important predecessors were believed to have brought the sun dance to the people. From this perspective it was their special ritual pathway to powers that would sustain them and reinforce their identity in relation to others who had similar ceremonies. Because of the considerable cultural interaction on the Plains, cultural interchange became important in the development of these rituals, but traditions of origin tended to constitute them as unique to the experience of each people. BIBLIOGRAPHY

Holler, Clyde. Black Elk’s Religion: The Sun Dance and Lakota Catholicism. Syracuse, N.Y.: Syracuse University Press, 1995. For Strength and Visions. Edward S. Curtis’s 1908 photograph shows an Absaroka, or Crow, Indian participating in one of the grueling Plains rituals known as the sun dance. Library of Congress

Mails, Thomas E. Sundancing: The Great Sioux Piercing Ritual. Tulsa, Okla: Council Oaks Books, 1998. Spier, Leslie. “The Sun Dance of the Plains Indians.” Anthropological Papers of the American Museum of Natural History. Vol. 16. New York: The Trustees, 1921. Yellowtail, Thomas. Yellowtail: Crow Medicine Man and Sun Dance Chief. Norman: University of Oklahoma Press, 1991.

closer proximity. These groups became the horse-mounted nomads that fired the imagination of Europeans and Americans alike. Among these groups the ritual known as the sun dance became richly developed and imagined.

See also Indian Dance; Indian Religious Life; Tribes: Great Plains.

As a consequence of increased cultural interactions, mid-nineteenth-century Plains sun dances featured a number of common elements. Almost all of the rituals included a lodge constructed around a specially selected center pole. There were preparatory sweat lodge rituals that often continued through the four- to eight-day ceremony. A central altar became the focus of many of the ceremonies, and a sacred bundle or bundles was transferred from a previous sponsor to an individual or family sponsor for the year. Male dancers were pierced on both sides of their chest and tethered to the center pole by means of skewers attached to leather thongs; during some point in the ritual they also might drag buffalo skulls tethered to skewers imbedded in the flesh of their backs. Participants actively sought and often experienced powerful visions that were life transforming. Animal-calling rituals and pervasive buffalo symbolism focused on ensuring that the buffalo would continue to give themselves to the people as food. Sexual intercourse sometimes took place between women who had ritually become buffalo and men who had also assumed this role, establishing a tie of kinship between the humans and the buffalo people. Dancing, body painting, and complex color symbolism created

SUNDAY SCHOOLS first appeared in American cities in the 1790s. Following the example of British reformers, American organizers hoped to provide basic literacy training to poor children and adults on their one free day. Typical of these schools were those begun in Philadelphia in 1791 by the First Day Society, a group of clerics and merchants who paid local schoolmasters to teach “persons of each sex and of any age . . . to read and write,” using the Bible as the central text. By 1819 the last First Day school had closed, and by 1830 Sunday schools of this type had virtually disappeared from the American scene, although traces of their pattern remained visible for decades in “mission” Sunday schools found in impoverished urban neighborhoods, in rural areas lacking permanent churches, and among newly freed African Americans during Reconstruction. A new-style Sunday school arose in their place, taught by volunteer teachers (a majority of them women) and providing a specifically evangelical Protestant curriculum. By 1832, nearly 8 percent of free children were attending such schools; in Philadelphia alone, the figure was almost 30 percent.

Howard L. Harrod

19

SUPERCONDUCTING SUPER COLLIDER

Evangelical Sunday schools grew rapidly as Protestant clergy and lay people molded them into key elements in an institutional network designed to make the new nation Protestant. (Although some Catholic and Jewish congregations established Sunday schools, the institution itself never assumed the significance it acquired in Protestant religious education.) New ideas about children’s needs and potential also fueled their growth, as did congregations’ embrace of Sunday schools and the development of common schools in urban areas. Indeed, during the nineteenth century, Sunday schools and public schools grew in tandem, developing a complementary relationship. Sunday school societies played important parts in the schools’ proliferation. The American Sunday School Union, a cross-denominational national organization founded in Philadelphia in 1824, was the largest of these, publishing curricular materials and children’s books and sponsoring missionaries to remote regions. Denominational agencies, such as the Methodist Episcopal Sunday School Union (1827) and the Sunday School Board of the African Methodist Episcopal Zion Church (1884), followed suit. After the Civil War, denominational interests came into increasing conflict with the American Sunday School Union, especially in the area of teacher training and lesson writing. Gradually, denominational organizations and teachers’ conventions became the organizations of choice, and the American Sunday School Union’s preeminence declined. It was at a national Sunday school teachers’ convention in 1872 that delegates and publishers adopted plans for a system of “uniform lessons,” standardizing the Biblical texts studied each week but permitting each denomination to shape the lessons’ contents. And the origins of the Chautauqua Movement idea can be traced to a Sunday school teachers’ summer institute organized by the Methodist bishop John Heyl Vincent in 1873. In the twentieth century, Sunday schools were primarily church institutions, recruiting the next generations of members. Although teaching remained volunteer labor performed mostly by women, the work of managing became professionalized, many congregations hired directors of religious education, and new agencies took on the tasks of multiplying the number of Sunday schools and shaping teachers’ preparation. By the turn of the twenty-first century, Sunday school attendance had declined overall. Nevertheless, Sunday schools remain a significant institutional tool for the religious training of succeeding generations, as many a child could testify.

Seymour, Jack L. From Sunday School to Church School: Continuities in Protestant Church Education in the United States, 1860– 1929. Washington, D.C.: University Press of America, 1982.

Anne M. Boylan See also Protestantism.

SUPERCONDUCTING SUPER COLLIDER (SSC), a federally financed project abandoned in 1993 that would have been capable of accelerating subatomic particles to energy levels forty times that previously achieved by researchers. For reasons of national prestige and international economic competitiveness, the Ronald Reagan administration in 1982 encouraged U.S. high-energy scientists to devise a challenging national accelerator project. Physicists responded with plans for the most ambitious particle accelerator ever attempted, a superconducting super collider. It was to be a proton collider far more energetic than existing ones, employing the superconducting magnetic technology recently developed at the Fermi National Laboratory in Illinois. The primary justification for the machine was a search for particles known as Higgs bosons. The machine was to produce forty TeV protons (where one TeV, or tera-electron volt, is 1 trillion electron volts). This determined the size (a fifty-fourmile-long ring) and the projected cost ($4.4 billion). Federal funding for the machine required justification. Support from the Texas congressional delegation and the promise of $1 billion toward the project from the state of Texas led to the decision to build the accelerator in Waxahachie, Texas, rather than near Fermilab. In the autumn of 1993 the House of Representatives, faced with a more than doubled price tag, voted overwhelmingly to kill the project. By then $2 billion had been spent, the superconducting magnets had been tested, one-third of the ring had been excavated, and two teams of a thousand physicists and engineers from around the world were working out detailed designs of the two enormous particle detectors to observe and analyze proton collisions in the TeV energy range. BIBLIOGRAPHY

Kevles, Daniel J. The Physicists: The History of a Scientific Community in Modern America. 2d ed. Cambridge, Mass.: Harvard University Press, 1995. Trefil, James. “Beyond the Quark: The Case for the Super Collider.” New York Times Magazine (30 April 1989): 24. Weinberg, Steven. Dreams of a Final Theory. New York: Pantheon, 1992.

a. r. BIBLIOGRAPHY

Boylan, Anne M. Sunday School: The Formation of an American Institution, 1790–1880. New Haven, Conn.: Yale University Press, 1988.

See also Cyclotron; Physics: Nuclear Physics.

McMillen, Sally G. To Raise Up the South: Sunday Schools in Black and White Churches, 1865–1915. Baton Rouge: Louisiana State University Press, 2001.

SUPERFUND, officially the Comprehensive Environmental Response, Compensation, and Liability Act of 1980, began as a $1.6 billion, five-year program cre-

20

S U P P LY- S I D E E C O N O M I C S

ated by Congress to expedite cleanup of the nation’s worst hazardous waste sites. National concern over the release of hazardous wastes buried beneath the residential community at Love Canal in western New York State prompted its passage. The term also refers to the Superfund Amendment and Reauthorization Act (SARA) of 1986, which comprehensively revised the original act and added $9 billion to the fund. In response to a 1984 tragedy in Bhopal, India, in which three thousand people died and hundreds of thousands were reportedly affected by exposure to deadly methyl isocyanate gas that leaked from a Union Carbide plant, Congress included provisions in SARA requiring corporations to inform host communities of the presence of dangerous materials and to develop emergency plans for dealing with accidental releases of such materials. From the beginning Superfund met with harsh, and often justified, criticism. President Ronald Reagan’s commitment to reduce government regulation of industry undermined the effectiveness of the legislation. At the end of five years the money was gone, and only six of the eighteen hundred hazardous waste sites identified at that time had been cleaned up. Another eighteen thousand suspected sites remained to be investigated. A new provision reauthorized the program to continue until 1994. Legal disputes had mired those willing to restore sites, and in 1994 the legislation was not reauthorized. Instead, the program has continued to function with special appropriated funding while Congress negotiates how to make the program more effective and efficient.

capacity of 300 and a cruising speed from 2.5 to 3 times that of sound—both better than Concorde’s. Boeing and Lockheed, two of the three major commercial jet manufacturers, produced full-sized mockups for a 1967 design competition. Boeing’s design was heavier and more complex but promised slightly better performance and a significantly more impressive, futuristic look. It won, but engineers later abandoned its most advanced features as they struggled to build a plane light enough to operate profitably. The final design, the 2707-300, mirrored Concorde in both appearance and performance. Opposition to the SST project emerged on multiple fronts during the late 1960s. Environmentalists warned of catastrophic damage to the ozone layer. Homeowners along flight routes rebelled against the prospect of routine sonic booms. Members of Congress objected to the use of public funds for a commercial venture. Boeing officials worried, privately, that the SST might bankrupt the company if it failed. Dismayed by rising costs, mounting opposition, and unfulfilled promises, Congress cancelled the SST program in 1971. BIBLIOGRAPHY

Ames, Mary E. “The Case of the U.S. SST: Disenchantment with Technology.” In her Outcome Uncertain: Science and the Political Process. Washington, D.C.: Communications Press, 1978. Ties the cancellation to shifts in public opinion. Horwitch, Mel. Clipped Wings: The American SST Conflict. Cambridge, Mass.: MIT Press, 1982. The definitive history.

A. Bowdoin Van Riper

BIBLIOGRAPHY

Anderson, Terry, ed. Political Environmentalism: Going Behind the Green Curtain. Stanford, Calif.: Hoover Institution Press, 2000. LaGrega, Michael D., Phillip L. Buckingham, and Jeffrey C. Evans. Hazardous Waste Management. Boston: McGraw Hill, 2001. Vaughn, Jacqueline. Environmental Politics: Domestic and Global Dimensions. New York: St. Martin’s Press, 2001.

John Morelli / f. h. See also Environmental Protection Agency; Times Beach.

SUPERMARKETS AND SUPERSTORES. See Retailing Industry.

SUPERSONIC TRANSPORT. In late 1962, the governments of France and Great Britain announced their intention to jointly develop a supersonic transport (SST) named the “Concorde.” Anxious that the United States not trail the Europeans in the SST market as it had in the case of jet airliners, President John F. Kennedy, in a June 1963 speech at the Air Force Academy, called for a jointly funded government-industry program to design and build an American SST. The specifications, drawn from a government feasibility study, called for a passenger

See also Aircraft Industry; Boeing Company.

SUPPLY-SIDE ECONOMICS is based on the premise that high tax rates hurt the national economy by discouraging work, production, and innovation. President Ronald Reagan’s adoption of supply-side economics as the underlying theory for his economic policy in the 1980s represented a major shift in U.S. economic thinking. Supply-side theory was far from new, however, its basic ideas dating back to the early-nineteenth-century works of Jean-Baptiste Say and David Ricardo. It had been ignored in the United States since the New Deal, because of the demand-side theories of the British economist John Maynard Keynes, who believed in raising income and reducing unemployment by expanding demand even if the government does so through deficit spending. In the 1980s, supply siders found an audience looking for an alternative to deficit-oriented, demand-side policies. Arthur B. Laffer popularized the idea. He argued that cutting taxes, especially those of high income groups, would increase government revenues, because lower tax rates would produce more incentives for business and individuals to work and less reason for them to avoid taxes whether through non-productive investments in tax shelters or outright tax avoidance. Cutting taxes would result

21

S U P R E M E C O U RT

in more jobs, a more productive economy, and more government revenues. This theory fit nicely into the conservative political agenda, because it meant less interference with the economy and, when combined with spending cuts and deficit reduction, smaller government. Supply-side economics dominated the administration of President Ronald Reagan, who instituted major tax cuts in 1981 and 1986, reducing the top U.S. rate from 70 percent to roughly 33 percent. However, Congress did not reduce federal spending to compensate for the reduced revenue, with the result that deficits soared to record levels. In the view of some advocates, the failure of Congress to adopt a balanced-budget amendment that would have controlled federal spending to match the tax cuts meant that supply-side theories were not really tried. Cutting taxes remained an important goal for subsequent Republican administrations, but by 2001, few argued that tax cuts would increase government revenue. Rather, tax cuts were a way to stimulate the economy, reign in government spending, and return the budget surplus to its rightful owners. The legacy of supply-side economics has been more political than economic. In the mid-1990s, Republican House Speaker Newt Gingrich observed that supply-side economics has “relatively little to do with economics and a great deal to do with human nature and incentives.” It contributed to the larger debate about the respective roles of government, individuals, and incentives in U.S. society as the nation faced a global economy. BIBLIOGRAPHY

Canton, Victor A., Douglas H. Joines, and Arthur B. Laffer. Foundations of Supply-Side Economics: Theory and Evidence. New York: Academic Press, 1983. Thompson, Grahame. The Political Economy of the New Right. London: Pinter, 1990. Wilber, Charles K., and Kenneth P. Jameson. Beyond Reaganomics: A Further Inquiry Into the Poverty of Economics. Notre Dame, Ind.: University of Notre Dame Press, 1990. Winant, Howard A. Stalemate: Political Economic Origins of SupplySide Policy. New York: Praeger, 1988.

Brent Schondelmeyer / c. p. See also Debt, Public; Economics; Keynesianism; Radical Right; Taxation.

SUPREME COURT. The Supreme Court is the final judicial authority in the U.S. system of government. Designated in Article III of the U.S. Constitution to have jurisdiction over all cases “arising under” the Constitution, the Court has the power to hear cases on appeal from the Federal appellate courts and the highest courts of each state. The Constitution also provides that the Court may act as a trial court in a limited number of cases: “Cases affecting Ambassadors, other public Ministers and Consuls, and those in which a State shall be Party.” Though

22

the Supreme Court is the final judicial authority in American government, it is not necessarily the final legal or political authority in the political system. While litigants may never appeal Supreme Court decisions to a superior court, disputes may proceed in other branches of government after a Supreme Court ruling. Congress and state legislatures may effectively alter or negate Supreme Court decisions involving statutory interpretation by amending or clarifying statutes, and may nullify constitutional decisions by amending the Constitution pursuant to Article V of the Constitution. Several factors are important to understand the Court’s role in American democracy, including: the continuing nature of the Court’s relationship to Congress, the Executive Branch, and state governments; the influence of political and economic history on the Court; the intellectual underpinnings of Supreme Court decisions; and the internal dynamics of the Court as a distinct institution. Finally, the ambiguity of many key provisions of the Constitution is a source of both limits and power, for it creates the need for an authoritative voice on the Constitution’s meaning and simultaneously makes such interpretations open to contestation. Created at the crossroads of law and politics, the Supreme Court’s history is a history of controversy. In addition to the possibility of legislative alteration of Supreme Court decisions, formal relationships the Constitution establishes between the Court and the other branches of the national government affects the Court’s power. First, the President appoints each justice to the Court, subject to Senate confirmation. Second, Supreme Court justices, like all federal judges, serve for life, absent impeachment by the House of Representatives and removal by the Senate. Third, Congress controls the number of justices that serve on the Court at any given time. At various points in U.S. history, the Court has had as few as five justices and as many as ten. Since 1865, however, the number has held steady at nine, including one chief justice. Fourth, Congress controls the Court’s operational budget, though actual compensation to the justices “shall not be diminished during [the Justices] Continuance in office.” (Article III, Section 1). Fifth, the Constitution gives Congress power over much of the Court’s appellate jurisdiction. These and other overlapping Constitutional functions of each branch of government have led scholars to proclaim that the three branches of government are “separate institutions, sharing powers.” Beyond constitutional overlap, the special institutional nature of the Supreme Court is important. For example, the Court lacks the power to decide cases unless the proper parties to a lawsuit bring the case to the Court. The Court also lacks the ability to implement its decisions of its own accord, having to rely upon the executive branch to carry out its will. As Alexander Hamilton wrote in Federalist 78, the Framers firmly expected that the Supreme Court, “no influence over either the sword or the

S U P R E M E C O U RT

purse,” and would thus be “the least dangerous” branch of the three branches of government. Marshall and the Establishment of Judicial Power Though constrained, the Supreme Court has grown in stature and power since the time of the founding. This growth would have been nearly impossible without the deft political thinking and imaginative judicial mind of John Marshall, who served as Chief Justice from 1801– 1835. The Constitution is unclear about the Court’s power to declare acts of Congress unconstitutional and therefore void. Marshall resolved the matter in 1803, ruling in Marbury v. Madison that the Court did indeed possess this power. The historical circumstances and reasoning of the case dramatically illustrate the complex nature of judicial power discussed above. Marbury arose during the tense transfer of power from the Federalist administration of John Adams to the Democratic-Republican administration of Thomas Jefferson in the wake of the 1800 election. Just before leaving office, Adams appointed William Marbury as a justice of the peace in Washington, D.C.—one of several new judgeships created by the departing Federalist Congress trying to maintain a Federalist presence in government. After assuming office, however, Jefferson and his Secretary of State, James Madison, refused to deliver Marbury’s commission to him. Seeking the judgeship, Marbury took his claim directly to the Supreme Court. Marshall confronted a conundrum: if he and the Court ordered Jefferson to give Marbury his commission, Jefferson would surely refuse to obey, making the still fledgling Court appear weak in the face of executive power. Worse, Congress could have impeached Marshall. If the Court declined to support Marbury, however, it would appear to be afraid of Jefferson. Writing for the Court, Marshall dodged having to order Jefferson to deliver the commission by holding that the Constitution did not give the Court the power to hear such cases except on appeal from a lower court. However, he went on to hold that the Judiciary Act of 1789 was unconstitutional because it gave the Court the power to hear the case in original jurisdiction. Thus, Marshall avoided a potentially crippling conflict with the President while simultaneously establishing a broad power that the Court could use in the future. It would be nearly fifty years before the Court declared another act of Congress unconstitutional in the infamous Dred Scott decision. The issue of states’ power in relation to the national government was the most important issue the Court confronted before the Civil War. The Marshall Court was instrumental in increasing the power of the national government over the states. In two controversial decisions, Fletcher v. Peck (1810) and Martin v. Hunter’s Lessee (1816), the Court declared that the Constitution gave it the power to review the constitutionality of decisions of state supreme courts and the acts of state legislatures, respectively. And in McCulloch v. Maryland (1819) and

Gibbons v. Ogden (1824), the Court interpreted the “necessary and proper” and commerce clauses of Article I to give Congress broad regulatory power over the economy. The Marshall Court was also committed to protecting vested economic interests through the contracts clause of Article I (see Dartmouth College v. Woodward, 1819). Under the leadership of Chief Justice Roger B. Taney (1836–1864), the Court was somewhat more deferential to the states, giving them more room to regulate commerce on their own and to impair the obligations of contracts for public policy reasons. (Cooley v. Board of Wardens, 1851; Charles River Bridge v. Warren Bridge, 1837). As race and sectional divide came to the fore by midcentury, the Taney Court found itself at the center of the gathering storm. In 1857, the Court made an infamous decision that made Civil War inevitable. Dred Scott v. Sandford held that African Americans did not constitute “citizens” and that the first of Henry Clay’s three Great Compromises—the Missouri Compromise—was unconstitutional. The Civil War also tested the power of the president of the United States to effectively manage the country. In the Prize Cases (1863) and Ex Parte Milligan (1866), respectively, the Court found that the president could unilaterally establish a shipping blockade and seize property from “non-enemies” during a time of insurrection, but that the president could not impose martial law upon the citizens and suspend the writ of habeas corpus. The Era of Economic Rights and Limited Government The North’s victory in the Civil War had two major consequences: the end of slavery and the unleashing of corporate development in the United States—pitting the regulatory power of governments against the interests of business and the private sector. With some exceptions, the Court showed more concern for the rights of business than with the plight of African Americans. The Reconstruction Era following the Civil War allowed the Court to interpret the recently ratified Thirteenth, Fourteenth, and Fifteenth Amendments to the Constitution. In 1875, Congress passed a Civil Rights Act providing for full access to public accommodations, regardless of race. The Supreme Court, however, found that such legislation exceeded Congress’ power, which only extended to “the subject of slavery and its incidences” (Civil Rights Cases, 1883). Beyond striking down legislation passed to integrate American society on the basis of race, the Court in this period also upheld legislation designed to segregate American society on the basis of race. In 1896, the Court denied a Fourteenth Amendment Equal Protection challenge to the State of Louisiana’s statute mandating racial segregation on trains (Plessy v. Ferguson). Some modern-day commentators point to these Reconstruction Era Court decisions regarding race as the nadir of the intellectual rigor of the Court.

23

S U P R E M E C O U RT

Lochner v. New York epitomizes another controversial area for constitutional scholars. In 1905, the Court invalidated a New York law that regulated the maximum hours for bakers, finding that the law violated the “right to contract.” Critics have pointed out that there is no textual right to contract listed in the Constitution. The Court subsequently overturned Lochner, but the case poses a perennial constitutional problem: how can the Ninth Amendment and the idea of non-enumerated rights find legitimacy with an unelected judiciary? More simply, what nontextual rights are in the Constitution and how does anyone—including the Court—know what they are? The Supreme Court has employed two different tacks in discovering non-enumerated rights in the Constitution. During the so-called “Lochner era,” it used the due process clause of the Fourteenth Amendment. In Meyer v. Nebraska (1923) and Pierce v. Society of Sisters (1925), for example, the Court found respectively that state laws limiting the ability to teach children foreign languages and restricting the teaching of children in private schools violated due process guarantees, which encompass freedom “from bodily restraint, . . . to contract, to engage in any of the common occupations of life, to acquire useful knowledge, to marry, establish a home and bring up children, [and] to worship [a deity] according to the dictates of [one’s] own conscience.” All of these aspects of liberty are “essential to the orderly pursuit of happiness by free men” and as such are protected by the Constitution under a doctrine called substantive due process. Whereas the Court used substantive due process to limit the reach of state regulatory power, it used a restrictive interpretation of the commerce clause to limit the regulatory power of Congress in the decades before the New Deal. These cases illuminate the interactive nature of the relationship between the branches of government discussed above. The Court ruled in Hammer v. Dagenhart (1918) and A.L.A. Schechter Poultry Corp. v. United States (1935) that Congress lacked the power to pass legislation regulating child labor, and to delegate the regulation of agriculture, coal mining, and textiles to the executive branch. Because the power of Congress was critical to the success of President Franklin Delano Roosevelt’s New Deal programs, F.D.R. responded to these and other decisions with a radical proposal. The president proposed expanding the number of justices on the Court to fifteen in the hope of garnering a majority that would permit Congress to pass New Deal legislation. Though Congress did not enact the plan, two justices on the Court abruptly changed their views on the commerce clause in a series of momentous decisions, including National Labor Relations Board v. Jones & Laughlin Steel (1937, which permitted Congress to regulate private employment practices) and Steward Machine Co. v. Davis, (1937, which held that Congress may sometimes exact taxes that have the effect of regulations). These famous changes in voting patterns came to be known as the “Switch in Time that Saved Nine.”

24

The Civil Rights/Civil Liberties Era After the New Deal crisis was resolved and the nation emerged victorious from World War II, the Court embarked on an extraordinary expansion of civil liberties and civil rights, especially under the leadership of Chief Justice Earl Warren (1953–1968). No case was more important in this regard than Brown v. Board of Education (1954), in which the Court overruled Plessy and declared that racial segregation in public schools violates the Equal Protection clause. Though it took several years before federal courts and the executive branch began enforcing the principles of Brown in a meaningful way, the decision was the springboard for later decisions that extended equal protection rights to women, gays and lesbians, aliens, children born out of wedlock, and other minorities. In the later 1960s and 1970s, the Court authorized massive integration plans for school districts; these decisions were controversial because they embroiled the federal courts in overseeing complicated institutions, a job that critics claimed lay beyond courts’ capacity. Controversy also arose with the emergence of the second form of substantive due process, as discussed above. In Griswold v. Connecticut (1965), the Court struck down a law criminalizing the use of contraceptive devices on the basis of a “right to privacy” in the Constitution, which it discovered not in the due process clause, but rather in the emanations of the penumbras of the text of the First, Third, Fourth, Fifth, and Ninth Amendments. When it proceeded to render the controversial decision in Roe v. Wade (1973), that the right to privacy protects a woman’s right to have an abortion, the Court placed the right to privacy back into the Fourteenth Amendment’s due process clause. Recently, however, the Court has revived the “textual” discovery of rights in Saenz v. Roe (1999). The Court in Saenz found that one component of the non-enumerated right to travel is derived from the Privileges and Immunities Clause of the Fourteenth Amendment. The Warren Court also accelerated the application of the Bill of Rights to the states. Originally, the Bill of Rights was intended to protect individuals only from the actions of the federal government (Barron v. Baltimore, 1833). Nevertheless, in 1925 the Court ruled that because freedom of speech is a fundamental liberty protected by the due process clause of the Fourteenth Amendment, it is enforceable against state and local governments as well (Gitlow v. New York). By the 1960s, the Court had “incorporated” other clauses of the First Amendment to apply to the states. The incorporation of the Fourth, Fifth, and Sixth Amendments coincided with the Warren Court’s so-called “criminal rights revolution,” which generated great controversy in the context of the increasing crime rates and cultural upheavals of the sixties. Though appointed by the Republican President Eisenhower, Warren presided over what has been characterized as the most liberal period during the Court’s history. The Court’s rulings in Mapp v. Ohio (1961, holding that evidence ob-

S U P R E M E C O U RT

tained in violation of the Fourth Amendment must be excluded from trial), Gideon v. Wainwright (1963, applying the Sixth Amendment’s right to retain counsel for the indigent extends against the states) and Miranda v. Arizona (1966, requiring police to warn suspects of their rights in custodial interrogations) greatly expanded the rights of the criminally accused. With Justice William Brennan leading the way, the Warren Court also dramatically liberalized the First Amendment law of free speech and press. Before the late 1950s, speech could generally be punished if it had a “tendency” to cause violence or social harm. Building on the famous dissenting free speech decisions of Justices Oliver Wendell Holmes and Louis Brandeis earlier in the century, the Warren Court provided substantially more freedom for such controversial expression as pornography, vibrant (even vicious) criticism of public officials, hate speech, and offensive speech. Concisely, modern speech doctrine protects expression unless it constitutes hardcore pornography (“obscenity”), libel, threats, or speech that is likely to trigger imminent violence. (See, for example, New York Times v. Sullivan, 1964; Brandenburg v. Ohio, 1969.)

gressional power. The Court began what is called “the new federalism” by curtailing Congress’ power to prohibit the possession of weapons near schools. (United States v. Lopez, 1995). In Printz v. United States (1997), it ruled that Congress may not force state executive officers to enforce federal gun control legislation. In United States v. Morrison (2000), the Court struck down a federal law that provided civil remedies for victims of gendermotivated attacks. And in Board of Trustees v. Garrett (2001), the Court concluded that Congress did not have the authority to hold states liable for violations of the Americans with Disabilities Act. This change in the Supreme Court jurisprudence was not entirely unforeseeable. With seven of the Justices on the Court being appointed by Republican presidents, the more curious issue is why the group of the five most conservative justices waited so long to construct the new federalism. The five justices that formed the majority in each of the cases mentioned above (Rehnquist, Antonin Scalia, Clarence Thomas, Anthony Kennedy, and Sandra Day O’Connor) had all served together since 1991, yet the lodestar of the Court’s more conservative decisions and the number of times in which the conservative block voted together did not begin in earnest until 1995.

Recent Trends: Consolidation, and the New Substantive Due Process and Federalism After Warren left the Court, President Nixon—who had campaigned against the liberalism of the Warren era— nominated the more conservative Warren Burger in the hope of ending the reign of judicial liberalism. But under Chief Justices Burger (1969–1986) and William Rehnquist (1986 to the present), the Court has generally consolidated the liberties of the Warren Era rather than radically reversing course. Though the Court has cut back some Fourth and Fifth Amendment rights, limited the reach of affirmative action (Adarand Constructors, Inc. v. Pena, 1995) and limited the scope of desegregation of the schools and the equal protection clause (see, for example, Freeman v. Pitts, 1992; Washington v. Davis, 1976), it has also maintained the fundamental right to an abortion (Planned Parenthood of Southeastern Pennsylvania v. Casey, 1992), expanded the protection of free speech (R.A.V. v. St. Paul, 1992), and reaffirmed the Miranda decision (Dickerson v. United States, 2000).

These same five justices also became crucial in Bush v. Gore (2000), the case that resolved the 2000 presidential election and is already one of the most controversial cases in the Court’s history. The Court issued a stay, 5– 4, mandating that the State of Florida stop counting Presidential ballots on December 9, 2000. The five justices, along with Justices Souter and Breyer in part, ruled in the per curiam opinion that such counting absent uniform statewide standards violated the Equal Protection Clause of the Fourteenth Amendment and that all counting efforts had to have been completed by December 12, 2000— the same day the Court issued the opinion and three days after the Court halted the counting of the ballots.

The Burger Court retreated from its effort to reinforce the states’ rights provisions of the Tenth Amendment, but the Rehnquist Court has revived the doctrine of federalism under the aegis of the commerce clause. From the time of the New Deal until near the end of the twentieth century, the Court had regularly accorded an ever-increasing amount of power to Congress. The Supreme Court has upheld Congressional power under the Commerce Clause to regulate such things as wheat production for home usage and public accommodations on the basis of race. (Wickard v. Filburn, 1942; Heart of Atlanta Motel, 1964). Since 1995, however, a seismic shift has occurred in the Court’s jurisprudence regarding Con-

Bell, Derrick A. And We Are Not Saved: The Elusive Quest For Racial Justice. New York: Basic Books, 1989.

BIBLIOGRAPHY

Ackerman, Bruce. We the People, Volume I: Foundations. Cambridge, Mass.: Harvard University Press, 1991. Amar, Akhil Reed. The Bill of Rights: Creation and Reconstruction. New Haven, Conn.: Yale University Press, 1998.

Bickel, Alexander. The Least Dangerous Branch: The Supreme Court at the Bar of Politics. 2nd ed. New Haven, Conn.: Yale University Press, 1986. Clayton, Cornell W., and Howard Gillman, eds. Supreme Court Decisionmaking: New Institutionalist Approaches. Chicago: University of Chicago Press, 1999. Ely, John Hart. Democracy and Distrust: A Theory of Judicial Review. Cambridge, Mass.: Harvard University Press, 1980. Griffin, Stephen M. American Constitutionalism: From Theory to Practice. Princeton, N.J.: Princeton University Press, 1999.

25

S U P R E M E C O U RT PA C K I N G B I L L S

Horwitz, Morton J. The Transformation of American Law, 1780– 1860: The Crisis of Legal Orthodoxy. New York: Oxford University Press, 1992. Kutler, Stanley I. Judicial Power and Reconstruction Politics. Chicago: University of Chicago Press, 1968. McClosky, Robert G. The American Supreme Court. 3d ed. Chicago: University of Chicago Press, 2000. Neustadt, Richard E. Presidential Power: The Politics of Leadership. New York: Wiley, 1960. O’Brien, David M. Storm Center: The Supreme Court in American Politics. New York: Norton, 2000. Rosenberg, Gerald N. The Hollow Hope: Can Courts Bring About Social Change? Chicago: University of Chicago Press, 1991. Thayer, James B. “The Origin and Scope of the American Doctrine of Constitutional Law.” Harvard Law Review 7 (1893): 129.

Donald A. Downs Martin J. Sweet See also Constitution of the United States; Judicial Review; Judiciary; Separation of Powers; and vol. 9: Women in Industry (Brandeis Brief ).

SUPREME COURT PACKING BILLS are congressional measures designed to alter the composition and also the direction of the Supreme Court of the United States. By changing the number of justices, the Supreme Court majority that decides cases is altered, and thus “packed” for one side or another of a case. The Constitution does not fix the number of Supreme Court justices, and under Article III, Section 2, Congress has the authority to alter the number of justices on the Court. The seminal Judiciary Act of 1789 fixed the number of justices at six. Since 1789, Congress has increased the number from time to time for a variety of reasons, including increasing efficiency and decreasing the justices’ workload. Congress has also altered the number of justices to produce desired results in Supreme Court cases. Congress changed the number of justices from six to five in 1801 during the contentious and politicized prelude to the presidential transition from John Adams to Thomas Jefferson. The Judiciary Act of 1801 was an attempt to pack the courts, including the Supreme Court, with Federalist judges after Adams’s party lost the executive and legislative branch to Jefferson’s Democratic-Republicans in the 1800 election. The act, passed by the Federalistcontrolled lame duck Congress, created fifty-eight new judgeships and prevented incoming Democratic President Thomas Jefferson from presidential court packing by reducing the number of Supreme Court justices from six to five. In 1802, Congress, controlled by Jefferson’s allies, repealed the Act of 1801, again bringing the number of justices back to six. During post–Civil War Reconstruction, Congress changed the number of Supreme Court justices to preserve the Reconstruction program favored by the domi-

26

Packing the Court. Clifford Berryman’s 1937 drawing, with the label “New Deal Plan for Enlarged Supreme Court,” mocks the proposal of a politically frustrated President Franklin D. Roosevelt (depicted here talking to Harold L. Ickes, his head of the Public Works Administration). 䉷 Library of Congress/corbis

nant Radical Republicans in Congress. In 1866, Congress reduced the number of justices from ten (which had been the number determined by Congress just three years earlier in 1863) to six to prevent President Andrew Johnson from presidential packing of the Court by appointing new justices who might overrule the congressionally approved Reconstruction program. Congress again increased the number to nine in 1869, once Johnson, who barely survived impeachment, was out of office. Since 1869, the number of Supreme Court justices has remained constant at nine. Attempts to pack the Supreme Court, however, have come from presidents as well as from Congress. Presidential court packing is seen as part of the presidential appointment of Supreme Court justices and still occurs today. In choosing an appointee, the president will consider the potential appointee’s legal philosophy and interpretive ideology, as well as personal political party affiliation. The presidential packing powers, however, are limited by the opportunity to appoint a new justice during the president’s term (a function of the retirement or death of sitting justices) and senatorial advice of and consent to the president’s appointee of choice. The most recognized court-packing bill is President Franklin Delano Roosevelt’s (FDR) proposal of 1937. When various aspects of Roosevelt’s New Deal legislation found their way to the Supreme Court, four conservative justices (who came to be known derisively as the “Four Horseman”) solidly opposed FDR’s attempt to expand the

SURFING

scope and power of the federal government, particularly over the depressed American economy. Two justices, Chief Justice Charles Evans Hughes and Justice Owen J. Roberts, were swing votes and tended to vote with those who opposed the New Deal legislation. The result was that the Court struck down eight out of ten major programs proposed by FDR, many by narrow majorities. In February 1937, FDR announced his proposal to alter the composition of the judiciary, citing inefficiency and backlogged dockets as the reasons necessitating the change. The proposal would have impacted the American federal judicial system from top to bottom, but its primary goal was to pack the Supreme Court with justices he would appoint. His plan would have authorized the president to replace every judge or justice who had served more than ten years or had failed to retire within six months after reaching seventy years of age. At the time, the proposal would have authorized FDR to appoint as many as six new justices to the Supreme Court. The proposal, the subject of tense debates, never made it out of committee, and Congress as a whole never voted on it. FDR and congressional New Deal supporters, however, still received their desired result. With the two 1937 cases of West Coast Hotel v. Parrish and National Labor Relations Board v. Jones & McLaughlin Steel Corporation, Justice Roberts changed his voting tendencies and began voting in favor of upholding sweeping New Deal legislation. Roberts denied that his “switch” was influenced by FDR’s court packing proposal. There are many other viable explanations, but the saying “the switch in time that saved nine” emerged as the characterization of the Court packing events of 1937. The Court’s new willingness to support President Roosevelt’s favored legislation took the wind out of the sails of his court-packing plan.

Rehnquist, William H. The Supreme Court. New York: Knopf, 2001.

Jacob E. Cooke, Akiba J. Covitz, Esa Lianne Sferra, Meredith L. Stewart

SURFING. Riding a surfboard across the face of a breaking wave was once the preserve of ancient Polynesian islanders, but in the twentieth century it became something enjoyed by millions of people the world over. Modern surfing has spread well beyond its more recent Hawaiian, American, and Australian origins, becoming a global phenomenon of such magnitude that every minute of every day will find somebody, somewhere, trying to catch a wave. The talented and photogenic few are paid to surf by a multibillion dollar surfing industry. For the rest, it is an obsessive hobby, a statement of identity, and even a spiritual pursuit. Surfing originated sometime between 1500 b.c. and a.d. 400 among the oceanic island cultures of Polynesia. From there, it spread to the Sandwich (Hawaiian) Islands,

Congress’s attempts to pack the court have had more effect on the Supreme Court than presidential packing. Court packing bills have been designed to result in congressional control of the Supreme Court, which provides the largest check on congressional legislation and action. Congressional control of the Supreme Court disrupts the balance of powers and the system of checks and balances revered as fundamental to the system of government in the United States. The unsuccessful attempt by FDR and his Democratic allies in Congress to pack the Supreme Court was the last major concerted attempt by a president and Congress to alter the number of justices on the Supreme Court and thus change the direction of American law and life.

BIBLIOGRAPHY

Abraham, Henry J. The Judiciary: The Supreme Court in the Governmental Process. 10th ed. New York: New York University Press, 1996. Leuchtenburg, William E. The Supreme Court Reborn: The Constitutional Revolution in the Age of Roosevelt. New York: Oxford University Press, 1995.

Surfing Pioneer. Teenager Isabel Letham catches a wave, c. 1917. Duke Kahanamoku picked the locally known bodysurfer out of a crowd in 1915 to be the first person in Australia to ride a Hawaiian-style surfboard.

27

SURPLUS, FEDERAL

where it was witnessed by the British explorer Captain James Cook in 1778. The missionaries that followed in Cook’s wake discouraged the practice to such an extent that it had practically vanished by the end of the nineteenth century. It was revived early in the 1900s by young Hawaiians and promoted by the local tourist industry and Alexander Hume Ford, who founded the Hawaiian Outrigger Canoe Club in 1908 in Honolulu. The Hawaiian surfers Duke Kahanamoku and George Freeth traveled to America and Australia to take part in exhibitions that helped spread surfing beyond Hawaii’s shores. The accessibility of the sport was limited by the athletic demands of the heavy redwood boards that were the Hawaiian norm. Only with the invention of lighter board materials in the 1940s and 1950s did surfing become more appealing to the general public. Surfing subcultures appeared in Hawaii, California, and Australia, developing a distinctive language, fashion, attitude, and lifestyle that gradually filtered into mainstream popular culture. The 1960s saw the emergence of glossy surfing magazines, surf music, and surf clothing and equipment companies, along with the release of numerous surf-related movies, all of which led to a huge increase in the surfing population. New inventions such as wetsuits, leashes, and more maneuverable short boards only added to surfing’s worldwide popularity. Large national organizations were created to organize the sport and to hold competitions, leading eventually to a professional circuit that is funded, principally, by the surf industry and media sponsorship. The original extreme sport, surfing continues to push its boundaries. The development of tow-in surfing technology allows big-wave surfers to ride offshore waves that are more than sixty feet high. BIBLIOGRAPHY

Finney, Ben, and James Houston. Surfing: The History of the Ancient Hawaiian Sport. San Francisco: Pomegranate Books, 1996. Kampion, Drew. Stoked: A History of Surf Culture. Santa Monica, Calif.: General Publishing, 1997. Young, Nat. The History of Surfing. Angourie, Australia: Palm Beach Press, 1998.

Rick Dodgson See also Honolulu; Sports.

SURPLUS, FEDERAL. Federal budgets have varied considerably over our history. As indicated in the accompanying table, our first and second centuries were very different in terms of budget surpluses. Through our first 134 years, surpluses were the norm; in the last 75 years, they have been rare. A number of factors affect our budgetary history. The most pronounced are wars and economic recessions. However, philosophical and partisan values and beliefs also were critical.

28

TABLE 1 Years in Surplus or Deficit, 1792–2000 Years 1792–1800 1801–1825 1826–1850 1851–1875 1876–1900 1901–1925 1926–1950 1951–1975 1976–2000 SOURCE:

Years in Surplus

Years in Deficit

5 17 16 17 19 14 8 5 3

4 8 9 8 6 11 17 20 22

U.S. Dept of Commerce, 1970, pp. 1104–1105.

For example, when the Jeffersonian Democrats defeated the Federalists in 1800, their philosophy of a limited national government replaced the Federalists’ more activist role. As a result, the budget deficits that Treasury Secretary Alexander Hamilton was willing to run to provide federal services were replaced by Thomas Jefferson’s desire to run surpluses to reduce the total public debt. His (and James Madison’s) Secretary of the Treasury, Albert Gallatin, was able to produce surpluses in ten of fourteen years from 1801 to 1824, with the exceptions coming primarily during and after the War of 1812. Because Democrats were in power for most of the period prior to the Civil War, there were thirty-three years of budget surpluses from 1801 to 1850. Since the total public debt was essentially eliminated by 1835, but both the Democrats, and especially the Whigs, believed in high protective tariffs, revenue consistently outpaced spending. The answer of what to do with the excess funds came in the form of the Deposit Act of 1836, which required the federal government to distribute any surplus over $5 million to the states in proportion to their electoral votes (hence their population). That act was short-lived because of major economic downturns commencing with the Panic of 1837. That recessionary period resulted in deficit spending during six of the next seven years. Surpluses returned by 1844 and were common until the military buildup prior to and during the Civil War. The war resulted in eight years of deficits and an unimaginable increase in government spending and government debt. The latter rose from $64 million in 1860 to $2.8 billion in 1866. That debt was partly paid for by issuing war bonds, authorized by Lincoln’s hard-driving Secretary of the Treasury, Samuel Chase. Tariff revenue declined substantially during the war. Thus, to help finance the war effort and to begin paying off the debt, Congress passed the first income tax in 1862, which remained in effect until its repeal in 1872. Because of the income tax, the nation was able to return to surpluses in 1866. The fiscally conservative na-

S U R R O G AT E M O T H E R H O O D

ture of the post–Civil War period, for both Republicans and Democrats, led to continuous surpluses from 1866 to 1894. As in the pre–Civil War period, surpluses eventually ended due to the nation’s economic woes, which in the 1890s were much worse than anything the country had experienced before. Because of the enormity of the debt built up during the Civil War, the total debt had only been reduced to $1 billion by 1894 from the $2.8 billion that existed in 1866. Spending in the 1890s on the Spanish American War and veterans pension legislation contributed to five straight years of deficits at the end of the nineteenth century. A mixed pattern of deficits and surpluses marked the period beginning in 1900 and ending in 1918 with U.S. entry into World War I. The war again produced a historic level of debt that was completely beyond anything previously imagined. It stood at $24 billion by 1921. However, the income tax had been reenacted in 1913 and was expanded dramatically during the war. By the end of the war it accounted for 56 percent of federal revenues. Using that tax engine, Secretary of the Treasury Andrew Mellon was able to produce surpluses in every year he was in power, which included three presidential administrations from 1921 to 1932. As the surpluses were acquired, Mellon would subsequently support reduction in taxes, with income tax rate reduction the primary target. Because of the strength of the economy during this period, even as tax rates were reduced, more taxes were collected and more surpluses created. Those surpluses ended with the depression, and with the willingness of Franklin Roosevelt to use the federal government to help alleviate the nation’s suffering—and to run deficits while doing so. Later during his thirteen years in office Roosevelt embraced the philosophy of John Maynard Keynes, which lent academic endorsement to the concept of deficit spending in times of recession. The combination of growing expenditures for New Deal programs; the philosophical acceptance of Keynesian deficit spending; and wars in Europe, Korea, and Vietnam, along with a cold war with Russia, created a phenomenal period of growth in U.S. government, and with it an omnipresent budget deficit. Beginning in the 1980s both the origin of deficits and the drive for budget surpluses took on a new dimension. For the first time in budget history, deficits were an unintended consequence of a major peacetime tax reduction occurring in 1981. The influence of Keynesian macroeconomic theory had waned in policy circles and for a short period was replaced by a supply-side theory in which tax reductions were viewed as the major engine of capital formation, and therefore economic growth. Some of the empirical evidence supporting the supply-side theory included the results of the surplus-generating actions of the Mellon era in the 1920s. Supply-siders argued that cutting tax rates would ultimately increase tax collections. The debate over supply-side theory continues, but the shortterm effects are not subject to debate. Deficits exploded,

reaching $290 billion in 1992. The political result was a consistent effort to return the nation to surpluses. That legislative goal dominated legislative and presidential politics from 1982 to 1997. Spending restraints were imposed and peacetime tax increases were enacted for the first time since 1931. With government having achieved the goal of returning to surpluses by the 1998 fiscal year, politics seemed to be returning to a prior period, as political pressures supporting tax reductions (accomplished in early 2001) and pent-up spending demands crowded the political agenda. BIBLIOGRAPHY

Studenski, Paul, and Herman E. Krooss. Financial History of the United States. New York: McGraw-Hill, 1952. U.S. Department of Commerce, Bureau of the Census. Historical Statistics of the United States, Colonial Times to 1970. New York: Basic Books, 1976. Witte, John. The Politics and Development of the Federal Income Tax. Madison: University of Wisconsin Press, 1985.

John Witte See also Debt, Public; Deposit Act of 1836; Supply-Side Economics; Taxation.

SURROGATE MOTHERHOOD is the process by which a woman bears a child for another couple, typically an infertile couple. There are two kinds of surrogate motherhood. In traditional surrogacy, the mother is artificially inseminated with sperm from the father or with sperm from a donor, if the father is infertile. In gestational surrogacy, sperm is taken from the father (or from a donor) and the egg is taken from the mother, fertilization happens in vitro, and the embryos are then implanted into the surrogate mother’s uterus. Thus, the surrogate mother is not genetically related to the child. For over one hundred years artificial insemination was used as a way of managing male infertility that kept the family intact and allowed children to be born to a married couple. Artificial insemination was generally kept secret. Couples did not tell friends, family, or the children themselves that donor sperm was used, thus maintaining the fiction of biological paternity. Though stories of surrogate motherhood, often with familial surrogates, date back two thousand years, in 1976 the lawyer Noel Keane arranged the first formal agreement between a couple and a surrogate mother in the United States. The marketing of “surrogacy” developed as a solution to female infertility. Brokers entered the scene, hiring women to become pregnant via artificial insemination with the sperm of the husband of the infertile woman. In 1986 surrogacy came to national attention with the case of “Baby M.” In this case, the woman hired as a surrogate, Mary Beth Whitehead, later refused to relinquish the child. After a protracted court battle, in

29

SURROUNDED, THE

which Whitehead’s parental rights were stripped and then replaced, the hiring couple won custody of the baby, but Whitehead remained the legal mother with visitation rights. Since the 1980s, advances in technology have increased the use of gestational surrogacy. As it has become more common, there has been an increase in the number of Latin American, Asian American, and African American surrogates. The Center for Surrogate Parenting (CSP) estimates a cost of $56,525 for traditional surrogacy, in which artificial insemination is used, and a cost of $69,325 if another woman’s egg is used. Approximately $15,000 of these fees are paid to the surrogate herself for the time and sacrifice of the pregnancy. When surrogacy agreements first surfaced in the mid-1970s, there was no payment for surrogate motherhood, and it tended to involve middle-class and blue-collar couples, with friends and sisters helping each other. Once payment became the norm, the demographic changed: “the majority of the couples remain largely upper-middle-class people, whereas the majority of the surrogates are working class women” (Ragone´, Surrogate Motherhood, p. 194). In 2002, most states had no specific laws regarding surrogate motherhood. While many states do not uphold surrogacy contracts, all states recognize birth certificates and adoption certificates from other states, making surrogate services available to anyone with the money to hire them. That surrogacy has become a business has not meant that contracting couples do not value the surrogate or that the surrogate does not care about the child or the couple. Very careful screening—approximately 95 percent of potential surrogates are rejected—ensures that situations similar to that of Mary Beth Whitehead do not happen. Surrogates are chosen for their commitment. In the only ethnographic study of surrogacy, Helena Ragone´ found that couples adopted one of two strategies in dealing with their surrogate. “Egalitarians” wanted to maintain a relationship with the surrogate mother and did not see her as a means to an end. Since in all of Ragone´’s cases the children were still quite young, it is difficult to know how this would play out. “Pragmatists” simply dropped the relationship with the surrogate, taking the child as theirs, and considering the payment sufficient acknowledgement of the role of the surrogate. BIBLIOGRAPHY

Ragone´, Helena. Surrogate Motherhood: Conception in the Heart. Boulder, Colo.: Westview Press, 1994. Rothman, Barbara Katz. Recreating Motherhood: Ideology and Technology in a Patriarchal Society. New York, Norton, 1989.

See also Adoption; Childbirth and Reproduction; Children’s Bureau; Family; Foster Care.

30

SURROUNDED, THE a novel by the Native American author D’Arcy McNickle, was first published in 1936 by Dodd, Mead of New York City and republished in 1978 by the University of New Mexico Press. The economically stressed nation of the 1930s may not have been as ready to consider the tragic losses of American Indian peoples as they would be in the 1970s, despite the reforms of John Collier, President Franklin D. Roosevelt’s commissioner of Indian affairs, who crafted the Indian Reorganization Act of 1934, and the rave review of Oliver LaFarge, Pulitzer Prize–winning novelist. The Surrounded depicts the many ways in which law constrained American Indians: laws that established the reservation system during the nineteenth century; the law of the Catholic Church missions that took as their task educating Salish and other traditional tribal people out of their “savagery,” and laws that prohibited the practice of Native religions. In the novel, the limitations imposed on the Salish-Spanish protagonist, Archilde, and his people lead to his demise after he becomes a fugitive in an effort to protect his mother from being prosecuted for killing a game warden. In his many professional roles—Bureau of Indian Affairs official, scholar, anthropologist, writer, and founder of national organizations—McNickle devoted his life to drawing attention to the ways in which tribal peoples were “surrounded.” BIBLIOGRAPHY

McNickle, D’Arcy. Native American Tribalism: Indian Survivals and Renewals. New York: Oxford University Press, 1973. Parker, Dorothy R. Singing an Indian Song: A Biography of D’Arcy McNickle. Lincoln: University of Nebraska Press, 1992. Purdy, John Lloyd. Word Ways: The Novels of D’Arcy McNickle. Tucson: University of Arizona Press, 1990.

Kathryn W. Shanley See also Indian Policy, U.S.: 1830–1900, 1900–2000; Indian Reorganization Act; Tribes: Great Plains.

SURVEY ACT of 1824, enacted by Congress twelve years after Treasury Secretary Albert Gallatin’s “Report on Roads, Canals, Harbors, and Rivers” had generated interest in national internal improvements. The act authorized the president, with the aid of army engineers, to conduct surveys of such canal and turnpike routes as would serve an important national interest. Presidents Madison and Monroe had vetoed earlier efforts to appropriate money for such purposes, because each president thought an amendment to the Constitution was necessary to authorize federal expenditures for the construction of roads or canals. But the Supreme Court’s decision in Gibbons v. Ogden (1824) regarding the scope of Congressional power over interstate commerce cleared the way for President Monroe to sign this bill. Congress repealed the act in 1838.

S U RV E Y I N G

BIBLIOGRAPHY

Hill, Forest G. Roads, Rails and Waterways: The Army Engineers and Early Transportation. Westport, Conn.: Greenwood Press, 1977. Larson, John L. “ ‘Bind the Republic Together’: The National Union and the Struggle for a System of Internal Improvements.” Journal of American History 74 (September 1987): 363–387. Malone, Laurence J. Opening the West: Federal Internal Improvements before 1860. Westport, Conn.: Greenwood Press, 1998.

L. W. Newton / c. p. See also Canals; Engineers, Corps of; River and Harbor Improvements; Roads.

SURVEYING. Using little more than a compass and a 66-foot chain, early American surveyors set out early to chart the United States of America. Surveys determine boundaries, chart coastlines and navigable streams and lakes, and provide for mapping of land surfaces. Much of this work done in the early days of the United States used rudimentary, although not necessarily inefficient, equipment. For instance, surveyors set a 2,000-mile line for the transcontinental railroad in the 1860s without the benefit of maps, aerial views, or precise knowledge of topographical features. A century later, when surveyors set the line for Interstate 80 using everything their predecessors had not, the route followed the railroad’s route almost exactly. The primary tool used by surveyors in North America from the 1600s through the end of the 1800s was a

“Gunter’s chain,” measuring 66 feet long, usually with 100 swiveled links. A retractable steel tape to replace the chain was patented in 1860 by W. H. Paine of Sheboygan, Wisconsin. Surveyors relied on the compass to set the direction of their chain. Goldsmith Chandlee, a notable clock and instrument maker, built a brass foundry in Winchester, Virginia, in 1783 and made the most advanced surveying compasses of his day. The biggest breakthrough in surveying technology came in England in 1773, when Jesse Ramsden invented the circular dividing engine, which allowed the manufacture of precise scientific and mathematical instruments. The first American to develop a capability for the mechanical graduation of instruments was William J. Young. Young built the first American transit in Philadelphia in 1831, replacing the heavier, more inconvenient theodolite, which measures horizontal and vertical angles. The transit has a telescope that can be reversed in direction on a horizontal axis. The transit built by Young differs little from the transit used in the early twenty-first century. The increased demand for accuracy in railroad construction, civil engineering, and city surveys led to the rapid acceptance of the transit. An influx of tradesmen from the Germanic states in the 1830s and 1840s provided a means of manufacturing precision instruments in volume. To help with mathematical calculations, surveyors began experimenting with a number of nonelectric calculators, including Thacher’s Calculating Instrument, patented in 1881, which was the equivalent of a 360-inchlong slide rule precise to 1:10,000. Slide rules replaced

Makeshift Facilities. Photographers accompanying surveyors in the West had to process their film in a tent (as shown here) or wagon. Library of Congress

31

S U RV E Y I N G

calculating instruments, calculators replaced slide rules, and computers have replaced calculators. America’s original thirteen colonies, as well as a few states such as Texas and Kentucky, were originally surveyed by metes and bounds, which is the process of describing boundaries by a measure of their length. On 7 May 1785, Congress adopted the Governmental Land Surveys, which provided for the “rectangular system,” which measured distances and bearing from two lines at right angles and established the system of principal meridians, which run north and south, and base lines, running east and west. Under the Northwest Ordinance of 1787, Ohio served as the experimental site for the new public lands surveying system. The lessons learned culminated in the Land Ordinance of 1796, which determined the surveying and numbering scheme used to survey all remaining U.S. public lands. The first government-sanctioned survey was the Survey of the Coast, established in 1807 to mark the navigational hazards of the Atlantic Coast. Under Superintendent Ferdinand Hassler, the survey used crude techniques, including large theodolites, astronomical instruments, plane table topography, and lead line soundings to determine hydrography. Despite these techniques, the survey achieved remarkable accuracy.

By the time the Coast Survey was assigned to map Alaska’s coast, after Alaska was acquired in 1867, technological advancements had provided new kinds of bottom samplers, deep-sea thermometers, and depth lines. A new zenith telescope determined latitude with greater accuracy, and the telegraph provided a means of determining longitudinal differences by flashing time signals between points. Inland, surveys were more informal. Often under sponsorship from the Army, explorers such as Meriwether Lewis and William Clark, Zebulon Pike, and Stephen H. Long went out on reconnaissance missions, gathering geographic, geologic, and military information. After the Civil War (1861–1865), westward migration created a need for detailed information about the trans-Mississippi West. Congress authorized four surveys named after their leaders: Clarence King, F. V. Hayden, John Wesley Powell, and George M. Wheeler. In addition to topography and geography, these surveys studied botany, paleontology, and ethnology. The U.S. Geological Survey was formed in 1879 and began mapping in the 1880s, relying on the chain-andcompass method of surveying. By the early 1900s, surveyors were working with plane tables equipped with telescopic alidades with vertical-angle arcs, allowing lines of survey to be plotted directly from the field. Leveling in-

Surveying Camp. This 1912 photograph shows a noon camp of surveyors in the southwestern part of the Jornada Range Reserve, New Mexico. National Archives and Records Administration

32

S WA RT H M O R E C O L L E G E

struments have been used since 1896 to set permanent elevation benchmarks. Aerial photography came into use as a survey tool following World War I (1914–1918), and photogrammetry was widely used by the 1930s. Today, satellites enable surveyors to use tools as sophisticated as the global positioning system (GPS), which can eliminate the need for a line-of-sight survey. BIBLIOGRAPHY

Cazier, Lola. Surveys and Surveyors of the Public Domain, 1785– 1975. Washington, D.C.: U.S. Department of the Interior, Bureau of Land Management, 1993. Thompson, Morris M. Maps for America: Cartographic Products of the U.S. Geological Survey and Others. Reston, Va.: U.S. Government Printing Office, 1979. “Virtual Museum of Surveying.” Ingram-Hagen & Co.; updated June 2002. Available at http://www.surveyhistory.org

T. L. Livermore See also Geography; Geological Survey, U.S.; Geological Surveys, State; Geology; Geophysical Explorations.

SUSQUEHANNA COMPANY. See Wyoming Valley, Settlement of.

SUSSEX CASE. On 24 March 1916, a German submarine attacked the English Channel steamer Sussex. The United States, regarding this action as a violation of the pledge given by the German government in the 1915 Arabic case, responded that unless Germany stopped using submarines against passenger and freight vessels, it would sever diplomatic relations. The German government gave the necessary assurances, but with the qualification that the United States should require Great Britain to abandon the blockade of Germany. The United States refused to accept the German qualification. Consequently, when Germany renewed submarine warfare on 1 February 1917, the United States severed relations. BIBLIOGRAPHY

Safford, Jeffrey J. Wilsonian Maritime Diplomacy, 1913–1921. New Brunswick, NJ: Rutgers University Press, 1978. Terraine, John. The U-boat Wars, 1916–1945. New York: Putnam, 1989.

Bernadotte E. Schmitt / a. e.

Sutter’s Fort. This 1847 wood engraving shows how it looked a short time before the gold rush ruined Sutter and doomed the fort. Library of Congress

surrounded by a high wall with bastions on opposite corners to guard against attack. Built around the interior of the wall were the workshops and stores that produced all goods necessary for New Helvetia to function as a selfsupporting community. Sutter’s Fort housed a kitchen, able to serve up to two hundred workers and visitors a day; carpenter and blacksmith shops; a bakery and blanket factory; a general store and jail; and rooms that Sutter provided free to the region’s new immigrants. Sutter’s Fort is most often associated with James Marshall’s discovery of gold in 1849, but the ensuing gold rush resulted in the destruction of the fort and its resources by miners and fortune hunters, and in the financial ruin of John Sutter. Sutter left New Helvetia in 1850, and Sutter’s Fort fell into disrepair. When restoration efforts began in 1890, the central building was all that remained. The fort has been reconstructed and restored and is now maintained and administered as a California State Park. BIBLIOGRAPHY

Gudde, Erwin G. Sutter’s Own Story: The Life of General John Augustus Sutter and the History of New Helvetia in the Sacramento Valley. New York: Putnam, 1992. The original edition was published in 1936. Lewis, Oscar. Sutter’s Fort: Gateway to the Gold Fields. Englewood Cliffs, N.J.: Prentice-Hall, 1966. Payen, Louis A. Excavations at Sutter’s Fort, 1960. Sacramento: State of California, Department of Parks and Recreation, Division of Beaches and Parks, Interpretive Services, 1961. Facsimile reprint, Salinas, Calif.: Coyote Press, n.d.

Brenda Jackson See also Gold Rush, California.

See also Great Britain, Relations with; Lusitania, Sinking of the; World War I; World War I, Navy in.

SUTTER’S FORT. In 1841 John Sutter (1803–1880) established a fort in California’s Sacramento Valley as the trade and commercial center of his New Helvetia colony. It contained a central building constructed of adobe bricks,

SWARTHMORE COLLEGE, chartered by the state of Pennsylvania in 1864, was founded by the Hicksite Quakers, who split from the orthodox branch of the Society of Friends in 1827. The name derived from Swarthmoor Hall, the home of George Fox, the English founder of Quakerism. Swarthmore’s governors were required to

33

S W E AT S H O P

be Quakers until the college became nominally nonsectarian soon after 1900, although the Quaker influence continued. The college was coeducational from the start. Its first graduating class in 1873 consisted of one man and five women. Early Swarthmore was shaped by a struggle between liberals, who wanted an urban location, moderate social rules, and a quality collegiate education; and traditionalists, who envisioned a rural institution, a “guarded” education to preserve Quaker traditions, and preparatory work for students not ready for college. The choice of a semirural location eleven miles southwest of Philadelphia was the first of several compromises, but tensions continued. The eighteen-year administration of the traditionalist Edward Magill, who replaced Edward Parrish as president in 1871, was marked by debates over social rules, teacher training, and precollege work. By 1890 rules were relaxed and the departments of teacher training and preparatory work were eliminated. The curriculum, already strong in science and engineering, was enriched by the expansion of electives and the creation of endowed professorships. During the presidency of Joseph Swain, from 1902 to 1921, Swarthmore joined the collegiate mainstream, boasting nationally competitive sports teams and an array of extracurricular activities. An honors system developed by President Frank Aydelotte between 1920 and 1940 featured seminars modeled on Oxford tutorials and earned Swarthmore a national reputation for academic excellence. The turmoil of the 1960s, the proliferation of new academic fields, expanded overseas study, and increased work in the performing arts brought change to the honors program but did not alter Swarthmore’s position among the nation’s top liberal arts colleges. Swarthmore grew steadily, supported by private donations and gifts from educational foundations. When its main building, later called Parrish, was rebuilt after a disastrous fire in 1881, students named the college newspaper the Phoenix to celebrate the rise of the college from the ashes. Between 1920 and 2001 an endowment of $3 million increased to $1 billion, and a student body of 500 grew to more than 1,400. In 2001 approximately one of three undergraduates was a person of color, while another 7 percent represented more than forty other countries. At the beginning of the twenty-first century the college’s seventeen thousand living alumni included three Nobel Prize winners plus many leading scientists, academics, professionals, and social activists.

BIBLIOGRAPHY

Clark, Burton R. The Distinctive College: Antioch, Reed, and Swarthmore. Chicago: Aldine, 1970. Leslie, W. Bruce. Gentlemen and Scholars: College and Community in the “Age of the University,” 1865–1917. University Park: Pennsylvania State University Press, 1992.

34

Walton, Richard J. Swarthmore College: An Informal History. Swarthmore, Pa.: Swarthmore College, 1986.

Robert C. Bannister See also Education, Higher: Colleges and Universities.

SWEATSHOP refers to both a workplace and a labor system. Sweated work is undesirable, unhealthy, and undemocratic. Sweated labor is characterized by harsh conditions, long hours, very low wages, job insecurity, and often takes place in illegal and temporary workplaces. Sweatshops are often small, temporary garment “shops.” Historically, however, sweated workers have often toiled in their own homes, in a system called homework and frequently involving child labor. Sweated industries tend to be those with intense competition and often seasonal production, requiring little capital outlay, almost no technological innovation, and a constant supply of cheap, unskilled labor. It is an extreme example of what economists call “flexible specialized production.” The three key elements are the avoidance of fixed costs, a fixed labor force, and fixed rules. By being flexible, producers can adjust supply to demand quickly, cutting the risk of long-term investment. They can expand to meet new demand and retract during downturns. Producers avoid union rules and legal regulations and restrictions that set wages, benefits, and conditions by working in hidden shops and moving frequently. Sweated labor systems transfer or shift the social responsibility of production elsewhere, namely onto society. They create a secondary labor market, which often involves the most vulnerable of workers: immigrants (often illegal), young women, and the undereducated. Sweatshop labor systems are most often associated with garment and cigar manufacturing of the period 1880–1920. Sweated labor can also be seen in laundry work, green grocers, and most recently in the “day laborers,” often legal or illegal immigrants, who landscape suburban lawns. Sweatshops became visible through the public exposure given to them by reformers in the late nineteenth and early twentieth centuries in both England and the United States. In 1889–1890, an investigation by the House of Lords Select Committee on the Sweating System brought attention in Britain. In the United States the first public investigations came as a result of efforts to curb tobacco homework, which led to the outlawing of the production of cigars in living quarters in New York State in 1884. In an effort to eliminate these inhumane conditions, reformers focused on three principle areas: support of labor unions, a more active state that better regulated the economy, and an informed consumer (the national consumers’ movement). Until the late twentieth century, it was assumed that the federal minimum wage and maximum hours legisla-

SWEDENBORGIAN CHURCHES

Sweatshop. Puerto Rican garment workers operate sewing machines in New York City. Arte Publico Press

tion of 1938, part of larger New Deal social and economic reforms, had curtailed sweatshops in the United States. Unfortunately, America rediscovered its sweatshops. In August 1995, federal agencies raided a compound of several apartments in El Monte, California. These residences functioned as a large-scale sweatshop. There, seventy-two illegal Thai immigrants lived and worked in inhumane conditions, sewing sixteen hours a day on garments for several nationally prominent retailers. Discoveries of additional sweatshops led reformers, unionists, and student activists to revive the antisweatshop movement through organizations such as the Union of Needletrades, Industrial and Textile Employees (UNITE) and Students Against Sweatshops. BIBLIOGRAPHY

Boris, Eileen. Home to Work: Motherhood and the Politics of Industrial Homework in the United States. Cambridge, U.K.: Cambridge University Press, 1994. Green, Nancy L. Ready-to-Wear and Ready-to-Work. Durham, N.C.: Duke University Press, 1997. Ross, Andrew, ed. No Sweat: Fashion, Free Trade, and the Rights of Garment Workers. New York: Verso, 1997.

Storrs, Landon R. Y. Civilizing Capitalism. Chapel Hill: University of North Carolina Press, 2000.

Richard A. Greenwald See also Business, Regulation of; Homework; Piecework; Wages and Hours of Labor, Regulation of.

SWEDENBORGIAN CHURCHES, or the Churches of the New Jerusalem, follow the teachings of Emanuel Swedenborg, an eighteenth-century Swedish scientist and theologian. At the end of a distinguished scientific career, Swedenborg began experiencing an ability to converse with spirits and angels and turned his attention to the relation between the spiritual and material worlds. His theological beliefs included a spiritual and allegorical method of interpreting scripture, a belief that his own spiritual revelations took part in the ongoing second coming of Christ, and an understanding of the afterlife as a continuation of each individual’s freely chosen spiritual path. Swedenborg denied the orthodox doctrine of the Trinity along with original sin, vicarious atonement, and bodily resurrection. In 1783, followers of Swedenborg began meeting in London, where Swedenborg’s

35

S W I F T V. T Y S O N

books had been published. After reading Swedenborg’s Heaven and Hell on his voyage to Philadelphia, James Glen, a planter, introduced Swedenborg to the New World with a series of lectures in 1784. Swedenborg’s ideas then spread to Boston and New York and through the missionary work of the tree-planter and Swedenborgian “Johnny Appleseed” ( John Chapman), to parts of the Middle West. By 1817, when the General Convention of the Church of the New Jerusalem was established, the church had about 360 members in nine states. Membership grew quickly after 1850, reaching its peak of about 10,000 in 1899. During this time period, Swedenborgian thought appealed to many American intellectuals, including transcendentalists Ralph Waldo Emerson and Bronson Alcott, members of Owenite and Fourierist utopian communities, and spiritualists. The small size of the church belied its sizable cultural influence, since many prominent devotees of Swedenborg, including Henry James Sr., did not join the church. In 1890, as a result of a disagreement concerning the divine authority of Swedenborg’s writings, the General Church of the New Jerusalem broke away from the General Convention. In 1999, the General Church had about 5,600 members, and the General Convention had about 2,600 members. BIBLIOGRAPHY

Block, Marguerite Beck. The New Church in the New World: A Study of Swedenborgianism in America. New York: Henry Holt, 1932. Meyers, Mary Ann. A New World Jerusalem: The Swedenborgian Experience in Community Construction. Westport, Conn.: Greenwood Press, 1983.

Molly Oshatz

SWIFT V. TYSON, 41 U.S. (16 Peters.) 1 (1842). This Supreme Court decision interpreted the Judiciary Act of 1789’s requirement that the federal courts follow the “laws” of the states. Justice Story, for the unanimous Court, held that judicial decisions regarding matters of general commercial jurisprudence were not “laws,” but only “evidence of law,” and were not binding under the 1789 act. Thus, where no statutes addressed the issue, a federal court sitting in New York could ignore the decisions of New York courts and rule in accordance with those of other states. Swift was reversed by Erie Railroad Co. v. Tompkins, 304 U.S. 64 (1938). BIBLIOGRAPHY

Freyer, Tony Allan. Harmony and Dissonance: The Swift and Erie Cases in American Federalism. New York: New York University Press, 1981.

Stephen B. Presser

SWIMMING. The origins of swimming are lost in the murk of prehistory, but humans probably developed the

36

skill after watching animals “dog paddle.” Swimmers appear in artwork on Egyptian tombs, in Assyrian stone carvings, in Hittite and Minoan drawings, and in Toltec murals. Ancient gladiators swam while training, and Plato believed that a man who could not swim was uneducated. Contemporaries reported that both Julius Caesar and Charlemagne were strong swimmers. The first swimming races of which there is a record were held in Japan in 36 b.c., but England was the first modern society to develop swimming as a competitive sport. In the nineteenth century, the British competed in the breaststroke and the sidestroke, both modifications of the “dog paddle.” They were generally more interested in endurance than speed, and viewed swimming the English Channel as the supreme test. While Europeans employed the breaststroke and sidestroke, natives of the Americas, West Africa, and some Pacific Islands used variations of the crawl. Europeans got their first glimpse of this new stroke in 1844, when a group of American Indians was invited to London to compete. Flying Gull bested Tobacco by swimming 130 feet in an unheard-of 30 seconds. One observer noted that the Indians “thrashed the water violently” and compared their arm action to the “sails of a windmill.” The British were impressed with the natives’ speed, but they considered their style uncivilized. The overhand stroke was finally introduced to Britain in the 1870s by J. Arthur Trudgen, who noticed indigenous people using the technique during a trip to South America. Upon his return, he began teaching this approach to others. As British swimmers began combining the Trudgen overhand with the breaststroke frog kick, the focus of competition began to shift from distance to speed. Trudgen had failed to notice the natives’ use of the flutter kick, but this was not lost on another British swimmer, Frederick Cavill. In 1878, Cavill immigrated to Australia, where he taught swimming and built pools. During a trip to the Solomon Islands near the turn of the century, Cavill closely watched Pacific Islanders swimming. Noting the way they combined the overhand stroke with kicking action, he taught this new method to his six sons and other British e´migre´s. His sons, in turn, carried the “Australian crawl” back to England and the United States. The American swimmer Charles Daniels improved on the “Australian crawl” by timing his kick to his armstroke. Using the “American crawl,” Daniels won the United States’s first Olympic gold medal in 1904. Although the Greeks did not include swimming in the ancient Olympics, a freestyle competition was part of the first modern games held in 1896. (Freestyle meant that any stroke was allowed.) In 1900, the backstroke was added, as well as three unusual swimming events: an obstacle course, a test of underwater swimming, and a 4,000meter event. Only the backstroke competition was retained. By 1904, the crawl was becoming the dominant

SWIMMING

The early twentieth century also saw a boom in leisure swimming. Americans had been going to the beach for seaside recreation ever since railroads made public beaches more accessible in the late nineteenth century. The first municipal pool in the United States was built in Brookline, Massachusetts, in 1887, and by the 1920s many cities and some wealthy homeowners had installed pools. Leisure swimming had health as well as social benefits; President Franklin D. Roosevelt swam regularly to strengthen legs weakened by paralysis, while President John F. Kennedy swam to strengthen his back muscles.

Swimsuit Scandal. Several women talk with a police officer after disobeying a prohibition on what many people regarded as skimpy bathing suits in Chicago, 1922. 䉷 UPI/corbisBettmann

freestyle stroke, so the breaststroke was made a separate event. The first American swimmer to achieve national fame was Duke Kahanamoku, a native Hawaiian who won three gold medals and two silvers in the 1912, 1920, and 1924 Olympics. Kahanamoku used six flutter kicks for each cycle of his arms, a technique that is now considered the classic freestyle form. In 1924, the twenty-year-old Johnny Weissmuller beat Kahanamoku, achieving international celebrity. In a decade of racing, Weissmuller set twenty-four world swimming records, won five Olympic gold medals, and never lost a race of between 50 yards and a half-mile. Weissmuller achieved even greater fame, however, when he went on to Hollywood to play Tarzan on the silver screen. Women were excluded from Olympic swimming until 1912 because they were considered too frail to engage in competitive sports. In the 1910s, however, the newly formed Women’s Swimming Association of New York gave women an opportunity to train for competition. Gertrude Ederle, the daughter of a delicatessen owner, began setting world records in distances of between 100 and 800 meters. Wanting to win fame for her swimming club, in 1926 she became the first woman to swim the English Channel. The nineteen-year-old’s time of 14 hours and 31 minutes broke the existing men’s record, and Ederle returned home to a ticker-tape parade. The first American woman to win an Olympic swimming title was Ethelda Bleibtrey, who captured three gold medals in 1920.

Beginning in the 1930s, women’s swimsuits became increasingly streamlined and revealing. (Fabric rationing during World War II [1939–1945] led to the introduction of the two-piece bathing suit, and the “bikini”—named for a U.S. nuclear testing site in the South Pacific—debuted in 1946.) Pin-up girls and starlets appeared in bathing attire, and in 1944 swimming champion Esther Williams made a splash in the film Bathing Beauty. Williams’s appearance in a string of Hollywood swimming movies in the 1940s and 1950s helped popularize synchronized swimming. Hollywood was not alone in turning a camera on swimmers. In 1934, Iowa University coach Dave Armbruster first filmed swimmers in order to study their strokes. To speed his breaststrokers, Armbruster developed a double overarm recovery known as the “butterfly.” An Iowa swimmer, Jack Seig, paired this with a “dolphin kick,” in which his body undulated from the hips to the toes. The butterfly was so exhausting that it was initially considered a novelty, but swimmers using the overhand stroke began dominating breaststroke races. In 1953, the butterfly was finally recognized as a separate competitive stroke. The final years of the twentieth century were golden for American swimmers. Mark Spitz, a butterfly and freestyle racer, garnered seven gold medals and seven world records in the 1972 Munich Olympics, the most ever in a single Olympiad. In 1992, freestyler Matt Biondi matched Spitz’s career record of 11 Olympic medals (The only other Olympian to win 11 medals was shooter Carl Osburn). In the 1980s, Tracy Caulkins became the only American swimmer ever to hold U.S. records in every stroke; she won three gold medals at the Olympics in 1984. Competing in the 1992, 1996, and 2000 Games, Jenny Thompson won ten butterfly and freestyle medals, including eight golds, the most ever captured by a woman. BIBLIOGRAPHY

Gonsalves, Kamm, Herbert, ed. The Junior Illustrated Encyclopedia of Sports. Indianapolis, Ind.: Bobbs-Merrill, 1970. USA Swimming Official Web site. Home page at http://www .usa-swimming.org. Yee, Min S., ed. The Sports Book: An Unabashed Assemblage of Heroes, Strategies, Records, and Events. New York: Holt, Rinehart, and Winston, 1975.

Wendy Wall

37

S Y M B I O N E S E L I B E R AT I O N A R M Y

SYMBIONESE LIBERATION ARMY, a violent revolutionary group that espoused vaguely Marxist doctrines and operated in California from 1973 to 1975, undertaking a highly publicized campaign of domestic terrorism. Their 1973 assassination of the Oakland superintendent of schools, Marcus Foster, brought them to national attention. They became even more notorious the following year when they kidnapped Patricia Hearst, a wealthy newspaper heiress. In a bizarre twist, Hearst joined her captors and became an active revolutionary. A shootout with the Los Angeles police in May 1974 left six of the radicals dead, but they continued to operate throughout 1975. Subsequently, the group dissolved, as its members ended up dead, captured, or in hiding. In 1999 the SLA was once again in the headlines with the arrest of Kathleen Soliah, one of its fugitive members. She ultimately pleaded guilty to charges of aiding and abetting a plot to plant bombs in police vehicles. As of 2002, she and three other former SLA members were facing murder charges stemming from a 1975 bank robbery. BIBLIOGRAPHY

Bryan, John. This Soldier Still at War. New York: Harcourt Brace Jovanovich, 1975.

Patricia Hearst, Revolutionary. Before becoming the unlikely poster girl for the short-lived Symbionese Liberation Army, she was an heiress kidnapped by the group in 1974. AP/Wide World Photos

38

Hearst, Patricia. Every Secret Thing. Garden City, N.Y.: Doubleday, 1982.

Daniel J. Johnson See also Kidnapping; Terrorism.

SYMPHONY ORCHESTRAS. While Americans have enjoyed music making since their earliest days, colonial cities at first had insufficient size and disposable income to support orchestras. By the 1750s, however, Boston, Philadelphia, and Charleston had orchestras. In the early national period, music making assumed roles that involved more than mere entertainment. In Lexington, Kentucky, for example, an orchestra was developed as a means of competing with rival city Louisville in the hope that stronger levels of culture would attract entrepreneurs and trade. In Boston, the Handel and Haydn Society was founded in 1815. It maintained regular concerts and quickly became a center for the city’s culture. This was the first music organization prominently to conceive and use its influence in explicitly conservative ways: to maintain established traditions and to discourage what were seen as corrupting “modern” trends. German immigrants in the 1840s and 1850s sparked the formation of orchestras and festivals in many cities. In 1842, the New York Philharmonic Society was established. In 1878, a second orchestra, the New York Symphony, emerged. The two were rivals until they merged in 1928, although the New York music public had shown it could support two full orchestras. While there was a highbrowlowbrow dichotomy in nineteenth-century America, the popularity of symphony orchestras, and opera as well, touched many beyond the wealthy classes, especially among immigrant groups, including German and Italian Americans. Grand orchestra concerts were a rage in midnineteenth-century America. While the proceeds were good and the mainstream public was satisfied, some critics and serious music lovers noted the often middling (or worse) quality of the music making. In an age when corporations were eclipsing many older means of providing goods and services, the organization of the American symphony orchestra began to evolve from individual entrepreneurialism toward corporate forms. An example is the Boston Symphony, founded in 1881. The investment banker Henry L. Higginson, an ardent music lover, was impatient with the ragtag nature and substandard performances of American musical organizations. Higginson used his considerable financial power to hire the best conductors and musicians and bind them to contracts that restricted their outside activities and demanded regular rehearsals; he simply paid everyone enough to make the arrangement irresistible. While Higginson’s corporate order restricted musicians’ freedom, musically it worked wonders. Other cities followed suit, and the United States witnessed the establishment of many of its major orchestras in the generations after Higginson founded the Boston Symphony.

SYNDICALISM

World War I interrupted not the quality but the character of American symphony orchestras. Before 1917, Austro-German traditions had utterly dominated the repertoire of orchestras. Also, conductors and personnel were largely German. The war changed this. Repertoire turned to French, Russian, and American composers, and while Austro-German music quickly reemerged in programming, it never again reached its position of prewar dominance. More starkly, personnel shifted markedly. Some German orchestra members returned home and never came back. War hysteria pressured several conductors. Frederick Stock of the Chicago Symphony had to resign his post for the war’s duration. Two conductors— Ernst Kunewald of Cincinnati and Karl Muck of Boston—were investigated by the Justice Department and arrested under suspicion of subversion and spying. Both spent the war in an internment camp and were subsequently compelled to accept deportation. Despite personnel shifts during the war, the quality of music making never flagged. Orchestras’ popularity continued to grow, and in the 1920s many—especially the Boston Symphony of Serge Koussevitsky—began to champion works by American composers. This put the symphony orchestras more to the center of the major aesthetic issues among modern American artists, critics, and audiences. The heat of the debates here, combined with the increasing presence of music making among the general population with the proliferation of records and radio, made the symphony orchestras of the nation a central part of the country’s cultural life in the 1920s and 1930s. Radio was especially important in maintaining this presence during the depression, when many smaller orchestras folded. The New Deal Works Progress Administration’s music project helped too, as it sponsored many touring symphony orchestras and presented public concerts for minimal prices. Most famously, in 1937, the National Broadcasting Company (NBC) also began live radio concerts. Not content with the best orchestras in New York City or anywhere else, NBC president Robert Sarnoff hired conductor Arturo Toscanini to put together a hand-picked orchestra. The NBC orchestra concerts became a Sunday afternoon mainstay for millions of households. Many still think it was the greatest orchestra ever assembled. Walt Disney added further to the symphony’s visibility in the cultural life of the nation when he hired Leopold Stokowski and the Philadelphia Orchestra for the animated movie Fantasia (1940). After World War II, orchestras continued to flourish, especially with the breakdown of barriers that had prevented Jews, African Americans, and women from playing in significant numbers. The orchestra became a perfect neutral ground for the rise of anyone with musical talent. Indeed, to prevent bias, conductors often auditioned people from behind screens. Progress took some time, but talent won in the end. Just as radio had boosted the musical presence of the symphony among virtually all levels of the American mu-

sic public, television would do much the same in the 1950s and 1960s. Here the Columbia Broadcasting System’s production of Leonard Bernstein’s innovative Young People’s Concerts with the New York Philharmonic were pivotal in introducing new generations to the symphony. Still, it was with the television generation and with the general economic prosperity of the era that Americans began gravitating steadily toward genres of music other than the symphonic. Alternative musical forms and other entertainment in general had always been available, but a significant line seemed to be crossed in the 1970s, as in most cities the weekend symphony concert seemed less and less to be a central event as it had once been in various communities’ cultural lives. In this regard, the life of the American symphony orchestra closed the last quarter of the twentieth century on less sure footing than it had been. The cities with the greatest symphonic traditions, like Boston, New York, Philadelphia, Cleveland, and Chicago, never felt significantly imperiled, although even they occasionally experienced labor strife and financial pinches. The orchestras of other cities became more seriously troubled, and in the early twenty-first century the fate of the symphony orchestra as a mainstay in the cultural life of most American cities has ceased to be the certainty it once was. BIBLIOGRAPHY

Arian, Edward. Bach, Beethoven, and Bureaucracy: The Case of the Philadelphia Orchestra. University: University of Alabama Press, 1971. Johnson, H. Earle. Symphony Hall, Boston. New York: DaCapo Press, 1979. Kupferberg, Herbert. Those Fabulous Philadelphians: The Life and Times of a Great Orchestra. New York: Scribners, 1969. Mueller, John Henry. The American Symphony Orchestra: A Social History of Musical Taste. Bloomington: Indiana University Press, 1951. Mussulman, Joseph A. Music in the Cultured Generation: A Social History of Music in America, 1870–1900. Evanston, Ill.: Northwestern University Press, 1971. Otis, Philo Adams. The Chicago Symphony Orchestra: Its Organization, Growth, and Development, 1891–1924. Freeport, N.Y.: Books for Libraries Press, 1972. Swoboda, Henry, comp. The American Symphony Orchestra. New York: Basic Books, 1967.

Alan Levy See also Music Festivals; Music Industry; Music: Classical, Early American.

SYNDICALISM, or revolutionary industrial unionism, originated in France but has been identified in the United States with the Industrial Workers of the World (IWW), founded in 1905. The IWW sought strong, centralized unions, while French syndicalists preferred smaller unions. Both opposed action through existing governments.

39

SYNDICALISM

Syndicalists sought to establish a producers’ cooperative commonwealth, with socially owned industries managed and operated by syndicats, or labor unions. Emphasizing class struggle, they advocated direct action through sabotage and general strikes. Opponents, criticizing the movement for militant actions, opposing political government, and condoning violence, secured antisyndicalist laws in several states. The syndicalist movement waned after World War I when many former adherents joined Communist, Trotskyite, or other Socialist groups.

40

BIBLIOGRAPHY

Kimeldorf, Howard. Battling for American Labor: Wobblies, Craft Workers, and the Making of the Union Movement. Berkeley: University of California Press, 1999.

Gordon S. Watkins / c. w. See also Communist Party, United States of America; Industrial Workers of the World; Labor.

T TABERNACLE, MORMON. This unique Salt Lake City auditorium, built by the Church of Jesus Christ of Latter-day Saints between 1864 and 1867 at a cost of about $300,000, was designated a National Historic Landmark in 1970 and a National Civil Engineering Landmark in 1971. Its interior is 150 feet wide, 250 feet long, and 80 feet high, and accommodates nearly 8,000 people. The Tabernacle’s most distinctive feature is a nine-foot-thick tortoise-like roof, designed by a bridge-builder and constructed without nails. A network of lattice arches, resting on buttresses in the outside walls but with no interior support, forms this remarkable dome. Timbers were fastened together with wooden dowels. Split timbers were bound with rawhide that, as it dried, contracted and held them tight.

supervise the adjustment of the Philippine Islands’ government from military command to civil rule. The fivemember commission assumed legislative authority on 1 September 1900, less than two years after Spain ceded the Philippines to the United States following the SpanishAmerican War of 1898. On 4 July 1901, William Howard Taft, president of the commission, became the Philippines’ first civilian governor.

The tabernacle is notable also for its outstanding acoustics and its famous organ, which by the early twentyfirst century contained over 11,600 pipes. In 1994 the Organ Historical Society cited it as “an instrument of exceptional merit, worthy of preservation.”

On 1 September 1901, three Filipinos were appointed to the Taft Commission, and each American member became an executive department head. However, unstable economic conditions became a catalyst for the creation of a Filipino resistance movement dedicated to achieving immediate independence. To quell growing opposition, the United States promulgated a Sedition Law on 4 November 1901, making the advocacy of independence punishable by death or long imprisonment.

The first meeting in the Tabernacle was a general conference of the church in 1867. These semiannual gatherings were held there until 1999, after which they were transferred to the new and more spacious conference center. In the early twenty-first century the building continued to be used for organ recitals, concerts, religious services, and various public functions. As the home of the renowned Mormon Tabernacle Choir, it also hosted regular Sunday broadcasts over the CBS radio and television networks. BIBLIOGRAPHY

Anderson, Paul L. “Tabernacle, Salt Lake City.” In Encyclopedia of Mormonism. Edited by Daniel H. Ludlow et al. New York: Macmillan, 1992. Grow, Stewart L. A Tabernacle in the Desert. Salt Lake City, Utah: Deseret Book Company, 1958.

The commission defined its mission as preparing the Filipinos for eventual independence, and focused on economic development, public education, and the establishment of representative institutions. The commission went on to establish a judicial system, organize administrative services, and create a legal code that included laws regarding health, education, agriculture, and taxation.

In July 1902, a legislature was established that included a popularly elected Lower House and the Taft Commission, which was also known as the Second Philippine Commission. Five years later, the reorganization went into effect and elections for the assembly took place, but franchise was limited to owners of substantial property who were also literate in English or Spanish. After considerable Filipino lobbying and the capture of resistance leader Emilio Aguinaldo, the TydingsMcDuffie Act was passed. It provided for a ten-year period of “Commonwealth” status, beginning in 1935. On 4 July 1946, the United States granted the Philippines complete independence.

James B. Allen James T. Scott See also Latter-day Saints, Church of Jesus Christ of.

TAFT COMMISSION. President William McKinley appointed the Taft Commission on 16 March 1900 to

TAFT-HARTLEY ACT (1947). Passed by Congress over the veto of President Harry Truman, the TaftHartley Act enacted a number of significant amendments

41

T A F T- H A RT L E Y A C T

to the National Labor Relations Act of 1935. The 1935 law, known as the Wagner Act, may have been the most radical legislation of the twentieth century, recognizing and giving federal protection to workers’ rights to organize, to form unions, to engage in strikes and other “concerted activities,” including picketing, and to bargain collectively with their employers. The Wagner Act overturned a vast body of older, judge-made laws, which had enshrined, as a right of private property, employers’ freedom to refuse to deal with unions or union workers. Now, the Wagner Act required them to bargain collectively with employees, and it forbade them to interfere with workers’ new statutory rights. No longer could employers punish or fire pro-union employees or avoid independent unions by creating company-dominated unions; and no longer could they refuse to bargain in good faith with the unions that workers chose to represent them. What was more, the 1935 legislation created a new federal agency, the National Labor Relations Board (NLRB), to supervise union elections and bring “unfair labor practices” charges against employers who violated the Act. The enforcement tools at the Board’s disposal were never formidable; nonetheless, the spectacle of federal support behind vigorous industrial union drives both emboldened workers and enraged much of the business community and its supporters in Congress. During the dozen years since Congress passed the Wagner Act, the labor movement had quintupled in size, reaching roughly 15 million members, or 32 percent of the nonfarm labor force. A substantial majority of the workforce of such key industries as coal mining, railroads, and construction belonged to unions. Thus, by 1947 organized labor had become “Big Labor” and a mighty power in the public eye, and the complaints of business that the National Labor Relations Act was a one-sided piece of legislation began to resonate. The Act safeguarded workers’ rights and enshrined collective bargaining but provided no protection for employers or individual employees against the abuses or wrongdoing of unions. Changes after World War II The end of World War II (1939–1945) saw a massive strike wave, which helped turn public opinion against “Big Labor.” Thus, when the Republicans won both houses of Congress in the 1946 elections, new federal labor legislation was almost inevitable. Indeed, in the decade preceding 1946, well over 200 major bills setting out to amend the Wagner Act had been introduced in Congress. These bills had rehearsed the main themes of the complex and lengthy Taft-Hartley Act. The gist of TaftHartley, according to its proponents, was to right the balance of power between unions and employers. The Wagner Act, they claimed, was tilted toward unions; TaftHartley would protect employers and individual workers. For the latter, the new law contained provisions forbidding the closed shop and permitting states to outlaw any kind of union security clauses in collective agreements.

42

Already, several states, led by Florida and Arkansas, had adopted so-called right-to-work measures, outlawing any form of union security—not only the closed shop, but also contract provisions that required workers who declined to join the union to pay their share of expenses for bargaining and processing grievances. By sanctioning rightto-work statutes, Taft-Hartley did not injure “Big Labor” in the industrial heartland, so much as help thwart union advance in traditionally anti-union regions like the South and the prairie states. The Taft-Hartley Act Brings Changes For employers, the Act created a list of union “unfair labor practices,” where previously the Wagner Act had condemned only employer practices. Taft-Hartley also greatly expanded the ability of both employers and the Board to seek injunctions against unions, thus undermining some of the protections against “government by injunction” that labor had won in the 1932 Norris-LaGuardia Act. It gave employers the express right to wage campaigns against unions during the period when workers were deciding and voting on whether to affiliate with a union. Previous Board policy generally had treated these processes as ones in which workers ought to be free to deliberate and decide free from employer interference. The new law also banned secondary boycotts and strikes stemming from jurisdictional disputes among unions. These provisions chiefly affected the older craft unions of the American Federation of Labor, whose power often rested on the capacity for secondary and sympathetic actions on the part of fellow union workers outside the immediate “unfair” workplace. By contrast, Taft-Hartley’s anticommunist affidavit requirement, like its sanction for right-to-work laws, fell most heavily on the Congress of Industrial Organizations (CIO). The statute required that all union officials seeking access to NLRB facilities and services sign an affidavit stating that they were not communists. The requirement rankled because it implied that unionists were uniquely suspect. The law did not require employers or their agents to swear loyalty, but it did demand that the representatives of American workers go through a demeaning ritual designed to impugn their patriotism or they would be unable to petition the Board for a representation election or to bring unfair labor practice cases before it. Finally, the Act changed the administrative structure and procedures of the NLRB, reflecting congressional conservatives’ hostility toward the nation’s new administrative agencies, exercising state power in ways that departed from common-law norms and courtlike procedures. Thus, the Act required that the Board’s decision making follow legal rules of evidence, and it took the Board’s legal arm, its general counsel, out of the Board’s jurisdiction and established it as a separate entity. The CIO’s general counsel, for his part, warned that by establishing a list of unfair union practices and by imposing on the NLRB courtlike fact-finding, the new law

TAILHOOK INCIDENT

would plunge labor relations into a morass of legalistic proceedings. Already under the Wagner Act, employers had found that unfair labor practice cases stemming from discrimination against union activists, firings of unionminded workers, and the like could all be strung out for years in the nation’s appellate courts, rendering the Act’s forthright endorsement of unionization a hollow one.

BIBLIOGRAPHY

Since the late 1930s the NLRB itself had been retreating from its initially enthusiastic promotion of industrial unionism. Now, with Taft-Hartley, the Board or the independent legal counsel, who might be at odds with the Board, would have even more reason to maintain a studied “neutrality” toward union drives and collective versus individual employment relations, in place of Wagner’s clear mandate in behalf of unionism. The great irony, the CIO counsel went on to say, was that so-called conservatives, who had made careers out of criticizing the intrusion of government authority into private employment relations, had created a vast and rigid machinery that would “convert . . . [federal] courts into forums cluttered with matters only slightly above the level of the police court.”

See also Diplomacy, Secret; Diplomatic Missions; Japan, Relations with.

And so it was. Despite its restrictions on secondary actions and jurisdictional strikes, Taft-Hartley did little to hamper the established old craft unions, like the building trades and teamsters, whose abuses had prompted them; but it went a long way toward hampering organizing the unorganized or extending unions into hostile regions of the nation, and it helped make the nation’s labor law a dubious blessing for labor. BIBLIOGRAPHY

Millis, Harry A., and Emily C. Brown. From the Wagner Act to Taft-Hartley: A Study of National Labor Policy and Labor Relations. Chicago: University of Chicago Press, 1950. Tomlins, Christopher. The State and the Unions: Labor Relations, Law, and the Organized Labor Movement in America, 1880– 1960. New York: Cambridge University Press, 1985. Zieger, Robert H. The CIO: 1935–1955. Chapel Hill: University of North Carolina Press, 1995

William E. Forbath

Esthus, Raymond A. Theodore Roosevelt and Japan. Seattle: University of Washington Press, 1966. Minger, Ralph E. “Taft’s Missions to Japan: A Study in Personal Diplomacy.” Pacific Historical Review 30 (1961).

Samuel Flagg Bemis / a. g.

TAFT-ROOSEVELT SPLIT. When Republican President William Howard Taft took office in 1909 he did so with the support of his reform-minded predecessor Theodore Roosevelt. Within a year, however, Progressive reformers in Congress complained that the administration had allied itself with the conservative Congressional establishment. The reformers, known as Insurgents and led by Senator Robert M. La Follette of Wisconsin, took particular exception to Taft’s controversial firing of Gifford Pinchot in January 1910. Pinchot, head of the Forest Service and a leading conservationist, had been a longtime friend of Roosevelt’s and his firing became a rallying point for Progressives. On his return from a year-long trip to Africa, Roosevelt consulted with Pinchot and other Progressive leaders and plotted a political comeback. In a speech in Kansas in August 1910, Roosevelt attacked Taft’s conservatism and proposed a sweeping program of reforms he called the “New Nationalism.” At the 1912 Chicago convention, Roosevelt contested for the Republican nomination, but conservative party leaders defiantly renominated Taft. Outraged by the conservatives’ heavyhanded tactics, Roosevelt organized the Bull Moose Progressive Party, and became its candidate for president. The split between Roosevelt and Taft allowed the Democratic candidate, Woodrow Wilson, to win the presidency with only about 42 percent of the vote. BIBLIOGRAPHY

Broderick, Francis L. Progressivism at Risk: Electing a President in 1912. New York: Greenwood, 1989. Harbaugh, William H. The Life and Times of Theodore Roosevelt. New York: Oxford University Press, 1975. Mowry, George E. Theodore Roosevelt and the Progressive Movement. Madison: University of Wisconsin Press, 1946.

TAFT-KATSURA MEMORANDUM (29 July 1905), a so-called agreed memorandum exchanged between Secretary of War William Howard Taft, speaking for President Theodore Roosevelt, and Prime Minister Taro Katsura of Japan. The memorandum invoked Japanese-American cooperation “for the maintenance of peace in the Far East.” Thus ornamented, it expressed an approval by the United States of Japanese suzerainty over Korea and a disavowal by Japan of “any aggressive designs whatever on the Philippines.” Roosevelt assured Taft afterward that his “conversation with Count Katsura was absolutely correct in every respect,” thus emphatically approving the agreement, which remained secret until 1925.

Edgar Eugene Robinson / a. g. See also Bull Moose Party; Conservation; Conventions, Party Nominating; Elections, Presidential: 1912; Progressive Movement; Progressive Party, Wisconsin; Republican Party.

TAILHOOK INCIDENT. The Tailhook Association, named for the arresting gear on carrier-based aircraft, is a private group of navy and marine aviators. During the association’s 1991 annual convention in Las Vegas, eighty-three women, many of them naval officers, alleged

43

TALK SHOWS, RADIO AND TELEVISION

that they had been sexually assaulted passing through a hotel hallway filled with male officers. Secretary of the Navy H. Lawrence Garrett III and Chief of Naval Operations Adm. Frank B. Kelso II attended the convention, but both said they witnessed no improper behavior. A subsequent navy investigation was indecisive, and on 18 June 1992, Secretary Garrett asked the Defense Department’s inspector general to take control of the inquiry. The next week several female victims, led by navy Lt. Paula A. Coughlin, a helicopter pilot and aide to Rear Adm. John W. Snyder, Jr., brought charges. On 26 June, Secretary Garrett resigned. Members of Congress criticized the pace of the investigation, the commitment of investigators, and the stonewalling of Tailhook members. In April 1993, the Inspector General accused 140 officers of indecent exposure, assault, and lying under oath. About fifty were fined or disciplined. Accusations in more prominent cases did not lead to court-martial convictions or even demotions. In 8 February 1994, a navy judge ruled that Admiral Kelso had misrepresented his activities at the convention and had tried to manipulate the subsequent investigation. Denying these charges, Kelso decided to retire two months early with a full pension, in return for a tribute from Defense Secretary John J. Dalton that stated Kelso was a man of the “highest integrity and honor.” During that same week Coughlin announced her resignation, saying her career in the navy had been ruined because she had chosen to bring charges. She later received monetary awards from lawsuits against the Tailhook Association, the Hilton Hotels Corporation, and the Las Vegas Hilton Corporation. BIBLIOGRAPHY

McMichael, William H. The Mother of All Hooks: The Story of the U.S. Navy’s Tailhook Scandal. New Brunswick, N.J.: Transaction, 1997. O’Nelll, William L. “Sex Scandals in the Gender-Integrated Military.” Gender Issues 16, 1/2 (Winter/Spring 1998): 64–86. Zimmerman, Jean. Tailspin: Women at War in the Wake of Tailhook. New York: Doubleday, 1995.

Irwin N. Gertzog / c. r. p. See also Marine Corps, United States; Navy, United States; Sexual Harassment; Women in Military Service.

TALK SHOWS, RADIO AND TELEVISION. The talk show has been an important programming format for television and radio since its earliest origins. On television, the earliest such program was Meet the Press, which first aired in 1947. The original host, Martha Rountree, was also the only woman in the program’s history to moderate discussion as politicians and other public leaders made appearances. As television’s ability to impact society grew, so did the need for expansions of the talk show format. In 1952, the Today show made its first appearance on NBC with host Dave Garroway. Soon other networks followed with similar programs, such as the

44

The Dick Cavett Show. The talk show host (center) engages in conversation with heavyweight boxers Muhammad Ali (left) and Jurgen Blin, whom the once and future champion fought, and knocked out, at the end of 1971. AP/Wide World Photos

Morning Show on CBS with host Walter Cronkite. As television reached more and more homes all over the country, the talk show changed to include more entertainment and human-interest features. The Tonight Show, first with Steve Allen in 1954 and eventually Johnny Carson, established the late-night genre that remains wildly popular today. A variety of daytime talk shows have covered a number of issues with very distinct methods of delivery. Serious, issue-oriented programs like Donahue, the Oprah Winfrey Show, and Charlie Rose have been important vehicles for the discussion of important social issues. Other television talk programs have featured hosts interjecting their personal opinions to guests while fielding questions from the audience. The growth of “trash TV” began in the early 1980s with the Morton Downey, Jr. Show. These programs featured incendiary guests who would often come to blows in discussions of race, sexual preference, and infidelity. Many times the hosts themselves would become involved, as when Geraldo Rivera suffered a broken nose during a fracas in one episode of his syndicated talk program. The Jerry Springer Show became a national force in the 1990s and found itself at the center of controversy about the violence and lack of moral content on television in America. These various forms of talk shows continued to dominate afternoon television programming at the turn of the twenty-first century. Radio talk programs evolved over the years as the daily commute to and from work became a high-ratings time slot for that medium. Talk radio programs have become an important political force. Various liberal and conservative hosts voice their views in daily programs. Rush Limbaugh became one of the most well known and well paid of these political hosts, specializing in espousing conservative views and deriding then President Bill Clinton. National Public Radio, founded in 1970, serves over fifteen million listeners and provides two popular talknews programs, All Things Considered and Morning Edition.

TAMMANY HALL

These programs are among the most respected in radio. Many morning radio programs are known for their comic antics and, at times, offensive humor. The Howard Stern Show became one of the first programs to shock listeners and test the limits of what could and could not be aired. This willingness to push boundaries has resulted in a large and loyal audience. Sports talk shows have become an important element of regional radio programming. These call-in talk shows allow fans of various local teams to voice concerns, ideas, and opinions about their favorite clubs. Radio talk can be on the eccentric side as well, such as the paranormal and conspiratorial discussions led by Art Bell on late-night radio.

overpowered a Texas mountain lion, mounted him, and rode away quirting him with a rattlesnake.

BIBLIOGRAPHY

BIBLIOGRAPHY

Hirsch, Alan. Talking Heads: Political Talk Shows and Their Star Pundits. New York: St. Martin’s Press, 1991.

Unless they were deliberately imposing on the gullibility of the tenderfoot, tall liars did not expect their audience to believe them. Sometimes they lied as a defense against assumptions of superiority. Sometimes they lied through modesty. Sometimes, finding that their listeners did not believe the truth, they lied to regain their reputations for veracity. Sometimes they lied with satiric intent. Mostly, however, they lied because they were storytellers of imagination and resource and knew how to make the time pass pleasantly. In lying, they gave the United States some of its most characteristic folklore.

Brown, Carolyn S. The Tall Tale in American Folklore and Literature. Knoxville: University of Tennessee Press, 1987.

Levin, Murray B. Talk Radio and the American Dream. Lexington, Mass.: Lexington Books, 1987.

Dorson, Richard Mercer. Man and Beast in American Comic Legend. Bloomington: Indiana University Press, 1982.

Parish, James Robert. Let’s Talk!: America’s Favorite Talk Show Hosts. Las Vegas, Nev.: Pioneer Books, 1993.

Wonham, Henry B. Mark Twain and the Art of the Tall Tale. New York: Oxford University Press, 1993.

Jay Parrent

Mody C. Boatright / a. e.

See also Television Programming and Influence.

See also Slang.

TALL STORIES is a term used in the United States to denote a comic folktale characterized by grotesque exaggeration. Although not confined to the United States, the tall story has flourished there as nowhere else and thoroughly characterizes the popular psychology that resulted from the rapid expansion of the country in the nineteenth century.

TALLMADGE AMENDMENT, a bill proposed on 13 February 1819 by Rep. James Tallmadge of New York to amend Missouri enabling legislation by forbidding the further introduction of slavery into Missouri and declaring that all children born of slave parents after the admission of the state should be free upon reaching the age of twenty-five. The bill provoked heated debate in Congress and nationwide agitation, marking the beginning of sectional controversy over the expansion of slavery. The slave section was convinced of the necessity of maintaining equal representation in the Senate. The House adopted the amendment but the Senate rejected it. The Missouri Compromise (1820) settled the issue.

The subjects of the tall stories, or tall tales, were those things with which the tellers were familiar: weather, fauna, topography, and adventure. Long before the nation became “dust-bowl conscious,” plains residents told of seeing prairie dogs twenty feet in the air digging madly to get back to the ground. In the southern highlands, astounding tales arose, such as that of the two panthers who climbed each other into the sky and out of sight, or that of David Crockett, who used to save powder by killing raccoons with his hideous grin. Tony Beaver, a West Virginia lumberman, took a day out of the calendar by arresting the rotation of the earth. A northern lumberman, Paul Bunyan, with his blue ox, Babe, snaked whole sections of land to the sawmills. Mike Fink, king of the keelboatmen, used to ride down the Mississippi River dancing Yankee Doodle on the back of an alligator. Freebold Freeboldsen, having left his team in his Nebraska field while he went for a drink, returned to find his horses eaten up by the grasshoppers, who were pitching the horses’ shoes to determine which should get Freebold. Kemp Morgan, able to smell oil underground, once built an oil derrick so high that an ax failing from the crown wore out nineteen handles before it hit the ground. Pecos Bill, who according to legend dug the Rio Grande, once

BIBLIOGRAPHY

Fehrenbacher, Don E. Sectional Crisis and Southern Constitutionalism. Baton Rouge: Louisiana State University Press, 1995.

John Colbert Cochrane / c. w. See also Antislavery; Sectionalism.

TAMMANY HALL. Founded in May 1789 by William Mooney, the Society of Saint Tammany originally began as a fraternal organization that met to discuss politics at Martling’s Tavern in New York City. Enthusiastically pro-French and anti-British, the Tammany Society became identified with Thomas Jefferson’s DemocraticRepublican Party. By 1812 the society boasted some 1,500 members and moved into the first Tammany Hall at the corner of Frankfurt and Nassau streets. In the “labyrinth

45

TAMMANY HALL

mous cartoonist, Thomas Nast, lashed out at the boss week after week, depicting him in prison stripes and Tammany as a rapacious tiger devouring the city. “Honest” John Kelly turned Tammany into an efficient, autocratic organization that for several generations dominated New York City politics from clubhouse to city hall.

Tammany Hall. The headquarters in New York City, c. 1900, of the once-powerful—and sometimes exceptionally corrupt— fraternal organization and Democratic political machine. AP/Wide World Photos

of wheels within wheels” that characterized New York politics in the early nineteenth century, Tammany was the essential cog in the city’s Democratic wheel, and carried New York for Andrew Jackson and Martin Van Buren in the elections of 1828 and 1832. The adoption by the state legislature in 1826 of universal white male suffrage and the arrival each year of thousands of immigrants changed the character of New York City and of its politics. Despite some early xenophobia, the Tammany leaders rejected the nativism of the Know-Nothing Party. Realizing the usefulness of the newcomers, they led them to the polls as soon as they were eligible to vote; in turn, the new voters looked to the local Democratic district leader as a source of jobs and assistance in dealing with the intricacies of the burgeoning city bureaucracy. Upon the election of Fernando Wood as mayor in 1854, city hall became and remained a Tammany fiefdom. With the elevation of William Marcy Tweed to grand sachem of the Tammany Society in 1863, Tammany became the prototype of the corrupt city machine, and for a time its power extended to the state capital after Tweed succeeded in electing his own candidate, John Hoffman, governor. The corruption of the Tweed Ring was all pervasive. Tweed and his associates pocketed some $9 million, padding the bills for the construction of the infamous Tweed Courthouse in City Hall Park. The estimated amounts they took in graft, outright theft, real estate mortgages, tax reductions for the rich, and sale of jobs range from $20 million to $200 million. Tweed ended his spectacular career in jail, following an expose´ of the ring by the New York Times and Harper’s Weekly, whose fa-

46

Kelly’s successor as Tammany leader was Richard Croker, who was somewhat more in the Tweed mold; he took advantage of the smooth-running Kelly machine to indulge his taste for thoroughbred horses, fine wines, and high living. Croker initiated the alliance between Tammany and big business, but Charles Francis Murphy, his successor, perfected it. Contractors with Tammany connections built the skyscrapers, the railroad stations, and the docks. A taciturn former saloonkeeper who had been docks commissioner during the administration of Mayor Robert A. Van Wyck, Murphy realized that the old ways were no longer appropriate. He set about developing the so-called New Tammany, which, when it found it was to its advantage, supported social legislation; sponsored a group of bright young men like Alfred E. Smith and Robert Wagner Sr. for political office; and maintained control of the city by its old methods. Murphy died in 1924 without realizing his dream of seeing one of his young men, Al Smith, nominated for the presidency. Murphy was the last of the powerful Tammany bosses. His successors were men of little vision, whose laxity led to the Seabury investigation of the magistrates courts and of the city government. In 1932, Mayor James J. Walker was brought up on corruption charges before Governor Franklin D. Roosevelt but resigned before he was removed from office. In retaliation the Tammany leaders refused to support Roosevelt’s bid for the Democratic nomination for president, and tried to prevent Herbert H. Lehman, Roosevelt’s choice as his successor, from obtaining the gubernatorial nomination. As a result, the Roosevelt faction funneled federal patronage to New York City through the reform mayor, Fiorello La Guardia (a nominal Republican). The social legislation of the New Deal helped to lessen the hold of the old-time district leaders on the poor, who now could obtain government assistance as a right instead of a favor. Absorption of most municipal jobs into civil service and adoption of more stringent immigration laws undercut the power base of the city machines. In the 1960s the New York County Democratic Committee dropped the name Tammany; and the Tammany Society, which had been forced for financial reasons to sell the last Tammany Hall on Union Square, faded from the New York scene. BIBLIOGRAPHY

Callow, Alexander B., Jr. The Tweed Ring. New York: Oxford University Press, 1966. Mandelbaum, Seymour J. Boss Tweed’s New York. New York: Wiley, 1965. Moscow, Warren. The Last of the Big-Time Bosses: The Life and Times of Carmine De Sapio and the Rise and Fall of Tammany Hall. New York: Stein and Day, 1971.

TAOS

Mushkat, Jerome. Tammany: The Evolution of a Political Machine, 1789–1865. Syracuse, N.Y.: Syracuse University Press, 1971.

Catherine O’Dea / a. g. See also Civil Service; Corruption, Political; Democratic Party; Machine, Political; Rings, Political; Tammany Societies; Tweed Ring.

TAMMANY SOCIETIES. Organizations patterned after the New York and Philadelphia Tammany societies appeared in several states about 1810. Originally founded as fraternal organizations, Tammany societies quickly evolved into political machines that controlled local elections and offices. Rhode Island politics were controlled by a local Tammany society in 1810–11; an Ohio society played an active part in the factional struggles of Republicans in 1810–12. The first Ohio “wigwam” was authorized by a dispensation from Michael Leib, grand sachem of the Philadelphia society, although there is little other evidence of any central organization. The constitution and ritual were those of a patriotic fraternal order of a democratic character. BIBLIOGRAPHY

Mushkat, Jerome. Tammany: The Evolution of a Political Machine, 1789–1865. Syracuse, N.Y.: Syracuse University Press, 1971. Utter, William Thomas. Ohio Politics and Politicians, 1802–1815. New York, 1929.

Eugene H. Roseboom / a. g. See also Machine, Political; Rings, Political.

TAMPA–ST. PETERSBURG. These twin cities, located on Florida’s west coast, comprise, along with surrounding communities, metropolitan Tampa Bay. The area has been the site of successive Native American cultures as attested by numerous burial mounds. The first European to enter Tampa Bay was Panfilo de Narvaez in 1528, thus Tampa–St. Petersburg occupy the earliest site of European discovery of any metropolitan area in the United States. Hernando de Soto explored the region by land in 1539. By 1767 Seminole Indians had reached Tampa Bay; after the First Seminole War (1817–1818) Fort Brooke, which was to become the focal point for the future settlement of Tampa, was established in 1824. Tampa was platted in the early 1850s, by which time it had become the railhead for cattle bound for Cuba, where jerked beef was needed as cheap protein for slaves of the island’s burgeoning sugar industry. In 1861 Fort Brooke was occupied by Confederate troops; it was bombarded by Union vessels in 1862 and finally captured in 1864. Tampa saw little growth until the 1880s, when Henry B. Plant brought the first coordinated system of rail lines into the village (1884), thus linking Tampa with Jacksonville, Florida, and New York City. It was during this same

decade that Cubans established Tampa’s cigar industry and the area known as Ybor City, today a famous tourist destination. Tampa served as the embarkation point for U.S. troops, including Teddy Roosevelt’s Rough Riders, sailing for Cuba during the Spanish-American War of 1898. The early twentieth century saw the growth of the phosphate and citrus industries in the area, while Russian railroad entrepreneur Pyotr Dementyev brought his line onto the Pinellas Peninsula and laid out St. Petersburg, which he named for the city of his homeland. Great resort hotels noted for their fanciful architecture were constructed in both Tampa and St. Petersburg for northern guests. Both cities experienced spectacular growth at the end of the twentieth century, when the population of metropolitan Tampa Bay reached 2,395,997. BIBLIOGRAPHY

Brown, Canter. Tampa in Civil War and Reconstruction. Tampa, Fla.: University of Tampa Press, 2000. Kerstein, Robert J. Politics and Growth in Twentieth-Century. Gainesville: University Press of Florida, 2001. Mormino, Gary R., and George E. Pozzetta. The Immigrant World of Ybor City: Italians and Their Latin Neighbors in Tampa, 1885–1985. Urbana: University of Illinois Press, 1987.

Robert N. Lauriault See also Florida.

TAOS (rhymes with house) means “in the village.” The northernmost of the Pueblo Indian villages in New Mexico, Taos was described first in 1540 by Spanish explorers. This agricultural community, distinguished by its fivestory buildings, had been residence to several hundred Tiwa-speaking inhabitants since at least a.d. 1200–1250. The Spanish renamed the town San Gero´nimo de Taos, and Fray Pedro de Miranda built an outpost near the village in 1617. Taos participated in the Pueblo Revolt of 1680, which drove the Spaniards out of New Mexico. The community endured the reoccupation in 1692, but it rebelled again in 1696. This rebellion was quelled by Don Diego de Vargas. After 1696, Spanish authorities and their Mexican successors ruled Taos peacefully by tolerating traditional religious practices and recognizing an annual trade bazaar that attracted plains Indians eager to acquire Pueblo wares and crops. Known as the Taos Fair after 1723, the institution brought a short season of peace to the province and boosted New Mexico’s economy. In 1796, Fernando Chacon granted land to seventy-three Hispanic families to settle where the present incorporated town of San Fernando de Taos is located, three miles south of the pueblo. During the Mexican era (1821–1846), Taos became important as home to many American traders, most notably Christopher “Kit” Carson. Taosen˜os revolted against Mexican rule in 1837 and against American rule

47

TAR

Taos. A couple of Pueblo Indians look past adobe ovens (foreground) in their village, which dates back to at least c. 1200. Library of Congress

in 1847, killing the trader Charles Bent, the first American territorial governor. Retribution led to strained relations among Anglos, Hispanos, and Taos Indians for decades to come.

Simmons, Marc. New Mexico: A Bicentennial History. New York: Norton, 1977.

By 1900, Taos had become home to the Taos school of American painters, most notably Bert Phillips and Ernest Blumenschein, who attracted many other artists in the early twentieth century, among them Mabel Dodge, Andrew Dasburg, Georgia O’Keeffe, and John Marin. Since the 1950s, Taos has become a favorite Western resort for tourists and skiers. In 1970, after a half century of legal battles, Taos Pueblo regained title to Blue Lake, a sacred site off-reservation within the nearby Carson National Forest.

See also Pueblo.

BIBLIOGRAPHY

Bodine, John J. “Taos Pueblo.” In Handbook of North American Indians. Edited by William C. Sturtevant et al. Volume 9: Southwest, edited by Alfonso Ortiz. Washington, D.C.: Smithsonian Institution, 1979. Grant, Blanche C. When Old Trails Were New: The Story of Taos. New York: Press of the Pioneers, 1934. Reprint, Albuquerque: University of New Mexico Press, 1991. Porter, Dean A., Teresa Hayes Ebie, and Suzan Campbell. Taos Artists and Their Patrons, 1898–1950. Notre Dame, Ind.: Snite Museum of Art; distributed by University of New Mexico Press, 1999.

48

William R. Swagerty

TAR. In the American colonies, tar was a by-product of land clearing and was both exported and supplied to local shipyards. In 1705 Parliament established bounties on naval stores, including tar, imported from the colonies. Following the passage of this law and subsequent acts, annual shipments of pitch and tar from the colonies to Great Britain increased from less than one thousand barrels to more than eighty-two thousand barrels. During the era of wooden ships, tar retained an important place in manufacturing and trade statistics, especially in North Carolina. In the twentieth century most of the tar produced was distilled to yield carbolic oil, naphtha, and other crude products, while pine wood tar was used in medicines and soap. BIBLIOGRAPHY

Kilmarx, Robert A., ed. America’s Maritime Legacy: A History of the U.S. Merchant Marine and Shipbuilding Industry since Colonial Times. Boulder, Colo.: Westview Press, 1979.

TARIFF

Shephard, James F., and Walton, Gary M. Shipping, Maritime Trade, and the Economic Development of Colonial North America. Cambridge: Cambridge University Press, 1972.

BIBLIOGRAPHY

Friedman, Lawrence. Crime and Punishment in American History. New York: Basic Books, 1993.

Victor S. Clark / h. s.

Alvin F. Harlow / s. b.

See also North Carolina; Shipbuilding.

See also Crime; Punishment.

TAR AND FEATHERS. Although it had long been a legal punishment in England, pouring molten tar over an offender’s body and covering it with feathers was part of extralegal demonstrations in the American colonies and the United States. Perpetrators often directed this punishment against those who violated local mores—for example, loyalists during the revolutionary era, abolitionists in the antebellum South, and others judged immoral or scandalous by their communities. During the colonial period, the women of Marblehead, Massachusetts, tarred and feathered Skipper Floyd Ireson because he refused to aid seamen in distress. During the Whiskey Rebellion in Pennsylvania (1794), backcountry insurgents tarred and feathered at least twenty government agents. The practice finally vanished in the late nineteenth century.

TARAWA (20–24 November 1943). As the opening blow in the American offensive through the central Pacific, the Second Marine Division began landing on Betio, an islet in the Tarawa atoll, part of the Gilbert Islands, on the morning of 20 November 1943. The island’s fortyfive hundred Japanese defenders fought back stubbornly from behind solid fortifications. With air support and naval gunfire, the marines rooted out the Japanese defensive positions one at a time. A final Japanese counterattack was defeated on the night of 22–23 November, and the last defenders were eliminated on the 24th. Tarawa, which proved a valuable base, cost more than one thousand American lives, and twice as many wounded. BIBLIOGRAPHY

Alexander, Joseph H. Utmost Savagery: The Three Days of Tarawa. Annapolis, Md.: Naval Institute Press, 1995. Graham, Michael B. Mantle of Heroism. Novato, Calif.: Presidio, 1993. Gregg, Charles T. Tarawa. New York: Stein and Day, 1984. Sherrod, Robert. Tarawa: The Story of a Battle. Fredericksburg, Tex.: Admiral Nimitz Foundation, 1973.

Stanley L. Falk / a. r. See also Gilbert Islands; Marshall Islands; World War II, Air War against Japan.

TARIFF is a schedule of import and export rates paid to a government, but a tariff can also be a duty imposed on a class of items or laws regulating duties used either to raise revenue for the government or protect internal industries and commerce. Tariffs go hand in hand with trade, both of which are subject to the ebb and flow of American social, political, and economic history. Scholars tend to divide the history of the tariff into three periods: the Early Republic to the Civil War (1789–1860), Civil War to the Great Depression (1861–1930s), and from the depression onward.

Tar and Feathers. The 1774 cartoon shows Bostonians protesting the tax on imported tea by tarring and feathering customs official John Malcolm and forcing him to drink from a teapot. The incident took place in late January, about a month after the Boston Tea Party. Archive Photos, Inc.

Colonial Era and the Early Republic Much of the tariff ’s early history was colored by the colonists’ experience with England. In an effort to earn revenue to pay for the costly French and Indian War as well as the cost of maintaining the colonies themselves, England initiated the Townshend Acts of 1767, which placed duties on certain items such as paper, glass, tea, and other goods the colonies produced in small amounts. Responding to the larger issue of taxation without representation, the colonists sought ways to avoid paying the

49

TARIFF

Townshend tariffs. Sometimes colonists refused to purchase items, and public protests were organized. As the colonies worked toward independence, most were highly suspicious of taxation in any form. For instance, the Articles of Confederation did not provide for the national government to levy any taxes. Individual states voluntarily provided monies and levied their own tariffs. This did not change until 1789 when the new Constitution granted Congress the authority to “lay and collect Taxes, Duties, Imposts, and Excises, to pay the Debts and provide for the common Defence and general Welfare of the United States.” The drafters of the Constitution included certain limitations: all taxes were required to be applied geographically equally, Congress was not allowed to place duties on exports from states, and states could not impose duties without the approval of Congress. Foreign trade and commerce was now the responsibility of the new federal government. James Madison, serving as Speaker of the House of Representatives, introduced the very first National Tariff Act on 4 July 1789. Madison’s original bill called for a tariff to raise revenue so that the new government could meet its obligations. Northern manufacturers argued for the tariff to include protectionist measures to help the young industries compete with foreign markets. So the final tariff bill combined ad valorem taxes and specific taxes. Ad valorem taxes are based on a percentage of the item’s value, while specific taxes are assigned regardless of value. For instance, a specific tax of ten cents a gallon was assigned to all imported wines in the hopes Americans would buy American wine, which in turn would aid American wine manufacturers. The first National Tariff Act was a moderate one, especially as a protective tariff. Madison tended to favor revenue tariffs while Alexander Hamilton strongly favored a high protective tariff. The nation sided with Madison until the United States and England went to war. Beginning in 1790, England started developing policies that limited U.S. trade with the rest of Europe. In response, the United States placed an embargo on England, and by 1812 the countries were enmeshed in war. With no British imports, American industries expanded rapidly in order to meet demand. When the War of 1812 ended in 1815, now President Madison was faced with a flood of English goods and war debt. Madison asked Alexander Dallas, secretary of the Treasury, to write a tariff bill to address these issues. The resulting Tariff Act of 1816 propelled protectionism to the forefront for the first time. Dallas’s bill divided imports into three classes, with the class depending on how much of the commodity was made in the United States. For instance, the first class included items manufactured in abundance, so these items were assigned a high tariff. Items not produced in the United States at all fell into the third class, which incurred a small revenue tax. The 1816 tariff increased taxes an average of 42 percent, and the U.S. market was flooded with cheap British imports. This, combined with other

50

economic problems, led to the panic of 1819, depression, and a reevaluation of the nation’s tariff policy. Between 1818 and 1827, tariff issues involved constitutional and sectional problems as well as economic problems. A number of tariffs were passed during these years, but the only major act was the Tariff of 1824. Supported by the middle Atlantic and western states and opposed by the South and northeastern commercial shippers, the 1824 tariff increased the duty of a number of goods including hemp, wool, lead, iron, and textiles. The tariff protected certain industries and hurt others. The duties on hemp and iron hurt shipbuilders. The focus, however, was on the wool industry. The English Parliament reduced the duty on imported wool, which meant English wool goods sold cheaply in America. In response, the Mallory Bill, designed to protect American wool, was presented to Congress; it passed the House but not the Senate. Vice President John C. Calhoun’s tie-breaking vote defeated the bill.

1828 to 1860 The Mallory Bill’s narrow margin of defeat inspired protective tariff supports, sparked debates, and led to the Tariff of 1828, called the “Tariff of Abominations” or the “Black Tariff.” This tariff was a political power play. The supporters of Andrew Jackson introduced a very high protective tariff designed so that, when defeated, New England would be isolated and support would be built for Jackson’s presidential bid in New York, Pennsylvania, the West, and the South. The scheme’s engineers underestimated the nation’s desire for a high protective tariff, and the 1828 bill passed both houses of Congress. The tariff raised the ad valorem duty on raw wool, for example, to 50 percent and added a specific duty of four cents per pound, making it a compound duty. Duties were also raised on iron, hemp, molasses, flax, distilled liquors, and slate. The South was outraged by the increases and even threatened nullification and secession. After the election of Andrew Jackson, protectionists and opponents faced off in an effort to replace the 1828 tariff. Sectional interests muddled the process, but the resulting Tariff of 1832 did not include the worst features of the 1828 tariff and lowered duties to resemble the duties of 1824. Although Virginia and North Carolina supported the bill, the rest of the South did not. South Carolina so opposed the bill that the state declared the tariffs of 1828 and 1832 “null and void.” Jackson was furious and even asked Congress to approve the use of military force. A number of plans were developed, but Henry Clay’s bill, the Compromise Tariff of 1833, was the strongest. Clay’s bill included something for the South and the protectionists. For the South the bill expanded the list of free items and called for the reduction of ad valorem duties over 20 percent. For the protectionists the reduction was to occur over a ten-year period, gradual enough to allow industries to adjust for the change.

TARIFF

Clay’s compromise worked for ten years, but a general depression from 1837 to 1843 and the inability of the government to meet its expenses provided protectionists with ammunition. When President John Tyler, a Whig, called for a bill, the Whigs in Congress drew up a measure that raised duties to their 1832 rates. Items such as molasses incurred a 51 percent ad-valorem duty and railroad iron was assigned a 71 percent ad-valorem duty. The Tariff of 1842 also discontinued the credit system, so payment in cash became a requirement. Despite the 1842 tariff ’s similarity to that of 1832, it did not elicit the same sectional problems or emotions. In 1844 prosperity returned, and the Democrats, traditional proponents of a low tariff, returned to power. President James K. Polk’s secretary of the Treasury, Robert J. Walker, a firm believer in free trade, set out almost immediately to lower tariff rates. The Walker Tariff of 1846 made a new alphabetical schedule of tariffs that exclusively used ad-valorem duties to raise revenue. For instance, schedule A included luxury items and had the highest duties. The 1846 tariff was very successful and made the tariff a non-issue for eleven years. In fact, the tariff issue only surfaced in 1857 because the Treasury had grown too large. The United States also entered its first reciprocity agreement in 1854. The agreement with Canada established free trade of natural products between the two countries. The agreement, however, became a casualty of the Civil War. Civil War to 1890 With the election of Abraham Lincoln, the Republicans regained control of the government and the nation plunged into Civil War. Republicans, traditionally in favor of high protective tariffs, raised rates to unprecedented heights. The Morrill Tariff of 1861 raised ad valorem to the 1846 levels. Throughout the Civil War the federal government constantly needed to increase revenue. Besides tariffs the government created systems of excise taxes, an income tax, and professional licensing taxes. After the Civil War, the measures taken to meet the demands of war now produced an excess. In response, the Republicans cut most of the internal taxes and made small efforts to reduce the high protective tariffs characteristic of the post–Civil War period. Despite a depression between 1873 and 1879, the government’s revenue was approximately $100 million per year. Concerned for their popularity, the Republicans decided it was in the best interest to make some effort to reduce high tariffs. First, Congress formed a Tariff Commission charged with reporting on “the establishment of a judicious tariff, or the revision of the existing tariff.” President Chester Arthur appointed nine protectionists to the commission, who developed a plan to reduce the tariff an average of 25 percent. Congress, however, ignored the commission’s recommendation and even made some rates higher. The 1883 tariff, called the “Mongrel Tariff,” remained in effect for seven years.

1890 to 1930 Despite the election of Democrat Grover Cleveland, the party was too divided to effectively exert pressure to ensure tariff reform. The 1888 defeat of Cleveland by Benjamin Harrison and Republican majorities in the House and Senate ushered in Republican control. The McKinley Tariff of 1890 increased duties on items such as wool, dress goods, linens, lace, and cutlery and extended protection to agricultural goods in the hope of courting the votes of western farmers who might be considering a rival party. The tariff also extended the free list and reduced duties on steel rails, structural iron and steel, and copper. The 1890 tariff also introduced commercial reciprocity for the first time. Weeks after the McKinley Tariff became law, the Democrats won a majority in the House, and in the next presidential election Democrat Grover Cleveland was elected. The Democrats had plans for tariff reform, but the Harrison administration had exhausted the Treasury’s surplus, causing a panic. Further, the Democrats were divided over the repeal of the Sherman Silver Purchase Act. Despite these difficulties, William L. Wilson introduced a bill that not only reduced manufactured-goods duties but also put raw materials on the free list. Once in, the protectionists, both Republicans and Democrats, dominated the Senate; 634 amendments were added and the bill was named the Wilson-Gorman Tariff Act of 1894. However, the tariff did reduce duties to 40 percent. The Wilson-Gorman Tariff was blamed for the 1894 depression, and with the Republicans again in control of both houses and with William McKinley in the White House, the protectionists passed the Dingley Act of 1897, which imposed the highest average rate of customs duties to date. The Dingley Act remained in force (and the Republicans remained in power) for almost fifteen years, longer than any other act. By 1908 the longevity of the Dingley Tariff made the issue hot again, and Republicans decided to reduce duties in the interest of self-preservation. Both Republican and Democrats were influenced by public dissatisfaction with increasing prices on all sorts of goods without a corresponding increase in wages. So, the Republicans basically adopted a lower tariff platform in order to compete politically with the Democrats in the 1908 election. Nelson Aldrich amended the Payne Act of 1909, a moderate House bill, 847 times in the Senate. The Payne-Aldrich Tariff resulted in a decline of 2.38 percent and abandoned reciprocity. The Payne-Aldrich Tariff was hotly criticized and led to the Democrats regaining control of Congress. The Democrats’ first effort, the Underwood-Simmons Act of 1913, proposed to lower duties and rates but was overshadowed by the Great War. The government did not have to raise tariffs during World War I; instead it raised most of its revenue from the income tax. Once the war ended, the Emergency Tariff of 1921, or the Fordney Emergency Tariff Bill, was developed to protect agricultural goods such as wheat, corn, meat, wool, and sugar.

51

TARIFF

At the same time, the House Ways and Means Committee was working to revise the Simmons-Underwood Tariff. After much debate and revision, the Fordney-McCumber Tariff signaled the return of the high protective tariffs.

ried on Kennedy’s foreign trade policy and started a new round of tariff bargaining in 1964. Fifty-three GATT countries, including the United States, concluded negotiations that cut tariffs by 35 percent on more than 60,000 items.

The Great Depression The nation prospered until the stock market crash of 1929 and the Great Depression. Upon taking office, President Herbert Hoover asked Congress to create agricultural relief legislation and to increase the tariff. The result was the Smoot-Hawley Tariff of 1930, which brought rates to an all-time high. Duties increased on agricultural goods, and a number of items were removed from the free list. The 1930 tariff also reorganized the Tariff Commission and created higher salaries for commissioners. It also generated worldwide animosity and initiated a number of defensive tariffs.

The Tokyo Round attempted to cope with the growing depression and inflation cycle of the 1970s. It lowered the average tariff on industrial products to 4.7 percent and developed a series of non-tariff barrier agreements.

Franklin D. Roosevelt made clear in his campaign he intended to break down the barriers created by the SmootHawley Tariff. Roosevelt, with the help of Secretary of State Cordell Hull, developed a series of Reciprocal Trade Agreements. The first Reciprocal Trade Bill of 1934 granted Roosevelt the authority to negotiate reciprocal agreements with other nations for three years. Similar extensions were enacted until the Trade Expansion Act of 1962. General Agreement on Tariffs and Trade At the end of World War II, the United States set out to help rebuild Europe, America’s major prewar market. In addition to a number of trade extensions acts, negotiations in Geneva led to the multilateral General Agreement on Tariffs and Trade (GATT). The agreement outlined broad terms for international trade, called for tariff reduction of over 45,000 items, and included the “most favored nation” clause, which ensured all members would benefit from each other’s agreements. The United States participated in GATT by executive agreements and without express approval of Congress. There were eight rounds of negotiations: Geneva (1947); Annecy, France (1949); Torquay, England (1951); Geneva (1956); Dillon (1960–1962); Kennedy (1962–1967); Tokyo (1973–1979), and Uruguay (1986–1994). The first six rounds concentrated almost solely on tariff reduction. The last Reciprocal Trade Agreement extension expired June 1962. President John Kennedy outlined the issues to Congress and proposed legislation to make tariff revision internally and to bargain abroad, either within or outside of GATT. The bill enacted was the Trade Expansion Act of 1962. The 1962 act set forth presidential permissions and prohibitions. For instance, the president was allowed to promote trade abroad and prevent communists from taking part in the markets of American friends. But the president was required to set ending dates, and without most-favored-nation status from communist-dominated countries. Kennedy was assassinated just one month after signing the 1962 act, but President Lyndon Johnson car-

52

Creation of the World Trade Organization During the 1980s and 1990s, the members of GATT felt the nature of the international economy needed a more structured and powerful international trade organization. GATT was originally established as a provisional body, but no other proposal or organization was accepted, so it remained the only organization dealing with international trade until 1 January 1995 when the World Trade Organization (WTO) was created. The Uruguay Round, which was a series of negotiations, ushered in the biggest reforms since the creation of GATT. The agenda included such items as rules for settling disputes, intellectual property, and agriculture and textiles trade reform. Talks broke down a number of times, but the group eventually came up with a number of successful moves. For instance, the Uruguay Round developed a new, more efficient dispute settlement system and a trade policy review mechanism, which called for a regular review of policies and practices. Finally, the round created the WTO. The GATT organization was no longer, but the GATT agreement remained in effect as GATT 1994. The WTO agreements cover goods as well as services and intellectual property. As the only international body to deal with trade, the WTO has three objectives: to aid the free flow of trade, to come to agreement through negotiation, and to settle disputes impartially. The WTO is made up of a number of different bodies, including the overseeing body called the Ministerial Conference. The 140 member governments administer the WTO, accounting for over 97 percent of world trade. BIBLIOGRAPHY

Dobson, John M. Two Centuries of Tariffs. Washington, D.C.: United States International Trade Commission, 1976. Kaplan, Edward S., and Thomas W. Ryley. Prelude to Trade Wars: American Tariff Policy, 1890–1922. Westport, Conn.: Greenwood Press, 1994. Ratner, Sidney. The Tariff in American History. New York: Van Nostrand, 1972. Wolman, Paul. Most Favored Nation. Chapel Hill: University of North Carolina Press, 1992. World Trade Organization. Home page at http://www.wto.org

Lisa A. Ennis See also Trade Agreements; Trade, Foreign.

T AV E R N S A N D S A L O O N S

Cheers! The Schlitz Hotel Bar in Milwaukee, Wisc., shown here in the early twentieth century, displays a wide variety of alcoholic beverages. Library of Congress

TASK FORCE 58 was the long-range naval striking arm of the U.S. Pacific Fleet during the offensive against Japan in World War II. It became the major weapon system in the wartime and postwar U.S. Navy, replacing the battleship. During World War II the Navy created numbered fleets with subordinate numbered task organizations. In August 1943 the Navy divided the Pacific Fleet into the Third and Fifth Fleets, of which the fast carriers became Task Force 58 (TF 58). The Navy later subdivided TF 58 into task groups and they into smaller task units. This system, which allowed the Pacific Fleet to transfer ships between commands with a minimum of administrative detail, became the basis for postwar naval organization. The tasks of TF 58, which the Navy renamed Task Force 38 in 1944, increased as the war progressed. In 1944, TF 58 sought out and destroyed the Japanese fleet and naval air forces at the Battles of the Philippine Sea and of Leyte Gulf. In 1943 and 1944 it provided defensive cover and air support for the amphibious forces that captured the Gilbert, Marshall, New Guinea, Mariana, Palau, and Philippine Islands and protected the forces that neutralized Truk. In 1945 it supported the amphibious landings at Iwo Jima and Okinawa, fought off Japanese kamikaze air attacks, and struck airfields and strategic targets in Formosa and Japan. The latter-type missions also dominated fast-carrier operations in the Korean and

Vietnam Wars, during which the carriers (in far fewer numbers) composed TF 77 as part of the Seventh Fleet. BIBLIOGRAPHY

Belote, James H. Titans of the Sea. New York: Harper and Row, 1975. Bradley, James. Flags of Our Fathers. New York: Bantam, 2000. Cutler, Thomas J. The Battle of Leyte Gulf, 23–26 October, 1944. New York: HarperCollins, 1994. Wildenberg, Thomas. Destined for Glory. Annapolis, Md.: Naval Institute Press, 1998.

Clark G. Reynolds / e. m. See also Aircraft Carriers and Naval Aircraft; Philippine Sea, Battle of the.

TAVERNS AND SALOONS. Early New England taverns were actually private homes where the homeowner both served meals and opened rooms so travelers would have a place to stay. Taverns received travelers who came on canal boats, in stagecoaches, and by horseback. By the 1790s taverns were offering more services: if a horse needed stabling, stalls were to be had; clubs and boards of directors held meetings in their rooms; promoters of the arts used taverns for dances, stage productions, and art galleries; area residents met in taverns at the

53

T A X I N K I N D , C O N F E D E R AT E

end of the day to discuss politics, transact business, or gossip. Many stagecoach stops were at taverns, which provided workers to load and unload freight. Early post offices were often in taverns. Taverns, often the social and economic centers of communities, evolved and expanded along with the country. While always offering drink, both alcohol (licenses to serve alcohol had to be applied for and approved by the local government) and soft, they also made newspapers available to their patrons for reading, and were used as polling places during elections. Because the building was often large enough to accommodate a group, taverns were sometimes utilized as courtrooms. In times of war, taverns were used as military headquarters. In addition, many taverns served as a basic general store, selling staples such as molasses, cloth, kitchen utensils, and spices. (Some taverns, the nicer ones, had a parlor that was set apart for ladies or others who did not wish to be seated in the main room. The furnishings were usually more formal and included a fire in the colder months.) Taverns had colorful names, the Eagle, the Star and Garter, the Bull’s Eye, the Leather Bottle, the Globe, the Indian Queen, and the Mermaid Inn among them. The Mermaid opened shortly after James Simpson established Salem, Virginia, in 1802. At a time when many people were illiterate and before the practice of naming and numbering streets was common, signs were hung out to identify each tavern. Some were carved from wood and then painted; others were stone, tile, metal, and even stuffed animal heads. Taverns were commonly absentee-owned, with the tavern keeper living in the building as a tenant, much like motel managers of the current day. The lodging was undoubtedly part of the tavern keeper’s compensation. By the end of the nineteenth century, taverns had died out as each area of their trade became specialized. Boardinghouses, restaurants, theaters, hotels, and saloons became stand-alone businesses. With the advent of the train, passengers and freight depots no longer had need of taverns for transfers. Saloons Saloons were the western version of a tavern but did not provide lodging; entertainment, however, took on a decidedly western flair. Instead of art displays, saloons offered prizefights or boxing matches. Saloons did not host formal dances; they had dance hall girls who danced with the men for a price. Many saloon keepers built stages on which short plays and variety shows were held. The Apollo Hall in Denver opened in 1859 with the saloon on the ground floor and a theater on the second floor. Denver was only a year old, but ready for variety in entertainment. Saloon talent, however, was not especially sophisticated or refined; for example, the strong woman act of Mrs. De Granville; the entrepreneur who installed a stereoscope

54

with obscene pictures; and the man who was hired to walk to and fro on a platform above the bar in a Cheyenne saloon for 60 hours. Saloons had liquor of the best, and the worst, qualities, depending on their location—a rich mining town or a hardscrabble settlement. Saloons had the most success in mining or cattle towns. In some of these settlements, saloons outnumbered stores and other establishments by two to one. Abilene, Kansas, with a year-round population of only 800, had eleven saloons. Abilene was on the trail of the cattle drives from Texas, thus the population would briefly increase by at least 5,000 cowboys. Regulations were few, so some saloons were open all day and night, seven days a week, especially in mining towns. Gunfights and other violence were common in both the cattle and mining towns. Saloons located in farming communities were much quieter. Farmers were usually settled inhabitants, with a name to protect and consistent hard work facing them each morning. They discussed their successes and difficulties over snacks that the barkeeper supplied as they sipped their beers. Many Americans thought that saloons and strong drink were the work of the devil (indeed, alcoholism was a major problem in the United States). Perhaps the most vociferous in that belief was Carry A. Nation, who traveled around the country preaching her temperance message, urging moderation in most things but complete abstinence of intoxicating liquor. She carried a hatchet in her underskirt, and more than once used it to destroy liquor bottles and bar equipment. Churches promoted the temperance movement, and it spread throughout the country during the last decades of the nineteenth century. BIBLIOGRAPHY

Erdoes, Richard. Saloons of the Old West. New York: Knopf, 1979. Grace, Fran. Carry A. Nation: Retelling the Life. Bloomington: Indiana University Press, 2001. Schaumann, Merri Lou Schribner. Taverns of Cumberland County 1750–1840. Carlisle, Pa.: Cumberland County Historical Society, 1994.

Peggy Sanders See also Stagecoach Travel; Temperance Movement.

TAX IN KIND, CONFEDERATE. See Tithes, Southern Agricultural.

TAXATION is the imposition by a government of a compulsory contribution on its citizens for meeting all or part of its expenditures. But taxation can be more than a revenue raiser. Taxes can redistribute income, favor one group of taxpayers at the expense of others, punish or reward, and shape the behavior of taxpayers through incentives and disincentives. The architects of American tax

T A X AT I O N

policy have always used taxes for a variety of social purposes: upholding social order, advancing social justice, promoting economic growth, and seeking their own political gain. The need for new revenues has always set the stage for pursuing social goals through taxation, and the need for new revenues has been most intense during America’s five great national crises: the political and economic crisis of the 1780s, the Civil War, World War I, the Great Depression, and World War II. In the process of managing each of these crises, the federal government led the way in creating a distinctive tax regime—a tax system with its own characteristic tax base, rate structure, administrative apparatus, and social intention. In the United States, progressive taxation—taxation that bears proportionately more heavily on individuals, families, and firms with higher incomes—has always enjoyed great popularity. Progressive taxation has offered a way of reconciling the republican or democratic ideals with the high concentrations of wealth characteristic of capitalist economic systems. During national crises, political leaders have been especially intent on rallying popular support. Consequently, the powerful tax regimes associated with great national crises have each had a significant progressive dimension. The Colonial Era and the American Revolution, 1607–1783 Before the American Revolution, taxation was relatively light in the British colonies that would form the United States. Public services, such as education and roads, were limited in scale, and the British government heavily funded military operations. In 1763, after the expensive Seven Years’ War, the British government initiated a program to increase taxes levied on Americans, especially through “internal” taxes such as the Stamp Act (1765) and the Townshend Acts (1767). But colonial resistance forced the British to repeal these taxes quickly, and the overall rate of taxation in America remained low until the outset of the Revolution, at least by contemporary British standards. Tax rates and types of taxation varied substantially from colony to colony, and even from community to community within particular colonies, depending on modes of political organization and the distribution of economic power. British taxing traditions were diverse, and the various colonies and local communities had a rich array of institutions from which to choose: taxes on imports and exports; property taxes (taxes on the value of real and personal assets); poll taxes (taxes levied on citizens without any regard for their property, income, or any economic characteristic); excise (sales) taxes; and faculty taxes, which were taxes on the implicit incomes of people in trades or businesses. The mix varied, but each colony made use of virtually all of these different modes of taxation. Fighting the Revolution forced a greater degree of fiscal effort on Americans. At the same time, the democratic forces that the American Revolution unleashed en-

ergized reformers throughout America to restructure state taxation. Reformers focused on abandoning deeply unpopular poll taxes and shifting taxes to wealth as measured by the value of property holdings. The reformers embraced “ability to pay”—the notion that the rich ought to contribute disproportionately to government—as a criterion to determine the distribution of taxes. The reformers were aware that the rich of their day spent more of their income on housing than did the poor and that a flat, ad valorem property levy was therefore progressive. Some conservative leaders also supported the reforms as necessary both to raise revenue and to quell social discord. The accomplishments of the reform movements varied widely across the new states; the greatest successes were in New England and the Middle Atlantic states. During the Revolution, while state government increased taxes and relied more heavily on property taxes, the nascent federal government failed to develop effective taxing authority. The Continental Congress depended on funds requisitioned from the states, which usually ignored calls for funds or responded very slowly. There was little improvement under the Articles of Confederation. States resisted requisitions and vetoed efforts to establish national tariffs.

The Early Republic, 1783–1861 The modern structure of the American tax system emerged from the social crisis that extended from 1783 to the ratification in 1788 of the U.S. Constitution. At the same time that the architects of the federal government forged their constitutional ideas, they struggled with an array of severe fiscal problems. The most pressing were how to finance the revolutionary war debts and how to establish the credit of the nation in a way that won respect in international financial markets. To solve these problems, the Constitution gave the new government the general power, in the words of Article 1, section 8, “To lay and collect Taxes, Duties, Imposts, and Excises.” The Constitution, however, also imposed some restrictions on the taxing power. First, Article 1, section 8, required that “all Duties, Imposts and Excises shall be uniform throughout the United States.” This clause prevented Congress from singling out a particular state or group of states for higher rates of taxation on trade, and reflected the hope of the framers that the new Constitution would foster the development of a national market. Second, Article 1, section 9, limited federal taxation of property by specifying that “No Capitation, or other direct, Tax shall be laid, unless in Proportion to the Census.” The framers of the Constitution never clearly defined “direct” taxation, but they regarded property taxes and “capitation” or poll taxes as direct taxes. The framers’ goals were to protect the dominance of state and local governments in property taxation, and to shield special categories of property, such as slaves, against discriminatory federal taxation.

55

T A X AT I O N

As the framers of the Constitution intended, property taxation flourished at the state and local levels during the early years of the Republic. Most of the nation’s fiscal effort was at these levels of government, rather than at the federal level, and the property tax provided most of the funding muscle. Differences persisted among states regarding the extent and form of property taxation. Southern states remained leery of property taxation as a threat to the land and slaves owned by powerful planters. These states also had the most modest governments because of limited programs of education and internal improvements. One southern state, Georgia, abandoned taxation altogether and financed its state programs through land sales. Northern states, in contrast, generally expanded their revenue systems, both at the state and local levels, and developed ambitious new property taxes. The reformers who created these new property taxes sought to tax not just real estate but all forms of wealth. They described the taxes that would do this as general property taxes. These were comprehensive taxes on wealth that would reach not only tangible property such as real estate, tools, equipment, and furnishings but also intangible personal property such as cash, credits, notes, stocks, bonds, and mortgages. Between the 1820s and the Civil War, as industrialization picked up steam and created new concentrations of wealth, tax reformers tried to compel the new wealth to contribute its fair share to promoting communal welfare. By the 1860s, the general property tax had, in fact, significantly increased the contributions of the wealthiest Americans to government. At the federal level, a new tax regime developed under the financial leadership of the first secretary of the Treasury, Alexander Hamilton. His regime featured tariffs—customs duties on goods imported into the United States—as its flagship. Tariffs remained the dominant source of the government’s revenue until the Civil War. To establish precedents for future fiscal crises, Hamilton wanted to exercise all the taxing powers provided by Congress, including the power to levy “internal” taxes. So, from 1791 to 1802, Congress experimented with excise taxes on all distilled spirits (1791); on carriages, snuff manufacturing, and sugar refining (1794); and with stamp duties on legal transactions, including a duty on probates for wills (1797)—a first step in the development of the federal estate tax. In addition, in 1798 Congress imposed a temporary property tax, apportioned according to the Constitution, on all dwelling houses, lands, and large slave holdings. Excise taxes proved especially unpopular, and the tax on spirits touched off the Whiskey Rebellion of 1794. President George Washington had to raise 15,000 troops to discourage the Pennsylvania farmers who had protested, waving banners denouncing tyranny and proclaiming “Liberty, Equality, and Fraternity.”

56

In 1802, the administration of President Thomas Jefferson abolished the Federalist system of internal taxation, but during the War of 1812, Congress restored such taxation on an emergency basis. In 1813, 1815, and 1816, Congress enacted direct taxes on houses, lands, and slaves, and apportioned them to the states on the basis of the 1810 census. Congress also enacted duties on liquor licenses, carriages, refined sugar, and even distilled spirits. At the very end of the war, President James Madison’s secretary of the Treasury, Alexander J. Dallas, proposed adopting an inheritance tax and a tax on incomes. But the war ended before Congress acted. The Era of Civil War and Modern Industrialization, 1861–1913 The dependence of the federal government on tariff revenue might have lasted for at least another generation. But a great national emergency intervened. The Civil War created such enormous requirements for capital that the Union government had to return to the precedents set during the administrations of Washington and Madison and enact a program of emergency taxation. The program was unprecedented in scale, scope, and complexity. During the Civil War, the Union government placed excise taxes on virtually all consumer goods, license taxes on a wide variety of activities (including every profession except the ministry), special taxes on corporations, stamp taxes on legal documents, and taxes on inheritances. Each wartime Congress also raised the tariffs on foreign goods, doubling the average tariff rate by the end of the war. And, for the first time, the government levied an income tax. Republicans came to the income tax as they searched for a way to hold popular confidence in their party in the face of the adoption of the new regressive levies—taxes that taxed lower income people at higher rates than the wealthy. Republicans looked for a tax that bore a closer relationship to “ability to pay” than did the tariffs and excises. They considered a federal property tax but rejected it because the allocation formula that the Constitution imposed meant taxing property in wealthy, more urban states at lower rates than in poorer, more rural states. The Republican leadership then took note of how the British Liberals had used income taxation in financing the Crimean War as a substitute for heavier taxation of property. They settled on this approach, and the result was not only an income tax but a graduated, progressive tax—one that reached a maximum rate of 10 percent. This was the first time that the federal government discriminated among taxpayers by virtue of their income. The rates imposed significantly higher taxes on the wealthy—perhaps twice as much as the wealthy were used to paying under the general property tax. By the end of the war, more than 15 percent of all Union households in the northeastern states paid an income tax. After the Civil War, Republican Congresses responded to the complaints of the affluent citizens who had accepted the tax only as an emergency measure. In 1872,

T A X AT I O N

Congress allowed the income tax to expire. And, during the late 1860s and early 1870s, Republican Congresses phased out the excise taxes, except for the taxes on alcohol and tobacco. Republicans, however, kept the high tariffs, and these constituted a new federal tax regime. Until the Underwood-Simmons Tariff Act of 1913 significantly reduced the Civil War rates, the ratio between duties and the value of dutiable goods rarely dropped below 40 percent and was frequently close to 50 percent. On many manufactured items the rate of taxation reached 100 percent. The system of high tariffs came to symbolize the commitment of the federal government to creating a powerful national market and to protecting capitalists and workers within that market. The nationalistic symbolism of the tariff in turn reinforced the political strength of the Republican Party. After the Civil War, continuing industrialization and the associated rise of both modern corporations and financial capitalism increased Democratic pressure to reform the tariff. Many Americans, especially in the South and West, came to regard the tariff as a tax that was not only regressive but also protective of corporate monopolies. One result was the enactment, in 1894, of a progressive income tax. But in 1895 the Supreme Court, in Pollock v. Farmers’ Loan and Trust Company, claimed, with little historical justification, that the architects of the Constitution regarded an income tax as a direct tax. Since Congress had not allocated the 1894 tax to the states on the basis of population, the tax was, in the Court’s view, unconstitutional. Another result of reform pressure was the adoption in 1898, during the Spanish-American War, of the first federal taxation of estates. This tax was graduated according to both the size of the estate and the degree of relationship to the deceased. The Supreme Court upheld the tax in Knowlton v. Moore (1900), but in 1902 a Republican Congress repealed it. State and local tax policy also began to change under the pressure of industrialization. The demand of urban governments for the funds required for new parks, schools, hospitals, transit systems, waterworks, and sewers crushed the general property tax. In particular, traditional self-assessment of property values proved inadequate to expose and determine the value of intangible property such as corporate stocks and bonds. Rather than adopt rigorous and intrusive new administrative systems to assess the value of such, most local governments focused property taxation on real estate, which they believed they could assess accurately at relatively low cost. Some states considered following the advice of the reformer Henry George and replacing the property tax with a “single tax” on the monopoly profits embedded in the price of land. Farm lobbies, however, invariably blocked such initiatives. Instead, after 1900, state governments began replacing property taxation with special taxes, such as income taxes, inheritance taxes, and special corporate taxes. Beginning in the 1920s, state governments would con-

tinue this trend by adding vehicle registration fees, gasoline taxes, and general sales taxes. The Establishment of Progressive Income Taxation, 1913–1929 Popular support for progressive income taxation continued to grow, and in 1909 reform leaders in Congress from both parties finally united to send the Sixteenth Amendment, legalizing a federal income tax, to the states for ratification. It prevailed in 1913 and in that same year Congress passed a modest income tax. That tax, however, might well have remained a largely symbolic element in the federal tax system had World War I not intervened. World War I accelerated the pace of reform. The revenue demands of the war effort were enormous, and the leadership of the Democratic Party, which had taken power in 1912, was more strongly committed to progressive income taxes and more opposed to general sales taxes than was the Republican Party. In order to persuade Americans to make the financial and human sacrifices for World War I, President Woodrow Wilson and the Democratic leadership of Congress introduced progressive income taxation on a grand scale. The World War I income tax, which the Revenue Act of 1916 established as a preparedness measure, was an explicit “soak-the-rich” instrument. It imposed the first significant taxation of corporate profits and personal incomes and rejected moving toward a “mass-based” income tax—one falling most heavily on wages and salaries. The act also reintroduced the progressive taxation of estates. Further, it adopted the concept of taxing corporate excess profits. Among the World War I belligerents, only the United States and Canada placed excess-profits taxation—a graduated tax on all business profits above a “normal” rate of return—at the center of wartime finance. Excess-profits taxation turned out to generate most of the tax revenues raised by the federal government during the war. Thus, wartime public finance depended heavily on the taxation of income that leading Democrats, including President Wilson, regarded as monopoly profits and therefore ill-gotten and socially hurtful. During the 1920s, three Republican administrations, under the financial leadership of Secretary of the Treasury Andrew Mellon, modified the wartime tax system. In 1921 they abolished the excess-profits tax, dashing Democratic hopes that the tax would become permanent. In addition, they made the rate structure of the income tax less progressive so that it would be less burdensome on the wealthy. Also in 1921, they began to install a wide range of special tax exemptions and deductions, which the highly progressive rates of the income tax had made extremely valuable to wealthy taxpayers and to their surrogates in Congress. The Revenue Acts during the 1920s introduced the preferential taxation of capital gains and a variety of deductions that favored particular industries, deductions such as oil- and gas-depletion allowances.

57

T A X AT I O N

The tax system nonetheless retained its “soak-therich” character. Secretary Mellon led a struggle within the Republican Party to protect income and estate taxes from those who wanted to replace them with a national sales tax. Mellon helped persuade corporations and the wealthiest individuals to accept some progressive income taxation and the principle of “ability to pay.” This approach would, Mellon told them, demonstrate their civic responsibility and help block radical attacks on capital. The Great Depression and New Deal, 1929–1941 The Great Depression—the nation’s worst economic collapse—produced a new tax regime. Until 1935, however, depression-driven changes in tax policy were ad hoc measures to promote economic recovery and budget balancing rather than efforts to seek comprehensive tax reform. In 1932, to reduce the federal deficit and reduce upward pressure on interest rates, the Republican administration of President Herbert Hoover engineered across-theboard increases in both income and estate taxes. These were the largest peacetime tax increases in the nation’s history. They were so large that President Franklin D. Roosevelt did not have to recommend any significant tax hikes until 1935. Beginning in 1935, however, Roosevelt led in the creation of major new taxes. In that year, Congress adopted taxes on wages and the payrolls of employers to fund the new social security system. The rates of these taxes were flat, and the tax on wages provided an exemption of wages over $3,000. Thus, social security taxation was regressive, taxing lower incomes more heavily than higher incomes. Partly to offset this regressive effect on federal taxation, Congress subsequently enacted an undistributed profits tax. This was a progressive tax on retained earnings—the profits that corporations did not distribute to their stockholders. This measure, more than any other enactment of the New Deal, aroused fear and hostility on the part of large corporations. Quite correctly, they viewed Roosevelt’s tax program as a threat to their control over capital and their latitude for financial planning. In 1938, a coalition of Republicans and conservative Democrats took advantage of the Roosevelt administration’s embarrassment over the recession of 1937–1938 to gut and then repeal the tax on undistributed profits. World War II, 1941–1945: From “Class” to “Mass” Taxation President Roosevelt’s most dramatic reform of taxation came during World War II. During the early phases of mobilization, he hoped to be able to follow the example of Wilson by financing the war with taxes that bore heavily on corporations and upper-income groups. “In time of this grave national danger, when all excess income should go to win the war,” Roosevelt told a joint session of Congress in 1942, “no American citizen ought to have a net income, after he has paid his taxes, of more than

58

$25,000.” But doubts about radical war-tax proposals grew in the face of the revenue requirements of full mobilization. Roosevelt’s military and economic planners, and Roosevelt himself, came to recognize the need to mobilize greater resources than during World War I. This need required a general sales tax or a mass-based income tax. In October of 1942, Roosevelt and Congress agreed on a plan: dropping the general sales tax, as Roosevelt wished, and adopting a mass-based income tax that was highly progressive, although less progressive than Roosevelt desired. The act made major reductions in personal exemptions, thereby establishing the means for the federal government to acquire huge revenues from the taxation of middle-class wages and salaries. Just as important, the rates on individuals’ incomes—rates that included a surtax graduated from 13 percent on the first $2,000 to 82 percent on taxable income over $200,000—made the personal income tax more progressive than at any other time in its history. Under the new tax system, the number of individual taxpayers grew from 3.9 million in 1939 to 42.6 million in 1945, and federal income tax collections leaped from $2.2 billion to $35.1 billion. By the end of the war, nearly 90 percent of the members of the labor force submitted income tax returns, and about 60 percent of the labor force paid income taxes, usually in the form of withheld wages and salaries. In making the new individual income tax work, the Roosevelt administration and Congress relied heavily on payroll withholding, the information collection procedures provided by the social security system, deductions that sweetened the new tax system for the middle class, the progressive rate structure, and the popularity of the war effort. Americans concluded that their nation’s security was at stake and that victory required both personal sacrifice through taxation and indulgence of the corporate profits that helped fuel the war machine. The Roosevelt administration reinforced this spirit of patriotism and sacrifice by invoking the extensive propaganda machinery at their command. The Treasury, its Bureau of Internal Revenue, and the Office of War Information made elaborate calls for civic responsibility and patriotic sacrifice. Cumulatively, the two world wars revolutionized public finance at the federal level. Policy architects had seized the opportunity to modernize the tax system, in the sense of adapting it to new economic and organizational conditions and thereby making it a more efficient producer of revenue. The income tax enabled the federal government to capitalize on the financial apparatus associated with the rise of the modern corporation to monitor income flows and collect taxes on those flows. In the process, progressive income taxation gathered greater popular support as an equitable means for financing government. Taxation, Americans increasingly believed, ought to redistribute income according to ideals of social justice and thus express the democratic ideals of the nation.

“ T A X AT I O N W I T H O U T R E P R E S E N T AT I O N ”

The Era of Easy Finance, 1945 to the Present The tax regime established during World War II proved to have extraordinary vitality. Its elasticity—its ability to produce new revenues during periods of economic growth or inflation—enabled the federal government to enact new programs while only rarely enacting politically damaging tax increases. Consequently, the World War II tax regime was still in place at the beginning of the twentyfirst century. During the 1970s and the early 1980s, however, the regime weakened. Stagnant economic productivity slowed the growth of tax revenues, and the administration of President Ronald Reagan sponsored the Emergency Tax Relief Act of 1981, which slashed income tax rates and indexed the new rates for inflation. But the World War II regime regained strength after the Tax Reform Act of 1986, which broadened the base of income taxation; the tax increases led by Presidents George H. W. Bush and William J. Clinton in 1991 and 1993; the prolonged economic expansion of the 1990s; and the increasing concentration of incomes received by the nation’s wealthiest citizens during the buoyant stock market of 1995–2000. Renewed revenue growth first produced significant budgetary surpluses and then, in 2001, it enabled the administration of president George W. Bush to cut taxes dramatically. Meanwhile, talk of adopting a new tax regime, in the form of a “flat tax” or a national sales tax, nearly vanished. At the beginning of the twenty-first century, the overall rate of taxation, by all levels of government, was about the same in the United States as in the world’s other modern economies. But the United States relied less heavily on consumption taxes, especially valueadded taxes and gasoline taxes, and more heavily on social security payroll taxes and the progressive income tax. BIBLIOGRAPHY

University Press, 1993. Stresses a post–World War II victory for a “hegemonic tax logic” based on the needs of American capitalism. Leff, Mark. The Limits of Symbolic Reform: The New Deal and Taxation, 1933–1939. Cambridge, U.K.: Cambridge University Press, 1984. Interprets President Franklin Roosevelt’s interest in progressive taxation as symbolic rather than substantive. Ratner, Sidney. Taxation and Democracy in America. New York: Wiley, 1967. The classic interpretation of the expansion of income taxation as a great victory for American democracy. Stanley, Robert. Dimensions of Law in the Service of Order: Origins of the Federal Income Tax, 1861–1913. New York: Oxford University Press, 1993. Regards the income tax as an effort to preserve the capitalist status quo. Stein, Herbert. The Fiscal Revolution in America. Rev. ed. Washington, D.C.: AEI Press, 1990. Explores the influence of “domesticated Keynesianism” on fiscal policy, including the Kennedy-Johnson tax cut of 1964. Steinmo, Sven. Taxation and Democracy: Swedish, British, and American Approaches to Financing the Modern State. New Haven, Conn.: Yale University Press, 1993. A model study in comparative political economy applied to international tax policy. Steuerle, C. Eugene. The Tax Decade: How Taxes Came to Dominate the Public Agenda. Washington: Urban Institute, 1992. The best history of the “Reagan Revolution” in tax policy. Wallenstein, Peter. From Slave South to New South: Public Policy in Nineteenth-Century Georgia. Chapel Hill: University of North Carolina Press, 1987. The best fiscal history of a single state. Witte, John F. The Politics and Development of the Federal Income Tax. Madison: University of Wisconsin Press, 1985. The leading history of the income tax from a pluralist point of view.

Becker, Robert A. Revolution, Reform, and the Politics of American Taxation, 1763–1783. Baton Rouge: Louisiana State University Press, 1980. Sees conflict within the colonies and states as an important part of the American Revolution.

Zelizer, Julian E. Taxing America: Wilbur D. Mills, Congress, and the State, 1945–1975. Cambridge, U.K.: Cambridge University Press, 1998. Interprets the powerful chair of the House Ways and Means Committee as a reformer.

Beito, David T. Taxpayers in Revolt: Tax Resistance during the Great Depression. Chapel Hill: University of North Carolina Press, 1989. A neoconservative approach to the history of taxation during the New Deal era.

W. Elliot Brownlee

Brownlee, W. Elliot. Federal Taxation in America: A Short History. Washington, D.C., and Cambridge, U.K.: Wilson Center Press and Cambridge University Press, 1996. Includes a historiographical essay. Brownlee, W. Elliot, ed. Funding the Modern American State, 1941–1995: The Rise and Fall of the Era of Easy Finance. Washington, D.C., and Cambridge, U.K.: Cambridge University Press, 1996. Fischer, Glenn W. The Worst Tax? A History of the Property Tax in America. Lawrence: University Press of Kansas, 1996. The best single volume on the history of property taxation. Jones, Carolyn. “Class Tax to Mass Tax: The Role of Propaganda in the Expansion of the Income Tax during World War II.” Buffalo Law Review 37 (1989): 685–737. King, Ronald Frederick. Money, Time, and Politics: Investment Tax Subsidies in American Democracy. New Haven, Conn.: Yale

See also Budget, Federal; Capitation Taxes; Debts, Revolutionary War; Excess Profits Tax; Hamilton’s Economic Policies; Inheritance Tax Laws; Negative Income Tax; Poll Tax; Pollock v. Farmers’ Loan and Trust Company; Revenue, Public; Revolution, American: Financial Aspects; Sales Taxes; Social Security; Stamp Act; Tariff.

“TAXATION WITHOUT REPRESENTATION” was at the center of the ideological underpinnings of the American Revolution. Resistance to the practice originated with the establishment of parliamentary supremacy in England, especially during the seventeenth century, when “no taxation without representation” was asserted as every Englishman’s fundamental right. Colonial leaders also struggled during the seventeenth century to establish their provincial assemblies’ sole power to tax within the colonies. When Parliament attempted to raise revenues

59

T A Y L O R V. L O U I S I A N A

Taxes are not to be laid on the people but by their consent in person or by deputation . . . these are the first principles of law and justice and the great barriers of a free state, and of the British constitution in part. I ask, I want no more—Now let it be shown how ’tis reconcilable with these principles or to many other fundamental maxims of the British constitution, as well as the natural and civil rights which by the laws of their country all British subjects are entitled to, as their best inheritance and birthright, that all the northern colonies, who are without legal representation in the house of Commons should be taxed by the British parliament. James Otis (Massachusetts lawyer and pamphleteer), “The Rights of the British Colonies Asserted and Proved,” Boston, 1764. SOURCE:

in the colonies after 1763, colonial leaders vigorously protested, arguing that their rights as Englishmen guaranteed that, since colonists were not directly represented in Parliament, only their representatives in the colonial assemblies could levy taxes. BIBLIOGRAPHY

Bailyn, Bernard. The Ideological Origins of the American Revolution. Enlarged ed. Cambridge, Mass.: Belknap Press, 1992. Greene, Jack P. The Quest for Power: The Lower Houses of Assembly in the Southern Royal Colonies, 1689–1776. Chapel Hill: University of North Carolina Press, 1963. Morgan, Edmund Sears. The Birth of the Republic, 1763–89. Chicago: University of Chicago Press, 1956.

Aaron J. Palmer See also Assemblies, Colonial; Colonial Policy, British; Stamp Act; Sugar Acts; Taxation; and vol. 9: Massachusetts Circular Letter; Patrick Henry’s Resolves; Stamp Act.

TAYLOR V. LOUISIANA, 419 U.S. 522 (1975). Billy Taylor, accused of rape, appealed to the Supreme Court claiming that a state law exempting women from jury duty infringed on his Sixth Amendment right to be tried by an impartial jury. The Court ruled in Taylor’s favor, invalidating all state laws restricting jury duty on the basis of gender. In Louisiana women could be called for jury service only if they filed a written declaration of their willingness to serve. As a result, most Louisiana juries, including the one that convicted Taylor, were all-male. Louisiana’s practice was similar to one that the Court had unanimously upheld in Hoyt v. Florida (1961), a decision that sustained the Florida law as a reasonable concession to women’s family responsibilities. The Court had implied acceptance of states’ exclusion of women from grand

60

juries as recently as 1972. But Taylor ended special treatment for women. The Court quoted from an earlier case: “Who would claim that a jury was truly representative of the community if all men were intentionally and systematically excluded from the panel?” In Duren v. Missouri (1979) the Court extended its ruling in Taylor to a Missouri law allowing women to be exempted from jury service on the basis of their gender. Since Taylor, jury duty has been a responsibility shared equally by men and women. BIBLIOGRAPHY

Baer, Judith A. Women in American Law. New York: Holmes and Meier, 1985–1991.

Judith A. Baer / a. r. See also Civil Rights and Liberties; Jury Trial; Women’s Rights Movement: The 20th Century.

TEA, DUTY ON. Tea coming to colonial America was subject to British import and excise or inland duties. The import duties were practically fixed at 11.67 percent, and the inland duties varied from four shillings to one shilling plus 25 percent ad valorem. The Revenue Act of 1767, which levied a duty of three pence per pound, stirred resentment against Britain and became the center of political resistance. Despite an attempted boycott against its importation, Americans would have their tea; between 1767 and 1774, more than 2 million pounds were imported and the American duty was paid. In 1773 the East India Company was permitted to export tea directly to America and set up wholesale markets in Boston, New York, Philadelphia, and Charleston. This created a de facto monopoly, precipitating agitation across the colonies not unlike that over the sale of stamps. There had been no change in the tax since 1767, but the tea ships with their loads of taxed freight became a symbol of taxation tyranny. Tories claimed the tea was coming in without any tax being paid. Whigs exposed the subterfuge. In the ensuing newspaper and pamphlet warfare, Alexander Hamilton won his first reputation as a political writer. Every tea ship was turned back or had its tea destroyed, unless its cargo was landed under an agreement that it would not be sold (see Boston Tea Party). After 1774 the Association enforced a boycott on most English imports. Some tea filtered through, was entered at the customshouses, and had the regular duty paid on it. BIBLIOGRAPHY

Brown, Richard D. Revolutionary Politics in Massachusetts: The Boston Committee of Correspondence and the Towns, 1772– 1774. Cambridge, Mass.: Harvard University Press, 1970.

O. M. Dickerson / a. r. See also Smuggling, Colonial; Taxation; Tea Trade, Prerevolutionary.

TEACHER CORPS

TEA TRADE, PREREVOLUTIONARY. The Dutch in mid-seventeenth-century New Amsterdam were the first people in North America to drink tea. The habit caught on more slowly among the British colonists who succeeded the Dutch. Although tea was available to seventeenth-century British colonists—William Penn quite likely carried tea with him when he arrived in Pennsylvania in 1682, and the first license to sell tea in Boston was issued in 1690—it was not until after 1720 that the consumption of tea blossomed in British North America. By the mid-century, nowhere in the Western world, other than Great Britain, was tea consumption more prevalent than along the eastern seaboard of North America. In 1774, approximately 90 percent of the affluent households in Massachusetts owned items associated with tea, such as teacups and teapots. Perhaps 50 percent of middling people and 42 percent of poor people also owned tea-making equipment on the eve of the American Revolution. By 1760, tea ranked third, behind textiles and ironware, among the goods colonists imported from Britain. Like other goods imported into the colonies, tea was embedded in the British mercantile system of trade. The East India Company, which held a monopoly on the trade, shipped tea from China to London where wholesalers purchased it at auctions and then distributed it internally or exported it. The British government raised revenue through high import duties and heavy excise taxes on tea. Because of extensive smuggling, especially between 1723 and 1745 when taxes were at their highest, there is no way to measure accurately the amount of tea imported by the North American colonies. The illegal trade in tea, much of it from Holland, must have been sizeable, given that almost every ship the British seized or examined for smuggling included tea in its cargo. The tea trade became a major point of contention between Britain and its American colonies in 1767, when tea was listed among the Townsend Duties. The nonimportation movement, which arose in response to the new duties, significantly reduced the quantity of tea entering the colonies. In New York and Philadelphia, the amount of tea imported from England fell from 494,096 pounds in 1768 to just 658 pounds in 1772. Exports to New England also declined from 291,899 pounds in 1768 to 151,184 pounds in 1772. When Parliament repealed the Townsend Duties in 1770, it retained the tax on tea as a symbol of the right and power of Parliament to tax the colonies. The struggle over the tea trade climaxed in 1773 when parliament passed the Tea Act, lowering the tax on tea and enabling the financially troubled East India Company to export tea directly to North America. Parliament anticipated that the Tea Act would lower tea prices in America and increase profits for the East India Company. British colonists, however, interpreted the Tea Act as an attempt by the British government to force them to accept Parliament’s right to tax them. In 1773, attempts to

bring tea into the colonies resulted in a series of “tea parties” in Annapolis, Boston, New York, Philadelphia, and Charleston. The efforts of revolutionaries to halt the tea trade never fully succeeded, however. In 1775, the British exported 739,569 pounds of tea to the colonies. BIBLIOGRAPHY

Scott, J. M. The Great Tea Venture. New York: Dutton, 1965. Smith, Woodruff D. “Complications of the Commonplace: Tea, Sugar, and Imperialism.” Journal of Interdisciplinary History 23, no. 2 (1992): 259–278.

Krista Camenzind See also Tea, Duty on.

TEACHER CORPS, created by the Higher Education Act of 1965. Senators Gaylord A. Nelson of Wisconsin and Edward M. Kennedy of Massachusetts proposed the legislation, and President Lyndon B. Johnson gave the idea a name. This program grew out of the same Great Society optimism that fueled Head Start and Volunteers in Service to America. During its seventeen-year life, the corps conducted more than 650 projects in cities, small towns, and rural areas, focusing on educational innovation. The first broad concern of the Teacher Corps was to improve education for the disadvantaged. In the mid-1960s, policymakers likened it to the Peace Corps— idealistic young people would bring energy and commitment to schools in blighted urban areas and poor rural communities. The corps encouraged graduates of liberal arts colleges and members of minority groups to join. The perspectives of these nontraditional teachers led to curricular innovation in individual instruction and multicultural education. A second innovation was in teacher training. After eight weeks of training, interns spent two years engaged simultaneously in university study, work-study in the schools, and work in communities, which included afterschool recreation activities, home visits, and health programs. During its last years, the Teacher Corps was more concerned with in-service training for teachers already in schools, focusing on professional development and innovations among veteran teachers. Cooperation among educators was important to the Teacher Corps. The Department of Health, Education and Welfare provided funds. At the state level, college and university teachers instructed interns and consulted with local schools. School districts and community groups then utilized the interns. Controversy surrounded the Teacher Corps from the beginning. The corps threatened the traditional rights of the states in educational matters, and issues of trust and authority simmered beneath the surface of relations between teachers and interns, school districts and universities, and the national office and local educators. Com-

61

TEACHER TRAINING

munity groups were concerned about being shuffled aside. By the late 1970s, the mission of the corps became difficult to define and its varied constituents hard to satisfy. In an effort to cut back federal involvement in education, President Ronald Reagan officially eliminated the corps as part of the 1981 Education Consolidation and Improvement Act. It ceased operations in 1983. BIBLIOGRAPHY

Bernstein, Irving. Guns or Butter: The Presidency of Lyndon Johnson. New York: Oxford University Press, 1996. Dallek, Robert. Flawed Giant: Lyndon Johnson and His Times, 1961–1973. New York: Oxford University Press, 1998. Kaplan, Marshall, and Peggy L. Cuciti, eds. The Great Society and Its Legacy: Twenty Years of U.S. Social Policy. Durham, N.C.: Duke University Press, 1986. Unger, Irwin. The Best of Intentions: The Triumphs and Failures of the Great Society under Kennedy, Johnson, and Nixon. New York: Doubleday, 1996.

Christine A. Ogren / a. e. See also Education; Great Society; Peace Corps.

offer them instruction on how best to integrate new technology into lesson plans. Critics of teacher training programs cite an overemphasis on methods and psychological studies, the neglect of academic subjects, the need for accountability to ensure that training and certification are based less on academic credits and more on ability to function in the classroom, and the lack of uniform requirements among states. A Nation at Risk, the 1983 report of the National Committee on Excellence in Education, appointed by President Ronald Reagan, alerted the American public to the need to attract high-quality teaching candidates and to improve their training. By the mid-1990s, most states offered alternative routes to certification to mid-career people and liberal arts graduates via programs that provide on-the-job supervision. On the federal level, the Troops to Teachers program helps qualified retired servicemen and servicewomen begin second careers as teachers in public schools. The Teach for America program, supported by private, corporate, and government donations, trains recent college graduates at summer institutes. Program participants then teach for at least two years in rural and urban low-income areas. BIBLIOGRAPHY

TEACHER TRAINING in the United States began in 1794 when the Society of Associated Teachers was formed in New York City to establish qualifications for teachers in that city. The Free School Society, established in 1805, also in New York City, began training teachers using public funds and organized a teacher-training course. In 1885, Brown University began to offer students courses in pedagogy, establishing one of the first university-level departments of education. When the study of teaching methods began to receive recognition as a valid program in the twentieth century, the certification standards for teachers increased throughout the United States. By the end of the twentieth century, almost all American teachers received preservice training in institutions of higher education with programs that complied with state guidelines for certification. These institutions usually have separate schools or departments of education, and prospective teachers are education majors. Nearly every teacher holds a bachelor’s degree, and the vast majority have additional credits, with more than half holding one or more advanced degrees. Many states require graduate education for permanent liscensure. Education students must take courses in pedagogical techniques, and prospective secondary teachers need a specified number of credit hours in the specific subject they plan to teach. Training includes a student teaching requirement, a period of classroom teaching under the supervision of a certified teacher. States vary in their course content and credit requirements. Since the 1980s, the expanding role of computers in the classroom has made familiarity with high technology almost mandatory for teachers, and organizations such as the National Teacher Training Institute

62

Britzman, Deborah P. Practice Makes Practice: A Critical Study of Learning to Teach. Albany: State University of New York Press, 1991. Edwards, Elizabeth. Women in Teacher Training Colleges, 1900– 1960: A Culture of Femininity. New York: Routledge, 2000. Leavitt, Howard B., ed. Issues and Problems in Teacher Education: An International Handbook. New York: Greenwood Press, 1992.

Myrna W. Merron / a. e. See also Carnegie Foundation for the Advancement of Teaching; Education, Higher: Women’s Colleges; Peabody Fund; Smith-Hughes Act; Teacher Corps.

TEACHERS’ LOYALTY OATH. Since 1863, nearly two-thirds of the states have adopted loyalty oaths for teachers. Some oaths prohibit membership in subversive groups and the teaching of subversive doctrines, and others ask for sweeping disclaimers of past beliefs and associations. The early Cold War years following World War II produced a bumper crop of such oaths. In Cramp v. Board of Public Instruction of Orange County, Florida (1961), the Supreme Court struck down all-encompassing oaths infringing on First Amendment rights to freedom of thought and expression, but affirmed the constitutionality of generic teachers’ oaths to uphold state and federal constitutions in Knight v. Board of Regents of University of State of New York (1967).

TECHNOLOGY

BIBLIOGRAPHY

Reutter, E. Edmund, Jr., and Robert R. Hamilton. The Law of Public Education. 2d ed. Mineola, N.Y.: Foundation Press, 1976.

Samuel H. Popper / c. w.

Stratton, David H. Tempest over Teapot Dome: The Story of Albert B. Fall. Norman: University of Oklahoma Press, 1998.

Eric S. Yellin See also Scandals.

See also Pierce v. Society of Sisters.

TEAMSTERS. See International Brotherhood of Teamsters.

TEAPOT DOME OIL SCANDAL. In October 1929, Albert B. Fall, the former Secretary of the Interior under President Warren G. Harding, was convicted of accepting bribes in the leasing of U.S. Naval Oil Reserves in Elk Hills, California, and Teapot Dome, Wyoming. They were leased to private oil barons Edward L. Doheny and Harry F. Sinclair, respectively. Though the reserves had been set aside in 1912 for the Navy in case of war, responsibility for the reserves had been passed to the Department of the Interior at the outset of Harding’s administration in 1921. Responding to the concerns of conservationists and many in business, Montana Senator Thomas J. Walsh opened hearings in October 1923 to investigate the competitive bidding practices Fall used for the leases. Walsh’s investigations eventually revealed that Doheny and Sinclair had together given Fall approximately $404,000 (about $4 million in 2000) either as loans or as investments in Fall’s New Mexico cattle ranch while he was serving in the cabinet. All three men faced charges of bribery and conspiracy to defraud the U.S. government; the Supreme Court canceled the leases in 1927. Sinclair was acquitted of conspiracy and bribery charges in 1928, and Doheny was acquitted in 1930. In a juridical paradox, the court ruled that regardless of Sinclair’s and Doheny’s intentions, Fall had, in fact, accepted the loans and investments as bribes and had been influenced by them. He was convicted in 1929 for accepting bribes and was imprisoned from 1931 to 1932. The political fallout of the scandal was enormous. Though Calvin Coolidge managed to hold on to the White House for the Republicans in 1924 by placing most of the blame on Fall and Harding (who died in office in 1923), the party faced charges of corruption through the 1950s. Moreover, Doheny’s prominence and associations in the Democratic Party seemed to spread the corruption to all aspects of politics in the 1920s.

TECHNOCRACY MOVEMENT of the 1930s advocated the radical reorganization of American society around the principles of advanced technology. William Henry Smyth, an inventor and social reformer from California, first coined the term “technocracy” in 1919. Engineer Howard Scott revived the idea of a technological society during the economic depression that swept the United States in the 1930s. Scott believed that “technocrats” familiar with modern machinery could automate production, distribute industrial wealth, increase consumption, and spark a national economic recovery. Scott also argued that technocrats could apply their skills to remake the nation’s financial system and prevent future depressions. They could set a product’s value by the amount of energy consumed in production and redesign a monetary system based on “energy certificates” good for a certain amount of consumption. In Scott’s utopia, the government would also provide each citizen an annual income of $20,000 in exchange for a minimum amount of work. To lay the groundwork for his technological society, Scott and a group of coworkers conducted an energy survey of North America from office space provided by Columbia University. Although their efforts fueled public interest, they also attracted the scornful denunciation of professional economists, and the Technocracy Movement essentially ended in 1933. However impractical Scott’s technocracy may have been, however, his theories highlighted the impact of machines on society and the pervasive economic inequality of the 1930s. Technocrats ultimately stimulated discussion of the nation’s economic problems and probably helped create a climate favorable for increasing the federal involvement in the economy. BIBLIOGRAPHY

Akin, William. Technocracy and the American Dream. Berkeley and Los Angeles: University of California Press, 1977. Noble, David F. Forces of Production: A Social History of Industrial Automation. New York: Knopf, 1984. Scott, Howard. Introduction to Technocracy. New York: J. Day Co., 1933.

Harris Gaylord Warren / e. m. See also Automation; Great Depression; Share-the-Wealth Movements; Townshend Plan.

BIBLIOGRAPHY

Davis, Margaret L. Dark Side of Fortune: Triumph and Scandal in the Life of Oil Tycoon Edward L. Doheny. Berkeley: University of California Press, 1998.

TECHNOLOGY. See Industrial Research and individual industries.

63

TECUMSEH’S CRUSADE

TECUMSEH’S CRUSADE. At the end of the French and Indian War in 1763, France gave up its claims to its vast North American empire. Abandoning not only French settlements, France also withdrew from generations of economic, military, and political alliances with hundreds of thousands of American Indians. Forced to redefine their economies and polities, many Algonquian communities throughout the Ohio River valley and southern Great Lakes began negotiating with the British to assume many of the lost opportunities for trade, tribute, and protection. Slowly, the British assumed many of the former roles of the French and established trading outposts and forts throughout Algonquian territories.

Cherokee and Iroquois, to fully support the confederacy’s efforts, Tecumseh’s aspirations for an overarching Indian union capable of withstanding American aggression ended on 5 October 1813, when he perished at the Battle of the Thames. As the British sued for peace and the confederacy dissolved, Shawnee and other Great Lakes Indian communities became displaced from their homelands in Ohio, Michigan, Indiana, and Illinois to lands west of the Mississippi.

It was within this mutually constructed AngloAlgonquian world that the young Shawnee warrior, Tecumseh, was raised. Witnessing the erosion of British strength following the American Revolution, the Shawnee and other Great Lakes groups increasingly faced the advancing American nation by themselves. Bloody conflicts between American settlers and Shawnee, Delaware, Miami, and Wyandot communities, among others of the Algonquian group, became commonplace in the lateeighteenth and nineteenth centuries.

Sudgen, John. Tecumseh’s Last Stand. Norman: University of Oklahoma Press, 1985.

Despite the increased conflicts and pressures from American settlers, Algonquians and other Indian powers, including the Cherokee in Kentucky and Tennessee, continued to control the fertile lands to the Mississippi. Following the Louisiana Purchase of 1803, however, American settlers, surveyors, and politicians increasingly coveted the lands between the Ohio and Mississippi River. Many, including Thomas Jefferson, believed that Indians had either to adopt American farming economies or be removed permanently from American society, an idea of exclusion at odds with more than a century of Indianwhite relations in the region. Conflicts continued to escalate in the early 1800s, and Algonquian communities had already begun taking up arms against American settlements when Britain again fought the United States in the War of 1812. Organized around the military and political leadership of Tecumseh, Shawnee and other Indian communities had also recently begun a series of cultural reforms to spiritually revive and energize their communities. Under the influence of Tecumseh’s brother, Tenskwatawa, also known as the Prophet, this religious movement facilitated Tecumseh’s military and political efforts to organize Indian communities throughout the Great Lakes and into the South into a broad confederacy against the Americans. Known for his impassioned oratory and strategic vision, Tecumseh, with the aid of the British in Canada, guided the confederacy through a series of battles with American forces under the leadership of the Indiana territorial governor William Henry Harrison. Facing overwhelming military odds, particularly the lack of supplies, and unable to get non-Algonquian groups, such as the

64

BIBLIOGRAPHY

Edmunds, R. David. The Shawnee Prophet. Lincoln: University of Nebraska Press, 1983.

Ned Blackhawk See also Indian Policy, Colonial; Indian Policy, U.S.: 1775– 1830; Indian Removal; Thames, Battle of the; Wars with Indian Nations: Colonial Era to 1783.

TEEPEE. See Tipi.

TEHERAN CONFERENCE. From 28 November to 1 December 1943, President Franklin D. Roosevelt, Prime Minister Winston Churchill, and Marshal Joseph Stalin met at Teheran, the capital of Iran, to coordinate Western military plans with those of the Soviet Union. Most important of all, the “big three” drew up the essential victory strategy in Europe, one based on a cross-channel invasion called Operation Overlord and scheduled for May 1944. The plan included a partition of Germany, but left all details to a three-power European Advisory Commission. It granted Stalin’s request that Poland’s new western border should be at the Oder River and that the eastern one follow the lines drafted by British diplomat Lord Curzon in 1919. The conference tacitly concurred in Stalin’s conquests of 1939 and 1940, these being Estonia, Latvia, Lithuania, and a slice of Finland. Stalin reiterated his promise, made in October 1943 at Moscow, to enter the war against Japan upon the defeat of Germany, but he expected compensation in the form of tsarist territories taken by Japan in 1905. On 1 December 1943, the three powers issued a declaration that welcomed potential allies into “a world family of democratic nations” and signed a separate protocol recognizing the “independence, sovereignty, and territorial integrity” of Iran. BIBLIOGRAPHY

Eubank, Keith. Summit at Teheran. New York: Morrow, 1985. Mayle, Paul D. Eureka Summit: Agreement in Principle and the Big Three at Teheran, 1943. Newark: University of Delaware Press, 1987.

T E L E C O M M U N I C AT I O N S

Sainsbury, Keith. The Turning Point: Roosevelt, Stalin, Churchill, and Chiang-Kai-Shek, 1943: The Moscow, Cairo, and Teheran Conferences. Oxford: Oxford University Press, 1985.

Justus D. Doenecke

rope and Britain, which have turned these networks into public utilities. In the case of the Internet, we see the control moving from the military to the private sector, and Congress grappling with how to regulate “objectionable” communications such as pornography.

See also World War II.

TELECOMMUNICATIONS. The history of telecommunications is a story of networks. Alexander Graham Bell on his honeymoon wrote of a “grand system” that would provide “direct communication between any two places in [a] city” and, by connecting cities, provide a true network throughout the country and eventually the world (Winston, Media Technology, p. 244). From the telegraph to the telephone to e-mail, electronic communication has extended farther and reached more people with increasing speed. The advent of the Internet in combination with a satellite system that covers the entire surface of the earth has brought us closer to the “global village” envisioned by Marshall McLuhan in the 1960s. The variety of media included under the umbrella of “telecommunications” has expanded since the early twentieth century. The term was adopted in 1932 by the Convention Internationale des Telecommunications held in Madrid (OED). At this point, the telegraph, the telephone, and the radio were the only widely used telecommunications media. The United States, the point of origin for only one of these three (Bell’s telephone), soon came to dominate the telecommunications industries. The Radio Corporation of America (RCA) was created in 1919, three years before Britain’s British Broadcasting Corporation (BBC). By 1950, the American Telephone and Telegraph Company (AT&T) provided the best telephone service in the world. American television led the way after World War II (1939–1945). Then, in the early 1980s, a new device was introduced: the personal computer. Although not intended as a tool for telecommunications, the personal computer became in the 1990s the most powerful means of two-way individual electronic communication, thanks to a network that goes far beyond any “grand system” dreamed of by Bell. The network we now call the Internet gives a person with a computer and an Internet connection the ability to send not only words, but graphs, charts, audio signals, and pictures, both still and moving, throughout the world. Most telecommunications networks were created for specific purposes by groups with vested interests. The telegraph network was created to make scheduling trains possible. Telephones were first primarily for business use. The grandfather of the Internet, ARPANET, was commissioned by the Department of Defense in 1969 to develop a military communication network that could withstand a nuclear attack. In general, the U.S. Congress has chosen to allow these networks to remain under private control with a modicum of regulation, in contrast to governments in Eu-

The Telegraph The first practical means of electronic communication was the telegraph. The science on which it is based was over a century old when the sudden development of the railway system in the 1830s, first in England, then in America, made it necessary to communicate the movement of trains rapidly. The interconnection of the various technologies, one breeding the need for another, is well illustrated. But while the telegraph was developed with this one purpose in mind, the potential uses of the new device were soon recognized, and information other than that dealing with train schedules began to flow across the wires. In 1844, the Democratic National Convention’s nominee for vice president declined via telegraph, though the Convention, not trusting the new device, had to send a group from Baltimore to Washington, D.C., for face-to-face confirmation. Here we see an early example of the evolution of trust in these new networks. While battles were waged over ownership, the technology continued to expand its influence as the stock market and the newspaper business, both in need of rapid transmission of information, began using the everexpanding network. As with later technologies, there was debate in Congress over governmental control. Congress’ decision was to let the private sector compete to exploit this new technology. That competition ended with the adoption of one specific “code,” and Samuel Morse emerged as the Bill Gates of the telegraph. The Telephone and the Fax Telegraphy required training in Morse code on the part of both sender and receiver, so this form of telecommunication remained primarily a means of communication for business and for urgent personal messages sent from a public place to another public place. Bell’s telephone, invented in 1876, brought telecommunication into the home, although the telephone remained primarily a business tool until after World War II, when telephones become common in American homes. AT&T, formed in 1885, held a virtual monopoly on U.S. telephonic communication until 1982. The Justice Department forced the separation of Western Union from the company in 1913. At this point an AT&T vice president, Nathan Kingsbury, wrote a letter to the U.S. Attorney General, which came to be called the “Kingsbury Commitment.” It formed the basis of AT&T’s dominance of telephone service until 1982, when the Justice Department insisted that AT&T be severed into seven “Baby Bells” who each provided local service to a region.

65

T E L E C O M M U N I C AT I O N S

chats in 1933, and its effect on the public was demonstrated inadvertently by Orson Welles in his radio drama based on H. G. Wells’s novel The War of the Worlds. Many people accepted the fictional tale of an invasion from Mars as fact and panicked. In 1939, NBC began broadcasting television signals, but television broadcasting was halted until after World War II ended in 1945. Both radio and television altered many aspects of American society: home life, advertising, politics, leisure time, and sports. Debates raged over television’s impact on society. Television was celebrated as an educational panacea and condemned as a sad replacement for human interaction.

Walkie-Talkies. Al Gross shows two early models of walkietalkies that he invented, precursors of more sophisticated forms of modern telecommunications. AP/Wide World Photos

The control that AT&T maintained probably contributed to the quality of phone service in the United States, but it also squelched some developments. For example, until 1968, only equipment leased from AT&T could be hooked to their network. Thus the facsimile machine (the fax), originally developed in the nineteenth century as an extension of telegraphy, did not come into use until after the 1968 FCC order forcing Bell to allow users to hook non-Bell equipment to the AT&T network. Factors other than technology often determine the evolution of telecommunications. Radio and Television Radio and television are quite different from the telegraph and telephone: they communicate in one direction and “broadcast” to many listeners simultaneously. The Italian Guglielmo Marconi, working in England in 1896, patented his wireless system and transmitted signals across the Atlantic in 1901. By 1919 RCA was formed, and in 1926, it created the National Broadcasting Company (NBC). The radio was a common household appliance by the time of President Franklin Delano Roosevelt’s fireside

66

The Internet Like the Interstate Highway System, which carries a different kind of traffic, the Internet began as a Cold War postapocalypse military project in 1969. ARPANET was created to develop a means of effective communication in the case of a nuclear war. The Advanced Research Project Agency (ARPA), created in 1957 in response to the launch of Sputnik, advanced the case that such a network was necessary, illustrating again that necessity (or at least perceived necessity) is the mother of invention. Paul Baran, a RAND researcher studying military communications for the Air Force, wrote in 1964, “Is it time now to start thinking about a new and possibly non-existent public utility, a common user digital data communication plant designed specifically for the transmission of digital data among a large set of subscribers?” As the ARPANET expanded, users developed software for sending electronic mail, soon dubbed e-mail, then just plain email. By 1973, about three-fourths of the traffic on this network connecting many research universities consisted of email. The network expanded to include other universities and then other local area networks (LANs). Once these local area networks became connected to one another, this new form of communication spread rapidly. In 1982, a protocol was developed that would allow all the smaller networks to link together using the Transmission Control Protocol (TCP) and the Internet Protocol (IP). Once these were adopted on various smaller “internets,” which connected various LANs, “the Internet” came into being. Just as railroad companies had to adopt a common gauge of track to make it possible to run a train across the country, so the various networks had to adopt a common protocol so that messages could travel throughout the network. Once this happened, the Internet expanded even more rapidly. This electronic network, often dubbed “the information superhighway,” continued to expand, and in the early 1990s, a new interface was developed that allowed even unsophisticated users of personal computers to “surf the Internet”: the World Wide Web. With this more friendly access tool came the commercialization of this new medium.

T E L E C O M M U N I C AT I O N S A C T

The Access Issue Access has been a key issue throughout the history of telecommunications. The term “universal service,” coined in 1907 by Bell Chief Executive Officer Theodore Vail, came to mean, by midcentury, providing all Americans affordable access to the telephone network. There were still rural areas without electrical and telephone service in the mid-twentieth century (the two networks often sharing the same poles for stringing wires overhead), but by the end of the century, about 94 percent of all homes had phones (notable exceptions being homes in poverty zones such as tribal lands and inner-city neighborhoods). In the final decade of the twentieth century, cell phones became widely available, though they were not adopted as quickly in the United States as elsewhere. This new and alternative network for telephonic communication makes possible wireless access from remote sites, so that villages in central Africa, for example, can have telephone access to the world via satellite systems. In the United States, subscribers to cell phone services increased from about 5,000 in 1990 to over 100,000 in 2000, while average monthly bills were cut in half. Despite the fact that access to the Internet expanded much faster than did access to earlier networks, there was heated political debate about the “digital divide” separating those who have such access from the have-nots. This points to the importance of this new form of telecommunication, which combines personal communication technology with information access. Thus, federal programs in the 1990s promoted Internet access to public schools and libraries. While 65 percent of public high schools had Internet access in 1995, the figure reached 100 percent by 2000. Once connected to this vast network, the computer becomes not only an educational tool but also a means of communication that can change the world. In 1989 news from Tiananmen Square protesters came out of China via email. The Merging of the Media By the mid-1990s, the impact of the Internet, the new digital technologies, the satellite systems, and fiber-optic cables was felt throughout the world of telecommunications. Radio stations began “web casting,” sending their signals out over the Internet so listeners around the world could tune in. By the turn of the twenty-first century, not only pictures but also entire movies could be downloaded from the Internet. As use of computers increased, the digital format became increasingly important, and by the end of the century digital television was a reality, though not widely in use. A variety of mergers by telecommunications companies increased the need for government oversight. Congress grappled with regulation of this ever-expanding field that knows no borders or nationality. The Telecommunications Act of 1996 extended the quest for “universal service” to “advanced telecommunications services,” but other attempts to regulate content on the Internet tended to be rejected by the courts as unconstitutional.

Effect of Medium on the Message If television produced a generation that was more comfortable with the image than with the word, computers turned a later generation back to the word, and to new symbols as well. Marshal McLuhan in the 1960s said that “the medium is the message.” The phenomenon of the medium affecting the communication process is well illustrated by the development of the “emoticon” in email chat room and instant messenger communications. Emoticons came about when email and Internet users discovered that the tone of their messages was often missed by receivers, who sometimes became offended when a joking tone was not inferred. Thus, the emoticon was proposed in 1979, first as a simple -) and then the more elaborate :-) to suggest tone, and soon this and other tone indicators came into widespread use. Too often we limit ourselves to “just the facts” when considering technology, but its impact on the social sphere is important. Just as the automobile changed employment patterns (with rural residents commuting into the city) and architecture (creating the garage as a standard part of homes), so the telephone ended the drop-in visit and created telemarketing. It draws us closer electronically while distancing us physically. We are still debating the impact of the television, which seems to alter some family patterns extensively, and already we are discussing “Internet addiction.” Telecommunications remains an expanding and changing field that alters us in ways we might fail to recognize. BIBLIOGRAPHY

Baran, P. “On Distributed Communication Networks.” IEEE Transactions on Communications Systems (1 March 1964). “Digital Divide, The.” CQ Researcher 10, no. 3 ( Jan 28, 2000): 41–64. Jensen, Peter. From the Wireless to the Web: The Evolution of Telecommunications, 1901–2001. Sydney: University of New South Wales Press, 2000. Lebow, Irwin. Information Highways and Byways: From the Telegraph to the 21st Century. New York: IEEE Press, 1995. Lubar, Steven D. InfoCulture: The Smithsonian Book of Information Age Inventions. Boston: Houghton Mifflin, 1993. McCarroll, Thomas. “How AT&T Plans to Reach Out and Touch Everyone.” Time 142 ( July 5, 1993): 44–46. Mitchell, William J. City of Bits: Space, Place, and the Infobahn. Cambridge, Mass.: MIT Press, 1995. Available at http:// mitpress2.mit.edu/e-books/City_of_Bits/ Winston, Brian. Media Technology and Society: A History: From the Telegraph to the Internet. New York: Routledge, 1998.

William E. King See also AT&T.

TELECOMMUNICATIONS ACT of 1996 represented a bipartisan effort to overhaul the nation’s telecommunications laws. It encouraged deployment of new

67

TELEGRAPH

telecommunications technologies and promoted competition among providers of local telephone service, between local and long distance telephone companies, and among providers of cable and broadcast television programming. The act in large part replaced the Communications Act of 1934, which was enacted at a time when technology was less mature and telephone and broadcast telecommunications were largely compartmentalized. The Telecommunications Act significantly restructured local and national telephone markets. Local phone service had long been considered a natural monopoly. States typically awarded an exclusive franchise in each local area to a specific carrier, which would own and operate the infrastructure of local telephone service. By the early 1990s, however, technological advances made competition among local carriers seem possible, and the act ended the regime of state-sanctioned monopolies. The act prohibited states from enforcing laws impeding competition and imposed on existing local carriers a number of obligations intended to encourage competition. The most prominent of these duties was the requirement that such carriers share their networks with potential competitors. In exchange for opening local markets to competition, the act removed restrictions that prevented local carriers from providing long distance telephone service. By removing limitations on competition in both the local and long distance markets, the act made it possible for telephone companies to offer integrated long distance and local telephone service to the public. The act also made significant changes to the regulation of cable and broadcast television in order to encourage competition. One of the more important changes was the authorization of telephone companies to provide cable television services. The act also eliminated the regulation of cable television rates, except for basic broadcast service, and liberalized prior restrictions on “over-the-air” television broadcasters that limited the number of broadcast stations that any one entity could own. In addition to its provisions encouraging competition, the act contained controversial rules regarding obscenity, indecency, and violence on cable and broadcast television and on the Internet. It pressured television networks to establish a rating system for their programs, and required manufacturers of television sets to install “Vchips,” circuitry that would allow viewers to block violent or sexual programming. The act also required cable television operators that provide channels dedicated to sexually oriented programming either to completely scramble the channel or to limit the channel’s programming to nighttime hours. In 2000, the Supreme Court struck down this latter provision in United States v. Playboy Entertainment Group, saying that the provision regulated speech protected by the First Amendment and that the provision was not the least restrictive means of protecting children from inadvertently viewing offensive programming. The most contentious of the act’s provisions sought to protect minors from “indecent” communications on

68

the Internet through the use of criminal penalties for people sending such communications or for providers of Internet service that knowingly facilitated such communications. This part of the act was controversial in part because it regulated not only obscene communications, which do not receive constitutional protection under the First Amendment, but also those that were “patently offensive as measured by contemporary standards.” The Supreme Court struck down this provision within a year after it was passed. In the 1996 case of Reno v. American Civil Liberties Union, the Court explained that the regulation of nonobscene communication was a content-based restriction of speech and that such a restriction merited careful scrutiny. The Court, troubled by the possible “chilling effect” the provisions would have on free speech, held that the provisions were too broad, vague, and undefined to survive constitutional challenge. BIBLIOGRAPHY

Huber, Peter W., Michael K. Kellogg, and John Thorne. The Telecommunications Act of 1996: Special Report. Boston: Little, Brown, 1996. Krattenmaker, Thomas G. “The Telecommunications Act of 1996.” Connecticut Law Review 29 (1996): 123–174. Wiley, Richard E., and R. Clark Wadlow, eds. The Telecommunications Act of 1996. New York: Practising Law Institute, 1996.

Kent Greenfield See also Censorship, Press and Artistic.

TELEGRAPH. The word “telegraph” originally referred to any device that facilitated long-distance communication. Although various means of “telegraphing” began thousands of years ago, it was not until the early nineteenth century that the concept of using electrical devices took root. By that time, Alessandro Volta had developed the battery, Hans Christian Oersted had discovered the relationship between electrical current and magnetism, and Joseph Henry had discovered the electromagnet. Combining these new technologies into a reliable communication system was to be the work of Massachusettsborn artist Samuel F. B. Morse. Morse worked with partners Alfred Vail and Leonard Gale to design his electromechanical device, which Morse described as the “Recording Telegraph.” In 1837, Morse’s newly patented telegraph featured a dot-and-dash code to represent numbers, a dictionary to turn the numbers into words, and a set of sawtooth type for sending signals. Morse demonstrated his telegraph at a New York exhibition a year later with a model that used a dot-dash code directly for letters instead of the number-word dictionary. “Morse code” was to become standard throughout the world. The dots or dashes, created from an interruption in the flow of electricity, were recorded on a printer or interpreted orally.

TELEPHONE

sociated Press to dispatch news, industry for the transmission of information about stocks and commodities, and the general public to send messages. The telegraph’s military value was demonstrated during the Civil War (1861–1865) as a way to control troop deployment and intelligence. However, the rival technologies of the telephone and radio would soon replace the telegraph as a primary source of communication. Until the mid-1970s, Canada used Morse telegraphy, and Mexico continued with the system for its railroads up to 1990. However, the telegraph is no longer widely used, save by a small group of enthusiasts. Although radiotelegraphy (wireless transmission using radio waves) is still used commercially, it is limited in the United States to just a few shore stations that communicate with seafaring ships. Telephones, facsimile machines, and computer electronic mail have usurped the Morse model of long-distance communication. BIBLIOGRAPHY

Bates, David Homer, and James A. Rawley. Lincoln in the Telegraph Office: Recollections of the United States Military Telegraph Corps During the Civil War. Lincoln: University of Nebraska Press, 1995.

Samuel F. B. Morse. A painting of the artist with his invention, the telegraph, which transmitted messages using his “Morse code” of dots and dashes. Library of Congress

Blondheim, Menahem. News Over the Wires: The Telegraph and the Flow of Public Information in America, 1844–1897. Cambridge, Mass.: Harvard University Press, 1994. Gabler, Edwin. The American Telegrapher: A Social History, 1860– 1900. New Brunswick, N.J.: Rutgers University Press, 1988. Jolley, E. H. Introduction to Telephony and Telegraphy. London: Pitman, 1968.

In 1844, Congress funded $30,000 for the construction of an experimental telegraph line that was to run the forty miles between Washington, D.C., and Baltimore. From the Capitol in Washington, Morse sent the first formal message on the line to Baltimore, “What hath God wrought?” Rapid advances in telegraph use followed. Small telegraph companies began operations throughout the United States, including American Telegraph Company, Western Union Telegraph Company, New York Albany and Buffalo Electro-Magnetic Telegraph Company, Atlantic and Ohio Telegraph Company, Illinois and Mississippi Telegraph Company, and New Orleans and Ohio Telegraph Company. In 1861, Western Union built its first transcontinental telegraph line. The first permanently successful telegraphic cable crossing the Atlantic Ocean was laid five years later. The invention of “duplex” telegraphy by J. B. Stearns and “quadruplex” telegraphy by Thomas A. Edison in the 1870s enhanced the performance of the telegraph by allowing simultaneous messages to be sent over the same wire. All rapid long-distance communication within private and public sectors depended on the telegraph throughout the remainder of the nineteenth century. Applications were many: Railroads used the Morse telegraph to aid in the efficiency and safety of railroad operations, the As-

Kym O’Connell-Todd See also Telecommunications; Western Union Telegraph Company.

TELEPHONE. The telephone, a speech transmission device, dates from 1876, the year Alexander Graham Bell patented his “Improvements in Telegraphy.” Many inventors had been experimenting with acoustics and electricity, among them Thomas Edison, Emil Berliner, and Elisha Gray. Each of these men, as well as Bell’s assistant Thomas Watson, contributed modifications that resulted in the telephone we recognize today. Technology has advanced, but the fundamental principles remain the same. When Bell Telephone Company formed to market its product in 1877, the telegraph was the reigning telecommunication service. Coast-to-coast communication had been possible since 1861, and 2,250 telegraph offices spanned the country. Earlier that year, Western Union had been offered the Bell patent but refused it, only to buy telephone technology from others. Although Bell held the patent for the device, 1,730 other companies were making telephones. In 1882, the American Bell Telephone Company won a court judgment against Western Union and gained con-

69

TELEPHONE CASES

1915. Radiotelephone service to other countries and ships at sea was available after 1927. A transatlantic cable was laid in 1956. The transmission of calls by microwave began soon after World War II (1939–1945), and Bell Laboratories initiated satellite communications with the launch of Telstar in 1962. The Bell Systems invention that had the most dramatic impact on the world was the transistor. Unveiled in 1948, it made small electronic devices possible. The transistor was vital to the development of hearing aids, portable radios, and the personal computer. AT&T introduced modems for data transmission between computers over telephone lines in 1958. A Department of Defense computer network project from 1969 (ARPANET) developed into the Internet by 1992, and the popular World Wide Web appeared in 1994. By 2001, 143 million Americans, more than half the population, were communicating online, sending data and audio and video transmissions. Eighty percent of them relied on telephone dial-up connections. BIBLIOGRAPHY

Alexander Graham Bell. A photograph of the inventor (seated) demonstrating his telephone, patented in 1876. U.S. National Aeronautics and Space Administration

Grosvenor, Edwin, and Morgan Wesson. Alexander Graham Bell: The Life and Times of the Man Who Invented the Telephone. New York: Abrams, 1997. Gwanthmey, Emily, and Ellen Stern. Once Upon a Telephone: An Illustrated Social History. New York: Harcourt Brace, 1994. Katz, James Everett. Connections: Social and Cultural Studies of the Telephone in American Life. New Brunswick, N.J.: Transaction, 1999.

trolling interest in the company, an event that paved the way for modern telephone systems. In 1885, Bell formed a subsidiary, American Telephone & Telegraph (AT&T), which provided a network to which Bell-licensed companies could connect. For the first time, long-distance calling became possible. As the twentieth century progressed, the importance of telephone service in the daily lives of Americans increased. The Bureau of the Census estimated that in 1920, 35 percent of households had telephones. Fifty years later the figure had risen to 90.5 percent. The Bell System manufactured and installed all telephone equipment and provided all the services. As a national monopoly, it had regulated rates. It was often written that Bell was the best telephone system in the world. The 1877 technology start-up had become the largest privately owned industry in United States history with more than 1 million employees and $152 billion in assets in 1983. However, as the 1960s drew to a close, complaints of poor service and of “Ma Bell’s” monopoly attracted government attention. In 1974, the Department of Justice filed an antitrust suit against AT&T that culminated in a 1984 court order that deregulated the industry. Bell Systems had lost its empire, but its pioneering engineers left an indelible mark on the world. Bell Telephone announced the first transcontinental telephone service at the San Francisco World’s Fair in

70

Noll, A. Michael. Introduction to Telephones and Telephone Systems. Norwood, Mass.: Artech House, 1999.

Christine M. Roane See also AT&T; Internet; Telecommunications.

TELEPHONE CASES. Alexander Graham Bell’s 1876 patent on the telephone had barely been filed before a legion of other inventors surfaced to claim rights to the invention. The Western Union Telegraph Company purchased the rights to Amos E. Dolbear’s and Thomas A. Edison’s telephone inventions and began manufacturing and installing telephones. The Bell Telephone Company brought suit and prevailed in court in 1879. Over the next two decades, the holders of the Bell patent battled more than six hundred lawsuits. Daniel Drawbaugh, an obscure Pennsylvania mechanic who claimed to have had a workable instrument as early as 1866, came nearest to defeating them, losing his Supreme Court appeal in 1887 by a vote of only four to three. The government sought from 1887 to 1897 to annul the patent, but failed. BIBLIOGRAPHY

Harlow, Alvin F. Old Wires and New Waves: The History of the Telegraph, Telephone, and Wireless. New York: D. AppletonCentury, 1936.

Alvin F. Harlow / a. r.

T E L E VA N G E L I S M

See also AT&T; Industrial Research; Patents and U.S. Patent Office; Telecommunications.

women rubbed shoulders, exerted a democratizing force, although Whitefield himself never condemned the institution of slavery and was a latecomer to the cause of American independence.

TELEVANGELISM. As television became a staple of American culture in the second half of the twentieth century, a growing number of Protestant preachers embraced the new mass medium to deliver their messages. Catholics, too, took to the airwaves, most famously in the person of Bishop Fulton J. Sheen, who utilized the new medium of television to demonstrate the compatibility of American culture and Catholic faith. Televangelism emerged after World War II as an outgrowth of evangelicalism, a type of Protestant religion based on the idea that people needed to open their hearts and redirect their wills toward Christ, not only to secure an eternal place in heaven, but also to better their lives on earth. While evangelicals point to the New Testament story of Jesus commissioning disciples as the origin of their movement, modern evangelicalism emerged in eighteenth-century Britain and North America in the context of a burgeoning market economy. Preachers skilled at awakening religious feelings in their audiences used open-air stages to promote their beliefs and to enact the emotional process of repentance for sin and heartfelt commitment to God. The foremost evangelical predecessor of televangelists was the Anglican preacher George Whitefield, an actor before his conversion, whose combination of religious fervor, theatrical flair, and marketing genius made him the most celebrated figure in America in the decades preceding the American Revolution. One of the first entrepreneurs to cultivate publicity for his performances through the fast-growing newspaper medium, Whitefield drew large audiences to his sermons, which included tearful reenactments of the lives of biblical characters. These gatherings, where rich and poor, slave and free, men and

As evangelicalism developed in America, African Americans contributed elements of African religious tradition, such as spirit possession, call and response, and the five-tone musical scale, to the repertoire of evangelical performance. In nineteenth century America evangelicalism was often associated with social reform, especially antislavery, education, and temperance. In the early twentieth century, however, evangelicalism became increasingly tied to conservative politics, fundamentalist interpretations of the Bible, and hostility to liberal forms of Protestant theology and social reform. When Billy Graham began to make use of television in the 1950s, evangelicalism was almost as closely identified with anticommunism as it was with personal salvation.

Jim and Tammy Faye Bakker. Their “PTL (Praise the Lord) Club” was one of the most popular televangelist programs until the late 1980s, when Bakker confessed to adultery and then was convicted of defrauding his contributors. AP/Wide World Photos

The most famous televangelist of the twentieth century, Graham turned from radio to television to broadcast his message. Combining fervent preaching, heart-melting music, and personal testimonies from successful people, Graham’s crusades traveled around the country and eventually around the world, carrying the evangelical mix of religious outreach, theatrical entertainment, and creative entrepreneurship to new levels of sophistication. Graham’s evident personal integrity and continual prayers for the spiritual guidance of political leaders led to his visibility as a respected public figure and to his role as counselor to several American presidents. Televangelism boomed in the 1970s and 1980s, when the Federal Communications Commission (FCC) changed its policy of mandating free time for religious broadcasts to allow stations to accept money for religious programs. This regulatory change inspired more than a few preachers to use television as a means of funding their ministries. Oral Roberts sought funds for the development of the City of Faith Medical and Research Center in Tulsa, Oklahoma, by introducing the concept of “seed faith,” a means by which viewers might reap miracles from God in their own lives by donating to Roberts’s ministry. In The Hour of Power, broadcast from the Crystal Cathedral in Garden Grove, California, Robert Schuller preached about the power of positive thinking, offering viewers the chance to purchase membership in his Possibility Thinkers Club along with a mustard seed cross as a sign of their faith. Pat Robertson’s success in introducing a talk-show format to showcase interviews with people testifying to the power of faith led to the purchase of his own network, the Christian Broadcasting Network (CBN), which funded his bid for the Republican presidential nomination in 1988. Televangelists’ power to generate money contributed to the formation of conservative political constituencies, like Jerry Falwell’s Moral Majority and the Christian Coalition led by Robertson and Ralph Reed, which influenced public policy and political rhetoric in the United States. At the same time the money in televangelism stim-

71

TELEVISION: PROGRAMMING AND INFLUENCE

ulated various forms of corruption and scandal, leading to deepening distrust of televangelists on one hand and to more rigorous forms of accounting on the other. In the 1990s and the early years of the twenty-first century televangelism grew along with communications technology and the increasing pluralism of American religious life. Satellite, cable, and Internet technologies offered new opportunities for evangelical outreach and made increasingly sophisticated forms of presentation readily available. This technological expansion fostered the development of niche programming—shows devoted to biblical prophecy, for example—as well as the extension of televangelism’s mix of entertainment, self-promotion, and missionary outreach to other groups—for example, Catholics advocating devotion to Mary through dramatic reenactments of their own piety. As televangelism diversified, the distinctively Protestant character of its message blurred. Televangelism’s success compromised Protestant evangelicalism’s exclusive claim to salvation. BIBLIOGRAPHY

Alexander, Bobby C. Televangelism Reconsidered: Ritual in the Search for Human Community. Atlanta, Ga.: Scholars Press, 1994. Balmer, Randall. Mine Eyes Have Seen the Glory: A Journey into the Evangelical Subculture in America. Expanded ed. New York: Oxford University Press, 1993. Schmidt, Rosemarie, and Joseph F. Kess. Television Advertising and Televangelism: Discourse Analysis of Persuasive Language. Philadelphia: J. Benjamins Publishing, 1986. Schultze, Quentin J. Televangelism and American Culture: The Business of Popular Religion. Grand Rapids, Mich.: Baker Book House, 1991. Stout, Harry S. The Divine Dramatist: George Whitefield and the Rise of Modern Evangelicalism. Grand Rapids, Mich.: Eerdmans, 1991.

Amanda Porterfield See also Evangelicalism and Revivalism; Protestantism; Religion and Religious Affiliation; Television: Programming and Influence.

TELEVISION This entry includes 2 subentries: Programming and Influence Technology

PROGRAMMING AND INFLUENCE By 1947, the American Broadcasting Company (ABC), Columbia Broadcasting System (CBS), the Du Mont Network, and the National Broadcasting Company (NBC) had started regularly scheduling television programs on a small number of stations. Many more channels soon commenced operations, and a TV boom began. By 1960 just under 90 percent of all households had one or more sets. Because most channels had network affiliation agree-

72

ments—96 percent of all stations in 1960—the networks dominated the medium for over thirty years. (Du Mont ceased operations in 1955.) Especially in the evening, when most Americans watched TV, consumers very likely viewed a network program. In the late 1940s, relatively few advertisers were prepared to follow the American radio model of producing and underwriting the cost of shows. Within a few years, however, and often by accident, the networks and a few advertisers developed individual programs that sparked interest in the medium. This, in turn, encouraged more companies to advertise on TV. At first, television betrayed both a class and regional bias. The coaxial cable permitting simultaneous network telecasts did not reach Los Angeles, the center of the nation’s motion picture industry and home to most popular entertainers, until September 1951. As a result, most network shows originated from New York. And programs tended to have a New York accent. At the same time, programmers often confused their own, more cosmopolitan, tastes with those of viewers. Network executives assumed audiences wanted culturally ambitious fare, at least some of the time. Some simply believed the TV audience was more educated and well-to-do, despite studies indicating little class bias to set ownership. In the 1950s, television relied on a variety of program types or “genres.” The first was the variety program, telecast live with a regular host. Milton Berle and Ed Sullivan starred in two of the most durable variety hours. Individual sponsors produced “dramatic anthologies,” original dramas aired live. Although many TV plays were uneven or pretentious, some proved memorable, notably Marty, which was later remade as a feature film starring Ernest Borgnine. Other program types came from network radio: the dramatic series, situation comedy, and quiz (later game) show. They relied on one of radio’s oldest objectives: create a consumer habit of tuning to a specific program every day or week. (Many closed with the admonition, “Same time, same station.”) CBS, of the four networks, adhered most dutifully to this model of programming. The success of CBS’s situation comedy I Love Lucy (1951–1957) confirmed the network’s strategy. More tellingly, repeats of episodes proved almost as popular. This greatly undermined another broadcast industry “rule”: that audiences always wanted original programming, even in the summer when replacement series heretofore had been offered. By the late 1950s, most series were filmed. They had an additional advantage over the live telecast. They could not only be rerun in the summer but then rented or “syndicated” for re-airing by individual stations in the United States and overseas. Lucy, it should be noted, was the single most rerun series in the history of television. TV’s dependency on film accelerated in the late 1950s. ABC banked heavily on filmed action/adventure

TELEVISION: PROGRAMMING AND INFLUENCE

series—first westerns, then detective dramas—many of which gained large followings. CBS and NBC quickly seized on the trend. During the 1958–1959 season, seven of the ten most popular programs, according to the A. C. Nielsen ratings service, were westerns. Most were considerably more sophisticated than television’s earliest westerns, such as Hopalong Cassidy and The Lone Ranger, which were plainly aimed at pre-adolescents. The new “adult” westerns and detective series also possessed higher production values. The large audiences especially for westerns also indicated a change in the television audience, as TV spread into smaller cities and towns in the South and West. Filmed programming satisfied small-town audiences, which, as movie exhibitors had long known, greatly preferred westerns over nightclub comedy or original drama. By the end of the 1950s, the economics of television had become clear. Networks and stations derived most of their revenues from the sale of time to advertisers. Indeed, the stations that the networks owned were their most profitable properties. Producing successful programs was far more risky—too much for most stations to do extensively. Most new television series failed. Yet a popular program could be a moneymaker in syndication. With this prospect in mind, as well as a wish to wrest control from advertisers, the networks gradually began producing more of their own programming. Government regulations, however, severely restricted network participation in entertainment programming in the 1970s and 1980s. News programming was the great laggard in early TV. The networks produced fifteen-minute early evening weekday newscasts and telecast special events, including the national party conventions and presidential inaugurations. Informational shows were considered “loss leaders,” presented to satisfy TV critics and federal regulators. The Federal Communications Commission (FCC) assigned TV licenses, including the limited number that the agency permitted the networks to own. The FCC expected each license holder to devote a small proportion of its schedule to “public interest” programming, including news. Under no pressure to win audiences, news program producers had great latitude in story selection. That said, TV news personnel tended to be political centrists who took their cues from colleagues working at the prestigious newspapers. For all its shortcomings, early television news had one great journalist, Edward R. Murrow of CBS. Revered for his radio coverage of World War II, Murrow coproduced and hosted the documentary series See It Now, beginning in 1951. Although widely praised and courageous in its treatment of domestic anti-Communism, See It Now never won a large audience. His less critically admired interview program Person to Person, was far more popular and, indeed, anticipated similar, more celebritycentered efforts by Barbara Walters of ABC several decades later.

In the early 1960s, NBC and CBS began pouring more of their energies into their early evening newscasts, lengthening them from fifteen to thirty minutes in 1963. (ABC did not do so until 1967 and waited another decade before investing substantially in news.) The early evening newscast strategy reflected the “habit” rule of broadcasting, while proving very profitable. Although audiences did not equal those for entertainment shows later in the evening, the nightly newscasts drew enough viewers to interest advertisers. Similarly successful was NBC’s Today show, which premiered in 1952. Aired in the early morning for two hours, Today offered a mix of news and features. ABC eventually developed a competitor, Good Morning America. In the late 1950s and 1960s, all three networks occasionally produced documentaries, usually an hour long, that explored different public issues. Although they rarely had impressive ratings, documentaries mollified critics and regulators dismayed by the networks’ less culturally ambitious programming. The opportunity costs (the value of goods or services that one must give up in order to produce something) of airing documentaries, however, grew with heightened advertiser demand for popular series in the late 1960s. The networks quietly reduced their documentary production. Although most TV critics were dismayed, the FCC, which had earlier encouraged such programming, said nothing. Partly relieving the networks of their former obligations was the Public Broadcasting Service (PBS), created by Congress in 1969. Although chronically underfinanced, PBS managed to produce some public affairs and informational programming, once the preserve of the commercial networks. The commercial network documentary had all but vanished by 1980. In its place came a new type of news show. CBS’s 60 Minutes, which debuted in 1968, was the trendsetter. The documentary’s great weaknesses, according to 60 Minutes producer Don Hewitt, was its slow pacing. Largely because of its devotion of an hour or more to one “serious” issue like German unification, it bored the majority of viewers. Hewitt wanted to make news programming engaging. “Instead of dealing with issues we [will] tell stories,” he remarked (Richard Campbell, 60 Minutes and the News, p. 3). And he determined to mix it up. On 60 Minutes, no single topic would absorb more than a quarter hour. The topics covered, in turn, would vary to attract as many in the audience as possible. It came to be known as the first TV “magazine” and eventually, 60 Minutes nurtured a large following. Indeed, it became the first news program to compete successfully with entertainment series in evening prime time. All three networks found airing newsmagazines irresistible. They were considerably cheaper than entertainment programming and the network could own and produce the program, and not pay fees to an independent company. (At the time, the FCC limited network ownership of entertainment programs.) This meant higher

73

TELEVISION: PROGRAMMING AND INFLUENCE

profits, even if a 60 Minutes imitator accrued smaller ratings than a rival entertainment series. The tone of network news changed over time. In the 1950s and early 1960s, TV news programs tended to be almost stenographic. A network newscast report on a cabinet secretary’s speech was largely unfiltered. This approach had several explanations. Excessively critical coverage might upset federal regulators. Then, too, broadcast news people tended to share in many of the assumptions of newsmakers, especially in regards to the Cold War with the Soviet Union. Television’s coverage of America’s involvement in Vietnam, especially during the escalation of U.S. participation (1963–1967), was hardly hostile. Nor was TV’s combat footage especially graphic. Still, the inability of the U.S. military to secure South Vietnam, despite repeated claims of progress, shattered the Cold War consensus while fostering a new skepticism toward those in power. So did the attempts by the Nixon administration to cover up scandals associated with the Watergate break-in of 1972. The networks did not cover the Watergate affair as searchingly as some newspapers, the Washington Post or Los Angeles Times, for example. Yet the scandals further damaged relations between government officials and network TV news correspondents. But correspondents had not become leftist ideologues, as many conservatives assumed; network reporters’ politics remained strikingly centrist. Rather, TV correspondents tended to mediate government news more warily—regardless of which party controlled the executive branch. Network TV news also became more correspondentcentered. The reporter’s interpretation of an announcement—not the announcement itself—dominated most network news accounts.

Sporting events had long been a convenient means of filling the schedule. Because their audiences were disproportionately male, however, most sports telecasts could not command the same ratings as popular entertainment series, except for the championship series in baseball and the National Football League (NFL). Moreover, in airing sporting contests, television played favorites. Football proved to be the most “telegenic” sport, and began luring viewers on Sunday afternoons, which had long been considered a time when people would not watch television. Professional football broke another rule by achieving ratings success in prime time, with the debut of Monday night NFL telecasts on ABC in 1970. Cable television in the 1980s and 1990s created more outlets devoted to sports. With a cable connection, subscribers could improve their TV’s reception and greatly increase their programming choices. In the 1980s, the non-cable viewer could select from seven channels; the cable home had thirtythree. More and more consumers preferred to have more options, which multiplied in the 1990s. In the late 1980s, cable reached about half of all households. A decade later, just under 70 percent of all homes had cable. Although cable offered an extraordinary range of choices, viewer preferences were strikingly narrow. Channels playing to certain, specialized tastes enjoyed the greatest success. Eight of the fifteen most watched cable telecasts the week of 17–23 December 2001, were on Nickelodeon, which programmed exclusively for young children. Professional wrestling and football programs placed five shows that week.

Still, in times of grave national crisis, network newscasters self-consciously assumed a special role. After the assassination of John F. Kennedy in 1963 and the resignation of Richard M. Nixon in 1974, television journalists sought to reassure and unite the nation. The sociologist Herbert J. Gans dubbed this the “order restoration” function of the national news media. The terrorist attacks of September 2001 prompted a similar response, as well as demonstrations of patriotism not seen on television news since the early Cold War.

With cable’s spread, the networks saw their share of the evening audience fall from 90 percent in the mid1970s to just over 60 percent twenty years later. The network early evening newscasts suffered even larger declines. The creation of all-news cable channels, beginning with the Cable News Network (CNN) in 1980, ate away at the authority of the network news programs. Still, CNN’s effects should not be overstated. Except during a national crisis, relatively few watched CNN. Entertainment cable channels actually posed the larger problem. The availability of such channels gave viewers alternatives to the newscasts they had not previously had.

Local news programming became especially important to individual stations. Stations initially aired news programs as a regulatory concession. Most followed the networks in expanding their newscasts from fifteen minutes in the 1960s. They were of growing interest to advertisers, and became the single most profitable form of local programming. Stations extended the length and frequency of their newscasts. Production values and immediacy increased as stations switched from film to videotape for their stories. As the competition among stations for ratings grew, the news agenda changed. Little time went to serious issues—which were often difficult to capture visually—as opposed to features, show-business news, and, in larger markets, spectacular fires and crimes.

All in all, cable had contradictory effects on the networks. News producers, anxious to retain audiences, made their newscasts’ agenda less serious and more fixated on scandal (a trend also explained by the end of the Cold War). At the same time, entertainment programs, similarly losing viewers to cable, became more daring. This was not because cable programs, with a few exceptions on pay cable services, violated moral proprieties. Many cable channels aired little other than reruns of network programs and old feature films. For the networks, however, only a more relaxed standard could hold viewers, especially younger ones. While still voluntarily honoring some moral strictures, television series handled violence and sexual relations with a realism unimaginable a generation

74

TELEVISION: PROGRAMMING AND INFLUENCE

earlier. Old prohibitions against the use of profanity and nudity were partially relaxed. No network hurried this trend along more enthusiastically than Fox. Formed in 1986, Fox carried a number of comedies, action dramas, and reality shows (When Good Pets Go Bad ), some of which consciously crossed mainstream boundaries of good taste. Fox owner Rupert Murdoch, an Australian publisher of tabloid newspapers, lacked the self-conscious sensibility of his older rivals. Fox’s rise coincided with the relaxation of federal regulations. Between the 1920s and 1970s, the relative scarcity of on-air channels justified government oversight of broadcasting. The radio spectrum only permitted so many stations per community. With cable eliminating this rationale, the FCC in the 1980s systematically deregulated broadcasting. In the late twentieth century, television license holders aired news programs to make money, not to please federal officials. Congress approved this course, and the 1996 Telecommunications Act weakened remaining FCC rules limiting the number of stations that networks and others could own. Institutional Impacts of Television The nation’s established mass media—radio, films, and newspapers—reacted differently to television’s sudden presence in the American home. Radio felt the effects first, as audiences for radio programs, particularly in the evening, dropped sharply in the first half of the 1950s. Radio’s relative portability allowed some recovery, especially with the development of the transistor. Then, too, in the 1950s, most Americans only owned one television. Those unhappy with what another family member insisted on watching could listen to a radio elsewhere in the house. Moreover, radio could be a diversion for those doing the dishes or cleaning a room. At the same time, radio listening while driving became much more common as more automobiles were equipped with radios, and the percentage of Americans who owned cars increased. In addition, some radio stations broke with an older industry tradition by targeting a demographic subgroup of listeners, specifically, adolescents. Stations hired disc jockeys who continuously played rock and roll music. Television stations and networks could only offer a few programs tailored to teens. Advertisers prized their parents more. Radio, in that regard, anticipated the direction of television’s competitors after the 1960s. Radio stations continued to narrow their formats by age, race, and politics. Television presented an enormous challenge to the film industry. Theater attendance dropped sharply in the late 1940s and early 1950s. however, box office receipts were declining even before television arrived in many communities. With marginal theaters closing, the studios responded by reducing the number of movies produced per year. To compete with TV, more films had elaborate special effects and were produced in color. (Not until 1972 did most homes have color televisions.) The collapse of film censorship in the mid-1960s gave Hollywood an-

other edge: violence and sexual situations could be portrayed with an unprecedented explicitness that TV producers could only envy. Although most large studios at first resisted cooperating with the television networks, by the mid-1950s virtually every movie company was involved in some TV production. With some exceptions, most of Hollywood’s initial video work resembled the old “B” movie, the cheaper theatrical release of the 1930s and 1940s produced as the second feature for a twin billing or for the smaller theaters, most of which had ceased operations in the late 1950s. In the late 1960s, motion picture firms began producing TV movies, that is, two-hour films specifically for television. At first, they were fairly cheaply mounted and forgettable. But a few had enormous impact. ABC’s Roots, telecast in 1977, chronicled the history of an African American family and prompted a new appreciation for family history. Although the TV films remained popular through the 1980s, higher costs caused the networks to lose their enthusiasm for the genre, which all but disappeared from the small screen in the 1990s. No major mass medium responded more ineffectively to the challenge of television than newspapers. For more than two decades, newspaper publishers refused to regard TV as a threat to their industry. Indeed, the diffusion of television did not initially affect newspaper circulation. In the long run, however, TV undermined the daily newspaper’s place in American life. As “baby boomers,” those Americans born between 1946 and 1963, reluctantly entered adulthood, they proved less likely to pick up a paper. If they did, they spent less time reading it. Publishers belatedly responded by making their papers more appealing to a generation raised with television. They shortened stories, carried more pictures, and used color. Assuming, not always correctly, that readers already knew the headlines from television, editors insisted that newspaper stories be more analytical. Yet they were losing the war. The more interpretive journalism failed to woo younger readers, while many older readers deemed it too opinionated. Although Sunday sales were fairly stable, daily circulation per household continued to drop. Like many newspaper publishers, America’s political class only slowly recognized television’s impact. John F. Kennedy’s video effectiveness during the 1960 presidential campaign, however, changed many minds, as did some powerful television political spots by individual candidates later in the decade. TV advertising became an increasingly common electoral weapon, even though its actual impact was debated. Nevertheless, to candidates and their consultants, the perception that television appeals could turn an election mattered more than the reality. And, as the cost of television spots rose, so did the centrality of fundraising to politicians. TV, in that regard, indirectly contributed to the campaign finance problem besetting both political parties by making their leaders more dependent on the monies of large corporations and their political action committees.

75

TELEVISION: TECHNOLOGY

Advertisers of goods and services, and not political candidates, were far and away commercial television’s greatest patrons. (Political campaigns accounted for 7 percent of all advertising spending—print as well as video— in 1996.) During TV’s first decade, sponsors had great power. They likely underwrote entire programs, and often involved themselves in aspects of the production. They sought product placement on the set, and sometimes integrated the middle commercial into the story. They also censored scripts. For example, a cigarette manufacturer sponsoring The Virginian forbade a cast member from smoking a cigar on camera.

Baughman, James L. The Republic of Mass Culture: Journalism, Filmmaking, and Broadcasting in America since 1941. 2d ed. Baltimore: Johns Hopkins University Press, 1997.

In the early 1960s, sponsors lost their leverage. The involvement of some in the rigging of popular quiz shows had embarrassed the industry. Members of Congress and others insisted that the networks, and not sponsors, have the ultimate authority over program production (a power the networks themselves had long sought). Concomitantly, more advertisers wanted to enter television, creating a seller’s market. Then, too, as the costs of prime time entertainment series rose, so did the expense of sole sponsorship. Advertisers began buying individual spots, as opposed to entire programs. The new economics of television, even more than the fallout over the quiz scandals, gave the networks sovereignty over their schedules. Yet the entry of so many more potential sponsors, demanding masses of viewers, placed added pressure on the networks to maximize their ratings whenever possible. Networks turned away companies willing to underwrite less popular cultural programming, such as The Voice of Firestone, because more revenue could be earned by telecasting series with a wider appeal.

———. The “Uncensored War”: The Media and Vietnam. New York: Oxford University Press, 1986.

The popularity of cable in the 1980s and 1990s marked a new phase in advertiser-network relations. The “niche marketing” of cable channels like MTV and Nickelodeon greatly eased the tasks of advertising agencies’ media buyers seeking those audiences. The networks, on the other hand, confronted a crisis. Although willing to continue to patronize network programs, advertisers made new demands. These did not ordinarily involve specific production decisions, whether, for instance, a character on a sitcom had a child out of wedlock. Instead, media buyers had broader objectives. No longer did they focus exclusively on the size of a program’s audience; they increasingly concerned themselves with its composition. A dramatic series like Matlock had a large audience, but a graying one. Friends and Melrose Place, on the other hand, were viewed by younger viewers. Advertisers assumed that younger consumers were far more likely to try new products and brands. Increasingly in the 1990s, the demographics of a series’ audience determined its fate. This left viewers not in the desired demographic group in the wilderness of cable. BIBLIOGRAPHY

Balio, Tino, ed. Hollywood in the Age of Television. Boston: Unwin Hyman, 1990.

76

Bernhard, Nancy E. U.S. Television News and Cold War Propaganda, 1947–1960. Cambridge, U.K.: Cambridge University Press, 1999. Bogart, Leo. The Age of Television: A Study of Viewing Habits and the Impact of Television on American Life. 3d ed. New York: Frederick Ungar, 1972. Hallin, Daniel C. We Keep America on Top of the World: Television Journalism and the Public Sphere. London and New York: Routledge, 1994.

Mayer, Martin. About Television. New York: Harper and Row, 1972. The best, most thoughtful journalistic account of the television industry before the cable revolution. O’Connor, John E., ed. American History/American Television: Interpreting the Video Past. New York: Frederick Ungar, 1983. Stark, Steven D. Glued to the Set: The Sixty Television Shows and Events That Made Us Who We Are Today. New York: Free Press, 1997.

James L. Baughman See also Celebrity Culture; Mass Media; Talk Shows, Radio and Television; Televangelism; Videocassette Recorder.

TECHNOLOGY Television is the process of capturing photographic images, converting them into electrical impulses, and then transmitting the signal to a decoding receiver. Conventional transmission is by means of electromagnetic radiation, using the methods of radio. Since the early part of the twentieth century, the development of television in the United States has been subject to rules set out by the federal government, specifically the Federal Communications Commission (FCC), and by the marketplace and commercial feasibility. Early Developments Image conversion problems were solved in the latter part of the nineteenth century. In 1873 English engineer Willoughby Smith noted the photoconductivity of the element selenium, that its electrical resistance fluctuated when exposed to light. This started the search for a method to change optical images into electric current, and simultaneous developments in Europe eventually led to a variety of mechanical, as opposed to electronic, methods of image transmission. In 1884 German engineer Paul Nipkow devised a mechanical scanning system using a set of revolving disks in a camera and a receiver. This converted the image by transmitting individual images sequentially as light passed through small holes in the disk. These were then “reassembled” by the receiving disk. The scanner, called a Nipkow disk, was used in experiments in the United States by Charles F. Jenkins and in England by John L. Baird to

TELEVISION: TECHNOLOGY

create a crude television image in the 1920s. Jenkins began operating in 1928 as the Jenkins Television Corporation near Washington, D.C., and by 1931 nearly two dozen stations were in service, using low-definition scanning based on the Nipkow system. In the 1930s, American Philo T. Farnsworth, an independent inventor, and Vladimir K. Zworykin, an engineer with Westinghouse and, later, the Radio Corporation of America (RCA), were instrumental in devising the first workable electronic scanning system. Funding, interference from competitors, and patent issues slowed advances, but Farnsworth came out with an “image dissector,” a camera that converted individual elements of an image into electrical impulses, and Zworykin developed a similar camera device called the iconoscope. Although Zworykin’s device was more successful, in the end collaboration and cross-licensing were necessary for commercial development of television. By 1938, electronic scanning systems had overtaken or, in some cases, incorporated elements of, mechanical ones. Advancements made since the early 1900s in the United States, Europe, and Russia by Lee De Forest, Karl Ferdinand Braun, J. J. Thomson, A. A. Campbell Swinton, and Boris Rosing contributed to the commercial feasibility of television transmission. Allen B. DuMont’s improvements on the cathode-ray tube in the late 1930s set the standard for picture reproduction, and receivers (television sets) were marketed in New York by DuMont and RCA. The cathode-ray tube receiver, or picture tube, contains electron beams focused on a phosphorescent screen. The material on the screen emits light of varying intensity when struck by the beam, controlled by the signal from the camera, reproducing the image on the tube screen in horizontal and vertical lines—the more lines, the more detail. The “scene” changes at around the rate of 25 to 30 complete images per second, giving the viewer the perception of motion as effectively as in motion pictures. Early Commercial Broadcasting In 1939, the National Broadcasting Company in New York provided programming focused on the New York World’s Fair. During the 1930s, RCA president David Sarnoff, a radio programming pioneer, developed research on programming for television, which was originally centered on public events and major news stories. In late 1939, the FCC adopted rules to permit the collection of fees for television services, in the form of sponsored programs. In the industry, the National Television Systems Committee (NTSC) was formed to adopt uniform technical standards. Full commercial program service was authorized by the FCC on 1 July 1941, with provisions that the technical standard be set at 525 picture lines and 30 frames per second. After more than forty years of experimentation, television was on the brink of full commercial programming by the beginning of World War II (1939–1945). After World War II, a television

broadcasting boom began and the television industry grew rapidly, from programming and transmitting (“airing”) to the manufacturing of standardized television sets. Color Television The development of color television was slower. Color television used the same technology as monochromatic (black and white), but was more complex. In 1940, Peter Goldmark demonstrated a color system in New York that was technically superior to its predecessors, going back to Baird’s 1928 experiments with color and Nipkow disks. But Goldmark’s system was incompatible with monochromatic sets. The delay in widespread use of color television had more to do with its compatibility with monochromatic systems than with theoretical or scientific obstacles. By 1954, those issues had been resolved, and in 1957 the federal government adopted uniform standards. For most Americans, however, color televisions were costprohibitive until the 1970s. The Future of Television The last three decades of the twentieth century were filled with as many exciting advancements in the industry as were the first three: Projection televisions (PTVs) were introduced, both front- and rear-projection and with screens as large as 7 feet; videotape, which had been used by broadcasters since the 1950s, was adapted for home use, either for use with home video cameras or for recording programmed broadcasting (by the 1980s videocassette recorders—VCRs—were nearly as common as TVs); cable television and satellite broadcasting began to make inroads into the consumer market; and in the early 2000s, digital videodiscs (DVDs) began to replace videotape cassettes as a consumer favorite. Also in the 1970s, advancements were made in liquid crystal display (LCD) technology that eventually led to flatter screens and, in the 1990s, plasma display panels (PDPs) that allowed for screens over a yard wide and just a few inches thick. The 1990s brought about a revolution in digital television, which converts analog signals into a digital code (1s and 0s) and provides a clearer image that is less prone to distortion (though errors in transmission or retrieval may result in no image at all, as opposed to a less-thanperfect analog image). First developed for filmmakers in the early 1980s, high-definition television (HDTV) uses around 1,000 picture lines and a wide-screen format, providing a sharper image and a larger viewing area. Also, conventional televisions have an aspect ratio of 4:3 (screen width to screen height), whereas wide-screen HDTVs have an aspect ratio of 16:9, much closer to that of motion pictures. Since the late 1980s, the FCC has been aggressively advocating the transition to digital television, largely because digital systems use less of the available bandwidth, thereby creating more bandwidth for cellular phones. Based on technical standards adopted in 1996, the FCC ruled that all public television stations must be digital by

77

TELLER AMENDMENT

May 2003, considered by many to be an overly optimistic deadline. As with the development of color television, the progress of HDTV has been hampered by compatibility issues. The FCC ruled in 1987 that HDTV standards must be compatible with existing NTSC standards. By 2000, however, the focus for the future of HDTV had shifted to its compatibility and integration with home computers. As of 2002, HDTV systems were in place across the United States, but home units were costly and programming was limited. BIBLIOGRAPHY

Ciciora, Walter S. Modern Cable Television Technology: Videos, Voice, and Data Communications. San Francisco: Morgan Kaufmann, 1999. Federal Communications Commission. Home page at http:// www.fcc.gov Fisher, David E. Tube: The Invention of Television. Washington, D.C.: Counterpoint, 1996. Gano, Lila. Television: Electronic Pictures. San Diego, Calif.: Lucent Books, 1990. Trundle, Eugene. Guide to TV and Video Technology. Boston: Newnes, 1996.

Paul Hehn See also Electricity and Electronics; Telecommunications.

TELLER AMENDMENT, a disclaimer on the part of the United States in 1898 of any intention “to exercise sovereignty, jurisdiction or control” over the island of Cuba when it should have been freed from Spanish rule. It was proposed in the Senate by Henry M. Teller of Colorado and adopted, 19 April as an amendment to the joint resolution declaring Cuba independent and authorizing intervention. Spain declared war on the United States five days later. By August, the United States had expelled Spanish forces from the island. Despite Teller’s amendment, the United States intervened in Cuban internal affairs deep into the twentieth century. BIBLIOGRAPHY

LaFeber, Walter. The New Empire: An Interpretation of American Expansion, 1860–1898. Ithaca, N.Y.: Cornell University, 1963. Perez, Louis A., Jr. Cuba and the United States: Ties of Singular Intimacy. Athens: University of Georgia Press, 1990.

Julius W. Pratt / a. g. See also Cuba, Relations with; Imperialism; Spain, Relations with; Spanish-American War.

TEMPERANCE MOVEMENT. The movement to curb the use of alcohol was one of the central reform efforts of American history. From earliest settlement, consumption of alcohol was a widely accepted practice in America, and while drunkenness was denounced, both

78

distilled and fermented beverages were considered nourishing stimulants. In 1673 the Puritan divine Increase Mather condemned drunkenness as a sin, yet said “Drink in itself is a good creature of God, and to be received with thankfulness.” Alcohol was not prohibited but rather regulated through licensing. Growth of the Temperance Movement The half century after independence witnessed both a gradual change in attitudes toward alcoholic beverages and an increase in alcohol production and consumption. A pamphlet by the prominent Philadelphia physician Benjamin Rush entitled An Inquiry into the Effects of Spirituous Liquors on the Human Mind and Body, published in 1784, was an early voice denouncing the harmful effects of distilled liquors. The first temperance society of record was formed in Litchfield County, Connecticut, in 1789 by prominent citizens convinced that alcohol hindered the conduct of their businesses. In 1813 the Massachusetts Society for the Suppression of Intemperance was formed by society’s elites—clergymen, town officials, and employers—“to suppress the too free use of ardent spirits, and its kindred vices, profaneness and gambling, and to encourage and promote temperance and general morality,” as its constitution proclaimed. There was good reason for the concern of these early temperance advocates. The newly opened western lands in Pennsylvania, Tennessee, and Kentucky were producing grain more easily transported if converted to whiskey. Cheaper than rum, whiskey soon flooded the market. Estimates are that between 1800 and 1830 the annual per capita consumption of absolute alcohol among the drinking-age population (fifteen and older) ranged from 6.6 to 7.1 gallons. By 1825 the forces of evangelical Protestantism mobilized for the temperance crusade. In that year, the Connecticut clergyman Lyman Beecher preached six sermons warning of the dangers of intemperance to a Christian republic. The next year sixteen clergy and laypersons in Boston signed the constitution of the American Society for the Promotion of Temperance. The reformers sensed divine compulsion to send out missionaries to preach the gospel of abstinence from the use of distilled spirits. Using an effective system of state, county, and local auxiliaries, the American Temperance Society (ATS) soon claimed national scope. Voluntary contributions enabled it to support agents who visited every part of the country, striving to affiliate all temperance groups with the national society. By 1831 the ATS reported over 2,200 known societies in states throughout the country, including 800 in New England, 917 in the Middle Atlantic states, 339 in the South, and 158 in the Northwest. The efforts of the ATS were aimed at the moderate drinker to encourage total abstinence from distilled liquor. By the late 1830s, the national organization, now called the American Temperance Union, was attempting to distance itself from antislavery reformers to placate southern temperance societies, sponsor legislation against

TEMPERANCE MOVEMENT

Temperance Cartoon. This woodcut, c. 1820, blames rum for such evils as murder, fever, and cholera. 䉷 corbis

the liquor traffic, and adopt a pledge of total abstinence from all intoxicants, the teetotal pledge. However, each of these efforts sparked internal division and external opposition, which, along with the 1837 panic and ensuing depression, weakened the reform movement. Interest in temperance revived with the appearance of the Washingtonian movement in 1840. Six tipplers in Baltimore took the abstinence pledge, formed a temperance organization named after the first president, and began to spread the temperance gospel. Aimed at inebriates rather than moderate drinkers, Washingtonian meetings featured dramatic personal testimonies of deliverance from demon rum akin to the revival meetings of the Second Great Awakening, as well as other social activities to replace the conviviality of the tavern. Orators such as John B. Gough and John H. W. Hawkins toured the country, including the South, lecturing on temperance. The Washingtonian impulse was strong but short-lived, owing to lack of organization and leadership. The enthusiasm generated by the Washingtonians was captured and institutionalized by the Sons of Temperance, a fraternal organization formed in 1842 by some Washingtonians concerned about the frequency of backsliding. They proposed an organization “to shield us from the evils of Intemperance; afford mutual assistance in case of sickness; and elevate our character as men.” A highly structured society requiring dues and a total abstinence pledge, the Sons introduced a new phase of the temper-

ance movement, the fraternal organization with secret handshakes, rituals, ceremonies, and regalia. The organization spread rapidly and all but a few states had Grand Divisions of the Sons by 1847. The peak year of membership was 1850, when the rolls listed over 238,000 members. At the same time the Sons of Temperance was flourishing, Father Theobald Mathew, the well-known Irish Apostle of Temperance, undertook a speaking tour through the United States. Between July 1849 and November 1851, he traveled the country administering the temperance pledge to several hundred thousand people, many of them Irish Americans. His tour illustrated some of the dynamics affecting the temperance movement. Upon his arrival in America, Mathew was greeted by William Lloyd Garrison, who pressured him to reaffirm an abolition petition Mathew had signed some years earlier. Seeking to avoid controversy, and aware that he planned to tour the South, Mathew declined despite Garrison’s public insistence. Word of the affair reached Joseph Henry Lumpkin, chairman of the Georgia State Temperance Society, who had invited Mathew to address the state temperance convention. Despite his insistence that temperance was his mission, Mathew’s acknowledgement of his abolition sentiments led Lumpkin to withdraw his invitation to address the convention. Nonetheless, Mathew did successfully tour the South.

79

TEMPERANCE MOVEMENT

Carry Nation. The hatchet-wielding nemesis of saloons all over Kansas, and then across the country, in the first decade of the twentieth century. Library of Congress

During the antebellum era the temperance message was spread widely through the printed word. Weekly and monthly journals appeared devoted solely to temperance, while many religious periodicals carried news of the reform movement. Songs, poems, tracts, addresses, essays, sermons, and stories found their way into print, and temperance literature became a common part of the cultural landscape. Fiction like Timothy Shay Arthur’s Ten Nights in a Bar-Room, and What I saw There (1854), portrayed the pain and shame experienced by drunkards and their families, as well as the joy of a life redeemed from demon rum. Temperance was trumpeted as the means to both social and domestic tranquility and individual economic advancement. The ATS was among the first of voluntary benevolent reform organizations of the antebellum era to admit women, who participated in significant numbers. Women both joined men’s societies and formed their own auxiliaries. According to the ideology of the day, woman’s presumed superior moral influence, exercised mainly in the domestic sphere, added moral weight to the temperance cause. Also, women along with children were the main

80

victims of alcoholic excess in the form of domestic violence and economic deprivation. From Moral to Legal Reform By the late 1830s some temperance reformers were ready to abandon moral suasion (urging individuals to abstinence by personal choice) in favor of legal suasion (employing the coercion of law). In 1838 and 1839 temperance workers circulated petitions asking state legislatures to change license laws regulating liquor traffic. Some petitions sought to prohibit liquor sales in less-thanspecified quantities ranging from one to twenty gallons. Others sought local option laws allowing communities to regulate liquor sales. While petition campaigns occurred in most states, they were usually unsuccessful. After the revival of temperance interest in the 1840s, a second prohibition effort occurred in the next decade. The state of Maine, under the efforts of the merchant Neal Dow, passed a prohibitory statute in 1851 outlawing the manufacture and sale of intoxicants. The Maine Law became a model for state campaigns throughout the country. During the early years of the 1850s temperance was

TENEMENTS

one of the issues along with nativism, slavery, and the demise of the Whig Party that colored state political campaigns. A number of states passed prohibitory laws, though most were declared unconstitutional or repealed by 1857. Despite the failure of these efforts, temperance had proven the most widespread reform of the antebellum era. Following the Civil War, the Prohibition Party was formed in Chicago in 1869, and began nominating presidential candidates in 1872, though it languished in the shadow of the major parties. Perhaps more important was the emergence of greater involvement of women in the temperance cause with the appearance of the Woman’s Christian Temperance Union in 1874. Annual per capita consumption of absolute alcohol had dropped sharply during the 1830s and 1840s and remained relatively stable at one to two gallons through most of the second half of the century. As America shifted from a rural to urban culture, drinking patterns shifted as well, away from whiskey to beer, a more urban beverage, now readily available owing to technological developments like pasteurization and refrigeration. Saloons became familiar fixtures of the urban landscape, and for temperance workers, the symbol of alcohol’s evil. The WCTU, largely a collection of Protestant women, adopted a confrontational strategy, marching in groups to the saloon and demanding that it close. Under the leadership of Frances Willard, who led the organization for two decades, the WCTU embraced a wide variety of reforms, including woman’s suffrage, believing that only by empowering women in the public sphere could alcohol be eliminated and the home protected. The WCTU became the largest temperance and largest women’s organization prior to 1900. Building on the women’s efforts to keep the alcohol issue before the public, the Anti-Saloon League was formed by evangelical Protestant men in 1895. Attacking the saloon was its method; its aim was a dry society. The Anti-Saloon League worked through evangelical denominations, winning statewide victories over the next two decades. Its crowning success was the passage of the Eighteenth Amendment in 1919, ushering in the Prohibition era that ran from 1920 to 1933. BIBLIOGRAPHY

Blocker, Jack S., Jr. American Temperance Movements: Cycles of Reform. Boston: Twayne, 1989. Bordin, Ruth. Women and Temperance: The Quest for Power and Liberty, 1873–1900. Philadelphia: Temple University Press, 1981. Dannenbaum, Jed. Drink and Disorder: Temperance Reform in Cincinnati from the Washingtonian Revival to the WCTU. Urbana: University of Illinois Press, 1984. Hampel, Robert L. Temperance and Prohibition in Massachusetts, 1813–1852. Ann Arbor, Mich.: UMI Research Press, 1982. Krout, John Allen. The Origins of Prohibition. New York: Knopf, 1925. Rorabaugh, W. J. The Alcoholic Republic: An American Tradition. New York: Oxford University Press, 1979.

Tyrrell, Ian R. Sobering Up: From Temperance to Prohibition in Antebellum America, 1800–1860. Westport, Conn.: Greenwood Press, 1979.

Douglas W. Carlson See also Prohibition.

TEN-FORTIES, gold bonds issued during the Civil War that were redeemable after ten years and payable after forty years. Authorized by Congress on 3 March 1864 to allow greater freedom in financing the Civil War, their low 5 percent interest made them unpopular; less popular, at any rate, than the earlier “five-twenties,” bonds with more flexible terms issued by the U.S. Treasury under the direction of financier Jay Cooke. Bond sales declined rapidly and forced the Treasury to rely more heavily on short-term loans, bank notes, greenback issues, and taxes. BIBLIOGRAPHY

Rein, Bert W. An Analysis and Critique of the Union Financing of the Civil War. Amherst, Mass: Amherst College Press, 1962. Reinfeld, Fred. The Story of Civil War Money. New York: Sterling Publishing, 1959.

Chester M. Destler / a. r. See also Cooke, Jay, and Company; Greenbacks; National Bank Notes; War Costs.

TENEMENTS. The New York City Tenement House Act of 1867 defined a tenement as any rented or leased dwelling that housed more than three independent families. Tenements were first built to house the waves of immigrants that arrived in the United States during the 1840s and 1850s, and they represented the primary form of urban working-class housing until the New Deal. A typical tenement building was from five to six stories high, with four apartments on each floor. To maximize the number of renters, builders wasted little space. Early tenements might occupy as much as 90 percent of their lots, leaving little room behind the building for privies and water pumps and little ventilation, light, or privacy inside the tenement. With a large extended family and regular boarders to help pay the rent, which could otherwise eat up over half of a family’s income, a tenement apartment might house as many as from ten to twelve people at a time. These tenement residents often also worked in the building in such occupations as cigar rolling and garment making. From the beginning, reformers attacked tenement conditions. In New York City, early attempts at reform included fire-prevention measures, the creation of a Department of Survey and Inspection of Buildings in 1862, and the founding of the Metropolitan Board of Health in 1866. Meanwhile, city tenements were getting increas-

81

TENEMENTS

Tenth Ward had a population of 69,944, approximately 665 people per acre. The 1901 legislation, opposed by the real estate industry on the grounds that it would discourage new construction, improved tenement buildings. The law mandated better lighting and fireproofing. Most important, it required that privies be replaced with indoor toilet facilities connected to the city sewers, with one toilet for every two apartments.

Tenement. In this photograph by Jacob Riis, c. 1890, a woman does handiwork while smoking a pipe in her cramped tenement room in New York. 䉷 Bettmann/Corbis

Beginning in the New Deal era, reformers’ strategies changed. Drawing on a tradition of “model tenements” and new government interest in housing construction, reformers designed public housing projects. Their plans emphasized open space, much as an earlier generation had passed laws to provide more light and fresh air for urban working-class families. The imposed standards, however, often created new problems. Building closures and slum clearance displaced many working-class families, while new high-rise public housing often fell victim to segregation and neglect. Although reformers continued to attack working-class living conditions, social pressures sustained many of the problems of poverty and overcrowding.

ingly crowded: by 1864, approximately 480,400 of New York City’s more than 700,000 residents lived in some 15,300 tenement buildings. New York State passed a Tenement House Law on 14 May 1867, the nation’s first comprehensive housing reform law. It established the first standards for minimum room size, ventilation, and sanitation. It required fire escapes and at least one toilet or privy (usually outside) for every twenty inhabitants. However, enforcement was lax. An 1879 amendment to the 1867 legislation required more open space on a building lot and stipulated that all tenement rooms open onto a street, rear yard, or air shaft. The measure was designed to increase ventilation and fight diseases, such as tuberculosis, that ravaged tenement neighborhoods. To meet the standards of the 1879 law, builders designed the “dumbbell tenement” with narrow airshafts on each side to create a dumbbell-like shape from above. Despite slightly better fireproofing and ventilation, reformers attacked these buildings as only a limited improvement on existing conditions. In 1890, Jacob Riis’s How the Other Half Lives rallied middle-class reformers to the cause of improving tenement life. His photos and essays drew attention to the health and housing problems of tenement neighborhoods. The most significant New York State law to improve deteriorating tenement conditions was the Tenement Act of 1901, promoted by a design competition and exhibition held by the Charity Organization Society in 1900. By that time, the city’s Lower East Side was home to the most densely populated buildings on earth. The neighborhood’s

82

Slums Breed Crime. A U.S. Housing Authority poster showing police making arrests outside a tenement building. National Archives and Records Administration

TENNESSEE

BIBLIOGRAPHY

Bauman, John F., Roger Biles, and Kristin M. Szylvian. From Tenements to the Taylor Homes: In Search of an Urban Housing Policy in Twentieth-Century America. University Park: Pennsylvania State University Press, 2000. Day, Jared N. Urban Castles: Tenement Housing and Landlord Activism in New York City, 1890–1943. New York: Columbia University Press, 1999. Ford, James. Slums and Housing, with Special Reference to New York City: History, Conditions, Policy. Cambridge, Mass.: Harvard University Press, 1936. Hall, Peter. Cities of Tomorrow: An Intellectual History of Urban Planning and Design in the Twentieth Century. New York: Blackwell, 1988. Lower East Side Tenement Museum. Home page at http:// www.tenement.org. Plunz, Richard. A History of Housing in New York City. New York: Columbia University Press, 1990. Riis, Jacob. How the Other Half Lives: Studies among the Tenements of New York. New York: Scribners, 1890.

Mark Ladov See also Poverty; Public Health; Urbanization; and vol. 9: In the Slums.

TENNESSEE. Since its founding, Tennessee has traditionally been divided into three sections: East Tennessee, Middle Tennessee, and West Tennessee. East Tennessee includes part of the Appalachian Mountains, which stretch from Alabama and Georgia northward through East Tennessee to New England; the Great Valley, which is to the west of the Appalachians, slanting northeastward from Georgia through Tennessee into Virginia; and the Cumberland Plateau, which is to the west of the Great Valley, slanting from northeastern Alabama through Tennessee into southeastern Kentucky. The people of East Tennessee are often called “Overhills,” because Tennessee was once part of North Carolina and was west over the mountains from the rest of North Carolina. Both the Cumberland Plateau and the Great Valley are fertile and ideal for growing many different crops; the Great Valley is well watered. The Tennessee Appalachian Mountains are rugged, with numerous small valleys occupied by small farms. The people of East Tennessee were from their first settlement an independent-minded group who valued hard work and self-reliance. Middle Tennessee extends from the Cumberland Plateau westward to the Highland Rim. The people who live on the Highland Rim are often called “Highlanders.” The lowlands include the Nashville Basin, are well watered, and are noted for their agriculture, especially for cotton and tobacco. The Highland Rim features many natural wonders, including many caves and underground streams. Situated between East Tennessee and West Tennessee, Middle Tennessee has sometimes seemed to be a divided culture. Before the Civil War, it had more slaves

than East Tennessee, but fewer than West Tennessee, and it tended to favor the small farm tradition of the east rather than the plantation system of the west. It was divided on its support for outlawing slavery, but after Reconstruction its politics were controlled by a political spoils system run by Democrats who controlled Tennessee until the 1970s. West Tennessee lies in the Gulf Coastal Plain, a region that stretches northward from the Gulf of Mexico to Illinois along the Mississippi River. It was in this region that many local Native Americans made their last efforts to retain their remaining lands by petitioning the federal government for help. Land speculators of the early 1800s created towns and plantations throughout the area, and they brought with them the slave culture of North Carolina. Historians differ on the exact numbers, but between 40 percent and 60 percent of the people who lived in West Tennessee were slaves during the antebellum period. The plantations were notoriously cruel. Tennessee is nicknamed the “Big Bend State” because of the unusual course of the Tennessee River. It flows southwest from the Appalachian Mountains through the Great Valley into Alabama. There, it bends northwestward, reenters Tennessee at Pickwick Lake, and flows north along the western edge of the Highland Rim into Kentucky, eventually joining the Ohio River. During the 1930s, the United States government established the Tennessee Valley Authority (TVA), a project to provide jobs for people who had lost their jobs during the Great Depression and intended to control flooding and to provide hydroelectricity to Tennessee and its neighbors. It was controversial, with many criticizing it as a waste of money, and others insisting that it was destroying Tennessee’s environment. The TVA built dams on the Tennessee River and the Cumberland River, creating new lakes and reservoirs, as well as a system of over 650 miles of waterways that boats used to ship products around the state. Tennessee is bordered on the north by Kentucky; along its northeastern border is Virginia. Its eastern boundary is along the western border of North Carolina. Its southern border extends along the northern borders of Georgia, Alabama, and Mississippi. Its western border is met by Arkansas in the south and Missouri in the north. Prehistory Tennessee has a complex ancient past; there is evidence throughout the state of numerous cultures that have come and passed in the regions now within its borders. Over 100,000 years ago, people crossed into North America from northeastern Asia. Traces of these earliest peoples are hard to find, partly because the glaciers of an ice age about 11,000 years ago would have destroyed their remains. Tennessee offers tantalizing hints as to what some of these migrants were like, because in some of Tennessee’s caves are the remains of ancient cave dwellers. In West Tennessee there are caves that hold evidence of ancient fishermen. This evidence may represent several dif-

83

TENNESSEE

ferent cultures, but each seems to have practiced a religion. Their cave dwellings contain spearheads as well as fishhooks, and they may have hunted the big game of the Great Plains such as mammoths, camels, and giant bison. About 9000 b.c., nomadic peoples known as PaleoIndians began crossing North America. They were primarily hunters; in the Great Plains they hunted the large land mammals that roved in herds across grasslands. In Tennessee they would have hunted the same animals until the great forests covered much of Tennessee around 7000 b.c. They likely hunted bison and deer in these forests. Their spear points suggest that several different cultural groups of Paleo-Indians crossed the Mississippi into and through Tennessee. Around 5000 b.c., another group of people, who archaeologists call Archaic Indians, may have begun migrating into Tennessee. The first Archaic Indians of the Midwest made a significant technological advance over the Paleo-Indians by developing the atlatl, a handheld device with a groove in which to hold a spear. It enabled a person to throw a spear with far greater force and accuracy than by throwing a spear with a bare hand. Archaeological remains from about 2000 b.c. show signs of people settling throughout Tennessee. They began making pottery that increased in sophistication over the next few thousand years. Homes were made of log posts with walls of clay. Communities enlarged and engaged in public works projects to clear land, plant crops, and build places of worship. Pottery was commonplace and was used for cooking, carrying, and storage. These ancient peoples began a practice that has puzzled and fascinated archaeologists: they built mounds, sometimes seven stories high. Few of the mounds that survive have been explored by scientists, but those that have reveal people to have been buried in them, sometimes just one, sometimes many. They have been mummified and have carved animals and people, as well as food, placed around them, indicating a belief in an afterlife in which game and food would be wanted, at least symbolically. That the different cultures who built these mounds had priests is clear, and clear also is that they had

84

a highly developed government that would have included many villages and towns. By about a.d. 800, maize had become a crop, probably brought to Tennessee from central Mexico. By this time, the people in Tennessee were ancestors of modern Native Americans. They continued the mound-building tradition and were farmers. They lived in villages consisting of people related by blood, but they may have insisted that people marry outside their villages much as the Native Americans did when the first Europeans explored Tennessee. Their governments probably consisted of federations of villages, governed by a high chief. When Hernando de Soto explored southern and western Tennessee in 1540, the peoples had undergone much turmoil for more than a century. The Mound Builders had been driven away, exterminated, or absorbed into invading tribes. Three language groups were represented in the area: Algonquian, Iroquoian, and Muskogean. Among the Iroquoian group were the Cherokees, who had probably migrated from the north into Tennessee. They had a settled society that claimed East and Middle Tennessee as their territory. The Iroquois Confederacy to the north claimed the Cherokees’ territory, but the Cherokees resisted them. The Muskogean cultural tribes, Creeks and Chickasaws, claimed the rest of Tennessee, with the Creeks contesting the Cherokees for some of Middle Tennessee. The Chickasaws of western Tennessee were very well organized, with strong leadership and excellent military skills. The capital of the Cherokees was Echota (aka Chota), a city that was declared “bloodless,” meaning no fighting was allowed. Weapons were not allowed either. It was a place created by the Native Americans to settle their disputes through diplomacy, and in the Cherokee and Creek tribes, in particular, skilled diplomats were awarded honors equal to those of skilled warriors. Each village had a main house (“town house”) where religious ceremonies took place. Villages consisted of clay houses, usually gathered around the main house. By the late 1600s, the Native Americans had horses, cattle, pigs, and chickens, imports from Europe. They farmed their lands and hunted wild game in their forests, but the Cher-

TENNESSEE

okees were fast developing livestock farming. Some of the Shawnees, of the Algonquian language group, had moved into the Cumberland Valley to escape the Iroquois Confederacy. The Cherokees and Creeks viewed them as interlopers, but the Shawnees had little choice; the Iroquois Confederacy sometimes settled its differences with its neighbors with genocide. In 1714, the Cherokee, Creek, and Iroquois Confederates drove the Shawnees out; the Shawnees sought sanctuary in the Ohio Valley. War with the Iroquois Confederacy often seemed imminent for the Cherokees, Creeks, and Chickasaws, but during the 1700s new threats came to preoccupy those Native Americans. Land By 1673, the French were trading to the north of Tennessee and had antagonized the Chickasaws to the point that the Chickasaws killed Frenchmen on sight. That year, a Virginian, Abraham Wood, commissioned explorer John Needham to visit the Cherokees west of the Appalachian Mountains in what is now Tennessee. John Needham visited the Cherokees twice, and he was murdered by them. His servant Gabriel Arthur faced being burned alive so bravely that his captors let him live. In 1730, Alexander Cuming of North Carolina led an expedition across the Appalachians to make the acquaintance with the Cherokees of the Great Valley. He impressed the Native Americans with his boldness and eloquence, as well as his numerous weapons, and the chiefs agreed to affiliate themselves with England. Cuming took Cherokee representatives to England, where they were well treated. Among them was Attakullakulla (meaning “Little Carpenter”), who, upon returning home, became a great diplomat who several times prevented bloodshed. In 1736, the French, having nearly wiped out the Natchez tribe, invaded West Tennessee with intention of eradicating the Chickasaws. The Chickasaws were forewarned by English traders and decisively defeated the French invaders. The French built Fort Assumption where Memphis now stands, as part of their effort to control the Chickasaws. They failed. In another war in 1752, the Chickasaws again beat the French. These victories of the Chickasaws were important for all the Native Americans in Tennessee, because a 1738 epidemic had killed about 50 percent of the Cherokees, leaving them too weak to guarantee the safety of their neighbors. From that time on, Cherokee politics were chaotic, with different chiefs gaining ascendancy with very different views at various times, making Cherokee policies wildly swing from one view to another. During the Revolutionary War (1775–1783), some Cherokees allied themselves with Shawnees, Creeks, and white outlaws, and tried to retake East Tennessee. They were defeated by American forces under the command of Colonel Evan Shelby, who drove them into West Tennessee. In 1794, the Native Americans of Tennessee united in a war against the United States and were utterly de-

feated; they became subject to American rule. These battles had been over possession of land, and in 1794, the land belonged to the United States. On 1 July 1796, Tennessee became the sixteenth state of the United States, taking its name from Tenasie, the name of a Cherokee village. Civil War In the 1850s, the matter of slavery was a source of much conflict in Tennessee. The settlers in the east wanted it outlawed. Slave owners ignored laws regulating slavery and turned West Tennessee into a vast land of plantations worked by African American slaves. Literacy was forbidden to the slaves, and they were not even allowed to worship God, although they often did in secret. Newspapers and politicians campaigned against slavery in Tennessee, but others defended slavery with passion. On 9 February 1861, Tennessee held a plebiscite on the matter of secession, with the results favoring remaining in the Union 69,387 to 57,798. The governor of Tennessee, Isham G. Harris, refused to accept the results and committed Tennessee to the Confederacy. He organized another plebiscite on whether Tennessee should become an independent state, with independence winning 104,913 to 47,238. He declared that the result meant Tennessee should join the Confederacy. The Confederacy made its intentions clear by executing people accused of sympathizing with the Union. Over 135,000 Tennesseans joined the Confederate army; over 70,000 joined the Union army, with 20,000 free blacks and escaped slaves. The surprise attack at Shiloh on 6–7 April 1862 seemed to give the Confederacy the upper hand in Tennessee, but the Union troops outfought their attackers. After the Stone’s River Battle (near Murfreesboro) from 31 December 1862–2 January 1863, the Union dominated Tennessee. Over 400 battles were fought in Tennessee during the Civil War. The lands of Middle and West Tennessee were scourged. Fields became massive grounds of corpses, farms were destroyed, trees were denuded, and Tennessean refugees clogged roads by the thousands. On 24 July 1866, Tennessee, which had been the last state to join the Confederacy, became the first former Confederate state to rejoin the United States. Prior to readmission, on 25 February 1865, Tennessee passed an amendment to its constitution outlawing slavery. Schools were soon accepting African Americans as well as whites and Native Americans, but in December 1866, the Ku Klux Klan was founded at Pulaski, Tennessee. Directed by “Grand Cyclops” Nathan Bedford Forrest, it murdered African Americans and white people who were sympathetic to them, raped white female schoolteachers for teaching African Americans, burned schools, and terrified voters, in resistance to Reconstruction. Segregated Society By 1900, Tennessee had a population of 2,020,616. It was racially segregated. In a decades-long effort to deny edu-

85

TENNESSEE

cation to African Americans, the state managed to create an illiteracy rate among whites and blacks that was the third worst in the nation. Under the direction of Governor Malcolm R. Patterson (1907–1911), in 1909, the state enacted a general education bill. When the United States entered World War I (1914– 1918), thousands of Tennesseans volunteered, and Tennessee contributed the greatest American hero of the war, Sergeant Alvin York from Fentress County in northern Middle Tennessee. In 1918, the soft-spoken farmer and his small squad captured 223 Germans in the Argonne Forest; the sight of a few Americans leading hundreds of captured Germans to American lines was said to have been astonishing. By 1920, the state population was 2,337,885. On 18 August of that year Tennessee ratified the Twenty-First Amendment of the Constitution of the United States, which gave women the vote. In 1923, Governor Austin Peay reorganized state government, eliminating hundreds of patronage positions, while consolidating government enterprises into eight departments. In 1925, the infamous Scopes “Monkey Trial” was held in Dayton, Tennessee. A new state law said that evolution could not be taught in Tennessee schools, but John Scopes taught it anyway and was charged with violating the law. Two outsiders came to try the case, atheist Clarence Darrow for the defense and three-time presidential candidate William Jennings Bryant for the prosecution. The trial was broadcast to the rest of the country by radio, and when Scopes was convicted and fined, the impression left was that Tennessee was home to ignorance and bigotry enforced by law, an image it had not completely escaped even at the turn of the twenty-first century. A man who did much to counter the image was statesman Cordell Hull, from Overton County, west of Fentress, in northern Middle Tennessee. He served in the U.S. House of Representatives and Senate and was Franklin Roosevelt’s secretary of state. He helped create the “Good Neighbor Policy” that helped unify the nations of the New World, and he was important to the development of the United Nations. He received the 1945 Nobel Peace Prize for his work. Civil Rights By 1950, Tennessee’s population was 55 percent urban. The cities controlled most of the state’s politics, and they were becoming more cosmopolitan. Getting a bit of a head start in desegregating schools, the University of Tennessee admitted four African Americans to its graduate school in 1952. On the other hand, Frank Clement was elected governor on the “race platform,” insisting that there would be no racial integration in Tennessee. Many other politicians would “play the race card” during the 1950s and 1960s, and many of these politicians would change their minds as Clement would as the civil rights movement changed the way politics were conducted. Memphis State University began desegregating in 1955, after the United States Supreme Court ruling in 1954 that

86

segregating the races was unconstitutional. In 1956, Clement called out the National Guard to enforce desegregation of schools in Clinton. Even so, schools elsewhere here bombed or forced to close by white supremacists. By 1959, African Americans were staging wellorganized nonviolent protests in Nashville in an effort to have stores and restaurants desegregate. Meanwhile, the U.S. government, under a 1957 civil rights law, sued Democratic Party local organizations for their exclusion of African Americans from voting and holding office. Slowly, desegregation took hold in Tennessee; it took until 1965 for Jackson to begin desegregating its restaurants. In 1968, in Memphis, sanitation workers went on strike and Martin Luther King Jr., the preeminent figure in the civil rights movement, came to the city to help with negotiations. On 4 April 1968, King was shot to death by James Earl Ray. Modern Era In the 1970s, Tennessee made a remarkable turnaround in its image. With the election of Winfield Dunn as governor in 1971, the state for the first time since Reconstruction had a Republican governor and two Republican senators. This notable shift in political fortunes marked the coming of the two-party system to Tennessee, which had a positive effect on the politics and society of the state. If Democrats were to hold on to power, they needed African Americans as new allies. In 1974, the state’s first African American congressman, Harold Ford of Memphis, was elected. The Democrats remained dominant in the state, but the competition with Republicans was lively and encouraged the participation of even those who had been disenfranchised, poor whites as well as African Americans, as recently as 1965. Among the most notable politicians of the 1980s and 1990s was Albert Gore Jr., son of a powerful United States Senator, and widely expected to be a powerful politician himself. In 1988 and 1992, he ran for the presidential nomination of the Democratic Party, and he served from 1993–2001 as vice president of the United States under his longtime friend President Bill Clinton. His cosmopolitan views and his work for environmental causes helped to change how outsiders viewed Tennesseans. By 2000, Tennessee’s population was just under 5,500,000, an increase from 1990’s 4,896,641. Although the urban population was larger than the rural one, there were 89,000 farms in Tennessee. The TVA had doubled the amount of open water in Tennessee from 1930 to 1960, and the several artificial lakes and streams became prime attractions for recreation in the 1990s; the state also had some of the most beautiful woodlands in the world. Memphis became a regional center for the arts, as well as a prime retail center for northern Mississippi, in addition to Tennessee; Nashville developed the potential for its music industry to be a magnet for tourists, and by the 1990s many a young musician or composer yearned to live there.

TENNESSEE RIVER

BIBLIOGRAPHY

Alderson, William T., and Robert H. White. A Guide to the Study and Reading of Tennessee History. Nashville: Tennessee Historical Commission, 1959. Corlew, Robert E. Revised by Stanley J. Folmsbee and Enoch Mitchell. Tennessee: A Short History. 2d edition Knoxville: University of Tennessee Press, 1981. Dykeman, Wilma. Tennessee: A Bicentennial History. New York: Norton, 1975. Hull, Cordell, and Andrew H. T. Berding. The Memoirs of Cordell Hull. New York: Macmillan, 1948. Kent, Deborah. Tennessee. New York: Grolier, 2001. State of Tennessee home page. Available at http://www.state.tn.us. Van West, Carroll. Tennessee History: The Land, the People, and the Culture. Knoxville: University of Tennessee Press, 1998. A wealth of information and opinion.

TENNESSEE RIVER, formed by the confluence of the Holston River and the French Broad River, near Knoxville, Tennessee, follows a serpentine course into northern Alabama and from there northward to the Ohio River at Paducah, Kentucky. The length of the main stream is 652 miles, and the total drainage area is 40,569 square miles. Called for a time the Cherokee River, it was used extensively by Indians on war and hunting expeditions, especially by the Cherokees, some of whose towns were located along the branches of the river in southeast Tennessee. In the mid-eighteenth century, the Tennessee Valley played an important part in the Anglo-French rivalry for the control of the Old Southwest that culminated in the French and Indian War. The river was also an important route for migration of settlers into the Southwest after that war.

TENNESSEE, ARMY OF. When General Braxton Bragg reorganized the Army of Mississippi on 20 November 1862 he named it the Army of Tennessee. After fighting at Stone’s River, the army spent the summer campaigning in middle Tennessee. Aided by Virginia troops, the army won an outstanding victory at Chickamauga. After mounting an inconclusive siege at Chattanooga that led to defeat, the army retreated into northern Georgia. Leadership was in flux—William J. Hardee replaced Bragg; Joe Johnson replaced Hardee. Despite Johnson’s rather successful efforts to slow Sherman’s march toward Atlanta, Jefferson Davis replaced Johnson with John B. Hood. After several tough battles, the army left Atlanta and moved into Tennessee where it experienced defeats at Franklin and Nashville. Richard Taylor replaced Hood and retreated into Mississippi. After moving to the east to challenge Sherman, the army surrendered at the Battle of Bentonville.

Use of the river for navigation was handicapped by the presence of serious obstructions, especially the Muscle and Colbert shoals at the “Great Bend” in northern Alabama. The problem of removing or obviating the obstructions to navigation has been a perennial one that has received spasmodic attention from the federal government as well as from the states of Tennessee and Alabama, including a grant of public lands to Alabama in 1828 for the construction of a canal, and several subsequent surveys and appropriations. In the twentieth century, discussion of the river shifted from navigation to power production and flood control. During World War I, construction of the Wilson Dam and nitrate plants at the Muscle Shoals initiated a nationwide controversy over the question of public or private ownership and operation of power facilities. Since the New Deal created the Tennessee Valley Authority (TVA) in 1933, the river has been the subject of an extensive program involving navigation and flood control, fertilizer experimentation, and the production and sale of electric power, all of which fueled the social and economic transformation of the Tennessee Valley. The river has been made into a chain of reservoirs, or lakes, held back by nine major dams. As a result of TVA improvements, freight traffic on the Tennessee, which had been one million tons in 1933, had reached twenty-seven million tons per year by the early 1970s. By 1985, the 234-mile Tenn-Tom waterway opened, connecting the river’s Pickwick Lake to the Tombigbee River at Demopolis, Alabama.

BIBLIOGRAPHY

BIBLIOGRAPHY

Daniel, Larry J. Soldiering in the Army of Tennessee: A Portrait of Life in a Confederate Army. Chapel Hill: University of North Carolina Press, 1991.

Colignon, Richard A. Power Plays: Critical Events in the Institutionalization of the Tennessee Valley Authority. Albany: State University of New York Press, 1997.

McPherson, James M. What They Fought For, 1861–1865. New York: Doubleday Anchor, 1994. A brilliant explanation of motivation, human nature, and military necessity.

Davidson, Donald. Tennessee: The Old River, Frontier to Secession. Knoxville: University of Tennessee Press, 1978.

Vanderwood, Paul J. Night Riders of Reelfoot Lake. Memphis, Tenn.: Memphis State University Press, 1969.

Kirk H. Beetz See also Appalachia; Cherokee; Creek; Cumberland Gap; Iroquois; Shawnee.

———. Ordeal by Fire: The Civil War and Reconstruction. 3d ed. Boston: McGraw-Hill, 2001.

Droze, Wilmon Henry. High Dams and Slack Waters: TVA Rebuilds a River. Baton Rouge: Louisiana State University Press, 1965.

Donald K. Pickens

S. J. Folmsbee / h. s.

See also Chickamauga, Battle of.

See also River Navigation; Rivers; and vol. 9: Power.

87

T E N N E S S E E VA L L E Y A U T H O R I T Y

TENNESSEE VALLEY AUTHORITY. The Tennessee Valley Authority (TVA), a federal corporation responsible for power generation in the Tennessee Valley, serves roughly 8.3 million people through 158 municipal and cooperative power distributors. TVA furnishes power to an 80,000-square-mile area, including the state of Tennessee and parts of Kentucky, Virginia, North Carolina, Georgia, Alabama, and Mississippi, thus making the corporation one of America’s largest electrical power producers. Born of President Franklin D. Roosevelt’s innovative solution to help stimulate the area’s economy during the Great Depression, the TVA development began after World War I (1914–1918). A government-owned dam and nitrate-producing facility at Muscle Shoals, on the Tennessee River in northwestern Alabama, became the seedling of the audacious experiment. Nebraska Senator George W. Norris hoped at the time to build more dams similar to the Wilson Dam at Muscle Shoals, bringing public control to the Tennessee River. Almost singlehandedly, Norris held the dam in government ownership until President Roosevelt’s vision expanded it into a broader concept of multipurpose development and regional planning. On 18 May 1933, Congress responded to Roosevelt’s prodding and enacted the Tennessee Valley Act. TVA was to be more than a flood control and power agency. It was seen as having a wide mandate for economic development, recreation, reforestation, and the production of fertilizer. But the agency was in sad shape at its start. The best timber had already been cut, the land had been farmed too long, and crop yields were declining.

Controversy also surrounded TVA. Private utilities fought the agency’s power policies, and an internal feud between Chairman Arthur Morgan and directors David Lilienthal and Harcourt Morgan unsettled TVA’s direction until 1938. Nevertheless, the agency pushed forward. By 1941, it operated eleven dams with six more under construction, and it was selling low-cost electric power to 500,000 consumers throughout six states. TVA technicians developed fertilizers, and 25,000 demonstration farms taught local citizens the benefits of more scientific farming. Additionally, the agency helped replant forests, controlled forest fires, and improved habitat for wildlife. During World War II (1939–1945), 70 percent of TVA power went to defense industries, among them the Oak Ridge atomic project. At the war’s end, TVA had completed a 652-mile navigation channel, becoming the largest electricity supplier in the United States. Attacked for being too radical, TVA also found itself criticized for being too conciliatory to established interests and ideas. Director Lilienthal claimed that TVA practiced “grassroots democracy” by reaching out in a massive educational effort to involve the rural population of the valley. However, critics saw mostly manipulation in this approach. The 1960s saw unprecedented growth in the Tennessee Valley. At the same time, TVA began building nuclear plants as a new source of power. The agency survived reproach from both conservatives and environmentalists and, by the early 1970s, claimed an impressive record. In 1972, an estimated $395 million in flood damages had been averted by TVA dams. Power revenues came to $642 million, of which TVA returned $75 million to the U.S. Treasury. Along with industrial customers, 2 million residential consumers used TVA power. The TVA manages an integrated, technically advanced system of dams, locks, and reservoirs in the Tennessee River watershed. The balanced system facilitates navigation, controls flooding, and provides hydropower to benefit users. As of 2002, it included three nuclear generating plants, eleven coal-fired plants, twenty-nine hydraulic dams, five combustion turbine plants, a pumpedstorage plant, and roughly 17,000 miles of transmission lines, making the TVA the largest public power system in the nation. TVA’s generation mix consisted of 63 percent coal, 31 percent nuclear, and 6 percent hydroelectric.

Tennessee Valley Authority. Created by President Franklin Roosevelt to provide low-cost electricity and to stimulate economic development in the Tennessee Valley region during the Great Depression, the TVA also ran demonstration farms, such as the one shown here, to instruct farmers on the optimal use of fertilizers. In the photo, the healthy crops were treated with phosphate, while the barren crops were left untreated. 䉷 Franklin Delano Roosevelt Library

88

The agency serves 158 local municipal and cooperative power distributors that deliver power to homes and businesses within the seven-state area. Also involving itself in technical assistance to communities and industries, TVA conducts economic research and analysis, industrial research, and conceptual site engineering and architectural services. It also provides technical and financial support to small and minority-owned businesses as well as working with regional industrial development associations to recruit new industry and develop strategies for creating jobs.

TENNIS

Preserving wildlife habitat, TVA oversees more than 122,000 acres of public land designated for naturalresource management. Forty percent of it is administered by other agencies, while the remainder falls under TVA management. The agency launched the Natural Heritage Project in 1976, with the help of the Nature Conservancy, to analyze and manage biodiversity on TVA lands and to improve compliance with federal environmental regulations. The project monitors threatened and endangered plant and animal species in the TVA service area. Since its beginnings, the Natural Heritage Project has supplied environmental data on TVA activities ranging from transmission-line construction to economic development. TVA has also developed a land-use system of 10,700 acres classified as TVA Natural Areas. The specified sites are designated as Habitat Protection Areas, Small Wild Areas, Ecological Study Areas, or Wildlife Observation Areas and include limitations on activities that could endanger important natural features. Throughout the Tennessee Valley, TVA operates roughly 100 public recreation facilities, including campgrounds, day-use areas, and boat-launching ramps. BIBLIOGRAPHY

Callahan, North. TVA: Bridge Over Troubled Waters. South Brunswick, N.J.: A. S. Barnes, 1980. Chandler, William U. The Myth of TVA: Conservation and Development in the Tennessee Valley, 1933–1983. Cambridge, Mass.: Ballinger, 1984. Conkin, Paul K., and Erwin C. Hargrove. TVA : Fifty Years of Grass-Roots Bureaucracy. Chicago: University of Illinois Press, 1983.

Creese, Walter L. TVA’s Public Planning: The Vision, The Reality. Knoxville: The University of Tennessee Press, 1990. Tennessee Valley Authority Web Site. Home page at http:// www.tva.com/.

Kym O’Connell-Todd See also New Deal; and vol. 9: Power.

TENNIS, or more properly, lawn tennis, derives from the ancient game of court tennis. It was introduced in the United States shortly after Major Walter Clopton Wingfield demonstrated a game he called Sphairistike at a garden party in Nantclwyd, Wales, in December 1873. Formerly, some historians believed that Wingfield’s game of Sphairistike, played on an hourglass-shaped court, was first brought to America by way of Bermuda. In 1875 Mary Ewing Outerbridge, an American, obtained a set of tennis equipment from British officers stationed there and her brother, A. Emilius Outerbridge, set up a court on the grounds of the Staten Island Cricket and Baseball Club in New York City, the home of the first national tournament in September 1880. However, Outerbridge was preceded by Dr. James Dwight (often called the father of American lawn tennis) and F. R. Sears Jr., who played the first tennis match in the United States at Nahant, Massachusetts, in August 1874. The present scoring system of 15, 30, 40, games, and sets became official at the first Wimbledon (England) Championship in 1877. In 1881, the newly formed U.S. National Lawn Tennis Association (USNLTA) (the “National” was dropped in 1920, the “Lawn” in 1975) hosted the first official tennis championship in the United States at the Newport Casino

89

TENNIS

in Rhode Island. Richard D. Sears of Boston won the tournament, a feat he repeated annually through 1887. From the Late Nineteenth to the Mid-Twentieth Century Although tennis was initially confined mainly to the Northeast, by the 1880s and 1890s it was spreading throughout the United States, with tournaments and clubs organized in Cincinnati, Atlanta, New Orleans, Seattle, San Francisco, and Chicago, which was awarded the national doubles championships in 1893 as part of the World’s Columbian Exposition there. The first Davis Cup matches, between the United States and Great Britain, were held at the Longwood Cricket Club in Brookline, Massachusetts, in 1900. The cup donor, Dwight F. Davis, was a native of St. Louis but was at Harvard when he put up the cup, as were Malcolm Whitman and Holcombe Ward, also members of the first Davis Cup team. At that time, there were 44 tennis clubs in the United States; by 1908, there were 115. Like golf, tennis was most popular among America’s economic and cultural elite. African Americans, Jews, and recent immigrants were usually excluded from the private clubs where tennis thrived. From its introduction in the United States, tennis greatly appealed to both sexes, yet women were initially forbidden from playing in public tournaments. American clubs, like those in Europe, often assigned female players different venues and imposed confining styles of dress that limited their range of motion. Nevertheless, the United States has consistently produced some of the strongest women players in tennis history. The Englishborn Californian May Sutton was national champion in 1904, and in 1905 became the first American to win at Wimbledon. Hazel Hotchkiss’ volleying style of attack allowed her to win forty-three national titles. She was also the donor of the Wightman Cup, sought annually since 1923 by British and American women’s teams. Fifty years later, Billie Jean King, winner of four U.S. titles, would defeat the aging Bobby Riggs in what was called the Battle of the Sexes, a landmark event in the histories of both tennis and feminism. In 1916 the USNLTA funded a series of programs and clinics to develop the skills of budding tennis players and promote the sport on a wider scale. As a result, the following decades saw numerous American players receive worldwide acclaim. Over the course of his career, William T. Tilden II won seven U.S. titles and three Wimbledon championships. Beginning in 1923, Helen Wills won the first of seven U.S. women’s championships and ultimately triumphed at Wimbledon for a record eight times. Her match at Cannes in 1926 with Suzanne Leglen, six-time Wimbledon champion, was the most celebrated women’s contest in the history of the game. A decade later Don Budge, the first player to complete the coveted “grand slam” by winning at Wimbledon, the U.S. Open, the French Open, and the Australian Open, regained the Davis Cup for the United States in 1937 after

90

Grand Slam Winner. In 1938 Don Budge (in near court, opposite Fred Perry) became the first person to win a tennis Grand Slam, taking the titles of the Australian Open, French Open, Wimbledon, and the U.S. Open in the same year. 䉷 corbis

a period of French and English domination. Following World War II, the development of young tennis players continued under the auspices of the Tennis Educational Association. School physical education instructors were trained to teach tennis, while inner-city programs attempted to spread tennis to underprivileged youths. At the same time, the American Tennis Association became an outlet for aspiring African American players, including Althea Gibson, who in 1950 became the first African American to participate in the U.S. Open. Radical Innovations The late 1960s saw revolutionary changes in tennis, both in the United States and worldwide. Until that time, the sport’s most prestigious competitions were open exclusively to amateurs. However, in 1968 the International Lawn Tennis Federation sanctioned open tournaments, permitting amateurs to compete against professionals. This shift had a profound impact on both professional and amateur tennis. New promoters and commercial sponsors came into the game and the schedule of tournaments was radically revised and enlarged. The prize money available for professional players increased dramatically, with tennis superstars such as Rod Laver, Jimmy Connors, Arthur Ashe, Billie Jean King, and Chris Evert earning hundreds of thousands of dollars a year by the mid-1970s. Top players no longer struggled to earn a living under the rules governing amateur status; as a result, the mean age of competitive players rose sharply, as many found they could earn more playing tennis than in other careers. Matches were also increasingly televised, especially after 1970, when the introduction of the “sudden death” tiebreaker made it possible to control the length of matches.

T E R M I N AT I O N P O L I C Y

Improvements in racket technology further revolutionized the sport of tennis during the 1960s and 1970s. Steel, aluminum, and graphite rackets soon replaced the traditional wooden designs. Over the next two decades, wood and metal rackets gave way to stronger and lighter synthetic materials, while conventional head sizes disappeared in favor of intermediate and oversized racket heads, first introduced by Prince Manufacturing in 1976. Competitive techniques and styles of play were greatly affected by the new racket technology. The two-handed backhand, popularized during the 1970s, proved ideally suited to the new, larger racket heads and became a staple of the competitive game. The new racket technology was clearly responsible for a greater reliance on power in both men’s and women’s competitive tennis throughout the 1990s.

TENURE OF OFFICE ACT, passed by Congress in 1867 over President Andrew Johnson’s veto, was designed to restrict greatly Johnson’s appointing and removing power. When Johnson attempted to remove Secretary of War Edwin M. Stanton, the Radical Republican Congress proceeded with its long-laid plans for the impeachment and trial of the president. As Stanton was not a Johnson appointee, the act could not be applied to him. Passed during, and as part of, the struggle between Johnson and Congress over Reconstruction, sections of the act were repealed early in Ulysses S. Grant’s first administration; the rest of the act was repealed 5 March 1887.

U.S. Dominance During the last three decades of the twentieth century, the United States remained the single most important source of world-class players. Between 1974 and 1999, Jimmy Connors, John McEnroe, Jim Courier, Pete Sampras, and Andre Agassi held the world’s top men’s ranking for a combined sixteen years. In the same period, Americans Billie Jean King, Chris Evert, Monica Seles, and Lindsay Davenport held the top women’s ranking in a total of ten years, with Martina Navratilova, a naturalized American, adding another seven. Since the late 1970s, when an estimated thirty-two to thirty-four million Americans played tennis, the popularity of the sport has been in decline. Although interest in tennis experienced a resurgence during the early 1990s, by the decade’s end only 17.5 million Americans were actually playing the sport. Particularly underrepresented have been Americans of color, despite the success and influence of such players as Michael Chang and Venus and Serena Williams. Nevertheless, tennis remains a multibillion-dollar industry worldwide, with top tournaments frequently hosting record crowds.

Kutler, Stanley I. Judicial Power and Reconstruction. Chicago: University of Chicago Press, 1968.

BIBLIOGRAPHY

Collins, Bud, and Zander Hollander, eds. Bud Collins’ Modern Encyclopedia of Tennis. Farmington Hills, Mich.: Gale, 1994. Gillmeister, Heiner. Tennis: A Cultural History. New York: New York University Press, 1998. Parsons, John. The Ultimate Encyclopedia of Tennis: The Definitive Illustrated Guide to World Tennis. London: Carlton Books, 1998. Phillips, Caryl. The Right Set: A Tennis Anthology. New York: Vintage, 1999. Sports Illustrated 2002 Sports Almanac. New York: Bishop Books, 2001.

Allison Danzig David W. Galenson John M. Kinder See also Sports.

BIBLIOGRAPHY

Benedict, Michael Les. A Compromise of Principle: Congressional Republicans and Reconstruction, 1863–1869. New York: Norton, 1974.

McKittrick, Eric L. Andrew Johnson and Reconstruction. Chicago: University of Chicago Press, 1960. Thomas, Benjamin P., and Harold M. Hyman. Stanton: The Life and Times of Lincoln’s Secretary of War. New York: Knopf, 1962.

Willard H. Smith / a. g. See also Impeachment Trial of Samuel Chase; Liberal Republican Party; Stalwarts; Wade-Davis Bill.

TERMINATION POLICY. After World War II, pressure in Congress mounted to reduce Washington’s authority in the West, end the reservation system, and liquidate the government’s responsibilities to Indians. In 1953 the House of Representatives passed Resolution 108, proposing an end to federal services for thirteen tribes deemed ready to handle their own affairs. The same year, Public Law 280 transferred jurisdiction over tribal lands to state and local governments in five states. Within a decade Congress terminated federal services to more than sixty groups, including the Menominees of Wisconsin and the Klamaths of Oregon, despite intense opposition by Indians. The effects of the laws on the Menominees and the Klamaths were disastrous, forcing many members of the tribes onto public assistance rolls. President John F. Kennedy halted further termination in 1961, and Presidents Lyndon B. Johnson and Richard M. Nixon replaced termination with a policy of encouraging Indian selfdetermination with continuing government assistance and services. After years of struggle the Menominees and Klamaths succeeded in having their tribal status restored in 1973 and 1986, respectively. BIBLIOGRAPHY

Fixico, Donald Lee. Termination and Relocation: Federal Indian Policy, 1945–1960. Albuquerque: University of New Mexico Press, 1986.

91

TERRITORIAL GOVERNMENTS

Peroff, Nicholas C. Menominee Drums: Tribal Termination and Restoration, 1954–1974. Norman: University of Oklahoma Press, 1982.

Frank Rzeczkowski See also Bureau of Indian Affairs.

TERRITORIAL GOVERNMENTS. The Constitution empowers Congress to govern the territory of the United States and to admit new states into the Union. However, territorial governments in the United States predate the Constitution. The Congress of the Confederation enacted the Northwest Ordinance of 1787 for the region north of the Ohio River and westward to the Mississippi. Under its terms the territories could look forward to eventual statehood on terms of equality with the original states. As modified by congressional enactments after the adoption of the Constitution in 1789, the Ordinance set forth the general framework of government for the territories that ultimately achieved statehood, beginning with Tennessee in 1796 and ending, most recently, with Alaska and Hawaii in 1959. The Ordinance provided for three stages of government. Congress established each territorial government by way of an organic act, a federal law serving as a temporary constitution. In the initial or “district” stage, the president, with the consent of the Senate, appointed a governor, a secretary, and three judges. The governor served as head of the militia and superintendent of Indian affairs. He was authorized to establish townships and counties, appoint their officials, and, in conjunction with the judges, adopt laws for the territory. The second stage began when the territory attained a population of at least 5,000 free adult males. The inhabitants could then establish a legislature consisting of a house of representatives elected for two years and a legislative council appointed by the president to serve for five years. The house and council would choose a nonvoting delegate to Congress. The governor enjoyed the authority to convene, adjourn, and dissolve the legislature, and could exercise a veto over legislative enactments. Congress retained the power to nullify the acts of territorial legislatures. Finally, when the total population of a territory reached 60,000, it could petition Congress for admission into the Union. Admission was not automatic; indeed, the process often became entangled in struggles between partisan or sectional interests. For example, in the decades preceding the Civil War, Congress balanced the admission of each free state with the admission of a slave state. Once it decided to admit a territory, Congress would pass an enabling act authorizing the people of the territory to adopt a permanent state constitution and government. Over the course of the nineteenth century Congress further modified the pattern set forth in the Ordinance. For instance, in later territories the governor, secretary,

92

and judges were appointed for four years, and the electorate chose the members of the council and the nonvoting congressional delegate, who served for two-year terms. The governor shared appointive power with the council, and a two-thirds vote of the legislature could override his veto. The legislature apportioned itself, fixed the qualifications for suffrage, and organized judicial districts. Most local officials were elected. Legislative and gubernatorial acts were still subject to the approval of Congress. Judicial power was placed in supreme, superior, district, probate, and justice-of-the-peace courts. The turn of the twentieth century ushered in a new period in territorial governance. In 1898, the United States won the Spanish-American War and took sovereignty over the Philippines, Puerto Rico, and Guam. It established governments in these territories that borrowed elements from the Ordinance, but varied widely according to local circumstances. For instance, under Puerto Rico’s Organic Act, passed by Congress in 1900, the president appointed the governor and legislative council, while the electorate chose the members of a lower legislative chamber. Yet in Guam, Congress did not even pass an organic act until 1950; until then, the navy administered the territory. The acquisition of these islands triggered a nationwide debate over whether Congress had an obligation to admit all U.S. territories into statehood eventually, or whether it could govern some territories as colonies indefinitely. Opposition to statehood for the former Spanish colonies was based in part on the view that their inhabitants were too different, racially and culturally, from the American mainstream. In the rhetoric of the time, the question was whether the Constitution “followed the flag” to the new territories. In the Insular Cases of 1901, the U.S. Supreme Court held that it did not. Distinguishing for the first time between incorporated and unincorporated territories, the Court explained that all territories acquired prior to 1898 (along with Hawaii, which became a U.S. territory in 1898) had been incorporated into the United States, while the new territories remained unincorporated. According to the Court, the decision whether to incorporate a territory was entirely up to Congress. The incorporated/unincorporated distinction had two consequences. First, unincorporated territories were not considered to be on a path to statehood. Second, in legislating for incorporated territories, Congress was bound by all constitutional provisions not obviously inapplicable, but in the unincorporated territories, Congress was bound to observe only the “fundamental” guarantees of the Constitution. Neither the Court nor Congress attempted to specify precisely what these fundamental guarantees included. Later, the Supreme Court decided that such guarantees as the right to a trial by jury and an indictment by grand jury were not among these fundamental rights, but most provisions of the Bill of Rights were held applicable. At the turn of the twenty-first century, the U.S. had five territories, none of which was incorporated: the Commonwealth of Puerto Rico, the Commonwealth of the

TERRITORIAL SEA

Northern Mariana Islands (CNMI), the U.S. Virgin Islands, Guam, and American Samoa. Although federal laws generally apply in the territories, and their inhabitants are U.S. citizens (or, in American Samoa, U.S. nationals), they cannot vote in presidential elections and do not have senators or representatives in the federal government. Instead, they elect nonvoting delegates to Congress, except for the CNMI, which simply sends a representative to Washington, D.C. The Departments of War, State, Interior, and the Navy have all played a role in the administration of territories. In 1873, Congress conferred upon the Department of the Interior statutory jurisdiction over territorial governments, but after 1898, Guam was assigned to the Navy Department, and the Philippines and Puerto Rico to the War Department. In 1934 President Franklin D. Roosevelt created by executive order the Division of Territories and Island Possessions within the Department of the Interior. In 1950 this division became the Office of Territories. In the early 2000s it was known as the Office of Insular Affairs. BIBLIOGRAPHY

Eblen, Jack Ericson. The First and Second United States Empires: Governors and Territorial Government, 1784–1912. Pittsburgh, Pa.: University of Pittsburgh Press, 1968. Farrand, Max. The Legislation of Congress for the Government of the Organized Territories of the United States, 1789–1895. Newark, N.J.: Baker, 1896. Leibowitz, Arnold H. Defining Status: A Comprehensive Analysis of United States Territorial Relations. Dordrecht, Netherlands: Nijhoff, 1989. Van Cleve, Ruth G. The Office of Territorial Affairs. New York: Praeger, 1974.

Christina Duffy Burnett See also Territories of the United States.

TERRITORIAL SEA is a belt of coastal waters subject to the territorial jurisdiction of a coastal state. The territorial jurisdiction of the coastal state extends to the territorial sea, subject to certain obligations deriving from international law; the most significant of which is the right of innocent passage by foreign ships. The distinction between the territorial sea, in effect an extension of exclusive coastal state sovereignty over its land mass and the high seas, a global commons beyond the reach of any state’s jurisdiction, dates at least to the early eighteenth century in Europe. A limit to the territorial sea of three nautical miles from the coast was accepted by many countries until the latter part of the twentieth century, including by the United States, which claimed a three-mile territorial sea dating from the beginning of the republic. A United Nations–sponsored conference in 1958 adopted four major multilateral agreements on the law of the sea, but failed to secure an international agreement on a compromise limit to the territorial sea. The United States, along with

other maritime powers such as the United Kingdom, Japan, and the Netherlands, argued for the traditional threemile limit so as to preclude coastal-state encroachments into the navigational freedoms of the high seas. A second UN conference convened in 1960 was similarly unsuccessful. The Third United Conference on the Law of the Sea, initiated in 1973, adopted a major new multilateral convention in Montego Bay, Jamaica, in 1982. That agreement confirmed the emerging trend toward a twelve-mile limit. Although the United States is not a party to the 1982 convention, President Reagan in December 1988 claimed a twelve-mile territorial sea on behalf of the United States. According to the Montego Bay convention, which has emerged as the international standard even for those states not party to it, measurement of the territorial sea from convoluted shorelines may be made from baselines connecting headlands. Baselines are also used for bays and estuaries with headlands not over twenty-four miles apart, between outer points of coastal inland chains that enclose internal waters, and for historic bays to which territorial claims have been established by long and uncontested use. The territorial sea is now but one component of a larger international legal regime governing the interests of coastal states in their adjacent waters. The United States, like many states, claims limited jurisdiction in a “contiguous zone” of twelve additional miles beyond the territorial sea to enforce customs, fiscal, immigration, and sanitary laws, and to punish violations of its laws committed in its territory or territorial sea. U.S. courts have supported the arrest of smugglers hovering beyond territorial waters with the intent to violate customs laws. Legislation authorizing a four-league customs-enforcement zone was protested by other countries, but during Prohibition several countries agreed by treaty to arrests within a one-hour sailing distance from shore. Many countries, following President Harry S. Truman’s proclamation in 1945, have claimed jurisdiction over continental shelves extending off their coasts. This form of jurisdiction extends to the seabed and not the water column above it, primarily for the purpose of exploiting resources such as oil and gas. The extent of the continental shelf may vary, depending on the shape of the sea floor. “Exclusive economic zones,” which govern the use of the water column primarily for the purposes of fishing, may extend up to 200 nautical miles from a coastal state’s baseline. In 1983 President Reagan claimed an exclusive economic zone of 200 nautical miles on behalf of the United States. BIBLIOGRAPHY

Jessup, Philip C. The Law of Territorial Waters and Maritime Jurisdiction. New York: Jennings, 1927. McDougal, Myres S., and William T. Burke. The Public Order of the Oceans: A Contemporary International Law of the Sea. New Haven, Conn.: New Haven Press, 1987.

David A. Wirth See also International Law.

93

T E R R I T O R I E S O F T H E U N I T E D S T AT E S

TERRITORIES OF THE UNITED STATES are those dependencies and possessions over which the United States exercises jurisdiction. Until the turn of the nineteenth century, American experience was almost exclusively directed to the creation of territorial governments within the continental United States. The force of the Northwest Ordinance of 1787 set the precedent that territorial status was a step on the path to statehood, during which time residents of the territories maintained their citizenship and their protections under the Constitution. Alaska and Hawaii, admitted in 1959, were the last of the territories to become states and the only exceptions to the pattern of contiguity with existing states and territories. Although new states were admitted, in the twentieth century the United States entered an era when the appropriate destiny of its territorial acquisitions was not necessarily statehood. For the Spanish possessions ceded to the United States in 1898, the peace treaty did not include the promise of citizenship found in earlier treaties of annexation. Subject only to the limitations of the Constitution, Congress was free to determine the political status and civil rights of the inhabitants. In the Insular Cases, decided in 1901, the Supreme Court held that Congress could distinguish between incorporated and unincorporated territories and that the full guarantees and restraints of the Constitution need not be applied to the latter. Congress

94

uniformly chose to treat its new acquisitions as unincorporated territories and so enjoyed a flexibility not present in the earlier pattern of territorial government. In common with other dependencies Puerto Rico was initially subject to military control, although this period was brief. Its inhabitants became U.S. citizens in 1917. Civil government with a gradual broadening of selfrule culminated in an act of Congress in 1950 that authorized Puerto Rico to formulate and adopt its own constitution, which came into effect in 1952. While commonwealth status is not the equivalent of statehood and did not terminate U.S. authority, the agreement that neither Congress nor the president should annul Puerto Rican legislation guaranteed the commonwealth the maximum degree of autonomy accorded to any of the territories. The Virgin Islands were purchased from Denmark in 1917 and citizenship was conferred in 1927. By the early 2000s, the islands had become a popular vacation destination. Guam did not attract significant attention until World War II, after which it became the site of major military installations. Guamanians became citizens in 1950, framed and adopted a constitution in 1969, and since 1970 have elected their governor as well as members of the legislature.

TERRORISM

American Samoa became a distinct entity in 1899 and remained under the administration of the U.S. Navy until 1951. In 1960 a constitution was formulated with Samoan participation and was then accepted and promulgated by the secretary of the Interior. With the exception of Guam, islands of the Caroline, Marshall, and Mariana groups have been held by the United States as trust territories under the United Nations since 1947. The trust agreement charges the United States with the development of the islands toward “selfgovernment or independence.” BIBLIOGRAPHY

Carr, Raymond. Puerto Rico: A Colonial Experiment. New York: New York University Press, 1984. Stevens, Russell L. Guam U.S.A.: Birth of a Territory. Honolulu: Tongg Publishing, 1956. Taylor, Bette A. The Virgin Islands of the United States: A Descriptive and Historical Profile. Washington, D.C.: Congressional Research Library, Library of Congress, 1988.

Robert L. Berg / a. g. See also Caroline Islands; Guantanamo Bay; Marshall Islands; Midway Islands; Paris, Treaty of (1898); Pribilof Islands; Samoa, American; Spain, Relations with; Spanish-American War; Teller Amendment.

TERRORISM is a political tactic that uses threat or violence, usually against civilians, to frighten a target group into conceding to certain political demands. The term “terrorism” was first used to describe the state terrorism practiced by the French revolutionaries of 1789–1795. Through kangaroo courts, executions by guillotine, and violent repression of political opponents, the revolutionaries tried to frighten the population into submission. Two great terrorist states of the twentieth century, Nazi Germany and Stalinist Russia, also practiced the threat and use of violence to keep their own citizens in line. In the nineteenth century, terrorist tactics were adopted by individuals and groups that used assassinations, bombings, and kidnappings to undermine popular support for what the terrorists saw as unjust policies or tyrannical governments. Terrorist acts were first committed on a wide scale in the United States during the latter part of the nineteenth century. On 4 May 1886, an anarchist bomb killed eight policemen during a demonstration in Chicago’s Haymarket Square, and on 16 September 1920, an anarchist bomb hidden in a wagon on Wall Street killed thirty people and seriously injured more than two hundred. Although anarchist violence received the most newspaper coverage during this period, the white supremacist Ku Klux Klan (KKK) was the most important terrorist group in the United States from 1850 to the 1960s. The KKK used marches, beatings, and lynchings to intimidate

Terrorism in Alabama. Officials examine the destruction at the Sixteenth Street Baptist Church in Birmingham after a bomb killed four black girls attending Sunday school on 15 September 1963; the last conviction of the white supremacists responsible did not take place until nearly forty years later. AP/ Wide World Photos

African Americans who wished to vote or otherwise participate in the political process. Beginning in the late 1960s, extreme-left groups like the Weathermen engaged in kidnapping and bombings to protest the Vietnam War, while groups like the Symbionese Liberation Army engaged in armed actions against civilians or the police, hoping thereby to provoke a “people’s revolution.” These groups disappeared in the 1970s and 1980s only to be replaced by extreme-right terrorist organizations. On 19 April 1995 a truck bomb exploded outside the Alfred P. Murrah federal building in Oklahoma City, destroying the building and killing 168 people. An act of domestic terrorism, the Oklahoma City Bombing was the worst terrorist attack in U.S. history at the time. Testifying before the U.S. Senate in 1998, FBI Director Louis J. Freeh stated that, “The current domestic terrorist threat primarily comes from right-wing extremist groups, including radical paramilitary [militia] groups, Puerto Rican terrorist groups, and special interest groups.” The period after 1960 saw the rise of international terrorist attacks on Americans in the Middle East and in Latin America. The most dramatic instance of terrorism during this period was the 4 November 1979 attack by

95

T E S T L AW S

Iranian students on the United States Embassy in Teheran, when sixty-six diplomats were held hostage until their release on 20 January 1981. According to the U.S. State Department, seventy-seven U.S. citizens were killed and 651 injured in international terrorist attacks between 1995 and 2000. By the mid-1970s, international terrorists began to carry out operations on American soil. On 24 January 1975, the Puerto Rican Armed National Liberation Front killed four people when bombs exploded at the Fraunces Tavern in New York City. Eleven months later, on 29 December 1975, a bomb exploded in the TWA terminal at La Guardia Airport, killing eleven. No group ever claimed responsibility. The next major incident occurred on 26 February 1993, when a truck bomb exploded in the basement of New York’s World Trade Center, killing six and wounding thousands. At his 1997 trial, bombing mastermind Ramzi Yousef stated, “I support terrorism so long as it was against the United States government and against Israel.” On 11 September 2001, in the most murderous terrorist attack American history had yet witnessed, almost three thousand people were killed. Nineteen Middle Eastern terrorists hijacked four airplanes; one crashed into the Pentagon, two destroyed the twin towers of New York City’s World Trade Center, and one, possibly headed for the White House, crashed in a wooded area of Pennsylvania. Although the hijackers left no message, they were clearly motivated by hatred of the United States and by a desire to force a change in American policy in the Middle East. The enormity of the attack pushed terrorism to the top of the American political agenda, with President George W. Bush declaring “war on terror” in his 20 September 2001 address to a joint session of Congress. President Bush predicted that this new war could last for years or even decades. The World Trade Center attack also led to a major change in the way the United States deals with terrorism. Before 11 September 2001, the United States followed a police-justice model whereby police and intelligence agencies identified and apprehended terrorists and then turned them over to the justice system. After those attacks, however, the Bush Administration adopted a preemptive-war model, whereby the United States intends to strike at individual terrorists or terrorist groups anywhere in the world and has threatened to use all means necessary, from special forces to massive military force, to attack what it identifies as “terrorist states” that support international terrorism. The adoption of this model led President Bush in his 29 January 2002 State of the Union address to talk about Iran, Iraq, and North Korea together as an “axis of evil” and to threaten military action against Iraq. This statement led to much uneasiness among allies of the United States,who feared that the administration’s war on terrorism signaled a move toward unilateralism in U.S. foreign policy and the destabilization of international relations.

96

BIBLIOGRAPHY

Harmon, Christopher. Terrorism Today. Portland, Ore.: Frank Cass, 2000. Laqueur, Walter. The New Terrorism: Fanaticism and the Arms of Mass Destruction. New York: Oxford University Press, 2000. Wilkinson, Paul, Terrorism and the Liberal State. New York: New York University Press, 1986.

Harvey G. Simmons See also 9/11 Attack; and vol. 9: George W. Bush, Address to a Joint Session of Congress and the American People (As Delivered Before Congress), 20 September 2001.

TEST LAWS. Although the national government had used loyalty tests before the Civil War and Reconstruction, those eras witnessed an attempt to establish criteria of loyalty. Both Abraham Lincoln and Andrew Johnson considered loyalty oaths and disloyalty proceedings to be an integral part of war and reconstruction policy. Despite constant pressure from Congress, Lincoln maintained control of loyalty proceedings in the federal government. He did, however, have to compromise particularly in the case of the “ironclad oath.” This oath required every federal officeholder to swear that he had “never voluntarily borne arms against the United States” or given any aid to those so doing or held any office “under any authority or pretended authority in hostility to the United States.” Furthermore, each individual had to swear that he had “not yielded a voluntary support to any pretended government, authority, power, or constitution within the United States, hostile or inimical thereto. . . .” In 1864, Congress broadened the scope of the oath to include its own membership, which would effectively bar returning reconstructed state delegations. On 24 January 1865, Congress extended the oath to lawyers practicing in federal courts. Under Johnson the issue of loyalty oaths became critical to Radical Republican policy. In Missouri and West Virginia, for example, adoption of the ironclad oath was fundamental to Radical Republican control. Both the federal and state oaths created serious constitutional difficulties, however. Opponents raised various constitutional challenges to the oaths, and in 1866, the Supreme Court heard Cummings v. Missouri and Ex Parte Garland, the former a challenge to the state law and the latter a challenge to the federal test-oath act of 1865. The decisions in these two cases had been preceded in December 1866 by Ex Parte Milligan, which some Republicans had interpreted as dangerous to their ideas of reconstruction. The decisions rendered in the Cummings and Garland test-oath cases did not allay their suspicions. On 14 January 1867, the Supreme Court invalidated the test oath of 1865 because the oath provision was a bill of attainder and an ex post facto law. Because of these decisions, Radical Republicans mounted various legislative proposals for curbing what

TET OFFENSIVE

—1 QL

Q

1 —2 QL

The test oath itself was modified in 1868 for national legislators, who now had only to swear to future loyalty. In 1871, Congress further modified the oath for all former Confederates to a promise of future loyalty. Finally, in 1884, Congress repealed the test-oath statutes.

— QL

they felt was abuse of judicial power. The Radical Republicans asserted the right of the legislative branch to decide “political” questions, which included the barring of “conspirators” and “traitors” from practicing in federal courts. Meanwhile, in 1867, the Court in Mississippi v. Johnson rejected an attempt to have it rule on the constitutionality of the congressional Reconstruction. It argued that an injunction in this case would interfere in the legitimate political functions of the legislative and executive branches. The Court’s decision in 1868 to hear arguments in Ex Parte McCardle did lead to congressional action curtailing the Court’s jurisdiction in all cases arising under the Habeas Corpus Act of 1867. The Court’s acquiescence in this restriction of its power of judicial review and the acceptance of Congress’s right in Texas v. White (1869) to guarantee republican governments in the states obviated any further threats to the Court at this time.

QL —11 —1

BIBLIOGRAPHY

Foner, Eric. A Short History of Reconstruction, 1863–1877. New York: Harper and Row, 1990. Kutler, Stanley I. Judicial Power and Reconstruction. Chicago: University of Chicago Press, 1968. Sniderman, Paul M. A Question of Loyalty. Berkeley: University of California Press, 1981.

Joseph A. Dowling / a. e. QL Q L—1

TET OFFENSIVE. In the spring of 1967, the communist Vietcong leadership began planning a nationwide offensive aimed at destroying the South Vietnamese government and forcing the Americans out of the Vietnam War. The communists were concerned about the growing U.S. military presence in Vietnam and their own mounting losses. The Vietcong believed that South Vietnam was ripe for revolution and saw the Saigon government as the weak link in the Allied war effort. The Politburo in Hanoi, in conjunction with leaders of the Vietcong, developed a plan for an all-out attack to take place during the Tet holiday at the end of January 1968. The communists expected that a general offensive, aimed primarily at South Vietnamese military and government installations, would encourage a majority of the citizens to turn against the Saigon government. The combination of military action and popular revolution would sweep away the Saigon regime, put in its place a procommunist slate of leaders, and thus force the United States to withdraw from the war. The communists christened their attack the Tong Cong Kich–Tong Khia Nghia, or TCK–TKN (General Offensive–General Uprising) plan. The first phase of TCK–TKN began in the fall of 1967 with a series of attacks in western Vietnam near the

borders with Laos and Cambodia. These attacks were designed to draw allied forces away from urban centers in the eastern part of the country, and gave the communists more opportunity to infiltrate troops and stockpile supplies near dozens of key cities and towns. The allied leaders detected signs of an imminent enemy offensive that would likely take place around the Tet holiday but concluded that the thrust would be limited to the three northern provinces of South Vietnam. In the early morning hours of 30 January 1968, the communists in the mid-northern section of South Vietnam began their offensive one day early, apparently the result of a miscommunication with Hanoi. They attacked nine cities, including Da Nang, Nha Trang, Pleiku, and Kontum, which gave allied forces partial warning before the main offensive began in the early morning hours of the thirty-first. The communists, however, still managed to achieve a large measure of tactical surprise. Approximately 84,000 communist soldiers attacked Saigon and five of the largest urban centers, thirty-six of forty-four

97

T E X A N E M I G R AT I O N A N D L A N D C O M PA N Y

colonists with log cabins, rifles, and ammunition. Acrimonious disputes arose when other settlers, acting independently, moved into land unoccupied but promised to the company, and claimed homesteads by preemption. The only organized opposition in Texas to annexation in 1845 came from agents of the company, who feared abrogation of their colonization contract. Conflicts waxed after annexation, leading to two armed raids by settlers, in 1848 and 1852, on company headquarters at Stewartsville, Collin County. Land title claims were quieted only in 1853, when a law was passed granting settlers the right to land actually occupied as a homestead. The company was then compensated in part with a tract of unoccupied public land in west Texas. BIBLIOGRAPHY

Tet Offensive. The Cholon area of Saigon is hit by two 750pound bombs during the shelling of the South Vietnamese capital in early 1968. 䉷 corbis

provincial capitals, and at least sixty-four of 242 district capitals. The communists wreaked havoc and caused confusion, but were soon overcome by the weight of American firepower and the surprisingly able resistance of the South Vietnamese army. With the exception of the city of Hue´ and the marine base at Khe Sanh, two battles that persisted until March, the offensive collapsed within the first week. As many as 45,000 Vietcong and North Vietnamese army soldiers perished in the offensive, and the popular uprising failed to materialize. However, the offensive caused significant political turmoil in the United States and strengthened the hand of those who wanted to limit or extinguish the American role in Vietnam. BIBLIOGRAPHY

Davidson, Phillip B. Vietnam at War: The History, 1946–1975. Novato, Calif.: Presidio Press, 1988. Karnow, Stanley. Vietnam: A History. New York: Viking, 1983. Oberdorfer, Don. Tet! The Turning Point in the Vietnam War. Garden City, N.Y.: Doubleday, 1971.

Erik B. Villard See also Vietnam War.

TEXAN EMIGRATION AND LAND COMPANY, also known as the Peters’ Colony Company, introduced 2,205 families into north central Texas between 1841 and 1848 as part of the basic settlement of seventeen present-day counties, which include the cities of Dallas, Fort Worth, and Wichita Falls. Organized by W. S. Peters and associates of Louisville, Kentucky, and Cincinnati, Ohio, the company entered into contract with the Republic of Texas on 9 November 1841. The Republic of Texas distributed free land on its northern Indian frontier in parcels of 640 acres, while the company furnished the

98

Connor, Seymour V. Kentucky Colonization in Texas: A History of the Peters Colony. Baltimore: Clearfield, 1994.

Sam H. Acheson / a. r. See also Annexation of Territory; Land Claims; Land Companies; Texas Public Lands.

TEXAS. The varied geography of Texas has helped to shape its history. The eastern third of the state’s 266,807 square miles is mostly humid woodlands, much like Louisiana and Arkansas. A broad coastal plain borders the Gulf of Mexico. Much of southwest and far-west Texas is semiarid or arid desert, and west-central Texas northward through the Panhandle marks the southernmost part of the Great Plains. The central and north-central regions of the state are mostly gently rolling prairies with moderate rainfall. Moving from northeast to southwest, the major rivers are the Red, Sabine, Trinity, Brazos, Colorado, Guadalupe, Nueces, and Rio Grande; none has ever proven very suitable for navigation. The state is generally flat, with the exception of the Hill Country region west of the Austin–San Antonio area and the Davis Mountains of far west Texas. The First Texans Prior to the arrival of Europeans, Texas was home to a diverse collection of native peoples. Most numerous of these were the Hasinai branch of the Caddo Indians in east Texas, an agricultural society related to the moundbuilding cultures of the Mississippi Valley. Along the upper and central Gulf Coast ranged the nomadic Karankawas, and south Texas was home to various hunter-gatherers collectively known as Coahuiltecans. The Apaches were the dominant Plains nation, following the great herds of bison. Numerous small groups, including the Jumanos of southwest Texas and the Tonkawas of central Texas, lived in various parts of the state. Spanish Texas Europeans first viewed Texas in 1519, when an expedition led by the Spaniard Alonso A´lvarez de Pineda mapped the

TEXAS

Gulf Coast from Florida to Mexico. In 1528 survivors of the Pa´nfilo de Narva´ez expedition, which had previously explored parts of Florida, washed ashore in the vicinity of Galveston Island during a storm. Only four men survived the first few months, including A´lvar Nu´n˜ez Cabeza de Vaca, whose memoir became the first published account of Texas. After more than seven years of harrowing adventure, the castaways finally made their way back to Mexico in 1536. The tales of Cabeza de Vaca and his companions inspired the expedition of Francisco Va´zquez de Coronado, who entered the Texas Panhandle from New Mexico in 1541. Although he failed in his search for gold, Coronado was the first European to see Palo Duro Canyon and to encounter the Apache Indians. In 1542, while Coronado was crossing the Panhandle, an expedition led by Luis de Moscoso Alvarado was entering east Texas from Louisiana. Moscoso perhaps reached as far as the Brazos River before returning to the Mississippi. When Coronado and Moscoso failed to find riches in Texas, Spain abandoned its efforts to explore or exploit Texas. For the next 140 years, Spain would claim the vast region, but only when the French suddenly appeared on the scene did Texas again become a priority. In 1684 Rene´ Robert Cavelier, Sieur de La Salle, sailed from France with the intention of establishing a colony at the mouth of the Mississippi River. Overshooting his target by 400 miles, he landed instead at Matagorda Bay. At a well-concealed point at the head of the bay, he built a crude camp commonly known as Fort Saint Louis. Beset by disease, disunity, and hostile Indians, the settlement lasted only four years, with La Salle being killed by his own men in 1687. But the ill-fated French venture alerted the Spanish to the dangers of losing Texas, and La Salle unintentionally became the impetus for the creation of a permanent Spanish presence in Texas. Between 1684 and 1689 Spain dispatched five sea and six land expeditions to locate and expel La Salle. Finally, in 1689 a party led by Alonso de Leo´n found the ruins of La Salle’s settlement. The French were gone, but Spain was now determined to establish a presence in east Texas among the Hasinai. The following year the Spanish established Mission San Francisco de los Tejas in presentday Houston County. However, floods, disease, and poor relations with the Indians caused the Franciscan missionaries to abandon the effort in 1693. Spain tried to move back into east Texas beginning in 1716, eventually founding six missions and a presidio there. In 1718 Martı´n de Alarco´n, the governor of Coahuila and Texas, founded a mission and presidio on the San Antonio River in south central Texas to serve as a halfway station between the east Texas missions and the Rio Grande. In time, the San Antonio complex would become the capital and principal settlement of Spanish Texas. Spain’s second effort in east Texas proved little more successful than the first, and by 1731 most of the missions

in the east had been abandoned, leaving Spain with only a token presence in the area. Missions and presidios founded in other parts of Texas in the mid-1700s, such as the Mission San Saba´ near present-day Menard, met with disease, Indian attack, or other problems and were all short-lived. In 1773, following an inspection tour by the Marque´s de Rubı´, the crown ordered the abandonment of the remaining east Texas settlements. Spain had acquired Louisiana from France in 1763 and no longer needed Texas as a buffer to French expansion. Some of the east Texas settlers resisted being resettled in San Antonio and eventually returned to east Texas, founding the town of Nacogdoches. By the late eighteenth century, then, Spanish Texas essentially consisted of San Antonio, Nacogdoches, and La Bahı´a (later renamed Goliad), which had been founded on the lower Texas coast in 1722. At its height around 1800, the non-Indian population of Spanish Texas numbered perhaps 4,000. When the United States acquired the Louisiana Territory in 1803, Spain found itself with an aggressive new neighbor on its northern frontier. Over the next two decades Anglo-American adventurers known as “filibusters” launched repeated expeditions into Texas, with the intention of detaching it from New Spain. Two filibusters, Augustus Magee (1813) and James Long (1819, 1821), joined with Mexican revolutionary Jose´ Bernardo Gutie´rrez de Lara to invade Texas from the United States. A Spanish royalist army crushed the rebels near San Antonio at the battle of Medina River and unleashed a reign of terror across Texas. By the time Mexico won its independence from Spain in 1821, the non-Indian population of Texas stood at no more than 3,000. Mexican Texas Hispanic Texans, or Tejanos, had supported the movement for Mexican independence, and they likewise endorsed the creation of a federal republic in the 1820s. Long neglected by Mexico City, many of these hardy settlers realized that trade with the United States held the best promise for prosperity. Therefore, when a bankrupt American businessman named Moses Austin proposed establishing a colony of 300 American families in 1821, his plan met with widespread support and gained the approval of Spanish authorities. Austin died before launching his colony, but his son, Stephen F. Austin, inherited the project and became Texas’s first empresario (colonization agent). Austin’s colony encompassed parts of nearly forty present-day Texas counties along the lower watersheds of the Brazos and Colorado Rivers. By 1834 some 15,000 Anglos lived in Texas, along with 4,000 Tejanos and 2,000 African American slaves. The Texas Revolution Relations between the Texan settlers and the Mexican government began to sour in 1830, when the Mexican congress passed a law intended to weaken Anglo influence in the state. Among other provisions, the Law of 6 April, 1830 placed Mexican troops in East Texas and canceled

99

TEXAS

all empresario contracts, although Austin and one other empresario were later exempted from the ban. Over the next five years, clashes between settlers and Mexican soldiers occurred repeatedly, often over customs regulations. Anglos demanded free trade, repeal of the 1830 law, and separate statehood for Texas apart from Coahuila, to which it had been joined for administrative purposes since 1824. Matters came to a head in 1835, when President Antonio Lo´pez de Santa Anna abandoned federalism altogether, abolished the 1824 constitution, and centralized power in his own hands. Anglo Texans, joined by some Tejanos, resisted Santa Anna; hostilities commenced at Gonzales on 2 October 1835. One month later, the Texans declared a provisional state government loyal to the 1824 constitution. In February 1836 a Mexican army of several thousand commanded by Santa Anna arrived in San Antonio, where they found the old Alamo mission held by approximately 200 defenders. After a thirteen-day siege, Santa Anna’s soldiers stormed the mission on March 6, killing all the defenders, including James Bowie, William Barret Travis, and David Crockett. Shortly thereafter, James Fannin surrendered a force of about 400 volunteers at Goliad, who were subsequently executed at Santa Anna’s order. On March 2 a convention at Washington-on-theBrazos declared independence and authorized Sam Houston to take command of all remaining troops in Texas. On 21 April 1836, following a six-week retreat across Texas, Houston’s army attacked one division of the Mexican army at San Jacinto and won a stunning victory. Some 800 Mexican troops were killed or wounded and that many more captured, while Texan deaths numbered fewer than ten. Santa Anna was captured the next day and ordered his remaining troops from Texas. Independence was won. The Republic of Texas In September 1836 Sam Houston was elected president of the Republic of Texas. He faced a daunting task in rebuilding the war-torn country, securing it against reinvasion from Mexico and hostile Indians, achieving diplomatic recognition from the world community, and developing the economy. Over the next decade the record on all of these matters was mixed at best. Twice in 1842 Mexican armies invaded and briefly occupied San Antonio. On the western frontier the Comanche Indians (immigrants to Texas in the mid-1700s) terrorized settlers with their brilliant horsemanship and fierce warrior code. In east Texas the Republic waged a brutal war of extermination against the Cherokees (also recent immigrants), driving the survivors into what is now Oklahoma. The Republic also undertook imprudent ventures such as the 1841 Santa Fe Expedition, intended to open a trade route between Texas and New Mexico, which resulted instead in the capture and imprisonment of nearly 300 Texans by Mexico. The wars against the Indians and the Santa Fe Expedition can largely be laid at the doorstep of Mirabeau B. Lamar, who replaced Houston as president in 1838 and

100

believed in a sort of Texan version of Manifest Destiny. Under Lamar, the national debt rose from $1 million to $7 million and the currency depreciated drastically. Typical of Lamar’s grandiose thinking was his action in moving the capital to Austin, a new village on the far western frontier. Exposed to Indian and Mexican attacks and difficult to reach, the new capital was a luxury that the republic could scarcely afford, but Lamar envisioned its future as the centrally located seat of a vast Texan empire. By the time Houston returned to office in 1841, the financial condition of the republic made annexation by the United States critically important. Texans almost unanimously desired annexation, but concerns about slavery effectively prevented American action. In 1844, though, pro-annexation candidate James K. Polk captured the Democratic presidential nomination. When Polk won the election, the outgoing president, John Tyler, viewed it as a mandate for annexation. Having previously failed to gain Senate approval for a treaty of annexation, Tyler resorted to the tactic of annexing Texas by means of a congressional joint resolution requiring only simple majorities in both houses of Congress. It succeeded, and Texas officially entered the Union on 29 December 1845. The new state retained ownership of its vast public domain; it

TEXAS

also retained its massive public debt. The new constitution reflected the strong Jacksonian political leanings of most Texans, creating a government with limited powers. The Republic had enjoyed considerable success on one front: In a decade the population had grown from about 40,000 to nearly 140,000. The Republic had made land available practically free to immigrants from the United States, and it also resurrected the empresario system to attract immigrants from the United States and Europe. In the last years of the Republic, some 10,000 colonists from Kentucky, Indiana, Illinois, and Ohio settled in the E. S. Peters colony in northeast Texas; about 7,000 Germans came to a grant in the Hill Country; and approximately 2,000 French Alsatians settled in Henri Castro’s colony southwest of San Antonio. These immigrants gave Texas a more ethnically diverse population than most other southern states. Statehood, Disunion, and Reconstruction Immigration notwithstanding, after annexation Texas drew closer to the states of the Deep South, primarily due to the growth of slavery and the cotton economy. The enslaved population grew from 38,753 in 1847 to 182,566 in 1860. Cotton production increased from 58,000 bales in 1849 to 431,000 bales in 1859. As part of the Compromise of 1850, Texas surrendered its claims to parts of what are now New Mexico, Colorado, and Wyoming (thus assuming its modern boundaries) in return for federal assumption of its public debt. Texas thus enjoyed its most prosperous decade of the nineteenth century. By 1860 Texas mirrored its fellow southern states economically and politically. Following Lincoln’s election and the secession of the Deep South states, the state legislature called a secession convention and, over the strong opposition of Governor Sam Houston, voted to secede from the Union. Texas voters ratified the convention’s decision by a three-to-one margin. About 60,000 Texans served the Confederacy, many of them in the eastern theatre of the war. Hood’s Brigade and Terry’s Rangers were among the better-known Texas units. On 19 June 1865, a date celebrated by black Texans as “Juneteenth,” Union occupation troops under Gen. Gordon Granger landed at Galveston and declared the state’s slaves free. Texas’ experiences in Reconstruction were typically southern. The state underwent Presidential Reconstruction in 1865 through 1866, resulting in the election of state and local governments dominated by former rebels, including Governor James Throckmorton, a former Confederate general. Black Codes returned African Americans to a condition of quasi-servitude. When Congress took over the Reconstruction process in 1867, black males were enfranchised, many former Confederate officeholders were removed (including Governor Throckmorton), and the Reconstruction process began anew. With African Americans voting, the Republican Party rose to power. The Republican Constitution

of 1869 gave the new governor, Edmund J. Davis, and the legislature sweeping new authority. Davis, a former judge who had lived in Texas since the 1840s, had served in the Union Army and championed the rights of blacks. His administration created a system of public education for children of both races; established a state police force to help protect the lives and property of all citizens; and worked to attract railroads to Texas using government subsidies. The measures galvanized the Democratic opposition, and in 1872 the Democrats recaptured the state legislature. In December 1873 the Democrat Richard Coke, a former Confederate officer, defeated Davis and “redeemed” Texas from Republican rule. The triumphant Democrats undid virtually all of the Republican programs, and in 1876 they ratified a new state constitution that returned the state to its Jacksonian, limited-government, white-supremacist roots. Texas in the Gilded Age and the Progressive Era The 1870s marked the beginning of the longest agricultural depression in the state’s history. Cotton prices declined steadily through the 1880s and 1890s; land prices and interest rates rose. By century’s end a majority of white farmers had joined African Americans in the ranks of tenants and sharecroppers, trapped in a vicious spiral of debt and dependence. In 1900 half of Texas farmers worked on rented farms. Railroads finally came to Texas. The Missouri, Kansas, and Texas Railroad connected Texas to northern markets in 1872; by 1882 the Texas and Pacific and the Southern Pacific gave Texas east-west transcontinental connections. But the transportation revolution had come at a heavy price: The legislature had lured rail companies to Texas by granting them 32 million acres of the public domain. One bright spot in the mostly bleak economic picture of the late nineteenth century was the growth of the cattle industry. The Spanish had first brought hardy longhorns to Texas in the 1700s. By the end of the Civil War millions of the animals roamed wild across the open grasslands south of San Antonio. Between 1866 and 1885, five million of these cattle were driven northward, first to Sedalia, Missouri, and later to a succession of railheads in Kansas. Thereafter the cattle industry declined precipitously. The arrival of railroads and the advance of the farming frontier ended the great overland cattle drives, confining cattle raising to ranches large and small. By this time, years of overgrazing had damaged the range and weakened herds. Then, in 1885 through 1886, two years of severe drought and an unprecedented blizzard killed thousands of cattle and drove many small operators out of business. Only the largest and most efficient ranches, such as the million-acre King Ranch in South Texas, survived. As the farmers’ depression deepened, complaints mounted against the established political parties, the railroads, and foreign capitalists. Many ordinary farmers

101

TEXAS

sought relief from self-help organizations such as the Patrons of Husbandry (popularly called the Grange) and the Farmers’ Alliance. In 1891 Alliancemen founded the People’s, or Populist, party. Between 1892 and 1896 the Populists competed vigorously with the Democrats, promising to rein in the monopolistic practices of railroads and large corporations, reform the nation’s monetary system, and provide affordable credit for struggling farmers. The rise of Populism spurred the state Democrats to embrace limited reforms such as a railroad commission, which became a reality under Governor James S. Hogg (1891–1895). But Populism required far more government action than most Texans could stomach, and the party’s willingness to appeal for African American votes further tainted it in the eyes of many whites. After 1896 Populism faded, but many of its ideas would resurface in progressivism and the New Deal. In the aftermath of Populism, the Democratic Party sponsored electoral “reforms” that largely disfranchised blacks. Foremost among these, the 1902 poll tax also effectively eliminated large numbers of poor whites from politics. Middle-class white Texans embraced certain progressive reforms, such as woman’s suffrage, prohibition, prison reform, and the commission plan of city government, but many elements of Texas progressivism were

102

aimed at limiting the influence of northern and foreign capital in the state’s economy. Changes in banking and insurance laws, designed to give Texas-owned companies competitive advantages, constituted much of what passed for progressivism in the state. The Emergence of Modern Texas The twentieth century began with two history-altering events. The first, a massive hurricane, devastated Galveston in September 1900, costing 6,000 lives in one of the worst natural disasters in U.S. history. But the other event ultimately overshadowed even that tragedy. On 10 January 1901 the greatest oil gusher in history blew in at Spindletop, near Beaumont. Texas immediately became the center of the world’s petroleum industry. Hundreds of new oil firms came into existence; some, like Texaco, became huge. Perhaps more important than the oil itself was the subsequent growth of the refining, pipeline, oiltool, and petrochemical industries, which transformed the Gulf Coast into a manufacturing center, creating jobs and capital for investment. Growth of these industries, along with the discovery of massive new oil fields in east and west Texas, caused the Texas economy to modernize and begin diverging from the southern pattern of poverty and rurality.

TEXAS

As the economy modernized, however, Texas politics lagged behind. Governor James Ferguson, elected in 1914, three years later faced charges of corruption and suffered impeachment and a ban from future office holding. Undeterred, Ferguson ran his wife, Miriam, successfully twice, in 1924 and 1932, promising “two governors for the price of one.” Most historians consider the Fergusons demagogues and an embarrassment to the state, characterizations that likewise applied to Governor W. Lee “Pappy” O’Daniel, a Fort Worth flour merchant who was elected governor in 1938 on a platform based on “the Ten Commandments and the Golden Rule.” Progressive Democrats, such as the New Dealer James V. Allred (governor from 1935 to 1939), were rare in Texas. World War II transformed Texas. In 1940 a majority of Texans still lived in rural areas, and sharecroppers plowing cotton fields behind mules were still everyday sights. But the war drew hundreds of thousands of rural Texans into the military or into good-paying manufacturing jobs. By 1950 a majority of Texans lived in urban areas. Farms had mechanized and modernized. Much of this prosperity was due to federal spending, and for the first time the U.S. government was spending more in Texas than the state’s citizens paid in federal taxes. Texas cities, which had always been relatively small, began to grow rapidly. By 1960 Houston boasted a population of 938,219, followed by Dallas’s 679,684 and San Antonio’s 587,718. The Texas economy boomed in the 1970s, when world oil prices skyrocketed. The boom ended in 1983 and bottomed out in 1986. The oil “bust” plunged the state into a near-depression, as thousands of oil companies and financial institutions failed. Unemployment soared, and state tax revenues declined by 16 percent. But in the long run the crisis may have benefited the state, for it forced the economy to diversify and become less oildependent. In the 1990s Texas became a center of the “high-tech” revolution, with dramatic growth in electronics, communications, and health care–related industries. Population growth resumed. The 2000 census revealed that Houston, Dallas, and San Antonio had grown respectively to about 2 million, 1.2 million, and 1.1 million people. Even more dramatic was suburban growth; the greater Dallas–Fort Worth metropolitan area grew faster than any other large metropolitan area in the nation in the 1990s, with 5.2 million people by 2000, larger than 31 states. Overall, Texas passed New York to become the country’s second-largest state, with a population of nearly 21 million. Much of this growth was fueled by Hispanic immigrants, who made up 32 percent of the Texas population in 2000. As the economy modernized, so did Texas politics. The Civil Rights Movement enfranchised African Americans and Hispanics, who heavily favored liberal Democrats, including Texan Lyndon B. Johnson. This drove many conservative white voters into the Republican Party. In 1978, William P. Clements, Jr., became the first Re-

publican elected to the governorship since Reconstruction. Two other Texas Republicans, George H. W. Bush and his son, George W. Bush, claimed the nation’s highest office in 1988 and 2000, respectively. Democrats continued to dominate politics in the large cities, but at the state level the Republican revolution was completed in 1998, when Republicans held every statewide elective office. Texas, then, entered the twenty-first century very much in the mainstream of American life and culture. Texans continued to take pride in their state’s colorful history, and many non-Texans persisted in thinking of Texas as the land of cowboys and oil tycoons. But as a modern, diverse, urban, industrial state, Texas had become more like the rest of the nation and less like the rough-and-tumble frontier of its legendary past. BIBLIOGRAPHY

Barr, Alwyn. Reconstruction to Reform: Texas Politics, 1876–1906. Austin: University of Texas Press, 1971. Buenger, Walter L. Secession and the Union in Texas. Austin: University of Texas Press, 1984. Calvert, Robert A., Arnoldo De Leo´n, and Gregg Cantrell. The History of Texas. 3rd ed. Wheeling, Ill.: Harlan Davidson, 2002. Campbell, Randolph B. An Empire for Slavery: The Peculiar Institution in Texas, 1821–1865. Baton Rouge: Louisiana State University Press, 1989. Cantrell, Gregg. Stephen F. Austin, Empresario of Texas. New Haven, Conn.: Yale University Press, 1999. Chipman, Donald E. Spanish Texas, 1519–1821. Austin: University of Texas Press, 1992. Hogan, William R. The Texas Republic: A Social and Economic History. Norman: University of Oklahoma Press, 1946. Lack, Paul D. The Texas Revolutionary Experience: A Social and Political History, 1835–1836. College Station: Texas A&M University Press, 1992. Moneyhon, Carl H. Republicanism in Reconstruction Texas. Austin: University of Texas Press, 1980. Montejano, David. Anglos and Mexicans in the Making of Texas, 1836–1986. Austin: University of Texas Press, 1987. Smith, F. Todd. The Caddo Indians: Tribes at the Convergence of Empires, 1542–1854. College Station: Texas A&M University Press, 1995. Spratt, John S. The Road to Spindletop: Economic Change in Texas, 1875–1901. Dallas, Tex.: Southern Methodist University Press, 1955.

Gregg Cantrell See also Alamo, Siege of the; Dallas; El Paso; Explorations and Expeditions, Spanish; Fort Worth; Galveston; Houston; Mexican-American War; “Remember the Alamo”; and vol. 9: Memories of the North American Invasion; Mexican Minister of War’s Reply to Manuel de la Pen˜a y Pen˜a; Message on the War with Mexico; The Story of Enrique Esparza.

103

T E X A S N AV Y

TEXAS NAVY. The southwestern borderlands were a serious barrier in 1836 to Mexico’s attempts to crush the Texas revolution. Although Mexican President Antonio Lo´pez de Santa Anna’s advisers warned him to establish a Mexican Gulf fleet to protect the flow of seaborne military supplies along the coast before launching an overland campaign, Santa Anna refused to wait. In the meantime, the Texans, with only four small armed ships, seized control of the Gulf and disrupted Mexican supply routes throughout the war. By the summer of 1837, however, Mexico had blockaded Texas and many residents feared a sea invasion. In 1838, France’s navy fortuitously surrounded Mexico and destroyed its fleet. Alarmed by French withdrawal in 1839, President Mirabeau B. Lamar committed Texas to a naval program. By 1840, the new fleet consisted of an eleven-gun steamer, a twenty-two-gun flagship, and five smaller but effective men-of-war. The collapse of Texan James Treat’s peace negotiations with Mexico caused Lamar to enter into a de facto alliance with the state of Yucata´n, then fighting for independence from the Mexican union. As allies of Yucata´n, the Texas navy captured Tabasco and, as late as the spring of 1843, fought engagements with new Mexican steam warships built and commanded by the British. The Texas fleet kept Mexico busy and saved the young republic from re-invasion. By 1843, United States annexation was close at hand, and the president of Texas, Sam Houston, recalled the navy because he believed it was too expensive and was jeopardizing his diplomacy. After annexation, the remaining ships in the Texas navy became the property of the U.S. government. BIBLIOGRAPHY

Francaviglia, Richard V. From Sail to Steam: Four Centuries of Texas Maritime History, 1500–1900. Austin: University of Texas Press, 1998. Hill, Jim D. The Texas Navy: In Forgotten Battles and Shirtsleeve Diplomacy. Austin, Tex.: State House Press, 1987. Montejano, David. Anglos and Mexicans in the Making of Texas, 1836–1986. Austin: University of Texas Press, 1987.

Jim Dan Hill / e. m. See also Annexation of Territory; Armored Ships; MexicanAmerican War; Mexico, Relations with.

TEXAS PUBLIC LANDS. The 1845 treaty of annexation between the Republic of Texas and the United States made Texas the only state aside from the original thirteen colonies to enter the Union with control over its public lands. The state has since disposed of these lands in various ways. It sold land to settlers through various preemption acts and granted land as compensation for war service, bonuses for construction of railroads and other public works, payment for the construction of the state capitol, and support for education. By the Compro-

104

mise of 1850, Texas also ceded claims to lands that now lie in other states. At the end of the nineteenth century, Texas had no unappropriated public lands left. BIBLIOGRAPHY

Miller, Thomas L. The Public Lands of Texas, 1519–1970. Norman: University of Oklahoma Press, 1972. Morgan, Andrea Gurasich. Land: A History of the Texas General Land Office. Austin: Texas General Land Office, 1992.

W. P. Ratchford / c. p. See also Land Grants: Overview.

TEXAS RANGERS. In 1823 Stephen F. Austin hired ten men he called “rangers” to conduct a raid against the Indians. On 24 November 1835 the Texas legislature created a police force of three companies, fifty-six men each, known as Texas Rangers. Their numbers and reputation rose and fell, influenced by threats to the Texas Republic and governmental economy. Organized along military lines, the rangers had no uniforms in the nineteenth century. Later they began to wear suits with the ubiquitous cowboy hat. Rangers served in the Texas Revolution as scouts, but their numbers remained small. In December 1838 Mirabeau B. Lamar, president of the Republic, added eight companies. Until the Mexican-American War the rangers were Indian fighters. During his second presidency of Texas, Sam Houston used 150 rangers under the command of Captain John Coffee Hays to protect the frontier from Indian raids, and the rangers gained a reputation for toughness and dedication to duty. After Texas became a state, from 1848 to 1858, the rangers had no official duties since the United States controlled the border and the frontier. In January 1858 Senior Captain John S. “Rip” Ford led attacks on Indians from the Red River to Brownsville. During the Civil War and Reconstruction the rangers contributed little to law and order, but subsequently they pacified the border with Mexico and stopped various feuds in the state. Between 1890 and 1920 the state legislature dramatically reduced the number of rangers. The Mexican Revolution changed the situation. Reacting to Pancho Villa’s raid on Columbus, New Mexico, rangers killed approximately five thousand Hispanics from 1914 to 1919. Shocked, the state legislature set new standards of recruitment and professionalism. In the 1920s the rangers dealt with riots, labor strikes, the Ku Klux Klan, and oil strikes. The Great Depression marked a low point in the organization’s history. Because the rangers supported her opponent in the Democratic primary, Miriam A. “Ma” Ferguson fired all forty-four rangers. The new force was only thirty-two men. In 1935 legislators created the Texas Department of Public Safety and administratively combined the rangers, the highway patrol, and a state crime lab. The five com-

TEXTBOOKS

panies of rangers were restored, and qualifying examinations and behavioral standards were instituted. Between 1938 and 1968 Colonel Homer Garrison Jr. shifted the rangers’ focus to detective work. During that time, in response to World War II, fears of sabotage, the civil rights movement, and urbanization, the number of Rangers increased. After 1968 the rangers worked closely with local police and improved their recruitment, training, and scientific methods. By 1993 the ninety-nine officers included two women, and by 1996 Texas had 105 rangers. BIBLIOGRAPHY

Gillett, James B. Six Years with the Texas Rangers, 1875–1881. New Haven, Conn.: Yale University Press, 1925. A classic autobiography. Procter, Ben. Just One Riot: Episodes of the Texas Rangers in the Twentieth Century. Austin, Tex.: Eakin Press, 1991. A brief scholarly evaluation. Webb, Walter Prescott. The Texas Rangers: A Century of Frontier Defense. 2d ed. Austin: University of Texas Press, 1965. The major source for understanding the Rangers’ behavior and their resulting reputation.

Donald K. Pickens See also Texas.

TEXAS V. WHITE, 7 Wallace 700 (1869), was an attempt by the Reconstruction governor of Texas to prevent payment on federal bonds disposed of by the secessionist state government in payment of supplies for the Confederacy. The Supreme Court acknowledged the governor’s competence to sue on the ground that Texas was now, and had never ceased to be, a member of “an indestructible Union”; hence the ordinance of secession was void. But the Court denied the power of the secessionist government to dispose of state property for purposes of rebellion. The decision was overruled in 1885 in Morgan v. United States. BIBLIOGRAPHY

Hyman, Harold M. The Reconstruction Justice of Salmon P. Chase: In re Turner and Texas v. White. Lawrence: University Press of Kansas, 1997. Hyman, Harold M., and William M. Wiecek. Equal Justice under Law: Constitutional Development, 1835–1875. New York: Harper and Row, 1982.

Harvey Wish / a. r. See also Civil War; Confederate States of America.

TEXTBOOKS constitute the de facto curriculum in many disciplines. Especially at the secondary level, where 85 percent of the nation’s students take courses before graduation, American history is a controversial area because of disputes over content and interpretation. U.S.

history texts include the study of continental geography, political history, economic development, social history, and diverse cultures. Private corporations provide textbooks to state and local governments for a profit, an arrangement that differs from that prevailing in most industrialized countries, where the national government creates the curriculum and publishes textbooks. The total domestic market for instructional materials was an estimated $5 billion in 1992, of which more than $2 billion represented elementary and high school materials. Because the public-school systems of Texas and California buy so many textbooks, many corporations tailor the contents of their publications to meet the interests and needs of schools in those two states. Since 1970 there have been considerable changes in textbooks, especially in U.S. history and social studies because of the influence of social history, revisionism, and multiculturalism on curriculum composition. Publishers expended considerable effort to make texts redress earlier omissions. Nevertheless, the state-level controversies of the late 1980s and early 1990s in California and New York showed that textbook publishers remained beset by the demands of special-interest groups, including ethnic activists, feminists, the disabled, environmentalists, homosexuals, and religious groups, all of whom desire favorable and prominent treatment. Such pressures make it difficult for publishers to balance academic integrity against market requirements. Several federal court cases in the 1980s reflect the perennial disputes over textbook censorship, content, and interpretation. Challenges have arisen over biology, health, literature, and history texts. Three significant federal cases originated in local complaints that textbooks promote secular humanism (Smith v. Board of School Commissioners of Mobile County, 1986), atheism (Mozert v. Hawkins County Public Schools, 1987), and the theory of evolution (Aguillard v. Edwards, 1987). Textbooks remain useful and efficient devices for learning in all formal subjects, offering organized, convenient sequences of ideas and information for structured teaching and learning. In the 1990s schools at all levels began to experiment with CD-ROMs and other video technologies as curriculum supplements. The classroom use of CD-ROM reference works, electronic atlases, and on-line databases continues to grow, but it is far from certain that such media will supplant textbooks.

BIBLIOGRAPHY

Altbach, Philip G., Gail P. Kelly, Hugh G. Petrie, and Lois Weiss, eds. Textbooks in American Society: Politics, Policy, and Pedagogy. Albany: State University of New York Press, 1991. Apple, Michael W., and Linda K. Christian-Smith, eds. The Politics of the Textbook. New York: Routledge, 1991. DelFattore, Joan. What Johnny Shouldn’t Read: Book Censorship in America. New Haven, Conn.: Yale University Press, 1992.

105

T E X T B O O K S , E A R LY

Jenkinson, Edward B. Censors in the Classroom: The Mind Benders. Carbondale: Southern Illinois University Press, 1979.

Gilbert T. Sewall / a. e. See also American Legion; Education; Educational Technology; McGuffey’s Readers; New England Primer; Publishing Industry; School, District.

TEXTBOOKS, EARLY. Bibles, almanacs, embroidered samplers, and broadsheets were the most common textual materials in most colonial homes. Children used hornbooks to learn to read short phrases and proverbs. A hornbook consisted of a wooden paddle holding a piece of printed text that was covered with a layer of transparent cow’s horn to protect the text. As schools proliferated in New England, most used a version of The New England Primer, copied from English texts, and most schoolbooks were imported from England. After the Revolution, the schoolteacher Noah Webster lobbied for copyright legislation to protect his book, A Grammatical Institute of the English Language, later renamed The American Spelling Book, which he began marketing in 1783. He supplemented the speller with a grammar (1784) and a reader (1785), and by 1804, more than 1.5 million copies of his books had been sold. Webster’s books met the new nation’s need for a distinctly American product. He standardized American English spelling and grammar, and his books emphasized nationalism and patriotism. By the time Webster died in 1843, 24 million copies of his books had been sold. Schoolbooks were a popular product as the nation expanded and public schools were established. In 1840 various publishers sold 2.6 million schoolbooks. In 1837, William McGuffey’s Eclectic Reader was published, directed at the burgeoning western market. Truman and Smith Publishing Company in Cincinnati, Ohio, offered the job of compiling reading selections for four graded readers to Catharine Beecher, who had authored other texts, as well as coauthoring Primary Geography for Children with her sister Harriet Beecher Stowe. Beecher was too busy establishing the Western Female Institute in Cincinnati, and recommended McGuffey, an experienced educator. McGuffey gathered previously published pieces for the first edition and did little actual work on later editions. The McGuffey readers were revised numerous times, with all new material at three different points. Major editions were published in 1836 (7 million copies sold), 1857 (40 million sold), 1879 (60 million sold), and 1890–1920 (15 million sold). As the century wore on, schoolbooks made fewer references to religion and more to honesty and self-reliance. Charity to others was extolled, as well as respect for authority. Illustrations grew more important as printing technology became more sophisticated, and by the 1880s the books were heavily illustrated, usually showing children and animals in idealized pastoral or natural settings.

106

McGuffey’s First Reader. This woodcut illustrates the first lesson in the 1836 edition of the enormously popular and long-lived series. 䉷 Bettmann/corbis

Rural organizations such as the Farmer’s Alliance and National Grange began challenging the reliance on textbooks. The Grange lobbied for more vocational training, practical knowledge, and science, and less rote memorization. Grange-sponsored schools were established in southern states, Michigan, and California. The Grange advocated free textbooks for children and urged states to buy books in bulk to save money. In 1890 the Farmer’s Alliance charged textbook publishers with creating a “Textbook Trust,” claiming the American Book Company (publisher of the McGuffey books) controlled the market and prices. Schoolbook publishers responded to local critics because they were subject to community approval; high school and college texts were not. By the end of the century, John Dewey, author of School and Society (1899), led progressive educational reforms, urging hands-on learning rather than complete reliance on texts.

TEXTILES

BIBLIOGRAPHY

Apple, Michael W., and Linda K. Christian-Smith, eds. The Politics of the Textbook. New York: Routledge, 1991. Tanner, Daniel, and Laurel Tanner. History of the School Curriculum. New York: Macmillan, 1990.

Laurie Winn Carlson See also Hornbook; McGuffey’s Readers; New England Primer; Webster’s Blue-Backed Speller.

TEXTILES. Textile production played a crucial part in the American industrial revolution, the establishment of organized labor, and the technological development of this country. Once, textile production was simple enough that the entire process could and did take place in the home. Now, textiles represent a complex network of interrelated industries that produce fiber, spin yarns, fabricate cloth, and dye, finish, print, and manufacture goods.

Products and Services About 35 percent of U.S. manufactured cloth is intended for apparel, 16 percent for home furnishings, and 24 percent for floor coverings. The remaining 25 percent is used in industrial textiles, which include sports equipment, conveyer belts, filtration materials, and agricultural and construction materials. So-called geotextiles are used for earth stabilization and drainage as well as reinforcement in roads and bridges. The aerospace industry uses industrial textiles in the nose cones of space shuttles, and medicine uses textiles as artificial arteries and dissolving stitches. Fiber Producers Until the early twentieth century, all textiles were derived from plants or animals. The invention of a process for regenerating cellulose from wood chips and cotton linters into a usable fiber marked the beginning of research, development, and innovation. Many of today’s textile producers started as chemical companies.

Woolen Mill. Female workers examine lengths of fabric and mark any imperfections at this Boston textile mill, 1912. 䉷 corbis

107

TEXTILES

Producers of natural fibers are dependent on raw materials and often held hostage to nature. It is not easy for them to quickly increase or decrease output based on consumer demand. Most producers sell their fiber to mills or wholesalers for resale and seldom have any direct involvement after the fiber is sold. Trade organizations like Cotton Incorporated and the American Wool Council have been established to support producers by providing educational materials, helping with public relations, and assisting with advertising. Manufactured fibers can be made from regenerated natural materials, or they can be synthesized from chemicals. Because many of these processes may be petroleumbased, such producers may be affected by events concerning the oil industry. The American Fiber Manufacturers Association is the primary association for the manufactured fiber industry. Manufactured fibers can be sold as unbranded fiber, where the fiber producer has no further involvement; trademarked fiber, where the fiber producer has some control over the quality of the fabric; or licensed trademarked fiber, where the fiber producer sets standards that must be met by the fabric manufacturer. An advantage of trademarked or licensed trademarked fiber is that the fabric manufacturers and, ultimately, the garment manufacturers, can capitalize on advertising and brand recognition. Origins in America The American colonies were viewed as rich deposits of natural resources for Europe, and the colonists were considered as a consumer pool. Because Holland and France were producing their own wool, England was forced to look west for a new market. England encouraged the culture of flax, hemp, and silk in the colonies, but only if it aided English industries. Though the colonists were capable of producing cloth through spinning and weaving, they found no real necessity to do so as long as cloth could be imported. Problems arose in the Massachusetts colony when the French captured supply ships. The lack of sufficient warm clothing in an inhospitable climate created great hardship in the northern settlements. The Massachusetts colony recognized the need to be as self-sufficient as possible. It encouraged the development of raw materials and the manufacture of wool and linen cloth. A bounty was offered to weavers as inducement, and the coarse linen they produced was the first officially recorded American-produced textile. In 1638, twenty families arrived in Massachusetts from Yorkshire, a wool-producing district in England. Five years later, they began the manufacture of cloth, establishing the textile industry in America. Although they worked primarily in wool, they also spun and wove flax and cotton. The mill they established continued in production into the nineteenth century. With increasing concern over the availability of goods, in 1645 the Massachusetts colony instructed the public to preserve and increase their flocks of sheep, make woolen cloth, and

108

advise friends and family still in England to emigrate and bring as many sheep with them as possible. By the beginning of the eighteenth century, there were a quarter of a million colonists. Textile production had become important enough to pose a threat to English merchants and manufacturers. The English enacted restrictions that detailed what goods could be exported to the colonies and by whom, and what items could be exported from the colonies and where. This only served to instill a greater sense of defiance among the colonists. George Washington was a great supporter of homespun American cloth and maintained a weaving house on his Mount Vernon estate, as did Thomas Jefferson at Monticello. Imported textiles became very unpopular, especially after the 1765 Stamp Act. England retaliated for colonial disobedience by disallowing the exportation of any textile goods, machinery, or equipment to the colonies. The American army suffered terribly during the Revolution because of lack of proper clothing. The freedom won by the former colonists allowed the textile industry to develop. Industry Pioneers George Cabot founded the first integrated American textile mill in Beverly, Massachusetts, in 1787. His mill handcarded fiber, spun yarn, and wove cloth, all under one roof. The company produced a variety of cotton fabrics until the early 1800s. Samuel Slater may be considered the father of the American industrial revolution. English by birth, he trained for seven years in a textile mill, and left England in 1789 at age twenty-one. Settling in Rhode Island, he built the first successful water-powered spinning mill in Pawtucket in 1793. Francis Cabot Lowell, nephew of George Cabot, visited English textile mills and committed the workings of the power loom to memory. Upon his return, he worked with the inventor Paul Moody at Waltham, Massachusetts, to develop the first American power loom. George Corliss contributed to steam engine design and succeeded in making Providence, Rhode Island, the center of steam engine manufacture in the 1850s. First used as a source of alternate power during the dry season, steam slowly replaced water as an energy source. It allowed a mill owner to build in a populous area without regard for waterpower. How the Industry Developed Cloth production is a two-part process: spinning fiber into yarn, and weaving yarn into cloth. A mechanized spinning frame was invented in England in 1764 that could spin eight spools of yarn at once. Within a few years, it was improved to spin 100 spools simultaneously. Richard Arkwright improved upon the original design so that all steps occurred in one machine. It was in the factory of his partner, Jedediah Strutt, that Samuel Slater was trained. Slater opened Slater Mill in 1793 with money from Providence investors. His organizational methods

TEXTILES

Spinning Jenny. A 1765 engraving of James Hargreaves’s revolutionary new invention, the mechanized spinning frame. 䉷 corbis

became the blueprint for successors in the Blackstone River Valley. Based on mills smaller than those used in Massachusetts, his plan was ideal for small rural mill villages. Seven more mills opened by 1800, and there were 213 by 1815. The mills flourished in areas where the rocky terrain made farming unsuitable. The year after Slater opened his mill, Eli Whitney patented a machine that would lead to the revival of the declining practice of slavery and ultimately contribute to the causes of the Civil War. In 1790, there were 657,000 slaves in the southern states. In 1793, 187,000 pounds of cotton was harvested. Because one slave was able to clean only one pound of cotton fiber per day, the crop hardly was worth the trouble. Whitney’s cotton gin, however, could process fifty pounds a day, enabling the harvest to grow to six million pounds in 1795. The business of slavery grew as well, so that in 1810 there were 1.3 million slaves and 93 million pounds of cotton harvested. Cotton became the largest U.S. export and textiles the most important industry before the Civil War. Weavers could not keep up with the abundance of yarn being produced by the mechanized mills. This problem was solved when Francis Cabot Lowell and Paul Moody created their more efficient power loom and spinning apparatus in 1813 in Lowell’s Waltham mill. With a dependable loom, weaving could now keep apace of spinning. Soon mills began to dot the rivers of New England. The fully integrated mill marked the shift from a rural, agrarian society to a manufacturing economy. Shortly after his death, Lowell’s associates began to develop an area north of Boston where the Merrimack River and Paw-

tucket Falls had the waterpower to operate dozens of mills. Named for Lowell, the planned community was set up in 1823 and incorporated in 1826. By 1850 almost six miles of canals flowed through Lowell, drove the waterwheels of 40 mill buildings, and powered 320,000 spindles and almost 10,000 looms, operated by more than 10,000 workers. The period from 1820 to 1860 saw the rapid development of many more factories. New England became the nation’s textile center. In 1825, there were 16,000 mills in Maine, New Hampshire, Vermont, and New York. By 1850, there were 60,000 mills in the United States. New England alone had 896 power-driven mills, almost 500 of which were in northern Massachusetts, patterned after Lowell’s Waltham mill. Virtually all mills were fully mechanized by the early part of the nineteenth century. Initially powered by water, the mills eventually switched to steam, then electricity. By 1910, the Lowell mills were using hydroelectricity. The Civil War dramatically changed production. The cotton harvest shrunk to 200,000 bales in 1864, and after the war the western states began producing cotton. The South was faced with the need to reinvent itself and began to build spinning and weaving mills. Its lower wages, lower rate of unionization, and openness to new technology induced many northern mills to relocate southward in the years between the world wars. Chemistry began to play an important part in the textile industry in the mid-nineteenth century when synthetic dyes were discovered. These were followed in 1891

109

TEXTILES

LOWELL MILL GIRLS Beginning in 1823, girls from farms and local villages were recruited to work in the Lowell mills for a few years before they left for marriage or other reasons. Most were between fifteen and thirty years old and worked an average of three years. They lived in dormitories and boarding houses with strict rules of curfew and moral conduct. In 1834, 800 young female mill workers went on strike to protest wage cuts, claiming the cuts threatened their economic independence. The Lowell Female Labor Reform Association was formed in 1844, the first organization of working women to try to bargain collectively for better conditions and higher pay. The economic downturn of the 1850s led to lower pay and longer hours, and as a result, immigrant Irish women replaced American farm girls. In the late nineteenth century, women held nearly two-thirds of all textile jobs in Lowell.

by the development of regenerated cellulose, the first manmade fiber. The first plant for manufacturing “artificial silk” in America opened in 1910. Later named rayon (1924), the fabric was followed by acetate and triacetate, also cellulose derivatives. Chemical companies set up research and development labs in the race to find new fibers. DuPont established an experimental lab for the purpose of pure scientific research in 1928. Directed by Dr. Wallace Hume Carothers, the lab conducted work on polyesters but abandoned the project to pursue what would become known as nylon. After several years of development, the fiber was presented to consumers in the form of women’s stockings. In 1940, when they became available to the general public, nylon stockings earned more than $3 million in profit in seven months, completely covering the cost of research and development. Nylon stockings ceased production during World War II when nylon was needed for parachutes, ropes, and tents. British scientists picked up Carothers’s work on giant molecules and further developed polyesters. DuPont bought the appropriate patent and opened the first U.S. plant to produce Dacron polyester in 1953. Subsequent developments include manufactured fibers for protection, high performance, durability, strength, and ease of care. Other important chemical contributions are finishes on traditional fabrics for wrinkle resistance, shrinkage control, and color fastness. Technological developments include computer-aided design (CAD) and computer-aided manufacture (CAM). CAD equipment is used in the design of yarns and fabrics and the development of coloration. Prints can easily be manipulated, and designs can be

110

reconfigured in seconds. CAM is used for designing factory layouts and in textile production processes like the control of looms and robotics. Computers are invaluable in communications and for tracking inventory. Concern for the impact of manufacturing on the environment led to the development of so-called environmentally improved textile products. One such product is lyocell, regenerated cellulose produced using a nontoxic solvent. Organic cotton and naturally colored cottons are being cultivated, and natural dyes have sparked interest. Attention is also being given to recycling materials such as old carpets as well as other used textile products into new materials. Plastic soda bottles are being processed into fiberfill, polar fleece, and geotextiles. Statistics By the end of the twentieth century, there were approximately 75,000 woolgrowers in the United States, active in almost every state, and 35,000 cotton growers, mainly in the South. Textiles were also being manufactured in almost all states, with the largest concentrations in Georgia, North Carolina, and South Carolina. According to the U.S. Department of Commerce and the Bureau of Labor Statistics there were 5,117 companies, with 6,134 plants, in 1997. The companies employed 541,000 workers in 2000, but within a few years 177,000 jobs had been lost and more than 215 mills had closed. Though the industry income was $57.8 billion in 2000, shipments and exports soon dropped as the strength of the U.S. dollar against faltering Asian economies allowed for a surge of inexpensive imported textiles and clothing. Changes in Business and Commerce The textile industry has undergone significant changes in business practices in several key areas. Labor relations, trade practices, product labeling, product safety, and environmental and antipollution measures have been subjects of public scrutiny and federal legislation. Employee and Labor Practices Once farmers gave up rural self-sufficiency, they had to adapt to a mill whistle rather than the rhythm of nature. Life was difficult and unhealthy with long hours and poor conditions. Respiratory disease was common and there was always the danger of losing a limb in the machinery. The mills were cold and drafty in the winter and stifling in the summer, as well as dirty and noisy. Physical abuse occurred and it was not uncommon for mill owners to take advantage of workers. When labor was scarce, conditions improved, but conditions declined again when more workers became available. Samuel Slater developed a management style that became known as the Rhode Island system. He hired entire families, who often lived in company housing, shopped in the company store, and attended company schools and churches. It was a clever means of control because bad behavior on one worker’s part could get the entire family

T H A M E S , B AT T L E O F T H E

fired. Work was ten to twelve hours a day, six days a week. Sunday was for church and for children to learn basic reading, writing, and arithmetic. Though the mill complex did provide a measure of convenience for the workers, it was actually a way for the owner and investors to regulate every aspect of the workers’ lives. Paid by the mill owner, teachers and ministers preached the party line. By 1830, 55 percent of Rhode Island mill workers were children earning less than $1 a week. Children on farms worked equally long hours, and so for poor families, millwork was seen as an improvement. Textile machines lent themselves to child labor because they were simple enough for unskilled children to operate under adult supervision. By 1900, 92 percent of southern textile workers lived in mill villages. By 1908, fewer than 7 percent had a living situation with anything more than a simple privy. Some villages had a rule that a family had to have one employee for each room in the house, further ensuring child entry into the workforce. School was discouraged so that children would have no option but to enter mill life. Schools were free to seventh grade, then charged tuition after that. Between 1880 and 1910 about one-fourth of southern cotton mill workers were under sixteen, having entered the mills full-time by age twelve. The Fair Labor Standards Act of 1938 finally regulated child labor. In the 1890s, the National Union of Textile Workers held meetings throughout the Carolina Piedmont, organizing ninety-five locals by 1900. Unions continued to organize workers and in 1929 a wave of strikes began in Elizabethton, Tennessee. Thousands of mill workers walked out and stayed out three months even in the face of intimidation and the murder of Ella May Wiggins, organizer of the Gastonia, North Carolina, strike. Though hunger forced the workers back with only minor concessions from the owners, the stage was set for later protest. In an effort to stimulate recovery from the 1929 stock market crash and the depression that followed, President Franklin D. Roosevelt signed the National Industrial Recovery Act (NIRA) into law in 1933. Under NIRA, a Cotton Textile Board was established to enforce a code of fair competition in the industry, limit destructive price competition, prevent overproduction, and guarantee mill hands a minimum wage. Unfortunately, the Board was controlled by mill owners, who used the minimum wage as the maximum and laid off even more workers. The 1934 General Textile Strike led to the eventual abandonment of the mill village system. Twenty thousand Alabama workers walked out, demanding a minimum of $12 for a thirty-hour week and reinstatement of fired union members. The unrest spread, and when the United Textile Workers (UTW) called for a general strike an estimated 400,000 workers walked out, making it the largest labor conflict in American history. The governors of South Carolina, North Carolina, and Georgia called out

the militias and the national guards to support the mill owners. Financial need forced workers back and the UTW called off the strike three weeks later. Many workers were fired and blacklisted. In the early 1960s, African Americans made up fewer than 2 percent of textile industry employees. Although the industry was very competitive and most jobs were largely unskilled, it chose to overlook this source of labor. Integration occurred through the enforcement of the federal Civil Rights Act of 1964. Prospects In the 1980s, half a million jobs moved overseas in the search for cheap labor, and in the next decades jobs continued to be lost and mills shut down. Legislative efforts have been made to protect the American textile industry, which will also need continuing innovation and technological advances in order to survive. BIBLIOGRAPHY

Collier, Billie J., and Phyllis G. Tortora. Understanding Textiles, 6th ed. Upper Saddle River, N.J.: Prentice Hall, 2001. Hall, Jacquelyn Dowd, et al. Like a Family: The Making of a Southern Cotton Mill World. Chapel Hill: University of North Carolina Press, 1987. Harris, J., ed. Textiles, 5000 Years: An International History and Illustrated Survey. New York: Abrams, 1993. Kinnane, Adrian. DuPont: From the Banks of the Brandywine to Miracles of Science. Wilmington, Del.: DuPont, 2002. Little, Frances. Early American Textiles. New York: Century Co., 1931. Minchin, Timothy J. Hiring the Black Worker: The Racial Integration of the Southern Textile Industry, 1960–1980. Chapel Hill: University of North Carolina Press, 1999. Tortora, Phyllis G., and Robert S. Merkel, eds. Fairchild’s Dictionary of Textiles, 7th ed. New York: Fairchild, 1996.

Christina Lindholm See also Industrial Revolution; Labor; Labor Legislation and Administration; Mill Streams; Slavery; Strikes; United Textile Workers; and vol. 9: Mill Worker’s Letter on Hardships in the Textile Mills.

THAMES, BATTLE OF THE. The American effort to reclaim the upper Great Lakes, lost to the British in August 1812, was led by Gen. William Henry Harrison, who established Fort Meigs (above Toledo) as an advance base, and Capt. Oliver Hazard Perry, who built the fleet that, on 10 September, won the Battle of Lake Erie. Harrison’s troops, convoyed by Perry’s fleet, pursued British Gen. Henry A. Procter’s forces into the interior of Ontario. The Americans engaged and defeated Proctor and his Indian allies, led by Tecumseh, a few miles east of Thamesville on 5 October 1813. Harrison’s victory added to the future president’s reputation as a military hero and restored American dominance in the Northwest.

111

T H A N K S G I V I N G D AY

BIBLIOGRAPHY

Morison, Samuel E. “Old Bruin”: Commodore Matthew C. Perry, 1794–1858. Boston: Little, Brown, 1967. Peterson, Norma Lois. The Presidencies of William Henry Harrison and John Tyler. Lawrence: University Press of Kansas, 1989. Skaggs, David Curtis. A Signal Victory: The Lake Erie Campaign, 1812–1813. Annapolis, Md.: Naval Institute Press, 1997. Sugden, John. Tecumseh’s Last Stand. Norman: University of Oklahoma Press, 1985. ———. Tecumseh: A Life. New York: Holt, 1998.

as turkey (which is such a central symbol that the holiday is sometimes called Turkey Day). BIBLIOGRAPHY

Appelbaum, Diana Karter. Thanksgiving: An American Holiday, An American History. New York: Facts On File, 1984. Myers, Robert J. Celebrations: The Complete Book of American Holidays. Garden City, N.Y.: Doubleday, 1972. Pleck, Elizabeth. “The Making of the Domestic Occasion: The History of Thanksgiving in the United States.” Journal of Social History 32 (1999): 773–789.

M. M. Quaife / a. r. See also “Don’t Give Up the Ship”; Ghent, Treaty of; Great Lakes Naval Campaigns of 1812; Lake Erie, Battle of; Tecumseh’s Crusade; Tippecanoe, Battle of; War of 1812.

THANKSGIVING DAY. Thanksgiving Day, a national holiday imitated only by Canadians, was first established as an annual event by Abraham Lincoln in a proclamation dated 3 October 1863. Expressing hope amidst the continuing Civil War, it was a response to the campaign of Sarah Josepha Hale, editor of Godey’s Lady’s Book, to nationalize an autumn festival already observed by most of the states. Sporadic days of thanksgiving had been previously appointed by national leaders, such as those honoring military victories during the American Revolution, the Whiskey Rebellion, and the War of 1812 and one by George Washington to celebrate the new Constitution on 26 November 1789. The origin of the holiday is rooted in New England practices of prayer and feasting, most symbolically enacted by the three-day harvest celebration in 1621 between the Pilgrim settlers of Plymouth Colony and ninety Wampanoag, an event briefly mentioned in the histories written by Plymouth governors William Bradford and Edward Winslow. This First Thanksgiving has been widely promoted since the late nineteenth century as a source of national origins. The types of public events during Thanksgiving have changed over time and have included church services, shooting matches, and—in nineteenth-century cities—military parades, masquerades, child begging, and charity banquets. Persisting public activities include games between football rivals (beginning in 1876) and spectacular commercially sponsored parades, such as the Macy’s parade in New York City starting in 1924. President Franklin Delano Roosevelt changed the traditional observance from the last to the penultimate Thursday in 1939 (a year when November had five Thursdays) to extend the holiday shopping season. The controversy surrounding the alteration, however, led to a congressional resolution in 1941 that fixed the official holiday as the fourth Thursday in November. The heavy volume of travel over the fourday weekend originated in the nineteenth-century tradition of homecoming, when urban residents returned to celebrate their rural roots and feast on native foods such

112

Timothy Marr See also Holidays and Festivals.

THEATER in America started as ritual performance by Native Americans and then, upon the arrival of the first white, Spanish settlers, became another sort of ritual, based on medieval European Christian morality plays. For many years, theater was outlawed in Colonial America, although the proscription hardly called a halt to performances. As everywhere, theater ranged between high and low: early “high” theater attempted to duplicate what was going on in Europe and included rewritten (“improved”) Shakespeare and other, mostly British dramas, including School for Scandal by Richard Brinsley Sheridan. “Low” theater included riverboat shows, vaudeville, minstrel shows, and Wild West shows. It was not until the late eighteenth century that an authentic “American” voice began to emerge in the theater. This voice continued to develop throughout the nineteenth century and found itself being embraced on the world stage during the twentieth century. Early American Theater While there are no records of the earliest Native American performances, Indian rituals were noted by the early white settlers. Native Americans performed most of their theatrical pieces in honor of various gods or to celebrate changes in seasons, harvests, hunts, battles, and so on. Among the many performances were the summer and winter rituals of the Pueblo Indians. Pueblo dramas included the Deer Dance, Buffalo Dance, Corn Dance, Raingod Dance, and the Eagle Dance. Variations on Native American performance were later played out many times with white settlers in rituals and ceremonies focused around treaties and other meetings. These dramas included gift giving, dances, and speeches. Later, Indians— and cowboys—became stock characters in performances ranging from melodramas to vaudeville. In “Wild West” shows of the nineteenth century, Indian rituals were recreated for white audiences in the eastern United States and in Europe. The first recorded white colonial performances were morality plays performed by missionaries for Spanish soldiers in Florida in 1567. These plays were intended to show the supremacy of the Spaniards’ religion and its ul-

T H E AT E R

timate triumph in the New World. Although no record of the actual play exists, it can be assumed that it took the stylized and ritualistic form of medieval drama. In Colonial days, theater was looked down upon by many of the Puritanical white settlers, so it was not until 1665 that the first play performed in English was recorded. Ye Bare and Ye Cub was performed by three men in Accomack County, Virginia. Apparently someone was offended by the offering, or simply by the idea of theater, because the players were sued. After the play was performed in court, the performers were found “not guilty of fault.” Quakers were especially opposed to theatrical performances and had laws passed against them in most of the colonies, beginning with William Penn’s in Pennsylvania. Proscriptions against theater were not passed in Virginia, and that is likely why it became the home of the first professional American theater, the Company of Comedians, led by entrepreneur Lewis Hallam. Hallam’s troupe of provincial players arrived from England in 1752. Like most of the companies to follow, the Company of Comedians was run by an actor/manager. After performing Shakespeare in Williamsburg, Virginia, Hallam built the first theater in New York City in 1753 and in Charleston in 1754. Hallam’s fare also included such English staples as Restoration drama, farce, and operetta. His company played Philadelphia and toured the South and eventually moved to Jamaica, where Hallam died. While in Jamaica, Hallam’s wife married another theater producer, David Douglass, who had founded theaters in Philadelphia and New York. Under Douglass, the company moved back to the States, calling itself the American Company. Hallam’s son, Lewis Hallam the Younger, often performed opposite his mother and proved to be a talented comic. In 1767, Hallam played the lead in the first professional American drama, Thomas Godfrey’s Prince of Parthia. In 1775, theater was again banned, this time by the Continental Congress. While the ban was routinely ignored, it did put off professional theater producers—including David Douglass, who moved back to Jamaica— and fostered more amateur performances, especially those featuring patriotic themes. Theater in the Early United States After the Revolutionary War (1775–1783), the American Company returned to New York City and when David Douglass died, Hallam took over and produced what is widely believed to be the first important American play, one written by a Harvard-educated lawyer and army officer, Royall Tyler. Tyler’s play, The Contrast, debuted in New York in March 1787. The characters in The Contrast include a Revolutionary War veteran and a man deemed a natural nobleman. The leading character, Jonathan, was the first in a long line of “Yankees” to grace the American stage. Tyler made comparisons between American and British attitudes that favored the American. In addition to its themes of patriotism and the belief that love con-

quers all, Tyler’s play is filled with references to the fashions and topics of the time. The Contrast was an instant hit that was also performed in Baltimore, Philadelphia, and Boston and has seen revivals up to the twenty-first century. During the early nineteenth century, touring groups continued to play a large role in American theater, and English actors were often imported to headline local productions. Among the more popular players were Edmund Kean and Junius Brutus Booth (father of actor Edwin Booth and actor/Lincoln assassin John Wilkes Booth). At this time, actors often specialized in one or two roles that they were known for. The American-born actor credited with innovating a truly American style of acting was Edwin Forrest. After playing second leads to Edmund Kean, Forrest eventually became a leading man and played throughout the East, South, and Midwest. Forrest was an athletic actor who was a natural for heroic and rebellious roles. He found his greatest fame as star of Metamora; or, The Last of the Wampanoags (1829), a play that he found by sponsoring a contest for a tragedy, “of which the hero . . . shall be an aboriginal of this country.” Forrest played the Indian Metamora throughout his career, and the success of the play caused many other dramas featuring the noble savage to be entered into the American repertory. For the most part, when Black Americans were portrayed, it was not as noble persons but as buffoons. The 1840s saw the rise of minstrelsy, in which mostly white, but also black, performers sang and danced while made up in blackface, achieved by smearing coal on the face. Minstrel shows remained popular until the early twentieth century. Also wildly popular in midcentury were “Tom Shows,” melodramatic productions based on Harriet Beecher Stowe’s 1852 novel, Uncle Tom’s Cabin. Other forms of diversion included vaudeville, which boasted such performers as Eddie Foy, W. C. Fields, and Sophie Tucker. P. T. Barnum sponsored singing tours by the “Swedish Nightingale,” Jenny Lind, and opened the American Museum (1842) in New York City where he exhibited such freakish attractions as “Tom Thumb” and the Siamese twins Chang and Eng. Barnum, along with James A. Bailey, founded the Barnum and Bailey Circus in 1881. Wild West shows were in vogue, especially Buffalo Bill’s Wild West Show, organized by former Pony Express rider William Frederick Cody in 1883. Cody’s Cowboy and Indian show toured throughout the United States and Europe. Showboats were also a popular venue for all manner of entertainment from vaudeville to Shakespeare. Theater of the Gilded Age The last thirty years of the 1800s, often referred to as the “Gilded Age,” were dominated by melodrama. Many Civil War plays were produced; they often focused on romances between Northern and Southern lovers but skirted the political issues of the war. Nonetheless, American theater was edging ever closer to the realistic style of

113

T H E AT E R

performance that would come to dominate it in the twentieth century. A trend in late-nineteenth-century drama, attributed largely to California-born manager/playwright/producer David Belasco, was to greatly enhance the production values of a play. Belasco built enormous and spectacular three-dimensional sets that he deemed naturalistic. Belasco was among the forerunners of a small group of producers who were breaking away from the romantic style of acting that marked the nineteenth century as well. These producer/directors encouraged actors to perform in a naturalistic style that suited the actors’ own personalities. By 1888, it was estimated that there were more than 2,400 professional actors in the United States. A few earned as much as $100,000 a year—a tremendous amount at the time. Among the highly paid actors were many who came from theatrical families, including descendents of the Booths, the Davenports, the Jeffersons, and the DrewBarrymores (Lionel, Ethel, and John Barrymore all worked on the New York stage in the early twentieth century). Lesser-known performers were often badly treated; sometimes no pay was given for weeks or even months of rehearsal. Thus, in 1894, the Actors’ Society of America, later Actors’ Equity, was formed to negotiate standard contracts for actors. Even before this, other stage employees organized unions. The number of actors grew to around 15,000 at the turn of the twentieth century. Along with the increase in actors came an increase in acting schools. Among the first was the Lyceum Theatre School, founded in New York City in 1884 and renamed the American Academy of Dramatic Arts in 1892. The American Academy of Dramatic Arts remains perhaps the most prestigious acting school in the country. In the mid-nineteenth century, stock companies rose in number and often traveled. The opening of the first transcontinental railroad in 1869 meant that productions could travel to the West Coast. Soon companies stopped developing a large number of new plays and instead produced long runs of a single, popular play that they often took on tour. By the early 1870s, there were about 50 resident stock companies in the country. In 1886, a group of booking agents and managers formed a partnership known as the Theatrical Trust (or Syndicate). For approximately thirty years, the Syndicate controlled virtually all bookings at professional theaters. Over 1,700 theaters were available to touring productions in 1905, according to Julius Cahn’s Official Theatrical Guide, making the Syndicate’s sphere of influence very great indeed. By the turn of the twentieth century, resident stock companies were nearly nonexistent. A challenge to the Syndicate’s authority came from independent producer David Belasco, who wanted to stage a play set in Japan at the 1904 World’s Fair in St. Louis and was blocked by the syndicate. Belasco booked a theater anyway and, typically, the Syndicate mounted a

114

rival play on the same topic as Belasco’s. Even an antitrust suit, filed after the Sherman Antitrust Act of 1890 became law, failed to loosen the Syndicate’s grip. What did finally stop the Syndicate was another group of theatrical monopolists, the New York–based Shubert brothers—Lee, Sam S., and Jacob J. The Shuberts, who initially worked with the Syndicate, eventually joined forces with David Belasco, actress Minnie Maddern Fiske, and others to overturn it. The nineteenth century did see some accomplished American playwrights, including Edward Harrigan, William Dean Howells, and Steele MacKaye. However, the time and country that produced such memorable writers in other genres as Walt Whitman, Emily Dickinson, and Henry David Thoreau failed to nurture a truly great playwright until the twentieth century. Theatre in the Early Twentieth Century The early twentieth century mostly saw a continuation of commercialization and lack of originality in the theater. Melodrama, with subjects ranging from historical to romantic to Western to mystery, remained the form most often performed. Touring ceased to be the main way in which plays were presented and stock companies again formed. The continuing prosperity of America was reflected in the theater, and by 1912 there were some 8,000 theaters in America. By then, activities were focused in New York, especially off Times Square. Many of the theaters built during the boom of the 1920s were still used in 2002. With the exception of some suffragist actresses, there were very few performers involved in political causes. However, in the Chicago slums, Jane Addams and Ellen Gates Starr recognized the possibilities of theater as a force for social good and opened Hull House in 1889 as an alternative entertainment for impoverished youth. Similar theaters followed, including the Henry Street Settlement in New York. As more and more of the theatergoing public became exposed to the work of such groundbreaking European playwrights as Henrik Ibsen, Anton Chekhov, and George Bernard Shaw, a small but active theater intelligentsia was formed that looked for more sophisticated plays. In the teens, “Little Theaters” began to open around the country. Some of these were formed for the purpose of offering standard commercial fare at cut rates, but many were formed with a higher purpose in mind—to produce serious, realist drama. These little theaters, including Chicago’s Little Theatre, New York’s Neighborhood Playhouse and Washington Square Players, and the Cleveland Playhouse featured work by both contemporary European and American playwrights and were modeled after European art theaters such as the Moscow Art Theatre and Dublin’s Abbey Theatre. American performances by these two theater companies and others greatly influenced the style of acting in America further toward naturalism.

T H E AT E R

In Massachusetts, the Provincetown Players were developing the early short sea plays (set on the sea) of the only American playwright ever to win a Nobel Prize (1936), Eugene O’Neill. O’Neill was the son of James O’Neill, a famous actor who felt he had squandered his talent playing mostly one role, in The Count of Monte Cristo, throughout his career. The plays were taken to New York and the Provincetown Players began a tradition of developing plays out of town before a New York opening. O’Neill was the first of many great American playwrights to work in the twentieth century. He is credited with first perfecting the realist voice of the American stage. During the 1930s, the Great Depression brought a far greater interest in political theater. Such groups as the International Ladies Garment Workers Union put on plays, and even the government got into the act through the federally sponsored and ill-fated Federal Theatre Project, which attempted to put 13,000 theater people on the government payroll. Meanwhile, the unions were represented by playwright Clifford Odets in his Waiting for Lefty on the legitimate stage. Lillian Hellman and Thornton Wilder were among the other prominent playwrights of the time. The postwar 1940s were also a fascinating time for theater. It was then that the heartbreaking dramas of Mississippi playwright Tennessee Williams, The Glass Menagerie (1945) and A Streetcar Named Desire (1947), were staged. Marlon Brando, who studied the Stanislavski System of acting originated at the Moscow Art Theatre and taught at The Actors Studio (opened 1947), became an overnight sensation after starring in A Streetcar Named Desire. His intimate performance not only led to a long film career but also had a great influence on the way American actors performed. Arthur Miller debuted works that deal with government corruption (All My Sons, 1947), the alienation of modern man (Death of a Salesman, 1949), and manipulation of public opinion through the House Un-American Activities Committee hearings of the early 1950s (The Crucible, 1953). In 1947, Julian Beck and Judith Malina formed the Living Theatre, an experimental theater devoted to producing avant-garde plays that promoted the ideals of pacifism and anarchy. The 1940s also saw the development of the American musical, starting with Oklahoma (1943), written by Richard Rodgers and Oscar Hammerstein and choreographed by Agnes DeMille. Other musicals included Brigadoon (1947) and My Fair Lady (1956), by the team of Alan Jay Lerner and Frederick Loewe, and West Side Story (1957) by Leonard Bernstein and Arthur Laurents, and later, Sweeney Todd (1979), by Stephen Sondheim. The musical was to become the most American of theatrical genres; immense productions began to dominate the large theaters in New York by the 1950s and continue to do so.

Theatre in the Late Twentieth Century The Civil Rights Movement, the war in Vietnam, and the other upheavals of the 1960s provided a rich time for theater. Playwrights including Amiri Baraka (then LeRoi Jones) championed the Black Arts Movement with such in-your-face plays as Dutchman (1964), in which a white woman stabs a black man on a subway. David Rabe wrote about Vietnam in Stick and Bones (1971). The 1960s also saw the first of many plays dealing openly with homosexuality. The Boys in the Band premiered in 1968. Later plays to deal with the subject included Larry Kramer’s The Normal Heart (1985) and Tony Kushner’s Pulitzer Prize– winning two-part epic, Angels in America (1991, 1993). The 1960s also ushered in the work of Neil Simon, probably the most popular writer of comedies in the late twentieth century. Among other important playwrights of the last part of the century, California born and raised Sam Shepard writes plays about those who, like himself, rejected the mores of polite society; Christopher Durang lampoons the Catholic church that he was raised in; and Marsha Norman writes of a woman so disconnected she is planning suicide (’night Mother, 1982). Performance artists such as Karen Findley, whose work dealt with her own sexuality, Anna Deavere Smith, who explores social issues such as Black-Jewish relationships, and performer/musician Laurie Anderson rose to prominence in the 1980s. Many of these performances were produced Off Broadway, including the New York Shakespeare Festival, founded in 1954 by Joseph Papp for the purpose of mounting Shakespeare productions in Central Park that were free and open to the public each summer. When Papp died in 1991, the innovative African American director George C. Wolfe became director of the festival. Papp also produced the surprise hit hippie musical of 1967, Hair, at his not-for-profit Public Theater. Hair was then moved to Broadway and the profits used for other, less commercial productions. Broadway is still dominated by musicals and revivals of musicals, and it has seen a tremendous decline since the 1980s, largely because of escalating costs in mounting a production. In the 1950s, a grand musical such as My Fair Lady might have cost half a million dollars to produce, and tickets were less than ten dollars each. By the end of the twentieth century, costs soared so that a musical such as The Lion King (1997) could cost $15 million to produce and a ticket could cost up to $100. Broadway budgets and ticket prices have long provided much of the momentum for Off Broadway and later for even smaller—less than 100-seat—houses called Off Off Broadway. Greenwich Village’s Caffe Cino, founded in 1958 by Joe Cino, is generally thought to be the birthplace of Off Off Broadway, but Off Off Broadway’s most enduring and important producer is Ellen Stewart of Cafe´ La Mama, which was founded in 1962, and renamed the La Mama Experimental Theater Club. Stewart is known for giving fresh voices a place in her theater, not because

115

T H E M E PA R K S

she likes the script—she often does not read them in advance—but rather because she has a good feeling about the person bringing an idea for a production to her. Off and Off Off Broadway venues, in addition to many regional theaters including Steppenwolf in Chicago, Magic Theater in San Francisco, and repertory companies including Yale Repertory Theater, American Conservatory Theater in San Francisco, Missouri Repertory Theater, and Chicago’s Goodman Theater, are thought by many to be the most exciting places to view theater in the twenty-first century. BIBLIOGRAPHY

Blum, Daniel. Great Stars of the American Stage: A Pictorial Record. New York: Greenberg, 1952. Brustein, Robert. Reimagining American Theatre. New York: Hill and Wang, 1991. Henderson, Mary C. Theater in America: 250 Years of Plays, Players, and Productions. New York: Abrams, 1996. Hischak, Thomas S. American Theatre: A Chronicle of Comedy and Drama, 1969–2000. New York: Oxford University Press, 2001. Londre, Felicia Hardison, and Daniel J. Watermeier. The History of North American Theater: From Pre-Columbian Times to the Present. New York: Continuum Publishing, 1998. Lorca Peress contributed information on Off Off Broadway.

Rebekah Presson Mosby See also Dance, Indian; Death of a Salesman, The; Music: Theater and Film; Wild West Show.

THEME PARKS. See Amusement Parks.

the worke wee haue in hand, it is by a mutuall consent . . . to seeke out a place of Cohabitation and Consorteshipp under a due forme of Government both ciuill and ecclesiastical”; the “due forme” was that enacted in the Bible. John Cotton later argued that the New England colonies, having a clear field before them, were duty bound to erect a “Theocracy . . . as the best forme of government in the commonwealth, as well as in the Church.” Consequently, the political theory assumed that the colonies were based on the Bible and that all specific laws would show biblical warrant. The governments of the two colonies were founded on the theory that God had ordained all society as a check on depraved human impulses and, therefore, that all politics should ideally fulfill God’s will. Hence, Winthrop explained in 1645, that after people entered a body politic, they gained the freedom to do only that “which is good, just and honest”—in other words, only that which God demands. The purpose of the state was to enforce God’s will, and to ensure that every member of society would observe God’s laws. BIBLIOGRAPHY

Foster, Stephen. The Long Argument: English Puritanism and the Shaping of New England Culture, 1570–1700. Chapel Hill: University of North Carolina Press, 1991. Gildrie, Richard P. The Profane, the Civil, and the Godly: The Reformation of Manners in Orthodox New England, 1679–1749. University Park: Pennsylvania State University Press, 1994. Miller, Perry. The New England Mind: The Seventeenth Century. New York: Macmillan, 1939. Noll, Mark A. A History of Christianity in the United States and Canada. Grand Rapids, Mich.: Eerdmans, 1992.

Perry Miller / s. b.

THEOCRACY IN NEW ENGLAND. This term was applied to the political regimes established in the Massachusetts Bay and New Haven colonies. These colonies were not theocracies in the traditional sense—that is, clergy did not establish or run their political systems. In both colonies, there was a clear separation of church and state. In Massachusetts, for instance, clergy were forbidden to hold public office, and both colonies maintained separate systems of political and religious leadership. But it was also the case that these political and religious systems were mutually reinforcing, and that early leaders hoped that every institution of their societies— the family, the church, and the magistracy—would function in concert to maintain a pious society based on Calvinist theology and religious practice. For this reason some have applied the term “theocracy” to seventeenthcentury New England. Colonial leaders deliberately intended to create a Bible Commonwealth, a society in which the fundamental law would be the revealed Word of God, and God would be regarded as the supreme legislator. Thus, John Winthrop announced the program before the settlement, “For

116

See also Cambridge Agreement; Massachusetts Bay Colony; New England Way; New Haven Colony.

THEOSOPHY is defined by its expounders as a religion-philosophy-science brought to America by “messengers of the guardians and preservers of the ancient Wisdom-Religion of the East.” Its founder was an eccentric Russian noblewoman, Helena P. Blavatsky. In July 1848, at age sixteen, she was married to a forty-one-yearold government official. She ran away after three months to Constantinople and joined a circus. After extensive travels in the Far East where she claimed to have received instruction from “Sages of the Orient,” she came to New York City on 7 July 1873 and, two years later, with William Q. Judge, Henry Steel Olcott, and fifteen others, formed the Theosophical Society. The purpose of the organization was to further a universal brotherhood of humanity without distinction of race, color, sex, caste, or creed; to further the study of the ancient scriptures and teachings such as Brahmanical, Buddhist, and Zoroastrian; and to investigate the “unexplained laws of Nature” and the psychic and spiritual powers latent in man.

THINK TANKS

At first, the theosophists displayed an interest in spiritualism but later repudiated it, stating that spiritistic phenomena “were but a meagre part of a larger whole.” Later, Madame Blavatsky formed what she termed an “esoteric section,” which was a select group of promising students gathered to study the more profound teachings of theosophy. Madame Blavatsky left the United States in 1878 and formed theosophical societies in England and India, which recognized her leadership until her death in 1891. The teachings of theosophy stress universal brotherhood to be a fact in nature on which its philosophy and religion are based. Theosophy proclaims a “Deific Absolute Essence, infinite and unconditioned . . . from which all starts, and into which everything returns.” Man has an immortal soul, but the soul is a tenant of many different bodies in many different lives. Every soul must become perfect before the next stage of existence can be entered upon, and those who go forward most rapidly must wait for all. For this, many reincarnations are necessary. Theosophy accepts the miracles of Jesus but denies their supernatural character, holding that they were accomplished through natural laws. As of 2001, there were 130 theosophical study centers and theosophical societies—known as lodges—in the United States. BIBLIOGRAPHY

Campbell, Bruce F. Ancient Wisdom Revised: A History of the Theosophical Movement. Berkeley: University of California Press, 1980. Greenwalt, Emmet A. The Point Loma Community in California, 1897–1942: A Theosophical Experiment. Berkeley: University of California Press, 1955. Washington, Peter. Madame Blavatsky’s Baboon: Theosophy and the Emergence of the Western Guru. London: Secker and Warburg, 1993.

William W. Sweet / f. b. See also Asian Religions and Sects; Cults; New Age Movement; Utopian Communities.

THINK TANKS are policy-oriented research organizations that provide expertise to government. By the year 2000 there were an estimated 1,200 nongovernment think tanks of various descriptions, various focuses on social and economic issues, and various sources of funding at work in the United States. Of the major think tanks, only the Brookings Institution (1916) and the Carnegie Endowment for International Peace (1910) were founded before World War II. The American Enterprise Institute was founded in 1943. Although think tanks are ostensibly nonpartisan, in many instances they function as extensions of state power, gaining and losing influence with changes in governments and shifts in the ideological climate of the country. In other cases, think tanks function more independently,

questioning and monitoring state strategies and structures. (For example, the Rand Corporation, founded in the aftermath of World War II, was created to monitor and evaluate Air Force programs, before it became an independent research organization in the 1950s.) The course of the Brookings Institution reflects the kinds of changes that can occur in shifting ideological currents. Founded as the Institute for Government Research in 1916 and reorganized in 1927 by the St. Louis philanthropist Robert Brookings, the Brookings Institution sought to bring nonpartisan expertise to policy questions of the day. During the 1930s, however, the institution, under its first president, Harold Moulton, became a major critic of many New Deal programs, including the National Recovery Administration, the Agricultural Adjustment Administration, securities regulation, and Keynesian economic policy. Following World War II, Moulton warned repeatedly that the government had drifted into “uncharted seas, if not state socialism,” and called for an end to “regimentation.” In response to the new postwar environment and the reluctance of foundations to fund an institution they perceived as ineffective and out of touch, Robert Calkins, former head of the General Education Fund at the Rockefeller Foundation, agreed to become president of Brookings. Calkins reorganized the institution and recruited social scientists with liberal credentials and government experience. This new group had close ties with government and, unlike the devotees of the earlier nonpartisan ideal, aligned themselves closely with presidential administrations. In 1965, when Calkins retired, the Brookings Institution was representative of mainstream Keynesian economic thinking, and its growing influence was reflected in renewed foundation support, especially from the Ford Foundation. Under Calkins’s successor, Kermit Gordon, Brookings’s reputation as a liberal Democratic think tank was well entrenched. Under Gordon, the Brookings Institution became a major center for policy innovation in welfare, health care, education, housing, and taxation policy. In 1976, the board of trustees appointed Bruce MacLaury to head the institution. A former regional Federal Reserve banker and Treasury official, MacLaury successfully courted business support, increased corporate representation on the board of trustees, and moved the institution toward a more moderate ideological stance. By the 1970s, the Brookings Institution confronted competition from other major policy research institutions, especially the American Enterprise Institute and the Heritage Foundation, both viewed as conservative research institutions close to the Republican party. The American Enterprise Institute (AEI), founded in 1943 as the American Enterprise Association (AEA), illustrates the experience of a conservatively oriented research institution that expressed deep ambivalence about the post–World War II policy consensus. The key figure

117

T H I R D PA RT I E S

behind the establishment of the AEA was Lewis Brown, chairman of Johns-Manville Corporation. From the start, the AEA reflected a conservative bias. In 1954, A. D. Marshall, head of General Electric, assumed the institution’s presidency and immediately hired William Baroody and W. Glenn Campbell, both staff economists at the U.S. Chamber of Commerce, to head the research program. Under their guidance, AEA was gradually built into a modern research institute under its new name, the American Enterprise Institute. Principle support came from the Lilly Endowment, the Scaife Fund, and the Earhart and Kresge Foundations, as well as major corporate sponsors. The institution’s reputation was enhanced when the Nixon administration called a number of AEI associates to government positions. The AEI also emerged as a successful proponent of economic deregulation. In 1977, William Baroody retired and his son, William Baroody Jr., took over the presidency of the institution. To improve its standing in the academic community, the AEI assembled an impressive staff including Melvin Laird, William Simon, Robert Bork, Michael Novak, and Herbert Stein. The tenure of William Baroody Jr., however, ended abruptly in the summer of 1987, when an increasingly restive board of trustees forced his resignation because of cost overruns and declining revenues. Baroody’s successor, Christopher DeMuth, bolstered the conservative orientation of the institute by bringing on board several former Reagan administration officials with strong rightist reputations. The founding of the Heritage Foundation in 1973 revealed a new ideological climate in the analysis of public knowledge. Founded by Edwin Feulner and Paul Weyrich to provide rapid and succinct legislative analysis on issues pending before Congress, the Heritage Foundation sought to promote conservative values and demonstrate the need for a free market and a strong defense. The Heritage Foundation’s articulation of conservative values in social policy, education, and government activities placed it at the forefront of New Right activity. The Heritage Foundation remained relatively small in its early years, but the election of Ronald Reagan to the presidency in 1980 enhanced the institution’s prestige. By the mid-1980s the Heritage Foundation had established a solid place in the Washington world of think tanks as a well-organized, efficient, and well-financed research organization that called for the turning over of many government tasks to private enterprise, a strong defense, and a cautious approach to Russia and China. During the 1960s, 1970s, and 1980s, a myriad of other think tanks emerged in Washington representing a range of ideological positions and specialized policy interests, including the left-oriented Institute for Policy Studies (1963) and the libertarian-oriented Cato Institute (1977). Think tanks concerned with national security included the Center for Strategic and International Studies (1962) and the Center for National Security Studies

118

(1962) affiliated with the American Civil Liberties Union. The Urban Institute (1968) focused on domestic social, welfare, and family policy, while the National Women’s Law Center (1972) worked on policies that affect women, especially reproductive rights, employment, and education. The Institute for International Economics (1981) became a major center for international economic and monetary policies, especially from a free-trade perspective. The traditionalist-oriented Ethics and Public Policy Center (1976) provided analysis of public policies related to religious issues. BIBLIOGRAPHY

Critchlow, Donald T. The Brookings Institution, 1916–1952. DeKalb: Northern Illinois University Press, 1985. Dixon, Paul. Think Tanks. New York: Atheneum, 1971. Edwards, Lee. The Power of Ideas: The Heritage Foundation at Twenty-Five Years. Ottawa, Ill.: Jameson Books, 1997. Friedman, John S., ed. The First Harvest: An Institute for Policy Studies Reader, 1963–1983. Washington, D.C.: Grove, 1983. Lagemann, Ellen Condliffe. The Politics of Knowledge: The Carnegie Corporation, Philanthropy, and Public Policy. Middletown, Conn.: Wesleyan University Press, 1989. Ricci, David. The Transformation of American Politics: The New Washington and the Rise of Think Tanks. New Haven, Conn.: Yale University Press, 1993. Smith, James Allen. The Idea Brokers: Think Tanks and the New Policy Elite. New York: Free Press, 1991.

Donald T. Critchlow

THIRD PARTIES. The American political system has rarely been kind to third parties. No third party has won a presidential election in over a century. From the point of view of the two major parties, minor parties have functioned more as irritants or sideshows than as serious rivals. Parties such as the Libertarian Party, the American Vegetarian Party, the nativist Know-Nothing Party, and the agrarian Populist parties have been most valuable as safety valves for alienated voters, and as sources of new ideas, which, if they become popular, the major parties appropriate. In the historian Richard Hofstadter’s classic formulation: “Third parties are like bees: once they have stung, they die.” Hofstadter explains this phenomenon by claiming that the major parties champion patronage not principle. A better explanation is more structural, and more benign. The “first to the post” nature of most American elections selects the candidate with the most number of votes even without a majority. Marginal parties that woo a consistent minority languish. On the presidential level, the “winner take all” rules for most states in the electoral college further penalize third parties by diffusing their impact. In 1992, Ross Perot received over 19 million votes, 18.8 percent of the popular vote, but no electoral votes, and, thus, no power. As a result, although there is nothing mandating it in the Constitution—and the Framers abhorred

T H I R D PA RT I E S

Party heralded the reorientation in government power that shaped the twentieth century. “We believe that the power of the government—in other words of the people—should be expanded,” the platform thundered. Some of the more radical Populist schemes proposing public ownership of the railroads, the telegraph, and the telephone failed. But many other proposals eventually became integrated into American political life, such as a national currency, a graduated income tax, the (secret) Australian ballot, and the direct election of United States senators. In 1892, James B. Weaver of the People’s Party won more than a million popular votes and 22 electoral votes. That year Populists sent a dozen congressmen to Washington, while securing governor’s chairs in Kansas, North Dakota, and Colorado.

George Wallace. The segregationist Alabama governor, who was a third-party candidate for president in 1968; his 1972 bid for the Democratic presidential nomination ended when an assassination attempt left him disabled. Library of Congress

parties—since the 1830s a two-party system has been the norm in American politics. The classic American third party is identified with an issue, or a cluster of issues. The searing antebellum slavery debate spawned various third parties. James G. Birney ran with the antislavery Liberty Party in 1840 and 1844; former president Martin Van Buren won over 10 percent of the popular vote—but no electoral votes—with the Free Soil Party in 1848. By 1860, the antislavery Republican Party had captured the presidency, although with less than 40 percent of the popular vote in a rare fourway race. Some historians consider the Republican Party America’s only successful third party. Others argue that the party debuted as a new major party assembled from old ones, not as a minor party that succeeded. Third Parties after the Civil War The century and a half following the Civil War witnessed an extraordinarily stable rivalry between the Republicans and the Democrats. Throughout, third parties erupted sporadically, commanded attention, made their mark politically, rarely gained much actual power, and then disappeared. In the late nineteenth century, the agrarian Populist protest movement produced a Greenback Party and the People’s Party. The 1892 platform of the People’s

In the early twentieth century, the Socialist, Socialist Workers, and Socialist Laborites helped radical Americans, particularly many immigrants, express frustration while staying within America’s political boundaries. Typically, the perennial Socialist Party candidate, Eugene V. Debs, won hundreds of thousands of votes in 1904, 1908, 1912, and 1920, but not even one electoral vote. The only formidable third-party challenge from that era was a fluke. In 1912, the popular former president Theodore Roosevelt fought his handpicked prote´ge´ President William Howard Taft for the Republican nomination. When Taft won, Roosevelt ran as a Progressive. Thanks to Roosevelt, the Progressive Party won 88 electoral votes, and became the only modern third party to come in second for the presidency. Twelve years later, “Fighting Bob” Robert M. La Follette’s Progressive campaign only won the electoral votes of his home state, Wisconsin. Still, as with the Populists, many Progressive ideas became law, such as woman’s suffrage, prohibition of child labor, and a minimum wage for working women. Third Parties in the Modern Era In the latter half of the twentieth century, third parties were even more transitory and often had even fewer infrastructures. In 1948, Southerners rejecting the Democratic turn toward civil rights bolted the party to form the Dixiecrats or States’ Rights Democratic Party. Their candidate Strom Thurmond won 1,169,063 popular votes and 39 electoral votes from various Southern states. That same year former Vice President Henry Wallace’s breakaway party from the left side of the Democratic coalition, the Progressive Party, won 1,157,172 votes scattered in the North and Midwest, but no electoral votes. Twenty years later, civil rights issues again propelled a Southern breakaway party with George Wallace’s American Independent Party winning almost 10 million votes and 46 electoral votes. In the modern era, the most attention-getting third party revolts cast a heroic independent voice against mealymouthed and hypercautious major party nominees. In 1980, veteran Congressman John Anderson broke away from the Republican Party, after distinguishing himself in

119

T H I RT Y- E I G H T H PA R A L L E L

the Republican primaries as a straight shooter. In 1992 and 1996 billionaire businessman Ross Perot bankrolled his own campaign and party, targeting the deficit. And in 2000, the long-time reformer Ralph Nader mounted a third-party effort that did not even win five percent of the popular vote, but whose more than 90,000 votes in Florida may have thrown the election to George W. Bush. In an era of cynicism and political disengagement, public opinion polls show that Americans claim they would like to see a third party as an alternative. At the state and local level, some third parties have lasted, most notably New York City’s Liberal and Conservative Parties and Minnesota’s Farmer-Labor Party. In the 1980s, the Libertarian Party advanced in Alaska, and in the 1990s, Connecticut and Maine, among others, had independent governors, while Vermont had an independent-socialist congressman. Still, these are mere shooting stars in the American political universe. As their predecessors did, modern, consumer-oriented Americans approve of third parties in principle, but rarely in practice. BIBLIOGRAPHY

Hofstadter, Richard. The Age of Reform: From Bryan to F. D. R. New York: Random House, 1955. Polakoff, Keith I. Political Parties in American History. New York: Wiley, 1981. Reichley, James. The Life of the Parties: A History of American Political Parties. New York: Free Press, 1992. Rosenstone, Steven J. Third Parties in America: Citizen Response to Major Party Failure. Princeton, N.J.: Princeton University Press, 1984.

Gil Troy See also Machine, Political; Political Parties.

THIRTY-EIGHTH PARALLEL. As World War II in the Pacific neared its end in August 1945, the United States began to dismantle the Japanese Empire and return conquered nations to indigenous rule. The United States had given little thought to the Korean peninsula before Japan’s surrender. The region played a very minor role in America’s strategic plan during the war, and many observers expected that the Soviet Union would assume the postwar occupation duties. However, Joseph Stalin’s clear ambition to dominate Eastern Europe and parts of the Middle East convinced U.S. policymakers to limit Soviet influence in the Far East. The Soviet Union had 1.6 million battle-hardened troops on the Manchurian and Korean borders, but the United States wagered that Stalin would accept an American role in the postwar occupation of Korea if President Harry Truman’s administration moved quickly. When the Soviet Twenty-fifth Army entered Korea on 10 August and moved as far south as Pyongyang, the United States realized how little time it had to act. That same evening, the War Department instructed two U.S.

120

Army officers, Colonel Dean Rusk and Colonel Charles H. Bonesteel III, to design an occupation plan for Korea. They proposed a demarcation line at the thirty-eighth parallel, with the Soviets handling the postwar occupation duties in the north and the Americans administering the southern half. The choice was based on expediency—they were forced to make a quick decision and the thirty-eighth parallel was clearly marked on most maps of Korea. The decision was also made for bureaucratic convenience: the thirty-eighth parallel divided the country into two halves of roughly the same size (the northern part being slightly larger—48,000 square miles opposed to 37,000). However, it did not take into account the economic differences or such factors as demography and geography. As a result, the northern half included virtually all of the industrial facilities and mineral wealth, while the southern sphere incorporated most of the agricultural land and a majority of the population. The thirty-eighth parallel was designed to be a political border, but not a permanent one, and thus it did not take into account military defensibility. The United States immediately forwarded the occupation plan to Stalin and the Soviets accepted on 16 August. With the rapid ascent of the Cold War, however, the thirty-eighth parallel soon became a de facto international boundary between an emergent communist state led by Kim Il-sung in the north and a pro-Western autocratic state headed by Syngman Rhee in the south. BIBLIOGRAPHY

Hickey, Michael. The Korean War: The West Confronts Communism, 1950–1953. Woodstock, N.Y.: Overlook Press, 1999. Sandler, Stanley. The Korean War: No Victors, No Vanquished. Lexington: University of Kentucky Press, 1999. Stueck, William Whitney. The Korean War: An International History. Princeton, N.J.: Princeton University Press, 1995.

Erik B. Villard See also Korean War.

THIRTY-HOUR WEEK. In 1932, Senator Hugo Black (D-Alabama) introduced the Thirty-Hour Work Week Bill, 72nd Congress, to “prohibit, in interstate or foreign commerce, all goods produced by establishments where workers were employed more than five days a week or six hours a day.” Black hoped that this bill, drafted by the American Federation of Labor, would create 6 million jobs. The Senate passed the bill on 6 April 1933, by a vote of 53–30. President Franklin Delano Roosevelt privately expressed doubts, and the bill remained in House of Representatives committees for five years. When the Fair Labor Standards Act became law in 1938, the thirty-hour work week provision was not included.

T H O M A S C O N F I R M AT I O N H E A R I N G S

BIBLIOGRAPHY

Hunnicutt, Benjamin Kline. Work Without End: Abandoning Shorter Hours for the Right to Work. Philadelphia: Temple University Press, 1988.

Kelly Boyer Sagert See also American Federation of Labor–Congress of Industrial Organizations; Great Depression.

THOMAS CONFIRMATION HEARINGS. On 28 June 1991, Thurgood Marshall, the first African American to serve on the Supreme Court, sent his resignation to President George H. W. Bush. Three days later, the president nominated Clarence Thomas, another African American, to fill the vacancy. But while Marshall had been a leading liberal on the Court and a champion of minorities and the poor, Thomas held much more conservative views. Born into a poor Georgia family, he graduated from Yale Law School in 1974 and rose in Missouri legal and political circles until moving to Washington, D.C., with Senator John Danforth. Thomas headed the Equal Employment Opportunity Commission from 1982 until

1990, when President Bush named him to the Court of Appeals for the District of Columbia. The earliest phase of the nomination hearings centered on Thomas’s political, social, and judicial views, his critics inquiring particularly into his lukewarm attitudes toward affirmative action and other social programs aimed at minority groups. The nominee maintained a discreetly noncommittal attitude when questioned on such controversial matters as abortion. The Senate Judiciary Committee was evenly divided and, in late September, sent the nomination forward with no recommendation. Before the full Senate could act, however, an explosive new element was injected into the debate. Anita Hill, a young African American law professor at the University of Oklahoma, alleged that while working for Thomas she had been harassed by him. She charged that Thomas repeatedly addressed crude and sexually explicit remarks to her and made persistent and unwanted sexual advances. Hill made her allegations in televised testimony before the Senate Judiciary Committee, and much of the nation was transfixed by her dramatic charges and by Thomas’s vehement and angry denial of every allegation (he called the proceedings “a high tech lynching”). Both principals were highly articulate and both seemed honest, but it was clear that one of them was not telling the truth. Proponents of Thomas felt that the Hill testimony was part of an orchestrated campaign to discredit a conservative. Opponents, on the other hand, believed that Hill’s charges were convincing and damaging and that the Judiciary Committee (made up entirely of white males) was insensitive at best and harshly aggressive toward Hill at worst—Senators Arlen Spector, Orrin Hatch, and Alan Simpson came in for particular criticism because of their overtly hostile attitude toward Hill. On 15 October 1991, the full Senate voted to confirm Thomas by a vote of 52–48, the narrowest confirmation vote of the twentieth century. Thomas took the oath of office on 23 October. One of the results of the hearings was a heightened consciousness of the problem of sexual harassment and a greater willingness on the part of many women to reveal their own experiences and, in some instances, to bring formal charges. BIBLIOGRAPHY

Danforth, John. Resurrection: The Confirmation of Clarence Thomas. New York: Viking, 1994. Hill, Anita. Speaking Truth to Power. New York: Doubleday, 1997. Mayer, Jane, and Jill Abramson. Strange Justice: The Selling of Clarence Thomas. Boston: Houghton Mifflin, 1994. Anita Hill. The law professor testifies before the Senate Judiciary Committee that Clarence Thomas sexually harassed her while she worked for him at the Equal Employment Opportunity Commission. Thomas angrily denied the accusations and was narrowly confirmed to a seat on the U.S. Supreme Court. Associated Press/World Wide Photos

United States Senate. Nomination of Judge Clarence Thomas to be Associate Justice of the Supreme Court of the United States: Hearings before the Committee on the Judiciary. 4 volumes. Washington, D.C.: Government Printing Office, 1991.

David W. Levy See also Confirmation by the Senate.

121

THREE MILE ISLAND

an eleven-year period, the cleanup of Three Mile Island’s severely damaged reactor cost in excess of $1 billion. BIBLIOGRAPHY

Cantelon, Philip L., and Robert C. Williams. Crisis Contained: The Department of Energy at Three Mile Island. Carbondale: Southern Illinois University Press, 1982. President’s Commission on the Accident at Three Mile Island. Report of the President’s Commission on the Accident at Three Mile Island: The Need for Change: The Legacy of TMI. Washington, D.C.: Government Printing Office, 1979. Stephens, Mark. Three Mile Island. New York: Random House, 1980.

Robert M. Guth John Wills Three Mile Island. An aerial view of the nuclear facility near Harrisburg, Pa. AP/Wide World Photos

THREE MILE ISLAND, the site of the worst civilian nuclear power program accident in the United States, is located in the Susquehanna River near Harrisburg, Pennsylvania. In the early 1970s, Metropolitan Edison built two reactors on Three Mile Island for commercial energy production. On 28 March 1979, a faulty valve allowed water coolant to escape from Metropolitan Edison’s second reactor, Unit 2, during an unplanned shutdown. A cascade of human errors and technological mishaps resulted in an overheated reactor core with temperatures as high as 4,300 degrees and the accidental release of radiation into the atmosphere. Plant operators struggled to resolve the situation. Press reporters highlighted the confusion surrounding the accident, while Governor Richard L. Thornburgh of Pennsylvania and President Jimmy Carter visited the stricken plant, urging the nation to remain calm. On 30 March, state officials evacuated pregnant women and preschool children from the immediate area as a safety measure. On 2 April, temperatures decreased inside the Unit 2 reactor, and government officials declared the crisis over on 9 April. A commission authorized by President Carter investigated the calamity. Government analysts calculated that, at the height of the crisis, Unit 2 was within approximately one hour of a meltdown and a significant breach of containment. The lessons learned at Three Mile Island led to improved safety protocols and equipment overhauls at commercial reactors across the country. Three Mile Island also contributed to rising public anxiety over the safety of nuclear energy, anxieties fueled by the coincidental release of The China Syndrome, a fictional movie about the cover-up of a nuclear plant accident, just twelve days before the disaster at Three Mile Island. The Three Mile Island accident became a rallying cry for grassroots antinuclear activists. Wary of sizable cost overruns and public resistance, electrical utilities shied from constructing new nuclear plants in the years that followed. Over

122

See also Electric Power and Light Industry; Nuclear Power; Nuclear Regulatory Commission.

THRESHER DISASTER. Launched in July 1960 and commissioned in August 1961, the USS Thresher was the lead boat for a revolutionary class of “hunter-killer attack” submarines designed to destroy Soviet ballistic missile submarines. A strong steel hull, although thinner than that of most submarines, permitted the Thresher to withstand greater damage and operate significantly deeper than its counterparts. Its advanced design incorporated a reduced-profile conning tower to increase maneuverability while providing maximum stealth and a highly sensitive, bow-mounted sonar array to detect the enemy at greater distances. The Thresher’s torpedo room was located aft of the conning tower, and the tubes angled upward to utilize SUBROC, or submarine rocket, torpedoes. More importantly, a nuclear reactor provided the submarine its power and extended its operational range. During an exhaustive two-year sea trial period, the Thresher suffered an unanticipated reactor shutdown and a collision with a tugboat in addition to the usual “shakedown” problems. After additional tests, the submarine began a nine-month overhaul in August 1962. On 9 April 1963, the Thresher returned to sea and initiated a series of routine dives to “test depth,” or maximum safe operating depth, estimated at approximately 1,000 feet. On 10 August the crew reported “minor difficulties” at 700 feet and attempted an emergency surface. The Thresher never reappeared, sinking in 8,500 feet of water, with all 129 men aboard killed. The submarine imploded from the extreme pressure at these depths, leaving only small fragments of wreckage to be located or recovered. Tests conducted at the time of the accident (and again in the 1980s) revealed that the nuclear reactor had remained intact and an environmental disaster averted. The ensuing inquiry into the navy’s first loss of a nuclearpowered vessel suspected improperly welded brazed joints as leading to Thresher’s demise, but the official cause remained unknown. The disaster sobered proponents of a nuclear navy, who subsequently instituted the SUBSAFE

TIANANMEN SQUARE PROTEST

program to review nuclear submarine construction and operations to ensure that safety would keep pace with technological innovation. BIBLIOGRAPHY

Duncan, Francis. Rickover and the Nuclear Navy: The Discipline of Technology. Annapolis, Md.: Naval Institute Press, 1990. Polmar, Norman. Death of the Thresher. Philadelphia: Chilton, 1964. Rockwell, Theodore. The Rickover Effect: How One Man Made a Difference. Annapolis, Md.: Naval Institute Press, 1992.

Derek W. Frisby See also Navy, United States; Nuclear Power.

THRIFT STAMPS. During World War I the American government turned to thrift stamps as one means of financing the war effort while instilling traditional values. War expenses totaled $33 billion, and the Treasury Department sold approximately $21 billion worth of Liberty bonds to meet the nation’s new financial demands. However, to encourage thrift and support for the war effort among elements of the population who could not afford even the smallest bond, valued at fifty dollars, the Treasury Department was authorized to issue Thrift Stamps and War Savings Stamps. This revenue measure was often targeted at immigrants and school children. In many localities, public school teachers were authorized to implement the program and teach children the values of patriotism and saving. Thrift Stamps cost twenty-five cents each, and when sixteen were collected they could be exchanged for War Savings Stamps or Certificates, which bore interest compounded quarterly at four percent and were tax free. War Savings Stamps could be registered at any post office, insuring the owner against loss, and sold back to the government through any post office with ten days written notice. The conditions placed on the program made it popular with small investors. The campaign began on 2 January 1918 and closed at the year’s end. When the War Savings Stamps matured on 1 January 1923, the Treasury Department promised to pay the sum of five dollars for each certificate. In little more than a year over $1 billion was raised in this campaign, fulfilling its ideological and financial purposes. BIBLIOGRAPHY

Kennedy, David. Over Here: The First World War and American Society. New York: Oxford University Press, 1980. Wynn, Neil A. From Progressivism to Prosperity: World War I and American Society. New York: Holmes and Meier, 1986.

Ron Briley See also Savings Bonds.

Defiance at Tiananmen Square. A lone protester stands up to the tanks during the government crackdown on the mostly peaceful, student-led demonstrations in Beijing. Years later, Time magazine named him one of the century’s top twenty “leaders and revolutionaries” for his inspiring action. 䉷 corbis

TIANANMEN SQUARE PROTEST. On 15 April 1989, students held a vigil in Beijing’s Tiananmen Square that commemorated the death of Hu Yaobang, a progressive leader who had sought reforms in China. They demanded freedom and empowerment for a young generation. The vigil became an ongoing protest in the square on 4 May and gave rise to a prodemocracy movement throughout China. Calling for a change in government through political liberalization and an end to official corruption, the demonstrators displayed Lady Liberty, meant to resemble the Statue of Liberty in New York Harbor and signaling a desire for an open way of life. Although the situation was far from a civil war, the scope of the largely nonviolent opposition to the government was very broad. While the movement earned support for its agenda and sympathy abroad through wide international media coverage, the most potent challenge to the legitimacy and authority of the Communist Party since Mao Tse-tung’s 1949 victory against the Nationalists was crushed at Tiananmen Square by military force on 3 and 4 June 1989, seven weeks after it had begun. Hundreds of protesters and bystanders were presumed dead, thousands wounded and imprisoned. From documents smuggled out of China and published in the United States, it appears that factional struggles among China’s leaders and the fear of international shame delayed military action. President George H. W. Bush, acting upon public outrage, imposed minor diplomatic sanctions, but he subordinated human rights concerns to U.S. business interests, encouraging

123

TICONDEROGA, CAPTURE OF

Bill Clinton to denounce him as “coddling dictators” during the 1992 presidential campaign. In turn, however, Clinton’s policies followed the pattern of engaging the Chinese commercially, claiming that trade and openness would facilitate political reforms. This policy was embodied in the ongoing grant of most-favored-nation trade status to China, the jailing of human rights activists notwithstanding. BIBLIOGRAPHY

Nathan, Andrew, and E. Perry Link, eds. The Tiananmen Papers. 2001. Wang, James C. F. Contemporary Chinese Politics: An Introduction. Englewood Cliffs, N.J.: Prentice-Hall, 2000.

Itai Sneh See also China, Relations with.

Ticonderoga to supervise the moving of fourteen mortars and coehorns, two howitzers, and forty-three cannons. The guns were taken in groups by water to Fort George, at the southern end of Lake George; on sleds drawn by oxen and horses to Claverack, New York; and thence east through the mountains to Cambridge, Massachusetts. By 24 January 1776, Gen. George Washington was able to use these cannons to force the British from Boston. In 1777, the British moved to recapture Fort Ticonderoga. British Gen. John Burgoyne’s army of more than nine thousand was opposed by Gen. Arthur Saint Clair with about twenty-five hundred men. The British dragged cannon up Sugar Hill (Mount Defiance), which commanded the fort from the southwest. Shortly after midnight on 6 July, Saint Clair wisely retreated southward along the eastern shore of Lake Champlain, leaving the fort to the British. BIBLIOGRAPHY

TICONDEROGA, CAPTURE OF (1775). The French fort built in October 1755 by Marquis de Lotbinie`re, commanding the route between lakes Champlain and George, fell into English hands during the French and Indian War after Sir Jeffrey Amherst’s successful siege in 1759. The English renamed it Fort Ticonderoga, New York. In 1775, Massachusetts revolutionaries hatched a plan to capture Fort Ticonderoga to obtain cannon for the siege of Boston. Early in the morning of 10 May, Ethan Allen, Benedict Arnold, and eighty-three men crossed Lake Champlain in two boats. The expedition passed through the ruined walls and, without bloodshed, quickly subdued the sleepy garrison of two officers and forty-three men. On 5 December, Henry Knox arrived at

Billias, George Athan. George Washington’s Opponents: British Generals and Admirals in the American Revolution. New York: Morrow, 1969. Bird, Harrison. March to Saratoga: General Burgoyne and the American Campaign, 1777. New York: Oxford University Press, 1963. French, Allen. The Taking of Ticonderoga in 1775. Cambridge, Mass.: Harvard University Press, 1928. Gilchrist, Helen Ives. Fort Ticonderoga in History. [Fort Ticonderoga? N.Y.]: Printed for the Fort Ticonderoga Museum, [192-?]. Hargrove, Richard J. General John Burgoyne. Newark: University of Delaware Press, 1983.

Edward P. Alexander / a. r. See also Boston, Siege of; Bunker Hill, Battle of; Burgoyne’s Invasion; Oriskany, Battle of.

TIDELANDS, lands lying under the sea beyond the low-water limit of the tide but considered within the territorial waters of a nation. The U.S. Constitution does not specify whether ownership of these lands rests with the federal government or with individual states. Perhaps because little commercial value was attached to tidelands, ownership was never firmly established but the states generally proceeded as if they were the owners.

Fort Ticonderoga. This photograph shows a cannon facing Lake Champlain; the small British garrison was asleep when Americans crossed the lake and seized the fort, only three weeks after the Revolution began. 䉷 Lee Snider/corbis

124

The value of tidelands increased when it became known that vast oil and natural gas deposits lay within their limits and that modern technology made retrieval of these minerals commercially profitable. The first offshore oil well began production in 1938 in shallow water in the Gulf of Mexico one mile off the Louisiana coast; in 1947, a second well began to operate off the coast of Terrebonne Parish, also in Louisiana. In that same year, the Supreme Court ruled, in United States v. California, that the federal government and not the states owned the tidelands. The decision meant the loss of untold millions

T I D E WAT E R

of dollars in taxes and leasing fees by the states. The states whose tidelands were suspected of containing minerals objected strongly to the decision. The issue became important in the 1952 presidential campaign. The Republican candidate, Dwight D. Eisenhower, pledged legislation that would restore the tidelands to the states. Eisenhower won the election, and, in 1953, Congress passed two acts that fulfilled his campaign promise. The Submerged Lands Act extended state ownership to three miles from their actual coastline—except for Florida and Texas, which received ownership of the tidelands to within 10.5 miles of their coastlines. The Outer Continental Shelf Lands Act gave the United States paramount rights from the point where state ownership leaves off to the point where international waters begin. The 1953 acts did not end all controversy, however. The Submerged Lands Act, in particular, was so badly drawn up that state taxes and leasing fees had to be put in escrow pending final resolution of the numerous lawsuits that emerged. The Supreme Court finally decided the issue on 31 May 1960 when it ruled that Mississippi, Alabama, and Louisiana owned the rights to the offshore lands for a distance of 3.5 miles, and Texas and Florida owned rights to tidelands within three leagues, or approximately 10.5 miles, from their coastline boundaries (United States v. States of Louisiana, Texas, Mississippi, Alabama, and Florida). In the case of Texas, the claim to special boundary limits had been recognized by Congress in the Treaty of Guadalupe Hidalgo of 1848, which ended the Mexican-American War. The ruling for Florida was based on congressional approval of Florida’s claims when the state reentered the Union after the Civil War. Although the other Gulf states objected to what they considered preferential treatment for Florida and Texas, no new legislation resulted. In 1963, the U.S. Justice Department settled the last of the tidelands controversies by ruling that the 1953 act gave control to the states of islands near the shore that were created after the states had been admitted to the Union. BIBLIOGRAPHY

Bartly, Ernest R. The Tidelands Oil Controversy: A Legal and Historical Analysis. Austin: University of Texas Press, 1953. Galloway, Thomas D., ed. The Newest Federalism: A New Framework for Coastal Issues. Wakefield, R.I.: Times Press, 1982. Marshall, Hubert R., and Betty Zisk. The Federal-State Struggle for Offshore Oil. Indianapolis, Ind.: Published for the Interuniversity Case Program by Bobbs-Merrill, 1966.

Thomas Robson Hay / c. w. See also Constitution of the United States; Guadalupe Hidalgo, Treaty of; Natural Gas Industry; Territorial Sea.

TIDEWATER is a term commonly used to designate that portion of the Atlantic coastal plain lying east of the points in rivers reached by oceanic tides. This region, the

first to be occupied by settlers from the Old World, slowly became an area of comparative wealth. Merchants and shippers in the towns; and planters growing tobacco, rice, indigo, and cotton, dominated the tidewater population. Since the tidewater coastal area is so narrow in New England, the terminology is more applicable elsewhere, particularly to the middle and southern Atlantic regions that were initially British colonies and the later states of the federal Union. First to settle and establish themselves economically, socially, and politically, tidewater region inhabitants secured control of the government. Almost inevitably, they used the machinery of government for their own benefit and in accordance with their own traditions and ideals, and they resisted any efforts to weaken their control. Nonetheless, the later population—composed largely of small farmers who moved out into the Piedmont region—found this governmental domination both unfair and injurious. A serious and long-standing sectional conflict resulted. Sometimes, as in the case of Bacon’s Rebellion of 1676 in Virginia, the Paxton riots of 1764 in Pennsylvania, and the Regulator movement of 1768–1771 in North Carolina, the conflict resulted in open warfare. At times, manipulation and compromise kept violence down. Less violence accompanied the separation of West Virginia from the rest of Virginia in 1863 when the western counties, which had remained loyal to the Union, formed their own state. On all occasions, however, the serious conflict in ideals and interest had to be taken into consideration. The political history of the colonies, and later the states, can only be interpreted adequately in the light of this conflict. The tidewater element of the population maintained control of the government largely by a device of disproportional representation that operated widely from Pennsylvania to Georgia. Another device was restricted suffrage, wherein a heavy property qualification benefited the wealthy of the tidewater region while it disadvantaged the poorer inhabitants of the interior. Using these devices to control the legislatures, the tidewater element pursued policies in regard to the Indians, debts, and taxes that were of most benefit to the tidewater population and therefore often injurious to the up-country population. BIBLIOGRAPHY

Kars, Marjoleine. Breaking Loose Together: The Regulator Rebellion in Pre-Revolutionary North Carolina. Chapel Hill: University of North Carolina Press, 2002. Kolp, John Gilman. Gentlemen and Freeholders: Electoral Politics in Colonial Virginia. Baltimore: Johns Hopkins University Press, 1998. Lee, Wayne E. Crowds and Soldiers in Revolutionary North Carolina: The Culture of Violence in Riot and War. Gainesville: University Press of Florida, 2001. Williams, John Alexander. West Virginia: A History. Morgantown: West Virginia University Press, 2001.

Alfred P. James / a. e.

125

T I L L , E M M E T T, LY N C H I N G O F

See also Fall Line; Indigo Culture; Insurrections, Domestic; Paxton Boys; Regulators; Rice Culture and Trade; Tobacco Industry; Sectionalism.

BIBLIOGRAPHY

Whitfield, Stephen J. A Death in the Delta: The Story of Emmett Till. New York: Free Press, 1988.

Stephen J. Whitfield

TILL, EMMETT, LYNCHING OF. Emmett Louis Till was murdered in the Mississippi Delta on 28 August 1955, making the fourteen-year-old Chicagoan the bestknown young victim of racial violence in the nation’s history. Visiting relatives shortly before he would have started the eighth grade, Till entered a store in Money, in Leflore County, and as a prank behaved suggestively toward Carolyn Bryant, the twenty-one-year-old wife of the absent owner, Roy Bryant. This breach of racial etiquette soon provoked Bryant and his half brother, J. W. Milam, to abduct Till from his relatives’ home, pistol-whip and then murder him, and finally to dump the corpse into the Tallahatchie River. Bryant and Milam were prosecuted in the early autumn. Despite forthright testimony by the victim’s mother, Mamie Till, a jury of twelve white men quickly acquitted the defendants. The verdict was widely condemned even in the southern white press, and more sharply in the black press and the foreign press. The brutality inflicted upon a guileless teenager exposed the precarious condition that blacks faced—especially in the rural South—as did no other episode. Such violence in defense of racial supremacy and white womanhood helped to inspire the civil rights movement in the early 1960s.

See also Lynching.

TILLMANISM. Tillmanism, which was strongest during the years 1890 through 1918, was an agrarian movement in South Carolina led by “Pitchfork Ben” Tillman (1847–1918) and characterized by violent white supremacy, the lionization of farmers, and hostility toward northern business interests and the aristocratic southern leadership. The traditional interpretation claims that the movement embraced a legitimate populism that helped the rural poor (Tillman helped found Clemson University, for example), even if it was marred by racism. Recent scholarship, however, argues that Tillman’s agrarian rhetoric was a crass tactic to gain control of Democratic Party machinery. In the current view farmers received little material benefit from Tillman’s policies and suffered from the backward social system that white supremacy created. BIBLIOGRAPHY

Kantrowitz, Stephen D. Ben Tillman & the Reconstruction of White Supremacy. Chapel Hill: University of North Carolina Press, 2000.

Jeremy Derfner See also White Supremacy.

TIMBER CULTURE ACT. An 1870s weather hypothesis suggested that growing timber increased humidity and perhaps rainfall. Plains country residents urged the Federal Government to encourage tree planting in that area, believing trees would improve the climate. Also, 1870 government land regulations dictated that home seekers in Kansas, Nebraska, and Dakota could acquire only 320 acres of land. To encourage tree planting and increase the acreage open to entry, Congress passed the Timber Culture Act in 1873, declaring that 160 acres of additional land could be entered by settlers who would devote forty acres to trees. Some 10 million acres were donated under this act, but fraud prevented substantive tree growth. The act was repealed in 1891. BIBLIOGRAPHY

Gates, Paul W. History of Public Land Law Development. Washington, D.C.: Public Land Law Review Commission, 1968.

Paul W. Gates / f. b. See also Forestry; Great Plains; Land Speculation. Emmett Till. Murdered at fourteen in a small Mississippi town, because white men believed he had whistled at a white woman. 䉷 corbis

126

TIME. The first issue of Time magazine appeared on 3 March 1923. The magazine was founded by the twenty-

TIMES BEACH

four-year-old Yale graduates Briton Hadden and Henry Luce. They created a distinctive newsweekly that was “Curt, Clear, and Complete” to convey “the essence of the news” to the “busy man.” Emphasizing national and international politics, Time contained brief articles that summarized the significant events of the week. Its authoritative and omniscient tone was created through the technique of “group journalism,” in which the magazine was carefully edited to appear the product of a single mind. Time peppered its articles with interesting details and clever observations. It sought to make the news entertaining by focusing on personality. In its first four decades, over 90 percent of Time’s covers featured a picture of an individual newsmaker. In 1927, the magazine began its well-known tradition of naming a “man of the year,” making aviator Charles Lindbergh its first selection. Time’s formula proved successful, particularly in appealing to better-educated members of the white middle class. By the end of the 1930s, circulation neared one million and its journalistic innovations were much imitated—newspapers were adding week-in-review sections and former Time employees launched Newsweek in 1933. Particularly after Hadden’s death in 1929, Time reflected the empire-building vision of Henry Luce. Beginning in the 1930s, Luce expanded the operations of Time, Inc. In 1930, he created Fortune, a business magazine widely read by the nation’s economic leaders. In 1936, he created Life, a vastly popular magazine that summarized the weekly news events through pictures and had a seminal influence on the development of photojournalism. Luce also launched “The March of Time,” both a radio program and a newsreel. Luce became a well-known advocate of the global expansion of American power and influence. In a famous 1941 Life editorial, Luce called for an “American Century” in which the United States would “accept wholeheartedly our duty and our opportunity as the most powerful and vital nation in the world and . . . exert upon the world the full impact of our influence, for such purposes as we see fit and by such means as we see fit.” Luce’s essay anticipated America’s leadership of the capitalist world in the Cold War years, while his publications helped promote his patriotic, internationalist, and procapitalist views. In the Cold War years, Time’s reporting of the news reflected Luce’s anticommunism. Throughout the 1940s, Time contained flattering portraits of the Chinese dictator Chiang Kai-shek and urged greater U.S. effort to prevent the victory of Mao Zedong and communism in China. The magazine’s support of Cold War principles is clearly represented in a 1965 Time essay declaring the escalating battle in Vietnam to be “the right war, in the right place, at the right time.” The Cold War years were a time of great expansion for Time, as it became America’s most widely read news magazine, reaching a circulation of over four million by the end of the 1960s.

After Luce’s death in 1967, Time made a number of changes to its distinctive journalistic style. In response to the growing influence of television news, Time granted bylines to writers, expanded its article lengths, shifted its focus from personality to issues, and added opinion pieces. However, much of Time’s original journalistic vision of a news summary delivered in an authoritative and entertaining tone persisted, not just in Time, but also in the news media as a whole. Meanwhile, Time, Inc., continued to expand. In the 1970s, Time acquired a large stake in the developing field of cable television. In 1989, it merged with Warner Brothers to become Time Warner. In 2001, it merged with America Online to become the gigantic media conglomerate AOL Time Warner, with large operations in television, publishing, music, film, and the Internet. Thus, even as the journalistic vision of the original Time had lost its distinctiveness, Luce’s plan to make Time the cornerstone of a media empire was far more successful than his wildest expectations at the magazine’s founding. BIBLIOGRAPHY

Baughman, James L. Henry R. Luce and the Rise of the American News Media. Boston: Twayne, 1987. Elson, Robert T. Time, Inc.: The Intimate History of a Publishing Enterprise, 1923–1941. New York: Atheneum, 1968. ———. The World of Time, Inc.: The Intimate History of a Publishing Enterprise, 1941–1960. New York: Atheneum, 1973. Herzstein, Robert. Henry R. Luce: A Political Portrait of the Man Who Created the American Century. New York: Scribners, 1994. Prendergast, Curtis, with Geoffrey Colvin. The World of Time: The Intimate History of a Changing Enterprise, 1960–1980. New York: Atheneum, 1986.

Daniel Geary See also Magazines.

TIMES BEACH, a town in Missouri, came to national attention in December 1982, when Environmental Protection Agency (EPA) officials learned that soil samples taken from the town’s dirt roads contained dioxin, a toxic chemical by-product, in concentrations hundreds of times higher than levels considered safe for human exposure. The EPA found that a contractor hired by Times Beach to control dust on its dirt roads had sprayed them with waste automotive oil mixed with industrial sludge from a defunct chemical company. The EPA purchased all the property in Times Beach and permanently evacuated its 2,000 residents. The buyout was the first under the Superfund program. BIBLIOGRAPHY

Posner, Michael. “Anatomy of a Missouri Nightmare.” Maclean’s 96 (April 4, 1983): 10–12.

127

TIMES SQUARE

Times Square at Night. The Great White Way still shines despite the Great Depression, in this photograph by Irving Underhill from the early 1930s. Library of Congress

Switzer, Jacqueline Vaughn. Environmental Politics: Domestic and Global Dimensions. New York: St. Martin’s Press, 1994.

John Morelli / c. w. See also Chemical Industry; Conservation; Environmental Protection Agency; Hazardous Waste; Superfund.

TIMES SQUARE in New York City, formerly Longacre Square and often referred to as the “Great White Way” because of the Broadway theaters’ lights that illuminate the district, is formed by the intersection of three streets—Broadway, Seventh Avenue, and Forty-second Street. It was renamed for the New York Times building erected at the opening of the twentieth century. By the 1920s the neighborhood became a concentrated entertainment district of theaters, vaudeville, cabarets, bars, and restaurants. The 1929 stock market crash took its toll on the area and many businesses that once attracted a well-heeled clientele turned to seamier forms of entertainment. In particular, pornographic movie houses, “peep shows,” and the flesh trade gradually infested the district. By the 1960s the drug trade added an additional element of danger to the neighborhood. However, the area was never totally deserted by legitimate forms of entertainment and Broadway shows always guaranteed the retention of a certain flow of legitimate commercial traffic into the area. During the 1990s, New York City began a slow but steady push for its revitalization. In the early 2000s

128

that process, sometimes referred to as “Disneyfication,” was nearly complete and the district was a mecca for family-oriented tourism and entertainment. BIBLIOGRAPHY

Rogers, W. G. Carnival Crossroads: The Story of Times Square. Garden City, N.Y.: Doubleday, 1960. Stone, Jill. Times Square: A Pictorial History. New York: Collier, 1982. Taylor, William, ed. Inventing Times Square: Commerce and Culture at the Crossroads of the World. New York: Russell Sage Foundation, 1991.

Faren R. Siminoff See also Broadway.

TIMUCUA. In the sixteenth century, prior to contact with the Spanish, around 200,000 Timucuans lived in what is today northern Florida and southern Georgia. Approximately thirty-five distinct chiefdoms divided the area politically. Their language, Timucua, is so strikingly different from other southeastern languages that some linguists have argued that the Timucuans may have originated in Central or South America, but archaeological evidence, some of it 5,000 years old, seems to undermine these claims.

TINIAN

Each of the chiefdoms consisted of two to ten villages, with lesser villages and leaders paying tribute to higher-status chiefs. Both men and women served as chiefs. Before contact with the Spanish, Timucuans lived in close proximity to wetlands, and supported themselves by hunting, fishing, and gathering. Because of their rapid demise after contact with the Spanish, little is known about Timucuan culture and lifeways. Archaeologists in recent decades have begun to fill in sorely needed details about diet, burial practices, and political structures. Contact with the Spanish brought sweeping changes to Timucua country as each of the thirty-five chiefdoms received its own Franciscan mission between 1595 and 1630. The presence of Spanish missions brought about more than just religious change; the once locally oriented Timucuans were drawn into the broader struggle for empire on a global scale. In the missions, Timucuans built churches, forts, and barracks for the Spanish; they also raised pigs and sheep and grew corn. Indians grew the food and provided the labor that allowed the Spanish to dominate the Southeast throughout the seventeenth century. At the same time, disease imported from Europe wreaked havoc on Timucuan peoples. Epidemics caused severe depopulation: by the 1650s, only around 2,000 Timucuans remained. Although their population declined drastically, mission life provided some measure of stability for Timucuans. This stability was short-lived, however. The founding of English colonies at Jamestown (1607) and Charles Town (1670) renewed conflict between Spain and Britain, and Carolina slavers and allied Native groups continually raided the Spanish missions for captives. When Spain evacuated Florida following the Seven Years’ War, all of the remaining Timucuans were taken to Cuba. The last full-blooded Timucuan died in Cuba in 1767. BIBLIOGRAPHY

Milanich, Jerald T. The Timucua. Cambridge, Mass.: Blackwell, 1996. ———. Laboring in the Fields of the Lord: Spanish Missions and Southeastern Indians. Washington, D.C.: Smithsonian Institution Press, 1999. ———. “The Timucua Indians of Northern Florida and Southern Georgia.” In Indians of the Greater Southeast: Historical Archaeology and Ethnohistory. Edited by Bonnie G. McEwan. Gainesville: University Press of Florida, 2000. Worth, John. The Timucuan Chiefdoms of Spanish Florida. 2 vols. Gainesville: University Press of Florida, 1998.

Matthew Holt Jennings See also Tribes: Southeastern.

Tin Pan Alley. A photograph by G. D. Hackett of several music-publishing companies in buildings on Twenty-eighth Street in Manhattan. Getty Images

area around Twenty-eighth Street and Sixth Avenue to Thirty-second Street and then to the area between Fortysecond and Fiftieth streets, the name “Tin Pan Alley” moved with it. The term suggests the tinny quality of the cheap, overabused pianos in the song publishers’ offices. As the songwriting and music-publishing industry moved to other parts of the city, and to other cities as well, Tin Pan Alley became a term applied to the industry as a whole. BIBLIOGRAPHY

Furia, Philip. The Poets of Tin Pan Alley: A History of America’s Great Lyricists. New York: Oxford University Press, 1990. Jasen, David. Tin Pan Alley: The Composers, the Songs, the Performers and their Times: The Golden Age of American Popular Music from 1886–1956. New York: D. I. Fine, 1988. Tawa, Nicholas. The Way to Tin Pan Alley: American Popular Song, 1866–1910. New York: Schirmer Books, 1990.

Stanley R. Pillsbury / h. r. s. See also Broadway; Music Industry.

TIN PAN ALLEY, a phrase probably coined early in the 1900s, described the theatrical section of Broadway in New York City that housed most publishers of popular songs. As the music-publishing industry moved from the

TINIAN (from 24 July to 1 August 1944). The invasion of Tinian by American forces was necessary to secure the occupation of its neighbor Saipan, captured the previous

129

TIPI

month. Landing beaches on northern Tinian were chosen to take advantage of field artillery based on Saipan. On the morning of 24 July, following several days of bombardment, the Fourth Marine Division came ashore and pushed rapidly inland, surprising the Japanese force of 8,000. Reinforcements from the Second and Fourth Marine Divisions landed on 25 July and swept to the southern tip by 1 August, killing most of the Japanese garrison. American casualties were 328 killed and 1,571 wounded. Tinian became a major U.S. Air Force base for the strategic bombardment of Japan. BIBLIOGRAPHY

Crowl, Philip. Campaign in the Marianas. Washington, D.C.: Office of the Chief of Military History, Dept. of the Army, 1960. Hoffman, Carl W. The Seizure of Tinian. Washington, D.C.: Historical Division, Headquarters, U.S. Marine Corps, 1951.

TIPI, a conical skin tent best known from the Plains Indians but with historical roots from the indigenous people of the Arctic. All tipis have a central fire hearth, an east-facing entrance, and a place of honor opposite the door. Plains tipis are actually tilted cones, with a smokehole down a side with controllable flaps, and an interior lining for ventilation and insulation. Tipi covers historically were bison hide, but modern tipis use canvas. Plains tipis use either a three- or a four-pole framework overlain with additional poles as needed. Covers are stretched over the poles, staked, or weighted down with stones. Tipis were an excellent adaptation for hunting and gathering peoples who needed a light, transportable, yet durable residence. BIBLIOGRAPHY

Philip A. Crowl / a. r.

Laubin, Reginald, and Gladys Laubin. The Indian Tipi: Its History, Construction, and Use. Norman: University of Oklahoma Press, 1977. Originally published in 1955, it is the most complete book on the tipi available and contains excellent illustrations throughout.

See also Air Power, Strategic; Marine Corps, United States; Philippine Sea, Battle of the; Saipan; Trust Territory of the Pacific; World War II; World War II, Air War against Japan.

Davis, Leslie, ed. From Microcosm to Macrocosm: Advances in Tipi Ring Investigation and Research. Edited by Leslie Davis. Plains Anthropologist 28–102, pt. 2, Memoir 19 (1983). Twenty-three papers investigate tipi use on the Great Plains

Hoyt, Edwin P. To the Marianas: War in the Central Pacific, 1944. New York: Van Nostrand Reinhold, 1980.

Tipis. Young Man Afraid of His Horse, an Oglala chief of the Lakota tribe (part of the Sioux confederation), stands in front of several of these Plains Indian dwellings. National Archives and Records Administration

130

TITANIC, SINKING OF THE

from prehistory to the period after Euroamerican contact. Heavily illustrated.

Larry J. Zimmerman See also Tribes: Great Plains.

TIPPECANOE, BATTLE OF (7 November 1811). In response to pressure from white settlers, the Shawnee leader Tecumseh organized a confederacy of Native American tribes in the Indiana and Michigan territories. The crisis came in the summer of 1811, when Tecumseh, after renewing his demands on Gen. William Henry Harrison, governor of the Indiana Territory, at Vincennes, departed to rally the tribes of the Southwest to the confederacy. Urged on by the frantic settlers, Harrison decided to strike first. On 26 September Harrison advanced with 1,000 soldiers on the Indian settlement of Prophetstown, along Tippecanoe Creek, 150 miles north of Vincennes. He spent most of October constructing Fort Harrison at Terre Haute, resuming his march on 28 October. With the town in sight, Harrison yielded to belated appeals for a conference. Turning aside, he encamped on an elevated site a mile from the village. Meanwhile the Native American warriors, a mile away, were stirred to a frenzy by the appeals of Tecumseh’s brother Tenskwatawa (“the Prophet”). Shortly before dawn (7 November), they drove in Harrison’s pickets and furiously stormed the stillsleeping camp. Harrison’s soldiers deflected the attack with a series of charges, attacked and razed the Indian town on 8 November, and began the retreat to distant Fort Harrison. Although Tippecanoe was popularly regarded as a great victory and helped Harrison’s political fortunes, the army had struck an indecisive blow. With almost onefourth of his followers dead or wounded he retreated to Vincennes, where the army was disbanded or scattered. During the War of 1812, federal troops would again do battle with Tecumseh, who had formed an alliance with the British. BIBLIOGRAPHY

Bird, Harrison. War for the West, 1790–1813. New York: Oxford University Press, 1971. Edmunds, R. David. The Shawnee Prophet. Lincoln: University of Nebraska Press, 1983. ———. Tecumseh and the Quest for Indian Leadership. Boston: Little, Brown, 1984. Peterson, Norma L. The Presidencies of William Henry Harrison and John Tyler. Lawrence: University Press of Kansas, 1989.

M. M. Quaife / a. r. See also Indian Policy, U.S., 1775–1830; Indiana; Shawnee; Tecumseh, Crusade of; Thames, Battle of the; “Tippecanoe and Tyler Too!”; War Hawks; War of 1812.

“TIPPECANOE AND TYLER TOO!” was the campaign slogan of the Whigs in 1840, when William Henry Harrison, the hero of the Battle of Tippecanoe, and John Tyler were their candidates for the presidency and vice-presidency, respectively. The party cry typified the emotional appeal of the Whig canvass. Deliberately avoiding issues, its supporters wore coonskin caps, built campaign log cabins in almost every town of consequence, and freely dispensed hard cider to the voters, who were persuaded that Harrison had saved the country from untold Indian atrocities. Few American political slogans have been such unadulterated demagoguery. BIBLIOGRAPHY

Gunderson, Robert G. The Log-Cabin Campaign. Lexington: University of Kentucky Press, 1957. Varon, Elizabeth. “Tippecanoe and the Ladies, Too.” Journal of American History 82 (September 1995).

Irving Dilliard / l. t. See also Elections; Elections, Presidential: 1840; Whig Party.

TITANIC, SINKING OF THE. On 12 April 1912 the White Star Line’s royal mail steamer Titanic, a ship many considered unsinkable, set sail on its maiden voyage from Southampton, England, with stops at Cherbourg, France, and Queenstown, Ireland. On board were many of the most wealthy and influential people in early twentieth-century society and hundreds of emigrants. On 14 April, at 11:40 p.m., the Titanic, some four hundred miles from the coast of Newfoundland, hit an iceberg on its starboard side. Shortly after midnight the crew was instructed to prepare the lifeboats and to alert the passengers. The lifeboats had capacity for one-half of the passengers, and some of the boats left not fully loaded. At 2:20 a.m. the Titanic disappeared. Although the Titanic sent out distress calls, few vessels carried wireless radios, and those that did staffed them only during daytime hours. The eastbound liner Carpathia, some fifty miles away, responded to the Titanic’s signals and began taking on survivors. The Carpathia rescued 705 people, but 1,523 died. Five days after the sinking, the White Star Line chartered a commercial cable company vessel, the MackayBennett, to search the crash area for bodies. Ultimately three other ships joined the search, and 328 bodies were recovered. To aid in identification, hair color, weight, age, birthmarks, jewelry, clothing, and pocket contents were recorded. Nevertheless 128 bodies remained unidentified. Amid calls for an investigation of the tragedy, hearings began in the United States and in England. Neither inquiry blamed the White Star Line, but both issued a series of recommendations, including lifeboats for all passengers, lifeboat drills, a twenty-four-hour wireless, and an international ice patrol to track icebergs.

131

T I T H E S , S O U T H E R N A G R I C U LT U R A L

BIBLIOGRAPHY

Ballard, Robert D., with Rick Archbold. The Discovery of the “Titanic.” New York: Warner, 1987. Biel, Steven. Down with the Old Canoe: A Cultural History of the “Titanic” Disaster. New York: Norton, 1996. Eaton, John P., and Charles A. Hass. “Titanic”: Destination Disaster. New York: Norton, 1987 ———. “Titanic”: Triumph and Tragedy. New York: Norton, 1995. Lord, Walter. A Night to Remember. New York: Holt, 1955. ———. The Night Lives On. New York: Morrow, 1986. Titanic. Survivors in a lifeboat reach the Carpathia and safety.

Lynch, Don, and Ken Marschall. “Titanic”: An Illustrated History. Toronto: Madison Press, 1992.

Library of Congress

John Muldowny See also Disasters; Shipbuilding; Transportation and Travel.

The Titanic story evolved into a major cultural phenomenon. The fascination began with the initial newspaper reports, which, while exaggerating stories of supposed heroism, led to the erection of countless memorial plaques, statues, fountains, and buildings in both England and the United States. After this initial outpouring of grief, interest in the Titanic lagged, but following the publication in 1955 of Walter Lord’s A Night to Remember, additional books and films about the tragedy appeared. Robert Ballard’s discovery of the wrecked Titanic in 1985 and the subsequent publication in 1987 of his book, The Discovery of the Titanic, brought a deluge of Titanica. Included in this flood were video games, CD-ROMs, classical music scores, documentaries, and traveling exhibits of artifacts, mementos, and memorabilia from the ship. In 1997 a Broadway musical was staged, and in 1999 James Cameron directed an epic film. The discovery also revealed new information that it was not a long gash but a strategically placed hull puncture that sank the ship. This information in turn raised speculation about the strength and reliability of the steel and rivets used in its construction and renewed questions about the vessel’s speed, iceberg warnings, the conduct of the crew and certain first-class passengers, treatment of third-class passengers, and the ship on the horizon. The Titanic saga seems unending. It continually fascinates as a microcosm of the Edwardian world of the early twentieth century. The wealth and status of its passengers, like John Jacob Astor, Benjamin Guggenheim, Isadore and Ida Straus, and Charles Thayer, represent the equivalents of rock music, entertainment, and sports figures. The Titanic story has something for everyone—the ultimate shipwreck, strictures against overconfidence in technology, the results of greed and rampant capitalism, and what-ifs and might-have-beens. The Titanic, if sinkable in reality, remains unsinkable in cultural memory and imagination.

132

TITHES, SOUTHERN AGRICULTURAL, were an expedient of the Confederate congress for securing subsistence for its armies. Because taxes collected in the depreciated Confederate currency did not bring in enough revenue, the ten percent levy in kind was adopted on 24 April 1863, to tap the resources of the 7 or 8 million Confederate farms. Yeoman farmers were especially burdened under the system, and their grievances, which exacerbated preexisting class tensions, eroded Confederate morale in the final months of war. But the revenues produced by the tithes were indispensable in sustaining the southern war effort. BIBLIOGRAPHY

Escott, Paul D. After Secession: Jefferson Davis and the Failure of Confederate Nationalism. Baton Rouge: Louisiana State University Press, 1978.

Francis B. Simkins / a. r. See also Civil War; Confederate States of America; Excess Profits Tax; Taxation; Union Sentiment in the South.

TITLES OF NOBILITY. A title of nobility grants special privileges to an individual at the expense of the rest of the people. The U.S. Constitution prohibits both federal and state governments from granting titles of nobility, and prohibits federal officials from accepting them, but does not prohibit private citizens from accepting them. No case regarding titles of nobility has reached the Supreme Court, but the issue has been raised at the trial level, where plaintiffs usually argue that the privileges of government officials or agents amount to constructive titles of nobility. This position is supported by an 1871 U.S. Attorney General ruling (13 Ops. Atty. Gen. 538) that a commission making someone a diplomatically accredited representative of a foreign government, with the special immunities diplomats enjoy, would constitute a title of nobility. The courts have, however, tended to

T O B A C C O I N D U S T RY

avoid ruling on this argument, preferring to interpret “titles” narrowly, as the English titles, duke, marquis, earl, count, viscount, or baron, rather than consider the privileges that would create functional equivalents. In feudal tradition, those titles usually brought land or the income from land and special legal privileges; they also required special duties to the king. It should be noted that both Constitutional prohibitions occur in the same sections with the prohibitions against ex post facto laws and bills of attainder, which are the obverse of titles of nobility, the legislative imposition of legal disabilities on persons without due process. The founders allowed for minor privileges and the kinds of disabilities that come with general regulations and taxes, but sought to exclude both extremes: great privileges and great deprivations of rights. BIBLIOGRAPHY

Blackstone, William. Commentaries on the Laws of England. Book I Part I Section IV. Ed. St. George Tucker. 1803. Available from http://www.constitution.org/tb/tb-1104.htm. Bouvier, John. Bouvier’s Law Dictionary. 1856. Entry on “Nobility.” Available from http://www.constitution.org/bouv/ bouvier_n.htm. The Federalist, Nos. 39, 44, 69, 84, 85. Available from http:// www.constitution.org/fed/federa00.htm. Segar, Simon. Honores Anglicani: or, Titles of Honour. The temporal nobility of the English nation . . . have had, or do now enjoy, viz. dukes, marquis. London, 1712. Virginia Ratifying Convention, June 17, 1788, Speech by Edmund Randolph. Available from http://www.constitution .org/rc/rat_va_14.htm.

Jon Roland See also Constitution of the United States.

TOBACCO AS MONEY. Because of the scarcity of specie, Virginia, Maryland, and North Carolina used tobacco as currency throughout most of the colonial period. In 1619 the Virginia legislature “rated” high-quality tobacco at three shillings and in 1642 made it legal tender. Nearly all business transactions in Maryland, including levies, were conducted in terms of tobacco. North Carolina used tobacco as money until the outbreak of the Revolution. Sharp fluctuations in tobacco prices led Virginia in 1727 to adopt a system of “tobacco notes,” certificates issued by inspectors of government warehouses. The obvious weakness of tobacco as currency—notably, lack of portability and variability of value—became more apparent with time, and it was abandoned in the second half of the eighteenth century. BIBLIOGRAPHY

Breen, T. H. Tobacco Culture. Princeton: New Jersey Press, 2001.

Hugh T. Lefler / a. r.

See also Barter; Colonial Commerce; Cotton Money; Maryland; Tobacco Industry.

TOBACCO INDUSTRY. Tobacco in the form of leaf, snuff, chew, smoking tobacco, cigars, and factorymade cigarettes has often been called the United States’ oldest industry. Since its introduction to Europeans by American Indians, no other agricultural crop has been more thoroughly entwined with the history of the United States than the growing, processing, and manufacturing of tobacco. In addition, no one product has enjoyed deeper ties to the colonization of the New World and to the expansion of international trade between the New World and Europe, Asia, and the Middle East over the last four centuries. The prospect of farming tobacco and selling it to England brought the earliest British colonists to Virginia and Maryland, and at the end of the twentieth century U.S. companies such as Philip Morris and RJR Nabisco continued to dominate the international cigarette market and stood among the most profitable transnational corporations. U.S. tobacco growing, manufacturing, distribution, marketing, and sales contributed $15 billion in wages to some 660,000 American workers. For many centuries tobacco has been identified with the New World, especially the United States. In the form of the mass-produced cigarette, U.S. tobacco became the virtual international symbol of American modernity. Indeed, students of the industry have argued that the advent of machine-made cigarettes in the 1880s helped inaugurate in the United States the modern era of mass consumer products, mass advertising and promotion, and the professionally managed modern corporation. However, the last half of the twentieth century saw the U.S. tobacco industry come under pressure from the demonstrated health hazards of smoking and the subsequent steady decline in smoking in the United States and other highly industrialized nations. In response, the industry aggressively pursued expanding into markets in Asia, Eastern Europe, and Africa, prompting the World Health Organization to accuse tobacco manufacturers of fomenting a tobacco epidemic. Equally worrisome for the industry, at century’s end the growth of class-action lawsuits, the publication of documents revealing corporate manipulation of the political and legal process and the willful distortion and suppression of scientific findings, and the rise of government antitobacco measures further clouded the future of the domestic tobacco market. Cigarette makers faced the prospect of being demoted to the status of a rogue industry in the eyes of U.S. citizenry. Early History: Production and Consumption Most modern tobacco consumption derives from Nicotiana tabacum, which is a species of nightshade plant. The general consensus is that the tobacco plant originated in South America and was spread by American Indians to North America and the South Pacific and Australia. The arrival of Europeans in the New World introduced them

133

T O B A C C O I N D U S T RY

half was consumed domestically. From 1945 to the 1980s, U.S. annual production averaged two billion pounds.

Sorting Tobacco. African American workers, mostly women, sort tobacco at the T. B. Williams Tobacco Company in Richmond, Va., c. 1899 . Library of Congress

to tobacco, and by the early seventeenth century commercial tobacco became a driving force of colonization in North America and the Caribbean. The Jamestown colony in Virginia owed its very survival to tobacco. A cash crop requiring very intensive labor from planting to harvesting to curing, its cultivation created a demand for conscripted labor, first in the form of indentured European servants on family farms and soon afterward in the form of African slave labor on large landholdings. Two types of tobacco leaf were grown, principally for pipe smoking and, later on, snuff. They were both dark varieties: the more expensive leaf grown in Virginia and the stronger, cheaper orinoco leaf grown in Maryland. In England, demand for tobacco rapidly grew and by 1628 the Chesapeake colonies exported 370,000 pounds annually to England, procuring substantial tax revenues for the state, which overcame early Crown hostility to tobacco cultivation and consumption. Tobacco farming spread quickly to North Carolina, South Carolina, Kentucky, Tennessee, and Georgia. It also extended to two other regions in which cigar (Cuban) leaf cultivation would come to dominate in the nineteenth century: the Northeast (Pennsylvania, New York, Connecticut, and Massachusetts) and, later, the Midwest (Ohio, Illinois, Wisconsin, Minnesota, and Missouri). In 1700 exports of raw leaf from the British Chesapeake colonies reached 37 million pounds and by the outbreak of the American Revolution in 1776 upward of 100 million pounds. At the end of the eighteenth century, the main producers of tobacco were the United States, Brazil, and Cuba. After a decline following the American Revolution, U.S. production rebounded, but only slowly due to the Napoleonic Wars (1799 through 1815) and the War of 1812. Production then rose sharply to 434 million pounds in 1860 and, after a drop due to the Civil War, resumed its growth, averaging 660 million pounds in 1900 through 1905, of which one-

134

Throughout most of their history, Americans overall and men in particular remained the heaviest consumers of tobacco worldwide, principally in the form of chewing and smoking tobacco. Europeans consumed tobacco by smoking it in clay pipes until the eighteenth century, when manufactured snuff became dominant starting in Spain. While chewing tobacco was rare in Europe, it was quite popular in the United States among men and remained so up to the early twentieth century. Pipe smoking was also popular among men and some women in the nineteenth century Women also used snuff. It was taken by New York society women and by women of all classes in the South. In Europe, pipe smoking made a comeback in the nineteenth century at the expense of snuff, but was soon forced to accommodate the new vogues for cigar and cigarette smoking popular both there and in North America. These shifts in consumption patterns stemmed in part from the development in the nineteenth century of new, lighter leaves of the bright and the burley varieties, which were more suitable for chewing and cigarette smoking and competed with the dark leaf grown in Virginia and Maryland. By the end of the nineteenth century, the bulk of U.S. tobacco production had shifted away from lowlying areas of Maryland and Virginia to the Virginia– North Carolina Piedmont region and to Kentucky, where the bright and the burley varieties flourished. By 1919 bright accounted for 35% of the U.S. tobacco crop, burley for 45%. Industrializing Tobacco and the Rise of the Cigarette Until 1800 tobacco manufacturing proper was largely carried out in Europe. Initially, U.S. factories were dispersed in the tobacco-growing regions of Virginia, North Carolina, Tennessee, Kentucky, and Missouri, which used slave labor. New York, a center of snuff production, was the exception. Manufacturing of tobacco also thrived among planters who prepared tobacco for chew. After the Civil War, the introduction of steam-powered shredding and cigarette machines and pressures stemming from the rise of national markets led to the concentration of tobacco manufacturing in that sector. Cigar manufacturing underwent a similar evolution somewhat later. Cigars first became popular in the United States after the MexicanAmerican War, and their manufacture was fairly dispersed in cigar leaf-growing regions. However, by 1905 the greatest centers of cigar manufacturing were Philadelphia, New York, Boston, Cincinnati, Chicago, Baltimore, Richmond, Tampa, and Key West. In the United States, the convenience and simplicity of smoking cigarettes made from the bright variety of tobacco was discovered by Union and Confederate troops alike during the Civil War. Ready-made cigarettes using mixtures of bright and burley tobacco allowed U.S. manufacturers to develop cheaper brands. U.S. cigarette production boomed between 1870 and 1880, rising from 16

T O B A C C O I N D U S T RY

million cigarettes (compared to 1.2 billion cigars) annually to over 533 million, reaching 26 billion by 1916. The growth of the U.S. population between 1880 and 1910 and the decline of chewing tobacco due to antispitting ordinances further expanded the market for cigarettes. With this growth arose new aggressive methods of packaging (striking colors, designs, logos, brand names), promoting (gifts, picture cards, free samples, discounts and rebates to jobbers, retailers, etc.), and advertising (newspapers, billboards, posters, handbills, endorsements) cigarettes to an emerging national market. In 1881 James Bonsack patented a new cigarettemaking machine that turned out over 120,000 cigarettes per day. Until then, factory workers rolled up to 3,000 cigarettes a day. The Bonsack machines made the fortune of James B. Duke, who adopted them in 1884. By securing exclusive rights over Bonsack machines and devoting 20% of his sales revenues to advertising, Duke helped create a mass national market, which he soon dominated. By 1889 W. Duke and Sons had become the world’s leading manufacturer of cigarettes, with 40% of the U.S. market. That same year Duke pressured his rivals into forming the American Tobacco Company with Duke as president. The trust did not own any tobacco farms, and employed its considerable leverage to depress the price of tobacco leaf. This unequal relationship to the detriment of growers reached a crisis point forty years later during the Great Depression, necessitating the tobacco price support program of 1933—still in place at the end of the twentieth century—which rescued tobacco growers, many of them tenant farmers, from certain ruin. The trust also proceeded to absorb important rivals as well as manufacturers of chew, snuff, smoking tobacco, and cigars including R.J. Reynolds, P. Lorillard, Liggett and Myers, the American Snuff Company, and the American Cigar Company. The geometric increase in cigarette production spurred the trust to make a major innovation in modern corporate practices: to seek outlets in foreign markets (not controlled by state monopolies), often by buying local companies outright (United Kingdom, Japan) and later by setting up factories abroad (China). American Tobacco Company’s incursion into Britain provoked British companies to form a cartel, Imperial Tobacco. In turn, in 1902 Imperial Tobacco formed a joint company, but with minority interest, with American Tobacco called the BritishAmerican Tobacco Company (BAT). Together the U.S. and U.K. cartels exploited overseas markets while withdrawing from each other’s domestic market. At the turn of the century, upward of one-third of the U.S. trust’s cigarettes were exported and 54% or 1.2 billion were exported to China alone. By 1910, the year before its demise, the trust accounted for 75 percent of U.S. tobacco production of all kinds. In 1911, the Supreme Court found the American Tobacco Company in violation of the Sherman Antitrust Act and ordered its breakup into four major companies: the American Tobacco Company, Liggett and Myers, R.J. Reynolds, and P. Lorillard.

In 1900 machine-made cigarettes still accounted for only 3 to 4 percent of U.S. tobacco leaf production. Their greatest growth lay ahead: by 1940 the figure had risen to 50 percent (or 189 billion cigarettes) and by 1970 to 80 percent (or 562 billion cigarettes). In 1913 the newly independent R.J. Reynolds launched Camels, the “first modern cigarette.” An innovative blend of burley and Turkish tobacco backed by a massive publicity campaign, Camels were quickly imitated by American’s Lucky Strike and Liggett and Myers’ revamped Chesterfield cigarettes (in 1926 Lorillard jumped in with its Old Gold brand). All three brands stressed their mildness and catered their appeal to men and women alike. Between them the three brands enjoyed 65 to 80 percent market share through the 1940s. The 1920s saw the “conversion” of many tobacco consumers to the cigarette in the Unites States, United Kingdom, Europe, China, and Japan. Between 1920 and 1930, U.S. cigarette consumption doubled to 1,370 cigarettes per capita. Smoking and Health As in the previous century, war was to prove a boon to U.S. tobacco, especially cigarettes. The rations of American soldiers and sailors included tobacco. With each world war, U.S. consumption of tobacco jumped and that of cigarettes soared, leaping 57 percent between 1916 and 1918 and 75 percent between 1940 and 1945. Per capita consumption in the United States reached almost 3,500 per year by 1945, a rate matched only by the United Kingdom and Canada. It would be twenty years before nations in continental Europe and East Asia would achieve similar rates. By 1955 in the United States, 57 percent of men and 28 percent of women smoked. A veritable culture of cigarette smoking had arisen. It was a culture of glamour, style, and modern individualism featured and promoted in fashion magazines and Hollywood films It would appear that the widespread movement by women to adopt cigarettes began prior to advertising campaigns actively directed at them and coincided with the culmination of the suffragette movement’s drive to obtain the right to vote. Commentators openly associated cigarettes with women’s emancipation. Estimates vary, but by 1929 around 16 percent of women smoked cigarettes, a figure that rose to 25 to 35 percent in the late 1940s and peaked at around 30 to 35 percent in the early 1960s. Ever since King James I’s denunciation of tobacco in the seventeenth century as detrimental to one’s health and character, tobacco had been the object of recriminations by politicians, religious leaders, heads of industry, and social commentators. At the very moment cigarettes took off as a popular consumer product in the 1880s and 1890s, antismoking crusaders were waging successful campaigns banning the sale or consumption of tobacco in seventeen states, but their success was short-lived: World War I undid most of the legislation. Prior to World War II, cases of lung cancer were relatively rare in the United States, the United Kingdom, and Canada, the heaviest-smoking countries, but rates in men were rising fast, prompting

135

T O B A C C O I N D U S T RY

Processing Tobacco. African American men process tobacco at the T. B. Williams Tobacco Company in Richmond, Va., c. 1899. Library of Congress

medical researchers to initiate the first statistical studies of the disease. Results of early studies in the United States and the United Kingdom appeared in 1950 just as the Federal Trade Commission was castigating the tobacco industry for making false health claims for their products. Reports of other studies followed over the next two years resulting in a health scare that precipitated a temporary 10 percent drop in consumption. The industry responded in two ways: by promoting filtered-tipped cigarettes (42 percent of all cigarettes by 1956 through 1960) and mentholated brands, which they claimed to be less harsh and harmful; and by questioning the validity of the studies, a tactic it would pursue with each unfavorable new scientific finding up through the 1990s, especially through its Council for Tobacco Research and the industry’s lobbying arm, the Tobacco Institute. Meanwhile, tobacco in the United States, as in many countries, because of its economic importance, the substantial tax revenues it contributed to federal and state coffers ($3 billion in 1964 and $13.4 billion in 1998), and its campaign contributions, benefited from its special status as neither a food nor a drug and thus escaped formal government regulation as to its effects. Under pressure from health organizations, the government published in 1964 a landmark report of the Surgeon General warning the American public of the dangers of smoking. It was the first in a long series of Surgeon General reports that reviewed existing studies on tobacco-

136

related diseases and, beginning in the 1980s, on women and smoking, nicotine addiction, modified cigarettes, cessation, secondhand smoke, youth initiation, and smoking among racial and ethnic minority groups. The political and economic picture of the domestic market for the tobacco industry had changed. In 1965, the industry had to work vigorously to keep the new cigarette warning labels watered down, and in 1970 the industry withdrew all radio and television ads voluntarily in order to eliminate free broadcast time awarded by the Federal Trade Commission starting in 1967 for antismoking public service announcements. Segregation of smokers in airplanes and other forms of public transportation began in 1972 and was extended to public buildings in Arizona (1974) and Minnesota (1975). New studies on the dangers of secondhand smoke in the 1980s and 1990s galvanized the antismoking movement to pressure federal, state, and local governments to ban smoking completely in public buildings, public transportation, stores, theaters, and schools, establish smoking sections in workplaces and restaurants, and, in the case of California, ban smoking in all indoor public areas including workplaces, restaurants, and bars. U.S. cigarette consumption began to decline. Men’s and women’s rates had already dropped from 52 and 34 percent, respectively, in 1965 to 44 and 32 percent in 1970 and to 38 and 29 percent by 1980, respectively. By 1990, the rates had dropped precipitously to 28 and 23, respec-

TOCQUEVILLE

tively, and by 1999 to 26 and 22 percent. Per capita cigarette consumption peaked in the early 1970s at around 4,000 and steadily dropped from 1980 (3,850) to 1999 (2,000). Meanwhile, tobacco-related diseases (lung cancer, emphysema, coronary heart disease, stroke) became the leading preventable causes of death—over 400,000 deaths in 1990. For women, the number of deaths due to lung cancer surpassed those due to breast cancer in 1987.

and Zimbabwe in the export of tobacco leaf while remaining ahead of the United Kingdom and the Netherlands in cigarette exports. U.S. tobacco leaf production, exports, and employment are expected to continue to fall as domestic consumption declines and as productivity, competition from cheaper foreign leaf, and the growth in off-shore manufacturing by U.S. companies increase.

Industry adjusted by offering low-tar and nicotine cigarettes (a 40 percent drop in yields between 1967 and 1981), cheaper brands, and promotion gimmicks such as coupons and giveaways and by opposing systematically growing legal challenges. In a changing market, one company that rose to preeminence was Philip Morris, thanks to its innovative marketing. Its market share surpassed that of previous leader R.J. Reynolds in 1983, and it also took the lead in industry sponsorship of cultural institutions, concerts, and national sporting events. To cover declining U.S. sales, it exploited a traditional outlet for U.S. cigarettes somewhat neglected since World War II: overseas markets. With the help of the U.S. Trade Representative and North Carolina Senator Jesse Helms in 1986, Philip Morris, along with R.J. Reynolds, forced open East Asian markets previously dominated by state monopolies, and in the 1990s snapped up privatized state-run companies in former communist countries in Eastern Europe. By the end of the century, Philip Morris held 50 percent of the U.S. cigarette market followed by R.J. Reynolds (23 percent), Brown & Williamson (12 percent), and Lorillard (10 percent).

BIBLIOGRAPHY

Although faced with a changing market, leading U.S. cigarette manufacturers remained among the nation’s most profitable companies. In the 1980s and 1990s they repositioned themselves by diversifying into the beverage and food industry (Nabisco, Kraft Foods), blurring their corporate identities. In 2002 Philip Morris executives proposed renaming the parent company from Philip Morris Companies, Inc., to Altria Group, Inc. The threat of successful lawsuits resulted in the Master Settlement Agreement signed on 23 November 1998 with forty-six states attorneys general. This agreement stipulated payment of $206 billion to states over twenty-five years, reigned in industry promotion practices, especially those targeting youth, and provided $5.5 billion over ten years in aid to vulnerable tobacco growers. To cover the settlement’s costs the industry increased prices forty-five cents per pack and Philip Morris announced a 16 percent cut in its U.S. workforce. Down from a high of 75,000 in 1955, in 1990 cigarette manufacturing in the United States directly employed 41,000 people; the number dropped to 26,000 by 1999. In 1999 through 2000, debt-ridden RJR Nabisco sold off its overseas tobacco operations to Japan Tobacco and its food products company to Philip Morris, and spun off its domestic tobacco operations as R.J. Reynolds Tobacco. Finally, at decade’s end India had moved ahead of the United States in total leaf and cigarette production (behind China), and the United States fell behind Brazil

Brandt, Allan M. “The Cigarette, Risk, and American Culture.” Daedalus 119 (1990): 155–177. Centers for Disease Control and Prevention. Tobacco Information and Prevention Source. Available at http://www.cdc .gov/tobacco. On-line access to U.S. data including Surgeon General reports. Cox, Howard. The Global Cigarette. Origins and Evolution of British American Tobacco 1880–1945. New York: Oxford University Press, 2000. Glantz, Stanton A., John Slade, Lisa A. Bero, Peter Hanauer, and Deborah E. Barnes. The Cigarette Papers. Berkeley: University of California Press, 1996. Goodman, Jordan. Tobacco in History: The Cultures of Dependence. New York: Routledge, 1993. Most complete international history. Jacobstein, Meyer. The Tobacco Industry in the United States. New York: Columbia University Press, 1907. Reprint, New York, AMS, 1968. Important early statistics. Klein, Richard. Cigarettes Are Sublime. Durham, N.C.: Duke University Press, 1993. The significance of cigarettes in modern culture. Kluger, Richard. Ashes to Ashes: America’s Hundred-Year Cigarette War, the Public Health, and the Unabashed Triumph of Philip Morris. New York: Knopf, 1996. Prize-winning history of U.S. industry’s marketing and political strategies. Parker-Pope, Tara. Cigarettes: Anatomy of an Industry from Seed to Smoke. New York: New Press, 2001. Lively short account. Rabin, Robert L., and Stephen D. Sugarman, eds. Regulating Tobacco. New York: Oxford University Press, 2001. Recent U.S. developments including the 1998 Master Settlement Agreement. Robert, Joseph C. The Story of Tobacco in America. Chapel Hill: University of North Carolina Press, 1967. Schudson, Michael. “Symbols and Smokers: Advertising, Health Messages, and Public Policy.” In Smoking Policy: Law, Politics, and Culture. Edited by Robert L. Rabin and Stephen D. Sugarman. New York: Oxford University Press, 1993. Tobacco Control Archives. Available at http://www.library .ucsf.edu/tobacco. Important review of secret industry documents.

Roddey Reid See also American Tobacco Case; Tobacco and American Indians; Tobacco as Money.

TOCQUEVILLE. See Democracy in America.

137

TODAY

for a few years of comic relief, a chimpanzee named J. Fred Muggs. In 1961, the news department at NBC took over production of the show, and the lead host position went successively to John Chancellor (1961–1962), Hugh Downs (1962–1971), and Frank McGee (1971–1974). Barbara Walters became the first woman to co-host the show, which she did from 1974 to 1976. Walters was paired with a series of co-hosts until Jim Hartz got the permanent job. In 1976, Walters and Hartz were replaced by Tom Brokaw (1976–1981) and Jane Pauley (1976– 1989). Subsequent hosts included Bryant Gumbel (1982– 1997), Deborah Norville (1989–1991), Katie Couric (1991– ), and Matt Lauer (1997– ). BIBLIOGRAPHY

Kessler, Judy. Inside Today: The Battle for the Morning. New York: Villard, 1992. Metz, Robert. The Today Show: An Inside Look at Twenty-five Tumultuous Years. Chicago: Playboy Press, 1977.

Robert Thompson The Today Show. Before 1952, network television programming did not start until 10:00 a.m. That changed when NBC president Sylvester Weaver created a two-hour show called the Today show that ran from 7:00 until 9:00 a.m. each day. Fifty years later, in 2002, the show was still successful and had spawned copycat shows on each of the other major networks. Bryant Gumbel, here interviewing former President Richard Nixon in 1990, was one of the show’s most popular hosts; he ended a fifteen-year stint as host of Today in 1997. 䉷 AP/Wide World Photos

TODAY. In 1952, no network television programming was scheduled earlier than 10:00 a.m. (EST). NBC president Sylvester “Pat” Weaver created Today with the idea that people might watch TV early in the morning before going to work and sending their children off to school. The two-hour show, running from 7:00 a.m. to 9:00 a.m. (EST), was designed to unfold in small modular segments, with the expectation that few viewers would watch from beginning to end. News, interviews, feature stories, and weather were combined in an informal style by friendly hosts. Today went on the air on 14 January 1952 and has remained there with relatively minor changes ever since. It was not until 1954 that another network, CBS, scheduled a program, The Morning Show, in the same time slot, and it was not until the 1970s, when Good Morning, America was introduced on ABC, that any program challenged the ratings dominance of Today. Fifty years after the beginning of Today, all early morning network shows were essentially copies of it. Today replaced the daily newspaper as a first source of information for millions of Americans at the start of each day, providing news and weather reports as well as discussions of books, trends, and other cultural and domestic topics. From 1952 to 1961, the Today team included Dave Garroway, Betsy Palmer, Jack Lescoulie, Frank Blair, and

138

See also Television: Programming and Influence.

TOHONO O’ODHAM. See Akimel O’odham and Tohono O’odham.

TOLEDO, the fourth largest city in Ohio in the early twenty-first century, began in 1680 as a French trading post. Ceded to the British in 1763, it became part of the U.S. Northwest Territory in 1787. Canals and railroads helped establish Toledo as a major inland port and center of industry. During the Progressive Era, Toledo won national recognition for urban reform. Historically, Toledo has been a major producer of glass and automotive products, but these industries declined, and from 1970 to 2000 employment in the Toledo metropolitan area decreased markedly. During this same period, population declined from 383,062 to 313,619, although city leaders question the accuracy of the 2000 federal census. Toledo has experienced other problems. A 1967 race riot caused extensive property damage, injuries, and arrests. Public schools were closed for several weeks in 1976 and 1978 because of teacher strikes. In July 1979 a bitter dispute between the city government and police and firemen led to a twoday general strike and costly arson fires. In the 1980s and 1990s, Toledo sought to emphasize its strong medical, cultural, and higher educational institutions. New downtown buildings and the Portside festival marketplace along the Maumee River were indicative of business leaders’ commitment to the city. BIBLIOGRAPHY

Jones, Marnie. Holy Toledo: Religion and Politics in the Life of “Golden Rule” Jones. Lexington: University Press of Kentucky, 1998.

TOLL BRIDGES AND ROADS

Korth, Philip A., and Margaret R. Beegle. I Remember Like Today: The Auto-Lite Strike of 1934. East Lansing: Michigan State University Press, 1988. McGucken, William. Lake Erie Rehabilitated: Controlling Cultural Eutrophication, 1960s–1990s. Akron, Ohio: University of Akron Press, 2000.

John B. Weaver / a. e. See also Boundary Disputes Between States; Canals; Great Lakes; Labor; Michigan, Upper Peninsula of; Northwest Territory; Ohio; Railroads.

TOLERATION ACTS provided for varying degrees of religious liberty in the American colonies. In New England, where the Congregational Church enjoyed legal establishment, the law required taxpayers to support the Puritan churches. Strong dissent in Massachusetts and Connecticut during the early eighteenth century resulted in legal exemptions for Quakers, Baptists, and Episcopalians. Rhode Island was the exception in New England, granting full freedom of worship. The middle colonies offered broad religious liberty. William Penn’s Charter of 1682 provided for freedom of conscience to all Pennsylvanians who believed in God. Later, however, royal pressure forced the legislature to restrict liberties for Jews and Catholics. The New Jersey proprietors offered religious liberty in order to attract settlers. In New York, although the Anglican Church enjoyed official establishment, the realities of religious diversity and local control resulted in de facto religious liberty for most denominations.

Hall, Timothy L. Separating Church and State: Roger Williams and Religious Liberty. Urbana: University of Illinois Press, 1998. Isaac, Rhys. The Transformation of Virginia, 1740–1790. Chapel Hill: University of North Carolina Press, 1982.

Shelby Balik Winfred T. Root See also Dissenters; First Amendment; Maryland; Massachusetts Bay Colony; Religion and Religious Affiliation; Religious Liberty; Virginia.

TOLL BRIDGES AND ROADS, a system that developed as a means of transportation improvement in the face of limited public funding. Local, colonial, and state governments, burdened by debt, chartered private turnpike and bridge companies with the authority to build, improve, and charge tolls. While toll bridges appeared in New England by 1704, the first toll roads of the turnpike era were in Virginia, which authorized tolls on an existing public road in 1785, and Pennsylvania, which chartered the sixty-two-mile Philadelphia and Lancaster Turnpike in 1792.

The Anglican Church was stronger in the southern colonies and often encroached on dissenters’ religious practice, particularly in Virginia and Maryland. Virginian evangelicals met with resistance, as did Maryland Catholics, although the latter enjoyed protection under the Toleration Act of 1649. Georgia’s royal charter (1732) confirmed religious liberty for all except Catholics. In North Carolina, Anglicans maintained tenuous power. The American Revolution reinforced the doctrine of individual liberty, including religious freedom. Most state constitutions framed in this era sanctioned freedom of conscience to some extent. Local religious establishment continued in many states (until Massachusetts separated church and state in 1833). The Northwest Ordinance (1787) extended religious liberty to the Northwest Territory. The First Amendment of the federal Constitution forbade Congress to abridge the free exercise of religion. BIBLIOGRAPHY

Bonomi, Patricia U. Under the Cope of Heaven: Religion, Society, and Politics in Colonial America. New York: Oxford University Press, 1986. Curry, Thomas J. The First Freedoms: Church and State in America to the Passage of the First Amendment. New York: Oxford University Press, 1986.

Toll Rates. Photographed in 1941 by Jack Delano, this sign on the New Hampshire side of a bridge across the Connecticut River near Springfield, Vt., lists rates ranging from “foot passengers” and bicyclists to automobiles and motorcycles, as well as vehicles drawn by various numbers of “horses or beasts.” Library of Congress

139

TOLLS EXEMPTION ACT

From 1790 to 1850, private toll roads were the nation’s primary land-based venue of transportation. More than four hundred turnpikes, many paved to the French engineer Pierre-Marie Tresaguet’s specifications, facilitated trade and communication with and movement to western areas. By the 1830s, toll canals, toll bridges over such major rivers as the Connecticut, and between 10,000 and 20,000 miles of private toll roads composed the heart of the national transportation network. However, lack of profitability and competition from railroads, which hauled heavy freight at relatively low costs and high speeds, led most turnpike and bridge authorities to dissolve by 1850, leaving their roads and bridges to public ownership. A second wave of toll bridges and roads began in the 1920s and 1930s. In 1927, Congress ruled that federal highway aid could be used for toll bridges built, owned, and operated by the states. Bridge authorities in New York, California, and elsewhere thereafter sold revenue bonds amortized by tolls to finance new bridges, including the George Washington, Triborough, and San Francisco–Oakland Bay. But restrictions on using federal aid for toll road construction, coupled with heavy traffic on existing roads and cash-strapped treasuries, led states to create special authorities that designed and built new toll roads, which like the bridges were financed through revenue bonds. These limited-access, high-speed roads included New York City–area parkways (beginning in 1926), Connecticut’s Merritt Parkway (1937), the Pennsylvania Turnpike (1940, first to accommodate trucks), and the Maine Turnpike (1947, the first postwar toll road). The 1956 Federal-Aid Highway Act, authorizing the interstate highway system, allowed toll bridges, roads, and tunnels to join the system if they met interstate standards and contributed to an integrated network. However, tolls were prohibited on all interstates beyond the 2,447 miles of toll expressways operating or under construction in 1956; additionally, federal highway aid could not be used for new toll facilities except for high-cost bridges and tunnels and extensions to existing toll roads. These policies discouraged new toll road construction, until 1987 and 1991 legislation made federal highway aid available for noninterstate public and private toll roads, and allowed the imposition of tolls on federally funded noninterstate highways. Toll roads constructed thereafter included the private Dulles Greenway in Virginia and the public E-470 beltway near Denver. In the 1990s, Houston, San Diego, and Orange County, California, introduced high-occupancy toll, or HOT, lanes on otherwise free highways, permitting solo drivers to access carpool lanes by paying a toll; critics charged that this traffic management strategy created a road system stratified by class. In 2000, the Federal Highway Administration listed 4,927 miles of toll roads and 302 miles of toll bridges and tunnels nationwide. BIBLIOGRAPHY

American Public Works Association. History of Public Works in the United States, 1776–1976. Chicago: American Public Works Association, 1976.

140

Go´mez-Iba´n˜ez, Jose´ A., and John R. Meyer. Going Private: The International Experience with Transport Privatization. Washington, D.C.: Brookings Institution, 1993. Klein, Daniel B. “The Voluntary Provision of Public Goods? The Turnpike Companies of Early America.” Economic Inquiry 28 (October 1990): 788–812.

Jeremy L. Korr See also Interstate Highway System; Transportation and Travel.

TOLLS EXEMPTION ACT, an act of Congress, 24 August 1912, exempting American vessels in coast-wise traffic from the payment of tolls on the Panama Canal. The Hay-Pauncefote Treaty of 1901 had provided that the canal should be free and open to the ships of all nations without discrimination, so the act raised a serious moral and legal question. President Woodrow Wilson, on 5 March 1914, eloquently requested repeal as a matter of sound diplomacy and international good faith. Prominent Republicans seconded his efforts, and the act was repealed a few weeks later. Congress, however, expressly denied any relinquishment of the right to grant exemptions to coastwise shipping. BIBLIOGRAPHY

Collin, Richard H. Theodore Roosevelt’s Caribbean: The Panama Canal, the Monroe Doctrine, and the Latin American Context. Baton Rouge: Louisiana State University Press, 1990.

W. A. Robinson / c. w. See also Hay-Pauncefote Treaties; Panama Canal; Panama Canal Treaty.

TOMAHAWK appears to derive from the Algonquian tamahawk, or cutting utensil. The earliest English reference to the word came from John Smith, who indicated that it could mean “axe” or “war club.” Over time the term came to denote metal trade hatchets rather than other forms. Tomahawks were among the most popular items Europeans brought to the fur trade. Innumerable varieties developed, from simple hand-forged tomahawks to those elaborately inlaid with precious metals; some featured a spike or hammer head opposite the blade. Spontoon tomahawks had a spearlike blade, suitable for war, not woodcutting. One of the most popular types was the pipe tomahawk, featuring a pipe bowl opposite the blade and a handle drilled through to allow for smoking. Metal trade tomahawks became much prized throughout North America, and were widespread in eastern North America by 1700. Their spread coincided with growth in the fur and hide trade. Tomahawks coexisted with older forms of clubs and hybrid weapons well into the nineteenth century. While very popular with both Indians and white settlers, tomahawks and other hand weapons were increasingly reduced to a ceremonial role in Native Amer-

TOMBSTONE

O.K. Corral, Tombstone. The site of the most famous shootout in the history of the West: Wyatt Earp, his brothers Virgil and Morgan, and John “Doc” Holliday against Ike and Billy Clanton, Frank and Tom McLaury, and Billy Clairborne, 26 October 1881. 䉷 corbis

ican life by the advent of repeating firearms in the midnineteenth century. Symbolically, tomahawks remain synonymous with North American Indian warriors and warfare. BIBLIOGRAPHY

Hartzler, Daniel D., and James A. Knowles. Indian Tomahawks and Frontiersman Belt Axes. Baltimore: Windcrest, 1995. Peterson, Harold L. American Indian Tomahawks. Rev. ed. New York: Heye Foundation, 1971.

Robert M. Owens See also Indian Trade and Traders.

TOMBSTONE. A former silver boomtown located east of the San Pedro River valley in southeastern Arizona, Tombstone is some twenty-five miles north of the Mexican border. Prospector Ed Schieffelin, who discovered silver nearby in 1877, named the site as he did because of remarks by soldiers at Camp Huachuca that the only thing he would find was his tombstone. Large-scale silver production began in 1880. The district yielded about $30 million over the next thirty years and about $8 million thereafter. Politics, feuds, greed, and conflicting town lot claims produced violence that culminated in the 26 October 1881 shootout near the O.K. Corral between the Earp brothers and Doc Holliday on one side and the

Clantons and McLaurys on the other. Labor strife and flooding curtailed mining operations in the mid-1880s. Despite extensive efforts to pump water out of the underground shafts, nearly all the mines were abandoned by 1911. Tombstone’s population declined from 5,300 in 1882 to 849 in 1930. In 1929 the Cochise County seat was moved from Tombstone to Bisbee. With the publication of Walter Noble Burns’s Tombstone, an Iliad of the Southwest (1927) and Stuart Lake’s Wyatt Earp, Frontier Marshal (1931), along with the institution of the town’s first Helldorado Days celebration in 1929, Tombstone capitalized on its notoriety as The Town Too Tough to Die. Subsequent books, motion pictures, and television shows have enhanced its reputation as a place where legends of the Old West were played out. The town became a national historic landmark in 1962, and is a major tourist attraction. Its population was 1,504 in 2000. BIBLIOGRAPHY

Marks, Paula Mitchell. And Die in the West: The Story of the O.K. Corral Gunfight. New York: Morrow, 1989. Shillingberg, William B. Tombstone, A.T.: A History of Early Mining, Milling, and Mayhem. Spokane, Wash.: Arthur H. Clark, 1999.

Bruce J. Dinges Rufus Kay Wyllys See also Mining Towns; Silver Prospecting and Mining.

141

TONIGHT

ence, would be copied by nearly all of the late-night talk shows that followed. Carson retired in 1992 and NBC awarded the vacated position to Jay Leno, who had been a frequent guest host since 1987, and re-titled it The Tonight Show with Jay Leno. David Letterman, angry that he had been passed up for the job, left his NBC program and moved to CBS to compete with Leno. Since the Jack Paar era, the program has enjoyed an important place in American culture. BIBLIOGRAPHY

Carter, Bill. The Late Shift: Letterman, Leno, and the National Battle for the Night. New York: Hyperion, 1994. Metz, Robert. The Tonight Show. Chicago: Playboy Press, 1980. Heeeree’s Johnny! Beginning in 1954, NBC tried to create a latenight talk show that would keep viewers up after the local news. For ten years a number of hosts tried to make the show work, with only Jack Paar (1957–1962) achieving any real success. All this changed in October 1962 when Johnny Carson became the host of The Tonight Show with Johnny Carson. For thirty years Carson ruled late-night television with his comfortable comedic style that featured a nightly monologue, topical humor and sketches, and guests from the entertainment industry. Carson is shown here behind his familiar desk interviewing Frank Sinatra in 1976. 䉷 AP/Wide World Photos

TONIGHT. “The Tonight Show,” the generic title used to describe the many iterations of NBC TV’s latenight comedy-talk show, was originally developed by Sylvester “Pat” Weaver, the president of NBC in the early 1950s. Tonight! was the initial title. It ran from 1954 through 1956, hosted by Steve Allen. In its final year, Allen shared hosting duties with the comedian Ernie Kovacs. In 1957, the title was changed to Tonight! America After Dark, with Jack Lescoulie as host. Unlike Allen’s show, which emphasized comic sketches and music, Tonight! America After Dark concentrated on news, interviews, and live remote broadcasts, much like the Today program. After six months, Al Collins replaced Lescoulie and a month later the format was overhauled once again. The Jack Paar Show debuted in July 1957 in a format that emphasized interviews and “desk comedy.” Paar also conducted political crusades on the air, supporting Fidel Castro’s revolution in Cuba, broadcasting from the Berlin Wall, and including presidential candidates John F. Kennedy and Richard M. Nixon among his guests in 1960. When Paar left the show in March 1962, guest hosts filled in on the re-titled The Tonight Show until October of that year. From October 1962 through May 1992, Johnny Carson established and sustained The Tonight Show Starring Johnny Carson as an American institution. The format of his show, an opening comic monologue—often about news events—followed by interviews, occasional comic pieces, musical performances, and chats with the audi-

142

Robert Thompson See also Television: Programming and Influence.

TONKIN GULF RESOLUTION. On 2 August 1964, the USS Maddox, engaged in an electronic spying operation in the Tonkin Gulf, was involved in a firefight with North Vietnamese PT boats. On 4 August, the Maddox was apparently attacked again in international waters. Although that second attack was never confirmed, President Lyndon B. Johnson informed the American people that he was retaliating against North Vietnam’s aggression by ordering air attacks on its military installations and that he was also asking Congress for its support in the form of a congressional resolution. Drafted weeks earlier by the executive, this resolution was designed to grant the president the authority he desired to protect and defend American interests in Southeast Asia. Managing the Senate floor debate on behalf of the administration was Senator J. William Fulbright of Arkansas, a respected member of that body who also was a good friend of the president. He sought to quell existing doubts about the seemingly open-ended nature of the resolution by informing several skeptical colleagues that the president sought no wider war in Southeast Asia. According to Fulbright, that was the president’s intent and the nature of his policy. Thus, given the strong public support for the president’s action and congressional unwillingness to challenge his authority, Congress passed the resolution on 7 August 1964 with only two dissenting votes in the Senate. The resolution charged that North Vietnam had attacked American ships lawfully present in international waters, which was part of a systematic campaign of aggression it has been waging against its neighbors. Congress approved and supported “the determination of the President, as Commander in Chief, to take all necessary measures to repel any armed attack against the forces of the United States and to prevent further aggression.” In addition, it also authorized the president “to take all necessary steps, including the use of armed force, to assist

TORNADOES

any [SEATO] member or protocol state . . . requesting assistance in defense of its freedom.”

Rawley, James A. Race and Politics: “Bleeding Kansas” and the Coming of the Civil War. Philadelphia: Lippincott, 1969.

President Johnson believed passage of the resolution had given him the necessary legal authority to take whatever action he deemed appropriate in Vietnam. But as disillusionment with the war widened and deepened, and as more information surfaced about provocative American actions in the gulf prior to the alleged incident involving the Maddox, Congress grew increasingly unhappy with how it had been deceived by the president in August 1964. Consequently, it repealed the resolution, which became invalid in 1971. President Richard M. Nixon, disregarding Congress’s action, continued to wage war in Vietnam while acting in his capacity as commander in chief.

Wendell H. Stephenson / h. s.

BIBLIOGRAPHY

Hess, Gary. Presidential Decisions for War: Korea, Vietnam, and the Persian Gulf. Baltimore: Johns Hopkins University Press, 2001. Mirsky, Jonathan. “The Never Ending War.” New York Review of Books (25 May 2000).

William C. Berman

TOPEKA CONSTITUTION. The movement for statehood launched by free-state Kansans in opposition to the proslavery territorial government was inaugurated in the late summer of 1855, when a “people’s” assembly at Topeka called an election for members of a constitutional convention. Thirteen of the delegates were natives of southern states, ten of New York and Pennsylvania, eight of the Old Northwest, four of New England, and two of foreign countries. They chose James H. Lane, a popular sovereignty Democrat as president. The constitution contained standard provisions for the forms and functions of government. The bill of rights prohibited slavery and declared invalid indentures of blacks executed in other states. The service of free blacks in the militia was prohibited, but the fundamental question of admitting them to Kansas was referred to the voters along with the constitution and a general banking law. After the Constitution was ratified, Lane went to Washington, D.C., to petition Congress for Kansas statehood. On 4 March 1856, the legislature assembled at Topeka and elected U.S. senators. The House of Representatives passed a bill 3 July 1856, to admit Kansas under the Topeka Constitution, although five days later the Senate substituted its own measure authorizing a constitutional convention. The Senate’s actions terminated the ambitions laid out in the Topeka Constitution. BIBLIOGRAPHY

Fehrenbacher, Don E. Sectional Crisis and Southern Constitutionalism. Baton Rouge: Louisiana State University Press, 1995.

See also Border Ruffians; Kansas; Kansas-Nebraska Act; Sectionalism.

TORIES. See Loyalists.

TORNADOES. A product of an unusually powerful thunderstorm, a tornado is a naturally occurring atmospheric vortex of air spiraling at a very high speed, usually about 250 miles per hour or more, forming a funnel, and extending from the ground to the base of a convective cloud. The shape of the funnel depends on air pressure, temperature, moisture, dust, rate of airflow in the vortex, and whether the air in the tornado’s core is moving upward or downward. A tornado can also have multiple vortices. Double vortices are often produced when the upper vortex turns in the direction opposite to the circular motion of the lower vortex. Because of all these factors, very few tornadoes look like true funnels. Tornadoes cause one-fifth of natural-disaster losses each year in the United States. The most intense tornadoes can toss a car a halfmile or shatter a house. However, about 80 percent of tornadoes are weak and cause no more damage than severe winds. A tornado can last fewer than 10 seconds or more than two hours. Tornadoes can occur singly or in swarms. There is no agreement among experts on any single theory of tornado formation. The typical tornado has ground contact for about six miles, marking a path up to 500 feet wide. Tornadoes travel as fast as 35 to 60 miles per hour. The average number of tornadoes in the United States ranges between 700 and 800 per year, exceeding 1,000 in some years, most notably 1973, 1982, 1990, and 1992. Tornadoes occur most frequently in Texas, followed by Oklahoma and Kansas. Most tornado fatalities happen in the deep South and generally total fewer than 100 per year, although 350 people died in the 1974 tornado that swept through Alabama, Georgia, Tennessee, Kentucky, and Oklahoma on 3 and 4 April. Although tornadoes have been reported in every state, most occur in the Gulf States and in the Midwest. The west-to-east airflow across the United States is interrupted by the Rocky Mountains, which push the air currents upward; they fall suddenly as they reach the Great Plains. If moisture-laden air is pulled in from the Gulf of Mexico and meets the high dry air over the plains, that confluence creates the conditions for a tornado. Tornado season begins in early spring in the deep South and progresses northward, with two-thirds of tornadoes occurring from March to June. Tornadoes are most likely to form in late afternoon, but they can occur at any time of day on any day of the year.

143

T O R P E D O WA R FA R E

The National Severe Storms Forecast Center in Kansas City, Missouri, is responsible for issuing warnings of approaching tornadoes. Tornado predictions are based on meteorological conditions in combination with unusual patterns on the weather radar. Although the approach of a tornado can be forecast only 50 percent of the time, warnings have become important in reducing the death toll. BIBLIOGRAPHY

Eagleman, Joe R. Severe and Unusual Weather. New York: Van Nostrand Reinhold, 1983. Grazulis, Thomas P. The Tornado: Nature’s Ultimate Windstorm. Norman: University of Oklahoma Press, 2001.

Mary Anne Hansen See also Disasters; Great Plains; Meteorology; Midwest.

TORPEDO WARFARE. Robert Whitehead’s selfpropelled torpedo—a cigar-shaped weapon with an ex-

plosive charge and powered by a small engine—became standard in all major navies by the 1870s. Torpedoes increased rapidly in speed, range, and explosive power. By the eve of World War I, torpedoes effectively ranged near 7,000 yards with top speeds over 40 knots. The largest torpedoes had bursting charges of 700 pounds of explosive. Until about 1900 torpedo boats—small, very fast vessels designed for torpedo attacks—were the principle torpedo carriers. As protection against such vessels large warships acquired batteries of quick-firing guns, and in the 1890s began to rely on a new type of warship, the torpedo boat destroyer, for protection. By the outbreak of World War I, the destroyer, grown to about 1,000 tons, had largely usurped the torpedo boat. Submarines, however, have made the greatest use of torpedoes. During World War I, German submarines sank more than 11 million tons of British shipping, forced the British fleet to operate with extraordinary precaution, and nearly won the war for the Central Powers.

Tornado. This is the oldest known photograph of a tornado, taken 28 August 1884, twenty-two miles southwest of Howard, S.D. National Oceanic and Atmospheric Administration/Department of Commerce

144

TOURISM

Between the wars, airplanes added a new dimension to torpedo warfare. In World War II a small force of British “swordfish” torpedo planes put half the Italian battle fleet out of action at Taranto harbor in 1940, and in 1941 Japanese torpedo planes helped cripple the American fleet at Pearl Harbor. In World War II, German U-boats sank millions of tons of Allied shipping before the Allies finally won the long Battle of the Atlantic. In the Pacific, American submarines devastated the Japanese merchant marine, accounting for 28 percent of all Japanese naval shipping sunk during the war. BIBLIOGRAPHY

Gannon, Robert. Hellions of the Deep: The Development of American Torpedoes in World War II. University Park: Pennsylvania State University Press, 1996. Gray, Edwyn. The Devil’s Device: Robert Whitehead and the History of the Torpedo. Annapolis, Md.: Naval Institute Press, 1991.

Ronald Spector / c. w. See also Munitions; Ordnance; Submarines.

TOURISM. From sunbathers at Myrtle Beach to Civil War buffs at Gettysburg, Americans travel to many different destinations for a variety of reasons. Today, tourism plays an integral role in American economy, society, and culture. The Travel Industry Association of America reported that in 2001 tourism generated 7.8 million American jobs and revenues in excess of $545 billion. Yet tourism is relatively new. In less than two hundred years, touring has changed from the activity of a small elite to a mass phenomenon spurred by a thriving economy, improved transportation, national pride, and an increased desire to escape the pressures of modern life. Before the 1820s, Americans rarely traveled for pleasure. In the next two decades, however, the fruits of industrialization created the necessary environment for tourism, as more Americans possessed the time, money, and opportunity for recreational travel. With the invention of the steamboat and increased use of railroads after 1830, Americans could travel faster, more inexpensively, and in relative comfort. For most of the nineteenth century, Americans traveled in pursuit of improved health, sublime scenery, and social opportunities. Large spas sprang up in upstate New York and the Valley of Virginia, where the elite could “take” the waters. Americans also traveled the country searching for picturesque wonders. Popularized by the British, the “picturesque” tourist sought sublime scenes that astonished by their grandeur, beautiful vistas that soothed through pastoral serenity, and landscapes that intrigued by their quaintness. Favorite destinations included the White Mountains of New Hampshire, the villages along the Hudson River, and most of all, Niagara Falls. The historian John Sears has shown that such journeys

Yosemite National Park. In this 1922 photograph, naturalist A. F. Hall shows visitors a giant sequoia knocked down by a storm in 1919, when it was 996 years old; the labels on tree rings indicate when historic events took place during the life of the California tree. 䉷 Bettmann/Corbis

became sacred pilgrimages as tourists found spiritual renewal gazing on the power and beauty of the divine in nature. A popular itinerary, the “fashionable tour,” combined health and the picturesque as visitors steamed up the Hudson River to Albany, traveled west along the Erie Canal stopping at the Ballston or Saratoga Springs, and ended up at Niagara Falls. Popular guidebooks such as Theodore Dwight’s The Northern Traveller (1825) showed tourists where to visit, how to get there, and what to experience. In turn, trips became a sign of status for the individuals and of cultural identity for their new nation. After the Civil War, attention focused on Florida and the West. Northerners gathered to winter in Jacksonville, a semitropical Eden according to a multitude of guidebooks from the 1870s and 1880s. Popular excursions included a cruise down the St. John’s River and a visit to America’s oldest city, St. Augustine. Even more people flocked to the state after the oil tycoon Henry M. Flagler constructed a railroad along Florida’s eastern coast and built a string of luxury hotels including the lavish 1,150room Royal Poinciana Hotel in Palm Beach, completed in 1894 and at the time the largest wooden structure in

145

TOURISM

the world. Henry B. Plant used similar methods to lure tourists to the state’s Gulf Coast. The West, however, attracted visitors more out of curiosity than climate. The completion of the transcontinental railroad in 1869 and luxurious Pullman Palace cars enticed visitors to California. Visitors to the West marveled at the wonders of Yosemite and Pike’s Peak and stayed in luxury resorts such as the Hotel Del Monte in Monterey. Americans increasingly viewed the West as a mythic, golden land. Railroads busily promoted this image in guidebooks and pamphlets while travel agents, such as the Raymond and Whitcomb Company, helped smooth the journey westward. During the late nineteenth and early twentieth centuries, preservation groups worked on several popular sites. In 1860, the Mount Vernon Ladies Association purchased and restored George Washington’s Virginia home

and in the process spurred similar efforts that rescued such sites as the Hermitage and Jamestown Island. Cities and states created chambers of commerce and tourism boards that urged patriotic citizens to “see America first.” The federal government responded to pressures for preservation and conservation by establishing Yellowstone as a national park in 1872. Later, the National Parks Act of 1916 established the National Park Service (NPS), whose mission was to conserve the scenery, natural and historic objects, and wildlife of America for future generations. In the decades after World War I, the automobile spurred a great expansion of tourism. By 1930, 23 million Americans owned cars, and middle-class Americans traveled the country staying at hotels, motels, and campgrounds. Federal legislation earmarked large sums for roads, highways, and turnpikes, including the scenic Blue Ridge Parkway. During the Great Depression close to $4 billion was spent by the Works Progress Administration (WPA) to build, repair, or improve 651,087 miles of highway and 124,031 bridges. The WPA also issued guidebooks for several states and key cities through the Federal Writers Program. After 1945, America tourism experienced phenomenal growth. Most Americans enjoyed a two-week vacation that had been denied them during the years of depression and war. As Americans’ disposable income rose, so did the promotion of tourism. Major destinations included cities, ski resorts, and national parks. Several cities revitalized their downtown areas to attract tourists. San Antonio’s Riverwalk and Baltimore’s Inner Harbor are but two examples. And beginning with the 1955 opening of Disneyland in Anaheim, California, there has been phenomenal growth in theme parks with attendance totaling more than 163 million in 1998. After the attacks of 11 September 2001, air travel plummeted and domestic tourism suffered, though by spring 2002 the World Trade Organization had announced that recovery was well underway. BIBLIOGRAPHY

Aron, Cindy S. Working at Play: A History of Vacations in the United States. New York: Oxford University Press, 1999. Brown, Dona. Inventing New England: Regional Tourism in the Nineteenth Century. Washington, D.C.: Smithsonian Institution, 1995. Cocks, Catherine. Doing the Town: The Rise of Urban Tourism in the United States, 1850–1915. Berkeley: University of California Press, 2001.

Nantucket. A poster created by Ben Nason, c. 1938, for the New Haven Railroad (one of a series of seven he made, which the railroad distributed into the 1950s), advertising the popular island vacation spot south of Cape Cod, Mass.; passengers changed from the train to the ferry at Woods Hole. 䉷 Swim Ink/Corbis

146

Sears, John F. Sacred Places: American Tourist Attractions in the Nineteenth Century. New York: Oxford University Press, 1989. Shaffer, Marguerite. See America First: Tourism and National Identity, 1880–1940. Washington, D.C.: Smithsonian Institution, 2001.

Rebecca C. McIntyre

T O W B O AT S A N D B A R G E S

Barge. African American refugees transport their household belongings along a canal in Richmond, Va., at the end of the Civil War. Library of Congress

See also Amusement Parks; National Park System; Recreation; Transportation and Travel.

TOWBOATS AND BARGES. The deficiencies of railroad transportation during World War I led to the Transportation Act of 1920, which created the Inland Waterways Corporation (1924) and its Federal Barge Line. The completion of the nine-foot channel of the Ohio River in 1929 was followed by similar improvements on the Mississippi and its tributaries and the Gulf IntraCoastal Canals. Each improvement marked a giant step by the U.S. Army Engineers (Corps of Engineers) in promoting inland waterways development. Private capital followed these improvements with heavy investments in towboats and barges. In the years before World War II, towboat power soared steadily from 600 to 1,200 to 2,400. The shift from steam to diesel engines cut crews from twenty or more on steam towboats to an average of eleven to thirteen on diesels. By 1945 fully 50 percent of the towboats were diesel; by 1955, the figure was 97 percent. Meanwhile the paddlewheel had given way to the propeller, the single propeller to the still-popular twin propeller; the triple propeller became fairly common during the 1960s. In 1974 the Valley Line added the 10,500-horsepower triplescrew W. J. Barta to its fleet of twenty-one towboats and 750 barges. Capable of handling forty barges with a capacity of 50,000 tons, the W. J. Barta transported twentytwo times the record-breaking 9,266 cotton bales carried

by the Henry Frank in 1881. By the end of the twentieth century, 10,500-horsepower towboats were common on the Mississippi. The pilothouse is the key to modern towboat expansion. Electronics are everywhere: main control panels, radar, computers, communication systems, and circuit television scanners that monitor the entire boat for the pilot, who can communicate with pilots of approaching boats. The pilot is in telephone communication with the numerous marine services that have sprung up to cut out barges from a tow while it is under way, thus saving time and money. Some towboats have thrusters (like the bowboats of rafting days) that aid the pilots in passing other large tows, negotiating sharp bends, passing bridges, or entering locks. Traffic on the Mississippi system climbed from 211 million short tons to more than 330 million between 1963 and 1974. The growth in river shipping did not abate in the final quarter of the century. Traffic along the Upper Mississippi rose from 54 million tons in 1970 to 112 million tons in 2000. The change from riveted to welded barges, the creation of integrated barges, and the innovation of double-skinned barges have led to improved economy, speed, and safety. Shipping on Mississippi barges became substantially less expensive than railroad transport, but at a cost to taxpayers. Barge traffic is the most heavily subsidized form of transport in the United States. A report in 1999 revealed that fuel taxes cover only 10 percent of the annual $674 million that the U.S. Army

147

TOWER COMMISSION

Corps of Engineers spends building and operating the locks and dams of the Mississippi River. BIBLIOGRAPHY

Clay, Floyd M. History of Navigation on the Lower Mississippi. Washington, D.C., 1983. Petersen, William J. Towboating on the Mississippi. Washington, D.C.: National Waterways Study, U.S. Water Engineer Water Resource Support Center, Insitute for Water Resources, 1983.

Willliam J. Petersen / a. r. See also Bargemen; Engineers, Corps of; Inland Lock Navigation; Inland Waterways Commission; Lakes-to-Gulf Deep Waterway; Mississippi River; River Navigation; Waterways, Inland.

TOWER COMMISSION. Appointed by President Ronald Regan in November 1986, the Tower Commission investigated allegations that the administration sold arms to Iran in exchange for U.S. hostages in Lebanon and then diverted money from the arms sales to the Nicaraguan contras, which violated congressional legislation. Headed by former Senator John Tower, the commission also was charged with proposing changes in the National Security Council (NSC) to prevent any such action in the future. Its 1987 report concluded that members of the NSC staff were responsible for the secret diversion of funds and that President Reagan was out of touch with the actions of his own government in the White House. BIBLIOGRAPHY

Koh, Harold Hongju. The National Security Constitution: Sharing Power After the Iran-Contra Affair. New Haven, Conn.: Yale University Press, 1990. United States. President’s Special Review Board. The Tower Commission Report: The Full Text of the President’s Special Review Board. New York: Times Books, 1987.

Katy J. Harriger / a. g. See also Iran-Contra Affair; Political Scandals; Special Prosecutors.

TOWN GOVERNMENT or township government is the lowest level of general-purpose local government in the northeastern and midwestern states. Generally the jurisdiction of towns or townships extends only to areas outside of incorporated cities. Towns were the principal units of local government in colonial New England, providing schools, poor relief, roads, and other necessary services. The town meeting, an assembly of all enfranchised townspeople, was the primary decision-making body, but over the course of the colonial period the elected selectmen seemed to grow increasingly important in determining town policy. Towns or townships also existed in New

148

York, New Jersey, and Pennsylvania, though in these middle colonies counties played a greater role in local government than in New England. The southern colonies did not develop townships. This basic geographic pattern persisted throughout the following centuries. In the northernmost states towns were most significant. In the middle swath of states they existed but were less important, and they were foreign to the southern half of the nation. During the nineteenth century the trans-Appalachian states stretching from Ohio to the Dakotas adopted township government. In each of these states township officials were responsible for roads, cemeteries, and poor relief. Moreover they ensured that farmers maintained their fences and impounded stray livestock. States could assign other tasks to townships as well. In Ohio township clerks were authorized to record livestock brands, and Kansas lawmakers empowered townships to eliminate prairie dogs. In New York, New Jersey, Michigan, Illinois, Wisconsin, and Nebraska the chief township officer was the supervisor, and the country governing boards were composed of the supervisors from each township. The township supervisor was therefore both a township and a county official. By the first half of the twentieth century many observers believed the town or township was obsolete. In New England the town meeting seemed ill suited to the larger towns with thousands of voters. Many indifferent townspeople failed to exercise their right to participate, abdicating decision making to the random few hundred people who attended the meetings. In 1915, in response to this situation, Brookline, Massachusetts, adopted the representative town meeting. All town voters could attend these meetings and express their views, but only elected representatives could vote on the issues. By the last decade of the twentieth century forty-two Massachusetts towns, seven Connecticut towns, and one Vermont community had adopted the representative town meeting. To preserve a semblance of broad-based democracy, these assemblies usually included over two hundred voting representatives. Another alternative to the town meeting was the town council. This was a small legislative body comparable to a city council, and it often hired a town manager with duties corresponding to those of a city manager. In other words, under the town council plan a town was governed like a city but retained the title of town. In 1945 Bloomfield became the first Connecticut community to opt for this plan, and in 1971 Agawam was the first Massachusetts town to embrace council rule. By the 1990s twenty-nine Massachusetts towns operated under the council plan. In a majority of New England towns the traditional town meeting survived, though only a minority of voters attended. Lauded as bastions of direct democracy, town meetings actually appeared to be prime examples of democratic apathy. Meanwhile, most students of local government were growing increasingly critical of township rule outside of New England. They condemned the townships as obso-

TOWNSEND PLAN

lete remnants of a horse-and-buggy era and urged abolition of these unnecessary units. Responding to this academic assault, township officials mobilized, organizing state associations that promoted townships as paragons of grassroots democracy. In some states lawmakers reduced the authority of townships. In 1929 Iowa’s legislature deprived townships of their responsibility for local roads, and in the early 1930s Indiana shifted all authority over local roads and drainage ditches to its counties. Most of the midwestern states, however, authorized new responsibilities, such as zoning and fire protection. By the close of the twentieth century towns or townships in a number of states could exercise virtually the full range of powers of an incorporated city. Despite pronouncements that town or township governments were outmoded in an increasingly metropolitan world, they proved especially important in states with large suburban populations. In both New York and Michigan, for example, the town was the chief unit of local government for a rapidly rising number of suburbanites. Whereas in 1960 only 36 percent of New Yorkers lived in towns, by 1990 this figure was up to 47 percent. In Michigan the share of the state’s population living in townships and dependent on their services as opposed to those of cities rose from 42 percent in 1990 to 48 percent in 2000. Rather than passing from the scene or surviving as obsolete relics, the New England town and townships elsewhere in the Northeast and Midwest adapted to the political realities of the twentieth century and remained vital elements of local government. BIBLIOGRAPHY

Hendrickson, J. P. “Grass-roots Government: South Dakota’s Enduring Townships.” South Dakota History 24, no. 1 (1994): 19–42. Sly, John Fairfield. Town Government in Massachusetts (1620– 1930). Cambridge, Mass.: Harvard University Press, 1930. Zimmerman, Joseph F. The Massachusetts Town Meeting: A Tenacious Institution. Albany: State University of New York at Albany, 1967.

Jon C. Teaford See also County Government; Local Government.

TOWNSEND PLAN, a plan for an Old-Age Revolving Pension, prompted one of the most astonishing social movements of the New Deal period. Combining the American traditions of pressure politics, reform by monetary manipulation, and evangelical utopianism, the Townsend Plan was one of the most popular of several such movements produced by the social distress and insecurity following the panic of 1929. Dr. Francis E. Townsend, the plan’s originator, announced it on 1 January 1934, and speedily enrolled millions of supporters. As embodied in a bill endorsed by Townsend, the plan entitled all people sixty years of age or over who had been U.S. citizens for at least five years, to an annuity of up to

Pension Plan. In the turmoil following the stock market crash of 1929, Frances E. Townsend (shown here) developed the Old Age Revolving Pension that would have paid U.S. citizens age sixty and older $200 per month. The plan, which would have been funded with national 2 percent sales tax, was never enacted, although it caused a stir and gained some support. Library of Congress

$200 a month, provided they did not earn any money, and spent all of each month’s annuity, within the United States, by the fifth day of the following month. To finance the plan, advocates sought to raise $20 billion annually through a general sales tax of two percent. The plan’s authors regarded it as no mere old-age pension but rather, a solution to virtually all U.S. economic ills, including budget deficits. Appealing largely to the lower middle class during a period of great social unrest, the leaders defended the profit system as central to progress and denounced tendencies toward collectivism. Its disciplined voters were instrumental in electing to Congress several outspoken opponents of the New Deal. However, Congress repeatedly rejected bills putting forth the Townsend Plan, mainly because critics, including economists, charged that such a high sales tax would cause wholesale inflation. The Townsend Plan movement died out during the beginnings of economic recovery in the late 1930s. BIBLIOGRAPHY

Bennett, David H. “The Year of the Old Folks’ Revolt.” American Heritage 16, no. 1 (1964).

149

TOWNSHEND ACTS

Dorman, Morgan J. Age before Booty; an Explanation of the Townsend Plan. New York: Putnam, 1936. Mitchell, Daniel J. B. “Townsend and Roosevelt: Lessons from the Struggle for Elderly Income Support.” Labor History 42, no. 3 (2001).

C. Vann Woodward / m. b. See also New Deal; Old Age; Pension Plans; Share-theWealth Movements; Social Security.

TOWNSHEND ACTS, four Parliamentary acts imposed on the American colonists (1767). They take their name from Charles Townshend, chancellor of the Exchequer and head of the British government at the time they were enacted. The first law, the Suspending Act, suspended the New York assembly until it complied with the provisions of the Quartering Act of 1765, which required colonies to supply British troops with shelter and supplies. This law resulted from General Thomas Gage’s decision to concentrate troops in central reserves in New York City, from which they might be dispatched as needed. This decision imposed an unforeseen financial burden on that colony, and the New York assembly refused to appropriate funds for additional quarters in New York City because they thought there was still ample room in the barracks at Albany. The second act was the Revenue Act, which levied import duties on lead, paper, glass, and tea—all of which colonists could import legally only from Great Britain. This revenue was earmarked to support royal officials in the colonies, including judges and governors, who had relied previously on local assemblies for their salaries. Many colonists feared that this system would put these officials beyond all local control while increasing their dependence upon the British ministry for their positions and pay. Resistance to the Revenue Act took the form of agitation, nonimportation agreements, open evasion of the duties, and the promotion of American manufactures. This act marked the second time that the British government had regulated colonial commerce to raise revenue (the first was the Sugar Act of 1764). All other commercial laws had been intended to protect some industry within the empire. British leaders like Sir William Pitt and Edmund Burke assailed the Revenue Act as anticommercial. Instead of encouraging British industry, they argued that it discouraged English manufactures and encouraged competing industries in the colonies. A board of customs commissioners, established by the third Townshend Act, assumed responsibility for collecting the new taxes. The board was stationed at Boston and retained complete control over all American customs. It was empowered to reorganize customs, regulate or close ports of entry, appoint customs officers, hire coastguard vessels and provide them with search warrants, and take other measures necessary to enforce the revenue

150

laws. Townshend revenues and seizures of goods would pay for this new system. Enforcement officers met with resistance from many colonists, including those who seized the Liberty and burned the Gaspe´e. Such actions led the customs commissioners to ask for troops, so forces headed in September 1768 from New York to Boston, where they were quartered in the city. Friction between civilians and the soldiers resulted, and all but two regiments were withdrawn in 1769. One of these was involved in the Boston Massacre (1770), after which all troops were withdrawn. Finally, the fourth Townshend Act repealed the inland duties on tea in England and permitted it to be exported to the colonies free of all British taxes. The uproar over the Townshend Acts subsided after Parliament repealed all duties except that on tea in 1770. The controversy reemerged a few years later, however, when protests over the Tea Act led to the Boston Tea Party in 1773. BIBLIOGRAPHY

McCusker, John J., and Kenneth Morgan, eds. The Early Modern Atlantic Economy. New York: Cambridge University Press, 2000. Middlekauff, Robert. The Glorious Cause: The American Revolution, 1763–1789. New York: Oxford University Press, 1982. Nash, Gary B. The Urban Crucible: The Northern Seaports and the Origins of the American Revolution. Cambridge, Mass.: Harvard University Press, 1986.

Shelby Balik O. M. Dickerson See also Billeting; Boston Tea Party; Colonial Assemblies; Colonial Policy, British; Gaspe´e, Burning of the; Navigation Acts; Parliament, British; Quartering Act; and vol. 9: The Pennsylvania Farmer’s Remedy; Townshend Revenue Act.

TOXIC SHOCK SYNDROME (TSS), a rare, sometimes fatal disease that caused widespread panic among women during the early 1980s when the Centers for Disease Control (CDC) and other public health organizations linked the growing number of cases of TSS with the increasing popularity of high-absorbency tampons. The earliest reported cases of TSS occurred among seven children in 1978 and were linked with the presence of Staphylococcus aureus. Symptoms of the disease include vomiting, diarrhea, high fever, and sunburnlike rash. Fatalities among early TSS patients were around 8 percent. In 1980, 890 cases were reported to the CDC, 812 of which were among women whose illness coincided with the start of their menstrual periods. When the Utah Department of Health collected information suggesting that women with TSS had used a particular tampon brand, Rely, the CDC devised a study to examine tampon brand use. The study found that 71 percent of a test group of women with TSS had used Rely tampons. On 22 September 1980, Procter and Gamble recalled all Rely tampons on the market and all tampon manufacturers subsequently

TOYS AND GAMES

lowered the absorbency of their tampons. The Food and Drug Administration began requiring that all tampon packages carry information on TSS, advising women to use tampons with the minimum absorbency needed and to change tampons frequently. Though the scare associated menstruating women with TSS, the disease has been reported in men, children, and older women and in conjunction with surgery, influenza, sinusitis, childbirth, intravenous drug use, cuts, boils, abscesses, insect bites, and the use of contraceptive sponges, cervical caps, and diaphragms. BIBLIOGRAPHY

Donawa, Maria E. et al. “Toxic Shock Syndrome: Chronology of State and Federal Epidemiologic Studies and Regulatory Decision-Making.” Public Health Reports 99 (1984). Etheridge, Elizabeth W. Sentinel for Health: A History of the Centers for Disease Control. Berkeley: University of California Press, 1992. Sapolsky, Harvey M., ed. Consuming Fears: ThePolitics of Product Risks. New York: Basic Books, 1986.

Suzanne White Junod / f. b. See also Centers for Disease Control and Prevention; Epidemics and Public Health; Microbiology; Women’s Health.

TOXIC SUBSTANCE CONTROL ACT (TSCA), signed by President Gerald Ford on 11 October 1976, gives the Environmental Protection Agency (EPA) the power to track industrial chemicals produced in the United States. The act grew out of federal legislation originally proposed in 1971 by the President’s Council on Environmental Quality. The council’s report, “Toxic Substances,” identified a need for legislation to identify and control chemicals whose manufacture, processing, distribution, or disposal was potentially dangerous and yet was not adequately regulated under other environmental statutes. Both houses of Congress passed legislation in the Ninety-second and Ninety-third sessions, but the controversy over the scope of chemical screening stalled final passage of legislation until 1976. Under the act the EPA screens industrial chemicals and can require the reporting or testing of those chemicals which may pose an environmental or human health hazard. The act also grants the EPA the authority to ban the manufacturing or import of industrial chemicals which pose an unreasonable risk. The EPA is further responsible for tracking the thousands of new industrial chemicals that are developed each year with either unknown or dangerous characteristics and for controlling them as necessary to protect human health and the environment. Manufacturers and processors of chemicals may be required under the act to conduct and report the results of tests to determine the effects of potentially dangerous chemicals on living things.

BIBLIOGRAPHY

Bergeson, Lynn L. “TSCA: The Toxic Substances Control Act” Chicago: Section of Environment, Energy, and Resources, American Bar Association. Davies, J. Clarence “Determining Unreasonable Risk Under the Toxic Substances Control Act” Washington, D.C.: Conservation Foundation, 1979. Druley, Ray M., and Girard L. Ordway. The Toxic Substances Control Act. Rev. ed. Washington, D.C.: Bureau of National Affairs, 1981.

Shira M. Diner See also Consumer Protection; Environmental Protection Agency; Food and Drug Administration.

TOYS AND GAMES have always reflected the attitudes, humor, and imagination of the culture and times that created them. Toys have a unique cross-generational appeal that can capture the fancy of not only children, but adults as well. More than a few grown-ups have complex model railroads, proudly displayed action figure collections, or a coveted doll saved from their childhood. As toy historians Athelstan and Kathleen Spilhaus write, “A toy’s appeal lies in the form and shape, the beauty of line, the color and detail, the charm of miniaturization, and the humor of caricature. Some toys amuse use with their jerky antics; others add beauty to our lives with their grace and rhythm. Many do things we can’t do in real life, thereby keeping us in touch with fantasy.” Toys and Games through History While some toys, such as mechanical, lithographed tin, or electronic toys, obviously could not have been introduced without simultaneous technological or production advances, once introduced a toy or game is not necessarily limited to its era of origin. Evidence shows that ancient Egyptian children played with simple, wooden dolls; Roman children played with marbles; stuffed dolls and animals date to antiquity and remain popular; chess seems to date from ancient Chinese and Egyptian dynasties; and little boys can still buy a bag of classic green soldiers just as their daddies did decades ago. Colonial American children played with games and toys like marbles, dice (often carved from bone or antlers on the frontier), and stuffed dolls. Mothers, again especially on the frontier, made many stuffed dolls out of an old stocking or worn pant leg. The child of a more wellto-do urban merchant might play with a doll depicting a peddler, replete with items for sale. Such accessorized dolls foreshadowed by 150 years such modern dolls as Barbie and G.I. Joe. Paper dolls and paper cut-outs have long been popular. In the eighteenth century, dolls—hand painted with watercolor—typically reflected women at household chores. In the United States, that corresponded with republican ideals of womanly virtue. Late in the nineteenth century, Jules Verne’s books had popularized science fiction, boys could cut out lithographed paper

151

TOYS AND GAMES

shapes that they glued together to make fanciful vessels. Punch-out paper dolls with attachable, tabbed clothing have remained a favorite with children. Eighteenth-century adults fancied thought-provoking, strategic games like chess or checkers. Enlightenment thought encouraged exercising the brain for logical reasoning in any way possible.The eighteenth century also saw the production of many moving figures called “automata.” These complex, often life-size models were the work of imaginative and skilled toy makers. However, because of their complexity and the limitations of preindustrial production, automata were never produced in sufficient numbers to classify as toys. Nevertheless, they captured the imagination of the age. One particularly fascinating automata was a Parisian monkey who deftly painted a portrait of Benjamin Franklin, then sat back to admire his work. The eighteenth century, which spawned Frederick the Great, the Royal British Army, and the American and French Revolutions, not surprisingly also generated great interest in lead or “tin” soldiers. Johann Gottfried Hilpert, a Prussian, standardized soldier production and had a team of women to paint them. Children and adult collectors could amass whole armies of tin soldiers, complete with infantrymen, grenadiers, lancers, cavalrymen, artillerists, and commanders on horseback. Industrialized production methods, including assembly-line processes, plus improved transportation in the mid-nineteenth century revolutionized toy making and toy consumption. Now manufacturers could quickly make toys and distribute them to a wide customer base. The automata of the eighteenth century gave way to the mechanical toys of the nineteenth. Popular mechanical toys included monkeys painting pictures, figures that walked, and horse-drawn wagons that rolled as the horse’s legs moved back and forth. A post–Civil War toy of Robert E. Lee depicted him on horseback; another more creative one depicted Union General Ulysses S. Grant raising a cigar to his lips, then puffing out real smoke! While some earlier toys had moved by means of flowing water or sand, the new mechanicals reflected the industrialization of the time. Most were made of tin, detailed with colorful lithographs that accurately depicted rivets in a ship’s hull or stripes on a tiger’s back. Manufacturers fitted the molded tin around metal armatures, articulated with movable joints. Inside the armatures were energy storage devices. Most used springs that, when wound tight, stored kinetic energy. When a key or switch allowed the spring to relax, the released energy moved the toy. Others used clockworks—gears and springs wound by a key. The late nineteenth century was also the era of the cast-iron bank. The most popular banks also moved via the same type of mechanism as toys. Some depicted animals jumping through hoops, or, again, drinking from bottles or mugs. Placement of a coin in a bank character’s

152

hand or mouth usually activated the bank. Any motion always ended with the deposit of the coin. American toys have always exhibited an air of nationalism. Banks and toys began depicting American heroes, especially from the Revolution, around the nation’s centennial in 1876. Post–Civil War toys, whether wooden pull toys, tin mechanical toys, or paper doll cutouts, often depicted Civil War soldiers from both the North and the South. As the United States Navy modernized into steampowered, steel-clad fleets with rotating gun turrets, toys reflected the change; wind-up, rolling destroyers with smokestacks and ramming prows became popular. In 1916, as the United States prepared to enter World War I (1914–1918), a mechanical toy showed Uncle Sam kicking Kaiser Wilhelm II in the seat of the pants. American toys also depicted open racism and prejudice. One mechanical toy was an African American dancing on a platform, which was not overtly marked but latently resembled a slave auction block. Other black toys, often named “Uncle Tom,” shuffled, danced, or played instruments. The Twentieth Century The early twentieth century saw toy makers introduce the names that would become classic. In 1902, after seeing a cartoon that depicted the immensely popular Theodore Roosevelt with a bear cub he had found on a hunting trip, Morris Michtom of Brooklyn’s Ideal Toy Company introduced the stuffed “teddy bear.” Picked up by a variety of other companies, teddy bears have been in production ever since. Another enduring stuffed toy hit the markets in 1915 when Johnny Gruelle introduced Raggedy Ann. Gruelle followed her up with Raggedy Andy for boys, and a series of illustrated books to keep them before the public. Americans’ fascination with industry and building translated to toys. In 1913, A. C. Gilbert introduced the Erector Set. The next year Charles Pajeau followed suit with Tinker Toys. And in 1916, John Lloyd Wright— famed architect Frank Lloyd Wright’s son—introduced Lincoln Logs, which one of his father’s creations helped inspire. As of 2002, all were still on the market. The age of movie/toy tie-ins arrived in 1928 when Walt Disney created the animated Mickey Mouse in a cartoon short. Soon, stuffed Mickeys were on the market, accompanied by his animated sidekicks Donald Duck, Goofy, and the rest. Most large malls boast a Disney Store with toys and merchandise from all of Disney’s movies. The Great Depression of the 1930s saw families seeking durable games they could play together. In 1936, Parker Brothers introduced perhaps one of the most enduring games of all time, Monopoly. The game fit the mood of the country exactly. People had little money, but by playing Monopoly they could be rich with play money. They could buy lucrative Atlantic City properties and monopolize utilities and railroads. If they made a misstep,

TOYS AND GAMES

they could get tossed in jail—just as they wished would happen to many of the “captains of industry” whom many Americans blamed for the depression. Two other phenomena once again revolutionized the toy industry after World War II (1939–1945). They were the introduction of a suitable plastic for toys, and the baby boom. The plastic known as polystyrene actually first appeared in 1927. But World War II perfected the use of plastics in industry. That, coupled with a postwar prosperity and the largest new generation in American history, set the stage for a toy boom. The classics of the age were born. In 1952, the Hassenfeld Brothers—Hasbro—of Providence, Rhode Island, introduced an unlikely toy: plastic eyes, noses, ears, and lips that kids could stick into potatoes and other vegetables or fruits. Hasbro called the odd toy Mr. Potato Head, and it quickly caught on. In the 1960s, Hasbro marketed Mr. Potato Head with plastic potatoes, and even plastic carrots, green peppers, and french fries. A Mrs. Potato Head appeared, but the spuds lost their appeal by the 1970s. In 1995, however, Pixar’s movie Toy Story repopularized Mr. Potato Head. Television and movie tie-ins created new toy markets in the 1950s. Disney’s Mickey Mouse Club spurred a demand for mouse-ear hats, as did Disney’s Davy Crockett series a demand for coonskin caps. Disney’s Zorro encouraged little boys to ask for black plastic swords tipped with chalk so they could slash a “Z” on sidewalks, trees, and buildings. In 1959, Mattel introduced Barbie, the most popular plastic doll of all time. Mattel engineered a marketing coup with Barbie, by offering not only the doll but a range of accessories as well. Changes of clothes, purses, gloves, shoes—no Barbie was complete without a decent wardrobe, and a Barbie box to carry it in. Soon Barbie had a boyfriend, Ken, and a sister, Skipper. Barbie was born into the suburban housewife era and has lived through the hippie age of the 1960s, the do-your-own-thing era of the 1970s, and the flamboyant 1980s. While feminists have decried that Barbie, with her exaggerated hourglass figure, is sexist, foisting upon young girls an image of womanhood that is hard to achieve, she has nevertheless endured. In 1965, Hasbro took the social risk of introducing a doll for boys—G.I. Joe. At almost a foot tall, Joe was loosely based on a new television series called The Lieutenant. In reality, Joe arrived in a year when the United States was celebrating the twentieth anniversary of its victory in World War II. Joe represented a time before the Cold War when Americans were victorious on the battlefield. Not that six-year-old boys cared, but Joe won the favor of their parents, and that was half the battle. Joe also followed Barbie’s marketing scheme, by offering accessories like M-1 Rifles, hand-grenades, dress blues, and jungle camouflage. Boys could even outfit Joe in enemy uniforms; but they were enemies from the “good ol’

days”—Germans and Japanese—not the North Vietnamese or Vietcong of the 1960s. Indeed, Joe would suffer, as would all Americans, from United States involvement in Vietnam. As victory eluded the United States there and things military faded from fashion in the wake of war protests, Joe changed from a soldier to an adventurer. In 1970, Hasbro began marketing Joes as the “Adventure Team.” Bewhiskered Joes drove All-Terrain Vehicles instead of Jeeps, and hunted for stolen mummies and white tigers. Joe became anemic as the United States began to doubt itself on the battlefield. By the mid-1970s, Joe had faded away. He returned, however, as a smaller action figure in the early 1980s to battle an elite group of terrorists known as Cobra. In the late 1990s, Hasbro returned the original G.I. Joe to the markets. The target audience was grown-up baby boomers who once played with the original. Toy cars had long been around. Jack Odell introduced finely crafted miniatures he called Matchbox Cars in 1952, and in 1957 Tonka trucks and cars hit the market. Larger-scale, metal vehicles with free-rolling wheels, Tonkas were virtually indestructible. Mattel revolutionized the market once again in 1966 with the introduction of Hot Wheels. The cars had low-friction wheels and used gravity to speed them down strips of yellow track that boys could attach to tabletops then run down to the floor. The cars depicted stock autos, like Mustangs and Camaros, and fanciful show and concept cars. The release of George Lucas’s Star Wars in 1977 brought new interest in miniature play figures, popularly called “action figures.” In the early twenty-first century, toy buyers can expect tie-in toys to hit the shelves a month or more before their associate movie, but Star Wars arrived in the summer of 1977 with no affiliated toys. Not until the next year did figures of Luke Skywalker, Darth Vader, and the rest appear. Compared with later action figures, the original Kenner Star Wars figures are simple, yet toy collectors highly prize them. In the 1980s, virtually every kid-oriented movie from E.T. to Beetlejuice had action figure/toy tie-ins. The 1980s also saw reverse tie-ins, when toy manufacturers contracted animation studios to produce cheap half-hour cartoons to support toys. He-Man, She-Ra, G.I. Joe, and Teenage-Mutant Ninja Turtles capitalized on such marketing strategy. Electronics have also made their mark on toys. In 1972, Magnavox introduced Odyssey, the first video game that could be hooked into a television. Atari followed in 1976 with Pong. While home video games boomed briefly, they faded quickly in the early 1980s as large, coin-fed games in arcades attracted players. In 1983, Nintendo pried the home video game market back open with games like Super Mario Brothers. Now video games are a staple for both televisions and computers.

153

TRACK AND FIELD

BIBLIOGRAPHY

The History of Toys and Games. Available at http://www .historychannel.com/exhibits.toys. Ketchum, William C., Jr. Toys and Games. Washington, D.C.: Cooper-Hewitt Museum of the Smithsonian Institution, 1981. Miller, G. Wayne. Toy Wars: The Epic Struggle between G.I. Joe, Barbie, and the Companies that Make Them. New York: Times Books, 1998. Spilhaus, Athelstan, and Kathleen Spilhaus. Mechanical Toys: How Old Toys Work. New York: Crown Publishers, 1989.

R. Steven Jones See also G.I. Joe; Barbie Doll; Middlebrow Culture; “Monopoly”; “Scrabble”; Vacation and Leisure.

TRACK AND FIELD athletics in the United States had multiple origins in the early- to mid-nineteenth century. British models were most influential. Scottish immigrants formed Caledonian Clubs in many American cities, and through these the tradition of Highland Games (also called Caledonian Games) brought track and field competition to the East Coast through the mid-1870s. Boston, for example, held its first Highland Games in 1842. In 1849 English long-distance runners demonstrated their sport to large American crowds. Another important thread, older and harder to trace, is the Native American running and games traditions. One of the first American runners to compel English athletes’ notice was Louis “Deerfoot” Bennett, a Seneca Indian who ran in England in 1862, dressed for effect in wolfskin and a feathered headband. Yet another venue for organized competition was county and state fairs. As in England, social class distinguished the structures that contained and sponsored track and running events. Caledonian Club events tended to invite all comers, no matter what race or ethnicity. Other British imports, such as the races called “pedestrians,” were often largely working-class events. One of the first American pedestrians was held in 1835 at the Union racetrack in New York. Runners competed to cover ten miles in less than an hour. (One out of nine entrants achieved this goal.) Another type of pedestrian was the “six day go as you please” staged in several cities in the mid-nineteenth century. These were endurance events characterized by betting and by the rough informality of that era’s urban spectacles. One race in Boston in the mid-1880s was run indoors by contestants from a wide variety of social backgrounds who had coaches and stood to win some money. A final category was the women’s walking contest, quite popular in the 1870s. Often lucrative for the winners, these marathon contests, involving thousands of quartermile track circuits per meet, disappeared in the 1880s and are barely remembered today. By the late-nineteenth century the other pedestrians had also shriveled because of

154

widespread corruption and the increasing attraction of more elitist and “legitimate” competitions. Collegiate and club track and running competitions eventually overwhelmed more populist events. For these athletes, amateur status was a badge of honor. In the 1880s and 1890s, the athletic club model caught on among American elites. These clubs varied from social clubs with fine athletic facilities to clubs primarily for amateur athletes, but in America’s gilded age, most clubs developed membership policies defined by income and social prestige. The New York Athletic Club (NYAC) was founded in 1868, and the Boston Athletic Association in 1887. By the late nineteenth century, most American cities had amateur athletic clubs, and the international aspirations of the American clubs were captured in the first AmericanBritish meet held at Travers Island, New York, in June 1895, in which the NYAC hosted its London counterpart. On the collegiate scene, perhaps due to their relative age and their links to elite preparatory schools with track programs and to the city athletic clubs, northeastern universities nurtured many outstanding amateur track and field athletes at the turn of the century. The growth of organized collegiate sports partly reflected middle-class concerns about the fate of rugged manliness in an urban, electrified world. The Intercollegiate Association of Amateur Athletics was founded in 1876. By the 1880s, track and field events encompassed the 100- and 220-yard sprints, the quarter-, half-, and mile runs, hurdles, the broad jump, long jump, pole vault, shot put, 56-pound throw, and hammer throw, and sometimes the half-mile walk. (The marathon would be an Olympic addition.) In 1896 a fourteen-man team sponsored by the Boston Athletic Association traveled to Athens for the first modern Olympic Games. The young Americans won nine of the twelve track and field events. By the 1912 games, United States track athletes had put the Olympics on their calendars and continued their impressive record of victories. The remarkable Carlisle Indian School graduate, Jim Thorpe, won both the pentathlon and decathlon. The 1912 team also included several African American members, as had the 1908 team. The development of American track and field has reflected the evolution of various groups’ access to social competition in general. Into the early twentieth century, American white men dominated the track and field events sponsored and fostered by the white athletic clubs and the white-dominated colleges. Yet African Americans competed in track and field from its American beginnings, largely through venues that paralleled those of white male athletes. Most black track athletes, as in baseball and other sports, functioned in segregated settings. The “colored” YMCAs nurtured athletic skills and organizational knowledge. American blacks also founded urban athletic clubs to foster recreation and competition; in fact, like whites of various ethnic and class groupings, African Americans fully participated in the club movement of the late nineteenth century. Limited community resources hampered these

TRACK AND FIELD

clubs, and members usually had to use public facilities for their activities. Black colleges, founded after the Civil War, offered a crucial staging ground for black athletes. After initial hesitation to commit their scarce resources to athletics, by the 1890s college administrators were backing a varsity movement. More public resources might have come their way through the Second Morrill Act of 1890, except that southern white state legislators diverted funds intended for black land-grant colleges to white uses. Even in those years, the outstanding competitive skills of individual black men occasionally emerged. A few black athletes were able to participate in white-controlled events like the Highland Games. A few black students attended white colleges and universities, sometimes only after being required to graduate from a black college. These included outstanding athletes like Amherst’s W. T. S. Jackson, the University of Pennsylvania’s J. B. Taylor, Howard Smith, and Dewey Rogers, and Harvard’s N. B. Marshall and Ted Cable (a graduate of Andover Academy). Other venues for blacks to compete against whites included the military, where black units could field competitors against white units’ teams. American meets and teams contained increasing numbers of black American world-class athletes, including of course Jesse Owens, whose winning performance offered an ironic commentary on the Third Reich’s racial philosophy in the 1936 Berlin Olympic Games. In the mid-1890s college women began testing their skill in track and field events. Vassar College held the first of forty-two consecutive women’s field days in 1895. For thirty years, women track athletes strove against the physical educators’ received wisdom, which echoed cultural repression of women’s physical exertion on the grounds that women were incapable of extended exercise. In the early 1920s, track and field boomed as a sport for college women, then fell victim by the 1930s to social fears of the “mannish” and unnatural (read: “lesbian”) female types who might thrive in sports so dependent on “masculine” strength and speed (rather than the grace and agility one could read into gymnastics, skating, and even tennis and golf, which had their own social cachet). Colleges were not the only breeding ground for women (or men) track athletes. Though access to good tracks, coaches, and practice time made a difference in results, one could compete for relatively little money in events sponsored by the Amateur Athletic Union and thus qualify for distinction. While the blight on female track athletics hit colleges first, non-collegiate athletes continued to compete and draw audiences into the 1930s. There was room in public regard for Mildred “Babe” Didrikson, who gained celebrity in the 1931 nationals by breaking the world’s record for the 80-meter hurdles and achieved Olympic distinction in 1932. (In the longer run, her blunt speech and avoidance of dresses seemed to confirm stereotypes of women athletes.) Didrikson and many other non-collegiate women athletes were sponsored by indus-

Jesse Owens. The track and field phenomenon, in midair during a broad jump in 1936, the year he won four gold medals and broke world records at the Olympic Games in Berlin. AP/Wide World Photos

trial leagues, part of the “welfare capitalism” movement of the 1920s. As female participation in track and field became culturally complicated, black women emerged as the individuals able to withstand the stigma of speed, endurance, and strength to compete in national and international meets. Alice Coachman was the first black woman to win an Olympic gold medal in the high jump, in London in 1948. Wilma Rudolph won Americans’ hearts with her Olympic performance in 1960, when she won three gold medals; and she was only one member of an Olympic women’s squad dominated by black collegiate athletes. (The entire relay team was from Tennessee State University.) Since the 1960s a host of black American women athletes have starred on the world stage of Olympic competition, including Evelyn Ashford, Valerie Brisco-Hooks, Gail Devers, Florence Griffith Joyner, Jackie JoynerKersee, Marion Jones, and Wyomia Tyus. Black men have matched black women’s track and field brilliance in the last fifty years. Again, a partial list includes Bob Beamon, Leroy Burrell, Milt Campbell, Lee Evans, Carl Lewis, Michael Johnson, Edwin Moses, and

155

TRADE AGREEMENTS

Mike Powell. The bitter side of African American success is the continuing social and “scientific” conversation about whether there are physiological causes of black athletic domination. Besides linking to a long Euro-American history of slandering black Africans and their descendants as more animalistic and primitive than whites, this debate implies that blacks may have to work less hard and thus deserve less credit for their athletic achievements. As with other sports, track and field’s twentieth century has been characterized by both technical and technological developments contributing to progressively faster, longer, higher results. Technological improvements encompass the materials used in equipment, including shoes and clothing, as well as timing, starting, and measurement methods. There have also been illegitimate technological developments, notably the use of drugs, particularly anabolic steroids, to enhance physical development and performance. Technical improvements include training regimes, nutritional knowledge, and research toward systematizing and enhancing the psychosocial aspects of training and competition. The final major development has been the erosion of distinctions between amateur and professional athletic status. Endorsements and sponsorships from corporations and other organizations allow outstanding track athletes to enhance and extend their careers. Many other professional athletes may earn far more, but professionalization has contributed to the visibility and democratization of track and field. BIBLIOGRAPHY

Ashe, Arthur R., Jr. A Hard Road to Glory: A History of the AfricanAmerican Athlete 1619–1918. Volume I. New York: Amistad, 1993. Cahn, Susan K. Coming On Strong: Gender and Sexuality in Twentieth-Century Women’s Sport. New York: Free Press, 1994. Chalk, Ocania. Black College Sport. New York: Dodd, Mead, 1976. Guttmann, Allen. Women’s Sports: A History. New York: Columbia University Press, 1991. McNab, Tom. The Complete Book of Track and Field. New York: Exeter Books, 1980. Rader, Benjamin G. American Sports: From the Age of Folk Games to the Age of Televised Sports. 3rd ed. Englewood Cliffs, N.J.: Prentice Hall, 1996. Riess, Steven A. City Games: The Evolution of American Urban Society and the Rise of Sports. Urbana: University of Illinois Press, 1989. Tricard, Louise Mead. American Women’s Track and Field: A History, 1895 through 1980. Jefferson, N.C.: McFarland, 1996.

Mina Carson See also Olympic Games, American Participation in; Sports.

156

TRADE AGREEMENTS. When two or more nations wish to establish or modify economic relations and set tariffs on international commerce they enter into a trade agreement. Any authorized government official may negotiate such an agreement, but all participating governments must formally ratify the proposed treaty before it becomes effective. As a result, domestic political forces and interest groups exert considerable influence over the provisions of any trade agreement. The United States negotiated few trade agreements in the eighteenth and nineteenth centuries. Domestic political pressures determined how high or low import taxes (tariffs) would be. From the earliest debates in the First Congress, some political leaders favored low tariffs designed to raise revenue while others favored much higher rates to protect domestic producers from foreign competition. Lower rates generally prevailed through the 1850s, but protectionist tariffs were sponsored by the dominant Republican party during and after the Civil War. To encourage particular types of trade within the forbiddingly high post–Civil War tariff structure some leaders favored bilateral trade agreements in which each nation agreed to reduce rates in return for reciprocal reductions. In the 1870s the United States signed a reciprocal trade agreement with the then-independent Hawaiian government that gave Hawaiian sugar exporters tariff-free access to the U.S. market. In the early 1890s Secretary of State James G. Blaine negotiated reciprocal trade agreements that softened the effect of the highly protectionist McKinley Tariff Act of 1890, but the 1894 WilsonGorman Tariff Act made such agreements impossible. With the exception of the Underwood Act, which passed in 1913 but never went into effect because of World War I, protectionist rates remained until the Great Depression, when it appeared that the nation’s high import duties were not only detrimental to world trade but also might be harmful to the domestic economy. In the election of 1932 the Democrats came to power on a program involving “a competitive tariff ” for revenue and “reciprocal trade agreements with other nations.” Cordell Hull, President Franklin D. Roosevelt’s secretary of state, was the driving force behind congressional action in getting the Trade Agreements Act made law on 12 June 1934. The Reciprocal Trade Agreements Act of 1934 permitted reduction of trade barriers by as much as half in return for reductions by another nation. Moreover, the new act, in form an amendment to the 1930 Tariff Act, delegated to the president the power to make foreigntrade agreements with other nations on the basis of a mutual reduction of duties, without any specific congressional approval of such reductions. The act limited reduction to 50 percent of the rates of duty existing then and stipulated that commodities could not be transferred between the dutiable and free lists. The power to negotiate was to run for three years, but this power was renewed for either two or three years periodically until replaced by the Trade Expansion Act of 1962. In the late

TRADE AGREEMENTS

1930s and 1940s U.S. negotiators arranged a great number of bilateral trade agreements. In fact, between 1934 and 1947 the United States made separate trade agreements with twenty-nine foreign countries. The Tariff Commission found that when it used dutiable imports in 1939 as its basis for comparison, U.S. tariffs were reduced from an average of 48 percent to an average of 25 percent during the thirteen-year period, the imports on which the duties were reduced having been valued at over $700 million in 1939. Although Congress gave the State Department the primary responsibility for negotiating with other nations, it instructed the Tariff Commission and other government agencies to participate in developing a list of concessions that could be made to foreign countries or demanded from them in return. Each trade agreement was to incorporate the principle of “unconditional mostfavored-nation treatment.” This requirement was necessary to avoid a great multiplicity of rates. During World War II the State Department and other government agencies worked on plans for the reconstruction of world trade and payments. They discovered important defects in the trade agreements program, and they concluded that they could make better headway through simultaneous multilateral negotiations. American authorities in 1945 made some far-reaching proposals for the expansion of world trade and employment. Twentythree separate countries then conducted tariff negotiations bilaterally on a product-by-product basis, with each country negotiating its concessions on each import commodity with the principal supplier of that commodity. The various bilateral understandings were combined to form the General Agreement on Tariffs and Trade (GATT), referred to as the Geneva Agreement, which was signed in Geneva on 30 October 1947. This agreement did not have to be submitted to the U.S. Senate for approval because the president was already specifically empowered to reduce tariffs under the authority conferred by the Trade Agreements Extension Act of 1945. After 1945 Congress increased the power of the president by authorizing him to reduce tariffs by 50 percent of the rate in effect on 1 January 1945, instead of 1934, as the original act provided. Thus, duties that had been reduced by 50 percent prior to 1945 could be reduced by another 50 percent, or 75 percent below the rates that were in effect in 1934. But in 1955 further duty reductions were limited to 15 percent, at the rate of 5 percent a year over a three-year period, and in 1958 to 20 percent, effective over a four-year period, with a maximum of 10 percent in any one year. In negotiating agreements under the Trade Agreements Act, the United States usually proceeded by making direct concessions only to so-called chief suppliers— namely, countries that were, or probably would become, the main source, or a major source, of supply of the commodity under discussion. This approach seemed favorable to the United States, since no concessions were ex-

tended to minor supplying countries that would benefit the chief supplying countries (through unconditional most-favored-nation treatment) without the latter countries first having granted a concession. The United States used its bargaining power by granting concessions in return for openings to foreign markets for American exports. Concessions to one nation through bilateral negotiations were often extended to all others through the mostfavored-nation principle. Many international agreements included a clause stating that the parties would treat each other in the same way they did the nation their trade policies favored the most. If in bilateral negotiations the United States agreed to reduce its import duties on a particular commodity, that same reduction was automatically granted to imports from any nation with which the United States had a most-favored-nation arrangement. The high tariff walls surrounding the United States were gradually chipped away through bilateral agreements that established much lower rates for all its major trading partners. From the original membership of twenty-three countries, GATT had expanded by the mid-1970s to include more than seventy countries, a membership responsible for about four-fifths of all the world trade. During the numerous tariff negotiations carried on under the auspices of GATT, concessions covering over 60,000 items had been agreed on. These constituted more than two-thirds of the total import trade of the participating countries and more than one-half the total number of commodities involved in world trade. With the expiration on 30 July 1962, of the eleventh renewal of the Reciprocal Trade Agreements Act, the United States was faced with a major decision on its future foreign trade policy: to choose between continuing the program as it had evolved over the previous twenty-eight years or to replace it with a new and expanded program. The second alternative was chosen by President John F. Kennedy when, on 25 January 1962, he asked Congress for unprecedented authority to negotiate with the European Common Market for reciprocal trade agreements. The European Common Market had been established in 1957 to eliminate all trade barriers in six key countries of Western Europe: France, West Germany, Italy, Belgium, the Netherlands, and Luxembourg. Their economic strength, the increasing pressure on American balance of payments, and the threat of a Communist aid and trade offensive led Congress to pass the Trade Expansion Act of 1962. This act granted the president far greater authority to lower or eliminate American import duties than had ever been granted before, and it replaced the negative policy of preventing dislocation by the positive one of promoting and facilitating adjustment to the domestic dislocation caused by foreign competition. The president was authorized, through trade agreements with foreign countries, to reduce any duty by 50 percent of the rate in effect on 1 July 1962. Whereas the United States had negotiated in the past on an item-by-item, rate-by-

157

TRADE DOLLAR

rate basis, in the future the president could decide to cut tariffs on an industry, or across-the-board, basis for all products, in exchange for similar reductions by the other countries. In order to deal with the tariff problems created by the European Common Market, the president was empowered to reduce tariffs on industrial products by more than 50 percent, or to eliminate them completely when the United States and the Common Market together accounted for 80 percent or more of the world export value. The president could also reduce the duty by more than 50 percent or eliminate it on an agricultural commodity, if he decided such action would help to maintain or expand American agricultural exports. After Kennedy’s death, President Lyndon B. Johnson pushed through a new round of tariff bargaining that culminated in a multilateral trade negotiation known as the Kennedy Round. The agreement, reached on 30 June 1967, reduced tariff duties an average of about 35 percent on some 60,000 items representing an estimated $40 billion in world trade, based on 1964 figures, the base year for the negotiations. As a result of the tariff-reduction installments of the Kennedy Round, by 1973 the average height of tariffs in the major industrial countries, it is estimated, had come down to about 8 or 9 percent. Although both Johnson and President Richard M. Nixon exerted pressure on Congress to carry some of the trade expansion movements of the Kennedy Round further, Congress resisted all proposals. Since 1934 U.S. trade negotiations have been an executive responsibility, but Congress has maintained a strong interest in both procedures and outcomes. In the 1960s and 1970s it called upon the U.S. Tariff Commission to identify “peril points,” where reduction of specific duties might cause serious damage to U.S. producers or suppliers. Other federal legislation provided for relief measures if increased imports cause injury to a domestic industrial sector. The crisis in foreign trade that developed in 1971–1972 was the result of stagnation as well as of an unprecedented deficit in the U.S. balance of payments. Some pressure groups from both industry and labor tried to revive the protectionism that had flourished before 1934, but they had had small success except on petroleum imports by the mid-1970s. The world’s acceptance of more liberal trade agreements has had different effects on U.S. producers. Most likely to benefit are those engaged in the nation’s traditionally export-oriented agricultural sector. Production costs are relatively low in the U.S. heartland, so a freer market tends to benefit domestic agricultural exporters. At the same time, labor-intensive industries, such as textiles, electronics, and automobiles, have suffered from the gradual reduction of import restrictions. U.S. wage rates range far higher than comparable rates in certain countries that have built very efficient textile mills and fabrication plants for electronic devices and appliances. The recovery of the U.S. auto industry in the 1990s, however, demonstrated that increasing the use of industrial robot-

158

ics and automated assembly lines can help undermine the cost advantage of foreign manufacturers. As more liberal trade agreements promote competition among producers, each nation is likely to develop stronger and weaker economic sectors that complement those of its global trading partners. The ultimate trade agreement is one in which all national barriers disappear. The European Union (formerly the European Economic Community) represents an approximation of that goal, as does the 1993 North American Free Trade Agreement (NAFTA) among the United States, Canada, and Mexico. NAFTA cancels all major barriers to exchange of goods and services among the participants, leaving the GATT structure in control of imports and exports outside the free-trade area. BIBLIOGRAPHY

Grinspun, Ricardo, and Maxwell A. Cameron, eds. The Political Economy of North American Free Trade. New York: St. Martin’s Press, 1993. McCormick, Thomas J. America’s Half Century: United States Foreign Policy in the Cold War and After. Baltimore: Johns Hopkins University Press, 1995. McKinney, Joseph A., and M. Rebecca Sharpless, eds. Implications of a North American Free Trade Region: Multi-Disciplinary Perspectives. Waco, Texas: Baylor University, 1992.

John M. Dobson Sidney Ratner / a. g. See also European Union; General Agreement on Tariffs and Trade; North American Free Trade Agreement; Reciprocal Trade Agreements; Trade, Foreign.

TRADE DOLLAR. The currency law of 1873 created a special silver dollar, weighing 420 grains instead of the standard 412.5 grains, ostensibly to encourage trade with China, but more probably to provide a market for domestic silver producers (see Crime of 1873). The bulk of the 36 million pieces coined went to China, but at least 6 million were forced into circulation in the United States, despite the fact that after 1887 they were no longer legal tender. Many were bought at a discount and paid out at par to immigrant laborers who were forced to take the loss. Hoping to force the government to buy new silver, the silver interests delayed government redemption of the coins until 1887. BIBLIOGRAPHY

Carothers, Neil. Fractional Money: A History of the Small Coins and Fractional Paper Currency of the United States. New York: Wiley, 1930. Schwarz, Ted. A History of United States Coinage. San Diego, Calif.: Barnes, 1980.

Neil Carothers / a. r. See also China Trade; Currency and Coinage; Free Silver; Money; Silver Prospecting and Mining.

TRADE, DOMESTIC

HIGHLIGHTS OF THE DEVELOPMENT OF DOMESTIC TRADE, 1492–2002 1492–1607, The New World Evolves

1914–1928, World War I and the Jazz Age

The Old World (Europe) and the New World begin to blend together as the pioneers bring livestock andother necessities for survival. Fishing starts a dominant industry in the North.

The Eighteenth Amendment was ratified, prohibiting the manufacture of alcoholic beverages for sale. Illegal industries evolved in active trade in wines and liquor. America enjoyed prosperity, along with demanding more goods and services.

1607–1783, The Colonial Era Native Americans taught the colonists to raise new crops, including corn, squash, potatoes, and tobacco. Ships were sent from Europe full of luxuries to trade. The first permanent trading post was established.

1784–1860, A New Nation The Constitution was adopted. A national currency and a banking system were developed.

1861–1865, The Civil War The North was thriving economically as a result of war expenditures. The South was just barely surviving. Its agricultural lands had been turned into battlefields, and what little was produced was not being marketed because the North blockaded the Southern ports.

1865–1889, Reconstruction The development of the railroads connecting the Middle West and the South to the West made trading much faster and more profitable. Also, the Great Lakes became a hub of commerce between the East and the Middle West.

1890–1913, The Progressive Era The creation of the Federal Trade Commission brought control and regulation to interstate trade. For the first time, more people were employed by industry than working on farms.

TRADE, DOMESTIC. Trade can be defined as engaging in an exchange for goods and services. For trade to take place, there must be at least two parties with different wants and needs. These people may not be able to produce the goods or services alone and seek others who can do so. People need the basics to survive such as food, clothing, and shelter. They may want large houses, fashion clothing, and exotic food. Specialization is extremely important to trade. Each worker or company focuses on producing a type of service or product, creating interdependency. Whether they are consumers, workers, producers, or the government itself, everyone benefits from trade in various ways.

1929–1939, The Great Depression The stock market crashed in October 1929. A domino effect occurred in the nation’s economy, and consumer spending collapsed. Trade was nearly at a standstill.

1941–1945, World War II Demand for goods and services was once again placed on need only, although government spending on the war effort helped bring the country out of the Great Depression. Production for consumer goods declined, and production for war necessities increased.

1945–Present, The Modern Era Since World War II, goods increasingly have been mass-produced to meet the needs and wants of consumers. Companies seek lower production costs and less time in the production process. America has seen a shift from domestic production to foreign production. Some American companies have bought land in foreign countries, built factories there, and used their labor because it is cheaper. America is helping the underdeveloped countries to develop as a result of trade. The technological revolution beginning in the mid1980s has brought even faster production methods and a new set of industries. Consumer spending is at an all-time high. Trade is evolving on a worldwide basis.

Workers benefit from securing jobs with companies that are expanding and desire labor. A producer or company can grow as a result of the demand for its product or service. Consumers are able to make better choices because they create the demand for products and services and indirectly create competition among producers. The government benefits because of the taxes on producers, workers, and consumers. Since colonial times, trade has contributed greatly to the standard of living of the United States, which is among the highest in the world. Early Trade, 1492–1783 The people who came to America did so to seek freedom of religion, freedom of political views, and economic op-

159

TRADE, DOMESTIC

portunity. Great Britain still had control over the colonies. As the American colonies were being settled, trade became a means of survival. The Native Americans assisted the colonists in growing food. They introduced them to potatoes, corn, and tobacco, which the colonists in turn traded for goods from Europe. Indigenous people had their settlements either near waterways or near trails they had created. The colonists used both the waterways and the trails as transportation routes to conduct trade. As America was being explored, trade was evolving. To make trade easier, trading posts were set up in towns. A popular one, called the Aptucxet Trading Post, was founded by the Pilgrims in 1627. It has often been referred to as “the cradle of American commerce.” Fur, lumber, luxury goods, and food were just a few things that were traded. The Native Americans as well as the Pilgrims used the trading post, exchanging beaver skins for blankets, guns, hatchets, and rum. As the colonial trade grew, hostilities developed with Britain and also among the colonists themselves over trade issues. As the colonists were prospering, Britain was losing money as a result of its war with France over territories in North America. Britain’s unsuccessful efforts to tax the colonists helped spark the Revolution. During the war, little trade took place outside the colonies. People became more self-sufficient and interdependent, or they just did without things that they were used to having. By 1762 merchants had been complaining that there was no central bank in the colonies. A central bank was starting to evolve by the end of the Revolution. The signing of the Constitution in 1787 created a strong government that supported Americans who were trying very hard to maintain an economy based on domestic trade, with an emphasis on agriculture. A New Nation, 1783–1860 In the late 1700s, the newly discovered Ohio Valley waterways made inland trade easier for New England and the middle and southern colonies. The steamboat made a successful appearance in the Ohio Valley in 1811. It was mainly used to get crops to market areas. The “river cities,” including Cincinnati, Louisville, Saint Louis, and New Orleans, became trading hubs as manufacturing developed along the waterways. By the 1820s there was a central bank and a national currency. In almost every large town and new cities, banks were being built. The opening of the Erie Canal in 1825, connecting Lake Erie to the Hudson River, also furnished a new outlet for the Northwest traffic. New York City wasn’t just a market anymore but also a commercial center for trade. Waterways continue to be important, but the landlocked towns began to prosper from trade as a result of the railroads. They spread quickly from Baltimore to Wisconsin. Most of the northern Atlantic Coast turned to manufacturing as the railroad continued to grow.

160

Civil War, 1861–1865 At the time of the Civil War, the South was producing mostly agricultural products, with an emphasis on cotton and tobacco as major commodities for trade. The North cut off the South’s markets when the war started, and its trade was almost at a standstill. Also, with so many men in the army, it was impossible for the women left behind to operate the transportation systems to get the products to markets. The South was mostly self-sufficient and poor once again. By 1863, the North was seeking profits on goods related to the war, such as the high demand for army uniforms. The North was thriving economically and making quick money. Reconstruction, 1865–1889 The two decades from 1870 to 1890 marked the development of railroads connecting the Middle West and the South to the West. People migrated to the West in search of a profitable economic future. New settlements sprang up quickly along with the manufacturing and agricultural trade. With the development of a more sophisticated transportation system came a demand for material possessions. The Great Lakes were transformed to provide the needs and wants of commerce between the East and the Middle West. The Progressive Era, 1890–1914 The Progressive Movement started in about 1890 as a protest against the excesses of the preceding century and the corruption in government at the time. One result of this movement was more effective regulation on business and trade. Journalists (some of whom became known as muckrakers) exposed the sins of corporate giants like Standard Oil Corporation to the socially conscious public. The turn of the century saw the establishment of large corporations and trusts, which attempted to control both supply and demand of their product category and to exercise authority over newly organized labor unions. These new corporations controlled vast amounts of money and resources, which they used for expansion, competition with foreign business, political influence, control of large blocks of stock, and the pooling of patents. The creation of the Federal Trade Commission under President Woodrow Wilson brought control and regulation to interstate trade. Cities were growing explosively because of their role as centers for great industrial corporations. Cities had the money and the employment, and to them came the vast armies of workers, the expanding railway systems, and the crowded and often unhealthy factories. Department stores flourished and became centers of shopping, enabled by improvements in transportation, such as tramways and motorcars. For the first time in United States history, there were more people employed by industry than working on farms.

TRADE, DOMESTIC

In spite of the move to the cities, agricultural trade also grew. The shift on the farm from manual labor to machines allowed for the expansion in commercial farming. The expanding population in the cities provided a large market for the farmers’ products, but there was still enough left to sell to foreign countries. The number of farms in the United States tripled between 1860 and 1910, from 2 million to 6 million. At the turn of the century, America was a land of abundance. Supplies of many natural resources surpassed those of the rest of the world. By this time, there was a well-established trade both domestically and internationally in iron, steel, coal, cotton, corn, and wheat. World War I and The Jazz Age, 1914–1928 In 1917 the United States entered World War I, and from that experience first became a major world power. During the brief period that the country was involved in the war, the shortage of men at home meant that there were plenty of jobs available and full employment. Lucrative government contracts meant a full workload. Immediately following the war, there was a glut of returning veterans seeking work. The end of wartime contracts meant fewer jobs, business owners attempted to

drive down wages and to break unions in an effort to maintain profits, and the unions began to revolt. Scarcity of available money stalled the shift to a consumer goods economy. In 1919 the Eighteenth Amendment was ratified, prohibiting the manufacture of alcoholic beverages for sale. It spawned an illegal cottage industry that resulted in an active trade in wines and liquors. In addition, the period gave rise to a new industry: the underworld. Trading in alcohol and the demand for union busters gave illegal activity new strongholds. During the 1920s, America enjoyed an era of prosperity, and big business regained control. New wealth brought more leisure time and the growth of the entertainment business: organized sports, silent films, radio, mass-oriented magazines, and recorded music. Corporations issued stock publicly on a wide scale, and millions of Americans were able to buy stock in these giant companies. The Great Depression, 1929–1939 All of this affluence came to a crashing end when the stock market collapsed in October 1929. Just prior to the crash in 1929, the Gross National Product was $87 billion; four

161

TRADE, DOMESTIC

years later, it had shrunk to $41 billion. Every day, factories closed and banks and businesses failed. In 1930 the jobless numbered 7 million; by 1932 the number had risen to 15 million out of a total workforce of 45 million. Farmers were also hurt, as thousands lost their land and homes through foreclosure. In the South, the collapse of the export market for cotton and tobacco and problems with overproduction led to looting and riots in 1931. American businessmen found that the overseas markets for their goods were drying up because the depression was becoming global. In 1933 President Franklin Delano Roosevelt led Congress to enact a wide variety of emergency economic and social legislation called the New Deal, which brought some relief to the ailing country. The Securities and Exchange Commission was created in 1934. Its purpose was to police corporations that were issuing new securities. The National Recovery Administration was created in 1933 to establish codes for fair competition and to guarantee workers the right to form unions. Minimum wages and maximum work hours were established, and the Social Security system was created to provide relief to the elderly and infirm. During this decade, Hollywood became the movie capital of the world. With the sensational boom in “talk-

162

ing” movies, an industry was born that would supply entertainment to the country and abroad. Isolationism was the U.S. foreign policy in Roosevelt’s first term in office, a policy in opposition to America’s efforts to regulate international currency and trade. World War II, 1941–1945 All of this changed on 7 December 1941 when Japan bombed American ships in Pearl Harbor and America entered World War II. Some say that this marked the end of the depression. World War II saw the beginning of what came to be called the military industrial complex. This alliance between government and big business led to unprecedented production records; manufacturing production in 1943 doubled over the year before, as thousands of previously civilian businesses shifted into manufacturing items for war. With so many men in the armed forces, there were new job opportunities for women. Farmers were also affected, as the increasing mechanization of equipment led to a 35 percent rise in output in those years—enough food for civilians, American armed forces, and allies like England and the Soviet Union. At home, Americans moved to cities to find work and to consolidate families whose men had gone to war. Manufacturing plants in the north brought African Amer-

TRADE, FOREIGN

icans from the south to work. Job opportunities existed for this group of workers but the results were crowded working conditions, inadequate housing and transportation, and urban blight. Volatile racial tensions occurred with the result in Detroit, Michigan, being one of the bloodiest riots in history. The Modern Era, 1945 to the Present At the close of World War II, the United States was the most powerful nation in the world, both politically and economically. The GI bill provided $15 billion for veterans’ college educations and low-cost mortgages for new homes in the suburbs. Factories and businesses sprang up in record numbers, and women in the workforce added to the new affluence of families. America entered the consumer culture in the 1950s, and this trend continued throughout the rest of the century. The 1960s were a period of revolt and change in America. The 1970s continued this radical movement, as the baby boom generation (born just after World War II) came of age and provided new areas of consumer demand for punk rock music, drugs, hippie fashion, and vegetarian cuisine. The late 1970s saw massive inflation, and the Arab oil embargo had a profound effect on the cost of living as oil prices soared for business, home, and auto. The American automobile business lost its domestic position dominance, as more fuel-efficient cars from Japan became popular. This business suffered badly until the end of the 1980s, when cooperative deals between American and Japanese automobile manufacturers resulted in Japanese auto plants being built in the United States. Shortly after the end of World War II, the General Agreement on Tariffs and Trade (GATT) was established, reducing world tariffs and allowing for increased imports. Revisions to GATT in December 1993 provided for the elimination of all quotas on clothing and textiles on 1 January 2005. Also in 1993, the North American Free Trade Agreement (NAFTA) was signed. This agreement set new guidelines for cooperative trade between the United States, Mexico, and Canada. Since the 1970s a sharp rise in low-cost imports has led to the decline of industry in America, and the United States has lost its place as a world leader in manufacturing. Between 1980 and 1991, the number of workers in the manufacturing sector fell by 2 million. At the same time, United States workers were increasingly finding employment in the service sector in areas like health care, computer programming, insurance, food service, banking and finance, and entertainment. Multinational companies became the vogue, and mergers and acquisitions and joint ventures defined the business landscape. Domestic trade in the last decades of the twentieth century was strong in the areas of automobiles, housing, computers, and environmental and health-related products. The computer business was the fastest-growing in-

dustry in the United States between 1973 and the late 1990s, and the cellular telephone business boomed. As manufacturing in the United States declined and large corporations suffered, there was a sharp increase in small businesses (those with 500 or fewer employees). At the turn of the twenty-first century, 95 percent of all businesses in the United States were classified as small businesses. BIBLIOGRAPHY

Badger, Anthony J. The New Deal: The Depression Years 1933–40. New York: Hill and Wang, 1989. Batra, Ravi. The Myth of Free Trade: A Plan for America’s Economic Revival. New York: Scribners, 1993. Boyer, Paul S., ed. The Oxford Companion to United States History. New York: Scribners, 1993. “Doubtless as Good”: Growing Pains. Available at www .americanhistory.si.edu/doubtless Nash, Gerald D. The Great Depression and World War II: Organizing America, 1933–1945. New York: St. Martin’s Press, 1979. An Outline of American History. Available at www.usinfo .state.gov/usa/infousa/facts/history Samhaber, Ernst. Merchants Make History: How Trade Has Influenced the Course of History Throughout the World. Transl. E. Osers. New York: John Day, 1964. Sitkoff, Harvard. Postwar America: A Student Companion. New York: Oxford University Press, 2000. Streissguth, Tom. An Eyewitness History: The Roaring Twenties. New York: Facts on File, 2001. Uschan, Michael V. A Cultural History of the United States Through the Decades: The 1940s. San Diego, Calif.: Lucent Books, 1999.

Donna W. Reamy Rosalie Jackson Regni

TRADE, FOREIGN. The United States throughout its history has been relatively self-sufficient; yet foreign trade has, since the colonial period, been a dominant factor in the growth of the nation. The colonies were founded basically for the purpose of commerce: the shipment of products, particularly raw materials, to the mother country and the sale of finished goods from the shops of England in the colonies. Even had colonial plans not been centered around the welfare of Englishmen at home, the results could scarcely have been different. The Atlantic coast is particularly suited to commerce on the high seas. Deep harbors in the North and bays, indentations, and rivers and smaller streams from New York southward provided excellent ports for loading and unloading the ships of the day. Moreover, the settlements, clustered around the places where the ships came in or scattered along the rivers and creeks, were almost completely isolated from each other. As late as 1794 it took a week (under the most favorable conditions) to make the trip by coach from Boston to New York. Although the

163

TRADE, FOREIGN

seas were infested with privateers and pirates (and the distinction was sometimes a thin one) and the ships were small and the journey long, the hazards of overland trading were still greater and the returns more uncertain. Foreign trade was primarily in outgoing raw materials and incoming manufactured goods during the colonial period. Simple economic necessity had turned the colonists to agriculture. When surplus food production became possible, economic specialization appeared. Dictated by climatic and soil conditions, as well as by a host of other factors, production in each section determined the course of its commerce. The trade of the colonies south of Pennsylvania was chiefly with England. Ships from British ports called at the wharves of plantations along the rivers of Maryland and Virginia for tobacco and the next year returned with goods ordered from the shops of London and other cities. Furs, skins, naval stores, and small quantities of tobacco made up the early cargoes that went out from the Carolinas, but after 1700 rice quickly gained the lead as the most important export. Before the middle of the century indigo had become a profitable crop not only because it offered employment for the slaves when they were not busy in the rice fields but also because the demand for the dye in England had induced Parliament to vote a bounty. On the eve of the Revolution indigo made up by value about 35 percent of the exports of South Carolina. The commerce of New England and the middle colonies ran counter to economic plans of empire. Grain, flour, meat, and fish were the major products of Pennsylvania and New Jersey and the colonies to the north. Yet shipment of these materials to England endangered long-established interests of Englishmen at home. Although small amounts of naval stores, iron, ship timbers, furs, whale oil and whalebone, oak and pine plank, and staves, barrels, and hoops went off to London, other markets had to be sought in order to obtain means of paying for the large amounts of goods bought in England. The search for sales brought what is often referred to as the triangular trade. Southern Europe, Africa, and the West Indies bought 75 percent of the exports of New England and more than 50 percent of those of New York and Pennsylvania. On the eve of the Revolution the middle colonies were shipping annually to southern Europe more than 500,000 bushels of wheat and more than 18,000 tons of bread. Fish, meat, grain, ship timbers, lumber, and materials for barrels, kegs, and casks also went out in large quantities from Pennsylvania, New York, and New England. Rum was exchanged in Africa for slaves, and the slaves in turn sold in the West Indies for specie or for more molasses for New England rum distilleries. These islands, in fact, provided an inexhaustible market for fish, meat, foodstuffs, and live animals, as well as pearl ash, potash, cut-out houses, lumber, and finished parts for making containers for sugar, rum, and molasses. Corn,

164

wheat, flour, bread, and vegetables found their greatest outlet in the islands. Unfortunately the sellers of raw materials—the colonists—were almost always in debt to the manufacturers of finished goods—the British. Carrying charges by English shipowners ate up the favorable balance of the southerners, and the debts of the planters became virtually hereditary. Northern commercial men, selling more than they bought everywhere except in England, gained enough specie to settle their accounts in London with reasonable promptness. The persistent drainage of money to the mother country, however, was a significant factor in the discontent that developed in America. Although the Revolution did not destroy American trade, even with the British, the former colonies obviously lost their preferred position in the world of commerce and also the protection of the powerful empire fleet. British trade regulations of 1783 (emphasized by further regulations in 1786–1787) closed the ports of the West Indies to the ships of the new nation and protected others by heavy tonnage duties. Only Sweden and Prussia agreed to reciprocity treaties. Yet this critical postwar period was far less discouraging than it is sometimes pictured to be. Varying tariffs in the ports and hostile action and counteraction among the states did keep commerce in perpetual uncertainty and prevented retaliation against European discriminations, but trade went on either in traditional channels or in new markets. Shipping interests in the new Congress secured legislation favoring Americanowned ships. The tonnage registered for foreign trade increased in the years 1789–1810 from 123,893 to 981,000, and imports and exports in American bottoms jumped roughly from about 20 percent to about 90 percent. The Napoleonic Wars turned production forces to military goods, drove merchant ships from the seas, and pushed prices upward rapidly. Although many ships were seized, American merchant captains and the nation prospered until President Thomas Jefferson, seeking to maintain peace, induced Congress in 1807 to pass the Embargo Act. Exports dropped from $108.3 million to $22.4 million within a year; imports fell from $138.5 million to $56.9 million. Repeal of the embargo brought some revival, but other restrictions and the war against England drove exports to $6.9 million in 1814 and imports to $12.9 million. Foreign trade in the years between 1815 and 1860, though fluctuating often, moved generally upward. Agricultural products made up the major part of the exports. Cotton led all the rest—production mounted from about 200,000 bales in 1821 to more than 5 million in 1860, 80 percent of which was sold abroad. Great Britain and France were the two greatest purchasers, but Germany, Austria, Belgium, Holland, and Russia bought appreciable quantities. The West Indies and South America took large amounts of grain and flour, and English demands increased steadily after the repeal of the corn laws in 1846. Tobacco, rice, meat, and meat products, as well as lumber,

TRADE, FOREIGN

naval stores, barrels and kegs, staves, and hoops moved out in large quantities. Cottons, woolens, silks, iron, cutlery, china, and a miscellany of other items made up the bulk of the incoming cargoes. But the glory of the clipper ship was being obscured by the iron-hulled steamers that came from the British shipyards; the day of the whalers was ending even before oil began to flow out of the first well at Titusville, Pa., in 1859. As the nation became increasingly industrialized between the Civil War and World War II, domestic production and domestic trade were its basic concerns. Railroads knit marketing centers together and economic specialization reached maturity. Agriculture, spreading into the West, increased each year its outpouring of foodstuffs; and industry, entrenched behind a high protective tariff, grew with astounding rapidity. The American merchant marine declined rapidly as investors turned their dollars into railroads and other industrial ventures at home. The percentage of foreign trade carried in American bottoms decreased from 66.5 percent in 1860 to 7.1 percent in 1900. That did not mean, however, any lessening in total ocean commerce. The value of exports and imports combined rose from $686,192,000 in 1860 to $4,257,000,000 in 1914. Cotton, wheat, flour, and other farm products continued to move out in ever-larger amounts, but it was obvious that agriculture was losing out to manufactured goods. The changing nature of exports and imports clearly revealed the fact that Europe

was becoming each year relatively less important in American foreign trade. Shipments to and from Asia, Oceania, Africa, Canada, and Latin America were growing rapidly. World War I restored temporarily the supremacy of Europe as a consumer of American agricultural products. But new goods also made up large portions of the cargoes—chemicals, explosives, firearms, special woods for airplane propellers, barbed wire, and a host of other needs of fighting forces. The value of exports and imports more than doubled during the war. The huge purchases of the Allies were based on government credits in the United States, and the slow growth for the next decade was financed largely by American loans. The economic structure fell apart in 1929. Prices declined sharply everywhere; world credit and world finance broke down; foreign exchange transactions were curtailed or taken over completely by government in many places; and the principal powers sought to maintain themselves by hiding behind high tariffs, trade licenses, and fixed quotas. The United States had for a decade been shutting itself off from the world. The climax was reached in the SmootHawley Tariff of 1930, which brought retaliatory restrictions from other nations. Foreign trade of the nation dropped to $2.9 billion in 1932. The slow climb upward to $6.6 billion in 1940 was in part the result of the insistence of Secretary of State Cordell Hull that reciprocity agreements rather than trade restrictions were essentials in commercial revival. By authority of the Reciprocal

165

TRADE, FOREIGN

Trade Agreements Act of 1934 he made a series of executive agreements with foreign nations by which he encouraged American trade, and, by applying the mostfavored-nation clause, spread the gains widely over the world. In the war years 1941–1945 more than $50 billion in goods went out of American ports, and $17 billion came in. But about $32.9 billion of the exports were lend-lease war materials to fighting allies and no payment was expected. That was a startling change in the customary creditor-debtor relationship following World War I, but the experiences of that war dictated the decision. The whole international economic structure was, in fact, undergoing a basic revolution. By the end of the war production facilities had roughly doubled; the nature of the outpouring products had changed astoundingly; and the people of the nation in general and the agricultural and industrial working force in particular had not only found new homes but also new wants and new hopes. Tired of rationing and eager for a new world, Americans were at the end of the war impatient with the delays in transforming the industrial plants from war goods to peace goods and intolerant of any threats of wage cuts. But reconstruction in the nation was slow. Shelves were long empty and shortages of many essentials developed. Europe was paralyzed, and multilateral trade had all but ended. Fearful of communism and convinced that hunger must be eliminated if traditional nations were to be reestablished and if new ones were to be created on the principle of freedom of choice, the United States initiated (1947) the Marshall Plan, which, as proposed by U.S. Secretary of State George C. Marshall, provided $12 billion in aid for the economic recovery of Europe. Already American loans, credits, grants, and relief—private and public—had exceeded that amount by several billion dollars. The plan was not envisioned as a relief program but as a cooperative venture that would restore, or create, economic well-being for all. On 3 April 1948, President Harry S. Truman signed the European Recovery Act, which, through the Economic Cooperation Administration, headed by Paul G. Hoffman and a European coordinating body, the Organization for European Economic Cooperation, gave out through sixteen national offices in Europe and a mission in China at least $17 billion over a four-year period. Machinery for regulating international monetary and trade relations had already been established by the end of the 1940s. The International Monetary Fund (IMF) and the International Bank for Reconstruction and Development (the World Bank) had been created at a meeting in Bretton Woods, N.H., in 1944. The General Agreement on Tariffs and Trade (GATT), with authority to agree on tariff rates in the free world, floundered for a while but became firmly established by late 1947.

166

If the 1940s were years of destruction and reconstruction, the 1950s were, throughout the free world, years of growth and of adjustments in a transition from a basically nationalistic thinking concerning tariffs and trade to a basically international philosophy of freedom of world commerce from deadening restrictions. The experiences of the Great Depression and World War II turned thoughts earnestly toward free trade. Led by the social philosophers and economists the movement gained remarkable headway, even among political leaders. Conscious of the disadvantages of small and sometimes jealous countries in building an industrial structure and in bargaining with great nations, such as the United States and the Soviet Union, Europe turned to unity. Assuming an independent stance, although still drawing appreciable amounts of U.S. aid, France, Belgium, West Germany, Luxembourg, Italy, and the Netherlands in 1957 formed the European Economic Community (EEC), most often referred to as the Common Market. Since the primary purpose of the organization was to improve the economy of all the members by throwing a common barrier around the whole and harmonizing restrictions within, various interests in the United States, especially farmers, were deeply concerned. Within three years after the formation of the Common Market, Great Britain, Sweden, Norway, Denmark, Austria, Switzerland, and Portugal formed the European Free Trade Association (EFTA). (Finland became an associate member in 1961.) With the United States and Canada, the groupings came to be called the Atlantic Community. But not all was harmony in the new economic community. The mercantilists quarreled with the tariff reformers everywhere, and in the United States there was opposition to shifting control of tariff rates from Congress to an international body. The decade of the 1960s was at times a period of bitter controversy. President John F. Kennedy early in 1962 requested Congress to delegate some of its authority over tariffs to the executive department so that he might make revisions at home and might, in the meetings of GATT, bargain for ends that would further the trade of all of the countries involved. The Trade Expansion Act of 1962 granted much authority to the president, notably the power to reduce tariffs on a linear basis by as much as 50 percent on a most-favorednation basis. American delegates and officials of the Common Market, who were determined to assert themselves politically and economically, gathered in Geneva in 1964, in what is called the Kennedy Round of the GATT discussions. The ministers of the various countries had met the year before in a somewhat vain effort to work out ground rules for the proceedings. Agreements concerning rates on even the simplest industrial groups were troublesome to reach, and reductions in agricultural tariffs were arrived at—if at all—only with great difficulty. After nearly four years of controversy, the meeting adjourned with average

TRADE, FOREIGN

tariff rates lowered to somewhere between 35 and 40 percent. Many industrialists and laborers in the United States, wholly dissatisfied, returned to protectionism. Members of the Common Market were unhappy in many ways also, but obviously pleased that they possessed the power to challenge the United States.

ports as small European cars, encountering little American competition, began to appear in ever-larger numbers in the United States. Only in the export of trucks, buses, and automotive parts and equipment did the United States keep the unfavorable trade to a reasonable limit in the automotive field.

The foreign trade of the United States had undergone profound changes. The great surpluses that had marked U.S. world commerce from the 1870s began in the 1950s a decline that reached significant proportions in the 1960s. The great steel empire that Andrew Carnegie and Henry Clay Frick had done much to make the wonder of the industrial world was crumbling because of new mills and less costly labor in other countries. Freighters put into ports on the Atlantic, the Pacific, and the Gulf and even traveled down the Saint Lawrence Seaway to Cleveland, Detroit, and Chicago to unload finished industrial goods in the heart of America. As Europe and other countries of the free world made a remarkable recovery from the war years, products from their new plants poured into the stream of international commerce. Between 1960 and 1967 finished goods in U.S. imports increased 150 percent. Steel, automobiles, textiles, and electronic goods led the new imports. Incoming steel, until 1957, was insignificant in amount and had grown only to some 3 million tons in 1960. But by 1967 shipments had reached 11.5 million tons, and the next year reached 18 million. In 1971 steel imports amounted to 18.3 million tons—nearly 18 percent of all steel sold in the nation that year, when total employment in American mills also fell to its lowest point since 1939.

Textile and footwear manufacturers, too, protested the loss of markets because of competing goods from other countries, especially Japan. Some agreements were reached concerning shipments of cotton cloth into the United States, but the whole field of synthetic fibers remained open. Between 1965 and 1969 American imports of man-made fiber textile increased from 79 million pounds to 257 million pounds. During the same period imports of wearing apparel of man-made fibers grew from 31 million pounds to 144 million pounds. The number of imported sweaters rose from 501,000 dozen in 1965 to about 6.9 million dozen in 1969. Imports of footwear were increasing also: 96 million pairs in 1965; 202 million pairs in 1969. In the first four months of 1970, one-third of the demand for footwear was being met by foreign shops.

Competing steelmaking plants, although new, were not appreciably more efficient than those of the United States. Basically the steel problem was too much steel. Production facilities over the world were far in excess of need; yet Japan, for instance, although having to both bring in raw materials and send its finished product to faraway markets, ever increased its output. Even production in Mexico, South Korea, Spain, and the Philippines, for example, grew steadily as capacity outside the United States doubled in the 1960s. American steelmakers were both unwilling and unable to bargain in the marketplace. They blamed cheap labor (the European advantage, they asserted, was about $20 a ton; the Japanese roughly twice that amount) and liberal governmental assistance in the form of border taxes, license requirements, special levies, quotas, export rebates, hidden subsidies, value added tax, and other monetary and legislative provisions for hindering exports from the United States and encouraging exports to the United States. They turned to Congress for help. Both European and Japanese producers agreed to limit further shipments of steel for the next three years to an annual growth of 2.5 percent. The automobile industry was turned topsy-turvy also. Large British and French cars—once popular as prestige vehicles—steadily declined among American im-

Electronic goods in foreign trade added appreciably to the deficit in the United States. Between 1963 and 1970 such imports, by value, mostly from Japan, increased at the annual rate of 32 percent. By 1970 they accounted for 37 percent of the television sets, 63 percent of the phonographs, 92 percent of the radios, and 96 percent of the tape recorders sold in the United States—though some of the parts were made in American plants or in Americanowned foreign plants. Even the developing countries exported local products, including tropical fruits and novelties, and such substantial products as special steels. The basic problem in American foreign trade in the early 1970s was that imports had increased more rapidly than exports. Building on the foundation of American aid after World War II, and to an appreciable extent on borrowed American technology, Europe and parts of Asia performed an industrial miracle and captured markets over the world, especially in the United States, with their well-made goods. Moreover, the United States, suffering from persistent inflation and its consequent high prices, could not effectively compete in world markets. Imports were cheap in comparison with domestic prices, and foreign goods flowed freely into the ports. Many industrialists and wage earners in the United States resented the economic penalties they thought the changing foreign trade situation had brought. During the 1960s ever-increasing numbers of U.S. corporations and individuals set up factories throughout the world. Some said they were fleeing behind the protective walls that prevented Americans from selling in many world markets; others said they were escaping the high wages at home that choked them out of world competition; a few said they were getting away from the irresponsible American workmen. Discontent in the nation continued to grow, and American industrialists and laborers and a great num-

167

TRADE, FOREIGN

ber of other citizens, convinced that the whole international experiment had been a failure, turned to protection. Arguments by theoretical scholars and realistic statisticians that free trade had created more jobs than it had destroyed and that a return to the old order would bring economic tragedy were unconvincing to factory owners with limited markets or to men without jobs. American foreign trade was involved not only in the complex industrial world but also in the even more complex monetary world. The annual unfavorable balance of payments, sometimes of several billion dollars, made it difficult for the nation to pay its bills. The merchandise exchange was with few exceptions favorable to the United States; it was the balance of payments that embarrassed the nation. Military commitments in Europe and elsewhere, the Vietnam War, heavy expenditures of American tourists abroad, shipping charges, and a host of other payments left the nation each year through the 1960s and at the beginning of the 1970s heavily indebted. This debt steadily increased the claims on the gold reserves of the United States and brought an ever-growing doubt concerning the dollar. The essential monetary difficulty was not so much the problem of gold as it was the problem of adjusting the existing monetary system to the needs of the new international situation and the overvalued dollar—the only currency in the free world with a fixed value based on a specific amount of gold. (The designers of the IMF at Bretton Woods had set up that standard with all other currencies having a parity relation to it. There was a modest permissible variation in the rate of exchange.) If, however, the unit value of any currency became too cheap or too expensive in terms of other currencies, it could be devalued or revalued upward to be realistically realigned with the dollar. In the 1960s most of the currencies of the major countries had become greatly undervalued in terms of the dollar, notably the West German mark and the Japanese yen. Thus imports were temptingly cheap in American ports, and exports discouragingly costly in foreign markets. Through the 1960s U.S. imports grew twice as fast as exports, and the small trade surplus fell each year far short of meeting the persistent foreign debt. Dollar claims piled up in Europe. In 1968 additional reserves (often referred to as paper gold) were provided by the creation of Special Drawing Rights issued by the IMF. But the imbalance continued. There was no lack of suggested remedies: devalue the dollar; increase the price of gold; widen the parity margin; float all currencies; desert gold altogether. Each proposal stirred some doubts, and each one presented a plethora of known and unknown difficulties. As the 1970s began, there was no question that the dollar was under tremendous pressure. The impending crunch came in August 1971, when higher interest rates in the United States, rumors of revaluations, and a growing American deficit, swelled by strikes and threatened strikes, poured a flood of unwanted

168

dollars into Europe. Speculators, corporations, commercial banks, and other holders, protecting themselves from changes in currency values, began to scurry out from under their surplus dollars. They returned to the United States $4 billion in the second week of August. The nation at the time held only $13 billion in its gold reserve against some $60 billion in short-term obligations. On 15 August President Richard M. Nixon closed the door on gold redemptions and levied a 10 percent surtax on dutiable imports. The drastic action, it was hoped, would force Japan and the major European countries to revalue their currencies upward, remove some of their manifold barriers to United States trade, and share the costs of American military forces stationed abroad. Despite many fears that the action might disrupt the monetary world, the situation cleared appreciably, although the bitternesses that had long existed did not disappear. By February 1972 the monetary situation had begun to deteriorate rapidly. Fearful that Congress, dissatisfied with promised trade concessions from the Common Market, Canada, and Japan, would severely amend the devaluation proposal, Europe began to enact currency controls. American foreign trade throughout the year remained the largest in the world, but exports made no appreciable gains on imports. The surtax, soon removed, had not lessened appreciably the amount of goods coming into American ports. Tariff walls had come down, but other barriers had gone up. The dollar, devalued again in February 1973 and further deteriorated through the succeeding currency float, continued to decline relative to the currencies of the Common Market and Japan. A gasoline shortage developed with the oil embargo of October 1973, and by the early months of 1974 the economic situation was recognized by even the most optimistic as a full-blown depression, with further unemployment but no end to inflation. Quarrels in the free world intensified as the United States established de´tente with the Soviet Union and offered a friendly hand to China. Sales of grain to the Soviets, reductions in military and other world expenditures, augmented returns from foreign investments, and other favorable factors pushed the balance of trade substantially in favor of the United States by the beginning of 1976. Between the 1970s and the mid-1990s the U.S. post– World War II dominance of world trade came to an end. Major changes in transportation, finance, corporate structures, and manufacturing restructured the global economy, erasing the significance of international economic boundaries. Whole industries in the United States were largely eliminated, unable to compete effectively against cheaper and often better imports. In labor-intensive industries, such as textiles, shoes, and assembly work, the competition came from low-wage developing countries; in the automobile, steel, and electronics industries it came from technological innovators abroad who developed new products and efficient manufacturing.

TRADE, FOREIGN

The United States continued to be the world’s largest internal economic market, but this did not isolate the United States from international trade, as it willingly imported goods and services and eagerly exported goods and know-how. The United States sought a role in the global economy, as evidenced by the North American Free Trade Agreement (NAFTA) and major revisions in the General Agreement on Tariffs and Trade (GATT). NAFTA, which became effective in January 1994, created a major regional trading block including Canada, the United States, and Mexico. This far-reaching agreement reduced tariffs over a fifteen-year period, eased cross-border transportation, and opened Mexico to U.S. and Canadian investments, even in banking and state-owned energy monopolies. Labor unions opposed NAFTA, asserting that corporations would transfer jobs and plants to Mexico because of lower wages and lax environmental regulations. GATT negotiations were protracted. Revisions were negotiated by three presidents—Ronald Reagan, George Bush, and Bill Clinton—who supported cutting tariffs among 123 nations. The GATT agreement known as the Uruguay Round reduced tariffs by 40 percent, cut agricultural subsidies, extended patent protection, and set out rules on global investment. Disputes were to be resolved by the World Trade Organization (WTO), a powerful arbitration board that would decide whether a nation’s domestic laws violated the agreement. The arguments for trade liberalization through NAFTA and GATT were the classic economic arguments of comparative advantage originally articulated by the early nineteenth-century British economist David Ricardo. The idea was simple—nations should specialize in products they can produce cheaper or better. Deciding what products that would mean for the United States was problematic. The last U.S. trade surplus (U.S. exports exceeding imports) occurred in 1975, when the nation enjoyed a $12.4 billion surplus. By 1984, however, the United States was posting $100 billion-plus trade deficits each year, reaching a record $166 billion in 1994. The trade deficit is a summary statistic for a more complicated set of relationships that includes country-to-country deficits and surpluses and differences between economic sectors. In 1993, for example, the United States ran a trade surplus of $12.8 billion for foods, feed, and beverages but had large deficits in automotive vehicles ($50 billion) and consumer goods ($79.4 billion). U.S. productivity lost its advantage when other industrializing nations used new technologies and lower wages to gain access to the vast U.S. market, outcompeting domestic manufacturers. A television-addicted nation sat glued in front of foreign-produced sets. The last U.S. television factory—operated by Zenith Electronics Corporation in Springfield, Mo.—closed in 1992, leaving more than 1,300 workers jobless when production shifted to Mexico. Japan cut into several consumer markets— electronics, cameras, computers, automobiles. Japanese brand names became household words: Sony, Mitsubishi,

Toyota, Nissan, Honda, Hitachi, Mazda, Sharp, Canon, Panasonic. The United States turned to quotas to stem Japanese imports. Japan responded by opening plants in the United States that employed U.S. workers but still diverted dollars abroad. Labor unions urged the public to “buy American,” but identifying the products was far from easy. A car “made in America” contained components built in more than a dozen countries on three continents and ultimately assembled in a U.S.-based factory. Was the car American made? A General Motors executive in 1952 testified before Congress: “What is good for the country is good for General Motors.” The reasoning no longer held as GM moved jobs and facilities to East Asia or to Mexican plants along the U.S. border. Displaced from well-paying jobs, U.S. workers and managers found reentering the workforce difficult. Even in a growing economy, new jobs paid less. The Census Bureau found that workers who left or were laid off between 1990 and 1992 saw their weekly wages fall 23 percent when they regained employment, often without health insurance and other benefits. The international economy developed an infrastructure of transportation, financing, and communications that made the movement of money, information, and goods easier and cheaper. Corporations moved money around the world to finance trade, protect against currency fluctuations, or to seek higher returns on investments. Meanwhile U.S. individual and institutional investors looked overseas for investments, often financing enterprises that competed against U.S.-based companies. Huge amounts of capital that otherwise might have been invested in domestic companies found its way abroad into emerging markets in Eastern Europe, Latin America, and the Pacific Rim. Money could be moved instantaneously around the world. Capital’s loyalties were not to governments or domestic economies but to the best rate of return. Making the United States competitive internationally was easier to advocate than accomplish. It put pressures on corporations to reduce employment and improve production. Problems created in the United States by trade liberalization remained largely unaddressed by the end of the 1990s. For the public, trade liberalization was complicated and confusing, with contradictions that were difficult to explain or accept. If trade agreements were good for the nation, why were jobs lost and industries hurt? In 1994 the United States displaced Japan as the world’s most competitive economy, based on an annual index by the World Economic Forum. The international economy subjected the U.S. labor force to new economic pressures—job insecurity, stagnant wages for nonskilled labor, and fewer company-sponsored benefits, particularly health insurance. U.S. wage rates were substantially lower than those in Germany and Japan, but within the United States something else occurred—a long-term trend of widening income inequality between the nation’s rich and the poor and middle classes. Meanwhile, the domestic

169

T R A D E U N I O N E D U C AT I O N A L L E A G U E

economy was transforming itself, moving from an industrial age to the information age, one for which many workers were ill prepared. The questions were how many high-tech, well-paying jobs could the economy realistically create and how could those stuck in low-wage jobs in the growing service section support themselves and their families.

Barrett, James R. William Z. Foster and the Tragedy of American Radicalism. Urbana: University of Illinois Press, 1999. Klehr, Harvey. The Heyday of American Communism: The Depression Decade. New York: Basic Books, 1984.

James J. Lorence See also Communist Party, United States of America; Trade Unions.

BIBLIOGRAPHY

Audley, John J. Green Politics and Global Trade: NAFTA and the Future of Environmental Politics. Washington, D.C.: Georgetown University Press, 1997. Buchanan, Patrick J. The Great Betrayal: How American Sovereignty and Social Justice Are Sacrificed to the Gods of the Global Economy. Boston: Little, Brown, 1998. Dunkley, Graham. The Free Trade Adventure: The WTO, the Uruguay Round, and Globalism, a Critique. New York: St. Martin’s Press, 2000. Ferguson, Niall. The Cash Nexus: Money and Power in the Modern World, 1700–2000. New York: Basic Books, 2002. Friedman, Thomas L. The Lexus and the Olive Tree: Understanding Globalization. New York: Anchor Books, 2000. Hufbauer, Gary C., and Jeffrey J. Schott et al. NAFTA: An Assessment. Washington, D.C.: Institute for International Economics, 1993. Krugman, Paul R. The Age of Diminished Expectations: U.S. Economic Policy in the 1990s. Cambridge, Mass.: MIT Press, 1990. Pomfret, Richard W. T. International Trade: An Introduction to Theory and Policy. Cambridge, Mass.: Blackwell, 1991. Yergin, Daniel. Commanding Heights: The Battle between Government and the Marketplace That Is Remaking the Modern World. New York: Simon and Schuster, 1999.

James A. Barnes Brent Schondelmeyer / a. g. See also Dollar Diplomacy; Foreign Investment in the United States; Imperialism; Latin America, Relations with; Most-Favored-Nation Principle; Pan-American Union; Reciprocal Trade Agreements; South America, Relations with; Treaties, Commercial.

TRADE UNION EDUCATIONAL LEAGUE (TUEL). Established in Chicago (1920) under the leadership of William Z. Foster, TUEL grew from left-wing labor activists’ efforts to build a progressive union movement. Used by the Communist Party to strengthen leftist forces inside the American Federation of Labor (AFL), it advocated industrial unionism, rank-and-file influence, support for the Soviet Union, and the formation of a labor party as a preliminary step in the establishment of a workers’ republic. It created linkages with some mainstream union leaders and assumed leadership of strikes in northeastern textile and garment industries (1926–1928). Conservative AFL opposition and the Communist Party shift toward revolutionary dual unions undermined TUEL, which disbanded in 1929.

170

BIBLIOGRAPHY

TRADE UNION UNITY LEAGUE (TUUL) was founded in Cleveland (1929) as the Communist Party’s vehicle for union activity during the Great Depression. Its establishment resulted from the party’s decision to create a revolutionary alternative to the American Federation of Labor. Although independent radical unions had appeared before 1929, dual unionism accelerated the development of separate progressive labor organizations. Led by William Z. Foster and Jack Johnstone, TUUL gained strength in the needle trades, textiles, and coal mining. It led strikes in textiles and coal, including dramatic but unsuccessful stoppages in Gastonia, North Carolina, and Harlan County, Kentucky. With the shift to the Popular Front in 1934, TUUL was dissolved. BIBLIOGRAPHY

Cochran, Bert. Labor and Communism: The Conflict That Shaped American Unions. Princeton, N.J.: Princeton University Press, 1977.

James J. Lorence See also Communist Party, United States of America.

TRADE UNIONS are associations that represent the collective interests of their employee-members in bargaining and negotiating with large employers. Trade unions generally seek to obtain higher wages, reduced working hours, and improved working conditions for employees. In addition, trade unions seek to improve workplace safety and to obtain increased benefits, such as health insurance, pensions, and disability insurance, for employees. Unions also look to protect the employment security of their members, largely by negotiating to implement seniority rules and to eliminate “at-will” employment contracts under which non-union employees traditionally have been subject to dismissal without cause. Although trade unions did not obtain legal recognition until the 1930s, laborers first began organizing to bargain collectively with employers long before obtaining such recognition. 1780s–1880s In addition to being the cradle of American liberty, the city of Philadelphia also served as the cradle of American labor activism. In 1786, Philadelphia printers staged America’s first labor strike, successfully procuring a $6 per week

TRADE UNIONS

minimum wage. In 1792, Philadelphia shoemakers formed America’s first labor association, which lasted for one year before disbanding. In 1834, representatives from various separate trade unions convened at the National Trades’ Union (NTU) Convention, in New York City. The NTU convention, which marked the first substantial effort to create a national labor organization in the United States, set goals for the labor movement that included obtaining legal recognition for trade unions in every American jurisdiction, organizing unorganized workers, establishing universal free public education for children and adults, and eliminating child labor. Some NTU members sought to pursue their goals through political channels by creating a separate political party. A successor to the NTU was formed in 1866, when the National Labor Union (NLU) brought together national trade organizations, local trade unions, city trade assemblies, and other reform-minded groups. The NLU’s progressive agenda included equal pay for equal work regardless of race or gender, an eight-hour work day, and arbitration. Three years later, in 1869, Philadelphia tailors formed the Noble Order of the Knights of Labor (KoL), an organization that included skilled and unskilled labor and promoted arbitration over strikes. Inspired by the socialist movement, the KoL proposed to replace capitalism with workers’ cooperatives. In the following decades, however, these organizations went into decline. First, in 1872, the NLU dissolved after local issues came to overshadow national efforts. Then, a decade later, the KoL lost influence and membership after loosely organized labor was implicated in Chicago’s violent Haymarket Riot of 1886. 1880s–1930s: Labor Gains Momentum In 1886, a KoL splinter group formed the American Federation of Labor (AFL), electing cigar-maker Samuel Gompers as its first president (1886–1924, except 1895). The AFL organized skilled craftsmen by trade, but excluded unskilled workers. Stressing economic rather than political goals, the AFL under Gompers promoted the use of labor strikes and boycotts, and emphasized the need for written contracts with employers. The AFL’s focus was national; Gompers discouraged involvement with local or international issues. Gompers worked within existing political parties, dampening support for a separate labor party. In the early twentieth century, a series of statutes enacted by Congress secured legal protection for labor organizing and union activity. In 1914, the Clayton Antitrust Act made clear that peaceful combinations of workers in labor organizations were not criminal conspiracies. In 1932, the Norris-LaGuardia Act stripped federal judges of power to enjoin strikes, making it easier for workers to strike and picket. The National Labor Relations Act of 1935 (Wagner Act or NLRA) recognized the right of workers to organize and bargain collectively. The NLRA

also created the National Labor Relations Board (NLRB), whose three members were charged with supervising union elections and stopping employers’ unfair labor practices. In 1935, President John L. Lewis of the United Mine Workers of America urged the AFL to begin organizing unskilled industrial workers, in addition to skilled workers. When the AFL refused, Lewis formed the Committee on Industrial Organization (CIO) within the AFL. By late 1938, however, the CIO ratified its own constitution (becoming the Congress of Industrial Organization), and split from the AFL. During Lewis’s tenure as the CIO’s first president (1936–1940), unskilled steel and automobile production workers were organized. 1939–1945: War Economy After Pearl Harbor, the AFL and CIO promised to refrain from utilizing labor strikes for the duration of the war. Without the power to strike, workers lost their most important tool to offset employer power. Further, accelerated wartime productivity increased workplace accidents and injuries. To support workers, President Franklin D. Roosevelt created a 12-member National War Labor Board in 1942, with four members each representing business, organized labor, and government. No constituency was satisfied. Workers disliked the Little Steel Formula of 1942, which restricted wage increases in order to check inflation. Business leaders chafed under Board rulings that presumed new workers at union plants to be union members, and that required employers to terminate workers who failed to pay union dues. Labor, however, remained loyal to Roosevelt, hopeful that their loyalty would pay off politically at the war’s end. Japan’s surrender in August 1945 ended the AFL-CIO No-Strike Pledge, and was followed by a six-month tidal wave of strikes. 1945–1960: Gains in Collective Bargaining, Stability, Affluence In the postwar period, labor unions consolidated successes including the institutionalization of collective bargaining, the development of employee benefits packages, and the adoption of grievance procedures and unionsponsored seniority systems for individual employment decisions. These union successes improved the lot of nonunion workers as well. Per capita U.S. wages rose 45 percent in the 1940s, and 56 percent in the 1950s. For many, the urgency of the worker’s struggle diminished. At the same time, new postwar legislation sought to limit union power. The 1947 Taft-Hartley Act gave individual workers a right to refuse union membership (striking a blow against “closed shop” facilities). It also required unions to provide advance notice of strikes; reauthorized federal courts to enjoin strikes affecting national health or safety for eighty days; restricted unions’ financial contributions to political candidates; defined unfair

171

TRADE UNIONS

labor and union practices; outlawed mass picketing; and neutralized the NLRB’s former labor advocacy position. Labor leaders responded to Taft–Hartley by intensifying political action. Both the AFL and the CIO backed the Democratic Party, effectively ending any lingering support for a separate labor party. In the late 1940s, labor unions began expunging communists from their ranks. In 1952, staunch anticommunist George Meany became head of the AFL. Three years later, to increase labor’s clout, Meany and CIO president Walter Reuther orchestrated an AFL-CIO merger. While Meany assumed the new joint AFL-CIO presidency, Reuther continued to serve as United Auto Worker (UAW) president until his death in 1970. In 1957, Congress enacted the Landrum-Griffin Act to control union corruption, while the AFL-CIO expelled the 1.5 million-member Teamsters Union for corruption. Between 1957 and 1988, three Teamster presidents were convicted and sentenced to prison terms for corruption (Dave Beck, Jimmy Hoffa, and Roy Williams). The Teamsters Union was not readmitted to the AFL-CIO until 1987. 1960s–1970s: Labor Looks Conservative and Bureaucratic In 1962, President John F. Kennedy issued an executive order encouraging union representation and collective bargaining on behalf of federal employees. Consequently, union membership ballooned among public sector employees during the 1960s. However, with the AFL-CIO and the Teamsters serving as the public face of the labor movement, unions’ liberal image changed. In particular, these organizations’ pro–Vietnam War positions caused declines in new union membership among America’s youth. The AFL-CIO also was widely perceived in the 1960s as being insufficiently supportive of civil rights. In particular, unions suffered from a dearth of African American union officials and from ongoing segregation and unequal treatment in the locals. In 1960, Brotherhood of Sleeping Car Porters president A. Philip Randolph (then the only African American AFL-CIO official) formed the Negro American Labor Council (NALC) in order to advance the interests of African American laborers. In 1966, however, Randolph resigned from NALC after its public criticisms of the AFL-CIO intensified. The labor movement’s public reputation was also marred in 1964, when it was revealed that Teamsters’ pension funds had been loaned by union officials to organized crime figures. The ensuing scandal caused the downfall of Teamsters’ president Jimmy Hoffa, who began serving a thirteen-year federal prison term in 1967, but remained president of the Teamsters Union until 1971. Differences between AFL head Meany and UAW (and former CIO) head Reuther on issues of civil rights, political activity, funding of organizing activities, and even-

172

tually Vietnam, all led to the UAW’s thirteen-year withdrawal from the AFL-CIO from 1968 to 1981. In 1972, the pro-war AFL-CIO declined to endorse pro-labor Democratic presidential candidate George McGovern, because of McGovern’s antiwar stance. Even while the established organs of organized labor were facing difficult times, however, at least one new union was gaining strength in the 1960s and 1970s. During that period, the United Farm Workers of America (UFWA), led by Cesar Chavez, organized Hispanic and Filipino migrant farm workers in California and Arizona. Utilizing both labor strikes and boycotts, the UFWA eventually won collective bargaining agreements from California grape and lettuce growers. In 1971, the UFWA joined the AFL-CIO. 1980–Present In 1981, organized labor suffered a major setback when President Ronald Reagan responded to a federal air traffic controllers strike by firing the striking employees. By illustrating the ability of employers to recruit replacement workers, this episode chilled unions from calling for future labor strikes. Instead, unions in the 1980s and 1990s looked increasingly to legislatures for protection in such areas as minimum wage, family and medical leave, workplace safety, and pension protection. However, organized labor suffered a major legislative defeat in 1994 when the North American Free Trade Agreement was implemented despite heavy union lobbying against it. Since then, however, unions have successfully sponsored campaigns for a Living Wage, which have been enacted by several local governments throughout the United States. BIBLIOGRAPHY

Bernstein, Irving. The Lean Years. Cambridge, Mass.: Houghton Mifflin, Riverside, 1960. Bernstein, Irving. A Caring Society: The New Deal, the Worker, and the Great Depression. Boston: Houghton Mifflin, 1985. Craver, Charles B. Can Unions Survive?: The Rejuvenation of the American Labor Movement. New York: New York University Press, 1993. Frankfurter, Felix, and Nathan Greene. The Labor Injunction. New York: MacMillan, 1930. Geoghan, Thomas. Which Side are You On? Being for Labor when Labor is Flat on its Back. New York: Plume, 1992. Goldfield, Michael. The Decline of Organized Labor in the United States. Chicago: Chicago University Press, 1987. Zieger, Robert H. American Workers, American Unions, 2d ed. Baltimore: Johns Hopkins University Press, 1994.

Linda Dynan See also American Federation of Labor–Congress of Industrial Organizations; International Brotherhood of Teamsters; United Automobile Workers of America; and vol. 9: Ford Men Beat and Rout Lewis; The Pullman Strike and Boycott.

TRADEMARKS

TRADE WITH THE ENEMY ACTS. England’s common law, supplemented by orders-in-council and acts of Parliament, governed restriction of trade with the enemy as a means of economic coercion and domestic conservation. During the French and Indian War (1756– 1763) these prohibitions, with the revival of the Molasses Act of 1733, threatened to disrupt the interdependent commerce between the food-producing English colonies and the sugar- and rum-producing French West Indies. Colonists thereupon evaded embargoes by fraudulently sailing cargoes to the enemy’s Caribbean ports in “flags of truce,” ships licensed ostensibly to exchange prisoners. An indirect trade also developed through such neutral ports as Curac¸ao, Saint Eustatius, and Montecristi, until neutral ships thus involved were captured and condemned under the Rule of War of 1756. The revolutionary embargoes and nonconsumption agreements against England were more effective than the English restrictions of trade largely because of the energy of American committees directed by the Continental Congress and reinforced by local embargo laws.

deemed applicable, such as the Korean “emergency” of 1950 and the extended embargo against China and North Korea that thereafter became part of the cold war arsenal. Embargoes proscribed trade with Cuba in 1963 and with North Vietnam in 1964. As involvement in Vietnam increased, the National Liberation Front, the Vietcong, and the Liberation Red Cross faced the act’s restrictions.

During the Franco-American “misunderstanding” of 1798–1800 and the War of 1812, Congress proscribed trading with the enemy as part of military policy, although imported war materials from the enemy country were opportunistically permitted. The president had authority to limit and suspend operation of the law.

Lourie, Samuel Anatole. “The Trading with the Enemy Act.” Michigan Law Review 42 (1943).

In the Mexican-American War no restrictions on enemy trading existed. When the enemy’s ports and customshouses had been captured, President James K. Polk not only raised the blockade but encouraged imports into Mexico in order to collect duties to finance the army of occupation. During the Civil War both belligerents employed the commercial weapon to some extent. The North blockaded southern ports and imposed an embargo; at the same time the Treasury had authority to purchase southern cotton and to license limited trade. Meanwhile the Confederacy prohibited trade with Northerners, and various states ordered further embargoes on cotton exports. During World War I the country adopted extensive measures to prevent enemy trading and to enforce the Allied blockade of Germany. They included executive proclamations, the Espionage Act, and the Trading with the Enemy Act of 6 October 1917. The latter act carefully defined and almost completely prohibited such trade. At the outset of World War II, under the auspices of the Trading with the Enemy Act, Congress renewed and enlarged presidential power to seize any property “belonging to or held for, by, on account of, or on behalf of, or for the benefit of, an enemy or ally of an enemy.” Courts consistently upheld its provisions as a necessary means to conduct economic warfare despite its broad scope and sweeping application. After 1950 Congress extended the Trading with the Enemy Act to situations that had not hitherto been

In 1969 President Richard M. Nixon opened the door to trade with China, and by 1975 there was a steady trade in nonrestricted goods between China and the United States, setting a precedent that in the future the Trading with the Enemy Act would apply only to “hot war” adversaries. BIBLIOGRAPHY

Berman, Harold J., and John R. Garson. “United States Export Controls—Past, Present, and Future.” Columbia Law Review 67 (1967). Eichengreen, Barry J., ed. Transatlantic Economic Relations in the Post–Cold War Era. New York: Council on Foreign Relations Press, 1998.

———. “ ‘Enemy’ Under the Trading with the Enemy Act and Some Problems of International Law.” Michigan Law Review 42 (1943).

Eric L. Chase Harold W. Chase / c. w. See also Blockade; China Trade; Committees of Correspondence; Smuggling, Colonial; Trade, Foreign.

TRADEMARKS are words or symbols used on goods to indicate source. Merchants and artisans have used trademarks for centuries; the medieval trademark not only allowed artisans to take credit for their work but also permitted guilds to control quality. English common law (the law of court decisions rather than statutes) protected trademarks beginning in the seventeenth century. The colonists brought this law with them from England. George Washington, in 1772, sought to protect the mark “G. Washington” for use on flour. The purpose of trademark law was to prevent consumer deception as to source. This meant that trademarks were local and goods-specific, as was most trade at the time. A trademark of “Washington’s” for flour in Virginia would not extend to “Washington’s” on silverware or to “Washington’s,” even on flour, outside Virginia. Through the nineteenth century, trade became less local, and a system of federal registration was created in 1870. This system, held unconstitutional in 1879 in a series of Supreme Court decisions collectively known as the Trademark Cases, was replaced in 1881, and then in 1905 with a federal trademark registration statute restricted to marks used in interstate commerce, thus rendering it constitutionally valid. The federal scheme became more im-

173

T R A D I N G C O M PA N I E S

portant during the twentieth century with the rise of truly national commerce and advertising combined with the judiciary’s generous views as to what constitutes interstate commerce. Today, trademark law is increasingly governed by the federal Lanham Act, passed in 1946 and amended in 1988, though state law remains important. Unlike patents and copyrights, trademarks have no fixed duration. Trademarks are more valuable than ever; some, like CocaCola, are certainly worth tens of millions of dollars. BIBLIOGRAPHY

Chisum, Donald S., and Michael A. Jacobs. Understanding Intellectual Property Law. New York: Matthew Bender, 1996. McCarthy, J. Thomas. McCarthy on Trademarks and Unfair Competition. 4th ed. St. Paul, Minn.: West Group, 1998.

John A. Kidwell See also Copyright; Intellectual Property.

TRADING COMPANIES played an important part in colonial American settlement. Six incorporated British companies established settlements: the Virginia Company at Jamestown (1606), the London and Bristol Company at Sagadahoc (1610), the Council for New England at Newfoundland (1620), the Bermuda Company at Bermuda (1622), the Massachusetts Bay Company at Salem (1629), and the Old Providence Company at Old Providence (1630). The Dutch used a similar organization to plant their settlement in New Netherland at New Amsterdam. There were two types of trading companies: jointstock and associates. Joint-stock companies were legally incorporated by the crown by royal charter. They were run by a treasurer and an executive council from headquarters named in the charter. They resembled a modern corporation in selling shares to stockholders, whose liability was limited to their specific investments and who met quarterly in “general courts.” A company’s charter gave it title to a specific territory and a legal monopoly to trade in that region, and it granted the company governmental powers over any settlements in its territory. The company also had control over the natives and authority to defend its settlements and trade from foreign aggression. The colonists themselves lived under a complex of rules and regulations that originated with both company officers and the settlers participating in colonial governments. All the ships, storehouses, and livestock bought from company funds were company property. Individual colonists might also own private property, and these holdings were subject to taxation to raise money for the colony. The land was a common stock belonging to the stockholders until disposed of by grant to settlers or investors. Practically, there was no way a stockholder in England could share in this common stock except by emigrating to the colony. Trading privileges belonged to the stockholders of the home company.

174

Limited partnerships called associates—less formal arrangements than joint-stock companies—were another common type of trading company. Such companies were not fully incorporated, and their territorial grants came from some legally incorporated company. The London Company used this device in the settlement of Virginia, where associates settled Berkeley Hundred and many other regions. In return for a title to a specified tract of land, associates agreed to transport a certain number of settlers to a given area and establish them within a limited time. The London Company issued forty-four such grants, including one to the group of settlers that came to be known as the Pilgrims, which they never used because they landed in Plymouth instead. Another company of associates, the Dorchester Company (1624), received a grant in what later became Massachusetts and established a settlement at Salem. This company, with its grant, was finally merged into the Massachusetts Bay Company, which was incorporated under royal charter. Because the charter stipulated no place for the company’s offices and meetings, the officers moved the company and charter to America, where the company became the basis of a commonwealth, and the “general court” assumed governmental power. The company’s trading division maintained headquarters in London until 1638. The Massachusetts Bay Company furnished a model for later settlements in Rhode Island and Connecticut, the governments of which were similar to that of the joint-stock companies. BIBLIOGRAPHY

Andrews, K. R., N. P Canny, and P. E. H. Hair, eds. The Westward Enterprise: English Activities in Ireland, the Atlantic, and America, 1480–1650. Detroit, Mich.: Wayne State University Press, 1979. Carr, Lois Green, Philip D. Morgan, and Jean B. Russo, eds. Colonial Chesapeake Society. Chapel Hill: University of North Carolina Press, 1988. Clark, Charles E. The Eastern Frontier: The Settlement of Northern New England, 1610–1763. New York: Knopf, 1970. Cook, Jacob Ernest, et al., eds. Encyclopedia of the North American Colonies. New York: Scribners, 1993.

O. M. Dickerson / s. b. See also Colonial Charters; Colonial Settlements; Council for New England; General Court, Colonial; Hundred; Local Government; Massachusetts Bay Colony; Pilgrims; Plymouth Colony; Virginia Company of London.

TRADING POSTS, FRONTIER. British, French, and Dutch traders established some of the earliest North American trading posts in the seventeenth century as trade between Indians and European fur trappers increased. While Europeans engaged in the enterprise for profits to be realized from the sale of sought-after furs, Indians exchanged pelts for desired items such as guns and ammunition, blankets, copper kettles, silver, glass beads, and

TRADING POSTS, FRONTIER

cloth. Though often no more than a collection of dilapidated cabins, frontier trading posts served as the commercial centers of the frontier, built on or near waterways to expedite both the shipment of furs and pelts downriver, and the return of supplies and trade items upriver. Under the leadership of Samuel de Champlain, the French established trading posts at Acadia in 1604–05 and Quebec in 1608. In 1609, English sailor Henry Hudson, employed by the Dutch East India Company, claimed the Hudson River valley for the Dutch. Forts Orange (the present site of Albany, New York) and Amsterdam were established as trading posts shortly thereafter. Some of the earliest English trading post records date to 1662, when ten pounds of tobacco were traded for furs to make a hat. Britain’s Hudson’s Bay Company was granted exclusive trade rights to the Hudson River watershed in 1670 and for one hundred years enjoyed trade dominance in North America. The fur trade moved into the Great Lakes region in the late seventeenth and early eighteenth centuries, and in 1715 the French established a principal trading post at Michilimackinac on Lake Michigan, near the site of the mission station established by Pe´re Marquette in 1668. A group of independent traders formed the North West Company in 1784 and began to establish trading posts throughout the interior regions of North America, eventually reaching the Pacific Coast.

The XY Company organized in 1798 but found competition with the North West Company too fierce; the two merged in 1804. This merger provided the Hudson’s Bay Company with its greatest competition, and in 1821 the North West and Hudson’s Bay Companies combined, retaining the name of the latter. The American Fur Company, established in 1808 by John Jacob Astor, was the largest American trading company and dominated the fur trade in the United States through its numerous trading posts until its dissolution in 1850. The American fur trade, along with the number of frontier trading posts, increased dramatically after 1803 as the Louisiana Purchase opened vast western territories to exploration, trade, and settlement. In the early-to-midnineteenth century, A. P. Chouteau, West Point–educated son of French trader Pierre Chouteau, acted as general manager of his family’s four trading posts, all located near St. Louis and in the Upper Missouri River valley. The Chouteaus obtained furs and pelts from the Osage, Comanche, and Kiowa, among others, and supplied their posts with goods imported from Europe and Asia. The Hudson’s Bay Company controlled the fur trade in the Northwest from its headquarters, located at the mouth of the Columbia River. Fort Vancouver, under the leadership of post factor John McLoughlin, was the grandest and most self-supporting of the trading posts in

175

TRADING STAMPS

the West. As fur trade brigades were dispatched to remote areas for weeks and months at a time, the lumber produced at the company mill and the fruits and vegetables raised on the company farm were shipped north to Russian posts in the Aleutians, west to the Hawaiian Islands, and around Cape Horn to England. Fort Vancouver served as the depot for all the Hudson’s Bay Company activities in the Northwest from 1824 until 1860, when the company ceased operations in the United States and its territories. Built in 1834 on the LaRemay’s (Laramie) River, Fort William was another of the early western trading posts. William Sublette and his partner, Robert Campbell, undercut prices offered by the competing Rocky Mountain Fur Company, secured the Indian trade, and became quite prosperous. Though Fort William lacked the opulence and grandeur of Fort Vancouver, it provides a better representation of the era’s trading posts; its rectangular stockade, built from cottonwood logs with elevated blockhouses on two corners and over the main entrance, was typical of most eighteenth- and nineteenth-century western posts. In 1824, the U.S. government established Fort Gibson on the Arkansas River to protect settlers against Indian attack. The fort included a sutler’s store; this addition of government merchants began a series of events that permanently altered frontier trade. In the years that followed, the federal government obtained several abandoned frontier trading posts to serve as military posts. In 1850, the army moved into a trading post established by the North West Company in 1820 at The Dalles on the Columbia River and in 1855 it purchased Fort Pierre Chouteau in Dakota Territory. Trappers and traders held a variety of views regarding the consumption of alcohol at frontier trading posts. While Britain’s Hudson’s Bay Company officers occasionally partook of a glass of wine, they banned other forms of alcohol from their trading posts, insisting that consumption caused Indians to become aggressive and fight amongst themselves, rather than paying due diligence to trapping. The French considered themselves primarily trappers, and not traders. They married Indian women, adopted aspects of Indian culture, and, unconcerned with the “evils of alcohol,” indulged in large quantities of food and drink as the opportunity presented itself. Alcohol was the most popular item offered in trade by the American companies since most Indians preferred to trade with the British for their finely tooled goods. Alcohol became an American trade staple and so critical to the American fur trade that the proceeds generated by its sales to Indians, and to trappers at the annual rendezvous, represented most if not all trade company profits. Trapping became more difficult as settlement moved further westward and fur-bearing animal populations diminished; at the same time, it became less important to traders. Frontier trading posts began to resemble the general stores of the East, with homesteaders and farmers, many of them women, numbering among the traders. Al-

176

though the fur trade continued in parts of the West into the 1870s, by the 1840s most frontier trading posts had been replaced by traditional mercantile establishments and thus rendered obsolete. BIBLIOGRAPHY

Chittenden, Hiram Martin. The American Fur Trade of the Far West: A History of the Pioneer Trading Posts and Early Fur Companies of the Missouri Valley and the Rocky Mountains and of the Overland Commerce with Santa Fe. 3 vols. 1902. Reprint, 2 vols. Stanford, Calif: Academic Reprints, 1954. McNitt, Frank. The Indian Traders. Norman: University of Oklahoma Press, 1962. Morrison, Dorothy Nafus. Outpost: John McLoughlin and the Far Northwest. Portland: Oregon Historical Society Press, 1999. Northwest Forts and Trading Posts. Tacoma: Washington State Historical Society, 1968. Roberts, Robert B. Encyclopedia of Historic Forts: The Military, Pioneer, and Trading Posts of the United States. New York: Macmillan, 1987. Rorabaugh, W. J. The Alcoholic Republic: An American Tradition. New York: Oxford University Press, 1979. Trennert, Robert A., Jr. Indian Traders on the Middle Border: The House of Ewing, 1827–54. Lincoln: University of Nebraska Press, 1981. White, Richard. The Middle Ground: Indians, Empires, and Republics in the Great Lakes Region, 1650–1815. Cambridge, U.K.: Cambridge University Press, 1991.

Brenda Jackson See also Fur Companies; Fur Trade and Trapping; Indian Trade and Traders; Missouri River Fur Trade; Pacific Fur Company.

TRADING STAMPS. See Thrift Stamps.

TRAIL DRIVERS, cowboys who moved cattle, typically in herds of about 2,500, from a home range to a distant market or another range. The typical outfit consisted of a boss, who might or might not be the owner of the herd; ten to fifteen hands, each of whom had a string of from five to ten horses; a horse wrangler (or remudero), who drove and herded the cow ponies; and a cook. The men drove and grazed the cattle most of the day, herding them by relays at night. Most considered ten or twelve miles a good day’s drive, as the cattle had to thrive along the route. Wages for a trail driver were about $40 a month. The trail drivers’ code presupposed that no matter what the hazards, hardships, or physical torture, a man would stay with his herd as loyally as a captain stays with his ship at sea. BIBLIOGRAPHY

Hunter, J. Marvin, compiler and ed. The Trail Drivers of Texas: Interesting Sketches of Early Cowboys. Austin: University of

TRAIL OF TEARS

Texas Press, 1985. The original edition was published San Antonio, Texas: Jackson Printing, 1920–1923.

J. Frank Dobie / c. w.

“TRAIL OF BROKEN TREATIES.” A central protest event of the Red Power activist period of the 1970s, the “Trail of Broken Treaties” was organized by members of the American Indian Movement (AIM) to bring national attention to Native grievances. The “trail” began on the West Coast in the late summer of 1972 as an automobile caravan composed of Indians from across the country who intended to demonstrate their concerns in Washington, D.C. As it proceeded east, the caravan stopped by reservations and urban Indian communities to drum up support, recruit participants, conduct workshops, and draft an agenda for Indian policy reform. The caravan arrived in Washington, D.C., in the early days of November, just before the 1972 presidential election, a time considered ideal for anyone seeking media coverage. As it traveled across the country, the caravan grew, numbering several hundred when it arrived in the capital. Initially the group was orderly, but when housing for the protesters disintegrated, the original goals of the organizers shifted from meetings and demonstrations to a weeklong occupation of the Bureau of Indian Affairs building. The occupation was reported on the front pages of the New York Times and many other newspapers. The publicity drew attention to Indian rights and provided a platform for the protesters to present their “20-Point Program” to increase the role of tribes in the formation of Indian programs. The “self-determination” federal legislation of the mid-1970s that shifted more local control to recognized tribes should be understood against the backdrop of the Red Power protest era, especially the Trail of Broken Treaties and the protests it inspired. Another important outcome of the Trail of Broken Treaties and the other protests of the era was a surge of Native pride and consciousness. For example, the Lakota author Mary Crow Dog describes the response to militant Indians such as those in the American Indian Movement: The American Indian Movement hit our reservation like a tornado, like a new wind blowing out of nowhere, a drumbeat from far off getting louder and louder. . . . I could feel this new thing, almost hear it, smell it, touch it. Meeting up with AIM for the first time loosened a sort of earthquake inside me. (pp. 74–75) BIBLIOGRAPHY

Crow Dog, Mary, and Richard Erdoes. Lakota Woman. New York: Grove Weidenfeld, 1990.

TRAIL OF TEARS, most closely associated with the Cherokees, is perhaps the most well known injustice done to Native Americans during the removal period of the 1830s. Historically, the Cherokees occupied lands in several southeastern states including North Carolina and Georgia. Acting under the Removal Act of 1830, federal authorities sought to win the tribe’s agreement to exchange tribal lands for a reservation in the West. In 1835, approximately 500 Cherokees, none of them elected officials of the Cherokee nation, gathered in New Echota, Georgia, and signed a treaty ceding all Cherokee territory east of the Mississippi to the United States in exchange for $5 million and new homelands in Indian Territory (modern Oklahoma). Though a majority of the tribe protested this illegal treaty, it was ratified—by a single vote— by the U.S. Senate on 23 May 1836. In May 1838, federal troops and state militia units supervised by General Winfield Scott rounded up the Cherokees who refused to accept the New Echota agreement and held them in concentration camps until they were sent west in groups of approximately 1,000 each. Three groups left that summer, traveling 800 miles from Chattanooga by rail, boat, and wagon, primarily on the water route. In November, with river levels too low for navigation and with inadequate clothing and supplies, twelve more groups traveled overland, under close military supervision and primarily on foot, in spite of roads rendered impassable by autumn rains and the subsequent onset of winter. By March 1839, all survivors had arrived in their new home. Of the 15,000 Cherokees who began the journey, about 4,000—a fifth of the total Cherokee population—perished along the route. Though local and state governments along with private organizations and individuals made some efforts to recognize this tragic event in American history, it was not until 1987 that Congress designated the Trail of Tears as a National Historic Trail under the supervision of the National Park Service.

BIBLIOGRAPHY

Anderson, William L., ed. Cherokee Removal: Before and After. Athens: University of Georgia Press, 1991. Hoig, Stan. Night of the Cruel Moon: Cherokee Removal and the Trail of Tears. New York: Facts on File, 1996. National Park Service. Certification Guide: Trail of Tears National Historic Trail. Santa Fe, N. Mex.: National Park Service, 1994. Perdue, Theda, and Michael D. Green, eds. The Cherokee Removal: A Brief History with Documents. Boston: Bedford Books, 1995.

Josephy, Alvin M., Jr., Joane Nagel, and Troy Johnson, eds. Red Power. 2d ed. Lincoln: University of Nebraska Press, 1999.

Michael Sherfy

Joane Nagel

See also Cherokee; Cherokee Nation Cases; Indian Land Cessions; Indian Removal; Indian Territory; Removal Act of 1830.

See also Wounded Knee (1973).

177

T R A I L E R PA R K S

Thornburg, David A. Galloping Bungalows: The Rise and Demise of the American House Trailer. Hamden, Conn.: Archon Books, 1991. Wallis, Allan D. Wheel Estate: The Rise and Decline of Mobile Homes. New York: Oxford University Press, 1991.

Deirdre Sheets See also Housing; Transportation and Travel.

Trailer Park Community. A mobile home park in Gillette, Wyo. 䉷 corbis

TRAILER PARKS began to appear in the 1920s as roads improved and Americans enjoyed a fascination with motoring and highway travel as leisure pursuits. Trailers were originally designed for recreational uses such as family camping or adventure. Industry pioneers began designing vehicles for their own families, and soon found themselves manufacturing and selling house trailers. Informal sites where motorists towing house trailers could park and live in a community of other travelers were formed independently, and as the number of trailer campers increased, the need for specially designated campgrounds arose. These were originally established as free municipal facilities but they soon became fee facilities in order to discourage the poor and limit users to a tourist population.

TRAIN ROBBERIES were more frequent in the United States than anywhere else in the world in the latter half of the nineteenth century. Vast stretches of sparsely inhabited country permitted robbers to escape undetected; carelessness and lack of adequate security on trains also made robberies easier. The robbery of $700,000 from an Adams Express car on the New York, New Haven, and Hartford Railroad, the first train robbery on record, occurred in 1866. That same year, the four Reno brothers stole $13,000 in their first train holdup. They went on to stage a number of bold bank and train robberies in southern Indiana and Illinois before the Pinkerton Detective Agency, just coming into prominence, tracked them down

Tourists were not the only people using house trailers, however, and by 1936 an estimated one million people were living in them for part or all of the year. Eventually the industry split; house trailers produced for travel became recreational vehicles (RVs) while mobile homes were produced specifically as dwellings. During World War II a boom occurred in trailer living among military and construction workers who followed jobs and assignments. The postwar housing crisis perpetuated the popularity of trailers as dwellings. In the 1950s, trailer parks evolved into communities intended for permanent dwelling rather than as tourist parks, while RV campgrounds replaced the original trailer parks. Trailer parks are frequently associated with tornadoes. That is because mobile homes are destroyed more easily and, therefore, in greater numbers than more structurally substantial houses. BIBLIOGRAPHY

Santiago, Chiori. “House Trailers Have Come a Long Way, Baby.” Smithsonian 29, no. 3 ( June 1998): 76–85.

178

Jesse James. One of Quantrill’s Raiders, a Confederate guerrilla band in the Civil War (this photograph is from 1864, when he was seventeen), he was subsequently the legendary leader of a gang of robbers (first of banks, then of trains) in the Midwest, until “that dirty little coward” Robert Ford—as a sympathetic “ballad” of the day put it—killed him for a bounty in 1882.

TRANSCENDENTALISM

in 1868. Vigilantes executed three of the four brothers before their cases came to trial. The Farringtons operated in 1870 in Kentucky and Tennessee. Jack Davis of Nevada, after an apprenticeship robbing stagecoaches in California, started operations at Truckee, California, by robbing an express car of $41,000. Train robberies peaked in 1870. The colorful and daring Jesse James gang began to operate in 1873 near Council Bluffs, Iowa. No other robbers are so well known; legends and songs were written about their deeds. For nine years they terrorized the Midwest, and trainmen did not breathe freely until an accomplice shot Jesse, after which his brother Frank retired to run a Wild West show. Sam Bass in Texas, the Dalton boys in Oklahoma, and Sontag and Evans in California were other robbers with well-known records. After 1900 the number of holdups declined conspicuously. BIBLIOGRAPHY

DeNevi, Don. Western Train Robberies. Millbrae, Calif.: Celestial Arts, 1976. Pinkerton, William Allan. Train Robberies, Train Robbers, and the “Holdup” Men. New York: Arno Press, 1974. The original edition was published in 1907.

Carl L. Cannon / c. w.

TRANS-APPALACHIAN WEST. The TransAppalachian West is the region west of the Appalachian Mountains and east of the Mississippi River. It stretches from the U.S. border with Canada down to Mexico. Originally blanketed with coniferous and deciduous forests, it was home to numerous Native American groups. The United States gained control of the region after the Treaty of Paris (1783), which ended the American Revolution. Treaties with the local Indian populations resulted in a flood of settlement over the next seventy years. The region’s economy has been based on both agriculture and manufacturing. Nine states were formed out of the region and it is home to over 65 million people. Polly Fry See also Paris, Treaty of (1783).

TRANSCENDENTALISM was a movement for religious renewal, literary innovation, and social transformation. Its ideas were grounded in the claim that divine truth could be known intuitively. Based in New England and existing in various forms from the 1830s to the 1880s, transcendentalism is usually considered the principal expression of romanticism in America. Many prominent ministers, reformers, and writers of the era were associated with it, including Ralph Waldo Emerson (1803– 1882), Henry David Thoreau (1817–1862), Margaret Fuller (1810–1850), Theodore Parker (1810–1860), Bron-

son Alcott (1799–1888), and Orestes Brownson (1803– 1876). Various organizations and periodicals gave the movement shape. The earliest was the so-called “Transcendental Club” (1836–1840), an informal group that met to discuss intellectual and religious topics; also important was the “Saturday Club,” organized much later (1854). Many transcendentalists participated in the utopian communities of Brook Farm (1841–1848; located in West Roxbury, Massachusetts), founded by George Ripley (1802–1880) and his wife, Sophia Dana Ripley (1803–1861), and the short-lived Fruitlands (1843–1844; located in Harvard, Massachusetts), founded by Alcott. A number of transcendentalist ministers established experimental churches to give their religious ideas institutional form. The most important of these churches were three in Boston: Orestes Brownson’s Society for Christian Union and Progress (1836–1841); the Church of the Disciples (founded 1841), pastored by James Freeman Clarke (1810–1888); and Theodore Parker’s Twenty-Eighth Congregational Society (founded 1845–1846). The most famous transcendentalist magazine was the Dial (1840–1844), edited by Fuller and then by Emerson; other major periodicals associated with the movement included the Boston Quarterly Review (1838–1842), edited by Brownson, and the Massachusetts Quarterly Review (1847–1850), edited by Parker. Transcendentalism emerged from Unitarianism, or “liberal Christianity”—an anti-Calvinist, anti-Trinitarian, anticreedal offshoot of Puritanism that had taken hold among the middle and upper classes of eastern Massachusetts. The founders of transcendentalism were Unitarian intellectuals who came of age, or became Unitarians, in the 1820s and 1830s. From Unitarianism the transcendentalists took a concern for self-culture, a sense of moral seriousness, a neo-Platonic concept of piety, a tendency toward individualism, a belief in the importance of literature, and an interest in moral reform. They looked to certain Unitarians as mentors, especially the great Boston preacher William Ellery Channing. Yet transcendentalists came to reject key aspects of the Unitarian worldview, starting with their rational, historical Christian apologetic. The Unitarian apologetic took as its starting point the thesis of the British philosopher John Locke that all knowledge, including religious knowledge, was based on sense data. The Unitarians were not strict Lockeans; under the influence of the Scottish “Common Sense” philosophers, notably Thomas Reid and Dugald Stewart, they held that some fundamental knowledge could be known intuitively—for example, that certain things were morally right and wrong, and that the world that human senses perceive in fact exists. Nonetheless, Unitarians held that only “objective” evidence could prove Jesus had delivered an authoritative revelation from God. They believed they had found such evidence in the testimony, provided in the Gospels, of Jesus’ miracles. The Unitarians valued the historical study of Gospel accounts, in order to prove them “genuine” and therefore credible.

179

TRANSCENDENTALISM

Transcendentalists rejected as “sensual” and “materialistic” Unitarianism’s Lockean assumptions about the mind, and were inspired instead by German philosophical idealism. Its seminal figure, Immanuel Kant, argued that sense data were structured by the mind according to certain “transcendental” categories (such as space, time, and cause and effect), which did not inhere in the data, but in the mind itself. The transcendentalists liked the Kantian approach, which gave the mind, not matter, ultimate control over the shape of human experience. The name of their movement was derived from Kant’s philosophical term. Yet the transcendentalists, unlike Kant but like other Romantics (and, to an extent, the Common Sense philosophers), held that religious knowledge itself could be intuitively known. According to this view, people could tell “subjectively” that Jesus had given a revelation from God, because his doctrine was self-evidently true and his life self-evidently good. The transcendentalist apologetic turned out to have radical implications. Because transcendentalists believed religious truth could be known naturally, like any other truth, they tended to reject the idea of miraculous inspiration as unnecessary and to dismiss as false the claim made for the Bible that it had unique miraculous authority. Transcendentalists still respected Jesus, but the more radical of them, like Emerson in his Divinity School Address (1838), and Parker in Discourse on the Transient and Permanent in Christianity (1841), attacked the miracle stories in the Gospels as pious myths. Such attacks were highly controversial; theologically conservative Unitarians accused the transcendentalists of being infidels and atheists. Meanwhile, the transcendentalists began to see religious value in sacred writings beyond the Bible, including those of Buddhists, Hindus, and Muslims. The transcendentalists became pioneers in the American study of comparative religion. Another implication of intuitionism had to do with the role of the artist. The transcendentalists believed all human inspiration, whether biblical or not, drew from the same divine source. They did not hold religious inspiration to be mundane, like artistic and intellectual inspiration; rather, they held that artistic and intellectual inspiration, like religious inspiration, were divine. The artist, in particular the poet, gained new importance to the transcendentalists as a potential prophet figure, and poetry as a potential source of divine revelation. Emerson was being characteristically transcendentalist when in his first book, Nature (1836), he sought to achieve wholly honest, beautiful, and original forms of expression. In his address “American Scholar” (1837), meanwhile, he called on American writers to stop imitating foreign models; actually, the transcendentalists promoted American interest in foreign Romantic writers, especially Samuel Taylor Coleridge (1772–1834), Thomas Carlyle (1795–1881), and Johann Wolfgang von Goethe (1749–1832). Intuitionism also affected the transcendentalist approach to social and political problems. Transcendental-

180

Ralph Waldo Emerson. The profoundly influential nineteenth-century essayist, poet, lecturer, abolitionist, and leading light of transcendentalism. 䉷 Bettmann/corbis

ists believed laws should be disobeyed if moral intuition held them to be unjust. Thoreau famously argued this point in his essay “Civil Disobedience” (1848; also called “Resistance to Civil Government”). He here advised individuals to disobey unjust laws so as to prevent their personal involvement in evil. More broadly, the transcendentalists held that inspiration was blunted by social conformity, which therefore must be resisted. This is a theme of Emerson’s essay “SelfReliance” (1841) and Thoreau’s book Walden (1854). When approaching the education of children, the transcendentalists advocated innovative methods that supposedly developed a child’s innate knowledge; Alcott tried out transcendentalist methods at his famous experimental Boston school in the mid-1830s. Elizabeth Palmer Peabody (1804–1894), who later played a major role in bringing the European kindergarten to America, described Alcott’s approach in her Record of a School (1835), as did Alcott himself in his Conversations with Children on the Gospels (1836). Transcendentalists also came to criticize existing social arrangements, which they thought prevented individual spiritual development. There were calls and attempts to change what were seen as oppressive economic structures. Orestes Brownson, in his Boston Quarterly Review

TRANSCONTINENTAL RAILROAD, BUILDING OF

articles on the “Laboring Classes” (1840), advocated abolition of inherited private property. George and Sophia Ripley, with others, tried to make Brook Farm a place with no gap between thinkers and workers. Eventually, the Farmers adopted a system inspired by the French socialist Charles Fourier, who believed that in a properly organized society (one he planned in minute detail), people could accomplish all necessary social work by doing only what they were naturally inclined to do. Margaret Fuller, meanwhile, criticized the lack of educational, political, and economic opportunities for women of the era. In the famous series of “conversations” she led for women (1839–1844), Fuller set out to encourage their intellectual development, and in her Woman in the Nineteenth Century (1846), issued a famous manifesto in favor of women’s rights. She came to embody many of the principles she advocated, and became a significant literary critic and journalist, as well as a participant in the Roman Revolution of 1848. The transcendentalists saw slavery as inherently wrong because it crushed the spiritual development of slaves. They protested against slavery in various ways and a few of them, most notably Parker, became leaders of the abolitionist movement. Finally, the transcendentalists laid great value on the spiritual value of nature; Thoreau, particularly, is regarded as a principal forerunner of the modern environmental movement. Transcendentalism has always had its critics. It has been accused of subverting Christianity; of assessing human nature too optimistically and underestimating human weakness and potential for evil; of placing too much emphasis on the self-reliant individual at the expense of society and social reform. Yet even those hostile to transcendentalism must concede that American literature, religion, philosophy, and politics have been shaped by the movement in profound ways. BIBLIOGRAPHY

Capper, Charles, and Conrad E. Wright, eds. Transient and Permanent: The Transcendentalist Movement and Its Contexts. Boston: Massachusetts Historical Society, 1999. Miller, Perry, ed. The Transcendentalists: An Anthology. Cambridge, Mass.: Harvard University Press, 1950. Packer, Barbara, “The Transcendentalists.” In The Cambridge History of American Literature. Edited by Sacvan Bercovitch. Vol. 2: Prose Writing 1820–1865. New York: Cambridge University Press, 1995.

Dean Grodzins See also Individualism; Philosophy; Romanticism; Utopian Communities; Walden.

portation hampered contact between eastern and western commercial centers. Both the United States government and entrepreneurs sought faster transportation to link the two sections. For a decade after 1850, Congress studied possible transcontinental routes, but arguments over sectionalism and slavery blocked all plans. Not until after the South seceded and the Civil War had begun could Congress pass an effective transcontinental plan, the Pacific Railroad Act of 1862. It called for two railroad companies to complete the transcontinental line. The railroad would be a “land-grant railroad,” meaning that the government would give each company 6,400 acres of land and up to $48,000 for every mile of track it built. The money capitalized the project, and the railroads could use the land to entice settlers to the West, who in turn would need the railroads to haul freight. But Congress, afraid to fund a project that would never be completed, wrote a caveat into the act: the railroads had to complete the project by July 1, 1876, or they would forfeit the land, money, and all of the constructed track. The Union Pacific Railroad, a corporation formed for the venture, would build the eastern half of the line starting in Nebraska. The Central Pacific Railroad, owned by a group of California entrepreneurs including Collis Huntington and Leland Stanford, would build the western half. Preliminary work began, even as the nation still fought the Civil War. Surveyors and engineers had to scout and map workable routes. After the war, several army generals served as engineers on the project. They included Grenville Dodge, a favorite general of Ulysses S. Grant and William T. Sherman, who became the Union Pacific’s chief engineer. Work progressed rapidly after the Civil War. The project attracted many former soldiers, both Union and Confederate, as well as Irish and Chinese immigrants. The Central Pacific quickly had to tackle the rugged Sierras in California. Rather than go over or around them, engineers chose to go through them. But such a plan required tons of dynamite and someone to set the charges. The Chinese were often willing to do the hazardous work for less pay than other Americans, and they became a backbone of the Central Pacific work crew. Men working on both lines braved the extremes of heat and cold, hostile Native Americans, and disease as they advanced. The two railroads reached northern Utah at about the same time, and the work crews passed by each other, for no one had decided where the rails were to join. Government engineers stepped in and selected Promontory Point, Utah, for the connection. In a ceremony that included the driving of a symbolic golden railroad spike, the two lines linked on May 10, 1869, seven years ahead of schedule. BIBLIOGRAPHY

TRANSCONTINENTAL RAILROAD, BUILDING OF. The Transcontinental Railroad was the result of the U.S. commitment to Manifest Destiny and its burgeoning industrial might. Long distances and slow trans-

Billington, Ray Allen, and Martin Ridge. Westward Expansion: A History of the American Frontier. Albuquerque: University of New Mexico Press, 2001.

R. Steven Jones

181

T R A N S P L A N T S A N D O R G A N D O N AT I O N

See also Central Pacific–Union Pacific Race; Land Grants for Railways.

TRANSPLANTS AND ORGAN DONATION. Transplantation (grafting) is the replacement of a failing organ or tissue by a functioning one. Transplantation was a dream in antiquity. The Hindu deity Ganesha had his head replaced by an elephant’s head soon after birth (RigVeda, 1500 b.c.). In the Christian tradition Saints Cosmas and Damian (fl. 3rd century a.d.) are famous for replacing the diseased leg of a true believer with the leg of a darkskinned Moor, thereby becoming the patron saints of physicians and surgeons. Transplantation may be from the same person (autologous), from the same species (homologous—the allograft can come from a genetically identical twin, genetically close parent or sibling, living unrelated person, or cadaver) or from a different species (xenotransplant). Human tissues carry highly specific antigens, which cause the immune system to react to “foreign” materials. An antigen is a substance that when introduced into an organism evokes the production of substances—antibod-

ies—that destroy or neutralize the antigen. Grafts of a person’s own tissue (such as skin grafts) are therefore well tolerated. Homologous grafts are plagued by attempted rejection by the recipient human. The biological acceptability of the graft is measured by tissue typing of the donor and recipient using the human leucocyte antigen, or HLA, panels. The closer the match between the donor and the recipient, the greater the chance of graft acceptance and function. Xenotransplantation is as yet entirely experimental because of tissue rejection and the possibility of transmitting animal diseases to the human recipient. Organ transplantation has two sets of problems. The first relate to the recipient: the magnitude of the procedure and the intricacies of the surgical technique, the avoidance of rejection (acute or chronic) of the grafted tissue because of antigens in the tissue, and temporary and long-term suppression of the recipient’s immune processes, with resulting infections and cancers. The second set of problems relates to the graft itself: the source of the graft and its collection, preservation, and transport to the recipient. Associated problems are ethical and economic, including the expense of the procedure and the cost of long-term monitoring and support of the patient.

Transcontinental Railroad. The Union Pacific and the Central Pacific ceremonially link up at Promontory Summit, Utah, just north of the Great Salt Lake, on 10 May 1869; the spot is now the Golden Spike National Historic Site. National Archives and Records Administration

182

T R A N S P L A N T S A N D O R G A N D O N AT I O N

Many of the technical problems associated with transplantation are gradually being overcome, and solutions are being constantly improved. Obtaining donor organs and distributing them equitably remain critical problems. Transplantation is well established for skin, teeth, bone, blood, bone marrow, cornea, heart, kidney, liver, and to a lesser extent for the lung, pancreas, and intestines. On occasion two transplants are combined, such as heart and lung or pancreas and kidney. Grafting an individual’s own skin was well known to the ancient Hindus and has been widely used in the Western world since the middle of the nineteenth century. Skin grafting is a major resource in treating large wounds and burns. Artificially grown skin analogues and frozen pigskin can temporarily meet massive immediate needs. Blood transfusion was attempted in the seventeenth century in France and England but was abandoned because of adverse reactions, including death. The identification of blood types in the early twentieth century and the discovery of methods of separating and preserving blood and its components have made transfusion a common and effective therapy. An important side effect of World War II and later conflicts has been improvement in all aspects of blood transfusion—collection, preservation, and delivery. The recognition of HLA types was based largely on the practices of blood transfusion and skin grafting. Transplantation of bone marrow and stem cells (precursors from which blood cells develop) is used to treat patients with malignancies of the blood and lymphatic system, such as the leukemia and lymphoma. Donor cells may be from the patient or from antigenmatched donor(s). Usually the patient’s bone marrow (with the stem cells) is totally destroyed by chemotherapy, sometimes with whole body irradiation afterward. Donor cells are then introduced into the body, with the expectation that they will take over the production of new blood cells. The commonest organ transplanted is the kidney. The first successful kidney transplant was done in 1954 in the United States between identical twins; before immunosuppressive procedures were developed, twins were the most successful donors. Transplantation between twins evokes the least immune reactions as the HLA types of twins are identical or nearly so. In 2001, about 14,000 kidney transplants were performed in the United States, 63 percent using kidneys obtained from cadavers. Patient survival using cadaveric donor kidneys is more than 90 percent at 1 year after surgery, and 60 to 90 percent at 5 years. For living donor kidneys, survival is above 98 percent at 1 year and 71 to 98 percent at 5 years. Corneal transplants have a high rate of success because the cornea does not have blood vessels and hence is not highly antigenic. Cadaver corneas can be successfully preserved and stored in eye banks for delivery as needed. More than 30,000 corneas are grafted each year in the United States. More than 5,000 liver transplantations were done in the United States in 2001. Some of these transplants were

portions of livers from living donors. In living adult liver donors, significant surgical complications and even a few deaths have raised some questions about the procedure. Though this is a controversial procedure, the great demand for donor livers will certainly keep this practice going. The heart is the fourth most common organ replaced. The first heart transplantation was done in South Africa in 1967; the high risk made it very controversial at the time. In 2000, almost 2,200 heart transplants were performed in the United States. Graft rejection remains a problem, and immunosuppresion (with its attendant dangers) has to be continued lifelong. If patients do not have other significant diseases, they return to near-normal functioning. More than 80 percent of patients function satisfactorily 1 year after surgery, and 60 to 70 percent at 5 years. At any given time, thousands of patients are waiting for donated organs. With progressive technical improvement in keeping seriously ill patients alive and making transplantation less risky, the need for organs continues to rise. Bioengineering is the application of engineering principles to biology—this includes the artificial production of cells and organs, or that of equipment that can perform functions of organs such as the kidneys or the heart. Bioengineered cells and tissues are a promising field in transplantation. Bioengineered skin is widely used for short-term coverage. Bioengineered corneas appear to be promising. Primitive heart-muscle cells (myoblasts) are being transplanted into diseased hearts, chondrocytes or cartilage cells are being cultured for use in degenerated joints, and there is considerable interest in xenografts. Since 1968 a Uniform Anatomical Gift Act allows adults to donate their organs for transplantation after death. In every state, some form of donor card is associated with driver’s licenses, and health care providers in most states are required to ask permission for postmortem organ procurement. (In some European countries consent for organ donation is presumed.) The United Network for Organ Sharing (UNOS) was established in 1977 to coordinate the distribution of kidneys and later other organs nationally and to maintain a registry of persons awaiting transplant. The UNOS generally prefers that donated organ(s) be used in the local community. All transplant centers are required to join the network and abide by its rules. By May 2002, UNOS membership included 255 Transplant Centers, 156 Histocompatibility Laboratories, and 59 Operating Organ Procurement Organizations. With all these efforts, the shortage of organs persists.

BIBLIOGRAPHY

Cooper, David K. C., and Robert P. Lanza. Xeno: The Promise of Transplanting Animal Organs into Humans. Oxford and New York: Oxford University Press, 2000. Fox, Rene´e C., and Judith P. Swazey, with the assistance of Judith C. Watkins. Spare Parts: Organ Replacement in American Society. New York: Oxford University Press, 1992.

183

T R A N S P O RT AT I O N , D E PA RT M E N T O F

Lock, Margaret M. Twice Dead: Organ Transplants and the Reinvention of Death. Berkeley: University of California Press, 2002. Munson, Ronald. Raising the Dead: Organ Transplants, Ethics, and Society. Oxford and New York: Oxford University Press, 2002. Murray, Joseph E. Surgery of the Soul: Reflections on a Curious Career. Canton, Mass.: Science History Publications, 2001. Parr, Elizabeth, and Janet Mize. Coping With an Organ Transplant: A Practical Guide to Understanding, Preparing For, and Living With an Organ Transplant. New York: Avery, 2001. United States Congress, House Committee on Commerce. Organ and Bone Marrow Transplant Program Reauthorization Act of 1995: Report (to Accompany S. 1324). Washington, D.C.: U.S. General Printing Office, 1996. United States Congress, House Committee on Commerce, Subcommittee on Health and the Environment. Organ Procurement and Transplantation Network Amendments of 1999: Report Together with Dissenting Views (to Accompany H.R. 2418). Washington, D.C.: U.S. General Printing Office, 1999. United States Congress, House Committee on Government Reform and Oversight, Subcommittee on Human Resources. Oversight of the National Organ Procurement and Transplantation Network: Hearing Before the Subcommittee on Human Resources of the Committee on Government Reform and Oversight. 105th Cong., 2nd sess., 8 April 1998. Washington, D.C.: General Printing Office, 1998. Youngner, Stuart J., Rene´e C. Fox, and Laurence J. O’Connell, eds. Organ Transplantation: Meanings and Realities. Madison: University of Wisconsin Press, 1996.

Internet Sources For current national statistical data, see the Web sites of the Scientific Registry of Transplant Recipients, http://ustrans plant.org/annual.html, and the United Network for Organ Sharing, http://www.unos.org/frame_default.asp. For general information for patients, updated regularly, see http://www.nlm.nih.gov/medlineplus.

Ranes C. Chakravorty See also Medicine and Surgery.

TRANSPORTATION, DEPARTMENT OF (DOT) was established by an act of Congress (P.L. 89670) on 15 October 1966, and formally opened for business on 1 April 1967. It consists of the Office of the Secretary and fourteen Operating Administrations, each of which has statutory responsibility for the implementation of a wide range of regulations, both at its headquarters in Washington, D.C., and at the appropriate regional offices. Mission The Department’s mission is to develop and coordinate policies that provide an efficient and economical national transportation system, with due regard for its impact on safety, the environment, and national defense. For example, DOT regulates safety in the skies, on the seas, and

184

on the roads and rails. The department regulates consumer and economic issues regarding aviation and provides financial assistance for programs involving highways, airports, mass transit, the maritime industry, railroads, and motor vehicle safety. It writes regulations carrying out such disparate statutes as the Americans with Disabilities Act and the Uniform Time Act. It promotes intermodal transportation (utilizing different modes of transportation for one trip) and implements international trade and transportation agreements. The Structure The Office of the Secretary (OST) oversees the formulation of America’s national transportation policy, including the promotion of intermodalism and safety. The office includes the secretary, the deputy secretary, one under secretary, five assistant secretaries, and the office of the general counsel. Four of the assistant secretaries are appointed by the president and confirmed by the Senate. The fifth, the assistant secretary for administration, has a career civil service appointee at its helm. In addition to the general counsel, these four offices include Aviation and International Affairs, Budget and Financial Management, Governmental Affairs, and Transportation Policy. The Operating Administrations, which are responsible for implementing the department’s mission, include: (1) the United States Coast Guard (USCG); (2) the Federal Aviation Administration (FAA); (3) the Federal Highway Administration (FHWA); (4) the Federal Motor Carrier Safety Administration (FMCSA); (5) the Federal Railroad Administration (FRA); (6) the National Highway Traffic Safety Administration (NHTSA); (7) the Federal Transit Administration (FTA); (8) the Maritime Administration (MARAD); (9) the Saint Lawrence Seaway Development Corporation (SLSDC); (10) the Research and Special Programs Administration (RSPA); and (11) the Transportation Security Administration (TSA). Each is headed by a presidential appointee who is subject to Senate confirmation. The Bureau of Transportation Statistics (BTS), the Transportation Administrative Service Center (TASC), and the Surface Transportation Board (STB) provide specialized functions. DOT’s Immediate Pre-History From the outset, the drive behind the establishment of a Department of Transportation was to develop a viable national transportation policy. When DOT was formed, the federal government had no less than thirty-four agencies and functions to handle the nation’s transportation programs. The need to nationalize these programs under a single roof gained steady adherence in the years following the Civil War; and since that time, members of Congress attempted to pass legislation resolving this issue on ninetytwo occasions. The first Hoover Commission (1947– 1949), as part of its mandate to reorganize the executive branch, proposed to put all the transportation functions under the Department of Commerce, which President Harry S. Truman did—to almost no one’s satisfaction.

T R A N S P O RT AT I O N A N D T R AV E L

President Dwight Eisenhower’s Advisory Committee on Governmental Organization proposed a cabinet-level Department of Transportation and Communications; however, this proposal faced several political obstacles. Consequently, when the retiring administrator of the stillindependent Federal Aviation Agency proposed to President Lyndon Baines Johnson the establishment of a Department of Transportation, the staff of the Bureau of the Budget, who had been working on proposals to reorganize the executive branch, seized upon his proposal. Noting that “America today lacks a coordinated [intermodal] transportation system,” Johnson agreed, and within two years, in October 1966, the Department of Transportation became a reality. From Many to One As such, DOT proved valuable in the development of a national transportation policy, particularly during the administrations of Presidents Gerald Ford and George H. W. Bush. During President Ronald Reagan’s administration, not only had the maritime administration successfully been brought into the Department, but DOT had managed to withstand serious efforts to pry away the Coast Guard and the FAA as well. It even saw commercial space transportation and residual functions of the Civilian Aeronautics Board (CAB) and the Interstate Commerce Commission (ICC) become significant parts of the mix. Guided by President William Clinton’s National Performance Review (the Reinventing Government Initiative) and Congress’s passage of the Chief Financial Officers Act of 1990 and the Government and Performance Results Act of 1993, the Department pursued a “One DOT” management strategy, replete with customer service proposals, strategic planning, and performance appraisals. As an example, NHTSA adopted as its slogan, “People Saving People.” Following the 11 September 2001 attacks on the World Trade Center and the Pentagon, on 19 November 2001, Congress passed the Aviation and Transportation Security Act, which established the Transportation Security Administration (TSA), responsible for securing all modes of transportation in the United States. BIBLIOGRAPHY

Burby, John F. The Great American Motion Sickness; or, Why You Can’t Get From There to Here. Boston: Little, Brown, 1971. Davis, Grant Miller. The Department of Transportation. Lexington, Mass.: D.C. Heath Lexington, 1970. Hazard, John L. Managing National Transportation Policy. Westport, Conn.: Eno Foundation for Transportation Policy, 1988. Whitnah, Donald Robert. U.S. Department of Transportation: A Reference History. Westport, Conn.: Greenwood Press, 1998.

R. Dale Grinder See also Transportation and Travel.

TRANSPORTATION ACT OF 1920, also known as the Esch-Cummins Act. The U.S. government took over and ran the railroads from 26 December 1917 to 1 March 1920. During the period of government operation, the tracks were obliged to carry a heavy volume of traffic with little attention to replacements or ordinary maintenance. This was more a result of circumstances than the fault of the government; nevertheless, the railroads were in a deplorable condition when, after a little more than two years, they were returned to private operation. As a result, some remedial legislation was imperative. The Transportation Act of 28 February 1920 was the result. The Senate bill, introduced by Sen. Albert B. Cummins, and the House bill, proposed by Rep. John Jacob Esch, required a conference committee to produce a compromise measure, which became effective on 1 March, a little more than three months after President Woodrow Wilson returned the railroads to private operation. To help the railroads financially, the bill authorized consolidations, established a six-month guarantee period, and authorized extensive loans for a variety of purposes. Congress provided for arbitration without power of enforcement and established voluntary adjustment boards to settle labor disputes. These provisions were to be enforced by the Railroad Labor Board, consisting of nine members and having national jurisdiction. Hotly contested in Congress, the Transportation Act of 1920 engendered controversy for years thereafter. Advocates contended that favorable terms were necessary to avoid paralysis of the national transportation system; detractors claimed that railroads and financial interests had dictated terms to their own advantage. BIBLIOGRAPHY

Berk, Gerald. Alternative Tracks: The Constitution of American Industrial Order, 1865–1917. Baltimore: Johns Hopkins University Press, 1994. Himmelberg, Robert F., ed. Business-Government Cooperation, 1917–1932: The Rise of Corporatist Policies. Vol. 5 of Business and Government in America since 1870. New York: Garland, 1994.

W. Brooke Graves / c. w. See also Railroad Administration, U.S.; Railroads; Transportation and Travel.

TRANSPORTATION AND TRAVEL. Travel in the United States for most of its history was arduous. Nineteenth-century transportation systems, notably the railroad, improved travel between and within cities, but most Americans could go only as far as their horses could carry them. The vast country remained largely inaccessible to all but the most intrepid pioneer or explorer. The United States was transformed in the twentieth century into the most mobile society in human history. Americans traveled far more often and covered many

185

T R A N S P O RT AT I O N A N D T R AV E L

was not available to maintain and improve them. Operating horse-drawn passenger vehicles was difficult during much of the colonial era because of the poor roads. The first regular stagecoach route was inaugurated on 8 March 1759, between New York City and Philadelphia, and by the end of the colonial period a network of services connected the larger towns. A covered wagon service known as the “flying machine,” operated by John Mercereau during the 1770s, was advertised as a miracle of speed because it covered the 100-mile distance between New York City and Philadelphia in only a day and a half, and it had a reputation for sticking precisely to a published timetable. Nineteenth-Century Transportation Through the nineteenth century, the top transportation objective in the United States was to open routes between eastern population centers and sparsely inhabited territories to the west. In the first quarter century after independence, construction of roads across the Appalachian Mountains received priority. As American settlement pushed further westward during the nineteenth century, first water and then rail transport emerged as leading forms of transport.

Pullman Car. The interior of a sleeping car on the Chicago, Milwaukee, St. Paul and Pacific Railroad. Getty Images

more miles during the twentieth century compared not only to their ancestors but also to their contemporaries in other countries. The principal agent of the unprecedented ability to travel was a transportation system based on near universal ownership of private motor vehicles. Colonial Era Most settlers lived near a body of water, so water transportation was the usual means of travel. A short journey down a river could be undertaken by canoe, while a longerdistance trip across a protected bay or sound could be made by shallop, sloop, schooner, or other small sailboat. For those who could afford a horse, land-based travel could be accomplished by riding on a trail traced initially by deer, buffalo, and other animals. Otherwise, a traveler had to set off on foot. Colonists moved goods between east coast cities by mule and packhorse. In the west, fur traders, farmers, and settlers widened footpaths by riding horses or attaching horses to wagons. Colonial road building officially started in 1639, when the Massachusetts General Court directed that each town lay out roads connecting it with adjacent villages. Roads were built by other colonial governments, but the condition of these dirt roads was generally poor and money

186

Turnpikes. To stimulate road construction during the last decade of the eighteenth century and the first decade of the nineteenth, states chartered private companies to build, operate, and maintain turnpikes, so named because poles armed with pikes were turned to allow travelers to pass through after paying. The first turnpike, between Philadelphia and Lancaster, Pennsylvania, was chartered in 1790, begun in 1792, and completed in 1794. The sixty-two-mile road was thirty-seven feet wide, paved with stone, and covered with gravel. Its high quality and financial success generated interest from hundreds of companies in turnpikes. By 1811 New York had chartered 137 companies, which constructed 1,400 miles of roads, and Pennsylvania had 2,380 miles of road built by 102 companies. High tolls discouraged using the turnpikes to transport bulky products at a profit. Some turnpikes were built with state and federal government aid. Most prominent was the Cumberland Road or National Pike, authorized by Congress in 1806. Financing was arranged through an agreement in which states exempted from taxation for five years federal land sold to settlers in return for the federal government agreeing to appropriate 5 percent of the proceeds from the land sales for building the road. The first 130-mile stretch of the National Pike from Cumberland, Maryland, west to Wheeling, West Virginia, was completed in 1818. The National Pike was an engineering marvel, eighty feet wide, with bridges across streams. Its most distinctive feature was a thirty- to forty-foot-wide center track made not of dirt but of the new macadam technology, a teninch layer of compacted small stones. The route reached what proved to be its westward terminus at Vandalia, Il-

T R A N S P O RT AT I O N A N D T R AV E L

linois, in 1852 and was not extended further west to Jefferson City, Missouri, as planned, because water and rail had by then emerged as better choices for long-distance travel. Canals. Movement of people and especially goods by barge was much cheaper than by road because a horse could drag a load that was fifty times heavier on water than across land. But water travel to the west from east coast population centers was impractical because navigable rivers, especially in the Northeast, such as the Delaware and Hudson, flowed generally north-south. Water routes to the west were opened through canals, a technique already widely used in Great Britain. New York State under the leadership of Governor DeWitt Clinton authorized construction of the Erie Canal in 1817 to connect the Hudson River with Lake Erie. Forty feet wide and four feet deep, the Erie Canal rose over 500 feet through 83 locks. The first 15 miles between Utica and Rome were opened in 1819, the entire 363-mile canal between Troy (Albany) and Buffalo on 26 October 1825. With the opening of the Erie Canal, transporting a ton of freight between New York City and Buffalo took eight days instead of twenty and cost about $10 instead of $100. Cities in Upstate New York along the route, such as Syracuse, Rochester, and Buffalo, thrived, and New York City surpassed Philadelphia as the country’s most populous city and most important seaport. In the Midwest, the state of Ohio in 1825 authorized two canals to connect Lake Erie with the Ohio River, including the Ohio and Erie, completed in 1832 between Portsmouth and Cleveland, and the Miami and Erie between Cincinnati and Toledo, substantially finished in 1835, though not completely until 1845. In Indiana, the Wabash and Erie Canal, begun in 1832 and completed in 1843, connected Evansville on the Ohio River with Toledo and the Miami and Erie Canal near the Ohio-Indiana state line. The United States had 3,326 miles of canals in 1840, and 3,698 in 1850. Canals were built and financed mostly by states because private individuals lacked sufficient capital. But states overreached, constructing canals that could never generate enough revenue to pay off the loans. Inability to repay canal construction loans was a major contributor to the panic of 1837, the worst economic depression of the nineteenth century. Subsequent nineteenth-century transportation improvements would be financed by private speculators. Robert Fulton first demonstrated the practicability of steam power in 1807 when he sailed his boat the Clermont 150 miles up the Hudson River from New York City to Albany in thirty-two hours. On the western rivers such as the Ohio and Mississippi, flat-bottomed two-deck steamboats quickly became the cheapest means for longdistance hauling of large quantities of goods. The 1,200mile journey up the Mississippi from New Orleans to St. Louis could be completed in four days. More than 1,000

steamboats plied the Mississippi and its tributaries during the 1850s. Railroads. It was the railroad that first succeeded in knitting together a unified coast-to-coast transportation network for the United States. The first railroad in the United States was the Baltimore and Ohio. Given the honor of placing the first rail, on 4 July 1828, was the Maryland native Charles Carroll, who as the country’s only surviving signer of the Declaration of Independence symbolically linked the political revolution of the eighteenth century with the industrial revolution of the nineteenth. The first 13 miles, between Baltimore and Ellicott City, Maryland, opened in 1830, and by 1835 the B&O had 135 miles of track. Other early-1830s U.S. rail lines included New York’s Mohawk and Hudson and South Carolina’s Charleston and Hamburg, a 136-mile route, then the world’s longest. U.S. railroad mileage grew rapidly through the nineteenth century: 23 miles of track in 1830, 2,818 miles in 1840, 9,021 miles in 1850, 30,626 miles in 1860, 52,914 miles in 1870, 93,296 miles in 1880, 163,597 miles in 1890, and 193,346 miles in 1900. Rail companies succeeded in digging through the Appalachians and bridging the Mississippi during the 1850s. Barely a decade later, the first transcontinental railroad was completed.

187

T R A N S P O RT AT I O N A N D T R AV E L

Congress created the Union Pacific Railroad Company in 1862 for the purpose of building a road from Nebraska west to California. Meanwhile, Sacramento, California, merchants organized the Central Pacific Railroad to build eastward. To encourage rapid construction, the two railroads were granted ownership of ten square miles of federal land for every mile of track laid, raised in 1864 to twenty square miles. They also received subsidies of $16,000 for every mile of track laid in the plains, $32,000 in the foothills, and $48,000 in the mountains. The two lines met at Promontory Point, Utah, on 10 May 1869, where Leland Stanford, a California grocer and Central Pacific investor, drove in the last spike, made of California gold. Several other transcontinental railroads were quickly constructed, also backed by generous grants of public land. Rail-based transportation systems were also built within cities during the nineteenth century to help ease congestion resulting from rapid growth. Horse-drawn streetcars were widely used beginning in the 1850s until replaced by electric streetcars during the 1880s and 1890s. In larger cities, elevated railroads were constructed beginning in the 1870s, and the first underground railroad (subway) opened in Boston in 1897. After a half-century of rapid construction, the United States had 40 percent of the world’s total rail mileage in

188

1900. For every 10,000 inhabitants, the United States had 27 miles of tracks, compared to 4.8 miles in Europe and 1.3 miles in the rest of the world. For every 100 square miles of territory, the United States had 9.6 miles of tracks, compared to only 5.1 miles in Europe and 0.3 miles in the rest of the world. Train service made possible rapid movement between major cities in the late nineteenth century, but for the majority of Americans who still lived in rural areas railroads offered little service because they stopped infrequently between major cities. The routes of the main rail lines controlled the fate of rural communities. The rural and small-town stations where the trains did stop were like pearls strung along the railroad line. Around the stations economic and social activity bustled. Beyond a tento-twelve-mile radius of the stations, most farmers lived in isolation, able to reach the outside world only by riding horses over dirt trails. In 1900, the United States still had 30 million horses, an average of more than one per household. Seven groups—Vanderbilt, Pennsylvania, Morgan, Gould, Moore, Harriman, and Hill—controlled two-thirds of U.S. rail service in 1900. To most Americans, the railroad owners were hated and feared robber barons insensitive to the public interest. As monopolies, U.S. railroads paid more attention to wealthy riders willing to pay high

T R A N S P O RT AT I O N A N D T R AV E L

prices for a luxurious ride than to average Americans eager to travel but unable to afford a ticket. The railroad was ripe for a challenge from a viable alternative for intercity travel. In the twentieth century, that viable alternative turned out to be the private motor vehicle. Twentieth Century Nineteenth-century transportation improvements made it possible for groups of Americans to travel long distances together with relative speed and comfort. The twentieth century brought personal and affordable travel to each individual American. Itinerary and departure time were determined by the stagecoach, steamboat, or railroad operator during the nineteenth century. During the twentieth century, the motor vehicle enabled individuals to decide for themselves where and when to travel. Motor vehicles. The Duryea Motor Wagon Company, organized by brothers J. Frank and Charles E. Duryea in Chicopee Falls, Massachusetts, was the first company in the United States to manufacture automobiles in volume, thirteen in 1896. Duryea had gained fame by winning a race through the streets of Chicago on 28 November 1895, the first important event involving motor vehicles in U.S. history. Because motor vehicles quickly captured the public imagination for their speed and performance, early producers assumed that the market was primarily for highend recreation and leisure purposes. Early vehicles were purchased as novelty items, akin to motorized bicycles, and with an average cost of about $2,000 only wealthy

people could afford them. Motor vehicles in fact were known as pleasure cars until World War I, when the motor vehicle industry launched a successful campaign to call them passenger cars instead, because “pleasure” sounded unpatriotic in the midst of a world war. The Ford Motor Company, organized in 1903 by Henry Ford, led the transformation of the motor vehicle from a toy into an indispensable tool of daily life. Ford believed that desire to own motor vehicles was universal, limited only by their high cost, and that once in possession of them Americans would find them extremely useful and practical. To build cars cheaply, Ford pioneered such production methods as offering only one model, designing an easy-to-build car, standardizing parts, placing machines in a logical sequence in the factory, assigning a very specialized job to each worker, and above all bringing the tasks to the workers along a continuously moving assembly line. Sales of the Ford car, known as the Model T, increased from 13,840 in 1909, its first year of production, to a peak of 1.4 million in 1924. When production ended in 1927, the Model T cost only $290, and Ford had sold more than 15 million of them over eighteen years. During the 1910s and 1920s, half of the world’s motor vehicles were Ford Model Ts. General Motors overtook Ford as the leading motor vehicle producer in the 1920s by offering a wide variety of vehicles with styling changed every year. GM stimulated sales through readily available low-interest loans and

189

T R A N S P O RT AT I O N A N D T R AV E L

Intercity Passenger Miles by Type of Transport

Air 10%

Bus 3%

2000 (4,726 billion passenger miles)

Rail 0.01%

Motor Vehicle 87%

Rail 5%

1950 (556 billion passenger miles)

Bus 5%

Air 3%

Motor Vehicle 87%

SOURCE: U.S. Department of Transportation, Bureau of Transportation Statistics.

increased profits through innovative financial management practices. At the onset of the Great Depression in 1929, the number of motor vehicles in the United States was nearly as great as the number of families, at a time when possession of a motor vehicle was extremely rare in the rest of the world. Through the first half of the twentieth century, the United States accounted for more than threefourths of the world’s production and sales of motor vehicles. As a result of a high car ownership rate, the United States had a very different transportation system for much of the twentieth century than anywhere else in the world. As early as 1930, the U.S. Census Bureau reported that one-fourth of U.S. cities with more than 10,000 inhabitants had no public transit and so depended entirely on cars for transportation. Even in the country’s largest cities, most trips were being made by car in 1930. Use of motor vehicles had been limited during the first two decades of the twentieth century by poor road

190

conditions. The first inventory of U.S. roads by the Office of Public Roads Inquiry in 1904 found only 153,662 miles of roads with any kind of surfacing. The 1916 Federal Aid Road Act appropriated $75 million over five years to pay half of the cost of building rural post roads, with states paying the remaining half. In 1921 the amount was increased to 75 million per year. The amount of surfaced roads in the United States increased from 257,291 miles in 1914 to 521,915 miles in 1926. The Federal Highway Act of 1921 called for designation of a national highway system of interconnected roads. The complete national system of 96,626 miles was approved in 1926 and identified by the U.S. highway numbers still in use. The first limited-access highway—the Pennsylvania Turnpike—opened in 1940. The Interstate Highway Act of 1956 called for construction of 44,000 miles of limitedaccess highways across the United States. The federal government paid for 90 percent of the cost to construct the highways. Most of the miles of interstate highways were constructed to connect cities, but most of the dollars were spent to cross inside cities. Construction of new highways could not keep pace with increased motor vehicle usage during the second half of the twentieth century. Between 1950 and 2000, the number of Americans nearly doubled and the number of roads doubled, but the number of vehicles more than quadrupled and the number of miles driven more than quintupled. As a result, the United States had more motor vehicles than licensed drivers in 2000. The federal government played an increasing role in the design of safer, cleaner, more efficient motor vehicles, especially during the 1960s and 1970s. The National Traffic and Motor Vehicle Safety and Highway Safety Acts of 1966 mandated safety features, such as seat belts. A cabinet-level Department of Transportation was established in 1967 to coordinate and administer overall transportation policy. The 1970 Clean Air Act specified reductions in polluting emissions. The 1975 Energy Policy and Conservation Act specified minimum fuel efficiency. However, improvements in passenger car safety and fuel efficiency during the late twentieth century were offset by Americans’ preference for purchasing trucks instead. Aviation. The federal government was crucial in shaping the role of aviation in the U.S. transportation system during the 1920s and 1930s. After the Wright Brothers’ first successful manned flight in 1903, airplanes were flown primarily for entertainment and military purposes until 15 May 1918, when Army pilots started daily airmail service between New York and Washington. The Post Office—then a cabinet-level federal department—was authorized under the 1925 Kelly Act to award private aviation companies with contracts to carry mail on the basis of competitive bidding. Because carrying airmail accounted for 90 percent of airline revenues during the 1920s, carriers with contracts were the ones to survive the industry’s initial shakeout and then evolve into the dominant passengercarrying services during the 1930s.

T R A N S P O RT AT I O N A N D T R AV E L

The first privately contracted airmail routes started on 15 February 1926, from Detroit to Chicago and Cleveland, and the federal government stopped flying its own airmail planes on 31 August 1927. Regularly scheduled passenger service started in 1926, when airmail contractors first provided a limited number of seats in their planes. Aviation companies started carrying cargo that year as well.

Ton-Miles of Freight by Type of Transport Air 0.04% 2000 (3 trillion ton-miles) Water 22%

The Civil Aeronautics Board (CAB, originally called the Civil Aeronautics Authority), created under the 1938 Civil Aeronautics Act, certified airlines as fit to fly, specified pairs of cities between which they could fly passengers, and regulated their fares. The Federal Aviation Administration (FAA), established in 1958, regulated safety and other standards for aircraft, airports, and pilots. Passenger service grew rapidly during the 1950s and 1960s, especially after the introduction of large jet-engine planes capable of flying more people longer distances at higher speeds. Between 1938 and 1978, the number of passengers increased from 1 million to 267 million and revenue passenger miles increased from 533 million to 219 billion. Routes supported by the Post Office back in the 1920s formed the backbone of the passenger services certified during the next half-century of government regulation. The federal government dramatically restructured the industry in 1978 through passage of the Airline Deregulation Act. Airlines could now fly wherever they wished inside the United States and charge passengers whatever they wished. The CAB—its regulatory role rendered obsolete—was disbanded in 1984, and its safety oversight functions transferred to the FAA. Deregulation set off a wave of airline acquisitions, mergers, and bankruptcies during the 1980s and 1990s. A handful of surviving airlines dominated the U.S. air system by terminating most point-to-point flights between pairs of cities and instead concentrating most flights in and out of a few hub airports. Regional airlines fed more passengers from smaller cities into the hubs. As a result, most airports offered nonstop flights to fewer cities but one-stop flights (via transfer at a hub) to many more cities. Low-cost airlines filled in gaps in the hub-and-spokes system by offering inexpensive flights between pairs of underserved airports. In the first two decades of deregulation, U.S. air travel increased even more rapidly than in the past—from 275 million passengers in 1978 to 466 million in 1990 and 666 million in 2000, and from 219 billion passenger miles in 1978 to 458 billion in 1990 and 693 billion in 2000. Free to charge passengers whatever they wished, airlines employed sophisticated yield management models to constantly change fares for particular flights depending on demand. Surviving nineteenth-century transportation systems. Squeezed between motor vehicles for shorter distances and airplanes for longer distances, the railroads lost nearly

Rail 33%

Truck 45% 1950 (1 billion ton-miles)

Water 17% Truck 18%

Air 0.03%

Rail 65%

SOURCE: U.S. Department of Transportation, Bureau of Transportation Statistics.

all of their intercity passengers during the second half of the twentieth century. The handful of remaining intercity passenger routes were taken over in 1971 by Amtrak, with federal financial support. Most of Amtrak’s 22 million passengers in 2000 were traveling between the large cities in the Northeast. Amtrak operated some suburban commuter rail lines, although most were transferred to local or regional public authorities. Railroads and truck companies shared about evenly in the growth of freight handling during the first half of the twentieth century, but after completion of the interstate highway system trucks captured virtually all of the growth while railroads stagnated. Conrail was created by the federal government in 1976 to take over a number of bankrupt freight-hauling lines, including the Penn Central, the nation’s largest when it was created in 1968 through the merger of the Pennsylvania and New York Central railroads. Within urban areas, rail-based transit enjoyed a modest revival in the late twentieth century, especially in construction of new subway and streetcar (now called light rail) lines. The 1991 Intermodal Surface Transportation Efficiency Act and 1998 Transportation Equity Act en-

191

T R AV E L I N G S A L E S M E N

abled state and local governments to fund a mix of highway and transit improvements. On major inland waterways, such as the Mississippi and Ohio Rivers, the federal government widened and straightened river channels and constructed locks and dams to make shipping by barge faster and safer. Dredging operations permitted oceangoing vessels to reach inland cities such as Tulsa, Oklahoma. In 2000, the United States had about 25,000 miles of navigable inland channels, not including the Great Lakes. Movement of freight became much easier in the late twentieth century by packing goods in containers that could be moved easily from ship to rail to truck. Into the Twenty-first Century The United States entered the twenty-first century with the prospect that travel would be slower and more difficult than during the twentieth century. After the 11 September 2001 terrorist attack on the World Trade Center and Pentagon employed four airplanes as weapons, strict security checks instituted at U.S. airports increased total travel time and made more Americans afraid to fly. On the ground, roads and bridges deteriorated at a faster rate than they could be repaired, while motor vehicle usage continued to increase. As a result, driving time between and within cities increased. BIBLIOGRAPHY

Air Transport Association of America. Annual Report. Washington, D.C.: Air Transport Association of America, published annually. Davies, R. E. G. Fallacies and Fantasies of Air Transport History. McLean, Va.: Paladwr, 1994. Dunbar, Seymour. A History of Travel in America. 4 vols. Indianapolis: Bobbs-Merrill, 1915. Flink, James J. The Automobile Age. Cambridge, Mass.: MIT Press, 1988. Hilton, George W., and John F. Due. The Electric Interurban Railways in America. Stanford, Calif.: Stanford University Press, 1960. Jarrett, Philip, ed. Modern Air Transport: Worldwide Air Transport from 1945 to the Present. London: Putnam Aeronautical Books, 2000. Rubenstein, James M. Making and Selling Cars: Innovation and Change in the U.S. Automotive Industry. Baltimore: Johns Hopkins University Press, 2001. Taylor, George Rogers. The Transportation Revolution, 1815– 1860. New York: Rinehart, 1951. Vance, James E., Jr. The North American Railroad: Its Origin, Evolution, and Geography. Baltimore: Johns Hopkins University Press, 1995. Vranich, Joseph. Derailed: What Went Wrong and What to Do about America’s Passenger Trains. New York: St Martin’s Press, 1997. Womack, James P., Daniel T. Jones, and Daniel Roos. The Machine That Changed the World. New York: Rawson, 1990.

James M. Rubenstein

192

See also Air Transportation and Travel; Airline Deregulation Act; Amtrak; Cumberland Road; Erie Canal; FederalAid Highway Program; Federal Aviation Administration; Interstate Highway System; Railroads; Railways, Interurban; Railways, Urban, and Rapid Transit; Transportation, Department of.

TRAVELING SALESMEN are representatives of business firms who travel through assigned territories to solicit orders for future deliveries of their employers’ goods and services. Unlike peddlers or canvassers, they seek orders from other business firms and public institutions rather than from individual consumers or households. Also, unlike peddlers and other itinerant merchants, they usually sell from samples or descriptions of their products rather than carry goods for immediate delivery. Although itinerant dealers, such as seagoing and overland traders, emerged early in American economic life, traveling salesmen were virtually nonexistent before the mid-nineteenth century. Thin, sparsely developed market areas, small-scale manufacturing, and the lack of branded merchandise lines provided little incentive for the use of salesmen. Wholesale merchants, who maintained their own contacts with suppliers, dominated trade. Retailers either made periodic buying trips to major wholesale centers to replenish their inventories or patronized local wholesale jobbers. After 1840 manufacturers began to take the initiative and send salesmen in search of customers. This change resulted from (1) the growth of market opportunities as America became more urban; (2) transportation improvements that reduced travel time and expense; and (3) growth in manufacturing capacity and the consequent need to sell greater quantities of goods. The pioneer traveling salesmen cannot be precisely identified, in part because of definitional problems. For example, should a company owner or partner who made an occasional visit to distant customers or agents be classified as a traveling salesman? A Wilmington, Del., railway equipment manufacturing company, Bonney and Bush (subsequently Bush and Lobdell and then Lobdell Car Wheel Company), employed a traveling agent starting in the 1830s and added additional salesmen in the 1850s. Scovill Manufacturing Company of Waterbury, Conn., which made brassware, experimented with a traveling salesman from 1832 to 1835 but did not finally adopt that selling method until 1852. The Rogers Brothers Silverware Company and other metalworking firms also began using traveling salesmen in the early 1850s. Many states and municipalities, acting at the behest of their local wholesalers, imposed costly licensing requirements on traveling salesmen entering their jurisdictions. These barriers proved ineffective and eventually were declared unconstitutional in Robbins v. Taxing District (1887). The U.S. Census reported 7,262 traveling sales-

TREASON

men in 1870 and 223,732 in 1930, but these figures may represent only one-half to one-third of the true numbers. The number of salesmen undoubtedly increased after 1930. Salesmanship in the twentieth century became increasingly professionalized and scientific. The early sales management textbooks of the 1910s and 1920s were followed by a steadily expanding stream of books, periodicals, and college courses; more careful planning of salesmen’s itineraries; attention to new methods of testing, selecting, and training salesmen; and experimentation with various commission and salary methods of compensation. Traveling salesmen continued to ply their trade well into the twenty-first century. But while salesmen selling to businesses thrived—selling drugs to doctors or cosmetics to salons, for example—the door to door salesman became extremely rare. The rise of the two-income family after the 1970s deprived door to door salesmen of daytime access to customers. In addition, the rise of the Internet as a marketing device and the trend toward gated and policed suburban subdivisions created steep barriers for door to door work. In many parts of the United States by the end of the twentieth century, the door to door salesman was a thing of the past. BIBLIOGRAPHY

Hollander, S. C. “Nineteenth Century Anti-Drummer Legislation.” Business History Review 38 (1964). Moore, Truman E. The Traveling Man: The Story of the American Traveling Salesman. Garden City, N.Y.: Doubleday, 1972. Porter, Glenn, and Harold C. Livesay. Merchants and Manufacturers: Studies in the Changing Structure of NineteenthCentury Marketing. Baltimore: Johns Hopkins Press, 1971. Spears, Timothy B. 100 Years on the Road: The Traveling Salesman in American Culture. New Haven: Yale University Press, 1995.

Stanley C. Hollander / l. t.; a. r. See also Advertising; Consumerism; Marketing.

TREASON. Traditionally, treason was betrayal of the state, which, in most countries meant the monarch. A person who commits treason is a traitor. However, the framers of the U.S. Constitution chose to adopt a restricted definition of treason, making it the only term defined in the body of the Constitution. James Wilson was the principal author of the provision: Art. III Sec. 3: Treason against the United States, shall consist only in levying War against them, or in adhering to their Enemies, giving them Aid and Comfort. No person shall be convicted of Treason unless on the Testimony of two witnesses to the same overt Act, or on confession in open Court. The Congress shall have Power to declare the Punishment of Treason, but no Attainder of Treason shall work Corruption of Blood, or Forfeiture except during the Life of the Person attainted.

Their reason for defining treason was the common English practice of charging political opponents with a capital offense, often on weak evidence, under the doctrine of “constructive treason.” A classic case was the trial of Algernon Sidney, beheaded in 1683 for plotting against the king. The case against him was based largely on passages from his treatise, Discourses Concerning Government, which was not even published until after his death, in 1698. The term treason was familiar in the common law before it was used in the Statute of 25 Edward III (1350), from which the Constitution derives its language concerning the levying of war and adhering to enemies, giving them aid and comfort. However, the Constitution’s treason clause contains no provision analogous to that by which the Statute of Edward III penalized the compassing (intending) of the king’s death, since in a republic there is no monarch and the people are sovereign. Charges of treason for compassing the king’s death had been the main instrument used in England for the most drastic, “lawful” suppression of political opposition or the expression of ideas or beliefs distasteful to those in power. The Statute of 7 William III (1694) introduced the requirement of two witnesses to the same or different overt acts of the same treason or misprision (concealment) of treason, made several exceptions to what could be considered treason, and protected the right of the accused to have copies of the indictment and proceedings against him, to have counsel, and to compel witnesses—privileges not previously enjoyed by those accused of common law crimes. This statute served as a model for colonial treason statutes. The first major cases under the U.S. Constitution arose from an 1807 conspiracy led by Aaron Burr, who had served as vice president under Thomas Jefferson in 1801–1805. The conspirators planned to seize parts of Mexico or the newly acquired Louisiana Territory. Burr and two confederates, Bollman and Swartwout, were charged with treason. Chief Justice John Marshall opened the door for making actions other than treason a crime in Ex parte Bollman when he held that the clause does not prevent Congress from specifying other crimes of a subversive nature and prescribing punishment, so long as Congress is not merely attempting to evade the restrictions of the treason clause. But he also stated, “However flagitious [villainous] may be the crime of conspiring to subvert by force the government of our country, such conspiracy is not treason. To conspire to levy war, and actually to levy war, are distinct offences. The first must be brought into open action by the assemblage of men for a purpose treasonable in itself, or the fact of levying war cannot have been committed. So far has this principle been carried, that . . . it has been determined that the actual enlistment of men to serve against the government does not amount to levying of war.” On the basis of these considerations and because no part of the crime charged had been committed in the District of Columbia, the Court held that

193

TREASON

TREASON TRIALS Ex parte Bollman, 4 Cr. (8 U.S.) 75 (1807). United States v. Burr, 4 Cr. (8 U.S.) 469 (1807). Annals of Congress, Tenth Congress, First Session, Senate, Debate on Treason and Other Crimes, 1808. Wharton’s State Trials of the United States (Philadelphia, 1849), and Lawson’s American State Trials (17 volumes, St. Louis, 1914–1926), trials of Thomas Wilson Dorr (1844) and of John Brown (1859). Cramer v. United States, 325 U.S. 1 (1945). Haupt v. United States, 330 U.S. 631 (1947). Kawakita v. United States, 343 U.S. 717 (1952). United States v. Rosenberg, 195 F.2d 583 (2d. Cir.), cert den., 344 U.S. 889 (1952).

Bollman and Swartwout could not be tried in the District and ordered their discharge. Marshall continued by saying, “the crime of treason should not be extended by construction to doubtful cases.” Burr was acquitted 1 September 1807, after an opinion rendered by Chief Justice Marshall in U.S. v. Burr that further defined the requirements for proving treason. The Court held that Burr, who had not been present at the assemblage of men on Blennerhassett Island, could be convicted of advising or procuring a levying of war only upon the testimony of two witnesses to his having procured the assemblage, but the operation was covert and such testimony was unobtainable. Marshall’s opinion made it extremely difficult to convict someone of levying war against the United States unless the person participated in actual hostilities. The Burr and Bollman cases prompted the introduction in 1808 of a Senate bill to further define the crime of treason. The debate on that bill, which was rejected, provides insight into the original understanding of the treason clause: its purpose was to guarantee nonviolent political controversy against suppression under the charge of treason or any other criminal charge based on its supposed subversive character, and there was no constitutional authority to evade the restriction by creating new crimes under other names. Before 1947, most cases that were successfully prosecuted were not federal trials but rather state trials for treason, notably the trials of Thomas Wilson Dorr (1844) and John Brown (1859) on charges of levying war against the states of Rhode Island and Virginia, respectively. After the Civil War, some wanted to try Southern secessionists for treason, and former the Confederate

194

president Jefferson Davis was charged with treason in U.S. v. Jefferson Davis. The constitutional requirement in Art. III Sec. 2 Cl. 3 that an offender be tried in the state and district where the offense was committed would have meant trying Davis in Virginia, where a conviction was unlikely, so the case was dismissed. Although the United States government regarded the activities of the Confederate States as a levying of war, the president’s Amnesty Proclamation of 25 December 1868 pardoned all those who had participated on the Southern side. Since the Bollman case, the few treason cases that have reached the Supreme Court have been outgrowths of World War II and charged adherence to enemies of the United States and the giving of aid and comfort. In the first of these, Cramer v. United States, the issue was whether the “overt act” had to be “openly manifest treason” or whether it was enough, when supported by the proper evidence, that it showed the required treasonable intention. The Court in a five to four opinion by Justice Jackson took the former view, holding that “the two witness principle” barred “imputation of incriminating acts to the accused by circumstantial evidence or by the testimony of a single witness,” even though the single witness in question was the accused himself. “Every act, movement, deed, and word of the defendant charged to constitute treason must be supported by the testimony of two witnesses.” The Supreme Court first sustained a conviction of treason in 1947 in Haupt v. United States. Here it was held that although the overt acts relied upon to support the charge of treason (defendant’s harboring and sheltering in his home his son who was an enemy spy and saboteur, assisting him in purchasing an automobile and in obtaining employment in a defense plant) were all acts that a father would naturally perform for a son, this fact did not necessarily relieve them of the treasonable purpose of giving aid and comfort to the enemy. In Kawakita v. United States, the petitioner was a native-born citizen of the United States and also a national of Japan by reason of Japanese parentage and law. While a minor, he took the oath of allegiance to the United States, went to Japan for a visit on an American passport, and was prevented from returning to this country by the outbreak of war. During World War II he reached his majority in Japan, changed his registration from American to Japanese, showed sympathy with Japan and hostility to the United States, served as a civilian employee of a private corporation producing war materials for Japan, and brutally abused American prisoners of war who were forced to work there. After Japan’s surrender, he registered as an American citizen, swore that he was an American citizen and had not done various acts amounting to expatriation, and returned to this country on an American passport. The question whether, on this record, Kawakita had intended to renounce American citizenship was peculiarly one for the jury, said the Court in sustaining conviction, and the jury’s verdict that he had not so

T R E A S U RY, D E PA RT M E N T O F T H E

intended was based on sufficient evidence. An American citizen, it continued, owes allegiance to the United States wherever he may reside, and dual nationality does not alter the situation. This case is notable for extending U.S. criminal jurisdiction to the actions of U.S. civilian citizens abroad, which would have originally been considered unconstitutional. World War II was followed by the Cold War, which resulted in political prosecutions of several persons for treason and other charges on dubious evidence. The trials of the Axis broadcasters—Douglas Chandler, Robert H. Best, Mildred Gellars as “Axis Sally,” Iva Ikuko Toguri d’Aquino as “Tokyo Rose” (later pardoned by President Ford when it was revealed she had been a double agent for the allies)—and the indictment and mental commitment of Ezra Pound, muddied the jurisprudence of the treason clause. Their actions provided no significant aid or comfort to an enemy and were not committed within the territorial jurisdiction of the United States. In United States v. Rosenberg, the Court held that in a prosecution under the Espionage Act for giving aid to a country (not an enemy), an offense distinct from treason, neither the two-witness rule nor the requirement as to the overt act was applicable. However, no constitutional authority for the Espionage Act itself was proven. BIBLIOGRAPHY

Chapin, Bradley. The American Law of Treason: Revolutionary and Early National Origins. Seattle: University of Washington Press, 1964. Elliot, Jonathan. Debates in the Several State Conventions on Adoption of the Federal Constitution. Philadelphia, 1836, p. 469 ( James Wilson). Hurst, James Willard. The Law of Treason in the United States: Collected Essays. Westport, Conn.: Greenwood Publishing, 1971. Kutler, Stanley I. The American Inquisition: Justice and Injustice in the Cold War. New York: Hill and Wang, 1982.

with other responsibilities. This oversight meant that Secretary Hamilton and his successors guided fiscal policy and influenced foreign trade; collected and disbursed the revenue of government; maintained the stability of the national currency; were responsible for funding the national debt; made the lion’s share of federal appointments in the new nation; influenced the development of American manufacturing; and policed America’s territorial waters. The Treasury Department’s reach and authority changed over the next two centuries, depending on both the forcefulness and personality of the incumbent secretary and the addition or subtraction of a particular responsibility. But within the cabinet framework of constitutional executive power, the department always remained at the center of domestic policy, foreign and domestic commerce, and national fiscal oversight. Hamilton’s goals were twofold: shift the equilibrium between states’ rights and federal authority created by the Constitution to the advantage of national power, and diversify the American economy, making it more balanced by augmenting dependence on agriculture with strong encouragement of elite commercial and manufacturing interests. He achieved both goals and in doing so he set the terms for a national debate that endured into the twentieth century. He funded the national debt, a product of the Revolution, in a way that immediately shifted power away from the states; he created the First Bank of the United States in 1791, both consolidating federal control over fiscal policy and stimulating foreign trade. His 1791 Report on Manufactures established a standard that engaged the federal government on behalf of elite-led industrialization over the next quarter-century and beyond. Using the Treasury Department as the instrument of his will in implementing his vision of a strong diversified economy in a nation led by a landed and moneyed gentry, he also renewed the ideological debate over the very shape of the republic established by the American Revolution.

Jon Roland See also Arnold’s Treason; Civil Rights and Liberties; Davis, Imprisonment and Trial of; Rosenberg Case.

TREASURY, DEPARTMENT OF THE. At its inception in 1789 the U.S. Treasury Department quickly came to dominate the executive branch. Alexander Hamilton, the first secretary of the treasury, became a virtual prime minister in the Washington administration. Although the department’s role diminished somewhat under future secretaries, it remained the cabinet branch most central to the operation of the federal government. The administrative reach of the Treasury Department is enormous. Early Congresses mandated that the department create and oversee the U.S. Customs Service, the Internal Revenue Service, the U.S. Mint, the Coast Guard, and the First Bank of the United States, along

Nineteenth Century Even when Hamilton’s enemy Thomas Jefferson became president in 1801, and despite his agrarian and democratic rhetoric—echoed faithfully by his secretary of the treasury, Albert Gallatin—Hamiltonian economic reforms, and the ideology behind them, endured. Gallatin served two presidents ( Jefferson and James Madison) for fourteen years with great ability. But his department legacy, sometimes in conflict with his own and his administrations’ principles, was to implement and solidify Hamilton’s vision of America. Gallatin’s 1810 Report on Manufactures to Congress, building on earlier submissions, encouraged American industrial development. President James Madison completed his own version of Hamiltonian treasury policies in 1816, when he signed bills chartering the Second Bank of the United States and introducing America’s first protective tariff. These measures

195

T R E A S U RY, D E PA RT M E N T O F T H E

had the cumulative effect of strengthening the Treasury Department’s hand in shaping government policy. Under Andrew Jackson’s strong executive leadership, the Treasury Department was at the forefront in 1830s attempts to reverse Hamiltonian policy. Treasury secretaries Louis McLane and especially Roger Taney carried the banner in assaulting what Jacksonian Democrats saw as entrepreneurial excess and economic elitism. The Second Bank of the United States in particular was seen to drain federal control of fiscal policy, foreign and domestic commerce, and even westward expansion of America’s farmers. It was Roger Taney who drafted Jackson’s famous and ideologically crucial 1832 message vetoing the recharter of the Second Bank of the United States. And it was the Treasury Department that ultimately inherited the residue of bank power over American fiscal and economic policy when the Independent Treasury was legislated in 1840. It was the Treasury Department that issued the Specie Circular of 1836, granting enormous financial leverage to Jackson’s state-oriented “Pet Banks,” meant to oversee the financing of a more rapid and democratic agrarian expansion into the west. So the department during the 1830s became the chief executive means of implementing populist, agrarian Jacksonian Democracy. Civil War. The practical result of Jacksonian policies, however, was to unwittingly open the door to unrestrained free enterprise and industrial expansion. Jacksonian ideology finally undermined the Treasury Department’s role in controlling economic development. Until the Civil War restored its centrality, the department shared the fate of the executive branch as a whole as its power to exercise leadership dwindled under the weak presidents who presided through the 1850s. Abraham Lincoln’s secretary of the treasury, Salmon P. Chase, was an able administrator and politically a powerful and well-placed Republican Party leader. Facing the crisis of the Civil War, he quickly moved to restore the fiscal health of the department and find the revenue and credit needed to prosecute the war. He used the power of the department to collect the new taxes mandated by Congress; and he restored the authority of a Customs Service fractured by secession. Chase also used Treasury Department guarantees to borrow large amounts of money from private capital sources to finance the war until his tax policies could kick in; and via the National Bank Acts of 1863 and 1864, drafted by him and passed at his urging, he reformed the nation’s truncated banking system by, among other things, eliminating competition for borrowed funds by taxing the state banks to the breaking point. The national banks already chartered were forced by the new laws to invest one-third of their capital in government bonds to help finance the war, a provision made possible by the weakening of the state banks and thus the elimination of competition from that source. But Chase’s restoration of the Treasury Department to something near its former eminence was short-lived and did not survive much beyond the Civil War. In the

196

post-war nineteenth century the department shared the fate of a weakened presidency in general, and it mostly failed to exercise much fiscal or economic restraint on the Gilded Age. Twentieth Century Progressive Era. Efforts to correct the economic, political, and social excesses of the late nineteenth century also began the process of restoring the Treasury Department to its earlier eminence in directing domestic executive policies. It remained at the center of government for much of the twentieth century. The Federal Reserve Act of 1913, part of the Progressive reform package delivered by Woodrow Wilson, was the most important piece in the puzzle. William McAdoo was secretary of the treasury from 1913 to 1918, and he oversaw both its complicated passage through Congress and its implementation. The fact that he was Wilson’s son-in-law did not hurt his leverage. The Federal Reserve Act created a new and original banking system. While after the 1960s the Federal Reserve Board created by the act achieved a greater degree of autonomy, the board started life under the Progressives as very much the creature of the Treasury Department. Both the secretary and the comptroller of the treasury were voting members of the board, as were six regional directors appointed by the president of the United States. For at least a half-century the secretary of the treasury wielded immense de facto authority over economic policy, interest rates, currency (via federal reserve notes), and commercial paper through his ability to move the Federal Reserve Board. Even later, more conservative administrations fell in line as bankers admitted that the Federal Reserve introduced a financial stability to the nation that it had not seen since the tenure of Alexander Hamilton. Progressive impetus also achieved final ratification of the Sixteenth Amendment, making constitutional the passage of a graduated federal personal income tax. This opened the door to a resurgence of Treasury Department authority. First introduced in the waning days of Teddy Roosevelt’s administration, the amendment was ratified in 1913, in time for the Wilson administration to implement a graduated income tax. While it did not dramatically result in a “soak the rich” policy, it did increase the amount of federal funds overseen by the Treasury Department, and it significantly increased the bureaucracy within the department through the revitalization of the Internal Revenue Service, which dated back to 1791. In general, as federal oversight of economic and social conditions increased in the twentieth century, the Treasury Department’s role in that oversight increased as well. This became evident immediately following the Progressive Era. Both the politically conservative 1920s and the dramatically liberal 1930s made clear the strong resurgence of the Treasury Department at the center of domestic policy. Two powerful secretaries oversaw this

T R E A S U RY, D E PA RT M E N T O F T H E

comeback: Andrew Mellon and Henry Morgenthau. The ideological divide between the two men was immense. Secretary from 1921 to 1932, Mellon successfully lobbied Congress to reduce taxes for the wealthy, whether they be individuals or corporations viewed in law as individual entities. In an era of fiscal speculation, Mellon’s department gave corporate America free rein in generating stock market–oriented wealth. Mellon espoused the theory of “trickle down economics,” which held that wealth at the top would filter down to the lower classes. So the secretary was instrumental in gutting even the modest graduation of the new income tax, and he almost entirely removed the tax burdens the Progressives had imposed on the well-to-do. Mellon’s popularity soared until 1929. He was seen as the architect of the theory, best enunciated by President Calvin Coolidge, that “the business of government is business.” The secretary of the treasury was the spokesman for the “New Prosperity” of paper profits generated by Wall Street, gains that fueled the Roaring Twenties mentality of easy wealth for the upper-middle classes and new rich. And Mellon was, with President Herbert Hoover, the fall guy on whom the Great Depression of the 1930s was blamed after the stock market collapse of 1929. The Great Depression and the New Deal. The depression discredited the conservative economic leadership of the 1920s. Under the New Deal, which began with Franklin Delano Roosevelt’s election in 1932, Secretary of the Treasury Henry Morgenthau oversaw the strongly liberal policy of government intervention on the side of labor and the small farmer. He was no less an icon of what a modern secretary of the treasury should be for the rural and urban working classes than Mellon was for capitalists and entrepreneurs. Secretary from 1934 to 1945, Morgenthau was one of a cadre of important liberals responsible for the legislation that transformed Mellon’s policy of government hands off free enterprise to a dramatically new policy of welfare capitalism, invoking vast government control over the private sector of the economy. Some argue that the New Deal destroyed real capitalism in America; others claim that FDR and his administration saved capitalism from failing completely. However one reads the record, the Treasury Department was at the center of New Deal domestic policy. In drafting most depression-era legislation, Morgenthau was secondary to New Dealers like FDR guru Harry Hopkins, Labor Secretary Frances Perkins, and Brain Truster Raymond Moley, but the Treasury Department was central to its revolutionary implementation. And in a few areas, the treasury did find legislative solutions as well. In 1935, for example, Morgenthau came up with the plan to find the vast funding needed to secure passage of the Social Security Act. He convinced the president to levy a payroll tax to build up the trust fund that made Social Security a self-financed old-age benefit independent of congressional budgeting. Initially, he argued in the midst of vast unemployment, financing would come

from taxes only on those working, a de facto elite in the 1930s. The secretary was a key player too in shaping legislation for a graduated corporate income tax that included those holding companies far removed legally from their profitable corporate subsidiaries. When recovery foundered in 1937, Morgenthau was instrumental in convincing FDR to move more conservatively in the economic sector, advice that FDR heeded in launching what historians now call “the second New Deal.” There followed a renewed increase in deficit spending as the administration once again pumped money into public spending designed to increase employment. The Treasury Department was at the center of this “second New Deal,” as it had been at the center of the first. The radical reforms of the first New Deal were consolidated, insuring that “welfare capitalism” (conservatives were already calling it the welfare state) would remain a permanent part of twentieth- century economic and social policy. For twelve critical years the treasury remained front and center in guiding domestic policy. Even post–World War II Republican secretaries were unwilling (or unable) to undo New Deal economic and social reform. In the years between the New Deal and Lyndon Johnson’s Great Society newly created cabinet-level departments, especially Health, Education, and Welfare and Housing and Urban Development, siphoned off some of the near monopoly the treasury had exercised over domestic affairs. In the 1980s Ronald Reagan’s conservative policies cut taxes and returned to the massive deficit spending that marked the New Deal. This effort thrust the treasury once more into the center of executive branch oversight; the arrival of Robert Rubin, secretary of the treasury beginning in 1993 during Bill Clinton’s administration, cemented the treasury’s central role in domestic policy yet again. Balanced budgets and free trade. Rubin first pushed successfully to balance the budget, largely by means of more disciplined spending, trimming back federal bureaucracy, and tax reform that increased revenue by imposing more taxes on the well-to-do and corporate America. These were openly acknowledged to implement Rubin’s vision. The Treasury Department then moved to restore American and global economic health by moving America rapidly toward free trade through NAFTA (North American Free Trade Agreement); expansion of mostfavored-nation status for China; and closer cooperation with the European Union as it moved toward full economic integration. The Treasury Department, working with the Federal Reserve (still nominally under its jurisdiction), oversaw a long period of prosperity lasting into the new millennium. It did this while both keeping inflation in check and balancing the budget. Like a few earlier secretaries, Rubin, who served through nearly two terms under Bill Clinton, won enormous public confidence in personal terms. Thus his support and that of his department translated into bipartisan popular and congressional support for any policies they espoused. Rapidly rising

197

T R E AT I E S , C O M M E R C I A L

stock markets, dynamic expansion of stock investments by the public at large, growing employment opportunities, massive gains in the new high-tech economy and low inflation all contributed to sustaining perhaps the longest period of uninterrupted prosperity in the nation’s history. One major result of America’s domination of a new global economy was to elevate the Treasury Department to virtually the same supreme driving force in government in the 1990s that it had enjoyed two centuries earlier under the aegis of Alexander Hamilton. BIBLIOGRAPHY

Elkins, Stanley M., and Eric McKitrick. The Age of Federalism. New York: Oxford University Press, 1993. Kennedy, David M. Freedom from Fear: The American People in Depression and War, 1929–1945. New York: Oxford University Press, 1999. Link, Arthur. Woodrow Wilson and the Progressive Era, 1910–1917. New York: Harper, 1954. Sellers, Charles G. The Market Revolution: Jacksonian America, 1815–1846. New York: Oxford University Press, 1991. Walston, Mark. The Department of the Treasury. New York: Chelsea House, 1989.

Carl E. Prince See also Bank of the United States; Coast Guard, U.S. ; Cre´dit Mobilier of America; Customs Service, U.S.; Debts, Revolutionary War; Gilded Age; Granger Movement; Great Society; Railroads; Tariff.

TREATIES, COMMERCIAL. From its earliest years, the United States’ foreign policy has focused as much on commercial interests as on all other concerns (including military) combined. This focus comes from what Americans and their government have perceived as their needs and from the way America views its role in international affairs. The Revolution itself was motivated in part by English restrictions on foreign trade by the American colonies. One of the United States’ very first treaties was the Treaty of Amity and Commerce of 1778, which opened American ports and markets to French traders and opened French ports and markets to Americans. France’s colonial markets had great value to American merchants as sources of raw materials to manufacture into goods for sale, not only in America but overseas. This treaty led to navigation treaties with several European powers, eventually opening markets in the Far East that proved very profitable in the late 1700s and early 1800s. There was already a global economy, and commercial treaties were becoming highly complicated agreements among several nations at once, often with each new treaty requiring adjustments to old ones. Sometimes the State Department or the president concluded treaties known as executive agreements. The Senate occasionally challenged these executive agreements, arguing that the Constitution required formal Senate confirmation of commercial trea-

198

ties; these were called formal accords. This vagueness between executive agreements and formal accords often made commercial treaty negotiations difficult because foreign countries could not tell whether years of hard negotiations with the president or State Department would be accepted by the Senate. In 1936, United States v. CurtisWright Export Corporation, the Supreme Court tried to clarify the distinctions between executive agreements and formal accords and affirmed that the president had the authority to make commercial treaties without always needing the Senate’s approval. This decision was controversial, especially when the United States gave “mostfavored-nation” status to communist Hungary in 1978 and to communist China annually from the late 1980s through the early 2000s. During the 1800s, the creation of commercial treaties was haphazard because of America’s conflicting impulses to isolate itself from foreign affairs and to create new markets for its goods. Americans also believed that free trade was a liberating force that would bring political freedom and would raise the standard of living for America’s trading partners. For example, when Admiral Matthew Perry sailed warships to Japan to pressure Japan into making a commercial treaty with the United States, America regarded it as doing the Japanese people a good turn by opening their country to benefits of a modern economy. By the 1920s it was clear that having a trading agreement with the United States was good for a country; nations as disparate as Japan and Argentina were creating wealth for themselves by selling consumer goods to Americans. By 1923 the principle of most-favored-nation status became a permanent part of American foreign policy: It clarified the trading rights of American commercial partners, making it easier to negotiate economic ventures with American companies. In the second half of the twentieth century, the United States participated in four sweeping commercial agreements: the World Bank, the World Monetary Fund (WMF), the General Agreement on Tariffs and Trade (GATT), and the North American Free Trade Agreement (NAFTA). Americans believed that it was in everybody’s best interest to improve the economies of impoverished nations. The World Bank, to which the United States was by far the major contributor, was intended to make long-term loans to build private industries, and the World Monetary Fund, with the United States again the major contributor, was created to loan governments money to stabilize their economies and to help them promote economic growth. The WMF became very controversial in the 1990s, because some people saw it as creating a global economy (they were about three hundred years too late) that would lead to international corporations oppressing the peoples of the world. GATT was intended to eliminate the trade barriers presented by tariffs. It recognized that economies can change, and it provided a mechanism for changing the treaty to meet changing times called the “round” of negotiations. The first round took place in Geneva in 1947

T R E AT I E S W I T H F O R E I G N N AT I O N S

and focused on coordinating tariffs to help nations devastated by World War II. A single round could last for years, and no wonder: the first round alone covered more than 45,000 trade agreements. GATT is probably the supreme achievement of twentieth-century commercial treaties, generating more wealth for its member nations through free trade than any other treaty America was party to. NAFTA was a response to creation of the European Union and efforts among Southeast Asian countries to form a trading block. By eliminating trade barriers among its members, the European Union created a powerful economic machine in which member nations could coordinate and finance large industrial enterprises and challenge America for world dominance in foreign trade. The United States and Canada already had a free trade agreement that allowed shipping across their borders almost without impediment. Negotiated mainly during the administration of George Bush the elder (1989–1993), NAFTA sought to include all of North America in a single economic engine that would be unmatched in its resources. Mexico readily participated in negotiations with Canada and the United States, but the nations of Central America, most of which were in social upheaval, did not, although President Bush envisioned that both Central America and South America would be included in the future. NAFTA required adjustments to GATT, because it affected almost every trading partner’s treaties with the United States. It was to be a formal accord, requiring the Senate’s consent, and passage was a tricky business in 1993–1994; the new president, Bill Clinton, had said he opposed NAFTA during his campaign. This brought into play an interesting characteristic of American treaty negotiations: the promise to treaty partners that subsequent presidential administrations will honor agreements made by previous ones. This consistency has been upheld by presidents since Thomas Jefferson, and President Clinton persuaded the Senate to approve NAFTA. Kirk H. Beetz BIBLIOGRAPHY

Appleton, Barry. Navigating NAFTA: A Concise User’s Guide to the North American Free Trade Agreement. Rochester, N.Y.: Lawyer’s Cooperative Publishing, 1994. MacArthur, John R. The Selling of “Free Trade”: NAFTA, Washington, and the Subversion of American Democracy. New York: Hill and Wang, 2000. Morrison, Ann V. “GATT’s Seven Rounds of Trade Talks Span More than Thirty Years.” Business America 9 (7 July 1986): 8–10. Wilson, Robert R. United States Commercial Treaties and International Law. New Orleans, La.: Hauser Press, 1960.

TREATIES WITH FOREIGN NATIONS. In international usage the term “treaty” has the generic sense of “international agreement.” Rights and obligations, or

status, arise under international law irrespective of the form or designation of an agreement. In constitutional usage, however, treaties are sometimes distinguished from less formal agreements by special requirements for negotiation or ratification, limitations of subject matter, or distinctive effects in domestic law. The U.S. Constitution distinguishes treaties from other agreements and compacts in three principal ways. First, only the federal government can conclude a “Treaty, Alliance, or Confederation.” States can make an “Agreement or Compact” with other states or with foreign powers but only with consent of the Congress (Article I, section 10). Second, treaties are negotiated and ratified by the president, but he or she must obtain the advice and consent of the Senate, two-thirds of the senators present concurring (Article II, section 2, clause 2). President George Washington understood this provision to include Senate advice during both treaty negotiation and ratification. He attempted to consult with the Senate at an executive council concerning a proposed Indian treaty, but after a frustrating experience he declared that he “would be damned” if he ever did that again. Washington’s successors sought the advice and consent of the Senate only after treaty negotiations, during the period of ratification. Third, the Constitution distinguishes international treaties from “agreements and compacts” by making treaties part of the supreme law of the land that judges in every state are bound to enforce (Article VI, clause 2). The U.S. Supreme Court has on occasion asserted that it may nullify unconstitutional treaties, but it has never done so. International treaties are generally obligatory after signature and before formal ratification. In the United States, however, this is only true when a treaty is designated as “self-executing.” Otherwise, under U.S. law, treaties are sent to Congress for legislative ratification and implementation. Early American Treaties After declaring independence from Great Britain in 1776, the United States concluded fifteen treaties before the ratification of the U.S. Constitution in 1789. These early treaties reflected the problems of political decentralization at the time. Commissioners appointed largely ad hoc by the Continental Congress negotiated the treaties and the agreements were subject to a very uncertain ratification process. Between 1776 and 1781 the assent of all states voting was required for treaty approval, with nine states constituting a quorum. After the creation of the Articles of Confederation in 1781, nine of the thirteen states had to approve each treaty. These provisions posed many difficulties for America’s nascent diplomats, operating without an established foreign service or a reliable framework of legislative support. At critical moments, the Continental Congress often skirted its stated rules to obtain desired treaty ratification. The Treaty of Alliance with France in 1778—a

199

T R E AT I E S W I T H F O R E I G N N AT I O N S

vitally important part of America’s revolutionary struggle against Great Britain—obtained congressional ratification with a vote recorded as unanimous. Yet the representatives of two states were certainly absent from the vote. Two more states may also have failed to ratify the treaty. Proponents of the alliance with France disguised the absence of required consent for the treaty by depicting a vote of eight states, rather than the necessary nine, as a unanimous congressional voice. Often employing similar procedures, the Continental Congress ratified the Treaty of Paris in 1783, which ended the war with Great Britain on very favorable terms for Americans. London acknowledged American independence and conceded the new nation free navigation of the Mississippi River, the key inland estuary for north-south commerce and communication. Americans also concluded a series of commercial treaties around this same time with the Netherlands (1782), Sweden (1783), Prussia (1785), and Morocco (1786). In 1788 the United States concluded a formal consular convention with France, assuring high diplomatic standing for American representatives in Paris. After 1789, treaty making under the U.S. Constitution focused upon assuring American economic independence, freedom from entanglement in the Napoleonic Wars that convulsed the European continent, and territorial expansion in North America. In 1794 John Jay negotiated a treaty with Great Britain—Jay’s Treaty—that sought to reduce growing tensions between the Americans and their former colonial masters. U.S. citizens objected to British restrictions on American trade with London’s adversaries, especially France, and they found the British impressment of captured American sailors into British military service deeply offensive. Jay’s Treaty did not prohibit London’s continued attacks on American shipping, but it did secure the final withdrawal of British troops from a string of occupied western forts around the Great Lakes. The treaty also opened U.S. trade with British-controlled India and the West Indies. Many Americans, including then–Secretary of State Thomas Jefferson, opposed the Jay Treaty as too deferential to Britain. They demanded a stronger assertion of American neutral shipping rights. Recognizing that Jay had done the best he could from a position of U.S. weakness, President Washington personally pushed the treaty through the Senate, barely gaining ratification. The debate about the Jay Treaty began a long history of domestic controversy over the necessary and acceptable compromises required by the vagaries of international politics. Jay’s “realism” was pragmatic, but it contradicted many of America’s stated ideals. Thomas Pinckney followed Jay’s work by negotiating a treaty with Spain in 1795 known as Pinckney’s Treaty. Under this agreement Spain granted the United States access to the Mississippi River—especially the port of New Orleans, under Spanish control—and the territories around the estuary. The Spanish also promised to help

200

curb Indian attacks on American settlements. In return, the United States promised to respect Spanish holdings in North America. The Pinckney Treaty offered the United States unprecedented access to western and southern territories and it consequently avoided the controversies surrounding the Jay Treaty. The Senate ratified the Pinckney Treaty with minimal debate. The Jay and Pinckney Treaties set precedents for American diplomatic efforts in the early republic. In each case a group of elite American representatives negotiated with their foreign counterparts in search of an agreement that would assure stability in European-American relations and U.S. domination on the North American continent. President Thomas Jefferson’s treaty with Napoleon Bonaparte in 1803 accomplished both ends. Despite his revulsion at the despotism of the French emperor, Jefferson purchased the vast Louisiana Territory from Napoleon at the cost of $15 million. The new lands—828,000 square miles—provided room for America to grow and expand westward relatively free from the warfare that convulsed Europe at the time. Jefferson’s distrust of a strong central government did not stop him from concluding a treaty that doubled the size of the United States and asserted a presidential right to transform the shape of the country. The Treaty of Ghent, signed in 1814 at the conclusion of America’s ill-considered War of 1812 with Great Britain, acknowledged U.S. predominance in North America. It also marked the end of Anglo-American hostilities. The so-called “special relationship” between leaders in Washington and London—based on general amity, trust, and cooperation—began in very nascent form with the signing of this treaty. Great Britain continued to assert a right of impressment over American shipping, but after 1814 London rarely exercised this prerogative. The United States, in return, pledged not to attack British-controlled Canada, as it had during its struggle for independence and during the War of 1812. Treaties negotiated by the U.S. government between 1814 and 1848, including the Webster-Ashburton Treaty of 1842 and the Oregon Boundary Treaty of 1846, secured further expansion of American territorial holdings without jeopardizing British claims in Canada. The Treaty of Guadalupe-Hidalgo, signed at the conclusion of the Mexican-American War in 1848, provided the United States with possession of present-day California, Arizona, Nevada, and Utah, as well as parts of New Mexico, Colorado, and South Dakota. In return the administration of President James K. Polk paid Mexico a paltry $15 million and promised not to annex any further Mexican territory, despite contrary pressures from many American citizens. By the middle of the nineteenth century America had, through warfare and treaty making, established itself as a colossal land power stretching from the Atlantic to the Pacific Ocean. The nation’s asserted Manifest Destiny to dominate the continent reflected racial, religious, and cultural assumptions of American superiority that found

T R E AT I E S W I T H F O R E I G N N AT I O N S

their way into the treaties of the period. Time and again, American leaders asserted their right to expand. Time and again, they laid claim to territories they had never before controlled. The non-Americans—Indians, Mexicans, and others—who resided on many of the new U.S. territories received little voice in the treaties negotiated during this period. Treaties and American Overseas Expansion After the conclusion of the Civil War in 1865, U.S. treaties focused on expanding American economic, political, and cultural interests outside of North America. In 1867 Secretary of State William Henry Seward secured a treaty with Russia, which agreed to sell the territory of Alaska to the United States for $7.2 million. Seward foresaw that this northern “icebox” would provide important natural resources and help extend American economic interests across the Pacific Ocean. The U.S. Senate almost rejected this treaty, as it rejected many of Seward’s other expansionist schemes. Nonetheless, the Alaska treaty created a precedent for American overseas expansion that would slowly reach its fruition around the end of the nineteenth century. Following a few short months of warfare with the overextended and declining Spanish Empire, at the end of 1898 the United States secured the Treaty of Paris with Madrid’s representatives. By the terms of this treaty, Spain vacated its colony in Cuba, acknowledging America’s sphere of influence in the area. The Spanish also ceded Puerto Rico, Guam, and the Philippine archipelago to the United States. With Senate approval in early 1899, these islands became America’s first extended foreign colonies. The provisions for American occupation of the Philippines allowed President William McKinley to wage fortyone months of brutal ground warfare against a native Filipino resistance. By 1902, when the American counterinsurgency forces asserted nearly complete control over the archipelago, as many as twenty-thousand Filipino rebels had died opposing American colonialism. Some 4,200 Americans also perished in this battle to enforce U.S. occupation under the terms of the Treaty of Paris. Many Americans, the so-called anti-imperialists, opposed U.S. military activities in the Philippines, but President McKinley acted with the legitimacy provided by the treaty with Spain. Following the Panamanian Revolution of 1903, the administration of President Theodore Roosevelt used a similar tact. Secretary of State John Hay negotiated the Hay-Bunau–Varilla Treaty with the newly created state of Panama in the same year. The treaty granted the United States the right to construct and operate an isthmian canal linking the Caribbean Sea with the Pacific Ocean. When completed in 1914, the fifty-one-mile canal allowed ships to travel between the Atlantic and Pacific Oceans, saving the thousands of miles required to circumnavigate South America before the existence of this passage. The new transport route greatly facilitated trade between the pro-

ductive eastern seaboard of the United States and the large markets of Asia. The Hay-Bunau–Varilla Treaty allowed for American construction of and control over the Panama Canal. After many subsequent treaty revisions— the most significant in 1977—the Panamanian government attained sovereignty over the canal zone in 2000. The treaties negotiated by the United States with Russia, Spain, and Panama after the Civil War indicated that American interests had extended far beyond the North American continent and its established trading routes with Europe. An industrializing nation that had reached the end of its western frontier looked overseas for new markets and strategic possessions. American foreign expansion occurred primarily through treaties negotiated with declining empires (Spain), established states seeking new allies (Russia), and new regimes subject to foreign pressure (Panama). U.S. imperialism in the late nineteenth and early twentieth centuries was relatively costless for Americans because their leaders attained so much at the negotiating table. Multilateral Treaties and a Liberal International Order In the aftermath of World War I, many Americans sought new mechanisms for building international cooperation and averting future military conflicts. President Woodrow Wilson called for a new kind of diplomacy that rejected the competing alliances, autocracies, and arms races of old. Instead, he argued that only what he called a League of Nations could promise free trade, collective security, and long-term international stability. America had entered World War I to “make the world safe for democracy,” according to Wilson, and he sought a peace treaty at the close of hostilities that carried this vision to fruition. At the Paris Peace Conference in 1919 Wilson pressured his allied counterparts—particularly Georges Clemenceau of France, Vittorio Orlando of Italy, and David Lloyd George of Great Britain—to formulate a treaty that emphasized European reconstruction and cooperation rather than war revenge. The American president succeeded only in part, but he did manage to include a covenant creating a League of Nations in the final treaty authored largely by France, Italy, Great Britain, and the United States. On 28 June 1919 the defeated leaders of Germany signed the treaty at the Chateau de Versailles outside Paris, the site of more than five months of heated negotiations on the text of what became known as the Versailles Treaty. In the next year, rancorous Senate debate resulted in the rejection of the Versailles Treaty by the United States. Despite President Wilson’s tireless public speeches on behalf of the treaty, isolationists, led by Republican senator Henry Cabot Lodge, managed to depict Wilson’s League of Nations as a harmful encumbrance that would embroil Americans in additional overseas difficulties. Lodge and his colleagues added numerous reservations to the treaty that would restrict American participation in the League.

201

T R E AT I E S W I T H F O R E I G N N AT I O N S

On 19 March 1920 these reservations and the Versailles Treaty itself failed to receive the necessary two-thirds majority in the Senate. An odd collection of Republican isolationists and Democratic supporters of Wilson’s original proposal had prohibited American participation in a nascent liberal international order. Through the 1920s and 1930s this isolationist sentiment spurned official U.S. alliance with foreign powers. Washington did, however, enter into a series of multilateral treaties aimed at naval disarmament (the Washington Treaty of 1921 and the London Treaty of 1930) and outlawing war (the Kellogg-Briand Pact of 1928). These treaties had few enforcement mechanisms, but they sought to guarantee a peaceful and open climate for American businesses that were then expanding their activities overseas. World War II illustrated the shortcomings in these platitudinous treaties. When Germany, Italy, and Japan began to pursue militaristic policies in the early 1930s, the international community lacked the legal mechanisms and political will to react with necessary force. Without American participation, the League of Nations was a very weak reed. Without forceful penalties for treaty violations, the fascist powers were not deterred from attacking neighboring states. During the course of World War II, many Americans vowed to correct the mistakes of the past. President Franklin Roosevelt made it clear that the war would only end with the unconditional surrender of the fascist powers and the creation of a new series of international treaties that guaranteed, with force, the kind of liberal international order envisioned by Wilson. In particular, Roosevelt called for a United Nations that would include a Security Council of the great powers, capable of employing force for collective security. The United Nations Charter, signed in San Francisco on 26 June 1945, made this vision into a reality. In contrast to its rejection of the League of Nations in 1920, on 28 July 1945 the U.S. Senate approved the United Nations Charter by a vote of 89 to 2. A series of arrangements for multilateral economic cooperation came to fruition around this same time. The Bretton Woods agreements of 1944 stand out because they created the International Bank for Reconstruction and Development (the World Bank) and the International Monetary Fund (IMF), both designed to regulate and support capitalist wealth creation across the globe. At the center of these new international institutions, the United States took on an unprecedented role as the primary financier for global economic exchanges. Unlike the UN Charter, the groundbreaking Bretton Woods agreements were not handled as treaties, but rather as economic legislation, in the U.S. House of Representatives and Senate. At the time, international economics did not attract the same high political attention as issues of military security. Cold War hostilities between the United States and the Soviet Union distorted American multilateralism. Af-

202

ter 1945 U.S. treaties focused on building collective security alliances with states imperiled by communist infiltration and possible invasion. The North Atlantic Treaty Organization (NATO), created in 1949, provided for mutual security and close military cooperation among Western Europe, Canada, and the United States. Each government pledged that it would regard an attack on one member as an attack on all. By approving NATO the Senate affirmed a new bipartisan anticommunist consensus in the United States. In place of prior isolationist urgings, American politicians firmly committed themselves to the containment of communism abroad through extensive and long-term U.S. military and economic intervention. The creation of NATO marked the end of American isolationism by treaty. In the 1950s the United States extended the NATO precedent to other areas of the world. In 1954 it joined Great Britain, France, Australia, New Zealand, the Philippines, Thailand, and Pakistan in the creation of the Southeast Asia Treaty Organization (SEATO). America also concluded a mutual defense treaty with the Guomindang government of Taiwan in late 1954. SEATO and the Taiwan treaty pledged their signatories to cooperate for mutual defense against communist threats. The treaties also obligated the governments to promote free markets and democracy. SEATO commitments contributed to increasing American intervention in Southeast Asia after 1954. This was particularly true in Indochina, where the United States became the chief sponsor of an anticommunist South Vietnamese government. Belligerent autocrats in South Vietnam—as well as in Taiwan and Pakistan—used their nations’ treaties with the United States to call upon American military support for anticommunist warfare rather than domestic development and democratization. The failure of U.S. military activities in Vietnam between 1965 and 1975 proved that SEATO and other treaties had misdirected American policies. In the aftermath of the Vietnam War, the United States shied away from new treaties of defensive alliance. Instead, American officials focused on arms control in their attempts to reduce tensions with the Soviet Union. In 1972 the two superpowers signed the Strategic Arms Limitation Treaty (SALT I), which for the first time limited the future construction of nuclear weapons delivery systems. It also included an Anti-Ballistic Missile (ABM) Treaty that prohibited the two governments from building more than two missile defense sites. In 1974 they reduced this limit to one missile defense site for each nation. SALT II, signed in 1979, pledged the two superpowers to additional limits on nuclear weapons delivery systems. President James Earl Carter withdrew the treaty from Senate consideration after the Soviet invasion of Afghanistan in December 1979, but his successor, Ronald Reagan, voluntarily followed through on the SALT II limitations. Despite Reagan’s assertion that the Soviet Union was an “evil empire,” he pressed forward with arms

T R E AT I E S , N E G O T I AT I O N A N D R AT I F I C AT I O N O F

control negotiations. The world was too dangerous to do otherwise and the treaties of the 1970s had attracted strong support among citizens, intellectuals, and policymakers. Reagan negotiated the most far-reaching arms control treaties of any American president. The Intermediate Nuclear Forces Treaty (INF) of 1988 eliminated an entire group of weapons for the first time: the intermediate and short-range nuclear missiles stationed by both superpowers in Europe. In 1991 Reagan’s successor, George H. W. Bush, concluded negotiations for the Strategic Arms Reduction Treaty (START I) that reduced both American and Russian nuclear arsenals by 30 percent. These treaties contributed to the peaceful end of the Cold War. In the post–Cold War world, where America’s vision of a liberal international order appears triumphant, U.S. leaders have proven unsure about future treaty negotiations. Presidents William Jefferson Clinton and George W. Bush have pursued policies embracing both American unilateralism and international cooperation. President Bush, for example, withdrew from the ABM Treaty in 2001 while he was managing an international coalition of states fighting terrorist influences in Afghanistan and other areas. American presidents often prefer to act without the restrictions and senatorial oversight of treaty negotiations. This is likely to remain true in the twenty-first century, but future leaders will surely rely on treaties to affirm serious and long-standing political, military, and economic commitments abroad. BIBLIOGRAPHY

Bundy, McGeorge. Danger and Survival: Choices about the Bomb in the First Fifty Years. New York: Random House, 1988. Cooper, John Milton, Jr. Breaking the Heart of the World: Woodrow Wilson and the Fight for the League of Nations. New York: Cambridge University Press, 2001. Dallek, Robert. Franklin D. Roosevelt and American Foreign Policy, 1932–1945. New York: Oxford University Press, 1979. Gaddis, John Lewis. Strategies of Containment: A Critical Appraisal of Postwar American National Security Policy. New York: Oxford University Press, 1982. Garthoff, Raymond L. De´tente and Confrontation: AmericanSoviet Relations from Nixon to Reagan. Rev. ed. Washington, D.C.: Brookings Institution, 1994. ———. The Great Transition: American-Soviet Relations and the End of the Cold War. Washington D.C.: Brookings Institution, 1994. LaFeber, Walter. The American Search for Opportunity, 1865– 1913. New York: Cambridge University Press, 1993. McMahon, Robert J. The Limits of Empire: The United States and Southeast Asia since World War II. New York: Columbia University Press, 1999. Perkins, Bradford. The Creation of a Republican Empire, 1776– 1865. New York: Cambridge University Press, 1993.

C. H. McLaughlin Jeremi Suri

See also Bretton Woods Conference; Cold War; Ghent, Treaty of; Guadalupe Hidalgo, Treaty of; Internationalism; Jay’s Treaty; Louisiana Purchase; Manifest Destiny; North Atlantic Treaty Organization; Paris, Treaty of (1783); Paris, Treaty of (1898); Pinckney’s Treaty; Southeast Asia Treaty Organization; Strategic Arms Limitation Talks; United Nations; Versailles, Treaty of.

TREATIES WITH INDIANS. See Indian Treaties.

TREATIES, NEGOTIATION AND RATIFICATION OF. A treaty is a formal agreement signed by one or more countries. Article II, Section 2, of the Constitution gives the president the “Power, by and with the Advice and consent of the Senate, to make Treaties, provided two thirds of the Senators present concur.” Although the drafters of the Constitution intended for the president and the Senate to collaborate in negotiating and ratifying treaties, throughout U.S. history, the responsibility for treaty making has rested with the chief executive. In the United States, only the federal government can make treaties with other nations. Article I, Section 10, of the Constitution provides that “No State shall enter into any Treaty, alliance, or Confederation” nor “without the Consent of Congress . . . enter into any Agreement or Compact with another state, or with a foreign power.” There are five stages in arriving at a treaty. In the first stage, the president prepares instructions about the terms of the treaty. The president assigns a representative to negotiate the agreement with counterparts from the other nation or nations and president then signs the draft of the treaty. In the second stage, the president submits the treaty to the Senate for its consideration. The Senate can consent to the treaty; reject it, block it by tabling it; or consent with reservations. If the Senate consents, the president proceeds to the third stage, known as ratification. In the fourth stage, the president exchanges ratifications with the co-signing country. The U.S. Department of State and American diplomats abroad typically handle this step. In the fifth and final stage, the president proclaims the treaty the law of the land. If the Senate refuses to consent to the treaty, the process is halted and the president cannot ratify the agreement. Or, if the Senate attaches reservations or amendments to the treaty, the president may accept or reject them. Congress did not change seventy-two percent of treaties until 1945. Since World War II, however, presidents have evaded Senate oversight of treaty making by entering into what are called “executive agreements” with foreign nations. These agreements do not have the force of law but are generally binding on the courts while they are in effect, which is the term in office of the president who made them. Executive agreements have varied widely in importance. Some have concerned inconsequential matters, such

203

T R E AT I E S , N E G O T I AT I O N A N D R AT I F I C AT I O N O F

as adjusting the claim of an individual citizen. Or they have involved routine diplomacy, such as recognizing a government. However, many of America’s most significant international accords have been in the form of executive agreements. Examples are the Open Door Notes (1899) concerning American trade in China, the exchange of American destroyers for access to British military bases (1940), and the Yalta and Potsdam agreements after World War II. Since World War II, the number of executive agreements has far exceeded the number of treaties. From time to time, Congress has tried to limit the President’s ability to enter into such agreements. Sen. John W. Bricker of Ohio launched the most ambitious attempt to curtail what he perceived as a usurpation of power by the executive branch. In 1953, Bricker proposed a constitutional amendment that would give Congress the power to “regulate all Executive and other agreements with any foreign power or international organization.” The amendment failed in the Senate by one vote in February 1954 and was never passed. Trade and Territory For much of American history, U.S. treaty making has primarily involved two areas of interest: the promotion of overseas business and the acquisition of land across North America. During the early decades of the Republic, American leaders sought trade and territory while dealing with the unfinished business of the revolutionary war. The most important treaties signed by the United States in the eighteenth century were Jay’s Treaty (1794) and the Pinckney Treaty (1795), which established peaceful relations with Britain and Spain. Despite the formal end to warfare between the United States and England (Treaty of Paris, 1783), relations between the revolutionary upstart and the Mother Country remained poor. In 1794, President Washington appointed John Jay to negotiate a settlement of American and British grievances to avert another war. Under the terms of the Jay Treaty, signed on 19 November 1794, the central source of friction was removed when Britain agreed to cede control of military forts on the northwestern frontier to the United States. The United States agreed to grant England most-favored-nation trading status. Under the Pinckney Treaty with Spain, the border separating the United States and Spanish Florida was established at the thirty-first parallel. The United States also gained vital trading rights along the Mississippi River in addition to the right of deposit at the port of New Orleans. By virtue of these two treaties, the United States could reasonably expect to extend its grasp as far west as the Mississippi and south to include Florida. During the next half century, those territorial aims were exceeded as

204

a result of the Louisiana Purchase (1803) and later annexation treaties. In what is considered one of history’s greatest land deals, the Louisiana Purchase, President Thomas Jefferson spent $15 million to buy 828,000 square miles of North American land from Napoleon Bonaparte, the French emperor. With the purchase, the United States doubled in size and extended its domain west to the Rocky Mountains, north to Canada, and south to the Gulf of Mexico. The deal was signed on 30 April 1803. President James Monroe’s secretary of state, John Quincy Adams, subsequently obtained Florida from Spain in the AdamsOnis Treaty signed on 22 February 1819. In 1844, President John Tyler signed a treaty with the leaders of the breakaway Republic of Texas to bring that former Mexican territory into the Union. As a result of anti-slavery opposition in the Senate, the treaty was rejected. However, Tyler was able to obtain Texas the following year when Congress approved the annexation through a joint resolution. Because Mexico did not accept the loss of Texas, war broke out between the neighboring countries. These hostilities were ended with the treaty of Guadalupe Hidalgo (2 February 1848), which not only secured the American claim to Texas but added California and New Mexico to the United States. With the ratification of the Gadsen Purchase Treaty with Mexico in 1854, America gained the southernmost areas of Arizona and New Mexico. Secretary of State William Seward’s Alaska Purchase Treaty with Russia in 1867 was the United States’ last major land deal. In the years preceding the Civil War, the United States expanded its overseas trade via treaties with England, Russia, Siam, China, Hawaii, and Japan. To increase business in Latin America, U.S. leaders signed a series of accords establishing the right to trade across the strategic Isthmus of Panama and eventually build a canal in one of the Central American countries. Overseas Expansion and International Agreements At the end of the nineteenth century, the United States became one of the world’s great powers by virtue of its growing overseas commerce. Having become a great power, the United States began acting more like one. The naval fleet was vastly enlarged, and efforts were taken to obtain territory abroad. Many of the treaties signed by the U.S. during this period resulted from this new imperialism. Some of these treaties were hotly debated in the Senate, reflecting the limits to popular support for the notion of American colonial expansion. In the peace treaty signed by the United States and Spain after t