The United States in the World Wars
Kara Dixon Vuic, Texas Christian University
Wars dominated American history in the twentieth century. The century began and ended with the U.S. military involved in wars on other continents. Men faced the possibility of required military service for much of the century, while military service shaped notions of citizenship for all Americans, whether they served or not. A wartime economy—even in times of peace—framed American businesses, labor, and politics. Military service played a crucial role in civil rights movements for African Americans, women, gays and lesbians, and other marginalized groups. This seminar will consider the broad influence of wars and the military on U.S. society, politics, culture, and the military through a focused look at American involvement in World War I and World War II. The readings reflect recent trends in the history of the wars, especially trends that connect military history to social, gender, and cultural history.