“Another Brick in the Wall:” American Children Go to War

“WHAT ARE YOU DOING? Have you started a garden? Are you helping to win the war? Everybody must work and work hard. Soldiers and sailors cannot fight without the help of the rest of us…When the count is made, on the roll of National Service, will you be PLUS one, or MINUS one?”[1]

American men fought, but the entire nation went to war in World War I. War altered all aspects of American life. Some civil liberties like freedom of the press were restricted in the name of national security. The draft created soldiers out of male citizens. The federal government gained unprecedented oversight over domestic production, foreign trade, and other economic areas that impacted mobilization and maintenance of the war effort. Defeating Germany required the work and sacrifices of every American- a charge they were reminded of at every turn.

Originally intended to be a go-between between the government and the press, George Creel’s Committee of Public Information soon turned to spreading the gospel of patriotism. The passage above is typical for media during the time; what is a bit atypical (at least through modern eyes) is that this article was published in Boy’s Life: The Boy Scouts’ Magazine. The effort to “draw children to military values and service” lead directly to what Dr. Ross F. Collins termed the “militarization of American childhood.”[2]

Magazines such as Boy’s LifeSt. Nicholas Magazine for Boys and Girls, and American Boy linked children’s activities directly to the success of the soldiers fighting overseas. They encouraged planting gardens, buying victory bonds, volunteering for the American Red Cross, and other activities. Idle American hands were not just the devil’s workshop, but the enemy’s. The magazines depicted war as a patriotic and heroic duty, soldiers as valiant adventurers, and men, women, and children back home as the first line of national defense in their absence. 

Even vocabulary and syntax changed to reflect the urgency of the message. Sentences became shorter and often took on second-person plural tense, something unheard of in today’s more detached third-person journalistic style. For example, an article on pest control in home gardens begins, “The war is on. You have enlisted as a gardener…Mobilize your forces. Get a store of ammunition (arsenate of lead and the other poisons), get a machine gun or two (hand sprayers), and post your guards…”[3] Children listened: growing, saving, doing, and doing without as directed. An article entitled “How the Boys Scouts Help in the War” even described a group of boy scouts who created and volunteered for service at home “to protect their mothers and sisters” in the Boy Scout Emergency Coast Patrol.[4]

Of course, most of American children’s education in war did not come from their voluntary consumption of popular media. Wholesale militarization could only be achieved through compulsory education on the virtues of war. The lessons in patriotism and civic duty taught at home were reinforced at school. A former educator himself, President Woodrow Wilson understood the power of the captive audience of a classroom. The CPI began publishing a bimonthly newsletter, National School Service, to instruct teachers on how to teach the war, and more importantly, to emphasize the importance of every citizen’s support, regardless of age. “There may be those who have doubts as to what their duty in this crisis is,” wrote Herbert Hoover in the inaugural edition of the newsletter, “but the teachers cannot be of them.”[5] T

he newsletter urged teachers to encourage their students to participate in many of the activities celebrated in the aforementioned children’s magazines. “War savings stamps, food and fuel economy, the Red Cross, (and) the Liberty Loan, are not intrusions on school work,” explained one article. “They are unique opportunities to enrich and test not knowledge, but the supreme lesson of intelligent and unselfish service.”[6]This idea dovetailed neatly with the president’s belief in the subjugation of individual agendas and gains to those of society as a whole, the need for the interest of every citizen to be “consciously linked with the interest of his fellow citizens, (his) sense of duty broadened to the scope of public service.”[7] Like their students, teachers listened, but they faced a stiff penalty if they refused. “Teachers who remained neutral concerning patriotism could be fired, as ten were in New York City, of hundreds in many incidents across the nation,” explained Collins in Children, War, and Propaganda.[8]

“It is not the object of this periodical to carry the war into the schools. It is there already…There can be but one supreme passion for our America; it is the passion for justice and right…for a world free and unfearful,” explained Guy Stanton Ford, director of Civic and Educational Publications for the CPI.[9] Wilson echoed Ford’s ideas, saying “it is not an army that we must shape and train for war; it is a nation.”[10] War mobilization militarized every aspect of American life; childhood was no exception. “Children were exhorted to sacrifice individuality for the group and pleasure for work; in essence, to take on the previous ‘adult’ role of responsible worker,” wrote Andrea McKenzie in “The Children’s’ Crusade: American Children Writing War.”[11] Once this facet of innocence was lost, it was not and could not be restored. Children did not fight in World War I, but their childhood experiences would shape their response when their nation called them to the front lines in the 1940s. The “greatest generation” came of age believing their parents’ war was also theirs. They prepared from childhood to serve in their own. 


[1] “How the Boy Scouts Help in the War,” Boy’s Life: The Boy Scout’s Magazine 1, no. 1 (June 1917), 42, Boyslife.org Wayback Machine, accessed October 31, 2016, http://boyslife.org/wayback/.

[2] Ross F. Collins, “This is Your War, Kids: Selling World War I to American Children,” North Dakota State University PubWeb, accessed November 1 2016, https://www.ndsu.edu/pubweb/~rcollins/436history/thisisyourwarkids.pdf, 3; 24.

[3] “The Second Phase of the War,” Boy’s Life: The Boy Scout’s Magazine 1, no. 1 (June 1917), 34, Boyslife.org Wayback Machine, accessed October 31, 2016, http://boyslife.org/wayback/.

[4] “How Boy Scouts Help in the War,” 7.

[5] Herbert Hoover, “Hoover Commends Teachers,” National School Service 1, no. 1 (September 1, 1918), 1, National School Service 1918-1919, Kindle edition. 

[6] Guy Stanton Ford, “National School Service,” National School Service 1, no. 1 (September 1, 1918), 8, National School Service 1918-1919, Kindle edition.

[7] Henry A. Turner, “Woodrow Wilson and Public Opinion,” The Public Opinion Quarterly 21, no. 4 (Winter, 1957-1958): 505-520, accessed October 31, 2016, http://www.jstor.org/stable/2746762, 507.

[8] Ross F. Collins, Children, War, and Propaganda, (New York: Peter Lang, Inc., 2011), accessed October 31, 2016, http://www.childrenwarandpropaganda.com.

[9] Ford, “National School Service,” 9.

[10] Woodrow Wilson, quoted in National School Service, 9.

[11] Andrea McKenzie quoted in Ross F. Collins, “This is Your War, Kids: Selling World War I to American Children,” 10.

Sources Cited

Boy’s Life: The Boy Scout’s Magazine 1, no. 1 (June 1917), 42, Boyslife.org Wayback Machine, accessed October 31, 2016, http://boyslife.org/wayback/.

Collins, Ross F. Children, War, and Propaganda, (New York: Peter Lang, Inc., 2011), accessed October 31, 2016, http://www.childrenwarandpropaganda.com.

——“This is Your War, Kids: Selling World War I to American Children,” North Dakota State University PubWeb, accessed November 1 2016, https://www.ndsu.edu/pubweb/~rcollins/436history/thisisyourwarkids.pdf.

National School Service 1, no. 1 (September 1, 1918), 1, National School Service 1918-1919, Kindle edition.

Turner, Henry A. “Woodrow Wilson and Public Opinion,” The Public Opinion Quarterly 21, no. 4 (Winter, 1957-1958): 505-520, accessed October 31, 2016, http://www.jstor.org/stable/2746762.

Truth, Justice, and the American Way: Japanese-American Internment in the “Good War”

Like President Woodrow Wilson before him, Franklin Delano Roosevelt urged the American people to avoid scapegoating citizens of German, Italian, and Japanese descent as the United States entered war. After the Japanese attack on Pearl Harbor on December 7, 1941, however, FDR’s reassurance that there was “nothing to fear but fear itself” rang hollow.[1]

As the nation went to war with the Empire of Japan, distrust and fear spread to Americans of Japanese descent. On Valentine’s Day, 1942, General John DeWitt warned the president of the threat posed by Japanese Americans in the wake of Pearl Harbor, saying, “In the war in which we are now engaged, racial affinities are not severed by migration. The Japanese race is an enemy race.”[2]The lack of concrete evidence supporting DeWitt’s claims did not discredit them. Internment supporters like California Attorney General Earl Warren construed it as proof that attacks were eminent, believing the absence of fifth column activity was “the most ominous sign in our whole situation…Our day of reckoning is bound to come.”[3]Five days later, bowing to public opinions like those of DeWitt and Warren, Roosevelt signed Executive Order 9066, creating “military areas” to intern thousands of Japanese-American men, women, and children for the duration of the war. The reckoning had come, but only for a very specific portion of the population. Over one hundred thousand American citizens were forcibly removed from their homes and relocated to internment camps for the duration of the war. Their belongings, livelihoods, and civil liberties were forfeited because local military officials believed their presence near military bases and installations in the West presented a threat to national security. Fear took precedence over the protection of constitutionally-guaranteed civil liberties.

The Japanese attack on Pearl Harbor did not create the racism towards the Japanese-American community; it gave it teeth. He argues the discrimination against Americans of Japanese descent began with the settling of the West in the mid to late nineteenth century. Racism towards Chinese settlers naturally flowed to the Japanese because they shared similar physical characteristics, and competition for limited land and employment opportunities heightened conflicts between ethnic groups. Discriminatory laws governing land ownership and restricting other ways Japanese-Americans could participate in society left them on the periphery, creating and bolstering barriers to their integration. During World War II, this outsider status would then be offered as “evidence” of their unwillingness to assimilate into American society.

At the same time Japanese-Americans were being interned in the name of national security, scientists were forging a weapon that would change the world. On July 16, 1945, Robert Oppenheimer and the other membersof the Manhattan Project (the code name for the group tasked with developing the weapon) were among the first to witness the explosion of an atomic bomb in the deserts of Alamogordo, New Mexico. Even in this barren and desolate space, the raw power of the bomb and its 40,000-foot mushroom cloud could not be denied. Oppenheimer described it as “Now I am become Death, the destroyer of worlds,” a line from the Hindu scripture, the Bhagavad Gita. They knew the bomb would be destructive. They knew it was a power unlike any other known. They also knew it was developed for one reason- to demonstrate an awe and fear-inspiring weapon that would finally put an end to the world war.

Less than a month later, on August 6, the U.S. deployed the bomb on the Japanese cities of Hiroshima and Nagasaki. Why those cities were chosen has inspired a lot of debate. The bombs were meant to achieve two main goals: 1) Force unconditional surrender by the Japanese, and 2) demonstrate the weapon’s power to the world. Much of Japan had already been bombed by the Allies/U.S.- Tokyo was already destroyed, so bombing it wouldn’t showcase the bomb’s power dramatically enough. Other strategic targets introduced even more issues- no one knew what would happen if an atomic bomb exploded near suspected missile sites or arms dumps. Hiroshima was a military target but also small and compact enough to be completely destroyed by one bomb. This was the horribly eloquent demonstration the United States was looking for. A second atomic bomb was dropped on the shipbuilding city of Nagasaki three days later.

But why drop a second bomb? Again, this is something historians continue to debate. The simplest answer, and the answer repeated by many of the soldiers and airmen who participated, was the official instruction was to drop two bombs. The military had developed two bombs, the order was to drop two bombs, so two bombs were dropped. We also have to remember that the Empire of Japan did not surrender after Hiroshima- one atomic bomb had not achieved the objective. Even after the bombing of Nagasaki, Japanese war council members disagreed about whether the war should to continue. In the end, Emperor Hirohito gave permission for unconditional surrender on August 14, 1945.

The United States was no longer at war, but the nation would fight about many of the decisions made during the war, like the internment of American citizens and the deployment of atomic weapons, for years to come. The dropping of the atomic bomb separated the history of human warfare into before Hiroshima and after Hiroshima. There was no turning back. Nuclear armament became central to domestic security and a path to national legitimacy. Before mutually-assured destruction was a tagline or cornerstone, it was a harsh realization. Albert Einstein described it best when he said, “I know not with what weapons World War III will be fought, but World War IV will be fought with sticks and stones.”

This was not enough to stop the proliferation and development of new weapons. In 1952, the United States developed and testing a hydrogen bomb, over 1,000 times more powerful than that exploded over Hiroshima. Though no other country has used a nuclear weapon in war since, the threat remains. North Korea tested a hydrogen (thermonuclear) bomb in September 2017. It continues to test-launch short-range projectiles, or intercontinental ballistic missiles, that can carry thermonuclear warheads. The status of Russian’s arms program and questions around Putin’s willingness to use its nuclear arsenal also causes pause. The nations of India and Pakistan, enemies for which war seems only to be awaiting a new flashpoint, are both nuclear powers. Policy makers and international leaders hope the threat of a fourth world war fought with sticks will be enough to convince current leaders that nuclear war, though powerful and final, is not the best way to decide their differences.

The legacy of Japanese-American internment was not international like that of the bombings of Hiroshima and Nagasaki, but it did cause many ripples here at home. Reverend Emery Andrews, writing in 1943, said “future historians will record this evacuation–this violation of citizenship rights–as one of the blackest blots on American history; as the time that democracy came the nearest of being wrecked.”[4]Andrews was correct, but it took over forty years. In the 1980s, the Supreme Court vacated the convictions under the 1944 decisions of Korematsu v. United States and Hirabayashi v. United States. The original decisions affirmed the President’s (and Congress’s) ability to amend constitutional rights to protect the nation in a time of war. Korematsu II vacated the conviction of Fred Korematsu, but it did not vacate the case law. The precedent still stands and can be- and is- used to justify taking away citizens’ constitutional rights during a national “emergency.” In 1988, President Ronald Reagan officially apologized for WWII internment and signed the Civil Liberties Act that granted monetary reparations to internment survivors and family members. The speeches and checks did not fully atone for what Japanese-Americans experienced during the war, but the effort did silence much of the discussion on the topic.

A terrorist attack on American soil that recalled Pearl Harbor in destruction and effect reopened the debate. On September 11, 2001, two hijacked planes crashed into the World Trade Center in New York City, bringing down both towers and killing thousands. Simultaneously a hijacked plane crashed into the Pentagon in Washington, D.C. and another plane crashed in Pennsylvania, its target presumed to be the White House. Though planned and perpetrated by the militant Islamic group al Qaeda, a terrorist organization disavowed by the majority of Muslims worldwide, Arab- and Muslim-Americans became scapegoats for the tragedy. Anti-Muslim hate crimes skyrocketed between 2000 and 2001, increasing by 1700 percent.[5]

Historians did not miss the similarities between the changes in how Japanese-Americans were perceived and treated following the attack on Pearl Harbor, and how Arab- and Muslim-Americans were perceived and treated following 9/11. Many expressed concerns that the United States continually follows the same pattern of uniting behind a common hatred in the name of domestic security. Recasting the past as a warning to the present, historians focused on how the combination of unchecked racism and the expansion of presidential power allowed Japanese-American citizens’ constitutional rights to be sacrificed in the name of national security.

The American memory of WWII tends to focus on our successes, not our failures. The defeat of the Axis powers and liberation of Jewish concentration camps amplify the good in the “good war,” but overshadow the ways the United States distorted some democratic ideals and practices to achieve them. Even though historians agree Japanese-American internment was “our worst wartime mistake,” America does tend to consolidate national unity through common hate in times of crisis. Americans have tried hard to forget the internment of citizens of Japanese descent during World War II, leading many historians to fear it could-and will- happen again. Though the “military necessity” of internment was never proven and the Supreme Court laterruled it was based on “race prejudice, war hysteria and a failure of political leadership,” the legal precedent stands.[6] As historian John Dower wrote, “war hates and race hates do not go away; rather, they (go) elsewhere.”[7]

KMS 2019

Sources Cited

Daniels, Roger.Prisoners Without Trial: Japanese Americans in World War II (Hill and Wang Critical Issues). New York: Hill and Wang, rev. ed. 2004.

Dower, John W. War Without Mercy: Race and Power in the Pacific War. New York: Pantheon Books, 1986.

Gee, Harvey. “Habeas Corpus, Civil Liberties, and Indefinite Detention during Wartime: From Ex Parte Endo and the Japanese American Internment to the War on Terrorism and Beyond.” The University of the Pacific Law Review47 (2016): 792-838, accessed October 31, 2016, HeinOnline.

Giannis, Joshua. “The Court, The Constitution, and Japanese-American Internment,” Stanford Journal of East Asian Affairs(Summer 2011): 87-96, accessed October 31, 2016, https://web.stanford.edu/group/sjeaa/journal111/Japan4.pdf.

Khan, Mussarat and Kathryn Ecklund, “Attitudes Toward Muslim Americans Post 9/11,” Journal of Muslim Mental HealthVII, no. 1 (2012): 1-16, accessed December 29, 2016, http://hdl.handle.net/2027/spo.10381607.0007.101.

“Only Thing We Have To Fear Is Fear Itself: FDR’s First Inaugural Address,” History Matters, accessed December 6, 2016, http://historymatters.gmu.edu/d/5057/.

Malkin, Michelle. In Defense of Internment: The Case for ‘Racial Profiling’ in World War II and the War on Terror. Washington, D.C.: Regnery Publishing, Inc., 2004.

Muller, Eric L. American Inquisition: The Hunt for Japanese American Disloyalty in World War II. Chapel Hill: University of North Carolina Press, 2007.

——. “Indefensible Internment: There was no good reason for the mass internment of Japanese Americans during WWII.” Reason.com (Dec. 1, 2004), accessed December 21, 2016, http://reason.com/archives/2004/12/01/indefensible-internment.

Oluwu, Dejo. “Civil liberties versus military necessity: lessons from the jurisprudence emanating from the classification and internment of Japanese-Americans during World War II,” The Comparative and International Law Journal of Southern Africa43, no. 2 (July 2010): 190-212, accessed December 5, 2016, http://www.jstor.org/stable/23253161.

Raico, Ralph. “American Foreign Policy: The Turning Point, 1898-1919,” The Independent Institute(February 1, 1995), accessed October 21, 2016, http://www.independent.org.

Reeves,Richard. Infamy: The Shocking Story of Japanese American Internment in World War II. New York: Henry Holt and Company, 2015.

Robinson, Greg. A Tragedy of Democracy: Japanese Confinement in North America. New York: Columbia University Press, 2009.

——. By Order of the President: FDR and the Internment of Japanese Americans. Cambridge: Harvard University Press, 2001.

——. “A Critique of Michelle Malkin’s ‘In Defense of Internment’, Part Two.” Modelminority.com, (August 8, 2004), accessed December 21, 2016, https://www.web.archive.org/web/20080919020738/http://modelminority.com/article849.html.

Shaffer, Robert. “Opposition to Internment: Defending Japanese American Rights During World War II,” Historian61, No. 3 (Spring, 1999): 597-618, accessed November 21, 2016, EBSCOHost.

 

[1]“Only Thing We Have To Fear Is Fear Itself: FDR’s First Inaugural Address,” History Matters, accessed December 6, 2016, http://historymatters.gmu.edu/d/5057/.

[2]John DeWitt quoted in Richard Reeves, Infamy: The Shocking Story of Japanese American Internment in World War II(New York: Henry Holt and Company, 2015), 41.

[3]Earl Warren quoted in Daniels, 37.

[4]Robert Shaffer, “Opposition to Internment: Defending Japanese American Rights During World War II,” Historian61, No. 3 (Spring, 1999): 597-618, accessed November 21, 2016, EBSCOHost, 598.

[5]Mussarat Khan and Kathryn Ecklund, “Attitudes Toward Muslim Americans Post 9/11,” Journal of Muslim Mental HealthVII, no. 1 (2012): 1-16, accessed December 29, 2016, http://hdl.handle.net/2027/spo.10381607.0007.101, 2.

[6]Oluwu, 207.

[7]John W. Dower, War Without Mercy: Race and Power in the Pacific War(New York: Pantheon Books, 1986), 311.

White Christmas, Blueberry Pie, Yellow Peril

In Michael Curtiz’s 1954 film, White Christmas, two WWII Army veterans conspire to reunite their entire unit to show their appreciation for the general that led them during the war. Though the story begins in Monte Cassino, Italy, on Christmas Eve, 1944, White Christmas is not a war movie: the war images are muted, the edges softened. It is a movie that reflects what Americans wanted to remember about the war (with singing, dancing, and large set pieces added in for good measure). Home Front U.S.A.: America During WWII author America Allan M. Winkler washes his discussion of post-Pearl Harbor America in the same shades of sepia, forcing us to also wonder whether he is writing about the war, or what Americans most wanted to remember about the war.

The Japanese attack on Pearl Harbor on December 7, 1941, shocked Americans out of their complacency. Neither the ocean nor pledges of isolationism could keep the country out of the conflict. Winkler writes the attack fostered the “sense of shared purpose” long missing from American popular support for the Allies.[1]Roosevelt’s Office of War Information (created June 1942) capitalized on this wellspring of patriotism by framing the war as supporting “American values and portraying Americans as they wanted to be seen.”[2]As the OWI proclaimed the glories of what the boys abroad were fighting for, including baseball and homemade blueberry pie, the Office of Civilian Defense (1941) stepped up its efforts to educate Americans on their own defense. The sale of war bonds, victory gardens, and other programs allowed those who were not serving overseas to fight the war where they were- every bond, turnip, and scrap of rubber was another strike against the enemy.

Winkler waxes rhapsodic about the war efforts that brought the country together, but we should not lose sight of the fact that post-Pearl Harbor America was united against the Japanese, not united behind the Allies. He writes Pearl Harbor “brought a sense of unity to all Americans,” a generalization that is not supported by historical fact.[3]Racism and prejudice played a deciding role in identifying the “they” who did this to “us.” German Americans and Italian Americans were openly discriminated against, but thousands of Japanese American citizens were forcibly removed to internment camps for based on unsubstantiated claims they posed a threat to national security. (Historians have yet to identify credible evidence for the “military necessity” behind Japanese internment, but the nation did not apologize or attempt reparations until the 1980s.) “Anti-Japanese hysteria gripped the home front,” explained Dolores Flamiano in “Japanese American Internment in Popular Magazines: Race, Citizenship, and Gender in World War II Photojournalism.” “Wartime internment stories were rife with racial slurs and stereotypes, which most readers apparently accepted or at least tolerated.”[4]Popular music reflected this trend, with songs like “We’re Gonna Find a Feller Who Is Yeller and Beat Him Red, White, and Blue” sharing air time with Irving Berlin’s “White Christmas.”[5]

Winkler does include the unfair treatment of Japanese Americans during the war in his chapter on “Outsiders and Ethnic Groups,” but his failure to even allude to this development in discussing post-Pearl Harbor war mobilization and propaganda underscores a disconcerting paradox. “For all of the hardships, the United States fared well in World War II,” he writes, but “(n)ot all Americans fared well.”[6]The attack on Pearl Harbor united the country, but in mobilizing American patriotism, it also mobilized American hate. Pearl Harbor’s legacy was a united nation, but a divided people. We entered the war secure in our belief that we would help democracy prevail overseas while playing fast and loose with the civil liberties of entire groups of citizens at home. Historians played into the myth, content to ignore the constitutional concerns raised by Japanese internment until the 1970s and 1980s. American historical memory of the war and the war era became increasingly sepia-toned as we allowed the recollection of our hatred of all things yellow (and black) to fade.

The acknowledgment of our wartime errors does not diminish our successes, but refusal to recognize these errors perpetuates division. Our collective memory of WWII should include Mom’s homemade pie and women’s baseball games, but also the barbed wire surrounding the camps at Manzanar. We are a country who found its place in the world amid dreams of white Christmases, hopes to return to the comforts of home and blueberry pie, and the insecurities that preyed upon our fear and led us to act counter to everything for which we said we stood. To borrow the phrase, the World War II generation can be the “greatest generation” without being understood as a perfect generation.

[1]Allan M. Winkler, Home Front U.S.A. America During World War II, (Wheeling: Harlan Davidson, Inc., 2012), 31.

[2]Ibid., 35.

[3]Ibid., 31.

[4]Dolores Flamiano, “Japanese American Internment in Popular Magazines: Race, Citizenship, and Gender in World War II Photojournalism,” Journalism History36, no. 1 (Spring 2010): 23-35, accessed October 31, 2016, EBSCOHost., 23; 24.

[5]Winkler, 39.

[6]Ibid., 55; 56.

Sources Cited

Flamiano, Dolores. “Japanese American Internment in Popular Magazines: Race, Citizenship, and Gender in World War II Photojournalism,” Journalism History 36, no. 1 (Spring 2010):23-35, accessed October 31, 2016, EBSCOHost.

Winkler, Allan M. Home Front U.S.A. America During World War II. Wheeling: Harlan Davidson, Inc., 2012.

Kate Murphy Schaefer, 2019

Arming the “Boys:” The Women’s Munition Reserve Seven Pines Bag Loading Plant, Penniman, Virginia

Updated 7/16/21

Mobilization for World War I allowed women previously unheard-of opportunities to take on non-traditional roles. Some served abroad as nurses and yeomen; others took up the ploughshares the men had traded for swords by working on family farms and with the Women’s Land Army. Traditional activities like sewing and knitting also took on new importance as the items were shipped overseas. Women also took over the factory jobs left open by the citizens turned soldiers, helping keep the American war machine rolling.

Beginning in 1915, DuPont chemical company directed all its manufacturing and production towards the war effort. Social crisis tends to trump political scruples, so the company’s recent antitrust troubles did not hinder its consolidation of a monopoly over American munitions production. DuPont Plant #37, located on the York River near Williamsburg, Virginia, was a shell-filling plant. The company’s Women’s Munition Reserve Seven Pines Bag Loading Plant was located near what is now Sandston, Virginia.

The Seven Pines and Penniman plants paid relatively high wages for hazardous work. Workers were responsible for loading TNT into ammunition shells and bagging gunpowder for shipment. Despite the potential danger, hundreds of men and women flocked to the plant in search of employment. A village quickly grew up around Plant #37 as DuPont constructed 230 houses to entice its workers to live nearby. It became known as Penniman. The population of Penniman numbered 10,000–20,000.[1] 

Female workers made up most of the workforce at Seven Pines. Women of all walks of life were represented, and it was not unusual for middle- and lower-class women to sew and fill powder bags side-by-side with Virginia’s First Lady, Marguerite Davis.[2] Fashion norms also relaxed a bit as a concession to the war effort. Long skirts were impractical in factories, particularly in factories filled with flammable and potentially explosive materials. DuPont issued trousers to the woman munitions workers of Seven Pines. To maintain propriety and keep the clothing suitably feminine, the women wore “womanalls” and “trouserettes.” as they “stuffed one shell for the Kaiser.”[3]

The inscription on the metal badge housed at the VAARNG Mullins Armory in Richmond, Virginia, reads “WOMEN’S MUNITION RESERVE SEVEN PINES BAG LOADING PLANT.” Badges issued for other DuPont munitions plants took similar forms. Plant badges served several purposes: some were practical, some rather grisly. As metal withstands an explosion better than flesh, numbered badges could help identify a worker killed during a plant accident. It is probable the “68” on the middle of the badge was the identification number for a female worker.

badge

Figure 1. Women’s Munition Reserve Seven Pines Bag Loading Plant badge. Photo by author.

Plant badges provided a different kind of protection for male workers. Being branded a “slacker,” or man who did not serve or did not work towards the war effort, was almost as bad as being German. Wearing a factory badge showed the community you were doing your part.

The Seven Pines plant officially opened in October 1918 with a “Liberty Day” celebration. Less than a month later, Armistice rendered the plant’s work unnecessary. The Richmond-Fairfield Railway Company bought the properties originally constructed for the workers, building the foundation for a suburb for the city of Richmond- Sandston. Affordable housing and access to jobs in the city allowed workers to find employment as the nation shifted to a post-war economy.

While the Sandston community survived, community around Dupont Plant #37 did not. The Spanish flu epidemic that raged across the nation also took its toll in Penniman. The local hospital could not keep up with the number of sick men, women, and children who entered its doors. Local coroners and casket  also struggled to keep up with the dead. When Plant #37 closed its doors, the surviving families left in search of employment, in some cases taking their DuPont-constructed houses with them by floating them down the river. By the mid-1920s, Penniman had disappeared. The Women’s Munition Plant badge is tangible evidence of a place that can no longer be found on a Virginia map. While most of the women’s individual stories also disappeared, this material evidence preserves the story of another way Virginia women broke through gender boundaries to support their country and their Commonwealth.

munition poster

Figure 2. Frederic H. Spiegel, 1918. Library of Virginia Special Collections Archive.

Notes

[1] Martha W. McCartney, James City County: Keystone of the Commonwealth, (James City County, Virginia: Donning Company Publishing, 1997).

[2] Virginia Women and the First World War: Records and Resources at the Library of Virginia,” Library of Virginia Archival and Information Services, accessed August 27, 2018, https://www.lva.virginia.gov/public/guides/WomenofWWI.pdf, 2

[3] Ibid.

Special thank you to the readers who sent me revisions for information that was unclear in my original post. I appreciate your dedication to keeping historians accountable as we endeavor to tell the truth about the past as much as possible.

Book Review: Fly Girls: How Five Daring Women Defied All Odds and Made Aviation History

O’Brien, Keith. Fly Girls: How Five Daring Women Defied All Odds and Made Aviation History. Boston: Houghton Mifflin Harcourt, 2018.

In Fly Girls: How Five Daring Women Defied All Odds and Made Aviation History, Keith O’Brien reminds the reader that only one of the names of the women he profiles will be familiar: Amelia Earhart. Even then, Earhart is known more for his defeat by than her conquest of the air. The “friendly sky” described by modern commercial airlines is in reality a jealous mistress: aviators that do not give her total attention or fail to decipher the roles that changing conditions play on flight patterns and aircraft will not enjoy her company very long. The race to tame the sky claimed many lives,  male and female.

One of the most poignant episodes in the book comes when an interviewer asks Earhart why she wants to fly. “Why do men ride horses?” she replies. She seems stunned by the idea that women could not share the thirst of adventure felt by men. By the end of the chapter, Earhart’s contribution to that flight would be reduced to that of ballast, with several male aviators claiming it would have been better if she had been left behind and two hundred gallons of fuel loaded in her place. Aviator instruction and training for men and women were the same: they had to complete the same education and tasks to earn flying licenses. Flying while female, however, was often seen as a bigger liability than flying while intoxicated. The various commercial schemes women undertook to be able to get in the cockpit also made them appear to be more interested in fame and fortune than in flying.

Fly Girlslends new lyrics to a familiar tune. As women’s history gains readers and with them, profitability, we can expect to see many more histories of forgotten women in male-dominated spaces. Women made important contributions to early aviation and would continue to make contributions as pilots, mechanics, and engineers during the world wars. They laid the groundwork for pilots like Tammie Jo Shultz, the former Navy fighter pilot who landed a Southwest plane after it lost one of its engines after takeoff earlier this year. O’Brien’s book also reminds us that for every Shultz and Earhart, there are thousands of female pilots who never make it into the papers.

KMS

June 2018

Burning Down the House: Putting American Women in their Place Following WWII

World War II changed a multitude of things, but not American gender norms and stereotypes. The war reinforced the differences between men and women and deepened the power struggle. Allan M. Winkler drew a direct correlation between women’s involvement in the war effort and the development of the women’s rights movement, but this only tells part of the story.[1] It was not participation, but the gender-based barriers and limits to women’s participation in the war effort that reinvigorated the women’s civil rights movement. “Utilizing American woman power was a matter of military expediency,” wrote Michaela M. Hampf in “‘Dykes’ or ‘Whores’: Sexuality and the Women’s Army Corps in the United States during World War II.”[2] Expediency does not connote acceptance or appreciation, a distinction that followed women throughout the war. “Opponents to even a temporary participation of women felt that not only the efficiency of the military was threatened, but also the traditional system of male dominance and the roles of female homemaker and male breadwinner” continued Hampf.[3] In other words, women who did not stick to hearth and home were seen as more likely to burn down the house than to keep the home fires burning. The response to the possible subversion of traditional gender roles was an increased effort to keep women in their place.

One effective way to reinforce the traditional structure was to play up the differences between men and women by highlighting the ways in which women could never measure up to the ideal represented by American manhood. Low wages and low expectations concerning the duration of female employment were blatant reminders of women’s worth in the workplace relative to their male counterparts; others were less transparent. Articles on industry beauty contests, fashion shows, and “war fashion tips for feminine safety” shared pages with war reports in the monthly newsletters of a New England shipyard, for example.[4] These articles framed women workers as both “helpless” and “glamorous,” two decidedly nondesirable traits in workers meant to keep the economy and the war effort on track.[5]

Media depictions took contradictory representations of women even further. Women were depicted in images like “Rosie the Riveter,” but were also prominent in posters warning soldiers of venereal disease, “penis propaganda” that implied any woman could present a threat to manhood.[6] Male promiscuity is excused, accepted, and even expected, but female promiscuity threatened the health of American society and of its fighting men. The “virgin/whore binary” (coined by Lisa Wade in her essay for Sociological Images) was not limited to factory work or propaganda.[7] Women who served in military capabilities had to be careful not to be too ambitious lest they be branded as lesbians, prostitutes, or a combination of both. Linda Grant DePauw noted more work on military prostitution has been published than on women on who served as combat soldiers during the war.[8] The relative lack of research on women’s combat service compared to their illicit sexual service preserves the hypersexualized “otherness” of women in war, reminding us historians are not immune from the social norms and cultural mores of the environment in which they research and write.

Participation in the WWII workforce did not magically give women agency, nor did it open society’s eyes to their worth and abilities. If it had, there would have been no need for the women’s civil rights movement. Society does not change on its own, and the process is brutal. Some women simply could not reconcile the “new sense of self” and “self-reliance” fostered by working outside of the home with the societal expectation that they would “cheerfully leap back to home” when the men returned from war.[9] As Dellie Hahne told Studs Terkel in an interview for his “The Good War:” An Oral History of World War II, “a lot of women said, Screw that noise. ‘Cause they had a taste of making her own money, a taste of spending their own money, making their own decisions.”[10] As the hands that rocked the cradle learned their hands could handle many other tasks, they were not content to go back to how things were. The war had changed them, but it was up to them to change their world.

[1] Allan M. Winkler, “The World War II Homefront,” History Now: The Journal of the Gilder Lehman Institute, The Gilder Lehman Institute of American History, accessed December 12, 2016, https://www.gilderlehrman.org/history-by-era/world-war-ii/essays/world-war-ii-home-front.

[2] Michaela M. Hampf, “‘Dykes’ or ‘Whores’: Sexuality and the Women’s Army Corps in the United States during World War II.” Women’s Studies International Forum 27 (2004): 13-30, accessed December 14, 2016, EBSCOHost., 13.

[3] Ibid., 16.

[4] Jane Marcellus, “Bo’s’n’s Whistle: Representing ‘Rosie the Riveter’ on the Job,” American Journalism 22, no. 2 (2005): 83-108, accessed November 28, 2016, EBSCOHost., 94.

[5] Ibid.

[6] See http://www.cnn.com/2015/08/25/health/wwii-vd-posters-penis-propaganda.

[7] Lisa Ward, “The Virgin/Whore Binary in World War II Propaganda,” Sociological Images, June 15, 2011, accessed December 15, 2016, https://thesocietypages.org/socimages/2011/06/15/the-virginwhore-binary-in-world-war-ii-vd-propaganda/.

[8] Linda Grant DePauw, Battle Cries and Lullabies: Women in War from Prehistory to the Present. (Norman: University of Oklahoma Press, 1998), 262.

[9] Winkler 352; Terkel, 120.

[10] Studs Terkel, “The Good War:” An Oral History of World War II. (New York: The New Press, 2011). Kindle edition.

And the Oscar Goes to: Hattie McDaniel and the Original #OscarsSoWhite

On February 29, 1940, African-American actress, singer, and entertainer Hattie McDaniel won an Academy Award for her portrayal as Mammy in Gone With The Wind. Though 1939 also saw the premieres of movies like The Wizard of Oz and Mr. Smith Goes to Washington, GWTW earned thirteen nominations and eight awards, including Best Picture, Best Director, Best Actress, and McDaniel’s Best Supporting Actress accolade. Given the racism and discrimination rampant in the United States in the 1940s, the decision to award the Supporting Actress to an African-American woman seemed to be a tremendous step forward.

It was.

It also wasn’t.

Born in 1893, Hattie McDaniel began performing when she was in high school as part of a troupe called The Mighty Minstrels. By the time she was in her 20s, she was performing on the radio, the first African-American woman to do so. The performing life did not pay well, and McDaniel often worked as domestic help to make ends meet. After moving to Los Angeles, she was cast as an extra in a Hollywood musical. After earning her Screen Actors Guild (SAG) card, McDaniel went on to small roles in I’m No Angel, The Little Colonel, Judge Priest, and Show Boat. She worked with and became friends with many of the major stars of the day, including Shirley Temple, Henry Fonda, Clark Gable, and Olivia de Havilland. Her relationships with the latter two helped her win the role of Mammy in Gone With The Wind.

McDaniel’s acting ability was never in doubt. Mammy was the soul of Margaret Mitchell’s novel and of the film adaptation. Reception to the film (and its actors) demonstrates the black soul of American racism, however. None of the African-American actors were able to attend the film’s opening night at Atlanta’s Loew’s Grand Theater. Jim Crow also showed up at the Oscar ceremony the following year. GWTW director David O. Selznick had to petition for McDaniel to be able to attend the ceremony at the Ambassador Hotel. She and her date ended up sitting at a table at the back of the room separate from her GWTW costars. It was easier to award McDaniel one of the top acting awards in the nation than to find a place for her in American society as an African American woman. She could be a star, but not an equal.

McDaniel also faced censure from the African-American community, who saw GWTW and the character of Mammy as romanticizing the Old South and slavery. They criticized her for taking roles as slaves and servants, saying she was preserving the stereotypes that fueled discrimination against black Americans. McDaniel disagreed, arguing African-American women did not have the luxury of choosing their roles if they wanted to continue to work (and to eat). “The only choice permitted us is either to be servants for $7 a week or to portray them for $700 a week,” she said.[1] McDaniel believed “a woman’s gifts will make room for her,” but we cannot forget for a moment that a woman is rarely in control of the room’s location or its conditions.[2]

Almost eighty years later, the Academy Awards, and the United States, struggles with diversity. The 2015 Awards earned the hashtag #OscarsSoWhite when the Academy of Motion Picture Arts and Sciences nominated Caucasian actors for all twenty major acting awards, the first occurrence since 1998. American society is diverse, but depicting and honoring that diversity continues to be difficult. It is hard to believe we can still celebrate the “first black,” “first Asian,” “first Hispanic,” “first LGBTQ,” “first woman” (the list goes on and on) anything in the year 2018, but that is our reality and our society.

Tonight’s 90th Academy Awards is not without its own firsts: the first female cinematography nominee (Rachel Morrison for Mudbound); nominations for African-American director/comedian/actor Jordan Peele (Get Out) and for female director Greta Gerwig (Lady Bird); and of course, the first Oscars since Harvey Weinstein was dethroned by industry leaders finally taking the sexual assault and abuse allegations against him seriously. 2018 seems to be the year where Hollywood is at least ready to listen to disenfranchised voices, but that does not mean the path ahead is certain. Some have criticized the film Call Me By Your Name, the story of a young man’s summer affair with his father’s research assistant, as promoting sexual promiscuity and underage sexual relations. This is especially interesting during a cinematic season that also saw the opening of Fifty Shades Darker, the second film in a trilogy that regularly substitutes softcore pornography for plot and character development. Others criticize Guillermo del Toro’s The Shape of Water for not going far enough in its development of its disabled characters, namely the protagonist, Elisa.[3] The Oscars, like society itself, is perpetually caught in a game of one step forward, one step back, not far enough—wait, too far. It is only by stretching boundaries that we will ever arrive at a new, more equitable normal.

McDaniel said “we respect sincerity in our friends and acquaintances, but Hollywood is willing to pay for it.”[4] Perhaps the best way forward is to keep reminding Hollywood, and other sources of American power, that it will only get what it pays for. We must also remember that we, the consumers, get what we pay for. Hollywood films what sells. If we continue to demand film art that is inclusive and also put our money where we say our priorities lie, #OscarsSoWhite can become part of history, not a recurring pattern. Race and gender shape art, but do not and cannot determine its worth. Perhaps we must also keep reminding them we have the receipts.

kms

[1] https://www.inspiringquotes.us/author/9148-hattie-mcdaniel.

[2] Ibid.

[3] See “What ‘The Shape of Water’ Gets Wrong About Disability,” at http://www.cbc.ca/radio/day6/episode-379-populism-in-italy-s-elections-greenland-s-ice-melt-the-shape-of-water-ode-to-cds-and-more-1.4555633/what-the-shape-of-water-gets-wrong-about-disability-1.4555657.

[4] https://www.inspiringquotes.us/author/9148-hattie-mcdaniel.

F-Bomb Field Trip: U.S. Army Women’s Museum in Fort Lee, Virginia

Two statues guard the entrance of the U.S. Army Women’s Museum at Fort Lee, Virginia: Pallas Athena, the Greek goddess responsible for wisdom and war, and a female American soldier, the personification of those attributes. The USAWM was originally part of Fort McClellan, Alabama, but moved to Virginia after the base closed in 1999. It opened at Fort Lee in 2001, but it was only five years ago that the museum became the first American military installation to display a statue of a female soldier. This timeline parallels women’s fight to both participate in the U.S. military and be recognized for their participation. The museum does a very good job at establishing the fact that women have always been involved in American wars; it was official recognition of their contributions that trailed behind.

The museum begins and ends with a large tree adorned with replicas of dog tags left behind by fallen female soldiers. One electronic exhibit allows the visitor to select the names of individual soldiers and pull up their pictures and a short biography and service record. Sacrifice is key to the USAWM: from the “unofficial” soldiers in the Revolutionary and Civil Wars to the WACs of WWII and combat soldiers of Desert Storm and following, female sacrifice was essential to American military success.

As described on its website, the USAWM is a “repository of artifacts and archives,” but also “an educational institution.” The curators have done a fantastic job integrating elements that will keep younger visitors interested and entertained. A theater in a small alcove explains the role of Walt Disney animation in the war effort and shows several WWII-era Donald Duck cartoons produced during the time. There is also an area that allows children to try on the various caps/head gear, uniform pieces, and arms mentioned and depicted in the exhibits. Kids can also take home free coloring pages as a souvenir.

The USWM is also an important resource for historians and researchers. Appointments to view the archives or explore their service member oral histories can be made on the museum website. The archive holds over 1.5 million documents, including books, photographs, scrapbooks, correspondence, and other formats. The museum also allows visitors to take home copies of the U.S. Army Center for Military History’s books on the Women’s Army Corps (WACs) by historians Bettie J. Morden and Mattie E. Treadwell. As military history has long been the domain of men, finding sources about military women written by military women is refreshing to say the least. Morden’s and Treadwell’s works would be extremely useful to students and researchers interested in investigating female participation in the army from 1942 to 1978. Treadwell’s Special Studies text provides additional information on mid-century American interpretations of gender and war and reflects on how these interpretations shaped what military women were and were not allowed to do during wartime.

The U.S. Army Women’s Museum is easy to overlook, but well worth a visit. At the time of our visit (March 1, 2018), several exhibits were under construction, including redesign of a gallery and an expansion of one area. I’m definitely planning a return visit to see the new pieces and check out the U.S. Army Quartermaster Museum next door.

kms

The United States Army Women’s Museum / 2100 Avenue A / Fort Lee, Virginia / www.awm.lee.army.mil / 804-734-4327 / Tues. through Sat., 1000-1700

Today in Women’s History: The U.S. Supreme Court Rules the 19th Amendment Constitutional

“The right of citizens of the United States to vote shall not be denied or abridged by the United States or by any State on account of sex. Congress shall have power to enforce this article by appropriate legislation.”

The U.S. Constitution is, and is intended to be, a living document, but that does not mean changes are easy. Constitutional amendments are hard-fought and hard-won. The debates they spur often inspire strange political alliances. The long fight for female suffrage is the story you know, but the woman’s suffrage movement’s awkward alliance against the 15th amendment is not widely publicized (for obvious reasons).

The 19th amendment to the Constitution, ensuring “(t)he right of citizens of the United States to vote shall not be denied or abridged by the United States or by any State on account of sex,was ratified in 1920. The amendment was the result of an over eighty-year battle for women’s rights in the United States. The 1848 Seneca Falls Convention laid out the women’s movement’s battle strategies and goals, but the changing political landscape would often thwart the suffragettes’ plans.

The Civil War divided the nation; Reconstruction would end up dividing the woman’s rights movement. The proposed 15th amendment stated suffrage “shall not be denied…on account of race.” This gave white and non-white men the right to vote, but as it did not specify suffrage could not be denied based on sex, women were again denied the right. Woman’s rights leaders Elizabeth Cady Stanton and Susan B. Anthony bristled at the idea and withdrew their political support for the amendment. “If that word ‘male’ be inserted,” wrote Stanton, “it will take us at least a century to get it out.[1]

“That word” was not included, but the implication was enough to bar women from voting. The women’s movement split into two groups: Anthony and Stanton formed the National Woman Suffrage Association (NWSA), while Lucy Stone and others who supported the ratification of the 15th amendment formed the American Woman Suffrage Association (NWSA). Disagreements over ideology and methodology hampered the movement’s efficiency and its ability to present a united message regarding woman’s rights. Further complicating matters, some groups interpreted the NWSA’s anti-15th amendment stance as evidence of racism within the movement. Though the NWSA’s connection to groups that supported racial discrimination was tenuous at best, it impacted their image and message. It was not a particularly effective way to court the thousands of African-American women who also wanted civil rights as American citizens, to say the least.

As is often the case, an American war was the ultimate impetus to bringing about American social change. Women’s contributions in mobilization for World War I finally convinced male leaders and politicians that women’s participation could not be ignored. Anyone who gave so much for their country, and made do with so little, deserved the civil rights too long denied them. (Of course, women’s protests and other forms of mobilization for suffrage also forced politicians’ hands.) “I regard the concurrence of the Senate in the constitutional amendment proposing the extension of the suffrage to women as vitally essential to the successful prosecution of the great war of humanity in which we are engaged,” said President Woodrow Wilson in an address to Congress in 1918.[2] Congress agreed, passing the 19th amendment in 1919. It was ratified the following year.

The final challenge to the amendment’s constitutionality came in the 1922 Supreme Court case, Leser v. Garnett. In the original case, lawyer Oscar Leser sued to have two women removed from Maryland voting rolls, saying women did not have the right to vote in Maryland because the state had not ratified the 19th amendment. Chief Justice Louis Brandeis ruled women’s suffrage applied to all American women regardless of whether or not their state ratified the amendment (approved women’s right to vote). Ratifying the amendment put the law in the books, but the 1922 decision in Leser v. Garrett ensured it was a law that women could use.

kms

[1] Elizabeth Cady Stanton quoted in Akhil Reed Amar, America’s Constitution: A Biography (New York: Random House, 2005), 394.

[2] Woodrow Wilson quoted in “Women’s Suffrage and WWI,” National Park Service, Accessed February 27, 2018, https://www.nps.gov/articles/womens-suffrage-wwi.htm.