“Spinning” Jennie

This is a story about patriotism, war, slut-shaming, and dough. 

Lots of dough. 

On July 3, 1863, Mary Virginia Wade was shot while kneading dough in her kitchen. She was the only “direct” civilian casualty of the Battle of Gettysburg, a stray bullet entering her shoulder, piercing her heart, and lodging in her corset. 

Gennie lost her life first, her name directly after. “Gennie” became “Jennie” as newspapers rushed to get the story into print. Any sympathy for Wade’s death was fleeting. Within weeks her “heroism” came under fire from members of her own community, with one individual telling the newspaper “her sympathies were not as much for the Union as they should have been.”[i] It didn’t take long for rumors of “late night visitors” to circle. A bullet destroyed her body, but gossip killed her reputation.

In what is one of the odder details of the story, what Gennie actually had her hands in, not what she was suspected of having her hands in, was critical to settling her legacy. On July 4, Wade’s mother made 15 loaves of bread for Union soldiers using the dough her daughter had kneaded. Kneading dough- the evidence of Gennie’s “service to the Union cause” resulted in a government pension for her mother. 

It took over 30 years for Gettysburg to come to terms with its famous casualty. The Gennie Wade memorial was erected in 1900. An American flag flies perpetually over her statue, one of two women awarded this distinction (The other is Betsy Ross.) The restored Wade house is now a museum that also offers ghost tours. “You never know what might later show up in your photographs,” touts one website. “Many who have done so have come to find inexplicable paranormal objects, possibly the disembodied spirit of Jennie Wade.”[ii] Wade’s tragedy is also a potential cure for the fiancée-deprived, quoting “local lore” that after putting your finger in one of the bullet holes in the door, “you will become engaged not long after.”[iii]

The tour guides aren’t sure how to answer questions about Wade’s manufactured reputation as a prostitute and Confederate-sympathizer. Disembodied spirits- not disembodied “prostitutes”- are not as easy to market in brochures.

Marble martyrs of men who fought for and against the United States dot the landscape of Gettysburg, their bravery and devotion unquestioned. Rumor still haunts Mary Virginia Wade. She is a curiosity, a commodity- she sells ghost tours and t-shirts. Spinning Gennie’s legacy for one cause or another continues to be an extremely profitable endeavor.

Following the Civil War, “the dead became what their survivors chose to make them,” wrote historian Drew Gilpin Faust.[iv] The commoditization of Gennie Wade is a good example of the malleability of history. It also proves the United States still doesn’t know how to process the Civil War. Statues, emblems, and flags come to represent a history made and remade on a whim. A stray bullet transforms a woman into a corpse; history turns that woman into a martyr, then a harlot, then a treasured patriot.

How we choose to spin the past reveals more truth about the living than it ever does the dead. 

KMS

January 26, 2021


[i] Margaret S; Creighton, The Colors of Courage: Gettysburg’s Forgotten History. (New York: Basic Books, 2006), 195.

[ii] “The Jennie Wade House,” Civil War Ghosts.com, Accessed July 18, 2020, https://civilwarghosts.com/the-jennie-wade-house/

[iii] Ibid.

[iv] Drew Gilpin Faust, This Republic of Suffering: Death and the American Civil War. (New York: Alfred A. Knopf, 2008), 269.

Image taken from American Battlefield Trust: https://www.battlefields.org/learn/biographies/mary-virginia-jennie-wade

Truth, Justice, and the American Way: Japanese-American Internment in the “Good War”

Like President Woodrow Wilson before him, Franklin Delano Roosevelt urged the American people to avoid scapegoating citizens of German, Italian, and Japanese descent as the United States entered war. After the Japanese attack on Pearl Harbor on December 7, 1941, however, FDR’s reassurance that there was “nothing to fear but fear itself” rang hollow.[1]

As the nation went to war with the Empire of Japan, distrust and fear spread to Americans of Japanese descent. On Valentine’s Day, 1942, General John DeWitt warned the president of the threat posed by Japanese Americans in the wake of Pearl Harbor, saying, “In the war in which we are now engaged, racial affinities are not severed by migration. The Japanese race is an enemy race.”[2]The lack of concrete evidence supporting DeWitt’s claims did not discredit them. Internment supporters like California Attorney General Earl Warren construed it as proof that attacks were eminent, believing the absence of fifth column activity was “the most ominous sign in our whole situation…Our day of reckoning is bound to come.”[3]Five days later, bowing to public opinions like those of DeWitt and Warren, Roosevelt signed Executive Order 9066, creating “military areas” to intern thousands of Japanese-American men, women, and children for the duration of the war. The reckoning had come, but only for a very specific portion of the population. Over one hundred thousand American citizens were forcibly removed from their homes and relocated to internment camps for the duration of the war. Their belongings, livelihoods, and civil liberties were forfeited because local military officials believed their presence near military bases and installations in the West presented a threat to national security. Fear took precedence over the protection of constitutionally-guaranteed civil liberties.

The Japanese attack on Pearl Harbor did not create the racism towards the Japanese-American community; it gave it teeth. He argues the discrimination against Americans of Japanese descent began with the settling of the West in the mid to late nineteenth century. Racism towards Chinese settlers naturally flowed to the Japanese because they shared similar physical characteristics, and competition for limited land and employment opportunities heightened conflicts between ethnic groups. Discriminatory laws governing land ownership and restricting other ways Japanese-Americans could participate in society left them on the periphery, creating and bolstering barriers to their integration. During World War II, this outsider status would then be offered as “evidence” of their unwillingness to assimilate into American society.

At the same time Japanese-Americans were being interned in the name of national security, scientists were forging a weapon that would change the world. On July 16, 1945, Robert Oppenheimer and the other membersof the Manhattan Project (the code name for the group tasked with developing the weapon) were among the first to witness the explosion of an atomic bomb in the deserts of Alamogordo, New Mexico. Even in this barren and desolate space, the raw power of the bomb and its 40,000-foot mushroom cloud could not be denied. Oppenheimer described it as “Now I am become Death, the destroyer of worlds,” a line from the Hindu scripture, the Bhagavad Gita. They knew the bomb would be destructive. They knew it was a power unlike any other known. They also knew it was developed for one reason- to demonstrate an awe and fear-inspiring weapon that would finally put an end to the world war.

Less than a month later, on August 6, the U.S. deployed the bomb on the Japanese cities of Hiroshima and Nagasaki. Why those cities were chosen has inspired a lot of debate. The bombs were meant to achieve two main goals: 1) Force unconditional surrender by the Japanese, and 2) demonstrate the weapon’s power to the world. Much of Japan had already been bombed by the Allies/U.S.- Tokyo was already destroyed, so bombing it wouldn’t showcase the bomb’s power dramatically enough. Other strategic targets introduced even more issues- no one knew what would happen if an atomic bomb exploded near suspected missile sites or arms dumps. Hiroshima was a military target but also small and compact enough to be completely destroyed by one bomb. This was the horribly eloquent demonstration the United States was looking for. A second atomic bomb was dropped on the shipbuilding city of Nagasaki three days later.

But why drop a second bomb? Again, this is something historians continue to debate. The simplest answer, and the answer repeated by many of the soldiers and airmen who participated, was the official instruction was to drop two bombs. The military had developed two bombs, the order was to drop two bombs, so two bombs were dropped. We also have to remember that the Empire of Japan did not surrender after Hiroshima- one atomic bomb had not achieved the objective. Even after the bombing of Nagasaki, Japanese war council members disagreed about whether the war should to continue. In the end, Emperor Hirohito gave permission for unconditional surrender on August 14, 1945.

The United States was no longer at war, but the nation would fight about many of the decisions made during the war, like the internment of American citizens and the deployment of atomic weapons, for years to come. The dropping of the atomic bomb separated the history of human warfare into before Hiroshima and after Hiroshima. There was no turning back. Nuclear armament became central to domestic security and a path to national legitimacy. Before mutually-assured destruction was a tagline or cornerstone, it was a harsh realization. Albert Einstein described it best when he said, “I know not with what weapons World War III will be fought, but World War IV will be fought with sticks and stones.”

This was not enough to stop the proliferation and development of new weapons. In 1952, the United States developed and testing a hydrogen bomb, over 1,000 times more powerful than that exploded over Hiroshima. Though no other country has used a nuclear weapon in war since, the threat remains. North Korea tested a hydrogen (thermonuclear) bomb in September 2017. It continues to test-launch short-range projectiles, or intercontinental ballistic missiles, that can carry thermonuclear warheads. The status of Russian’s arms program and questions around Putin’s willingness to use its nuclear arsenal also causes pause. The nations of India and Pakistan, enemies for which war seems only to be awaiting a new flashpoint, are both nuclear powers. Policy makers and international leaders hope the threat of a fourth world war fought with sticks will be enough to convince current leaders that nuclear war, though powerful and final, is not the best way to decide their differences.

The legacy of Japanese-American internment was not international like that of the bombings of Hiroshima and Nagasaki, but it did cause many ripples here at home. Reverend Emery Andrews, writing in 1943, said “future historians will record this evacuation–this violation of citizenship rights–as one of the blackest blots on American history; as the time that democracy came the nearest of being wrecked.”[4]Andrews was correct, but it took over forty years. In the 1980s, the Supreme Court vacated the convictions under the 1944 decisions of Korematsu v. United States and Hirabayashi v. United States. The original decisions affirmed the President’s (and Congress’s) ability to amend constitutional rights to protect the nation in a time of war. Korematsu II vacated the conviction of Fred Korematsu, but it did not vacate the case law. The precedent still stands and can be- and is- used to justify taking away citizens’ constitutional rights during a national “emergency.” In 1988, President Ronald Reagan officially apologized for WWII internment and signed the Civil Liberties Act that granted monetary reparations to internment survivors and family members. The speeches and checks did not fully atone for what Japanese-Americans experienced during the war, but the effort did silence much of the discussion on the topic.

A terrorist attack on American soil that recalled Pearl Harbor in destruction and effect reopened the debate. On September 11, 2001, two hijacked planes crashed into the World Trade Center in New York City, bringing down both towers and killing thousands. Simultaneously a hijacked plane crashed into the Pentagon in Washington, D.C. and another plane crashed in Pennsylvania, its target presumed to be the White House. Though planned and perpetrated by the militant Islamic group al Qaeda, a terrorist organization disavowed by the majority of Muslims worldwide, Arab- and Muslim-Americans became scapegoats for the tragedy. Anti-Muslim hate crimes skyrocketed between 2000 and 2001, increasing by 1700 percent.[5]

Historians did not miss the similarities between the changes in how Japanese-Americans were perceived and treated following the attack on Pearl Harbor, and how Arab- and Muslim-Americans were perceived and treated following 9/11. Many expressed concerns that the United States continually follows the same pattern of uniting behind a common hatred in the name of domestic security. Recasting the past as a warning to the present, historians focused on how the combination of unchecked racism and the expansion of presidential power allowed Japanese-American citizens’ constitutional rights to be sacrificed in the name of national security.

The American memory of WWII tends to focus on our successes, not our failures. The defeat of the Axis powers and liberation of Jewish concentration camps amplify the good in the “good war,” but overshadow the ways the United States distorted some democratic ideals and practices to achieve them. Even though historians agree Japanese-American internment was “our worst wartime mistake,” America does tend to consolidate national unity through common hate in times of crisis. Americans have tried hard to forget the internment of citizens of Japanese descent during World War II, leading many historians to fear it could-and will- happen again. Though the “military necessity” of internment was never proven and the Supreme Court laterruled it was based on “race prejudice, war hysteria and a failure of political leadership,” the legal precedent stands.[6] As historian John Dower wrote, “war hates and race hates do not go away; rather, they (go) elsewhere.”[7]

KMS 2019

Sources Cited

Daniels, Roger.Prisoners Without Trial: Japanese Americans in World War II (Hill and Wang Critical Issues). New York: Hill and Wang, rev. ed. 2004.

Dower, John W. War Without Mercy: Race and Power in the Pacific War. New York: Pantheon Books, 1986.

Gee, Harvey. “Habeas Corpus, Civil Liberties, and Indefinite Detention during Wartime: From Ex Parte Endo and the Japanese American Internment to the War on Terrorism and Beyond.” The University of the Pacific Law Review47 (2016): 792-838, accessed October 31, 2016, HeinOnline.

Giannis, Joshua. “The Court, The Constitution, and Japanese-American Internment,” Stanford Journal of East Asian Affairs(Summer 2011): 87-96, accessed October 31, 2016, https://web.stanford.edu/group/sjeaa/journal111/Japan4.pdf.

Khan, Mussarat and Kathryn Ecklund, “Attitudes Toward Muslim Americans Post 9/11,” Journal of Muslim Mental HealthVII, no. 1 (2012): 1-16, accessed December 29, 2016, http://hdl.handle.net/2027/spo.10381607.0007.101.

“Only Thing We Have To Fear Is Fear Itself: FDR’s First Inaugural Address,” History Matters, accessed December 6, 2016, http://historymatters.gmu.edu/d/5057/.

Malkin, Michelle. In Defense of Internment: The Case for ‘Racial Profiling’ in World War II and the War on Terror. Washington, D.C.: Regnery Publishing, Inc., 2004.

Muller, Eric L. American Inquisition: The Hunt for Japanese American Disloyalty in World War II. Chapel Hill: University of North Carolina Press, 2007.

——. “Indefensible Internment: There was no good reason for the mass internment of Japanese Americans during WWII.” Reason.com (Dec. 1, 2004), accessed December 21, 2016, http://reason.com/archives/2004/12/01/indefensible-internment.

Oluwu, Dejo. “Civil liberties versus military necessity: lessons from the jurisprudence emanating from the classification and internment of Japanese-Americans during World War II,” The Comparative and International Law Journal of Southern Africa43, no. 2 (July 2010): 190-212, accessed December 5, 2016, http://www.jstor.org/stable/23253161.

Raico, Ralph. “American Foreign Policy: The Turning Point, 1898-1919,” The Independent Institute(February 1, 1995), accessed October 21, 2016, http://www.independent.org.

Reeves,Richard. Infamy: The Shocking Story of Japanese American Internment in World War II. New York: Henry Holt and Company, 2015.

Robinson, Greg. A Tragedy of Democracy: Japanese Confinement in North America. New York: Columbia University Press, 2009.

——. By Order of the President: FDR and the Internment of Japanese Americans. Cambridge: Harvard University Press, 2001.

——. “A Critique of Michelle Malkin’s ‘In Defense of Internment’, Part Two.” Modelminority.com, (August 8, 2004), accessed December 21, 2016, https://www.web.archive.org/web/20080919020738/http://modelminority.com/article849.html.

Shaffer, Robert. “Opposition to Internment: Defending Japanese American Rights During World War II,” Historian61, No. 3 (Spring, 1999): 597-618, accessed November 21, 2016, EBSCOHost.

 

[1]“Only Thing We Have To Fear Is Fear Itself: FDR’s First Inaugural Address,” History Matters, accessed December 6, 2016, http://historymatters.gmu.edu/d/5057/.

[2]John DeWitt quoted in Richard Reeves, Infamy: The Shocking Story of Japanese American Internment in World War II(New York: Henry Holt and Company, 2015), 41.

[3]Earl Warren quoted in Daniels, 37.

[4]Robert Shaffer, “Opposition to Internment: Defending Japanese American Rights During World War II,” Historian61, No. 3 (Spring, 1999): 597-618, accessed November 21, 2016, EBSCOHost, 598.

[5]Mussarat Khan and Kathryn Ecklund, “Attitudes Toward Muslim Americans Post 9/11,” Journal of Muslim Mental HealthVII, no. 1 (2012): 1-16, accessed December 29, 2016, http://hdl.handle.net/2027/spo.10381607.0007.101, 2.

[6]Oluwu, 207.

[7]John W. Dower, War Without Mercy: Race and Power in the Pacific War(New York: Pantheon Books, 1986), 311.

White Christmas, Blueberry Pie, Yellow Peril

In Michael Curtiz’s 1954 film, White Christmas, two WWII Army veterans conspire to reunite their entire unit to show their appreciation for the general that led them during the war. Though the story begins in Monte Cassino, Italy, on Christmas Eve, 1944, White Christmas is not a war movie: the war images are muted, the edges softened. It is a movie that reflects what Americans wanted to remember about the war (with singing, dancing, and large set pieces added in for good measure). Home Front U.S.A.: America During WWII author America Allan M. Winkler washes his discussion of post-Pearl Harbor America in the same shades of sepia, forcing us to also wonder whether he is writing about the war, or what Americans most wanted to remember about the war.

The Japanese attack on Pearl Harbor on December 7, 1941, shocked Americans out of their complacency. Neither the ocean nor pledges of isolationism could keep the country out of the conflict. Winkler writes the attack fostered the “sense of shared purpose” long missing from American popular support for the Allies.[1]Roosevelt’s Office of War Information (created June 1942) capitalized on this wellspring of patriotism by framing the war as supporting “American values and portraying Americans as they wanted to be seen.”[2]As the OWI proclaimed the glories of what the boys abroad were fighting for, including baseball and homemade blueberry pie, the Office of Civilian Defense (1941) stepped up its efforts to educate Americans on their own defense. The sale of war bonds, victory gardens, and other programs allowed those who were not serving overseas to fight the war where they were- every bond, turnip, and scrap of rubber was another strike against the enemy.

Winkler waxes rhapsodic about the war efforts that brought the country together, but we should not lose sight of the fact that post-Pearl Harbor America was united against the Japanese, not united behind the Allies. He writes Pearl Harbor “brought a sense of unity to all Americans,” a generalization that is not supported by historical fact.[3]Racism and prejudice played a deciding role in identifying the “they” who did this to “us.” German Americans and Italian Americans were openly discriminated against, but thousands of Japanese American citizens were forcibly removed to internment camps for based on unsubstantiated claims they posed a threat to national security. (Historians have yet to identify credible evidence for the “military necessity” behind Japanese internment, but the nation did not apologize or attempt reparations until the 1980s.) “Anti-Japanese hysteria gripped the home front,” explained Dolores Flamiano in “Japanese American Internment in Popular Magazines: Race, Citizenship, and Gender in World War II Photojournalism.” “Wartime internment stories were rife with racial slurs and stereotypes, which most readers apparently accepted or at least tolerated.”[4]Popular music reflected this trend, with songs like “We’re Gonna Find a Feller Who Is Yeller and Beat Him Red, White, and Blue” sharing air time with Irving Berlin’s “White Christmas.”[5]

Winkler does include the unfair treatment of Japanese Americans during the war in his chapter on “Outsiders and Ethnic Groups,” but his failure to even allude to this development in discussing post-Pearl Harbor war mobilization and propaganda underscores a disconcerting paradox. “For all of the hardships, the United States fared well in World War II,” he writes, but “(n)ot all Americans fared well.”[6]The attack on Pearl Harbor united the country, but in mobilizing American patriotism, it also mobilized American hate. Pearl Harbor’s legacy was a united nation, but a divided people. We entered the war secure in our belief that we would help democracy prevail overseas while playing fast and loose with the civil liberties of entire groups of citizens at home. Historians played into the myth, content to ignore the constitutional concerns raised by Japanese internment until the 1970s and 1980s. American historical memory of the war and the war era became increasingly sepia-toned as we allowed the recollection of our hatred of all things yellow (and black) to fade.

The acknowledgment of our wartime errors does not diminish our successes, but refusal to recognize these errors perpetuates division. Our collective memory of WWII should include Mom’s homemade pie and women’s baseball games, but also the barbed wire surrounding the camps at Manzanar. We are a country who found its place in the world amid dreams of white Christmases, hopes to return to the comforts of home and blueberry pie, and the insecurities that preyed upon our fear and led us to act counter to everything for which we said we stood. To borrow the phrase, the World War II generation can be the “greatest generation” without being understood as a perfect generation.

[1]Allan M. Winkler, Home Front U.S.A. America During World War II, (Wheeling: Harlan Davidson, Inc., 2012), 31.

[2]Ibid., 35.

[3]Ibid., 31.

[4]Dolores Flamiano, “Japanese American Internment in Popular Magazines: Race, Citizenship, and Gender in World War II Photojournalism,” Journalism History36, no. 1 (Spring 2010): 23-35, accessed October 31, 2016, EBSCOHost., 23; 24.

[5]Winkler, 39.

[6]Ibid., 55; 56.

Sources Cited

Flamiano, Dolores. “Japanese American Internment in Popular Magazines: Race, Citizenship, and Gender in World War II Photojournalism,” Journalism History 36, no. 1 (Spring 2010):23-35, accessed October 31, 2016, EBSCOHost.

Winkler, Allan M. Home Front U.S.A. America During World War II. Wheeling: Harlan Davidson, Inc., 2012.

Kate Murphy Schaefer, 2019

Fighting on the “Kitchen Front”

Bullets, mortars, potatoes? Victory in World War II required the help of every British citizen, and British food. Though the government tried to avoid becoming embroiled in another world war, the British Empire declared war on Germany in September 1939. With the men abroad once again, women re-entered the factories, took up plows and axes in the Women’s Land Army and Timber Corps, and a select few even went behind enemy lines as intelligence operatives. The number of women who fought on the “Kitchen Front” was even larger. British wartime food programs demonstrate wars are not won solely by soldiers in the field. Civilians, especially women, played an important role in continuing the fight. The British Empire went to war, and so did its food.

Napoleon Bonaparte famously said, “an army marches on its stomach,” but this also applies to civilians. One lasting lesson from the previous war was the importance of planning for the possible interruption of supply lines. With its Navy deployed to other fronts, Britain could easily be off from the rest of Europe, and access to food. Mindful of the bread queues that wound around the nation during WWI, the government reconstituted the Ministry of Food in 1940 and appointed Frederick Marquis, Lord Woolton as Minister of Food. Tasked with ensuring the soldiers at the front and those at home were fed, Woolton attacked the problem from several angles.

One of the Ministry’s most important programs involved food rationing. The government began planning a ration system in 1936, but it was not instituted until 1940. Each citizen received a ration book that allowed them to buy a certain amount of a rationed foods per month. The January rations limited bacon, sugar, and butter; meat was added to the list in March; and the list of rationed foods expanded to include cooking fat, cheese, jam, tea, and milk by July. Another system, the points system, was more flexible, giving each citizen a certain number of points to be used on any (and any amount) of points-rationed foods. The ration system ensured food was distributed equitably and meant everyone, from fishmonger to Royal Family, had the same access to food and paid the same prices.

The “Dig for Victory” campaign encouraged citizens to grow their own food to supplement their diet. “This is a food war,” said Woolton. “The battle on the kitchen front cannot be won without help from the kitchen garden.”[1] Home gardening became a patriotic duty, not just a pastime. Gardens cropped up in allotments throughout the country. Other campaigns encouraged eliminating as much food waste as possible and saving oils and fats so they could be used to manufacture explosives.

In addition to overseeing rationing programs and encouraging domestic efforts to grow food and eliminate food waste, the Ministry of Food offered guidance to help home cooks compensate for the dietary changes by promoting nutritious unrationed foods. Cartoon characters Potato Pete and Dr. Carrot were central to marketing this to children, appearing on posters and singing songs on the wireless. The tunes were catchy, and the lyrics were even catchier:

Potatoes new, potatoes old

Potato (in a salad) cold

Potatoes baked or mashed or fried

Potatoes whole, potatoes pied

Enjoy them all, including chips

Remembering spuds don’t come in ships![2]

Government-sponsored radio programs also targeted the children’s mothers. “The Kitchen Front” radio spot gave home cooks recipes and ideas for stretching their family’s rations. Carrot roly-poly, carrot marmalade, carrot candy, carrot sausage roll: the final products did not always match the recipes’ creativity; British home cooks’ ingenuity in rising to the challenge was also unmatched.

The “Kitchen Front” also helped spread one of the British government’s most pervasive and abiding propaganda campaigns. The German Air Force, the Luftwaffe, began an extensive aerial bombardment campaign to break the will of British citizens and force the government to sue for peace. The Ministry of Information began blitz of its own, spreading the half-truth that British pilots had exceptional night vision because of the amount of carrots they ate. The vitamins in carrots do support vision health, but a pilot would likely turn orange before he ate enough carrots for significant improvement. The intended message was British pilots would have no problem responding to German air raids, but the claims soon spread to the home front.

“My first three nights of ambulance driving…were fraught with anxiety that I should have to give up the job through not being able to see well enough,” said Margaret Grant in her February 28, 1941 broadcast. “I resorted to my food chart for guidance, and after taking a large glass of milled carrot and sliced tomato each day for a week, I drove with ease and comfort.”[3] Another host made similar remarks during her August 9, 1941 broadcast, calling carrots “good blackout food.”[4] There is no way to confirm how many Germans were actually fooled by the campaign, but carrots have been linked to better eyesight ever since.

Coping with food rationing and other social changes was difficult, but also helped bring communities together. The Ministry of Food affirmed the value of women’s contributions and the importance of every single citizen in war. Men left for the front lines, but the civilians at home manned the Kitchen Front.

[1] Lord Woolton quoted in K. Annabelle Smith, “A WWII Propaganda Campaign Popularized the Myth that Carrots Help You See in the Dark,” Smithsonian Magazine, August 13, 2013, accessed October 9, 2018, https://www.smithsonianmag.com/arts-culture/a-wwii-propaganda-campaign-popularized-the-myth-that-carrots-help-you-see-in-the-dark-28812484/.

[2] Song lyrics quoted in “History Cookbook,” CookIt!, accessed October 4, 2018, http://cookit.e2bn.org/historycookbook/20-97 -world-war-2-Food-facts.html.

[3] Margaret Grant, “The Kitchen Front, 28th February 1941,” in “The Kitchen Front World War Two Recipes and Commentary,” from the UK National Archives and transcribed by the World Carrot Museum.

[4] Mrs. Hudson, “The Kitchen Front, 9 August 1941,” in “The Kitchen Front World War Two Recipes and Commentary,” from the UK National Archives and transcribed by the World Carrot Museum.

vzOJYhnAr1DQ9eK4TvgocAo3M4OB5Dzbh8Ua088Xv_k

 

 

 

Arming the “Boys:” The Women’s Munition Reserve Seven Pines Bag Loading Plant, Penniman, Virginia

Updated 7/16/21

Mobilization for World War I allowed women previously unheard-of opportunities to take on non-traditional roles. Some served abroad as nurses and yeomen; others took up the ploughshares the men had traded for swords by working on family farms and with the Women’s Land Army. Traditional activities like sewing and knitting also took on new importance as the items were shipped overseas. Women also took over the factory jobs left open by the citizens turned soldiers, helping keep the American war machine rolling.

Beginning in 1915, DuPont chemical company directed all its manufacturing and production towards the war effort. Social crisis tends to trump political scruples, so the company’s recent antitrust troubles did not hinder its consolidation of a monopoly over American munitions production. DuPont Plant #37, located on the York River near Williamsburg, Virginia, was a shell-filling plant. The company’s Women’s Munition Reserve Seven Pines Bag Loading Plant was located near what is now Sandston, Virginia.

The Seven Pines and Penniman plants paid relatively high wages for hazardous work. Workers were responsible for loading TNT into ammunition shells and bagging gunpowder for shipment. Despite the potential danger, hundreds of men and women flocked to the plant in search of employment. A village quickly grew up around Plant #37 as DuPont constructed 230 houses to entice its workers to live nearby. It became known as Penniman. The population of Penniman numbered 10,000–20,000.[1] 

Female workers made up most of the workforce at Seven Pines. Women of all walks of life were represented, and it was not unusual for middle- and lower-class women to sew and fill powder bags side-by-side with Virginia’s First Lady, Marguerite Davis.[2] Fashion norms also relaxed a bit as a concession to the war effort. Long skirts were impractical in factories, particularly in factories filled with flammable and potentially explosive materials. DuPont issued trousers to the woman munitions workers of Seven Pines. To maintain propriety and keep the clothing suitably feminine, the women wore “womanalls” and “trouserettes.” as they “stuffed one shell for the Kaiser.”[3]

The inscription on the metal badge housed at the VAARNG Mullins Armory in Richmond, Virginia, reads “WOMEN’S MUNITION RESERVE SEVEN PINES BAG LOADING PLANT.” Badges issued for other DuPont munitions plants took similar forms. Plant badges served several purposes: some were practical, some rather grisly. As metal withstands an explosion better than flesh, numbered badges could help identify a worker killed during a plant accident. It is probable the “68” on the middle of the badge was the identification number for a female worker.

badge

Figure 1. Women’s Munition Reserve Seven Pines Bag Loading Plant badge. Photo by author.

Plant badges provided a different kind of protection for male workers. Being branded a “slacker,” or man who did not serve or did not work towards the war effort, was almost as bad as being German. Wearing a factory badge showed the community you were doing your part.

The Seven Pines plant officially opened in October 1918 with a “Liberty Day” celebration. Less than a month later, Armistice rendered the plant’s work unnecessary. The Richmond-Fairfield Railway Company bought the properties originally constructed for the workers, building the foundation for a suburb for the city of Richmond- Sandston. Affordable housing and access to jobs in the city allowed workers to find employment as the nation shifted to a post-war economy.

While the Sandston community survived, community around Dupont Plant #37 did not. The Spanish flu epidemic that raged across the nation also took its toll in Penniman. The local hospital could not keep up with the number of sick men, women, and children who entered its doors. Local coroners and casket  also struggled to keep up with the dead. When Plant #37 closed its doors, the surviving families left in search of employment, in some cases taking their DuPont-constructed houses with them by floating them down the river. By the mid-1920s, Penniman had disappeared. The Women’s Munition Plant badge is tangible evidence of a place that can no longer be found on a Virginia map. While most of the women’s individual stories also disappeared, this material evidence preserves the story of another way Virginia women broke through gender boundaries to support their country and their Commonwealth.

munition poster

Figure 2. Frederic H. Spiegel, 1918. Library of Virginia Special Collections Archive.

Notes

[1] Martha W. McCartney, James City County: Keystone of the Commonwealth, (James City County, Virginia: Donning Company Publishing, 1997).

[2] Virginia Women and the First World War: Records and Resources at the Library of Virginia,” Library of Virginia Archival and Information Services, accessed August 27, 2018, https://www.lva.virginia.gov/public/guides/WomenofWWI.pdf, 2

[3] Ibid.

Special thank you to the readers who sent me revisions for information that was unclear in my original post. I appreciate your dedication to keeping historians accountable as we endeavor to tell the truth about the past as much as possible.

Book Review: They Fought Alone

Glass, Charles. They Fought Alone: The True Story of the Starr Brothers, British Secret Agents in Nazi-Occupied France. New York: Penguin Press, 2018.

Spies and spy craft have long captured the imagination. Agent 007. Emma Peel and John Steed. Even Maxwell Smart and Agent 99. Espionage is intriguing, especially when we are privy to the tricks of the trade. It is entertaining, especially when we know our heroes and heroines will be saved from a dastardly fate at the very last second and live to spy another day. Real life is not like the movies. James Bond actor Roger Moore once explained the difference saying, “You can’t be a real spy and have everybody in the world know who you are and what your drink is.” Transparency is a luxury that gets spies killed.

In They Fought Alone: The True Story of the Starr Brothers, British Secret Agents in Nazi-Occupied France, Charles Glass tells the story of two brothers who gathered intelligence for the British during WWII. Adolf Hitler’s September 1939 invasions of Poland and Czechoslovakia prompted Britain’s declaration of war, but that declaration was not an impediment to his march across Europe. Belgium, France, Luxembourg, and the Netherlands were the next to fall. Planning to “set Europe ablaze” by bolstering local resistance movements and gather the intelligence needed to win the war against Nazi Germany, the British created the Special Operations Executive (SOE) in July 1940. Taking his title from Colonel Maurice Buckmaster’s memoir, Glass’s book highlights the experiences and contributions of two SOE spies: George and John Starr. “Theirs would be a lonely struggle, cut off from the wives and children they loved, deprived of the comradeship of a regular military unit, and on their own behind enemy lines,” writes Glass.

If male spies “fought alone,” where did that leave their female counterparts? According to Glass, George Starr disliked female SOE agent Odette Sansom from the beginning, also complaining he was “put in charge of three bloody women” for his first overseas assignment. Starr later complains Sansom made unwelcome sexual advances, implying her espionage was more horizontal than on the up and up. The British and French governments reached different opinions regarding Sansom, however, awarding her the George Cross and making her a Chevalier de la Légion d’honneur.

Sansom survived torture and imprisonment in a Nazi concentration camp. She fought the enemy, but also the prejudices and biases of her own side. Some female spies did use sex appeal to their advantage in gathering intelligence about enemy troop movements and plans, but we cannot say all female spies employed “sexpionage” because of shortcomings in character and moral standing. Sex can be a tool, just like encryption machines and short-wave radios, and the spy who does not use all the tools at her disposal is not that effective a spy at all.

A biography tells the truth about history as it was perceived by that person. Glass is correct to include Starr’s biases against Sansom in his book because it was a true part of Starr’s experience in the SOE. They Fought Alone should be read in conjunction with other histories of the SOE, including some of the recent works that focus on the women who served. This will give the most balanced view of the men and women who fought alone during WWII. The Allies fought against the tyranny of the Axis powers but held onto their own prejudices. The only thing more dangerous than being a spy at war was being a female spy at war. The enemy inherently distrusted because you were an enemy; your own side inherently distrusted you because you were a woman.

KMS, 22 August 2018

 

Book Reviews: How Fascism Works by Jason Stanley and On Tyranny by Timothy Snyder

Stanley, James. How Fascism Works: The Politics of Us and Them. New York: Random House Publishing, 2018.

Snyder, Timothy. On Tyranny: Twenty Lessons from the Twentieth Century. New York: Tim Duggan Books, 2017.

When and if fascism comes to America it will not be labeled “made in Germany”; it will not be marked with a swastika; it will not even be called fascism; it will be called, of course, “Americanism…” The high-sounding phrase “the American way” will be used by interested groups intent on profit, to cover a multitude of sins against the American and Christian tradition, such sins as lawless violence, tear gas and shotguns, denial of civil liberties

Halford E. Luccock, “Keeping Life Out of Confusion,” September 11, 1938[1]

The past doesn’t repeat itself, but it often rhymes.

Attributed to Mark Twain

History’s propensity to rhyme has always intrigued me; more recently it keeps me up at nights. You can ignore the individual warning signs only too long before they coalesce into a sustained feeling of dread. We can hope, as Martin Luther King, Jr. did, for the moral arc of the universe to bend towards justice, but more often human nature bends towards its baser instincts. Racism and prejudice is more prevalent than ever. In 2017, a politician used FDR’s policy of Japanese internment during WWII to justify a travel ban targeting Muslim-majority countries. The quest to “Make America Great Again” is a policy of regression to eras when only white was right, women knew and were kept in their place, and anyone who didn’t fit the norm was criticized and ostracized.

Some historians see even more disturbing overtones in the current rhyme. In their recent books, Timothy Snyder and Jason Stanley sound a haunting alarm about the overwhelming similarities between current American domestic and foreign policies and those of fascist sites in the 1930s and 1940s.

 

Snyder and Stanley take different approaches to the topic, but each is effective, to the point, and as my Dad would say, scary as all get out. Snyder’s On Tyranny: Twenty Lessons from the Twentieth Century reads as a handbook for resisting authoritarianism. “History does not repeat,” he writes, “but it does instruct.” (Snyder, 9).

Stanley, on the other hand, does not shrink from making pointed comparisons between Mussolini’s Italy, Hitler’s Germany, and Trump’s America. How Fascism Works: The Politics of Us and Them is an indictment of the ways in which the President’s open nativism, racism, and sexism have opened the floodgates for others to openly express the same. Stanley does not place the blame for America’s woes solely on the President, however. He also indicts the whole of the President’s administration and Republican-majority Congress for aiding and abetting the presidential view of “Americanism.” Stanley’s narrative on the Republican response (or lack thereof) to the President’s boast to “grab ‘em by the p***y” should chill anyone who supports women’s rights to the bone.

Stanley’s and Snyder’s cogent use of historical evidence and documents makes their arguments even more persuasive. Stanley’s understanding of Mein Kampf probably rivals that of the author himself. Historians and layman readers who support the current Administration’s policies will no doubt say when you go looking for unicorns, you will undoubtedly find them. We still cannot afford to ignore the patterns Stanley and Snyder reveal. We cannot afford to ignore the ways in which our current rhyme fits nicely into the goose-stepped past. We cannot afford not to ask to what “American way” we are dedicated to returning: “Give us your tired, your poor, your huddled masses yearning to be free,” or “I know nothing but my Country, my whole Country, and nothing but my Country.”

KMS

16 July 2018

Cartoon from University of California San Diego Library Digital Collections, https://library.ucsd.edu/dc/object/bb4164680v

[1] Halford E. Luccock, “Keeping Life Out of Confusion” The New York Times, 11 September 1938), 15.

Book Review: Fly Girls: How Five Daring Women Defied All Odds and Made Aviation History

O’Brien, Keith. Fly Girls: How Five Daring Women Defied All Odds and Made Aviation History. Boston: Houghton Mifflin Harcourt, 2018.

In Fly Girls: How Five Daring Women Defied All Odds and Made Aviation History, Keith O’Brien reminds the reader that only one of the names of the women he profiles will be familiar: Amelia Earhart. Even then, Earhart is known more for his defeat by than her conquest of the air. The “friendly sky” described by modern commercial airlines is in reality a jealous mistress: aviators that do not give her total attention or fail to decipher the roles that changing conditions play on flight patterns and aircraft will not enjoy her company very long. The race to tame the sky claimed many lives,  male and female.

One of the most poignant episodes in the book comes when an interviewer asks Earhart why she wants to fly. “Why do men ride horses?” she replies. She seems stunned by the idea that women could not share the thirst of adventure felt by men. By the end of the chapter, Earhart’s contribution to that flight would be reduced to that of ballast, with several male aviators claiming it would have been better if she had been left behind and two hundred gallons of fuel loaded in her place. Aviator instruction and training for men and women were the same: they had to complete the same education and tasks to earn flying licenses. Flying while female, however, was often seen as a bigger liability than flying while intoxicated. The various commercial schemes women undertook to be able to get in the cockpit also made them appear to be more interested in fame and fortune than in flying.

Fly Girlslends new lyrics to a familiar tune. As women’s history gains readers and with them, profitability, we can expect to see many more histories of forgotten women in male-dominated spaces. Women made important contributions to early aviation and would continue to make contributions as pilots, mechanics, and engineers during the world wars. They laid the groundwork for pilots like Tammie Jo Shultz, the former Navy fighter pilot who landed a Southwest plane after it lost one of its engines after takeoff earlier this year. O’Brien’s book also reminds us that for every Shultz and Earhart, there are thousands of female pilots who never make it into the papers.

KMS

June 2018

Book Review: Princess: The Early Life of Queen Elizabeth II by Jane Dinsmoore

Dinsmoore, Jane. Princess: The Early Life of Queen Elizabeth II. Lanham, Maryland: Rowman and Littlefield Publishing Group, 2018.

The only thing we commoners like better than a royal wedding is a royal scandal. The magazines and newspapers who ooh-ed and aah-ed over the wedding of Prince Harry and Meghan Markle saw no irony in publishing snarky articles about supposed rows with her new family members a week later. The British royal family lives in a gilded cage, and for all the riches, pomp, and splendor, we would do well to remember they are also people. In Princess: The Early Life of Queen Elizabeth II, author Jane Dinsmoore allows us to see the world’s longest-reigning monarch as just that: a regular person born into unbelievable and sometimes overwhelming responsibility.

Born Elizabeth Alexandra Mary Windsor to the Duke and Duchess of York in 1926, Princess Elizabeth was known as Lillibet by close family members. She was third in line for the throne, but this could all change if her uncle David finally married and produced an heir. She loved horses, participating in Girl Guards activities (the British version of America’s Girl Scouts), and putting up with the theatrics of her little sister, Margaret. She lived a charmed life as the apple of her parent’s eye, and if she begrudged sharing them with their royal duties, she said little. Ten years later, everything changed. With King Edward VIII’s abdication of the throne, Lillibet’s father became king and she became the heir presumptive. The princess began learning statecraft at one of the most difficult points of British history: the abdication threatened the monarchy as an institution while the coming war with Germany would test the monarchy’s place in governing the country.

Pulling from interviews, memoirs, and other writings, Dinsmoore’s writing sometimes resembles a day planner more than a narrative, but her attention to detail is phenomenal. Elizabeth II’s childhood and adulthood could be seen as a type of school for scandal, perhaps preparing her for the issues that would crop up with her children’s and grandchildren’s marriages. George VI’s handling of the continued machinations of Edward VIII and Wallis Simpson (made Duke and Duchess of Windsor after his abdication) and discovery of Mountbatten designs on the monarchy (introduced with Elizabeth’s relationship with Prince Phillip of Greece) no doubt impacted how Queen Elizabeth would deal with her children’s affairs, failed marriages, divorces, and remarriages.

“When I was a little boy I read about a fairy princess, and there she is,” wrote American President Harry Truman, but there is so much more to Elizabeth Windsor’s story. The Queen Elizabeth seen during the Trooping of the Colour, royal weddings, celebrations, and memorial ceremonies is also the woman who battled insecurity and loved fiercely. She was once a young girl, a young wife, a young mother. The beautiful grounds of Buckingham Palace and Balmoral Castle were once torn apart by Luftwaffe airstrikes, their lights dimmed and dining tables bare as the royal family stayed true to the austerity measures they asked of their people. When we go looking for fairy stories, we will find them. The truth is harder to locate and often harder to take. Dinsmoore’s Princess Elizabeth is a girl hoping to meet the expectations of her family and nation while also wanting to make her own mark on it all. Perhaps she was not that different from any young woman on the cusp of taking the world by storm.

KMS

Book Review: A World on Edge by Daniel Schönpflug

Schönpflug, Daniel. A World on Edge: The End of the Great War and the Dawn of a New Age. New York: Henry Holt and Co. Metropolitan Books, 2018.

Daniel Schönpflug’s book, A World on Edge: The End of the Great War and the Dawn of a New Age (English translation of Comets years. 1918: the world on the rise) begins with an ominous image: an effigy of Kaiser Wilhelm strung up between two New York City streets. With the Great War at an end, the survivors had to learn how to navigate the world it created. Some, like the man represented in effigy, faded into the background, while others used the lessons of protracted war and fractured peace to claim the spotlight.

Abandoning the traditional focus on disarmament, redeployment, and reparations, Schönpflug constructs his history of the post-WWI period using the stories and experiences of people who lived it. He tells the stories of former political figures (Crown Prince Wilhelm of Prussia, Matthias Erzberger) alongside those of rising political stars (Harry S. Truman, Nguyen Tat Thanh), and trades stories of fading revolutionary movements (T.E. Lawrence and the Middle East) with those just beginning to catch flame (Nguyen Tat Thanh in Vietmam, Terence MacSwiney in Ireland). Each point has a counterpoint, but there is also commonality in the lives lived after Armistice. Russian White Army soldier Marina Yurlowa speaks of the same type of battle fatigue expressed by U.S. doughboy Alvin York. Artists Walter Gropius and George Grosz hoped their art would help them make sense of the new world; Gropius found purpose in construction, while Grosz saw only nothingness. The men would become leaders of the Bauhaus and Dadaist artistic movements, respectively.

Schönpflug’s inclusion of women (Virginia Woolf, journalist Louise Weiss, Moina Michael, the aforementioned Marina Yurlowa) was a welcome surprise. He gives their stories share equal space with those of the men, a huge departure from many historical treatments that relegate women’s wartime and post-war experiences to a separate “women’s” chapter. The inclusion of a female soldier is especially heartening as Russian historiography has only recently restored a place for armed women in its history.

The author’s new approaches towards the post-war period does not preclude him from exploring the ways the Treaty of Versailles laid the groundwork for the rise of National Socialism. Schönpflug prefers to stoke a slow burn, showing the reader how individuals can go from elation over the end of armed hostilities to disillusionment over the world the war made. Nations and individuals alike placed their hopes in salvation through Wilsonian diplomacy and the League of Nations. Wilson’s rejection of the League and its resulting failure would lead them to different ideas and different saviors.

Kierkegaard wrote “life can only be understood backwards, but it must be lived forwards.” Schönpflug’s history reminds us that “forwards” contains multiple directions, and people understand their present in multiple ways. For some, the swinging Kaiser represented the freedom to break free from old traditions, ideas, and constraints. Others found the peace did not live up to its promise and inclined towards despair. All agreed a world begot by violence would not easily shake the lessons of its cradle. Post-war Europe was on the edge of a new world; the next few years would determine whether it remained mired in the ashes or rose like a phoenix.

kms 2018

Burning Down the House: Putting American Women in their Place Following WWII

World War II changed a multitude of things, but not American gender norms and stereotypes. The war reinforced the differences between men and women and deepened the power struggle. Allan M. Winkler drew a direct correlation between women’s involvement in the war effort and the development of the women’s rights movement, but this only tells part of the story.[1] It was not participation, but the gender-based barriers and limits to women’s participation in the war effort that reinvigorated the women’s civil rights movement. “Utilizing American woman power was a matter of military expediency,” wrote Michaela M. Hampf in “‘Dykes’ or ‘Whores’: Sexuality and the Women’s Army Corps in the United States during World War II.”[2] Expediency does not connote acceptance or appreciation, a distinction that followed women throughout the war. “Opponents to even a temporary participation of women felt that not only the efficiency of the military was threatened, but also the traditional system of male dominance and the roles of female homemaker and male breadwinner” continued Hampf.[3] In other words, women who did not stick to hearth and home were seen as more likely to burn down the house than to keep the home fires burning. The response to the possible subversion of traditional gender roles was an increased effort to keep women in their place.

One effective way to reinforce the traditional structure was to play up the differences between men and women by highlighting the ways in which women could never measure up to the ideal represented by American manhood. Low wages and low expectations concerning the duration of female employment were blatant reminders of women’s worth in the workplace relative to their male counterparts; others were less transparent. Articles on industry beauty contests, fashion shows, and “war fashion tips for feminine safety” shared pages with war reports in the monthly newsletters of a New England shipyard, for example.[4] These articles framed women workers as both “helpless” and “glamorous,” two decidedly nondesirable traits in workers meant to keep the economy and the war effort on track.[5]

Media depictions took contradictory representations of women even further. Women were depicted in images like “Rosie the Riveter,” but were also prominent in posters warning soldiers of venereal disease, “penis propaganda” that implied any woman could present a threat to manhood.[6] Male promiscuity is excused, accepted, and even expected, but female promiscuity threatened the health of American society and of its fighting men. The “virgin/whore binary” (coined by Lisa Wade in her essay for Sociological Images) was not limited to factory work or propaganda.[7] Women who served in military capabilities had to be careful not to be too ambitious lest they be branded as lesbians, prostitutes, or a combination of both. Linda Grant DePauw noted more work on military prostitution has been published than on women on who served as combat soldiers during the war.[8] The relative lack of research on women’s combat service compared to their illicit sexual service preserves the hypersexualized “otherness” of women in war, reminding us historians are not immune from the social norms and cultural mores of the environment in which they research and write.

Participation in the WWII workforce did not magically give women agency, nor did it open society’s eyes to their worth and abilities. If it had, there would have been no need for the women’s civil rights movement. Society does not change on its own, and the process is brutal. Some women simply could not reconcile the “new sense of self” and “self-reliance” fostered by working outside of the home with the societal expectation that they would “cheerfully leap back to home” when the men returned from war.[9] As Dellie Hahne told Studs Terkel in an interview for his “The Good War:” An Oral History of World War II, “a lot of women said, Screw that noise. ‘Cause they had a taste of making her own money, a taste of spending their own money, making their own decisions.”[10] As the hands that rocked the cradle learned their hands could handle many other tasks, they were not content to go back to how things were. The war had changed them, but it was up to them to change their world.

[1] Allan M. Winkler, “The World War II Homefront,” History Now: The Journal of the Gilder Lehman Institute, The Gilder Lehman Institute of American History, accessed December 12, 2016, https://www.gilderlehrman.org/history-by-era/world-war-ii/essays/world-war-ii-home-front.

[2] Michaela M. Hampf, “‘Dykes’ or ‘Whores’: Sexuality and the Women’s Army Corps in the United States during World War II.” Women’s Studies International Forum 27 (2004): 13-30, accessed December 14, 2016, EBSCOHost., 13.

[3] Ibid., 16.

[4] Jane Marcellus, “Bo’s’n’s Whistle: Representing ‘Rosie the Riveter’ on the Job,” American Journalism 22, no. 2 (2005): 83-108, accessed November 28, 2016, EBSCOHost., 94.

[5] Ibid.

[6] See http://www.cnn.com/2015/08/25/health/wwii-vd-posters-penis-propaganda.

[7] Lisa Ward, “The Virgin/Whore Binary in World War II Propaganda,” Sociological Images, June 15, 2011, accessed December 15, 2016, https://thesocietypages.org/socimages/2011/06/15/the-virginwhore-binary-in-world-war-ii-vd-propaganda/.

[8] Linda Grant DePauw, Battle Cries and Lullabies: Women in War from Prehistory to the Present. (Norman: University of Oklahoma Press, 1998), 262.

[9] Winkler 352; Terkel, 120.

[10] Studs Terkel, “The Good War:” An Oral History of World War II. (New York: The New Press, 2011). Kindle edition.

The Last Witch of Parkland

On March 15, 1895, Michael Cleary burned his wife Bridget alive. He claimed his real wife had been taken by the fairies, and a changeling put in its place. After days of folk remedies (including dousing her with urine and force-feeding her herbal concoctions) and attempts to coax the fairy to leave through exposing it to the lit hearth (in other words, burning Bridget with the flames), Cleary finally poured paraffin oil on her smoldering clothing, setting her aflame. The media frenzied at Cleary’s trial, digging into the details of the witness statements and “evidence” of the supernatural at work in modern times.

There was more to this fairy story, however. “The overwhelming message of the fairy legends is that the unexpected may be guarded against by careful observance of society’s rules,” explained Angela Bourke in her 1999 book, The Burning of Bridget Cleary.[1] To Bourke, Bridget presented a more potent challenge to her local society than the supernatural ever could. A trained dressmaker who owned her own Singer sewing machine and also raised her own chickens, she was an educated tradeswoman who earned her own money. Her clientele brought her into contact with men and women in higher social classes, and through them, new ideas about what she wanted and expected from life. A woman who could support herself financially could not be as easily controlled by a husband or society in general. Adding the fact that she had not performed her wifely duty and borne a child to carry on the Cleary name, Bridget was a dangerous anomaly within the social norms of her community.

News coverage of battered spouses always seems to turn up warning signs far too late, and Cleary’s story is no different. A few months before she was killed, Bridget confided in her aunt Mary Kennedy about her troubles at home, saying “He’s making a fairy of me now, and an emergency…he thought to burn me three months ago.”[2] Cleary could have been speaking figuratively, saying her husband was disappointed in her and wished she would revert to the naïve, uneducated woman he married. It also could have been a literal cry for help, voicing her fears that her husband planned to harm her physically. History does not allow us to say with certainty which of these possibilities is true, but we do know Michael Cleary justified burning his wife to death because she was a “fairy.”

Cleary went to jail for fifteen years and his wife became the “last witch of Ireland,” a neat label that both sold papers and kept the public from developing too much empathy for the woman. Bridget Cleary was not a witch. At most she was a victim of the supernatural, or at least a victim of a society that used the supernatural as a cover for forcibly bringing women into line with accepted conventions.

One hundred years later, we pat ourselves on the back for disdaining the supernatural. We say we don’t burn witches, but that’s not exactly true. Modern society retains its own system of rules and punishments to regulate female behavior that is more often than not contradictory to those it holds for males. Our worst censure is reserved for women who defy convention: the ones who speak when they are supposed to be silent, rage when they are supposed to be resigned, act when they are supposed to be accepting. We don’t burn women at the stake; we roast them on social media. There is a reason the slang term for putting someone in their place using a well-timed insult is called a “burn.”

The survivors of the school shootings at Stoneman-Douglas High School in Parkland, Florida on February 14, have come under fire for their response to the massacre. It defies the resigned “thoughts and prayers” that bolster the status quo. Channeling their grief and anger into action, the teenagers built one of the most powerful and compelling challenges to the American gun lobby in recent memory, if not ever. The sincerity of their message, spoken and shouted through tears, is difficult to deny, so detractors took aim at the messengers themselves. NRA leaders and other anti-gun control supporters insisted the teens are too young to be so poised and must therefore be talking heads for adult anti-gun/anti-Second Amendment groups already in place.

The worst insults seem to be reserved for Emma Gonzalez, a young woman whose words are as cutting as her hair is close-cropped. She called B.S., so Leslie Gibson, a Republican candidate for Maine’s House of Representatives referred to her as a “skinhead lesbian” on Twitter. Outrage over Gibson’s comments forced him to drop out of the race, but branding Miss Gonzalez in this manner shows modern America has its own answer to the Irish changeling fairy tale. Women must look and act a certain way to be accepted and must parrot the approved message if they are to be respected. Her haircut is not threatening in itself. Her sexual orientation, whatever it may be, has absolutely no bearing on her stance on gun control. Gibson may have attacked other classmates for their message, but he refused to hear Gonzalez because of her appearance and his interpretation of her sexuality. A non-white female with the courage to stand up to established adult politicians and the strength to stay on message as she attended a month of friends’ funerals and memorial services? Threatening does not begin to describe the woman. Neither does powerful. She again did the unthinkable at the March 24 March for our Lives rally in Washington, D.C. by staying silent. For six long minutes and twenty interminable seconds, Gonzalez stood on the stage, most of them saying nothing as tears dripped down her face. She weaponized silence, bringing the crowd to its feet and her detractors to their knees. The gun control crusader was without speech but had the last word.

In looking to history for lessons, we must remember we will sometimes see things we don’t want to see, including the fact that repeated “thoughts and prayers” are historically ineffective at keeping it from repeating itself. That prejudice, hate, and fear make words like “lesbian” (and “Pocahontas” for that matter) a slur and insult. That over a hundred years of experience, growth, and technology cannot keep us from behaving in the same ways as our “backward” ancestors did when confronted by change and challenge. We don’t burn young women as witches anymore, but we are very keen to crush the spirits of women and men who refuse to conform to societal expectations.

Describing the Cleary case in 1901, historian Michael J. McCarthy bemoaned the fact that the “events took place, not in Darkest Africa, but in Tipperary; not in the ninth or tenth, but at the close of the nineteenth century.”[3] Another century has passed. When will America stop burning its “witches,” or at least accept the fact that we aren’t as enlightened and modern as we would have others believe?

[1] Angela Bourke, The Burning of Bridget Cleary (New York: Penguin Books, 1999), 34.

[2] Bridget Cleary quoted in Bourke, 75.

[3] “Bridget Cleary burned to death,” excerpted from Michael J. McCarthy, Five Years in Ireland, 1895-1900, posted in Library Ireland, accessed March 25, 2018, http://www.libraryireland.com/articles/Burning-Bridget-Cleary/.

KMS 2018

Taking it to the Streets: International Women’s Day, March 8, 1917

One hundred one years ago today, thousands of Russian women took to the streets to protest high prices and food scarcity. “Down with high prices” and “down with hunger,” they shouted. Their voices did not go unheard. Thousands joined them the next day as a labor strike broke out. On March 9 (February 25 according to the Russian calendar), approximately 200,000 workers filled Petrograd. Their new battle cry? Down with the tsar.

In Hemingway’s The Sun Also Rises, one character describes how he went bankrupt as happening “gradually and suddenly.”[1] “Gradually and suddenly” is also an extremely apt way to characterize the 1917 February Revolution in Russia. The revolutionary spark kindled by the massacre of Father Gapon and his followers in 1905 was temporarily dimmed by Nicholas II’s creation of the Duma and other assorted attempts at reforms. The next twelve years saw steady economic decline, rampant inflation, military setbacks and defeats in World War I, and a continued increase in the people’s distrust and disfavor with their autocratic government. All of these factors contributed to the February Revolution, but what caused it to occur at that specific time? Why not in January, or the previous December? The revolution needed a flashpoint, and that came in the form of a loaf of bread. The person who wants to identify the roots of the February 1917 Revolution need look no farther than what was on (and more importantly, what was not on) Russian dinner tables. More than allegiance to any revolutionary dogma or nationalist feeling, Russians of every class and creed shared the experience of persistent food insecurity. Food scarcity does not link to the entire revolutionary movement in a straight line, but it is both a common theme and symbol of the problems within the Russian government, military, and people themselves.

“It all began with bread,” wrote historian Orlando Figes in his social history of the Revolution.[2] As the country mobilized for war, the majority of the nation’s food production was earmarked for sustaining the millions of men (and women) serving at the front (and rear).[3] Even this was not enough, as soldiers complained of the lack of provisions, arms, and other necessities. “In Ivov, before the eyes of 28 thousand soldiers, five people were flogged for leaving their courtyard without permission to buy white bread,” wrote soldier A. Novokov.[4]

Food insecurity was even worse on the home front. As peasant farmers realized they could not buy enough food to support their families, they turned to farming subsistence crops like potatoes and oats instead of traditional grains. In the cities, workers had money to buy food, but near constant food shortages meant there was no food to buy. “We will soon have a famine,” wrote Maxim Gorky to his wife, Ekaterina. “I advise you to buy ten pounds of food and hide it.”[5] Most would not be able to make such preparations. “They say: work calmly, but we are hungry and we cannot work,” wrote a group of female workers in June 1915. “They say there is no bread. Where is it then? Or is it only for the Germans that the Russian land produces?”[6]

Everyone seemed to recognize Russia’s situation was dire except the tsar. While his country starved, the “little father” of Russia dined in style. Describing a typical meal at Tsar Nicholas II’s table, Alexander Mosolov writes of “soup…followed by fish, a (game or chicken) casserole, vegetables, sweets, (and) fruit.”[7] The ruling family washed down this abundance of food with “madeira, white and red wines for breakfast… and different wines served at lunch, as is the custom everywhere else in the civilised (sic) world.”[8] There is no more powerful demonstration of the tsar’s disengagement from the people he ruled than the royal family enjoying the finest Bordeaux while his people could barely scrape enough food together to keep themselves alive. Nero is said to have fiddled while Rome burned, but the Romanovs did feast while the Russian people starved.

The people were hungry, the army was in disarray, and the government seemed out-of-touch at best, but the situation was still not quite ripe for revolution. The people needed a common cause they could rally behind. This cause crystallized in the bread lines of Petrograd. Figes described the Petrograd bread lines as almost “a sort of political forum or club, where rumours, information, and views were exchanged.”[9] As they realized common experiences and concerns, the people began to organize. Put quite simply by Figes, “(t)he February Revolution was born in the bread queue.”[10] Organized civil disobedience took a more violent turn as bread shortages led to bread riots. Strikes and walkouts in factories increased the number of people demonstrating in the streets, making it ever more difficult for the police to regain control. The tsarist government fell, the Romanov Dynasty ended, and a Provisional Government made up of leaders of the Duma was left to pick up the pieces. It should be no mystery why the Bolsheviks captured the imagination of the people. Their promises of peace, land, and bread neatly summed up the needs of every Russian soldier, farmer, and worker, man or woman, child or adult.

When compared to other causes of the Revolution—World War I, failed reforms, tsarist incompetence—bread seems insignificant. Lack of bread, however, is extremely significant. The Russian government could not meet the needs of its people. Hundreds of thousands died at the front lines and at home while the Duma struggled against a tsar who had no understanding of his country’s issues or impending demise. The 1917 February Revolution in Russia continued a legacy of protest that included the 1789 women’s bread riots in revolutionary France and the bread riots in Richmond, Virginia (then capital of the Confederacy) in 1863. Food insecurity inspired women to speak, and gave them a message to which their societies listened.

One hundred one years later, women still wait in bread lines, walk miles for clean water for their families, and struggle to care for their families. Equal pay and equal rights continue to escape even the most modern, “civilized” nations. It is easier to create hashtags and slogans than real change. On this International Women’s Day, we recognize the women who spoke up and walked out. We salute the women who continue to refuse to let the status quo determine their present and stifle their future.

 

[1] Ernest Hemingway (1926) The Sun Also Rises (New York: Scribner, 2006 ed.), Book 2, chapt. 13.

[2] Orlando Figes, A People’s Tragedy: The Russian Revolution 1891-1924. (New York: Penguin Books, 2006), 298.

[3] Ibid.

[4] “A. Novikov to A. I. Ivanova, Moscow, “Excerpts from Soldiers’ Letters, Intercepted by Censors, 1915-1917,” in Russia in War and Revolution, 1914:1922: A Documentary History, ed. Jonathan Daly, Leonid Trofinov, accessed June 6, 2016, http://www.snhu-media.snhu.edu/files/course-repository/graduate/his/his630, 12.

[5] Maxim Gorky quoted in Figes, 300.

[6] “Proclamation of Kostroma women workers to soldiers, June 1915,” in Russia in War and Revolution, 1914-1922: A Documentary History, 11.

[7] Alexander Mosolov, “At the Emperor’s Court, Book IV” in At the Court of the Last Tsar, accessed June 14, 2016, http://www.alexanderpalace.org/mossolov.

[8] Ibid.

[9] Figes, 300.

[10] Ibid.

F-Bomb Field Trip: U.S. Army Women’s Museum in Fort Lee, Virginia

Two statues guard the entrance of the U.S. Army Women’s Museum at Fort Lee, Virginia: Pallas Athena, the Greek goddess responsible for wisdom and war, and a female American soldier, the personification of those attributes. The USAWM was originally part of Fort McClellan, Alabama, but moved to Virginia after the base closed in 1999. It opened at Fort Lee in 2001, but it was only five years ago that the museum became the first American military installation to display a statue of a female soldier. This timeline parallels women’s fight to both participate in the U.S. military and be recognized for their participation. The museum does a very good job at establishing the fact that women have always been involved in American wars; it was official recognition of their contributions that trailed behind.

The museum begins and ends with a large tree adorned with replicas of dog tags left behind by fallen female soldiers. One electronic exhibit allows the visitor to select the names of individual soldiers and pull up their pictures and a short biography and service record. Sacrifice is key to the USAWM: from the “unofficial” soldiers in the Revolutionary and Civil Wars to the WACs of WWII and combat soldiers of Desert Storm and following, female sacrifice was essential to American military success.

As described on its website, the USAWM is a “repository of artifacts and archives,” but also “an educational institution.” The curators have done a fantastic job integrating elements that will keep younger visitors interested and entertained. A theater in a small alcove explains the role of Walt Disney animation in the war effort and shows several WWII-era Donald Duck cartoons produced during the time. There is also an area that allows children to try on the various caps/head gear, uniform pieces, and arms mentioned and depicted in the exhibits. Kids can also take home free coloring pages as a souvenir.

The USWM is also an important resource for historians and researchers. Appointments to view the archives or explore their service member oral histories can be made on the museum website. The archive holds over 1.5 million documents, including books, photographs, scrapbooks, correspondence, and other formats. The museum also allows visitors to take home copies of the U.S. Army Center for Military History’s books on the Women’s Army Corps (WACs) by historians Bettie J. Morden and Mattie E. Treadwell. As military history has long been the domain of men, finding sources about military women written by military women is refreshing to say the least. Morden’s and Treadwell’s works would be extremely useful to students and researchers interested in investigating female participation in the army from 1942 to 1978. Treadwell’s Special Studies text provides additional information on mid-century American interpretations of gender and war and reflects on how these interpretations shaped what military women were and were not allowed to do during wartime.

The U.S. Army Women’s Museum is easy to overlook, but well worth a visit. At the time of our visit (March 1, 2018), several exhibits were under construction, including redesign of a gallery and an expansion of one area. I’m definitely planning a return visit to see the new pieces and check out the U.S. Army Quartermaster Museum next door.

kms

The United States Army Women’s Museum / 2100 Avenue A / Fort Lee, Virginia / www.awm.lee.army.mil / 804-734-4327 / Tues. through Sat., 1000-1700

Today in Women’s History: The U.S. Supreme Court Rules the 19th Amendment Constitutional

“The right of citizens of the United States to vote shall not be denied or abridged by the United States or by any State on account of sex. Congress shall have power to enforce this article by appropriate legislation.”

The U.S. Constitution is, and is intended to be, a living document, but that does not mean changes are easy. Constitutional amendments are hard-fought and hard-won. The debates they spur often inspire strange political alliances. The long fight for female suffrage is the story you know, but the woman’s suffrage movement’s awkward alliance against the 15th amendment is not widely publicized (for obvious reasons).

The 19th amendment to the Constitution, ensuring “(t)he right of citizens of the United States to vote shall not be denied or abridged by the United States or by any State on account of sex,was ratified in 1920. The amendment was the result of an over eighty-year battle for women’s rights in the United States. The 1848 Seneca Falls Convention laid out the women’s movement’s battle strategies and goals, but the changing political landscape would often thwart the suffragettes’ plans.

The Civil War divided the nation; Reconstruction would end up dividing the woman’s rights movement. The proposed 15th amendment stated suffrage “shall not be denied…on account of race.” This gave white and non-white men the right to vote, but as it did not specify suffrage could not be denied based on sex, women were again denied the right. Woman’s rights leaders Elizabeth Cady Stanton and Susan B. Anthony bristled at the idea and withdrew their political support for the amendment. “If that word ‘male’ be inserted,” wrote Stanton, “it will take us at least a century to get it out.[1]

“That word” was not included, but the implication was enough to bar women from voting. The women’s movement split into two groups: Anthony and Stanton formed the National Woman Suffrage Association (NWSA), while Lucy Stone and others who supported the ratification of the 15th amendment formed the American Woman Suffrage Association (NWSA). Disagreements over ideology and methodology hampered the movement’s efficiency and its ability to present a united message regarding woman’s rights. Further complicating matters, some groups interpreted the NWSA’s anti-15th amendment stance as evidence of racism within the movement. Though the NWSA’s connection to groups that supported racial discrimination was tenuous at best, it impacted their image and message. It was not a particularly effective way to court the thousands of African-American women who also wanted civil rights as American citizens, to say the least.

As is often the case, an American war was the ultimate impetus to bringing about American social change. Women’s contributions in mobilization for World War I finally convinced male leaders and politicians that women’s participation could not be ignored. Anyone who gave so much for their country, and made do with so little, deserved the civil rights too long denied them. (Of course, women’s protests and other forms of mobilization for suffrage also forced politicians’ hands.) “I regard the concurrence of the Senate in the constitutional amendment proposing the extension of the suffrage to women as vitally essential to the successful prosecution of the great war of humanity in which we are engaged,” said President Woodrow Wilson in an address to Congress in 1918.[2] Congress agreed, passing the 19th amendment in 1919. It was ratified the following year.

The final challenge to the amendment’s constitutionality came in the 1922 Supreme Court case, Leser v. Garnett. In the original case, lawyer Oscar Leser sued to have two women removed from Maryland voting rolls, saying women did not have the right to vote in Maryland because the state had not ratified the 19th amendment. Chief Justice Louis Brandeis ruled women’s suffrage applied to all American women regardless of whether or not their state ratified the amendment (approved women’s right to vote). Ratifying the amendment put the law in the books, but the 1922 decision in Leser v. Garrett ensured it was a law that women could use.

kms

[1] Elizabeth Cady Stanton quoted in Akhil Reed Amar, America’s Constitution: A Biography (New York: Random House, 2005), 394.

[2] Woodrow Wilson quoted in “Women’s Suffrage and WWI,” National Park Service, Accessed February 27, 2018, https://www.nps.gov/articles/womens-suffrage-wwi.htm.

Publications List

A list of my published work, including links.

Female Spies of the SOE,” Historic UK History Magazine, April 2018.

“Truth to Power: First Ladies that Shaped Presidencies and Policies: Abigail Adams,” History is Now, June 2018.

“Land Girls and Lumber Jills,” Historic UK History Magazine, June 2018.

“Truth to Power: First Ladies that Shaped Presidencies and Policies: Eleanor Roosevelt,” History is Now, July 2018. 

“Female Spies in the Irish War of Independence,” History Today, August 2018.

“Truth to Power: First Ladies that Shaped Presidencies and Policies: The Two Mrs. Wilsons,” History is Now, June 2018.

“The Sinking of the Lancastria,” Historic UK History Magazine, November 2020.

“Another Brick in the Wall:” American Children Go to War

“WHAT ARE YOU DOING? Have you started a garden? Are you helping to win the war? Everybody must work and work hard. Soldiers and sailors cannot fight without the help of the rest of us…When the count is made, on the roll of National Service, will you be PLUS one, or MINUS one?”[1]

American men fought, but the entire nation went to war in World War I. War altered all aspects of American life. Some civil liberties like freedom of the press were restricted in the name of national security. The draft created soldiers out of male citizens. The federal government gained unprecedented oversight over domestic production, foreign trade, and other economic areas that impacted mobilization and maintenance of the war effort. Defeating Germany required the work and sacrifices of every American- a charge they were reminded of at every turn.

Originally intended to be a go-between between the government and the press, George Creel’s Committee of Public Information soon turned to spreading the gospel of patriotism. The passage above is typical for media during the time; what is a bit atypical (at least through modern eyes) is that this article was published in Boy’s Life: The Boy Scouts’ Magazine. The effort to “draw children to military values and service” lead directly to what Dr. Ross F. Collins termed the “militarization of American childhood.”[2]

Magazines such as Boy’s LifeSt. Nicholas Magazine for Boys and Girls, and American Boy linked children’s activities directly to the success of the soldiers fighting overseas. They encouraged planting gardens, buying victory bonds, volunteering for the American Red Cross, and other activities. Idle American hands were not just the devil’s workshop, but the enemy’s. The magazines depicted war as a patriotic and heroic duty, soldiers as valiant adventurers, and men, women, and children back home as the first line of national defense in their absence. 

Even vocabulary and syntax changed to reflect the urgency of the message. Sentences became shorter and often took on second-person plural tense, something unheard of in today’s more detached third-person journalistic style. For example, an article on pest control in home gardens begins, “The war is on. You have enlisted as a gardener…Mobilize your forces. Get a store of ammunition (arsenate of lead and the other poisons), get a machine gun or two (hand sprayers), and post your guards…”[3] Children listened: growing, saving, doing, and doing without as directed. An article entitled “How the Boys Scouts Help in the War” even described a group of boy scouts who created and volunteered for service at home “to protect their mothers and sisters” in the Boy Scout Emergency Coast Patrol.[4]

Of course, most of American children’s education in war did not come from their voluntary consumption of popular media. Wholesale militarization could only be achieved through compulsory education on the virtues of war. The lessons in patriotism and civic duty taught at home were reinforced at school. A former educator himself, President Woodrow Wilson understood the power of the captive audience of a classroom. The CPI began publishing a bimonthly newsletter, National School Service, to instruct teachers on how to teach the war, and more importantly, to emphasize the importance of every citizen’s support, regardless of age. “There may be those who have doubts as to what their duty in this crisis is,” wrote Herbert Hoover in the inaugural edition of the newsletter, “but the teachers cannot be of them.”[5] T

he newsletter urged teachers to encourage their students to participate in many of the activities celebrated in the aforementioned children’s magazines. “War savings stamps, food and fuel economy, the Red Cross, (and) the Liberty Loan, are not intrusions on school work,” explained one article. “They are unique opportunities to enrich and test not knowledge, but the supreme lesson of intelligent and unselfish service.”[6]This idea dovetailed neatly with the president’s belief in the subjugation of individual agendas and gains to those of society as a whole, the need for the interest of every citizen to be “consciously linked with the interest of his fellow citizens, (his) sense of duty broadened to the scope of public service.”[7] Like their students, teachers listened, but they faced a stiff penalty if they refused. “Teachers who remained neutral concerning patriotism could be fired, as ten were in New York City, of hundreds in many incidents across the nation,” explained Collins in Children, War, and Propaganda.[8]

“It is not the object of this periodical to carry the war into the schools. It is there already…There can be but one supreme passion for our America; it is the passion for justice and right…for a world free and unfearful,” explained Guy Stanton Ford, director of Civic and Educational Publications for the CPI.[9] Wilson echoed Ford’s ideas, saying “it is not an army that we must shape and train for war; it is a nation.”[10] War mobilization militarized every aspect of American life; childhood was no exception. “Children were exhorted to sacrifice individuality for the group and pleasure for work; in essence, to take on the previous ‘adult’ role of responsible worker,” wrote Andrea McKenzie in “The Children’s’ Crusade: American Children Writing War.”[11] Once this facet of innocence was lost, it was not and could not be restored. Children did not fight in World War I, but their childhood experiences would shape their response when their nation called them to the front lines in the 1940s. The “greatest generation” came of age believing their parents’ war was also theirs. They prepared from childhood to serve in their own. 


[1] “How the Boy Scouts Help in the War,” Boy’s Life: The Boy Scout’s Magazine 1, no. 1 (June 1917), 42, Boyslife.org Wayback Machine, accessed October 31, 2016, http://boyslife.org/wayback/.

[2] Ross F. Collins, “This is Your War, Kids: Selling World War I to American Children,” North Dakota State University PubWeb, accessed November 1 2016, https://www.ndsu.edu/pubweb/~rcollins/436history/thisisyourwarkids.pdf, 3; 24.

[3] “The Second Phase of the War,” Boy’s Life: The Boy Scout’s Magazine 1, no. 1 (June 1917), 34, Boyslife.org Wayback Machine, accessed October 31, 2016, http://boyslife.org/wayback/.

[4] “How Boy Scouts Help in the War,” 7.

[5] Herbert Hoover, “Hoover Commends Teachers,” National School Service 1, no. 1 (September 1, 1918), 1, National School Service 1918-1919, Kindle edition. 

[6] Guy Stanton Ford, “National School Service,” National School Service 1, no. 1 (September 1, 1918), 8, National School Service 1918-1919, Kindle edition.

[7] Henry A. Turner, “Woodrow Wilson and Public Opinion,” The Public Opinion Quarterly 21, no. 4 (Winter, 1957-1958): 505-520, accessed October 31, 2016, http://www.jstor.org/stable/2746762, 507.

[8] Ross F. Collins, Children, War, and Propaganda, (New York: Peter Lang, Inc., 2011), accessed October 31, 2016, http://www.childrenwarandpropaganda.com.

[9] Ford, “National School Service,” 9.

[10] Woodrow Wilson, quoted in National School Service, 9.

[11] Andrea McKenzie quoted in Ross F. Collins, “This is Your War, Kids: Selling World War I to American Children,” 10.

Sources Cited

Boy’s Life: The Boy Scout’s Magazine 1, no. 1 (June 1917), 42, Boyslife.org Wayback Machine, accessed October 31, 2016, http://boyslife.org/wayback/.

Collins, Ross F. Children, War, and Propaganda, (New York: Peter Lang, Inc., 2011), accessed October 31, 2016, http://www.childrenwarandpropaganda.com.

——“This is Your War, Kids: Selling World War I to American Children,” North Dakota State University PubWeb, accessed November 1 2016, https://www.ndsu.edu/pubweb/~rcollins/436history/thisisyourwarkids.pdf.

National School Service 1, no. 1 (September 1, 1918), 1, National School Service 1918-1919, Kindle edition.

Turner, Henry A. “Woodrow Wilson and Public Opinion,” The Public Opinion Quarterly 21, no. 4 (Winter, 1957-1958): 505-520, accessed October 31, 2016, http://www.jstor.org/stable/2746762.