“Make America Great Again,” “A Future to Believe In,” and Competing Philosophies of American History
The current U.S. election has been a consistent stream of embarrassing statements, extremist rhetoric, radical political stances, glaring hypocrisies, and nonstop media coverage that in many cases comes off as an uncritical infomercial for Hilary Clinton and/or Donald Trump, the presumptive Presidential nominees of the Democrat and Republican Parties. I rarely discuss contemporary politics on this website and I don’t want to wade too deeply into those depths with this post.
One theme from this election that interests me, however, is the degree to which political change from the status quo is necessary to ensure future prosperity for the United States. Bernie Sanders and Trump maintain two remarkably contrasting political platforms, but they’re also the two loudest advocates for a political revolution that completely dismantles the vaguely-defined Washington “establishment” and puts a totally new order of governance into place. Meanwhile other candidates like Clinton and the now-departed John Kasich often speak in more moderate terms about incremental change, compromise, and the toning down of heated rhetoric.
Trump’s campaign slogan, “Make America Great Again,” has clearly resonated with a good number of Americans who, for various reasons, feel like they are falling behind economically while also watching their moral values and ways of life being destroyed in a twenty-first century culture war. The slogan offers itself as a great title for a manifesto in support of a conservative revolution. But what, exactly, does it mean when we call for America to become great again? Are we not great now? What greatness are we trying to recover? Who are we trying to take the country back from? When in American history was this country ever truly great for all? What does “Make America Great Again” say about how we view the whole of U.S. history?
In his 2015 publication Fighting Over the Founders: How We Remember the American Revolution, historian Andrew M. Schocket argues that the memory of the American Revolution and the development of U.S. history holds inherent “political and cultural implications” for how we view the world today. In sum, how we view the origins of the country’s founding can say a lot about how we view the role of politics and government in our lives today. Schocket distinguishes between those who view the American Revolution from an “essentialist” viewpoint and those who view it from an “organicist” viewpoint.
The essentialists argue that American history has only one discernible meaning that offers us clear lessons for navigating the contemporary world, and that any other interpretation or act of “historical revisionism” that diverts from the clear, God-ordained version of American history is flawed. This version of history emphasizes the importance of “private property, capitalism, traditional gender roles, and Protestant Christianity,” according to Schocket, and it views the U.S. Constitution as a perfect or near-perfect document that promotes freedom and liberty for all. The essentialists also assume that our contemporary U.S. government has strayed from its glorious founding ideals, and that the great future struggle of American society lies in restoring our political life to one that sits in harmony with the constitutional order that existed during the nation’s founding and early formation, which is the greatest, most free era in our history. The essentialist vision is all about making American great again.
The “organicist,” version of American history differs from the essentialist one in several ways. The organicists argue that there is no single, fully accurate version of American history that can be learned without interpreting the facts of the past. They believe that there are many ways to interpret this history and that appreciating the various interpretations people form about the past allows for a more holistic and accurate understanding of American history. For example, Schocket explains that “you might insist that white Virginians revolted primarily because they wanted to keep their slaves, and I might insist that white Virginians revolted primarily because they resented British governance, and we could both have a legitimate claim to be debated.” The organicists also refrain from glorifying the American Revolution and the nation’s early years too much, instead stressing the contrast between the ideals of the founders and the sometimes destructive policies they implemented in practice. The great future struggle of American society for the organicists isn’t so much about returning the country’s government to a state of perfect alignment with its glorious past (which in their minds is a contested belief subject to debate) as much as it’s about improving upon that past by achieving the ideals of freedom, liberty, and equality through good governmental practices in the present. The organicist vision is perhaps best articulated through Sanders’ campaign slogan, “A Future to Believe In.” (It bears repeating, however, that the Sanders campaign platform of a “political revolution” to accomplish these ideals is contested, and I doubt all organicist-minded thinkers would agree with the necessity of such a revolution).
I suspect that most Americans fall somewhere in between the essentialist-organicist spectrum. I don’t believe the constitution or American history as a whole can be understood through one uniform narrative devoid of interpretation, and my training as a historian stresses the importance of understanding multiple perspectives and interpreting history through both primary and secondary sources. I also tend to agree with Ulysses S. Grant when he wrote (more or less) that it’s impossible for a contemporary society to solely live by the rules set by people hundreds of years ago. In these regards I find myself aligned more with the organicist interpretation. I can embrace some essentialist philosophies, such as the belief that the nation’s constitution and republican form of government have promoted freedom and liberty for many Americans, but I would argue that the challenge of enhancing everyone’s freedoms is a never-ending project that requires constant debate and discussion about the best path forward. There will never be a point when we wave a “Mission Accomplished” banner once we’ve successfully implemented a perfect form of Republican governance throughout the United States. Moreover, asking an essentialist-type question like “What would Jefferson do?” is not very useful. Instead, we should follow the lead of historian David Sehat and ask, “What is the common good today?”
I believe that cultural and political critiques don’t need to offer workable solutions in order to be valid. The act of criticizing is valuable in and of itself. I remember one time, for example, when a National Park Service official visited my place of employment and argued that “if you come to me with problems without offering solutions, you’re just whining and complaining.” I thought at the time and still believe today that that line of thinking is absolute crap. A problem doesn’t go away because there are no foreseeable solutions. Sometimes problems require teamwork, dialogue, and extended time for workable solutions to be implemented. Demanding that the critic bear the responsibility of solving the problem at hand is, in reality, a subtle defense of the status quo.
I mention this belief because we historians are a criticizing people. We interrogate the meaning of anything and everything, and we formulate interpretations of past and present events in ways that can elicit heated debate between members of the profession and between historians and their many publics.
Some of the most interesting and passionate conversations within the historical community occur when new films, performing arts pieces, and historical literature about the past are released and gain widespread popularity beyond the boundaries of the profession. Whenever something like Steven Spielberg’s Lincoln is released to critical acclaim, historians are always quick to throw their voices into the discussion and wag their fingers about historical inaccuracies and potential problems with the interpretive thrust of these cultural artifacts. Oftentimes they present thoughtful critiques that refrain from offering workable solutions that would enhance the historical accuracy of a given production, and that’s okay! But I must admit that I sometimes wonder what good these sorts of critiques really do for anyone besides making the reviewer look like a grumpy curmudgeon. Don’t historians realize that mediums like film, theater, and children’s books are not the same as academic scholarship and therefore require a different form of communicating the stuff of history to audiences? What would these historians do if they were tasked with writing a film, play or piece of literature? How would they interpret something like the American Civil War in ninety minutes as opposed to four-hundred pages?
The latest examples of historians-as-cultural-critics are taking place around Lin-Manuel Miranda’s hit Broadway musical Hamilton and Ramin Ganeshram’s children’s book A Birthday Cake for George Washington.
Hamilton focuses on the life of Alexander Hamilton and the politics of early American history. The show has consistently sold out on Broadway and is slated to earn hundreds of millions of dollars as it prepares to tour theaters across the country. In recent months, however, historians have been pushing back against some of the musical’s themes and interpretations. I see Lyra D. Monteiro’s review in The Public Historian as a catalyst in pushing these critiques towards a larger discussion with Hamilton’s viewing audience. In the musical Manuel employs people of color to depict the founding fathers, partly as a way of showing how contemporary Americans of all backgrounds have the power to take ownership of American history. Monteiro, however, rightly points out that no actual people of color from the time period are depicted in the musical, and that while Hamilton is billed as “the history of Americans then, interpreted by Americans today,” such a distinction is actually hurtful in that it suggests no people of color were around during the Revolutionary Era. She also takes issue with the themes of individualism and the glorification of the American Dream that are prevalent in the musical. Meanwhile, William Hogeland and David Waldstreicher take issue with Hamilton’s portrayal as a leading progressive thinker, Jason Allen calls the musical “a color-blind Stockholm Syndrome,” Nancy Isenburg argues that Hamilton’s arch-nemisis Aaron Burr was actually not that bad a guy, and the front page of the New York Times on April 11th includes an extended discussion with other historians who have weighed in on the musical’s accuracy.
A Birthday Cake for George Washington was pulled from the shelves in January by its publisher, Scholastic, after intense criticism about the ways it allegedly depicted slavery in a benign fashion. Ganeshram discussed the banning of her book in the Huffington Post, arguing that she wrote the book under the “reasonable assumption that understanding the overarching horror and criminality of slavery was a given — and that parents and educators would share that context in a way that was most appropriate for their young listener,” but the essay has not brought her book back to the shelves at this point. One of the most vocal critics of the book was living history interpreter Michael Twitty, who, writing in The Guardian, argued that “our society has poorly dealt with slavery in relation to our children,” and that A Birthday Cake for George Washington represents a larger truth about America’s inability to deal with its history of slavery. But curiously, Twitty acknowledges that while he knows Ganeshram personally, he has never talked with her about the book, nor has he even read the book itself. And while Twitty is certainly right to point out that we need to do a better job of discussing slavery, especially with young children, his failure to further explain how he proposes to solve this problem leaves readers wondering how future authors can improve upon the messages conveyed in A Birthday Cake for George Washington. Perhaps we really don’t have a solid blueprint for discussing slavery with children, which in turn opens the door for historians to start discussing solutions for writing better historical stories about slavery rather than constantly critiquing each children’s book that comes out about the topic.
Again, I think it’s important that historians contribute their voices to larger conversations about the ways history is depicted in popular media, film, and literature, but I also wonder if and how we can add legitimacy to our viewpoints by going beyond the “historians say ______ is inaccurate” model. Historical interpretations in an artistic, entertainment-based medium are not going to meet the exacting standards of someone used to having books published by an academic press or someone working in a professional public history setting for a living. Historians should acknowledge that and act accordingly when critiquing popular media.
I am currently working my way through The Way We Never Were: American Families and the Nostalgia Trap by historian Stephanie Coontz. It’s a very provocative book that challenges a lot of our preconceived notions about family structures in U.S. history, and Coontz convincingly argues that the concept of “traditional family values” is really an invention of contemporary politics rather than anything rooted in historical fact.
Coontz points out that a common and persistent myth in current political discourse is that families today are suffering from the effects of modern “rootlessness”: this belief suggests that families today are more mobile and transient than they used to be, children generally have more fractured relationships with their parents and grandparents, and that children are being raised less by their parents and more by surrounding influences such as television, the internet, popular media, friends, and other community members. Coontz challenges this interpretation with a stunning fact that I have never seen before:
Families are not more mobile and transient than they used to be. In most nineteenth-century cities, both large and small, more than 50 percent–and often 75 percent–of the residents in any given year were no longer there ten years later. People born in the twentieth century are much more likely to live near their birthplace than were people in the nineteenth century (14).
She goes on to suggest that families today actually have stronger bonds than those of the nineteenth century. Grandparents are living longer and forging stronger relationships with their children, visits with relatives have increased, and only four percent of children today do not live with either parent, as compared to ten percent in 1940 and perhaps even higher in the nineteenth century. Furthermore, enslaved people in the nineteenth century often saw their families broken up and torn apart while family struggles and employment structures like apprenticeships for white families also demonstrate how communities and outside factors have always played an integral role in raising children.
This discussion got me thinking about the sorts of identities and allegiances nineteenth century Americans would have forged for themselves.
There is a school of thought that argues that more people identified with and considered themselves citizens of a state before aligning with the United States as a whole, especially before the Civil War. Confederate General Robert E. Lee is seen as the archetype figure for this line of thinking. At the outbreak of war and with Virginia choosing to side with the Confederacy, Lee asserted that “I have been unable to make up my mind to raise my hand against my native state, my relations, my children & my home . . . & never desire again to draw my sword save in defence of my State.” Despite his years of service to the U.S. Army, Lee’s first allegiance was to Virginia and, by extension, his family. In his mind he had little agency in the matter since a choice to fight for the Union would be the ultimate form of betrayal to his primary allegiance. The novelist Shelby Foote infamously crystallized this state allegiance theory to millions of viewers on Ken Burns’s documentary of the Civil War:
Before the war, it was said ‘the United States are’—grammatically it was spoken that way and thought of as a collection of independent states. And after the war it was always ‘the United States is,’ as we say today without being self-conscious at all. And that sums up what the war accomplished. It made us an ‘is.’
While this theory is compelling, I think there is room to question its accuracy.
Lee’s life experiences before the war represent an aberration from those of most nineteenth century Americans. He grew up in a prosperous, stable family with deep roots in his native state, and those roots were solidified even more when he married into the Custis family. Most families had neither the wealth nor the state roots of Lee’s family in the years before the Civil War. While it’s true that Lee’s army career took him to places far away from Virginia such as St. Louis and Texas, those travels initially strengthened his allegiance to the Union, not his state. As the late historian Elizabeth Brown Pryor pointed out, Lee commented in 1857 that his patriotism extended to the whole country and that it “contained no North, no South, no East no West, but embraced the broad Union, in all its might & strength, present & future.” That argument clearly contradicts his later statements at the outbreak of war. Moreover, there were a number of Lee relatives that felt differently about their allegiances and eagerly signed up to fight for the United States against secession, and other notable Virginians like George Thomas and Winfield Scott had no qualms about maintaining their commissions in the U.S. army and their allegiance to the Union.
Shelby Foote’s assertion is also questionable. Andy Hall analyzed nineteenth century publications using Google Ngram and discovered that while “United States are” and “United States is” were used interchangeably during the early years of the Republic, the 1840s witnessed a sharp spike in the use of the term “United States is,” which may be indicative of wartime passions and calls for unity during the Mexican-American War. These calls were often led by nationalists North and South like Henry Clay, Thomas Hart Benton, and Daniel Webster. But beyond the written word we may also question how nineteenth-century Americans could have developed such strong allegiances to a state if they were so geographically mobile. And what about the millions of immigrants who came to the United States in the years before the Civil War? Did they emigrate out of an allegiance and identification with a particular city or state within the country, or did they come because of a belief in American ideals and a love of the whole Union?
Nineteenth century Americans were a mobile people. In an age of cheap, federally subsidized land, ever-developing transportation and communication technology, and rapid westward expansion, many Americans moved from place-to-place in search of communities and infrastructures that gave them the best chance at maintaining a stable family and economic life. It’s not evident to me that they would have automatically identified with a state more so than a local community, a city, or the whole Union. Their allegiances may have been multiple and endearing, but for many Americans their love of Union was paramount.
I always said, blacks need to stop bringing up slavery all the time. It was a long time ago. Why can’t they just move on and forget about it? But then they wanted to move on and get rid of these confederate statues, and I was all like, “Things that happened a long time ago are still important. You shouldn’t forget about them!”
The above quote comes from a really funny piece of satire that a friend shared with me from The Push Pole, a website based out of Southern Louisiana. Its title seems apt for the times: “Thousands of History Buffs Magically Appear After City Council Votes to Remove Confederate Monuments.” The piece is funny because it’s rooted in a partial truth about the complex and contradictory ways Americans often choose to remember their history: “Never Forget” is an arbitrary term that extends to historical events and people we care about, but when it comes to historical things we consider to be overblown or simply not worth caring about, “we need to move on” becomes the default response. (See Andrew Joseph Pegoda’s essential essay on “Never Forget” for more thoughts on the subjective nature of the term).
The taking down or altering of some public statues, monuments, and memorials honoring the Confederacy sparked a vigorous debate in 2015 about the place of Confederate iconography in America’s commemorative landscape and whether or not some of these icons–particularly the ones in places of public governance, public schools, town squares, and the like–should remain in their place of honor. The online discussion took place through blog posts, newspaper op-eds, and thousands upon thousands of comments. While some of these discussions were productive and enlightening, we were also treated to excessive and misleading cries of “erasing history” (which is a flawed argument to take when analyzing public iconography), poor analogies that compared changes to Confederate iconography to ISIS-led destruction of Middle Eastern history, and emotion-filled hysterics that often said more about the politics of the present than any actual grasp of historical knowledge. And while folks got emotionally heated about Confederate icons, other historical artifacts such as this 19th century Virginia slave cabin are being demolished or in other cases facing potential demolition in the near future, all amid the sound of near silence on and offline.
What is the point of preserving symbolic icons that commemorate historic events and people if the actual historical artifacts that act as tangible representations of these events and people go away; things like letters, historic homes, battlefields, and other material artifacts? What would happen if some of that energy expended on debating iconography went towards preserving local history, Civil War battlefields, slave cabins, historic cemeteries, material artifacts, or archival records?
You and I can write blog posts or comment on newspaper articles until our fingers break off, but none of it really matters unless we get involved in our local communities and work towards convincing our neighbors of the importance of preserving history. Contact your local officials and tell them why public funding is important for ensuring a future grounded in an honest, responsible understanding of the past. Tell them to support historic preservation efforts in your area. Tell them that it’s important to support history education initiatives in the k-12 classroom such as National History Day and humanities programs in community colleges, four-year colleges, and universities. Tell them to support local institutions like historical societies, museums, and archival repositories. Join a preservation group like the Civil War Trust or the National Trust for Historic Preservation. Go visit a nearby National Historic Site. Attend a historical reenactment. Ask questions and be willing to listen and learn about the past, even if it’s difficult and unpleasant.
If you live in a community where a statue, monument, or memorial is currently garnering controversy, read up on relevant scholarship about the historical event being commemorated and why a symbolic icon was erected to preserve the memory of that event. Honestly consider whether or not that symbolic icon should remain in a place of honor in your community. If town hall meetings or other events are taking place about the history in your area, go to them. Listen to the perspective of other community members and express your own thoughts as well. Work towards becoming an active member of your community and an advocate for history.
If 2015 marks the beginning of a renewed conversation about history and memory in American society, let us use 2016 as a starting point for a renewed effort towards advancing the importance of supporting, preserving, and educating people about the history that is all around us. Get off the message boards and get to work in your community.
Cheers to a great new year.
As I went about my day this morning I came across a Facebook post from a friend that nearly made my eyes roll out of my head. The post linked to a 6th-grade quality “listicle” entitled “21 Things About America That Most Americans Don’t Realize.” I typically ignore these sorts of things, especially lists like this one that provide absolutely no evidence to back up their so-called historical facts. But a commenter on said friend’s post remarked about his pleasure in seeing number five, which is posted above, and I had to jump into the debate. I am no expert in 17th century history and I probably should have stayed out of the fray, but there’s a larger point about American history than needs to be made here.
Blacks have their own troubling role in slavery’s legacy. African elites in Western and Central Africa happily complied in helping advance the slave trade, which Henry Louis Gates explains in the New York Times. And there were some free blacks who owned slaves in America. Anthony Johnson is perhaps the most well-known black slaveholder. Born in Africa but sent to Virginia in 1821, Johnson worked as an indentured servant until 1635 or whereabouts. By the 1650s he was a well-to-do property owner with his own indentured servants. In 1653, one of those servants, John Casor, sued Johnson and argued that his term of service had expired. The courts found in 1655 that Johnson still “owned” Casor and that he be returned to Johnson immediately. This case was the first one in which a court found that a person who had not committed a crime could be legally held into lifelong servitude–enslavement–under British law. This is where claims emerge that Anthony Johnson was America’s first slaveholder.
The reality of the situation is much more complex, of course.
With regards to British law, the first legally enslaved African was John Punch, who was sentenced to lifetime servitude after attempting to run away to Maryland at some point in either late 1639 or early 1640, fifteen years before Johnson took ownership of Casor. Sociologist Rodney D. Coates of Miami University, in an analysis of the racialization of early American law cases, accurately concludes that “John Punch’s name should go down in history as being the first official slave in the English colonies” since he was the first one to be legally enslaved through English law (333). Anthony Johnson was not the first slaveholder in America and, by extension, the first slaveholder in America was not black. It’s also puzzling why so many biographies of Johnson (see here, here, and here) would omit such an important historical fact if it is actually true that he was the first American slaveholder.
This debate over who was the first slaveholder in America exposes the sorts of biases Americans have when it comes to understanding their own history.
All too often we Americans are taught in our White Anglo-Saxon Protestant history textbooks that the beginning of “American history” started with the Virginia Company’s “Mayflower Compact” at Jamestown, Virginia, in 1619 and later Pilgrim settlements in Plymouth, Massachusetts. History textbooks portray the growth of what eventually became the United States as a growth spreading from East to West, with America’s origins in the thirteen colonies followed by steady expansion westward. But settlement patterns in the present-day United States actually started in the opposite direction! According to sociologist and historian James Loewen, “people…discovered the Americas and settled it from west to east. People got to the Americas by boat from northeastern Asia or by walking across the Bering Strait during an ice age. Most Indians in the Americas can be traced by blood type, language similarity, and other evidence to a very small group of first arrivals…either way, afoot or by boat, evidence suggests that people entered Alaska first” (20). Moreover, following Columbus’s discovery of the “New World” in 1492, Spanish colonists settled in places like present-day New Mexico, Texas, and Florida before other Europeans settlers went to Virginia, Maryland, and Massachusetts. When they came to the New World, these Spanish colonists enslaved its indigenous populations and later began importing African slaves to the New World following Ferdinand and Isabella’s approval of African slavery in 1501. St. Augustine, Florida, was a hub for the Spanish slave trade. And it was all legal!
Clearly there were people living in the Americas thousands of years before any Europeans came over, although we often ignore that reality. For example, even though my hometown of St. Louis, Missouri, is celebrating 2014 as the 250th anniversary of the city’s “founding,” there was an advanced society hundreds of years before 1764 right in our backyard that was larger than London at one point in time. After 1492, there were non-British Europeans who colonized the Americas and traded slaves long before the British came over. To ultimately suggest that Anthony Johnson was the first legal slaveholder in what would eventually become the United States is utter poppycock, no matter what any viral internet garbage tries to tell you.
Over the past couple of weeks I have been participating in an online seminar (a “webinar”) called “Co-Creating Narratives in Public Spaces” that is being co-hosted by the National Park Service and the Museum Studies program at George Washington University. Yesterday’s webinar focused on “Relevance, Diversity, and Inclusion” within the National Park Service. I shared some thoughts and participated in a good dialogue with several other scholars on Twitter, and I feel like I’ve gotten a lot out of the event so far. I would like to make a brief note, however, on the use of the term “changing demographics” and what, exactly, it means when we talk about changing demographics in the United States.
One of the primary questions we discussed yesterday was the following:
Is the shift toward more inclusive narratives more than a reflection of–or a response to–the changing demographics of America?
This question is based on a faulty premise by suggesting that the notion of “changing demographics” is a relatively new one in American society.
Even though the presenters at the webinar took pains to argue that their use of the “changing demographics” term referred to broad social changes in the U.S.–an aging Baby Boom Generation leaving the workplace for retirement, an increasing number of women in positions of power, and recent debates about the role sexuality in American society–it was obvious to me that “changing demographics” was mostly associated with the changing racial/ethnic demographics of the country brought on by immigration. As Joel Kotkin remarks in Smithsonian Magazine, “Immigration will continue to be a major force in U.S. life . . . the United States of 2050 will look different from that of today: whites will no longer be in the majority. The U.S. minority population, currently 30 percent, is expected to exceed 50 percent before 2050. No other advanced, populous country will see such diversity.”
When it comes to race and ethnicity, yes, the United States is certainly becoming more diverse. But the United States has always been diverse, no matter what context you place on the term “changing demographics.” This nation’s demographics have been in a constant state of fluid change since at least 1776 and probably before then. Men, women, young and old people, people with disabilities, people who identify as LGBTQ, immigrants, slaves, Europeans, Indians, Africans, Asians, and Hispanics have always lived in the United States and been a part of its history. The shift towards more inclusive narratives in interpretive history should not take place because of today’s “changing demographics” but because much of the interpretive history told in this country has never accounted for the demographic changes that have always been a part of the American experience.
Moreover, the shift towards more inclusive narratives needs to happen because the need for accurate history is equally if not more important than any notion of “changing demographics.” Inclusiveness and accuracy go hand-in-hand. When Park Rangers at Gettysburg told visitors in the 1960s that the American Civil War was about “states’ rights,” they undoubtedly alienated any African Americans that may have visited the park. But ultimately they interpreted history that was simply inaccurate. When we leave out the role of minorities, women, and other unacknowledged groups from American history, we are telling inaccurate history. Richard Sandell argues in Museums, Prejudice and the Reframing of Difference that audiences to museums and other cultural institutions view these places as sources of knowledge and information akin to newspapers, television, or libraries. People come to public history sites seeking knowledge and information that addresses the questions they consider important. Public history sites are resource centers where people go to make sense of their world. We are obligated to do our absolute best to provide them accurate history, and we do a disgrace to the historical record when we don’t strive for inclusive narratives that highlight the experiences of ALL Americans.
In sum, I believe that the shift towards inclusive narratives is both reflective of and a reaction to the history of changing demographics in the United States that cultural institutions have only recently acknowledged. In creating inclusive and accurate narratives, we must also strive to tell stories–plural–that provide light into the American experience rather than focusing on a futile effort to tell one grand narrative that purports to speak for all of us.
The question of when, exactly, the United States became a truly unified nation has dominated the discussions of scholars seeking to explain the origins of American nationalism. Robert Penn Warren famously argued in 1961 that the United States could not be considered a nation until the blood-spilling of the American Civil War ended in 1865. The American Revolution, according to Warren, “did not create a nation except on paper . . . [The United States] became a nation, only with the Civil War.” Others argue, however, that a nation did in fact exist before the Civil War. Hans Kohn points out that pre-war tariff policies that favored the development of American commercial interests over European ones along with a national thirst for westward expansion demonstrate that the roots of American nationhood were established well before the outbreak of the Civil War in 1861.
Regardless of when the United States became a unified country, almost everyone agrees that the Civil War altered Americans’ relationship to their nation, both politically and culturally. Even though bloody civil wars don’t necessarily bring about a stronger sense of nationalism in their aftermath, popular depictions of the Civil War in American history and memory have framed the deaths of 750,000 Americans as a necessary sacrifice for bringing together a young, fractious nation. The philosopher William James in 1910 argued that few Americans would change their nation’s history if given the opportunity: “Ask all our millions, north and south, whether they would vote now . . . to have our war for the Union expunged from history, and the record of a peaceful transition to the present substituted for that of its marches and battles, and probably hardly a handful of eccentrics would say yes.” Some contemporary historians define the nature of Civil War death in mythical terms. Charles P. Roland’s popular An American Iliad connects the American Civil War to the Greco-Trojan war of Greek mythology: “More than a century ago the American people engaged in a great sectional conflict that reenacted all of the heroism and sacrifice, all of the cruelty and horror, of the Greco-Trojan War. The Union victory . . . forever changed the course of American history and thereby of world history.”
These comments reflect a particular way of viewing the United States that conceives military action as the defining characteristic of American nationalism. Warfare, more than any other political, economic, social, or cultural factor, brings Americans together into an “imagined community” whose citizens are willing to die in battle to defend the rights and freedoms of fellow citizens thousands of miles away from their own homes.
The shocking death toll of the American Civil War demanded reflection, interpretation, and explanation from those who survived the war. To address these pressing demands of memory, Americans created new rituals they believed would maintain and strengthen the relationship between living and dead, what Union General John A. Logan described as a “solemn trust.” Although the practice of decorating graves has disputed origins, the call of Union veterans in the Grand Army of the Republic (GAR) in 1868 for all communities to decorate the graves of their local Civil War dead marked the official beginning of Memorial Day in the United States.
Memorial Day ritualizes the living’s “solemn trust” with the dead by annually reserving time on the American calendar for remembering and reflecting upon the memories of those who have died to preserve the United States and its freedoms. These rituals and observances are as much for us the living as they are for the dead. Each year we take time to justify to ourselves our belief that the dead did not die in vain and that we are a better nation because of their death. We also remind ourselves of the obligations we have to our fallen friends and loved ones and use Memorial Day to speak on behalf of those people. Indeed, the dead don’t have the chance to speak on Memorial Day; we speak for the dead and mold them to fit our own visions and beliefs.
The GAR played their own role in defining society’s memory of the war by annually reminding their audiences in Memorial Day speeches of the righteousness of preserving the Union (and in some speeches the righteousness of destroying slavery). For example, Indiana veteran George W. Spahr argued in his 1893 Memorial Day speech that all Americans should be “consoled by the fact that we are no longer a doubtful confederation of States; that we are no longer a compact of colonies existing at the will and pleasure of the parties to the combine.” Above all, Spahr believed this unified nationalist spirit was born through the efforts of men whose “self-sacrifice” provided a tangible example of patriotism and love of country.
Our nation’s dead deserve a place in our collective memory and a debt of thanks that will never be fully paid. Memorial Day helps us pay a part of that debt back and reminds us of our fellow citizens who are willing to die so that we may continue to live in comfort. But lurking under our “thank the servicemen and women” sentiments lie difficult questions that this nation must continually address about the nature of military action and nationalism.
According to the historian Susan-Mary Grant:
Americans . . . have been unwilling to concede that violence rather than voluntarism played a central role in their national development. Consequently, as far as the creation of the American nation is concerned, the subject of war is approached obliquely. The American way of war, in short, is almost always presented in quasi-mystical terms that support the national idea of freedom and equality for all . . . [and] downplay the extent and the implications of violence within the nation (189, 191).
In sum, Memorial Day is often framed as a day for remembering death, but less often is it a day for remembering the act of killing. George Spahr focused on the “self-sacrifice” of Union soldiers in his Memorial Day speech, but he omitted the fact that the federal government resorted to a forceful military draft in 1863 to maintain the Union war effort. He and countless veterans were also forced to deal with the memories of wartime killing on a daily basis. As historian Reid Mitchell points out, Julia Ward Howe’s song “Battle Hymn of the Republic” includes the lyric “let us die to make men free,” not “let us kill to make men free,” a convenient side stepping of what soldiers are actually tasked to do in the military. Any reflection on the nation’s dead requires us to analyze the nature of war itself and ask why our elected leaders sometimes choose to rely on warfare to ostensibly preserve and even enhance our American democracy.
We should do our absolute best to avoid warfare in the future. Memorial Day should be a day for reflection, thanks, and critical discussion about the state our nation, but not a day for unquestioningly glorifying the military. Mississippi Senator John Sharp Williams said as much to a group of Confederate veterans in his 1904 Memorial Day speech:
No matter how bright the uniform, how loud ‘the shouting of captains,’ how splendid the deeds of valor, how inspiring the clangor of fife and drums, there is nothing more disgusting, nothing more detestable, and nothing more in the history of the world has been so dangerous and destructive as the puerile thirst for military fame and the schoolboy love for ‘glory’ and a strenuous life.
Given our questionable military interventions since the attacks of September 11, 2001, and our seeming inability to care for veterans once they return home, let us hope that this nation’s future is not dominated by constant warfare and the deaths of our best and brightest citizens. We owe it to ourselves and those who have died in service of the United States to promote peace at all times.