How History and Memory Converge to Make Sense of The Past

Photo Credit: https://sites.google.com/a/worth.org.uk/worth-school-activities/history-society

History is the process by which individuals and societies make sense of the past. Although they are often used interchangeably, the terms “History” and “The Past” are not mutually exclusive. “The Past” is the verified, factual information we know about past events in human history. We know, for example, that the Declaration of Independence was written in 1776. “History,” however, is the process by which we document, contextualize, and interpret the meaning of a particular event. Why was the Declaration of Independence written? Who wrote it? What was going on in the world at the time of its writing? What social, economic, religious, and political forces inspired the document’s author? What were the consequences of its publication? These are the types of questions historians ask when researching and interpreting “The Past” to make an informed historical argument about something like the writing of the Declaration of Independence.

Memory plays a necessary and crucial role in creating history. “Memory” is the process by which individuals and societies choose to remember (and forget) their pasts. Memories are created after an event has taken place and take the form of oral recollection, art, public iconography, and many other expressions of personal reflection. How did Thomas Jefferson remember his role in writing the Declaration years later? What did members of the Continental Congress think of the event? How did citizens of the colonies remember hearing about the Declaration of Independence? What monuments, statues, markers, and plaques were created to commemorate the event? What messages did these icons attempt to convey to viewers about the Declaration? How is the Declaration remembered by society today? These are the types of questions historians and memory scholars ask when researching how present-day conditions simultaneously shape and are shaped by past events. History and memory intersect to tell us what happened in the past, and what it means for us today.

What are the distinctions between history and memory? Is there a distinction between the two? Scholars disagree on this question, but I think there are distinctions, albeit very subtle.

Take the case of the veteran’s recollection of a wartime experience twenty years after a significant battle. The truthfulness of that soldier’s recollection may not be fully verifiable based on the evidence that was created from the time in which the battle originally took place. His or her recollection may contradict the official battle report created at the time (“The Past”), or it may include details that were previously omitted. Sometimes the recollection may even unintentionally confuse or invent crucial details with the passage of time. Nevertheless the veteran’s memory exists as a “personal truth” for him or herself; an individual process by which the soldier copes with, comprehends, and understands their experiences in that battle. The tricky task for the historian is to determine whether the veteran’s recollection should be incorporated into the body of evidence being used to interpret the history of that battle. Is the recollection reliable? Does it help advance the story? Does it help or hinder the historian’s effort to make sense of The Past?

Historian Jonathan Hansen argues that history advances through hypothesis while memory evolves over time but never really advances. I like that description because memories of a given event will change over time (a new personal reflection or the erection of a new monument, for example) but those memories may not be verifiable in the same way a historical fact can be through a hypothesis.

Much of what we understand about The Past is based on memory, which simultaneously informs and muddles the historical process. As such, the concept of “Truth” does exist within the historical process, but it takes multiple forms. The International Coalition of Sites of Conscience defines four different forms of “Truth”: forensic truth (The factual, verifiable past), personal truth (a personal memory), social truth (a collectively held truth as expressed through art, public iconography, political speechs, etc.) and healing truth (a collective process of historical reckoning such as South Africa’s Truth and Reconciliation Commission).

The above description is how I understand the distinctions between The Past, history, and memory.  These three phenomenons constantly interact and shape each other, leading to the creation of individual and collective understandings of past events that in many cases contain multiple truths for us to learn from.

Cheers

Advertisements

How Historians and Musicians Receive Similar Training in College

Yours Truly Performing at Off Broadway in St. Louis. Photo Credit: Rick Miller Photography

Over the years numerous friends and family, knowing that I studied history in college and now work as a public historian for a living, have come to me with a range of questions about people and events from the past. I think more often than not I have failed to give them a satisfactory answer to their questions. That’s because in most cases they’ve asked questions about time periods in which I have only a basic and limited understanding. As fascinating as I find the Roman Empire, the Medieval Era, the Great Depression and the New Deal, and other periods in history, I just don’t have the specialized knowledge to give an accurate, informative answer in most cases. And yet oftentimes these questions are prefaced with a comment like, “you’re a historian, so you should be able to help me…”

The reality is that most professional historians specialize in a particular time period, and that time period can be quite small in scope depending on the individual historian’s interests. I think non-historians sometimes assume that the primary goal of studying history is the accumulation of facts. As historian David McKenzie pointed out on Twitter, historical knowledge for many is “simply cramming facts into one’s head to be spit out at a moment’s notice.” While learning facts and establishing historical accuracy are certainly important facets of any history degree program, there are many other elements of good historical practice. This includes (but is not limited to) the ability to search for and interpret the larger context surrounding a particular event, the need to understand change over time, the importance of crafting solid research questions, the talent to be a good reader, writer, and speaker, and the training needed to become well-versed in both primary and secondary source material of a particular, specialized historical era.

When I struggle to answer my friends’ and family’s questions, I point out that historians are in some ways similar to musicians. My area of expertise is nineteenth century U.S. history–particularly the Civil War Era–and that is my “musical instrument,” so to speak. You wouldn’t say “oh, you’re a musician! Go over and play that guitar” without first asking that musician what instrument they play and if they could play guitar. And just because a musician can play guitar doesn’t mean they can play tuba or do a freestyle rap on the spot. The situation is similar with historians. I can talk about the Battle of Shiloh or the Civil Rights Act of 1866, but I’d have a more difficult time giving a detailed answer about, say, the Battle of D-Day or the Civil Rights Act of 1964. As much as I’d love to give detailed answers and remarkable facts about every event in human history, the limits of human intelligence require a more specific and concentrated focus.

Music education students in college are required to learn how to play a string instrument, a brass/woodwind instrument, and sing in a choir regardless of their prior expertise. They also learn music theory and develop an ability to read sheet music whether it’s in treble clef or bass clef (or alto clef!). As future teachers of band, orchestra, and choir in a k-12 setting, this training prepares them to help students learn how to play an instrument, read sheet music, and perform together in an organized creation of musical sound. History students at the undergrad level receive a similar curriculum in that they take courses in U.S., European, and World history during their training. They receive a broad instruction that enables them to educate younger students about a wide swath of human history. But like the musician with a specific instrument that they specialize in and perform with in concerts, the historian finds a time period to specialize in and contribute to through public talks, the creation of scholarship, and, in my case as a public historian, by interpreting history to a wide range of publics.

Cheers

 

NCPH 2018: Where Do We Go From Here?

Last week I attended the 2018 Annual Meeting of the National Council on Public History in Las Vegas. It was my fifth straight NCPH conference and my first time in Las Vegas, which in itself was quite a treat as I took some time to take in the city’s sights and sounds. As a pretty active member of NCPH I ended up spending lots of time during the conference in committee meetings and planning for my own presentation in the session, “Rewiring Old Power Lines: The Challenge of Entrenched Narratives.” I did have the chance, however, to attend a range of sessions during the conference. Overall I enjoyed my experience and left with a lot of satisfaction about my participation in NCPH. I do have questions and concerns moving forward, however. What follows are three thoughts about the conference and the state of public history:

What is the meaning of the term “Community”?: One of the strong points of attending NCPH conferences is that presenters are constantly exploring ways to bring the ideals and values of public history to new audiences. Every year there seems to be passionate discussion about three different questions:

1. How to rewrite narratives to incorporate the perspectives of previously marginalized historical actors in interpretive programs.

2. How to bring new audiences to public history sites, particularly young people and people of color.

3. How to establish a community-oriented culture of inclusion and equity at public history sites.

I believe these concerns are fundamental to the public history profession, and they’ll always play an important role in how the field defines itself. Effectively addressing these questions is a great challenge without clear answers. I confess, however, that this year I felt like some of the conversations I heard were akin to listening to a song on repeat. Sometimes I felt like asking, “okay, these concerns are valid, but I’ve heard these same thoughts for the past five years. What are we actually doing to push the field in a new direction?”

Part of the problem, I think, is that the term “community” is sometimes thrown around in an irresponsible way. Like the term “general public,” there really is no such thing as a “community.” There are only “communities,” and any discussion about “meeting the needs of the local community” really should be pluralized. Take, for example, my hometown of St. Louis. St. Louis County has 90 municipalities, all of which have their own histories and present-day needs. St. Louis city is a separate legal entity from the county and has its own neighborhoods and ethnic enclaves. Nearby Jefferson County and especially St. Charles County have experienced explosive suburban growth over the past twenty years. Several counties in Illinois also have a close association with St. Louis. All of these areas fall under the term “St. Louis Metro Area,” but as a public historian I can’t really talk about “meeting the needs of the St. Louis community.” The area is too big and the population is too unique to be described in historical terms as a single community. Ferguson, Chesterfield, and Affton are all in St. Louis County, for example, but have different histories and different needs. In reality there are some communities in St. Louis that are well served by their public history institutions and others that are not. So when we talk about meeting the needs of a local population, we need to start from the premise that there are many communities in a given locality we should be reaching out to and serving.

Concerns about Mid-Level Professionals: I think NCPH has done a wonderful job of making its annual meeting a welcoming place for graduate students and new professionals. Both groups benefit from a mentoring program, a special outing the first night of the conference, the Speed Networking session, and an environment that is friendly to new attendees. In general I think students and new professionals get a lot out of the NCPH Annual Meeting.

As I experience my own transition away from the term “new professional,” however, I’ve been thinking more and more about mid-level professionals and what the organization is doing to meet their needs. Those of us who have been in the field between four and ten years are most likely still in the field because we were fortunate to find jobs to support ourselves. But what happens when you’ve got your foot in the door with that entry-level position but can’t move up? I am greatly concerned about the number of mid-level professionals that I spoke with that are struggling to find career growth and new opportunities to put their skills to practice. For many, there is no career track to speak of. Throughout the conference I thought about a former cohort from graduate school who left the public history field to find a job in sales a couple weeks earlier. Another NCPH conference-goer who recently retired mentioned that his position isn’t being filled. I also admit to my own concerns about my future in public history.

What can NCPH do to help mid-level professionals find the career growth they seek? I’m not sure, but it’s my hope that the Professional Development Committee (of which I am co-chair) along with other NCPH committees can begin discussing strategies for the future. Additionally, while I could not attend working group 2, “Negotiating Power Lines: Economic Justice and the Ethics of Public History,” the tweets from that panel were fascinating and hint at some interesting ideas about promoting better pay for public historians.

Props to South African Public History: A significant highlight of the conference was having the chance to attend session 36, “South African Recovery from Cruel Pasts: Using Creative Arts to Visualize Alternatives.” Members of Rhodes University’s Isikhumbuzo Applied History Unit came all the way from South Africa to present at the conference, and it was a real treat. Historian Julia Wells and historians/performers Masixole Heshu and Phemelo Hellemann discussed the 1819 Battle of Grahamstown and efforts by the Applied History Unit to bring this history to life through creative arts, including poetry, storytelling, and pantsula dancing, a type of dancing invented in South Africa in the 1950s and 1960s. Dancers Azile Cibi and Likhaya Jack demonstrated pantsula dancing for all participants, and for the first time at a conference I ended dancing myself! They also demonstrated scenes from a play the Applied History Unit developed to portray the story of “Pete,” a native South African who was able to save his mother, a POW during the Battle of Grahamstown, and return her from bondage (true story). The session was extremely fascinating and a real treat to attend.

(Another thing I noticed about this session was that the presenters went into the crowd, introduced themselves, and thanked each audience member for attending their session. I was struck by the kindness of this small act and think presenters at future history conferences should embrace this practice).

All in all, NCPH 2018 was a great time and I look forward to next year’s meeting. For now, it’s back to the grid.

Cheers

Exploring the Past Turns 5

Photo Credit: Pinterest https://www.pinterest.com/explore/helicopter-cake/

January 1 marks the fifth anniversary of creating Exploring the Past. Establishing on online presence to share thoughts, ideas, and scholarship with interested readers and to network with other history scholars has been immensely rewarding for me on a personal and professional level. I initially created this website as an avenue to work on my writing skills while I was a graduate student at IUPUI and to contemplate (in a public setting) what studying history meant to me. I continue to write here for those same reasons, but as a professional public historian I’ve also worked to discuss challenges I face in my work and to contribute to larger conversations within the field about fair employment practices, “public engagement,” and interpreting difficult histories.

Through this blog I’ve written more than 400 posts and have received thousands of comments, most of which came from real people and were positive in nature. I’ve developed strong real-life and online friendships, have been offered speaking and writing gigs, and have felt a sense of personal accomplishment from this blog. Most notably for this year, through this blog I was offered a regular writing position at the Journal of the Civil War Era‘s blog Muster, which has put me in contact with some of the finest Civil War scholars in the field and has challenged me to become a better writer.

What guides me in my public writing is the belief that historians should make their work accessible in content, style, and location. Historians will continue writing in long-form mediums like books and journal articles because the field needs “slow scholarship” – scholarship that needs time for comprehensive research, thinking, and evolution over a long period of time, oftentimes several years. But blogging is a unique art form in and of itself: the ability to break down a complex topic into 100 to 1,200 words is a challenge not easily accomplished even by the best historians. History blogging oftentimes reaches an audience much broader than the one reached by books and journal articles, and it forces writers to put their best foot forward when making an argument that will reach an audience beyond the confines of the academy or the museum. I consider my public writing an extension of my work as a public historian and it offers me a chance to discuss topics that I may not get to discuss in my regular job.

I believe 2017 was a major year of growth for me as a historian, intellectual, and scholar. I gave several talks, including one you can see here in which I discussed controversial public monuments; I wrote a journal article on Missouri Congressman John Richard Barret that now looks to be published next year; I was elected to the Board of the Missouri Council for History Education; I made huge strides at work, where I’ve taken on increased responsibilities, including developing education programs for schools and senior groups, running teacher workshops, and conducting historical research; and I wrote five online essays that in my belief constitute some of my best writing:

Conversely, my personal success was marked on this blog with a good number of negative, personally insulting, and trollish comments – more than the previous four years combined. I attribute part of this development to the internet in general, where efforts to improve the public discourse are Sisyphean in nature, but I also believe it’s reflective of this blog’s growing readership. If a post shows up on Google and ends up being shared by a few people who may love or hate what you have to say, you’ll quickly find out that people from all parts of the globe will find your writings, for better or worse.

What was particularly strange for me was the number of negative comments on blog posts that I wrote several years ago. There is no such thing as a perfect writer, and the work of improving one’s writing is a process that takes years to develop. There has been a noticeable movement among Twitter users to delete old tweets that could be harmful in the present, and more than a few times I have contemplated deleting old blog posts here that no longer reflect my thinking (and there are a good number of them here). I have made mistakes over the past five years and it would be easy to remove them. At the same time, however, I believe this blog is in some ways a tangible story of my growth and development as a historian. It is a personal archive of sorts, and I choose to leave it as is not just for others but for myself.

2018 will start with lots of exciting projects and I look forward to seeing what happens from here. As always, thank you for your readership and support over the past five years.

Cheers

Rothenbucher v. Grant, Dent, and Long (1859)

I’ve been working on a research project in collaboration with the Missouri State Archives, and in the course of this research project the folks at the archives came across an 1859 court case involving Ulysses S. Grant and his Father-in-Law that I have never seen before. I wish I could say that the court case provides groundbreaking insights into Grant’s experiences while living in St. Louis (1854-1859) but instead it adds more confusion and mystery to that story.

On August 11, 1858, Philip Rothenbucher loaned $200 to Grant, his Father-in-Law Frederick Dent, and Harrison Long, who I’m unfamiliar with. The promissory note states that “Twelve months after date we, or either of us” promise to pay the loan back at ten percent interest. A year went by and no one had paid back the $200, so Rothenbucher sued at the St. Louis County Circuit Court on September 6, 1859. Rothenbucher wrote a testimony and produced the promissory note signed by Grant, Dent, and Long. Apparently no one on the defense appeared in court, and on September 7 Rothenbucher was awarded $222.40 ( only 1 percent interest of original the note).

But here’s where things get weird.

The St. Louis County Sheriff reported that he successfully executed a writ of summons to Dent and Long to appear in court, but that “the other defendent U S Grant not found in my County.” Dent and Long were therefore held responsible for the $222.40 due to Rothenbucher while Grant was dismissed from the case. I suppose this outcome was also possible because of the wording of the original note states that “we, or either of us” would figure out a way to pay back the debt. What’s weird to me is that Grant was still in St. Louis in September 1859. In fact, he wrote a letter to his father on August 20 reporting that he was waiting to hear back from a Board of Commissioners appointed to select the next St. Louis County engineer, and another to his father on September 23 stating that his application for county engineer had been rejected and that he was unsure about his future in St. Louis. The last letter in Grant’s hand from St. Louis was written in February 1860 (See The Papers of Ulysses S. Grant, Volume 1, pages 350-355 to see these letters).

So where was Grant in early September 1859? I am stumped. In any case, this lawsuit further reinforces the fact that Grant was badly impoverished and in debt by the time his family left St. Louis for Galena, Illinois. Probably no one involved in this case could have expected that Grant would be president ten years later.

Here are the files from the court record. Some of the pages are hard to read:

On Compromise and the Coming of the Civil War

The essence of all politics is the art of compromise. The success or failure of a nation-state’s policy goals lies in the ability of its political actors–some of which may have vastly different interests–to negotiate and sometimes compromise on preferred ideals in the interest of crafting intelligent policy that promotes the greater good. Compromise, of course, doesn’t always lead to positive outcomes. As the philosopher Avishai Margalit beautifully argues in On Compromise and Rotten Compromises:

We very rarely attain what is first on our list of priorities, either as individuals or as collectives. We are forced by circumstances to settle for much less than what we aspire to. We compromise. We should, I believe, be judged by our compromises more than by our ideals and norms. Ideals may tell us something important about what we would like to be. But compromises tell us who we are. (5)

Superficially, it sounds silly to ask whether compromises are good or bad, much like asking whether bacteria are good or bad: we cannot live without bacteria, though sometimes we die because of bacteria. Yet that asymmetry makes the question about the goodness and the badness of bacteria, as well as those of compromise, worth asking. We have ten times as many bacteria in our bodies as we have cells, and many of those are vital for our existence. A small number of bacteria are pathologic and cause disease, and and with the proper treatment, we may get rid of them. Similarly, compromises are vital for social life, even though some compromises are pathogenic. We need antibiotics to resist pathogenic bacteria, and we need to actively resist rotten compromises that are lethal for the moral life of a body politic. (7)

This description captures one of the most fundamental quandaries of human existence: when should individuals and groups make compromises on ideals to accomplish an objective, and when is refusing to compromise the better option of the two? Studying history is a worthwhile endeavor for considering the ramifications of political compromise on the health of a nation-state and its people.

It was with this conception of compromise on my mind when I read historian Carole Emberton’s fine essay in the Washington Post and Caleb McDaniel’s in The Atlantic today on the breakdown of compromise efforts leading up to the Civil War. White northerners and southerners forged successful compromise efforts (at least in the minds of those seeking political union between the sections) on the issue of slavery from the beginning of the nation’s founding. As the country acquired new western territory through conquest and purchase in the years before the Civil War, debates continually sprang up about whether the institution of slavery would accompany the white American settlers moving westward. In hindsight, various compromise efforts like the 1820 Missouri Compromise, the Compromise of 1850, and others were really measures to appease the proslavery south, but they nonetheless allowed the Union to be maintained for nearly eighty years after its founding.

It’s worth asking students of the Civil War to consider how compromise over slavery was possible in 1850 but not in 1860. My answer would be that the Republican Party’s successful entrance into electoral politics changed the game. The Republicans explicitly organized as a party in 1854 on the principle that slavery should be banned in the western territories and left open for free labor (for some Republicans, this meant only free white labor). Although Abraham Lincoln acknowledged that Constitutionally speaking slavery could not be touched where it already existed in the south, his personal hatred of slavery was well-know and feared by proslavery fire-eaters who saw his election as a step towards federal governance dominated by northern anti-slavery convictions. In other words, an administration that was hostile to the south’s economic, political, and social interest in keeping African Americans enslaved.

President-elect Lincoln was willing to compromise to the extent that he offered support to the first proposed 13th Amendment guaranteeing the federal government’s protection of slavery in the states where it already existed, but he refused to compromise on the question of slavery’s westward expansion, drawing a line in the sand and arguing that he had been elected on the belief that the west should be for free labor. Compromising on this question would sacrifice the Republican Party’s core principle of existence. Likewise, many white Southern Democrats argued that talk of disunion could be mollified if the federal government passed legislation guaranteeing the right to bring their slave property west with them. They refused, however, to make any further compromises short of these new guarantees from the federal government. As Emberton argues, “it was slavery, and the refusal of Southern slaveholders to compromise on slavery, that launched the Civil War.”

Cheers

Why Claiming that a Writer is “Biased” is Usually Meaningless

In the great lexicon of “Commonly-Used Words that Mean Absolutely Nothing in Contemporary Discourse,” the term “biased” is perhaps the most meaningless of all. Go through a few Amazon book reviews of recent historical scholarship and you will undoubtedly read reviews that don’t actually engage in the book’s content but claim that the author is “biased.” Scroll through social media and view discussions about essays in online news sources, and sure enough you’ll see people complaining about bias.

Complaining that a writer has a bias is more often than not a completely meaningless gesture that simply intends to end discussion about a particular topic. Rather than engaging the writer’s argument, claiming bias means shifting the argument towards questions about the writer’s motivations. And more often not, this exercise is speculative and the critic really doesn’t know anything about the writer’s motivations or his or her scholarship and personal experiences. If you cannot explain those motivations or clearly explain what the author is biased for or against, then claiming “bias” is meaningless.

I’ve experienced claims of “bias” in my own writing on this website. One of the most popular essays I’ve written here explores Ulysses S. Grant’s relationship with slavery before the Civil War. As you can see in the comments of that essay, several readers claimed that I was “biased,” overly generous to Grant, and that I wouldn’t be so generous to Robert E. Lee. While I’ve mentioned Lee in passing in various essays here, I have never made him a featured subject and have never discussed his relationship with slavery, so there’s no proof I would actually treat Lee differently from Grant. The claims against me are speculative in nature, based on feelings and a speculative judgement that I would be biased in that case. In reality, these claims against me say more about the reader than my scholarship and are a perfect example of why claiming “bias” is meaningless.

All writers approach their subjects with biases shaped by past life experiences, education, and political motivations. Having biases is in fact perfectly natural. The burden of proof in determining whether those biases irreparably damage the writer’s argument falls onto the critic, however, and thinking about bias claims this way actually makes the task of convincingly arguing that an author is biased all the more difficult. Even when the case of a writer being biased is completely noticeable, such as the case of Dinesh D’Souza’s relentless distortion of history and the Ku Klux Klan to support his hatred of the Democratic Party, focusing on the writer’s arguments is a far better course of action that speculating about his or her personal motivations.

Focus on the game, not the players.

Cheers