The National Council on Public History’s 2017 Annual Meeting has concluded and I’m back home doing my thing. There were more than 800 registrants at this year’s meeting who undoubtedly had a range of experiences during the conference, but on a personal level it was a true pleasure seeing old friends, making new ones, and having the chance to participate in important conversations about the state of the field.
In thinking about the conference’s theme since coming home–“The Middle: Where Have We Come From, Where Are We Going?“–my mind keeps going back to two sets of questions I have about the role of authority within the field. One is between public historians and the publics they work with, the other is between public historians and the people who employ them.
Regarding the former set of questions, I was struck by how various sessions grappled with whether public historians should cede or assert their authority in these situations. To cite one example, various presenters analyzing controversial monuments in the United States and Argentina all admitted during the conference that beyond doing research on the monuments and presenting their findings, a correct path for navigating where to go in the future was mystifying. Do historians conclude by presenting their findings and avoid making declarative statements one way or the other, or do they use their authority to advocate for a particular position that may or may not reflect the viewpoint of a majority of a local community’s residents? If historians take a position, whose voices within the community do they choose to amplify and why? More specifically, since community members already have a voice regardless of whether or not public historians are there, whose voices do we choose to use our privilege and platform in service of?
Additionally, are their times when further dialogue over something like the presence of a controversial monument is unnecessary and public historians must start taking political action to achieve a larger goal? How useful is it for public historians to keep discussing so-called “counter-monuments” and contextual markers for something like the Liberty Place Monument when local residents in that community are ready to take that monument down?
In “Touring Sites of Nostalgia and Violence: Historical Tourism and Memory in Germany, Poland, Turkey, and the United States,” a session I had the privilege of moderating, the question of historical authority in the visitor experience to sites of violence was a central question. Erica Fagan of the University of Massachusetts Amherst explored the use of Instagram at Holocaust sites like Auschwitz and Dachau and mused on what extent historians should moderate these posts, arguing that these sites needed to have a social media presence to dispel historical myths and falsehoods. Yagmur Karakaya of the University of Minnesota assessed several museum exhibits in Turkey that romanticized the historical legacy of the Ottoman Empire. She made connections between the exhibit content and the rhetoric of the current Erdogen administration in promoting their own goals, wondering if there was a role for public historians to offer a more balanced and less nationalistic portrayal of the Ottoman past. And Amanda Tewes explored Calico Ghost Town, a small historic site in San Bernardino, California, that is entirely volunteer-run and is probably better described as a theme park than a historic site. Volunteers engage in battle reenactments and glorify the mythic western white miner who drank heavy, carried a gun, and asserted his individualism and masculinity. Meanwhile, the actual history of Chinese laborers in the area and Calico’s peaceful, relatively non-violent culture are completely ignored.
Assessing the correct relationship between public historians and their publics is not a new concept, and NCPH 2017 continued a long conversation within the field about this topic. Unfortunately I believe we all too often use buzzword jargon words like “shared authority,” “giving groups a voice,” “community,” “radical history,” and “relevance” without thinking critically about what, exactly, we mean by these terms. This is something I warned about after last year’s conference, but I still think it’s a problem within the field. Moreover, while I won’t get into specifics here, I think we sometimes run the risk of taking too much credit for capturing the stories of disaffected groups who, once again, already have their own voices regardless of our presence. And when we do that, we come off as condescending and patronizing at best.
With regards to my second set of questions–the relationship between public historians and the people who employ them–it was obvious from the beginning that this conference was very much inward looking towards questions of employment and financial support for the long-term health of the field. To be sure, I am of the opinion that the humanities have struggled to maintain support since Socrates died for asking too many questions. But circumstances change over time and with our current political moment being highlighted by hiring freezes, potential budget cuts, and an increasingly politicized culture not just at the federal level but also the state and local level, it is safe to say that grad students about to hit the job market and new professionals at entry-level jobs are wondering about finding work and establishing career tracks. What happens when institutions face severe cuts and education is the first thing to go? What are the implications when the number of public history programs increases in times of economic uncertainty?
We are not sure what’s next and we all admitted it at the conference.
So, in sum, I think the big challenge for the field of public history continues to revolve around authority: Asserting our value as historians who enlighten, challenge, and inspire our many publics to understand and learn from studying history, but also using our positions to give those many publics a platform to share their experiences, stories, and perspectives about the past without us dominating the process.
Oh, also: I did a workshop on starting a walking tour business with Jeff Sellers and Elizabeth Goetsch, and it was probably one of the best experiences I have ever had at an NCPH conference.
President Donald Trump went out of his way yesterday to honor the 250th birthday of Andrew Jackson in Tennessee, which in turn has amplified continued online conversation about who in American history is deserving of honor through public ceremony and monumentation. Writer Shaun King was quick to declare that “no President who ever owned human beings should be honored” and that “slavery was a monstrous system. Everybody who participated in it was evil for having done so. Period. No exceptions.”
Some of the most difficult work in public history right now, in my opinion, centers around the nature of public commemoration and understanding how societies choose to remember their past. These are difficult conversations to have and the boundary lines between “good” and “bad” are arbitrary and poorly defined. King’s argument is provocative and worth considering. Generally speaking, I agree that owning slaves was a choice and that participating in the institution of slavery is inexcusable. But once you read the story of Ulysses S. Grant, our last President to be a slaveholder, you might conclude that King’s argument is simplistic and not a very satisfying resolution to the question of who and who isn’t worthy of public honor.
Now, I make my living educating people about General Grant’s life and times, so it could be easy for a reader to claim that I am “biased” or that I am a Grant apologist. I would reject that claim. All I can say is that I have my views about Grant but that those views have been developed through years of vigorous study of the man based on the best historical scholarship around. I don’t approach my job with the intention of portraying Grant as a hero or a sinner to visitors, but rather seek to humanize his experiences and increase understanding of his beliefs, motivations, and actions within the context of 19th century history.
Ulysses S. Grant lived in St. Louis from 1854 to 1859. For most of that time he worked as a farmer and lived with his family at White Haven, his In-Laws slave plantation in South St. Louis county. During this time Grant somehow obtained one slave, William Jones (see here for a more detailed essay I wrote about Grant’s relationship to slavery). We don’t know how or why he obtained Jones, nor do we know for how long he owned him. We do know, however, that he freed Jones in March 1859 before leaving St. Louis, something many other slaveholding Presidents never did with their enslaved people. That was the extent of Grant’s personal experiences in slaveholding. Unfortunately for historians, Grant didn’t leave any letters before the war stating one way or the other how he felt about the institution as a whole. It appears that Grant never challenged slavery’s presence in America or considered the politics and philosophy of slavery in writing before the war.
Something changed in Grant’s mind during the Civil War, however. He embraced emancipation as a war aim and welcomed black troops into his ranks. By the end of the war, one out of seven troops in his ranks were black. During the initial phases of Reconstruction, Grant came to believe that President Andrew Johnson’s policies towards the South were too lenient and that the freedpeople deserved more protection against violence, black codes, and overt discrimination by whites. After the Memphis Massacre in 1866 Grant called upon the federal government to arrest and prosecute the perpetrators who killed 46 African Americans, which never happened. When Grant was elected President in 1868, he immediately called upon Congress and the states to ratify the 15th Amendment preventing states from banning men from voting based on their race. On March 30, 1870, he delivered a message to Congress in which he declared that the 15th Amendment was the most significant act in U.S. history and a repudiation of the 1857 Dred Scott Supreme Court decision:
It is unusual to notify the two Houses of Congress by message of the promulgation, by proclamation of the Secretary of State, of the ratification of a constitutional amendment. In view, however, of the vast importance of the fifteenth amendment to the Constitution, this day declared a part of that revered instrument, I deem a departure from the usual custom justifiable. A measure which makes at once 4,000,000 people voters who were heretofore declared by the highest tribunal in the land not citizens of the United States, nor eligible to become so (with the assertion that “at the time of the Declaration of Independence the opinion was fixed and universal in the civilized portion of the white race, regarded as an axiom in morals as well as in politics, that black men had no rights which the white man was bound to respect”), is indeed a measure of grander importance than any other one act of the kind from the foundation of our free Government to the present day.
In 1871 Grant responded to the emergence of the Ku Klux Klan by using the KKK Act to shut down the group. That year he also used his Third Annual State of the Union Address to call upon Brazil, Cuba, and Puerto Rico to abolish slavery. He repeated the theme in his Fourth Address, stating that the Spanish Empire’s continuation of slavery in Cuba was “A terrible wrong [that] is the natural cause of a terrible evil. The abolition of slavery and the introduction of other reforms in the administration of government in Cuba could not fail to advance the restoration of peace and order. It is greatly to be hoped that the present liberal Government of Spain will voluntarily adopt this view.” In future addresses he spoke out against other White supremacist groups in the South like the White League and Red Shirts who continued to commit acts of violence and sometimes outright massacres against African Americans in the South. And during his Post-Presidency world tour, Grant stated to Otto von Bismarck about the Civil War that “As soon as slavery fired upon the flag it was felt, we all felt, even those who did not object to slaves, that slavery must be destroyed. We felt that it was a stain to the Union that men should be bought and sold like cattle.”
Frederick Douglass spoke often about Grant and was a dedicated supporter of his Presidency. At one point he stated that “Ulysses S. Grant, the most illustrious warrior and statesman of modern times, the captain whose invincible sword saved the republic from dismemberment, made liberty the law of the land. A man too broad for prejudice, too humane to despise the humblest, too great to be small at any point. In him, the negro found a protector . . .” and recalled in his 1881 book Life and Times of Frederick Douglass that:
My confidence in General Grant was not entirely due to the brilliant military successes achieved by him, but there was a moral as well as military basis for my faith in him. He had shown his single-mindedness and superiority to popular prejudice by his prompt cooperation with President Lincoln in his policy of employing colored troops, and his order commanding his soldiers to treat such troops with due respect. In this way he proved himself to be not only a wise general, but a great man, one who could adjust himself to new conditions, and adopt the lessons taught by the events of the hour. This quality in General Grant was and is made all the more conspicuous and striking in contrast with his West Point education and his former political associations; for neither West Point nor the Democratic party have been good schools in which to learn justice and fair play to the Negro (433-435).
Is Grant someone who should never be honored, as Shaun King suggests?
My biggest issue with King’s argument is that it assumes that people in the past never changed their thinking over time and that a former slaveholder like Ulysses S. Grant could never come to realize that holding humans in bondage was wrong. Grant was far from a saint: his ownership of William Jones was inexcusable, his General Orders No. 11 during the war expelling Jews from his lines was inexcusable, and his Indian policy during his Presidency was well-intentioned but flawed. But are there not actions he took in his life that were commendable and worth honoring?
One of the bigger problems I see with this whole discussion is that we as a society should really focus on understanding before honoring. I would rather see President Trump read a book about Andrew Jackson than stage a big ceremony honoring the man (who, to be sure, has a horrid record as a slaveholder, racist, and Indian fighter, and is someone I wouldn’t be comfortable honoring). I would like for Americans to go to historic sites with the intention of understanding the life and times of historic figures. I would like for people to appreciate complexity, nuance, and the basic idea that people–then and now–often hold evolving and contradictory views towards politics.
I suppose my historical training has soured me on the idea of “heroes” as a general approach to appreciating history. I admire the words of the Declaration of Independence, but I haven’t forgotten that the author of those words raped Sally Hemmings. I admire Washington’s words about entangling alliances and the importance of Union, but I haven’t forgotten that he too was a slaveholder. I think Jackson was right on the South Carolina Nullification Crisis, but I won’t forgive him for the Trail of Tears or his violent slaveholding. I think Grant was wrong for being a slaveholder, but I appreciate the efforts he undertook as President to protect the rights of all, and I appreciate that he came around to believe that slavery was an evil wrong. I appreciate moments in history when right triumphed over wrong and people in the past took principled stands for positions that protected the rights of all Americans, but I never forget that people in the past were humans, not Gods, and that even the best humans have their flaws. And I never forget that American freedom was first established in this country on a co-existence with and acceptance of slavery.
A few months ago a friend of mine gave me a copy of Suhi Choi’s recent book about the Korean War and how it has been remembered in both the United States and (South) Korea. Choi, a communications professor at the University of Utah, employs public history techniques throughout the book to analyze oral histories she conducted with victims of the No Gun Ri massacre, media accounts of the massacre, and various monuments that have been erected in both countries to commemorate the war as a whole. I enjoyed reading the book for its content and arguments, but what I enjoyed the most was its brevity. Clocking in at 115 pages of main text and five chapters, the book was a quick read (with the exception of some jargon-y passages throughout) yet thoroughly researched and intellectually stimulating. The book’s shortness reminded me of the Southern Illinois University Press “Concise Lincoln Library” series that has published numerous short studies on various aspects of Abraham Lincoln’s life that are typically between 100 and 150 pages long.
While I acknowledge that different historical topics require studies of varying length and depth (I currently have one book on my nightstand that is more than 800 pages long), I find myself increasingly supportive of the idea that academic histories, generally speaking, should be shorter and more concise than what they typically are now. I am no expert on publishing books with an academic press, but I’ve been told by those who’ve been through the process that they normally don’t accept anything less than 75,000 words, or roughly 250 to 300 pages. That makes sense because most PhD dissertations end up being about that length, but I think there should be some sort of system in place to encourage and publish more scholarship that would be more appropriately covered in a study between 100 and 150 pages.
As a scholar who regularly reads books from academic publishers, I crave the analysis, interpretation, and detailed research that such books offer to their readers. As a reader, however, I am more likely to go back to a short book and read it again in the future, whereas with a longer book I feel less inclined to read it in full or go back to read it a second time. It’s important for me to read as many print books as possible to get a more comprehensive understanding of historical topics that fascinate me, but the presence of thoughtful online essays and history blogs has changed how I read and reduced the amount of time I dedicate to reading full-length print books. I admit that nowadays page length plays an extremely important role in determining what I read next. 150 pages is more often compelling to me than 500 pages.
Over the past few days a good number of historians have been sharing an article from the Washington Post that ostensibly confirms what many of us in the field already know: history is relevant, important, and worth studying. The article, “In Divided America, History is Weaponized to Praise or Condemn Trump,” points out that thousands upon thousands of Americans on social media are using history–or, more appropriately, their understanding of history–to make arguments to “support or oppose” the current administration’s actions. Moreover, the article provocatively claims that the President’s election has “certainly revived interest in U.S. history.” Many historians on social media are applauding these developments.
I don’t buy it.
While I agree that in our current moment we are seeing more online conversations that invoke historical figures and events, it’s worth asking a number of questions about this development. History is a tool that can be used to better understand where we came from and how we got to where we are now. Are we really engaging in conversations that actually strive to utilize historical thinking to understand what happened in the past, or have we simply turned basic historical facts into superficial rhetorical weapons to make political arguments about today? How productive is it to use history to debate government policy or predict how current policy will work in the long run? How useful is it to cite historical examples when the record is so vast as to justify any sort of political ideology or belief?
If there’s so much interest in history, why is the National Endowment for the Humanities facing the possibility of being cut completely from the federal budget? Why do colleges and universities continually trim down the budgets and staffing of history departments? Why is there a decline in students majoring in history? Why do high schools so frequently hire history teachers based on a candidate’s ability to coach a sports team and not because of their ability to educate students about the discipline? Why is visitor attendance to historic sites in a state of decline? Why do I have friends on Facebook who will simultaneously tell me that they enjoy reading history but that pursuing a liberal arts degree is “stupid” because such degrees are “fake” and “useless” on the job market?
Senator Ted Cruz recently argued that “The Democrats are the party of the Ku Klux Klan . . . The Klan was founded by a great many Democrats.” While it’s factually true that the KKK was founded by Southern Democrats after the Civil War, anyone who has even a cursory understanding of U.S. history knows that the Republican and Democrat party platforms have changed, evolved, and in some cases flipped from what they were in 140 years ago. But then again, Senator Cruz isn’t making this statement in the interest of understanding the context and complexity of history, in this case the Reconstruction era. He doesn’t care that the second wave of the KKK that emerged following the theatrical release of The Birth of a Nation in 1915 recruited many of its members from the Republican party, so much so that in Indiana the KKK essentially took over the state Republican party and the State House in the 1924 state election. He doesn’t care that in 1890, amid a growing wave of black voting disenfranchisement initiatives throughout the South, the Republican party sold out its black constituents by giving up on the Lodge Bill, which would have allowed for federal oversight of federal elections and given circuit courts the ability to investigate voter fraud, disenfranchisement, and ensure fair elections. The Republican Party gave up on this bill so that it could get Southern support for a different bill that would raise tariffs rates, the party’s primary concern at the time. He doesn’t care that racism has been a staple of U.S. history and something widely supported by Americans of all political persuasions.
Senator Cruz doesn’t care about any of this because he is only concerned about using history as a weapon to praise his buddies and condemn his enemies. He wants to portray contemporary Democrats as bigots, racists, and ideological descendants of the KKK Democrats of the 1870s. He doesn’t care about the history.
It’s a shame that so many politicians on all sides of the political spectrum so often resort to weaponizing history.
A few days before the Washington Post article was published, Northwestern University history professor Cameron Belvins wrote what is in my mind the best essay of 2017 so far. He warns of the dangers of using history to predict the future and calls upon historians to consider the ways history might be counter-productive to understanding the complexities of today’s politics. You must read this essay – it is fantastic.
In sum, I think we historians still have a long way to go before we can declare victory in our effort to expose our students and the public more broadly to the joys and benefits of studying history. And I would argue that the value of studying history is not that it provides “answers” to contemporary problems or a solid blueprint for effective government policy in the future, but that it trains us how to interpret source material, appreciate change over time, and ask better questions about our world, both then and now.
The online publication Indian Country Media Network recently reported on a public history site that engaged in a public hanging reenactment of a Native American man this past summer. The article garnered some attention among public historians on social media, many of which expressed serious concerns about the appropriateness of this program. The Westmoreland County Historical Society attempted to defend their program with the following statement on Facebook:
I decided to respond to this statement. Here’s my email to the organization:
To whom it may concern at the Westmoreland Historical Society,
This message is in response to your statement on Facebook about a recent program your institution hosted in which living history performers reenacted an eighteenth century court case, including the gruesome hanging of the Native American Mamachtaga. I am a public historian who occasionally participates in living history programs, and I heard about this particular program through social media. While I respect your dedication to educating visitors about eighteenth century American history—particularly complex legal cases that involve thorny issues of race, gender, and indigenous rights—I have serious concerns about your institution continuing to engage in this mock hanging and similar reenactments in the future.
I found your Facebook statement in defense of your institution’s program to be inadequate. The statement appears to fundamentally misunderstand what many critics of this program are saying. Several Facebook users who commented on your statement also missed the point. The genesis of your statement is that your institution worked very hard to present a historically accurate program; your team engaged in primary source research and worked to provide context for the “historic political climate and social attitudes as well.” That’s good, but it’s not enough.
By focusing your statement on the historical accuracy of the program, you seem to suggest that your critics must either have a problem with the idea of doing any program on the Mamachtaga case, or that they can’t handle the idea of a historic education program that focuses on the “bad” parts of history. In any case, the program was historically accurate, so what’s the problem? That is a mistaken argument. On the contrary, few professionals in the public history field would have any problem with doing a program about Mamachtaga or similar cases like his. The problem that many of us have with your program is the lack of consideration about how the story is being told and the interpretive medium in which it is being told.
Interpretive programs take many shapes and forms, including tours led by trained guides, public lectures, video presentations, historical markers, digital presentations, living history programs, and other mediums. As educators, we want these programs to foster understanding and appreciation for history and the role it shapes in our daily lives. But what educational value does a hanging reenactment offer for the visitors who come to your site? What is it that you want your visitors to take away from this program?
There are have been numerous controversial living history programs about slavery in recent years that you may have heard about. In 2011, the Jefferson National Expansion Memorial in St. Louis, Missouri, hosted a “slave auction” reenactment on the steps of the Old Courthouse, and Conner Prairie Interactive History Park in Fishers, Indiana, hosts an award-winning living history program about the Underground Railroad entitled “Follow the North Star.” These programs have received both praise and criticism, which I think is fair. The problem with any program that attempts to literally “reenact” a historical experience like slavery, genocide, the Holocaust, or in this case a hanging, is that such a program cannot be accurately recreated for a modern audience, which in turn trivializes the experience. Furthermore, such a program runs the risk of de-humanizing the historical figures these performers are attempting to portray, and they can be emotionally hurtful to people who are watching regardless of what their particular background may be.
What makes the Old Courthouse slave auction and “Follow the North Star” largely successful, however, is that they incorporate pre- and post-event activities to allow people a chance to become emotionally prepared for what they are about to see and then share their feelings with a trained professional in a facilitated dialogue setting afterwards. Rather than coming to the site unprepared and leaving with a bunch of bottled-up emotions, these dialogue activities allow people to unwind and feel welcome in an environment that promotes learning and inclusiveness. Attendance in these activities is mandatory for visitors who want to participate in the main event. Your statement does not indicate whether or not such activities were incorporated into your program.
I have no doubt that the Westmoreland Historical Society is dedicated to conveying accurate history to its audiences. But at the end of the day, making sure that living history programs are historically accurate is only half the challenge of creating a successful program. Considerations of audience, setting, and interpretive medium must also be considered, and I believe the Mamachtaga program failed to account for these considerations. I believe a living history program focused on the hanging of a Native American or anyone else distracts from the history you want to impart on your audiences and is ultimately a program in poor taste. American history is drenched in the blood of victims of state violence, whether that be the largest mass hanging in U.S. history after the Dakota Uprising in 1862, the Memphis Massacre of 1866, the East St. Louis riots of 1919, or any other countless instances of bloodshed. Must we reenact these stories in a literal fashion in order to attract visitors and dollars? In the future, I hope you reconsider the merits of doing a mock hanging and consider other ways of bringing American history to life.
The Washington Post recently wrote an article about an ongoing debate between economic historians and historians of capitalism (the two are not the same) about the role of slavery in the U.S. economy before the Civil War, particularly the relationship between slavery and capitalism. This debate has been taking place for a number of years, from what I can gather, but I find the Post’s handling of this extended conversation to be mildly annoying.
Generally speaking, the historians of capitalism argue that the two were intimately related and that slavery thrived and expanded in the U.S. precisely because of capitalism. Sven Beckert and Seth Rockman have recently argued that the sheer number of enslaved people throughout the South, combined with Northern (and British) capital investment in the institution renders “an unclear line of demarcation between a capitalist North and a slave South, with consequences for how we understand North and South as discrete economies—and whether we should do so in the first place.” In the Post article we hear from Edward Baptist, another historian of capitalism, who argues that the torturing of enslaved people was foundational to slavery’s growth and expansion by forcing them to produce at higher and higher rates to account for the increased demand in slave-picked cotton during the first half of the nineteenth century.
Economic historians, on the other hand, generally caution that collapsing the distinctions between Northern and Southern economies runs the risk of complicating our ability to explain how the Civil War came about. If the institution of slavery was so strongly supported in the North, then how do you explain the rise of popular anti-slavery parties in the North during the 1840s and 1850s that campaigned on the argument that slavery was a threat to the value of one’s labor and a less efficient production system than one based on free labor principles? How do you explain the origins of a bloody civil war between the two sections if their economic systems were so intimately connected? Where do discussions over sectional disagreements about economic policies like tariffs, taxes, public land sales, and government involvement in infrastructure projects fit within the capitalist historians’ focus? Furthermore, in responding to Baptist, Alan Olmstead argues that new seed technologies accelerated cotton production and played the most crucial role in fostering slavery’s growth, not slave torture.
I don’t propose to offer any concrete answers to this discussion other than to say that I find the way the Post has framed the issue isn’t really productive. Must historians’ explanation for slavery’s growth in the United States–an incredibly complex topic that could take a lifetime to study–be whittled down to a single cause: torture or seeds? Isn’t it more plausible to suggest that the two ideas (and probably more) of the various camps can coexist and complement each other? I think so. Increased cotton production in the South by enslaved labor before the Civil War was possible because of political and economic policies (national, state, and local), social practices, scientific and religious beliefs, and a strong law enforcement/police state that allowed for this state of affairs to flourish and grow.
I do not mean to suggest that historians must put equal weight to all factors when explaining a particular historical event or topic; weighing out these factors is part of the fun in debating these issues. Whenever possible, I think the quantification of empirical evidence allows historians a chance to put more weight into their claims for one particular factor over another. But historians should always strive for complexity and nuance rather than either-or propositions as the Post would have us understand this topic. When the goal becomes over-simplification and monocausal explanations for complex historical processes, I think we end up doing more harm than good to the historical record.
In looking back at this recent and torturous U.S. Presidential election, I believe the blatant and irresponsible sharing of fake news, inaccurate memes, and outright propaganda, combined with a general lack of civility and informed online conversation, contributed in some way to Donald Trump’s electoral victory. I do not mean to suggest that there were no other factors that contributed to this particular outcome or that people on the left side of the political spectrum don’t also share fake news and stupid memes – they do. But evidence is mounting that fake and inaccurate news–particularly Pro-Trump news–is widespread on social media and that many people regardless of political preference take misinformation seriously if it lines up with their own personal and political views. Facebook is especially bad in this regard. The chances are good that many voters who are also Facebook users went to the polls and made their respective decision based partly on false information gleaned from articles shared on their news feed.
Professor Mike Caulfield’s particularly sobering analysis of fake articles created by a fake paper, the “Denver Guardian,” that spread like wildfire across Facebook demonstrate how easy it is to get duped by someone with an agenda and basic computing skills. Friends and family that I care about have also engaged in this sharing of fake news on Facebook, which I find deeply troubling. Facebook has evolved into a news-sharing website without creating a mechanism for effectively moderating fact from fiction, and at the end of the day the site isn’t fun anymore. I haven’t checked my account since the election.
As a historian and educator I have stressed on this website the importance of teaching not just historical content in the classroom but also historical methods. When we teach both content and methods, we convey to students the idea that history is not just a mess of names, dates, and dead people, but also a process that enables students to conduct research, interpret reliable primary and secondary source documents, and ultimately become better writers, readers, and thinkers in their own lives. I think that now more than ever these skills need to be taught not just for their utility in understanding the past but for also parsing through the vast multitudes of information that bombard our social media feeds on a daily basis. Historians have much to contribute to contemporary society and they should lead the way in accomplishing this important work. When we learn to think historically, we enable ourselves to become more informed citizens who have the ability to participate in electoral politics with an understanding of the issues at hand and how our system of government operates.
I am interested in hearing from history teachers about what methods, tools, and practices they employ when teaching students how to distinguish between reliable and unreliable sources and how to interpret these sources to construct informed arguments and narratives. Sam Wineburg’s scholarship has been instrumental in my own thinking about these topics, and I believe everyone should listen to or read his keynote address at the 2015 meeting of the American Association for State and Local History. I have also utilized historian Kalani Craig’s guide on the 5 “Ps” of reading primary sources, which is equally relevant when assessing sources on contemporary topics.
What has worked for you when teaching others how to assess and interpret documentary sources? Please let me know in the comments.
There’s a lot of buzz within the public history and museum fields about Franklin Vagnone and Deborah Ryan’s new book, Anarchist’s Guide to Historic House Museums. I’d been waiting for a while to have a chance to read the book, and I finally got around to it this week. Overall the book aims to challenge standard practices at historic house museums in regards to interpretation, education, and preservation at these places, and it will definitely provoke new conversations within the field about how and why historic house museums are important for understanding and appreciating the past.
I finished Anarchist’s Guide feeling underwhelmed. While I found the book’s appendices useful for researching visitor feedback and evaluating a given site’s standard practices, I felt like most of Anarchist’s Guide’s conclusions were neither revolutionary, radical, nor original. I might expand upon these thoughts in a future blog post. Nevertheless, I do agree with one central argument made by Vagnone and Ryan that should be repeated to all house museum professionals, however: historic house museums are first and foremost about the people, past AND present, who occupy the house’s space. As Vagnone argues, “the breath of a house is the living that takes place within it, not the structure or its contents” (21). Hear! Hear!
With the National Park Service–at least among those of us who work at historic homes–there is a running joke about the dreaded “furniture tour.” You arrive for the tour and the guide that accompanies the group room-by-room focuses almost exclusively on the furniture pieces of the room and the minute details of each piece that no one will remember when the tour concludes: what year this chair was produced, what state this table came from, how thankful we tourists should be for the good museum professionals who’ve preserved all this furniture for us today. What often goes missing from these tours is the humanity of it all. Why is any of this furniture important? Who are the people who owned this furniture, and why did they buy it? What is so important about this house and why should we continue preserving it? Why should we care about this place today?
To be sure, there is an important place for material culture analyses at historic homes. A gifted interpreter can take a historic artifact and tell nuanced stories about the people who owned it and that artifact’s cultural, economic, and political history. Who built this artifact? Why was this artifact valuable at the time and why did the owner purchase it in the first place? What can this artifact tell us about the times in which its owner lived? When historical artifacts act as tools towards the end goal of better understanding and appreciating the past and the people who lived in it, visitors leave with a better sense of empathy and the humanity of the past. Conversely, tours end up becoming boring and stale when historical artifacts become ends within themselves, reinforcing the idea that the study of history is primarily one of rote memorization and filling the “empty” minds of visitors with dates and facts.
The situation at my own workplace is somewhat unique in regard to historic artifacts. At the Ulysses S. Grant National Historic Site we have no original furniture inside the historic White Haven estate. While the structure itself is still mostly original today, the lack of original furniture disappoints some visitors. This feeling is understandable, and by no means do I consider such a sentiment misplaced or silly. We all visit historic homes partly because we are curious to see what they look like inside, and at first blush an empty room is nothing to be too excited about. But I take pains to point out to visitors that the National Park Service didn’t choose to preserve this particular house because it was old or because of the way it was designed, but because of the people who lived in it. The house, to paraphrase Vagnone, breathes life because of the people who were there during its 170-year existence as a private residence and the people who still visit it as a National Historic Site today. If the house and its original structural elements were to be completely destroyed tomorrow, the National Park Service would continue to oversee the site and tell the stories of the people who lived there, even if there was nothing original to actually see. But if people stopped coming to the site and the house became an empty hole of nothing beyond a historic structure, what would be the point of the NPS staying to preserve the site? It wouldn’t matter if each room had an abundance of historic artifacts – no one would be there to see it.
A historic house without any people in it breathes no life. Anyone who holds a leadership position at a historic house museum ought to remember that when designing interpretive programs or explaining to stakeholders why their particular site is important and worth preserving.
I read a really interesting article today on Aeon from Stanford University history professor Caroline Winterer about the American Revolution, the creation of the U.S. Constitution, and enlightenment ideals. The underlying thesis of the article is partly rooted in the idea that Americans today have mythologized and flattened the legacies of the country’s various constitutional framers in ways that diminish the complexity of their thinking and their basic humanity. That’s not necessarily a new or bold thesis, but the way Winterer approaches this conclusion is pretty unique to me. Most notably she points out that the British philosopher John Locke–an imposing intellectual figure in the minds of many of the country’s framers–believed that while knowledge was obtained not through a divine God but through the five senses (empiricism) and language, the extent to which humans could trust their senses to provide an objective understanding of reality was very much uncertain. Similarly, since language was man-made and not the creation of God, the meanings ascribed to any word were subject to interpretation and merely “arbitrary signs that represent ideals.” What this meant for the framers, according to the Winterer, was that the feeling of uncertainty was a prevalent accompaniment in their efforts to create a functioning government and a civil society:
In fact, the American founders were uncertain about many things. They were uncertain about politics, nature, society, economics, human beings and happiness. The sum total of human knowledge was smaller in the 18th century, when a few hardy souls could still aspire to know everything. But even in this smaller pond of knowledge, and within a smaller interpretive community of political actors, the founders did not pretend certainty on the questions of their day. Instead they routinely declared their uncertainty.
While I freely admit that I am no expert of early American history, this interpretation strikes me as largely correct. The effort to create a constitution based on laws and not kings or divine providence was bold, ambitious, and fraught with uncertainty, which is why the framers established a process for amending the constitution to improve it in the future. But this article also got me thinking about the ways we teach history to middle school and high school students and why we need to make the idea of uncertainty a central element in teaching students how to think historically.
It is easy to look back at past events in hindsight and diagnose certain events as “inevitable.” Civil War historian Gary Gallagher, for example, often points out how easy it is to see the U.S. military’s victory at Gettysburg and conclude that this battle clearly led to an inevitable victory over the Confederates in the Civil War. But by understanding the sense of uncertainty people felt as events happened in real time, within circumstances often beyond their control, we can better empathize with the ways people in the past understood and reacted to the contingencies of their lives and their times. And perhaps we can teach students to embrace uncertainty in their own lives rather than seeing it as something to fear.
Growing up during the No Child Left Behind Era led to most of my history classes emphasizing standardized tests, most of which were exclusively multiple choice. The tests and lessons I encountered emphasized rote memorization of facts, which in turn portrayed the study of history as an exercise in the mastery of information and the people of the past as all-knowing figures who in many cases were certain of the consequences of their actions (especially those who fought in the American Revolution and helped create the Constitution). By focusing on the importance of critically analyzing primary and secondary sources, making reasoned interpretations based on the available evidence for a particular historical event, and making evidence-based arguments through written, oral, and digital means, history teachers can perhaps bring the uncertainty of the past (and the present!) into the forefront of building historical thinking skills.
It’s reassuring to know that there are enlightened people like Wisconsin Senator Ron Johnson who are in positions of power and have the ability to set education policy in this country.
Senator Johnson says that the “tenured professors in the higher education cartel” are working to keep college costs high and not doing enough to embrace digital technology like Blue-Ray discs, the internet, and the world wide web in the classroom – a classroom that he believes should have fewer teachers and replaced with what he calls “destructive technology.”
Johnson: We’ve got the internet – you have so much information available. Why do we have to keep paying different lecturers to teach the same course? You get one solid lecturer and put it up online and have everybody available to the knowledge for a whole lot cheaper? But that doesn’t play well to tenured professors in the higher education cartel. So again, we need destructive technology for our higher education system.
WISPOLITICS: But online education is missing some facet of a good –
Johnson: Of course, it’s a combination, but prior to me doing this crazy thing [of being in the Senate] . . . I was really involved on a volunteer basis in an education system in Oshkosh. And one of the things we did in the Catholic school system was we had something called “academic excellence initiative.” How do you teach more, better, easier?
One of the examples I always used – if you want to teach the Civil War across the country, are you better off having, I don’t know, tens of thousands of history teachers that kind of know the subject, or would you be better popping in 14 hours of Ken Burns Civil War tape and then have those teachers proctor based on that excellent video production already done? You keep duplicating that over all these different subject areas.
Where do you even start with this nonsense?
- Digital technology–more specifically education technology–is not a panacea that automatically enhances classroom learning. In 1922, Thomas Edison predicted that “the motion picture is destined to revolutionize our educational system and in a few years it will supplant largely, if not entirely, the use of textbooks.” That “revolution,” of course, never came about, partly because any sort of technology used in the classroom is merely a tool for achieving the larger goal of learning. Technology is not an end in and of itself, and watching a documentary is no more effective than listening to someone drone on forever in the front of a classroom. It’s how you use those tools that matters, and the best teachers put a range of tools–from pens and pencils to computers and tablets–to work in fostering a positive learning environment.
- Jonathan Rees has blogged for several years about MOOCs and ed tech and has a book coming out on the subject. Mr. Johnson ought to read it.
- Ken Burns is a wonderful filmmaker and producer, but his PBS series is not the definitive word on the history of the American Civil War. It’s been twenty-plus years since the documentary came out. It is dated and has a few questionable interpretations. Again, teaching history or any subject doesn’t mean popping in a movie and having students take notes. Pairing the documentary with other works of scholarship–written and on film–and analyzing how historians have interpreted the war and constructed narratives about the history of the war is a better start. Having students learn from a trained professional how to find, analyze, and interpret primary sources…that’s also a good start. And having a teacher facilitate dialogue through guided questions or some other thoughtful activity after the film holds more potential for learning than watching a video from “a solid lecturer” after watching a fourteen-hour documentary.
- Ron Johnson sounds like he hasn’t stepped foot in a college in forty years. Tenure basically doesn’t exist for most young faculty members anymore. The “higher education cartel,” if any such thing exists, has bought into Senator Johnson’s rhetoric and has actively worked to implement austerity measures while relying more on part-time contingent faculty, especially since the 2008 recession. College doesn’t consist of professors constantly lecturing their students anymore. Higher education is not an Orwellian propaganda machine where students read Das Kapital and dream about Cultural Marxism all day and then party all night. We should be investing more in public education rather than advocating for “destructive technology” or busting up some make-believe “higher education cartel.”
You can’t make up this stuff up.