The National Council on Public History’s 2017 Annual Meeting has concluded and I’m back home doing my thing. There were more than 800 registrants at this year’s meeting who undoubtedly had a range of experiences during the conference, but on a personal level it was a true pleasure seeing old friends, making new ones, and having the chance to participate in important conversations about the state of the field.
In thinking about the conference’s theme since coming home–“The Middle: Where Have We Come From, Where Are We Going?“–my mind keeps going back to two sets of questions I have about the role of authority within the field. One is between public historians and the publics they work with, the other is between public historians and the people who employ them.
Regarding the former set of questions, I was struck by how various sessions grappled with whether public historians should cede or assert their authority in these situations. To cite one example, various presenters analyzing controversial monuments in the United States and Argentina all admitted during the conference that beyond doing research on the monuments and presenting their findings, a correct path for navigating where to go in the future was mystifying. Do historians conclude by presenting their findings and avoid making declarative statements one way or the other, or do they use their authority to advocate for a particular position that may or may not reflect the viewpoint of a majority of a local community’s residents? If historians take a position, whose voices within the community do they choose to amplify and why? More specifically, since community members already have a voice regardless of whether or not public historians are there, whose voices do we choose to use our privilege and platform in service of?
Additionally, are their times when further dialogue over something like the presence of a controversial monument is unnecessary and public historians must start taking political action to achieve a larger goal? How useful is it for public historians to keep discussing so-called “counter-monuments” and contextual markers for something like the Liberty Place Monument when local residents in that community are ready to take that monument down?
In “Touring Sites of Nostalgia and Violence: Historical Tourism and Memory in Germany, Poland, Turkey, and the United States,” a session I had the privilege of moderating, the question of historical authority in the visitor experience to sites of violence was a central question. Erica Fagan of the University of Massachusetts Amherst explored the use of Instagram at Holocaust sites like Auschwitz and Dachau and mused on what extent historians should moderate these posts, arguing that these sites needed to have a social media presence to dispel historical myths and falsehoods. Yagmur Karakaya of the University of Minnesota assessed several museum exhibits in Turkey that romanticized the historical legacy of the Ottoman Empire. She made connections between the exhibit content and the rhetoric of the current Erdogen administration in promoting their own goals, wondering if there was a role for public historians to offer a more balanced and less nationalistic portrayal of the Ottoman past. And Amanda Tewes explored Calico Ghost Town, a small historic site in San Bernardino, California, that is entirely volunteer-run and is probably better described as a theme park than a historic site. Volunteers engage in battle reenactments and glorify the mythic western white miner who drank heavy, carried a gun, and asserted his individualism and masculinity. Meanwhile, the actual history of Chinese laborers in the area and Calico’s peaceful, relatively non-violent culture are completely ignored.
Assessing the correct relationship between public historians and their publics is not a new concept, and NCPH 2017 continued a long conversation within the field about this topic. Unfortunately I believe we all too often use buzzword jargon words like “shared authority,” “giving groups a voice,” “community,” “radical history,” and “relevance” without thinking critically about what, exactly, we mean by these terms. This is something I warned about after last year’s conference, but I still think it’s a problem within the field. Moreover, while I won’t get into specifics here, I think we sometimes run the risk of taking too much credit for capturing the stories of disaffected groups who, once again, already have their own voices regardless of our presence. And when we do that, we come off as condescending and patronizing at best.
With regards to my second set of questions–the relationship between public historians and the people who employ them–it was obvious from the beginning that this conference was very much inward looking towards questions of employment and financial support for the long-term health of the field. To be sure, I am of the opinion that the humanities have struggled to maintain support since Socrates died for asking too many questions. But circumstances change over time and with our current political moment being highlighted by hiring freezes, potential budget cuts, and an increasingly politicized culture not just at the federal level but also the state and local level, it is safe to say that grad students about to hit the job market and new professionals at entry-level jobs are wondering about finding work and establishing career tracks. What happens when institutions face severe cuts and education is the first thing to go? What are the implications when the number of public history programs increases in times of economic uncertainty?
We are not sure what’s next and we all admitted it at the conference.
So, in sum, I think the big challenge for the field of public history continues to revolve around authority: Asserting our value as historians who enlighten, challenge, and inspire our many publics to understand and learn from studying history, but also using our positions to give those many publics a platform to share their experiences, stories, and perspectives about the past without us dominating the process.
Oh, also: I did a workshop on starting a walking tour business with Jeff Sellers and Elizabeth Goetsch, and it was probably one of the best experiences I have ever had at an NCPH conference.
Over the past few days a good number of historians have been sharing an article from the Washington Post that ostensibly confirms what many of us in the field already know: history is relevant, important, and worth studying. The article, “In Divided America, History is Weaponized to Praise or Condemn Trump,” points out that thousands upon thousands of Americans on social media are using history–or, more appropriately, their understanding of history–to make arguments to “support or oppose” the current administration’s actions. Moreover, the article provocatively claims that the President’s election has “certainly revived interest in U.S. history.” Many historians on social media are applauding these developments.
I don’t buy it.
While I agree that in our current moment we are seeing more online conversations that invoke historical figures and events, it’s worth asking a number of questions about this development. History is a tool that can be used to better understand where we came from and how we got to where we are now. Are we really engaging in conversations that actually strive to utilize historical thinking to understand what happened in the past, or have we simply turned basic historical facts into superficial rhetorical weapons to make political arguments about today? How productive is it to use history to debate government policy or predict how current policy will work in the long run? How useful is it to cite historical examples when the record is so vast as to justify any sort of political ideology or belief?
If there’s so much interest in history, why is the National Endowment for the Humanities facing the possibility of being cut completely from the federal budget? Why do colleges and universities continually trim down the budgets and staffing of history departments? Why is there a decline in students majoring in history? Why do high schools so frequently hire history teachers based on a candidate’s ability to coach a sports team and not because of their ability to educate students about the discipline? Why is visitor attendance to historic sites in a state of decline? Why do I have friends on Facebook who will simultaneously tell me that they enjoy reading history but that pursuing a liberal arts degree is “stupid” because such degrees are “fake” and “useless” on the job market?
Senator Ted Cruz recently argued that “The Democrats are the party of the Ku Klux Klan . . . The Klan was founded by a great many Democrats.” While it’s factually true that the KKK was founded by Southern Democrats after the Civil War, anyone who has even a cursory understanding of U.S. history knows that the Republican and Democrat party platforms have changed, evolved, and in some cases flipped from what they were in 140 years ago. But then again, Senator Cruz isn’t making this statement in the interest of understanding the context and complexity of history, in this case the Reconstruction era. He doesn’t care that the second wave of the KKK that emerged following the theatrical release of The Birth of a Nation in 1915 recruited many of its members from the Republican party, so much so that in Indiana the KKK essentially took over the state Republican party and the State House in the 1924 state election. He doesn’t care that in 1890, amid a growing wave of black voting disenfranchisement initiatives throughout the South, the Republican party sold out its black constituents by giving up on the Lodge Bill, which would have allowed for federal oversight of federal elections and given circuit courts the ability to investigate voter fraud, disenfranchisement, and ensure fair elections. The Republican Party gave up on this bill so that it could get Southern support for a different bill that would raise tariffs rates, the party’s primary concern at the time. He doesn’t care that racism has been a staple of U.S. history and something widely supported by Americans of all political persuasions.
Senator Cruz doesn’t care about any of this because he is only concerned about using history as a weapon to praise his buddies and condemn his enemies. He wants to portray contemporary Democrats as bigots, racists, and ideological descendants of the KKK Democrats of the 1870s. He doesn’t care about the history.
It’s a shame that so many politicians on all sides of the political spectrum so often resort to weaponizing history.
A few days before the Washington Post article was published, Northwestern University history professor Cameron Belvins wrote what is in my mind the best essay of 2017 so far. He warns of the dangers of using history to predict the future and calls upon historians to consider the ways history might be counter-productive to understanding the complexities of today’s politics. You must read this essay – it is fantastic.
In sum, I think we historians still have a long way to go before we can declare victory in our effort to expose our students and the public more broadly to the joys and benefits of studying history. And I would argue that the value of studying history is not that it provides “answers” to contemporary problems or a solid blueprint for effective government policy in the future, but that it trains us how to interpret source material, appreciate change over time, and ask better questions about our world, both then and now.
One of the reasons I enjoy blogging is that it gives me a chance to hash out thoughts, ideas, and theories that may not be fully developed in my mind. Blogging for me is as much about asking questions about how and why we study history as it is writing essays that aim to inform readers on a given historical topic that I’ve studied. Indeed, asking questions about the fundamental theories the underlie the act of historical thinking and the intellectual contours of the profession is a necessary challenge all historians must address. In doing so, we better position ourselves to sharpen our methodological tools while simultaneously improving upon the ways we explain the importance of studying history to the rest of society. Doing a better job of answering the question “why study history?” has been a central challenge of my career as a public historian so far, and I’ve thankfully learned a lot not just by reading books but also blogging out my ideas and receiving constructive feedback from thoughtful readers.
With my last post I delved into the importance of having “historical perspective” when analyzing current events. Does it help to have historical perspective? If the answer is yes, then how so? My thoughts were shaky and I had no conclusive answers. Thankfully a number of commenters stepped in and offered some brilliant thoughts.
From Christopher Graham of the American Civil War Museum:
I think the comparisons of better/worse are not the right way to frame the questions and leads us to dumb debates over better/worse and that’s not very good history.
What historical perspective should be teaching us–aside from the overwhelming complexity that defies a better/worse narrative–is how this process of historical change works. We should be asking–where is the intentionality that represents tradition and systems, and where does contingency and the unexpected that shape sensibilities and events intersect with it? How does that inevitably make things different–not necessarily better or worse, just different. And how do we identify those historical processes at work in current events? The answer reveals that we should be looking widely for motivations for change, should be ready to accept the unexpected, and that it is a dynamic process.
And from Andrew McGregor of Purdue University:
I think a lot of folks who talk about historical perspective, talk in terms of compare and contrast, which, to me, isn’t really what history and historical perspective is about. I think one of the problems that your are wrestling with here is that questions like “how did we get here?” (which are an important question to ask!) are inherently teleological. Similarly comparing and contrasting almost always involved some sort of value judgment (progress or declension). Neither approach is very emotive or humanistic (to deBoer’s point), which forces us to rethink how we understand and tell the history of “victims” (for lack of a better word). I think historical perspective works best, when we are use it to understand and get inside of moments, ideas and arguments, cultures, to better understand the lineage of people’s experiences, creating what might be termed a historical empathy built through examples and understandings of the past. This is much more easily done we analyzing how and why people make certain arguments about the Confederate flag or the R*dskins mascot, but when talking about structures and processes (like criminal justice and policing) we sometimes lose that humanness in how we tell, explain, or understand history. I’ll stop rambling there, but I think my overall point here is that we need to be conscientious of keeping a humanistic historical perspective instead of falling into lazy patterns of analysis that are often flawed.
Both of these comments redirected my thinking on historical perspective towards a new direction. It’s perfectly natural for us to compare and contrast the conditions of contemporary society with those of past societies – it’s all we can really do since we can’t predict what the future will bring. But in focusing my thoughts on comparing past and present through a better-or-worse dichotomy, I failed to grasp all the different and dynamic ways historical thinking challenges us to assess the present beyond a simple progress/declension narrative. Historical thinking includes all that Christopher and Andrew mention in their comments; finding the intersection of intentionality and contingency, analyzing change over time, and exploring ideas, cultures, and experiences in a way that goes beyond making subjective judgements as to whether things are better or worse today.
Many of us who study history do so in part because we are curious to see how our current society came to be. When discussing anything like education, economic, or foreign policy, it helps to see how policies, theories, laws, and ideas have evolved over time. There is a seductive quality to historical thinking. Sometimes, for better or worse, it leads us to believe that studying the past can offer us stability, order, and a better understanding to the world. We should be cautious, however, about drawing hard and fast conclusions about what the past can really teach us about the present or what it can do in terms of mapping out a foundation for future policy. Likewise, we should be very cautions about drawing comparisons between historical events and contemporary politics. I’ve been seeing a lot of these types of articles lately. But of all the reasons one may be inclined to oppose Donald Trump’s presidential bid, I don’t think Zachary Taylor’s rise to the Whig party’s Presidential nomination in 1848 as an “outsider” candidate and the subsequent fall of the Whig party in the 1850s is one that would prevent many people from voting for Trump, even if there are some similarities between the 1848 and 2016 elections. (It also bears pointing out that Ulysses S. Grant was very much a political “outsider” when he accepted the Republican party’s nomination in 1868, and the party turned out to be just fine with him at the helm. So it seems like there is no accepted wisdom when it comes to choosing outsider candidates based on historical precedent).
Ultimately I think there is a very fine line between studying history for the sake of understanding changes over time and how things came to be, and studying history as a means of forming future policy. I often get lost in the gray area of the intersection of history and politics when thinking about the importance of historical thinking as a way of making sense of the world. I do think there are some connections to be made between past and present. Here in the United States I don’t believe it’s a mere coincidence that the states where the harshest anti-LGBTQ legislation has been passed are also the states that most ardently supported Jim Crow laws and resisted the Civil Rights Movement in the recent past. And yet at the same time I understand that the people of the past were not like us. There’s nothing suggesting that today’s society will act a certain way because of what happened in the past. I don’t believe that history repeats itself. But how we understand the past is contested in part because we disagree about the historical connections and comparisons that make sense for explaining the world today. As a society debates its history and competing interpretations vie for the most compelling understanding of the past based on available evidence, politics fills the void left by an uncertain, incomplete, and inchoate understanding of the role of history in shaping present circumstances.
Are there any “lessons of history” to be gleaned from studying the past? There are a few that come to mind for me.
One “lesson” is that humans are complex figures and nothing is predictable. Historical precedents often give us imperfect answers for solving contemporary problems. At the same time, while I value the contributions of social scientists in politics and economics, I tend to look upon their predictive models with great skepticism because they value generalizations that dismiss statistical outliers over complex interpretations that take a more holistic view of societal thinking, which is what historians try to do most of the time. That does not mean social science has nothing to contribute, but only that there will be incorrect predictions at times and human behaviors that go beyond numbers and trends. Nate Silver screws up sometimes.
The other lesson is that “progress” is a double-edged sword that always comes with a trade-off. The invention of standardized time in the 19th century provided order to an industrializing world and ensured more efficiency and larger production capabilities in a capitalist economy, but it also made people slaves to the clock and killed many workers who yielded under this unforgiving economic structure. The development of the world wide web, the internet, and smartphone technology quite literally gives us the world at our fingertips, allowing us the chance to access tens of thousands of books, articles, and bits of information that people in the past would have never had access to. And yet at the same time we have become addicted to our phones. The internet is full of misinformation that spreads like a wildfire through social media and poorly-written memes. Whether or not we are truly smarter than those who lived before this technology is very much an open question. And, as Evgeny Morozov has so convincingly demonstrated, the internet doesn’t make us freer and in fact can be used to prop up authoritarian governments. I subscribe to the Walter Benjamin theory of progress and his conception of history as a storm that we crash into while we have our backs to the future.
In the end I like what historian Ian Beacock has to say about history and contemporary politics. To wit:
Do we need to banish history from our public life? Of course not. But we ought to think more carefully about how we put it to use. Appeals to the past are most valuable, and do most to strengthen our democratic culture, when they help us see more potential futures: by showing events to be contingent and complex, turning us away from simplistic models and easy answers, and reminding us of the terrific, terrifying creativity that drives human behavior. In practice, that means we should spend less time trying to find the perfect single equivalence between Trump and politicians past and more time reflecting on broader patterns. More than particular historical analogies, we need historical thinking.
One of the most unfortunate and widely-accepted ideas about historical thinking is that “history is written by the victors.” This talking point asserts that the truth of the past is not shaped by reasoned interpretive historical scholarship or a factual understanding of the past, but by the might of political and cultural leaders on the “winning” side of history who have the power to shape historical narratives through school textbooks, public iconography, movies, and a range of other mediums. To be sure, these mediums are powerful venues for establishing political ideologies and shaping personal assumptions about the way the world works. And it’s definitely true that governmental or “official” entities can and do exploit this power to achieve their own ends. In his book Remaking America: Public Memory, Commemoration, and Patriotism in the Twentieth Century, historian John Bodnar discusses the concept of “official cultural expressions” that aim to shape how people remember the past. These expressions originate from social leaders and official authorities who seek to shape society’s historical understanding in ways that promote “social unity, the continuity of existing institutions, and loyalty to the status quo” (13). In other words, those in power have an interest in maintaining their power, and a “useable past” that conforms to their vision of present-day conditions can function as a strong tool in upholding their status.
It is a mistake, however, to assume that only the “winners” of history have the power to manipulate the past to attain their present-day goals. This is especially the case in an age where the internet wields enormous potential for a person from any walk of life to build a powerful platform for spouting their beliefs and opinions. We must do away with this fiction that history is only written by the winners. (I know that “Winners” is a vague and ill-defined term in this context, but I will set aside any long-winded attempt at a definition for this post).
There may be no stronger example of “losers” writing widely accepted historical narratives than those who have advocated for the Lost Cause interpretation of the American Civil War. The central argument of the Lost Cause, of course, is that the Confederacy was morally and constitutionally right in their efforts to secede from the United States. But loss is central to Lost Cause theory in that many of its advocates argue that the Confederacy was doomed from the very beginning of the war since United States forces had superior resources and military forces to overwhelm them. Although the historical reality demonstrates that there were several instances during the war when it appeared the Confederacy was on the brink of victory, the narrative power of young men patriotically putting their lives on the line for a doomed yet noble cause still appeals to a great number of Americans today.
In the years after the Civil War, Lost Cause advocates grabbed their pens and their pocketbooks in an effort to win the memory battle over the meaning of the nation’s bloodiest conflict. In 1866 Confederate General Daniel Harvey Hill established The Land We Love, a magazine that glorified Southern literature, agrarianism, and provided a platform for Confederate veterans to publish their reminiscences of battle. From 1884 to 1887 the popular Century Magazine published its famous Battles and Leaders of the Civil War, which included lengthy articles from both United States and Confederate military leaders about the war. Former Confederate political leaders like Jefferson Davis and Alexander Stephens wrote autobiographies and histories of the Confederacy that reflected their version of events. Many history textbooks in schools throughout the country, but especially those in former Confederate states, taught a Lost Cause version of the war that glorified the Confederacy. Later on a number of motion picture films like Birth of A Nation and Gone With the Wind further extended the Lost Cause’s reach. And for roughly fifty years (1880-1930) countless millions of dollars were spent through both donations and public tax revenues to support the erection of monuments glorifying the Confederacy all across the South (and elsewhere, I’m sure).
All of these expressions of memory and historical interpretation were readily accepted by many if not most of Americans all over the country after the war. The “Losers” succeeded in writing a history that gained popular acceptance in American society. And the Lost Cause interpretation of the war is readily available for those looking to study it today. Anyone can go online and read Davis, Stephens, and many other Lost Cause materials on Google Books or HathiTrust. Anyone can find the Declarations of Secession written by the various Southern states that chose to explain their reasoning for embracing disunion.
History is written by everybody, not just the “winners.” It’s true that there have been times in history when “official narratives” have aimed to eradicate alternate historical interpretations that didn’t fully conform to the desires of the nation-state, the Church, or what have you. But the bigger point that is equally true is that historical counter-narratives always exist to subvert “victors” history, both orally and in print. “History is written by the victors” is a lazy argument that is usually deployed in the absence of historical evidence to defend claims about the past. This is why it was so ironic to me when I heard the complaint that “history is written by the victors” when the city of New Orleans decided to take down their Confederate statues in December. Clearly that’s not a true statement once you see how former Confederates and their supporters succeeded in shaping NOLA’s commemorative landscape for more than 150 years following the end of the Civil War.
The Atlantic has posted an essay by Alia Wong on U.S. history textbooks in K-12 classes that is worth reading. The essay focuses on a recent discovery of a ridiculous claim in a history textbook published by McGraw Hill suggesting that African slaves brought to the American colonies from the 1600s to the 1800s were “immigrants” to this land who somehow came here on their own free will. You would think that twenty years after the “textbook wars” of the 1990s and James Loewen’s Lies My Teacher Told Me: Everything Your American History Textbook Got Wrong was published to critical acclaim that textbook companies like McGraw Hill would be more careful about the claims they make in these textbooks, but I suppose that is asking too much when a group like the Texas Board of Education wields so much power in determining what gets into history textbooks around the country. You often hear George Santayana’s abused quote about people who don’t remember the past being doomed to repeat it, but it seems that there are times when people who do remember the past and in some cases actively participate in that past are actually more doomed to repeat it.
There is a bigger problem than bad history textbooks in U.S. classrooms, however, and that is bad history teachers. To wit:
Compared to their counterparts in other subjects, high-school history teachers are, at least in terms of academic credentials, among the least qualified. A report by the American Academy of Arts & Sciences on public high-school educators in 11 subjects found that in the 2011-12 school year, more than a third—34 percent—of those teaching history classes as a primary assignment had neither majored nor been certified in the subject; only about a fourth of them had both credentials. (At least half of the teachers in each of the other 10 categories had both majored and been certified in their assigned subjects.)
In fact, of the 11 subjects—which include the arts, several foreign languages, and natural science—history has seen the largest decline in the percentage of teachers with postsecondary degrees between 2004 and 2012. And it seems that much of the problem has little to do with money: The federal government has already dedicated more than $1 billion over the last decade to developing quality U.S.-history teachers, the largest influx of funding ever, with limited overall results. That’s in part because preparation and licensing policies for teachers vary so much from state to state.
A recent report from the National History Education Clearinghouse revealed a patchwork of training and certification requirements across the country: Only 17 or so states make college course hours in history a criterion for certification, and no state requires history-teacher candidates to have a major or minor in history in order to teach it.
“Many [history teachers] aren’t even interested in American history,” said Loewen, who’s conducted workshops with thousands of history educators across the country, often taking informal polls of their background and competence in the subject. “They just happen to be assigned to it.”
A bad history textbook in the hands of a good teacher can be turned into a useful instrument for teaching students about the construction of historical narratives, the differences between history and memory, and, of course, the factually correct historical content. A bad history teacher can lead students towards a lifetime hatred of history, regardless of how factually correct their textbook is.
I did not know that 34 percent of history teachers were not majors or certified in history, nor did I know that only 17 states have required qualifications for someone to teach history in a classroom, but I can safely say that Loewen’s observations about people being “assigned” to teach history are true. They often have “coach” in their title.
I do not mean to suggest that all coaches are bad teachers or lack historical knowledge. My initial inspiration for studying history in college was sparked in large part by a Western Civilization teacher during my senior year of high school who also happened to coach football and basketball. But that was the thing; every student viewed him as a teacher who also happened to coach, rather than as a coach who also happened to teach history. And unfortunately there were several coaches at my high school who were simply unfit to teach history.
Is there a lack of qualified history teachers in the United States for our K-12 schools, or does the problem lie in a lack of opportunities for qualified history teachers to find gainful employment in K-12 schools?
Addendum: If you’re a teacher who is frustrated with the quality of your history textbook, I highly recommend that you take advantage of The American Yawp, a free online history textbook that is collaboratively written by some of the best and brightest historians in the country. It is designed for a college classroom but I have no doubt that high school students, especially those in AP classes, could use it to their advantage.
Over the past few weeks the New York Times has rekindled a longstanding debate among scholars and educators over the role of lecturing in the college classroom. Back in September Annie Murphy Paul suggested that college lectures are “a specific cultural form that favors some people while discriminating against others, including women, minorities and low-income and first generation college students. This is not a matter of instructor bias; it is the lecture format itself . . . that offers unfair advantages to an already privileged population.” This month Molly Worthen responded with a defense of the traditional lecture, arguing that “lectures are essential for teaching the humanities most basic skills: comprehension and reasoning, skills whose value extends beyond the classroom to the essential demands of working life and citizenship.”
Both essays make good points that I agree with. Since I adhere to the idea that knowledge is constructed and that people rely on prior knowledge when making connections to new intellectual content, I can see Paul’s argument that poor and minority students who attended inferior schools during their youth can be at a disadvantage in a lecture-based college classroom. Conversely, I can also agree with Worthen that lectures expose students to content experts who have a platform to share their knowledge beyond the confines of a TV soundbite or YouTube video. I also agree with her that lectures can challenge students to synthesize information and take good notes.
I do not approach this conversation as an experienced college professor, but as a certified social studies teacher who had a cup of coffee in the middle/high school teaching world a few years ago and as a current interpreter for the National Park Service, where a parallel discussion is taking place about whether interpreters should play the role of “sage on the stage” or “guide by the side” during visitor interactions. These jobs have allowed me to participate in and facilitate learning experiences through a wide range of mediums. These experiences inform my opinion that lectures can be an effective tool for generating effective learning experiences, but only if they are used within reason, at appropriate times. Furthermore, it’s not productive to look at lectures and active learning as either/or propositions. Educators should be well-versed in a range of teaching methods, and I believe most critics of the lecture format are asking professors to expand their pedagogical vocabulary rather than asking them to fully abolish the traditional lecture course, as Worthen suggests.
Before I advance my arguments further, we should pause and ask what, exactly, constitutes a lecture. Derek Bruff of Vanderbilt University offers a useful distinction between educators who incorporate discussion and interaction throughout their lectures and others who engage in what he calls “continuous exposition,” which is completely devoid of any student interaction and is really just a monologue. The “continuous exposition” was a staple of my undergraduate education, and it was a real drag most of the time. I had a number of professors that lectured for the entire period and then, with five minutes left, would ask if anyone had questions. In my five years in undergrad I don’t think a single student ever asked a question during those five-minute windows, largely because most students wanted to get out of class by that point and understood that any sort of real, substantive Q&A with the professor would require much more than five minutes. A more active approach to lecturing–or a wholly different approach altogether–would have yielded more feedback from students if these professors truly cared about that feedback.
Another consideration is how much emphasis is given to the lecture in evaluating a student’s performance in a given class. In a continuous exposition lecture, the student’s grade is tied almost exclusively to his or her ability to recite in written form what the professor says during the lecture. This too is a problem in my mind because it places too much emphasis on rote memorization and recitation of content at the expense of training students to think about interpretation, analysis, and the process of drawing informed conclusions. I like Andrew Joseph Pegoda’s “high stakes quizzing” approach which places much more emphasis on assigned readings outside the classroom, frequent quizzes that challenge students to draw conclusions about their readings, and classroom discussions about those readings that are guided–but not exclusively directed–by the professor. This approach invites thoughtful student interaction while also allowing the professor the option to step back or jump into the discussion as necessary.
Yet another consideration in this discussion is reconciling the underlying tension between disciplinary knowledge and educational theory in educating future teachers. Most of my history professors were primarily focused on teaching content and used the continuous exposition model to convey that content, but my education professors stressed that we could only lecture for ten minutes to our future students and that we would have to utilize other active learning methods for the bulk of our classroom experiences (these education professors, ironically enough, often had a tendency to lecture for more than an hour to us). Historian and educator Fritz Fischer, writing in the June 2011 issue of Historically Speaking, explains that:
My students and I struggle with trying to connect the world of academic history with the world of pedagogical training. On the one hand, they were told by the educational theorists to create a “student centered” classroom and to rely on clever and fun classroom activities such as jigsaws and Socratic debates. These activities were often intellectually vapid, devoid of historical content or an understanding of historical context. On the other hand, they sat in their introductory history courses and listened to lectures about THE story of the past. Some of the lectures might have been engaging, interesting, and powerful, but were they really reflective of what historians do, and could they be at all helpful in a K-12 history classroom? How were my students to reconcile these worlds? (15)
The best way to reconcile these worlds, in my opinion, is to embrace a balanced approach to teaching that values lecturing not as the ultimate means for obtaining knowledge but as a tool within a larger arsenal that includes the use of other methods such as classroom discussions, group projects, classroom websites and blogs, and assignments that challenge students to develop and communicate arguments through written and oral form.
The challenge, of course, is designing these student-centered activities in ways that incorporate both content and disciplinary process. Bruce A. Lesh offers some great examples of implementing a balanced teaching approach in the middle/high school history classroom in his book “Why Won’t You Just Tell Us the Answer?”: Teaching Historical Thinking in Grades 7-12. In one case he challenges students to envision themselves as public historians who are tasked with documenting historical events through the creation of a historical marker. Students work on a given topic and are tasked with doing historical research, writing the text for this historical marker, and then explaining their methods and interpretations in both written form and during classroom discussion. This is a perfect example of an intellectually rigorous but student-centered approach to teaching historical thinking and content. It allows students a platform to contribute their own knowledge to the learning process, but it also allows the teacher to facilitate conversation and act as a content expert when necessary. Furthermore, it’s an activity that can be catered to students of all ages, whether they’re in elementary school or college.
So, while I don’t think educators need to fully discard the lecture, I think they should take the time to ensure they use it with proper care and with students’ learning journeys in mind.
P.S. I meant to, but forgot to include a link to Josh Eyler’s post “Active Learning in Not Our Enemy,” which is very good and worth reading. I owe a debt to Josh for sparking some of my own thoughts in this essay.
Over the past week I’ve been reading Lying About Hitler: History, Holocaust, and the David Irving Trial by Richard J. Evans, a British historian noted for his scholarship on modern German history. The book is fantastic, a must-read for its discussions of Nazi history and the nature of historical research and interpretation.
Evans wrote the book after serving as chief historical adviser and witness for the defense team in a famous libel suit that took place in a British court in the late 1990s and early 2000s. The suit was brought on by David Irving, a now-largely discredited “historian” of World War II and Nazi history who was accused by author Deborah Lipstadt of being a Holocaust denier in a book she wrote about the topic. Lipstadt and her publisher, Penguin Books UK, enlisted Evans’s help in determining whether or not Lipstadt’s claims about Irving were true and, if so, how Irving manipulated the historical record to exonerate Adolf Hitler and minimize the horrors of the Holocaust. These challenges were particularly difficult for the defense team because British libel laws assume that accusers/plaintiffs in these cases are acting in good faith, which in turn essentially throws the burden of proof on the defense instead of the plaintiff. Evans and several PhD candidates spent more than a year and half researching Irving’s works, primary source documents, and other relevant historiography, and Evans himself spent several days in the witness box during the trial.
While the case was primarily concerned with determining whether or not Irving had manipulated the historical record to promote his political agenda (and NOT whether or not the Holocaust had occurred in the first place), it proved to be an interesting one for the entire historical enterprise because it also raised important questions about truth, objectivity, and the boundaries of reasonable interpretation in historical scholarship. The entire profession, in a sense, was on trial. One journalist at the trial–responding to Irving’s claims that the memories of thousands of Holocaust survivors were subject to dismissal because of the victims’ delusional thinking and a vast conspiracy by the Jewish community to perpetuate falsehoods about the Holocaust–remarked that:
It is history itself which is on trial here, the whole business of drawing conclusions from evidence. If Irving is able to dismiss the testimony of tens of thousands of witnesses, where does that leave history? If we can’t know this, how can we know that Napoleon fought at Waterloo or that Henry VIII had six wives? How can we know anything? . . . If we start to doubt corroborated facts, how can we prevent ourselves being swallowed up in doubt, unable to trust anything we see? It might all be a conspiracy, a legend, a hoax. This the bizarre, never-never world inhabited by David Irving. Now the court has to decide: is this our world too? (195)
In the course of researching and testifying at the trial Evans uncovered instance after instance in which Irving intentionally manipulated historical evidence by selectively choosing, altering, and misquoting documents, falsified quantitative data, and relied on primary sources that were universally declared by trained historians to be forgeries and/or deliberate falsehoods. Evans presented substantial evidence suggesting that Irving had consistently argued in books, interviews, and talks that Hitler neither knew about or ordered violence against Jews during Kristallnacht or their total extermination during World War II; that gas chambers were never used to kill Jews during the war; that the figure of six million Jews killed was a deliberate exaggeration perpetuated in part by the Jewish community (Irving placed the number of Jews killed around 100,000, most of which he attributed to disease at the concentration camps); and that the bombing of Dresden by Allied forces in 1945 had actually killed upwards of 250,000 Germans instead of the roughly 25,000 that most contemporary officials reported at the time and most historians accept as an appropriate figure today. Downplaying the total number of Jews killed by the Nazis and playing up the total number of Germans killed at Dresden, of course, allowed Irving to argue that the conduct of Allied forces during the war towards Germany was harsher and more brutal than Nazi actions towards European Jews. Evans proves without a doubt in Lying About Hitler that all of these claims are absolute bunk.
The courts found in favor of Deborah Lipstadt and the defense team in 2000. She had not committed an act of libel when she claimed that Irving was a Holocaust denier, and it was determined that Irving had in fact manipulated the historical record to justify his antisemitic and racist views.
Evans neatly summarizes some of the central issues this case raised for the historical enterprise in the last chapter of Lying About Hitler. He asks two questions:
- “What are the boundaries of legitimate disagreement among historians?”
- “How far do historians’ interpretations depend on a selective reading of the evidence, and where does selectivity end and bias begin?”
Evans argues that while historians frame their questions from a range of perspectives and disciplinary approaches, they are obligated to read historical evidence “as fully and fairly as they can.” Using Joseph Goebbels’s diary as a case study, he asserts that it is useless to cherry-pick quotes from the diary to support an argument when another historian could pick other quotes and potentially refute your argument. “What a professional historian does,” Evans argues, “is to take the whole of the source in question into account, and check it against other relevant sources, to reach a reasoned conclusion that will withstand critical scrutiny by other historians who look at the same material . . . Argument between historians is limited by what the evidence allows them to say” (248-250). He then uses a metaphor that I find extremely convincing to reinforce his points:
Suppose we think of historians like figurative painters sitting at various places around a mountain. They will paint it in different styles, using different techniques and different materials, they will see it in a different light or from a different distance according to where they are, and they will view it from different angles. They may even disagree about some aspects of its appearance, or some of its features. But they will all be painting the same mountain. If one of them paints a fried egg, or a railway engine, we are entitled to say that she or he is wrong; whatever it is that the artist has painted, it is not the mountain. The possibilities of legitimate disagreement and variation are limited by the evidence in front of their eyes. An objective historian is simply one who works within these limits. They are limits that allow a wide latitude for differing interpretations of the same document or source, but they are limits all the same (250).
Hats off to Dr. Evans’s important work in this case, both for the victims of the Nazi Holocaust but also the historical enterprise as a whole.
A few weeks ago The Bitter Southerner published a nice essay on Civil War reenacting. The author asserts that reenacting is in a period of transition as some enthusiasts push for a more holistic understanding of Civil War history, one that strives for “deeper, truer purposes” by explaining why this war was fought in the first place. Moreover, the author argues that “Civil War reenactments are as popular now as they have ever been,” so therefore historical reenactors–and the entire public history field–can and should find ways to use reenacting to disseminate a better understanding of the Civil War to the public.
I don’t buy the argument that Civil War reenacting–or any sort of reenacting save for maybe World War II–is as popular as ever. While it’s my understanding that the Civil War Sesquicentennial did bring out a large number of reenactors to commemorate the 150th anniversary of various battles like Shiloh and Gettysburg and General Lee’s surrender at Appomattox Court House, what’s notable about the Sesquicentennial is how many reenactors who grew up during the Centennial years of the 1960s chose to wear their outfits for the last time with the end of the 150th this year. I highly doubt that the so-called “millennial” generation is going to maintain the popularity of Civil War or any historical reenacting in the future. Another article in the Peoria Journal Star seems to confirm my skepticism.
The Journal Star article focuses on Fort Crevecoeur in Illinois, which was first established in the late seventeenth century. The fort doesn’t have the popularity of a Civil War site, but during the 1980s the fort held reenactments based on events when the fort functioned as a trading post that drew crowds of more than 3,000 visitors. Today the number of reenactors still participating in events at the fort dwindles around 15 or 20. The site has expanded its offerings and now hosts a French and Indian War reenactment, but concerns remain about the future of these events and even the site itself.
Three reasons are offered in the Journal Star for explaining why historical reenactments aren’t popular with younger people today:
- Kids are more interested in technology “and air conditioning” than spending the day outside at a reenactment.
- Kids don’t have the money to invest in reenacting.
- Older reenactors don’t want to be around young people and look upon them “with disdain.”
I think explanation two–that kids [i.e. their parents] don’t have the money–is the most plausible. Time is an important factor as well, however. Organized sports, to take one example, play a much larger role in many children’s lives than they did in the 1960s or 70s. Kids also have more homework than ever before. And while older folks can definitely be grumpy sometimes, I think those sentiments are reflective of a basic fact of human nature: young people like hanging out with other young people during their free time. Ditto to older people. And both groups can unfortunately look at each other with too much skepticism.
The bit about kids being more interested in technology than being outside is a red herring that distracts us from questioning why this site’s current programming is not attracting visitors. Sure, kids are inside more often and some teens are spending as much as seven and a half hours a day consuming media, but who’s to blame for that? The kids are being raised by parents who attended those reeanctments as children in the 1980s and are choosing not to return. Adults are spending more time inside consuming media on computers, phones, and televisions too. Kids who spend all day online are doing so because their parents allow them to, and because parents often indulge in the same types of behavior.
Is the future of public history doomed because of digital technology, or is there an opportunity for public historians to thoughtfully incorporate at least some semblance of digital technology to enhance their programming? If you agree with me that the latter course is a better one, then you’ll see that throwing our hands up and doing nothing but moaning about kids and their phones will most certainly hasten this field’s demise. Let’s stop making excuses. Michael Twitty, a historical reenactor who often portrays himself as an enslaved cook at events and is quoted in the aforementioned Bitter Southerner piece, is right when he says that “my job is to bring to life what the life of an enslaved person looked like so that you can take a picture of it with your iPhone and share this knowledge.”
Historical reenacting is suffering in part, I think, because the way we teach history to students in the classroom is changing, both in content and method. What I call the “dates, dead people, and dust” approach to history education is slowly going away and being replaced with teaching methods that embrace nuance and context and incorporate primary sources alongside select readings from academic historians. Memorizing a bunch of facts for a forty-question multiple choice test is not an effective way of teaching history, and it seems like more teachers are beginning to challenge their students to interpret the past using primary and secondary evidence and to then make compelling arguments in written and oral form. Moreover, the discussion topics in history classrooms, for better or worse, are moving away from heroic narratives of great white men, dramatic battles, and military tactics towards questions of politics, economics, social practices, group identities, and history as a process of connected events with implications for the present instead of a series of unconnected events on a timeline from a bygone, irrelevant era.
Given these changes in the classroom, we should be unsurprised when historical reenactments that privilege dry lectures, minute facts about obscure historical artifacts, and narratives that literally whitewash the past don’t attract the attention of young or diverse audiences.
Do I think historical reenacting is ineffective or a waste of time? Absolutely not! I’ve done historical reenacting myself (poorly). When I lived in Indiana I participated in Connor Prarie’s “Follow the North Star” program and was profoundly moved by the entire experience. But the reenacting alone wasn’t what impressed me. The program gave me a chance to actively participate in the event itself and contribute my own voice through a post-event facilitated dialogue run by a professional. The dialogue challenged all participants to reflect on their own experiences during the event and then comment on the role of history in shaping our society today. Plus they gave us resources to learn more about slavery, the underground railroad, and Indiana history at the very end. It was reenacting with emotion, passion, intelligence, active participation, and respect for the past. We need more of that in public history if we want reenacting to play an important educational role in the future.
Yesterday Vox published an essay from scholar Sarah A. Chrisman about her and her husband’s ongoing effort to live their everyday lives in the Victorian era of the 1880s and 1890s. Literally. They live in a house built in 1888, exclusively use historical technologies like iceboxes and mechanical clocks (although I guess the internet is fair game?), wear Victorian-style clothing in and out of the house, and bathe with a bowl and pitcher every morning. They are going full-out with this experiment.
While these two clearly love Victorian culture and use it as a way of bonding with each other, I think this project is wacky. I saw the original article through a tweet from Slate writer Rebecca Onion. Several people commented on that tweet that the project was not historically accurate and that it was “pure caucasity,” an experiment undertaken by a well-to-do white couple who consciously chose to live their lives as wealthy Victorian elites and not, say, poor working-class immigrants who might die at age 40 of dysentery or consumption. I jokingly remarked on Facebook that it was the apotheosis of hipster divinity, by which I meant that the hipster ideal of bringing back historical aesthetics, fashion sensibilities, and technologies into the counterculture of modern society was taken to a new level by Chrisman and her husband. There is also an irony in forgoing modern technologies and lamenting the excesses of free market capitalism today while celebrating a period in which capitalism was arguably its harshest and most unregulated, all while using Victorian technologies that would have been cutting edge at the time.
Onion was quick to write a must-read, brilliant critique of this Victorian living experiment. I don’t propose to completely repeat that critique here, but I want to focus on what I believe is Chrisman’s very naive understanding of the relationship between the past (what actually happened), history (what we say about what happened in the past and the narratives we form to tell those stories), and the role of interpretation in shaping our understanding of the past.
Chrisman suggests that secondary sources (documents created about a historical event after the fact) can lead to misunderstanding and confusion about the past. That’s certainly true, but she goes even further by relying exclusively on primary source materials written during the Victorian Era to determine the accuracy of her living experiment. She states this argument in the following paragraph:
The artifacts in our home represent what historians call “primary source materials,” items directly from the period of study. Anything can be a primary source, although the term usually refers to texts. The books and magazines the Victorians themselves wrote and read constitute the vast bulk of our reading materials — and since reading is our favorite pastime, they fill a large percentage of our days. There is a universe of difference between a book or magazine article about the Victorian era and one actually written in the period. Modern commentaries on the past can get appallingly like the game “telephone”: One person misinterprets something, the next exaggerates it, a third twists it to serve an agenda, and so on. Going back to the original sources is the only way to learn the truth.
There are a few problems here. One is that no matter how many historical artifacts a person possesses from any given time period, that person is not living her life within the social, political, and economic conditions of that period. Victorian women, for example, did not have the right to vote and were often relegated to a life of domestic home making (believed at the time to be their “natural” sphere in life). Many also had limited opportunities to obtain a quality education. These are challenges that Chrisman does not have to deal with today. An idealized and imagined past of iceboxes and mechanical clocks can only take you so far in accomplishing an accurate, “real” past.
There is also an assumption in this passage that only secondary sources are prone to misinterpretation, exaggeration, and agenda-filled commentaries. On the contrary, primary sources are just as prone to the same issues as secondary sources and can do much to distort “the truth.” Elizabeth Varon’s Appomattox: Victory, Defeat, and Freedom at the End of the Civil War perfectly demonstrates how easy it is for primary sources to lead readers into the wrong direction. Varon points out that in the aftermath of General Lee’s surrender to General Grant there was much confusion among contemporaries about the meaning of the Appomattox surrender terms. Abolitionists and African Americans believed that Grant’s terms vindicated the cause of emancipation and laid the foundations for black citizenship, voting rights, and even racial equality in the postwar years. Conservative Northern Unionists and former Confederates, however, viewed Grant’s generous terms as a call for a more cautious approach to reconstructing the nation that would place prewar Southern political elites back into power and maintain the underpinnings of white supremacy. Added to the challenge of understanding Appomattox are the rampant misinterpretations and exaggerations of some contemporary newspapers and letter writers. Some people sympathetic to the Confederacy stated that U.S. forces outnumbered Confederate forces ten to one at the time of Lee’s surrender. That interpretation in many cases served a larger twisted agenda arguing that Grant’s victory came about solely because he had more troops and superior resources than Lee. Grant’s men and the causes they fought for were supposedly tainted because they had won an unfair fight, a war of “might over right” instead of “right over might.”
But hey, these are all primary sources, right? We just need to look at what people said at the time–no matter how disparate their views were–and get back to the actual surrender terms instead of these tainted secondary sources to uncover their meaning and find the truth, eh?
Historians are trained to research primary and secondary sources, develop interpretations of these sources using the best available evidence and theories, and construct narratives that best reflect the realities of the past. The best historical scholarship pushes us to new understandings of both past and present, but we shouldn’t kid ourselves about the ability to literally time travel into the past either through historical research or “period living.”
The historian Robin Collingwood argued in the 1940s that historians, through hard work and intense research, could “know” Julius Caesar and put themselves in his mind, understanding “the situation in which Caesar stood, and thinking for himself what Caesar thought about the situation and the possible ways of dealing with it.” Implicit in this argument, as Sam Wineburg points out, is the assumption that “human ways of thought, in some deep and essential way, transcend time and space.”
I’m not as confident as Collingwood that we can completely get into the mindset of any historical person or time period. I actually believe the opposite is true: that the combination of limited primary sources and our distance in both time and space prevents us from truly grasping the full truth of the past in most instances. We are just trying to make sense of the past to the best of our abilities and then somehow use that knowledge to inform society as we move rapidly into an unknowable future.
I hope Chrisman and her husband enjoy their Victorian lifestyles, but please don’t lecture the rest of us about your adventure getting you closer to some ubiquitous “truth” of the past that the rest of us miss by not living the life of a bourgeois Victorian elite like you.