Category Archives: In the Classroom

On Compromise and the Coming of the Civil War

The essence of all politics is the art of compromise. The success or failure of a nation-state’s policy goals lies in the ability of its political actors–some of which may have vastly different interests–to negotiate and sometimes compromise on preferred ideals in the interest of crafting intelligent policy that promotes the greater good. Compromise, of course, doesn’t always lead to positive outcomes. As the philosopher Avishai Margalit beautifully argues in On Compromise and Rotten Compromises:

We very rarely attain what is first on our list of priorities, either as individuals or as collectives. We are forced by circumstances to settle for much less than what we aspire to. We compromise. We should, I believe, be judged by our compromises more than by our ideals and norms. Ideals may tell us something important about what we would like to be. But compromises tell us who we are. (5)

Superficially, it sounds silly to ask whether compromises are good or bad, much like asking whether bacteria are good or bad: we cannot live without bacteria, though sometimes we die because of bacteria. Yet that asymmetry makes the question about the goodness and the badness of bacteria, as well as those of compromise, worth asking. We have ten times as many bacteria in our bodies as we have cells, and many of those are vital for our existence. A small number of bacteria are pathologic and cause disease, and and with the proper treatment, we may get rid of them. Similarly, compromises are vital for social life, even though some compromises are pathogenic. We need antibiotics to resist pathogenic bacteria, and we need to actively resist rotten compromises that are lethal for the moral life of a body politic. (7)

This description captures one of the most fundamental quandaries of human existence: when should individuals and groups make compromises on ideals to accomplish an objective, and when is refusing to compromise the better option of the two? Studying history is a worthwhile endeavor for considering the ramifications of political compromise on the health of a nation-state and its people.

It was with this conception of compromise on my mind when I read historian Carole Emberton’s fine essay in the Washington Post and Caleb McDaniel’s in The Atlantic today on the breakdown of compromise efforts leading up to the Civil War. White northerners and southerners forged successful compromise efforts (at least in the minds of those seeking political union between the sections) on the issue of slavery from the beginning of the nation’s founding. As the country acquired new western territory through conquest and purchase in the years before the Civil War, debates continually sprang up about whether the institution of slavery would accompany the white American settlers moving westward. In hindsight, various compromise efforts like the 1820 Missouri Compromise, the Compromise of 1850, and others were really measures to appease the proslavery south, but they nonetheless allowed the Union to be maintained for nearly eighty years after its founding.

It’s worth asking students of the Civil War to consider how compromise over slavery was possible in 1850 but not in 1860. My answer would be that the Republican Party’s successful entrance into electoral politics changed the game. The Republicans explicitly organized as a party in 1854 on the principle that slavery should be banned in the western territories and left open for free labor (for some Republicans, this meant only free white labor). Although Abraham Lincoln acknowledged that Constitutionally speaking slavery could not be touched where it already existed in the south, his personal hatred of slavery was well-know and feared by proslavery fire-eaters who saw his election as a step towards federal governance dominated by northern anti-slavery convictions. In other words, an administration that was hostile to the south’s economic, political, and social interest in keeping African Americans enslaved.

President-elect Lincoln was willing to compromise to the extent that he offered support to the first proposed 13th Amendment guaranteeing the federal government’s protection of slavery in the states where it already existed, but he refused to compromise on the question of slavery’s westward expansion, drawing a line in the sand and arguing that he had been elected on the belief that the west should be for free labor. Compromising on this question would sacrifice the Republican Party’s core principle of existence. Likewise, many white Southern Democrats argued that talk of disunion could be mollified if the federal government passed legislation guaranteeing the right to bring their slave property west with them. They refused, however, to make any further compromises short of these new guarantees from the federal government. As Emberton argues, “it was slavery, and the refusal of Southern slaveholders to compromise on slavery, that launched the Civil War.”

Cheers

Advertisements

On Using Historical Analogies Responsibly

Is President Donald Trump like Andrew Jackson?

Wait, maybe he’s more like Andrew Johnson.

Or King George III.

Or the Founding Fathers.

Or Aaron Burr.

Or John Quincy Adams.

Or Abraham Lincoln.

Or Jefferson Davis.

Or Horace Greeley.

Or Ulysses S. Grant.

Or James K. Vardaman.

Or Theodore Roosevelt.

Or Huey Long.

Or Benito Mussolini.

Or George Patton.

Or Franklin Roosevelt.

Or George Wallace.

Or Barry Goldwater.

Or Richard Nixon.

Or Ronald Reagan.

Or Hugo Chavez.

Over the past week historians have been debating the merits of using historical analogy to educate lay audiences about the messy circumstances of our current political moment. Moshik Temkin started the discussion with an op-ed in the New York Times decrying the “historian as pundit” persona that, as can be seen above, has gotten attention within the online realm (not all of those essays were written by historians, but you get the point). Temkin expresses worries about “the rapid-fire, superficial way history is being presented, as if it’s mostly a matter of drawing historical analogies,” which in turn simplifies, trivializes, and downplays the significance of both past and present-day events. Conversely, many historians on my Twitter feed reacted negatively to Temkin’s piece, arguing that we must meet people where they are and that analogy provides opportunities for historians to demonstrate changes and continuities in American history.

Is there room to argue that both sides of this argument are a little bit right and a little bit wrong? I think so.

I do not agree with Temkin when he suggests historians should avoid appearances on TV and “quick-take notes” in a news article. Nor do I agree with the argument that we should leave analogy solely to the non-historian pundits. There are limitations to both TV and newspaper articles since they offer only small tidbits and soundbites for expressing a particular viewpoint, but they do offer historians an opportunity to demonstrate the value of the past in shaping the present. For example, my friend and fellow public historian Will Stoutamire contributed some wonderful insights into this article on the history of Arizona’s Confederate monuments. Last I heard that particular article had been viewed something like 70,000 times over the past month. Not bad! Likewise, I agree with Julian Zelizer when he argues that:

Historians have an important role in unpacking key elements of the ways that institutions operate over time to make sense of big trends and broader forces that move beyond the particular moment within which we live. We can’t become so blinded by our concern for particularity and specificity and nuance that we lose site of the big picture.

At the same time, however, is Temkin incorrect when he suggests that we should be wary of poor historical analogies? Is he wrong when he asserts that we should remind our audiences that a similar event or person from the past does not lead to a similar outcome in the present? Can we conclude that some of the above historical analogies are trite and unhelpful? Are there better questions we can ask about the past and how it has shaped the present? Is their room to sometimes discuss the past on its own terms without resorting to comparisons with the present? I was struck by a recent article from a senior English major who, in discussing national politics in the classroom, warned that “if authors are only worth reading insofar as they inform modern phenomena, then the entire English canon is of mere antiquarian interest and can be summarily dismissed.” If you insert ‘history’ for the word ‘English,’ do we run into the same problem by downplaying huge swaths of history that don’t have an explicit relevance to current politics?

A huge shortcoming of this entire discussion, of course, is that public historians and the work they do are completely left out of the conversation. Here’s the thing. Public historians work in small spaces all the time; spaces that are more often then not much smaller than the ones academics use. We don’t get sixty minutes for lecture, 400 pages to write a book, or even a New York Times opinion piece. We get ten minute introductions, tweets, short Facebook posts, museum exhibits that are often viewed for ten seconds or less, and other educational programming of short duration. Both Temkin and his critics leave this important work out of their discussion.

So here’s a strong middle ground from which to argue. Historians should always strive to meet people where they are in their learning journey. They ought to embrace opportunities to give talks, speak on news shows, be quoted in a newspaper article, or write op-eds for a media outlet with a large platform. At the same time, they ought to use historical analogies responsibly and within the context of highlighting the importance of studying history. The past itself is interesting on its own terms, and sometimes it’s okay to discuss it without resorting to a comparison with Donald Trump. And perhaps academic historians can learn a thing or two from public historians about conveying complex historical subjects into clear, accessible interpretations of the past to a wide range of audiences.

Cheers

The Public Education “Culture Wars” of the Reconstruction Era

The historiography of the Reconstruction era has and continues to be overwhelmingly focused on questions of race, citizenship, and equal protection under the law in the years after the American Civil War. For an era of remarkable constitutional change and the dramatic transition of four million formerly enslaved people into citizens (and, for some, into voters and elected leaders), this focus is understandable. Reconstruction-era scholars almost unanimously agree today that Reconstruction was a noble but “unfinished revolution” undone by an end to military rule in the South in 1877 and an apathetic white North no longer interested in protecting black rights, which in turn allowed unrepentant, racist white Southern Democrats to overtake their state governments and impose Jim Crow laws that ushered in a long era of white political supremacy throughout the region.

The “unfinished revolution” thesis is undoubtedly true, but there is more to the story of Reconstruction than the question of Black Civil Rights (although the importance of that story cannot be overstated). The country’s finances were in shambles and questions emerged about the best way to pay down the federal deficit and establish sound credit; women fought for the right to vote but were denied this right when the 15th amendment limited suffrage to men only; Indian tribes throughout the west faced the prospect of rapid white westward expansion and a federal government that simultaneously preached peace with the tribes but also did little to stop white encroachment of their lands; and immigrants from mostly Southern and Eastern Europe began to settle in the United States, causing a great deal of consternation among political leaders about how to best assimilate these people into American culture.

Regarding the latter issue, historian Ward McAfee’s 1998 publication Religion, Race, and Reconstruction: The Public School in the Politics of the 1870s is a masterful treatment of the role of public education during the Reconstruction era. I just finished reading the book and I learned a ton from it.

McAfee’s thesis is essentially three-pronged. The first argument is that increasing numbers of immigrants to the U.S. during Reconstruction raised a great deal of concern within the Republican Party, especially those who had flirted with Know-Nothingism in the 1850s and held anti-immigrant and anti-Catholic prejudices. Republicans feared that these immigrants held their allegiance to the Pope above their allegiance to the U.S. and that the Catholic church kept their parishioners illiterate, superstitious, and ignorant of the larger world. These immigrants would attempt to subvert the country’s republican institutions and make America a bulwark of the Vatican. The emergence of public education during Reconstruction, therefore, was not just an effort to educate the formerly enslaved but also an effort to promote (Protestant) morals, good citizenship, and obedience to republican institutions among immigrant children ostensibly being raised on Catholic principles.

The second argument relates to the division of taxpayer funds for public schools during Reconstruction. These emerging public schools during the era often incorporated Bible readings in class without much complaint. Republicans argued that Bible readings would teach good morals to students and that these teachings were appropriate as long as they took a “nonsectarian” approach that didn’t cater to any particular denomination. Most of these readings were done out of King James Bibles originally translated by the Church of England, however, and Catholics accused public school teachers of engaging in pro-Protestant, anti-Catholic teachings. To remedy this issue, Catholics established their own private, parochial schools and called upon the federal government to ensure that state tax funds for education be equally distributed between public “Protestant” schools and private Catholic schools. Republicans led the charge against splitting these funds and undertook an effort to ban public funding for “sectarian” schools. Towards the end of Reconstruction the Republicans made this issue a centerpiece of their party platform, and in 1875 Congressman James Blaine led an unsuccessful effort to pass an amendment banning public funding for sectarian schools (although “nonsectarian” religious instruction and Bible readings could still hypothetically take place in the public school classroom). While this amendment failed, 38 of 50 states today still have their own state “Blaine amendments” banning the funding of sectarian schools.

The third and arguably most provocative argument from McAfee is his contention that Reconstruction failed largely because of an initiative by the radical wing of the Republican party to mandate racially integrated “mixed-race” schooling in 1874. Most Republicans were skeptical if not outright hostile to racially integrated public schools (in stark contrast to their desire to have children from Protestant, Catholic, and other religious backgrounds intermingled together in public schools). Massachusetts Senator Charles Sumner, however, was a dedicated proponent of racial integration in the schools and refused to compromise on the issue. When Congress began debating the merits of a new Civil Rights bill in 1874 that would mandate equal treatment in public accommodations, public transportation, and jury service, Sumner insisted on including a clause on racially integrated public schools. When news of Sumner’s demands became public, Democrats and conservative Republicans in both the North and South responded with outrage. Conservative Republicans in particular stated that while equal treatment in public facilities was acceptable, mandating mixed schools was a bridge too far. Republicans lost control of Congress after the 1874 midterm elections, and, according to McAfee, the cause of this loss was the insistence of Radical Republicans to mandate racial integration in schools.

Prior to reading McAfee I was of the belief that the devastating Panic of 1873 was the primary reason why Republicans lost the 1874 midterms, but McAfee presents convincing evidence that the mixed-schools initiative also contributed to those losses in a significant way. With Democratic control of Congress now assured, Reconstruction’s future was doomed. A Civil Rights Act was passed in 1875–largely in tribute to Sumner after he died in 1874–that mandated equal treatment in public facilities and jury service, but the clause mandating racial integration of public schools was removed. In any case, the Supreme Court in 1883 determined in Civil Rights Cases that parts of the Civil Rights of Act of 1875 were unconstitutional because, according to the court, the 14th amendment requiring equal protection of the laws only applied to the actions of the state and not the actions of private individuals and organizations.

Religion, Race, and Reconstruction is a fine piece of intellectual history that brings life to a long-forgotten element of Reconstruction history, and I highly recommend the book to readers of this blog.

Cheers

Historical Thinking Promotes Informed Citizenship

In looking back at this recent and torturous U.S. Presidential election, I believe the blatant and irresponsible sharing of fake news, inaccurate memes, and outright propaganda, combined with a general lack of civility and informed online conversation, contributed in some way to Donald Trump’s electoral victory. I do not mean to suggest that there were no other factors that contributed to this particular outcome or that people on the left side of the political spectrum don’t also share fake news and stupid memes – they do. But evidence is mounting that fake and inaccurate news–particularly Pro-Trump news–is widespread on social media and that many people regardless of political preference take misinformation seriously if it lines up with their own personal and political views. Facebook is especially bad in this regard. The chances are good that many voters who are also Facebook users went to the polls and made their respective decision based partly on false information gleaned from articles shared on their news feed.

Professor Mike Caulfield’s particularly sobering analysis of fake articles created by a fake paper, the “Denver Guardian,” that spread like wildfire across Facebook demonstrate how easy it is to get duped by someone with an agenda and basic computing skills. Friends and family that I care about have also engaged in this sharing of fake news on Facebook, which I find deeply troubling. Facebook has evolved into a news-sharing website without creating a mechanism for effectively moderating fact from fiction, and at the end of the day the site isn’t fun anymore. I haven’t checked my account since the election.

As a historian and educator I have stressed on this website the importance of teaching not just historical content in the classroom but also historical methods. When we teach both content and methods, we convey to students the idea that history is not just a mess of names, dates, and dead people, but also a process that enables students to conduct research, interpret reliable primary and secondary source documents, and ultimately become better writers, readers, and thinkers in their own lives. I think that now more than ever these skills need to be taught not just for their utility in understanding the past but for also parsing through the vast multitudes of information that bombard our social media feeds on a daily basis. Historians have much to contribute to contemporary society and they should lead the way in accomplishing this important work. When we learn to think historically, we enable ourselves to become more informed citizens who have the ability to participate in electoral politics with an understanding of the issues at hand and how our system of government operates.

I am interested in hearing from history teachers about what methods, tools, and practices they employ when teaching students how to distinguish between reliable and unreliable sources and how to interpret these sources to construct informed arguments and narratives. Sam Wineburg’s scholarship has been instrumental in my own thinking about these topics, and I believe everyone should listen to or read his keynote address at the 2015 meeting of the American Association for State and Local History. I have also utilized historian Kalani Craig’s guide on the 5 “Ps” of reading primary sources, which is equally relevant when assessing sources on contemporary topics.

What has worked for you when teaching others how to assess and interpret documentary sources? Please let me know in the comments.

Cheers

The Role of Uncertainty in Historical Thinking

I read a really interesting article today on Aeon from Stanford University history professor Caroline Winterer about the American Revolution, the creation of the U.S. Constitution, and enlightenment ideals. The underlying thesis of the article is partly rooted in the idea that Americans today have mythologized and flattened the legacies of the country’s various constitutional framers in ways that diminish the complexity of their thinking and their basic humanity. That’s not necessarily a new or bold thesis, but the way Winterer approaches this conclusion is pretty unique to me. Most notably she points out that the British philosopher John Locke–an imposing intellectual figure in the minds of many of the country’s framers–believed that while knowledge was obtained not through a divine God but through the five senses (empiricism) and language, the extent to which humans could trust their senses to provide an objective understanding of reality was very much uncertain. Similarly, since language was man-made and not the creation of God, the meanings ascribed to any word were subject to interpretation and merely “arbitrary signs that represent ideals.” What this meant for the framers, according to the Winterer, was that the feeling of uncertainty was a prevalent accompaniment in their efforts to create a functioning government and a civil society:

In fact, the American founders were uncertain about many things. They were uncertain about politics, nature, society, economics, human beings and happiness. The sum total of human knowledge was smaller in the 18th century, when a few hardy souls could still aspire to know everything. But even in this smaller pond of knowledge, and within a smaller interpretive community of political actors, the founders did not pretend certainty on the questions of their day. Instead they routinely declared their uncertainty.

While I freely admit that I am no expert of early American history, this interpretation strikes me as largely correct. The effort to create a constitution based on laws and not kings or divine providence was bold, ambitious, and fraught with uncertainty, which is why the framers established a process for amending the constitution to improve it in the future. But this article also got me thinking about the ways we teach history to middle school and high school students and why we need to make the idea of uncertainty a central element in teaching students how to think historically.

It is easy to look back at past events in hindsight and diagnose certain events as “inevitable.” Civil War historian Gary Gallagher, for example, often points out how easy it is to see the U.S. military’s victory at Gettysburg and conclude that this battle clearly led to an inevitable victory over the Confederates in the Civil War. But by understanding the sense of uncertainty people felt as events happened in real time, within circumstances often beyond their control, we can better empathize with the ways people in the past understood and reacted to the contingencies of their lives and their times. And perhaps we can teach students to embrace uncertainty in their own lives rather than seeing it as something to fear.

Growing up during the No Child Left Behind Era led to most of my history classes emphasizing standardized tests, most of which were exclusively multiple choice. The tests and lessons I encountered emphasized rote memorization of facts, which in turn portrayed the study of history as an exercise in the mastery of information and the people of the past as all-knowing figures who in many cases were certain of the consequences of their actions (especially those who fought in the American Revolution and helped create the Constitution). By focusing on the importance of critically analyzing primary and secondary sources, making reasoned interpretations based on the available evidence for a particular historical event, and making evidence-based arguments through written, oral, and digital means, history teachers can perhaps bring the uncertainty of the past (and the present!) into the forefront of building historical thinking skills.

Cheers

Senator Ron Johnson: Too Many History Teachers, Not Enough “Destructive Technology”

It’s reassuring to know that there are enlightened people like Wisconsin Senator Ron Johnson who are in positions of power and have the ability to set education policy in this country.

Senator Johnson says that the “tenured professors in the higher education cartel” are working to keep college costs high and not doing enough to embrace digital technology like Blue-Ray discs, the internet, and the world wide web in the classroom – a classroom that he believes should have fewer teachers and replaced with what he calls “destructive technology.”

Johnson: We’ve got the internet – you have so much information available. Why do we have to keep paying different lecturers to teach the same course? You get one solid lecturer and put it up online and have everybody available to the knowledge for a whole lot cheaper? But that doesn’t play well to tenured professors in the higher education cartel. So again, we need destructive technology for our higher education system.

WISPOLITICS: But online education is missing some facet of a good –

Johnson: Of course, it’s a combination, but prior to me doing this crazy thing [of being in the Senate] . . . I was really involved on a volunteer basis in an education system in Oshkosh. And one of the things we did in the Catholic school system was we had something called “academic excellence initiative.” How do you teach more, better, easier?

One of the examples I always used – if you want to teach the Civil War across the country, are you better off having, I don’t know, tens of thousands of history teachers that kind of know the subject, or would you be better popping in 14 hours of Ken Burns Civil War tape and then have those teachers proctor based on that excellent video production already done? You keep duplicating that over all these different subject areas.

Where do you even start with this nonsense?

  • Digital technology–more specifically education technology–is not a panacea that automatically enhances classroom learning. In 1922, Thomas Edison predicted that “the motion picture is destined to revolutionize our educational system and in a few years it will supplant largely, if not entirely, the use of textbooks.” That “revolution,” of course, never came about, partly because any sort of technology used in the classroom is merely a tool for achieving the larger goal of learning. Technology is not an end in and of itself, and watching a documentary is no more effective than listening to someone drone on forever in the front of a classroom. It’s how you use those tools that matters, and the best teachers put a range of tools–from pens and pencils to computers and tablets–to work in fostering a positive learning environment.

 

  • Jonathan Rees has blogged for several years about MOOCs and ed tech and has a book coming out on the subject. Mr. Johnson ought to read it.

 

  • Ken Burns is a wonderful filmmaker and producer, but his PBS series is not the definitive word on the history of the American Civil War. It’s been twenty-plus years since the documentary came out. It is dated and has a few questionable interpretations. Again, teaching history or any subject doesn’t mean popping in a movie and having students take notes. Pairing the documentary with other works of scholarship–written and on film–and analyzing how historians have interpreted the war and constructed narratives about the history of the war is a better start. Having students learn from a trained professional how to find, analyze, and interpret primary sources…that’s also a good start. And having a teacher facilitate dialogue through guided questions or some other thoughtful activity after the film holds more potential for learning than watching a video from “a solid lecturer” after watching a fourteen-hour documentary.

 

  • Ron Johnson sounds like he hasn’t stepped foot in a college in forty years. Tenure basically doesn’t exist for most young faculty members anymore. The “higher education cartel,” if any such thing exists, has bought into Senator Johnson’s rhetoric and has actively worked to implement austerity measures while relying more on part-time contingent faculty, especially since the 2008 recession. College doesn’t consist of professors constantly lecturing their students anymore. Higher education is not an Orwellian propaganda machine where students read Das Kapital and dream about Cultural Marxism all day and then party all night. We should be investing more in public education rather than advocating for “destructive technology” or busting up some make-believe “higher education cartel.”

You can’t make up this stuff up.

Cheers

Print Books Still Play an Important Role in Learning Experiences

By User:David Monniaux, flickr user 007 Tanuki, User:Jorge Ryan, and User:ZX95 - File:Uncut book p1190369.jpg , File:Used books 001.jpg , and File:Austria - Admont Abbey Library - 1407.jpg, CC BY-SA 3.0, https://commons.wikimedia.org/w/index.php?curid=21987941

Photo Credit: Wikipedia, David Monniaux, flickr user 007 Tanuki, and Jorge Ryan.

In my time blogging at Exploring the Past I’ve gone on a sort of mini-crusade against conventional understandings within popular media about millennials’ relationship to digital technology and the ways they acquire knowledge. See here, here, and here for examples. Common arguments in this discourse include the belief that millennials acquire knowledge about the world in fundamentally different ways than older people; that old, conventional mediums of learning such as reading books or visiting museums are of little interest to millennials; and that we educators must fundamentally overhaul our approach to working with young students. We must embrace “disruption” in order to unlock the potential of young people. In the teaching world you might hear about the incorporation of digital technology in the form of iPads, computers, and ebooks as a way of making classes more hands-on and interactive, whereas in the public history world you might hear some vague jargon-y gobbledygook about “engagement” or “meeting the needs of a new generation” to get them to visit museums, National Parks, and the like.

I don’t buy into the “disruption” hype that says we must dismantle everything and that we must completely do away with books, textbooks, or lectures (although I agree that educators can and do abuse the lecture medium to their students detriment). The logic of “disruption” fits into a long history of what one scholar describes as “giddy prophecies” about new developments in media technology. Thomas Edison predicted in 1922 that “the motion picture is destined to revolutionize our educational system and . . . in a few years it will supplant largely, if not entirely, the use of textbooks.” Similar prophecies have been uttered in recent years about floppy disks, CD-ROMS, and computers.

Well, it turns out that at least a few traditional educational mediums are resilient. A forthcoming study by linguistics professor Naomi Baron asserts that 92 percent of students and millennials prefer print books over ebooks, and that print publications still play an integral role in educational classrooms regardless of grade level. It turns out that print publications still have an important educational purpose nearly 100 years after Edison predicted their eventual demise. Furthermore, millennials actually read more than older adults!

Don’t get me wrong: I support the implementation of digital technology in both formal and informal learning environments, but I’ve always believed that such implementations need to be done with an understanding that these mediums are merely tools. They need to be used carefully towards a larger goal of making our students critical thinkers who ask good questions and demonstrate sharp, analytical thinking. If an “interactive” activity doesn’t accomplish these goals, then it’s worthless in my view. Rather than debating whether or not digital technology should play a role in education (it can and should), we need to discuss what approaches with digital tools work and which ones don’t. And again, the end goal is key. I believe Sam Wineburg is mostly correct when he asserts, with regards to the history classroom, that:

I don’t think that a history class should be about things such as . . . making cute posters, or about making history “engaging.” It’s about getting students to think rigorously about the evidence. Fun is okay, but I would rather have them hate the class and come out of the class having the skills needed to be good citizens than having them enjoy themselves.

Cheers

Get Out of Your Chair and Support Historic Preservation and Education in Your Community

I always said, blacks need to stop bringing up slavery all the time. It was a long time ago. Why can’t they just move on and forget about it? But then they wanted to move on and get rid of these confederate statues, and I was all like, “Things that happened a long time ago are still important. You shouldn’t forget about them!”

The above quote comes from a really funny piece of satire that a friend shared with me from The Push Pole, a website based out of Southern Louisiana. Its title seems apt for the times: “Thousands of History Buffs Magically Appear After City Council Votes to Remove Confederate Monuments.” The piece is funny because it’s rooted in a partial truth about the complex and contradictory ways Americans often choose to remember their history: “Never Forget” is an arbitrary term that extends to historical events and people we care about, but when it comes to historical things we consider to be overblown or simply not worth caring about, “we need to move on” becomes the default response. (See Andrew Joseph Pegoda’s essential essay on “Never Forget” for more thoughts on the subjective nature of the term).

The taking down or altering of some public statues, monuments, and memorials honoring the Confederacy sparked a vigorous debate in 2015 about the place of Confederate iconography in America’s commemorative landscape and whether or not some of these icons–particularly the ones in places of public governance, public schools, town squares, and the like–should remain in their place of honor. The online discussion took place through blog posts, newspaper op-eds, and thousands upon thousands of comments. While some of these discussions were productive and enlightening, we were also treated to excessive and misleading cries of “erasing history” (which is a flawed argument to take when analyzing public iconography), poor analogies that compared changes to Confederate iconography to ISIS-led destruction of Middle Eastern history, and emotion-filled hysterics that often said more about the politics of the present than any actual grasp of historical knowledge. And while folks got emotionally heated about Confederate icons, other historical artifacts such as this 19th century Virginia slave cabin are being demolished or in other cases facing potential demolition in the near future, all amid the sound of near silence on and offline.

What is the point of preserving symbolic icons that commemorate historic events and people if the actual historical artifacts that act as tangible representations of these events and people go away; things like letters, historic homes, battlefields, and other material artifacts? What would happen if some of that energy expended on debating iconography went towards preserving local history, Civil War battlefields, slave cabins, historic cemeteries, material artifacts, or archival records?

You and I can write blog posts or comment on newspaper articles until our fingers break off, but none of it really matters unless we get involved in our local communities and work towards convincing our neighbors of the importance of preserving history. Contact your local officials and tell them why public funding is important for ensuring a future grounded in an honest, responsible understanding of the past. Tell them to support historic preservation efforts in your area. Tell them that it’s important to support history education initiatives in the k-12 classroom such as National History Day and humanities programs in community colleges, four-year colleges, and universities. Tell them to support local institutions like historical societies, museums, and archival repositories. Join a preservation group like the Civil War Trust or the National Trust for Historic Preservation. Go visit a nearby National Historic Site. Attend a historical reenactment. Ask questions and be willing to listen and learn about the past, even if it’s difficult and unpleasant.

If you live in a community where a statue, monument, or memorial is currently garnering controversy, read up on relevant scholarship about the historical event being commemorated and why a symbolic icon was erected to preserve the memory of that event. Honestly consider whether or not that symbolic icon should remain in a place of honor in your community. If town hall meetings or other events are taking place about the history in your area, go to them. Listen to the perspective of other community members and express your own thoughts as well. Work towards becoming an active member of your community and an advocate for history.

If 2015 marks the beginning of a renewed conversation about history and memory in American society, let us use 2016 as a starting point for a renewed effort towards advancing the importance of supporting, preserving, and educating people about the history that is all around us. Get off the message boards and get to work in your community.

Cheers to a great new year.

Wanted in America: Good U.S. History Teachers

The Atlantic has posted an essay by Alia Wong on U.S. history textbooks in K-12 classes that is worth reading. The essay focuses on a recent discovery of a ridiculous claim in a history textbook published by McGraw Hill suggesting that African slaves brought to the American colonies from the 1600s to the 1800s were “immigrants” to this land who somehow came here on their own free will. You would think that twenty years after the “textbook wars” of the 1990s and James Loewen’s Lies My Teacher Told Me: Everything Your American History Textbook Got Wrong was published to critical acclaim that textbook companies like McGraw Hill would be more careful about the claims they make in these textbooks, but I suppose that is asking too much when a group like the Texas Board of Education wields so much power in determining what gets into history textbooks around the country. You often hear George Santayana’s abused quote about people who don’t remember the past being doomed to repeat it, but it seems that there are times when people who do remember the past and in some cases actively participate in that past are actually more doomed to repeat it.

There is a bigger problem than bad history textbooks in U.S. classrooms, however, and that is bad history teachers. To wit:

Compared to their counterparts in other subjects, high-school history teachers are, at least in terms of academic credentials, among the least qualified. A report by the American Academy of Arts & Sciences on public high-school educators in 11 subjects found that in the 2011-12 school year, more than a third—34 percent—of those teaching history classes as a primary assignment had neither majored nor been certified in the subject; only about a fourth of them had both credentials. (At least half of the teachers in each of the other 10 categories had both majored and been certified in their assigned subjects.)

In fact, of the 11 subjects—which include the arts, several foreign languages, and natural science—history has seen the largest decline in the percentage of teachers with postsecondary degrees between 2004 and 2012. And it seems that much of the problem has little to do with money: The federal government has already dedicated more than $1 billion over the last decade to developing quality U.S.-history teachers, the largest influx of funding ever, with limited overall results. That’s in part because preparation and licensing policies for teachers vary so much from state to state.

A recent report from the National History Education Clearinghouse revealed a patchwork of training and certification requirements across the country: Only 17 or so states make college course hours in history a criterion for certification, and no state requires history-teacher candidates to have a major or minor in history in order to teach it.

“Many [history teachers] aren’t even interested in American history,” said Loewen, who’s conducted workshops with thousands of history educators across the country, often taking informal polls of their background and competence in the subject. “They just happen to be assigned to it.”

A bad history textbook in the hands of a good teacher can be turned into a useful instrument for teaching students about the construction of historical narratives, the differences between history and memory, and, of course, the factually correct historical content. A bad history teacher can lead students towards a lifetime hatred of history, regardless of how factually correct their textbook is.

I did not know that 34 percent of history teachers were not majors or certified in history, nor did I know that only 17 states have required qualifications for someone to teach history in a classroom, but I can safely say that Loewen’s observations about people being “assigned” to teach history are true. They often have “coach” in their title.

I do not mean to suggest that all coaches are bad teachers or lack historical knowledge. My initial inspiration for studying history in college was sparked in large part by a Western Civilization teacher during my senior year of high school who also happened to coach football and basketball. But that was the thing; every student viewed him as a teacher who also happened to coach, rather than as a coach who also happened to teach history. And unfortunately there were several coaches at my high school who were simply unfit to teach history.

Is there a lack of qualified history teachers in the United States for our K-12 schools, or does the problem lie in a lack of opportunities for qualified history teachers to find gainful employment in K-12 schools?

Cheers

Addendum: If you’re a teacher who is frustrated with the quality of your history textbook, I highly recommend that you take advantage of The American Yawp, a free online history textbook that is collaboratively written by some of the best and brightest historians in the country. It is designed for a college classroom but I have no doubt that high school students, especially those in AP classes, could use it to their advantage.

Taking a Balanced Approach to Lecturing and Student-Centered Learning

Bueller

Over the past few weeks the New York Times has rekindled a longstanding debate among scholars and educators over the role of lecturing in the college classroom. Back in September Annie Murphy Paul suggested that college lectures are “a specific cultural form that favors some people while discriminating against others, including women, minorities and low-income and first generation college students. This is not a matter of instructor bias; it is the lecture format itself . . . that offers unfair advantages to an already privileged population.” This month Molly Worthen responded with a defense of the traditional lecture, arguing that “lectures are essential for teaching the humanities most basic skills: comprehension and reasoning, skills whose value extends beyond the classroom to the essential demands of working life and citizenship.”

Both essays make good points that I agree with. Since I adhere to the idea that knowledge is constructed and that people rely on prior knowledge when making connections to new intellectual content, I can see Paul’s argument that poor and minority students who attended inferior schools during their youth can be at a disadvantage in a lecture-based college classroom. Conversely, I can also agree with Worthen that lectures expose students to content experts who have a platform to share their knowledge beyond the confines of a TV soundbite or YouTube video. I also agree with her that lectures can challenge students to synthesize information and take good notes.

I do not approach this conversation as an experienced college professor, but as a certified social studies teacher who had a cup of coffee in the middle/high school teaching world a few years ago and as a current interpreter for the National Park Service, where a parallel discussion is taking place about whether interpreters should play the role of “sage on the stage” or “guide by the side” during visitor interactions. These jobs have allowed me to participate in and facilitate learning experiences through a wide range of mediums. These experiences inform my opinion that lectures can be an effective tool for generating effective learning experiences, but only if they are used within reason, at appropriate times. Furthermore, it’s not productive to look at lectures and active learning as either/or propositions. Educators should be well-versed in a range of teaching methods, and I believe most critics of the lecture format are asking professors to expand their pedagogical vocabulary rather than asking them to fully abolish the traditional lecture course, as Worthen suggests.

Before I advance my arguments further, we should pause and ask what, exactly, constitutes a lecture. Derek Bruff of Vanderbilt University offers a useful distinction between educators who incorporate discussion and interaction throughout their lectures and others who engage in what he calls “continuous exposition,” which is completely devoid of any student interaction and is really just a monologue. The “continuous exposition” was a staple of my undergraduate education, and it was a real drag most of the time. I had a number of professors that lectured for the entire period and then, with five minutes left, would ask if anyone had questions. In my five years in undergrad I don’t think a single student ever asked a question during those five-minute windows, largely because most students wanted to get out of class by that point and understood that any sort of real, substantive Q&A with the professor would require much more than five minutes. A more active approach to lecturing–or a wholly different approach altogether–would have yielded more feedback from students if these professors truly cared about that feedback.

Another consideration is how much emphasis is given to the lecture in evaluating a student’s performance in a given class. In a continuous exposition lecture, the student’s grade is tied almost exclusively to his or her ability to recite in written form what the professor says during the lecture. This too is a problem in my mind because it places too much emphasis on rote memorization and recitation of content at the expense of training students to think about interpretation, analysis, and the process of drawing informed conclusions. I like Andrew Joseph Pegoda’s “high stakes quizzing” approach which places much more emphasis on assigned readings outside the classroom, frequent quizzes that challenge students to draw conclusions about their readings, and classroom discussions about those readings that are guided–but not exclusively directed–by the professor. This approach invites thoughtful student interaction while also allowing the professor the option to step back or jump into the discussion as necessary.

Yet another consideration in this discussion is reconciling the underlying tension between disciplinary knowledge and educational theory in educating future teachers. Most of my history professors were primarily focused on teaching content and used the continuous exposition model to convey that content, but my education professors stressed that we could only lecture for ten minutes to our future students and that we would have to utilize other active learning methods for the bulk of our classroom experiences (these education professors, ironically enough, often had a tendency to lecture for more than an hour to us). Historian and educator Fritz Fischer, writing in the June 2011 issue of Historically Speaking, explains that:

My students and I struggle with trying to connect the world of academic history with the world of pedagogical training. On the one hand, they were told by the educational theorists to create a “student centered” classroom and to rely on clever and fun classroom activities such as jigsaws and Socratic debates. These activities were often intellectually vapid, devoid of historical content or an understanding of historical context. On the other hand, they sat in their introductory history courses and listened to lectures about THE story of the past. Some of the lectures might have been engaging, interesting, and powerful, but were they really reflective of what historians do, and could they be at all helpful in a K-12 history classroom? How were my students to reconcile these worlds? (15)

The best way to reconcile these worlds, in my opinion, is to embrace a balanced approach to teaching that values lecturing not as the ultimate means for obtaining knowledge but as a tool within a larger arsenal that includes the use of other methods such as classroom discussions, group projects, classroom websites and blogs, and assignments that challenge students to develop and communicate arguments through written and oral form.

The challenge, of course, is designing these student-centered activities in ways that incorporate both content and disciplinary process. Bruce A. Lesh offers some great examples of implementing a balanced teaching approach in the middle/high school history classroom in his book “Why Won’t You Just Tell Us the Answer?”: Teaching Historical Thinking in Grades 7-12. In one case he challenges students to envision themselves as public historians who are tasked with documenting historical events through the creation of a historical marker. Students work on a given topic and are tasked with doing historical research, writing the text for this historical marker, and then explaining their methods and interpretations in both written form and during classroom discussion. This is a perfect example of an intellectually rigorous but student-centered approach to teaching historical thinking and content. It allows students a platform to contribute their own knowledge to the learning process, but it also allows the teacher to facilitate conversation and act as a content expert when necessary. Furthermore, it’s an activity that can be catered to students of all ages, whether they’re in elementary school or college.

So, while I don’t think educators need to fully discard the lecture, I think they should take the time to ensure they use it with proper care and with students’ learning journeys in mind.

Cheers

P.S. I meant to, but forgot to include a link to Josh Eyler’s post “Active Learning in Not Our Enemy,” which is very good and worth reading. I owe a debt to Josh for sparking some of my own thoughts in this essay.