The historiography of the Reconstruction era has and continues to be overwhelmingly focused on questions of race, citizenship, and equal protection under the law in the years after the American Civil War. For an era of remarkable constitutional change and the dramatic transition of four million formerly enslaved people into citizens (and, for some, into voters and elected leaders), this focus is understandable. Reconstruction-era scholars almost unanimously agree today that Reconstruction was a noble but “unfinished revolution” undone by an end to military rule in the South in 1877 and an apathetic white North no longer interested in protecting black rights, which in turn allowed unrepentant, racist white Southern Democrats to overtake their state governments and impose Jim Crow laws that ushered in a long era of white political supremacy throughout the region.
The “unfinished revolution” thesis is undoubtedly true, but there is more to the story of Reconstruction than the question of Black Civil Rights (although the importance of that story cannot be overstated). The country’s finances were in shambles and questions emerged about the best way to pay down the federal deficit and establish sound credit; women fought for the right to vote but were denied this right when the 15th amendment limited suffrage to men only; Indian tribes throughout the west faced the prospect of rapid white westward expansion and a federal government that simultaneously preached peace with the tribes but also did little to stop white encroachment of their lands; and immigrants from mostly Southern and Eastern Europe began to settle in the United States, causing a great deal of consternation among political leaders about how to best assimilate these people into American culture.
Regarding the latter issue, historian Ward McAfee’s 1998 publication Religion, Race, and Reconstruction: The Public School in the Politics of the 1870s is a masterful treatment of the role of public education during the Reconstruction era. I just finished reading the book and I learned a ton from it.
McAfee’s thesis is essentially three-pronged. The first argument is that increasing numbers of immigrants to the U.S. during Reconstruction raised a great deal of concern within the Republican Party, especially those who had flirted with Know-Nothingism in the 1850s and held anti-immigrant and anti-Catholic prejudices. Republicans feared that these immigrants held their allegiance to the Pope above their allegiance to the U.S. and that the Catholic church kept their parishioners illiterate, superstitious, and ignorant of the larger world. These immigrants would attempt to subvert the country’s republican institutions and make America a bulwark of the Vatican. The emergence of public education during Reconstruction, therefore, was not just an effort to educate the formerly enslaved but also an effort to promote (Protestant) morals, good citizenship, and obedience to republican institutions among immigrant children ostensibly being raised on Catholic principles.
The second argument relates to the division of taxpayer funds for public schools during Reconstruction. These emerging public schools during the era often incorporated Bible readings in class without much complaint. Republicans argued that Bible readings would teach good morals to students and that these teachings were appropriate as long as they took a “nonsectarian” approach that didn’t cater to any particular denomination. Most of these readings were done out of King James Bibles originally translated by the Church of England, however, and Catholics accused public school teachers of engaging in pro-Protestant, anti-Catholic teachings. To remedy this issue, Catholics established their own private, parochial schools and called upon the federal government to ensure that state tax funds for education be equally distributed between public “Protestant” schools and private Catholic schools. Republicans led the charge against splitting these funds and undertook an effort to ban public funding for “sectarian” schools. Towards the end of Reconstruction the Republicans made this issue a centerpiece of their party platform, and in 1875 Congressman James Blaine led an unsuccessful effort to pass an amendment banning public funding for sectarian schools (although “nonsectarian” religious instruction and Bible readings could still hypothetically take place in the public school classroom). While this amendment failed, 38 of 50 states today still have their own state “Blaine amendments” banning the funding of sectarian schools.
The third and arguably most provocative argument from McAfee is his contention that Reconstruction failed largely because of an initiative by the radical wing of the Republican party to mandate racially integrated “mixed-race” schooling in 1874. Most Republicans were skeptical if not outright hostile to racially integrated public schools (in stark contrast to their desire to have children from Protestant, Catholic, and other religious backgrounds intermingled together in public schools). Massachusetts Senator Charles Sumner, however, was a dedicated proponent of racial integration in the schools and refused to compromise on the issue. When Congress began debating the merits of a new Civil Rights bill in 1874 that would mandate equal treatment in public accommodations, public transportation, and jury service, Sumner insisted on including a clause on racially integrated public schools. When news of Sumner’s demands became public, Democrats and conservative Republicans in both the North and South responded with outrage. Conservative Republicans in particular stated that while equal treatment in public facilities was acceptable, mandating mixed schools was a bridge too far. Republicans lost control of Congress after the 1874 midterm elections, and, according to McAfee, the cause of this loss was the insistence of Radical Republicans to mandate racial integration in schools.
Prior to reading McAfee I was of the belief that the devastating Panic of 1873 was the primary reason why Republicans lost the 1874 midterms, but McAfee presents convincing evidence that the mixed-schools initiative also contributed to those losses in a significant way. With Democratic control of Congress now assured, Reconstruction’s future was doomed. A Civil Rights Act was passed in 1875–largely in tribute to Sumner after he died in 1874–that mandated equal treatment in public facilities and jury service, but the clause mandating racial integration of public schools was removed. In any case, the Supreme Court in 1883 determined in Civil Rights Cases that parts of the Civil Rights of Act of 1875 were unconstitutional because, according to the court, the 14th amendment requiring equal protection of the laws only applied to the actions of the state and not the actions of private individuals and organizations.
Religion, Race, and Reconstruction is a fine piece of intellectual history that brings life to a long-forgotten element of Reconstruction history, and I highly recommend the book to readers of this blog.
In looking back at this recent and torturous U.S. Presidential election, I believe the blatant and irresponsible sharing of fake news, inaccurate memes, and outright propaganda, combined with a general lack of civility and informed online conversation, contributed in some way to Donald Trump’s electoral victory. I do not mean to suggest that there were no other factors that contributed to this particular outcome or that people on the left side of the political spectrum don’t also share fake news and stupid memes – they do. But evidence is mounting that fake and inaccurate news–particularly Pro-Trump news–is widespread on social media and that many people regardless of political preference take misinformation seriously if it lines up with their own personal and political views. Facebook is especially bad in this regard. The chances are good that many voters who are also Facebook users went to the polls and made their respective decision based partly on false information gleaned from articles shared on their news feed.
Professor Mike Caulfield’s particularly sobering analysis of fake articles created by a fake paper, the “Denver Guardian,” that spread like wildfire across Facebook demonstrate how easy it is to get duped by someone with an agenda and basic computing skills. Friends and family that I care about have also engaged in this sharing of fake news on Facebook, which I find deeply troubling. Facebook has evolved into a news-sharing website without creating a mechanism for effectively moderating fact from fiction, and at the end of the day the site isn’t fun anymore. I haven’t checked my account since the election.
As a historian and educator I have stressed on this website the importance of teaching not just historical content in the classroom but also historical methods. When we teach both content and methods, we convey to students the idea that history is not just a mess of names, dates, and dead people, but also a process that enables students to conduct research, interpret reliable primary and secondary source documents, and ultimately become better writers, readers, and thinkers in their own lives. I think that now more than ever these skills need to be taught not just for their utility in understanding the past but for also parsing through the vast multitudes of information that bombard our social media feeds on a daily basis. Historians have much to contribute to contemporary society and they should lead the way in accomplishing this important work. When we learn to think historically, we enable ourselves to become more informed citizens who have the ability to participate in electoral politics with an understanding of the issues at hand and how our system of government operates.
I am interested in hearing from history teachers about what methods, tools, and practices they employ when teaching students how to distinguish between reliable and unreliable sources and how to interpret these sources to construct informed arguments and narratives. Sam Wineburg’s scholarship has been instrumental in my own thinking about these topics, and I believe everyone should listen to or read his keynote address at the 2015 meeting of the American Association for State and Local History. I have also utilized historian Kalani Craig’s guide on the 5 “Ps” of reading primary sources, which is equally relevant when assessing sources on contemporary topics.
What has worked for you when teaching others how to assess and interpret documentary sources? Please let me know in the comments.
I read a really interesting article today on Aeon from Stanford University history professor Caroline Winterer about the American Revolution, the creation of the U.S. Constitution, and enlightenment ideals. The underlying thesis of the article is partly rooted in the idea that Americans today have mythologized and flattened the legacies of the country’s various constitutional framers in ways that diminish the complexity of their thinking and their basic humanity. That’s not necessarily a new or bold thesis, but the way Winterer approaches this conclusion is pretty unique to me. Most notably she points out that the British philosopher John Locke–an imposing intellectual figure in the minds of many of the country’s framers–believed that while knowledge was obtained not through a divine God but through the five senses (empiricism) and language, the extent to which humans could trust their senses to provide an objective understanding of reality was very much uncertain. Similarly, since language was man-made and not the creation of God, the meanings ascribed to any word were subject to interpretation and merely “arbitrary signs that represent ideals.” What this meant for the framers, according to the Winterer, was that the feeling of uncertainty was a prevalent accompaniment in their efforts to create a functioning government and a civil society:
In fact, the American founders were uncertain about many things. They were uncertain about politics, nature, society, economics, human beings and happiness. The sum total of human knowledge was smaller in the 18th century, when a few hardy souls could still aspire to know everything. But even in this smaller pond of knowledge, and within a smaller interpretive community of political actors, the founders did not pretend certainty on the questions of their day. Instead they routinely declared their uncertainty.
While I freely admit that I am no expert of early American history, this interpretation strikes me as largely correct. The effort to create a constitution based on laws and not kings or divine providence was bold, ambitious, and fraught with uncertainty, which is why the framers established a process for amending the constitution to improve it in the future. But this article also got me thinking about the ways we teach history to middle school and high school students and why we need to make the idea of uncertainty a central element in teaching students how to think historically.
It is easy to look back at past events in hindsight and diagnose certain events as “inevitable.” Civil War historian Gary Gallagher, for example, often points out how easy it is to see the U.S. military’s victory at Gettysburg and conclude that this battle clearly led to an inevitable victory over the Confederates in the Civil War. But by understanding the sense of uncertainty people felt as events happened in real time, within circumstances often beyond their control, we can better empathize with the ways people in the past understood and reacted to the contingencies of their lives and their times. And perhaps we can teach students to embrace uncertainty in their own lives rather than seeing it as something to fear.
Growing up during the No Child Left Behind Era led to most of my history classes emphasizing standardized tests, most of which were exclusively multiple choice. The tests and lessons I encountered emphasized rote memorization of facts, which in turn portrayed the study of history as an exercise in the mastery of information and the people of the past as all-knowing figures who in many cases were certain of the consequences of their actions (especially those who fought in the American Revolution and helped create the Constitution). By focusing on the importance of critically analyzing primary and secondary sources, making reasoned interpretations based on the available evidence for a particular historical event, and making evidence-based arguments through written, oral, and digital means, history teachers can perhaps bring the uncertainty of the past (and the present!) into the forefront of building historical thinking skills.
It’s reassuring to know that there are enlightened people like Wisconsin Senator Ron Johnson who are in positions of power and have the ability to set education policy in this country.
Senator Johnson says that the “tenured professors in the higher education cartel” are working to keep college costs high and not doing enough to embrace digital technology like Blue-Ray discs, the internet, and the world wide web in the classroom – a classroom that he believes should have fewer teachers and replaced with what he calls “destructive technology.”
Johnson: We’ve got the internet – you have so much information available. Why do we have to keep paying different lecturers to teach the same course? You get one solid lecturer and put it up online and have everybody available to the knowledge for a whole lot cheaper? But that doesn’t play well to tenured professors in the higher education cartel. So again, we need destructive technology for our higher education system.
WISPOLITICS: But online education is missing some facet of a good –
Johnson: Of course, it’s a combination, but prior to me doing this crazy thing [of being in the Senate] . . . I was really involved on a volunteer basis in an education system in Oshkosh. And one of the things we did in the Catholic school system was we had something called “academic excellence initiative.” How do you teach more, better, easier?
One of the examples I always used – if you want to teach the Civil War across the country, are you better off having, I don’t know, tens of thousands of history teachers that kind of know the subject, or would you be better popping in 14 hours of Ken Burns Civil War tape and then have those teachers proctor based on that excellent video production already done? You keep duplicating that over all these different subject areas.
Where do you even start with this nonsense?
- Digital technology–more specifically education technology–is not a panacea that automatically enhances classroom learning. In 1922, Thomas Edison predicted that “the motion picture is destined to revolutionize our educational system and in a few years it will supplant largely, if not entirely, the use of textbooks.” That “revolution,” of course, never came about, partly because any sort of technology used in the classroom is merely a tool for achieving the larger goal of learning. Technology is not an end in and of itself, and watching a documentary is no more effective than listening to someone drone on forever in the front of a classroom. It’s how you use those tools that matters, and the best teachers put a range of tools–from pens and pencils to computers and tablets–to work in fostering a positive learning environment.
- Jonathan Rees has blogged for several years about MOOCs and ed tech and has a book coming out on the subject. Mr. Johnson ought to read it.
- Ken Burns is a wonderful filmmaker and producer, but his PBS series is not the definitive word on the history of the American Civil War. It’s been twenty-plus years since the documentary came out. It is dated and has a few questionable interpretations. Again, teaching history or any subject doesn’t mean popping in a movie and having students take notes. Pairing the documentary with other works of scholarship–written and on film–and analyzing how historians have interpreted the war and constructed narratives about the history of the war is a better start. Having students learn from a trained professional how to find, analyze, and interpret primary sources…that’s also a good start. And having a teacher facilitate dialogue through guided questions or some other thoughtful activity after the film holds more potential for learning than watching a video from “a solid lecturer” after watching a fourteen-hour documentary.
- Ron Johnson sounds like he hasn’t stepped foot in a college in forty years. Tenure basically doesn’t exist for most young faculty members anymore. The “higher education cartel,” if any such thing exists, has bought into Senator Johnson’s rhetoric and has actively worked to implement austerity measures while relying more on part-time contingent faculty, especially since the 2008 recession. College doesn’t consist of professors constantly lecturing their students anymore. Higher education is not an Orwellian propaganda machine where students read Das Kapital and dream about Cultural Marxism all day and then party all night. We should be investing more in public education rather than advocating for “destructive technology” or busting up some make-believe “higher education cartel.”
You can’t make up this stuff up.
In my time blogging at Exploring the Past I’ve gone on a sort of mini-crusade against conventional understandings within popular media about millennials’ relationship to digital technology and the ways they acquire knowledge. See here, here, and here for examples. Common arguments in this discourse include the belief that millennials acquire knowledge about the world in fundamentally different ways than older people; that old, conventional mediums of learning such as reading books or visiting museums are of little interest to millennials; and that we educators must fundamentally overhaul our approach to working with young students. We must embrace “disruption” in order to unlock the potential of young people. In the teaching world you might hear about the incorporation of digital technology in the form of iPads, computers, and ebooks as a way of making classes more hands-on and interactive, whereas in the public history world you might hear some vague jargon-y gobbledygook about “engagement” or “meeting the needs of a new generation” to get them to visit museums, National Parks, and the like.
I don’t buy into the “disruption” hype that says we must dismantle everything and that we must completely do away with books, textbooks, or lectures (although I agree that educators can and do abuse the lecture medium to their students detriment). The logic of “disruption” fits into a long history of what one scholar describes as “giddy prophecies” about new developments in media technology. Thomas Edison predicted in 1922 that “the motion picture is destined to revolutionize our educational system and . . . in a few years it will supplant largely, if not entirely, the use of textbooks.” Similar prophecies have been uttered in recent years about floppy disks, CD-ROMS, and computers.
Well, it turns out that at least a few traditional educational mediums are resilient. A forthcoming study by linguistics professor Naomi Baron asserts that 92 percent of students and millennials prefer print books over ebooks, and that print publications still play an integral role in educational classrooms regardless of grade level. It turns out that print publications still have an important educational purpose nearly 100 years after Edison predicted their eventual demise. Furthermore, millennials actually read more than older adults!
Don’t get me wrong: I support the implementation of digital technology in both formal and informal learning environments, but I’ve always believed that such implementations need to be done with an understanding that these mediums are merely tools. They need to be used carefully towards a larger goal of making our students critical thinkers who ask good questions and demonstrate sharp, analytical thinking. If an “interactive” activity doesn’t accomplish these goals, then it’s worthless in my view. Rather than debating whether or not digital technology should play a role in education (it can and should), we need to discuss what approaches with digital tools work and which ones don’t. And again, the end goal is key. I believe Sam Wineburg is mostly correct when he asserts, with regards to the history classroom, that:
I don’t think that a history class should be about things such as . . . making cute posters, or about making history “engaging.” It’s about getting students to think rigorously about the evidence. Fun is okay, but I would rather have them hate the class and come out of the class having the skills needed to be good citizens than having them enjoy themselves.
I always said, blacks need to stop bringing up slavery all the time. It was a long time ago. Why can’t they just move on and forget about it? But then they wanted to move on and get rid of these confederate statues, and I was all like, “Things that happened a long time ago are still important. You shouldn’t forget about them!”
The above quote comes from a really funny piece of satire that a friend shared with me from The Push Pole, a website based out of Southern Louisiana. Its title seems apt for the times: “Thousands of History Buffs Magically Appear After City Council Votes to Remove Confederate Monuments.” The piece is funny because it’s rooted in a partial truth about the complex and contradictory ways Americans often choose to remember their history: “Never Forget” is an arbitrary term that extends to historical events and people we care about, but when it comes to historical things we consider to be overblown or simply not worth caring about, “we need to move on” becomes the default response. (See Andrew Joseph Pegoda’s essential essay on “Never Forget” for more thoughts on the subjective nature of the term).
The taking down or altering of some public statues, monuments, and memorials honoring the Confederacy sparked a vigorous debate in 2015 about the place of Confederate iconography in America’s commemorative landscape and whether or not some of these icons–particularly the ones in places of public governance, public schools, town squares, and the like–should remain in their place of honor. The online discussion took place through blog posts, newspaper op-eds, and thousands upon thousands of comments. While some of these discussions were productive and enlightening, we were also treated to excessive and misleading cries of “erasing history” (which is a flawed argument to take when analyzing public iconography), poor analogies that compared changes to Confederate iconography to ISIS-led destruction of Middle Eastern history, and emotion-filled hysterics that often said more about the politics of the present than any actual grasp of historical knowledge. And while folks got emotionally heated about Confederate icons, other historical artifacts such as this 19th century Virginia slave cabin are being demolished or in other cases facing potential demolition in the near future, all amid the sound of near silence on and offline.
What is the point of preserving symbolic icons that commemorate historic events and people if the actual historical artifacts that act as tangible representations of these events and people go away; things like letters, historic homes, battlefields, and other material artifacts? What would happen if some of that energy expended on debating iconography went towards preserving local history, Civil War battlefields, slave cabins, historic cemeteries, material artifacts, or archival records?
You and I can write blog posts or comment on newspaper articles until our fingers break off, but none of it really matters unless we get involved in our local communities and work towards convincing our neighbors of the importance of preserving history. Contact your local officials and tell them why public funding is important for ensuring a future grounded in an honest, responsible understanding of the past. Tell them to support historic preservation efforts in your area. Tell them that it’s important to support history education initiatives in the k-12 classroom such as National History Day and humanities programs in community colleges, four-year colleges, and universities. Tell them to support local institutions like historical societies, museums, and archival repositories. Join a preservation group like the Civil War Trust or the National Trust for Historic Preservation. Go visit a nearby National Historic Site. Attend a historical reenactment. Ask questions and be willing to listen and learn about the past, even if it’s difficult and unpleasant.
If you live in a community where a statue, monument, or memorial is currently garnering controversy, read up on relevant scholarship about the historical event being commemorated and why a symbolic icon was erected to preserve the memory of that event. Honestly consider whether or not that symbolic icon should remain in a place of honor in your community. If town hall meetings or other events are taking place about the history in your area, go to them. Listen to the perspective of other community members and express your own thoughts as well. Work towards becoming an active member of your community and an advocate for history.
If 2015 marks the beginning of a renewed conversation about history and memory in American society, let us use 2016 as a starting point for a renewed effort towards advancing the importance of supporting, preserving, and educating people about the history that is all around us. Get off the message boards and get to work in your community.
Cheers to a great new year.
The Atlantic has posted an essay by Alia Wong on U.S. history textbooks in K-12 classes that is worth reading. The essay focuses on a recent discovery of a ridiculous claim in a history textbook published by McGraw Hill suggesting that African slaves brought to the American colonies from the 1600s to the 1800s were “immigrants” to this land who somehow came here on their own free will. You would think that twenty years after the “textbook wars” of the 1990s and James Loewen’s Lies My Teacher Told Me: Everything Your American History Textbook Got Wrong was published to critical acclaim that textbook companies like McGraw Hill would be more careful about the claims they make in these textbooks, but I suppose that is asking too much when a group like the Texas Board of Education wields so much power in determining what gets into history textbooks around the country. You often hear George Santayana’s abused quote about people who don’t remember the past being doomed to repeat it, but it seems that there are times when people who do remember the past and in some cases actively participate in that past are actually more doomed to repeat it.
There is a bigger problem than bad history textbooks in U.S. classrooms, however, and that is bad history teachers. To wit:
Compared to their counterparts in other subjects, high-school history teachers are, at least in terms of academic credentials, among the least qualified. A report by the American Academy of Arts & Sciences on public high-school educators in 11 subjects found that in the 2011-12 school year, more than a third—34 percent—of those teaching history classes as a primary assignment had neither majored nor been certified in the subject; only about a fourth of them had both credentials. (At least half of the teachers in each of the other 10 categories had both majored and been certified in their assigned subjects.)
In fact, of the 11 subjects—which include the arts, several foreign languages, and natural science—history has seen the largest decline in the percentage of teachers with postsecondary degrees between 2004 and 2012. And it seems that much of the problem has little to do with money: The federal government has already dedicated more than $1 billion over the last decade to developing quality U.S.-history teachers, the largest influx of funding ever, with limited overall results. That’s in part because preparation and licensing policies for teachers vary so much from state to state.
A recent report from the National History Education Clearinghouse revealed a patchwork of training and certification requirements across the country: Only 17 or so states make college course hours in history a criterion for certification, and no state requires history-teacher candidates to have a major or minor in history in order to teach it.
“Many [history teachers] aren’t even interested in American history,” said Loewen, who’s conducted workshops with thousands of history educators across the country, often taking informal polls of their background and competence in the subject. “They just happen to be assigned to it.”
A bad history textbook in the hands of a good teacher can be turned into a useful instrument for teaching students about the construction of historical narratives, the differences between history and memory, and, of course, the factually correct historical content. A bad history teacher can lead students towards a lifetime hatred of history, regardless of how factually correct their textbook is.
I did not know that 34 percent of history teachers were not majors or certified in history, nor did I know that only 17 states have required qualifications for someone to teach history in a classroom, but I can safely say that Loewen’s observations about people being “assigned” to teach history are true. They often have “coach” in their title.
I do not mean to suggest that all coaches are bad teachers or lack historical knowledge. My initial inspiration for studying history in college was sparked in large part by a Western Civilization teacher during my senior year of high school who also happened to coach football and basketball. But that was the thing; every student viewed him as a teacher who also happened to coach, rather than as a coach who also happened to teach history. And unfortunately there were several coaches at my high school who were simply unfit to teach history.
Is there a lack of qualified history teachers in the United States for our K-12 schools, or does the problem lie in a lack of opportunities for qualified history teachers to find gainful employment in K-12 schools?
Addendum: If you’re a teacher who is frustrated with the quality of your history textbook, I highly recommend that you take advantage of The American Yawp, a free online history textbook that is collaboratively written by some of the best and brightest historians in the country. It is designed for a college classroom but I have no doubt that high school students, especially those in AP classes, could use it to their advantage.
Over the past few weeks the New York Times has rekindled a longstanding debate among scholars and educators over the role of lecturing in the college classroom. Back in September Annie Murphy Paul suggested that college lectures are “a specific cultural form that favors some people while discriminating against others, including women, minorities and low-income and first generation college students. This is not a matter of instructor bias; it is the lecture format itself . . . that offers unfair advantages to an already privileged population.” This month Molly Worthen responded with a defense of the traditional lecture, arguing that “lectures are essential for teaching the humanities most basic skills: comprehension and reasoning, skills whose value extends beyond the classroom to the essential demands of working life and citizenship.”
Both essays make good points that I agree with. Since I adhere to the idea that knowledge is constructed and that people rely on prior knowledge when making connections to new intellectual content, I can see Paul’s argument that poor and minority students who attended inferior schools during their youth can be at a disadvantage in a lecture-based college classroom. Conversely, I can also agree with Worthen that lectures expose students to content experts who have a platform to share their knowledge beyond the confines of a TV soundbite or YouTube video. I also agree with her that lectures can challenge students to synthesize information and take good notes.
I do not approach this conversation as an experienced college professor, but as a certified social studies teacher who had a cup of coffee in the middle/high school teaching world a few years ago and as a current interpreter for the National Park Service, where a parallel discussion is taking place about whether interpreters should play the role of “sage on the stage” or “guide by the side” during visitor interactions. These jobs have allowed me to participate in and facilitate learning experiences through a wide range of mediums. These experiences inform my opinion that lectures can be an effective tool for generating effective learning experiences, but only if they are used within reason, at appropriate times. Furthermore, it’s not productive to look at lectures and active learning as either/or propositions. Educators should be well-versed in a range of teaching methods, and I believe most critics of the lecture format are asking professors to expand their pedagogical vocabulary rather than asking them to fully abolish the traditional lecture course, as Worthen suggests.
Before I advance my arguments further, we should pause and ask what, exactly, constitutes a lecture. Derek Bruff of Vanderbilt University offers a useful distinction between educators who incorporate discussion and interaction throughout their lectures and others who engage in what he calls “continuous exposition,” which is completely devoid of any student interaction and is really just a monologue. The “continuous exposition” was a staple of my undergraduate education, and it was a real drag most of the time. I had a number of professors that lectured for the entire period and then, with five minutes left, would ask if anyone had questions. In my five years in undergrad I don’t think a single student ever asked a question during those five-minute windows, largely because most students wanted to get out of class by that point and understood that any sort of real, substantive Q&A with the professor would require much more than five minutes. A more active approach to lecturing–or a wholly different approach altogether–would have yielded more feedback from students if these professors truly cared about that feedback.
Another consideration is how much emphasis is given to the lecture in evaluating a student’s performance in a given class. In a continuous exposition lecture, the student’s grade is tied almost exclusively to his or her ability to recite in written form what the professor says during the lecture. This too is a problem in my mind because it places too much emphasis on rote memorization and recitation of content at the expense of training students to think about interpretation, analysis, and the process of drawing informed conclusions. I like Andrew Joseph Pegoda’s “high stakes quizzing” approach which places much more emphasis on assigned readings outside the classroom, frequent quizzes that challenge students to draw conclusions about their readings, and classroom discussions about those readings that are guided–but not exclusively directed–by the professor. This approach invites thoughtful student interaction while also allowing the professor the option to step back or jump into the discussion as necessary.
Yet another consideration in this discussion is reconciling the underlying tension between disciplinary knowledge and educational theory in educating future teachers. Most of my history professors were primarily focused on teaching content and used the continuous exposition model to convey that content, but my education professors stressed that we could only lecture for ten minutes to our future students and that we would have to utilize other active learning methods for the bulk of our classroom experiences (these education professors, ironically enough, often had a tendency to lecture for more than an hour to us). Historian and educator Fritz Fischer, writing in the June 2011 issue of Historically Speaking, explains that:
My students and I struggle with trying to connect the world of academic history with the world of pedagogical training. On the one hand, they were told by the educational theorists to create a “student centered” classroom and to rely on clever and fun classroom activities such as jigsaws and Socratic debates. These activities were often intellectually vapid, devoid of historical content or an understanding of historical context. On the other hand, they sat in their introductory history courses and listened to lectures about THE story of the past. Some of the lectures might have been engaging, interesting, and powerful, but were they really reflective of what historians do, and could they be at all helpful in a K-12 history classroom? How were my students to reconcile these worlds? (15)
The best way to reconcile these worlds, in my opinion, is to embrace a balanced approach to teaching that values lecturing not as the ultimate means for obtaining knowledge but as a tool within a larger arsenal that includes the use of other methods such as classroom discussions, group projects, classroom websites and blogs, and assignments that challenge students to develop and communicate arguments through written and oral form.
The challenge, of course, is designing these student-centered activities in ways that incorporate both content and disciplinary process. Bruce A. Lesh offers some great examples of implementing a balanced teaching approach in the middle/high school history classroom in his book “Why Won’t You Just Tell Us the Answer?”: Teaching Historical Thinking in Grades 7-12. In one case he challenges students to envision themselves as public historians who are tasked with documenting historical events through the creation of a historical marker. Students work on a given topic and are tasked with doing historical research, writing the text for this historical marker, and then explaining their methods and interpretations in both written form and during classroom discussion. This is a perfect example of an intellectually rigorous but student-centered approach to teaching historical thinking and content. It allows students a platform to contribute their own knowledge to the learning process, but it also allows the teacher to facilitate conversation and act as a content expert when necessary. Furthermore, it’s an activity that can be catered to students of all ages, whether they’re in elementary school or college.
So, while I don’t think educators need to fully discard the lecture, I think they should take the time to ensure they use it with proper care and with students’ learning journeys in mind.
P.S. I meant to, but forgot to include a link to Josh Eyler’s post “Active Learning in Not Our Enemy,” which is very good and worth reading. I owe a debt to Josh for sparking some of my own thoughts in this essay.
A few weeks ago The Bitter Southerner published a nice essay on Civil War reenacting. The author asserts that reenacting is in a period of transition as some enthusiasts push for a more holistic understanding of Civil War history, one that strives for “deeper, truer purposes” by explaining why this war was fought in the first place. Moreover, the author argues that “Civil War reenactments are as popular now as they have ever been,” so therefore historical reenactors–and the entire public history field–can and should find ways to use reenacting to disseminate a better understanding of the Civil War to the public.
I don’t buy the argument that Civil War reenacting–or any sort of reenacting save for maybe World War II–is as popular as ever. While it’s my understanding that the Civil War Sesquicentennial did bring out a large number of reenactors to commemorate the 150th anniversary of various battles like Shiloh and Gettysburg and General Lee’s surrender at Appomattox Court House, what’s notable about the Sesquicentennial is how many reenactors who grew up during the Centennial years of the 1960s chose to wear their outfits for the last time with the end of the 150th this year. I highly doubt that the so-called “millennial” generation is going to maintain the popularity of Civil War or any historical reenacting in the future. Another article in the Peoria Journal Star seems to confirm my skepticism.
The Journal Star article focuses on Fort Crevecoeur in Illinois, which was first established in the late seventeenth century. The fort doesn’t have the popularity of a Civil War site, but during the 1980s the fort held reenactments based on events when the fort functioned as a trading post that drew crowds of more than 3,000 visitors. Today the number of reenactors still participating in events at the fort dwindles around 15 or 20. The site has expanded its offerings and now hosts a French and Indian War reenactment, but concerns remain about the future of these events and even the site itself.
Three reasons are offered in the Journal Star for explaining why historical reenactments aren’t popular with younger people today:
- Kids are more interested in technology “and air conditioning” than spending the day outside at a reenactment.
- Kids don’t have the money to invest in reenacting.
- Older reenactors don’t want to be around young people and look upon them “with disdain.”
I think explanation two–that kids [i.e. their parents] don’t have the money–is the most plausible. Time is an important factor as well, however. Organized sports, to take one example, play a much larger role in many children’s lives than they did in the 1960s or 70s. Kids also have more homework than ever before. And while older folks can definitely be grumpy sometimes, I think those sentiments are reflective of a basic fact of human nature: young people like hanging out with other young people during their free time. Ditto to older people. And both groups can unfortunately look at each other with too much skepticism.
The bit about kids being more interested in technology than being outside is a red herring that distracts us from questioning why this site’s current programming is not attracting visitors. Sure, kids are inside more often and some teens are spending as much as seven and a half hours a day consuming media, but who’s to blame for that? The kids are being raised by parents who attended those reeanctments as children in the 1980s and are choosing not to return. Adults are spending more time inside consuming media on computers, phones, and televisions too. Kids who spend all day online are doing so because their parents allow them to, and because parents often indulge in the same types of behavior.
Is the future of public history doomed because of digital technology, or is there an opportunity for public historians to thoughtfully incorporate at least some semblance of digital technology to enhance their programming? If you agree with me that the latter course is a better one, then you’ll see that throwing our hands up and doing nothing but moaning about kids and their phones will most certainly hasten this field’s demise. Let’s stop making excuses. Michael Twitty, a historical reenactor who often portrays himself as an enslaved cook at events and is quoted in the aforementioned Bitter Southerner piece, is right when he says that “my job is to bring to life what the life of an enslaved person looked like so that you can take a picture of it with your iPhone and share this knowledge.”
Historical reenacting is suffering in part, I think, because the way we teach history to students in the classroom is changing, both in content and method. What I call the “dates, dead people, and dust” approach to history education is slowly going away and being replaced with teaching methods that embrace nuance and context and incorporate primary sources alongside select readings from academic historians. Memorizing a bunch of facts for a forty-question multiple choice test is not an effective way of teaching history, and it seems like more teachers are beginning to challenge their students to interpret the past using primary and secondary evidence and to then make compelling arguments in written and oral form. Moreover, the discussion topics in history classrooms, for better or worse, are moving away from heroic narratives of great white men, dramatic battles, and military tactics towards questions of politics, economics, social practices, group identities, and history as a process of connected events with implications for the present instead of a series of unconnected events on a timeline from a bygone, irrelevant era.
Given these changes in the classroom, we should be unsurprised when historical reenactments that privilege dry lectures, minute facts about obscure historical artifacts, and narratives that literally whitewash the past don’t attract the attention of young or diverse audiences.
Do I think historical reenacting is ineffective or a waste of time? Absolutely not! I’ve done historical reenacting myself (poorly). When I lived in Indiana I participated in Connor Prarie’s “Follow the North Star” program and was profoundly moved by the entire experience. But the reenacting alone wasn’t what impressed me. The program gave me a chance to actively participate in the event itself and contribute my own voice through a post-event facilitated dialogue run by a professional. The dialogue challenged all participants to reflect on their own experiences during the event and then comment on the role of history in shaping our society today. Plus they gave us resources to learn more about slavery, the underground railroad, and Indiana history at the very end. It was reenacting with emotion, passion, intelligence, active participation, and respect for the past. We need more of that in public history if we want reenacting to play an important educational role in the future.
This post rambles a bit. Fair warning.
Politicians have complained about the quality of public education in the United States since at least the 1910s, when all states passed laws making k-12 school attendance mandatory. Since that time our country’s leaders have essentially lathered, rinsed, and repeated the warnings echoed in the Reagan administration’s 1983 “A Nation at Risk” report: “The educational foundations of our society are presently being eroded by a rising tide of mediocrity that threatens our very future as a Nation and a people . . . Our society and its educational institutions seem to have lost sight of the basic purposes of schooling, and of the high expectations and disciplined effort needed to attain them.”
Today, President Barack Obama and Secretary of Education Arne Duncan have targeted a growing “skills deficiency” crisis in public schools. We are told that students today can’t read, write, do basic math, or think critically, nor are they prepared for college or employment in a competitive workforce. These assumptions, readily accepted as legitimate ideas in American thought about education, are repeated in internationally-reputable publications like the New York Times, who unabashedly argue that the “skills deficiency” in today’s students is a “troublesome fact.” “The American work force is less educated than it needs to be at a time when most jobs in the new economy will require some college education.” At the same time we are also told that the nation’s education system continues to fall behind other nations like Finland, South Korea, and China.
Rich philanthropists like Bill Gates and Mark Zuckerberg want more students of all ages learning “practical” skills like coding. Florida Governor Rick Scott thinks liberal arts and humanities majors like anthropology, psychology, philosophy, and ethnic studies are a drain on the state’s higher education dollars, and he’d rather shift that money to “STEM” degrees (science, technology, engineering, and mathematics). Ivy Tech Community College President Tom Snyder agrees, saying that a liberal arts degree is “a poor investment” and that students must consider skills-based training in STEM fields of study.
To justify these claims and address their shortcomings, the Obama administration continues to rely on Bush-era policy precedents in high-stakes testing and performance-based financial incentives to address accountability issues and motivate students and teachers to get those scores up. The bottom line results, however, haven’t changed since 2000.
But what if these assumptions about “skills deficiencies,” lack of STEM-based training, and a nation’s education system falling behind the rest of the industrialized world are faulty? A new study by the nonpartisan Economic Policy Institute suggests that our assumptions need to be reevaluated:
The above chart shows that today’s labor market is not stifled by a lack of skilled employers entering the workforce. Rather, a “broad-based lack of demand for workers” is preventing skilled workers from obtaining gainful employment. That is also (and especially!) the case with STEM-based employment fields, where the U.S. Census Bureau reports that only one in four STEM graduates with a bachelor’s degree have a STEM job and the industry struggles with too much supply and stagnant wages for those in the field. And it turns out that U.S. students have struggled with standardized tests for at least fifty years, according to Diane Ravitch. To wit:
It is worth noting that American students have never received high scores on international tests. On the first such test, a test of mathematics in 1964, senior year students in the US scored last of twelve nations, and eighth-grade students scored next to last. But in the following fifty years, the US outperformed the other eleven nations by every measure, whether economic productivity, military might, technological innovation, or democratic institutions. This raises the question of whether the scores of fifteen-year-old students on international tests predict anything of importance or whether they reflect that our students lack motivation to do their best when taking a test that doesn’t count toward their grade or graduation.
I’m sure there are plenty of students entering college who struggle with basic comprehension skills. I do not, however, subscribe to the idea that there are too many people in college or that the majority of those coming out of school are unprepared to enter the workforce. Only one-third of today’s 27-year-olds (yours truly included) have at least a bachelor’s degree, contrary to those who assert that a college degree is today’s equivalent of a high school diploma. Some also argue that there’s a “college dropout epidemic” in U.S. higher education. I find President Obama’s proposal to make community college free to all students intriguing, although there are many questions that still need to be addressed, including whether or not students should automatically have their tuition covered or if there should be performance-based strings attached. The working conditions of teachers in k-12 and higher education should also quesitoned, although I’d be shocked if they actually were.
We need to understand, however, that deeper structural problems within today’s labor market prevent many young people from obtaining gainful, full-time employment. For example, long-tenured Baby Boomer generation employees are retiring from the workforce and their jobs are simply going away rather than being filled by a younger person. Full-time jobs are becoming part-time jobs without benefits. Real wages for most workers have stagnated since 2000.
Getting an education is important for the simple reason that the experience of learning is valuable in and of itself, regardless of employment prospects. No one can take your education away from you. Liberal arts/humanities degrees will be relevant as long as we desire to learn more about the human condition and think critically about it. But gone are the days in which “go to school” was enough to guarantee career success. Getting a college degree and obtaining “skills” don’t guarantee much of anything in today’s labor market, regardless of field. Finding employment these days seems to be based on who you know and where you come from rather than where you’re going and how much potential you have.