I read a really interesting article today on Aeon from Stanford University history professor Caroline Winterer about the American Revolution, the creation of the U.S. Constitution, and enlightenment ideals. The underlying thesis of the article is partly rooted in the idea that Americans today have mythologized and flattened the legacies of the country’s various constitutional framers in ways that diminish the complexity of their thinking and their basic humanity. That’s not necessarily a new or bold thesis, but the way Winterer approaches this conclusion is pretty unique to me. Most notably she points out that the British philosopher John Locke–an imposing intellectual figure in the minds of many of the country’s framers–believed that while knowledge was obtained not through a divine God but through the five senses (empiricism) and language, the extent to which humans could trust their senses to provide an objective understanding of reality was very much uncertain. Similarly, since language was man-made and not the creation of God, the meanings ascribed to any word were subject to interpretation and merely “arbitrary signs that represent ideals.” What this meant for the framers, according to the Winterer, was that the feeling of uncertainty was a prevalent accompaniment in their efforts to create a functioning government and a civil society:
In fact, the American founders were uncertain about many things. They were uncertain about politics, nature, society, economics, human beings and happiness. The sum total of human knowledge was smaller in the 18th century, when a few hardy souls could still aspire to know everything. But even in this smaller pond of knowledge, and within a smaller interpretive community of political actors, the founders did not pretend certainty on the questions of their day. Instead they routinely declared their uncertainty.
While I freely admit that I am no expert of early American history, this interpretation strikes me as largely correct. The effort to create a constitution based on laws and not kings or divine providence was bold, ambitious, and fraught with uncertainty, which is why the framers established a process for amending the constitution to improve it in the future. But this article also got me thinking about the ways we teach history to middle school and high school students and why we need to make the idea of uncertainty a central element in teaching students how to think historically.
It is easy to look back at past events in hindsight and diagnose certain events as “inevitable.” Civil War historian Gary Gallagher, for example, often points out how easy it is to see the U.S. military’s victory at Gettysburg and conclude that this battle clearly led to an inevitable victory over the Confederates in the Civil War. But by understanding the sense of uncertainty people felt as events happened in real time, within circumstances often beyond their control, we can better empathize with the ways people in the past understood and reacted to the contingencies of their lives and their times. And perhaps we can teach students to embrace uncertainty in their own lives rather than seeing it as something to fear.
Growing up during the No Child Left Behind Era led to most of my history classes emphasizing standardized tests, most of which were exclusively multiple choice. The tests and lessons I encountered emphasized rote memorization of facts, which in turn portrayed the study of history as an exercise in the mastery of information and the people of the past as all-knowing figures who in many cases were certain of the consequences of their actions (especially those who fought in the American Revolution and helped create the Constitution). By focusing on the importance of critically analyzing primary and secondary sources, making reasoned interpretations based on the available evidence for a particular historical event, and making evidence-based arguments through written, oral, and digital means, history teachers can perhaps bring the uncertainty of the past (and the present!) into the forefront of building historical thinking skills.
The Atlantic has posted an essay by Alia Wong on U.S. history textbooks in K-12 classes that is worth reading. The essay focuses on a recent discovery of a ridiculous claim in a history textbook published by McGraw Hill suggesting that African slaves brought to the American colonies from the 1600s to the 1800s were “immigrants” to this land who somehow came here on their own free will. You would think that twenty years after the “textbook wars” of the 1990s and James Loewen’s Lies My Teacher Told Me: Everything Your American History Textbook Got Wrong was published to critical acclaim that textbook companies like McGraw Hill would be more careful about the claims they make in these textbooks, but I suppose that is asking too much when a group like the Texas Board of Education wields so much power in determining what gets into history textbooks around the country. You often hear George Santayana’s abused quote about people who don’t remember the past being doomed to repeat it, but it seems that there are times when people who do remember the past and in some cases actively participate in that past are actually more doomed to repeat it.
There is a bigger problem than bad history textbooks in U.S. classrooms, however, and that is bad history teachers. To wit:
Compared to their counterparts in other subjects, high-school history teachers are, at least in terms of academic credentials, among the least qualified. A report by the American Academy of Arts & Sciences on public high-school educators in 11 subjects found that in the 2011-12 school year, more than a third—34 percent—of those teaching history classes as a primary assignment had neither majored nor been certified in the subject; only about a fourth of them had both credentials. (At least half of the teachers in each of the other 10 categories had both majored and been certified in their assigned subjects.)
In fact, of the 11 subjects—which include the arts, several foreign languages, and natural science—history has seen the largest decline in the percentage of teachers with postsecondary degrees between 2004 and 2012. And it seems that much of the problem has little to do with money: The federal government has already dedicated more than $1 billion over the last decade to developing quality U.S.-history teachers, the largest influx of funding ever, with limited overall results. That’s in part because preparation and licensing policies for teachers vary so much from state to state.
A recent report from the National History Education Clearinghouse revealed a patchwork of training and certification requirements across the country: Only 17 or so states make college course hours in history a criterion for certification, and no state requires history-teacher candidates to have a major or minor in history in order to teach it.
“Many [history teachers] aren’t even interested in American history,” said Loewen, who’s conducted workshops with thousands of history educators across the country, often taking informal polls of their background and competence in the subject. “They just happen to be assigned to it.”
A bad history textbook in the hands of a good teacher can be turned into a useful instrument for teaching students about the construction of historical narratives, the differences between history and memory, and, of course, the factually correct historical content. A bad history teacher can lead students towards a lifetime hatred of history, regardless of how factually correct their textbook is.
I did not know that 34 percent of history teachers were not majors or certified in history, nor did I know that only 17 states have required qualifications for someone to teach history in a classroom, but I can safely say that Loewen’s observations about people being “assigned” to teach history are true. They often have “coach” in their title.
I do not mean to suggest that all coaches are bad teachers or lack historical knowledge. My initial inspiration for studying history in college was sparked in large part by a Western Civilization teacher during my senior year of high school who also happened to coach football and basketball. But that was the thing; every student viewed him as a teacher who also happened to coach, rather than as a coach who also happened to teach history. And unfortunately there were several coaches at my high school who were simply unfit to teach history.
Is there a lack of qualified history teachers in the United States for our K-12 schools, or does the problem lie in a lack of opportunities for qualified history teachers to find gainful employment in K-12 schools?
Addendum: If you’re a teacher who is frustrated with the quality of your history textbook, I highly recommend that you take advantage of The American Yawp, a free online history textbook that is collaboratively written by some of the best and brightest historians in the country. It is designed for a college classroom but I have no doubt that high school students, especially those in AP classes, could use it to their advantage.
When I completed my undergraduate studies in 2011, I did so without the benefit of taking a historical methods class. There was no requirement to take one when I started my studies five years earlier, and I was later grandfathered from having to take one when the requirements changed while I was still in the history program. Sure, I learned a lot of historical content and got better at writing papers while in undergrad, but I lacked the philosophical, theoretical, and methodological foundations necessary for thinking critically about the bigger questions that dominate a historian’s thoughts: what is historical objectivity, and can it be achieved? Is the past a foreign country, and if so, when do we find ourselves back in our native homeland? Can we separate the past and the present? What does it mean to think historically? What is truth? Is there such a thing as multiple truths? Who owns history? What is the importance of history in understanding the human condition?
I don’t necessarily have clear answers to all of these questions today, but my graduate training–especially my studies in public history and museum studies–did much to raise a sense of awareness about the need to always keep the “Why?” questions of history in close proximity to the how, what, and where questions that surround any historical inquiry. Books from the likes of Martha Howell and Walter Prevenier, John Lewis Gaddis, and Michel-Rolph Trouillot exposed me to the importance of source criticism, the power structures the shape the documents we use in our research, and the artistic and scientific qualities of historical study. More generally, these books showed me the importance of historical thinking as a way of understanding our contemporary world. Quite frankly, I am obsessed with talking and writing about historical methods these days.
You can only learn so much and be a student for so long before you must move on from graduate school, however. I would surmise that some history graduates probably do away with their historical methods books after graduation, and I don’t blame them. But now that I’m a practicing public historian I think it’s more important than ever to keep pushing myself to think about the “Why?” questions of history because the public audiences I work with ask those sorts of questions all the time. People often come in with a specific conception of history as facts, dates, dead people, and dust, which challenges me to find ways to teach them about history as an interpretive act that is continually up for questioning and revision. Many visitors also ask me questions about things they learned growing up and whether or not they were true. And every once in a while I get questions about the evidence and methods I use in crafting my own interpretations of nineteenth century history.
A good history teacher emphasizes process and method in addition to content. Public historians should also strive to teach their audiences–most of whom don’t engage with the stuff of history on a regular basis–process and method in addition to content whenever possible so that they are empowered to start their own exploration into the past.
Now that I’m out of school and able to read any book I want, I follow a simple method for ensuring my continued growth as a nineteenth century historian, interpreter, and educator. For every two books I read about history, I read one book about “method,” whether that be historical methods, philosophy of history, public history, museum practices, educational theories, or something along those lines. So, for example, I just finished reading Edward Baptist’s The Half Has Never Been Told: Slavery and the Making of American Capitalism and Brian Matthew Jordan’s Marching Home: Union Veterans and Their Unending War. Now I am currently reading Nina Simon’s The Participatory Museum.
To be sure, the best history books teach us much about historical methods! Broadly defined, what I mean by “history book” is a work of scholarship in which the central thesis contains an argument about historical content and our understanding of the past, whereas “method” scholarship focuses primarily on discussing the specific practices scholars and practitioners employ when they interpret the past.
It’s hard to be a good historian if you omit reading one or the other. Having a lot of knowledge about educational theories and interpretive practices is important, but it’s hard to be an effective communicator if you don’t have any historical knowledge to communicate. Likewise, it’s great to have a lot of historical knowledge, but if you don’t know how to effectively communicate that knowledge to your audiences, you will struggle as a teacher and/or public historian.
My last essay on the recent events in Ferguson, Missouri elicited positive feedback and, unsurprisingly, pushback and criticism. When I shared the essay on Twitter, a fellow North St. Louis county native by the name of Alan R. Knight (whom I’ve never met in person or previously interacted with online) tweeted me more than thirty times expressing his belief that I “should be more responsible” when discussing this topic. He provided a laundry list of grievances that never addressed the content of my essay, but instead conveyed a peculiar theory for explaining the economic and social issues currently plaguing North St. Louis county. According to Mr. Knight, much of these problems revolved around the teaching of slavery in history classrooms. In teaching slavery, north county educators are preaching “hatred,” “propaganda,” “victimization,” and “slander” to the area’s African American population in an attempt to teach them to hate the United States, rely on the state and federal government for welfare handouts, and give votes and power to the Democratic party (“democrat slavers,” according to Mr. Knight). He says we live in a fully equal society and that blacks are completely at fault for any “racist hatred” against them.
Most rational readers, I hope, can easily see the ridiculousness and silliness of these claims.
There are plenty of history teachers around the United States who teach this country’s history of slavery and choose not to associate with the Democratic party. Over nine-tenths of all entitlement benefits in the U.S. go to elderly, disabled, or working households – not working-age people who simply refuse to work. Mr. Knight’s blaming of “racist hatred” on the victims of racism rather than actual racists is nothing new within the so-called “race conversation” in America. As I’ve argued repeatedly, teachers are often seen as the sole influence in a child’s upbringing when in reality schools are merely one part of a larger community effort to raise a child. And the idea of a fully equal society becoming reality in social practice is most likely impossible because the precise definition of what constitutes “equality” constantly changes over time as new questions force society to reconsider the boundaries of individual freedom, fair play, and equal protection under the law. This is not to suggest that equality doesn’t exist in some capacity or that the United States has not experienced great advances in economic, social, and political equality during its history. Far from it. It’s safe to say I am probably more content living under the boundaries of equality in 2014 than if I were to live under the boundaries of equality from 1860. It just means there will never be a time when we’ll all shake hands, say “everything’s equal!,” and dispose of our laws, justice systems, and lawyers.
Mr. Knight, however, challenged me on a philosophical level to consider the role of slavery in the history curriculum. What is the importance of teaching slavery in a U.S. history class, regardless of grade level?
As countless historians, scholars, and citizens have argued, the worst aspects of U.S. history–slavery, Indian extermination and western expansion, segregation, Jim Crow laws, lynching, imperialism, and mass incarceration–are not merely blips along the road to American democracy as we understand it today. They were fundamental building blocks in its growth, and you cannot honestly describe this nation’s history without addressing them. As Ta-Nehisi Coates argued earlier this year, “to celebrate freedom and democracy while forgetting America’s origins in a slavery economy is patriotism à la carte.” Our nation’s capitol was literally built with slave labor, for crying out loud.
Teaching slavery is not a form of propaganda or victimization, nor should its existence in the U.S. history curriculum be a partisan talking point in which parties debate whether or not it should be in the curriculum in the first place. Slavery is a part of our history whether we like it or not. Teaching our students of its wrongs illuminates the vast gulf between democratic principles and democratic practices. It also exposes the difficulty of finding a balance between liberty and order in a republican democracy.
It’s also true that we should acknowledge the history of antislavery and the eventual emancipation of all slaves with the passing of the 13th amendment in 1865. Many heroes in U.S. history have put their lives on the line to right serious wrongs and promote peace, justice, and freedom. These people deserve our recognition, historical memories, and other acts of public commemoration. But how do students come to understand the challenges these people faced if you don’t first expose them to the wrongs and inequalities of the society in which they lived? How do students develop a genuine appreciation for abolitionists like William Lloyd Garrison, Frederick Douglass, Theodore Weld, or the Grimke sisters without exposing them to the history of slavery or the fact that the abolitionist movement was very small and almost universally hated throughout the country during the antebellum era? To focus only on what we today consider “a good fight against inequality” without discussing those inequalities in depth is to put the cart before the horse in historical thinking and teaching. Talking only about “good history” is boring and uninspiring to students. It seems to me that if we want our students to feel like empowered citizens who can help make positive changes in our communities, we should expose them to this nation’s historical failures and the ongoing fight to make society more just, humane, and equitable. We’ve come a long way, but we’ve still got a long way to go.
I didn’t respond to all of Mr. Knight’s grievances, but for those interested you will find part of our twitter conversation below.
.@alanrknight74 Sure, we have come a long way. But you are in fantasyland if you really think we live in a post-racial, fully equal society.
— Nick Sacco (@NickSacco55) November 27, 2014
.@alanrknight74 That’s some loaded language right there. What do you mean by “US societal fabric”?
— Nick Sacco (@NickSacco55) November 27, 2014
Midterms say otherwise MT “@alanrknight74 Division by race & class taught in schools, creates ignorance & hopelessness, keeps Dems in power”
— Nick Sacco (@NickSacco55) November 27, 2014
Teaching slavery not a form of propaganda or victimization. It’s called U.S. history, and it helped build this nation. @alanrknight74
— Nick Sacco (@NickSacco55) November 27, 2014
No good history educator teaches equality as non-existent or makes any student ashamed of themselves b/c of history. @alanrknight74
— Nick Sacco (@NickSacco55) November 27, 2014
Went 2 same schools, highly doubt that. Why not both? “@alanrknight74 rhetoric focused on inequality & injustice, not a path to opportunity”
— Nick Sacco (@NickSacco55) November 27, 2014
Started in Hazelwood but later moved out to St. Charles. No one “left out” talk about career advancement or path 2 success. @alanrknight74
— Nick Sacco (@NickSacco55) November 27, 2014
.@alanrknight74 Yes, Ferguson + Jennings whole different ballgame. Lot of the uplift + guidance you refer to must start at home + community.
— Nick Sacco (@NickSacco55) November 27, 2014
.@alanrknight74 Social woes don’t all fall on Dems, nor shld schools shy away from tough history or say that equality is fully achieved, but
— Nick Sacco (@NickSacco55) November 27, 2014
.@alanrknight74 school is merely one part of a larger relationship btwn homes, communities, schools, etc. Need to reinforce each other.
— Nick Sacco (@NickSacco55) November 27, 2014
Regardless of how any individual Missouri voter feels about the results of the 2014 midterm election, all voters in the Show-Me State have much to celebrate with their collective rejection of Constitutional Amendment 3, a poorly written special interest-funded initiative that would have hurt the state’s schools, teachers, and students. The initiative was largely funded by Rex Sinquefield, a St. Louis philanthropist who has never taught a k-12 class and who put $1.2 million of his own money into a lobbying group called “Teach Great” to promote the measure. Despite these efforts the amendment was rejected by a 3-to-1 margin and not a single county had a majority of its population vote in support of it. In rejecting this amendment, Missouri voters took a principled stand against the elimination of teacher tenure and the implementation of additional standardized testing to measure teacher performance in the state’s public schools. In the essay that follows I will outline why such a program would have never worked in the first place had voters approved this atrocious legislation.
The full text of the proposed amendment can be read at the Missouri Secretary of State’s office here. Key sections of the text include Section 3e, which eliminates teacher tenure by requiring all schools to enter into contracts with teachers for only three years at most, and Section 3f, which stipulates that all schools enter into a “standards based performance evaluation” that “shall be based upon quantifiable student performance data as measured by objective criteria.”
The desire to eliminate teacher tenure largely stems from popular misunderstandings of what tenure entails. Many people think tenure means a lifetime appointment without threat of termination, which in turn will ostensibly breed laziness and incompetence in the classroom. In reality a tenured Missouri teacher can be terminated at any time for one of six clearly listed violations, including incompetence, insubordination, immoral conduct, and felony conviction. All tenure does is ensure that a teacher who works for five years at the same district no longer has to rely on a year-to-year contract to ensure their employment status with that district. Moreover, if a tenured teacher is terminated from their contract, they are entitled to a hearing before the district’s school board, whereas non-tenured teachers can be dismissed without cause or a hearing. That’s it.
A standards based performance evaluation relying heavily on quantifiable student performance data is fraught with all sorts of evaluative difficulties and uncertainties. To demonstrate this point we can think about how such a system would work for medical doctors. There are many types of doctors out there, but let us specifically consider Dr. Gregory House, my favorite TV doctor.
Dr. House and his team are diagnosticians who care for patients with a range of medical issues. Some are minor, others are life-threatening. To ensure that Dr. House and his team are providing good medical care to their patients, we might say that a “standards based performance evaluation using quantitative data” about patient responses to their medical diagnoses would be most appropriate for assessing these doctors’ performances. This data may give us insights into how long people stayed at the hospital, what sorts of ailments they suffered from, how many people died under the doctors care, etc. On the face of it these suggestions seem like fair, objective measures for analyzing Dr. House’s team, just as a standards based evaluation looks fair to teachers.
Few people, however, would fail to acknowledge that the patients under Dr. House’s care have specific socio-economic and socio-cultural backgrounds that do not easily fall under a quantitative measure of assessment. How a person takes care of his or her self–diet, exercise, sleep habits–influences their body’s ability to respond to a doctor’s treatment (in addition to genetics, which we are still learning about). Economic inequalities may prevent someone from getting appropriate medical treatment during the early stages of their ailment. Some people simply have a negative perception of doctors and/or medicine and opt out of treatment until it’s too late. All of these factors play a role in the doctor-patient-medicine relationship, and the effectiveness of Dr. House’s diagnostics team cannot be simplified into quantifiable numbers and Excel spreadsheets about the number of people who died under their care. Is it really fair to base Dr. House’s pay on whether or not he can save the life of a person who smoked two packs a day, didn’t have ready access to good healthcare throughout their life, and only sought medical help when it was too late to do anything about an inoperable form of cancer?
Just like Dr. House’s patients, k-12 students come from specific socio-economic and socio-cultural backgrounds. Those backgrounds can do much to shape how they will perform in a classroom long before a teacher can do anything to help them out. Poor parenting, broken homes, abusive families, a parental disinterest in education, poor nutrition, crumbling neighborhoods, and a lack of social services or extracurricular activities for kids all act as potential barriers to success in the classroom. Indeed, a teacher’s influence in shaping his or her students’ education is probably overstated in most cases because it takes a community to raise a child and overcome these problems. Grades, standardized tests, and quantifiable data don’t account for these contingencies. A group of students who get ‘C’s on their standardized tests may have received those grades after a teacher spent long hours helping them overcome years of perpetual ‘D’s and ‘F’s. Yet Mr. Sinquefield’s pet legislation would base that teacher’s future salary, retention, promotion, demotion, or dismissal on the fact that her students got ‘C’s. Is that really fair?
When devising an evaluation it is absolutely essential to first develop your research questions before determining the sorts of tools and methods you plan to implement in the evaluation. There is room for both qualitative and quantitative methods in teacher evaluations, but school administrators must first ask themselves what, exactly, they want to learn about their teachers and students. Public school districts throughout Missouri undoubtedly face different economic, cultural, and political challenges within their local communities, and the process of evaluating teachers should be largely shaped by individual districts, their school boards, and local residents who are must attuned to these circumstances. Limited assistance from the Missouri Department of Elementary and Secondary Education can be helpful as well. And we must always fight to ensure that children from impoverished and/or abusive backgrounds are getting they help they need outside the classroom so that they can succeed in the classroom. Amendment 3 doesn’t address these issues, and Missouri voters threw it in the trash where it rightfully belongs.
Regular readers of Exploring the Past know that I occasionally muse on the state of higher education in the United States and what the academy might hold for someone like me in the future. I wish I had the ability to look into a crystal ball and envision this future, but I instead find myself in a paradox when it comes to whether or not I should pursue my history Ph.D. On the one hand, I’ve worked extremely hard to position myself for a full-time, permanent status placement with the National Park Service, an agency whose mission and values I care deeply about. Now that I’ve earned that position (for which I’m extremely grateful), it seems that getting “experience in the field” and not pursuing a Ph.D. at present makes the most sense for my career development, and I’ve been told as much by some of my teachers. On the other hand, I received a lot of support from other teachers during graduate school who encouraged me to strongly consider the Ph.D. path and pursue it as soon as possible. By pursuing the Ph.D. now, I could also position myself for potential employment within higher education in addition to possibly furthering my public history career.
I love teaching history in both formal and informal learning settings, and I hope to do more of both in my career. But it can be mentally overwhelming thinking about the unknown contingencies that will shape where I go and what I do through the course of my career. It’s important not to discount any avenue of opportunity at this point, and I’ve been doing my best to get a feel for what I might expect if I were to pursue my Ph.D. Unfortunately, the research I’ve done so far indicates that my prospects don’t look good if I pursue this path.
Overeducated, Unemployed: There are too many Ph.D. candidates in all fields of study. Hope College English professor William Pannapacker suggests that this glut of Ph.Ds (especially in the humanities) stems from schools’ reliance on cheap teaching labor. “It’s my view that higher education in the humanities exists mainly to provide cheap, inexperienced teachers for undergraduates so that shrinking percentage of tenured faculty members can meet an ever-escalating demand for specialized research.” These schools, according to Pannapacker, don’t really care about the employment status of their students once they graduate.
Because there is a glut of Ph.Ds on the academic job market, schools set the terms of employment to their favor. Amid severe funding cuts for public colleges and universities and rising costs for non-academic ventures (more on that in a moment) since the 2008 Great Recession, a race to the bottom has ensued within academia. More than 50 percent of all faculty in today’s schools are part-time. Some faculty voluntarily choose to be part-time because they either have full-time employment outside academia, are retired from the workplace and choose to teach occasionally, or simply prefer this sort of schedule. But the vast majority of these faculty members do not work outside the academy and are placed in a position where they frantically run around from school to school looking for classes to teach. Adjunct faculty members have no job security, no health benefits, and make an average of $2,700 per course (which means that a person teaching four classes per semester would be making $21,600 annually, before taxes). There are professors on food stamps.
Efforts to find new Ph.Ds employment opportunities outside the academy (“alt-ac”) are still in their infancy, but the transferability of academic skills and training to alternative careers remains an open question. The Boston Globe recently published an essay about a “quiet crisis” in the science community, where recent Ph.Ds have been increasingly forced to work in low-paying postdoctoral apprenticeships thanks to cuts in higher education employment and federal funding. (And, while I’m at it, I should point out that STEM graduates of any level are struggling to find employment, contrary to popular belief that the U.S. lacks a sufficient number of young, competent science, technology, engineering, and mathematics professionals).
And then there are those who simply can’t find employment. The graph above, taken from a recent study by the American Historical Association, found that less than 50 percent of new history Ph.Ds reported finding “definite employment” following the completion of their degree and about 40 percent saying that they were still “seeking employment.” (Here’s a collection of data studies from the AHA on history programs, employment, and students).
What is the cost of pursuing a Ph.D?: Many–if not most–Ph.D. students live on a monthly or yearly stipend that ostensibly covers the cost of living. Schools pay their students to work as Teaching Assistants, researchers, or a range of other jobs for roughly twenty hours a week. These stipends end up totaling around $10,000-$12,000 per year, which is low enough that some students have been forced to take public assistance to help pay the bills while in school. A select few are lucky enough to get their tuition fully covered in addition to their yearly stipend, but most students are forced to take out loans to cover their rising tuition costs and usage fees while in school. The average debt burden for graduate students today is $60,000. Moreover, this debt doesn’t account for the years of lost income that accompany any full-time investment in the pursuit of a Ph.D., which can take between five and ten years to complete. University of Iowa Sally Mason recently attributed at least half of this student debt to so-called “‘lifestyle debts’ caused by students buying things like iPhones, iPads, and laptops” (see link above), reflective of her clear ignorance of these serious problems.
Where does the money go?: The sad thing about this rising debt is that so much of the increasing tuition rates in U.S. schools are reflective of the rising costs of things completely detached from the academic classroom. Contrary to popular assumptions and beliefs, colleges and universities are not increasing tuition rates because of rising faculty costs. That money is actually going towards the building of fancy campus centers, sports stadiums, dorms, and 9,000 Square Feet President’s Residences. And we can’t forget the huge growth of higher education administrators who take an increasing amount of the budgetary pie in academia. From 1987 to 2012 the number of administrators in colleges and universities more than doubled, with 517,636 administrators and professional employees added to the payrolls during that period.
I’m not one for big freak-outs and alarmist rhetoric, but the above information definitely sobers my perspective whenever I start thinking about furthering my education, and I have a bad feeling that things will either stay the same or get worse in the future.
I just finished reading Karen and Barbara Fields’ fine 2012 publication Racecraft: The Soul of Inequality in American Life. Like Ta-Nehisi Coates (who discusses the book here and here) I found that some of the book’s essays were more difficult to read than others, and there were times when I did not understand the arguments being made. But the main idea of the book–that the concept of “race” in America is an invention of racism–really challenges me to reconsider how the language American society uses to discuss “race” is oftentimes embedded with racist connotations, even if those doing the talking have good intentions.
The Fields’ point out in the introduction to Racecraft that Martin Luther believed in witchcraft: men stealing milk by thinking of cows; a mother getting asthma because of a mean glare from a neighbor; seeing demons at one’s deathbed. In each of these cases, Witchcraft became a strategy by which Luther explained his everyday experiences and the world around him. Witchcraft became an ideology for Luther, as it did for many people for many hundreds of years before and after his time. But at some point the ideology of witchcraft no longer sufficed as an explanation for the everyday experiences of peoples’ lives. Today we would never attribute someone’s asthma to witchcraft, much less suggest that witchcraft is the direct cause for anything in society. The illusion of witches produced the concept of witchcraft in Luther’s time; the Fields’ argue that the illusion of race produces the concept of racecraft in our own time. The comparison is significant not because both concepts are silly superstitions, but because both concepts have been so widely accepted as plausible explanations of fundamental truths about human nature.
The concept of “race” was originally conceived by Enlightenment-era thinkers as a way of classifying and categorizing people and justifying transatlantic slavery on the basis of skin color and ancestry. But “race” as a biological concept has been discredited by the scientific community today, and there is actually no genetic basis for race. (see Jason Kelly’s brief list of resources on scientific racism here and information from the PBS program “Race – The Power of an Illusion” here) What constitutes “black” in the United States is not the same as what constitutes “black” in Brazil or Colombia. People who fall outside the U.S. racial paradigm–say, a Muslim immigrant–oftentimes aren’t attributed a “race” within that paradigm.
Racecraft, according to the Fields’, fails to explain inequality because it mutes class inequality. They cite a recent case in which a white electrician in Ohio fell on hard times after the 2008 Great Recession and was forced to rely on government aid. The electrician went out at midnight to buy groceries and, in a fit of disgust, had no qualms commenting in a New York Times article his exasperation with seeing large “crowds of midnight [food-stamp] shoppers once a month when benefits get renewed…Generally, if you’re up at the hour and not working, what are you into?” The Fields’ point out that even though the electrician was out at midnight with food stamps and ostensibly conducting legitimate business, “he assumed that the people in the crowds were not on legitimate business…Racism tagged the midnight shoppers as ‘into’ something unsavory because they appeared to be out of work; racecraft concealed the truth that the electrician and the midnight shoppers suffer under the same regime of inequality” (269-270).
Racecraft also uses “race” to explain racism, when the opposite is the actual reality. The Fields’ point out that while scholars and school teachers often argue that “legal segregation was based upon race,” legal segregation was actually based upon racism and the false science that justified the classifying of “races.” Thomas Jefferson justified slavery in his Notes on the State of Virginia on the scientifically “factual” basis that the ancestry of slaves in America made them biologically inferior people, but in reality racism sustained slavery’s justification and expansion in the United States. In sum, “racism becomes something Afro-Americans are, rather than something racists do” (97).
In chapter three, Barbara Fields pushes the race-racism paradigm to its outer boundaries by suggesting that the notion of “racial tolerance” masks racial discrimination through “good intentions.” In response to an author who argued that the refusal of cab drivers to stop for black passengers was attributable to “intolerance,” Fields clearly outlines the problematic nature of “tolerance”:
Tolerance itself, generally surrounded by a beatific glow in American political discussion, is another evasion born of the race-racism switch. Its shallowness as a moral or ethical precept is plain. (“Tolerate thy neighbor as thyself” is not quite what Jesus said…). As a political precept, tolerance has unimpeachably anti-democratic credentials, dividing society into persons entitled to claim respect as a right and persons obliged to beg tolerance as a favor. The curricular fad for “teaching tolerance” underlines the anti-democratic implications. A teacher identifies for the children’s benefit characteristics (ancestry, appearance, sexual orientation, and the like) that count as disqualifications from full and equal membership in human society. These, the children learn, they may overlook, in an act of generous condescension–or refuse to overlook, in an act of ungenerous condescension. Tolerance thus bases equal rights on benevolent patronization rather than democratic first principles, much as a parent’s misguided plea that Jason “share” the swing or seesaw on a public playground teaches Jason that his gracious consent, rather than another child’s equal claim, determines the other child’s access (104-105).
There’s a lot to digest in Racecraft, but it’s well worth the read and I learned much. While race is a biological fallacy, discrimination and double standards against people based on their ancestry–racism–is very real. The logic of racecraft–the illusion of race–masks inequality through false scientific explanations for why some people are supposedly inferior to others. Racecraft also challenges me to consider how public historians, museum practitioners, and classroom educators approach ideas of tolerance, equality, and understanding in their work with students.
To teach the principles of historical thinking in a classroom without the aid of primary source documentation is the equivalent of teaching someone to play guitar without giving them an instrument to practice on. During the G.W. Bush “No Child Left Behind” Era (and no doubt before that) education leaders in the United States preached the gospel of standardized testing. Through the use of history textbooks, pre-written tests (usually in the form of multiple choice scantron forms without any written essay questions), and pre-written classroom activities, a generation of historically-informed youth would acquire a correct and appreciative view of the nation’s past, which in turn would promote good citizenship and a healthy obedience to democratic values. As a high schooler in the early 2000s I was frequently treated to long-winded lectures about supposedly “important” dates, dead people, and dust, a barrage of multiple-choice tests, and assigned readings in history textbooks that would place the worst insomniac into a deep sleep. Primary sources–the “musical instruments of history”–were nowhere to be found in my high school education. My own teaching experiences in 2011 and 2012 were equally frustrating once I realized how little control I had in the design of my unit plans.
The No Child Left Behind (and President Obama’s “Race to the Top”) framework for teaching k-12 history is now being challenged by some historians and educators. The College Board recently drafted a new framework for teaching Advance Placement U.S. History courses that shifts the focus from rote memorization of factual information to the critical analysis and interpretation of primary source documentation. These proposed changes call for shifting the classroom experience towards teaching historical content and historical process. They also emphasize a broad view of history showing that our nation’s history is subject to multiple interpretations and perspectives.
If we adhere to the belief that history is a complex landscape composed of many viewpoints, however, the place of United States history within that landscape becomes more ambiguous than the NCLB framework would have us believe. The nationalist leanings of the American state–built largely on the foundations of a shared national history and the mythical stories we teach each other about that history–might be placed on infirm foundations. Beliefs in American exceptionalism could be replaced by a crisis of patriotism. The heroic can be challenged and criticized. Obedience to the social status quo transitions to questioning, dissent, and potential civil disobedience.
Unsurprisingly, there are critics who are concerned about teaching a complex form of American history that places our heroes, our “good wars,” and our heritage in limbo. Stanley Kurtz says the College Board’s revisions are “an attempt to hijack the teaching of U.S. history on behalf of a leftist political ideological perspective.” The Texas State Board of Education accuses the College Board of encouraging a “disdain for American principles.” And a Jefferson County, Colorado, School Board Member named Julie Williams is proposing that a new nine-member committee be formed to inspect U.S. history textbooks in the Jefferson County School District because, according to her, “I don’t think we should encourage kids to be little rebels. We should encourage kids to be good citizens” (high school students in the district are now protesting these school board proposals. Who says kids don’t care about history?).
Is there a better way to teach history, expose students to its “truths,” and remove its politics from the classroom?
One idea that is gaining steam throughout the country calls for the complete removal of history textbooks from the history curriculum. Public schools in Nashville, Tennessee, are removing textbooks from the classroom in favor of websites, “interactive” videos, and primary source documentation, all of which are being implemented through $1.1 million in funds for the 2014-2015 academic year. Historian and educator Fritz Fischer argues (but with a dose of skepticism) that these changes are welcome because “not relying on traditional history books cuts down on the potential for ‘textbook wars’ where residents object to certain conclusions.” Stephanie Wager of the Iowa Department of Education concurs, arguing that “you don’t really need to have the traditional textbook.” If we simply remove these politicized textbooks from the classroom, we can focus on primary sources and let students make their own conclusions from the historical evidence presented to them.
I agreed with this perspective a year ago, but I don’t agree with getting rid of history textbooks (or at least a selection of secondary-source readings) now. Here’s why:
For one, the notion that students will automatically learn more and prefer the use of fancy digital tools and “interactive” materials rather than print books is based on the faulty logic that today’s students are “digital natives” who are more comfortable using digital technology than older people who did not grow up around this technology. I addressed those claims here.
Secondly, removing secondary sources from the classroom prevents students from learning about the interpretive nature of history and how our understanding of the past is constantly revised as new questions about the present prompt new questions about the past. Jim Grossman is right when he argues that revisionism is fundamental to historical inquiry, and we lose that critical component of the historian’s toolbox when we simply throw primary sources at students without showing them how historians interpret and sometimes disagree with the meaning of those documents. If primary sources are the “musical instrument” with which historians conduct their performances, secondary sources are the “technique” we employ to help us competently perform with our musical instruments.
Thirdly, primary source documents are laced with their own biases, speculative claims, faulty memories, and political agendas. If you don’t believe that, just imagine what sorts of primary sources historians of the early 2000s will have at their disposal one hundred years from now. The best contemporary historical scholarship provides us strategies for assessing the reliability of a primary source, and that scholarship should be an integral part of the classroom experience. Again, just giving students the “facts” without giving them a framework for critically thinking about those “facts” does little to advance their own understanding of history’s complexities.
History is political and always will be. The United States has plenty of accomplishments to be proud of, but an unquestioning self-congratulatory narrative of progress doesn’t tell the whole story of this nation’s history. And it’s boring! We need to teach both content and process in the history classroom. We need more primary sources in the classroom, but we also need more secondary sources that do a better job of providing students with a framework for interpreting those primary sources. And we need to show students how the very nature of American identity and citizenship has changed over time, which means taking a critical look at both the good AND bad in American history.
As a public historian, I am constantly challenged by the need to conduct interpretive tours that provide accessible historical knowledge to people of all ages. I’ve been thinking about this challenge a lot lately because I feel like I have a lot of room for improvement when it comes to working with children who participate in my interpretive tours at the Ulysses S. Grant National Historic Site. Most of my training as an educator and historian has revolved around teaching history to people from about age twelve and up, and my Missouri teacher certification covers grades 5-12.
Interpretation guru Freeman Tilden argued in his 1957 publication Interpreting Our Heritage that “interpretation addressed to children (say, up to the age of twelve) should not be a dilution of the presentation to adults, but should be a fundamentally different approach. To be at its best it will require a separate program.” Tilden’s argument may sound fairly basic and obvious, but it’s surprisingly easy for interpreters to overlook the needs of children when giving their presentations. I’ve been on plenty of tours where interpreters only address the adults in their audiences without even acknowledging the presence of the children there.
When I worked for the Capitol Tour Office at the Indiana State House, the educational focus of my tours was clear cut. Sometimes I gave tours to groups with upwards of more than 100 fourth graders, while at other times I did tours with senior groups, government officials, and other adult groups. Most of time I had a clear idea of who my audience was and what I needed to do as an educator to make my tours inclusive and accessible. Generally speaking, I made sure to make the student tours “interactive” through the use of visuals and active questioning that challenged students to recall their prior knowledge of Indiana history and civics. During adult tours I usually did a more straight-forward presentation with time for audience questions throughout the tour.
Life’s a little more difficult at ULSG because you simply cannot predict who is going to come through the door for your tours. What do you do when you have a group of five adults and five children, three of which are under age 12? How would you make this tour inclusive for both the adults and children? Do you focus on addressing the groups as if they were all adults and simply sprinkle a few questions that you address to the children to make it more inclusive, or do you compose a completely separate program largely geared around the children? Since I have a ten-minute limit for giving my interpretations, should I cut out important historical context related to Grant’s life and the history of his family so that I can focus on asking more questions and providing definitions for complex topics like slavery on tours with many children? Is there a good way to delicately talk about slavery and secession to groups with children under twelve? Tilden’s principle of inclusive children’s tours seems to fall short when attempting to wrestle with these interpretive challenges because the answer oftentimes cannot be whittled down to “do a separate program for the best results.”
I am curious to hear from other public historians, museum practitioners, and educators about their own experiences working with children under twelve in an interpretive setting. I’d also like to find relevant scholarship that addresses these questions. A search of the database for the National Council on Public History’s quarterly publication The Public Historian on JSTOR unfortunately yielded no results for any scholarship about working with children. Larry Beck and Ted Cable’s 2002 publication Interpretation for the 21st Century: Fifteen Guiding Principles for Interpreting Nature and Culture dedicates an entire chapter to “interpreting throughout the lifespan,” including children, teenagers, and seniors, but I found that chapter remarkably unhelpful to me; the authors focus exclusively on providing ideas for introducing kids to nature without any mention of the need to connect kids to history and culture. I find those omissions indicative of just how difficult it is for interpreters to make history (especially if it’s intangible history) relevant and accessible to young audiences.
Every audience I work with is unique and presents an interpretive challenge for me. Although I have experience working with children under twelve, I believe I still have a lot of work to do before I can get to a point where I feel comfortable working with groups that are mixed-age or mostly composed of children under age twelve.
[Addendum: I should add that one book I find quite helpful when comes to museum education and educational theories is Deborah L. Perry’s What Makes Learning Fun? Principles for the Design of Intrinsically Motivating Museum Exhibits. I highly recommend purchasing it.]