I am presenting a paper about the Grand Army of the Republic, Department of Indiana, and their memories of the Civil War at the 34th Annual Meeting of the Indiana Association of Historians on Saturday, March 8. The paper is titled “‘This Will Be Our History and Our Glory:’ Civil War Memories and the Grand Army of the Republic, Department of Indiana.” I will share some of my arguments from the paper in future blog posts. Below is my paper abstract:
In the only scholarly study of the Grand Army of the Republic, Department of Indiana, historian James H. Madison* declared in 2003 that Hoosier Civil War veterans remembered the conflict in a way that “created silences that denied the central essence of the war.” These veterans, argues Madison, reflected the racial attitudes of late nineteenth century Indiana, where racism, segregation, and violence against African Americans occurred on an all too frequent basis. As products of this racist society, the collective memories of Indiana GAR veterans by the fiftieth anniversary of the Battle of Gettysburg in 1913 allegedly reflected an active “forgetting” of the role of slavery, race, and emancipation in the nation’s deadliest war. More recent scholarship from Barbara A. Gannon and Caroline E. Janney challenges the notion of GAR veterans “forgetting” about these divisive issues, but both of these studies look at the Grand Army of the Republic on a national scale, raising questions about the applicability of these scholars’ theories to the local context of Indiana.
How did Indiana GAR members remember the Civil War? This study analyzes speeches, newspaper articles from the Indianapolis veteran-published American Tribune, and actions of Indiana’s Civil War veterans from 1880-1918 to argue that the members of the Indiana GAR remembered their role in destroying slavery, often intertwining the goals of Union and emancipation together in their interpretations of the conflict. Nevertheless, Hoosier veterans remained largely silent about the imposition of Jim Crow laws and legalized segregation throughout the country. This paper is part of a larger Master’s thesis and is a compilation of original research and scholarly synthesis.
*James H. Madison, “Civil War Memories and ‘Pardnership Forgittin’,” 1865-1913, Indiana Magazine of History 99, no. 3 (September 2003): 198-230.
My Twitter buddy and fellow historian Andrew Joseph Pegoda writes a thoughtful essay outlining his conception of historical inquiry and the history of the historical profession itself. I left a comment on the essay, but would like to expand upon those thoughts here.
Andrew posits that there are two different definitions of history. History (with a big H) consists of the study of the past using evidence, resources, and historical methods to construct narratives and interpretations about what happened. history (little h), according to Andrew, is “everything that has ever happened, didn’t happen, everything that has been thought, etc., from less than a millisecond ago.” In sum, little h history largely consists of historical facts. Abraham Lincoln signed the Emancipation Proclamation on January 1, 1863. Ulysses S. Grant was born on April 27, 1822. The United States was originally composed of 13 states. And so on.
Andrew continues by lamenting the state of historical literacy in the United States, suggesting that many students today are resistant to seriously analyzing their own myths about the past. He argues that public schools, politicians, and history museums have manipulated little h history to suit their own political agendas to create a mythic past that glorifies America’s alleged exceptionalism while downplaying its own complicated and sometimes dark history of slavery, segregation, nativism, and violence against indigenous populations. In this regard, Andrew’s arguments resemble those of Indiana University professor John Bodnar, who suggests that political and cultural leaders create “official memories” of past events through monuments, memorials, and other symbols as a way of establishing a consensus view of history within society. This consensus history aims to unify a society’s historical understanding of the past–sometimes through myth–as a way of maintaining the political, cultural, and social status quo.
There are many reasons for explaining why people conflate the myths they acquire growing up for big H history that relies on evidence and interpretation for understanding. One reason for this confusion, I would argue, is that the study of history itself has established its own myths over time, myths that historians have helped to perpetuate in their own commentaries about historical methods, the nature of truth, and what exactly constitutes “history” from a content perspective. Indeed, there remains a popular perception of historians working individually in an archive, writing a book, or teaching a class, using evidence obtained from research to objectively report on “how things were” all through the process. Even though public history and the digital humanities have recently emerged as serious and important additions to the field, many outside the walls of academia would be surprised to encounter historians working collaboratively in a digital humanities and/or book project, creating education programs in a public history setting, or working to address contemporary problems and enliven neighborhoods through historic preservation. Historians are often viewed as solitary reactionaries, struggling to pick up the broken pieces of the past while political and cultural elites create their own history in the present. They have struggled to convey the relevance of their field of study to public audiences because those audiences’ very conception of the historian’s craft is often rooted in a mythic understanding of history as it was practiced in the nineteenth century.
As Martha Howell and Walter Prevenier point out, the field of history became professionalized in the nineteenth century, especially towards the end of the century. Academic universities in Europe and later the United States began to include history as a curricular requirement and a field for research and publication. This professionalization led to what Howell and Prevenier describe as the “golden age of text editing and source publication” (41). Many of these scholars strove to make history a more scientific field of inquiry and sought to remove any rendering of the past that suggested a personal bias, a subjective interpretation, or a mythic understanding of the past. The German historian Leopold Von Ranke, for example, argued that historians should strictly limit their studies to provable facts and empirically testable material. This “positivist” view of historical methodology conceived history as “a rendering of the past strictly on its own terms, without grand theory about social systems, causality, or purpose” (88).
The rise of professionalized history can also be tied to the rise of nineteenth century nationalism. For all of their talk about objective truth, many historians during this period contradicted themselves by participating in national projects that aimed to record and celebrate the history of their nation’s past. Howell and Prevenier point to many European projects such as the Monumenta Germaniae Historica in Germany (1819) and The Recueil des Historiens de la France (1899) as scholarly endeavors aimed at romanticizing the past in order to foster social cohesion and national identity in the present. These projects, of course, were hardly objective or strictly limited to historical “facts.”
In the same way that myths provide comfort and understanding about historical events, the mythic nineteenth century view of the discipline of history puts the past in a neat box. Here in the United States, the contradiction of nineteenth century positivist history still seems to have a hold on society’s understanding of historical facts. In my public history work I’ve had countless conversations with people who tell me they just want “the facts” of history without the interpretation. At the same time, however, they express discomfort over any analysis of United States history that might disrupt their own nationalist sentiments and subjective concept of American identity. History, in this view, falls outside the issues of today, allowing for the past to become a source of comfort (which it can and should be, within reason) rather than a source for questions about today. This nineteenth century view also privileges certain types of history (political, military, economics) over others (race, gender, class, material culture), which might explain why books about war, politics, and economics dominate the bookshelves of Barnes & Noble. It might also explain why the field of history continues to have a strong gender and racial imbalance that leans towards white males.
Leopold Von Ranke’s conception of history posits that the study of history has an “endpoint” defined by the discovery and confirmation of factual evidence to explain the past in a single narrative. I view history as a field of inquiry with no “endpoint,” instead representing a lifelong journey of questioning, revision, and interpreting of historical sources that often yield a number of competing perspectives about what actually happened. When we get too comfortable with our understanding of the past, it’s time to start asking more questions of our sources. Convincing young students and the rest of society of the power of history requires us to move our historical methods and thinking beyond the nineteenth century.
I am feeling on top of the world at this very moment. This morning I submitted a draft of my master’s thesis to my committee in anticipation of my formal thesis defense on Tuesday, March 11. I still have a long way to go in the entire process, but I feel like a huge load of researching, writing, and editing has been lifted off my shoulders. The draft is longer than I anticipated–141 pages, which is almost double the minimum requirement for a history master’s thesis at IUPUI–but I honestly believe that every question I ask and every interpretation I provide has an important purpose within the study (of which I’ll have more information in future posts). I’ve been very fortunate to have a committee of professors who have taken an active interest in my topic and who have already read rough drafts of all my chapters multiple times. Having three different perspectives throughout the process has allowed for a wide range of questions and comments on revising my work, and their prompt attention to my research has given me ample time to craft what I hope will be an important addition to the study of Civil War memory in Indiana and the entire Grand Army of the Republic.
When I first started graduate school in August 2012 I hadn’t put much thought into my thesis topic. Indianapolis was a new city for me, and I didn’t know a whole lot about the history of the state, although I knew that I wanted to do something Civil War-related. Within days of moving to the city I visited the Indiana Soldiers’ and Sailors’ Monument in downtown Indianapolis, and this visit prompted within me questions about the nature of Civil War memory in the Hoosier state. What really struck me about the monument at the time was its location. The very definition of Indianapolis’s cardinal directions has been shaped by the monument’s location in the geographical center of the city. Everything west of the monument is western Indianapolis, everything east is eastern Indianapolis, so on and so forth. City designer Alexander Ralston platted a circle in the middle of the city in 1821 with the intention of placing the Governor’s mansion in this circle (the mansion was built poorly, however, and no governor ever lived there, ironically enough). My interpretation of this design is that Ralston aimed to place Indiana’s chief executive in the center of the capitol city as a way of reinforcing notions of good governance, “progress,” westward expansion, and American patriotism in the Hoosier state.
With the end of the Civil War, however, calls were made to turn the circle into a commemorative monument to Indiana’s Civil War dead, and in 1887 the Grand Army of the Republic finally persuaded the Indiana General Assembly to appropriate $200,000 to build the monument. By placing the monument in the geographic center of the capitol city, Indiana now defined itself as a state whose very foundations were built on its collective remembrance of the past. Future legislation banning the construction of any buildings within the circle that were taller than the monument reinforced this idea by ensuring that future commercial developments would never overshadow the state’s memories of its past and its war dead.
Even though my thesis did not have enough space to address the construction of the Indiana Soldiers’ and Sailors’ Monument, this August visit to the monument sparked my interest in looking at the Indiana veterans who were so adamant about constructing a monument that reflected their memories of war.
Following this visit, I outlined my plan for researching, writing, and editing:
September 2012 – April 2013: In September I decided that the Indiana GAR was going to be my topic. For the next seven months I dedicated myself to doing research and developing research questions for the thesis. I also presented a few rough drafts of ideas I had at conferences at Ohio University and the University of Indianapolis, which gave me the opportunity to get some feedback from professors outside of IUPUI. Finally, I started blogging a bit about my research here at Exploring the Past (which is under the “Grand Army of the Republic” category to the right).
May 2013 – December 2013: Early in the process I made a vow to have all three chapters written by the end of 2013. In May I had a formal prospectus defense in which I outlined my ideas for each chapter and compiled a list of primary and secondary sources I intended to use throughout the study. This prospectus defense was successful, and I began writing shortly thereafter. Over the summer I wrote two chapters and continued to conduct research for my third chapter, which will be looking at the relationship between the Indiana GAR’s desire for “patriotic instruction” for young children and the rise of public education in the state during the 1890s and 1900s. I wrote the third chapter in my free time during the Fall semester and completed it in December.
January 2014 – Present: Since the turn of the new year I have focused on writing an introduction and conclusion while also making extensive edits to the entire document. Now that I’ve turned in a draft of the whole product, I will now focus on making edits, tying up loose ends, and preparing for my thesis defense. I am also presenting a paper about the Indiana GAR at the Indiana Association of Historians conference at Anderson University on March 8th. At some point in March or April I will have the thesis reviewed by an editor at the IUPUI graduate office, followed by the eventual publication of the study into book form. Thanks to IUPUI’s commitment to open access policies, my master’s thesis will also be available online to the whole world through the university library’s ScholarWorks Repository. Hopefully others will read it besides my family
What do you think about the process of writing a master’s thesis? Any recommendations for those looking to get a head start on their own studies? Be sure to leave a comment if you have ideas.
Since 1939, the economy of St. Louis, Missouri, has benefited from a healthy number of job opportunities provided by the aerospace industry. Starting with McDonnell Aircraft, which then became McDonnell Douglas in 1967, and finally Boeing in 1997, aeronautics has brought in a range of talented individuals into the city, including aircraft pilots, aerospace engineers, and machinists who build all of this cutting-edge technology.
The machinists who work at Boeing’s plant in St. Louis are a part of the International Association of Machinists, District 837. This union negotiates on behalf of its workers for wages, periodic wage increases, signing bonuses, health benefits, overtime premiums, safety standards, and pension plans with the Boeing Company. Additionally, the union provides legal aid to workers who may be mistreated by the company in some sort of capacity. To be sure, I don’t have the knowledge to analyze the nature of the IAM’s relationship with Boeing today. Nevertheless, I think it’s fair to say that the machinists who work for Boeing are a critical element in the success of Boeing as a company, and it’s fair to say that the skills these people have developed require years of education, training, and professional development.
But let’s take a step back for just a second. Let’s envision a situation where protests arise against the arrangement between IAM and Boeing. Concerns emerge about the quality of the machinists’ work, the amount of money being paid to them, and Boeing ends up losing out on contracts with various governments and private companies. A wealthy citizen and/or corporation expresses public concerns about the state of the aerospace industry in the United States and argues that the IAM is largely to blame for this sordid state of affairs, going so far as to say that the situation is a threat to national security. Other countries are experiencing tremendous growth in the aerospace industry and are actively looking for more talent to help enhance their own capacity for development in aerospace, this wealthy citizen/corporation argues, and it’s time to reform the system. The wealthy citizen/corporation invests money into a new program named “Build for America,” which aims to create a sort of national machinists corps, a talent base in which to recruit competent machinists. The members of this machinist corps would need no prior experience in manufacturing. All they would need is a college degree and a willingness to work at least two years manufacturing airplanes at the Boeing plant. Once their two years were up these “Build for America” corps members could decide to go back and receive more training in manufacturing, or they could move on to other career endeavors. Leadership at Boeing loves this deal and the amount of money it will save the company, therefore they decide to cut their ties with IAM and create a bargaining agreement in which the wealthy citizen/corporation helps run the “Build for America” program and provide machinist labor for Boeing.
The situation I’ve just described above, of course, is absolutely ridiculous. While leadership at Boeing may or may not enjoy working with IAM, they realize that the relationship between the company and its labor force is crucial to its success. The machinists who work at Boeing practice a highly specialized craft that requires a lot of talent, experience, and dedication. One cannot simply walk onto this sort of job without any manufacturing experience and master the entire practice of building aerospace technology within two years.
When it comes to teaching and public education, however, the situation I’ve described above captures the essence of the current battle between teachers unions, school administrations, and wealthy education reformers around the United States. Philanthropists like Bill Gates and the Walton family have come out against teachers unions, arguing that these unions are in the business of protecting bad teachers. These bad teachers are the root of public education failures throughout the country; taxpayers spend too much money on education, teachers make too much money for the work they do, and children all over the country drop out of school because the teachers and schools they attended failed them. As Davis Guggenheim suggests in the popular film Waiting for Superman, perhaps the reason public education fails so many students is not because of the struggling families, impoverished communities, and lack of quality jobs that surround these students’ lives, but the teachers themselves and the unions that protect them.
To address these problems, wealthy philanthropists have poured millions into charter schools and programs like “Teach for America,” where recent college graduates with no prior experience in teaching agree to work in impoverished public schools for two years. An implicit assumption made by Gates, the Walton family, et. al., is that teachers somehow have the power to overcome all circumstantial issues outside the classroom and lead their students out of poverty. By getting rid of the teachers unions that promote tenure polices for teachers and protections for teachers who are unfairly accused of misconduct, school administrators will be able to run their schools like businesses, firing the “bad” teachers, hiring “good” ones in their place, and using standardized tests to measure success in the classrooms. In their attempts to obtain the highest possible test scores for the charter schools, these school administrators have even resorted to “firing” poor performing students. As Diane Ravitch points out, “some charter schools ‘counsel out’ or expel students just before state testing day. Some have high attrition rates, especially among lower-performing students.” Geoffry Canada of the Harlem Children’s Zone, for example, kicked out his first class of middle school students because they did not meet the test score expectations of the school’s board of trustees.
And now there is news out of Newark, New Jersey, that the state education department is looking into possibly firing upwards of 700 experienced public education teachers from the Newark Public School system and replacing them with more than 300 Teach for America candidates; candidates with no teaching experience and who frequently leave the communities they teach when their terms are up. This possible purge of public education teachers, in my opinion, reflects what Andrew Hartman has pointed out as Teach for America’s “hidden agenda.” Rather than working to fix poverty, segregation, violence, crime, and a myriad of factors outside the classroom, the Charter School/Teach For America/education reform movement incorrectly believes that simply eliminating teachers unions and privatizing education will magically solve the problem of public education in the United States.
Boeing would never fire 700 experienced machinists and replace them with 370 college graduates who lacked any manufacturing skills. So how is it logical to fire 700 experienced teachers and replace them with 370 college graduates who lack any teaching skills? What’s the point of getting a teaching degree?
To be sure, teachers unions’ are not above criticism (look no further than the “Rubber Rooms” controversy in New York City for an example of a ridiculous and wasteful teachers union initiative). But I am tired of reading articles in which teachers–who have spent years getting advanced degrees, gaining experience in the classroom, and investing themselves in the communities where they work–are blamed solely for the education failures of their students. I am tired of wealthy philanthropists and bureaucratic politicians who have never spent a day in a classroom tell teachers that they should be “seen,” but not “heard” when it comes to issues in education. I am tired of seeing laws enacted that make it easier for business leaders who have never spent a day in the classroom to become school superintendents. I am tired of reading articles from teachers who plead with the public not to blame them for the failures of their students because they have no room to actually teach. I think we can do better.
Whenever I get a chance to visit a museum–whether it be related to history, art, science, or even a children’s museum–I expect to be challenged intellectually. I believe the best museums are the ones with lots of questions, lots of arguments, lots of discussions, and lots of opportunities for visitors’ preconceptions about the world to be enhanced, challenged, questioned, and/or proven wrong. That’s what learning is about, right? Indeed, having lots of exhibits, artifacts, and informational labels will mean nothing if there is no effort at an interpretation. The who, what, where, and how informational questions are important for building a foundation for learning within a museum setting, but visitors also want the why questions. Why does this matter? Why are we here? Why this and not that? Why should we care? Without making an interpretive argument, museums become dull and boring. I would rather go to a museum that made terrible arguments rather than one that made no arguments at all.
University of Leicester Museum Studies professor Richard Sandell’s 2005 publication Museums, Prejudice, and the Reframing of Difference points out that all too often museum professionals are hesitant to take strong positions or convey “messages” to their audiences. A range of explanations account for this hesitancy, including concerns about the possibility of museums being viewed as places of indoctrination or sectarianism, the possibility that donors and other stakeholders are uncomfortable with museum messages, and even self-doubts within museum staff about the power of cultural institutions to enact meaningful change in society.
While acknowledging that different museums have different missions, goals, and objectives to accomplish, Sandell criticizes museums that make strong claims to impartiality and objectivity. “These concerns about partiality,” argues Sandell, “are used as an excuse to avoid engaging with social issues and acknowledging that museums of all kinds . . . embody particular moral standpoints” (196). In actuality, museums that strive to make no interpretations, arguments, or commentaries about the present are themselves still making a sort of interpretation through their own silences. Rather than striving for objectivity in museum exhibits and programs, Sandell suggests that museums should strive for fairness: places that promote inclusiveness, open and honest discussion, social interaction, and understanding between different people and groups in society.
Part of the reason why museums (especially art and history museums) have been struggling over the past ten to fifteen years, in my opinion, is that audiences are tired of purely information-based experiences in museums. If I just want information, why can’t I look that up on Wikipedia or do a Google search instead of traveling to a museum? It is true that museums today face more competition for attention and leisure time than they did in the 1960s and 1970s. Nevertheless, museum practitioners should be working to convince society that museums are truly beneficial places (socially, intellectually, mentally, even physically) that are really worth the time to visit in person or online. This work of convincing society of the importance of museums means that practitioners and professionals need to work towards creating experiences that give audiences the ability to be active users within the museum setting, not passive visitors merely seeking objective information.
The recently deceased cultural theorist and sociologist Stuart Hall‘s analyses of culture and media production provides important insights for museum educators looking to move beyond a purely informational educational experience. Hall and many of his cohorts in the 1980s and 1990s argued that consumers of media information–whether through radio, television, or the internet today–create their own processes for understanding messages conveyed to them. Rather than being passive recipients of a static message, media audiences themselves contribute to the construction of their own messages that could be radically different from the intended message a media outlet attempts to convey. In the process of consuming media messages, people bring with them their prior experiences and knowledge of the world. They are not empty vessels waiting to be filled with passive information. Describing Hall’s “encoding-decoding” model, Sandell points out that students of media studies have shifted their inquiries from “what media messages do to people, to what they mean to them” (11).
What does this have to do with museums? Above all, it’s important to remember that visitors of museums bring with them their prior experiences and knowledge. Museum visitors use these tools to construct their own messages from exhibits and programs, just like they do with media messages. True, the intended message of a museum may turn out to be radically different from the message an audience member interprets during their experience, and these misinterpreted messages could be potentially difficult for museums that attempt to promote diversity, inclusiveness, and/or the healing of historical wounds from past injustices such as slavery, segregation, or genocide. Nevertheless, from where I stand, museums that convey messages are better than ones that convey no message. For museums that profess to be objective, everything is black or white. There is no room for discussion, no room for questions, no room for multiple perspectives or interpretations of museum messages. Look at an exhibit, get your label text information, go home. For museums that convey messages, an acknowledgement is made about the fact that museums are not neutral spaces. By promoting fairness instead of objectivity, these museums acknowledge the agency and intelligence of their audiences. Indeed, these museums enhance their own capacity for growth by understanding that museums are shaped by their audiences just as much as audiences are shaped by the museums they visit.
I’ve come across of lot of thought-provoking reads lately. Here’s a roundup:
In the Classroom
- David Cutler says it time to say goodbye to history textbooks. Yours truly was almost brought to tears reading this wonderful essay.
- Catherine Gobron asks, “is compulsory schooling necessary?” Gobron says no, arguing that “making people do things they don’t like encourages people to dislike the thing you are making them do, even when that thing is fun or valuable. If a person finds a road worth crossing, they’ll cross it.”
- Laura Miller writes a thought-provoking essay on the tendency of readers to unjustly blame their own comprehension struggles on the authors and scholars whose works they are reading. Miller suggests that readers blame others for their difficulties because they don’t want to feel ignorant or have a sense of intellectual insecurity.
- Chris Conrad writes a heart-felt and inspirational essay on the shortcomings of history education in the United States. He points out that non-European history (Asian, African, South American history, etc.) are only taught when they have a connection to white European history, and that history is more than facts. “We have a fetish toward facts, numbers, and statistics and demonstrate a fear of questioning and theorizing,” argues Conrad, and we must work to not view history as a monolithic narrative of progress, but a complex story with many perspectives and a wide range of ups and downs.
- In an essay that parallels Conrad’s, Abigail Zuger argues that the “single best answer syndrome” has infiltrated the world of medical education.
- Fellow scholar and Twitter/Facebook buddy Andrew Joseph Pegoda writes about teaching writing intensive courses for undergrads.
- The Washington Post details the story of an 11-year-old dying boy who was forced to take a standardized test in Florida. Sadly, the boy (Ethan Rediske) passed away on February 7.
History, Science, Technology, and Philosophy
- Simon Critchley writes a beautiful essay on “the Dangers of Certainty,” arguing that “all knowledge, all information that passes between human beings, can be exchanged only within what we might call ‘a play of tolerance,’ whether in science, literature, politics or religion.”
- How to train your mind to think critically and form your own opinions.
- Noam Cohen analyzes the efforts of Wikipedia to create programming that makes it easier for mobile users to make edits to the world’s largest reference website.
- According to a recent study by the National Science Foundation, one in four Americans thinks the Sun goes around the Earth. Oy vey
- My friend Joshua Hedlund argues that the recent Bill Nye/Ken Ham debate over Evolutionism and Creationism demonstrated that scientists are not as open-minded or “progressive” as Bill Nye would have them to be, nor is their evidence as strong as it may seem. An interesting and thought-provoking read.
- Eric Foner reflects on teaching history and the need to question American Exceptionalism. A quote is necessary to give readers a taste of the brilliance of this interview:
I would say the most pernicious idea about history that is widely shared here is the idea of American exceptionalism: That our history is so different from that of every other country that we don’t have to know anything about anybody else. With that comes, “We are better than everybody else. We have a unique mission in the world.” In a way it alerts us to issues of liberty and human rights around the world—but in a globalized, interconnected world, that notion of exceptionalism has little real validity. It leads to parochialism, self-satisfaction, and a lack of interest in the rest of the world. If I had one idea I’d like everyone to abandon, it’s American exceptionalism, that we are exempt from the processes of history that affect everybody else. That we have a superior position in the world that gives us the right to tell other countries what they ought to do without listening to an international dialogue.
Yesterday was a pretty exciting day for me, as I had the privilege of giving my first professionally invited talk/speaking engagement. The history department at my Alma mater (Lindenwood University) invited me to address the undergrads in the department about public history and applying for graduate school, and I think the presentation went well. I aimed to keep the speech between 35-45 minutes with some space for questions and answers at the end. I also encouraged questions throughout the talk in an attempt to make it more of a conversation rather than a monologue. The crowd was pretty quiet during the talk, but I had good questions at the end, and I’d like to think the students came away with a better understanding of what public history is.
I broke up the talk into three sections:
1. My career transition from the “high school teacher track” to public history.
2. A definition of public history (see here for an essay I wrote that captures my ideas on the subject) and what sorts of jobs people can get in public history.
3. Tips and tricks for applying to graduate school (see here for more).
Here are a couple questions I received at the end of my talk and how I answered them:
1. What is the most important question to consider when applying for graduate school? In my opinion, one needs to figure out their budget for graduate school before going any further in the application process. For each program that interests you, make sure to study what sort of financial aid is available (fellowships, scholarships, stipends, grants, loans, etc.) and figure out how you can get as much aid as possible. For me, it was extremely important to find a program that gave me the ability to go to school without taking out any additional loans. I have some outstanding loans from undergrad that will need to be addressed in the near future, and I realized that I didn’t get into history to become a millionaire. While I was willing to pay some money out-of-pocket for grad school, it was imperative not to put myself in a hole financially, and my general mantra towards advanced degrees is that one should make the school work to recruit you, not the other way around.
Nevertheless, I also realized that by setting these boundaries during my school search, I eliminated solid programs that could potentially do a lot to influence my career. I have several public history acquaintances at American University in Washington, D.C. who have told me they received no financial aid whatsoever for their graduate degree. The cost of school–added with the cost of living in the D.C. metro area–is astronomical, most definitely in the five digits and beyond. With that cost, however, are some pretty amazing opportunities to work with cultural institutions like the Smithsonian, the Library of Congress, or the National Park Service. You as an individual must weigh the pros and cons of each program you apply to and determine what is the best fit for you financially while also keeping an eye towards your future earnings if you were to pursue a career in public history.
2. What’s the difference between museum studies and public history? There is a lot of overlap between the two fields and I had trouble distinguishing the two when I initially started my search for a graduate school. Generally speaking, public historians are trained in historical methods, historical research, and communicating the stuff of history to a public audience. The medium in which public historians communicate with their audiences could be in a museum, but it could also take place in a historical society, archive, library, film production, a historic preservation project, or a classroom. Museum studies students do not receive training in historical methodology; rather, they receive specific training for work within a museum setting. This training includes exhibit design, label writing, curation, object-based learning, and museum education. Museum studies students are trained for work in a wide range of museums, including history museums, art museums, children’s museums, and science museums. Public historians are not necessarily qualified to work in all of these types of museums, but they have opportunities outside the museum walls.
3. Do I need an advanced degree to pursue a career in public history? Generally speaking, you need at least an M.A. to work in public history. Granted, I have no doubt that there small historical societies and museums who are willing to take applicants with just a B.A. degree, and there are part-time jobs that do not require an advanced degree. In my own experiences, however, most of the jobs I’ve read about and applied for require an advanced degree, and oftentimes they want a portfolio or some sort of tangible example of work you’ve done in the field.
4. What is the nature of your Master’s thesis? Is it related to public history? My specific master’s thesis does not cover a public history topic. I was not required to pick a public history topic, although my study on the Grand Army of the Republic, Department of Indiana, analyzes historical memory and does briefly discuss the role of monuments in conveying messages to public audiences about collective memory. This case is not the uniform standard in every program, however. Some public history programs require a public history-related thesis, while others require no thesis and opt to have students complete a portfolio. I highly recommend that prospective students ask program directors about the nature of these projects and seriously consider if they’d be willing to complete a thesis or portfolio of some sort. All too often, students complete their coursework but struggle to finish their final project. Those pursuing a Ph.D. sometimes end up getting an ABD “degree” (All but dissertation). You don’t want to do that.
If you have a question about public history or applying for graduate school, feel free to leave a comment below.
For most of my life I’ve never been a big science person. I’ve always struggled with science, and ever since I can remember I’ve never had good grades in the subject. This state of affairs is largely my fault because of my attitude towards the subject and my narrow, tunnel-vision argument that usually went along the lines of “I’m a history student, I don’t like numbers, science isn’t relevant to my studies.” In sum, I did bad not because I was unintelligent; I did bad because I was a numbskull. Fortunately, a light went off in my head around the time I turned 25 in 2012, and now I realize that science is awesome, not to mention extremely relevant to how I practice my own craft as a historian. This semester I’ve been fortunate enough to sit in on a class that covers the History of Science and Technology Since 1750. Partly a history course and partly a science course, the first three weeks of invigorating and challenging reads have already exposed me to amazing scientific thinkers like David Hume and Karl Popper and have challenged me to consider what we mean when we talk about “doing science.”
On Tuesday, February 4, a well-publicized debate took place between Scientist Bill Nye and Creation Museum Founder Ken Ham over the following question: “is creation a viable model of origins in today’s modern scientific era?” The history of science and technology class I’m taking meets on Tuesday nights, but by pure coincidence (or providence, depending on how you see things), inclement weather in the Indianapolis area led to the cancellation of class. So I tuned in to watch what I believed to be an interesting and entertaining debate, helped in large part because there was a lot of lively discussion on Twitter via the #creationdebate hashtag.
A lot of people on Twitter and Facebook, however, questioned the need for a debate, and a lot of post-debate essays I’ve read seem to be rather negative in tone. Nye and Ham talked past each other, Bill Nye wasn’t entertaining enough, and so on. So it goes. While I understand those who say that Nye shouldn’t have even acknowledged Ham’s creationist ideas by agreeing to debate him, I believe it’s always important to reinforce to society what it is scientists do, why they practice science, and why we should encourage people to utilize scientific thinking in their daily lives, even if they don’t make their living in a scientific field.
The debate was nearly three hours long and covered a lot of ground. I just want to focus on a rather curious argument that Ken Ham continually referred to throughout the debate. That argument, according to Ham, is that there are two distinct “types” of science that scientists practice: “observational” and “historical” science.
“Observational” science, Ham argues, refers to science that takes place in the present. It can be observed, tested, hypothesized, and repeated. “Historical” science, on the other hand, refers to science that lacks “evidence,” cannot be observed, and is essentially unknown. Ham asserts, for example, that we can’t deal with something like the formation and existence of the Grand Canyon because we weren’t there to observe it.
The problem with this distinction is that all science is observational, and all science is historical. What happened thousands and millions of years ago is observational, and we have the evidence and technology to observe, hypothesize, and test this type of science. Science that happened last week, yesterday, and twenty minutes ago is historical, observable, and testable. I’ve never heard scientists refer to two “types” of science before, and the reality appears to suggest that this observational/historical dichotomy is figment of Ham’s imagination, not something readily embraced by the science community.
Even if we were to accept the idea of a difference between “historical” and “observational” science, at what point would we agree on a paradigm shift from historical to observational science? When do we cross into observable territory with regards to scientific inquiry? Noah’s Ark? What observable evidence exists to support that claim? If the answer is the Bible, then we need to ask whether or not the Bible can be used as scientific evidence.
I see two crucial factors for understanding why we analyze paradigm shifts and changes over time. For one, we want to outline precise moments in history in which a change–politically, socially, culturally, scientifically, etc.–took place; a moment in time in which something changed. Second and equally important, we want to “think historically” in the sense that we need to understand and appreciate the fact that we today are merely observers of the past, not participants in it. By doing this exercise, we are able to better place our lives in time. “The past is a foreign country,” as David Lowenthal argues, and we must work to learn the “language” of what life was like in the past. Of course, at what point a modern-day society is created is a bone of contention among historians today. For our purposes here, however, it will suffice to point out that people thousands of years ago did not act like us or think like us, and part of thinking historically involves an effort to use historical evidence to uncover how these people made sense of and created their own realities. Ken Ham’s model, in my opinion, doesn’t help us analyze the observational or historical aspects of science or even history itself.
Over the past year and a half in graduate school I’ve developed a much better appreciation of how cultural artifacts and ideas (whether conveyed through books, digital resources, material objects, symbols, and/or oral communication) feed off each other and give life to a social system. The “things” we typically utilize on a regular basis–television, the internet, Starbucks coffee, restaurants, sneakers, our homes, the buildings where we go to school and work, etc.–are cultural artifacts that can tell us much about political power structures, social meanings, and how people in a society construct a reality for themselves. I’ve learned that you really can’t take anything at “face value” without asking how and why an artifact or idea conveys meaning to people, nor can you look at an artifact or idea without asking about the larger context in which the artifact or idea was created. For scholars (especially those in fields like history, anthropology, sociology, literature, and cultural studies), the notion of artifacts and ideas whose meanings are socially constructed is probably not new or ground-breaking. Nevertheless, I would argue that one of the many reasons why the humanities is such an important facet of scholarly study (and why more people need to take the humanities seriously) is that the humanities requires people to ask questions about the very nature of truth, justice, citizenship, and power in society, challenging us to consider not only the possibilities but also the limitations that society puts on us in our capacity for growth as humans.
Even though I’ve watched tons of Super Bowls in the past, I finally realized with this most recent match that the game itself is a valuable cultural artifact that provides insights into American values and society. Indeed, I believe Super Bowl Sunday is perhaps the most “American” of days in the entire calendar, even more so than Memorial Day and the Fourth of July. With the profusion of cable/satellite television, Netflix, and the internet, it’s simply amazing to imagine 111 million people in the United States–roughly one out of every three Americans–sitting down on a couch to engage themselves in one common televised spectacle. What other event on America’s commemorative landscape captures the collective attention of so many people besides the Super Bowl?
The Super Bowl this year was significant to me because it strongly reinforced the fact that this one silly game is perhaps the strongest conveyor of America’s civil religion to rest of the world. To be sure, some scholars assert that there is no such thing as a “civil religion,” but I like Richard T. Hughes’s definition of civil religion, which he describes as the “myths that America lives by . . . by which we affirm the meaning of the United States.” Ritual expressions of patriotism such as the National Anthem, the raising of American flag, and the creation of national monuments and cemeteries, I would argue, all fall under the category of “civil religion” because they purport to tell us through symbols what it means to be an American and instill a nearly religious faith in what we perceive to be “the American way.” Myths function in much the same way in that they purport to tell us stories about the past and who we are today. Wildly one-sided and simplified stories about the Founding Fathers, American wars, or abstract notions of “progress” can also fall into this notion of mythologized civil religion.
So how does the Super Bowl convey ideas about America’s civil religion? For one, the game itself broadcasts to the world the values that bring 111 million Americans to their TVs: sex, violence, commercialism, materialism, and above all, American exceptionalism. We see sexualized images of women throughout the game, we put aside serious concerns about the long term effects of football concussions while watching men smash their heads into each other, we laugh at silly commercials that cost millions to produce, and we wrap it all in the American flag. We see videos of servicemen and women saying hello to their loved ones back home (although we don’t take time to think about why they’ve been placed thousands of miles away from home, because we’ve got a Bruno Mars song to catch), we see American flags waving all over the stadium, we see businesses making an attempt to connect commerce with patriotism, and we get angry when Coca-Cola makes a multilingual commercial that offends our cultural sensibilities. By using the flag for these symbolic purposes, we argue that the Super Bowl is America. This is what we stand for, this is who we are, this is why we are exceptional people. Football is our game.
The winner for the biggest expression of American Exceptionalism and civil religion this year goes to Chrysler. In a much publicized commercial that featured music star Bob Dylan, the commercial asks a silly, but serious question: “Is there anything more American than America?”
The video goes on to create its own mythic story about automobile production and “progress” in America, making references to the “authenticity” and “originality” of America automotive ingenuity. “Detroit made cars, and cars made America.” Without American cars, we just aren’t America anymore! When you spend your money on American goods, you show American pride and even patriotism. As the video description states:
When it’s made here, it’s made with the one thing you can’t import from anywhere else, American pride. The all-new 2015 Chrysler 200. America’s Import.
Sex, violence, commercialism, materialism, American exceptionalism: America’s civil religion, or at least a part of it.
On a somewhat-related sidenote, I have to laugh at all the criticism Bob Dylan has received for appearing in this commercial. Countless friends have remarked that he’s a “sellout,” a “phony,” a has-been that’s given into corporate interests. This line of thinking is humorous to me because we as a society have asked for these types of commercials. So many people talk about watching the Super Bowl because “I want to watch the commercials.” For a few hours on Super Bowl Sunday, we gladly ask America’s corporate powers to become theatrical entertainers so that we can laugh, find new things to buy, have something to talk about on Monday, pat ourselves on the back, and vaguely thank those in the service for giving us our freedom and this blessed game. To meet “the people’s” criteria, companies like Chrysler exploit the symbolism of the American flag and hire entertainers like Bob Dylan because that’s what we’ve asked for. If you were in Dylan’s shoes, would you really turn down all that money for just two minutes of work?
Bob Dylan has more punk, rock n’roll, and soul in his left pinky finger than most of us have in our entire bodies. He sure as hell doesn’t care what we think about his commercial, and his bank account isn’t losing money for every retweet or status update complaining about him. At the end of the day, are we really angry with Bob Dylan, or are we actually angry with ourselves? Is Bob Dylan the sellout, or are we (myself included) the actual sellouts?
I recently came across a Huffington Post article where it was discovered that the famous television show The Simpsons had a scene in 2005 (the episode was “Bonfire of the Manatees”) in which the two teams in tomorrow’s Super Bowl were on the television with a “final score” of 19-14 in favor of the Denver Broncos.
Last year on this blog I predicted that the Ravens would beat the 49ers 35-31; the actual outcome was 34-31 in favor of the Ravens. This year I predict that the Broncos will win 24-21 over the Seahawks.
I’m a big sports fan and I’ve always enjoyed football, even though I never cared to play the sport and am now seeing several of my friends who did play years ago going through their own pains today. Nevertheless, I’ve learned a lot about the dark side of football over the past year, and it’s hard to ignore the devastating physical and emotional toll the game has on those who play it. Mike Webster’s 2002 death was one of the first notable instances in which football undoubtedly played a leading cause his untimely death at the age of 50. More recently, Junior Seau committed suicide in May of 2012. No doubt there are other horror stories as well. My friend Joshua Hedlund has decided to stop watching the Super Bowl altogether, and the more I read stories like those of Webster and Seau, the more I wonder if that might be the route I go in the future as well.