Category: Readings

These are the blog posts on assigned readings. Due on SUNDAY!

Teaching the Digital Native

Let’s face it, using technology is difficult and not everyone can navigate the digital equally well.

The readings for week 13 complicate the trope of “digital natives” that Marc Prensky described in 2001. According to Prensky, the “digital natives” of the millenial generation learn better with online tools because they have always used such tools and learn best and most fluently in this environment. Danah Boyd points out that this depends on the context they bring to learning.

Students are bringing a variety of experience levels to undergraduate and graduate history courses and professors should not presume a level field. Allison Marsh had problems with her graduate public history students engaging in digital material as she had hoped they would. Even when they completed assignments, they were rarely very successful public history exhibits. And these are skills that they will certainly use in almost all public history careers.

Our Clio 1 class also represents this diversity of experience levels and success in using some of the digital tools in our class has varied (this doesn’t necessarily track with the generation of student). However, I’m truly looking forward to the final projects of my classmates because I know they’ll produce something creative and interesting.

As Dan Cohen has written, digital history is precisely what sets GMU apart from other History PhD programs in Virginia. “Pragmatic and prescient”- using the digital in teaching is seen as an inevitable solution. Sometimes it’s to the problem of relevancy in the digital age. We need to teach the students “where they are”- online. And online is also where those teaching the discipline are, even if they don’t always admit it. 

In looking at some of Mills Kelley’s “hoax” courses, students learned most successfully when they brought their enthusiasm for the digital, but are taught rigorously to think historically. Some may (and did) have problems with Kelley’s teaching methodology, but no one disputes that the students genuinely worked hard on their hoaxes, learning history in the process. Students also brought this excitment with them to Nicolas Trepanier’s Historical Gaming courses. The key in both cases was leveraging the excitement for the digital- creating or playing, to teach skills that are not inherently digital at all- evaluating sources, approaching historiographical interpretations, and analyzing arguments.

The common threads that I take away from these readings is that the experience of students should not be taken for granted. Assuming enough support for students at all levels, use their excitement for digital and social tools to encourage thinking historically. If you make history fun while teaching its core ethics, students will learn how to think historically and how to produce history in the digital age. Though the intrinsic “digital native” does not exist in the wild, I’m all for using the tools that students or members of the public (in the case of public history projects) find most engaging. I was excited to see that the National Park Service is interpreting history with mobile phones on the Mall and elsewhere. If the public can think critically about their history- online or off, it’s a win for our work.

Publishing My MA Thesis Online

In January 2014, I uploaded an article-length draft of my Masters’ Thesis on School Desegregation in Roanoke, Virginia to Academia.Edu. It was a tough decision for me to make- will college or high school students plagiarize me? By putting this online, will scholarly journals or essay collections pass on it? Will it receive any readership or merely eliminate me from consideration for more august publications?

I had read the AHA’s statement on embargoing of Dissertations (it had come out in July 2014) and I feared rejection from journals because my work was publicly available. Obviously publication of a MA thesis is not as critical as turning a dissertation into a monograph, an act on which careers can sometimes be created (or ended). Yet, the same issues felt like they were at play. Ultimately, I took a deep breath and hit the “publish” key.

It would be nice if this story fit into the narrative described by Rebecca Anne Goertz (and published on the blog of Johns Hopkins University Press). But no bigshots at preeminent journals have found my thesis and demanded its publication. Still, the paper has received over 300 views. I can tell from search keywords that many of the viewers are members of the public or even students of Roanoke schools (former and current) I wrote about. I’ve been contacted by a French novelist who sets her historical fiction during the Civil Rights Era and by a Virginia Tech Education professor who used my piece with public school teachers in Roanoke.

It’s too early to call my decision a success (JAH, call me!). I currently believe it’s been a qualified success exactly in ways I did not expect. Academia.edu has a reputation as the “Facebook” of scholars (one of many), so I expected historians in my field (“my audience”) to find the paper and perhaps comment. In reality, members of the public who have an interest in the history of schooling in Roanoke read my paper. The primary academic interest I did receive was from outside of my field in the education department.

I think this speaks to the importance of open access and creative commons licensing, as work online will inevitably be used in ways that the creator did not expect. Academia.edu allows me to keep the rights to my publication, while still pushing it to the public.

I do respect the positions taken by AHA President William Cronon, that the suggested embargo is meant to protect “our most vulnerable colleagues.” Cronon is sincerely attempting to help junior faculty, but, I find more compelling Trevor Owens’ argument for a bizarro AHA in which the goals of the Scholarly Society align with public good rather than a traditional model of publishing books.

I still worry that my decision to go open access was too naive, but in the meantime, someone is searching for my work on Google.

Peer Review in a Digital Age

This week’s readings touched on many aspects of publishing digital scholarship from the formal (Ayers’ and Thomas’ work) to the more informal- blogs and twitter. One of the underlying tensions between works produced for a digital environment and traditional scholarship is perceived difference in authority. In one of the earliest class readings, Gertrude Himmelfarb argued that “[e]very source appearing on the screen has the same weight and credibility.” With digital scholarship providing a contrast to the rigorous system of peer review used by academic journals and publishing houses.

As described in Cohen and Rosenzweig, Hitchcock, and the authors of Writing History in a Digital Age, the cost of publication and distribution on the web approaches zero. Thus the professional machinery that limited traditional scholarship to the best and brightest that would fit in a physical journal or UP monograph is no longer a concern over resources or cost of production. The premier journals are still limited to what would traditionally fit in the “dead tree” version- 3-5 articles, book reviews, front and back matter.

Digital publications are not constrained in such ways, but often do lack what is at the bedrock of many professional journals, a system of peer review. But this lack of peer review is recognized by online practitioners and systems are beginning to be put in place to address the problem of “authority.” As such, online peer review exists on a continuum of approaches.

Joan Troyano described the peer review process used by the Journal of Digital Humanities. The reviewers were usually graduate students or the journal’s editors, not necessarily scholars in the fields being discussed. This is a “peer” review, but not of one’s peers in a particular academic discipline. Without the stamp of approval of the gatekeepers of a discipline, the perception of “authority” is diminished, even as access to the public is far increased over gated articles open to academics at Universities. Yet, the JDH did offer “an astounding” level of interest in a virtually new journal. And I think Joan’s description of the material as a “middle state” serves a very useful purpose. They “collect the good” which generates interest and hopefully can press those publications to a final level of peer review before publication in a disciplines most prestigious journals. It’s a very effective way to “filter” the flood of  online publications.

The peer review system for Writing History in the Digital Age provided an interesting mix of newer open methods and traditional “experts” (who even had the opportunity to send anonymous comments). As a result, Writing History in the Digital Age might have a level of polish that exceeds the JDH (to be fair, the projects have very different aims). I also think their authors are correct that in the digital environment authority is much more fluid and based on reputation. Gertrude Himmelfarb is wrong that authority is thrown out the window on the web, but historians are correct to be worried about authority. In an online world of billions, a dozens views has little authority.

In the digital world, reputation is just as powerful, but also more fleeting. Within the “publish-then-filter” model described by Cummings and Jarrett, the usefulness of the “filter” determines its value for the reader. We trust the New York Times to “filter” the most important news and conversations for  us, just as I trust William Cronon to “filter” the best environmental history and news on his twitter feed.

The other pole of peer review for digital projects are articles like Ed Ayers and William Thomas’ “The Differences Slavery Made.” This went through a formal peer review process and was published by the AHR. It had all the “authority” and professional recognition of traditional scholarship, but some scholars did not treat the web companion project similarly to the published article by not interacting with it. Though Ayers and Thomas had high aspirations of changing the form of scholarship, the traditional peer review and gate keeping made their work very recognizable to the profession- a good and bad outcome.

Digital Scholarship needs all of these levels of peer review, but I believe that the most successful peer review will mirror the best practices on the web where “expert” users are described as such, but all users are open to comment on work. A system similar to Writing History in the Digital Age could plausibly work for journals in some fields if given a chance.

Why Play the Past?

This week we discussed gaming in history scholarship. This is not the “history of videogames”, but using gaming to teach and explain history. But why we are having this discussion in the first place? If all the readings argue that games can play a role in historical scholarship, why use games instead of traditional forms of narrative?

Trevor Owens argues that games have a large audience. This is certainly true. Video games are a multi-billion dollar industry that continues to aggressively grow. Owen’s most read blog posts (by a large margin) are on Colonization and Fallout 3. Yet, I wonder if the scores of users are the board game and mod focused players that historians could speak to or if they were interested in playing a “fun” game. This necessarily limits the potential topics that a historical game could address (“playing” slavery would not be appropriate).

Owens, Adam Chapman, and the other authors also focus on video games “potential to make explicit and operationalize the models of change over time.” These are the “rules” which guide historical agents and events. Chapman argues that the rule-making nature of videogames allows users to internalize the “rules” of the fictional process, while simultaneously understanding how the imaginary world of Civilization relates to a real historical world as described in Kennedy’s Rise and Fall of the Great Powers. The synergy between the virtual world (and its algorithmic rules based on real world civilization interactions) and the historical world as explained by Kennedy’s monograph is successful in this instance. But only if the user privileges the “Form” (algorithms) over the “content” (video games outcomes).

The game’s rules must not be too rigid as to limit natural exploration of “the world”, nor too loose to allow ahistorical interpretations or events to reign. If Hitler wins World War II in Call of Duty, but the gamer learns something about the realities of Modern Mechanized warfare, does the educator consider it a success? On a most basic level- no. The historical game must follow a fine line of not being ahistoric or being too pre-determined, a problem for the Lost Museum, whose users just rotely figured out “the rules” without exploring why these were so based on historic realities.

There is definitely a place for videogames in historical education if only as a way to reach students and adults and provide a unique immersive presence in the historical world. i primarily attribute my early passion for history to a string a fantastic teachers in eighth through twelfth grade. But I would be remiss not to mention the long hours playing Caesar, Civilization, and Colonization on my desktop computer during middle school and high school. The immersive experiences with Dutch colonies, Praetorians, and the impact of gunpowder on society certainly impacted my historical imagination.

The problem is that these games are huge commercially directed enterprises with large staffs of developers, designers, artists, and big bankrolls. To achieve that level of immersion is orders of magnitude more difficult than the levels which The Lost Museum and Pox and the City attempted and achieved. So I think Chapman, Owens, and Joshua Brown, and the Pox creators are correct that great historical scholarship can be created. Historical gaming just needs the proper topic and its “Valley of the Shadows” massive project to prove the concept.

Crowdsourced History: Cathedral or Bazaar?

This week’s readings discussed crowdsourcing and how it affects historical projects. Every large scale digital history project which invites input from the public must make a conscious decision- are the project’s leaders going to vet all contributions or will they allow the chaos of the crowds to lead the way. To take the analogy of Eric Raymond’s The Cathedral or the Bazaar, will the architects watch closely as the public carefully place the bricks to create an immaculate cathedral (Transcribing Bentham) or will the crowd create a chaotic but successful infrastructure based on a few guidelines (Wikipedia).  Eric Raymond, a Free Open Source Software (FOSS) supporter unsurprisingly determines that the bazaar-style method was actually a great success. While Wikipedia’s success has shown that chaos can work at certain scales and for certain aims, it’s not appropriate for all projects.

Roy Rosenzweig’s article offered one of the most direct comparisons of projects with vastly different levels of authority. Directly comparing Wikipedia’s biographies with the top-down directed American National Biography Online, it’s clear that the professionally written articles are of a higher quality than Wikipedia’s. Yet at a cost of millions to create and very high access fees, ANBO is not nearly as efficient in creating a reference as Jimmy Wales’  bizarre bazaar of knowledge.

For certain historical projects, a distributed model is appropriate. Rosenzweig suggests that the survey textbook would be an ideal candidate for an open source, distributed creation. And History Matters continues to be a successful and freely available source (though I don’t know how close its production was to Roy’s suggested model of open contribution with a central vetting committee).

Trevor Owens also touches on authority in crowdsourced projects, and assuages the concerns of cultural resource professionals, that these projects have a long history in non-digital forms, and that the “crowds” are actually the engaged and knowledgeable amateurs who professionals are trying to reach. This makes the relinquishing of authority more acceptable in many contexts like NARA’s Citizen Archivist, NYPL menus, and even Transcribing Bentham.

Yet in the Bentham transcription project, the required level of transcription was extremely high. The transcriptions would become published in online or even physical formats, so the project directors required near perfection of the texts. Because of this high bar, a close review of each transcription was needed and this ultimately caused the project to be fairly inefficient in how its paid employees time was spent. For many, the project failed because fewer documents were produced by staff moderating the volunteers than if they had simply transcribed documents with their time. Had a Wikipedia style of authority directed transcription review, then perhaps it would have produced more works. So giving up some authority (volunteer editors) might have increased quantity with a possible reduction in quality. Yet would the Bentham Papers achieve the high quality expected of this type of project? Likely yes, but even the perception of diminished quality might make it less well regarded among the academics who would use it. Here again, relinquishing of authority would cause concern further down the professional historical pipeline.

In last week’s readings and this weeks, the relinquishing of authority has become a major litmus test for public history professionals. For Carl Smith, “serious history” online came from historians and museum professionals only. Yet “Cleveland Historical” succeeded because they reached out to the community for oral history stories- giving up some of the authority inherent in top-directed projects (though individual stories still received proper vetting). In collecting material for the Hurricane Digital Memory Bank, freedom from “the Authorities” was required to earn trust and this was only achieved by giving participants complete control over their stories. In this case, the HDMB projected its authority by creating a safe place for the public to entrust their stories. History on the web need not create the Cathedral, but it needs more structure than the bazaar. A project’s magic is in the proper balance.

Virtual Museums

This week’s readings focused on using digital technologies to enhance web-based museums. This is one area where I knew very little about the theory and practice behind museum studies and digital curation. Some of the most interesting articles described how users are engaging with public history “in the wild.”

In Mark Tebeau’s description of Cleveland Historical, 20% of people accessed the web via a mobile device (but Tabeau cited to 2010 and 2013 articles). It seems that number might be as high as 55% now, though statistics I found vary.  With the movement towards mobile, the impact of Cleveland Historical grows. And it’s already a very powerful site in my opinion.  The intersection of thoughtful digital technology, local oral histories, an intuitive map-based arrangement of stories, and useful features like “tours” make Cleveland Historical stand out. It could be that I just love local history and oral history (having worked on this oral history project), but it seems very effective as a public history tool. Students can learn about their city and tourists or local residents can delve more deeply into its neighborhood histories. The focus on the aural and the spatial also set it apart from many digital history projects which prioritize the visual.

Another success for Cleveland Historical was its SEO outreach. For any digital endeavor, historical or not, being found in a search result is paramount to a project’s success.  If the public can’t find your work, then it won’t impact the public. I did some Google searches on using various “stories” or Cleveland areas as search terms. Cleveland Historical generally popped up at the top of the results. This has long been a strength of Wikipedia, and a requirement for successful digital projects.

Cleveland Historical did have one drawback, and this is slightly unfair because it is somewhat outside of CH’s scope, but the stories are generally not very academic. Each “story” provides interesting narratives about local areas with engaging oral histories, but they don’t seem to add up to any overarching argument. As Wyman and others point out, most digital projects either delve into many topics shallowly or few topics deeply. Cleveland Historical chooses breadth, which is not to say it isn’t “serious history.”

It’s unclear whether Carl Smith would place Cleveland Historical into the category of “serious history” but he argues that his project on the Great Chicago Fire would certainly qualify. The project had grand ambitions, and might have been unique in the early days of the web, but the curation  of its content did not impress me. The essays, galleries, and libraries seemed more of a chore to wade through. I do acknowledge that when I focused on the actual content, the prose and photos were interesting and enjoyable to read or view. It simply did not succeed as a web project in either the 1996 or 2011 versions.

For me, Melissa Terras and Tim Sherratt had the most interesting articles. The Cleveland and Chicago local history projects were either good or bad in expected ways based on Wyman, etc’s Best Practices and as Lindsay’s “Virtual Tourists” might engage in the content. How historical content on Sherratt’s Trove or the variety of most accessed objects that Terras describes were far more unexpected.

In some ways, Sherratt rehashes the long standing good actor/bad actor uses of history that have always existed in digital or analog formats. But by using trackback link technology, he can truly suss out every use of Trove on the web. Unfortunately the bad uses of history- climate change deniers, threatened to outshine the positive uses. Yet, I think Sherratt is correct that institutions need to trust in the overall “good will” of the audience. His powerful “Faces of White Australia” flows from the Australian National Archive’s open content use restrictions. The popular digital objects described by Terras also flow from similar generally open policies. Though her article is more of a catalogue of popular cultural heritage sites, it is ultimately the sharing web that underlies many of the most popular uses. Without the social web, I never would have known about Wales’ most wanted dog.

Dog with a pipe in its mouth

The Scale of Digital

The question of scale is one which all historians must consider in their work. Is the project a microhistory that explains larger movements with representative (or unique) examples? Does it explain the grand movements of history across time and space from a distant perspective instead? All of the projects in this week’s readings engage in the question of scale by allowing users to freely navigate between near and distant readings of their subject. Ed Ayers and Scott Nesbit explain how digital history projects can make the leap between the micro and the macro with their work on Visualizing Emancipation.

Ayers and Nesbit successfully move from the overarching legal and political complexities influencing abolition on a national scale to how this played out among individual slaves in the Shenandoah Valley. I felt their article was very clear and powerful in unpacking the web project. The map was also interesting, though less powerful in its explanation of space and time. There were so many individual events occurring during the time lapse- movements of the Union Army without associated emancipation events, that it was difficult to follow the argument so successfully made in their article. It did allow searching of the metadata of events which was interesting in viewing material related to towns or particular people.

The other projects, ORBIS and Digital Harlem, successfully displayed the power of scale. ORBIS was essentially an argument for scale by literally discussing the time with which information or goods moved around the Roman Empire. By including the farthest cities when analyzing a network, the scale of the empire is clearly stated. Yet beyond the empire’s scale, few other insights are available. It could be useful for a scholar of military history to know the exact time it took for a letter between London and Rome if they were examining decisions in a military campaign. Or to compare the “networks” of different cities in the hinterlands of the Empire. But I suspect the accuracy a scholar would require isn’t served by the best guesses of the creators of ORBIS. I do admit that the map functions look really attractive and would be very useful in education contexts with students from a variety of levels.

I thought Digital Harlem also navigated the issues of scale with elegance. By showing clusters of interaction, users can see both larger historical patterns in Harlem along with the individual intereactions themselves. By including the other uses of each address certain aspects of the neighborhood were pretty successfully recreated. Both the individual scale and larger time scales can be explored.

The question of scale has also been in the forefront of my mind as David Armitrage and Jo Guldi advocate for longer timescales and deeper narratives in order for historians to reclaim their place in public intellectual discussion. They are correct that historians no longer have the ear of policy makers as we once did. Columnists for major papers include economists like Paul Krugman and others, but few to no historians. In The History Manifesto they see longer time and space scales as a way to influence policy discussions. By examining long terms, we can avoid “short termism” (nice Google ngram usage in the introduction). Armitrage and Guldi’s argument links well to the strengths of digital history by allowing the shifting between large timescales, but not obscuring the individual as economists are prone to do.

Connecting the Disconnected

Our reading assignments have generally been grouped into a few themes for each week: web presence vs internet history. old vs new definitions of digital history, and practical vs theoretical digitization. This week seemed like a wider variety of material, though each piece falls along the axis of case study, practical tutorial, or theoretical essay. Lauren Klein’s article on James Hemings and his presence/silence in the letters of Thomas Jefferson encompasses all three categories.

The article provides a history via network analysis of someone who was virtually off the network. The history of slavery is filled with similar stories of reclaiming a voice that was silenced by that institution’s systemic repression, but this is a stark case emanating from the digital world. James Hemings’ world story was not the type of existence to be ignored by history. He had a skilled position with which he won his freedom, was literate, and made his way into the historic record. He was almost invisible within the “Papers of Thomas Jefferson.” Yet Klein finds his story. This is an interesting work of digital history theory because the important part of Klein’s work dismantles the structure of digital archive from the piece of correspondence, the letter or document, to the word, in this case James Hemings. We have gone from dismantling the archives in the first weeks to dismantling the document. While dismantling the archive with keyword searching could be dangerous with its false negatives/poor OCR, Klein uses the individual names in letters to provide a view of Jefferson’s slave/free black network in a rigorous, reproducible way.

The article provides a case study because finding the invisible in the archives, even digital archives, is a prototypical task of the historian. Especially when the “archive” is the papers of the writer of the Declaration of Independence and whose papers represent a classic American archive. A common historical task is bringing to the fore a silent historical actor: a slave, an oppressed political actor, or a common worker’s story. Traditionally, historians find these in the liminal spaces on the archives. Klein finds Hemings’ story in these same liminal spaces, but by a digital method. The article veers into a tutorial because she provides the exact steps she took (search for Heming papers > named entity recognition on 51 JH letters > network analysis). Presumably, the python code could easily be made available to be a reproducible case study.

I also appreciated that Klein’s network visualizations were more simple than Kaufman’s analysis of the Kissinger memcons. It takes maturity to only map what is necessary for your argument and no more. According to Scott Weingart, if the visualization is not advancing your argument, it is not appropriate. And here is where Klein succeeds. Her arc diagram does not include the entire Jefferson papers, but the subsets which promote her argument. Either the elites of Virginia or those who mention James Hemings. They effectively show the interaction in Jefferson’s world between the different classes that would be found more or less easily using traditional historical methods.

The other article which particularly interested me was Johanna Drucker’s essay on graphical display in the humanities. This resonated with me as it provides such an interesting antidote to the traditional view of digital data and instead argues that digital humanities embrace capta. Drucker feels that the often ambiguous nature of the humanities, problematizing by design, should be represented similarly in our graphical representations. This is a similar argument made against quantitative historians, so one we should bear in mind. Historians and digital humanists should never forget that one of our important disciplinary strengths is too keep the complexity of history at the forefront. The History Manifesto has encouraged me to compare our work to economists who are so present in the public discourse. An important distinction when considering the complexity of life is that economists work with data. Humanists work with capta.

Week 5: Theorizing Text Mining

This week’s readings either used text mining as part of their research methodology or provided theories on its best practices and uses. What most struck me about text mining, and Ted Underwood’s treatment of it, was the connection to complex mathematical algorithms. These are both incredibly complicated- word frequencies can tell us much about a given text, more than we might intuitively imagine- even genre and author, yet applied in a seemingly simple manner- our searches are lumped into a list by “relevance” which has been determined in a complicated manner which is opaque to the researcher. Opaque research methods was echoed in last weeks readings as well, but those using the techniques in their work described their workflows much more clearly.

Ted Underwood described text mining of documents as usually accomplishing one of two tasks: 1. Find patterns in the documents that becomes the literary/historical argument in and of themselves. 2. Use the technique as an exploratory tool to highlight clues where additional work using traditional analysis would prove useful. Gibbs and Cohen, Blevins, and Kaufman’s articles generally fall into the first category.  I felt that Blevins provided the strongest argument based on text mining. He did a great job of both answering the “so what” question by basing his work on theories of “producing space” and “imagined geographies” in Houston. Blevins also did an admirable job of explaining his research process in great detail so that readers can adequately question his methods and research decisions.

Gibbs and Cohen’s article was particularly detailed in explaining its methodologies- apt since it was part of a journal issue revolving around new digital techniques and “distant reading.” Their article even describing the failures of certain terms to be statistically significant in their changes over time for various reasons. In a traditionally researched narrative, the author wouldn’t describe letters that weren’t relevant to their search, but it’s important that digital historians describe their successes, but also where a technique failed them in a surprising way. For example, their search for instances of scientific “fields” and when they became considered organized bodies of knowledge by the Victorians. This didn’t happen until very late in the 19th century so there weren’t really enough entries to find a pattern, the nearly 400 examples could provide new knowledge if given a traditional reading.

Micki Kaufman’s website is a very interesting example of the digital history project. She basically applies the whole tool kit of text mining techniques to the body of Kissenger related memos and summaries of telephone conversations. Some of these techniques were just exploratory- for example creating the 40 categories which the telegrams might fit into. (As a sidenote: is the default number of MALLET categories 40? Is there any reason for this coincidence in Kaufman’s and Nelson’s work?) While Kaufman had some of the most stunning visual elements, her work suffered from the lack of uniting theme. It ultimately was more exploratory than argumentative.

Rob Nelson’s “Mining the Dispatch” provides some methodologies to review what is in the Richmond Dispatch during the Civil War and generally falls into the latter category of exploratory text mining projects. By assigning a variety of categories to the articles, he can see what sort of information Richmonders were reading and the changes to the frequency of those types of articles over time.

I would be curious to see use of Cameron Blevins’ techniques on the Dispatch in providing an “imagined geography” of the City of Richmond. I wonder if there would be much more familiarity with places south of the Mason-Dixon line than immediately north, even if those areas were geographically much closer (Baltimore, Maryland vs. Montgomery, Alabama). As the digital history field develops, hopefully we can sub in successful methods, like Blevins’, on different bodies of documents. This could lead to more standard digital workflows (as opposed to standard tools) which I think could benefit the field as a whole.

Reproducible Research

This week’s readings concerned databases and the elegent, reproducible methods of getting historical knowledge out of them. Many of the themes were very familiar to me and echoed a graduate seminar on local archives I’d taken in the past. The questions of authority, how to extract knowledge, and the role of the historian/actual work of the historian were at play in archives, both digital and physical.

Tim Hancock’s chapter covered a number of interesting topics, but I fixated on his casting of the historian as an archives “expert.” The historian would toil at the archive, expertly pulling substantive examples to support their work from a huge collection of non-relevant material. The work would then be published, launching others to build upon the nuggets of information mined from the opaque archive. In the digital archive, anyone can search and unearth their own information, democratizing the historical process (which most agree is one benefit of the digital).

But I wondered where this leads the professional historian. If the public doesn’t require someone with the peculiar expertise to determine the “correct” evidence, what is the point of our discipline and its requisite training? This was answered by Lara Putnam in “The Transnational and the Text Searchable.” Putnam shows that where historians had to search in the liminal spaces for their research topics,  “against the grain” from the “historia patria” at the heart of many national archives’ missions, historians must still do so in their digital searches. For searching online can provide you the documents but it often doesn’t provide the context or the local knowledge of a physical archive.

The hazards of new digital methods were also echoed in McDaniel, Mussel, Nicholson, and Spedding. This is where the professional historian can provide value over the amateur. In gleaning the meaning hidden in the cracks of the databases, understanding the technologies underlying the documents (the limits of OCR, for example), and providing reproducible research methods, a professional historian still provide value in the pursuit of the past.

One of McDaniels points about reproducible research methods was the cite the electronic version of a document if you actually used the electronic version. In her 2014 article, Lara Putnam does so, but still has broken links. The link is broken because she mistyped the link to this page, but the reader might not be able to find an object without conducting their own search. This speaks to one of the major problems with citing to electronic objects, they might not exist in that spot in perpetuity like a physical book with an LC call number. So this is the other place where professional historians will need to evolve our methods to provide truly reproducible research that ensures our peers or the public can follow us through the evidence we find.

But the readings don’t quite address a reproducible workflow from start to finish- a set of instructions- 1. go to X database 2. use Y search term  3. organize the data in Z way.  One of the strengths of computers is that they run a set of instructions the same way given the same parameters and inputs, so digital history should leverage this ability. It would definitely look more like William Turkel’s vision of history and be barely recognizable to those who rely exclusively on physical documents in archives for their research. Yet the databases *are* different and we need to recognize that fact and leverage it where it’s useful. To search across traditional colllections, as in Nicholson’s media culture history, and beware of the possible pitfalls of divorcing documents from their sense of place as Putnam points out. Manovich is correct that databases utilize a sense of narrative that is unique from traditional history, so the discipline can’t continue to pretend we work the same way we always have.