Category: hist696

Voices of the 1969 Santa Barbara Oil Spill

Following a series of cascading mistakes and accidents, the Union Oil Company’s Platform A exploded on January 28, 1969. The blowout spilled about 100,000 barrels of oil along the shores of Santa Barbara, California according to Coast Guard experts, the largest in California’s history.

Beaches of Santa Barbara Harbor, 1969 By Antandrus at en.wikipedia [Public domain], from Wikimedia Commons.
The spill caused massive damage to the coastal areas of Santa Barbara County, but also contributed to increased environmental activism. The passage of the National Environmental Protection Act and the first Earth Day all occurred in the wake of the oil spill.  Commentators have specifically noted the increase in grassroots activism on environmental issues as a result of the spill. The Santa Barbara News-Press was particularly vociferous in its opposition to offshore oil development and penned many editorials demanding changes in how the Union Oil Company and the Department of the Interior conducted operations.

Readers responded similarly, penning numerous Letters to the Editor issuing demands about oil drilling, corporate welfare, and protection of the environment. One artifact of the Letters to the Editor is the existence (in most cases) of addresses of the writer. In a sense, the letters exist in a space and time in which the author penned them.

By using letters to the Editor of the News-Press transcribed by Darren Hardy, and letters to the Los Angeles Times and New York Times that I transcribed, I intend to map the geography of protest about the 1969 Santa Barbara Oil Spill. After topic modeling the letters with Mallet, one frequently used topic was symbolized with the following words:

 oil drilling federal beaches offshore ocean fish government birds historical point children wildlife channel pollution secretary damages coastline leases

I take this topic to have a local concern with pollution that was inherent to Santa Barbarans anger over polluting their local channel and coastline. This topic also describes nature more specifically than another general “nature” topic which included “rivers”, “lakes”, and “planet”. Using CartoDB (which is essentially Google Maps on steroids), I mapped both the letters and how strong the local protest topic existed in each letter.

Expand the map to fullscreen for best viewing and be sure to zoom into the Santa Barbara area.

 

The offshore oil platforms existing in 1969 are shown in red (data from Bureau of Ocean Energy Management). I expected most of the opposition to come from those living near the sites of environmental harm such as coastal beaches, but some of the strongest voices came from wealthy neighborhoods like Westside Santa Barbara. Letters from Ithaca, New York and southern Connecticut had strong probabilities of the “local protest” topic, which I didn’t expect. With further geographic analysis, additional insights could be gleaned from the letters as they’re situated in space and time.

One of the prime limiting factors with this project was the relatively few letters reviewed for topic modeling. I analyzed 26 letters, fourteen from the Santa Barbara News-Press, six from the Los Angeles Times, and six from the New York Times to get a regional and national papers’ perspectives. Twenty-five letters was a decent number to create an interesting map, but not enough to generate quality topics. In the future, using Natural Language Processing, more letters to the editor, and additional geographic information analysis, historical topics could be reviewed for what an authors location might contribute to their political writings. As this was a sampling of letters, gaps in perspective do remain and ideally a conservative paper would help flesh out the different perspectives that existed on offshore drilling nationally.

Additional analysis of the letters in Voyant:

 

Teaching the Digital Native

Let’s face it, using technology is difficult and not everyone can navigate the digital equally well.

The readings for week 13 complicate the trope of “digital natives” that Marc Prensky described in 2001. According to Prensky, the “digital natives” of the millenial generation learn better with online tools because they have always used such tools and learn best and most fluently in this environment. Danah Boyd points out that this depends on the context they bring to learning.

Students are bringing a variety of experience levels to undergraduate and graduate history courses and professors should not presume a level field. Allison Marsh had problems with her graduate public history students engaging in digital material as she had hoped they would. Even when they completed assignments, they were rarely very successful public history exhibits. And these are skills that they will certainly use in almost all public history careers.

Our Clio 1 class also represents this diversity of experience levels and success in using some of the digital tools in our class has varied (this doesn’t necessarily track with the generation of student). However, I’m truly looking forward to the final projects of my classmates because I know they’ll produce something creative and interesting.

As Dan Cohen has written, digital history is precisely what sets GMU apart from other History PhD programs in Virginia. “Pragmatic and prescient”- using the digital in teaching is seen as an inevitable solution. Sometimes it’s to the problem of relevancy in the digital age. We need to teach the students “where they are”- online. And online is also where those teaching the discipline are, even if they don’t always admit it. 

In looking at some of Mills Kelley’s “hoax” courses, students learned most successfully when they brought their enthusiasm for the digital, but are taught rigorously to think historically. Some may (and did) have problems with Kelley’s teaching methodology, but no one disputes that the students genuinely worked hard on their hoaxes, learning history in the process. Students also brought this excitment with them to Nicolas Trepanier’s Historical Gaming courses. The key in both cases was leveraging the excitement for the digital- creating or playing, to teach skills that are not inherently digital at all- evaluating sources, approaching historiographical interpretations, and analyzing arguments.

The common threads that I take away from these readings is that the experience of students should not be taken for granted. Assuming enough support for students at all levels, use their excitement for digital and social tools to encourage thinking historically. If you make history fun while teaching its core ethics, students will learn how to think historically and how to produce history in the digital age. Though the intrinsic “digital native” does not exist in the wild, I’m all for using the tools that students or members of the public (in the case of public history projects) find most engaging. I was excited to see that the National Park Service is interpreting history with mobile phones on the Mall and elsewhere. If the public can think critically about their history- online or off, it’s a win for our work.

Debate over Digital Scholarship

Ben Brands and I led the class discussion about digital scholarship. We aimed to provoke sharp discussion early in the class with our most controversial question: 1. What does Tim Hitchcock mean when he says “the book is dead?” Is he right or overstating the case?

While we did provoke discussion, the class did not divide into the pro-book and anti-book factions that I expected. The discussion was measured in tone and concerned Hitchcock’s themes about authoritarian structures inherent in the book’s form. While I had hoped the first question would inspire some fireworks to get discussion going, we did effectively address some of the structures of authority that Hitchcock described. Our discussion also ranged to the physical constraints of books and how that might be overcome by digital scholarship.

Though the first question did not engage the class in the way I expected, it was still an effective question. One of my questions which fell flat was: 5. William Thomas discusses the “fundamental components of scholarship” as evidence, engagement with prior scholarship, and a scholarly argument while his collaborator Edward Ayers identifies a broader concept of scholarship as that which “contributes, in a meaningful and enduring way, to an identifiable collective and cumulative enterprise.” How are these two views of scholarship related, and which is more valid and applicable to digital scholarship? What issues do each raise for digital scholarship?

I asked what I thought was an interesting question, and indeed a question that is fundamental to our discipline- what counts as history? The classes response: *crickets*. I think the question failed to elicit much discussion because it was too long and probably too amorphous and wordy. When discussing the much more concrete criteria for an AHR Digital Article Award, the class engaged in how argument and evidence interacted in traditional vs digital scholarship. They needed a little more structure in which to discuss the meanings of evidence and scholarship. I’m not sure that everyone in the class was convinced that non-linear arguments could exist let alone be effective. I similarly echo that skepticism until there’s more and more convincing examples of non-linear digital arguments.

I think this class is at a useful place in the syllabus because we’ve had a chance to explore all the tools that make up digital scholarship. The weeks’ articles can provide context for where all of these tools might fit into the historical discipline. One counterargument is that  by learning more about digital scholarship earlier in the semester, we’ll write for our blogs in a more effective manner by harnessing the hypertextuality of the web. Ultimately, the placement is useful because it links many of the themes of the surrounding weeks- particularly the traditional disciplinary forces pushing against digital history.

Publishing My MA Thesis Online

In January 2014, I uploaded an article-length draft of my Masters’ Thesis on School Desegregation in Roanoke, Virginia to Academia.Edu. It was a tough decision for me to make- will college or high school students plagiarize me? By putting this online, will scholarly journals or essay collections pass on it? Will it receive any readership or merely eliminate me from consideration for more august publications?

I had read the AHA’s statement on embargoing of Dissertations (it had come out in July 2014) and I feared rejection from journals because my work was publicly available. Obviously publication of a MA thesis is not as critical as turning a dissertation into a monograph, an act on which careers can sometimes be created (or ended). Yet, the same issues felt like they were at play. Ultimately, I took a deep breath and hit the “publish” key.

It would be nice if this story fit into the narrative described by Rebecca Anne Goertz (and published on the blog of Johns Hopkins University Press). But no bigshots at preeminent journals have found my thesis and demanded its publication. Still, the paper has received over 300 views. I can tell from search keywords that many of the viewers are members of the public or even students of Roanoke schools (former and current) I wrote about. I’ve been contacted by a French novelist who sets her historical fiction during the Civil Rights Era and by a Virginia Tech Education professor who used my piece with public school teachers in Roanoke.

It’s too early to call my decision a success (JAH, call me!). I currently believe it’s been a qualified success exactly in ways I did not expect. Academia.edu has a reputation as the “Facebook” of scholars (one of many), so I expected historians in my field (“my audience”) to find the paper and perhaps comment. In reality, members of the public who have an interest in the history of schooling in Roanoke read my paper. The primary academic interest I did receive was from outside of my field in the education department.

I think this speaks to the importance of open access and creative commons licensing, as work online will inevitably be used in ways that the creator did not expect. Academia.edu allows me to keep the rights to my publication, while still pushing it to the public.

I do respect the positions taken by AHA President William Cronon, that the suggested embargo is meant to protect “our most vulnerable colleagues.” Cronon is sincerely attempting to help junior faculty, but, I find more compelling Trevor Owens’ argument for a bizarro AHA in which the goals of the Scholarly Society align with public good rather than a traditional model of publishing books.

I still worry that my decision to go open access was too naive, but in the meantime, someone is searching for my work on Google.

Peer Review in a Digital Age

This week’s readings touched on many aspects of publishing digital scholarship from the formal (Ayers’ and Thomas’ work) to the more informal- blogs and twitter. One of the underlying tensions between works produced for a digital environment and traditional scholarship is perceived difference in authority. In one of the earliest class readings, Gertrude Himmelfarb argued that “[e]very source appearing on the screen has the same weight and credibility.” With digital scholarship providing a contrast to the rigorous system of peer review used by academic journals and publishing houses.

As described in Cohen and Rosenzweig, Hitchcock, and the authors of Writing History in a Digital Age, the cost of publication and distribution on the web approaches zero. Thus the professional machinery that limited traditional scholarship to the best and brightest that would fit in a physical journal or UP monograph is no longer a concern over resources or cost of production. The premier journals are still limited to what would traditionally fit in the “dead tree” version- 3-5 articles, book reviews, front and back matter.

Digital publications are not constrained in such ways, but often do lack what is at the bedrock of many professional journals, a system of peer review. But this lack of peer review is recognized by online practitioners and systems are beginning to be put in place to address the problem of “authority.” As such, online peer review exists on a continuum of approaches.

Joan Troyano described the peer review process used by the Journal of Digital Humanities. The reviewers were usually graduate students or the journal’s editors, not necessarily scholars in the fields being discussed. This is a “peer” review, but not of one’s peers in a particular academic discipline. Without the stamp of approval of the gatekeepers of a discipline, the perception of “authority” is diminished, even as access to the public is far increased over gated articles open to academics at Universities. Yet, the JDH did offer “an astounding” level of interest in a virtually new journal. And I think Joan’s description of the material as a “middle state” serves a very useful purpose. They “collect the good” which generates interest and hopefully can press those publications to a final level of peer review before publication in a disciplines most prestigious journals. It’s a very effective way to “filter” the flood of  online publications.

The peer review system for Writing History in the Digital Age provided an interesting mix of newer open methods and traditional “experts” (who even had the opportunity to send anonymous comments). As a result, Writing History in the Digital Age might have a level of polish that exceeds the JDH (to be fair, the projects have very different aims). I also think their authors are correct that in the digital environment authority is much more fluid and based on reputation. Gertrude Himmelfarb is wrong that authority is thrown out the window on the web, but historians are correct to be worried about authority. In an online world of billions, a dozens views has little authority.

In the digital world, reputation is just as powerful, but also more fleeting. Within the “publish-then-filter” model described by Cummings and Jarrett, the usefulness of the “filter” determines its value for the reader. We trust the New York Times to “filter” the most important news and conversations for  us, just as I trust William Cronon to “filter” the best environmental history and news on his twitter feed.

The other pole of peer review for digital projects are articles like Ed Ayers and William Thomas’ “The Differences Slavery Made.” This went through a formal peer review process and was published by the AHR. It had all the “authority” and professional recognition of traditional scholarship, but some scholars did not treat the web companion project similarly to the published article by not interacting with it. Though Ayers and Thomas had high aspirations of changing the form of scholarship, the traditional peer review and gate keeping made their work very recognizable to the profession- a good and bad outcome.

Digital Scholarship needs all of these levels of peer review, but I believe that the most successful peer review will mirror the best practices on the web where “expert” users are described as such, but all users are open to comment on work. A system similar to Writing History in the Digital Age could plausibly work for journals in some fields if given a chance.

Why Play the Past?

This week we discussed gaming in history scholarship. This is not the “history of videogames”, but using gaming to teach and explain history. But why we are having this discussion in the first place? If all the readings argue that games can play a role in historical scholarship, why use games instead of traditional forms of narrative?

Trevor Owens argues that games have a large audience. This is certainly true. Video games are a multi-billion dollar industry that continues to aggressively grow. Owen’s most read blog posts (by a large margin) are on Colonization and Fallout 3. Yet, I wonder if the scores of users are the board game and mod focused players that historians could speak to or if they were interested in playing a “fun” game. This necessarily limits the potential topics that a historical game could address (“playing” slavery would not be appropriate).

Owens, Adam Chapman, and the other authors also focus on video games “potential to make explicit and operationalize the models of change over time.” These are the “rules” which guide historical agents and events. Chapman argues that the rule-making nature of videogames allows users to internalize the “rules” of the fictional process, while simultaneously understanding how the imaginary world of Civilization relates to a real historical world as described in Kennedy’s Rise and Fall of the Great Powers. The synergy between the virtual world (and its algorithmic rules based on real world civilization interactions) and the historical world as explained by Kennedy’s monograph is successful in this instance. But only if the user privileges the “Form” (algorithms) over the “content” (video games outcomes).

The game’s rules must not be too rigid as to limit natural exploration of “the world”, nor too loose to allow ahistorical interpretations or events to reign. If Hitler wins World War II in Call of Duty, but the gamer learns something about the realities of Modern Mechanized warfare, does the educator consider it a success? On a most basic level- no. The historical game must follow a fine line of not being ahistoric or being too pre-determined, a problem for the Lost Museum, whose users just rotely figured out “the rules” without exploring why these were so based on historic realities.

There is definitely a place for videogames in historical education if only as a way to reach students and adults and provide a unique immersive presence in the historical world. i primarily attribute my early passion for history to a string a fantastic teachers in eighth through twelfth grade. But I would be remiss not to mention the long hours playing Caesar, Civilization, and Colonization on my desktop computer during middle school and high school. The immersive experiences with Dutch colonies, Praetorians, and the impact of gunpowder on society certainly impacted my historical imagination.

The problem is that these games are huge commercially directed enterprises with large staffs of developers, designers, artists, and big bankrolls. To achieve that level of immersion is orders of magnitude more difficult than the levels which The Lost Museum and Pox and the City attempted and achieved. So I think Chapman, Owens, and Joshua Brown, and the Pox creators are correct that great historical scholarship can be created. Historical gaming just needs the proper topic and its “Valley of the Shadows” massive project to prove the concept.

Crowdsourced History: Cathedral or Bazaar?

This week’s readings discussed crowdsourcing and how it affects historical projects. Every large scale digital history project which invites input from the public must make a conscious decision- are the project’s leaders going to vet all contributions or will they allow the chaos of the crowds to lead the way. To take the analogy of Eric Raymond’s The Cathedral or the Bazaar, will the architects watch closely as the public carefully place the bricks to create an immaculate cathedral (Transcribing Bentham) or will the crowd create a chaotic but successful infrastructure based on a few guidelines (Wikipedia).  Eric Raymond, a Free Open Source Software (FOSS) supporter unsurprisingly determines that the bazaar-style method was actually a great success. While Wikipedia’s success has shown that chaos can work at certain scales and for certain aims, it’s not appropriate for all projects.

Roy Rosenzweig’s article offered one of the most direct comparisons of projects with vastly different levels of authority. Directly comparing Wikipedia’s biographies with the top-down directed American National Biography Online, it’s clear that the professionally written articles are of a higher quality than Wikipedia’s. Yet at a cost of millions to create and very high access fees, ANBO is not nearly as efficient in creating a reference as Jimmy Wales’  bizarre bazaar of knowledge.

For certain historical projects, a distributed model is appropriate. Rosenzweig suggests that the survey textbook would be an ideal candidate for an open source, distributed creation. And History Matters continues to be a successful and freely available source (though I don’t know how close its production was to Roy’s suggested model of open contribution with a central vetting committee).

Trevor Owens also touches on authority in crowdsourced projects, and assuages the concerns of cultural resource professionals, that these projects have a long history in non-digital forms, and that the “crowds” are actually the engaged and knowledgeable amateurs who professionals are trying to reach. This makes the relinquishing of authority more acceptable in many contexts like NARA’s Citizen Archivist, NYPL menus, and even Transcribing Bentham.

Yet in the Bentham transcription project, the required level of transcription was extremely high. The transcriptions would become published in online or even physical formats, so the project directors required near perfection of the texts. Because of this high bar, a close review of each transcription was needed and this ultimately caused the project to be fairly inefficient in how its paid employees time was spent. For many, the project failed because fewer documents were produced by staff moderating the volunteers than if they had simply transcribed documents with their time. Had a Wikipedia style of authority directed transcription review, then perhaps it would have produced more works. So giving up some authority (volunteer editors) might have increased quantity with a possible reduction in quality. Yet would the Bentham Papers achieve the high quality expected of this type of project? Likely yes, but even the perception of diminished quality might make it less well regarded among the academics who would use it. Here again, relinquishing of authority would cause concern further down the professional historical pipeline.

In last week’s readings and this weeks, the relinquishing of authority has become a major litmus test for public history professionals. For Carl Smith, “serious history” online came from historians and museum professionals only. Yet “Cleveland Historical” succeeded because they reached out to the community for oral history stories- giving up some of the authority inherent in top-directed projects (though individual stories still received proper vetting). In collecting material for the Hurricane Digital Memory Bank, freedom from “the Authorities” was required to earn trust and this was only achieved by giving participants complete control over their stories. In this case, the HDMB projected its authority by creating a safe place for the public to entrust their stories. History on the web need not create the Cathedral, but it needs more structure than the bazaar. A project’s magic is in the proper balance.

Curating Digital Objects

It seems simple: upload a half dozen objects to an online exhibit. Yet there are so many more decisions to consider. How to categorize the object, navigate the online copyright landscape, and properly present objects each required much thoughtful decision-making.

I had initially planned to upload transcribed letters to the editor of the Santa Barbara News-Press. These “voices” and my analysis of them will be my final Clio 1 project. But who own a transcription of a letter posted to public newspaper space? Does the original letter writer? The newspaper? The transcriber? I have requested permission from the transcriber- Darren Hardy with the thought that he “must have” navigated this complex legal environment to post his versions online. In reality, he likely has not because neither the News-Press, nor the letter writers (or their heirs) have bothered to protect their rights in court….. BUT copyright law is out of my area of expertise and the scope of this post.

After determining that with Darren’s permission, I would post away.. I attempted to upload each XML file in Omeka. Unfortunately XML files are not a permitted type of file in Omeka. What the CMS giveth, the CMS taketh away. After attempting to transform the XML files to something more useful, I decided to upload some of my own personal photographs, though I would keep the “Santa Barbara Oil Spill” name for future work on that project. The photos came from a tour of Monticello that I chaperoned back in 2008 and represent a sort of prototypical public history experience.

Uploading my own material allowed for easier metadata creation. I didn’t need to place uploader (me), letter writers, newspaper editors, and transcribers into “creators”, “publishers”, “contributors”, or “source” categories. Yet there were still difficulties. Does a photograph of an object get a “language” (I decided no). A photo with English text (yes). Dates needed properly formatted descriptions (YYYY-MM-DD) and consistency between data was key.

Once the items were described and thankfully uploaded, I needed to place them into an exhibit. This was also slightly trickier to navigate in Omeka than I expected. Fortunately, all the work of adding images and metadata made the exhibition work much simpler once I knew which buttons created pages and their association with objects. I organized the pages into a vaguely narrative format, though the user could navigate in any order.

My Omeka exhibit is here.

Virtual Museums

This week’s readings focused on using digital technologies to enhance web-based museums. This is one area where I knew very little about the theory and practice behind museum studies and digital curation. Some of the most interesting articles described how users are engaging with public history “in the wild.”

In Mark Tebeau’s description of Cleveland Historical, 20% of people accessed the web via a mobile device (but Tabeau cited to 2010 and 2013 articles). It seems that number might be as high as 55% now, though statistics I found vary.  With the movement towards mobile, the impact of Cleveland Historical grows. And it’s already a very powerful site in my opinion.  The intersection of thoughtful digital technology, local oral histories, an intuitive map-based arrangement of stories, and useful features like “tours” make Cleveland Historical stand out. It could be that I just love local history and oral history (having worked on this oral history project), but it seems very effective as a public history tool. Students can learn about their city and tourists or local residents can delve more deeply into its neighborhood histories. The focus on the aural and the spatial also set it apart from many digital history projects which prioritize the visual.

Another success for Cleveland Historical was its SEO outreach. For any digital endeavor, historical or not, being found in a search result is paramount to a project’s success.  If the public can’t find your work, then it won’t impact the public. I did some Google searches on using various “stories” or Cleveland areas as search terms. Cleveland Historical generally popped up at the top of the results. This has long been a strength of Wikipedia, and a requirement for successful digital projects.

Cleveland Historical did have one drawback, and this is slightly unfair because it is somewhat outside of CH’s scope, but the stories are generally not very academic. Each “story” provides interesting narratives about local areas with engaging oral histories, but they don’t seem to add up to any overarching argument. As Wyman and others point out, most digital projects either delve into many topics shallowly or few topics deeply. Cleveland Historical chooses breadth, which is not to say it isn’t “serious history.”

It’s unclear whether Carl Smith would place Cleveland Historical into the category of “serious history” but he argues that his project on the Great Chicago Fire would certainly qualify. The project had grand ambitions, and might have been unique in the early days of the web, but the curation  of its content did not impress me. The essays, galleries, and libraries seemed more of a chore to wade through. I do acknowledge that when I focused on the actual content, the prose and photos were interesting and enjoyable to read or view. It simply did not succeed as a web project in either the 1996 or 2011 versions.

For me, Melissa Terras and Tim Sherratt had the most interesting articles. The Cleveland and Chicago local history projects were either good or bad in expected ways based on Wyman, etc’s Best Practices and as Lindsay’s “Virtual Tourists” might engage in the content. How historical content on Sherratt’s Trove or the variety of most accessed objects that Terras describes were far more unexpected.

In some ways, Sherratt rehashes the long standing good actor/bad actor uses of history that have always existed in digital or analog formats. But by using trackback link technology, he can truly suss out every use of Trove on the web. Unfortunately the bad uses of history- climate change deniers, threatened to outshine the positive uses. Yet, I think Sherratt is correct that institutions need to trust in the overall “good will” of the audience. His powerful “Faces of White Australia” flows from the Australian National Archive’s open content use restrictions. The popular digital objects described by Terras also flow from similar generally open policies. Though her article is more of a catalogue of popular cultural heritage sites, it is ultimately the sharing web that underlies many of the most popular uses. Without the social web, I never would have known about Wales’ most wanted dog.

Dog with a pipe in its mouth

Mapped Judges

Last week, I worked to create a network of non-white judges. i theorized that Federal Judges would come from the same networks of schools, often historically black Colleges and Universities, due to increased segregated schooling in the US South. I was incorrect as there were few networks and over 200 different schools represented among the 368 Judges.

This week, I decided to map the birthplaces and schools of Federal non-white judges to determine if additional patterns might arise.

For best results, I’d recommend zooming into the map or maximizing it in a new window and viewing the United States or particular US regions (though internationally born judges do provide some interesting data points). I simply added the birth place columns (City, State) and School. I gave these a spectrum of colors (5 ranges) to indicate change over time by birth year. Colleges received red colors and birth places yellow shades to distinguish them from one another.

I expect the higher proportion of African American judges born in the South would mirror population concentrations and likewise for Hispanic Judges born in Texas and California. One interesting fact exposed by the map is the great distance that many judges seemed to travel between birth place and school. It would be interesting to see if the same held for white Judges.

Unfortunately, Google doesn’t provide this type of functionality beyond manually calculating distance for each judge.

Mapping the judges was incredibly easy, because I’d done the work last week to clean up the data first. I simply loaded the CSV into Google and played with the settings for a few minutes, chose the light political basemap to highlight the data: Presto! A decent map.

But what makes this easy- the lack of options, also creates some mapping problems. The college networks are further de-emphasized because each college receives only one “pin”, even though Howard, for example, has at least 12 Federal Judge alumni. Birth places suffered from the same issue (e.g. New York, NY babies). Using a method to “weight” the number of alumni or hometowns would provide a more accurate representation of Judge patterns.

The schools data was also more suspect than the birth place data. I’m unsure why Brown University, for example, is located over the University of Virginia. Why Google placed Jersey State College over State College, Pennsylvania is more obvious, but difficult to fix without manually changing the data.

Ultimately, Google provides an easy and effective method when the data exists and is in the proper format. A project for Clio 3 also illustrates the benefits of previously clean data and building on the work of others.

I’m attempting to analyze and possibly map Decisions of the Indian Claims Commission. Fortunately, there are some fantastic geographers who have come before me and geo-rectified Charles Royce’s maps (1899). Charles Royce created maps of Indian land title ceded to the Federal government Nicole Smith first created the georectified maps, sharing them as shapefiles. Matthew McCarthy, a George Mason University Geography Dept graduate student, expanded upon her work, adding significant geographic analysis. His maps also includes some helpful features like present-day reservations. My map is currently far more primitive (and inaccurate in some ways), though I hope it will provide the bones on which future textual analysis might rest:

Using more powerful tools (open source QGIS), I’ve created a custom map that is not as “pretty” as Google’s, but will provide a more robust platform for future Work. And with good data (provided by kind souls), it’s already effective in showing patterns of Western migration and Indian land cession.