Author: petercarrjones

Peter Carr Jones is an Historian living in Washington, DC.

Creative Commons History

The following essay argues that public media (NPR, ProPublica, etc.), should publish more material with a Creative Commons license. NPR receives much of its funding from member stations who pay subscriptions to broadcast programs, so using CC licenses for all output would gut their revenue. It makes sense, however, to get as many eyeballs on breaking news or investigative pieces, because these are worth more in reputation and prestige.

Our academic work can also fall in this category because it is ostensibly worth more to receive the prestige and reputation from having your ideas spread widely than in keeping them on your own site. There is still a time and place for submitting to peer-reviewed journals and trying to publish your dissertation with a University Press. The particular prestige of this type of scholarly communication is unique from that provided by the crowd. I’m just suggesting that your witty 1000 word piece on a topical subject might have an unintended life if you set it up correctly.

Also, I wanted to exercise my internet right to post a CC BY-NC-ND 4.0.

Why Some Public Media Content Should Be Creative Commons Licensed

By Melanie Kramer

This essay is published under a Attribution-NonCommercial-NoDerivatives 4.0 International (CC BY-NC-ND 4.0) license. Feel free to copy and redistribute it, with attribution and a link back to the original source.

In February 2015, NPR’s Danny Zwerdling published a four-part investigative series on nurses who had been injured on the job . The series was published on the NPR website and distributed using NPR’s API to member station websites.

On the NPR homepage, Danny’s investigative work was given the same treatment as other NPR and member station pieces published on the same day. What this means is that a quick news spot from The Two-Way or a human interest piece on ice sculptures appeared on the NPR homepage for roughly the same length of time as Danny’s work, which took over a year to put together. If you took a glance at the NPR homepage several days later, you were likely to miss Danny’s work altogether.

NPR does have an API but it stipulates that use is for “personal, non-commercial use, or for noncommercial online use by a nonprofit corporation which is exempt from federal income taxes under Section 501(c)(3) of the Internal Revenue Code.”

This means Danny’s lengthy investigation could not be published by a for-profit alternative weekly or by a nursing organization. It could not be picked up by a wire service or distributed by a foreign news organization. Danny’s piece has won awards and inspired local stations to localize coverage but did it have the greatest possible impact it could have had online?

I was thinking about Danny’s pieces when I saw that ProPublica encourages anyone to steal its stories, which are published under a Creative Commons license and frequently picked up by other publications. As Richard Tofel and Scott Klein write:

In the last year, republication of ProPublica stories under our CC license has increased dramatically. Through November, we’ve recorded more than 4.2 million page views this year for authorized reprints of our work, which is up 77 percent over the same period in 2011, and is the equivalent of an additional 29 percent on top of the traffic to our own web site.

Among the literally thousands of sites that have reprinted ProPublica stories in 2012 alone are Ars Technica, the Atlantic Wire, CBS News, the Charlotte Observer, the Chronicle of Higher Education, the Cleveland Plain Dealer, Foreign Policy, the Houston Chronicle, the Huffington Post, the Las Vegas Sun, the Los Angeles Times, the Miami Herald, MinnPost, Minnesota Public Radio, Mother Jones, MSNBC, Nature, NBCNews.com, the Newark Star-Ledger, the New Haven Register, the New York Daily News, Nieman Journalism Lab, the San Jose Mercury News, Scientific American, the Seattle Times, Slate, Talking Points Memo, the Tampa Bay Times, the Trentonian, USA Today, the Utne Reader, Wired and Yahoo News.

Why do we do this? ProPublica’s mission is for our journalism to have impact, that is for it to spur reform. Greater reach – the widest possible audience – doesn’t equate to impact, but it can help, and certainly doesn’t hurt. So we encourage it. And, of course, we started in 2008 with almost no audience or reputation at all, and needed – and still need – to increase the circle of people who know us, and our work. CC helps us achieve that goal.

ProPublica also makes sure it can track the impact of any stories that are distributed under the Attribution-NonCommercial-NoDerivatives 4.0 International Creative Commons license. This is really important, because it allows ProPublica to keep track of how their stories are being disseminated and include numbers from their republished material in reports that go out to funders and their Boards. As Tofel and Klein note:

We created a simple JavaScript beacon that we call Pixel Ping. We designed it, working with developers at DocumentCloud, to be as lightweight as possible and so that it doesn’t violate, either in spirit or letter, the privacy policies of the sites that republish our work. Pixel Ping simply counts the number of times our stories are read on sites that republish them. It doesn’t collect any information about visitors, and it neither sets nor reads browser cookies. It’s open source, too.

I am not advocating that NPR or member stations release all of their content under a Creative Commons license. This would destroy NPR’s business model, which is currently based, in part, on member stations’ paying dues to broadcast their programming. But for breaking news, which often garner a national audience, and investigative pieces — which are designed to have impact and spur change – doesn’t it make sense for the stories to reach the biggest audience possible?

I say breaking news because member stations are often providing in-depth, well-reported coverage of local, breaking news events that are of interest to national audience — and don’t always receive the audience they should. Take, for instance, St. Louis Public Radio’s ongoing coverage of Ferguson. It is well-reported, in-depth, and covers all of the angles.

If St. Louis Public Radio offered up its Ferguson coverage under a restricted creative commons license with the same stipulations that ProPublica has, then other stations — and even other news organizations — could have spread St. Louis’s coverage beyond the audience of its website. People from around the country would have learned about St. Louis’s coverage in this way, and potentially wanted to support the station for providing some of the best coverage around. St. Louis would have been able to track its reach and its impact in a different way.

For breaking news pieces that attain national significance and for investigative pieces that require a lot of time, capital, and labor, it makes sense to think of the audience beyond public media’s walls online. And it fits within the mission of public media, which is to provide programs and services that inform, educate, enlighten, and enrich the public – wherever that public may happen to be located.

Digitization and the Historical Profession

If last week was about defining digital history as methodology, a frequent theme in this week’s readings was using digital methods in common historical tasks to research without thinking through some of these choices. Last week was about characterizing what “digital” historians “do” and why they do it. This week’s readings frequently reminded me how different historical work now looks from twenty years ago, but how unacknowledged these differences are among those who don’t define themselves as “digital historians.”

Ian Milligan’s work on Canadian historians indicated many ways in which historians were not thinking about how online access affected their research choices. Newspapers which were not online were ignored significantly more. This even prioritized Toronto history over that of other cities without digitized newspapers. Historians have long acknowledged the danger of prioritizing the metropole at the expense of regional voices and perspectives, yet here was another example happening in twenty-first century historical work. 1)Ian Milligan, “Illusionary Order: Online Databases, Optical Character Recognition, and Canadian History, 1997–2010,” The Canadian Historical Review. Vol. 94, no. 4 (2013): 540–69. Ricky Erway and Jennifer Schaffner of the OCLC have argued that a focus on maximum digitization (access over preservation, quantity over quality) will allow institutions to see which types of material gets the most researcher eyeballs and this can help drive future investment as well as reducing the impact that lack of access has on certain research. 2)Ricky Erway and Jennifer Schaffner, for OCLC Programs and Research, “Shifting Gears: Gearing Up to Get Into the Flow,” 2007.

Milligan noted that historians likely missed some examples due to a reliance on full text search. As Simon Tanner, Rosenzweig and Cohen have noted, OCR is far from perfect and digitalized text  can be created in a variety of ways. Other scholars have noted the error rates of computer programs, particularly when creating OCR text for historic newspapers. Bob Nicholson and Nicole Maurantonio, argue that using digital databases is and should be a different mindset than merely using full text search to replace old microfilm methods of visually scanning in a chronological fashion. Sarah Werner and Marlene Manoff provide further examples of how the digital is very different from the physical. Both in the ways in which technology can obscure meaning from physical objects, but also the ways in which technology can illuminate texts in new ways. 3)Marlene Manoff, “The Materiality of Digital Collections: Theoretical and Historical Perspectives,” Portal : Libraries and the Academy; Jul 2006; 6, 3; 311-325

This week’s readings also attacked the authority of historical professionals as the arbiters of public historical work. Besides the previously discussed criticism from Milligan, Roy Rosensweig and many of the authors writing on crowdsourcing indicated the skills and abilities that the educated public can bring to cultural heritage and knowledge projects from Wikipedia to the Papers of the War Department. As long as guidelines were clear, project goals aligned with best practices of crowdsourcing, and cultural professionals were willing to cede some flexibility and discretion to “the crowd”, then engagement and labor could be brought to traditional institutions of cultural heritage used to knowledge only moving in one direction from professionals to the public. This was well described by Sheila Brennan and Mills Kelly as History Web 1.5. Not as fully unmediated as web 2.0, but more so than traditional historical practice.

I expect we will discuss many similar problems alluded to by Milligan in next week’s readings on search and databases, but professional historians need to learn that even when not “doing digital history,” they need to approach digital sources with respect for the public’s contributions and new appreciation for how research into digital sources is very different from that in paper or even microfilm.

References   [ + ]

1. Ian Milligan, “Illusionary Order: Online Databases, Optical Character Recognition, and Canadian History, 1997–2010,” The Canadian Historical Review. Vol. 94, no. 4 (2013): 540–69.
2. Ricky Erway and Jennifer Schaffner, for OCLC Programs and Research, “Shifting Gears: Gearing Up to Get Into the Flow,” 2007.
3. Marlene Manoff, “The Materiality of Digital Collections: Theoretical and Historical Perspectives,” Portal : Libraries and the Academy; Jul 2006; 6, 3; 311-325

What Is Digital History?

Today is the #dayofdh2015. It’s a day to celebrate the extraordinary in digital humanities your projects, pretty maps, cool visualizations, amazing findings. It’s also a day to explain the mundane tasks of DH aka what we actually spend our time doing- filling out grant proposals, munging data, updating Java so that MALLET works. The idea is that our non-digital colleagues will see more of the former than the latter and come to value our methodologies.

So what is digital history? For me, there are many binaries in which I recognize digital history, each of which urge elaboration, but which I will only briefly mention now. The first is micro vs macro viewpoints. Successful digital history almost always zooms in and out between the these poles of evidence. From the single story- a name plucked from the record via full text search to the  overarching- historical forces marching across maps over time. Digital history most successfully displays the macro-level analysis, while not missing the human stories that ground great histories. The second binary is unfortunately a part of digital history, the commercial vs. the open. The historical and archival discipline has ceded vast quantities of our history web to corporations, from ProQuest to Google. These collections and tools are in various levels of access, but are certainly not in the public domain. This is both a threat and a call to action to maintain our discipline’s role as open and democratic sources for historical knowledge.

The third binary describes visions of digital history which I first discussed. There are the utopian- DH will “save the humanities”, and the mundane- DH is merely tools to make stuff (and these tools have their own perils of power relationships).

As Tom Scheinfeldt has noted in his post on the Dividends of Difference, digital humanities comes from a computational linguistics and cliometric world of Father Busa and “Time on the Cross.” It also comes from a radical history of collecting stories from below using the tools of oral history and folk collecting. This collecting was technological, archival, public, collaborative, political, and networked. Both of these branches are simultaneously utopian and mundane approaches to scholarship.

I also approach digital humanities with a slightly different influence. Considering my time at CHNM, my “DH” is Digital History and I define this work as: the use of computers in researching, presenting, or teaching history. It’s a deceptively broad definition as its goals are deceptively broad in its attempts to use modern tools in pursuit of past experience.

In this vein, it’s a forward looking methodology. Digital history is, as Cameron Blevins noted during a recent AHA talk, “in a perpetual future tense.” Though Blevins meant his discussion as a call to action- to use new tools to challenge accepted historical narratives, I think there’s something in DH’s utopian genes that will always push the methodology’s envelope in performing its roles of research, presentation, and teaching.

I use the term methodology deliberately. There are some aspects of disciplinary formation common to digital humanities- I use similar tools and speak the same language as my literary colleagues. We have common goals and approaches in our work. Yet I believe my fundamental questions and missions align with history, the discipline and the digital is a way to achieve these. Though we use different approaches, Eric Foner and I both attempt to “democratize history” (he more successfully than I).

This definition is not meant to exclude those who fall in the liminal spaces of discplines. Andrew Prescott’s work crosses boundaries of science, museums, literature, and archives to illuminate the human condition. Prescott has worked in many different positions within these discplines, but it is work that is rooted in materiality. Will Thomas, Ed Ayers, Dan Cohen, and Roy Rosensweig, all describe fantastic visions of what digital history can create, but each vision- from Ayers historic virtual worlds (1999) to Thomas’ call of merging the past with the present (2015) is rooted in a materiality that is foundational to history.

So if you’ll accept my premise that digital history is perpetually forward thinking methodology, I hope you’ll also accept that this requires grinding work of the “non-DH” variety. So like Roy and Dan have urged, we should “sit down in front of our computers” on this #dayofdh2015 and “get to work.”

Designing From a Template

This week our class’s design assignment was due. In some ways, this was an page that should build on our previous preliminary, typography, and image assignments. I certainly employed many of these skills in my design assignment. Manipulating images so my work would be more cohesive. I made several images the same size and resolution to add consistency to my design. Likewise, I used a serif font, Libre Baskerville, to affect an older, historic feel.

What was completely new to me, was adapting a complex, responsive layout to my own needs. This is not a simple copy and paste job. It was somewhat simple to sub in my historical brewing content, but it was much more difficult to add sections and content not already included. Even navigating where the relevant css could be found was difficult as there were seven different css files included with the free layout from HTML5 UP. With the jquery and javascript littering my page, it felt like moving from tee ball to the major leagues. Despite the complexity, I now feel much more comfortable with some of the building html/css blocks- anchor links within a page, padding/margins/alignment, and css fonts.

I also learned several new techniques in order to learn how to effectively deploy my layout. Several image features were actually from a font called font awesome. It took me awhile to figure out why there were no icons in the images folder, but this technique will be very useful for my future work. I also learned about css gradients so that I could change the color of the layout to match my header image. I didn’t expect to learn these new techniques this week, but by reading a professional developer’s code I had to understand some of the state of the art in contemporary web design. I have no doubt this will improve my historic web pages.

My comment on Mason’s blog.

 

User Interaction

Josh Brown and the Lost Museum are of a time in which user interaction looked significantly different than today’s web. Then Flash websites could provide some interaction in slow loading and clunky websites. Online gaming was in its infancy and the major ways people interacted with websites was through “guest books” or the very young social web where Myspace was king. In this world, the Lost Museum was novel and somewhat sophisticated. Yet, Brown’s self critiques ring true, the site lacked the unrestrained and immersive world of Myst, its inspiration. Now even virtual worlds like Second Life are passe. It seems that the historical web has had trouble interacting with public audiences compared to other disciplines. There have been marginally successful “crowd sourced” projects, and commercial sites like Ancestry.com are popular. Yet, historical websites frequently lag behind in allowing users to interact with sources, maps, or documents.

One area in which new technology could provide some of this needed interactivity is in javascript charting languages like D3. Where Jquery makes websites scroll smoothly and look more professional (but reduce accessibility), D3 provides users the opportunity to fully explore a project’s data. This is used to great affect with Lincoln Mullen’s Maps of Slavery’s expansion. The article provides a clear argument, but users can still interact with the underlying data by changing dates, hovering over specific counties, or playing with a variety of other options. This becomes a model for more active learning because users can see how changing options affect the population of slaves and free blacks across time and place. Plus it just looks cool!

My comment on Ben’s site.

Accessible Web

How Accessible is my page? Before this week’s readings, I knew some of the very basics of web accessibility. Alt texts in images help screen readers. Section 508 compliance is a thing. That’s frequently on the bottom of government websites? Ok, so I knew basically nothing. The readings provided great context in providing access to the web for the disabled community. Historians’ goals often include telling the story of disadvantaged communities, so ensuring access for all audiences is crucial to our ethics.

But the most impactful part of the readings was actually submitting my website to the WAVE Web Accessibility Tool. This tool showed places where I didn’t use alt text or had other site design issues. I didn’t think that some PDF files would be inaccessible or overly complicated for screen readers to access when adding links to them. I also didn’t think about the small icons that appear next to the author’s name on each post. WordPress automatically posts these based on the theme used, but doesn’t include alt text. I will need to explore how to add alt text to these images to make my site more accessible.

By and large, I did a fair job of including alt text for my images. Yet it’s an aspect of my site that I’ll need to stay vigilant about to ensure the widest possible access to my writing. Being able to submit my work was much more helpful by giving me my own real world examples whereas the readings only provided theoretical examples.

Comment on Rob’s blog. 

 

Eric’s Portfolio Image Page

One of the strengths of Eric’s page is that his images complement a really fascinating story. The recolorization of the black and white nightclub scene makes sense due to the rarity of color photographs in night clubs. Eric’s sense of light vs darkness gives the photo a chiaroscuro effect which is interesting. My favorite aspect is the woman’s red dress which really pops from the darker background. The man’s blue jeans are a nice contrast to the red, but on further reflection, look a little too blue to be real. It’s possible that my current screen (mac) plays a role in this feeling. On a quick initial view with my android cell phone, the blue jeans looked realistic.

His touch ups on the photograph of the bartender are very well done. Eric successfully removes the “noise” in the right side of the photograph which lets the focus on the man (and his reflection in the mirror), which is the focal point of the photograph, shine through. The additional contrast added to the mans face is successful and makes him much less pale. It might go a little too far and appear almost burned (too much burn tool?).

The engraving is a nice find given the time period. I also work in the twentieth century and cheated slightly by making an “engraving-style” image. Because Eric’s chosen engraving is a small detail on another image, making the size large on the website images draws attention to the low quality of the initial image. Given the context of the website and other similar sized images, it would look out of place as a tiny picture, but I think the image works better on a smaller scale.

On the overall page, the color palette matches the feel of disco and contrasts each other well. The alignment and design are tasteful, though perhaps the text could be a little larger to be more readable.

Comments on Nathan’s Image Page

Our Clio 2 class has been busily building up our Photoshop skills and this week we showed off some of the techniques we’ve been mastering through a variety of image assignments. Since there was no reading assignment, we were tasked to review the work of one of our classmates. I am reviewing Nathan Michalewicz’s page.

The first aspect which struck me was how fantastic his recolorization image turned out. The colors are slightly faded, making it appear exactly like an old color photograph from the 1960s or a painting from the 19th century Realist school. The colors chosen seem appropriate for the time period and more importantly, they are added with care and precision down to the gold buttons. The man’s skin is a little pale, but not enough to detract from the overall effect. The decisions on cropping and repairing the photo are also well chosen.

The woodcut of Murad III is also effectively manipulated. Removing the artifacts from the archive: stamps and borders, helps the viewer imagine the Sultan without the materiality of the woodcut. The tan background, blended with the black and white photo is a very effective technique. That, plus the vignette effect, makes the graphic very well suited to the earth tones of Nathan’s page design.

I have very few critiques of Nathan’s photo editing choices. Besides the overly pale skin of the man with the chair, I thought the vignette effect reduced the contrast of lines and cross-hatching on the woodcut of Murad and ultimately was less powerful than the original. I think a smaller diameter of vignette/fading would have the same blending effect with more impact.

A few other small nits that are secondary to the image assignment. First, the text in the background with the very complex images might be a little too busy. A single color or simpler background might work better on this particular page. Second, the copy could use one more pass for typographical errors. Again, not related to the images, but correctness always helps the page’s authority. Finally, the page title (not just the header) should reflect “image” not “type.”

The Vintage Web: What’s New is Old

Our reading load has begun to taper off with the expectation of more assignments due in the coming weeks: an image and design assignments, plus the looming final project. In light of this, my viewing of Lynda.com and messing around with Photoshop has increased proportionally. Speaking of practical photoshop techniques, I enjoyed Cameron Moll’s “vintage” suggestions and intend to employ them in my upcoming assignments. When I saw the “2004” date on his posts, I was extremely skeptical. Fortunately, neither the advice, nor the design was stale (I wish I could say the same for his links to “other experts“- all broken). The sharpened blur and “machine wash cycle” are really great image effects which  should work well in my design (maybe a fake vintage beer label similar to Moll’s example?).

For the image and design assignments, I think I’m going to re-design a website for my side gig/hobby: beer history consulting.

I created a website banner using this photo from the incredible and historic Schlenkerla Brewery in Bamberg, Germany.

Schlenkerla Brewery Mural
A mural of gnomes brewing beer at the Schlenkerla Brewery in Bamberg, Germany.

Here’s another picture to give you an idea of the scale of this incredible mural. Of gnomes. Making beer.

Peter Jones with Schlenkerla Brewery Mural
A photo with the author at Schlenkerla.

My original attempt wasn’t bad, but with my new skills and more work, I think I’ll be able to do a much more interesting header. I really like the colors in the banner, so hopefully I can use these earthy yellows and reds (I know.. everyone in class is using earthtones).

If anyone has other design suggestions, please leave them in the comments.

Comment on Alyssa’s blog.

Ethics of Photography

It was fascinating to read Errol Morris’ series of posts on the ethics of photojournalism in reference to FSA photographers. Morris looked at Arthur Rothstein’s and Walker Evans’ most controversial photos from nearly as many angles as Edward Steichen photographed his cups and saucers.

Migrant Mother Photo
One of the most famous FSA photos, Dorothy Lange likely directed her subject to a degree. How much does this impact how we view it? Photo at LOC

Morris engaged the photos and photographers in an attempt to get at their core ethics and answer the question: what does photographic evidence mean? This article contrasted interestingly with the spirit behind the Lynda.com tutorials on digital photo editing. Morris admits that photoshopping images is only one of the newest forms in a long line of techniques to use photography for political purposes. Interestingly, the Lynda.com tutorials, especially the tutorial on photo restoration, are not trying to allow an editor to change a photo for their own purposes, but to present what the original photographer would have wanted to convey. By stripping the damage that time has done to a photograph, the editor can extract the essence of the original.

Morris’ series also impressed on me, how important a sense of ethics around web images is to any historical website I create. Rothstein and Evans were trying to tell particular stories, though felt they presented their subjects fairly. Many in America strongly disagreed and as a result, Rothstein, in particular, was discredited. As Morris concludes: “It should not be lost on any of us that these controversies are still with us.” It’s an important lesson for all historians on the web.

Comment on Steve’s Post.