Spatializing photographic archives, a project funded by the National Endowment for the Humanities, entailed the release of open source software for recovering the 3D geometry of a location from photographs taken from diverse angles (and even at different times); and a case study of Richard Misrach’s landscape photography demonstrating the value of this approach for scholars.
We’ve now completed an extensive and carefully illustrated White Paper for this NEH-sponsored project, a large pdf of which you may find here. (26.5mb).
The White Paper describes the open-source software tool we’ve developed, and our reasons for wanting to forge a new approach to making digital tool for scholars. It also examines the implications of our approach for photography. After examining the history of landscape photography in the American West, we show how by stepping outside the photographic frame and unfreezing a photographs’s frozen instant, we can reveal many hidden aspects of photography and create new kinds of works.
Read Full Post Here
Last week, the Walker Art Center launched a major website redesign, which museum geeks are hailing as “a potential paradigm shift for institutional websites,” (Seb Chan) and an “earth-shaking game changer” (Museumnerd). Here’s what I see: a website as a unique core offering–alongside, but not subservient to, the physical institution. Walkerart.org is not about the Walker Art Center. It is the Walker Art Center, in digital form.
The new site resembles an online newspaper, featuring articles written by Walker staff alongside stories from the greater world of art reporting on the web. While there is a tight menu of Walker Art Center offerings at the top (Visit, Exhibitions & Events, Media, Collections, Join), the rest of the website is a digitally-based panoply of content broadly related to the Walker’s mission. It is an online experience about contemporary art that goes beyond the Walker’s walls.
And it breaks a lot of conventional rules about museum homepages….
What the Walker has done is commit to a unique online approach–not just for one program or microsite, but for their homepage. They took their vision of the institution as an idea hub, looked at comparable sites online that achieve that vision, and adopted and adapted the journalistic approach to their goals.
Read Full Post Here
Read an interview with Walker Art Center team here.
Editors’ Note: The University of Michigan, host of the Humanities, Arts, Science, and Technology Advanced Collaboratory (HASTAC) conference held last week, has released videos of the keynote addresses. A round-up of the conference is available here, and a full list of keynote videos is available here.
KEYNOTE VIDEO: Cathy Davidson, Now You See It: The Future of Learning in a Digital Age
- Cathy N. Davidson, John Hope Franklin Humanities Institute Professor of Interdisciplinary Studies, Duke University; Introduction by Daniel Herwitz, Director of Institute for the Humanities, University of Michigan; Duration: 01:15:00.
KEYNOTE VIDEO: Daniel Atkins, Cyberinfrastructure
- Daniel Atkins, Associate Vice President for Research Cyberinfrastructure, University of Michigan; Introduction by Margaret Hedstrom, University of Michigan; Duration: 00:40:00
KEYNOTE VIDEO: Pochoda, McPherson, Cohen, and Nash, The Future of Digital Publishing
- Phil Pochoda (chair), University of Michigan; Tara McPherson, University of Southern California: Editor of the born-digital, multimedia journal, Vectors: Journal of Culture and Technology in a Dynamic Vernacular; Dan Cohen, George Mason University: Author/editor of several books on scholarly digital publishing, and influential blogger at dancohen.org; and Richard Eoin Nash: Founder of Cursor, a site to coordinate a portfolio of online, membership publishing communities, including Red Lemonade, a pop-lit-alt-cult operation. Duration: 01:19:00.
KEYNOTE VIDEO: Leach, Digital Technologies in the Civilizing Project of the Global Humanities
- James A. Leach, Chairman of the National Endowment for the Humanities; Introduction by Cathy N. Davidson, Duke University; Duration: 00:25:00.
KEYNOTE VIDEO: Vaidhyanathan, The Technocultural Imagination
- Siva Vaidhyanathan, University of Virginia; Introduction by Paul Courant, Library Dean, University of Michigan; Duration: 01:08:00.
KEYNOTE VIDEO: Data, Code, and Research at Scale
- Joshua M. Greenberg, Director, Alfred P. Sloan Foundation’s Digital Information Technology program; Introduction by Dan Cohen, George Mason University; Duration: 00:47:00.
See Full List of Keynote Videos Here.
A common problem in search and exploration interfaces is the vocabulary problem. This refers to the great variety of words with which different people can use to describe the same concept. For people exploring a text collection, this makes search difficult. There are only a limited number different queries they can think of to describe that concept, but they may be missing many other instances that use different words. This is an important issue for humanities scholars. Often, the very first step of a literature analysis is to comb through text, trying to find thought-provoking examples to study later.
In this post, I give an example of how our project WordSeer, a text analysis environment for humanities scholars, can be used to overcome this problem. In this example, I’ll using an instance of WordSeer running on the complete works of Shakespeare from the Internet Shakespeare Editions. It’s live, so you can follow along with this example on the web at wordseer.berkeley.edu/shakespeare.
You can read the post after the jump, or just watch this video.
Read Full Post Here.
Watch Video Here.
Just as THATCamp challenges attendees to set and steer the agenda, Startup Weekend leaves a lot up to the participants, who have 54 hours to pitch a product idea (typically tech-related), form teams, validate their idea, develop a business model, and put together a demo and a longer pitch.
Some might wonder what entrepreneurship training has to do with the digital humanities (DH), but I believe that the two communities have much in common and can learn from each other…. While DH projects typically don’t form companies and don’t aim to make a profit, most do need to consider how to define their value, find users and sustain themselves. To get off the ground, DH projects go through a process similar to a start-up: identifying a need and potential solution, drafting project plans, putting together a team, building a prototype, iterating on that prototype, and disseminating the product (whether a tool, collection, model, publication, or large-scale research). Both the DH and lean startup communities have embraced similar principles, such as agile development, user-focused design, open source software, and iteration. In a broader sense, I believe that DH brings the spirit of entrepreneurship–taking risks, experimenting, building something that serves a need, innovating, tolerating failure–to the humanities. We can see this spirit manifested in the NEH’s Digital Humanities Start-Up grants, the many digital humanities “Labs” (a term also used frequently by startups and tech companies), and One Week One Tool, which was inspired by “crash ‘startup’ or ‘blitz weekends’.” In a sense, many DH centers serve as startup incubators, providing the know-how and support to help get an idea off the ground.
Events like Startup Weekend could address a need in the DH community for more training in successfully launching projects. Often graduate training in the humanities does not prepare people for the complexities of getting a major DH project started and keeping it going. Such training is now being offered at the Digital Humanities Summer Institute (taught by Lynne Siemens, a professor in U Victoria’s school of public administration who does research in entrepreneurship and academic team development), at THATCamp workshops (such as Sharon Leon’s Introduction to Project Management in Digital Humanities), as part of DH educational programs such as UVA’s Praxis Program, and in publications such as Sharon Leon’s Project Management for Humanists:Preparing Future Primary Investigators. I think StartUp Weekend offers another compelling model for providing training in a fast, fun and experiential way.
Read Full Post Here.
Searching for information might seem like one of the most routine and commonplace activities of university life. However, as students work within an information environment that is increasingly open and dynamically changing, research assignments also represent a complex and potentially daunting task, and one that is fraught with embedded social and cultural processes and relationships.
The Ethnographic Research in Illinois Academic Libraries (ERIAL) Project was a two-year study of student research practices involving a collaborative effort of five Illinois universities…. Using a mixed-methods approach that integrated nine qualitative research techniques and included over 600 participants, the ERIAL project sought to gain a better understanding of undergraduates’ research processes based on first-hand accounts of how they obtained, evaluated, and managed information for their assignments.
…Search algorithms can thus reveal or conceal information depending on the skills of the user. Unfortunately, the students who participated in the ERIAL project did not appear to adequately understand conceptually how information is organized or how search works. Of all the students who were asked, none could correctly explain how a search in Google (or any other search engine) works or organizes results…. This lack of “algorithmic literacy” potentially renders students vulnerable to the disciplinary power contained in search systems, as well as subjects, rather than agents, of algorithmic culture.
….One challenge for educators and librarians is to balance facilitating ease of use with a conceptual understanding of how search works. Search shouldn’t be magic; it’s only when its processes and algorithmic culture are demystified that our students become empowered to use it effectively.
Read Full Post Here.
There are lots of tools out there that aggregate existing information and even organize it for users to interpret. Since the early Hypercities, GIS tools, for instance, have been very much the rage among humanists who wish to add geographical and census data to enhance the “lived experience” of a text. But there are fewer tools that actually build an archive of live interpretation—as opposed to facts layered and ready for interpretation–around a stable text. And that’s where what I call “Reading with the Stars” comes in.
Last year I attended a presentation by Reinhard Engels (Harvard University Libraries) in which he demonstrated a deep zoom widget he was working on called “HIGHBROW.” Using important texts such as the Bible, the Divine Comedy, and Shakespeare (First Folio), Reinhard’s widget brought these texts together with some of their more famous commentaries. A spike graph at the top of the screen showed viewers where the text had received more (or less) comment, and scrolling down into the text allowed viewers to see specific comments from a range of well known thinkers such as St. Augustine and Sir Thomas Moore on the Bible. Highbrow offered viewers a snapshot of the text’s reading history through the lens of established experts. As a teacher of ENGL 372, a large undergraduate lecture class required for our major at WSU, that focuses on the Transatlantic 19th century, I wondered if the static archive Highbrow could create might be transformed into a dynamic archiving tool for student comments around major texts….
Read Full Post Here
And this brings me to the danger inherent in Culturomics. First, machine-readable texts do not, and will never, represent the totality of the human experience. What about paintings, illustrations, and photographs, statues and figurative art, architecture, music, material culture, and ecology? What about oral history? What about economic, statistical and demographic evidence? Although there are millions upon millions of books, magazines, newspapers, and other printed material, they represent only the visible, privileged, literate tip of a vast store of human culture.
Even more troubling, texts lie. “There is no document of civilization,” said Walter Benjamin, “which is not at the same time a document of barbarism.” One of the great insights of the “New Social History” was the need to rub documents against the grain. Text mining usually rubs with the grain, merely reproducing the endemic biases and structured incompleteness of the written past. The graph can only replicate the lie.
This is not to say that Culturomics is hopelessly biased and needs to be discarded. On the contrary, it is precisely this kind of utopian enthusiasm – the dream that we can actually develop a more total vision of human culture – that is needed to keep history afloat. Large scale text mining is simply wonderful. Like all great inventions, though, it can be used for good or for ill. And it makes sense, I think, to guard against the naive assumption that all of human culture or history can be reduced to a computational algorithm.
Read Full Post Here
The CUNY Digital Humanities Initiative has released video from two recent events:
Digital Humanities in the Library, November 18, 2011
- Ben Vershbow (NYPL) on “NYPL Labs: Hacking the Library”
Digital Humanities in the Classroom, October 18, 2011
- Mark Sample, “Building and Sharing When You’re Supposed to be Teaching”
- Shannon Mattern, “Beyond the Seminar Paper: Setting New Standards for New Forms of Student Work”
Watch the videos here.