Link to the Event

On Monday, November 28th and Tuesday, November 29th, Dave Lester (Creative Lead for MITH) and I (Web Developer for MITH) attended the Institute fuer Dokumentologie and Editorik (I-D-E) workshop for Tools for Digital Scholarly Editions held at the University of Cologne, Germany. Alexander Czmiel from the University of Berlin and fellow for the I-D-E had invited Dave Lester and then me to the conference to give a report on the Text and Image Linking Environment (TILE) project. In addition, I helped organize a report on my work done with the Interedition team for interoperable annotation clients. This would be MITH’s first appearance on the scene of the University of Cologne, which houses multiple institutions for Digital Humanities related projects and research affiliates: the I-D-E, as mentioned before, the Cologne Center for eHumanities (CCeH) and the International Center for Archival Research (ICARus).

After arriving at the workshop location, which was at a rather new and improved location on the Cologne campus, we met with Alexander Czmiel and his colleague, Patrick Sahle. As participants filtered in, Patrick and Alex started things off by introducing themselves and their mission: to establish a global community of Digital Humanities programmers focused on working together on scholarly editions. Their point is similar to that of Interedition, which focuses on establishing programmer networks to prevent duplicity of efforts and tools. To paraphrase Joris Van Zundert, lead for the Interedition project, “You don’t need two collation engines ”. Indeed, as every one of the approximately 40 participants introduced themselves, we found many intersections in our projects and in our research. In order to better work together, the workshop was set for each of us to unconference ideas together and brainstorm ways forward. The workshop’s focus then quickly switched to what, exactly, it is that each of us is working on and how it might be different or similar.

To aid with this, we proceeded to give or listen to a series of 15-20 minute presentations on specific projects. Each of them was an interesting take on the idea of linking data together with marked-up text or image files. Here’s a list of them to peruse:

The result of these presentations was an agreement that the Digital Humanities has a similar need across institutions for SVG annotations, exporting and importing TEI data and annotation data, and automatically linking lines and text together in a seamless fashion. Obviously, there is no one interface currently available (nor will there ever be, in my opinion) that can manage all of this, so the idea of arguing over what platform to incorporate our efforts in was found useless. Instead, we were encouraged to split up the workshop into sub-groups that focused on development and institutional planning goals that fit with what we wanted out of the event.

As with the presentations, there was too much information, engaging arguments, and fascinating ideas brought up in these sub-sessions to report it all in a single blog posting. What I can write about is the proceedings of the two sessions I was involved in: the Text and Image Linking group and the Annotation Framework sub-group.

“Text and Image Linking” started off with a more in-depth description of the TILE project and what it does. Moritz Wissenbach then expanded on how he took the code from TILE and implemented it into his own project, Faust Edition, thus creating a quasi-interoperable connection between the projects. From there, we argued that small collaborations such as this can build a kind of network of data between projects that works around institutional and project boundaries. Robert Casties of DigiLib asked some very interesting questions, such as why we chose TILE as a framework and how or if our collective data could support the massive amount of metadata his users require. Marco Petris of the CATMA project in Hamburg had similar concerns for such a collaboration: What if collaborating means making an unspoken contract and forces a specific agenda on a project otherwise adverse to the ideas of something like TILE? Are we simply fitting a square peg into a round hole, to use a (poor) allegory? How do these smaller connections effect the global Digital Humanities community? In order to continue these conversations and (hopefully) reach some kind of consensus or possible white paper, Ulrike Henny from Cologne established the Text and Image Linking e-mail list: textimagelinking at uni-koeln dot de.

“Annotation Frameworks” had a very similar subset of participants which carried over from the Text and Image Linking subgroup. As a representative of MITH, I spoke to our work on collaborating with the Open Annotation Collaboration (OAC) to develop an open-source and extensible framework for annotations. This, we believe, would allow for other users and programmers to take our saved annotations and use them in their own work, since the OAC specifications are so flexible and open. Robert Casties continued his argument about how a universal model for data exchange as proposed at the Darmstadt Bootcamp for Interedition and OAC would not fully realize some goals for metadata. Moritz and my argument was that it could encompass everything a researcher may need, pending we as programmers remain transparent about our code and our research. Again, the final conclusion was that we needed to clarify and continue our arguments online and through e-mail.

This conclusion was shared by many of the sub-groups who all presented brief, 5 minute overviews of their findings. Central to a lot of the arguments were how we are all similar in goals, yet very different in use cases, funding, and programming time. What we really need, as the “DH Community Planning” sub-group reported, is a common outlet for collaborating and exchanging ideas outside of the workshop. Joris Van Zundert made an eloquent defense for sticking with a path of collaboration and working around institutional boundaries. This led to much discussion on the issue of “What to do next”. Dave Lester now plans to head up an initiative to get Interedition Bootcamps split between the United States and Europe (I’ll let him explain that more). Several institution partners vowed to use the Stanford University e-mail list for DH community planners. On the development side, we all promised to work hard on our own, off-work time to develop prototypes and working proof-of-concepts of interoperable tools in order to get more funding for projects such as Interedition.

For my MITH development time, this conference has led to several engaging opportunities. TextGridLab, a software framework created at the Goettingen University in Germany, has interest in connecting their own image and text linking tool to TILE’s. It is possible that this collaboration could serve as a working example of the kind of outcome we want from the Text and Image Linking group. Several of us also have interest in taking a closer look at an Automatic Line Recognition system used by developers Daniel Hochstrasser and Tobias Rindlisbacher of Zurich. As for the Image Linking e-mail list, I’ve already contributed my own thoughts there and Robert Sanderson of OAC and the Shared Canvas project have gotten more users engaged in talking through that medium. I and MITH are very excited at the connections made and look forward to future collaborative conferences such as this one. It only goes to show that it only takes a relatively small effort to create interesting and powerful tools for the DH community.

Editor's note: This post originally appeared at Grant Dickie's blog on January 4, 2012.