Academic libraries find themselves embracing new roles in at least two key areas:
- Publishing. More academic libraries are entering the world of scholarly publishing by creating or expanding services. About half the respondents in a recent survey had (or were developing) library publishing services in order to support change in scholarly publication. Three quarters of the respondents indicated they published journals, while half indicated they were publishing monographs and/or conference proceedings. . . .
- Data curation. Funding agencies including the National Science Foundation (NSF) and National Institutes of Health (NIH) now have requirements that promote open access to the underlying data gathered during grant-funded research projects. . . . . Some academic libraries are already creating services that help campus researchers comply with the requirements to create the plans and to archive and share the data once it is gathered while many more are preparing to “embrace the role of data curator to remain relevant and vital to our scholars.”
We describe work-in-progress on the design and methodology of the Dynamic Linked Data Observatory : a framework to monitor Linked Data over an extended period of time. The core goal of our work is to collect frequent, continuous snapshots of a subset of the Web of Data that is interesting for further study and experimentation, with an aim to cap-
ture raw data about the dynamics of Linked Data.
The White House Office of Science and Technology Policy has released “Fact Sheet: Big Data Across the Federal Government.”
Here’s an excerpt:
Below are highlights of ongoing Federal government programs that address the challenges of, and tap the opportunities afforded by, the big data revolution to advance agency missions and further scientific discovery and innovation.
| Digital Curation and Preservation Bibliography 2010: “If you’re looking for a reading list that will keep you busy from now until the end of time, this is your one-stop shop for all things digital preservation.”— “Digital Preservation Reading List,” Preservation Services at Dartmouth College weblog, February 21, 2012. | Digital Scholarship |
Our friends at the UK JISC have just issued a very useful new report titled “Value and Benefits of Text Mining” which looks at some of the early applications of text mining, particularly in the context of the scholarly literature, and the technical, economic, and legal barriers to large scale use of text mining technologies. Of particular interest is the analysis of text mining as a means of accelerating innovation and discovery across a wide range of sectors and some of the related economic considerations. The coverage of copyright here actually include a discussion of possible future changes in copyright law (perhaps in the context of the Hargreaves review of intellectual property law that is under discussion in the UK) to facilitate text mining technologies.
The report can be found at
This report briefly presents the findings and recommendations of the “Library Publishing Services: Strategies for Success” project which investigated the extent to which publishing has now become a core activity of North American academic libraries and suggested ways in which further capacity could be built.
“Library Publishing Services: Strategies for Success: Final Research Re” by James L. Mullins, Catherine Murray-Rust et al..
Humanities and the social sciences have traditionally been disciplines aligned closely with the institutional library and its resources and services. Increasingly, in my conversations with librarians, there is a concern that while the library as a space remains popular, this masks a growing distance between the services the library provides and the needs and expectations of researchers (to say nothing of undergrads).
As subjects like digital humanities find themselves transformed by their engagement with technology, is the library facing the threat of redundancy?
So, I wanted to explore some of the roles that libraries might have in the Digital Humanities:
As input into the development, design, and improvement of the HathiTrust Research Center (HTRC), recipients of Google’s Digital Humanities Grants were interviewed to identify issues encountered during their projects. This project was guided by the following goals:
– Increase empirical understanding of how to identify materials for use by scholars.
– Increase empirical understanding of how to provide better access to materials for use by scholars.
– Identify meaningful characteristics of content that affect identification, retrieval, and other parameters.
– Identify data preprocessing and transformation issues encountered by scholars.
– Provide input to inform the architecture of the HTRC related to representation of collections, faceted browsing, identifiers, etc.
Yesterday I live-tweeted the Digital Publishing Forum “Measuring the Reader” at University College London.
I storified my tweets and added a brief introduction.
I really enjoyed it. The next one is on 21 March and will be “concerned with the huge changes in reference publishing over the last decade and how it has gone digital in a spectacular way, making a space for trusted resources in spite of Wikipedia, and working with librarians to enable discovery and reach new audiences. Our three speakers will tell the real story, warts and all, and look to future opportunities as well as threats.”
You can find more information about UCL’s Centre for Publishing on their web site, and follow them on Twitter @uclpublishing.
The SURFfoundation has released Users, Narcissism and Control—Tracking the Impact of Scholarly Publications in the 21st Century.
Here’s an excerpt:
This report explores the explosion of tracking tools that have accompanied the surge of web based information instruments. Is it possible to monitor ‘real-time’ how new research findings are being read, cited, used and transformed in practical results and applications? And what are the potential risks and disadvantages of the new tracking tools? This report aims to contribute to a better understanding of these developments by providing a detailed assessment of the currently available novel tools and methodologies. A total of 16 quite different tools are assessed.