Sunday, June 16, 2013

Interesting stuff - for an Assessment-Nerd

I recently received the feed of latest articles from Evidence-based Library & Information Practice (EBLIP), which included conference papers from the 2010 Library Assessment conference, as well as a few original articles & research summaries. (Official Disclosure: I am a peer-reviewer for EBLIP.)  Here is the official scope statement:
EBLIP is a peer reviewed, open access journal published quarterly by the University of Alberta Learning Services, using the OJS Software. The purpose of the journal is to provide a forum for librarians and other information professionals to discover research that may contribute to decision making in professional practice. EBLIP publishes original research and commentary on the topic of evidence based library and information practice, as well as reviews of previously published research (evidence summaries) on a wide number of topics.
Of course, as a librarian with a background in research, I was drawn to this journal waaaaay back at my previous library.  Now I regularly read the articles, looking for gems and ideas for my own professional interests.  This issue, I was not disappointed.  From the 2010 Library Assessment conference, there is a heavy emphasis on ARL assessment products, including LibQUAL+® and its sister product, ClimateQUAL+®, but I was more interested in the articles on such topics as assessing special collections, faculty dissatisfaction with their libraries' journal collections, and linking outcomes with library resources and instruction.  While the conference papers are a bit old (OK, maybe not in "LIS time", but I'm still on "biomedical time"), I still found them quite useful. Here is my take on a few of the key articles, features and evidence summaries.

Still Bound for Disappointment? Another Look at Faculty and Library Journal Collections by Jennifer Rutner (Columbia University) and Jim Self (Univ. of Virginia)
Research Questions:  Are faculty at other ARL institutions all dissatisfied with their libraries' journals?  “Given the substantial investment in journals at ARL libraries, why are faculty at these institutions consistently dissatisfied with their library’s journal collections?”
What they did: Analyzed LibQUAL+ data from over 20 ARL libraries, and interviewed faculty at Columbia to find out more specifically why they are dissatisfied.
Take-home message: Faculty at many ARL libraries are not satisfied with their libraries journals, but not necessarily for the reasons you think.  Many of the issues brought up by Columbia faculty, at least, were technical in nature (e.g. poor "automated responses from library systems", poor search interfaces, etc.) and fall under the umbrella of User Experience.  C'mon, folks - let's get together on this.  We can make the user experience better if we just start working together.

Linking Information Seeking Patterns with Purpose, Use, Value, and Return On Investment of Academic Library Journals by Donald King & Carol Tenopir.
Research Questions: How are purposes of scholarly reading, information seeking behaviors, aspects of use, and positive outcomes or value all related?  How can we use this relationship to demonstrate our value (e.g. with an ROI)?
What they did: This is part of the IMLS-funded MAXDATA project, which included surveys of faculty at 5 universities using the "critical incident" method, asking respondents to think about the most recent information need when completing the survey.  Questions were asked about the purpose of their need, how the information was used, and the value they placed on that information.  Also asked was what the faculty would have done if that last article they used was not available - buy it? spend how much time looking for it?  Based on these responses & previous research, they calculated an ROI of the academic libraries.
Take-home message: Faculty read a lot of articles; they get most of their articles from library resources; the most recent articles are used the most and most articles from library resources are used electronically. Most of this information is not really new - King & Tenopir have been doing a lot of research on use & access.  A few interesting results: faculty spent more time reading articles from library resources than their own subscriptions (probably because more research articles were from the library), and faculty spent more time browsing than searching for each article.  Faculty would spend about 13 hours a year browsing, searching and obtaining articles if the library didn't provide them, which results in about $3500/faculty in cost savings provided by the library, or a 3.6:1 ROI.

Value of Libraries: Relationships Between Provision, Usage, and Research Outcomes by Michael Jubb, Ian Rowlands and David Nicholas out of England.
Research Question(s): Another attempt to derive a measure of value of libraries via the articles it provides.  This time, the link is more outcomes-based.  Specifically, what is the relationship of faculty's use of articles, institutional expenditures on journals, and faculty's research productivity?
What they did: They mined the Web logs of Science Direct and Oxford journals to get the information search & use behaviors.  Then they interviewed faculty and librarians at selected universities. This data was combined with previously-gathered COUNTER-usage data from a variety of British academic institutions.
Take-home message: Cost per download is going down as usage of ejournals increases. The size of the institution does not necessarily predict usage.  Expenditures on journals drives usage, but usage does NOT drive expenditures.  More interestingly, research success drives use.  They were unable, however, to find a factor that use drives - not even productivity.

There were several articles on the development and use of assessment systems and methods, including the University of Wollongong's "Library Cube", U Penn's MetriDoc, and ARL's Balanced Scorecard.

However, these are relatively older items (2010).  There were two articles of recent research that caught my eye - one was a citation analysis study for collection development, and the other was an attempt to provide a method of assessing special collections.

A Citation Analysis of the Classical Philology Literature: Implications for Collection Development by Gregory Crawford from Penn State.
Research Question(s): What are the citation patterns in the classical philology literature and how have these changed over time?
What they did: Examined each citation in every article in two specific years (1986 & 2006) of one journal prominent in the field, noting specifically age, format, language, length.  This was compared with study results from the 1950's.
Take-home message: Citation patterns have not changed a whole lot over the last 50 years.  Citations are about the same age (24-25 years), which is not too surprising given that the topics are 2500-3500 years old.    The distribution of citations by format haven't changed much either - 28-29%.  One interesting change has been an increase in the percentage of book citations (55% in 1956 to 68% in 2006).  This is due primarily to reductions in citations to Festschriften and dissertations.  There were more journal titles cited in 2006 than in 1986 - the author suggests that researchers were "casting a wider net".  This, however, does correspond with an increase in journal titles in all fields.  Eighteen journals made up the top 10 lists from all three years, with 4 titles in all 3 years, and 4 titles in two of the 3 years.  This suggests modest stability of literature.  This article will be of much interest when we look at our classical studies collection.

Data-Driven Decision Making: An Holistic Approach to Assessment in Special Collections Repositories by Melanie Griffin and Barbara Lewis of University of South Florida and Mark Greenberg, Western Washington University.
Research Question(s): How can all aspects of special collections be assessed to enable better decision-making?
What they did: Used Web site usage, patron surveys, usability studies, and Web analytics to answer a series of questions regarding staffing needs, staff-training needs, customer needs assessment and technical needs assessment.
Take-home message: It takes a village - of measures, at least -- to assess a library...of any kind.  While the title includes "holistic", I think it is more akin to triangulation or, essentially, comprehensive assessment.  However, this assessment was focused on the operations and marketing decisions.  It did not include any attempt to get at impact or outcomes of usage of their collections.  This would make it, indeed, holistic.

Finally, there were a number of evidence summaries - critical reviews of published literature.  Reviews that caught my attention were:

Well, reading all of these articles will fill my morning train commute next week.

No comments:

Post a Comment