Monday, December 3, 2012

Perusing interesting ideas on a Sunday Morning

I've developed a new routine these last few weeks of watching TEDTalks on my TV (the convergence of TV & Internet in action) and reading miscellaneous blog entries on Sunday mornings while my better half is enjoying riding his motorcycle on the nearly empty streets.  Of course, my dog would rather I be taking him to the park, but his fun will come later.  This is my time to think and learn.  And here is what I've discovered this week:

If we want to help people, Shut Up and Listen!  Ernesto Sirrolli discusses the seemingly obvious lesson learned from decades of failed Western-based aid projects in African nations.  For instance, after attempting to teach some villagers how to raise Italian tomatoes in a rich valley, all the fruits of their labor were lost to the migrating hippos.  Sirrolli sums up why the villagers did not tell him about the hippos - because he didn't ask.

Sirrolli takes this lesson learned and applies it to his own NGO of nurturing entrepreneurship in developing countries.  His method is to approach a potential client with no problems to address and no solutions in mind, but rather to listen to what the client wants to accomplish and what solutions he or she has in mind.  While addressed to those in NGOs, this lesson should (and sometimes is) applied to librarianship.  Consider collection development.  Rather than prescribing the kinds of materials for a particular subject or collection, we should be listening or paying attention to what our readers (faculty and students) use and need.  They may not know the specific items, but they know they what they want to accomplish.  What we can provide is the knowledge of publishing and literature, as well as storage and retrieval, that enables the selection of and access to the most useful materials that will enable them be successful.

From The Scholarly Kitchen, I've read an essay that advocates relying too extensively on "metrics" or even "altmetrics" to measure the quality of scholarly communication.  His alternatives to metrics,  or "alt2metrics," addresses the problem that our current quantitative metrics (e.g. impact factor, Eigenfactor, etc.) are dismissed by the very sources of scholarly communications - the academic researchers: "More often than not, academics and researchers are dismissive of metrics, as they’ve seen how, once you poke at them, they end up being relatively blunt measures with little nuance or depth."  The author, Kent Anderson, then describes what he believes are the factors that the researchers pay attention to when evaluating research, namely:

  • Brand (of journal)
  • Authorship (within the journal)
  • Results 
  • Sponsorship
My thought, however, is that these are exactly the kinds of "signals" that the metrics are trying to quantify.  Impact factors are closely associated with brand of journal, as well as sponsorship, while h-index and the citation networks measure author productivity and impact, as well as networking.  Kent is advocating that we (those interested in measuring the quality of science) not ignore the "primary, original signals of value" that "scientists rely on every day to guide them and their searches for information."  This does make sense, but I'm beginning to believe that the Uncertainty Principle applies to many, many more things than physics.  I'm sure I'm not the only one who's come to this conclusion...

No comments:

Post a Comment