Information Today, Inc. Corporate Site KMWorld CRM Media Streaming Media Faulkner Speech Technology DBTA/Unisphere
PRIVACY/COOKIES POLICY
Other ITI Websites
American Library Directory Boardwalk Empire Database Trends and Applications DestinationCRM Faulkner Information Services Fulltext Sources Online InfoToday Europe KMWorld Literary Market Place Plexus Publishing Smart Customer Service Speech Technology Streaming Media Streaming Media Europe Streaming Media Producer Unisphere Research



Vendors: For commercial reprints in print or digital form, contact LaShawn Fugate (lashawn@infotoday.com)

Magazines > Computers in Libraries > March 2023

Back Index Forward

SUBSCRIBE NOW!
Vol. 43 No. 2 — March 2023

METRICS MASHUP

What Do We Need to Know?
Info Pro Competencies and Competence
by Elaine M. Lasda


I have been thinking lately about competencies for librarians and information professionals, especially those related to measurement and evaluation of research impact. This pondering led me down a rabbit hole regarding some key definitions. Seeking clarity on the distinctions between “competency” and “competence,” I consulted my institution’s subscription to the Oxford English Dictionary (OED Online, December 2022).

COMPETENCY DEFINED

Not surprisingly, the OED provides many iterations of meaning for both terms. For my purposes, some definitions of competence were not relevant; for example, a medical definition says, “Ability (of a valve or sphincter) to function normally.” Taken as a whole, the definitions of competence mainly relate to adequacy: “an ad equate supply” or “capacity to deal adequately with a subject.” Another sense of “competence” is that of being comfortable, of living with ease, or having sufficient means.

Words like “adequate” and “sufficient” crop up in the definition of competency as well. In fact, in some circumstances, the two terms are considered synonyms. The nuanced difference falls within this definition of “competency”: “sufficiency of qualification; capacity.” For my purposes, competence is a sense of adequate function and comfort, while competency conveys a sense of adequate qualifications. Really, no surprises here.

What about the concept of “adequate”? Check out this OED definition: “Satisfactory, but worthy of no stronger praise or recommendation; barely reaching an acceptable standard; just good enough.” Thus, if we as info pros are to master any sort of competencies in our field, we are barely reaching an acceptable standard. I can see that there ought to be a bare minimum benchmark, but I was always under the impression that having competencies indicated a greater level of proficiency or qualification. Lesson learned, so I reframed and moved on to my original question.

ARE OUR COMPTENCIES CURRENT?

Giving this topic time and attention is worthy, as we hear from many corners of library land. Providing support for research impact metrics is a value-add that we can provide for our organizations. Swedish researchers Fredrik Åström and Joacim Hansson rightly point out bibliometric expertise can garner librarians and libraries influence and status within their organizations (“How Implementation of Bibliometric Practice Affects the Role of Academic Libraries,” Journal of Librarianship and Information Science, v. 45, n. 4, 2013: pp. 261–352; doi.org/10.1177/0961000612456867). One passage in their article particularly struck me:

“Traditionally, librarianship has to a large extent been focused on issues concerning the acquisition and organization of library collections, as well as searching and retrieving information for the users of the library. However, with online access to both search tools and the information per se, it is easier for users to do many of the search- and retrieval-related tasks themselves. At least to some extent, it could seem that some of the services related to one of the core competencies of librarianship are no longer in high demand by users of the library” (italics mine).

An overarching consideration regarding the increased reliance on evaluation metrics in the research ecosystem is the increase in substantially better and more nuanced metrics and resultant analytics available now compared to when I did my first citation search using ISI databases on DIALOG decades ago. With AI initiatives such as natural language processing, sentiment analysis, and text mining, we can expect the ability to categorize cited references and altmetric sources to improve in accuracy and be obtainable with greater ease.

In many organizations, librarians have planted their flags firmly in bibliometric support and other valuable areas of the evolving research ecosystem. Growth areas adding value in research circles include expertise on new forms of scholarly communication, advocacy for researchers’ rights regarding copyright and licensing of research output, contributions to evidence synthesis projects, support for digital scholarship/digital humanities, and leverage of AI and new technologies. You can probably think of more.

ADEQUACY OF SKILL SETS

These issues taken in total require a seismic shift in librarian competencies. We need to develop a concept of adequacy for all these areas for librarians and info pros in the research enterprise. There is a fair amount of literature attempting to delineate appropriate definitions of competencies in this area.

Here are two notable examples: the 2021 Competencies Model for Bibliometrics, posted on The Bibliomagician blog (thebibliomagician.wordpress.com/competencies), and the July 30, 2021, post by Barbara S. Lancho Barrantes, Hannelore Vanhaverbeke, and Silvia Dobre (thebibliomagician.wordpress.com/2021/07/30/the-new-competencies-model-for-bibliometrics) on the new competencies model. The competencies are specific, detailed, and largely focus on compliance with U.K.’s REF (Research Excellence Frame work) and other European research evaluation standards. These competencies are helpful guideposts, but for librarians in some research enterprises, even these basic-level statements might be overkill for the generalist. Here are some examples:

Under Knowledge in the Field, Entry Level, you’ll find “Support different stakeholders with general enquiries about bibliometrics i.e., how to analyse publication activity or find potential co-authors,” and “Help researchers identify suitable places to publish.”

Under Entry Level, Technical Skills is “Prepare reports and presentations that include some interpretations and visualisations of the bibliometric data.”

For my institution at least, these competencies would be more intermediate than entry-level. To me, “entry-level” means someone fielding the reference question intake (walk-in, chat-in, email-in, and so forth) knows where to start someone off with their question. The three competencies above would be the domain of librarians with subject matter expertise in the research domain. I doubt, however, that my experience is generalizable to all libraries and information services.

Some LIS and iSchools have curricula that do not appropriately reflect newer skills needed in the field. Most of us learn about citation analysis, research evaluation, metrics, and indicators in the field, not through graduate-level coursework. Resources such as the LIS-BIBLIOMETRICS list, blogs like Bibliomagician, Aaron Tay’s Musings About Librarianship, and occasional posts on Scholarly Kitchen; higher-education trade publications; groups such as the Bibliometrics and Research Impact Community (BRIC) in Canada: events that include the Bibliometrics Symposium held by the National Institutes of Health; and training sessions and conference programming offered by ALA, SLA, and others represent ground-level efforts to this end. These strategies are informative but in no way standardized. What skills would be essential for the recent M.S.I.S. graduate that constitute an “adequate” level for bibliometrics and research impact?

IS EASIER REALLY BETTER?

Another issue to consider is what Anne K. Krüger and Sabrina Petersohn dub the “de-skilling” of citation searching by database vendors (“‘I Want to Be Able to Do What I Know the Tools Will Allow Us to Do’: Practicing Evaluative Bibliometrics Through Digital Infrastructure,” Research Evaluation, October 2022; doi.org/10.1093/reseval/rvac009).

De-skilling refers to the efforts by product developers to simplify the citation searching process and include more on-the-fly analytical tools that users can ostensibly use to obtain the desired metric output easily and seamlessly. Thus, providers/vendors, resource and tool features and capabilities, and end users all mesh to combine the overall infrastructure, which drives how the tools are used. I suppose my personal conception of what is “barely adequate” in the bibliometric world would be the ability to search these ever more “simplified” platforms. This is problematic, however, due to errors in records and metadata that render the auto-generated metrics messy and inaccurate. A more skilled librarian could easily address dirty data issues by exporting the data, cleaning it up, and calculating more accurate metrics. Would that then be the baseline?

STANDARD ADEQUACY

The more I dug in, the less I felt confident that a standardized, baseline, “adequate” competency for all librarians and info pros is definable. This points to yet another competency: keeping current. The sands of what constitutes an adequate level of competence in bibliometric and research evaluation are shifting rapidly as our data analysis capabilities become ever more sophisticated. What I did derive from this thought exercise is a preliminary list of questions I think should be considered when evaluating what knowledge, skills, and abilities are adequate in each of our research support settings.

  • What resources are available for info pros to use in their settings?
  • What kinds of services and outputs are required by patrons/users?
  • Is the service advisory or merely output? In other words, does your service provide context and recommendations to go along with output, or does your service leave the interpretation to the user?
  • “If you build it, they will come.” Would developing services currently not offered make sense in each information center setting?
  • How should info pros best understand the institutional applications and misapplications of evaluative metrics in their organization?
  • How should info pros advocate for responsible use and educate stakeholders in the nuances of metrics? What level of advocacy or education is necessary?
  • What types of statistical and data science skills are necessary for data cleaning, manipulation, and visualization?
  • How will your organization educate staff on new trends such as sentiment/text analysis, network analysis, new ways of viewing citation data, and sentiments of citations?
  • How will your organization educate staff on how metrics reflect changes in an increasingly open research ecosystem?

A POSSIBLE WAY FORWARD

After this process of reading and mulling, I am less certain there could be boilerplate language for “adequate” research evaluation and impact metrics competencies. Like any planning endeavor, libraries and information centers would do well to develop an organization-specific evaluation of resources such as tools and staffing, garner the needs and concerns of enterprise stakeholders, and consider how to optimize growth and advocacy opportunities. Graduate schools of LIS and iSchools could teach to the broader concepts, such as the uses and goals of citation analysis and research impact, ethics, and responsible use. If our traditional competencies are of diminishing use, it is essential for schools to integrate content related to research evaluation strategies into coursework for emerging librarians and info pros.


Elaine M. Lasda

Elaine M. Lasda (elasda@albany.edu) is coordinator for scholarly communication and associate librarian for research impact and social welfare, University at Albany, SUNY.

Comments? Emall Marydee Ojala (marydee@xmission.com), editor, Online Searcher.