Mastering Open Access Metrics

Like their subscription-based counterparts, Open Access articles’ metadata is essential to measuring its impact in the academic world.

oa_metricsOpen Access (or OA) is a result of the digital publishing revolution. When print was augmented or replaced with interconnected digital documents, scholars, researchers, and librarians naturally turned increasingly to the open and ostensibly free Web. They also questioned the role of traditional, subscription-based gatekeepers—for both the published results and the data themselves. As this series has shown, the debate on open-versus-subscription is complex and difficult. One issue is clear, however. For digitally-published articles, the quality and consistency of connected metadata is essential. Being able to find, evaluate, cite, and track the right article—among many thousands of others—is made possible only by a consistent approach to “card catalog” definitions and how they are presented and measured. Traditional journals have always had basic, and relatively consistent metadata, of course. Well before the digital era, bibliographic information was the mainstay for librarians and researchers. However, digital media not only invite a richer metadata landscape, they also require it. Journal articles that are no longer contained in bound volumes on a shelf, but binary code on a disc, simply need more help if they are to remain useful to humans. Because Open Access is a largely digital phenomenon, OA journals have tended to be more proactive in the adoption of metadata and metrics. “Open Access publishers embraced digital metadata standards early on, and have introduced new concepts that traditional publishers have adopted,” said Anneliese Taylor, Assistant Director, Scholarly Communications & Collections at the University of California, San Francisco Library. “From the perspective of openness, these identifiers [such as the Open Researcher & Contributor ID or ORCID for journal authors] enable the free and easy transfer of information. An ORCID allows researchers to find other articles by an author, and the more open the publication, the easier it makes the research process.”

Metadata Basics

ORCIDs are one of several major types of metadata used by both subscription-based and OA journals. Another is the Digital Object Identifier or DOI. For scholarly content, the nonprofit group Crossref is (among other things) the official registrar of this ISO standard. A DOI is the “persistent identifier” for each article’s digital content, including abstracts, related objects, and physical assets or files. As a series of linked identifiers maintained by Crossref, DOIs are applied to unpublished drafts (preprints), as well as articles accepted for peer review and final published articles. Another significant metadata development is Crossref’s “CrossMark” service. This allows publishers to send standardized “change metadata” as part of their normal DOI deposits, or as updates. They include the existence of updates or new versions, as well as the occurrence of challenges or concerns, or rarely outright retractions. Such data are displayed when the reader clicks the CrossMark icon on the HTML or PDF version of the article. CrossMark article status updates are available via an API and can thus be propagated to other systems. This would make it possible for publishers and platforms to expose these updates (including digital citations) automatically. At present, change metadata are not automatically passed to digital citations in other articles. Other, common metadata types are not as open as ORCIDs and DOIs. Institutional identifiers, for example, are growing in importance—not only for researchers but also for governmental and funding entities, many of whom operate under an Open Access mandate. Currently, a database of institutional identifiers is maintained by a for-profit data and research company, Ringgold, but these are proprietary in nature. Crossref and others are exploring the eventual creation of an open source registry of research institutions and departments, which would require an open metadata approach. A related metric, funding identifiers, is of particular interest to Open Access publishing. The increased demand for transparency—including the source of funding for previous research—has led to Crossref’s Open Funder Registry.

Article-Level Metrics: Impact & Usage Metadata

Beyond the structural basics, publishers are increasingly involved with usage data that indicate an article’s use and reputation. Both traditional and Open Access publishers are increasingly counting an article’s citation frequency as an impact indicator. However, other factors, such as the number of page views, downloads, or even social media mentions, can also indicate potential impact, and affect the author’s reputation. Taylor noted the potential impact of this usage data. “For Open Access articles on many OA platforms, for example, one can find the number of HTML views or PDF downloads, social network shares, and occasionally their citation instances as well,” she said. “It’s a way to know the impact a particular article has. An article may have been published in a prestigious journal like Nature, but maybe very few people read it.” Article quality is not always guaranteed if all you know is where it was published, she noted. Those looking for a tidy solution will probably be disappointed, however, according to John Chodacki, Director of the University of California Curation Center. “You can’t neatly roll up and evaluate everything in an apples-to-apples manner. What article-level metrics can do is tell a story,” he said. “It combines different types of metrics about social reach and citations and usage. You need alternative ways of measuring as well as traditional ones—using data that are as atomized as possible.” Chodacki noted that the aim of such metrics is to “go below the journal level” when evaluating article impact. “Rather than simply measuring the value or reputation of the container, this will help us evaluate the value of the research itself.” Librarians evaluating the quality of individual articles can begin to use these data, despite the lack of a concise roll-up number or score. Chodacki pointed to the existence of citation and social media indices, but said that librarians need to start with a different premise. “Instead of saying generically ‘Let me help you,’ we need to ask, ‘What kind of help do you want?’ whether that be focused on discovering social reach, short-term/long-term impact, or other factors.” He concluded with the need for an open clearinghouse of bibliometric data. “We need to look at the problem differently, and that requires infrastructure.” Chodacki currently chairs the Crossref Event Data initiative, whose aim is to better understand how scholarly content is shared and consumed. Clearly, there is still a lot of mapping work to do. Open Access In Action
Comment Policy:
  • Be respectful, and do not attack the author, people mentioned in the article, or other commenters. Take on the idea, not the messenger.
  • Don't use obscene, profane, or vulgar language.
  • Stay on point. Comments that stray from the topic at hand may be deleted.
  • Comments may be republished in print, online, or other forms of media.
  • If you see something objectionable, please let us know. Once a comment has been flagged, a staff member will investigate.


RELATED 

ALREADY A SUBSCRIBER?

We are currently offering this content for free. Sign up now to activate your personal profile, where you can save articles for future viewing

ALREADY A SUBSCRIBER?