Skip to Main Content
It looks like you're using Internet Explorer 11 or older. This website works best with modern browsers such as the latest versions of Chrome, Firefox, Safari, and Edge. If you continue with this browser, you may see unexpected results.

Measure for Measure: Altmetrics (June 2017): For Institutions

By Beth Juhl

For Institutions

Founded by medical genetics researcher Euan Adie in 2011, the London-based company Altmetric (and its website suite of tools) provides services to institutions, publishers, and funders as well as individuals. Altmetric generates an Attention Score for a publication by factoring a growing collection of sources, including mainstream news reports, blogs, Wikipedia mentions, saves to online reference managers (e.g., Mendeley), reviews and recommendations on post-publication peer-review sites such as Publons and F1000, blog posts, social media mentions, and posts to YouTube or discussion sites such as Reddit. The Altmetric Support portal (and Knowledge Base, in its Solutions section) provides a chronology and complete list of sources. The Attention Score is computed from number of mentions, the type of source where the mentions occur (for example, standard news outlets rank more highly than blogs), and the author of the mention (with prominent persons in the field ranking more highly). The Attention Score is then presented graphically as the Altmetric donut, a colorful representation of the categories where impact could be measured. Through its Altmetric Explorer web apps and various Badge products, the company provides real-time metrics to institutions, publishers, and funders. Altmetric provides the Badge service free of charge to universities for use in institutional repositories, and makes research output data available on request to academic librarians. Altmetric Badges for Books were introduced in 2016 to track book mentions and cites based on ISBNs or book- and chapter-level DOIs.

PlumX metrics from Plum Analytics do not compute a numeric score, but instead report on more than sixty different types of research output—so-called “artifacts,” and the activities related to those artifacts—in five categories: usage (e.g., downloads, plays, or WorldCat holdings), captures (exports and saves, bookmarks, subscribers), mentions (blog posts, reviews, links), social media (likes, tweets, shares), and citations (from Crossref, PubMed, Scopus, SSRN, etc.). Plum Analytics methods match on these artifacts and aggregate activity as dashboards, documenting trends and reach at the artifact, researcher, departmental, or organization level. The company sells its PlumX Dashboards product to institutions where staff can curate researcher and departmental profiles and manage publications associated with those individuals and groups. Its PlumX metrics allow institutions to embed reports directly into institutional repositories or web pages. It also offers a reporting object not unlike the Altmetric donut—a sunburst-shaped graphical representation of the categories and venues in which the research makes the most impact. Other tools in the Plum Analytics suite report on grant activities, present benchmark comparisons, or offer funding opportunities matching an organization’s research profile.

Not content to simply purchase off-the-shelf solutions, leading research universities, originally in the UK and now worldwide, are working with Elsevier to develop a framework of research metrics that can be shared and compared across institutions. The pilot Snowball Metrics project has produced two Snowball Metrics Recipe Books, the latest with two dozen scholarly communication measures such as research output and citations, public engagement, and altmetrics based on Web of Science, Scopus, Google Scholar, and other sources.