New Platform Released—Altmetric for Institutions
Altmetrics (alternative metrics) emerged in 2010 as a new category of scholarly impact measurement. Since then it has become “a fluid area of research and practice, in which various alternative and traditional measures of personal and scholarly impact can be explored and compared simultaneously.”
A review of the growth in altmetrics in January 2014 by librarians Robin Chin Roemer and Rachel Borchardt mentioned some key developments.
For example, Public Library of Science (PLoS) was one of the first journals to give its authors access to article-level metrics, while Mendeley, an altmetrics-enabled citation management and networking tool, offers unique data regarding article readership to users. Single user tools like ImpactStory have also emerged as a way for researchers to capture their impact through altmetrics data channels. Likewise, entities like Altmetric.com have developed apps that demonstrate how altmetric data can enhance bibliometric data through integration with Scopus and other websites. Recently, higher-level altmetrics tools like PlumX (now part of EBSCO, as of January 15th, 2014) have emerged that summarize and compare the impact and quality of not only individuals, but research centers, departments, and institutions around the world.
There continues to be a lot of activity recently in this area, with interesting competitive and cooperative interactions among some of the altmetric providers. Just last week, Plum Analytics, an EBSCO company, announced a new way to visualize the data in PlumX called Plum Print. This week brings the announcement of a draft white paper issued by The National Information Standards Organization (NISO) that summarizes Phase I of its Alternative Assessment Metrics (Altmetrics) Project for public comment (open for public comment through July 18, 2014). Also this week, we have the announcement of a new product offering from Altmetric.
Altmetric for Institutions
Altmetric was founded by Euan Adie in 2011. Adie had previously worked on Postgenomic.com, an open source scientific blog aggregator founded in 2006. Adie says, “Interested in taking the ideas from Postgenomic forward we entered an altmetrics app into Elsevier’s Apps for Science competition and ended up winning. The prize money helped us to grow from an evenings & weekends project into a full-fledged product: the first standalone version of the Altmetric Explorer was released in February 2012. In July 2012 we took on additional investment from Digital Science. Our users now include some of the world’s leading journals, funders and institutions.” [Note: Digital Science is a division of Macmillan Science & Education.]
Altmetric for Institutions is a just announced web-based application that allows users to search, monitor, and report on the online attention surrounding published research. Institutional users can browse all of the papers Altmetric has ever picked up a mention for, and view research published by their institution specifically. They can define search filters, create custom groups, and set up regular reporting to assist them in tracking and analyzing the attention their research is receiving. The company worked closely with Cambridge University on the development of the new institutional platform. There was also an unnamed Australian partner and another in the U.S.
Juergen Wastl, Research Strategy Officer at University of Cambridge, commented: “Altmetric for Institutions has so far proved a very useful tool for monitoring and collating information on the attention that our published research is receiving. It is invaluable in that we can analyse and report on data that would otherwise take weeks or months to collate—and in doing so can better support not only our Departments and Faculties but also our research networks and initiatives. Working with Altmetric as a development partner has been a great experience in a cooperative spirit, and we will work towards introducing this platform to our wider faculty.”
Altmetric for Institutions is now available for trialling—visit http://www.altmetric.com/institutional-edition.php to explore the platform with sample data, or email info@altmetric.com if you would like to establish a trial incorporating your institution’s data. Pricing has not yet been determined. A web demo will be offered at 4pm BST on Friday June 20, 2014 (11am EDT, 10am CDT).
I interviewed Euan Adie by phone and email for the details. Here’s our Q&A about the company, its products, target markets, and issues concerning altmetrics.
ATG: What prompted the development of a tool like Altmetric—was there a specific pain point or a problem to be solved for you personally?
EA: Yes. I started out as a computational biologist, working in medical genetics. I wrote software, and the crazy thing about that is you don’t get any recognized credit as a researcher for just writing software. Instead, you have to write an application note, which is essentially a screenshot and a download link, get that published, and hope that people cite it.
This was around 2005, and a memo was going round the lab I worked in saying that because of the research assessment exercise that the British government was about to run we were told to publish fewer incremental advances in low impact journals and more “big stories” covering a year or eighteen months worth of work in high impact titles.
That struck me as a terrible disservice to researchers—high impact journals publish very particular kinds of research. In fact, the impact factor itself reflects a very particular kind of impact, scholarly use. What about all the other kinds of positive impact or influence that research can have?
ATG: Describe the market for your version of Altmetric Explorer before introducing this new Altmetric for Institutions. What does this institutional version offer that’s new?
EA: To date we have worked mostly with publishers—proving them with data and easy-to-embed badges, which they can showcase alongside the articles on their site to allow authors and readers to get a better idea of the attention the research has received. As part of this we created the Altmetric Explorer, which enabled the publishers to easily view the data for all of their articles, compare journals, and report on this data internally and to their editorial boards.
We offered Explorer accounts for free to academic librarians, and almost immediately began to get requests for additional functionality. These suggestions and more have been incorporated into the new institutional version of the platform. To summarize, we have added:
- The ability to import data on who is publishing what (and your organizational structure) from a CRIS or institutional repository.
- Group functionality—summary level reports and article data can be viewed at the departmental or institutional level—or users can create custom reports based on the data set they wish to see.
- Data for individual authors—in the first version of the Explorer this was not easy to do—but now you can view the altmetrics data for papers published by specific authors within your institution, great for when somebody comes to you as a librarian for help figuring this altmetrics stuff out. We’ll shortly be offering the capability to search by ORCID ID as well.
- Summary level reporting—our new summary screen gives an overview of the attention by source, a map overview of location, and a chart which lists articles in order of the amount of attention they have received.
ATG: Who do you expect to use Altmetric for Institutions and for what purposes?
EA: Our experience with our development partners to date has shown us that Altmetric for Institutions is relevant to a few different groups across the organization.
- Librarians (research support librarians in particular) can help their researchers discover more about the broader impact their papers are having, or easily identify notable articles for suggested reading.
- Research administration officers can use the new platform to track the wider impact that research published by their institution is having. They can identify key influencers, monitor individual departments, and use the data to provide additional evidence for research assessment exercises or grant applications.
- Communications offices can use it to help tell the story of their institution, and identify successes that they may not otherwise hear about. Altmetric for Institutions helps to close the loop between the researcher, publisher, and comms office, and provides valuable content for them to help promote the attention their research attracts. Similarly to how they might use a media monitoring service such as Meltwater to track mentions of their institution, they can use our platform to track the attention specifically around their papers—collated and disambiguated.
- And finally, researchers themselves can use the platform to monitor the attention that their research is getting, provide evidence of impact, and add valuable insights to their CVs or personal profile pages.
ATG: What types of data do you track and how often is it updated?
EA: We scan all of our sources for any mentions of unique identifiers, such as DOIs, PubMed, or arXiv IDs, or handle (handle.net). Our sources include:
- Social media—Twitter, Facebook, Google+, LinkedIn, Sina Weibo, Reddit
- Mainstream news—global news outlets such as the BBC, New York Times, but also local papers and specialist titles like Scientific American or New Scientist
- Blogs—a manually curated list that ensures we keep out spam
- Policy documents—our newest addition, we now track where articles are mentioned in policy documents to show the real life application of the research
- Peer-review sites—such as Publons, Peerage of Science, or PubPeer
- Faculty of 1000
- YouTube (here we track mentions in the description of the video—not any spoken reference to an article)
- CiteULike/Mendeley—these do not contribute to the Altmetric score but we provide reader counts.
Social media and news mentions update on about an hourly basis, and most of the others at least daily.
ATG: How does Altmetric handle disambiguation?
EA: We look for a unique identifier on each version of a paper—so for example, the DOI, PubMed, or arXiv ID. At the point where we find two of these identifiers mentioned together we are able to reconcile all of the information for that article.
ATG: How many customers are using the various versions of your service?
EA: We currently have around 50 publisher customers using one product or another, including Elsevier, Wiley, Nature Publishing Group, and the Royal Society of Chemistry. We receive around 7 million calls to our API each day, and have so far tracked mentions for more than 2 million papers. We’re working with 7 of the top 10 journals (as ranked in Google Scholar), up from 3 this time last year, so we’ve come a long way quite quickly!
ATG: How do you compare to your competitors, such as Plum Analytics (an EBSCO company), ImpactStory, and others?
EA: Altmetrics are a relatively new field and all three companies are learning and making improvements on a daily basis. We’ve always been friendly with ImpactStory (a not for profit venture who provide researchers the opportunity to create an ‘altmetrics CV’), and in fact we now provide some of their social media data. They’re doing some really great things especially around tracking more than just articles and datasets, and I’d actually say they were more complementary than a competitor.
Plum’s doing a brilliant job getting out there and talking to institutions about altmetrics, which is good for awareness of the field as a whole.
Altmetric is primarily a data company, and that’s our strength—altmetrics data is our specialty, which is how we’ve ended up being trusted in the publisher world.
ATG: What improvements are planned for the future?
EA: We will continue to develop and improve on our offering for both publishers and institutions. We’ll be expanding our policy document coverage, and add more advanced reporting functionality—as well as the ability to search for authors by ORCID.
There are always more sources to track popping up, too!
ATG: What about data quality and consistency among Altmetric providers?
EA: I think it’s important to remember that consistency between providers for a given metric is only possible if the providers all first agree on what that metric is and where the data behind it comes from.
There’s no one standard alternative metric, that’s kind of the point of the field. The kinds of consistency issues that I think we as a community need to figure out are more about individual sources like, say, Twitter. In theory, if you ask how often an article has been tweeted you should get a consistent answer from anybody counting.
But actually there are all sorts of assumptions involved, and those assumptions aren’t necessarily easy to make. For example, should you include any tweets of the preprint version of the article? What about the version in the author’s institutional repository?
From an author’s perspective, you’d probably want the total tweet count for the article no matter what platform it is hosted on. From a publisher’s perspective, you’d possibly only want the count from your journal.
These are the kinds of questions we need to come together as a community—both tools and users —to answer. A good example of how it’s being addressed is the NISO altmetrics standards process that kicked off about six months ago.
ATG: “Are people going to get hired (or fired) on the basis of their Altmetric score?
EA: We hope not (and strongly discourage this). The score does not indicate research quality, it is merely an indication of the volume and type of attention an article has received. Looking into the original mentions and data may provide useful insight for tenure or hiring committees, but the number alone must never be the basis of such a decision.
ATG: How do you see the field of altmetrics developing in the future?
EA: I think altmetrics is a couple of years away from being mainstream, which may make the name even more confusing than it already is. (“Alternative” isn’t the right word, really, as this kind of data is complementary to citations, and “metrics” implies that the most important thing is the numbers, when actually the metrics are a way into the richer, qualitative data—which policy documents are citing the research? Who is tweeting about it? And so on.)
Partly this is just a question of uptake. Google Scholar has a list of “top publications”—journals ranked by their 5 year h-index. Last September we were working with 2 of the top 10, now we’re working with 7 of them. More researchers are seeing altmetrics data than ever before, and using it in different ways: we serve almost half a million “donut” visualizations a day, and I think we’ve given out more than a thousand librarian accounts.
What’s still missing is some standardization, though this is developing through initiatives like the project NISO is running. We need more community ownership of ideas and wishlists, too—funders and publishers can continue to drive the altmetrics agenda but I’d like to see more individual researchers and librarians becoming involved and push for what they think is important. That’s kind of the point, after all—that everybody has a different view of impact: so let’s capture as many as possible and see where we can take things!
More Information on Altmetrics
- The National Information Standards Organization (NISO) has released a draft white paper summarizing Phase I of its Alternative Assessment Metrics (Altmetrics) Project for public comment. The White Paper is open for public comment through July 18, 2014. It is available with a link to an online commenting form on the NISO Altmetrics Project webpage, along with the detailed output documents and recordings from each of the meetings and related information resources.
- An ACM Web Science Conference 2014 Workshop; June 23, 2014, Bloomington, IN: altmetrics14: expanding impacts and metrics (Euan Adie is one of the keynote speakers.)
Altmetrics: A manifesto
“altmetrics is the creation and study of new metrics based on the Social Web for analyzing, and informing scholarship. Our vision is summarized in:
J. Priem, D. Taraborelli, P. Groth, C. Neylon (2010), Altmetrics: A manifesto, (v.1.0), 26 October 2010.”
- Keeping Up With… Altmetrics, by Robin Chin Roemer and Rachel Borchardt, January 2014, ACRL Keeping Up With
- Robin Chin Roemer and Rachel Borchardt. “From Bibliometrics to Altmetrics: Keeping Up with A Changing Scholarly Landscape.” College & Research Libraries News 73, no. 10 (2012): 596-600.
- Altmetrics as new indicators of scientific impact, by Donatella Gentili, Chiara Rebuffi Annarita Barbaro. JEAHIL (2014) Volume: 10, Issue: 1, Pages: 3-6
- Going beyond bibliometric and altmetric counts to understand impact, by Kristi Holmes
- Altmetrics: A librarian’s outlook on potential applications, by Jenny Delasalle
- Altmetrics tell a story, but can you read it?, by Mike Taylor, Research Specialist, Elsevier Labs
For an interesting study on data quality and consistency using APIs across three altmetrics providers, see:
- Zahedi, Zohreh; Fenner, Martin; Costas, Rodrigo (2014): How consistent are altmetrics providers? Study of 1000 PLOS ONE publications using the PLOS ALM, Mendeley and Altmetric.com APIs. figshare.
Paula J. Hane is a freelance writer and editor covering the library and information industries. She was formerly Information Today, Inc.’s news bureau chief and editor of NewsBreaks. Her email address is paulajeanhane@gmail.com.
I do find it a bit odd that you don’t assess the value of altmetrics, or even mention the well-founded criticisms of it made by the actual victims of this sort of rather childish mismeasurement of their activities. See, for example, http://www.dcscience.net/?p=6369 and
http://blogs.bmj.com/bmj/2014/05/07/david-colquhoun-and-andrew-plested-why-altmetrics-is-bad-for-science-and-healthcare/