By: Nancy K. Herther, writer, consultant and former librarian with the University of Minnesota Libraries
Altmetrics began as more of a trivia factoid resource that might be used as fodder for an academic mixer. However, with so much influence now easily derived from information and its usage, no one is ignoring Wikipedia’s potential for alternative metrics. Today metrics are being required for all aspects of evaluation of grant proposals, tenure decisions, hiring, public policy, funding and more. It represents important new insights for understanding the implementation and impact of policies, trending information that increasingly influences all types of policy, planning and rewards.
“Since journal publication has historically been a major means of academic communication, the impact factor or h-index based on bibliometrics has been mainly used as an indicator to measure the impact,” notes a 2017 article in Performance Measurement and Metrics. “However, in recent days, researchers are using social media such as blogs and Twitter for not only discussing and recommending research issues among researchers but also exploring ideas and collecting research information. In addition, researchers are sharing bibliographic information in their fields of interest by using web-based reference management tools. As these activities are done online, many traces from research are left on online platforms.” Now even Wikipedia articles are being combed for altmetric data of potential value.
WIKIPEDIA DATA AND ALTMETRICS
“Wikipedia is a useful benchmark for online accessibility of public scholarship in that it provides immediate, freely available information to today’s diverse global public seeking online answers to questions and relief from problems,” note scholars in a 2010 article, titled “Wikipedia as Public Scholarship: Communicating Our Impact Online,” in the Journal of Applied Communication Research. “Wikipedia is a useful benchmark for online accessibility of public scholarship in that it provides immediate, freely available information to today’s diverse global public seeking online answers to questions and relief from problems,” these authors continue.
“Within those millions of [Wikipedia] articles exist hundreds of thousands of links to academic research,” noted Altmetric’s Cat Williams in a 2015 blog post. “Publishers, institutions and researchers are increasingly moving to leverage the exposure and traffic that a reference on Wikipedia can generate for their content. Although the value and relevance of that traffic is sometimes debated, for their part Wikipedia enforce strict editorial guidelines to try and ensure that quality and standards are consistent across articles, and that undue bias isn’t shown towards over-zealous posters.”
“There’s a long standing connection between Wikipedia and altmetrics,” Williams continued. “Dario Taraborelli at the Wikimedia Foundation is one of the authors of the altmetrics manifesto, and he’s doing a lot of great work with groups like CrossRef on helping to make Wikipedia data available for altmetrics tools and research projects, and to help set standards around its use and presentation.”
“The Internet is one of the most common sources of information. Researchers can make their findings readily available online by posting web-based white papers and by creating hyperlinks to scholarly resources for already developed web pages,” notes Sarah J. Tracey in her 2013 book, Qualitative Research Methods: Collecting Evidence, Crafting Analysis, Communicating Impact. “Anyone is allowed to post and edit on Wikipedia, and this publicly and freely accessible encyclopedia serves to directly assist employees, community members, journalists, students, and scholars. Although many scholars have been ambivalent and even derogatory about Wikipedia as a credible source, phrases like “wiki-it” suggest that open sources of its kind are here to stay” (p. 308).
An interesting 2019 collaboration by researchers from Yahoo and Spotify tested the potential for using “Wikipedia (text in foot-note articles in particular), which provides a small set of labeled examples, enabling unsupervised and semi-supervised methods for sentiment classification.”
IMAN TAHAMTAN ON WIKIPEDIA’S POTENTIAL METRICS ROLE
Iman Tahamtan is a Ph.D. Candidate in the School of Information Sciences, College of Communication and Information (CCI), at the University of Tennessee. He has researched the potential of Wikipedia and other social media resources as altmetrics and shares his perspectives with ATG readers:
NKH: I know that your recent co-authored article with Lutz Bornmann focused on the possibilities and applications of social media metrics (altmetrics) for research evaluation. From this research, and the work that you have published, could you reflect on the needs and potentials that you see for alternative metrics in science? Are you surprised by the global excitement/involvement?
IT: I think the scientific community needs to think of designing field-based metrics rather than some universal metrics that are used for all disciplines. For instance, we still do not have reliable metrics for art and humanities. Regarding alternative metrics, I would say each discipline (in some cases sub-disciplines) should have its own altmetrics.
For instance, in measuring the societal impact of science, in psychology, a ‘change in behavior’ might be considered as a societal impact, while in medical sciences ‘curing a disease’ or ‘building a vaccine’ could be considered as a societal impact. Today, most research groups in the world that are involved in doing research on science metrics are mostly focused on indicators that can be used for most (if not all) disciplines, while in my perspective this really does not work. Disciplines are different, thus require different metrics for their evaluation.
So, yes, there is a need to design new metrics in science. The global involvement in designing alternative metrics is really great as many of the scholars working in this area do not have a library and information science background. Each day I see articles published in this area by those who have a background in chemistry, economics, environmental science, etc. This should the extent of all scientific communities’ involvement in finding more reliable and less biased metrics.
NKH: Social Media have exploded in the past ten years. Today we have multimillionaire teen ‘influencers’ as well as bots and efforts to twist truth and impact elections, opinions, create fear and sometimes provoke hate and crimes. Facebook is Facebook, Instagram is Instagram. These mega-systems are private companies and attract everyone and cover everything – and there is no effective oversight or quality controls of any real value today – despite efforts by governments to force some types of responsibility on social media companies. Can social media ever be ‘tamed,’ restrained or managed in a way that makes scholars more comfortable exploring topics for research purposes?
IT: This is a great question. I am currently dealing with removing bots in my research using some already existing algorithms. I believe social media sites can be managed or restricted by designing algorithms that are caple of identifying low-quality, dangerous (hate or crime posts) and false information, and filtering them out or flagging them (as Twitter does for tweets) in some way. This would also help people have more control over what they (want to) see on social media.
NKH: Clearly, social media is a force, a communications medium that needs to be taken seriously and has value for researchers and their institutions. At the same time, research evaluation is a serious field of study and as scientists, who wants to create bad science, promote bad data or questionable results? And we all know that science doesn’t always follow a straight line or bell curve, but often has ‘fits and starts,’ dead-ends and other results. Can valid scientific evaluation really exist in this environment?
IT: We found in our research (that you pointed to in your email) that scientific evaluation using social media data cannot really be valid or useful, because social media data can only show the popularity of research or how interested people are in the topic of the research. A reason why it cannot be really valid: as you know, alternative metrics (e.g., social media mentions) measure the impact of scientific knowledge on society. The assumption here is that research with more social media mentions has more societal impact. The issue here is that we cannot really say whether the research has been shared on social media by the public or by the scientific community. If the research paper is shared by scholars then can we still say it has had an impact on society? what kind of impact? we cannot really say whether a research paper shared on social media is an indication of impact. As I said, it can only show the popularity of the research. I don’t think valid scientific evaluation can exist in this environment, but other people may have a different perspective.
NKH: To me, we seem to be entering a new phase, what some are calling Altmetrics 2.0. Given the research and publication that you’ve done, how do you see the acceptance of these new models of valuation?
IT: I have not heard about Altmetrics 2.0. but if it is going to be based on social media mentions, we would face the same issues and challenges. Overall, I don’t believe in using social media to measure the impact of science, whatever name we give it. I believe each discipline or scientific field should have its own framework for measuring the societal impact of science: e.g., librarians should come up with a framework for measuring the impact of their research, public health experts should have their own methods for measuring impact, etc.
NKH: In one recent research article, the authors admitted that “the very emergence of social media, for example, has heralded a new age for the public dissemination of scientific knowledge. It therefore comes as no surprise that ‘altmetrics,’ an endeavor to quantitatively represent mentions and interactions on social media platforms such as Twitter or Facebook, have been proposed as a means to evaluate the societal impact of research ex post. Yet despite the consensus over their potential for impact assessment, the jury is still out as to what kind of impact altmetrics scores actually reflect.” How do you think this might change? How strong is the resistance by academics as well as by administrators/institutions?
IT: I do agree with this statement by the authors of this research. Lutz Bornmann, the second author, and Robin are my co-authors on some papers. Lutz is a frontrunner in this area. I think while after hundreds of research in this area, the scientific community has not been able to come to a consensus on whether social media mentions should be used in measuring impact or not, this would remain the same in the future. We all are aware of the limitations of social media mentions.
I am not sure about the resistance, but as far as I am aware, even most of the scholars who have proposed such alternative metrics believe they are not good measures for evaluating science. Also, in faculty promotion, such metrics (based on social media data) are not used in most institutions. They are still using citation counts and h-index as two criteria for evaluating faculties.
NKH: One exciting aspect of all this new research, for me, is the openness of the inquiries and open questioning of sometimes the very assumptions involved in the presented research. For example, in one article, coauthors note that “however, there are also other empirical findings suggesting a contrary conclusion. Wooldridge and King (2019), for example, used the same data set as Bornmann et al. (2019) but other methods, and concluded that “the work presented in this study provides direct evidence, for the first time, of a correlation between expert peer review of the societal impact of research and altmetric data from the publications defining the underpinning research” (p. 281). Against the backdrop of these contradicting results, it is necessary to advance further empirical investigations about the correlation between assessments of the societal impact of research and altmetrics scores.” Would you agree?
IT: Measuring the societal impact using social media data and calculating its correlation with altmerics score has been done many times in the previous research. When (in my opinion) using social media data for societal impact measurement is invalid, then further investigations about the correlation between assessments of the societal impact of research and altmetrics scores is also invalid.
NKH: This is incredibly exhilarating. It also shows not only a strong international effort to create a ‘new science’ that might open new doors to not only supporting research, but develop new methodologies, collaborative endeavours and improve the support and understanding and progress of science across the globe for the future. You’re invested in this area, how do you see the future potential of altmetrics? What new areas or new questions are you hoping will be opened up in this effort?
IT: As I noted, I believe altmetrics should shift its focus from social media mentions to the methodologies that are discipline-based. Does it make measuring science from a global perspective difficult? Yes, but it would result in producing reliable metrics in each field. Social media mentions have really failed to a great extent in measuring societal impact, and I have no doubt scholars are trying to find other methods, but I believe they would also fail if we try to find metrics that work for all scientific fields. Why? Because LIS is different than public health, or economics, etc.. LIS needs its own metrics as other disinclines need. Or we could think of designing metrics that include a combination of altmetrics, citation counts and its variants, and other metrics (that might be designed) in the future.
Overall, I would say metrics researchers should collaborate with scholars from each scientific community to design discipline-based metrics. Researchers in Library Science, e.g., can design a framework for measuring the impact of science in Library Science.
EXCITING NEW AREAS OF RESEARCH ARE OPENING
One exciting aspect of all this new research is the openness of the inquiries and questioning of even the very assumptions involved in the presented research. We are seeing not only a strong international effort to create a ‘new science’ and new metrics that might open new doors to supporting research, developing new methodologies, establishing new types of collaborative endeavours and working to improve the support, understanding, and progress of science across the globe. With Wikipedia taking on an increasing role in information dissemination, the options for using Wiki data for analysis and assessment seems more and more reasonable.
Nancy K. Herther is a research consultant and writer who recently retired from a 30-year career in academic libraries.