Don’s Conference Notes- Academic Publishing in Europe (APE) 2021

by | Feb 22, 2021 | 0 comments

#

By Donald T. Hawkins (Freelance Editor and Conference Blogger)

The APE conference was started in 2005 by Arnoud de Kemp, formerly a top manager at Springer-Verlag. It has been held in mid-January in Berlin, Germany since its founding and is one of the first conferences to be held each year. APE 2021, the 16th conference in the series, was held virtually due to the COVID-19 pandemic on January 12-13 and attracted about 700 attendees. This year and in the future, the APE conferences will be organized by the Berlin Institute for Scholarly Publishing and livestreamed and hosted by Morressier.  

Opening Keynote

Prof. Dr. Dorothea Wagner

The theme of APE 2021 was “The New Face of Trust”, so it is appropriate that Prof. Dr. Dorothea Wagner, Chair of the German Council of Science and Humanities, entitled her keynote presentation “Open and Autonomous: The Basis for Trust in Science”. 

She began by noting that COVID has had a historic effect on science and has been a game changer; never before has the reaction to a devastating catastrophe been so dependent on scientific research. Science-related news has been much in the headlines during 2020. What does this new level of attention mean for science as a system?  2020 laid bare the strengths and weaknesses of our societies and our system of science, and we should not miss the resulting opportunity to learn.  

One of the lessons we have already learned is that complex and globally connected societies are vulnerable to new kinds of crises. Not all kinds of crises can be anticipated in detail, so resilience—the ability to react to unanticipated challenges—is critical. Only a resilient system can help society. Wagner likened trust to a “transmission belt” connecting science and society.

C:\Users\DTH\Pictures\003.JPG

Science is a social subsystem that makes claims, puts them to the test, and infers conclusions from them. It cannot play its role in society if it is not trusted. In a society without trust, scientific endeavor is little more than wheel spinning. Fortunately, the experiences of 2020 showed that trust in science is still widespread. 

What does trust have to do with scientific publications? Science is a system of collective knowledge production which can only exist through collaborations, so the system of publication is a critical infrastructure for science. Publications are the primary medium on which science has an impact on society; therefore, quality assurance remains central in academic publishing. Reliability requires provisions for long-term archiving, anecdotal versioning, and referencing. Methodical skepticism is one of the prime virtues of a scientist.  Transparency of the process leading to publications is how we can have trust in science. A public discourse on the purpose of academic publishing is needed. Publication, criticism, and revisions lead to a growing improvement of databases. 

Science education is an indispensable basis for trust. Scientists and publishers are responsible for working together with communicators and journalists. If politicians and the public have a good understanding of how science works, they will feel more comfortable with controversies in science and will not fall prey to exaggerated claims.

The question of trust casts new light on the autonomy of science; a major reason for mistrust in science is that scientists may be too dependent on funders for their research. The real danger of predatory publishing is not that it might spread low quality work, but that publishing becomes an end in itself, so mere quantities of publications become important in promotion decisions, etc. Some profit-minded publishers may take advantages of such indicators and believe that science is ruled by the “publish or perish” imperative, which casts doubt on the whole system.

Another issue of trust is plagiarism, which is among the most prominent scandals in scientific publication. Plagiarism violates the standards of good scientific practice because it undermines the system of rewards on which reputations are built. It also reveals the motives of people in the system, which may be stronger than those enforcing good scientific practice. It is therefore important to take measures to prevent plagiarism and sanction those involved.

In 2020 we have not only seen an increasing trust in science, but fewer people are mistrusting science because the pandemic has demonstrated how strongly devoted most scientists are to improving the public good. It is therefore important to not only think about how we can make publishing better, but what the mechanisms of publishing reveal about ourselves, so that we don’t spoil the image that science has gained during the pandemic. 

The transition to open access is an opportunity to create a system which maximizes the quality of scientific publications as well as a public discourse on publishing and science as an opportunity to increase knowledge for the public good. 

Keynote 2: Reinvention or Return to ‘Normal’? Scholarly Communications at a Crossroads

Lauren Kane, Chief Strategy Officer, Morressier and President, Society for Scholarly Publication (SSP), said that there is no mistaking 2020 for just another year. It was watershed moment for scholarly communications, and never have they been more relevant or important. 

The past is prologue, and our community has no shortage of ideas, business models, and ways to share knowledge. The question now becomes, “Where do we go from here?” Do we return to existing structures and norms, or do we now have an opportunity for rethinking and reinvention? 

Three major trends are shaping our industry:

  1. The Road to OA
    We are morally aligned with OA, but there are still discussions on how it can be practically achieved. We understand it is not free, but what are the limitations of the Author Publication Charge? Should action be mandated or is it voluntary? Some societies worried that trends toward mandated action could harm their sustainability and impede the access they wish to extend.  Another concern is that OA is not equitable access.  Open is not really open unless it is globally inclusive. The most notable thing to come out of these discussions was collaboration and coalition building on all sides of the OA issue. Most organizations have realized that they cannot deal with these issues in isolation.
  2. The Expanding Research Lifecycle
    We have gained an appreciation for access beyond the published article: preprints, conference presentations, poster sessions, and data—i.e. all the literature that was once lumped together as “grey literature”. Most societies had ample room to grow when this content was available in print, but there was also recognition of the need for more transparency in the research process and the need to rethink peer review to eliminate gender and racial biases.
  3. Societies Examine Their Purpose
    Should societies’ journals have mass appeal or should they be aimed at a specific technical community? Should they be open or only accessible by subscription? Should they be available to a broad audience or a specific narrow one? How should meetings be convened in a digital age and be more inclusive and accessible? Being a nonprofit and achieving sustainability are very thorny issues. Should societies be involved in politics or policies? Should societies take a position with their government or administration or should they remain neutral? Will this involve their nonprofit status?

Enter 2020

Things that seemed so daunting before 2020 are now seen more in perspective when compared with the monumental challenges posed by public health and financial issues. These are some of the effects on publishing:

  • Financial and budgetary pressures exist throughout the industry,
  • Personal and professional considerations may collide,
  • In-person meetings, which are important for scholarly exchange, have been cancelled,
  • Concern about loss of advancement and the ability to communicate effectively has increased, and
  • Progress in innovation (such as OA) has been stalled.

2020 has caused significant health and emotional issues such as home schooling, home caregiving, and working from home. Everyone has been living very different lives than they did the year before; however, it is encouraging that many societies have refused to let 2020 be a lost year and have used it as a moment in time to go forward. This industry has a history of risk aversion and slow change, but there are positive lessons to be learned in this major shift. In response to a common adversary and despite geographical barriers, we have seen unanimity, collaboration, and regard for one another, and have been acting like a close knit community. Virtual meetings have become accessible to people who could never travel to events. Researchers shut out of their laboratories have worked on dormant papers. 

It is now a time to experiment and explore new directions. For example, profits of many societies from virtual meetings have increased because they did not have to pay costs of venues. Double blind peer review is making it more open and transparent. This year could be either an outlier or a true inflection point and an opportunity for reinvention. Losses will be acutely felt throughout the industry, and some organizations will be unable to recover. There is legitimate concern for the arts and humanities. We could see a return to silos as groups try to protect what is theirs.  

So where will we go? Here are some possibilities:

  • A reversion probably will not happen because challenges and opportunities are irrevocably linked.  
  • Scholarly communication is evolving because it must.  
  • New revenue models and increased dissemination to existing and new audiences will occur. 
  • We will benefit from the experimentation now taking place with OA. 
  • Those that succeed in the future will do so in partnership with others. 

Keynote 3: Opening Doors to Discovery: How Partnerships are Key to Advancing Open Science

Frank Vrancken Peeters, CEO, Springer Nature, said that open science matters because it facilitates global collaboration and is the most efficient way to disseminate information to the global public. The road to open science has been long.

C:\Users\DTH\Pictures\005.JPG

Although open science gives more visibility to authors, its implementation has taken several decades. We need to speed up the process because the number of new publications is growing by about 8% per year. Many preprints of articles on COVID have been made freely available, and it has become one of most sought after topics of all time. 

We must embrace openness as a key method and work together as partners because open access provides a major access to science. Gold OA offers the simplest, most open, and most sustainable route to OA and open science. Publishers have a duty to care for the scientific record and must work together with researchers. Researchers must continue to publish in journals that they know, trust, and support OA because OA is changing the way that they collaborate.

Transformative agreements are very successful because researchers do not have to find funding. They enable all researchers to publish OA regardless of their field. Under these agreements, both libraries and funders contribute to APCs. Springer Nature has formed a pilot partnership with ResearchGate, which is extremely popular with researchers (it has a registered user base of about 17 million people) to make available 41 of its journals with 3 years of data. The partnership has created a seamless experience for users who have been very positive, so the pilot will be expanded to 3,000 journals in 2021. COVID has closed gaps between lay people and scientists, so sharing data is important. Openness in a key tool in the research process and, along with transparency, is critical in gaining trust.

Panel Discussions

Restoring Trust in Published Research

  • You can usually find what you are looking for, but putting a challenge in the way of an author does not help their performance.  We need to improve research, recognize that everyone is fallible, value teamwork, and conduct local peer review.
  • Peer review does not live up to a promise of quality assurance. We may need to compare the quality of papers in peer reviewed journals with those in journals that are not peer reviewed.  There may not be much difference. Another panelist took the opposite view and said that peer review does live up to its promise because it helps in trusting the information published and helps authors improve their articles. 
  • Post-publication peer review where anybody can comment on an article is a good process. Merely having something published does not mean that it is of high quality. If a paper is rejected and re-submitted to another journal, most authors do not make many of the changes suggested in previous reviews because they do not have a way to see them. 

Creating a Level Playing Field for the Global South

  • Funding issues for access to OA journals are a problem. It is important to recognize that researchers work not only for their personal interests but also because it is their job. Policymakers in some countries do not understand the difference.
  • Research in the Global South is growing, and researchers want to have their work noticed by their peers. Many in the Global South prefer to publish in international journals, but there is also a role for local journals. Some universities require researchers to publish in local journals. Researchers from the Global South are rarely on editorial boards because publishing in local journals does not give them recognition. We need to work together to make all journals prestigious. It is also important to make sure to publish in widely understood languages.
  • Some journals have APCs equivalent to several months of a researcher’s salary so many of them cannot decide where to publish.

OA and the Value of Selectivity

  • Many researchers on social media have outspoken views about OA, open science, and preprints, but many that are not on social media have only a limited knowledge about OA. A Taylor & Francis survey conducted in 2019 found that use of Gold OA is growing; self-archiving occurrence is low; and 40% of researchers would be unwilling to submit their research to a Gold OA journal that charges for publication.
  • Currently, the average APC of a society journal is $5,000, which is not sustainable. Cost should not be a barrier to dissemination.
  • We need to embrace all publication routes and have a diverse publication landscape and sustainable funding models that meet the needs of both researchers and publishers.
  • Preprints are good for researchers. Many are not updated until the Version of Record is issued. 
  • OA mandates are here to stay and are expanding, but researcher awareness of their implications is low. 

Day 1 Closing Plenaries

Financial Transparency and the Cost of Quality

Alison Mudditt, CEO of the Public Library of Science (PLoS), discussed selectivity at PLoS and noted that 15 years ago, OA was a publishing model that combined access with research. Publishing in highly selective journals remains very important. 

Users want transparency about prices and value, but publishers must understand how stakeholders determine value, so PLoS has launched a community action program. Its actual costs of publication are about 50% higher than the APCs charged; they include the broad services provided—marketing, ethical issues, policies, new open science practices, preprints, etc. Alternative models are being developed to move away from the consequences of the APC; we need to ensure that costs are fairly and appropriately balanced, and we must be transparent about prices. 

In the future, publication costs will shift from APCs to modified flat fees based on the publication history of the institution. The flat fees will be capped at the amount of revenue that PLoS takes plus a 10% margin to cover costs of running the journals.  PLoS will need to prove that it can generate enough revenue via a partnership model that will eliminate APCs for members. The focus will be on how to make selectivity valuable to researchers. Transparency has to be a cornerstone for building trust. These are ethical questions as much as business ones.

Crisis in Communication: The Functions and Future of Selective Journals

James Butcher, VP of Journals, Nature and BMC Portfolios, Springer Nature, said that there are 2 types of journals:

  1. Recorded journals for authors describing techniques and processes done by researchers, and 
  2. Newspaper journals edited primarily for readers.

The functions of the Nature journals are to filter, enhance, and amplify. About half of submitted peer reviewed articles are accepted. Enhancing means to make the paper better through peer review and what the editors add.  (Nature has 200 Editors with Ph.Ds. and 300 people working on support.) The ability to amplify an author’s manuscript can have a huge impact; one article was accessed 60,000 times in its first month! Authors have the choice of publishing OA with an APC of €9,000 (about US$10,900). 

Beyond the Paper, the Data, and Then a Bit Further: Capturing More of the Research Workflow

This session was introduced by David Crotty, Editorial Director, Journals Policy, Oxford University Press and Editor-in-Chief of SSP’s Scholarly Kitchen. He said that as we abandon the constraints of the print era, opportunities arise to add to the depth of knowledge presented. The open data movement is the first and most obvious enhancement to the scholarly record, providing valuable resources both for reuse and reproducibility purposes.

Registered Reports

Dr. David Mellor, Director of Policy Initiatives, Center for Open Science (COS), Charlottesville, VA, said that registered reports are published regardless of the outcome of the research. They undergo a second stage of peer review. Journals that adopt registered reports accept articles about the methods used, and editors decide whether they are credible. Reviewers ask how authors will demonstrate that the work will be done competently and completely. If they are satisfied with the author’s reply, the article is conditionally accepted. When the work is completed and the final results are submitted, the reviewers judge whether the conclusions are supported by the data.  The advantage of this approach is that the results are more reproducible and credible, which allows null results to be published. The first review allows improvements and adjustments to be incorporated into the article. There is no evidence that registered reports are cited less frequently than conventional articles. 

Reporting Research Methodologies and Reagents

Maryann Martone, Professor Emerita of Neuroscience, University of California, San Diego, said that science has a foundation of reproducibility, although some disciplines, such as biology, have variable systems and are hard to reproduce. Some systems are affected by the tools used, or even the operating system of the researcher’s computer. Many details of the research are in repositories and do not appear in the published article; as a result, there is rarely enough detail to reproduce the experiment exactly. Method sections of many papers are terse, and some researchers use protocols from others who have used protocols from still others, etc.

Martone suggested that one solution to these problems is to put the details in a Supplemental Data section of the article and then link data about materials used via a “research identifier” in the article. Those identifiers can then be put into a “research identification network”. “Manage Methods” sections could be described not only as text but also as recipes (”These are the steps that I used”) without a lot of unnecessary details. Protocols.io is a platform that manages these like another research object.

The better we are at specifying things, the better our articles will be. We must prepare to share and share properly, and pay attention to where the data will go, especially when someone on the project leaves. As a result, negative results will start to rise in value. 

Publishing a complete Record of a Research Project

Scott Fraser, Provost Professor, Director of Science Initiatives, University of Southern California, noted that data management is integral for transparent and reproducible research, and metadata is collected automatically as the experiment is proceeding. Scientific data management practices can capture imaging data by value preservation, validation, and by increasing the value of previously curated data. Data should be FAIR (Findable, Accessible, Interoperable, and Reusable). The conclusions of this study are:

  • The field is ready for imaging informatics.
  • Robust tools are available to translate formats and can be used for scientific data management, which should be used throughout the collection, analysis, and archiving of data.
  • DOIs must be in all figure legends. 

Day 2 Keynote

Yvonne Campfens

In her keynote address, “From Complexity to Transparency: How the OA Switchboard is building a cost-effective collaborative Infrastructure Solution for an OA-driven scholarly Communications Landscape”, Yvonne Campfens, Executive Director, OA Switchboard, described the current OA landscape and how the OA Switchboard builds trust by developing challenging topics in the transition to OA. There is a widespread belief that research results will be better if they are made openly available to the community. Open is better for science and brings advantages to the world. OA business models are becoming more diverse. Funders and research institutions are paying for OA centrally and are expanding their requirements about how research results should be published. 

As background, she noted that there are 2 types of trust: practical, which means delivering what you promise—dependability, reliability, and competence; and emotional, which means trusting that people are supporting you even if there are no official rules. 

The following challenges are standing in the way of a faster transition to OA:

  • Redistribution of money in the system: a shift from subscription payments to OA service payments, from the acquisitions budget to the publishing services budget, and from reading institutions to producing institutions. Smaller publishers may not be included when deals are being made. Libraries may direct authors to publish in certain journals.
  • Transparency is good for research, but not all research is OA. Transparency is needed for trust in commercial publishers and for the requirements that researchers and grantees be compliant with mandates from their funders and employers. It is needed for accountability to those who sign contracts (mostly publishers and institutions), and for comparing values, and it is needed to prevent money from being wasted in the system. 
  • Prohibitive costs: Many people say that it is very expensive to make the transition to OA. Libraries formerly just bought subscriptions, but OA involves many transaction processes that must be done. The web of buyers and sellers is very complex, and there are many lawyers and senior management people that must get involved. But there have been successes; for example, banks collaborated and developed a messaging system (SWIFT), and the whole industry has benefitted, which manages the cost. 
  • Heated topics: Recognizing that these challenges exist is a challenge in its own right, and it stands in the way of the transition to OA.

The OA Switchboard is a central information exchange hub that is based on collaboration. A neutral nonprofit intermediary is indispensable to ensure authoritative data, thus freeing authors and researchers from administrative issues, financial negotiations, and settlements. Ideally, users do not see anything of the Switchboard because it works behind the scenes as an independent intermediary, ensuring the exchange of OA information so that financial settlements can be done. It does not get involved in billing, invoicing, collecting funds, etc. and is for funders, research institutions, and academic publishers. Costs of the system will be covered by all 3 entities equally. 

The software is very simple and was developed in only 13 weeks. It is a communication hub and does not build a database but simply validates the messages and routes them to the right recipients. There is no opportunity to take the data and use it for other purposes. There are only 2 types of messages that are sent to the funder institution: Does this publication meet your requirements?”, and “Do you have central funding available to cover the publication charges?”

The Switchboard builds trust by providing transparency, accountability compliance, efficiency, cost efficiency, and simplicity. It is expected to launch in 2021, and the first priority will be to work with the launch customers on their priorities and integration with partner systems.

New Dotcoms To Watch

This traditional popular session at APE conferences consisted of presentations by representatives from 6 startups in scholarly communication.

  • Sciscore.com: A reproducibility crisis in science is responsible for a considerable amount of money being wasted because of delays and repetition in the research process. Sciscore can help by evaluating whether authors have appropriately addressed key subjects in their research and then making a “rigor adherence table”. It uses 55 algorithms to analyze the data and makes 2 tables to identify rigor and resources. Using the tables, institutions can make decisions to increase reproducibility of results published by their employees, and funders can assess whether reproducibility criteria have been met.
  • Labforward.io: It has become more difficult to make discoveries because productivity has declined over the last decade by a factor of 5.  Scientists in the U.S. spend $28 billion annually on basic biomedical research that cannot be successfully repeated. More than half of their published research is not reproducible (i.e. wasted). Making discoveries is a tedious and complex process requiring planning, preparation, execution, documentation, and optimization, and much of it is happening in silos. Labforward has developed an electronic lab notebook that helps scientists manage their data across the production life cycle.
  • Iris.ai: We are living in a world where knowledge is grow exponentially, which results in a similar growth of scientific articles. How do we find the information that we need? Can we trust citations as one of the ways to navigate all this knowledge? Citation bias is well known. Keyword query systems work well for domain experts who know exactly what they are looking for, but they do not work with unknown unknowns or interdisciplinary searches. The AI revolution has just begun; can we trust a machine to help us navigate the world’s research?  Here are key points to answer these questions: 1.A machine cannot be based on keyword searches. 2.It cannot be built on the citation system. 3.We must be able to prove that a machine would perform better than a human. 4.There must be human-machine collaboration. Iris.ai conducts a semi-automatic literature review using AI technology to explore, and then focus and narrow the results to those articles of interest. It can deal with a variety of different types of content. The results are proven and can be trusted.
  • MagmaLearning.com: We tend to forget things very quickly (the last book or article read, etc.) Magma’s ARI9000 app is a powered by machine learning that learns from user input and improves with experience. Learning is a highly personal process, varying from one person to another so it is necessary to get the right content at the right moment. Feedback is important to keep learners motivated and let content creators gain insight into readers’ experiences. Personalized learning adapts to diversity and promotes inclusion.
  • Scholarcy.com: It is widely recognized that there are too many articles published and not enough time to read them all. Over 3 million papers were published in 2008, and preprint servers have grown by over 300% since 2015. Researchers typically spend 45 minutes reading an article and read 250 articles a year. Reading and understanding the information after it has been discovered is what matters; a major risk and universal challenge for researchers is that important findings are being missed. We tend to rely on skim reading to keep up, but the danger with that is that it is easy to miss important information. Scholarcy reads and distills information into “summary flashcards” and does the skim reading for the user so that it is easy to determine how useful an article is for research. The time to read a paper is reduced from 45 minutes to 15. Libraries are also using Scholarcy to generate plain language summaries of articles.
  • Researcher-app is addressing the same problem as Scholarcy. It helps researchers stay up to date by creating a personalized feed of articles taken from 15,000 journals. Users select the journals they want to follow and receive articles on their chosen device. Output can be synchronized with reference managers like Mendeley and Zotero. So far, it is being used by 14 million scientists and researchers. 

Collaborations Built on Trust

This session focused on the humanities and social sciences (HSS), which are rapidly becoming digital by using OA and other publishing formats. 

Prof. Dr. Andreas Fickers, Director, Luxembourg Centre for Contemporary and Digital History, University of Luxembourg, described some of the issues he has faced in developing the Journal of Digital History, such as working with authors and editorial board members. Periodic workshops were held, especially with authors planning to write articles for the initial issues. The “Jupyter Notebooks” open source system for creating and sharing documents containing code, equations, visualizations, and text has proven useful in this project. It is important to think about how to visualize the content; the future is in scalable reading and the ability to interact with the data. 

Ros Pyne, Director, Open Access Books, Springer Nature, discussed the future of the monograph, which she called “long-form scholarship in the digital age”. In HSS, monographs do more than just reporting on the results of research; they are part of the research and provoke debate, shift paradigms, and provide a focal point for research.  

OA is a critical feature of scholarly work, and mid-length academic monographs can include multimedia. AI may have a role in producing books. Although print will be with us for a significant time, machine-generated books are starting to appear. The rise and importance of e-books is important, but more can be done to improve the e-book reading experience. One study found that OA books are downloaded 10 times more than non-OA books, and OA seems to be revealing a latent readership, particularly in low-income countries. 

The main challenges to publishing an OA book are the inability to find funds, low awareness of OA, perceptions of lower quality, and lack of a willingness to pay. The OAPEN OA books toolkit, a freely available resource, will help academic book authors better understand OA book publishing and promote and increase trust in OA books. It has been well received and has enabled authors to publish their works and make them available to the greatest possible readership.

‘Subscribe to Open’ as an Alternative Path to OA Transition

Dr. Kamran Naim, Head of Open Science, CERN, asked why alternative approaches to OA need to be considered. He noted that although there is an increased momentum toward OA, APC-supported models have the potential of being exclusionary. Most APCs are high (some can be as much as $6,000) and are a major impediment to researchers in developing countries. Some disciplines have no funding for those purposes, and waivers are not sustainable. Some approaches to OA not funded by APCs have been tried; the Subscribe to Open (S2O) model begun by Annual Reviews and now in its 7th year of operation shows promise.  It provides an adequate incentive for institutions to participate, ensures that institutions will continue to do so, and preserves the vendor/customer relationship.

Typically, an S2O offer would give participants a 5% discount on the subscription price of a journal. Selecting S2O is therefore attractive for institutions and targets current subscribers. It is a subscription and is guaranteed to continue only if all participants continue it. If an institution declines to participate, it will be charged the full price of the subscription, with no other discounts available. The S2O offer is not a donation (which is forbidden by many institutions) and avoids the need for collective collaboration. It does not use revenue targets which might undermine the model and uses existing procurement processes. The model is stable, recurs annually, and remains a demand-based procurement model. 

Collaborative Open Access: The National Contact Point Open Access Germany (OA2020-DE)

Dirk Pieper, Director, University Library, Bielefeld, Germany said that the Contact Point was founded by the Alliance of Science Organizations in Germany in 2017 and will conclude in July 2021. It has issued reports on publications by academic institutions, APC charges, and the cost of OA transition, and has created business models for OA transformation in HSS and for books, and conducted classes for libraries, publishers, and intermediaries. A major lesson learned from this work is that librarians need to trust stakeholders, financial mechanisms, workflows and outcomes of OA transformation. Trust is also needed in large-scale OA and transformative agreements.

Climate Action. Influencing Policy and Tackling Real-World Challenges – How Can Scholarly Collaboration Support Rapid Action?

Climate affects everyone, and the earth’s climate systems are undergoing unprecedented changes. This session discussed an integrated approach between all information stakeholders, including researchers, publishers, policy and society.

Climate Action – What the Data Tells Us

Dr. Lewis Collins, Editor-in-Chief, One Earth, (published by Cell Press) said that climate is a difficult problem and a complex tangled mess of issues. Science is good at solving well defined challenges, but climate change is a complex array of challenges. It is happening, and we need urgent interdisciplinary action to avert a crisis, The target is to limit global warming to under 2 degrees. 

Journals and publishers have a duty to advance our understanding of climate change and make the science heard, not just questioned, and then to encourage collaboration. Every article published should aspire to develop policies for climate change and use them globally. In response, Cell Press has launched One Earth, a journal focused on today’s challenges. Elsevier published a report looking at research trends around system development goals (SDGs) which showed that climate research is growing, but it still needs a more local and social science focus. The response to the COVID pandemic has demonstrated our united power to advance climate change.

Combining Policy, Science and Publishing

Dr. Joanna Depledge, Former Editor, Climate Policy and Research Fellow, Centre for Environment, Energy and Natural Resource Governance (CEENRG), University of Cambridge noted that Climate Policy, founded in 2000 and published by Taylor & Francis, has become the foremost international peer-reviewed journal on all aspects of responding to climate change, and has experienced a dramatic growth:

C:\Users\DTH\Pictures\033.JPG

Its authors are mission-oriented and motivated to reach policymakers, practitioners, and negotiators. The subject is inherently interdisciplinary; core subject areas of the journal are proposing new policy approaches, analyzing existing proposals and claims, and improving understanding of drivers shaping climate policies. The rapid growth of articles has led to information overload so free access to the articles for a limited period is provided, and the journal is promoted on social media. The global distribution of submissions is dominated by western and northern countries and has remained virtually constant over the past 5 years, so effectively addressing climate change will require action in the Global South.

What Does the Research Community Need from Stakeholders?

According to Dr. Andrew Kelly, Portfolio Manager at Taylor & Francis (T&F) Group, there is a desire for research on the “grand challenges of our time”. T&F conducted a survey to find out what makes authors choose their areas of research and the journals in which they choose to publish. 90% of the respondents indicated that their research currently contributes to meeting real world problems. Younger researchers may be more likely to enter these areas of research. Responses from China-based researchers’ were different than those from the rest of the world. We can support our researchers by ensuring that there are channels to reach policy makers, and then reward them for investing time and resources in these activities. In this year of the pandemic, the main concern is working in silos. We need to facilitate collaboration; publishers can foster interaction with general public.

The SDG Publishers Compact

Sherri Aldis, Chief, UN Publishing, United Nations (UN) and Dr. Michiel Kolman, Chair, Inclusive Publishing and Literacy Committee, International Publishers Association (IPA) said that the UN has developed 17 SDGs for everyone everywhere to change the world:

C:\Users\DTH\Documents\ATG\APE 2021\Wednesday Photos\2021-01-13\Capture.JPG

COVID is more than a health crisis; for example, 17 million people have been pushed into poverty, and 90% of all students are not in school. 

Publishers can be agents of change through their publications; the SDG Publishers Compact developed by the UN and IPA aims to reach the SDGs during 2020- 2030, which has been designated as the “Decade of Action”.  Publishers are encouraged to sign the Compact; a list of the actions to which the publishers commit is on its website.

Panel Discussion: Balancing the Need for Rapid Sharing With the Need for Rigorous Evaluation – the Role of Preprints and Peer Review

Moderated by Magdalena Skipper, Editor-in-Chief, Nature

Preprints have existed for a long time (at least since the 1960s), and they are here to stay. Because of the need for rapid information sharing as a result of the COVID pandemic, the rate of preprint adoption has dramatically increased. It is especially crucial within the health and clinical sciences that scientific findings are appropriately scrutinized before being made public; therefore peer review has become important. Peer review has gone through a number of models and experiments and has become the norm. Many scientists hate it, but we cannot do without it.

Panel members gave the following brief accounts of their views. 

Over 50 preprint servers exist covering an extensive range of subjects. A major emphasis seems to be on biomedical and health sciences, but preprint servers are not exclusively limited to those disciplines. 

  • Preprints speed up science and allow faster dissemination within the scientific community. They also expose data that may not survive peer review. The nature of preprints is changing, especially as a result of the pandemic: ethics has emerged as a review criterion. Some people feel that they have “the answer” to the problem, and it is urgent. Others think preprints should be shared only with researchers, not the world, which would be very difficult to do. There is no preprint business model. MedRxiv, a server for health science preprints, accepts original research articles which are first evaluated to ensure that there would be no risk publishing them before peer review. It has experienced a large growth in submissions in the last year. 
  • Springer Nature uses several methods of peer review and is trying to normalize the use of preprints to facilitate sharing by early dissemination. Authors are encouraged to share preprints, and every author is provided with access to In Review, a server where preprints of Springer’s authors are hosted while they are being reviewed. Preprints are in HTML and are given DOIs, thus becoming part of the permanent scholarly record.  When an article is published, it is linked to the preprint. In Review is available for 414 Springer Nature journals.
  • Review Commons organizes high quality peer review before an article is submitted to a journal. Participating journals and servers include PLoS, the European Molecular Biology Organization (EMBO), bioRxiv, an medRxiv. The Commons addresses efficiency, quality, and transparency by avoiding the delays if an article is submitted to different journals. A consortium of publishers and reviewers has agreed to use Review Commons to produce and to consider journal-independent reviews. Quality is improved by focusing only on the science discussed in the article, and transparency is achieved by publishing reviews from all journals as they occur. Review Commons does not make any editorial decisions. When an article is published, the full set of reviews is published with it. The system has been successful; 17 journals have accepted manuscripts and their reviews, and authors have strongly supported this journal agnostic and science-focused process. Preprints are aggregated using an automated curation process.
  • Peer reviews minimize editorial bias and give researchers the ability to share their findings, including incremental results, methods developments, etc. Challenges of peer reviews are limited checks before posting, lack of clarity on what has or has not been checked, lack of underlying source data, delays in seeing the full reviewed article, and  availability of links between the preprint and peer reviewed article. Post-publication of peer reviews can solve these problems; when an article is published, the author has the option of publishing the reviews and names of the reviewers along with it. The author can also revise the article and react to the reviewers’ comments. The question is then raised: what does publication mean? Traditionally, it has meant that the article has passed peer review. With post-publication of reviews, it has come to have different meanings as this table shows.
C:\Users\DTH\Documents\ATG\APE 2021\Wednesday Photos\2021-01-13\Capture1.JPG

It is important that peer review history remains public, even for articles that have not been published; publication will allow tracking of the research process.

Final Session: The Fourth Paradigm: Data-Intensive Scientific Discovery: More Than 10 Years Later

The title of this session was taken from The Fourth Paradigm: Data Intensive Scientific Discovery (Microsoft Research, 2009), which is a collection of essays on how science can be enhanced by sharing data.  It is now a good time to look back at some of the developments in data science that have occurred in the last 10 years. 

Recycle the Waste! 

Prof. Dr. Claudia Draxl, Physics Department, Humboldt-Universität zu Berlin

Here is a view of research paradigms over the years. Today, we are at the beginning of the 4th paradigm.

The publication culture is changing, so we need to find new tools to extrapolate data and to find the right data. We need a lot of data to include in our algorithms because science has become very complex. 

After a research project is completed, it is customary to publish an article about it. Articles may contain figures, equations, structures, or a few numbers. But we will never be able to provide all the information in an article that we have collected. Who would want to write such a paper? Who would want to read it? Is some of the information trash? Trash comes in several possible forms: 

  • Recipes for syntheses that were not successful, or which are not published and kept secret because of intellectual property or competitive considerations,
  • Metadata may not be complete and may have simply been handwritten in lab notebooks, then discarded when the research is finished,
  • Candidate materials that were not useful for the research and were discarded, but which may be very useful for other research, and
  • Everything else that is not put in the published article.  

Publishing only success stories leads to bias. We need to also publish failures because they can increase our understanding. Some materials are not trash, but are not needed for current research, so the Novel MAterials Discovery (NOMAD) repository was established in 2014 to host the data about them. It now contains more than 100 million calculations from all over the world. The data sets are published like manuscripts, so they have DOIs and can be downloaded using a CC-BY license. They are categorized and can be searched. Trash has therefore been turned into a gold mine!

In the research of tomorrow, the data challenges will be enormous. Interoperability is the biggest issue. We need benchmarks to determine the quality of the data. Trust levels and a FAIR data structure are urgently needed as well. 

Draxl concluded with her vision for the library of tomorrow.

It will have books of course, but also research journals, a connection to the data that were used in the published articles, interactive tools for the community to use, and centralized metadata for exploring the data found in the journals.

STM Research Data Year 2020 – A Review 

Dr. James Milne, President, ACS Publications, and Chairman of the Board of the International Association of Scientific, Technical and Medical Publishers (STM) said that data has been essential to science throughout history. How can we maximize its value?  

When powered by AI, data science can add magic. It can revolutionize the way science works, tap into unknown unknowns, test hypotheses against vast amounts of data and create new ones, do research, run labs, and write research articles. Publishers and researchers must work collaboratively. STM publishers have had a long-standing commitment to and a crucial role in FAIR data. We must also change how we evaluate and incentivize researchers, which is a key to moving forward faster. Since 2014 there has been a massive growth in the number of articles linked to data sets. 

The State of Open Data Report 2020 confirms that publishers still have a key role in data sharing. Surveys show that researchers get satisfaction with sharing data and should get credit for it. They rely on publishers if they need help in making their data openly available. At the 2020 APE conference, STM launched the Research Data Year to assist publishers that are supporting authors in sharing their data. At the start, 21 publishers participated; between them they publish over 13,000 journals.  Progress of the program is tracked on a dashboard on the STM website. We cannot work alone in this and must start early in the manuscript submission process to encourage researchers to provide their data and also work with funders to show them the benefits of researchers sharing their data.

There is still more work to be done. 

European Open Science Cloud (EOSC)

Prof. Dr. Karel Luyben, President, EOSC Association, said that we are moving towards open data and FAIR data to advance science. Skills, metrics, and rewards are used in this process. 

Data includes any digital output that comes from a research project, such as identifiers, standards, code, and metadata. and it must be as open and as FAIR as possible. The EOSC is moving towards a “web of FAIR research data” and services related to that data. It will be a federation of existing services in a virtual space for science producers and consumers. It is starting in Europe but is envisioned to grow into a worldwide organization. Boundary conditions for EOSC are:

  • Core funding for EOSC from the EU,
  • Inclusiveness of all stakeholders, and
  • Core following the subsidiarity principle.

We must realize that countries have different structures for their data, which must be accommodated. The EOSC should be hardware agnostic. Core functions going forward are:

  • Develop and govern the federating core,
  • Manage the compliance framework,
  • Manage trusted certification, 
  • Manage the AAI,
  • Manage PID policies,
  • Reach out to stakeholders,
  • Monitor services and transactions,
  • Manage EOSC trademarks, and
  • Contribute to EU policies.

The overriding principle is that EOSC is being developed for researchers. It will succeed if and only if it follows a multi-stakeholder approach, and it must ensure that research artifacts be as open as possible (although there will be circumstances where they must be closed). Data must be accessible by people as well as by machines in order to deliver services to scientists. 

The EOSC Governance created an EOSC Association which now has 187 members.

All partners will commit to transparency and openness. EOSC will be active until the end of 2030.

————————————————

The next APE will take place January 10-12.2022 in Berlin.

Donald T. Hawkins is an information industry freelance writer based in Pennsylvania. In addition to blogging and writing about conferences for Against the Grain, he blogs the Computers in Libraries and Internet Librarian conferences for Information Today, Inc. (ITI) and maintains the Conference Calendar on the ITI website (http://www.infotoday.com/calendar.asp). He is the Editor of Personal Archiving: Preserving Our Digital Heritage, (Information Today, 2013) and Co-Editor of Public Knowledge: Access and Benefits (Information Today, 2016). He holds a Ph.D. degree from the University of California, Berkeley and has worked in the online information industry for over 50 years.

0 Comments

Submit a Comment

LATEST NEWS

SUBSCRIBE TO OUR PODCAST

Share This