Dr. Aleksandra Sokolowska, UZH Computational Scientist and Astrophysicist, Head of Research and Analytics at ETH Zurich Spinoff Validity Labs AG, @alex_sokolowska
There is a lot of confusion around the topic of blockchain and distributed ledgers (DLTs)1, hence the existence of a dichotomy between their “enthusiasts” and “sceptics”. In contrast to machine learning technologies, which easily speak to our imagination with the application of an intelligent machine exceeding human capabilities in large-scale classification, diagnostics or predictive tasks, blockchain networks do not make superhumans. They do, however, provide an unregulated2 digital framework for direct human/non-human interactions and solve the issue, which in the real world even to-date leads to serious conflicts: transparency of deals and immutability of their records. Since they enable more direct participatory marketplaces, the prices of digital goods can be set in a fairer manner, bringing more equilibrium to predominantly monopolistic ecosystems.
In the context of interactions, the scientific ecosystem is a place in dire need of a revolutionary solution to its multitude of problems. At its core are the broken incentives for all the stakeholders – hiring committees, researchers, funding organizations, and publishers. The first three wrongfully rely on the journal impact factor as a measure of the quality of the individual researcher’s work. As a result, a large number of first-author publications in high impact factor journals becomes the target. The amount of research funding is disproportionate to the demand, hence researchers facing weak prospects for permanent employment inflate our vault of knowledge with indigestible number of 2.5 million scientific publications a year. Needless to say, the publishing business is blooming at this rate, having no incentive to interfere. However, the crisis is becoming increasingly apparent, with reports of key performance indicator culture ruining scientific research, e.g. by means of making the results irreproducible, with rates sometimes as high as 90% as in the case of cancer research papers. If that persists, the reputation and position of journals will be undermined.
In addition, the model, in which scientists evaluate someone else’s work for free as an absolutely essential step in the handsomely paid publishing process is meeting with a backlash from the stretched-thin scientific community. Newcomers have already been experimenting with a new revenue model, which on top of that tackles two unsolved problems: it enables reviews of the peer reviews as well as post-publication peer reviews (e.g. Science Matters). Amongst others, the lack of a system which integrates the latter fosters omissions such as continuous references of retracted papers.
A number of initiatives have already harnessed the advantages of DLTs in the attempt to mitigate issues of the scientific ecosystem, e.g. via: 1) timestamping various research outputs such as registration of an idea, data collection, data analysis, interpretation, and peer review, ensuring that every researcher involved in a project gets appropriate credit for their input; 2) crowdfunding research projects and paying for reviews via cryptocurrencies; 3) tokenizing research outputs; 4) transmitting to the network which lab equipment operated by whom conducted an experiment; 5)
trading otherwise discarded experimental data over a decentralized network. In addition, the inherent properties of the peer-to-peer architecture will in the near future allow researchers to use resources such as idle storage and computational power of peers for a better price than what is offered by tech giants today.
Another application, namely the concept of disintermediation of various services in the scientific ecosystem, is particularly promising. Universities and their libraries, which are covering annually increasing publishing fees, begin to reconsider their position of weakness and look into other solutions. A decentralized consortium of such universities, or a DAO (decentralized autonomous organization), would constitute an unprecedented advancement in how the knowledge is maintained by the academies today.
The social implications of this infrastructure would be severe – DAO smart contracts could be used for transparent governance, voting and financial transactions, thereby simplifying bureaucratically- heavy processes. As a distributed system, this organization would be democratic. The protocols of who stores what data could be developed – e.g. the storage could be assigned based on the PIs’ affiliation. Each university would have free access to the research data in the network. The attribution of who did what would be straightforward with equipment broadcasting to the trusted network when needed, ensuring that everybody did their fair share to end up on the author’s list of a project.
In such an “academy on a ledger” the funds, which today are the publishers’ profit, could be turned into salaries of highly-trained drop-out PhD students, who could now serve as paid curators of the content (i.e. reviewers and researchers performing replication studies). In this future world, we wouldn’t add papers to the journals any more – we would be maintaining our collective body of knowledge and writing it to the “ledger” only when it’s verified by the curators.
Much like with the two AI winters of the 60s and 80s, today blockchain capabilities are limited and the expectations were raised too high thanks to the industry-driven b-s factor3. The technology is at its infancy with its unsolved scalability issues and the user experience that is scaring people off. This is the right time for experimentation, however, the motivation behind its implementation should always be critically evaluated. In case of DLTs, the reason for their use should be clear: to protect the commons from rent-seeking intermediaries.
Meanwhile, as an undoubtedly better decentralized way of handling human knowledge is cooking, the scientific publishing industry must keep up with research innovation and provide an adequate means of scholarly communication. Such a new way should not impair research quality, should not incentivize quantity over quality, provides the reader with quick insights in the era of global research and easy access to any data that is necessary to critically evaluate someone’s work at any point, be it pre- or post-publication. Blockchain aside, this should be today’s number one goal.
1Simply put, blockchain combines already existing concepts in cryptography and distributed systems, allowing a user to sign transactions digitally. A transaction can be purely financial – transferring money from A to B – or in addition contain a smart contract, which is a piece of code performing tasks on that network. In addition, there is a global consensus mechanism, ensuring that all connected parties in the network have the same understanding of the ledger.
2debates on the regulation of blockchain and DLTs are ongoing
3pardon my French