The increasingly digital workflow in science has made it possible to share almost all aspects of the research cycle, from pre-registered analysis plans and study materials to the data and analysis code that produce the reported results. Although the growing availability of research output is a positive development, most of this digital information is in a format that makes it difficult to find, access, and reuse. A major barrier is the lack of a framework to concisely describe every component of research in a machine-readable format: A grammar of science.
Software
We are developing software to achieve our goals of generating and processing machine-readable descriptions to facilitate archiving studies, pre-registration, finding variables and measures used in other research, meta-analyses of studies, finding and re-using datasets in other ways, and assessing research outputs for best practices.
Our current focus is on developing MetaCheck, which aims to screen scientific manuscripts to identify potential issues or areas for improvement and guide researchers in adopting best practices. It can also assist with processing large numbers of papers for metascientific enquiry.
Funding
This project is partially funded by the TDCC Research Transparency Check Project, co-led by René Bekkers and Daniël Lakens. Aspects of the project are also funded by a Vici grant to Daniël Lakens.
Core Team
Scienceverse is a collaborative project led by Lisa DeBruine (University of Glasgow) and Daniël Lakens (Technical University Eindhoven). Other contributors are Cristian Mesquida (postdoc, TUE), Jakub Werner (RA, TUE), Hadeel Khawatmy (RA, TUE), Lavinia Ion (RA, TUE), René Bekkers (Transparency Check co-PI, VU Amsterdam), and Max Littel (RA, VU).








Publications
Lakens, D., & DeBruine, L. M. (2020). Improving Transparency, Falsifiability, and Rigour by Making Hypothesis Tests Machine Readable. Advances in Methods and Practices in Psychological Science. 2021;4(2). doi:10.1177/2515245920970949 (more accessible preprint version)


