MIT Libraries logo MIT Libraries

MIT logo

Designing Review and Evaluation for Open Scholarship

Designing Review and Evaluation for Open Scholarship

(Collaborative work with Philip N. Cohen, Professor of Sociology, at the University of Maryland, College Park, CREOS Visiting Scholar- emeritus).

 

The scholarly communications ecosystem is evolving rapidly. However, the absence of systematic, comparable, longitudinal evidence on the performance of the system’s components, and the complexity of publication and review practices, makes it difficult to characterize specific changes, to evaluate their short-term impact, and to develop reliable generalizable conclusions about their broader application. The broad objective of this area of research is to inform the design and application of review and evaluation in open scholarship and meta-research. 

 

Recent publications and presentations:

Information about the vetting process for a specific publication or general publishing venue remains difficult to find, daunting to compare at scale, and impossible for anyone (with the possible exception of a journal’s editorial board) to fully assess. Obtaining a better view will require systematic, comparable, and scalable methods of describing peer review.

 

    • The Scholarly Knowledge Ecosystem: Challenges and Opportunities for the Field of Information. (SocArxiv Preprint)

      In this article we draw upon major reports from a cross-section of disciplines related to large scale scientific information ecosystems to characterize the most significant research challenges and promising potential approaches. We explore two themes that emerge across research areas: the need to align research approaches and methods with core ethical principles; and the promise of approaches that are transdisciplinary and cross-sectoral. 

 

    • The Market for Peer Review (Draft for ASAPBio Preprint Sprint)

      This white paper characterizes the problem of ‘market failure’ in peer review. We identify characteristics of peer review as an economic ‘good’ that leads to poor market operation, provide a general checklist for developing replicable and measurable market interventions to improve peer review, and illustrate this approach by outlining a conceptual scheme for allocating peer-review using tokens.

 

In this presentation, we review exemplars in health and medical research to identify the range of ways that complex human systems resist reliable inferences and blunt ad-hoc interventions. We then examine a successful intervention in open science and show how evaluation and generalization of its success is vulnerable to the same threats to inference found in public health. We discuss how interventions designed to increase the reliability of health systems interventions can be applied to evaluate the practices of open science and scholarly communication. 


(Presentation for 2019 BITSS Annual Meeting, and discussion panel at Unlocking the File Drawer Workshop)