Search Results

You are looking at 1 - 10 of 11 items for

  • Author or Editor: Matthew S. Mayernik x
  • Refine by Access: All Content x
Clear All Modify Search
Matthew S. Mayernik
,
Mohan K. Ramamurthy
, and
Robert M. Rauber
Full access
Matthew S. Mayernik
,
Mohan K. Ramamurthy
, and
Robert M. Rauber
Full access
Matthew S. Mayernik
,
Mohan K. Ramamurthy
, and
Robert M. Rauber
Full access
Matthew S. Mayernik
,
Mohan K. Ramamurthy
, and
Robert M. Rauber
Full access
Matthew S. Mayernik
,
Sarah Callaghan
,
Roland Leigh
,
Jonathan Tedds
, and
Steven Worley

Abstract

Peer review holds a central place within the scientific communication system. Traditionally, research quality has been assessed by peer review of journal articles, conference proceedings, and books. There is strong support for the peer review process within the academic community, with scholars contributing peer reviews with little formal reward. Reviewing is seen as a contribution to the community as well as an opportunity to polish and refine understanding of the cutting edge of research. This paper discusses the applicability of the peer review process for assessing and ensuring the quality of datasets. Establishing the quality of datasets is a multifaceted task that encompasses many automated and manual processes. Adding research data into the publication and peer review queues will increase the stress on the scientific publishing system, but if done with forethought will also increase the trustworthiness and value of individual datasets, strengthen the findings based on cited datasets, and increase the transparency and traceability of data and publications.

This paper discusses issues related to data peer review—in particular, the peer review processes, needs, and challenges related to the following scenarios: 1) data analyzed in traditional scientific articles, 2) data articles published in traditional scientific journals, 3) data submitted to open access data repositories, and 4) datasets published via articles in data journals.

Full access
Matthew S. Mayernik
,
Mohan K. Ramamurthy
, and
Robert M. Rauber
Full access
Matthew S. Mayernik
,
Mohan K. Ramamurthy
, and
Robert M. Rauber
Full access
Matthew S. Mayernik
,
Mohan K. Ramamurthy
, and
Robert M. Rauber
Full access
Matthew S. Mayernik
,
Mohan K. Ramamurthy
, and
Robert M. Rauber
Full access
Douglas C. Schuster
,
Matthew S. Mayernik
,
Gretchen L. Mullendore
, and
Jared W. Marquis

Abstract

It has become common for researchers to make their data publicly available to meet the data management and accessibility requirements of funding agencies and scientific publishers. However, many researchers face the challenge of determining what data to preserve and share and where to preserve and share those data. This can be especially challenging for those who run dynamical models, which can produce complex, voluminous data outputs, and have not considered what outputs may need to be preserved and shared as part of the project design. This manuscript presents findings from the NSF EarthCube Research Coordination Network project titled “What About Model Data? Best Practices for Preservation and Replicability” (https://modeldatarcn.github.io/). These findings suggest that if the primary goal of sharing data are to communicate knowledge, most simulation-based research projects only need to preserve and share selected model outputs along with the full simulation experiment workflow. One major result of this project has been the development of a rubric, designed to provide guidance for making decisions on what simulation output needs to be preserved and shared in trusted community repositories to achieve the goal of knowledge communication. This rubric, along with use cases for selected projects, provide scientists with guidance on data accessibility requirements in the planning process of research, allowing for more thoughtful development of data management plans and funding requests. Additionally, this rubric can be referred to by publishers for what is expected in terms of data accessibility for publication.

Open access