2 Matching Annotations
  1. Jul 2018
    1. On 2013 Jun 18, Jonathan Dugan commented:

      This work describes ongoing efforts that are closely related to current efforts here at PLOS Labs looking at open, structured methods and tools to evaluate published works by peers.

      I thought the section on "Providing Appropriate Incentives" was both an excellent review of the issues there, and the most important part to getting next generation evaluations to occur. Specifically, "there simply isn't any meaningful incentive to contribute" is a core issue in both the issues with current peer review, and with other novel new systems for peer evaluation.

      In terms of what happens to traditional pre-publication review, it's my opinion that the existing peer review process is essential for fair reputation management in the professional work of researchers - and our work on this problem is predicated on a hybrid model that preserves 'peer review' in its current form for the foreseeable future. Any future situation around evaluation that "might ultimately obviate the need for conventional journals" would need to be predicated on Internet-scale identity systems and significantly robust and resilient professional reputation systems that are far from being produced at this point. Many issues exist in both identity and reputation online that are unsolved and will likely require changes in the expectation of researchers and the public (think generations of people) in these core concepts before online systems will be successful. A new system that "obviates" peer review would need to be measurably and obviously better for science and for researchers using the system. The hybrid model referred to in the paper with novel methods working beside existing publishers using journals to encode professional reputation using peer review has many extremely positive points in how it works.


      This comment, imported by Hypothesis from PubMed Commons, is licensed under CC BY.

  2. Feb 2018
    1. On 2013 Jun 18, Jonathan Dugan commented:

      This work describes ongoing efforts that are closely related to current efforts here at PLOS Labs looking at open, structured methods and tools to evaluate published works by peers.

      I thought the section on "Providing Appropriate Incentives" was both an excellent review of the issues there, and the most important part to getting next generation evaluations to occur. Specifically, "there simply isn't any meaningful incentive to contribute" is a core issue in both the issues with current peer review, and with other novel new systems for peer evaluation.

      In terms of what happens to traditional pre-publication review, it's my opinion that the existing peer review process is essential for fair reputation management in the professional work of researchers - and our work on this problem is predicated on a hybrid model that preserves 'peer review' in its current form for the foreseeable future. Any future situation around evaluation that "might ultimately obviate the need for conventional journals" would need to be predicated on Internet-scale identity systems and significantly robust and resilient professional reputation systems that are far from being produced at this point. Many issues exist in both identity and reputation online that are unsolved and will likely require changes in the expectation of researchers and the public (think generations of people) in these core concepts before online systems will be successful. A new system that "obviates" peer review would need to be measurably and obviously better for science and for researchers using the system. The hybrid model referred to in the paper with novel methods working beside existing publishers using journals to encode professional reputation using peer review has many extremely positive points in how it works.


      This comment, imported by Hypothesis from PubMed Commons, is licensed under CC BY.