12 Matching Annotations
  1. Jul 2018
    1. On 2014 May 09, Greg Lennon commented:

      This article is certainly worthy of discussion, not just due to the credibility and experience of its authors, but I hope that future publications will be able to provide some benchmarking and analysis of systems outside the US in terms of the aspects that are better (or not) compared to the current American system, and to provide context for the proposals in this publication.


      This comment, imported by Hypothesis from PubMed Commons, is licensed under CC BY.

    2. On 2014 May 06, Melissa Vaught commented:

      In this perspective, Alberts et al reference a white paper published by the American Society for Biochemistry & Molecular Biology (ASBMB), Toward a Sustainable Biomedical Research Enterprise. In part, the white paper provided a basis for a discussion hosted by the ASBMB Public Affairs Advocacy Committee at Experimental Biology 2014. My report from the session provides a summary and includes a link to a collection of tweets posted by myself and others from the session.

      Both Jeremy Berg (president of ASBMB) and Paula Stephan (a professor of economics) highlighted the pronounced increase in PhD production in recent years. Alberts et al state here, "The goal of the next set of recommendations is to gradually reduce the number of entrants into PhD training in biomedical science—producing a better alignment between the number of entrants and their future opportunities..." The message from these groups seems clear: In biomedical sciences, the influx of funds was met by expansion of PhD programs, and it's now time to scale back PhD production. Yet in this and another session, I heard some faculty push back against the idea that their departments should train fewer PhD students. This dissonance emphasizes the importance of a question raised by the ASBMB white paper: "Are there approaches that could estimate how many Ph.D.-, M.S.-, and B.S.-level scientists are needed for the American biomedical research enterprise?"

      Finally, I would like to raise an issue that I have missed in the discussions thus far. Diversity of the biomedical workforce is also out of balance (see, for example NSF data on Women, Minorities, and Persons with Disabilities in Science & Engineering). I think it's important to consider how proposed changes might disproportionately affect underrepresented minorities and how to ensure that we continue to improve diversity in science while moving toward a sustainable research enterprise.


      This comment, imported by Hypothesis from PubMed Commons, is licensed under CC BY.

    3. On 2014 Aug 14, Jim Woodgett commented:

      While Alberts et al. describe the situation in the USA, they might take a look to their north where the equivalent funding agency to NIH (except with a 30th of the budget), the Canadian Institutes of Health Research, is grappling with several of the issues raised by the authors. The "funding reforms" are documented here (http://www.cihr-irsc.gc.ca/e/44761.html) and attempt to address a slow but steady decline (slow train wreck) in confidence in adjudication caused by a 6 year flat-lined budget further exacerbated by the predictable reactions of scientists to the increased pressures.

      Some of their proposed changes are interesting (virtual review, 7 year programs, etc.) but it's a huge simultaneous experiment with no controls, transitional funding or Plan B. I encourage researchers in other jurisdictions to follow the progression of the CIHR reforms and to learn from their outcomes. Clearly the current ecosystem of science is not sustainable without either additional investment or elimination of some of the perverse incentives that drive poor decisions and planning.


      This comment, imported by Hypothesis from PubMed Commons, is licensed under CC BY.

    4. On 2014 Aug 14, David Colquhoun commented:

      I think that the problems are stated very well in this article. See also 'The Mismeasurement of Science' (http://www.dcscience.net/?p=186 ), and other writing by Peter Lawrence of the Lab for Molecular Biology. But the solutions that are suggested are not as convincing as the statement of the problem. I fear that Alberts at al. have failed to grasp the nettle. My suggestions are as follows(based on 'Open access, peer review, grants and other academic conundrums': http://www.dcscience.net/?p=487 ) .

      (1) Limit the number of papers that an individual can publish. This would increase quality, it would reduce the impossible load on peer reviewers and it would reduce costs. It would also encourage the best people to do experiments themselves, rather than presiding over an army serfs.

      (2) Limit the size of labs so that more small groups are encouraged. This would increase both quality and value for money.

      (3) More (and so smaller) grants are essential for innovation and productivity.

      (4)Move towards self-publishing on the web so the cost of publishing becomes very low rather than the present extortionate costs. It would also mean that negative results could be published easily and that methods could be described in proper detail.

      (5) Peer-review is less than satisfactory even for the most glamorous journals. At the bottom end of the market it is utterly ineffective. The solution to that is open post-publication peer review. Every paper should be followed by a comments section.

      (6)Stop using metrics as a substitute for reading papers. The use of metrics is corrupting science. Citation counting is inadequate (see http://www.dcscience.net/?p=182 ). Altmetrics are plain silly (http://www.dcscience.net/?p=6369 ).


      This comment, imported by Hypothesis from PubMed Commons, is licensed under CC BY.

    5. On 2014 Apr 28, Kenneth D Gibbs commented:

      The potential, downward impacts of the current system are real. In speaking to recent biomedical science Ph.D. graduates, issues with science funding generally, and the climate created by the current funding climate particularly, came up as reasons for some to leave academic science and/or science altogether. I commend my recent work published in CBE Life Science Education: "What Do I Want to Be with My PhD? The Roles of Personal Values and Structural Dynamics in Shaping the Career Interests of Recent Biomedical Science PhD Graduates". The link is here: http://www.lifescied.org/content/12/4/711.full


      This comment, imported by Hypothesis from PubMed Commons, is licensed under CC BY.

    6. On 2014 Apr 18, Allison Stelling commented:

      L Charles Murtaugh: "reduce the temptation to package all results into "stories" that compromise messy scientific reality for the sake of superficially compelling narrative"

      Well-put. As compelling- and useful, when used properly- as stories and narratives are, they can also gloss over important complexities that are part and parcel with the realities of lab work. Storytelling will always have a place in any human endeavor, but the degree to which it is currently emphasized in science can cause damaging hype and overselling.

      I think there are elements of pre-publication peer review that should be kept. It is usually helpful to have senior, experienced scientists give feedback prior to presenting one's work to the Academy. (I'd quite prefer to be told if I am completely wrong about something in a semi-private fashion before I present it to the world!) However, post-publication peer review has great potential to put many eyes on data and produce robust applications that are planted upon firmly tested foundations.

      bioRxiv may be the start of an inversion in biomedical publishing- I like the idea, but I also know there is much invested in the current distribution system for high quality biomedical science. (It'd be interesting to see a similar site geared towards clinical trial data.)

      A recent Nature editorial, Credit where credit is due, suggests new authorship metrics as well. Such systems may help clear up exactly whom did what in the increasingly team-orientated work of life science. Biology needs teamwork, yes- but it is also important to acknowledge (and reward) all the individual contributions.


      This comment, imported by Hypothesis from PubMed Commons, is licensed under CC BY.

    7. On 2014 Apr 18, L Charles Murtaugh commented:

      I found Dr. Bishop's comments interesting and provocative. In particular, I quite like the idea that granting agencies should "require grantholders to write up the findings from a funded project before they are eligible to apply for further funding." This could be a great use for preprint depositories like bioRxiv. The wider use of such depositories, as a means for disseminating data from funded projects, could also reduce the temptation to package all results into "stories" that compromise messy scientific reality for the sake of superficially compelling narrative. I don't know that the entire journal system needs to be jettisoned -- I sometimes joke that the future of scientific publishing will be to dump a stack of TIFF files onto a server and issue a tweet -- but some new approach is needed to ensure access to data from publicly funded projects and, in turn, to ensure some level of reward for those who generate that data. Making data release a mandatory part of grant applications would serve these ends.


      This comment, imported by Hypothesis from PubMed Commons, is licensed under CC BY.

    8. On 2014 Apr 17, Allison Stelling commented:

      The other issue is that the entire United States biomedical endeavor is severely underfunded, by about an order of magnitude. Putting a man on the moon was cheap, easy, and fast compared to "curing" cancer. If we are truly at war with death, we should fund it like one.

      However, simply throwing money at the problem will solve little. Self reinforcing negative feedback loops of money, power, and prestige are endemic in many American systems. They result in the inefficient allocation of resources, and consolidate that which should be fairly evenly distributed. We need a fully transparent accounting of where the money is going before more resources are pumped into the system.

      We need to get universities out of professional research, and allow PIs to train their students instead of desperately writing grants which, more and more, will not get funding. We need to set up institutes to soak up the excess of biomedical "trainees" and provide them with stable work. We need more lab technicians and staff scientists to perform replications and verify discoveries.

      We must parallel process the quest for cures- after all, science must happen anywhere, in any language; otherwise it is not science. It will not be one single person, department, scientific field, institute, city, State, or nation that finds such "cures". It will take all of us working together, and societies that are educated and understand why this is so very necessary.


      This comment, imported by Hypothesis from PubMed Commons, is licensed under CC BY.

    9. On 2014 Apr 17, Dorothy V M Bishop commented:

      Thanks to Alberts et al for opening up this discussion. While I agree with many of their points, I was rather disappointed in their proposed solutions. Their recommendations fall into three categories:

      a) Improving predictability of scientific budgets

      b) Changing the ways in which early-career scientists are funded, and in particular removing financial incentives for institutions to treat them as scientific serfs

      c) Increase the quantity and quality of reviewing of grants

      I felt that (a) would be unworkable in a world subject to unpredictable political and economic forces; (b) seemed worth serious consideration, but (c) seemed unlikely to achieve much and ran the danger of making worse the problem that scientists have in reducing the time to think and do productive work<sup>1.</sup>

      I was disappointed that the authors said very little about how we might tackle the malaise that affects the top echelons of science in many institutions, which they describe clearly in their first section: "pressure to rush into print, cut corners, exaggerate findings, overstate significance". They recognise the 'reproducibility crisis'<sup>2</sup> but it's not clear that their recommendations do anything to tackle it.

      As I noted in a blogpost last year<sup>3:</sup> "I don’t believe anyone goes into science because they want to become rich and famous: we go into it because we are excited by ideas and want to discover new things. But just as bankers seem to get into a spiral of greed whereby they want higher and higher bonuses, it’s easy to get swept up in the need to prove yourself by getting more and more grants, and to lose sight of the whole purpose of the exercise – which should be to do good, thoughtful science. We won’t get the right people staying in the field if we value people solely in terms of research income, rather than in terms of whether they use that income efficiently and effectively."

      A depressing example of the consequences is here: a postdoc describing leaving the field because of pressure to distort findings<sup>4.</sup>

      I was pleased to see Alberts et al suggesting that funders should take into account the amount of funding already awarded when considering a proposal. That's a step in the right direction. But I would go further: require grantholders to write up the findings from a funded project before they are eligible to apply for further funding. Require pre-registration of research protocols, to prevent cherry-picking of results<sup>5.</sup> Alberts et al want science to be more slow and thoughtful: I think to achieve that aim we need to change the incentives for those at the top.

      References

      <sup>1</sup> http://deevybee.blogspot.co.uk/2013/09/evaluate-evaluate-evaluate.html

      <sup>2</sup> http://www.nature.com/news/independent-labs-to-verify-high-profile-papers-1.11176

      <sup>3</sup> http://deevybee.blogspot.co.uk/2013/05/the-academic-backlog.html

      <sup>4</sup> http://anothersb.blogspot.com/2014/04/dear-academia-i-loved-you-but-im.html

      <sup>5</sup> http://deevybee.blogspot.co.uk/2014/01/why-does-so-much-research-go-unpublished.html


      This comment, imported by Hypothesis from PubMed Commons, is licensed under CC BY.

  2. Feb 2018
    1. On 2014 Apr 17, Dorothy V M Bishop commented:

      Thanks to Alberts et al for opening up this discussion. While I agree with many of their points, I was rather disappointed in their proposed solutions. Their recommendations fall into three categories:

      a) Improving predictability of scientific budgets

      b) Changing the ways in which early-career scientists are funded, and in particular removing financial incentives for institutions to treat them as scientific serfs

      c) Increase the quantity and quality of reviewing of grants

      I felt that (a) would be unworkable in a world subject to unpredictable political and economic forces; (b) seemed worth serious consideration, but (c) seemed unlikely to achieve much and ran the danger of making worse the problem that scientists have in reducing the time to think and do productive work<sup>1.</sup>

      I was disappointed that the authors said very little about how we might tackle the malaise that affects the top echelons of science in many institutions, which they describe clearly in their first section: "pressure to rush into print, cut corners, exaggerate findings, overstate significance". They recognise the 'reproducibility crisis'<sup>2</sup> but it's not clear that their recommendations do anything to tackle it.

      As I noted in a blogpost last year<sup>3:</sup> "I don’t believe anyone goes into science because they want to become rich and famous: we go into it because we are excited by ideas and want to discover new things. But just as bankers seem to get into a spiral of greed whereby they want higher and higher bonuses, it’s easy to get swept up in the need to prove yourself by getting more and more grants, and to lose sight of the whole purpose of the exercise – which should be to do good, thoughtful science. We won’t get the right people staying in the field if we value people solely in terms of research income, rather than in terms of whether they use that income efficiently and effectively."

      A depressing example of the consequences is here: a postdoc describing leaving the field because of pressure to distort findings<sup>4.</sup>

      I was pleased to see Alberts et al suggesting that funders should take into account the amount of funding already awarded when considering a proposal. That's a step in the right direction. But I would go further: require grantholders to write up the findings from a funded project before they are eligible to apply for further funding. Require pre-registration of research protocols, to prevent cherry-picking of results<sup>5.</sup> Alberts et al want science to be more slow and thoughtful: I think to achieve that aim we need to change the incentives for those at the top.

      References

      <sup>1</sup> http://deevybee.blogspot.co.uk/2013/09/evaluate-evaluate-evaluate.html

      <sup>2</sup> http://www.nature.com/news/independent-labs-to-verify-high-profile-papers-1.11176

      <sup>3</sup> http://deevybee.blogspot.co.uk/2013/05/the-academic-backlog.html

      <sup>4</sup> http://anothersb.blogspot.com/2014/04/dear-academia-i-loved-you-but-im.html

      <sup>5</sup> http://deevybee.blogspot.co.uk/2014/01/why-does-so-much-research-go-unpublished.html


      This comment, imported by Hypothesis from PubMed Commons, is licensed under CC BY.

    2. On 2014 May 06, Melissa Vaught commented:

      In this perspective, Alberts et al reference a white paper published by the American Society for Biochemistry & Molecular Biology (ASBMB), Toward a Sustainable Biomedical Research Enterprise. In part, the white paper provided a basis for a discussion hosted by the ASBMB Public Affairs Advocacy Committee at Experimental Biology 2014. My report from the session provides a summary and includes a link to a collection of tweets posted by myself and others from the session.

      Both Jeremy Berg (president of ASBMB) and Paula Stephan (a professor of economics) highlighted the pronounced increase in PhD production in recent years. Alberts et al state here, "The goal of the next set of recommendations is to gradually reduce the number of entrants into PhD training in biomedical science—producing a better alignment between the number of entrants and their future opportunities..." The message from these groups seems clear: In biomedical sciences, the influx of funds was met by expansion of PhD programs, and it's now time to scale back PhD production. Yet in this and another session, I heard some faculty push back against the idea that their departments should train fewer PhD students. This dissonance emphasizes the importance of a question raised by the ASBMB white paper: "Are there approaches that could estimate how many Ph.D.-, M.S.-, and B.S.-level scientists are needed for the American biomedical research enterprise?"

      Finally, I would like to raise an issue that I have missed in the discussions thus far. Diversity of the biomedical workforce is also out of balance (see, for example NSF data on Women, Minorities, and Persons with Disabilities in Science & Engineering). I think it's important to consider how proposed changes might disproportionately affect underrepresented minorities and how to ensure that we continue to improve diversity in science while moving toward a sustainable research enterprise.


      This comment, imported by Hypothesis from PubMed Commons, is licensed under CC BY.

    3. On 2014 May 09, Greg Lennon commented:

      This article is certainly worthy of discussion, not just due to the credibility and experience of its authors, but I hope that future publications will be able to provide some benchmarking and analysis of systems outside the US in terms of the aspects that are better (or not) compared to the current American system, and to provide context for the proposals in this publication.


      This comment, imported by Hypothesis from PubMed Commons, is licensed under CC BY.