55 Matching Annotations
  1. Dec 2018
    1. .

      Total publishing income from Conservation Biology and Conservation Letters together would be $468,000 (324K+140K) minus what's taken by Wiley.

      Compared to the reported publishing income of $678,000 for 2017, that would be min. $200,000 less, reducing the publishing surplus by half from $400,000 to $200,000.

    2. ,

      As a very rough estimate, assuming that publishing income ($678,000 as reported for 2017) is mostly derived from author charges for Conservation Biology (ca. $300,000 as per above), Conservation Letters ( $140,000, see calculation below*) and subscription revenue, all minus what's taken by Wiley:

      Depending of what % is taken by Wiley, income from author charges would be 65% or less of total publishing income, and income from subscriptions 35% or more.

      (with the acknowledgement that this is comparing 2017 revenue with a sample of 2018 publication data).

      *Dec 2018 issue of Conservation Letters contained 17 articles/reviews at $1,850 = $31.450 Extrapolated to a full year, with 20% waived charges: app. $140,000 author charges.

    3. .

      Using the calculation in the comments above (based on no page charges and waived color charges for OA articles):

      The December 2018 issue could have raised more than $65,000 from author charges. If we assume that the December 2018 issue is typical over a year and that 25% of charges are waived, then Conservation Biology raises nearly $300,000 a year (before subscription revenue)

    4. f we assume nowaiverswere grantedfor December 2018 issue, the page charges the would have generated $37,950, the color charges $28,000, and the APCs $21,000, for a total of $86,950 paid by authors.

      Working from the information that page charges are not levied for OA articles (see previous comment):

      page charges would have generated $26,550, the color charges $28,000, and the APCs $21,000, for a total of $75,550 paid by authors

      If it is assumed that the fee for color charges is requested to be waived for OA articles, as is allowed by journal policies:

      page charges would have generated $26,550, the color charges $17,500, and the APCs $21,000, for a total of $65,050 paid by authors

      (7 OA articles with a total of 75 pages, of which 15 color pages)

    5. fee for color pages is$700

      Interestingly, authors can decide to only have color in the online publication, and b/w in the print publication, in which case there are no color charges. (https://onlinelibrary.wiley.com/page/journal/15231739/homepage/forauthors.html#Charges)

      It would be interesting to see how many authors choose this option.

    6. Pages charges are assessed at the rate of $150 per page with provision for reductions or waivers if the author does not have the means to pay at this rate

      There are also no page charges for open access papers (https://onlinelibrary.wiley.com/page/journal/15231739/homepage/forauthors.html#Charges)

  2. Oct 2018
    1. v

      Perhaps even more important/pressing, for those cases where it is unambiguously defined as posing a problem for Plan S implementation (eg Germany) it would be important to make explicit what this would mean for compliance with plan S (e.g. if the EU signs on, what this would mean for grantees from Germany).

      Regarding AAUP, whether or not the wording 'full freedom in research and in the publication of the results' encompasses choice of publication venue is up for debate...

    2. ,

      citation needed ;-)

    3. .

      On the other hand, it could be argued that this would encourage eventual flipping (soft-flipping) of journals (if submissions to the subscription journal would taper off) and thus would ease the transition. One condition would be the open journal would be open to all authors (so no 'Nature Plan S') :-)

    4. Support and encourage the application of DORA and other declarations on good research evaluation practice no later than 2020.

      And also apply these practices as funders -fully in line with the commitment (in the preamble of planS ) of the coalition funders to 'fundamentally revise the incentive and reward system of science" (starting from DORA)

    5. FOAA cost transparency prop

      Although it should be noted that the proposed APC-cap in TTOA is considerably higher (not exceeding €1400 on average) than that set in the FOAA principles itself (max $1000)

    6. build cost transparency into t

      Fully seconded. This is very important.

    7. The implementation phase of this principle should therefore not only focus on publication fees, but also support no-(author)-fee initiatives.

      There is nothing in Plan S that focuses solely on publication fees, and the preamble specifically states no preference for any given business model (excluding hybrid). To what extent this approach will hold in the implementation is indeed an open question yet.

    8. ,

      Important point: selectively favouring existing solutions might in the longer run inhibit innovation and potential for larger change. So perhaps there should be a balance between supporting existing and new initiatives?

    9. excellent reputation. Establishing the quality

      Would be good to qualify and specify this, e.g. is this meant in terms of journal quality in technical/objective terms (as described above), or also as reputation/quality with authors in terms of trust (in infrastructure and editiorial board), reach and (perceived) quality of papers published previously?

    10. technical infrastructural criteria and quality of service criteria: editorial responsiveness, added value of peer review, language and copy editing, indexing, preservation

      Additional criteria (with some overlap): TOP guidelines https://cos.io/our-services/top-guidelines/ …, open peer review, open licenses, platform stability, machine readability, XML. configurable papers, reader side formatting, diversity, commenting, multiple language abstracts (cf https://twitter.com/jeroenbosman/status/1053207440999751680)

    11. Evaluating research quality at the journallevel

      Agreed, but I think the issue here is quality of the platform/journal, not quality of the published research itself.

    12. .

      It's not stated that this is to be developed from scratch, and could build on DOAJ as well, possibly providing funds to increase its capacity?

    13. , some publishers allow authors to “keep” their copyright while demanding that authors sign over all exclusive ri

      I think this is excluded by the phrase 'with no restrictions', but clarification would indeed be welcome.

    14. .

      While not black and white, this is a really important point that deserves ample consideration, discussion and commitment.

    15. this might incentivize publishers to hasten the transition of their journals to full gold OA

      I would be really interested to learn a bit more detail on the reasoning behind this.

    16. Alternatively, Plan S can choose to focus on articles and reports, but make it explicit that data have to be freely accessible in the context of Open Science.

      I would support the latter, because it emphasizes the importance of the issue without complicating implementation of plan S by extending it to data (with different infrastructure and license issues). But that's more from a practical then from a conceptual view on research outputs, and may be too shortsighted :-)

    1. establish a European Open Science Cloud, which will make eventually access to data, data sharing, and data re-use possible.

      This really is an aside here, but if this implies that any future data/data services for a future monitor should all/only be coming from EOSC that would be a worrying limitation in itself.

    2. It is not the intention to provide the ultimate Monitor and a research assessment tool, but solely a tool which helps us toassess how open science practices evolve and provide insights on how we could possibly foster such open science practices with public policy

      Even if a monitoring tool is not set out to be used as a research assessment tool, chances are it will be used as such, either directly or by insititutions emulating or modeling (parts of) the monitor for their own monitoring activities. Therefore, the EC could be considered to have a responsibility to provide an as equitable and transparant instrument as possible.

    3. the evaluation is based solely on the information provided in the submitted tender

      Unless I am reading this the wrong way, does this statement, in the context of this and other answers, mean that any COI or other problem that is not mentioned in the tender is not included in the evaluation, and if pointed out later is dismissed because it was not included in the tender and thus not part of the evaluation? That seems circular reasoning which leaves no ground for appeal on such reasons?

    4. Art. II.10.2 of the Contrac

      This refers to a document (apparently the contract itself) that is, to my knowledge, not publicly available.

      It would be very helpful if the Commission could point to the location of this document, if it is indeed publicly available.

    5. ave been identified in the tender, as requested in the tender specifications and made known to the European Commission

      This raises several questions. Since per the contract award notification, the share of work allocated to the subcontractor was estimated to be 10%, the reason the subcontractor(s?) were identified in the tender 'as requested' must be that their capacity was necessary to fullfil the selection criteria? IF that was the case, the question is then why their identity did not need to be revealed to the public in the tender award notification?

    6. and its drivers

      As an aside, it is as of yet unclear if and how the monitor project will address this issue, and (again) what the role of the subcontractor will be in realizing that aspect of the monitor.

    7. There will be no exclusive dependence on a single subcontractor the consortium

      It is still unclear what this means in the current implementation, with Elsevier being the only subcontractor that we know of, and many current indicators solely depending on Elsevier data sources.

    8. The Monitor should be seen as a collaborative initiative.

      As of today (Oct 3) there has been no response given to the individual comments, and how they will be dealt with.

    9. the Monitor is one of the various ‘tools’ available to guide us how to promote and incentivise Open Science. The Commission never takes its decisions on a single tool such as the Open Science Monitor

      This is irrelevant to the question at hand.

    10. The consortium is neither fully dependent on Elsevier, nor does Elsevier determine which indicators the Monitor bases itself on.

      This is still the crux: at the moment, the consortium does rely fully on Elsevier for a large part of the current indicators (pending any future decisions which are unknown as of yet), and the role of Elsevier in determining the indicators and/or other aspects of the monitor is not document anywhere that is publicly accessible.

    11. Elsevier as a subcontractor provides in this context only a service to the contractor and has therefore no authorityto decide on the construction of the Monitor

      While this reply answers the question on who is ultimately responsible (the consortium as contractors), it does not rule out an undue role of E in affecting those decisions, nor does it provide the requested transparancy on what the role of the subcontractor will be.

    12. subcontractors (including Elsevier)

      Apparently there are more subcontractors, whose capacity is each deemed to be necessary to fullfil the selection criterio of the tender? (see this comment: https://hyp.is/ArDG3scjEeiKGKvmCp3cqQ/docdrop.org/static/drop-pdf/Annex-to-letter-to-Jon-Tennant-1--BbPfR)

      Who are they?

    13. Therefore, the evaluation report of the public procurement procedure can be requested in accordance with the rules and procedures set out in this Regulation.

      This is an option that might be worth considering.

  3. Jul 2018
  4. May 2018
    1. I originally made comments through Hypothes,is partly because the commenting system on the website had some problems. These are now solved. To keep all comments in one place for the organizers, all copies below are now also entered as copies in the website commenting system. For transparancy and preservation, I am leaving the Hypothes.is comments as well.

    2. Number of scientific projects on Github

      Number of GitHub projects archived on Zenodo

    3. Number of code projects with DOI

      Software citations in DataCite

    4. Please add here trends not currently captured above, and suggest possible indicators and sources for them.

      Only include indicators that are themselves open, so data can be reused and results can be reproduced

    5. Please add here trends not currently captured above, and suggest possible indicators and sources for them.
      • preregistrations (OSF, clinicaltrials.gov, aspredicted.org)
      • journals accepting preregistered reports - journals practicing open peer review (names published, reports published)
      • institutions with open science aspects included in T&P policies
      • institutions and publishers that are DORA signatories
      • publishers/journals that do not advertise impact factor on journal or article webpages
      • journals accepting preprints
      • preprints published (Crossref, DataCite (both with caveats)
      • journals being transparant about breakdown of APC cost [to be continued]
    6. % of journals with open code policy*

      additional indicator: number of publishers/journals that have adopted the TOP Guidelines (including the level of adoption actual implementation where possible) Source: https://cos.io/our-services/top-guidelines/

    7. Number of Journals with policies on data sharing* (Source: Vasilevsky et al, 2017)

      additional indicator: number of publishers/journals that have adopted the TOP Guidelines (including the level of adoption actual implementation where possible) Source: https://cos.io/our-services/top-guidelines/

  5. Mar 2018
  6. Feb 2018
    1. De VSNU heeft een officieel reactieformulier opgesteld voor opmerkingen over deze conceptversie: http://www.vsnu.nl/publieke-consultatie-nederlandse-gedragscode-wetenschappelijke-integriteit.html

      (zie ook: http://www.vsnu.nl/publieke-consultatie-nederlandse-gedragscode-wetenschappelijke-integriteit.html voor meer informatie over de consultatieronde).

      Wie daarnaast haar/zijn reactie(s) ook openbaar toegankelijk wil maken, kan ze ook hier plaatsen via Hypothes.is.

      (NB.Dit is een open beschikbare web-annotatietool, en geen initiatief van VSNU of een van de andere betrokken organisaties)

      Reacties via Hypothes.is zijn onderdeel van het publiek domein (CC0).

    2. Dezeconceptversieis bestemd voor consultatie.

      De VSNU heeft een officieel reactieformulier opgesteld voor opmerkingen over deze conceptversie: http://www.vsnu.nl/publieke-consultatie-nederlandse-gedragscode-wetenschappelijke-integriteit.html

      (zie ook: http://www.vsnu.nl/publieke-consultatie-nederlandse-gedragscode-wetenschappelijke-integriteit.html voor meer informatie over de consultatieronde).

      Wie daarnaast haar/zijn reactie(s) ook openbaar toegankelijk wil maken, kan ze ook hier plaatsen via Hypothes.is.

      (NB.Dit is een open beschikbare web-annotatietool, en geen initiatief van VSNU of een van de andere betrokken organisaties)

      Reacties via Hypothes.is zijn onderdeel van het publiek domein (CC0).

  7. Oct 2017
    1. Tweets

      tweets do have an identifier

    2. Software tools

      (you've probably discussed this at length...) should these be together? Or should the tools themselves be under entity (with their code being an object)?

    3.  Research objects
      • research proposals
      • preregistrations
      • reviews
      • comments
      • translations
      • single observations
    4. RRID

      for publishers? (I prob. should look into RRID more)

    5. Organizations

      GRID ?

  8. Sep 2017
    1. the voice of scholarly publishing

      This is catchy-sounding, but STM represents one quite specific stakeholder group in scholarly publishing, and there are many other 'voices'.

    2. including any and all metadata

      Not clear if this refers to 'modification' or 'extraction'. If the latter, that will be an interesting aspect in and of itself.