71 Matching Annotations
  1. Jul 2018
    1. ‘we should be able to improve research if we reward scientists specifically for adopting behaviours that are known to improve research’.

      Very Skinnerian!

    2. We interpreted that several of the documents pointed to a disconnect between the production of research and the needs of society (i.e., productivity may lack translational impact and societal added value).

      Again, without a strong statement in the Principles of the Scholarly Commons regarding the compact between researchers and society, none of the principles or rules will make sense, as they don't address this issue.

    3. A burgeoning number of scientific leaders believe the current system of faculty incentives and rewards is misaligned with the needs of society and disconnected from the evidence about the causes of the reproducibility crisis and suboptimal quality of the scientific publication record.

      Good quote

  2. Apr 2018
    1. For example, funding agencies may be more interested in performance measures related to the translation of team research findings to practical applications, whereas team researchers may use the number of publications produced and amount grant funding obtained to gauge the success of a team science endeavor. In addition, the method of evaluation and metrics of success may vary at different points during the team research project. Short-term measures may include indicators of synergistic output, whereas long-term measures may be related to the impact of the research on the evolution of a discipline or the development of public policy.

      Complexities of measuring effectiveness of team science from different perspectives.

  3. Feb 2018
    1. 5. INNOVATION AND COMMERCIAL RESEARCH BASED ON THE USE OF FACTS, DATA, AND IDEAS SHOULD NOT BE RESTRICTED BY INTELLECTUAL PROPERTY LAW

      Should make sure this is prominently featured in the commons materials as it is an important point that CC-NC is not commons compliant.

    2. 3. LICENSES AND CONTRACT TERMS SHOULD NOT RESTRICT INDIVIDUALS FROM USING FACTS, DATA AND IDEAS Generally, licences and contract terms that regulate and restrict how individuals may analyse and use facts, data and ideas are unacceptable and inhibit innovation and the creation of new knowledge and, therefore, should not be adopted. Similarly, it is unacceptable that technical measures in digital rights management systems should inhibit the lawful right to perform content mining.

      Link to this in the Matrix. Critically missing in most discussions of open access.

  4. Oct 2017
    1. Similarly, references can be initially tagged as mixed citations, but at a later stage, Texture can connect to third party APIs (e.g. DataCite, CrossRef) and convert them into fully structured element citations.

      Very nice example; can we even do better than this?

    1. Another reason APCs would rise is that the money flowing into the current system from outside the academic research community – i.e., journal subscriptions from industry – is estimated to be about 25 percent of the total. In a “pay-to-publish model,” systemic costs would need to be borne by the academic research community rather than shared with industry. This is because the costs of publishing in a gold OA system are covered entirely by those who publish articles – the academic research community – and not spread among consumers including the commercial sector, which accesses large amounts of research but publishes comparatively little. These two points have not been addressed in discussions to date but need to be worked through if gold open access is to be a viable, long-term solution globally.

      This is an interesting point that I have not heard raised before. Would like to have this verified.

    1. A way of conceptualizing our way out of a single provider solution by a powerful first-mover is to think about datasets as public resources, with attendant public ownership interests.

      Support for the commons, even with sensitive patient data.

  5. Aug 2017
    1. The commons is a collectively evolving protocol language for sharing scholarly information, and anyone can build on that protocol to the extent they wish.

      This is a very good definition, and one we should keep in mind.

    1. Both groups have committed themselves to “fair” open access principles, which, among other things, “strongly recommend” that journal owners be “fully” non-profit. “A for-profit company accountable only to shareholders”, the statement pointedly stresses, “is not compatible” with these principles.

      I'm not sure that this is entirely true, but the companies have to be accountable to the scholarly community first, and their shareholders second. That's why we've been trying to articulate core principles that any system should adhere to: scholarlycommons.org

    2. The first set of challenges, around sustainable funding for a non-profit infrastructure, has a viable answer: the key is to redirect the billions – even a fraction of those billions – that libraries currently spend on subscriptions to the new, scholar-run platforms. These dollars are crucial, too, to underwrite an OA future for the university presses and scholarly societies.
    3. Perhaps the steepest obstruction is our own well-earned cynicism. The university, with its audit culture and industry “partnerships”, is already so entangled with corporate values that its “non-profit” status strikes many of us as hollow. What difference does it make, by extension, to bring scholarly publishing back into the fold, when the “fold” itself is shot through with market thinking?

      Fighting for the heart and soul of scholarly communications.

    1. A strong version of moral rights even gives the author the right to retract a work from publication and to enjoin any further publication or duplication.

      But not in the scholarly commons, although I think we did recognize that this was a problem. But the persistence requirement of FAIR says they cannot remove all traces of the work.

    2. Authors also have the right to not be attributed if they no longer wish to be associated with the work.

      Interesting, to Dan's point. But does the Commons allow this?

    3. Box 1. Layers of Copyrights in Databases

      Very useful information about copyrights and data

    4. Academic researchers and their offices of sponsored projects should carefully review drafts of sponsored research agreements and clinical trial agreements to ensure they do not inappropriately restrict a researcher’s right to disseminate the results of the scientific research they have conducted. A researcher should ensure that the agreements do not permit commercial sponsors to revise, delete, or suppress information generated by the researcher. The terms and timing of disclosing research results that are trade secrets should be incorporated into the sponsored research agreements, not negotiated at the time of publication

      Things to keep in mind when negotiating a research agreement.

    5. For example, if a researcher collaborates with a pharmaceutical company, the researcher may be contractually bound to suppress the release of research data until the sponsor has developed a patentable product.

      Commercial partnerships

    6. so long as the information has been subject to reasonable measures to keep it secret.

      Certainly a barrier to data publishing!

    7. First, the source of all intellectual property rights is national law. Certain international treaties harmonize intellectual property owners’ rights but leave the users’ rights to vary by country. Second, certain countries have added protection beyond what the treaties require. Specifically, the members of the European Union, candidate countries in Eastern Europe, Mexico [2], and South Korea have created a specialized database right that applies to certain databases created or maintained within their borders. These laws regulate uses of these databases only within their borders.

      Important to recognize that national laws may interfere with free exchange of data.

    8. legal uncertainty interferes with the productive reuse of research data.

      Good quote.

    1. Researchersareusuallyexpectedtoobtaininformedconsentforpeopletoparticipateinresearchandforuseoftheinformationcollected.Wherepossible,consentshouldalsotakeintoaccountanyfutureusesofdata,suchasthesharing,preservationandlong-termuseofresearchdata.Ataminimum,consentformsshouldnotprecludedatasharing,suchasbypromisingtodestroydataunnecessarily

      Recommendations about informed consent.

    2. Researchdata—evensensitiveandconfidentialdata—canbesharedethicallyandlegallyifresearcherspayattention,fromthebeginningofresearch,tothreeimportantaspects:•whengaininginformedconsent,includeprovisionfordatasharing•whereneeded,protectpeople’sidentitiesbyanonymisingdata•considercontrollingaccesstodata

      Prepare ahead if you are collecting sensitive information.

    1. At the core, we have a fundamental issue of what “counts”, and what counts will clearly depend on the community doing the counting. This is the central social and political issue on which disagreements on the status of preprints are based. We will never agree on a universal definition because communities naturally value different things

      In our discussions of the scholarly commons, we left out "credit and reward" for precisely this reason.

    2. But they all illustrate the same central point. There exists no universal standard of when an output is considered as part of the formal scholarly record. Rather, it is determined by particular groups in particular contexts

      Good quote for the Scholarly Commons section on credit as well

    3. On the origin of nonequivalent states: How we can talk about preprints [version 1; referees: 2 approved]

      Relevant for "Don't publish, release!"

  6. Jul 2017
  7. May 2017
    1. We propose to consider the following 'levels' for FAIRports, or actually Data Objects contained in them (in other words, one FAIRport could contain Data Objects with Different 'levels of FAIRness) (see figure). Level 1: Each Data Object has a PID and intrinsic FAIR metadata (in essence 'static') Level 2: Each Data Object has 'user defined' (and updated) metadata to give rich provenance in FAIR format of the data, what happened to it, what it has been used for, can be used for etc., which could also be seen as rich FAIR annotations Level 3. The Data Elements themselves in the Data Objects are 'technically' also FAIR, but not fully Open Access and not Reusable without restrictions (for instance Patient data or Proprietary data). Level 4: The metadata as well as the data elements themselves are fully FAIR and completely public, under well defined license. (Non-licensed data considered 'public' by their owner will still be excluded from integration projects by for instance Pharmaceutical companies).

      We just switched our numbering for levels of compliance for the commons, with 1 = best and 4 = least compliant. I don't know if these are used anywhere else, but we should probably be consistent. Perhaps we should switch to a non-numbering system so as to avoid confusion.

    1. If you publish, you’re a publisher We are a global community of members, large and small, who publish scholarly content in all disciplines and in many formats. We define publishing broadly; your business model is your business.

      Cross map this in the Scholarly Commons

    1. That said, the default for maximal FAIRness should be that the data themselves are made available under well-defined conditions for reuse by others (panel E).

      level 2 in the commons decision trees

    2. The FAIR principles, although inspired by Open Science, explicitly and deliberately do not address moral and ethical issues pertaining to the openness of data

      Which is why the Commons explicitly includes Open as the first principle. It is exactly to address the moral and ethical issues.

  8. Apr 2017
    1. Is there any existing infrastructure organisation that satisfies our principles? ORCID probably comes the closest, which is not a surprise as our conversation and these principles had their genesis in the community concerns and discussions that led to its creation. The ORCID principles represented the first attempt to address the issue of community trust which have developed in our conversations since to include additional issues. Other instructive examples that provide direction include Wikimedia Foundation and CERN.
    2. Even with the best possible governance structures, critical infrastructure can still be co-opted by a subset of stakeholders or simply drift away from the needs of the community. Long term trust requires the community to believe it retains control. Here we can learn from Open Source practices. To ensure that the community can take control if necessary, the infrastructure must be “forkable.” The community could replicate the entire system if the organisation loses the support of stakeholders, despite all established checks and balances. Each crucial part then must be legally and technically capable of replication, including software systems and data.
    3. An organisation that is both well meaning and has the right expertise will still not be trusted if it does not have sustainable resources to execute its mission.
    4. If an infrastructure is successful and becomes critical to the community, we need to ensure it is not co-opted by particular interest groups. Similarly, we need to ensure that any organisation does not confuse serving itself with serving its stakeholders.
    5. Transparent operations – achieving trust in the selection of representatives to governance groups will be best achieved through transparent processes and operations in general (within the constraints of privacy laws).
    6. Stakeholder Governed – a board-governed organisation drawn from the stakeholder community builds more confidence that the organisation will take decisions driven by community consensus and consideration of different interests.
  9. Feb 2017
    1. Open access advocates might be concerned about some of these directions, but my sense is that many of these scientists and librarians remain largely focused on trying to compete with, or at least influence, scientific publishing.

      I think this is correct. The Scholarly Commons group wants to define a different model where all these components that exist now define a different, coordinated ecosystem that supports open science. The question is, are we too late? Or can Elsevier just be one option?

    1. To enable this magic, an app that people can use to annotate regions in web pages is necessary but not sufficient. You also need an API-accessible service that enables computers to create and retrieve annotations. Even more fundamentally, you need an open web standard that defines how apps and services work not only with atomic resources named and located by URLs, but also segments of interest within them.

      Requirements to make this vision fully functional. Lays the foundation for the type of granular provenance that we're envisioning for next generation scholarship.

  10. Oct 2016
    1. UC San Diego will transform California and a diverse global society by educating, generatingand disseminating knowledge and creative works, and engaging in public service

      UCSD's mission

    2. The founders of UC San Diego aspired to establish an experimental campus, one that would define the future of education and research at a public university. The success of this vision is unparalleled: UC San Diego is recognized as one of the top public research universities in the country and one of the top twenty universities in the world.
  11. Sep 2016
    1. The simplest way to deposit a manuscript on bioRxiv is to upload a single PDF including the text and any figures/tables.

      Are pdf's FAIR? I would argue not. Not particularly useful for machines.

    1. describe experimental work that is not performed in accordance with the relevant ethical standards for research using animals or human subjects.

      Open and inclusive requirements do NOT preclude the enforcement of ethical in standards in research. All publishers of the commons must enforce these.

    2. Authors retain copyright and choose from several distribution/reuse options under which to make the article available (CC-BY, CC-BY-NC, CC-BY-ND, CC-BY-NC-ND, or no reuse). By posting on bioRxiv, authors explicitly consent to text mining of their work (e.g., by search engines or researchers).

      Not compliant with FAIR; again, suspect this is entirely due to publisher requirements.

    1. nor grant arXiv the right to grant any specific rights to others.

      Not fully reusable. Probably because authors want to publish in journals. The issue of redistribution is unresolved. We know that this inhibits reuse of software, if we can't redistribute products that use open source software, but does it hold for narrative works?

    1. arXiv is an openly accessible, moderated repository for scholarly papers in specific scientific disciplines.

      Do such things exist in the commons? Domain specificity is allowed but not as a condition of entering the commons. But content can be aggregated and enhanced, according to Madrid, on behalf of user communities.

    2. The endorsement system verifies that arXiv contributors belong to the scientific community in a fair and sustainable way that can scale with arXiv's future growth.

      Is this consistent with the principles of the Commons?

    1. 2. A fundamentally new approach towards optimal reuse of research data. Data sharing and stewardship is the default approach for all publicly funded research. This requires definitions, standards and infrastructures.

      Open by default

    1. No one can read everything.  We rely on filters to make sense of the scholarly literature, but the narrow, traditional filters are being swamped. However, the growth of new, online scholarly tools allows us to make new filters; these altmetrics reflect the broad, rapid impact of scholarship in this burgeoning ecosystem. We call for more tools and research based on altmetrics.

      Scholarly Commons: No reward system should be built into the commons, but the ability to track metrics is native to the Commons.

    1. We reaffirm the principle that only the intrinsic merit of the work, and not the title of the journal in which a candidate’s work is published, will be considered in appointments, promotions, merit awards or grants.

      Business models for the Commons.

    2. Community standards, rather than copyright law, will continue to provide the mechanism for enforcement of proper attribution and responsible use of the published work, as they do now.

      A good point for the Commons. Good explanation of CC0, i.e., it doesn't mean that for good scholarship things should not be attributed.

    3. Open access is a property of individual works, not necessarily journals or publishers.

      Important point for the Commons.

    4. A complete version of the work and all supplemental materials, including a copy of the permission as stated above, in a suitable standard electronic format is deposited immediately upon initial publication in at least one online repository that is supported by an academic institution, scholarly society, government agency, or other well-established organization that seeks to enable open access, unrestricted distribution, interoperability, and long-term archiving (for the biomedical sciences, PubMed Central is such a repository).

      Very similar again to Berlin.

    5. The author(s) and copyright holder(s) grant(s) to all users a free, irrevocable, worldwide, perpetual right of access to, and a license to copy, use, distribute, transmit and display the work publicly and to make and distribute derivative works, in any digital medium for any responsible purpose, subject to proper attribution of authorship[2], as well as the right to make small numbers of printed copies for their personal use.

      Very similar to Berlin declaration, but covers printed works. Does the Scholarly Commons have anything to say about that? I think not.

    1. A complete version of the work and all supplemental materials, including a copy of the permission as stated above, in an appropriate standard electronic format is deposited (and thus published) in at least one online repository using suitable technical standards (such as the Open Archive definitions) that is supported and maintained by an academic institution, scholarly society, government agency, or other well-established organization that seeks to enable open access, unrestricted distribution, inter operability, and long-term archiving.

      Defines the roles of contributors and publishers in the Commons.

    2. Open access contributions must satisfy two conditions:The author(s) and right holder(s) of such contributions grant(s) to all users a free, irrevocable, worldwide, right of access to, and a license to copy, use, distribute, transmit and display the work publicly and to make and distribute derivative works, in any digital medium for any responsible purpose, subject to proper attribution of authorship (community standards, will continue to provide the mechanism for enforcement of proper attribution and responsible use of the published work, as they do now), as well as the right to make small numbers of printed copies for their personal use.

      Definition of open access.

    3. Our mission of disseminating knowledge is only half complete if the information is not made widely and readily available to society.

      "Open by default" principle

  12. Aug 2016
    1. Important questions are (i) whether preprint submissions should be screened for adherence to scientific and ethical standards and (ii) how to handle data that raise ethical concerns or that contravene national policy or guidance.

      Something that the commons will have to consider.

    2. Posting preprints has the added benefit of democratizing the flow of information and making it available to all investigators across the globe, while allowing journals to make their own judgments of appropriateness and interest after peer review. Publicly available preprints provide an opportunity for authors to obtain feedback beyond the few scientists who see the manuscript during peer review.

      Benefits of the "don't publish, release" model envision by the Commons.

    1. In the future, the most valuable science institutions will be closely linked to the people and places whose urgent problems need to be solved; they will cultivate strong lines of accountability to those for whom solutions are important; they will incentivize scientists to care about the problems more than the production of knowledge. They will link research agendas to the quest for improved solutions — often technological ones — rather than to understanding for its own sake. The science they produce will be of higher quality, because it will have to be. The current dominant paradigm will meanwhile continue to crumble under the weight of its own contradictions, but it will also continue to hog most of the resources and insist on its elevated social and political status.
    2. The Artemis Project is different from science-as-usual in many ways. It is small, collaborative, and focused not on producing good science for its own sake, nor on making a profit, but on solving a problem. It takes its research agenda from patient advocates, not from scientists.

      Again, something for the Scholarly Commons.

    3. The beautiful lie insists that scientists ought to be accountable only to themselves. Marqusee’s advice to his staff was precisely the contrary: “Have no constituency in the research community, have it only in the end-user community.”

      Very relevant for the scholarly commons.

  13. Jul 2016
    1. But in fact, there were plenty of other researchers going into West Africa to do genetic sequencing. A lot of those scientists were simply waiting on prestigious journals to publish their findings.

      I cannot believe that a scientist would do this in the face of a global health crisis. Can anyone save my faith in science?

    2. They don't coordinate with people fighting the epidemic on the ground — don't even share their discoveries for months, if ever.

      That is a crime, if you ask me.

    1. Notably, too, many did their landmark work in places that some might regard as off the beaten path of science (Alicante, Spain; France’s Ministry of Defense; Danisco’s corporate labs; and Vilnius, Lithuania). And, their seminal papers were often rejected by leading journals—appearing only after considerable delay and in less prominent venues. These observations may not be a coincidence: the settings may have afforded greater freedom to pursue less trendy topics but less support about how to overcome skepticism by journals and reviewers.

      This is a very important point for the Scholarly Commons.

    1. PLoS ONE supporters have a ready answer: start by making any core text that passes peer review for scientific validity alone open to everyone; if scientists do miss the guidance of selective peer review, then they can use recommendation tools and filters (perhaps even commercial ones) to organize the literature — but at least the costs will not be baked into pre-publication charges.

      The model promoted by the Scholarly Commons

  14. Jun 2016
    1. This is often summarized more pithily as “when a measure becomesa target, it ceases to be a good measure.”

      Good quote for the Commons.