1,755 Matching Annotations
  1. Apr 2017
    1. Data citations are formal ways to ground the research findings in a manuscript, upon their supporting evidence, when that evidence consists of externally archived datasets.

      Nice definition of data citation

  2. Mar 2017
    1. We propose the creation of an international coalition whose mission is to collectively support those core data resources deemed essential to the work of life science researchers, educators, and innovators worldwide. Through this coalition, funders of the life sciences should commit to the long-term shared responsibility to sustain the open access to core data resources because of their value to the global life science community and adhere to the oversight principles outlined above.

      Recommendation for international funding for sustainability of core data resources.

    1. Slides and posters can be shared on the Biocuratio

      Test

    2. 85. Repurpos.us: A fully open and expandable drug repurposing portal

      Uses Wikidata. Website is just a prototype but available.

    3. 43. NaviCom: A web application to create interactive molecular network portraits using multi-level omics data Inna Kuperstein, Maturin Dorel, Eric Viara, Emmanuel Barillot and Andrei Zinovyev
    1. Top tags 50

      Filter your annotations by tag. Or, search by any field or keyword in the search bar.

    2. ORCID: 0000-0002-8406-3871

      Add your ORCID to your Hypothesis profile

    3. Annotate the URL with a page note

    4. Annotations: 2565

      Take notes, curate, discuss, correct

    1. This study demonstrates for the first time that there exists an association between cytokines and cognition in patients with breast cancer, prior to any treatment.
    2. This study demonstrates for the first time that there exists an association between cytokines and cognition in patients with breast cancer, prior to any treatment.
    3. This study demonstrates for the first time that there exists an association between cytokines and cognition in patients with breast cancer, prior to any treatment.
    4. Results of the primary multivariable analyses and subsequent post-hoc analyses found that higher TNF, as measured by sTNF-RII, predicted reduced memory performance in patients.
    5. Results of the primary multivariable analyses and subsequent post-hoc analyses found that higher TNF, as measured by sTNF-RII, predicted reduced memory performance in patients.
    6. This study demonstrates for the first time that there exists an association between cytokines and cognition in patients with breast cancer, prior to any treatment.
    7. This study demonstrates for the first time that there exists an association between cytokines and cognition in patients with breast cancer, prior to any treatment.
    1. Yet there is widespread expectation that access to research data should be supported by the government or academic institutions and be free to the research community.

      Our current funding model presumes that the infrastructure for conducting research will reside at my University. But that is only mostly true. We also have to support national and international infrastructures. So I think that a portion of indirects need to go to national infrastructure. My univerisyt has no problem charging me a "communications tax" on every one of my grants.

    2. However, with thousands of grants per year and data sets ranging from megabytes or less to terabytes or more, not all federally funded data can be realistically hosted within a public repository.

      An underappreciated point. When I served on study sections way back in the day when these digital repositories were being set up, it was clear that the agenecies had no rationale strategy for determining when a public repository was warranted.

    3. accelerate research discovery

      Perhaps

    4. In early 2008, Google announced that it would begin to support open-source scientific data sets. By the end of the year, the project was shut down for business reasons (7).

      Google Data reference

    5. Digital data are ephemeral, and access to data involves infrastructure and economic support. In order to support the downloading of data from federally funded chemistry experiments, astronomy sky surveys, social science studies, biomedical analyses, and other research efforts, the data may need to be collected, documented, organized in a database, curated, and/or made available by a computer that needs maintenance, power, and administrative resources. Access to data requires that the data be hosted somewhere and managed by someone. Technological and human infrastructure supporting data stewardship is a precondition to meaningful access and reuse, as “homeless” data quickly become no data at all.

      Data lifecycle in a nutshell

    6. So who pays for data infrastructure?

      The (literally) million dollar question

    1. To wit

      I would also add that when you delete the main comment, the replies remain. I'm assuming this will be fixed with moderation, in that a moderator could delete everything. But this has been a major pain for groups that are using H to comment on drafts on web pages. I think people are expecting a "Resolve" function that would delete replies too.

    1. data refereed?

      In all truth, I'm not sure what I meant by this. Recommend to remove.

    2. Were permissions obtained to share the data within regulatory requirements?

      Actually, this is now redundant given the mini-decision tree for deciding whether one has permission to share. So I think it should be: Have the data been prepared according to any regulatory requirements for making them open?

    3. s it sensitive data that cannot be openly shared?

      Think this should be reworded or at least linked to an explanation: Are the data sensitive (i.e., will making the data open potentially harm research subjects or researchers beyond concerns of tenure and promotion.

    4. Scholarly Commons Subworking group

      Need to put a link to the working group. But for the attribution, how do we want to handle this? We have ORCID's for people and ID's for organizations. What about working groups and task forces?

    5. it

      We are not consistent about referring to the specifically scholarly object. Recommend that we do so.

    6. No

      Could link to materials on expectations about sharing of physical samples, although the commons itself is digital.

    7. Do you have the rights to make it open?

      Feedback from Daniel: The question mark should be here. Also, shouldn't be "it", but probably "data"

    1. no one country alone can afford to effectively monitor the entire Earth
    2. Dating back to formation of the International Meteorological Organization in 1873, nations have recognized the importance of sharing environmental observation data for the development of lifesaving weather forecasts in neighboring nations.

      Weather data sharing goes way back.

  3. Feb 2017
    1. Although tiling also occurs in human protoplasmic astrocytes, the spatial domains of these cells are more overlapping and the boundaries are not respected by processes from varicose and interlaminar projecting astrocytes ( Oberheim et al., 2009 and Sosunov et al., 2014).

      Well how about that.

    2. Astrocytes appear to remain in a restricted region tangential to their site of embryonic origin

      Different from oligodendrocytes

    1. owever, a recent study using the severed axons of cultured peripheral sensory neurons showed that metabolically labelled, newly synthesized proteins are trafficked to the plasma membrane44, thus suggesting that the functional equivalent of RER and Golgi apparatus exists in these axons.

      Is this the same case in dendrites for Golgi?

    2. These observations were consistent with an overlooked electron microscopy study from 1970

      That's great.

    3. axon as a passive transmitter of information

      When has an axon ever been characterized as a passive transmitter of information?

    4. Second, the ectopic presence of proteins in other parts of the cell during protein transport is avoided.

      Interesting idea

    1. We also found that axonal mRNA translation continues in adulthood, when regulators of neurotransmission and axon survival are locally translated.

      Adult

    2. t is not known, however, which mRNAs are axonally translated and which specific aspects of visual circuit assembly they affect.

      Again, they are talking about development.

    3. rational interpretation of these results is that specific subsets of mRNAs are coordinately translated when required whereas most axonally localized mRNAs remain translationally repressed. Thus, to understand the function of axonal mRNA translation, it is important to carry out a comprehensive and unbiased global analysis of the mRNAs that are specifically translated in the axonal compartment in vivo.

      Consistent with ribosomes being excluded from the axon.

    1. stochastic optical reconstruction microscopy

      Make sure that this is in the methods ontology

    2. exclusively in presynaptic interneurons, most likely in axons.

      Most likely in axons?

    3. Despite decades of research, the molecular basis of long-term changes in neurotransmitter release remains unsolved.

      That's a remarkable statement.

    1. immunopanning

      Have not heard this term: "Immunopanning is essentially an immunoprecipitation (IP) of cells using an antibody immobilized to a solid surface, like a cell culture plate. Conventionally, an IP is performed using small agarose or magnetic beads (~50 to 150μm in size) conjugated to an antibody or Protein A/G, and can pull down individual proteins, protein complexes, and/or nucleic acid complexes." From Biology Stack Exchange

    2. These aberrant cells simultaneously displaying both astrocytic and microglial markers are surprising because such a phenotype has not been described previously in any neurodegenerative disease.

      I wonder if they are in SMA?

    3. months without undergoing replicative senescence.

      Like a tumor

    4. aberrant astrocyte-like cell (AbA cells)

      Add to NIF ontology

    5. aken together, studies suggest a model in which mutant SOD1 induces primary motor neuron damage and distal axonopathy making glial cells active contributors to neuronal loss, thereby driving disease progression

      When glia go bad...

    1. Open access advocates might be concerned about some of these directions, but my sense is that many of these scientists and librarians remain largely focused on trying to compete with, or at least influence, scientific publishing.

      I think this is correct. The Scholarly Commons group wants to define a different model where all these components that exist now define a different, coordinated ecosystem that supports open science. The question is, are we too late? Or can Elsevier just be one option?

    1. To enable this magic, an app that people can use to annotate regions in web pages is necessary but not sufficient. You also need an API-accessible service that enables computers to create and retrieve annotations. Even more fundamentally, you need an open web standard that defines how apps and services work not only with atomic resources named and located by URLs, but also segments of interest within them.

      Requirements to make this vision fully functional. Lays the foundation for the type of granular provenance that we're envisioning for next generation scholarship.

  4. Jan 2017
    1. The potential of BBD is hindered (similar to that of money) when it accumulates, but does not flow.

      Love this quote!

    2. Currently, a full video record of a rodent's lifespan (~24 months) in a standard cage (millimeter resolution), at VGA resolution, day and night (infrared LED illumination) and lossy, yet sufficient, compression, can be acquired with a $50 webcam and stored on a hard drive.

      A standard life span in a very small and artificial space.

    3. Fortunately, driven by consumer interest in documenting the behavior of their children, cats and extreme sports mishaps, this technology has become much more accessible.

      Thank you cats (and children)

    4. It is the unifying space in which genes, neural structure, neural function, body plan, physical constraints and environmental effects converge.

      Very poetic!

    5. natural selection acts on behavior

      Very important point.

    6. In their pursuit of a tractable problem, neuroscientists have tended to reduce the complexity of behavior by favoring highly constrained experimental preparations that allow them to focus on the complexity of the brain itself.

      Agreed. They ignore what the the circuits they study are designed to do/

    1. to fight for LA

      LA doesn't even know you are coming and I suspect won't care when it finds out. Big mistake.

    1. The take-home message I want to convey here is that we should not forget that when humans learn, you do not need to feed thousands of training examples, you just feed 3, or even 1, and you have a new concept or category.

      Same is true of animals. This occurred to me in graduate school when scientists were doing lesion studies on memory where an animal had to have hundreds of training trials to learn a task. In nature, you don't get hundreds of trials. At the same time, I had a new puppy. He scratched on the door and I let him in and that was all it took for him to do it again.

    1. hird, inhibitory projections to pyramidal neurons in mature neocortex appeared to be different from the developing neocortex. In this study, we did not identify any connection from BPCs to L23 or L5 pyramidal neurons or from BTCs to L5 pyramidal neurons (Figs. 4 and ​and6C6C and fig. S14), but these types of connections have been frequently identified in the developing neocortex (5)

      Wow

    2. Interneuron-targeting interneurons also received very little input from local pyramidal neurons and specific inhibitory inputs, very different from other groups of interneurons (Fig. 5, C and D). ISIs thus may act primarily as disinhibitors of the local cortical microcircuit and may be primarily controlled by long-range inputs.

      Interesting

    3. Maturation of GABAergic interneurons takes longer than for pyramidal cells, and their continuous development throughout adolescence into adulthood often further obscures our understanding

      I didn't realize that. But as Gordon said along time ago, just a consideration of their diversity tells you where nature put her emphasis.

    4. Despite specific connection patterns for each cell type, we found that a small number of simple connectivity motifs are repeated across layers and cell types defining a canonical cortical microcircuit.

      the canonical circuit

    1. Figure 2.

      <40% Source?

    2. In fact, we find that, in supragranular cortical layers, 5-HT3AR interneurons are the predominant interneuron population.

      Interesting.

    3. Gonchar et al., 2007

      Read this

    4. (McBain and Fisahn, 2001;

      Read this

    5. Activation of both excitatory and inhibitory neurons establishes a simple disynaptic circuit that provides powerful, local feedforward inhibition because of stronger and faster TC excitation to FS interneurons than to excitatory neurons, which are mainly spiny stellate and star pyramidal neurons

      Again, something to consider.

    6. TC afferents directly contact both excitatory and inhibitory neurons in L4 and deep L3 in the cortex

      This is usually ignored in the canonical circuit.

    7. We used a mouse line expressing EGFP under the control of the 5Htr3aR-promoter (5HT3aR-BACEGFP ) provided by the GENSAT project at Rockefeller University.

      Should insert an RRID here.

    8. In fact, nearly all interneurons that do not express parvalbumin (PV) or somatostatin (SST) are 5-HT3AR expressing.

      So not basket neurons?

    9. Buzsáki et al., 2004)

      Read this.

    10. The profound influence of these neuromodulators on the cortex, likely in part stems from their preferential targeting of inhibitory cortical interneurons releasing GABA (Beaulieu and Somogyi, 1991; Smiley and Goldman-Rakic, 1996).

      i didn't realize that

    1. Prospective data

      I find that positioning the cursor right over the text, particularly at the end of the text, and then selecting, lets me annotate.

    2. No

      You can annotate a hyperlinked box but it takes some skill

    3. Do you have the rights to make it open?

      I want to modify this section to include an "info box" that explains who has to give permission before it can be released. Permission from colleagues Permission from institution Permission from regulatory agencies? Permission from human subjects?

    4. What type of data will you be working with?

      I will rephrase this as a Y/N question. Does the data already exist?

  5. Dec 2016
    1. Soaking the foot in a hypertonic solution once at the time an abscess is drained may also have some merit. However, in order for any solution to draw infection or clean up the affected area, the solution must be hypertonic which can be achieved by saturating it with a salt. To make a saturated solution, MgSO4 (Epsom salts) is added to hot water until no more salt can be dissolved.

      So soaking initially is a good thing.

    2. hichever poultice is used, it must be held in place with a bandage. An ideal foot bandage is a medium-sized disposable diaper covering the enclosed medication. For more padding, use multiple diapers. For a sweating effect, use plastic-covered diapers and duct tape. For more breathing, use non-plastic covered diapers and gauze bandage.

      Instructions. No mention of duct tape!

    3. The poultice provides a warm moist hydroscopic environment which stays in contact with the foot twenty-four hours a day but does not have the detrimental effects of continuous soaking.

      Support for goo

    4. Chronic foot soaking for an abscess can actually prolong the healing process. In many cases, the pocket resulting from the accumulation of exudate from the abscess will be prevented from draining and drying up as the softened structures of the chronically soaked foot compress the affected area.

      Interesting; according to the rest of the article, there is little research on its therapeutic benefits.

    1. Practicesprinciples of notice and consent continue to be necessary, but are no longer sufficient. Campusdata governance mechanisms must address the uses and reuses of data about individuals in our community

      Main point of paper.

    2. Few “big data” predictive analytics run afoul of currentlaw and policy, for example, even as the expectations of our community may delineate different boundaries.

      Is being "uncomfortable" is enough to govern policy.

    3. One mustfirstdiscover the existence of a database or other store of data. Discovery may require a directory or index of campus data resources. No such directory currently exists, and to create and maintain one is a complex and expensive process. If individuals inside or outside the University can determine easily where information about individuals exists, that directory itself becomes a target for cyberattack, for legal discovery, and for all manner of research and administrative questions. Conversely, if no inventory of available data resources on campus exists, it is difficult to draw boundaries around what data must be governed.

      This really is a problem for open data that is even remotely sensitive. Centralization increases efficiency but also vulnerability.

    4. information entrusted

      This is an interesting word in the context; to what extent is the university entrusted with much of the type of data we are talking about here.

    5. Theseproperties characterize the need for governance mechanisms focused on appropriateuse of data rather than on particular data elements or owners, thusassuringthat University principlesare applied to data governance.

      Agree. Can't really break this down.

    6. Datathat may not appear to be sensitive at the time of collection, such as student traffic to a course website, may become extremely rich when combined with other data such as a student’s grades, medical records, library usage, foodpurchases, and social media habits.

      Basically, impossible to predict the value of any bit of data in the era of big data analytics.

    7. Opting out of data collectionis rarely an option in university environments, because access to basic university and public services depends upon myriad uses of data for teaching, learning, library services, travel, personnel, payroll, recreation, and well beyond.

      Plus students may not feel free to do so.

    8. nd consent, that data collected for one purpose should not be used for another purpose without the express permission of the person who is the data subject.

      This is where we run into trouble these days, particularly with open data.

    9. e t a fundamental concern that led to the creation of this Ta s kForce is the growing number of data-generating activities involving our students, faculty, and staff that fall outside of IRB purview.

      Scope-this type of data falls through current regulations

    10. whether or not those data meet any existing legal or policydefinition of personally identifiable information (PII).

      It's not all about PII

    11. nitial set of principles by which governance decisions can be considered, a UCLA data governance structure, and necessary surrounding processes.

      Purpose of report

    12. “Big data” analytics and the routine sharing of data with a diverse array of third parties have accelerated concerns for reconsidering ourpolicy, funding, and technology frameworksforthe appropriate use and protection of data

      Current academic data challenges

    1. Moreover, each of the above-mentioned tools can be used by somebrain scientists, but most tools are de-signed for data scientists, so the learningcurve can be incredibly steep.

      Very true.

    2. We are proposing to design, build, anddeploy an instance of ‘‘cloud neurosci-ence,’’ meaning that the data, the code,and the analytic results all live in the cloudtogether. Cloud neuroscience can bethought of as an operating system, a setof programs that run on it, a file systemthat stores the data, and the data itself,all designed to run in a scalable fashionand to be accessible from anywhere.

      Looks very similar in intent to the NIH Data Commons.

    1. The editor looks at the title of the paper and sends it to two friends whom the editor thinks know something about the subject. If both advise publication the editor sends it to the printers. If both advise against publication the editor rejects the paper.

      That's where the columnist got this. But it wasn't correctly presented.

    2. The final step was, in my mind, to open up the whole process and conduct it in real time on the web in front of the eyes of anybody interested. Peer review would then be transformed from a black box into an open scientific discourse. Often I found the discourse around a study was a lot more interesting than the study itself. Now that I have left I am not sure if this system will be introduced.

      Good quote to have for Hypothes.is

  6. Nov 2016
    1. None of the control regions showed this degree of preservation—the thickness of all of them in superagers was intermediate between young adults and typical older adults, or what we call partial preservation.

      Interesting (if true)

    1. Ghost authors:This phrase is used in two ways. Itusually refers to professional writers (often paid bycommercial sponsors) whose role is not acknowledged.Although such writers rarely meet ICMJE criteria, sincethey are not involved in the design of studies, or thecollection or interpretation of data, it is important toacknowledge their contribution, since their involvementmay represent a potential conflict of interest. The termcan also be used to describe people who made asignificant contribution to a research project (and fulfilthe ICMJE criteria) but are not listed as authors. TheICMJE guidelines clearly condemn this practice and statethat ‘All persons designated as authors should qualify forauthorship, and all those who qualify should be listed.’
    1. These reasonable concerns aside, pre-registration does present a means for reducing the frequency at which exploratory research masquerades as having an a priori design

      Benefits of pre-registration

    1. - In this paper, we have

      Is something missing here?

    2. “Effect size and confidence interval”.

      Hypothes.is has direct linking, so you could technically embed a link to the relevant section.

    1. For example, advisors and administrators used an analysis of courses in which students consistently performed poorly to target supplemental instruction, and to inform a redesign of introductory math courses. GSU has also developed a program targeting small, conditional grants to students who are highly likely to complete a degree but would be forced to drop out for a semester because of a small amount due to the bursar.

      Nice use of data to improve outcomes, particularly the latter.

    2. lthough rarely as sophisticated as Ithaca’s efforts, such review of social media information is quite common: a 2015 Kaplan Test Prep Survey of 397 admissions officers found that 40 percent of admissions officers visit applicants’ social media profiles, often to verify information presented on their applications

      Interesting statistic

    3. ome institutions, especially community colleges, do not have their own IRB and therefore rely on approval by their president or defer to collaborators instead.

      Hadn't really thought of that, but probably should have.

    4. Even when they are known, most education researchers do not have the computer science expertise required to implement them. It is also the case that significant computing power, not accessible to every researcher, is needed to process massive datasets.

      Big data and data science challenge

    5. For example, it is frequently difficult for researchers to gain access to datasets collected by other researchers or institutions for other purposes.

      Or even, apparently, at their own.

  7. Oct 2016
    1. one to supply their salary

      At some point, we have to examine the role of soft money and indirect costs on this entire structure.

    2. knows the pressures are largely self-generated.

      Yes, that is the sad fact. Much of what ails science is entirely self generated.

    3. but from chasing an ideal of what makes a good scientist

      Yes. People are expected to achieve the mythic ideal of the scientist, particularly in terms of individual contributions.

    4. Old scientists

      Why I want to establish the "knowledge trust" to buy out the contracts of older scientists so they can mentor younger scientists and not compete with them.

    5. today’s senior investigators experienced a more comfortable trajectory in science

      I would say that the bulge of the baby boom certainly did; those of us at the tail and after, not so much.

    6. “Frankly, the job of being a principal investigator and running a lab just looks horrible,” wrote one neuroscientist from the United States.

      And that is really sad

    7. with little administrative support

      Pretty much no administrative support. Same with many senior scientists.

    8. “The number of people is at an all-time high, but the number of awards hasn’t changed,”

      Bingo. I knew that doubling the NIH budget was a huge mistake; completely mishandled.

    9. But are young scientists whining — or drowning?

      A bit of both, I suppose.

    1. “There has never been more arrogance and smugness” than in today’s self-congratulatory scientific culture, he asserts.

      I would also agree with that statement.

    2. Silicon Valley entrepreneur Roman Ormandy, for example, has criticized the brain-as-processor model. “The more neural research progresses, the clearer it becomes that brain is vastly more complex than we thought just a few decades ago,” Ormandy has noted.

      I think so.

    3. If my mind is running on another computer, it is no longer me.”

      Great quote. And what about the microbiome?

    1. UC San Diego will transform California and a diverse global society by educating, generatingand disseminating knowledge and creative works, and engaging in public service

      UCSD's mission

    2. The founders of UC San Diego aspired to establish an experimental campus, one that would define the future of education and research at a public university. The success of this vision is unparalleled: UC San Diego is recognized as one of the top public research universities in the country and one of the top twenty universities in the world.
    1. Our review found that 4% of the 271 journal articles assessed did not report the number of animals used anywhere in the methods or the results sections [5].

      That's actually better than I would have thought.

  8. Sep 2016
    1. MF: Comparative Political Studies conducted this pilot as a special issue of the journal devoted to results-free review. As special issue editors, we understand that the current editors do not intend to make results-free review a mandatory practice, nor will it be allowed as a regular manuscript submission track. The Journal of Experimental Political Science plans to make results-free review a standing manuscript submission track (see forward to Lin and Green 2016 here).  

      But I'm wondering whether editors could help here by deliberately soliciting negative results in some areas? That is, actively ferret out the other 99 studies through special issues.

    2. Second, contrary to fears that greater emphasis on transparency creates more incentives for clever research designs and methodological perfection, reviewers placed an overwhelming emphasis on theoretical consistency and substantive importance. In this regard, results-free review worked better than we could have hoped in incentivizing theory and research design over narrow concerns about novelty of methodology or empirical causal identification. Atheoretical work stood very little chance of publication in our pilot: referees consistently asked for meaningful tests of theoretically important hypotheses.

      This is a very interesting result. In my own journal, Brain and Behavior, where I try very hard to publish negative results when they are submitted, I had to do something similar to avoid 8000 papers on "this gene is not involved in that". The negative results had to be theoretically grounded (or what passes for theory in biomedicine).

    1. Likewise, Cambridge mathematician Tim Gowers argues that researchers should get recognition for advancing science broadly through informal idea sharing — rather than only getting credit for what they publish. "We’ve gotten used to working away in private and then producing a sort of polished document in the form of a journal article," Gowers said. "This tends to hide a lot of the thought process that went into making the discoveries. I'd like attitudes to change so people focus less on the race to be first to prove a particular theorem, or in science to make a particular discovery, and more on other ways of contributing to the furthering of the subject."

      The argument for credited annotations.

    1. The simplest way to deposit a manuscript on bioRxiv is to upload a single PDF including the text and any figures/tables.

      Are pdf's FAIR? I would argue not. Not particularly useful for machines.

    1. describe experimental work that is not performed in accordance with the relevant ethical standards for research using animals or human subjects.

      Open and inclusive requirements do NOT preclude the enforcement of ethical in standards in research. All publishers of the commons must enforce these.

    2. Authors retain copyright and choose from several distribution/reuse options under which to make the article available (CC-BY, CC-BY-NC, CC-BY-ND, CC-BY-NC-ND, or no reuse). By posting on bioRxiv, authors explicitly consent to text mining of their work (e.g., by search engines or researchers).

      Not compliant with FAIR; again, suspect this is entirely due to publisher requirements.

    1. nor grant arXiv the right to grant any specific rights to others.

      Not fully reusable. Probably because authors want to publish in journals. The issue of redistribution is unresolved. We know that this inhibits reuse of software, if we can't redistribute products that use open source software, but does it hold for narrative works?

    1. arXiv is an openly accessible, moderated repository for scholarly papers in specific scientific disciplines.

      Do such things exist in the commons? Domain specificity is allowed but not as a condition of entering the commons. But content can be aggregated and enhanced, according to Madrid, on behalf of user communities.

    2. The endorsement system verifies that arXiv contributors belong to the scientific community in a fair and sustainable way that can scale with arXiv's future growth.

      Is this consistent with the principles of the Commons?

    1. 2. A fundamentally new approach towards optimal reuse of research data. Data sharing and stewardship is the default approach for all publicly funded research. This requires definitions, standards and infrastructures.

      Open by default

    1. When reporting: › published information: please provide the complete reference (authors lists with initials, journal, volume, page range and year) or a PubMed identifier › unpublished information: please provide the name of the people associated with a particular item of information Note that you don't have to be the original author to submit us updates, and can therefore report anybody's work.

      Is UniProt using ORCID identifiers?

    1. The UniProt Consortium UniProt: a hub for protein information Nucleic Acids Res. 43: D204-D212 (2015)

      Would be good to participate in the Data Citation Pilot Project and the Resource Identification Initiative.

    1. New South Wales (NSW)

      Demonstration of inserting a figure:

    2. However, there has been surprisingly little research exploring the actual use of primary healthcare services around the time of hospitalisation, which requires linkage of primary care and hospital data for individuals.

      Well, that is surprising.

    3. Setting 266 950 participants in the 45 and Up Study, New South Wales (NSW) Australia

      Demonstration of annotation

    1. No one can read everything.  We rely on filters to make sense of the scholarly literature, but the narrow, traditional filters are being swamped. However, the growth of new, online scholarly tools allows us to make new filters; these altmetrics reflect the broad, rapid impact of scholarship in this burgeoning ecosystem. We call for more tools and research based on altmetrics.

      Scholarly Commons: No reward system should be built into the commons, but the ability to track metrics is native to the Commons.

    1. We reaffirm the principle that only the intrinsic merit of the work, and not the title of the journal in which a candidate’s work is published, will be considered in appointments, promotions, merit awards or grants.

      Business models for the Commons.

    2. Community standards, rather than copyright law, will continue to provide the mechanism for enforcement of proper attribution and responsible use of the published work, as they do now.

      A good point for the Commons. Good explanation of CC0, i.e., it doesn't mean that for good scholarship things should not be attributed.

    3. Open access is a property of individual works, not necessarily journals or publishers.

      Important point for the Commons.

    4. A complete version of the work and all supplemental materials, including a copy of the permission as stated above, in a suitable standard electronic format is deposited immediately upon initial publication in at least one online repository that is supported by an academic institution, scholarly society, government agency, or other well-established organization that seeks to enable open access, unrestricted distribution, interoperability, and long-term archiving (for the biomedical sciences, PubMed Central is such a repository).

      Very similar again to Berlin.

    5. The author(s) and copyright holder(s) grant(s) to all users a free, irrevocable, worldwide, perpetual right of access to, and a license to copy, use, distribute, transmit and display the work publicly and to make and distribute derivative works, in any digital medium for any responsible purpose, subject to proper attribution of authorship[2], as well as the right to make small numbers of printed copies for their personal use.

      Very similar to Berlin declaration, but covers printed works. Does the Scholarly Commons have anything to say about that? I think not.

    1. Open access contributions include original scientific research results, raw data and metadata, source materials, digital representations of pictorial and graphical materials and scholarly multimedia material.

      Definition of research objects

    2. A complete version of the work and all supplemental materials, including a copy of the permission as stated above, in an appropriate standard electronic format is deposited (and thus published) in at least one online repository using suitable technical standards (such as the Open Archive definitions) that is supported and maintained by an academic institution, scholarly society, government agency, or other well-established organization that seeks to enable open access, unrestricted distribution, inter operability, and long-term archiving.

      Defines the roles of contributors and publishers in the Commons.

    3. Open access contributions must satisfy two conditions:The author(s) and right holder(s) of such contributions grant(s) to all users a free, irrevocable, worldwide, right of access to, and a license to copy, use, distribute, transmit and display the work publicly and to make and distribute derivative works, in any digital medium for any responsible purpose, subject to proper attribution of authorship (community standards, will continue to provide the mechanism for enforcement of proper attribution and responsible use of the published work, as they do now), as well as the right to make small numbers of printed copies for their personal use.

      Definition of open access.

    4. Our mission of disseminating knowledge is only half complete if the information is not made widely and readily available to society.

      "Open by default" principle

  9. Aug 2016
    1. Posting of units of information smaller than research papers should be encouraged, as long as the data reporting allows others to replicate the work. The value of sharing review articles and commentaries as preprints is less clear.

      Negative results.

    2. Important questions are (i) whether preprint submissions should be screened for adherence to scientific and ethical standards and (ii) how to handle data that raise ethical concerns or that contravene national policy or guidance.

      Something that the commons will have to consider.

    3. However, the majority of attendees felt that peer review orchestrated through journals still plays a valuable role in providing in-depth analysis and scrutiny. Surveys also show that many researchers believe that their own work often improves through the peer-review process (7).

      Two tier system. No reason not to have two rounds of review.

    4. Benefits.

      Didn't see the publishing of negative results as a benefit, but I think it would be.

    5. Next, preprints provide reviewers with an opportunity to see—in real time—reactions from the community and how the researcher responds to thes

      Very critical, then, that preprints provide for feedback.

    6. Preprints enable reviewers to assess an applicant's ideas by scrutinizing the research findings, rather than using the journal name (or its impact factor) as a proxy for quality. Funders are keen to uphold the principle that funding decisions should be based on the merit of the research.

      Very true.

    7. They have taken a step toward embracing a new culture of science communication by posting a preprint this year, most of them for the first time.

      Leading by doing. The question is whether they will continue to do so or is this a one off.

    8. Some suggested that the archive could be flooded with weak papers meant only to assert priority. But decades of experience have demonstrated that scientists do not post poor-quality work on arXiv because of the impact on their reputations; we expect professional biologists to behave similarly.

      We are so concerned about weak science getting out there; but our current model leads to plenty of weak science getting out there.

    9. Posting preprints has the added benefit of democratizing the flow of information and making it available to all investigators across the globe, while allowing journals to make their own judgments of appropriateness and interest after peer review. Publicly available preprints provide an opportunity for authors to obtain feedback beyond the few scientists who see the manuscript during peer review.

      Benefits of the "don't publish, release" model envision by the Commons.

    1. These differences between lots and the possibility that the lot that we used was subject to precipitation during the experiment raises the possibility that the differences between our results and those of Erschbamer et al. are due to differences between lots of PD168393. For example, it is possible that the compound precipitated somewhat in our experiments, perhaps clogging the intra-spinal cannulae whereas the lot used by Erschbamer did not precipitate. Erschbamer et al. do not indicate the lot number for the PD168393 that they used in their paper, but subsequent discussions reveal that the lot number for the original study was B61311. According to the manufacturer, the differences in the numbering indicate that lot B61311 and D00069257 were synthesized at different times, whereas lot D00069257 and D00078517 were synthesized at the same time but packaged at different times. Unfortunately, the company has been unable or unwilling to provide material from previous batches, so it has not been possible to carry out direct head-to-head comparisons between different lots.

      Importance of providing more complete descriptions of reagents used.

    2. There are major differences in the physicochemical properties of commercially available EGFR inhibitors.
    1. “There’s not a lot of research on how best to socially, emotionally, environmentally, support Alzheimer’s patients, that might ameliorate their own anxiety, their own stress — maybe the disease, as horrible as it is, would be less horrible through a better care structure, but we do very little research on that.”

      I think so. Or there may be approaches, like Virtual Reality, that lets them live in the world they think they live in.

    2. In the future, the most valuable science institutions will be closely linked to the people and places whose urgent problems need to be solved; they will cultivate strong lines of accountability to those for whom solutions are important; they will incentivize scientists to care about the problems more than the production of knowledge. They will link research agendas to the quest for improved solutions — often technological ones — rather than to understanding for its own sake. The science they produce will be of higher quality, because it will have to be. The current dominant paradigm will meanwhile continue to crumble under the weight of its own contradictions, but it will also continue to hog most of the resources and insist on its elevated social and political status.
    3. The Artemis Project is different from science-as-usual in many ways. It is small, collaborative, and focused not on producing good science for its own sake, nor on making a profit, but on solving a problem. It takes its research agenda from patient advocates, not from scientists.

      Again, something for the Scholarly Commons.

    4. So Visco and her colleagues decided that NBCC would shoulder that burden and start managing the science itself. “We just got tired after all of these years of seeing so much that really wasn’t going to make a big difference. We have no interest in the continued funding of these things.”

      Similar to the Spinal Muscular Atrophy Foundation.

    5. In short, he made his research accountable to the end-user rather than to his scientific peers.

      Great story.

    6. The beautiful lie insists that scientists ought to be accountable only to themselves. Marqusee’s advice to his staff was precisely the contrary: “Have no constituency in the research community, have it only in the end-user community.”

      Very relevant for the scholarly commons.

    7. The scientific knowledge necessary to solve these sorts of problems would never be spontaneously generated by “the free play of free intellects.”

      Research for a purpose

    8. If people expect scientific research — even basic, long-term research — to contribute to a larger goal, there must be some mechanism of accountability for driving it toward that goal

      Which funding agencies like NIH have never attempted to do, for the most part.

    9. “The data that we now generate overwhelm our abilities of interpretation, and the attempts of the new discipline of ‘systems biology’ to address this shortfall have to date produced few insights into cancer biology beyond those revealed by simple, home-grown intuition.”

      That's quite a remarkable statement.

    10. given the limits of predictive science to solve problems in complex natural phenomena

      That is the key

    11. specially through mapping of the human genome, which pretty much everyone agrees has been an incredibly powerful catalyst for scientific research while generating unfathomably huge amounts of data but, at best, yielding modest health care benefits.

      When the measure of success is how many more papers get published, then we see many projects as successful.

    12. The difficulty with this way of doing science is that for any large body of data pertaining to a complex problem with many variables, the number of possible causal links between variables is inestimably larger than the number a scientist could actually think up and test. For example, researchers have identified more than a hundred variables that may influence obesity, from genes to education to job stress to how fast you eat to whether you were breastfed.

      The problem with big data unconstrained.

    13. Sometimes the problem is not that it is hard to come up with facts, but that it is all too easy. This is why science almost never provides a solution to politically controversial issues. Usually it does the opposite, providing peer-reviewed and thus culturally validated truths that can be selected and assembled in whatever ways are necessary to support the position and policy solution of your choice. If this observation seems implausible or overstated, consider that after forty years of research on the risks and benefits of mammograms, their effectiveness is more contested than ever; similarly, after more than twenty-five years and $15 billion of research to assess the safety of the Yucca Mountain nuclear waste repository site in Nevada, nothing has resulted beyond political gridlock. In neither case does the science add up to a unitary truth. What we have, instead, is trans-science that “weaves back and forth across the boundary between what is and what is not known and knowable.”

      Sad but true

    14. If both scientific research and political debates over such questions seem to drag on endlessly, surely one reason is that we have the wrong expectations of science. Our common belief is that scientific truth is a unitary thing — there is one fact of the matter, that’s why the light always goes on when I flip the switch. But trans-scientific questions often reveal multiple truths, depending in part on what aspects of an issue scientists decide to do research on and how they go about doing that research.

      Could also be said of science, but at least there is an answer there. These larger questions are almost always contextual.

    15. In fact, the great thing about trans-science is that you can keep on doing research; you can, as Fitzpatrick says, create “the sense that we’re gaining knowledge when we’re not gaining knowledge,” without getting any closer to a final or useful answer.

      That is scary.

    16. Fitzpatrick thinks that such reasoning is not justified when using mouse brains to model human neurodegenerative disease.

      The lack of successful translation into treatments supports this.

    17. But in the meantime, the paper has been cited in about 500 other papers, many of which may have been cited multiple times in turn. In

      Argument for the reproducibility channel

    18. it got more and more money,

      No article about what has gone wrong with science has yet addressed the effects of indirect costs on driving this entire process.

    19. “science is so large that many of us begin to worry about the sheer mass of the monster we have created.” In his book Little Science, Big Science (1963), Price noted presciently that the number of scientists was growing so fast that it could only lead to a “scientific doomsday” of instability and stress, and that exponential growth of the scientific enterprise would bring with it declining scientific originality and quality, as the number of truly great scientists was progressively drowned out by the much more rapidly increasing number of merely competent ones.

      Prescient, but elitist as usual. Competent scientists play a role here.

    20. scientific publishing industry exists not to disseminate valuable information

      A little harsh. It is actually publishers now (admittedly to find a new business model) that are pushing for more data to be published.

    21. The professional incentives for academic scientists to assert their elite status are perverse and crazy, and promotion and tenure decisions focus above all on how many research dollars you bring in, how many articles you get published, and how often those articles are cited in other articles.

      Really good synopsis follows.

    22. You will need forty hours a week to perform teaching and administrative duties, another twenty hours on top of that to conduct respectable research, and still another twenty hours to accomplish really important research.... Make an important discovery, and you are a successful scientist in the true, elitist sense in a profession where elitism is practiced without shame.... Fail to discover, and you are little or nothing.

      Good quote

    23. But if there is nothing by which to measure scientific progress outside of science itself, how can we know when our knowledge is advancing, standing still, or moving backwards?

      It is what allowed the merit system in science to be corrupted.

    24. Vannevar Bush’s beautiful lie makes it easy to believe that scientific imagination gives birth to technological progress, when in reality technology sets the agenda for science, guiding it in its most productive directions and providing continual tests of its validity, progress, and value. Absent their real-world validation through technology, scientific truths would be mere abstractions.
    25. If, as Visco says, “at some point you really have to save a life,” it will be a technology, perhaps a vaccine or drug, that does the job.

      I tend to agree with this. Is is already.

    26. The National Science Foundation funded basic research into this new, technology-created realm, including grants to two graduate students in computer science at Stanford University who wanted to understand how best to navigate the novel and expanding landscape of digital information.

      I wonder why NSF doesn't crow about this more?

    27. The electronics industry and semiconductor physics progressed hand-in-hand not because scientists, working “in the manner dictated by their curiosity for exploration of the unknown,” kept lobbing new discoveries over the lab walls that then allowed transistor technology to advance, but because the quest to improve technological performance constantly raised new scientific questions and demanded advances in our understanding of the behavior of electrons in different types of materials.

      Does this hold across fields? Certainly, CRISPR arose from very basic research.

    28. “And what? And what is there to show? You want to do this science and what?” “At some point,” Visco says, “you really have to save a life.”

      The ultimate impact

    29. What seemed to drive many of the scientists was the desire to “get above the fold on the front page of the New York Times,” not to figure out how to end breast cancer.

      I'm afraid that I cannot refute this, although I think many are driven by a desire to help.