1,441 Matching Annotations
  1. Mar 2020
    1. we decided to use PDMS-based polymer matrices

      PDMS-based material was used in this experiment because there are known uses for this material in other sustained- release products.

  2. Feb 2020
    1. the caged arms had a significantly higher fracture force than V-shaped arms (65.6 ± 7.5 N, n = 6 versus 51.7 ± 5.8 N, n = 6; P < 0.05, Student’s t test).

      The caged arms fracture force was significantly higher than the V-shaped arms. The three-point bend test shows that it takes a higher force to break the caged arms than it takes to break the V- shaped arms.

    2. the caged arms made using a thermoplastic polymer,

      The arms were made out of a thermoplastic polymer which is a substance that can be heated to become pliable, and upon cooling, it hardens.

    3. mechanical properties of V-shaped arms

      Different forces were applied to the material. The physical reactions to those forces were noted in order to determine the mechanical properties of this material.

      Examples of Mechanical Properties: strength, toughness, brittleness, etc..

    4. an outer sleeve made of a rigid polymer that provides mechanical integrity (structural polymer)

      The polymer making up the outer sleeve is for structural support of the dosage form.

    1. picogram analyses of Pb-Sr isotopes of fluid inclusions

      Picogram is a unit of measurement of weight and it is equivalent to one-trillionth (10<sup>-12</sup>) of a gram. A picogram analysis is done by weighing a sample at the picogram scale and then using other analytical techniques on this sample to get meaningful information.

    2. all diamonds show typical sublithospheric features

      In order to confirm the sublithospheric features, the authors characterized the structure of these diamonds using a technique called cathodoluminescence imaging. With this technique, light emitted by a sample when irradiated with electron radiation can be measured.

    3. compositions of basalts provide information

      Measurement of isotopic content of primitive basaltic rocks has been a useful method to understand the exact chemical composition of these old rocks. Learn more in this article from Eos explaining the importance of using these techniques. https://eos.org/features/isotope-geochemists-glimpse-earths-impenetrable-interior

    4. The carbon isotope compositions of the diamonds

      The carbon isotope compositions were measured using an instrument called Stable Isotopes mass spectrometer. Watch this video to get an idea about how this instrument works in a science laboratory: https://www.youtube.com/watch?v=SHbzEwMt-1s

  3. Nov 2019
    1. injected with 4×1054×105<math xmlns="http://www.w3.org/1998/Math/MathML"><mn>4</mn><mo>×</mo><msup><mn>10</mn><mn>5</mn></msup></math> Sa1N tumor cells

      The experiment was repeated with a much larger number of tumor cells injected. This is to ensure that the effects seen rely only on the treatment course and not on other factors.

    2. All control mice injected subcutaneously with 1×1061×106<math xmlns="http://www.w3.org/1998/Math/MathML"><mn>1</mn><mo>×</mo><msup><mn>10</mn><mn>6</mn></msup></math> Sa1N cells

      The authors injected groups of 5 mice each with SA1N cells causing fibrocarcinoma.

  4. Oct 2019
    1. treated with anti-CTLA-4

      A third group of mice received anti-CTLA-4 antibodies as treatment.

    2. in vivo administration of antibodies to CTLA-4

      The authors injected mice with antibodies that bind CTLA-4.

  5. Sep 2019
    1. on the silicon dioxide (SiO2) substrate by means of a solution process method (figs. S1 and S2), providing a partial coverage on the SiO2 surface (22).

      The authors have deposited a few drops of graphene in a colloidal liquid state on a SiO<sub>2</sub> substrate to give it a non-uniform coverage of graphene upon evaporating the solution. Simultaneously, they also deposited a solution containing nanodiamonds to get nanodiamond particles on the SiO<sub>2</sub> surface.

    2. We demonstrate our observation of stable macroscale superlubricity while sliding a graphene-coated surface against a DLC-coated counterface

      The authors have designed and performed superlubricity experiments by sliding DLC-coated stainless steel balls against a graphene surface. However, after analyzing the initial test results, they needed to modify the design by also incorporating nanodiamonds into the system. They anticipated that the nanodiamonds can act as nano-ball bearings, thereby enhancing the mechanical strength of graphene and contributing to superlubricity.

    3. Atomistic simulations

      Molecular dynamics is a computer simulation method that allows for prediction of the time evolution of a system of interacting particles such as atoms and molecules.

    1. by using a two-chamber place preference test

      The authors hypothesized that because the ZI to PVT projection promotes intake of foods that are pleasurable to eat and also makes mice overcome their aversion to light in order to eat that food, that stimulation of this pathway is pleasurable or rewarding for the mice.

      They tested this by placing the animals in a box with two identical compartments. The mice were able to freely move around the box. On one side of the chamber the mice received stimulation of their ZI-PVT neurons, whereas the stimulation was turned off when the mice were on the other side.

    2. optogenetic stimulation

      A technique that uses light to control the activity of cells, most commonly neurons, in living animals. The cells are genetically modified to express ion channels that are sensitive to light. Shining light on the neurons changes their activity, allowing scientists to understand the role of the neuron in a given behavior or physiological process.

    3. Anterograde AAV-ChIEF-tdTomato labeling

      Infection of the neurons with tdTomato-tagged AAV allows the projection of the ZI GABA axons to be visualized.

      The authors used this method to determine where in the brain these neurons project to.

    4. we injected Cre recombinase–inducible adeno-associated viruses (AAV) expressing the optogenetic channelrhodopsin-like ChIEF fused with a tdTomato reporter [AAVdj-CAG-DIO-ChIEF-tdTomato (driven by the CAG promoter) (10, 11)] bilaterally into the rostral ZI of vesicular GABA transporter (VGAT)–Cre mice that express Cre recombinase in GABA neurons

      To target a neuron population of interest, e.g. those that express GABA, scientists use genetically modified viruses (AAVs) to deliver proteins into the brain (such as optogenetic tools).

      This is achieved by using two tools: 1) a mouse line that expressed the enzyme Cre recombinase in a specific population of neurons (e.g. those that express the GABA transporter VGAT) and 2) an AAV that expresses an optogenetic protein only in the presence of Cre. The AAV is injected into the brain region of interest in the Cre mice. This AAV has a tdTomato tag which allows the injection site to be visualized under a fluorescent microscope.

      For further information on these tools see how mice optogenetics are used this video.

      The ZI in both hemispheres of the brain was injected with the AAV (bilaterally), with the region lying towards the front of the brain (rostral) being targeted. The optogenetic tool used (ChIEF) activates neurons when blue light is shone on the cells.

  6. Aug 2019
    1. Given that APC−/− tumors can efficiently transport both glucose and fructose, we sought to determine the metabolic fate of glucose and fructose using 13C isotopic tracing. We isolated tumors from APC−/− mice and exposed them to four different labeling conditions for 10 min ex vivo: 13C-glucose (labeled at all six carbons), 13C-fructose (labeled at all six carbons), 13C-glucose + unlabeled fructose, and 13C-fructose + unlabeled glucose.

      To study glucose and fructose metabolism in tumors the they traced the breakdown of the molecules.

      1. The scientists labeled glucose and fructose with a radioactive atom that can be traced even as the molecule is broken down.
      2. They incubated tumor tissues with labeled-glucose, labeled-fructose or a mix of labeled-glucose + fructose, or a mix of labeled-fructose + glucose. These tumor tissues absorb the sugars and metabolize them. Note: Adding a mixture of sugars to the tumor allows the scientists to determine how the metabolic pathways are related.
      3. The different components after metabolism are then determined in lab to trace the metabolic pathway of how tumors break down sugars.
    2. on a tumor tissue microarray containing 25 cases of human colon tumors ranging from early-stage adenomas to metastatic carcinoma (fig. S5B)

      In order to investigate tumor metabolism in human tissues scientists used a tissue microarray where tiny samples of human tumors or tissues were cultured and studied. They compared metabolism between 25 different tumors, of varying severity, to normal human intestinal cells.

    3. Given these findings, we hypothesized that fructose in the intestinal lumen might be efficiently transported and metabolized by tumors located in the distal small intestine and colon.

      The authors wanted to test whether tumors near the end (distal) of the intestines or in the colon consume fructose since it can be found in much higher concentrations in the colon than glucose can. Approach: They marked glucose and fructose molecules with C14 (radioactive carbon) which can be traced as the sugars get broken down and metabolized. This way if they find C14 from fructose and glucose in a tumor they can conclude that it metabolizes both sugars.

    4. Indeed, we found that fructose concentration was significantly increased in the colonic lumen (4.4 mM at peak 30 min) in WT mice after an oral bolus of HFCS (fig. S4A), consistent with impaired fructose uptake in the small intestine.

      The scientists repeated experiments from previous work to validate their methods. They fed mice (oral bolus) high fructose corn syrup and then 30 minutes later (to allow for digestion) measured fructose levels in the colon. Similar to previous studies, they found elevated fructose in the colon suggesting the passive transporters in the intestine were saturated and allowed fructose to pass through undigested.

    5. To uncouple the metabolic effects caused directly by HFCS from those caused by HFCS-induced obesity, we treated APC−/− mice with a restricted amount (400 μl of 25% HFCS) of HFCS daily via oral gavage starting the day after tamoxifen injection (referred to as the HFCS group).

      Here they did a two-part experiment: 1) They tested whether high fructose corn syrup itself induced metabolic dysfunction by giving mice a limited about of high fructose corn syrup so that they did not become obese.

      2) To test the effects of high fructose corn syrup on colorectal tumor formation and growth the authors compared tumor characteristics between mice fed with different amounts of high fructose corn syrup. First, they treated all mice with tamoxifen to activate tumor formation. Then they broke them down into three groups: HFCS- mice that were treated with a limited amount of high fructose corn syrup to prevent obesity, WB- mice that had high fructose corn syrup mixed in with water so they consume a lot of it, and Con- a control group with no high fructose corn syrup administered. They then compared the formation of tumors and their characteristics to look at the effects of high fructose corn syrup.

    6. We first determined the physiological effects of HFCS administered to APC−/− and wild-type (WT) mice

      The scientists were first interested in looking at how high fructose corn syrup affects an entire mouse, and compare the effects on normal mice and their genetically modified mouse (APC -/-). They did this by mixing high fructose corn syrup into their water and allowing them to drink as much as they wanted (ad libitum). They monitored the mice's weight over time.

    7. To untangle the link between sugar consumption, obesity, and cancer, we mimicked SSB consumption in a genetically engineered mouse model of intestinal tumorigenesis. In this model, the adenomatous polyposis coli (APC) gene is deleted in Lgr5+ intestinal stem cells upon systemic tamoxifen injection (Lgr5-EGFP-CreERT2; APCflox/flox, hereafter APC−/− mice) (11, 12).

      The scientists need a mouse model that will develop intestinal tumors so that they can study the effects of sugar-sweetened beverages on the tumor. They manipulated the mouse genes so that after injecting a drug (Tamoxifen) a gene in the intestine is deleted and tumors begin to form. With this genetically engineered mouse the scientists can induce tumor formation of the mouse, then track tumor size and metabolism to look at the effects of high fructose corn syrup in a diet.

    1. tetrodotoxin, which prevents Na+ influx elicited by veratridine, prevented the effects of depolarization

      The sodium influx caused by veratridine was cancelled out by tetrodotoxin exposure. Tetrodotoxin blocks the influx of sodium ions thereby stopping depolarization. Hence there is no increase in the activity of substance P in the presence of both drugs.

    2. one-way analysis of variance

      Used to compare average means of two or more samples.

      Here, the authors used this test to measure the activity of substance P across the following groups: (1) control, (2) in the presence of tetrodotoxin, (3) in the presence of veratridine, and (4) in the presence of tetrodotoxin and veratridine.

      Read more at Khan Academy.

    3. Control

      These are explants obtained from nucleus locus ceruleus. These explants are placed in a nutrient medium and no drugs are provided to this group. This group serve as a comparison group to the groups treated with the drugs.

    4. Autoradiography

      The last step in the dot blot to detect the materials of interest using a radioactive probe. Here, the radioactive probes tagged to proenkephalin were observed for three time points: 0 days, 1 day, and 3 days.

    5. The medullary explants exhibited a 50-fold rise in [Leu]enkephalin within 4 days, after a 2-day lag period, and continued increasing through 7 days, the longest time examined. In contrast, TH activity remained constant throughout, while PNMT decreased 60 percent in the first 4 hours, maintaining a stable plateau thereafter.

      Medullary explants were obtained from adult rats to understand the mechanism behind the different transmitter expression. 

      The tyrosine hydroxylase (TH) activity stayed consistent throughout the seven-day period. TH enzyme is responsible for the synthesis of catecholamines. The consistent TH activity indicates that there is no change related to the expression of catecholamines in this experiment. 

      Next, the PNMT enzyme is responsible for adrenergic expression. PMNT exhibited a 60% decrease in four hours and was maintained at that level for the remainder time period. 

      The [leu]enkephalin (opiate expression) showed a 50-fold increase in four days and continued to increase until seven days. There is a continuous increase in the expression of opiates as opposed to catecholamines.

    6. immunocytochemical reactivity

      Technique used to mark the target of interest using an antibody-based test. In this study, tyrosine hydroxylase and dopamine β-hydroxylase are the target molecules of interest. The antibodies against tyrosine hydroxylase—namely dopamine β-hydroxylase—are used to detect the molecule of interest.

    7. Depolarization with veratridine completely blocked the increase of substance P

      Explants were depolarized with veratridine, leading to a decrease in the levels of substance P.

    8. explanted superior cervical ganglia

      The ganglia was removed from the animal and transferred to a nutrient medium. These explants were maintained in the medium for six months to one year. At several time points, the explants were taken and observed for the activity of substance P.

    9. in culture

      The neurons are dissected from the animal and grown in a dish. The dish contains supplemental factors and a medium that mimics the composition of the fluid inside the animal.

    10. grown in dissociated cell culture

      Neurons are separated from the animal through mechanical or enzymatic disruption. The separated neurons are transferred to a dish or culture plate. The neurons are maintained in the dish.

  7. Jul 2019
    1. The multiplet nuclei capture rate was comparable to single-cell RNA-seq analysis using the 10× platform

      In order to see if they were accidentally capturing more than one nuclei at a time, the authors mixed nuclei from mouse and human samples prior to running snRNA-seq. If they saw mouse RNA mixed with human RNA, this meant there was a multiplet (or, more than one nuclei was captured).

      However, they found that there was very low rates of multiplets, meaning that their experiment is working well.

    2. We aimed to gain insight into cell type–specific transcriptomic changes by performing unbiased single-nucleus RNA sequencing (snRNA-seq) (4) of 41 postmortem tissue samples

      The authors wanted to see if particular cell types have different gene expression in autistic brains. They also examined two different brain areas to see if there's regional differences.

    3. We generated 104,559 single-nuclei gene expression profiles—52,556 from control subjects and 52,003 from ASD patients (data S2)—and detected a median of 1391 genes and 2213 transcripts per nucleus, a yield that is in agreement with a recent snRNA-seq

      The authors calculated the total number of genes expressed in the single-nuclei data of controls and ASD patients and found it was about the same. The median gene number is lower than the transcript number because a gene can have multiple transcript forms (called isoforms).

    4. 10× Genomics platform

      The 10X Genomic system is a platform that isolates single nuclei and isolates "libraries" (a collection of RNA fragments which can be used to identify particular RNAs) from each nuclei.

    5. To compare changes in ASD to those in patients with sporadic epilepsy only, we generated additional snRNA-seq data from eight PFC samples from patients with sporadic epilepsy and seven age-matched controls (data S1)

      The authors wanted to make sure that any effects they were seeing were specific to ASD, and not epilepsy, so they included patients with epilepsy alone as an additional control.

    6. (fig. S1A; P > 0.1, Mann-Whitney U test)

      This means that the control and ASD subjects didn't differ in age, sex, or RNA quality. This is important because results can be biased by uncontrolled factors (e.g., what if there's more females in one group, and the effect you're seeing is really due to sex?)

      Mann-Whitney U test is a statistical test that they used to show that there's no signifiant differences between the controls and ASD subjects.

  8. Jun 2019
    1. veratridine depolarization

      Veratridine is a drug that causes an increase in the sodium influx. The authors used the drug to cause depolarization.

    2. histofluorescence

      Fluorescent markers are used to label catecholamine in the neurons and visualized using a fluorescent microscopy.

    1. Locomotor sensitization

      A technique used to measure the movement or locomotor activity of the animal assessed in the open field box. It is thought that with repeated administration of the drug, the animals can show an increase in locomotor activity, which is a sign of sensitization.

      Photobeams are placed on the walls of the box to record the movements of the animal. The mice can explore and get used to the test area of the open field.

      The animal is tracked for the distance covered in centimeters using an automated video system. The experiment is repeated on day 1 and on each day following the injection of cocaine (on days 8, 9, 10, 11). The total distance covered by the animal is recorded for each day.

      Watch the technique here at: https://www.jove.com/video/53107/assessment-cocaine-induced-behavioral-sensitization-conditioned-place

    2. CPP measures an animal’s preference for a particular place in its environment, which develops as that place becomes consistently associated with a rewarding stimulus and assumes some of the rewarding effects of the stimulus

      A three-chamber apparatus is constructed with access to two chambers for the animal for this experiment. There are 3 phases to this experiment: pre-conditioning, conditioning, and testing.

      Pre-conditioning: Animal can freely move on any side of the chamber. For each animal, the first preference of the chamber is noted. This was done on days 7 and 8 of the experimental protocol.

      Conditioning: Train the animals to saline on its preferred chamber and to cocaine on the least preferred side. This was done on days 9, 10, and 11 of the experimental protocol.

      Testing: On day 12 of the experiment, the animals are allowed to have access to both sides of the chamber for 30 minutes. The time spent in the preferred chamber and the less preferred chamber is recorded.

      View the video to learn more about the conditioned place preference protocol. The protocol is describing how to measure craving in animals using morphine as the drug of preference. https://www.jove.com/video/58384/a-conditioned-place-preference-protocol-for-measuring-incubation

    3. long-term potentiation

      Slice electrophysiology is a technique that is widely used to study synaptic plasticity. Brain slices containing the nucleus accumbens region was obtained from the mice.

      For this technique, stimulating and recording electrodes are needed. Recording electrodes measure the electrical activity of the neurons in the area. Stimulating electrodes are used to stimulate a dendrite (s) in the brain region that can elicit a response which can be recorded via the recording electrode. The stimulus is given at a rate of 1 per minute.

      The stimulating electrode was placed in the nucleus accumbens, and the recording electrode was placed near to the stimulating electrodes.

      The amplitude or the size of the response can be calculated from each stimulus. Baseline values were obtained. After that, an LTP stimulus was given, and the post-LTP data was collected. The data were normalized to baseline values.

      Watch the video here on LTP is done in hippocampus, a brain region involved in memory: https://www.jove.com/video/2330/preparation-acute-hippocampal-slices-from-rats-transgenic-mice-for

    4. high-frequency stimulation

      The neurons are activated by a high frequency of 100Hz. The protocol used here: 4 trains of 100 Hz tetanus 3 mins apart and are represented below:

    5. FosB expression

      Chromatin immunoprecipitation technique is used to measure FosB expression.

      Briefly, the brain tissue was fixed with formaldehyde to crosslink the DNA binding proteins. The DNA was sheared into small fragments, some of which contains the DNA binding proteins. Using specific antibodies (H4, H3), the DNA binding protein complex was isolated. The proteins are digested, and the DNA is released. The specific DNA sequences of interest were amplified to see if they precipitated with the antibody.

      Watch the video here: https://www.jove.com/science-education/5551/chromatin-immunoprecipitation

    6. PCR

      mRNA was isolated from the brain tissue using the Trizol reagent. RNA was later reverse transcribed to cDNA or complementary DNA using primers of interest. The fold difference of mRNA over control values is calculated and compared across the groups.

      Check the video here on the technique: https://www.youtube.com/watch?v=0MJIbrS4fbQ

    7. Immunoblots

      Protein was extracted from the tissue. Proteins are separated based on molecular weights in a SDS-Page gel. The gel was transferred to nitrocellulose membrane.

      The antibodies to H3 and H4 were applied to the membrane to detect the bands of interest.

      Watch the technique here: https://www.jove.com/science-education/5065/the-western-blot

    8. SAHA

      The drug was administered directly to the nucleus accumbens of the mice.

      In order to do so, the coordinates of the nucleus accumbens are obtained from the mouse brain atlas. The mouse is placed in a stereotaxic chamber, and the cannula was inserted into the brain to inject the drug every day for 7 days. The cannula was guided to be inserted into the brain using the coordinates

    9. HDAC activity

      The nuclear fractions are obtained from the mice using a nuclear extraction kit. HDAC activity was measured using the kit.

    10. 14 days after stopping 7 days of nicotine treatment

      The mice receive 7 days of nicotine or water, and then the animals are weaned off the drug for 14 days. Cocaine was administered to animals after day 14 of treatment.

    11. To investigate further the duration of the priming effect of nicotine

      What is the duration of nicotine exposure that is needed to obtain the priming response we see in these animals?

      Does nicotine need to be given closer to another drug or separated a few days apart?

    12. To test further the idea that histone acetylation and deacetylation are key molecular mechanisms for the effect of nicotine on the response to cocaine, we conducted two sets of experiments, one genetic and one pharmacological

      Next, the authors tested the idea of histone acetylation by using a low dosage of theophylline, an HDAC stimulator. In contrast to SAHA, the theophylline should decrease the response to cocaine.

    13. we asked whether we could simulate the effect of nicotine by specifically inhibiting deacetylases with the HDAC inhibitor suberoylanilide hydroxamine acid

      If nicotine is inhibiting HDAC activity, then by using an HDAC inhibitor, we should be able to mimic the effects of nicotine on LTP and FosB expression. This hypothesis was tested by using SAHA, an HDAC inhibitor.

    14. histone deacetylase (HDAC) activity directly in the nuclear fraction of cells in the striatum

      To confirm that, histone deacetylase activity (HDAC) was measured in the striatum.

    15. Does the hyperacetylation produced by nicotine result from activation of one or more acetylases or from the inhibition of deacetylases?

      The authors next addressed whether the acetylation of residues is due to an increase in activation of acetylases or due to inhibition of deacetylase.

    16. we used immunoblotting and examined the extent of chromatin modifications in the whole striatum of mice chronically treated with nicotine

      The authors observed the acetylation levels of H3 and H4 after 7 days of nicotine treatment in striatum tissue using chromatin immunoprecipitation and immunoblotting.

    17. whether nicotine enhances FosB expression in the striatum by altering chromatin structure at the FosB promoter and, if so, does it magnify the effect of cocaine?

      The authors asked the question: does nicotine increase FosB expression by altering the chromatin structure at FosB promoter?

    18. we gave cocaine (30 mg/kg) in two protocols: for 24 hours or 7 consecutive days followed by 24 hours of treatment with nicotine

      The animals were given cocaine in drinking water for 7 days. Later, the mice were administered nicotine for 4 days. FosB mRNA levels were measured.

    19. does nicotine pretreatment followed by cocaine increase the response to cocaine, whereas the reverse order of drug treatment does not?

      These experiments were performed to determine if the nicotine pretreatment combined with cocaine injection produces similar results to cocaine pretreatment combined with nicotine injection.

    20. We treated mice with nicotine (50 μg/ml) in the drinking water for either 24 hours (Fig. 1A) or 7 days

      Nicotine was added to the drinking water for the mice.

      7 days treatment: The mice were fed with the nicotine-containing water for 7 days. For the next 4 days, mice received a cocaine injection intraperitoneally (injection into a body cavity) with continuous exposure to nicotine-containing water. The injections were given once per day.

      24 hours of treatment. The mice were exposed to the nicotine-containing water for 24 hours, and the next 4 days; the mice received a cocaine injection (once per day) intraperitoneally with continuous exposure to nicotine-containing water.

    1. we expected the intervention to be particularly beneficial for women tending to endorse the gender stereotype.

      The authors predicted that the effect of the values affirmation intervention would be greater for women who more strongly endorse the gender stereotypes.

    2. We predicted a reduced gender gap in performance for women who completed the values affirmation.

      The authors' main predication was that women who completed values affirmation would have a smaller gender gap than women who did not complete values affirmation.

    3. We tested whether values affirmation would reduce the gender achievement gap in a 15-week introductory physics course for STEM majors.

      The authors tested whether using a values affirmation intervention in their college physics course could reduce the performance gap between men and women.

    4. The values-affirmation intervention used in this study involves writing about personally important values (such as friends and family). The writing exercise is brief (10 to 15 min) and is unrelated to the subject matter of the course.

      The key variable in this experiment is whether a student experiences the values affirmation intervention.

      In the values affirmation intervention, students briefly write about a value they find personally important.

    1. To examine interaction dynamics across sites and to test their association with environmental variables, we calculated the dissimilarity (interaction turnover) between pairs of networks, using data limited to species present in the networks.

      The authors compared the similarities and differences between species' interactions across different locations around the island, taking into account the unique environments of each site.

  9. May 2019
    1. To test whether activation of the VGATZI-PVT inhibitory pathway leads to body weight gain, we selectively photostimulated this pathway for only 5 min every 3 hours over a period of 2 weeks.

      The authors hypothesize that because stimulation of the ZI to PVT pathway evokes a large increase in food intake in a short amount of time, long-term stimulation should lead to weight gain.

    2. To test the time course and efficiency of optogenetic activation of VGATZI-PVT inhibitory inputs to evoke feeding, we used a laser stimulation protocol of 10 s ON (20 Hz) followed by 30 s OFF for more than 20 min to study ZI axon stimulation in PVT brain slices and feeding behavior. Stimulation of ZI axons with this protocol hyperpolarized and inhibited PVT glutamatergic neurons each time the light was activated (Fig. 3A). Mice immediately started feeding for each of the 30 successive trials of ZI axon laser stimulation (Fig. 3B and movie S4). The mean latency to initiate feeding was 2.4 ± 0.6 s when we used laser stimulation of 20 Hz (Fig. 3C).

      The authors followed an optogenetic protocol by intermittently turning on the stimulation light for 10 seconds followed by 30 seconds of no stimulation. With each 10 seconds of light on, they measured how long it took for the mice to begin eating.

    3. We crossed VGAT-Cre mice with vGlut2-GFP mice in which neurons expressing vesicular glutamate transporter (vGlut2) were labeled with green fluorescent protein (GFP) to study whether ZI GABA neurons release synaptic GABA to inhibit PVT glutamate neurons (16, 17).

      The authors bred two different mouse lines together: one parent expressed Cre in VGAT-positive neurons and the other parent expressed a protein that emits green fluorescence (GFP) in vGlut2-positive neurons.

      The researchers than used the offspring of this cross to record from GFP-positive cells in a slice and ask whether VGAT cells in the ZI provide input to these neurons.

    4. Laser stimulation (1 to 20 Hz) evoked depolarizing currents in ZI ChIEF-tdTomato–expressing VGAT neurons tested with whole-cell recording in brain slices, displaying a high-fidelity correspondence with stimulation frequency (Fig. 1B).

      The authors recorded the activity of the ChIEF-expressing neurons in brain slices using electrodes. Stimulating the slice with blue light activated the ChIEF-expressing neurons, causing them to fire in the same pattern with which they were stimulated (i.e. high fidelity). Hz (hertz) refers to the number of times the light flashes per second, i.e. 20Hz corresponds to 20 flashes of light per second which caused the neurons to fire 20 times per second.

      This virtual lab demonstrates electrophysiological recordings of neurons: Neurophysiology Virtual Lab

    5. Cre recombinase–dependent rabies virus–mediated monosynaptic retrograde pathway tracing in vGluT2–Cre recombinase mice

      The authors identified the neurons that lie upstream and provide input to PVT neurons.

      They targeted excitatory PVT neurons using vGluT2-Cre mice and used a modified rabies virus that traffics into neurons that provide input to the starting population of cells.

    6. food intake was measured when food was put in a brightly illuminated chamber in a two-chamber light-or-dark conflict test

      Mice were placed in a chamber with two compartments—one with no lights and one brightly illuminated. Mice are innately averse to light and so will usually spend more time in the unlit compartment.

    7. After mice were partially fasted with only 60% of the normal food available during the preceding night, laser stimulation (20 Hz, 10 min ON followed by 10 min OFF, two times) of ChIEF-expressing PVT vGluT2 neurons reduced food intake (Fig. 4, F to H).

      The authors gave the mice a small amount of food to eat overnight, which meant that they were hungry during the experiment. Therefore, control mice commenced eating with short latency at at the onset of the stimulation protocol.

    8. To explore the neuronal pathway postsynaptic to the VGATZI-PVT axon terminals, we injected Cre-inducible AAV-ChIEF–tdTomato selectively into the PVT of vGlut2-Cre mice (Fig. 4A and fig. S8A).

      The authors assessed the role of the neurons downstream (postsynaptically) of the ZI neurons that project to the PVT. They examined how food intake was affected when PVT excitatory neurons were optogentically stimulated.

      Given that GABA is an inhibitory neurotransmitter, the PVT neurons would normally be inhibited when the ZI to PVT projection is active. Thus, when the PVT neurons are stimulated, food intake should increase. Indeed, this is what the authors found.

    9. To test whether ZI GABA neurons exert long-term effects on energy homeostasis, we microinjected AAV-flex-taCasp3-TEVp, which expresses caspase-3 (24), into the ZI of VGAT-Cre mice to selectively ablate ZI GABA neurons (fig. S7).

      The authors selectively killed ZI GABA neurons by using an AAV to express a caspase in these neurons. Caspase-3 is an enzyme that induces cell death.

    10. A chemo-genetic designer receptor exclusively activated by designer drugs (DREADD) was used to test the hypothesis that silencing the cells postsynaptic to ZI GABA axons, the PVT glutamate neurons, would enhance food intake. We injected Cre-inducible AAV5-hSyn-HA-hM4D(Gi)-IRES-mCherry coding for the clozapine-N-oxide (CNO) receptor into the PVT of vGlut2-Cre mice (25, 26) (fig. S9, A and B).

      Silencing of neurons in the PVT that receive input from ZI GABAergic neurons should increase food intake given that these neurons are inhibited by ZI GABA neurons, which increase food intake.

      The authors used a chemogenetic approach in which a modified (DREADD) receptor is expressed in the neurons using AAVs. The receptor is activated specifically by a synthetic drug (CNO) that has no other biological effect.

      The authors used this approach over an optogenetic method to silence the neurons as currently available optogenetic tools for inhibition are not very efficient.

    11. To confirm that PVT vGlut2 neurons were killed by the virus-generated caspase-3, we injected the Cre-dependent reporter construct AAV-tdTomato simultaneously with AAV-flex-taCasp3-TEVp to corroborate that reporter-expressing neurons were absent after selective caspase expression. With coinjection, little tdTomato expression was detected, whereas many cells were detected with injections of AAV-tdTomato by itself, consistent with the elimination of vGluT2 neurons in the PVT (fig. S10, A to D).

      To confirm that the caspase virus was killing cells, a tdTomato reporter, which makes the cells red under a fluorescent microscope, was injected at the same time as the caspase virus.

      The authors found that few tdTomato cells were present in mice that also received the caspase, compared to control mice that were injected with the tdTomato only. Thus, the caspase virus efficiently killed the PVT neurons.

    1. Let us assume that the genetic code is a simple one and ask how many bases code for one amino acid.

      In other words, how many bases in a row translate into one amino acid?

      Let's do a thought experiment (which is considerably cheaper than a laboratory experiment):

      Assume that each amino acid is coded for by two bases in a row. The code would have one of four different bases in the first position of the code (A, G, C, T) and one of four different bases for the second. How many combinations of pairs would be possible?

      For example: (1) A A (2) A G (3) A C (4) A T (5) G A (6) G G (7) G C (8) G T …

      If you continued to write out every combination, you would come up with 16 possible pairs of bases. However, that's four short of the 20 natural amino acids. This is a good sign that two bases is not enough to code for all possible amino acids (and, in fact, we now know that it takes three bases in a row).

      How many combinations would be possible if the code were a grouping of three bases?

    2. The crucial experiment is to put together, by genetic recombination, three mutants of the same type into one gene

      This "frame shift" experiment tests whether the bases are read in singlets, pairs, or triplets.

      https://ghr.nlm.nih.gov/primer/illustrations/frameshift.jpg

    3. These mutations are believed to be due to the addition or subtraction of one or more bases from the genetic message. They are typically produced by acridines, and cannot be reversed by mutagens which merely change one base into another. Moreover, these mutations almost always render the gene completely inactive, rather than partly so.

      By incorporating acridine into genetic material, Crick and coworkers produced mutations in DNA. These mutations led to either the addition or subtraction of one base pair in the genetic code.

    1. the Force and Motion Conceptual Evaluation (FMCE)

      A secondary outcome measure was student scores on the Force and Motion Conceptual Evaluation. Because this test is administered throughout the country, the authors can compare their findings to the normal results for the population.

    2. Students in the control group selected their least important values from the same list and wrote why these values might be important to other people.

      Students in the control group spent time writing about a value that was not important to their identity. This ensures that any difference between the values affirmation group and the control group is due to the act of self-reflection and affirmation, and not simply a result of general writing.

    3. As part of an online survey typically given in the course (week 2), students also indicated their endorsement of the stereotype that men perform better than women in physics.

      Level of endorsement of the stereotype that men perform better than women in physics is a moderating variable. Its value influences how much the values affirmation intervention affects course performance.

    4. attempts to reduce identity threat in authentic classroom contexts have been limited

      One of the key features of this study is that it takes places in an authentic college classroom, rather than being an artificial, one-time laboratory experiment.

    1. We performed a genome-wide screen for loci affecting overall body size in six species of Darwin’s finches that primarily differ in size and size-related traits: the small, medium, and large ground finches, and the small, medium, and large tree finches (Fig. 1, A and B, and table S1)

      The investigators initially had to decide that they were going to use samples from six species of finches. Then, they screened the entire genome of these species to look for genetic variants in different individuals. The objective was to see if any variation at any given location was associated with size and/or size-related traits.

    2. We constructed a maximum-likelihood phylogenetic tree on the basis of this ~525-kb region

      The genome-wide fixation index scan found that a region around 525 kb in size showed the most striking differences.

      Lamichhaney and colleagues constructed another phylogenetic tree based on the alignment of this region in all of the samples.

    3. We constructed a maximum-likelihood phylogenetic tree on the basis of all 180 genome sequences (Fig. 1C)

      The nucleotide alignment of the variable positions from all 180 samples (60 birds plus 120 from previous study) allowed the scientists to generate a phylogeny using software FastTree.

      See here to learn more about FastTree and its maximum-likelihood method.

    4. We sequenced 10 birds from each of the six species (total 60 birds) to ~10× coverage per individual, using 2 × 125–base pair paired-end reads. The sequences were aligned to the reference genome from a female medium ground finch (12).

      Lamichhaney and colleagues sequenced a total of 60 birds. Sequencing is a technique used to actually read the DNA.

      For a history of DNA sequencing and assembly, this resource from HHMI BioInteractive is a great tool.

      This video shows specifically how Illumina Sequencing Technology works.

      If a finch genome is 1 Gbp (one trillion base pairs), sequencing "to ~10x coverage per individual" would mean that you obtain 10 Gbp of sequencing data.

      "Using 2 x 125-base pair paired-end reads" refers to the fact that the fragments sequenced were sequenced from both ends and not just one. Refer to the videos above for more information.

    5. We then genotyped individuals of the Daphne population of medium ground finches that succumbed or survived during the drought of 2004–2005.

      Genotyping establishes a genetic code for each individual finch. With birds, this can typically be done using plucked feathers, blood, or eggshell membranes.

      The researchers here used blood samples that were collected on FTA paper and then stored at -80°C. FTA paper is treated to bind and protect nucleic acids from degrading.

      This type of scanning is used to identify specific gene markers that are highly variable. Researchers wanted to identify a locus that showed beak size variation.

    1. Compared with REPAIRv1, REPAIRv2 exhibited increased specificity, with a reduction from 18,385 to 20 transcriptome-wide off-targets with high-coverage sequencing (125x coverage, 10 ng of REPAIR vector transfected)

      To more rigorously compare the off-target activity of two systems, the authors performed sequencing with higher coverage.

      Recall that earlier they used 12.5x coverage. Here, they used 125x coverage.

      Why is this important? Cellular genes are expressed at different levels which leads to a different number of individual mRNA molecules. The more abundant a particular molecule is, the easier it is to detect it at a given coverage. When you increase the coverage, you have a chance to catch molecules that are less abundant in the cell.

      This is exactly what happened in the experiment with the REPAIRv1. At 125x coverage, the authors detected off-targets in the majority of transcripts (18385 of around 20000 protein-coding genes in our genome). By contrast, the REPAIRv2 system was astonishingly more specific and produced off-targets 1000 times less frequently.

    2. We further explored motifs surrounding off-targets for the various specificity mutants

      Inspired by other explorations into 3' and 5' motifs, the authors looked at transcripts with off-target effects—specifically at two nucleotides surrounding the edited adenosine.

    3. A majority of mutants either significantly improved the luciferase activity for the targeting guide or increased the ratio of targeting to nontargeting guide activity, which we termed the specificity score

      The authors looked at two characteristics of the modified protein variants.

      First, they tested whether the mutant had changed its editing activity. This was calculated by looking at the restoration of the Cluc luciferase signal. While the authors didn't necessarily want increased editing activity, they wanted to avoid a loss of editing activity. 

      However, catalytic activity sometimes leads to more off-target effects. Therefore, they authors calculated the ratio between the Cluc signal in targeting and non-targeting conditions, i.e. the specificity score. The higher the score, the more specific a mutant variant was.

    4. we generated an RNA-editing reporter on Cluc by introducing a nonsense mutation [W85X (UGG→UAG)],

      To create the reporter, the researchers "broke" the gene for the Cluc luciferase by introducing a mutation in the UGG codon, changing it to UAG (a nonsense mutation, which signals the ribosome to stop translation). Since this codon was positioned in the beginning of the Cluc transcript, no luciferase was synthesized in the cell.

      The A (adenosine) in the UAG codon was the target for RNA editing. Both Cas13b-mediated RNA recognition and ADAR-mediated editing were required to remove the stop codon at the beginning of the Cluc transcript, which would restore Cluc expression.

      This means that Cluc luminescence would only be seen where editing (both targeting and cleavage) was successful.

    5. We next characterized the interference specificity of PspCas13b and LwaCas13a across the mRNA fraction of the transcriptome.

      The next question was to understand the specificity of Cas13 in the whole cellular transcriptome (the portion of the genome that's transcribed). In the previous experiments, the researchers looked at the expression of only one target (unmodified or modified). Here, they narrowed their focus from the entire genome to just the transcriptome.

      To do that, the authors transfected the cells with Cas13, LwaCas13a or PspCas13b, a gRNA to target Gluc, and a plasmid containing the gene for Gluc.

      The control cells got an irrelevant gRNA instead of the gRNA targeting Gluc. As an additional control for comparison, the authors used shRNA-mediated knockdown of Gluc in a parallel cell culture.

      After 48 hours the researchers collected the cells, extracted mRNA, and determined the sequences and the number of copies for each transcript.

    6. We transfected HEK293FT cells with either LwaCas13a or PspCas13b, a fixed guide RNA targeting the unmodified target sequence, and the mismatched target library corresponding to the appropriate system.

      The cells were transfected with a nuclease, a gRNA for the non-mutated target site and a whole library with all possible plasmid variants. After 48 hours the researchers collected the cells, extracted RNA, and determined which mutated sequences from the library were left uncleaved.

      The transcript with the unmodified sequence was depleted most efficiently so that its level was the lowest after the cleavage. The levels of all other sequences with substitutions decreased to a lesser extent or did not decrease at all. The better Cas13 cut the sequence, the higher the depletion of this sequence was.

      The authors then compared the sequences by their "depletion scores."

    7. Sequencing showed that almost all PFS combinations allowed robust knockdown

      Substitutions in the PFS motifs did not affect how well Cas13a and Cas13b found and cut the target sequences. As a result, sequences with such substitutions were depleted as successfully as the control sequence, which was unmodified.

    8. To more rigorously define the activity of PspCas13b and LwaCas13a, we designed position-matched guides tiling along both Gluc and Cluc transcripts and assayed their activity using our luciferase reporter assay.

      To figure out which parts of the Gluc or Cluc RNA molecules were the best targets for Cas13, the authors generated a series of gRNA guides where each guide was shifted one to several nucleotides relative to the previous one. This is called tiling.

      In this way, the guides together could cover the whole sequence or a part of a sequence that a researcher was interested in. See figure 4A and 4C or figure 5A for a visual of how the guides were "tiled."

    9. Therefore, we tested the interference activity of the seven selected Cas13 orthologs C-terminally fused to one of six different localization tags without msfGFP.

      The authors took six of the best-performing orthologs from the previous part of the study and replaced the msfGFP domain at the C-terminus of each ortholog with different localization sequences. They then tested Gluc knockdown in the same way they previously tested LwaCas13a.

    10. We transfected human embryonic kidney (HEK) 293FT cells with Cas13-expression, guide RNA, and reporter plasmids and then quantified levels of Cas13 expression and the targeted Gluc 48 hours later

      To directly compare the effectiveness of Cas13a, b, and c orthologs, the authors transfected cells with two luciferases, Cas13 and two different gRNAs targeting Gluc luciferase.

      They measured Gluc luciferase activity. Reduced Gluc luciferase activity indicated interference from the Cas13 ortholog and successful targeting by the gRNA.

      They determined the expression of Cas13 to see whether Gluc knockdown was dependent on the quantity of Cas13 rather than the specific orthology.

    11. Here, we describe the development of a precise and flexible RNA base editing technology using the type VI CRISPR-associated RNA-guided ribonuclease (RNase) Cas13

      In this article, the authors describe how they created a system that can edit RNA molecules. They used a Cas13 protein fused to an adenine deaminase. The Cas13 protein recognized specific sequences on an RNA molecule, and the adenine deaminase edited bases, which can convert A to I (which is functionally read as a G).

      The authors improved the targeting specificity (accuracy) and editing rate (precision) by Cas13 and deaminase mutagenesis, and determined the sequences for which this system is most effective. They showed one application of this technology by correcting a series of disease-causing mutations at the cellular level.

    1. We evaluated the collective scrolling and tribological behavior of many individual graphene patches and created a density distribution of their tribological state in order to assess their contribution to the observed friction

      Authors have conducted theoretical simulation to investigate the sliding behavior of an ensemble of graphene sheets to elucidate the macroscale scrolling phenomena. To explore the mesoscopic friction behavior, number density (number of particles per unit volume) of the patches as a function of the coefficient of friction and time is calculated by grouping the friction coefficients collected over the ensemble of graphene patches.

    2. we performed a large-scale MD simulation for an ensemble of graphene-plus-nanodiamonds present between DLC and the underlying multilayered graphene substrate (fig. S8).

      To understand the transition of friction from the nanoscale to the macroscopic superlubric condition observed in experiments, authors have simulated a mesoscopic scenario. They have created and analyzed an ensemble (assembly of systems) of graphene patches and nanodiamonds between the DLC and graphene substrate subjected to sliding friction.

    3. We have simulated the effects of surface chemistry and considered the role of defects

      To understand the role of defects in superlubricity, the authors performed computer simulations by introducing double vacancies and Stone-Wales defects on graphene sheets. Studies were conducted in both dry and humid environments.

    4. DLC-nanodiamond-graphene system in a humid environment

      Upon observing experimentally the effect of humidity on friction conditions, authors have extended their studies. They have performed computer simulations to further analyze the interaction between water molecules and graphene in the DLC-nanodiamond-graphene system in a humid environment.

  10. Apr 2019
    1. This prototype includes a MOF-801 layer (packing porosity of ~0.85, 5 by 5 by 0.31 cm, containing 1.34 g of activated MOF), an acrylic enclosure, and a condenser

      Since this paper was published, the authors refined and optimized the devise and tested it under desert conditions with record high efficiency.

      See "Related Content" tab for: Kim, Hyunho, et al. "Adsorption-based atmospheric water harvesting device for arid climates." Nature communications 9.1 (2018): 1191.

    2. Experiments were performed in a RH-controlled environmental chamber interfaced with a solar simulator.

      To test the material in a laboratory setup, the authors use an enclosed chamber in which conditions such as humidity, temperature, and solar illumination can be regulated. This guarantees control over the experimental conditions and reproducibility.

    3. activated (solvent removal from the pores) by heating at 150°C under vacuum for 24 hours

      Heating under reduced pressure lowers the boiling point of liquids. This allows all the solvent and water molecules trapped in the MOF to evaporate easily, emptying all the cavities before starting the experiment.

    1. We carried out more detailed analysis of the wear track that

      In order to understand the wear properties of the graphene-nanodiamond compound after the sliding experiments, the authors performed electron microscopy studies which can reveal the structure of the material in the wear debris.

    2. Raman analysis

      Raman spectroscopy is a chemical analysis technique capable of probing the chemical structure, crystallinity, and molecular interactions of materials.

    3. The contact area normalized with respect to the initial value at t = 0 is ~1 (22), as shown in Fig. 4C

      The authors defined contact area as the area of graphene atoms which are in the range of chemical interactions from the DLC tip atoms. The normalized contact area is defined as the contact area at any time (t) with respect to the initial contact area at time t=0 (when the graphene patches are fully expanded).

    4. To further explore the superlubricity mechanism, we performed molecular dynamics (MD) simulations (table S1)

      In order to elucidate the mechanism of graphene nanoscroll formation and the origin of superlubric state, the authors have conducted computer simulation studies.

    5. Our experiments suggest that the humid environment

      To investigate the effect of environmental conditions on nanoscale friction and superlubricity, the authors have conducted experiments in humid air in place of dry nitrogen.

    1. computed by subtracting the climatological temperature value (17) for the month in which the profile was measured

      For example, if a temperature profile was taken on Feb. 19th, for each depth in that profile the authors subtracted out the average value for all Februaries over the last 50 years for each depth point. This removes the seasonal temperature changes from the data-set, allowing the authors to focus on the long term variability instead.

    2. computed the contribution to the vertically integrated field shown in Fig. 3B from each 500-m layer

      By examining the ocean in distinct depth increments of 500m each, the authors aimed to determine where most of the cooling and heating of the North Atlantic is occurring.

    3. computed as the time derivative of heat content

      The authors calculated how much heat is being stored in the deep sea by looking at the cumulative temperature change from 1955-59 to 1970-74, as well as from 1970-74 to 1988-92.

      By integrating, or combining, the temperature data for all depths between 0 and 300m and between 0 and 3000 m, the authors aimed to examine where the heat is going - into the surface ocean or the deep ocean.

    4. running 5-year composites

      The authors combine 5 years of data into one value, typically by averaging all values. A running composite (sometimes known as a moving or running average) is calculated by creating a series of averages of different subsets of the full data set in order to compare to the original data set.

      Calculating a running composite is a common technique used with time series data in order to smooth out short-term fluctuations and highlight longer-term trends or cycles.

      For more information on how to calculate a moving average: http://www.statisticshowto.com/moving-average/

    5. Computation of the anomaly fields was similar to our earlier work (7), but some procedures were changed

      The authors calculated temperature anomalies by subtracting the seasonal temperature cycle from monthly data values.

      Every now and then a shipboard temperature measurement can malfunction or be used incorrectly, recording temperatures far higher or far lower than what is realistic. These values need to be excluded to accurately study the oceans. To make sure these errors do not affect the study, the authors considered a particular range of data points with cutoffs at the higher and lower end of the range.

      Unlike their previous work, the authors used a less strict cutoff for when data values were considered good enough to use in their analysis. This is because they found that some large-scale temperature features were mistakenly being flagged as "bad" data under the stricter cutoff, despite those features being real and measurable events in the ocean.

    6. yearly and year-season objectively analyzed temperature

      Using data available in the World Ocean Database, Levitus et al. looked at both annual temperature data and average season temperatures within each year (for winter, spring, summer, fall).

      However, because temperature changes over the course of a year due to the changing of the seasons (summers are warm, winters are cold), this seasonality must be taken into account when studying the overall change in ocean temperatures. To "objectively analyze" the data, the natural seasonal temperature cycle was subtracted from each data point in order to focus on the trends over time.

      For each monthly temperature data point, the average temperature for that month the sample was measured was subtracted. The difference between the data point and the monthly average is called an anomaly.

    7. Using these data, yearly, objectively analyzed, gridded analyses of the existing data were prepared and distributed (7) for individual years for the period 1960 to 1990.

      The authors averaged monthly oceanographic data acquired by ship-of-opportunity and research vessels into annual temperature measurements for every year from 1960 to 1990.

      Scientists measure locations on Earth using longitude (180° E ↔ 180° W) and latitude (90° N ↔ 90° S).  Lines of longitude and latitude cross to create a grid across the planet.

      For this study, Levitus et al. combined temperature data for every 1° longitude by 1° latitude area of the ocean. Where multiple ships frequented the same location, those multiple data points were averaged into one value for each matching depth.

    1. unconstrained canonical correspondence analysis

      Refers to a statistical method that searches for multivariate relationships between two data sets.

      This method is most often used in genetics and ecological sciences. Learn more about why, how, and when to use it here.

    1. To investigate PFS constraints on REPAIRv1, we designed a plasmid library that carryies a series of four randomized nucleotides at the 5′ end of a target site on the Cluc transcript

      Though the authors had already characterized PFS preferences for Cas13b, they needed to check that the fusion of Cas13b with ADAR did not change its targeting efficiency and specificity. The researchers also wanted to confirm that PFS would work when generating RNA edits. This is important as DNA base editors are limited by the PAM of Cas9 and Cpf1 making RNA more powerful since you can target anywhere in the transcriptome. Therefore, it was important to check PFS constraints again.

    2. We mutated residues in ADAR2DD(E488Q) previously determined to contact the duplex region of the target RNA (Fig. 6A) (19).

      The researchers mutated amino acids in ADAR involved with binding to the RNA target or catalytic deamination to test whether they affected deamination.

  11. Mar 2019
    1. A gradual increase in homozygosity was then observed over the next five generations (Fig. 2D), as expected from the small number of breeding pairs

      The authors measured genetic diversity through the inbreeding coefficient and average nucleotide diversity, finding that the pattern of decreasing genetic diversity was as expected for an inbreeding population.

    2. the allele frequency

      The authors measure allele frequencies to see if they are changing, which is an indication of continuing natural selection.

    3. more detailed morphological analysis

      The authors analyzed body size and several components of bill size to give a more complete comparison of the Big Bird lineage to both of its parental species. Results of these analyses are found in Figure 3.

    1. To probe the functional connectivity between these layer V projection neurons and STN in the PD animals, we conducted a separated-optrode experiment in anesthetized animals in which the fiber-optic and recording electrodes were placed in two different brain regions in Thy1::ChR2 animals

      Based upon previous findings that the cortex and STN are connected, the investigators wanted to know if driving M1 layer V neurons had an effect on STN neuronal firing and subsequent behavioral output. So they placed an optrode over M1 and a recording electrode in the STN.

    2. we used Thy1::ChR2 transgenic mice (22, 23) in which ChR2 is expressed in projection neurons, and we verified that in Thy1::ChR2 line 18, ChR2-YFP is excluded from cell bodies in the STN but is abundant in afferent fibers

      Thy1 (thymocyte differentiation antigen 1) is expressed in the axonal projections of mature neurons. When its promoter is placed in control over ChR2 expression, the protein would be expressed in the projection neurons as opposed to the somata of local neurons.

    3. The STN is a predominantly excitatory structure (30) embedded within an inhibitory network. This anatomical arrangement enables a targeting strategy for selective STN inhibition (Fig. 1B), in which enhanced NpHR (eNpHR) (21) is expressed under control of the calcium/calmodulin-dependent protein kinase IIα (CaMKIIα) promoter, which is selective for excitatory glutamatergic neurons and not inhibitory cells, fibers of passage, glia, or neighboring structures

      Since the subthalamic nucleus is excitatory, meaning the neurons within release the neurotransmitter glutamate, selectively targeting this region can be accomplished via the promoter calcium/calmodulin-dependent protein kinase II alpha (CaMKIIα). Placing a gene downstream of the CaMKIIα promoter will cause the gene to be selectively expressed only in excitatory neurons.

      The authors placed the gene sequence for halorhodopsin under the control of the CaMKIIα promoter and were able to selectively inhibit the firing of excitatory glutamatergic neurons in the subthalamic nucleus.

    1. Dot blot analysis

      A quantitative method to detect mRNA levels in the explants. Here, mRNA levels of proenkephalin is measured using this method.

      Learn more with this video from Abnova.

    2. Denervation

      A technique used to separate or eliminate a particular nerve supply to specific area(s) in the nervous system.

    1. We performed a parametric study, including varying the packing porosity (0.5, 0.7, and 0.9) and layer thickness (1, 3, 5, and 10 mm), and determined the time and amount of harvestable water for a solar flux of 1 sun

      The authors examined different parameters to optimize the water harvesting properties of the material. The porosity and thickness of the material are assumed to be the parameters having the largest effect because they determine the amount of water that can be adsorbed for a chosen compacted adsorbent layer.

    1. Dirichlet multinomial mixtures

      Refers to a computational technique to model the probability of microbial metagenomics data by representing the data as a frequency matrix of the number of times each taxa is observed in a sample.

    2. To achieve a balance between number of phenotypes of interest and rates of false discovery, a stepwise approach was applied.

      The researchers wanted to ensure that the phenotypes of interest were not falsely correlated with microbiota genera. To avoid overfitting, they reduced the phenotypic data in different stages. By approaching the problem in a series of distinct stages, the researchers could determine the most important variables that affect the abundance and diversity of gut microbiota.

  12. Feb 2019
    1. Participants indicated whether it was the duty of the government to enforce regulations that would minimize the casualties in such circumstances, whether they would consider the purchase of an AV under such regulations, and whether they would consider purchasing an AV under no such regulations

      n the final experiment, the researchers presented participants with a scenario in which an AV would have to sacrifice passengers to minimize overall casualties. They were then asked whether governments should regulate autonomous vehicles, so that they are programmed to minimize casualties; whether they would buy a vehicle that was subject to those regulations; and whether they would buy a vehicle that was not regulated in that way.

    2. Study four (n = 267 participants) offers another demonstration of this phenomenon. Participants were given 100 points to allocate between different types of algorithms, to indicate (i) how moral the algorithms were, (ii) how comfortable participants were for other AVs to be programmed in a given manner, and (iii) how likely participants would be to buy an AV programmed in a given manner.

      In Study 4, participants were given a "budget" of 100 points to assign to different algorithms, which was a way for researchers to look at their priorities.

      For example, if there were three algorithms, a participant could choose to allocate 20 points to the first one, 30 points to the second, and 50 to the third. This allowed the authors to directly compare how participants felt about the different algorithms presented in Study 4.

      There were three algorithms, and participants had a separate budget for each in which they answered the questions:

      1. How moral is this algorithm relative to the others?
      2. If self-driving cars were programmed with this particular algorithm over the others, how comfortable would you be with it?
      3. How likely would you buy a self-driving car programmed with this algorithm?
    3. In study two (n = 451 participants), participants were presented with dilemmas that varied the number of pedestrians’ lives that could be saved, from 1 to 100.

      Participants read scenarios in which the self-driving vehicle would sacrifice its single passenger to save pedestrians, with the number of pedestrians ranging from one to 100.

      The participants were asked which situation (saving the passenger versus some number of pedestrians) would be the most moral choice.

    4. The last item in every study was an easy question (e.g., how many pedestrians were on the road) relative to the traffic situation that participants had just considered. Participants who failed this attention check (typically 10% of the sample) were discarded from subsequent analyses.

      With the last question in each study, the authors were checking if the participant was still paying attention. If the participant got the question wrong, their answers were discarded because the authors assumed that the participant was not actively engaged.

      Attention checks can help make sure that the data you are collecting is higher quality. For example, participants could start speeding through the survey without thinking through their answers, which results in data that is not informative or useful.

    5. Amazon Mechanical Turk (MTurk) platform

      Amazon Mechanical Turk (MTurk) is a website that lets users earn money for doing small tasks. It has become increasingly popular for researchers, who use it to reach a wide audience online.

      With MTurk, a large and diverse set of data can be collected in a short period of time.

    6. our scenarios did not feature any uncertainty about decision outcomes

      The scenarios presented in the study always assumed that someone would be killed by the autonomous vehicle. The authors did not look into situations where the outcome was uncertain, like if the passenger had a greater chance of survival than a pedestrian.

    7. But would people approve of government regulations imposing utilitarian algorithms in AVs, and would they be more likely to buy AVs under such regulations?

      In the final two studies, the authors wanted to discover if people approve of governments legally enforcing utilitarian algorithms in AVs, and whether people would want to buy AVs if these regulations existed.

    8. Regression analyses (see supplementary materials) showed that enthusiasm for self-driving cars was consistently greater for younger, male participants. Accordingly, all subsequent analyses included age and sex as covariates.

      Regression analyses allow you to understand how different things may or may not be related. For example, how does the number of views for a YouTube video change based on the video's content?

      In this study, the authors used regression to investigate how factors like gender, age, and religion might be related to someone's enthusiasm about self-driving cars.

      This is important for finding potential covariates in the experiment. Covariates are characteristics of the people in an experiment, like age, gender, and income. These variables cannot be controlled like experimental ones, but they can affect the outcome of an experiment and bias your results in an unexpected way. They can also reveal interesting trends in society.

      By determining the covariates that could sway the experiment, scientists can make the model more accurate. Here, the authors determined that the experiments should account for age and gender.

    9. broad public exposure and a disproportionate weight in individual and public decisions about AVs

      Even though the situations where AVs would sacrifice their passengers are rare, these are emotional events that would have a large influence on the public despite their infrequency.

    10. Consider, for example, the case displayed in Fig. 1A, and assume that the most common moral attitude is that the AV should swerve. This would fit a utilitarian moral doctrine (11), according to which the moral course of action is to minimize casualties. But consider then the case displayed in Fig. 1C. The utilitarian course of action, in that situation, would be for the AV to swerve and kill its passenger, but AVs programmed to follow this course of action might discourage buyers who believe their own safety should trump other considerations.

      In Fig. 1C, the car must decide between killing its own passenger or several pedestrians. A utilitarian viewpoint, which calls for the choice resulting in the "most good," would be to sacrifice one (the passenger) to save many.

      However, people who prioritize their own safety will not want to buy a car that's programmed this way because the result could be that their car would sacrifice them.

    1. morphological measurements and whole genome sequencing

      The authors used bill size and body mass (morphological measurements) and genetic data.

      The genetic data used was the entire DNA sequence for each bird (whole genome sequencing) for sections to analyze, not just particular sections of DNA, as had been done in the past.

    2. analysis

      The phylogenetic tree analysis consists of scanning the genomes of all 46 members of the Big Bird lineage and comparing them to the genomes of 180 other finches from Daphne Major for which there are genome data.

      A computer program is used to generate many phylogenetic trees, and mathematical parameters are used to ensure that the data are properly interpreted in the program. Then, a statistical test is used to evaluate the reliability of each split in the tree.

    1. Relationships between temperature and insect population growth rates drive logistic population increases of insects during each crop’s growing season, and they also scale the fractional survival rate of insects over the rest of the year (14), termed the diapause survival, ϕo.

      This sentence is saying that the higher number of insects you see during the growing season and the lower numbers seen throughout the rest of the year is because of the relationships between insects and temperature, where warmer temperatures result in greater insect populations. This pattern, fewer insects during colder months and more insects during warmer months, is called diapause and means that insects, and also some other animals, go into a dormancy until environmental conditions are more favorable for survival.

  13. Jan 2019
    1. If the NPF-NPFR system were to function generally to signal the state of the Drosophila reward system, NPF levels should be increased by rewarding experiences other than mating, such as exposure to intoxicating levels of ethanol. To test this hypothesis, we exposed virgin males to ethanol vapor using an exposure paradigm previously shown to be rewarding (three 10-min exposures spaced by 1 hour) (4).

      Throughout this paper, the researchers have been making the claim that activity of the NPF system signals the fly's "reward state." According to this theory, when the fly experiences something rewarding (such as mating), the amount of NPF in the brain and the activity of associated neurons increases, which in turn leads to a reduced urge to seek additional rewards (such as alcohol). To this end, they have shown that there is an increased amount of NPF in the flies' brains following mating. However, if their theory were true, then the opposite should also be observed. That is, the consumption of alcohol should also increase NPF levels in the brain, since alcohol is also a source of reward (just like mating).

      To test this hypothesis, the researchers decided to measure NPF levels after exposure to both air and an alcohol vapor (the former being neutral/not rewarding, and the latter being rewarding). If their theory about NPF generally signalling reward were true, one would expect to see increased levels of NPF in the brain following exposure to alcohol (just like was observed following mating in Figure 2A).

    2. development of a preference for the odor associated with these experiences would imply that flies found the events rewarding

      You might be wondering why this step is necessary, especially for alcohol, since the researchers have already shown that flies consume more alcohol-laden food. However, in the alcohol/food consumption paradigm, it is impossible to completely dissociate the contribution of the hunger and the caloric contribution of alcohol from the objective enjoyment of the experience. Here, what is being tested for is the preference for odor (which is otherwise neutral) that has been paired in memory with the experience of alcohol consumption, independent of any confounding factors. Further, the actual test phase occurs 24 hours after the alcohol exposure, when the flies are fully sober. For these reasons, this type of odor-paired conditioning assay is a much better indicator of whether an experience of substance is objectively rewarding for a fly.

    3. We propose that the activity of the NPF-NPFR system may be a neural representation of the state of the Drosophila reward system. If so, experiences that change NPF-NPFR activity should promote behaviors that restore the system to its normal state. In this model, sexual deprivation would create an NPF deficit that increases reward-seeking behavior such as ethanol consumption. Conversely, successful copulation would create a NPF surfeit that reduces reward seeking.

      Here, the researchers put forth their theory for the mechanism driving all of the effects they have observed so far. This mechanism is based on the concept of homeostasis, which basically states that there is an "ideal state" for one's internal physiological make-up. Any time the body deviates from the ideal state, it will attempt to correct itself and return to this state. The researchers thus theorize that the amount of NPF in the brain is a marker for the "reward state."

      If the fly mates repeatedly, it experiences something highly rewarding, leading to an increase in NPF beyond "normal" levels. The fly's body thus wants to return to a normal level of NPF. To do this, it must forgo alternate rewards such as alcohol. This would explain why mated flies do not have a preference for alcohol. Sexually deprived flies, on the other hand, experience a lack of reward, which leads to a decrease in NPF beyond normal levels. To compensate for this, when these flies are exposed to a second possible reward in the form of alcohol, they consume it excessively. This would explain why sexually deprived, virgin flies show an alcohol preference.

    4. heads of males subjected to different sexual experiences: rejected-isolated, virgin-grouped, and mated-grouped

      Since past work in Drosophila, mice, and C. elegans has shown that altering NPF/NPY can affect how organisms react to alcohol, the researchers in this study hypothesized that sexual deprivation might lead to changes in levels of NPF in the flies' brains, which in turn caused them to consume more alcohol. To test this, they extracted the brains from male flies with three distinct sexual histories (mated, virgin, and sexually rejected), and measured their NPF levels.

    5. Second, we tested the effect of artificial activation of NPF neurons by expressing the heat-activated cation channel dTRPA1 (21) under NPF-GAL4 control (22).

      In order to further confirm this causal relationship, the authors used the protein dTRPA1 (which activates specific neurons at a high temperature). Specifically, they inserted dTRPA1 into all of the same neurons that also contained NPF. Therefore, at a low temperature there would be no effect, but at a high temperature all of the neurons related to NPF would be activated. If the relationship between NPF and alcohol preference were indeed causal, then artificially increasing the activity of NPF-related neurons should reduce preference for alcohol.

    6. Finally, we sought to establish whether there was a role for the repellant chemosensory cue

      Since cis-vaccenyl acetate (cVA) is used by females to turn away unwanted mating attempts by males, it is only released by females who have already mated. Virgin females, who may still be looking to mate, do not release this chemical cue. This difference raises the possibility that the increased preference for alcohol amongst the rejected males could be because they smelled this chemical cue (which the mated male flies would not have smelled).

    7. males were exposed individually to decapitated virgin females on the same schedule as the rejected-isolated cohort, using a protocol that results in courtship suppression

      To test this possibility, they exposed the virgin male flies to dead female flies. These virgin males were still exposed to the visual cue of a female fly (especially the rear half of the body where mating occurs). However, they were neither able to mate, nor were they exposed to active rejection by a female.

    8. we compared males that differed in sexual experience but not in housing conditions—that is, mated and virgin males that were both group-housed.

      For the first follow-up experiment, the researchers ensured that both the mated male flies and the virgin male flies were housed in large groups (consisting exclusively of other male flies). This was done to see if social isolation (as the virgin male flies in the initial experiment might have experienced) might have been the cause for the alcohol preference.

    9. We compared males trained with either mated females (rejected-isolated) or decapitated virgin females (11). Both groups endured sexual deprivation (lack of copulation), but only the former was exposed to cVA.

      To test the possibility that cis-vaccenyl acetate (cVA) was the driving factor behind the alcohol preference amongst rejected males, the researchers conducted a follow up experiment in which one group of males was exposed to mated females, and the other was exposed to dead virgin females. Both of these groups thus experienced sexual rejection. However, the group exposed to the dead virgin females would not be exposed to the cVA cue, whereas the group exposed to mated females would. If there were a difference in alcohol preference between these two cohorts of male flies, then one could conclude that the preference for alcohol was not caused by sexual deprivation specifically, but instead due to exposure to cVA.

    10. We used two distinct sexual experiences to generate two cohorts of male flies.

      To test the effect of sexual experience on alcohol consumption, the researchers first had to create two groups of male flies, each with a different sexual history. They did this by allowing one group to mate freely with multiple different females for four days. This became the "mated-grouped cohort." The second group was only exposed to female flies who had already mated. Female flies who have already mated once will not mate a second time until they have laid their eggs. For this reason, the second group of males experienced repeated sexual rejection by the mated females for four days. This became the "rejected-isolated cohort." These two cohorts, (one mated and the other sexually rejected) could then be tested for the differences between them.

    11. ethanol preference index

      A calculated score, from –1 to 1, that indicates how much the flies preferred the alcohol-spiked food relative to the regular food. A number above zero means the flies ate more food with alcohol in it, and a number below zero means they ate more regular, nonalcoholic food.

    12. food with or without 15% ethanol supplementation

      Drosophila are typically fed a jellylike substance in the lab consisting largely of sugar. In this experiment, the flies were given a choice between just a regular tube of food or a tube that contained the food mixed with alcohol.

    13. two-choice preference assay

      A behavioral model in which an organism is offered a choice between two alternatives. Typically, the organism is allowed to freely choose amongst these alternatives, and its choices are used to define whether or not the organism has a preference for one alternative over another.

    1. Furthermore, we speculated that different transcriptome states could also potentially alter the number of off-targeting events.

      Different cell types specialized to performing particular functions require different proteins for their work. Therefore, they have distinct transcriptomes.

      Moreover, a cell's transcriptome changes depending on where it is in the cell cycle and the conditions and needs of the surrounding environment.

      As a result, off-targets of the REPAIR system may vary from one cell type to another, reflecting changes in the presence and abundance of individual transcripts.

    2. REPAIRv1 off-target edits were predicted to result in numerous variants, including 1000 missense base changes (fig. S13C), with 93 events in genes related to cancer processes

      Earlier the authors only counted off-targets and did not look at the effect of adenosine modification on the transcript function. Here, they determined the position of the modification and the possible impact on the amino acid choice if the edit was in the protein coding region.

    3. We selected a subset of these mutants (Fig. 6B) for transcriptome-wide specificity profiling by next-generation sequencing.

      The recovery of the Cluc signal is an easy way to get the first look at the activity of the mutants. However, a single transcript cannot reflect changes in the whole transcriptome.

    4. we tested 17 single mutants with both targeting and nontargeting guides, under the assumption that background luciferase restoration in the nontargeting condition would be indicative of broader off-target activity.

      It would be too expensive to test all the mutants by RNA sequencing to find whole-transcriptome off-targets. Instead, the authors looked at a single transcript for Cluc luciferase.

      The authors assumed that if luciferase activity was restored even non-targeting guide RNAs, it would mean that ADAR2 increased off-target activity.

    5. To reduce the size, we tested a variety of N-terminal and C-terminal truncations of dCas13

      AAVs can accommodate up to 4.7kb, meaning dCas13b and the extra regulatory sequences are too big to fit in one virus. To solve this problem, the authors truncated (shortened) dCas13b by removing sections of the protein with unnecessary functions.

      The dCas13b nuclease is composed of different domains (parts), some of which are used for RNA binding and some which are used for cleavage. The RNA cleavage is mediated by two well-defined HEPN domains which are located close to the N- and C-terminus of the protein. We are still unsure which domains mediate binding.

      However, the REPAIRv1 system needs only the RNA binding ability of dCas13b. Therefore, the researchers were able to remove sections of dCas13b containing HEPN while retaining its binding function.

    6. AAV vectors have a packaging limit of 4.7 kb

      Almost the entire AAV genome can be replaced with a desired construct of max length of 4.7kb. This limits how much genetic material the virus can carry, so some essential viral genes are carried in an additional construct or constructs (the "helper plasmid" in the picture below).

      Learn more about AAV at Addgene.

    7. We then tested the ability of REPAIRv1 to correct 34 different disease-relevant G→A mutations

      The researchers chose 34 disease causing mutations to test. Each mutation was a G to A substitution that changed the amino acid sequence of the protein and disrupted normal function.

    8. Using guide RNAs containing 50-nt spacers

      The authors chose three 50-nt spacers to target two genes carrying disease-relevant mutations. As was determined at the previous step, the spacers of 50-nt length showed higher rates of editing but more off-target effects than 30-nt spacers.

    9. To demonstrate the broad applicability of the REPAIRv1 system for RNA editing in mammalian cells, we designed REPAIRv1 guides against two disease-relevant mutations

      After the authors had generated successful REPAIRv1-mediated editing of mRNA for an exogenous gene (i.e., a gene introduced from outside the organism), they chose two endogenous genes (i.e., native to the organism) to further explore the power of REPAIRv1.

      They tested Cas13b nuclease activity in the same way as they did with the exogenous gene.

    10. we modified the linker between dCas13b and ADAR2DD(E488Q)

      The dCas13b and ADAR2 protein parts are joined together via a stretch of amino acids called a linker. There are more than a thousand linker variants in different multi-domain proteins. The sequence of a linker can influence protein folding and stability, as well as functional properties of individual domains. Therefore, the choice of linker is an important step in protein design.

      The authors tested linkers of different length and flexibility and found that shorter and more flexible variants produced the best results.

    11. To validate that restoration of luciferase activity was due to bona fide editing events, we directly measured REPAIRv1-mediated editing of Cluc transcripts via reverse transcription and targeted next-generation sequencing.

      The ADAR deaminase can introduce changes not only into the target adenosine but also in the surrounding adenosine bases. To check how specific the dCas13-ADAR2 protein was in targeting the correct adenosine base, the researchers sequenced edited Cluc transcripts and determined all positions with A to I substitutions.

      Further, they used tiling gRNA molecules in order to determine the influence of the spacer (PFS) length and the mismatch distance on the off-target editing.

    12. position-matched guides

      The authors directly compared the RNA knockdown efficiency of two technologies, Cas13 cleavage and RNA interference (RNAi). Both technologies require guide RNA (gRNA) molecules for targeted recognition. Cas13 is directed by a gRNA, while RNAi complex uses a molecule termed shRNA.

      Different parts of an RNA molecule can be more or less accessible to gRNAs. Therefore, gRNA and shRNA target sequences have to be selected such that they are close to each other, i.e. position-matched, so that a more fair comparison between Cas13 and RNAi can be made.

    13. Cas13b from Prevotella sp. P5-125 (PspCas13b) and Cas13b from Porphyromonas gulae (PguCas13b) C-terminally fused to the HIV Rev nuclear export sequence (NES), and Cas13b from Riemerella anatipestifer (RanCas13b) C-terminally fused to the mitogen-activated protein kinase NES

      Why did the authors test several NESs from different proteins?

      Over 200 NESs from different proteins have been described. Each NES is around 10 amino acids long and has a unique structure. The variation between NESs means that they have different effects on export efficiency and protein stability in different environments.

    14. To engineer a PspCas13b lacking nuclease activity (dPspCas13b, referred to as dCas13b hereafter), we mutated conserved catalytic residues in the HEPN domains and observed loss of luciferase RNA knockdown

      The authors hypothesized that even without nuclease activity, Cas13b would still be able to recognize target molecules directed by a gRNA.

      Mutations of the key part of the protein responsible for RNA cleavage eliminated Cas13b's catalytic ability. The next step was to test whether the mutated Cas13b was still able to find the target sequences, even if it couldn't cleave them.

    15. We found that LwaCas13a and PspCas13b had a central region that was relatively intolerant to single mismatches

      The sequences with substitutions in the middle part (between 12 and 26 nucleotides) had the lowest depletion scores. A lower depletion score means that the nuclease was not as successful at cleaving the RNA sequence (i.e. depleting expression).

    16. To characterize the interference specificities of PspCas13b and LwaCas13a, we designed a plasmid library of luciferase targets containing single mismatches and double mismatches throughout the target sequence and the three flanking 5′ and 3′ base pairs

      To figure out how specifically the Cas13 orthologs would target RNA, they created a "library" (collection) of sequences with one or two mutations in the gRNA target site for the Gluc gene. The idea was to see how different a sequence could be and still be recognized by the nuclease.

      Remember that the goal is to find a highly specific nuclease. This means that ideally the nuclease would not recognize sequences with mutations.

    17. without msfGFP

      The authors ran the assay without msfGFP to determine if any of the orthologs do not require stabilization domains. This is important because orthologs that do not require these domains could be used when an experimental construct needs to be small.

    18. For each Cas13 ortholog, we designed PFS-compatible guide RNAs, using the Cas13b PFS motifs derived from an ampicillin interference assay

      Cas13a prefers to recognize target sequences when a spacer (target sequence) is surrounded by specific PFS motifs.

      To determine which PFS sequences Cas13b orthologs prefer, the researchers used an ampicillin interference assay. They transformed bacterial cells with two plasmids: One contained a Cas13b ortholog while the second plasmid included an ampicillin resistance gene and a gRNA against this gene with randomly generated 5' and 3' PFS around the target sequence. If the PFS led to the Cas13b ortholog targeting the correct gene, bacterial cells would die because they would lose ampicillin resistance when the gene was cleaved.

      The authors collected and analyzed cells that survived to determine which PFS sequences were irrelevant for targeting. They then subtracted these PFSs from the starting pool to determine which PFSs are preferred by Cas13b orthologs.

    19. To assay interference in mammalian cells, we designed a dual-reporter construct expressing the independent Gaussia (Gluc) and Cypridina (Cluc) luciferases under separate promoters, allowing one luciferase to function as a measure of Cas13 interference activity and the other to serve as an internal control.

      To monitor the effect of Cas13 nuclease activity, the researchers constructed a vector that contained genes for two different luciferases. These luciferases use different substrates (source materials) to generate light.

      One luciferase was used to measure Cas13 interference, and the other was used as a control. In cells where Cas13 did not interfere, both luciferases would emit light. In cells with interference, however, activity from one of the luciferases would decrease.

    1. A genome-wide PCA analysis

      Analyzing multiple whole genomes at the same time is so complex that scientists needed to simplify the data to make it easier to see patterns and differences between the genomes. The mathematical tool that they used was PCA analysis, which reduces the complexity of the data and retains its variance.

      PCA analysis works by setting principal components (e.g. PC1, PC2, PC3), which are essentially directions in which the data has the largest spread.

      In this case, PC1 and PC2 are studied.

    2. sequenced and analyzed 59 hypervariable mtDNA fragments from ancient dogs spread across Europe, and we combined those with 167 modern sequences

      The d-loop sequences of mitochondrial DNA are considered to be mutational hotspots (places where mutations appear to happen more frequently than others), and therefore looking at this region can provide important information about how evolution occurred.

      Modern dog sequences and seven of the 59 ancient d-loop sequences were already available on a public database. The rest of the sequences were generated from ancient DNA samples by polymerase chain reaction (PCR). The sequences were then compared to each other using DomeTree, a program that creates haplogroup phylogenetic trees based on mitochondrial DNA.

    3. we used the radiocarbon age of the Newgrange dog to calibrate the mutation rate for dogs

      The team of scientists used radiocarbon dating techniques to calculate the dog as 4700-4900 years old.

      Since they knew how old the Newgrange dog was, they were able to estimate the time at which the Newgrange dog and the Portuguese village dogs last had a common ancestor, and figure out when they diverged from each other.

      Based on the assumption that a new generation of dogs were born every 3 years, the scientists were able to calculate the mutation rate.

    4. we defined Western Eurasian and East Asian “core” groups (Fig. 1A), supported by the strength of the node leading to each cluster (12).

      The information represented in Figure 1A was used to define the two core groups. The dogs needed to have very high bootstrap values (>90)—high-quality genetic data—to support their placement in the groups.

      The Western Eurasian core group consisted of all modern breeds and Portugal village dogs. The East Asian core group consisted of Sharpei, Village dogs from China, Tibet, and Vietnam, and Tibetan Mastiffs.

    5. We used principal components analysis (PCA), D statistics, and the program TreeMix (12) to further test this pattern.

      Three tools for mining the data for ancestry information.

      Principal components analysis (PCA) simplifies the data. The number of variables is reduced but the trends and patterns are retained.

      D-statistics detects admixture events; it is used to detect gene flow between closely-related species.

      TreeMix is a genetics computer program used to estimate ancestral relationships. It can detect population splits and migration events.

    6. radiocarbon dated

      This technique allows scientists to estimate the age of a plant or animal based on the amount of carbon-14 that is present at the time of measurement. Carbon-14, also known as radiocarbon, is a weakly radioactive type of carbon molecule that decays over time. The age of samples up to 60,000 years old can be estimated.

      To learn more about radiocarbon dating, check out this video from Scientific American.

    7. CanineHD 170,000 (170 K)

      A type of genetic test that covers 170,000 single-position variations in the genome of a dog. It is essentially a chip, upon which the genetic material of the dog of interest is placed. It contains thousands (170,000 in this case) of probes—short DNA sequences that can stick to the complementary sequence in the sample, if that matching DNA variant is present. Each interaction can be recorded to easily measure the presence of a large number of genes at the same time.

    8. (28x) of an ancient dog dated to ~4800 calendar years before the present (12) from the Neolithic passage grave complex of Newgrange (Sí an Bhrú) in Ireland.

      The scientists isolated DNA from a portion of the temporal bone in the dog's skull.

      They made a library of single-stranded DNA sequences; smaller pieces of DNA that together represent the entire genome of the dog. The DNA library is sequenced on a machine that can read the order of the bases (As, Ts, Gs, and Cs) that make up the genome of the particular dog being studied.

      Check out this video from TED-Ed on how to sequence the human genome (it also applies to the dog genome).

    9. sequences from European dogs (from 14,000 to 3000 years ago)

      They sequenced the d-loop of mitochondrial DNA, an area where mutations happen more often than other parts of mitochondrial DNA.

      Seven of the d-loop sequences were already available on a public database, and the others were generated from DNA in bone samples. A very small amount of bone was ground to a fine powder. The scientists were very careful to make sure contamination of the samples did not occur. Once the cells in the bone sample were broken open, and the DNA was isolated from other parts of the cell, the DNA could be sequenced by polymerase chain reaction (PCR).

      Check out this video from Khan Academy to learn how DNA is sequenced by PCR.

    10. ancient

      In this case, the DNA specimens are ~14,000 to 3000 years old.

  14. Dec 2018
    1. Ecdysozoa (molting animals) is a major protostome clade (Figure 1) proposed by Aguinaldo et al. (1997) that includes the large phyla Arthropoda (Figure 3) and Nematoda (both of tremendous ecological, economic, and biomedical importance) and their close relatives (Tardigrada [water bears, 1150 species], Nematomorpha [351 species], Onychophora [velvet worms; 182 species], Kinorhyncha [179 species], Loricifera [30 species], and Priapulida [19 species]). Ecdysozoans are characterized by their ability to molt the cuticle during their life cycle, and for having reduced epithelial ciliation, which requires locomotion via muscular action. They include segmented or unsegmented, acoelomate, pseudocoelomate, or coelomate animals; many have annulated cuticles and a mouth located at the end of a protrusible or telescopic proboscis, and some lack circular musculature (i.e., Nematoda, Nematomorpha, Tardigrada). Here we restrict the proposed sampling to the noninsect and nonnematode ecdysozoans. The clade includes the only animals (loriciferans) thought to complete their life cycles in anoxic environments (Danovaro et al. 2010). This group is also relevant for studies of extreme cell size reduction. Other than the mentioned arthropod and nematode genomes, no genome is available for any member of the Ecdysozoa.

      This is one of the limitations that will be placed on researchers that plan on conducting research to contribute to this database. They restrict a certain proposed sampling to the non-insect and non-nematode ecdysozoans.

    2. Many of the roughly 70 invertebrate species whose genomes have been sequenced belong to the Arthropoda or Nematoda, although the number of other invertebrate genomes continues to grow (e.g., Olson et al. 2012; Takeuchi et al. 2012; Zhang et al. 2012; Simakov et al. 2013; Tsai et al. 2013; Flot et al. 2013). We propose to focus on noninsect/nonnematode phyla, and specifically on an important group of currently neglected arthropods, the crustaceans.

      They propose a certain focus for researchers that wish to dedicate time to contributing to the database by stating that they should focus more on neglected groups of arthropods, specifically crustaceans.

    3. GIGA has adopted a set of standards and best practices to help ensure that genomic resources, data, and associated metadata are acquired, documented, disseminated, and stored in ways that are directly comparable among projects and laboratories. These data should be easily and equitably shared among GIGA members and the broader scientific community, and GIGA will obey appropriate laws and regulations governing the protection of natural biodiversity. Briefly, all genome projects will report on a set of parameters that will allow assessment of genome assembly, annotation, and completeness (e.g., NG50, N50 of contigs and scaffolds, number of genes, assembled vs. estimated genome size) (Jeffery et al. 2013). Detailed descriptions of these standards and compliant protocols will be posted on the GIGA Web site. These will be revised periodically to facilitate the establishment and maintenance of current best practices common to many invertebrate genome and transcriptome sequencing projects and to help guide the researcher in selecting and assessing genomes for further analyses. The following recommendations summarize minimal project-wide standards designed to accommodate the large diversity of invertebrates, including extremely small and rare organisms, as well as those that live in close association with other organisms. Permissions: GIGA participants must comply with treaties, laws, and regulations regarding acquisition of specimens or samples, publication of sequence data, and distribution or commercialization of data or materials derived from biological resources. Participants must acquire all necessary permits required for collection and transport of biological materials prior to the onset of the work. The CBD recognizes the sovereignty of each nation over its biological resources, and under the auspices of the CBD, many nations and jurisdictions rigorously regulate the use and distribution of bioIogical materials and data. GIGA participants must be aware of these regulations and respect the established rights of potential stakeholders, including nations, states, municipalities, commercial concerns, indigenous populations, and individual citizens, with respect to any materials being collected, to all derivatives and progeny of those materials, and to all intellectual property derived from them. GIGA participants must also familiarize themselves with the conservation status of organisms to be sampled and any special permits that may be required (e.g., CITES). Moreover, GIGA participants should collect in ways that minimize impacts to the sampled species and their associated environments. Field collection and shipping: Methods for field collection and preservation of specimens and tissues should be compatible with recovery of high-quality (e.g., high molecular weight, minimally degraded) genomic DNA and RNA (Dawson et al. 1998; Riesgo et al. 2012; Wong et al. 2012). Many reagents commonly used for tissue and nucleic acid preservation (e.g., ethanol, dry ice) are regulated as hazardous and/or flammable materials. These reagents may be restricted from checked and carry-on luggage and may require special precautions for shipping or transport. GIGA participants should contact the appropriate airline carriers or shippers for information regarding safe and legal shipment of preserved biological materials. When possible, multiple samples will be collected so that extractions can be optimized and samples resequenced as technologies improve. Specimens of known origin (i.e., field-collected material) will be favored over specimens of unknown origin (e.g., material purchased from the aquarium trade). Collection data will include location (ideally, with GPS coordinates) and date, and also other data such as site photographs and environmental measurements (e.g., salinity) when relevant. Selection and preparation of tissues: It is often advisable to avoid tissues that may contain high concentration of nucleases, foreign nucleic acids, large amounts of mucus, lipid, fat, wax, or glycogen or that are insoluble, chitinous, or mineralized. To obtain the highest quality material for sequencing or library construction, it may be preferable to extract nucleic acids from living or rapidly preserved tissue from freshly sacrificed animals, from gametes or embryos, or from cell lines cultivated from the target organism (Ryder 2005; Rinkevich 2011; Pomponi et al. 2013). When appropriate, select tissues or life history stages that will avoid contamination by symbionts, parasites, commensal organisms, gut contents, and incidentally associated biological and nonbiological material. Whenever possible, DNA or RNA will be sequenced from a single individual because many taxa display sufficient polymorphism among individuals to complicate assembly. Similarly, heterozygosity can also hinder assembly: inbreeding may be used to reduced heterozygosity (Zhang et al. 2012) or, when crossings are impossible (for instance in asexual species), haplotypes may have to be assembled separately (Flot et al. 2013). Quantity and Quality: The quantity of DNA or RNA required for sequencing varies widely depending on the sequencing platform and library construction methods to be used and should be carefully considered. Recent consensus from the G10KCOS group of scientists suggests that at least 200 – 800 µg of high-quality genomic DNA is required to begin any project because of the requirement for large insert mate-pair libraries (Wong et al. 2012). However, these minimum quantities are expected to decline with improving technology. DNA quality can be assessed by size visualizatons and 260/280nm ratios. Quality of RNA will be checked using RNA integrity number (RIN > 7 is preferred); however, these values have been shown to appear degraded in arthropods due to artifacts during quantification (Schroeder et al. 2006; Winnebeck et al. 2010). Taxonomic identity: The taxonomic identity of source organisms must be verified. Whenever possible, consensus should be sought from expert systematists, supportive literature, and sequence analysis of diagnostic genes (see next section). Voucher specimens: As a prerequisite for inclusion as a GIGA sample, both morphological and nucleic acid voucher specimens must be preserved and deposited in public collections, and the associated accession numbers must be supplied to the GIGA database. Photographs should be taken of each specimen and cataloged along with other metadata. The GIGA Web site lists cooperating institutions willing to house voucher specimens for GIGA projects, such as the Smithsonian Institution or the Ocean Genome Legacy (http://www.oglf.org). Documentation of projects, specimens, and samples: Unique alphanumeric identification numbers (GIGA accession numbers) will be assigned to each GIGA project and to each associated specimen or sample used as a source of genome or transcriptome material for analysis. A single database with a web interface will be established to accommodate metadata for all specimens and samples. Metadata recording will also aim to coordinate and comply with previously established standards in the community, such as those recommended by Genomics Standards Consortium (http://gensc.org/; Field et al. 2011). Sequencing Standards: Standards for sequencing are platform and taxon specific, and sensitive to the requirements of individual sequencing facilities. For these reasons, best practices and standards will be established for individual applications. Coverage with high-quality raw sequence data is a minimal requirement to obtain reliable assemblies. An initial sequencing run and assembly will be used to estimate repeat structure and heterozygosity. These preliminary analyses will make it possible to evaluate the need for supplemental sequencing, with alternative technologies aimed at addressing specific challenges (e.g., mate-pair sequencing to resolve contig linkage). Moreover, all raw sequence reads generated as part of a GIGA project will be submitted to the NCBI Sequence Read Archive. Sequence Assembly, Annotation, and Analyses: Because assemblies vary widely in quality and completeness, each assembly should be described using a minimum set of common metrics that may include: (1) N50 (or NG50) length of scaffolds and contigs (see explanation of N50 in Bradnam et al. 2013), (2) percent gaps, (3) percent detection of conserved eukaryotic genes (e.g., Core Eukaryotic Genes Mapping Approach (Parra et al. 2007), (4) statistical assessment of assembly (Howison et al, 2013), (5) alignment to any available syntenic or physical maps (Lewin et al. 2009), and (6) mapping statistics of any available transcript data (Ryan 2013). The current paucity of whole invertebrate genome sequence projects can pose problems for gene calling, gene annotation, and identification of orthologous genes. In cases where the genome is difficult to assemble, we recommend that genome maps be developed for selected taxa via traditional methods or new methods (e.g., optical mapping of restriction sites) to aid and improve the quality of genome assembly (Lewin et al. 2009) and that GIGA genome projects be accompanied by transcriptome sequencing and analysis when possible. Such transcriptome data will assist open reading frame and gene annotation and are valuable in their own right.

      GIGA aims to set standards for data acquisition and the processing of that data as well as the input of that data into their system. These standards will be progressively revised as better methods are discovered. These standards are to be used for a large variety of invertebrates. This standardization will allow for a more easily accessible and usable bank of information.

    4. We also recognize the existence and formation of other recent genome science initiatives and coordination networks and will synchronize efforts with such groups through future projects. Because GIGA is an international consortium of scientists, agencies, and institutions, we will also abide by the rules of global funding agencies for data release (e.g., those proposed by the Global Research Council; http://www.globalresearchcouncil.org). We are aware that different nations have different constraints and regulations on the use of biological samples. Given the international nature of GIGA, we will work to ensure that national genomic legacies are protected and will consult with the pertinent governmental agencies in the countries from which samples originate. We will deposit sequence data in public databases (e.g., GenBank), as well as deposit DNA vouchers in publically accessible repositories (e.g. Global Genome Biodiversity Network, Smithsonian). GIGA is an inclusive enterprise that invites all interested parties to join the effort of invertebrate genomics. We will attempt to capture the impact of the effort in the wider scientific and public arenas by following relevant publications and other products that result from GIGA initiatives.

      The project coordinators will also join with other networks to work together on future projects of common interest. They also understand that different countries have limits on the use of biological samples and thus their governments will be consulted with so that the international program is protected from any government agencies in the countries where the samples come from.

    5. GIGA embraces a transparent process of project coordination, collaboration, and data sharing that is designed to be fair to all involved parties. The ENCODE project may be emulated in this regard (Birney 2012). We are committed to the rapid release of genomic data, minimizing the period of knowledge latency prior to public release while protecting the rights of data product developers (Contreras 2010). The data accepted as part of GIGA resources will undergo quality control steps that will follow preestablished and evolving standards (see Standards section) prior to data release. Efforts such as those of Albertin et al. (2012) have addressed data sharing issues relevant to GIGA and other large-scale genomics consortia.

      Once the data has been submitted by researchers, it will go through a quality control process to make sure that it reaches the quality and accuracy which the project requires. As well as to avoid duplicate data being published.

    6. but there are currently no published genomes for the other 21 invertebrate phyla. We examined current phylogenetic hypotheses and selected key invertebrate species that span the phylogenetic diversity and morphological disparity on the animal tree of life (see Supplementary Material). New invertebrate genome data can reveal novel sequences with sufficient phylogenetic signal to resolve longstanding questions.

      The currently available genome sequences at the time were reviewed and the missing phyla were taken note of. These missing areas in the so called "tree of life" were added to the list of species to be researched so that the data could be used to further other research.

    7. Therefore, the geographic scope of the project in terms of participation, taxa collected, stored, and sequenced, data analysis and sharing, and derived benefits, requires global partnerships beyond the individuals and institutions represented at the inaugural workshop. Because international and interinstitutional cooperation is essential for long-term success, the new GIGA Web site will be used to foster cooperative research projects. For now, the GIGA Web site can serve as a community nexus to link projects and collaborators, but it could also eventually expand to host multiple shared data sets or interactive genome browsers. The broad scope of GIGA also necessitates growth in the genomics-enabled community overall. Sequencing and analyzing the large amount of resulting data pose significant bioinformatic and computational challenges and will require the identification and creation of shared bioinformatics infrastructure resources.

      The information gathered from experimentation will be unified by being submitted to the GIGA website, thus allowing researchers from all over the world to have access to the information at a moments notice. The website will also be used to host cooperative research projects that scientists from around the world could participate in and contribute data towards.