- Nov 2017
-
lantern.northwestern.pub lantern.northwestern.pub
-
Thus, glymphatic CSF influx is sharply suppressed in conscious alert mice as compared with naturally sleeping or anesthetized littermates.
There is less CSF circulation in the brain when a mouse is awake.
-
Ketamine/xylazine anesthesia significantly increased influx of CSF tracer in all mice analyzed
The scientists confirmed that awake mice showed reduced influx of CSF compared to anesthetized mice.
Together, with the previous experiment, this finding compellingly demonstrates that arousal state has a powerful effect on the influx of CSF through a mouse's brain.
-
- Oct 2017
-
www.scienceintheclassroom.org www.scienceintheclassroom.org
-
Interestingly, although all five sgRNAs conferred resistance to PLX, only the best shRNA achieved sufficient knockdown to increase PLX resistance
Not only did the authors find that GeCKO more consistently identified candidate genes, they found that the genes that were identified were better at providing PLX resistance.
-
Lower P values for the GeCKO versus shRNA screen indicate better scoring consistency among sgRNAs
The authors compared the top 100 hits in each screen and found that the sgRNA screen using GeCKO was more effective.
-
Our highest-ranking genes included previously reported candidates NF1 and MED12 (18, 19) and also several genes not previously implicated in PLX resistance, including neurofibromin 2 (NF2), Cullin 3 E3 ligase (CUL3), and members of the STAGA histone acetyltransferase complex (TADA1 and TADA2B)
The RIGER algorithm identified NF1 (neurofibromin, a tumor suppressor protein that inhibits RAS), MED12 (helps regulate transcription of RNA polymerase II-dependent genes), NF2 (gene for merlin which is used to control cell shape, movement, and communication), CUL3 (plays a key role in the polyubiquitination and degradation of proteins as a component of the E3 ubiquitin ligase), TADA1 (subunit of STAGA a chromatin modifying complex of proteins), and TADA2B (promotes transcription by organizing HAT activity). All were implicated in PLX resistance.
-
These candidates yield new testable hypotheses regarding PLX resistance mechanisms
The authors were able to identify new candidate genes for PLX resistance, and suggest that further research be done on how these genes confer this resistance.
-
lentiCRISPRs abolished EGFP fluorescence in 93 ± 8% (mean ± SD) of cells after 11 days
The efficacy of gene knockout by lentiCRISPR transduction was high. The sgRNAs eliminated 93+/- 8% of EGFP fluorescence in the experiment.
Further, lentiCRISPR transduction was more effective than transduction of cells with lentiviral vectors expressing EGFP-targeting shRNA, which produced an incomplete knockdown of EGFP.
-
-
www.scienceintheclassroom.org www.scienceintheclassroom.org
-
To answer the second part of the question, we studied the specific protein positions targeted by selection in SXL. Significant evidence of adaptive selection was found at 15 codons (p ≤ 0.05) predominantly located at N- and C-terminal regions (Fig. 4a). These results are consistent with functional studies showing that the sex-specific properties of extant Drosophila SXL depend on its global structure, and that modifications at N- and C-terminal domains of SXL in the drosophilid lineage represented coevolutionary changes determining the appropriate folding of SXL to carry out its sex-specific function (Ruiz et al. 2013)
The SXL protein controls sex determination and dosage compensation in the Drosophilla. This protein is said to depend on its global structure rather than on a specific domain. In this way, it implies, that these modifications represent co-evolutionary changes that determine the folding of SXL proteins to carry out its sex-determination and dosage compensation. - Jake Barbee
-
Two major conclusions can be drawn from these results: first, the diversification of Sxl in dipterans seems to have been driven by episodes of adaptive selection involving amino acid replacements at specific codons in terminal protein domains. Second, the recruitment of Sxl into sex-specific roles required bursts of adaptive selection during the evolution of dipterans and most importantly in the common ancestor of drosophilids, probably taking advantage of its preexisting role as a general splicing factor (Ruiz et al. 2003; Serna et al. 2004).
Bursts of natural selection is what lead to evolution of Sxl which replaced amino acids in the final proteins in the complex. This was required for the progression of the dipterans because this change early on is what caused the consistency of Sxl today. -Elder
-
SXL (the top component of the Drosophila sex determination cascade) constitutes the slowest evolving sex determining protein in drosophilids (Fig. 2b) as well as a slow evolving protein in other insect species (see Table 2 for details). However, there is still the possibility that such a high degree of conservation is a result of the lack of sex-specific functions in insects other than Drosophila (Cline et al. 2010; Sánchez 2008). Two approaches were followed in order to explore this scenario: first, the analysis of SXL in non-drosophilid insects revealed an evolutionary rate of 0.95 × 10−3 substitutions/site per MY (Table 2), constituting a much lower rate than the one estimated for Drosophila (2.80 × 10−3 substitutions/site per MY)
This slow rate suggests the success of this SXL gene, since the evolving rate is slow, the need for change is limited and very slim. The continual success of this gene is selected by nature to continue with very little changes made to the sequencing. -Elder
-
-
journals.biologists.com journals.biologists.com
-
Our data suggest that sailfish are not able to achieve the extremely high speeds claimed by earlier studies (Barsukov, 1960; Lane, 1941). These speed assessments (approximately 35 m s−1) are based on fishermen’s records of hooked specimens and are most likely overestimations.
Here the author ties the goal of this study with the data collected. The goal of this study was to test whether earlier studies were correct in their estimations of maximum swimming speeds in sailfish. The author describes that the data collected disproves earlier studies and provides explanation as to why the data of the prior studies may have been skewed. -Kyrsten
-
Based on the estimated absolute speeds, sailfish appear to be the fastest of the four species investigated here, however, they were also 50-80 cm longer than the other three species and maximum speed is known to increase with fish length (Wardle, 1975). Using a length-speed relationship based on burst swimming performance of various species (Videler, 1993), we found that the size-corrected speed performance is highest in little tunny and barracuda, followed by dorado and sailfish (Fig. 2D).
The goal of this study was to determine whether earlier studies were correct in their determination of maximum swimming speeds in sailfish as well as comparing sailfish speeds to other large marine predators. Here the author explains that compared to the other predators, the speed performance in sailfish based on size was the lowest. This is supported with Figure 2D. -Kyrsten
-
The calculated maximum attainable swimming speeds for the four species expressed in m s−1 (A) and in Lf s−1
From graph A it can be seen that the sailfish had the highest maximum swimming speed. Next was the Barracuda, then the Little tunny. The Dorado had the lowest max swimming speed.
Mikaela
-
-
www.scienceintheclassroom.org www.scienceintheclassroom.org
-
Table 2
By quantifying behavioral tolerance and intolerance of male-female transisthmian pairs, the researchers are able to compare interactions between closely related and distantly related species of snapping shrimp.
For example, while keeping figure 1 and table 2 in mind, more closely related species have higher compatibility compared to more distantly related species. Although this is not an indicator of having viable offspring, this shows that closely related species may share similar behaviors or may have similar niches. (JP)
-
Fig. 1 Single most parsimonious phylogenetic tree constructed on the basis of mtDNA sequences with PAUP (18). Transitions were given one-quarter the weight of transversions (based on the fourfold greater abundance of transitions than transversions in our data), and trees were rooted by the P7-P7'-C7 clade. Taxon codes are as in Table 1.
Figure 1 is a visual representation of the relationship between differing species of snapping shrimps. This diagram was made using the mitochondria DNA sequences of the organisms and the PAUP program which calculates the level of relatedness between the sibling species of snapping shrimp.
Comparing Table 1 with Figure 1, it is found that species more closely related, such as P2 and C2, had a lower mtDNA mean value (6.6). Meanwhile, species that were more distantly related, such as P7' and C7, had a higher mtDNA mean value (19.7). (JP)
-
-
www.scienceintheclassroom.org www.scienceintheclassroom.org
-
Fig. 7 On this figure of electrosensory pathways in the gymnotiform fish (modified from Carr and Maler, 1986), we have labeled regions where the proposed computations for high-frequency electrolocation might be implemented.
The author using results from video , electric images and BEM simulations was able to depict the electrosensory pathways in the fish and where the fish has organs with receptors that are able to sense electric fields. By knowing where these electroreceptors are , the author is able to determine where the organism is able to conduct electrolocation. -Michelle Oriana Gomez-Guevara
-
-
www.scienceintheclassroom.org www.scienceintheclassroom.org
-
our findings suggest that the overexploitation of spawning aggregations can fundamentally alter the natural predator-prey equilibrium, limiting foraging options for reef sharks within aggregation sites.
The inverted biomass pyramid is a good thing, but the fish spawning aggregations play an important role in whether or not this stays a good thing.
- D.N.B.
-
These observations confirmed that hundreds of sharks actively feed on a large variety of prey (at least 14 fish species; Figures 4 and S3). In particular, sharks feed aggressively on the large number of groupers present during spawning aggregations in June and July [13]. Shark abundance and residency times both increase when camouflage groupers (Epinephelus polyphekadion) arrive from the surrounding reef area to spawn
there is an active correlation between the populations of prey and predators, when the camouflage groupers aggregate in order to spawn. This increase in prey is what attracts the sharks. MSARS , WT & YS
-
Overall, sharks showed different degree of residency (mean ± SEM = 42.21% ± 7.75% of days present in the pass; range = 2.1%–95.9%; Table S3), with three transient (<20% residency), six semi-resident (20%–70% residency), and four highly resident (>70% residency) sharks (Figure S2).
The study showed that overall, there was a higher number of sharks observed that spent most of their time in the pass than there were that spent less of their time in the pass. YS & WT
-
-
scienceintheclassroom.org scienceintheclassroom.org
-
Scaleless dragons show an irregular skin surface with the initiation of some dermoepidermal undulations of the skin (Fig. 3G), indicating that this phenomenon does not fully require the presence of anatomical placodes.
Mutant dragons displayed evidence of scale patterns, despite the absence of fully formed scales. This shows that there are factors other than the anatomical placode at play in scale formation.
-
-
www.scienceintheclassroom.org www.scienceintheclassroom.org
-
Results
The authors used 5 measures for replication success to check to what extent the 100 original studies could be successfully replicated.
-
The overall replication evidence is summarized in Table 1
The authors wanted to know if successfully replicable studies differed from studies that could not be replicated in a systematic way. As the criterion for replication success, they used their first approach (significance testing). They found that studies from social psychology and studies from social psychology journals were less likely to replicate than those stemming from cognitive psychology and cognitive psychology journals. Moreover, studies were more likely to replicate if the original study reported a lower p-value and a larger effect size, and if the original finding was subjectively judged to be less surprising. However, successfully replicated studies were not judged to be more important for the field or conducted by more experienced research teams.
-
Moreover, correlational evidence is consistent with the conclusion that variation in the strength of initial evidence (such as original P value) was more predictive of replication success than was variation in the characteristics of the teams conducting the research (such as experience and expertise).
From this study, we can conclude that the efforts made to reduce the influence of the researchers involved in the replication studies on replication success seemed merited. Importantly, replication success was less influenced by the characteristics of the replication team than by how strong the results of the original study were. Stronger evidence for the effect the original study investigated was related to a successful replication.
-
The disadvantage of the descriptive comparison of effect sizes is that it does not provide information about the precision of either estimate or resolution of the cumulative evidence for the effect.
For the fourth measure, the authors combined the original and replication effect sizes and calculated a joint estimation of the effects. They wanted to see how many of the studies that could be analyzed this way would show an effect that was significantly different from zero if the evidence from the original study and that of the replication study was combined.
Results showed that 68% of the studies analyzed this way indicated that an effect existed. In the remaining 32% of the studies, the effect found in the original study, when combined with the data from the replication study, could no longer be detected.
-
A complementary method for evaluating replication is to test whether the original effect size is within the 95% CI of the effect size estimate from the replication.
A second way to evaluate the replication success was to compare the sizes of the effects of the original studies and those of the replication studies using confidence intervals (CIs). They checked if the effects size of the original study was a value that was also among the range of values revealed as good estimates of the true effect size in reality, which was calculated from the size of the effect in the replication study.
Using this measure, they found that slightly fewer than half the replications were successful.
-
A straightforward method for evaluating replication is to test whether the replication shows a statistically significant effect (P < 0.05) with the same direction as the original study.
The authors first checked how many replications "worked" by analyzing how many replication studies showed a significant effect with the same direction (positive or negative) as the original studies.
Of the 100 original studies, 97 showed a significant effect. Based on their ability to replicate the original studies, the authors expected around 89 successful replications. However, results showed that only 35 studies were successfully replicated.
-
-
www.scienceintheclassroom.org www.scienceintheclassroom.org
-
As reported previously (5), branching species were more susceptible to hurricane damage than were massive heads (Fig. 4B). In an extremne example, at a depth of 6 m on Monitor Reef on the West Fore Reef (Fig. 1, location C, and Fig. 3) the planar living areas of branching Acropora spp. were reduced by up to 99 percent (Table 2, rows 1 to 3), whereas colonies of foliaceous and encrusting Agraricia agaricites were reduced by only 23 percent (Table 2, row 6),
The shape of corals has a strong influence on how much they are damaged because shape directly affects how much force flowing water exerts on the coral's skeleton. Thinly branching corals are especially susceptible to breaking, whereas corals with thicker, leaflike, encrusting, and head forms experience much less mechanical stress during a storm.
-
- Sep 2017
-
-
Social capital, as fraught and divided as the literature may be, is ultimately just a concept; and a relatively underdeveloped one, at that. It is neither benevolent nor vengeful, and perhaps what the literature describes as social capital activation says more about the theoretical dispositions of the researchers than the concept itself.
I would love to see you submit this to the new DigiSoc@VCU Medium.com page. You would need to strengthen the conclusions. You discuss is so very powerful up until the end. The it falls a bit flat. What are the implications of this new form of digital social capital? Are there other ways of mapping and measuring it? How can SNA advance this literature? What research questions come out of the alt-right study? These are the type of questions you should explore in your conclusions.
-
-
www.scienceintheclassroom.org www.scienceintheclassroom.org
-
R2 in GWAS models ranging on average from 9% for the SNP set associated with survival in England to 24% for the SNP set associated with survival in Finland
The R2 value indicates how much of the variation in a trait can be explained by the genetic markers identified. If R2 is not 100% than other locations in the genome or environmental factors are influencing the trait being analysed.
In this case the SNPs associated with survival in Finland were much more important in influencing plant survival than the SNPs associated with survival in England.
-
-
www.scienceintheclassroom.org www.scienceintheclassroom.org
-
we found the complexes to be roughly twice the size of a single polysome
The biggest molecule, in this experiment KDM5B, took the longest to translate.
-
there was little interaction between the two, providing direct evidence that the vast majority of KDM5B polysomes act independently of one another. However, a small fraction (~5%) of KDM5B polysomes formed complexes that co-moved for hundreds of seconds
Most polysomes translated the FLAG tagged and HA tagged proteins independently; however, a few interacted with one another, suggesting that translation of some proteins might be co-regulated.
-
yielding a single consistent elongation rate of 10 ± 2.3 aa/sec, fairly close to what has been measured using genome-wide ribosomal profiling (5.6 aa/sec)
The translational elongation rate was consistent for each of the translated proteins and was between 8-12 amino acids per second.
-
Importantly, for all constructs, the correlation vanished at times greater than the dwell time. This implies initiation is random, so there is no memory between initiation events, similar to what was observed for transcriptional initiation (12) and in contrast to bursting
Like the process of transcription, the start of the translation process is random and there is no memory of previous initiations.
-
This suggests that the polysomes we imaged are organized in a more globular shape rather than an elongated shape, consistent with recent atomic force microscopy images
Polysomes used for translation are constructed like a big ball rather than an elongated structure.
-
Despite these trends, there was huge variability in mobility between mRNA, so that we sometimes saw rapidly moving KDM5B polysomes (up to 6 μm2/s) as well as nearly immobile H2B polysomes (~ 0.01 μm2/s;
Polysomes showed a lot variation in movement through the cell. Proteins that were moving into the nucleus generally moved by diffusion while beta actin polysomes showed a slower pattern of movement, suggesting that the newly translated protein might be interacting with other cellular proteins.
-
detected translation sites are polysomes that can contain as few as 1 ribosome every 900 mRNA bases or as many as 1 ribosome every 200 mRNA bases
After tracking the mRNA going through translation, it was concluded that polysomes are variable.There can be one ribosome present for on average every 200 mRNA bases, but a ribosome can be present every 900 mRNA bases.
-
We therefore estimate there are 3.1 ± 0.5 nascent peptide chains per beta-actin translation site, 2.1 ± 0.4 per H2B site, and 5.1 ± 0.9 per KDM5B site
The results suggested that there are multiple ribosomes per RNA being translated. The number of ribosomes/mRNA varied from 3-5 and seemed to correlate with the length of the mRNA.
-
Together these data suggested the co-moving spots were indeed translation sites.
Because the spots became brighter after addition of cycloheximide (which allows more ribosomes to load onto each mRNA) and because their were fewer spots after addition of puromycin, it can be concluded that the fluorescent spots in the cell are sites of translation.
-
Within minutes of drug addition, the number of co-moving spots dropped exponentially
With the addition of puromycin, the number fluorescent spots decreased, suggesting that the spots were sites of translation.
-
- Aug 2017
-
www.scienceintheclassroom.org www.scienceintheclassroom.org
-
The highly ambitious RCP2.6 scenario seems to be the only possible pathway toward more limited impacts. Only the coldest RCP2.6L simulations, which correspond broadly to the 1.5°C target of the Paris Agreement, allow ecosystem shifts to remain inside the limits experienced during the Holocene.
The RCP2.6 scenario, the pathway that is likely to limit warming to the 2.0°C threshold targeted by the Paris Agreement, would result in changes that are just within the most extreme variation of the Holocene. This pathway is considered highly ambitious as it would require achieving even more emissions reductions than are currently proposed.
The pathway developed by the authors to meet the 1.5° threshold, RCP2.6L, is the only one that would keep the ecosystem changes within the Holocene variation.
This analysis indicates that for the Mediterranean region, there will likely be significantly larger impacts if the global temperature increase reaches the threshold of 2.0°C rather than being limited to 1.5°C.
-
Our analysis shows that, in approximately one century, anthropogenic climate change without ambitious mitigation policies will likely alter ecosystems in the Mediterranean in a way that is without precedent during the past 10 millennia. Despite known uncertainties in climate models, GHG emission scenarios at the level of country commitments before the UNFCCC Paris Agreement will likely lead to the substantial expansion of deserts in much of southern Europe and northern Africa
The current scenario for global greenhouse gas emissions, taking into account all the voluntary emission reductions targets set by countries as part of the United Nations Framework Convention on Climate Change, matches best with RCP4.5. This analysis shows that the climate change associated with that pathway would likely result in changes in the Mediterranean ecosystems by the end of the century that go far beyond what was seen in the past 10,000 years.
-
Simulated warm mixed forests also extend to the Atlantic Ocean in the west of France, indicating the inability of BIOME4 to distinguish Atlantic pine forests
Even though the Atlantic pine forests and the Mediterranean vegetation types are different, the BIOME4 model is not able to sort the two types into separate categories.
-
-
www.science.org www.science.org
-
J. Lewis et al., Science 293, 1487 (2001).
Lewis and colleagues made a genetically modified mouse with a mutated APP gene and a mutated tau gene.
These double mutants had plaques like other hAPP mice, but they also had tangles that were more severe than in other mice, showing that problems with amyloid and tau can interact and cause symptoms of Alzheimer’s disease.
-
We found no adverse effects of tau reduction on health or cognition in mice, and the evidence that even partial tau reduction robustly protected mice from Aβ and excitotoxic agents highlights its potential benefits.
This work started as a side project and eventually became a pillar of Dr. Roberson’s later work on mouse models of Alzheimer’s disease. It has been credited with "sparking the research community’s interest in the role of tau in the pathogenesis of Alzheimer’s disease.”
Read more in Neurology Today:
-
tau modulates sensitivity to excitotoxins and may be involved in regulating neuronal activity
Lowering tau even in normal mice (those without the hAPP gene, who would not get Alzheimer’s disease) was beneficial in making mice less prone to seizures when injected with an seizure-inducing drug.
This means that in healthy mice, normal tau may be involved in modulating excitation in neurons, reducing the overall risk of seizures.
(Seizures are a result of abnormally high electrical activity in the brain that occurs when certain neurons are excessively excited.)
-
at a dose that was not lethal to mice without hAPP (P < 0.05). Tau reduction prevented this effect, as no hAPP/Tau+/– or hAPP/Tau–/– mice died. Seizures in hAPP/Tau+/– and hAPP/Tau–/– mice were less severe and occurred at longer latencies than in hAPP/Tau+/+ mice
Like humans with Alzheimer’s disease, many mice used as models of the disease have a tendency to get seizures, including the hAPP mice used in this study.
However, hAPP mice without tau or with reduced tau had fewer seizures than those with the normal amount of tau.
-
Given that tau reduction prevented behavioral deficits but not neuritic dystrophy, these may represent parallel, rather than causally linked, disease manifestations, or tau reduction may act downstream of neuritic dystrophy.
Eliminating tau in hAPP mice prevented memory and cognitive problems, but it didn’t prevent the neurons from becoming damaged and impaired.
The removal of tau must therefore work in an unrelated way to improve cognition in these mice.
-
Despite the differences in their behavior, hAPP/Tau+/+, hAPP/Tau+/–, and hAPP/Tau–/– mice had similar amounts of neuritic dystrophy
All hAPP mice, independent of the amount of tau, had neurons whose axons and dendrites were withering away in regions near amyloid plaques.
These results show that this type of damage in Alzheimer’s disease can happen even without tau being present.
-
In our study, reduction of endogenous, wild-type tau protected hAPP mice against Aβ-dependent cognitive impairments, and this did not involve the elimination of a large pool of tau with typical AD-associated modifications.
Reducing tau in hAPP mice protected those mice against the cognitive impairments seen in Alzheimer’s disease.
The authors looked at hAPP mice with normal amounts of tau to see how the tau became abnormal later in life. They compared these changes with abnormal tau measured in other mouse models of Alzheimer’s disease.
The connection between tau and amyloid-β in this model isn’t clear and may involve multiple types of tau or tau stored in various locations (or pools) within the cell.
-
Thus, the beneficial effects of reducing tau were observed without detectable changes in Aβ burden, suggesting that tau reduction uncouples Aβ from downstream pathogenic mechanisms.
The authors examined the brains of hAPP mice with different levels of tau. They found that, at all levels of tau, the amount of amyloid plaques and the amount of floating amyloid-β were the same in all hAPP mice.
Thus, the reduction of tau, which had improved memory, didn’t alter the amyloid-β or plaque levels, suggesting that amyloid-β or plaque levels alone are not responsible for memory loss.
-
Thus, tau reduction prevented major Aβ-dependent adverse effects in hAPP mice.
hAPP mice are known to die young, as a result of the induced Alzheimer’s disease. But if hAPP mice had a reduced amount of tau, they lived much longer.
Additionally, hAPP mice with normal levels of tau were hyperactive even into middle age, whereas hAPP mice with no tau were not.
-
tau reduction gene dose-dependently ameliorates Aβ-dependent water maze learning and memory deficits
Mice with the hAPP gene usually do poorly in the water maze because of their impaired memory. However, hAPP mice that have only half the amount of normal tau did remember where the platform was after additional training.
hAPP mice with no tau performed similar to normal, healthy mice, showing good memory on this test.
Thus, removal of all the normal tau seems to prevent memory problems. Even removing just half the normal tau already has some memory benefits.
-
Thus, tau reduction can block Aβ- and excitotoxin-induced neuronal dysfunction and may represent an effective strategy for treating Alzheimer's disease
Evidence from this study and following studies showing the importance of lowering tau has led researchers to target tau with their experimental new treatments, including vaccines that use the body’s own immune system to fight off the problematic tau proteins.
http://www.medicaldaily.com/alzheimers-disease-tau-protein-vaccine-391883
Almost 10 years after this paper was published, no good treatments for Alzheimer’s disease exist, but more and more of the new drugs in clinical trials are targeting the tau protein, as a result of early laboratory research such as this.
Tags
Annotators
URL
-
-
www.scienceintheclassroom.org www.scienceintheclassroom.org
-
Thresholds for tissue damage from underwater sonar require ~100 kPa (53) and result from many cycles of bubble growth and collapse over tens of seconds of continuous wave excitation. Tissue damage in this setting is due to the negative pressure rather than exposure to a single compression pulse. These considerations indicate that direct tissue damage resulting from transmission of the blast shock wave through the brain is unlikely.
The negative phase of a blast wave follows the initial shock, and sucks items back toward the center of the blast source. Here, the authors consider the possibility that this negative phase could contribute to injury.
They use underwater sonar as a standard. Tissue damage results from repeated negative pressure of ~100 kilopascal generated by sonar waves. Because the blast pressure wave used in this experiment is only a single compressive wave and the negative pressure is far below the positive pressure, the authors discount negative pressure as a possible cause of injury.
-
Thresholds for positive pressures are not well characterized but are likely to exceed 40 MPa because positive pressures commonly used in clinical shock wave lithotripsy are not associated with significant, if any, tissue damage (52).
Lithotripsy is a procedure that uses sound waves to break apart kidney stones without damaging normal tissue. These waves have positive pressures of around 40 megapascal. Because the waves created by blast pressure are much smaller in amplitude than those used in lithotripsy, it is unlikely they would cause damage.
-
it is notable that mature NFTs were not detected in the cortex or hippocampus of blast-exposed mice.
Mouse tau protein is resistant to pathogenic aggregation like that seen in humans. Therefore, it is not surprising that the authors did not observe neurofibrillary tangles (NFTs) in the brains of blast-exposed mice).
However, the authors were surprised to see pretangles (early-stage NFTs) and elevated levels of abnormally phosphorylated tau protein in the brains of blast-exposed mice. The observation of both of these abnormalities in wild-type mice strongly supports the link between blast exposure and CTE.
Alternatively, the authors could have used genetically modified (transgenic) mice that overexpress human tau protein. These mouse strains show aggressive tau protein aggregation leading to mature NFTs.
-
Although PTP recovered by 1 month after blast, the magnitude of LTP 1 hour after tetanus was significantly reduced at both postblast time points
Regardless of the magnitude of the signal immediately after the blast, the post-tetanic response in the blast-exposed mice falls off sharply over the 1-hour period following the stimulus. This means that the brain pathway and processes by which new memories are formed is impaired in blast-exposed mice.
-
we found that the magnitude of posttetanic potentiation (PTP) immediately after application of theta-burst stimulation (TBS) was significantly less at the 2-week time point (Fig. 6E; P < 0.05, repeated-measures multifactorial ANOVA).
Post-tetanic potentiation (PTP) is the change in potential that occurs after a tetanus, or a series of stimuli that occur in a short period. It is thought to be involved in synaptic plasticity, which is important to learning and memory.
The authors observed that blast-exposed mice had a reduced potential that is consistent with impaired hippocampal function. The deficit remained for at least 1 month.
-
Phosphorylated tau (CP-13) immunostaining in superficial layers of the cerebral cortex 2 weeks after exposure to a single blast. Increased accumulation of phosphorylated tau in the brains of blast-exposed mice was confirmed by quantitative immunoblot analysis (Fig. 5).
This is an unexpected and important finding. Wild-type C57BL/6 mice with murine tau protein do not normally develop the neurofibrillary pretangles seen in this figure panel. The authors confirmed this finding in Figure 5.
-
Brains from blast-exposed mice also exhibited enhanced somatodendritic phosphorylated tau CP-13 immunoreactivity in neurons in the superficial layers of the cerebral cortex (Fig. 3J) that was not observed in the brains of sham-blast control mice
The authors also observed this in the human cases in Figure 1. This is a surprising result, because the CP13 antibody detects hyperphosphorylated human tau protein (not mouse), and phosphorylated tau abnormalities had not been observed in nontransgenic mouse models.
Because the authors saw CTE-linked tau abnormalities in the mouse model but not in mice in the control group, they concluded that blast exposure is linked to brain abnormalities associated with CTE.
-
In contrast, brains from blast-exposed mice showed marked neuropathology by immunohistological analysis (Fig. 3, H, J, L, Q, N, S, and T)
Using more sensitive methods (light microscopy, electron microscopy, protein immunoblotting, neurophysiology, and behavioral testing), the authors found that a single blast exposure was sufficient to cause significant brain injury, CTE-related abnormalities, and cognitive deficits in mice.
-
We did not detect delayed blast-induced ICP transients in either preparation over recording times up to 100 ms. These observations indicate that blast wavefront transmission in the mouse brain is mediated without significant contributions from thoracovascular or hydrodynamic mechanisms.
There were no significant differences between the pressure in the brains of living mice compared to decapitated mice. This led the authors to reject the hypothesis that there is a thoracic-vascular contribution to TBI (the water hammer effect).
This result also suggested there must be a different way that blast exposure causes brain injury, which motivated the authors' investigation of other hypotheses.
-
These results identify common pathogenic determinants leading to CTE in blast-exposed military veterans and head-injured athletes and additionally provide mechanistic evidence linking blast exposure to persistent impairments in neurophysiological function, learning, and memory.
The authors report several findings linking blast exposure to CTE:
-A correlation between blast exposure and CTE in human brains (identical to the CTE seen in athletes' brains).
-A causal link between blast exposure and CTE in a mouse model.
-A causal link between blast exposure and cognitive problems commonly seen in blast-exposed military veterans.
-Cognitive problems can be prevented by restricting head movement during a blast.
-
-
www.scienceintheclassroom.org www.scienceintheclassroom.org
-
As with failures to expand northward or into cooler areas, land-use changes do not relate to range losses from bumblebee species’ southern or warm thermal limits
We can see this in Table S3 in the Supplementary Materials.
-
model selection includes a small continental effect; intercept for Europe, 1459 m (366 SE); North America, 1074 m (340 SE) (Fig. 2)]. Europe’s mountainous areas are oriented predominantly east-west, potentially inducing more pronounced upslope shifts.
A continental effect indicates that the shifts in elevation are predictably different depending which continent the species is on.
Positioning of the mountains could cause this.
-
bumblebee species’ range losses from their historical southern limits have been pronounced in both Europe and North America, with losses growing to ~300 km in southern areas on both continents (Fig. 1C).
As we can see in Figure 1C, species in southern areas have seen their southern range limits shift by as much as 300 km on average.
-
These failures to track climate change occur in parallel in regions that differ in their intensities of human land use (e.g., Canada and northern Europe), which had no direct or interaction-based effect in any statistical model (Table 1)
Canada and northern Europe have very different histories of land-use. If land-use were an important factor behind declines in bumblebee ranges, than there likely would have been a difference in bumblebees' ability to track climate change between North America and Europe.
The authors did not see this, and conclude that land-use is likely not an important factor driving these range declines.
-
These locally important effects do not “scale up” to explain cross-continental shifts along bumblebee species’ thermal or latitudinal limits. The timing of climate change–related shifts among bumblebee species underscores this observation: Range losses from species’ southern limits and failures to track warming conditions began before widespread use of neonicotinoid pesticides (figs. S2 and S3)
The authors conclude that while neonicotinoids and other pesticides do kill bumblebees, they are likely not responsible for the large-scale declines in range sizes that are observed.
-
Observed losses from species’ southern or warm boundaries in Europe and North America, and associated phylogenetic signals, are consistent with ancestral limitations of bumblebees’ warm thermal tolerances and evolutionary origins
Bumblebees evolved in cool, temperate conditions, and did not need to evolve tolerances to high temperatures. Considering this, it's not surprising that they are dying out from the regions of their ranges that reach the hottest temperatures.
-
-
www.scienceintheclassroom.org www.scienceintheclassroom.org
-
Western assay of induced SX4 cells showed the presence of Venus only in the membrane fraction and not in the cytoplasmic fraction, suggesting efficient membrane localization of Tsr-Venus.
The e coli strain engineered by the authors is behaving as they predicted.
-
- Jul 2017
-
scienceintheclassroom.org scienceintheclassroom.org
-
including these variables does not substantively affect our findings.
The authors concluded that writing skills do not affect the chance of receiving grant funding from NIH.
-
- Jun 2017
-
www.scienceintheclassroom.org www.scienceintheclassroom.org
-
(iv) Microstimulation-evoked vocalizations suggest that deep-layer but not superficial-layer cortical activity is sufficient to trigger vocalizations.
Microstimulation of deep layers of the somatosensory cortex is sufficient to trigger tickling-associated vocalizations.
-
Tickling-evoked calls are not simple reflexes in response to touch.
The response to tickling is different than the general response to being touched. It is dependent on the type of touch, the location of touch, and the emotional state of the rat.
-
The increase of vocalizations after initial tickling (Fig. 1, C and D) and anxiogenic suppression of tickling-evoked calls (Fig. 3, B and C) support Darwin’s idea that “the mind must be in a pleasurable condition” for ticklish laughter (4).
There are two pieces of evidence that show the link between mood and ticklishness in rats:
First, the initial round of tickling seems to predispose rats to both play behaviors and to an increased tickling response in subsequent interactions.
Second, anxiety suppresses the rats' responses to tickling.
-
Similar to USVs, neuronal firing rates increased more during tickling than during gentle touch on the trunk (Fig. 2D).
Neural activity in the trunk cortex was greater during tickling than during gentle touch.
-
(iii) The strong call-related activation of the trunk somatosensory cortex points to an involvement in tickling-evoked vocalizations. Call-related firing in the somatosensory cortex is much stronger than call-evoked activity in the auditory cortex (18).
There is a strong correlation between the production of tickle-related USVs and activity in the somatosensory cortex.
-
(i) We found that tickling can evoke intense neuronal activity in the somatosensory cortex (Fig. 2). Moreover, play behavior, which induces anticipatory vocalizations in rats (Fig. 1, F to H) (17) and humans (23), evoked neuronal activity similar to the activity evoked by tickling (fig. S2E).
Tickling and the anticipation of tickling are both associated with activity in the somatosensory cortex.
-
When microstimulation was directly preceded by tickling, more USVs were evoked
"Priming" the rats by tickling them prior to microstimulation increases the number of USVs produced.
-
Although rats had no interaction with the experimenter, they emitted USVs (Fig. 4H, top).
Stimulation of somatosensory neurons (in the absence of tickling) is sufficient to cause the production of USVs.
This shows a direct causal link between the increased neural activity and the increased production of USVs.
-
Furthermore, the effect was more prominent in layers 4 and 5a than in the superficial layers (Fig. 4F).
Neural firing rates in the deeper layers of the cortex have a stronger relationship to USVs than rates in the surface layers of the cortex.
-
The activity of trunk somatosensory neurons was correlated with USV emissions: Neurons increased their firing rate before and during USV emissions (Fig. 4, C and D, Before versus On).
The firing rate of neurons is highest just before and during USV production. This shows that these two events are correlated.
-
Remarkably, neuronal responses were also observed during hand-chasing phases, when rats were not touched by the experimenter (Fig. 2B and fig. S2E).
There was also increased activity in the somatosensory cortex during hand-chasing, despite the fact that the rats were not being touched.
This was interesting, because the primary function of the somatosensory cortex is to detect touch.
-
Play behavior (rat chasing experimenter’s hand; Fig. 1F) also evoked USVs (Fig. 1, G and H, and movie S3) (17).
Rats produce USVs during play behavior, even if they are not currently being tickled.
-
Tickling the ventral trunk evoked the largest number of USVs (Fig. 1D) and the largest fraction of combined USVs (Fig. 1E).
The ventral (belly) region of the rat is more ticklish than the dorsal (back) or tail regions.
-
-
www.scienceintheclassroom.org www.scienceintheclassroom.org
-
Results
The authors used 5 measures for replication success to check to what extent the 100 original studies could be successfully replicated.
-
- May 2017
-
www.scienceintheclassroom.org www.scienceintheclassroom.org
-
Transparency and Openness Promotion (TOP) Guidelines (http://cos.io/top) (37)
Nosek and colleagues summarize eight standards for transparency and openness in research that focus on citations, data accessibility, accessibility of computational resources, making research materials like participant instructions available and giving access to the design and analyses, study and analysis plan pre-registration, and the use of replication studies overall.
They argue that journals should require and enforce adherence to transparency guidelines, and that the submission of replication studies, in particular in the Registered Report format, should be an option.
-
The present results suggest that there is room to improve reproducibility in psychology. Any temptation to interpret these results as a defeat for psychology, or science more generally, must contend with the fact that this project demonstrates science behaving as it should.
The fifth and final conclusion of the paper addresses the big-picture takeaway of the results.
On one hand, the authors recognize that research is a process where new ideas have to be explored and sometimes might turn out not to be true. Maximum replicability is therefore not desirable, because it would mean that no more innovations are being made.
On the other hand, the authors also conclude that there is room for improvement: stronger original evidence and better incentives for replications would form a stronger foundation for psychological research.
-
Nonetheless, collectively, these results offer a clear conclusion: A large portion of replications produced weaker evidence for the original findings
Because there is some uncertainty about how exactly the replication success rate in psychological research should be determined, the authors go about the interpretation of the results of this study very conservatively.
This very careful interpretation of the data is that the replication studies yielded largely weaker evidence for the effects studied than the original studies.
-
No single indicator sufficiently describes replication success, and the five indicators examined here are not the only ways to evaluate reproducibility.
It is difficult to say exactly how many original studies were successfully replicated. The precise conclusions drawn from this paper depend a lot on which of the 5 measures used to determine replication success you think is the most appropriate measure. The results of one measure indicate a replication success rate as low as 36%, while another measure suggests a success rate of 68%. Perhaps some researchers would even say that another measure not included in this study would have made it possible to draw more meaningful conclusions. The scientific community has so far not agreed on what measure should be used to evaluate replication success rates.
Moreover, there are other limitations on this approach to studying reproducibility (see paragraph "Implications and limitations")that make it difficult to generalize the findings of this study not only psychological research, but other disciplines. It is also difficult to evaluate from the findings in this study whether the evidence indicates a specific effect is true or does not exist.
Therefore, the first conclusion of this paper is that all interpretations of the data are only an estimation of how reproducible psychological research is, not an exact answer.
-
In addition to the quantitative assessments of replication and effect estimation, replication teams provided a subjective assessment of replication success of the study they conducted.
Finally, the authors used the subjective rating as an indicator of replication success. Out of 100 replication teams, only 39 reported that they thought they had replicated the original effect.
-
Comparing the magnitude of the original and replication effect sizes avoids special emphasis on P values. Overall, original study effect sizes (M = 0.403, SD = 0.188) were reliably larger than replication effect sizes (M = 0.197, SD = 0.257),
With this third measure for replication success, the authors further compared the sizes of the original and replicated effects. They found that the original effect sizes were larger than the replication effect sizes in more than 80% of the cases.
-
-
www.scienceintheclassroom.org www.scienceintheclassroom.org
-
Within any zone, the amount and type of damage inflicted upon sessile organisms was greatly influenced by their shapes, sizes, and mechanical properties. Damage to gorgonians, corals, and sponges ranged from partial to complete mortality (20) and was caused by abrasion, burial, and the tearing or fracture of tissue and skeleton. The fate of detached colonies and fragments, and thus the ultimate consequences to populations, of Hurricane Allen, varied widely between taxa.
The way that physical forces affected corals depended heavily on their form, since form affects how organisms are influenced by mechanical forces. Different species have different shapes and sizes, so the impact of the storm depends on the species.
-
Shallow fore-reef areas were generally more severely damaged than deep ones. We see this most directly by comparing the same species on the same reefs at different depths. For example, head corals were more frequently toppled in sand channels in 10 m of water than in 14 m (Table 1, compare rows 5 and 6; x2 for numbers toppled and not toppled after Hurricane Allen = 4.75, P < .05).
The energy within waves is released most violently in shallow waters. Corals at greater depths were less likely to be damaged.
-
-
scienceintheclassroom.org scienceintheclassroom.org
-
Our results, together with recent reports showing brain calcification in microcephalic fetuses and newborns infected with ZIKV (10, 14) reinforce the growing body of evidence connecting congenital ZIKV outbreak to the increased number of reports of brain malformations in Brazil.
The authors report that their results confirm previous evidence connecting the ZIKV outbreak in Brazil to an increase in cases of microcephaly.
-
Other studies are necessary to further characterize the consequences of ZIKV infection during different stages of fetal development.
The authors used models that allowed them to study early stages of brain development. They suggest there is more work to be done to determine the effects of ZIKV infection on later stages of fetal development.
-
Our results demonstrate that ZIKV induces cell death in human iPS-derived neural stem cells, disrupts the formation of neurospheres and reduces the growth of organoids (fig. S2), indicating that ZIKV infection in models that mimics the first trimester of brain development may result in severe damage.
Summarizing the results, the authors conclude that in their model of early brain development, ZIKV causes severe damage.
-
These results suggest that the deleterious consequences of ZIKV infection in human NSCs, neurospheres and brain organoids are not a general feature of the flavivirus family.
Because DENV2 did not reduce cell growth or affect morphology, the researchers concluded that those effects are unique to ZIKV and not characteristic of the flavivirus family (to which both DENV2 and ZIKV belong).
-
ZIKV induced caspase 3/7 mediated cell death in NSCs
Zika virus induces the expression of caspase 3/7, indicating a cell is preparing to die.
Dengue virus 2, on the other hand, did not increase caspase 3/7.
-
reduced by 40% when compared to brain organoids under mock conditions
Brain organoids infected with Zika virus were, on average, 40% smaller.
-
-
scienceintheclassroom.org scienceintheclassroom.org
-
Although our findings show that NIH grants are not awarded purely for previous work or elite affiliations and that reviewers contribute valuable insights about the quality of applications, mistakes and biases may still detract from the quality of funding decisions.
Summing up all results, we can say that previous work or elite affiliations do not "close the door" for new ideas in research.
-
The large value-added for predicting tail outcomes suggests that peer reviewers are more likely to reward projects with the potential for a very high-impact publication and have considerable ability to discriminate among strong applications.
The authors' findings suggest that peer reviewers are good at identifying innovative and ground-breaking projects.
-
peer reviewers add value by identifying the strongest research proposals.
The authors show that peer review scores are good predictors of scientific productivity when differences in field of research, year, and applicant qualifications are removed. This suggests that peer reviewers have the necessary expertise to choose good applicants.
-
the grant with a 1-SD worse score is predicted to have 7.3% fewer future publications and 14.8% fewer future citations
The authors conclude here that regardless of gender, ethnicity, or institutional prestige, when the peer-review score lowers by one standard deviation, we can observe a corresponding decrease of the number of publications and citations of an author.
-
Controlling for publication history attenuates but does not eliminate the relationship
Again, controlling for the variable of a PI's research background does not eliminate the relationship the authors originally found.
-
Controlling for cohort and field effects does not attenuate our main finding.
The authors' adjustments to control for various external effects did not change their original findings.
-
This variation in citations underscores the potential gains from being able to accurately screen grant applications on the basis of their research potential.
The authors found that there is a lot of variation in the research output of projects that receive grants. They conclude that it would be useful to find a way to accurately screen applications to determine their potential.
-
NIH is the world’s largest funder of biomedical research (12). With an annual budget of approximately $30 billion, it supports more than 300,000 research personnel at more than 2500 institutions (12, 13). A funding application is assigned by topic to one of approximately 200 peer-review committees (known as study sections).
Based on an analysis conducted by the authors, biomedical research is valued highly by individuals, governments, foundations, and corporations. Research is seen as a source of more effective treatments and preventive measures and as a route to policy, new commercial products, and economic development.
As a result, investments in biomedical research are the highest of all sectors.
-
- Apr 2017
-
www.scienceintheclassroom.org www.scienceintheclassroom.org
-
The availability of a neuropathologically validated murine model with correspondence to human CTE is expected to open new avenues for investigation of mechanisms, biomarkers, and risk factors relevant to blast-related brain injury
In addition to the observations and findings of this study, the physical instruments and experimental protocols that were developed will likely be very useful for further research and development of diagnostics, therapies, rehabilitation strategies, and preventative measures.
The work here will help military service personnel and others at risk for developing neurotrauma-related disease.
-
APOE (apolipoprotein E) genotype (77)], history of previous head trauma, innate inflammatory responsivity, neuropsychiatric comorbidity, age and gender,
All of these are known to affect how susceptible a patient is to mild forms of neurotrauma and related effects. They can also affect how these are expressed.
-
even greater pathogenic potential
The authors suggest that blast exposure may be more damaging than sports-related impact because they were able to observe substantial neural abnormalities after a single blast exposure.
In the case of impact-induced head injury, mild cases typically require repetitive impacts to cause neural abnormalities.
-
Although it is possible that high-frequency components (>100 kHz) could lead to localized focusing due to reverberation and constructive interference, the pressure amplitudes we measured were far below tissue damage thresholds.
A possible alternative hypothesis is that the structure of the head and the brain would cause the blast wave to bounce around inside the brain, causing damage.
The authors' observations of roughly equal pressures both inside and outside the skull suggest that this is not a likely cause of injury. Furthermore, the magnitude of the pressure wave is far below the threshold that causes damage to tissues.
-
ICP dynamics recorded during blast exposure revealed blast-induced pressure transients in the hippocampus that were coincident with and comparable in amplitude, waveform, and impulse to FFP measurements outside the cranium. This finding is consistent with the head acting as a lumped element for which the blast-induced external pressure differential equilibrates within ~100 μs.
The results of this study suggest that pressure from the blast wave is not enough to cause brain injury. The data and calculations show that the brain rapidly adapts to the changing pressure, and that the pressure inside and outside the skull during the blast wave is comparable.
Thus, blast wave pressure alone is not responsible for TBI.
-
Notably, within this small controlled case series, the effects of blast exposure, concussive injury, and mixed trauma (blast exposure and concussive injury) were indistinguishable.
This statement is based on the authors' human findings from the human case studies. They observed identical paterns in head-injured athletes and blast-exposed military veterans, which has important implications for our undersetanding of traumatic brain injury and CTE.
-
Head immobilization during blast exposure eliminated blast-related impairments in hippocampal-dependent learning acquisition (Fig. 7D; P > 0.20, repeated-measures ANOVA with post hoc Scheffe test compared to sham-blast controls) and restored blast-related memory retention deficits to normal levels (Fig. 7E; P > 0.20, one-way ANOVA with post hoc Scheffe test), supporting the conclusion that head acceleration is necessary for behavioral learning impairments.
To test the "bobblehead effect," the authors performed the same experiments on mice whose heads had been held still during blast exposure and mice whose heads were allowed to move freely.
Mice whose heads were allowed to move freely were exposed to both the blast wave (the pressure of the blast) and the blast wind (the wind following a blast wave). Mice whose heads were restrained were only exposed to the blast wave.
The authors hypothesized that blast neurotrauma and the resulting cognitive deficits were the result of the acceleration of the head during a blast (i.e., the blast wind), meaning that mice whose heads were restrained would not show any abnormalities.
The results reported here support the authors' hypothesis that head acceleration is a major contributor to traumatic brain injury following blast exposure.
-
Pathologically swollen, edematous, and often highly vacuolated astrocytic end-feet were observed in association with dysmorphic capillaries marked by pathologically thickened, tortuous basal lamina and abnormal endothelial cells with irregularly shaped nuclei (Fig. 4L and figs. S11 to S16)
Astrocytes support blood vessels and interact with other types of brain cells to control the blood-brain barrier (BBB), a membrane that protects the brain from disruptive changes in the blood.
Using the electorn microscopic images in Figure 4 and Supplementary Figures 11 to 16, the authors saw that a single blast exposure can damage not only are the smallest blood vessels but the astrocytic "feet" that wrap around and encase these blood vessels.
Because these blood vessels are critical for gas and fluid exchange, damage to them can disrupt many metabolic processes.
-
However, blast-exposed mice did show decreased choline acetyltransferase immunoreactivity in motor neurons in the cervical cord (fig. S10D) and cranial nerve XII (fig. S10F) when compared to sham-blast controls (fig. S10, C and E), suggesting loss of central cholinergic inputs.
Choline acetyltransferase (ChAT) is an enzyme that produces the neurotransmitter acetylcholine.
Reduced ChAT concentrations are associated with Alzheimer's disease and amyotrophic lateral sclerosis (ALS).
Decreased ChAT following blast TBI suggests similarities between Alzheimer's, ALS, and TBI.
-
Activated perivascular microglia were observed throughout the brain in blast-exposed mice and were especially notable in the cerebellum
The authors also observed other key signs of CTE, such as the presence of activated microglia near small blood vessels (perivascular microgliosis).
This finding is consistent with traumatic microvascular injury and reactive microgliosis, which are signs of chronic inflammation in the brain.
-
Although the reflected and transmitted shock waves are large (~2.5 times greater than the 77-kPa incident overpressure), the ~7-μs traversal time of the skull-brain transmitted wave is short enough to allow rapid equilibration across the skull. Thus, the head acts acoustically as a “lumped element”
The authors also calculated how the blast pressure wave would travel through the brain tissue. They found that, as with the skull, the pressure wave moved too quickly through the brain to cause injury.
The authors conclude that the head may be treated as a "lumped element," meaning that the blast pressure wave travels through it as if it were a single body.
-
The pressure differential associated with this traversal has an insignificant effect on skull displacement due to the short time interval.
The authors next investigated how the blast pressure wave interacted with the mice's skulls. They calculated the speed of the shockwave by taking pressure measurements with multiple sensors at different locations and measuring the time delay between recordings. Using the distance between sensors and the time delay, they were able to record the "speed" of the wave as it traveled down the shock tube.
By comparing the wave speed with the width of the mice's heads, the authors concluded that the wave traveled too quickly to create a pressure gradient large enough to cause injury. Based on these results, they were able to rule out skull compression as a cause of TBI.
-
persistent hippocampal-dependent learning and memory deficits that persisted for at least 1 month and correlated with impaired axonal conduction and defective activity-dependent long-term potentiation of synaptic transmission.
Mice that were exposed to blasts had memory and learning problems. Additionally, axons in their brain showed reduced function and they had defective activity-dependent long-term potentiation, which is thought to be important to learning and memory.
In this case, these defects were all correlated but not necessarily caused by one another.
-
-
www.scienceintheclassroom.org www.scienceintheclassroom.org
-
adapt to a permanently altered environment
Some of the environmental damage caused by human actions is too far along to be repaired, including species loss and, to some extent, climate change. Because we cannot undo some of our environmental alterations, we must adapt to them. We already know a great deal about how to adapt to climate change, and in many places around the world, particularly cities, communities are making the adaptive changes needed to deal with extreme weather and other climate change–related impacts.
-
Fundamental behavioral changes are thus needed to stop damaging the natural world
Human society is on an unsustainable trajectory. Because environmental degradation is ultimately a human behavior problem, human behavior must change. Adjustments in resource consumption must happen at both an individual level as well as across societies. The industrial systems and infrastructure that currently serve human needs must change as well, to dramatically lower their impact on natural systems.
-
Further psychological research needs to elucidate how to accelerate the adoption of ecologically-grounded worldviews
Generally people build their own worldview over time through trial and error, listening to elders, going to school and religious services, and general experience. There has been very little experimental research done about how people can transform quickly from an unsustainable worldview, such as the modern-industrial worldview, to a more ecologically-grounded worldview.
-
change the larger systems that drive so much of human behavior
We are surrounded by systems that define how we behave, including neighborhoods, households, schools, religions, and the economy.
The government is one example of a system that defines rules about how groups of people act (for instance, when and for how long people can water their lawns), interact together (whether roads are built to accommodate pedestrians, bikes, buses, trains, and cars), and how we share resources (such as community composting sites).
It is important to remember that a system such as a government is run by people, so individual people need to start the process of changing the systems by voting, running for office, or through work as government employees.
-
humans can move toward a sustainable society by creating conditions that motivate environmentally responsible collective action
The situation is a powerful driver of human behavior. Therefore we must create situations that drive behavior toward sustainability rather than against it. This means that, among other things, situations, tasks, and activities have to engage people's needs to feel competent, have choices, and to feel like they belong.
-
Additional research is needed to understand how to enhance the pace and depth of worldview change
There is very little research studying how to change worldviews. Yet, people in developed countries will not be able to make ecologically appropriate decisions if they keep thinking the same way they do now.
There are many tools that psychologists have developed for other purposes that can be applied to facilitating worldview change. For instance, educational psychologists, who are experts about how people learn, and industrial/organizational psychologists, who regularly develop employee training programs, can apply what they know about teaching knowledge, skills, and abilities to teaching worldview principles.
-
they must internalize an ecologically-grounded worldview and integrate it into the vision they set for others
There are physical laws that govern how ecosystems work. Only when leaders better understand this will they be able to make sustainable decisions. Because the path from ecosystem to organization is often long, this connection between individuals' decisions and the state of the planet is very hard to see. Therefore, decision-makers have to add "understanding how ecosystems work" to the knowledge, skills, and abilities they develop to do their jobs.
-
Though psychological research has examined what motivates people to volunteer and cooperate for social causes, or mobilize around political campaigns, the results have yet to be applied to collective efforts for conservation.
Psychologists have studies individuals' motivations for volunteering for many decades. The reasons people volunteer, or join collective efforts to help others, include:
- To do something new
- To feel good about themselves
- To learn a new skill
- To help others, the community, or the larger world
- To live their values, in particular religious values
-
explains why individuals are unwilling to surrender the convenience of a personal car or to spend money on energy efficiency measures that not only save money in the long run but also help curb greenhouse gas emissions
An important insight from evolutionary psychology is that all human beings carry with them certain inherited behavioral tendencies. One of these is a tendancy to be short-term thinkers.
According to Van Vugt, Grisckevicius, and Schultz (2014), early humans were more likely to survive if they focused on immediately available benefits rather than thinking about possible future needs. In contrast, solving complex environmental issues such as climate change requires reducing, or even eliminating, current fossil fuel–intensive behaviors (for example, commuting alone in a personal car) in order to ensure a healthy climate in the future.
-
Experiencing the self as separate from nature is the foundation of humanity’s damaged relationship to planetary resources
Before modern times, people lived in small groups, hunting and foraging for food. By necessity, they were very tuned in to their surroundings, and likely believed that other species were their neighbors and family.
Over the past couple of centuries, humans have become more and more disconnected from the natural world, treating it as something that exists for us—as resources to be used, abused, and discarded.
-
human action to radically transform
Human beings throughout most of the world need to work together to dramatically change the way we meet our everyday needs for food, clean water, energy, shelter, and transportation.
-
how to activate ecologically-compatible engagement, especially leadership, for the collective work needed to become more sustainable
There is very little published research about how to help organizational leaders (CEOs, religious clergy, elected officials) understand the importance of their role in creating sustainable systems.
Additionally, with just a few exceptions, most research about developing a realistic understanding of the earth's systems is aimed at children. For adults there is an additional challenge of having to unlearn assumptions that are deeply embedded in the way they think about the world.
Since working toward sustainability of large systems, including organizations, will lead to faster and more expansive change, more research needs to be done on how to effectively increase the knowledge and motivation of these leaders.
-
psychologists need to move beyond targeting individuals’ private sphere choices, and focus on how to foster collective action
The environmental degradation we face is much larger than what individual change can effectively address. To solve issues such as climate change, loss of biodiversity, and other global environmental issues, we must change the larger systems that meet our needs for energy, food, water, transportation, and goods. These systems are not going to change on their own; instead, it will take the efforts of many, many people working together to demand change and to be persistent in those demands.
-
Conforming to norms promoting sustainable behavior may actually feel threatening to individuals whose identity is perceived to be at odds with being “green.”
One's identity and social group affiliations appear to be among the most important determinants of whether a person takes pro-environmental action. If an individual does not identify as someone who cares about the environment, then it feels very uncomfortable and awkward to take actions typical of environmentalists. In addition, if one's social group does not express environmental concern, then taking environmental action in front of them can feel very risky as they may notice and express disapproval.
-
Community-Based Social Marketing (CBSM)
Community-Based Social Marketing (CBSM) programs have been applied around the world to change behaviors that are most environmentally impactful: agricultural practices, transportation, home energy use, water use, overall resource consumption and waste, and toxic chemical production and use.
-
humans must experience and better understand their profound interdependence with the planet
When humans do not interact directly with the natural world it is hard to know that we completely depend on it to live. For instance, when we buy food at the supermarket, we may not think about where that food was grown, how long it took to grow, and what kind of soil and weather conditions were required. We may not think about how drought and changes in climate are likely to change the food that is available to us. A direct experience with nature such as caring for some food plants helps us understand how long it takes and how challenging it is to grow what we need to eat.
-
- Jan 2017
-
scienceintheclassroom.org scienceintheclassroom.org
-
As reported previously (5), branching species were more susceptible to hurricane damage than were massive heads (Fig. 4B). In an extremne example, at a depth of 6 m on Monitor Reef on the West Fore Reef (Fig. 1, location C, and Fig. 3) the planar living areas of branching Acropora spp. were reduced by up to 99 percent (Table 2, rows 1 to 3), whereas colonies of foliaceous and encrusting Agraricia agaricites were reduced by only 23 percent (Table 2, row 6),
The shapes of corals has a strong influence on how much they are damaged because shape directly affects how much force flowing water exerts on the coral's skeleton. Branching corals have large flat exposed areas that are especially susceptible to breaking, while corals with leaf-like, encrusting and head forms experience much less mechanical stress during a storm.
-
Within any zone, the amount and type of damage inflicted upon sessile organisms was greatly influenced by their shapes, sizes, and mechanical properties. Damage to gorgonians, corals, and sponges ranged from partial to complete mortality (20) and was caused by abrasion, burial, and the tearing or fracture of tissue and skeleton. The fate of detached colonies and fragments, and thus the ultimate consequences to populations, of Hurricane Allen, varied widely between taxa.
The way that physical forces affected corals depended heavily on their form, since form affects how organisms are influenced by mechanical forces. Different species have different shapes and sizes, so the impact of the storm depends on the species.
-
Shallow fore-reef areas were generally more severely damaged than deep ones. We see this most directly by comparing the same species on the same reefs at different depths. For example, head corals were more frequently toppled in sand channels in 10 m of water than in 14 m (Table 1, compare rows 5 and 6; x2 for numbers toppled and not toppled after Hurricane Allen = 4.75, P < .05).
The energy within waves is released most violently in shallow waters. Corals at greater depths were less likely to be damaged.
-
-
scienceintheclassroom.org scienceintheclassroom.org
-
Moreover, correlational evidence is consistent with the conclusion that variation in the strength of initial evidence (such as original P value) was more predictive of replication success than was variation in the characteristics of the teams conducting the research (such as experience and expertise).
Third, from this study we can conclude that the precautions the authors took against replication success depending on the team of researchers who conducted the replication study were quite successful: there is no evidence that characteristics of the replication team influenced the outcomes of the replication attempt.
Rather than replication-team-specific differences that influenced the replication outcome, there were systematic differences between successfully replicated and non-replicable studies based on characteristics of the original study. Therefore, the fourth conclusion of this paper is that original studies which showed stronger evidence for the effect they investigated were also more likely to be successfully replicated.
-
The present results suggest that there is room to improve reproducibility in psychology. Any temptation to interpret these results as a defeat for psychology, or science more generally, must contend with the fact that this project demonstrates science behaving as it should.
The fifth and final conclusion of the paper regards the question what psychologists and other researchers should take from these results regarding their overall research practices. The conclusion is mixed. On the one hand, the authors recognize that research is a process where new ideas have to be explored and sometimes might turn out not to be true. Maximum replicability is therefore not desirable, because it would mean that no more innovations are being made. On the other hand, the authors also conclude that there is room for improvement: stronger original evidence and better incentives for replications would put progress in psychological research on a stronger foundation.
-
Nonetheless, collectively, these results offer a clear conclusion: A large portion of replications produced weaker evidence for the original findings
Because there is some uncertainty about how exactly the replication success rate in psychological research should be determined, the authors go about the interpretation of the results of this study very conservatively. This very careful interpretation of the data, and the second conclusion of this study, is that the replication studies yielded largely weaker evidence for the effects studied than the original studies.
-
No single indicator sufficiently describes replication success, and the five indicators examined here are not the only ways to evaluate reproducibility.
The discussion section begins with a cautionary sentence reminding the reader that it is difficult to say exactly how many original studies were successfully replicated. The precise conclusions drawn from this paper depend a lot on which of the 5 measures used to determine replication success you think is the most appropriate measure. The results of measure indicate a replication success rate as low as 36%, while another measure suggests a success rate of 68%. Perhaps some researchers would even say that another measure not included in this study would have made it possible to draw more meaningful conclusions. The scientific community has so far not agreed on what measure should be used to evaluate replication success rates.
Moreover, there are other limitations to this approach to studying reproducibility (see paragraph "Implications and limitations"), which make it difficult to generalize the findings of this study to psychological research in general, or even to other disciplines. It is also difficult to evaluate from the findings in this study whether the evidence indicates a specific effect is true or does not exist.
Therefore, the first conclusion of this paper is that all interpretations of the data are only an estimation of how reproducible psychological research is, not an exact answer.
-
-
scienceintheclassroom.org scienceintheclassroom.org
-
which is 25 to 40% of the estimated CO2 released to the atmosphere from arctic fresh waters (4, 9). The remaining CO2 released to the atmosphere is likely generated in soil waters and transferred directly to surface waters (3, 4) or is respired by bacteria
The carbon dioxide (CO<sub>2</sub>) released from surface water as calculated in this paper only accounts for 20% to 40% of the estimated CO<sub>2</sub> release from all arctic freshwater.
Other possible sources of CO<sub>2</sub> include CO<sub>2</sub> released directly from water in the ground and CO<sub>2</sub> produced by organisms living in the sediment at the bottom of lakes, rivers, and streams.
-
- Nov 2016
-
scienceintheclassroom.org scienceintheclassroom.org
-
Our results, together with recent reports showing brain calcification in microcephalic fetuses and newborns infected with ZIKV (10, 14) reinforce the growing body of evidence connecting congenital ZIKV outbreak to the increased number of reports of brain malformations in Brazil.
The authors report that their results confirm previous evidence connecting the ZIKV outbreak in Brazil to an increase in cases of microcephaly.
-
Our results demonstrate that ZIKV induces cell death in human iPS-derived neural stem cells, disrupts the formation of neurospheres and reduces the growth of organoids (fig. S2), indicating that ZIKV infection in models that mimics the first trimester of brain development may result in severe damage.
Summarizing the results, the authors conclude that in their model of early brain development, ZIKV causes severe damage.
-
These results suggest that the deleterious consequences of ZIKV infection in human NSCs, neurospheres and brain organoids are not a general feature of the flavivirus family.
Because DENV2 did not reduce cell growth or affect morphology, the researchers concluded that those effects are unique to ZIKV and not characteristic of the flavivirus family (to which both DENV2 and ZIKV belong).
-
ZIKV induced caspase 3/7 mediated cell death in NSCs
Zika virus induces the expression of caspase 3/7, indicating a cell is preparing to die.
Dengue virus 2, on the other hand, did not increase caspase 3/7.
-
-
scienceintheclassroom.org scienceintheclassroom.org
-
peer reviewers are more likely to reward projects with the potential for a very high-impact publication and have considerable ability to discriminate among strong applications
The authors' findings suggest that peer reviewers are good at identifying innovative and ground-breaking projects.
-
peer reviewers add value by identifying the strongest research proposals
The authors show that peer review scores are good predictors of scientific productivity when differences in field of research, year, and applicant qualifications are removed. This suggests that peer reviewers have the necessary expertise to choose good applicants.
-
the grant with a 1-SD worse score is predicted to have 7.3% fewer future publications and 14.8% fewer future citations
The authors conclude here that regardless of gender, ethnicity, or institutional prestige, when the peer-review score lowers by one standard deviation, we can observe a corresponding decrease of the number of publications and citations of an author.
-
including these variables does not substantively affect our findings
The authors concluded that writing skills does not affect the chance of receiving grant funding from the NIH.
-
This variation in citations underscores the potential gains from being able to accurately screen grant applications on the basis of their research potential
The authors found that there is a lot of variation in the research output of projects that receive grants. They conclude that it would be useful to find a way to accurately screen applications to determine their potential.
-
NIH is the world’s largest funder of biomedical research (12). With an annual budget of approximately $30 billion, it supports more than 300,000 research personnel at more than 2500 institutions (12, 13). A funding application is assigned by topic to one of approximately 200 peer-review committees (known as study sections).
Based on an analysis conducted by the authors, biomedical research is valued highly by individuals, governments, foundations, and corporations. Research is seen as a source of more effective treatments and preventive measures and as a route to policy, new commercial products, and economic development.
As a result, investments in biomedical research are the highest of all sectors.
-
- Sep 2016
-
scienceintheclassroom.org scienceintheclassroom.org
-
J. Lewis et al., Science 293, 1487 (2001).
Lewis and colleagues made a genetically modified mouse with a mutated APP gene and a mutated tau gene.
These double mutants had plaques like other hAPP mice, but they also had tangles that were more severe than in other mice, showing that problems with amyloid and tau can interact and cause symptoms of Alzheimer’s disease.
-
We found no adverse effects of tau reduction on health or cognition in mice, and the evidence that even partial tau reduction robustly protected mice from Aβ and excitotoxic agents highlights its potential benefits.
This work started as a side project and eventually became a pillar of Dr. Roberson’s later work on mouse models of Alzheimer’s disease. It has been credited with "sparking the research community’s interest in the role of tau in the pathogenesis of Alzheimer’s disease.”
Read more in Neurology Today:
-
Our findings raise the possibility that tau reduction could protect against AD and other neurological conditions associated with excitotoxicity
This work was covered by AlzForum, a leading advocacy group for Alzheimer’s research and patients.
Read more at AlzForum:
http://www.alzforum.org/news/research-news/app-mice-losing-tau-solves-their-memory-problems
-
tau modulates sensitivity to excitotoxins and may be involved in regulating neuronal activity
Lowering tau even in normal mice (those without the hAPP gene, who would not get Alzheimer’s disease) was beneficial in making mice less prone to seizures when injected with an seizure-inducing drug.
This means that in healthy mice, normal tau may be involved in modulating excitation in neurons, reducing the overall risk of seizures.
(Seizures are a result of abnormally high electrical activity in the brain that occurs when certain neurons are excessively excited.)
-
at a dose that was not lethal to mice without hAPP (P < 0.05). Tau reduction prevented this effect, as no hAPP/Tau+/– or hAPP/Tau–/– mice died. Seizures in hAPP/Tau+/– and hAPP/Tau–/– mice were less severe and occurred at longer latencies than in hAPP/Tau+/+ mice
Like humans with Alzheimer’s disease, many mice used as models of the disease have a tendency to get seizures, including the hAPP mice used in this study.
However, hAPP mice without tau or with reduced tau had fewer seizures than those with the normal amount of tau.
-
Given that tau reduction prevented behavioral deficits but not neuritic dystrophy, these may represent parallel, rather than causally linked, disease manifestations, or tau reduction may act downstream of neuritic dystrophy.
Eliminating tau in hAPP mice prevented memory and cognitive problems, but it didn’t prevent the neurons from becoming damaged and impaired.
The removal of tau must therefore work in an unrelated way to improve cognition in these mice.
-
Despite the differences in their behavior, hAPP/Tau+/+, hAPP/Tau+/–, and hAPP/Tau–/– mice had similar amounts of neuritic dystrophy
All hAPP mice, independent of the amount of tau, had neurons whose axons and dendrites were withering away in regions near amyloid plaques.
These results show that this type of damage in Alzheimer’s disease can happen even without tau being present.
-
In our study, reduction of endogenous, wild-type tau protected hAPP mice against Aβ-dependent cognitive impairments, and this did not involve the elimination of a large pool of tau with typical AD-associated modifications
Reducing tau in hAPP mice protected those mice against the cognitive impairments seen in Alzheimer’s disease.
The authors looked at hAPP mice with normal amounts of tau to see how the tau became abnormal later in life. They compared these changes with abnormal tau measured in other mouse models of Alzheimer’s disease.
The connection between tau and amyloid-β in this model isn’t clear and may involve multiple types of tau or tau stored in various locations (or pools) within the cell.
-
Thus, the beneficial effects of reducing tau were observed without detectable changes in Aβ burden, suggesting that tau reduction uncouples Aβ from downstream pathogenic mechanisms
The authors examined the brains of hAPP mice with different levels of tau. They found that, at all levels of tau, the amount of amyloid plaques and the amount of floating amyloid-β were the same in all hAPP mice.
Thus, the reduction of tau, which had improved memory, didn’t alter the amyloid-β or plaque levels, suggesting that amyloid-β or plaque levels alone are not responsible for memory loss.
-
Thus, tau reduction prevented major Aβ-dependent adverse effects in hAPP mice
hAPP mice are known to die young, as a result of the induced Alzheimer’s disease. But if hAPP mice had a reduced amount of tau, they lived much longer.
Additionally, hAPP mice with normal levels of tau were hyperactive even into middle age, whereas hAPP mice with no tau were not.
-
tau reduction gene dose-dependently ameliorates Aβ-dependent water maze learning and memory deficits.
Mice with the hAPP gene usually do poorly in the water maze because of their impaired memory. However, hAPP mice that have only half the amount of normal tau did remember where the platform was after additional training.
hAPP mice with no tau performed similar to normal, healthy mice, showing good memory on this test.
Thus, removal of all the normal tau seems to prevent memory problems. Even removing just half the normal tau already has some memory benefits.
-
Thus, tau reduction can block Aβ- and excitotoxin-induced neuronal dysfunction and may represent an effective strategy for treating Alzheimer's disease
Evidence from this study and following studies showing the importance of lowering tau has led researchers to target tau with their experimental new treatments, including vaccines that use the body’s own immune system to fight off the problematic tau proteins.
http://www.medicaldaily.com/alzheimers-disease-tau-protein-vaccine-391883
Almost 10 years after this paper was published, no good treatments for Alzheimer’s disease exist, but more and more of the new drugs in clinical trials are targeting the tau protein, as a result of early laboratory research such as this.
-
-
scienceintheclassroom.org scienceintheclassroom.org
-
Similarly, Ctnnb1 andEdar, two other placode markers, also show marked differences in expression between wild-type and scaleless dragons. In both phenotypes, expression of these two genes is first ubiquitous across the whole epidermis before becoming restricted to the placodes in wild-type individuals only (Fig. 4B). These results indicate that expression of each of these three placode markers in reptiles is similar to the expression dynamic of the corresponding genes in mammals (27) and birds (20, 28,39). On the other hand, the absence of an anatomical placode in scaleless dragons coincides with the inability of signaling pathways to pattern the skin, similar to what is observed in mice deficient in Eda/Edar(40).
the authors are demonstrating that not only the signalling pathway but also the anatomical placode are required for reptiles to develop scales properly.
-
generates a new splice donor site (gt) 42 bases upstream of the wild-type donor site, thus generating a 14–amino acid deletion in the corresponding transcript (Fig. 3D).
The insertion of the jumping DNA caused a mutation. This mutation affected splicing, a process where the DNA is edited to ultimately produce the correct protein.
This incorrect splicing resulted in a shortened non-functional protein.
-
argued homologous to those characterized in chicken
Because the same patterns in development of chicken feathers and lizard scales are similar, it can be assumed that they are related.
-
This set of new results coherently and conclusively indicates that most skin appendages in amniotes are homologous; that is, they all evolved from a shared common ancestor that exhibited appendages developing from an anatomical placode and expressing a set of signaling molecules still involved in the development of scales, hairs, and feathers of extant species.
The placode is crucial in the formation of all the different skin appendages (hair, feathers, scales), and all of them develop with the same molecules.
This shows that they are homologous- thus closely related.
-
These results reveal a new evolutionary scenario where hairs, feathers, and scales of extant species are homologous structures inherited, with modification, from their shared reptilian ancestor’s skin appendages already characterized by an anatomical placode and associated signaling molecules.
The authors have found that the placode and the placode's signalling molecules are highly conserved - meaning it is found across many species.
This study shows that the placode is responsible for the formation of many different structures in the various species, such as feathers, scales and hair.
-
homozygous for a codominant mutation
Homozygous means that they have two copies of the exact same gene.
Co-dominant means that two different genes that are present in the same individual will both be expressed and blended together.
In this case, the gene which causes a scaleless mutant must have two copies of the mutant gene. If there is one copy, there will be partial formation of scales.
-
similar phenotypes in other vertebrates because of impairments of the EDA receptor (EDAR; a member of the TNF family) (18) or its ligand EDA, indicating a conserved role of this pathway in reptiles as well
Similar disease symptoms can be observed in other animals with a mutation in the same gene.
-
Scaleless dragons show an irregular skin surface with the initiation of some dermoepidermal undulations of the skin (Fig. 3G), indicating that this phenomenon does not fully require the presence of anatomical placodes.
Even though the mutant does not have scales as a result of a non functional protein, it still shows the beginnings of a pattern for scales.
This means that there are other elements that are not part of the anatomical placode which contribute to the formation of scales
-
-
scienceintheclassroom.org scienceintheclassroom.org
-
This key experiment showed that eels never (10 of 10 trials for each of two eels) followed a doublet with an attack volley without a “mechanosensory echo” from the prey, but attacked in response to the stimulator-generated fish twitch
Eels always check if there is living prey by sending two fast electric signals (doublet). Only after the eels feel the movement of the fish, which they induced by their double shock, they attack the fish by really strong shocks.
-
These experiments suggest that the electric eel’s strong electric organ discharge remotely activates motor neuron efferents of its prey, although this activation could occur anywhere between the spinal cord and the presynaptic side of the neuromuscular junction.
The fish did not behave differently. Thus, the eel's shock activates "decentralised" motor neurons of the fish.
-
Overall, this study reveals that the electric eel has evolved a precise remote control mechanism for prey capture, one that takes advantage of an organisms’ own nervous system.
This study revealed that Eels, through the process of evolution and survival for the fittest, have developed a very efficient system of prey detection and capture. The Eel's electrical powers work much like the way a TV remote control does. Using electrical pulses Eels turn off their prey's nervous system the same way a remote control turns off a tv screen. This happens because the electric pulses take advantage of the mechanism through which the Prey's nervous system works by hacking into it and telling it to freeze.
-
This result indicates that fish are immobilized by massive, involuntary muscle contraction.
The eel's electric shock contracts the fish's muscles, which paralyses it.
-
These findings indicate that fish motor neuron activation is required to induce tetanus in prey
The eel's shock affects the nervous system (motor neurons) of the fish.
-
-
scienceintheclassroom.org scienceintheclassroom.org
-
Smaller deletions (also unique to the individual family) (table S5) were closest to CNTN3, encoding BIG-1, an immunogloglobulin super-family protein that stimulates axon outgrowth (32); RNF8, encoding a RING finger protein that acts as a ubiquitin ligase and transcriptional co-activator (33); andSCN7A (amid a cluster of voltage-gated sodium channels that also includes SCN1A, SCN2A, SCN3A, and SCN9A) on 2q
The authors identified small deletions in the genome of the family in genes coding for various proteins that are highly expressed in the brain. However, it has not been proven that all these deletions play a role in the disease.
-
The relatively reduced M/F ratio of affected children and the reduced rate of linked de novo CNVs in the consanguineous sample (not significantly different from rates in control) both suggest that consanguineous pedigrees with autism are enriched for autosomal recessive causes
In consanguineous families, the authors observed that the proportion of affected women amongst the affected children was higher as well as the number of de novo CNVs was reduced.
This could mean that families with related parents present more recessive causes for autism. The authors will now try to determine what these causes are and where they are located on the genome.
-
rates of inherited CNVs (some potentially causative) were high in both the SNP and BAC arrays, ranging in size from 1.4 kb to 3.9 Mb (tables S2 and S3), overall rates of de novo CNVs that segregated with ASD were 0% in consanguineous multiplex (0 of 42 patients) and 1.9% in consanguineous simplex families (1 of 52 patients), which were considerably lower than reported for nonconsanguineous families: 1.28% in the HMCA overall
After the experiments, the data was analyzed in order to determine the proportion of de novo copy number variants in each subject in comparison with a reference set of unaffected and affected subjects as well as the rates of de novo CNVs inherited from their parents.
The data showed that de novo CNVs were much less frequent in consanguineous families (both simplex and multiplex) and the amount of inherited CNVs is high.
-
An increased role for inherited factors in autism families with shared ancestry was also suggested by a low rate of de novo CNVs
The low rate of de novo copy number variant suggests that abnormalities in autistic patients' genomes comes from defective genes that were already present in the parents, rather than healthy genes that mutated before or shortly after fertilization.
-
A second >300 kbp, linked, homozygous deletion (again not present in >2000 individuals other than this family) is closest to PCDH10 on 4q28 (Fig. 2 and table S5), which encodes a cadherin superfamily protein essential for normal forebrain axon outgrowth
A second big deletion was identified on the chromosome. This codes for a protein that allows for neuron, and more specifically axon, growth.
-
The deletion completely removes c3orf58, which encodes an uncharacterized protein with a signal peptide that localizes to the Golgi (28). Moreover, the deletion is near the 5′ region of NHE9, such that only 60 to 85 kbp upstream of the transcription initiation site is spared
This deletion completely removes a protein-coding gene called c3orf58, and a large section of possible regulatory regions of another gene called NHE9. SK
The authors focused on one pedigree in this paragraph: the family of patient AU-3101. A first deletion removes an entire gene, coding for a protein present in the Golgi Apparatus, as well an upstream region of the gene coding for NHE9, an ion exchanger.
-
we were surprised to see that several consanguineous pedigrees showed large, rare, inherited homozygous deletions within linked regions, some of which are very likely causative mutations (Figs. 1 and 2 and table S5). Such deletions were present in 5 of 78 consanguineous pedigrees (6.4%) and ranged in size from 18 thousand base pairs (kbp) to > 880 kbp
The authors noticed that in some cases, there were deletions of certain long sequences present on both chromosomes of the affected individuals. These huge deletions could cause symptoms of autism in certain cases.
-
nonoverlapping between families, consistent with genetic heterogeneity,
The abnormalities in autistic patient's chromosomes were not shared between families. This suggests that autism can have many different genetic origins.
-
the M/F ratio of affected individuals was typical, at 4.8:1 (115 males: 24 females). However, in consanguineous, multiplex pedigrees, the M/F ratio was 2.6:1 (34 males: 13 females) (fig. S1), compared to 7.4:1 (81 males: 11 females) for the other categories of families (i.e., nonconsanguineous and consanguineous simplex)
The authors observed that the proportion of women affected doubles in consanguineous multiplex compared to the whole HMCA and is almost tripled compared to simplex families.
-
Our finding that deletions of genes regulated by neuronal activity or regions potentially involved in regulation of gene expression in autism suggests that defects in activity-dependent gene expression may be a cause of cognitive deficits in patients with autism.
The authors found that large deletions of genes that are regulated by neuronal activity were associated with autism symptoms. This is an interesting finding, because autism symptoms often develop later in development.
-
-
scienceintheclassroom.org scienceintheclassroom.org
-
Although our findings show that NIH grants are not awarded purely for previous work or elite affiliations and that reviewers contribute valuable insights about the quality of applications, mistakes and biases may still detract from the quality of funding decisions.
Summing up all results we can say that previous work or elite affiliations do not "close the door" for new ideas in research.
-
-
www.scienceintheclassroom.org www.scienceintheclassroom.org
-
Pharmacotherapies that increase hippocampal BDNF may prove to be efficacious treatments for fear disorders characterized by extinction impairments
One of the goals of fear learning research is to be able to improve therapies for people suffering from fear and anxiety disorders.
The fear extinction process is impaired in many of these patients, so identifying molecular targets may aid in drug development.
-
it is possible that BDNF treatment may lead to partial reversal of conditioning-induced changes
Changes that occur following fear conditioning, such as a reduction in hippocampal BDNF, may be corrected by BDNF treatment.
-
Because BDNF facilitates NMDA receptor currents (11, 12), exogenously applied BDNF may simulate extinction by inducing bursting in the IL mPFC
The effects that infused BDNF has on extinction are mediated by NMDA receptors in the IL mPFC.
Activation of these receptors may stimulate neurons in this region, and this could explain why infusion of BDNF results in reduced freezing even when the animals do not go through extinction training.
-
Our results provide further support for the importance of this pathway in extinction and extend these findings by identifying BDNF as a key molecular mediator
This research furthers our understanding of fear learning by describing extinction using both a systems level approach (hippocampus to IL-mPFC pathway) and a molecular mechanism (BDNF-mediated NMDA activation).
-
We were able to pharmacologically induce extinction with a single infusion of BDNF into the hippocampal-infralimbic pathway, a key projection for extinction memory
When infused into the hippocampus, BDNF can induce extinction in the absence of extinction training.
This suggests that activation of this pathway is a key component of the fear extinction process.
-
which suggests that the IL mPFC is the primary site of action for hippocampal BDNF
Blocking BDNF activity in the IL mPFC inhibits the effects of BDNF infused into the hippocampus.
-
BDNF infused into the hippocampus reduced fear, as measured by both freezing [main effect of drug F2,21 = 4.715, P = 0.020, post hoc P = 0.013 comparing SAL(IL) + SAL(Hipp) to SAL(IL) + BDNF(Hipp)] (Fig. 3C) and conditioned suppression of food seeking (fig. S4)
The fact that BDNF infusion into the hippocampus results in similar fear reduction to infusion into the IL mPFC supports the hypothesis that the hippocampus supplies BDNF to the IL mPFC.
-
NMDA receptors are necessary for BDNF-induced reductions in fear
The role of NMDA receptors in fear acquisition has been previously established.
By showing that rats receiving an NMDA antagonist do not respond to BDNF, the authors show that these receptors are also required for BDNF-induced fear reduction.
-
indicating that BDNF left the original fear memory intact
Based on the fact that unsignaled footshocks can reinstate freezing after extinction, the authors conclude that BDNF does not degrade the original memory.
-
The lack of effect on conditioning and open-field anxiety suggests that BDNF infusions did not decrease amygdala activity nonspecifically
When testing the behavioral effects of a treatment, it is important to rule out nonspecific interaction with other regions of the brain that may similarly affect behavior through an alternative route.
The amygdala is known to be involved in fear acquisition, so if the BDNF was nonspecifically acting on the amygdala it would be apparent during conditioning or open-field anxiety testing.
-