3 Olympic exchange theory
Here is a debunk. Titanic Switch Theory conclusions and references
3 Olympic exchange theory
Here is a debunk. Titanic Switch Theory conclusions and references
This summer, unprecedented wildfires also burned in Greenland, Alaska, and Siberia, contributing to ice melt as their soot and smoke traveled across the Arctic.
It is not unprecedented. Large areas of the Arctic burn every year and this is part of the natural cycle. ‘Arctic On Fire’ Is Normal, Part Of Nature - Moose, Bears, Voles, Foxes, Owls, Birds Of Prey All Depend On Wild Fires
Democratic People's Republic of Korea[a]
North Korea itself rejects communism.
“There are two ways of looking at a place: There is what it calls itself, and there is what analysts or journalists want to say a place is,” Owen Miller, who lectures in Korean history and culture at London’s School of Oriental and African Studies (SOAS), told Newsweek.
“On neither of those counts is North Korea Communist. It doesn’t call itself Communist—it doesn’t use the Korean word for Communist. It uses the word for socialism but decreasingly, less and less over the decades.”
The state’s official ideology is juche, a Sino-Korean word used in both North and South Korea that roughly translates as “independence, or the independent status of a subject,” according to Miller.
“Juche is enshrined in North Korea’s constitution, explicated in thousands of propaganda texts and books, while teachers indoctrinate North Korean children with the ideology at an early age.
The concept evolved in the 1950s, in the wake of the Korean War, as North Korea sought to distance itself from the influence of the big socialist powers: Russia and China. However the concept has a more profound resonance for North Koreans, alluding to the centuries when Korea was a vassal state of the Chinese.
“When Kim Il Sung started using the word, he was using [it] to refer to this sense of injured pride, going back decades and much further, hundreds of years under Chinese control. He is saying North Korea is going to be an independent nation in the world, independent of other nations,” Miller says.”
The accelerating capacity build-out is changing the power sector landscape. Wind and solar will contribute 24% of power supply by 2040 compared with 7% today. Although the competitiveness is improving, there are practical limitations to reaching a fuel mix comprised of 50% or greater share for solar and wind. We see growth in energy storage to almost 600 GW. But without long-duration storage, on a much higher scale, high solar and wind yields negative prices and grid shape, design and stability issues.
This does not mention that hydro is the preferred long-duration storage for solar and wind energy and most countries have plenty of hydro storage available which is very low cost, far less cost than batteries.
The EU alone has enoughpotential for developing future pumped hydro storage for 123 TWh according to a 2013 report. That is enough to store the entire output from 5000 gigawatts of power stations for one day.
This is how the UK plans to do it and we have a fully worked out transition plan through to 2050 which is how the UK government knew it can commit to it. The EU has many projects either in place or proposed for both extra pumped hydro storage and long range UHVDC power lines.
China uses hydro storage big time. The US can definitely do it. Australia can do it, plenty of hydro storage to do the peak demand for solar power.
The Australia National University has found 22,000 potential pump hydro sites in Australia. So many that they say that you only need to use the best 0.1% of them, and can afford to be choosy.
It's not just conventional hydro. An Australian company Genex is building solar power on the site of a disused goldmine. They used pumped hydro for power storage using the old mine shafts. It pumps water up and down between the lower and upper galleries of the old mine. See video.
There's an especially good synergy if you have solar panels floating on top of the lakes behind hydro electric dams.
The potential for solar panels floating on hydro-electric dams is vast. If 10% of the available surface area is used worldwide for solar power it would produce at total of 5.211 million gigawat hours a year of power. That’s 5,211 terawatt hours.
Where Sun Meets Water: Floating Solar Market Report
There are many other ways to do it. Carbon zero transport is likely to involve a transition to electrical vehicles. Once nearly everyone is running them, then they can use a fraction of the battery charge for the car to buy electricity from the grid when it is in surplus and costs less and sell it back at a profit when it is in demand, and so make money from their cars when parked.
Other methods include solar thermal storage already used in solar plants in deserts.
See also my Do renewables for power generation take up more land area than fossil fuels? Well - not really! which has more links to follow up on peaking power.
https://astrosociety.org/edu/publications/tnl/23/23.html
Broken link - is in archive.org though. Here is the quote:
Marsden continued to refine his calculations, and discovered that he could trace Comet Swift- Tuttle's orbit back almost two thousand years, to match comets observed in 188 AD and possibly even 69 BC. The orbit turned out to be more stable than he had originally thought, with the effects of the comet's jets less pronounced. Marsden concluded that it is highly unlikely the comet will be 15 days off in 2126, and he called off his warning of a possible collision. His new calculations show Comet Swift-Tuttle will pass a comfortable 15 million miles from Earth on its next trip to the inner solar system. However, when Marsden ran his orbital calculations further into the future, he found that, in 3044, Comet Swift-Tuttle may pass within a million miles of Earth, a true cosmic "near miss.''
Marsden's prediction, and later retraction, of a possible collision between the Earth and the comet highlight that fact that we will most likely have century-long warnings of any potential collision, based on calculations of orbits of known and newly discovered asteroids and comets. Plenty of time to decide what to do.
https://web.archive.org/web/20130402063233/https://astrosociety.org/edu/publications/tnl/23/23.html
an active supervolcano
It is not a supervolcano. Its VEI (Volcanic Explosivity Index) is variously estimated as 6 or 7. A super volcano has VEI at least 8. This is just taken from the title of the New Scientist article - NS does tend to use hyperbole (exaggeration for emotional effect) sometimes.
This is a recent paper labeling it as VEI 7
Pan, B., de Silva, S.L., Xu, J., Chen, Z., Miggins, D.P. and Wei, H., 2017. The VEI-7 Millennium eruption, Changbaishan-Tianchi volcano, China/DPRK: New field, petrological, and chemical constraints on stratigraphy, volcanology, and magma dynamics. Journal of Volcanology and Geothermal Research, 343, pp.45-59.
This 2016 paper calls it VEI <=6
Despite its historical and geological significance, relatively little is known about Paektu, a volcano that has produced multiple large (volcanic explosivity index ≤ 6) explosive eruptions, including the ME, one of the largest volcanic events on Earth in the last 2000 years
The explosive ME deposited 23 ± 5 km3 dense rock equivalent (DRE) of material emplaced in two chemically distinct phases in the form of ash, pumice, and pyroclastic flow deposits
Iacovino, K., Ju-Song, K., Sisson, T., Lowenstern, J., Kuk-Hun, R., Jong-Nam, J., Kun-Ho, S., Song-Hwan, H., Oppenheimer, C., Hammond, J.O. and Donovan, A., 2016. Quantifying gas emissions from the “Millennium Eruption” of Paektu volcano, Democratic People’s Republic of Korea/China. Science advances, 2(11), p.e1600913. Press release
Study provides new evidence about gas emissions from ancient North Korean volcanic eruption
USGS definition of a supervolano:
The term "supervolcano" implies a volcanic center that has had an eruption of magnitude 8 on the Volcano Explosivity Index (VEI), meaning that at one point in time it erupted more than 1,000 cubic kilometers (240 cubic miles) of material. Eruptions of that size generally create a circular collapse feature called a caldera.
The NS article is just using hyperbole for a more dramatic headline for emotional effect
Andy Coghlan (15 April 2016). "Waking supervolcano makes North Korea and West join forces". NewScientist. Retrieved 17 May 2019.
In your own supervolcano article it is listed as a Vel 7, a "super eruption'" as your page puts it, but not quite a supervolcano. The page itself explains that a supervolcano is 8 or more.
However difficult climate diplomacy might be, we should not shy away from clearly stating that nothing has been achieved as long as we cannot see an effect on global measurements. Therefore, the political response has been inadequate.
China is the key here. Half of our CO2 emissions increase is due to China. It is rapidly industrializing and rapidly decarbonizing at the same time. At present it is still continuing to use coal to get more power but it is on the point of a massive switch over. It may have already passed over the change from increasing to level emissions, it's hard to tell, last year could be the highest. Certainly will be before 2030 and probably before 2025. They recently said they will increase their pledge in 2020 and give a road map towards carbon zero at that point.
For more background and cites see my
Many other countries are already rapidly decreasing in emissions. UK Is an example. We are on pretty much a straight line to 20% by 2050 and so it is not too much of an ask to dip that down a bit more to reach 0% by 2050.
But we can say that the currently observed warming is very much consistent with our understanding of the climate system, which goes back to the end of the 19th century. Back then, a Swedish physicist, named Svant Arrhenius, already calculated the expected degree of warming without the use of computer models, and his estimate is still very much in line with our current understanding.
He was the first to develop the ideas of global warming due to CO2 but if his prediction was the same as ours it is just coincidence - he used it to explain us exiting from the ice age and hadn't got the modern understanding e.g. of the Milankovitch cycles, and all the complexities of the interactions involved. Up to the 70s scientists were thinking in terms of climate following a random walk back to a future next ice age.
https://www.sciencehistory.org/distillations/magazine/future-calculations
What we often hear is that increased levels of greenhouse gases will make such extreme heat events more likely, but that a direct link cannot be proven. In my view, this level of precaution is unfounded.
There is pretty robust research suggesting that we have more and stronger heat waves in a warming world.
This is my paraphrase from the IPCC 2018 report Chapter 3
Number of exceptionally hot days to increase most in the tropics, extreme heat waves emerge early and expected to be widespread already at 1.5°C of global warming
1.5°C instead of 2°C means around 420 million fewer people frequently exposed to extreme heatwaves and 65 million fewer exposed to exceptional heat waves assuming constant vulnerability (i.e. they don't migrate to avoid them)
This is my summary of a recent paper:
They retropredicted that in 2000 37% of the population had an increased susceptiblity to dying due to a heat wave (1 in 10,000 if you don't take adequate precautions). The actual figure is 30.6%. From that they forward predicted to 47.6% (+-9.6%) at 1.5°C and 53.7% (+/-8.7%) at 3°C .
It's not possible to know statistically about any particular heat wave but the predictions are reasonably robust for the numbers affected per year increasing.
The new models are producing much scarier projections on temperature increase
There was some recent research suggesting we may need to adjust the long term climate sensitivity - over thousands of years. But there is such a vast range of values there it's not likely one study signfiicantly changes it. We need to wait for the next high level review.
This is a carbon brief article on it from last year, you can see how the projections are very variable between below 1.5 C through to above 6 C. Mostly between 1.5 and 4.5 C.for the response to doubling CO2 levels - but that's about how much it responds in equilibrium over thousands of years, e.g. after all the ice in Greenland and Antarctica that is long term unstable melts.
And though there is still a huge range, the outliers are more clearly outliers than they used to be, there is a better understanding of the science and it is expected to be in the middle.
https://www.carbonbrief.org/explainer-how-scientists-estimate-climate-sensitivity
There is ongoing work, mostly for embedded Linux systems, to support 64-bit time_t on 32-bit architectures, too
The kernel is now just about fixed, with the end in site and it is over to the glibc library writers and only then can application writers update to be 2038 compatible:
2019 update: Approaching the kernel year-2038 end game By Jonathan Corbet January 11, 2019
Strangelets are small pieces of strange matter, perhaps as small as nuclei. They would be produced when strange stars are formed or collide, or when a nucleus decays
An excellent cite here for strangelets is the LHC safety review in 2011. It also gives additional details that would be useful for the article and includes a short summary of the state of current research on strangelet production. The supplement to the review describes how the LHC confirmed the emerging picture, which is that strange matter does not form at high energies
Also, just as icecubes are not produced in furnaces, the high temperatures expected in heavy-ion collisions at the LHC would not allow the production of heavy nuclear matter, whethernormal nuclei or hypothetical strangelets.
Review of the Safety of LHC Collisions LHC Safety Assessment Group, 2011
Implications of LHC heavy ion data for multi-strange baryon production LHC Safety Assessment GroupSept 26, 201
, “Limiting global warming to 1.5°C would require rapid, far-reaching and unprecedented changes in all aspects of society.”
It did not however say that those changes are sacrificies or that they make society worse. The IPBES report goes into it far mroe, these are chagnes that have huge social benefit, a circular sustainable economy, local people more involved in decisions about what they do, valuing, preserving and restoring nature services. And - reducing pollution also, and with electric cars, a quieter world too (though they may need to add noises to electric cars to make them easier to hear). Like the post I shared a while back,
“It is no coincidence that the deepest and most protracted recessions in recent decades have taken hold in countries that experienced booms.”
It is possible to have endless economic growth. For instance increasing economic growth does not need to lead to increasing energy use, often leads to reduced energy use.
Plainly enough then, the moneyed world’s worries are beginning to sound a lot like those voiced by advocates of the Green New Deal and campaigners of Fridays for the Future and the Extinction Rebellion
There are big savings to be made for sure. One estimate is that we save $20 trillion by 2100 if we limit warming to 1.5°C compared to 2°C
Burke, M., Davis, W.M. and Diffenbaugh, N.S., 2018. Large potential reduction in economic damages under UN mitigation targets. Nature, 557(7706), p.549.
Even if it only wiped out all but 3.5 billion, it would wipe out half of today’s human population. Human die off at even this less extreme scale would put the politically popular cause of economic growth in sharp reverse.
This is not happening. The population is leveling off over much of the world due to prosperity not scarcity.
There is inefficient agriculture in Africa especially that could produce ten times as much food if it was optimized like the best agriculture in the US and China. Most of the predicted population increase is in Africa. Worldwide our population is leveling off, no longer exponential. Not leveling off due to scarcity as used to be predicted but due to prosperity. Reduced child mortality, better education, and greater equality of women in decision making all play a role here. Wth fewer children dying, parents have smaller families because they know that they have a good chance of surviving to adulthood.
Japan is one of the nations with the most rapidly declining populations. We are already at peak child, or close to, we jave roughly the same number of children as a decade ago and the world population continues to grow rapidly mainly beause of the extraordinary advances in health care. Worldwide we live ten years longer than 50 years ago. For some countries it is 20 years longer, for China it is a truly remarkable 30 years increase in life expectancy from 1960 to 2010.
This expands on some of that, and goes into other things such as resources and energy return on energy invested for fossil fuels compared to renewables
Debunked: Soon we won’t be able to feed everyone because the world population is growing so quickly
“I think it’s extremely unlikely that we wouldn’t have mass death at 4C. If you have got a population of nine billion by”
This is a very out of date quote from 2009. "Warming will 'wipe out billions'". scotsman.com.
It is not what modern research is finding, we can feed everyone through to 2100 on all the scenarios.
Summarize some of this research here: We can feed everyone through to 2100 and beyond
this term is used to refer to any Eastern Orthodox Christian
This explains how in the Eastern Orthodox church all Christians think of themselves as being the servant of God. And about how Jesus was as well. Being a Servant
the existential threat posed by climate change.
This is from journalist exaggerations and junk science. The IPCC's own worst case is a scenario where we can still feed everyone through to 2100 but with reduced food security.
See Box 8 of chapter 3of the 2018 IPCC report
I summarize it as:
This is one of their scenarios from chapter 3 of the 2018 report. There is nothing remotely like extinction or end of civilization in this scenario. We can still feed everyone as well, though with less food security. It is still a world with much of our natural world still here, the majority of the species survive, not a desert. However it is a world we would not want to head for, with the corals nearly all gone, many areas of the world facing problems, severe loss of biodiversity and increasing rather than decreasing world poverty by 2100.
[The IPCC’s own worst case climate change example - a 3°C rise by 2100(https://www.quora.com/q/duzzmyeobxjljrpq/The-IPCC-s-own-worst-case-climate-change-example-a-3-C-rise-by-2100)
“Revolution or collapse — in either case, the good life as we know it is no longer viable.”
Renewables do work and we can have a fossil fuel free economy. Many countries are already rapidly transitioning to a largely renewables based civilization.
[Do renewables for power generation take up more land area than fossil fuels? Well - not really!]https://www.science20.com/robert_walker/ipcc_did_not_say_12_years_or_18_months_to_save_planet_no_scientific_cliff_edge_should_they_challenge_it_or_who) Yes, climate change can be beaten by 2050. Here’s how.
“Revolution or collapse — in either case, the good life as we know it is no longer viable.”
Renewables do work and we can have a fossil fuel free economy. Many countries are already rapidly transitioning to a largely renewables based civilization.
[Do renewables for power generation take up more land area than fossil fuels? Well - not really!]https://www.science20.com/robert_walker/ipcc_did_not_say_12_years_or_18_months_to_save_planet_no_scientific_cliff_edge_should_they_challenge_it_or_who) Yes, climate change can be beaten by 2050. Here’s how.
key tipping points
This article does not mention any tipping points. The IPCC found that there are no tipping points up to 2 C and beyond for climate except the Western Antarctic and Greenland ice melt and those unfold over many thousands of years.
Although Britain boosted the Paris Agreement in June by committing to net zero carbon emissions by 2050, the country, preoccupied by Brexit, is far from on a climate war footing.
We do not need to be on a climate war footing -- that can lead to autocratic decisions and mistakes. It's what sociologists call a "wicked problem" - solutions are complex and need to be done carefully.
But we are working on this. See my.
The UK public are pretty much fed up about Brexit and the politicians know they have to find a solution. Maybe we leave this autumn, maybe we have a general election, one way or another we'll be resolving this.
But much else is happening in the UK it's not just Brexit!
Likewise, a push led by France and Germany for the European Union to adopt a similar target was relegated to a footnote at a summit in Brussels after opposition from Poland, the Czech Republic and Hungary.
Poland, the Czech Republic and Hungary face major issues in transitioning fast because of the coal industries.
This is something to fix in future deals. The Green New Deal will deal with these issues, making it easier for them to transition. That's a proposal by Ursula von der Leyen', new president of the European Comission, and we now have a "Green Wave" in European politics that will make a big difference going into the future.
In October, the U.N.-backed Intergovernmental Panel on Climate Change (IPCC) warned emissions must start falling next year at the latest to stand a chance of achieving the deal’s goal of holding the global temperature rise to 1.5 degrees Celsius.
They did not say that. They said that there are many ways to stay within 1.5°C. The easiest way to do it, the "Low Energy Demand" scenario, involves a 45% reduction in emissions by 2030 tapering down to zero emissions by 2050. But there are many alternatives.
That foretaste of a radically hotter world underscored what is at stake in a decisive phase of talks to implement the 2015 Paris Agreement, a collective shot at avoiding climate breakdown.
These areo often misunderstood. It is not about lethal heat. Already 30% of the world was experiencing these heat waves every year. On the 3 C path that we are on now, this is projected to increase to a little over 50%. It's not a big deal, you do not die of a heat wave. Not if you take the right precautions.
existential threat posed by climate change.
This is from journalist exaggerations and junk science. The IPCC's own worst case is a scenario where we can still feed everyone through to 2100 but with reduced food security.
See Box 8 of chapter 3of the 2018 IPCC report
I summarize it as:
This is one of their scenarios from chapter 3 of the 2018 report. There is nothing remotely like extinction or end of civilization in this scenario. We can still feed everyone as well, though with less food security. It is still a world with much of our natural world still here, the majority of the species survive, not a desert. However it is a world we would not want to head for, with the corals nearly all gone, many areas of the world facing problems, severe loss of biodiversity and increasing rather than decreasing world poverty by 2100.
[The IPCC’s own worst case climate change example - a 3°C rise by 2100(https://www.quora.com/q/duzzmyeobxjljrpq/The-IPCC-s-own-worst-case-climate-change-example-a-3-C-rise-by-2100)
Timeline of carbon capture and storage
Not been updated for a decade (as of 2019). Lots of newer material in https://en.wikipedia.org/wiki/Carbon_capture_and_storage
The total carbon capture capacity of the facility is 800,000 tonnes per year.
They plan to expand to capture 2.3 million tonnes per year by 2025 and 5 million tonnes per year before 2030. They say they are commercially self-sustaining, with no government subsidies.
This is the first time that ALMA has ever observed the surface of a star and this first attempt has resulted in the highest-resolution image of Betelgeuse available.
This is about a decade out of date. There is a higher resolution image from 2009
The Spotty Surface of Betelgeuse Credit: Xavier Haubois (Observatoire de Paris) et al.
The figure in the paper itself is this one:
The paper is here:
Haubois, X., Perrin, G., Lacour, S., Verhoelst, T., Meimon, S., Mugnier, L., Thiébaut, E., Berger, J.P., Ridgway, S.T., Monnier, J.D. and Millan-Gabet, R., 2009. Imaging the spotty surface of Betelgeuse in the H band . Astronomy & Astrophysics, 508(2), pp.923-932.
There are other images of similar resolution. This is an article from 2018.
Ariste, A.L., Mathias, P., Tessore, B., Lèbre, A., Aurière, M., Petit, P., Ikhenache, N., Josselin, E., Morin, J. and Montargès, M., 2018. Convective cells in Betelgeuse: imaging through spectropolarimetry. Astronomy & Astrophysics, 620, p.A199.
He lost the title of "World's Shortest Man"
He is still in the record as the shortest mobile man
Khagendra Thapa Magar is currently listed on the Guiness world records site as the shortest man living (mobile))
Junrey Balawing is listed as the shortest man - living (non-mobile)
Balawing is the world's shortest man alive
He is in the record as the shortest non mobile man
Khagendra Thapa Magar is currently listed on the Guiness world records site as the shortest man living (mobile))
Junrey Balawing is listed as the shortest man - living (non-mobile)
Well known potentially hazardous asteroids are normally only a hazard on a time scale of hundreds of years
Many are only potentially hazardous on a timescale of thousands of years or millions of years. Example, Swift-Tuttle's first chance of impact is a small chance of impact in 4479 of 1 in a million.
For the alternative formulation, where X is the number of trials up to and including the first success, the expected value is E(X) = 1/p.
No cite given, the calculation can be done as here
Thus a wet bulb temperature of 35 °C (95 °F) is the threshold beyond which the body is no longer able to adequately cool itself
Confusingly doesn't explain that wet bulb temperatures differ significantly from the heat index. The US system of heat index is roughly the “perceived heat”, how warm it feels, and is used mainly for public outreach such as heat wave warmings, rather than scientific research.
Sadly, because the conversion depends on radiant heat (as well as humidity), there is no systematic way to convert one to the other. It gives an idea of what the temperature feels like - but how well you can tolerate it may depend on the amount of humidity and how much of the perceived heat is due to radiant heat.
A wet bulb temperature of 33°C (92°F) corresponds very roughly to a heat index of around 57°C (135°F) in the absence of radiant heat.
But with radiant heat the heat index can increase relative to those values and be larger than you’d expect from the wet bulb temperature by over 7°C (11°F) for indoor conditions and over 11°C (18°F) for outdoor conditions with direct sunlight.
Iheanacho, I., 2014. Can the USA National Weather Service Heat Index Substitute for Wet Bulb Globe Temperature for Heat Stress Exposure Assessment?.
Because of this factor, it was once believed that the highest heat index reading actually attainable anywhere on Earth was approximately 71 °C (160 °F)
Says who??
believe that a limited convention is possible.
So does James Kenneth Roger, Attorney at Osborn Maledon, P.A more about him - cited later in this article
He uses various arguments against this, mainly that it would defeat its purpose if it was unlimited because States would be reluctant to call such a convention.
However, he acknowledges that the Philadelphia convention in 1787) (not called under article V) went beyond its own remits and then he says that the main protection is that 3/4 of States have to support any amendments made by any such convention, which would include a majority of at least 22 out of the original at least 34 who called the convention in the first place (at most 12 total can be against any ammendment).
For more details: later annotation in page
The fact that Congress has not called such a convention, and that courts have rejected all attempts to force Congress to call a convention, has been cited as persuasive evidence that Paulsen's view is incorrect
Rogers’ legal opinion about Paulson’s argument is misparaphrased in this article. He does not use the fact that no convention has been held yet as a reason to suppose that the convention has to be limited, indeed he acknowledges that the Philadelphia convention in 1787 ) (although not called under article V) and cites this as a reasonable concern.
He uses other arguments against this, mainly that it would defeat its purpose if it was unlimited because States would be reluctant to call such a convention out of fear of what other things it might decide. He also says that if they thought this would happen the States would immediately rescind their applications, so preventing the convention, something Idaho has already done.
He does say that if the convention was unlimited then all existing applications could be aggregated together to call a single convention to discuss them all but he does not use the fact that this has not happened as an argument to say that such a convention is impossible. However, he says that the main protection is that 3/4 of States have to support any amendments made by any such convention. This would include a majority of the original 34 or more States that called for it (at most 12 of them could refuse to ratify). The arguments are
It then says
It gives the example of the Philadelphia Convention of 1787 which exceeded its mandate of revising the Articles of Confederation to show that there are well founded concerns about whether a modern convention with a limited mandate could exceed its original scope.
It says it would be difficult for a government to intervene as a constitutional convention could concievably claim independent authority.
However it goes on to say that any ammendments have to be ratified by 3/4 of the States. So, if the convention proposes extra amendments the would only be accepted if ratified by 38 States. This would mean that most of the States that originally requested it would also ratify it thus legitimizing their actions.
(This is the maths here: 38 out of 50 have to ratify so that means up to 12 could refuse to ratify, and a convention requres 2/3 of 50 or 34 States to be initiated. If the ones that don't ratify are all from the original States, then that would make it 22 that ratify of the original 34 or over 64% of them)
The section concludes that
The second argument—that the States have no power beyond initiating a convention—is partially correct. They do, however, have indirect authority to limit the convention. Congress’s obligation to call a convention upon the application of two thirds of the States is mandatory, so it must call the convention that the States have requested. Thus, Congress may not impose its own will on the convention. As argued above, the purpose of the Convention Clause is to allow the States to circumvent a recalcitrant Congress. The Convention Clause, therefore, must allow the States to limit a convention in order to accomplish this purpose. The prospect of a general convention would raise the specter of drastic change and upheaval in our constitutional system. State legislatures would likely never apply for a convention in the face of such uncertainties about its results, especially in the face of a hostile national legislature. [73] States are far more likely to be motivated to call a convention to address particular issues. If the States were unable to limit the scope of a convention, and therefore never applied for one, the purpose of the Convention Clause would be frustrated.
A related concern is whether States’ applications that are limited to a particular subject should be considered jointly regardless of subject or tallied separately by subject matter to reach the twothirds threshold necessary for the calling of a convention. [74] This is an important question because if all applications are considered jointly regardless of subject matter, Congress may have the duty to call a convention immediately based on the number of presently outstanding applications from states on single issues[74].
If the above arguments about the States’ power to limit a convention are valid, however, then applications for a convention for different subjects should be counted separately. This would ensure that the intent of the States’ applications is given proper effect. An application for an amendment addressing a particular issue, therefore, could not be used to call a convention that ends up proposing an amendment about a subject matter the state did not request be addressed. [76]
Footnote
73
These fears, however, are mitigated by the States’ own powers over ratification.
74 . Paulsen, supra note 3, at 737–43. 75 . Id. at 764. Paulsen counts forty ‐ five valid applications as of 1993.
76
If it were established that applications on different topics are considered jointly when determining if the twothirds threshold has been reached, states would almost certainly rescind their outstanding applications to prevent a general constitutional convention. Some states have already acted based on fears of a general convention. For example, in 1999 the Idaho legislature adopted a resolution rescinding all of its outstanding applications for a constitutional convention. S.C.R. 129, 1999 Leg. (Idaho 1999). Georgia passed a similar resolution in 2004. H.R. 1343, Gen. Assemb. 2004 (Ga. 2004). Both resolutions were motivated by a fear that a convention could exceed its scope and propose sweeping changes to the Constitution.
pdf here
.[29]
There is a formatting error in all these Rogers bookmarks. Should ilnk to this cite pdf here
Rogers 2007.
There is a formatting error in all these Rogers bookmarks. Should ilnk to this cite pdf here
Rogers 2007, p. 1007
There is a formatting error in all these Rogers bookmarks. Should ilnk to this cite pdf here
" (PDF).
pdf here
Rogers, 2007 & 1014–20.
Rogers & 2007 1008.
Rogers & 2007 1009.
Rogers & 2007 1010.
Rogers, 2007 & 1010–20.
Rogers, 2007 & 1014–19.
There is a formatting error in all these Rogers bookmarks. Should ilnk to this cite pdf here
; some studies have reported that in adult humans about 700 new neurons are added in the hippocampus every day
2019 study finds thousands of young neurons in brain tissue through to the ninth decade of life.
By utilizing highly controlled tissue collection methods and state-of-the-art tissue processing techniques, the researchers found thousands of newly formed neurons in 13 healthy brains from age 43 up to age 87 with a slight age-related decline in neurogenesis (about 30% from youngest to oldest).
Old Brain, New Neurons? Harvard University press release
New neurons in red in brain tissue from a 68-year-old Original paper: Moreno-Jiménez, E.P., Flor-García, M., Terreros-Roncal, J., Rábano, A., Cafini, F., Pallas-Bazarra, N., Ávila, J. and Llorens-Martín, M., 2019. Adult hippocampal neurogenesis is abundant in neurologically healthy subjects and drops sharply in patients with Alzheimer’s disease. Nature medicine, 25(4), p.554.
This all sounds like a recipe for a completely unworkable set of complex requirements that once again will favor big companies with deep pockets and big legal departments.
The draft says the opposite.
Even if consumer rules, data protection rules, as weil as contract rules, have converged across the EU, in today's regulatory environment, only the big platform companies can grow and survive. A fragmented market with divergent rules is difficult to contest for newcomers, and the absence of dear, uniform, and updated rules in areas such as illegal content online is dissuasive for new innovators. This is a major strategic weakness for the EU in the digital economy and increase reliance on non-EU services for essential services used by all citizens on a daily basis.
The Iack of legal clarity also entails a regulatory disincentive for platforms and other intermediaries to act proactively to tackle illegal content, as weil as to adequately address harmful content online, especially when combined with the issue of fragmentation of rules addressed above. As a consequence, many digital services avoid taking on more responsibility in tackling illegal content, for fear of becoming liable for content they intermediate. This Ieads to an environment in which especially small and medium-sized platforms face a regulatory risk which is unhelpful in the fight against online harms in the broad sense. At the same time, when companies do take measures against potentially illegal content, they have limited legal incentives for taking appropriate measures to protect legal content.
Service interoperabillty. Where equivalent services exist, the framewerk should take account of the ernerging application of existing data portability rules and explore further options for facilitating data transfers and improve service interoperability - where such interoperability makes sense, is technically feasible, and can increase consumer choice without hindering the ability of (in particular, Smaller) companies to grow.
There is not even the slightest suggestion of making things easier for big companies at the expense of small. Instead the ambition is to make it easier for smaller companies and innovative startups, including native EU startups, to compete on a level playing field in the EU with the US tech giants.
The EU has no overwhelming motivation to protect US tech giants and has often shown it is willing to stand up against them.
The draft itself is here
That's a classic: affirming that general monitoring is prohibited, while bringing in rules for proactive automated filtering technologies -- aka general monitoring.
It doesn't say that the filtering would be required. It says
specific provisions governing algorithms for automated filtering technologies -- where these are used -- .should be considered, to provide the necessary transparency and accountability of automated content moderation
If it is like the copyright directive on this point, what they are saying there is that if content is modified it has to be transparent to the user. For instance, you'd have a right to get the online provider to explain to you why your content was taken down and to address it. As an example, I have some animated gifs to help panicked people to breathe slowly that I used to add to my posts to reassure panicked people. For no apparent reason Facebook now blocks them. Including one I made myself that I know is absolutely fine and the ones by DestressMonday that say it is okay to use. There is currently no way to even contact them and get an answer as to why they block this content. This would give me a way to do that, and they would have to explain what happened or sort it out. Article 13/17 has this right to redress:
- Member States shall provide that online content-sharing service providers put in place an effective and expeditious complaint and redress mechanism that is available to users of their services in the event of disputes over the disabling of access to, or the removal of, works or other subject matter uploaded by them.
That would surely be included amongst the rules that would govern pro-active upload filters which are currently operated without any regulation on what they do and without any requirement to explain anything about what their upload filters do to anyone.
a binding "Good Samaritan provision" would encourage and incentivise proactive measures, by clarifying the lack of liability as a result of Such measures
The companies already do proactive measures. E.g. Facebook blocks many sites so that you can't even link to them. With the New Zealand terrorism video, then Facebook stopped 1.5 million copies of it before they were seen, 1.2 million before upload and shared 90 digital signatures with other companies so they could do the same. Faceboook tweet see also Update on New Zealand | Facebook Newsroom
Here the important point in this paragraph is not the pro-active moderating, which is already happening - but that they are saying that they will clarify the lack of liability to companies for pro-active moderating (similarly to the section 230 in the US which we don't have in the EU).
Finally, a binding "Good Samaritan provision" would encourage and incentivise proactive measures, by clarifying the Iack of liability as a result of Such measures, on the basis of the notions already included in the Illegal Content Communication.
attacks on the fundamental principles underlying the open Internet that began with the Copyright Directive.
The copyright directive is motivated by the wish to stop copyright violations, not to attack the openness and freedom of the internet. It has many provisions to protect digital freedom.
I am talking about its stated objectives. But if it is law then the law is what you see, not any hidden objective. Lawyers and judges don't use hidden objectives to decide legal cases but just go by what the law says.
If things like upload filters and the imposition of intermediary liability become widely implemented as the result of legal requirements in the field of copyright, it would only be a matter of time before they were extended to other domains.
Non sequitor, doesn't follow. Indeed if wide implementation of a law leads to problems, it would have the opposite effect.
If some law is implemented and is widely accepted as a good law and sensible provisions it can be a model for ther new laws. If it turns out to be problematical then new laws would have a lot of opposition based on the experience of the previous one. Problematical laws can also be modified or repealed.
was nicknamed the annus confusionis ("year of confusion")
It was actually called the ultima annus confusionis, or the "final year of confusion". Also the primary source says it was 443 days.
The primary source here is Macrobius, in his Saturnalia) 1, 14, 3:, 400 AD
He first describes various Roman theories about when intercalation) (insertion of leap days or months) started, the earliest being an idea from Varro that it was a very ancient law inscribed on a bronze column around 472 BCE .
He then goes on to say (page 165 and page 166):
There was a time when intercalation) was entirely neglected out of superstition, while sometimes the influence of the priests, who wanted the year to be longer or shorter to suit the tax farmers,298 saw to it that the number of days in the year was now increased, now decreased, and under the cover of a scrupulous precision the opportunity for confusion increased.299
But Gaius Caesar took all this chronological inconsistency, which he found still ill-sorted and fluid, and reduced it to a regular and well-defined order;300 in this he was assisted by the scribe Marcus Flavius, who presented a table of the individual days to Caesar in a form that allowed both their order to be determined and, once that was determined, their relative position to remain fixed.301
When he was on the point of starting this new arrangement, then, Gaius Caesar let pass all the days still capable of creating confusion: when that was done, the final year of confusion was extended to 443 days.302 Then imitating the Egyptians, who alone know all things concerned with the divine, he undertook to reckon the year according to the sun, which completes its course in 365¼ days.303
The source says
Caesar called 46 BC the ultimus annus confusionis ("The final year of confusion")
Roman wits, however, called it the annus confusionis ("Year of Confusion").
Your ref 2 says
We think of the calendar as a universal measure of time. It's like a perfect grid that can be extended endlessly into the future. There's a website that tells me my birthday in the year 2128 will fall on a Monday.
But in antiquity, calendars were simply ways of organizing religious festivals, the terms of contracts, and other social arrangements. People knew calendars could be shifted and manipulated-even for political reasons. Priests and officials "kept" the time, and different calendars were in use throughout the world. Calendar time simply wasn't as fixed back then. An ancient calendar was more like a schedule, subject to change and revision.
So Caesar's reform was all the more remarkable. As both high priest and dictator of Rome, he had the authority to impose a whole new scheme on the Roman world. Cicero joked that this man now wished to control the very stars, which rose according to his new calendar as if by edict. Caesar's calendar still needed some minor adjustments, but Europe never got another jumbo year like 46 BC. And to this day, we are still marching along on Caesar's time.
The Longest year in History University of Houston scholar Richard Armstrong
Here is an academic secondary source
... our seasons come always at very nearly the same time, as fixed by our calendar, so much so that if ther is any variety, we remark on it, and say that spring is late, or autumn early, this year. It needs some little historical knowledge and imaagination to remind us of a time when it was not so; when months were lunar, many days were named and not numbered, and the year had so little to do with the seasons that it was quite possible for November or December to arrive before the summer was well over. Yet this was the case in the greatest civilizations of classical antiquity until a comparatively late date. For Rome, the year which we call 46 B.C. is called by Macrobius the last year of the muddled reckoning, annus confusionis ultimus, and it was 445 days long, so much had the nominal dates got behind the real ones; with the next year began the Julian reckoning, albeit with sundry boggles on the part of the Roman officials who did not quite undersatnd it, and long delays before the whole Western world adopted it.
Footnote: Macrobius, Saturnalia) 1, 14, 3: no one, except moderns who sould know better, ever calls it the annus confusionis simply.
Rose HJ. The Pre-Caesarian Calendar: Facts and Reasonable Guesses. The Classical Journal. 1944 Nov 1;40(2):65-76.
"This is further evidence that temperatures will keep rising until government policies that decrease greenhouse gas emissions are actually implemented," emphasized Green.
We need to do lots more but we are already doing a lot, it's not inaction.
Heat waves are increasing in duration and frequency, while smashing records.
Heat waves are expected to increase in duration and frequency - but we can also cope with those also - they don't kill you unless you neglect to take the right precautions - e.g. running in hot weather and not drinking enough water, not looking after infants properly to protect them from heat, making sure old people take care of themselves or are looked after in hot weather etc.
Since 1961, Earth's glaciers lost 9 trillion tons of ice. That's the weight of 27 billion 747s.
Yes we are losing glacier ice. India especially needs to build in climate resilience for this.
Greenland — home to the second largest ice sheet on Earth — is melting at unprecedented rates.
Greenland ice is melting more this year, but last year it actually increased because of an unusual amount of snowfall in the winter of 2017-8. You need to be wary of looking at particular months or particular years and generalizing from them. Yes the world is warming but it is as the scientists predicted, it's not something unexpected. There are plus sides too, the world has become greener as a result of the CO2 fertilization. As they say there this is an El Nino year so it's not surprising that we get record temperatures this year, but we can also expect to get records anyway
Warming climes have doubled the amount of land burned by wildfires in the U.S. over the last 30 years, as plants and trees, notably in California, get baked dry.
The fires in California are part of the natural order, there have been wildfires there since long before humans got to America, and part of the problem is that they have let them grow on and on without the regular controlled fires the American Indians did. It needs careful forest management to reduce the risk. A warming world has more risk from wildfires - but the fires themselves can be prevented / controlled in the same way they are anyway in dry weather.
Its mission is to use biologically-detailed digital reconstructions and simulations of the mammalian brain to identify the fundamental principles of brain structure and function
Most neuroscientists think this is impossible with current knowledge.
This annotation paraphrases parts of the article in BBC Futures, Will we ever ... simulate the human brain?, which is cited as a summary of the issues on page 9 of the 2015 mediation report on the Blue Brain project
A billion dollar project claims it will recreate the most complex organ in the human body in just 10 years. But detractors say it is impossible. Who is right?
Is it even possible to build a computer simulation of the most powerful computer in the world – the 1.4-kg (3 lb) cluster of 86 billion neurons that sits inside our skulls?
The very idea has many neuroscientists in an uproar, and the HBP’s substantial budget, awarded at a tumultuous time for research funding, is not helping
The problem is that though neuroscientists have built neural nets since the 1950s, the vast majority treat each neuron as a single abstract point.
Markram wants to treat each neuron as a complex entity together with the active genes that switch on and off inside them, the 3000 synapses that let each neuron connect with its neighbours, the ion channels (molecular gates) that allow them to build up a voltage by moving charged particles in and out of membrane boreders and the electrical activity.
Critics say that even building a single neuron model in this way is feindishly difficult. Then we have even less knowledge about how these cells connect.
Markram's idea was to do a complete inventory of which genes are switched on in which cells in which parts of the brain, the "single-cell transcriptome" and then based on that he thinks he can recreate the electrical behaviour of each cell and how the neurons branches grow from scratch.
Eugene Izhikevich from the Brain Corporation thinks we should be able to build a network with the connectivity and anatomy of a real brain, but that it would just be a fantastically detailed simulation of a dead brain in a vat - that it would not be possible to simulate an active brain.
Markram himself says that his aim is not to build a brain that could act like us.
“People think I want to build this magical model that will eventually speak or do something interesting,” says Markram. “I know I’m partially to blame for it – in a TED lecture, you have to speak in a very general way. But what it will do is secondary. We’re not trying to make a machine behave like a human. We’re trying to organise the data.”
Chris Eliasmith from University of Waterloo, Canada, told BBC Futures:
“The project is impressive but might leave people baffled that someone would spend a lot of time and effort building something that doesn’t do anything,”
He is involved in the IBM brain simulation called SyNAPSE which also doesn't do very much. He says
“Markram would complain that those neurons aren’t realistic enough, but throwing a ton of neurons together and approximately wiring them according to biology isn’t going to bridge this gap,”
Grandparents
This leaves out his maternal grandparents, Malcolm (1866–1954) and Mary MacLeod (née Smith; 1867–1963)
Ruttan Walker told me she often uses grief as a way to process her emotions about climate. "We have to acknowledge that we've changed our planet. We've made it more dangerous and we've done harm," she said.
We are also doing many good things. E.g.
24 ways the world is getting better - good news journalists rarely share
even Wallace-Wells
Wallace Wells is just a journalist, and gets the science wrong in a big way. It's not "even him" - he also way over exaggerates based on sources that already exaggerate and junk science.
For many reactions by scientists to his article that later became his book, see
Scientists explain what New York Magazine article on "The Uninhabitable Earth" gets wrong
and see my:
Deep Adaptation: A Map for Navigating Climate Tragedy
This non peer reviewed 'paper' relies on the worst kind of non peer reviewed junk science.
The author makes a big thing about it failing peer review as if that gave it additional credentials. But it was failed because it didn't pass the academic standards set by the journal, not because of the views peresented. It is by a sociologist, and uses another non peer reviewed article as its main source on the science which is a big no-no in publishing, to rely on another article rejected from peer review.That in turn uses the now largely disproved clathrate gun hypothesis.
It wasn't published because it fails even the most basic criteria for a peer reviewed academic paper. Even without looking at the ideas, that it relies on non peer reviewed material for some of its main points is already a major no-no especially since it uses it to support material the author is not expert on themselves.
mong them are the notorious melting sea ice exaggerator Peter Wadhams and the ecologist and the-end-is-nigh-fabulist Guy McPherson.
Guy McPherson is junk science. E.g. he relies on Sam Carana's absurd use of a quadratic to extrapolate!
To see how bizarre that is bear in mind that a quadratic always goes to infinity in both directions, future and past.
[Absurd Blog Post At Arctic News Predicts 10 °C Rise In Global Temperatures By 2026 - QUADRATIC To EXTRAPOLATE!](https://www.science20.com/print/237501
"I think it's clear that emissions will come down to zero and stabilize the climate sometime this century. But taking 50 years to do that will yield a different world than if we do it in 20 years. It's up to us to decide which of these worlds we want to live in."
According to the IPCC to take 50 years to do it ,zero emissiosn by 2070 means we stay itin 2 C. To take 30 years by 2050, keeps us within 1.5 C. Both those targets hafve a wide range - and the global temperature changes a lot from year to year but the main thing is that we can stay within around 1.5 C. The IPCC also did an example worst case which was dismantling the Paris agreement and then acting far too late in the 2030s in a non coordinated way. We are not on that path. That path is a 3 C by 2100 with little by way of margin to improve on it.
It is a future to avoid but it is not human extinction or collapse of cvilizaiton. And we can stil lfeed everyone by 2100 but it's a less biodiverse world to leave to the next generation and with many problems and reversing some of the progress towards sustainability goals of this century, and reducing food security and increasing global poverty, flooding, storms, drought etc.
The IPCC’s own worst case climate change example - a 3°C rise by 2100
Models that use the status quo—a.k.a. "not changing anything" as a baseline—show that we're headed off a cliff in terms of planetary habitability.
The linked to paper misunderstands many things. China is arapidly industrializing and at the same time rapidly increasing its renewables percentage. But until it has a very high percentage of renewables it can't stop its increase in CO2 emissions. It is going to happen, they pledged to peak emissions before 2030 and it's generally accepted they will, but until then, then half the increase recently is just due to China. They have the largest renewables industry in the world and it developed from almost nothing in a few years. They have to increase pledges, but they first needed the technology in place before this is possible.
This does not mean it is impossible to stay within 1.5 C. The curves for all the scenarios are almost identical until 2030 but we are on the 3 C path now by the unconditional pledges and policies and are on track to increase on them.
in May, an Australian think tank called climate change "a near- to mid-term existential threat to human civilization."
This had no scientific credibility. written by a couple of businessmen, with no scientific peer review.
Michael Mann, respected climate scientist at Pennsylvannia State University, calls their report
"overblown rhetoric, exaggeration, and unsupportable doomist framing":
Richard Betts, Professor, Met Office Hadley Centre & University of Exeter, put it like this:
The “report” is not a peer-reviewed scientific paper. It’s from some sort of “think tank” who can basically write what they like. The report itself misunderstands / misrepresents science, and does not provide traceable links to the science it is based on so it cannot easily be checked (although someone familiar with the literature can work it out, and hence see where the report’s conclusions are ramped-up from the original research).
One of their central points was that they took an analysis of heat waves which lead to 1 in 10,000 to be at risk of dying if they do not take precautions such as the old and infants, and said it meant that
"Thirty-five percent of the global land area, and 55 percent of the global population, are subject to more than 20 days a year of lethal heat conditions, beyond the threshold of human survivability,"
The paper was calibrated to 30% of us facing "deadly heat" in 2000. We didn't get death of 30% of the world population in 2000. They totally misunderstood the paper.
Another example, they read a figure of 1 billion people at risk of sea level rise of 20 meters, way beyond anything possible this century or centuries into the future, and said this was the number of people who would be climate migrants by 2050. With sea level rise you can stay where you are if you build sea walls and the rise at even the highest level with "business as usual" and worst case scenario is 2.5 meters.
Many mistakes like this - they weren't scientists, were clearly not used to reading science papers, and can't have run it past anyone who understands climate science to check it.
did the one from May about how 1 million species are on track to go extinct due to human-caused environmental degradation, assuming we don't change our course and stop generating greenhouse gases (alongside other forms of environmental havoc).
First the million species was just an extrapolation of the IUCN red list to insects and minute and microscopic sear creatures. It didn't mean that more species are at risk than the IUCN red list already says.
They analysed the reasons we continue to lose species to extinction, and found a solution. Not only that, they found a solution that is practical, feasible, makes economic sense, and has preliminary government interest too, to the extent that over 100 governments were happy to sign off on their conclusions as a “summary for policy makers”.
This was the central point they wanted to present to the world. They did the most rigorous analysis ever done of such a situation involving experts in social sciences and economics as well as the natural sciences. From their analysis it has become clear, not only why we are losing so many species, not only how to fix it, but that we all benefit too, individuals, businesses, governments.
If we want it, this is a future world we can choose for ourselves by the actions we do right now. What’s more, we have time to act too. My short summary of their central message is
“Make Biodiversity Great Again, We Know How to do it”.
Let's Save A Million Species, And Make Biodiversity Great Again, UN Report Shows How
Last year's UN report on humanity's probable failure to stop warming short of the 1.5 degree Celsius threshold had a similar message
That is what journalists said, not the report and not the co-chairs. For example this is what Jim Skea said:
“The key message is that we can keep global warming below 1.5 degrees °C. It is possible within the laws of physics and chemistry. But it will require huge transitions in all sorts of systems, energy, land, transportation, but what the report has done is to send out a clear message to the governments that it is physically possible, it is now up to them to decide whether they want to take up the challenge.”
They said we can do it, that we an stay within 1.5 C and they outlnied four different ways we can do it. The easiest is to reduce emissions to zero by 2050, and this is the best for GDP growth as well. It's not more expensive, it's less expensive to do this than inaction.
As David Wallace-Wells noted in his 2019 bestseller The Uninhabitable Earth
No we are not headed for an uninhabitable world, or collapse of civilization. Many of these stories are based on this sensationalist book that exaggerates climate change scenarios. Sometimes the author is referred to as a “Climate scientist” but he is not. He is a general interest journalist who says himself that he has only been interested in climate change for one to two years.
He wrote a journalist article which he then expanded into a book on the topic which has been criticized by climate scientists as exaggerated and full of mistakes. He knows how to write engaging and imaginative prose but he doesn’t know how to evaluate climate change research.
Jetsun Milarepa (Tibetan: .mw-parser-output .uchen{font-family:"Qomolangma-Dunhuang","Qomolangma-Uchen Sarchen","Qomolangma-Uchen Sarchung","Qomolangma-Uchen Suring","Qomolangma-Uchen Sutung","Qomolangma-Title","Qomolangma-Subtitle","Qomolangma-Woodblock","DDC Uchen","DDC Rinzin",Kailash,"BabelStone Tibetan",Jomolhari,"TCRC Youtso Unicode","Tibetan Machine Uni",Wangdi29,"Noto Sans Tibetan","Microsoft Himalaya"}.mw-parser-output .ume{font-family:"Qomolangma-Betsu","Qomolangma-Chuyig","Qomolangma-Drutsa","Qomolangma-Edict","Qomolangma-Tsumachu","Qomolangma-Tsuring","Qomolangma-Tsutong","TibetanSambhotaYigchung","TibetanTsugRing","TibetanYigchung"}རྗེ་བཙུན་མི་ལ་རས་པ, Wylie: rje btsun mi la ras pa, 1028/40–1111/23)[1] was a Tibetan siddha, who famously was a murderer as a young man then turned to Buddhism to become an accomplished buddha despite his past. He is generally considered as one of Tibet's most famous yogis and poets, serving as an example for the Buddhist life. He was a student of Marpa Lotsawa, and a major figure in the history of the Kagyu school of Tibetan Buddhism.[1]
This is a hagiography completed in 1488,three and a half centuries after his death It was written by an inspirational poet and nyönpa or "religious madman" Gtsang-smyon He-ru-ka. It is a classic of Tibetan literature, but is not a biography. This article only presents this later acount.
The earliest known account of his life is strikingly different, attributed to Milarepa's principle disciple, Gampopa, though it's probably lecture notes by one of his students.
In this earliest account, he is not a murderer. There is no mention of him killing anyone with black magic, or of his trial constructing towers under Marpa. It's his mother who dies when he is young, not his father. T
Andrew Quintman whose thesis and then book was about Milarepa's life hasn't attempted to deduce his "real life". Though he does say there is good evidence he existed as a historical figure.
For an expanded version of this article with cites, and discussion of the earlier accounts, see Milarepa
10,000 are passing inside the orbit of Neptune on any given day.
A better cite as the Sky at Night is no longer available to watch:
The hunt is now on for more 'Oumuamua-like objects. Extrapolating from this one discovery, there ought to be some 10,000 of them passing through our Solar System inside the orbit of Neptune.
'Oumuamua: 'space cigar's' tumble hints at violent past - BBC News
Breaching a 'carbon threshold' could lead to mass extinction
If it is true then it plays out over a timescale of around 10.000 years of a slightly reduced amount of CO2 being sequestered in the oceans due to acidification. The mass extinctions he studies have many different hypotheses about why they happened.
It is a simple model where he just looks at inputs to the ocean and changing the rate at which carbon is added and removed. These "toy models" often leave out important things that would change the picture. He tries to get a kind of uniform explanation of many past extinctions. So at present it is leading edge science and speculative.
He published a previous paper in 2017. It's only had 21 cites in google Scholar most not that related to the topic.
This is his 2019 press release
http://news.mit.edu/2019/carbon-threshold-mass-extinction-0708
This is his 2017 press release http://news.mit.edu/2017/mathematics-predicts-sixth-mass-extinction-0920
and paper
https://advances.sciencemag.org/content/advances/3/9/e1700906.full.pdf
It says at the end: > Any spike would reach its maximum after about 10,000 years. Hopefully that would give us time to find a solution."
There would be lots we could do with 10,000 years to do it, including maybe technological ways to directly remove CO2 from the water. There isn't that much on this yet ,but are a few projects, enough so that one can suppose that our future civilization may well be able to do it, and it is possible we are on the point of doing it already.
This is direct extraction of CO2 from seawater
This is a similar idea of using the hydrogen from seawater + CO2 to make methanol - the CO2 is 125 times more concentrated in the sea which may make it easier to extract than from the atmosphere.
https://arstechnica.com/science/2019/06/creative-thinking-researchers-propose-solar-methanol-island-using-ocean-co%E2%82%82/ https://www.pnas.org/content/116/25/12212
You can also use seawater plus scrap iron and aluminium to make bcarbonates from CO2 from some other source (power station?) that counteract acidity of the oceans. Traps it as a mineral called dawsonite that is a natural component of Earth's crust.
https://www.youtube.com/watch?v=GsCZm5QPxO8
https://www.sciencedaily.com/releases/2018/06/180625192825.htm
Given the current annual production rate for aluminium and its recycling rate, this technology could be used to miner alise 20–45 million tonnes of carbon dioxide per annum, which would make it the third largest carbon-dioxide-utilising chemi-cal process. Analysis of the energetics of the electrochemical mineralisation shows it is 33% more energy efficient to use waste aluminium this way rather than to recycle it. A similar analysis for using non-recycled scrap steel suggests this could capture 822 million tonnes of carbon dioxide, enough to negate the effect of refineries worldwide. Even if the required electrical energy came from a coal-fuelled power station, the overall process using either aluminium or steel is carbon negative, with more carbon dioxide being mineralised than would be released by the power station
Paper here: https://onlinelibrary.wiley.com/doi/pdf/10.1002/cssc.201702087
The report estimated 86,000 casualties, including 3,500 fatalities, 715,000 damaged buildings, and 7.2 million people displaced, with two million of those seeking shelter, primarily due to the lack of utility services. Direct economic losses, according to the report, would be at least $300 billion
This is not modeling a single event. The cite itself explains that it is for all three segments of the fault hypothetically rupturing as a single faujlt of magnitude 7.7. In actuality it would be three separate earthquakes.
The combined rupture of all three segments simultaneously is designed to approximate the sequential rupture of all three segments over time. The magnitude of Mw7.7 is retained for the combined rupture.
It also explains that these are mainly minor injuries.
Nearly 86,000 total casualties are expected for the 2:00AM event. A large portion of these casualties are minor injuries, approximately 63,300, though 3,500 fatalities are also expected. It goes on to explain that these are immediate deaths from buildings and bridges Those estimates include casualties resulting from structural building and bridge damage only. Therefore, the estimates do not included injuries and fatalities related to transportation accidents, fires, or hazmat exposure. This section deals only with injuries. Fatalities are addressed under mortuary services. The injuries and casualties estimated by the model are only for those that occur at the time of the event. The model does not provide for increases in these numbers that occur post event. For example, those that sustain injuries may die later, or injuries incurred as a result of response activities may result in fatalities
Under mortuary services it has this table which breaks down the 3,500 by state:
That’s for the eight states of Missouri, Illinois, Indiana, Kentucky, Tennessee, Alabama and Missisicpi. Total population 43 million according to the 2000 data they were using.
Most in Tennessee which had a population of 5.69 million and would have 1,319 casualties in this scenario.
By comparison, the US yearly death rate is 8.1 per thousand so for Tennessse, about 45,000 a year.
In the conclusion it says
“Some impacts may be mitigated by retrofitting infrastructure in the most vulnerable areas. By addressing infrastructure vulnerability prior to such a catastrophic event, the consequences described in this report may be reduced substantially.The resource gaps and infrastructure damage described in this analysis present significant unresolved strategic and tactical challenges to response and recovery planners. It is highly unlikely that the resource gaps identified can be closed without developing new strategies and tactics and expanded collaborative relationships.”
A 2015 study suggested that the AMOC has weakened by 15-20% in 200 years
This doesn't seem to have been updated since 2015. The IPCC report in 2018 (chapter 3) says
It is more likely than not that the Atlantic Meridional Overturning Circulation (AMOC) has been weakening in recent decades, given the detection of the cooling of surface waters in the North Atlantic and evidence that the Gulf Stream has slowed since the late 1950s. There is only limited evidence linking the current anomalously weak state of AMOC to anthropogenic warming. It is very likely that the AMOC will weaken over the 21st century. The best estimates and ranges for the reduction based on CMIP5 simulations are 11% (1– 24%) in RCP2.6 and 34% (12– 54%) in RCP8.5 (AR5). There is no evidence indicating significantly different amplitudes of AMOC weakening for 1.5°C versus 2°C of global warming.
Hoegh-Guldberg, O., Jacob, D., Taylor, M., Bindi, M., Brown, S., Camilloni, I., Diedhiou, A., Djalante, R., Ebi, K., Engelbrecht, F. and Guiot, K., 2018. Impacts of 1.5 ºC global warming on natural and human systems. Chapter 3, section 3.3.8
Apart from a few asteroids whose densities have been investigated,[6] one has to resort to enlightened guesswork.
The field has moved on a lot since then. This gives an idea of the range of values for NEO's
So, for a 200 meter asteroid it’s between 1 and 3, most likely around 1.75 or so, but a small chance of an iron meteorite of 6 to 7. For a 20 meter asteroid it’s a similar range but with two very likely densities of around 2.2 and around 2.8 (just going by eye from that diagram). 100 meter size range similar.
It is based on this analysis by the structural type and composition.
Although numerous studies point to resistance to some of Mars conditions, they do so separately, and none has considered the full range of Martian surface conditions, including temperature, pressure, atmospheric composition, radiation, humidity, oxidizing regolith, and others, all at the same time and in combination.[230] Laboratory simulations show that whenever multiple lethal factors are combined, the survival rates plummet quickly.[21]
The researchers are of the view that their work strongly supports the possibility that terrestrial microbes most likely can adapt physiologically to live on Mars
"This work strongly supports the interconnected notions (i) that terrestrial life most likely can adapt physiologically to live on Mars (hence justifying stringent measures to prevent human activities from contaminating / infecting Mars with terrestrial organisms); (ii) that in searching for extant life on Mars we should focus on "protected putative habitats"; and (ii) that early-originating (Noachian period) indigenous Martian life might still survive in such micro-niches despite Mars' cooling and drying during the last 4 billion years"
de Vera, Jean-Pierre; Schulze-Makuch, Dirk; Khan, Afshin; Lorek, Andreas; Koncz, Alexander; Möhlmann, Diedrich; Spohn, Tilman (2014). "Adaptation of an Antarctic lichen to Martian niche conditions can occur within 34 days". Planetary and Space Science. 98: 182–190. Bibcode:2014P&SS...98..182D. doi:10.1016/j.pss.2013.07.014. ISSN 0032-0633.
Currently, the surface of Mars is bathed with radiation, and when reacting with the perchlorates on the surface, it may be more toxic to microorganisms than thought earlier.[11][12] Therefore, the consensus is that if life exists —or existed— on Mars, it could be found or is best preserved in the subsurface, away from present-day harsh surface processes.
This is the old view from around 2007. Nowadays the surface is also thought to be of interest for the search for present day life on Mars.
Cites here are from "A new analysis of Mars “special regions”: findings of the second MEPAG Special Regions Science Analysis Group (SR-SAG2)." 2014
(see section 2.1, page 891)
Finding 2-1: Modern martian environments may contain molecular fuels and oxidants that are known to support metabolism and cell division of chemolithoautotrophic microbes on Earth
3.6. Ionizing radiation at the surface page 891 of[1]).
Finding 3-8: From MSL RAD measurements, ionizing radiation from GCRs at Mars is so low as to be negligible. Intermittent SPEs can increase the atmospheric ionization down to ground level and increase the total dose, but these events are sporadic and last at most a few (2–5)days. These facts are not used to distinguish Special Regions on Mars.
Over a 500-year time frame, the martian surface could be estimated to receive a cumulative ionizing radiation dose of less than 50 Gy, much lower than the LD90 (lethal dose where 90% of subjects would die) for even a radiation-sensitive bacterium such as E. coli (LD90 of ~200–400 Gy)
(see 3.7. Polyextremophiles: combined effects of environmental stressors of[1]).
Finding 3-9: The effects on microbial physiology of more than one simultaneous environmental challenge are poorly understood. Communities of organisms may be able totolerate simultaneous multiple challenges more easily than individual challenges presented separately. What little is known about multiple resistance does not affect our current limits of microbial cell division or metabolism in response to extreme single parameters.
All citing:
Rummel, J.D., Beaty, D.W., Jones, M.A., Bakermans, C., Barlow, N.G., Boston, P.J., Chevrier, V.F., Clark, B.C., de Vera, J.P.P., Gough, R.V. and Hallsworth, J.E., 2014. A new analysis of Mars “special regions”: findings of the second MEPAG Special Regions Science Analysis Group (SR-SAG2).
The search for evidence of habitability, taphonomy (related to fossils), and organic compounds on Mars is now a primary NASA and ESA objective.
Since the article is about Life on Mars it should surely mention the first of NASA’S four science goals:
Goal I: determine if Mars ever supported life
- Objective A: determine if environments having high potential for prior habitability and preservation of biosignatures contain evidence of past life.
- Objective B: determine if environments with high potential for current habitability and expression of biosignatures contain evidence of extant life."
From: Hamilton, V.E., Rafkin, S., Withers, P., Ruff, S., Yingst, R.A., Whitley, R., Center, J.S., Beaty, D.W., Diniega, S., Hays, L. and Zurek, R., Mars Science Goals, Objectives, Investigations, and Priorities: 2015 Version.
There is an almost universal consensus among scholars that the Exodus story is best understood as myth
This is an over simplification, it's possible that some of the population of Israelites did come from Egypt, possibly many thousands of them, and that the story has elements from the experiences of those who did.
"While there is a consensus among scholars that the Exodus did not take place in the manner described in the Bible, surprisingly most scholars agree that the narrative has a historical core, andthat some of the highland settlers came, one wayor another, from Egypt "
In this, I am not referring to the various traditions of Israel’s interaction with Egypt resulting from the era of Egyptian control inCanaan or from some relations with the Hyksos,which found their way into the Bible, but to the possibility that there was a group which fled Egypt, and brought this story of Exodus with it. Though the size of this group is debated, most of the above scholars agree that it was in the range of a few thousands, or even hundreds (some give it more weight, e.g., Hoffmeier 1997). Still, despite the limited size of this group, it appears that during the process of Israel’s ethnogenesis its story became part of the common history of all the Israelites. Most of those who accept some historical core for the story of the Exodus from Egypt, date it to the thirteenth century, at the time of Ramses II, while others dateit to the twelfth century, during the time of Ramses III.
Archaeology does not really contribute to the debate over the historicity or even historical background of the Exo-dus itself, but if there was indeed such a group, it contributed the Exodus story to that of all Israel. While I agree that it is most likely that there was such a group, I must stress that this is based on an overall understanding of the developmentof collective memory and of the authorship of the texts (and their editorial process). Archaeology, unfortunately, cannot directly contribute(yet?) to the study of this specific group of Israel’s ancestors.
So was this Exodus group also Merneptah’s Israel, or at least part of it? Clearly, if there was an Exodus in the thirteenthcentury this group of people could have been part of Merneptah’s Israel. However, despite the assumed significance of this group (the Exodus as a "national" epic, more below), it is likely that this group was incorporated at a later stage, only after Merneptah’s time, or at least that it was distinct from Merneptah’s Israel. After all,although this group clearly brought with it some of what became the history of Israel, it wasn’tMerneptah’s Israel, or any "Israel" for that matter. While many scholars agree that the Exodus group brought with it YHWH as a new deity, the name Israel has the component "El," rather than "Ya" or "Yahu." Thus, Israel could [have] preceded the arrival of the Exodus group, and it is likely that the latter was not Israel’s "core"group.
See also the Wikipedia article
There is no indication that the Israelites ever lived in Ancient Egypt, and the almost universal consensus among scholars is that the Exodus story is best understood as myth.[
This is an over simplification, it's possible that some of them did, and that the story has elements from the experiences of those who did.
"While there is a consensus among scholars that the Exodus did not take place in the manner described in the Bible, surprisingly most scholars agree that the narrative has a historical core, andthat some of the highland settlers came, one wayor another, from Egypt "
In this, I am not referring to the various traditions of Israel’s interaction with Egypt resulting from the era of Egyptian control inCanaan or from some relations with the Hyksos,which found their way into the Bible, but to the possibility that there was a group which fled Egypt, and brought this story of Exodus with it. Though the size of this group is debated, most of the above scholars agree that it was in the range of a few thousands, or even hundreds (some give it more weight, e.g., Hoffmeier 1997). Still, despite the limited size of this group, it appears that during the process of Israel’s ethnogenesis its story became part of the common history of all the Israelites. Most of those who accept some historical core for the story of the Exodus from Egypt, date it to the thirteenth century, at the time of Ramses II, while others dateit to the twelfth century, during the time of Ramses III.
Archaeology does not really contribute to the debate over the historicity or even historical background of the Exo-dus itself, but if there was indeed such a group, it contributed the Exodus story to that of all Israel. While I agree that it is most likely that there was such a group, I must stress that this is based on an overall understanding of the developmentof collective memory and of the authorship of the texts (and their editorial process). Archaeology, unfortunately, cannot directly contribute(yet?) to the study of this specific group of Israel’s ancestors.
So was this Exodus group also Merneptah’s Israel, or at least part of it? Clearly, if there was an Exodus in the thirteenthcentury this group of people could have been part of Merneptah’s Israel. However, despite the assumed significance of this group (the Exodus as a "national" epic, more below), it is likely that this group was incorporated at a later stage, only after Merneptah’s time, or at least that it was distinct from Merneptah’s Israel. After all,although this group clearly brought with it some of what became the history of Israel, it wasn’tMerneptah’s Israel, or any "Israel" for that matter. While many scholars agree that the Exodus group brought with it YHWH as a new deity, the name Israel has the component "El," rather than "Ya" or "Yahu." Thus, Israel could [have] preceded the arrival of the Exodus group, and it is likely that the latter was not Israel’s "core"group.
See also the Wikipedia article
In 2012 the Danish government adopted a plan to increase the share of electricity production from wind to 50% by 2020,[6] and to 84% by 2035
In 2019 they committed to aim for 70% CO2 reductions by 2030 and zero emissions by 2040. New Danish government puts climate change centre stage
Carbon mineralization
The COSIA Carbon XPRIZE Challenge is a competition to convert CO2 into products with highest net value from either a coal or gas power plant. In April 2018, ten finalists were given $5 million each to demonstrate their technologies large scale in the real world. The winner gets a $7.5 million grand prize announced in March 2020.
Five of the ten are focused on carbon minerallization technology. One of them is a team from Aberdeen that hopes to use CO2 capture to make the entire concrete industry carbon negative.
The Carbon Capture Machine precipitates it into calcium and magnesium carbonates (much like stalactites in caves) as a carbon negative replacement for ground calcium carbonate (GCC) which is needed for concrete. If this works on a commercial scale it can decarbonize the concrete industry, or 6% of the world’s annual CO2 emissions. If they can make it commercially viable, GCC has a market value of $20 billion.Carbon Upcycling](http://www.co2upcycling.com/) makes new CO2ncrete from CO2 and chemicals, competing directly with the $400 billion concrete industry - in places like California with a carbon tax and mandate for low carbon building materials.
*CarbonCure Technologies](https://www.carboncure.com/) injects CO2 into wet concrete while it is being mixed. They are aleady in commercial use with 100 installations across the US, retrofitting concrete plants for free then charging a licensing fee. It may take up to 20 years to be used on scale for reinforced concrete, because that’s needed as a durability testing period.
For more on this see Between a Rock and Hard Place: Commercializing CO2 Through Mineralization
Asteroid impact wiped out the dinosaurs
This is far far too large. Its title is "Planetoid crashing into primordial Earth" by Don Davis.
For the asteroid for the dinosaurs a far better image is this one
this one
Asteroid that created Chicxulub crater
(not sure of artist)
Or this one, by the same artist, Don Davis
titled Chicxulub impact site
Where the Tibetan highlanders live, the oxygen level is only about 60% of that at sea level
Not cited, could cite:
"At 4000 meters, every lungful of air only has 60% of the oxygen molecules that people at sea level have," said co-author Cynthia Beall of Case Western Reserve University.
Ethiopians and Tibetans thrive in thin air using similar physiology, but different genes
Its footprints are distinctive
The extremely short limbs make it impossible for this frog to hop, although it can walk (Boulenger 1907). ...
This frog has extensive webbing on its feet, in contrast to other members of the genus Breviceps. Carruthers and Passmore (1978) conjecture that the foot webbing enables traction on loose sand, as the frog moves about on the surface of its sand dune habitat at night (based on its distinctive tracks).
https://amphibiaweb.org/cgi/amphib_query?where-genus=Breviceps&where-species=macrops
The small area of sand dunes often gets a lot of fog, which supplies moisture in an otherwise arid and dry region.
Uncited sentence. Useful cite:
Voucher specimens held in museum collections were examined, and demonstrate the northernmost locality in Lüderitz, Namibia, with all 11 localities in white sandy habitat where coastal fog exceeds 100 days per year. The most southerly record from active searches was just south of Kleinzee in South Africa. A new threat to this species is housing development in prime coastal sand dunes.
Channing, A. and Wahlberg, K., 2011. Distribution and conservation status of the desert rain frog Breviceps macrops. African journal of herpetology, 60(2), pp.101-112.
Contemporary analysis of historical data from the last 11 millennia[12] matches with the indigenous Saptarishi Calendar.[13] The length of the transitional periods between each Yuga is unclear, and can only be estimated based on historical data of past cataclysmic events. Using a 300 year (10% of the length of a particular yuga) period for transitions, Kali Yuga has either ended recently in the past 100 to 200 years, or is to end soon sometime in the next 100 years.
Most Hindus would say that the Hindu Kalu Yuga ends thousands of years into our future. An earlier version of this page gave the conventional view
The Kali Yuga began approximately five thousand years ago, and it has a duration of 432,000 years, leaving us with 427,000 years until the end of the present age.
https://en.wikipedia.org/w/index.php?title=Kali_Yuga&oldid=889598580#10,000_year_%22Golden_Age%22
This edit is based on an article on the Graham Hancock website - would not normally be regarded as a reliable source in Wikipedia.
This is about him: https://en.wikipedia.org/wiki/Graham_Hancock
This is the article they give as a source. https://grahamhancock.com/dmisrab6/
may indeed pose an existential risk for Earth and its inhabitants.
It is NOT an existential risk. An existential risk means a risk of human extinction or severely impacting on the habitability of Earth for future generations. This is talking about objects possibly up to 100 meters in diameter. On land then this can create a small crater up to a kilometer or so in diameter. The heat and blast wave would mean you have to evacuate a city, and perhaps much of the region around it for tens of kilometers.
If it lands in the ocean it's harmless. It is not big enough even to cause a normal height tsunami unles it hit very close to shore and on land only harms you if it is very close to an urban area. At this size there are no global effects and only 1% of Earth's surface is urban.
But the hypothesis here is for a comet break up not an iron meteorite. From this cite
Although our datashow that large Taurids have porous and fragile structure, objects of tens or hundreds of meters in size pose a hazard to the ground even if they have low intrinsic strength.
Simulations show that for an asteroid that breaks up in an airburst like the Tunguska one, there are barely any waves at all and no tsunami.
For details see this comment I made on my Did you know, NASA have NEVER issued any ASTEROID ALERT - most likely future warning: Tiny Asteroid to Splosh Harmlessly in Pacific Ocean, which is based on "Near and far-field hazards of asteroid impacts in oceans" March 2019 in Acta Astronautica.
But the Taurid swarm, a dense cluster within the Taurid meteoroid stream, and through which the Earth periodically passes, changes the odds significantly and gives a possible reason for the unlikely occurrence that a once per 1000-year event occurred just over a century ago. If the hypothesized might of the Taurid swarm is successfully proven, this also heightens the possibility of a cluster of large impacts over a short period of time.
More about this hypothesis - it is due to Napier and Clube and is based on the idea that the Taurids are the result of a large comet 100 km in diameter that broke up ten to twenty thousand years ago. It is not at all proven and this will be a test of it.
The paper says
The June-August 2019 encounter of the TSC provides aunique opportunity to identify additional NEOs of theswarm, helping to either substantiate or refute the giant comet hypothesis of Clube & Napier (1984) and the Tau-rid Complex hypothesis of Asher (1992) and Asher (1992).Dedicated surveys will at the very least be able to placelimits on the NEO density near the swarm centre.
This hypothesis proposes that a giant comet (of or-der one hundred kilometres – comparable to large KBOs) fragmented in the inner solar system of order 10-20 ka ago, producing a complex of dust and small bodies (including 2P/Encke and associated asteroids) still present today.
It is a controversial hypothesis, dating back to Clube and Napier's books
Clube and his colleagues argue that the Taurids’ range of orbits indicates they were all shed by a huge comet, originally 100 miles across or more, that entered the inner solar system some 20,000 years ago. The comet’s orbit took it inside that of Mercury, close to the sun. By 10,000 years ago it was desiccated and brittle, and since then big chunks have been breaking off each time it passes the sun. One of those chunks, Clube thinks, is a comet called Encke. But the core object itself may still be out there. We suspect that the source of the Taurids is in an orbit similar to Encke’s, going round the sun every 3.39 years, says Clube. We think we’re on the verge of finding it.
... It’s quite possible some of the June events fit in with a single object, but I think Victor may have turned it into a bit of a conspiracy theory, says Brian Marsden of the Harvard-Smithsonian Center for Astrophysics.
Astrophysicists Say The Taurids Meteor Shower Could Send Dangerous Rocks to Earth - Discovery June 9, 1992
They never found that big source object. But the idea still continues.
Although most astronomers are not convinced by Clube and Napier's hypothesis of the giant comet, there is a lot of interest in searching this Taurid stream for asteroids. In a previous paper, a search in the database for asteroids that could match the properties of their new sub branch of the southern branch of the Taurids. They found two of them, 2015 TX24 (1.40±0.51 km) and and 2005 UR.which is also hundreds of meters in diameter.
Spurný, P., Borovička, J., Mucke, H. and Svoreň, J., 2017. Discovery of a new branch of the Taurid meteoroid stream as a real source of potentially hazardous bodies. Astronomy & Astrophysics, 605, p.A68.
According to Western Meteor Physics Group data analysis, the Earth will approach within 30,000,000 km of the center of the Taurid swarm this summer, the closest such encounter since 1975. The calculations also show that this will be the best viewing time of the Taurid swarm until the early 2030s.
The swarm passes well below Earths orbit. The red objects in this illustration show the larger objects according to their hypthesis, from their figure 1.
study, published by arXiv
The study is here
Clark, D.L., Wiegert, P. and Brown, P.G., 2019. The 2019 Taurid resonant swarm: prospects for ground detection of small NEOs. Monthly Notices of the Royal Astronomical Society: Letters, 487(1), pp.L35-L39.
ere was a small but definite risk, about one in 500, that its orbit, and Earth's, might coincide on 21 September 2030
This means the odds are 500 to 1 against it happening - so of course it is no surprise if it is later removed from the list. Level 1 is "normal". Such encounters are normally removed from the list as the date gets closer and of course are not expected to hit, it means they are most likely by far to miss..
The new prediction is unlikely to be withdrawn, however
It has been withdrawn, the latest predictions show the first possible impact in 2069
https://cneos.jpl.nasa.gov/sentry/details.html#?des=2000%20SG344
approximately 15,000 light years from Earth
It is not known how far away it is. If it is in the Outer arm it is around 16,000 light years away (5 kpc) and if in the Perseus arm it is half that distance, 8,000 light years away. There is a supernova remnant that may be associated with it at a distance of around 800 parsecs or 2,600 light years away. The McGill survey estimates 2 kpc or about 6,500 light years. In short, there is considerable uncertainty about its distance.
The line of sight intercepts the Perseus and Outer arms of the Galaxy, atdistances of∼2.5 and∼5 kpc, respectively. In this paper, we assume the distanced= 5 kpc. In addition,there exists a supernova remnant (SNR) G160.9+2.6,∼80′north of SGR 0501+4516 (Gaensler & Chatterjee2008; G ̈oˇg ̈u ̧s et al. 2010). The distance and age of the SNR were estimated as 800±400 pc and 4000–7000 years(Leahy & Tian 2007). G ̈oˇg ̈u ̧s et al. (2010) proposed thatSGR 0501+4516 could be associated with G160.9+2.6 Mong, Y.L. and Ng, C.Y., 2018. X-Ray Observations of Magnetar SGR 0501+ 4516 from Outburst to Quiescence. The Astrophysical Journal, 852(2), p.86.
For the McGill distance see table 7 of:
Olausen, S.A. and Kaspi, V.M., 2014. The McGill magnetar catalog. The Astrophysical Journal Supplement Series, 212(1), p.6.
As of March 2016[update], 23 magnetars are known
As of June 2019, 29 are known.
possibly until 1550 BC
There doesn't seem to be any way to get 1550 from the cite. More like 2200 BC, works out at -2183 BC ± 40 years
We report the youngest radiocarbon determination so far for an identified species of Antillean sloth, 4190 ± 40 yr BP
Published in 2007, 2007 -4190 = -2183
Another sloth bone, the youngest mentioned is still not 1550, seems to be -1726 ± 50yr
Although Woods [1989] reported a “whole bone” date of 3715 ± 50yr bp for unspecified sloth remains recovered at Trou Wòch Sa Wo in southern Haiti, five different sloth species have been recovered from this cave [MacPhee et al., 2000] and there is thus no way of relating this date to a single taxon as we have done here. In any case, the accuracy of this age esti-mate should be confirmed, minimally byAMS dating of individual, systematicallyidentified elements.
1989-3715 = -1726 and there is no obvious way to get 1550 from this.
Cite is
MacPhee, R.D., Iturralde-Vinent, M.A. and Vázquez, O.J., 2007. Prehistoric sloth extinctions in Cuba: Implications of a new “last” appearance date. Caribbean Journal of Science, 43(1), pp.94-99.
That is equivalent to the current rate of total US emissions, every year until 2100
This is equivalent to only a third of the US emisions through to 2100 - current rate of US emissions due to the energy sector is 5.28 gigatons a year cite
and has been over 5 gigatons a year since 1988
cite. At 5 gigatons CO2 a year for 80 years, the total US contributions by 2100 with “business as usual” is 400 gigatons by 2100.
Around 10% of the carbon that does defrost will probably be released as CO2, amounting to 130-150 billion tonnes
The table 2.2 of chapter 2 of the 2018 IPCC report has about 180 gigatons to a tenth of a degree which would make this a possible extra increase of less than a tenth of a degree C.
These models often do not include the miigating effects of vegetation. That includes peat growth, and growth of vegation such as forests. A 2018 review came to the conclusion that with RCP 8.5 (Business as usual) it is possible that substantial net losses do not begin until 2100 and that with RCP 4.5 roughly similar to what we are on then the permafrost could have a negative feedback effect all teh way through
"Despite model uncertainties, the results of this study indicate that, under climate change trajectories resulting from little or no mitigation effort, such as the RCP8.5 climate we considered, the northern permafrost region would likely act as a source of soil carbon to the atmosphere, but substantial net losses would not occur until after 2100. Under climate change trajectories resulting from more aggressive mitigation, such as the RCP4.5 climate we considered, our analysis indicates that the northern permafrost region could act as a net sink for carbon. These results have significant implications for climate mitigation policies, as they indicatethat effective mitigation policies could attenuate the negative consequences of the permafrost–carbon feedback that are likely to occur under policies that result in little or no mitigation.""
The reason the IPCC don't include these effects is because they are not yet adequately modeled. It is possible that once we know enough to model them accurately that they may even be carbon negative through to 2100 or even indefinitely with mitigation scenarios.
A recent Nature comment says that rapid collapse could roughly double this minor effect of the permafrost on Earth's climate by 2300.
Mercury is also entering the food chain, thanks to thawing permafrost. The Arctic is home to the most mercury on the planet. The US Geological Survey estimates there’s a total of 1,656,000 tonnes of mercury trapped in polar ice and permafrost: roughly twice the global amount in all other soils, oceans, and atmosphere. Natali explains that, “mercury often binds up with organic material in places where you have high organic matter content… organism’s bodies don’t remove it, so it bio-accumulates up the food web. Permafrost is almost the perfect storm – you have a lot of mercury in permafrost, it is released into wetland systems, those are the right environment for organisms to take them up, and then [it] heads up the food web. That’s a concern for wildlife, people, and the commercial fishing industry.”
The mercury is bound up in sediment in rivers found up to 2.8 km downstream of the permafrost slumps. Though the levels are high, because the mercury is bound then the researchers do not know if it is of any concern. The press release says
However, because the mercury is locked within sediments, the scientists are unsure as to whether this mercury could be consumed by organisms in the area and whether this mercury poses any threat to the security of northern food webs.
These results highlight the need for further research on mercury cycling in regions experiencing active permafrost thaw, as well as studies examining if and how this mercury might enter food webs in surrounding ecosystems. Record levels of mercury released by thawing permafrost in Canadian Arctic
https://www.ualberta.ca/science/science-news/2018/december/mercury-permafrost-arctic-climate-change
But she adds that studies of animal populations actually suggest that, “warmer temperatures also increase the prevalence of viruses and disease, so we’re seeing a lot more caribou and reindeer becoming more sickly as a result of this warming climate… it is just not an environment that is suited to thrive at these warmer temperatures
That is probably this press release from the NOAA. 2018 Arctic Report Card: Reindeer and caribou populations continue to decline The numbers of caribou fluctuate widely from year to year. However this year's decline is likely due in part to climate change. Only one of the herds is near their historic high. For reindeer, the population in Norway has been stable since 2002.
Only the two populations outlined in gold have no decline. The total poulation has declined by 56% and some of the herds have no recovery in sight at present.
About 54% of the variability is due to climate indicators. The main ones are the indicators of plant growth in October (warming growing degree days) and June (plant growing degree days).
These are often beneficial to caribou, but when you have multiple warm summers then you get increased drought, flies, parasites and perhaps increased susceptibility to pathogens from heat stress.
In Canada barren ground caribou became nationally recognized as "threatened" and two herds of Eastern migratory Caribou are "Endangered" and in Russia, which have many wild reindeer sub-species, then the declines are espcially for island, forest and mountain reindeer.
Natali also says that many areas are experiencing “Tundra browning”: the higher temperatures lead surface water to evaporate into the atmosphere, causing plants to die off.
There has been a large increase in biomass in the Arctic regions. It is a mix of large areas of greening and smaller areas of browning. It suggests there are processes delaying green up. See this chart:
Epstein et al (2012) found an average circumpolar increase in aboveground tundra biomass of 19.8% between 1982 and 2010.This increase was accentuated in the mid- to southern tundra subzones (20–26% increase), yet it was substantially less in the more northern tundra (2–7%).
...
Decline in greenness has recently been detected especially during the last 3–4 years
Changes in timing of spring snow melt, permafrost degradation, killing frosts due to mid-winter or early-spring snow melt, or vegetation shift from graminoids to deciduous shrubs are all possible reasons for arctic tundra browning.
For the boreal forest, remote sensing studies continue to support the “browning” of forest vegetation (1982–2008) with increasing drought stress as the most probable driver. However, this reduction in photosynthesizing vegetation appears to be related to the fractions of evergreen trees and deciduous trees on the landscape – with greater declines in evergreen-dominated areas (Miles and Esau, 2016). Changes towards greening or browning appear here as well highly variable, both in time and space.
The changing colors of the Arctic: from greening to browning
But methane and CO2 are not the only things being released from the once frozen ground. In the summer of 2016, a group of nomadic reindeer herders began falling sick from a mysterious illness. Rumours began circling of the “Siberian plague”, last seen in the region in 1941. When a young boy and 2,500 reindeer died, the disease was identified: anthrax. Its origin was a defrosting reindeer carcass, a victim of an anthrax outbreak 75 years previously. The 2018 Arctic report card speculates that, “diseases like the Spanish flu, smallpox or the plague that have been wiped out might be frozen in the permafrost.” A French study in 2014 took a 30,000 year-old virus frozen within permafrost, and warmed it back up in the lab. It promptly came back to life, 300 centuries later. (To read more, see BBC Earth’s piece on the diseases hidden in ice.)
Anthrax is a special case because it is able to withstand long periods in dormancy. It is also easy to contain and treat. There are many microbes that can survive in dormant state. Like anthrax. If you happened to revive one of those it's possible but they would still have never met any modern antibiotics. Not likely to start a pandemic.
Viruses probably can't survive at all except some very huge ones that infect marine creatures. The bubonic plague is not at all likely to survive because it needs a living host not likely to thaw up a frozen body and be infected would be very unusual situation.
Any ancient microbes are going to have no antibiotic resistance have never even seen penicillin probably. So, ancient diseases are unlikely to be a problem especially since we are exposed to them from time to time anyway. Mammoths and the like have been occasionally washed out of permafrost for ages.
There have never been any instances of archaeologists working on melting permafrost being infected by ancient viruses.
Adding to this apocalyptic vision, in 2016 the Doomsday Vault – a sub-permafrost facility in Arctic Norway, which safeguards millions of crop seeds for perpetuity – was breached with meltwater
The vault was NOT breached with meltwater. Some melt water got into the entrance tunnel, which was never meant to be watertight, and this has happened just about every year since it was constructed.
This is how Cary Fowler who helped create the seed vault described this click bait story to Popular Mechanics cite
“Flooding is probably not quite the right word to use in this case. In my experience, there’s been water intrusion at the front of the tunnel every single year.”
“The tunnel was never meant to be watertight at the front, because we didn’t think we would need that. What happens is, in the summer the permafrost melts, and some water comes in, and when it comes in, it freezes. It doesn’t typically go very far.”
“If there was a worst case scenario where there was so much water, or the pumping systems failed, that it made its way uphill to the seed vault, then it would encounter minus 18 [degrees celsius] and freeze again. Then there’s another barrier [the ice] for entry into the seed vault,”
Hege Njaa Aschim explained in an email to Popular Science:
“This fall—October 2016—we had extreme weather in Svalbard with high temperatures and a lot of rain—very unusual. This caused water intrusion into the tunnel leading to the seed vault. The seeds and the vault was never at risk. This was no flooding, but more water than we like. So we are doing measures to improve and secure the entrance and tunnel,”
Turns out the Svalbard seed vault is probably fine
For more discussion and cites my:
That is equivalent to the current rate of total US emissions, every year until 2100.
This is equivalent to roughly a third of the US emisions - current rate of US emissions due to the energy sector is 5.28 gigatons a year cite
and has been over 5 gigatons a year since 1988
cite. At 5 gigatons CO2 a year for 80 years that's 400 gigatons by 2100.
Around 10% of the carbon that does defrost will probably be released as CO2, amounting to 130-150 billion tonnes
The table 2.2 of chapter 2 of the 2018 IPCC report has about 180 gigatons to a tenth of a degree which would make this a possible extra increase of less than a tenth of a degree C.
These models often do not include the miigating effects of vegetation. That includes peat growth, and growth of vegation such as forests. A 2018 review came to the conclusion that with RCP 8.5 (Business as usual) it is possible that substantial net losses do not begin until 2100 and that with RCP 4.5 roughly similar to what we are on then the permafrost could have a negative feedback effect all teh way through
"Despite model uncertainties, the results of this study indicate that, under climate change trajectories resulting from little or no mitigation effort, such as the RCP8.5 climate we considered, the northern permafrost region would likely act as a source of soil carbon to the atmosphere, but substantial net losses would not occur until after 2100. Under climate change trajectories resulting from more aggressive mitigation, such as the RCP4.5 climate we considered, our analysis indicates that the northern permafrost region could act as a net sink for carbon. These results have significant implications for climate mitigation policies, as they indicatethat effective mitigation policies could attenuate the negative consequences of the permafrost–carbon feedback that are likely to occur under policies that result in little or no mitigation.""
The reason the IPCC don't include these effects is because they are not yet adequately modeled. It is possible that once we know enough to model them accurately that they may even be carbon negative through to 2100 or even indefinitely with mitigation scenarios.
A recent Nature comment says that rapid collapse could roughly double the effect on Earth's climate
2017
The U.S. Energy Infomormation Administration say that it rose by 2.8% in 2018 but project that it will decrease in 2019 and 2020. The increase in 2018 was due to a 10% increase in emissiosn from natural gas and preliminary data makes it 0.4% below teh record set in 2017. The high energy consumption in 2018 is largely due to air conditioning demand in the warm weather, the winter months were also colder. 2019 and 2020 are expected to be milder which is what leads to the reduced forecast for them. The estimates of industrial production growth and GDP growth also factor into their prediciton - a slow down in GDP but faster industrial growth in 2019, slow down in industrial growth in 2020, as industrial production tends to be more energy intensive than the rest of the economy.
U.S. energy-related CO2 emissions increased in 2018 but will likely fall in 2019 and 2020
The seasonal low of 1,078.96 feet (328.87 m) in 2017 is close to that experienced in 2014, safely above the drought trigger for now.[27] However, that level is still 36 feet (11 m) below the seasonal low experienced in 2012 and the lake is projected to begin falling again in 2018.[28]
Seasonal low of 1,076 in 2018. However there was a big snow melt in 2019 feeding lake Powell. As of June 17 it is around 5 feet above its 2017 level. lake level chart
Colorado had 134% of its normal snowfall in winter 2018, Utah 138%, Wyoming 116%.
As a result Lake Powell is expected to rise 50 feet in 2019, a gain of 12 million acre-feet, compared with only 4.6 million last year. It expects to release 9 million acre feet from Powell to Mead for the fifth consecutive year.
Elephant Butte, a reservoir for the Rio Grande in New Mexico will also be replenished from 10% to around 30% of capacity.
It is too early to say if this is the end of the decade long drought phase. However it is enough so that Arizona, the state with the lowest priority rights to the water from lake Mead is no longer expected to have to cut its share in 2020. That shortage may now be put off until after 2021.
Snowmelt fills Colorado River and other waterways in U.S. Southwest, easing drought fears, Denver Post, June 13, 2019
Drought and water usage issues
Out of date - doesn't mention the new drought contingency plan. All seven states signed a drought contingency plan on May 20 2019, which lasts through until 2026, involving voluntary reductions in water taken from lake Mead if the levels get low.
One sticking point was the Carlton sea in California, which formed as a result of a failed canal project in the late twentieth century and is now an important stop for migrating birds. The Imperial Irrigation District needed extra funding of $200 million to help preserve this sea before it would agree to a reduction. But the Metropoliton Water District was able to pledge most of California's voluntary water cuts and saved the plan.
They next need to work on a longer term contingency plan for the next 50 years.
Interior and states sign historic drought agreements to protect Colorado River - press release by US government under "Reclamation, Manging water in the West"
Under the drought plan, states voluntarily would give up water to keep Lake Mead on the Arizona-Nevada border and Lake Powell upstream on the Arizona-Utah border from crashing. Mexico also has agreed to cuts
The drought contingency plan takes the states through 2026, when existing guidelines expire. The states already are preparing for negotiations that will begin next year for new guidelines.
The Imperial Irrigation District was written out of California's plan when another powerful water agency, the Metropolitan Water District, pledged to contribute most of the state's voluntary water cuts.
Imperial had said it would not commit to the drought plan unless it secured $200 million in federal funding to help restore a massive, briny lake southeast of Los Angeles known as the Salton Sea.
Felicia Fonseca, US official declares drought plan done for Colorado River, Phys.org, March 20, 2019
For more background
Despite signs of interstate cooperation, the decline of Lake Mead isn’t near being solved, Michael Hiltzik Feb 08, 2019, Los Angeles Times.
Setbacks
Doesn't mention the first three rocket failures though they are covered in Falcon 1
"And the reason that I ended up being the chief engineer or chief designer, was not because I want to, it's because I couldn't hire anyone. Nobody good would join. So I ended up being that by default. And I messed up the first three launches. The first three launches failed. Fortunately the fourth launch which was – that was the last money that we had for Falcon 1 – the fourth launch worked, or that would have been it for SpaceX."
Elon Musk (28 September 2017), Making Life Multiplanetary | 2017 International Astronautical Congress
if a small region of the universe by chance reached a more stable vacuum, this 'bubble' would spread.
[this section needs cites, I have them when I have time] Should explain there are two ways that a false vacuum can collapse. It can do it through energetically pushing it over the barrier. This was only possible in the early universe and conditions do not exist for this today. Or it can happen through quantum tunneling. An example to explain quantum tunneling - in principle a ping pong ball inside a vault could spontaneously find itself outside of it just because of quantum position undertainty, without having to move through the walls. Given infinite time and enough vaults and ping pong balls eventually this has to happen but it is not a realistic possibility on normal timescales.
On the molecular scale then quantum tunneling events can happen and indeed may help explain how we can smell, and how birds are able to sense the weak Earth's magnetic field enough to navigate.
The quantum tunneling of the false vacuum collapse is more like the ping pong ball analogy, it is extremely unlikely. The latest estimates based on properties of the Higgs and the top quark is that the odds are googols to one against it having happened any time in the past through to the present since the Big Bang.
However in the extreme conditions of the early universe, then it should have happened, back when the entire observable universe was compressed into a space far less than that of a proton, nucleus of hydrogen atom, and at extreme temperatures, through the first method of being pushed over the barrier rather than tunneling through it.
John Ellis has suggested that this likely means that we need new physics to explain why the universe surfvived that early stage. The alternative he mentions is that we are surrounded by true vacuum in all directions and happen to be one of the few exceedingly unlikely patch of false vacuum in an infinite universe. Given infinite space - time then even the most improbable would happen somewhere - but he doesn't think this is a likely hypothesis (see 47:40 into this video).
New physics that could explain this includes
Another suggestion that's been made is that when a Higgs field collapses this could create a new universe with its own space and time rather than expand into ours.
There is a possibility that the Higgs field was stable in the early universe due to an interaction with gravity which would explain why the universe survived to date cite. That still leaves us with the question, of why it is so close to the boundary between stable and unstable, when it could be completely stable?
Existential threat
This section is way out of date, most of it seems to have been written before the discovery of the Higgs boson in 2013 or shortly after it. Enough is now known for a rigorous calculation (assuming that there is no new physics to be found of course).
The authors of the peer reviewed paper update their abstract in arxiv.org from time to time. The original published value was that there would be a future lifetime of the universe of \(10^{139} \) years with 95% oncfidence of more than \(10^{58} \) years
As of version 4 of their paper revised 2nd May 2018, it is now \(10^{161} \) years with 95% oncfidence of more than \(10^{65} \) years
The odds that we have encountered a vacuum collapse already, or that one is on its way (total over the lifetime of the univesre to date) used to be between odds of \(10^{107} \) to 1 against and \(10^{718 } \) to 1 against
They now say it is between between odds of \(10^{367} \) to 1 against and \(10^{1124 } \) to 1 against
Andreassen, A., Frost, W. and Schwartz, M.D., 2018. Scale-invariant instantons and the complete lifetime of the standard model. Physical Review D, 97(5), p.056006.
Joseph Lykken has said that study of the exact properties of the Higgs boson could shed light on the possibility of vacuum collapse
This has now been done in a 2018 paper published in Physical Review D cite
The authors of the paper update their abstract in arxiv.org from time to time. The original published value was that there would be a future lifetime of the universe of \(10^{139} \) years with 95% oncfidence of more than \(10^{58} \) years
As of version 4 of their paper revised 2nd May 2018, it is now \(10^{161} \) years with 95% oncfidence of more than \(10^{65} \) years
The odds that we have encountered a vacuum collapse already, or that one is on its way (total over the lifetime of the univesre to date) used to be between odds of \(10^{107} \) to 1 against and \(10^{718 } \) to 1 against
They now say it is between between odds of \(10^{367} \) to 1 against and \(10^{1124 } \) to 1 against
Andreassen, A., Frost, W. and Schwartz, M.D., 2018. Scale-invariant instantons and the complete lifetime of the standard model. Physical Review D, 97(5), p.056006.
They argue that due to observer selection effects, we might underestimate the chances of being destroyed by vacuum decay because any information about this event would reach us only at the instant when we too were destroyed.
Their argument has been completely misunderstood. They are discussing any natural catastrophic events that could destroy Earth. This is a paper from 2005 well before the discovery of the Higgs.
They mention three possibilities, that a cosmic radiation collision event triggers collapse of the Earth into a black hole, into strange matter or the false vacum collapse of the entire universe.
The observer selection effect here is just that we exist and applies to all three scenarios and is not specific to the false vacuum collapse. Indeed it would apply to any other scenario that could destroy Earth or change it in such a way as to make it impossible for huamns to evolve. cite
Given that life on Earth has survived for nearly 4 billion years (4 Gyr), it might be assumed that natural catastrophic events are extremely rare. Unfortunately, this argument is flawed because it fails to take into account an observation-selection effect, whereby observers are precluded from noting anything other than that their own species has survived up to the point when the observation is made.
If it takes at least 4.6 Gyr for intelligent observers to arise, then the mere observation that Earth has survived for this duration can-not even give us grounds for rejecting with 99% confidence the hypothesis that the average cosmic neighbourhood is typically sterilized, say,every 1,000 years. The observation-selection effect guarantees that we would find ourselves in a lucky situation, no matter how frequent thesterilization events
The researchers estimated from their observations that there are nearly two Jupiter-mass rogue planets for every star in the Milky Way
A later 2017 study cast doubt on this result. It used a larger population of microlensing events and found at most one Jupiter-mass rogue planet for every four stars in the Milky Way.
Mróz, P., Udalski, A., Skowron, J., Poleski, R., Kozłowski, S., Szymański, M.K., Soszyński, I., Wyrzykowski, Ł., Pietrukowicz, P., Ulaczyk, K. and Skowron, D., 2017. No large population of unbound or wide-orbit Jupiter-mass planets. Nature, 548(7666), p.183.<
It also signified the disappearance of an entire mammal family of river dolphins (Lipotidae)
Potentially confusing - may give impression it means extinction of all river dolphins worldwide. There are two other families of river dolphins.
Clearer as
"It also signified the disappearance of one entire river dolphin mammal family (Lipotidae), leaving only two extant families of river dolphins"
expected to be the first UHVDC cable in the United States
Now on hold. [https://www.windpowermonthly.com/article/1460152/hurdles-kill-off-uss-first-hvdc-line-20-years]
When the temperature is below the freezing point of water, the dew point is called the frost point, as frost is formed rather than dew.
Though popular accounts of meteorology sometimes suggest this, dew point and frost point differ. Dew point is the temperature for 100% humidity of the air in normal conditions. Frost point is the higher temperature for 100% humidity over an ice surface. This distinction normally doesn't matter much, but is important for processes in clouds. Growth of icy particles is favoured over water droplets when both are possible, because the frost point is at a higher temperature than the dew point
I am summarizing what the meteorologist Jeff Haby explains here
"The dew point is the temperature at which the air is saturated with respect to water vapor over a liquid surface. When the temperature is equal to the dewpoint then the relative humidity is 100%. The common ways for the relative humidity to be 100% is to 1) cool the air to the dewpoint, 2) evaporate moisture into the air until the air is saturated, 3) lift the air until it adiabatically cools to the dew point. "The frost point is the temperature at which the air is saturated with respect to water vapor over an ice surface. It is more difficult more water molecules to escape a frozen surface as compared to a liquid surface since an ice has a stronger bonding between neighboring water molecules. Because of this, the frost point is greater in temperature than the dew point. This fact is important to precipitation growth in clouds. Since the vapor pressure is less over an ice surface as compared to a supercooled liquid surface at the same temperature, when the relative humidity is 100% with respect to water vapor the relative humidity over the ice surface will be greater than 100%. Thus, precipitation growth is favored on the ice particles."
Omega Point
This article is very poor. Theillard de Chardin's theory is an attempt by a devout Catholic scientist to reconcile religious ideas of the love of God and of teleology - that our life and world has a purpose, with scientific understanding. To leave that aspect out is to miss the entire point of his work. The theory is very influential in both Christian theology generally and especially Catholic theology, not just in the nineteenth century but is still influential through to the present.
This article attempts to treat it as a purely scientific theory stripping away all religious elements. It cites mainly critics who ridicule the idea that religion is relevant to science and the idea that our universe may have any teleology or purpose. There would be the same problems writing an article about Christian ideas of the Resurrection that ignored the theological context. This approach is not used in other articles on Christian theology in Wikipedia.
Rather than annotate particular points in this article I think it is best to just direct the reader to the entry on him in the French Wikipedia, which is much better, written as theology, as well as some summaries of his work by other authors.
The Omega point is a dynamic concept created by Pierre Teilhard de Chardin , who gave it the name of the last letter of the Greek alphabet : Omega .
For Teilhard, the Omega Point is the ultimate point of the development of complexity and consciousness towards which the universe (1) . According to his theory, exposed in The Future of Man and The Human Phenomenon , the universe is constantly evolving towards ever higher degrees of complexity and consciousness, (1) the Omega point being the culmination but also the cause of this evolution (1) . In other words, the Omega point exists in a supremely complex and supremely conscious way, transcending the universe in the making.
For Teilhard the Omega point evokes the Christian Logos , that is Christ , in that it attracts all things to him and is, according to the Nicene symbol , "God born of God, Light born of the Light, true God born of the true God ", with the indication: " and by him all things were done ".
Subsequently this concept was taken up by other authors, such as John G. Bennett (1965) or Frank Tipler (1994).
The Omega point has five attributes, which Teilhard details in The Human Phenomenon .
The five attributes In the book The Human Phenomenon (The Human Phenomenon, 1955), Teilhard describes the five attributes of the Omega point:
It has always existed - only in this way you can explain the evolution of the universe to higher levels of consciousness.
must be personal - a person and not an abstract idea; the greater complexity of the question has not only led to higher forms of consciousness, but to greater personalization, of which humans are the highest forms of the "personalization" of the universe. They are fully "individualized", free activity centers. It is in this sense that it is said that man was made in the image of God, which is the highest form of personality. Teilhard de Chardin expressly maintains that the Omega point, when the universe by unification will become one, we will not see the elimination of people, but the super-personalizing. The personality will be infinitely richer. Indeed, the Omega point unites the creation, and it unites, the universe becomes more complex and increases its consciousness. Just as God created the universe evolves to forms more complexity, consciousness, and finally with man, personality because God, the universe attracting to itself is a person.
It must be transcendent - the Omega Point is not the result of complexity and consciousness. It exists before the evolution of the universe, because the Omega Point is the cause of the evolution of the universe towards greater complexity, consciousness and personality. This essentially means that the Omega Point is located outside the context in which the universe is evolving, because it is because of its magnetic attraction that the universe tends to it.
must be independent - without limits of space and time.
It must be irreversible - which must provide the ability to reach.
This is how the idea is described by Linda Sargent Wood as summarized by Oxford Scholarship Online
Merging Catholicism and science, Teilhard asserted that evolution was God's ongoing creative act, that matter and spirit were one, and that all was converging into one complete, harmonious whole. Though controversial, his organismic ideas offered an alternative to reductionistic, dualistic, mechanistic evolutionary views. They satisfied many who were looking for ways to reconnect with nature and one another; who wanted to revitalize and make personal the spiritual part of life; and who hoped to tame, humanize, and spiritualize science. In the 1960s many Americans found his book The Phenomenon of Man and other mystical writings appealing. He attracted Catholics seeking to reconcile religion and evolution, and he proved to be one of the most inspirational voices for the human potential movement and New Age religious worshipers. Outlining the contours of Teilhard's holistic synthesis in this era of high scientific achievement helps explain how some Americans maintained a strong religious allegiance.
Wood, L.S., 2012. A More Perfect Union: Holistic Worldviews and the Transformation of American Culture after World War II. Oxford University Press.
“Only where someone values love more highly than life, that is, only where someone is ready to put life second to love, for the sake of love, can love be stronger and more than death. If it is to be more than death, it must first be more than mere life. But if it could be this, not just in intention but in reality, then that would mean at the same time that the power of love had risen superior to the power of the merely biological and taken it into its service. To use Teilhard de Chardin’s terminology, where that took place, the decisive complexity or “complexification” would have occurred; bios, too, would be encompassed by and incorporated in the power of love. It would cross the boundary—death—and create unity where death divides. If the power of love for another were so strong somewhere that it could keep alive not just his memory, the shadow of his “I”, but that person himself, then a new stage in life would have been reached. This would mean that the realm of biological evolutions and mutations had been left behind and the leap made to a quite different plane, on which love was no longer subject to bios but made use of it. Such a final stage of “mutation” and “evolution” would itself no longer be a biological stage; it would signify the end of the sovereignty of bios, which is at the same time the sovereignty of death; it would open up the realm that the Greek Bible calls zoe, that is, definitive life, which has left behind the rule of death. The last stage of evolution needed by the world to reach its goal would then no longer be achieved within the realm of biology but by the spirit, by freedom, by love. It would no longer be evolution but decision and gift in one.”
Orthodoxy of Teilhard de Chardin: (Part V) (Resurrection, Evolution and the Omega Point)
His views have also been seen as relevant to modern tanshumanists who want to apply technology to overcome our human limitations. Some of them think that his ideas foreshadowed this.
A movement known as tranhumanism wants to apply technology to overcome human limitations. Followers believe that computers and humans may combine to form a “super brain,” or that computers may eventually exceed human brain capacity. Some transhumanists refer to that future time as the “Singularity.” In his 2008 article “Teilhard de Chardin and Transhumanism,” Eric Steinhart wrote that:
Teilhard de Chardin was among the first to give serious consideration to the future of human evolution.... [He] is almost certainly the first to describe the acceleration of technological progress to a singularity in which human intelligence will become super intelligence.
Teilhard challenged theologians to view their ideas in the perspective of evolution and challenged scientists to examine the ethical and spiritual implications of their knowledge. He fully affirmed cosmic and biological evolution and saw them as part of an even more encompassing spiritual evolution toward the goal of ultrahumans and complete divinity. This hypothesis still resonates for some as a way to place scientific fact within an overarching spiritual view of the cosmos, though most scientists today reject the notion that the Universe is moving toward some clear goal.
Pierre Teilhard de Chardin Paleontologist, Mystic and Jesuit Priest - Kahn Academy
By Tom Butler-Bowdon
In a nutshell: By appreciating and expressing your uniqueness, you literally enable the evolution of the world.
For Teilhard humankind was not the centre of the world but the ‘axis and leading shoot of evolution’. It is not that we will lift ourselves above nature but, in our intellectual and spiritual quests, dramatically raise its complexity and intelligence. The more complex and intelligent we become, the less of a hold the physical universe has on us, he believed.
Just as space, the stars and galaxies expand ever outwards, the universe is just as naturally undergoing ‘involution’ from the simple to the increasingly complex; the human psyche also develops according to this law. ‘Hominisation’ is what Teilhard called the process of humanity becoming more human, or the fulfilment of its potential. ... Teilhard said as humanity became more self-reflective, able to appreciate its place in space and time, its evolution would start to move by great leaps instead of a slow climb. In place of the glacial pace of physical natural selection, there would be a supercharged refinement of ideas that would eventually free us of physicality altogether. We would move irresistibly toward a new type of existence, at which all potential would be reached. Teilhard called this the ‘omega point’.
Book review: The Phenomenon of Man by Pierre Teilhard de Chardin
An eruption of the giant Yellowstone Caldera is one of the triggering events in the end-of-the-world spectacle 2012. And that was basically a documentary, right?
For those who worry about supervolcanoes, this is a new study shows that it would take anywhere from centuries to thousands of years for the magma chamber to change state to permit a supervolcano. None of the ones volcanologists monitor show any signs of this process. And we would have plenty of warning as before the eruption, as the ground would be lifted not just by meters but by tens to hundreds of meters. This is a process that would most likely take many human lifetimes, according to a study at the University of Illinois
We would have ample warning of a supervolcano eruption - centuries, or thousands of years
Hey, do you like soccer? Of course you don't. You're going to like it even less after the accidental creation of strangelets that cause the Earth to be smashed into a hyper-dense sphere the size of a soccer field. Good luck using your hands then, pal.
Short summary: the LHC is harmless because the same collisions happen in our atmosphere thousands of times a day at far higher energies. Whatever it may do is happening in our atmosphere already and clearly doesn’t harm us.
It is very unlikely that the LHC can create mini black holes. If it can then the mini black holes are expected to disintegrate immediately in a burst of Hawking radiation. Whether or not that’s true, our universe has natural particle accelerators larger than our solar system, which accelerate particles to immense energies and fire them in all directions including towards the Earth. There is no way that our tiny accelerators on Earth can match their energies. Indeed, THOUSANDS OF COLLISIONS each MORE THAN SEVEN MILLION TIMES MORE ENERGETIC than the highest levels that LHC can produce happen in our atmosphere EVERY SINGLE DAY from natural collisions with cosmic radiation particles.
Debunked: the Large Hadron Collider will create mini black holes that will destroy the world
And according to apocalypse enthusiasts, it would knock every satellite from the sky, tear holes in the ozone layer, bombard every living thing on the planet with cancer-causing cosmic rays for hundreds of years, and lead to mass extinctions. Entire portions of the Earth could end up uninhabitable!
This is nonsense. You are talking about a possible increase in UV. Possible satellites glitching for a few hours until the storm is over. No risk from ionizing radiation at ground level, our atmosphere is equivalent to 10 meters thickness of water to protect us.
There is not any risk at all of areas of Earth becoming uninhabitable to humans. If we get a magnetic pole reversal, then from studies of previous reversals and modeling, the main risk is increased UV as a result of the ozone layer damaged by repeated solar storms. It would mean that you need to wear more sunblock cream on sunny days.
We are not yet in a magnetic pole reversal, and if we are headed for one, it’s likely to start in earnest decades to centuries to perhaps thousands of years in the future and take at least a century or two to complete.
The main risk of a solar storm is of GPS satellites glitching for hours, recovering once it is over - and of power cuts. We used to think that the power cuts could be severe and cost getting on for a trillion dollars and risk power cut for months or longer. But the grid is more resilient than those early studies suggested. It is now thought to be similar to a major hurricane in effect, perhaps up to $100 billion in costs for the US and only local short term power cuts. This is because modern step up transformers are more robust than the few older ones still in the grid, and they are usually multiply redundant. It would have no effect at all on consumer electronics such as laptops or cellphones - the article confused magnetic storms with EMP pulse effects.
Debunked: Earth’s magnetic poles to swap and make parts of Earth uninhabitable - NOT
mass extinction
We do not face mass extinction either. Of all the ecosystems only the corals are at risk of near complete extinciton at the worst case. That would be sad but hardly a blip compared with mass extinctions of the past. We start with an unusually biodiverse world compared to previous eras because of their separation into many continents and the habitats from tropical desrts all the way to polar ice sheets.
We risk reducing biodiversity and many extinctions but we also know how to stop it - to stop perverse agricultural subsidies, reduce food waste, promote sustainable agriculture and a circular economy.
actual impending doom of global climate change
This is not an impending doom at all. For the worst case see the IPCC’s example worst case climate change scenario.
Progress
Needs a Criticism section.
The main crticism is that it is not helping to understand how the human brain itself works. From the article by Frégnac et al in Nature in 2014:
Contrary to public assumptions that the HBP would generate knowledge about how the brain works, the project is turning into an expensive database-management project with a hunt for new computing architectures. (1) The problem is that it is not founded on knowledge of how neurons are connected in the real brain or experimental data. Most importantly, there are no formulated biological hypotheses for these simulations to test Instead it is an attempt at simulation using many hardware neurons of something the researchers hope will resemble the way a human brain works without any experimental data to guide the experimentation.(1)
The revised plan advances a concept in which in silico experimentation becomes a “foundational methodology for understanding the brain” (1)
Shortly after the project started, many European neuroscientists signed an open letter raising the following issues (3)
a) That the project could not provide understanding of the brain without corrective loops between hypotheses and experimental facts and the project is not guided by any precies hypotheses to be tested and checked.
b) That the model is overoptimisic and wrong. It should either abandon neurologial researches and only focus on technological advances or be split into two projects for the two areas
c) Too expensive, syphoning away important funds from other fundamental research d) excessively big and coordination mechanisms unclear
This collective wrote “Open message to the European Commission concerning the Human Brain Project” to the European Commission on July 7, 2014.
A mediation report was published in 2015. This upheld most of the criticisms.
The mediation committee summarized the disagreement as (2):
The goal of reconstructing the mouse and human brain in silico and the associated comprehensive bottom-up approach is viewed by one part of the scientific community as being impossible in principle or at least infeasible within the next ten years, whileanother part sees value not only in making such simulation tools available but also in their development,in organizing data, tools and experts
They recommended that the goals should be less ambitious saying:
Issue: Public announcements by the HBP leadership and by the EC overstated objectives and the possible achievements of the HBP. Unrealistic expectations were raised, such as tools for predictive sim-ulation of the human brain to enable understanding of brain function or to support diagnosis and therapy of neurodegenerative diseases within the course of the project. This resulted in a loss of scientific credibility of the HBP.
Recomendation: The HBP andthe EC should clearly andopenly communicate the project’s sharp-ened mission and objectives. Furthermore,theHBP should systematically create and use opportuni-ties to enteraconstructive scientific discourse with the science community, with science policy mak-ers and with the interested public. Ultimately the reputation of the HBP in the science community will rest on the publication of convincing scientific results and the generation of widely used IT platforms. ...
They recommend that it be split into three or four sub projects with PI's with a strong neuroscience background:
Issue:The absenceof systems and cognitive neuroscience subverts the multi-scale and multi-perspective ambitions of the HBP to integrate and validate various approaches to a unifying model-ling and simulation platform.It also impairs the validation of other IT platforms developed in the HBP regarding the value added to future neuroscience research and clinical practice
Recommendation:The SPs (and the constituent WPs) suggested in the FPA should be consolidated and integrated with a set of new cross-cutting horizontal tasks to form a matrix-type project structure. These cross-cutting activities should be organized in at least 3 -4 WPsto address challenging problems of systems and cognitive neuroscience which are led by PIs with a strongscientific background in the respective areas. These WPsshould be aggregated to a new cross-cutting subproject “Cognitive and Systems Neuroscience: CSN”.
[1]Frégnac, Y. and Laurent, G., 2014. Neuroscience: Where is the brain in the Human Brain Project?. Nature News, 513(7516), p.27.
[2] Human Brain Project Mediation Report, Mediator Prof. Dr. Wolfgang Marquardt, March 2015
[3] This is a paraphrase of the relevant section of: Why do we foster and grant wrong innovative scientific methods? The neuroscientific challenge Jordi Vallverdu Autonomous University of Barcelona
Blue Brain Project
Page is five years out of date and doesn't include criticism section. French Wikipedia summarizes the criticisms like this
This project, along with the resulting Human Brain Project , is heavily criticized for a variety of reasons, including the scientific strategy adopted and the high cost involved. Launched in 2013, HBP faced a number of criticisms, including the lack of evidence from Blue Brain [5 ]. However, in October 2015, the team of the Blue Brain Project (in) published in Cell an article describing a simulation of a rat brain, covering 30,000 neurons and 40 million synapses - which did not stop criticizing the overall unrealism of BPH[5] Blue Brain (French Wikipedia)
[5] Kai Kupferschmidt, "Virtual rat brain fails to impress its critics", Science , October 16, 2015, Vol. 350 no. 6258 pp. 263-264; DOI: 10.1126 / science.350.6258.263
See also Theil, S., 2015. Why the Human Brain Project Went Wrong—and How to Fix It. Scientific American, 313(4), pp.36-42.
Frégnac, Y. and Laurent, G., 2014. Neuroscience: Where is the brain in the Human Brain Project?. Nature News, 513(7516), p.27.
by reverse-engineering mammalian brain circuitry
One of the main criticisms of this project is that neroscientists do not have a detailed "map of connections between neurons within and across brain areas that could guide simulations"
From the beginning, neuroscientists pointed out that large-scale simulations make little sense unless constrained by data, and used to test precise hypotheses. In fact, we lack, among other resources, a detailed 'connectome', a map of connections between neurons within and across brain areas that could guide simulations. There is no unified format for building functional databases or for annotating data sets that encompass data collected under varying conditions. Most importantly, there are no formulated biological hypotheses for these simulations to test
Frégnac, Y. and Laurent, G., 2014. Neuroscience: Where is the brain in the Human Brain Project?. Nature News, 513(7516), p.27.
commonly recognized conditions such as delusional infestation"
Note for comparison, Lede for French Wikipedia: Morgellons
"Morgellons" or "Morgellon's disease" is a controversial medical condition reported in the United States in 2002. It is characterized by dermatological lesions where the patient perceives inert or organic fibers, included or protruding from the skin.
It is most often considered as a form of parasitic delirium or Ekbom syndrome , a factitious disorder , or a collective syndrome of psychogenic origin. Some authors, however, indicate that Morgellon's disease is a true somatic disease of infectious origin, including Lyme disease .
some people have linked Morgellons "to another illness viewed skeptically by most doctors, chronic Lyme disease"
The Mayo clinic also refers to this research saying cite:
The research on Morgellons by multiple groups over decades has yielded conflicting results. Multiple studies report a possible link between Morgellons and infection with Borrelia spirochetes. These results contradict [the CDC study] One of the main proponents of this hypothesis is Marianne J. Middelveen, MDes, a Veterinary Microbiologist from Alberta, Canada. She made a connection with a disease of cattle, called Bovine Digital dermatitis which has similar symptoms - and in that case, it is well established that there are microfilaments of keratin and collagen which form beneath the skin.
She analysed filaments that form beneath the skin of sufferers, and found out that these also are made of keratin and collagen. She also found spirochetes, which are usually associated with Lyme disease in humans.
The main paper is Middelveen, M.J., Bandoski, C., Burke, J., Sapi, E., Filush, K.R., Wang, Y., Franco, A., Mayne, P.J. and Stricker, R.B., 2015. Exploring the association between Morgellons disease and Lyme disease: identification of Borrelia burgdorferi in Morgellons disease patients. BMC dermatology, 15(1), p.1.
She, along with a dozen or so other researchers, publish one or two papers a year on this topic.
For more background and cites, see also Mystery Of Morgellons - Disease Or Delusion - Scientific Hypothesis Of Connection With Lyme Disease (originated as Wikipedia article but on rejection from Wikipedia, rewritten in less encyclopedic tone as a blog post)
Here is Marianne Middelveen talking about her research https://youtu.be/IaxdRvesVfM
French Wikipedia summarizes her reseach as
As of 2011, some scientists are trying to demonstrate the reality of the disease by defining it as filamentous dermatitis linked to Lyme disease, by publishing one to two articles per year.
The main author of publications is Marianne J. Middelveen, veterinarian, which reconciles the bovine digital dermatitis or Mortellaro disease in animals and humans Morgellons 13 . Bovine disease is a skin infection associated with various pathogens, including spirochaetes and treponemes . In the animal, the disease is shown as contagious, being able to present papules filiform.
Middelveen assumes that morgellons are a human equivalent of bovine digit dermatitis. She regularly publishes works showing an association between Morgellons and Lyme disease 14. 15
FOOTNOTES
Marianne J. Middelveen and Raphael B. Stricker , " Filament formation associated with spirochetal infection: a comparative approach to Morgellons disease ," Clinical, Cosmetic and Investigational Dermatology , Vol. 42011, p. 167-177 ( ISSN 1178-7015 , PMID 22253541 , PMCID PMC3257881 , DOI 10.2147 / CCID.S26183 , read online [ archive ] , accessed March 4, 2019 )
Marianne J. Middelveen Cheryl Bandoski Jennie Burke and Eva Sapi , " Exploring the Association entre Morgellons disease and Lyme disease: identification of Borrelia burgdorferi in Morgellons disease patients ," BMC Dermatology , Vol. 15, n o 1,February 12, 2015, p. 1 ( ISSN 1471-5945 , PMID 25879673 , PMCID PMC4328066 , DOI 10.1186 / s12895-015-0023-0 , read online [ archive ] , accessed March 4, 2019 ) summary in English " morgellons and lyme " [ archive ] , on imedecin.com .
(in) Jyotsna S. Shah and Raphael B. Stricker , " Detection of tick-borne infection in Morgellons disease patients by serological and molecular technologies " [ archive ] on Clinical, Cosmetic and Investigational Dermatology ,November 9, 2018 (accessed March 4, 2019 )
In 2008, The Washington Post reported that Internet discussions about Morgellons include many conspiracy theories about the cause, including biological warfare, nanotechnology, chemtrails and extraterrestrial life.
The Washington Post article predates the CDC study. This is a summary by a journalist of an internet search of Moregellons discussion blogs.
Other points in the Washington Post article not mentioned here:
Robert Bransfield is a New Jersey psychiatrist who studies the connection between infection and mental illness. ... "This isn't delusional," Bransfield says. "Delusions are quite variable. So, one person might have a delusion that the FBI is sending out messages to his dentures. Someone else has a delusion that their next-door neighbor is stealing their mail. But the people who have Morgellons all describe it in the same way. It doesn't have the variability you would see in delusions." ... And what of mass hysteria? Could Morgellons be, in a very real sense, nothing more than an Internet virus that has taken hold in susceptible minds? "I do see suggestion with the Internet, but it's hard to explain it on that alone," Bransfield says. "You can see the fibers. The fibers can't be mass hysteria. You see people describe this who don't have a computer," he says. "It's puzzling. It's hard to make sense out of it. But it's there. "
It also covers the very early stages of the research into a possible connection with Borrelia burgdorfer
At about the same time, Leitao, who was trained as a biologist and worked as an electron microscopist; along with Ginger Savely, a nurse practitioner; and Raphael Stricker, a hematologist, published a paper in the American Journal of Clinical Dermatology reporting that 79 out of 80 Morgellons patients they studied also were infected with Borrelia burgdorferi, the tick-borne bacteria that cause Lyme disease.
Morgellons Research Foundation
As the article says, the Morgellons Research Foundation which was the primary patient advocacy group in the 2000s.
However it does not mention its successor, the Charles Holman Morgellons Disease Foundation, . a 501(c)3 nonprofit organization committed to "advocacy and philanthropy in the battle against Morgellons Disease"
This orgnaization holds an annual three day conference on Morgellons for researchers to discuss their findings. Its main subject of study is a possible connection with chronic lyme disease.
There are many peer reviewed papers on the topic. It is minority view science but not fringe.
All attempts to add a mention of this research to Wikipedia eiterh on this article or as a separate article, evne notlinked to by it, are removed. Here are some of the cites.
Middelveen, Marianne J; Burugu, Divya; Poruri, Akhila; Burke, Jennie; Mayne, Peter J; Sapi, Eva; Kahn, Douglas G; Stricker, Raphael B (2013). "Association of spirochetal infection with Morgellons disease". F1000Research. doi:10.12688/f1000research.2-25.v1. ISSN 2046-1402.
Middelveen, Marianne J; Bandoski, Cheryl; Burke, Jennie; Sapi, Eva; Filush, Katherine R; Wang, Yean; Franco, Agustin; Mayne, Peter J; Stricker, Raphael B (2015). "Exploring the association between Morgellons disease and Lyme disease: identification of Borrelia burgdorferi in Morgellons disease patients". BMC Dermatology 15 (1). doi:10.1186/s12895-015-0023-0. ISSN 1471-5945.
Marianne J Middelveen, Raphael B Stricker, Filament formation associated with spirochetal infection: a comparative approach to Morgellons disease, in Clinical, Cosmetic and Investigational Dermatology 2011
Marianne J. Middelveen1, Jennie Burke2, Eva Sapi3, Cheryl Bandoski3, Katherine R. Filush3, Yean Wang2, Agustin Franco2, Arun Timmaraju3, Hilary A. Schlinger1, Peter J. Mayne1 Culture and identification of Borrelia spirochetes in human vaginal and seminal secretions, F100 research
Marianne J. Middelveen1, Elizabeth H. Rasmussen2, Douglas G. Kahn3 and Raphael B. Stricker1*, Morgellons Disease: A Chemical and Light Microscopic Study, Journal of Clinical&Experimental Dermatology Research
Shah, PhD, Jyotsna S. "Morgellons Disease – Chronic Form Of Borrelia Infection?”
Middelveen MJ, Stricker RB Morgellons disease: a filamentous borrelial dermatitis, International Journal of General Medicine » Volume 9, DovePress
Morgellons is poorly understood but the general medical consensus is that it is a form of delusional parasitosis in which individuals have some form of actual skin condition that they believe contains some kind of fibers.
The cites for this sentence all preced the big 2012 CDC study. Although most say it is a form of Delusional Parasitosis one of them says: The cause, transmission, and treatment are unknown."
Whether Morgellons disease is a delusional disorder or even a disease has been a mystery for more than 300 years. Symptoms of Morgellons include crawling and stinging sensations, feeling of “bugs” and/or fiber-like material beneath the skin, disabling fatigue, and memory loss. The cause, transmission, and treatment are unknown.
Simpson, L; Baier, M (August 2009). "Disorder or delusion? Living with Morgellons disease". Journal of Psychosocial Nursing and Mental Health Services. 47 (8): 36–41. doi:10.3928/02793695-20090706-03. PMID 19681520.
Morgellons is poorly characterized but the general medical consensus is that it is a form of delusional parasitosis; the sores are the result of compulsive scratching, and the fibers, when analysed, turn out to originate from textiles.
The CDC just said they were not able to conclude whether it represents a new condition, or a wider recogntion of delusioanl parasitosis, and called it an "unexplained dermopathy".
We were not able to conclude based on this study whether this unexplained dermopathy represents a new condition, as has been proposed by those who use the term Morgellons, or wider recognition of an existing condition such as delusional infestation, with which it shares a number of clinical and epidemiologic features.
Pearson, M.L., Selby, J.V., Katz, K.A., Cantrell, V., Braden, C.R., Parise, M.E., Paddock, C.D., Lewin-Smith, M.R., Kalasinsky, V.F., Goldstein, F.C. and Hightower, A.W., 2012. Clinical, epidemiologic, histopathologic and molecular features of an unexplained dermopathy. PLoS One, 7(1), p.e29908. The Mayo Clinic describes it like this:
Morgellons disease: Managing an unexplained skin condition
Morgellons disease is an uncommon, poorly understood condition characterized by small fibers or other particles emerging from skin sores. People with this condition often report feeling as if something were crawling on or stinging their skin.
Some doctors recognize the condition as a delusional infestation and treat it with cognitive behavioral therapy, antidepressants, antipsychotic drugs and counseling. Others think the symptoms are related to an infectious process in skin cells. Further study is needed.
This does not amount to a medical consensus that it is delusional parasitosis. Other cites given later in this article predate the CDC study.
CDC investigation
This section doesn't mention criticisms of the CDC study.
The main problem they faced is the low prevalence of the disease, only 3.65 cases per 100,000. Their 4 year study found only 41 patients with the condition at more than ten thousand dollars per patient. They also didn't select patients who self diagnosed as having Morgellons so it is possible the patients they studied did not think they had the condition..
Harry Schone summarizes these criticisms in one of the sections of his University College London thesis
"It is indeed true that the CDC were being cautious, that they found no positive evidence for the claims made by Morgellons sufferers, but it does not mean that the study can go without critical appraisal. Although expensive and lengthy, the research only clinically evaluated 41 people. Furthermore, since the population was selected by criteria other than self-identification it has been argued by critics of the study that some of those included did not have or even consider themselves to have Morgellons. The validity of these criticisms may rest on somewhat pedantic points, but what is certainly true is that an awful lot of reading between the lines has been passed off as something more substantial."
See Learning from Morgellons, Harry Quinn Schone, Masters thesis for UCL (University College London) - see Harry Schone.
This is a masters thesis rather than a PhD, however UCL is one of the most prestigious universities in Europe and this thesis summarizes concerns of other researchers.
No parasites or mycobacteria were detected in the samples collected from any patients. Most materials collected from participants' skin were composed of cellulose, likely of cotton origin
In their 2015 paper, Middelveen and her co researchers describe techniological limitations of the CDC study had limitations which could explain why they didn't find the spirochetes that they are able to identify in Morgellons patients:
"The search for spirochetal pathogens in that study was limited to Warthin-Starry staining on a small number of tissue samples and commercial two-tiered serological Lyme disease testing as interpreted by the CDC Lyme surveillance criteria. It should be noted that only two of the patients in our study group were positive for Lyme disease based on the CDC Lyme surveillance criteria and yet Borrelia spirochetes were readily detectable in this group of 25 MD patients."
They attribute their success in detecting Borrelia burgdorferi and closely related spirochetes to several factors
A 2015 review
This article doesn't seem to have had much editing since 2015. The hypothesis of a multifactorial issue is now the scientific consensus.
For more cites see the annotation on the last sentence of the lede
A 2019 review concludes that the collapse develops through a sequence of seteps.
First, climate change, agrochemicalization or inadequate food decreasesteh strength
Then there are faults in bee management, including depriving them of too much honey replacing it with sugary food, inadequate treatment against Varoa, wintering on honey contaminated with insecticides etc.
They then become more easily infected oftenwith nosemosis, Lotmaria infection and American foulborood infection. Then finally the varoa spreads due to lack of adequate measures to prevent it.
STANIMIROVIĆ, Z., GLAVINIĆ, U., RISTANIĆ, M., ALEKSIĆ, N., JOVANOVIĆ, N., VEJNOVIĆ, B. and STEVANOVIĆ, J., 2019. LOOKING FOR THE CAUSES OF AND SOLUTIONS TO THE ISSUE OF HONEY BEE COLONY LOSSES. Acta Veterinaria-Beograd, 69(1), pp.1-31.
Acta Veterinaria-Beograd 2019, 69 (1), 1-3118for bee feeding. The use of supplements with sugar syrup should not be avoided, since they provide sufficient amino acids, peptides, micro- and macroelements which are absent from pure sugar syrup [18]. The use of supplements may prevent energetic, immune and oxidative stress in bees, and thus prevent losses in apiaries [129, 171-174]. The presence of a young, healthy bee queen in the hive guarantees the development of healthy bee colonies and successful beekeeping [131, 175]. Suitable pathogen control in hives, primarily of the bee mite V. destructor, with effective, registered varroacides is also a prerequisite for maintaining bee colonies in a good health condition. In addition, a strong link was detected between colony losses and beekeepers’ education and training: professionals were capable of keeping colonies free from diseases, unlike hobbyists [12, 70]. Professionals promptly detected symptoms, especially those of American foul brood or Varroa infestation, and timely applied control measures, contributing to the survival of their colonies. This was the first time that scientists focused attention on the impact of apiculturists and beekeeping practices on colony losses. The same authors commented that the introduction of a bee killer, Varroamite, to Europe at the beginning of 1980s, did not result in increased colony losses. This was explained by the fact that beekeepers efficiently adopted measures to combat against the mite [12].concLuSIonSScientific consensus has been reached that colony losses (CCD) are a multifactorial issue [3, 4, 6], which follows various conditions, but, according to our observations, it develops through a sequence of steps. Firstly, various non-specific factors (e.g. climate changes, agrochemisation and inadequate food) decrease the strength of the colonies; apitechnical faults (depriving bees of too much honey and a consecutive addition of large quantities of sugary food, inadequate treatments of colonies mainly against V. destructor, high stress and exhausting of bees, wintering colonies on honey contaminated with pesticides – sunflower honey, bad timing for wintering the colonies etc.). Such colonies easily become eligible for bacterial, microsporidial and trypanosomal infections. Manifested nosemosis combined with Lotmaria infection and latent American foulbrood infection, additionally exhaust bee colonies and impair the immune system of the bee [176-179]. Finally, inadequate anti-varroa strategies lead to significant health problems in bees and the spread of viruses for which Varroa is a vector, and/or activator. The whole process is a path prepared for the manifestation of virus infections
A large amount of speculation has surrounded a family of pesticides called neonicotinoids as having caused CCD.
This page seems to have not been updated much since 2013. There is scientific consensus now that it is multifactorial.
Bee populations are declining in the industrialized world, raising concerns for the sustainable pollination of crops. Pesticides, pollutants, parasites, diseases, and malnutrition have all been linked to this problem. We consider here neurobiological, ecological, and evolutionary reasons why bees are particularly vulnerable to these environmental stressors. Central-place foraging on flowers demands advanced capacities of learning, memory, and navigation. However, even at low intensity levels, many stressors damage the bee brain, disrupting key cognitive functions needed for effective foraging, with dramatic consequences for brood development and colony survival.
Klein, S., Cabirol, A., Devaud, J.M., Barron, A.B. and Lihoreau, M., 2017. Why bees are so vulnerable to environmental stressors. Trends in Ecology & Evolution, 32(4), pp.268-278.
The good news is that the past decade has seen plenty of progress in understanding the mystery of Colony Collapse Disorder. The bad news is that we now recognise it as a complex problem with many causes, although that doesn’t mean it is unsolvable.
For all bees, foraging on flowers is a hard life. It is energetically and cognitively demanding; bees have to travel large distances to collect pollen and nectar from sometimes hard-to-find flowers, and return it all to the nest. To do this they need finely tuned senses, spatial awareness, learning and memory.
Anything that damages such skills can make bees struggle to find food, or even get lost while trying to forage. A bee that cannot find food and make it home again is as good as dead.
Because of this, bee populations are very vulnerable to what we call “sublethal stressors” – factors that don’t kill the bees directly but can hamper their behaviour.
Ten years after the crisis, what is happening to the world’s bees?
Colonies are often challenged by multiple stressors, which can interact: for example, pesticides can enhance disease transmission in colonies. Colonies may be particularly vulnerable to sublethal effects of pathogens and pesticides since colony functions are compromised whether a stressor kills workers, or causes them to fail at foraging. Modelling provides a way to understand the processes of colony failure by relating impacts of stressors to colony-level functions.
Barron, A.B., 2015. Death of the bee hive: understanding the failure of an insect society. Current Opinion in Insect Science, 10, pp.45-50.
A 2019 review concludes that the collapse develops through a sequence of seteps.
First, climate change, agrochemicalization or inadequate food decreasesteh strength
Then there are faults in bee management, including depriving them of too much honey replacing it with sugary food, inadequate treatment against Varoa, wintering on honey contaminated with insecticides etc.
They then become more easily infected oftenwith nosemosis, Lotmaria infection and American foulborood infection. Then finally the varoa spreads due to lack of adequate measures to prevent it.
STANIMIROVIĆ, Z., GLAVINIĆ, U., RISTANIĆ, M., ALEKSIĆ, N., JOVANOVIĆ, N., VEJNOVIĆ, B. and STEVANOVIĆ, J., 2019. LOOKING FOR THE CAUSES OF AND SOLUTIONS TO THE ISSUE OF HONEY BEE COLONY LOSSES. Acta Veterinaria-Beograd, 69(1), pp.1-31.
Acta Veterinaria-Beograd 2019, 69 (1), 1-3118for bee feeding. The use of supplements with sugar syrup should not be avoided, since they provide sufficient amino acids, peptides, micro- and macroelements which are absent from pure sugar syrup [18]. The use of supplements may prevent energetic, immune and oxidative stress in bees, and thus prevent losses in apiaries [129, 171-174]. The presence of a young, healthy bee queen in the hive guarantees the development of healthy bee colonies and successful beekeeping [131, 175]. Suitable pathogen control in hives, primarily of the bee mite V. destructor, with effective, registered varroacides is also a prerequisite for maintaining bee colonies in a good health condition. In addition, a strong link was detected between colony losses and beekeepers’ education and training: professionals were capable of keeping colonies free from diseases, unlike hobbyists [12, 70]. Professionals promptly detected symptoms, especially those of American foul brood or Varroa infestation, and timely applied control measures, contributing to the survival of their colonies. This was the first time that scientists focused attention on the impact of apiculturists and beekeeping practices on colony losses. The same authors commented that the introduction of a bee killer, Varroamite, to Europe at the beginning of 1980s, did not result in increased colony losses. This was explained by the fact that beekeepers efficiently adopted measures to combat against the mite [12].concLuSIonSScientific consensus has been reached that colony losses (CCD) are a multifactorial issue [3, 4, 6], which follows various conditions, but, according to our observations, it develops through a sequence of steps. Firstly, various non-specific factors (e.g. climate changes, agrochemisation and inadequate food) decrease the strength of the colonies; apitechnical faults (depriving bees of too much honey and a consecutive addition of large quantities of sugary food, inadequate treatments of colonies mainly against V. destructor, high stress and exhausting of bees, wintering colonies on honey contaminated with pesticides – sunflower honey, bad timing for wintering the colonies etc.). Such colonies easily become eligible for bacterial, microsporidial and trypanosomal infections. Manifested nosemosis combined with Lotmaria infection and latent American foulbrood infection, additionally exhaust bee colonies and impair the immune system of the bee [176-179]. Finally, inadequate anti-varroa strategies lead to significant health problems in bees and the spread of viruses for which Varroa is a vector, and/or activator. The whole process is a path prepared for the manifestation of virus infections
Syngenta together with Bayer is challenging this ban in court.
The case was dismissed by the European court on 22nd May 2018 cite
83 million in 2014
90 million in 2017, data from FAOSTAT
Alternative shipping routes
Way out of date, last updated in 2012. As of 2019 the pipelines include:
A third of the world’s liquefied natural gas and almost 20% of total global oil production passes through the strait,
Also, about 30% of seaborne traded oil, more than 85% of that for Asia, mainly Japan, India, South Korea and China [details] (https://www.reuters.com/article/us-yemen-security-oil-factbox/factbox-middle-east-oil-gas-shipping-risks-alternative-routes-idUSKBN0MM2E720150326)
2017: 17.2 million bpd, first half of 2018, 17.4 million, [details] (https://www.reuters.com/article/us-iran-oil-factbox/strait-of-hormuz-the-worlds-most-important-oil-artery-idUSKBN1JV24O) (Sea-borne crude and condensate)
Most of the crude exported from Saudi Arabia, Iran, the United Arab Emirates, Kuwait and Iraq passes through it. It is also the route for nearly all the liquefied natural gas (LNG) from lead exporter Qatar. cite
largest crude oil export line.
Currently running at much lower than capacity at 80,000 to 90,000 bpd, some reports say much lower details
Rare 'superflares' could one day threaten Earth
Large magnetic storms can happen, true superflares can't, that was disproved a while back. Ours is the wrong kind of star for that. It needs to have a stronger magnetic field than our sun and the poles need to spin more rapidly than the equator - ours is the other way around.
Can superflares happen on our sun - latest research says no
The study must be about the larger magnetic storms, simlar to the Carrington event in the nineteenth century. Superflares can be up to three times brighter than that one - not bright enough for the light to be harmful.
Don't harm you directly. Can cause GPS to glitch for a few hours. Can cause power cuts but these are much less than was estimated a few years ago, the step up / down transformers are more resilient than previousy thought.
My post here goes into details of effects of solar storms with cites, towards the end:
one million animal and plant species were threatened with extinction.
This figure got far more attention than it deserved. One of many figures in the IPBES Media Release that the press highlighted as their main finding. Their main finding was that these species can be prevented from extinction.
It includes vulnerable species,with a 10% chance of extinction in 100 years, which means about half would be lost in 600 years. This also includes those with a stable population of less than 1000 mature individuals (+ various other alternative inclusion criteria). These can be conserved. Vulnerable species often move to least concern, as happened with the humpback whale in 2008.
They assume that across most groups of species, 25% of species are threatened with extinction. It's a reasonable guess based on the 27% figure for the few ones that were assessed for the IUCN red list.
For insects, they expect it is lower but unlikely to be less than 10%. 10% of the 5.5 million insects is 550,000, and 25% of the remaining 2.6 million is 625,000. The imprecision of the estimates means there is no point in being more precise than one million. The figure covers 8 milliion estimated eukaryotes - that includes minute creatures in seas, rivers and soil, many needing a microscope to see. The 2011 paper which is source said that at the time, 86% of existing species had not yet been described.
The 2011 paper estimates 298,000 plant species, so a quarter of those would be 74,500 plants that have a 10% chance of extinction in 100 years or else have a world population of less than 1000 years.
You can read the science behind the IPBES assessment here: A million threatened species? Thirteen questions and answers by Dr Andy Purvis, one of the authors who calculated the figure.
I wrote up the IPBES report as Let's Save A Million Species, And Make Biodiversity Great Again, UN Report Shows How
Researchers say their analysis of all documented plant extinctions in the world shows what lessons can be learned to stop future extinctions.
Most plants are far less vulnerable to total extinction than animals. The reason is that plant seeds can be preserved for centuries in cold dry conditions, even millennia.
Some tropical plants such as mangoes can't be preserved in this way as their seeds do not survive drying and cooling. But most plants can be.
The Svalbard Global Seed Vault bank is an extra backup which is sufficient to restore all the world's agriculture even if there were no other seeds left in the entire world. Not jus the main crops but numerous wild varieties as well. Our seed crops are not going to go extinct. It currently has just short of a million samples, and will eventually store 4.5 million varieties of crops.
For seeds generally, the Millennium Seed Bank is another major seed bank, largest seed bank in the world, with an aim to have 25% of the world's seeds by 2020. It already has all the UK's seeds apart from a few either too rare to collect the seeds, or that have seeds that can't be preserved.
The number is based on actual extinctions rather than estimates, and is twice that of all bird, mammal and amphibian extinctions combined.
This is what the researchers would expect given that there are far more plant species overall. The New Scientist article about this research makes this point:
The number of plant extinctions is much greater than the number for birds, mammals, and amphibians. That’s what researchers would expect, given there are more plant species overall
Humans have driven nearly 600 plant species to extinction since 1750s
Almost 600 plant species have been lost from the wild in the last 250 years, according to a comprehensive study.
Okay - they are confusing things a bit here by not giving the total number of plant species there are. Which in 2016 was 391,000 species, with 2,000 new species discovered every year.
How many plant species are there in the world? Scientists now have an answer
So that's about 0.15% have been lost in 250 years.
one million animal and plant species were threatened with extinction.
This figure got far more attention than it deserved. They didn't present it as a significant finding. It was just a back of the envelope type calculation anyone could have done.
It includes vulnerable species, which includes those with a 10% chance of extinction in 100 years, and those with a stable population of less than 1000 mature individuals (+ various other alternative inclusion criteria).
With good conservation then vulnerable species often move from vulnerable to least concern, as happened with the humpback whale in 2008.
The basis of their calculation was a guess that across most groups of species, 25% of species are threatened with extinction, based on the 27% figure for the few ones that were assessed for the IUCN red list.
They said that for insects, there is good reason to suppose it is lower, but unlikely it’s less than 10%. Around 5.5 million of those 8 million species are insects, so, with that conservative estimate of 10%, that’s about 0.5 million insect species threatened (including vulnerable).
So, its based on a rough extrapolation to the eight or so million species of eukaryotes - so that includes insects and minute sea creatures, many probably needing a microscope to see.
Most species haven’t been assessed at all. The 2011 paper which may be their source said that at the time, 86% of existing species had not yet been described.
Obviously not many of those million species can be higher plants. There just aren't enough of them. The 2011 paper which may be their source estimates 298,000 plant species, so a quarter of those would be 74,500 plants that have a 10% chance of extinction in 100 years or else have a world population of less than 1000 years (+ various other alternative inclusion criteria).
They give many more figures in the Media Release
I wrote up the IPBES report as Let's Save A Million Species, And Make Biodiversity Great Again, UN Report Shows How
Almost 600 plant species have been lost from the wild in the last 250 years, according to a comprehensive study.
Okay - they are confusing things a bit here by not giving the total number of plant species there are. Which in 2016 was 391,000 species, with 2,000 new species discovered every year.
How many plant species are there in the world? Scientists now have an answer
So that's about 0.15% have been lost in 250 years.
5G
You might think this is the place to look for material on whether there are any possible dangers of 5g. However, in an eccentric decision, editors of this articlle remove any sections on the topic as here Removed Dangers of 5g and their explanation on talk page here.
Wikipedia does have a separate article on the topic of Mobile phone radiation and health, but material from that article is not permitted here and this page doesn't link to it. Sadly that article also has almost nothing on 5g and health.
In 2011 then WHO /International Agency for Researchon Cancer (IARC) classified radio frequency EM fields as possibly carcinogenic to humans (Group 2B).
If there is a risk it's a tiny one, of certain type of brain cancer. You can take measures to avoid the risk such as not holding a cellphone near your head while downloading large files.
IARC classifies radiofrequency electromagnetic fields as possibly carcinogenic to humans
Mayo clinic puts the situation like this:
The bottom line? For now, no one knows if cellphones are capable of causing cancer. Although long-term studies are ongoing, to date there's no convincing evidence that cellphone use increases the risk of cancer. If you're concerned about the possible link between cellphones and cancer, consider limiting your use of cellphones — or use a speaker or hands-free device that places the cellphone antenna, which is typically in the cellphone itself, away from your head.
A couple of hundred scientists have signed an appeal to the WHO to say it needs further investigation. Not 5g particularly but cell phones and wifi generally.
The evidence isn't very good yet but they think there is enough to be worth investigating on a precautionary level.
These scientists are concerned about a very minute risk of cancer. Though too small to be noticed, if there are even a few deaths in a million, it is something to take precautions to prevent.
There is a lot of conspirachy theory nonsense on the topic however. See for instance
another massive planet hiding out in the Kuiper belt.
If it exists it is the size of Mars, half the diameter of Earth.. Planet 9 is the size of Neptune, four times the diameter of our Earth.
Rogue planets are planets that have been ejected from the systems for one reason or another and have traveled close enough to the Sun to be caught in its gravity. More than mysteriously changing orbits of KBO’s, rogue planets can present a danger to other planets in the system. In simulations, rogue planets get kicked out of the solar system 60 percent of the time, but in 10 percent of cases, these rogue planets destroy another planet on the way out.
This is not a rogue planet, if it exists it has been circling way beyond Neptune for billions of years. None of this applies. It can't even get as close as Neptune.
Could Destroy Earth
Can't destroy Earth
Rogue Planet
Not a rogue planet
rendering it totally uninhabitable in around five billion years.
For a detailed future timeline see my Once the sun dies and the solar system goes dark, would Europa (if it had life) continue to provide a liquid ocean and a biosphere?
However, in an extreme case of future planning, University of Glasgow space engineer Dr Matteo Ceriotti has proposed some radical solutions to help Earth change orbit.
For a more thorough account see his Wandering Earth: rocket scientist explains how we could move our planet
The rest of this article is reasonable, if sometimes a little sketchy in the details.
Another possibility is to shield Earth with giant shades in space.
Scientists know the Sun will one day change from its current state to a red giant, a star far larger than its currently size.
Just to say, that it remains comfortable conditions for humans through to around 100 million years from now when a moist greenhouse is a possibility. That's about as long as it took for humans to evolve from tiny shrew sized creatures clambering around trees at the time of the dinosaurs. Debunked: Climate change will make the world too hot for humans
Man-made climate change is an immediate concern, meaning a global warming apocalypse appears increasingly likely.
Not true, see my Climate Change Will NOT End Human Civilization By 2050, 'Overblown Rhetoric And Unsupportable Doomist Framing' Says Michael Mann
Earth EXODUS: Plan to alter Earth’s ORBIT to escape being eaten by dying Sun
This is for billions of years into the future!
Deinococcus radiodurans also failed to grow under low atmospheric pressure, under 0 °C, or in the absence of oxygen
This is not surprising - it shows that radiodurans is an obligate aerobe. This doesn't mention the surrpsing result mentinoped in this cite that S. liquefaciens a was able to grow under these conditions.
A more accurate summary of the source would be something like this:
In other simulations, Serratia liquefaciens strain ATCC 27592 was able to grow at 7 mbar, 0°C, in CO2-enriched anoxic atmospheres. This was surprising, as it is a generalist that occurs in many terestrial niches, not an extremophile. Two extremophiles, Deinococcus radiodurans strain R1 and Psychrobacter cryohalolentis strain K5, were both unable to grow in anoxic conditions (making them obligate aerobes) and R1 was also unable to grow below 0 C or at 7 mbar.
Source says:
Only Serratia liquefaciens strain ATCC 27592 exhibited growth at 7 mbar, 0°C, and CO2-enriched anoxic atmospheres ... The growth of S. liquefaciens at 7 mbar, 0°C, and CO2-enriched anoxic atmospheres was surprising since S. liquefaciens is ecologically a generalist that occurs in terrestrial plant, fish, animal, and food niches.
Even the hardiest cells known could not possibly survive the cosmic radiation near the surface of Mars since Mars lost its protective magnetosphere and atmosphere
This is baed on earlier papers that studied dormant life because back then they thought that the present day Mars surface was too cold and dry for life, but that life could have survived in dormant form from times when the axial tilt varied, the atmosphere thickened and water briefly flows on Mars. Any such life would be buried deep because the cummulatie effects of ionizing radiation over millions of years can sterilize anything.
However, if the life can continue to grow, reproduce and heal itself, then over 500 years, then even e coli, one of our most radiosensitive microbes, is reduced by only 90%. Levels are slimilar to the interior of the ISS and are not lethal to microbes unless they are dormant for long periods.
From the MSL RAD measurements, ionizing radiation levels from cosmic radiation are so low as to be negligible. The intermittent solar storms increase the dose only for a few days and the Martian surface provides enough shielding so that the total dose from solar storms is less than double that from cosmic radiation/ Over 500 years the Mars surface would receive a cummulative dose of less than 50 Gy, far less than the dose where 90% of even a radiation senstiive bacterium such as e-coli would die (LD90 of ~200 - 400 Gy). These facts are not used to distinguish Special Regions on Mars
Cite here
NASA have the search for extant life as one of their two top priorities for searching for life on Mars.
A special region on Mars for the purposes of Planetary protection is a region classified by COSPAR where terrestrial organisms are likely to propagate, or interpreted to have a high potential for existence of extant Martian life forms
See
A major goal is to preserve the planetary record of natural processes by preventing human-caused microbial introductions, also called forward contamination.
Uncited assertion. Not the main goal. A few microbes on the Martian surface would not obscure the planetary record in the frozen regolith. The main goal is to protect future science experiments, so that they don't find Earth microbes when searching for extant Mars organisms.
Cassie Conley, NASA spokeswoman for the office of planetary protection puts it like this: https://youtu.be/qk-Ycp5llEI
Third video on overview page of the NASA Office of Planetary Protection.
“So we have to do all of our search for life activities, we have to look for the Mars organisms, without the background, without the noise of having released Earth organisms into the Mars environment”
This entire section is OR and SYNTHESIS. NASA have the search for extant life as one of their two top priorities for searching for life on Mars. See https://encyclopediaofastrobiology.org/wiki/Protecting_Mars_special_regions_with_potential_for_life_to_propagate
link the Deccan Traps eruption to the asteroid impact that created the nearly antipodal Chicxulub crater
This theory of antipodal focusing is disproved according to the intro to the cited source. The source says that the Chicxulub crater is offset from the antipodes by 130°. It also says that the impactor didn't have enough energy to cause melting at the antipodes. The cite says this disproves the antipodal theory.
Instead the new idea presented in the cite is that the impact generated a magnitude 9 earthquake worldwide and this caused volcanism to increase everywhere, through a now well established process by which nearby earthquakes can trigger increased volcanism. The Deccan traps started well before the impact, due to a rising "plume head", rising through the mantle which hapens every 20-30 million years, but after the impact they sped up and the chemistry changed.
Cite says:
The possibility that an impact at Cretaceous-Paleogene time caused Deccan volcanism has been investigated since the discovery of the iridium anomaly at Cretaceous-Paleogene boundary, with an emphasis on antipodal focusing of seismic energy. However, the Deccan continental flood basalts were not antipodal to the 66 Ma Chicxulub crater at the time of the impact, but instead separated by an epicentral distance of ~130°. Also, a Chicxulub-size impact does not in any case appear capable of generating a large mantle melting event. Thus, impact-induced partial melting could not have caused the initi-ation of Deccan volcanism, consistent with the occurrence of Deccan volcanism well before Cretaceous-Paleogene/Chicxulub time.
Instead, Deccan volcanism is widely thought to represent the initial outburst of a new mantle plume “head” at the beginning of the Réunion hotspot track
Accompanying press release from Berkely university says:
Michael Manga, a professor in the same department, has shown over the past decade that large earthquakes – equivalent to Japan’s 9.0 Tohoku quake in 2011 – can trigger nearby volcanic eruptions. Richards calculates that the asteroid that created the Chicxulub crater might have generated the equivalent of a magnitude 9 or larger earthquake everywhere on Earth, sufficient to ignite the Deccan flood basalts and perhaps eruptions many places around the globe, including at mid-ocean ridges.
“It’s inconceivable that the impact could have melted a whole lot of rock away from the impact site itself, but if you had a system that already had magma and you gave it a little extra kick, it could produce a big eruption,” Manga said.
Similarly, Deccan lava from before the impact is chemically different from that after the impact, indicating a faster rise to the surface after the impact, while the pattern of dikes from which the supercharged lava flowed – “like cracks in a soufflé,” Renne said – are more randomly oriented post-impact.
“There is a profound break in the style of eruptions and the volume and composition of the eruptions,” said Renne. “The whole question is, ‘Is that discontinuity synchronous with the impact?’”
Another cite here, "The Conversation" which is written by academics and is WP:RS.
Our observations suggest the following sequence of events at the end of the Cretaceous period. Just over 66 million years ago, the Deccan Traps start erupting – likely initiated by a plume of hot rock rising from the Earth’s core, similar in some ways to what’s happening beneath Hawaii or Yellowstone today, that impinged on the side of India’s tectonic plate. The mid-ocean ridges and dinosaurs continue their normal activity.
About 250,000 years later, Chicxulub hits off the coast of what will become Mexico. The impact causes a massive disruption to the Earth’s climate, injecting particles into the atmosphere that will eventually settle into a layer of clay found across the planet. In the aftermath of impact, volcanic activity accelerates for perhaps tens to hundreds of thousands of years. The mid-ocean ridges erupt large volumes of magma, while the Deccan Traps eruptions flood lava across much of the Indian subcontinent.
clathrate gun hypothesis
The lede doesn't mention the major literature review by the USGS n December 2016 which oncluded that evidence is lacking for the original hypothesis. the similar conclusion by the Royal Society in 2017 that there is a relatively limited role for climate feedback from dissociation of the methane clathrates and the the CAGE research group (Centre for Arctic Gas Hydrate, Environment and Climate) which conlcuded that the methane formed over 6 million years ago and have been slowly releasing methane for 2 million years independent of warm or cold climate Details here:Clathrate gun hypothesis
A 2018 published review concluded that the clathrate gun hypothesis remains controversial, but that better understanding is vital.
This is NOT their conclusion. Their conclusion was that it is unlikely. Quotes from the paper:
"Although the clathrate gun hypothesis remains controversial (21), a good understanding of how environmental change affects natural CH4 sources is vital in terms of robustly projecting future fluxes under a changing climate."
Then later:
"Nevertheless, it seems unlikely that catastrophic, widespread dissociation of marine clathrates will be triggered by continued climate warming at contemporary rates (0.2◦C per decade) during the twenty-first century"
.They did however urge caution about extraction of methane clathrates as a fuel, as this could lead to leaks of methane.
As discussed previously(Section 4.1), the stability of CH4 clathrate deposits may already be at risk from climate change.Accidental or deliberate disturbance, due to fossil fuel extraction, has the potential for extremelyhigh fugitive CH4 losses to the atmosphere "Nevertheless, it seems unlikely that catastrophic, widespread dissociation of marine clathrates will be triggered by continued climate warming at contemporary rates (0.2◦C per decade) during the twenty-first century"
For details with more cites Clathrate gun hypothesis
It was estimated in November 2015 that only 18 Jews remain in Syria
Cite has no mention of the number of Jews left in Israel or the number 18.
There may be Jews left in Syria, according to the Jerusalem Post.
First, one member of the family rescued in 2016 is still there because she was married to a Muslim man and signed conversion papers though she says she didn't really convert
According to Motti Kahana, who engineered the operation, there are no Jews left in Aleppo, aside from one member of the Halabi family, Linda, whose immigration to Israel was denied, citing her conversion to Islam – the source of a dispute between the Jewish Agency and Kahana.
The latter still sends kosher food to the woman, and maintains that though she signed conversion papers – which is required by Syrian law when marrying a Muslim – she did not really convert.
Also another family in Aleppo claims they are Jewish asking for aliyah
A family from war-torn Aleppo is appealing to the State of Israel for refuge, citing their Jewish heritage, Army Radio reported on Sunday.
“There is nobody who can help us to get out of this place,” said 30-year-old Razan (real name withheld) in an audio recording translated from Arabic into Hebrew and aired on the radio station. “We are asking that the Israeli government does not abandon us, but helps us get out of here to another country. I ask that the government demands from the entire world to do this. All my love and loyalty is to this religion [Judaism].”
Experts say there are still some Jews remaining in other parts of Syria. Elizabeth Tzurkov, a Syria researcher at Israeli think tank the Forum for Regional Thinking, told Army Radio: “... a number of Syrians have approached me who are descendants of Jewish women, who converted to Islam or who did not convert, and inquired how they can move to Israel.”
More recently, in September 2016, the last Jews of Aleppo were rescued hence ending that last Jewish presence in Aleppo
There may be Jews left in Syria, according to the Jerusalem Post.
First, one member of the family rescued in 2016 is still there because she was married to a Muslim man and signed conversion papers though she says she didn't really convert
According to Motti Kahana, who engineered the operation, there are no Jews left in Aleppo, aside from one member of the Halabi family, Linda, whose immigration to Israel was denied, citing her conversion to Islam – the source of a dispute between the Jewish Agency and Kahana.
The latter still sends kosher food to the woman, and maintains that though she signed conversion papers – which is required by Syrian law when marrying a Muslim – she did not really convert.
Also another family in Aleppo claims they are Jewish asking for aliyah
A family from war-torn Aleppo is appealing to the State of Israel for refuge, citing their Jewish heritage, Army Radio reported on Sunday.
“There is nobody who can help us to get out of this place,” said 30-year-old Razan (real name withheld) in an audio recording translated from Arabic into Hebrew and aired on the radio station. “We are asking that the Israeli government does not abandon us, but helps us get out of here to another country. I ask that the government demands from the entire world to do this. All my love and loyalty is to this religion [Judaism].”
Experts say there are still some Jews remaining in other parts of Syria. Elizabeth Tzurkov, a Syria researcher at Israeli think tank the Forum for Regional Thinking, told Army Radio: “... a number of Syrians have approached me who are descendants of Jewish women, who converted to Islam or who did not convert, and inquired how they can move to Israel.”
It was estimated in November 2015 that only 18 Jews remain in Syria
Cite has no mention of the number of Jews left in Israel or the number 18.
The typical tidal range in the open ocean is about 0.6 metres (2 feet)
In the open ocean it can be anything from 0 to around a meter, 0.6 meters is on the high side.
Good map of tidal ranges here https://www.researchgate.net/figure/Global-tidal-ranges-C-2015-NASA-Goddard-Space-Flight-Center-NASA-Jet-Propulsion_fig3_317370107
Perigean spring tide
NOAA FAQ about Perigean spring tides is a useful source too
Mr. Woods' book examines the occurrences of coastal flooding though history. What he discovered is that coastal flooding did occur when there was a strong onshore wind, such as a hurricane or nor'easter, which occasionally occurred at the same time as a "perigean spring tide."
The problem has been that a number of people have misinterpreted the information presented in this book to mean that coastal flooding would occur whenever the "perigean spring tides" occur. This has led to articles published in various media sources that incorrectly predict widespread coastal flooding at the times of the "perigean spring tides," causing needless concern.
Most people who live along the coastline know that coastal flooding can occur whenever there are strong onshore winds, whether there is a "perigean spring tide" or not. Additionally, this flooding will be worse if the storm strikes around the time of high tide rather than around the time of low tide.
But in ALL cases, it is the storm winds which cause the coastal flooding, not the tides. Coastal flooding is the result of meteorology (the weather) not astronomy (normal tidal fluctuations). All astronomical considerations are accounted for in the NOS tide and tidal current predictions. https://co-ops.nos.noaa.gov/faq2.html#15
The state is required to obtain at least 33% of its electricity from renewable resources by 2020, and 50% by 2030, excluding large hydro
Our of date. Cited page now says that by SB 100 California is required to produce 60% renewables by 2030 and all electricity from carbon-free sources by 2045
The cite is to a preprint, not a WP:RS. 81 is likely a typo for 18. Most often given as 10-15 km
"Asteroids striking the Earth typically [Minton and Malhotra, 2010] have an impactor density of 2680 kg/m3and an impact velocity of 20 km/s.Assuming these properties, modern scaling relations indicate that a 10–15 km diameter projectile [Collins et al., 2008] created the 170 km diameter Chicxulub crater"
Parkos, D., Alexeenko, A., Kulakhmetov, M., Johnson, B.C. and Melosh, H.J., 2015. NOx production and rainout from Chicxulub impact ejecta reentry. Journal of Geophysical Research: Planets, 120(12), pp.2152-2168