726 Matching Annotations
  1. Nov 2019
    1. ractional cloud coveragetends to be highest around sunrise and lowest duringthe afternoon. The thinning (and in some cases thebreakup) of the overcast during the daytime is due tothe absorption of solar radiation just below the cloudtops (see Fig. 4.30

      Diurnal timing of albedo affects the climatic timescale energy balance. Might it change with climate?

    2. evaporation of the drizzle drops in thesubcloud layer absorbs latent heat. The thermody-namic impact of the downward, gravity-driven flux ofliquid water is an upward transport of sensible heat,thereby stabilizing the layer near cloud base

      Stabilization by drizzle

    3. In contrast, cooling from above drivesclosed cell convection

      Albedo of the Earth depends scarily much on this delicate bistable (two-regime) solution for PBL-top clouds!

    4. The area indicated by the hatched region representsthe total amount of heat input into the bottom of the boundarylayer from sunrise until time t1

      Conserved variable with height diagram, subject to a stability limit. Energy flux "fill the area" game.

    5. The stable boundarylayer near the ground consumes TKE, resulting inweak and sporadic turbulence there

      surface friction consumes it, stability just prevents vertical transports of the TKE downward to refresh the slowed winds

    6. Bowen ratio over the oceans decreaseswith increasing sea surface temperature. Typical valuesrange from around 1.00.5 along the ice edge to lessthan 0.1 over the tropical oceans where latent heatfluxes dominate

      Sometimes I see "evaporative fraction" EF used. That is clearer in its meaning.

  2. Oct 2019
    1. FIG. 7. The horizontal distribution of (a),(b)v, (c),(d)vD, (e),(f)vQ, and (g),(h)vaqgat 500 hPa on day 0. Thewhite dashed contour lines denote a heavy-rainfall area (day 0 precipitation greater than 20 mm day21in ECN andgreater than 12 mm day21in the SUS

      Composite omega patterns add up to the total pretty well, on average

    2. To separate thesesignals, we decompose each meteorological variable intoan EPE-related background component (means fromday213 to day24 and from day14today113) and anEPE-related synoptic-scale component (the differencesbetween the total and the background component)

      This should have been mentioned sooner, this is like the 10th use of the word "background", and "synoptic"

    1. The expression for can be cast in various forms that, although mathematically equivalent, are open to markedly different interpretation.

      QG omega equation: all forms

    1. MERRA-2 data collections

      2D datasets may be browsed at https://goldsmr4.gesdisc.eosdis.nasa.gov/dods/ 3D datasets at https://goldsmr5.gesdisc.eosdis.nasa.gov/dods/

      To use a data collection, you must make a NASA Earthdata account. You can put your USERNAME and PASSWORD in the URL like this: dods://USERNAME:PASSWORD@goldsmr5.gesdisc.eosdis.nasa.gov/dods/M2I3NVAER

      Enter that in the Data Choosers tab, General-->URLs, URL box in IDV's Dashboard. Or simpler software Panoply will accept it.

      Or, if you use the plain URL, IDV or Panopy or other software should prompt you for credentials.

    1. the nature of turbulence isstrongly modulated by heat fluxes and drag (momen-tum fluxes) at the surface.

      back to a more narrative exposition, after the formalism-heavy section 9.1

    2. t*is of order 15 min,which corresponds to the turnover time for the largestconvective eddy circulations, which extend from theEarth’s surface all the way up to the capping inversion

      turnover timescales

    3. roth-order closure. In thiscase, neither Eqs. (9.10) nor (9.11) is retained. Instead,the mean flow state is parameterized directly. Thisapproach, called similarity theory

      Zeroth order closure directly assumes the large scale solution

    4. nalogy with radiative fluxes in theexpression for radiative heating rates (4.52)

      CONVERGENCE of the statistical heat flux is a heating rate by the scales being treated statistically

    5. dvectionof TKEby the mean wind,Mismechanical generationof turbulence,Bis buoyantgeneration or consumptionof turbulence,Tris trans-portof turbulence energy by turbulence itself, and is the viscous dissipation rate.

      treating TKE as a continuous scalar that is advected, generated by shear, generated by buoyancy, diffused, and dissipated.

    6. for any half-hour period, there is a well-definedmean temperature and velocity; the range of temper-ature and velocity fluctuations measured is bounded(i.e., no infinite values); and a statistically robust stan-dard deviation of the signal about the mean can becalculated. That is to say, the turbulence is not com-pletely random; it is quasi-rand

      How long is long enough?

  3. Sep 2019
    1. local and non-local energy diffusion across the wavenumbers, with all Fourier modes feeling a sort of thermal bath described by a Gibbs-ensemble

      another idea of saturation or equilibrium

    2. based on the idea that only the mean flux, ϵin<math><msub is="true"><mrow is="true"><mi is="true">ϵ</mi></mrow><mrow is="true"><mi is="true">i</mi><mi is="true">n</mi></mrow></msub></math>, plays a statistical role in the inertial range [2]. In such a case, Kolmogorov derived the celebrated −5∕3<math><mo is="true">−</mo><mn is="true">5</mn><mo is="true">∕</mo><mn is="true">3</mn></math> power law

      yes it starts with the classic

    3. Up until now, all manipulations leading to the global and to the scale-by-scale energy balances (8) and (15)–(17) are exact. In order to proceed further we need to make some assumptions.

      framework only, so far

    4. The presence of a cascade requires that there exists a range of scales where all terms on the RHS of (15) are vanishingly small.

      peculiar logical / lexical construct

  4. Aug 2019
    1. The prescribed radiative cooling rate is at the default value (−1.5 K/day) in the first control simulation (BASE). It is reduced by half to −0.75 K/day throughout the troposphere in HEAT

      Small domain, so little cooling, must be quite intermittent

  5. Jun 2019
    1. Equation (6) illustrates that the magnitude of effective buoyancy depends on the local second derivative of B, rather than the simple sign and magnitude of B

      Ohh 3D inverse Laplacian of HORIZONTAL Laplacian

  6. www.mv.helsinki.fi www.mv.helsinki.fi
    1. long-enduring disintegration of science into specialties,we need to re-integratescientific reasoninginto a new holistic worldview.

      THERE IT IS! The zeitgeist hunger as I see it: Re-synthesis of reductionism's bits. With necessary and actually very useful approximations (essentials, not fundamentals!). Maybe the information age helps make it possible, and "semi-objective" (interpersonally shareable) if not unique (dominant Master Narratives). Infodynamics bookkeeping tools and concepts help. Multivocality of unifying narratives is an increasingly undeniable facet of the greater truth of the world.

    2. problems of science are not so much in Nature itself but in our own thoughtsabout Nature

      Yes. Denying ourselves the power of causality narratives, especially high-level ones like (nonunique) teleology stories, because they are misinterpreted as attribution of those to nature.

    3. he nature and necessity of explanations, and, above all, the fundamental,but often unrecognized,obstacles toobtaining them

      More reading I ought to do... when will I ever find time to write?

    4. odern science has convinced us that nothing that is obvious is true and that everything that is magical, improbable, extraordinary, gigantic, mi-croscopic, heartless, or outrageous is scientific

      modernity was radical in its 1920s day, now is an edifice or institution to be transcended

    5. not random but teleological,yet without a presetgoal

      teleology is a strategy to better use the frail human mind (an aspect of epistemology), not a characteristic (or not) of nature!

    6. Everything is evolving

      Yes but most of the action is in cyclostationary orbits we can learn a lot from, and perhaps then apply to the slight secular trends.

    7. Given that todaythatfundamental scientific questions about time, space, matter, life, and consciousness remain unanswered, more precise measurements willnot help. Instead, weneed tounearth and reexaminethose of our beliefsfrom which the questions stem.

      awkward, has grammatical error - reframe

    8. Mathematics is the language of expressing natural laws, but a correct syntax, as such,is noguaranteeofthe truth of a script

      Indeed- I consider it an accounting system. Surprising implications can be derived from familiar sometimes, but a sequence of equalities ("derivation") often just looks like 0=0 repeated again and again to prop up a narrative sometimes with some sleights of hand.

    9. Thequantumis understood as the elemental constituent of everything that exists. It is postulatedthat thiscould be the underlying reason why all processes are essentially alik

      Really, is that the key? Reeks of old-school "modern" physicist mindset.

      Isn't it a higher level law of order in time (infodynamics), not the identicality of the hyper-reductionist's "atom", that makes the macro-regularities?

    10. he theories themselves influence

      "filter bubbles" are the new "turtles all the way down"!

      Bayesian filter divergence (nonuniqueness) might be the infodynamic paradigm?

    11. I do not pretend to master modern science in all of its complexity

      Nor can any human any more, which is part of the profundity of the times. We need "statistical infodynamics" (and rational inattention theory from economics) because nobody can ever again know it all.

    12. the book draws oncommon sense, onpractical wisdom, and oneveryday experience

      Yes, the outdoor peasant mind meeting the stifling temple mind of entombed-truths science!

    13. this theory of nonequilibrium thermodynamics

      mention this sooner, before the excitable praise to the novelty? I was inspired by a Prigogine book back in high school before I could understand it.

    14. understanding and insight. In fact, this newly revealednatural law of time makes sense

      But did it require a slight redefining of "understanding", "sense", and "insight"? Post-modern, without throwing away the power of the modern? I shall read on.

    15. explained by the fact that everything that exists is comprised of the same elemental constituents, quanta of light

      Ugh, reductionism run wild. I thought this book would be about the re-synthesis, a gesture toward a useful and rigorous science of patterns and entities, building girders strong enough to perch above reductionism's abyss of microdeterminism as the be all and end all of scientific description?

    16. As surprising as it maybe, common patterns are ubiquitous

      Quite an artifact of an authorial mind deep in unfamiliar terrain, here in paragraph 2. Isn't the second phrase a tautology? I guess "common" has not been defined, and is used in an uncommon meaning here (to mean parallelism between distant layers of abstraction). If this is as true as tautology, who could find it surprising?? What mindset is presupposed of the reader here?

    1. the principle of least action does not, in itself, describe the trajectory of a planet or the course of a river, the free-energy principle will need to be unpacked carefully in each sphere of its application

      principle of least action in information theory

    2. the aim of philosophy “is to understand how things in the broadest possible sense of the term hang together in the broadest possible sense of the term.”

      i drink, and i know things

    3. Dark-Room agents can only exist if they can exist. The tautology here is deliberate, it appeals to exactly the same tautology in natural selection (Why am I here? – because I have adaptive fitness: Why do I have adaptive fitness? – because I am here).

      like the Rotunno-Klemp-Weisman theory of squall lines: it doesn't predict them, it explains them if they happen to exist

    4. Shannon set out this framework, with its beautifully simple, core idea of equating generation of information with reduction of uncertainty (i.e., “surprise”).

      shannon summary

    1. the boom of researchrelying on FEP just highlights there is room for deductive systematization and physics-first approaches in life science theorizing

      it's all good if activity and thinking is stimulated

    2. free-energy theorists attempt an enormous vari-ety of derivations from FEP. In that sense, FEP may be said to play the role of a firstprinciple

      a putative or postulated principle perhaps

    3. the whole point of[FEP]istounifyalladaptiveautopoieticandself-organizingbehaviorunderonesimpleimperative; avoid surprises and you will last longer [...]

      or in other words the survivors we see because they last longer have avoided nasty surprises

    4. Free-energy theorists may simply reject the idea that adequate scientific repre-sentation of life science phenomena must target the component parts and operationsand internal organization of mechanisms.

      back to what comprises "explanatory power"

    5. Mechanists wouldtherefore conclude that FEP lacks explanatory power

      explanatory power for mechanists is something a teleological principle does not comprise

    6. constrain the possible structures and configurations that might perform those oper-ations; but they are equally keen to emphasize that structural decompositions intomodeled components within a mechanism can also constrain the possible functionsand configurations performed

      like the maximum entropy production principle might constrain either the thing that does the job, or how well the job can get done

    7. By redescribing systems’capacities in terms of their functional properties and dispositions, functional analysisoffers scientists a way to tackle the target phenomenon. But it also offers the potentialfor prediction and explanation

      sounds like my JMSJ paper topic

    8. apparently at odds with mechanists’ emphasisthat life science phenomena should be explained by appeal to mechanisms, and thatadequate strategies for explanation in the life sciences should involve decomposingthese mechanisms into component parts and operations

      reductionism vs. teleology

    9. Stipulative definitions, likelivingsystemasanattractingsetinaphasespaceoradaptivebehaviorasbehaviorthatreducesaverage surprise, provide the bridge principles that connect theoretical predicates fromdifferent disciplines, and that allow free-energy theorists to attempt the deductionsneeded to claim reductions of other principles to FEP

      yes stipulative definitions, nice word for self fulfilling definitions?

    10. Since attracting sets are subsets of classically predefined phase spaces,organicists deny the assumption that all living systems’ characteristic behavior isaptly represented with an attracting set

      Yes if time is long, then the state space (Markov blanket) expands to include all other organisms (in lifetime) and species (in evolutionary time), and we are back to the game theory of contingent history.

    11. recognition) densityq(Ψ,μ)

      Is this a mapping? (is that what density is?) between the unknowable \Psi and some smaller (perhaps categorical recognition) perception vector \mu? But why isn't that just encompassed in M?

    12. nternal parametersμ

      μ was not defined. Also angle brackets <> (subscript q) are not defined.

      Is the word "variational" carrying a load here? Do I need to study that word?

    13. likelihoodp(D=dt+1|Ψψt+1,Aat,M) and prior densityp(Ψψt+1|M), which jointly specify the generativemodel “entailed by” the system’s phenotype

      phenotype here is used very broadly to include M (which is not indexed as a function of time)

    14. the surprise of sampling some sensory outcome (or experiencing somesensory state) can be represented with the negative log probability:−logp(D=dt+1|at,M). This measure quantifies the improbability

      "surprise" is a nice term for improbability, or I like "missing information" from A Farewell to Entropy book.

    15. “are confined to a bounded subset of states and remain thereindefinitely”

      but the Markov blanket grows and grows with time for systems with memory, redefining the state space actually, so this statement ending in "indefinitely" sounds way too woolly to be satisfying.

    1. As a necessary condition for the reaction to occur at constant temperature and pressure, ΔG must be smaller than the non-PV (e.g. electrical) work, which is often equal to zero (hence ΔG must be negative

      Sign of dG determines if a thing will happen spontaneously

    1. Raymond Hide, whom I had met by chance during the last year of my PhD, told me of recent work by climatologist Paltridge [12] showing that the properties of Earth’s climate could be derived using the Principle of Maximum Entropy Production!

      maximum entropy production principle - early citation

    1. In addition, this transition condition for the cellular automaton adds a stochastic component in its evolution, compared with a strictly Boolean ruleset

      cumulus game of life stochastic game

    1. onditioning for the common history byreplacing time delayed mutual information by a variantof Eq.(4) resolves this aparent paradox.

      eliminate synchronization (infinite velocity of information flow)

    2. xample, take a bi-variate time series (seeFig. 3) of the breath rate and instantaneous heart rate ofa sleeping human suffering from sleep apne

      sleep apnea example. Heart rate ramps up to gasping episodes.

    3. Either one can study transfer entropy as a function of theresolution, or one can fix a resolution for the scope of astudy.

      Here is the way I have wanted to measure macro-entropy without having it dominated by the micro (thermodynamic) entropy.

    4. transfer entropy behaves like mutual information.If computationally feasible, the influence of a known com-mon driving forceZmay be excluded by conditioning theprobabilities under the logarithm toznas well

      mutual information between including vs. excluding dependence of p on other processes

    1. preferred path forward for advancing physics in EMC models is one in which innovations from the community are socialized and introduced through strong collaborative working relationships between developers/scientists at EMC and thosein the broader community

      ok, but how?

    2. In the numerical modeling community it is not uncommon for developers to be confounded when model innovations that look better “on paper” -or seem to perform betterin different modeling frameworks –do not increase overall skill when implemented in complex, highly nonlinear modeling systems

      indeed

  7. May 2019
    1. GCM) is nudged towards 6-h reanalyses. The nudging is applied either in the whole tropical band or in a regional summer monsoon domain

      a monsoon nudging domain

    1. using the nudging/relaxation methodology first outlined in Klinker and Sardeshmukh (1992) and used subsequently by others including Douville et al. (2011) and Hall et al. (2013)

      nudging references

    1. shallow CAPE measures the integrated buoyancy for undiluted parcels only up to the midtroposphere

      This gives control of the convection to lower troposphere (bottom heavy) adiabatic cooling by the 2nd vertical mode of w, but is not labeled an "inhibition" effect. Instead, a QE story about these simple algebraic equations' closure was preferred.

  8. Apr 2019
    1. standard deviation equal to current uncertainty estimates for radiosonde vertical profiles

      Here is the key: radiosonde "uncertainty" sets the magnitude, while vertical structure is here

    2. Changes to MP produce a similar order-of-magnitude response in convective hydrologic cycle, dynamics, and latent heating as changes to IC

      How can microphysics and thermodynamics be compared, if the changes are incommensurate (different units, etc.)?

    1. Non-rotated PCs (left panels) and rotated components(right panels) for all variables at the Ranai upper-air sounding site.

      About 5 DOFs in the vertical for all fields

    1. Any comprehensive theory for the mesoscale spectrum should account for the presence of intermittent but very broad band forcing of the mesoscale by latent heating

      all scales convect

    2. resonant triad interaction of two IGWs with a balanced vortex mode, in which the vortex catalyzes the transfer of energy from large- to small-wavelength IGWs

      very specific mechanism

    3. Moist processes primarily enhance the divergent part of the spectrum, which has a relatively shallow spectral slope that resembles −

      moist compared to dry

    1. forcing of and acts at all the scales, and unlike the classical turbulence theory, there is not a well-defined inertial subrange here. As also suggested by Waite and Snyder (2009), it is possible that the mesoscale kinetic energy spectrum does not arise from a cascade process.

      All scales convect, no inertial subrange or scale-local cascade is indicated. So where does -5/3 come from?

    2. The buoyancy production generated by moist convection, while mainly injecting energy in the upper troposphere at small scales, could also contribute at larger scales, possibly as a result of the organization of convective cells into mesoscale convective systems.

      A weak statement of the result as I see it, which is that all scales convect.

    1. instead of an error cascade from smaller to larger scales (upscale growth), errors in more complex flows tend to grow uniformly at all scales (up-magnitude growth

      all scales convect perhaps

    1. The partitioning algorithm depends on the horizontal distribution of simulated reflectivity and vertical velocity, using three criteria (Steiner et al. 1995): intensity of reflectivity, peakedness (excess of reflectivity over a background value), and area within an intensity-dependent radius around a convective grid. At each time step of integration, the modified WRF code can automatically identify whether each grid point is categorized into convective, stratiform, or other

      Texture of simulated reflectivity, based on algorithms for observed reflectivity.