347 Matching Annotations
  1. Apr 2022
    1. José A. Estarelles dice: 19 abril, 2022 a las 11:42 am Buenas, Soy el traductor de esta obra de Wolfgang Smith. Tengo que decir que no es cierto que el autor desconozca o ignore la decoherencia, ni lo que algunos quieren extraer de ello, como demuestra su respuesta de 2019 al doctor McAndrew, «To Be or Not To Be an Apple» (philos-sophia.org/be-or-not-be-apple): “El famoso ‘problema de la medición’, … a pesar de las famosas pretensiones en sentido contrario, en realidad no se ha resuelto hasta ahora: ni el tour de force de la mecánica de Bohm, ni la teoría de los ‘muchos mundos’, ni la ‘decoherencia’, ni ninguna otra de sus propuestas [de la comunidad de física] ha logrado aún resolver ese dilema.” En fin, el colapso del vector de estado para nada es un “problema obsoleto”, ni “de interés histórico”. Es una cuestión plenamente vigente, por incómoda que sea para tantos. Entra Smith más en la decoherencia en los capítulos 2 y 4 de su título “The Vertical Ascent” (2020), que en gran medida se pueden consultar en la web del autor: “The Tripartite Wholeness” (philos-sophia.org/the-tripartite-wholeness) y “Lost in Math: The Particle Physics Quandary” (philos-sophia.org/particle-physics-quandary) donde, como en otro acertado comentario aquí, también se menciona a Sabine Hossenfelder. Dicho esto, y dejando a un lado el desafío que plantea la misma decoherencia para la interpretación totalista de la física cuántica, supongo que verá por qué me parece que esta reseña que hace al libro está exenta de valor crítico, aunque agradezco el interés prestado. Responder Francisco R. Villatoro dice: 19 abril, 2022 a las 1:5
      • ATENCION!!!
      • BIEN DICHO!: "[...] supongo que verá por qué me parece que esta reseña que hace al libro está exenta de valor crítico, aunque agradezco el interés prestado."
    2. Alf dice: 21 abril, 2022 a las 7:48 pm Como curiosidad en la historia de la física ¿ Desde cuando y porque ( me refiero a pruebas empíricas o formalismos matemáticos ) han dejado obsoleto a los procesos R (como colapso del vector de estado ) Responder Francisco R. Villatoro dice: 22 abril, 2022 a las 12:59 am Alf, la idea de los pioneros era que el colapso de la función de onda era un mecanismo dinámico diferente de la evolución unitaria. El formalismo de integrales de camino en mecánica cuántica y la interpretación de muchos mundos fueron cambiando la mentalidad de la mayoría de los físicos poco a poco. Durante la década de los 1970 la idea de que la única dinámica necesaria es la evolución unitaria y de que el colapso es un artificio innecesario se fue imponiendo poco a poco. A finales de la década de los 1970, los libros de texto de mecánica cuántica empezaron a omitir el problema de la medida como un problema que todo físico tuviera que conocer y el colapso dejó de mencionarse en la mayoría de dichos libros de texto, que ya solo discutían la evolución unitaria. Ya en los 1980 el colapso quedó degradado a las monografías específicas y a artículos de investigación sobre física más allá de la mecánica cuántica. A pesar de ello, la idea de que el colapso es un concepto obsoleto (además de innecesario) no se ha impuesto hasta principios del siglo XXI. Responder Alf dice: 22 abril, 2022
      • ATENCION!!!
      • @Francis: CITATION NEEDED!!! REFERENCES???
    3. joan dice: 18 abril, 2022 a las 3:17 pm que opinais de este libro: EL ENIGMA CUANTICO: ENCUENTROS ENTRE LA FISICA Y LA CONCIENCIA BRUCE ROSENBLUM FRED KUTTNER Responder Francisco R. Villatoro dice: 19 abril, 2022 a las 9:32 am Joan, el libro de Rosenblum y Kuttner parece empezar bien con su discusión de la historia que antecede a EPR y las desigualdades de Bell, pero entonces, cuando llega al nudo de la cuestión del «enigma cuántico» se hunde en la miseria; en lugar de discutir lo que se sabe sobre el asunto derrapa con el tema de la conciencia y acaba engañando al lector que esperaba una discusión sobre el «enigma cuántico» y se encuentra una discusión metafísica que no está relacionada ni con el título, ni con el resto del libro. En mi opinión, un libro bien escrito, de lectura fácil, pero muy poco recomendable para quien quiera aprender algo sobre física cuántica.
      • OK
    4. Francisco R. Villatoro dice: 15 abril, 2022 a las 9:48 am Javi, el colapso de la función de onda no existe, es un concepto obsoleto. La evolución unitaria se aplica tanto al sistema de medida como al sistema medido durante todo el proceso de la medida; cuando el sistema de medida es tan complicado que no se puede describir en pie de igualdad al sistema medido, se recurre a actualizar el estado del sistema medido tras la conclusión de la medida para hacer compatibles la evolución de ambos (esto es lo que se llama proyección del estado tras la medida). Si esta actualización se realiza de forma correcta, en futuras medidas el resultado será compatible con dicha actualización; si se realiza de forma incorrecta, habrá discrepancias, que señalarán que se ha realizado mal. No hay ningún misterio en todo esto.
      • VER respuesta de Fernando
      • Fernando dice: 16 abril, 2022 a las 5:40 am
    5. Wolfgang Smith, «El enigma cuántico. Descubriendo la clave oculta», Sekotia (2021) [188 pp.], traducido por José Antonio Estarelles
      • SEE
      • Smith W. The quantum enigma: finding the hidden key. 3rd rev. ed. Hillsdale, N.Y: Sophia Perennis; 2005. 156 p. ISBN: 978-1-59731-007-9 978-1-59731-038-3
    6. Fernando dice: 16 abril, 2022 a las 5:40 am Hola, Francis, Primero de todo, felicitarte por tu sin duda extraordinaria labor en este blog, que es digna de elogio. En segundo lugar, y ya en relación al tema aquí abordado, me ha sorprendido tu afirmación categórica acerca de la inexistencia del colapso, pues si bien es cierto que ésta es una postura sostenida por muchos físicos que trabajan en el programa de Decoherencia Cuántica, no menos cierto es que no es compartida por toda la comunidad científica. De hecho, siendo rigurosos, se trata de una corriente del pensamiento no avalada aún de manera general por los resultados matemáticos y experimentales derivados de dicho programa de investigación. Sin ir más lejos, dentro de dicho marco matemático, el colapso del sistema-aparato es aparente, pues aunque como consecuencia del entorno el operador densidad reducido del sistema-aparato acabe teniendo finalmente la forma característica de un estado mixto, el operador densidad del sistema global sistema-aparato-entorno sigue siendo el de un estado puro, por lo que el sistema global sigue estando en un estado superposición, y por ende, el observable medido sigue sin estar bien definido según el formalismo cuántico. Es más, al tomar la traza parcial sobre los estados del entorno para hacer cálculos sobre el operador densidad reducido del sistema-aparato, no estamos teniendo en cuenta la coherencia de fase global entre el sistema, el aparato y el entorno, por lo que el mecanismo de decoherencia en ningún momento afecta a dicha fase. En consecuencia, el sistema-aparato-entorno sigue en un estado superposición, no legitimándonos nada a afirmar lo contrario. Es por ello que no se puede afirmar que se haya producido un colapso real de algo, ni tampoco que la Decoherencia resuelva el problema de la lectura definida del aparato de medición. Me consta, no obstante, que el concepto de envarianza introducido por Zurek resulta lo suficientemente prometedor como para poder resolver quizás los problemas a los que conduce el citado operador densidad reducido, pero desconozco si lo ha llegado lograr. Si lo ha hecho, te agradecería me pasaras algún «paper» sobre ello para poder estar completamente actualizado. Incluso si dispusieras de tiempo y te apeteciera, estaría bien poder discutirlo privadamente de físico a físico.
      • OK: afirmacion categorica! Solo los "pro" decoherence
      • Quien eres, Fernando?
    7. Enrique dice: 17 abril, 2022 a las 5:40 pm Hola Juan, aún coincidiendo plenamente con casi todo tu planteamiento, la teoría cuántica de campos no resuelve el conocido como problema de la medida, que se ilustra, de una forma muy sencilla, con el experimento de la doble rendija: cuando no se mide por donde pasa, el electrón se comporta como onda, cuando se mide se comporta como corpúsculo. Es precisamente la decoherencia la que señala que el aparato de medida, al ser un objeto externo al sistema que hemos considerado y que además está formado por una enorme cantidad de partículas, rompe la coherencia del sistema, impidiendo que se produzca la interferencia y, por tanto, que veamos el electrón como onda.
      • NO ENTIENDO
    8. Enrique dice: 16 abril, 2022 a las 11:40 am Soy físico. Llevo más de 20 años intentando convencer a colegas y amigos de que la decoherencia es la interpretación correcta al problema de la medida en cuántica. La mayor objeción que he encontrado en los demás para aceptar esta idea no es su coherencia o lógica, sino que, ni en los libros de texto, ni en los de divulgación y en la mayoría de los artículos científicos, de lo que se habla es de colapso de la función de onda o en el mejor de los casos de pérdida de información de la matriz densidad (interpretación de Von Newman). He llegado a la conclusión de que, en los libros de divulgación se habla de colapso de la función de onda porque permite decir cosas «extrañas» sobre la física y eso vende. Por otra parte, los grandes libros de texto sobre cuántica se escribieron hace 40 años o más y fueron escritos por autores que habían estudiado las bases de la cuántica 40 años antes y por tanto, cuando el colapso de la función de onda era incuestionable. En mi opinión faltan de 10 a 20 años para veamos libros de divulgación que incluyan la decoherencia como solución al problema de la medida. Yo solo vi, en 1995, una pincelada sobre lo que era la decoherencia (apenas unas líneas) en el libro «El quark y el jaguar» del gran Murray Gell-Mann. Después de aquello, nada.
      • OK
    1. He continues by comparing open works to Quantum mechanics, and he arrives at the conclusion that open works are more like Einstein's idea of the universe, which is governed by precise laws but seems random at first. The artist in those open works arranges the work carefully so it could be re-organized by another but still keep the original voice or intent of the artist.

      Is physics open or closed?

      Could a play, made in a zettelkasten-like structure, be performed in a way so as to keep a consistent authorial voice?

      What potential applications does the idea of opera aperta have for artificial intelligence? Can it be created in such a way as to give an artificial brain a consistent "authorial voice"?

  2. Mar 2022
    1. Melvin Vopson has proposed an experiment involving particle annihilation that could prove that information has mass, and by Einstein's mass-energy equivalence, information is also energy. If true, the experiment would also show that information is one of the states of matter.

      The experiment doesn't need a particle accelerator, but instead uses slow positrons at thermal velocities.

      Melvin Vopson is an information theory researcher at the University of Portsmouth in the United Kingdom.

      A proof that information has mass (or is energy) may explain the idea of dark matter. Vopson's rough calculations indicate that 10^93 bits of information would explain all of the “missing” dark matter.

      Vopson's 2022 AIP Advances paper would indicate that the smallest theoretical size of digital bits, presuming they are stable and exist on their own would become the smallest known building blocks of matter.

      The width of digital bits today is between ten and 30 nanometers. Smaller physical bits could mean more densely packed storage devices.


      Vopson proposes that a positron-electron annihilation should produce energy equivalent to the masses of the two particles. It should also produce an extra dash of energy: two infrared, low-energy photons of a specific wavelength (predicted to be about 50 microns), as a direct result of erasing the information content of the particles.

      The mass-energy-information equivalence principle Vopson proposed in his 2019 AIP Advances paper assumes that a digital information bit is not just physical, but has a “finite and quantifiable mass while it stores information.” This very small mass is 3.19 × 1038 kilograms at room temperature.

      For example, if you erase one terabyte of data from a storage device, it would decrease in mass by 2.5 × 1025 kilograms, a mass so small that it can only be compared to the mass of a proton, which is about 1.67 × 1027 kilograms.

      In 1961, Rolf Landauer first proposed the idea that a bit is physical and has a well-defined energy. When one bit of information is erased, the bit dissipates a measurable amount of energy.

    1. First, what do I mean by meta-stable? It’s in reference to an article by Scott H Young - 7 Rules for Staying Productive Long-Term. In it, Scott describes a concept in physics where something is stable but small perturbations can cause it to break, and not able to go back to its starting position after a small push. The example he gives is a pendulum, perfectly balanced at the top; when pushed will not return to its starting point. Contrast this with a pendulum’s other stable point, the bottom. When pushed, it’ll return to that starting point easily.

      Something is meta-stable if small perturbations won't cause it to break or be able to go back to its starting point.

      example: pendulum

    1. Ryan Usher says: January 22, 2022 at 6:19 pm “…I don’t at all understand why Quanta chose to cover this.” This is something I find myself wondering as well, and it’s not the first time Quanta has fallen prey to this kind of hype–and I characterize it in that way to be generous to Quanta in the face of my cynicism. So I decided to waste a couple of hours to come up with the following:
      • SEE
    1. gravitational míreming woh* This text was recognized by the built-in Ocrad engine. A better transcription may be attained by right clicking on the selection and changing the OCR engine to "Tesseract" (under the "Language" menu). This message can be removed in the future by unchecking "OCR Disclaimer" (under the Options menu). More info: http://projectnaptha.com/ocrad

      Gravitational microlensing

      • gravitational wave approaching the earth is interrupted by blackhole, signal gets modified
  3. Feb 2022
  4. Dec 2021
    1. gravity on the poles in a bit larger
      • BEWARE!
      • is not "gravity"; it is the RESULTING "acceleration" on the massive (IMPORTANT) object: gravity + rotation
      • BESIDES: there are 2 factors:
        • distance to center: pole < equator
        • rotation: pole=0 < equator
    1. Physics Hamiltonian for the zeros of the Riemann Zeta function (2016) General Relativity and Cosmology: Unsolved Questions and Future Directions (2016) There are no particles, there are only fields (2012) Would Bohr be born if Bohm were born before Born? (2007) The holographic solution - why general relativity must be understood in terms of strings (2004) Measurement of subpicosecond time intervals between two photons by Interference (1987) Bertlmann’s socks and the nature of reality (1981) More is different (1972) On the Einstein Podolsky Rosen Paradox (1964) Deterministic Nonperiodic Flow (1962) There's Plenty of Room at the Bottom (1959) Forms of Relativistic Dynamics (1949) What is life? (1944) Can Quantum-Mechanical Description of Physical Reality Be Considered Complete? (1935) Possible Existence of a Neutron (1932) On the electrodynamics of moving bodies (1905)

      see

  5. Nov 2021
    1. “Because physicists started out with the imaginary, unstable cube as their model instead of the real-world stable tetrahedron, they got into all these imaginary numbers and other complicated and completely unnecessary mathematics. It would be so much simpler if they started out with the tetrahedron, which is nature’s best structure, the simplest structural system in Universe.

      (Just as an aside, to remember later when you’re studying physics in school, I want to point out that the tetrahedron is also equivalent to the quantum unit of physics, and to the electron.)”

  6. Oct 2021
  7. Sep 2021
  8. Aug 2021
  9. Jul 2021
    1. We may assume that Anaximander somehow had to defend his bold theory of the free-floating, unsupported earth against the obvious question of why the earth does not fall. Aristotle’s version of Anaximander’s argument runs like this: “But there are some who say that it (namely, the earth) stays where it is because of equality, such as among the ancients Anaximander. For that which is situated in the center and at equal distances from the extremes, has no inclination whatsoever to move up rather than down or sideways; and since it is impossible to move in opposite directions at the same time, it necessarily stays where it is.” (De caelo 295b10ff., DK 12A26) Many authors have pointed to the fact that this is the first known example of an argument that is based on the principle of sufficient reason (the principle that for everything which occurs there is a reason or explanation for why it occurs, and why this way rather than that).

      principle of sufficient reason

      : for everything which occurs there is a reason or explanation for why it occurs, and why this way rather than that

      The first example in Western culture is that of Anaximander explaining why the Earth does not fall.

    2. These observations were made with the naked eye and with the help of some simple instruments as the gnomon. The Babylonians, in particular, were rather advanced observers. Archeologists have found an abundance of cuneiform texts on astronomical observations. In contrast, there exists only one report of an observation made by Anaximander, which concerns the date on which the Pleiades set in the morning. This is no coincidence, for Anaximander’s merits do not lie in the field of observational astronomy, unlike the Babylonians and the Egyptians, but in that of speculative astronomy. We may discern three of his astronomical speculations: (1) that the celestial bodies make full circles and pass also beneath the earth, (2) that the earth floats free and unsupported in space, and (3) that the celestial bodies lie behind one another. Notwithstanding their rather primitive outlook, these three propositions, which make up the core of Anaximander’s astronomy, meant a tremendous jump forward and constitute the origin of our Western concept of the universe.

      Anaximander practiced speculative astronomy instead of just observational astronomy and in so doing, he dramatically changed the cosmological outlook of Western culture.

  10. Jun 2021
    1. The Internet, an immeasurably powerful computing system, is subsuming most of our other intellectual technologies. It’s becoming our map and our clock, our printing press and our typewriter, our calculator and our telephone, and our radio and TV.

      An example of technological progress subsuming broader things and abstracting them into something larger.

      Most good mathematical and physical theories exhibit this sort of behaviour. Cross reference Simon Singh's The Big Bang.

  11. May 2021
    1. The largest collection of Isaac Newton's papers has gone digital, committing to open-access posterity the works of one of history's greatest scientist. Among the works shared online by the Cambridge Digital Library are Newton's own annotated copy of Principia Mathematica and the 'Waste Book,' the notebook in which a young Newton worked out the principles of calculus.

      I've annotated something about Isaac Newton's Waste Book for calculus before (possibly in Cambridge's Digital Library itself, but just in case, I'm making a note of it here again so it doesn't get lost.

      In my own practice, I occasionally use small notebooks to write temporary notes into before transferring them into other digital forms. I generally don't throw them away, but they're essentially waste books in a sense.

  12. Apr 2021
  13. Feb 2021
    1. Say, for instance, a hypothetical self-driving car is sold as being the safest on the market. One of the factors that makes it safer is that it “knows” when a big truck pulls up along its left side and automatically moves itself three inches to the right while still remaining in its own lane. But what if a cyclist or motorcycle happens to be pulling up on the right at the same time and is thus killed because of this safety feature?

      I think that an algorithm that's "smart" enough to move away from a truck is also "smart" enough to know that it cannot physically occupy the same space as the motorcycle.

  14. Dec 2020
  15. Oct 2020
    1. History has a second lesson. Even though beauty was arguably a strong personal motivator for many physicists, the problems that led to breakthroughs were not merely aesthetic misgivings – they were mathematical contradictions. Einstein, for example, abolished absolute time because it was in contradiction with Maxwell’s electromagnetism, thereby creating special relativity. He then resolved the conflict between special relativity and Newtonian gravity, which gave him general relativity. Dirac later removed the disagreement between special relativity and quantum mechanics, which led to the development of the quantum field theories which we still use in particle physics today.
    2. My conclusion from this long line of null results is that when physics tries to rectify a perceived lack of beauty, we waste time on problems that aren’t really problems. Physicists must rethink their methods, now – before we start discussing whether the world needs a next larger particle collider or yet another dark matter search.

      $$Insert LaTeX$$

    1. Their findings indicate that the set of all quantum field theories forms a unique mathematical structure, one that does indeed pull itself up by its own bootstraps, which means it can be understood on its own terms.

      What kind of structure? Group? Ring? Other?

    1. The notion that counting more shapes in the sky will reveal more details of the Big Bang is implied in a central principle of quantum physics known as “unitarity.” Unitarity dictates that the probabilities of all possible quantum states of the universe must add up to one, now and forever; thus, information, which is stored in quantum states, can never be lost — only scrambled. This means that all information about the birth of the cosmos remains encoded in its present state, and the more precisely cosmologists know the latter, the more they can learn about the former.
    1. This is, I thought, little more than an analogy with Boyle’s Law, one of the most striking early successes of the scientific revolution, which holds that the pressure and volume of a fixed amount of gas are inversely proportional.  Release the contents from a steel cylinder into a balloon and the container expands.  But it still contains no more gas than before.  Something like that must have been in the mind of the first person who first spoke of “inflating” the currency. From there it was a short jump to the way that classical quantity theory relies on the principle of plenitude – the age-old assumption, inherited from Plato, that there can be nothing truly new under the sun, that the collection of goods of “general price level” were somehow fixed.
  16. Sep 2020
  17. Jul 2020
  18. Jun 2020
    1. tunnel splitting

      What exactly is tunnel splitting? The essay mentions this various times without clearly explaining what it is, and whenever I search it up online multiple things are said about it.

      From what I could understand, quantum spin tunnel splitting makes it so that the magnetization of a system can "switch between states with opposite magnetization that are separated by an energy barrier much larger than thermal energy". But what exactly happens during tunnel splitting? And why are there different ones mentioned in the article? [magnetization tunneling, zero-field tunnel splitting, ground-state tunnel splitting]

      I also understand that this phenomenon "defies classical physics" because of magnetization switching. How is that possible?

  19. May 2020
  20. Apr 2020
  21. Mar 2020
    1. Sempre ricordando che quando si parla di sfericità, l’elettrone non deve essere pensato come una pallina: si tratta di una particella elementare, dunque non strutturata e indivisibile, e per forma si intende in realtà la simmetria delle sue interazioni con i campi esterni, con altre cariche.
  22. Jan 2020
    1. Since water is denser than air, and the reflection is diffuse. A lot of light is internally reflected, thereof, increasing the probability of absorption at surface.

      The light is reflected back inside the water, because of the total internal reflection:

      • water is denser than air
      • angle of incidence is greater than the so-called critical angle

    2. This is because the light now has a layer of water to go through. And due to the reflectance of water, not all light at the air-liquid-interface (border between air and water) goes through the water. Some of it is reflected.

      Wet things become darker, because of the water consistency, reflectance that doesn't let all the light to transmit through it.

      The probability of light getting transmitted is: 1 - R1 (reflectance at the air-liquid interface)

    3. There are two types of reflection (two ways the wave can be thrown back). Specular Diffuse

      Two types of reflection:

      1. specular - light leaves the surface at the same angle it hits it
      2. diffuse - hitting light is scattered into all angles when reflected
  23. Dec 2019
    1. removing

      We're not removing overlapping regions, right? Rather, we are merging overlapping ROIs into one single ROI so as to incorporate the reality that they overlap. The image illustrates that, and not a removal of the intersection of two ROIs (which is what this language implies).

  24. Nov 2019
    1. Quantum Realism: A virtual reality would be subject to virtual time, where each processing cycle is one "tick." Every gamer knows that when the computer is busy the screen lags—game time slows down under load. Likewise, time in our world slows down with speed or near massive bodies, suggesting that it is virtual. So the rocket twin only aged a year because that was all the processing cycles the system busy moving him could spare. What changed was his virtual time.

      Thought exercise. Modern "Zen koan".

  25. Jul 2019
  26. Feb 2019
    1. Deep learning approach based on dimensionality reduction for designing electromagnetic nanostructures

      将深度学习应用于物理学,逐渐成为呼声很高的研究方法。这篇论文是相关工作中的一个代表,研究人员将深度学习技术应用于电磁纳米结构的分析、设计和优化研究当中,不仅大大地降低了分析和设计的计算复杂度,还能够提供新的设计方案(例如本文中设计了一种全新的基于相变材料的可重构光学超曲面)。作者将相关的软件集合成一个工具包,免费开放,可促进电磁纳米结构的研究。

  27. Dec 2018
  28. Nov 2018
    1. Opening the black box of deep learning

      上海大学的这个文有水文的倾向,没有提出任何实际数学的理论构想,整体还是太过唯象,企图给出DL 的物理理论解释。。。~ 扫了一眼,论点论证都比较牵强 ~

    2. DeepSphere: Efficient spherical Convolutional Neural Network with HEALPix sampling for cosmological applications

      对具有方位信息的数据做卷积,实现了所谓的 3D 卷积,这对天文上的微博背景辐射(CMB)数据的应用很有意义。

  29. Sep 2018
    1. The establishing of this mutual relationship between technology and physics is correct. But it remains a merely historiological establishing of facts and says nothing about that in which this mutual relationship is grounded. The decisive question still remains: Of what essence is modern technology that it thinks of putting exact science to use?

      It seems that the author is establishing a relationship between modern physics and technology as a circular relationship in that modern physics would not be possible if technology did not allow us to study it nor would technology advance should modern physics not be studied. So, essentially, modern technology is a method to reveal modern physics to us? Therefore, modern technology is also a revealing just as modern physics is a revealing, but that they mutually allow each other to do so?

  30. Apr 2018
  31. Mar 2018
  32. Oct 2017
    1. You have probably heard about the hunt for dark matter, a mysterious substance thought to permeate the universe, the effects of which we can see through its gravitational pull. But our models of the universe also say there should be about twice as much ordinary matter out there, compared with what we have observed so far.

      Two separate teams found the missing matter – made of particles called baryons rather than dark matter – linking galaxies together through filaments of hot, diffuse gas.

  33. Sep 2017
    1. similarities between the two

      It would be interesting to discuss these similarities. From my non-chemist brain, I see more pattern and regularity in the chemical than in the social. Reminds me of NDTyson quote that physics is easy and sociology is hard because of the nonlinearity of human behavior.

  34. Apr 2017
    1. In 2013, François Englert and Peter Higgs were awarded the Nobel Prize in Physics for the development of the Higgs mechanism.

      'The Nobel Prize in Physics 2013 was awarded jointly to François Englert and Peter W. Higgs "for the theoretical discovery of a mechanism that contributes to our understanding of the origin of mass of subatomic particles, and which recently was confirmed through the discovery of the predicted fundamental particle, by the ATLAS and CMS experiments at CERN's Large Hadron Collider"'

      https://www.nobelprize.org/nobel_prizes/physics/laureates/2013/

    2. we recommend using the term 'transformation' instead of 'decay', as this more accurately describes the physical process

      OH MY GOD YES! <3

      The term "decay" when applied to non-macro phenomena is terribly misleading for anyone who isn't a physicist.

      "Decay" has several meanings (see the Wikipedia page), but it would not be foolish to assume that the term is commonly associated with things like decomposition or biological decays. Even in physics, an orbital decay is a gradual process.

      "Transformation" is much more applicable, and is a term I've used myself over the last few years instead of "decay" in this context.

    3. Matter particles can be divided into three groups: quarks (q) and antiquarks (\bar{q}); electrically charged leptons (\ell) and antileptons (\bar{\ell}); neutrinos (ν) and antineutrinos (\bar{\nu}). Gluons (g) couple to colour charge, which only quarks, antiquarks, and gluons themselves, have.

      Typically, though, the matter particles (fermions) are grouped into two, depending on whether they interact with the colour charge (quarks) or not (leptons, which include both the electrically charged leptons and neutrinos).

      However, the division into three groups, as shown here, is helpful!

  35. Feb 2017
    1. The following is a statement of the laws of physics, not just my own personal opinion. "When power is Variable, Power controls airspeed." "When power is fixed, Pitch controls airspeed." In general, airplanes go where you point them, and go as fast as the power dictates. This is the easiest way to fly, and it works in all airplanes.
  36. Jan 2017
  37. Nov 2016
  38. Oct 2016
  39. Jul 2016
    1. Page 187 On hyper authorship

      "hyper authorship” is an indicator of "collective cognition" in which the specific contributions of individuals no longer can be identified. Physics has among the highest rates of coauthorship in the sciences and the highest rates of self archiving documents via a repository. Whether the relationship between research collaborators (as indicated by the rates of coauthorship) and sharing publications (as reflected in self archiving) holds in other fields is a question worth exploring empirically.

    1. I always found it incredible. He would start with some problem, and fill up pages with calculations. And at the end of it, he would actually get the right answer! But he usually wasn’t satisfied with that. Once he’d gotten the answer, he’d go back and try to figure out why it was obvious. And often he’d come up with one of those classic Feynman straightforward-sounding explanations. And he’d never tell people about all the calculations behind it. Sometimes it was kind of a game for him: having people be flabbergasted by his seemingly instant physical intuition, not knowing that really it was based on some long, hard calculation he’d done.

      Straightforward intuition isn't just intuition.

  40. Jun 2016
    1. Actually, I didn’t need Holmesian deductions to conclude that Aad et al. aren’t using a conventional definition of authorship. It’s widely known*** that at least two groups in experimental particle physics operate under the policy that every scientist or engineer working on a particular detector is an author on every paper arising from that detector’s data. (Two such detectors at the Large Hadron Collider were used in the Aad et al paper, so the author list is the union of the “ATLAS collaboration” and the “CMS collaboration”.) The result of this authorship policy, of course, is lots of “authorships” for everyone: for the easily searchable George Aad, for instance, over 400 since 2008.

      Physicists authorship models

    1. TheHEP research community is thus characterized by highlevels of internal scrutiny, mutual trust—witness, for in-stance, the institutionalized practice of relying upon, andciting, preprints—and peer tracking, such that it is notsusceptible to systematic fraud. Contrary

      physicists live in a very trustful, observant, world; also they do a lot of internal, pre-referee, review

    2. The answer probably has to do with the relative intensityof socialization and oral communication (Traweek, 1992,pp. 120 –123), along with the character of the organizationalstructures and value systems, which define collaborations inlarge-scale, high-energy physics and biomedical research.

      Why is there less soul-searching about hyper-authorship in HEP? disciplinary differences

    3. Thisarticle(a)beginswithabrief,historicaloverviewofscholarlypublishing,focusingontheroleoftheauthorandtheconstitutionoftrustinscientificcommunication;(b)offersanimpressionisticsurveyandanalysisofrecentdevelop-mentsinthebiomedicalliterature;(c)explorestheextenttowhichdeviantpublishingpracticesinbiomedicalpublishingareafunctionofsociocognitiveandstructuralcharacteris-ticsofthedisciplinebycomparingbiomedicinewithhighenergyphysics,theonlyotherfieldwhichappearstoexhibitcomparablehyperauthorshiptendencies;and(d)assessestheextenttowhichcurrenttrendsinbiomedicalcommuni-cationmaybeaharbingerofdevelopmentsinotherdisci-plines

      Great overview of what is going to happen in article:

      1. History of authorship
      2. Survey of state of biomedicine
      3. "extent to which deviant publishing practices in biomedical publishing are a function of sociocognitive and structural characteris-tics of the discipline by comparing biomedicine with high energy physics, the only other field which appears to exhibit comparable hyperauthorship tendencies"
      4. Assess extent to which biomedical trends may foreshadow trends in other fields.
    1. Combined Measurement of the Higgs Boson Mass in pp<math xmlns="http://www.w3.org/1998/Math/MathML" display="inline"><mi>p</mi><mi>p</mi></math> Collisions at s√=7<math xmlns="http://www.w3.org/1998/Math/MathML" display="inline"><msqrt><mi>s</mi></msqrt><mo>=</mo><mn>7</mn></math> and 8 TeV with the ATLAS and CMS Experiments

      ATLAS Collaboration, CMS Collaboration, G. Aad, B. Abbott, J. Abdallah, O. Abdinov, R. Aben, et al. 2015. “Combined Measurement of the Higgs Boson Mass in $pp$ Collisions at $\sqrt{s}=7$ and 8 TeV with the ATLAS and CMS Experiments.” Physical Review Letters 114 (19): 191803. doi:10.1103/PhysRevLett.114.191803.

      This is the 5000+ author physics paper

      Note a) that they actually credit the authorship to the collaborations on the byline; and b) that they have two plus pages of secondary affiliations!

  41. Apr 2016
  42. Feb 2016
  43. Jan 2016
  44. Dec 2015
    1. Pronunciations for hexadecimal numbers:<br> 0xB3 "bibbity-three"<br> 0xF5 "fleventy-five"<br> 0xDB "dickety-bee"

      BZARG is the work of Tim Babb, who lives in the San Francisco Bay Area, and is Lighting Optimization Lead for Pixar Animation Studios.

      This blog focuses primarily on graphics, physics, programming, and probably some philosophy and fiction

  45. Oct 2015
    1. In a landmark study, scientists at Delft University of Technology in the Netherlands reported that they had conducted an experiment that they say proved one of the most fundamental claims of quantum theory — that objects separated by great distance can instantaneously affect each other’s behavior.

      The researchers describe their experiment as a “loophole-free Bell test” in a reference to an experiment proposed in 1964 by the physicist John Stewart Bell as a way of proving that “spooky action at a distance” is real.

    2. the strongest evidence yet to support the most fundamental claims of the theory of quantum mechanics about the existence of an odd world formed by a fabric of subatomic particles, where matter does not take form until it is observed and time runs backward as well as forward.
    1. "It's intriguing that you've got general relativity predicting these paradoxes, but then you consider them in quantum mechanical terms and the paradoxes go away," says University of Queensland physicist Tim Ralph. "It makes you wonder whether this is important in terms of formulating a theory that unifies general relativity with quantum mechanics."
  46. Jun 2015
    1. Schrödinger thought that the Greeks had a kind of hold over us—they saw that the only way to make progress in thinking about the world was to talk about it without the “knowing subject” in it. QBism goes against that strain by saying that quantum mechanics is not about how the world is without us; instead it’s precisely about us in the world. The subject matter of the theory is not the world or us but us-within-the-world, the interface between the two.
  47. Oct 2014
    1. The discovery of the Higgs Boson has been a testament to the co- herence of the Standard Model of Physics, but the way in which this boson interacts with a fundamental class of particles known as leptons has yet to be explored, due to the rarity with which the Higgs Boson decays into leptons and the event's similarity to the decay of other particles such as the Z Boson. By creating a machine learning model to accurately determine whether a Higgs Boson is decaying into tau particles (a type of lepton) within a particle accelerator, physicists will be able to explore the nature of the Higgs's interaction with lep- tons. Our model serves to contribute to the work of others involved in the Higgs Boson Machine Learning Challenge, a crowdsourced e ort to generate a satisfactory classi cation model to be used at CERN's facilities, where the Higgs Boson is studied.

      Really interesting abstract.