söndag 17 mars 2024

Universality of Radiation with Blackbody as Reference


One of the unresolved mysteries of classical physics is why the radiation spectrum of a material body only depends on temperature and frequency and not on the physical nature of the body, as an intriguing example of universality. Why is that? The common answer given by Planck is statistics of energy quanta, an answer however without clear physics based on ad hoc assumptions which cannot be verified experimentally as shown by this common argumentation. 

I have pursued a path without statistics based on clear physics as Computational Blackbody Radiation in the form of near resonance in a wave equation with small radiative damping as outgoing radiation, subject to external forcing  $f_\nu$ depending on frequency $\nu$ , which shows the following radiance spectrum $R(\nu ,T)$ (with more details here) characterised  by a common temperature $T$,  radiative damping parameter $\gamma$, and $h$ defines a high-frequency cut-off.  Radiative equilibrium with incoming = outgoing radiation shows to satisfy:

  • $R(\nu ,T)\equiv\gamma T\nu^2 =\epsilon f_\nu^2$ for $\nu\leq\frac{T}{h}$,
  • $R(\nu ,T) =0$ for $\nu >\frac{T}{h}$,
where $0<\epsilon\le 1$ is a coefficient of absorptivity = emissivity, while frequencies above cut-off $\frac{T}{h}$ cause heating. The radiation can thus be described by the coefficients $\gamma$, $\epsilon$ and $h$ and the temperature scale $T$. 

Here $\epsilon$ and $h$ can be expected to depend on the physical nature of the body, with a blackbody defined by $\epsilon =1$ and $h$ minimal thus with maximal cut-off. 

Let us now consider possible universality of the radiation parameter $\gamma$ and temperature $T$.

Consider then two radiating bodies 1 and 2 with different characteristics $(\gamma_1,\epsilon_1, h_1, T_1)$ and $(\gamma_2,\epsilon_2, h_2, T_2)$, which when brought into radiative equilibrium will satisfy (assuming here for simplicity that $\epsilon_1=\epsilon_2$):
  • $\gamma_1T_1\nu^2 = \gamma_2T_2\nu^2$ for $\nu\leq\frac{T_2}{h_2}$ 
  • assuming $\frac{T_2}{h_2}\leq \frac{T_1}{h_1}$ 
  • and for simplicity that 2 reflects frequencies $\nu > \frac{T_2}{h_2}$.    
If we choose body 1 as reference, to serve as an ideal reference blackbody, defining a reference temperature scale $T_1$, we can then calibrate the temperature scale $T_2$ for body 2 so that 
  • $\gamma_1T_1= \gamma_2T_2$,
thus effectively assign temperature $T_1$ and $\gamma_1$ to body 2 by radiative equilibrium with body 1 acting as a reference thermometer. Body 2 will then mimic the radiation of body 1 in radiative equilibrium and a form of universality with body 1 as reference will be achieved, with independence of $\epsilon_1$ and $\epsilon_2$.

The analysis indicates that the critical quality of the reference blackbody is maximal cut-off (and equal temperature of all frequencies), and not necessarily maximal absorptivity = emissivity = 1. 

Universality of radiation is thus a consequence of radiative equilibrium with a specific reference body in the form of a blackbody acting as reference thermometer.  

Note that the form of the radiation law $R(\nu ,T)= \gamma T\nu^2$ reflects that the radiative damping term in the wave equation is given by $-\gamma\frac{d^3}{dt^3}$ with a third order time derivative as universal expression from oscillating electric charges according to Larmor.

In practice body 1 is represented by a small piece of graphite inside a cavity with reflecting walls represented by body 2 with the effect that the cavity will radiate like graphite independent of its form or wall material. Universality will thus be reached by mimicing of a reference, viewed as an ideal blackbody, which is perfectly understandable, and not by some mysterious deep inherent quality of blackbody radiation. Without the piece of graphite the cavity will possibly radiate with different characteristics and universality may be lost.

We can compare many local currencies calibrated to the dollar as common universal reference.  
  • All dancers which mimic Fred Astaire, dance like Fred Astaire, but all dancers do not dance like Fred Astaire.     
PS1 The common explanation for the high frequency cut-off is that they have low probability, which is not physics, while I suggest that high frequencies cannot be represented because of finite precision, which can be physics.  

PS2 Note that high-frequency cut-off increasing with temperature gives a 2nd Law expressing that energy is radiated from warm to cold and to no degree from cold to warm, thus acting like semi-conductor allowing an electrical current only if a voltage difference is above a certain value.

torsdag 14 mars 2024

Geopolitics of Mathematical Physics


Swedish Prime Minister offering to build 1000 Swedish Tank90.

Let us identify some leading scientist/period/country in the history of mathematical physics as the foundation of science:

  • Archimedes: Greece 
  • Song Dynasty: China 
  • Galileo: Renaissance Italy
  • Leibniz and Euler: Calculus Scientific Revolution Holy Roman Empire
  • Newton: Calculus Scientific Revolution England 
  • Cauchy and Fourier: Scientific/French Revolution France
  • Maxwell: Industrial Revolution England 
  • Boltzmann: statistical mechanics German Empire 
  • Hilbert: mathematics German Empire 
  • Planck: radiation German Empire
  • Einstein: relativity German Empire, Weimar Republic
  • Schrödinger: quantum mechanics atomic physics Weimar Republic  
  • Oppenheimer: atomic bomb USA
  • Feynman: subatomic physics USA
  • Gell Mann, Weinberg: subatomic physics USA
  • Witten: string theory subsubatomic physics USA
  • China? India? Third World?
We see that geopolitical dominance comes along with leading science. No wonder, since science gives industrial and military power. We can follow a shift of power from the Holy Roman Empire (Germany) to France/England back to German Empire over Weimar Republic followed by USA until China and India and the Third World now are ready to step in.  

We can also follow a development from macroscopis of engines to microscopics of atoms to the string theory of today, along with an expansion from our Solar system to cosmology of the Universe. 

Modern physics is viewed to have started with Planck's derivation of his radiation law based on statistical mechanics as a break-away from the rationality of the classical deterministic mechanics serving the scientific/industrial revolution,  which after 20 years of incubation developed into the new quantum mechanics of atom physics, with the atomic bomb as triumph. 

Modern physics thus in 1900 took a step away from deterministic to statistical mechanics, from rationality to irrationality apparently breaking with the tradition from Archimedes to Hilbert carried by rational mathematical thinking, which led the way into the 20th century. 

Questions:
  1. Can it be that the roots of the observed irrationality of the 20th century with two World Wars, and today a 21st century on its way to a third, can be traced to the irrationality of modern physics? 
  2. Will the dominance of USA persist with the present focus on string theory?
  3. Will the rationalism of ancient China lead to a take over?
A more detailed perspective on modern physics politics is given in Dr Faustus of Modern Physics.

2nd Law or 2-way Radiative Heat Transfer?

In the present discussion of the 2nd Law of Thermodynamics let us go back to this post from 2011 exposing the 19th century battle between 1-way transfer (Pictet) vs 2-way transfer (Prevost) of heat energy by radiation playing a central role in climate science today. 

In 1-way transfer a warm body heats a colder body in accordance with the 2nd Law. 

In 2-way heat transfer both warm and cold bodies are viewed to heat each other, but the warm heats more and so there is a net transfer of heat energy from warm to cold. But a cold body heating a warm body violates the 2nd Law of thermodynamics, and so there is something fishy here. 

Yet the basic mathematical model of radiative heat transfer in the form of Schwarzschild's equation involve 2-way transfer of heat energy, in apparent violation of the 2nd Law. 

I have discussed this situation at length on this blog with tags such as 2nd law of thermodynamics and radiative heat transfer with more on Computational Blackbody Radiation.

If you worry about the 2nd Law, you can ask yourself how 2-way radiative heat transfer is physically possible, when it appears to violate the 2nd Law? What is false here: 2nd Law or 2-way heat transfer?

What is your verdict? 


tisdag 12 mars 2024

Philosophy of Statistical Mechanics?

Collapsed pillars of modern building.

Let us continue with the post Three Elephants of Modern Physics taking a closer look at one of them. 

We learn from Stanford Encyclopedia of Philosophy the following about Statistical Mechanics SM: 

  • Statistical Mechanics is the third pillar of modern physics, next to quantum theory and relativity theory
  • Its aim is to account for the macroscopic behaviour of physical systems in terms of dynamical laws governing the microscopic constituents of these systems and probabilistic assumptions.
  • Philosophical discussions in statistical mechanics face an immediate difficulty because unlike other theories, statistical mechanics has not yet found a generally accepted theoretical framework or a canonical formalism. 
  • For this reason, a review of the philosophy of SM cannot simply start with a statement of the theory’s basic principles and then move on to different interpretations of the theory.
This is not a very good start, but we continue learning: 
  • Three broad theoretical umbrellas: “Boltzmannian SM” (BSM), “Boltzmann Equation” (BE), and “Gibbsian SM” (GSM).
  • BSM enjoys great popularity in foundational debates due to its clear and intuitive theoretical structure. Nevertheless, BSM faces a number of problems and limitations
  • There is no way around recognising that BSM is mostly used in foundational debates, but it is GSM that is the practitioner’s workhorse.
  • So what we’re facing is a schism whereby the day-to-day work of physicists is in one framework and foundational accounts and explanations are given in another framework.
  • This would not be worrisome if the frameworks were equivalent, or at least inter-translatable in relatively clear way...this is not the case.
  • The crucial conceptual questions (concerning BE) at this point are: what exactly did Boltzmann prove with the H-theorem?
This is the status today of the third pillar of modern physics formed by Boltzmann 1866-1906 and Gibbs 1902 as still being without a generally accepted theoretical framework, despite 120 years of deep thinking by the sharpest brains of modern physics. 

Is this something to worry about? If one the pillars apparently is shaky, what about the remaining two pillars? Who cares?

Recall that SM was introduced to rationalise the 2nd Law of Thermodynamics stating irreversibility of macroscopic systems based on deterministic reversible exact microscopics. This challenge was taken up by Boltzmann facing the question: If all components of a system are reversible, how can it be that the system is irreversible? From where does the irreversibility come? The only way forward Boltzmann could find was to replace exact determinism of microscopics by randomness/statistics as a form of inexactness. 
 
In the modern digital world the inexactness can take the form of finite precision computation performed with a certain number of digits (e g single or double precision). Here the microscopics is deterministic up the point of keeping only a finite number of digits, which can have more or less severe consequences on  macroscopic reversibility. This idea is explored in Computational Thermodynamics offering a 2nd Law expressed in the physical quantities of kinetic energy, internal energy, work and turbulent dissipation without need to introduce any concept of entropy.

Replacing SM by precisely defined finite precision computation gives a more solid third pillar. But this is new and not easily embraced by analytical/theoretical mathematicians/physicists not used to think in terms of computation, with Stephen Wolfram as notable exception.  

PS1 To meet criticism that the Stosszahlansatz underlying the H-theorem stating that particles before collision are uncorrelated, simply assumes what has to proved (irreversibility), Boltzmann argued:
  • But since this consideration has, apart from its tediousness, not the slightest difficulty, nor any special interest, and because the result is so simple that one might almost say it is self-evident I will only state this result.
Convincing?

PS2 Connecting to the previous post, recall that the era of quantum mechanics was initiated in 1900 by Planck introducing statistics of "energy quanta" inspired by Boltzmann's statistical mechanics, to explain observed atomic radiation spectra, opening the door to Born's statistical interpretation in 1927 of the Schrödinger wave function as the "probability of finding an electron" at some specific location in space and time, which is the text book wisdom still today. Thus the pillar of quantum mechanics is also weakened by statistics. The third pillar of relativity is free of statistics, but also of physics, and so altogether the three pillars offer a shaky foundation of modern physics.  Convinced? 

måndag 11 mars 2024

The 2nd Law as Radiative Heat Transfer


The 2nd Law of Thermodynamics states that heat energy $Q$ without forcing, is transferred from a body of temperature $T_1$ to a body of temperature $T_2$ with $T_1>T_2$ by conduction according to Fourier's Law if the bodies are in contact: 

  • $Q =\gamma (T_1-T_2)$ 

and/or by radiation according Stephan-Boltzmann-Planck's Law if the bodies are not in contact as radiative heat transfer

  • $Q=\gamma (T_1^4-T_2^4)$        (SBP)
where $\gamma > 0$.

The energy transfer is irreversible since it has a direction from warm to cold with $T_1>T_2$. It is here possible to view conduction as radiation at close distance and thus reduce the discussion to radiation. 

We can thus view the 2nd Law to be a consequence of (SBP), at least in the case of two bodies of different temperature: There is an irreversible transfer of heat energy from warm to cold. 

To prove 2nd Law for radiation thus can be seen to boil down to prove (SBF). This was the task taken on by the young Max Planck, who after a long tough struggle presented a proof in 1900, which he however was very unhappy with, since it like Boltzmann's H-theorem from 1872 was based on statistical mechanics and not classical deterministic physical mechanics.

But it is possible to prove (SBF) by replacing statistics with an assumption of finite precision computation in the form of  Computational Blackbody Radiation. Radiative heat transfer is here seen to be geared as a deterministic threshold phenomenon like a semi-conductor allowing heat transfer only one-way from warm to cold. 

Another aspect of radiation is that it is impossible to completely turn off or block by shielding of some sort. It connects to the universality of blackbody radiation taking the same form independent of material matter as shown here

We are thus led to the following form of the 2nd Law without any statistics:
  • Radiative heat transfer from warm to cold is unstoppable and irreversible. 
The finite precision aspect here takes the form of a threshold, thus different from that operational in the case of turbulent dissipation into heat energy connecting to complexity with sharp gradients as discussed in recent posts.

PS To learn how statistical mechanics is taught at Stanford University by a world-leading physicist, listen to Lecture 1 and ask yourself if you get illuminated:
  • Statistical mechanics is useful for predictions in cases when you do not know the initial conditions nor the laws of physics.

2nd Law for Cosmology

A mathematical model of the Universe can take the form of Euler's equations for a gas supplemented with Newton's law of gravitation as stated in Chap 32 Cosmology of Computational Thermodynamics.  

Computational solutions of these equations satisfy the following evolution equations as laws of thermodynamics depending on time $t$ 

  • $\dot K(t)=W(t)-D(t)-\dot\Phi (t)$     (1)
  • $\dot E(t)=-W(t)+D(t)$,                  (2)
where $K(t)$ is total kinetic energy, $E(t)$ total internal energy (heat energy), $W(t)$ is total work, $D(t)\ge 0$ is total turbulent dissipation, $\Phi (t)$ is total gravitational energy and the dot signifies differentiation with respect to time. Adding (1) and (2) gives the following total energy balance:
  • $K(t)+E(t)-\Phi(t)= constant.$          (3)
Further (1) and (2) express an irreversible transfer of energy from kinetic to internal energy with $D(t)>0$, and so serve as a 2nd Law for Cosmology giving time a direction. Recall that the theoretical challenge is to tell/show why turbulent dissipation is unavoidable. 

Computations may start from a hot dense state at $t=0$ which is seen to expand/cool (run code) (Big Bang) to maximal size and then contract/warm back to a hot dense state (Big Crunch) (run code) in an irreversible sequence of expansions/contractions until some final stationary equilibrium state with $E(\infty )=P(\infty )$. Compare with post from 2011.


Dark Matter as Axions as 85% of All Matter?

Sabine Hossenfelder in Exploding stars made of dark matter could heat up universe informs us about some new speculations about the physics of dark matter, believed to make up 85% of all matter in the Universe, in the form of    

  • axions or axion particles 
able to form 
  • axion stars
able to explode and so able to  
  • heat surrounding gas 
which could be a detectable phenomenon. Sabine ends asking how it is possible that physicists can be paid for this kind of speculation. 

Compare with the idea I have suggested that matter with density $\rho (x,t)=\Delta \phi (x,t)$ is formed from a gravitational potential $\phi (x,t)$ locally in space-time with coordinates $(x,t)$ from differentiation expressed by the Laplacian differential operator $\Delta$, and that dark matter corresponds to large regions where the potential is smooth in the sense that $\Delta \phi (x,t)$ is not large enough to create matter which is visible. 

It is conceivable that such large regions could concentrate gravitationally and even form stars which could explode as in the above scenario. Is anyone willing to pay for this idea? Does it make sense? 

söndag 10 mars 2024

Three Elephants in Modern Physics

Modern theoretical physicists busy at work handling the crisis.

There are three elephants in the crisis room of modern physics:

  1. special relativity
  2. 2nd law of thermodynamics
  3. foundations of quantum mechanics 

which since many years are no longer discussed in the physics community, not because they have since long been resolved, but because they remain open problems with no progress for 100 years. 

Only crackpot amateur physicists still debate these problems on fringe sites like John Chappell Natural Philosophy Society. Accordingly these topics are longer part of core theoretical physics education, since questions from students cannot be answered. This may seem strange, but it has come to be an agreement within the physics community to live with. Discussion closed. Research submitted to leading journals on these topics will be rejected, without refereeing.

Is it possible to understand why no progress has been made? Why is there a crisis?

As concerns special relativity the reason is that it is not a theory about real physics, but a theory about perceived observer perceptions of "events" identical to space-time coordinates without physical meaning. This makes special relativity to a game following some ad hoc rules without clear physics pretending to describe a world, which shows to be very very strange. Unless of course you realise that it is just a game and not science, but then nothing to teach at a university.  

As concerns topics 2 and 3 the reason is the introduction of statistical mechanics by Boltzmann followed by Planck as last rescue when deterministic physics seemed to fail. Again the trouble is that statistical mechanics is physics in the eyes of observers as probabilities of collections of physical events without cause-effect, rather than specific real events with cause-effect. Like special relativity this makes statistical mechanics into a game according to some ad hoc rules without physical meaning. 

In all three cases the observer is given a new key role as if the world depends on observation and cannot as in classical deterministic physics, itself go ahead without. This makes discussion complicated since there is no longer any common ground to all observers, and so eventually discussion dies. Nothing more to say. Everybody agrees that that there is nothing to disagree about. Everything in order. 

Schrödinger as the inventor of the Schrödinger equation for the Hydrogen atom with one electron as a deterministic classical continuum mechanical model, was appalled by Born's statistical interpretation of the multi-dimensional generalisation of his equation to atoms with several electrons defying physical meaning, and so gave up and turned to more fruitful pastures like the physics of living organisms.

But the elephants are there even if you pretend that they are not, and that is not a very healthy climate for scientific progress. You find my attempts to help out on this blog.  
  

lördag 9 mars 2024

Challenges to the 2nd Law

The book Challenges to the Second Law of Thermodynamics by Capek and Sheenan starts out describing the status of this most fundamental law of physics as of 2005:

  • For more than a century this field has lain fallow and beyond the pale of legitimate scientific inquiry due both to a dearth of scientific results and to a surfeit of peer pressure against such inquiry. 
  • It is remarkable that 20th century physics, which embraced several radical paradigm shifts, was unwilling to wrestle with this remnant of 19th century physics, whose foundations were admittedly suspect and largely unmodified by the discoveries of the succeeding century. 
  • This failure is due in part to the many strong imprimaturs placed on it by prominent scientists like Planck, Eddington, and Einstein. There grew around the second law a nearly inpenetrable mystique which only now is being pierced.
The book then continues to present 21 formulations of the 2nd Law followed by 20 versions of entropy and then proceeds to a large collection of challenges, which are all refuted, starting with this background:
  • The 2nd Law has no general theoretical proof.
  • Except perhaps for a dilute gas (Boltzmann's statistical mechanics), its absolute status rests squarely on empirical evidence.  
We learn that modern physics when confronted with the main unresolved problem of classical physics reacted by denial and oppression as cover up of a failure of monumental dimension. The roots of the present crisis of modern physics may hide here. 

Computational Thermodynamics seeks to demystify the 2nd Law as a result of finite precise computation  meeting systems developing increasing complexity like turbulence in slightly viscous flow.  

Physicists confronted with proving the 2nd Law. 



fredag 8 mars 2024

2nd Law vs Friction

Perpetual motion is viewed to be impossible because according to the 2nd Law of Thermodynamics (see recent posts)

  • There is always some friction.     (F)
Is this true? It does not appear to be true in the microscopics of atoms in stable ground state, where apparently electrons move around (if they do) without losing energy seemingly without friction, see Real Quantum Mechanics

But is it true in macroscopics of many atoms or molecules such as that of fluids? A viscous fluid in motion meets a flat plate boundary with a skin friction depending on the Reynolds number $Re\sim \frac{1}{\nu}$ with $\nu$ viscosity as follows based on observation:


We see that skin friction coefficient $c_f$ in a laminar boundary layer tends to zero as $Re$ tends to infinity (or viscosity tends to zero), which is supported by basic mathematical analysis. 

We also that there is a transition from laminar to turbulent boundary layer for $Re > 5\times 10^5$ with larger turbulent skin friction coefficient $c_f\approx 0.002$ with a very slow decay with increasing $Re$ with no observations of $c_f<0.001$. 

The transition to a turbulent boundary layer is the result of inherent instability of the motion of a fluid with small viscosity, which is supported by mathematical analysis, an instability which cannot be controlled in real life, see Computational Turbulent Incompressible Flow.

We thus get the message from both observation and mathematical analysis that (F) as concerns skin friction is true: Skin friction in a real fluid does not tend to zero with increasing $Re$, because there is always transition to a turbulent boundary layer with $c_f>0.001$.  

Laminar skin friction is vanishing in the limit of infinite $Re$, but not turbulent skin friction.

We can thus can rationalise the 2nd Law for fluids as presence of unavoidable skin friction from turbulent motion resulting from uncontrollable instability, which does not tend to zero with increasing $Re$ , reflecting presence of a non-zero limit of turbulent dissipation in the spirit of Kolmogorov.  

Connecting to finite precision computation discussed in recent posts, we understand that computational resolution to physical scale is not necessary, which makes turbulent flow computable without turbulence modelling. 

In bluff body computations (beyond drag crisis) it is possible to effectively set skin friction to zero as a slip boundary condition thus avoiding having to resolve turbulent boundary layers. In pipe flow skin friction cannot be set to zero.

PS1 A hope of vanishingly small laminar skin friction has been nurtured in the fluid mechanics community, but the required instability control has shown to be impossible, as an expression of the 2nd Law. 

PS2 One may ask if motion without friction is possible at all? Here we face the question what motion is, connecting to wave motion as an illusion. In Newtonian gravitation of a Platonic ideal world there is no dissipation/friction, but in the real world there is: The Moon is slowly receding from the Earth because of tidal motion energy loss. What then about the supposed motion of photons at the speed of light seemingly without energy loss from friction? Is this motion illusion, with necessary energy loss to receive a light signal? Computational Blackbody Radiationֶ  says yes! The 2nd Law can thus be formulated: 
  • There is no motion without friction. Photons in frictionless motion is illusion.
Do you buy this argument? Can you explain why? Instability? What is a photon in motion without friction? Illusion?

Why is there always some friction?