Re-writing Feynman’s Lectures?

I have a crazy new idea: a complete re-write of Feynman’s Lectures. It would be fun, wouldn’t it? I would follow the same structure—but start with Volume III, of course: the lectures on quantum mechanics. We could even re-use some language—although we’d need to be careful so as to keep Mr. Michael Gottlieb happy, of course. 🙂 What would you think of the following draft Preface, for example?

The special problem we try to get at with these lectures is to maintain the interest of the very enthusiastic and rather smart people trying to understand physics. They have heard a lot about how interesting and exciting physics is—the theory of relativity, quantum mechanics, and other modern ideas—and spend many years studying textbooks or following online courses. Many are discouraged because there are really very few grand, new, modern ideas presented to them. The problem is whether or not we can make a course which would save them by maintaining their enthusiasm.

The lectures here are not in any way meant to be a survey course, but are very serious. I thought it would best to re-write Feynman’s Lectures to make sure that even the least intelligent would be able to encompass almost everything that is in the lectures. 🙂

This is the link to Feynman’s original Preface, so you can see how it compares. 🙂

Now that sounds like a project, doesn’t it?

Jean Louis Van Belle, 22 May 2020

Post scriptum: It looks like we made Mr. Gottlieb and/or MIT very unhappy already: the link above does not work for us anymore (see what we get below). That’s very good: it is always nice to start a new publishing project with a little controversy. 🙂 We will have to use the good old paper print edition. We recommend you buy one too, by the way. 🙂 I think they are just a bit over US$100 now. Well worth it!

To put the historical record straight, the reader should note we started this blog before Mr. Gottlieb brought Feynman’s Lectures online. We actually wonder why he would be bothered by us referring to it. That’s what classical textbooks are for, aren’t they? They create common references to agree or disagree with, and why put a book online if you apparently don’t want it to be read or discussed? Noise like this probably means I am doing something right here. 🙂

Gottlieb

Joseph Larmor and the ring current model of an electron

 

Joseph Larmor is surely not among the more famous participants in the Solvay Conferences. He only joined the 1921 Conference, together with Charles Glover Barkla and others, and his one and only substantial intervention there is limited to some remarks and questions following a presentation by H.A. Lorentz on the Theory of Electrons, during which Lorentz highlights all of the issues in regard to what was then supposed to be the understanding of what an electron actually is (which, in my not-so-humble-view, is still pretty much the state of our current understanding of it).

I find his one intervention (and Lorentz’ reply to it) very interesting though, and so that’s why I am writing about it here. I am not aware of any free online English translations of the proceedings of the Solvay Conferences (nor of any translation of Lorentz’ paper in particular) but you may be luckier than me when googling: if you find it, please do let me know. In the meanwhile, I am happy to freely translate part of Larmor’s rather short intervention after Lorentz’ presentation from French to English:

“I understand that Mr. Lorentz was given the task to give an overview of how electrons behave inside of an atom. That requires an overview of all possible theories of the electron. That is a highly worthwhile endeavor which, in itself, would already justify the holding of this Conference. However, Mr. Lorentz might have paid more attention to the viewpoint that the electron has some structure, and that its representation as a simple distribution of electric charge can only be provisional: electrons explain electricity, but electricity does not explain electrons. However, the description of an electron in terms of a charge distribution is, for the time being, all we can imagine. In the past, we thought of the atom as an indivisible unit – a fundamental building block – and we imagined it as a swirling ring. That idea is gone now, and the electron has now taken the place of the atom as an indestructible unit. All we can know about it, is how it influences other bodies. If this influence is transmitted all across the aether, we need to be able to express the relations between the electron and the aether[1], or its force field in the space that surrounds it. It may have other properties, of course, but physics is the science that should analyze the influence or force of one body upon others.

The question we should raise here is whether or not an electron formed by a perfectly uniform current ring can grab onto the aether in a physical sense, and how it does so if its configuration does not change.” (Joseph Larmor, 1921, boldface and italics added)

Larmor then talks about the (possible) use of the energy-momentum tensor to address the latter question, which is a very technical discussion which is of no concern to us here. Indeed, the question on how to use tensors to model how an electron would interact with other charges or how it would create an electromagnetic field is, effectively, a rather standard textbook topic now and, in case you’d be interested, you can check  my blog on it or, else, (re-)read Chapters 25, 26 and 27 of Feynman’s Lectures on electromagnetism.

What grabbed my attention here was, effectively, not the technicality of the question in regard to the exact machinery of the electromagnetic force or field. It was Larmor’s description of the electron as a perpetual or persistent current ring (the French reference to it is this: un electron formé par un courant annulaire parfaitement uniforme), and his language on it, which indicates he thought of it as a rather obvious and natural idea! Hence, Parson’s 1915 toroidal ring model – the precursor to Schrödinger’s Zitterbewegung model and modern-day ring current models – was apparently pretty well established at the time! In fact, Rutherford’s lecture on the Structure of the Atom at the 1921 Conference further confirms this, as he also talks about Parson’s électron annulaire (ring electron) and the apparent magnetic properties of the electron (I will talk about Rutherford’s 1921 Solvay lecture in my next post).

Larmor’s belief that the electron was not pointlike should, of course, not surprise us in light of his rather famous work on the quantum-mechanical precession of the magnetic moment of an electron, but I actually wasn’t aware of Joseph Larmor’s own views in regard to its possible reality. In fact, I am only guessing here but his rather strong views on its reality may explain why the scientific committee − which became increasingly dominated by scientists in favor of the Bohr-Heisenberg interpretation of physical reality (basically saying we will never be able to understand it)  − did not extend an invitation to Larmor to attend the all-important Solvay conferences that would follow the 1921 Conference and, most notably, the 1927 Conference that split physicists between realists and… Well… Non-realists, I guess. 🙂

Lorentz’ immediate reaction to Larmor mentioning the idea of a swirling ring (in French: un anneau tourbillon), which is part of his reply to Larmor’s remarks, is equally interesting:

“There is a lot to be said for your view that electrons are discontinuities in the aether. […] The energy-momentum formulas that I have developed should apply to all particles, with or without structure. The idea of a rotating ring [in French: anneau tournant] has a great advantage when trying to explain some issues [in the theory of an electron]: it would not emit any electromagnetic radiation. It would only produce a magnetic field in the immediate space that surrounds it. […]” (H.A. Lorentz, 1921, boldface and italics added)

Isn’t that just great? Lorentz’ answer to Larmor’s question surely does not solve all of the problems relating to the interpretation of the electron as a current ring, but it sure answers that very basic question which proponents of modern quantum mechanics usually advance when talking about the so-called failure of classical physics: electrons in some electron orbital in an atom should radiate their energy out, but so they do not. Let me actually quote from Feynman’s Lectures on Quantum Mechanics here: “Classically, the electrons would radiate light and spiral in until they settle down right on top of the nucleus. That cannot be right.”

Surely You’re Joking, Mr. Feynman! Here is the answer of the classical quantum theorists: superconducting rings of electric current do not radiate their energy out either, do they?

[1] Larmor believed an aether should exist. We will re-quote Robert B. Laughlin here: “The word ‘ether’ has extremely negative connotations in theoretical physics because of its past association with opposition to relativity. This is unfortunate because, stripped of these connotations, it rather nicely captures the way most physicists actually think about the vacuum. […] The modern concept of the vacuum of space, confirmed every day by experiment, is a relativistic ether. But we do not call it this because it is taboo.”

On the concept of the aether, we can also usefully translate part of Lorentz’ answer to Larmor: “As for the aether, even the physicists who still talk about it have stripped the concept of anything it might have in common with matter. I was a believer in an immobile aether myself but I realize that, because of relativity, we cannot talk about any force acting on the aether. However, I still think of the aether as the seat of electromagnetic energy (in French, le siège de l’énergie électromagnétique). Now, we can all think of the components of the energy-momentum tensor like we want, but if we think of some of them being real in some sense, then all of them should be real in the same sense.”

Post scriptum: I should really stop duplicating posts between this and my other blog site on physics. Hence, I beg the readers who want to keep following me to do so on my ideez.org site. I think I’ll devote it a historical analysis of how useful and not-so-useful ideas in physics have evolved over the past hundred years or so, using the proceedings of the Solvay Conferences as the material for analysis.

The self-appointed science gurus

Sean Carroll recently tweeted this:

Sean Caroll

I could ‘t help giving him a straight answer. I actually like Sean Carroll, but I hate how he and others – think of John Gribbins, for example – self-appoint themselves as the only ‘gurus’ who are entitled to say something about grand theories or other ‘big ideas’: everyone else (read: all non-believers in QFT) are casually dismissed as ‘crackpot scientists’.

In fact, a few weeks before he had sent out a tweet promoting his ideas on the next ‘big ideas’, so I couldn’t help reminding him of the tweet above. 🙂

Sean Caroll next tweet

This is funny, and then it isn’t. The facts are this:

  1. The ‘new physics’ – the quantum revolution – started almost 100 years ago but doesn’t answer many fundamental questions (simply think about explaining spin and other intrinsic properties of matter-particles here).
  2. Geniuses like Einstein, Lorentz, Dirac and even Bell had serious doubts about the approach.
  3. Historical research shows theories and scientists were severely biased: see Dr. Consa’s review of quantum field theory in this regard.

I am very sorry, Dr. Carroll. You are much smarter than most – and surely much smarter than me – but here you show you are also plain arrogant. :-/ It’s this arrogance that has prevented a creative way out of the mess that fundamental physics finds itself in today. If you find yourself in a hole, stop digging !

The last words of H.A. Lorentz

I talked about the Solvay Conferences in my previous post(s). The Solvay Conference proceedings are a real treasury trove. Not only are they very pleasant to read, but they also debunk more than one myth or mystery in quantum physics!

It is part of scientific lore, for example, that the 1927 Solvay Conference was a sort of battlefield on new physics between Heisenberg and Einstein. Surprisingly, the papers and write-up of discussions reveal that Einstein hardly intervened. They also reveal that ‘battlefield stories’ such as Heisenberg telling Einstein to “stop telling God what to do” or – vice versa – Einstein declaring “God doesn’t play dice” are what they are: plain gossip or popular hear-say. Neither Heisenberg nor Einstein ever said that—or not at the occasion of the 1927 Solvay Conference, at least! Instead, we see very nuanced and very deep philosophical statements—on both sides of the so-called ‘divide’ or ‘schism’.

From all interventions, the intervention of the Dutch scientist Hendrik Antoon Lorentz stands out. I know (most of) my readers don’t get French, and so I might translate it into English one of these days. In the meanwhile, you may want to google-translate it yourself!

It is all very weird, emotional and historical. H.A. Lorentz – clearly the driving force behind those pre-WW II Solvay Conferences – died a few months after the 1927 Conference. In fact, the 1927 conference proceedings have both the sad announcement of his demise as well his interventions—such was the practice of actually physically printing stuff at the time.

For those who do read French, here you go:

DISCUSSION GENERALE DES IDEES NOUVELLES EMISES.

Causalité, Déterminisme. Probabilité.

Intervention de M. Lorentz:

“Je voudrais attirer l ’attention sur les difficultés qu’on rencontre dans les anciennes théories. Nous voulons nous faire une représentation des phénomènes, nous en former une image dans notre esprit. Jusqu’ici, nous avons toujours voulu former ces images au moyen des notions ordinaires de temps et d’espace. Ces notions sont peut-être innées; en tout cas, elles se sont développées par notre expérience personnelle, par nos observations journalières. Pour moi, ces notions sont claires et j ’avoue que je ne puis me faire une idée de la physique sans ces notions. L ’image que je veux me former des phénomènes doit être absolument nette et définie et il me semble que nous ne pouvons nous former une pareille image que dans ce système d’espace et de temps.

Pour moi, un électron est un corpuscule qui, a un instant donne, se trouve en un point détermine de l ’espace, et si j ’ai eu l ’idée qu’a un moment suivant ce corpuscule se trouve ailleurs, je dois songer à sa trajectoire, qui est une ligne dans l’espace. Et si cet électron rencontre un atome et y pénètre, et qu’après plusieurs aventures il quitte cet atome, je me forge une théorie dans laquelle cet électron conserve son individualité; c’est-à-dire que j ’imagine une ligne suivant laquelle cet électron passe à travers cet atome. Il se peut, évidemment, que cette théorie soit bien difficile à développer, mais a priori cela ne me parait pas impossible.

Je me figure que, dans la nouvelle théorie, on a encore de ces électrons. Il est possible, évidemment, que dans la nouvelle théorie, bien développée, il soit nécessaire de supposer que ces électrons subissent des transformations. Je veux bien admettre que l’électron se fond en un nuage. Mais alors je chercherai à quelle occasion cette transformation se produit. Si l’on voulait m’interdire une pareille recherche en invoquant un principe, cela me gênerait beaucoup. Il me semble qu’on peut toujours espérer qu’on fera plus tard ce que nous ne pouvons pas encore faire en ce moment. Même si l’on abandonne les anciennes idées, on peut toujours conserver les anciennes dénominations. Je voudrais conserver cet idéal d’autrefois, de décrire tout ce qui se passe dans le monde par des images nettes. Je suis prêt à admettre d’autres théories, à condition qu’on puisse les traduire par des images claires et nettes.

Pour ma part, bien que n’étant pas encore familiarisé avec les nouvelles idées que j’entends exprimer maintenant, je pourrais me représenter ces idées ainsi. Prenons le cas d’un électron qui rencontre un atome; supposons que cet électron quitte cet atome et qu’en même temps il y ait émission d’un quantum de lumière. Il faut considérer, en premier lieu, les systèmes d’ondes qui correspondent à l ’électron et à l’atome avant le choc. Après le choc, nous aurons de nouveaux systèmes d’ondes. Ces systèmes d’ondes pourront etre décrits par une fonction ψ définie dans un espace a un grand nombre de dimensions qui satisfait une équation différentielle. La nouvelle mécanique ondulatoire opèrera avec cette équation et établira la fonction ψ avant et après le choc.

Or, il y a des phénomènes qui apprennent qu’ il y a autre chose encore que ces ondes, notamment des corpuscules; on peut faire, par exemple, une expérience avec un cylindre de Faraday; il y a donc à tenir compte de l’individualité des électrons et aussi des photons. Je pense que je trouverais que, pour expliquer les phénomènes, il suffit d’admettre que l’expression ψψ* donne la probabilité que ces électrons et ces photons existent dans un volume détermine; cela me suffirait pour expliquer les expériences.

Mais les exemples donnes par M. Heisenberg m’apprennent que j’aurais atteint ainsi tout ce que l’expérience me permet d’atteindre. Or, je pense que cette notion de probabilité serait à mettre à la fin, et comme conclusion, des considérations théoriques, et non pas comme axiome a priori, quoique je veuille bien admettre que cette indétermination correspond aux possibilités expérimentales. Je pourrais toujours garder ma foi déterministe pour les phénomènes fondamentaux, dont je n’ai pas parlé. Est-ce qu’un esprit plus profond ne pourrait pas se rendre compte des mouvements de ces électrons. Ne pourrait-on pas garder le déterminisme en en faisant l’objet d’une croyance ? Faut-il nécessairement ériger l’ indéterminisme en principe?

I added the bold italics above. A free translation of this phrase is this:

Why should we elevate determinism or  – as Born en Heisenberg do – its opposite (indeterminism) to a philosophical principle?

What a beautiful statement ! Lorentz died of a very trivial cause: erysipelas, commonly known as St Anthony’s fire. :-/

Where things went wrong, exactly !

As mentioned in my previous post, Oliver Consa traces all of the nonsense in modern physics back to the Shelter Island (1947), Pocono (1948) and Oldstone (1949) Conferences. However, the first Solvay Conference that was organized after WW II was quite significant too. Niels Bohr and Robert Oppenheimer pretty much dominated it. Bohr does so by providing the introductory lecture ‘On the Notions of Causality and Complementarity’, while Oppenheimer’s ‘Electron Theory’ sets the tone for subsequent Solvay Conferences—most notably the one that would consecrate quantum field theory (QFT), which was held 13 years later (1961).

Indeed, the discussion between Oppenheimer and Dirac on the ‘Electron Theory’ paper in 1948 seems to be where things might have gone wrong—in terms of the ‘genealogy’ or ‘archaelogy’ of modern ideas, so to speak. In fact, both Oppenheimer and Dirac make rather historical blunders there:

  1. Oppenheimer uses perturbation theory to arrive at some kind of ‘new’ model of an electron, based on Schwinger’s new QFT models—which, as we now know, do not really lead anywhere.
  2. Dirac, however, is just too stubborn too: he simply keeps defending his un-defendable electron equation— which, of course, also doesn’t lead anywhere. [It is rather significant he was no longer invited for the next Solvay Conference.]

It is, indeed, very weird that Dirac does not follow through on his own conclusion: “Only a small part of the wave function has a physical meaning. We now have the problem of picking out that very small physical part of the exact solution of the wave equation.

It’s the ring current or Zitterbewegung electron, of course. The one trivial solution he thought was so significant in his 1933 Nobel Prize lecture… The other part of the solution(s) is/are, effectively, bizarre oscillations which he refers to as ‘run-away electrons’.

It’s nice to sort of ‘get’ this. 🙂

Tracing good and bad ideas

Today I decided to look for the original Solvay Conference papers, which were digitized by the libraries of the Free University of Brussels: here is the link to them.  I quickly went through the famous 1927 and 1930 Conferences (Einstein did not attend the 1933 Conference – nor did he attend the 1921 Conference) – but, to my great consternation – there is no trace of those so-called ‘heated discussions’ between Heisenberg and Einstein.

A few critical questions here and there, yes, but I don’t see anything even vaguely resembling an ‘ardent debate’ or a so-called ‘Bohr-Einstein controversy’. Am I mistaken—or am I missing something?

The fact that it’s all in French is quite interesting, and may explain why Einstein’s interventions are rare (I am not sure of the language that was used: the physicists then were multi-lingual, weren’t they?). The remarks of the French physicists Leon Brillouin, for example, are quite interesting but not widely known, it seems.

Funny remarks like Heisenberg telling Einstein ‘to stop telling God what to do’ are surely not there ! Are they folklore? Would anyone know whether these remarks are documented somewhere? I am just trying to trace those historical moments in the evolution of thought and science… 🙂

Things like this make me think a great deal of the ‘controversy’ between old (classical) and new (quantum) physics is actually just hype rather than reality. One of my readers sent me this link to a very interesting article in the LA Times in this regard. It’s a quick but very worthwhile read, showing it’s not only physics who suffers from ‘the need to sell’ real or non-existing results: here is the link—have a look!

In fact, I realize I am still looking for some kind of purpose for my new site. Perhaps I should dedicate it to research like this—separating fact from fiction in the history of ideas?

PS: I just checked the Wikipedia article on Heisenberg’s quotes and it seems Heisenberg’s “stop telling God what to do” is, effectively, disputed ! Interesting but, in light of its frequent use – also quite shocking, I would think.

PS 2: I jotted down the following based on a very quick scan of these Solvay Conferences:

Dr. Oliver Consa starts his scathing history of the sorry state of modern-day physics as follows:

“After the end of World War II, American physicists organized a series of three transcendent conferences for the development of modern physics: Shelter Island (1947), Pocono (1948) and Oldstone (1949). These conferences were intended to be a continuation of the mythical Solvay conferences. But, after World War II, the world had changed. The launch of the atomic bombs in Hiroshima and Nagasaki (1945), followed by the immediate surrender of Japan, made the Manhattan Project scientists true war heroes. Physicists were no longer a group of harmless intellectuals; they had become the powerful holders of the secrets of the atomic bomb.”[1]

Secrets that could not be kept, of course. The gatekeepers did their best, however. Julius Robert Oppenheimer was, effectively, one of them. The history of Oppenheimer – father of the atomic bomb and prominent pacifist at the same time – is well known.

It is actually quite interesting to note that the Solvay Conferences continued after WW II and that Niels Bohr and Robert Oppenheimer pretty much dominated the very first post-WW II Solvay Conference, which was held in 1948. Bohr does so by providing the introductory lecture ‘On the Notions of Causality and Complementarity[2], while Oppenheimer’s ‘Electron Theory’ sets the tone for subsequent Solvay Conferences—most notably the one that would consecrate quantum field theory (QFT), which was held 13 years later (1961).[3]

Significantly, Paul Dirac is pretty much the only one asking Oppenheimer critical questions. As for Albert Einstein, I find it rather strange that – despite him being a member of the scientific committee[4] – he actually hardly interferes in discussions. It makes me think he had actually lost interest in the development of quantum theory.

Even more significant is the fact that Dirac was not invited nor even mentioned in the 1951 Solvay Conference.

[1] Oliver Consa, Something is rotten in the state of QED, February 2020.

[2] See the 1948 Solvay Conference report on the ULB’s digital archives.

[3] Institut international de physique Solvay (1962). La théorie quantique des champs: douzième Conseil de physique, tenu à l’Université libre de Bruxelles du 9 au 14 octobre 1961.

[4] Einstein was a member of the Solvay scientific committee from the very first conference (1911) – representing, in typical style, a country (Austria, not Germany) rather than an institution or just being a member in some personal capacity – till 1948. He was not a member of the 1951 scientific committee. The reason might well be age or a lack of interest, of course: Einstein was 72 years in 1951, and would die four years later (1955).

The difference between a theory and an explanation

That’s a weird title, isn’t it? It’s the title of a fun paper (fun for me, at least—I hope for you too, of course), in which I try to show where quantum mechanics went wrong, and why and when the job of both the academic physicist as well as of the would-be student of quantum mechanics turned into calculating rather than explaining what might or might not be happening.

Modern quantum physicists are, effectively, like economists modeling input-output relations: if they are lucky, they get some kind of mathematical description of what goes in and what goes out of a process or an interaction, but the math doesn’t tell them how stuff actually happens.

So this paper of ours talks about that—in a very detailed way, actually—and then we bring the Zitterbewegung electron model and our photon model together to provide a classical explanation of Compton scattering of photons by electrons so as to show what electron-photon interference might actually be: two electromagnetic oscillations interfering (classically) with each other.

The whole thing also offers some reflections on the nature of the Uncertainty Principle.

Here is the link on the academia.edu site ! In case you do not have an academia.edu identity, here’s the link to the paper on Phil Gibbs’ alternative science site.

Enjoy ! 🙂 When everything is said and done, the mystery of quantum mechanics is this: why is an electron an electron, and why is a proton a proton? 🙂

PS: I am sure you think my last statement is nonsensical. If so, I invite you to think again. Whomever can explain the electron-proton mass ratio will be able to explain the difference between the electromagnetic and strong force. In other words, he or she will be able to connect the electromagnetic and the strong ‘sector’ of a classical interpretation of quantum mechanics. 🙂

Explaining the Lamb shift in classical terms

Corona-virus is bad, but it does have one advantage: more time to work on my hobby ! I finally managed to have a look at what the (in)famous Lamb shift may or may not be. Here is the link to the paper.

I think it’s good. Why? Well… It’s that other so-called ‘high precision test’ of mainstream quantum mechanics (read: quantum field theory)m but so I found it’s just like the rest: ‘Cargo Cult Science.’ [I must acknowledge a fellow amateur physicist and blogger for that reference: it is, apparently, a term coined by Richard Feynman!]

To All: Enjoy and please keep up the good work in these very challenging times !

🙂

Mainstream QM: A Bright Shining Lie

Yesterday night, I got this email from a very bright young physicist: Dr. Oliver Consa. He is someone who – unlike me – does have the required Dr and PhD credentials in physics (I have a drs. title in economics) – and the patience that goes with it – to make some more authoritative statements in the weird world of quantum mechanics. I recommend you click the link in the email (copied below) and read the paper. Please do it! 

It is just 12 pages, and it is all extremely revealing. Very discomforting, actually, in light of all the other revelations on fake news in other spheres of life.

Many of us – and, here, I just refer to those who are reading my post – all sort of suspected that some ‘inner circle’ in the academic circuit had cooked things up:the Mystery Wallahs, as I refer to them now. Dr. Consa’s paper shows our suspicion is well-founded.

QUOTE

Dear fellow scientist,

I send you this mail because you have been skeptical about Foundations of Physics. I think that this new paper will be of your interest. Feel free to share it with your colleagues or publish it on the web. I consider it important that this paper serves to open a public debate on this subject.

Something is Rotten in the State of QED
https://vixra.org/pdf/2002.0011v1.pdf

Abstract
“Quantum electrodynamics (QED) is considered the most accurate theory in the history of science. However, this precision is based on a single experimental value: the anomalous magnetic moment of the electron (g-factor). An examination of QED history reveals that this value was obtained using illegitimate mathematical traps, manipulations and tricks. These traps included the fraud of Kroll & Karplus, who acknowledged that they lied in their presentation of the most relevant calculation in QED history. As we will demonstrate in this paper, the Kroll & Karplus scandal was not a unique event. Instead, the scandal represented the fraudulent manner in which physics has been conducted from the creation of QED through today.”  (12 pag.)

Best Regards,
Oliver Consa
oliver.consa@gmail.com

UNQUOTE

A theory of matter-particles

Pre-scriptum (PS), added on 6 March 2020: The ideas below also naturally lead to a theory about what a neutrino might actually be. As such, it’s a complete ‘alternative’ Theory of Everything. I uploaded the basics of such theory on my academia.edu site. For those who do not want to log on to academia.edu, you can also find the paper on my author’s page on Phil Gibb’s site.

Text:

We were rather tame in our last paper on the oscillator model of an electron. We basically took some philosophical distance from it by stating we should probably only think of it as a mathematical equivalent to Hestenes’ concept of the electron as a superconducting loop. However, deep inside, we feel we should not be invoking Maxwell’s laws of electrodynamics to explain what a proton and an electron might actually be. The basics of the ring current model can be summed up in one simple equation:

c = a·ω

This is the formula for the tangential velocity. Einstein’s mass-energy equivalence relation and the Planck-Einstein relation explain everything else[1], as evidenced by the fact that we can immediately derive the Compton radius of an electron from these three equations, as shown below:F1The reader might think we are just ‘casually connecting formulas’ here[2] but we feel we have a full-blown theory of the electron here: simple and consistent. The geometry of the model is visualized below. We think of an electron (and a proton) as consisting of a pointlike elementary charge – pointlike but not dimensionless[3] – moving about at (nearly) the speed of light around the center of its motion.

Picture1

The relation works perfectly well for the electron. However, when applying the a = ħ/mc radius formula to a proton, we get a value which is about 1/4 of the measured proton radius: about 0.21 fm, as opposed to the 0.83-0.84 fm charge radius which was established by Professors Pohl, Gasparan and others over the past decade.[4] In our papers on the proton radius[5],  we motivated the 1/4 factor by referring to the energy equipartition theorem and assuming energy is, somehow, equally split over electromagnetic field energy and the kinetic energy in the motion of the zbw charge. However, the reader must have had the same feeling as we had: these assumptions are rather ad hoc. We, therefore, propose something more radical:

When considering systems (e.g. electron orbitals) and excited states of particles, angular momentum comes in units (nearly) equal to ħ, but when considering the internal structure of elementary particles, (orbital) angular momentum comes in an integer fraction of ħ. This fraction is 1/2 for the electron[6] and 1/4 for the proton.

Let us write this out for the proton radius:F2What are the implications for the assumed centripetal force keeping the elementary charge in motion? The centripetal acceleration is equal to ac = vt2/a = a·ω2. It is probably useful to remind ourselves how we get this result so as to make sure our calculations are relativistically correct. The position vector r (which describes the position of the zbw charge) has a horizontal and a vertical component: x = a·cos(ωt) and y = a·sin(ωt). We can now calculate the two components of the (tangential) velocity vector v = dr/dt as vx = –a·ω·sin(ωt) and vy y = –a· ω·cos(ωt) and, in the next step, the components of the (centripetal) acceleration vector ac: ax = –a·ω2·cos(ωt) and ay = –a·ω2·sin(ωt). The magnitude of this vector is then calculated as follows:

ac2 = ax2 + ay2a2·ω4·cos2(ωt) + a2·ω4·sin2(ωt) = a2·ω4ac = a·ω2 = vt2/a

Now, Newton’s force law tells us that the magnitude of the centripetal force will be equal to:

F = mγ·ac = mγ·a·ω2

As usual, the mγ factor is, once again, the effective mass of the zbw charge as it zitters around the center of its motion at (nearly) the speed of light: it is half the electron mass.[7] If we denote the centripetal force inside the electron as Fe, we can relate it to the electron mass me as follows:F3Assuming our logic in regard to the effective mass of the zbw charge inside a proton is also valid – and using the 4E = ħω and a = ħ/4mc relations – we get the following equation for the centripetal force inside of a proton:
F4How should we think of this? In our oscillator model, we think of the centripetal force as a restoring force. This force depends linearly on the displacement from the center and the (linear) proportionality constant is usually written as k. Hence, we can write Fe and Fp as Fe = -kex and Fp = -kpx respectively. Taking the ratio of both so as to have an idea of the respective strength of both forces, we get this:F5

The ap and ae are acceleration vectors – not the radius. The equation above seems to tell us that the centripetal force inside of a proton gives the zbw charge inside – which is nothing but the elementary charge, of course – an acceleration that is four times that of what might be going on inside the electron.

Nice, but how meaningful are these relations, really? If we would be thinking of the centripetal or restoring force as modeling some elasticity of spacetime – the guts intuition behind far more complicated string theories of matter – then we may think of distinguishing between a fundamental frequency and higher-level harmonics or overtones.[8] We will leave our reflections at that for the time being.

We should add one more note, however. We only talked about the electron and the proton here. What about other particles, such as neutrons or mesons? We do not consider these to be elementary because they are not stable: we think they are not stable because the Planck-Einstein relation is slightly off, which causes them to disintegrate into what we’ve been trying to model here: stable stuff. As for the process of their disintegration, we think the approach that was taken by Gell-Man and others[9] is not productive: inventing new quantities that are supposedly being conserved – such as strangeness – is… Well… As strange as it sounds. We, therefore, think the concept of quarks confuses rather than illuminates the search for a truthful theory of matter.

Jean Louis Van Belle, 6 March 2020

[1] In this paper, we make abstraction of the anomaly, which is related to the zbw charge having a (tiny) spatial dimension.

[2] We had a signed contract with the IOP and WSP scientific publishing houses for our manuscript on a realist interpretation of quantum mechanics (https://vixra.org/abs/1901.0105) which was shot down by this simple comment. We have basically stopped tried convincing mainstream academics from that point onwards.

[3] See footnote 1.

[4] See our paper on the proton radius (https://vixra.org/abs/2002.0160).

[5] See reference above.

[6] The reader may wonder why we did not present the ½ fraction is the first set of equations (calculation of the electron radius). We refer him or her to our previous paper on the effective mass of the zbw charge (https://vixra.org/abs/2003.0094). The 1/2 factor appears when considering orbital angular momentum only.

[7] The reader may not be familiar with the concept of the effective mass of an electron but it pops up very naturally in the quantum-mechanical analysis of the linear motion of electrons. Feynman, for example, gets the equation out of a quantum-mechanical analysis of how an electron could move along a line of atoms in a crystal lattice. See: Feynman’s Lectures, Vol. III, Chapter 16: The Dependence of Amplitudes on Position (https://www.feynmanlectures.caltech.edu/III_16.html). We think of the effective mass of the electron as the relativistic mass of the zbw charge as it whizzes about at nearly the speed of light. The rest mass of the zbw charge itself is close to – but also not quite equal to – zero. Indeed, based on the measured anomalous magnetic moment, we calculated the rest mass of the zbw charge as being equal to about 3.4% of the electron rest mass (https://vixra.org/abs/2002.0315).

[8] For a basic introduction, see my blog posts on modes or on music and physics (e.g. https://readingfeynman.org/2015/08/08/modes-and-music/).

[9] See, for example, the analysis of kaons (K-mesons) in Feynman’s Lectures, Vol. III, Chapter 11, section 5 (https://www.feynmanlectures.caltech.edu/III_11.html#Ch11-S5).

Mr. Feynman and boson-fermion theory

I’ve been looking at chapter 4 of Feynman’s Lectures on Quantum Mechanics (the chapter on identical particles) for at least a dozen times now—probably more. This and the following chapters spell out the mathematical framework and foundations of mainstream quantum mechanics: the grand distinction between fermions and bosons, symmetric and asymmetric wavefunctions, Bose-Einstein versus Maxwell-Boltzmann statistics, and whatever else comes out of that—including the weird idea that (force) fields should also come in lumps (think of quantum field theory here). These ‘field lumps’ are then thought of as ‘virtual’ particles that, somehow, ‘mediate’ the force.

The idea that (kinetic and/or potential) energy and (linear and/or angular) momentum are being continually transferred – somehow, and all over space – by these ‘messenger’ particles sounds like medieval philosophy to me. However, to be fair, Feynman does actually not present these more advanced medieval ideas in his Lectures on Quantum Physics. I have always found that somewhat strange: he was about to receive a Nobel Prize for his path integral formulation of quantum mechanics and other contributions to what has now become the mainstream interpretation of quantum mechanics, so why wouldn’t he talk about it to his students, for which he wrote these lectures? In contrast, he does include a preview of Gell-Mann’s quark theory, although he does say – in a footnote – that “the material of this section is longer and harder than is appropriate at this point” and he, therefore, suggests to skip it and move to the next chapter.

[As for the path integral formulation of QM, I would think the mere fact that we have three alternative formulations of QM (matrix, wave-mechanical and path integral) would be sufficient there’s something wrong with these theories: reality is one, so we should have one unique (mathematical) description of it).]

Any case. I am probably doing too much Hineininterpretierung here. Let us return to the basic stuff that Feynman wanted his students to accept as a truthful description of reality: two kinds of statistics. Two different ways of interaction. Two kinds of particles. That’s what post-WW II gurus such as Feynman – all very much inspired by the ‘Club of Copenhagen’—aka known as the ‘Solvay Conference Club‘ – want us to believe: interactions with ‘Bose particles’ – this is the term Feynman uses in this text of 1963  – involve adding amplitudes with a + (plus) sign. In contrast, interactions between ‘Fermi particles’ involve a minus (−) sign when ‘adding’ the amplitudes.

The confusion starts early on: Feynman makes it clear he actually talks about the amplitude for an event to happen or not. Two possibilities are there: two ‘identical’ particles either get ‘swapped’ after the collision or, else, they don’t. However, in the next sections of this chapter – where he ‘proves’ or ‘explains’ the principle of Bose condensation for bosons and then the Pauli exclusion principle for fermions – it is very clear the amplitudes are actually associated with the particles themselves.

So his argument starts rather messily—conceptually, that is. Feynman also conveniently skips the most basic ontological or epistemological question here: how would a particle ‘know‘ how to choose between this or that kind of statistics? In other words, how does it know it should pick the plus or the minus sign when combining its amplitude with the amplitude of the other particle? It makes one think of Feynman’s story of the Martian in his Lecture on symmetries in Nature: what handshake are we going to do here? Left or right? And who sticks out his hand first? The Martian or the Earthian? A diplomat would ask: who has precedence when the two particles meet?

The question also relates to the nature of the wavefunction: if it doesn’t describe anything real, then where is it? In our mind only? But if it’s in our mind only, how comes we get real-life probabilities out of them, and real-life energy levels, or real-life momenta, etcetera? The core question (physical, epistemological, philosophical, esoterical or whatever you’d want to label it) is this: what’s the connection between these concepts and whatever it is that we are trying to describe? The only answer mainstream physicists can provide here is blabber. That’s why the mainstream interpretation of physics may be acceptable to physicists, but not to the general public. That’s why the debate continues to rage: no one believes the Standard Model. Full stop. The intuition of the masses here is very basic and, therefore, probably correct: if you cannot explain something in clear and unambiguous terms, then you probably do not understand it.

Hence, I suspect mainstream academic physicists probably do not understand whatever it is they are talking about. Feynman, by the way, admitted as much when writing – in the very first lines of the introduction to his Lectures on Quantum Mechanics – that “even the experts do not understand it the way they would like to.”

I am actually appalled by all of this. Worse, I am close to even stop talking or writing about it. I only kept going because a handful of readers send me a message of sympathy from time to time. I then feel I am actually not alone in what often feels like a lonely search in what a friend of mine refers to as ‘a basic version of truth.’ I realize I am getting a bit emotional here – or should I say: upset? – so let us get back to Feynman’s argument again.

Feynman starts by introducing the idea of a ‘particle’—a concept he does not define – not at all, really – but, as the story unfolds, we understand this concept somehow combines the idea of a boson and a fermion. He doesn’t motivate why he feels like he should lump photons and electrons together in some more general category, which he labels as ‘particles’. Personally, I really do not see the need to do that: I am fine with thinking of a photon as an electromagnetic oscillation (a traveling field, that is), and of electrons, protons, neutrons and whatever composite particle out there that is some combination of the latter as matter-particles. Matter-particles carry charge: electric charge and – who knows – perhaps some strong charge too. Photons don’t. So they’re different. Full stop. Why do we want to label everything out there as a ‘particle’?

Indeed, when everything is said and done, there is no definition of fermions and bosons beyond this magical spin-1/2 and spin-1 property. That property is something we cannot measure: we can only measure the magnetic moment of a particle: any assumption on their angular momentum assumes you know the mass (or energy) distribution of the particle. To put it more plainly: do you think of a particle as a sphere, a disk, or what? Mainstream physicists will tell you that you shouldn’t think that way: particles are just pointlike. They have no dimension whatsoever – in their mathematical models, that is – because all what experimentalists is measuring scattering or charge radii, and these show the assumption of an electron or a proton being pointlike is plain nonsensical.

Needless to say, besides the perfect scattering angle, Feynman also assumes his ‘particles’ have no spatial dimension whatsoever: he’s just thinking in terms of mathematical lines and points—in terms of mathematical limits, not in terms of the physicality of the situation.

Hence, Feynman just buries us under a bunch of tautologies here: weird words are used interchangeably without explaining what they actually mean. In everyday language and conversation, we’d think of that as ‘babble’. The only difference between physicists and us commoners is that physicists babble using mathematical language.

[…]

I am digressing again. Let us get back to Feynman’s argument. So he tells us we should just accept this theoretical ‘particle’, which he doesn’t define: he just thinks about two of these discrete ‘things’ going into some ‘exchange’ or ‘interaction’ and then coming out of it and going into one of the two detectors. The question he seeks to answer is this: can we still distinguish what is what after the ‘interaction’?

The level of abstraction here is mind-boggling. Sadly, it is actually worse than that: it is also completely random. Indeed, the only property of this mystical ‘particle’ in this equally mystical thought experiment of Mr. Feynman is that it scatters elastically with some other particle. However, that ‘other’ particle is ‘of the same kind’—so it also has no other property than that it scatters equally elastically from the first particle. Hence, I would think the question of whether the two particles are identical or not is philosophically empty.

To be rude, I actually wonder what Mr. Feynman is actually talking about here. Every other line in the argument triggers another question. One should also note, for example, that this elastic scattering happens in a perfect angle: the whole argument of adding or subtracting amplitudes effectively depends on the idea of a perfectly measurable angle here. So where is the Uncertainty Principle here, Mr. Feynman? It all makes me think that Mr. Feynman’s seminal lecture may well be the perfect example of what Prof. Dr. John P. Ralston wrote about his own profession:

“Quantum mechanics is the only subject in physics where teachers traditionally present haywire axioms they don’t really believe, and regularly violate in research.” (1)

Let us continue exposing Mr. Feynman’s argument. After this introduction of this ‘particle’ and the set-up with the detectors and other preconditions, we then get two or three paragraphs of weird abstract reasoning. Please don’t get me wrong: I am not saying the reasoning is difficult (it is not, actually): it is just weird and abstract because it uses complex number logic. Hence, Feynman implicitly requests the reader to believe that complex numbers adequately describes whatever it is that he is thinking of (I hope – but I am not so sure – he was trying to describe reality). In fact, this is the one point I’d agree with him: I do believe Euler’s function adequately describes the reality of both photons and electrons (see our photon and electron models), but then I also think +i and −i are two very different things. Feynman doesn’t, clearly.

It is, in fact, very hard to challenge Feynman’s weird abstract reasoning here because it all appears to be mathematically consistent—and it is, up to the point of the tricky physical meaning of the imaginary unit: Feynman conveniently forgets the imaginary unit represents a rotation of 180 degrees and that we, therefore, need to distinguish between these two directions so as to include the idea of spin. However, that is my interpretation of the wavefunction, of course, and I cannot use it against Mr. Feynman’s interpretation because his and mine are equally subjective. One can, therefore, only credibly challenge Mr. Feynman’s argument by pointing out what I am trying to point out here: the basic concepts don’t make any sense—none at all!

Indeed, if I were a student of Mr. Feynman, I would have asked him questions like this:

“Mr. Feynman, I understand your thought experiment applies to electrons as well as to photons. In fact, the argument is all about the difference between these two very different ‘types’ of ‘particles’. Can you please tell us how you’d imagine two photons scattering off each other elastically? Photons just pile on top of each other, don’t they? In fact, that’s what you prove next. So they don’t scatter off each other, do they? Your thought experiment, therefore, seems to apply to fermions only. Hence, it would seem we should not use it to derive properties for bosons, isn’t it?”

“Mr. Feynman, how should an electron (a fermion – so you say we should ‘add’ amplitudes using a minus sign) ‘think’ about what sign to use for interaction when a photon is going to hit it? A photon is a boson – so its sign for exchange is positive – so should we have an ‘exchange’ or ‘interaction’ with the plus or the minus sign then? More generally, who takes the ‘decisions’ here? Do we expect God – or Maxwell’s demon – to be involved in every single quantum-mechanical event?”

Of course, Mr. Feynman might have had trouble answering the first question, but he’d probably would not hesitate to produce some kind of rubbish answer to the second: “Mr. Van Belle, we are thinking of identical particles here. Particles of the same kind, if you understand what I mean.”

Of course, I obviously don’t understand what he  means but so I can’t tell him that. So I’d just ask the next logical question to try to corner him:

“Of course, Mr. Feynman. Identical particles. Yes. So, when thinking of fermion-on-fermion scattering, what mechanism do you have in mind? At the very least, we should be mindful of the difference between Compton versus Thomson scattering, shouldn’t we? How does your ‘elastic’ scattering relate to these two very different types of scattering? What is your theoretical interaction mechanism here?”

I can actually think of some more questions, but I’ll leave it at this. Well… No… Let me add another one:

“Mr. Feynman, this theory of interaction between ‘identical’ or ‘like’ particles (fermions and bosons) looks great but, in reality, we will also have non-identical particles interacting with each other—or, more generally speaking, particles that are not ‘of the same kind’. To be very specific, reality sees many electrons and many photons interacting with each other—not just once, at the occasion of some elastic collision, but all of the time, really. So could we, perhaps, generalize this to some kind of ‘three- or n-particle problem’?”

This sounds like a very weird question, which even Mr. Feynman might not immediately understand. So, if he didn’t shut me up already, he may have asked me to elaborate: “What do you mean, Mr. Van Belle? What kind of three- or n-particle problem are you talking about?” I guess I’d say something like this:

“Well… Already in classical physics, we do not have an analytical solution for the ‘three-body problem’, but at least we have the equations. So we have the underlying mechanism. What are the equations here? I don’t see any. Let us suppose we have three particles colliding or scattering or interacting or whatever it is we are trying to think of. How does any of the three particles know what the other two particles are going to be: a boson or a fermion? And what sign should they then use for the interaction? In fact, I understand you are talking amplitudes of events here. If three particles collide, how many events do you count: one, two, three, or six?”

One, two, three or six? Yes. Do we think of the interaction between three particles as one event, or do we split it up as a triangular thing? Or is it one particle interacting, somehow, with the two other, in which case we’re having two events, taking into account this weird plus or minus sign rule for interaction.

Crazy? Yes. Of course. But the questions are logical, aren’t they? I can think of some more. Here is one that, in my not-so-humble view, shows how empty these discussions on the theoretical properties of theoretical bosons and theoretical fermions actually are:

“Mr. Feynman, you say a photon is a boson—a spin-one particle, so its spin state is either 1, 0 or −1. How comes photons – the only boson that we actually know to exist from real-life experiments – do not have a spin-zero state? Their spin is always up or down. It’s never zero. So why are we actually even talking about spin-one particles, if the only boson we know – the photon – does not behave like it should behave according to your boson-fermion theory?” (2)

Am I joking? I am not. I like to think I am just asking very reasonable questions here—even if all of this may sound like a bit of a rant. In fact, it probably is, but so that’s why I am writing this up in a blog rather than in a paper. Let’s continue.

The subsequent chapters are about the magical spin-1/2 and spin-1 properties of fermions and bosons respectively. I call them magical, because – as mentioned above – all we can measure is the magnetic moment. Any assumption that the angular momentum of a particle – a ‘boson’ or a ‘fermion’, whatever it is – is ±1 or ±1/2, assumes we have knowledge of some form factor, which is determined by the shape of that particle and which tells us how the mass (or the energy) of a particle is distributed in space.

Again, that may sound sacrilegious: according to mainstream physicists, particles are supposed to be pointlike—which they interpret as having no spatial dimension whatsoever. However, as I mentioned above, that sounds like a very obvious oxymoron to me.

Of course, I know I would never have gotten my degree. When I did the online MIT course, the assistants of Prof. Dr. Zwieback also told me I asked too many questions: I should just “shut up and calculate.” You may think I’m joking again but, no: that’s the feedback I got. Needless to say, I went through the course and did all of the stupid exercises, but I didn’t bother doing the exams. I don’t mind calculating. I do a lot of calculations as a finance consultant. However, I do mind mindless calculations. Things need to make sense to me. So, yes, I will always be an ‘amateur physicist’ and a ‘blogger’—read: someone whom you shouldn’t take very seriously. I just hope my jokes are better than Feynman’s.

I’ve actually been thinking that getting a proper advanced degree in physics might impede understanding, so it’s good I don’t have one. I feel these mainstream courses do try to ‘brainwash’ you. They do not encourage you to challenge received wisdom. On the contrary, it all very much resembles rote learning: memorization based on repetition. Indeed, more modern textbooks – I looked at the one of my son, for example – immediately dive into the hocus-pocus—totally shamelessly. They literally start by saying you should not try to understand and that you just get through the math and accept the quantum-mechanical dogmas and axioms! Despite the appalling logic in the introductory chapters, Mr. Feynman, in contrast, at least has the decency to try to come up with some classical arguments here and there (although he also constantly adds that the student should just accept the hocus-pocus approach and the quantum-mechanical dogmas and not think too much about what it might or might not represent).

My son got high marks on his quantum mechanics exam: a 19/20, to be precise, and so I am really proud of him—and I also feel our short discussions on this or that may have helped him to get through it. Fortunately, he was doing it as part of getting a civil engineering degree (Bachelor’s level), and he was (also) relieved he would never have to study the subject-matter again. Indeed, we had a few discussions and, while he (also) thinks I am a bit of a crackpot theorist, he does agree “the math must describe something real” and that “therefore, something doesn’t feel right in all of that math.” I told him that I’ve got this funny feeling that, 10 or 20 years from now, 75% (more?) of post-WW II research in quantum physics – most of the theoretical research, at least (3) – may be dismissed as some kind of collective psychosis or, worse, as ‘a bright shining lie’ (title of a book I warmly recommend – albeit on an entirely different topic). Frankly, I think many academics completely forgot Boltzmann’s motto for the physicist:

“Bring forward what is true. Write it so that it is clear. Defend it to your last breath.”

[…]

OK, you’ll say: get real! So what is the difference between bosons and fermions, then? I told you already: I think it’s a useless distinction. Worse, I think it’s not only useless but it’s also untruthful. It has, therefore, hampered rather than promoted creative thinking. I distinguish matter-particles – electrons, protons, neutrons – from photons (and neutrinos). Matter-particles carry charge. Photons (and neutrinos) do not. (4) Needless to say, I obviously don’t believe in ‘messenger particles’ and/or ‘Higgs’ or other ‘mechanisms’ (such as the ‘weak force’ mechanism). That sounds too much like believing in God or some other non-scientific concept. [I don’t mind you believing in God or some other non-scientific concept – I actually do myself – but we should not confuse it with doing physics.]

And as for the question on what would be my theory of interaction? It’s just the classical theory: charges attract or repel, and one can add electromagnetic fields—all in respect of the Planck-Einstein law, of course. Charges have some dimension (and some mass), so they can’t take up the same space. And electrons, protons and neutrons have some structure, and physicists should focus on modeling those structures, so as to explain the so-called intrinsic properties of these matter-particles. As for photons, I think of them as an oscillating electromagnetic field (respecting the Planck-Einstein law, of course), and so we can simply add them. What causes them to lump together? Not sure: the Planck-Einstein law (being in some joint excited state, in other words) or gravity, perhaps. In any case: I am confident it is something real—i.e. not Feynman’s weird addition or subtraction rules for amplitudes.

However, this is not the place to re-summarize all of my papers. I’d just sum them up by saying this: not many physicists seem to understand Planck’s constant or, what amounts to the same, the concept of an elementary cycle. And their unwillingness to even think about the possible structure of photons, electrons and protons is… Well… I’d call it criminal. :-/

[…]

I will now conclude my rant with another down-to-earth question: would I recommend reading Feynman’s Lectures? Or recommend youngsters to take up physics as a study subject?

My answer in regard to the first question is ambiguous: yes, and no. When you’d push me on this, I’d say: more yes than no. I do believe Feynman’s Lectures are much better than the modern-day textbook that was imposed on my son during his engineering studies and so, yes, I do recommend the older textbooks. But please be critical as you go through them: do ask yourself the same kind of questions that I’ve been asking myself while building up this blog: think for yourself. Don’t go by ‘authority’. Why not? Because the possibility that a lot of what labels itself as science may be nonsensical. As nonsensical as… Well… All what goes on in national and international politics for the moment, I guess. 🙂

In regard to the second question – should youngsters be encouraged to study physics? – I’d say what my father told me when I was hesitating to pick a subject for study: “Do what earns respect and feeds your family. You can do philosophy and other theoretical things on the side.”

With the benefit of hindsight, I can say he was right. I’ve done the stuff I wanted to do—on the side, indeed. So I told my son to go for engineering – rather than pure math or pure physics. 🙂 And he’s doing great, fortunately !

Jean Louis Van Belle

Notes:

(1) Dr. Ralston’s How To Understand Quantum Mechanics is fun for the first 10 pages or so, but I would not recommend it. We exchanged some messages, but then concluded that our respective interpretations of quantum mechanics are very different (I feel he replaces hocus-pocus by other hocus-pocus) and, hence, that we should not “waste any electrons” (his expression) on trying to convince each other.

(2) It is really one of the most ridiculous things ever. Feynman spends several chapters on explaining spin-one particles to, then, in some obscure footnote, suddenly write this: “The photon is a spin-one particle which has, however, no “zero” state.” From all of his jokes, I think this is his worst. It just shows how ‘rotten’ or ‘random’ the whole conceptual framework of mainstream QM really is. There is, in fact, another glaring inconsistency in Feynman’s Lectures: in the first three chapters of Volume III, he talks about adding wavefunctions and the basic rules of quantum mechanics, and it all happens with a plus sign. In this chapter, he suddenly says the amplitudes of fermions combine with a minus sign. If you happen to know a physicist who can babble his way of out this inconsistency, please let me know.

(3) There are exceptions, of course. I mentioned very exciting research in various posts, but most of it is non-mainstream. The group around Herman Batalaan at the University of Nebraska and various ‘electron modellers’ are just one of the many examples. I contacted a number of these ‘particle modellers’. They’re all happy I show interest, but puzzled themselves as to why their research doesn’t get all that much attention. If it’s a ‘historical accident’ in mankind’s progress towards truth, then it’s a sad one.

(4) We believe a neutron is neutral because it has both positive and negative charge in it (see our paper on protons and neutrons). as for neutrinos, we have no idea what they are, but our wild guess is that they may be the ‘photons’ of the strong force: if a photon is nothing but an oscillating electromagnetic field traveling in space, then a neutrino might be an oscillating strong field traveling in space, right? To me, it sounds like a reasonable hypothesis, but who am I, right? 🙂 If I’d have to define myself, it would be as one of Feynman’s ideal students: someone who thinks for himself. In fact, perhaps I would have been able to entertain him as much as he entertained me— and so, who knows, I like to think he might actually have given me some kind of degree for joking too ! 🙂

(5) There is no (5) in the text of my blog post, but I just thought I would add one extra note here. 🙂 Herman Batelaan and some other physicists wrote a Letter to the Physical Review Journal back in 1997. I like Batelaan’s research group because – unlike what you might think – most of Feynman’s thought experiments have actually never been done. So Batelaan – and some others – actually did the double-slit experiment with electrons, and they are doing very interesting follow-on research on it.

However, let me come to the point I want to mention here. When I read these lines in that very serious Letter, I didn’t know whether to laugh or to cry:

“Bohr’s assertion (on the impossibility of doing a Stern-Gerlach experiment on electrons or charged particles in general) is thus based on taking the classical limit for ħ going to 0. For this limit not only the blurring, but also the Stern-Gerlach splitting vanishes. However, Dehmelt argues that ħ is a nonzero constant of nature.”

I mean… What do you make of this? Of course, ħ is a nonzero constant, right? If it was zero, the Planck-Einstein relation wouldn’t make any sense, would it? What world were Bohr, Heisenberg, Pauli and others living in? A different one than ours, I guess. But that’s OK. What is not OK, is that these guys were ignoring some very basic physical laws and just dreamt up – I am paraphrasing Ralston here – “haywire axioms they did not really believe in, and regularly violated themselves.” And they didn’t know how to physically interpret the Planck-Einstein relation and/or the mass-energy equivalence relation. Sabine Hossenfelder would say they were completely lost in math. 🙂

The Mystery Wallahs

I’ve been working across Asia – mainly South Asia – for over 25 years now. You will google the exact meaning but my definition of a wallah is a someone who deals in something: it may be a street vendor, or a handyman, or anyone who brings something new. I remember I was one of the first to bring modern mountain bikes to India, and they called me a gear wallah—because they were absolute fascinated with the number of gears I had. [Mountain bikes are now back to a 2 by 10 or even a 1 by 11 set-up, but I still like those three plateaux in front on my older bikes—and, yes, my collection is becoming way too large but I just can’t do away with it.]

Any case, let me explain the title of this post. I stumbled on the work of the research group around Herman Batelaan in Nebraska. Absolutely fascinating ! Not only did they actually do the electron double-slit experiment, but their ideas on an actual Stern-Gerlach experiment with electrons are quite interesting: https://digitalcommons.unl.edu/cgi/viewcontent.cgi?article=1031&context=physicsgay

I also want to look at their calculations on momentum exchange between electrons in a beam: https://iopscience.iop.org/article/10.1088/1742-6596/701/1/012007.

Outright fascinating. Brilliant ! […]

It just makes me wonder: why is the outcome of this 100-year old battle between mainstream hocus-pocus and real physics so undecided?

I’ve come to think of mainstream physicists as peddlers in mysteries—whence the title of my post. It’s a tough conclusion. Physics is supposed to be the King of Science, right? Hence, we shouldn’t doubt it. At the same time, it is kinda comforting to know the battle between truth and lies rages everywhere—including inside of the King of Science.

JL

The ultimate electron model

A rather eminent professor in physics – who has contributed significantly to solving the so-called ‘proton radius puzzle’ – advised me to not think of the anomalous magnetic moment of the electron as an anomaly. It led to a breakthrough in my thinking of what an electron might actually be. The fine-structure constant should be part and parcel of the model, indeed. Check out my last paper ! I’d be grateful for comments !

I know the title of this post sounds really arrogant. It is what it is. Whatever brain I have has been thinking about these issues consciously and unconsciously for many years now. It looks good to me. When everything is said and done, the function of our mind is to make sense. What’s sense-making? I’d define sense-making as creating consistency between (1) the structure of our ideas and theories (which I’ll conveniently define as ‘mathematical’ here) and (2) what we think of as the structure of reality (which I’ll define as ‘physical’).

I started this blog reading Penrose (see the About page of this blog). And then I just put his books aside and started reading Feynman. I think I should start re-reading Penrose. His ‘mind-physics-math’ triangle makes a lot more sense to me now.

JL

PS: I agree the title of my post is excruciatingly arrogant but – believe me – I could have chosen an even more arrogant title. Why? Because I think my electron model actually explains mass. And it does so in a much more straightforward manner than Higgs, or Brout–Englert–Higgs, or Englert–Brout–Higgs–Guralnik–Hagen–Kibble, Anderson–Higgs, Anderson–Higgs–Kibble, Higgs–Kibble, or ABEGHHK’t (for Anderson, Brout, Englert, Guralnik, Hagen, Higgs, Kibble, and ‘t Hooft) do. [I am just trying to attribute the theory here using the Wikipedia article on it.] 

A common-sense interpretation of (quantum) physics

This is my summary of what I refer to as a common-sense interpretation of quantum physics. It’s a rather abstruse summary of the 40 papers I wrote over the last two years.

1. A force acts on a charge. The electromagnetic force acts on an electric charge (there is no separate magnetic charge) and the strong force acts on a strong charge. A charge is a charge: a pointlike ‘thing’ with zero rest mass. The idea of an electron combines the idea of a charge and its motion (Schrödinger’s Zitterbewegung). The electron’s rest mass is the equivalent mass of the energy in its motion (mass without mass). The elementary wavefunction represents this motion.

2. There is no weak force: a force theory explaining why charges stay together must also explain when and how they separate. A force works through a force field: the idea that forces are mediated by virtual messenger particles resembles 19th century aether theory. The fermion-boson dichotomy does not reflect anything real: we have charged and non-charged wavicles (electrons versus photons, for example).

3. The Planck-Einstein law embodies a (stable) wavicle. A stable wavicle respects the Planck-Einstein relation (E = hf) and Einstein’s mass-energy equivalence relation (E = m·c2). A wavicle will, therefore, carry energy but it will also pack one or more units of Planck’s quantum of action. Planck’s quantum of action represents an elementary cycle in Nature. An elementary particle embodies the idea of an elementary cycle.

4. The ‘particle zoo’ is a collection of unstable wavicles: they disintegrate because their cycle is slightly off (the integral of the force over the distance of the loop and over the cycle time is not exactly equal to h).

5. An electron is a wavicle that carries charge. A photon does not carry charge: it carries energy between wavicle systems (atoms, basically). It can do so because it is an oscillating field.

6. An atom is a wavicle system. A wavicle system has an equilibrium energy state. This equilibrium state packs one unit of h. Higher energy states pack two, three,…, n units of h. When an atom transitions from one energy state to another, it will emit or absorb a photon that (i) carries the energy difference between the two energy states and (ii) packs one unit of h.

7. Nucleons (protons and neutrons) are held together because of a strong force. The strong force acts on a strong charge, for which we need to define a new unit: we choose the dirac but – out of respect for Yukawa, we write one dirac as 1 Y. If Yukawa’s function models the strong force correctly, then the strong force – which we denote as FN – can be calculated from the Yukawa potential:

F1

This function includes a scale parameter a and a nuclear proportionality constant υ0. Besides its function as an (inverse) mathematical proportionality constant, it also ensures the physical dimensions on the left- and the right-hand side of the force equation are the same. We can choose to equate the numerical value of υ0 to one.

8. The nuclear force attracts two positive electric charges. The electrostatic force repels them. These two forces are equal at a distance r = a. The strong charge unit (gN) can, therefore, be calculated. It is equal to:

F2

9. Nucleons (protons or neutrons) carry both electric as well as strong charge (qe and gN). A kinematic model disentangling both has not yet been found. Such model should explain the magnetic moment of protons and neutrons.

10. We think of a nucleus as wavicle system too. When going from one energy state to another, the nucleus emits or absorbs neutrinos. Hence, we think of the neutrino as the photon of the strong force. Such changes in energy states may also involve the emission and/or absorption of an electric charge (an electron or a positron).

Does this make sense? I look forward to your thoughts. 🙂

[…]

Because the above is all very serious, I thought it would be good to add something that will make you smile. 🙂

saint-schrodinger-as-long-as-the-tomb-is-closed-jesus-is-both-dead-and-alive

The Charge Conservation Principle and Pair Production

The creation of an electron-positron pair out of a highly energetic photon – the most common example of pair production – is often presented as an example of how energy can be converted into matter. Vice versa, electron-positron annihilation then amounts to the destruction of matter. However, if John Wheeler’s concept of ‘mass without mass’ is correct – or if Schrödinger’s trivial solution to Dirac’s equation for an electron in free space (the Zitterbewegung interpretation of an electron) is correct – then what might actually be happening is probably simpler—but also far more intriguing.

John Wheeler’s intuitive ‘mass without mass’ idea is that matter and energy are just two sides of the same coin. That was Einstein’s intuition too: mass is just a measure of inertia—a measure of the resistance to a change in the state of motion. Energy itself is motion: the motion of a charge. Some force over some distance, and we associate a force with a charge. Not with mass. In this interpretation of physics, an electron is nothing but a pointlike charge whizzing about some center. It’s a charge caught in an electromagnetic oscillation. The pointlike charge itself has zero rest mass, which is why it moves about at the speed of light.[1]

This electron model is easy and intuitive. Developing a similar model for a nucleon – a proton or a neutron – is much more complicated because nucleons are held together by another force, which we commonly refer to as the strong force.

In regard to the latter, the reader should note that I am very hesitant to take the quark-gluon model of this strong force seriously. I entirely subscribe to Dirac’s rather skeptical evaluation of it:

“Now there are other kinds of interactions, which are revealed in high-energy physics and are important for the description of atomic nuclei. These interactions are not at present sufficiently well understood to be incorporated into a system of equations of motion. Theories of them have been set up and much developed and useful results obtained from them. But in the absence of equations of motion these theories cannot be presented as a logical development of the principles set up in this book. We are effectively in the pre-Bohr era with regard to these other interactions.”[2]

I readily admit he wrote this in 1967 (so that’s a very long time ago). He was reacting, most probably, to the invention of a new conservation law (the conservation of strangeness, as proposed by Gell-Mann, Nishijima, Pais and others) and the introduction of many other ad hoc QCD quantum numbers to explain why this or that disintegration path does or does not occur. It was all part of the Great Sense-Making Exercise at the time: how to explain the particle zoo?[3] In short, I am very reluctant to take the quark-gluon model of the strong force seriously.

However, I do acknowledge the experimental discovery of the fact that pairs of matter and anti-matter particles could be created out of highly energetic photons may well be the most significant discovery in post-WW II physics. Dirac’s preface to the 4th edition of the Principles of Quantum Mechanics summarized this as follows:

“In present-day high-energy physics, the creation and annihilation of charged particles is a frequent occurrence. A quantum electrodynamics which demands conservation of the number of charged particles is, therefore, out of touch with physical reality. So I have replaced it by a quantum electrodynamics which includes creation and annihilation of electron-positron pairs. […] It seems that the classical concept of an electron is no longer a useful model in physics, except possibly for elementary theories that are restricted to low-energy phenomena.”

Having said this, I think it’s useful to downplay Dr. Dirac’s excitement somewhat. Our world is governed by low-energy phenomena: if our Universe was created in a Big Bang – some extremely high-energy environment – then it happened 14 billion years or so ago, and the Universe has cooled down since. Hence, these high-energy experiments in labs and colliders are what they are: high-energy collisions followed by disintegration processes. They emulate the conditions of what might have happened in the first second – or the first minute, perhaps (surely not the first day or week or so) – after Creation.[4]

I am, therefore, a bit puzzled by Dr. Dirac’s sentiment. Why would he think the classical concept of an electron is no longer useful? An electron is a permanent fixture. We can create and destroy it in our high-energy colliders, but that doesn’t mean it’s no longer useful as a concept.

Pair production only happens when the photon is fired into a nucleus, and the generalization to ‘other’ bosons ‘spontaneously’ disintegrating into a particle and an anti-particle is outright pathetic. What happens is this: we fire an enormous amount of electromagnetic energy into a nucleus (the equivalent mass of the photon has to match the mass of the electron and the positron that’s being produced) and, hence, we destabilize the stable nucleus. However, Nature is strong. The strong force is strong. Some intermediate energy state emerges but Nature throws out the spanner in the works. The end result is that all can be analyzed, once again, in terms of the Planck-Einstein relation: we have stable particles, once again. [Of course, the positron finds itself in the anti-Universe and will, therefore, quickly disappear in the reverse process: electron-positron annihilation.]

No magic here. And – surely – no need for strange QCD quantum numbers.

Jean Louis Van Belle, 28 July 2019

[1] Erwin Schrödinger stumbled upon the Zitterbewegung interpretation of an electron when he was exploring solutions to Dirac’s wave equation for free electrons. It’s worth quoting Dirac’s summary of it: “The variables give rise to some rather unexpected phenomena concerning the motion of the electron. These have been fully worked out by Schrödinger. It is found that an electron which seems to us to be moving slowly, must actually have a very high frequency oscillatory motion of small amplitude superposed on the regular motion which appears to us. As a result of this oscillatory motion, the velocity of the electron at any time equals the velocity of light. This is a prediction which cannot be directly verified by experiment, since the frequency of the oscillatory motion is so high and its amplitude is so small. But one must believe in this consequence of the theory, since other consequences of the theory which are inseparably bound up with this one, such as the law of scattering of light by an electron, are confirmed by experiment.” (Paul A.M. Dirac, Theory of Electrons and Positrons, Nobel Lecture, December 12, 1933)

[2] P. A. M. Dirac, The Principles of Quantum Mechanics, Oxford University Press, 4th revised edition, Chapter XII (Quantum Electrodynamics), p. 312.

[3] Feynman’s 1963 Lecture on K-mesons (http://www.feynmanlectures.caltech.edu/III_11.html#Ch11-S5) is an excellent summary of the state of affairs at the time. The new colliders had, effectively, generated a ‘particle zoo’, and it had to be explained. We think physicists should first have acknowledged that these short-lived particles should, perhaps, not be associated with the idea of a (fundamental) particle: they’re unstable. Transients, at best. Many of them are just resonances.

[4] I use the term ‘Creation’ as an absolutely non-religious concept here: it’s just a synonym of the presumed ‘Big Bang’. To be very clear on this, I am rather appalled by semi-scientific accounts of the creation of our world in terms of the biblical week.

Smoking Gun Physics

The nature of the Higgs particle

The images below visualize what is generally referred to as the first ‘evidence’ for the Higgs boson: (1) two gamma rays emerging from the CERN LHC CMS detector, and (2) the tracks of four muons in the CERN LHC ATLAS detector. These tracks result from the collision between two protons that hit each other at a velocity of 99.99999 per cent of the speed of light – which corresponds to a combined energy of about 7 to 8 TeV.[1] That’s huge. After the ‘discovery’ of the Higgs particle, the LHC was shut down for maintenance and an upgrade, and the protons in the LHC can now be accelerated to energies up to 7 TeV – which amounts to 14 TeV when they crash into each other. However, the higher energy level only produced more of the same so far.[2]

We put ‘evidence’ and ‘discovery’ between inverted commas because the Higgs particle is (and, rest assured, will forever remain) a ghost particle only: we cannot directly observe it. Theoretical physicists and experimentalists agree these traces are just signatures of the long-awaited God particle. It was long-awaited indeed: the title of the six-page ‘leaflet’ explaining the award of the 2013 Nobel Prize in Physics to François Englert and Peter Higgs is: “Here, at last![3]  The long wait for it – and CERN’s excellent media team – may explain why the Nobel Physics Committee and the Royal Swedish Academy of Sciences were so eager to award a Nobel Prize for this ! So we should ask ourselves: what’s the hype, and what are the physics? And do the physics warrant the hype?

The facts are rather simple. We cannot directly observe the Higgs particle because it is just like all of the other ‘particles’ that come out of these collisions: they are too short-lived to leave a permanent trace. Indeed, when two protons hit each other at these incredible velocities, then all that’s left is debris flying around. This debris quickly disintegrates into other more debris – until we’re left with what we’re used: real particles, like electrons or protons. Things that don’t disintegrate.

The energy of the debris (the gamma rays or the muons) coming out of ‘Higgs events’ tells us the energy of the Higgs particle must be about 125 GeV. Besides its mass, it does not seem to have any other properties: no spin, no electric charge. It is, therefore, known as a scalar boson. In everyday language, that means it is just some (real) number. Newton had already told us that mass, as a measure of inertia, is just some real positive number—and Einstein taught us energy and mass are equivalent.

Interpreting the facts is tough. I am just an amateur physicists and so my opinion won’t count for much. However, I can’t help feeling Higg’s theory just confirms the obvious. For starters, we should be very hesitant to use the term ‘particle’ for the Higgs boson because its lifetime is of the order of 10-22 s. Think of it as the time an electron needs to go from electron orbital to another. Even at the speed of light – which an object with a rest mass of 125 GeV/c2 cannot aspire to attain – a particle with such lifetime cannot travel more than a few tenths of a femtometer: about 0.3´10-15 m, to be precise. That’s not something you would associate it with the idea of a particle: a resonance in particle physics has the same lifetime.

That’s why we’ll never see the Higgs boson—just like we’ll never see the W± and Z bosons whose mass it’s supposed to explain. Neither will none of us ever see a quark or a gluon: physicists tell us the signals that come out of colliders such as the LHC or, in the 1970s and 1980s, that came out of the PETRA accelerator in Hamburg, the Positron-Electron Project (PEP) at the Stanford National Accelerator Laboratory (SLAC), and the Super Proton-Antiproton Synchrotron at CERN, are consistent with the hypothesis that the strong and weak forces are mediated through particles known as bosons (force carriers) but – truth be told – the whole idea of forces being mediated by bosons is just what it is: a weird theory.

Are virtual particles the successor to the aether theory?

Maybe we should first discuss the most obvious of all bosons: the photon. Photons are real. Of course, they are. They are, effectively, the particles of light. They are, in fact, the only bosons we can effectively observe. In fact, we’ve got a problem here: the only bosons we can effectively observe – photons – do not have all of the theoretical properties of a boson: as a spin-1 particle, the theoretical values for its angular momentum are ± ħ or 0. However, photons don’t have a zero-spin state. Never. This is one of the things in mainstream quantum mechanics that has always irked me. All courses in quantum mechanics spend like two or three  chapters on why bosons and fermions are different (spin-one versus spin-1/2), but when it comes to the specifics – real-life stuff – then the only boson we actually know (the photon) turns out to not be a typical boson because it can’t have zero spin. [Physicists will, of course, say the most important property of bosons is that they you can keep piling bosons on top of bosons, and you can do that with photons. Bosons are supposed to like to be together, because we want to keep adding to the force without limit. But… Well… I have another explanation for that. It’s got to do with the fact that bosons don’t – or shouldn’t – carry charge. But I don’t want to start another digression on that. Not here.]

So photons – the only real-life bosons we’ve ever observed – aren’t typical bosons. More importantly, no course in physics has ever been able to explain why we’d need photons in the role of virtual particles. Why would an electron in some atomic orbital continuously exchange photons with the proton that holds it in its orbit? When you ask that question to a physicist, he or she will start blubbering about quantum field theory and other mathematical wizardry—but he or she will never give you a clear answer. I’ll come back to this in the next section of this paper.

I don’t think there is a clear answer. Worse, I’ve started to think the whole idea of some particle mediating a force is nonsense. It’s like the 19th-century aether theory: we don’t need it. We don’t need it in electromagnetic theory: Maxwell’s Laws – augmented with the Planck-Einstein relation – will do. We also don’t need it to model the strong force. The quarkgluon model – according to which quarks change color all of the time – does not come with any simplification as compared to a simpler parton model:

  1. The quark-gluon model gives us (at least) two quarks[4], two anti-quarks and nine gluons, so that adds up to 13 different objects.
  2. If we just combine the idea of a parton – a pointlike carrier of properties – with… Well… Its properties – the possible electric charges (±2/3 and ±1/3) and the possible color charges (red, green and blue) – we’ve got 12 partons, and such ‘parton model’ explains just as much.[5]

I also don’t think we need it to model the weak force. Let me be very clear about my intuition/sentiment/appreciation—whatever you want to call it:

We don’t need a Higgs theory to explain why W/Z bosons have mass because I think W/Z bosons don’t exist: they’re a figment of our imagination.

Why do we even need the concept of a force to explain why things fall apart? The world of unstable particles – transient particles as I call them – is a different realm altogether. Physicists will cry wolf here: CERN’s Super Proton-Antiproton Synchrotron produced evidence for W+, W and Z bosons back in 1983, didn’t it?

No. The evidence is just the same as the ‘evidence’ for the Higgs boson: we produce a short-lived blob of energy which disintegrates in no time (10-22 s or 10-23 s is no time, really) and, for some reason no one really understands, we think of it as a force carrier: something that’s supposed to be very different from the other blobs of energy that emerge while it disintegrates into jets made up of other transients and/or resonances. The end result is always the same: the various blobs of energy further dis- and reintegrate as stable particles (think of protons, electrons and neutrinos[6]). There is no good reason to introduce a bunch of weird flavor quantum numbers to think of how such processes might actually occur. In reality, we have a very limited number of permanent fixtures (electrons, protons and photons), hundreds of transients (particles that fall apart) and thousands of resonances (excited states of the transient and non-transient stuff).

You’ll ask me: so what’s the difference between them then?

Stable particles respect the E = h·f = ħ·ω relation—and they do so exactly. For non-stable particles – transients – that relation is slightly off, and so they die. They die by falling apart in more stable configurations, until we are left with stable particles only. As for resonances, they are just that: some excited state of a stable or a non-stable particle. Full stop. No magic needed.[7]

Photons as bosons

Photons are real and, yes, they carry energy. When an electron goes from one state to another (read: from one electron orbital to another), it will absorb or emit a photon. Photons make up light: visible light, low-energy radio waves, or high-energy X- and γ-rays. These waves carry energy and – when we look real close – they are made up of photons. So, yes, it’s the photons that carry the energy.

Saying they carry electromagnetic energy is something else than saying they carry electromagnetic force itself. A force acts on a charge: a photon carries no charge. If photons carry no charge, then why would we think of them as carrying the force?

I wrote I’ve always been irked by the fact that photons – again, the only real-life bosons we’ve ever observed – don’t have all of the required properties of the theoretical force-carrying particle physicists invented: the ‘boson’. If bosons exist, then the bosons we associate with the strong and weak force should also not carry any charge: color charge or… Well… What’s the ‘weak’ charge? Flavor? Come on guys ! Give us something we can believe in.

That’s one reason – for me, at least – why the idea of gluons and W/Z bosons is non-sensical. Gluons carry color charge, and W/Z bosons carry electric charge (except for the Z boson – but we may think of it as carrying both positive and negative charge). They shouldn’t. Let us quickly review what I refer to as a ‘classical’ quantum theory of light.[8]

If there is one quantum-mechanical rule that no one never doubts, it is that angular momentum comes in units of ħ: Planck’s (reduced) constant. When analyzing the electron orbitals for the simplest of atoms (the one-proton hydrogen atom), this rule amounts to saying the electron orbitals are separated by a amount of physical action that is equal to h = 2π·ħ.  Hence, when an electron jumps from one level to the next – say from the second to the first – then the atom will lose one unit of h. The photon that is emitted or absorbed will have to pack that somehow. It will also have to pack the related energy, which is given by the Rydberg formula:Formula 1To focus our thinking, let us consider the transition from the second to the first level, for which the 1/12 – 1/22 is equal 0.75. Hence, the photon energy should be equal to (0.75)·ER ≈ 10.2 eV. Now, if the total action is equal to h, then the cycle time T can be calculated as:
Formula 2This corresponds to a wave train with a length of (3×108 m/s)·(0.4×10-15 s) = 122 nm. That is the size of a large molecule and it is, therefore, much more reasonable than the length of the wave trains we get when thinking of transients using the supposed Q of an atomic oscillator.[9] In fact, this length is the wavelength of the light (λ = c/f = c·T = h·c/E) that we would associate with this photon energy.

We should quickly insert another calculation here. If we think of an electromagnetic oscillation – as a beam or, what we are trying to do here, as some quantum – then its energy is going to be proportional to (a) the square of the amplitude of the oscillation – and we are not thinking of a quantum-mechanical amplitude here: we are talking the amplitude of a physical wave here – and (b) the square of the frequency. Hence, if we write the amplitude as a and the frequency as ω, then the energy should be equal to E = k·a2·ω2, and the k in this equation is just a proportionality factor.

However, relativity theory tells us the energy will have some equivalent mass, which is given by Einstein’s mass-equivalence relation: E = m·c2. This equation tells us the energy of a photon is proportional to its mass, and the proportionality factor is c2. So we have two proportionality relations now, which (should) give us the same energy. Hence, k·a2·ω2 must be equal to m·c2, somehow.

How should we interpret this? It is, obviously, very tempting to equate k and m, but we can only do this if c2 is equal to a2·ω2 or – what amounts to the same – if c = a·ω. You will recognize this as a tangential velocity formula. The question is: the tangential velocity of what? The a in the E = k·a2·ω2 formula that we started off with is an amplitude: why would we suddenly think of it as a radius now? Because our photon is circularly polarized. To be precise, its angular momentum is +ħ or –ħ. There is no zero-spin state. Hence, if we think of this classically, then we will associate it with circular polarization.

However, these properties do not make it a boson or, let me be precise, these properties do not make it a virtual particle. Again, I’ve haven’t seen a textbook – advanced or intermediate level – that answers this simple question: why would an electron in some stable atomic orbital – it does not emit or absorb any energy – continuously exchange virtual photons with the proton that holds it in its orbit?

How would that photon look like? It would have to have some energy, right? And it would have to pack to physical action, right? Why and how would it take that energy – or that action (I like the German Wirkung much better in terms of capturing that concept) – away from the electron orbital? In fact, the idea of an electron orbital combines the idea of the electron and the proton—and their mutual attraction. The physicists who imagine those virtual photons are making a philosophical category mistake. We think they’re making a similar mistake when advancing the hypothesis of gluons and W/Z bosons.

Conclusions

We think the idea of virtual particles, gauge bosons and/or force-carrying particles in general is superfluous. The whole idea of bosons mediating forces resembles 19th century aether theory: we don’t need it. The implication is clear: if that’s the case, then we also don’t need gauge theory and/or quantum field theory.

Jean Louis Van Belle, 21 July 2019

[1] We took this from the above-mentioned leaflet. A proton has a rest energy of 938,272 eV, more or less. An energy equal to 4 TeV (the tera– prefix implies 12 zeroes) implies a Lorentz factor that is equal to γ = E/E0 = 4´1012/938,272 » 1´106. Now, we know that 1 – β2 = c2/c2v2/c2 = 1/γ2 = 1/γ2 » 1´10-12. The square root of that is of the order of a millionth, so we get the same order of magnitude.

[2] To be fair, the high-energy collisions also resulted in the production of some more short-lived ‘particles’, such as new variants of bottomonium: mesons that are supposed to consist of a bottom quark and its anti-matter counterpart.

[3] See: https://www.nobelprize.org/uploads/2018/06/popular-physicsprize2013-1.pdf. Higgs’ theory itself – on how gauge bosons can acquire non-zero masses – goes back to 1964 and was put forward by three individual research groups. See: https://en.wikipedia.org/wiki/1964_PRL_symmetry_breaking_papers.

[4] We write at least because we are only considering u and d quarks here: the constituents of all stable or fairly stable matter (protons and neutrons, basically).

[5] See: Jean Louis Van Belle, A Realist Interpretation of QCD, 16 July 2019.

[6] If we think of energy as the currency of the Universe, then you should think of protons and electrons as bank notes, and neutrinos as the coins: they provide the change.

[7] See: Jean Louis Van Belle, Is the Weak Force a Force?, 19 July 2019.

[8] This is a very much abbreviated summary. For a more comprehensive analysis, see: Jean Louis Van Belle, A Classical Quantum Theory of Light, 13 June 2019.

[9] In one of his Lectures (I-32-3), Feynman thinks about the Q of a sodium atom, which emits and absorbs sodium light, of course. Based on various assumptions – assumption that make sense in the context of the blackbody radiation model but not in the context of the Bohr model – he gets a Q of about 5×107. Now, the frequency of sodium light is about 500 THz (500×1012 oscillations per second). Hence, the decay time of the radiation is of the order of 108 seconds. So that means that, after 5×107 oscillations, the amplitude will have died by a factor 1/e ≈ 0.37. That seems to be very short, but it still makes for 5 million oscillations and, because the wavelength of sodium light is about 600 nm (600×10–9 meter), we get a wave train with a considerable length: (5×106)·(600×10–9 meter) = 3 meter. Surely You’re Joking, Mr. Feynman! A photon with a length of 3 meter – or longer? While one might argue that relativity theory saves us here (relativistic length contraction should cause this length to reduce to zero as the wave train zips by at the speed of light), this just doesn’t feel right – especially when one takes a closer look at the assumptions behind.

The virtuality of virtual particles

I’ve did what I promised to do – and that is to start posting on my other blog. On quantum chromodynamics, that is. But I think this paper deserves wider distribution. 🙂

The paper below probably sort of sums up my views on quantum field theory. I am not sure if I am going to continue to blog. I moved my papers to an academia.edu site and… Well… I think that’s about it. 🙂

0123456789p10p11p12