Over the past years, I have written a number of papers on physics—mostly exploratory, sometimes speculative, always driven by the same underlying discomfort.
Not with the results of modern physics. Those are extraordinary. But with the ordering of its explanations.
We are very good at calculating what happens. We are less clear about why some things persist and others do not.
That question—why stability appears where it does—has quietly guided much of my thinking. It is also the thread that ties together a new manuscript I have just published on ResearchGate:
This post is not a summary of the manuscript. It is an explanation of why I wrote it, and what kind of work it is meant to enable.
Not a new theory — a different starting point
Let me be clear from the outset.
This manuscript does not propose a new theory. It does not challenge the empirical success of the Standard Model. It does not attempt to replace quantum field theory or nuclear phenomenology.
What it does is much more modest—and, I hope, more durable.
It asks whether we have been starting our explanations at the wrong end.
Instead of beginning with abstract constituents and symmetries, the manuscript begins with something far more pedestrian, yet physically decisive:
Persistence in time.
Some entities last. Some decay. Some exist only fleetingly as resonances. Some are stable only in the presence of others.
Those differences are not cosmetic. They shape the physical world we actually inhabit.
From electrons to nuclei: stability as a guide
The manuscript proceeds slowly and deliberately, revisiting familiar ground:
the electron, as an intrinsically stable mode;
the proton, as a geometrically stable but structurally richer object;
the neutron, as a metastable configuration whose stability exists only in relation;
the deuteron, as the simplest genuinely collective equilibrium;
and nuclear matter, where stability becomes distributed across many coupled degrees of freedom.
At no point is new empirical content introduced. What changes is the interpretive emphasis.
Stability is treated not as an afterthought, but as a physical clue.
Interaction without mysticism
The same approach is applied to interaction.
Scattering and annihilation are reinterpreted not as abstract probabilistic events, but as temporary departures from equilibrium and mode conversion between matter-like and light-like regimes.
Nothing in the standard calculations is altered. What is altered is the physical picture.
Wavefunctions remain indispensable—but they are treated as representations of physical configurations, not as substitutes for them.
Probability emerges naturally from limited access to phase, geometry, and configuration, rather than from assumed ontological randomness.
Why classification matters
The manuscript ultimately turns to the Particle Data Group catalogue.
The PDG tables are one of the great achievements of modern physics. But they are optimized for calculation, not for intuition about persistence.
The manuscript proposes a complementary, stability-first index of the same data:
intrinsically stable modes,
metastable particle modes,
prompt decayers,
resonances,
and context-dependent stability (such as neutrons in nuclei).
Nothing is removed. Nothing is denied.
The proposal is simply to read the catalogue as a map of stability regimes, rather than as a flat ontology of “fundamental particles”.
A programme statement, not a conclusion
This manuscript is intentionally incomplete.
It does not contain the “real work” of re-classifying the entire PDG catalogue. That work lies ahead and will take time, iteration, and—no doubt—many corrections.
What the manuscript provides is something else:
a programme statement.
A clear declaration of what kind of questions I think are still worth asking in particle physics, and why stability—rather than constituent bookkeeping—may be the right place to ask them from.
Why I am sharing this now
I am publishing this manuscript not as a final product, but as a marker.
A marker of a line of thought I intend to pursue seriously. A marker of a way of reading familiar physics that I believe remains underexplored. And an invitation to discussion—especially critical discussion—on whether this stability-first perspective is useful, coherent, or ultimately untenable.
Physics progresses by calculation. It matures by interpretation.
This manuscript belongs to the second category.
If that resonates with you, you may find the full text of interest.
Some time ago, I found myself involved in what can best be described as an intellectual fallout with a group of well‑intentioned amateur researchers. This post is meant to close that loop — calmly, without bitterness, and with a bit of perspective gained since.
One of the more sensible people in that group bothered to push an interesting article onto my desk, and so I want to talk about that one here.
Gary Taubes, CERN, and an unexpected reinforcement
It’s an article by Gary Taubeson the discovery of the W and Z bosons at CERN, later incorporated into his book Nobel Dreams. Far from undermining my position, the article did the opposite: it reinforced the point I had been trying to make all along.
Taubes does not engage in ontology. He does not ask what W and Z bosons are in a metaphysical sense. Instead, he describes what was measured, how it was inferred, and how fragile the boundary is between evidence and interpretation in large‑scale experimental physics.
This connects directly to an earlier piece I published here:
Let me restate the central point, because it is still widely misunderstood:
Criticizing the ontologization of W/Z bosons (or quarks and gluons) is not the same as denying the reality of the measurements that led to their introduction.
The measurements are real. The detector signals are real. The conservation laws used to infer missing energy and momentum are real. What is not forced upon us is the metaphysical leap that turns transient, unstable interaction states into quasi‑permanent “things.”
Stable vs. unstable states — a distinction we keep blurring
My own work has consistently tried to highlight a distinction that I find increasingly absent — or at least under‑emphasized — in mainstream physics discourse:
Stable states: long‑lived, persistent, and directly accessible through repeated measurement
Unstable or intermediate states: short‑lived, inferred through decay products, reconstructed statistically
W and Z bosons belong firmly to the second category. So do quarks and gluons in their confined form. Treating them as ontologically equivalent to stable particles may be pragmatically useful, but it comes at a conceptual cost.
It is precisely this cost that I criticize when I criticize mainstream physics.
Not because mainstream physics is “wrong.” But because it has become too comfortable collapsing epistemology into ontology, especially in its public and pedagogical narratives.
Why this matters now
There is another reason this distinction matters, and it is a forward‑looking one.
The probability that something radically new — in the sense of a fundamentally novel interaction or particle family — will be discovered in the coming decades is, by most sober assessments, rather low. What we will have, however, is:
More precise measurements
Larger datasets
Longer baselines
Better statistical control
In that landscape, progress will depend less on naming new entities and more on bridging what has already been measured, sometimes decades ago, but never fully conceptually digested.
That is where I intend to focus my efforts in the coming years.
Not by founding a new church. Not by declaring metaphysical revolutions. But by carefully working at the interface between:
what was actually measured,
what was legitimately inferred,
and what we may have too quickly reified.
Closing note
If there is one lesson I take — from the past dispute, from Taubes, from the history of CERN or fundamental physics in general — it is this:
Physics progresses best when it remains modest about what it claims to be about.
Measurements first. Interpretation second. Ontology, if at all, only with restraint.
That stance may be unsatisfying to those looking for grand narratives. But it is, I believe, the only way to keep physics from quietly turning into metaphysics while still wearing a lab coat.
Living Between Jobs and Life: AI, CERN, and Making Sense of What We Already Know
For decades (all of my life, basically :-)), I’ve lived with a quiet tension. On the one hand, there is the job: institutions, projects, deliverables, milestones, and what have you… On the other hand, there is life: curiosity, dissatisfaction, and the persistent feeling that something fundamental is still missing in how we understand the physical world. Let me refer to the latter as “the slow, careful machinery of modern science.” 🙂
These two are not the same — obviously — and pretending they are has done physics no favors (think of geniuses like Solvay, Edison or Tesla here: they were considered to be ‘only engineers’, right? :-/).
Jobs optimize. Life explores.
Large scientific institutions are built to do one thing extremely well: reduce uncertainty in controlled, incremental ways. That is not a criticism; it is a necessity when experiments cost billions, span decades, and depend on political and public trust. But the price of that optimization is that ontological questions — questions about what really exists — are often postponed, softened, or quietly avoided.
And now we find ourselves in a new historical moment.
The Collider Pause Is Not a Crisis — It’s a Signal
For decades, theoretical physics could lean on an implicit promise: the next machine will decide. Higher energies, larger datasets, finer resolution — always just one more accelerator away. That promise is now on pause.
Which means something important:
We can no longer postpone understanding by outsourcing it to future experiments.
Why CERN Cannot Do What Individuals Can
CERN is a collective of extraordinarily bright individuals. But this is a crucial distinction:
A collective of intelligent people is not an intelligent agent.
CERN is not designed to believe an ontology. It is designed to:
build and operate machines of unprecedented complexity,
produce robust, defensible measurements,
maintain continuity over decades,
justify public funding across political cycles.
Ontology — explicit commitments about what exists and what does not — is structurally dangerous to that mission. Not because it is wrong, but because it destabilizes consensus.
Within a collective:
someone’s PhD depends on a framework,
someone’s detector was designed for a specific ontology,
someone’s grant proposal assumes a given language,
someone’s career cannot absorb “maybe the foundations are wrong.”
So even when many individuals privately feel conceptual discomfort, the group-level behavior converges to: “Let’s wait for more data.”
That is not cowardice. It is inevitability.
We Are Drowning in Data, Starving for Meaning
The irony is that we are not short on data at all.
We have:
precision measurements refined to extraordinary accuracy,
anomalies that never quite go away,
models that work operationally but resist interpretation,
concepts (mass, spin, charge, probability) that are mathematically precise yet ontologically vague.
Quantum mechanics works. That is not in dispute. What remains unresolved is what it means.
This is not a failure of experiment. It is a failure of sense-making.
And sense-making has never been an institutional strength.
Where AI Actually Fits (and Where It Doesn’t)
I want to be explicit: I still have a long way to go in how I use AI — intellectually, methodologically, and ethically.
AI is not an oracle. It does not “solve” physics. It does not replace belief, responsibility, or judgment.
But it changes something fundamental.
AI allows us to:
re-analyze vast datasets without institutional friction,
explore radical ontological assumptions without social penalty,
apply sustained logical pressure without ego,
revisit old experimental results with fresh conceptual frames.
In that sense, AI is not the author of new physics — it is a furnace.
It does not tell us what to believe. It forces us to confront the consequences of what we choose to believe.
Making Sense of What We Already Know
The most exciting prospect is not that AI will invent new theories out of thin air.
It is that AI may help us finally make sense of experimental data that has been sitting in plain sight for decades.
test radical but coherent ontologies against known data,
separate what is measured from how we talk about it,
revisit old results without institutional inertia.
This does not guarantee progress — but it makes honest failure possible. And honest failure is far more valuable than elegant confusion.
Between Institutions and Insight
This is not an AI-versus-human story.
It is a human-with-tools story.
Institutions will continue to do what they do best: build machines, refine measurements, and preserve continuity. That work is indispensable.
But understanding — especially ontological understanding — has always emerged elsewhere:
in long pauses,
in unfashionable questions,
in uncomfortable reinterpretations of existing facts.
We are entering such a pause now.
A Quiet Optimism
I do not claim to have answers. I do not claim AI will magically deliver them. I do not even claim my current ideas will survive serious scrutiny.
What I do believe is this:
We finally have the tools — and the historical conditions — to think more honestly about what we already know.
That is not a revolution. It is something slower, harder, and ultimately more human.
And if AI helps us do that — not by replacing us, but by challenging us — then it may turn out to be one of the most quietly transformative tools science has ever had.
Not because it solved physics.
But because it helped us start understanding it again.
Every few years, a paper comes along that stirs discomfort — not because it is wrong, but because it touches a nerve. Oliver Consa’s “Something is rotten in the state of QED” is one of those papers.
It is not a technical QED calculation. It is a polemic: a long critique of renormalization, historical shortcuts, convenient coincidences, and suspiciously good matches between theory and experiment. Consa argues that QED’s foundations were improvised, normalized, mythologized, and finally institutionalized into a polished narrative that glosses over its original cracks.
This is an attractive story. Too attractive, perhaps. So instead of reacting emotionally — pro or contra — I decided to dissect the argument with a bit of help.
At my request, an AI language model (“Iggy”) assisted in the analysis. Not to praise me. Not to flatter Consa. Not to perform tricks. Simply to act as a scalpel: cold, precise, and unafraid to separate structure from rhetoric.
This post is the result.
1. What Consa gets right (and why it matters)
Let’s begin with the genuinely valuable parts of his argument.
a) Renormalization unease is legitimate
Dirac, Feynman, Dyson, and others really did express deep dissatisfaction with renormalization. “Hocus-pocus” was not a joke; it was a confession.
Early QED involved:
cutoff procedures pulled out of thin air,
infinities subtracted by fiat,
and the philosophical hope that “the math will work itself out later.”
It did work out later — to some extent — but the conceptual discomfort remains justified. I share that discomfort. There is something inelegant about infinities everywhere.
b) Scientific sociology is real
The post-war era centralized experimental and institutional power in a way physics had never seen. Prestige, funding, and access influenced what got published and what was ignored. Not a conspiracy — just sociology.
Consa is right to point out that real science is messier than textbook linearity.
c) The g–2 tension is real
The ongoing discrepancy between experiment and the Standard Model is not fringe. It is one of the defining questions in particle physics today.
On these points, Consa is a useful corrective: he reminds us to stay honest about historical compromises and conceptual gaps.
2. Where Consa overreaches
But critique is one thing; accusation is another.
Consa repeatedly moves from:
“QED evolved through trial and error” to “QED is essentially fraud.”
This jump is unjustified.
a) Messiness ≠ manipulation
Early QED calculations were ugly. They were corrected decades later. Experiments did shift. Error bars did move.
That is simply how science evolves.
The fact that a 1947 calculation doesn’t match a 1980 value is not evidence of deceit — it is evidence of refinement. Consa collapses that distinction.
b) Ignoring the full evidence landscape
He focuses almost exclusively on:
the Lamb shift,
the electron g–2,
the muon g–2.
Important numbers, yes — but QED’s experimental foundation is vastly broader:
scattering cross-sections,
vacuum polarization,
atomic spectra,
collider data,
running of α, etc.
You cannot judge an entire theory on two or three benchmarks.
c) Underestimating theoretical structure
QED is not “fudge + diagrams.” It is constrained by:
Lorentz invariance,
gauge symmetry,
locality,
renormalizability.
Even if we dislike the mathematical machinery, the structure is not arbitrary.
So: Consa reveals real cracks, but then paints the entire edifice as rotten. That is unjustified.
3. A personal aside: the Zitter Institute and the danger of counter-churches
For a time, I was nominally associated with the Zitter Institute — a loosely organized group exploring alternatives to mainstream quantum theory, including zitterbewegung-based particle models.
I now would like to distance myself.
Not because alternative models are unworthy — quite the opposite. But because I instinctively resist:
strong internal identity,
suspicion of outsiders,
rhetorical overreach,
selective reading of evidence,
and occasional dogmatism about their own preferred models.
If we criticize mainstream physics for ad hoc factors, we must be brutal about our own.
Alternative science is not automatically cleaner science.
4. Two emails from 2020: why good scientists can’t always engage
This brings me to two telling exchanges from 2020 with outstanding experimentalists: Prof. Randolf Pohl (muonic hydrogen) and Prof. Ashot Gasparian (PRad).
Both deserve enormous respect, and I won’t reveal the email exchanges because of respect, GDPR rules or whatever). Both email exchanges revealed the true bottleneck in modern physics to me — it is not intelligence, not malice, but sociology and bandwidth.
a) Randolf Pohl: polite skepticism, institutional gravity
Pohl was kind but firm:
He saw the geometric relations I proposed as numerology.
He questioned applicability to other particles.
He emphasized the conservatism of CODATA logic.
Perfectly valid. Perfectly respectable. But also… perfectly bound by institutional norms.
His answer was thoughtful — and constrained. (Source: ChatGPT analysis of emails with Prof Dr Pohl)
b) Ashot Gasparian: warm support, but no bandwidth
Gasparian responded warmly:
“Certainly your approach and the numbers are interesting.”
But: “We are very busy with the next experiment.”
Also perfectly valid. And revealing: even curious, open-minded scientists cannot afford to explore conceptual alternatives.
Their world runs on deadlines, graduate students, collaborations, grants.
(Source: ChatGPT analysis of emails with Prof Dr Pohl)
The lesson
Neither professor dismissed the ideas because they were nonsensical. They simply had no institutional space to pursue them.
That is the quiet truth: the bottleneck is not competence, but structure.
5. Why I now use AI as an epistemic partner
This brings me to the role of AI.
Some colleagues (including members of the Zitter Institute) look down on using AI in foundational research. They see it as cheating, or unserious, or threatening to their identity as “outsiders.”
But here is the irony:
AI is exactly the tool that can think speculatively without career risk.
An AI:
has no grant committee,
no publication pressure,
no academic identity to defend,
no fear of being wrong,
no need to “fit in.”
That makes it ideal for exploratory ontology-building.
real-space modeling without metaphysical inflation,
EM + relativity as a unified playground,
photons, electrons, protons, neutrons as geometric EM systems.
This is not a replacement for science. It is a tool for clearing conceptual ground, where overworked, over-constrained academic teams cannot go.
6. So… is something rotten in QED?
Yes — but not what you think.
What’s rotten is the mismatch
between:
the myth of QED as a perfectly clean, purely elegant theory, and
the reality of improvised renormalization, historical accidents, social inertia, and conceptual discomfort.
What’s rotten is not the theory itself, but the story we tell about it.
What’s not rotten:
the intelligence of the researchers,
the honesty of experimentalists,
the hard-won precision of modern measurements.
QED is extraordinary. But it is not infallible, nor philosophically complete, nor conceptually finished.
And that is fine.
The problem is not messiness. The problem is pretending that messiness is perfection.
7. What I propose instead
My own program — pursued slowly over many years — is simple:
Bring physics back to Maxwell + relativity as the foundation.
Build real-space geometrical models of all fundamental particles.
Reject unnecessary “forces” invented to patch conceptual holes.
Hold both mainstream and alternative models to the same standard: no ad hoc constants, no magic, no metaphysics.
And — unusually — use AI as a cognitive tool, not as an oracle.
Let the machine check coherence. Let the human set ontology.
If something emerges from the dialogue — good. If not — also good.
But at least we will be thinking honestly again.
Conclusion
Something is rotten in the state of QED, yes — but the rot is not fraud or conspiracy.
It is the quiet decay of intellectual honesty behind polished narratives.
The cure is not shouting louder, or forming counter-churches, or romanticizing outsider science.
The cure is precision, clarity, geometry, and the courage to say:
Let’s look again — without myth, without prestige, without fear.
If AI can help with that, all the better.
— Jean Louis Van Belle (with conceptual assistance from “Iggy,” used intentionally as a scalpel rather than a sycophant)
Post-scriptum: Why the Electron–Proton Model Matters (and Why Dirac Would Nod)
A brief personal note — and a clarification that goes beyond Consa, beyond QED, and beyond academic sociology.
One of the few conceptual compasses I trust in foundational physics is a remark by Paul Dirac. Reflecting on Schrödinger’s “zitterbewegung” hypothesis, he wrote:
“One must believe in this consequence of the theory, since other consequences which are inseparably bound up with it, such as the law of scattering of light by an electron, are confirmed by experiment.”
Dirac’s point is not mysticism. It is methodological discipline:
If a theoretical structure has unavoidable consequences, and
some of those consequences match experiment precisely,
then even the unobservable parts of the structure deserve consideration.
This matters because the real-space electron and proton models I’ve been working on over the years — now sharpened through AI–human dialogue — meet that exact criterion.
They are not metaphors, nor numerology, nor free speculation. They force specific, testable, non-trivial predictions:
a confined EM oscillation for the electron, with radius fixed by ;
a “photon-like” orbital speed for its point-charge center;
a distributed (not pointlike) charge cloud for the proton, enforced by mass ratio, stability, form factors, and magnetic moment;
natural emergence of the measured discrepancy;
and a geometric explanation of deuteron binding that requires no new force.
None of these are optional. They fall out of the internal logic of the model. And several — electron scattering, Compton behavior, proton radius, form-factor trends — are empirically confirmed.
Dirac’s rule applies:
When inseparable consequences match experiment, the underlying mechanism deserves to be taken seriously — whether or not it fits the dominant vocabulary.
This post is not the place to develop those models in detail; that will come in future pieces and papers. But it felt important to state why I keep returning to them — and why they align with a style of reasoning that values:
geometry,
energy densities,
charge motion,
conservation laws,
and the 2019 SI foundations of , , and over metaphysical categories and ad-hoc forces.
Call it minimalism. Call it stubbornness. Call it a refusal to multiply entities beyond necessity.
For me — and for anyone sympathetic to Dirac’s way of thinking — it is simply physics.
Every now and then a question returns with enough insistence that it demands a fresh attempt at an answer. For me, that question has always been: can we make sense of fundamental physics without multiplying entities beyond necessity? Can we explain light, matter, and their interactions without inventing forces that have no clear definition, or particles whose properties feel more like placeholders than physical reality?
Today, I posted a new paper on ResearchGate that attempts to do exactly that:
It is the result of an unusual collaboration: myself and an artificial intelligence (“Iggy”), working through the conceptual structure of photons, electrons, and protons with the only tool that has ever mattered to me in physics — Occam’s Razor.
No metaphysics. No dimensionless abstractions. No “magical” forces.
Just:
electromagnetic oscillations,
quantized action,
real geometries in real space,
and the recognition that many so-called mysteries dissolve once we stop introducing layers that nature never asked for.
The photon is treated as a linear electromagnetic oscillation obeying the Planck–Einstein relation. The electron as a circular oscillation, with a real radius and real angular momentum. The proton (and later, the neutron and deuteron) as systems we must understand through charge distributions, not fictional quarks that never leave their equations.
None of this “solves physics,” of course. But it does something useful: it clears conceptual ground.
And unexpectedly, the collaboration itself became a kind of experiment: what happens when human intuition and machine coherence try to reason with absolute precision, without hiding behind jargon or narrative?
The result is the paper linked above. Make of it what you will.
As always: no claims of authority. Just exploration, clarity where possible, and honesty where clarity fails.
If the questions interest you, or if the model bothers you enough to critique it, then the paper has succeeded in its only purpose: provoking real thought.
Over the past few weeks — and more intensely these past mornings — I’ve returned to two of my earliest texts in the Lectures on Physics series: the first on quantum behavior, and the second on probability amplitudes and quantum interference. Both have now been updated with new annexes, co-authored in dialogue with ChatGPT-4o.
This wasn’t just a consistency check. It was something more interesting: an exercise in thinking with — not through — a reasoning machine.
The first annex (Revisiting the Mystery of the Muon and Tau) tackles the open question I left hanging in Lecture I: how to interpret unstable “generations” of matter-particles like the muon and tau. In the original paper, I proposed a realist model where mass is not an intrinsic property but the result of oscillating charge or field energy — a stance that draws support from the 2019 revision of SI units, which grounded the kilogram in Planck’s constant and the speed of light. That change wasn’t just a technicality; it was a silent shift in ontology. I suspected that much at the time, but now — working through the implications with a well-tuned AI — I can state it more clearly: mass is geometry, inertia is field structure, and the difference between stable and unstable particles might be a matter of topological harmony.
The second annex (Interference, Identity, and the Imaginary Unit) reopens the deeper riddle at the heart of quantum mechanics: why probability amplitudes interfere at all. This annex is the child of years of irritation — visible in earlier, sharper essays I published on academia.edu — with the lazy mysticism that often surrounds “common phase factors.” The breakthrough, for me, was to fully accept the imaginary unit iii not as a mathematical trick but as a rotation operator. When wavefunctions are treated as oriented field objects, not just complex scalars, interference becomes a question of geometric compatibility. Superpositions and spin behavior can then be reinterpreted as topological effects in real space. This is where I think mainstream physics got lost: it started calculating without explaining.
ChatGPT didn’t invent these ideas. But it helped me phrase them, frame them, and press further on the points I had once hesitated to formalize. That’s what I mean when I say this wasn’t just a cleanup job. It was a real act of collaboration — a rare instance of AI not just paraphrasing or predicting, but amplifying and clarifying an unfinished line of human reasoning.
They mark, I think, a modest turning point. From theory and calculation toward something closer to explanation.
And yes — for those following the philosophical side of this project: we did also try to capture all of that in a four-panel comic involving Diogenes, a turtle, and Zeno’s paradox. But that, like all things cartooned by AI, is still a work in progress. 🙂
Post Scriptum (24 June 2025): When You Let the Machine Take the Pen
In the spirit of openness: there’s been one more development since publishing the two annexes above.
Feeling I had taken my analytical skills as far as I could — especially in tackling the geometry of nuclear structure — I decided to do something different. Instead of drafting yet another paper, I asked ChatGPT to take over. Not as a ghostwriter, but as a model builder. The prompt was simple: “Do better than me.”
It’s dense, unapologetically geometric, and proposes a full zbw-based model for the neutron and deuteron — complete with energy constraints, field equations, and a call for numerical exploration. If the earlier annexes were dialogue, this one is delegation.
I don’t know if this is the end of the physics path for me. But if it is, I’m at peace with it. Not because the mystery is gone — but because I finally believe the mystery is tractable. And that’s enough for now.
Over the past years, I’ve been working — quietly but persistently — on a set of papers that circle one simple, impossible question: What is the Universe really made of?
Not in the language of metaphors. Not in speculative fields. But in terms of geometry, charge, and the strange clarity of equations that actually work.
Here are the three pieces of that arc:
🌀 1. Radial Genesis Radial Genesis: A Finite Universe with Emergent Spacetime Geometry This is the cosmological capstone. It presents the idea that space is not a stage, but an outcome — generated radially by mass–energy events, limited by time and light. It’s an intuitive, equation-free narrative grounded in general relativity and Occam’s Razor.
⚛️ 2. Lectures on Physics: On General Relativity (2) Lectures on GRT (2) This one is for the mathematically inclined. It builds from the ground up: tensors, geodesics, curvature. If Radial Genesis is the metaphor, this is the machinery. Co-written with AI, but line by line, and verified by hand.
🌑 3. The Vanishing Charge The Vanishing Charge: What Happens in Matter–Antimatter Annihilation? This paper is where the mystery remains. It presents two possible views of annihilation: (1) as a collapse of field geometry into free radiation, (2) or as the erasure of charge — with geometry as the by-product. We didn’t choose between them. We just asked the question honestly.
Why This Arc Matters
These three papers don’t offer a Theory of Everything. But they do something that matters more right now: They strip away the fog — the inflation of terms, the myth of complexity for complexity’s sake — and try to draw what is already known in clearer, more beautiful lines.
This is not a simulation of thinking. This is thinking — with AI as a partner, not a prophet.
So if you’re tired of being told that the Universe is beyond your grasp… Start here. You might find that it isn’t.
A researcher I was in touch with a few years ago sent me a link to the (virtual) Zitter Institute: https://www.zitter-institute.org/. It is a network and resource center for non-mainstream physicists who succesfully explored – and keep exploring, of course – local/realist interpretations of quantum mechanics by going back to Schrödinger’s original and alternative interpretation of what an electron actually is: a pointlike (but not infinitesimally small) charge orbiting around in circular motion, with:
(i) the trajectory of its motion being determined by the Planck-Einstein relation, and
I started exploring Schrödinger’s hypothesis myself about ten years ago – as a full-blown alternative to the Bohr-Heisenberg interpretation of quantum mechanics (which I think of as metaphysical humbug, just like Einstein and H.A. Lorentz at the time) – and consistently blogged and published about it: here on this website, and then on viXra, Academia and, since 2020, ResearchGate. So I checked out this new site, and I see the founding members added my blog site as a resource to their project list.
[…]
I am amazingly pleased with that. I mean… My work is much simpler than that of, say, Dr. John G. Williamson (CERN/Philips Research Laboratories/Glasgow University) and Dr. Martin B. van der Mark (Philips Research Laboratories), who created the Quantum Bicycle Society (https://quicycle.com/).
So… Have a look – not at my site (I think I did not finish the work I started) but at the other resources of this new Institute: it looks like this realist and local interpretation of quantum mechanics is no longer non-mainstream… Sweet ! It makes me feel the effort I put into all of this has paid off ! 😉 Moreover, some of my early papers (2018-2020) are listed as useful papers to read. I think that is better than being published in some obscure journal. 🙂
I repeat again: my own research interest has shifted to computer science, logic and artificial intelligence now (you will see recent papers on my RG site are all about that now). It is just so much more fun and it also lines up better with my day job as a freelance IT project manager. So, yes, it is goodbye – but I am happy I can now refer all queries about my particle models and this grand synthesis between old and new quantum mechanics to the Zitter Institute.
It’s really nice: I have been in touch with about half of the founding members of this Institute over the past ten years – casually or in a more sustained way while discussing this or that 2D or 3D model of an electron, proton, or neutron), and they are all great and amazing researchers because they look for truth in science and are very much aware of this weird tendency of modern-day quantum scientists turning their ideas into best-sellers perpetuating myths and mysteries. [I am not only thinking of the endless stream of books from authors like Roger Penrose (the domain for this blog was, originally, reading Penrose rather than reading Feynman) or Graham Greene here, but also of what I now think of rather useless MIT or edX online introductions to quantum physics and quantum math.]
[…]
Looking at the website, I see the engine behind it: Dr. Oliver Consa. I was in touch with him too. He drew my attention to remarkable flip-flop articles such as William Lamb’s anti-photon article (it is an article which everyone should read, I think: unfortunately, you have to pay for it) and remarkable interviews with Freeman Dyson. Talking of the latter (I think of as “the Wolfgang Pauli of the third generation of quantum physicists” because he helped so many others to get a Nobel Prize before he got one – Dyson never got a Nobel Prize, by the way), this is one of these interviews you should watch: just four years before he would die from old age, Freeman Dyson plainly admits QED and QFT is a totally unproductive approach: a “dead end” as Dyson calls it.
So, yes, I am very pleased and happy. It makes me feel my sleepness nights and hard weekend work over the past decade on this has not been in vain ! Paraphrasing Dyson in the above-mentioned video interview, I’d say: “It is the end of the story, and that particular illumination was a very joyful time.” 🙂
Thank you, Dr. Consa. Thank you, Dr. Vassallo, Dr. Burinskii, Dr. Meulenberg, Dr. Kovacs, and – of course – Dr. Hestenes – who single-handedly revived the Zitterbewegung interpretation of quantum mechanics in the 1990s. I am sure I forgot to mention some people. Sorry for that. I will wrap up my post here by saying a few more words about David Hestenes.
I really admire him deeply. Moving away from the topic of high-brow quantum theory, I think his efforts to reform K-12 education in math and physics is even more remarkable than the new space-time algebra (STA) he invented. I am 55 years old and so I know all about the small and pleasant burden to help kids with math and statistics in secondary school and at university: the way teachers now have to convey math and physics to kids now is plain dreadful. I hope it will get better. It has to. If the US and the EU want to keep leading in research, then STEM education (Science, Technology, Engineering, and Mathematics) needs a thorough reform.
I have not posted in a while, and that is because I find the format of a video much easier to express my thoughts. Have a look at my YouTube channel ! Also, for the more serious work, I must refer to my ResearchGate page. Have fun thinking things through ! 🙂
Last year’s (2022) Nobel Prize in Physics went to Alain Aspect, John Clauser, and Anton Zeilinger for “for experiments with entangled photons, establishing the violation of Bell inequalities and pioneering quantum information science.”
I did not think much of that award last year. Proving that Bell’s No-Go Theorem cannot be right? Great. Finally! I think many scientists – including Bell himself – already knew this theorem was a typical GIGO argument: garbage in, garbage out. As the young Louis de Broglie famously wrote in the introduction of his thesis: hypotheses are worth only as much as the consequences that can be deduced from it, and the consequences of Bell’s Theorem did not make much sense. As I wrote in my post on it, Bell himself did not think much of his own theorem until, of course, he got nominated for a Nobel Prize: it is a bit hard to say you got nominated for a Nobel Prize for a theory you do not believe in yourself, isn’t it? In any case, Bell’s Theorem has now been experimentally disproved. That is – without any doubt – a rather good thing. 🙂 To save the face of the Nobel committee here (why award something that disproves something else that you would have given an award a few decades ago?): Bell would have gotten a Nobel Prize, but he died from brain hemorrhage before, and Nobel Prizes reward the living only.
As for entanglement, I repeat what I wrote many times already: the concept of entanglement – for which these scientists got a Nobel Prize last year – is just a fancy word for the simultaneous conservation of energy, linear and angular momentum (and – if we are talking matter-particles – charge). There is ‘no spooky action at a distance’, as Einstein would derogatorily describe it when the idea was first mentioned to him. So, I do not see why a Nobel Prize should be awarded for rephrasing a rather logical outcome of photon experiments in metamathematical terms.
Finally, the Nobel Prize committee writes that this has made a significant contribution to quantum information science. I wrote a paper on the quantum computing hype, in which I basically ask this question: qubits may or may not be better devices than MOSFETs to store data – they are not, and they will probably never be – but that is not the point. How does quantum information change the two-, three- or n-valued or other rule-based logic that is inherent to the processing of information? I wish the Nobel Prize committee could be somewhat more explicit on that because, when everything is said and done, one of the objectives of the Prize is to educate the general public about the advances of science, isn’t it?
However, all this ranting of mine is, of course, unimportant. We know that it took the distinguished Royal Swedish Science Academy more than 15 years to even recognize the genius of an Einstein, so it was already clear then that their selection criteria were not necessarily rational. [Einstein finally got a well-deserved Nobel Prize, not for relativity theory (strangely enough: if there is one thing on which all physicist are agreed, it is that relativity theory is the bedrock of all of physics, isn’t it?), but for a much less-noted paper on the photoelectric effect – in 1922: 17 years after his annus mirabilis papers had made a killing not only in academic circles but in the headlines of major newspapers as well, and 10 years after a lot of fellow scientists had nominated him for it (1910).]
Again, Mahatma Gandhi never got a Nobel Price for Peace (so Einstein should consider himself lucky to get some Nobel Prize, right?), while Ursula von der Leyen might be getting one for supporting the war with Russia, so I must remind myself of the fact that we do live in a funny world and, perhaps, we should not be trying to make sense of these rather weird historical things. 🙂
Let me turn to the main reason why I am writing this indignant post. It is this: I am utterly shocked by what Dr. John Clauser has done with his newly gained scientific prestige: he joined the CO2 coalition! For those who have never heard of it, it is a coalition of climate change deniers. A bunch of people who:
(1) vehemently deny the one and only consensus amongst all climate scientists, and that is the average temperature on Earth has risen with about two degrees Celsius since the Industrial Revolution, and
(2) say that, if climate change would be real (God forbid!), then we can reverse the trend by easy geo-engineering. We just need to use directed energy or whatever to create more white clouds. If that doesn’t work, then… Well… CO2 makes trees and plants grow, so it will all sort itself out by itself.
[…]
Yes. That is, basically, what Dr. Clauser and all the other scientific advisors of this lobby group – none of which have any credentials in the field they are criticizing (climate science) – are saying, and they say it loud and clearly. That is weird enough, already. What is even weirder, is that – to my surprise – a lot of people are actually buying such nonsense.
Frankly, I have not felt angry for a while, but this thing triggered an outburst of mine on YouTube, in which I state clearly what I think of Dr. Clauser and other eminent scientists who abuse their saint-like Nobel Prize status in society to deceive the general public. Watch my video rant, and think about it for yourself. Now, I am not interested in heated discussions on it: I know the basic facts. If you don’t, I listed them here. Look at the basic graphs and measurements before you would want to argue with me on this, please! To be clear on this: I will not entertain violent or emotional reactions to this post or my video. Moreover, I will delete them here on WordPress and also on my YouTube channel. Yes. For the first time in 10 years or so, I will exercise my right as a moderator of my channels, which is something I have never done before. 🙂
[…]
I will now calm down and write something about the mainstream interpretation of quantum physics again. 🙂 In fact, this morning I woke up with a joke in my head. You will probably think the joke is not very good, but then I am not a comedian and so it is what it is and you can judge for yourself. The idea is that you’d learn something from it. Perhaps. 🙂 So, here we go.
Imagine shooting practice somewhere. A soldier fires at some target with a fine gun, and then everyone looks at the spread of the hits around the bullseye. The quantum physicist says: “See: this is the Uncertainty Principle at work! What is the linear momentum of these bullets, and what is the distance to the target? Let us calculate the standard error.” The soldier looks astonished and says: “No. This gun is no good. One of the engineers should check it.” Then the drill sergeant says this: “The gun is fine. From this distance, all bullets should have hit the bullseye. You are a miserable shooter and you should really practice a lot more.” He then turns to the academic and says: “How did you get in here? I do not understand a word of what you just said and, if I do, it is of no use whatsoever. Please bugger off asap!”
This is a stupid joke, perhaps, but there is a fine philosophical point to it: uncertainty is not inherent to Nature, and it also serves no purpose whatsoever in the science of engineering or in science in general. All in Nature is deterministic. Statistically deterministic, but deterministic nevertheless. We do not know the initial conditions of the system, perhaps, and that translates into seemingly random behavior, but if there is a pattern in that behavior (a diffraction pattern, in the case of electron or photon diffraction), then the conclusion should be that there is no such thing as metaphysical ‘uncertainty’. In fact, if you abandon that principle, then there is no point in trying to discover the laws of the Universe, is there? Because if Nature is uncertain, then there are no laws, right? 🙂
To underscore this point, I will, once again, remind you of what Heisenberg originally wrote about uncertainty. He wrote in German and distinguished three very different ideas of uncertainty:
(1) The precision of our measurements may be limited: Heisenberg originally referred to this as an Ungenauigkeit.
(2) Our measurement might disturb the position and, as such, cause the information to get lost and, as a result, introduce an uncertainty in our knowledge, but not in reality. Heisenberg originally referred to such uncertainty as an Unbestimmtheit.
(3) One may also think the uncertainty is inherent to Nature: that is what Heisenberg referred to as Ungewissheit. There is nothing in Nature – and also nothing in Heisenberg’s writings, really – that warrants the elevation of this Ungewissheit to a dogma in modern physics. Why? Because it is the equivalent of a religious conviction, like God exists or He doesn’t (both are theses we cannot prove: Ryle labeled such hypotheses as ‘category mistakes’).
Indeed, when one reads the proceedings of the Solvay Conferences of the late 1920s, 1930s and immediately after WW II (see my summary of it in https://www.researchgate.net/publication/341177799_A_brief_history_of_quantum-mechanical_ideas), then it is pretty clear that none of the first-generation quantum physicists believed in such dogma and – if they did – that they also thought what I am writing here: that it should not be part of science but part of one’s personal religious beliefs.
So, once again, I repeat that this concept of entanglement – for which John Clauser got a Nobel Prize last year – is in the same category: it is just a fancy word for the simultaneous conservation of energy, linear and angular momentum, and charge. There is ‘no spooky action at a distance’, as Einstein would derogatorily describe it when the idea was first mentioned to him.
Let me end by noting the dishonor of Nobel Prize winner John Clauser once again. Climate change is real: we are right in the middle of it, and it is going to get a lot worse before it gets any better – if it is ever going to get better (which, in my opinion, is a rather big ‘if‘…). So, no matter how many Nobel Prize winners deny it, they cannot change the fact that average temperature on Earth has risen by about 2 degrees Celsius since 1850 already. The question is not: is climate change happening? No. The question now is: how do we adapt to it – and that is an urgent question – and, then, the question is: can we, perhaps, slow down the trend, and how? In short, if these scientists from physics or the medical field or whatever other field they excel in are true and honest scientists, then they would do a great favor to mankind not by advocating geo-engineering schemes to reverse a trend they actually deny is there, but by helping to devise and promote practical measures to allow communities that are affected by natural disaster to better recover from them.
So, I’ll conclude this rant by repeating what I think of all of this. Loud and clear: John Clauser and the other scientific advisors of the CO2 coalition are a disgrace to what goes under the name of ‘science’, and this umpteenth ‘incident’ in the history of science or logical thinking makes me think that it is about time that the Royal Swedish Academy of Sciences does some serious soul-searching when, amongst the many nominations, it selects its candidates for a prestigious award like this. Alfred Nobel – one of those geniuses who regretted his great contribution to science and technology was (also) (ab)used to increase the horrors of war – must have turned too many times in his grave now…
I wrote a post with this title already, but this time I mean it in a rather personal way: my last paper – with the same title – on ResearchGate sums up rather well whatever I achieved, and also whatever I did not explore any further because time and energy are lacking: I must pay more attention to my day job nowadays. 🙂
I am happy with the RG score all of my writing generated, the rare but heartfelt compliments I got from researchers with far more credentials than myself (such as, for example, Dr. Emmanouil Markoulakis of Nikolaos, which led me to put a paper on RG with a classical explanation of the Lamb shift), various friendly but not necessarily always agreeing commentators (one of them commenting here on this post: a good man!), and, yes, the interaction on my YouTube channel. But so… Well… That is it, then! 🙂
As a farewell, I will just quote from the mentioned paper – The End of Physics (only as a science, of course) – hereunder, and I hope that will help you to do what all great scientists would want you to do, and that is to think things through for yourself. 🙂
Brussels, 22 July 2023
Bohr, Heisenberg, and other famous quantum physicists – think of Richard Feynman, John Stewart Bell, Murray Gell-Mann, and quite a few other Nobel Prize winning theorists[1] – have led us astray. They swapped a rational world view – based on classical electromagnetic theory and statistical determinism – for a mystery world in which anything is possible, but nothing is real.
They invented ‘spooky action at a distance’ (as Einstein derogatorily referred to it), for example. So, what actually explains that long-distance interaction, then? It is quite simple. There is no interaction, and so there is nothing spooky or imaginary or unreal about it: if by measuring the spin state of one photon, we also know the spin state of its twin far away, then it is – quite simply – because physical quantities such as energy and momentum (linear or angular) will be conserved if no other interference is there after the two matter- or light-particles were separated.
Plain conservation laws explain many other things that are being described as ‘plain mysteries’ in quantum physics. The truth is this: there are no miracles or mysteries: everything has a physical cause and can be explained.[2] For example, there is also nothing mysterious about the interference pattern and the trajectory of an electron going through a slit, or one of two nearby slits. An electron is pointlike, but it is not infinitesimally small: it has an internal structure which explains its wave-like properties. Likewise, Mach-Zehnder one-photon interference can easily be explained when thinking of its polarization structure: a circularly polarized photon can be split in two linearly polarized electromagnetic waves, which are photons in their own right. Everything that you have been reading about mainstream quantum physics is, perhaps, not wrong, but it is highly misleading because it is all couched in guru language and mathematical gibberish.
Why is that mainstream physicists keep covering up? I am not sure: it is a strange mix of historical accident and, most probably, the human desire to be original or special, or the need to mobilize money for so-called fundamental research. I also suspect there is a rather deceitful intention to hide truths about what nuclear science should be all about, and that is to understand the enormous energies packed into elementary particles.[3]
The worst of all is that none of the explanations in mainstream quantum physics actually works: mainstream theory does not have a sound theory of signal propagation, for example (click the link to my paper on that or – better, perhaps – this link to our paper on signal propagation), and Schrödinger’s hydrogen model is a model of a hypothetical atom modelling orbitals of equally hypothetical zero-spin electron pairs. Zero-spin electrons do not exist, and real-life hydrogen only has one proton at its center, and one electron orbiting around it. Schrödinger’s equation is relativistically correct – even if all mainstream physicists think it is not – but the equation includes two mistakes that cancel each other out: it confuses the effective mass of an electron in motion with its total mass[4], and the 1/2 factor which is introduced by the m = 2meff substitution also takes care of the doubling of the potential that is needed to make the electron orbitals come out alright.
The worst thing of all is that mainstream quantum physicists never accurately modeled what they should have modeled: the hydrogen atom as a system of a real proton and a real electron (no hypothetical infinitesimally and structureless spin-zero particles). If they had done that, they would also be able to explain why hydrogen atoms come in molecular H2pairs, and they would have a better theory of why two protons need a neutron to hold together in a helium nucleus. Moreover, they would have been able to explain what a neutron actually is.[5]
[1] James Stewart Bell was nominated for a Nobel Prize, but died from a brain hemorrhage before he could accept the prize for his theorem.
[2] The world of physics – at the micro-scale – is already fascinating enough: why should we invent mysteries?
[3] We do not think these energies can be exploited any time soon. Even nuclear energy is just binding energy between protons and neutrons: a nuclear bomb does not release the energy that is packed into protons. These elementary particles survive the blast: they are the true ‘atoms’ of this world (in the Greek sense of ‘a-tom’, which means indivisible).
[4] Mass is a measure of the inertia to a change in the state of motion of an oscillating charge. We showed how this works by explaining Einstein’s mass-energy equivalence relation and clearly distinguishing the kinetic and potential energy of an electron. Feynman first models an electron in motion correctly, with an equally correct interpretation of the effective mass of an electron in motion, but then substitutes this effective mass by half the electron mass (meff = m/2) in an erroneous reasoning process based on the non-relativistic kinetic energy concept. The latter reasoning also leads to the widespread misconception that Schrödinger’s equation would not be relativistically correct (see the Annexes to my paper on the matter-wave). For the trick it has to do, Schrödinger’s wave equation is correct – and then I mean also relativistically correct. 🙂
[5] A neutron is unstable outside of its nucleus. We, therefore, think it acts as the glue between protons, and it must be a composite particle.
I have been thinking on my explanation of dark matter/energy, and I think it is sound. It solves the last asymmetry in my models, and explains all. So, after a hiatus of two years, I bothered to make a podcast on my YouTube channel once again. It talks about everything. Literally everything !
It makes me feel my quest for understanding of matter and energy – in terms of classical concepts and measurements (as depicted below) – has ended. Perhaps I will write more but that would only be to promote the material, which should promote itself if it is any good (which I think it is).
I should, by way of conclusion, say a few final words about Feynman’s 1963 Lectures now. When everything is said and done, it is my reading of them which had triggered this blog about ten years ago. I would now recommend Volume I and II (classical physics and electromagnetic theory) – if only because it gives you all the math you need to understand all of physics – but not Volume III (the lectures on quantum mechanics). They are outdated, and I do find Feynman guilty of promoting rather than explaining the hocus-pocus around all of the so-called mysteries in this special branch of physics.
Quantum mechanics is special, but I do conclude now that it can all be explained in terms of classical concepts and quantities. So, Gell-Mann’s criticism of Richard Feynman is, perhaps, correct: Mr. Feynman did, perhaps, make too many jokes – and it gets annoying because he must have known some of what he suggests does not make sense – even if I would not go as far as Gell-Mann, who says “Feynman was only concerned about himself, his ego, and his own image !”
So, I would recommend my own alternative series of ‘lectures’. Not only are they easier to read, but they also embody a different spirit of writing. Science is not about you, it is about thinking for oneself and deciding on what is truthful and useful, and what is not. So, to conclude, I will end by quoting Ludwig Boltzmann once more:
“Bring forward what is true.
Write it so that it is clear.
Defend it to your last breath.”
Ludwig Boltzmann (1844 – 1906)
Post scriptum: As for the ‘hocus-pocus’ in Feynman’s Lectures, we should, perhaps, point once again to some of our early papers on the flaws in his arguments. We effectively put our finger on the arbitrary wavefunction convention, or the (false) boson-fermion dichotomy, or the ‘time machine’ argument that is inherent to his explanation of the Hamiltonian, and so on. We published these things on Academia.edu before (also) putting our (later) papers ResearchGate, so please check there for the full series. 🙂
Post scriptum (23 April 2023): Also check out this video, which was triggered by someone who thought my models amount to something like a modern aether theory, which it is definitely not the case: https://www.youtube.com/watch?v=X38u2-nXoto. 🙂 I really think it is my last reflection on these topics. I need to focus on my day job, sports, family, etcetera again ! 🙂
I had not touched physics since April last year, as I was struggling with cancer, and finally went in for surgery. It solved the problem but physical and psychological recovery was slow, and so I was in no mood to work on mathematical and physical questions. Now I am going through my ResearchGate papers again. I start with those that get a fair amount of downloads and – I am very pleased to see that happen – those are the papers that deal with very fundamental questions, and lay out the core of an intuition that is more widely shared now: physicists are lost in contradictions and will not get out of this fuzzy situation until they solve them.
[Skeptical note here: I note that those physicists who bark loudest about the need for a scientific revolution are, unfortunately, often those who obscure things even more. For example, I quickly went through Hossenfelder’s Lost in Math (and I also emailed her to highlight all that zbw theory can bring) but she did not even bother to reply and, more in general, shows no signs of being willing to go back to the roots, which are the solutions that were presented during the early Solvay conferences but, because of some weird tweak of the history of science, and despite the warnings of intellectual giants such as H.A. Lorentz, Ehrenfest, or Einstein (and also Dirac or Bell in the latter half of their lifes), were discarded. I have come to the conclusion that modern-day scientists cannot be fashionable when admitting all mysteries have actually been solved long time ago.]
The key observation or contradiction is this: the formalism of modern quantum mechanics deals with all particles – stable or unstable – as point objects: they are supposed to have no internal structure. At the same time, a whole new range of what used to be thought of as intermediate mental constructs or temporary classifications – think of quarks here, or of the boson-fermion dichotomy – acquired ontological status. We lamented that in one of very first papers (titled: the difference between a theory, a calculation and an explanation), which has few formulas and is, therefore, a much easier read than the others.
Some of my posts on this blog here were far more scathing and, therefore, not suitable to write out in papers. See, for example, my Smoking Gun Physics post, in which I talk much more loudly (but also more unscientifically) about the ontologicalization of quarks and all these theoretical force-carrying particles that physicists have invented over the past 50 years or so.
My point of view is clear and unambiguous: photons and neutrinos (both of which can be observed and measured) will do. The rest (the analysis of decay and the chain of reactions after high-energy collisions, mainly) can be analyzed using scattering matrices and other classical techniques (on that, I did write a paper highlighting the proposals of more enlightened people than me, like Bombardelli, 2016, even if I think researchers like Bombardelli should push back to basics even more than they do). By the way, I should probably go much further in my photon and neutrino models, but time prevented me from doing so. In any case, I did update and put an older paper of mine online, with some added thoughts on recent experiments that seem to confirm neutrinos have some rest mass. That is only what is to be expected, I would think. Have a look at it.
[…]
This is a rather lengthy introduction to the topic I want to write about for my public here, which is people like you and me: (amateur) physicists who want to make sense of all that is out there. So I will make a small summary of an equation I was never interested in: Dirac’s wave equation. Why my lack of interest before, and my renewed interest now?
The reason is this: Feynman clearly never believed Dirac’s equation added anything to Schrödinger’s, because he does not even mention it in his rather Lectures which, I believe, are, today still, truly seminal even if they do not go into all of the stuff mainstream quantum physicists today believe to be true (which is, I repeat, all of the metaphysics around quarks and gluons and force-carrying bosons and all that). So I did not bother to dig into it.
However, when revising my paper on de Broglie’s matter-wave, I realized that I should have analyzed Dirac’s equation too, because I do analyze Schrödinger’s wave equation there (which makes sense), and also comment on the Klein-Gordon wave equation (which, just like Dirac’s, does not make much of an impression on me). Hence, I would say my renewed interest is only there because I wanted to tidy up a little corner in this kitchen of mine. 🙂
I will stop rambling now, and get on with it.
Dirac’s wave equation: concepts and issues
We should start by reminding ourselves what a wave equation actually is: it models how waves – sound waves, or electromagnetic waves, or – in this particular case – a ‘wavicle’ or wave-particle – propagate in space and in time. As such, it is often said they model the properties of the medium (think of properties such as elasticity, density, permittivity or permeability here) but, because we do no longer think of spacetime as an aether, quantum-mechanical wave equations are far more abstract.
I should insert a personal note here. I do have a personal opinion on the presumed reality of spacetime. It is not very solid, perhaps, because I oscillate between (1) Kant’s intuition, thinking that space and time are mental constructs only, which our mind uses to structure its impressions (we are talking science here, so I should say: our measurements) versus (2) the idea that the 2D or 3D oscillations of pointlike charges within, say, an electron, a proton or a muon-electron must involve some kind of elasticity of the ‘medium’ that we commonly refer to as spacetime (I’d say that is more in line with Wittgenstein’s philosophy of reality). I should look it up but I think I do talk about the elasticity of spacetime at one or two occasions in my papers that talk about internal forces in particles, or papers in which I dig deep into the potentials that may or may not drive these oscillations. I am not sure how far I go there. Probably too far. But if properties such as vacuum permittivity or permeability are generally accepted, then why not think of elasticity? However, I did try to remain very cautious when it comes to postulating properties of the so-called spacetime vacuum, as evidenced from what I write in one of the referenced papers above:
“Besides proving that the argument of the wavefunction is relativistically invariant, this [analysis of the argument of the wavefunction] also demonstrates the relativistic invariance of the Planck-Einstein relation when modelling elementary particles.[1] This is why we feel that the argument of the wavefunction (and the wavefunction itself) is more real – in a physical sense – than the various wave equations (Schrödinger, Dirac, or Klein-Gordon) for which it is some solution. In any case, a wave equation usually models the properties of the medium in which a wave propagates. We do not think the medium in which the matter-wave propagates is any different from the medium in which electromagnetic waves propagate. That medium is generally referred to as the vacuum and, whether or not you think of it as true nothingness or some medium, we think Maxwell’s equations – which establishes the speed of light as an absolute constant – model the properties of it sufficiently well! We, therefore, think superluminal phase velocities are not possible, which is why we think de Broglie’s conceptualization of a matter particle as a wavepacket – rather than one single wave – is erroneous.[2]“
The basic idea is this: if the vacuum is true nothingness, then it cannot have any properties, right? 🙂 That is why I call the spacetime vacuum, as it is being modelled in modern physics, a so-called vacuum. 🙂
[…] I guess I am rambling again, and so I should get back to the matter at hand, and quite literally so, because we are effectively talking about real-life matter here. To be precise, we are talking about Dirac’s view of an electron moving in free space. Let me add the following clarification, just to make sure we understand exactly what we are talking about: free space is space without any potential in it: no electromagnetic, gravitational or other fields you might think of.
In reality, such free space does not exist: it is just one of those idealizations which we need to model reality. All of real-life space – the Universe we live in, in other words – has potential energy in it: electromagnetic and/or gravitational potential energy (no other potential energy has been convincingly demonstrated so far, so I will not add to the confusion by suggesting there might be more). Hence, there is no such thing as free space.
What am I saying here? I am just saying that it is not bad that we remind ourselves of the fact that Dirac’s construction is theoretical from the outset. To me, it feels like trying to present electromagnetism by making full abstraction of the magnetic side of the electromagnetic force. That is all that I am saying here. Nothing more, nothing less. No offense to the greatness of a mind like Dirac’s.
[…] I may have lost you as a reader just now, so let me try to get you back: Dirac’s wave equation. Right. Dirac develops it in two rather dense sections of his Principles of Quantum Mechanics, which I will not try to summarize here. I want to make it easy for the reader, so I will limit myself to an analysis of the very first principle(s) which Dirac develops in his Nobel Prize Lecture. It is this (relativistically correct) energy equation:
E2 = m02c4 + p2c2
This equation may look unfamiliar to you but, frankly, if you are familiar with the basics of relativity theory, it should not come across as weird or unfathomable. It is one of the many basic ways of expressing relativity theory, as evidenced from the fact that Richard Feynman introduces this equation as part of his very first volume of his Lectures on Physics, and in one of the more basic chapters of it: just click on the link and work yourself through it: you will see it is just another rendering of Einstein’s mass-equivalence relation (E = mc2).
The point is this: it is very easy now to understand Dirac’s basic energy equation: the one he uses to then go from variables to quantum-mechanical operators and all of the other mathematically correct hocus-pocus that result in his wave equation. Just substitute E = mc2 for W, and then divide all by c2:
So here you are. All the rest is the usual hocus-pocus: we substitute classical variables by operators, and then we let them operate on a wavefunction (wave equations may or may not describe the medium, but wavefunctions surely do describe real-life particles), and then we have a complicated differential equation to solve and – as we made abundantly clear in this and other papers (one that you may want to read is my brief history of quantum-mechanical ideas, because I had a lot of fun writing that one, and it is not technical at all) – when you do that, you will find non-sensical solutions, except for the one that Schrödinger pointed out: the Zitterbewegung electron, which we believe corresponds to the real-life electron.
I will wrap this up (although you will say I have not done my job yet) by quoting quotes and comments from my de Broglie paper:
Prof. H. Pleijel, then Chairman of the Nobel Committee for Physics of the Royal Swedish Academy of Sciences, dutifully notes this rather inconvenient property in the ceremonial speech for the 1933 Nobel Prize, which was awarded to Heisenberg for nothing less than “the creation of quantum mechanics”[1]:
“Matter is formed or represented by a great number of this kind of waves which have somewhat different velocities of propagation and such phase that they combine at the point in question. Such a system of waves forms a crest which propagates itself with quite a different velocity from that of its component waves, this velocity being the so-called group velocity. Such a wave crest represents a material point which is thus either formed by it or connected with it, and is called a wave packet. […] As a result of this theory, one is forced to the conclusion to conceive of matter as not being durable, or that it can have definite extension in space. The waves, which form the matter, travel, in fact, with different velocity and must, therefore, sooner or later separate. Matter changes form and extent in space. The picture which has been created, of matter being composed of unchangeable particles, must be modified.”
This should sound very familiar to you. However, it is, obviously, not true: real-life particles – electrons or atoms traveling in space – do not dissipate. Matter may change form and extent in space a little bit – such as, for example, when we are forcing them through one or two slits[2] – but not fundamentally so![3]
We repeat again, in very plain language this time: Dirac’s wave equation is essentially useless, except for the fact that it actually models the electron itself. That is why only one of its solutions make sense, and that is the very trivial solution which Schrödinger pointed out: the Zitterbewegung electron, which we believe corresponds to the real-life electron. 🙂 It just goes through space and time like any ordinary particle would do, but its trajectory is not given by Dirac’s wave equation. In contrast, Schrödinger’s wave equation (with or without a potential being present: in free or non-free space, in other words) does the trick and – against mainstream theory – I dare say, after analysis of its origins, that it is relativistically correct. Its only drawback is that it does not incorporate the most essential property of an elementary particle: its spin. That is why it models electron pairs rather than individual electrons.
We can easily generalize to protons or other elementary or non-elementary particles. For a deeper discussion of Dirac’s wave equation (which is what you probably expected), I must refer, once again, to Annex II of my paper on the interpretation of de Broglie’s matter-wave: it is all there, really, and – glancing at it all once again – the math is actually quite basic. In any case, paraphrasing Euclid in his reply to King Ptolemy’s question, I would say that there is no royal road to quantum mechanics. One must go through its formalism and, far more important, its history of thought. 🙂
To conclude, I would like to return to one of the remarks I made in the introduction. What about the properties of the vacuum? I will remain cautious and, hence, not answer that question. I prefer to let you think about this rather primitive classification of what is relative and not, and how the equations in physics mix both of it. 🙂
[1] To be precise, Heisenberg got a postponed prize from 1932. Erwin Schrödinger and Paul A.M. Dirac jointly got the 1933 prize. Prof. Pleijel acknowledges all three in more or less equal terms in the introduction of his speech: “This year’s Nobel Prizes for Physics are dedicated to the new atomic physics. The prizes, which the Academy of Sciences has at its disposal, have namely been awarded to those men, Heisenberg, Schrödinger, and Dirac, who have created and developed the basic ideas of modern atomic physics.”
[2] The wave-particle duality of the ring current model should easily explain single-electron diffraction and interference (the electromagnetic oscillation which keeps the charge swirling would necessarily interfere with itself when being forced through one or two slits), but we have not had the time to engage in detailed research here.
[3] We will slightly nuance this statement later but we will not fundamentally alter it. We think of matter-particles as an electric charge in motion. Hence, as it acts on a charge, the nature of the centripetal force that keeps the particle together must be electromagnetic. Matter-particles, therefore, combine wave-particle duality. Of course, it makes a difference when this electromagnetic oscillation, and the electric charge, move through a slit or in free space. We will come back to this later. The point to note is: matter-particles do not dissipate. Feynman actually notes that at the very beginning of his Lectures on quantum mechanics, when describing the double-slit experiment for electrons: “Electrons always arrive in identical lumps.”
[1] The relativistic invariance of the Planck-Einstein relation emerges from other problems, of course. However, we see the added value of the model here in providing a geometric interpretation: the Planck-Einstein relation effectively models the integrity of a particle here.
After a long break (more than six months), I have started to engage again in a few conversations. I also looked at the 29 papers on my ResearchGate page, and I realize some of them would need to be re-written or re-packaged so as to ensure a good flow. Also, some of the approaches were more productive than others (some did not lead anywhere at all, actually), and I would need to point those out. I have been thinking about how to approach this, and I think I am going to produce an annotated version of these papers, with comments and corrections as mark-ups. Re-writing or re-structuring all of them would require to much work.
The mark-up of those papers is probably going to be based on some ‘quick-fire’ remarks (a succession of thoughts triggered by one and the same question) which come out of the conversation below, so I thank these thinkers for having kept me in the loop of a discussion I had followed but not reacted to. It is an interesting one – on the question of ‘deep electron orbitals’ (read: the orbitals of negative charge inside of a nucleus exist and, if so, how one can model them. If one could solve that question, one would have a theoretical basis for what is referred to as low-energy nuclear reactions. That was known formerly as cold fusion, but that got a bit of a bad name because of a number of crooks spoiling the field, unfortunately.
PS: I leave the family names of my correspondents in the exchange below out so they cannot be bothered. One of them, Jerry, is a former American researcher at SLAC. Andrew – the key researcher on DEPs – is a Canadian astrophysicist, and the third one – Jean-Luc – is a rather prominent French scientist in LENR.]
From: Jean Louis Van Belle Sent: 18 November 2021 22:51 Subject: Staying engaged (5)
Oh – and needless to say, Dirac’s basic equation can, of course, be expanded using the binomial expansion – just like the relativistic energy-momentum relation, and then one can ‘cut off’ the third-, fourth-, etc-order terms and keep the first and second-order terms only. Perhaps it is equations like that kept you puzzled (I should check your original emails). In any case, this way of going about energy equations for elementary particles is a bit the same as those used in perturbation equations in which – as Dirac complained – one randomly selects terms that seem to make sense and discard others because they do not seem to make sense. Of course, Dirac criticized perturbation theory much more severely than this – and rightly so. 😊 😊 JL
From: Jean Louis Van Belle Sent: 18 November 2021 22:10 Subject: Staying engaged (4)
Also – I remember you had some questions on an energy equation – not sure which one – but so I found Dirac’s basic equation (based on which he derives the ‘Dirac’ wave equation) is essentially useless because it incorporates linear momentum only. As such, it repeats de Broglie’s mistake, and that is to interpret the ‘de Broglie’ wavelength as something linear. It is not: frequencies, wavelengths are orbital frequencies and orbital circumferences. So anything you would want to do with energy equations that are based on that, lead nowhere – in my not-so-humble opinion, of course. To illustrate the point, compare the relativistic energy-momentum relation and Dirac’s basic equation in his Nobel Prize lecture (I hope the subscripts/superscripts get through your email system so they display correctly):
Divide the above by c2 and re-arrange and you get Dirac’s equation: W2/c2 – pr2 – m2/c2 = 0 (see his 1933 Nobel Prize Lecture)
So that cannot lead anywhere. It’s why I totally discard Dirac’s wave equation (it has never yielded any practical explanation of a real-life phenomenon anyway, if I am not mistaken).
Cheers – JL
From: Jean Louis Van Belle Sent: 18 November 2021 21:49 Subject: Staying engaged (3)
Just on ‘retarded sources’ and ‘retarded fields’ – I have actually tried to think of the ‘force mechanism’ inside of an electron or a proton (what keeps the pointlike charge in this geometric orbit around a center of mass?). I thought long and hard about some kind of model in which we have the charge radiate out a sub-Planck field, and that its ‘retarded effects’ might arrive ‘just in time’ to the other side of the orbital (or whatever other point on the orbital) so as to produce the desired ‘course correction’ might explain it. I discarded it completely: I am now just happy that we have ‘reduced’ the mystery to this ‘Planck-scale quantum-mechanical oscillation’ (in 2D or 3D orbitals) without the need for an ‘aether’, or quantized spacetime, or ‘virtual particles’ actually ‘holding the thing together’.
Also, a description in terms of four-vectors (scalar and vector potential) does not immediately call for ‘retarded time’ variables and all that, so that is another reason why I think one should somehow make the jump from E-B fields to scalar and vector potential, even if the math is hard to visualize. If we want to ‘visualize’ things, Feynman’s discussion of the ‘energy’ and ‘momentum’ flow in https://www.feynmanlectures.caltech.edu/II_27.html might make sense, because I think analyses in terms of Poynting vectors are relativistically current, aren’t they? It is just an intuitive idea…
Cheers – JL
From: Jean Louis Van Belle Sent: 18 November 2021 21:28 Subject: Staying engaged (2)
But so – in the shorter run – say, the next three-six months, I want to sort out those papers on ResearchGate. The one on the de Broglie’s matter-wave (interpreting the de Broglie wavelength as the circumference of a loop rather than as a linear wavelength) is the one that gets most downloads, and rightly so. The rest is a bit of a mess – mixing all kinds of things I tried, some of which worked, but other things did not. So I want to ‘clean’ that up… 😊 JL
From: Jean Louis Van Belle Sent: 18 November 2021 21:21 Subject: Staying engaged…
Please do include me in the exchanges, Andrew – even if I do not react, I do read them because I do need some temptation and distraction. As mentioned, I wanted to focus on building a credible n = p + e model (for free neutrons but probably more focused on a Schrodinger-like D = p + e + p Platzwechsel model, because the deuteron nucleus is stable). But so I will not do that the way I studied the zbw model of the electron and proton (I believe that is sound now) – so that’s with not putting in enough sleep. I want to do it slowly now. I find a lot of satisfaction in the fact that I think there is no need for complicated quantum field theories (fields are quantized, but in a rather obvious way: field oscillations – just like matter-particles – pack Planck’s quantum of (physical) action which – depending on whether you freeze time or positions as a variable, expresses itself as a discrete amount of energy or, alternatively, as a discrete amount of momentum), nor is there any need for this ‘ontologization’ of virtual field interactions (sub-Planck scale) – the quark-gluon nonsense.
Also, it makes sense to distinguish between an electromagnetic and a ‘strong’ or ‘nuclear’ force: the electron and proton have different form factors (2D versus 3D oscillations, but that is a bit of a non-relativistic shorthand for what might be the case) but, in addition, there is clearly a much stronger force at play within the proton – whose strength is the same kind of ‘scale’ as the force that gives the muon-electron its rather enormous mass. So that is my ‘belief’ and the ‘heuristic’ models I build (a bit of ‘numerology’ according to Dr Pohl’s rather off-hand remarks) support it sufficiently for me to make me feel at peace about all these ‘Big Questions’.
I am also happy I figured out these inconsistencies around 720-degree symmetries (just the result of a non-rigorous application of Occam’s Razor: if you use all possible ‘signs’ in the wavefunction, then the wavefunction may represent matter as well as anti-matter particles, and these 720-degree weirdness dissolves). Finally, the kind of ‘renewed’ S-matrix programme for analyzing unstable particles (adding a transient factor to wavefunctions) makes sense to me, but even the easiest set of equations look impossible to solve – so I may want to dig into the math of that if I feel like having endless amounts of time and energy (which I do not – but, after this cancer surgery, I know I will only die on some ‘moral’ or ‘mental’ battlefield twenty or thirty years from now – so I am optimistic).
So, in short, the DEP question does intrigue me – and you should keep me posted, but I will only look at it to see if it can help me on that deuteron model. 😊 That is the only ‘deep electron orbital’ I actually believe in. Sorry for the latter note.
Cheers – JL
From: Andrew Sent: 16 November 2021 19:05 To: Jean-Luc; Jerry; Jean Louis Subject: Re: retarded potential?
Dear Jean-Louis,
Congratulations on your new position. I understand your present limitations, despite your incredible ability to be productive. They must be even worse than those imposed by my young kids and my age. Do you wish for us to not include you in our exchanges on our topic? Even with no expectation of your contributing at this point, such emails might be an unwanted temptation and distraction.
Dear Jean-Luc,
Thank you for the Wiki-Links. They are useful. I agree that the 4-vector potential should be considered. Since I am now considering the nuclear potentials as well as the deep orbits, it makes sense to consider the nuclear vector potentials to have an origin in the relativistic Coulomb potentials. I am facing this in my attempts to calculate the deep orbits from contributions to the potential energies that have a vector component, which non-rel Coulomb potentials do not have.
For examples: do we include the losses in Vcb (e.g., from the binding energy BE) when we make the relativistic correction to the potential; or, how do we relativistically treat pseudo potentials such as that of centrifugal force? We know that for equilibrium, the average forces must cancel. However, I’m not sure that it is possible to write out a proper expression for “A” to fit such cases.
Best regards to all,
Andrew
_ _ _
On Fri, Nov 12, 2021 at 1:42 PM Jean-Luc wrote:
Dear all,
I totally agree with the sentence of Jean-Louis, which I put in bold in his message, about vector potential and scalar potential, combined into a 4-vector potential A, for representing EM field in covariant formulation. So EM representation by 4-vector A has been very developed, as wished by JL, in the framework of QED.
We can see the reality of vector potential in the Aharonov-Bohm effect: https://en.wikipedia.org/wiki/Aharonov-Bohm_effect. In fact, we can see that vector potential contains more information than E,B fields. Best regards
Jean-Luc Le 12/11/2021 à 05:43, Jean Louis Van Belle a écrit :
Hi All – I’ve been absent in the discussion, and will remain absent for a while. I’ve been juggling a lot of work – my regular job at the Ministry of Interior (I got an internal promotion/transfer, and am working now on police and security sector reform) plus consultancies on upcoming projects in Nepal. In addition, I am still recovering from my surgery – I got a bad flue (not C19, fortunately) and it set back my auto-immune system, I feel. I have a bit of a holiday break now (combining the public holidays of 11 and 15 November in Belgium with some days off to bridge so I have a rather nice super-long weekend – three in one, so to speak).
As for this thread, I feel like it is not ‘phrasing’ the discussion in the right ‘language’. Thinking of E-fields and retarded potential is thinking in terms of 3D potential, separating out space and time variables without using the ‘power’ of four-vectors (four-vector potential, and four-vector space-time). It is important to remind ourselves that we are measuring fields in continuous space and time (but, again, this is relativistic space-time – so us visualizing a 3D potential at some point in space is what it is: we visualize something because our mind needs that – wants that). The fields are discrete, however: a field oscillation packs one unit of Planck – always – and Planck’s quantum of action combines energy and momentum: we should not think of energy and momentum as truly ‘separate’ (discrete) variables, just like we should not think of space and time as truly ‘separate’ (continuous) variables.
I do not quite know what I want to say here – or how I should further work it out. I am going to re-read my papers. I think I should further develop the last one (https://www.researchgate.net/publication/351097421_The_concepts_of_charge_elementary_ring_currents_potential_potential_energy_and_field_oscillations), in which I write that the vector potential is more real than the electric field and the scalar potential should be further developed, and probably it is the combined scalar and vector potential that are the ’real’ things. Not the electric and magnetic field. Hence, illustrations like below – in terms of discs and cones in space – do probably not go all that far in terms of ‘understanding’ what it is going on… It’s just an intuition…
Cheers – JL
From: Andrew Sent: 23 September 2021 17:17 To: Jean-Luc; Jerry; Jean Louis Subject: retarded potential?
Dear Jean-Luc,
Becasue of the claim that gluons are tubal, I have been looking at the disk-shaped E-field lines of the highly-relativistic electron and comparing it to the retarded potential, which, based on timing, would seem to give a cone rather than a disk (see figure). This makes a difference when we consider a deep-orbiting electron. It even impacts selection of the model for impact of an electron when considering diffraction and interference.
Even if the field appears to be spreading out as a cone, the direction of the field lines are that of a disk from the retarded source. However, how does it interact with the radial field of a stationary charge?
Do you have any thoughts on the matter.
Best regards,
Andrew
_ _ _
On Thu, Sep 23, 2021 at 5:05 AM Jean-Luc wrote:
Dear Andrew, Thank you for the references. Best regards, Jean-Luc
Le 18/09/2021 à 17:32, Andrew a écrit : > This might have useful thoughts concerning the question of radiation > decay to/from EDOs. > > Quantum Optics Electrons see the quantum nature of light > Ian S. Osborne > We know that light is both a wave and a particle, and this duality > arises from the classical and quantum nature of electromagnetic > excitations. Dahan et al. observed that all experiments to date in > which light interacts with free electrons have been described with > light considered as a wave (see the Perspective by Carbone). The > authors present experimental evidence revealing the quantum nature of > the interaction between photons and free electrons. They combine an > ultrafast transmission electron microscope with a silicon-photonic > nanostructure that confines and strengthens the interaction between > the light and the electrons. The “quantum” statistics of the photons > are imprints onto the propagating electrons and are seen directly in > their energy spectrum. > Science, abj7128, this issue p. 1324; see also abl6366, p. 1309
A few days ago, I mentioned I felt like writing a new book: a sort of guidebook for amateur physicists like me. I realized that is actually fairly easy to do. I have three very basic papers – one on particles (both light and matter), one on fields, and one on the quantum-mechanical toolbox (amplitude math and all of that). But then there is a lot of nitty-gritty to be written about the technical stuff, of course: self-interference, superconductors, the behavior of semiconductors (as used in transistors), lasers, and so many other things – and all of the math that comes with it. However, for that, I can refer you to Feynman’s three volumes of lectures, of course. In fact, I should: it’s all there. So… Well… That’s it, then. I am done with the QED sector. Here is my summary of it all (links to the papers on Phil Gibbs’ site):
The last paper is interesting because it shows statistical indeterminism is the only real indeterminism. We can, therefore, use Bell’s Theorem to prove our theory is complete: there is no need for hidden variables, so why should we bother about trying to prove or disprove they can or cannot exist?
Jean Louis Van Belle, 21 October 2020
Note: As for the QCD sector, that is a mess. We might have to wait another hundred years or so to see the smoke clear up there. Or, who knows, perhaps some visiting alien(s) will come and give us a decent alternative for the quark hypothesis and quantum field theories. One of my friends thinks so. Perhaps I should trust him more. 🙂
As for Phil Gibbs, I should really thank him for being one of the smartest people on Earth – and for his site, of course. Brilliant forum. Does what Feynman wanted everyone to do: look at the facts, and think for yourself. 🙂
I have a crazy new idea: a complete re-write of Feynman’s Lectures. It would be fun, wouldn’t it? I would follow the same structure—but start with Volume III, of course: the lectures on quantum mechanics. We could even re-use some language—although we’d need to be careful so as to keep Mr. Michael Gottlieb happy, of course. 🙂 What would you think of the following draft Preface, for example?
The special problem we try to get at with these lectures is to maintain the interest of the very enthusiastic and rather smart people trying to understand physics. They have heard a lot about how interesting and exciting physics is—the theory of relativity, quantum mechanics, and other modern ideas—and spend many years studying textbooks or following online courses. Many are discouraged because there are really very few grand, new, modern ideas presented to them. The problem is whether or not we can make a course which would save them by maintaining their enthusiasm.
The lectures here are not in any way meant to be a survey course, but are very serious. I thought it would be best to re-write Feynman’s Lectures to make sure that most of the above-mentioned enthusiastic and smart people would be able to encompass (almost) everything that is in the lectures. 🙂
This is the link to Feynman’s original Preface, so you can see how my preface compares to his: same-same but very different, they’d say in Asia. 🙂
[…]
Doesn’t that sound like a nice project? 🙂
Jean Louis Van Belle, 22 May 2020
Post scriptum: It looks like we made Mr. Gottlieb and/or MIT very unhappy already: the link above does not work for us anymore (see what we get below). That’s very good: it is always nice to start a new publishing project with a little controversy. 🙂 We will have to use the good old paper print edition. We recommend you buy one too, by the way. 🙂 I think they are just a bit over US$100 now. Well worth it!
To put the historical record straight, the reader should note we started this blog before Mr. Gottlieb brought Feynman’s Lectures online. We actually wonder why he would be bothered by us referring to it. That’s what classical textbooks are for, aren’t they? They create common references to agree or disagree with, and why put a book online if you apparently don’t want it to be read or discussed? Noise like this probably means I am doing something right here. 🙂
Post scriptum 2: Done ! Or, at least, the first chapter is done ! Have a look: here is the link on ResearchGate and this is the link on Phil Gibbs’ site. Please do let me know what you think of it—whether you like it or not or, more importantly, what logic makes sense and what doesn’t. 🙂
I talked about the Solvay Conferences in my previous post(s). The Solvay Conference proceedingsare a real treasury trove. Not only are they very pleasant to read, but they also debunk more than one myth or mystery in quantum physics!
It is part of scientific lore, for example, that the 1927 Solvay Conference was a sort of battlefield on new physics between Heisenberg and Einstein. Surprisingly, the papers and write-up of discussions reveal that Einstein hardly intervened. They also reveal that ‘battlefield stories’ such as Heisenberg telling Einstein to “stop telling God what to do” or – vice versa – Einstein declaring “God doesn’t play dice” are what they are: plain gossip or popular hear-say. Neither Heisenberg nor Einstein ever said that—or not at the occasion of the 1927 Solvay Conference, at least! Instead, we see very nuanced and very deep philosophical statements—on both sides of the so-called ‘divide’ or ‘schism’.
From all interventions, the intervention of the Dutch scientist Hendrik Antoon Lorentz stands out. I know (most of) my readers don’t get French, and so I might translate it into English one of these days. In the meanwhile, you may want to google-translate it yourself!
It is all very weird, emotional and historical. H.A. Lorentz – clearly the driving force behind those pre-WW II Solvay Conferences – died a few months after the 1927 Conference. In fact, the 1927 conference proceedings have both the sad announcement of his demise as well his interventions—such was the practice of actually physically printing stuff at the time.
For those who do read French, here you go:
DISCUSSION GENERALE DES IDEES NOUVELLES EMISES.
Causalité, Déterminisme. Probabilité.
Intervention de M. Lorentz:
“Je voudrais attirer l ’attention sur les difficultés qu’on rencontre dans les anciennes théories. Nous voulons nous faire une représentation des phénomènes, nous en former une image dans notre esprit. Jusqu’ici, nous avons toujours voulu former ces images au moyen des notions ordinaires de temps et d’espace. Ces notions sont peut-être innées; en tout cas, elles se sont développées par notre expérience personnelle, par nos observations journalières. Pour moi, ces notions sont claires et j ’avoue que je ne puis me faire une idée de la physique sans ces notions. L ’image que je veux me former des phénomènes doit être absolument nette et définie et il me semble que nous ne pouvons nous former une pareille image que dans ce système d’espace et de temps.
Pour moi, un électron est un corpuscule qui, a un instant donne, se trouve en un point détermine de l ’espace, et si j ’ai eu l ’idée qu’a un moment suivant ce corpuscule se trouve ailleurs, je dois songer à sa trajectoire, qui est une ligne dans l’espace. Et si cet électron rencontre un atome et y pénètre, et qu’après plusieurs aventures il quitte cet atome, je me forge une théorie dans laquelle cet électron conserve son individualité; c’est-à-dire que j ’imagine une ligne suivant laquelle cet électron passe à travers cet atome. Il se peut, évidemment, que cette théorie soit bien difficile à développer, mais a priori cela ne me parait pas impossible.
Je me figure que, dans la nouvelle théorie, on a encore de ces électrons. Il est possible, évidemment, que dans la nouvelle théorie, bien développée, il soit nécessaire de supposer que ces électrons subissent des transformations. Je veux bien admettre que l’électron se fond en un nuage. Mais alors je chercherai à quelle occasion cette transformation se produit. Si l’on voulait m’interdire une pareille recherche en invoquant un principe, cela me gênerait beaucoup. Il me semble qu’on peut toujours espérer qu’on fera plus tard ce que nous ne pouvons pas encore faire en ce moment. Même si l’on abandonne les anciennes idées, on peut toujours conserver les anciennes dénominations. Je voudrais conserver cet idéal d’autrefois, de décrire tout ce qui se passe dans le monde par des images nettes. Je suis prêt à admettre d’autres théories, à condition qu’on puisse les traduire par des images claires et nettes.
Pour ma part, bien que n’étant pas encore familiarisé avec les nouvelles idées que j’entends exprimer maintenant, je pourrais me représenter ces idées ainsi. Prenons le cas d’un électron qui rencontre un atome; supposons que cet électron quitte cet atome et qu’en même temps il y ait émission d’un quantum de lumière. Il faut considérer, en premier lieu, les systèmes d’ondes qui correspondent à l ’électron et à l’atome avant le choc. Après le choc, nous aurons de nouveaux systèmes d’ondes. Ces systèmes d’ondes pourront etre décrits par une fonction ψ définie dans un espace a un grand nombre de dimensions qui satisfait une équation différentielle. La nouvelle mécanique ondulatoire opèrera avec cette équation et établira la fonction ψ avant et après le choc.
Or, il y a des phénomènes qui apprennent qu’ il y a autre chose encore que ces ondes, notamment des corpuscules; on peut faire, par exemple, une expérience avec un cylindre de Faraday; il y a donc à tenir compte de l’individualité des électrons et aussi des photons. Je pense que je trouverais que, pour expliquer les phénomènes, il suffit d’admettre que l’expression ψψ* donne la probabilité que ces électrons et ces photons existent dans un volume détermine; cela me suffirait pour expliquer les expériences.
Mais les exemples donnes par M. Heisenberg m’apprennent que j’aurais atteint ainsi tout ce que l’expérience me permet d’atteindre. Or, je pense que cette notion de probabilité serait à mettre à la fin, et comme conclusion, des considérations théoriques, et non pas comme axiome a priori, quoique je veuille bien admettre que cette indétermination correspond aux possibilités expérimentales. Je pourrais toujours garder ma foi déterministe pour les phénomènes fondamentaux, dont je n’ai pas parlé. Est-ce qu’un esprit plus profond ne pourrait pas se rendre compte des mouvements de ces électrons. Ne pourrait-on pas garder le déterminisme en en faisant l’objet d’une croyance ? Faut-il nécessairement ériger l’ indéterminisme en principe?”
I added the bold italics above. A free translation of this phrase is this:
Why should we elevate determinism or – as Born en Heisenberg do – its opposite (indeterminism) to a philosophical principle?
What a beautiful statement ! Lorentz died of a very trivial cause: erysipelas, commonly known as St Anthony’s fire.
As mentioned in my previous post, Oliver Consa traces all of the nonsense in modern physics back to the Shelter Island (1947), Pocono (1948) and Oldstone (1949) Conferences. However, the first Solvay Conference that was organized after WW II was quite significant too. Niels Bohr and Robert Oppenheimer pretty much dominated it. Bohr does so by providing the introductory lecture ‘On the Notions of Causality and Complementarity’, while Oppenheimer’s ‘Electron Theory’ sets the tone for subsequent Solvay Conferences—most notably the one that would consecrate quantum field theory (QFT), which was held 13 years later (1961).
Indeed, the discussion between Oppenheimer and Dirac on the ‘Electron Theory’ paper in 1948 seems to be where things might have gone wrong—in terms of the ‘genealogy’ or ‘archaelogy’ of modern ideas, so to speak. In fact, both Oppenheimer and Dirac make rather historical blunders there:
Oppenheimer uses perturbation theory to arrive at some kind of ‘new’ model of an electron, based on Schwinger’s new QFT models—which, as we now know, do not really lead anywhere.
Dirac, however, is just too stubborn too: he simply keeps defending his un-defendable electron equation— which, of course, also doesn’t lead anywhere. [It is rather significant he was no longer invited for the next Solvay Conference.]
It is, indeed, very weird that Dirac does not follow through on his own conclusion: “Only a small part of the wave function has a physical meaning. We now have the problem of picking out that very small physical part of the exact solution of the wave equation.”
It’s the ring current or Zitterbewegung electron, of course. The one trivial solution he thought was so significant in his 1933 Nobel Prize lecture… The other part of the solution(s) is/are, effectively, bizarre oscillations which he refers to as ‘run-away electrons’.
Corona-virus is bad, but it does have one advantage: more time to work on my hobby ! I finally managed to have a look at what the (in)famous Lamb shift may or may not be. Here is the link to the paper.
I think it’s good. Why? Well… It’s that other so-called ‘high precision test’ of mainstream quantum mechanics (read: quantum field theory)m but so I found it’s just like the rest: ‘Cargo Cult Science.’ [I must acknowledge a fellow amateur physicist and blogger for that reference: it is, apparently, a term coined by Richard Feynman!]
To All: Enjoy and please keep up the good work in these very challenging times !
Yesterday night, I got this email from a very bright young physicist: Dr. Oliver Consa. He is someone who – unlike me – does have the required Dr and PhD credentials in physics (I have a drs. title in economics) – and the patience that goes with it – to make some more authoritative statements in the weird world of quantum mechanics. I recommend you click the link in the email (copied below) and read the paper. Please do it!
It is just 12 pages, and it is all extremely revealing. Very discomforting, actually, in light of all the other revelations on fake news in other spheres of life.
Many of us – and, here, I just refer to those who are reading my post – all sort of suspected that some ‘inner circle’ in the academic circuit had cooked things up:the Mystery Wallahs, as I refer to them now. Dr. Consa’s paper shows our suspicion is well-founded.
QUOTE
Dear fellow scientist,
I send you this mail because you have been skeptical about Foundations of Physics. I think that this new paper will be of your interest. Feel free to share it with your colleagues or publish it on the web. I consider it important that this paper serves to open a public debate on this subject.
Abstract “Quantum electrodynamics (QED) is considered the most accurate theory in the history of science. However, this precision is based on a single experimental value: the anomalous magnetic moment of the electron (g-factor). An examination of QED history reveals that this value was obtained using illegitimate mathematical traps, manipulations and tricks. These traps included the fraud of Kroll & Karplus, who acknowledged that they lied in their presentation of the most relevant calculation in QED history. As we will demonstrate in this paper, the Kroll & Karplus scandal was not a unique event. Instead, the scandal represented the fraudulent manner in which physics has been conducted from the creation of QED through today.” (12 pag.)