Re-reading What We Already Know

On PDG data, big science, and why simplicity still matters

For reasons I still find slightly amusing (it is better to be amused than annoyed, isn’t it?), old blog posts here (readingfeynman.org) or early papers on platforms such as vixra.org and academia.edu periodically resurface in “top reads” lists — sometimes many years after publication.

I would now qualify several of those texts as typical “angry young man” papers. However, I still consider most of their core claims to be true. And the papers — as mentioned above — still resonate with readers, even if I now take some distance from how they were written and framed.

That tells me two things. First, there is still genuine interest in careful, foundational thinking about physics. Second, the web (and increasingly AI agents crawling it) has a habit of flattening intellectual trajectories into caricatures: mainstream or outsider, orthodox or heretic.

I have looked at those caricatures about me, and I want to be very clear about where I stand.

1. I am deeply mainstream in one crucial sense: I trust measurements. I trust large-scale experimental infrastructure. I trust the Particle Data Group (PDG), CERN, and the decades of work that went into producing the numbers we now take for granted. I am not hostile to “big science” — on the contrary, I consider projects like CERN or ITER to be among the most impressive collective achievements of modern civilization. If society is going to spend large sums of money on something, I much prefer it to be on instruments that extend human knowledge rather than on instruments designed to destroy.

2. At the same time, I am comfortable being an outsider: I do not believe that theoretical sophistication excuses us from repeatedly asking what is actually grounded in experiment, and what is added later as interpretive scaffolding.

These two positions are not contradictory. Historically, they have gone together.

Think of Maxwell, who unified electric and magnetic phenomena not by adding complexity, but by simplifying and re-ordering – using mathematical advances – what was already known. Think of Lorentz and Einstein, who showed that gravitation need not be treated as a force at all. Think of Schrödinger and Dirac, who demonstrated that the same wave equations could describe light-like as well as matter-like phenomena without reifying every mathematical symbol into a physical object.

Progress, more often than not, comes from simplifying, not from proliferating entities.


A Minimal Experimental Core

That is the spirit in which I recently published a new working paper on ResearchGate:
Re-reading PDG particle listings through a Minimal Experimental Core (MEC).

The idea is almost embarrassingly simple. Take PDG particle listings — the most mainstream source imaginable — and re-present them using only quantities that are directly observable:

  • rest energy,
  • lifetime,
  • electric charge,
  • magnetic moment where available,
  • branching ratios understood as empirical event frequencies.

What I deliberately leave out at the primary level are non-observable quantum numbers and symmetry labels that require additional theoretical assumptions to interpret. Not because they are “wrong”, but because they are interpretive rather than measured.

The result is not an alternative theory. It is a different ordering of the same facts. And that re-ordering is surprisingly instructive.

When one looks at leptons, pions, and kaons in this way, certain patterns become obvious long before any model is invoked: differences in stability, sharp asymmetries in branching ratios, and cases where phase space alone clearly does not determine outcomes. None of this is new — but seeing it without the usual conceptual overlays changes how one thinks about explanation.


On big machines and global context

There is another reason I care about this kind of work.

We are entering a period in which fewer and fewer actors can afford to build the next generation of large experimental facilities. Europe (through CERN) and the United States remain central producers of high-quality collider and detector data. China, for geopolitical and economic reasons, may or may not build its own next “big thing” — and if it doesn’t, it will have to be content, like the rest of the world, with the data already produced.

That reality makes something very clear: we will spend the coming decades re-reading existing data. Carefully. Repeatedly. From new angles.

In that context, methodological clarity is not a luxury. It is a necessity.


AI, co-thinking, and intellectual hygiene

This brings me to one last point.

The paper I mentioned was written in close AI–HI co-thinking. I am not shy about that. Used properly, AI is not a generator of answers but a powerful tool for enforcing intellectual hygiene: forcing one to clarify terms, separate observation from explanation, and resist the temptation to smuggle assumptions into language.

If some AI systems currently reduce my online presence to that of a “lonely outlier”, then the best response is not complaint, but better signal: careful writing, explicit methodology, and visible alignment with the experimental foundations of physics.

That is what this work is meant to be.

Not a provocation.
Not a manifesto.
Just a careful re-reading of what we already know — and an invitation to do so again, together.

Leave a comment