# Quantum math: the rules – all of them! :-)

Pre-script (dated 26 June 2020): Our ideas have evolved into a full-blown realistic (or classical) interpretation of all things quantum-mechanical. In addition, I note the dark force has amused himself by removing some material. So no use to read this. Read my recent papers instead. ðŸ™‚

Original post:

In my previous post, I made no compromise, and used allÂ of the rules one needs to calculate quantum-mechanical stuff:

However, I didn’t explain them. These rules look simple enough, but let’s analyze them now. They’re simple and not at the same time, indeed.

[I] The first equation uses the KroneckerÂ delta, which sounds fancy but it’s just a simple shorthand:Â Î´ijÂ =Â Î´jiÂ is equal to 1 if i = j, and zero if i â‰ Â j, withÂ iÂ andÂ jÂ representingÂ baseÂ states.Â Equation (I) basically says that base states are all different.Â For example, the angular momentum in the x-direction of a spin-1/2 particleÂ â€“ think of an electron or a protonÂ â€“ isÂ eitherÂ +Ä§/2 orÂ âˆ’Ä§/2, not something in-between, or someÂ mixture. SoÂ âŒ© +xÂ |Â +xÂ âŒª = âŒ© âˆ’xÂ |Â âˆ’xÂ âŒª = 1 andÂ âŒ© +xÂ |Â âˆ’xÂ âŒª = âŒ© âˆ’xÂ | +xÂ âŒª = 0.

We’re talking base states here, of course. Base states are like a coordinate system: we settle on an x-, y- and z-axis, and a unit, and any point is defined in terms of an x-, y– and z-number. It’s the same here, except we’re talking ‘points’ in four-dimensional spacetime. To be precise, we’re talking constructs evolving in spacetime. To be even more precise, we’re talking amplitudesÂ with a temporal as well as a spatial frequency, which we’ll often represent as:

aÂ·eâˆ’iÂ·Î¸Â =Â eâˆ’iÂ·(Ï‰Â·t âˆ’ kÂ âˆ™x)Â =Â aÂ·eâˆ’(i/Ä§)Â·(EÂ·t âˆ’ pâˆ™x)

The coefficient in front (a) is just a normalization constant, ensuring all probabilities add up to one. It may not be a constant, actually: perhaps it just ensure our amplitude stays within some kind of envelope, as illustrated below.

As for the Ï‰ = E/Ä§ and k =Â p/Ä§ identities, these are the de Broglie equations for aÂ matter-wave, which the young ComteÂ jotted down as part of his 1924 PhD thesis. He was inspired by the fact that theÂ EÂ·t âˆ’ pâˆ™xÂ factor is an invariantÂ four-vector productÂ (EÂ·t âˆ’ pâˆ™x =Â pÎ¼xÎ¼) in relativity theory, and noted the striking similarity with the argument of any wave function in space and time (Ï‰Â·t âˆ’ kÂ âˆ™x) and, hence, couldn’t resist equating both. Louis de Broglie was inspired, of course, by the solution to the blackbody radiation problem, which Max Planck and Einstein had convincingly solved by accepting that theÂ Ï‰ = E/Ä§ equation holds for photons. As he wrote it:

“When I conceived the first basic ideas of wave mechanics in 1923â€“24, I was guided by the aim to perform a real physical synthesis, valid for all particles, of the coexistence of the wave and of the corpuscular aspects that Einstein had introduced for photons in his theory of light quanta in 1905.” (Louis de Broglie, quoted in Wikipedia)

Looking back, you’d of course want the phase of a wavefunction to be some invariant quantity, and the examples we gave our previous post illustrate how one would expect energy and momentum to impact its temporal and spatial frequency. But I am digressing. Let’s look at the second equation. However, before we move on, note thatÂ minusÂ sign in the exponent of our wavefunction:Â aÂ·eâˆ’iÂ·Î¸. The phase turns counter-clockwise. That’s just the way it is. I’ll come back to this.

[II] TheÂ Ï† andÂ Ï‡ symbols do notÂ necessarily represent base states. In fact, Feynman illustrates this law using a variety of examples including both polarized as well asÂ unpolarizedÂ beams, or ‘filtered’ as well as ‘unfiltered’ states, as he calls it in the context of the Stern-Gerlach apparatuses he uses to explain what’s going on. Let me summarize his argument here.

I discussed the Stern-Gerlach experiment in myÂ post on spin and angular momentum, but the Wikipedia article on it is very good too. The principle is illustrated below: a inhomogeneous magnetic field â€“Â note the direction of the gradient âˆ‡B = (âˆ‚B/âˆ‚x, âˆ‚B/âˆ‚y, âˆ‚B/âˆ‚z) â€“Â will split a beam of spin-one particles into three beams. [Matter-particles with spin one are rather rare (Lithium-6 is an example),Â but three states (rather than two only, as we’d have when analyzing spin-1/2 particles, such as electrons or protons) allow for more playÂ in the analysis. ðŸ™‚ In any case, the analysis is easily generalized.]

The splitting of the beam is based, of course, on the quantized angular momentum in the z-directionÂ (i.e. the direction of the gradient): its value is eitherÂ Ä§, 0, or âˆ’Ä§. We’ll denote theseÂ baseÂ states as +, 0 orÂ âˆ’, and we should note they are defined in regard to an apparatus with a specific orientation. If we call this apparatus S, then we can denote theseÂ base states as +S, 0S and âˆ’S respectively.

The interesting thing in Feynman’s analysis is the imagined modifiedÂ Stern-Gerlach apparatus, which â€“ I am using Feynman‘s words here ðŸ™‚Â â€“ Â “puts Humpty Dumpty back together.” It looks a bit monstruous, but it’s easy enough to understand. Quoting Feynman once more: “It consists of a sequence of three high-gradient magnets. The first one (on the left) is just the usual Stern-Gerlach magnet and splits the incoming beam of spin-one particles into three separate beams. The second magnet has the same cross section as the first, but is twice as long and the polarity of its magnetic field is opposite the field in magnetÂ 1. The second magnet pushes in the opposite direction on the atomic magnets and bends their paths back toward the axis, as shown in the trajectories drawn in the lower part of the figure. The third magnet is just like the first, and brings the three beams back together again, so that leaves the exit hole along the axis.”

Now, we can use this apparatus as aÂ filterÂ by inserting blocking masks, as illustrated below.

But let’s get back to the lesson. What about the second ‘Law’ of quantum math? Well… You need to be able to imagine all kinds of situations now. The rather simple set-up below is one of them: we’ve got two of these apparatuses in series now, S and T, with T tilted at the angleÂ Î± with respect to the first.

I know: you’re getting impatient. What about it? Well… We’re finally ready now. Let’s suppose we’ve got three apparatuses in series, with the first and the last one having the very same orientation, and the one in the middle being tilted. We’ll denote them by S, T and S’ respectively. We’ll also use masks: we’ll block the 0 andÂ âˆ’ state in the S-filter, like in that illustration above. In addition, we’ll block the + andÂ âˆ’ state in the T apparatus and, finally, the 0 andÂ âˆ’ state in the S’ apparatus. Now try to imagine what happens: how many particles will get through?

[…]

Just try to think about it.Â Make some drawing or something. Please!Â Â

[…]

OK… The answer is shown below. Despite the filtering in S, the +S particles that come out do have an amplitude to go through the 0T-filter, and so the number of atoms that come out will be some fraction (Î±) of the number of atoms (N) that came out of the +S-filter. Likewise, some other fraction (Î²) will make it through the +S’-filter, so we end up with Î²Î±N particles.

Now, I am sure that, if you’d tried to guess the answer yourself, you’d have saidÂ zeroÂ rather than Î²Î±N but, thinking about it, it makes sense: it’s not because we’ve got some angular momentum in one direction that we have none in the other. When everything is said and done, we’re talking componentsÂ of theÂ total angular momentum here, don’t we? Well… Yes and no. Let’s remove the masks from T. What do we get?

[…]

Come on: what’s your guess? N?

[…] You’re right. It’s N. Perfect. It’s what’s shown below.

Now,Â thatÂ should boost your confidence. Let’s try the next scenario. We block the 0 andÂ âˆ’ state in the S-filter once again, and theÂ + andÂ âˆ’ state in the T apparatus, so the first two apparatuses are the same as in our first example. But let’s change the S’ apparatus: let’s close the + andÂ âˆ’ state there now. Now try to imagine what happens: how many particles will get through?

[…]

Come on! You think it’s a trap, isn’t it? It’s not. It’s perfectly similar: we’ve got some other fraction here, which we’ll write as Î³Î±N, as shown below.

Next scenario: S has the 0 andÂ âˆ’ gate closed once more, and T is fully open, so it has no masks. But, this time, we set S’ so it filters the 0-state with respect to it. What do we get? Come on!Â Think! Please!

[…]

The answer is zero, as shown below.

Does that make sense to you? Yes? Great! Because many think it’s weird: they think the T apparatus must ‘re-orient’ the angular momentum of the particles. It doesn’t: if the filter is wide open, then “no information is lost”, as Feynman puts it. Still… Have a look at it. It looks like we’re opening ‘more channels’ in the last example: the S and S’ filter are the same, indeed, and T is fully open, while it selected for 0-state particles before. ButÂ noÂ particles come through now, while with the 0-channel, we hadÂ Î³Î±N.

Hmm… It actuallyÂ isÂ kinda weird, won’t you agree?Â Sorry I had to talk about this, but it will make you appreciate that secondÂ ‘Law’ now: we can always insert a ‘wide-open’ filter and, hence, split the beams into a completeÂ set of base statesÂ âˆ’ with respect to the filter, that isÂ âˆ’Â and bring them back together provided our filter does not produce any unequal disturbances on the three beams. In short, the passage through the wide-open filter shouldÂ notÂ result in a change of the amplitudes. Again, as Feynman puts it: the wide-open filter should really put Humpty-Dumpty back together again. If it does, we can effectively apply our ‘Law’:

For an example, I’ll refer you to my previous post. This brings me to the third and final ‘Law’.

[III]Â The amplitude to go from state Ï† to state Ï‡ is the complex conjugateÂ of the amplitude to to go from state Ï‡ to state Ï†:

âŒ© Ï‡Â |Â Ï†Â âŒª = âŒ© Ï† | Ï‡Â âŒª*

This is probably the weirdest ‘Law’ of all, even if I should say, straight from the start, we can actually derive it from the second ‘Law’, and the fact that all probabilities have to add up to one. Indeed, a probability is theÂ absoluteÂ square of an amplitude and, as we know, theÂ absoluteÂ square of a complex number is also equal to the product of itself and its complex conjugate:

|z|2Â =Â |z|Â·|z| = zÂ·z*

[You should go through the trouble of reviewing the difference between the square and the absoluteÂ square of a complex number. Just write z as a + ib and calculate (a + ib)2Â = a2Â + 2abiÂ + b2 , as opposed to |z|2Â = a2Â + b2. Also check what it means when writing z as rÂ·eiÎ¸Â = rÂ·(cosÎ¸ + iÂ·sinÎ¸).]

Let’s applying the probability rule to a two-filter set-up, i.e. the situation with the S and the tilted T filter which we described above, and let’s assume we’ve got a pure beam of +S particles entering the wide-open T filter, so our particles can come out in either of the three base states with respect to T. We can then write:

âŒ© +TÂ |Â +SÂ âŒª2Â + âŒ© 0T |Â +S âŒª2Â +Â âŒ© âˆ’T | +S âŒª2Â = 1

â‡” âŒ© +TÂ |Â +SÂ âŒªâŒ© +TÂ |Â +SÂ âŒª* +Â âŒ© 0T |Â +S âŒªâŒ© 0T |Â +S âŒª* + âŒ© âˆ’T | +S âŒªâŒ© âˆ’T | +S âŒª* = 1

Of course, we’ve got two other such equations if we start with a 0S or aÂ âˆ’S state. Now, we take theÂ âŒ© Ï‡Â |Â Ï†Â âŒª =Â âˆ‘ âŒ© Ï‡Â | i âŒªâŒ© i |Â Ï†Â âŒª ‘Law’, and substituteÂ Ï‡ andÂ Ï† for +S, andÂ allÂ iÂ states for the base states with regard to T. We get:

âŒ© +S | +S âŒª = 1 = âŒ© +S |Â +T âŒªâŒ© +TÂ |Â +SÂ âŒª +Â âŒ© +S | 0T âŒªâŒ© 0T |Â +S âŒª + âŒ© +S | â€“T âŒªâŒ© âˆ’T | +S âŒª

These equations are consistent only if:

âŒ© +S |Â +T âŒª = âŒ© +TÂ |Â +SÂ âŒª*,

âŒ© +S | 0T âŒª =Â âŒ© 0T |Â +S âŒª*,

âŒ© +S |Â âˆ’T âŒª =Â âŒ© âˆ’T |Â +S âŒª*,

which is what we wanted to prove. One can then generalize to any state Ï† and Ï‡. However, provingÂ the result is one thing.Â Understanding it is something else. One can write down a number of strange consequences, which all point to Feynman‘s rather enigmatic comment on this ‘Law’:Â “If this Law were not true, probability would not be ‘conserved’, and particles would get ‘lost’.” So what doesÂ thatÂ mean?Â Well… You may want to think about the following, perhaps. It’s obvious that we can write:

|âŒ© Ï† | Ï‡Â âŒª|2Â =Â âŒ© Ï† | Ï‡Â âŒªâŒ© Ï† | Ï‡Â âŒª* =Â âŒ© Ï‡ | Ï† âŒª*âŒ© Ï‡ | Ï† âŒª =Â |âŒ© Ï‡ | Ï† âŒª|2

This says that the probability to go from the Ï†-state to the Ï‡-state Â isÂ the sameÂ as the probability to go from theÂ Ï‡-state to the Ï†-state.

Now, when we’re talking base states, that’s rather obvious, because the probabilities involved are either 0 or 1. However, if we substitute for +S and âˆ’T, or some more complicated states, then it’s a different thing. My guts instinct tells me this third ‘Law’ â€“ which, as mentioned, can be derivedÂ from the other ‘Laws’Â â€“ reflectsÂ the principle ofÂ reversibilityÂ in spacetime, which you may also interpret as aÂ causalityÂ principle, in the sense that, in theory at least (i.e. not thinking about entropy and/or statistical mechanics), we can reverse what’s happening: we can go back in spacetime.

In this regard, we should alsoÂ remember that the complex conjugate of a complex number in polar form, i.e. a complex number written asÂ rÂ·eiÎ¸, is equal toÂ rÂ·eâˆ’iÎ¸, so the argument in the exponent gets aÂ minusÂ sign. Think about what this means for ourÂ aÂ·eâˆ’iÂ·Î¸Â =Â eâˆ’iÂ·(Ï‰Â·t âˆ’ kÂ âˆ™x)Â =Â aÂ·eâˆ’(i/Ä§)Â·(EÂ·t âˆ’ pâˆ™x)Â function. Taking the complex conjugate of this function amounts toÂ reversing the direction of t and x which, once again, evokes that idea of going back in spacetime.

I feel there’s some more fundamental principle here at work, on which I’ll try to reflect a bit more. Perhaps we can also do something with that relationship between the multiplicative inverse of a complex number and its complex conjugate, i.e.Â zâˆ’1Â = z*/|z|2. I’ll check it out. As for now, however, I’ll leave you to do that, and please let me know if you’ve got any inspirational ideas on this. ðŸ™‚

So… Well… Goodbye as for now. I’ll probably talk about theÂ HamiltonianÂ in my next post. I think we really did a good job in laying the groundwork for the really hardcore stuff, so let’s go for that now. ðŸ™‚

Post Scriptum: On the Uncertainty Principle and other rules

After writing all of the above, I realized I should add some remarks to make this post somewhat more readable. First thing: not all of the rules are thereâ€”obviously! Most notably, I didn’t say anything about the rules for adding or multiplying amplitudes, but that’s because I wrote extensively about that already, and so I assume you’re familiar with that. [If not, see my page on the essentials.]

Second, I didn’t talk about the Uncertainty Principle. That’s because I didn’t have to. In fact, we don’t need it here. In general, all popular accounts of quantum mechanics have an excessive focus on the position and momentum of a particle, while the approach in this and my previous post is quite different. Of course, it’s Feynman’s approach to QM really. Not ‘mine’. ðŸ™‚ All of the examples and all of the theory he presents in his introductory chapters in the Third Volume of Lectures, i.e. the volume on QM, are related to things like:

• What is the amplitude for a particle to go from spin state +S to spin state âˆ’T?
• What is the amplitude for a particle to be scattered, by a crystal, or from some collision with another particle, in the Î¸ direction?
• What is the amplitude for two identical particles to be scattered in the same direction?
• What is the amplitude for an atom to absorb or emit a photon? [See, for example, Feynman’s approach to the blackbody radiation problem.]
• What is the amplitude to go from one place to another?

In short, you read Feynman, and it’s only at theÂ veryÂ end of hisÂ exposÃ©, that he starts talking about the things popular books start with, such as the amplitude of a particle to be at point (x, t) in spacetime, or the SchrÃ¶dinger equation, which describes the orbital of an electron in an atom. That’s where the Uncertainty Principle comes in and, hence, one can really avoid it for quite a while. In fact, oneÂ shouldÂ avoid it for quite a while, because it’s now become clear to me that simply presenting the Uncertainty Principle doesn’t help all that much to trulyÂ understandÂ quantum mechanics.

Truly understanding quantum mechanics involves understanding all of these weird rules above. To some extent, that involves dissociatingÂ the idea of the wavefunction with our conventional ideas of time and position. From the questions above, it should be obvious that ‘the’ wavefunction does actuallyÂ notÂ exist: we’ve got a wavefunction for anything we can and possibly want to measure. That brings us to the question of the base states: what areÂ they?

Feynman addresses this question in a rather verbose section of his Lectures titled:Â What are the base states of the world?Â I won’t copy it here, but I strongly recommend you have a look at it. ðŸ™‚

I’ll end here with a final equation that we’ll need frequently: the amplitude for a particle to go from one place (r1) to another (r2). It’s referred to as aÂ propagatorÂ function, for obvious reasonsâ€”one of them being that physicists like fancy terminology!â€”and it looks like this:

The shape of theÂ e(i/Ä§)Â·(pâˆ™r12)Â function is now familiar to you. Note theÂ r12Â in the argument, i.e. the vectorÂ pointing fromÂ r1Â to r2. The pâˆ™r12Â dot product equals |p|âˆ™|r12|Â·cosÎ¸ = pâˆ™r12Â·cosÎ¸, with Î¸ the angle between p and r12. If the angle is the same, then cosÎ¸ is equal to 1. If the angle is Ï€/2, then it’s 0, and the function reduces to 1/r12. So the angle Î¸, through the cosÎ¸ factor, sort of scalesÂ the spatial frequency. Let me try to give you some idea of how this looks like by assuming the angle between p and r12Â is the same, so weâ€™re looking at the space in the direction of the momentum only andÂ |p|âˆ™|r12|Â·cosÎ¸ = pâˆ™r12. Now, we can look at the p/Ä§ factor as a scalingÂ factor, and measure theÂ distance xÂ in units defined by that scale, so we write: x = pâˆ™r12/Ä§.Â The function then reduces to (Ä§/p)Â·eiâˆ™x/xÂ =Â (Ä§/p)Â·cos(x)/xÂ + iÂ·(Ä§/p)Â·sin(x)/x, and we just need to square this to get the probability. All of the graphs are drawn hereunder: Iâ€™ll let you analyze them. [Note that the graphs do not include the Ä§/p factor, which you may look at as yet another scaling factor.] Youâ€™ll see â€“ I hope! â€“ that it all makes perfect sense: the probability quickly drops off with distance, both in the positive as well as in the negative x-direction, while it’s going to infinity when veryÂ near.Â [Note that the absolute square, usingÂ cos(x)/xÂ andÂ sin(x)/xÂ yields the same graph as squaring 1/xâ€”obviously!]

Some content on this page was disabled on June 16, 2020 as a result of a DMCA takedown notice from The California Institute of Technology. You can learn more about the DMCA here:

Some content on this page was disabled on June 16, 2020 as a result of a DMCA takedown notice from The California Institute of Technology. You can learn more about the DMCA here:

Some content on this page was disabled on June 16, 2020 as a result of a DMCA takedown notice from The California Institute of Technology. You can learn more about the DMCA here:

Some content on this page was disabled on June 16, 2020 as a result of a DMCA takedown notice from The California Institute of Technology. You can learn more about the DMCA here: