WeÂ climbed a mountainâ€”step by step, post by post. ðŸ™‚ We have reached the top now, and the view is gorgeous. WeÂ understand SchrÃ¶dinger’s equation, which describes how amplitudes propagate through space-time. It’s the quintessential quantum-mechanical expression. Let’s enjoy now, and deepen our understanding by introducing the concept of (quantum-mechanical) *operators*.

**The operator concept**

We’ll introduce the operator concept using SchrÃ¶dinger’s equation itself and, in the process, deepen our understanding of SchrÃ¶dinger’s equation a bit. You’ll remember we wrote it as:

However, you’ve probably seen it like it’s written on his bust, or on his grave, or wherever, which is as follows:

It’s the same thing, of course. The ‘over-dot’ is Newton’s notation for the time derivative. In fact, if you click on the picture above (and zoom in a bit), then you’ll see that the craftsman who made the stone grave marker, mistakenly, also carved a dot above the *psi *(Ïˆ)Â on the right-hand side of the equationâ€”but then someone pointed out his mistake and so the dot on the right-hand side isn’t painted. ðŸ™‚ The thing I want to talk about here, however, is the H inÂ that*Â *expression above, which is, obviously, the following **operator**:

That’s a pretty monstrous operator, isn’t it? It is what it is, however: an *algebraic*Â operator (it operates on a *number*â€”albeit a *complexÂ *numberâ€”unlikeÂ a *matrixÂ *operator, which operates on a *vectorÂ *or another matrix). As you can see, it actuallyÂ consists ofÂ two*Â *other (algebraic) operators:

- TheÂ âˆ‡
^{2Â }operator, which you know: it’s a *differentialÂ *operator. To be specific, it’s theÂ *LaplaceÂ *operator, which is the divergence (**âˆ‡**Â·) of the gradient (**âˆ‡**) of a function:Â âˆ‡^{2Â }= **âˆ‡**Â·**âˆ‡** = (âˆ‚/âˆ‚x, âˆ‚/âˆ‚y^{Â }, âˆ‚/âˆ‚z)Â·(âˆ‚/âˆ‚x, âˆ‚/âˆ‚y^{Â }, âˆ‚/âˆ‚z) = âˆ‚^{2}/âˆ‚x^{2 Â }+Â âˆ‚^{2}/âˆ‚y^{2Â }+ âˆ‚^{2}/âˆ‚z^{2}. ThisÂ too operates on our complex-valued function wavefunctionÂ Ïˆ, and yields some other complex-valued function, which we then multiply byÂ âˆ’Ä§^{2}/2m to get the first term.
- The V(
*x*, *y*, *z*) ‘operator’, whichâ€”in this particular contextâ€”just means: “multiply with V”. Needless to say, V isÂ the *potential*Â here, and so it captures the presence of external force fields. Also note that V is a real number, just likeÂ âˆ’Ä§^{2}/2m.

Let me say something about the dimensions here. On the left-hand side of SchrÃ¶dinger’s equation, we have the product ofÂ Ä§ and a time derivative (*iÂ *is just the imaginary unit, so that’s just a (complex) number). Hence, the dimension there is [JÂ·s]/[s] (the dimension of a time derivative is something expressed *per second*). So the dimension of the left-hand side isÂ *joule*. On the right-hand side, we’ve got two terms. The dimension of that second-order derivative (âˆ‡^{2}Ïˆ) is something expressedÂ per* square *meter, but then we multiply it with âˆ’Ä§^{2}/2m, whose dimension is [J^{2}Â·s^{2}]/[J/(m^{2}/s^{2})]. [Remember: m = E/*c*^{2}.] So that reduces to [JÂ·m^{2}]. Hence, the dimension of (âˆ’Ä§^{2}/2m)âˆ‡^{2}Ïˆ is *joule*. And the dimension of V isÂ *jouleÂ *too, of course. So it all works out.Â In fact, now that we’re here, it may or may not be useful to remind you of that heat diffusion equation we discussed when introducing the basic concepts involved in vector analysis:

That equation illustrated theÂ *physicalÂ *significance of the Laplacian. We were talking about the flow ofÂ *heatÂ *in, say, a block of metal, as illustrated below. TheÂ *qÂ *in the equation above is theÂ *heat per unit volume*, and the *h* in the illustration below was the heat flow vector (so it’s got *nothingÂ *to do with Planck’s constant), which depended on the material, and which we wrote asÂ *hÂ *= â€“*Îº***âˆ‡***T*, with T the temperature, and *ÎºÂ *(*kappa*) theÂ *thermal conductivity*. In any case, the point is the following: the equation below illustrates the physical significance of the Laplacian. We let itÂ *operateÂ *on the temperature (i.e. a *scalarÂ *function) and its product with some constant (just think of replacing *ÎºÂ *byÂ âˆ’Ä§^{2}/2mÂ gives us theÂ *time derivative*Â of *q*, i.e. the heat per unit volume.

In fact, we know that *qÂ *is proportional to T, so if we’d choose an appropriate temperature scale â€“ i.e. choose the zero point such that *qÂ *=Â *kÂ·*TÂ (your physics teacher in high school would refer to *kÂ *as the (volume)Â *specific heat capacity*) â€“ then we could simple write:

âˆ‚T/âˆ‚t = (*Îº*/k)âˆ‡^{2}*T*

From aÂ *mathematicalÂ *point of view, that equation is just the same asÂ âˆ‚Ïˆ/âˆ‚t = â€“(*i*Â·Ä§/2m)Â·âˆ‡^{2}Ïˆ, which is SchrÃ¶dinger’s equation for V = 0.Â In other words, you can â€“ and actually *shouldÂ *â€“ alsoÂ think of SchrÃ¶dinger’s equation as describing the *flowÂ *of… Well… *What?*

Well… Not sure. I am tempted to think of something like a probabilityÂ *densityÂ *in space, butÂ Ïˆ represents a (complex-valued) *amplitude*. Having said that, you get the ideaâ€”I hope! ðŸ™‚ If not, let me *paraphrase* Feynman on this:

“WeÂ can think of SchrÃ¶dinger’s equationÂ as describing the diffusion of a probability amplitude from one point to another. In fact, the equation looks something like the diffusion equation we introduced when discussing heat flow, or the spreading of a gas. But there is one main difference: the imaginary coefficient in front of the time derivative makes the behavior completely different from the ordinary diffusion such as you would have for a gas spreading out. Ordinary diffusion gives rise to real exponential solutions, whereas the solutions of SchrÃ¶dinger’s equationÂ are complex waves.”

That says it all, right? ðŸ™‚ In fact, SchrÃ¶dinger’s equation â€“ as discussed here â€“ was actually being derived when describing the motion of an electron along a line of atoms, i.e. for motion inÂ *one directionÂ *only, but you can visualize what it represents in three-dimensional space. The *real* exponential functions Feynman refer to exponentialÂ *decayÂ *function: as the energy is spread over an ever-increasing volume, the amplitude of the wave becomes smaller and smaller. That may be the case for complex-valued exponentials as well. The *keyÂ *difference between a real- and complex-valued exponential decay function is that aÂ *complexÂ *exponential is a cyclical function. Now, I quickly *googled *to see how we could visualize that, and I like the following illustration:

The dimensional analysis ofÂ SchrÃ¶dinger’s equation is also quite interesting because… Well… Think of it: that heat diffusion equation incorporates the same dimensions: temperature is a measure of the average energy of the molecules. That’s really something to think about. These differential equations are not onlyÂ *structurallyÂ *similar but, in addition, they all seem to describe someÂ *flow of energy*. That’s pretty deep stuff: it relates amplitudes to energies, so we should think in terms of Poynting vectors and all that. But… Well… IÂ need to move on, and so I *willÂ *move onâ€”so you can re-visit this later. ðŸ™‚

Now that we’ve introduced theÂ *conceptÂ *of an operator, let me say something about notations, because that’s quite confusing.

**Some remarks on notation**

Because it’s an operator, we should actually use the *hat *symbolâ€”in line with what we did when we were discussing *matrixÂ *operators: we’d distinguish the matrix (e.g. A) from its use as an operator (Ã‚). You may or may not remember we do the same in statistics: the *hatÂ *symbol is supposed to distinguish the *estimator *(*Ã¢*) â€“ i.e. some *function* we use to *estimateÂ *a parameter (which we usually denoted by some Greek symbol, likeÂ Î±)Â â€“Â from a specific *estimate*Â of the parameter, i.e. the *valueÂ *(*a*) we get when applying *Ã¢Â *to a specific *sample* or *observation*. However, if you remember the difference, you’ll also rememberÂ thatÂ *hat *symbol was quickly forgotten, because theÂ *contextÂ *made it clear what was what, and so we’d just write *a*(x) instead of *Ã¢*(x). So… Well… I’ll be sloppy as well here, if only because the *WordPress*Â editor only offers veryÂ fewÂ symbols with a hat! ðŸ™‚

In any case, this discussion on the use (or not) of thatÂ *hatÂ *is irrelevant. In contrast, whatÂ *isÂ *relevant is to realizeÂ this *algebraicÂ *operatorÂ H here is *very* different from that other quantum-mechanical Hamiltonian operator we discussed when dealing with a *finite* set of base states: *thatÂ *HÂ was the Hamiltonian *matrix*, but used in an ‘operation’ on some state. So we have theÂ *matrixÂ *operator H, and theÂ *algebraicÂ *operator H.

Confusing?

Yes and no. First, we’ve got the context again, and so you alwaysÂ *knowÂ *whether you’re looking at continuous or discrete stuff:

- If your ‘space’ is continuous (i.e. if states are to defined with reference to an
*infiniteÂ *set of base states), then it’s the algebraic operator.
- If, on the other hand, your states are defined by some finite set ofÂ
*discreteÂ *base states, then it’s the Hamiltonian matrix.

There’s another, more fundamental, reason why there should be no confusion. In fact, it’s the reason why physicists use the same symbol H in the first place: despite the fact that they *look *so different,Â these two operators (i.e. HÂ the algebraic operator and H the matrix operator) are actually *equivalent*. **Their interpretation is similar**, as evidenced from the fact that both are being referred to as the *energy operatorÂ *in quantum physics. The only difference is that one operates on a (state)Â *vector*, while the other operates on a continuous *function*.Â It’s just the difference betweenÂ *matrix mechanics*Â as opposed to *wave mechanics*Â really.

But…Â Well… I am sure I’ve confused you by nowâ€”and probably very much soâ€”and so let’s start from the start. ðŸ™‚

**Matrix mechanics**

Let’s start with the easy thing indeed: matrix mechanics. The matrix-mechanical approach is summarized in that set of Hamiltonian equations which, by now, you know so well:

If we have *nÂ *base states, then we haveÂ *nÂ *equations like this: one for eachÂ *iÂ *= 1, 2,… *n*. As for the introduction of the Hamiltonian, and the other subscript (*j*), just think of the description of a *state*:

So… Well… Because we had used *iÂ *already, we had to introduceÂ *j*. ðŸ™‚

Let’s think about |ÏˆâŒª. It is theÂ *stateÂ *of a system, like theÂ *ground stateÂ *of a hydrogen atom, or one of its manyÂ *excitedÂ *states. But… Well… It’s a bit of a weird term, really. It all depends on what you want to measure: when we’re thinking of the *ground* state, or an *excited* state, we’re thinking energy. That’s something else than thinking its position in space, for example. Always remember: a state is defined by a set ofÂ *baseÂ *states, and so those base states come with a certainÂ *perspective*: when talking *states*, we’re only looking at someÂ *aspectÂ *of reality, really. Let’s continue with our example ofÂ *energyÂ *states, however.

You know that the *lifetime* of a system in an excited state is usually short: some *spontaneous* or *induced* emission of a quantum of energy (i.e. a *photon*) will ensure that the system quickly returns to a less excited state, or to the ground state itself. However, you shouldn’t think of that here: we’re looking at *stableÂ *systems here. To be clear: we’re looking at systems that have someÂ *definiteÂ *energyâ€”or so we think: it’s just because of the quantum-mechanical uncertainty that we’ll always measure some other different value. Does that make sense?

If it doesn’t… Well… Stop reading, because it’s only going to get even more confusing. Not my fault, however!

*Psi*-chology

The ubiquity of that Ïˆ symbol (i.e. the Greek letterÂ *psi*) is really something *psi*-chological ðŸ™‚ and, hence, *veryÂ *confusing, really.Â In *matrix mechanics*, ourÂ Ïˆ would just denote a *state* of a *system*, like the energy of an electron (or, when there’s only one electron, our hydrogen *atom*). If it’s an electron, then we’d describe it by its orbital. In this regard, I found the following illustration from Wikipedia particularly helpful: the green orbitals show *excitationsÂ *of *copper*Â (Cu) orbitals on a CuO_{2Â }plane. [The two big arrows just illustrate the principle of X-ray spectroscopy, so it’s an X-ray *probingÂ *the structure of the material.]

So… Well… We’d write Ïˆ as |ÏˆâŒª just to remind ourselves we’re talking of some *state *of the* system* indeed. However,Â quantum physicists always want to confuse you, and so they will also use the *psiÂ *symbol toÂ denote something else: they’ll use it to denote a very *particular* *C*_{i}Â *amplitude* (or *coefficient*) in that |ÏˆâŒª = âˆ‘|*i*âŒª*C*_{i}Â formula above. To be specific, they’d replace the base states |*i*âŒª by the continuous position variable *x*, and they would write the following:

*C*_{i}Â = Ïˆ(*i* =Â x) = Ïˆ(x) = *C*_{Ïˆ}(x) = *C*(x) =Â âŒ©x|ÏˆâŒª

In fact, that’s just like writing:

Ï†(*p*) = âŒ© mom *p* | Ïˆ âŒª =Â âŒ©*p*|ÏˆâŒª =Â *C*_{Ï†}(*p*) = *C*(*p*)

What they’re doing here, is (1) reduce the ‘*system*‘ to a ‘*particle*‘ once more (which is OK, as long as you know what you’re doing) and (2) they basically state the following:

If a particle is in some state |ÏˆâŒª, then we can associate someÂ *wavefunction *Ïˆ(x) orÂ Ï†(*p*)â€”with it, and that wavefunction will represent the *amplitudeÂ *for the system (i.e. our particle) to be at x, or to have a momentum that’s equal toÂ *p.*

So what’s wrong with that? Well… Nothing. It’s just that… Well… Why don’t they use Ï‡(x) instead of Ïˆ(x)? That would avoid a lot of confusion, I feel: one should *not*Â use the same symbol (*psi*) for the |ÏˆâŒª **state** and the Ïˆ(x) **wavefunction**.

*Huh?Â *Yes. Think about it.Â The point is:Â theÂ *positionÂ *or the *momentum*, or even the *energy*, are **properties** of the system, so to speak and, therefore, it’s really confusing to use the same symbol *psiÂ *(Ïˆ) to describe (1) the *state*Â of the system, in general, versus (2)Â the *position*Â *wavefunction*, which describes… Well… Some *very particularÂ *aspect (or ‘state’, if you want) of the same system (in this case: its *position*). There’s no such problem withÂ Ï†(*p*), so… Well… Why don’t they useÂ Ï‡(x) instead of Ïˆ(x) indeed? I have only one answer:Â *psi*-chology. ðŸ™‚

In any case, there’s nothing we can do about it and… Well… In fact, that’s what this post is about: it’s about how to describe certainÂ *propertiesÂ *of the system. Of course,Â we’re talking quantum mechanics here and, hence,Â *uncertainty*, and, therefore, we’re going to talk about theÂ *average *position, energy, momentum, etcetera that’s associated with a particular *stateÂ *of a system, orâ€”as we’ll keep things *very *simpleâ€”the properties of a ‘particle’, really. Think of an electron in some orbital, indeed! ðŸ™‚

So let’s now look at that set of Hamiltonian equations once again:

Looking at it *carefully*Â â€“ so just look at it once again! ðŸ™‚Â â€“ and thinking aboutÂ what we did when going from the discrete to the continuous setting, we can now understand we should write the following for the continuous case:

Of course, combiningÂ SchrÃ¶dinger’s equation with the expression above implies the following:

Now how can we relate that integral to the expression on the right-hand side? I’ll have to disappoint you here, as it requires a lot of math to transform that integral. It requires writing H(*x*, *x’*) in terms of rather complicated functions, including â€“ you guessed it, didn’t you?Â â€“ Dirac’s delta function. Hence, I assume you’ll believe me if I say that the matrix- and wave-mechanical approaches *are* actually equivalent. In any case, if you’d want to check it, you can always read Feynman yourself. ðŸ™‚

Now, I wrote this post to talk about quantum-mechanicalÂ *operators*, so let me do that now.

**Quantum-mechanical operators**

You know the concept of an operator. As mentioned above, we should put a littleÂ *hatÂ *(^) on top of our Hamiltonian operator, so as to distinguish it from the matrix itself. However, as mentioned above, the difference is usually quite clear from the context. Our operators were all matrices so far, and we’d write the matrix elements of, say, some operator A, as:

A_{ij}Â â‰¡Â âŒ© *i*Â | A |Â *j*Â âŒª

The whole matrix itself, however, would usually *not* act on a *base* state but… Well… Just on some more general state Ïˆ, to produce some new state Ï†, and so we’d write:

| Ï†Â âŒª = AÂ | ÏˆÂ âŒª

Of course, we’d have toÂ *describeÂ *| Ï†Â âŒª in terms of the (same) set of base states and, therefore, we’d expand this expression into something like this:

You get the idea. I should just add one more thing. You know this important property of amplitudes: the âŒ© ÏˆÂ | Ï† âŒª amplitude is the *complex conjugateÂ *of the âŒ© Ï† |Â Ïˆ âŒª amplitude. It’s got to do with time reversibility, because the complex conjugate of *e*^{âˆ’iÎ¸Â }= *e*^{âˆ’i(Ï‰Â·tâˆ’kÂ·x)Â }is equal to *e*^{iÎ¸Â }= *e*^{i(Ï‰Â·tâˆ’kÂ·x)},Â so we’re just reversing the x- andÂ t*–*direction.Â We write:

Â âŒ©Â Ïˆ |Â Ï† âŒª =Â âŒ© Ï† | Ïˆ âŒª*

Now what happens if we want to take the complex conjugate when we insert a matrix, so when writing âŒ© Ï† | A | Ïˆ âŒª instead of âŒ© Ï† | Ïˆ âŒª, this rules becomes:

âŒ© Ï† | A | Ïˆ âŒª* =Â âŒ© Ïˆ | Aâ€ | Ï†Â âŒª

TheÂ *daggerÂ *symbol denotes theÂ *conjugate transpose*, so Aâ€ is an operator whose matrix elements are equal to A_{ij}â€ = A_{ji}*. Now, it may or may not happen that theÂ Aâ€ matrix is actually equal to the original A matrix. In that caseÂ â€“ andÂ *onlyÂ *in that caseÂ â€“ we can write:

âŒ© Ïˆ | A | Ï†Â âŒª = âŒ© Ï† | A | Ïˆ âŒª*

We thenÂ say that A is a ‘self-adjoint’ or ‘Hermitian’ operator. That’s just a definition of a property, which the operator may or may not haveâ€”but many quantum-mechanical operators are actually Hermitian. In any case, we’re well armed now to discuss some *actualÂ *operators, and we’ll start with that *energyÂ *operator.

**The energy operator (H)**

We know the state of aÂ *systemÂ *is described in terms of a set of *base* states. Now, our analysis of *N*-state systems showed we can always describe it in terms of aÂ *specialÂ *set of base states, which are referred to as the **states of definite energyÂ **because… Well… Because they’re associated with someÂ *definiteÂ *energy. In that post, we referred to these energy levels asÂ *E*_{nÂ }(*n* = I, II,â€¦ **N**). We used boldface for the subscript n (so we wrote **n** instead of n) because of these *RomanÂ *numerals. With each energy level, we could associate a base state, of *definite energyÂ *indeed, that we wrote asÂ |**n**âŒª. To make a long story short, we summarized our results as follows:

- The energiesÂ
*E*_{I}, *E*_{II},â€¦,Â *E*_{n},â€¦, *E*_{NÂ }are theÂ **eigenvalues**Â of the Hamiltonian matrix H.
- The state vectors |
**n**âŒª that are associated with each energy *E*_{n}, i.e. the set of vectorsÂ |**n**âŒª, are the corresponding *eigenstates*.

We’ll be working with some more subscripts in what follows, and these Roman numerals and the boldface notation are somewhat confusing (if only because I don’t want you to think of these subscripts as *vectors*), we’ll just denote *E*_{I}, *E*_{II},â€¦,Â *E*_{n},â€¦, *E*_{NÂ }as *E*_{1}, *E*_{2},â€¦,Â *E*_{i},â€¦, *E*_{N}, and we’llÂ *numberÂ *the states of definite energy accordingly, also using some Greek letter so as to clearly distinguish them from all ourÂ *LatinÂ *letter symbols: we’ll write these states as: |Î·_{1}âŒª, |Î·_{1}âŒª,… |Î·_{N}âŒª. [If I say, ‘we’, I mean Feynman of course. You may wonder why he doesn’t write |E_{i}âŒª, or |Îµ_{i}âŒª. The answer is: writingÂ |E_{n}âŒª would cause confusion, because this state will appear in expressions like: |E_{i}âŒªE_{i}, so that’s the ‘product’ of a state (|E_{i}âŒª) and the associated *scalarÂ *(E_{i}). Too confusing. As for using Î· (*eta*) instead of Îµ (*epsilon*) to denote something that’s got to do withÂ *e*nergy… Well… I guess he wanted to keep the *resemblance* with theÂ **n**, and then the Ancient Greek apparently did use this Î· letter Â for a sound like ‘*e*‘ so… Well… Why not? Let’s get back to the lesson.]

Using these base states of definite energy, we can write the state of the system as:

|ÏˆâŒª =Â âˆ‘Â |Î·_{i}âŒª C_{iÂ }Â = âˆ‘Â |Î·_{i}âŒªâŒ©Î·_{i}*|*ÏˆâŒª*Â Â *_{Â }Â over allÂ *iÂ *(*i* = 1, 2,… , N)

Now, we didn’t talk all that much about what these base states actuallyÂ *meanÂ *in terms of measuring something but you’ll believe if I say that, when *measuringÂ *the energy of the system,Â we’ll always measure one *or*Â the other *E*_{1}, *E*_{2},â€¦,Â *E*_{i},â€¦, *E*_{N}Â value. We’ll never measure something in-between: it’s *either*–*or*. Now, as you know, measuring something in quantum physics is supposed to be destructive but… Well… Let usÂ *imagineÂ *we could make a thousand measurements to try to determine theÂ *averageÂ *energy of the system. We’d do so by counting the number of times we measureÂ *E*_{1}Â (and of course we’d denote that number as N_{1}),Â *E*_{2},Â *E*_{3}, etcetera. You’ll agree that we’d measure the average energy as:

However, measurement is destructive, and we actuallyÂ *knowÂ *what theÂ *expected valueÂ *of this ‘average’ energy will be, because we know theÂ *probabilities*Â of finding the system in a particular base state. That probability is equal to the *absoluteÂ *square of that C_{iÂ }coefficient above, so we can use the P_{iÂ }= |*C*_{i}|^{2}Â formula to write:

âŒ©*E*_{av}âŒª = âˆ‘ P_{iÂ }*E*_{i}Â over allÂ *iÂ *(*i* = 1, 2,… , N)

Note that this is a rather general formula. It’s got nothing to do with quantum mechanics: if A_{i}Â represents the *possibleÂ *values of some quantity A, and P_{i}Â is the probability of getting that value, then (the expected value of) the average A will also be equal to âŒ©*A*_{av}âŒª = âˆ‘ P_{i }A_{i}. No rocket science here! ðŸ™‚ But let’s now apply our quantum-mechanical formulas to that âŒ©*E*_{av}âŒª = âˆ‘ P_{iÂ }E_{i}Â formula. [Ohâ€”and I apologize for using the same angle brackets âŒ© andÂ âŒª to denote an expected value hereâ€”sorry for that! But it’s what Feynman doesâ€”and other physicists! You see: they don’t *really* want you to understand stuff, and so they often use very confusing symbols.] Remembering that the absolute square of a complex number equals the product of that number and its complex conjugate, we can re-write the âŒ©*E*_{av}âŒª = âˆ‘ P_{iÂ }E_{i}Â formula as:

âŒ©*E*_{av}âŒª = âˆ‘ P_{iÂ }*E*_{i}Â = âˆ‘ |*C*_{i}|^{2Â }*E*_{i}Â =Â âˆ‘Â *C*_{i}**C*_{iÂ }E_{i}Â =Â âˆ‘Â *C*_{iÂ }**C*_{iÂ }E_{i}Â = âˆ‘ âŒ©Ïˆ|Î·_{i}âŒªâŒ©Î·_{i}*|*ÏˆâŒª*E*_{iÂ }=Â âˆ‘ âŒ©Ïˆ|Î·_{i}âŒª*E*_{i}âŒ©Î·_{i}*|*ÏˆâŒª over allÂ *i*

Now, you know that Dirac’sÂ *bra-ketÂ *notation allows numerous manipulations. For example, what we could do is take out that ‘common factor’ âŒ©Ïˆ|, and so we may re-write that monster above as:

âŒ©*E*_{av}âŒª =Â âŒ©Ïˆ| âˆ‘ Î·_{i}âŒª*E*_{i}âŒ©Î·_{i}*|*ÏˆâŒª = âŒ©Ïˆ|Ï†âŒª, with |Ï†âŒª = âˆ‘ |Î·_{i}âŒª*E*_{i}âŒ©Î·_{i}*|*ÏˆâŒª over allÂ *i*

**Huh?**Â Yes. Note the difference betweenÂ |ÏˆâŒª =Â âˆ‘Â |Î·_{i}âŒª C_{iÂ }Â = âˆ‘Â |Î·_{i}âŒªâŒ©Î·_{i}*|*ÏˆâŒª and |Ï†âŒª = âˆ‘ |Î·_{i}âŒª*E*_{i}âŒ©Î·_{i}*|*ÏˆâŒª. As Feynman puts it: Ï† is just some ‘*cooked-up*‘ state which you get by taking each of the base states |Î·_{i}âŒª in the amount *E*_{i}âŒ©Î·_{i}*|*ÏˆâŒª (as opposed to the âŒ©Î·_{i}*|*ÏˆâŒª amounts we took for Ïˆ).

I know: you’re getting tired and you wonder why we need all this stuff. Just hang in there. We’re almost done. I just need to do a few more unpleasant things, one of which is to remind you that this business of the energy states beingÂ *eigenstates*Â (and the energy levels beingÂ *eigenvalues*) of our Hamiltonian matrix (see my post on *N*-state systems) comes with a number of interesting properties, including this one:

HÂ |Î·_{i}âŒª = *E*_{i}|Î·_{i}âŒª =Â |Î·_{i}âŒª*E*_{i}

Just think about what’s written here: on the left-hand side, we’re multiplying a matrix with a (base) state vector, and on the left-hand side we’re multiplying it with aÂ *scalar*. So ourÂ |Ï†âŒª = âˆ‘ |Î·_{i}âŒª*E*_{i}âŒ©Î·_{i}*|*ÏˆâŒª sum now becomes:

|Ï†âŒª =Â âˆ‘Â HÂ |Î·_{i}âŒªâŒ©Î·_{i}*|*ÏˆâŒª over allÂ *iÂ *(*i* = 1, 2,… , N)

Now we can manipulate that expression some more so as to get the following:

|Ï†âŒª =Â H âˆ‘|Î·_{i}âŒªâŒ©Î·_{i}*|*ÏˆâŒª = H*|*ÏˆâŒª

Finally, we can re-combine this now with theÂ âŒ©*E*_{av}âŒª = âŒ©Ïˆ|Ï†âŒª equation above, and so we get the fantastic result we wanted:

âŒ©*E*_{av}âŒª = âŒ© Ïˆ | Ï† âŒª =Â âŒ© Ïˆ | HÂ *|Â *Ïˆ âŒª

**Huh?**Â Yes!Â **To get the average energy, you operate onÂ **|**Ïˆ**âŒª** with H, and then you multiply the result with **âŒ©**Ïˆ**|. It’s a beautiful formula. On top of that, theÂ new formula for the average energy is not only pretty but also useful, because now we donâ€™t need to say anything about any particular set of base states. We donâ€™t even have to know all of the possible energy levels. When we have to calculate the average energy of some system, we only need to be able to describe the *stateÂ *of that systemÂ in terms of *some* set of base states, and we also need to know the Hamiltonian matrix for *that* set, of course. But if we know that, we can calculate its average energy.

You’ll say that’s not a big deal because… Well… If you know the Hamiltonian, you know everything, so… Well… Yes. You’re right: it’s less of a big deal than it seems. Having said that, the whole development above is very interesting because of something else: we can easilyÂ *generalizeÂ *it for other physical measurements. I call it the ‘average value’ operator idea, but you won’t find that term in any textbook. ðŸ™‚ Let me explain the idea.

**The average value operator (A)**

The development above illustrates how we can relate a *physical observable*, like the (average) energy (E), to a quantum-mechanical operator (H). Now, the development above can easily be generalized to *any observable that would be proportional to the energy*.Â It’s perfectly reasonable, for example, to assume theÂ angular momentum â€“ as measured in some direction, of course, which we usually refer to as the z-direction â€“ would be proportional to the energy, and so then it would be easy to define a new operatorÂ L_{z}, which we’d define as the operator of the z-component of the angular momentum *L*. [I know… That’s a bit of a long name but… Well… You get the idea.] So we can write:

âŒ©*L*_{z}âŒª_{av} = âŒ© Ïˆ |Â L_{zÂ }*|Â *Ïˆ âŒª

In fact, further generalization yields the following grand result:

If a physical observable A is related to a suitable quantum-mechanical operator Ã‚, then the average value of A for the stateÂ |Â Ïˆ âŒª is given by:

âŒ©*A*âŒª_{av} = âŒ© Ïˆ | Ã‚_{Â }*|Â *Ïˆ âŒª =Â âŒ© Ïˆ | Ï† âŒª with | Ï† âŒª = Ã‚_{Â }*|Â *Ïˆ âŒª

At this point, you may have second thoughts, and wonder: *what* state |Â Ïˆ âŒª? The answer is: it doesn’t matter. It can be any state, *as long as we’re able to describe in terms of a chosen set of base states*. ðŸ™‚

OK. So far, so good. The next step is to look at how this works for the continuity case.

**The energy operator for wavefunctionsÂ (H)**

We can start thinking about the *continuousÂ *equivalent of theÂ âŒ©*E*_{av}âŒª = âŒ©Ïˆ|H*|*ÏˆâŒª expression by first expanding it. We write:

You know the continuous equivalent of a sum like this is an integral, i.e. an *infiniteÂ *sum. Now, because we’ve gotÂ *twoÂ *subscripts here (*i* and *j*), we get the following *doubleÂ *integral:

Now, I did take my time to walk you through Feynman’s derivation of the energy operator for theÂ *discreteÂ *case, i.e. the operator when we’re dealing withÂ *matrix mechanics*, but I think I can simplify my life here by just copying Feynman’s succinct development:

Done! Given a wavefunction Ïˆ(x), we get the average energy by doing that integral above. Now, the quantity in the braces of that integral can be written as that operator we introduced when we started this post:

So now we can write that integral much more elegantly. It becomes:

âŒ©*E*âŒª_{av} =Â âˆ«Â Ïˆ*(*x*)Â *H*Â Ïˆ(*x*) d*x*

You’ll say that doesn’t look likeÂ âŒ©*E*_{av}âŒª =Â âŒ© Ïˆ | HÂ *|Â *Ïˆ âŒª! It does. Remember that âŒ© Ïˆ | = *|Â *Ïˆ âŒª*. ðŸ™‚ *Done! *

I should add one qualifier though:Â the formula above assumes our wavefunction has been normalized, so all probabilities add up to one. But that’s a minor thing. The only thing left to do now is to generalize to three dimensions. That’s easy enough. Our expression becomes a *volumeÂ *integral:

âŒ©*E*âŒª_{av} =Â âˆ«Â Ïˆ*(*r*)Â *H*Â Ïˆ(*r*) d*V*

Of course, d*V* stands for d*VolumeÂ *here, not for any potential energy, and, of course, once again we assume all probabilities over the volume add up to 1, so all is normalized.Â *Done! *ðŸ™‚

We’re almost done with this post. What’s left is theÂ *positionÂ *andÂ *momentumÂ *operator. You may think this is going to another lengthy development but… Well… It turns out the analysis is remarkably simple. Just stay with me a few more minutes and you’ll have earned your degree. ðŸ™‚

**The position operator (***x*)

The thing we need to solve here is really easy. Look at the illustration below as representing the probability density of some particle being at *x*. Think about it: what’s the average position?

Well? What? The (expected value of the) average position is just this simple integral: âŒ©*x*âŒª_{av} =Â âˆ« *xÂ *P(*x*) d*x*, over all the whole range of possible values for x. ðŸ™‚ That’s all. Of course, because P(*x*) =Â |Ïˆ(*x*)|^{2}Â =Ïˆ*(*x*)Â·Ïˆ(*x*), this integral now becomes:

âŒ©*x*âŒª_{av} =Â âˆ«Â Ïˆ*(*x*) *x*Â Ïˆ(*x*) d*x*

That looks *exactlyÂ *the same asÂ âŒ©*E*âŒª_{av} =Â âˆ«Â Ïˆ*(*x*)Â *H*Â Ïˆ(*x*) d*x*, and so we can look at *xÂ *as an operator too!

*Huh?*Â Yes. It’s an extremely simple operator: it*Â *just means “multiply by *x*“. ðŸ™‚

I know you’re shaking your head now: is it *thatÂ *easy? It is. Moreover, the ‘matrix-mechanical equivalent’ is equally simple but, as it’s getting late here, I’ll refer you to Feynman for that. ðŸ™‚

**The momentum operator (***p*_{x}**)**

Now we want to calculate the average momentum of, say, some electron. What integral would you use for that?Â […] Well… *What?*Â […] It’s easy: it’s the same thing as for *x*. We can just substitute replaceÂ *xÂ *forÂ *pÂ *in thatÂ âŒ©*x*âŒª_{av} =Â âˆ« *xÂ *P(*x*) d*xÂ *formula, so we get:

âŒ©*p*âŒª_{av} =Â âˆ« *pÂ *P(*p*) d*p*, over all the whole range of possible values for *p*

Now, you might think the rest is equally simple, and… Well… It actually *isÂ *simple but there’s one additional thing in regard to the need to normalize stuff here. You’ll remember we defined aÂ *momentumÂ *wavefunction (see my post on the Uncertainty Principle), which we wrote as:

Ï†(p) = âŒ© mom *p* | Ïˆ âŒª

Now, in the mentioned post, we related this *momentum* wavefunction to the particle’sÂ Ïˆ(x) = âŒ©x|ÏˆâŒª wavefunctionâ€”which we should actually refer to as theÂ *positionÂ *wavefunction, but everyone just calls itÂ *the *particle’sÂ wavefunction, which is a bit of a misnomer, as you can see now: a wavefunction describes someÂ *propertyÂ *of the system, and so we can associate *severalÂ *wavefunctions with the same system, really! In any case, we noted the following there:

- The two probability density functions,Â Ï†(p) and Ïˆ(
*x*), look pretty much the same, but theÂ *half-widthÂ *(or standard deviation) of one was inversely proportionalÂ to the half-width of the other. To be precise, we found that the constant of proportionality was equal to Ä§/2, and wrote that relation as follows:Â Ïƒ_{p}_{Â }= (Ä§/2)/Ïƒ_{x}.
- We also found that, when using a regular normal distribution function for Ïˆ(
*x*), we’d have to normalize the probability density function*Â *by inserting aÂ (2Ï€Ïƒ_{x}^{2})^{âˆ’1/2Â }in front of the exponential.

Now, it’s a bit of a complicated argument, but the upshot is that we cannot just write what we usually write, i.e. P_{i}Â = |*C*_{i}|^{2 }or P(*x*) =Â |Ïˆ(*x*)|^{2}. No. We need to put a normalization factor in front, which combines the two factors I mentioned above. To be precise, we have to write:

P(*p*) =Â |âŒ©*p*|ÏˆâŒª|^{2}/(2Ï€Ä§)

So… Well… OurÂ âŒ©*p*âŒª_{av} =Â âˆ« *pÂ *P(*p*) d*p*Â integral can now be written as:

âŒ©*p*âŒª_{av} = âˆ« âŒ©Ïˆ|*p*âŒª*p*âŒ©*p*|ÏˆâŒªÂ d*p*/(2Ï€Ä§)

So that integral is totally like what we found for âŒ©*x*âŒª_{av}Â and so… We could just leave it at that, and say we’ve solved the problem. In that sense, itÂ *isÂ *easy. However, having said that, it’s obvious we’d want someÂ solution that’s written in terms ofÂ Ïˆ(*x*), rather than in terms of Ï†(*p*), and that requires some more manipulation. I’ll refer you, once more, to Feynman for that, and I’ll just give you the result:

So… Well… I turns out that the momentum operator â€“ which I tentatively denoted as *p*_{x}Â above â€“ is *notÂ *so simple as our positionÂ operator (*x*). Still… It’s notÂ *hugelyÂ *complicated either, as we can write it as:

*p*_{x}Â â‰¡ (Ä§/*i*)Â·(âˆ‚/âˆ‚*x*)

Of course, theÂ *puristsÂ *amongst you will, once again, say that I should be more careful and put aÂ *hatÂ *wherever I’d need to put one so… Well… You’re right. I’ll wrap this all up by copying Feynman’s overview of the operators we just explained, and so heÂ *doesÂ *use the fancy symbols. ðŸ™‚

Well, folksâ€”that’s it! Off we go! You know all about quantum physics now! We just need to work ourselves through the exercisesÂ that come with Feynman’s *Lectures*, and then you’re ready to go and bag a degree in physics somewhere. So… Yes… That’s what *I* want to do now, so I’ll be silent for quite a while now. Have fun! ðŸ™‚

Some content on this page was disabled on June 16, 2020 as a result of a DMCA takedown notice from The California Institute of Technology. You can learn more about the DMCA here:

https://en.support.wordpress.com/copyright-and-the-dmca/

Some content on this page was disabled on June 16, 2020 as a result of a DMCA takedown notice from The California Institute of Technology. You can learn more about the DMCA here:

https://en.support.wordpress.com/copyright-and-the-dmca/

Some content on this page was disabled on June 16, 2020 as a result of a DMCA takedown notice from The California Institute of Technology. You can learn more about the DMCA here:

https://en.support.wordpress.com/copyright-and-the-dmca/

Some content on this page was disabled on June 16, 2020 as a result of a DMCA takedown notice from The California Institute of Technology. You can learn more about the DMCA here:

https://en.support.wordpress.com/copyright-and-the-dmca/

Some content on this page was disabled on June 16, 2020 as a result of a DMCA takedown notice from The California Institute of Technology. You can learn more about the DMCA here:

https://en.support.wordpress.com/copyright-and-the-dmca/