Ted's Home Page Get Firefox! Get Thunderbird! Tuesday, December 03, 2024, 7:20 pm GMT
    Home :: Contact :: Blog :: Updates (Tue, Feb 12) :: Search :: Highlight Linux :: Apache HTTPD :: PHP :: VIM :: OCS
          email me email me
 
[Ted and a baby giraffe]
 
Contact Me
Résumé/CV
More Information
 
Faculty Job Search
Industry Job Search
Research/Graduate
Work
My Teaching Sites
 
General Posts
(including
LaTeX
templates)
 
Utilities
 
Public hg repos
Public git repos
 
Links
The Blog
 
Guestbook
 
 
 
 
   
 
 
  email me
email me
email me
email me



Argument :: Against a Deprecated Quantum Model (E-mail)


VERY
IMPORTANT
NOTICE:
First and foremost, this argument was made early in my undergraduate career. If I was to rewrite it today, it would be very different, and perhaps far more accurate. I still support the SENTIMENT of the article, though I would like to refine the content.

For example, certain references to particles going forward and backward in time can probably be removed. A more sophisticated understanding of the mathematical structures used in physics (Hermitian matrices, etc.) does not require this sort of thinking, and using this thinking is a dangerous way to lead people into other fundamental misunderstandings.

Also note that this argument does not invalidate the HUP. The HUP is the fundamental principle behind all of quantum mechanics. Nearly all of quantum mechanics can be reduced to results of the HUP. This argument does attempt to correct a popular misunderstanding about the HUP.

During a particular lecture in a Philosophy 101 class I was taking as a General Education Curriculum (GEC) class my senior year, the professor, who I hold in the highest esteem, made a comment about the uncertainty principle (also see here) which I felt construed a quantum model which has been deprecated. Earlier in the quarter, during a recitation, the recitation instructor also made a comment which seemed to be directly related to this older model.

I also notice that for simplicity sake this older model is still taught in introductory physics classes instead of asking students to make a mental leap into the new paradigm which provides a much better explanation for quantum phenomena.

This e-mail to my philosophy professor turned into a document well-strucutred enough for me to feel it was publishable here. I invite you to read it to help you too fold the more modern model into your own quantum understanding.

You may also find this e-mail interesting as it not only comments on the uncertainty principle, but it comments on the nature of time itself and how we observe particles which naturally travel back in time.

While I present these arguments here, they are not my own; I am simply trying to put in a short document some concepts that giants before me have expressed in much larger works.
 
 

From: Ted Pavlic
To: Neil Tennant
Subject: PHIL 101: Comment on Uncertainty
Date: Saturday, November 02, 2002 9:53 PM

Professor Tennant,

    In the last Philosophy 101 lecture, you talked a great deal about
uncertainty and accepting it rather than finding a Cartesian band-aid to
soften or dismiss it. You talked about this being the most modern view and
used examples in science to justify why modern philosophers and scientists
have been convinced by their observations that even our most trusted truths
may be very questionable. During this justification, you mentioned the
uncertainty principle in quantum mechanics. My comment from hence forward is
on the use of this principle.

    I intend to show that this is a deprecated principle which is only used
at an introductory level to try to explain quantum interactions; the more
modern view of quantum mechanics is based on a much more probabilistic
paradigm which, when accepted entirely, dismisses any need for an
uncertainty principle. I will also briefly discuss this impact on other
quantum generalizations which oversimplify the model yet are frequently used
[perhaps dangerously] in philosophical conversations. In particular, because
of a discussion on the spontaneous generation and destruction of matter
within any two points of space brought up in my Philosophy 101 rectiation
some time ago, I intend to speak very briefly on the relationship of time
and the probabilistic paradigm. In particular, I wish to point out some
errors which are made when considering the time horizon which we naturally
observe (which perhaps supports the idea of the world we [perhaps falsly]
believe to exist being goverend only by our observations).

    As I do not intend to make this an extremely lengthy comment, I wish to
cite as a reference Richard Feynman's _QED: The Strange Theory of Light and
Matter_. Most of what I will discuss here will be covered in this very short
book. While the book primarily centers around quantum electrodynamics (QED),
it includes a fairly complete section explaining the similarities between
QED and the rest of physics. I typically refer those who are either just
entering a study of quantum physics (or a related study, like physical
chemistry) or to those who simply want a better understanding of popular
physics. This is a short and fairly well-written book (it is actually a
collection of four lectures), and it covers some important concepts like
Feynman diagrams.

    Again, as I do not mean to make this message too much longer, I will try
to jump straight into the parts of my argument which are related to your
lecture material. In particular, you mentioned the roll of uncertainty when
considering single and double slit problems and the diffraction patterns
that are created by them. Before this century, this sort of phenomena was
explained by electromagnetic wave theory, and this theory explained our
observations quite well until Einstein and Planck came around and created
what became to be known as quantum physics, which postulated photons. This
is where introductory material starts to skew from the more modern model of
quantum physics. The introductory overgeneralization would typically say
that these photons have wave-like behavior in certain situations. Some might
call them, "wavicles." We know that the main carrier of light must be some
sort of tiny corpuscle, but we have trouble explaining the wave-like
behavior under certain constraints. Many people do not realize this, but the
idea of tiny light particles (going against the grain of EM wave theory) had
been postulated long before either Einstein or Planck. Newton understood
that the more sensical model of light had to be a particle-based model of
light, but he had too much trouble with its apparent duality to ever come up
with a theory as elegant as the one that was postulated with quantum
mechanics. Regardless, the writings of Newton on this subject are very
valuable and show him to be an amazing thinker.

    Now, the more modern view of quantum mechanics treats photons as
particles which carry a probability with them which has both a magnitude and
a phase. When photons with equivalent magnitudes and opposite phases
"intersect," their probabilities subtract to zero and no photon is detected.
Keeping this in mind, understand that light does not travel in straight
lines from one point to another. Light travels in all directions through all
possible curves and paths from one place to another. In the end, our
observations are of where the probabilities "add up," which typically is
along a path of a straight line. When light is forced through inhomogeneous
space its probabilities cancel in such a way where the curved paths add up
or perhaps multiple paths show up.

    Keeping all of this in mind, consider a single slit (this example is
motivated by one made on page 55 of Feynman's book). If the slit is wide
enough, enough probable paths can make their way through the slit to cancel
out the non-straight paths. The light will appear to travel in a straight
path. However, in the model, light is traveling in every path, but as the
paths intersect the probabilities cancel and the other paths cannot be
observed. Now, if the slit is squeezed small enough, important "cancellation
paths" are cut out and probability starts to "add up" at other locations
other than the "straight line" path.

    I quote Feynman:

=====
This is an example of the "uncertainty principle": there is a kind of
"complementarity" between knowledge of whre light goes between the blocks
and where it goes afterwards--precise knowledge of both is impossible. I
would like to put the uncertainty principle in its historical place: When
the revolutionary ideas of quantum physics were first coming out, people
still tried to understand them in terms of old-fashioned ideas (such as,
light goes in straight lines). But at a certain point the old-fashioned
ideas would begin to fail, so a warning was developed that said, in effect,
"Your old-fashioned ideas are no damn good when . . ." If you get rid of all
the old-fashioned ideas and instead use the ideas that I'm explaining in
these lectures--adding _arrows_ for all the ways an event can happen--there
is no need for an uncertainty princple!
=====

    Feynman calls this an example of the uncertainty princple because as we
restrict a light's path more and more to confirm it really is going in a
straight path, we see it stops going in a straight path. However, this
really is a band-aid over a much better model -- a model where light (and
eventually everything) is just a function of probability. It turns out that
the three orthogonal dimensions and time are all parameterized by this
probability.

    Now consider the two slit case (this time, I motivate this example by an
example starting on page 79 of Feynman's book). The same method of vector
adding of probabilities with magnitude and phase can be applied to this case
and the interference pattern expected can be predicted. In this case, I wish
to point out the necessity of considering all possible paths when
considering total probability of observed light. In this case, when adding
up all possible probable cases to find out where the probabilities add up
and cancel out, it is necessary only to consider the probability of light
hitting the detector on the opposite side of the slits which accepts light
as it comes through the slits. As that detector moves up and down, the
probability oscillates depecting the interference pattern. Now, one might
say it would be interesting to put photon detectors (which can be built) in
one or both of the slits to see which slit the light actually travels
through. When the experiment is turned on and data is taken, one finds that
with the detectors in place, the interference pattern disappears! The
probability stays constant on the other side of the slits.

    This very weird result might be explained sophomorically as a result of
the uncertainty principle -- we're not "allowed" to determine that much
about the setup. However, this is actually a result of changing the
probability function. Before, it was important to consider only the
probability of light hitting the detector on the opposide side of the slits
as the light generator. Now, there are more (actually different)
probabilitistic outcomes that need to be considered. Now, we know that light
has the option of not only hitting the original detector, but going through
one slit detector and hitting the original detector or going through the
other slit detector and hitting the original detector. By isolating these
two cases, interference disappears; the end result is the simple sum of the
two single-slit probabilities. To put it another way, in order to have
interference, we must accept the probability that photons hitting the
interference detector could have come from *EITHER* of the two slits.
Knowing EXACTLY which slit through which the photons traveled destroys the
necessary conditions for interference.

    At this point I have completed most of my argument. Briefly, I wish to
summarize what has been said thus far to support the final short portion of
my argument:

*) Quantum electrodynamics models light as photons which are probabilistic
corpuscles which have phasor probability representors.

*) These probabilities can "cancel out" and "add up" due to their phase
characteristics (by way of vector addition). Because they have phasor
representors which are easily modeled as complex expoentials (which break
down into real and complex sinusoidal components), observed characteristics
of light may make it appear to follow a wave-like behavior. Most
electromagneticians can use this approximation with very little danger.

*) Likewise, because of how these phasor representors "add up" and "cancel
out," there is an apparent "uncertainty principle" which is observed as
alternate "paths" are destroyed. Rather than thinking of this as an
uncertainty which says that we never can know the exact single path light
travels, it is a better model to say that light travels through every path,
and as the number of possible paths is limited, the paths which cancel other
paths are removed and their conjugates show up.

*) In order to calculate the final true probability function of an event, it
is necessary to consider all possible "paths" which lead up to that event.
The probabilities of each of these paths can be summed to form the true
probability of the event.

    This concludes the bulk of my argument. I wish to finally show that it
is perhaps improduent to suggest that matter is spontaenously created and
destroyed in such a way that over time the amount of matter in the universe
averages out. I will use what has been discussed previously in this message
to support my claims. I will restrict this argument to quantum
electrodynamics and then weakly expand it to cover the rest of quantum
physics.

    Up to this point I have talked only about the various "paths" of
photons. I have said that in order to find out the probability of light
hitting a detector, it is necessary to consider all possible "paths" to that
detector and sum the probabilities. This same trait goes for electrons (and
soon, all matter). An electron can move along a path like a photon, but it
can also absorb or emit a photon and change its momentum when it does one of
these two things.

    When considering an initial position and a final position of an
electron, in order to calculate the most probable path of the electron, it
is necessary to consider all possible paths. When photons were considered,
this was easy. Now that electrons are the case being analyzed, it is
necessary to consider not only the different direct paths, but it is
necessary to consider that the electron may emit a photon, change its
momentum, and head to the final position.

    This becomes more interesting when two electrons are considered. The
photons which are emitted can be absorbed by the second electron, which
changes its momentum as well. The photons which are exchanged in this manner
are often called "virtual photons." The quantum tendency for electrons to
emit or absorb these photons is what we observe at a larger scale as
"charge."

    You may have noticed that up to this point I have not really brought in
the subject of time. I mentioned that time is just as much governed by
probability as the three orthogonal dimensions, but I left things at that.
This was primarily because photons have no mass. That is, mass, like charge,
is observed at a larger scale as an entities response to time. If an entity
does not exist in time (or experiences no time . . . or is exogenous to
time) it has no mass. At this point I am going to require of you to try to
think very abstractly about these concepts. As I said very early in this
e-mail, a real understanding of quantum physics requires a complete paradigm
shift. Traditional scientific education introduces things like "matter" and
"time" as absolutes -- the axoims at the bottom of the scientific pyramid.
In the quantum paradigm, these are simply probabilistic tendencies toward
certain behaviors. Things that have "matter" tend to behave a certain way
under certain conditions. in the quantum paradigm, the causal structure does
not move in only one direction in time; things that have not occurred "yet"
can influence the behavior of things which are just happening now. This is
not a fantastical fiction; this is a model that has been proposed and
refined over many years and seems to make very accurate predictions about
the behavior of the external world.

    Now, consider those two electrons traveling in space near each other.
Note that a photon that is emitted from one electron and absorbed by another
could be thought of as being emitted by the other electron and absorbed by
the other; there is no difference between the two events. Likewise, it is
probably easy to picture an electron emitting a photon which travels forward
in time to impact the momentum of another electron that absorbs it, but it
probably is a stretch to consider the second electron emitting a photon BACK
in time; however, it is perfectly correct to describe that event in both
ways -- with the photon going backward or forward in time. Likewise, it is
impossible to know "which way" the photon went as our time horizon pushes
forward since photons "experience" no time. A "backward moving" photon looks
exactly the same as a "forward moving" one.

    However, it is not only photons which travel back in time. Going back to
the single electron example, I want to use another Feynman example (from
page 97 of his book). When considering the scattering of light as it enters
a region with electrons, three cases come to mind. In the first case, an
electron encounters an electron, changes its momentum, and later emits an
electron and continues on in its original direction. Both the electron and
photon continue in paths parallel to their original paths. This is probably
the easiest case to visualize.

    Now consider a more interesting case. Imagine an electron emitting a
photon moving parallel to the original photon and changing its own momentum
to do it and then **LATER** encountering the original photon which entered
the scattering medium. The original photon will be observed and will change
the momentum of the electron back to its original momentum where it will
travel paralle to its original path. Nothing travels "back" in time in this
case, but a past event is influenced by a future one. That is, causality is
reversed.

    Now, in the most interesting case, not only do we explore the same sort
of casual reversal, but we see that matter can travel backward in time too
(with some interesting consequences). Imagine a photon entering into this
medium and an electron already in this medium. At some time in the future,
the electron emits a photon and changes its momentum so much that it travels
backward in time where it meets up with the entering photon where it joins
with it to become a forward moving electron again. Now remember that the
property of matter involves how entities behave under the influence of time.
This "backward moving" electron is observed in the forward moving time
direction as a positron. That is, it is an entity with the exact same amount
of matter in it but with an opposite charge. Also remember that charge
relates to how easily a massive entity emits or absorbs photons.

    Now consider a world (like the one in which we make our observations)
where time can only be viewed in one direction. Consider this "time horizon"
constantly moving in one direction. Now use this forward moving time horizon
to look at the above three cases. The first two cases do not appear to show
anything interesting, but the third case appears to show a photon
"spontaneously" "splitting" into a positron and an electron. That positron
appears to fire off to collide with another electron which produces a
photon. This can be and has been observed in a laboratory. In fact,
colliding two photons together can easily create positrons which can be kept
in magnetic fields for weeks.

    This sort of spontaneous matter generation is nothing more than matter
changing momentum in the future and coming back in time. Our time horizon
limits "how" we can observe these things, but if we have an appropriate
model which allows causation to exist going in both directions of time, it
is easy to explain why and how these particles appear to pop into existence.

    Now, I promised to extend this past QED to the other side of quantum
physics. That is, most of this note explains phenomena which is coupled with
Coulombic attraction and repulsion. That is, it explains the realm of
physics involving electromagnetics. However, these same ideas are being
applied to the other forces in nature, and as they are studied further and
further, new particles which play the same roles as electrons, positrons,
and photons are being discovered. This is where quarks, muons, neutrinos,
tau particles, gluons, and W particles come into play. Just as positrons pop
into existence by moving backward in time, these particles do the same thing
in order to fulfill some possible "path" from one natural state to another.

    And with all of this, it should be quite clear why the amount of matter
in the universe does average out over time. Matter which comes into
existence in one time comes *FROM* another time. A saturation in one time is
a depletion in another. Also, as these effects really only become extremely
important at small distances, the time frames which are interacting in this
way are typically very close to each other (from our observational stance)
so the increase in matter at one time is very quickly offset by the decrease
in another.

    And so hopefully through all of this I have at least started to convince
you that the "uncertainty principle" is just something that is typically
used to help model some of our strange observations, but if you are able to
accept our universe as one that is completely based upon probability, then
there is no need to consider this sort of principle; the effects of such a
principle are easily folded into the new model.

    Now, I am not arguing that uncertainty does not exist. You should notice
that this model embraces it; it does not posit to know anything about how
things really are. It is completely based on what things are possible and if
those are possible which remain when all possible things exist along side
each other.

    The only other things I wish to point out that I did not talk about are
gravity (which is not explained by anything above) and the speed of light
(and its constancy). I will not attempt to explain gravity; a new unified
theory exists which does a good job as postulating the existence of other
things (like diallel lines) which help to explain how gravity folds into
this whole theory. My comment about the speed of light involves its
constancy at small distances. There is actually a very small probability
that light can travel faster as well as slower than the speed of light. At
large distances, the contribution of these two probabilities do cancel each
other out. However, at very small distances this characteristic becomes very
important and light does start to travel faster than the speed of light. I
do not wish to comment much further on this, but I did want to bring it up.

    Finally, I do want to try to make a small comment on "existence." While
I say here that certain theories postulate the existence of certain
particles, I think it is well-understood that these theories make no attempt
to say, "these particles actually exist." These theories simply posit these
particles in order to model the behavior of the universe. An animist,
perhaps, might form other theories which can just as accurately explain the
behavior of the world around them in more spirtual terms. Neither the
scientist or the animist would claim that either theory was more "correct"
than the other; they both serve to model a universally unknown world around
them (in fact, most people I encounter tend to group animism and science
together in some form of harmony with each other and with "nature").

    Anyway, I was uncomfortable with your use of the uncertainty principle,
especially when it came to the diffraction patterns you spoke of, so I
wanted to send a message trying to clarify a more modern quantum model.
Personally, I am an electrical engineer and have little need for the
precision of QED, but these sort of concepts not only help me be more
comfortable with my own approximations, but they help me to explain some
more complicated interactions to my students.

    All the best --
    Ted Pavlic
    pavlic.3@osu.edu
 

appalling appalling
appalling appalling
email me email me
 
1705593 hits
(202 today)
 
Terms of Use
Ted Pavlic <ted@tedpavlic.com>   appalling appalling appalling appalling email me email me GPG Public Key: D/L, View, Ubuntu, MIT, PGP (verified) (ID: E1E66F7C) This Page Last Updated on Tuesday, February 12, 2019, 6:17 pm GMT