Quantum mechanics

Quantum mechanics is the branch of physics relating to the very small. 

It results in what may appear to be some very strange conclusions about the physical world. At the scale of atoms and electrons, many of the equations of classical mechanics, which describe how things move at everyday sizes and speeds, cease to be useful. In classical mechanics, objects exist in a specific place at a specific time. However, in quantum mechanics, objects instead exist in a haze of probability; they have a certain chance of being at point A, another chance of being at point B and so on.

Three revolutionary principles

Quantum mechanics (QM) developed over many decades, beginning as a set of controversial mathematical explanations of experiments that the math of classical mechanics could not explain. It began at the turn of the 20th century, around the same time that Albert Einstein published his theory of relativity, a separate mathematical revolution in physics that describes the motion of things at high speeds. Unlike relativity, however, the origins of QM cannot be attributed to any one scientist. Rather, multiple scientists contributed to a foundation of three revolutionary principles that gradually gained acceptance and experimental verification between 1900 and 1930. They are:

Quantized properties: Certain properties, such as position, speed and color, can sometimes only occur in specific, set amounts, much like a dial that “clicks” from number to number. This challenged a fundamental assumption of classical mechanics, which said that such properties should exist on a smooth, continuous spectrum. To describe the idea that some properties “clicked” like a dial with specific settings, scientists coined the word “quantized.”

Particles of light: Light can sometimes behave as a particle. This was initially met with harsh criticism, as it ran contrary to 200 years of experiments showing that light behaved as a wave; much like ripples on the surface of a calm lake. Light behaves similarly in that it bounces off walls and bends around corners, and that the crests and troughs of the wave can add up or cancel out. Added wave crests result in brighter light, while waves that cancel out produce darkness. A light source can be thought of as a ball on a stick being rhythmically dipped in the center of a lake. The color emitted corresponds to the distance between the crests, which is determined by the speed of the ball’s rhythm. 

Waves of matter: Matter can also behave as a wave. This ran counter to the roughly 30 years of experiments showing that matter (such as electrons) exists as particles.

Quantized properties?

In 1900, German physicist Max Planck sought to explain the distribution of colors emitted over the spectrum in the glow of red-hot and white-hot objects, such as light-bulb filaments. When making physical sense of the equation he had derived to describe this distribution, Planck realized it implied that combinations of only certain colors (albeit a great number of them) were emitted, specifically those that were whole-number multiples of some base value. Somehow, colors were quantized! This was unexpected because light was understood to act as a wave, meaning that values of color should be a continuous spectrum. What could be forbidding atoms from producing the colors between these whole-number multiples? This seemed so strange that Planck regarded quantization as nothing more than a mathematical trick. According to Helge Kragh in his 2000 article in Physics World magazine, “Max Planck, the Reluctant Revolutionary,” “If a revolution occurred in physics in December 1900, nobody seemed to notice it. Planck was no exception …” 

Planck’s equation also contained a number that would later become very important to future development of QM; today, it’s known as “Planck’s Constant.”

Quantization helped to explain other mysteries of physics. In 1907, Einstein used Planck’s hypothesis of quantization to explain why the temperature of a solid changed by different amounts if you put the same amount of heat into the material but changed the starting temperature.

Since the early 1800s, the science of spectroscopy had shown that different elements emit and absorb specific colors of light called “spectral lines.” Though spectroscopy was a reliable method for determining the elements contained in objects such as distant stars, scientists were puzzled about why each element gave off those specific lines in the first place. In 1888, Johannes Rydberg derived an equation that described the spectral lines emitted by hydrogen, though nobody could explain why the equation worked. This changed in 1913 when Niels Bohr applied Planck’s hypothesis of quantization to Ernest Rutherford’s 1911 “planetary” model of the atom, which postulated that electrons orbited the nucleus the same way that planets orbit the sun. According to Physics 2000 (a site from the University of Colorado), Bohr proposed that electrons were restricted to “special” orbits around an atom’s nucleus. They could “jump” between special orbits, and the energy produced by the jump caused specific colors of light, observed as spectral lines. Though quantized properties were invented as but a mere mathematical trick, they explained so much that they became the founding principle of QM.

Particles of light?

In 1905, Einstein published a paper, “Concerning an Heuristic Point of View Toward the Emission and Transformation of Light,” in which he envisioned light traveling not as a wave, but as some manner of “energy quanta.” This packet of energy, Einstein suggested, could “be absorbed or generated only as a whole,” specifically when an atom “jumps” between quantized vibration rates. This would also apply, as would be shown a few years later, when an electron “jumps” between quantized orbits. Under this model, Einstein’s “energy quanta” contained the energy difference of the jump; when divided by Planck’s constant, that energy difference determined the color of light carried by those quanta. 

With this new way to envision light, Einstein offered insights into the behavior of nine different phenomena, including the specific colors that Planck described being emitted from a light-bulb filament. It also explained how certain colors of light could eject electrons off metal surfaces, a phenomenon known as the “photoelectric effect.” However, Einstein wasn’t wholly justified in taking this leap, said Stephen Klassen, an associate professor of physics at the University of Winnipeg. In a 2008 paper, “The Photoelectric Effect: Rehabilitating the Story for the Physics Classroom,” Klassen states that Einstein’s energy quanta aren’t necessary for explaining all of those nine phenomena. Certain mathematical treatments of light as a wave are still capable of describing both the specific colors that Planck described being emitted from a light-bulb filament and the photoelectric effect. Indeed, in Einstein’s controversial winning of the 1921 Nobel Prize, the Nobel committee only acknowledged “his discovery of the law of the photoelectric effect,” which specifically did not rely on the notion of energy quanta.

Roughly two decades after Einstein’s paper, the term “photon” was popularized for describing energy quanta, thanks to the 1923 work of Arthur Compton, who showed that light scattered by an electron beam changed in color. This showed that particles of light (photons) were indeed colliding with particles of matter (electrons), thus confirming Einstein’s hypothesis. By now, it was clear that light could behave both as a wave and a particle, placing light’s “wave-particle duality” into the foundation of QM.

Waves of matter?

Since the discovery of the electron in 1896, evidence that all matter existed in the form of particles was slowly building. Still, the demonstration of light’s wave-particle duality made scientists question whether matter was limited to acting only as particles. Perhaps wave-particle duality could ring true for matter as well? The first scientist to make substantial headway with this reasoning was a French physicist named Louis de Broglie. In 1924, de Broglie used the equations of Einstein’s theory of special relativity to show that particles can exhibit wave-like characteristics, and that waves can exhibit particle-like characteristics. Then in 1925, two scientists, working independently and using separate lines of mathematical thinking, applied de Broglie’s reasoning to explain how electrons whizzed around in atoms (a phenomenon that was unexplainable using the equations of classical mechanics). In Germany, physicist Werner Heisenberg (teaming with Max Born and Pascual Jordan) accomplished this by developing “matrix mechanics.” Austrian physicist Erwin Schrödinger developed a similar theory called “wave mechanics.” Schrödinger showed in 1926 that these two approaches were equivalent (though Swiss physicist Wolfgang Pauli sent an unpublished result to Jordan showing that matrix mechanics was more complete).

The Heisenberg-Schrödinger model of the atom, in which each electron acts as a wave (sometimes referred to as a “cloud”) around the nucleus of an atom replaced the Rutherford-Bohr model. One stipulation of the new model was that the ends of the wave that forms an electron must meet. In “Quantum Mechanics in Chemistry, 3rd Ed.” (W.A. Benjamin, 1981), Melvin Hanna writes, “The imposition of the boundary conditions has restricted the energy to discrete values.” A consequence of this stipulation is that only whole numbers of crests and troughs are allowed, which explains why some properties are quantized. In the Heisenberg-Schrödinger model of the atom, electrons obey a “wave function” and occupy “orbitals” rather than orbits. Unlike the circular orbits of the Rutherford-Bohr model, atomic orbitals have a variety of shapes ranging from spheres to dumbbells to daisies.

In 1927, Walter Heitler and Fritz London further developed wave mechanics to show how atomic orbitals could combine to form molecular orbitals, effectively showing why atoms bond to one another to form molecules. This was yet another problem that had been unsolvable using the math of classical mechanics. These insights gave rise to the field of “quantum chemistry.”

The uncertainty principle

Also in 1927, Heisenberg made another major contribution to quantum physics. He reasoned that since matter acts as waves, some properties, such as an electron’s position and speed, are “complementary,” meaning there’s a limit (related to Planck’s constant) to how well the precision of each property can be known. Under what would come to be called “Heisenberg’s uncertainty principle,” it was reasoned that the more precisely an electron’s position is known, the less precisely its speed can be known, and vice versa. This uncertainty principle applies to everyday-size objects as well, but is not noticeable because the lack of precision is extraordinarily tiny. According to Dave Slaven of Morningside College (Sioux City, IA), if a baseball’s speed is known to within a precision of 0.1 mph, the maximum precision to which it is possible to know the ball’s position is 0.000000000000000000000000000008 millimeters.

Onward

The principles of quantization, wave-particle duality and the uncertainty principle ushered in a new era for QM. In 1927, Paul Dirac applied a quantum understanding of electric and magnetic fields to give rise to the study of “quantum field theory” (QFT), which treated particles (such as photons and electrons) as excited states of an underlying physical field. Work in QFT continued for a decade until scientists hit a roadblock: Many equations in QFT stopped making physical sense because they produced results of infinity. After a decade of stagnation, Hans Bethe made a breakthrough in 1947 using a technique called “renormalization.” Here, Bethe realized that all infinite results related to two phenomena (specifically “electron self-energy” and “vacuum polarization”) such that the observed values of electron mass and electron charge could be used to make all the infinities disappear.

Since the breakthrough of renormalization, QFT has served as the foundation for developing quantum theories about the four fundamental forces of nature: 1) electromagnetism, 2) the weak nuclear force, 3) the strong nuclear force and 4) gravity. The first insight provided by QFT was a quantum description of electromagnetism through “quantum electrodynamics” (QED), which made strides in the late 1940s and early 1950s. Next was a quantum description of the weak nuclear force, which was unified with electromagnetism to build “electroweak theory” (EWT) throughout the 1960s. Finally came a quantum treatment of the strong nuclear force using “quantum chromodynamics” (QCD) in the 1960s and 1970s. The theories of QED, EWT and QCD together form the basis of the Standard Model of particle physics. Unfortunately, QFT has yet to produce a quantum theory of gravity. That quest continues today in the studies of string theory and loop quantum gravity.

Related Posts

Comments are closed.

© 2024 Telecommunication Engineering - Theme by WPEnjoy · Powered by WordPress