oso-9780192895547-chapter-7
Quantum Statistics
7.1 The Gibbs Factor
In the derivation of the Boltzmann factor, energy exchange between a small system and a reservoir was allowed, but particle exchange was not. Now, we modify the derivation to allow particle exchange. This is crucial for understanding systems where the number of particles can fluctuate, such as in chemical reactions or adsorption processes.
The ratio of probabilities for two different microstates is given by:
This ratio depends on the change in entropy of the reservoir as the system transitions between states. The exponent contains the change in the entropy of the reservoir as the system goes from state 1 to state 2.
From the reservoir’s viewpoint, the change is infinitesimal, so we invoke the thermodynamic identity:
This equation relates the change in entropy to changes in energy, volume, and particle number. Each term provides insight into how the reservoir responds to changes in the system.
Any energy, volume, or particles gained by the reservoir must be lost by the system. This conservation principle is fundamental to the derivation.
Throwing away the term (assuming constant volume or negligible volume change) and keeping the term, the change in entropy can be written as:
Here, the change in entropy is expressed in terms of the energy and particle number changes, weighted by temperature and chemical potential, respectively.
Plugging this expression, we get:
The ratio of probabilities is a ratio of simple exponential factors, each of which is a function of the temperature of the reservoir and the energy of the corresponding microstate. Now, the factor depends also on the number of particles in the system for state s.
This new exponential factor is called a Gibbs factor:
The Gibbs factor accounts for both energy and particle number fluctuations, making it essential for systems with variable particle numbers.
For absolute probability, we need a constant of proportionality in front of the exponential:
The quantity is the grand partition function or the Gibbs sum. It can be shown that:
The grand partition function sums over all possible states, considering all possible values of , which makes it a comprehensive descriptor for the system's statistical behavior.
The sum runs over all possible states (including all possible values of N). If more than one type of particle can be present in the system, the term becomes a sum over species of , and each subsequent equation is modified in a similar way. For two types of particles, the Gibbs factor becomes:
This is the extension of the Gibbs factor for multiple species, each with its own chemical potential and particle number.
Carbon Monoxide Poisoning Example
An adsorption site on a hemoglobin molecule, which carries oxygen in the blood, is a good example. A single hemoglobin molecule has four adsorption sites, each carries one molecule. Consider just one of the four sites and pretend that it is completely independent of the other three. This simplification allows us to model the system more easily.
If oxygen is the only molecule that can occupy the site, the system has just two possible states: unoccupied and occupied. The energies of these two states are 0 and , with . The negative sign indicates that the bound state is energetically favorable.
The grand partition function for this single-site system has just two terms:
The chemical potential is relatively high in the lungs, where oxygen is abundant, but is much lower in the cells where the oxygen is used. This gradient drives the diffusion of oxygen from the lungs to the cells.
Near the lungs, the blood is in approximate diffusive equilibrium with the atmosphere, an ideal gas in which the partial pressure of oxygen is about . The chemical potential can be calculated from:
This equation relates the chemical potential to the volume, internal partition function, number of particles, and quantum volume.
Temperature is at body temperature,. This is crucial for the physiological context of the example.
This gives:
The probability of any given site being occupied is:
If there is also some carbon monoxide present, there are three states available to the site: unoccupied, occupied by , and occupied by . The grand partition function is:
is the negative energy of a bound molecule and is the chemical potential of in the environment. is less abundant than oxygen. If it is 100 times less abundant, then its chemical potential is lower by roughly , so is roughly . Carbon monoxide's lower abundance affects its chemical potential, influencing its binding probability.
is more tightly bound to the site than oxygen, with . This stronger binding affinity is why carbon monoxide is so dangerous.
This gives:
The probability of the site being occupied by an oxygen molecule therefore drops to:
7.2 Bosons and Fermions
An important application of Gibbs factors is to quantum statistics, the study of dense systems in which two or more identical particles have a reasonable chance of wanting to occupy the same single-particle state. This is particularly relevant at low temperatures and high densities.
In this situation, the partition function for a system of N indistinguishable, noninteracting particles:
breaks down because the counting factor of is incorrect if the particles are not always in different states. The indistinguishability of particles requires a more nuanced approach.
Consider a system containing two noninteracting particles, either of which can occupy any of five states. If the two particles are distinguishable, then each has five available states and the total number of system states is . If the two particles are indistinguishable, equation 7.16 would predict , which can’t be right. The classical approximation fails when particles are indistinguishable and can occupy the same state.
If the particles are indistinguishable, all that matters is the number of particles in any given state. This is a key concept in quantum statistics.
Some types of particles can share a state with another of the same species (bosons), others can’t (fermions). This distinction leads to different statistical behaviors.
The number of identical bosons in a given state is unlimited. Examples of bosons include photons and gluons.
Many types of particles cannot share a state with another particle of the same type—these particles are fermions. Electrons, protons, and neutrons are fermions.
If the particles in the example are identical fermions, then the five system states in the final column of the table are not allowed, so Z is only 10, not 15. The Pauli exclusion principle significantly restricts the possible states for fermions.
The rule that two identical fermions cannot occupy the same state is called the Pauli exclusion principle. This principle is fundamental to the structure of atoms and the properties of materials.
Particles with integer spin (0, 1, 2, etc.) are bosons, while particles with half-integer spin (1/2, 3/2, etc.) are fermions. Spin is a fundamental property that dictates the statistical behavior of particles.
Condition where it Doesn't Matter Whether particles are Bosons or Fermions
When the number of available single-particle states is much greater than the number of particles:
Z_1 >> N
the chance of any two particles wanting to occupy the same state is negligible. In this regime, classical statistics provide a good approximation.
For a nonrelativistic ideal gas, the single-particle partition function is , where is some reasonably small number and is the quantum volume:
The quantum volume provides a measure of when quantum effects become significant.
The condition to apply then translates to: \frac{V}{N} >> vQ
This says that the average distance between particles must be much greater than the average de Broglie wavelength. When this condition holds, the wave functions of the particles do not significantly overlap.
For the air we breathe, the average distance between molecules is about while the average de Broglie wavelength is less than , so this condition is definitely satisfied. In everyday conditions, air molecules behave classically.
This condition depends not only on the density of the system, but also on the temperature and the mass of the particles, both through . Density, temperature, and mass collectively determine the importance of quantum effects.
If the gas is sufficiently dense or is sufficiently large, then the wavefunctions will start trying to touch and overlap. At this point it starts to matter whether the particles are fermions or bosons. Quantum statistics become essential in these extreme conditions.
The Distribution Functions
When a system violates the condition Z_1 >> N, so that we cannot treat it using the methods of Chapter 6, we can use Gibbs factors instead. Gibbs factors provide a more accurate description when quantum effects are significant.
Consider a “system” consisting of one single-particle state, rather than a particle itself. Thus the system will consist of a particular spatial wavefunction (and, for particles with spin, a particular spin orientation). This approach focuses on the occupancy of individual quantum states.
When the state is unoccupied, its energy is 0; if it can be occupied by n particles, then the energy will be . The energy is quantized based on the number of particles occupying the state.
The probability of the state being occupied by n particles is
Where is the grand partition function, the sum of the Gibbs' factors for all possible n.
Fermions
If the particles in question are fermions, then n can only be 0 or 1, so the grand partition function is
The occupancy of the state: the average number of particles in the state is given by:
This important formula is called the Fermi-Dirac distribution;
The Fermi-Dirac distribution describes the probability of a fermion occupying a given energy state.
The Fermi-Dirac distribution approaches zero when \varepsilon >> \mu, and goes to 1 when \varepsilon << \mu. States with energy much less than tend to be occupied, while states with energy much greater than tend to be unoccupied. This behavior reflects the Pauli exclusion principle.
A state with energy exactly equal to has a chance of being occupied, while the width of the fall-off from 1 to 0 is a few times . The chemical potential ($\mu$) determines the Fermi energy, and the thermal energy () broadens the distribution.
Bosons
If instead the particles in question are bosons, then n can be any nonnegative integer, so the grand partition function is:
(Since the Gibbs factors cannot keep growing without limit, must be less than and therefore the series must converge.) The chemical potential is constrained to ensure convergence of the partition function.
Meanwhile, the average number of particles in the state is:
Abbreviate . Then:
For bosons, we have:
This important formula is called the Bose-Einstein distribution; I’ll call it :
The Bose-Einstein distribution describes the average number of bosons occupying a given energy state.
Like the Fermi-Dirac distribution, the Bose-Einstein distribution goes to zero when \varepsilon >> \mu. Unlike the Fermi-Dirac distribution, however, it goes to infinity as approaches from above. This divergence leads to Bose-Einstein condensation.
Boltzmann Statistics
For particles obeying Boltzmann statistics, the probability of any single particle being in a certain state of energy is:
If there are N independent particles in total, the average number in this state is:
The chemical potential for such a system is . Therefore the average occupancy can be written:
When is sufficiently greater than , so that this exponential is very small, we can neglect the 1 in the denominator of either the Fermi-Dirac distribution or the Bose-Einstein distribution, and both reduce to the Boltzmann distribution. In the classical limit, both quantum distributions converge to the Boltzmann distribution.
The precise condition for the three distributions to agree is that the exponent be much greater than 1. This condition indicates when quantum effects are negligible.
We now know how to compute the average number of particles occupying a single-particle state, whether the particles are fermions or bosons, in terms of the energy of the state, the temperature, and the chemical potential. Energy, temperature, and chemical potential are the key parameters determining state occupancy.
7.3 Degenerate Fermi Gases
A “gas” of fermions at very low temperature. The fermions could be helium-3 atoms, or protons and neutrons in an atomic nucleus, or electrons in a white dwarf star, or neutrons in a neutron star. Degenerate Fermi gases exist in a variety of extreme environments.
The familiar example, though, is the conduction electrons inside a chunk of metal. Understanding degenerate Fermi gases is crucial for describing the behavior of electrons in metals.
“Very low temperature” means that Boltzmann statistics do not apply. At these temperatures, quantum effects dominate.
Zero Temperature
At , the Fermi-Dirac distribution becomes a step function. This simplification allows for easier calculations of ground-state properties.
All single-particle states with energy less than are occupied, while all states with energy greater than are unoccupied. In this context is also called the Fermi energy, denoted :
The Fermi energy is a fundamental property of degenerate Fermi gases.
When a gas of fermions is so cold that nearly all states below are occupied while nearly all states above are unoccupied, it is said to be degenerate. Degeneracy implies that the system's behavior is dominated by quantum statistics.
The value of is determined by the total number of electrons present. Fermi energy depends directly on the particle density of the system.
Because when , in this context, the equation makes perfect physical sense. Where is fixed at zero when all the electrons are packed into the lowest-energy states.
Calculating Fermi Energy
Calculation to find the total energy and pressure of the electron gas, assuming that the electrons are free particles, are confined inside a box of volume . This model simplifies the analysis of electrons in metals.
The definite-energy wavefunctions of a free electron inside a box are just sine waves. Free electrons are modeled as particles in a box.
For a one-dimensional box the allowed wavelengths and momenta are (as before)
In a three-dimensional box these equations apply separately to the x, y, and z directions, so:
Where is a triplet of positive integers. These quantum numbers define the allowed states in the box.
The allowed energies are therefore
Each allowed vector corresponds to a point in n-space with positive integer coordinates. Each lattice point actually represents two states, since for each spatial wavefunction there are two independent spin orientations. Spin degeneracy doubles the number of available states.
The energy of any state is proportional to the square of the distance from the origin, . The energy levels form a discrete spectrum.
The occupied region of n-space is essentially an eighth of a sphere, with the radius of this sphere. This geometric representation helps visualize the distribution of states.
The Fermi energy is:
The total number of occupied states is twice this volume (because of the two spin orientations):
Combining these two equations gives the Fermi energy as a function of and the volume of the box:
This equation shows how Fermi energy depends on particle density.
If you plug in some numbers, you’ll find that the Fermi energy for conduction electrons in a typical metal is of electron-volts. The Fermi energy is typically on the order of a few electron volts for metals.
The average energy of the electrons is . The average electron energy is a fraction of the Fermi energy.
The temperature that a Fermi gas would have to have in order for to equal is called the Fermi temperature:
The Fermi temperature is a characteristic temperature scale for degenerate Fermi gases.
Calculated the pressure of a degenerate electron gas using
The degeneracy pressure arises from the quantum mechanical confinement of electrons.
Degeneracy pressure is what keeps matter from collapsing under the huge electrostatic forces that try to pull electrons and protons together. This pressure is crucial for the stability of white dwarf and neutron stars.
Bulk modulus (the change in pressure when the material is compressed) divided by the fractional change in volume:
The formula actually agrees with experiment, within a factor of or so, for most metals. The bulk modulus provides a measure of the material's resistance to compression.
Small Nonzero Temperatures
One property of a Fermi gas can not be calculated using is the heat capacity, since this is a measure of how the energy of the system depends on . Heat capacity is sensitive to the temperature dependence of the energy.
At temperature , all particles typically acquire a thermal energy of roughly . Thermal energy introduces deviations from the perfect step function.
The only electrons that can acquire some thermal energy are those that are already within about of the Fermi energy—these can jump up into unoccupied states above . Only electrons near the Fermi level contribute significantly to thermal properties.
The additional energy is:
So the total energy of a degenerate Fermi gas for is:
From this result we can easily calculate the heat capacity:
Notice that the heat capacity goes to zero at required. This behavior is characteristic of degenerate Fermi gases.
The Density of States
To better visualize the behavior of a Fermi gas at small nonzero temperatures, introduce the concept of the density of states. The density of states is a crucial tool for understanding the properties of Fermi gases.
Changing variables in the energy integral, then the energy integral for a Fermi gas at zero temperature becomes:
The quantity in square brackets has a nice interpretation: It is the number of single-particle states per unit energy, called the density of states.
The density of states quantifies the number of available energy states at a given energy level.
To get the total number of electrons by just integrating the density of states up to the Fermi energy:
But what if is nonzero? Then we need to multiply by the probability of a state with that energy being occupied, that is, by the Fermi- Dirac distribution function. Also we need to integrate all the way up to infinity, since any state could conceivably be occupied:
And to get the total energy of all the electrons:
The Sommerfeld Expansion
The method for calculating the chemical potential and total energy of a free electron gas, in the limit kT << \xi_F is Sommerfeld expansion. The Sommerfeld expansion provides a systematic way to approximate integrals involving the Fermi-Dirac distribution.
Starting with the integral for N:
Use integration by parts:
Now what change to integration variable;
Since kT << \xiF then
Expand and integrate. Expanding and integrating allows for analytical approximations.
After assembling the pieces:
Then final results:
7.4 Blackbody Radiation
Analyzing the electromagnetic radiation inside some “box” (like an oven or kiln) at a given temperature. Blackbody radiation is a fundamental phenomenon in thermal physics.
The Ultraviolet Catastrophe
In classical physics, we treat electromagnetic radiation as a continuous “field” that permeates all space. Classical physics fails to correctly describe the high-frequency behavior of blackbody radiation.
Inside a box, we can think of this field as a combination of various standing-wave patterns. The electromagnetic field can be decomposed into a superposition of standing waves.
Each standing-wave pattern behaves as a harmonic oscillator with frequency : . Each mode of the electromagnetic field acts as a harmonic oscillator.
Each electromagnetic standing wave has two degrees of freedom, with an average thermal energy of . Classical equipartition predicts a thermal energy of per mode.
Since the total number of oscillators in the electromagnetic field is infinite, the total thermal energy should also be infinite.
this is called the ultraviolet catastrophe.
The Planck Distribution
In quantum mechanics, a harmonic oscillator can’t have just any amount of energy; its allowed energy levels are
Energy is quantized in quantum mechanics, resolving the ultraviolet catastrophe.
The partition function for a single oscillator is therefore
The average energy is
where .
If we think of the energy as coming in “units” of , then the average number of units of energy in the oscillator is
This formula is called the Planck distribution
The Planck distribution describes the spectral radiance of blackbody radiation.
According to the Planck distribution, short-wavelength modes of the electromagnetic field, with hf » kT, are exponentially suppressed. This suppression resolves the ultraviolet catastrophe.
Photons
“Units” of energy in the electromagnetic field can also be thought of as particles, called photons. Photons are the quanta of electromagnetic radiation.
They are bosons, so the number of them in any “mode” or wave pattern of the field ought to be given by the Bose-Einstein distribution:
Comparison with equation 7.72 therefore requires . The chemical potential of photons is zero.
Using the grand free energy to show for photons
Because the free energy must attain the minimum possible value at equilibrium with and & held fixed, and value will minimise . The minimization of free energy implies zero chemical potential for photons.
Using the condition for chemical equilibrium to derive for photons. Photons (𝛾) are created or absorbed by an electron:
As we in the equilibrium condition for such a reaction is the same as the reaction equation, with the name of each species replaced by its chemical potential.
Summing over Modes
The Planck distribution tells us how many photons are in any single “mode” (or “single-particle state”) of the electromagnetic field. To calculate total energy and number of photons, we need to sum over all modes.
Compute the total energy, and the total number of photons. To compute either one, we have to sum over all