Microcanonical Ensemble: Complete Guide to Entropy, Chemical Potential & Gibbs Paradox

Microcanonical Ensemble: Complete Statistical Mechanics Guide | Thermal Physics

Microcanonical Ensemble: Complete Statistical Mechanics Guide

Mastering Entropy Calculations, Chemical Potential, Gibbs Paradox, and Sackur-Tetrode Equation
Microcanonical Ensemble Statistical Mechanics Entropy Chemical Potential Gibbs Paradox Reading Time: 25 min

📜 Historical Background

The development of statistical mechanics and the microcanonical ensemble concept transformed our understanding of thermodynamics:

  • Ludwig Boltzmann (1870s): Developed statistical interpretation of entropy
  • J. Willard Gibbs (1902): Formulated ensemble theory and resolved Gibbs paradox
  • Otto Sackur & Hugo Tetrode (1912): Derived the entropy formula for ideal gases
  • Max Planck: Contributed to quantum statistics and constant determination

These developments established the foundation of statistical mechanics, connecting microscopic physics with macroscopic thermodynamics.

Introduction to Microcanonical Ensemble

🔬 What is the Microcanonical Ensemble?

The microcanonical ensemble describes an isolated system with fixed energy \(E\), volume \(V\), and number of particles \(N\). All microstates accessible to the system are equally probable, according to the fundamental postulate of statistical mechanics.

This ensemble is particularly useful for understanding the foundations of statistical mechanics and deriving fundamental thermodynamic relations.

💡 Key Insight

The microcanonical ensemble represents systems that are completely isolated from their environment - no energy, volume, or particle exchange. This makes it the simplest ensemble conceptually, though often mathematically challenging for calculations.

Entropy in Microcanonical Ensemble

🌊 Boltzmann Entropy Formula

The entropy of a system in the microcanonical ensemble is given by Boltzmann's famous formula:

\[ S = k_B \ln \Omega \]

where \( \Omega \) is the number of microstates accessible to the system with energy between \(E\) and \(E + \delta E\), and \(k_B\) is Boltzmann's constant.

🧮 Entropy Derivation for Ideal Gas

Step 1: Basic Entropy Expression

For a system with indistinguishable particles in the microcanonical ensemble:

\[ \sigma = \lambda k_B \left[ \ln \left( \frac{V}{N} \left( \frac{2\pi m k_B T}{h^2} \right)^{3/2} \right) + \frac{5}{2} \right] \]

Step 2: Thermal Wavelength Definition

\[ \lambda = \frac{h}{\sqrt{2\pi m k_B T}} \]

This is the thermal de Broglie wavelength associated with gas molecules at temperature \(T\).

Step 3: Simplified Entropy Expression

\[ \sigma = \lambda k_B \left[ \ln \left( \frac{V}{N \lambda^3} \right) + \frac{5}{2} \right] \]

Chemical Potential Calculations

⚗️ What is Chemical Potential?

Chemical potential \( \mu \) represents the change in system energy when adding a particle while keeping entropy and volume constant:

\[ \mu = -T \left( \frac{\partial S}{\partial N} \right)_{V,E} \]

It quantifies the "escaping tendency" of particles from a system and plays a crucial role in phase equilibria and chemical reactions.

Derivation of Chemical Potential

🧮 Chemical Potential Derivation

Step 1: Starting with Entropy Expression

For an ideal gas in the microcanonical ensemble:

\[ \sigma = N k_B \left[ \ln \left( \frac{V}{N \lambda^3} \right) + \frac{5}{2} \right] \]

Step 2: Differentiating with Respect to N

\[ \frac{\partial \sigma}{\partial N} = k_B \left[ \ln \left( \frac{V}{N \lambda^3} \right) + \frac{5}{2} \right] + N k_B \left[ \frac{1}{\frac{V}{N \lambda^3}} \cdot \left( -\frac{V}{N^2 \lambda^3} \right) \right] \]

Step 3: Simplifying the Expression

\[ = k_B \left[ \ln \left( \frac{V}{N \lambda^3} \right) + \frac{5}{2} - 1 \right] \]
\[ = k_B \left[ \ln \left( \frac{V}{N \lambda^3} \right) + \frac{3}{2} \right] \]

Step 4: Chemical Potential Formula

\[ \mu = -T \left( \frac{\partial S}{\partial N} \right)_{V,T} \]
\[ = -k_B T \left[ \ln \left( \frac{V}{N \lambda^3} \right) + \frac{3}{2} \right] \]

Step 5: Final Simplified Form

\[ \mu = k_B T \ln \left( \frac{N}{V} \lambda^3 \right) \]
\[ = k_B T \ln \left( n \lambda^3 \right) \]
\[ = k_B T \ln \left( \frac{n}{n_Q} \right) \]

Quantum Density Interpretation

🔍 Quantum Concentration

The quantum density \( n_Q \) represents the particle density when the interparticle spacing equals the thermal de Broglie wavelength:

\[ n_Q = \frac{1}{\lambda^3} = \left( \frac{2\pi m k_B T}{h^2} \right)^{3/2} \]

When \( n > n_Q \), quantum effects become important as wavefunctions overlap significantly.

📊 Physical Interpretation

The chemical potential \( \mu = k_B T \ln(n/n_Q) \) shows that:

  • When \( n < n_Q \) (dilute gas), \( \mu < 0 \)
  • When \( n = n_Q \), \( \mu = 0 \)
  • When \( n > n_Q \) (dense system), \( \mu > 0 \)

This reflects how "willing" particles are to join the system based on density.

Gibbs Paradox

⚖️ The Entropy Mixing Paradox

Gibbs paradox highlights an unexpected behavior in entropy when mixing identical versus different gases, resolved by correctly accounting for particle indistinguishability in quantum statistics.

Mixing of Different Gases

🧮 Entropy of Mixing Different Gases

Step 1: Initial and Final States

Consider two different ideal gases \((N_1, V_1, T)\) and \((N_2, V_2, T)\) allowed to mix by removing a partition:

\[ \text{Final volume: } V = V_1 + V_2 \]

Step 2: Entropy Change Calculation

\[ \Delta S = S_{\text{final}} - S_{\text{initial}} \]
\[ = \left[ S_1(N_1, V, T) + S_2(N_2, V, T) \right] - \left[ S_1(N_1, V_1, T) + S_2(N_2, V_2, T) \right] \]

Step 3: Using Ideal Gas Entropy

\[ S = N k_B \left[ \ln \left( \frac{V}{N \lambda^3} \right) + \frac{5}{2} \right] \]
\[ \Delta S = N_1 k_B \ln \left( \frac{V}{V_1} \right) + N_2 k_B \ln \left( \frac{V}{V_2} \right) \]

Step 4: For Equal Volumes and Particles

\[ V_1 = V_2 = \frac{V}{2}, \quad N_1 = N_2 = N \]
\[ \Delta S = N k_B \ln 2 + N k_B \ln 2 = 2N k_B \ln 2 \]

Mixing of Identical Gases

🧮 Entropy of Mixing Identical Gases

Step 1: Initial and Final States

Consider two identical ideal gases \((N_1, V_1, T)\) and \((N_2, V_2, T)\) allowed to mix:

\[ \text{Final state: } N = N_1 + N_2, \quad V = V_1 + V_2 \]

Step 2: Entropy Change Calculation

\[ \Delta S = S(N, V, T) - \left[ S(N_1, V_1, T) + S(N_2, V_2, T) \right] \]

Step 3: Using Correct Entropy Formula

\[ S = N k_B \left[ \ln \left( \frac{V}{N \lambda^3} \right) + \frac{5}{2} \right] \]
\[ \Delta S = (N_1 + N_2) k_B \ln \left( \frac{V}{N_1 + N_2} \right) - N_1 k_B \ln \left( \frac{V_1}{N_1} \right) - N_2 k_B \ln \left( \frac{V_2}{N_2} \right) \]

Step 4: For Equal Volumes and Densities

\[ V_1 = V_2 = \frac{V}{2}, \quad N_1 = N_2 = N, \quad \frac{N_1}{V_1} = \frac{N_2}{V_2} = \frac{N}{V} \]
\[ \Delta S = 2N k_B \ln \left( \frac{V}{2N} \right) - N k_B \ln \left( \frac{V/2}{N} \right) - N k_B \ln \left( \frac{V/2}{N} \right) \]
\[ = 2N k_B \ln \left( \frac{V}{2N} \right) - 2N k_B \ln \left( \frac{V}{2N} \right) = 0 \]

Resolution of the Paradox

🔍 Quantum Resolution

The resolution of Gibbs paradox lies in the quantum mechanical principle of particle indistinguishability. For identical particles, we must divide the classical phase space volume by \(N!\) to avoid overcounting microstates:

\[ \Omega_{\text{correct}} = \frac{\Omega_{\text{classical}}}{N!} \]

This leads to the correct entropy expression that resolves the paradox.

💡 Key Insight

The Gibbs paradox demonstrates that entropy is not simply additive when mixing identical systems. The resolution requires quantum mechanical indistinguishability of identical particles, which was not understood in classical thermodynamics.

Sackur-Tetrode Equation

📐 Complete Entropy Expression

The Sackur-Tetrode equation provides the exact expression for the entropy of a monatomic ideal gas:

\[ S = N k_B \left[ \ln \left( \frac{V}{N} \left( \frac{2\pi m k_B T}{h^2} \right)^{3/2} \right) + \frac{5}{2} \right] \]

This equation satisfies all thermodynamic requirements and correctly accounts for quantum effects through Planck's constant \(h\).

🧮 Sackur-Tetrode Derivation

Step 1: Phase Space Volume

The number of microstates for \(N\) particles in volume \(V\) with energy less than \(E\):

\[ \Omega(E) = \frac{1}{N!} \frac{V^N}{h^{3N}} \frac{(2\pi m E)^{3N/2}}{\left( \frac{3N}{2} \right)!} \]

Step 2: Using Stirling's Approximation

\[ \ln N! \approx N \ln N - N \]
\[ \ln \left( \frac{3N}{2} \right)! \approx \frac{3N}{2} \ln \left( \frac{3N}{2} \right) - \frac{3N}{2} \]

Step 3: Entropy Calculation

\[ S = k_B \ln \Omega \]
\[ = k_B \left[ -N \ln N + N + N \ln V - 3N \ln h + \frac{3N}{2} \ln (2\pi m E) - \frac{3N}{2} \ln \left( \frac{3N}{2} \right) + \frac{3N}{2} \right] \]

Step 4: Simplifying and Using \(E = \frac{3}{2} N k_B T\)

\[ S = N k_B \left[ \ln \left( \frac{V}{N} \right) + \frac{3}{2} \ln \left( \frac{2\pi m k_B T}{h^2} \right) + \frac{5}{2} \right] \]

Statistical Interpretation of Entropy

🔢 Boltzmann's Statistical Entropy

Boltzmann's entropy formula connects thermodynamics with statistical mechanics:

\[ S = k_B \ln W \]

where \(W\) is the number of microstates corresponding to a given macrostate. This formula is engraved on Boltzmann's tombstone as a testament to its fundamental importance.

📈 Information Theory

Shannon entropy in information theory is directly analogous to thermodynamic entropy, measuring uncertainty or information content.

🌌 Cosmology

Entropy plays a crucial role in understanding the arrow of time and the evolution of the universe from low to high entropy states.

🔬 Nanotechnology

Statistical mechanics provides the foundation for understanding fluctuations and thermodynamics at the nanoscale.

Relation Between Thermodynamic and Statistical Entropy

⚖️ Connecting Microscopic and Macroscopic

The statistical definition of entropy must be consistent with classical thermodynamics. For an ideal gas, we can verify this consistency:

🧮 Consistency Check

Step 1: Thermodynamic Definition

\[ dS = \frac{1}{T} dU + \frac{P}{T} dV - \frac{\mu}{T} dN \]

Step 2: For Ideal Gas

\[ U = \frac{3}{2} N k_B T \]
\[ P = \frac{N k_B T}{V} \]
\[ \mu = k_B T \ln \left( \frac{N}{V} \lambda^3 \right) \]

Step 3: Substituting into Thermodynamic Relation

\[ dS = \frac{1}{T} \left( \frac{3}{2} N k_B dT \right) + \frac{N k_B T}{V T} dV - \frac{k_B T \ln \left( \frac{N}{V} \lambda^3 \right)}{T} dN \]
\[ = \frac{3}{2} N k_B \frac{dT}{T} + N k_B \frac{dV}{V} - k_B \ln \left( \frac{N}{V} \lambda^3 \right) dN \]

Step 4: Integrating to Recover Sackur-Tetrode

This differential can be integrated to recover the Sackur-Tetrode equation, confirming consistency between thermodynamic and statistical definitions.

Phase Space and Entropy

🌌 Phase Space Volume

In classical statistical mechanics, the microcanonical ensemble considers all microstates within a thin energy shell in phase space:

\[ \Omega(E) = \frac{1}{h^{3N} N!} \int_{E \leq H(\mathbf{p},\mathbf{q}) \leq E+\delta E} d^{3N}p \, d^{3N}q \]

The factors \(h^{3N}\) and \(N!\) ensure the correct quantum mechanical counting of states in the classical limit.

Sample Problem: Entropy Calculation

Calculate the entropy of 1 mole of argon gas at standard temperature and pressure (STP: T = 273 K, P = 1 atm).

Given:
\[ N = N_A = 6.022 \times 10^{23} \]
\[ T = 273 \, \text{K} \]
\[ P = 1.013 \times 10^5 \, \text{Pa} \]
\[ m = 6.63 \times 10^{-26} \, \text{kg} \quad \text{(mass of argon atom)} \]
\[ k_B = 1.381 \times 10^{-23} \, \text{J/K} \]
\[ h = 6.626 \times 10^{-34} \, \text{J·s} \]
Volume from ideal gas law:
\[ V = \frac{N k_B T}{P} \]
\[ = \frac{6.022 \times 10^{23} \times 1.381 \times 10^{-23} \times 273}{1.013 \times 10^5} \]
\[ = 0.0224 \, \text{m}^3 \]
Thermal wavelength:
\[ \lambda = \frac{h}{\sqrt{2\pi m k_B T}} \]
\[ = \frac{6.626 \times 10^{-34}}{\sqrt{2\pi \times 6.63 \times 10^{-26} \times 1.381 \times 10^{-23} \times 273}} \]
\[ = 1.60 \times 10^{-11} \, \text{m} \]
Entropy using Sackur-Tetrode:
\[ S = N k_B \left[ \ln \left( \frac{V}{N \lambda^3} \right) + \frac{5}{2} \right] \]
\[ = 6.022 \times 10^{23} \times 1.381 \times 10^{-23} \left[ \ln \left( \frac{0.0224}{6.022 \times 10^{23} \times (1.60 \times 10^{-11})^3} \right) + \frac{5}{2} \right] \]
\[ = 8.314 \left[ \ln (1.45 \times 10^{7}) + 2.5 \right] \]
\[ = 8.314 \left[ 16.49 + 2.5 \right] \]
\[ = 157.9 \, \text{J/K} \]
Sample Problem: Chemical Potential

Calculate the chemical potential of nitrogen gas at room temperature (T = 300 K) and atmospheric pressure.

Given:
\[ T = 300 \, \text{K} \]
\[ P = 1.013 \times 10^5 \, \text{Pa} \]
\[ m = 4.65 \times 10^{-26} \, \text{kg} \quad \text{(mass of N₂ molecule)} \]
\[ k_B = 1.381 \times 10^{-23} \, \text{J/K} \]
\[ h = 6.626 \times 10^{-34} \, \text{J·s} \]
Number density:
\[ n = \frac{P}{k_B T} \]
\[ = \frac{1.013 \times 10^5}{1.381 \times 10^{-23} \times 300} \]
\[ = 2.45 \times 10^{25} \, \text{m}^{-3} \]
Thermal wavelength:
\[ \lambda = \frac{h}{\sqrt{2\pi m k_B T}} \]
\[ = \frac{6.626 \times 10^{-34}}{\sqrt{2\pi \times 4.65 \times 10^{-26} \times 1.381 \times 10^{-23} \times 300}} \]
\[ = 1.88 \times 10^{-11} \, \text{m} \]
Chemical potential:
\[ \mu = k_B T \ln (n \lambda^3) \]
\[ = 1.381 \times 10^{-23} \times 300 \times \ln (2.45 \times 10^{25} \times (1.88 \times 10^{-11})^3) \]
\[ = 4.143 \times 10^{-21} \times \ln (1.63 \times 10^{-6}) \]
\[ = 4.143 \times 10^{-21} \times (-13.32) \]
\[ = -5.52 \times 10^{-20} \, \text{J} \]
\[ = -0.345 \, \text{eV} \]

Frequently Asked Questions

Why is the N! factor needed in the microcanonical ensemble?

The N! factor (Gibbs factor) is essential for correctly counting microstates in systems of indistinguishable particles. In quantum mechanics, identical particles are fundamentally indistinguishable - there's no way to label them or track individual identities.

Without dividing by N!, we would overcount the number of distinct microstates by counting permutations of identical particles as different states. This correction:

  • Resolves Gibbs paradox
  • Makes entropy extensive (proportional to system size)
  • Ensures consistency with quantum statistics

The factor appears naturally in quantum statistical mechanics but must be added manually in the classical treatment.

What is the physical significance of the thermal wavelength λ?

The thermal de Broglie wavelength \( \lambda = h/\sqrt{2\pi m k_B T} \) has several important physical interpretations:

  • Quantum delocalization: Represents the spatial extent over which a particle's wavefunction is significant at temperature T
  • Quantum concentration: When \( n\lambda^3 \approx 1 \), quantum effects become important
  • Criterion for classical behavior: Systems with \( n\lambda^3 \ll 1 \) behave classically
  • Onset of quantum degeneracy: When \( n\lambda^3 \gtrsim 1 \), quantum statistics (Bose-Einstein or Fermi-Dirac) must be used

For most gases at room temperature, \( \lambda \sim 10^{-11} \) m and \( n\lambda^3 \sim 10^{-6} \), justifying the classical treatment.

How does the microcanonical ensemble relate to other ensembles?

The microcanonical ensemble is the foundation upon which other statistical ensembles are built:

Ensemble Fixed Quantities Application
Microcanonical E, V, N Isolated systems
Canonical T, V, N Systems in thermal bath
Grand Canonical T, V, μ Open systems

The canonical ensemble can be derived from the microcanonical ensemble by considering a small system in contact with a large heat bath. Similarly, the grand canonical ensemble extends this to systems that can exchange both energy and particles.

All ensembles give equivalent results in the thermodynamic limit (N → ∞), but each is most convenient for different types of problems.

📚 Master Statistical Mechanics

The microcanonical ensemble provides the fundamental foundation for statistical mechanics, connecting microscopic physics with macroscopic thermodynamics. Understanding these concepts is essential for advanced studies in condensed matter physics, astrophysics, and quantum many-body systems.

Read More: Statistical Physics Notes

© House of Physics | Thermal and Statistical Physics: Microcanonical Ensemble

Based on university curriculum with additional insights from statistical mechanics literature

House of Physics | Contact: aliphy2008@gmail.com

Post a Comment

0 Comments