We propose an explanation based upon strong-disorder renormalization group theory, in which the near-field interaction combined with random atomic positions results in an inhomogeneous broadening of atomic resonance frequencies. This limit arises purely from electrodynamics, as it occurs at densities far below those where chemical processes become important. Interestingly, despite the giant response of an isolated atom, we find that the maximum index does not indefinitely grow with increasing density but rather reaches a limiting value of n ≈ 1.7. Here, we theoretically and numerically investigate the evolution of the optical properties of an ensemble of ideal atoms as a function of density, starting from the dilute gas limit, including the effects of multiple scattering and near-field interactions. Moreover, this observation is difficult to reconcile with the fact that a single isolated atom is known to have a giant optical response, as characterized by a resonant scattering cross section that far exceeds its physical size.
Surprisingly, though, a deep understanding of the mechanisms that lead to this universal behavior seems to be lacking. It is interesting to observe that all optical materials with a positive refractive index have a value of index that is of order unity.