The success of quantitative modern quantum chemistry, relative to its primitive, qualitative beginnings, can be traced to two sources: better algorithms and better computers. While the two technologies continue to improve rapidly, efforts are heavily thwarted by the fact that the total number of ERIs increases quadratically with the size of the molecular system. Even large increases in ERI algorithm efficiency yield only moderate increases in applicability, hindering the more widespread application of ab initio methods to areas of, perhaps, biochemical significance where semi-empirical techniques267, 268 have already proven so valuable.
Thus, the elimination of quadratic scaling algorithms has been the theme of many research efforts in quantum chemistry throughout the 1990s and has seen the construction of many alternative algorithms to alleviate the problem. Johnson was the first to implement DFT exchange/correlation functionals whose computational cost scaled linearly with system size.532 This paved the way for the most significant breakthrough in the area with the linear scaling CFMM algorithm1210 leading to linear scaling DFT calculations.1212 Further breakthroughs have been made with traditional theory in the form of the QCTC192, 191, 193 and ONX1009, 1010 algorithms, while more radical approaches23, 289 may lead to entirely new approaches to ab initio calculations. Investigations into the quadratic Coulomb problem has not only yielded linear scaling algorithms, but is also providing large insights into the significance of many molecular energy components.
Linear scaling Coulomb and SCF exchange/correlation algorithms are not the end of the story as the diagonalization step has been rate limiting in semi-empirical techniques and, been predicted to become rate limiting in ab initio approaches in the medium term.1088 However, divide-and-conquer techniques1252, 1253, 1251, 661 and the recently developed quadratically convergent SCF algorithm830 show great promise for reducing this problem.