The success of quantitative modern quantum chemistry, relative to its primitive, qualitative beginnings, can be traced to two sources: better algorithms and better computers. While the two technologies continue to improve rapidly, efforts are heavily thwarted by the fact that the total number of ERIs increases quadratically with the size of the molecular system. Even large increases in ERI algorithm efficiency yield only moderate increases in applicability, hindering the more widespread application of ab initio methods to areas of, perhaps, biochemical significance where semi-empirical techniques215, 216 have already proven so valuable.
Thus, the elimination of quadratic scaling algorithms has been the theme of many research efforts in quantum chemistry throughout the 1990s and has seen the construction of many alternative algorithms to alleviate the problem. Johnson was the first to implement DFT exchange/correlation functionals whose computational cost scaled linearly with system size.441 This paved the way for the most significant breakthrough in the area with the linear scaling CFMM algorithm1040 leading to linear scaling DFT calculations.1042 Further breakthroughs have been made with traditional theory in the form of the QCTC154, 155, 156 and ONX857, 858 algorithms, while more radical approaches8, 232 may lead to entirely new approaches to ab initio calculations. Investigations into the quadratic Coulomb problem has not only yielded linear scaling algorithms, but is also providing large insights into the significance of many molecular energy components.
Linear scaling Coulomb and SCF exchange/correlation algorithms are not the end of the story as the diagonalization step has been rate limiting in semi-empirical techniques and, been predicted to become rate limiting in ab initio approaches in the medium term.926 However, divide-and-conquer techniques1082, 1083, 1081, 564 and the recently developed quadratically convergent SCF algorithm701 show great promise for reducing this problem.