The Hopfield model of a neural network is studied near its saturation, i.e., when the number p of stored patterns increases with the size of the network N, as p = αN. The mean-field theory for this system is described in detail. The system possesses, at low α, both a spin-glass phase and 2p dynamically stable degenerate ferromagnetic phases. The latter have essentially full macroscopic overlaps with the memorized patterns, and provide effective associative memory, despite the spin-glass features. The network can retrieve patterns, at T = 0, with an error of less than 1.5% for α <αc= 0.14. At αc the ferromagnetic (FM) retrieval states disappear discontinuously. Numerical simulations show that even above αc the overlaps with the sored patterns are not zero, but the level of error precludes meaningful retrieval. The difference between the statistical mechanics and the simulations is discussed. As α decreases below 0.05 the FM retrieval states become ground states of the system, and for α < 0.03 mixture states appear. The level of storage creates noise, akin to temperature at finite p. Replica symmetry breaking is found to be salient in the spin-glass state, but in the retrieval states it appears at extremely low temperatures, and is argued to have a very weak effect. This is corroborated by simulations. The study is extended to survey the phase diagram of the system in the presence of stochastic synaptic noise (temperature), and the effect of external fields (neuronal thresholds) coupled to groups of patterns. It is found that a field coupled to many patterns has a very limited utility in enhancing their learning. Finally, we discuss the robustness of the network to the relaxation of various underlying assumptions, as well as some new trends in the study of neural networks.