Associative memory networks consisting of highly interconnected binary-valued cells have been used to model neural networks. Tight asymptotic bounds have been found for the information capacity of these networks. The authors derive the asymptotic information capacity of these networks using results from normal approximation theory and theorems about exchangeable random variables