We explore the relationship between evolved neural network structure and function, by applying graph theoretical tools to the analysis of the topology of artificial neural networks known to exhibit evolutionary increases in dynamical neural complexity. Our results suggest a synergistic convergence between network structures emerging due to physical constraints, such as wiring length and brain volume, and optimal network topologies evolved purely for function in the absence of physical constraints. We observe increases in clustering coefficients in concert with decreases in path lengths that together produce a driven evolutionary bias towards small-world networks relative to comparable networks in a passive null model. These small-world biases are exhibited during the same periods that evolution actively selects for increasing neural complexity (also during which the model's agents are behaviorally adapting to their environment), thus strengthening the association between small-world network structures and complex neural dynamics. We also introduce a new measure of path length in graphs, "normalized path length", that is better behaved than existing metrics for networks comprised of disjoint subgraphs and disconnected nodes, and a novel method of quantifying the degree of evolutionary selection for small world networks, "small-world bias".

Work funded by the National Academies and Keck Futures Initiative, and performed in collaboration with Olaf Sporns, Steven Williams, Xin Shuai, and Sean Dougherty.