Donate to arXiv

Please join the Simons Foundation and our
generous member institutions
in supporting arXiv during our online giving campaign from October 16-19, 2017.
One hundred percent of your contribution will fund improvement initiatives and support
arXiv's Open Access mission!

Quantitative Biology > Molecular Networks

Abstract: Gene regulatory networks can be successfully modeled as Boolean networks. A
much discussed hypothesis says that such model networks reproduce empirical
findings the best if they are tuned to operate at criticality, i.e. at the
borderline between their ordered and disordered phases. Critical networks have
been argued to lead to a number of functional advantages such as maximal
dynamical range, maximal sensitivity to environmental changes, as well as to an
excellent trade off between stability and flexibility. Here, we study the
effect of noise within the context of Boolean networks trained to learn complex
tasks under supervision. We verify that quasi-critical networks are the ones
learning in the fastest possible way --even for asynchronous updating rules--
and that the larger the task complexity the smaller the distance to
criticality. On the other hand, when additional sources of intrinsic noise in
the network states and/or in its wiring pattern are introduced, the optimally
performing networks become clearly subcritical. These results suggest that in
order to compensate for inherent stochasticity, regulatory and other type of
biological networks might become subcritical rather than being critical, all
the most if the task to be performed has limited complexity.