Abstract

The evolution of aposematism, a phenomenon where prey species conspicuously advertise their unprofitability to predators, is puzzling. How did conspicuousness evolve, if it simultaneously increased the likelihood of an inexperienced predator to detect the prey and presumably kill it? Antiapostatic selection, where rare prey is predated relatively more often, is considered as another major difficulty for aposematism to evolve. However, the risk of being conspicuous in low frequencies has not been experimentally tested. We designed an experiment to test how frequency (4%, 12%, 32%) of conspicuous aposematic prey and its dispersion type (solitary vs. aggregated) affect an initial predation risk of the prey and in avoidance learning of predators. Wild great tits (Parus major) were predators on artificial prey in a "novel world." The relative mortality of aposematic prey was antiapostatic, thus the frequency-dependent predation was most severe at low frequencies. In all frequencies, aggregated aposematic prey survived better than solitary prey. Surprisingly, learning was not determined by a fixed number of unpalatable prey eaten, but at low frequencies fewer aposematic individuals eaten generated predators' avoidance learning. However, per-capita risk for the prey remained highest at low frequencies. Our results underscore the problems of initial evolution of rare conspicuous morphs. Aggregated prey suffered less from predation, indicating selective advantage of aggregation over solitary living for a conspicuous individual.