Animal models of human anxiety often invoke a conflict between approach and avoidance [1, 2]. In these, a key behavioral assay comprises passive avoidance of potential threat and inhibition, both thought to be controlled by ventral hippocampus [2-6]. Efforts to translate these approaches to clinical contexts [7, 8] are hampered by the fact that it is not known whether humans manifest analogous approach-avoidance dispositions and, if so, whether they share a homologous neurobiological substrate [9]. Here, we developed a paradigm to investigate the role of human hippocampus in arbitrating an approach-avoidance conflict under varying levels of potential threat. Across four experiments, subjects showed analogous behavior by adapting both passive avoidance behavior and behavioral inhibition to threat level. Using functional magnetic resonance imaging (fMRI), we observe that threat level engages the anterior hippocampus, the human homolog of rodent ventral hippocampus [10]. Testing patients with selective hippocampal lesions, we demonstrate a causal role for the hippocampus with patients showing reduced passive avoidance behavior and inhibition across all threat levels. Our data provide the first human assay for approach-avoidance conflict akin to that of animal anxiety models. The findings bridge rodent and human research on passive avoidance and behavioral inhibition and furnish a framework for addressing the neuronal underpinnings of human anxiety disorders, where our data indicate a major role for the hippocampus.

The emergence of multicellular animals was associated with an increase in phenotypic complexity and with the acquisition of spatial cell differentiation and embryonic development. Paradoxically, this phenotypic transition was not paralleled by major changes in the underlying developmental toolkit and regulatory networks. In fact, most of these systems are ancient, established already in the unicellular ancestors of animals [1-5]. In contrast, the Microprocessor protein machinery, which is essential for microRNA (miRNA) biogenesis in animals, as well as the miRNA genes themselves produced by this Microprocessor, have not been identified outside of the animal kingdom [6]. Hence, the Microprocessor, with the key proteins Pasha and Drosha, is regarded as an animal innovation [7-9]. Here, we challenge this evolutionary scenario by investigating unicellular sister lineages of animals through genomic and transcriptomic analyses. We identify in Ichthyosporea both Drosha and Pasha (DGCR8 in vertebrates), indicating that the Microprocessor complex evolved long before the last common ancestor of animals, consistent with a pre-metazoan origin of most of the animal developmental gene elements. Through small RNA sequencing, we also discovered expressed bona fide miRNA genes in several species of the ichthyosporeans harboring the Microprocessor. A deep, pre-metazoan origin of the Microprocessor and miRNAs comply with a view that the origin of multicellular animals was not directly linked to the innovation of these key regulatory components.

Mammals maintain a nearly constant core body temperature (T-b) by balancing heat production and heat dissipation. This comes at a high metabolic cost that is sustainable if adequate calorie intake is maintained. When nutrients are scarce or experimentally reduced such as during calorie restriction (CR), endotherms can reduce energy expenditure by lowering T-b [1-6]. This adaptive response conserves energy, limiting the loss of body weight due to low calorie intake [710]. Here we show that this response is regulated by the kappa opioid receptor (KOR). CR is associated with increased hypothalamic levels of the endogenous opioid Leuenkephalin, which is derived from the KOR agonist precursor dynorphin [11]. Pharmacological inhibition of KOR, but not of the delta or the mu opioid receptor subtypes, fully blocked CR-induced hypothermia and increased weight loss during CR independent of calorie intake. Similar results were seen with DIO mice subjected to CR. In contrast, inhibiting KOR did not change T-b in animals fed ad libitum (AL). Chemogenetic inhibition of KOR neurons in the hypothalamic preoptic area reduced the CR-induced hypothermia, whereas chemogenetic activation of prodynorphin-expressing neurons in the arcuate or the parabrachial nucleus lowered T-b. These data indicate that KOR signaling is a pivotal regulator of energy homeostasis and can affect body weight during dieting by modulating T-b and energy expenditure.

While we do not yet understand all the functions of sleep, its critical role for normal physiology and behaviour is evident. Its amount and temporal pattern depend on species and condition. Humans sleep about a third of the day with the longest, consolidated episode during the night. The change in lifestyle from hunter-gatherers via agricultural communities to densely populated industrialized centres has certainly affected sleep, and a major concern in the medical community is the impact of insufficient sleep on health 1 and 2. One of the causal mechanisms leading to insufficient sleep is altered exposure to the natural light–dark cycle. This includes the wide availability of electric light, attenuated exposure to daylight within buildings, and evening use of light-emitting devices, all of which decrease the strength of natural light–dark signals that entrain circadian systems [3].

The realization that senescence, age-dependent declines in survival and reproductive performance, pervades natural populations has brought its evolutionary significance into sharper focus. However, reproductive senescence remains poorly understood because it is difficult to separate male and female mechanisms underpinning reproductive success. We experimentally investigated male reproductive senescence in feral fowl, Gallus gallus domesticus, where socially dominant males monopolize access to females and the ejaculates of multiple males compete for fertilization. We detected the signal of senescence on multiple determinants of male reproductive success. The effect of age on status was dependent upon the intensity of intrasexual competition: old males were less likely to dominate male-biased groups where competition is intense but were as likely as young males to dominate female-biased groups. Mating and fertilization success declined sharply with male age largely as a result of population-level patterns. These age-dependent declines translated into sexually antagonistic payoffs: old males fertilized more eggs when they were dominant, but this resulted in females suffering a drastic reduction in fertility. Thus, male senescence causes potential for sexual conflict over mating, and the intensity of this conflict is modulated socially, by the probability of old males dominating reproductive opportunities.

Lateralized behaviors benefit individuals by increasing task efficiency in foraging and anti-predator behaviors [1–4]. The conventional lateralization paradigm suggests individuals are left or right lateralized, although the direction of this laterality can vary for different tasks (e.g. foraging or predator inspection/avoidance). By fitting tri-axial movement sensors to blue whales (Balaenoptera musculus), and by recording the direction and size of their rolls during lunge feeding events, we show how these animals differ from such a paradigm. The strength and direction of individuals’ lateralization were related to where and how the whales were feeding in the water column. Smaller rolls (≤180°) predominantly occurred at depth (>70 m), with whales being more likely to rotate clockwise around their longest axis (right lateralized). Larger rolls (>180°), conversely, occurred more often at shallower depths (<70 m) and were more likely to be performed anti-clockwise (left lateralized). More acrobatic rolls are typically used to target small, less dense krill patches near the water’s surface [5,6], and we posit that the specialization of lateralized feeding strategies may enhance foraging efficiency in environments with heterogeneous prey distributions.

As the only endemic neotropical parrot to have recently lived in the northern hemisphere, the Carolina parakeet (Conuropsis carolinensis) was an iconic North American bird. The last surviving specimen died in the Cincinnati Zoo in 1918 [1]. The cause of its extinction remains contentious: besides excessive mortality associated to habitat destruction and active hunting, their survival could have been negatively affected by its range having become increasingly patchy [2] or by the exposure to poultry pathogens [3, 4]. In addition, the Carolina parakeet showed a pre-dilection for cockleburs, an herbaceousplant that contains a powerful toxin, carboxyatractyloside, or CAT [5], which did not seem to affect them but made the birds notoriously toxic to most predators [3]. To explore the demographic history of this bird, we generated the complete genomic sequence of a preserved specimen held in a private collection in Espinelves (Girona, Spain), as well as of a close extant relative, Aratinga solstitialis. We identified two non-synonymous genetic changes in two highly conserved proteins known to interact with CAT that could underlie a specific dietary adaptation to this toxin. Our genomic analyses did not reveal evidence of a dramatic past demographic decline in the Carolina parakeet; also, its genome did not exhibit the long runs of homozygosity that are signals of recent inbreeding and are typically found in endangered species. As such, our results suggest its extinction was an abrupt process and thus likely solely attributable to human causes.

Population and group-specific behavioral differences have been taken as evidence for animal cultures [1–10], a notion that remains controversial. Skeptics argue that ecological or genetic factors, rather than social learning, provide a more parsimonious explanation [11–14]. Work with captive chimpanzees has addressed this criticism by showing that experimentally created traditions can be transmitted through social learning [15–17]. Recent fieldwork further suggests that ecological and genetic factors are insufficient to explain the behavioral differences seen between communities, but the data are only observational [18, 19]. Here, we present the results of a field experiment [20, 21] that compared the performance of chimpanzees (P. t. schweinfurthii) from two Ugandan communities, Kanyawara and Sonso, on an identical task in the physical domain—extracting honey from holes drilled into horizontal logs. Kanyawara chimpanzees, who occasionally use sticks to acquire honey [4], spontaneously manufactured sticks to extract the experimentally provided honey. In contrast, Sonso chimpanzees, who possess a considerable leaf technology but no food related stick use [4, 22], relied on their fingers, but some also produced leaf sponges to access the honey. Our results indicate that, when genetic and environmental factors are controlled, wild chimpanzees rely on their cultural knowledge to solve a novel task.

Is it possible to mutate DNA during transcription? A new study shows that UV-damaged DNA is deaminated during transcription, which is a probable mechanism underlying CC tandem mutations found in the p53 gene in skin cancers.

A landmark of developmental biology is the production of reproducible shapes, through stereotyped morphogenetic events. At the cell level, growth is often highly heterogeneous, allowing shape diversity to arise. Yet, how can reproducible shapes emerge from such growth heterogeneity? Is growth heterogeneity filtered out? Here, we focus on rapidly growing trichome cells in the Arabidopsis sepal, a reproducible floral organ. We show via computational modeling that rapidly growing cells may distort organ shape. However, the cortical microtubule alignment along growth-derived maximal tensile stress in adjacent cells would mechanically isolate rapidly growing cells and limit their impact on organ shape. In vivo, we observed such microtubule response to stress and consistently found no significant effect of trichome number on sepal shape in wild-type and lines with trichome number defects. Conversely, modulating the microtubule response to stress in katanin and spiral2 mutant made sepal shape dependent on trichome number, suggesting that, while mechanical signals are propagated around rapidly growing cells, the resistance to stress in adjacent cells mechanically isolates rapidly growing cells, thus contributing to organ shape reproducibility.

The theory of mimicry explains how a mimic species gains advantage by resembling a model species [1-3]. Selection for increased mimic-model similarity should then result in accurate mimicry, yet there are many surprising examples of poor mimicry in the natural world [4-8]. The existence of imperfect mimics remains a major unsolved conundrum. We propose and experimentally test a novel explanation of the phenomenon. We argue that predators perceive prey as having several traits, but that the traits differ in their importance for learning. When predators learn to discriminate prey, high-salience traits overshadow other traits, leaving them under little or no selection for similarity, and allow imperfect mimicry to succeed. We tested this idea experimentally, using blue tits as predators and artificial prey with three prominent traits: color, pattern, and shape. We found that otherwise imperfect color mimics were avoided about as much as perfect mimics, whereas pattern and shape mimics did not gain from their similarity to the model. All traits could separately be perceived and learned by the predators, but the color trait was learned at a higher rate, implying that it had higher salience. We conclude that difference in salience between components of prey appearance is of major importance in explaining imperfect mimicry.

The archaeological documentation of the development of sedentary farming societies in Anatolia is not yet mirrored by a genetic understanding of the human populations involved, in contrast to the spread of farming in Europe [1-3]. Sedentary farming communities emerged in parts of the Fertile Crescent during the tenth millennium and early ninth millennium calibrated (cal) BC and had appeared in central Anatolia by 8300 cal BC [4]. Farming spread into west Anatolia by the early seventh millennium cal BC and quasi-synchronously into Europe, although the timing and process of this movement remain unclear. Using genome sequence data that we generated from nine central Anatolian Neolithic individuals, we studied the transition period from early Aceramic (Pre-Pottery) to the later Pottery Neolithic, when farming expanded west of the Fertile Crescent. We find that genetic diversity in the earliest farmers was conspicuously low, on a par with European foraging groups. With the advent of the Pottery Neolithic, genetic variation within societies reached levels later found in early European farmers. Our results confirm that the earliest Neolithic central Anatolians belonged to the same gene pool as the first Neolithic migrants spreading into Europe. Further, genetic affinities between later Anatolian farmers and fourth to third millennium BC Chalcolithic south Europeans suggest an additional wave of Anatolian migrants, after the initial Neolithic spread but before the Yamnaya-related migrations. We propose that the earliest farming societies demographically resembled foragers and that only after regional gene flow and rising heterogeneity did the farming population expansions into Europe occur.

The evolutionary divergence of sexual signals is often important during the formation of new animal species, but our understanding of the origin of signal diversity is limited [1, 2]. Sensory drive, the optimization of communication signal efficiency through matching to the local environment, has been highlighted as a potential promoter of diversification and speciation [3]. The swordtail characin (Corynopoma riisei) is a tropical fish in which males display a flag-like ornament that elicits female foraging behavior during courtship. We show that the shape of the male ornament covaries with female diet across natural populations. More specifically, natural populations in which the female diet is more dominated by ants exhibit male ornaments more similar to the shape of an ant. Feeding experiments confirm that females habituated to a diet of ants prefer to bite at male ornaments from populations with a diet more dominated by ants. Our results show that the male ornament functions as a "fishing lure" that is diversifying in shape to match local variation in female search images employed during foraging. This direct link between variation in female feeding ecology and the evolutionary diversification of male sexual ornaments suggests that sensory drive may be a common engine of signal divergence.

The large variation in brain size that exists in the animal kingdom has been suggested to have evolved through the balance between selective advantages of greater cognitive ability and the prohibitively high energy demands of a larger brain (the "expensive-tissue hypothesis" [1]). Despite over a century of research on the evolution of brain size, empirical support for the trade-off between cognitive ability and energetic costs is based exclusively on correlative evidence [2], and the theory remains controversial [3, 4]. Here we provide experimental evidence for costs and benefits of increased brain size. We used artificial selection for large and small brain size relative to body size in a live-bearing fish, the guppy (Poecilia reticulata), and found that relative brain size evolved rapidly in response to divergent selection in both sexes. Large-brained females outperformed small-brained females in a numerical learning assay designed to test cognitive ability. Moreover, large-brained lines, especially males, developed smaller guts, as predicted by the expensive-tissue hypothesis [1], and produced fewer offspring. We propose that the evolution of brain size is mediated by a functional trade-off between increased cognitive ability and reproductive performance and discuss the implications of these findings for vertebrate brain evolution.

The impact of human mobility on the northern European urban populations during the Viking and Early Middle Ages and its repercussions in Scandinavia itself are still largely unexplored. Our study of the demographics in the final phase of the Viking era is the first comprehensive multidisciplinary investigation that includes genetics, isotopes, archaeology, and osteology on a larger scale. This early Christian dataset is particularly important as the earlier common pagan burial tradition during the Iron Age was cremation, hindering large-scale DNA analyses. We present genome-wide sequence data from 23 individuals from the 10th to 12th century Swedish town of Sigtuna. The data revealed high genetic diversity among the early urban residents. The observed variation exceeds the genetic diversity in distinct modern-day and Iron Age groups of central and northern Europe. Strontium isotope data suggest mixed local and non-local origin of the townspeople. Our results uncover the social system underlying the urbanization process of the Viking World of which mobility was an intricate part and was comparable between males and females. The inhabitants of Sigtuna were heterogeneous in their genetic affinities, probably reflecting both close and distant connections through an established network, confirming that early urbanization processes in northern Europe were driven by migration.

Lateralization is widespread throughout the animal kingdom [1-7] and can increase task efficiency via shortening reaction times and saving on neural tissue [8-16]. However, lateralization might be costly because it increases predictability [17-21]. In predator-prey interactions, for example, predators might increase capture success because of specialization in a lateralized attack, but at the cost of increased predictability to their prey, constraining the evolution of lateralization. One unexplored mechanism for evading such costs is group hunting: this would allow individual-level specialization, while still allowing for group-level unpredictability. We investigated this mechanism in group hunting sailfish, Istiophorus platypterus, attacking schooling sardines, Sardinella aurita. During these attacks, sailfish alternate in attacking the prey using their elongated bills to slash or tap the prey [22-24]. This rapid bill movement is either leftward or rightward. Using behavioral observations of identifiable individual sailfish hunting in groups, we provide evidence for individual-level attack lateralization in sailfish. More strongly lateralized individuals had a higher capture success. Further evidence of lateralization comes from morphological analyses of sailfish bills that show strong evidence of one-sided micro-teeth abrasions. Finally, we show that attacks by single sailfish are indeed highly predictable, but predictability rapidly declines with increasing group size because of a lack of population-level lateralization. Our results present a novel benefit of group hunting: by alternating attacks, individual-level attack lateralization can evolve, without the negative consequences of individual-level predictability. More generally, our results suggest that group hunting in predators might provide more suitable conditions for the evolution of strategy diversity compared to solitary life.

Human olfaction is sensitive but poorly encoded by language. A new study comparing horticulturalists and hunter-gatherers suggests that the strength of odor language is dependent on life-style. This work may stimulate olfactory research at the crossroads between biology and culture.

Anatolia and the Near East have long been recognized as the epicenter of the Neolithic expansion through archaeological evidence. Recent archaeogenetic studies on Neolithic European human remains have shown that the Neolithic expansion in Europe was driven westward and northward by migration from a supposed Near Eastern origin [1-5]. However, this expansion and the establishment of numerous culture complexes in the Aegean and Balkans did not occur until 8,500 before present (BP), over 2,000 years after the initial settlements in the Neolithic core area [6-9]. We present ancient genome-wide sequence data from 6,700-year-old human remains excavated from a Neolithic context in Kumtepe, located in northwestern Anatolia near the well-known (and younger) site Troy [10]. Kumtepe is one of the settlements that emerged around 7,000 BP, after the initial expansion wave brought Neolithic practices to Europe. We show that this individual displays genetic similarities to the early European Neolithic gene pool and modern-day Sardinians, as well as a genetic affinity to modern-day populations from the Near East and the Caucasus. Furthermore, modern-day Anatolians carry signatures of several admixture events from different populations that have diluted this early Neolithic farmer component, explaining why modern-day Sardinian populations, instead of modern-day Anatolian populations, are genetically more similar to the people that drove the Neolithic expansion into Europe. Anatolia's central geographic location appears to have served as a connecting point, allowing a complex contact network with other areas of the Near East and Europe throughout, and after, the Neolithic.

The processes leading up to species extinctions are typically characterized by prolonged declines in population size and geographic distribution, followed by a phase in which populations are very small and may be subject to intrinsic threats, including loss of genetic diversity and inbreeding [1]. However, whether such genetic factors have had an impact on species prior to their extinction is unclear [2, 3]; examining this would require a detailed reconstruction of a species' demographic history as well as changes in genome-wide diversity leading up to its extinction. Here, we present high-quality complete genome sequences from two woolly mammoths (Mammuthus primigenius). The first mammoth was sequenced at 17.1-fold coverage and dates to similar to 4,300 years before present, representing one of the last surviving individuals on Wrangel Island. The second mammoth, sequenced at 11.2-fold coverage, was obtained from an similar to 44,800-year-old specimen from the Late Pleistocene population in northeastern Siberia. The demographic trajectories inferred from the two genomes are qualitatively similar and reveal a population bottleneck during the Middle or Early Pleistocene, and a more recent severe decline in the ancestors of the Wrangel mammoth at the end of the last glaciation. A comparison of the two genomes shows that the Wrangel mammoth has a 20% reduction in heterozygosity as well as a 28-fold increase in the fraction of the genome that comprises runs of homozygosity. We conclude that the population on Wrangel Island, which was the last surviving woolly mammoth population, was subject to reduced genetic diversity shortly before it became extinct.

While present-day taxa are valuable proxies for understanding the biology of extinct species, it is also crucial to examine physical remains in order to obtain a more comprehensive view of their behavior, social structure, and life histories [1, 2]. For example, information on demographic parameters such as age distribution and sex ratios in fossil assemblages can be used to accurately infer socioecological patterns (e.g., [3]). Here we use genomic data to determine the sex of 98 woolly mammoth (Mammuthus primigenius) specimens in order to infer social and behavioral patterns in the last 60,000 years of the species' existence. We report a significant excess of males among the identified samples (69% versus 31%; p < 0.0002). We argue that this male bias among mammoth remains is best explained by males more often being caught in natural traps that favor preservation. Wehypothesize that this is a consequence of social structure in proboscideans, which is characterized by matriarchal hierarchy and sex segregation. Without the experience associated with living in a matriarchal family group, or a bachelor group with an experienced bull, young or solitary males may have been more prone to die in natural traps where good preservation is more likely.

Recent results on the thermal biology of unicellular fungi provide evidence that pigmentation is an ancient adaptation for harvesting solar radiation. A new model system promises novel opportunities for quantifying radiative heat transfer and improving biophysical models.

The origins and genetic affinity of the aboriginal inhabitants of the Canary Islands, commonly known as Guanches, are poorly understood. Though radiocarbon dates on archaeological remains such as charcoal, seeds, and domestic animal bones suggest that people have inhabited the islands since the 5th century BCE [1–3], it remains unclear how many times, and by whom, the islands were first settled [4, 5]. Previously published ancient DNA analyses of uniparental genetic markers have shown that the Guanches carried common North African Y chromosome markers (E-M81, E-M78, and J-M267) and mitochondrial lineages such as U6b, in addition to common Eurasian haplogroups [6–8]. These results are in agreement with some linguistic, archaeological, and anthropological data indicating an origin from a North African Berber-like population [1, 4, 9]. However, to date there are no published Guanche autosomal genomes to help elucidate and directly test this hypothesis. To resolve this, we generated the first genome-wide sequence data and mitochondrial genomes from eleven archaeological Guanche individuals originating from Gran Canaria and Tenerife. Five of the individuals (directly radiocarbon dated to a time transect spanning the 7th–11th centuries CE) yielded sufficient autosomal genome coverage (0.21× to 3.93×) for population genomic analysis. Our results show that the Guanches were genetically similar over time and that they display the greatest genetic affinity to extant Northwest Africans, strongly supporting the hypothesis of a Berber-like origin. We also estimate that the Guanches have contributed 16%–31% autosomal ancestry to modern Canary Islanders, here represented by two individuals from Gran Canaria.

In this study, we compare the genetic ancestry of individuals from two as yet genetically unstudied cultural traditions in Estonia in the context of available modern and ancient datasets: 15 from the Late Bronze Age stone-cist graves (1200-400 BC) (EstBA) and 6 from the Pre-Roman Iron Age tarand cemeteries (800/500 BC-50 AD) (EstIA). We also included 5 Pre-Roman to Roman Iron Age Ingrian (500 BC450 AD) (IngIA) and 7 Middle Age Estonian (1200-1600 AD) (EstMA) individuals to build a dataset for studying the demographic history of the northern parts of the Eastern Baltic from the earliest layer of Mesolithic to modern times. Our findings are consistent with EstBA receiving gene flow from regions with strong Western hunter-gatherer (WHG) affinities and EstIA from populations related to modern Siberians. The latter inference is in accordance with Y chromosome (chrY) distributions in present day populations of the Eastern Baltic, as well as patterns of autosomal variation in the majority of the westernmost Uralic speakers [1-5]. This ancestry reached the coasts of the Baltic Sea no later than the mid-first millennium BC; i.e., in the same time window as the diversification of west Uralic (Finnic) languages [6]. Furthermore, phenotypic traits often associated with modern Northern Europeans, like light eyes, hair, and skin, as well as lactose tolerance, can be traced back to the Bronze Age in the Eastern Baltic.

The origin of domestic dogs is poorly understood [1-15], with suggested evidence of dog-like features in fossils that predate the Last Glacial Maximum [6, 9, 10, 14, 16] conflicting with genetic estimates of a more recent divergence between dogs and worldwide wolf populations [13, 15, 17-19]. Here, we present a draft genome sequence from a 35,000 year-old wolf from the Taimyr Peninsula in northern Siberia. We find that this individual belonged to a population that diverged from the common ancestor of present-day wolves and dogs very close in time to the appearance of the domestic dog lineage. We use the directly dated ancient wolf genome to recalibrate the molecular timescale of wolves and dogs and find that the mutation rate is substantially slower than assumed by most previous studies, suggesting that the ancestors of dogs were separated from present-day wolves before the Last Glacial Maximum. We also find evidence of introgression from the archaic Taimyr wolf lineage into present-day dog breeds from northeast Siberia and Greenland, contributing between 1.4% and 27.3% of their ancestry. This demonstrates that the ancestry of present-day dogs is derived from multiple regional wolf populations.

Reduced exposure to daytime sunlight and increased exposure to electrical lighting at night leads to late circadian and sleep timing [1-3]. We have previously shown that exposure to a natural summer 14 hr 40 min:9 hr 20 min light-dark cycle entrains the human circadian clock to solar time, such that the internal biological night begins near sunset and ends near sunrise [1]. Here we show that the beginning of the biological night and sleep occur earlier after a week's exposure to a natural winter 9 hr 20 min:14 hr 40 min light-dark cycle as compared to the modern electrical lighting environment. Further, we find that the human circadian clock is sensitive to seasonal changes in the natural light-dark cycle, showing an expansion of the biological night in winter compared to summer, akin to that seen in non-humans [4-8]. We also show that circadian and sleep timing occur earlier after spending a weekend camping in a summer 14 hr 39 min:9 hr 21 min natural light-dark cycle compared to a typical weekend in the modern environment. Weekend exposure to natural light was sufficient to achieve similar to 69% of the shift in circadian timing we previously reported after a week's exposure to natural light [1]. These findings provide evidence that the human circadian clock adapts to seasonal changes in the natural light-dark cycle and is timed later in the modern environment in both winter and summer. Further, we demonstrate that earlier circadian timing can be rapidly achieved through natural light exposure during a weekend spent camping.

The function of tubular epithelial organs like the kidney and lung is critically dependent on the length and diameter of their constituting branches. Genetic analysis of tube size control during Drosophila tracheal development has revealed that epithelial septate junction (SJ) components and the dynamic chitinous luminal matrix coordinate tube growth. However, the underlying molecular mechanisms controlling tube expansion so far remained elusive. Here, we present the analysis of two luminal chitin binding proteins with predicted polysaccharide deacetylase activities (ChLDs). ChLDs are required to assemble the cable-like extracellular matrix (ECM) and restrict tracheal tube elongation. Overexpression of native, but not of mutated, ChLD versions also interferes with the structural integrity of the intraluminal ECM and causes aberrant tube elongation. Whereas ChLD mutants have normal SJ structure and function, the luminal deposition of the ChLD requires intact cellular SJs. This identifies a new molecular function for SJs in the apical secretion of ChLD and positions ChLD downstream of the SJs in tube length control. The deposition of the chitin luminal matrix first promotes and coordinates radial tube expansion. We propose that the subsequent structural modification of chitin by chitin binding deacetylases selectively instructs the termination of tube elongation to the underlying epithelium.