Advanced search

Advanced search is divided into two main parts, and one or more groups in each of the main parts. The main parts are the "Search for" (including) and the "Remove from search" (excluding) part. (The excluding part might not be visible until you hit "NOT" for the first time.) You can add new groups to both the including and the excluding part by using the buttons "OR" or "NOT" respectively, and you can add more search options to all groups through the drop down menu on the last row (in each group).

For a result to be included in the search result, is it required to fit all added including parameters (in at least one group) and not fit all parameters in one of the excluding groups. This system with the two main parts and their groups makes it possible to combine two (or more) distinct searches into one search result, while being flexible in removing results from the final list.

Background: Studies on autism have tended to focus either on those with intellectual disability (ie, those with intellectual quotient [IQ] under 70) or on the group that is referred to as high-functioning, that is, those with borderline, average or above average IQ. The literature on cognition and daily functioning in autism spectrum disorder combined specifically with borderline intellectual functioning (IQ 70-84) is limited. Methods: From a representative group of 208 preschool children diagnosed with autism spectrum disorder, those 50 children in the group with borderline intellectual functioning at ages 4.5-6.5 years were targeted for follow-up at a median age of 10 years. A new cognitive test was carried out in 30 children. Parents were interviewed with a semi-structured interview together with the Vineland Adaptive Behavior Scales (n=41) and the Autism-Tics, attention-deficit/hyperactivity disorder (AD/HD) and other comorbidities inventory (A-TAC) (n=36). Results: Most children of interviewed parents presented problems within several developmental areas. According to A-TAC and the clinical interview, there were high rates of attention deficits and difficulties with regulating activity level and impulsivity. Vineland Adaptive Behavior Scales composite scores showed that at school age, a majority of the children had declined since the previous assessment at ages between 4.5 and 6.5 years. Almost half the tested group had shifted in their IQ level, to below 70 or above 84. Conclusion: None of the children assessed was without developmental/neuropsychiatric problems at school-age follow-up. The results support the need for comprehensive follow-up of educational, medical and developmental/neuropsychiatric needs, including a retesting of cognitive functions. There is also a need for continuing parent/family follow-up and support.

The present study examines the effect of language experience on vocal emotion perception in a second language. Native speakers of French with varying levels of self-reported English ability were asked to identify emotions from vocal expressions produced by American actors in a forced-choice task, and to rate their pleasantness, power, alertness and intensity on continuous scales. Stimuli included emotionally expressive English speech (emotional prosody) and non-linguistic vocalizations (affect bursts), and a baseline condition with Swiss-French pseudo-speech. Results revealed effects of English ability on the recognition of emotions in English speech but not in non-linguistic vocalizations. Specifically, higher English ability was associated with less accurate identification of positive emotions, but not with the interpretation of negative emotions. Moreover, higher English ability was associated with lower ratings of pleasantness and power, again only for emotional prosody. This suggests that second language skills may sometimes interfere with emotion recognition from speech prosody, particularly for positive emotions.

Decisions were sampled from 108 participants during 8 days using a web-based diary method. Each day participants rated experienced regret for a decision made, as well as forecasted regret for a decision to be made. Participants also indicated to what extent they used different strategies to prevent or regulate regret. Participants regretted 30% of decisions and forecasted regret in 70% of future decisions, indicating both that regret is relatively prevalent in daily decisions but also that experienced regret was less frequent than forecasted regret. In addition, a number of decision-specific regulation and prevention strategies were successfully used by the participants to minimize regret and negative emotions in daily decision making. Overall, these results suggest that regulation and prevention of regret are important strategies in many of our daily decisions.

OBJECTIVE: Memory complaints are common in patients after brain tumor, but is difficult to map memory functions during awake surgery, to preserve them. Thus we analyzed one of the largest data sets on clinical, surgical, and anatomical correlates of memory in patients with brain tumor to date, providing anatomical hotspots for short and long-term memory functions. METHODS: A total of 260 patients with brain tumor (130 high-grade gliomas; 76 low-grade gliomas [LGG]; 54 meningiomas) were tested on 2 commonly used short-term memory (Digit Span Forward and Corsi Spatial Span) and 2 long-term memory tasks (Narrative Memory and Delayed Recall of Rey Figure). Patients were evaluated before and immediately after surgery and (for LGG) after 4 months and data analyzed by means of analysis of covariance and the voxel-based lesion-symptom mapping technique. RESULTS: As expected, patients with high-grade gliomas were already impaired before surgery, whereas patients with meningioma were largely unimpaired. Patients with LGG were unimpaired before surgery, but showed significant performance drop immediately after, with good recovery within few months. Voxel-based lesion-symptom mapping analyses identified specific anatomical correlates for verbal memory tasks, whereas visuospatial tasks provided good sensitivity to cognitive damage but failed to show anatomical specificity. Anatomical hotspots identified were in line with both previous functional magnetic resonance imaging and clinical studies on other neurological populations. CONCLUSIONS: Verbal memory tasks revealed a set of specific anatomical hotspots that might be considered eloquent for verbal memory functions, unlike visuospatial tasks, suggesting that commonly used spatial memory tasks might not be optimal to localize the damage, despite an otherwise good sensitivity to cognitive damage.

Objective: The aim of this work is to provide an in-depth investigation of the impact of low-grade gliomas (LGG) and their surgery on patients' cognitive and emotional functioning and well-being, carried out via a comprehensive and multiple-measure psychological and neuropsychological assessment.

Patients and Methods: Fifty surgically treated patients with LGG were evaluated 40 months after surgery on their functioning over 6 different cognitive domains, 3 core affective/emotional aspects, and 3 different psychological well-being measures to obtain a clearer picture of the long-term impact of illness and surgery on their psychological and relational world. Close relatives were also involved to obtain an independent measure of the psychological dimensions investigated.

Results: Cognitive status was satisfactory, with only mild short-term memory difficulties. The affective and well-being profile was characterized by mild signs of depression, good satisfaction with life and psychological well-being, and good personality development, with patients perceiving themselves as stronger and better persons after illness. However, patients showed higher emotional reactivity, and psychological well-being measures were negatively affected by epileptic burden. Well-being was related to positive affective/emotional functioning and unrelated to cognitive functioning. Good agreement between patients and relatives was found.

Conclusions: In the long-term, patients operated on for LGG showed good cognitive functioning, with no significant long-term cognitive sequelae for the extensive surgical approach. Psychologically, patients appear to experience a deep psychological change and maturation, closely resembling that of so-called posttraumatic growth, which, to our knowledge, is for the first time described and quantified in patients with LGG.

Objective: To analyze whether the frequency of autism spectrum disorder (ASD) in a cohort of Swedish children differs between those exposed to ultrasound in the 12th week and those exposed to ultrasound in the 18th week of gestation.

Methods: The study cohort consisted of approximately 30 000 children born between 1999 and 2003 to mothers who had been randomized to a prenatal ultrasound examination at either 12 or 18weeks' gestation as part of the framework for a study on nuchal translucency screening. The outcome measure in the present study was the rate of ASD diagnoses among the children. Information on ASD diagnoses was based on data from the Swedish social insurance agency concerning childcare allowance granted for ASD.

Results: Between 1999 and 2003, a total of 14 726 children were born to women who underwent a 12-week ultrasound examination and 14 596 to women who underwent an 18-week ultrasound examination. Of these, 181 (1.2%) and 176 (1.2%) children, respectively, had been diagnosed with ASD. There was no difference in ASD frequency between the early and late ultrasound groups.

Conclusions: Women subjected to at least one prenatal ultrasound examination at either 12 or 18weeks' gestation had children with similar rates of ASD. However, this result reflects routine care 10-15 years ago in Sweden. Today, higher intensity ultrasound scans are performed more frequently, at earlier stages during pregnancy and for non-medical purposes, implying longer exposure time for the fetus. This change in the use of ultrasound necessitates further follow-up study of the possible effects that high exposure to ultrasound during the gestational period has on the developing brain.

The development of tau-specific positron emission tomography (PET) tracers allows imaging in vivo the regional load of tau pathology in Alzheimer's disease (AD) and other tauopathies. Eighteen patients with baseline investigations enroled in a 17-month follow-up study, including 16 with AD (10 had mild cognitive impairment and a positive amyloid PET scan, that is, prodromal AD, and six had AD dementia) and two with corticobasal syndrome. All patients underwent PET scans with [F-18]THK5317 (tau deposition) and [F-18]FDG (glucose metabolism) at baseline and follow-up, neuropsychological assessment at baseline and follow-up and a scan with [C-11]PIB (amyloid-beta deposition) at baseline only. At a group level, patients with AD (prodromal or dementia) showed unchanged [F-18]THK5317 retention over time, in contrast to significant decreases in [F-18]FDG uptake in temporoparietal areas. The pattern of changes in [F-18]THK5317 retention was heterogeneous across all patients, with qualitative differences both between the two AD groups (prodromal and dementia) and among individual patients. High [F-18]THK5317 retention was significantly associated over time with low episodic memory encoding scores, while low [F-18]FDG uptake was significantly associated over time with both low global cognition and episodic memory encoding scores. Both patients with corticobasal syndrome had a negative [C-11]PIB scan, high [F-18]THK5317 retention with a different regional distribution from that in AD, and a homogeneous pattern of increased [F-18]THK5317 retention in the basal ganglia over time. These findings highlight the heterogeneous propagation of tau pathology among patients with symptomatic AD, in contrast to the homogeneous changes seen in glucose metabolism, which better tracked clinical progression.

Several tau PET tracers have been developed, but it remains unclear whether they bind to the same molecular target on the heterogeneous tau pathology. In this study we evaluated the binding of two chemically different tau-specific PET tracers (C-11-THK5351 and C-11-PBB3) in a head-to-head, in vivo, multimodal design. Nine patients with a diagnosis of mild cognitive impairment or probable Alzheimer's disease and cerebrospinal fluid biomarker evidence supportive of the presence of Alzheimer's disease brain pathology were recruited after thorough clinical assessment. All patients underwent imaging with the tau-specific PET tracers C-11-THK5351 and C-11-PBB3 on the same day, as well as imaging with the amyloid-beta-specific tracer C-11-AZD2184, a T1-MRI sequence, and neuropsychological assessment. The load and regional distribution of binding differed between C-11-THK5351 and C-11-PBB3 with no statistically significant regional correlations observed between the tracers. The binding pattern of C-11-PBB3, but not that of C-11-THK5351, in the temporal lobe resembled that of C-11-AZD2184, with strong correlations detected between C-11-PBB3 and C-11-AZD2184 in the temporal and occipital lobes. Global cognition correlated more closely with C-11-THK5351 than with C-11-PBB3 binding. Similarly, cerebrospinal fluid tau measures and entorhinal cortex thickness were more closely correlated with C-11-THK5351 than with C-11-PBB3 binding. This research suggests different molecular targets for these tracers; while C-11-PBB3 appeared to preferentially bind to tau deposits with a close spatial relationship to amyloid-beta, the binding pattern of C-11-THK5351 fitted the expected distribution of tau pathology in Alzheimer's disease better and was more closely related to downstream disease markers.

Strong negative reactions, physical symptoms, and behavioral disruptions due to environmental odors are common in the adult population. We investigated relationships among such environmental chemosensory responsivity (CR), personality traits, affective states, and odor perception. Study 1 showed that CR and neuroticism were positively correlated in a sample of young adults (n = 101), suggesting that persons high in neuroticism respond more negatively to environmental odors. Study 2 explored the relationships among CR, noise responsivity (NR), neuroticism, and odor perception (i.e., pleasantness and intensity) in a subset of participants (n = 40). High CR was associated with high NR. Regression analyses indicated that high CR predicted higher odor intensity ratings and low olfactory threshold (high sensitivity) predicted lower pleasantness ratings. However, neuroticism was not directly associated with odor ratings or thresholds. Overall, the results suggest that CR and odor thresholds predict perceptual ratings of odors and that high CR is associated with nonchemosensory affective traits.

Few studies have investigated long-term odor recognition memory, although some early observations suggested that the forgetting rate of olfactory representations is slower than for other sensory modalities. This study investigated recognition memory across 64 days for high and low familiar odors and faces. Memory was assessed in 83 young participants at 4 occasions; immediate, 4, 16, and 64 days after encoding. The results indicated significant forgetting for odors and faces across the 64 days. The forgetting functions for the 2 modalities were not fundamentally different. Moreover, high familiar odors and faces were better remembered than low familiar ones, indicating an important role of semantic knowledge on recognition proficiency for both modalities. Although odor recognition was significantly better than chance at the 64 days testing, memory for the low familiar odors was relatively poor. Also, the results indicated that odor identification consistency across sessions, irrespective of accuracy, was positively related to successful recognition.

In normal aging, people are confronted with impairment in both socioemotional and cognitive abilities. Specifically, there are age-related declines in inhibitory processes that regulate attention towards irrelevant material. In last years, the intranasal administration of the neuropeptide oxytocin has mainly been related to improvements in several domains such as emotion recognition and memory, but to date the effects of oxytocin in aging remain largely unknown. In a randomized, double blind, placebo controlled, within-subjects study design, we investigated whether oxytocin facilitates inhibitory processing in older adults compared to younger adults. In total, 41 older adults (51% women; age range 65-75 years) and 37 younger adults (49% women; age range 20-30 years) participated in this study two times, receiving a single intranasal dose of 40 IU of placebo and oxytocin in randomized order 45 minutes before engaging in the task. Participants were tested approximately a month apart and mostly at the same hour during both occasions. Inhibition was measured with a Go/NoGo task which included happy and neutral faces as targets (Go stimuli) and distractors (NoGo stimuli) shown on a computer screen. Participants were instructed to press a button any time they saw a target and remain passive when encountering a distractor. Preliminary results indicate effects for happy and neutral faces, but only in the distractor condition. For happy distractors, women rejected correctly happy faces more accurately than men did, both in the placebo and oxytocin conditions. A main effect of age was observed for the neutral distractors, where older adults were more successful in inhibiting responses than younger adults during oxytocin and placebo treatments. We did not observe effects of oxytocin in the different tasks. The role of oxytocin was not clear distinguished in the tasks. In sum, our findings showed that age and gender can influence inhibition but their effects depend on the displayed emotions. This suggests that the ability to inhibit interfering distractors may remain intact despite of age and that deficits in inhibition may be selective. The role of oxytocin in inhibition needs to be further investigated since it is possible that it is context dependent.

People constantly evaluate faces to obtain social information. However, the link between aging and social evaluation of faces is not well understood. Todorov and colleagues introduced a data-driven model defined by valence and dominance as the two main components underlying social judgments of faces. They also created a stimulus set consisting of computer-generated faces which systematically vary along various social dimensions (e.g., Todorov et al., 2013, Emotion, 13, 724-38). We utilized a selection of these facial stimuli to investigate age-related differences in judgments of the following dimensions: attractiveness, competence, dominance, extraversion, likeability, threat, and trustworthiness. Participants rated how well the faces represented the intended social dimensions on 9-point scales ranging from not at all to extremely well. Results from 71 younger (YA; mean age = 23.42 years) and 60 older adults (OA; mean age = 69.19 years) showed that OA evaluated untrustworthy faces as more trustworthy, dislikeable faces as more likeable, and unattractive faces as more attractive compared to YA. OA also evaluated attractive faces as more attractive compared to YA, whereas YA did rate likeable and trustworthy faces as more likeable and trustworthy than did OA. In summary, our findings showed that OA evaluated negative social features less negatively compared to YA. This suggests that older and younger persons may use different cues for social evaluation of faces, and is in line with prior research suggesting age-related decline in the ability to recognize negative emotion expressions.

We investigated how memory for faces and voices (presented separately and in combination) varies as a function of sex and emotional expression (anger, disgust, fear, happiness, sadness, and neutral). At encoding, participants judged the expressed emotion of items in forced-choice tasks, followed by incidental Remember/Know recognition tasks. Results from 600 participants showed that accuracy (hits minus false alarms) was consistently higher for neutral compared to emotional items, whereas accuracy for specific emotions varied across the presentation modalities (i.e., faces, voices, and face-voice combinations). For the subjective sense of recollection (“remember” hits), neutral items received the highest hit rates only for faces, whereas for voices and face-voice combinations anger and fear expressions instead received the highest recollection rates. We also observed better accuracy for items by female expressers, and own-sex bias where female participants displayed memory advantage for female faces and face-voice combinations. Results further suggest that own-sex bias can be explained by recollection, rather than familiarity, rates. Overall, results show that memory for faces and voices may be influenced by the expressions that they carry, as well as by the sex of both items and participants. Emotion expressions may also enhance the subjective sense of recollection without enhancing memory accuracy.

In recent years, the intranasal administration of the neuropeptide oxytocin has mainly been related to improvements in domains such as emotion recognition and memory, but to date the effects of oxytocin in aging remain largely unknown. A major caveat in oxytocin research is that it is almost exclusively based on young men which may reflect an inadequate picture of the potential benefits of oxytocin administration. In a randomized, double blind, placebo controlled, within-subjects study design, we investigated whether oxytocin affects the recognition of positive and negative stimuli differently in younger and older adults. Forty-four older adults (50% women; M= 69.82) and 44 younger adults (50% women; M= 24.75) participated in this study two times, receiving a single intranasal dose of 40 IUs of placebo and oxytocin in randomized order 40 minutes before engaging in the task. Participants watched short videoclips where actors displayed nine emotions: neutrality, happiness, pride, interest, relief, anger, despair, sadness, and disgust. Preliminary results indicate that oxytocin-induced reductions to negative emotions were found in bilateral fusiform gyrus (Z > 4.16, Family wise error corrected, pFWE < 0.009), hippocampus (Z > 4.53, pFWE < 0.002), insula (Z > 3.69, pFWE < 0.045), and superior temporal gyrus (Z > 4.34, pFWE < 0.008), as well as, right-lateralized reductions in the amygdala (Z = 3.73, pFWE = 0.005). These findings are in line with previous studies showing decreased brain activity to negative stimuli and suggest that this mechanism in not only present in younger adults but it can also be extended to an older population. Future studies should investigate how oxytocin impacts socioemotional and cognitive processes in elderly.

Intranasal oxytocin (OT) has previously been found to increase spirituality, an effect moderated by OT-related genotypes. This pre-registered study sought to conceptually replicate and extend those findings. Using a single dose of intranasal OT vs placebo (PL), we investigated experimental treatment effects, and moderation by OT-related genotypes on spirituality, mystical experiences, and the sensed presence of a sentient being. A more exploratory aim was to test for interactions between treatment and the personality disposition absorption on these spirituality-related outcomes. A priming plus sensory deprivation procedure that has facilitated spiritual experiences in previous studies was used. The sample (N = 116) contained both sexes and was drawn from a relatively secular context. Results failed to conceptually replicate both the main effects of treatment and the treatment by genotype interactions on spirituality. Similarly, there were no such effects on mystical experiences or sensed presence. However, the data suggested an interaction between treatment and absorption. Relative to PL, OT seemed to enhance spiritual experiences in participants scoring low in absorption and dampen spirituality in participants scoring high in absorption.

We present two individual-differences investigations, carried out with the aim of identifying the memory correlates of decision-making skills. The investigations were carried out on population-based Swedish samples between 25 and 80 years of age (n > 500). Study 1 showed selective relations between memory processes (i.e., semantic, episodic, and working memory) and diverse aspects of decision-making competence as measured with the A-DMC battery. The age-related declines observed in the more cognitively-demanding decision-making tasks were mediated by the age-related differences in working memory or episodic memory. Study 2 confirmed the findings even when controlling for the influence of processing speed and sensory functioning. Overall, the results showed that different memory processes fulfill different functional roles in diverse judgment and decision-making tasks.

18.

Del Missier, Fabio

et al.

Stockholm University, Faculty of Social Sciences, Department of Psychology, Cognitive psychology. University of Trieste, Italy.

Age-related differences in sensory functioning, processing speed, and working memory have been identified as three significant predictors of the age-related performance decline observed in complex cognitive tasks. Yet, the assessment of their relative predictive capacity and interrelations is still an open issue in decision making and cognitive aging research. Indeed, no previous investigation has examined the relationships of all these three predictors with decision making. In an individual-differences study, we therefore disentangled the relative contribution of sensory functioning, processing speed, and working memory to the prediction of the age-related decline in cognitively demanding judgment and decision-making tasks. Structural equation modeling showed that the age-related decline in working memory plays an important predictive role, even when controlling for sensory functioning, processing speed, and education. Implications for research on decision making and cognitive aging are discussed.

Age-related decline in complex cognitive tasks has been explained by changes in sensory functioning, processing speed, and working memory. However, there is still no agreement on the relative importance of these factors, and their relative role in decision making has not been investigated. In an individual-difference study on a population-based Swedish sample of adults (N = 563, age range 30-89), we disentangled the contribution of sensory decline, processing speed, and working memory measures to age-related changes in three cognitively-demanding decision-making tasks of the Adult Decision-Making Competence Battery (Resistance to Framing, Applying Decision Rules, Under/Overconfidence). Structural equation modeling showed that working memory is a significant predictor even when the influence of sensory variables, processing speed, and education (as a control for cohort effects) is taken into account. Moreover, the effects of sensory functioning and processing speed on decision making were mediated by working memory. These findings indicate that the age-related decline in complex decision-making tasks may not be entirely explained by changes in lower-level processes, highlighting the functional role of working memory processes.

20.

Del Missier, Fabio

et al.

Stockholm University, Faculty of Social Sciences, Department of Psychology, Cognitive psychology. Leeds University Business School, UK; University of Trieste, Italy.

Little is known about the psychological mechanisms underlying judgments of perceived inflation as empirical evidence is sparse. In two studies, we investigated two factors that are expected to play a significant role in global judgments of perceived inflation: product accessibility and attitudes towards inflation. In Study 1 (N = 253), primed participants retrieved five products whose prices had increased (or decreased) in the past year before expressing a judgment of past inflation (versus non-primed participants with no retrieval task). In Study 2 (N = 101) participants were merely exposed to a series of products, and asked to estimate their frequency of purchase, before judging past inflation. In one condition, the prices of the majority of products had actually increased in the last year, while in another condition they had decreased. In both studies, attitudes towards inflation were also measured. Product priming consistently affected inflation judgments in the direction of an assimilation effect. Also, more negative attitudes towards inflation were associated with higher judgments of perceived inflation. Path analysis confirmed that both product accessibility and attitudes are potential bases for judgments of perceived inflation. These findings suggest that multiple psychological influences may underlie global judgments of perceived inflation.

Although a number of theoretical accounts of proactive interference (PI) in episodic memory have been proposed, existing empirical evidence does not support conclusively a single view yet. In two experiments we tested the predictions of the temporal discrimination theory of PI against alternative accounts by manipulating the presentation schedule of study materials (lists blocked by category vs. interleaved). In line with the temporal discrimination theory, we observed a clear buildup of (and release from) PI in the blocked condition, in which all the lists of the same category were presented sequentially. In the interleaved condition, with alternating lists of different categories, a more gradual and smoother buildup of PI was observed. When participants were left free to choose their presentation schedule, they spontaneously adopted an interleaved schedule, resulting again in more gradual PI. After longer delays, we observed recency effects at the list level in overall recall and, in the blocked condition, PI-related effects. The overall pattern of findings agrees with the predictions of the temporal discrimination theory of PI, complemented with categorical processing of list items, but not with alternative accounts, shedding light on the dynamics and underpinnings of PI under diverse presentation schedules and over different time scales.

Olfactory performance using the Scandinavian Odor-Identification Test (SOIT) and self-reported olfactory function; several social, cognitive, and medical risk factors at baseline; and incident dementia during the following decade.

Studies have shown that persons with schizophrenia have lower accuracy in emotion recognition compared to persons without schizophrenia. However, the impact of the complexity level of the stimuli or the modality of presentation has not been extensively addressed. Forty three persons with a diagnosis of schizophrenia and 43 healthy controls, matched for age and gender, were administered tests assessing emotion recognition from stimuli with low and high levels of complexity presented via visual, auditory and semantic channels. For both groups, recognition rates were higher for high-complexity stimuli compared to low-complexity stimuli. Additionally, both groups obtained higher recognition rates for visual and semantic stimuli than for auditory stimuli, but persons with schizophrenia obtained lower accuracy than persons in the control group for all presentation modalities. Persons diagnosed with schizophrenia did not present a level of complexity specific deficit or modality-specific deficit compared to healthy controls. Results suggest that emotion recognition deficits in schizophrenia are beyond level of complexity of stimuli and modality, and present a global difficulty in cognitive functioning.

We used an alternate age variable, functional biological age (fBioAge), which was based on performance on functional body measures. The aim was to examine development of fBioAge across the adult life span, and to also examine potential gender differences and genetic and environmental influences on change with age. We used longitudinal data (n = 740; chronological age (ChronAge) range 45-85 at baseline) from the Swedish Adoption/Twin Study of Aging. The rate of increase in fBioAge was twice as fast after ChronAge 75 than before. fBioAge was higher in women than in men. fBioAge was fairly equally influenced by genetic and environmental factors. Whereas the rate of ChronAge cannot vary across time, gender, or individual, our analyses demonstrate that fBioAge does capture these within and between individual differences in aging, providing advantages for fBioAge in the study of aging effects.

Identification of novel candidate genes for schizophrenia (SZ) and bipolar disorder (BP), two psychiatric disorders with large epidemiological impacts, is a key research area in neurosciences and psychiatric genetics. Previous evidence from genome-wide studies suggests an important role for genes involved in synaptic plasticity in the risk for SZ and BP. We used a convergent genomics approach, combining different lines of biological evidence, to identify genes involved in the cAMP/PKA/CREB functional pathway that could be novel candidates for BP and SZ: CREB1, CREM, GRIN2C, NPY2R, NF1, PPP3CB and PRKAR1A. These 7 genes were analyzed in a HapMap based association study comprising 48 common SNPs in 486 SZ, 351 BP patients and 514 control individuals recruited from an isolated population in Northern Sweden. Genetic analysis showed significant allelic associations of SNPs in PRKAR1A with SZ and of PPP3CB and PRKAR1A with BP. Our results highlight the feasibility and the importance of convergent genomic data analysis for the identification of candidate genes and our data provide support for the role of common inherited variants in synaptic genes and their involvement in the etiology of BP and SZ.

Objective: Physicians tend to demonstrate inappropriate behavior when it comes to taking care of their own health. Self-prescribing or self-treatment seems to be practiced in many countries, and self-treated illnesses are found to be more common among general practitioners. For the physician such behavior is a threat to their own health, and as a consequence their patients might not be able to receive optimal health care. The purpose of this study is to examine the relationship between help seeking behavior, sickness presenteeism, exhaustion, and self- treatment among general practitioners.

Method: This cross-sectional study was conducted in 2013 among GPs employed in one City Council in Sweden using a questionnaire on health and work factors. The criterion variable “To self-diagnose and self-treat” was measured with a single item from the Physician Career Path Questionnaire (PCPQ; Fridner, 2004). Exhaustion was measured with a scale from the Oldenburg Burnout Inventory, OLBI (α = .82; Demerouti et al., 2001, 2003). “Sickness presenteeism” and “Taking vacation due to stress” was measured with single items, also from the PCPQ (Fridner, 2004). For the analyses, we used hierarchical multiple regression.

Results: Altogether 193 (63,9%) female GPs and 109 (36,1%) male GPs answered the questionnaire, a 44% response-rate. Among them 46,2% stated they had diagnosed and treated themselves for a condition for which they would have referred a patient to a specialist. Our regression analysis model revealed that those physicians who self-treated themselves were also significantly more sickness present at work. Adding to this, exhaustion among the GPs was also included in the model.

Conclusions: This study shows that self-treatment is not an isolated behavior, but occurs together with exhaustion and sickness presenteeism, indicating a quite severe situation for their health, which would need to be investigated by other doctors than themselves. This needs to be further investigated and taken into account by the National Board of Health and Welfare, County Councils and Medical Associations, and for future physicians our medical schools.

This study investigated how accumulating gains and losses, described as annual interest rates, influenced investment behavior. Investments after gains were on average greater than after losses regardless of the gain and loss interest rates. However, greater variance of interest rates gave some weight to that variable for gains but not for losses. We also analyzed the influence from different information cues on each participant’s investments. This revealed that interest rates influenced participants very differently, some invested more with increasing gains, or with increasing losses, while others invested less. This finding explained why interest rate was a weak predictor on the group level. Furthermore, our individual analyses showed an increased sensitivity to interest rates and judged future asset accumulations when the interest rate variance was greater. Finally, subjective reports of the importance of different cues for the participants’ own investments showed only some understanding of the cues influence on the investments.

The general aim of this thesis is to contribute to the understanding of how numerical information, such as asset values and interest rates, influences inexperienced investors in their investment decisions. In relation to this, I have investigated the participants’ own understanding of what information they rely on for their own decisions. I have also investigated how their willingness to wait for greater rewards is related to their investment decisions. Importantly, I have distinguished between average behavior (group behavior) and individual behavior in an attempt to better describe how different information is important for different individual investors.

On the group level the only reliable predictor of investment size was whether there was a gain or a loss during the period before the investment. However, how large the gain or loss was had no, or very limited, influence on investment size. When looking at each investor’s individual decisions, it was revealed that a substantial number of participants actually did rely on information other than only the gain/loss information, for example, the interest rates of forecasted developments of the different investment prospects. Furthermore, a substantial number of participants relied heavily on one of the cues; at least 50% of their investments were explained by the cue relied upon.

Interestingly, very few participants’ investments were influenced by their own judgments of future asset outcomes. Furthermore, the participants’ willingness to invest in funds with guaranteed gains was used as a proxy for time preference (willingness to wait for greater rewards instead of accepting lesser rewards in the present). Time preference was relevant for investments but it did not relate to judged asset outcomes. This indicates that people may be more influenced by their future-oriented preferences rather than by their future-oriented beliefs (judgments).

To conclude, these findings suggest that people use a preference-driven simplified strategy for investments and that these strategies differ substantially between individuals. This corroborates the idea about heuristic thinking, meaning that people simplify their decisions in a way that can deviate from normative value-maximizing behavior. For practical application, it is important to note the variety of strategies among individuals. This variety suggests that there is no “one size fits all” solution regarding instructions that can be given to inexperienced investors. The participants’ very limited insight into what information they relied upon is reason for researchers and advisors to understand the individuality in strategies in greater depth.

According to Prospect theory, we judge and decide in relation to a reference point. Furthermore, it has been found that we perceive amounts differently depending on if people are asked about percentages or actual amounts of currency. Therefore, in this study, the effects of response format (amount of SEK or percentage of assets) on long term investment decisions were investigated. I also investigated the relation between investments and subjective judgments of asset accumulations, as well as time preference (the willingness to wait for greater rewards). Average investments were greater followings gains compared to losses, but there was no statistically significant effect of response format. The gain/loss factor was the best predictor of average investments, independent of gain/loss size. Judgments of accumulated assets were weakly related to investments and time preference, but time preference was closely related to investments. I also wanted to know how participants used the information in the problems. Therefore, how important different kinds of information were for each individual participant’s investments was analyzed. This revealed that that it was more common in the currency condition, compared to the percentage condition, to rely heavily on forecasted future interest rates, but also to ignore this information completely. In conclusion, information processing is very diverse and how people are asked to invest can change what information they focus on or ignore.

Background: It is common that physicians go to work while sick and therefore it is important to understand the reasons behind. Previous research has shown that women and men differ in health and health related behavior. In this study, we examine gender differences among general practitioners who work while sick.

Methods: General practitioners (GP’s) working in outpatient care in a Swedish city participated in the study (n = 283; women = 63 %; response rate = 41 %). Data were obtained from a large web-based questionnaire about health and organization within primary care. Two questions about sickness presenteeism (going to work while sick) were included; life-long and during the past 12 months, and five questions about reasons. We controlled for general health, work-family conflict and demographic variables.

Women reported reasons related with “concern for others” and “workload” more strongly than men. Men reported reasons related with “capacity” and “money” more strongly than women. These differences are likely effects of gender stereotyping and different family-responsibilities.

Conclusions: Gender socialization and gender stereotypes may influence work and health-related behavior. Because sickness presenteeism is related with negative effects both on individuals and at organizational levels, it is important that managers of health organizations understand the reasons for this, and how gender roles may influence the prevalence of sickness presenteeism and the reasons that female and male GPs give for their behavior.

Background: In clinical practice, efficient and valid functional markers are needed to detect subtle cognitive and functional decline in mild cognitive impairment (MCI). This prospective study explored whether changes in perceived challenge of certain everyday technologies (ETs) can be used to detect signs of functional change in MCI.

Methods: Baseline and five-year data from 37 older adults (mean age 67.5 years) with MCI regarding their perceived ability to use ET were used to generate Rasch-based ET item measures reflecting the relative challenge of 46 ETs. Actual differential item functioning in relation to time was analyzed based on these item measures. Data collection took place in 2008-2014.

Results: Seven (15%) of the ETs included were perceived to be significantly more challenging to use at year five compared to at baseline, while 39 ETs (85%) were perceived to be equally challenging to use, despite the fact that the participants' perceived ability to use ET had decreased. Common characteristics among the ETs that became more challenging to use could not be identified. The dropout rate was 43%, which limits the power of the study.

Conclusions: Changes in the perceived challenge of ETs seem to capture functional change in persons with cognitive decline. Both easier and more challenging ETs typically used at home and in society need to be addressed to capture this functional change because significant changes occurred among ETs of all challenge levels and within all types of ETs.

The present study investigates the effect of the brain-derived neurotrophic factor (BDNF) val66met polymorphism on change in olfactory function in a large scale, longitudinal population-based sample (n = 836). The subjects were tested on a 13 item force-choice odor identification test on two test occasions over a 5-year-interval. Sex, education, health-related factors, and semantic ability were controlled for in the statistical analyses. Results showed an interaction effect of age and BDNF val66met on olfactory change, such that the magnitude of olfactory decline in the older age cohort (70–90years old at baseline) was larger for the val homozygote carriers than for the met carriers. The older met carriers did not display larger age-related decline in olfactory function compared to the younger group. The BDNF val66met polymorphism did not affect the rate of decline in the younger age cohort (45–65years). The findings are discussed in the light of the proposed roles of BDNF in neural development and maintenance.

The purpose was to explore interview data from young adults with long-standing pain about their experience of contacts with caregivers in a primary care setting, in order to synthesize and qualitatively analyse their reports about how they were received. Method: An emergent qualitative design was used. Open thematic research interviews were conducted with 11 young people (1 man, 10 women) (aged 20–31 years) with long-term pain. The interviews were recorded, transcribed verbatim and analysed using inductive thematic content analysis. Result: The analyses resulted in three themes; distrust experienced from care staffs, lonelinessand hopelessness forming the main theme Young adult with long-term pain. The informants described how they struggled with living with the pain, fighting with the care system and to obtain help. They reportedly felt they were not trusted and that they were not given any explanations or information why the pain spread and worsened. This left them feeling abandoned and alone and without hope concerning their pain, their feelings; and with doubts concerning their prospects. Much concern and doubt were expressed about their future work situation; whether they would be able to do work for which they had trained, and whether they would ever get any career opportunities. Conclusion: Living with long-term pain as a young adult and experiencing mistrust when in care might lead to feelings of loneliness, dependence and hopelessness and an existence marked by suffering and dependence. The experienced mistrust confined the young adult instead of allowing growth towards an adult identity and opportunities.

Objectives: Insufficient sleep has been associated with impaired recognition of facial emotions. However, previous studies have found inconsistent results, potentially stemming from the type of static picture task used. We therefore examined whether insufficient sleep was associated with decreased emotion recognition ability in two separate studies using a dynamic multimodal task.

Methods: Study 1 used a cross-sectional design consisting of 291 participants with questionnaire measures assessing sleep duration and self-reported sleep quality for the previous night. Study 2 used an experimental design involving 181 participants where individuals were quasi-randomized into either a sleep-deprivation (N = 90) or a sleep-control (N = 91) condition. All participants from both studies were tested on the same forced-choice multimodal test of emotion recognition to assess the accuracy of emotion categorization.

Conclusions: The studies presented here involve considerably larger samples than previous studies and the results support the null hypotheses. Therefore, we suggest that the ability to accurately categorize the emotions of others is not associated with short-term sleep duration or sleep quality and is resilient to acute periods of insufficient sleep.

Previous studies have highlighted a deﬁcit in facial emotion recognition after sleep loss. However, while some studies suggest an overall deﬁcit in ability, others have only found effects in individual emotions, or no effect at all. The aim of this study was to investigate this relationship in a large sample and to utilise a dynamic test of emotion recognition in multiple modalities. 145 individuals (91 female, ages 18–45) participated in a sleep-deprivation experiment. Participants were randomised into: one night of total sleep deprivation (TSD) or normal sleep (8–9 h in bed). The following day participants completed a computerised emotional recognition test, consisting of 72 visual, audio, and audio-visual clips, representing 12 different emotions. The stimuli were divided into “easy” and “hard” depending on the intensity of emotional display. A mixed ANOVA revealed signiﬁcant main effects of modality and difﬁculty, P < 0.001, but no main effect of condition, P = 0.31, on emotional recognition accuracy. Additionally, there was no interaction between condition and difﬁculty, P = 0.96, or modality, P = 0.67. This study indicates that sleep deprivation does not reduce the ability to recognise emotions. Given that some studies have only found effects on single emotions, it is possible that the effects of sleep loss are more speciﬁc than investigated here. However, it is also possible that previous ﬁndings relate to the types of static stimuli used. The ability to recognise emotions is key to social perception; this study suggests that this ability is resilient to one night of sleep deprivation.

The ability to correctly understand the emotional expression of another person is essential for social relationships and appears to be a partly inherited trait. The neuropeptides oxytocin and vasopressin have been shown to influence this ability as well as face processing in humans. Here, recognition of the emotional content of faces and voices, separately and combined, was investigated in 492 subjects, genotyped for 25 single nucleotide polymorphisms (SNPs) in eight genes encoding proteins important for oxytocin and vasopressin neurotransmission. The SNP rs4778599 in the gene encoding aryl hydrocarbon receptor nuclear translocator 2 (ARNT2), a transcription factor that participates in the development of hypothalamic oxytocin and vasopressin neurons, showed an association that survived correction for multiple testing with emotion recognition of audio–visual stimuli in women (n = 309). This study demonstrates evidence for an association that further expands previous findings of oxytocin and vasopressin involvement in emotion recognition.

Aim: This study investigated the results from the national, routine 18-month developmental surveillance at Child Healthcare Centres (CHCs) on children later diagnosed with autism spectrum disorder (ASD). Methods: Child Healthcare Centre records of 175 children, diagnosed with ASD before 4.5 years in Stockholm County, Sweden, were reviewed regarding the results of the eight-item neurodevelopmental surveillance. Results were contrasted with normative data from the general child population in Stockholm County. Results: More than one-third of the total ASD group, including half of the group with ASD and intellectual disability (ID), did not pass the required number of items, compared to one in 50 in the general child population. Of those with ASD and ID who had passed, more than one-third experienced developmental regression after 18 months of age. If the CHC surveillance had considered reported regulatory problems - crying, feeding and sleeping - then another 10% of the children with ASD and ID could have been identified during this surveillance. Conclusion: The existing CHC surveillance traced half of the group of children who were later diagnosed with ASD combined with intellectual disability. Adding an item on regulatory problems to the 18-month surveillance would have increased this number by another 10%.

To learn efficiently, many situations require people to judge what will be easy or difficult to learn, or how well it has been stored in memory. These metacognitive judgments are important to understand because they most likely guide how people behave when they learn, and consequently how much they learn. In this thesis, I focus on what is referred to as ease-of-learning (EOL) judgments, that is judgments about how easy or difficult a material will be to learn. EOL judgments have received relatively limited attention in the metacognitive literature. Therefore, this thesis also considers for comparison the more extensively researched judgments of learning (JOL), which are judgments of how well a studied material has been learned or how likely it is to be remembered on a later memory test. I had two major aims with my research. First, I aimed to investigate how accurate EOL judgments are, that is, how well they can predict the ease of future learning, and what moderates this accuracy. More precisely, I investigated what affects EOL judgment accuracy by varying how much an item-set varies in a predictive item characteristic, as well as varying methodological aspects of the judgment situation. The second major aim was to investigate what sources of information people use to make EOL judgments and how the information is used to make metacognitive judgments. In three studies, participants made EOL judgments for word pairs (e.g., sun – warm), or single words (e.g., bucket), studied the items, and tried to recall them on memory tests. In Study II, participants also made JOLs after studying the items. To estimate the accuracy of the judgments, the judgments were correlated with recall performance on memory tests. The results of the thesis show that EOL judgments can be accurate when they are made on a to-be-learned material which varies in a predictive item characteristic (Study I and II). In some conditions, EOL judgments are even as accurate as JOLs (Study II). Study II also supports the cue competition hypothesis, which predicts that, when people judge memory and learning, they sometimes rely less on one source of information if other information is available. Furthermore, Study III shows that processing fluency (the experience of effort associated with processing information), may be an important source of information for EOL judgments, and that people’s beliefs about available information can moderate how the information is used to make EOL judgments. Overall, the results show when EOL judgments will be accurate and when they will not, and provides evidence that people may use processing fluency to make EOL judgments even when it contradicts their beliefs. Importantly, the results also indicate that when multiple sources of information are available, information may compete for influence over metacognitive judgments.

When people begin to study new material, they may first judge how difficult it will be to learn. Surprisingly, these ease of learning (EOL) judgments have received little attention by metacognitive researchers so far. The aim of this study was to systematically investigate how well EOL judgments can predict actual learning, and what factors may moderate their relative accuracy. In three experiments, undergraduate psychology students made EOL judgments on, then studied, and were tested on, lists of word-pairs (e.g., sun – warm). In Experiment 1, the Goodman-Kruskal gamma (G) correlations showed that EOL judgments were accurate (G = .74) when items varied enough in difficulty to allow for proper discrimination between them, but were less accurate (G = .21) when variation was smaller. Furthermore, in Experiment 1 and 3, we showed that the relative accuracy was reliably higher when the EOL judgments were correlated with a binary criterion (i.e., if an item was recalled or not on a test), compared with a trials-to-learn criterion (i.e., how many study and test trials were needed to recall an item). In addition, Experiments 2 and 3 indicate other factors to be non-influential for EOL accuracy, such as the task used to measure the EOL judgments, and whether items were judged sequentially (i.e., one item at a time in isolation from the other items) or simultaneously (i.e., each item was judged while having access to all other items). To conclude, EOL judgments can be highly accurate (G = .74) and may thus be of strategic importance for learning. Further avenues for research are discussed.

Processing fluency influences many types of judgments. Some metacognitive research suggests that the influence of processing fluency may be mediated by participants’ beliefs. The current study explores the influence of processing fluency and beliefs on ease-of-learning (EOL) judgments. In two experiments (Exp 1: n = 94; Exp 2: n = 146), participants made EOL judgments on 24 six-letter concrete nouns, presented in either a constant condition (high fluency) with upper-case letters (e.g., BUCKET) or an alternating condition (low fluency) with mixed upper- and lower-case letters (e.g., bUcKeT). After judging words individually, participants studied the words and completed a free recall test. Finally, participants indicated what condition they believed made the words more likely to be learned. Results show constant-condition words were judged as more likely to be learned than alternating condition words, but the difference varied with beliefs. Specifically, the difference was biggest when participants believed the constant condition made words more likely to be learned, followed by believing there was no difference, and then believing the alternating condition made words more likely to be learned. Thus, we showed that processing fluency has a direct effect on EOL judgments, but the effect is moderated by beliefs.

Background: Autism spectrum disorder (ASD) is a developmental disorder with a wide variety of clinical phenotypes and co-occurrences with other neurodevelopmental conditions. Symptoms may change over time.

Aims: The aim of the present study was to prospectively follow 96 children, initially assessed for suspected ASD at an average age of 2.9 years.

Methods and procedures: All children had been identified with autistic symptoms in a general population child health screening program, and had been referred to the Child Neuropsychiatry Clinic in Gothenburg, Sweden for further assessment by a multi-professional team at Time 1 (T1). This assessment included a broad neurodevelopmental examination, structured interviews, a cognitive test and evaluations of the childis adaptive and global functioning. Two years later, at Time 2 (T2), the children and their parents were invited for a follow-up assessment by the same team using the same methods.

Outcomes and results: Of the 96 children, 76 had met and 20 had not met full criteria for ASD at T1. Of the same 96 children, 79 met full ASD criteria at T2. The vast majority of children with ASD also had other neurodevelopmental symptoms or diagnoses. Hyperactivity was observed in 42% of children with ASD at T2, and Intellectual Developmental Disorder in 30%. Borderline Intellectual Functioning was found in 25%, and severe speech and language disorder in 20%. The children who did not meet criteria for ASD at T2 had symptoms of or met criteria for other neurodevelopmental/neuropsychiatric disorders in combination with marked autistic traits. Changes in developmental profiles between T1 and T2 were common in this group of young children with ASD. The main effect of Cognitive level at T1 explained more than twice as much of the variance in Vineland scores as did the ASD subtype; children with IDD had significantly lower scores than children in the BIF and AIF group. Co-existence with other conditions was the rule.

Conclusions and implications: Reassessments covering the whole range of these conditions are necessary for an optimized intervention adapted to the individual child's needs.

The ability to recognize the identity of faces and voices is essential for social relationships. Although the heritability of social memory is high, knowledge about the contributing genes is sparse. Since sex differences and rodent studies support an influence of estrogens and androgens on social memory, polymorphisms in the estrogen and androgen receptor genes (ESR1, ESR2, AR) are candidates for this trait. Recognition of faces and vocal sounds, separately and combined, was investigated in 490 subjects, genotyped for 10 single nucleotide polymorphisms (SNPs) in ESR1, four in ESR2 and one in the AR. Four of the associations survived correction for multiple testing: women carrying rare alleles of the three ESR2 SNPs, rs928554, rs1271572 and rs1256030, in linkage disequilibrium with each other, displayed superior face recognition compared with non-carriers. Furthermore, the uncommon genotype of the ESR1 SNP rs2504063 was associated with better recognition of identity through vocal sounds, also specifically in women. This study demonstrates evidence for associations in women between face recognition and variation in ESR2, and recognition of identity through vocal sounds and variation in ESR1. These results suggest that estrogen receptors may regulate social memory function in humans, in line with what has previously been established in mice.

Successful retrieval from memory is a desirably difficult learning event that reduces the recall decrement of studied materials over longer delays more than restudying does. The present study was the first to test this direct testing effect for performed and read action events (e.g., light a candle) in terms of both recall accuracy and recall speed. To this end, subjects initially encoded action phrases by either enacting them or reading them aloud (i.e., encoding type). After this initial study phase, they received two practice phases, in which the same number of action phrases were restudied or retrieval-practiced (Exp. 1-3), or not further processed (Exp. 3; i.e., practice type). This learning session was ensued by a final cued-recall test both after a short delay (2 min) and after a long delay (1 week: Exp. 1 and 2; 2 weeks: Exp. 3). To test the generality of the results, subjects retrieval practiced with either noun-cued recall of verbs (Exp. 1 and 3) or verb-cued recall of nouns (Exp. 2) during the intermediate and final tests (i.e., test type). We demonstrated direct benefits of testing on both recall accuracy and recall speed. Repeated retrieval practice, relative to repeated restudy and study-only practice, reduced the recall decrement over the long delay, and enhanced phrases' recall speed already after 2 min, and this independently of type of encoding and recall test. However, a benefit of testing on long-term retention only emerged (Exp. 3), when prolonging the recall delay from 1 to 2 weeks, and using different sets of phrases for the immediate and delayed final tests. Thus, the direct testing benefit appears to be highly generalizable even with more complex, action-oriented stimulus materials, and encoding manipulations. We discuss these results in terms of the distribution-based bifurcation model.

Testing memory typically enhances subsequent re-encoding of information (“indirect” testing effect) and, as compared to restudy, it also benefits later long-term retention (“direct” testing effect). We investigated the effect of testing on subsequent restudy and 1-week retention of action events (e.g. “water the plant”). In addition, we investigated if the type of recall practice (noun-cued vs. verb-cued) moderates these testing benefits. The results showed an indirect testing effect that increased following noun-cued recall of verbs as compared to verb-cued recall of nouns. In contrast, a direct testing effect on the forgetting rate of performed actions was not reliably observed, neither for noun- nor verb-cued recall. Thus, to the extent that this study successfully dissociated direct and indirect testing-based enhancements, they seem to be differentially effective for performed actions, and may rely on partially different mechanisms.

We often need to monitor and coordinate multiple deadlines. One way to handle these temporal demands might be to represent future deadlines as a pattern of spatial relations. More specifically, we tested the hypothesis that multitasking reflects selective effects of coordinate (i.e., metric) relational processing. Participants completed two multitasking sessions under concurrent processing demands of coordinate versus categorical spatial information. We expected and observed that multitasking impairs concurrent coordinate, rather than categorical, spatial processing. In Experiment 1, coordinate-task performance was selectively decreased, while multitasking performance was equal under both load conditions. When emphasizing equal (primary/secondary) task-importance in Experiment 2, it was only multitasking performance that was selectively reduced under the coordinate-load condition. Thus, effective multitasking may partly reflect coordinate-relational processing.

47.

Larsson, Maria

et al.

Stockholm University, Faculty of Social Sciences, Department of Psychology, Perception and psychophysics.

Ekström, Ingrid

Stockholm University, Faculty of Social Sciences, Department of Psychology, Perception and psychophysics.

Sjölund, Sara

Stockholm University, Faculty of Social Sciences, Department of Psychology, Perception and psychophysics.

The objective of this study was to examine the association between performance in odor identification and future mortality in a community cohort of adults aged between 40 and 90 years. We assessed olfactory performance with a 13-item-version of the Scandinavian Odor Identification Test (SOIT). The results showed that during follow-up (mean=9.4 years, standard deviation=2.23), 411 of 1774 (23.2%) participants died. In a Cox model, the association between higher SOIT score and mortality was highly significant (hazard ratio [HR]=0.74, per point interval, 95% confidence interval [CI]=0.71–0.77, p<0.001). The effect was attenuated, but remained significant after controlling for age, sex, education, and health and cognitive variables that were also associated with an increased risk of mortality (HR=0.92, 95% CI=0.87–0.97, p=0.001). Controlling for dementia conversion prior to death did not attenuate the association between SOIT score and mortality (HR=0.92, 95% CI=0.87–0.97, p=0.001). Similar results were obtained for olfactory sensitivity as assessed by self-report. Overall, the present findings show that poor odor identification performance is associated with an increased likelihood of future mortality in middle-aged and older adults, after controlling for social, cognitive, and medical risk factors. Most importantly, controlling for the development of dementia did not attenuate the association between odor identification and mortality, suggesting that olfactory decline might mark deteriorating health also irrespective of dementia.

Autobiographical memories (AMs) are personally experienced events that may be localized in time and space. In the present work we present an overview targeting memories evoked by the sense of smell. Overall, research indicates that autobiographical odor memory is different than memories evoked by our primary sensory systems; sight, and hearing. Here, observed differences from a behavioral and neuroanatomical perspective are presented. The key features of an olfactory evoked AM may be referred to the LOVER acronym-Limbic, Old, Vivid, Emotional, and Rare.

The influences of perceived odor qualities on the retention of olfactory information across the adult lifespan were examined. Young (19–36 years), young-old (60–74 years), and old (75–91 years) adults (n = 202) rated a set of unfamiliar odors across a series of perceptual dimensions (i.e., pleasantness, intensity, and irritability) at encoding. The overall results indicated that memory for unpleasant olfactory information was better than that for pleasant odors across the lifespan. Also, participants showed better retention for odors perceived with high intensity and irritability than for odors rated with low or medium scores. Interestingly, the old adults showed selective beneficial memory effects for odors rated as highly irritable. To the extent that perceptions of high irritability reflect an activation of the trigeminal sensory system, this finding suggests that older adults may use trigeminal components in odor information to compensate for age-related impairments in olfactory memory.

Repeated testing during learning often improves later memory, which is often referred to as the testing effect. To clarify its boundary conditions, we examined whether the testing effect was selectively affected by covert (retrieved but not articulated) or overt (retrieved and articulated) response format. In Experiments 1 and 2, we compared immediate (5 min) and delayed (1 week) cued recall for paired associates following study-only, covert, and overt conditions, including two types of overt articulation (typing and writing). A clear testing effect was observed in both experiments, but with no selective effects of response format. In Experiments 3 and 4, we compared covert and overt retrieval under blocked and random list orders. The effect sizes were small in both experiments, but there was a significant effect of response format, with overt retrieval showing better final recall performance than covert retrieval. There were no significant effects of blocked versus random list orders with respect to the testing effect produced. Taken together, these findings suggest that, under specific circumstances, overt retrieval may lead to a greater testing effect than that of covert retrieval, but because of small effect sizes, it appears that the testing effect is mainly the result of retrieval processes and that articulation has fairly little to add to its magnitude in a paired-associates learning paradigm.