The JoVE video player is compatible with HTML5 and Adobe Flash. Older browsers that do not support HTML5 and the H.264 video codec will still use a Flash-based video player. We recommend downloading the newest version of Flash here, but we support all versions 10 and above.

This work explores the relationships between a user's choice of a given contraceptive option and the load of steroidal estrogens that can be associated with that choice. Family planning data for the USA served as a basis for the analysis. The results showed that collectively the use of contraception in the USA conservatively averts the release of approximately 4.8 tonnes of estradiol equivalents to the environment. 35% of the estrogenic load released over the course of all experienced pregnancies events and 34% the estrogenic load represented by all resultant legacies are a result of contraception failure and the non-use of contraception. A scenario analysis conducted to explore the impacts of discontinuing the use of ethinylestradiol-based oral contraceptives revealed that this would not only result in a 1.7-fold increase in the estrogenic loading of the users, but the users would also be expected to experience undesired family planning outcomes at a rate that is 3.3 times higher. Additional scenario analyses in which ethinylestradiol-based oral contraceptive users were modeled as having switched entirely to the use of male condoms, diaphragms or copper IUDs suggested that whether a higher or lower estrogenic load can be associated with the switching population depends on the typical failure rates of the options adopted following discontinuation. And, finally, it was estimated that, in the USA, at most 13% of the annual estrogenic load can be averted by fully meeting the contraceptive needs of the population. Therefore, while the issue of estrogen impacts on the environment cannot be addressed solely by meeting the population's contraceptive needs, a significant fraction of the estrogenic mass released to environment can be averted by improving the level with which their contraceptive needs are met.

Estrogen plays vital roles in mammary gland development and breast cancer progression. It mediates its function by binding to and activating the estrogen receptors (ERs), ERα, and ERβ. ERα is frequently upregulated in breast cancer and drives the proliferation of breast cancer cells. The ERs function as transcription factors and regulate gene expression. Whereas ERα's regulation of protein-coding genes is well established, its regulation of noncoding microRNA (miRNA) is less explored. miRNAs play a major role in the post-transcriptional regulation of genes, inhibiting their translation or degrading their mRNA. miRNAs can function as oncogenes or tumor suppressors and are also promising biomarkers. Among the miRNA assays available, microarray and quantitative real-time polymerase chain reaction (qPCR) have been extensively used to detect and quantify miRNA levels. To identify miRNAs regulated by estrogen signaling in breast cancer, their expression in ERα-positive breast cancer cell lines were compared before and after estrogen-activation using both the µParaflo-microfluidic microarrays and Dual Labeled Probes-low density arrays. Results were validated using specific qPCR assays, applying both Cyanine dye-based and Dual Labeled Probes-based chemistry. Furthermore, a time-point assay was used to identify regulations over time. Advantages of the miRNA assay approach used in this study is that it enables a fast screening of mature miRNA regulations in numerous samples, even with limited sample amounts. The layout, including the specific conditions for cell culture and estrogen treatment, biological and technical replicates, and large-scale screening followed by in-depth confirmations using separate techniques, ensures a robust detection of miRNA regulations, and eliminates false positives and other artifacts. However, mutated or unknown miRNAs, or regulations at the primary and precursor transcript level, will not be detected. The method presented here represents a thorough investigation of estrogen-mediated miRNA regulation.

Institutions: University of Colorado School of Medicine, Oregon Health & Science University, University of Colorado School of Medicine.

Sex differences in neuronal susceptibility to ischemic injury and neurodegenerative disease have long been observed, but the signaling mechanisms responsible for those differences remain unclear. Primary disassociated embryonic neuronal culture provides a simplified experimental model with which to investigate the neuronal cell signaling involved in cell death as a result of ischemia or disease; however, most neuronal cultures used in research today are mixed sex. Researchers can and do test the effects of sex steroid treatment in mixed sex neuronal cultures in models of neuronal injury and disease, but accumulating evidence suggests that the female brain responds to androgens, estrogens, and progesterone differently than the male brain. Furthermore, neonate male and female rodents respond differently to ischemic injury, with males experiencing greater injury following cerebral ischemia than females. Thus, mixed sex neuronal cultures might obscure and confound the experimental results; important information might be missed. For this reason, the Herson Lab at the University of Colorado School of Medicine routinely prepares sex-stratified primary disassociated embryonic neuronal cultures from both hippocampus and cortex. Embryos are sexed before harvesting of brain tissue and male and female tissue are disassociated separately, plated separately, and maintained separately. Using this method, the Herson Lab has demonstrated a male-specific role for the ion channel TRPM2 in ischemic cell death. In this manuscript, we share and discuss our protocol for sexing embryonic mice and preparing sex-stratified hippocampal primary disassociated neuron cultures. This method can be adapted to prepare sex-stratified cortical cultures and the method for embryo sexing can be used in conjunction with other protocols for any study in which sex is thought to be an important determinant of outcome.

Institutions: Corning Life Science, Corning Life Science, Corning Life Science.

Large volume adherent cell culture is currently standardized on stacked plate cell growth products when microcarrier beads are not an optimal choice. HYPERStack vessels allow closed system scale up from the current stacked plate products and delivers >2.5X more cells in the same volumetric footprint. The HYPERStack vessels function via gas permeable material which allows gas exchange to occur, therefore eliminating the need for internal headspace within a vessel. The elimination of headspace allows the compartment where cell growth occurs to be minimized to reduce space, allowing more layers of cell growth surface area within the same volumetric footprint.
For many applications such as cell therapy or vaccine production, a closed system is required for cell growth and harvesting. The HYPERStack vessel allows cell and reagent addition and removal via tubing from media bags or other methods.
This protocol will explain the technology behind the gas permeable material used in the HYPERStack vessels, gas diffusion results to meet the metabolic needs of cells, closed system cell growth protocols, and various harvesting methods.

Despite rapid advances in high-throughput microscopy, quantitative image-based assays still pose significant challenges. While a variety of specialized image analysis tools are available, most traditional image-analysis-based workflows have steep learning curves (for fine tuning of analysis parameters) and result in long turnaround times between imaging and analysis. In particular, cell segmentation, the process of identifying individual cells in an image, is a major bottleneck in this regard.
Here we present an alternate, cell-segmentation-free workflow based on PhenoRipper, an open-source software platform designed for the rapid analysis and exploration of microscopy images. The pipeline presented here is optimized for immunofluorescence microscopy images of cell cultures and requires minimal user intervention. Within half an hour, PhenoRipper can analyze data from a typical 96-well experiment and generate image profiles. Users can then visually explore their data, perform quality control on their experiment, ensure response to perturbations and check reproducibility of replicates. This facilitates a rapid feedback cycle between analysis and experiment, which is crucial during assay optimization. This protocol is useful not just as a first pass analysis for quality control, but also may be used as an end-to-end solution, especially for screening. The workflow described here scales to large data sets such as those generated by high-throughput screens, and has been shown to group experimental conditions by phenotype accurately over a wide range of biological systems. The PhenoBrowser interface provides an intuitive framework to explore the phenotypic space and relate image properties to biological annotations. Taken together, the protocol described here will lower the barriers to adopting quantitative analysis of image based screens.

Institutions: University of Maryland Eastern Shore, USDA - Agricultural Research Service, University of Maryland Eastern Shore.

Rainfall is a driving force for the transport of environmental contaminants from agricultural soils to surficial water bodies via surface runoff. The objective of this study was to characterize the effects of antecedent soil moisture content on the fate and transport of surface applied commercial urea, a common form of nitrogen (N) fertilizer, following a rainfall event that occurs within 24 hr after fertilizer application. Although urea is assumed to be readily hydrolyzed to ammonium and therefore not often available for transport, recent studies suggest that urea can be transported from agricultural soils to coastal waters where it is implicated in harmful algal blooms. A rainfall simulator was used to apply a consistent rate of uniform rainfall across packed soil boxes that had been prewetted to different soil moisture contents. By controlling rainfall and soil physical characteristics, the effects of antecedent soil moisture on urea loss were isolated. Wetter soils exhibited shorter time from rainfall initiation to runoff initiation, greater total volume of runoff, higher urea concentrations in runoff, and greater mass loadings of urea in runoff. These results also demonstrate the importance of controlling for antecedent soil moisture content in studies designed to isolate other variables, such as soil physical or chemical characteristics, slope, soil cover, management, or rainfall characteristics. Because rainfall simulators are designed to deliver raindrops of similar size and velocity as natural rainfall, studies conducted under a standardized protocol can yield valuable data that, in turn, can be used to develop models for predicting the fate and transport of pollutants in runoff.

Institutions: University of California San Francisco, University of California San Francisco, Xradia Inc..

This study demonstrates a novel biomechanics testing protocol. The advantage of this protocol includes the use of an in situ loading device coupled to a high resolution X-ray microscope, thus enabling visualization of internal structural elements under simulated physiological loads and wet conditions. Experimental specimens will include intact bone-periodontal ligament (PDL)-tooth fibrous joints. Results will illustrate three important features of the protocol as they can be applied to organ level biomechanics: 1) reactionary force vs. displacement: tooth displacement within the alveolar socket and its reactionary response to loading, 2) three-dimensional (3D) spatial configuration and morphometrics: geometric relationship of the tooth with the alveolar socket, and 3) changes in readouts 1 and 2 due to a change in loading axis, i.e. from concentric to eccentric loads. Efficacy of the proposed protocol will be evaluated by coupling mechanical testing readouts to 3D morphometrics and overall biomechanics of the joint. In addition, this technique will emphasize on the need to equilibrate experimental conditions, specifically reactionary loads prior to acquiring tomograms of fibrous joints. It should be noted that the proposed protocol is limited to testing specimens under ex vivo conditions, and that use of contrast agents to visualize soft tissue mechanical response could lead to erroneous conclusions about tissue and organ-level biomechanics.

Laboratory-determined Phosphorus Flux from Lake Sediments as a Measure of Internal Phosphorus Loading

Authors: Mary E. Ogdahl, Alan D. Steinman, Maggie E. Weinert.

Institutions: Grand Valley State University.

Eutrophication is a water quality issue in lakes worldwide, and there is a critical need to identify and control nutrient sources. Internal phosphorus (P) loading from lake sediments can account for a substantial portion of the total P load in eutrophic, and some mesotrophic, lakes. Laboratory determination of P release rates from sediment cores is one approach for determining the role of internal P loading and guiding management decisions. Two principal alternatives to experimental determination of sediment P release exist for estimating internal load: in situ measurements of changes in hypolimnetic P over time and P mass balance. The experimental approach using laboratory-based sediment incubations to quantify internal P load is a direct method, making it a valuable tool for lake management and restoration.
Laboratory incubations of sediment cores can help determine the relative importance of internal vs. external P loads, as well as be used to answer a variety of lake management and research questions. We illustrate the use of sediment core incubations to assess the effectiveness of an aluminum sulfate (alum) treatment for reducing sediment P release. Other research questions that can be investigated using this approach include the effects of sediment resuspension and bioturbation on P release.
The approach also has limitations. Assumptions must be made with respect to: extrapolating results from sediment cores to the entire lake; deciding over what time periods to measure nutrient release; and addressing possible core tube artifacts. A comprehensive dissolved oxygen monitoring strategy to assess temporal and spatial redox status in the lake provides greater confidence in annual P loads estimated from sediment core incubations.

Institutions: National Research Council, National Research Council, University of Manchester.

Current neurophysiological research has the aim to develop methodologies to investigate the signal route from neuron to neuron, namely in the transitions from spikes to Local Field Potentials (LFPs) and from LFPs to spikes.
LFPs have a complex dependence on spike activity and their relation is still poorly understood1. The elucidation of these signal relations would be helpful both for clinical diagnostics (e.g. stimulation paradigms for Deep Brain Stimulation) and for a deeper comprehension of neural coding strategies in normal and pathological conditions (e.g. epilepsy, Parkinson disease, chronic pain). To this aim, one has to solve technical issues related to stimulation devices, stimulation paradigms and computational analyses. Therefore, a custom-made stimulation device was developed in order to deliver stimuli well regulated in space and time that does not incur in mechanical resonance. Subsequently, as an exemplification, a set of reliable LFP-spike relationships was extracted.
The performance of the device was investigated by extracellular recordings, jointly spikes and LFP responses to the applied stimuli, from the rat Primary Somatosensory cortex. Then, by means of a multi-objective optimization strategy, a predictive model for spike occurrence based on LFPs was estimated.
The application of this paradigm shows that the device is adequately suited to deliver high frequency tactile stimulation, outperforming common piezoelectric actuators. As a proof of the efficacy of the device, the following results were presented: 1) the timing and reliability of LFP responses well match the spike responses, 2) LFPs are sensitive to the stimulation history and capture not only the average response but also the trial-to-trial fluctuations in the spike activity and, finally, 3) by using the LFP signal it is possible to estimate a range of predictive models that capture different aspects of the spike activity.

The glassy-winged sharpshooter (Homalodisca vitripennis) is a highly vagile and polyphagous insect found throughout the southwestern United States. These insects are the predominant vectors of Xylella fastidiosa (X. fastidiosa), a xylem-limited bacterium that is the causal agent of Pierce's disease (PD) of grapevine. Pierce’s disease is economically damaging; thus, H. vitripennis have become a target for pathogen management strategies. A dicistrovirus identified as Homalodisca coagulata virus-01 (HoCV-01) has been associated with an increased mortality in H. vitripennis populations. Because a host cell is required for HoCV-01 replication, cell culture provides a uniform environment for targeted replication that is logistically and economically valuable for biopesticide production. In this study, a system for large-scale propagation of H. vitripennis cells via tissue culture was developed, providing a viral replication mechanism. HoCV-01 was extracted from whole body insects and used to inoculate cultured H. vitripennis cells at varying levels. The culture medium was removed every 24 hr for 168 hr, RNA extracted and analyzed with qRT-PCR. Cells were stained with trypan blue and counted to quantify cell survivability using light microscopy. Whole virus particles were extracted up to 96 hr after infection, which was the time point determined to be before total cell culture collapse occurred. Cells were also subjected to fluorescent staining and viewed using confocal microscopy to investigate viral activity on F-actin attachment and nuclei integrity. The conclusion of this study is that H. vitripennis cells are capable of being cultured and used for mass production of HoCV-01 at a suitable level to allow production of a biopesticide.

A technique for laboratory estimation of net trophic transfer efficiency (γ) of polychlorinated biphenyl (PCB) congeners to piscivorous fish from their prey is described herein. During a 135-day laboratory experiment, we fed bloater (Coregonus hoyi) that had been caught in Lake Michigan to lake trout (Salvelinus namaycush) kept in eight laboratory tanks. Bloater is a natural prey for lake trout. In four of the tanks, a relatively high flow rate was used to ensure relatively high activity by the lake trout, whereas a low flow rate was used in the other four tanks, allowing for low lake trout activity. On a tank-by-tank basis, the amount of food eaten by the lake trout on each day of the experiment was recorded. Each lake trout was weighed at the start and end of the experiment. Four to nine lake trout from each of the eight tanks were sacrificed at the start of the experiment, and all 10 lake trout remaining in each of the tanks were euthanized at the end of the experiment. We determined concentrations of 75 PCB congeners in the lake trout at the start of the experiment, in the lake trout at the end of the experiment, and in bloaters fed to the lake trout during the experiment. Based on these measurements, γ was calculated for each of 75 PCB congeners in each of the eight tanks. Mean γ was calculated for each of the 75 PCB congeners for both active and inactive lake trout. Because the experiment was replicated in eight tanks, the standard error about mean γ could be estimated. Results from this type of experiment are useful in risk assessment models to predict future risk to humans and wildlife eating contaminated fish under various scenarios of environmental contamination.

Selection of Aptamers for Amyloid β-Protein, the Causative Agent of Alzheimer's Disease

Authors: Farid Rahimi, Gal Bitan.

Institutions: David Geffen School of Medicine, University of California, Los Angeles, University of California, Los Angeles.

Alzheimer's disease (AD) is a progressive, age-dependent, neurodegenerative disorder with an insidious course that renders its presymptomatic diagnosis difficult1. Definite AD diagnosis is achieved only postmortem, thus establishing presymptomatic, early diagnosis of AD is crucial for developing and administering effective therapies2,3.
Amyloid β-protein (Aβ) is central to AD pathogenesis. Soluble, oligomeric Aβ assemblies are believed to affect neurotoxicity underlying synaptic dysfunction and neuron loss in AD4,5. Various forms of soluble Aβ assemblies have been described, however, their interrelationships and relevance to AD etiology and pathogenesis are complex and not well understood6. Specific molecular recognition tools may unravel the relationships amongst Aβ assemblies and facilitate detection and characterization of these assemblies early in the disease course before symptoms emerge. Molecular recognition commonly relies on antibodies. However, an alternative class of molecular recognition tools, aptamers, offers important advantages relative to antibodies7,8. Aptamers are oligonucleotides generated by in-vitro selection: systematic evolution of ligands by exponential enrichment (SELEX)9,10. SELEX is an iterative process that, similar to Darwinian evolution, allows selection, amplification, enrichment, and perpetuation of a property, e.g., avid, specific, ligand binding (aptamers) or catalytic activity (ribozymes and DNAzymes).
Despite emergence of aptamers as tools in modern biotechnology and medicine11, they have been underutilized in the amyloid field. Few RNA or ssDNA aptamers have been selected against various forms of prion proteins (PrP)12-16. An RNA aptamer generated against recombinant bovine PrP was shown to recognize bovine PrP-β17, a soluble, oligomeric, β-sheet-rich conformational variant of full-length PrP that forms amyloid fibrils18. Aptamers generated using monomeric and several forms of fibrillar β2-microglobulin (β2m) were found to bind fibrils of certain other amyloidogenic proteins besides β2m fibrils19. Ylera et al. described RNA aptamers selected against immobilized monomeric Aβ4020. Unexpectedly, these aptamers bound fibrillar Aβ40. Altogether, these data raise several important questions. Why did aptamers selected against monomeric proteins recognize their polymeric forms? Could aptamers against monomeric and/or oligomeric forms of amyloidogenic proteins be obtained? To address these questions, we attempted to select aptamers for covalently-stabilized oligomeric Aβ4021 generated using photo-induced cross-linking of unmodified proteins (PICUP)22,23. Similar to previous findings17,19,20, these aptamers reacted with fibrils of Aβ and several other amyloidogenic proteins likely recognizing a potentially common amyloid structural aptatope21. Here, we present the SELEX methodology used in production of these aptamers21.

Estrogens are a family of female sexual hormones with an exceptionally wide spectrum of effects. When rats and mice are used in estrogen research they are commonly ovariectomized in order to ablate the rapidly cycling hormone production, replacing the 17β-estradiol exogenously. There is, however, lack of consensus regarding how the hormone should be administered to obtain physiological serum concentrations. This is crucial since the 17β-estradiol level/administration method profoundly influences the experimental results1-3. We have in a series of studies characterized the different modes of 17β-estradiol administration, finding that subcutaneous silastic capsules and per-oral nut-cream Nutella are superior to commercially available slow-release pellets (produced by the company Innovative Research of America) and daily injections in terms of producing physiological serum concentrations of 17β-estradiol4-6. Amongst the advantages of the nut-cream method, that previously has been used for buprenorphine administration7, is that when used for estrogen administration it resembles peroral hormone replacement therapy and is non-invasive. The subcutaneous silastic capsules are convenient and produce the most stable serum concentrations. This video article contains step-by-step demonstrations of ovariectomy and 17β-estradiol hormone replacement by silastic capsules and peroral Nutella in rats and mice, followed by a discussion of important aspects of the administration procedures.

Audio-based Environment Simulator (AbES) is virtual environment software designed to improve real world navigation skills in the blind. Using only audio based cues and set within the context of a video game metaphor, users gather relevant spatial information regarding a building's layout. This allows the user to develop an accurate spatial cognitive map of a large-scale three-dimensional space that can be manipulated for the purposes of a real indoor navigation task. After game play, participants are then assessed on their ability to navigate within the target physical building represented in the game. Preliminary results suggest that early blind users were able to acquire relevant information regarding the spatial layout of a previously unfamiliar building as indexed by their performance on a series of navigation tasks. These tasks included path finding through the virtual and physical building, as well as a series of drop off tasks. We find that the immersive and highly interactive nature of the AbES software appears to greatly engage the blind user to actively explore the virtual environment. Applications of this approach may extend to larger populations of visually impaired individuals.

We describe a high-throughput, high-volume, fully automated, live-in 24/7 behavioral testing system for assessing the effects of genetic and pharmacological manipulations on basic mechanisms of cognition and learning in mice. A standard polypropylene mouse housing tub is connected through an acrylic tube to a standard commercial mouse test box. The test box has 3 hoppers, 2 of which are connected to pellet feeders. All are internally illuminable with an LED and monitored for head entries by infrared (IR) beams. Mice live in the environment, which eliminates handling during screening. They obtain their food during two or more daily feeding periods by performing in operant (instrumental) and Pavlovian (classical) protocols, for which we have written protocol-control software and quasi-real-time data analysis and graphing software. The data analysis and graphing routines are written in a MATLAB-based language created to simplify greatly the analysis of large time-stamped behavioral and physiological event records and to preserve a full data trail from raw data through all intermediate analyses to the published graphs and statistics within a single data structure. The data-analysis code harvests the data several times a day and subjects it to statistical and graphical analyses, which are automatically stored in the "cloud" and on in-lab computers. Thus, the progress of individual mice is visualized and quantified daily. The data-analysis code talks to the protocol-control code, permitting the automated advance from protocol to protocol of individual subjects. The behavioral protocols implemented are matching, autoshaping, timed hopper-switching, risk assessment in timed hopper-switching, impulsivity measurement, and the circadian anticipation of food availability. Open-source protocol-control and data-analysis code makes the addition of new protocols simple. Eight test environments fit in a 48 in x 24 in x 78 in cabinet; two such cabinets (16 environments) may be controlled by one computer.

Localization-based super resolution microscopy can be applied to obtain a spatial map (image) of the distribution of individual fluorescently labeled single molecules within a sample with a spatial resolution of tens of nanometers. Using either photoactivatable (PAFP) or photoswitchable (PSFP) fluorescent proteins fused to proteins of interest, or organic dyes conjugated to antibodies or other molecules of interest, fluorescence photoactivation localization microscopy (FPALM) can simultaneously image multiple species of molecules within single cells. By using the following approach, populations of large numbers (thousands to hundreds of thousands) of individual molecules are imaged in single cells and localized with a precision of ~10-30 nm. Data obtained can be applied to understanding the nanoscale spatial distributions of multiple protein types within a cell. One primary advantage of this technique is the dramatic increase in spatial resolution: while diffraction limits resolution to ~200-250 nm in conventional light microscopy, FPALM can image length scales more than an order of magnitude smaller. As many biological hypotheses concern the spatial relationships among different biomolecules, the improved resolution of FPALM can provide insight into questions of cellular organization which have previously been inaccessible to conventional fluorescence microscopy. In addition to detailing the methods for sample preparation and data acquisition, we here describe the optical setup for FPALM. One additional consideration for researchers wishing to do super-resolution microscopy is cost: in-house setups are significantly cheaper than most commercially available imaging machines. Limitations of this technique include the need for optimizing the labeling of molecules of interest within cell samples, and the need for post-processing software to visualize results. We here describe the use of PAFP and PSFP expression to image two protein species in fixed cells. Extension of the technique to living cells is also described.

Characterization of Complex Systems Using the Design of Experiments Approach: Transient Protein Expression in Tobacco as a Case Study

Authors: Johannes Felix Buyel, Rainer Fischer.

Institutions: RWTH Aachen University, Fraunhofer Gesellschaft.

Plants provide multiple benefits for the production of biopharmaceuticals including low costs, scalability, and safety. Transient expression offers the additional advantage of short development and production times, but expression levels can vary significantly between batches thus giving rise to regulatory concerns in the context of good manufacturing practice. We used a design of experiments (DoE) approach to determine the impact of major factors such as regulatory elements in the expression construct, plant growth and development parameters, and the incubation conditions during expression, on the variability of expression between batches. We tested plants expressing a model anti-HIV monoclonal antibody (2G12) and a fluorescent marker protein (DsRed). We discuss the rationale for selecting certain properties of the model and identify its potential limitations. The general approach can easily be transferred to other problems because the principles of the model are broadly applicable: knowledge-based parameter selection, complexity reduction by splitting the initial problem into smaller modules, software-guided setup of optimal experiment combinations and step-wise design augmentation. Therefore, the methodology is not only useful for characterizing protein expression in plants but also for the investigation of other complex systems lacking a mechanistic description. The predictive equations describing the interconnectivity between parameters can be used to establish mechanistic models for other complex systems.

Modern 3D electron microscopy approaches have recently allowed unprecedented insight into the 3D ultrastructural organization of cells and tissues, enabling the visualization of large macromolecular machines, such as adhesion complexes, as well as higher-order structures, such as the cytoskeleton and cellular organelles in their respective cell and tissue context. Given the inherent complexity of cellular volumes, it is essential to first extract the features of interest in order to allow visualization, quantification, and therefore comprehension of their 3D organization. Each data set is defined by distinct characteristics, e.g., signal-to-noise ratio, crispness (sharpness) of the data, heterogeneity of its features, crowdedness of features, presence or absence of characteristic shapes that allow for easy identification, and the percentage of the entire volume that a specific region of interest occupies. All these characteristics need to be considered when deciding on which approach to take for segmentation.
The six different 3D ultrastructural data sets presented were obtained by three different imaging approaches: resin embedded stained electron tomography, focused ion beam- and serial block face- scanning electron microscopy (FIB-SEM, SBF-SEM) of mildly stained and heavily stained samples, respectively. For these data sets, four different segmentation approaches have been applied: (1) fully manual model building followed solely by visualization of the model, (2) manual tracing segmentation of the data followed by surface rendering, (3) semi-automated approaches followed by surface rendering, or (4) automated custom-designed segmentation algorithms followed by surface rendering and quantitative analysis. Depending on the combination of data set characteristics, it was found that typically one of these four categorical approaches outperforms the others, but depending on the exact sequence of criteria, more than one approach may be successful. Based on these data, we propose a triage scheme that categorizes both objective data set characteristics and subjective personal criteria for the analysis of the different data sets.

Mast cells play important roles in allergic disease and immune defense against parasites. Once activated (e.g. by an allergen), they degranulate, a process that results in the exocytosis of allergic mediators. Modulation of mast cell degranulation by drugs and toxicants may have positive or adverse effects on human health. Mast cell function has been dissected in detail with the use of rat basophilic leukemia mast cells (RBL-2H3), a widely accepted model of human mucosal mast cells3-5. Mast cell granule component and the allergic mediator β-hexosaminidase, which is released linearly in tandem with histamine from mast cells6, can easily and reliably be measured through reaction with a fluorogenic substrate, yielding measurable fluorescence intensity in a microplate assay that is amenable to high-throughput studies1. Originally published by Naal et al.1, we have adapted this degranulation assay for the screening of drugs and toxicants and demonstrate its use here.
Triclosan is a broad-spectrum antibacterial agent that is present in many consumer products and has been found to be a therapeutic aid in human allergic skin disease7-11, although the mechanism for this effect is unknown. Here we demonstrate an assay for the effect of triclosan on mast cell degranulation. We recently showed that triclosan strongly affects mast cell function2. In an effort to avoid use of an organic solvent, triclosan is dissolved directly into aqueous buffer with heat and stirring, and resultant concentration is confirmed using UV-Vis spectrophotometry (using ε280 = 4,200 L/M/cm)12. This protocol has the potential to be used with a variety of chemicals to determine their effects on mast cell degranulation, and more broadly, their allergic potential.

STORM is a recently developed super-resolution microscopy technique with up to 10 times better resolution than standard fluorescence microscopy techniques. However, as the image is acquired in a very different way than normal, by building up an image molecule-by-molecule, there are some significant challenges for users in trying to optimize their image acquisition. In order to aid this process and gain more insight into how STORM works we present the preparation of 3 test samples and the methodology of acquiring and processing STORM super-resolution images with typical resolutions of between 30-50 nm. By combining the test samples with the use of the freely available rainSTORM processing software it is possible to obtain a great deal of information about image quality and resolution. Using these metrics it is then possible to optimize the imaging procedure from the optics, to sample preparation, dye choice, buffer conditions, and image acquisition settings. We also show examples of some common problems that result in poor image quality, such as lateral drift, where the sample moves during image acquisition and density related problems resulting in the 'mislocalization' phenomenon.

Getting to Compliance in Forced Exercise in Rodents: A Critical Standard to Evaluate Exercise Impact in Aging-related Disorders and Disease

Authors: Jennifer C. Arnold, Michael F. Salvatore.

Institutions: Louisiana State University Health Sciences Center.

There is a major increase in the awareness of the positive impact of exercise on improving several disease states with neurobiological basis; these include improving cognitive function and physical performance. As a result, there is an increase in the number of animal studies employing exercise. It is argued that one intrinsic value of forced exercise is that the investigator has control over the factors that can influence the impact of exercise on behavioral outcomes, notably exercise frequency, duration, and intensity of the exercise regimen. However, compliance in forced exercise regimens may be an issue, particularly if potential confounds of employing foot-shock are to be avoided. It is also important to consider that since most cognitive and locomotor impairments strike in the aged individual, determining impact of exercise on these impairments should consider using aged rodents with a highest possible level of compliance to ensure minimal need for test subjects. Here, the pertinent steps and considerations necessary to achieve nearly 100% compliance to treadmill exercise in an aged rodent model will be presented and discussed. Notwithstanding the particular exercise regimen being employed by the investigator, our protocol should be of use to investigators that are particularly interested in the potential impact of forced exercise on aging-related impairments, including aging-related Parkinsonism and Parkinson’s disease.

Institutions: Heart Research Center Goettingen, University Medical Center Goettingen, German Center for Cardiovascular Research (DZHK) partner site Goettingen, University of Maryland School of Medicine.

In cardiac myocytes a complex network of membrane tubules - the transverse-axial tubule system (TATS) - controls deep intracellular signaling functions. While the outer surface membrane and associated TATS membrane components appear to be continuous, there are substantial differences in lipid and protein content. In ventricular myocytes (VMs), certain TATS components are highly abundant contributing to rectilinear tubule networks and regular branching 3D architectures. It is thought that peripheral TATS components propagate action potentials from the cell surface to thousands of remote intracellular sarcoendoplasmic reticulum (SER) membrane contact domains, thereby activating intracellular Ca2+ release units (CRUs). In contrast to VMs, the organization and functional role of TATS membranes in atrial myocytes (AMs) is significantly different and much less understood. Taken together, quantitative structural characterization of TATS membrane networks in healthy and diseased myocytes is an essential prerequisite towards better understanding of functional plasticity and pathophysiological reorganization. Here, we present a strategic combination of protocols for direct quantitative analysis of TATS membrane networks in living VMs and AMs. For this, we accompany primary cell isolations of mouse VMs and/or AMs with critical quality control steps and direct membrane staining protocols for fluorescence imaging of TATS membranes. Using an optimized workflow for confocal or superresolution TATS image processing, binarized and skeletonized data are generated for quantitative analysis of the TATS network and its components. Unlike previously published indirect regional aggregate image analysis strategies, our protocols enable direct characterization of specific components and derive complex physiological properties of TATS membrane networks in living myocytes with high throughput and open access software tools. In summary, the combined protocol strategy can be readily applied for quantitative TATS network studies during physiological myocyte adaptation or disease changes, comparison of different cardiac or skeletal muscle cell types, phenotyping of transgenic models, and pharmacological or therapeutic interventions.

Institutions: University of California, Berkeley , University of California, San Francisco.

Injuries to the tendon (e.g., wrist tendonitis, epicondyltis) due to overuse are common in sports activities and the workplace. Most are associated with repetitive, high force hand activities. The mechanisms of cellular and structural damage due to cyclical loading are not well known. The purpose of this video is to present a new system that can simultaneously load four tendons in tissue culture. The video describes the methods of sterile tissue harvest and how the tendons are loaded onto a clamping system that is subsequently immersed into media and maintained at 37°C. One clamp is fixed while the other one is moved with a linear actuator. Tendon tensile force is monitored with a load cell in series with the mobile clamp. The actuators are controlled with a LabView program. The four tendons can be repetitively loaded with different patterns of loading, repetition rate, rate of loading, and duration. Loading can continue for a few minutes to 48 hours. At the end of loading, the tendons are removed and the mid-substance extracted for biochemical analyses. This system allows for the investigation of the effects of loading patterns on gene expression and structural changes in tendon. Ultimately, mechanisms of injury due to overuse can be studies with the findings applied to treatment and prevention.

Developmental biology, issue 4, tendon, tension

209

Play Button

The use of Biofeedback in Clinical Virtual Reality: The INTREPID Project

Generalized anxiety disorder (GAD) is a psychiatric disorder characterized by a constant and unspecific anxiety that interferes with daily-life activities. Its high prevalence in general population and the severe limitations it causes, point out the necessity to find new efficient strategies to treat it. Together with the cognitive-behavioral treatments, relaxation represents a useful approach for the treatment of GAD, but it has the limitation that it is hard to be learned. The INTREPID project is aimed to implement a new instrument to treat anxiety-related disorders and to test its clinical efficacy in reducing anxiety-related symptoms. The innovation of this approach is the combination of virtual reality and biofeedback, so that the first one is directly modified by the output of the second one. In this way, the patient is made aware of his or her reactions through the modification of some features of the VR environment in real time. Using mental exercises the patient learns to control these physiological parameters and using the feedback provided by the virtual environment is able to gauge his or her success. The supplemental use of portable devices, such as PDA or smart-phones, allows the patient to perform at home, individually and autonomously, the same exercises experienced in therapist's office. The goal is to anchor the learned protocol in a real life context, so enhancing the patients' ability to deal with their symptoms. The expected result is a better and faster learning of relaxation techniques, and thus an increased effectiveness of the treatment if compared with traditional clinical protocols.

Diabetes mellitus is a major independent risk factor for increased morbidity and mortality in the hospitalized patient, and elevated blood glucose concentrations, even in non-diabetic patients, predicts poor outcomes.1-4 The 2008 consensus statement by the American Association of Clinical Endocrinologists (AACE) and the American Diabetes Association (ADA) states that "hyperglycemia in hospitalized patients, irrespective of its cause, is unequivocally associated with adverse outcomes."5 It is important to recognize that hyperglycemia occurs in patients with known or undiagnosed diabetes as well as during acute illness in those with previously normal glucose tolerance.
The Normoglycemia in Intensive Care Evaluation-Survival Using Glucose Algorithm Regulation (NICE-SUGAR) study involved over six thousand adult intensive care unit (ICU) patients who were randomized to intensive glucose control or conventional glucose control.6 Surprisingly, this trial found that intensive glucose control increased the risk of mortality by 14% (odds ratio, 1.14; p=0.02). In addition, there was an increased prevalence of severe hypoglycemia in the intensive control group compared with the conventional control group (6.8% vs. 0.5%, respectively; p<0.001). From this pivotal trial and two others,7,8 Wyoming Medical Center (WMC) realized the importance of controlling hyperglycemia in the hospitalized patient while avoiding the negative impact of resultant hypoglycemia.
Despite multiple revisions of an IV insulin paper protocol, analysis of data from usage of the paper protocol at WMC shows that in terms of achieving normoglycemia while minimizing hypoglycemia, results were suboptimal. Therefore, through a systematical implementation plan, monitoring of patient blood glucose levels was switched from using a paper IV insulin protocol to a computerized glucose management system. By comparing blood glucose levels using the paper protocol to that of the computerized system, it was determined, that overall, the computerized glucose management system resulted in more rapid and tighter glucose control than the traditional paper protocol. Specifically, a substantial increase in the time spent within the target blood glucose concentration range, as well as a decrease in the prevalence of severe hypoglycemia (BG < 40 mg/dL), clinical hypoglycemia (BG < 70 mg/dL), and hyperglycemia (BG > 180 mg/dL), was witnessed in the first five months after implementation of the computerized glucose management system. The computerized system achieved target concentrations in greater than 75% of all readings while minimizing the risk of hypoglycemia. The prevalence of hypoglycemia (BG < 70 mg/dL) with the use of the computer glucose management system was well under 1%.

A Strategy to Identify de Novo Mutations in Common Disorders such as Autism and Schizophrenia

Authors: Gauthier Julie, Fadi F. Hamdan, Guy A. Rouleau.

Institutions: Universite de Montreal, Universite de Montreal, Universite de Montreal.

There are several lines of evidence supporting the role of de novo mutations as a mechanism for common disorders, such as autism and schizophrenia. First, the de novo mutation rate in humans is relatively high, so new mutations are generated at a high frequency in the population. However, de novo mutations have not been reported in most common diseases. Mutations in genes leading to severe diseases where there is a strong negative selection against the phenotype, such as lethality in embryonic stages or reduced reproductive fitness, will not be transmitted to multiple family members, and therefore will not be detected by linkage gene mapping or association studies. The observation of very high concordance in monozygotic twins and very low concordance in dizygotic twins also strongly supports the hypothesis that a significant fraction of cases may result from new mutations. Such is the case for diseases such as autism and schizophrenia. Second, despite reduced reproductive fitness1 and extremely variable environmental factors, the incidence of some diseases is maintained worldwide at a relatively high and constant rate. This is the case for autism and schizophrenia, with an incidence of approximately 1% worldwide. Mutational load can be thought of as a balance between selection for or against a deleterious mutation and its production by de novo mutation. Lower rates of reproduction constitute a negative selection factor that should reduce the number of mutant alleles in the population, ultimately leading to decreased disease prevalence. These selective pressures tend to be of different intensity in different environments. Nonetheless, these severe mental disorders have been maintained at a constant relatively high prevalence in the worldwide population across a wide range of cultures and countries despite a strong negative selection against them2. This is not what one would predict in diseases with reduced reproductive fitness, unless there was a high new mutation rate. Finally, the effects of paternal age: there is a significantly increased risk of the disease with increasing paternal age, which could result from the age related increase in paternal de novo mutations. This is the case for autism and schizophrenia3. The male-to-female ratio of mutation rate is estimated at about 4–6:1, presumably due to a higher number of germ-cell divisions with age in males. Therefore, one would predict that de novo mutations would more frequently come from males, particularly older males4. A high rate of new mutations may in part explain why genetic studies have so far failed to identify many genes predisposing to complexes diseases genes, such as autism and schizophrenia, and why diseases have been identified for a mere 3% of genes in the human genome. Identification for de novo mutations as a cause of a disease requires a targeted molecular approach, which includes studying parents and affected subjects. The process for determining if the genetic basis of a disease may result in part from de novo mutations and the molecular approach to establish this link will be illustrated, using autism and schizophrenia as examples.

JoVE Visualize is a tool created to match the last 5 years of PubMed publications to methods in JoVE's video library.

How does it work?

We use abstracts found on PubMed and match them to JoVE videos to create a list of 10 to 30 related methods videos.

Video X seems to be unrelated to Abstract Y...

In developing our video relationships, we compare around 5 million PubMed articles to our library of over 4,500 methods videos. In some cases the language used in the PubMed abstracts makes matching that content to a JoVE video difficult. In other cases, there happens not to be any content in our video library that is relevant to the topic of a given abstract. In these cases, our algorithms are trying their best to display videos with relevant content, which can sometimes result in matched videos with only a slight relation.