The JoVE video player is compatible with HTML5 and Adobe Flash. Older browsers that do not support HTML5 and the H.264 video codec will still use a Flash-based video player. We recommend downloading the newest version of Flash here, but we support all versions 10 and above.

The infraorder Mygalomorphae (i.e., trapdoor spiders, tarantulas, funnel web spiders, etc.) is one of three main lineages of spiders. Comprising 15 families, 325 genera, and over 2,600 species, the group is a diverse assemblage that has retained a number of features considered primitive for spiders. Despite an evolutionary history dating back to the lower Triassic, the group has received comparatively little attention with respect to its phylogeny and higher classification. The few phylogenies published all share the common thread that a stable classification scheme for the group remains unresolved.

Venoms are chemically complex secretions typically comprising numerous proteins and peptides with varied physiological activities. Functional characterization of venom proteins has important biomedical applications, including the identification of drug leads or probes for cellular receptors. Spiders are the most species rich clade of venomous organisms, but the venoms of only a few species are well-understood, in part due to the difficulty associated with collecting minute quantities of venom from small animals. This paper presents a protocol for the collection of venom from spiders using electrical stimulation, demonstrating the procedure on the Western black widow (Latrodectus hesperus). The collected venom is useful for varied downstream analyses including direct protein identification via mass spectrometry, functional assays, and stimulation of venom gene expression for transcriptomic studies. This technique has the advantage over protocols that isolate venom from whole gland homogenates, which do not separate genuine venom components from cellular proteins that are not secreted as part of the venom. Representative results demonstrate the detection of known venom peptides from the collected sample using mass spectrometry. The venom collection procedure is followed by a protocol for dissecting spider venom glands, with results demonstrating that this leads to the characterization of venom-expressed proteins and peptides at the sequence level.

As society progresses and resources become scarcer, it is becoming increasingly important to cultivate new technologies that engineer next generation biomaterials with high performance properties. The development of these new structural materials must be rapid, cost-efficient and involve processing methodologies and products that are environmentally friendly and sustainable. Spiders spin a multitude of different fiber types with diverse mechanical properties, offering a rich source of next generation engineering materials for biomimicry that rival the best manmade and natural materials. Since the collection of large quantities of natural spider silk is impractical, synthetic silk production has the ability to provide scientists with access to an unlimited supply of threads. Therefore, if the spinning process can be streamlined and perfected, artificial spider fibers have the potential use for a broad range of applications ranging from body armor, surgical sutures, ropes and cables, tires, strings for musical instruments, and composites for aviation and aerospace technology. In order to advance the synthetic silk production process and to yield fibers that display low variance in their material properties from spin to spin, we developed a wet-spinning protocol that integrates expression of recombinant spider silk proteins in bacteria, purification and concentration of the proteins, followed by fiber extrusion and a mechanical post-spin treatment. This is the first visual representation that reveals a step-by-step process to spin and analyze artificial silk fibers on a laboratory scale. It also provides details to minimize the introduction of variability among fibers spun from the same spinning dope. Collectively, these methods will propel the process of artificial silk production, leading to higher quality fibers that surpass natural spider silks.

Institutions: Yale University, Virginia Tech, The Hebrew University of Jerusalem.

The quantity and quality of detritus entering the soil determines the rate of decomposition by microbial communities as well as recycle rates of nitrogen (N) and carbon (C) sequestration1,2. Plant litter comprises the majority of detritus3, and so it is assumed that decomposition is only marginally influenced by biomass inputs from animals such as herbivores and carnivores4,5. However, carnivores may influence microbial decomposition of plant litter via a chain of interactions in which predation risk alters the physiology of their herbivore prey that in turn alters soil microbial functioning when the herbivore carcasses are decomposed6. A physiological stress response by herbivores to the risk of predation can change the C:N elemental composition of herbivore biomass7,8,9 because stress from predation risk increases herbivore basal energy demands that in nutrient-limited systems forces herbivores to shift their consumption from N-rich resources to support growth and reproduction to C-rich carbohydrate resources to support heightened metabolism6. Herbivores have limited ability to store excess nutrients, so stressed herbivores excrete N as they increase carbohydrate-C consumption7. Ultimately, prey stressed by predation risk increase their body C:N ratio7,10, making them poorer quality resources for the soil microbial pool likely due to lower availability of labile N for microbial enzyme production6. Thus, decomposition of carcasses of stressed herbivores has a priming effect on the functioning of microbial communities that decreases subsequent ability to of microbes to decompose plant litter6,10,11.
We present the methodology to evaluate linkages between predation risk and litter decomposition by soil microbes. We describe how to: induce stress in herbivores from predation risk; measure those stress responses, and measure the consequences on microbial decomposition. We use insights from a model grassland ecosystem comprising the hunting spider predator (Pisuarina mira), a dominant grasshopper herbivore (Melanoplus femurrubrum),and a variety of grass and forb plants9.

The internal transcribed spacer 2 (ITS2) has been used as a phylogenetic marker for more than two decades. As ITS2 research mainly focused on the very variable ITS2 sequence, it confined this marker to low-level phylogenetics only. However, the combination of the ITS2 sequence and its highly conserved secondary structure improves the phylogenetic resolution1 and allows phylogenetic inference at multiple taxonomic ranks, including species delimitation2-8.
The ITS2 Database9 presents an exhaustive dataset of internal transcribed spacer 2 sequences from NCBI GenBank11 accurately reannotated10. Following an annotation by profile Hidden Markov Models (HMMs), the secondary structure of each sequence is predicted. First, it is tested whether a minimum energy based fold12 (direct fold) results in a correct, four helix conformation. If this is not the case, the structure is predicted by homology modeling13. In homology modeling, an already known secondary structure is transferred to another ITS2 sequence, whose secondary structure was not able to fold correctly in a direct fold.
The ITS2 Database is not only a database for storage and retrieval of ITS2 sequence-structures. It also provides several tools to process your own ITS2 sequences, including annotation, structural prediction, motif detection and BLAST14 search on the combined sequence-structure information. Moreover, it integrates trimmed versions of 4SALE15,16 and ProfDistS17 for multiple sequence-structure alignment calculation and Neighbor Joining18 tree reconstruction. Together they form a coherent analysis pipeline from an initial set of sequences to a phylogeny based on sequence and secondary structure.
In a nutshell, this workbench simplifies first phylogenetic analyses to only a few mouse-clicks, while additionally providing tools and data for comprehensive large-scale analyses.

Single Particle Electron Microscopy Reconstruction of the Exosome Complex Using the Random Conical Tilt Method

Authors: Xueqi Liu, Hong-Wei Wang.

Institutions: Yale University.

Single particle electron microscopy (EM) reconstruction has recently become a popular tool to get the three-dimensional (3D) structure of large macromolecular complexes. Compared to X-ray crystallography, it has some unique advantages. First, single particle EM reconstruction does not need to crystallize the protein sample, which is the bottleneck in X-ray crystallography, especially for large macromolecular complexes. Secondly, it does not need large amounts of protein samples. Compared with milligrams of proteins necessary for crystallization, single particle EM reconstruction only needs several micro-liters of protein solution at nano-molar concentrations, using the negative staining EM method. However, despite a few macromolecular assemblies with high symmetry, single particle EM is limited at relatively low resolution (lower than 1 nm resolution) for many specimens especially those without symmetry. This technique is also limited by the size of the molecules under study, i.e. 100 kDa for negatively stained specimens and 300 kDa for frozen-hydrated specimens in general.
For a new sample of unknown structure, we generally use a heavy metal solution to embed the molecules by negative staining. The specimen is then examined in a transmission electron microscope to take two-dimensional (2D) micrographs of the molecules. Ideally, the protein molecules have a homogeneous 3D structure but exhibit different orientations in the micrographs. These micrographs are digitized and processed in computers as "single particles". Using two-dimensional alignment and classification techniques, homogenous molecules in the same views are clustered into classes. Their averages enhance the signal of the molecule's 2D shapes. After we assign the particles with the proper relative orientation (Euler angles), we will be able to reconstruct the 2D particle images into a 3D virtual volume.
In single particle 3D reconstruction, an essential step is to correctly assign the proper orientation of each single particle. There are several methods to assign the view for each particle, including the angular reconstitution1 and random conical tilt (RCT) method2. In this protocol, we describe our practice in getting the 3D reconstruction of yeast exosome complex using negative staining EM and RCT. It should be noted that our protocol of electron microscopy and image processing follows the basic principle of RCT but is not the only way to perform the method. We first describe how to embed the protein sample into a layer of Uranyl-Formate with a thickness comparable to the protein size, using a holey carbon grid covered with a layer of continuous thin carbon film. Then the specimen is inserted into a transmission electron microscope to collect untilted (0-degree) and tilted (55-degree) pairs of micrographs that will be used later for processing and obtaining an initial 3D model of the yeast exosome. To this end, we perform RCT and then refine the initial 3D model by using the projection matching refinement method3.

Institutions: The University of Western Ontario, Instituto de Ciencias de la Vid y el Vino, Ghent University, University of Amsterdam.

The two-spotted spider mite, Tetranychus urticae, is a ubiquitous polyphagous arthropod herbivore that feeds on a remarkably broad array of species, with more than 150 of economic value. It is a major pest of greenhouse crops, especially in Solanaceae and Cucurbitaceae (e.g., tomatoes, eggplants, peppers, cucumbers, zucchini) and greenhouse ornamentals (e.g., roses, chrysanthemum, carnations), annual field crops (such as maize, cotton, soybean, and sugar beet), and in perennial cultures (alfalfa, strawberries, grapes, citruses, and plums)1,2. In addition to the extreme polyphagy that makes it an important agricultural pest, T. urticae has a tendency to develop resistance to a wide array of insecticides and acaricides that are used for its control3-7.
T. urticae is an excellent experimental organism, as it has a rapid life cycle (7 days at 27 °C) and can be easily maintained at high density in the laboratory. Methods to assay gene expression (including in situ hybridization and antibody staining) and to inactivate expression of spider mite endogenous genes using RNA interference have been developed8-10. Recently, the whole genome sequence of T. urticae has been reported, creating an opportunity to develop this pest herbivore as a model organism with equivalent genomic resources that already exist in some of its host plants (Arabidopsis thaliana and the tomato Solanum lycopersicum)11. Together, these model organisms could provide insights into molecular bases of plant-pest interactions.
Here, an efficient method for quick and easy collection of a large number of adult female mites, their application on an experimental plant host, and the assessment of the plant damage due to spider mite feeding are described. The presented protocol enables fast and efficient collection of hundreds of individuals at any developmental stage (eggs, larvae, nymphs, adult males, and females) that can be used for subsequent experimental application.

Institutions: University of Texas Medical Branch, University of Texas Medical Branch, University of Texas Medical Branch.

Cryo-electron microscopy (Cryo-EM)1 is a powerful approach to investigate the functional structure of proteins and complexes in a hydrated state and membrane environment2.
Coagulation Factor VIII (FVIII)3 is a multi-domain blood plasma glycoprotein. Defect or deficiency of FVIII is the cause for Hemophilia type A - a severe bleeding disorder. Upon proteolytic activation, FVIII binds to the serine protease Factor IXa on the negatively charged platelet membrane, which is critical for normal blood clotting4. Despite the pivotal role FVIII plays in coagulation, structural information for its membrane-bound state is incomplete5. Recombinant FVIII concentrate is the most effective drug against Hemophilia type A and commercially available FVIII can be expressed as human or porcine, both forming functional complexes with human Factor IXa6,7.
In this study we present a combination of Cryo-electron microscopy (Cryo-EM), lipid nanotechnology and structure analysis applied to resolve the membrane-bound structure of two highly homologous FVIII forms: human and porcine. The methodology developed in our laboratory to helically organize the two functional recombinant FVIII forms on negatively charged lipid nanotubes (LNT) is described. The representative results demonstrate that our approach is sufficiently sensitive to define the differences in the helical organization between the two highly homologous in sequence (86% sequence identity) proteins. Detailed protocols for the helical organization, Cryo-EM and electron tomography (ET) data acquisition are given. The two-dimensional (2D) and three-dimensional (3D) structure analysis applied to obtain the 3D reconstructions of human and porcine FVIII-LNT is discussed. The presented human and porcine FVIII-LNT structures show the potential of the proposed methodology to calculate the functional, membrane-bound organization of blood coagulation Factor VIII at high resolution.

Drosophila melanogaster embryonic and larval tissues often contain a highly heterogeneous mixture of cell types, which can complicate the analysis of gene expression in these tissues. Thus, to analyze cell-specific gene expression profiles from Drosophila tissues, it may be necessary to isolate specific cell types with high purity and at sufficient yields for downstream applications such as transcriptional profiling and chromatin immunoprecipitation. However, the irregular cellular morphology in tissues such as the central nervous system, coupled with the rare population of specific cell types in these tissues, can pose challenges for traditional methods of cell isolation such as laser microdissection and fluorescence-activated cell sorting (FACS). Here, an alternative approach to characterizing cell-specific gene expression profiles using affinity-based isolation of tagged nuclei, rather than whole cells, is described. Nuclei in the specific cell type of interest are genetically labeled with a nuclear envelope-localized EGFP tag using the Gal4/UAS binary expression system. These EGFP-tagged nuclei can be isolated using antibodies against GFP that are coupled to magnetic beads. The approach described in this protocol enables consistent isolation of nuclei from specific cell types in the Drosophila larval central nervous system at high purity and at sufficient levels for expression analysis, even when these cell types comprise less than 2% of the total cell population in the tissue. This approach can be used to isolate nuclei from a wide variety of Drosophila embryonic and larval cell types using specific Gal4 drivers, and may be useful for isolating nuclei from cell types that are not suitable for FACS or laser microdissection.

The zebrafish model has emerged as a relevant system to study kidney development, regeneration and disease. Both the embryonic and adult zebrafish kidneys are composed of functional units known as nephrons, which are highly conserved with other vertebrates, including mammals. Research in zebrafish has recently demonstrated that two distinctive phenomena transpire after adult nephrons incur damage: first, there is robust regeneration within existing nephrons that replaces the destroyed tubule epithelial cells; second, entirely new nephrons are produced from renal progenitors in a process known as neonephrogenesis. In contrast, humans and other mammals seem to have only a limited ability for nephron epithelial regeneration. To date, the mechanisms responsible for these kidney regeneration phenomena remain poorly understood. Since adult zebrafish kidneys undergo both nephron epithelial regeneration and neonephrogenesis, they provide an outstanding experimental paradigm to study these events. Further, there is a wide range of genetic and pharmacological tools available in the zebrafish model that can be used to delineate the cellular and molecular mechanisms that regulate renal regeneration. One essential aspect of such research is the evaluation of nephron structure and function. This protocol describes a set of labeling techniques that can be used to gauge renal composition and test nephron functionality in the adult zebrafish kidney. Thus, these methods are widely applicable to the future phenotypic characterization of adult zebrafish kidney injury paradigms, which include but are not limited to, nephrotoxicant exposure regimes or genetic methods of targeted cell death such as the nitroreductase mediated cell ablation technique. Further, these methods could be used to study genetic perturbations in adult kidney formation and could also be applied to assess renal status during chronic disease modeling.

Microorganisms are present on all inanimate surfaces creating ubiquitous sources of possible contamination in the laboratory. Experimental success relies on the ability of a scientist to sterilize work surfaces and equipment as well as prevent contact of sterile instruments and solutions with non-sterile surfaces. Here we present the steps for several plating methods routinely used in the laboratory to isolate, propagate, or enumerate microorganisms such as bacteria and phage. All five methods incorporate aseptic technique, or procedures that maintain the sterility of experimental materials. Procedures described include (1) streak-plating bacterial cultures to isolate single colonies, (2) pour-plating and (3) spread-plating to enumerate viable bacterial colonies, (4) soft agar overlays to isolate phage and enumerate plaques, and (5) replica-plating to transfer cells from one plate to another in an identical spatial pattern. These procedures can be performed at the laboratory bench, provided they involve non-pathogenic strains of microorganisms (Biosafety Level 1, BSL-1). If working with BSL-2 organisms, then these manipulations must take place in a biosafety cabinet. Consult the most current edition of the Biosafety in Microbiological and Biomedical Laboratories (BMBL) as well as Material Safety Data Sheets (MSDS) for Infectious Substances to determine the biohazard classification as well as the safety precautions and containment facilities required for the microorganism in question. Bacterial strains and phage stocks can be obtained from research investigators, companies, and collections maintained by particular organizations such as the American Type Culture Collection (ATCC). It is recommended that non-pathogenic strains be used when learning the various plating methods. By following the procedures described in this protocol, students should be able to:
● Perform plating procedures without contaminating media.
● Isolate single bacterial colonies by the streak-plating method.
● Use pour-plating and spread-plating methods to determine the concentration of bacteria.
● Perform soft agar overlays when working with phage.
● Transfer bacterial cells from one plate to another using the replica-plating procedure.
● Given an experimental task, select the appropriate plating method.

Institutions: University of California, Riverside, University of California, Riverside, University of São Paulo - USP, ISCA Technologies.

An inexpensive, noninvasive system that could accurately classify flying insects would have important implications for entomological research, and allow for the development of many useful applications in vector and pest control for both medical and agricultural entomology. Given this, the last sixty years have seen many research efforts devoted to this task. To date, however, none of this research has had a lasting impact. In this work, we show that pseudo-acoustic optical sensors can produce superior data; that additional features, both intrinsic and extrinsic to the insect’s flight behavior, can be exploited to improve insect classification; that a Bayesian classification approach allows to efficiently learn classification models that are very robust to over-fitting, and a general classification framework allows to easily incorporate arbitrary number of features. We demonstrate the findings with large-scale experiments that dwarf all previous works combined, as measured by the number of insects and the number of species considered.

We demonstrate methods for the detection of architectural distortion in prior mammograms of interval-cancer cases based on analysis of the orientation of breast tissue patterns in mammograms. We hypothesize that architectural distortion modifies the normal orientation of breast tissue patterns in mammographic images before the formation of masses or tumors. In the initial steps of our methods, the oriented structures in a given mammogram are analyzed using Gabor filters and phase portraits to detect node-like sites of radiating or intersecting tissue patterns. Each detected site is then characterized using the node value, fractal dimension, and a measure of angular dispersion specifically designed to represent spiculating patterns associated with architectural distortion.
Our methods were tested with a database of 106 prior mammograms of 56 interval-cancer cases and 52 mammograms of 13 normal cases using the features developed for the characterization of architectural distortion, pattern classification via quadratic discriminant analysis, and validation with the leave-one-patient out procedure. According to the results of free-response receiver operating characteristic analysis, our methods have demonstrated the capability to detect architectural distortion in prior mammograms, taken 15 months (on the average) before clinical diagnosis of breast cancer, with a sensitivity of 80% at about five false positives per patient.

Investigators have long been interested in the human propensity for the rapid detection of threatening stimuli. However, until recently, research in this domain has focused almost exclusively on adult participants, completely ignoring the topic of threat detection over the course of development. One of the biggest reasons for the lack of developmental work in this area is likely the absence of a reliable paradigm that can measure perceptual biases for threat in children. To address this issue, we recently designed a modified visual search paradigm similar to the standard adult paradigm that is appropriate for studying threat detection in preschool-aged participants. Here we describe this new procedure. In the general paradigm, we present participants with matrices of color photographs, and ask them to find and touch a target on the screen. Latency to touch the target is recorded. Using a touch-screen monitor makes the procedure simple and easy, allowing us to collect data in participants ranging from 3 years of age to adults. Thus far, the paradigm has consistently shown that both adults and children detect threatening stimuli (e.g., snakes, spiders, angry/fearful faces) more quickly than neutral stimuli (e.g., flowers, mushrooms, happy/neutral faces). Altogether, this procedure provides an important new tool for researchers interested in studying the development of attentional biases for threat.

Many researchers, across incredibly diverse foci, are applying phylogenetics to their research question(s). However, many researchers are new to this topic and so it presents inherent problems. Here we compile a practical introduction to phylogenetics for nonexperts. We outline in a step-by-step manner, a pipeline for generating reliable phylogenies from gene sequence datasets. We begin with a user-guide for similarity search tools via online interfaces as well as local executables. Next, we explore programs for generating multiple sequence alignments followed by protocols for using software to determine best-fit models of evolution. We then outline protocols for reconstructing phylogenetic relationships via maximum likelihood and Bayesian criteria and finally describe tools for visualizing phylogenetic trees. While this is not by any means an exhaustive description of phylogenetic approaches, it does provide the reader with practical starting information on key software applications commonly utilized by phylogeneticists. The vision for this article would be that it could serve as a practical training tool for researchers embarking on phylogenetic studies and also serve as an educational resource that could be incorporated into a classroom or teaching-lab.

Modern spiders spin high-performance silk fibers with a broad range of biological functions, including locomotion, prey capture and protection of developing offspring 1,2. Spiders accomplish these tasks by spinning several distinct fiber types that have diverse mechanical properties. Such specialization of fiber types has occurred through the evolution of different silk-producing glands, which function as small biofactories. These biofactories manufacture and store large quantities of silk proteins for fiber production. Through a complex series of biochemical events, these silk proteins are converted from a liquid into a solid material upon extrusion.
Mechanical studies have demonstrated that spider silks are stronger than high-tensile steel 3. Analyses to understand the relationship between the structure and function of spider silk threads have revealed that spider silk consists largely of proteins, or fibroins, that have block repeats within their protein sequences 4. Common molecular signatures that contribute to the incredible tensile strength and extensibility of spider silks are being unraveled through the analyses of translated silk cDNAs. Given the extraordinary material properties of spider silks, research labs across the globe are racing to understand and mimic the spinning process to produce synthetic silk fibers for commercial, military and industrial applications. One of the main challenges to spinning artificial spider silk in the research lab involves a complete understanding of the biochemical processes that occur during extrusion of the fibers from the silk-producing glands.
Here we present a method for the isolation of the seven different silk-producing glands from the cobweaving black widow spider, which includes the major and minor ampullate glands [manufactures dragline and scaffolding silk] 5,6, tubuliform [synthesizes egg case silk] 7,8, flagelliform [unknown function in cob-weavers], aggregate [makes glue silk], aciniform [synthesizes prey wrapping and egg case threads] 9 and pyriform [produces attachment disc silk] 10. This approach is based upon anesthetizing the spider with carbon dioxide gas, subsequent separation of the cephalothorax from the abdomen, and microdissection of the abdomen to obtain the silk-producing glands. Following the separation of the different silk-producing glands, these tissues can be used to retrieve different macromolecules for distinct biochemical analyses, including quantitative real-time PCR, northern- and western blotting, mass spectrometry (MS or MS/MS) analyses to identify new silk protein sequences, search for proteins that participate in the silk assembly pathway, or use the intact tissue for cell culture or histological experiments.

Perceptual and Category Processing of the Uncanny Valley Hypothesis' Dimension of Human Likeness: Some Methodological Issues

Authors: Marcus Cheetham, Lutz Jancke.

Institutions: University of Zurich.

Mori's Uncanny Valley Hypothesis1,2 proposes that the perception of humanlike characters such as robots and, by extension, avatars (computer-generated characters) can evoke negative or positive affect (valence) depending on the object's degree of visual and behavioral realism along a dimension of human likeness (DHL) (Figure 1). But studies of affective valence of subjective responses to variously realistic non-human characters have produced inconsistent findings 3, 4, 5, 6. One of a number of reasons for this is that human likeness is not perceived as the hypothesis assumes. While the DHL can be defined following Mori's description as a smooth linear change in the degree of physical humanlike similarity, subjective perception of objects along the DHL can be understood in terms of the psychological effects of categorical perception (CP) 7. Further behavioral and neuroimaging investigations of category processing and CP along the DHL and of the potential influence of the dimension's underlying category structure on affective experience are needed. This protocol therefore focuses on the DHL and allows examination of CP. Based on the protocol presented in the video as an example, issues surrounding the methodology in the protocol and the use in "uncanny" research of stimuli drawn from morph continua to represent the DHL are discussed in the article that accompanies the video. The use of neuroimaging and morph stimuli to represent the DHL in order to disentangle brain regions neurally responsive to physical human-like similarity from those responsive to category change and category processing is briefly illustrated.

Diffusion tensor imaging (DTI) techniques provide information on the microstructural processes of the cerebral white matter (WM) in vivo. The present applications are designed to investigate differences of WM involvement patterns in different brain diseases, especially neurodegenerative disorders, by use of different DTI analyses in comparison with matched controls.
DTI data analysis is performed in a variate fashion, i.e. voxelwise comparison of regional diffusion direction-based metrics such as fractional anisotropy (FA), together with fiber tracking (FT) accompanied by tractwise fractional anisotropy statistics (TFAS) at the group level in order to identify differences in FA along WM structures, aiming at the definition of regional patterns of WM alterations at the group level. Transformation into a stereotaxic standard space is a prerequisite for group studies and requires thorough data processing to preserve directional inter-dependencies. The present applications show optimized technical approaches for this preservation of quantitative and directional information during spatial normalization in data analyses at the group level. On this basis, FT techniques can be applied to group averaged data in order to quantify metrics information as defined by FT. Additionally, application of DTI methods, i.e. differences in FA-maps after stereotaxic alignment, in a longitudinal analysis at an individual subject basis reveal information about the progression of neurological disorders. Further quality improvement of DTI based results can be obtained during preprocessing by application of a controlled elimination of gradient directions with high noise levels.
In summary, DTI is used to define a distinct WM pathoanatomy of different brain diseases by the combination of whole brain-based and tract-based DTI analysis.

Institutions: Emory University School of Medicine, Brigham and Woman‘s Hospital and Massachusetts General Hospital.

Adapted tango dancing improves mobility and balance in older adults and additional populations with balance impairments. It is composed of very simple step elements. Adapted tango involves movement initiation and cessation, multi-directional perturbations, varied speeds and rhythms. Focus on foot placement, whole body coordination, and attention to partner, path of movement, and aesthetics likely underlie adapted tango’s demonstrated efficacy for improving mobility and balance. In this paper, we describe the methodology to disseminate the adapted tango teaching methods to dance instructor trainees and to implement the adapted tango by the trainees in the community for older adults and individuals with Parkinson’s Disease (PD). Efficacy in improving mobility (measured with the Timed Up and Go, Tandem stance, Berg Balance Scale, Gait Speed and 30 sec chair stand), safety and fidelity of the program is maximized through targeted instructor and volunteer training and a structured detailed syllabus outlining class practices and progression.

Institutions: Alberta Health Services / Calgary Laboratory Services / University of Calgary, University of Calgary, University of Calgary, University of Calgary, University of Calgary.

Staphylococcal Cassette Chromosome mec (SCCmec)typing is a very important molecular tool for understanding the epidemiology and clonal strain relatedness of methicillin-resistant Staphylococcus aureus (MRSA), particularly with the emerging outbreaks of community-associated MRSA (CA-MRSA) occurring on a worldwide basis. Traditional PCR typing schemes classify SCCmec by targeting and identifying the individual mec and ccr gene complex types, but require the use of many primer sets and multiple individual PCR experiments. We designed and published a simple multiplex PCR assay for quick-screening of major SCCmec types and subtypes I to V, and later updated it as new sequence information became available. This simple assay targets individual SCCmec types in a single reaction, is easy to interpret and has been extensively used worldwide. However, due to the sophisticated nature of the assay and the large number of primers present in the reaction, there is the potential for difficulties while adapting this assay to individual laboratories. To facilitate the process of establishing a MRSA SCCmec assay, here we demonstrate how to set up our multiplex PCR assay, and discuss some of the vital steps and procedural nuances that make it successful.

Modern 3D electron microscopy approaches have recently allowed unprecedented insight into the 3D ultrastructural organization of cells and tissues, enabling the visualization of large macromolecular machines, such as adhesion complexes, as well as higher-order structures, such as the cytoskeleton and cellular organelles in their respective cell and tissue context. Given the inherent complexity of cellular volumes, it is essential to first extract the features of interest in order to allow visualization, quantification, and therefore comprehension of their 3D organization. Each data set is defined by distinct characteristics, e.g., signal-to-noise ratio, crispness (sharpness) of the data, heterogeneity of its features, crowdedness of features, presence or absence of characteristic shapes that allow for easy identification, and the percentage of the entire volume that a specific region of interest occupies. All these characteristics need to be considered when deciding on which approach to take for segmentation.
The six different 3D ultrastructural data sets presented were obtained by three different imaging approaches: resin embedded stained electron tomography, focused ion beam- and serial block face- scanning electron microscopy (FIB-SEM, SBF-SEM) of mildly stained and heavily stained samples, respectively. For these data sets, four different segmentation approaches have been applied: (1) fully manual model building followed solely by visualization of the model, (2) manual tracing segmentation of the data followed by surface rendering, (3) semi-automated approaches followed by surface rendering, or (4) automated custom-designed segmentation algorithms followed by surface rendering and quantitative analysis. Depending on the combination of data set characteristics, it was found that typically one of these four categorical approaches outperforms the others, but depending on the exact sequence of criteria, more than one approach may be successful. Based on these data, we propose a triage scheme that categorizes both objective data set characteristics and subjective personal criteria for the analysis of the different data sets.

In this paper we present an automated system based mainly on the computed tomography (CT) images consisting of two main components: the midline shift estimation and intracranial pressure (ICP) pre-screening system. To estimate the midline shift, first an estimation of the ideal midline is performed based on the symmetry of the skull and anatomical features in the brain CT scan. Then, segmentation of the ventricles from the CT scan is performed and used as a guide for the identification of the actual midline through shape matching. These processes mimic the measuring process by physicians and have shown promising results in the evaluation. In the second component, more features are extracted related to ICP, such as the texture information, blood amount from CT scans and other recorded features, such as age, injury severity score to estimate the ICP are also incorporated. Machine learning techniques including feature selection and classification, such as Support Vector Machines (SVMs), are employed to build the prediction model using RapidMiner. The evaluation of the prediction shows potential usefulness of the model. The estimated ideal midline shift and predicted ICP levels may be used as a fast pre-screening step for physicians to make decisions, so as to recommend for or against invasive ICP monitoring.

Layers of Symbiosis - Visualizing the Termite Hindgut Microbial Community

Authors: Jared Leadbetter.

Institutions: California Institute of Technology - Caltech.

Jared Leadbetter takes us for a nature walk through the diversity of life resident in the termite hindgut - a microenvironment containing 250 different species found nowhere else on Earth. Jared reveals that the symbiosis exhibited by this system is multi-layered and involves not only a relationship between the termite and its gut inhabitants, but also involves a complex web of symbiosis among the gut microbes themselves.

Fear conditioning is a widely used paradigm in non-human animal research to investigate the neural mechanisms underlying fear and anxiety. A major challenge in conducting conditioning studies in humans is the ability to strongly manipulate or simulate the environmental contexts that are associated with conditioned emotional behaviors. In this regard, virtual reality (VR) technology is a promising tool. Yet, adapting this technology to meet experimental constraints requires special accommodations. Here we address the methodological issues involved when conducting fear conditioning in a fully immersive 6-sided VR environment and present fear conditioning data.
In the real world, traumatic events occur in complex environments that are made up of many cues, engaging all of our sensory modalities. For example, cues that form the environmental configuration include not only visual elements, but aural, olfactory, and even tactile. In rodent studies of fear conditioning animals are fully immersed in a context that is rich with novel visual, tactile and olfactory cues. However, standard laboratory tests of fear conditioning in humans are typically conducted in a nondescript room in front of a flat or 2D computer screen and do not replicate the complexity of real world experiences. On the other hand, a major limitation of clinical studies aimed at reducing (extinguishing) fear and preventing relapse in anxiety disorders is that treatment occurs after participants have acquired a fear in an uncontrolled and largely unknown context. Thus the experimenters are left without information about the duration of exposure, the true nature of the stimulus, and associated background cues in the environment1. In the absence of this information it can be difficult to truly extinguish a fear that is both cue and context-dependent. Virtual reality environments address these issues by providing the complexity of the real world, and at the same time allowing experimenters to constrain fear conditioning and extinction parameters to yield empirical data that can suggest better treatment options and/or analyze mechanistic hypotheses.
In order to test the hypothesis that fear conditioning may be richly encoded and context specific when conducted in a fully immersive environment, we developed distinct virtual reality 3-D contexts in which participants experienced fear conditioning to virtual snakes or spiders. Auditory cues co-occurred with the CS in order to further evoke orienting responses and a feeling of "presence" in subjects 2 . Skin conductance response served as the dependent measure of fear acquisition, memory retention and extinction.

JoVE Visualize is a tool created to match the last 5 years of PubMed publications to methods in JoVE's video library.

How does it work?

We use abstracts found on PubMed and match them to JoVE videos to create a list of 10 to 30 related methods videos.

Video X seems to be unrelated to Abstract Y...

In developing our video relationships, we compare around 5 million PubMed articles to our library of over 4,500 methods videos. In some cases the language used in the PubMed abstracts makes matching that content to a JoVE video difficult. In other cases, there happens not to be any content in our video library that is relevant to the topic of a given abstract. In these cases, our algorithms are trying their best to display videos with relevant content, which can sometimes result in matched videos with only a slight relation.