The James Webb Space Telescope will enable a wealth of new scientific investigations in the near- and mid- infrared, with sensitivity and spatial-spectral resolution greatly surpassing its predecessors. In this paper, we focus upon Solar System science facilitated by JWST, discussing the most current information available concerning JWST instrument properties and observing techniques relevant to planetary science. We also present numerous example observing scenarios for a wide variety of Solar System targets to illustrate the potential of JWST science to the Solar System community. This paper updates and supersedes the Solar System white paper published by the JWST Project in 2010 (Lunine et al., 2010). It is based both on that paper and on a workshop held at the annual meeting of the Division for Planetary Sciences in Reno, NV in 2012.

The James Webb Space Telescope is the successor to the Hubble Space Telescope. STScI and the Office of Public Outreach are committed to bringing awareness of the technology, the excitement, and the future science potential of this great observatory to the public, to educators and students, and to the scientific community, prior to its 2018 launch. The challenges in ensuring the high profile of JWST (understanding the infrared, the vast distance to the telescope's final position, and the unfamiliar science territory) requires us to lay the proper background. We currently engage the full range of the public and scientific communities using a variety of high impact, memorable initiatives, in combination with modern technologies to extend reach, linking the science goals of Webb to the ongoing discoveries being made by Hubble. We have injected Webb-specific content into ongoing E/PO programs: for example, simulated scientifically inspired but aesthetic JWST scenes, illustrating the differences between JWST and previous missions; partnering with high impact science communicators such as MinutePhysics to produce timely and concise content; educational materials in vast networks of schools through products like the Star Witness News.

JWST will dramatically advance our knowledge and understanding of the first generations of galaxies at z>10, their role in the re-ionization of the Universe, and the evolutionary processes that gave rise to the complexity and diversity of galaxies at the current epoch. As demonstrated by HST legacy projects like CLASH and the Hubble Frontier Fields, gravitational amplification by massive galaxy clusters can significantly extend the depth of the required observations. However, for JWST, reducing any diffuse background light will be just as crucial. We here propose Spitzer/IRAC observations of six massive cluster lenses, specifically selected as candidates for observation with JWST. By (a) quantifying the amount of intra-cluster light and (b) enabling us to improve our current lens models, the data resulting from the requested observations will be instrumental for the final selection of cluster targets that maximize the scientific returns of deep JWST observations.

The era of exoplanet characterization is upon us. For a subset of exoplanets -- the transiting planets -- physical properties can be measured, including mass, radius, and atmosphere characteristics. Indeed, measuring the atmospheres of a further subset of transiting planets, the hot Jupiters, is now routine with the Spitzer Space Telescope. The James Webb Space Telescope (JWST) will continue Spitzer's legacy with its large mirror size and precise thermal stability. JWST is poised for the significant achievement of identifying habitable planets around bright M through G stars--rocky planets lacking extensive gas envelopes, with water vapor and signs of chemical disequilibrium in their atmospheres. Favorable transiting planet systems, are, however, anticipated to be rare and their atmosphere observations will require tens to hundreds of hours of JWST time per planet. We review what is known about the physical characteristics of transiting planets, summarize lessons learned from Spitzer high-contrast exoplanet m...

Full Text Available Fairy tales that have been illustrated with a single image apiece are themselves generally a commercial enterprise, whose content and design must be conceived in a broadly acceptable mode in order to sell. Second, the selling process assumes a profit motive. Third, it can be shown that the single illustration mode results in projecting an individual illustratorʹs vision of a tale. Fourth, when large numbers of illustrations in single‐illustration mode exist in commercially produced books, the aggregate range of their content comprises the range of culturally‐acceptable images for a given story. (A methodology for analyzing fairy tales with multiple images will be introduced separately at the end of this article.

The James Webb Space Telescope (JWST) is a 6.5m, segmented, IR telescope that will explore the first light of the universe after the big bang. In 2014, a major risk reduction effort related to the Alignment, Integration, and Test (AI&T) of the segmented telescope was completed. The Pathfinder telescope includes two Primary Mirror Segment Assemblies (PMSA's) and the Secondary Mirror Assembly (SMA) onto a flight-like composite telescope backplane. This pathfinder allowed the JWST team to assess the alignment process and to better understand the various error sources that need to be accommodated in the flight build. The successful completion of the Pathfinder Telescope provides a final integration roadmap for the flight operations that will start in August 2015.

This article describes a theoretical model that links personal characteristics with resilience to economic hardship and its psychological and interpersonal consequences. This transactional model integrates social influence and social selection perspectives concerning the relation between socioeconomic circumstances and the development of individuals and families. In addition, this article discusses methodological and conceptual issues related to investigating the effects of personal character...

Declarative concepts (i.e., key terms with short definitions of the abstract concepts denoted by those terms) are a common kind of information that students are expected to learn in many domains. A common pedagogical approach for supporting learning of declarative concepts involves presenting students with concrete examples that illustrate how the…

In support of the launch of the James Webb Space Telescope, various teams at STScI (the Space Telescope Science Institute) have collaborated on how to re-structure the view of a an observing program within the Astronomer's Proposal Tool (APT) to accommodate for the differences between HST and JWST. For HST APT programs, the structure is visit-dominant, and there is one generic form for entering observing information that spans all instruments with their required fields and options. This can result in sometimes showing irrelevant fields to the user for a given observing goal. Also, the generation of mosaicked observations in HST requires the user to manually calculate the position of each tile within the mosaic, accounting for positional offsets and the roll of the telescope, which is a time consuming process. Now, for JWST programs in APT, the description of the observations has been segregated by instrument and mode into discrete observing templates. Each template's form allows instrument specific choices and displays of relevant information. APT will manually manage the number of visits needed to perform the observation. This is particularly useful for mosaics and dithering with JWST. For example, users will select how they would like a mosaic to be tiled at the observation level, and the visits are automatically created. In this, visits have been re-structured to be purely informational; all editing is done at the observation level. These options and concepts are illustrated to future users via the corresponding poster.

This article describes a theoretical model that links personal characteristics with resilience to economic hardship and its psychological and interpersonal consequences. This transactional model integrates social influence and social selection perspectives concerning the relation between socioeconomic circumstances and the development of individuals and families. In addition, this article discusses methodological and conceptual issues related to investigating the effects of personal characteristics in this context. Finally, initial empirical support for some of the key predictions from the proposed model are provided using longitudinal data collected from a sample of Midwestern families. Specifically, adolescent academic achievement, self-reports of Conscientiousness, and self-reports of low Neuroticism during adolescence predicted relevant outcomes in adulthood such as less economic pressure, more satisfying romantic relationships, and less harsh parenting behaviors. These preliminary findings support the hypothesized model and extend research concerning the life course outcomes associated with personal characteristics.

We study the capability of the James Webb Space Telescope (JWST) to detect Supermassive Dark Stars (SMDS). If the first stars are powered by dark matter heating in triaxial dark matter haloes, they may grow to be very large and very bright, visible in deep imaging with JWST and even Hubble Space Telescope (HST). We use HST surveys to place bounds on the numbers of SMDSs that may be detected in future JWST imaging surveys. We showed that SMDS in the mass range $10^6-10^7 M_\\odot$ are bright enough to be detected in all the wavelength bands of the NIRCam on JWST . If SMDSs exist at z ~10, 12, and 14, they will be detectable as J-band, H-band, or K-band dropouts, respectively. With a total survey area of 150 arcmin^2 (assuming a multi-year deep parallel survey with JWST), we find that typically the number of $10^6 M_\\odot$ SMDSs found as H or K-band dropouts is ~10^5\\fsmds, where the fraction of early DM haloes hosting DS is likely to be small, \\fsmds10 from SMDSs would be possible with spectroscopy: the SMDS (w...

The next generation for the Space Telescope Science Institute data management system is gearing up to provide a suite of archive system services supporting the operation of the James Webb Space Telescope. We are now completing the initial stage of integration and testing for the preliminary ground system builds of the JWST Science Operations Center which includes multiple components of the Data Management Subsystem (DMS). The vision for astronomical science and research with the JWST archive introduces both solutions to formal mission requirements and innovation derived from our existing mission systems along with the collective shared experience of our global user community. We are building upon the success of the Hubble Space Telescope archive systems, standards developed by the International Virtual Observatory Alliance, and collaborations with our archive data center partners. In proceeding forward, the “one archive” architectural model presented here is designed to balance the objectives for this new and exciting mission. The STScI JWST archive will deliver high quality calibrated science data products, support multi-mission data discovery and analysis, and provide an infrastructure which supports bridges to highly valued community tools and services.

The JWST Science Working Group recently published a comprehensive, top-level review of JWST science in the journal Space Science Reviews (Gardner et al. 2006, SSR, 123, 485). That review paper gives details of the 4 JWST science themes, and describes the design of the observatory and ground system. Since publication, the SWG, working with members of the astronomical community, has continued to develop the science case for JWST, giving more details in a series of white papers. The white paper topics include first light, galaxy surveys, AGN, supernovae, stellar populations, and exoplanets. The white papers are in various stages of completion. In this poster, I will review recent progress.

Forests are among the most important ecosystems for the provision of hydrological services. These include water supply and water damage mitigation, in the dimensions of quantity, timing and quality. Although the hydrological role of forests is well documented in the literature, a conceptual framewor

The forthcoming James Webb Space Telescope (JWST) will revolutionize galaxy evolution studies from the epoch of reionisation to the present day. In particular, a new era will be open for mid-IR astronomy, as the JWST Mid-Infrared Instrument (MIRI) will improve by an order of magnitude the sensitivit

The 6.5-m aperture James Webb Space Telescope (JWST) will be a powerful tool for studying and advancing numerous areas of astrophysics. Its Fine Guidance Sensor, Near-Infrared Camera, Near-Infrared Spectrograph, and Mid-Infrared Instrument will be capable of making very sensitive, high angular resolution imaging and spectroscopic observations spanning 0.7 - 28 ?m wavelength. These capabilities are very well suited for probing the conditions of star formation in the distant and local Universe. Indeed, JWST has been designed to detect first light objects as well as to study the fine details of jets, disks, chemistry, envelopes, and the central cores of nearby protostars. We will be able to use its cameras, coronagraphs, and spectrographs (including multi-object and integral field capabilities) to study many aspects of star forming regions throughout the galaxy, the Local Group, and more distant regions. I will describe the basic JWST scientific capabilities and illustrate a few ways how they can be applied to star formation issues and conditions with a focus on Galactic regions.

Scheduled to begin its 10 year mission after 2018, the James Webb Space Telescope (JWST) will search for the first luminous objects of the Universe to help answer fundamental questions about how the Universe came to look like it does today. At 6.5 meters in diameter, JWST will be the world s largest space telescope. This talk reviews science objectives for JWST and how they drive the JWST architecture, e.g. aperture, wavelength range and operating temperature. Additionally, the talk provides an overview of the JWST primary mirror technology development and fabrication status.

In 2018, exoplanet science will enter a new era with the launch of the James Webb Space Telescope (JWST). With JWST's observing power, several studies have sought to characterize how the instruments will perform and what atmospheric spectral features could theoretically be detected using transmission spectroscopy. With just two years left until launch, it is imperative that the exoplanet community begins to digest and integrate these studies into their observing plans and strategies. In order to encourage this and to allow all members of the community access to JWST simulations, we present here an open source tool for creating observation simulations of all observatory-supported time-series spectroscopy modes. We describe our tool, PandExo and use it to calculate the expected signal-to-noise ratio (SNR) for every confirmed planetary system with Jhours are needed to attain a SNR of 5 on key molecular absorption bands of H2O, CH4, and CO. We end by determining the number of planets (hot Jupiters, warm Neptunes, super-Earths, etc.) that are currently attainable with JWST.

The James Webb Space Telescope (JWST), NASA’s next great observatory launching in October 2018, required a dozen new technologies to develop. How will we maintain the prestige and cultural impact of Hubble as the torch passes to Webb? Emerging technologies such as augmented and virtual reality bring the viewer into the data and the concept in previously unimaginable immersive detail. Adoption of mobile devices has expanded access to information for wide swaths of the public. Software like Worldwide Telescope to hardware like the Occulus Rift are providing new avenues for learning. If we develop materials properly tailored to this medium, we can reach more diverse audiences than ever before. STScI is pioneering some tools related to JWST for showcasing at AAS, and in local events, which I highlight here.

Modeling a telescope's point spread function accurately is key to predicting its performance and extracting information from observations. WebbPSF is a flexible Python-based PSF simulation tool for JWST and WFIRST, developed at STScI. The WebbPSF-WFIRST module implements a model for the proposed Wide Field Instrument, as well as a proof-of-concept model for the Coronagraph Instrument. Since its announcement and public release at the Winter 2016 AAS, WebbPSF-WFIRST has been enhanced with the Cycle 6 design updates to the wide field instrument model. Additionally, the JupyterHub-based WFIRST Tools Server effort at STScI has provided access to these tools for dozens of users without the overhead of installing the software locally. For JWST, the optical models have been updated based on the latest test data and metrology for the instruments and the telescope flight hardware, including as-built mirror surface figures, variation between different field points, and updated optical budgets for in flight performance. WebbPSF has been checked against instrument test data from previous campaigns, and analysis of the PSF images taken during the JWST CV3 cryo-vac test campaign is currently underway.

Dramatic progress in exoplanetary systems imaging has occurred since the first generation of space coronagraphs on HST (NICMOS, STIS, ACS). While HST remains at forefront of both exoplanetary and circumstellar disk science, ground-based instruments have improved by three orders of magnitudes over the past decade. JWST will extend the current state of the art with a larger set of superior coronagraphs and greater sensitivity across more than a factor of 10 in wavelength, making it extraordinarily capable for detailed imaging characterization of planets and disks. We will address specific questions about nearby exoplanetary systems, while also optimizing observing strategies across the breadth of JWST’s high-contrast imaging modes, as follows: (a) Deep, multi-wavelength observations of selected nearby stars hosting known debris disks & planets. We will use the NIRCam and MIRI coronagraphs across the full range of JWST wavelengths, and perhaps MIRI MRS spatially resolved spectroscopy. Each comprehensive dataset will support a variety of investigations addressing both disk characterization and exoplanet detection & characterization. (b) Characterization of Planetary Systems around Cool M Stars. We will observe young and dusty M dwarfs, to complement observations of the closer but older M dwarf samples under consideration by other GTO groups. JWST observations will dramatically exceed HST images in their ability to address questions about the properties of dust rings, while the more favorable contrast ratios of planets relative to M dwarf hosts will enable sensitivity to relatively low mass planetary companions.

NASA's James Webb Space Telescope (JWST) is a 6.5m diameter, segmented, deployable telescope for cryogenic IR space astronomy ( 40K). The JWST Observatory architecture includes the Optical Telescope Element (OTE) and the Integrated Science Instrument Module (ISIM) element that contains four science instruments (SI), including a guider. OSIM is a full field, cryogenic, optical simulator of the JWST OTE. It is the "Master Tool" for verifying the cryogenic alignment and optical performance of ISIM by providing simulated point source/star images to each of the four Science Instruments in ISIM. Included in OSIM is a Pupil Imaging Module (PIM) - a large format CCD used for measuring pupil alignment. Located at a virtual stop location within OSIM, the PIM records superimposed shadow images of pupil alignment reference (PAR) targets located in the OSIM and SI pupils. The OSIM Pupil Imaging Module was described by Brent Bos, et al, at SPIE in 2011 prior to ISIM testing. We have recently completed the third and final ISIM cryogenic performance verification test before ISIM was integrated with the OTE. In this paper, we describe PIM implementation, performance, and measurement results.

In many ways, WFC3s IR channel is a good indicator for what to expect with JWST. There are some differences, most of which should be beneficial in JWST- JWSTs lower operating temperature will freeze out charge traps that would affect WFC3. Benefits should include lower dark current, lower persistence, and better reciprocity- JWSTs more recent HgCdTe process has lower defect density. The benefits are as described above- JWST uses better indium barriers. The benefits should include fewer RC type pixels. One area where more study might be beneficial is stability. The detector electronics play a significant role in determining how stable a detector system is(v.s. bias drifts and photometry). JWSTs SIDECARs are completely WFC3s Ball electronics- Studies comparing the bias and photometric stability of WFC3 and JWST might be useful to informing data acquisition and calibration strategies for JWST.

We explore how well spectra from the James Webb Space Telescope (JWST) will likely constrain bulk atmospheric properties of transiting exoplanets. We start by modeling the atmospheres of archetypal hot Jupiter, warm Neptune, warm sub-Neptune, and cool super-Earth planets with atmospheres that are clear, cloudy, or of high mean molecular weight (HMMW). Next we simulate the λ = 1–11 μm transmission and emission spectra of these systems for several JWST instrument modes for single-transit or single-eclipse events. We then perform retrievals to determine how well temperatures and molecular mixing ratios (CH{sub 4}, CO, CO{sub 2}, H{sub 2}O, NH{sub 3}) can be constrained. We find that λ = 1–2.5 μm transmission spectra will often constrain the major molecular constituents of clear solar-composition atmospheres well. Cloudy or HMMW atmospheres will often require full 1–11 μm spectra for good constraints, and emission data may be more useful in cases of sufficiently high F{sub p} and high F{sub p}/F{sub *}. Strong temperature inversions in the solar-composition hot-Jupiter atmosphere should be detectable with 1–2.5+ μm emission spectra, and 1–5+ μm emission spectra will constrain the temperature–pressure profiles of warm planets. Transmission spectra over 1–5+ μm will constrain [Fe/H] values to better than 0.5 dex for the clear atmospheres of the hot and warm planets studied. Carbon-to-oxygen ratios can be constrained to better than a factor of 2 in some systems. We expect that these results will provide useful predictions of the scientific value of single-event JWST spectra until its on-orbit performance is known.

This paper reports on the development, manufacture and integration of the James Webb Space Telescope's sunshield and spacecraft. Both of these JWST elements have completed design and development testing. This paper will review basic architecture and roles of these systems. Also to be presented is the current state of manufacture, assembly integration and test. This paper will conclude with a look at the road ahead for each subsystem prior to integration with the integrated telescope and instrument elements at Northrop Grumman's Space Park facility in late 2017.

We explore how well James Webb Space Telescope (JWST) spectra will likely constrain bulk atmospheric properties of transiting exoplanets. We start by modeling the atmospheres of archetypal hot Jupiter, warm Neptune, warm sub-Neptune, and cool super-Earth planets with clear, cloudy, or high mean molecular weight atmospheres. Next we simulate the $\\lambda = 1 - 11$ $\\mu$m transmission and emission spectra of these systems for several JWST instrument modes for single transit and eclipse events. We then perform retrievals to determine how well temperatures and molecular mixing ratios (CH$_4$, CO, CO$_2$, H$_2$O, NH$_3$) can be constrained. We find that $\\lambda = 1 - 2.5$ $\\mu$m transmission spectra will often constrain the major molecular constituents of clear solar composition atmospheres well. Cloudy or high mean molecular weight atmospheres will often require full $1 - 11$ $\\mu$m spectra for good constraints, and emission data may be more useful in cases of sufficiently high $F_p$ and high $F_p/F_*$. Strong t...

I present a new modeling and retrieval code for atmospheres of directly imaged exoplanets designed for use on JWST observations, extending my previous work on transiting planets. I perform example retrievals of temperature-pressure profiles, common molecular abundances, and basic cloud properties on existing lower-resolution spectra and on simulated JWST data using forward model emission spectra for planned NIRISS and NIRCam targets. From these results, I estimate the expected return on prospective JWST observations in information-theoretic terms using the mutual information metric.

The James Webb Space Telescope (JWST) is a large, infrared-optimized space telescope consisting of an Optical telescope element (OTE), Integrated science instrument module (ISIM), a Spacecraft, and a Sunshield. The Integrated Science Instrument Module (ISIM) consists of the JWST science instruments (NIRCam, MIRI, NIRSpec), a fine guidance sensor (FGS), the ISIM Structure, and thermal and electrical subsystems. JWST's instruments are designed to work primarily in the infrared range of the electromagnetic spectrum, and the instruments and telescope operate at cryogenic temperatures (approximately 35 K for the instruments).

Discusses three roles of textbook illustrations--to arrest the reader's attention and arouse interest, to provide explanation and clarification of complex verbal descriptions, and to aid retention of the information presented in the text. It is recommended that illustrations be designed with their specific role(s) in mind. (EAO)

The Aperture Masked Interferometry (AMI) mode on JWST-NIRISS is implemented as a 7-hole, 15% throughput, non-redundant mask (NRM) that operates with 5-8% bandwidth filters at 3.8, 4.3, and 4.8 microns. We present refined estimates of AMI's expected point-source contrast, using realizations of noise matched to JWST pointing requirements, NIRISS detector noise, and Rev-V JWST wavefront error models for the telescope and instrument. We describe our point-source binary data reduction algorithm, which we use as a standardized method to compare different observational strategies. For a 7.5 magnitude star we report a 10-a detection at between 8.7 and 9.2 magnitudes of contrast between 100 mas to 400 mas respectively, using closure phases and squared visibilities in the absence of bad pixels, but with various other noise sources. With 3% of the pixels unusable, the expected contrast drops by about 0.5 magnitudes. AMI should be able to reach targets as bright as M=5. There will be significant overlap between Gemini-GPI and ESO-SPHERE targets and AMI's search space, and a complementarity with NIRCam's coronagraph. We also illustrate synthesis imaging with AMI, demonstrating an imaging dynamic range of 25 at 100 mas scales. We tailor existing radio interferometric methods to retrieve a faint bar across a bright nucleus, and explain the similarities to synthesis imaging at radio wavelengths. Modest contrast observations of dusty accretion flows around AGNs will be feasible for NIRISS AMI. We show our early results of image-plane deconvolution as well. Finally, we report progress on an NRM-inspired approach to mitigate mission-level risk associated with JWST's specialized wavefront sensing hardware. By combining narrow band and medium band Nyquist-sampled images taken with a science camera we can sense JWST primary mirror segment tip-tilt to lOmas, and piston to a few nm. We can sense inter-segment piston errors of up to 5 coherence lengths of the broadest bandpass filter used

... Accredited programs prepare students for a career in academic or research health science centers, industry, or consulting. As members of the health career profession with strong communication skills, medical illustrators work closely with clients to interpret ...

Windows Presentation Foundation is Microsoft's newest API for creating Windows applications. It gives the programmer the ability to produce dazzling, graphics-rich programs easily without having to delve into the messy details of the graphics subsystem. To use this power, however, the programmer must learn new concepts for laying out pages and displaying graphics. Illustrated WPF presents these concepts clearly and visually-making them easier to understand and retain. What you'll learn* The important new concepts underlying programming in WPF, including the visual tree, the logical tree, depen

This paper describes two cryogenic thermal switches (CTSWs) under development for instruments on the James Webb Space Telescope (JWST). The first thermal switch was designed to extend the life of the solid H2 dewar for the 6 K Mid Infrared Instrument (MIRI) while the second thermal switch is needed for contamination and over-temperature control of three 35 K instruments on the Integrated Science Instrument Module (ISIM). In both cases, differential thermal expansion (DTE) between two materials having differing CTE values is the process that underpins the thermal switching. The patented DTE-CTSW design utilizes two metallic end-pieces, one cup-shaped and the other disc-shaped (both MIRI end-pieces are Al while ISIM uses an Al/Invar cup and an Al disc), joined by an axially centered Ultem rod, which creates a narrow, flat gap between the cup (rim) and disc. A heater is bonded to the rod center. Upon cooling one or both end-pieces, the rod contracts relative to the end-pieces and the gap closes, turning the CTSW ON. When the rod heater is turned on, the rod expands relative to the end-pieces and the gap opens, turning the CTSW OFF. During testing from 6-35 K, ON conductances of 0.3-12 W/K and OFF resistances greater than 2500 K/W were measured. Of particular importance at 6 K was the Al oxide layer, which was found to significantly decrease DTE-CTSW ON conductance when the mating surfaces were bare Al. When the mating surfaces were gold-plated, the adverse impact of the oxide layer was mitigated. This paper will describe both efforts from design through model correlation.

Near-Earth Objects (NEOs) account for a surprisingly large fraction of the Spitzer observing time devoted to Solar System science. As a community, we should think of ways to repeat that success with JWST. JWST is planning an open Early Release Science Program, with the expected deadline for letters of intent in early 2017. We can't wait for next year's DPS to develop ideas. The time is now!In order to stir up the discussion, we will present ideas for NEO observing programs that are well adapted to JWST's capabilities and limitations, based on our recent PASP paper (Thomas et al., 2016). Obvious measurement objectives would include* size and albedo from thermal continuum (MIRI photometry)* thermal inertia for objects with well-known shape and spin state (MIRI)* taxonomy through reflection spectroscopy and emission spectroscopy in the NIR and MIR; NIR colors for faint objects.In all cases, JWST's sensitivity will allow us to go deeper than currently possible by at least an order of magnitude. Meter-sized NEOs similar to 2009 BD or 2011 MD are easy targets for MIRI spectrophotometry!The following limitations must be kept in mind, however: JWST's large size makes it slow to move. Most problematic for NEOs is probably the resulting 'speed limit': non-sidereal tracking is supported up to a rate of 30 mas/s, NEOs can easily move faster than that (ways to relax this constraint are under discussion). The average slew to a new target is budgeted to take 30 min, effectively ruling out programs many-target programs like ExploreNEOs or NEOSurvey (see D. Trilling's paper). Additionally, JWST will only observe close to quadrature, translating to large solar phase angles for NEO observations; this is familiar from other space-based IR facilities.

The unprecedented sensitivity and angular resolution of the James Webb Space Telescope (JWST) will make it NASA’s premier space-based facility for infrared astronomy. This 6.5-meter telescope, which is optimized for observations in the near and mid infrared, will be equipped with four state-of-the-art instruments that include imaging, spectroscopy, and coronagraphy. These instruments, along with the telescope’s moving target capabilities, will enable the infrared study of solar system objects with unprecedented detail. A new white paper (Norwood et al., 2014) provides a general overview of JWST observatory and instrument capabilities for Solar System science, and updates and expands upon an earlier study by Lunine et al. (2010). In order to fully realize the potential of JWST for Solar System observations, we have recently organized 10 focus groups to explore various science use cases in more detail on topics including: Asteroids, Comets, Giant Planets, Mars, Near Earth Objects, Occultations, Rings, Satellites, Titan, and Trans-Neptunian Objects. The findings from these groups will help guide the project as it develops and implements planning tools, observing templates, the data pipeline and archives so that they enable a broad range of Solar System Science investigations. The purpose of this presentation is to raise awareness of the JWST Solar System planning, and to invite participation of DPS members with our Focus Groups and other pre-launch activities.References:Lunine, J., Hammel, H., Schaller, E., Sonneborn, G., Orton, G., Rieke, G., and Rieke, M. 2010, JWST Planetary Observations within the Solar System, http://www.stsci.edu/jwst/doc-archive/white-papers.Norwood, J., Hammel, H., Milam, S.,Stansberry, J., Lunine, J., Chanover, N., Hines, D., Sonneborn, G., Tiscareno, M., Brown, M. and Ferruit, P., 2014, ArXiv e-prints, 1403.6845.

The James Webb Space Telescope (JWST) is the scientific successor to the Hubble Space Telescope. It is a cryogenic infrared space observatory with a 25 m2 aperture (6 m class) telescope that will achieve diffraction limited angular resolution at a wavelength of 2 um. The science instrument payload includes four passively cooled near-infrared instruments providing broad- and narrow-band imagery, coronography, as well as multi-object and integral-field spectroscopy over the 0.6 Construction, integration and verification testing is underway in all areas of the program. The JWST is on schedule for launch during 2018.

Building awareness of a NASA mission prior to launch and connecting that mission to the education community can be challenging. In order to address this challenge, the Space Telescope Science Institute's Office of Public Outreach has developed the James Webb STEM innovation Project (SIP) - an interdisciplinary project that focuses on the engineering aspects and potential scientific discoveries of JWST, while incorporating elements of project-based learning. Students in participating schools will use skills from multiple subject areas to research an aspect of the JWST's design or potential science and create models, illustrated essays, or technology-based projects to demonstrate their learning. Student projects will be showcased during special events at select venues in the project states - thus allowing parents and community members to also be benefactors of the project. Currently, the SIP is being piloted in New York, California, and Maryland. In addition, we will be implementing the SIP in partnership with NASA Explorer Schools in the states of New Mexico, Michigan, Texas, Tennessee, and Iowa.

We present observations at 3.6 and 4.5 microns using IRAC on the Spitzer Space Telescope of a set of main sequence A stars and white dwarfs that are potential calibrators across the JWST instrument suite. The stars range from brightnesses of 4.4 to 15 mag in K band. The calibration observations use a similar redundancy to the observing strategy for the IRAC primary calibrators (Reach et al. 2005) and the photometry is obtained using identical methods and instrumental photometric corrections as those applied to the IRAC primary calibrators (Carey et al. 2009). The resulting photometry is then compared to the predictions based on spectra from the CALSPEC Calibration Database (http://www.stsci.edu/hst/observatory/crds/calspec.html) and the IRAC bandpasses. These observations are part of an ongoing collaboration between IPAC and STScI investigating absolute calibration in the infrared.

The assembly of galaxies involves the life cycle of mass, metal enrichment and dust that JWST will probe. Detailed studies of nearby galaxies provides guidance for interpreting the more distant forming galaxies. JWST/MIRI will enable stellar population studies akin to work done with HST on the Local Group galaxies but over a new wavelength range. MIRI's imaging capability over nine photometric bands from 5 to 28 microns is particularly suited to survey stars with an infrared excess and to detangle the extinction or thermal emission from various species of dust. These dusty stellar populations include young stellar objects, evolved stars and supernovae that are bright in the infrared. Using the rich Spitzer-IRS spectroscopic dataset and spectral classifications from the Surveying the Agents of Galaxy Evolution (SAGE)-Spectroscopic survey of over a thousand objects in the Magellanic Clouds, we calculate the expected flux -densities and colors in the MIRI broadband filters for these prominent infrared sources. We uses these fluxes to illustrate what JWST will see in stellar population studies for other Local Group galaxies. JWST/MIRI observations of infrared sources in Local Group Galaxies will constrain the life cycle of galaxies through their dust emission. For example, how much of the interstellar dust is supplied by dying stars? Do the number of young stellar objects agree with star formation diagnostic for the galaxy? We discuss the locations of the post- and pre-main-sequence populations in MIRI color-color and color-magnitude space and examine which filters are best for identifying populations of sources. We connect these results to existing galaxies with HST data for instance Andromeda and M33.

In this paper we investigate the opportunities provided by the James Webb Space Telescope (JWST) for significant scientific advances in the study of solar system bodies and rings using stellar occultations. The strengths and weaknesses of the stellar occultation technique are evaluated in light of JWST's unique capabilities. We identify several possible JWST occultation events by minor bodies and rings, and evaluate their potential scientific value. These predictions depend critically on accurate a priori knowledge of the orbit of JWST near the Sun-Earth Lagrange-point 2 (L2). We also explore the possibility of serendipitous stellar occultations by very small minor bodies as a by-product of other JWST observing programs. Finally, to optimize the potential scientific return of stellar occultation observations, we identify several characteristics of JWST's orbit and instrumentation that should be taken into account during JWST's development.

This artist's concept is a cutaway illustration of the Skylab with the Command/Service Module being docked to the Multiple Docking Adapter. In an early effort to extend the use of Apollo for further applications, NASA established the Apollo Applications Program (AAP) in August of 1965. The AAP was to include long duration Earth orbital missions during which astronauts would carry out scientific, technological, and engineering experiments in space by utilizing modified Saturn launch vehicles and the Apollo spacecraft. Established in 1970, the Skylab Program was the forerurner of the AAP. The goals of the Skylab were to enrich our scientific knowledge of the Earth, the Sun, the stars, and cosmic space; to study the effects of weightlessness on living organisms, including man; to study the effects of the processing and manufacturing of materials utilizing the absence of gravity; and to conduct Earth resource observations. The Skylab also conducted 19 selected experiments submitted by high school students. Skylab's 3 different 3-man crews spent up to 84 days in Earth orbit. The Marshall Space Flight Center (MSFC) had responsibility for developing and integrating most of the major components of the Skylab: the Orbital Workshop (OWS), Airlock Module (AM), Multiple Docking Adapter (MDA), Apollo Telescope Mount (ATM), Payload Shroud (PS), and most of the experiments. MSFC was also responsible for providing the Saturn IB launch vehicles for three Apollo spacecraft and crews and a Saturn V launch vehicle for the Skylab.

The Near-Infrared Spectrograph (NIRSpec) is the work-horse spectrograph at 1-5microns for the James Webb Space Telescope (JWST). A showcase observing mode of NIRSpec is the multi-object spectroscopy with the Micro-Shutter Arrays (MSAs), which consist of a quarter million tiny configurable shutters that are 0. ''20×0. ''46 in size. The NIRSpec MSA shutters can be opened in adjacent rows to create flexible and positionable spectroscopy slits on prime science targets of interest. Because of the very small shutter width, the NIRSpec MSA spectral data quality will benefit significantly from accurate astrometric knowledge of the positions of planned science sources. Images acquired with the Hubble Space Telescope (HST) have the optimal relative astrometric accuracy for planning NIRSpec observations of 5-10 milli-arcseconds (mas). However, some science fields of interest might have no HST images, galactic fields can have moderate proper motions at the 5mas level or greater, and extragalactic images with HST may have inadequate source information at NIRSpec wavelengths beyond 2 microns. Thus, optimal NIRSpec spectroscopy planning may require pre-imaging observations with the Near-Infrared Camera (NIRCam) on JWST to accurately establish source positions for alignment with the NIRSpec MSAs. We describe operational philosophies and programmatic considerations for acquiring JWST NIRCam pre-image observations for NIRSpec MSA spectroscopic planning within the same JWST observing Cycle.

The James Webb Space Telescope (JWST), scheduled for launch in 2018, is the successor to the Hubble Space Telescope (HST) but with a significantly larger aperture (6.5 m) and advanced instrumentation focusing on infrared science (0.6-28.0 $\\mu$m ). In this paper we examine the potential for scientific investigation of Titan using JWST, primarily with three of the four instruments: NIRSpec, NIRCam and MIRI, noting that science with NIRISS will be complementary. Five core scientific themes are identified: (i) surface (ii) tropospheric clouds (iii) tropospheric gases (iv) stratospheric composition and (v) stratospheric hazes. We discuss each theme in depth, including the scientific purpose, capabilities and limitations of the instrument suite, and suggested observing schemes. We pay particular attention to saturation, which is a problem for all three instruments, but may be alleviated for NIRCam through use of selecting small sub-arrays of the detectors - sufficient to encompass Titan, but with significantly fas...

The data processing and archive systems for the JWST will contain a petabyte of science data and the best news is that users will have fast access to the latest calibrations through a variety of new services. With a synergistic approach currently underway with the STScI science operations between the Hubble Space Telescope and James Webb Space Telescope data management subsystems (DMS), operational verification is right around the corner. Next year the HST archive will provide scientists on-demand fully calibrated data products via the Mikulski Archive for Space Telescopes (MAST), which takes advantage of an upgraded DMS. This enhanced system, developed jointly with the JWST DMS is based on a new CONDOR distributed processing system capable of reprocessing data using a prioritization queue which runs in the background. A Calibration Reference Data System manages the latest optimal configuration for each scientific instrument pipeline. Science users will be able to search and discover the growing MAST archive calibrated datasets from these missions along with the other multiple mission holdings both local to MAST and available through the Virtual Observatory. JWST data systems will build upon the successes and lessons learned from the HST legacy and move us forward into the next generation of multi-wavelength archive research.

NASA’s James Webb Space Telescope (JWST), planned for operation in about five years, will have the capability to investigate – and answer – some of the most challenging questions in astronomy. Although motivated and designed to study the very early Universe, the performance of the observatory’s instruments over a very wide wavelength range will allow the world’s scientific community unequaled ability to study cosmic phenomena as diverse as small bodies in the Solar System and the formation of galaxies. As part of preparation to use JWST, a conference was held in Tucson, Arizona in 2007 that brought together astronomers from around the world to discuss the mission, other major facilities that will operate in the coming decade, and major scientific goals for them. This book is a compilation of those presentations by some of the leading researchers from all branches of astronomy. This book also includes a "pre-history" of JWST, describing the lengthy process and some of the key individuals that initiat...

The coronagraphs on the James Webb Space Telescope (JWST) will enable high-contrast observations of faint objects at small separations from bright hosts, such as circumstellar disks, exoplanets, and quasar disks. Despite attenuation by the coronagraphic mask, bright speckles in the host’s point spread function (PSF) remain, effectively washing out the signal from the faint companion. Suppression of these bright speckles is typically accomplished by repeating the observation with a star that lacks a faint companion, creating a reference PSF that can be subtracted from the science image to reveal any faint objects. Before this reference PSF can be subtracted, however, the science and reference images must be aligned precisely, typically to 1/20 of a pixel. Here, we present several such algorithms for performing image registration on JWST coronagraphic images. Using both simulated and pre-flight test data (taken in cryovacuum), we assess (1) the accuracy of each algorithm at recovering misaligned scenes and (2) the impact of image registration on achievable contrast. Proper image registration, combined with post-processing techniques such as KLIP or LOCI, will greatly improve the performance of the JWST coronagraphs.

The James Webb Space Telescope (JWST) will be a nearly ideal machine for acquiring the transmission and emission spectra of transiting exoplanets over its large wavelength range 0.7 - 28 microns. The NIRSpec, NIRCam, nTFI, and MIRI instruments will have spectroscopic capabilities that span spectral resolutions from 20 - 3000 and can cover up to 2 - 3 octaves in wavelength simultaneously. This will allow observing multiple molecular features at once, facilitating the separation of atmospheric temperature and abundance effects on spectra. Many transiting planets will also be able to be observed with both transmission and eclipse spectroscopy, providing further insights and constraints on planetary thermal structures and energy transport. Simulated JWST spectra of planets ranging from mini-Neptunes to gas giants will be presented. These simulations include planets ranging from mini-Neptunes to gas giants will be presented. These simulations include current best estimates of actual instrument throughput, resolution, spectral range, systematic noise, and random noise terms. They show that JWST will be able to determine the atmospheric parameters of a wide variety of planets, often when observing only one or a few transit or eclipse event sequences. The thermal emissions of rocky super-Earths will also be quickly detectable via mid-IR eclipse observations if such planets are found around nearby M star hosts beforehand.

There are now well over a thousand confirmed exoplanets, ranging from hot to cold and large to small worlds. JWST spectra will provide much more detailed information on the molecular constituents, chemical compositions, and thermal properties of the atmospheres of transiting planets than is now known. We explore this by modeling clear, cloudy,and high mean molecular weight atmospheres of typical hot Jupiter, warm Neptune, warm sub-Neptune, and cool super-Earth planets and then simulating their JWST transmission and emission spectra. These simulations were performed for several JWST instrument modes over 1 - 11 microns and incorporate realistic signal and noise components. We then performed state-of the art retrievals to determine how well temperatures and abundances (CO, CO2, H2O, NH3) will be constrained and over what pressures for these different planet types. Using these results, we appraise what instrument modes will be most useful for determining what properties of the different planets, and we assess how well we can constrain their compositions, CO ratios, and temperature profiles.

The James Webb Space Telescope (JWST) is a large aperture (6.5 meter), cryogenic space telescope with a suite of near and mid-infrared instruments. JWST's primary science goal is to detect and characterize the first galaxies. It will also study the assembly of galaxies, star formation, protoplanetary systems, and the formation of evolution of planetary systems. We will review the motivations for JWST's science goals in the context of recent Hubble Space Telescope, and Spitzer Space Telescope observations and review the status of the JWST Observatory.

The determination of galaxy redshifts in the James Webb Space Telescope's (JWST) blank-field surveys will mostly rely on photometric estimates, based on the data provided by JWST's Near-Infrared Camera (NIRCam) at 0.6–5.0 μm and Mid Infrared Instrument (MIRI) at λ 5.0 μm. In this work we analyze ...

Bitten by the gravitational-wave bug? While we await Thursdays press conference, heres some food for thought: if LIGO were able to detect gravitational waves from compact-object mergers, how could we follow up on the detections? A new study investigates whether the upcoming James Webb Space Telescope (JWST) will be able to observe electromagnetic signatures of some compact-object mergers.Hunting for MergersStudying compact-object mergers (mergers of black holes and neutron stars) can help us understand a wealth of subjects, like high-energy physics, how matter behaves at nuclear densities, how stars evolve, and how heavy elements in the universe were created.The Laser Interferometer Gravitational-Wave Observatory (LIGO) is searching for the signature ripples in spacetime identifying these mergers, but gravitational waves are squirrelly: LIGO will only be able to localize wave sources to tens of square degrees. If we want to find out more about any mergers LIGO discovers in gravitational waves, well need a follow-up search for electromagnetic counterparts with other observatories.The Kilonova KeyOne possible electromagnetic counterpart is kilonovae, explosions that can be produced during a merger of a binary neutron star or a neutron starblack hole system. If the neutron star is disrupted during the merger, some of the hot mass is flung outward and shines brightly by radioactive decay.Kilonovae are especially promising as electromagnetic counterparts to gravitational waves for three reasons:They emit isotropically, so the number of observable mergers isnt limited by relativistic beaming.They shine for a week, giving follow-up observatories time to search for them.The source location can beeasily recovered.The only problem? We dont currently have any sensitive survey instruments in the near-infrared band (where kilonova emission peaks) that can provide coverage over tens of square degrees. Luckily, we will soon have just the thing: JWST, launching in 2018!JWSTs

The James Webb Space Telescopes segmented primary and deployable secondary mirrors will be actively con- trolled to achieve optical alignment through a complex series of steps that will extend across several months during the observatory's commissioning. This process will require an intricate interplay between individual wavefront sensing and control tasks, instrument-level checkout and commissioning, and observatory-level calibrations, which involves many subsystems across both the observatory and the ground system. Furthermore, commissioning will often exercise observatory capabilities under atypical circumstances, such as fine guiding with unstacked or defocused images, or planning targeted observations in the presence of substantial time-variable offsets to the telescope line of sight. Coordination for this process across the JWST partnership has been conducted through the Wavefront Sensing and Control Operations Working Group. We describe at a high level the activities of this group and the resulting detailed commissioning operations plans, supporting software tools development, and ongoing preparations activities at the Science and Operations Center. For each major step in JWST's wavefront sensing and control, we also explain the changes and additions that were needed to turn an initial operations concept into a flight-ready plan with proven tools. These efforts are leading to a robust and well-tested process and preparing the team for an efficient and successful commissioning of JWSTs active telescope.

The high sensitivity and broad wavelength coverage of the James Webb Space Telescope will transform the field of exoplanet transit spectroscopy. Transit spectra are inferred from minute, wavelength-dependent variations in the depth of a transit or eclipse as the planet passes in front of or is obscured by its star, and the spectra contain information about the composition, structure and cloudiness of exoplanet atmospheres. Atmospheric retrieval is the preferred technique for extracting information from these spectra, but the process can be confused by astrophysical and instrumental systematic noise. We present results of retrieval tests based on synthetic, noisy JWST spectra, for clear and cloudy planets and active and inactive stars. We find that the ability to correct for stellar activity is likely to be a limiting factor for cloudy planets, as the effects of unocculted star spots may mimic the presence of a scattering slope due to clouds. We discuss the pros and cons of the available JWST instrument combinations for transit spectroscopy, and consider the effect of clouds and aerosols on the spectra. Aerosol high in a planet's atmosphere obscures molecular absorption features in transmission, reducing the information content of spectra in wavelength regions where the cloud is optically thick. We discuss the usefulness of particular wavelength regions for identifying the presence of cloud, and suggest strategies for solving the highly-degenerate retrieval problem for these objects.

In this document, we summarize the main capabilities of the James Webb Space Telescope (JWST) for performing observations of Mars. The distinctive vantage point of JWST at the Sun-Earth Lagrange point (L2) will allow sampling the full observable disk, permitting the study of short-term phenomena, diurnal processes (across the East-West axis) and latitudinal processes between the hemispheres (including seasonal effects) with excellent spatial resolutions (0.07 arcsec at 2 {\\mu}m). Spectroscopic observations will be achievable in the 0.7-5 {\\mu}m spectral region with NIRSpec at a maximum resolving power of 2700, and with 8000 in the 1-1.25 {\\mu}m range. Imaging will be attainable with NIRCam at 4.3 {\\mu}m and with two narrow filters near 2 {\\mu}m, while the nightside will be accessible with several filters in the 0.5 to 2 {\\mu}m. Such a powerful suite of instruments will be a major asset for the exploration and characterization of Mars. Some science cases include the mapping of the water D/H ratio, investigatio...

The James Webb Space Telescope is the successor to the Hubble Space Telescope. STScI and the Office of Public Outreach are committed to bringing awareness of the technology, the excitement, and the future science potential of this great observatory to the public and to the scientific community, prior to its 2018 launch. The challenges in ensuring the high profile of JWST (understanding the infrared, the vast distance to the telescope's final position, and the unfamiliar science territory) requires us to lay the proper background, particularly in the area of spectroscopy. We currently engage the full range of the public and scientific communities using a variety of high impact, memorable initiatives, in combination with modern technologies to extend reach, linking the science goals of Webb to the ongoing discoveries being made by Hubble. Webbtelescope.org, the public hub for scientific information related to JWST, is now open. We have injected Webb-specific content into ongoing outreach programs: for example, partnering with high impact science communicators such as MinutePhysics to produce timely and concise content; partnering with musicians and artists to link science and art. Augmented reality apps showcase NASA’s telescopes in a format usable by anyone with a smartphone, and visuals from increasingly affordable 3D VR technologies.

Relative to ground-based telescopes, the James Webb Space Telescope (JWST) will have a substantial sensitivity advantage in the 2.2-5pm wavelength range where brown dwarfs and hot Jupiters are thought to have significant brightness enhancements. To facilitate high contrast imaging within this band, the Near-Infrared Camera (NIRCAM) will employ a Lyot coronagraph with an array of band-limited image-plane occulting spots. In this paper, we provide the science motivation for high contrast imaging with NIRCAM, comparing its expected performance to that of the Keck, Gemini and 30 m (TMT) telescopes equipped with Adaptive Optics systems of different capabilities. We then describe our design for the NIRCAM coronagraph that enables imaging over the entire sensitivity range of the instrument while providing significant operational flexibility. We describe the various design tradeoffs that were made in consideration of alignment and aberration sensitivities and present contrast performance in the presence of JWST's expected optical aberrations. Finally we show an example of a that can provide 10-5 companion sensitivity at sub-arcsecond separations.

The rings that adorn the four giant planets are of prime importance as accessible natural laboratories for disk processes, as clues to the origin and evolution of planetary systems, and as shapers as well as detectors of their planetary environments. The retinue of small moons accompanying all known ring systems are intimately connected as both sources and products, as well as shepherds and perturbers, of the rings. Leading sources of data on ring systems include spacecraft such as Cassini and Voyager, but also space telescopes such as Hubble and Spitzer as well as ground-based telescopes. The James Webb Space Telescope (JWST) is being prepared for launch in 2018 to begin a planned five-year mission. JWST will have the capability to observe solar system objects as close as Mars. Although most of the hardware is already designed and under construction if not completed, work continues on the development of operations guidelines and software and the completion of calibration tasks. The purpose of this white pape...

If necessity truly is the mother of invention, then advances in lightweight space mirror technology have been driven by launch vehicle mass and volume constraints. In the late 1970 s, at the start of Hubble development, the state of the art in ground based telescopes was 3 to 4 meter monolithic primary mirrors with masses of 6000 to 10,000 kg - clearly too massive for the planned space shuttle 25,000 kg capability to LEO. Necessity led Hubble to a different solution. Launch vehicle mass constraints (and cost) resulted in the development of a 2.4 meter lightweight eggcrate mirror. At 810 kg (180 kg/m2), this mirror was approximately 7.4% of HST s total 11,110 kg mass. And, the total observatory structure at 4.3 m x 13.2 m fit snuggly inside the space shuttle 4.6 m x 18.3 m payload bay. In the early 1990 s, at the start of JWST development, the state of the art in ground based telescopes was 8 meter class monolithic primary mirrors (16,000 to 23,000 kg) and 10 meter segmented mirrors (14,400 kg). Unfortunately, launch vehicles were still constrained to 4.5 meter payloads and 25,000 kg to LEO or 6,600 kg to L2. Furthermore, science now demanded a space telescope with 6 to 8 meter aperture operating at L2. Mirror technology was identified as a critical capability necessary to enable the next generation of large aperture space telescopes. Specific telescope architectures were explored via three independent design concept studies conducted during the summer of 1996 (1). These studies identified two significant architectural constraints: segmentation and areal density. Because the launch vehicle fairing payload dynamic envelop diameter is approximately 4.5 meters, the only way to launch an 8 meter class mirror is to segment it, fold it and deploy it on orbit - resulting in actuation and control requirements. And, because of launch vehicle mass limits, the primary mirror allocation was only 1000 kg - resulting in a maximum areal density of 20 kg/m2. At the inception of

The James Webb Space Telescope (JWST), due to launch in 2014, shall provide an unprecedented wealth of information in the near and mid-infrared wavelengths, thanks to its high-sensitivity instruments and its 6.5 m primary mirror, the largest ever launched into space. NIRSpec and MIRI, the two spectrographs onboard JWST, will play a key role in the study of the spectral features of Active Galactic Nuclei in the 0.6-28 micron wavelength range. This talk aims at presenting an overview of the possibilities provided by these two instruments, in order to prepare the astronomical community for the JWST era.

The advent of cryogenic space-borne infrared observatories such as the Spitzer Space Telescope has lead to a revolution in the study of planets and planetary systems orbiting sun-like stars. Already Spitzer has characterized the emergent infrared spectra of close-in giant exoplanets using transit and eclipse techniques. The James Webb Space Telescope (JWST) will be able to extend these studies to superEarth exoplanets orbiting in the habitable zones of M-dwarf stars in the near solar neighborhood. The forthcoming ground-based Extremely Large Telescopes (ELTs) will playa key role in these studies, being especially valuable for spectroscopy at higher spectral resolving powers where large photon fluxes are needed. The culmination of this work within the next two decades will be the detection and spectral characterization of the major molecular constituents in the atmosphere of a habitable superEarth orbiting a nearby lower main sequence star.

We present the prototyping results and laboratory characterization of a narrow band Fabry-Perot etalon flight model which is one of the wavelength selecting elements of the Tunable Filter Imager. The latter is a part of the Fine Guidance Sensor which represents the Canadian contribution to NASA's James Webb Space Telescope. The unique design of this etalon provides the JWST observatory with the ability to image at 30 Kelvin, a 2.2'x2.2' portion of its field of view in a narrow spectral bandwidth of R~100 at any wavelength ranging between 1.6 and 4.9 μm (with a gap in coverage between 2.5 and 3.2 μm). Extensive testing has resulted in better understanding of the thermal properties of the piezoelectric transducers used as an actuation system for the etalon gap tuning. Good throughput, spectral resolution and contrast have been demonstrated for the full wavelength range.

The James Webb Space Telescope (JWST) will allow observations with a unique combination of spectral, spatial, and temporal resolution for the study of outer planet satellites within our Solar System. We highlight the infrared spectroscopy of icy moons and temporal changes on geologically active satellites as two particularly valuable avenues of scientific inquiry. While some care must be taken to avoid saturation issues, JWST has observation modes that should provide excellent infrared data for such studies.

The James Webb Space Telescope (JWST) will allow observations with a unique combination of spectral, spatial, and temporal resolution for the study of outer planet satellites within our Solar System. We highlight the infrared spectroscopy of icy moons and temporal changes on geologically active satellites as two particularly valuable avenues of scientific inquiry. While some care must be taken to avoid saturation issues, JWST has observation modes that should provide excellent infrared data for such studies.

In the following, we have worked to develop a flexible "observability" scale of biologically relevant molecules in the atmospheres of newly discovered exoplanets for the instruments aboard NASA's next flagship mission, the James Webb Space Telescope (JWST). We sought to create such a scale in order to provide the community with a tool with which to optimize target selection for JWST observations based on detections of the upcoming Transiting Exoplanet Satellite Survey (TESS). Current literature has laid the groundwork for defining both biologically relevant molecules as well as what characteristics would make a new world "habitable", but it has so far lacked a cohesive analysis of JWST's capabilities to observe these molecules in exoplanet atmospheres and thereby constrain habitability. In developing our Observability Scale, we utilized a range of hypothetical planets (over planetary radii and stellar insolation) and generated three self-consistent atmospheric models (of dierent molecular compositions) for each of our simulated planets. With these planets and their corresponding atmospheres, we utilized the most accurate JWST instrument simulator, created specically to process transiting exoplanet spectra. Through careful analysis of these simulated outputs, we were able to determine the relevant parameters that effected JWST's ability to constrain each individual molecular bands with statistical accuracy and therefore generate a scale based on those key parameters. As a preliminary test of our Observability Scale, we have also applied it to the list of TESS candidate stars in order to determine JWST's observational capabilities for any soon-to-be-detected planet in those solar systems.

The James Webb Space Telescope (JWST) is the successor to the Hubble Space Telescope. JWST will be an infrared optimized telescope, with an approximately 6.5 m diameter primary mirror, that is located at the Sun-Earth L2 Lagrange point. Three of JWST's four science instruments use Teledyne HgCdTe HAWAII-2RG (H2RG) near infrared detector arrays. During 2010, the JWST Project noticed that a few of its 5 micron cutoff H2RG detectors were degrading during room temperature storage, and NASA chartered a "Detector Degradation Failure Review Board" (DD-FRB) to investigate. The DD-FRB determined that the root cause was a design flaw that allowed indium to interdiffuse with the gold contacts and migrate into the HgCdTe detector layer. Fortunately, Teledyne already had an improved design that eliminated this degradation mechanism. During early 2012, the improved H2RG design was qualified for flight and JWST began making additional H2RGs. In this article we present the two public DD-FRB "Executiye Summaries" that: (1) determined the root cause of the detector degradation and (2) defined tests to determine whether the existing detectors are qualified for flight. We supplement these with a brief introduction to H2RG detector arrays, and a discussion of how the JWST Project is using cryogenic storage to retard the degradation rate of the existing flight spare H2RGs.

A complete training package lets you learn Adobe Illustrator CC at your own speed Adobe Illustrator is the leading drawing and illustration software used to create artwork for a variety of media. This book-and-DVD package provides 13 self-paced lessons that get you up to speed on the latest version of Illustrator (Creative Cloud). Step-by-step instructions in the full-color book are supported by video tutorials on the DVD. Together, these tools will help you learn Adobe Illustrator basics, essential skills, and all the new capabilities in Illustrator CC-in no time. Includes step-by-step in

JWST has a key goal to search for First Light objects beyond z>10. Our 110-hr JWST GTO program, 'Webb Medium-Deep Fields' (WMDF), will target both blank and lensed fields to probe both the bright and the faint ends of the galaxy luminosity function at z > 10. While a number of well studied lensing clusters exist, not all of them are optimal for the JWST search of First Light objects, either because of their low Ecliptic latitudes (and hence high Zodiacal background) or because of their strong intra-cluster light (ICL) at the critical curve regions corresponding to the redshifts of interest. For this reason, our WMDF candidate lensing targets will include some recently discovered, high-mass (log[M/Msun] ~ 15) galaxy clusters, which we choose either because of their high Ecliptic latitude (beta > 40 deg) or because of their extreme compactness that minimizes the impact of the ICL. As part of our effort to collect ancillary data for these new systems to finalize the target list, we propose IRAC observations for 13 of them that are lacking sufficient data. These 3.6/4.5um data will be critical for our guaranteed JWST program: (1) they will greatly facilitate the modeling of the straylight that JWST will suffer in 1--5 um (the key range to search for z>10--20 objects), a problem that has recently been identified. If left untreated, such straylight components would severely hamper the detection of faint sources in a lensing field. The JWST observations alone would be difficult to separate the ICL from the straylight at the level needed. (2) the new 3.6/4.5um data will best match our deep optical imaging and spectroscopy at HST, Gemini, LBT and MMT. We will derive accurate photometric redshifts for any lensed background galaxies (at znote that these data will be highly valuable for the study of these clusters themselves before the JWST mission.

Deployable subsystems are essential to mission success of most spacecraft. These subsystems enable critical functions including power, communications and thermal control. The loss of any of these functions will generally result in loss of the mission. These subsystems and their components often consist of unique designs and applications, for which various standardized data sources are not applicable for estimating reliability and for assessing risks. In this study, a Bayesian approach for reliability estimation of spacecraft deployment was developed for this purpose. This approach was then applied to the James Webb Space Telescope (JWST) Sunshield subsystem, a unique design intended for thermal control of the observatory's telescope and science instruments. In order to collect the prior information on deployable systems, detailed studies of "heritage information", were conducted extending over 45 years of spacecraft launches. The NASA Goddard Space Flight Center (GSFC) Spacecraft Operational Anomaly and Reporting System (SOARS) data were then used to estimate the parameters of the conjugative beta prior distribution for anomaly and failure occurrence, as the most consistent set of available data and that could be matched to launch histories. This allows for an emperical Bayesian prediction for the risk of an anomaly occurrence of the complex Sunshield deployment, with credibility limits, using prior deployment data and test information.

The James Webb Space Telescope will be launched in Oct 2018 with the goal of observing galaxies in the redshift range of z = 10 - 15. As redshift increases, the age of the Universe decreases, allowing us to study objects formed only a few hundred million years after the Big Bang. This will provide a valuable opportunity to test and improve current galaxy formation theory by comparing predictions for mass, luminosity, and number density to the observed data. We have made testable predictions with the semi-analytical galaxy formation model Galacticus. The code uses Markov Chain Monte Carlo methods to determine viable sets of model parameters that match current astronomical data. The resulting constrained model was then set to match the specifications of the JWST Ultra Deep Field Imaging Survey. Predictions utilizing up to 100 viable parameter sets were calculated, allowing us to assess the uncertainty in current theoretical expectations. We predict that the planned UDF will be able to observe a significant number of objects past redshift z > 9 but nothing at redshift z > 11. In order to detect these faint objects at redshifts z = 11-15 we need to increase exposure time by at least a factor of 1.66.

The James Webb Space Telescope (JWST) project is an international collaboration led by NASA's Goddard Space Flight Center (GSFC) in Greenbelt, MD. JWST is NASA's flagship observatory that will operate nearly a million miles away from Earth at the L2 Lagrange point. JWST's optical design is a three-mirror anastigmat with four main optical components; 1) the eighteen Primary Mirror Segment Assemblies (PMSA), 2) a single Secondary Mirror Assembly (SMA), 3) an Aft-Optics Subsystem (AOS) consisting of a Tertiary Mirror and Fine Steering Mirror, and 4) an Integrated Science Instrument Module consisting of the various instruments for JWST. JWST's optical system has been designed to accommodate a significant amount of alignment capability and risk with the PMSAs and SMA having rigid body motion available on-orbit just for alignment purposes. However, the Aft-Optics Subsystem (AOS) and Integrated Science Instrument Module (ISIM) are essentially fixed optical subsystems within JWST, and therefore the cryogenic alignment of the AOS to the ISIM is critical to the optical performance and mission success of JWST. In support of this cryogenic alignment of the AOS to ISIM, an array of fiber optic sources, known as the AOS Source Plate Assembly (ASPA), are placed near the intermediate image location of JWST (between the secondary and tertiary mirrors) during thermal vacuum ground-test operations. The AOS produces images of the ASPA fiber optic sources at the JWST focal surface location, where they are captured by the various science instruments. In this manner, the AOS provides an optical yardstick by which the instruments within ISIM can evaluate their relative positions to and the alignment of the AOS to ISIM can be quantified. However, since the ASPA is located at the intermediate image location of the JWST three-mirror anastigmat design, the images of these fiber optic sources produced by the AOS are highly aberrated with approximately 2-3μm RMS wavefront error consisting

We present observations of the Hubble Space Telescope (HST) ``A Preparatory Program to Identify the Single Best Transiting Exoplanet for JWST Early Release Science" for WASP-63b, one of the community targets proposed for the James Webb Space Telescope (JWST) Early Release Science (ERS) program. A large collaboration of transiting exoplanet scientists identified a set of ``community targets" which meet a certain set of criteria for ecliptic latitude, period, host star brightness, well constrained orbital parameters, and strength of spectroscopic features. WASP-63b was one of the targets identified as a potential candidate for the ERS program. It is presented as an inflated planet with a large signal. It will be accessible to JWST approximately six months after the planned start of Cycle 1/ERS in April 2019 making it an ideal candidate should there be any delays in the JWST timetable. Here, we observe WASP-63b to evaluate its suitability as the best target to test the capabilities of JWST. Ideally, a clear atmosphere will be best suited for bench marking the instruments ability to detect spectroscopic features. We can use the strength of the water absorption feature at 1.4 μm as a way to determine the presence of obscuring clouds/hazes. The results of atmospheric retrieval are presented along with a discussion on the suitability of WASP-63b as the best target to be observed during the ERS Program.

I will describe the history of the universe, from the Big Bang to 2013, when the JWST is to be launched to look back towards our beginnings. I will discuss how the COBE results led to the Nobel Prize, how the COBE results have been confirmed and extended, and their implications for future observations. The James Webb Space Telescope will be used to examine every part of our history from the first stars and galaxies to the formation of individual stars and planets and the delivery of life-supporting materials to the Earth. I will describe the plans for the JWST and how observers may use it. With luck, the JWST may produce a Nobel Prize for some discovery we can only guess today.

In late 2015/early 2016, a major cryo-vacuum test was carried out for the Integrated Science Instrument Module (ISIM) of the James Webb Space Telescope (JWST). This test comprised the final cryo-certification and calibration test of the ISIM, after its ambient environmental test program (vibration, acoustics, EMI/EMC), and before its delivery for integration with the rest of the JWST observatory. Over the 108-day period of the round-the-clock test program, the full complement of ISIM flight instruments, structure, harness radiator, and electronics were put through a comprehensive program of thermal, optical, electrical, and operational tests. The test verified the health and excellent performance of the instruments and ISIM systems, proving the ISIM element's readiness for integration with the telescope. We report here on the context, goals, setup, execution, and key results for this critical JWST milestone.

The transit technique is used for the detection and characterization of exoplanets. The combination of transit and radial velocity (RV) measurements gives information about a planet's radius and mass, respectively, leading to an estimate of the planet's density (Borucki et al. 2011) and therefore to its composition and evolutionary history. Transit spectroscopy can provide information on atmospheric composition and structure (Fortney et al. 2013). Spectroscopic observations of individual planets have revealed atomic and molecular species such as H2O, CO2 and CH4 in atmospheres of planets orbiting bright stars, e.g. Deming et al. (2013). The transit observations require extremely precise photometry. For instance, Jupiter transit results to a 1% brightness decrease of a solar type star while the Earth causes only a 0.0084% decrease (84 ppm). Spectroscopic measurements require still greater precision ppm. The Precision Projector Laboratory (PPL) is a collaboration between the Jet Propulsion Laboratory (JPL) and California Institute of Technology (Caltech) to characterize and validate detectors through emulation of science images. At PPL we have developed a testbed to project simulated spectra and other images onto a HgCdTe array in order to assess precision photometry for transits, weak lensing etc. for Explorer concepts like JWST, WFIRST, EUCLID. In our controlled laboratory experiment, the goal is to demonstrate ability to extract weak transit spectra as expected for NIRCam, NIRIS and NIRSpec. Two lamps of variable intensity, along with spectral line and photometric simulation masks emulate the signals from a star-only, from a planet-only and finally, from a combination of a planet + star. Three masks have been used to simulate spectra in monochromatic light. These masks, which are fabricated at JPL, have a length of 1000 pixels and widths of 2 pixels, 10 pixels and 1 pixel to correspond respectively to the noted above JWST instruments. From many-hour long observing

In molecular clouds above a few magnitudes of total visual extinction, some components of the molecular gas freeze out on the surfaces of dust grains. These ice mantles around dust grains are the site of complex surface chemistry that leads to the formation of simple organic molecules in these mantles. The icy surfaces also facilitate the coaggulation of the dust particles, setting the stage for grain growth and ultimately the formation of planetary bodies.As part of the JWST NIRCam GTO program, we plan to observe a selection of small molecular cores using the wide field grism spectroscopy mode of NIRCam.This poster presents the results of a preliminary study of several candidate molecular cores using UKIRT, Spitzer IRAC, IRTF SpeX, Keck MOSFIRE and Subaru MOIRCS data.After the prelimary studies we have selected three molecular cores in different evolutionary stages for the GTO program: B68, a quiescent molecular core, LDN 694-2, a collapsing pre-stellar core, and B335, a protostellar core. All these cores are seen against a dense background of stars in the inner Galaxy and offer the opportunity for spatially well resolved mapping of the ice feature distribution. We will obtain slitless grism spectroscopy in six filters covering the features of H2O, CO2, CO, CH3OH, and the XCN feature. Simulations using aXeSIM have shown that spectrum overlap will occur in a fraction of the spectra, but will not be a prohibitive problem.Our poster will discuss the details of observations planned out in the APT system.

JWST-MIRI will have imaging and medium resolution (λ/Δλ ≍ 2000-3000) integral field spectroscopy with orders of magnitude improvements in sensitivity and/or spatial resolution compared with existing facilities. It will be a prime facility for astrochemical studies of gases and solids in a wide variety of objects in the next decade. 1. Introduction Mid-infrared spectroscopy is becoming a powerful tool in astrochemistry, with studies of molecules and sources that are highly complementary to those at millimeter wavelengths. Molecules without permanent dipole moments such CH4, C2H2 and CO2 can only be observed through their vibration-rotation transitions. Space-based missions open up the possibility to study molecules which are abundant in ouw own atmosphere, in particular H2O. Polycyclic Aromatic Hydrocarbons have their most prominent features at mid-infrared wavelengths, and the pure rotational transitions of the dominant molecule in the universe, H2, also occur in this band. Solid-state material is uniquely probed in the mid-infrared, including characteric bands of ices, silicates, oxides, carbides, carbonates and sulfides. The wealth of mid-infrared spectroscopy has been demonstrated by results from the ISO satellite (see van Dishoeck & Tielens 2001, van Dishoeck 2004 for reviews), by pioneering ground-based studies (Lacy et al. 1989, Evans et al. 1990) and most recently by the Spitzer Space Telescope. Targets include molecular clouds, PDRs, shocks, deeply embedded young stellar objects, UC HII regions, protoplanetary disks, planetary atmospheres, comets, evolved stars and even entire galaxies. In addition to an inventory of gaseous and solid-state material, the lines and line ratios provide powerful diagnostics of temperatures, densities, UV field, elemental abundances, etc. Systematic variations in features from region to region allow the physical and chemical processes to be traced. The MidInfraRed Instrument (MIRI) on board the 6m James Webb Space

Full Text Available The James Webb Space Telescope (JWST is the successor to the Hubble Space Telescope. JWST will be an infrared-optimized telescope, with an approximately 6.5 m diameter primary mirror, that is located at the Sun-Earth L2 Lagrange point. Three of JWST’s four science instruments use Teledyne HgCdTe HAWAII-2RG (H2RG near infrared detector arrays. During 2010, the JWST Project noticed that a few of its 5 μm cutoff H2RG detectors were degrading during room temperature storage, and NASA chartered a “Detector Degradation Failure Review Board” (DD-FRB to investigate. The DD-FRB determined that the root cause was a design flaw that allowed indium to interdiffuse with the gold contacts and migrate into the HgCdTe detector layer. Fortunately, Teledyne already had an improved design that eliminated this degradation mechanism. During early 2012, the improved H2RG design was qualified for flight and JWST began making additional H2RGs. In this article, we present the two public DD-FRB “Executive Summaries” that: (1 determined the root cause of the detector degradation and (2 defined tests to determine whether the existing detectors are qualified for flight. We supplement these with a brief introduction to H2RG detector arrays, some recent measurements showing that the performance of the improved design meets JWST requirements, and a discussion of how the JWST Project is using cryogenic storage to retard the degradation rate of the existing flight spare H2RGs.

Doppler and transit surveys are finding extrasolar planets of ever smaller mass and radius, and are now sampling the domain of superEarths. Recent results from the Doppler surveys suggest that discovery of a transiting superEarth in the habitable zone of a lower main sequence star may be possible. We evaluate the prospects for an all-sky transit survey targeted to the brightest stars I that would find the most favorable cases for photometric and spectroscopic characterization using the James Webb Space Telescope. We use the proposed Transiting Exoplanet Survey Satellite (TESS) as representative of an all-sky survey. We couple the simulated TESS yield to a sensitivity model for the MIRI and NIRSpec instruments on JWST. Our sensitivity model includes all currently known and anticipated sources of random and systematic error for these instruments. We focus on the TESS planets with radii between Earth and Neptune. Our simulations consider secondary eclipse filter photometry using JWST/MIRI, comparing the 11- and 15- micron bands to measure carbon dioxide absorption in superEarths, as well as JWST!NIRSpec spectroscopy of water absorption from 1.7-3.0 microns, and carbon dioxide absorption at 4.3 microns. We find that JWST will be capable of characterizing dozens of TESS superEarths with temperatures above the habitable range, using both MIRI and NIRspec. We project that TESS will discover about eight nearby habitable transiting superEarths, all orbiting lower main sequence stars. The principal sources of uncertainty in the prospects for JWST characterization of habitable superEarths are superEarth frequency and the nature of superEarth atmospheres. Based on our estimates of these uncertainties, we project that JWST will be able to measure the temperature, and identify molecular absorptions (water, carbon dioxide) in one to four nearby habitable TESS superEarths orbiting lower main sequence stars.

The JWST North Ecliptic Pole (NEP) Survey field is located within JWST's northern Continuous Viewing Zone, will span ˜14‧ in diameter (˜10‧ with NIRISS coverage) and will be roughly circular in shape (initially sampled during Cycle 1 at 4 distinct orientations with JWST/NIRCam's 4.4‧×2.2‧ FoV —the JWST “windmill”) and will have NIRISS slitless grism spectroscopy taken in parallel, overlapping an alternate NIRCam orientation. This is the only region in the sky where JWST can observe a clean extragalactic deep survey field (free of bright foreground stars and with low Galactic foreground extinction AV) at arbitrary cadence or at arbitrary orientation. This will crucially enable a wide range of new and exciting time-domain science, including high redshift transient searches and monitoring (e.g., SNe), variability studies from Active Galactic Nuclei to brown dwarf atmospheres, as well as proper motions of extreme scattered Kuiper Belt and Oort Cloud Objects, and of nearby Galactic brown dwarfs, low-mass stars, and ultracool white dwarfs. We therefore welcome and encourage follow-up through GO programs of the initial GTO observations to realize its potential as a JWST time-domain community field. The JWST NEP Survey field was selected from an analysis of WISE 3.4+4.6 micron, 2MASS JHKs, and SDSS ugriz source counts and of Galactic foreground extinction, and is one of very few such ˜10‧ fields that are devoid of sources brighter than mAB = 16 mag. We have secured deep (mAB ˜ 26 mag) wide-field (˜23‧×25‧) Ugrz images of this field and its surroundings with LBT/LBC. We also expect that deep MMT/MMIRS YJHK images, deep 8-12 GHz VLA radio observations (pending), and possibly HST ACS/WFC and WFC3/UVIS ultraviolet-visible images will be available before JWST launches in Oct 2018.

The recent discovery of three Earth-sized, potentially habitable planets around a nearby cool star, TRAPPIST-1, has provided three key targets for the upcoming James Webb Space Telescope (JWST). Depending on their atmospheric characteristics and precise orbit configurations, it is possible that any of the three planets may be in the liquid water habitable zone, meaning that they may be capable of supporting life. We find that present-day Earth levels of ozone, if present, would be detectable if JWST observes 60 transits for innermost planet 1b and 30 transits for 1c and 1d.

The recent discovery of three Earth-sized, potentially habitable planets around a nearby cool star, TRAPPIST-1, has provided three key targets for the upcoming James Webb Space Telescope (JWST). Depending on their atmospheric characteristics and precise orbit configurations, it is possible that any of the three planets may be in the liquid water habitable zone, meaning that they may be capable of supporting life. We find that present-day Earth levels of ozone, if present, would be detectable if JWST observes 60 transits for innermost planet 1b and 30 transits for 1c and 1d.

The Mid-InfraRed Instrument (MIRI, Wright et al. 2003) on board the James Webb Space Telescope (JWST) will be the next major mid-infrared facility in space. It combines a high sensitivity with medium resolution spectroscopy and subarcsec imaging. This makes it one of the prime facilities for astrochemical studies in the next decade. Mid-infrared spectroscopy is a very powerful astrochemical tool. Molecules without permanent dipoles such as CH_4, C_2H_2 and CO_2 can only be observed through their vibration-rotation transitions while atmospheric species, in particular H_2O, require space-based facilities. PAH and solid-state material have prominent features in the mid-infrared, and the pure rotational transitions of the dominant molecule in the universe, H_2, also occur in this band. The wealth of mid-infrared spectroscopy has been demonstrated by results from the ISO satellite (see van Dishoeck 2004 for a review), pioneering ground-based studies and most recently by the Spitzer Space Telescope. The targeted sources are extremely diverse and include objects in the local and distant universe. Variations in features allow both qualitative and quantitative studies of physical and chemical processes. MIRI consists of an imager (including low resolution (R=λ/Δλ≈100) spectroscopy and coronography) and a medium resolution spectrometer (R=2000-3000) operating in the 5-28μm wavelength range using 1024x1024 pixel Si:As arrays. The spectrometer uses four IFUs with fields ranging from 3.5 to 7 arcsec. MIRIs sensitivity, orders of magnitude higher compared to Spitzer and 8-m class ground-based telescopes, spatial and spectral resolution make it particularly well suited for studying gases and solids in disks around young stars and in the nuclei of (starburst) galaxies. The sensitive low resolution spectrometer will be ideal to characterize exoplanet atmospheres. MIRI is built in partnership by a European Consortium and the US. The MIRI flight model (FM) is now fully

One of the major capabilities offered by JWST is coronagraphic imaging from space, covering the near through mid-IR and optimized for study of planet formation and the evolution of planetary systems. Planning for JWST has resulted in expectations for instrument performance, observation strategies and data reduction approaches. HST with 20 years of coronagraphic imaging offers some experience which may be useful to those planning for JWST. 1) Real astronomical sources do not necessarily conform to expectations. Debris disks may be accompanied by more distant material, and some systems may be conspicuous in scattered light when offering only modest IR excesses. Proto-planetary disks are not constantly illuminated, and thus a single epoch observation of the source may not be sufficient to reveal everything about it. 2) The early expectation with NICMOS was that shallow, 2-roll observations would reveal a wealth of debris disks imaged in scattered light, and that only a limited set of PSF observations would be required. Instead, building up a library of spatially resolved disks in scattered light has proven to require alternate observing strategies, is still on-going, and has taken far longer than expected. 3) A wealth of coronagraphic options with an instrument may not be scientifically informative, unless there is a similar time investment in acquisition of calibration data in support of the science observations. 4) Finally, no one anticipated what can be gleaned from coronagraphic imaging. We should expect similar, unexpected, and ultimately revolutionary discoveries with JWST.

Imaging with the James Webb Space Telescope (JWST) will allow observations of the bulk of distant galaxies at the epoch of reionization. The recovery of their properties, such as age, color excess , specific star formation rate (sSFR), and stellar mass, will mostly rely on spectral energy distrib...

Radiology Illustrated: Gastrointestinal Tract is the second of two volumes designed to provide clear and practical guidance on the diagnostic imaging of abdominal diseases. The book presents approximately 300 cases with 1500 carefully selected and categorized illustrations of gastrointestinal tract diseases, along with key text messages and tables that will help the reader easily to recall the relevant images as an aid to differential diagnosis., Essential points are summarized at the end of each text message to facilitate rapid review and learning. Additionally, brief descriptions of each clinical problem are provided, followed by case studies of both common and uncommon pathologies that illustrate the roles of the different imaging modalities, including ultrasound, radiography, computed tomography, and magnetic resonance imaging.

Offers a practical approach to image interpretation for spinal disorders. Includes numerous high-quality radiographic images and schematic illustrations. Will serve as a self-learning book covering daily routine cases from the basic to the advanced. Radiology Illustrated: Spine is an up-to-date, superbly illustrated reference in the style of a teaching file that has been designed specifically to be of value in clinical practice. Common, critical, and rare but distinctive spinal disorders are described succinctly with the aid of images highlighting important features and informative schematic illustrations. The first part of the book, on common spinal disorders, is for radiology residents and other clinicians who are embarking on the interpretation of spinal images. A range of key disorders are then presented, including infectious spondylitis, cervical trauma, spinal cord disorders, spinal tumors, congenital disorders, uncommon degenerative disorders, inflammatory arthritides, and vascular malformations. The third part is devoted to rare but clinically significant spinal disorders with characteristic imaging features, and the book closes by presenting practical tips that will assist in the interpretation of confusing cases.

This illustration shows a close-up of Saturn's rings. These rings are thought to have formed from material that was unable to form into a Moon because of tidal forces from Saturn, or from a Moon that was broken up by Saturn's tidal forces.

Shown is an illustration of the Ares I concept. The first stage will be a single, five-segment solid rocket booster derived from the space shuttle programs reusable solid rocket motor. The first stage is managed by NASA's Marshall Space Flight Center in Huntsville, Alabama for NASA's Constellation program.

Full Text Available This paper will track my battle to ‘get conceptual’ in the production of a Grounded Theory. It will discuss early attempts at creating substantive codes through the process of open coding which, despite my best efforts, merely produced descriptive codes. It will illustrate the process by which these descriptive codes became more conceptual, earning the title of substantive code and how their presentation in essay form produced a perfect example of ‘conceptual description’. It will then describe the slow dawning of the purpose of ‘theoretical codes’ as organisers of substantive codes and the emergence of a Grounded Theory.

Proficiency in art and illustration was once considered an essential skill for biologists, because text alone often could not suffice to describe observations of biological systems. With modern imaging technology, it is no longer necessary to illustrate what we can see by eye. However, in molecular and cellular biology, our understanding of biological processes is dependent on our ability to synthesize diverse data to generate a hypothesis. Creating visual models of these hypotheses is important for generating new ideas and for communicating to our peers and to the public. Here, I discuss the benefits of creating visual models in molecular and cellular biology and consider steps to enable researchers to become more effective visual communicators.

Full Text Available The article discusses the contemporary agrobotanical illustration, as an integral part of scientific and educational work. For the analysis of the issue, we set up the basis of illustrative images, which acted as the test material. It has been shown that visual images serve as the material for the development of visual thinking students are taught to read information, which represents the worksheet, to think and to create something new. The article considers art of graphics pencil using a computer-drawing program with processing in Photoshop. There are mixed techniques (mixed media, based on the use of traditional drawing and herbarium specimen, processed in Photoshop in the color. Another new direction of contemporary agroillustration is infographics. Its using in educational and scientific process is determined by the fact that infographics involves analytical processing of quantitative data obtained during the experiment. On the other hand, the data is needed to visualize, execute and present. A new direction in contemporary agro-botanical illustration is plant images taken with X-rays. The modern level of requirements to create images is high enough and it must be taken into account when carrying out scientific experiments, when it is necessary to demonstrate the object of research results. Modern agroillustration can be processed using various systems of artificial intelligence, pattern recognition, for example, the system named “Edos”

The Thermal and Electrical Conductivity Probe on NASA's Phoenix Mars Lander detected small and variable amounts of water in the Martian soil. In this schematic illustration, water molecules are represented in red and white; soil minerals are represented in green and blue. The water, neither liquid, vapor, nor solid, adheres in very thin films of molecules to the surfaces of soil minerals. The left half illustrates an interpretation of less water being adsorbed onto the soil-particle surface during a period when the tilt, or obliquity, of Mars' rotation axis is small, as it is in the present. The right half illustrates a thicker film of water during a time when the obliquity is greater, as it is during cycles on time scales of hundreds of thousands of years. As the humidity of the atmosphere increases, more water accumulates on mineral surfaces. Thicker films behave increasingly like liquid water. The Phoenix Mission is led by the University of Arizona, Tucson, on behalf of NASA. Project management of the mission is by NASA's Jet Propulsion Laboratory, Pasadena, Calif. Spacecraft development is by Lockheed Martin Space Systems, Denver.

This concept is a cutaway illustration of the Lunar Module (LM) with detailed callouts. The LM was a two part spacecraft. Its lower or descent stage had the landing gear, engines, and fuel needed for the landing. When the LM blasted off the Moon, the descent stage served as the launching pad for its companion ascent stage, which was also home for the two astronauts on the surface of the Moon. The LM was full of gear with which to communicate, navigate, and rendezvous. It also had its own propulsion system, and an engine to lift it off the Moon and send it on a course toward the orbiting Command Module.

Scalable Vector Graphics (SVG) and CANVAS are two new tools introduced in HTML5 that you can use to add illustration and interactive animation to your Web pages. Understand why there are two different formats, how you can use them, and where they can be used today. HTML 5, is the first major update to the core language of the Web in over a decade The focus of this book is on innovations that most directly effect Web site design and multimedia integration The companion Web site features working demonstrations and tutorial media for hands-on p

The James Webb Space Telescope will enable astronomers to obtain exoplanet spectra of unprecedented precision. Especially the MIRI instrument may shed light on the nature of the cloud particles obscuring planetary transmission spectra in the optical and near-infrared. We provide self-consistent atmospheric models and synthetic JWST observations for prime exoplanet targets in order to identify spectral regions of interest and estimate the number of transits needed to distinguish between model setups. We select targets which span a wide range in planetary temperature and surface gravity, ranging from super-Earths to giant planets, and have a high expected SNR. For all targets we vary the enrichment, C/O ratio, presence of optical absorbers (TiO/VO) and cloud treatment. We calculate atmospheric structures and emission and transmission spectra for all targets and use a radiometric model to obtain simulated observations. We analyze JWST's ability to distinguish between various scenarios. We find that in very cloud...

With a scheduled launch in October 2018, the James Webb Space Telescope (JWST) is expected to revolutionise the field of atmospheric characterization of exoplanets. The broad wavelength coverage and high sensitivity of its instruments will allow us to extract far more information from exoplanet spectra than what has been possible with current observations. In this paper, we investigate whether current retrieval methods will still be valid in the era of JWST, exploring common approximations used when retrieving transmission spectra of hot Jupiters. To assess biases, we use 1D photochemical models to simulate typical hot Jupiter cloud-free atmospheres and generate synthetic observations for a range of carbon-to-oxygen ratios. Then, we retrieve these spectra using TauREx, a Bayesian retrieval tool, using two methodologies: one assuming an isothermal atmosphere, and one assuming a parametrized temperature profile. Both methods assume constant-with-altitude abundances. We found that the isothermal approximation bi...

Simulated point spread functions (PSFs) are an essential tool in preparing for future space telescopes, supporting pre-launch science simulations, observation planning, and analysis software development. The open-source Python package WebbPSF provides simulated PSFs for all of JWST's instruments and observing modes. We present the latest updates to WebbPSF based on both updated models ofthe assembled telescope optics and recent cryo-test data for the science instruments. Outputs from this latest version of WebbPSF will support the JWST Exposure Time Calculator and the first calls for proposals in the year ahead, among many other uses by the community. Furthermore, the same toolkit also now provides support for simulating PSFs for both the WFI and CGI instruments planned for WFIRST.

The James Webb Space Telescope will revolutionize transiting exoplanet atmospheric science due to its capability for continuous, long-duration observations and its larger collecting area, spectral coverage, and spectral resolution compared to existing space-based facilities. However, it is unclear precisely how well JWST will perform and which of its myriad instruments and observing modes will be best suited for transiting exoplanet studies. In this article, we describe a prefatory JWST Early Release Science (ERS) program that focuses on testing specific observing modes to quickly give the community the data and experience it needs to plan more efficient and successful future transiting exoplanet characterization programs. We propose a multi-pronged approach wherein one aspect of the program focuses on observing transits of a single target with all of the recommended observing modes to identify and understand potential systematics, compare transmission spectra at overlapping and neighboring wavelength regions...

The James Webb Space Telescope's (JWST) Near Infrared Spectrograph (NIRSpec) will provide a multi-object spectroscopy (MOS) mode through the Micro-Shutter Array (MSA). Each MSA quadrant is a grid of contiguous shutters that can be configured to form slits on more than 100 astronomical targets simultaneously. The combination of JWST's sensitivity and superb resolution in the infrared and NIRSpec's full wavelength coverage over 0.6 to 5 μm will open new parameter space for studies of galaxies and resolved stellar populations alike. We describe a NIRSpec MSA observing scenario of spectroscopy of individual stars in an external galaxy, and investigate the technical challenges posed by this scenario. This use case and others, including a deep galaxy survey and observations of Galactic HII regions, are guiding development of the NIRSpec user interfaces including proposal planning and pipeline calibrations.

Depicts characteristic imaging findings of common and uncommon diseases in the pediatric age group. Will serve as an ideal diagnostic reference in daily practice. Offers an excellent teaching aid, with numerous high-quality illustrations. This case-based atlas presents images depicting the findings typically observed when imaging a variety of common and uncommon diseases in the pediatric age group. The cases are organized according to anatomic region, covering disorders of the brain, spinal cord, head and neck, chest, cardiovascular system, gastrointestinal system, genitourinary system, and musculoskeletal system. Cases are presented in a form resembling teaching files, and the images are accompanied by concise informative text. The goal is to provide a diagnostic reference suitable for use in daily routine by both practicing radiologists and radiology residents or fellows. The atlas will also serve as a teaching aide and a study resource, and will offer pediatricians and surgeons guidance on the clinical applications of pediatric imaging.

The James Webb Space Telescope (JWST) will likely revolutionize transiting exoplanet atmospheric science; however, it is unclear precisely how well it will perform and which of its myriad instruments and observing modes will be best suited for transiting exoplanet studies. We will describe a prefatory JWST Early Release Science (ERS) Cycle 1 program that focuses on testing specific observing modes to quickly give the community the data and experience it needs to plan more efficient and successful transiting exoplanet characterization programs in later cycles. We will also present a list of "community targets" that are well suited to achieving these goals. Since most of the community targets do not have well-characterized atmospheres, we have initiated a preparatory HST + Spitzer observing program to determine the presence of obscuring clouds/hazes within their atmospheres. Measurable spectroscopic features are needed to establish the optimal resolution and wavelength regions for exoplanet characterization. We will present preliminary results from this preparatory observing program and discuss their implications on the pending JWST ERS proposal deadline in mid-2017.

The NASA TESS mission will deliver hundreds of transiting exoplanet candidates orbiting bright stars. The spectrometers SOPHIE at OHP and SPIRou at CFHT will be ideal to obtain radial velocities of these candidates, confirm their nature, and derive the planets' masses. These measurements will be crucial to deliver the best targets for atmospheric characterization with JWST. Here, we calculate the required observing time with SOPHIE, SPIRou, and JWST for each of the TESS targets in order to prepare follow-up observations. To infer their potential for JWST, we restrict the calculations to the case of transmission spectroscopy with NIRISS. The radial velocity follow-up of the giant planets (R_p > 4 R_E) could be achieved with SOPHIE, with a median observing time of 3.47 hours per target, and a total observing time of 305 hours that includes the 80% most favorable cases. Several small planets (R_p R_E) could also be confirmed, but most of them would require an unrealistic time investment. On the other hand, SPIRou is ideally suited to the follow-up of the small planets, with a median observing time of 2.65 hours per target, and a median observing time of 4.70 hours for the terrestrial planets in the habitable zone (R_p R_E, S programs with SOPHIE and SPIRou before the first planet candidates are delivered by TESS.

The Near-Infrared Spectrograph (NIRSpec) for the James Webb Space Telescope (JWST) will have a powerful multi-object spectroscopy mode using four configurable Micro-Shutter Arrays (MSAs). The contiguous MSA shutters can be opened to form slits on astronomical targets, for simultaneous spectroscopy of up to 100 sources per exposure. The NIRSpec MSA shutters are in a fixed grid pattern, and careful analysis in the observation planning process will be crucial for optimal definition of science exposures. Our goal is to maximize the number of astronomical science sources observed in the fewest number of MSA slit configurations. We are developing algorithms in the NIRSpec MSA Planning Tool (MPT) to improve the quality of planned observations using several common science observing strategies as test use cases. For example, the needs for planning extremely deep exposures on a small number of JWST discovered z > 10 galaxy candidates will differ significantly from the requirements for planning spectral observations on a representative sample of stars from a galactic star cluster catalog. In this poster, we present a high level overview of our plans to develop and optimize the MPT for the JWST NIRSpec multi-object spectroscopy mode.

The Near Infrared Camera (NIRCam) is one of the four science instruments of the James Webb Space Telescope (JWST). Its high sensitivity, high spatial resolution images over the 0.6 - 5 microns wavelength region will be essential for making significant findings in many science areas as well as for aligning the JWST primary mirror segments and telescope. The NIRCam engineering test unit was recently assembled and has undergone successful cryogenic testing. The NIRCam collimator and camera optics and their mountings are also progressing, with a brass-board system demonstrating relatively low wavefront error across a wide field of view. The flight model?s long-wavelength Si grisms have been fabricated, and its coronagraph masks are now being made. Both the short (0.6 - 2.3 microns) and long (2.4 - 5.0 microns) wavelength flight detectors show good performance and are undergoing final assembly and testing. The flight model subsystems should all be completed later this year through early 2011, and NIRCam will be cryogenically tested in the first half of 2011 before delivery to the JWST integrated science instrument module (ISIM).

The James Webb Space Telescope (JWST) Primary Mirror Segment Assembly (PMSA) was required to meet NASA Technology Readiness Level (TRL) 06 requirements in the summer of 2006. These TRL06 requirements included verifying all mirror technology systems level readiness in simulated end-to-end operating conditions. In order to support the aggressive development and technology readiness schedule for the JWST Primary Mirror Segment Assembly (PMSA), a novel approach was implemented to verify the nanometer surface figure distortion effects on an in-process non-polished beryllium mirror surface. At the time that the TRL06 requirements needed to be met, a polished mirror segment had not yet been produced that could have utilized the baselined interferometric optical test station. The only JWST mirror segment available was a finished machined segment with an acid-etched optical surface. Therefore an Electronic Speckle Pattern Interferometer (ESPI) was used in coordination with additional metrology techniques to perform interferometric level optical testing on a non-optical surface. An accelerated, rigorous certification program was quickly developed for the ESPI to be used with the unfinished optical surface of the primary mirror segment. The ESPI was quickly implemented into the PMSA test program and optical testing was very successful in quantifying the nanometer level surface figure deformation changes in the PMSA due to assembly, thermal cycling, vibration, and acoustic testing. As a result of the successful testing, the PMSA passed all NASA TRL06 readiness requirements.

This image illustrates major areas of emphasis of the Skylab Program. In an early effort to extend the use of Apollo for further applications, NASA established the Apollo Applications Program (AAP) in August of 1965. The AAP was to include long duration Earth orbital missions during which astronauts would carry out scientific, technological, and engineering experiments in space by utilizing modified Saturn launch vehicles and the Apollo spacecraft. Established in 1970, the Skylab Program was the forerurner of the AAP. The goals of the Skylab were to enrich our scientific knowledge of the Earth, the Sun, the stars, and cosmic space; to study the effects of weightlessness on living organisms, including man; to study the effects of the processing and manufacturing of materials utilizing the absence of gravity; and to conduct Earth resource observations. The Skylab also conducted 19 selected experiments submitted by high school students. Skylab's 3 different 3-man crews spent up to 84 days in Earth orbit. The Marshall Space Flight Center (MSFC) had responsibility for developing and integrating most of the major components of the Skylab: the Orbital Workshop (OWS), Airlock Module (AM), Multiple Docking Adapter (MDA), Apollo Telescope Mount (ATM), Payload Shroud (PS), and most of the experiments. MSFC was also responsible for providing the Saturn IB launch vehicles for three Apollo spacecraft and crews and a Saturn V launch vehicle for the Skylab.

Pattern approach to the diagnosis of lung diseases based on CT scan appearances. Guide to quick and reliable differential diagnosis. CT-pathology correlation. Emphasis on state-of-the-art MDCT. The purpose of this atlas is to illustrate how to achieve reliable diagnoses when confronted by the different abnormalities, or ''disease patterns'', that may be visualized on CT scans of the chest. The task of pattern recognition has been greatly facilitated by the advent of multidetector CT (MDCT), and the focus of the book is very much on the role of state-of-the-art MDCT. A wide range of disease patterns and distributions are covered, with emphasis on the typical imaging characteristics of the various focal and diffuse lung diseases. In addition, clinical information relevant to differential diagnosis is provided and the underlying gross and microscopic pathology is depicted, permitting CT-pathology correlation. The entire information relevant to each disease pattern is also tabulated for ease of reference. This book will be an invaluable handy tool that will enable the reader to quickly and easily reach a diagnosis appropriate to the pattern of lung abnormality identified on CT scans.

This book presents a unique approach for studying mechanisms and machines with drawings that were depicted unclearly in ancient Chinese books. The historical, cultural and technical backgrounds of the mechanisms are explained, and various mechanisms described and illustrated in ancient books are introduced. By utilizing the idea for the conceptual design of modern mechanisms, all feasible designs of ancient mechanisms with uncertain members and joints that meet the technical standards of the subjects’ time periods are synthesized systematically. Ancient Chinese crossbows (the original crossbow and repeating crossbows), textile mechanisms (silk-reeling mechanism, spinning mechanisms, and looms), and many other artisan's tool mechanisms are used as illustrated examples. Such an approach provides a logical method for the reconstruction designs of ancient mechanisms with uncertain structures. It also provides an innovative direction for researchers to further identify the original structures of mechanisms...

The CBT(Conceptual Blending Theory)is rapidly emerging as a major force in cognitive science and provides a unifying umbrella framework for a range of cognitive phenomena.The present paper is to have a general review of the conceptual blending theory through illustrating its four-space theory in order to have a better comprehension of its nature.

This article summarizes a workshop held on March, 2014, on the potential of the James Webb Space Telescope (JWST) to revolutionize our knowledge of the physical properties of exoplanets through transit observations. JWST's unique combination of high sensitivity and broad wavelength coverage will enable the accurate measurement of transits with high signal-to-noise. Most importantly, JWST spectroscopy will investigate planetary atmospheres to determine atomic and molecular compositions, to probe vertical and horizontal structure, and to follow dynamical evolution, i.e. exoplanet weather. JWST will sample a diverse population of planets of varying masses and densities in a wide variety of environments characterized by a range of host star masses and metallicities, orbital semi-major axes and eccentricities. A broad program of exoplanet science could use a substantial fraction of the overall JWST mission.

A document discusses a thermal radiator design consisting of lightweight composite materials and low-emittance metal coatings for use on the James Webb Space Telescope (JWST) structure. The structure will have a Thermal Subsystem unit to provide passive cooling to the Integrated Science Instrument Module (ISIM) control electronics. The ISIM, in the JWST observatory, is the platform that provides the mounting surfaces for the instrument control electronics. Dissipating the control electronic generated-heat away from JWST is of paramount importance so that the spacecraft s own heat does not interfere with the infrared-light gathering of distant cosmic sources. The need to have lateral control in the emission direction of the IEC (ISIM Electronics Compartment) radiators led to the development of a directional baffle design that uses multiple curved mirrorlike surfaces. This concept started out from the so-called Winston non-imaging optical concentrators that use opposing parabolic reflector surfaces, where each parabola has its focus at the opposite edge of the exit aperture. For this reason they are often known as compound parabolic concentrators or CPCs. This radiator system with the circular section was chosen for the IEC reflectors because it offers two advantages over other designs. The first is that the area of the reflector strips for a given radiator area is less, which results in a lower mass baffle assembly. Secondly, the fraction of energy emitted by the radiator strips and subsequently reflected by the baffle is less. These fewer reflections reduced the amount of energy that is absorbed and eventually re-emitted, typically in a direction outside the design emission range angle. A baffle frame holds the mirrors in position above a radiator panel on the IEC. Together, these will direct the majority of the heat from the IEC above the sunshield away towards empty space.

The high angular resolution technique of non-redundant masking (NRM) or aperture masking interferometry (AMI) has yielded images of faint protoplanetary companions of nearby stars from the ground. AMI on James Webb Space Telescope (JWST)'s Near Infrared Imager and Slitless Spectrograph (NIRISS) has a lower thermal background than ground-based facilities and does not suffer from atmospheric instability. NIRISS AMI images are likely to have 90%-95% Strehl ratio between 2.77 and 4.8 μm. In this paper we quantify factors that limit the raw point source contrast of JWST NRM. We develop an analytic model of the NRM point spread function which includes different optical path delays (pistons) between mask holes and fit the model parameters with image plane data. It enables a straightforward way to exclude bad pixels, is suited to limited fields of view, and can incorporate effects such as intra-pixel sensitivity variations. We simulate various sources of noise to estimate their effect on the standard deviation of closure phase, σ{sub CP} (a proxy for binary point source contrast). If σ{sub CP} < 10{sup –4} radians—a contrast ratio of 10 mag—young accreting gas giant planets (e.g., in the nearby Taurus star-forming region) could be imaged with JWST NIRISS. We show the feasibility of using NIRISS' NRM with the sub-Nyquist sampled F277W, which would enable some exoplanet chemistry characterization. In the presence of small piston errors, the dominant sources of closure phase error (depending on pixel sampling, and filter bandwidth) are flat field errors and unmodeled variations in intra-pixel sensitivity. The in-flight stability of NIRISS will determine how well these errors can be calibrated by observing a point source. Our results help develop efficient observing strategies for space-based NRM.

Growing supermassive black holes (∼ {10}9 {M}ȯ ) that power luminous z> 6 quasars from light seeds—the remnants of the first stars—within a Gyr of the Big Bang poses a timing challenge. The formation of massive black hole seeds via direct collapse with initial masses ∼ {10}4{--}{10}5 {M}ȯ alleviates this problem. Viable direct-collapse black hole formation sites, the satellite halos of star-forming galaxies, merge and acquire stars to produce a new, transient class of high-redshift objects, obese black hole galaxies (OBGs). The accretion luminosity outshines that of the stars in OBGs. We predict the multi-wavelength energy output of OBGs and growing Pop III remnants at z = 9 for standard and slim disk accretion, as well as high and low metallicities of the associated stellar population. We derive robust selection criteria for OBGs—a pre-selection to eliminate blue sources, followed by color–color cuts ([{F}090W-{F}220W]> 0;-0.3< [{F}200W-{F}444W]< 0.3) and the ratio of X-ray flux to rest-frame optical flux ({F}X/{F}444W\\gg 1). Our cuts sift out OBGs from other bright, high- and low-redshift contaminants in the infrared. OBGs with predicted {M}{AB}< 25 are unambiguously detectable by the Mid-Infrared Instrument (MIRI), on the upcoming James Webb Space Telescope (JWST). For parameters explored here, growing Pop III remnants with predicted {M}{AB}< 30 will likely be undetectable by JWST. We demonstrate that JWST has the power to discriminate between initial seeding mechanisms.

Phase retrieval, the process of determining the exitpupil wavefront of an optical instrument from image-plane intensity measurements, is the baseline methodology for characterizing the wavefront for the suite of science instruments (SIs) in the Integrated Science Instrument Module (ISIM) for the James Webb Space Telescope (JWST). JWST is a large, infrared space telescope with a 6.5-meter diameter primary mirror. JWST is currently NASA's flagship mission and will be the premier space observatory of the next decade. ISIM contains four optical benches with nine unique instruments, including redundancies. ISIM was characterized at the Goddard Space Flight Center (GSFC) in Greenbelt, MD in a series of cryogenic vacuum tests using a telescope simulator. During these tests, phase-retrieval algorithms were used to characterize the instruments. The objective of this paper is to describe the Monte-Carlo simulations that were used to establish uncertainties (i.e., error bars) for the wavefronts of the various instruments in ISIM. Multiple retrieval algorithms were used in the analysis of ISIM phase-retrieval focus-sweep data, including an iterativetransform algorithm and a nonlinear optimization algorithm. These algorithms emphasize the recovery of numerous optical parameters, including low-order wavefront composition described by Zernike polynomial terms and high-order wavefront described by a point-by-point map, location of instrument best focus, focal ratio, exit-pupil amplitude, the morphology of any extended object, and optical jitter. The secondary objective of this paper is to report on the relative accuracies of these algorithms for the ISIM instrument tests, and a comparison of their computational complexity and their performance on central and graphical processing unit clusters. From a phase-retrieval perspective, the ISIM test campaign includes a variety of source illumination bandwidths, various image-plane sampling criteria above and below the Nyquist- Shannon

A book-and-video training package provides a unique illustration to the basics of Illustrator Adobe Illustrator is a unique design and drawing program that allows you to create and produce brilliant art for a variety of mediums. This full-color book-and-video training package deciphers even the most complex Illustrator tasks and gets you quickly up to speed using the capabilities of the newest release of Illustrator. Thirteen self-paced lessons explain how to create and produce vibrant graphics using this robust vector drawing application. The complementary lessons featured on the vid

With its integral field unit, the near-infrared spectrograph NIRSpec on JWST will allow to measure high-resolution spectra into the 3-5 μm range with an increased sensitivity over ground-based systems. This capability will considerably extend our knowledge of brown dwarfs and bright exoplanets at large separations from their host star. But because there is not any coronagraph on NIRSpec, the performance in term of contrast at close separation will be extremely limited. In this communication, we explore possibilities to further push this limitation by comparing different observing strategies and associated post-processing techniques.

The James Webb Space Telescope (JWST) will likely revolutionize transiting exoplanet atmospheric science, due to a combination of its capability for continuous, long duration observations and its larger collecting area, spectral coverage, and spectral resolution compared to existing space-based facilities. However, it is unclear precisely how well JWST will perform and which of its myriad instruments and observing modes will be best suited for transiting exoplanet studies. In this article, we describe a prefatory JWST Early Release Science (ERS) Cycle 1 program that focuses on testing specific observing modes to quickly give the community the data and experience it needs to plan more efficient and successful transiting exoplanet characterization programs in later cycles. We propose a multi-pronged approach wherein one aspect of the program focuses on observing transits of a single target with all of the recommended observing modes to identify and understand potential systematics, compare transmission spectra at overlapping and neighboring wavelength regions, confirm throughputs, and determine overall performances. In our search for transiting exoplanets that are well suited to achieving these goals, we identify 12 objects (dubbed community targets'') that meet our defined criteria. Currently, the most favorable target is WASP-62b because of its large predicted signal size, relatively bright host star, and location in JWST's continuous viewing zone. Since most of the community targets do not have well-characterized atmospheres, we recommend initiating preparatory observing programs to determine the presence of obscuring clouds/hazes within their atmospheres. Measurable spectroscopic features are needed to establish the optimal resolution and wavelength regions for exoplanet characterization. Other initiatives from our proposed ERS program include testing the instrument brightness limits and performing phase-curve observations. The latter are a unique challenge

The James Webb Space Telescope (JWST) will likely revolutionize transiting exoplanet atmospheric science, due to a combination of its capability for continuous, long duration observations and its larger collecting area, spectral coverage, and spectral resolution compared to existing space-based facilities. However, it is unclear precisely how well JWST will perform and which of its myriad instruments and observing modes will be best suited for transiting exoplanet studies. In this article, we describe a prefatory JWST Early Release Science (ERS) Cycle 1 program that focuses on testing specific observing modes to quickly give the community the data and experience it needs to plan more efficient and successful transiting exoplanet characterization programs in later cycles. We propose a multi-pronged approach wherein one aspect of the program focuses on observing transits of a single target with all of the recommended observing modes to identify and understand potential systematics, compare transmission spectra at overlapping and neighboring wavelength regions, confirm throughputs, and determine overall performances. In our search for transiting exoplanets that are well suited to achieving these goals, we identify 12 objects (dubbed “community targets”) that meet our defined criteria. Currently, the most favorable target is WASP-62b because of its large predicted signal size, relatively bright host star, and location in JWST's continuous viewing zone. Since most of the community targets do not have well-characterized atmospheres, we recommend initiating preparatory observing programs to determine the presence of obscuring clouds/hazes within their atmospheres. Measurable spectroscopic features are needed to establish the optimal resolution and wavelength regions for exoplanet characterization. Other initiatives from our proposed ERS program include testing the instrument brightness limits and performing phase-curve observations. The latter are a unique challenge

@@ Cognitive science research in the last twenty-five years has provided considerable evidence that reason is embodied. The neural architectures that evolved to produce perception, sensation, and bodily movement are at the heart of what we experience as rational inference, conceptualization, and meaning construction.

ESA and NASA recently selected two 5 m cutoff Teledyne H2RG sensor chip assemblies (SCA) for flight on the James Webb Space Telescope (JWST) Near Infrared Spectrograph (NIRSpec). These HgCdTe SCAs incorporate Teledynes improved barrier layer design that eliminates the degradation that affected earlier JWST H2RGs(Rauscher et al. 2012a). The better indium barrier, together with other design changes, has improved the performance and reliability of JWSTs SCAs. In this article, we describe the measured performance characteristics that most directly affect scientific observations including read noise, total noise, dark current, quantum efficiency (QE), and image persistence. As part of measuring QE, we measured the quantum yield as a function of photon energy,, and found that it exceeds unity for photon energies E (2.65.2) Eg, where Eg is the HgCdTe bandgap energy. This corresponds to. 2 m for NIRSpecs 5 m cutoff HgCdTe. Our measurements agree well with a previous measurement by McCullough et al. (2008) for. 1.3. For 1.3, we find a slower increase in with photon energy than McCullough et al. did. However, and as McCullough et al. note, their two state model of the yield process is not valid for large 1.

This paper describes a novel gap gauge tool that is used to provide an independent check of the James Webb Space Telescope (JWST) Optical Telescope Element (OTE) primary mirror alignment. Making accurate measurements of the mechanical gaps between the OTE mirror segments is needed to verify that the segments were properly aligned relative to each other throughout the integration and test of the 6.6 meter telescope. The gap between the Primary Mirror Segment Assemblies (PMSA) is a sensitive indicator of the relative clocking and decenter. Further, the gap measurements are completely independent of all the other measurements use in the alignment process (e.g. laser trackers and laser radar). The gap measurement is a challenge, however, that required a new approach. Commercial gap measurements tools were investigated; however no suitable solution is available. The challenge of this measurement is due to the required 0.1 mm accuracy, the close spacing of the mirrors segments (approximately 3-9mm), the acute angle between the segment sides (approximately 4 degrees), and the difficult access to the blind gap. Several techniques were considered and tested before selecting the gauge presented here. This paper presents the theory, construction and calibration of the JWST gap gauge that is being used to measure and verify alignment of the OTE primary mirror segments.

The determination of galaxy redshifts in James Webb Space Telescope (JWST)'s blank-field surveys will mostly rely on photometric estimates, based on the data provided by JWST's Near-Infrared Camera (NIRCam) at 0.6-5.0 {\\mu}m and Mid Infrared Instrument (MIRI) at {\\lambda}>5.0 {\\mu}m. In this work we analyse the impact of choosing different combinations of NIRCam and MIRI broad-band filters (F070W to F770W), as well as having ancillary data at {\\lambda}=10, but the zphot quality significantly degrades at S/N<=5. Adding MIRI photometry with one magnitude brighter depth than the NIRCam depth allows for a redshift recovery of 83-99%, depending on SED type, and its effect is particularly noteworthy for galaxies with nebular emission. The vast majority of NIRCam galaxies with [F150W]=29 AB mag at z=7-10 will be detected with MIRI at [F560W, F770W]<28 mag if these sources are at least mildly evolved or have spectra with emission lines boosting the mid-infrared fluxes.

The high angular resolution technique of non-redundant masking (NRM) or aperture masking interferometry (AMI) has yielded images of faint protoplanetary companions of nearby stars from the ground. AMI on James Webb Space Telescope (JWST)'s Near Infrared Imager and Slitless Spectrograph (NIRISS) has a lower thermal background than ground-based facilites and does not suffer from atmospheric instability. NIRISS AMI images are likely to have 90 - 95% Strehl ratio between 2.77 and 4.8 micron. In this paper we quantify factors that limit the raw point source contrast of JWST NRM. We develop an analytic model of the NRM point spread function which includes different optical path delays (pistons) between mask holes and fit the model parameters with image plane data. It enables a straightforward way to exclude bad pixels, is suited to limited fields of view, and can incorporate effects such as intra-pixel sensitivity variations. We simulate various sources of noise to estimate their effect on the standard deviation of...

This paper covers the steps required to complete a medical illustration in a digital format using Adobe Illustrator and Photoshop. The project example is the surgical procedure for the release of the glenohumeral joint for the condition known as 'frozen shoulder'. The purpose is to demonstrate one method which an artist can use within digital media to create a colour illustration such as the release of the glenohumeral joint. Included is a general overview as how to deal with the administration of a medical illustration commission through the experience of a professional freelance artist.

The professional training of the medical illustration student has been a subject of controvery for the last few years. Curricula of the schools accredited by the Association of Medical Illustrators have, to varying degrees, undergone change due to new demands on the practicing illustrator. The curriculum in Biomedical Illustration at The University of Texas Health Science Center at Dallas attempts to balance time-proven courses with contemporary problem-solving approaches to the training of the medical illustrator. This paper reflects the history, philosophy and methods of implementation of the course of study leading to the M.A. degree in Biomedical Communication at the University of Texas Health Science Center at Dallas.

A picture can be a powerful communication tool. However, creating pictures to illustrate patient instructions can be a costly and time-consuming task. Building on our prior research in this area, we developed a computer application that automatically converts text to pictures using natural language processing and computer graphics techniques. After iterative testing, the automated illustration system was evaluated using 49 previously unseen cardiology discharge instructions. The completeness of the system-generated illustrations was assessed by three raters using a three-level scale. The average inter-rater agreement for text correctly represented in the pictograph was about 66 percent. Since illustration in this context is intended to enhance rather than replace text, these results support the feasibility of conducting automated illustration.

Model development may seem a daunting task for the novice. The purpose of this article is to illustrate the steps of model development applied to a real-life phenomenon using an inductive theory-generating research approach. The value of the illustration is that nurse researchers can follow the application of the process as a point of departure for their own work. A logical stepwise discussion is followed for empirical theory development. The logical thought process from identifying the phenomenon to describing the model as a visual metaphor of the phenomenon is illustrated.

As part of their 2007 Chemists Celebrate Earth Day Celebration, the American Chemical Society is sponsoring an illustrated haiku contest for students in grades K 12 around the theme, Recycling—Chemistry Can!

The Centre de Recherche Astrophysique de Lyon (CRAL) has recently developed two instrument simulators for spectrographic instruments. They are based on Fourier optics, and model the whole chain of acquisition, taking into account both optical aberrations and diffraction effects, by propagating a wavefront through the instrument, according to the Fourier optics concept. One simulates the NIRSpec instrument, a near-infrared multi-object spectrograph for the future James Webb Space Telescope (JWST). The other one models the Multi Unit Spectroscopic Explorer (MUSE) instrument, a second-generation integral-field spectrograph for the Very Large Telescope (VLT). The two simulators have been developed in different contexts (subcontracted versus developed internally), and for very different instruments (space-based versus ground-based), which strengthen the CRAL experience. This paper describes the lessons learned while developing these simulators: development methods, phasing with the project, points to focus on, getting data, interacting with scientists and users, etc.

The near- and mid-IR spectrum of many astronomical objects is dominated by emission bands due to UV-excited polycyclic aromatic hydrocarbons (PAH) and evaporating very small grains (eVSG). Previous studies with the ISO, Spitzer and AKARI space telescopes have shown that the spectral variations of these features are directly related to the local physical conditions that induce a photo-chemical evolution of the band carriers. Because of the limited sensitivity and spatial resolution, these studies have focused mainly on galactic star-forming regions. We discuss how the advent of JWST will allow to extend these studies to previously unresolved sources such as near-by galaxies, and how the analysis of the infrared signatures of PAHs and eVSGs can be used to determine their physical conditions and chemical composition.

The Mid-InfraRed Instrument (MIRI) on the James Webb Space Telescope (JWST) provides measurements over the wavelength range 5 to 28.5 microns. MIRI has, within a single 'package', four key scientific functions: photometric imaging, coronagraphy, single-source low-spectral resolving power (R ~ 100) spectroscopy, and medium-resolving power (R ~ 1500 to 3500) integral field spectroscopy. An associated cooler system maintains MIRI at its operating temperature of < 6.7 K. This paper describes the driving principles behind the design of MIRI, the primary design parameters, and their realization in terms of the 'as-built' instrument. It also describes the test program that led to delivery of the tested and calibrated Flight Model to NASA in 2012, and the confirmation after delivery of the key interface requirements.

The James Webb Space Telescope (JWST) cryogenic testing requires measurement systems that both obtain a very high degree of accuracy and can function in that environment. Close-range photogrammetry was identified as meeting those criteria. Testing the capability of a close-range photogrammetric system prior to its existence is a challenging problem. Computer simulation was chosen over building a scaled mock-up to allow for increased flexibility in testing various configurations. Extensive validation work was done to ensure that the actual as-built system meets accuracy and repeatability requirements. The simulated image data predicted the uncertainty in measurement to be within specification and this prediction was borne out experimentally. Uncertainty at all levels was verified experimentally to be <0.1 mm.

Based on accumulating evidence, simulation appears to be a basic computational mechanism in the brain that supports a broad spectrum of processes from perception to social cognition. Further evidence suggests that simulation is typically situated, with the situated character of experience in the environment being reflected in the situated character of the representations that underlie simulation. A basic architecture is sketched of how the brain implements situated simulation. Within this framework, simulators implement the concepts that underlie knowledge, and situated conceptualizations capture patterns of multi-modal simulation associated with frequently experienced situations. A pattern completion inference mechanism uses current perception to activate situated conceptualizations that produce predictions via simulations on relevant modalities. Empirical findings from perception, action, working memory, conceptual processing, language and social cognition illustrate how this framework produces the extensive prediction that characterizes natural intelligence.

The purpose of the paper is to obtain insight into and provide practical advice for event-based conceptual modeling. We analyze a set of event concepts and use the results to formulate a conceptual event model that is used to identify guidelines for creation of dynamic process models and static...... information models. We characterize events as short-duration processes that have participants, consequences, and properties, and that may be modeled in terms of information structures. The conceptual event model is used to characterize a variety of event concepts and it is used to illustrate how events can...... be used to integrate dynamic modeling of processes and static modeling of information structures. The results are unique in the sense that no other general event concept has been used to unify a similar broad variety of seemingly incompatible event concepts. The general event concept can be used...

We present new methodology to use cosmic infrared background (CIB) fluctuations to probe sources at 10 less than or approx. equal to z less than or approx. equal to 30 from a James Webb Space Telescope (JWST) NIRCam configuration that will isolate known galaxies to 28 AB mag at 0.55 m. At present significant mutually consistent source-subtracted CIB fluctuations have been identified in the Spitzer and AKARI data at 25 m, but we demonstrate internal inconsistencies at shorter wavelengths in the recent CIBER data. We evaluate CIB contributions from remaining galaxies and show that the bulk of the high-z sources will be in the confusion noise of the NIRCam beam, requiring CIB studies. The accurate measurement of the angular spectrum of the fluctuations and probing the dependence of its clustering component on the remaining shot noise power would discriminate between the various currently proposed models for their origin and probe the flux distribution of its sources. We show that the contribution to CIB fluctuations from remaining galaxies is large at visible wavelengths for the current instruments precluding probing the putative Lyman-break of the CIB fluctuations. We demonstrate that with the proposed JWST configuration such measurements will enable probing the Lyman-break. We develop a Lyman-break tomography method to use the NIRCam wavelength coverage to identify or constrain, via the adjacent two-band subtraction, the history of emissions over 10 less than or approx. equal to z less than or approx. equal to 30 as the universe comes out of the Dark Ages. We apply the proposed tomography to the current SpitzerIRAC measurements at 3.6 and 4.5 m, to find that it already leads to interestingly low upper limit on emissions at z greater than or approx. equal to 30.

We present new methodology to use cosmic infrared background (CIB) fluctuations to probe sources at 10≲ z≲ 30 from a James Webb Space Telescope (JWST)/NIRCam configuration that will isolate known galaxies to 28 AB mag at 0.5-5 μm. At present significant mutually consistent source-subtracted CIB fluctuations have been identified in the Spitzer and AKARI data at ˜2-5 μm, but we demonstrate internal inconsistencies at shorter wavelengths in the recent CIBER data. We evaluate CIB contributions from remaining galaxies and show that the bulk of the high-z sources will be in the confusion noise of the NIRCam beam, requiring CIB studies. The accurate measurement of the angular spectrum of the fluctuations and probing the dependence of its clustering component on the remaining shot noise power would discriminate between the various currently proposed models for their origin and probe the flux distribution of its sources. We show that the contribution to CIB fluctuations from remaining galaxies is large at visible wavelengths for the current instruments precluding probing the putative Lyman-break of the CIB fluctuations. We demonstrate that with the proposed JWST configuration such measurements will enable probing the Lyman-break. We develop a Lyman-break tomography method to use the NIRCam wavelength coverage to identify or constrain, via the adjacent two-band subtraction, the history of emissions over 10≲ z≲ 30 as the universe comes out of the “Dark Ages.” We apply the proposed tomography to the current Spitzer/IRAC measurements at 3.6 and 4.5 μm, to find that it already leads to interestingly low upper limit on emissions at z≳ 30.

Unique and exotic planets give us an opportunity to understand how planetary systems form and evolve over their lifetime, by placing our own planetary system in the context of the vastly different extrasolar systems that are being continually discovered by present space missions. With orbital separations that are less than one-tenth of the Mercury-Sun distance, these close-in planets provide us with valuable insights about the host stellar atmosphere and planetary atmospheres subjected to their enormous stellar insolation. Observed spectroscopic signatures reveal all spectrally active species in a planet, along with information about its thermal structure and dynamics, allowing us to characterize the planet's atmosphere. NASA's upcoming missions will give us the high-resolution spectra necessary to constrain the atmospheric properties with unprecedented accuracy. However, to interpret the observed signals from exoplanetary transit events with any certainty, we need reliable atmospheric retrieval tools that can model the expected observables adequately. In my work thus far, I have built a Markov Chain Monte Carlo (MCMC) convergence scheme, with an analytical radiative equilibrium formulation for the thermal structures, within the NEMESIS atmospheric modeling tool, to allow sufficient (and efficient) exploration of the parameter space. I also augmented the opacity tables to improve the speed and reliability of retrieval models. I then utilized this upgraded version to infer the pressure-temperature (P-T) structures and volume-mixing ratios (VMRs) of major gas species in hot Jupiter dayside atmospheres, from their emission spectra. I have employed a parameterized thermal structure to retrieve plausible P-T profiles, along with altitude-invariant VMRs. Here I show my retrieval results on published datasets of HD189733b, and compare them with both medium and high spectral resolution JWST/NIRSPEC simulations. In preparation for the upcoming JWST mission, my current work

The Galactic center harbors the closest supermassive black hole and contains warm, turbulent molecular clouds, dense stellar populations, and some of the most active star forming regions in the Milky Way. These unique conditions make the Galactic Center a compelling target for understanding how star formation varies with environment, how nuclear star clusters in galaxies evolve, and how supermassive black holes influence their surroundings. Detailed studies of the Galactic center have previously been conducted with ground-based telescopes equipped with adaptive optics in pencil-beam studies. However, Galactic center studies can be dramatically expanded with JWST's combination of large fields-of-view (FOV) and high spatial resolution in the infrared. Of particular relevance for the Galactic Center are NIRCam's suite of narrow-band imaging filters and NIRSpec's IFU spectrograph. The narrow-band imaging should provide precise astrometry, rough spectral types, and emission line maps for ~50,000 stars within a 2' x 2' FOV, while follow up IFU spectroscopy will give precise types and radial velocities for the most interesting subsets of stars. Potential results include: (1) counting the intermediate age red and yellow supergiants that will give information about the recent star formation history; (2) measuring the initial mass function below 1 Msun and studying young stellar objects in known and new young star clusters; (3) using 3D dynamics to model the kinematic evolution of the entire nuclear cluster, find hypervelocity stars, and trace the orbits of gas features and clusters in the region. Galactic Center observations with JWST will give us a more complete picture of the gas, stars, black hole, and their interactions in this dynamic region.

With a scheduled launch in 2018 October, the James Webb Space Telescope (JWST) is expected to revolutionize the field of atmospheric characterization of exoplanets. The broad wavelength coverage and high sensitivity of its instruments will allow us to extract far more information from exoplanet spectra than what has been possible with current observations. In this paper, we investigate whether current retrieval methods will still be valid in the era of JWST, exploring common approximations used when retrieving transmission spectra of hot Jupiters. To assess biases, we use 1D photochemical models to simulate typical hot Jupiter cloud-free atmospheres and generate synthetic observations for a range of carbon-to-oxygen ratios. Then, we retrieve these spectra using TauREx, a Bayesian retrieval tool, using two methodologies: one assuming an isothermal atmosphere, and one assuming a parameterized temperature profile. Both methods assume constant-with-altitude abundances. We found that the isothermal approximation biases the retrieved parameters considerably, overestimating the abundances by about one order of magnitude. The retrieved abundances using the parameterized profile are usually within 1σ of the true state, and we found the retrieved uncertainties to be generally larger compared to the isothermal approximation. Interestingly, we found that by using the parameterized temperature profile we could place tight constraints on the temperature structure. This opens the possibility of characterizing the temperature profile of the terminator region of hot Jupiters. Lastly, we found that assuming a constant-with-altitude mixing ratio profile is a good approximation for most of the atmospheres under study.

The JWST Mid Infrared Instrument (MIRI) operates in the 5-28μm wavelength range and can be configured for imaging, coronographic imaging, long-slit, low-resolution spectroscopy or medium resolution spectroscopy with an integral field unit. SCASim is one of a suite of simulators which operate together to simulate all the different modes of the instrument. These simulators are essential for the efficient operation of MIRI; allowing more accurate planning of MIRI observations on sky or during the pre-launch testing of the instrument. The data generated by the simulators are essential for testing the data pipeline software. The simulators not only need to reproduce the behaviour of the instrument faithfully, they also need to be adaptable so that information learned about the instrument during the pre-launch testing and in-orbit commissioning can be fed back into the simulation. SCASim simulates the behaviour of the MIRI detectors, taking into account cosmetic effects, quantum efficiency, shot noise, dark current, read noise, amplifier layout, cosmic ray hits, etc... The software has benefited from three major design choices. First, the development of a suite of MIRI simulators, rather than single simulator, has allowed MIRI simulators to be developed in parallel by different teams, with each simulator able to concentrate on one particular area. SCASim provides a facility common to all the other simulators and saves duplication of effort. Second, SCASim has a Python-based object-oriented design which makes it easier to adapt as new information about the instrument is learned during testing. Third, all simulator parameters are maintained in external files, rather than being hard coded in the software. These design choices have made SCASim highly reusable. In its present form it can be used to simulate any JWST detector, and it can be adapted for future instruments with similar, photon-counting detectors.

This appendix is a compendium of topical reports prepared for the Hanford Nuclear Energy Center: Status Report: Conceptual Fuel Cycle Studies for the Hanford Nuclear Energy Center; Selection of Heat Disposal Methods for a Hanford Nuclear Energy Center; Station Service Power Supply for a Hanford Nuclear Energy Center (HNEC); Impact of a Hanford Nuclear Energy Center on Ground Level Fog and Humidity; A Review of Potential Technology for the Seismic Characterization of Nuclear Energy Centers; Reliability of Generation at a Hanford Nuclear Energy Center (HNEC); Meteorological Evaluation of Multiple Reactor Contamination Probabilities for a Hanford Nuclear Energy Center; Electric Power Transmission for a Hanford Nuclear Energy Center (HNEC); The Impact of a Hanford Nuclear Energy Center on Cloudiness and Insolation; and A Licensing Review for an HNEC.

Uroradiology is an up-to-date, image-oriented reference in the style of a teaching file that has been designed specifically to be of value in clinical practice. All aspects of the imaging of urologic diseases are covered, and case studies illustrate the findings obtained with the relevant imaging modalities in both common and uncommon conditions. Most chapters focus on a particular clinical problem, but normal findings, congenital anomalies, and interventions are also discussed and illustrated. In this second edition, the range and quality of the illustrations have been enhanced, and many schematic drawings have been added to help readers memorize characteristic imaging findings through pattern recognition. The accompanying text is concise and informative. Besides serving as an outstanding aid to differential diagnosis, this book will provide a user-friendly review tool for certification or recertification in radiology. (orig.)

The 2018 launch of James Webb Space Telescope (JWST), coupled with the 2017 launch of the Transiting Exoplanet Survey Satellite (TESS), heralds a new era in Exoplanet Science, with TESS projected to detect over one thousand transiting sub-Neptune-sized planets (Ricker et al, 2014), and JWST offering unprecedented spectroscopic capabilities. Sullivan et al (2015) used Monte Carlo simulations to predict the properties of the planets that TESS is likely to detect, and published a catalog of 962 simulated TESS planets. Prior to TESS launch, the re-scoped Kepler K2 mission and ground-based surveys such as MEarth continue to seek nearby Earth-like exoplanets orbiting M-dwarf host stars. The exoplanet community will undoubtedly employ JWST for atmospheric characterization follow-up studies of promising exoplanets, but the targeted planets for these studies must be chosen wisely to maximize JWST science return. The goal of this project is to estimate the capabilities of JWST’s Near InfraRed Imager and Slitless Spectrograph (NIRISS)—operating with the GR700XD grism in Single Object Slitless Spectrography (SOSS) mode—during observations of exoplanets transiting their host stars. We compare results obtained for the simulated TESS planets, confirmed K2-discovered super-Earths, and exoplanets discovered using ground-based surveys. By determining the target planet characteristics that result in the most favorable JWST observing conditions, we can optimize the choice of target planets in future JWST follow-on atmospheric characterization studies.

The JWST (James Webb Space Telescope) Optical Telescope Element (OTE) consists of a 6.6 meter clear aperture, 18-segment primary mirror, all-reflective, three-mirror anastigmat operating at cryogenic temperatures. To verify performance of the primary mirror, a full aperture center of curvature optical null test is performed under cryogenic conditions in Chamber A at NASA Johnson Space Center using an instantaneous phase measuring interferometer. After phasing the mirrors during the JWST Pathfinder testing, the interferometer is utilized to characterize the mirror relative piston and tilt dynamics under different facility configurations. The correlation between the motions seen on detectors at the focal plane and the interferometer validates the use of the interferometer for dynamic investigations. The success of planned test hardware improvements will be characterized by the multi-wavelength interferometer (MWIF) at the Center of Curvature Optical Assembly (CoCOA).

The James Webb Space Telescope (JWST) Optical Telescope Element (OTE) consists of a 6.6 m clear aperture, 18 segment primary mirror, all-reflective, three-mirror anastigmat operating at cryogenic temperatures. To verify performance of the primary mirror, a full aperture center of curvature optical null test is performed under cryogenic conditions in Chamber A at the National Aeronautics and Space Administration (NASA) Johnson Space Center (JSC) using an instantaneous phase measuring interferometer. After phasing the mirrors during the JWST Pathfinder testing, the interferometer is utilized to characterize the mirror relative piston and tilt dynamics under different facility configurations. The correlation between the motions seen on detectors at the focal plane and the interferometer validates the use of the interferometer for dynamic investigations. The success of planned test hardware improvements will be characterized by the multi-wavelength interferometer (MWIF) at the Center of Curvature Optical Assembly (CoCOA).

The James Webb Space Telescope (JWST) primary mirror (PM) is 6.6 m in diameter and consists of 18 hexagonal segments, each 1.5 m point-to-point. Each segment has a six degree-of-freedom hexapod actuation system and a radius of-curvature (RoC) actuation system. The full telescope will be tested at its cryogenic operating temperature at Johnson Space Center. This testing will include center-of-curvature measurements of the PM, using the Center-of-Curvature Optical Assembly (COCOA) and the Absolute Distance Meter Assembly (ADMA). The COCOA includes an interferometer, a reflective null, an interferometer-null calibration system, coarse and fine alignment systems, and two displacement measuring interferometer systems. A multiple-wavelength interferometer (MWIF) is used for alignment and phasing of the PM segments. The ADMA is used to measure, and set, the spacing between the PM and the focus of the COCOA null (i.e. the PM center-of-curvature) for determination of the ROC. The performance of these metrology systems was assessed during two cryogenic tests at JSC. This testing was performed using the JWST Pathfinder telescope, consisting mostly of engineering development and spare hardware. The Pathfinder PM consists of two spare segments. These tests provided the opportunity to assess how well the center-of-curvature optical metrology hardware, along with the software and procedures, performed using real JWST telescope hardware. This paper will describe the test setup, the testing performed, and the resulting metrology system performance. The knowledge gained and the lessons learned during this testing will be of great benefit to the accurate and efficient cryogenic testing of the JWST flight telescope.

Hubble has revolutionized the field of distant galaxies through its deep imaging surveys, starting with the Hubble Deep Field (HDF) in 1995. That first deep survey revealed galaxies at redshift z~1-3 that provided insights into the development of the Hubble sequence. Each new HST instrument has explored new regimes, through the peak of star formation at z~2-3, just 2-3 billion years after the Big Bang, to our first datasets at a billion years at z~6, and then earlier to z~11. HST's survey capabilities were enhanced by 40X with ACS, and then similarly with the WFC3/IR, which opened up the first billion years to an unforeseen degree. I will discuss what we have learned from the remarkable HST and Spitzer imaging surveys (HUDF, GOODS, HUDF09/12 and CANDELS), as well as surveys of clusters like the Hubble Frontier Fields (HFF). Lensing clusters provide extraordinary opportunities for characterizing the faintest earliest galaxies, but also present extraordinary challenges. Together these surveys have resulted in the measurement of the volume density of galaxies in the first billion years down to astonishingly faint levels. The role of faint galaxies in reionizing the universe is still much-discussed, but there is no doubt that such galaxies contribute greatly to the UV ionizing flux, as shown by deep luminosity function studies. Together Hubble and Spitzer have also established the stellar-mass buildup over 97% of cosmic history. Yet some of the greatest surprises have come from the discovery of very luminous galaxies at z~8-11, around 400-650 million years after the Big Bang. Spectroscopic followup by Keck of some of these very rare, bright galaxies has confirmed redshifts from z~7 to z~9, and revealed, surprisingly, strong Lyα emission near the peak of reionization when the HI fraction in the IGM is high. The recent confirmation of a z=11.1 galaxy, just 400 million years after the Big Bang, by a combination of Hubble and Spitzer data, moved Hubble into JWST territory

The James Webb Space Telescope (JWST) consists of an infrared-optimized Optical Telescope Element (OTE) that is cooled down to 40 degrees Kelvin. A second adjacent component to the OTE is the Integrated Science Instrument Module, or ISIM. This module includes the electronic compartment, which provides the mounting surfaces and ambient thermally controlled environment for the instrument control electronics. Dissipating the 200 watts generated from the ISIM structure away from the OTE is of paramount importance so that the spacecraft's own heat does not interfere with the infrared light detected from distant cosmic sources. This technical challenge is overcome by a thermal subsystem unit that provides passive cooling to the ISIM control electronics. The proposed design of this thermal radiator consists of a lightweight structure made out of composite materials and low-emittance metal coatings. In this paper, we will present characterizations of the coating emittance, bidirectional reflectance, and mechanical structure design that will affect the performance of this passive cooling system.

The James Webb Space Telescope's Near InfraRed Imager and Slitless Spectrograph (NIRISS) contains a 7-hole non-redundant mask (NRM) in its pupil. NIRISS's Aperture Masking Interferometry (AMI) mode is useful both for science as well as wavefront sensing. In-focus science detector NRM and full pupil images of unresolved stars can be used to measure the wavefront without any dedicated wavefront sensing hardware or any moving mirrors. Using routine science operational sequences, these images can be taken before or after any science visit. NRM fringe phases constrain Gerchberg-Saxton phase retrieval to disambiguate the algorithm's two-fold degeneracy. We summarize how consecutive masked and unmasked exposures provide enough information to reconstruct a wavefront with up to ˜1-2 rms radians of error. We present our latest progress on using this approach on laboratory experiments, and discuss those results in the context of contingency for JWST segment phasing. We discuss extending our method to ground-based AO systems and future space telescopes.

The James Webb Space Telescope near-infrared camera (JWST NIRCam) has two 2.'2 $\\times$ 2.'2 fields of view that are capable of either imaging or spectroscopic observations. Either of two $R \\sim 1500$ grisms with orthogonal dispersion directions can be used for slitless spectroscopy over $\\lambda = 2.4 - 5.0$ $\\mu$m in each module, and shorter wavelength observations of the same fields can be obtained simultaneously. We present the latest predicted grism sensitivities, saturation limits, resolving power, and wavelength coverage values based on component measurements, instrument tests, and end-to-end modeling. Short wavelength (0.6 -- 2.3 $\\mu$m) imaging observations of the 2.4 -- 5.0 $\\mu$m spectroscopic field can be performed in one of several different filter bands, either in-focus or defocused via weak lenses internal to NIRCam. Alternatively, the possibility of 1.0 -- 2.0 $\\mu$m spectroscopy (simultaneously with 2.4 -- 5.0 $\\mu$m) using dispersed Hartmann sensors (DHSs) is being explored. The grisms, wea...

Context: The NIRSpec instrument for the James Webb Space Telescope (JWST) can be operated in multiobject (MOS), long-slit, and integral field (IFU) mode with spectral resolutions from 100 to 2700. Its MOS mode uses about a quarter of a million individually addressable minislits for object selection, covering a field of view of $\\sim$9 $\\mathrm{arcmin}^2$. Aims: The pipeline used to extract wavelength-calibrated spectra from NIRSpec detector images relies heavily on a model of NIRSpec optical geometry. We demonstrate how dedicated calibration data from a small subset of NIRSpec modes and apertures can be used to optimize this parametric model to the necessary levels of fidelity. Methods: Following an iterative procedure, the initial fiducial values of the model parameters are manually adjusted and then automatically optimized, so that the model predicted location of the images and spectral lines from the fixed slits, the IFU, and a small subset of the MOS apertures matches their measured location in the main o...

The Mid-Infrared Instrument (MIRI) is a multipurpose imager, coronagraph, and spectrometer for the James Webb Space Telescope. It provides wavelength coverage from 5 through 28 microns and is an integral contributor to all four of JWST's primary science themes. MIRI is being developed as a partnership between NASA and ESA, with JPL providing the Focal Plane System (FPS, consisting of the detectors, control electronics, and flight software) and the cooler, and a consortium of European astronomical institutes providing the optical bench and structure. The flight FPS is being prepared for delivery to the European Consortium for its integration into the optical bench, while the cooler is nearing its Critical Design Review. We describe the capabilities of the FPS and cooler, present test results and the predicted sensitivity performance of the FPS, and update the current status of each these systems. The research described in this poster was carried out at the Jet Propulsion Laboratory, California Institute of Technology, under a contract with the National Aeronautics and Space Administration.

Discussed several issues related to psychological empowerment. The thesis of this paper is that the development of a universal and global measure of psychological empowerment may not be a feasible or appropriate goal. I begin by distinguishing between empowerment processes and outcomes. Underlying assumptions are discussed including the notion that empowerment differs across people, contexts, and times. A nomological network that includes intrapersonal, interactional, and behavioral components is also presented. Two examples of psychological empowerment for voluntary service organization members and members of a mutual help organization are described to help illustrate differences in the specific variables that may be used to measure psychological empowerment in different populations and settings.

Conceptual spaces have been proposed as topological or geometric means for establishing conceptual structures and models. This paper, after briey reviewing conceptual spaces, focusses on the relationship between conceptual spaces and logical concept languages with operations for combining concepts...... to form concepts. Speci cally is introduced an algebraic concept logic, for which conceptual spaces are installed as semantic domain as replacement for, or enrichment of, the traditional....

Full Text Available This article distills the core principles of a phenomenological research design and, by means of a specific study, illustrates the phenomenological methodology. After a brief overview of the developments of phenomenology, the research paradigm of the specific study follows. Thereafter the location of the data, the data-gathering the data-storage methods are explained. Unstructured in-depth phenomenological interviews supplemented by memoing, essays by participants, a focus group discussion and field notes were used. The data explicitation, by means of a simplified version of Hycner's (1999 process, is further explained. The article finally contains commentary about the validity and truthfulness measures, as well as a synopsis of the findings of the study.

The practice of statistics involves analyzing data and planning data collection schemes to answer scientific questions. Issues often arise with the data that must be dealt with and can lead to new procedures. In analyzing data, these issues can sometimes be addressed through the statistical models that are developed. Simulation can also be helpful in evaluating a new procedure. Moreover, simulation coupled with optimization can be used to plan a data collection scheme. The practice of statistics as just described is much more than just using a statistical package. In analyzing the data, it involves understanding the scientific problem and incorporating the scientist's knowledge. In modeling the data, it involves understanding how the data were collected and accounting for limitations of the data where possible. Moreover, the modeling is likely to be iterative by considering a series of models and evaluating the fit of these models. Designing a data collection scheme involves understanding the scientist's goal and staying within hislher budget in terms of time and the available resources. Consequently, a practicing statistician is faced with such tasks and requires skills and tools to do them quickly. We have written this article for students to provide a glimpse of the practice of statistics. To illustrate the practice of statistics, we consider a problem motivated by some precipitation data that our relative, Masaru Hamada, collected some years ago. We describe his rain gauge observational study in Section 2. We describe modeling and an initial analysis of the precipitation data in Section 3. In Section 4, we consider alternative analyses that address potential issues with the precipitation data. In Section 5, we consider the impact of incorporating additional infonnation. We design a data collection scheme to illustrate the use of simulation and optimization in Section 6. We conclude this article in Section 7 with a discussion.

Full Text Available This paper focuses on Thomas S. Kuhn's work on taxonomic concepts and how it relates to empirical work from the cognitive sciences on categorization and conceptual development. I shall first review the basic features of Kuhn's family resemblance account and compare to work from the cognitive sciences. I shall then show how Kuhn's account can be extended to cover the development of new taxonomies in science, and I shall illustrate by a detailed case study that Kuhn himself mentioned only briefly in his own work, namely the discovery of X-rays and radioactivity.

The ambitious science goals of the James Webb Space Telescope (JWST) have driven spectacular advances in λco ~ 5um detector technology over the past five years. This paper reviews both the UH/RSC team"s Phase A development and evaluation of 2Kx2K arrays exceeding the detector requirements for JWST"s near infrared instruments and also the hardware integration of these into a 4Kx4K (16Mpxl) close packed mosaic focal plane array housed in an Ultra Low Background test facility. Both individual first generation 2Kx2K SCA"s and 4Kx4K mosaic focal planes have been extensively characterized in the laboratory and, since September 2003, a NIR camera utilizing the 4Kx4K mosaic focal plane has been in use for nearly 100 nights at the UH 2.2 m telescope on Mauna Kea. Typical test results for the first generation 2Kx2K arrays and their integration into 4Kx4K mosaic focal planes are reported. Demonstration of the design concepts and both array and mosaic focal plane performance in actual hardware, as described here, has provided the foundation for design iterations leading to later generations of 2Kx2K arrays and 4Kx4K mosaic focal planes. Four major technology developments leading to first generation hardware demonstrations of both 2Kx2K SCA"s and a 4Kx4K mosaic FPA are reviewed. These are: 1) improvement in test equipment and procedures to characterize the detectors against JWST requirements and goals, primarily at 37K but with the capability to test from 30K to 100K; 2) optimization of λc ~ 5 um MBE HgCdTe material on a CZT substrate for low dark current (goal of 0.003 e-/sec at 37K) with high quantum efficiency, low cross-talk and greatly reduced image persistence; 3) development of the 2Kx2K HAWAII-2RG multiplexer designed specifically to take full advantage of these detector characteristics for a wide range of astronomical applications (and fully compatible with an ASIC controller developed under the JWST Instrument Technology Development initiative) and 4) development of

Context. The NIRSpec instrument for the James Webb Space Telescope (JWST) can be operated in multiobject spectroscopy (MOS), long-slit, and integral field unit (IFU) mode with spectral resolutions from 100 to 2700. Its MOS mode uses about a quarter of a million individually addressable minislits for object selection, covering a field of view of ~9 arcmin2. Aims: The pipeline used to extract wavelength-calibrated spectra from NIRSpec detector images relies heavily on a model of NIRSpec optical geometry. We demonstrate how dedicated calibration data from a small subset of NIRSpec modes and apertures can be used to optimize this parametric model to the necessary levels of fidelity. Methods: Following an iterative procedure, the initial fiducial values of the model parameters are manually adjusted and then automatically optimized, so that the model predicted location of the images and spectral lines from the fixed slits, the IFU, and a small subset of the MOS apertures matches their measured location in the main optical planes of the instrument. Results: The NIRSpec parametric model is able to reproduce the spatial and spectral position of the input spectra with high fidelity. The intrinsic accuracy (1-sigma, rms) of the model, as measured from the extracted calibration spectra, is better than 1/10 of a pixel along the spatial direction and better than 1/20 of a resolution element in the spectral direction for all of the grating-based spectral modes. This is fully consistent with the corresponding allocation in the spatial and spectral calibration budgets of NIRSpec.

The determination of galaxy redshifts in the James Webb Space Telescope’s (JWST) blank-field surveys will mostly rely on photometric estimates, based on the data provided by JWST’s Near-Infrared Camera (NIRCam) at 0.6-5.0 μm and Mid Infrared Instrument (MIRI) at λ \\gt 5.0 μ {{m}}. In this work we analyze the impact of choosing different combinations of NIRCam and MIRI broadband filters (F070W to F770W), as well as having ancillary data at λ \\lt 0.6 μ {{m}}, on the derived photometric redshifts (z phot) of a total of 5921 real and simulated galaxies, with known input redshifts z = 0-10. We found that observations at λ \\lt 0.6 μ {{m}} are necessary to control the contamination of high-z samples by low-z interlopers. Adding MIRI (F560W and F770W) photometry to the NIRCam data mitigates the absence of ancillary observations at λ \\lt 0.6 μ {{m}} and improves the redshift estimation. At z = 7-10, accurate z phot can be obtained with the NIRCam broadbands alone when {{S}}/{{N}}≥slant 10, but the z phot quality significantly degrades at {{S}}/{{N}}≤slant 5. Adding MIRI photometry with 1 mag brighter depth than the NIRCam depth allows for a redshift recovery of 83%-99%, depending on spectral energy distribution type, and its effect is particularly noteworthy for galaxies with nebular emission. The vast majority of NIRCam galaxies with [F150W] = 29 AB mag at z = 7-10 will be detected with MIRI at [F560W, F770W] \\lt 28 mag if these sources are at least mildly evolved or have spectra with emission lines boosting the mid-infrared fluxes.

The James Webb Space Telescope (JWST) is nearing its launch date of 2018, and is expected to revolutionize our knowledge of exoplanet atmospheres. In order to specifically identify which observing modes will be most useful for characterizing a diverse range of exoplanetary atmospheres, we use an information content (IC) based approach commonly used in the studies of solar system atmospheres. We develop a system based upon these IC methods to trace the instrumental and atmospheric model phase space in order to identify which observing modes are best suited for particular classes of planets, focusing on transmission spectra. Specifically, the atmospheric parameter space we cover is T = 600-1800 K, C/O = 0.55-1, [M/H] = 1-100 × Solar for an R = 1.39 R J , M = 0.59 M J planet orbiting a WASP-62-like star. We also explore the influence of a simplified opaque gray cloud on the IC. We find that obtaining broader wavelength coverage over multiple modes is preferred over higher precision in a single mode given the same amount of observing time. Regardless of the planet temperature and composition, the best modes for constraining terminator temperatures, C/O ratios, and metallicity are NIRISS SOSS+NIRSpec G395. If the target’s host star is dim enough such that the NIRSpec prism is applicable, then it can be used instead of NIRISS SOSS+NIRSpec G395. Lastly, observations that use more than two modes should be carefully analyzed because sometimes the addition of a third mode results in no gain of information. In these cases, higher precision in the original two modes is favorable.

This illustration is a cutaway of the solid rocket booster (SRB) sections with callouts. The Shuttle's two SRB's are the largest solids ever built and the first designed for refurbishment and reuse. Standing nearly 150-feet high, the twin boosters provide the majority of thrust for the first two minutes of flight, about 5.8 million pounds, augmenting the Shuttle's main propulsion system during liftoff. The major design drivers for the solid rocket motors (SRM's) were high thrust and reuse. The desired thrust was achieved by using state-of-the-art solid propellant and by using a long cylindrical motor with a specific core design that allows the propellant to burn in a carefully controlled marner. At burnout, the boosters separate from the external tank and drop by parachute to the ocean for recovery and subsequent refurbishment. The boosters are designed to survive water impact at almost 60 miles per hour, maintain flotation with minimal damage, and preclude corrosion of the hardware exposed to the harsh seawater environment. Under the project management of the Marshall Space Flight Center, the SRB's are assembled and refurbished by the United Space Boosters. The SRM's are provided by the Morton Thiokol Corporation.

In recent years, fashion illustration is one of the new form of painting, fashion illustration compared to the previous illustration, it tends to be more professional and meticulous. Fashion illustration is often used in commercial ad-vertising, fashion magazines, packaging design, etc. Fashion illustrator for its strong contrast vivid colors and strong deco-rative effect, become the carrier of designers chasing with feelings of fashion .

This illustration depicts a side view of the Hubble Space Telescope (HST). The HST is the product of a partnership between NASA, European Space Agency Contractors, and the international community of astronomers. It is named after Edwin P. Hubble, an American Astronomer who discovered the expanding nature of the universe and was the first to realize the true nature of galaxies. The purpose of the HST, the most complex and sensitive optical telescope ever made, is to study the cosmos from a low-Earth orbit. By placing the telescope in space, astronomers are able to collect data that is free of the Earth's atmosphere. The HST detects objects 25 times fainter than the dimmest objects seen from Earth and provides astronomers with an observable universe 250 times larger than visible from ground-based telescopes, perhaps as far away as 14 billion light-years. The HST views galaxies, stars, planets, comets, possibly other solar systems, and even unusual phenomena such as quasars, with 10 times the clarity of ground-based telescopes. The major elements of the HST are the Optical Telescope Assembly (OTA), the Support System Module (SSM), and the Scientific Instruments (SI). The HST is approximately the size of a railroad car, with two cylinders joined together and wrapped in a silvery reflective heat shield blanket. Wing-like solar arrays extend horizontally from each side of these cylinders, and dish-shaped anternas extend above and below the body of the telescope. The HST was deployed from the Space Shuttle Discovery (STS-31 mission) into Earth orbit in April 1990. The Marshall Space Flight Center had responsibility for design, development, and construction of the HST. The Perkin-Elmer Corporation, in Danbury, Connecticut, developed the optical system and guidance sensors. The Lockheed Missile and Space Company of Sunnyvale, California produced the protective outer shroud and spacecraft systems, and assembled and tested the finished telescope.

This illustration is an orbiter cutaway view with callouts. The orbiter is both the brains and heart of the Space Transportation System (STS). About the same size and weight as a DC-9 aircraft, the orbiter contains the pressurized crew compartment (which can normally carry up to seven crew members), the huge cargo bay, and the three main engines mounted on its aft end. There are three levels to the crew cabin. Uppermost is the flight deck where the commander and the pilot control the mission. The middeck is where the gallery, toilet, sleep stations, and storage and experiment lockers are found for the basic needs of weightless daily living. Also located in the middeck is the airlock hatch into the cargo bay and space beyond. It is through this hatch and airlock that astronauts go to don their spacesuits and marned maneuvering units in preparation for extravehicular activities, more popularly known as spacewalks. The Space Shuttle's cargo bay is adaptable to hundreds of tasks. Large enough to accommodate a tour bus (60 x 15 feet or 18.3 x 4.6 meters), the cargo bay carries satellites, spacecraft, and spacelab scientific laboratories to and from Earth orbit. It is also a work station for astronauts to repair satellites, a foundation from which to erect space structures, and a hold for retrieved satellites to be returned to Earth. Thermal tile insulation and blankets (also known as the thermal protection system or TPS) cover the underbelly, bottom of the wings, and other heat-bearing surfaces of the orbiter to protect it during its fiery reentry into the Earth's atmosphere. The Shuttle's 24,000 individual tiles are made primarily of pure-sand silicate fibers, mixed with a ceramic binder. The solid rocket boosters (SRB's) are designed as an in-house Marshall Space Flight Center project, with United Space Boosters as the assembly and refurbishment contractor. The solid rocket motor (SRM) is provided by the Morton Thiokol Corporation.

The NIRSpec instrument of JWST can be operated in multi-object (MOS), long-slit, and integral field mode with spectral resolutions from 100 to 2700. Its MOS mode uses about a quarter of a million individually addressable mini-slits for object selection, covering a field of view of 9 square-arcminute. We have developed a procedure to optimize a parametric model of the instrument that provides the basis for the extraction of wavelength calibrated spectra from NIRSpec data, from any of the apertures and for all the modes. Here, we summarize the steps undertaken to optimize the instrument model parameters using the data acquired during the latest cryo-vacuum campaign of the JWST Integrated Science Instrument Module, recently carried out at NASA Goddard Space Flight Center. The calibrated parametric model is able to reproduce the spatial and spectral position of the input spectra with an intrinsic accuracy (1-sigma, RMS) ~ 1/10 of a pixel in spatial and spectral direction for all the modes. The overall wavelength calibration accuracy (RMS) of the model as measured on the extracted spectra is better than 1/20 of a resolution element for all of the grating-based spectral modes and at the level of 1/14 of a resolution element for the prism. These results are well within the allocations for the model in the overall spatial and spectral calibration budget of NIRSpec.

The JWST Optical Telescope Element (OTE) assembly is the largest optically stable infrared-optimized telescope currently being manufactured and assembled, and is scheduled for launch in 2018. The JWST OTE, including the 18 segment primary mirror, secondary mirror, and the Aft Optics Subsystem (AOS) are designed to be passively cooled and operate near 45K. These optical elements are supported by a complex composite backplane structure. As a part of the structural distortion model validation efforts, a series of tests are planned during the cryogenic vacuum test of the fully integrated flight hardware at NASA JSC Chamber A. The successful ends to the thermal-distortion phases are heavily dependent on the accurate temperature knowledge of the OTE structural members. However, the current temperature sensor allocations during the cryo-vac test may not have sufficient fidelity to provide accurate knowledge of the temperature distributions within the composite structure. A method based on an inverse distance relationship among the sensors and thermal model nodes was developed to improve the thermal data provided for the nanometer scale WaveFront Error (WFE) predictions. The Linear Distance Weighted Interpolation (LDWI) method was developed to augment the thermal model predictions based on the sparse sensor information. This paper will encompass the development of the LDWI method using the test data from the earlier pathfinder cryo-vac tests, and the results of the notional and as tested WFE predictions from the structural finite element model cases to characterize the accuracies of this LDWI method.

JWST ISIM has entered into its system-level testing program at NASA Goddard Space Flight Center (GSFC). In December 2013, ISIM successfully completed the first in a series of three cryo-vacuum tests, which included two flight science instruments. Since then, there have been full-fledged efforts towards the CV2 test scheduled to finish at the end of 2014. The complexity of the mission has generated challenging requirements that demand highly reliable system performance and capabilities from the Space Environment Simulator (SES) vacuum chamber. In order to satisfy the program requirements, GSFC had to develop unique structural and thermal hardware to test ISIM. Most noteworthy is a helium shroud structure and cooling system built in order to achieve operational temperatures below 20K (-253C). This paper: (1) provides an overview of the integrated mechanical and thermal facility systems required to achieve the objectives of JWST ISIM testing, (2) communicates the performance and challenges of the SES during the first ISIM test, and (3) summarizes the action plan to improve the system prior to the next test.

The light-weighted design of the Optical Telescope Element (OTE) of the James Webb Telescope (JWST) leads to additional sensitivity to vibration from the ground - an important consideration to the measurement uncertainty of the wavefront error (WFE) in the primary mirror. Furthermore, segmentation of the primary mirror leads to rigid-body movements of segment areas in the WFE. The ground vibrations are minimized with modifications to the test facility, and by the architecture of the equipment supporting the load. Additional special test equipment (including strategically placed isolators, tunable mass dampers, and cryogenic magnetic dampers) mitigates the vibration and the response sensitivity before reaching the telescope. A multi-wavelength interferometer is designed and operated to accommodate the predicted residual vibration. Thermal drift also adds to the measurement variation. Test results of test equipment components, measurement theory, and finite element analysis combine to predict the test uncertainty in the future measurement of the primary mirror. The vibration input to the finite element model comes from accelerometer measurements of the facility with the environmental control pumps operating. One of the isolators have been built and tested to validate the dynamic performance. A preliminary model of the load support equipment and the OTE with the Integrated Science Instrument Module (ISIM) is complete. The performance of the add-on dampers have been established in previous applications. And operation of the multi-wavelength interferometer was demonstrated on a scaled hardware version of the JWST in an environment with vibration and thermal drift.

We propose a co-ordinated multi-observatory survey at the North Ecliptic Pole. This field is the natural extragalactic deep field location for most space observatories (e.g. containing the deepest Planck, WISE and eROSITA data), is in the continuous viewing zones for e.g. Herschel, HST, JWST, and is a natural high-visibility field for the L2 halo orbit of SPICA with deep and wide-field legacy surveys already planned. The field is also a likely deep survey location for the forthcoming Euclid mission. It is already a multi-wavelength legacy field in its own right (e.g. AKARI, LOFAR, SCUBA-2): the outstanding and unparalleled continuous mid-IR photometric coverage in this field and nowhere else enables a wide range of galaxy evolution diagnostics unachievable in any other survey field, by spanning the wavelengths of redshifted PAH and silicate features and the peak energy output of AGN hot dust. We argue from the science needs of Euclid and JWST, and from the comparative multiwavelength depths, that the logical ...

Stellar jets and molecular outflows are observed in association with young accreting stars and are believed to play a key role in the star formation process. In this talk I will show how current and future space missions are of crucial importance to investigate the origin of stellar jets and their link to the accretion process. Thanks to its high angular (˜0.1") resolution, HST has been the first telescope allowing us to investigate the jet physics at optical/UV wavelengths down to the heart of the launching mechanism. We recently analysed a datacube of the jet emitted by the T Tauri star DG Tau obtaining spatio-kinematical maps of the hot atomic gas in the jet and of its physical conditions (Maurri et al., submitted). These data confirm the predictions of theoretical models including the fact that jets may extract the excess angular momentum from the system. In the last two years Herschel has further improved our comprehension of the ejection process observing the far infrared counterpart of fast and collimated atomic jets. PACS and HIFI observations, acquired within the GASPS (GAS in Protoplanetary Systems) Open Time Key Project (PI: B. Dent), show that T Tauri stars driving optical jets are also associated with a warm gas component emitting not only atomic ([OI], [CII]) but also molecular (high-J CO, H_2O, OH) lines. The comparison with Class 0 outflows highlights a clear evolutionary trend: the emission associated with evolved Class I/II sources is fainter and more compact and the estimated mass loss rates and lines cooling are one to two orders of magnitudes lower (Podio et al., to be submitted). The arrival of JWST will fill-in the gap between HST and Herschel opening a new window in the near and mid-infrared range at unprecedented angular resolution (down to 0.03"). This will allow resolving the emission in both atomic (e.g., [FeII]) and molecular (e.g., H_2) lines and understanding if the molecular gas is entrained by the atomic jet or launched with it

We propose a system named ETIS (Energy-based Tree Illustration System) for automatically generating tree illustrations characteristic of two-dimensional ones with features such as exaggerated branch curves, leaves, and flowers...

Exploring fundamental research questions, Conceptual Structures in Practice takes you through the basic yet nontrivial task of establishing conceptual relations as the foundation for research in knowledge representation and knowledge mining. It includes contributions from leading researchers in both the conceptual graph and formal concept analysis (FCA) communities.This accessible, self-contained book begins by providing the formal background in FCA and conceptual graphs. It then describes various software tools for analysis and computation, including the ToscanaJ suite. Written by the origina

The conceptual framework is alluded to in most serious texts on research, described in some and fully explained in few. However, examiners of doctoral theses devote considerable attention to exploring its function within social science doctoral vivas. A literature survey explores how the conceptual framework is itself conceptualised and explained.…

The Mid Infrared Instrument (MIRI) aboard JWST is equipped with one filter wheel and two dichroic-grating wheel mechanisms to reconfigure the instrument between observing modes such as broad/narrow-band imaging, coronagraphy and low/medium resolution spectroscopy. Key requirements for the three mechanisms with up to 18 optical elements on the wheel include: (1) reliable operation at T = 7 K, (2) high positional accuracy of 4 arcsec, (3) low power dissipation, (4) high vibration capability, (5) functionality at 7 K ball bearing, a central torque motor for actuation, a ratchet system with monolithic CuBe flexural pivots for precise and powerless positioning and a magnetoresistive position sensor has been implemented. We report here the final performance and lessons-learnt from the successful acceptance test program of the MIRI wheel mechanism flight models. The mechanisms have been meanwhile integrated into the flight model of the MIRI instrument, ready for launch in 2014 by an Ariane 5 rocket.

The illustrations for Tibetan sutras are coloured in two ways:in black and white or colours-the monotone illustrations accompanying Tibetan characters and usually engraved on woodblocks.The illustrations are often showed on the cover pages or two sides of the head pages of sutras; they are frequently displayed at two frames and in the middle of end pages.In this paper,I am going to introduce the

The current method of case-based design (CBD) can be well practiced for configuration design in which design experience knowledge is involved.However, since the design case is confined to a certain application domain, it is difficult for CBD to be applied to conceptual design process that develops concepts to meet design specifications.Firstly, a function factor description space is erected to provide an exhibition room for all functions of design cases.Next, the approach for identifying the space state of function factor in description space is proposed, including the determination of the similarities between function factors of design case.And then a general object-oriented representation for design case is presented by bringing the class of function and in-out flow into the current case representation.Finally, a living example for electro-pet design that illustrates the implementation of the method for case-based conceptual design based on distributed design case repositories is described.

THIS PUBLICATION IS COMPOSED OF OVER 150 PAGES OF BLACK AND WHITE ILLUSTRATIONS DEALING WITH RADIOISOTOPES AND THEIR USES. THESE ILLUSTRATIONS CONSIST OF CHARTS, GRAPHS, AND PICTORIAL REPRESENTATIONS WHICH COULD BE PREPARED AS HANDOUTS, TRANSPARENCIES FOR OVERHEAD PROJECTION, OR WHICH COULD BE USED IN A NUMBER OF OTHER WAYS FOR PRESENTING SUCH…

Data from electron microscopy, X-ray crystallography, and biophysical analysis are used to create illustrations of viruses in their cellular context. This report describes the scientific data and artistic methods used to create three illustrations: a depiction of the poliovirus lifecycle, budding of influenza virus from a cell surface, and a…

Data from electron microscopy, X-ray crystallography, and biophysical analysis are used to create illustrations of viruses in their cellular context. This report describes the scientific data and artistic methods used to create three illustrations: a depiction of the poliovirus lifecycle, budding of influenza virus from a cell surface, and a…

Full Text Available Conceptual combination research investigates the processes involved in creating new meaning from old referents. It is therefore essential that embodied theories of cognition are able to explain this constructive ability and predict the resultant behaviour. However, by failing to take an embodied or grounded view of the conceptual system, existing theories of conceptual combination cannot account for the role of perceptual, motor and affective information in conceptual combination. In the present paper, we propose the Embodied Conceptual Combination (ECCo model to address this oversight. In ECCo, conceptual combination is the result of the interaction of the linguistic and simulation systems, such that linguistic distributional information guides or facilitates the combination process, but the new concept is fundamentally a situated, simulated entity. So, for example, a cactus beetle is represented as a multimodal simulation that includes visual (e.g., the shiny appearance of a beetle and haptic (e.g., the prickliness of the cactus information, all situated in the broader location of a desert environment under a hot sun, and with (at least for some people an element of creepy-crawly revulsion. The ECCo theory differentiates interpretations according to whether the constituent concepts are destructively, or nondestructively, combined in the situated simulation. We compare ECCo to other theories of conceptual combination, and discuss how it accounts for classic effects in the literature.

Full Text Available Public and nonprofit organizations, entwined in the delivery of public goods and services, are in the midst of challenging economic times. In these circumstances, sound collaborative leadership may help bridge budget and program service delivery shortfalls. In this paper, we examine the administrative dynamics of mutual reliance between two prominent public and nonprofit organizations: public schools and parent-teacher groups (PTGs. We conclude that the partnership is changing as a result of external, economic forces. In essence, we are seeing a threat-rigidity response. The economic crisis may be responsible for causing PTGs to narrow their range of activities away from broader strategic issues that can be addressed through their confrontation activities and advocacy mission towards a narrower focus on classroom activities that protect core school operations, namely instruction.

textabstractCase studies of urban squatting in the United States and the Netherlands, and the fight against sexual violence in Spain and in the Netherlands form the empirical basis of an analysis of the features and development of autonomous and institutionalized social movements, and the in-teracti

Does compassion feel pleasant or unpleasant? Westerners tend to categorize compassion as a pleasant or positive emotion, but laboratory compassion inductions, which present another's suffering, may elicit unpleasant feelings. Across two studies, we examined whether prototypical conceptualizations of compassion (as pleasant) differ from experiences of compassion (as unpleasant). After laboratory-based neutral or compassion inductions, participants made abstract judgments about compassion relative to various emotion-related adjectives, thereby providing a prototypical conceptualization of compassion. Participants also rated their own affective states, thereby indicating experiences of compassion. Conceptualizations of compassion were pleasant across neutral and compassion inductions. After exposure to others' suffering, however, participants felt increased levels of compassion and unpleasant affect, but not pleasant affect. After neutral inductions, participants reported more pleasant than unpleasant affect, with moderate levels of compassion. Thus, prototypical conceptualizations of compassion are pleasant, but experiences of compassion can feel pleasant or unpleasant. The implications for emotion theory in general are discussed.

The human body has been depicted in ancient cave-paintings, in primitively sculpted figures, and through all the ages in various forms of artistic expression. The earliest medical texts were descriptive but not illustrated. Later, as it became clear that knowledge of the human body and all its systems was essential to the practice of healing, texts were accompanied by illustrations which became an integral part of the teaching process. The illustrators included artists, whose interest was primarily artistic, but who were sometimes employed by surgeons or physicians to illustrate their texts. Occasionally, the physicians or scientists accompanied their texts with their own illustrations, and in the last century, medical illustration, in its infinite variety of techniques, has been developed as a profession in its own right. As knowledge was extended, permitted by social and cultural change, as well as by technological advances, the types of illustrations have ranged from gross anatomy through dissections showing the various organ systems, histological preparations, and radiological images, right up to the computerized digital imagery that is available today, which allows both static and dynamic two- and three-dimensional representations to be transmitted electronically across the world in a matter of seconds. The techniques used to represent medical knowledge pictorially have been as varied as the illustrators themselves, involving drawing, engraving, printing, photography, cinematography and digital processing. Each new technique has built on previous experience to broaden medical knowledge and make it accessible to an ever-widening audience. This vast accumulation of pictorial material has posed considerable problems of storage, cataloguing, retrieval, display and dissemination of the information, as well as questions of ethics, validity, manipulation and reliability. This paper traces these developments, illustrating them with representative examples drawn from

The annual meeting of the Australian Institute of Medical and Biological Illustration in Melbourne in November 1998 included keynote addresses from Richard Morton and Robin Williams. Both speakers looked at the future of the medical illustration profession, and in particular the impact of new technology. This matter was also addressed by Joe Nicholls in a presentation given at the Institute of Medical Illustrators' Annual Symposium in Warwick, UK, in September 1998. This paper is a synthesis of the ideas presented by these three speakers and elaborates on common themes in their presentations.

In order to set up a conceptual data model that reflects the real world as accurately as possible,this paper firstly reviews and analyzes the disadvantages of previous conceptual data models used by traditional GIS in simulating geographic space,gives a new explanation to geographic space and analyzes its various essential characteristics.Finally,this paper proposes several detailed key points for designing a new type of GIS data model and gives a simple holistic GIS data model.

How to evaluate students' astronomy understanding is still an open question. Even though some methods and tools to help students have already been developed, the sources of students' difficulties and misunderstanding in astronomy is still unclear. This paper presents an investigation of the development of conceptual systems in astronomy by 50 engineering students, as a result of learning a general course on astronomy. A special tool called Conceptual Frameworks in Astronomy (CFA) that was initially used in 1989, was adapted to gather data for the present research. In its new version, the tool included 23 questions, and five to six optional answers were given for each question. Each of the answers was characterized by one of the four conceptual astronomical frameworks: pre-scientific, geocentric, heliocentric and sidereal or scientific. The paper describes the development of the tool and discusses its validity and reliability. Using the CFA we were able to identify the conceptual frameworks of the students at the beginning of the course and at its end. CFA enabled us to evaluate the paradigmatic change of students following the course and also the extent of the general improvement in astronomical knowledge. It was found that the measure of the students’ improvement (gain index) was g = 0.37. Approximately 45% of the students in the course improved their understanding of conceptual frameworks in astronomy and 26% deepened their understanding of the heliocentric or sidereal conceptual frameworks.

The James Webb Space Telescope (JWST) offers unprecedented sensitivity, stability, and wavelength coverage for transiting exoplanet studies, opening up new avenues for measuring atmospheric abundances, structure, and temperature profiles. Taking full advantage of JWST spectroscopy of planets from 0.6 to 28 μm, however, will require many observations with a combination of the NIRISS, NIRCam, NIRSpec, and MIRI instruments. In this white paper, we discuss a new NIRCam mode (not yet approved or implemented) that can reduce the number of necessary observations to cover the 1.0-5.0 μm wavelength range. Even though NIRCam was designed primarily as an imager, it also includes several grisms for phasing and aligning JWST’s 18 hexagonal mirror segments. NIRCam’s long-wavelength channel includes grisms that cover 2.4-5.0 μm with a resolving power of R = 1200-1550 using two separate configurations. The long-wavelength grisms have already been approved for science operations, including wide field and single object (time series) slitless spectroscopy. We propose a new mode that will simultaneously measure spectra for science targets in the 1.0-2.0 μm range using NIRCam’s short-wavelength channel. This mode, if approved, would take advantage of NIRCam’s Dispersed Hartmann Sensor (DHS), which produces 10 spatially separated spectra per source at R ˜ 300. We discuss the added benefit of the DHS in constraining abundances in exoplanet atmospheres as well as its ability to observe the brightest systems. The DHS essentially comes for free (at no time cost) with any NIRCam long-wavelength grism observation, but the detector integration parameters have to be selected to ensure that the long-wavelength grism observations do not saturate and that JWST data volume downlink constraints are not violated. Combining both of NIRCam’s channels will maximize the science potential of JWST, which is a limited life observatory.

The James Webb Space Telescope (JWST) offers unprecedented sensitivity, stability, and wavelength coverage for transiting exoplanet studies, opening up new avenues for measuring atmospheric abundances, structure, and temperature profiles. Taking full advantage of JWST spectroscopy of planets from 0.6um to 28um, however, will require many observations with a combination of the NIRISS, NIRCam, NIRSpec, and MIRI instruments. In this white paper, we discuss a new NIRCam mode (not yet approved or implemented) that can reduce the number of necessary observations to cover the 1.0um to 5.0um wavelength range. Even though NIRCam was designed primarily as an imager, it also includes several grisms for phasing and aligning JWST's 18 hexagonal mirror segments. NIRCam's long-wavelength channel includes grisms that cover 2.4um to 5.0um with a resolving power of R = 1200 - 1550 using two separate configurations. The long-wavelength grisms have already been approved for science operations, including wide field and single obj...

The author takes issue with the claim that the role of the medical illustrator is changing today. Not so, he says. The role is the same, and the need is as great as it ever was. Rather, some medical illustrators are changing in the desire to expand their field and become "biocommunicators.' Such expansion, the author suggests, is not for everyone, and those who choose to continue in their traditional role need make no apologies. It is a vital one.

This beautifully illustrated book is the first complete handbook to visual information. Well written, easy to use, and carefully indexed, it describes the full range of charts, graphs, maps, diagrams, and tables used daily to manage, analyze, and communicate information. It features over 3,000 illustrations, making it an ideal source for ideas on how to present information. It is an invaluable tool for anyone who writes or designs reports, whether for scientific journals, annual reports, or magazines and newspapers.

Full Text Available The central dogma is a core concept that is critical for introductory biology and microbiology students to master. However, students often struggle to conceptualize the processes involved, and fail to move beyond simply memorizing the basic facts. To encourage critical thinking, we have designed a set of magnetic nucleotide manipulatives that allow students to model DNA structure, along with the processes of replication, transcription, and translation.

This paper aims to discuss recent approaches in using more and more frequently computer tools as supports for the conceptual design phase of the architectural project. The present state-of-the-art about software as conceptual design tool could be summarized in two parallel tendencies. On the one...... hand, the main software houses are trying to introduce powerful and effective user-friendly applications in the world of building designers, that are more and more able to fit their specific requirements; on the other hand, some groups of expert users with a basic programming knowledge seem to deal...... with the problem of software as conceptual design tool by means of 'scripting', in other words by self-developing codes able to solve specific and well defined design problems. Starting with a brief historical recall and the discussion of relevant researches and practical experiences, this paper investigates...

The most important characteristic of the “world filled with nonlinearity” is the existence of scale interference: disparate space–time scales interfere with each other. Thus, the effects of unknowable scales invade the world that we can observe directly. This leads to various peculiar phenomena such as chaos, critical phenomena, and complex biological phenomena, among others. Conceptual analysis and phenomenology are the keys to describe and understand phenomena that are subject to scale interference, because precise description of unfamiliar phenomena requires precise concepts and their phenomenological description. The book starts with an illustration of conceptual analysis in terms of chaos and randomness, and goes on to explain renormalization group philosophy as an approach to phenomenology. Then, abduction is outlined as a way to express what we have understood about the world. The book concludes with discussions on how we can approach genuinely complex phenomena, including biological phenomena. T...

In the conceptual design phase of pharmaceutical plants as much as 80%-90% of the total cost of a project is committed. It is therefore essential that the chosen concept is viable. In this design process configuration and 3D models can help validate the decisions made. Designing 3D models...... is a complex task and requires skilled users. We demonstrate that a simple 2D/3D configuration tool can support conceptualizing of pharmaceutical plants. Present paper reports on preliminary results from a full scale implementation project at a Danish engineering company....

The James Webb Space Telescope (JWST) will carry four scientific instruments, one of which is the Tunable Filter Imager (TFI), which is an instrument within the Fine Guidance Sensor. The Dual Wheel (DW) mechanism is being designed, built and tested by COM DEV Ltd. under contract from the Canadian Space Agency. The DW mechanism includes a pupil wheel (PW) holding seven coronagraphic masks and two calibration elements and a filter wheel (FW) holding nine blocking filters. The DW mechanism must operate at both room temperature and at 35K. Successful operation at 35K comprises positioning each optical element with the required repeatability, for several thousand occasions over the five year mission. The paper discusses the results of testing geared motors and bearings at the cryogenic temperature. In particular bearing retainer design and PGM-HT material, the effects of temperature gradients across bearings and the problems associated with cooling mechanisms down to cryogenic temperatures. The results of additional bearing tests are described that were employed to investigate an abnormally high initial torque experienced at cryogenic temperatures. The findings of these tests, was that the bearing retainer and the ball/race system could be adversely affected by the large temperature change from room temperature to cryogenic temperature and also the temperature gradient across the bearing. The DW mechanism is now performing successfully at both room temperature and at cryogenic temperature. The life testing of the mechanism is expected to be completed in the first quarter of 2010.

The chemical compositions and thermal structures of close-in planets are two of the major questions raised over the last 15+ years of exoplanet atmospheric characterization. These are fundamental questions in their own right, and answering them also has the potential to improve our understanding of the planets in the Solar System. JWST offers the opportunity to make a major advance on these topics by revealing a more complete and accurate inventory of the chemical species in exoplanet atmospheres and by precisely measuring atmospheric temperatures over a broad range of pressures. I will describe how we plan to use an Interdisciplinary Scientist GTO program to determine the compositions and thermal structures of transiting, hot giant exoplanets using dayside thermal emission measurements obtained at secondary eclipse. Our composition measurements are focused on determining absolute molecular abundances as a tracer of atmospheric metallicity and the abundance ratio of carbon to oxygen. The targets in our program have a range of masses and irradiation, which will enable us to test theories of how atmospheric metallicity varies with planet mass and how thermal structures respond to different levels of stellar forcing.

In an attempt to schematically illustrate the pastoral care intervention to scientifically minded professionals and colleagues the author developed a model that can be used as an interdisciplinary teaching tool. Within the setting of hospital ministry, the tool also provides insights into the stages of "crisis experience" and illustrates the transformational process involved in The Healing Journey. These change-processes are explained against the background of a multi-level anthropology. This approach births a Healing Journey diagram, a spiritual pain assessment tool, and a seven-phase intervention model that may be helpful in Clinical Pastoral Education.

This is Stephen Hawking's updated, expanded and illustrated edition of his celebrated work which includes the most recent developments in the field, many of which were forecast by him. At the same time, he explains his complex theories through a fresh visual dimension. Over one hunded and fifty stunning colour illustrations have been specially commissioned for this purpose to help the reader understand what have become popular mythic images of our century, but which nonetheless remain difficult, abstract ideas to grasp. It includes a new introduction written specially for this edition.

In this talk we present a series of illustrative topics in Fourier Optics that are proving valuable in the design of EDOF camera systems. They are at the level of final examination problems that have been made solvable by a student or professoi having studied from one of Joseph W. Goodman's books---our tribute for his 75fr year. As time permits, four illustrative topics are l) Electromagnetic waves and Fourier optics;2) The perfect lens; 3) Connection between phase delay and radially varying focal length in an asphere and 4) tailored EDOF designs.

of activities relevant to the attainment of strategic objectives, the validity of the strategy selected and the learning theory of truth which is manifest in a system of constructive feedback on the gap between strategic expectations and strategic realisations. The conceptual analysis is illustrated by material...

This paper states and discusses general guidelines in teaching for conceptual change. Several important factors that seem to be necessary in meeting the guidelines in normal classrooms are considered. The factors relate to the teacher, student, and the classroom climate. The guidelines are illustrated using examples drawn from a fifth-grade…

Introduction: This paper advocates Foucault's notion of pouvoir/savoir (power/knowledge) as a conceptual lens that information researchers might fruitfully use to develop a richer understanding of the relationship between knowledge and power. Methods: Three of the authors' earlier studies are employed to illustrate the use of this conceptual lens.…

This paper describes the fundamental concept of the reconciliation behind the indirect method of the statement of cash flows. A conceptual framework is presented to demonstrate how accrual and cash-basis accounting methods relate to each other and to illustrate the concept of reconciling these two accounting methods. The conceptual framework…

Optical alignment and testing of the Integrated Science Instrument Module of the James Webb Space Telescope is underway. We describe the Optical Telescope Element Simulator used to feed the science instruments with point images of precisely known location and chief ray pointing, at appropriate wavelengths and flux levels, in vacuum and at operating temperature. The simulator's capabilities include a number of devices for in situ monitoring of source flux, wavefront error, pupil illumination, image position and chief ray angle. Taken together, these functions become a fascinating example of how the first order properties and constructs of an optical design (coordinate systems, image surface and pupil location) acquire measurable meaning in a real system. We illustrate these functions with experimental data, and describe the ray tracing system used to provide both pointing control during operation and analysis support subsequently. Prescription management takes the form of optimization and fitting. Our core too...

Clear, practical guide to the diagnostic imaging of diseases of the liver, biliary tree, gallbladder, pancreas, and spleen. A wealth of carefully selected and categorized illustrations. Highlighted key points to facilitate rapid review. Aid to differential diagnosis. Radiology Illustrated: Hepatobiliary and Pancreatic Radiology is the first of two volumes that will serve as a clear, practical guide to the diagnostic imaging of abdominal diseases. This volume, devoted to diseases of the liver, biliary tree, gallbladder, pancreas, and spleen, covers congenital disorders, vascular diseases, benign and malignant tumors, and infectious conditions. Liver transplantation, evaluation of the therapeutic response of hepatocellular carcinoma, trauma, and post-treatment complications are also addressed. The book presents approximately 560 cases with more than 2100 carefully selected and categorized illustrations, along with key text messages and tables, that will allow the reader easily to recall the relevant images as an aid to differential diagnosis. At the end of each text message, key points are summarized to facilitate rapid review and learning. In addition, brief descriptions of each clinical problem are provided, followed by both common and uncommon case studies that illustrate the role of different imaging modalities, such as ultrasound, radiography, CT, and MRI.

Originally part of a symposium on educational media for the deaf, the paper discusses the use of animated sequences which illustrate linguistic principles. The work of the Computer Graphics Research Group at Ohio State University is highlighted. It has been discovered that computer-assisted instructional systems enhance learning through greater…

Discusses the use of a computer-illustrated text (CIT) with integrated software to teach electric circuit theory to college students. Examples of software use are given, including simple animation, graphical displays, and problem-solving programs. Issues affecting electric circuit theory instruction are also addressed, including mathematical…

Many factors to be appropriately addressed in moving towards energy sustainability are examined. These include harnessing sustainable energy sources, utilizing sustainable energy carriers, increasing efficiency, reducing environmental impact and improving socioeconomic acceptability. The latter factor includes community involvement and social acceptability, economic affordability and equity, lifestyles, land use and aesthetics. Numerous illustrations demonstrate measures consistent with the a...

Full Text Available Abstract Background Cartoon-style illustrative renderings of proteins can help clarify structural features that are obscured by space filling or balls and sticks style models, and recent advances in programmable graphics cards offer many new opportunities for improving illustrative renderings. Results The ProteinShader program, a new tool for macromolecular visualization, uses information from Protein Data Bank files to produce illustrative renderings of proteins that approximate what an artist might create by hand using pen and ink. A combination of Hermite and spherical linear interpolation is used to draw smooth, gradually rotating three-dimensional tubes and ribbons with a repeating pattern of texture coordinates, which allows the application of texture mapping, real-time halftoning, and smooth edge lines. This free platform-independent open-source program is written primarily in Java, but also makes extensive use of the OpenGL Shading Language to modify the graphics pipeline. Conclusion By programming to the graphics processor unit, ProteinShader is able to produce high quality images and illustrative rendering effects in real-time. The main feature that distinguishes ProteinShader from other free molecular visualization tools is its use of texture mapping techniques that allow two-dimensional images to be mapped onto the curved three-dimensional surfaces of ribbons and tubes with minimum distortion of the images.

Richly illustrated, this special edition of Du Bois's seminal work includes historical woodcuts and engravings, photos and documents. Most of the photos, engravings, and documents are from the 19th and early 20th century and depict American slavery and its legacy, African-American life, and the prominent figures and events associated with the…

This paper demonstrates the benefits and application of Straussian Grounded Theory method in conducting research in complex settings where parameters are poorly defined. It provides a detailed illustration on how this method can be used to build an internationalization theory. To be specific, this paper exposes readers to the behind-the-scene work…

A classroom experiment is proposed in which students can mate a banded or spotted convict cichlid with a pink convict cichlid and observe the markings of their "children" and "grandchildren" as a way of illustrating Mendel's Laws of Dominance and Segregation. (MN)

There have been a number of papers dealing quantitatively with light refraction. Yet the conceptualization of the phenomenon that sets the foundation for a more rigorous math analysis is minimized. The purpose of this paper is to fill that gap. (Contains 3 figures.)

A major revolution in the study of metaphor occurred 30 years ago with the introduction of "conceptual metaphor theory" (CMT). Unlike previous theories of metaphor and metaphorical meaning, CMT proposed that metaphor is not just an aspect of language, but a fundamental part of human thought. Indeed, most metaphorical language arises from…

This paper presents the first observational study of an ongoing research project. The research focuses on ‘teaching conceptual design’ and on the investigation of new teaching methods and strategies. Presently, in the commonly established educational setting, students practice the role of designing

Doxing is the intentional public release onto the Internet of personal information about an individual by a third party, often with the intent to humiliate, threaten, intimidate, or punish the identified individual. In this paper I present a conceptual analysis of the practice of doxing and how it d

Doxing is the intentional public release onto the Internet of personal information about an individual by a third party, often with the intent to humiliate, threaten, intimidate, or punish the identified individual. In this paper I present a conceptual analysis of the practice of doxing and how it d

This article introduces a framework for conceptualizing four forms of cognitive neuroplasticity. The concepts include: (1) homologous area adaptivity; (2) cross-modal reassignment; (3) map expansion; and (4) compensatory masquerade. The limitations of each form of plasticity are presented. (Contains references.) (Author/CR)

This paper presents the first observational study of an ongoing research project. The research focuses on ‘teaching conceptual design’ and on the investigation of new teaching methods and strategies. Presently, in the commonly established educational setting, students practice the role of designing

In the present research, we introduced a conceptual framework of inclusion and subsequently used this as a starting point to develop and validate a scale to measure perceptions of inclusion. Departing from existing work on inclusion and complementing this with theoretical insights from optimal

n the present research, we introduced a conceptual framework of inclusion and subsequently used this as a starting point to develop and validate a scale to measure perceptions of inclusion. Departing from existing work on inclusion and complementing this with theoretical insights from optimal

In the present research, we introduced a conceptual framework of inclusion and subsequently used this as a starting point to develop and validate a scale to measure perceptions of inclusion. Departing from existing work on inclusion and complementing this with theoretical insights from optimal

We present here ail approach to conceptual querying where the aim is, given a collection of textual database objects or documents, to target an abstraction of the entire database content in terms of the concepts appearing in documents, rather than the documents in the collection. The approach is ...

Students' understanding of science develops through everyday experiences. As a result, they come to the science classroom with their own notions of how the world works. As teachers, we often must help students overcome their prior naive notions and move them toward a more scientific understanding. This process, known as conceptual change, is…

In the present research, we introduced a conceptual framework of inclusion and subsequently used this as a starting point to develop and validate a scale to measure perceptions of inclusion. Departing from existing work on inclusion and complementing this with theoretical insights from optimal disti

n the present research, we introduced a conceptual framework of inclusion and subsequently used this as a starting point to develop and validate a scale to measure perceptions of inclusion. Departing from existing work on inclusion and complementing this with theoretical insights from optimal distin

In the present research, we introduced a conceptual framework of inclusion and subsequently used this as a starting point to develop and validate a scale to measure perceptions of inclusion. Departing from existing work on inclusion and complementing this with theoretical insights from optimal disti

Doxing is the intentional public release onto the Internet of personal information about an individual by a third party, often with the intent to humiliate, threaten, intimidate, or punish the identified individual. In this paper I present a conceptual analysis of the practice of doxing and how it

This article reviews Giyoo Hatano's ground-breaking theoretical, empirical, and methodological contributions to conceptual change research. In particular, his discovery of "vitalism" as part of children's legitimate and distinctive biology at early ages stands as a landmark. In addition, his work reinterpreted childhood "personification," changing…

Over the past decade, NASA, under a succession of rotary-wing programs has been moving towards coupling multiple discipline analyses in a rigorous consistent manner to evaluate rotorcraft conceptual designs. Handling qualities is one of the component analyses to be included in a future NASA Multidisciplinary Analysis and Optimization framework for conceptual design of VTOL aircraft. Similarly, the future vision for the capability of the Concept Design and Assessment Technology Area (CD&A-TA) of the U.S Army Aviation Development Directorate also includes a handling qualities component. SIMPLI-FLYD is a tool jointly developed by NASA and the U.S. Army to perform modeling and analysis for the assessment of flight dynamics and control aspects of the handling qualities of rotorcraft conceptual designs. An exploration of handling qualities analysis has been carried out using SIMPLI-FLYD in illustrative scenarios of a tiltrotor in forward flight and single-main rotor helicopter at hover. Using SIMPLI-FLYD and the conceptual design tool NDARC integrated into a single process, the effects of variations of design parameters such as tail or rotor size were evaluated in the form of margins to fixed- and rotary-wing handling qualities metrics as well as the vehicle empty weight. The handling qualities design margins are shown to vary across the flight envelope due to both changing flight dynamic and control characteristics and changing handling qualities specification requirements. The current SIMPLI-FLYD capability and future developments are discussed in the context of an overall rotorcraft conceptual design process.

Our goal was to develop a comprehensive conceptual research framework on mode of delivery and to identify research priorities in this topic area through a Delphi process. We convened a multidisciplinary team of 16 experts (North Carolina Collaborative on Mode of Delivery) representing the fields of obstetrics and gynecology, neonatology, midwifery, epidemiology, psychometrics, decision sciences, bioethics, health care engineering, health economics, health disparities, and women's studies. We finalized the conceptual framework after multiple iterations, including revisions during a one-day in-person conference. The conceptual framework illustrates the causal pathway for mode of delivery and the complex interplay and relationships among patient, fetal, family, provider, cultural, and societal factors as drivers of change from intended to actual mode of delivery. This conceptual framework on mode of delivery will help put specific research ideas into a broader context and identify important knowledge gaps for future investigation.

Frequent inundated areas, low quality of water supply, highly dependent water sources from external are some key problems in Surabaya water balance. Many aspects of urban development have stimulated those problems. To uncover the complexity of water balance in Surabaya, a conceptual model for water sensitive city is constructed to find the optimum solution. A system dynamic modeling is utilized to assist and enrich the idea of conceptual model. A secondary analysis to a wide range data directs the process in making a conceptual model. FGD involving many experts from multidiscipline are also used to finalize the conceptual model. Based on those methods, the model has four main sub models that are; flooding, land use change, water demand and water supply. The model consists of 35 key variables illustrating challenges in Surabaya urban water.

'Conceptual Modeling and the Lexicon' investigates the linguistic aspects of conceptual modeling, concentrating on the terminology part. The author combines theoretical ideas and empirical facts from various scientific fields, such as cognitive psychology, computer science, lexicography, psycholingu

This carefully researched monograph is a historical investigation of the illustrated Aratea astronomical manuscript and its many interpretations over the centuries. Aratus' 270 B.C.E. Greek poem describing the constellations and astrological phenomena was translated and copied over 800 years into illuminated manuscripts that preserved and illustrated these ancient stories about the constellations. The Aratea survives in its entirety due to multiple translations from Greek to Latin and even to Arabic, with many illuminated versions being commissioned over the ages. The survey encompasses four interrelated disciplines: history of literature, history of myth, history of science, and history of art. Aratea manuscripts by their nature are a meeting place of these distinct branches, and the culling of information from historical literature and from the manuscripts themselves focuses on a wider, holistic view; a narrow approach could not provide a proper prospective. What is most essential to know about this work is...

Full Text Available This essay is based upon a survey of reproductions in auction catalogues – from their first appearance in the early eighteenth century until their more consistent use in the second decade of the twentieth century. Examining the role of these illustrations sheds light on how auctions functioned; it was not just the works of art that were traded, but knowledge about those works of art became currency to be exchanged. In contrast to the high end engravings and photographs of luxury illustrated art books, reproductions in auction catalogues – publications produced as ephemeral marketing tools – were of noticeably lower quality. This study of the status of reproductions, therefore, investigates the evolving understanding of art knowledge, both aesthetic and economic, and the interdependence of the market and connoisseurship.

The IEE Wiring Regulations Explained and Illustrated, Second Edition discusses the recommendations of the IEE Regulations for the Electrical Equipment of Buildings for the safe selection or erection of wiring installations. The book emphasizes earthing, bonding, protection, and circuit design of electrical wirings. The text reviews the fundamental requirements for safety, earthing systems, the earth fault loop impedance, and supplementary bonding. The book also describes the different types of protection, such as protection against mechanical damage, overcurrent, under voltage (which prevents

The first illustration of multiple sclerosis (MS) was by a young Scottish physician and artist, Dr Robert Carswell. Recognized as a talented illustrator by his teachers, he was encouraged to create an anatomy and pathology atlas. He spent years in the hospitals and mortuaries of Paris and Lyon painting watercolours and pen and ink drawings of patients and post mortem preparations. Of the 1034 paintings, 99 are of the brain and spinal cord and Plate 4, figure 4.4 in the atlas (Figure 2), is of MS. Carswell indicated he saw two examples of this pathology, but had not examined either patient, but illustrated one of them. We know little about the clinical history other than that the patient was paralyzed. About 200 of the atlases were printed, and it is still regarded as one of the greatest and most beautiful of all medical books. Carswell was appointed as the first Professor of Anatomy at the North London Hospital, later renamed the University College Hospital UK, where the original copy of his great atlas is archived. Due to ill health he resigned after a few years to reside in the healthier air outside Brussels, Belgium. He was appointed physician to King Leopold, but was also noted for his care of the poor. Queen Victoria knighted him for his care of King Louis Philippe of France when he was in exile. Although English journals did not note his passing at the age of 64 years, his great atlas remains as his memorial.

The NASA developed Ares rockets, named for the Greek god associated with Mars, will return humans to the moon and later take them to Mars and other destinations. In this early illustration, the Ares I is illustrated during lift off. Ares I is an inline, two-stage rocket configuration topped by the Orion crew vehicle and its launch abort system. With a primary mission of carrying four to six member crews to Earth orbit, Ares I may also use its 25-ton payload capacity to deliver resources and supplies to the International Space Station (ISS), or to 'park' payloads in orbit for retrieval by other spacecraft bound for the moon or other destinations. Ares I uses a single five-segment solid rocket booster, a derivative of the space shuttle solid rocket booster, for the first stage. A liquid oxygen/liquid hydrogen J-2X engine, derived from the J-2 engine used on the second stage of the Apollo vehicle, will power the Ares I second stage. Ares I can lift more than 55,000 pounds to low Earth orbit. The Ares I is subject to configuration changes before it is actually launched. This illustration reflects the latest configuration as of September 2006.

The science instruments (SIs) comprising the James Webb Space Telescope (JWST) Integrated Science Instrument Module (ISIM) were tested in three cryogenic-vacuum test campaigns in the NASA Goddard Space Flight Center (GSFC)'s Space Environment Simulator (SES) test chamber. In this paper, we describe the results of optical wavefront-error performance characterization of the SIs. The wavefront error is determined using image-based wavefront sensing, and the primary data used by this process are focus sweeps, a series of images recorded by the instrument under test in its as-used configuration, in which the focal plane is systematically changed from one image to the next. High-precision determination of the wavefront error also requires several sources of secondary data, including 1) spectrum, apodization, and wavefront-error characterization of the optical ground-support equipment (OGSE) illumination module, called the OTE Simulator (OSIM), 2) F-number and pupil-distortion measurements made using a pseudo-nonredundant mask (PNRM), and 3) pupil geometry predictions as a function of SI and field point, which are complicated because of a tricontagon-shaped outer perimeter and small holes that appear in the exit pupil due to the way that different light sources are injected into the optical path by the OGSE. One set of wavefront-error tests, for the coronagraphic channel of the Near-Infrared Camera (NIRCam) Longwave instruments, was performed using data from transverse translation diversity sweeps instead of focus sweeps, in which a sub-aperture is translated and/or rotated across the exit pupil of the system. Several optical-performance requirements that were verified during this ISIM-level testing are levied on the uncertainties of various wavefront-error-related quantities rather than on the wavefront errors themselves. This paper also describes the methodology, based on Monte Carlo simulations of the wavefront-sensing analysis of focus-sweep data, used to establish

Does compassion feel pleasant or unpleasant? People tend to categorize compassion as a pleasant or positive emotion, but laboratory compassion inductions, which present another’s suffering, may elicit unpleasant feelings. Across two studies, we examined whether prototypical conceptualizations of compassion (as pleasant) differ from experiences of compassion (as unpleasant). Following laboratory-based neutral or compassion inductions, participants made abstract judgments about compassion relat...

ENFORCE is a multi-disciplinary research project addressing trust management. The research objectives include the development of a methodology for the capture and analysis of policies for security and trust management, the development of a methodology for legal risk analysis to ensure trust, as well as the development of a language suitable for the specification of trust management policies. This report documents the ENFORCE conceptual framework for trust management by clarifying the notion o...

The business processes are the key asset for every organization. The design of the business process models is the foremost concern and target among an organization's functions. Business processes and their proper management are intensely dependent on the performance of software applications and technology solutions. The paper is attempt for definition of new Conceptual model of IT service provider, it could be examined as IT focused Enterprise model, part of Enterprise Architecture (EA) school.

Once a project obtains approval, decision makers have to consider a variety of alternative paths for completing the project and meeting the project objectives. How decisions are made involves a variety of elements including: cost, experience, current technology, ideologies, politics, future needs and desires, capabilities, manpower, timing, available information, and for many ventures management needs to assess the elements of risk versus reward. The use of high level Probabilistic Risk Assessment (PRA) Models during conceptual design phases provides management with additional information during the decision making process regarding the risk potential for proposed operations and design prototypes. The methodology can be used as a tool to: 1) allow trade studies to compare alternatives based on risk, 2) determine which elements (equipment, process or operational parameters) drives the risk, and 3) provide information to mitigate or eliminate risks early in the conceptual design to lower costs. Creating system models using conceptual design proposals and generic key systems based on what is known today can provide an understanding of the magnitudes of proposed systems and operational risks and facilitates trade study comparisons early in the decision making process. Identifying the "best" way to achieve the desired results is difficult, and generally occurs based on limited information. PRA provides a tool for decision makers to explore how some decisions will affect risk before the project is committed to that path, which can ultimately save time and money.

Maximizing reader insights into interior design as a conceptual way of thinking, which is about ideas and how they are formulated. The major themes of this book are the seven concepts of planning, circulation, 3D, construction, materials, colour and lighting, which covers the entire spectrum of a designer’s activity. Analysing design concepts from the view of the range of possibilities that the designer can examine and eventually decide by choice and conclusive belief the appropriate course of action to take in forming that particular concept, the formation and implementation of these concepts is taken in this book to aid the designer in his/her professional task of completing a design proposal to the client. The purpose of this book is to prepare designers to focus on each concept independently as much as possible, whilst acknowledging relative connections without unwarranted influences unfairly dictating a conceptual bias, and is about that part of the design process called conceptual analysis. It is assu...

This paper describes the electromagnetic compatibility (EMC) tests performed on the Integrated Science Instrument Module (ISIM), the science payload of the James Webb Space Telescope (JWST), at NASAs Goddard Space Flight Center (GSFC) in August 2015. By its very nature of being an integrated payload, it could be treated as neither a unit level test nor an integrated spacecraft/observatory test. Non-standard test criteria are described along with non-standard test methods that had to be developed in order to evaluate them. Results are presented to demonstrate that all test criteria were met in less than the time allocated.

For many years, researchers in a range of fields have combined quantitative and qualitative methods. However, the combined use of quantitative and qualitative methods has only recently been conceptualized and defined as mixed methods research. Some authors have described the emerging field as a third methodological tradition (in addition to the qualitative and quantitative traditions). Mixed methods research combines different perspectives and facilitates the study of complex interventions or programs, particularly in public health, an area where interdisciplinarity is critical. However, the existing literature is primarily in English. By contrast, the literature in French remains limited. The purpose of this paper is to present the emergence of mixed methods research for francophone public health specialists. A literature review was conducted to identify the main characteristics of mixed methods research. The results provide an overall picture of the mixed methods approach through its history, definitions, and applications, and highlight the tools developed to clarify the approach (typologies) and to implement it (integration of results and quality standards). The tools highlighted in the literature review are illustrated by a study conducted in France. Mixed methods research opens new possibilities for examining complex research questions and provides relevant and promising opportunities for addressing current public health issues in France.

The purpose of this note is to present the conceptual groundwork for the Comprehensive Analysis of Migration Pathways (CAMP). The conceptualization process for CAMP is discussed and available techniques for implementing CAMP are examined. Disposal of contaminated dredged material in a confined disposal facility is used to benchmark conceptual development. Case studies that illustrate analysis of selected migration pathways are also described.

The modeling of concepts from a cognitive perspective is important for designing spatial information systems that interoperate with human users. Concept representations that are built using geometric and topological conceptual space structures are well suited for semantic similarity and concept combination operations. In addition, concepts that are more closely grounded in the physical world, such as many spatial concepts, have a natural fit with the geometric structure of conceptual spaces. Despite these apparent advantages, conceptual spaces are underutilized because existing formalizations of conceptual space theory have focused on individual aspects of the theory rather than the creation of a comprehensive algebra. In this paper we present a metric conceptual space algebra that is designed to facilitate the creation of conceptual space knowledge bases and inferencing systems. Conceptual regions are represented as convex polytopes and context is built in as a fundamental element. We demonstrate the applicability of the algebra to spatial information systems with a proof-of-concept application.

For over a century the arrow has appeared in illustrations of cerebral function, yet the implications of using such symbols have not been previously considered. This review seeks to outline the nature, evolution, applications and limitations of this deceptively simple graphic device when it is used to picture functions of the brain. The arrow is found to have been used in several different ways: as a means of endowing anatomical structures with functional properties; as a method of displaying neural function either in free-standing form or in a structural or spatial framework; as a device for correlating functional data with underlying brain topography; and as a technique for linking functions of the brain with the world outside and with various philosophical concepts. For many of these uses the essential feature of the arrow is its directional characteristic. In contrast to the line, it is direction that enables the arrow to display information about time, which in turn can be exploited to depict functional rather than structural data. However, the use of the arrow is fraught with difficulties. It is often unclear whether an arrow has been used to illustrate fact, hypothesis, impression or possibility, or merely to provide a decorative flourish. Furthermore, the powerful symbolic nature of the arrow can so easily confer a spurious validity on the conjectural. Increasingly now there are insuperable difficulties when attempting to illustrate complex mechanisms of brain function. In the iconography of cerebral function, therefore, arrows with all their ambiguities may in certain circumstances become superseded by more non-representational symbols such as the abstract devices of the computational neuroscientist.

This Alternative Gallery feature introduces the photographic artist Professor Richard Sawdon Smith. Professor Sawdon Smith's work stems around a fascination with representations of anatomy that have been fuelled by his experience as a hospital patient. The work has allowed him to explore ideas through the use of medical illustrations which include early anatomical drawings, personal medical photography and facial modelling. The work highlights how such imagery can be used in the context of a patient seeking understanding and acceptance of ill health and disease using the body as a canvas on which to translate the experience.

It is the purpose of this manuscript to place an illustrative demonstration on the measurement of damped electromagnetic oscillations for a RLC circuit that it is easy to set in any physics laboratory equipped with PASCO technologies and USB Electrical PASPort sensors together with standard electrical components. The results of recording the electrical voltage with DATA Studio software have a very good agreement with performed simulations from MULTISIM software and/or standard calculations from theory. Our students and instructors enjoy of the experiment for their simplicity set up in addition to the instructive oscillations.

3D printing technology can help to visualize proofs in mathematics. In this document we aim to illustrate how 3D printing can help to visualize concepts and mathematical proofs. As already known to educators in ancient Greece, models allow to bring mathematics closer to the public. The new 3D printing technology makes the realization of such tools more accessible than ever. This is an updated version of a paper included in book Low-Cost 3D Printing for science, education and Sustainable Devel...

Over the past decade quantum information theory has developed into a vigorous field of research despite the fact that quantum information, as a precise concept, is undefined. Indeed the very idea of viewing quantum states as carriers of some kind of information (albeit unknowable in classical terms), leads naturally to interesting questions that might otherwise never have been asked, and corresponding new insights. We will discuss some illustrative examples, including a strengthening of the well known no-cloning theorem leading to a property of permanence for quantum information, and considerations arising from information compression that reflect on fundamental issues.

The conceptual design of mechatronic systems is addressed under the thrust of concurrent engineering and an enhanced conceptual design methodology describing the early design stage of mechatmnic systems is presented through an example illustration of a pick and place robot.This methodology treats each feasible solution as a solution strategy.In the methodology,Quality Function Deployment (QFD)is used as a baseline for the analysis of the mapping from customers to engineering requirements,Axiomatic Design(AD)is adopted as a guideline to generate feasible,good design solution alternatives,and Theory of Inventive Problem Solving(TRIZ)is applied to deal with domain conflicts in design.

Up-to-date and image-oriented for use in clinical practice. Chapters are organized by disease entity for quick reference. Includes high-quality images and schematic drawings. Radiology Illustrated: Gynecologic Imaging is an up-to-date, image-oriented reference in the style of a teaching file that has been designed specifically to be of value in clinical practice. Individual chapters focus on the various imaging techniques, normal variants and congenital anomalies, and the full range of pathology. Each chapter starts with a concise overview, and abundant examples of the imaging findings are then presented. In this second edition, the range and quality of the illustrations have been enhanced, and image quality is excellent throughout. Many schematic drawings have been added to help readers memorize characteristic imaging findings through pattern recognition. The organization of chapters by disease entity will enable readers quickly to find the information they seek. Besides serving as an outstanding aid to differential diagnosis, this book will provide a user-friendly review tool for certification or recertification in radiology.

... that: (1) The illustration be of a size less than three-fourths or more than one and one-half, in linear dimension, of each part of any matter so illustrated; (2) The illustration be one-sided; and...

Requirements for a rotorcraft conceptual design environment are discussed, from the perspective of a government laboratory. Rotorcraft design work in a government laboratory must support research, by producing technology impact assessments and defining the context for research and development; and must support the acquisition process, including capability assessments and quantitative evaluation of designs, concepts, and alternatives. An information manager that will enable increased fidelity of analysis early in the design effort is described. This manager will be a framework to organize information that describes the aircraft, and enable movement of that information to and from analyses. Finally, a recently developed rotorcraft system analysis tool is described.

and services. The general idea can be named embedded configuration. In this article we intend to conceptualize embedded configuration, what it is and is not. The difference between embedded configuration, sales configuration and embedded software is explained. We will look at what is needed to make embedded...... configuration systems. That will include requirements to product modelling techniques. An example with consumer electronics will illuminate the elements of embedded configuration in settings that most can relate to. The question of where embedded configuration would be relevant is discussed, and the current...

This is part four of a series on the ongoing optical modeling activities for James Webb Space Telescope (JWST). The first two discussed modeling JWST on-orbit performance using wavefront sensitivities to predict line of sight motion induced blur, and stability during thermal transients. The third investigates the aberrations resulting from alignment and figure compensation of the controllable degrees of freedom (primary and secondary mirrors), which may be encountered during ground alignment and on-orbit commissioning of the observatory. The work here introduces some of the math software tools used to perform the work of the previous three papers of this series. NASA has recently approved these in-house tools for public release as open source, so this presentation also serves as a quick tutorial on their use. The tools are collections of functions written in Matlab, which interface with optical design software (CodeV, OSLO, and Zemax) using either COM or DDE communication protocol. The functions are discussed, and examples are given.

Theory of Inventive Problem Solving (TRIZ) is a powerful tool widely used in engineering community. It is based on identification of a physical contradiction in a problem, and based on the corresponding pair of contradicting parameters selecting a few of suitable inventive principles, narrowing down the choice and leading to a much faster solution of a problem. It is remarkable that TRIZ methodology can also be applied to scientific disciplines. Many of TRIZ inventive principles can be post factum identified in various in scientific inventions and discoveries. However, additional inventive principles, more suitable for scientific disciplines, should be introduced and added to standard TRIZ, and some of the standard inventive principles need to be reformulated to be better applicable to science - we call this extension Accelerating Science TRIZ. In this short note we describe and illustrate the AS-TRIZ inventive principles via scientific examples, identifying AS-TRIZ inventive principles in discoveries and inv...

Seven core values are said to undergird the profession of occupational therapy, with empathy serving as a hallmark of one of those values-personal dignity. This inquiry explores the meaning of empathy within a practice that holds occupation at its center. The literature on empathy in both philosophy and the behavioral sciences yields cogent thoughts about the fullness of empathy and its characteristics actions. The Healing Heart, the biography of a pioneer therapist, Ora Ruggles, shows the manner in which occupational therapists can be empathic in their practice. These reflections and illustrations serve to sharpen the vision of occupational therapists as persons who reach for both the hands and the hearts of others.

Full Text Available The paper presents partial overview of the mathematical synthesis and the physical realization of metasurfaces, and related illustrative examples. The synthesis consists in determining the exact tensorial surface susceptibility functions of the metasurface, based on generalized sheet transition conditions, while the realization deals with both metallic and dielectric scattering particle structures. The examples demonstrate the capabilities of the synthesis and realization techniques, thereby showing the plethora of possible metasurface field transmission and subsequent applications. The first example is the design of two diffraction engineering birefringent metasurfaces performing polarization beam splitting and orbital angular momentum multiplexing, respectively. Next, we discuss the concept of the electromagnetic remotely controlled metasurface spatial processor, which is an electromagnetic linear switch based on destructive interferences. Then, we introduce a non-reciprocal non-gyrotropic metasurface using a pick-up circuit radiator (PCR architecture. Finally, the implementation of all-dielectric metasurfaces for frequency dispersion engineering is discussed.

The paper presents partial overview of the mathematical synthesis and the physical realization of metasurfaces, and related illustrative examples. The synthesis consists in determining the exact tensorial surface susceptibility functions of the metasurface, based on generalized sheet transition conditions, while the realization deals with both metallic and dielectric scattering particle structures. The examples demonstrate the capabilities of the synthesis and realization techniques, thereby showing the plethora of possible metasurface field transmission and subsequent applications. The first example is the design of two diffraction engineering birefringent metasurfaces performing polarization beam splitting and orbital angular momentum multiplexing, respectively. Next, we discuss the concept of the "transistor" metasurface, which is an electromagnetic linear switch based on destructive interferences. Then, we introduce a non-reciprocal non-gyrotropic metasurface using a pick-up circuit radiator (PCR) archite...

The science instruments (SIs) comprising the James Webb Space Telescope (JWST) Integrated Science Instrument Module (ISIM) were tested in three cryogenic-vacuum test campaigns in the NASA Goddard Space Flight Center (GSFC)'s Space Environment Simulator (SES) test chamber. In this paper, we describe the results of optical wavefront-error performance characterization of the SIs. The wavefront error is determined using image-based wavefront sensing, and the primary data used by this process are focus sweeps, a series of images recorded by the instrument under test in its as-used configuration, in which the focal plane is systematically changed from one image to the next. High-precision determination of the wavefront error also requires several sources of secondary data, including 1) spectrum, apodization, and wavefront-error characterization of the optical ground-support equipment (OGSE) illumination module, called the OTE Simulator (OSIM), 2) f/# and pupil-distortion measurements made using a pseudo-nonredundant mask (PNRM), and 3) pupil-geometry predictions for each SI field point tested, which are complicated because of a tricontagon-shaped outer perimeter and small holes that appear in the exit pupil due to the way that different light sources are injected into the optical path by the OGSE. One set of wavefront-error tests, for the coronagraphic channel of the Near-Infrared Camera (NIRCam) Longwave instruments, was performed using data from transverse-translation diversity (TTD) sweeps instead of focus sweeps, in which a subaperture is translated and/or rotated across the exit pupil of the system from one image to the next. Several optical-performance requirements that were verified during this ISIM Element-level testing are levied on the uncertainties of various wavefront-error-related quantities rather than on the wavefront errors themselves. This paper also gives an overview of the methodology, based on Monte Carlo simulations of the wavefront-sensing analysis

This work is conducted as a preliminary study for a project where individuals' conceptualizations of domain knowledge will thoroughly be analyzed across 150 subjects from 6 countries. The project aims at investigating how humans' conceptualizations dier according to dierent types of mother langua...

This work is conducted as a preliminary study for a project where individuals' conceptualizations of domain knowledge will thoroughly be analyzed across 150 subjects from 6 countries. The project aims at investigating how humans' conceptualizations differ according to different types of mother la...

This work is conducted as a preliminary study for a project where individuals' conceptualizations of domain knowledge will thoroughly be analyzed across 150 subjects from 6 countries. The project aims at investigating how humans' conceptualizations dier according to dierent types of mother langua...

This work is conducted as a preliminary study for a project where individuals' conceptualizations of domain knowledge will thoroughly be analyzed across 150 subjects from 6 countries. The project aims at investigating how humans' conceptualizations differ according to different types of mother la...

This paper presents a conceptual service architecture for adaptive mobile location services designed to be used on the next generation wireless network. The developed service architecture consists of a set of concepts, principles, rules and guidelines for constructing, deploying, and operating...... of a new-concept tracking service is chosen to demonstrate the applicability of the architecture. Through the case study, the service request and response processes will be illustrated. New possible service functions provided by the developed service architecture will be examined and discussed...... the mobile location services. The service architecture identifies the components required to build the mobile location services and describes how these components are combined and how they should interact. As a means of exploring the developed conceptual service architecture, an illustrative case study...

This book explores the popularity of American illustration from the late 1800s through the 1920s. Illustrated books, periodicals, the public consumption of illustrations, and various themes of illustration are discussed. Themes include: (1) "The Smart Set"; (2) "The Masses"; (3) "The Domestic Scene"; (4) "Town and Country"; (5) "Let Me Call You…

This book explores critical principles and new concepts in bioengineering, integrating the biological, physical and chemical laws and principles that provide a foundation for the field. Both biological and engineering perspectives are included, with key topics such as the physical-chemical properties of cells, tissues and organs; principles of molecules; composition and interplay in physiological scenarios; and the complex physiological functions of heart, neuronal cells, muscle cells and tissues. Chapters evaluate the emerging fields of nanotechnology, drug delivery concepts, biomaterials, and regenerative therapy. The leading individuals and events are introduced along with their critical research. Bioengineering: A Conceptual Approach is a valuable resource for professionals or researchers interested in understanding the central elements of bioengineering. Advanced-level students in biomedical engineering and computer science will also find this book valuable as a secondary textbook or reference.

We analyze some of the problems related to conceptual modeling in data warehousing environments. The long-term data warehousing challenge is flexibility rather than efficiency. The reason is that most warehouses will contain growing collections of historical data that are supposed to satisfy...... changing information needs. We show how the event-entity-relationship model (EVER) can be used for schema design and query formulation in data warehouses. Our work is based on a layered data warehouse architecture in which a global data warehouse is used for flexible long-term organization and storage...... of all warehouse data whereas local data warehouses are used for efficient query formulation and answering. In order to support flexible modeling of global warehouses we use a flexible version of EVER for global schema design. In order to support efficient query formulation in local data warehouses we...

The purpose of this introduction to spintronics is to provide some elementary description of its conceptual building blocks. Thus, it is intended for a newcomer to the field. After recalling rudimentary descriptions of spin precession and spin relaxation, spin-dependent transport is treated within the Boltzmann formalism. This suffices to introduce key notions such as the spin asymmetry of the conductivities in the two-current model, the spin diffusion length, and spin accumulation. Two basic mechanisms of spin relaxation are then presented, one arising from spin-orbit scattering and the other from electron-magnon collisions. Finally, the action of a spin-polarized current on magnetization is presented in a thermodynamics framework. This introduces the notion of spin torque and the characteristic length scale over which the transverse spin polarization of conduction electron decays as it is injected into a magnet.

Purpose - The purpose of this paper is to present insights into operations strategy (OS) in practice. It outlines a conceptualization and model of OS processes and, based on findings from an in-depth and longitudinal case study, contributes to further development of extant OS models and methods...... which presently mainly focus on OS content, as distinct from process issues. DesignImethodology/approach - The methodology combines action research and a longitudinal single site case study of OS processes in practice. Findings - The paper conceptualises an OS process as: events of dialogue and action...... provides a useful tool for describing and analyzing real-time OS processes unfolding in practice. Research limitations/implications - The research is based on a single case, which limits the generalizability of the findings. Practical implications - The findings suggest that, in order to obtain successful...

changing information needs. We show how the event-entity-relationship model (EVER) can be used for schema design and query formulation in data warehouses. Our work is based on a layered data warehouse architecture in which a global data warehouse is used for flexible long-term organization and storage......We analyze some of the problems related to conceptual modeling in data warehousing environments. The long-term data warehousing challenge is flexibility rather than efficiency. The reason is that most warehouses will contain growing collections of historical data that are supposed to satisfy...... of all warehouse data whereas local data warehouses are used for efficient query formulation and answering. In order to support flexible modeling of global warehouses we use a flexible version of EVER for global schema design. In order to support efficient query formulation in local data warehouses we...

We researched factors related to the success or failure in working relationships between free-lance medical illustrators and artist's representatives. In the fall of 1992, surveys were mailed to 230 medical illustrators; 105 (46%) completed surveys were returned. Respondents were divided into three categories: 1) medical illustrators currently represented, 2) medical illustrators previously represented, and 3) medical illustrators who had never been represented. Comparisons made among illustrators from the three groups included business practices, clientele, experience, and self-promotion techniques. These comparisons revealed notable differences and similarities between the three groups and were subsequently analyzed to identify the characteristics of medical illustrators who would benefit from professional representation.

William Cheselden was Great Britain's foremost surgeon/scientist in the first half of the 18th century. Cheselden directly challenged the Company of Barber-Surgeons' exclusive right to control dissection in London by being the first to conduct a regular series of anatomy lectures and demonstrations outside of the Company's Hall. He incorporated his lecture syllabus into a handbook of anatomy, The Anatomy of the Humane Body, which was used by students for nearly 100 years. Cheselden also wrote the text and drew the illustrations for a majestic atlas of comparative osteology, the Osteographia, or the Anatomy of the Bones. Cheselden used his superior knowledge of anatomy to reduce the morbidity and mortality associated with perineal lithotomy, one of the few operations possible in his era. Sagacious and pragmatic, Cheselden recognized that the enlightened practice of surgery beginning to take root in 18th-century London could flourish only under an autonomous body of surgeons. Cheselden used his personal funds and political skills to urge Parliament to pass legislation for the dissolution of the combined Company of Barber-Surgeons and the establishment of separate and distinct Surgeons' and Barbers' Companies. After disjoinder of the two groups on May 2, 1745, Cheselden served as one of the Wardens of the new Company of Surgeons--a predecessor of the Royal College of Surgeons of England. In 1746, Cheselden, who helped design the first Surgeons' Hall, served as the Company's Master.

This is a cutaway illustration of the Saturn V launch vehicle with callouts of the major components. The Saturn V is the largest and most powerful launch vehicle developed in the United States. It was a three stage rocket, 363 feet in height, used for sending American astronauts to the moon and for placing the Skylab in Earth orbit. The Saturn V was designed to perform Earth orbital missions through the use of the first two stages, while all three stages were used for lunar expeditions. The S-IC stage (first stage) was powered by five F- engines, which burned kerosene and liquid oxygen to produce more than 7,500,000 pounds of thrust. The S-II (second) stage was powered by five J-2 engines, that burned liquid hydrogen and liquid oxygen and produced 1,150,000 pounds thrust. The S-IVB (third) stage used one J-2 engine, producing 230,000 pounds of thrust, with a re-start capability. The Marshall Space Flight Center and its contractors designed, developed, and assembled the Saturn V launch vehicle stages.

The complexity of policy-making in the NHS is such that systemic, holistic thinking is needed if the current government's plans are to be realized. This paper describes systems thinking and illustrates its value in understanding the complexity of the diabetes National Service Framework (NSF); its role in identifying problems and barriers previously not predicted; and in reaching conclusions as to how it should be implemented. The approach adopted makes use of soft systems methodology (SSM) devised by Peter Checkland. This analysis reveals issues relating to human communication, information provision and resource allocation needing to be addressed. From this, desirable and feasible changes are explored as means of achieving a more effective NSF, examining possible changes from technical, organizational, economic and cultural perspectives. As well as testing current health policies and plans, SSM can be used to test the feasibility of new health policies. This is achieved by providing a greater understanding and appreciation of what is happening in the real world and how people work. Soft systems thinking is the best approach, given the complexity of health care. It is a flexible, cost-effective solution, which should be a prerequisite before any new health policy is launched.

Safety characteristics. Several applications of this conceptual design methodology have been carried out in order to validate it. Here we will show one of the most challenging case studies: the APT73 spaceplane. Today the demand for getting access to space is increasing and fully reusable launch vehicles are likely to play a key role in future space activities, but up until now this kind of space system has not been successfully developed. The ideal reusable launcher should be a vehicle able to maintain physical integrity during its mission, to takeoff and land at any conventional airport, to be operated with a minimum maintenance effort and to guarantee an adequate safety level. Thanks to its flexibility it should be able to enter the desired orbital plane and to abort its mission any time in case of mishap. Moreover considerable cost reduction could be expected only by having extremely high launch rates comparable to today's aircraft fleets in the commercial airlines business. In our opinion the solution which better meets these specifications is the Aerial Propellant Transfer spaceplane concept, the so called "one stage and a half" space vehicle, which takes off and climbs to meet a tanker aircraft to be aerially re-fuelled and then, after disconnecting from the tanker, it flies to reach the orbit. The APT73 has been designed to reach the Low Earth Orbit to perform two kinds of mission: 1) to release payloads; 2) to be flown as crew return vehicle from the ISS. The concept has emerged from a set of preliminary choices established at the beginning of the project: Possible variants to the basic plan have been investigated and a trade off analysis has been carried out in order to obtain the optimum configuration. Listed below are the options that have been evaluated: This paper provides a technical description of the APT73 and illustrates the design challenges encountered in the development of the project.

@@ Lakoff and Jonhnson pointed out "It is agreed that what makes the contemporary theory of metaphor unique is the important distinction drawn between conceptual meraphors and metaphorical expressions."Conceptual metaphor(or metaphorical concept)and linguistic metaphor are two different concepts.

This thesis investigates the question of how conceptual frameworks influence inductive reasoning. A conceptual framework is a collection of concepts used for a particular purpose; we can think of it as a semantic environment in which observations, or evidence, are recorded, and beliefs are formed an

Conceptual congruency effects are biases induced by an irrelevant conceptual dimension of a task (e.g., location in vertical space) on the processing of another, relevant dimension (e.g., judging words' emotional evaluation). Such effects are a central empirical pillar for recent views about how the mind/brain represents concepts. In the present…

define conceptual designing as a constructive framing and re-framing activity, which is mediated by and targeted at the creation of new design concepts. Conceptual designing as an approach is valuable for addressing the fuzziness and ambiguity typical of research that explores novel areas with new...... partners, methods and resources. It is by no means a new phenomenon, and the main contribution of the article is the clarification of conceptual designing as a particular approach to designing and researching. The approach embraces openness, resource-construction and collaboration. We conclude...... that conceptual designing can be especially useful in research and design projects that bring different kinds of people, organizations, technologies and domains together into the forming of new well-founded proposals for development. The presentation of conceptual designing in this paper is written...

Ecological momentary assessment (EMA) methods, which involve collection of real-time data in subjects' real-world environments, are particularly well suited to studying tobacco use. Analyzing EMA datasets can be challenging, as the datasets include a large and varied number of observations per subject and are relatively unstructured. This paper suggests that time is typically a key organizing principle in EMA data and that conceptualizing the data as a timeline of events, behaviors, and experiences can help define analytic approaches. EMA datasets lend themselves to answering a diverse array of research questions, and the research question must drive how data are arranged for analysis, and the kinds of statistical models that are applied. This is illustrated this with brief examples of diverse analyses applied to answer different questions from an EMA study of tobacco use and relapse.

The James Webb Space Telescope is on track for a launch in 2013. The author reviews the status and progress on the key hardware. The first primary mirror segments are already at MSFC for cryogenic tests, the mid IR instrument (MIRI) has already had successful tests of the engineering model, and the detectors are showing excellent performance. The author also describes the scientific objectives of the mission, with emphasis on the predicted capabilities for observing planets by the transit technique and through direct imaging. Recent direct observations of planets by HST and by adaptive optics from the ground have shown that, under favorable circumstances, much can be learned.

or emotional simulation. There are numerous scientists such as Richard Davidson, Paul Ekman or Dan Siegel, who study the effects of our emotions in our behavior and brain functions; unfortunately there are almost no existing references regarding how the creative process of images or animated movies help our...... reality by playing with images, full of emotions. These ones are extremely relevant in the learning process and interpretation of experiences which produce our thoughts and feelings. Emotions affect our decision making, problem solving and focus attention, features which we work on during the creative...... process of an animated movie or making illustrations. We present animation, including illustration as part of the process, as a social emotional learning tool and media to enhance wellbeing and work neuroplasticity; by means working on aspects from cognitive neuroscience, such as attention, transportation...

As landslides continue to be a hazard that account for large numbers of human and animal casualties, property loss, and infrastructure damage, as well as impacts on the natural environment, it is incumbent on developed nations that resources be allocated to educate affected populations in less developed nations, and provide them with tools to effectively manage this hazard. Given that the engineering, planning and zoning, and mitigation techniques for landslide hazard reduction are more accessible to developed nations, it is crucial that such landslide hazard management tools be communicated to less developed nations in a language that is not overly technical, and provides information on basic scientific explanations on where, why and how landslides occur. The experiences of the United States, Canada, and many other nations demonstrate that, landslide science education, and techniques for reducing damaging landslide impacts may be presented in a manner that can be understood by the layperson. There are various methods through which this may be accomplished–community-level education, technology transfer, and active one-on-one outreach to national and local governments, and non-governmental organizations (NGOs), who disseminate information throughout the general population. The population at large can also benefit from the dissemination of landslide information directly to individual community members. The United States Geological Survey and the Geological Survey of Canada have just published and will distribute a universal landslide handbook that can be easily made available to emergency managers, local governments, and individuals. The handbook, “The Landslide Handbook: A Guide to Understanding Landslides” is initially published as U.S. Geological Survey Circular 1325, in English, available in print, and accessible on the internet. It is liberally illustrated with schematics and photographs, and provides the means for a basic understanding of landslides, with

Full Text Available This study focuses on the analysis of the illustrations found in five different Geography textbooks in Romania. The analysis is based on several criteria: number, size, clarity, pedagogical usefulness. The following conclusions have been drawn: the illustrations are numerous; most of the illustrations are too small and unclear to be efficiently used in the teaching activity; the purpose of some materials is purely illustrative; some illustrations are overcharged with details, which prevent children from understanding them. Authors and publishing houses are advised to choose the illustrations in the fourth class Geography textbooks more carefully.

The PHENIX Conceptual Design Report (CDR) describes the detector design of the PHENIX experiment for Day-1 operation at the Relativistic Heavy Ion Collider (RHIC). The CDR presents the physics capabilities, technical details, cost estimate, construction schedule, funding profile, management structure, and possible upgrade paths of the PHENIX experiment. The primary goals of the PHENIX experiment are to detect the quark-gluon plasma (QGP) and to measure its properties. Many of the potential signatures for the QGP are measured as a function of a well-defined common variable to see if any or all of these signatures show a simultaneous anomaly due to the formation of the QGP. In addition, basic quantum chromodynamics phenomena, collision dynamics, and thermodynamic features of the initial states of the collision are studied. To achieve these goals, the PHENIX experiment measures lepton pairs (dielectrons and dimuons) to study various properties of vector mesons, such as the mass, the width, and the degree of yield suppression due to the formation of the QGP. The effect of thermal radiation on the continuum is studied in different regions of rapidity and mass. The e[mu] coincidence is measured to study charm production, and aids in understanding the shape of the continuum dilepton spectrum. Photons are measured to study direct emission of single photons and to study [pi][sup 0] and [eta] production. Charged hadrons are identified to study the spectrum shape, production of antinuclei, the [phi] meson (via K[sup +]K[sup [minus

The conceptual design of the high luminosity electron-ion collider, eRHIC, is presented. The goal of eRHIC is to provide collisions of electrons (and possibly positrons) with ions and protons at the center-of-mass energy range from 25 to 140 GeV, and with luminosities exceeding 10{sup 33} cm{sup -2} s{sup -1}. A considerable part of the physics program is based on polarized electrons, protons and He3 ions with high degree of polarization. In eRHIC electron beam will be accelerated in an energy recovery linac. Major R&D items for eRHIC include the development of a high intensity polarized electron source, studies of various aspects of energy recovery technology for high power beams and the development of compact magnets for recirculating passes. In eRHIC scheme the beam-beam interaction has several specific features, which have to be thoroughly studied. In order to maximize the collider luminosity, several upgrades of the existing RHIC accelerator are required. Those upgrades may include the increase of intensity as well as transverse and longitudinal cooling of ion and proton beams.

In this essay a critical review of present conceptual problems in current cosmology is provided from a more philosophical point of view. In essence, a digression on how could philosophy help cosmologists in what is strictly their fundamental endeavor is presented. We start by recalling some examples of enduring confrontations among philosophers and physicists on what could be contributed by the formers to the day-time striving of the second ones. Then, a short review of the standard model Friedmann-Lema\\^itre-Robertson-Walter (FLRW) of cosmology is given. It seems apparent that cosmology is living a golden age with the advent of observations of high precision. Nonetheless, a critical revisiting of the direction in which it should go on appears also needed, for misconcepts like "quantum backgrounds for cosmological classical settings" and "quantum gravity unification" have not been properly constructed up-to-date. Thus, knowledge-building in cosmology, more than in any other field, should begin with visions of...

The PHENIX Conceptual Design Report (CDR) describes the detector design of the PHENIX experiment for Day-1 operation at the Relativistic Heavy Ion Collider (RHIC). The CDR presents the physics capabilities, technical details, cost estimate, construction schedule, funding profile, management structure, and possible upgrade paths of the PHENIX experiment. The primary goals of the PHENIX experiment are to detect the quark-gluon plasma (QGP) and to measure its properties. Many of the potential signatures for the QGP are measured as a function of a well-defined common variable to see if any or all of these signatures show a simultaneous anomaly due to the formation of the QGP. In addition, basic quantum chromodynamics phenomena, collision dynamics, and thermodynamic features of the initial states of the collision are studied. To achieve these goals, the PHENIX experiment measures lepton pairs (dielectrons and dimuons) to study various properties of vector mesons, such as the mass, the width, and the degree of yield suppression due to the formation of the QGP. The effect of thermal radiation on the continuum is studied in different regions of rapidity and mass. The e[mu] coincidence is measured to study charm production, and aids in understanding the shape of the continuum dilepton spectrum. Photons are measured to study direct emission of single photons and to study [pi][sup 0] and [eta] production. Charged hadrons are identified to study the spectrum shape, production of antinuclei, the [phi] meson (via K[sup +]K[sup [minus

Brookhaven National Laboratory has prepared a conceptual design for a world class user facility for scientific research using synchrotron radiation. This facility, called the ''National Synchrotron Light Source II'' (NSLS-II), will provide ultra high brightness and flux and exceptional beam stability. It will also provide advanced insertion devices, optics, detectors, and robotics, and a suite of scientific instruments designed to maximize the scientific output of the facility. Together these will enable the study of material properties and functions with a spatial resolution of {approx}1 nm, an energy resolution of {approx}0.1 meV, and the ultra high sensitivity required to perform spectroscopy on a single atom. The overall objective of the NSLS-II project is to deliver a research facility to advance fundamental science and have the capability to characterize and understand physical properties at the nanoscale, the processes by which nanomaterials can be manipulated and assembled into more complex hierarchical structures, and the new phenomena resulting from such assemblages. It will also be a user facility made available to researchers engaged in a broad spectrum of disciplines from universities, industries, and other laboratories.

A participatory system dynamics modelling approach is advanced to support conceptualization of feedback processes underlying ecosystem services and to foster a shared understanding of leverage intervention points. The process includes systems mapping workshop and follow-up tasks aiming at the collaborative construction of causal loop diagrams. A case study developed in a natural area in Portugal illustrates how a stakeholder group was actively engaged in the development of a conceptual model depicting policies for sustaining the climate regulation ecosystem service.

Building a solid foundation of conceptual knowledge is critical for students in electrical engineering. This case study explores the use of simulation videos to illustrate complicated conceptual knowledge in foundational communications and signal processing courses. Students found these videos to be very useful for establishing concepts, understanding course content and increasing general knowledge in electrical engineering. We hope that the findings can help inform instructors of electrical ...

What is a conceptual model? How is conceptual modeling performed in general and in specific modeling domains? What is the role of established approaches in conceptual modeling? This book addresses these issues

Exoplanet Proxima b will be an important laboratory for the search for extraterrestrial life for the decades ahead. Here, we discuss the prospects of detecting carbon dioxide at 15 μm using a spectral filtering technique with the Medium Resolution Spectrograph (MRS) mode of the Mid-Infrared Instrument (MIRI) on the James Webb Space Telescope (JWST). At superior conjunction, the planet is expected to show a contrast of up to 100 ppm with respect to the star. At a spectral resolving power of R = 1790-2640, about 100 spectral CO2 features are visible within the 13.2-15.8 μm (3B) band, which can be combined to boost the planet atmospheric signal by a factor of 3-4, depending on the atmospheric temperature structure and CO2 abundance. If atmospheric conditions are favorable (assuming an Earth-like atmosphere), with this new application to the cross-correlation technique, carbon dioxide can be detected within a few days of JWST observations. However, this can only be achieved if both the instrumental spectral response and the stellar spectrum can be determined to a relative precision of ≤1 × 10-4 between adjacent spectral channels. Absolute flux calibration is not required, and the method is insensitive to the strong broadband variability of the host star. Precise calibration of the spectral features of the host star may only be attainable by obtaining deep observations of the system during inferior conjunction that serve as a reference. The high-pass filter spectroscopic technique with the MIRI MRS can be tested on warm Jupiters, Neptunes, and super-Earths with significantly higher planet/star contrast ratios than the Proxima system.

Hubble's WFC3 has been a game changer for studying early galaxy formation in the first 700 Myr after the Big Bang. Reliable samples of sources up to z~10, which can be discovered only from space, are now constraining the evolution of the galaxy luminosity function into the epoch of reionization. Despite these efforts, the size of the highest redshift galaxy samples (z >9 and especially z > 10) is still very small, particularly at high luminosities (L > L*). To deliver transformational results, much larger numbers of bright z > 9 galaxies are needed both to map out the bright end of the luminosity/mass function and for spectroscopic follow-up (with JWST and otherwise). One especially efficient way of expanding current samples is (1) to leverage the huge amounts of pure-parallel data available with HST to identify large numbers of candidate z ~ 9 - 11 galaxies and (2) to follow up each candidate with shallow Spitzer/IRAC observations to distinguish the bona- fide z ~ 9 - 11 galaxies from z ~ 2 old, dusty galaxies. For this program we are requesting shallow Spitzer/IRAC follow-up of 20 candidate z ~ 9 - 11 galaxies we have identified from 130 WFC3/IR pointings obtained from more than 4 separate HST programs with no existing IRAC coverage. Based on our previous CANDELS/GOODS searches, we expect to confirm 5 to 10 sources as L > L* galaxies at z >= 9. Our results will be used to constrain the bright end of the LF at z >= 9, to provide targets for Keck spectroscopy to constrain the ionization state of the z > 8 universe, and to furnish JWST with bright targets for spectroscopic follow-up studies.

This article, the 3rd in a series of 5, introduces the conceptual framework for thematic mapping, a novel approach to case conceptualization. The framework is transtheoretical in that it is not constrained by the tenets or concepts of any one therapeutic orientation and transdiagnostic in that it conceptualizes clients outside the constraints of diagnostic criteria. Thematic mapping comprises 4 components: a definition, foundational principles, defining features, and core concepts. These components of the framework, deemed building blocks, are explained in this article. Like the foundation of any structure, the heuristic value of the method requires that the building blocks have integrity, coherence, and sound anchoring. We assert that the conceptual framework provides a solid foundation, making thematic mapping a potential asset in mental health treatment.

Explores the life and work of the artist, Winslow Homer, a 19th century painter and illustrator. Focuses on the exhibition, "Winslow Homer the Illustrator: His Wood Engravings, 1857-1888." Includes the itinerary for the exhibition and examples of his illustrations. (CMK)

Full Text Available Building a solid foundation of conceptual knowledge is critical for students in electrical engineering. This case study explores the use of simulation videos to illustrate complicated conceptual knowledge in foundational communications and signal processing courses. Students found these videos to be very useful for establishing concepts, understanding course content and increasing general knowledge in electrical engineering. We hope that the findings can help inform instructors of electrical engineering to transform their teaching practice and eventually benefit students through building a solid conceptual understanding that fosters the development of further engineering competencies.

Full Text Available The starting point of this paper is Vygotsky's thesis that the prerequisite of conceptual thinking and concepts in general is the systematic influence upon the child effectuated by his/her inclusion into the process of education. The aim of this work is to examine characteristics of conceptual thinking of people who have not attended school, by which they have been devoid of formative role of education. Four different methods for examination of conceptual development have been used on the sample consisting of seventeen respondents who have not attended school. The results state that the majority of respondents have not demonstrated that they master the concepts on the highest level of development in none of these four methods. However, some respondents in some tests and some individual tasks within the tests show some characteristics of the high level of the conceptual thinking development.

Presented in viewgraph form are techniques to improve the conceptual design of complex systems. The paper discusses theory of design, flexible software tools for computer aided design, and methods for enhancing communication among design teams.

Our focus is on uncertainties in the underlying conceptual framework upon which all subsequent steps in numerical and/or analytical modeling efforts depend. Experienced environmental modelers recognize the value of selecting an optimal conceptual model from several competing site models, but usually do not formally explore possible alternative models, in part due to incomplete or missing site data, as well as relevant regional data for establishing boundary conditions. The value in and approach for developing alternative conceptual site models (CSM) is demonstrated by analysis of case histories. These studies are based on reported flow or transport modeling in which alternative site models are formulated using data that were not available to, or not used by, the original modelers. An important concept inherent to model abstraction of these alternative conceptual models is that it is "Far better an approximate answer to the right question, which is often vague, than the exact answer to the wrong question, which can always be made precise." (Tukey, 1962) The case histories discussed here illustrate the value of formulating alternative models and evaluating them using site-specific data: (1) Charleston Naval Site where seismic characterization data allowed significant revision of the CSM and subsequent contaminant transport modeling; (2) Hanford 300-Area where surface- and ground-water interactions affecting the unsaturated zone suggested an alternative component to the site model; (3) Savannah River C-Area where a characterization report for a waste site within the modeled area was not available to the modelers, but provided significant new information requiring changes to the underlying geologic and hydrogeologic CSM's used; (4) Amargosa Desert Research Site (ADRS) where re-interpretation of resistivity sounding data and water-level data suggested an alternative geologic model. Simple 2-D spreadsheet modeling of the ADRS with the revised CSM provided an improved

This paper describes how object-oriented concepts can be used throughout system development for integration purposes. Based on the distinction of physical and conceptual integration the concept of object wrapping is discussed for the integration of non-object-oriented systems. By regarding applications as high-level objects, i.e. wrapped applications, integration is achieved by modelling so-called integration relationships between these wrapped applications. While in conceptual integration re...

The purpose of the present article of scientific investigation lies on the deep study of the concept “socioformative teaching” due to the absence of a conceptual. Materials and methods; it is used conceptual cartography as a method, following de eight sides of the analysis proposed, taking as a base primary sources. Results and conclusions: the clarification of the concept is presented along with its methodological. New studies are suggested about methodology and its impact on the complete fo...

Full Text Available The Parisian Revue Illustrée (1885–1912, a middle-class periodical of broad circulation and sophisticated iconography, lets us examine the expansion of fin-de-siècle culture beyond the so-called ‘petites revues’, particularly in the years 1894–1903. Through this fashionable all-publics fortnightly, typically fin-de-siècle tales and songs on transient motives, replete with Art Nouveau images and ornamentation, reach bourgeois households. The article shows the niche category the magazine occupied through its copious and exciting iconography. Using unpublished correspondence and print material culture, it throws light on the ways its editors turned the more refined parts of the magazine into deluxe photo-mechanically produced books. The study focuses on two men, René Baschet, the Revue’s editor from 1889 to 1904, and Jérôme Doucet, his assistant editor from July 1897 to 1902, and two fin-de-siècle writers, Catulle Mendès and Jean Lorrain, as well as up-coming artists André Cahard, Henry Bellery-Desfontaines, Manuel Orazi, and Carloz Schwabe. The case shows that sophisticated illustration was a financial spur that came cheap while it supported the so-called ‘decadent’ writings. Further, with refined taste, numerous connexions to artists, and work for a Dutch publisher, Jérôme Doucet emerges as a key figure behind the scenes.

mercury conceptual model and its four submodels (1. Methylation, 2. Bioaccumulation, 3. Human Health Effects, and 4. Wildlife Heath Effects) can be used to understand the general relationships among drivers and outcomes associated with mercury cycling in the Delta. Several linkages between important drivers and outcomes have been identified as important but highly uncertain (i.e. poorly understood). For example, there may be significant wildlife health effect of mercury on mammals and reptiles in the Delta, but there is currently very little or no information about it. The characteristics of such linkages are important when prioritizing and funding restoration projects and associated monitoring in the Delta and its tributaries.

Reviewing anatomical, physiological and neurological standard literature for illustrations of referred visceral pain only one type of illustration can frequently be found, which is referred to as Treves and Keith. In fact, the original illustration as a model for most current pictures stems from the German edition of Sir Frederick Treves' famous book "Surgical Applied Anatomy" from 1914, which was reillustrated for didactical reasons for the German readership. While neither Treves and Keith nor the German illustrator Otto Kleinschmidt ever published any work on referred pain this illustration must have been adapted or copied from older sources by the illustrator. Therefore the comprehensive systematic original works before 1914 were reviewed, namely those of Sir Henry Head and Sir James Mackenzie. Due to the name of the phenomenon in the German literature of Head's zones, the illustrations were expected to be based mainly on Head's work. However, a comparison of all available illustrations led to the conclusion that Kleinschmidt chiefly used information from Mackenzie as a model for his illustration. Due to the inexact reproduction of Mackenzie's work by the illustrator some important features were lost that had been reported by the original authors. These include the phenomenon of Head's maximum points, which nowadays has fallen into oblivion.Therefore current charts, based on the illustration by Kleinschmidt from 1914, lack experimental evidence and appear to be a simplification of the observational results of both Head's and Mackenzie's original systematic works.

This article investigates the use of product models by conceptual designers. After a short introduction, abstraction applied in conceptual design is described. A model that places conceptual design in a three-dimensional space is used. Applications of conceptual design from the literature are used t

This article investigates the use of product models by conceptual designers. After a short introduction, abstraction applied in conceptual design is described. A model that places conceptual design in a three-dimensional space is used. Applications of conceptual design from the literature are used t

Many of the goals of research on conceptual metaphor in science education overlap with the goals of research on conceptual change. The relevance of a conceptual metaphor perspective to the study of conceptual change has already been discussed. However, a substantial body of literature on conceptual metaphor in science education has now emerged.…

This paper develops a method that incorporates the public value for environmental cobenefits when a conservation buyer can purchase water quality credits based on nonmarket valuation results. We demonstrate this approach through an experiment with adult students in a classroom laboratory environment. Our application contributes to the study of individual preference and willingness to pay for cobenefits associated with the production of water quality credits in relation to the Ohio River Basin Trading Project. We use three different methods to elicit individuals' willingness to pay (WTP), including (1) a hypothetical referendum, (2) a real referendum lacking incentive compatibility, and (3) a real choice with incentive compatibility. Methodologically, our WTP estimates suggest individuals are more sensitive to the cost changes and reveal the lowest value in the real choice with incentive compatibility. Practically, we find individuals value certain cobenefits and credits as public goods. Incorporating public value toward cobenefits may improve the overall efficiency of a water quality trading market. Based on our specification of a planner's welfare function, results suggest a substantial welfare improvement after identifying an optimal allocation of a buyer's budget across credits derived from agricultural management practices producing different portfolios of cobenefits.

Full Text Available Item responses can be context-sensitive. Consequently, composing test forms flexibly from a calibrated item pool requires considering potential context effects. This paper focuses on context effects that are related to the item sequence. It is argued that sequence effects are not necessarily a violation of item response theory but that item response theory offers a powerful tool to analyze them. If sequence effects are substantial, test forms cannot be composed flexibly on the basis of a calibrated item pool, which precludes applications like computerized adaptive testing. In contrast, minor sequence effects do not thwart applications of calibrated item pools. Strategies to minimize the detrimental impact of sequence effects on item parameters are discussed and integrated into a nomenclature that addresses the major features of item calibration designs. An example of an item calibration design demonstrates how this nomenclature can guide the process of developing a calibrated item pool.

Full Text Available Scientific knowledge is the knowledge accumulated by systematic studies and organized by general principles. Visual, verbal, numeric, and other types of representation are used to communicate scientific knowledge. Scientific illustration is the visual representation of objects and concepts in order to record and to convey scientific knowledge(Ford, 1993. There are some discussions on scientific illustrations in history, philosophy and the sociology of science(Burri & Dumit, 2008, but little has been done on the creation of scientific illustrations by illustrators. This study focuses on the creation of scientific illustrations by illustrators. The purpose is to show how illustrators create the visual messages in communications of scientific knowledge. Through analysis of semi-structured interviews with 6 professional illustrators, creators and art directors, it is showed that illustrators select and edit scientific information, add non-scientific information, and organize information into one visual representation of scientific knowledge. The implication of this research will provide a new perspective to multisensory communication of scientific knowledge.

In a distributed eMarketplace, recommended product ontologies are required for trading between buyers and sellers. Conceptual clustering can be employed to build dynamic recommended product ontologies. Traditional methods of conceptual clustering (e. g. COBWEB or Cluster/2) do not take heterogeneous attributes of a concept into account.Moreover, the result of these methods is clusters other than recommended concepts. A center recommendation clustering algorithm is provided. According to the values of heterogeneous attributes, recommended product names can be selected at the clusters, which are produced by this algorithm. This algorithm can also create the hierarchical relations between product names. The definitions of product names given by all participants are collected in a distributed eMarketplace.Recommended product ontologies are built. These ontologies include relations and definitions of product names, which come from different participants in the distributed eMarketplace. Finally a case is given to illustrate this method. The result shows that this method is feasible.

Full Text Available The purpose of the present article of scientific investigation lies on the deep study of the concept “socioformative teaching” due to the absence of a conceptual. Materials and methods; it is used conceptual cartography as a method, following de eight sides of the analysis proposed, taking as a base primary sources. Results and conclusions: the clarification of the concept is presented along with its methodological. New studies are suggested about methodology and its impact on the complete formation of students.

of space to the detriment of conceptual categories for place meaning (Casey 1997). The consequences for the study of semantics of prepositions are that the spatial uses of prepositions are described with explanatory force, while even claiming psychological reality in theory. Yet McDonough et al. 2003...... as qualitative, non-specific mental representations based in part on Mandler’s (2004 and 2010) and Barsalou’s (1999) view on conceptual representations, the analysis proceeds to describe and explain the distinctive senses of these prepositions and their interconnections. In keeping with Sinha and Kuteva’s (1995...

Half a century ago, calls had already been made for instrumental ensemble directors to move beyond performance to include the teaching of musical concepts in the rehearsal hall. Relatively recent research, however, suggests that conceptual teaching remains relatively infrequent during rehearsals. Given the importance of teaching for long-term…

In this paper we examine American inservice teachers' instructional methods in reading comprehension following professional development (PD), illustrating the prolonged and significant gap between research and practice in the field of reading comprehension. We explore this gap from a conceptual change perspective, positing that some level of…

This paper briefly reviews existing conceptualizations of resistance in counseling children. The author posits that resistance is an "expected" aspect of all counseling and offers an alternative orientation toward client resistance based on exploring the child's "helping narratives." Two case studies illustrate the implementation of this…

This paper reflects on the socio-political conceptualization of Western Iberia, one of Rewilding Europe’s first pilot areas. By combining social theories related to Politics of Scale and Actor Network Theory, we illustrate how Western Iberia is continuously being negotiated through practices in diff

This undergraduate student paper explores usage of mixed reality techniques as support tools for conceptual design. A proof-of-concept was developed to illustrate this principle. Using this as an example, a small group of designers was interviewed to determine their views on the use of this

Objective: This article proposes and illustrates a conceptual model of medical student well-being. Method: The authors reviewed the literature on medical student stress, coping, and well-being and developed a model of medical student coping termed the "coping reservoir." Results: The reservoir can be replenished or drained by various aspects of…

. Considering the strength and limitations of both methodologies, the question to be answered in this thesis is: How valuable and compatible are the classical analytical methods in today's conceptual design environment? And can these methods complement each other? To answer these questions, this thesis investigates the pros and cons of classical analytical structural analysis methods during the conceptual design stage through the following objectives: Illustrate structural design methodology of these methods within the framework of Aerospace Vehicle Design (AVD) lab's design lifecycle. Demonstrate the effectiveness of moment distribution method through four case studies. This will be done by considering and evaluating the strength and limitation of these methods. In order to objectively quantify the limitation and capabilities of the analytical method at the conceptual design stage, each case study becomes more complex than the one before.

This tool needs almost no user interaction. The user simply has to align both photographs and indicate the level of detail according to the distance. The rest is decided by our software. Whereas a professional illustrator needs more than 20 hours to finish a similar illustration, our software is able to do it in just few seconds.

The teaching of physics often involves the use of illustrations that complement and assist the understanding of a particular situation or physical phenomenon. Overall, the proper use of illustrations can maximize the learning and understanding of concepts and phenomena related to the teaching of science (physics, chemistry, biology) and mathematics.

This article explores the two series of visits to the artist's studio that appeared in the famed French illustrated magazine L'Illustration in the 1850s and in 1886. An in-depth examination of both the texts and images reveals the verbal and visual tropes used to characterize the artists and their s

This article explores the two series of visits to the artist's studio that appeared in the famed French illustrated magazine L'Illustration in the 1850s and in 1886. An in-depth examination of both the texts and images reveals the verbal and visual tropes used to characterize the artists and their s

Presents an interview with George Littlechild, a Canadian Plains Cree artist, writer and illustrator who has created nearly 500 paintings that have been exhibited on several continents. Discusses his autobiographical "This Land is My Land" which is illustrated with his paintings and which won the Jane Addams Picture Book Award. (SG)

Presents an interview with George Littlechild, a Canadian Plains Cree artist, writer and illustrator who has created nearly 500 paintings that have been exhibited on several continents. Discusses his autobiographical "This Land is My Land" which is illustrated with his paintings and which won the Jane Addams Picture Book Award. (SG)

This article presents a vision for fostering multilingualism in schools that extends the notion of translanguaging to include the realm of multilingual curriculum theorizing. We locate our analysis at the intersection of multicultural education, multilingual education, and curriculum studies in order to conceptualize language, culture, and…

Reviews advancements in the development of conceptual frameworks for studying information behavior. Concludes that a unifying theoretical body is emerging that, beyond its user-centered core, emphasizes the contextual interplay of cognitive, social, cultural, organizational, affective, and linguistic factors and asserts that information behavior…

Biology student mastery regarding the mechanisms of diffusion and osmosis is difficult to achieve. To monitor comprehension of these processes among students at a large public university, we developed and validated an 18-item Osmosis and Diffusion Conceptual Assessment (ODCA). This assessment includes two-tiered items, some adopted or modified…

The theory of conceptual fields is a developmental theory. It has two aims: (1) to describe and analyse the progressive complexity, on a long- and medium-term basis, of the mathematical competences that students develop inside and outside school, and (2) to establish better connections between the operational form of knowledge, which consists in…

We describe options for the production of an intense photon beam at the CEBAF Hall D Tagger facility, needed for creating a high-quality secondary K 0 L delivered to the Hall D detector. The conceptual design for the Compact Photon Source apparatus is presented.

While the conceptual change model of learning has contributed much to our understanding of how children learn science, recent criticisms of the model point out its lack of attention to motivational issues. This paper examines one such motivational construct of importance to the model: epistemic motivation. After a description of the construct, we…

This paper details the development of a taxonomy for conceptualizing teaching. This taxonomy is presented as a means to help educators understand and interpret what it is they do and continue in the process of searching and understanding. The purpose of developing a taxonomy, the basis for the dimensions--or subject matter--for the taxonomy, and…

The JWST IEC conformal shields are mounted onto a composite frame structure that must undergo qualification testing to satisfy mission assurance requirements. The composite frame segments are bonded together at the joints using epoxy, EA 9394. The development of a test method to verify the integrity of the bonded structure at its operating environment introduces challenges in terms of requirements definition and the attainment of success criteria. Even though protoflight thermal requirements were not achieved, the first attempt in exposing the structure to cryogenic operating conditions in a thermal vacuum environment resulted in approximately 1 bonded joints failure during mechanical pull tests performed at 1.25 times the flight loads. Failure analysis concluded that the failure mode was due to adhesive cracks that formed and propagated along stress concentrated fillets as a result of poor bond squeeze-out control during fabrication. Bond repairs were made and the structures successfully re-tested with an improved LN2 immersion test method to achieve protoflight thermal requirements.

Supermassive black holes observed at high redshift $z\\gtrsim6$ could grow from direct collapse black holes (DCBHs) with mass $\\sim10^5\\,M_{\\odot}$, which result from the collapse of supermassive stars (SMSs). If a relativistic jet is launched from a DCBH, it can break out of the collapsing SMS and produce a gamma-ray burst (GRB). Although most of the GRB jets are off-axis from our line of sight, we show that the energy injected from the jet into a cocoon is huge $\\sim10^{55-56}\\,{\\rm{erg}}$, so that the cocoon fireball is observed as ultra-luminous supernovae of $\\sim10^{45-46}\\rm{\\,erg\\,s^{-1}}$ for $\\sim5000 [(1+z)/16] \\rm{\\,days}$. They are detectable by the future telescopes with near infrared bands, such as, $Euclid$, $WFIRST$, $WISH$, and $JWST$ up to $z\\sim20$ and $\\sim 100$ events per year, providing a direct evidence of the DCBH scenario.

The fossil record provides the only direct evidence of temporal trends in biodiversity over evolutionary timescales. Studies of biodiversity using the fossil record are, however, largely limited to discussions of taxonomic and/or morphological diversity. Behavioural and physiological traits that are likely to be under strong selection are largely obscured from the body fossil record. Similar problems exist in modern ecosystems where animals are difficult to access. In this review, we illustrate some of the common conceptual and methodological ground shared between those studying behavioural ecology in deep time and in inaccessible modern ecosystems. We discuss emerging ecogeochemical methods used to explore population connectivity and genetic drift, life-history traits and field metabolic rate and discuss some of the additional problems associated with applying these methods in deep time.

Within the complex set of activities that comprise the scientific method, three clusters of activities can be recognized: experimentation, mathematization, and conceptual analysis. In psychology, the first two of these clusters are well-known and valued, but the third seems less known and valued. The authors show the value of these three clusters of scientific method activities in the works of the quintessential scientist Galileo Galilei. They then illustrate how conceptual analysis can be used in psychology to clarify the grammar and meaning of concepts, expose conceptual problems in models, reveal unacknowledged assumptions and steps in arguments, and evaluate the consistency of theoretical accounts. The article concludes with a discussion of three criticisms of conceptual analysis.

The JPL Control/Structure Interaction Program is developing new analytical methods for designing micro-precision spacecraft with controlled structures. One of these, the Conceptual Design Tool, will illustrate innovative new approaches to the integration of multi-disciplinary analysis and design methods. The tool will be used to demonstrate homogeneity of presentation, uniform data representation across analytical methods, and integrated systems modeling. The tool differs from current 'integrated systems' that support design teams most notably in its support for the new CSI multi-disciplinary engineer. The design tool will utilize a three dimensional solid model of the spacecraft under design as the central data organization metaphor. Various analytical methods, such as finite element structural analysis, control system analysis, and mechanical configuration layout, will store and retrieve data from a hierarchical, object oriented data structure that supports assemblies of components with associated data and algorithms. In addition to managing numerical model data, the tool will assist the designer in organizing, stating, and tracking system requirements.

介绍了Illustrator软件的功能与特点，阐述了利用Illustrator进行插画设计的方法，并且给出了详细的绘制流程。实践证明，利用Illustrator可以快速、高效、高质量地设计出精美插画。%Describes the functions and features of Illustrator software, describes the use of Illustrator for illustration design ap-proach, and gives a detailed drawing processes. Practice has proved that the use of Illustrator you can quickly, efficiently and beau-tifully designed high-quality illustrations.

Volume illustration can be used to provide insight into source data from CT/MRI scanners in much the same way as medical illustration depicts the important details of anatomical structures. As such, proven techniques used in medical illustration should be transferable to volume illustration, providing scientists with new tools to visualize their data. In recent years, a number of techniques have been developed to enhance the rendering pipeline and create illustrative effects similar to the ones found in medical textbooks and surgery manuals. Such effects usually highlight important features of the subject while subjugating its context and providing depth cues for correct perception. Inspired by traditional visual and line-drawing techniques found in medical illustration, we have developed a collection of fast algorithms for more effective emphasis/de-emphasis of data as well as conveyance of spatial relationships. Our techniques utilize effective outlining techniques and selective depth enhancement to provide perceptual cues of object importance as well as spatial relationships in volumetric datasets. Moreover, we have used illustration principles to effectively combine and adapt basic techniques so that they work together to provide consistent visual information and a uniform style.

When we list the areas of practice for medical illustrators we always include research, but how involved in research are we? The aim of this activity is to encourage your professional development not just as a medical illustrator but your involvement with research whether that is undertaking your own research, undertaking evidence based practice (1) , working as part of a research team, advising researchers on the value of medical illustration or supporting a student undertaking a research project for their degree or post-graduate qualification.

Value systems are central to understanding consumer behavior and they are an important basis for market segmentation. This study addresses changes in individual value systems across time. First, we conceptualize main ways in which value systems may change over time. Next, we extend Kamakura and Mazz

utilized are driven by the timeline in which questions must be answered. This can range from quick "back-of-the-envelope" assessments of a configuration made in an afternoon, to more detailed tradespace explorations that can take upwards of a year to complete. A variety of spreadsheet based tools and conceptual design codes are currently in use. The in-house developed conceptual sizing code RC (Rotorcraft) has been the preferred tool of choice for CD activity for a number of years. Figure 2 illustrates the long standing coupling between RC and solid modeling tools for layout, as well as a number of ad-hoc interfaces with external analyses. RC contains a sizing routine that is built around the use of momentum theory for rotors, classic finite wing theory, a referred parameter engine model, and semi-emperical weight estimation techniques. These methods lend themselves to rapid solutions, measured in seconds and minutes. The successful use of RC, however requires careful consideration of model input parameters and judicious comparison with existing aircraft to avoid unjustified extrapolation of results. RC is in fact a legacy of a series of codes whose development started in the early 1970s, and is best suited to the study of conventional helicopters and XV-15 style tiltrotors. Other concepts have been analyzed with RC, but typically it became necessary to modify the source code and methods for each unique configuration. Recent activity has lead to the development of a new code, NASA Design and Analysis of Rotorcraft (NDARC). NDARC uses a similar level of analytical fidelity as RC, but is built on a new framework intended to improve modularity and ability to rapidly model a wider array of concepts. Critical to achieving this capability is the decomposition of the aircraft system into a series of fundamental components which can then be assembled to form a wide-array of configurations. The paper will provide an overview of NDARC and its capabilities.

ⅠConceptual Integration Network Conceptual integration-"blending"-is a general cognitive operation on a par with analogy,recursion,mental modeling,conceptual categorization,and framing.In blending,structure from input mental spaces is projected to a separate,"blending"

This paper describes an approach to indexing texts by their conceptual content using ontologies along with lexico-syntactic information and semantic role assignment provided by lexical resources. The conceptual content of meaningful chunks of text is transformed into conceptual feature structures...

In recent years, there have been many exchanges of perspectives and debates in the field of conceptual change. Most of the classical views on conceptual change have been criticized, and there have been recent discussions around bridging the cognitive and socio-cultural approaches in the research on conceptual change. On the other hand, researchers…

Full Text Available The notion Concept implies a set of rules and ideas combining in a common intention actual principles for application, structure and form of a future artificial object. A rational concept is considered as a product of "conceptual design", which is nothing more than a step-by-step research on the possibility of achieving a desired goal. It precedes a full-scale development oriented to the project implementation. The author studies conceptual design as the process of idea evolution involving several s tages (Fig. 1. One of them is considered, for which an effort is made to generate an effective methodology of direct search for a rational concept with due regard for system and other methods.

Physics: with illustrative examples from medicine and biology is a three-volume set of textbooks in introductory physics written at the calculus level and designed primarily for students with career objectives in the life sciences.

Modern computer graphics software has enabled the medical illustrator to render very complex anatomy by composing many different layers of drawings simultaneously. This and the author's capacity to take an "editorial" approach to compress several chronological events into a single, comprehensive two-dimensional illustration are analyzed in a step-by-step process. Through a series of images, the article provides a visual synopsis of the development of an illustration for an extensive clinical case: total sacrectomy performed through an all-posterior approach. Originally given as a slide presentation at the American Association of Neurological Surgeons Theodore Kurze Lecture in April 2011, the article provides some detailed notes on the techniques the author used to develop a comprehensive neurosurgical illustration.

Statistical adjustments to accommodate multiple comparisons are routinely covered in introductory statistical courses. The fundamental rationale for such adjustments, however, may not be readily understood. This article presents a simple illustration to help remedy this.

Illustrations are a salient source of information in children's books, yet their effect on children's reading comprehension has been studied only through literal factual recall. The purpose of the current study was to determine the effect of illustrations on bridging inferences, an important aspect of meaning making in comprehension models. Identical short stories were presented under different illustration conditions with pictures that represented different parts of the story. Participants were 73 7- to 11-year-olds. Illustrations both facilitated and interfered with inferencing depending on the type of information depicted; however, this effect was reduced as grade increased. Additional findings were that the overall ability to make inferences increased with age and working memory was a significant predictor of this skill. Results are discussed in relation to cognitive and developmental models of comprehension.

Biology student mastery regarding the mechanisms of diffusion and osmosis is difficult to achieve. To monitor comprehension of these processes among students at a large public university, we developed and validated an 18-item Osmosis and Diffusion Conceptual Assessment (ODCA). This assessment includes two-tiered items, some adopted or modified from the previously published Diffusion and Osmosis Diagnostic Test (DODT) and some newly developed items. The ODCA, a validated instrument containing ...

In an attempt to ensure more consistent casting results and remove some schedule variance associated with casting, an improved casting furnace concept has been developed. The improved furnace uses the existing arc melter hardware and glovebox utilities. The furnace concept was designed around physical and operational requirements such as; a charge sized of less than 30 grams, high heating rates and minimal additional footprint. The conceptual model is shown in the report as well as a summary of how the requirements were met.

While many countries, especially the U.S., have developed and implemented environmental education programs, little information exists concerning the major environmental issues to be communicated, concepts to be taught, or conceptual objectives to be interpreted. Environmental education research efforts concerned with documenting program effectiveness and demonstrating goal attainment, concept, acquisition, student belief and attitude shift, and skill acquisitions are reported. Methods for assessing environmental education program effectiveness are described. (22 references)

Full Text Available Illustrated Sketches of Death Valley (1891 originated as a hastily-written series of journalistic sketches of our Western borax deserts. They were written on commission to supplement their authors income. Conceived as a means subtly to promote the borax industry the Sketches in time won unintended recognition as a classic source for their subject. They also assumed unforeseen importance as an illustration of the role of advertising in America’s changing economy.

Search engines have entered popular culture. They touch people in diverse private and public settings and thus heighten the importance of such important social matters as information privacy and control, censorship, and equitable access. To fully benefit from search engines and to participate in debate about their merits, people necessarily appeal to their understandings for how they function. In this chapter we examine the conceptual understandings that people have of search engines by performing a content analysis on the sketches that 200 undergraduate and graduate students drew when asked to draw a sketch of how a search engine works. Analysis of the sketches reveals a diverse range of conceptual approaches, metaphors, representations, and misconceptions. On the whole, the conceptual models articulated by these students are simplistic. However, students with higher levels of academic achievement sketched more complete models. This research calls attention to the importance of improving students' technical knowledge of how search engines work so they can be better equipped to develop and advocate policies for how search engines should be embedded in, and restricted from, various private and public information settings.

In this paper I examine the notion and role of metaphors and illustrations in Maxwell's works in exact science as a pathway into a broader and richer philosophical conception of a scientist and scientific practice. While some of these notions and methods are still at work in current scientific research-from economics and biology to quantum computation and quantum field theory-, here I have chosen to attest to their entrenchment and complexity in actual science by attempting to make some conceptual sense of Maxwell's own usage; this endeavour includes situating Maxwell's conceptions and applications in his own culture of Victorian science and philosophy. I trace Maxwell's notions to the formulation of the problem of understanding, or interpreting, abstract representations such as potential functions and Lagrangian equations. I articulate the solution in terms of abstract-concrete relations, where the concrete, in tune with Victorian British psychology and engineering, includes the muscular as well as the pictorial. This sets the basis for a conception of understanding in terms of unification and concrete modelling, or representation. I examine the relation of illustration to analogies and metaphors on which this account rests. Lastly, I stress and explain the importance of context-dependence, its consequences for realism-instrumentalism debates, and Maxwell's own emphasis on method.

Metaphor is a figure of speech in tradition view. In modern linguistics, conceptual metaphor can be seen as a way of thinking. Thus, conceptual metaphor is common used in English from accent time until today. It is the basis of human being ’s cognition, thinking ways, language application and behavior. It is such an important part of English that the application of concep-tual metaphor is quite essential in English teaching in China. This thesis mainly focuses on the application of conceptual metaphor in English teaching on the basis of detailed introduction of conceptual metaphor theories and people ’s way of sentence compre-hension.

Led by the Affordable Care Act, the U.S. health care system is undergoing a transformative shift toward greater accountability for quality and efficiency. Academic health centers (AHCs), whose triple mission of clinical care, research, and education serves a critical role in the country's health care system, must adapt to this evolving environment. Doing so successfully, however, requires a broader understanding of the wide-ranging roles of the AHC. This article proposes a conceptual framework through which the triple mission is expanded along four new dimensions: health, innovation, community, and policy. Examples within the conceptual framework categories, such as the AHCs' safety net function, their contributions to local economies, and their role in right-sizing the health care workforce, illustrate how each of these dimensions provides a more robust picture of the modern AHC and demonstrates the value added by AHCs. This conceptual framework also offers a basis for developing new performance metrics by which AHCs, both individually and as a group, can be held accountable, and that can inform policy decisions affecting them. This closer examination of the myriad activities of modern AHCs clarifies their essential role in our health care system and will enable these institutions to evolve, improve, be held accountable for, and more fully serve the health of the nation.

Full Text Available A participatory system dynamics modelling approach is advanced to support conceptualization of feedback processes underlying ecosystem services and to foster a shared understanding of leverage intervention points. The process includes systems mapping workshop and follow-up tasks aiming at the collaborative construction of causal loop diagrams. A case study developed in a natural area in Portugal illustrates how a stakeholder group was actively engaged in the development of a conceptual model depicting policies for sustaining the climate regulation ecosystem service.

This article reviews the influence of key figures on the pictorial representation of anatomy and the evolution of anatomical illustration during the Middle Ages until the time of the Renaissance, based on medical history books, journals and ancient medical books. During the early period in the Middle Ages, most illustrations were traditional drawings of emblematic nature, oftentimes unrealistic, not only because the precise knowledge of anatomy was lacking but also because the objective was to elucidate certain principles for teaching purposes. Five figure-series that came down to us through ancient manuscripts and textbooks represent the best examples of such traditional illustrations. With the advent of human dissection in the 13th and 14th centuries, a significant transformation in the depiction of anatomy began to project the practice of human dissection, as we see in the works of Mondino de Luzzi, Henri de Mondeville and Guido de Vigevano. After the invention of book printing in the second half of the 15th century, the reproduction of books was commonly practised and the woodcut made multiplication of pictures easier. Peter of Abano, Hieronymous Brunschwig, Johannes de Ketham, Johannes Peyligk, Gregory Reisch, Magnus Hundt, Laurentius Phryesen and many more included several anatomical illustrations in their treatises that demonstrated the development of anatomical illustration during the later Middle Ages.

Full Text Available In our study we attempt to analyze a new form of illustration − the meme that is widely used in science and educational process. To achieve this goal we had to assemble a collection of images to catalog and description in order to understand the new role of the illustrative image. Meme is defined as a unit of cultural information. According to the definition of Wikipedia, a meme can consider any idea, character, manner or way of doing things, consciously or unconsciously transmitted from person to person through speech, video, writing, rituals, drawings, gestures, etc. The term meme and the concept of the meme was proposed by evolutionary biologist Richard Dawkins in 1976 in his book "the Selfish gene". The article shows illustrative examples of memes based on the artwork and paintings of the great masters of painting, considered the use of created images. It shows the history of illustrative memes and transformation of images in the modern world of science and education. Using memes in scientific work, as a form of illustration, it can be noted that visualization of the object of research allows you to develop visual thinking, which has identified a number of functions: cognitive, communicative, methodological. Cognitive function is manifested in the ability to cognize an object through an image, communicative – explain the problem or task through the image; methodological – to build research with consideration of the peculiarities of the structure and morphology of the object

This discussion of weather models uses maps to illustrate the differences among three types of frontal cyclones (long wave, short wave, and troughs). Awareness of these cyclones can provide clues to atmospheric conditions which can lead toward accurate weather forecasting. (AM)

Full Text Available This article explores the two series of visits to the artist's studio that appeared in the famed French illustrated magazine L'Illustration in the 1850s and in 1886. An in-depth examination of both the texts and images reveals the verbal and visual tropes used to characterize the artists and their spaces, linking these to broader notions of "the artist" – his moral characteristics, behaviors, and artistic practice – as well as to the politics of the art world and the (bourgeois ideology of L'Illustration. The aim is to uncover not only the language but also the mechanics of the "mediatization" of the image of the artist in this crucial period.

This paper presents an automatic non-photorealistic rendering approach to generating technical illustration from 3D models. It first decomposes the 3D object into a set of CSG primitives, and then performs the hidden surface removal based on the prioritized list, in which the rendition order of CSG primitives is sorted out by depth. Then, each primitive is illustrated by the pre-defined empirical lighting model, and the system mimics the stroke-drawing by user-specified style. In order to artistically and flexibly modulate the illumination, the empirical lighting model is defined by three major components: parameters of multi-level lighting intensities, parametric spatial occupations for each lighting level, and an interpolation method to calculate the lighting units into the spatial occupation of CSG primitives, instead of"pixel-by-pixel" painting. This region-by-region shading facilitates the simulation of illustration styles.

Two manuscripts with colored illustrations in French libraries were investigated. The research showed that: the first manuscript with colored pictures include 2 volumes, titled Animaux et Plantes de Chine collected in Library of Museum National d'Historie Naturelle (MNHN);the other manuscript with colored pictures has only 1 volume, titled Botanique Chinoise collected in Library of Societe Asiatique, College de France, which were identified as illustrations of Ben cao gang mu (Compendium of Materia Medica) (1640 edition) by Li Shi-zhen. These pictures were copied by P. d'Incarville, and are similar to Plantes fleurs et arbres de Chine in Bibliotheque de l'Institut de Franceand, Collection de Plantes Veneneuses de la Chine Gravees et Imprimees en Couleurs par les Missionnaries Jesuites in Bibliotheque Nationale de France, respectively. The latter two manuscripts were identified as illustrations of Ben cao pin hui jing yao (Essential Collections of Materia Medica) (1700 edition).

Full Text Available The article offers a brief overview of the history of definitions of social economy at European level, the ideological background of the concept and highlights some key dimensions for sociological analysis of social economy. Although the social economy is a reality present in different forms in most human communities, the term has no universally accepted definition nor in international area or in Europe. Attempts of defining and theorizing of the concept is relatively new in relation to practice. This article is a development of conceptual framework chapter from the report “Profit to the people”, POSDRU project Social Economy Model in Romania.

Full Text Available This article reports on the fndings of a qualitative study in which the construction of integrity of some business leaders was explored. Data were gathered through ten in-depth interviews with six South African business leaders commended to be champions of integrity. A grounded-theory approach to the data analysis elicited fve themes. These themes and their interrelatedness are discussed in this article and a conceptual framework of integrity is proposed. Integrity is conceptualised as a multifaceted and dynamic construct based on a moral foundation and inner drive that is managed by cognitive and affective processes manifesting various integrity-related behaviours.

In this paper I wish to apply implications of the Conceptual Blending Theory to computer games. I will analyze chosen examples and discuss them as a result of video game innovation made possible through "conceptual blending." Conceptual blending links at least...... integration network consisting of at least two input spaces, a generic space and a blended space as well as its governing principles consisting of composition, completion, and elaboration. With the help of these instruments I analyze computer games like Tuper Tario Tros......., Hell. The purpose of my approach is not so much to validate the ideas of conceptual blending theory through another field of examples (computer games) but to name and analyze characteristics of the mentioned games with the help of a given method....

In this paper I wish to apply implications of the Conceptual Blending Theory to computer games. I will analyze chosen examples and discuss them as a result of video game innovation made possible through "conceptual blending." Conceptual blending links at least......., Hell. The purpose of my approach is not so much to validate the ideas of conceptual blending theory through another field of examples (computer games) but to name and analyze characteristics of the mentioned games with the help of a given method....... integration network consisting of at least two input spaces, a generic space and a blended space as well as its governing principles consisting of composition, completion, and elaboration. With the help of these instruments I analyze computer games like Tuper Tario Tros...

Trompe l'oeil is a French term meaning fool-the-eye. When this concept is applied to the art of painting, the painted objects must look as convincingly real as do the actual objects. With the scientific illustrator having a background of rendering concepts which are realistic and accurate, he or she should have no trouble moving into the trompe l'oeil fine art field. The author lists the necessary aspects of why a trompe l'oeil painting is different from a realistic scientific illustration, a still life, or a landscape painting.

The blockbuster bestseller now in a manga edition--fully illustrated and fun to read!Beautifully illustrated by Kensuke Okabayashi, this enthralling edition of Patrick Lencioni's massive bestseller gives readers a new format in which to understand the fascinating, complex world of teams. Kathryn Petersen, Decision Tech's CEO, faces the ultimate leadership crisis: Uniting a team in such disarray that it threatens to bring down the entire company. Will she succeed? Will she be fired? Will the company fail? Lencioni's gripping tale serves as a timeless reminder that leadership requires as much co

Risk analysis is a tricky procedure, where one can easily make mistakes. Indeed, although risk equations are rather general, transferring a methodology to another context or hazard type can often lead to inaccuracies or even significant errors. To illustrate this, common mistakes made with the Swiss methodology are presented, together with possible solutions. This includes the following: Risk analysis for moving objects only takes the process dimension into account (e.g. the length of a road section potentially affected by a landslide), but not the object dimension (e.g. the cars length). This is a fair simplification as long as the object dimension is considerably smaller than the process dimension. However, when the object is large compared to the process (e.g. rockfalls on a train), the results will be wrong. This problem can be illustrated by considering two blocs. According to this methodology a 1 m diameter bloc will be twice more susceptible to reach a train than a 50 cm bloc. This is obviously not correct. When it comes to rockfalls risk analysis on roads or railway found in the literature, the bloc dimension is usually neglected, in favour of the object dimension, which is a fair assumption in this context. However, it is possible to include both dimensions by using the sum of the lengths instead of one of them. Risk analysis is usually performed using 3 different scenarios, for 3 different ranges of return periods, namely 1-30, 30-100 and 100-300 years. In order to be conservative, the operator commonly considers the magnitude of the worst event that happens with a return period included between the class bounds, which means that the operator evaluates the magnitude reached or overpassed with a return period of 30, 100 and 300 years respectively. Then, since the magnitude corresponds to the upper bounds of the classes, risk is calculated using the frequency corresponding to these return periods and not to the middle of the class (and also subtracting the

Full Text Available A growing body of data has been gathered in support of the view that the mind is embodied and that cognition is grounded in sensory-motor processes. Some researchers have gone so far as to claim that this paradigm poses a serious challenge to central tenets of cognitive science, including the widely held view that the mind can be analyzed in terms of abstract computational principles. On the other hand, computational approaches to the study of mind have led to the development of specific models that help researchers understand complex cognitive processes at a level of detail that theories of embodied cognition (EC have sometimes lacked. Here we make the case that connectionist architectures in particular can illuminate many surprising results from the EC literature. These models can learn the statistical structure in their environments, providing an ideal framework for understanding how simple sensory-motor mechanisms could give rise to higher-level cognitive behavior over the course of learning. Crucially, they form overlapping, distributed representations, which have exactly the properties required by many embodied accounts of cognition. We illustrate this idea by extending an existing connectionist model of semantic cognition in order to simulate findings from the embodied conceptual metaphor literature. Specifically, we explore how the abstract domain of time may be structured by concrete experience with space (including experience with culturally-specific spatial and linguistic cues. We suggest that both EC researchers and connectionist modelers can benefit from an integrated approach to understanding these models and the empirical findings they seek to explain.

Environmental decision support intends to use the best available scientific knowledge to help decision makers find and evaluate management alternatives. The goal of this process is to achieve the best fulfillment of societal objectives. This requires a careful analysis of (i) how scientific knowledge can be represented and quantified, (ii) how societal preferences can be described and elicited, and (iii) how these concepts can best be used to support communication with authorities, politicians, and the public in environmental management. The goal of this paper is to discuss key requirements for a conceptual framework to address these issues and to suggest how these can best be met. We argue that a combination of probability theory and scenario planning with multi-attribute utility theory fulfills these requirements, and discuss adaptations and extensions of these theories to improve their application for supporting environmental decision making. With respect to (i) we suggest the use of intersubjective probabilities, if required extended to imprecise probabilities, to describe the current state of scientific knowledge. To address (ii), we emphasize the importance of value functions, in addition to utilities, to support decisions under risk. We discuss the need for testing "non-standard" value aggregation techniques, the usefulness of flexibility of value functions regarding attribute data availability, the elicitation of value functions for sub-objectives from experts, and the consideration of uncertainty in value and utility elicitation. With respect to (iii), we outline a well-structured procedure for transparent environmental decision support that is based on a clear separation of scientific prediction and societal valuation. We illustrate aspects of the suggested methodology by its application to river management in general and with a small, didactical case study on spatial river rehabilitation prioritization.

We analyse a number of different externalities to identify conceptual challenges for the practical implementation of their internalisation. Three issues were identified: i) The balance between compensation and technology change and the respective effects on the nominal and real GDP; ii) The relev......We analyse a number of different externalities to identify conceptual challenges for the practical implementation of their internalisation. Three issues were identified: i) The balance between compensation and technology change and the respective effects on the nominal and real GDP; ii......) The relevance and efficiency of different instruments for internalisation and compensation; and iii) Implementing internalisation over large geographical and temporal distances. We find taxation to be a more relevant and efficient tool for internalisation than insurance and litigation. With increasing...... geographical and especially temporal distance between the benefitting actor and the victim of the external cost, the involvement of a non-governmental intermediate actor becomes increasingly necessary to provide the short-term capital required to ensure a successful implementation....

Full Text Available Conceptual skills enable the development of abilities necessary for controlling certain aspects of life. They include communication skills, functional literacy, and self-direction skills. The aim of this paper was to determine the acquisition of conceptual skills in persons with visual impairment. The research was conducted on a sample of 127 persons with visual impairment, 19-60 years of age. Conceptual Skills Domain of Adaptive Behavior Assessment System II (ABAS-II was used to obtain data on the acquisition of conceptual skills. It was determined that age (p=0.001 and the category of visual impairment (blindness and low vision (p=0.000 were significant factors for the acquisition of conceptual skills in persons with visual impairment. On the other hand, time when vision loss occurred was not a significant factor for acquiring conceptual skills in persons with visual impairment (p=0.195.

The purposes of this study were to examine the diversity of the authors and illustrators in core reading series and to evaluate the opportunities and limitations of these texts in relation to the goals of multicultural education. The authors began their research concerned about whose stories are told in core reading series, and who gets to tell…

In picture books, illustrations often play a critical role in helping authors tell stories. Instruction in the elements of composition including visual, textual, and peritextual features enhances meaning for children when they are given the opportunity to become authors of their own picturebooks. This study was conducted in a fourth grade…

A statistical model for analysis of multiple time-series observation is briefly outlined. The model incorporates a change parameter corresponding to intervention or interruption of the dependent series. The additional time-series are included in the model as covariates. The practical application of the procedure is illustrated with traffic…

An experiment illustrating stereochemical principles, like different physical properties in achiral environments, assignment of absolute stereochemistry, and the stereoisomeric relationships to differences in absolute stereochemistry is devised. A demonstration of how enantiomers have the same physical properties until placed in chiral…

A method for teaching cardiovascular physiology to veterinary students in an integrated systems program is described. The method consists of lectures that discuss illustrations projected overhead. An example of one of the teaching modules and a list of the benefits of this method are included. (JMD)

We present a technique for the illustrative rendering of 3D line data at interactive frame rates. We create depth-dependent halos around lines to emphasize tight line bundles while less structured lines are de-emphasized. Moreover, the depth-dependent halos combined with depth cueing via line width

This book is a brief guide to understanding Islam, intended to help individuals better understand Islamic culture, Muslims, and the Holy Quran. It consists of the following three chapters: (1) Some Evidence for the Truth of Islam; (2) Some Benefits of Islam; and (3) General Information on Islam. The book is extensively illustrated with photographs…

A simple in class laboratory illustrating the principles of ion exchange chromatography as a bioseparation technique is described. A protein's isoelectric point as a driving force for ion exchange chromatography is easily demonstrated by using combinations of proteins with natural color or fluorescence, such as DsRed2, enhanced green fluorescent…

Picture books can influence how children perceive those from backgrounds and cultures different from their own. Studies have been conducted examining how the text of children's literature portrays multicultural characters or characters with disabilities. However, few have looked specifically at the portrayal of characters through illustrations,…

@@ Dear Editor, A good picture is worth a thousand words. Schematic diagram of protein domain structures with functional motifs/sites in a concise and illustrative drawing is greatly helpful for a broad readership to grasp the old and novel functions of proteins rapidly.

This single class activity described here: (1) illustrates the importance of interdependence in groups; (2) can be used to measure group productivity and performance; (3) can encourage groups to engage in group learning; and (4) can facilitate group cohesion for newly formed groups. Students will be working in groups for the majority of their…

The devastating power of hurricanes was evident during the 2005 hurricane season, the most active season on record. This has prompted increased efforts by researchers to understand the physical processes that underlie the genesis, intensification, and tracks of hurricanes. This research aims at facilitating an improved understanding into the structure of hurricanes with the aid of visualization techniques. Our approach was developed by a mixed team of visualization and domain experts. To better understand these systems, and to explore their representation in NWP models, we use a variety of illustration-inspired techniques to visualize their structure and time evolution. Illustration-inspired techniques aid in the identification of the amount of vertical wind shear in a hurricane, which can help meteorologists predict dissipation. Illustration-style visualization, in combination with standard visualization techniques, helped explore the vortex rollup phenomena and the mesovortices contained within. We evaluated the effectiveness of our visualization with the help of six hurricane experts. The expert evaluation showed that the illustration-inspired techniques were preferred over existing tools. Visualization of the evolution of structural features is a prelude to a deeper visual analysis of the underlying dynamics.

This article illustrates how literature can bring models to life in undergraduate courses on labor market economics. The authors argue that economics instructors and students can benefit from even small doses of literature. The authors examine excerpts from five American novels: "Sister Carrie" by Theodore Drieser (1900/2005); "The Grapes of…

Preschoolers' reasons for ranking the beauty of 30 children's book illustrations were investigated through individual interviews. Most frequent response criteria for beauty were associated with familiar objects or surroundings, action, color, clothing or accessories, water or ice, body features, and babies or small things. Children's book…

An Interactive Dichotomous Key (IDK) for 390 "taxa" of vascular plants from the Ria de Aveiro, available on a website, was developed to help teach botany to school and universitary students. This multimedia tool includes several links to Descriptive and Illustrated Glossaries. Questionnaires answered by high-school and undergraduate students about…

This paper introduces and illustrates Bi-relational Design (BD) as a general approach to (re)solving wicked problems. BD theorises oppositional, equipositional and para-positional approaches to problem-specific dyads (e.g., subjective/objective) based on a general consensus of research on epistemological development. These epistemic positions are…

In this article, the authors present a multilevel (or hierarchical linear) model that illustrates issues in the application of the model to data from meta-analytic studies. In doing so, several issues are discussed that typically arise in the course of a meta-analysis. These include the presence of non-zero between-study variability, how multiple…

Textbook style problems including detailed solutions introducing pharmaceutical topics at the level of an introductory chemical engineering course have been created. The problems illustrate and teach subjects which students would learn if they were to pursue a career in pharmaceutical engineering, including the unique terminology of the field,…

This single class activity described here: (1) illustrates the importance of interdependence in groups; (2) can be used to measure group productivity and performance; (3) can encourage groups to engage in group learning; and (4) can facilitate group cohesion for newly formed groups. Students will be working in groups for the majority of their…

This manual, designed for teachers of Spanish as a second language, contains visual aids to illustrate certain grammatical concepts and related vocabulary. The manual consists of 30 tear-out blackline masters, each containing one to six separate cartoon pictures, to be used for classroom or homework activities in any sequence. Accompanying each…

This paper discusses the author's experiences developing a Java applet that illustrates how error control is implemented in the Transmission Control Protocol (TCP). One section discusses the concepts which the TCP error control Java applet is intended to convey, while the nature of the Java applet is covered in another section. The author…

Here is a volume of process equipment and terms in standard Mandarin Chinese, English, and French. As with the English/French/Spanish edition, each page illustrates a particular piece of equipment, with captions identifying the key components. Glossaries at the end of each major section include the Romanized pronunciation of the Chinese.

In a modern programming environment like eg MATLAB it is possible by simple commands to perform advanced calculations on a personal computer. In order to use such a powerful tool efiiciently it is necessary to have an overview of available numerical methods and algorithms and to know about...... are illustrated by examples in MATLAB....

Flores-Crespo has written a timely paper, "Education, employment and human development: illustrations from Mexico". Flores-Crespo uses Amartya Sen's ideas to bring a fresh perspective to bear on the relationship between higher education and human development. Although there is growing interest in applying Sen's ideas in a range of…

Even the white paper or canvas is represented on the computer screen as digital layout .... market, Adobe Photoshop version CS4 has been selected for this study. Not that there are not .... This is a flexible technique of producing illustration in ...

This manual, designed for teachers of Spanish as a second language, contains visual aids to illustrate certain grammatical concepts and related vocabulary. The manual consists of 30 tear-out blackline masters, each containing one to six separate cartoon pictures, to be used for classroom or homework activities in any sequence. Accompanying each…

Full Text Available This article deals with illustrations in popularizing medical discourse. This visual tool can be considered as a strategy of paraphrasing, since popularizing journalists simplify and make information more explicit by using other semiotic information related to the main scientific content. A comparable corpus composed of the online versions of two American, two French and two Italian magazines of science popularization has been analysed. The role of illustrations, as well as their usefulness in terms of comprehension of the text, has been highlighted through a socio-terminological approach (Gaudin 2003. As far as a typological distinction of the illustrations is concerned, a Peircean classification has been adopted (Fisette 2009; Lathene-Da Cunha 2013. A number of observations emerge from the comparison between the magazines, per type and per language: the article shows that, while some illustrations contribute to a better intelligibility of the article’s content, others influence readers’ sensitivity.

Crystals of naphthalene form on the surface of an acetone solution and dance about in an animated fashion illustrating surface tension, crystallization, and intermolecular forces. Additional experiments reveal the properties of the solution. Flows within the solutions can be visualized by various means. Previous demonstrations of surface motion…

It is hard to imagine tourism without the creative use of both seductive and restrictive imaginaries about peoples and places. This article presents a theoretical framework for the study of tourism imaginaries and their circulation. Where do tourism imaginaries come from, how and why are they circulated across the globe, and what kind of impact do they have on people’s lives? I discuss the multiple links between tourism and the imaginary, illustrating the overlapping but conflicting ways in w...

A thorough review of the q-space technique is presented starting from a discussion of Fick's laws. The work presented here is primarily conceptual, theoretical and hopefully pedagogical. We offered the notion of molecular concentration to unify Fick's laws and diffusion MRI within a coherent conceptual framework. The fundamental relationship between diffusion MRI and the Fick's laws are carefully established. The conceptual and theoretical basis of the q-space technique is investigated from first principles.

This review takes a critical position with regards to Treagust and Duit's article, Conceptual Change: A discussion of theoretical methodological and practical challenges for science education. It is proposed that conceptual change research in science education might benefit from borrowing concepts currently being developed in the sociology of emotions. It is further suggested that the study of social interaction within evolving emotional cultures is the most promising avenue for developing and extending theories about conceptual change.

A thorough review of the q-space technique is presented starting from a discussion of Fick's laws. The work presented here is primarily conceptual, theoretical and hopefully pedagogical. We offered the notion of molecular concentration to unify Fick's laws and diffusion MRI within a coherent conceptual framework. The fundamental relationship between diffusion MRI and the Fick's laws are carefully established. The conceptual and theoretical basis of the q-space technique is investigated from first principles.

Metaphors can serve as cognitive instruments.In the process of conceptualization,metaphors would reflex the way people think and affect the way they perceive things.Conceptual metaphors are often used to help layman understand abstract and complex political issues better.The author studies Martin Luther King,Jr's famous speech "I Have a Dream" from the views of cognitive linguistics,and mainly analyzes the conceptual metaphors in this political speech.

National Aeronautics and Space Administration — CADNexus proposes to develop an Integrated Variable Fidelity Conceptual Design tool. The application will enable design and analysis of unconventional and advanced...

Logic components are used to support the conceptual design. Taking the stamping die structure as the research object,several logic components are defined for the conceptual die construction design. A new method-logic assembly modeling is presented to satisfy the characteristic of the top-down die design process. Representing shapes and spatial relations in logic can provide a natural,intuitive method of developing complete computer systems for reasoning about die construction design at the conceptual stage. This method can reflect the designer's thought clearly and provide the designer with a test bed for generating alternatives and performing reasoning work at the conceptual die design stage.

Based on the fact that function-structure generating and function solving are alternant processes with mutual causality during the conceptual design phase of mechatronic systems, a conceptual design cyclic feedback solving model of a mechatronic system is put forward on the basis of mapping between function layer, effect layer, working principle layer and structure layer. The process of solving single and system functions is analysed. Key technologies of interface matching and function solving are then advanced. Finally, a computer-aided conceptual design automatic software system for mechatronic systems is developed and the conceptual design of a computerised embroidery machine is given as an example.

After a short introduction to the present SI (Système International des Unités), this poster presentation illustrates a digest of the basics of the proposed New SI and of its difference with respect to the present SI. This involves a review of some unresolved problems in the CCU 2016 draft, concerning: the role of the constants of physics in it and their role in the conceptual construction of this international standard; the implications for science of the New SI implementation. Consequences and new duties for the users are illustrated, involving a possible hierarchy between countries that would be installed by the new definition.

Knowledge sharing is important because individual knowledge is not transformed into organizational knowledge until it is shared. The conceptual model presents how social factors create the conditions for effective knowledge sharing. It illustrates how three dimensions of social capital impact with each other and with knowledge sharing. Social…

This paper reflects on the socio-political conceptualization of ‘Western Iberia’, one of Rewilding Europe’s first pilot areas. Drawing from Actor Network Theory and social theories discussing Politics of Scale, we illustrate how ‘Western Iberia’ is continuously being negotiated through practices in

Based on the 1979 edition of the "Petit Larousse Illustre," English or American loan words are evaluated as to their actual usage in modern French. It is concluded that anglo-americanisms have a greater impact on technical French than on everyday French. A list of 57 words that have become part of French usage is provided for pedagogical…

The diversity of circumstances and developmental outcomes among Asian American children and youth poses a challenge for scholars interested in Asian American child development. This article addresses the challenge by offering an integrated conceptual framework based on three broad questions: (a) What are theory-predicated specifications of contexts that are pertinent for the development of Asian American children? (b) What are the domains of development and socialization that are particularly relevant? (c) How can culture as meaning-making processes be integrated in conceptualizations of development? The heuristic value of the conceptual model is illustrated by research on Asian American children and youth that examines the interconnected nature of specific features of context, pertinent aspects of development, and interpretive processes.

Full Text Available Changes in immigration law, globalization and increased ease of transportation have transformed modern societies into culturally diverse landscapes with religious diversity, in particular, presenting both opportunities and challenges. The author proposes a conceptual framework that embraces an interpretation of public relations as a social function, a covenantal model as a theoretical ground, an expanded worldview to include tolerance as an essential defining presupposition, and expanded communicative conceptual parameters that include religion in definitions of diversity and generic principles of excellent practice. An anecdotal review of faith communities in the U.S. reveals that public relations professionals and other communicators model the conceptual framework in interfaith initiatives and that the framework would serve as a helpful foundation for guiding communication professionals toward such behaviour. The study also illustrates that socially-responsible behaviour often has a foundation of faith common across various faith traditions.

The development of a toolset, SIMPLI-FLYD ('SIMPLIfied FLight dynamics for conceptual Design') is described. SIMPLI-FLYD is a collection of tools that perform flight dynamics and control modeling and analysis of rotorcraft conceptual designs including a capability to evaluate the designs in an X-Plane-based real-time simulation. The establishment of this framework is now facilitating the exploration of this new capability, in terms of modeling fidelity and data requirements, and the investigation of which stability and control and handling qualities requirements are appropriate for conceptual design. Illustrative design variation studies for single main rotor and tiltrotor vehicle configurations show sensitivity of the stability and control characteristics and an approach to highlight potential weight savings by identifying over-design.

The environmental and economic components of a region or a nation are inextricably linked. Moreover, environmental protection technology must deal specifically with the linkages between the economy and the environment, that is, with by-products of the economy as they move from the economy to the environment or with natural resources as they move from the environment to the economy. Yet, environmental policy analyses are rarely able to focus on these linkages. The author develops conceptual framework aimed at mitigating that inadequacy. The framework is tied to its theoretical basis in thermodynamics and is utilized to identify generic categories of environmental protection strategies, to identify some disadvantages of current strategies, and to suggest alternatives. 14 references, 4 figures.

The Sierra Toolkit computational mesh is a software library intended to support massively parallel multi-physics computations on dynamically changing unstructured meshes. This domain of intended use is inherently complex due to distributed memory parallelism, parallel scalability, heterogeneity of physics, heterogeneous discretization of an unstructured mesh, and runtime adaptation of the mesh. Management of this inherent complexity begins with a conceptual analysis and modeling of this domain of intended use; i.e., development of a domain model. The Sierra Toolkit computational mesh software library is designed and implemented based upon this domain model. Software developers using, maintaining, or extending the Sierra Toolkit computational mesh library must be familiar with the concepts/domain model presented in this report.

It has for centuries been commonly believed that syllogistic reasoning is an essential part of human rationality. For this reason, Aristotelian syllogistics has since the rise of the European university been a standard component of logic teaching. During the medieval period syllogistic validity......, implemented using the Prolog programming language as embodied in a Java implementation called Prolog+CG. The aim of the present paper is to study some interesting conceptual aspects of syllogistic reasoning and to investigate whether CG formalism can be helpful in order to obtain a better understanding...... of syllogistic reasoning in general and the system of Aristotelian syllogisms conceived as a deductive (axiomatic) structure in particular. Some prototypes of tools for basic logic teaching have been developed using Prolog+CG, and various preliminary tests of the tools have been carried out....

We in this paper rethink the conventional ways of explaining the change process of new company formation. We base our analysis on two well-established and dominating categories of entrepreneurship models; stages inspired models and interactive contingency models; and argue that these do...... not sufficiently conspire to capture the entrepreneurial start-up process as an everyday phenomenon of multi-dimensional individual, social and environmental interaction. In an effort to address this hypothesized theoretical gap; we apply ideas origination from Symbolic Interactionism to suggest a complementary...... conceptual model for comprehending the entrepreneurial process as an interactive construct. From here the idea of entrepreneurship as an on-going "Social Journey of Opportunity Construction" arises. We argue that this idea has potential impact on the practice of research, since it encourages scholars to step...

Presently the research on ageism is marked by numerous more or less diffuse definitions of the concept of ageism. Many studies investigate both the causes and consequences of ageism without a clear definition of the phenomenon. As a consequence the area is characterized by diverging research...... results which are hard to re-test and to compare. It is therefore difficult to obtain a framework on ageism.This article offers a conceptual clarification of ageism. Based on a review of the existing literature, a new definition of ageism is introduced. This definition is more explicit and complex than...... previous definitions. This has two purposes. Firstly, its clarity constitutes the foundation for higher reliability and validity in future research about ageism. Secondly, its complexity offers a new way of systemizing theories on ageism....

This report, which summarizes the result of preliminary conceptual design activities during Phase 1, follows the format of safety analysis report. The purpose of publishing this report is to gather all of the design information developed so far in a systematic way so that KALIMER designers have a common source of the consistent design information necessary for their future design activities. This report will be revised and updated as design changes occur and more detailed design specification is developed during Phase 2. Chapter 1 describes the KALIMER Project. Chapter 2 includes the top level design requirements of KALIMER and general plant description. Chapter 3 summarizes the design of structures, components, equipment and systems. Specific systems and safety analysis results are described in the remaining chapters. Appendix on the HCDA evaluation is attached at the end of this report.

Aesthetics and functional significance of the cerebral cortical relief gave us the idea to find out how often the convolutions are presented in fine art, and in which techniques, conceptual meaning and pathophysiological aspect. We examined 27,614 art works created by 2,856 authors and presented in art literature, and in Google images search. The cerebral gyri were shown in 0.85% of the art works created by 2.35% of the authors. The concept of the brain was first mentioned in ancient Egypt some 3,700 years ago. The first artistic drawing of the convolutions was made by Leonardo da Vinci, and the first colour picture by an unknown Italian author. Rembrandt van Rijn was the first to paint the gyri. Dozens of modern authors, who are professional artists, medical experts or designers, presented the cerebralc onvolutions in drawings, paintings, digital works or sculptures, with various aesthetic, symbolic and metaphorical connotation. Some artistic compositions and natural forms show a gyral pattern. The convolutions, whose cortical layers enable the cognitive functions, can be affected by various disorders. Some artists suffered from those disorders, and some others presented them in their artworks. The cerebral convolutions or gyri, thanks to their extensive cortical mantle, are the specific morphological basis for the human mind, but also the structures with their own aesthetics. Contemporary authors relatively often depictor model the cerebral convolutions, either from the aesthetic or conceptual aspect. In this way, they make a connection between the neuroscience and fineart.

Recently there has been great interest in possible mediators and moderators of family history risk for alcoholism. However, previous studies have failed to employ appropriate designs and data analytic strategies to identify moderators and mediators. This article uses a large data set to illustrate such analyses. In the current data, both presumed personality risk and dispositional self-awareness were found to play moderator (rather than mediator) roles. The conceptual, methodological and data analytic implications of the mediator-moderator distinction are discussed.

International audience; We present a general method for describing the conceptual neighborhood structure of qualitative relations in temporal and spatial reasoning. This method is based on representing the topological structure of the configuration spaces in terms of incidence structures. We illustrate this approach on some of the qualitative calculi introduced in the literature, including Allen’s temporal calculus and Freksa’s directional calculus.

Scientific and technical reports: How to Write and Illustrate provides step-by-step advice on tackling various tasks associated with report writing like gathering information, analyzing information, preparing an outline, writing a rough draft and revising. Many examples illustrate the processes involved at various steps. A stepwise approach to computer-assisted preparation of tables and various types of figures like line drawings, bar charts, histograms, flowcharts, etc., is provided. Also presented are suggestions about how to use commonly available computer programs to give visual shape to ideas, concepts, processes and cause and effect relations described in the text. Use of readability tests is explained as a screening system for checking comprehensibility of language used. Readers are alerted to some of the common pitfalls in science writing like redundancy, overuse of nouns, noun chains, excessive use of passive voice, use of overlong sentences and ambiguity. Checklist at the end of each chapter sums up...

Full Text Available This work, mostly based on the Semiolinguistic Theory of Discourse Analysis, intends to discuss aspects of the verbal-visual semiosis in illustrated books regarding the process of pathemization, that is, the triggering of emotions from descriptive staging. It is assumed that pathemization, of an intentional nature, is activated in the text-reader interaction due to a manifested discursive planning in (verbal and visual representations endowed with cultural valuation prone to reactive emotions. The observed complementarity between word and image in the analyzed illustrated books allows for not only a higher density of meaning, but also for the enhancement of qualities and categories that are not always “signifiable.” The symbolization by analogy updated in the images (visual or metaphorical, or in the superposition of both, complexifies signification, allowing not only for effects of meaning, but also for felt effects, stemming from knowledge and beliefs on which representations are founded, whether verbally or visually configured.

Throughout history, illustrations had played a key role in the promotion and evolution of medicine by providing a medium for transmission of scientific observations. Due to religious prohibitions, color drawings of the human body did not appear in medieval Persia and during the Islamic Golden Age. This tradition, however, has been overlooked with the publication of the first color atlas and text of human anatomy, Tashrihi Mansuri (Mansur's Anatomy), by Mansur ibn Ilyas in the fourteenth century AD. Written in Persian and containing several vivid illustrations of the human body, this book gained widespread attention by both scholars and lay persons. In this article, a brief history of Mansur's Anatomy and an English translation of selected sections from this book regarding the heart and blood vessels are presented.

We have written a series of interactive demonstrations and simulations for introductory Electricity and Magnetism. These programs are written in the Java (TM) language and are delivered via the World-Wide Web to students either in the classroom or at home. The combination of such interactive illustrations with the Web's hypermedia capability is of significant value in the creation of network-distributable useful courseware. We are using these applets at Rensselaer and are evaluating their effectiveness as components of the instruction of Studio Physics II (Introduction to Electricity and Magnetism). Two of the applets allow the student to explore two-dimensional electric and magnetic fields by drawing field lines and equipotentials, evaluating divergence and curl, and calculating loop and surface integrals for Maxwell's laws. Another applet illustrates Snell's law of refraction, and another is an optical bench with movable lenses and a movable object.

It is easy to construct classical 2-state systems illustrating the behavior of the short-lived and long-lived neutral $K$ mesons in the limit of CP conservation. The emulation of CP violation is more tricky, but is provided by the two-dimensional motion of a Foucault pendulum. Analogies are drawn between the pendulum and observables in neutral kaon decays. An emulation of CP- and CPT-violation using electric circuits is also discussed.

The main aim of this paper is to explore the link between poverty and inequality. In developing countries, there is a general consensus that high inequality can dampen significantly the impact of economic performance on poverty. In this paper, we propose a new theoretical framework that links poverty and inequality. We also show between and within group inequalities, as well as inequality in income sources, can contribute to total poverty. The methodology of the paper is illustrated using the...

A circular phylogeny with photos or drawings of species is named a phylogeny mandala. This is one of the ways for illustrating the Tree of Life, and is suitable to show visually how the biodiversity has developed in the course of evolution as clarified by the molecular phylogenetics. To demonstrate the recent progress of molecular phylogenetics, six phylogeny mandalas for various taxonomic groups of life were presented; i.e., (1) Eukaryota, (2) Metazoa, (3) Hexapoda, (4) Tetrapoda, (5) Eutheria, and (6) Primates.

This paper describes methods for explanatory and illustrative visualizations used to communicate aspects of Einstein's theories of special and general relativity, their geometric structure, and of the related fields of cosmology and astrophysics. Our illustrations target a general audience of laypersons interested in relativity. We discuss visualization strategies, motivated by physics education and the didactics of mathematics, and describe what kind of visualization methods have proven to be useful for different types of media, such as still images in popular science magazines, film contributions to TV shows, oral presentations, or interactive museum installations. Our primary approach is to adopt an egocentric point of view: The recipients of a visualization participate in a visually enriched thought experiment that allows them to experience or explore a relativistic scenario. In addition, we often combine egocentric visualizations with more abstract illustrations based on an outside view in order to provide several presentations of the same phenomenon. Although our visualization tools often build upon existing methods and implementations, the underlying techniques have been improved by several novel technical contributions like image-based special relativistic rendering on GPUs, special relativistic 4D ray tracing for accelerating scene objects, an extension of general relativistic ray tracing to manifolds described by multiple charts, GPU-based interactive visualization of gravitational light deflection, as well as planetary terrain rendering. The usefulness and effectiveness of our visualizations are demonstrated by reporting on experiences with, and feedback from, recipients of visualizations and collaborators.

Conceptual modelling is the addition of more real-world semantics to the computations performed by a computer. It is argued that in a proper engineering approach to computing, three kinds of conceptual modelling need to be distinguished, (1) modelling a software solution, (2) modelling the domain in

A new conceptual data model that addresses the geometric dimensioning and tolerancing concepts of datum systems, datums, datum features, datum targets, and the relationships among these concepts, is presented. Additionally, a portion of a related data model, Part 47 of STEP (ISO 10303-47), is reviewed and a comparison is made between it and the new conceptual data model.

A study investigated children's difficulty in learning color words and attempted to determine whether the difficulty was perceptual, conceptual, or linguistic. The subjects were 24 two-year-olds, half with knowledge of color words and half without, and a similar control group. The experimental subjects were given conceptual and comprehension tasks…

and research. Forcepad is an effort to provide a conceptual design and teaching tool in a finite-element software package. Forcepad is a two-dimensional finite-element application based on the same conceptual model as image editing applications such as Adobe Photoshop or Microsoft Paint. Instead of using...

Purpose: The main purpose of the present paper is to elaborate an entrepreneurial university conceptualization which could be appropriate for developing countries. A conceptualization which distinguishes between different elements of entrepreneurial universities in developing countries, and identifies the common ones. This conceptualization…

Organizes a reading of the conceptual change literature that brings into view a collection of design specifications for a conceptual change apparatus. Analyzes one such apparatus in the particulars of a science education demonstration program produced by the Harvard-Smithsonian Private Universe Project. (Contains 114 references.) (Author/WRM)

UML class diagrams can be used as a language for expressing a conceptual model of a domain. We use the General Ontological Language (GOL) and its underlying upper level ontology, proposed in [1], to evaluate the ontological correctness of a conceptual UML class model and to develop guidelines for

The present study used the Cognitive Reconstruction of Knowledge Model (CRKM) model of conceptual change as a framework for developing and testing how key cognitive, motivational, and emotional variables are linked to conceptual change in physics. This study extends an earlier study developed by Taasoobshirazi and Sinatra ("J Res Sci…

This review takes a critical position with regards to Treagust and Duit's article, "Conceptual Change: A discussion of theoretical methodological and practical challenges for science education." It is proposed that conceptual change research in science education might benefit from borrowing concepts currently being developed in the sociology of…

We engage in a metalogue based on eight papers in this issue of "Cultural Studies of Science Education" that review the state of conceptual change research and its possible affect on the teaching and learning of science. Our discussion addresses three aspects of conceptual change research: theoretical, methodological, and practical, as we discuss…

Teaching strategies associated with a conceptual change model of science teaching were examined in a study of seventh-grade life science teachers (n=13). Greater use of conceptual change teaching strategies was associated with use of the special instructional materials, not with the training. Students in classes where teachers were provided with…

Science students find heat, temperature, enthalpy and energy in chemical reactions to be some of the most difficult subjects. It is crucial to define their conceptual understanding level in these subjects so that educators can build upon this knowledge and introduce new thermodynamics concepts. This paper reports conceptual understanding levels of…

Based on extenics, an extensive functional information model (function-behavioral action-structure-environmental constraint) of the mechanical product intelligent conceptual design is developed, and the mechanism of theoretic structure solutions is produced, the mapping relations between function-behavior and behavior-structure are analyzed.The model is applied to the filling material system's conceptual design to verify validity.

Presents a critical analysis of how cultivation (long-term formation of perceptions and beliefs about the world as a result of exposure to media) has been conceptualized in theory and research. Analyses the construct of television exposure. Suggests revisions for conceptualizing the existing theory and extending it. (RS)

Full Text Available The books telling the story of Montserrat used to contain illustrative engravings. Montserrat presents us with a special case, because this monastery holds an uninterrupted record of illustrated texts or of texts with engraved images from the miniatures of the Llibre Vermell and the first fruits of the printing press to the 19th century. Admittedly, in the 16th, 17th and 18th centuries the books exhibit the engravings on the cover or within the first sheets; it is only in the 19th century that illustrations appear inserted in the text so as to make the story more agreeable. The images intensify the value of texts, since illustrations can help us understand the written word and may suggest other readings of the corresponding passages, clearer and more precise. Illustrations in the books on the history of a monastery are but a part of a larger set of ideas, projects and self-service which the Church aims at the reader. Furthermore, the iconographic models of the Virgin Mary and the Mountain are of great interest for the history of engraving and of the Catalan and Spanish devotions.