Over the last 17 years, scientists and engineers have developed synthetic gene circuits that can program the functionality, performance, and behavior of living cells. Analogous to integrated circuits that underlie myriad electronic products, engineered gene circuits can be used to generate defined dynamics, rewire endogenous networks, sense environmental stimuli, and produce valuable biomolecules.

These gene circuits hold great promise in medical and biotechnological applications, such as combating super bugs, producing advanced biofuels, and manufacturing functional materials.

To date, most circuits are constructed through a trial-and-error manner, which relies heavily on a designer's intuition and is often inefficient, said University of Illinois Bioengineering Associate Professor Ting Lu. "With the increase of circuit complexity, the lack of predictive design guidelines has become a major challenge in realizing the potential of synthetic biology," said Lu, who also is affiliated with the Carl R. Woese Institute for Genomic Biology and Physics Department at Illinois.

Researchers have turned to quantitative modeling to address this gene circuit design challenge. Typical models regard gene circuits as isolated entities that do not interact with their hosts and focus only on the biochemical processes within the circuits, noted Lu.

"Although highly valuable, the current modeling paradigm is often incapable of quantitatively, or even qualitatively sometimes, describing circuit behaviors," he said. "Increasing experimental evidences have suggested that circuits and their biological host are intimately connected and their coupling can impact circuit behaviors significantly."

Lu and his graduate students, Chen Liao and Andrew Blanchard, recently addressed the challenge by constructing an integrated modeling framework for quantitatively describing and predicting gene circuit behaviors. Using Escherichia coli (E. coli) as a model host, the framework consists of a coarse-grained but mechanistic description of host physiology that involves dynamic resource partitioning, multi-layered circuit-host coupling, and a detailed kinetic module of exogenous circuits.

The team demonstrated that, following training, the framework was able to capture and predict a large set of experimental data concerning the host and simple gene overexpression. For instance, they discovered that ppGpp-mediated effects are the key to understanding constitutive gene expression under environmental changes, including both nutrient and antibiotic changes. The team also demonstrated the utility of the platform by applying it to examine a growth-modulating feedback circuit whose dynamics is qualitatively altered by circuit-host couplings and revealing the behaviors of a toggle switch across scales from single-cell dynamics to population structure and to spatial ecology.

Although Lu's framework was established using E. coli as the model host, it has the potential to be generalized for describing multiple host organisms. "For example, we found that, by varying only a single parameter, the framework successfully predicted several key host metrics, including RNA-to-protein ratio, RNA contents per cell, and mean peptide elongation rate, for Salmonella typhimurium and Streptomyces coelicolor," said Lu.

According to Lu, this work advances the quantitative understanding of gene circuit behaviors, and facilitates the transformation of gene network design from trial-and-error construction to rational forward engineering. By systematically illustrating key cellular processes and multilayered circuit-host interactions, it further sheds light on quantitative biology towards a better understanding of complex bacterial physiology.

As the U.S. electricity grid continues to modernize, it will mean things like better reliability and resilience, lower environmental impacts, greater integration of renewable energy, as well as new computing and communications technologies to monitor and manage the increasing number of devices that connect to the grid. However, that enhanced connectivity for grid operators and consumers also opens the door to potential cyber intrusions.

As part of the Department of Energy's (DOE's) commitment to building cyber-resilient energy delivery systems, a new project led by Lawrence Berkeley National Laboratory (Berkeley Lab) will develop tools to detect and counter certain types of cyber attacks on the grid. The project has been awarded up to $2.5 million in funding over three years by DOE, one of 20 projects for cybersecurity on the grid announced recently.

With rooftop solar panels surging in popularity in the U.S. – growing from 30,000 homes in 2006 to more than 1 million last year – Berkeley Lab's project focuses on solar inverters, devices that turn the direct current from rooftop solar panels into alternating current that is fed back onto the grid. So-called "smart inverters" can enhance overall system reliability and reduce operational costs.

Industry and government are now developing standards for how solar inverters communicate with the grid so that the photovoltaic (PV) modules can adjust their power levels accordingly.

"It is this standardization that presents a vulnerability," said Daniel Arnold, a Berkeley Lab researcher and engineer who is one of the leads of the project. "As we modernize the grid, our belief is that we, as a society, can enjoy all of the benefits of large amounts of distributed PV, such as reduced greenhouse gas emissions and a more resilient system, and still have a secure network that is potentially more robust to cyber intrusions than it was before the introduction of large amounts of distributed PV."

In this project, Berkeley Lab will develop algorithms to essentially use the system in the same way the hackers might do but sending opposite signals to nullify the attack, similar to what a noise-canceling headphone does. "If an attacker tries to manipulate the settings in a number of PV inverters, we'll observe these manipulations, then identify the settings in PV inverters that have not been hacked, and finally, dispatch the appropriate settings to the inverters deemed safe in order to counter that attack," said Arnold, a researcher in Berkeley Lab's Grid Integration Group.

The concept is based on watching for irregularities in the physical behavior of the grid. "There are laws that govern the way the power grid operates from a physical perspective," said Sean Peisert, a cybersecurity expert in Berkeley Lab's Computational Research Division and the principal investigator on the project. "So we leverage those insights to understand the ways in which hackers might attempt to do something to the grid."

Ultimately the algorithms will be able to monitor the grid to provide advanced warning to a utility operator of a possible emerging attack.

An end-to-end set of project partners

Berkeley Lab is teaming with a slate of partners on this project, including OSISoft, SunSpec Alliance, SolarEdge, HDPV Alliance, Power Standards Lab, Arizona State University, Siemens, the National Rural Electric Cooperative Association, and the Sacramento Municipal Utility District.

"Our partners are helping to inform what we're doing and make sure it's realistic and meaningful in a practical engineering sense," Peisert said. "We have pretty much everyone who you might want as a partner on this – from the utilities themselves to manufacturers, to the company collecting all the data from the equipment and providing it to the utilities. It's really an end-to-end set of partners."

SunSpec Alliance is a global industry alliance working on information and communication standards for distributed energy resources such as solar PV and battery-based systems. "As the distributed energy grid takes shape, cybersecurity risks are increasing," said Tom Tansy, chairman of SunSpec. "The work that will take place in this program leverages best practices and standards, developed by SunSpec and others, and takes them to the next level by providing sophisticated technology to maintain and enhance grid security."

Two additional projects for grid resiliency

Separately, Berkeley Lab is contributing to two other projects in DOE's Grid Modernization Initiative. It will partner with SLAC National Accelerator Laboratory on the Grid Resilience and Intelligence Project, or GRIP, which has been awarded up to $6 million for more than three years. GRIP will combine artificial intelligence with massive amounts of data and know-how from a dozen other partners to identify places where the electric grid may be vulnerable to disruption, shore up those spots in advance, and get things up and running faster when failures do occur.

"Berkeley Lab has pioneered the development of algorithms that can optimally manage distributed energy resources, like wind, solar and batteries, and are completely plug and play," said Arnold, who is leading the Berkeley Lab part of GRIP. "In this project, we're partnering with SLAC to deploy and test our approach in a real utility network. With these algorithms, we hope to be able to create an electric grid that can use distributed energy resources to automatically reconfigure itself to maximize reliability during normal operations or emergencies."

Berkeley Lab is also contributing to the Laboratory Value Analysis Team, led by Pacific Northwest National Laboratory. Using metrics developed by DOE's Grid Modernization Lab Consortium, the analysis team will assess and determine the economic value of the project impacts and validate the technologies and operations strategies in each project before and after deployment.

###

Lawrence Berkeley National Laboratory addresses the world's most urgent scientific challenges by advancing sustainable energy, protecting human health, creating new materials, and revealing the origin and fate of the universe. Founded in 1931, Berkeley Lab's scientific expertise has been recognized with 13 Nobel Prizes. The University of California manages Berkeley Lab for the U.S. Department of Energy's Office of Science. For more, visit http://www.lbl.gov.

DOE's Office of Science is the single largest supporter of basic research in the physical sciences in the United States, and is working to address some of the most pressing challenges of our time. For more information, please visit science.energy.gov.

]]>ARLINGTON, Va. — Sept. 26, 2017 — Research!America's 22nd annual Advocacy Awards will pay special tribute to The Honorable John Edward Porter, Research!America Chair Emeritus, for his decades-long commitment to advancing medical and health research. Porter served as a U.S. Congressman from the 10th district in Illinois from 1980 to 2001 during which time he chaired the House Appropriations Subcommittee on Labor, Health and Human Services, Education and Related Agencies. The event will take place on Wednesday, March 14, 2018, in Washington, D.C., where Porter will receive the Research!America 2018 Legacy Award, thereafter to be named in his honor.

"John Porter is the epitome of a champion and highly influential leader for medical and health research," said The Honorable Michael N. Castle, chair, Research!America. "Throughout his long and stellar career, he has worked tirelessly to help speed discoveries in medicine and improve the lives of millions of Americans. We are pleased to celebrate his many accomplishments by honoring him with this award and know it will inspire many others to become champions for research."

Other 2018 Advocacy Award winners announced today, with additional awardees to be named in the coming weeks, are Roger I. Glass, M.D., Ph.D., director of the Fogarty International Center and associate director for international research at the National Institutes of Health (NIH), recipient of the Geoffrey Beene Builders of Science Award; Shari and Garen Staglin, founders of One Mind, One Mind Institute (IMHRO), and Bring Change2Mind, recipients of the Gordon and Llura Gund Leadership Award; EveryLife Foundation for Rare Diseases, recipient of the Paul G. Rogers Distinguished Organization Advocacy Award; and Peter J. Hotez, M.D., Ph.D., dean of the National School of Tropical Medicine and professor of pediatrics and molecular virology and microbiology at Baylor College of Medicine, recipient of the Raymond and Beverly Sackler Award for Sustained National Leadership. The annual Research!America Advocacy Awards program was established by the Board of Directors in 1996 to honor outstanding advocates for medical, health and scientific research.

"The 2018 honorees are exceptional in their commitment to bolstering our nation's scientific enterprise, both at home and abroad, leading and championing innovative research and the patient voice to raise awareness of the importance of medical research," said Mary Woolley, president and CEO, Research!America. "We applaud their achievements in advancing life-saving medical and health discoveries to improve the lives of people worldwide."

The Honorable John Edward Porter has served as a member of the Research!America board since 2001, 12 of those years as chair and now chair emeritus. Porter was a U.S. Congressman from the 10th district in Illinois for 21 years, serving on the U.S. House Committee on Appropriations and chairing the House Appropriations Subcommittee on Labor, Health and Human Services, Education and Related Agencies, which allocates funding for the NIH and other health-related federal agencies. He was instrumental in doubling the funding for the NIH over five years. In 2014, the John Edward Porter Neuroscience Research Center, a state of the art facility at the NIH, was dedicated in his name. Porter is also the recipient of the National Academy of Sciences Public Welfare Medal, the Academy's highest honor. He is a senior advisor in the international law firm Hogan Lovells LLP.

Roger I. Glass, M.D., Ph.D., who will receive the Geoffrey Beene Builders of Science Award, is director of the Fogarty International Center and associate director for international research at the NIH. In these roles, Glass oversees an extensive portfolio of grants and awards that support training of global health researchers and facilitates NIH's research and training partnerships abroad. Dr. Glass's research expertise is in the prevention of gastroenteritis from rotaviruses, noroviruses and cholera. He has maintained field studies in India, Bangladesh, Brazil, Mexico, Israel, Russia, Vietnam, China and elsewhere, and created a team of epidemiologists and virologists that spearheaded global efforts to research and introduce rotavirus vaccine worldwide. Dr. Glass is the recipient of numerous awards, including the prestigious Charles C. Shepard Lifetime Scientific Achievement Award, presented by the Centers for Disease Control and Prevention for his 30-year career as a leader in scientific research, and the 2015 Albert B. Sabin Gold Medal Award for his novel scientific research in the prevention of gastroenteritis from rotaviruses and noroviruses.

Shari and Garen Staglin, founders of the Staglin Family Vineyard, will receive the Gordon and Llura Gund Leadership Award for their commitment to accelerating cures for brain disorders through scientific research. Shari and Garen have actively given back to the community and supported charitable causes for 45 years. Their focus on brain health research is the result of their son Brandon's diagnosis of schizophrenia in 1990. Brandon is now director of marketing and communications at One Mind Institute and his sister Shannon is president at the Staglin Family Vineyard. The Staglins founded One Mind, One Mind Institute and Bring Change2Mind to address brain disorders and stigma. For the last 23 years their annual Music Festival for Brain Health, along with their other advocacy efforts, have raised over $280 million for brain health research.

The EveryLife Foundation for Rare Diseases has been selected to receive the Paul G. Rogers Distinguished Organization Advocacy Award. Rare diseases affect more than 30 million Americans, and there are fewer than 500 approved treatments for the 7,000 rare diseases that exist. The EveryLife Foundation was founded in 2009 to improve the regulatory process for drug development, from clinical trials to approval, by working with patient organizations, industry, academic scientists, the Food and Drug Administration (FDA), and the NIH to spur insightful scientific analysis and dialogue, expand grassroots support, and ultimately bring about key policy changes. The Foundation played a pivotal role in securing passage of the bipartisan 21st Century Cures Act, which was signed into law in 2016.

Peter J. Hotez, M.D., Ph.D., will receive the Raymond and Beverly Sackler Award for Sustained National Leadership for his far-reaching work in the areas of neglected tropical disease (NTD) research and vaccine development. Dr. Hotez is dean of the National School of Tropical Medicine at Baylor College of Medicine where he is also professor of pediatrics and molecular virology and microbiology. He serves as the director of the Texas Children's Hospital Center for Vaccine Development, where he leads a unique product development partnership for developing new vaccines for hookworm infection, schistosomiasis, Chagas disease, leishmaniasis, and SARS/MERS, diseases that affect hundreds of millions of people worldwide. In 2006 at the Clinton Global Initiative, he co-founded a Global Network for NTDs to provide access to essential medicines for hundreds of millions of people. Hotez was among the first to predict Zika's emergence in the U.S. and is recognized as an authority on vaccines. He is an outspoken leader of national efforts to educate the public about vaccines amid growing misconceptions about them, and he has appeared on BBC, CNN, Fox News and MSNBC. Hotez is founding Editor-in-Chief of PLoS Neglected Tropical Diseases and an elected member of the National Academy of Medicine.

###

For more information about the 2018 Advocacy Awards Dinner, visit http://www.researchamerica.org/advocacy_awards.

About Research!America's Advocacy Awards Dinner

The annual Research!America Advocacy Awards program was established in 1996 by the Board of Directors to honor outstanding advocates for medical, health and scientific research. Recognized individuals and organizations are those whose extraordinary leadership efforts have been effective in advancing our nation's commitment to medical, health and other scientific research. The awards dinner will take place on March 14, 2018, at the Andrew W. Mellon Auditorium in Washington, D.C. For more information, visit http://www.researchamerica.org/advocacy_awards.

About Research!America

Research!America is the nation's largest nonprofit public education and advocacy alliance working to make research to improve health a higher national priority. Founded in 1989, Research!America is supported by member organizations representing 125 million Americans. Visit http://www.researchamerica.org.

NASA's Aqua satellite provided an infrared look at Hurricane Maria's cloud top temperatures and found the coldest cloud tops and strongest storms were facing east of the center and away from the U.S. However, Maria is a large hurricane. On Sept. 26, the National Hurricane Center reported that hurricane-force winds extend outward up to 105 miles (165 km) from the center and tropical-storm-force winds extend outward up to 240 miles (390 km).

The Atmospheric Infrared Sounder or AIRS instrument aboard NASA's Aqua satellite passed over Hurricane Maria on Sept. 25 at 2:11 p.m. EDT (1811 UTC) and analyzed the storm in infrared light. Infrared light provides scientists with temperature data and that's important when trying to understand how strong storms can be. The higher the cloud tops, the colder and the stronger they are. So infrared light as that gathered by the AIRS instrument can identify the strongest sides of a tropical cyclone. The AIRS image clearly showed the bulk of strong storms in Maria's northeastern quadrant as well as in the storms surrounding the eye.

Cloud top temperatures in thunderstorms around the eyewall as cold as minus 63 degrees Fahrenheit (minus 53 degrees Celsius). Storms with cloud top temperatures that cold have the capability to produce heavy rainfall.

On Sept. 27, the National Hurricane Center noted "the satellite presentation of Maria has continued to slowly degrade over the past 24 hours, as deep convection is now confined to the southeastern portion of the circulation."

AIRS data also showed cooler sea surface temperatures, which are expected to help weaken Maria.

At the same time as the AIRS image, another instrument aboard Aqua called MODIS, or Moderate Resolution Imaging Spectroradiometer took a visible light image of Maria. In the visible image, Maria's eye was covered by high clouds.

Warnings in Effect on September 26, 2017

The National Hurricane Center (NHC) posted a Storm Surge Warning for Ocracoke Inlet to Cape Hatteras, North Carolina. A Tropical Storm Warning is in effect for Bogue Inlet to the North Carolina/Virginia border, and for the Albemarle and Pamlico Sounds.

A Storm Surge Watch is in effect for Cape Lookout to west of Ocracoke Inlet, and north of Cape Hatteras to Duck, North Carolina.

At 11 a.m. EDT on Sept. 26, the National Hurricane Center reported that tropical-storm-force winds nearing North Carolina's Outer Banks. Storm Surge and Tropical Storm Warnings were in place from North Carolina to Virginia.

Status of Hurricane Maria

At 11 a.m. EDT (1500 UTC), the center of Hurricane Maria was located near 33.6 degrees north latitude and 73.1 degrees west longitude. That's about 175 miles (285 km) southeast of Cape Hatteras, North Carolina. The latest minimum central pressure estimated from reconnaissance aircraft data is 971 millibars.

Maria was moving toward the north near 7 mph (11 kph) and NHC said this general motion with some decrease in forward speed is expected, followed by a turn toward the north-northeast on Wednesday, Sept. 27. On the forecast track, the center of Maria will pass east of the coast of North Carolina during the next couple of days.

Maximum sustained winds are near 75 mph (120 kph) with higher gusts. Gradual weakening is forecast during the next couple of days, and Maria is forecast to become a tropical storm within the next day or so.

NHC Key Messages About Maria

The National Hurricane Center shared Key Messages about this large hurricane:

1 Maria is forecast to continue moving northward, paralleling the U.S. east coast for the next 36 hours, and will likely bring some direct impacts to portions of the North Carolina coast through Wednesday where a tropical storm warning is in effect.

2. Storm surge flooding, especially along the sound side of the North Carolina Outer Banks, is expected, and a storm surge warning and watch are in effect for portions of eastern North Carolina.

3. Swells generated by Maria are affecting much of the east coast of the United States. These swells are also affecting Bermuda, the Turks and Caicos Islands, and the Bahamas. These swells are likely to cause life-threatening surf and rip current conditions.

Original Source

]]>https://scienmag.com/nasa-satellite-data-shows-hurricane-marias-strongest-side/feed/01562289The 3-D selfie has arrivedhttps://scienmag.com/the-3-d-selfie-has-arrived/
https://scienmag.com/the-3-d-selfie-has-arrived/#respondTue, 26 Sep 2017 17:01:12 +0000https://scienmag.com/the-3-d-selfie-has-arrived/Computer scientists at the University of Nottingham and Kingston University have solved a complex problem that has, until now, defeated experts in vision and graphics research. They have developed tec..

]]>Computer scientists at the University of Nottingham and Kingston University have solved a complex problem that has, until now, defeated experts in vision and graphics research. They have developed technology capable of producing 3D facial reconstruction from a single 2D image – the 3D selfie.

Their new web app allows people to upload a single colour image and receive, in a few seconds, a 3D model showing the shape of their face. People are queuing up to try it and so far, more than 400,000 users have had a go. You can do it yourself by taking a selfie and uploading it to their website.

The research – 'Large Pose 3D Face Reconstruction from a Single Image via Direct Volumetric CNN Regression' – was led by PhD student Aaron Jackson and carried out with fellow PhD student Adrian Bulat both based in the Computer Vision Laboratory in the School of Computer Science. Both students are supervised by Georgios (Yorgos) Tzimiropoulos, Assistant Professor in the School of Computer Science. The work was done in collaboration with Dr Vasileios Argyriou from the School of Computer Science and Mathematics at Kingston University.

The results will be presented at the International Conference on Computer Vision (ICCV) 2017 in Venice next month.

Technology at a very early stage

The technique is far from perfect but this is the breakthrough computer scientists have been looking for.

It has been developed using a Convolutional Neural Network (CNN) – an area of artificial intelligence (AI) which uses machine learning to give computers the ability to learn without being explicitly programmed.

The research team, supervised by Dr Yorgos Tzimiropoulos, trained a CNN on a huge dataset of 2D pictures and 3D facial models. With all this information their CNN is able to reconstruct 3D facial geometry from a single 2D image. It can also take a good guess at the non-visible parts of the face.

Simple idea complex problem

Dr Tzimiropoulos said: "The main novelty is in the simplicity of our approach which bypasses the complex pipelines typically used by other techniques. We instead came up with the idea of training a big neural network on 80,000 faces to directly learn to output the 3D facial geometry from a single 2D image."

This is a problem of extraordinary difficulty. Current systems require multiple facial images and face several challenges, such as dense correspondences across large facial poses, expressions and non-uniform illumination.

Adrian Bulat said "The method can be used to reconstruct the whole 3D facial geometry including the non-visible parts of the face."

Their technique demonstrates some of the advances possible through deep learning – a form of machine learning that uses artificial neural networks to mimic the way the brain makes connections between pieces of information.

Dr Vasileios Argyriou, from Kingston University's Faculty of Science, Engineering and Computing, said: "What's really impressive about this technique is how it has made the process of creating a 3D facial model so simple."

What could the applications be?

Aside from the more standard applications, such as face and emotion recognition, this technology could be used to personalise computer games, improve augmented reality, and let people try on online accessories such as glasses.

It could also have medical applications – such as simulating the results of plastic surgery or helping to understand medical conditions such as autism and depression.

Aaron's PhD is funded by the University of Nottingham. His research is focused on deep learning applied to the human face. This includes 3D reconstruction and segmentation applied to the human face and body.

Adrian Bulat is a PhD student in the Computer Vision Lab. His main research interests are in the area of face analysis, human pose estimation and neural network quantization/binarization.

]]>https://scienmag.com/the-3-d-selfie-has-arrived/feed/01562288OSA Laser Congress highlights latest advances in solid state lasers and industrial applicationshttps://scienmag.com/osa-laser-congress-highlights-latest-advances-in-solid-state-lasers-and-industrial-applications/
https://scienmag.com/osa-laser-congress-highlights-latest-advances-in-solid-state-lasers-and-industrial-applications/#respondTue, 26 Sep 2017 16:52:04 +0000https://scienmag.com/osa-laser-congress-highlights-latest-advances-in-solid-state-lasers-and-industrial-applications/NAGOYA, JAPAN - The 2017 OSA Laser Congress will offer a comprehensive view of the latest advancements in solid state lasers and other related technology. The conference program is comprised of a glob..

]]>NAGOYA, JAPAN – The 2017 OSA Laser Congress will offer a comprehensive view of the latest advancements in solid state lasers and other related technology. The conference program is comprised of a global audience of laser leaders and a comprehensive, peer-reviewed presentations. Market-focused sessions describe the needed technological and engineering advancements required to move these laser technologies into commercial products.

The 2017 OSA Laser Congress will include invited plenary presentations by Robert L. Byer and Katsumi Midorikawa. Byer is the William R. Kenan, Jr. Professor of Applied Physics at Stanford University, California, USA and Midorikawa is the Director of RIKEN Center for Advanced Photonics, Saitama, Japan. The plenary presentation will take place on 2 October, 8:00-9:30 am Nagoya Convention Center, Nagoya, Japan.

Robert L. Byer, Stanford University, USA Einstein, Lasers, Black Holes and Gravitational Waves On September 14, 2015 the two LIGO detectors nearly simultaneously detected gravitational wave signals from two merging Black Holes at more than one billion light years distance. Numerical relativity models confirmed the waveform came from two Black Holes of 29 and 36 solar masses merged to create a final Black Hole with mass 62 and in the process of merging in less than 1/5 second radiated gravitational waves with more than 3 solar masses of energy.

LIGO and Advanced LIGO requirements were met and enabled by advances in solid state lasers including a single frequency laser oscillator and quantum noise limited amplification. This presentation will give the history of LIGO and the direct detection of gravitational waves.

Katsumi Midorikawa, RIKEN Center for Advanced Photonics, Japan High-Order Harmonics: Application and Prospects Nearly thirty years have passed since the first observation of high-order harmonic generation (HHG). Although there has been strong interest in related physical phenomena, many researchers expected that HHG would not be useful as a practical source at that time because of its small photon number associated with low conversion efficiency. Contrary to their expectations, however, HHG is now established as a high-output coherent light source in the XUV region and the sole source of attosecond pulses. Midorikawa will share recent efforts on generation of high harmonics and applications including ultrafast XUV science and EUV optics/mask inspection.

Daichi Sumimori, Section Leader of Research and Development Dept. at NADEX LASER R&D Center of NADEX PRODUCTS Co., Ltd., Japan

Oliver Suttmann, Head of Department Production and Systems, Laser Zentrum Hannover e.V., Germany

Gijs van der Schot, Uppsala University, Sweden

Takeo Watanabe, Associate Professor, University of Hyogo, Japan

Seiei Yamamoto, Junior Supervisor, OKUMA Corporation, Japan

COLLOCATED MEETINGS

The Laser Applications Conference (LAC) is a three-day meeting focused on two main topic areas – materials processing and applications for high power lasers. Materials processing will cover advanced applications for industrial use while the applications for high power lasers will include topics such as: EUV for lithography, 16kW+ laser applications, X-Ray generation, lasers for space applications and tool making. One of the themes of this meeting will be to initiate discussions on what engineering and production advances are needed to translate promising technological advances into marketable products.

The Advanced Solid State Lasers Conference (ASSL) highlights new sources, advanced technologies, components and system design to improve the operation and application of solid state lasers. It covers the spectrum of solid state lasers from materials research to applied science and design innovations

The OSA Laser Congress will be held 1-5 October at the Convention Center Nagoya, Nagoya, Aichi, Japan. The Congress features the latest advances in solid state laser development and related technologies for free space laser communication, laser-based sensing, and numerous industrial applications. It provides attendees with a comprehensive view of the latest technological advances as well as applications of laser technologies for industrial products and markets. In 2017, the Congress offers two collocated meetings: Advanced Solid State Lasers Conference (ASSL) and Laser Applications Conference (LAC).

About The Optical Society

Founded in 1916, The Optical Society (OSA) is the leading professional organization for scientists, engineers, students and entrepreneurs who fuel discoveries, shape real-life applications and accelerate achievements in the science of light. Through world-renowned publications, meetings and membership initiatives, OSA provides quality research, inspired interactions and dedicated resources for its extensive global network of optics and photonics experts. For more information, visit: osa.org.

]]>https://scienmag.com/osa-laser-congress-highlights-latest-advances-in-solid-state-lasers-and-industrial-applications/feed/01562278Research led by PPPL provides reassurance that heat flux will be manageable in ITERhttps://scienmag.com/research-led-by-pppl-provides-reassurance-that-heat-flux-will-be-manageable-in-iter/
https://scienmag.com/research-led-by-pppl-provides-reassurance-that-heat-flux-will-be-manageable-in-iter/#respondTue, 26 Sep 2017 16:51:36 +0000https://scienmag.com/research-led-by-pppl-provides-reassurance-that-heat-flux-will-be-manageable-in-iter/Credit: ITER
A major issue facing ITER, the international tokamak under construction in France that will be the first magnetic fusion device to produce net energy, is whether the crucial divertor pl..

A major issue facing ITER, the international tokamak under construction in France that will be the first magnetic fusion device to produce net energy, is whether the crucial divertor plates that will exhaust waste heat from the device can withstand the high heat flux, or load, that will strike them. Alarming projections extrapolated from existing tokamaks suggest that the heat flux could be so narrow and concentrated as to damage the tungsten divertor plates in the seven-story, 23,000 ton tokamak and require frequent and costly repairs. This flux could be comparable to the heat load experienced by spacecraft re-entering Earth's atmosphere.

New findings of an international team led by physicist C.S. Chang of the U.S. Department of Energy's (DOE) Princeton Plasma Physics Laboratory (PPPL) paint a more positive picture. Results of the collaboration, which has spent two years simulating the heat flux, indicate that the width could be well within the capacity of the divertor plates to tolerate.

Good news for ITER

"This could be very good news for ITER," Chang said of the findings, published in August in the journal Nuclear Fusion. "This indicates that ITER can produce 10 times more power than it consumes, as planned, without damaging the divertor plates prematurely."

At ITER, spokesperson Laban Coblentz, said the simulations were of great interest and highly relevant to the ITER project. He said ITER would be keen to see experimental benchmarking, performed for example by the Joint European Torus (JET) at the Culham Centre for Fusion Energy in the United Kingdom, to strengthen confidence in the simulation results.

Chang's team used the highly sophisticated XGC1 plasma turbulence computer simulation code developed at PPPL to create the new estimate. The simulation projected a width of 6 millimeters for the heat flux in ITER when measured in a standardized way among tokamaks, far greater than the less-than 1 millimeter width projected through use of experimental data.

Deriving projections of narrow width from experimental data were researchers at major worldwide facilities. In the United States, these tokamaks were the National Spherical Torus Experiment before its upgrade at PPPL; the Alcator C-Mod facility at MIT, which ceased operations at the end of 2016; and the DIII-D National Fusion Facility that General Atomics operates for the DOE in San Diego.

Widely different conditions

The discrepancy between the experimental projections and simulation predictions, said Chang, stems from the fact that conditions inside ITER will be too different from those in existing tokamaks for the empirical predictions to be valid. Key differences include the behavior of plasma particles within today's machines compared with the expected behavior of particles in ITER. For example, while ions contribute significantly to the heat width in the three U.S. machines, turbulent electrons will play a greater role in ITER, rendering extrapolations unreliable.

Chang's team used basic physics principles, rather than empirical projections based on the data from existing machines, to derive the simulated wider prediction. The team first tested whether the code could predict the heat flux width produced in experiments on the U.S. tokamaks, and found the predictions to be valid.

Researchers then used the code to project the width of the heat flux in an estimated model of ITER edge plasma. The simulation predicted the greater heat-flux width that will be sustainable within the current ITER design.

Supercomputers enabled simulation

Supercomputers made this simulation possible. Validating the code on the existing tokamaks and producing the findings took some 300 million core hours on Titan and Cori, two of the most powerful U.S. supercomputers, housed at the DOE's Oak Ridge Leadership Computing Facility and the National Energy Research Scientific Computing Center, respectively. A core hour is one processor, or core, running for one hour.

###

Researchers from eight U.S. and European institutions collaborated on this research. In addition to PPPL, the institutions included ITER, the Culham Centre for Fusion Energy, the Institute of Atomic and Subatomic Physics at the Technical University of Vienna, General Atomics, MIT, Oak Ridge National Laboratory and Lawrence Livermore National Laboratory.

Support for this work comes from the DOE Office of Science Offices of Fusion Energy Sciences and Office of Advanced Scientific Computing Research.

PPPL, on Princeton University's Forrestal Campus in Plainsboro, N.J., is devoted to creating new knowledge about the physics of plasmas — ultra-hot, charged gases — and to developing practical solutions for the creation of fusion energy. The Laboratory is managed by the University for the U.S. Department of Energy's Office of Science, which is the largest single supporter of basic research in the physical sciences in the United States, and is working to address some of the most pressing challenges of our time. For more information, please visit science.energy.gov.

]]>https://scienmag.com/research-led-by-pppl-provides-reassurance-that-heat-flux-will-be-manageable-in-iter/feed/01562274A quantum computer to tackle fundamental science problemshttps://scienmag.com/a-quantum-computer-to-tackle-fundamental-science-problems/
https://scienmag.com/a-quantum-computer-to-tackle-fundamental-science-problems/#respondTue, 26 Sep 2017 16:51:09 +0000https://scienmag.com/a-quantum-computer-to-tackle-fundamental-science-problems/Credit: Quantum Nanoelectronics Laboratory, UC Berkeley.
For more than 50 years, Moore's Law has reigned supreme. The observation that the number of transistors on a computer chip doubles roug..

For more than 50 years, Moore's Law has reigned supreme. The observation that the number of transistors on a computer chip doubles roughly every two years has set the pace for our modern digital revolution–making smartphones, personal computers and current supercomputers possible. But Moore's Law is slowing. And even if it wasn't, some of the big problems that scientists need to tackle might be beyond the reach of conventional computers.

For the past few years, researchers at the Lawrence Berkeley National Laboratory (Berkeley Lab) have been exploring a drastically different kind of computing architecture based on quantum mechanics to solve some of science's hardest problems. With Laboratory Directed Research and Development (LDRD) funding, they've developed quantum chemistry and optimization algorithms, as well as prototype superconducting quantum processors. Recently, they proved the viability of their work by using these algorithms on a quantum processor comprising two superconducting transmon quantum bits to successfully solve the chemical problem of calculating the complete energy spectrum of a hydrogen molecule.

Now, two research teams led by Berkeley Lab staff will receive funding from the Department of Energy (DOE) to build on this momentum. One team will receive $1.5 million over three years to develop novel algorithms, compiling techniques and scheduling tools that will enable near-term quantum computing platforms to be used for scientific discovery in the chemical sciences. The other team will work closely with these researchers to design prototype four- and eight-qubit processors to compute these new algorithms. This project will last five years and the researchers will receive $1.5 million for their first year of work. By year five, the hardware team hopes to demonstrate a 64-qubit processor with full control.

"Someday, universal quantum computers will be able to solve a wide range of problems, from molecular design to machine learning and cybersecurity, but we're a long way off from that. So, the question we are currently asking is whether there are specific problems that we can solve with more specialized quantum computers," says Irfan Siddiqi, Berkeley Lab Scientist and Founding Director of the Center for Quantum Coherent Science at UC Berkeley.

According to Siddiqi, today's quantum coherent computing technologies do have the requisite coherence times, logical operation fidelities and circuit topologies to perform specialized computations for fundamental research in areas such as molecular and materials science, numerical optimization and high energy physics. In light of these advances, he notes that it is timely for DOE to explore how these technologies can be integrated into the high-performance computing community. On these new projects, the Berkeley Lab teams will work with collaborators in industry and academia to build on these advances and tackle difficult DOE-mission science problems such as calculating molecular system dynamics and quantum machine learning.

"We are at the early stages of quantum computing, kind of like where we were with conventional computing in the 1940s. We have some of the hardware, now we need to develop a robust set of software, algorithms and tools to optimally utilize it to solve really hard science problems," says Bert de Jong, who leads the Computational Chemistry, Materials and Climate Group in Berkeley Lab's Computational Research Division (CRD).

He will be heading a DOE Quantum Algorithms Team consisting of researchers from Berkeley Lab, Harvard, Argonne National Lab and UC Berkeley focused on "Quantum Algorithms, Mathematics and Compilation Tools for Chemical Sciences."

"Berkeley Lab's tradition of team science, as well as its proximity to UC Berkeley and Silicon Valley, makes it an ideal place to work on quantum computing end-to-end," says Jonathan Carter, Deputy Director of Berkeley Lab Computing Sciences. "We have physicists and chemists at the lab who are studying the fundamental science of quantum mechanics, engineers to design and fabricate quantum processors, as well as computer scientists and mathematicians to ensure that the hardware will be able to effectively compute DOE science."

The key to building quantum computers that solve scientific problems beyond the reach of conventional computers is "quantum coherence." This phenomenon essentially allows quantum systems to store much more information per bit than in traditional computers.

In a conventional computer, the circuits in a processor comprise billions of transistors–tiny switches that are activated by electronic signals. The digits 1 and 0 are used in binary to reflect the on and off states of a transistor. This is essentially how information is stored and processed. When programmers write computer code, a translator transforms it into binary instructions–1s and 0s–that a processor can execute.

Unlike a traditional bit, a quantum bit (qubit) can take on somewhat counterintuitive quantum mechanical properties like entanglement and superposition. Quantum entanglement occurs when pairs or groups of particles interact in such a way that the state of each particle cannot be described individually; instead the state must be described for the system as a whole. In other words, entangled particles act as a unit. Superposition occurs when a particle exists in a combination of two quantum states simultaneously.

So whereas a conventional computer bit encodes information as either 0 or 1, a qubit can be 0, 1 or a superposition of states (both 0 and 1 at the same time). A qubit's ability to exist in multiple states means that it can, for example, enable the calculation of material and chemical properties significantly faster than traditional computers. And if these qubits could be linked or entangled in a quantum computer, problems that cannot be solved today with conventional computers could be tackled.

But getting qubits to this state of quantum coherence, where they can take advantage of quantum mechanical properties and then making the most of them when they are in this state remains a challenge.

"Quantum computing is like playing a game of chess where the pieces and board are made of ice. As the players shuffle around the pieces, the components are melting, and the more moves you make, the faster the game will melt," says Carter. "Qubits lose coherence in a really short amount of time, so it's up to us to figure out the most useful set of moves we can make."

Carter notes that the Berkeley Lab approach of co-designing the quantum processors in close collaboration with the researchers developing quantum algorithms, compiling techniques and scheduling tools will be extremely useful for answering this question.

"Computational approaches are common across most scientific projects at Berkeley Lab. As Moore's Law is slowing down, novel computing architectures, system, and techniques have become a priority initiative at Berkeley Lab, " says Horst Simon, Berkeley Lab's Deputy Director. "We recognized early how quantum simulation could provide an effective approach to some of the most challenging computational problems in science, and I am pleased to see recognition of our LDRD initiative through this first direct funding. Quantum information science will become an increasingly important element of our research enterprise across many disciplines."

Because this field is still in its early days, there are many approaches for building a quantum computer. The Berkeley Lab-led teams will be looking into superconducting quantum computers.

To design and fabricate the next generation of quantum processors, the AQuES team will leverage the superconducting circuit facility in UC Berkeley's Quantum Nanoelectronics Laboratory while incorporating the expertise of researchers in Berkeley Lab's Accelerator Technology and Applied Physics, Materials Science and Engineering divisions. The research teams will also use the unique capabilities of two DOE facilities; the Molecular Foundry and National Energy Research Scientific Computing Center (NERSC), both located at Berkeley Lab.

]]>https://scienmag.com/a-quantum-computer-to-tackle-fundamental-science-problems/feed/01562270$2.5 million NSF grant to SMU will give teachers a math assessment tool to help studentshttps://scienmag.com/2-5-million-nsf-grant-to-smu-will-give-teachers-a-math-assessment-tool-to-help-students/
https://scienmag.com/2-5-million-nsf-grant-to-smu-will-give-teachers-a-math-assessment-tool-to-help-students/#respondTue, 26 Sep 2017 15:55:32 +0000https://scienmag.com/2-5-million-nsf-grant-to-smu-will-give-teachers-a-math-assessment-tool-to-help-students/Credit: Hillsman Jackson, SMU
A $2.5 million grant from the National Science Foundation to researchers at Southern Methodist University, Dallas, targets the ongoing struggle of U.S. elementary and h..

A $2.5 million grant from the National Science Foundation to researchers at Southern Methodist University, Dallas, targets the ongoing struggle of U.S. elementary and high school students with math.

When it comes to the STEM fields of science, technology, engineering and math, research shows that U.S. students continue at a disadvantage all the way through high school and entering college.

The four-year NSF grant to the Annette Caldwell Simmons School of Education and Human Development is led by SMU K-12 math education experts Leanne Ketterlin Geller and Lindsey Perry. They will conduct research and develop an assessment system comprised of two universal screening tools to measure mathematical reasoning skills for grades K-2.

"This is an opportunity to develop an assessment system that can help teachers support students at the earliest, and arguably one of the most critical, phases of a child's mathematical development," said Ketterlin Geller, a professor in the Simmons School and principal investigator for the grant developing the "Measures of Mathematical Reasoning Skills" system.

Teachers and schools will use the assessment system to screen students and determine who is at risk for difficulty in early mathematics, including students with disabilities. The measures also will help provide important information about the intensity of support needed for a given student.

Few assessments are currently available to measure the critical math concepts taught during those early school years, Ketterlin Geller said.

"Providing teachers with data to understand how a child processes these concepts can have a long-term impact on students' success not only in advanced math like algebra, but also success in STEM fields, such as chemistry, biology, geology and engineering," she said.

A 2015 Mathematics National Assessment of Education Progress report found that only 40 percent of U.S. fourth-grade students were classified as proficient or advanced, and those numbers have not improved between 2009 and 2015. In fact, the geometry scale of the fourth-grade mathematics report was significantly lower in 2015 than in 2009.

Early mathematics is a better and more powerful predictor of future learning, including reading and mathematics achievement, compared to early reading ability or other factors such as attention skills, according to one 2007 study on school readiness.

Research also has found that students' early mathematics knowledge is a more powerful predictor of their future socioeconomic status at age 42 than their family's socioeconomic status as children.

Early mathematics comprises numerous skills. However, number sense — the ability to work with numbers flexibly — in addition to spatial sense — the ability to understand the complexity of one's environment — are consistently identified as two of the main components that should be emphasized in early mathematics standards and instruction, say the SMU researchers.

The Measures of Mathematical Reasoning Skills system will contain tests for both numeric relational reasoning and spatial reasoning.

"I'm passionate about this research because students who can reason spatially and relationally with numbers are better equipped for future mathematics courses, STEM degrees and STEM careers," said Perry, whose doctoral dissertation for her Ph.D. from SMU in 2016 specifically focused on those two mathematical constructs.

"While these are very foundational and predictive constructs, these reasoning skills have typically not been emphasized at these grade levels, and universal screening tools focused on these topics do not yet exist," said Perry, who is co-principal investigator.

"Since intervention in the early elementary grades can significantly improve mathematics achievement, it is critical that K-2 teachers have access to high-quality screening tools to help them with their intervention efforts," she said. "We feel that the Measures of Mathematical Reasoning Skills system can really make a difference for K-2 teachers as they prepare the next generation of STEM leaders."

The four-year project, Measuring Early Mathematical Reasoning Skills: Developing Tests of Numeric Relational Reasoning and Spatial Reasoning, started Sept. 15, 2017. It employs an iterative research design for developing formative assessments, a process that Ketterlin Geller has devoted much of her 20-year career to.

Ketterlin Geller is Texas Instruments Endowed Chair in Education and director of Research in Mathematics Education in SMU's Annette Caldwell Simmons School of Education and Human Development. She is also a Fellow with the Caruth Institute for Engineering Education in the Lyle School of Engineering.

###

Media Contact

Margaret Allenmallen@smu.edu214-768-7664 @smu

http://www.smu.edu

Original Source

]]>https://scienmag.com/2-5-million-nsf-grant-to-smu-will-give-teachers-a-math-assessment-tool-to-help-students/feed/01562231Amid debate on reproducibility and p-values, experts gather to improve scientific evidence, rigorhttps://scienmag.com/amid-debate-on-reproducibility-and-p-values-experts-gather-to-improve-scientific-evidence-rigor/
https://scienmag.com/amid-debate-on-reproducibility-and-p-values-experts-gather-to-improve-scientific-evidence-rigor/#respondTue, 26 Sep 2017 15:55:23 +0000https://scienmag.com/amid-debate-on-reproducibility-and-p-values-experts-gather-to-improve-scientific-evidence-rigor/ALEXANDRIA, Va. (September 26, 2017) - Following the widespread impact of its historic statement on the overuse and misinterpretation of p-values throughout the scientific community, the American Stat..

]]>ALEXANDRIA, Va. (September 26, 2017) – Following the widespread impact of its historic statement on the overuse and misinterpretation of p-values throughout the scientific community, the American Statistical Association (ASA) is convening scientists spanning many disciplines to help bolster statistical and scientific evidence, move beyond an irrational overdependence on p-values, and improve the rigor and reproducibility of research. This first-of-its-kind event, the Symposium on Statistical Inference, takes place October 11-12, 2017, in Bethesda, Maryland.

"Statistical analysis is key to conducting research in any field of science, and the challenge and pressures of reproducibility are impacting the perception and path of worthwhile research," said ASA President Barry D. Nussbaum. "While the ASA's statement addressed the inappropriate explanations and uses of p-values and significance tests, this conference is an imperative next step to not only further the dialogue, but also drive change that leads to lasting improvements in multidisciplinary research, communicating and understanding uncertainty, and decision-making."

Discussions will center on specific approaches for refining statistical and scientific evidence as it intersects with conducting, using, sponsoring, disseminating, and replicating research. Upon the conclusion of the symposium, recommendations, approaches, and guidelines will be published and made available to the entire scientific community, policymakers, members of the media and the general public.

So important is this focus to the broader scientific community, that it is co-sponsored by the American Psychological Association and American Educational Research Association. Financial support for the conference has been provided by the Central Intelligence Agency (CIA), R Studio, and the ASA.

"Policymakers here and abroad look to American science as a vibrant source of insight and an agent of change that can help steer understanding and adoption of policies that better society. Having a collective and realistic understanding of what will improve research reproducibility will make science across all disciplines stronger and more meaningful," said Marcia McNutt, president of the National Academy of Sciences.

Key events:

* Steve Goodman, Associate Dean of Clinical and Translational Research; professor of Medicine and Health Research & Policy; co-founder of the Meta-research innovation Center (METRICS), Stanford University, and John Ioannidis, the C.F. Rehnborg Chair in Disease Prevention, professor of Medicine, Health Research and Policy, Statistics, Biomedical Data Science; co-director, METRICS; director of the PhD program in Epidemiology and Clinical Research, Stanford University, will lead the opening plenary session.

* Andrew Gelman, professor of statistics and political science; director of the Applied Statistics Center, Columbia University; Xiao-Li Meng, dean of the Graduate School of Arts and Sciences, Whipple V. N. Jones Professor of Statistics, Harvard University; and McNutt, will provide the closing plenary titled "The Radical Prescription for Change."

A full lineup of speakers and the program schedule can be found on the symposium website. Registration space has been expanded due to demand. Media can attend SSI for FREE but must pre-register by contacting Jill Talley, ASA Public Relations Manager, at jill@amstat.org.