Particle physics is to physics what big game hunting is to field biology. While theoretical physicists pore over their mathematical models, particle physicists are out in the brush with their pith helmets and shotguns, speaking softly, carrying big accelerators and blowing stuff up real good.

Five challenges still facing physics:

1) Combine general relativity and quantum theory into a single complete theory

2) Resolve the problem of observer-dependence in quantum theory

3) Determine whether the particles and forces in quantum theory can be unified

4) Explain how the constants in the standard model of particle physics are chosen by nature

NOTE: All articles in the amazing-science newsletter can also be sorted by topic. To do so, click the FIND buntton (symbolized by the FUNNEL on the top right of the screen) and display all the relevant postings SORTED by TOPICS.

A team of scientists from Purdue University and the Chinese Academy of Sciences has used CRISPR/Cas9 gene-editing technology to develop a variety of rice that produces 25-31 percent more grain and would have been virtually impossible to create through traditional breeding methods.

The team, led by Jian-Kang Zhu, a distinguished professor in the Department of Horticulture and Landscape Architecture at Purdue and director of the Shanghai Center for Plant Stress Biology at the Chinese Academy of Sciences, made mutations to 13 genes associated with the phytohormone abscisic acid, known to play roles in plant stress tolerance and suppression of growth. Of several varieties created, one produced a plant that had little change in stress tolerance but produced 25 percent more grain in a field test in Shanghai, China, and 31 percent more in a field test conducted on China's Hainan Island. Their findings were published early online today in the Proceedings of the National Academy of Sciences.

Zhu's team, which includes Purdue's Ray A. Bressan, a distinguished professor in the Department of Horticulture and Landscape Architecture, and researchers from the Chinese Academy of Sciences, silenced suites of pyrabactin resistance 1 (PYR1)/PYR1-like (PYL)/regulatory components of ABA receptor (ACAR) genes, or simply, PYL genes. These genes enhance tolerance of abiotic stresses, such as drought, soil salinity and other environmental factors, but also inhibit growth.

Since plants have evolved to create genetic redundancies, especially for traits required for survival, knocking out one gene in the PYL family might not have much effect on stress tolerance or growth since redundant genes can kick in to provide a similar function. Crafting the right knockout combination, however, led to a plant that uses just the right redundancies to hold onto its stress-tolerance characteristics but reduces the growth inhibition. "There is lots of evidence that although each PYL gene may have an individual specialty in function, by and large they also share some common functions," Zhu said. "When you remove one, others will function as a replacement."

The CRISPR/Cas9 technology allows plant breeders to quickly and accurately snip portions of DNA out of a sequence, editing the DNA code. The method allowed Zhu's team to modify multiple genes at one time, something that would have taken decades to do with traditional methods without a guarantee that the resulting plants would have the desired characteristics.

"You couldn't do targeted mutations like that with traditional plant breeding. You'd do random mutations and try to screen out the ones you don't want," Bressan said. "It would have taken millions of plants. Basically, it's not feasible. This is a real accomplishment that could not have been done without CRISPR."

Liquid crystals undergo a peculiar type of phase change. At a certain temperature, their cigar-shaped molecules go from a disordered jumble to a more orderly arrangement in which they all point more or less in the same direction. LCD televisions take advantage of that phase change to project different colors in moving images.

For years, however, experiments have hinted at another liquid crystal state—an intermediate state between the disordered and ordered states in which order begins to emerge in discrete patches as a system approaches its transition temperature. Now, chemists at Brown University have demonstrated a theoretical framework for detecting that intermediate state and for better understanding how it works.

"People understand the ordered and disordered behaviors very well, but the state where this transition is just about to happen isn't well understood," said Richard Stratt, a professor of chemistry at Brown and coauthor of a paper describing the research. "What we've come up with is a sort of yardstick to measure whether a system is in this state. It gives us an idea of what to look for in molecular terms to see if the state is present."

The research, published in the Journal of Chemical Physics, could shed new light not only on liquid crystals, but also molecular motion elsewhere in nature—phenomena such as the protein tangles involved in Alzheimer's disease, for example. The work was led by Yan Zhao, a Ph.D. student in Stratt's lab who expects to graduate from Brown this spring.

For the study, the researchers used computer simulations of phase changes in a simplified liquid crystal system that included a few hundred molecules. They used random matrix theory, a statistical framework often used to describe complex or chaotic systems, to study their simulation results. They showed that the theory does a good job of describing the system in both the ordered and disordered states, but fails to describe the transition state. That deviation from the theory can be used as a probe to identify the regions of the material where order is beginning to emerge. "Once you realize that you have this state where the theory doesn't work, you can dig in and ask what went wrong," Stratt said. "That gives us a better idea of what these molecules are doing."

NASA's Transiting Exoplanet Survey Satellite (TESS), led by MIT, successfully completed a lunar flyby on May 17, and the science team snapped a 2-second test exposure using one of the four TESS cameras. The image reveals more than 200,000 stars.

NASA’s next planet hunter, the Transiting Exoplanet Survey Satellite (TESS), is one step closer to searching for new worlds after successfully completing a lunar flyby on May 17. The spacecraft passed about 5,000 miles from the moon, which provided a gravity assist that helped TESS sail toward its final working orbit.

As part of camera commissioning, the science team snapped a two-second test exposure using one of the four TESS cameras. The image, centered on the southern constellation Centaurus, reveals more than 200,000 stars. The edge of the Coalsack Nebula is in the right upper corner and the bright star Beta Centauri is visible at the lower left edge. TESS is expected to cover more than 400 times as much sky as shown in this image with its four cameras during its initial two-year search for exoplanets. A science-quality image, also referred to as a “first light” image, is expected to be released next month in June.

TESS will undergo one final thruster burn on May 30 to enter its science orbit around Earth. This highly elliptical orbit will maximize the amount of sky the spacecraft can image, allowing it to continuously monitor large swaths of the sky. TESS is expected to begin science operations in mid-June after reaching this orbit and completing camera calibrations.

TESS will fly in an orbit that completes two circuits around Earth every time the moon orbits once. This special orbit will allow TESS’s cameras to monitor each patch of sky continuously for nearly a month at a time.

Launched from Cape Canaveral Air Force Station on April 18, TESS is the next step in NASA’s search for planets outside our solar system, known as exoplanets. The mission will observe nearly the entire sky to monitor nearby, bright stars in search of transits — periodic dips in a star’s brightness caused by a planet passing in front of the star. TESS is expected to find thousands of exoplanets.

NASA’s upcoming James Webb Space Telescope, scheduled for launch in 2020, will provide important follow-up observations of some of the most promising TESS-discovered exoplanets, allowing scientists to study their atmospheres.

TESS is a NASA Astrophysics Explorer mission led and operated by MIT in Cambridge, Massachusetts, and managed by NASA’s Goddard Space Flight Center in Greenbelt, Maryland. George Ricker of MIT’s Kavli Institute for Astrophysics and Space Research serves as principal investigator for the mission. Additional partners include Orbital ATK, NASA’s Ames Research Center, the Harvard-Smithsonian Center for Astrophysics, and the Space Telescope Science Institute. The TESS science instruments were jointly developed by MIT’s Kavli Institute for Astrophysics and Space Research and MIT’s Lincoln Laboratory. More than a dozen universities, research institutes, and observatories worldwide are participants in the mission.

A fundamental limit on the heat produced when erasing a bit of information has been confirmed in a fully quantum system.

One of the outstanding challenges to information processing is the eloquent suppression of energy consumption in the execution of logic operations. The Landauer principle sets an energy constraint in deletion of a classical bit of information. Although some attempts have been made to experimentally approach the fundamental limit restricted by this principle, exploring the Landauer principle in a purely quantum mechanical fashion is still an open question.

Employing a trapped ultracold ion, physicists now experimentally demonstrate a quantum version of the Landauer principle, i.e., an equality associated with the energy cost of information erasure in conjunction with the entropy change of the associated quantized environment. This experiment substantiates an intimate link between information thermodynamics and quantum candidate systems for information processing.

Additional figures by Informa show the extent of gene-therapy development: Informa Pharma Intelligence’s Trialtrove database records 729 gene therapies as having been developed, of which nearly two-thirds (461) were preclinical. Those therapies have been assessed in 1,855 clinical trials, most in early phases: 657 in Phase I, 509 in Phase I/II, and 455 in Phase II. As for later development, 89 have reached Phase III, 32 are in Phase II/III, and 28 in Phase IV.

By far the most crowded therapeutic area is oncology, accounting for two-thirds or 1,254 of the 1,855 trials. The next-largest indication is cardiovascular with 214 trials (11.5%), followed by infectious disease (6.5%).

Clinical activity is expected to increase, in part due to the FDA’s Regenerative Medicine Advanced Therapy (RMAT) designation, created through the 21st Century Cures Act. “There are a lot of incentives for sponsors who get RMAT, including early and frequent interactions with the FDA, as well as the ability to discuss early on any potential surrogate or intermediate endpoints in their clinical trials,” Amanda Micklus, principal analyst with Informa Pharma Intelligence, told GEN. “The RMAT designation potentially could really advance the progress of these therapies through the pipeline.”

The idea of a large lake monster living in Scotland’s Loch Ness has fascinated people for decades. But no concrete evidence has ever been found for “Nessie”’s existence. Could a plesiosaur be hiding in the depth of the lake?

Now, an international team of scientists led by Neil Gemmell from the University of Otago, New Zealand, will conduct an investigation into the waters of the famous loch which could help settle the mystery once and for all. The scientists will sample the water using so-called environmental DNA (eDNA), which will enable them to identify tiny remnants of genetic material left behind by any life in the loch. This technique will allow them to create a detailed list of all the organisms living in the waters and determine whether anything unusual—a larger marine monster, for example—resides there.

Gemmell maintains that finding traces of such an animal would be extremely unlikely, but he is nevertheless intrigued by what his team might find.

"Large fish, like catfish and sturgeons, have been suggested as possible explanations for the monster myth, and we can very much test that idea and others," he said in a statement. Looking for traces of a mythological monster is not the only goal of the upcoming research, though.

"While the prospect of looking for evidence of the Loch Ness monster is the hook to this project, there is an extraordinary amount of new knowledge that we will gain from the work about organisms that inhabit Loch Ness, the U.K.'s largest freshwater body,” Gemmell said.

The project will help researchers to understand more about both native and invasive species in the loch, such as the Pacific pink salmon. Gemmell predicted that the team would uncover never-before-documented species, bacteria in particular.

Scientists have isolated the gene responsible for temperature-controlled sex determination in turtles.

Red-eared slider turtles, a common household pet, develop into male or female embryos according to their egg incubation temperature. This little understood process is also at work in the eggs of crocodiles, alligators and some lizards. Researchers are now one step closer to solving a mystery which has persisted for over 50 years.

An international team from China and the United States used a recently refined process to "knock out" the gene they suspected to be responsible for sex determination in the turtles - known as Kdm6b.

"Knockouts come in several flavors," explained Prof Blanche Capel from Duke University, an author on the study. "It usually means a genetic manipulation that deletes a gene from the genome or blocks its function." With Kdm6b blocked, over 80% of turtles incubated at the (usually male-producing) temperature of 26C shifted their development to female. Females usually only develop when eggs are incubated at 32C.

Dr Nicole Valenzuela from Iowa State University, who was not involved in the study, noted that the findings confirmed earlier predictions that such genes "are themselves turned on at a temperature that produces one sex and turned off at a temperature that produces the opposite sex".

Uncertain future

Other recent studies have suggested that rising temperatures due to climate change could be causing a female skew in turtle populations in the wild.Hatchling mortality could also increase if nests remain at high temperatures for long periods of time.Prof Capel points out that "if you're incubating at low temperature, males take almost twice as long to incubate as females."

As the term “machine learning” has heated up, interest in “robotics” (as expressed in Google Trends) has not altered much over the last three years. So how much of a place is there for machine learning in robotics?

Most robots are not, and will likely not, be humanoids 10 years from now; as robots are designed for a range of behaviors in a plethora of environments, their bodies and physical abilities will reflect a best fit for those characteristics. An exception will likely be robots that provide medical or other care or companionship for humans, and perhaps service robots that are meant to establish a more personal and ‘humanized’ relationship.

Like many innovative technological fields today, robotics has and is being influenced and in some directions steered by machine learning technologies. According to a recent survey published by the Evans Data Corporation Global Development, machine learning and robotics is at the top of developers’ priorities for 2016, with 56.4 percent of participants stating that they’re building robotics apps and 24.7 percent of all developers indicating the use of machine learning in their projects.

The following overview of machine learning applications in robotics highlights five key areas where machine learning has had a significant impact on robotic technologies, both at present and in the development stages for future uses. Though by no means inclusive, the purpose of the summary is to give readers a taste for the types of machine learning applications that exist in robotics and stimulate the desire for further research in these and other areas.

Researchers already know that online fake news spreads much more quickly and more widely than real news. My research has similarly found that online posts withfake medical information get more views, comments, and likes than those with accurate medical content. In an online world where viewers have limited attention and are saturated with content choices, it often appears as though fake information is more appealing or engaging to viewers.

The problem is getting worse: By 2022, people in developed economies could be encountering more fake news than real information. This could bring about a phenomenon that researchers have dubbed “reality vertigo” — in which computers can generate such convincing content that regular people may have a hard time figuring out what’s true anymore.

However, those methods assume the people who spread fake news don’t change their approaches. They often shift tactics, manipulating the content of fake posts in efforts to make them look more authentic.

Context is also key. Words’ meanings can change over time. And the same word can mean different things on liberal sites and conservative ones. For example, a post with the terms “WikiLeaks” and “DNC” on a more liberal site could be more likely to be news while on a conservative site it could refer to a particular set of conspiracy theories.

Deep inside a cloud data center, a cluster of computers compares a critically-ill patient's symptoms, lab results, and medical history against a vast data set gleaned from thousands of medical journals and patient records. In seconds, the cluster produces a life-saving diagnosis that might have taken doctors and specialists weeks to figure out — too late for the patient with the mystery illness.

Such a feat would have been inconceivable not so long ago; but thanks to progress in the field of machine learning, this kind of capability is approaching reality, as advances like cloud computing provide both ready access to vast data sets and the compute power that machine-learning algorithms need to crunch them. For instance, algorithms trained on hundreds of thousands of medical images are already spotting some types of cancer in less than a second, while others detect heart conditions and lung cancers. Still another AI-based system, highlighted by Scientific American, crowdsources opinions from thousands of physicians so that machine learning algorithms can improve the diagnostic accuracy of individual doctors.

Behind all this is the recent resurgence in AI's capabilities thanks to advances in the speed of graphics processing units (GPUs), a type of microprocessor normally used for image processing that, it turns out, is great for building massively-parallel neural networks. When installed in data center computers, these GPUs allow AI algorithms to run in the cloud — democratizing AI and making it available to all, even from smartphones. The neural networks let deep machine learning (ML) algorithms recognize and learn from previously indiscernible patterns in data sets too large for humans to cope with. It's this powerful combination of cloud-based GPUs and deep learning networks that, after many false starts over the decades, is finally allowing AI to take off big time.

Diagnosing diseases with the same ease as Netflix recommends movies, or Amazon's Alexa answers a question, is only one arena in which AI's effects will be felt. A dizzying array of industries are going to be hugely impacted by AI, from agriculture to hospitality, retail, manufacturing, aviation, shipping, automotive, energy, finance, and logistics.

Fueling AI's rise are a clutch of key technologies that are on the march at the same time: big data, global connectivity, robotics, the Internet of Things and cloud computing. These will help amplify the impact of AI and ML and produce what Klaus Schwab, founder of the World Economic Forum, is calling the Fourth Industrial Revolution. Following the steam-driven mechanical revolution of James Watt, the electrical revolution of Thomas Edison and Nikola Tesla, and the digital revolution of Alan Turing and John von Neumann, the fourth will bring about a transformation based on machines that appear to mimic human thinking.

This begs the question: What will human-competitive, or cognitive, machines mean for our futures — and, in particular, for human employment — in coming decades? Well, there are some useful clues in those previous industrial revolutions, says the London-based business consultancy Deloitte. In a novel type of study, the firm analyzed the job roles people admitted to working in in every British census since 1871. While they found that, as expected, technology eliminated some jobs — such as weavers replaced by automated looms, and telephone operators supplanted by automatic phone exchanges — they found that, over time, new technologies generated far more jobs than they destroyed.

"The last 144 years demonstrate that when a machine replaces a human, the result, paradoxically, is faster growth and, in time, rising employment," states Deloitte in a summary of its report.

Fashion and textiles may be among the last places you’d expect to see innovation. But college researchers in Florida have figured out a revolutionary fabric that can alter its color with a mild change in temperature.

Now fashion designers will be able to modify your handbag or scarf to match the rest of your outfit, thanks to a new fabric dubbed ChroMorphous.

Dr. Ayman Abouraddy, professor of optics and photonics at the College of Optics & Photonics at the University of Central Florida (CREOL), said in an interview with VentureBeat that the age of user-controlled color-changing fabric is here. “Our goal is to bring this technology to the market to make an impact on the textile industry,” he said.

With ChroMorphous, each woven thread is equipped with a micro-wire and a color-altering pigment. You can use your smartphone to change the color or pattern of the fabric on demand, as the wire can alter the temperature of the fabric in a quick and uniform way. The change in temperature is barely noticeable by touch, Abouraddy said.

Abouraddy and Josh Kaufman worked on optical technology for more than a decade at CREOL, but in the past couple of years they have veered away from that work to produce this new kind of fabric.

A supermassive black hole in the center of the ultra-luminous quasar SMSS J215728.21-360215.1 (J2157-3602 for short) devours a mass equivalent to our Sun every two days and has a total mass of roughly about 20 billion solar masses, according to new research.

The newly-discovered black hole, which is located approximately 12.5 billion light-years from Earth, is growing so rapidly that it’s shining thousands of times more brightly than an entire galaxy, due to all of the gases it sucks in daily that cause lots of friction and heat. The image from the VISTA Hemispheric Survey shows the ultra-luminous quasar SMSS J215728.21-360215.1 (center). Image credit: Wolf et al.

“If we had this monster sitting at the center of our Milky Way Galaxy, it would appear 10 times brighter than a full moon,” said lead author Dr. Christian Wolf, from the Research School of Astronomy and Astrophysics at the Australian National University (ANU) and the ARC Centre of Excellence for All-sky Astrophysics (CAASTRO). “It would appear as an incredibly bright pin-point star that would almost wash out all of the stars in the sky. The energy emitted from this supermassive black hole was mostly UV light, but also radiated X-rays. Again, if this monster was at the center of the Milky Way it would likely make life on Earth impossible with the huge amounts of X-rays emanating from it.”

The J2157-3602 black hole was found by combining data from the recent Gaia data release 2 with data from the SkyMapper telescope at the ANU Siding Spring Observatory and NASA’s Wide-field Infrared Survey Explorer (WISE). “Large and rapidly-growing black holes are exceedingly rare, and we have been searching for them with SkyMapper for several months now,” Dr. Wolf said. “ESA’s Gaia satellite, which measures tiny motions of celestial objects, helped us find this supermassive black hole.”

Researchers from the University of Tokyo examined how neurons behave and found that they could be trained to join with one another using a 'synthetic neuron-adhesive material'.

A team of researchers in Japan has managed to recreate a tiny portion of a human brain, piece by piece, with more precision that ever before. The team connected networks of neurons, the pathways along which information travels through our brains, with remarkable accuracy.

It could be the first step toward the creation of brains in the lab that mirror our own - although millions of connecting neurons are needed to perform even basic tasks. The researchers examined how neurons behave and found that they could be trained to join with one another using a 'synthetic neuron-adhesive material', what they call their microscopic plates.

When the team connected two of these artificial neuron scaffolds together, they found that they were able to transmit electrical signals between them. There are a number of practical reasons why scientists hope to realise their ambition of creating a fully-functional artificial brain, and this finding is one small step toward that goal.

Growing brains in a lab could afford us greater understanding of a range of neurological and degenerative disorders, including Alzheimer’s and Parkinson’s diseases. While the latest finding is a long way off from achieving this, it does raise some interesting philosophical and ethical questions. Would a lab grown brain be capable of thinking and, if so, does that make it a person? To become a full person in the eyes of the law, an artificial brain would need to be conscious, with definitions of what constitutes 'consciousness' still up for debate.

Scientists at the Department of Energy's Oak Ridge National Laboratory are the first to successfully simulate an atomic nucleus using a quantum computer. The results, published in Physical Review Letters, demonstrate the ability of quantum systems to compute nuclear physics problems and serve as a benchmark for future calculations.

Quantum computing, in which computations are carried out based on the quantum principles of matter, was proposed by American theoretical physicist Richard Feynman in the early 1980s. Unlike normal computer bits, the qubit units used by quantum computers store information in two-state systems, such as electrons or photons, that are considered to be in all possible quantum states at once (a phenomenon known as superposition).

"In classical computing, you write in bits of zero and one," said Thomas Papenbrock, a theoretical nuclear physicist at the University of Tennessee and ORNL who co-led the project with ORNL quantum information specialist Pavel Lougovski. "But with a qubit, you can have zero, one, and any possible combination of zero and one, so you gain a vast set of possibilities to store data."

In October 2017 the multidivisional ORNL team started developing codes to perform simulations on the IBM QX5 and the Rigetti 19Q quantum computers through DOE's Quantum Testbed Pathfinder project, an effort to verify and validate scientific applications on different quantum hardware types. Using freely available pyQuil software, a library designed for producing programs in the quantum instruction language, the researchers wrote a code that was sent first to a simulator and then to the cloud-based IBM QX5 and Rigetti 19Q systems.

The team performed more than 700,000 quantum computing measurements of the energy of a deuteron, the nuclear bound state of a proton and a neutron. From these measurements, the team extracted the deuteron's binding energy—the minimum amount of energy needed to disassemble it into these subatomic particles. The deuteron is the simplest composite atomic nucleus, making it an ideal candidate for the project. "Qubits are generic versions of quantum two-state systems. They have no properties of a neutron or a proton to start with," Lougovski said. "We can map these properties to qubits and then use them to simulate specific phenomena—in this case, binding energy."

A challenge of working with these quantum systems is that scientists must run simulations remotely and then wait for results. ORNL computer science researcher Alex McCaskey and ORNL quantum information research scientist Eugene Dumitrescu ran single measurements 8,000 times each to ensure the statistical accuracy of their results. "It's really difficult to do this over the internet," McCaskey said. "This algorithm has been done primarily by the hardware vendors themselves, and they can actually touch the machine. They are turning the knobs."

The team also found that quantum devices become tricky to work with due to inherent noise on the chip, which can alter results drastically. McCaskey and Dumitrescu successfully employed strategies to mitigate high error rates, such as artificially adding more noise to the simulation to see its impact and deduce what the results would be with zero noise.

On August 6, 1945, Shigeru Orimen traveled from his rural home near Itsukaichi-cho to Hiroshima, where he was one of nearly 27,000 students working to prepare the city for impending U.S. airstrikes. For lunch that day, he had brought soybeans, sautéed potatoes and strips of daikon.

When the atomic bomb fell on Hiroshima at 8:16 a.m., Shigeru was among the nearly 7,200 students who perished. Three days later, his mother Shigeko would identify his body using his lunch box; the food inside was transformed into coal, but the outside remained intact.

Today, his lunch box and Shigeko’s testimony are part of the archives at the Hiroshima Peace Memorial Museum. The object and its story left a haunting impression on filmmakers Saschka Unseld and Gabo Arora who co-directed a new virtual reality experience titledThe Day the World Changed. Created in partnership with Nobel Media to commemorate the work of the International Campaign to Abolish Nuclear Weapons (the winner of the 2017 Nobel Peace Prize), the film premiered at the Tribeca Film Festival last week.

The immersive experience begins with an explanation of the genesis, development, and deployment of the atomic bomb and then moves to a second chapter focused on the aftermath of the attack. Audience members can walk through the ruins of the city and examine artifacts from the bombing, including Shigeru’s lunch box. In the final chapter, the piece shifts toward the present, describing the frenetic race to create new atomic weapons and the continued threat of nuclear war.

It’s hardly the only piece at Tribeca to focus on difficult topics: Among the festival’s 34 immersive titles are pieces that grapple with the legacy of racism, the threat of climate change, AIDS and the ongoing crisis in Syria. Neither is it the first VR installation to achieve popular acclaim. Last November, filmmaker Alejandro G. Iñárritu received an Oscar at the Academy’s Governor’s Awards for his virtual reality installation CARNE y ARENA, which captures the experience of migrants crossing the U.S.-Mexico border.

The Day the World Changed differs from these installations in a critical respect: Much of the material already exists in an archival format. Video testimony and radiated relics from the day of devastation come from the museum’s archives and photogrammetry (the creation of 3D models using photography) allowed for digital reproductions of surviving sites. In this sense, the piece shares more with the interpretive projects led by traditional documentarians and historians than the fantastical or gamified recreations that most associate with virtual reality.

What makes it different, Arora and Unseld say, is that the storytelling possibilities enabled by immersive technologies allow viewers to experience previously inaccessible locations—for example, the inside of the Atomic Dome, the Unesco World Heritage site directly underneath the explosion of the bomb that remains intact—and engage with existing artifacts in a more visceral way. The future is exciting, though there’s a certain tension given the national conversation on the dangers of technological manipulation. “You have to be very careful,” Arora says. “We think it’s important to figure out the grammar of VR and not just rely on an easy sort of way of horrifying people. Because that doesn’t last.”

The composition of the biosphere is a fundamental question in biology, yet a global quantitative account of the biomass of each taxon is still lacking. A research group now assembled a census of the biomass of all kingdoms of life. This analysis provides a holistic view of the composition of the biosphere and allows to observe broad patterns over taxonomic categories, geographic locations, and trophic modes.

A census of the biomass on Earth is key for understanding the structure and dynamics of the biosphere. However, a global, quantitative view of how the biomass of different taxa compare with one another is still lacking. This recent study assembled the overall biomass composition of the biosphere, establishing a census of the ≈550 gigatons of carbon (Gt C) of biomass distributed among all of the kingdoms of life. The research found that the kingdoms of life concentrate at different locations on the planet; plants (≈450 Gt C, the dominant kingdom) are primarily terrestrial, whereas animals (≈2 Gt C) are mainly marine, and bacteria (≈70 Gt C) and archaea (≈7 Gt C) are predominantly located in deep subsurface environments. The group could show that terrestrial biomass is about two orders of magnitude higher than marine biomass and estimate a total of ≈6 Gt C of marine biota, doubling the previous estimated quantity. This analysis reveals that the global marine biomass pyramid contains more consumers than producers, thus increasing the scope of previous observations on inverse food pyramids. Finally, the researchers were able to highlight that the mass of humans is an order of magnitude higher than that of all wild mammals combined and report the historical impact of humanity on the global biomass of prominent taxa, including mammals, fish, and plants.

Dr. Stefan Gruenwald's insight:

Plants are the "real" life forms on this Earth, harvesting sunlight and converting it to storable chemical energy and biomass. Animals can be viewed as parasites to plants.

Medical physicists at Martin Luther University Halle-Wittenberg (MLU) have developed a new method that can generate detailed three-dimensional images of the body's interior. These can be used to more closely investigate the development of cancer cells. The research group published its findings in Communication Physics.

Clinicians and scientists seek to better understand cancer cells and their properties in order to provide targeted cancer treatment. Individual cancer cells are often examined in test tubes before the findings are tested in living organisms. "Our aim is to visualise cancer cells inside the living body to find out how they function, how they spread and how they react to new therapies," says medical physicist Professor Jan Laufer from MLU. He specializes in the field of photoacoustic imaging, a process that uses ultrasound waves generated by laser beams to produce high-resolution, three-dimensional images of the body's interior.

"The problem is that tumur cells are transparent. This makes it difficult to use optical methods to examine tumors in the body," explains Laufer, whose research group has developed a new method to solve this problem. First, the scientists introduce a specific gene into the genome of the cancer cells. "Once inside the cells, the gene produces a phytochrome protein, which originates from plants and bacteria. There it serves as a light sensor," Laufer says. In the next step, the researchers illuminate the tissue with short pulses of light at two different wavelengths using a laser. Inside the body, the light pulses are absorbed and converted into ultrasonic waves . These waves can then be measured outside the organism, and two images of the body's interior can be reconstructed based on this data.

"The special feature of phytochrome proteins is that they alter their structure and thus also their absorption properties, depending on the wavelength of the laser beams. This results in changes to the amplitude of the ultrasound waves that are generated in the tumor cells. None of the other tissue components, for example, blood vessels, have this property—their signal remains constant," Laufer says. By calculating the difference between the two images, a high-resolution, three-dimensional image of the tumor cells is created, which is free of the otherwise overwhelming background contrast.

The development of Halle's medical physicists can be applied to a wide range of applications in the preclinical research and the life sciences. In addition to cancer research, the method can be used to observe cellular and genetic processes in living organisms.

Using genome sequences from present-day birds and reptiles, a research team has attempted to reconstruct the genome for an ancestral group of four-limbed amniotes known as diapsids from which birds, reptiles, and dinosaurs descended.

The researchers brought together chromosome-level genome assemblies for four birds and reptiles, using bioinformatic analyses and new cytogenetic data to reconstruct ancestral genome organization patterns, adaptive genetic changes, and conserved sequences in the lineage with the help of evolutionary breakpoint region (EBR) and homologous synteny blocks (HSBs) gleaned from the extant genome data. With these data, the team was able to reach back some 260 million years, predicting the chromosome structure of the diapsid Eunotosaurus ancestor of birds, reptiles, dinosaurs, and pterosaurs. The findings are being published in Nature Communications.

"It is fascinating to see how the knowledge we collect about extant species' genomes coupled with smart computation technologies allow us to go back in time and learn about the genomes and biology of species that existed long before humans appeared and for which we would never have a biological sample," co-senior author Denis Larkin, a comparative biomedical sciences researcher at the University of London's Royal Veterinary College, said in a statement.

Building on a prior chromosomal reconstruction study based on half a dozen extant bird genomes, the team used a "multiple-genome rearrangement and analysis" (MGRA2) analytical tool to retrace karyotypic patterns in the diapsid ancestral genome using genome assemblies for the chicken, zebra finch, mallard, Carolina anole lizard, and the grey short-tailed opossum, which represents a mammalian outgroup. The search uncovered nearly 400 multi-species HSBs, making it possible to reconstruct 19 chromosome-like "contiguous ancestral regions" with MGRA2 analyses.

The researchers also evaluated genome assemblies for several other extant species such as the turkey, budgerigar, and ostrich, but did not include those genomes in their subsequent analyses due to sequence fragmentation or genome misassembly issues that cropped up when they assessed the genomes cytogenetically using fluorescence in situ hybridization.

In addition to the bioinformatic look at the genomes, the team used cross-species molecular cytogenetics approaches to profile and compare chromosomal patterns in the chicken, Caroline anole lizard, red-eared slider turtle, and spiny soft-shelled turtle. It also did pairwise genome alignments with the Evolution Highway chromosome browser to look at HSB positions across species.

Together, these data provided a peek at ancestral diapsid genomes, as well as the within- and between-chromosome rearrangements that occurred in the diverse dinosaur, reptile, and bird descendants of the diapsids.

The procedure is rare, but could potentially help many more patients who experience similar bodily damage.

To successfully reconstruct a patient’s lost ear, Doctors at William Beaumont Army Medical Center in El Paso, Texas sculpted a new one from rib cartilage and implanted it under the tissues of the patient’s forearm to foster blood vessel growth.

The patient is Army private Shamika Burrage, who lost her ear in a car accident two years ago, reports Neel V. Patel for Popular Science. Burrage was returning from leave when her car’s front tire blew, sending the vehicle flipping across the road and ejecting her from her seat.

Burrage, now 21, spent several months in rehabilitation after the accident but sought counseling when she continued to suffer from insecurities about her appearance. “I didn’t feel comfortable with the way I looked so the provider referred me to plastic surgery,” Burrage says.

During the reconstruction process, surgeons reopened Burrage’s hearing canal to restore her hearing and implanted the vascularized ear in its rightful place. She will require two more surgeries to complete the process, but is currently faring well, according to a U.S. Army statement on the procedure.

“The whole goal is that by the time she’s done with all this, it looks good, it’s sensate, and in five years if somebody doesn’t know her they won’t notice,” says Lt. Col. Owen Johnson III, the chief of plastic and reconstructive surgery at the facility, in the statement.

Though a first for Army plastic surgeons, the procedure has long roots in medical practices, Patel reports. Since the early 20th century, doctors have reconstructed parts of ears in people suffering from congenital deformities using a technique that involves harvesting rib cartilage from the chest, sculpting it into the shape of and implanting it under the skin where the ear is normally placed.

As Patel writes, the second stage of the latest ear transplant, known as microvascular free tissue transfer, only became popular in the late 1990s. By stitching the implanted tissue to blood vessels, doctors can help it develop into “healthy, functioning tissue in a new area,” Patrick Byrne, the director of the Division of Facial Plastic and Reconstructive Surgery at Johns Hopkins University School of Medicine who pioneered this method, tells Patel.

Dueling neural networks. Artificial embryos. AI in the cloud. Welcome to our annual list of the 10 technology advances we think will shape the way we work and live now and for years to come.

Every year since 2001 the people at Technology Review have picked what they call the 10 Breakthrough Technologies. People often ask, what exactly is meant by “breakthrough”? It’s a reasonable question—some of the picks haven’t yet reached widespread use, while others may be on the cusp of becoming commercially available. What Technology Review is really looking for is a technology, or perhaps even a collection of technologies, that will have a profound effect on our lives.

For 2018, a new technique in artificial intelligence called GANs is giving machines imagination; artificial embryos, despite some thorny ethical constraints, are redefining how life can be created and are opening a research window into the early moments of a human life; and a pilot plant in the heart of Texas’s petrochemical industry is attempting to create completely clean power from natural gas—probably a major energy source for the foreseeable future.

The goal of the Google Quantum AI lab is to build a quantum computer that can be used to solve real-world problems. This strategy is designed to explore near-term applications using systems that are forward compatible to a large-scale universal error-corrected quantum computer. In order for a quantum processor to be able to run algorithms beyond the scope of classical simulations, it requires not only a large number of qubits. Crucially, the processor must also have low error rates on readout and logical operations, such as single and two-qubit gates.

Google states: "The guiding design principle for this device is to preserve the underlying physics of our previous 9-qubit linear array technology1, 2, which demonstrated low error rates for readout (1%), single-qubit gates (0.1%) and most importantly two-qubit gates (0.6%) as our best result. This device uses the same scheme for coupling, control, and readout, but is scaled to a square array of 72 qubits. We chose a device of this size to be able to demonstrate quantum supremacy in the future, investigate first and second order error-correction using the surface code, and to facilitate quantum algorithm development on actual hardware."

There's a new nuclear simulator on the internet, and it's here to emphasize just how awful a 50,000KT blast would be.

For years, one of the more perversely interesting things on the internet has been Alex Wellerstein's NUKEMAP, which — true to its name — shows you the estimated damage if you dropped a nuclear weapon anywhere in the world. Now the Outrider Foundation has released its own, rather more elegant version, and we're back to blowing up our backyards.

Outrider's simulator lets you enter any location and select from a number of bomb strengths, from the 15KT Little Boy (the first nuke used in war) to the 50,000KT Tsar Bomba, which Russia tested in 1961. The simulator estimates the number of casualties and describes what would happen within the various reaches of the blast (radiation, shock wave, etc). To illustrate, we dropped a 300KT W-87, which is a current part of the US nuclear arsenal. And while much of Brooklyn is destroyed, the damage stretches into lower Manhattan and across to New Jersey as well.

So that brings us to Tsar Bomba. If the USSR were to build another Tsar Bomba and detonated the 50,000KT behemoth over the Digg offices in lower Manhattan, how bad would that be? Reader, it would be extremely bad. The simulator estimates that Tsar Bomba would kill over 7.5 million people if it were detonated in an air burst over lower Manhattan, with the heat wave reaching well into New Jersey, Connecticut and Long Island.

You can play with the simulator here and pray that this remains a thought experiment forever.

Researchers at The Johns Hopkins Kimmel Cancer Center have developed a test for urine, gathered during a routine procedure, to detect DNA mutations identified with urothelial cancers.

UroSEEK uses urine samples to seek out mutations in 11 genes or the presence of abnormal numbers of chromosomes that would indicate the presence of DNA associated with bladder cancer or upper tract urothelial cancer (UTUC). The researchers said the test, when combined with cytology, the gold standard noninvasive test currently used for detection, significantly enhanced early detection for patients who are considered at risk for bladder cancer and surveillance of patients who had already been treated for bladder cancer.

These findings were published online on March 20 in eLife.

"There were nearly 80,000 new cases of bladder cancer and more than 18,000 deaths in 2017," said George Netto, M.D., a senior author on the UroSEEK paper, formerly at The Johns Hopkins University and currently chair of pathology at the University of Alabama-Birmingham. "This is about using the urine to detect the cancer. UroSEEK is a method of detection that many people have tried to find that is noninvasive."

Most cancers are curable if they are detected early, and the researchers are exploring ways to use cancer gene discoveries to develop cancer screening tests to improve cancer survival. They announced the development of CancerSEEK, a single blood test that screens for eight cancer types, and PapSEEK, a test that uses cervical fluid samples to screen for endometrial and ovarian cancers.

A team of Indiana University researchers has reported the first evidence that non-human animals (rats) can replay a stream of multiple episodic memories. The study was published in the journal Current Biology.

Episodic memory is the ability to remember specific events. For example, if a person loses their car keys, they might try to recall every single step — or ‘episode’ — in their trip from the car to their current location. The ability to replay these events in order is known as ‘episodic memory replay.’

“People wouldn’t be able to make sense of most scenarios if they couldn’t remember the order in which they occurred,” said Professor Jonathon Crystal, senior author on the study.

To assess animals’ ability to replay past events from memory, Professor Crystal and colleagues spent nearly a year working with 13 rats, which they trained to memorize a list of up to 12 different odors. The rats were placed inside an ‘arena’ with different odors and rewarded when they identified the second-to-last odor or fourth-to-last odor in the list.

The scientists changed the number of odors in the list before each test to confirm the odors were identified based upon their position in the list, not by scent alone, proving the animals were relying on their ability to recall the whole list in order. Arenas with different patterns were used to communicate to the rats which of the two options was sought.

“After their training, the animals successfully completed their task about 87% of the time across all trials,” Professor Crystal said. “The results are strong evidence the animals were employing episodic memory replay.” Additional experiments confirmed the rats’ memories were long-lasting and resistant to ‘interference’ from other memories, both hallmarks of episodic memory.

The team also ran tests that temporarily suppressed activity in the hippocampus — the site of episodic memory — to confirm the rats were using this part of their brain to perform their tasks.

Viruses are more likely to evolve in similar ways in related species—raising the risk that they will "jump" from one species to another, new research shows.

Scientists from the universities of Exeter and Cambridge compared viruses that evolved in different species and found "parallel genetic changes" were more likely if two host species were closely related. The findings suggest that when a new virus appears in a species such as chimpanzees, closely related species like humans may become vulnerable too. Such jumps, also known as host shifts, are a major source of infectious disease, with viruses such as HIV, Ebola and SARS coronavirus all thought to have jumped into humans from other species.

The researchers used deep sequencing of genomes to track the evolution of viruses in 19 species of flies. "Our findings show that when a virus adapts to one host, it might also become better adapted to closely related host species," said Dr Ben Longdon, of the University of Exeter. "This may explain in part why host shifts tend to occur between related species. However, we sometimes see the same mutations occurring in distantly related host species, and this may help explain why viruses may sometimes jump between distantly related host species. At present we know very little about how viruses shift from one host species to another, so research like this is important if we want to understand and ultimately predict emerging viral diseases."

The fruit flies used in the study were 19 species from the Drosophilidae family, which shared a common ancestor 40 million years ago.

Sharing your scoops to your social media accounts is a must to distribute your curated content. Not only will it drive traffic and leads through your content, but it will help show your expertise with your followers.

Integrating your curated content to your website or blog will allow you to increase your website visitors’ engagement, boost SEO and acquire new visitors. By redirecting your social media traffic to your website, Scoop.it will also help you generate more qualified traffic and leads from your curation work.

Distributing your curated content through a newsletter is a great way to nurture and engage your email subscribers will developing your traffic and visibility.
Creating engaging newsletters with your curated content is really easy.