Tag Archives: Science News

Post navigation

First cloned none-human primates: Zhong Zhong and Hua Hua (Image credited to Qiang Sun and Mu-ming Poo, Institute of Neuroscience of the Chinese Academy of Sciences)

Cloned primates are here! Over three decades have passed since the birth of Dolly, the sheep, scientists have now tackled cloning mammals that are even closer to us on the evolutionary tree: macaque monkeys. What does this mean for a society that witnesses dramatic changes day by day: computers are outperforming doctors in calling out heart abnormalities in patients; 3D-printed organs are bringing us one step closer to tissue restoration; genome sequencing has become an online product easily available for anyone curious about their ancestry, bodybuilding, or just simply wine tastes. Breakthroughs in science and technologies are so prevalent in our life that by now, we probably shouldn’t be surprised by any new discovery. Yet when the two cute, little, cloned monkeys were born, the whole world was, once again, shaken.

Published inCell on January 24th, 2018, a study from a group of scientists in China reported their methods in generating two non-human primates that are genetically identical. To clone the two identical macaque monkeys, the scientists applied Somatic Cell Nuclear Transfer, the same method that generated Dolly in 1996. The key idea behind cloning is that a new organism, be it sheep or monkey, is generated without sexual reproduction. Asexual reproduction is not as uncommon as one would think, plenty of plants do so. For example,Bryophyllum shed plantlets from the edge of the leavesto produce new plants. Some insects, such as ants and bees, also exploit asexual reproduction to clone a huge working class army. Since asexual reproduction is essentially an organism duplicating itself, the offsprings are all genetically identical. Evolution, however, doesn’t favor asexual reproduction as identical offsprings don’t prevail in a fast changing environment. On the other hand, sexual reproduction combines different sperms and eggs to create diverse offsprings, of which some may survive. To combat challenges from the mother nature, higher organisms, such as mammals, almost exclusively reproduce sexually. This is why a cloned monkey, an anti-evolution human creation, is mind blowing.

To clone mammals, scientists came up with the idea of transferring the nucleus of a somatic cell to an enucleated egg (an egg that lacks nucleus). Unlike germ cells (sperm and eggs), somatic cells refer to any cells that don’t get passed onto the next generation. These cells have the full genome of an organism that is split equally in germ cells during sexual reproduction. Carrying half of the genome, sperm and egg need to fuse their genetic materials to make one viable embryo. Technically, the nucleus of a somatic cell holds all the genetic information an organism needs. Thus, by inserting the somatic cell nucleus into an egg, scientists could generate a functional embryo. But why not into a sperm? Evolution has trimmed mammalian sperm tremendously so that it can accomplish its only job better: swim faster to fertilize the egg. As a result, not much other than the sperm’s genetic information is incorporated into the fertilized egg and the embryo relies on the cellular machinery from the egg to finish development. Using this technology, the scientists generated over 300 “fertilized” embryos. Of these embryos, 260 were transferred to 63 surrogate mothers to finish developing. 28 surrogate mothers became pregnant, and from those pregnancies, only 2 healthy monkey babies were born. Although they were carried by different surrogate mothers, every single piece of their genetic code is the same as the the somatic nucleus provider, a real-life demonstration of primate-cloning. Followed by millions of people since their debut to the world, these two macaque superstars are the living samples of a revolutionary breakthrough in our science and technologies.

Despite the extremely low success rate, this technology erects another monument in the history of mankind’s creations. Carrying identical genetic information, cloned monkeys like these two can be a very powerful tool in biomedical research and diseases studies. Co-authorMu-ming Poo, director of the Chinese Academy of Sciences’ Institute of Neuroscience in Shanghai, said that these monkeys could be used to study complicated genetic diseases where environmental factors also play a significant role, such as Alzheimer’s and Parkinson’s diseases. While there are ethical concerns on this technology and its easy application to human cloning, it is worth noting that almost all human creations (explosives, GMO food, the internet, etc.) are double-sided swords. It is up to the hand that wields this sword to decide whether to do good or bad. It is wise to be cautious with the development of new technologies, but it’s also important not to constrain our creativity. After all, it is our creative minds that drive us toward creating a better life for everyone.

If you love a cheesy sci-fi movie as much as I do, the word shark probably brings a few images to mind; swimmers rushing to shore, a huge, hungry, Great white, ready to devour anything in its sights. You may have even started humming the iconic Jaws theme. But you might be surprised to hear that off the big screen, not all sharks are out for blood. In fact, one shark prefers a leafy, green, salad.

We often think of sharks as strict meat eaters, but researchers at the University of California-Irvine are turning the meat hungry shark stereotype on its head with their (mostly) vegetarian Bonnethead sharks. The Bonnethead shark is a small type of hammerhead shark often found in warm, shallow waters of the Northern hemisphere. Bonnetheads get their name from their distinct shovel-like head shape.

The Bonnethead shark’s unique head shape distinguishes it from its hammerhead cousins.

Though distinct in appearance, the characteristic that makes the Bonnethead shark truly unique is its diet. Sharks are infamous meat-eaters. The Bonnethead, however, prefers its meat with a side of veggies. Studies on the diet of the Bonnethead began in 2007 when large amounts of seagrass were found in the stomach of a shark in the Gulf of Mexico. For many years, it was thought the seagrass was indigestible and eaten on accident while the sharks were hunting for shrimp, mollusks, and small fish in the seagrass ridden waters. Recent research now suggests Bonnethead sharks can digest the seagrass they eat and could use it as a source of nutrients.

As the first seagrass-eating shark be discovered, there are still many questions surrounding this veggie-loving shark. Does the Bonnethead eat seagrass on purpose? Or is it accidentally consumed while hunting for creatures on the ocean floor? Perhaps the most puzzling question is how the Bonnetheads are able to digest seagrass? Because Bonnethead sharks have short intestines that are typical of a strict meat eater, scientists suspect bacteria living in the gut give the Bonnethead the ability to digest seagrass. More research is needed to discover which, if any, bacteria help the Bonnethead digest its dinner.

Although questions remain, one thing is certain; the Bonnethead shark is a unique and remarkable creature with much to teach their human neighbors about what constitutes a five-star meal under the sea.

Earlier this year, the U.S. government released the Climate Science Special Report. This document describes the state of the Earth’s climate, specifically focusing on the U.S. If you are someone who is interested in environmental science or policy, you may have thought about reading it. But where to start? The report contains fifteen chapters and four additional appendices, so reading it may seem daunting. We published this summary of the report to provide a brief introduction to climate change, and to provide a starting point for anyone who wants to learn more.

“Climate change” is a phrase that has become ubiquitous throughout many aspects of American and global society, but what exactly is climate change?

Like weather, climate takes into account temperature, precipitation, humidity, and wind patterns. However, while weather refers to the status of these factors on any given day, climate describes the average weather for a location over a long period of time. We can consider a climate for a specific place (for example, the Caribbean Islands have a warm, humid climate), or we can consider all of Earth’s climate systems together, which is known as the global climate.

Depending on where you live, you may have seen how weather can change from day to day. It may be sunny one day, but cool and rainy the next. Climate change differs from changes in weather because it describes long-term changes in average weather. For example, a place with a changing climate may be traditionally warm and sunny, but over many years, become cooler and wetter. While weather may fluctuate from day to day, climate change is due to gradual changes that occur over long periods of time. Climate is viewed through an historical lense, comparing changes over many years. Though we may not notice the climate changing on a daily basis, it can have drastic effects on our everyday lives. It can impact food production, world health, and prevalence of natural disasters.

Summary of the potential physical, ecological, social and large-scale impacts of climate change. The plot shows the impacts of climate change versus changes in global mean temperature (above the 1980-1999 level). The arrows show that impacts tend to become more pronounced for higher magnitudes of warming. Dashed arrows indicate less certainty in the projected impact and solid arrows indicate a high level of certainty.

What Causes Climate Change?

The major factor determining the Earth’s climate is radiative balance. Radiation is energy transmitted into and out of the Earth’s atmosphere, surface, and oceans. Incoming radiation most often comes from light and heat energy from the Sun. Earth can lose energy in several ways. It can reflect a portion of the Sun’s radiation back into space. It can also absorb the Sun’s energy, causing the planet to heat up and reflect low-energy infrared radiation back into the atmosphere. The amount of incoming and outgoing radiation determines the characteristics of climate, such as temperature, humidity, wind, and precipitation. When the balance of incoming and outgoing radiation changes, the climate also changes.

Scientists agree that it’s extremely likely that human activity (via greenhouse gas emissions) is the dominant cause of the increase in global temperature since the mid-20th century.

There are some natural factors that can influence climate. The main ones are volcanic eruptions and the El Niño Effect. Volcanic eruptions emit clouds of particles that block the Sun’s radiation from reaching the Earth, changing the planet’s radiative balance and causing the planet to cool. The El Niño Effect is a natural increase in ocean temperature in the Pacific Ocean that leads to other meteorological effects. The increase in ocean temperature off the coast of South America leads to higher rates of evaporation, which can cause wind patterns to shift, influencing weather patterns worldwide. Together, these factors influence climate, so when they differ from the norm, they can contribute to climate change.

It is true that climate change can occur naturally and it is expected to happen slowly over long periods of time. In some cases, the climate can change for a few months or years (such as in the case of a volcanic eruption), but the effects of these events are not long-lasting. However, since the Industrial Era, the factor contributing most to climate change has been an anthropogenic driver, meaning one that is being caused by human activity. The primary cause of climate change since the Industrial Era has been the presence of greenhouse gases in the atmosphere. The main greenhouse gases are carbon dioxide (CO2), methane (CH4), and nitrous oxide (N2O). These gases are problematic because they remain in the Earth’s atmosphere for a long time after they are released. They trap much of Earth’s outgoing radiation, leading to an imbalance of incoming and outgoing radiation energy. Because the Earth’s atmosphere is holding on to all that energy while still receiving irradiation from the Sun, the planet heats up. This is called the greenhouse effect, because it is similar to what happens in a greenhouse—the Sun’s energy can get in, but the heat cannot get out. The greenhouse effect has intensified due to the greenhouse gases that are released during our modern industrial processes. This has caused the Earth’s climate to begin to change.

Who contributed to the Climate Science Special Report?

The report was written by members of the American scientific community, including (but not limited to) the National Science Foundation, the National Aeronautics and Space Administration (NASA), the US Army Corps of Engineers, and multiple universities and national labs. They analyzed data from articles in peer reviewed scientific journals—that is, other scientists read these articles before they were even published to check for questionable experiments, data, or conclusions—as well as government reports and statistics and other scientific assessments. The authors provided links to each source in citation sections at the end of each chapter. They combined everything they learned into this one comprehensive document, the Climate Science Special Report.

What can we learn from this report?

First of all, the report reveals that the Earth is getting warmer. The average global surface temperature has increased about 1.8°F (1.0°C) since 1901. This may seem like a small change, but this increase in temperature is enough to affect the global climate. Sea levels have risen about eight inches since 1900, which has led to increased flooding in coastal cities. Weather patterns have changed, with increased rainfall and heatwaves. While the increased rainfall has been observed primarily in the Northeastern U.S., the western part of the U.S. has experienced an increase in forest fires, such as those that have devastated California this year. Such changes in weather patterns can be dangerous for those who live in those areas. They can even damage infrastructure and affect agriculture, which impacts public health and food production. These changes mainly result from greenhouse gases, namely CO2, that humans have emitted into the atmosphere.

Where can I go to read the report myself?

You can find a link to the main page of the report here. There is also an Executive Summary, which was written for non-scientists. While the rest of the report contains some technical language, it is generally accessible, and contains visuals to help readers understand the data. If you are interested in gaining a better understanding of Earth’s climate and how it’s changing, we encourage you to take a look at the Climate Science Special Report to learn more.

Plastics are nearly unavoidable. From the plastic bottle of water you grab walking into a meeting to the money in your wallet, plastics are ubiquitous. However, evidence is accumulating that heavy plastic use takes a hefty toll on the environment, especially the world’s oceans, which are the repository of nearly 4.8-12.7 million tons of plastic each year (about five bags of plastic for every foot of coastline in the world). Much of this marine plastic comes from litter that washes down storm drains into the oceans, but it can also be blown from landfills to end up in the ocean. Marine wildlife including fish, birds, seals, turtles and whales consume startling amounts of plastics, not only because these plastics look like dinner but because they oftensmell like it too. Dangers of plastics to marine animals include entanglement and intestinal perforation or blockage which can causenutrient starvation—marine animals starving on a stomach stuffed with plastic. Researchers estimate that90% of sea birds andhalf of all sea turtles have consumed plastics.

Millions of tons of plastic waste winds up in the ocean each year.

More recently, the alarm has been raised aboutmicroplastics, small plastics and plastic fibers less than 5 mm in size. Microplastics can come from thedegradation of larger plastics and from washing clothing containing synthetic fibers. Microplastics act like magnets for chemicals the U.S. Environmental Protection Agency (EPA) calls “Persistent Bioaccumulative and Toxic Substances” (PBTs). PBTs build up in the bodies of marine organisms and can harm us when we consume seafood. Though other potential dangers of microplastics to the environment arenot clear yet, it has been shown that the decomposition of plastics can releasetoxic chemicals including bisphenol A (BPA), a chemical which disrupts hormone balances and may be linked tohuman health concerns including diabetes, behavioral disorders like ADHD, and cancer. Researchers at the University of Missouri-Columbia have shown that some of the same adverse health effects occur infish exposed to BPA, indicating a risk to marine food chains and ecosystems. It is clear that we do not yet know the full impact of plastics in our oceans—but that the dumping of plastic waste into marine ecosystems is not without consequences.

Here are100 ways to reduce your plastic use, ranging from reusable coffee cups to making your own deodorant to avoid the use of plastic packaging—an idea that doesn’t stink. Another way to track your plastic use is to accept thePlastic-Free Challenge—a social media challenge that lets you share your commitment to reducing your plastic footprint with all your followers. A good way to get started is to keep track of how much plastic you use and strive to reduce this amount every week. If you want to think bigger than your own plastic footprint, you can call your representatives about measures likeplastic bag bans in your city and about funding research forequipping water treatment facilities to deal with microplastic-contaminated effluent. This year, I’ll be making it my New Year’s resolution to reduce my plastic consumption: a small change in habits that can add up. Let’s face it, I was never going to make it to the gym, anyway.

The Giza pyramids. The center pyramid (tallest) is the pyramid for King Khufu.

You probably don’t usually think of particle physics and the Great Pyramid of Giza as having much in common. In some ways, the two seem diametrically opposed: the Giza Pyramid is the pinnacle of past achievement while particle physics relies on cutting-edge technology. The Great Pyramid of Giza is the only one of the seven wonders of the ancient world that is still standing. It is the tomb built over 4,500 years ago as a monument to Pharaoh Khufu. Little is known about how such a massive structure was built so long ago or what the internal structure looks like. Historians and archaeologists have been trying to uncover these mysteries for centuries. On the other hand, particle physics is often portrayed in movies and TV as a futuristic discipline in which scientists develop futuristic weapons. As a result, when an article was published in Nature on November 2 with the title “Discovery of a big void in Khufu’s Pyramid by observation of cosmic-ray muons,” there was a media frenzy.

What are cosmic rays?

Though we sometimes think of science as a means to developing a more technological future, it is also importantly a means to understanding the secrets of the past. Much of the information we have about the origins of the universe and the evolution of galaxies has come from studying the fundamental building blocks of matter. For example, some particle physicists study the atomic nuclei falling toward Earth’s surface in cosmic rays. Studying this material provides insight into how galaxies are formed and how they evolved chemically.

When the protons and other nuclei in cosmic rays collide with nuclei in Earth’s atmosphere, pions are created. Pions then quickly decay and form muons. Muons have a negative charge like an electron but 207 times greater mass. The Earth’s atmosphere slows down all of the particles in the cosmic ray showers that are constantly raining down on the Earth. Because of their relatively large mass, muons are actually able to reach the surface of the Earth, unlike lighter particles. Detectors on or near the Earth’s surface can then detect these particles. In order to eliminate background noise from measurements, some particle detectors are placed deep underground in mines and caves. Learn more about what it is like to conduct scientific research in an underground laboratory in Underground Science at SNOLAB.

Using muon detection as an imaging technique

As early as 1955, physicists have taken advantage of the ability of muons to penetrate through rock by measuring the thickness of rock formations using muon flux (the amount of muons passing through an area of a detector). The concept of these measurements is similar to x-ray imaging. An x-ray is a high-energy form of electromagnetic radiation. Due to their high energy, x-rays are not readily absorbed by soft tissue such as skin and organs but are absorbed by denser structures like bone. If an x-ray source is aimed at a human arm and a film is placed on the other side of the arm, an image of the bones will be formed by the relatively low number of x-rays that reach the film behind the bones. Similarly, physicists working with cosmic rays realized that they could image large structures by taking advantage of the the fact that higher-density materials (such as stone) will absorb more muons than areas of lower density (such as air).

Muon detection is a promising method for studying the pyramids because we can infer information about the internal structure without having to destroy it or to open up sealed sections. In 1970, researchers first attempted to use muon flux to search for cavities in the Giza pyramid. By placing detectors in the subterranean chamber (label 5 in the schematic diagram below), they were able to image a region of the pyramid occupying approximately 19% of the pyramid’s total volume. No new chambers were discovered in this region. Since then, detector technology has become more sensitive to smaller amounts of muon flux. In 2015, a team comprised of researchers from universities in Japan, France, and Egypt tried once again to use muon detection to search for cavities within the great stone structure. They tried three methods of muon detection, and this time they were successful.

First, they placed detectors inside the Queen’s chamber and in the corridor outside. These measurements were taken over the course of several months so that a relatively high number of muons would be recorded, making the measurements more accurate. The measured muon signals were compared with the signals they would expect based on what was previously known about the structure of the pyramid. From this method, they detected an unexpected excess of muons coming from the region above the Great Gallery. From this excess, the researchers inferred that there was an unexpected region of air inside the pyramid. The excess of muons is about the same as the excess of muons that pass through the Great Gallery, meaning the the newly detected void is approximately the same size as the Great Gallery. In order to verify their findings, the researchers used two additional muon-detection methods over the course of the next two years with detectors placed in different locations. Signals from all three methods pointed to the same conclusion: there exists a void in the stone structure above and parallel to the grand gallery.

While this information doesn’t directly tell us why or how the pyramid was built the way that it was, it adds to our knowledge about the structure and may help archaeologists infer something about the design. Although, since the publication of the article, there has been some controversy over whether or not this is actually new information, this research still clearly demonstrates that cosmic rays may be useful for understanding not only events that happened billions of years ago but also events that happened much closer to home. And importantly, this shows that science and the humanities have something to offer each other. As scientific advances continue to shape the political and cultural landscape of our future, we will also understand more and more about our past.

Drug addiction is notoriously difficult to treat. Limited treatment options are available for those suffering from addiction, including behavioral therapy, rehabilitation programs, and medication. However, current drug addiction medications are only approved to treat opioid, tobacco, or alcohol abuse, leaving out many other drugs of abuse,such as cocaine or methamphetamine.

Yet even when patients successfully complete rehab or stick to a medication plan, there is still a risk of relapse. This can often be due to the emergence of drug cravings. For instance, a former alcoholic may see a sign for a bar they used to frequent. That sign can induce feelings of craving for alcohol, even long after the user quits or abstains from drinking. Strong cravings could lead to a relapse and a resumption of the cycle of addiction.

No pharmaceutical treatments are currently available for cocaine addiction.

However, a recent discovery may change the way we approach drug addiction treatment. Italian researchers, working alongside the National Institute on Drug Abuse (NIDA), were able to reduce drug cravings and usage in cocaine addicts for the first time using a technique called transcranial magnetic stimulation (TMS).

Long-term use of drugs change how brain cells communicate to each other. Think of a drug addict’s brain cells as speaking in gibberish, or unable to speak at all. Important messages aren’t being sent correctly, which contributes to the negative effects of addiction.

In a TMS procedure, researchers place a figure-8-shaped magnetic coil on the patient’s head. When turned on, the coil can send electrical signals into the brain. Importantly, brain cells communicate using electricity, and the “messages” between cells depend on the strength and frequency of these signals. Researchers found that the electrical signals from TMS help change the way brain cells “speak” to each other, getting rid of the gibberish and making cells communicate normally.

TMS uses a magnetic coil to send electric signals into the brain.

In the case of drug addicts, the electrical signals from the magnetic coil are focused at a brain region called the dorsolateral prefrontal cortex (dlPFC). This is a part of the brain that handles decision making and cognitive ability, and is affected by drugs of abuse. For instance, drug addicts demonstrate lower dlPFC activity compared to non-addicted individuals during cognitive tasks.

Knowing how important this brain region is, researchers performed a study where they stimulated the dlPFC of drug addicts using TMS. They had cocaine addicts undergo either the TMS procedure or take medication (as a control group). They found that the cocaine users who experienced TMS had less cocaine cravings than their control counterparts. Further, the TMS group had more cocaine-free urine samples compared to the control group.

The dorsolateral prefrontal cortex is affected by drug addiction.

Other studies support these results, focusing specifically on the prefrontal cortex, which appears to be a “sweet spot” for treating drug addiction. For instance, an earlier study found that daily TMS sessions, focused more broadly at the left prefrontal cortex, reduced cocaine craving. A later study honing in on the left dlPFC found similar reduction of craving in cocaine users.

Interestingly, the Italian TMS study was based on a rodent experiment with a very similar design. In this study, researchers allowed rats to develop a cocaine addiction and then stimulated a brain region analogous to the human dlPFC. Amazingly, the rats decreased cocaine seeking behaviors, much like their human counterparts in the TMS study. When this brain region was inhibited, or “turned off”, the rats increased their cocaine seeking.

Despite their promise, these TMS studies are just the beginning. Researchers are still a long way from developing a cure or reliable treatment for drug addiction. Like any new drug or treatment, it will be many years before TMS could be accepted as standard care for drug addicts. However, TMS has been successfully used to help patients in other ways. For instance, it has been used to help treat depression and is often used to help doctors identify damage from strokes, brain injuries, and neurodegenerative diseases. TMS holds a lot of promise and is on the cusp of being a successful drug addiction treatment. It’s only a matter of time before this stimulating idea becomes reality.

The recentforest fires have been wreaking havoc across California since early October. In fact, destructive wildfires are a frequent occurrence in the dry, western state. Such fires are generally bad news as they cause destruction of property and affect air quality. However, are they always bad? Interestingly, the answer is no.

Wildfires can be an intricate part of a forest’s natural cycle, and may even help its survival. One such example lies in front of our eyes in the state of North Carolina, where thelongleaf pine finds its home.

Longleaf pine forests across the Southeastern United States are one of the most diverse environmental systems in North America. At one point in time, they covered aboutninety million acres of landwhich, unfortunately, has decreased to only about three million acres. Human development and exclusion of fires by human effort are largely responsible for this decline. Longleaf pines are adapted to fire cycles; preventing fires actually hurts the health of the forest. Native Americans realized this correlation and rarely intervened whenever lightning induced fires, which were common events in theSandhills region, a major home of the longleaf pines located in North Carolina, South Carolina, and Georgia.When the early European settlers came over, they realized the potential of pine resin in shipbuilding. Very soon, North Carolina’s pine forests became a supply line of naval stores for the UK’sRoyal Navy. These early settlers however still continued to burn fires like the natives and thereby contributed to the health of the ecosystem. It was only with growth in plantation forestry came an urge to desperately eliminate fires.

The Sandhills region is home to about a thousand different plantspecies, the dominant species being the longleaf pines. With their long needles, the pines produce a bright, shiny green canopy growing atop massively tall trunks.Additionally, the forests support a wide variety of animals amounting to 160 different species of birds, including the endangered red-cockaded woodpeckers, a large number of salamanders, toads, frogs, the hognose snakes, and fox squirrels, and many other species. In the twentieth century, firefighting prevented the regeneration of longleaf pines, providing non-fire resistant species a competitive edge. That, coupled with increasing human settlement, reduced longleaf pine forest covers. In 1963, the remnants of the natural home of the longleaf pines were brought under the state parks system whenWeymouth Woods was established. Since then, simulatedprescribed fires are used systematically as a conservation tool to restore and maintain the longleaf pines.

Weymouth Woods Sandhills Nature Preserve, a North Carolina State Park in the Moore County around Fort Bragg, offers a great snapshot of the magnificent pine forests that once covered the southeastern United States. During my visit, I was surprised by the wide variety of wildlife I encountered within a short period along the sandy trails. This included a large number of dragonflies, skinks, a moccasin, and not to mention the diversity of plants that coexist in the Sandhills pine forests. If you are intrigued by the unique nature and ecology of the longleaf pines, their role in North Carolina’s history, or simply take pride in being a ‘Tar Heel’, I definitely recommend visiting this place. You will not be disappointed in this treasure trove of nature.

The best models of how our world works are incomplete. Though they accurately describe much of what Mother Nature has thrown at us, models represent just the tip of the full iceberg and a deeper understanding awaits the endeavoring scientist. Peeling back the layers of the natural world is how we physicists seek a deeper understanding of the universe. This search pushes existing technology to its limits and fuels the innovation seen in modern day nuclear and particle physics experiments.

This is a map of the SNOLAB facility. It’s 2 km (~1.2 miles) underground and is the deepest clean room facility in the world!

Today, many of these experiments search for new physics beyond the Standard Model,thetheoryphysicistshaveacceptedtodescribethebehaviorofparticles. Some physical phenomena have proven difficult to reconcile with the Standard Model and research seeks to improve understanding of those conundrums, particularly regarding the properties of elusive particles known as neutrinos which have very little mass and no electric charge, and dark matter, a mysterious cosmic ingredient that holds the galaxies together but whose form is not known. The experiments pursuing these phenomena each take a different approach toward these same unknowns resulting in an impressive diversity of techniques geared towards the same goal.

On one side of the experimental spectrum, the Large Hadron Collider smashes together high-energy protons at a rate of one billion collisions per second. These collisions could have the potential to create dark matter particles or spawn interactions between particles that break expected laws of nature. On the other side of the spectrum, there is a complimentary set of experiments that quietly observe their environments, patiently waiting to detect rare signals of dark matter and other new physical processes outside the realm of behavior described by the Standard Model. As the signals from the new physics are expected to be rare (~1 event per year as compared to the LHC’s billion events per second), the patient experiments must be exceedingly sensitive and avoid any imposter signals, or “background”, that would mimic or obscure the true signal.

The quest to decrease background interference has pushed experiments underground to cleanroom laboratories setup in mine caverns. While cleanrooms reduce the chances of unwanted radioactive isotopes, like radon-222, wandering into one’s experiment, mines provide a mile-thick shield from interference that would be present at the surface of Earth: particles called cosmic rays constantly pepper the Earth’s surface, but very few of them survive the long journey to an underground lab.

The rate at which muons, a cosmic ray particle, pass through underground labs decreases with the depth of the lab. At the SNOLAB facility, shown in the lower right, approximately one muon passes through a square centimeter of the lab every 100 years.

The form and function of modern underground experiments emerged from the collective insights and discoveries of the scientific community studying rare physical processes. As in any field of science, this community has progressed through decades of experimentation with results being communicated, critiqued, and validated. Scientific conferences have played an essential role in this process by bringing the community together to take stock of progress and share new ideas. The recent conference on Topics in Astroparticle and Underground Physics (TAUP) was a forum for scientists working to detect dark matter and study the properties of neutrinos. Suitably, the conference was held in the historic mining town of Sudbury, Ontario, home to the Creighton Mine, at the bottom of which lies SNOLAB, a world-class underground physics laboratory which notably housed the 2015 Nobel Prize winning SNO experiment. SNO, along with the Super-Kamiokande experiment in Japan’s Kamioka mine, was awarded “for the discovery of neutrino oscillations, which shows that neutrinos have mass.”

There is a natural excitement upon entering an active nickel mine, donning a set of coveralls, and catching a cage ride down into the depths; this was our entrance into the Creighton Mine during the TAUP conference. After descending an ear-popping 6800 feet in four minutes, we stepped out of the cage into tunnels— known as drifts— of raw rock. From there, we followed the path taken everyday by SNOLAB scientists, walking approximately one kilometer through the drifts to the SNOLAB campus. At SNOLAB, we prepared to enter the clean laboratory space by removing our coveralls, showering, and donning cleansuits. Inside, the rock walls are finished over with concrete and epoxy paint and we walked through well-lit hallways to a number of experiments which occupy impressively large caverns, some ~100 feet high.

Physicists visiting SNOLAB get a close-up view of the DEAP-3600 and MiniClean dark matter experiments. Shown here are large tanks of water that shield sensitive liquid argon detectors located within.

Our tour of SNOLAB included visits to several dark matter experiments, including DEAP-3600 and MiniClean, which attempt to catch the faint glimmer of light produced by the potential interaction of dark matter particles with liquid argon. A stop by PICO-60 educated visitors on another captivating experiment, which monitors a volume of a super-heated chemical fluid for bubbles that would indicate the interaction of a dark matter particle and a nucleus. The tour also included the SNO+ experiment, offering glimpses of the search for a rare nuclear transformation of the isotope tellurium-130; because this transformation depends on the nature of neutrinos, its observation would further our understanding of these particles.

SNOLAB is also home to underground experiments from other fields. The HALO experiment, for instance, monitors the galaxy for supernovae by capturing neutrinos that are emitted by stellar explosions; neutrinos may provide the first warnings of supernovae as they are able to escape the confines of a dying star prior to any other species of particle. Additionally, the REPAIR experiment studies the DNA of fish kept underground, away from the natural levels of radiation experienced by all life on the surface of Earth.

The search for rare signals from new physical phenomena pushed physicists far underground and required the development of new technologies that have been adapted by other scientific disciplines. The SNOLAB facility, in particular, has played a key role in helping physics revise its best model of the universe, and it can be expected that similar undergroundfacilitiesaroundtheworld will continue to help scientists of many stripes reveal new facets of the natural world.

Alexandra Elbakyan will go down in history as the mastermind ofSci-Hub and perhaps as a champion for open access research. Sci-Hub is an online repository of pirated research articles that enables scholars to access millions of articles free of charge. Elbakyan founded Sci-Hub in 2011 in response to the paywalls guarding many of the articles that she needed for her neuroscience graduate studies. The pirating website provides research article access for a community of scholars who could not afford to access the articles through traditional channels.

As you might imagine, Sci-Hub is surrounded by controversy. The Sci-Hub repository relies on hacked publishing websites or donated institutional login credentials to obtain research articles. You may even consider Elbakyan to be a modern day Robin Hood – robbing the rich publishing giants to provide research articles to those without access, and many scientists have praised her efforts (and even donated to the cause) for advancing open-access research. However, many others (including publishers) view Sci-Hub and research article piracy as ethically wrong, and they have condemned her efforts.

Alexandra Elbakyan, founder of Sci-Hub, speaks at a conference at Harvard in 2010.

A recent article onPeerJ that is based out of a group from the University of Pennsylvania investigated how extensively Sci-Hub has infiltrated the databases of publishing agencies.According to Daniel Himmelstein, the lead author of the study, Sci-Hub contains 69% of all research articles that exist (based on an estimated 81.6 million articles in total), with coverages approaching 93% for disciplines such as chemistry. What’s particularly fascinating is that the 56 million or so articles are all located within one repository, and they are easily accessed via a DOI (digital object identifier) search bar. The PeerJ article contains more about the extent of Sci-Hub’s reach than can be covered in this briefing, but you can also explore an interactive website about Sci-Hub’s capacity that is associated with the study.

Open-access research is a hot topic, and the recent lawsuits against Sci-Hub have only added fuel to the fire. While Sci-Hub has increased publicity for open-access research, the ethics behind Sci-Hub’s article piracy has clouded the open-access conversation. Many would agree that open-access research is important, but at what cost? Does the end result that all people have equal access to research data justify the ethical quandary of article piracy? Alexandra Elbakyan believes that is does. Only time will tell if she is right.

Please note: Accessing Sci-Hub is illegal in the United States. The author of this article and the editors of ThePipettepen do not condone the use of Sci-Hub to access research articles. Rather, this article is only intended to provide current scientific news on open-access research.

Frog slime, although gross, might help combat some strains of the influenza virus.

Got the flu? Time to start looking for your frog prince.

Researchers at Emory University have identified asubstance that killsinfluenza, the virus that causes seasonal flu. The influenza-killing substance, called urumin, is produced on the skin of the South Indian frog and stops influenza virus growth by causing the virus to burst open-think of smashing an egg with a hammer!

Researchers think urumin disrupts a structure on the outside of the virus. Influenza, like an egg, has anouter shell that protects the contents of the virus- the “yolk”- which the virus uses to grow and replicate. Unlike an egg, the outer shell of influenza is not smooth. Instead, it contains small spikes. Uruminsticks to these influenza spikes, interfering with their function and causing the virus to burst open.

The influenza virus uses the spikes to stick to human cells and cause infection. Two types of spikes are found on each influenza virus,H and N. There are multiple types of H’s and N’s, and each virus picks one H and one N to “wear” on its outer shell, similar to the way we choose a pair of pants and shirt to wear every day.

Cartoon of Influenza. The outside is covered in spikes, H in light blue and N in dark blue. The coils in the center contain the genetic information or “yolk” that causes the virus to replicate. From: Doug Jordan.

Surprisingly, urumin is onlyeffective against viruses containing the H spike type, H1. This is becauseurumin can only stick to H1 spikes, not to N spikes or to other types of H spikes. H1 is one of only3 types of H spikes known to infect humans. Shockingly, H1 viruses are responsible for some of the worst flu outbreaks in history such as the1918 Spanish flu pandemic that caused 50 million deaths and, more recently, the swine flu pandemic of2009.

Destroying influenza in a lab environment is great, but what about in a living animal? In the same study, urumin treatment resulted in a 250% increase in mouse survival after influenza infection. Urumin treatment also decreased disease severity by lessening weight loss and decreasing the amount of virus in the lungs.

Although these mouse experiments are promising, it is important to point out that the mice were given urumin 5 minutes before they were infected with influenza and also received urumin everyday for the rest of the infection. Because most of us do not know the exact moment we are exposed to influenza virus – the grocery store? the breakroom? the gym? – it is difficult to treat someone at the moment they are infected with influenza. Thus, more research is needed to look at the effectiveness of urumin when it is given days after infection, which is the typical time that an infected person might visit their doctor.

With more research, urumin could be the promising new influenza drug researchers have been looking for, to potentially reduce influenza-associated deaths and complications.