Archive for the 'General Science' Category

I know I have been hitting this topic frequently recently, but I can’t ignore a major study published in Nature. The study is not just about organic farming, but about how we use land and implications for climate change, specifically carbon sequestration. The core idea is this – when we consider land use and its impact on the climate, we also have to consider the opportunity cost of not using the land in a more useful way. This echoes a previous study by different authors I discussed five months ago, and a review article by still different authors I discussed three months ago.

There certainly does now seem to be a growing consensus that we have to think very carefully about how we use land in order to minimize any negative impact on the environment, and specifically limit carbon in the atmosphere driving climate change.

The new study essentially argues that we need to use land optimally. If land is well suited to growing corn, then we should grow corn. If it is better suited for forestation, then we should allow forests to grow there and not convert it to farmland. Forests sequester a lot more carbon than farmland, and this is a critical component to any overall strategy to mitigate climate change. The authors calculate that land use contributes, “about 20 to 25 per cent of greenhouse gas emissions.”

If we put the various studies I have been discussing together, a compelling image emerges. First, we need to consider that we are already using all the best farmland to grow crops. Any expansion of our farmland will by necessity be using less and less optimal land for farming. This translates to a greater negative impact on the climate. However, our food production needs will grow by about 50% by 2050.

This is a strong argument, in my opinion, against biofuels. We need that land to grow food, not fuel – unless we can source biofuels from the ocean or industrial vats without increasing land use.

The Pew Research Center has recently published a large survey regarding American’s attitudes toward food, including genetic modification, food additives, and organics. There are some interesting findings buried in the data that are worth teasing out.

First, some of the top line results. They found that 49% of Americans feel that genetically modified organisms (GMOs) are bad for health, while 44% said they were neutral, and 5% said they were better. So the public is split right down the middle over the health effects of GMOs. The 49% who feel that GMOs are bad for health is up from 39% when they gave the same survey in 2016 – so unfortunately, we have lost ground on this issue.

Breaking these numbers down, we find that women are a little more likely to fear GMOs as a health risk than men, 56% compared to 43%. I suspect this is due primarily to differences in how anti-GMO messages are marketed, and the general marketing of pseudoscience to women (the Goop effect). This is also significant because women are more likely to make food purchasing decisions for their families.

Even more interesting is the relationship between science knowledge and fear of GMO’s – among those with a high degree of science knowledge, 38% thought GMOs had health risks, while 52% of those with a low degree of science knowledge thought so. The same pattern is seen through all the subquestions about GMOs. For example, 49% of those with a high degree of science knowledge believe GMOs have the potential to increase the global food supply, while only 20% of those with a low degree of science knowledge believe this.

Scientists report in Nature the indentification of two new species of Hemimastigophora, a predatory protist. What makes the paper newsworthy is that the authors are arguing that their genetic analysis suggests Hemimastigophora, currently categorized as a phylum, should instead be its own suprakingdom.

To make sense of this let’s review the basic structure of taxonomy, the system we use to categorize all life. All known life is divided first into three domains, the bacteria, archaea, and eukaryotes. Bacteria and archaea do not have a nucleus, while eukaryotes are larger and have a nucleus which contains most of their DNA.

Eukaryotes are divided into kingdoms, including plants, animals, fungus, protozoa, and chromista (algae with a certain kind of chlorophyll). Kingdoms are then divided into phyla, which are essentially major body plans within that group.

This is a simplified overview, because there is a lot of complexity here, with suprakingdoms, subkingdoms, and further breakdowns. Further, there is a lack of consensus on how to exactly divide up these major groups. Even in the cited paper, the authors say there are 5-8 “suprakingdom level groups” within the domain eukaryotes. The number of kingdoms depends on which scheme you use, and how you interpret the existing evidence.

The reason for uncertainty is that we have not yet done a full genetic analysis on every known group. Further, when we discover new species that lie outside of the existing scheme, we have to rethink how different groups are actually related.

This is one of those items that at first does not seem like a big deal, and probably won’t get much play in the mainstream media, but is actually a significant milestone. Today, the international General Conference on Weights and Measures will meet in Versailles, France, to vote on whether or not to adopt a new standard for the kilogram. This is a formality, because this change has been worked on for years and the standard is now all set to change.

I have been reading a lot recently about the history of science and technology, and one common theme is that an important core feature of our modern society is infrastructure. If, for example, there were some sort of apocalypse, what would it take to reboot society? Theoretically, we would preserve much of our knowledge in books and would not have to start from scratch. The limiting factor would likely be infrastructure. Gasoline engines won out over electric engines for cars partly (and some believe primarily) because the infrastructure for distributing gasoline was put in place before the electrical infrastructure.

Science itself also has an infrastructure, which includes standard weights and measures. This sounds boring, but being able to precisely measure something, using standardized units that every scientist around the world can use, is critically important to both science and technology. Anything that makes doing science easier reduces the cost and increases the pace of science, with incredible downstream benefits.

In 1879 Le Grand K (or the International Prototype Kilogram – IPK) was created – this is a cylinder of platinum and iridium that is the ultimate reference for 1 kilogram. This hunk of metal is kept in a double bell jar, and never touched. Even a slight finger print would change how much it weighs. From this original kilogram, exact copies were made and distributed to countries to serve as their national standard. Occasionally these copies are sent back to France to compare to the original. These copies are then used to calibrate equipment used for precise measurement.

There’s some bad news, followed by good news, but partially countered by further bad news. The bad news is that our population is growing, and therefore our food requirements, and yet we are approaching the limits of our ability to increase crop yield with cultivation alone. Experts can quibble about whether or not we are at or near the limit, but it’s pretty clear that we are not going to double crop efficiency in the next 50 years through cultivation.

That, however, is pretty much what we need to do if we are going to meet humanity’s caloric needs. By 2050 yields will need to be 60% higher than 2005, and needs will likely continue to rise before they stabilize. Sure, there are some gains to be made in reducing waste, but not nearly enough. And sure, we need to take steps to stabilize our population more quickly, like fighting poverty and promoting the rights of women in developing countries.

But even under optimistic conditions – we simply need to grow more food. Further, as I recently reviewed, we are pretty much using all the good arable land available. Expanding into more land for growing food is not a good option.

So really we have one viable option if we are going to meet our food needs – genetically modifying crops. That is the good news – GMOs actually have the potential to significantly increase crop yields. One way to do that is through making photosynthesis more efficient. It turns out, there are several ways to do this.

First, there is a difference between C3 and C4 photosynthetic pathways. The C4 pathway is more efficient, and increases biomass production. Part of the efficiency is through better carbon concentration mechanisms. This pathway has independently evolved in many plants, and there are others that are part way between C3 and C4, but our major food crops all use C3.

One of the frustrating aspects of how science is reported in the mainstream media is when a complex study with very unclear results is presented with a misleading bottom line. Most people read only the headline, or perhaps the first paragraph, in order to glean the essence of a scientific study. They don’t read deep into the reporting to find the important details, or go to the study itself.

This is especially problematic when the study is of a preliminary design, or when the author’s conclusions are biased or misleading.

The most recent example of these issues is a study looking at the consumption of organic food and the risk of cancer. CNN reported the study as showing: “You can cut your cancer risk by eating organic, a new study says.” No – that is not what the study shows.

The study itself is not bad, for what it is, but it is highly limited in the conclusions that can be drawn from it, and it has some serious limitations. The researchers looked at a French database of good consumption, NutriNet-Santé Prospective Cohort Study. They had volunteers fill out food diaries for three days, report their organic food consumption, gathered demographic and other lifestyle data, and then followed them for over four years, using various methods to track the incidence of cancer.

Neanderthals were our close cousins. They are the closest species to modern humans that we know of. There is also the Denisovans, which are currently classified as a subspecies of Homo sapiens, but may eventually be classified as their own species.

Neanderthals lived from 400,000 to 40,000 years ago. They spread out of Africa, and throughout Europe and Asia. When modern humans arrived later, there was some clear interbreeding going on – Europeans and Asians have about 2% Neanderthal DNA. In fact a recent study suggests that modern humans specifically retained Neanderthal genes that conveyed improved resistance to European viruses.

The first fossil specimen of Neanderthal was discovered in 1829, although this was not recognized until later. The first recognized specimen was collected in 1856 in the Neander valley in Germany. This was the first early hominid specimen found. Perhaps because of the time it was discovered, our image of Neanderthal is still colored by the notions of the day. “Primitive” was synonymous with brutish and animalistic.

The organic food lobby has been successfully demonizing safe and effective biotechnology for the last two decades. Part of their strategy is to create a false dichotomy between GMO (genetically modified organisms) and non-GMO. In reality there is a continuum of methods used to alter the genetics of crops that we have been using for centuries. There are real differences among these various techniques, but they do not divide cleanly into two categories (more on this below).

The Non-GMO Project is part of this strategy. You have probably seen the labels, with the appealing butterfly reassuring consumers about the wholesomeness of products. Even when there are no GMO options for a food type, you can get the label. This scam makes money for the Non-GMO Project and is good for marketing. The goal is to eradicate any technology this non-expert and private group decides is GMO.

Now, the Information Technology and Innovation Foundation is hitting back. They are a non-profit science think tank whose purpose is to provide non-partisan science information to inform policy. They have submitted a petition to the FDA:

The Non-GMO Project food label deliberately deceives and misleads consumers in violation of the Federal Food, Drug and Cosmetic Act. ITIF petitions FDA to prohibit such labels. The “Non-GMO” Project butterfly logo and label on consumer foods and goods misleads and deceives consumers through false and misleading claims about foods, food ingredients and their characteristics related to health and safety, thus constituting misbranding under the law. ITIF therefore requests, in a Citizen’s Petition submitted to the Commissioner of Food and Drugs, that FDA issue a regulation to prohibit the use of the term “Non-GMO” on consumer foods and goods, and to require distributors of foods and goods to revise their labeling to omit any “Non-GMO” term, symbol, or claims.

This is essentially correct – one could argue that the purpose of the Non-GMO label is to confuse and misinform consumers, in order to extract more money out of them for equivalent or even inferior products.

Hurricane Florence is about to hit the Carolinas, extending as far south as Georgia and north as Virginia. The storm peaked at a Category 4, but now as it approaches land has been downgraded to a Category 2 hurricane. Hurricanes generally gain power over the ocean, from the energy of water evaporating off the surface, and then lose power when they hit land.

But this weakening does not mean that the storm is not dangerous. The category refers only to wind speeds, but there are other factors to consider. First, this is a very large storm. Also, it is slowing down, so may stall over the Carolinas. This means it can just sit there, dumping large amounts of water. The two dangers are storm surges and flooding. A storm surge is the rise in sea level above the normal high tide, caused by the wind. This can rapidly flood coastlines.

Flooding occurs when rivers overflow their banks and there is simply too much water too fast to be absorbed into the ground or flow back to the ocean. Flooding and storm surges actually cause most of the damage and deaths during large storms, not the wind itself.

Dramatic storms like Florence always seem to prompt fresh discussion about the effects of global warming and what we should be doing about it.

Here is a good summary by the Geophysical Fluid Dynamics Laboratory of the effects of AGW on tropical storms. Warming does not increase the number of storms, but it does increase the average intensity of storms and the amount of water they drop. Therefore, statistically we should be, and are, seeing more intense storms and more flooding.

Of course you cannot blame any single storm on AGW – it is a statistical effect.

A recent paper, published to arXiv, claims to have a method for making a superconducting material at ambient temperatures and pressures. The authors, Dev Kumar Thapa, Anshu Pandey, are from the Indian Institute of Science, and have garnered a lot of attention for their paper. However, recently there was published another paper on arXiv by Brian Skinner from MIT. Skinner noticed a suspicious pattern of repeating noise in two sets of data from the Thapa-Pandey paper. This could be a signature of fabricated data.

Scientific American has a good summary of the whole story, but that’s the quickie version. Skinner did not make any accusations, just published his analysis with a question to the original authors to explain the repeating pattern. Thapa and Pandey have responded only to say their results are being replicated. The rest of the physics community is not satisfied with this response, and are calling for them to send their material to outside labs for completely independent testing.

Another wrinkle to the story is that Pratap Raychaudhuri, a physicist at the Tata Institute of Fundamental Research in India, floated a hypothesis that perhaps the noise is not noise, but a signal resulting from “the natural rotation of particles within a magnetic field.” If that’s the case, then the pattern should replicate. So we are still left with the need to independently replicate the experiments.

The stakes here are high because so-called room temperature superconductivity is one of the holy grails of material science. Superconductivity means that electricity can flow through the medium without resistance, and therefore with no loss of power. A room temperature superconductor could therefore transform electronics, the power grid, and anything using super powerful magnets (like MagLev trains and MRI scans).

The current dominant theory as to how superconductivity works in certain materials two electron can come together to form what’s called a Cooper pair. This pair of electrons can then travel long distances in the material without resistance. However, Cooper pains can only exist at very low temperatures. So the quest has been to find materials that will allow Cooper pairs to exist at higher and higher temperatures.