By Gregory Pauland Phil Zuckerman, Published: April 29

Long after blacks and Jews have made great strides, and even as homosexuals gain respect, acceptance and new rights, there is still a group that lots of Americans just don’t like much: atheists. Those who don’t believe in God are widely considered to be immoral, wicked and angry. They can’t join the Boy Scouts. Atheist soldiers are rated potentially deficient when they do not score as sufficiently “spiritual” in military psychological evaluations. Surveys find that most Americans refuse or are reluctant to marry or vote for nontheists; in other words, nonbelievers are one minority still commonly denied in practical terms the right to assume office despite the constitutional ban on religious tests.

Rarely denounced by the mainstream, this stunning anti-atheist discrimination is egged on by Christian conservatives who stridently — and uncivilly — declare that the lack of godly faith is detrimental to society, rendering nonbelievers intrinsically suspect and second-class citizens.

Is this knee-jerk dislike of atheists warranted? Not even close.

A growing body of social science research reveals that atheists, and non-religious people in general, are far from the unsavory beings many assume them to be. On basic questions of morality and human decency — issues such as governmental use of torture, the death penalty, punitive hitting of children, racism, sexism, homophobia, anti-Semitism, environmental degradation or human rights — the irreligious tend to be more ethical than their religious peers, particularly compared with those who describe themselves as very religious.

Consider that at the societal level, murder rates are far lower in secularized nations such as Japan or Sweden than they are in the much more religious United States, which also has a much greater portion of its population in prison. Even within this country, those states with the highest levels of church attendance, such as Louisiana and Mississippi, have significantly higher murder rates than far less religious states such as Vermont and Oregon.

As individuals, atheists tend to score high on measures of intelligence, especially verbal ability and scientific literacy. They tend to raise their children to solve problems rationally, to make up their own minds when it comes to existential questions and to obey the golden rule. They are more likely to practice safe sex than the strongly religious are, and are less likely to be nationalistic or ethnocentric. They value freedom of thought.

While many studies show that secular Americans don’t fare as well as the religious when it comes to certain indicators of mental health or subjective well-being, new scholarship is showing that the relationships among atheism, theism, and mental health and well-being are complex. After all, Denmark, which is among the least religious countries in the history of the world, consistently rates as the happiest of nations. And studies of apostates — people who were religious but later rejected their religion — report feeling happier, better and liberated in their post-religious lives.

Nontheism isn’t all balloons and ice cream. Some studies suggest that suicide rates are higher among the non-religious. But surveys indicating that religious Americans are better off can be misleading because they include among the non-religious fence-sitters who are as likely to believe in God, whereas atheists who are more convinced are doing about as well as devout believers. On numerous respected measures of societal success — rates of poverty, teenage pregnancy, abortion, sexually transmitted diseases, obesity, drug use and crime, as well as economics — high levels of secularity are consistently correlated with positive outcomes in first-world nations. None of the secular advanced democracies suffers from the combined social ills seen here in Christian America.

More than 2,000 years ago, whoever wrote Psalm 14 claimed that atheists were foolish and corrupt, incapable of doing any good. These put-downs have had sticking power. Negative stereotypes of atheists are alive and well. Yet like all stereotypes, they aren’t true — and perhaps they tell us more about those who harbor them than those who are maligned by them. So when the likes of Glenn Beck, Sarah Palin, Bill O’Reilly and Newt Gingrich engage in the politics of division and destruction by maligning atheists, they do so in disregard of reality.

As with other national minority groups, atheism is enjoying rapid growth. Despite the bigotry, the number of American nontheists has tripled as a proportion of the general population since the 1960s. Younger generations’ tolerance for the endless disputes of religion is waning fast. Surveys designed to overcome the understandable reluctance to admit atheism have found that as many as 60 million Americans — a fifth of the population — are not believers. Our nonreligious compatriots should be accorded the same respect as other minorities.

Gregory Paul is an independent researcher in sociology and evolution. Phil Zuckerman, a professor of sociology at Pitzer College, is the author of “Society Without God.”

High percentage of omega-3s in the blood may boost risk of aggressive prostate cancer

Conversely, high percentage of trans-fatty acids linked with lower risk

SEATTLE – The largest study ever to examine the association of dietary fats and prostate cancer risk has found what's good for the heart may not be good for the prostate.

Analyzing data from a nationwide study involving more than 3,400 men, researchers at Fred Hutchinson Cancer Research Center found that men with the highest blood percentages of docosahexaenoic acid, or DHA, an inflammation-lowering omega-3 fatty acid commonly found in fatty fish, have two-and-a-half-times the risk of developing aggressive, high-grade prostate cancer compared to men with the lowest DHA levels.

Conversely, the study also found that men with the highest blood ratios of trans-fatty acids – which are linked to inflammation and heart disease and abundant in processed foods that contain partially hydrogenated vegetable oils – had a 50 percent reduction in the risk of high-grade prostate cancer. In addition, neither of these fats was associated with the risk of low-grade prostate cancer risk. The researchers also found that omega-6 fatty acids, which are found in most vegetable oils and are linked to inflammation and heart disease, were not associated with prostate cancer risk. They also found that none of the fats were associated with the risk of low-grade prostate cancer.

These findings by Theodore M. Brasky, Ph.D., and colleagues in the Hutchinson Center's Public Health Sciences Division were published online April 25 in the American Journal of Epidemiology.

"We were stunned to see these results and we spent a lot of time making sure the analyses were correct," said Brasky, a postdoctoral research fellow in the Hutchinson Center's Cancer Prevention Program. "Our findings turn what we know – or rather what we think we know – about diet, inflammation and the development of prostate cancer on its head and shine a light on the complexity of studying the association between nutrition and the risk of various chronic diseases."

The researchers undertook the study because chronic inflammation is known to increase the risk of several cancers, and the omega-3 fatty acids found primarily in fish and fish oil supplements have anti-inflammatory effects. In contrast, other fats, such as the omega-6 fats in vegetable oil and trans-fats found in fast foods, may promote inflammation. "We wanted to test the hypothesis that the concentrations of these fats in blood would be associated with prostate cancer risk," Brasky said. "Specifically, we thought that omega-3 fatty acids would reduce and omega-6 and trans-fatty acids would increase prostate cancer risk."

The mechanisms behind the impact of omega-3s on risk of high-grade prostate cancer are unknown. "Besides inflammation, omega-3 fats affect other biologic processes. It may be that these mechanisms play a greater role in the development of certain prostate cancers," Brasky said. "This is certainly an area that needs more research."

Currently there is no official recommended daily allowance for omega-3 fats for adults or children, although many nutrition experts and physicians recommend 450 milligrams of omega-3 DHA per day as part of a healthy diet.

The study was based on data from the Prostate Cancer Prevention Trial, a nationwide randomized clinical trial that tested the efficacy of the drug finasteride to prevent prostate cancer. While the trial involved nearly 19,000 men age 55 and older, the data in this analysis came from a subset of more than 3,000 of the study participants, half of whom developed prostate cancer during the course of the study and half of whom did not. The clinical trial was unique in that prostate biopsy was used to confirm the presence or absence of prostate cancer in all study participants.

Among the study participants, very few took fish oil supplements – the most common non-food source of omega-3 fatty acids, which are known to prevent heart disease and other inflammatory conditions. The majority got omega 3s from eating fish.

So based on these findings, should men concerned about heart disease eschew fish oil supplements or grilled salmon in the interest of reducing their risk of aggressive prostate cancer? Brasky and colleagues don't think so.

"Overall, the beneficial effects of eating fish to prevent heart disease outweigh any harm related to prostate cancer risk," Brasky said. "What this study shows is the complexity of nutrition and its impact on disease risk, and that we should study such associations rigorously rather than make assumptions," Brasky said.

###

The National Cancer Institute funded this study, which also involved researchers from the University of Texas Health Science Center at San Antonio and the NCI.

At Fred Hutchinson Cancer Research Center, our interdisciplinary teams of world-renowned scientists and humanitarians work together to prevent, diagnose and treat cancer, HIV/AIDS and other diseases. Our researchers, including three Nobel laureates, bring a relentless pursuit and passion for health, knowledge and hope to their work and to the world. www.fhcrc.org

James L. McQuivey, Ph.D. is a Vice President and Principal Analyst at Forrester Research serving Consumer Product Strategy professionals.

Apple triggered much debate when it recently announced it would begin enforcing policies that add a 30% toll to any content — Kindle books, newspaper subscriptions — sold through an app on an Apple device. Apple essentially restrained publishers’ access to Apple’s customers — a huge market within the Apple ecosystem.

With this move, the company –- typically known for anticipating and even causing seismic shifts in the business world -– demonstrates that it is fundamentally unprepared for one of the biggest transformations we are about to experience: The end of scarcity.

Our day-to-day experience teaches us that scarcity is real. All modern business practices are built on this assumption. Some businesses depend upon it entirely. For example, high-end auction houses and low-end infomercials both remind you through various cues that if you don’t buy it now, you may not be able to ever buy it again.

But what happens if the economics of scarcity are exchanged for the economics of plenty? For those industries that provide information or experience as a primary good, scarcity is rapidly evaporating. The media business is undergoing a similar change with the rise of citizen journalists, bloggers, and YouTube performers — all of which circumvent the traditional systems that once dictated production norms and processes. Most of these companies have sought to restore order by reinstating scarcity rather than celebrating its passing. It’s not a good sign of things to come.

Apple’s recent move was no different. The company imposed artificial scarcity on the relatively boundless iPhone and iPad ecosystem. It’s a restriction of publishers’ access to Apple’s existing customers.

This is no criticism of the company’s prowess nor is it a critique of the morality of the company’s strategy. I’m leveling a more devastating charge than that. I’m suggesting that the longer we postpone the inevitable shift to the economics of plenty, the longer we delay the remarkable benefits, both commercial and social, that relinquishing scarcity will provide.

We already see hints at what such a shift will do to generate value.

The New Model

Watch closely as entities of previously impossible scale become commonplace –- companies like Facebook, Google, and even Twitter. These companies scaled up so quickly precisely because they are not bound by scarcity. There is no meaningful limit to how many people can benefit from Facebook, and so it acquires more customers without aggressively marketing its services. Similarly, any new initiative that Google offers the world can reach hundreds of millions of people within a few days at modest incremental cost.

It demonstrates how these companies operate under previously undefined rules. In their world, the costs to exploit scale revert to zero. The best ideas, no matter how small or underfunded, have the largest potential impact, and a company that gives its value away may stand to gain more value in return. As a result, companies like Facebook and Google are writing the book on how to manage the economics of plenty, even if they don’t know it.

Other industries next in line for disruption like education and health care would be wise to pay attention. Most of what they do depends on the control of information that will soon no longer be scarce. Education reformers have long predicted a world where top professors spread their knowledge across the globe through electronic tools. But the knowledge students need is not only located in those few professors’ minds. Once we digitize not just the distribution of knowledge but the production of it, the existing university system loses its raison d’etre. Why would people come to a single physical location at higher and higher costs when the knowledge it houses is no longer scarce?

It is unlikely that universities, hospitals, and other information-dependent entities will see this coming and respond appropriately. While we wait to see which companies can unshackle themselves from the assumption of scarcity, we will live deprived of the innovation the economics of plenty could inspire. I expect that today’s teens will scratch their heads at some future date and wonder why we were so hesitant to accept what will have come so naturally to them.

If you sometimes find yourself needing an open wireless network in order to check your email from a car, a street corner, or a park, you may have noticed that they're getting harder to find.

Stories like the one over the weekend about a bunch of police breaking down an innocent man's door because he happened to leave his network open, as well as general fears about slow networks and online privacy, are convincing many people to password-lock their WiFi routers.

The gradual disappearance of open wireless networks is a tragedy of the commons, with a confusing twist of privacy and security debate. This essay explains why the progressive locking of wireless networks is harmful — for convenience, for privacy and for efficient use of the electromagnetic spectrum.

We will need a political and technological "Open Wireless Movement" to reverse the degradation of this indispensable component of the Internet's infrastructure. Part of the task will simply be reminding people that opening their WiFi is the socially responsible thing to do, and explaining that individuals who choose to do so can enjoy the same legal protections against liability as any other Internet access provider.1 Individuals, including Bruce Schneier and Cory Doctorow, have laid some of the groundwork. It's time to spead the message far and wide.

But an Open Wireless Movement will also need to do technical work: we need to build new technologies to ensure that people have an easy way to share a portion of their bandwidth without affecting the performance of their own network connections while at the same time ensuring that there is absolutely no privacy downside to running an open wireless network.

The wireless world we ought to live in

Most of us have had the experience of tremendous inconvenience because of a lack of Internet access. Being lost in a strange place with no way to find a map; having an urgent email to send with no way to do so; trying to meet a friend with no way to contact them. Even if we have data plans for our mobile phones, we've probably had these experience in cities or countries where our phones don't have coverage or don't have coverage for less-than-extortionate prices. We may even experience this problem at home, when our Internet connection dies while we urgently need to use it.

Finding yourself in one of these binds is a bit like finding yourself parched and thirsty while everyone around you is sipping from nice tall glasses of iced water, or finding yourself cold and drenched in a rain storm because nobody will let you under their umbrella. At those moments when you are lost, or missing a deadline, or failing to meet your friend, it is almost always true that Internet data links are traveling through your body in the form of electromagnetic wireless signals — it's just that people have chosen to lock those networks so that you can't make use of them.

A tragedy of the commons

When people turn on WEP or WPA encryption for their networks deliberately, there are two common reasons: a desire to prevent their neighbors from "free riding" on their connections; and a fear that unencrypted WiFi is a security or privacy risk. Both of those reasons have a degree of legitimacy, but neither of them changes the fact that we would be better off if there were more open networks. Also, both of these problems could be solved without password locking our networks. What we need, instead, is to develop and deploy better WiFi protocols.

Let's focus on the first issue for a moment: traffic prioritization.

Many people would like to have the fastest network connection possible, and for that reason are reluctant to let their neighbors share their link. After all, if your neighbor is streaming music or watching YouTube videos on your WiFi, that's going to slow your traffic down a bit! But those same people would probably be willing to give up some bandwidth at home from time to time, in exchange for having free open wireless everywhere else. In other words, we'd all be better off if we all left our WiFi open, but we each benefit slightly if we close our WiFi. Our failure to work together prevents us from enjoying better, more widespread Internet access.

The best solution to this problem is to have WiFi routers which make it very easy to share a certain amount of bandwidth via an open network, but simultaneously provide an encrypted WPA2 network that gets priority over the open network. Some modern routers already support multiple networks like this, but we need a very simple, single-click or default setting to get the prioritization right.

Securing the Future for Open WiFi

If the problem of open WiFi was just about convincing people to share their connections, we'd be in a better situation. Enough people understand the importance of sharing that we'd have open networks more or less everywhere.

The problem that's really killing open WiFi is the idea that an unlocked network is a security and privacy risk.

This idea is only partially true. Computer security experts will argue at great length about whether WEP, WPA and WPA2 actually provide security, or just a false sense of security. Both sides are partially correct: none of these protocols will make anyone safe from hacking or malware (WEP is of course trivial to break, and WPA2 is often easy to break in practice), but it's also true that even a broken cryptosystem increases the effort that someone nearby has to go to in order to eavesdrop, and may therefore sometimes prevent eavesdropping.

It doesn't really matter that WiFi encryption is a poor defense against eavesdropping: most computer users only understand the simple message that having encryption is good, so they encrypt their network. The real problem isn't that people are encrypting their WiFi: it's that the encryption prevents them from sharing their WiFi with their friends, neighbours, and strangers wandering past their houses who happen to be lost and in need of a digital map.

We need WiFi that is open and encrypted at the same time!

Insofar as there is some privacy (and psychological) benefit to using an encrypted WiFi network, there's actually no reason why users of open wifi shouldn't get those benefits too!

There is currently no WiFi protocol that allows anybody to join the network, while using link-layer encryption to prevent each network member from eavesdropping on the others. But such a protocol should exist. There are some technical details to work through, but they are manageable.2

In fact, this proposed protocol offers some privacy/security benefits not available in shared-passphrase WPA2, which is the strongest easy-to-deploy WiFi encryption system. Under WPA2 all the users on the network can calculate each others' session keys and eavesdrop on each other. With our suggested design, that would cease to be possible.

The Unintuitive Benefits of Open Wireless

Since 1994, the United States government has auctioned off huge portions of the electromagnetic spectrum to telecommunications companies. WiFi operates in tiny scraps of spectrum that were left over from the auctions. Similar processes have occurred in many other countries.

But WiFi networks (especially modern 802.11N networks) turn out to make inherently much more efficient use of spectrum than systems of widely spaced cell phone towers. This results from a property of wireless protocols called area spectral efficiency: basically, if your data only has to travel to a nearby router, the same frequency range can be used for someone else's data around the corner or across the street. In contrast, if your data needs to travel all the way to a cell tower, nobody else in between can use that same portion of spectrum.

If we want a future where anyone can watch high definition movies or make video calls from anywhere without wires, what we need is short-range networks with routers everywhere — like the one we'd have if everyone opened their WiFi.

What Needs to be Done

EFF will be working with other organizations to launch an Open Wireless Movement in the near future. In the mean time, we're keen to hear from technologists with wireless expertise who would like to help us work on the protocol engineering tasks that are needed to make network sharing easier from a privacy and bandwidth-sharing perspective. You can write to us at openwireless@eff.org.

1. If you run an open wireless network, you may be able to receive significant legal protection from Section 230 of the CDA (against civil and state criminal liability for what others publish through the service) and Section 512 of the DMCA (against copyright claims based on what others use the service for). While these protections are not complete, EFF regularly engages in impact litigation to help ensure that these laws offer as strong protection to network operators as possible.

2. That kind of wireless network could use asymmetric cryptography to generate secret session keys for each user. The main challenge with this design is how to prevent man in the middle attacks. Wireless routers do not have canonical names, so they cannot be issued certificates in the same way that, say, TLS encrypted websites are. A feasible alternative is the trust-on-first-use design employed by SSH: the first time you connect to a wireless network, you might have to assume that the router's key is correct, but you will be notified if ever changes (which would mean that there is a new router, or the beginning or end of a man-in-the-middle attack). If you can actually see the router, you don't have to assume that the key is correct because you can check it against a number on the box itself. For other users, the security could be improved by using GPS, or some other means of remembering not only the keys of each router but also whether it was expected to be present in a given location

Friday, April 29, 2011

April 28, 2011: More than 30 years after they left Earth, NASA's twin Voyager probes are now at the edge of the solar system. Not only that, they're still working. And with each passing day they are beaming back a message that, to scientists, is both unsettling and thrilling.

The message is, "Expect the unexpected."

"It's uncanny," says Ed Stone of Caltech, Voyager Project Scientist since 1972. "Voyager 1 and 2 have a knack for making discoveries."

Today, April 28, 2011, NASA held a live briefing to reflect on what the Voyager mission has accomplished--and to preview what lies ahead as the probes prepare to enter the realm of the Milky Way itself.

Click to view a video about Voyager's incredible journey to the edge of the solar system.

The adventure began in the late 1970s when the probes took advantage of a rare alignment of outer planets for an unprecedented Grand Tour. Voyager 1 visited Jupiter and Saturn, while Voyager 2 flew past Jupiter, Saturn, Uranus and Neptune. (Voyager 2 is still the only probe to visit Uranus and Neptune.)

When pressed to name the top discoveries from those encounters, Stone pauses, not for lack of material, but rather an embarrassment of riches. "It's so hard to choose," he says.

Stone's partial list includes the discovery of volcanoes on Jupiter's moon Io; evidence for an ocean beneath the icy surface of Europa; hints of methane rain on Saturn's moon Titan; the crazily-tipped magnetic poles of Uranus and Neptune; icy geysers on Neptune's moon Triton; planetary winds that blow faster and faster with increasing distance from the sun.

"Each of these discoveries changed the way we thought of other worlds," he says Stone.

In 1980, Voyager 1 used the gravity of Saturn to fling itself slingshot-style out of the plane of the Solar System. In 1989, Voyager 2 got a similar assist from Neptune. Both probes set sail into the void.

Sailing into the void sounds like a quiet time, but the discoveries have continued.

Stone sets the stage by directing our attention to the kitchen sink. "Turn on the faucet," he instructs. "Where the water hits the sink, that's the sun, and the thin sheet of water flowing radially away from that point is the solar wind. Note how the sun 'blows a bubble' around itself."

There really is such a bubble, researchers call it the "heliosphere," and it is gargantuan. Made of solar plasma and magnetic fields, the heliosphere is about three times wider than the orbit of Pluto. Every planet, asteroid, spacecraft, and life form belonging to our solar system lies inside.

The Voyagers are trying to get out, but they're not there yet. To locate them, Stone peers back into the sink: "As the water (or solar wind) expands, it gets thinner and thinner, and it can't push as hard. Abruptly, a sluggish, turbulent ring forms. That outer ring is the heliosheath--and that is where the Voyagers are now."

The heliosheath is a very strange place, filled with a magnetic froth no spacecraft has ever encountered before, echoing with low-frequency radio bursts heard only in the outer reaches of the solar system, so far from home that the sun is a mere pinprick of light.

"In many ways, the heliosheath is not like our models predicted," says Stone.

In June 2010 Voyager 1 beamed back a startling number: zero. That's the outward velocity of the solar wind where the probe is now. No one thinks the solar wind has completely stopped; it may have just turned a corner. But which way? Voyager 1 is trying to figure that out through a series of "weather vane" maneuvers, in which V1 turns itself in a different direction to track the local breeze. The old spacecraft still has some moves left, it seems.

No one knows exactly how many more miles the Voyagers must travel before they "pop free" into interstellar space. Most researchers believe, however, that the end is near. "The heliosheath is 3 to 4 billion miles in thickness," estimates Stone. "That means we'll be out within five years or so."

There is plenty of power for the rest of the journey. Both Voyagers are energized by the radioactive decay of a Plutonium 238 heat source. This should keep critical subsystems running through at least 2020.

After that, he says, "Voyager will become our silent ambassador to the stars."

Each probe is famously equipped with a Golden Record, literally, a gold-coated copper phonograph record. It contains 118 photographs of Earth; 90 minutes of the world's greatest music; an audio essay entitled Sounds of Earth (featuring everything from burbling mud pots to barking dogs to a roaring Saturn 5 liftoff); greetings in 55 human languages and one whale language; the brain waves of a young woman in love; and salutations from the Secretary General of the United Nations. A team led by Carl Sagan assembled the record as a message to possible extraterrestrial civilizations that might encounter the spacecraft.

"A billion years from now, when everything on Earth we've ever made has crumbled into dust, when the continents have changed beyond recognition and our species is unimaginably altered or extinct, the Voyager record will speak for us," wrote Carl Sagan and Ann Druyan in an introduction to a CD version of the record.

Some people note that the chance of aliens finding the Golden Record is fantastically remote. The Voyager probes won't come within a few light years of another star for some 40,000 years. What are the odds of making contact under such circumstances?

On the other hand, what are the odds of a race of primates evolving to sentience, developing spaceflight, and sending the sound of barking dogs into the cosmos?

Donald Trump is very proud of himself for forcing President Obama to release his birth certificate, ending the debate over whether he was legally fit to lead the country. But not everything the Donald has put his name behind has succeeded. TIME takes a look at some gambles that went bust

Wednesday, April 27, 2011

THE FLOOD WARNING CONTINUES FOR THE OH RIVER AT MCALPINE UPPER.* AT 11:00 AM WED THE STAGE WAS 30.7 FEET.* FLOOD STAGE IS 23.0 FEET.* MODERATE FLOODING IS OCCURRING & MODERATE FLOODING IS FORECAST.* FORECAST.THE RIVER WILL CONTINUE RISING TO NEAR 33.0 FEET BY THUR EVENING THEN BEGIN FALLING.* IMPACT.AT 29.0 FEET.PARTS OF UTICA IN FLOOD. THIRD STREET RAMP OFF OF I-64 IS CLOSED. 10TH & 27TH STREETS ARE CLOSED AT FLOODGATES.* FLOOD HISTORY.THIS CREST COMPARES TO A PREVIOUS CREST OF 33.0 FEET ON MAY 10 1961.

THE FLOOD WARNING CONTINUES FOR THE OH RIVER AT MCALPINE LWR.* AT 11:00 AM WED THE STAGE WAS 62.5 FEET.* FLOOD STAGE IS 55.0 FEET.* MINOR FLOODING IS OCCURRING & MINOR FLOODING IS FORECAST.* FORECAST.THE RIVER WILL CONTINUE RISING TO NEAR 64.9 FEET BY THUR EVENING THEN BEGIN FALLING.* IMPACT.AT 66.0 FEET.FLOODGATES AT RIVERPORT ARE CLOSED.* FLOOD HISTORY.THIS CREST COMPARES TO A PREVIOUS CREST OF 65.2 FEET ON MAR 5 1962.

Solar power goes viral

In this diagram, the M13 virus consists of a strand of DNA (the figure-8 coil on the right) attached to a bundle of proteins called peptides — the virus coat proteins (the corkscrew shapes in the center) which attach to the carbon nanotubes (gray cylinders) and hold them in place. A coating of titanium dioxide (yellow spheres) attached to dye molecules (pink spheres) surrounds the bundle. More of the viruses with their coatings are scattered across the background. Photo - Image: Matt Klug, Biomolecular Materials Group

Researchers at MIT have found a way to make significant improvements to the power-conversion efficiency of solar cells by enlisting the services of tiny viruses to perform detailed assembly work at the microscopic level.

In a solar cell, sunlight hits a light-harvesting material, causing it to release electrons that can be harnessed to produce an electric current. The new MIT research, published online this week in the journal Nature Nanotechnology, is based on findings that carbon nanotubes — microscopic, hollow cylinders of pure carbon — can enhance the efficiency of electron collection from a solar cell's surface.

Previous attempts to use the nanotubes, however, had been thwarted by two problems. First, the making of carbon nanotubes generally produces a mix of two types, some of which act as semiconductors (sometimes allowing an electric current to flow, sometimes not) or metals (which act like wires, allowing current to flow easily). The new research, for the first time, showed that the effects of these two types tend to be different, because the semiconducting nanotubes can enhance the performance of solar cells, but the metallic ones have the opposite effect. Second, nanotubes tend to clump together, which reduces their effectiveness.

And that’s where viruses come to the rescue. Graduate students Xiangnan Dang and Hyunjung Yi — working with Angela Belcher, the W. M. Keck Professor of Energy, and several other researchers — found that a genetically engineered version of a virus called M13, which normally infects bacteria, can be used to control the arrangement of the nanotubes on a surface, keeping the tubes separate so they can’t short out the circuits, and keeping the tubes apart so they don’t clump.

The system the researchers tested used a type of solar cell known as dye-sensitized solar cells, a lightweight and inexpensive type where the active layer is composed of titanium dioxide, rather than the silicon used in conventional solar cells. But the same technique could be applied to other types as well, including quantum-dot and organic solar cells, the researchers say. In their tests, adding the virus-built structures enhanced the power conversion efficiency to 10.6 percent from 8 percent — almost a one-third improvement.

This dramatic improvement takes place even though the viruses and the nanotubes make up only 0.1 percent by weight of the finished cell. “A little biology goes a long way,” Belcher says. With further work, the researchers think they can ramp up the efficiency even further.

The viruses are used to help improve one particular step in the process of converting sunlight to electricity. In a solar cell, the first step is for the energy of the light to knock electrons loose from the solar-cell material (usually silicon); then, those electrons need to be funneled toward a collector, from which they can form a current that flows to charge a battery or power a device. After that, they return to the original material, where the cycle can start again. The new system is intended to enhance the efficiency of the second step, helping the electrons find their way: Adding the carbon nanotubes to the cell “provides a more direct path to the current collector,” Belcher says.

The viruses actually perform two different functions in this process. First, they possess short proteins called peptides that can bind tightly to the carbon nanotubes, holding them in place and keeping them separated from each other. Each virus can hold five to 10 nanotubes, each of which is held firmly in place by about 300 of the virus's peptide molecules. In addition, the virus was engineered to produce a coating of titanium dioxide (TiO2), a key ingredient for dye-sensitized solar cells, over each of the nanotubes, putting the titanium dioxide in close proximity to the wire-like nanotubes that carry the electrons.

The two functions are carried out in succession by the same virus, whose activity is “switched” from one function to the next by changing the acidity of its environment. This switching feature is an important new capability that has been demonstrated for the first time in this research, Belcher says.

In addition, the viruses make the nanotubes soluble in water, which makes it possible to incorporate the nanotubes into the solar cell using a water-based process that works at room temperature.

Prashant Kamat, a professor of chemistry and biochemistry at Notre Dame University who has done extensive work on dye-sensitized solar cells, says that while others have attempted to use carbon nanotubes to improve solar cell efficiency, “the improvements observed in earlier studies were marginal,” while the improvements by the MIT team using the virus assembly method are “impressive.”

“It is likely that the virus template assembly has enabled the researchers to establish a better contact between the TiO2 nanoparticles and carbon nanotubes. Such close contact with TiO2 nanoparticles is essential to drive away the photo-generated electrons quickly and transport it efficiently to the collecting electrode surface.”

Kamat thinks the process could well lead to a viable commercial product: “Dye-sensitized solar cells have already been commercialized in Japan, Korea and Taiwan,” he says. If the addition of carbon nanotubes via the virus process can improve their efficiency, “the industry is likely to adopt such processes.”

Belcher and her colleagues have previously used differently engineered versions of the same virus to enhance the performance of batteries and other devices, but the method used to enhance solar cell performance is quite different, she says.

Because the process would just add one simple step to a standard solar-cell manufacturing process, it should be quite easy to adapt existing production facilities and thus should be possible to implement relatively rapidly, Belcher says.

The research team also included Paula Hammond, the Bayer Professor of Chemical Engineering; Michael Strano, the Charles (1951) and Hilda Roddey Career Development Associate Professor of Chemical Engineering; and four other graduate students and postdoctoral researchers. The work was funded by the Italian company Eni, through the MIT Energy Initiative’s Solar Futures Program.

Shreya and Daniela started SmartPowerEd in Nov. ’09 to empower students to take on smart energy projects at their high schools. They started from nothing, were inspired by a lot of mentors, and learned as they went along. Learn a bit about their journey and how they went from watching a presentation to giving presentations!

Who started SmartPowerEd? When and why?

Shreya Indukuri & Daniela Lapidous, class of 2012 at the Harker Upper School in San Jose, CA.

June ’09: they won $5,500 to work on installing smart meters, an organic garden, and window insulating film at Harker, and to try to inspire the student body to attack climate change head-on!

October ’09: Shreya & Daniela were invited to the Governor’s Global Climate Summit hosted by Governor Schwarzenegger, as two of 25 youth climate leaders, where they presented their project to the supportive governors of CA, Washington, and Oregon. Their work was also documented as part of the U.S. portion of a UNICEF documentary on youth activism against climate change.

November ’09: Smart meters were installed at Harker. The success of the project was realized. SmartPowerEd was founded.

Co-organized the Climate Solutions Road Tour to travel 2,400 miles across India in solar electric cars to document the best solutions to climate change, even ending up on MTV India!

Worked at the German Parliament with the Environment Committee and at the International Chamber of Commerce in Paris

Holds a dual B.A. and Master of Environmental Management from Yale University.

————————

What is SmartPowerEd? We are a network of schools implementing smart energy solutions to reduce energy costs and address climate change.

What is climate change? Climate change has been accepted by top scientists as a phenomenon that is caused by human action and is warming the Earth over time. As the Earth warms, ecosystems change and are harmed. Sea levels rise. The environment is irreversibly damaged. To read more about what climate change is, click here. A great place to read about the excessive carbon in the atmosphere is at 350. Keep up with green news at Huffington Post, one of many online news sources with green sections. 10 quick facts on American energy use. Know it, own it, fight it!

Why is this movement for smart energy important? We are the young generation of America, and climate change will affect be our future. Solutions exist and we have the opportunity to implement them now.

The more schools join the SmartPowerEd network, the more information can be shared and the more energy savings can be made. In addition to climate change, when you save money on energy costs of school buildings, more money can go toward education itself.

“We can’t solve problems by using the same kind of thinking we had when we created them.” – Albert Einstein

A massive reservoir of buried frozen carbon dioxide ice (dry ice) detected by the Mars Reconnaissance Orbiter (MRO) is intimately related to the mass of the red planet’s atmosphere as the planet tilts on its axis, which in turn could affect the stability of liquid water and the frequency and severity of dust storms.

A newly discovered buried deposit of frozen carbon dioxide (dry ice) near Mars’ south pole contains about 30 times more carbon dioxide than previously estimated. This map colour-codes thickness estimates: red=600 metres, yellow to about 400m; dark blue to less than 100m, tapering to zero. Image: NASA/JPL-Caltech/Sapienza University of Rome/Southwest Research Institute.

MRO made the detection using its ground penetrating radar instrument, revealing a frozen carbon dioxide ice deposit at Mars’ south pole that occupies a 12,000 cubic kilometre volume and holds 80 percent as much CO2 as today’s atmosphere. Planetary scientists already knew that a small cap of carbon-dioxide ice rested on top of the water ice there, but the newly detected deposit has about 30 times more dry ice than previously estimated.

“The discovery was very surprising to me and many other scientists, although theoretical models had predicted that the CO2 atmosphere would occasionally collapse onto the poles,” Roger Phillips of Southwest Research Institute and deputy team leader for MRO’s Shallow Radar instrument tells Astronomy Now. “To actually find evidence of this is another matter, so we were very careful in our analyses to make sure we got it right.”

This cross-section view of underground layers near Mars' south pole is a radargram based on data from the Shallow Subsurface Radar (SHARAD) instrument on MRO. Researchers interpret the zone that is nearly free of radio-wave reflections (hence dark in the radargram) to be composed of frozen carbon dioxide, or "dry ice." This cross section covers a transect about 330 kilometres long in a region from about 86 degrees to 87 degrees south latitude and 280 degrees to 10 degrees east longitude, and is approximately 1.7 kilometres deep. Image: NASA/JPL-Caltech/Sapienza University of Rome/Southwest Research Institute.

Features such as collapse pits, which are known to be caused by dry ice sublimation, suggest that the ice cap is dissipating, adding gas to the martian atmosphere. “There are two types of collapse pits indicating sublimation of the dry ice,” explains Phillips. “There are isolated sublimation pits, up to four kilometres in diameter, and there are linear troughs, along which sublimation collapse pits form. There are also erosional remnants of the CO2 deposit seen in the radar data.”

Phillips adds that if you include this buried deposit, Martian carbon dioxide right now is roughly half frozen and half in the atmosphere, but at other times it can be nearly all frozen or nearly all in the atmosphere. When the dry ice is in its dissipating phase, much of the carbon dioxide enters the planet’s atmosphere, resulting in stronger winds, not only increasing the frequency of dust storms, but also their intensity. They even speculate that conditions might be stable enough for liquid water to exist in some locations on the red planet.

These images from orbit show an area near Mars' south pole where coalescing or elongated pits are interpreted as signs that an underlying deposit of frozen carbon dioxide, or "dry ice," has been shrinking by sublimation. The image on the left covers an area about 5.2 kilometres across, near 87 degrees south latitude, 268 degrees east longitude; the image on the right is an enlarged section of the rectangle in the left image. Image: NASA/JPL-Caltech/Univ. of Arizona.

The sublimating dry ice is likely linked to the tilt of Mars’ axis, which affects the amount of sunlight falling onto the polar regions. If Mars’ tilt and orbital parameters act to offer maximum exposure to the summer sunshine at the south pole, this could increase the atmospheric pressure by 75 percent its current level, and modelling suggests that the planet’s atmosphere could change several-fold on the order of 100,000 years or less.

“A tilted Mars with a thicker carbon-dioxide atmosphere causes a greenhouse effect that tries to warm the Martian surface, while thicker and longer-lived polar ice caps try to cool it,” says Robert Haberle of NASA’s Ames Research Center. “Our simulations show the polar caps cool more than the greenhouse warms. Unlike Earth, which has a thick, moist atmosphere that produces a strong greenhouse effect, Mars’ atmosphere is too thin and dry to produce as strong a greenhouse effect as Earth’s, even when you double its carbon-dioxide content.”

The royal wedding Dalek

The royal wedding is now just around the corner. Some people will be content simply watching the event on their television, but not one Doctor Who fanatic.

Chris Balcombe, 51, has spent one week decorating his Dalek to tie in with the occasion. The Doctor's nemesis has been painted red, blue and white and is covered with Union Jack flags and photos of Prince William and Kate Middleton.

Furthermore, Chris attached a mechanical grip to the Dalek so it can hold trays of food and drinks.

See below for a picture of Chris posing with his royal wedding-themed Dalek:

You may even know a few of these people. You have to put up with them, because you work with them, or they’re family, or whatever.

These Birthers are people who refuse to believe a black man could possibly be their president.And like a screaming child in a shopping aisle, there’s just no reasoning with them. Like a frustrated parent, we’ve been hoping to ride it out.

Well, it’s time to stop. In even engaging these them, we’ve become part of the problem.

By treating birthers like they’re rational, in allowing them to continue the tantrum they started throwing after the 2008 election was over… we empower them.

They are sore losers at best, bigots at worst.And it’s time to stop enabling them.

So on May 10th, I ask you to clearly, bluntly, kindly but firmly, treat them like the petulant little infants they are.

I ask you to join me, both in your actions, and by tweeting your feelings on this with the hashtag #ShunABirther – that 5/10/11 is offically “Shun A Birther” day.

Whenever you meet someone you suspect of harboring recreational bigotry and xenophobia… ask to see their intellectual papers whether you are at the shopping mall, in a restaurant, at the office or one of the remaining people on “Celebrity Apprentice.”

Ask them if they believe Barack Obama is a United States citizen.If they say yes, you continue like nothing happened.

But if they say no - if they tell you they believe Barack Obama is NOT the President because he’s not American - you raise your palm up and let them know they can talk to the hand.You INSTANTLY cease conversation.You go silent, and you shun.

You treat them like they treat fact.You ignore.

If you’re at a store, you turn your back on them for an uncomfortable beat, and exit.If you’re at a restaurant, you stop talking and when the bill comes, you tip NOTHING.

You give them silence and the respect they deserve.

May 10th.I will be asking “Do you believe Barack Obama is a united states citizen?”If the answer is yes, the day continues.But if the answer is no, my being stunned turns into they’re being shunned.

These people do not deserve the time of day from us if they are still living in the 1800s. Do not talk, Do not engage… Simply walk away because that birther is not your child and you don’t have to put up with the tantrum. The more embarrassing your shun, the better.

Birtherism needs to be thrown on the crapheap of history with the Klan hood. But it’ll never happen if we keep trying to talk to them like facts can change their mind, or logic can quell the undercurrent of fear and bigotry that fuels their issue.

Join me this tenth of May.

From now until then, pledge your support by sharing stories of birther stupidity and ignorance through Facebook and twitter links, with the hashtag #ShunABirther.

Then, on 5/10 - share your personal stories of how you turned your back on their ignorance. Don’t confront, just shun. Don’t debate, ignore. Don’t engage… EMBARRASS. WIth a smile and a pivot, make these people feel like the fools they are.

Do unto others as they would do unto The President of the United States, our Democracy and basic human civility.

Give ‘em the cold shoulder.The silent treatment.Turn your back on them. Show them the door or your hand.

On May 10, 2011… Shun a birther!

Ignorance shouldn’t be bliss. It should be embarrassing and uncomfortable.

Tuesday, April 26, 2011

Facebook looks to cash in on user data

By JESSICA GUYUNN

Los Angeles Times

Published: April 25, 2011 10:52AM Updated: April 25, 2011 10:52AM

For years, the company founded by Mark Zuckerberg put little effort into ad sales, focusing instead on making its service irresistible to users. Today more than 600 million people have Facebook accounts, with the average user spending seven hours a month posting. Now the company is looking to cash in on this personal information by helping advertisers pinpoint exactly who they want to reach. Eric Risberg | Associated Press file photo

Julee Morrison has been obsessed with Bon Jovi since she was a teenager.

So when paid ads for fan sites started popping up on the 41-year-old Salt Lake City blogger’s Facebook page, she was thrilled. She described herself as a “clicking fool,” perusing videos and photos of the New Jersey rockers.

Then it dawned on Morrison why all those Bon Jovi ads appeared every time she logged onto the social networking site.

“Facebook is reading my profile, my interests, the people and pages I am ‘friends’ with and targeting me,” Morrison said. “It’s brilliant social media, but it’s absolutely creepy.”

For Facebook users, the free ride is over.

For years, the privately held company founded by Mark Zuckerberg in a Harvard dorm room put little effort into ad sales, focusing instead on making its service irresistible to users. It worked. Today more than 600 million people have Facebook accounts. The average user spends seven hours a month posting photos, chatting with friends, swapping news links and sending birthday greetings to classmates.

Now the company is looking to cash in on this mother lode of personal information by helping advertisers pinpoint exactly who they want to reach. This is no idle boast. Facebook doesn’t have to guess who its users are or what they like. Facebook knows, because members volunteer this information freely — and frequently — in their profiles, status updates, wall posts, messages and “likes.”

It now is tracking this activity, shooting online ads to users based on their demographics, interests, even what they say to friends on the site — sometimes within minutes of them typing a key word or phrase.

For example, women who have changed their relationship status to “engaged” on their Facebook profiles shouldn’t be surprised to see ads from local wedding planners and caterers pop up when they log in.

Marketers have been tracking consumers’ online habits for years, compiling detailed dossiers of where they click and roam. But Facebook’s unique trove of consumer behavior could transform it into one of the most powerful marketing tools ever invented, some analysts believe. And that could translate into a financial bonanza for investors in the seven-year-old company as it prepares for a public offering, perhaps as early as next year.

But privacy watchdogs said Facebook’s unique ability to mine data and sell advertising based on what its members voluntarily share amounts to electronic eavesdropping on personal updates, posts and messages that many users intended to share only with friends.

“Facebook has perfected a stealth digital surveillance apparatus that tracks, analyzes and then acts on your information, including what you tell your friends,” said Jeffrey Chester, executive director of the Center for Digital Democracy. “Facebook users should be cautious about whether the social networking giant ultimately has their best interests at heart.”

Facebook said it does not disclose information that would allow advertisers to identify individual users, instead filtering based on geography, age or specific interests. It also lets users control whether companies can display the users’ names to others to promote products. But any information users post on the site — hobbies, status updates, wall posts — is fair game for ad targeting.

A lot is riding on getting it right. Last year, online advertising in the U.S. grew 15 percent to $26 billion, according to the Internet Advertising Bureau.

Monday, April 25, 2011

Rev. Graham and the Signs of Armageddon

April 25, 2011

On ABC’s "This Week," the Rev. Franklin Graham was wrong when he said that earthquakes, wars and famines are occurring "with more frequency and more intensity."

The preacher, who is the son of the Rev. Billy Graham and president and CEO of the Billy Graham Evangelistic Association, discussed the prophecy of Armageddon with host Christiane Amanpour during a special Easter edition of the Sunday talk show.

Graham, April 24: I believe we are in the latter days of this age. When I say "latter days," could it be the last hundred years or the last thousand years or the last six months? I don’t know.

But the Bible, the things that the Bible predicts, earthquakes and famines, nation rising against nation, we see this happening with more frequency and more intensity.

On all three counts, the preacher is wrong. Today’s famines and armed conflicts are fewer and relatively smaller than those in the last century, and the frequency of major earthquakes has remained about the same.

Recent Famines ‘Small by Historical Standards’

First, let’s look at famines. Cormac O’Grada, an economics professor at University College Dublin who authored "Famine: A Short History," wrote that "the frequency of famines has been declining over time" and so has "famine mortality." In his book, which was published in 2009, O’Grada said "the famines that have struck since 1960s have been small by historical standards," which he attributed to improved economic conditions since 1950.

In a lecture he gave last year based on his research for the book, O’Grada said:

O’Grada, June 2010: A study published a decade ago claimed that famine was responsible for 70 million deaths during the twentieth century, or more than either world war. But the previous century was probably worse, at least relatively speaking. Thirty million is a lower-bound estimate of famine mortality in India and China alone between 1870 and 1902, while a figure of ‘fifty million dead might not be unrealistic’.

Recent famines, by contrast, have been ‘small’. Since the high-profile crises of Malawi in 2002 and of Niger in 2005 famine has not been front-page news.

We e-mailed O’Grada and asked him about Graham’s claim. He replied, "The Rev. Graham is incorrect."

‘General Reduction’ in Armed Conflicts

It’s also not true that we are seeing "nation rising against nation" with "more frequency and more intensity."

The Uppsala Conflict Data Program at Uppsala University in Sweden compiles data on "armed conflicts" — which it defines as wars that result in at least 1,000 deaths and minor conflicts that result in more than 25 but less than 1,000 deaths. Its data date only to 1946 but show the number of armed conflicts topped 50 in the early 1990s. As of 2009, the number stood at 36 — which represents a slight uptick in recent years, but it’s still not as high as the early 1990s.

In a March 2009 interview, Uppsala Conflict Data Program founder Peter Wallensteen said there has been a "general reduction in the number of armed conflicts" since that peak in the ’90s:

Wallensteen, March 2009: We have also seen a general reduction in the number of armed conflicts, and that created a lot of headlines. In 1991-1992, there were about 50 armed conflicts, but by 2003, there were 29. Now the number is up to 35 again, so it’s escalated — but many of them are smaller conflicts, of which very few have really become big wars. Again, this seems to identify some kind of ability to keep conflicts smaller at least.

Earthquakes ‘Fairly Constant’

The perception that earthquakes are on the rise is not new to the folks at the United States Geological Survey. In fact, the agency has a web page that asks the question, "Are Earthquakes Really on the Increase?"

The short answer is no. It may seem like it because there are many more monitoring stations than ever before, but major earthquakes "have remained fairly constant" since the beginning of the last century.

The long answer:

USGS: We continue to be asked by many people throughout the world if earthquakes are on the increase. Although it may seem that we are having more earthquakes, earthquakes of magnitude 7.0 or greater have remained fairly constant.

A partial explanation may lie in the fact that in the last twenty years, we have definitely had an increase in the number of earthquakes we have been able to locate each year. This is because of the tremendous increase in the number of seismograph stations in the world and the many improvements in global communications. In 1931, there were about 350 stations operating in the world; today, there are more than 8,000 stations and the data now comes in rapidly from these stations by electronic mail, internet and satellite. This increase in the number of stations and the more timely receipt of data has allowed us and other seismological centers to locate earthquakes more rapidly and to locate many small earthquakes which were undetected in earlier years. The NEIC now locates about 20,000 earthquakes each year or approximately 50 per day. Also, because of the improvements in communications and the increased interest in the environment and natural disasters, the public now learns about more earthquakes.

According to long-term records (since about 1900), we expect about 17 major earthquakes (7.0 – 7.9) and one great earthquake (8.0 or above) in any given year.

Of course, we take no position on whether there will be an Armageddon. But we can say that the frequency and intensity of famines, wars and earthquakes are either declining or at least not increasing.

The SERVIR-Himalaya node at ICIMOD in Nepal is the third global node in SERVIR's Regional Visualization and Monitoring System. (ICIMOD) SERVIR-Himalaya made its successful debut in Kathmandu, Nepal, on Oct. 5, taking the stage as the third global node in the SERVIR Regional Visualization and Monitoring System. SERVIR-Himalaya expands the collaboration between NASA, the U.S. Agency for International Development, or USAID, and international partners to meet development challenges by "linking space to village."

The partners inaugurated the state-of-the-art regional monitoring system at a ribbon-cutting ceremony attended by NASA Administrator Charles Bolden and Michael Yates, senior deputy assistant administrator of USAID's Bureau for Economic Growth, Agriculture and Trade. The team at SERVIR-Himalaya's host institution, the International Centre for Integrated Mountain Development, or ICIMOD, was represented by Director General Andreas Schild and Basanta Shrestha, division head of ICIMOD's Mountain Environment and Natural Resources Information System.

Mr. Shrestha highlighted the local perspective on the Himalayan node launch: "Through the partnership with USAID and NASA on SERVIR-Himalaya, ICIMOD will be able to augment its capacity and its network of cooperative partners in the region to use Earth observation for societal benefits of the mountain communities." SERVIR features web-based access to satellite imagery, decision-support tools and interactive visualization capabilities. It puts previously inaccessible information into the hands of scientists, environmental managers and decision-makers.

Indeed, the name SERVIR comes from the Spanish verb "to serve," and SERVIR-Himalaya will serve the Hindu Kush-Himalaya region, including the partner nations of Afghanistan, Bangladesh, Bhutan, China, India, Nepal and Pakistan.

Approximately 1.3 billion people depend on the ecosystem services provided by the Himalayan mountains, yet the region is known as Earth's "third pole" because of its inaccessibility and the vast amount of water stored there in the form of ice and snow. SERVIR will integrate Earth science data from NASA satellites with geospatial information products from other government agencies to support and expand ICIMOD's focus on critical regional issues such as disaster management, biodiversity conservation, trans-boundary air pollution, snow and glacier monitoring, mountain ecosystem management and climate change adaptation.

It's no wonder that Prof. José Achache, director of the Group on Earth Observations, touted SERVIR-Himalaya as furthering the Global Earth Observing System of Systems or GEOSS. SERVIR was developed in coordination with the group, which includes more than 80 nations working together to build GEOSS to serve the needs of people the world over.

SERVIR grew out of collaboration between USAID and researchers at NASA's Marshall Space Flight Center in Huntsville, Ala.

"The SERVIR technology and our partnership with various organizations and people around the globe reflect NASA's commitment to improving life on our home planet for all people," he added.

According to Mr. Yates, USAID’s perspective reflects a similar goal. "We are pleased to work with our partners in Nepal, and in other regions of the world, to build capacity to use satellite data and mapping technologies for making practical decisions that improve people’s lives," he said.

Since 2005, SERVIR has served the Mesoamerican region and the Dominican Republic from the Water Center for the Humid Tropics of Latin America and the Caribbean, which is based in Panama. SERVIR also has served East Africa since 2008, operating from the Regional Center for Mapping of Resources for Development in Nairobi, Kenya.

The SERVIR program is operated by the Earth Science Division's Applied Sciences Program in NASA's Science Mission Directorate in Washington. Four other NASA field centers work with the Marshall Center on the program: NASA's Goddard Space Flight Center in Greenbelt, Md., NASA's Ames Research Center in Moffet Field, Calif., NASA's Jet Propulsion Laboratory in Pasadena, Calif., and NASA's Langley Research Center in Hampton, Va.

Scientists suggest spacetime has no time dimension

Scientists propose that clocks measure the numerical order of material change in space, where space is a fundamental entity; time itself is not a fundamental physical entity. Image credit: Wikimedia Commons.

(PhysOrg.com) -- The concept of time as a way to measure the duration of events is not only deeply intuitive, it also plays an important role in our mathematical descriptions of physical systems. For instance, we define an object’s speed as its displacement per a given time. But some researchers theorize that this Newtonian idea of time as an absolute quantity that flows on its own, along with the idea that time is the fourth dimension of spacetime, are incorrect. They propose to replace these concepts of time with a view that corresponds more accurately to the physical world: time as a measure of the numerical order of change.

In two recent papers (one published and one to be published) in Physics Essays, Amrit Sorli, Davide Fiscaletti, and Dusan Klinar from the Scientific Research Centre Bistra in Ptuj, Slovenia, have described in more detail what this means.

No time dimension

They begin by explaining how we usually assume that time is an absolute physical quantity that plays the role of the independent variable (time, t, is often the x-axis on graphs that show the evolution of a physical system). But, as they note, we never really measure t. What we do measure is an object’s frequency, speed, etc. In other words, what experimentally exists are the motion of an object and the tick of a clock, and we compare the object’s motion to the tick of a clock to measure the object’s frequency, speed, etc. By itself, t has only a mathematical value, and no primary physical existence.

This view doesn’t mean that time does not exist, but that time has more to do with space than with the idea of an absolute time. So while 4D spacetime is usually considered to consist of three dimensions of space and one dimension of time, the researchers’ view suggests that it’s more correct to imagine spacetime as four dimensions of space. In other words, as they say, the universe is “timeless.”

“Minkowski space is not 3D + T, it is 4D,” the scientists write in their most recent paper. “The point of view which considers time to be a physical entity in which material changes occur is here replaced with a more convenient view of time being merely the numerical order of material change. This view corresponds better to the physical world and has more explanatory power in describing immediate physical phenomena: gravity, electrostatic interaction, information transfer by EPR experiment are physical phenomena carried directly by the space in which physical phenomena occur.”

As the scientists added, the roots of this idea come from Einstein himself.

“Einstein said, ‘Time has no independent existence apart from the order of events by which we measure it,’” Sorli told PhysOrg.com. “Time is exactly the order of events: this is my conclusion.”

In the future, the scientists plan to investigate the possibility that quantum space has three dimensions of space, as Sorli explained.

“The idea of time being the fourth dimension of space did not bring much progress in physics and is in contradiction with the formalism of special relativity,” he said. “We are now developing a formalism of 3D quantum space based on Planck work. It seems that the universe is 3D from the macro to the micro level to the Planck volume, which per formalism is 3D. In this 3D space there is no ‘length contraction,’ there is no ‘time dilation.’ What really exists is that the velocity of material change is ‘relative’ in the Einstein sense.”

Numerical order in space

The researchers give an example of this concept of time by imagining a photon that is moving between two points in space. The distance between these two points is composed of Planck distances, each of which is the smallest distance that the photon can move. (The fundamental unit of this motion is Planck time.) When the photon moves a Planck distance, it is moving exclusively in space and not in absolute time, the researchers explain. The photon can be thought of as moving from point 1 to point 2, and its position at point 1 is “before” its position at point 2 in the sense that the number 1 comes before the number 2 in the numerical order. Numerical order is not equivalent to temporal order, i.e., the number 1 does not exist before the number 2 in time, only numerically.

As the researchers explain, without using time as the fourth dimension of spacetime, the physical world can be described more accurately. As physicist Enrico Prati noted in a recent study, Hamiltonian dynamics (equations in classical mechanics) is robustly well-defined without the concept of absolute time. Other scientists have pointed out that the mathematical model of spacetime does not correspond to physical reality, and propose that a timeless “state space” provides a more accurate framework.

The scientists also investigated the falsifiability of the two notions of time. The concept of time as the fourth dimension of space - as a fundamental physical entity in which an experiment occurs - can be falsified by an experiment in which time does not exist, according to the scientists. An example of an experiment in which time is not present as a fundamental entity is the Coulomb experiment; mathematically, this experiment takes place only in space. On the other hand, in the concept of time as a numerical order of change taking place in space, space is the fundamental physical entity in which a given experiment occurs. Although this concept could be falsified by an experiment in which time (measured by clocks) is not the numerical order of material change, such an experiment is not yet known.

“Newton theory on absolute time is not falsifiable, you cannot prove it or disprove it, you have to believe in it,” Sorli said. “The theory of time as the fourth dimension of space is falsifiable and in our last article we prove there are strong indications that it might be wrong. On the basis of experimental data, time is what we measure with clocks: with clocks we measure the numerical order of material change, i.e., motion in space.”

How it makes sense

In addition to providing a more accurate description of the nature of physical reality, the concept of time as a numerical order of change can also resolve Zeno’s paradox of Achilles and the Tortoise. In this paradox, the faster Achilles gives the Tortoise a head start in the race. But although Achilles can run 10 times faster than the Tortoise, he can never surpass the Tortoise because, for every distance unit that Achilles runs, the Tortoise also runs 1/10 that distance. So whenever Achilles reaches a point where the Tortoise has been, the Tortoise has also moved slightly ahead. Although the conclusion that Achilles can never surpass the Tortoise is obviously false, there are many different proposed explanations for why the argument is flawed.

Here, the researchers explain that the paradox can be resolved by redefining velocity, so that the velocity of both runners is derived from the numerical order of their motion, rather than their displacement and direction in time. From this perspective, Achilles and the Tortoise move through space only, and Achilles can surpass Tortoise in space, though not in absolute time.

The researchers also briefly examine how this new view of time fits with how we intuitively perceive time. Many neurological studies have confirmed that we do have a sense of past, present, and future. This evidence has led to the proposal that the brain represents time with an internal “clock” that emits neural ticks (the “pacemaker-accumulator” model). However, some recent studies have challenged this traditional view, and suggest that the brain represents time in a spatially distributed way, by detecting the activation of different neural populations. Although we perceive events as occurring in the past, present, or future, these concepts may just be part of a psychological frame in which we experience material changes in space.

Finally, the researchers explain that this view of time does not look encouraging for time travelers.

“In our view, time travel into the past and future are not possible,” Sorli said. “One can travel in space only, and time is a numerical order of his motion.”