June 25, 2008

3d virtual reality adoption in the consumer marketplace, particularly the home, has been limited by three primary factors. First, the technology needed to display 3d virtual reality has been expensive and not integrated into consumer electronic devices. Second, the interfaces to achieve 3d have been cumbersome in specific those associated with active stereo display. Third, the bandwidth necessary to move highly accurate models has been limited to the home.These characteristics coupled with little "mass" exposure to 3d virtual reality immersion have effectively killed the home from being a virtual reality marketplace.

However, within the past two years there have been remarkable shifts that remove some of these technological obstacles along with an increased interest in 3d as a viewing experience. Perhaps the greatest indicator of consumer interest in these types of environments for entertainment could be seen in the box office success of the 3d animated feature Beowulf which grossed $82 million. [1]

From a technological standpoint, more vendors, driven by potential consumer entertainment interest, are integrating 3d virtual reality capabilities into consumer televisions which function as the monitor home entertainment computer. This integration on the consumer end also comes with a maturation of tools allowing for easier creation of 3d objects. [2] Finally, there is a persistently increasing broadband capability in the United States. Current estimates place broadband penetration at about 57% for all households.[3]

May 05, 2008

High definition video content is broadcast daily into millions of American homes. While various flavors of HD exists (720p, 1080i,1080p) there has been an undeniable trend towards higher resolution and better fidelity.

Beyond the current commodity high definition (best resolution of 1440*1080) is the world of digital cinema, aka as 4K. 4K offer resolution of 4096*1714 with an average of ~8 million pixels. [1] As Theo Mayer notes, "Whereas new formats typically represent some 50-70% improvement in resolution to the previous generation, 4K represent an over 325% leap in a single step over the just establishing 1080p (1920h X 1080v) standard. It represents an incredible 500% leap over the current DLP based SXGA+ (1400h X 1050v) format!"

While 4K technologies are not yet common the costs for both the cameras necessary for content creation and the projectors for display are rapidly declining. Within a three year period, we will likely see 4K in commercial movie theaters and within six years our homes.

What changes will 4K technologies have on collaboration and science? First, the ability to stream 4K real-time video will finally realize the often hyped promise of video being "just as good as being there." The 4k resolution will allow our collaborations at a distance to occur at the resolution of digital cinema. As these types of collaborations become more common they will act as an application driver for broadband networks. To date most tests with uncompressed real-time 4K video teleconferences have relied on multiple 1 gigabit lambdas.[3]

Second, 4K technologies will change the nature of scientific visualization. One of the challenges of high performance computing is how to deal with the massive data sets produced. Regardless of application area, the reality is the most plausible way to understand terabytes of data is through interactive visualization.[4] Best available display technologies today require us to "dumb down" visualizations so they can be viewed on the desktop. 4k will remove this barrier and allow us to have much richer interactive visualizations. We will finally be able to see what we have been able to compute.

Third, our paradigm for approaching sources and displays has been a one to one relationship. We have done this so we can combine multivariate data without compromising resolution. 4K projectors are currently designed in such a way as to break this paradigm. Put simply, a 4K projector can be fed by multiple inputs which are then simultaneously displayed.[2] Tomorrow scientists will be able to create desktops as large as movie screens and feed data at high definition resolution to auditoriums of colleagues.

April 29, 2008

The social web (Web 2.0) built on decentralized tools for participation with cheap data storage accessed by increasingly affordable broadband networks is allowing the Internet to function as society's database. The best exemplar of this trend is Wikipedia which boasts over 10 million articles in more then 250 languages used by 684 million users.[1] However, interestingly, a Nature survey in 2005 revealed that while 70% of scientists had heard of Wikipedia few used it and only a fraction (10%) had ever edited an entry. [2]

In 2007 renowned Harvard biologist E.O. Wilson called for the creation of an Encyclopedia of Life which would act as gathering point for scientific research on every aspect of the biosphere.[3] The goal of this project is to create a scientific valid online database containing information on the 1.8 currently known species. Projected to take over 10 years with a cost estimate of ~$110 million the project hopes to catalyze scientific and amateur interest in the biosphere. Anyone will be able to contribute information to the site but scientific experts will validate information prior to it being included on an official species page.[3] Exemplar pages, for example of the Peregrine Falcon, are now available, see http://www.eol.org/taxa/16990688

January 14, 2008

Predictions are that without substantial new conservation efforts America's national helium reserve will be depleted in nine years. Apart from the use in children's balloons, Helium (He) has important roles in nuclear magnetic resonance, mass spec, fiber optics, computer chip development, and cooling.

How irreplaceable is He? Lee Sobotka, Professor of Chemistry and Physics at Washington University in Saint Louis notes:

"When we use what has been made over the approximate 4.5 billion of
years the Earth has been ar

ound, we will run out," Sobotka said . "We
cannot get too significant quantities of helium from the sun — which
can be viewed as a helium factory 93 million miles away — nor will we
ever produce helium in anywhere near the quantities we need from
Earth-bound factories. Helium could eventually be produced directly in
nuclear fusion reactors and is produced indirectly in nuclear fission
reactors, but the quantities produced by such sources are dwarfed by
our needs."

January 05, 2008

A new era in scientific discovery is upon us as humans are augmented by robotic computerized assistants to help them sort through the floods of data being created by modern scientific instruments. The increased compute power of the modern microprocessor teamed with breakthroughs in machine learning are the underlying foundations for this revolution.

The scientific transformation fueled by robotic/machine discovery will occur across a number of scientific drivers with life sciences leading the way. Life sciences will push the boundaries due to the explosion of new data in these domains. Take for example the gene-expression microarrays which make it possible to measure the rate at which cells expression the proteins for thousand of genes. The problem as noted by Michael Molla is "the amount of data this new technology produces is more than one can manually analyze."[1]

A great example of the potential for robotic discovery can be seen in a recent experiment done at Manchester University. Using advanced machine learning algorithms scientists were able to train a computer to look for new biological markers for diagnosing Alzheimer's disease. The computerized assistant was able to scan many more samples than a human assistant would have been able to and importantly was able to "learn" from each new marker found.[2]

Observational science will also be changed as breakthroughs in computer vision combined with automated observation allow scientist to have 24*7*365 surveillance of samples. For example, the Smart Vivarium project at the University of California San Diego, uses machine leaning and cameras to allow for the robotic monitoring of all of the mice used in animal experiments at UCSD. As Tony Yaskh at UCSD noted, "It allows you to look at subtle changes in behavior that we know are occurring but cannot study because of the labor intensive nature of long-term monitoring."[3]

Impacts:The automation of scientific discovery through robotic data collection and analysis is necessity given the explosion of data being produced by modern science. The practical impact of this change will be to allow us to enhance the efficiency of scientific discovery.

However, this automation, will come at a cost which will be to decrease the flexibility of modern scientific experiments. Put simply, the "noise" eliminated by many robots is exactly the type of input that in the hands of a human observer may lead to a completely serendipitous discovery. [4]

The importance of serendipity to scientific breakthrough should not be underestimated. As Steve Shapin explains in his review of the book the Accidental Scientists, "Many scientists, including the Harvard physiologist Walter Cannon and,later, the British immunologist Peter Medawar, liked to emphasize how much of scientific discovery was unplanned and even accidental. One of Cannon's favorite examples of such serendipity is Luigi Galvani's observation of the twitching of dissected frogs' legs, hanging from a copper wire, when they accidentally touched an iron railing, leading to the discovery of "galvanism"; another is Hans Christian Ørsted's discovery of electromagnetism when he unintentionally brought a current-carrying wire parallel to a magnetic needle."[5]

Thus, automation will allow discovery when we know what we are looking for but it will likely decrease the serendipity that has fueled some of our most important breakthroughs.

November 14, 2007

NO - REPEAT NO - WATCH OR WARNING IS IN EFFECT FOR THE STATES AND PROVINCES LISTED ABOVE. A TSUNAMI HAS BEEN OBSERVED AT THE FOLLOWING SITES TIME - TIME OF MEASUREMENT AMPL - TSUNAMI AMPLITUDES ARE MEASURED RELATIVE TO NORMAL SEA LEVEL. THESE ARE NOT CREST-TO-TROUGH HEIGHTS. EVALUATION BASED ON THE LOCATION - MAGNITUDE - HISTORIC TSUNAMI RECORDS AND OBSERVED TSUNAMI EFFECTS THE EARTHQUAKE WAS NOT SUFFICIENT TO GENERATE A TSUNAMI DAMAGING TO CALIFORNIA/ OREGON/ WASHINGTON/ BRITISH COLUMBIA OR ALASKA. SOME OF THESE AREAS MAY EXPERIENCE NON-DAMAGING SEA LEVEL CHANGES. PRELIMINARY EARTHQUAKE PARAMETERS MAGNITUDE - 7.7 TIME- 0641 AKST NOV 14 2007 0741PST NOV 14 2007 1541UTC NOV 14 2007 LOCATION- 22.2 SOUTH 69.9 WEST - NORTHERN CHILE DEPTH - 37 MILES/59 KM THE PACIFIC TSUNAMI WARNING CENTER IN EWA BEACH HAWAII IS CLOSING ITS INVESTIGATION AND WILL ISSUE A FINAL MESSAGE FOR AREAS OUTSIDE CALIFORNIA/ OREGON/ WASHINGTON/ BRITISH COLUMBIA AND ALASKA.

September 26, 2007

"The researchers placed identical strains of
salmonella in containers and sent one into space aboard the shuttle,
while the second was kept on Earth, under similar temperature
conditions to the one in space.

After the shuttle returned, mice were given varying oral doses of the salmonella and then were watched.

After
25 days, 40 percent of the mice given the Earth-bound salmonella were
still alive, compared with just 10 percent of those dosed with the
germs from space. And the researchers found it took about one-third as
much of the space germs to kill half the mice, compared with the germs
that had been on Earth.

The researchers found 167 genes had changed in the salmonella that went to space.

Why?

"That's
the 64 million dollar question," Nickerson said. "We do not know with
100 percent certainty what the mechanism is of space flight that's
inducing these changes.""