Main menu

"As eloquently stated in The Rensselaer Plan 2024, the university's strategic plan that leads us to our 200th in 2024, "the most significant transformation at Rensselaer over the past decade has been the creation of a research portfolio of a size, significance, quality, and prominence that positions us to impact Global Challenges." Indeed, over the past 12 years, Rensselaer has solidified its place among the important technological universities of the 21st century..." more

Research in Media, Arts, Science, and Technology facilitates new approaches to networking, advanced visualization, sensor design, haptics, and multiscale modeling and simulation, which are supported by the core capabilities of EMPAC.

Enabled by the capabilities of the CCI, Rensselaer has developed important programs in Computational Science and Engineering focused on high performance computing, big data, and data analytics, which supports research and innovation across a broad front.

Our excellence in Nanotechnology and Advanced Materials builds from the fundamental understanding—experimental, theoretical, and computational - of the underlying atomic and molecular properties of a wide range of nanostructured materials. We now are developing robust, affordable, and sustainable methods for manufacturing new functional hybrid materials, and the hierarchical systems and products based upon them.

A variety of instruments have been deployed that collect all kinds of data that give scientists and researchers a “real-time” view of what’s happening in the lake as it happens. This allows them to monitor where potentially harmful impacts like road salt, nutrient runoff, contaminants and invasive species are coming from, and what the consequences might be if their presence increases.

“A key challenge for modern AI is putting back together a field that has almost splintered among these methodologies,” says James Hendler, director of the Rensselaer Polytechnic Institute for Data Exploration and Applications in Troy, New York.

When talking about robots and self-awareness, I think most people would just freak out, but there are some people who would be extremely excited and interested about these things. But I don’t think freaking out would be the case here, even though a robot just passed the first self-awareness test ever.

Cyber-infrastructure, above and beneath the waves, is giving researchers a high-tech look at factors impacting Lake George water quality. The Jefferson Project is a long-term collaboration between IBM, Rensselaer Polytechnic Institute and The Fund for Lake George that has cost more than $10 million just to ramp up.

When you think of the Internet of Things, you probably don't think of lakes. But IBM, Rensselaer Polytechnic Institute, and the Fund for Lake George are using IoT technology to make New York's Lake George a "smart lake."

Academic researchers and computer giant IBM are aiming to make Lake George, a 52-kilometer-long body of water in New York state, one of the smartest lakes in the world. Late last month, scientists formally began to capture data from the first of 40 sensing platforms that will give researchers a detailed glimpse into lake behaviors such as water circulation and temperature. The information will be fed into computer models that the researchers say could help managers protect Lake George from threats such as invasive species, excessive nutrients, road salt, and pollution.

The effort, known as the Jefferson Project, involves more than 60 scientists from theRensselaer Polytechnic Institute (RPI) in Troy, New York; the FUND for Lake George, a regional conservation group; and IBM research labs in Brazil, Ireland, Texas, and New York. The researchers are using Lake George as a test bed for an array of sophisticated “smart” sensors that will monitor 25 different variables, including biological characteristics and water chemistry and quality. The sensors will not only report data back to laboratories, often in real time, but be able to respond to changes in the lake environment. “Our sensors can look at other sensors around [them] and say, ‘I’m seeing something a little unusual, are you seeing it too?’” says RPI’s Rick Relyea, director of the Jefferson Project. “If so, the sensor can make the decision to sample more frequently or sample in a particular depth of water more. They have a great deal of intelligence.”

The data the sensors collect will be fed to an IBM supercomputer that will help researchers develop five different computer models that will enable one of the Jefferson Project’s main goals: visualizing Lake George’s behavior. For example, using high-resolution weather forecasting technology developed by IBM, researchers will be able to see how runoff from big storms moves through the 600-square-kilometer Lake George watershed. Other models will allow researchers to examine the impact of the use of road salt on water quality, see how water circulates throughout the lake, and visualize lake food webs.

The Jefferson Project isn’t the only effort to harness new technologies to wire up and study lakes. The U.S. National Science Foundation’s National Ecological Observatory Network is using similar approaches to study the impact of climate change, land-use change, and invasive species on aquatic ecosystems. Internationally, the Global Lake Ecological Observatory Network (GLEON), a grassroots network of ecologists, IT experts, and engineers, also uses new technologies to study how lakes respond to environmental change.

This Jefferson Project isn’t the first time IBM has experimented with instrumenting a body of water, says Harry Kolar, an IBM researcher and an adjunct professor of physics at Arizona State University, Tempe. The company has helped develop many of the technologies being used at Lake George by participating in other projects, including the River and Estuary Observatory Network, an observatory system tracking the Hudson River at Denning’s Point in Beacon, New York. In 2009, IBM also launched a joint project with Ireland’s Marine Institute to monitor water quality and marine life in Ireland’s Galway Bay.

What makes the Jefferson Project different, Kolar says, is not only the smart sensors and the high frequency with which they will collect data, but how the data will be used to help inform the models. And Paul Hanson, a limnologist at the University of Wisconsin (UW), Madison, says that although the Jefferson Project is similar to other lake-monitoring projects, “they’re doing it on steroids. More variables, more frequency, and with better integration [with] models.”

Overall, researchers plan to equip the lake with 40 sensor-carrying platforms, some on land and some in the water; they have deployed 14 thus far. The platforms come in four “flavors”: vertical profilers that send instruments into the lake’s depths to monitor things such as water temperature, chlorophyll, and dissolved organic matter; weather stations that measure humidity, barometric pressure, and wind velocity; tributary stations that study water entering the lake; and acoustic Doppler profilers, underwater sensors that measure lake currents.

Kevin Rose, a postdoctoral associate at UW Madison, who is active in GLEON, says IBM’s involvement makes the Jefferson Project stand out. “Private-public partnerships are going to be a hallmark of how more research is done in the future and this is a great model to see that in action,” he says.

The ultimate test of the Jefferson Project’s value, Hanson says, will be whether local and regional officials are able to use the information to better manage and protect the body of water known as “the Queen of American Lakes.”And project director Relyea says they are aiming high. “Ultimately,” he adds, “our goal is to make this project a blueprint for understanding lakes” that can be replicated elsewhere.

The project, which is expected to run for at least 3 years, is jointly funded by the three groups; leaders say it has a total budget “in the millions,” including direct spending and in-kind contributions. Researchers expect the Jefferson Project to have all of its systems fully integrated by the end of 2016.

Over 30 years ago, Rensselaer established its field station at a donated property in the town of Bolton Landing. (The space was previously a lodge, and it still provides a place to sleep for visiting students and scientists.) This station has served as a base for long-term monitoring of Lake George, as well as other research in the area—including monitoring a number of Adirondack lakes following the acid rain regulations passed in 1990. Now, it is home to the Jefferson Project. And with IBM's technological and financial support, researchers are getting ready to take advantage of a whole new approach to studying Lake George: Big Data.

A team of researchers at the Rensselaer Polytechnic Institute led by Christopher Carothers, Director of the institute’s Center for Computational Innovations described for The Platform how True North is finding a new life as a lightweight snap-in on each node that can take in sensor data from the many components that are prone to failure inside, say for example, an 50,000 dense-node supercomputer (like this one coming online in 2018 at Argonne National Lab) and alert administrators (and the scheduler) of potential failures This can minimize downtime and more important, allow for the scheduler to route around where the possible failures lie, thus shutting down only part of a system versus an entire rack.

Boleslaw Szymanski, a computer scientist at the Rensselaer Polytechnic Institute in Troy, New York, said that the team's findings could provide some general guidance for how companies could better manage their brands.

A team of researchers, led by Rensselaer Polytechnic Institute professor Yuri Lvov, has found an elegant explanation for the long-standing Fermi-Pasta-Ulam (FPU) problem, first proposed in 1953, investigated with one of the world's first digital computers, and now considered the foundation of experimental mathematics.