Editor’s Note: This post, from Chicagoan John Tolva, was also posted on his personal blog. He is currently the driving force behind the forthcoming CityForward.org project.

Recently I was asked by WBEZ, the Chicago NPR affiliate, to write an essay on a topic or trend from 2009 that I would like to see carried forward “from here on out”.

What I wrote was a condensation of a year of conferences and talks informed by IBM’s Smarter Cities perspective — all with a Chicago bent. It was an interesting and ultimately enjoyable exercise, whittling down a tough subject into something to be read aloud. I’m grateful to NPR for the opportunity and their collaborative editing.

Here’s the link to the transcript and audio on NPR. The actual broadcast, I’m told, will be during All Things Considered on 1/1/2010. Pretty sure the broadcast is Chicago-only.

Here is the original essay, which gives a little more context to my screed.

This past year offered Chicagoans some unique opportunities to consider our collective identity as a city. We looked forward, dreaming of how we might remake the urban space to host the world and its Olympians in 2016. We looked backward, celebrating Burnham’s 100-year-old vision for what the city might become and, perhaps more interestingly, what it never did become. These two events both asked Chicagoans to imagine a city that did not exist, to grapple with a series of what-ifs about the built environment.

And yet, there’s another city — equally intangible — being built even as we move on from the Olympic decision and unrealized bold plans. It is a literal second city, built right atop our architecture of buildings, streets, and sewers. This is the city of data — every bit as complex and vital as our physical infrastructure, but as seemingly unreal (and unrealized) as the what-might-have-beens of Burnham’s Plan and Chicago 2016.

But what is a city of data and why should Chicago care about being one?

IT research firm Gartner notes that by the end 2012, 20% of the (non-video) data on the Internet will originate not from humans but from sensors in the environment. If your eyes just glazed over, let’s look at this from a different angle: if Gartner is right, for every four text messages that a pedestrian sends, the sidewalk she is walking on while doing so is also sending an equivalent amount of data. The city itself is becoming part of the Internet.

This is happening already. The city is increasingly instrumented; nearly everything today can be monitored, measured, and sensed. There are billions of processors embedded in everything from structural girders to running shoes. Millions of radio frequency identification tags turn inanimate objects into addressable resources. The city is immersed in a environment of data continuously built and rebuilt from the lived experiences of its occupants. And yet, this information architecture is hardly planned, much less dreamt about, or celebrated.

Consider the intersection of Michigan Avenue and Congress Parkway, what Burnham envisioned as a grand pedestrian-friendly concourse leading westward towards a towering civic center and eastward to the lakefront. This was never built, of course. (The circle interchange is our civic center, alas.) And yet there’s another built world, equally intangible, an infrastructure of data, overlaid on this intersection.

Three students surf the web thanks to an open WIFI cloud that leaks out of a local hotel lobby.

Several GPS units in cars all update with detail about the intersection as they approach.

Sensors embedded in the water main below the street register a blockage.

Closed-circuit cameras in three different shops capture the same window shopper as he moves down Michigan towards Randolph.

An exhausted cyclist’s bike computer uploads his location and energy expenditure as he stops to use his iPhone to log into a Zipcar waiting to take him home.

The city 311 database is populated with 7 different service requests from the surrounding area, coming from phone and e-mail.

Taxis criss-cross the intersection as their fare data trails are logged locally and broadcast to dispatch.

Four different people tweet from different perspectives on the same news crawl that moves across a building’s frontage.

A bus stops to pick up passengers and bathes them in the glow of the full-color video screen running along its side.

RFID chips on pallets loaded into building docks beneath the street respond to transducers in the receivers’ doorways.

And on and on. The examples are commonplace, but together they form an infrastructure — or superstructure — a second set of interactions, invisible or barely visible, atop the interactions that we plan for and currently build for. Proprietary, public, local, remote — all manner of data continuously permeates the streetscape. And yet we scarcely think of how it plays a part in the city that we’re building, the city that we want to become.

We don’t dwell on physical city infrastructure much either — unless we’re momentarily captivated by an architectural facade or, more commonly, inconvenienced by some lapse in the expected service. And yet. We’re the city that defines architectural styles for the world, that elevates an urban planner to local celebrity, that engages in a heated debate about the merits of remaking ourselves for the Olympics. From here on out why should we not apply such passion to the next wave of digital infrastructure? It is a decision not to be made lightly or as a thought exercise: how we design our city of information is as vital to quality of life as streets, schools, and job opportunity.

Dan Hill, a leading urban designer in matters digital, notes that we often think of the information landscape like street furniture and road signs, as adornment or a supplement to the physical environment. But fissures in a city’s data infrastructure are as consequential as potholes. They are structural failings of a city at the most basic level, in a way that a busted piece of street art would never be.

Think of cell phone outages — “dark zones” — as potholes in the urban information landscape. Or consider GPS brownouts, such as cause error in bus-tracking when the CTA enters the satellite-blocking skyscraper canyon of the Loop. But these examples are minor compared to the real issue before us: how do we proactively build a city of information that is inclusionary, robust but flexible, and reflective of a city’s unique character?

Our built structures — physical and digital — are manifestations of the patterns of human life in a city. They encode our desires, our needs, and our hopes. In some cases the permanence of the built environment inhibits or works at cross-purposes to these goals. (Think of expressways as barriers to the way people move about neighborhoods.)

We have a unique opportunity to ensure that our digital infrastructure avoids the mistakes of our physical infrastructure, to make Chicago the envy not just of building architects but of information architects.

I suggest two ways to start. To engage in a dialogue about this new built environment — such as we did collectively this summer — our city planners and citizenry need to be at least as conversant with the language of information architecture as we are, at a basic level, about physical architecture. Call it an aesthetics of data. This is as much a matter of becoming aware of what’s happening around us, of figuring out the most elegant ways of making the unseen felt, of thinking of our urban spaces as I described the interactions at Michigan and Congress.

Second, we need to recognize that, while the power of information is the power to connect, every linkage made represents a connection not made or, at worst, a disconnection. (Think again of the unintended effects of expressways on neighborhood mobility.) Our plan for a networked urbanism should seek above all to be maximally enfranchising, lowering barriers to commerce and community.

We must take up this mantle and be active participants in the design of this networked urbanism. We must make our voice heard. From educating our elected representatives about the opportunities before us, to encouraging our youth — who increasingly live in a world of data — to think critically about their role in the urban fabric, we must embrace this challenge with the same passion embodied in our historical tradition of remarkable plans for Chicago.

Buildings that know when they need to be fixed before something breaks; sensors that tell the fire department details of a fire before they receive the emergency phone call; smart water and sewage systems that filter and recycle water. . . . .

It’s that time of year here at IBM – when we look to the future and make five predictions of technological trends that will change the way we live in the next five years. Given the current attention to making our cities smarter, for this year’s we have focused on five innovations that will change our cities in the next five years.

Importantly, the list is intended to serve as a discussion point to discuss – and debate – the prospects for our cities and how progress can be made.

If there’s one common thread in all of the advances we see in the coming years, it’s the ability to monitor our environment with sensors and the application of analytics – complex algorithms baked into software – to make decisions based on all of that data. In reality, it’s what we’ve been talking about for the past year here on this blog, but we are just now beginning to see these efforts implemented at the city level to really change how cities work.

Analytics will predict the patterns of how diseases will spread, will enable buildings to evaluate the relationships between their systems and provide real-time information to management, will enable city smart grids to draw on clean energy during peak and off peak hours, find water leaks and more efficient ways to move water, and predict emergencies before they happen to limit their impact.

While these are predictions for the future, in each case the innovation is rooted in work we are just beginning to see pop up with some of our city clients or in our labs today. We’ll spend some time over the next few weeks to go deeper into each one of these topics, sharing what’s happening now and exploring opportunities for the future.

But in the meantime, and without further ado, below is this year’s “Next 5 in 5”:

Cities will have healthier immune systemsGiven their population density, cities will remain hotbeds of communicable diseases. But in the future, public health officials will know precisely when, where and how diseases are spreading – even which neighborhoods will be affected next. Scientists will give city officials, hospitals, schools and workplaces the tools to better detect, track, prepare for and prevent infections, such as the H1N1 virus or seasonal influenza. We will see a “health Internet” emerge, where anonymous medical information, contained in electronic health records, will be securely shared to curtail the spread of disease and keep people healthier.

City buildings will sense and respond like living organismsAs people move into city buildings at record rates, buildings will be built smartly. Today, many of the systems that constitute a building – heat, water, sewage, electricity, etc. – are managed independently. In the future, the technology that manages facilities will operate like a living organism that can sense and respond quickly, in order to protect citizens, save resources and reduce carbon emissions. Thousands of sensors inside buildings will monitor everything from motion and temperature to humidity, occupancy and light. The building won’t just coexist with nature – it will harness it. This system will enable repairs before something breaks, emergency units to respond quickly with the necessary resources, and consumers and business owners to monitor their energy consumption and carbon emission in real-time and take action to reduce them. Some buildings are already showing signs of intelligence by reducing energy use, improving operational efficiency, and improving comfort and safety for occupants.

Cars and city buses will run on emptyFor the first time, the “E” on gas gauges will mean “enough.” Increasingly, cars and city buses no longer will rely on fossil fuels. Vehicles will begin to run on new battery technology that won’t need to be recharged for days or months at a time, depending on how often you drive. IBM scientists and partners are working to design new batteries that will make it possible for electric vehicles to travel 300 to 500 miles on a single charge, up from 50 to 100 miles currently. Also, smart grids in cities could enable cars to be charged in public places and use renewable energy, such as wind power, for charging so they no longer rely on coal-powered plants. This will lower emissions as well as minimize noise pollution. (see the Battery 500 and Bornholm electric vehicle posts for hints at what is to come)

Smarter systems will quench cities’ thirst for water and save energyToday, one in five people lack access to safe drinking water, and municipalities lose an alarming amount of precious water — up to 50 percent through leaky infrastructure. On top of that, human demand for water is expected to increase sixfold in the next 50 years. To deal with this challenge, cities will install smarter water systems to reduce water waste by up to 50 percent. Cities also will install smart sewer systems that not only prevent run-off pollution in rivers and lakes, but purify water to make it drinkable. Advanced water purification technologies will help cities recycle and reuse water locally, reducing energy used to transport water by up to 20 percent. Interactive meters and sensors will be integrated into water and energy systems, providing you with real time, accurate information about your water consumption so you will be able to make better decisions about how and when you use this valuable resource.

Cities will respond to a crisis — even before receiving an emergency phone callCities will be able to reduce and even prevent emergencies, such as crime and disasters. Law enforcement agencies will turn to mathematics and analytics to analyze the right information at the right time, so that public servants can take proactive measures to head off crime. Fire departments will begin using software to potentially prevent fires from happening in the first place. Even today, scientists are beginning to look at past fires, smoke patterns and climate fluctuations to developing models that predict wildfires, to prevent fires and speed public evacuations when they happen.

The future of the local social web lies at the confluence of two emerging realities: Government 2.0 and Media 2.0. Here we see social networking tools, user-centered design, wikis, blogs, and mashups being used to create novel networks and platforms that enable a new civic reality: Community 2.0.

According to technologist Tim O’Reilly, Government 2.0 has several defining characteristics. Foremost is the concept of government as a platform, rather than a service, that enables the development of an ecosystem of self-governance, transparency and accountability. Whereas Government 1.0 represents power in the hands of the few; Government 2.0 represents power in the hands of the many. Citizens become participants, rather than observers, in their local government, actively engaging with civic leaders, government officials, and each other to proactively define and solve the most pressing local issues.

The social web is also leading to new models in the media industry. According to Jeff Jarvis, the hierarchical, siloed world of mainstream media is being replaced by a new news ecosystem that is ever-dependent on a network of voices and links. There is no longer one centralized, autocratic, top-down news source, but a series of linked enterprises—both large and small—that work together to report on local issues. These professional and amateur journalists do what they do best and link to the rest, creating a richer and more vibrant community of voices at the local level.

The new news ecosystem is built on an interlocking system of platforms. Imagine if you combined hyper-local versions of WikiLeaks and data.gov, with SeeClickFix, Document Cloud, Meetup, citizen journalists, and local news sites. You would end up with a new news ecosystem that is more transparent and efficient. Using these tools, news organizations with smaller editorial teams would still be able to perform the critical watchdog function that is a necessary component of a well-functioning democracy.

In Community 2.0, the relationship between government, the press, and the citizenry evolves into something that is much more transparent, engaging, and active. It is about redefining the idea of the Fourth Estate. Members of the former audience are now producers, not just consumers. Members of the former constituency are now actors, not just voters. Community 2.0 creates a living, evolving ecosystem of citizens.

Community 2.0 is open, local and vocal. At the heart of Community 2.0 are platforms like SeeClickFix, which make the traditional modes of communicating with media and the government more iterative, process-oriented and real-time. Using these tools, voices are aggregated, issues are documented, groups are organized, actions are broadcast, and citizens are more engaged in their environments. The output is a curated, crowdsourced live stream of information detailing the most topical and timely issues in a community.

Earlier this fall, The Knight Commission on the Information Needs of Communities in Democracy released its report detailing the ways that technology can be harnessed to help local communities help themselves. Using citizen-centered design, SeeClickFix has created a multi-platform social tool that allows citizens not just to complain, but to act. It enables Community 2.0, where citizens can communicate with one another to solve the most pressing issues in their cities or towns—from graffiti to crime to potholes to transit issues. It allows citizens to self-organize, crowdsource solutions to problems, reach out to relevant public officials, and work collectively to improve the quality of life in their cities and towns. The power of the tool is not limited to the few, but open to the many.

The local social web embodied in Community 2.0 is more inclusive. Web 2.0 technologies like SeeClickFix can atomize activities to reduce a primary barrier to civic engagement—time. Unlike a town hall meeting, that requires a significant time commitment; systems like SeeClickFix account and provide for different levels of engagement and make the act of contributing to the community simple. As these constraints on participation are liberalized, a wider cross-section of the community begins to contribute to the collective efforts. In a networked information economy where citizens are linked together, the tragedy of the commons is replaced with the promise of the collective.

On SeeClickFix we have seen utility companies, clean air non-profits, police chiefs, public works officials, State Transit officials, business improvement districts, city councilman and citizens all working together and communicating on issues to resolve them. Since no one individual typically owns the problem and no one agency typically can solve it, SeeClickFix enables these types of public/private partnerships.

Something interesting occurs within a community when people begin to experience the power of the SeeClickFix platform. As with any networked tool, the utility of the tool increases exponentially as the number of users increases. Residents start by using SeeClickFix to do simple tasks, like reporting potholes and traffic issues, or noting that they would like an issue fixed. They are alerted via email when people join the conversation which allows them to see that others in the community share their concerns. These alerts draw users back into the conversation and help to create a sense of community. The next step in the evolution from resident to member of Community 2.0 is when people use the tool to self-organize to find solutions to the problem, moving the community from complaint to action.

The impact of SeeClickFix is magnified through the positive feedback loop that is created via interaction between media, citizens and the government. In New Haven, issues are being uploaded on SeeClickFix. The New Haven Independent and The New Haven Register are then using the tool to source stories. Links to the issue on SeeClickFix are embedded in these stories or in the reader comments, directing others in the community to contribute to the conversation. Links to the stories are embedded in the issue report on SeeClickFix and used as rallying cries to gather support. Emails are sent to government officials, citizens, non-profits and other interested actors detailing the issue and alerting them when someone else in the community has joined the conversation. Government officials can post on the issues and thus communicate with many citizens at once. It is these multiple layers of feedback that encourage repeated engagement.

Recently, SeeClickFix was used to report increased instances of muggings occurring in a specific block in New Haven. The New Haven Independent used SeeClickFix to source leads for a story. The story linked to SeeClickFix, where the problem had first been acknowledged. Immediately, hundreds of citizens voted to have the problem “fixed.” Neighbors used the platform to organize a neighborhood watch, suggest solutions such as better lighting and an increased police presence, and arrange a meeting with the Mayor. Officials from City Council and City Hall responded to citizens’ concerns via SeeClickFix and followed-up these online conversations with offline meetings to work to resolve the issue. These meetings have lead to promises of increased lighting from the City and the arrest of several mugging suspects. More importantly, the neighborhood is now better connected and, therefore, better prepared to self-organize to solve issues in the future. SeeClickFix has enabled the creation of Community 2.0.

This is just one example of the way that the tool is being used in one city. Now imagine a world where we have data from every city and town across the country. Through geographically-specific collaborative democracy projects, we begin to see patterns of engagement, self-organization, deliberation, and problem resolution. We can evaluate what works and what doesn’t and then share this information across the platform. Experts in one city who have discovered solutions to issues that are broadly applicable to other cities can easily disseminate this information. Web2.0 allows for the creation of a new collective commons, where hyper-local, hyper-specific issues are hashed out in an open and transparent manner and then dispersed across the network.

in silico modeling creates images of liver inflammation and cancer that are similar to what might be seen under the microscope.

At the University of Pittsburgh’s McGowan Institute for Regenerative Medicine, researchers are using IBM technology to open up new dimensions in biological modeling. With the help of an IBM Shared University Research (SUR) Award and an IBM supercomputer, Pitt is using leading-edge in silico biological research, which uses computer simulations to explore biological pathways and test therapeutic interventions, tissue engineering, cell therapies and artificial organs and biodevices. The results of such research could significantly reduce the cost of new drug development and shorten the treatment evaluation process – getting treatments to the market faster and cheaper. These modeling techniques are similar to those used to generate the fantasy creatures of other worlds in movies, such as “Lord of the Rings” and the “Star Trek” series. So instead of creating an imaginary character to fill out a battle scene, Pitt scientists are applying computational techniques to simulate, for example, inflamed liver cells morphing into cancer. That allows them to see not only how tumors develop, but how drugs or other interventions could affect disease progression.

For example, the Pitt research team has simulated liver tissue to study how a chronic hepatitis infection can lead to liver cancer, lung tissues to study viral infection and chronic obstructive pulmonary disease and skin to study how patients with spinal cord injuries develop pressure ulcers. This type of advanced modeling can help researchers better understand basic biological processes and allows them to screen drugs and determine their impact on the body to uncover the best interventions for a broad range of diseases.

Dana asks an interesting question: are IBM’s “smarter traffic” ideas an homage to Moses or Jacobs?

He immediately answers his own question with the statement:

“Looking at IBM’s highly-advertised ideas on smarter traffic, much of it is built on the idea Robert Moses called flow.”

To answer simply, IBM’s work in this area is not “built” on Moses’ notions. Our work on helping cities build smarter transportation systems is built on three simple observations on some aspects of how society and technology are changing. Readers of this blog will know them as instrumentation, interconnection and intelligence. The people and systems of the world are becoming increasingly able to be more aware, of many more things, in many more places, much faster, and to analyse and derive timely insight and action from that. That’s it.

We use these observations to build things to help achieve people’s goals. If those goals are Moses-like, or Jacobs-like, that’s fine. The community that sets them is the judge of what they are or should be.

That isn’t to say we don’t have a strong point of view, though, because we do. Our approach is about leveraging technology for the sake of a city’s citizens. We don’t advocate tolls. Or not advocate them. We don’t advocate city centre cameras. Or not advocate them. We do advocate, say, holistic multi-modal solutions for transport, leveraging technology where and when applicable (this is where my son says, Dad, you sound like Dilbert). We want progress through making the overall systems smarter. And that approach has many facets, completely dependent on the community, city or nation in question.

In the end, Dana’s answers his own question with exactly the right answer:

The smartest city will find ways to support both. People and goods have to get around. But they also need destinations. Getting off the freeway and into the crowd is the challenge.

I believe – and many of the people I work with believe – that the Moses vs. Jacob is a false dichotomy. It isn’t about one or the other. The world has moved beyond that – our problems are too complex and interconnected.

Following is a guest post from W. “RP” Raghupathi, a professor Schools of Business for Fordham University:

This week in New York City, Fordham University and IBM gathered leaders from academia, government, health care and the venture capital community to discuss a coming wave of new jobs that will demand candidates possess strong, analytics-related skills.

Most current job opportunities focus on using analytic skills to gain insight into business data are simple and reactive – requiring the development of basic reports on information residing in databases and data warehouses. However, with the innovative analytic technologies now available, businesses have a great, new opportunity to gain deeper insight and understanding of the relationships and the patterns within the data, allowing them to make better business decisions proactively, in real-time.

This is especially true in today’s world of sustainability, in which businesses, non-profit organizations and government are driving new initiatives such as smart grid and intelligent transportation. These new opportunities are creating jobs where analytics can play a significant role. For example, smart grids require the analysis of data from different locations, as well as from the grid within, so analytics can help gain insight into where power is needed or where the power might be in surplus. Likewise in transportation, analytics plays a role in building intelligence into roads and toll systems to improve upon traffic and congestion issues. In the healthcare industry, the trend towards personalized healthcare means the more specific information a physician has about the patient, the higher quality of care they can provide to a patient on an individualized basis. Analytics also helps in the remote monitoring of patients where all data can be analyzed at a centralized location enabling, for example, senior citizens to be monitored remotely for their vital signs.

Fordham’s curriculum and information technology combined with IBM’s technology and analytics expertise made available to us through the academic initiative have helped us not only teach the skills, but work deep business analytics technology and apply them to real world scanerios. When our students graduate, they have a solid set of skills and practical experience to take up jobs in sectors such as power, healthcare, transportation, as well as non-profit agencies where they can apply analytics to some of the world’s most challenging business and societal issues.

We are excited about this effort and look froward to seeing positive results from this collaboration.

W. “RP” Raghupathi is Professor of Information Systems at the Schools of Business for Fordham University

The new Business Analytics & Optimization (BAO) Study (launching today, Dec. 9th) is entitled Breakaway because the report found that organizations that were making smarter decisions and applying predictive analytics were starting to put distance between themselves and competitors.

It also found that such breakaway businesses were succeeding, even in these challenging times, by taking intelligent risks and embracing new opportunities.

This “diavlog” (video blog dialog) will also include a Q&A session with viewers sharing questions by chat or Twitter, by including the hashtag #ibmbao in the tweet.

Finally, the webcast will also kick off the Analytics Virtual Center (AVC): a 3D, voice-enabled collaboration space intended to complement the six physical analytics centers IBM has launched this year in London (just last week), Berlin, Beijing, Tokyo, New York and Washington D.C. In fact, a group of viewers will be watching the webcast live from within this new, easy-to-use virtual complex.

Information technology is not necessarily an intuitive match with environmental concerns, but recent breakthroughs by corporate giant IBM are affecting water management in positive ways around the world. The goal, said Sharon Nunes, Vice President of IBM’s Big Green Innovations, which deals with advanced water management, is to provide agencies, utilities and private industry with precise data. A fast, accurate, local weather prediction, for example, can help clients — water management-related or otherwise — prepare for sudden storms. “Information like this helps to minimize business costs and work flow is more efficient,” Ms. Nunes said.

Last year, IBM established a Centre of Excellence for Water Management in Amsterdam where the focus is on flood management and levee systems, and in Dublin where a project called SmartBay monitors data about tidal flow, wave heights, temperature and phyloplankton via sensors placed throughout Galway Bay. Locally, IBM joined with New York’s Beacon Institute in 2007 to launch a monitoring network in the Hudson River that records information about oxygen content, temperature and wind speed, and assesses how these affect aquatic life. IBM sends scientists to Beacon, to mentor local college students and to teach the teachers. “Eventually the plan is to place sensors along entire length of the Hudson, down to the Harbor and up to Troy,” said Ms. Nunes. (A Hudson River sensor is pictured in the photo at right.)

“The Hudson River is the pilot river system for this groundbreaking initiative, and the 12 million people who live within its watershed will be the first beneficiaries of our work,” said John Cronin, Director and Chief Executive Officer of The Beacon Institute when the project was announced in 2007. “This new way of observing, understanding and predicting how large river and estuary ecosystems work ultimately will allow us to translate that knowledge into better policy, management and education for the Hudson River and for rivers and estuaries worldwide.”

With all the press about energy costs sinking data center budgets, along comes a data center that consumes only half the normal power. And it even offers excess heating and cooling to nearby buildings, sometimes returning spare electricity to the grid. Syracuse University, teaming with IBM and NY State just opened what is likely the world’s most energy-stingy data center.

The fact that the new data center generates all of its own electricity on site is at the top of a cascade of opportunistic energy management. There goes all the heat loss of a traditional fossil fuel powered generation plant along with the transmission line and transformer losses, too. Then they proceeded to throw a net over the entire thermal profile of the place and squeeze out every BTU by capturing hot exhaust from the natural gas turbines, using it to heat buildings and cool – through absorption chillers - all those churning servers. Additional green tools include water cooled server racks, DC power distribution – they even use outside air for free cooling during the winter.

This type of systems approach to designing efficiency into an entire operation can be an important offset to the booming growth of data centers. Many of the same principles can also work in the design of any building.

Syracuse University has made this terrific story even richer by wiring the entire place with sensors with plans to run the center as a living laboratory. The volumes of data collected will allow researchers to progressively tweak the equations of the complex until they reach…. thermal nirvana.

Building a Smarter Planet is a blog intended to provide readers with thought-provoking content and a place to talk about the issues raised within the content. It is our hope that you will feel compelled to share some of the things you see, read and hear on this blog with your friends, family and peers. We feel strongly that this blog is not going to deliver final answers to the issues raised, but that it will represent a starting point for conversation around the issues.