"Conservation groups are using technology to understand and protect our planet in an entirely new way."

"The Internet of Things (IoT) was an idea that industry always loved. It was simple enough to predict: as computing and sensors become smaller and cheaper, they would be embedded into devices and products that interact with each other and their owners. Fast forward to 2017 and the IoT is in full bloom. Because of the stakes — that every device and machine in your life will be upgraded and harvested for data — companies wasted no time getting in on the action. There are smart thermostats, refrigerators, TVs, cars, and everything else you can imagine.

Industry was first, but they aren’t the only. Now conservationists are taking the lead.

The same chips, sensors (especially cameras) and networks being used to wire up our homes and factories are being deployed by scientists (both professional and amateur) to understand our natural world. It’s an Internet of Living Things. It isn’t just a future of efficiency and convenience. It’s enabling us to ask different questions and understand our world from an entirely new perspective. And just in time. As environmental challenges — everything from coral bleaching to African elephant poaching— continue to mount, this emerging network will serve as the planetary nervous system, giving insight into precisely what actions to take.

It’s a new era of conservation based on real-time data and monitoring. It changes our ecological relationship with the planet by changing the scales at which we can measure — we get both increased granularity, as well as adding a truly macro view of the entire planet. It also allows us to simultaneously (and unbiasedly) measuring the most important part of the equation: ourselves.

Specific and Real-Time

We have had population estimates of species for decades, but things are different now. Before the estimates came from academic fieldwork, and now we’re beginning to rely on vast networks of sensors to monitor and model those same populations in real-time. Take the recent example of Paul Allen’s Domain Awareness System (DAS) that covers broad swaths of West Africa. Here’s an excerpt from the Bloomberg feature:

For years, local rangers have protected wildlife with boots on the ground and sheer determination. Armed guards spend days and nights surrounding elephant herds and horned rhinos, while on the lookout for rogue trespassers.

Allen’s DAS uses technology to go the distance that humans cannot. It relies on three funnels of information: ranger radios, animal tracker tags, and a variety of environmental sensors such as camera traps and satellites. This being the product of the world’s 10th-richest software developer, it sends everything back to a centralized computer system, which projects specific threats onto a map of the monitored region, displayed on large screens in a closed circuit-like security room.

For instance, if a poacher were to break through a geofence sensor set up by a ranger in a highly-trafficked corridor, an icon of a rifle would flag the threat as well as any micro-chipped elephants and radio-carrying rangers in the vicinity.

[video]

These networks are being woven together in ecosystems all over the planet. Old cellphones being turned into rainforest monitoring devices. Drones surveying and processing the health of Koala populations in Australia. The conservation website MongaBay now has a section of their site dedicated to the fast-moving field, which they’ve dubbed WildTech. Professionals and amateurs are gathering in person at events like Make for the Planet and in online communities like Wildlabs.net. It’s game on.

The trend is building momentum because the early results have been so good, especially in terms of resolution. The organization WildMe is using a combination of citizen science (essentially human-powered environmental sensors) and artificial intelligence to identify and monitor individuals in wild populations. As in, meet Struddle the manta ray, number 1264_B201. He’s been sited ten times over the course of 10 years, mostly around the Maldives.

[image]

The combination of precision and pervasiveness means these are more than just passive data-collecting systems. They’re beyond academic, they’re actionable. We can estimate more accurately — there are 352,271 elephants estimated to remain in Africa — but we’re also reacting when something happens — a poacher broke a geofence 10 minutes ago.

The Big Picture

It’s not just finer detail, either. We’re also getting a better bigger picture than we’ve ever had before. We’re watching on a planetary scale.

Of course, advances in satellites are helping. Planet (the company) has been a major driving force. Over the past few years they’ve launched hundreds of small imaging satellites and have created an earth-imaging constellation that has ambitions of getting an image of every location on earth, every day. Like Google Earth, but near-real-time and the ability to search along the time horizon. An example of this in action, Planet was able to catch an illegal gold mining operation in the act in the Peruvian Amazon Rainforest.

[image]

It’s not just satellites, it’s connectivity more broadly. Traditionally analog wildlife monitoring is going online. Ornithology gives us a good example of this. For the past century, the study of birds have relied on amateur networks of enthusiasts — the birders — to contribute data on migration and occurrence studies. (For research that spans long temporal time spans or broad geographic areas, citizen science is often the most effective method.) Now, thanks to the ubiquity of mobile phones, birding is digitized and centralized on platforms like eBird and iNaturalist. You can watch the real-time submissions and observations:

[image]

Sped up, we get the visual of species-specific migrations over the course of a year:

[animated GIF]

Human Activity

The network we’re building isn’t all glass, plastic and silicon. It’s people, too. In the case of the birders above, the human component is critical. They’re doing the legwork, getting into the field and pointing the cameras. They’re both the braun and the (collective) brain of the operation.

Keeping humans in the loop has it’s benefits. It’s allowing these networks to scale faster. Birders with smartphones and eBird can happen now, whereas a network of passive forest listening devices would take years to build (and would be much more expensive to maintain). It also makes these systems better adept at managing ethical and privacy concerns — people are involved in the decision making at all times. But the biggest benefit of keeping people in the loop, is that we can watch them—the humans—too. Because as much as we’re learning about species and ecosystems, we also need to understand how we ourselves are affected by engaging and perceiving the natural world.

We’re getting more precise measurements of species and ecosystems (a better small picture), as well as a better idea of how they’re all linked together (a better big picture). But we’re also getting an accurate sense of ourselves and our impact on and within these systems (a better whole picture).

We’re still at the beginning of measuring the human-nature boundary, but the early results suggests it will help the conservation agenda. A sub-genre of neuroscience called neurobiophilia has emerged to study the effects on nature on our brain function. (Hint: it’s great for your health and well-being.) National Geographic is sending some of their explorers into the field wired up with Fitbits and EEG machines. The emerging academic field of citizen science seems to be equally concerned with the effects of participation than it is with outcomes. So far, the science is indicating that engagement in the data collecting process has measurable effects on the community’s ability to manage different issues. The lesson here: not only is nature good for us, but we can evolve towards a healthier perspective. In a world approaching 9 billion people, this collective self-awareness will be critical.

What’s next

Just as fast as we’re building this network, we’re learning what it’s actually capable of doing. As we’re still laying out the foundation, the network is starting to come alive. The next chapter is applying machine learning to help make sense of the mountains of data that these systems are producing. Want to quickly survey the dispersion of arctic ponds? Here. Want to count and classify the number of fish you’re seeing with your underwater drone? We’re building that. In a broad sense, we’re “closing the loop” as Chris Anderson explained in an Edge.org interview:

If we could measure the world, how would we manage it differently? This is a question we’ve been asking ourselves in the digital realm since the birth of the Internet. Our digital lives — clicks, histories, and cookies — can now be measured beautifully. The feedback loop is complete; it’s called closing the loop. As you know, we can only manage what we can measure. We’re now measuring on-screen activity beautifully, but most of the world is not on screens.

As we get better and better at measuring the world — wearables, Internet of Things, cars, satellites, drones, sensors — we are going to be able to close the loop in industry, agriculture, and the environment. We’re going to start to find out what the consequences of our actions are and, presumably, we’ll take smarter actions as a result. This journey with the Internet that we started more than twenty years ago is now extending to the physical world. Every industry is going to have to ask the same questions: What do we want to measure? What do we do with that data? How can we manage things differently once we have that data? This notion of closing the loop everywhere is perhaps the biggest endeavor of … [more]

Smartphones made it easy to research facts, capture images, and navigate street maps, but they haven't brought us closer to the physical environment in which we live – until now.

Meet SCiO. It is the world's first affordable molecular sensor that fits in the palm of your hand. SCiO is a tiny spectrometer and allows you to get instant relevant information about the chemical make-up of just about anything around you, sent directly to your smartphone."

"Augmented Ecology is a research platform that tracks developments in an emerging branch of the anthropocene; the intertwining of data and media systems with ecosystems.

[image: “Heat-map for yearly migratory pattern of the Black-throated Gray Warbler on eBird”]

Mapping, visualization and tracking technologies contribute to a more detailed picture of the living and geological landscapes. They help to model, to explore, to research, to protect, to admire, exploit or conserve the natural world by extending our view. By satellite, drone, radio-tag, browser and smartphone, hidden paterns and behaviour are discovered, networks of meaning are formed and participatory science undertaken. These tools are extending our human senses, making visible the daily life of a whale not unlike the way early telescopes made the features of Saturn visible.

[image “Mapping efforts by Google Trek”]

Through epizoic media, drone ecology and satellite sensors living systems seem to be emerging as a subset of the internet of things (IoT). Perhaps this subset could be called an Internet of Organisms (IoO), at any rate it makes for a splendid looking acronym…

The augmentation of natural systems raises some new questions: What changes does the increasing level of media resolution bring to our relationship with the great out-doors and wildlife? What kinds of opportunities do they offer for interaction, research, citizen science or tourism? What is their impact on the political value of the wilderness, both as a global commons and as a refuge away from human society, government and corporate power?

The aim of this research is to highlight how technologies such as remote sensing, tagging, mapping, uav-s, develop a next chapter in our ongoing history of exploration, domestication, exploitation of, and fascination for the dynamic systems we are part of.

The wired wilderness is becoming populated by data-harvesting animals, camera-traps, conservation drones, Google Trek adventurers, cyberpoachers and many other forms of machine wilderness. Perhaps Augmented Ecology can be a fieldguide to browse this weird neck of the woods? Surely these developments are worth our deliberate attention - Theun Karelse

This research was triggered by the development of an opensource smartphone application called Boskoi for exploring and mapping the edible landscape undertaken at FoAM. As one of the first participatory apps focussed on nature, it flashed out many issues. The issues surrounding Redlist species were particularly thought provoking and resulted in a session in FoAM’s program at Pixelache festival in 2011 asking: ‘Is there still a privatelife for plants?’ (an adaptation of the title of the BBC natural history series)"

"In the years since the Mars Exploration Rover Spirit and Rover first began transmitting images from the surface of Mars, we have become familiar with the harsh, rocky, rusty-red Martian landscape. But those images are much less straightforward than they may seem to a layperson: each one is the result of a complicated set of decisions and processes involving the large team behind the Rovers.

With Seeing Like a Rover, Janet Vertesi takes us behind the scenes to reveal the work that goes into creating our knowledge of Mars. Every photograph that the Rovers take, she shows, must be processed, manipulated, and interpreted—and all that comes after team members negotiate with each other about what they should even be taking photographs of in the first place. Vertesi’s account of the inspiringly successful Rover project reveals science in action, a world where digital processing uncovers scientific truths, where images are used to craft consensus, and where team members develop an uncanny intimacy with the sensory apparatus of a robot that is millions of miles away. Ultimately, Vertesi shows, every image taken by the Mars Rovers is not merely a picture of Mars—it’s a portrait of the whole Rover team, as well."

"This is an extraordinary exploitation of a mobile phone by the folks who brought the very slick Morpholioapps suite of creative apps for the iPad. While watching, I remembered very recently I caught myself looking at my Macbook Air, not even a year old yet, and thinking, “How quaint!” It is the best laptop I’ve ever used, arguably the best laptop, full stop, but, it suddenly occurred to me, it is still the legacy of the typewriter. The one thing that makes my Air great is the web.

But, the thing that web great is a mobile device.

I understand that for most of our schools–all that I know of, in fact–a laptop program is still the first step. We’re just not ready yet to let go of this old technology. But even as we are building our laptop programs, we need to be having a very serious discussion about how we will implement our mobile programs, or we are going to be caught flat-footed, again. The world is going mobile:

Indeed, a laptop program doesn’t ask us to really change our pedagogy. The same one we’ve been using for 200 years works pretty good on the device so rolling out a even a 1:1 program is comparatively easy. But mobile-based teaching/learning both enables and requires a significant change in pedagogy and methodology.

I get asked all the time, “Laptop, tablet or smartphone: if you could have just one for your students, which would it be?” The answer is, without hesitation or qualification, a smartphone. My second choice would be a tablet, like the iPad. My last option would be a laptop. You just get way more leverage from a smartphone (a topic for another post.) It will be mobile technologies that we will later call the catalyst for the educational renaissance."

"Smartphones made it easy to research facts, capture images, and navigate street maps, but they haven’t brought us closer to the physical environment in which we live – until now.

Meet SCiO. It is the world's first affordable molecular sensor that fits in the palm of your hand. SCiO is a tiny spectrometer and allows you to get instant relevant information about the chemical make-up of just about anything around you, sent directly to your smartphone.

Out of the box, when you get your SCiO, you’ll be able to analyze food, plants, and medications.

For example, you can:

• Get nutritional facts about different kinds of food: salad dressings, sauces, fruits, cheeses, and much more.
• See how ripe an Avocado is, through the peel!
• Find out the quality of your cooking oil.
• Know the well being of your plants.
• Analyze soil or hydroponic solutions.
• Authenticate medications or supplements.
• Upload and tag the spectrum of any material on Earth to our database. Even yourself!

These are just a few of the starter applications that you can use upon receiving your SCiO. After SCiO is released new applications will be developed and released regulary. If you order SCiO from Kickstarter you will get all new applications for free in the next two years.

The possibilities of SCiO applications are endless. for example in the future you can use SCiO to measure properties of cosmetics, clothes, flora, soil, jewels and precious stones, leather, rubber, oils, plastics, and even your pet!"

"Built to entertain blind players as well as those who can see, the audio-only game’s accommodation of disabled gamers is a pleasant anomaly in the gaming industry, even though the number of gamers with disabilities is significant. The latest Americans with Disabilities report, which draws on 2010 census data, estimates that nearly fifty-seven million Americans, or roughly nineteen per cent of the population, have a disability, with over thirty-eight million suffering from what the report considers to be a “severe disability” of a physical, mental, or communicative nature. While nearly twenty million Americans “had difficulty with physical tasks relating to upper body function,” more than eight million over the age of fifteen have difficulty seeing and seven and a half million reported difficulty hearing. There is certainly overlap with the fifty-eight per cent of Americans who, according to the Electronic Software Association, play video games; the Able Gamers Foundation, a charity organization for disabled gamers, estimates that there are thirty-three million gamers with some kind of disability.

In the nineteen-eighties, gamers like John Dutton, a quadriplegic who learned to use the Atari 2600 joystick with his mouth and chin, drew attention to the need for hardware that disabled gamers could use. In 1988, Nintendo released the NES Hands Free, a video-game controller designed explicitly for disabled gamers, which was worn like a vest. It had a chin stick for movement and a tube that players breathed in and out of to control the “A” and “B” buttons. In the nineties, attention shifted to making in-game control schemes more accessible, leading to releases like Shades of Doom, a first-person shooter for visually impaired gamers. More recently, the Call of Duty franchise, inspired by the quadriplegic professional gamer Randy Fitzgerald, introduced a special button layout for disabled gamers which makes it easier to aim, while the Able Gamers Foundation has published a guide that shows developers how to design more accessible products."

…

"“There are gamers out there who are anxious for more accessible content, and very little, if any, of it is coming from established publishers,” Astolfi said. “People with disabilities are a group that has, in general, not been targeted by major video-game releases. But as the indie game movement continues to grow, I think we’ll see more games designed specifically for this audience.”

Yet a large part of BlindSide’s success seems tied to the fact that it doesn’t feel like a game that’s been designed for disabled players. A game with no visual stimulus can be just as engrossing for players who can see as for those who cannot, it seems. “Our favorite feedback on the game was actually a negative comment,” Astolfi said. “It was a three-star review from a sighted player who said he found the game too scary.”"

"Counting Sheep: NZ Merino in an Internet of Things is a three-year research project (2011-2014) based in the School of Design, Victoria University of Wellington, New Zealand. Led by Dr Anne Galloway, our work explores the role that cultural studies and design research can play in supporting public engagement with the development and use of science and technology.

The Internet of Things is a vision for computing that uses a variety of wireless identification, location, and sensor technologies to collect information about people, places and things - and make it available via the internet. Today's farms generate and collect enormous amounts of data, and we're interested in what people can do with this information - as well as what we might do with related science and technology in the future.

Over the past two years we've travelled around the country, visiting merino stations, going to A&P shows and shearing competitions, and spending time in offices and labs, talking with breeders, growers, shearers, wool shandlers, scientists, industry representatives, government policy makers and others - all so that we could learn as much as possible about NZ merino. Then we took what we learned and we started to imagine possible uses for these technologies in the future production and consumption of merino sheep and products.

This website showcases our fictional scenarios and we want to know what you think!"

"WNYC invites families, armchair scientists and lovers of nature to join in a bit of mass science: track the cicadas that emerge once every 17 years across New Jersey, New York and the whole Northeast by building homemade sensors and reporting your observations."

"I've been thinking a lot about motive & intent for the last few years. How we recognize motive &… how we measure its consequence.

This is hardly uncharted territory. You can argue easily enough that it remains the core issue that all religion, philosophy & politics struggle with. Motive or trust within a community of individuals.

…Bruce Schneier…writes:

"In today's complex society, we often trust systems more than people. It's not so much that I trusted the plumber at my door as that I trusted the systems that produced him & protect me."

I often find myself thinking about motive & consequence in the form of a very specific question: Who is allowed to speak on behalf of an organization?

To whom do we give not simply the latitude of interpretation, but the luxury of association, with the thing they are talking about …

Institutionalizing or formalizing consequence is often a way to guarantee an investment but that often plows head-first in to the subtlies of real-life."

"Remote sensor connections is a feature allowing other programs to connect to Scratch. This allows it to be extended to connect to devices, access the internet, or perform other functions not possible inside Scratch. For example, JoyTail allows you to use a joystick with Scratch."

"Right now we are in the age of life-logging, recording every bit of information about a person's activities, behavior, and physicality. This behavior is also called total capture and Facebook’s latest Timeline feature, has introduced the idea of total capture to mainstream audiences. A Principal Researcher at Microsoft Research, Abigail Sellen is critical of the modern conversation on life-logging and total capture and argues this technical handling of memories through indexing and metadata is just not how memory works."

"Today we are used to the public sensors around us being noticeable if you know what to look for. In 20 years time this may no longer be the case, and the social implications are worth exploring. … Let's look at London, a fairly typical large capital city. London has a surface area of approximately 1570 square kilometres, and around 7.5 million inhabitants (not counting outlying commuter towns). Let us assume that our hypothetical low-power processor costs 10 euro cents per unit, in large volumes. To cover London in CPUs roughly as powerful as the brains of the Android tablet I'm reading this talk from, to a density of one per square metre, should therefore cost around €150M in 2040, or €20 per citizen. … "It has been said that the internet means the death of privacy — but internet-based tracking technologies aren't useful if you leave your computer at home and switch off your smartphone. In contrast, the internet of things — the city wallpapered from edge to edge with sensors and communicating processors — really does mean the death of privacy. You'd have to lock yourself in a faraday cage and switch off all the electrical devices near to you in order to regain any measure of invisibility. … we're going to be subjected to more monitoring than most people today can possibly imagine. … The logical end-point of Moore's Law and Koomey's Law is a computer for every square metre of land area on this planet — within our lifetimes. And, speaking as a science fiction writer, trying to get my head around the implications of this technology for our lives is giving me a headache. We've lived through the personal computing revolution, and the internet, and now the advent of convergent wireless devices — smartphones and tablets. Ubiquitous programmable sensors will, I think, be the next big step, and I wouldn't be surprised if their impact is as big as all the earlier computing technologies combined."

"It sounds strange when you first read it: behavioural change to accommodate the invisible gaze of the machines, just in case there’s an invisible depth-camera you’re obstructing. And at the same time: the literacy to understand that there when a screen is in front of a person, there might also be an optical relationship connecting the two – and to break it would be rude.

The Sensor-Vernacular isn’t, I don’t think, just about the aesthetic of the “robot-readable world“; it’s also about the behaviours it inspires and leads to.

How does a robot-readable world change human behaviour?…

Look at all the other gestures and outwards statements that the sensor-vernacular has already lead to: [examples]…

Where next for such behavioural shifts? How long before, rather than waving, or shaking hands, we greet each other with a calibration pose:

Which may sound absurd, but consider a business meeting of the future… [described]"

"The Instruments of Politeness show how we might interact with context aware technology in the future.

At present we can lie about our current situation because the only transmitted information is the actual conversation and background noise. In the future mobile phones will be able to estimate our activity by evaluating multiple sensors in the device. This information will not only be used by the device itself but shared with our environment. The project 'Instruments of Politeness' allows the user to lie about his current activity.

What if we could trick the perception of our "aware" gadgets?

These two objects focus on simulating specific movement procedures. The first one converts a circular movement into a gentle linear motion as if the person was walking with the phone in their pocket. The second object creates a random movement to simulate a person dancing."

"There is increasing concern about pollution levels in the world's most ubiquitous and essential substance – air – and a new pilot project from Intel is aiming to address it via the developed world's second most ubiquitous thing: The mobile device. The Common Sense Project has developed a prototype for a new handheld mobile device equipped with an air quality sensor that helps communities record and analyze environmental data in order to become more engaged in civic matters of environmental policy and regulation."

"We are developing mobile sensing technologies that help communities gather and analyze environmental data. We hope that this hardware and software will empower everyday citizens to learn more about their environment and influence environmental regulations and policy.

We have developed various research prototypes, which are being used in studies such as a deployment on street sweepers in San Francisco and a deployment of a handheld device in West Oakland. Right now we are focusing our efforts on air quality sensing. Our hope is that our research prototypes will demonstrate the utility of embedding environmental sensors in commercial commodity devices such as mobile phones."

"Trillions of embedded systems are being unleashed into the world. What are the implications of a world filled with all these sensors and actuators? Some of the world’s most insightful designers, thinkers and entrepreneurs will address these questions, with you, at Doors of Perception 7 in Amsterdam on 14, 15, 16 November 2002. The theme is Flow: the design challenge of pervasive computing."

"Noise pollution is a serious problem in many cities. NoiseTube is a research project about a new participative approach for monitoring noise pollution involving the general public. Our goal is to extend the current usage of mobile phones by turning them into noise sensors enabling each citizen to measure his own exposure in his everyday environment. Furthermore each user could also participate to the creation of a collective map of noise pollution by sharing automatically his geolocalized measures with the community.

By installing our free application on your GPS equipped phone, you will be able to measure the level of noise in dB(A) (with a precision a bit lower than a sound level meter), and contribute to the collective noise mapping by annotating it (tagging, subjective level of annoyance) and sending this geolocalized information automatically to the NoiseTube server by internet (GPRS)."

"'amphibious architecture' is a new project by the new york city design studio the living. the project specifically uses water as a surface, since it is so ubiquitous in the world, yet under-explored in art and design. the project consists of two networks of floating interactive tubes that feature light beacons on top and a range of sensors below. these sensors 'monitor water quality, presence of fish, and human interest in the river ecosystem', while the lights respond and 'create feedback loops between humans, fish, and their shared environment'. 'an SMS interface allows citizens to text-message the fish, to receive real-time information about the river, and to contribute to a display of collective interest in the environment.’"

"Jing Li, a physical scientist at NASA's Ames Research Center, Moffett Field, Calif., along with other researchers working under the Cell-All program in the Department of Homeland Security’s Science and Technology Directorate, developed a proof of concept of new technology that would bring compact, low-cost, low-power, high-speed nanosensor-based chemical sensing capabilities to cell phones.

The device Li developed is about the size of a postage stamp and is designed to be plugged in to an iPhone to collect, process and transmit sensor data. The new device is able to detect and identify low concentrations of airborne ammonia, chlorine gas and methane. The device senses chemicals in the air using a "sample jet" and a multiple-channel silicon-based sensing chip, which consists of 16 nanosensors, and sends detection data to another phone or a computer via telephone communication network or Wi-Fi."

"Natural Fuse creates a city-wide network of electronically-assisted plants that act both as energy providers and as circuit breakers.

Every seemingly helpful device that a human being uses has its own carbon “footprint” which, in excess, can harm other living beings. “Natural fuse” is a micro scale CO2 monitoring & overload protection framework that works locally and globally, harnessing the carbon-sinking capabilities of plants.

Natural fuses allow only a limited amount of energy to be expended in the system; that amount is balanced by the amount of CO2 that can be absorbed by the plants that are growing in the system.

In the same way that circuit-breakers are useful for preventing excessive current use, so too can the Natural Fuse plants break the CO2 footprint “circuit”.

"And here we get to the crux of the issue: in both Hong Kong and Tokyo, the consequences of decisions made by engineers about the properties of a technical system cascaded upward not merely to the level at which they could afford or constrain individual behavior, but that at which they affected the macro-level performance of the entire subway system…and maybe even the community’s long-term well-being."

"we hope that this work goes some way towards building better spatial and gestural models of RFID, as material for designers to build better products and to take full advantage of the various ways in which spatial proximity can be used. And with this better understanding we hope to be able to discuss and design for privacy and the ‘leakage’ of data in a more rigorous way."

"Stanford-Clark has set up various systems for real-time monitoring of the Internet of Things, many of them using Twitter (he calls the resulting tweets "tweetjects"). One example got a bit of mainstream media coverage lately: a house that uses Twitter to monitor its energy consumption.

As Rory Cellan-Jones from the BBC reported recently, Stanford-Clark has installed sensors on a number of household objects - such as electricity meters and windows. From this he can monitor lighting, heating, temperature, phone and water usage. Stanford-Clark is able to turn his fountain, lights and heaters on and off by flicking switches on a web page or from a live dashboard application on his mobile phone."

"The NETLab Toolkit is a free set of software tools that enable designers to easily "sketch in hardware". With no programming at all and working in the familiar environment of Flash, designers can hook up a physical sensor (e.g. a knob) and immediately get that knob to control a motor or a video projection. The toolkit works with a wide range of sensors, wireless sensors, input from the Wii Remote, controls motors and LEDs, communicates with MIDI devices, controls sound, graphics, and video in Flash, and communicates with DMX computer controlled lighting equipment, all with a simple drag-and-drop interface (of course, programming hooks are provided as well)."

"Another post on Pachube and how easy it is to get setup and going. This post will cover the basic steps on how to get some sensors online! I will be using my project of two light sensors as an example."

"Pachube is a service that enables you to connect, tag and share real time sensor data from objects, devices, buildings and environments around the world. Here, we collect together Pachube apps that create/modulate input feeds or make use of output feeds. Sign up for Pachube here!"

"Christoph Bartneck, an industrial design professor at the Eindhoven University of Technology in the NetherlandsDenmark, teaches a course I would have killed to take: a master class to create expansion packs for Lego Mindstorms NXT. The students have to make working sensors that go beyond what Lego and third-party sensor makers (and others) already provide. Here are some samples from the students’ projects: GPS Sensor, Wireless Sensor Bridge, Optical Mouse Sensor"

"This report describes the results of a collaborative research project to develop a suite of low-tech sensors and actuators that might be useful for artists and architects working with interactive environments. With this project we hoped to consolidate a number of different approaches we had found ourselves taking in our own work and develop both a "kit-of-parts" and a more conceptual framework for producing such works."

"Ever wonder how loud a sound really is? Well now you can with Decibel Meter! Decibel Meter uses your iPhone's built in Mic to determine the intensity level of sounds it picks up, and displays it in dB SPL (Decibel Sound Pressure Level). The iPhone's Mic has the ability to pick up sounds near 0 dB SPL and up to around 110 dB SPL."

"What does the platform look like that allows digital architects to layer interaction models from massively multiplayer gaming and wii-like gestural performance onto urban-scale environments? We are building software, initially for an amusement park in Dubai, that integrates the datastreams from thousands of modular sensor nodes into a high-resolution realtime spatial model of people and objects."

"Below are a couple of sketches for Arduino Ethernet: the first can be used to share sensor data with Pachube and grab remote sensor data, so you can effectively have both "local" and "remote" sensors; and the second enables you to control Arduino Ethernet via a web page.

These sketches should make Arduino-based web-controlled home automation, and remote-responsive spaces a lot easier. The advantage of working with an ethernet shield is that you no longer need to tether the Arduino to a computer in order to access Pachube and other network services!"

"We often think of mobile technologies simply in terms of their communication capabilities, but their increasing ability to trace our movements and collect information about the spaces through which we pass, can also make it easier for people to keep track of the places and things that matter most to them. From geo-visualisations and mapping mash-ups, to the mobile geospatial web and location-based services, people’s relationships to places (and each other) are changing."

"Botanicalls opens a new channel of communication between plants and humans, in an effort to promote successful inter-species understanding.

The Botanicalls project is fundamentally about communication between plants and people. We are empowering both by inventing new avenues of interaction. Plants that might otherwise be neglected are given the ability to call, text message and email people to request assistance. People who are unsure of their ability to effectively care for growing things are given visual and aural clues using common human methods of communication."

"Finally, something the iphone doesn't do, yet... "Interzone aims to build on the work of Sensor Planet: a Nokia Research Center initiated program on large-scale sensor networks that is interested in combining the physical and the virtual worlds through new ways of sensing." This project is a sensor that would work in conjunction with a mobile phone to provide information to the user about local air quality."

"This installation on display in Montreal, Canada by Uruguayan artist and architect Juliana Rosales focuses on the relationship between built and natural environments and the occupants of both spaces. The project uses an Arduino and custom circuitry to mo

"Welcome to Pachube, a service that enables people to tag and share real time environmental data from objects, devices and spaces around the world. The key aim is to facilitate interaction between remote environments, both physical and virtual."

"Inventor Andy Stanford-Clark has set up a twitter account for his house. This allows his home automation system to keep him updated about lighting, security, energy usage, etc. while he is away. Check out this great example of the potential for ambient i

"Air Monsters is a mobile instrument that seeks to explore the issue of air pollution. This is achieved by a fictional narrative of invisible monsters that resides in the air which metaphorically represents the air pollutants in the air."

"enables people to tag and share real time sensor data from objects, devices and spaces around the world...key aim is to facilitate interaction between remote environments, both physical and virtual. Apart from enabling direct connections between any two

"Our technology enables individuals and corporations to better understand their environment, through the use of a series of GPS-enabled sensors. We provide a set of open APIs and communication protocols to manage the data collected."

"Black Cloud curriculum is organized around an Alternate Reality game in which students track down wireless air quality sensors. These sensors are hidden in the students’ neighborhood at environmentally critical locations."

"technology and design consultancy that helps artists, designers and brands make the best use of digital technologies. Our manufacturing expertise, which includes the Arduino project, allows us to work with designers on all aspects of a project, from init

"sketch of average high street...based on here and now; none of the technology is R&D...We can’t see how the street is immersed in a twitching, pulsing cloud of data...two caricatured possible futures that can be deployed to flush out a few more issues.

"AT suggestion of Department of Homeland Security, NYC Council members have drafted legislation requiring anyone who has or uses a detector that measures chemical, biological or radioactive agents to get license from Police Department."

"The sensors are linked up and used like a rudimentary motion-capture suit -- only instead of needing a controlled environment and special cameras, microphones worn on the torso pick up beeps from the emitters to locate your limbs as you flail about."

"BUG is a new kind of device, one that's designed by you, not us. BUG is an open source, modular consumer electronics platform that makes building hardware just as easy as writing software or Web applications"

"Brandon has spent the past decade studying amphibians which he defined as the "environmental canaries in the coalmine." They act as bio-sensors. Studies have demonstrated that amphibians are declining even in protected environments."

"It is about making data architectures and systems that are as much alive as the earth they capture, that permit us to dynamically understand, manipulate, and research new propositions about our living and increasingly dying planet."