Ever wondered what the weather would look like as a painting? Now you don’t have to.

Our habitats should reflect our lives. Why does the world around us not react to events the same way that our devices do? If we care so much about this information, let's bring it into the real world and stop trying to suck all of our data through the tiny straw that is a phone screen. We need to make more proactive devices so that we can spend more time reacting to data, and less time searching for it. We need to make ambient devices that live where we do, so that our data is more accessible. Furthermore, we need to stop being so literal with our data. Take weather data, for example. When I see that list of numbers predicting the future, I feel like I am trying to figure out what a painting looks like with a microscope. We need to explore other ways of expressing this data; what if it was a painting?

I first found out that Dark Sky provides a free API for hourly weather forecast data over the future 168 hours. The data is high quality and accessible, but presenting it in a meaningful and digestible way has been an issue (Dark Sky on iOS, Left). The amount of data is overwhelming and very difficult to remember, but we may be able to synthesize this complex data into a digestible pattern by lowering the specificity. Instead of metaphors like: temperature is a number, chance of rain is a number, cloudiness is a number, we can combine these variables into a single color for each hour. All 168 hours presented together create a piece of art, painted by the weather forecast.

The color picker (Right) shows how temperature and the chance of rain of each hour map to a corresponding RGB value. Warmer colors indicate higher temperatures, blues indicate lower temperatures, and the more green the color, the higher chance of rain. Cloud coverage and sunrise/sunset times are then added by controlling the brightness; the more dim, the more cloud coverage. These metaphors more directly align with the weather that we experience. We already associate warmth with red, cold with blue, and green with spring and rain; in this way, we can get a sense of the data without actually reading any numbers. Individual colors would be difficult to interpret, but many colors presented relative to one another reveal patterns that are otherwise hidden.

Try to get a sense of the weather by referencing the Sky Matrix (Right), and the RGB color picker, while keeping in mind that the brightness of the led indicates sunrise/sunset times, as well as cloud coverage. A few patterns immediately emerge: The sun rises between 7/8 in the morning and sets between 5/6 in the afternoon each day. On Saturday (far right column) the green indicates rain throughout the day, and orange for a warm rain mid-day. It is also easy to see temperatures starting relatively cool and raising throughout the week, from blue to red.

One case that caught me off guard is when an LED shows white. Since temperature is a ratio between red and blue, an LED can only be white when the temperature is 50º with a 50% chance of rain; it is the only time there can be the same amount of all three colors. These patterns, and others, are not immediately evident in lists of numbers, and not as digestible in traditional visualizations. This "painting" can also be seen as a whole, and interpreted in seconds because there is no limitation of reading speed. However it is not without flaw. Those who are colorblind are definitely going to have a hard time translating the data that relies on color, but there are alternate solutions for this too, which I may explore in the future...

This is just one example of how we can create devices that update our habitats, present ambient and proactive data, and question the metaphors of traditional communication. This is what can start to make the difference between reading what data says, and quickly sensing what the data actually means. I built the Sky Matrix using a Raspberry Pi, FadeCandy Driver, and a bunch of NeoPixel LEDs from Adafruit. It updates its data every 15 minutes automatically, and the current hour of the day breathes in and out to give you some context of where 'now' is on the Sky Matrix.

This was a past project I realized wasn't on my site and I think deserves to be on here. Designed for iPhone and iPad Air, this dual dock is the consequence of my frustrations with the current charging experience. I had a few major requirements when I started this project: use existing charging cords without modifying, create the ability to plug-in devices without looking, and don't use any moving or mechanical parts. The result is a minimalist sapele wood slab with a hidden seam to separate the two pieces and release the two charging cords.

Touch Lamp was created out of a necessity for a great table lamp. Most lamps have cumbersome switches by which a half-asleep human can be easily frustrated, so Touch Lamp does away with them. Touch anywhere on the aluminum ring to turn the lamp on. Touch again to turn it off.

In addition to cumbersome switches, Touch Lamp does away with traditional lamp shades, that get dusty and worn, and replaces it with a glass sphere. The base is turned and sculpted from African Mahogany and the power cord is soft, woven nylon. The form has an inherent ability for power cord to be fed through any direction relative to the power source. These features work together to fit our human habits and lifestyle, instead of forcing specific behaviors, in order to create the most enjoyable lamp experience possible.

The first version isn't the most stable and the weight of too many pens on one side can cause it to tip, but it's easy to imagine the base as metal wire in the next version, providing more stability and space.

A couple of weeks ago I decided to take something already very simple and make it intriguing. This pencil cup is turned out of African Mahogany and seems as if it is a ring floating just above the surface. This effect causes a quick double-take of a normally ordinary object, followed by an innocent curiosity of how it's made.