Intel’s taking the lead in the new “data economy”

Intel is looking to take the lead in what it has dubbed the “data economy,” helping consumers and individuals realize and retain more value from their personal data. Antonio Regalado and Jessica Leber report at MIT Technology Review that the the world’s largest computer chip maker has launched a “Data Economy Initiative.” Ken Anderson, a cultural anthropologist who is in charge of the project, described the initiative to them as “a multiyear study whose goal is to explore new uses of technology that might let people benefit more directly, and in new ways, from their own data.”

As part of the initiative, Intel is funding hackathons to encourage developers to experiment with personal data in new ways, Regalado and Leber note. “[Intel] has also paid for a rebellious-sounding website called We the Data,” they report, “featuring raised fists and stories comparing Facebook to Exxon Mobil.” Read more…

Comments Off on Strata Week: Intel wants you to reap the benefits from your personal data

How Does Copyright Work in Space? (The Economist) — amazingly complex rights trail for the International Space Station-recorded cover of “Space Oddity”. Sample: Commander Hadfield and his son Evan spent several months hammering out details with Mr Bowie’s representatives, and with NASA, Russia’s space agency ROSCOSMOS and the CSA. That’s the SIMPLE HAPPY ENDING.

Great Lessons: Evan Weinberg’s “Do You Know Blue?” (Dan Meyer) — It’s a bridge from math to computer science. Students get a chance to write algorithms in a language understood by both mathematicians and the computer scientists. It’s analogous to the Netflix Prize for grown-up computer scientists.

The New York Times questions the environmental impact of data centers. Also, big data as hiring manager and inside Foursquare's data science.

The NYT investigates data center pollution, Google buys wind power

The New York Times (NYT) has conducted a year-long investigation into data centers and their environmental impact, and the first reports from the investigation were published this week. NYT writer James Glanz reports that the study showed the tens of thousands of data centers required around the world to process the vast amounts of data produced by billions of users each day “is sharply at odds with its image of sleek efficiency and environmental friendliness.” Glanz says that through interviews and research, the NYT found data centers to be wasteful with electricity. Glanz reports:

“Online companies typically run their facilities at maximum capacity around the clock, whatever the demand. As a result, data centers can waste 90 percent or more of the electricity they pull off the grid, The Times found. To guard against a power failure, they further rely on banks of generators that emit diesel exhaust. The pollution from data centers has increasingly been cited by the authorities for violating clean air regulations, documents show. … Worldwide, the digital warehouses use about 30 billion watts of electricity, roughly equivalent to the output of 30 nuclear power plants, according to estimates industry experts compiled for The Times. Data centers in the United States account for one-quarter to one-third of that load, the estimates show.”

Glanz also notes the findings showed that only about 6 to 12% of the electricity data centers are consuming for servers is actually being used to perform computations — the remaining 88+% is being used to maintain idling servers standing at the ready for surges in site activity. You can find Glanz’s full report, along with analysis and industry interviews, here.

Heavy data, open source strategies for businesses, and collaborating on code.

This week on O’Reilly: Jim Stogdill said data is getting heavier relative to the networks that carry it around the data center; Simon Phipps revealed open source community strategies relevant to the enterprise; and Team Geek authors Brian Fitzpatrick and Ben Collins-Sussman discussed the importance of developer collaboration.

Solving the problem of where to store huge amounts of data

This week, we look at the problem of too much government data, and companies beginning to build air-economized data centers (some in barns!). Plus: a few suggestions for pre-Strata reading on big data.

I shouldn’t have yelled at that Chinese guy so much — the post that redeemed Fake Steve Jobs in my eyes. We all know that there’s no fucking way in the world we should have microwave ovens and refrigerators and TV sets and everything else at the prices we’re paying for them. There’s no way we get all this stuff and everything is done fair and square and everyone gets treated right. No way. And don’t be confused — what we’re talking about here is our way of life. Our standard of living. You want to “fix things in China,” well, it’s gonna cost you. Because everything you own, it’s all done on the backs of millions of poor people whose lives are so awful you can’t even begin to imagine them, people who will do anything to get a life that is a tiny bit better than the shitty one they were born into, people who get exploited and treated like shit and, in the worst of all cases, pay with their lives.

Reconnoiter — holy cow web console and analytics for data centers, from the magic Theo Schlossnagle. He built the screenshots for his OSCON presentation, graphing streams of live performance data from dozens of data centers, while on a Virgin America flight.