More than 1.8 zettabytes of information will be created and stored in 2011, according to the latest IDC Digital Universe Study sponsored by EMC. That’s a mind-boggling figure, equivalent to 1.8 trillion gigabyte.

“Before atomic timekeeping, clocks were set to the skies. But starting in 1972, radio signals began broadcasting atomic seconds and leap seconds have occasionally been added to that stream of atomic seconds to keep the signals synchronized with the actual rotation of Earth. Such adjustments were considered necessary because Earth’s rotation is less regular than atomic timekeeping. In January 2012, a United Nations-affiliated organization could permanently break this link by redefining Coordinated Universal Time. To understand the importance of this potential change, it’s important to understand the history of human timekeeping.”

What’s interesting is that several tech companies, at least according to the American Customer Satisfaction Index, are ranked below Bank of America, which is actually listed last among the 19 companies with a satisfaction rating of 68 out of 100. In other words, 18 other companies have even lower customer satisfaction scores than BofA.

IBM Thursday announced a breakthrough in computer memory technology, which may lead to the development of solid-state chips that can store as much data as NAND flash technology but with 100 times the performance and vastly greater lifespan.
Currently, NAND flash memory products, such as SSDs, have write rates as high as 2Gbit/sec.IBM said it has produced phase-change memory (PCM) chips that can store two bits of data per cell without data corruption problems, something that has plagued PCM development from the start.

The old adage says “a picture is worth a thousand words,” but just exactly which words is the question. While facial recognition and GPS-enabled cameras have made tagging digital snapshots with names and locations much easier, a team of students from Duke University and the University of South Carolina has developed a smartphone app called TagSense that takes advantage of the range of multiple sensors on a mobile phone to automatically apply a greater variety of tags to photos.

Looking to make it easier for people to search and retrieve specific photos from their ever-growing digital albums, the team set about creating an app that takes advantage of not just a smartphone’s GPS, but also its accelerometer, light sensor and microphone to provide additional details about a given photo. The app doesn’t just make use of the sensors on the phone taking the photo, but also collects information from the smartphone’s of subjects within the photo – with permission of course.