The Dressler Blog

Digital Trends

By Nathan Hunt | Published May 25, 2017

DNA Storage
I have one photograph of my grandfather. He died many years ago, when my father was only a boy. In the photograph taken decades before my father’s birth, my grandfather poses next to a pile of wreckage that was the biplane he had been flying. He is wearing the World War One uniform of the United States Army Air Corps and leans on a cane, still recovering from the crash that could have taken his life. His expression is unreadable, not because the photographer failed to capture it but because the print is now a century old. Too much data has been lost to know how my grandfather felt about his near-death experience. All data storage is temporary. Paper rots, tape degrades. Half of all films made before 1951 are gone forever because they were stored on celluloid. But scientists at Microsoft Research believe they might have a longer lasting data storage solution – DNA. DNA as a storage medium is both dense and durable. If stored as DNA, every movie ever made would fit in a space smaller than a sugar cube. And DNA lasts. Scientists have been able to sequence DNA extracted from the tooth of a wooly mammoth – hardly optimal storage conditions. Last year Microsoft announced that they had successfully stored 200 megabytes of data in a DNA strand. However the cost of both creating the strands and then reading the sequenced DNA is currently prohibitively expensive.
Why does this matter? We are reaching the end of what could be considered the first age of computing. Silicon chips and tape storage are in an advanced stage of senescence. The first age of computing was an age of human ingenuity, largely divorced from the natural world. Binary notation, while an elegant solution for early computing, is simply not used anywhere else in nature. Quantum computing based upon the underlying quantum structure of the universe is a more sustainable approach. Magnetized tape was an acceptable storage solution for the first age of computing. But DNA storage makes use of nature’s genius to store more for longer. Neural networks are inspired by the structure of our own brains and frictionless interfaces may free us from the artificial tyranny of the work station. The common theme is that computing is attempting to leverage the felicities of our natural world, rather than fight against them.
In a nutshell: Innovation isn’t ending. It’s beginning.
Read MoreThe camera is not replacing the keyboard
Technology is wonderful. But for every true innovation that will improve people’s lives, there are twenty over-hyped technology fads that are breathlessly promoted by credulous editors and experts because they are easier to understand than real innovations like machine learning, the blockchain, or quantum computing. The latest example of clueless absurdity is the increasingly common claim “the camera is the new keyboard.” Sigh. Okay, let me try to parse this. The argument goes something like this: Snapchat, Instagram, image search, Google Lens. That’s pretty much it. Here’s what is true: the internet has been a primarily-text medium for a long time. Images and video are becoming more important. Here’s what isn’t true: the camera on your phone will replace the need to type. In certain cases, image search is superior to text search. In certain cases, social networks based on images are gaining users faster than social networks based on text. But the vast majority of digital behavior is still text based. This is a writing and reading medium. Sure, it’s getting better at using images. But the camera won’t replace the keyboard any more than the urinal replaced the toilet. They’re used by different audiences for different things.
Why does this matter? Again, technology is wonderful. But it is subject to periodic fits of mass delusion. If you are investing in technology or trying to accomplish a digital transformation and you get overwhelmed by faddish nonsense, your business could suffer. Please file “the camera is the new keyboard” with “voice input will replace everything” and “everything will be crowdsourced” and “the sharing economy.” These are fads. Some people will make money off these fads. Most people will waste time and energy chasing them. If your job is to attend meetings and regurgitate the latest fad, don’t let me stand in your way. But if your technology choices matter, focus on real innovations based in technological change.
In a nutshell: No. Just no.
Read MoreAR/VR: the now and the soon
Clay Bavor is Google’s VP of Virtual and Augmented Reality. In conjunction with last week’s Google I/O conference he wrote a Medium post on Virtual and Augmented Reality. Much of what he wrote would be familiar to regular Digital Trends readers. AR and VR are positions on a continuum and not distinct technologies. AR and VR are parts of a large movement to build immersive technologies that will reduce the friction of interaction with the digital world. Google sees their own current and historical mission to be inextricably involved with the development of immersive technologies. However, Bavor’s insight into the progress of AR and VR towards large scale commercial applications is particularly interesting. He suggest that we are not close. In fact, he suggests that current immersive technologies are the equivalent of mobile phone technology in the 1980’s – meaning that it is interesting and useful, but nowhere near what it will become. Because of Bavor’s unique position, one can anticipate that he has seen the bleeding edge of immersive technology now under development. If he thinks we have a way to go, we definitely have a way to go.
Why does this matter? The graphical user interface is stuck. The last great innovation was the touchscreen interfaces of the first iPhones. Our desktops and mobile devices have not changed meaningfully in close to ten years. We see differences in design sensibility, but no changes in fundamentals. Apple, who popularized the last generation of GUI advances, seems paralyzed and insecure following the death of Steve Jobs. Google has had at best a grudging acceptance of the necessity of the interface. Amazon, a company that failed the GUI at every stage, is bizarrely ahead of the pack with voice. My sense was that all of the big tech companies were holding their fire until immersive technologies came on line. I foolishly imagined that this must mean that enterprise-scale AR was imminent. I now suspect that may have been unduly optimistic.
In a nutshell: Immersive tech is going to take longer than I thought.
Read More

Give us your email to sign up for our weekly Dressler Digital Trends. Stop trying to keep up and start getting ahead.