Display in contact lens

Enhancing contact lenses with electronics seems to be a topic that has gained research interest. Smart contact lenses for health and head-up displays article the litany of research projects underway in the field of contact lens enhancement. Sensimed has already contact lens with surface-mounted strain gauge to assess glaucoma risk. Those medical measuring lenses have lacked display.

The previous comment points to page that tells about Bates method to improve vision.

The Bates method is an alternative therapy aimed at improving eyesight.
Despite continued anecdotal reports of successful results, including well-publicized support by Aldous Huxley,[2] Bates’ techniques have not been shown objectively to improve eyesight,[3] and his main physiological proposition – that the eyeball changes shape to maintain focus – has consistently been contradicted by observation.[4] In 1952, optometry professor Elwin Marg wrote of Bates, “Most of his claims and almost all of his theories have been considered false by practically all visual scientists.”[

Tomi Engdahl says:

In December, rumors spread that Google was finishing up a prototype on high-tech glasses known as wearable head-up displays (HUD) that could tap into Google’s cloud-based location services and detail users’ surroundings. The information would then appear as an augmented reality computer display.

According to 9to5Google, these glasses can do all that and more. The glasses will reportedly have an extremely small front-facing camera with a flash to gather information and take photos, a navigation system that is used by head-tilting to scroll and click, I/O for voice input and output, and CPU/RAM/storage hardware nearly equivalent to a generation-old Android smartphone.

Tomi Engdahl says:

Epson launched its BT-100 media viewer glasses in Japan back in November, but now the Android-powered translucent visor is now available in the USA. The headset uses micro projectors to create a 960 x 540 qHD display in front of the wearer’s eyes, which appears as an 80-inch image 5 meters away, while still allowing them to see the world around them. The dual projectors mean that it’s able to display 3D content, too.

The headset connects to a controller with a touchpad and the standard Android buttons, along with Wi-Fi connectivity that allows you to browse the web, view media, or even (as Thomas Sohmers demonstrated earlier month) control Parrot’s AR.Drone quadrocopter.

A long-rumored Google project, the Project Glass augmented reality glasses were unveiled today by Google on a new Google+ page. The project is specifically from Google X, the company’s “secret lab” focused on long-term projects. These early videos and images show an augmented reality concept that’s deeply integrated with all of Google’s services, with voice commands, video chat, location check-ins, maps (outside and in-store), and much more.

Google is providing the first public glimpse of the augmented-reality glasses being developed by its Google X research group under the name “Project Glass.”

Why unveil them now? The New York Times and Reuters say the Project Glass team will be testing prototypes in public. The New York Times reported previously that the glasses would go on sale this year, but the Google team doesn’t mention that possibility.

What do you think? Would you use these glasses in public, and how much would you pay for them?

Okay, this may not be the best looking pair of glasses on the market right now, but what they lack for in style, they make up for in jaw-dropping awesomeness.

This is the Epson Moverio BT-100 headset and they’re the world’s first Android-based wearable display. You can browse the web with these glasses, watch movies, listen to music, and more.

They’re not cheap — priced at $699.99

To start things off, the BT-100 includes Wi-Fi connectivity, so users can get online if and when service is available. The whole system runs on the Android 2.2 (FroYo) platform and includes Adobe Flash player for easy web browsing and better app speeds.

Users can toggle between 2D and 3D on a QHD (one-quarter of a fully HD-display) screen. Using pioneering micro-projection technology and advanced imaging techniques, the perceived image that the glasses display is incredibly large: it’s actually equivalent to watching an 80-inch display from a little over 16 feet away.

The sides of the glasses are semi-transparent, so users can enjoy what they’re watching, but still see what’s going on in the real world if need-be.

Also, 3D images can be included into the user’s surrounding environment on a “floating” see-through display.

Google’s augmented reality eyewear is coming to disrupt your face and your business model. If you don’t even have to pull your phone out to take a photo, get directions, or message with friends, why would you need to buy the latest iPhone or spend so much time on Facebook?

It could be a year before Google eyewear reaches stores, but that’s why these and other tech companies need to strategize now. If they wait to see if the device is a hit, the world could be seeing through Google-tinted glasses by the time they adapt. Apple and Facebook’s bet might be to team up…

Cramming all the functionality into a sleek set of glasses is going to take time and effort, but the Google(x) skunklabs is on it. There’s a dozen ways the product could flop, most obviously if the glasses are awkward and unstylish, but also if they’re too heavy, expensive, fragile, or the world is just not quite ready. Let’s forget those for a second. Say Google figures it out and the retail version of Project Glass (which may end up being called Google Eye) becomes wildly popular. How will this disrupt Apple and Facebook, and what should they do to defend themselves?

Larry Page and Sergey Brin have long had the dream of a hands-free, mobile Google, where search was a seamless process as you moved around the world. As the years progressed the vision did, too, expanding beyond search to persistent connections with the people in your lives.

Google is revealing that it is taking concrete steps towards that vision with ProjectGlass, an augmented reality system that will give users the full range of activities performed with a smart phone — without the smart phone. Instead, you wear some sort of geeky prosthetic

The concept video for the Glass project concentrates on the cool things you may do with it one day — create instant contact with friends, monitor feeds about weather and other info, get information about a subway station out of service, receive turn by turn directions on the way to a destination, snap a picture by command, even find your way to a certain tome in the labyrinthine Strand bookstore. Everything works perfectly because, well, it’s a concept video and not a depiction of something that’s actually perfected.

ScienceDaily (May 13, 2012) — Using tiny solar-panel-like cells surgically placed underneath the retina, scientists at the Stanford University School of Medicine have devised a system that may someday restore sight to people who have lost vision because of certain types of degenerative eye diseases.

Macetech is in the business of coming up with novel uses for LEDs. They were displaying a proof-of-concept pair of LED glasses at Maker Faire last weekend that’s notable for two reasons. The first is that the wearer can see relatively well through the slits in the “glasses”: The LEDs which are so bright to the external viewer are completely blocked by the thin strips they are mounted on. (These prototype glasses are made of sturdy cardboard.)

Folks have been clamoring for more on Google’s Project Glass and Sergey Brin–one of the co-founders of Google is now burying himself in the R&D department associated with its development. Recently Brin appeared on ‘The Gavin Newsom Show’ with the prototype glasses perched on his face.

However, Brin dropped a bomb when he stated that he’d like to have the glasses out as early as next year.

Google’s Project Glass product lead Steve Lee walks us through his experience with the development of the company’s sci-fi-inspired eyewear–from his team’s “hundreds of variations and dozens of early prototypes” to his vision of the future.

“Something like this has never been created before,” says Steve Lee

Following Google’s announcement of Project Glass, Fast Company talked extensively with Lee to learn more about how his team is turning this work of science-fiction into a reality.

A retinal implant has given a brief glimpse of light to a small number of blind people, and could one day be a common treatment for vision loss due to injury or disease.

Shawn Kelly, a senior systems scientist at Carnegie Mellon University, has developed a computer chip that translates camera images into electrical pulses that the nerves inside the brain can understand. The result is vision.

The cameras are incredibly small and mounted to a pair of glasses. The digital information picked up from the camera is sent along a wire to a thin film surgically implanted in the back of the patient’s eye, between the sclera and the retina. The electrical signals stimulate the nerves in the retina, and that allows the patient to see. The system is powered via induction

It’s a far cry from the bionic eyes of science fiction, though. The resolution is only 256 pixels total, because that’s how many electrodes can be made to fit on the back of the film.

ordinary human vision involves approximately 1 million nerves, and more than 100 millon rod and cone cells. But it is something.

A new bionic eye implant could allow blind people to recognize faces, watch TV and even read. Nano Retina’s Bio-Retina is one of two recent attempts to help patients with age-related macular degeneration, which affects 1.5 million people in the U.S.

Although a similar implant, Second Sight’s Argus II, has been on the market in Europe since last year, it requires a four-hour operation under full anesthesia because it includes an antenna to receive power and images from an external apparatus.

Because of Bio-Retina’s compact size, an ophthalmologist can insert it through a small incision in the eye in 30 minutes—potentially more appropriate for seniors. The Bio-Retina will generate a 576-pixel grayscale image. And clinical trials could begin as soon as next year.

1. DON GLASSES

Ordinary-looking glasses contain a battery, a power-delivering laser apparatus and working lenses

2. SHINE LASER POWER

The near-infrared laser beam, gentle enough to shine harmlessly through the eye onto the implant, provides up to three milliwatts of power to a photovoltaic cell on the eye implant.

3. CAPTURE IMAGE

4. TRIGGER NEURONS

Six hundred needle electrodes (wrapped in biocompatible silicon and sapphire to prevent the formation of scar tissue) penetrate the retina. Each electrode represents one pixel, sending pulses of electricity to stimulate the eye’s neurons, which transmit the image to the brain.

Skiers, snowboarders, and snowmobilers now have a real-time way of tracking their performance and capturing video, thanks to a new head-up display designed to fit inside a set of snow goggles.

The micro-display, used in goggles built by six different manufacturers, also incorporates GPS and Bluetooth technology, and has the ability to pair with Android smartphones for additional real-time connections. “While you’re skiing, you can take a glance with your right eye to see how fast you’re going or to look at a map,” Xichi Zheng, director of systems engineering for Recon Instruments, told us. “You can also be notified when a call comes in from your smartphone.”

New gadgets — I mean whole new gadget categories — don’t come along very often. The iPhone was one recent example. You could argue that the iPad was another. But if there’s anything at all as different and bold on the horizon, surely it’s Google Glass.

This idea got a lot of people excited when Nick Bilton of The New York Times broke the story of the glasses in February. Google first demonstrated it April in a video.

Now, Google emphasized — and so do I — that Google Glass is still at a very, very early stage. Lots of factors still haven’t been finalized, including what Glass will do, what the interface will look like, how it will work, and so on. Google doesn’t want to get the public excited about some feature that may not materialize in the final version. (At the moment, Google is planning to offer the prototypes to developers next year — for $1,500 — in anticipation of selling Glass to the public in, perhaps, 2014.)

Google has said that eventually, Glass will have a cellular radio, so it can get online; at this point, it hooks up wirelessly with your phone for an online connection.

The biggest triumph — and to me, the biggest surprise — is that the tiny screen is completely invisible when you’re talking or driving or reading. You just forget about it completely.

Tomi Engdahl says:

New technology that will allow information, such as text messages from a mobile phone, to be projected onto a contact lens worn in the human eye has been developed by Belgian researchers.

Ghent University’s centre of microsystems technology has developed a spherical curved LCD display which can be embedded in contact lenses and handle projected images using wireless technology.

“Now that we have established the basic technology, we can start working towards real applications, possibly available in only a few years,” said Professor Herbert De Smet.

“This is not science fiction,” said Jelle De Smet, the chief researcher on the project, who believes commercial applications for the lenses will be available within five years.

“This will never replace the cinema screen for films. But for specific applications it may be interesting to show images such as road directions or projecting text messages from our smart phones straight to our eye.”

Soon, retinal implants that fit entirely inside the eye will use nanoscale electronic components to dramatically improve vision quality for the wearer, according to two research teams developing such devices.

Retinal prostheses on the market today, such as Second Sight’s Argus II, allow patients to distinguish light from dark and make out shapes and outlines of objects, but not much more.

This device was the first “bionic eye” to reach commercial markets. It contains an array of 60 electrodes, akin to 60 pixels, that are implanted behind the retina to stimulate the remaining healthy cells. The implant is connected to a camera, worn on the side of the head, that relays a video feed.

A similar implant, is made by Bionic Vision Australia, that has just 24 electrodes.

INVENTIVE INFORMATION OUTFIT Google has started accepting consumer input on how Google Glass can be used in the real world.

Google is taking applications from consumers who want to take part in a trial programme for the augmented reality glasses. Previously early models of Glass were only available to developers interested in the devices.

“We’re looking for bold, creative individuals who want to join us and be a part of shaping the future of Glass,” wrote Google in a Google+ post on the announcement.

On Wednesday at the University of California, San Francisco’s Mission Bay medical campus, the Facebook CEO met up with Google cofounder and special projects head Sergey Brin.

“I can’t wait to get my own,” said Zuckerberg of Glass, while Brin tried to adjust his own pair on the Facebook CEO’s head.

In the past, Zuckerberg has been wary of commending Google, which some see as a natural rival to Facebook in the battle for corporate Silicon Valley hegemony. Late last month during Facebook’s fourth quarter earnings call, Zuckerberg noted of his company’s connection to Google: “Even though our relationship isn’t one where the companies really talk, we are able to do a bunch of things and build some great experiences.”

On Wednesday, “the companies” talked. And they talked about developing for Google Glass.

While Zuckerberg excitedly tried on the device, he made sure that few photos were snapped. Despite being the CEO of a social network of more than a billion individuals who share pictures and status updates every second, he asked that the handful of observers only take private photos of him and Brin.

From a Facebook perspective, those videos and photos on Brin’s device are likely the perfect type of content to share on their network. And with a working Facebook integration, it would presumably take the flick of a finger or the quick shift of an eye to share those memories from the spectacles to a social network, whether it be Facebook or rival Google+. Zuckerberg, though, said he had no immediate vision of an application in mind but said to Brin: “Is there anything specific you want us to be trying? If so, I want to be doing that.”

The Facebook CEO was undeterred, constantly reminding Brin of how excited he was to get his own pair. Unlike the handful of early adopters, however, Zuckerberg will not have to submit any application or pay any fee.

Google has posted a video on YouTube of a presentation Developer Advocate Timothy Jordan gave at the 2013 South by Southwest Interactive conference. In this video, he gives the audience and viewers a preview of its Google Mirror API, which is being used to help build services for its Glass project.

Glass will only use slower 802.11B/G wifi likely to save energy and chip size by eliminating 802.11N (or ac). Perhaps most importantly, Google says the display will be the equivalent of a 25-inch HD screen about 8 feet away. That’s a good way to imagine the Glass experience.

Every move that Google has made regarding its wearable Glass computer has been closely scrutinized, particularly the staggered, picky rollouts of the device, most recently with Glass Explorer applicants. We were generally led to believe that the device would be made available to the masses later this year, but new comments from Google’s chairman indicate otherwise.

So, according to Schmidt, rather than being made available to consumers later this year as promised, most of us probably won’t get a chance to use the device until sometime in the spring of 2014.

Tomi Engdahl says:

The author behind the Infographic, Martin Missfledt, concludes that the biggest challenge Google now faces with this device is making it usable for people with normal glasses. Right now, Google Glass needs to be placed ahead of normal glasses, which neither looks good nor feels comfortable for the user. Missfledt suggests that Google might have to manufacture individual customized prisms for these users, but that could wind up being a super costly endeavor.

If you think Google Glass looks a little weird now, take a look at how it got its start.

During a Fireside chat at Google I/O Thursday Senior Industrial Designer at Google on the Glass team Isabelle Olsson showed off how Glass looked originally: a scuba mask with a computer board attached to it.

The future came crashing down on me this week at the Google I/O developer conference while I stood at a bathroom urinal.

I had just wrapped up a conversation with a man who owned a pair of Google’s Internet-connected glasses, Google Glass. He had explained that one of the gadget’s greatest features is the ability to snap a photo with a wink. “It’s amazing, you just look at something, wink your eye and it just takes a picture,” he said enthusiastically.

Everywhere I looked at the conference, people were wearing Google Glass. Hundreds of them. Maybe more than a thousand! They were on the escalator. At the coffee stations. Press lounges. Lingering in the hallways like gangs of super nerds. They looked like real people as they nibbled on M&M’s and nuts at the snack bars. Except they weren’t; these “humans” were able to take pictures with their eyes and then post them to the Internet.

The developers present who didn’t own the company’s augmented reality glasses stared at those who did with awe.

Often, Google Glass owners looked strange. Many were using their cellphones while wearing the glasses — defeating a declared purpose of the new gadget, to free you from having to look at your phone.

The US Patent and Trademark Office has published a pile of Google patent applications describing a contact lens with embedded microchips that operate a microcamera, chemical sensors that detect changes in the makeup of tears, interfaces that would allow the contacts to connect with Android devices or smart cars, and command protocol that would let users tell the contacts what to do using a pattern of blinks.

University of South Australia associate professor Drew Evans has created proof-of-concept work that could in the future lead to computerised contact lenses.

The conducting polymer lens is an early step into what could lead to circuitry being etched into contact lenses.

The work is combination of the University’s Future Industries Institute work into the PEDOT polymer and lenses developed by a UK contact lens firm understood to be Contamac.

“We have been researching in this area for the last decade in the military and automotive industries, but this is the first time we have been able to bring our polymer into contact lens technology,” Evans told Vulture South.

“It is a milestone simply because the polymer is biocompatible, meaning the body finds it friendly.

“There have been companies over the last decade building circuitry into lenses but that is using materials like copper – you’d be taking a big risk to stick that in your eye.”

The PEDOT polymer was first discovered in the late 1970s by three scientists who later won the Nobel Prize in 2000. Evans’ work improves on the polymer improving its properties for use in wearable devices.