The Syndiant 4K UHD 0.55” LCOS Light Modulation Panel caught my eye! In more ways than one. The SYL2341 manages to provide excellent image quality with vivid colors while meeting the cost and power requirements of portable applicationsThis Ultra small yet High Definition technology enables a bright and rich large screen experience in both near-eye and high brightness projection projects. The eyewear and projector demos on their booth at CES were very impressive indeed! Using small display panels plus optics to create video eyewear has been around for a while. But given their display architecture which integrates all-digital smart electronics onto the display panel, and the brightness and resolution they have achieved, it is worth watching this space for all sorts of potential portable immersive experiences. I will be on the lookout!

I have noted Bebop's sensing fabric for a few years now and knew that it would find it's way into some valuable applications spaces soon. Princpally being applied to wearables such as shoe insoles, sleeve control, etc. But it was this year's featured product that Caught My Eye - an interactive input glove for VR. The fabric specification was impressive, but it's the Touch and Haptics AI that make this a potential component for many applications. So I was very pleased to see the BeBop Sensors Forte Wireless Data Glove system, a fully featured affordable data glove to incorporate haptics and super accurate, rapid sensing for gaming and AR/VR environments for a more realistic VR experience. The hand tracking system tracks fingers and fingertips with haptics, the Forte Glove is an ultra-comfortable one-size-fits-all glove, conducive to lengthy Augmented Reality and Virtual Reality sessions due to its light weight and open airy design. Light and simple looking, it also can be worn inside a glove, for example. Sensing speed and accurately sensing finger movements are key features along with tactile feedback to the fingertips. The glove provides Super Accurate Rapid Sensing with data rates at 150 frames per second. With lag time eliminated, triggering has near instantaneous response times -- perfect for the most demanding Applications and Games.

Looxid Labs explores the user's mind and their unspoken emotions! Using human physiological signal sensors, their VR Headset monitors your eyes and brainwaves to determine your emotion, attention and interest while watching VR content. Two eye tracking cameras and six brain-wave sensors are built into the specialised VR headset. Theoretically allows the tracking of what the user themselves can't describe. Potentially valuable use cases!

Similar to well established MyndPlay VR in the UK who have been doing this for a few years, but the MyndPlay brain sensors can be worn and user response analysed in conjunction with most existing VR headsets and the actual VR content can be triggered to change depending on user's emotion and reactions.

I am sometimes a skeptic about VR/AR gaming. Seems to be rather too techy and even often quite low resolution for my liking. And the storytelling and educational aspects often seem a bit lacking. But I would say that, being one who is always looking for the bigger story and valuable use cases rather than just for killing time. The Lenovo STAR WARS:Jedi Challenges technology and Smartphone games Application is of a different breed. Very clever how they use the smartphone at 45º in the headset in a way that lets you see the room you are in (visual see-through, not video) and then lets you play Star Wars Chess or challenge the various Star Wars charactors to Light Sabre duels. But first, to qualify to fight, you need to learn how to use a light sabre! Light Sabre Training also done well, and fun. The light sabre itself is stunning to look at and hold and in your virtual vision and hearing pops out just like in the movie. Lots of virtual charactors and action in your living room or kitchen. Virtual Star Wars charactors and in fact everything in your line of sight is life-sized and lifelike and of course, it is also taking place in the reality of your home setting. This is a great piece of technology Lenovo and a great gaming experience Disney! Fun, fun, fun. Credit goes to the partnership with Disney (and Lucasfilm) to produce interesting, authentic and compelling experiences. Oh yes, and an amazingly reasonable price of $199. to boot. More, More, More please Lenovo and Disney!

And in Lenovo's words:

Inspired by the Star Wars Holochess game shown in Star Wars: A New Hope, the Lenovo Mirage AR headset was built truly from the ground up to recreate an immersive augmented reality experience for Star War fans of all ages. Users were able to truly enjoy key moments in the Star Wars films – a new type of experience that had never existed until now.Star Wars: Jedi Challenges lets fans live their dream of battling with a Lightsaber in augmented reality and other in-game experiences through the Skywalker Lightsaber and Lenovo Mirage AR headset.This product comprises three pieces of hardware: The Lenovo Mirage AR headset, the Lightsaber controller and the Tracking Beacon.* The Lenovo Mirage AR headset is an all-new augmented reality head mounted display (HMD) designed to give Star Wars fans an opportunity to recapture some of their favorite Star Wars moments.* The Lightsaber Controller is a Lightsaber device that pairs with the Lenovo Mirage AR headset for fans to experience Star Wars in AR. The Lightsaber was intricately designed by Lucas film to be modeled after the Skywalker Lightsaber.* The Tracking Beacon acts as a stable base and zero reference point for the sensors in Lenovo Mirage AR headset.Starting price $199 at Best Buy and also available at Lenovo.com.

MergeVR make some pretty amazing VR Goggles, but it was the Merge Cube that Caught My Eye. It is a Hologram you hold in your hand. A six-sided "virtual screen" for AR content which you can make yourself, or get on the Merge Cube Miniverse website.

The Merge Cube merges the physical and digital worlds using augmented reality technology and the powerful camera and sensors in your mobile device. According to MergeVR, t’s the first object of its kind, and it’s creating new ways for people to interact with technology. Now you can hold a galaxy in the palm of your hand, examine fossils and ancient artifacts like a real archaeologist, watch as a volcano erupts before your eyes, and play games in ways never possible before!

Pleased to see a very large and action packed Humaneyes booth at CES very much dedicated to AR/VR/3D360º capture and distribution. As with other application sectors, with their prosumer priced camera commercially available, it is now all about relevant content, applications and widespread access to content. Was very happy to see the latest Humaneyes VUZE 3D360º camera on show, of course as I covered it in a CES2016 WCME Entry when they also were finalists and then won the Last Gadget Standing award that year.

But this year Humaneyes made some big announcements - firstly, an Underwater housing and software needed for their 3D360º camera. This is a mean feat given the difficulties getting full sphere and 3D stitching working underwater.

They also announced the selection by the National Geographic Channel of the VUZE camera to capture 3D360º footage inside the International Space Station. European Space Agency astronaut Paolo Nespoli will use a Vuze VR camera to document life on the ISS for a VR companion to National Geographic's upcoming series "One Strange Rock."

In addition, Humaneyes are also developing a hosting facility (“Humaneyes Zone”) for 3D360º content so VUZE user's content and projects can be stored and made available to others. The Humaneyes Zone is an open platform, which takes VR storytelling to the next level, providing an end-to-end solution for prosumers. The site enables anyone, without a need for a technical or programming knowledge, to quickly and easily create a VR website to showcase their VR content and to tell a story in 3D360º, whether it’s a tour of an apartment, a training session, a wedding or just showcasing a vacation.

So three biggies for Humaneyes this year - capturing 3D360º content underwater and outer space and hosting of 3D360º projects!

A good example of a "connected sensor" becoming a complete and very portable capability when used with a Smartphone App. Third Eye wireless thermal attachment is a smart solution for thermal capability on the go. Paired with Smartphone or Smartglasses, Third Eye delivers lightweight, hands-free thermal imaging - creating a multitude of use cases and putting them into the hands of anyone. The Signal Sensor can also be clipped to normal glasses to capture a first-person point of view on the Smartphone.

Third Eye turns smart glasses and mobile devices into advanced thermal vision systems for real time viewing and analysis or to capture a thermal video or still picture. Even capable of Live-video streaming.The precise remote temperature measurements and scanning are ideal for many applications such as in the medical, industrial, environmental sectors plus night vision applications for professional and amateur uses such as hunting and for search and rescue. Accuracy plus small size and simplicity of operation make this an enabler for many applications where thermal sensing would be valuable, but previously impractical or simply not available.

Vuzix stole the show with Best in Show AR Glasses at CES this year - the Vuzix Blade Smart Glasses. Also declared so by all the major technology news publications and networks. A full spec, non-tethered pair of glasses which will be targetted for the industrial/enterprise space due to the rich portfolio of applications that will already run on the Blade Smart Glasses (and initial price). They are also real evidence that proper consumer grade (style, price and function) AR glasses are just around the corner. Consumer applications writers, start your engines! These are certainly a potential prosumer “item” now with improvements in price and styling surely coming soon I suspect. Vuzix also upgraded their well established M100 smart glasses now used by in many industrial applications, and showed a better, brighter M300 model complete with a suite of out-of-the-Box applications for industry. Ready to turn on and deploy - as easy as Click - Connect - Collaborate.

These two products will now move us rapidly from the Smartglasses technology emphasis of the past to the real world applications and valuable use cases.

Went to BETT yesterday in London (Europe’s largest Technology in Education exhibition). Past years have been all about interactive whiteboards, then large touch screens (basically blackboard replacements and teaching aids) This year was much more about “Interactive Learning” and student involvement. BETT2018 was all about Robots and learning how to code and having fun doing it. And being introduced into classrooms at and early age. I am 100% behind this and at CES2018 (Consumer Electronics Show in Las Vegas) saw some really good examples of this too (for example ROOT). I have captured only some of the “Learning Coding through fun with Robots” examples here, but not sure how a teacher or an educational curriculum decision maker, can chose from these. Lots of early stage stuff. This will no doubt mature a bit and become more focussed by next year.

Four other significant findings that also caught my eye at BETT:1. If you follow the wearable technology, IoT, IoMe, IoCar, IoHome and AI areas as I do, you will have noticed the innovation in the Smartphone + Sensor’s area. Saw some interesting education tools in this area from Vernier and IDS Education (their distributor in the UK) at BETT2018. “Go Direct Sensors” that now work directly with Smartphones to let students apply technology to create real world solutions. These education curriculum tools are ready for prime time and caught my eye.

2. AR/VR/360º capture, content and storytelling. Not as predominant as I had hoped as this is a “learn by actually being there” proposition. Devices are readily available, and the Education based content is coming, but early days of a big opportunity to learn in an experiential way. As with other application sectors, this is now all about relevant content and applications and then blending that into an existing way of working. But provides a very experiential way for students to learn. Kind of like when years ago classes started using soundproof booths and headphones with mikes to individually learn languages. Was very happy to see the Humaneyes VUZE 3D360º camera on show, and its “Borderless Education Experience”. The VUZE camera is an affordable tool for students to create meaningful content and learn storytelling in this new medium. Humaneyes is also developing a hosting facility (“Humaneyes Zone”) for 360º content so student projects can be stored and made available to others. Well done Humaneyes for recognising the Educational sector and providing the tools for learning.

3. Also at CES2018 in Las Vegas this year there was a whole huge hall dedicated to two things - Robots (discussed above) and Drones. These technologies are progressing rapidly into seriously valuable application areas, and it is appropriate that emerging graduates learn the dynamics and skills needed to be productive in these emerging areas. The subject of Drones was not covered well at BETT, but very pleased to see Parrot exhibiting on this very topic with it’s Parrot Mambo Edu Kit. This really caught my eye and will be looking for more at BETT 2019.

4. Innovations from Octagon studios to learn by creating. Building on their huge library of AR Cards for children of all ages to view, manipulate and interact with animals, cars, space ships, dinosaurs, etc.

I had a discussion with Hasbi Asyadiq, CEO of Octagon spinoff Assemblr World. Assemblr World is a Smartphone/Tablet App to “virtually” build things from a library of over half a million objects and then Augment the Reality of the real world with your creations. You can also work collaboratively and share your personal projects. This is potentially a learning tool to create, collaborate, share and place these virtual objects and environments into the real world around you for viewing on your or other's smartphone or tablet screens in an "Augmented Reality" way. Looking forward to seeing this develop into a real learning tool.

This was one of the interesting discoveries that caught my eye at the JETCO Japan Pavilion. A Google Glass-like wearable display which could be carried like a pen in the pocket, and then "transformed" into a one-eye display with camera, battery and software on board. On the table was a working Vuzix M100 I suppose to show how the concept model could be used in various applications.

i found the concept interesting for someone who would use Smart Glasses off and on in the course of the day in their profession. Sort of like taking out your glasses to get a clearer, close up look at a situation, but in this case with image capture, remote assistance and relevant information on tap.

A nice concept, but not sure of the useability aspects.

I say, let's just move on quickly to the next phase - normal looking glasses with lenses that augment your vision with relevant info, and do a bit of life blogging and sharing at the same time!