Tag: Computer Vision

I had the privilege of presenting to the Dr. Pepper Snapple Group marketing organization today via a Keynote format on the topics of data & the evolution of experience during their Media Masters event.

The first half of the hour-long talk explored how Epsilon Agency approaches data and data design. Highlighting our work with structured data, our view on the alignment of actioning of data through mapping attributes to dimensions and then highlighting multiple case studies tied to our unstructured data work and machine learning approach.

Then the presentation shifted into the full Evolution of Experience, E^3 talk tied to Empower, Exponential and Enhance and discussed how we will evolve from us inputting into technology, to our environment adapting to us.

Dr. Pepper Media Masters

Empower is looking at how accessibility to mobile technology has led to consumers being empowered to create, amplify and influence across generations.

Exponential is all about acceleration through intelligent systems. This looks at the rise of virtual assistants and the ability to predict consumer needs and ultimately become a proxy for the individual that will forever alter the path to purchase.

Enhanced is all about the bridging of physical and digital and how immersive computing, AR, VR, computer vision will make the user’s camera intelligent and forever changing the retail experience.

The talk ends with an explanation of how we will evolve from a mobile-centric world to the new normal of voice, vision and touch experiences powered by AI including a date when it will all converge.

It was a great crowd and I really enjoyed the hour with such a highly engaged and interactive group.

If you are interested in having Tom speak at your event please contact here.

This week I had the opportunity to attend the Google I/O conference in Mountain View, California. It was an incredibly compelling event as Google shifted their focus as a company from mobile first to AI first. This means that all products will be redefined and enhanced through various forms of AI.

This includes the Google Assistant, which was the star of the show. The deck goes into detail, but it’s incredibly important that we begin thinking about the role that the Google Assistant plays across home, smartphone, wearables, auto and soon AR. With the launch on the iPhone announced at the conference it gives Assistant 200 million voice enabled devices out of the gate.

What is also key to consider is the Google Assistant equivalent of an Alexa Skill, called an Action by Google. Actions can support transactions outside of Amazon as well as not requiring installation. Also, there is a very small number of actions that exist today, but a huge and rapidly growing ecosystem of devices that are Google Assistant enabled.

Here is the full trend recap and analysis:

Section one covers trends tied to connection & cognition:

Vision of Ubiquitous Computing

Multi-Modal Computing

Google Assistant (Actions, Auto, Computer Vision, Wear)

Android O

Progressive Web Apps

Structured Data & Search

Section two covers all facets of immersive computing:

Immersive Computing

Daydream (Virtual Reality)

Social VR

WebVR

Visual Positioning Services

Tango (Augmented Reality)

WebAR

In addition to the attached recap, there is also a 4 minute “light recap” video:

For third party commentary, discussed the role of Google Lens & Computer Vision with AdExchanger here