The video starts with an overview of the newly launched Amazon Spark, which is a mashup of discovery and inspiration features of Pinterest, combined with the UX style of Instagram for iOS users built on a foundation of a shoppable personalized feed.

The discussion shifts to unstructured social data & Amazon’s relevancy algorithms, the role of image recognition and the power of visual discovery as a foundation for additional data exhaust and progressive profiling.

Next, the analysis reviews the role of social in Spark and the evolution of reviews. From UGC, influencers, and social currency to a theory tied to rumors of conversational commerce through a potential messaging platform. Spark could be Amazon’s answer to a scalable campaign platform driven by WOM.

I had the privilege to speak and host Epsilon Symposium 2017. With hundreds of clients in attendance, I was tasked with discussing the role of artificial intelligence across Epsilon and Conversant as well as tease examples of emerging technology that my team and I are working on.

Here are the key highlights I discussed during Symposium 2017. Part of my role is evaluating and embracing the latest innovations and determining how they connect to our Epsilon and Conversant solutions.

So the last ten years I have talked about how disruption is the new normal. How emerging technology can impact consumer behavior and what it means for marketers.

Today we are at an inflection point. Where we are seeing the shift from mobile first to AI first. It’s less about disruption and more about acceleration through intelligent systems.

That’s where Epsilon and Conversant’s heritage of aligning data and technology and driving innovation is the key to leveraging whatever the future may bring and where consumers will be.Within the agency business, we are using Machine Learning to categorize the data of culture along with our data of identity to fuel our creative approach.

From a product perspective, We are also achieving harmony (Pun intended ;) through machine learning and AI through a centralized intelligence hub for decisioning across channels.

Finally, Conversant is at the forefront of integrating AI through machine learning and image recognition to create world-class speed and scale where every 5 minutes, consumer actions across 160M individual profiles lead to over a billion model updates.

Regardless of how the future state shifts and evolves… be it through bots becoming agents on our behalf, the evolution of consumer based journey’s expanding to include system based journey’s or a hyper connected augmented reality future. All of those elements will be highly dependent on Data and decisioning as the foundational element.

Included in the analysis is the reveal of the Xbox One X and why it’s relevant for marketers, briefly discusses some of the game reveals, discusses the role of gaming and connecting with Gen Z including eSports, the role of affinity alignment and more.

When I think of Apple, 3 things come to mind: Industrial design of it’s hardware, interoperability across products, and of course millions of apps. After WWDC 2017, I need to add artificial intelligence (AI) enabled experiences, device level privacy and a new focus on augmented reality.

AI was a the key theme of WWDC (mentioned 20 times in 2.5 hours). Apple highlighted how both machine learning and deep learning are now integrated across multiple products. From Apple Watch, Siri, facial recognition in photos and even hand written notes in iOS11. AI integrated experiences were one of the more important areas discussed during WWDC.

WWDC also saw a new hardware launch in the form of the HomePod. HomePod is Apple’s entry into the Smart speaker market. While Siri is integrated into the device it’s to be determined the role it can play for brand marketers as the skills and actions we have begun to depend on in other product ecosystems was surprising absent.

Apple is also investing heavily into enabling augmented reality experiences through hardware and software. With the launch of ARKit, their strategy is to empower the millions of developers to take their AR building blocks and create immersive experiences that are closely mapped to the real world via world tracking for both 2D and 3D elements.

Apple is building a foundation for the future built on device level privacy, artificial intelligence, augmented reality and multimodal computing through evolving Siri beyond handsets into cars and the home with Homepod.

This week I had the opportunity to attend the Google I/O conference in Mountain View, California. It was an incredibly compelling event as Google shifted their focus as a company from mobile first to AI first. This means that all products will be redefined and enhanced through various forms of AI.

This includes the Google Assistant, which was the star of the show. The deck goes into detail, but it’s incredibly important that we begin thinking about the role that the Google Assistant plays across home, smartphone, wearables, auto and soon AR. With the launch on the iPhone announced at the conference it gives Assistant 200 million voice enabled devices out of the gate.

What is also key to consider is the Google Assistant equivalent of an Alexa Skill, called an Action by Google. Actions can support transactions outside of Amazon as well as not requiring installation. Also, there is a very small number of actions that exist today, but a huge and rapidly growing ecosystem of devices that are Google Assistant enabled.

Here is the full trend recap and analysis:

Section one covers trends tied to connection & cognition:

Vision of Ubiquitous Computing

Multi-Modal Computing

Google Assistant (Actions, Auto, Computer Vision, Wear)

Android O

Progressive Web Apps

Structured Data & Search

Section two covers all facets of immersive computing:

Immersive Computing

Daydream (Virtual Reality)

Social VR

WebVR

Visual Positioning Services

Tango (Augmented Reality)

WebAR

In addition to the attached recap, there is also a 4 minute “light recap” video:

For third party commentary, discussed the role of Google Lens & Computer Vision with AdExchanger here

I look forward to Facebook’s F8 developer conference each year. It’s a great opportunity to see how Facebook is prioritizing and adjusting their 10 year road map based on shifting consumer behavior and new advancements in technology.

What was fascinating about this years conference is the rate they are accelerating the convergence of technologies that connect us, immerse us into new virtual worlds and advancing innovation well beyond what we would expect from a company that identifies itself as social first.

Facebook wants to redefine how we think about reality and the not too distant future when all reality is augmented and virtual. The following provides analysis across the consumer centric filters of connection, cognition and immersion.

Connection – Trends that reimagine how we connect, enable and empower consumers

Cognition – Trends where machine based intelligence will disrupt and redefine data assets and how we work

Immersion – Trends that align technology and presence to evoke emotion, entertain and power commerce

Here are few examples of the 15 territories analyzed starting with:

The Camera as the First Augmented Reality Platform – Facebook understands that in order to truly create scale the key is to empower consumers, developers and other 3rd parties to create experiences on their behalf.Consumer empowerment is powerful and will accelerate adoption and ultimately influence consumer behavior towards a new normal.

The democratization of augmented reality (AR) powered by advancing artificial intelligence (AI), has the potential to redefine advertisers approaches to content marketing, making it less about content and more about enabling experiences through compelling and contextually relevant effects.

Frames & AR Studio – Two sets of tools comprise the new Camera Effects Platform. The Frames Studio allows for quick deployment and creation of effects that can enhance an image, video or even Facebook live stream. This platform allows artists, creators and brands to create frames that can be targeted using Facebook targeting abilities for distribution.

The AR Studio is where it’s possible to create light weight AR effects that can developed and enhanced with elements such as real-time data to build highly contextual AR experiences. This is where brand marketers have an opportunity to align data + experiences.

Gaming & eSports

Convergence of gaming & video has been a massive trend over the past 24 months. 2B people play games each month. The rise and consumption of game streams now consists of 665M people watching people play games.

On Facebook people watch, play & create. Facebook’s gaming video product supports eSports (14-31% of live gaming consumption), developers, gaming entertainers and social connection for consumers of game stream content.

Gaming content is digitally native baked in real time interactivity. With gaming video the audience is more than a spectator. They participate in the experience via comments and getting involved in the gameplay.

Messenger 2.0 – 2016 was considered the year of the bot. Primarily fueled by Facebook’s Messenger beta which accelerated the development of a bot ecosystem to further enhance the Messenger experience.

In 2017, Facebook is positioning Messenger as Messenger 2.0 with a sharp focus on integration of other services via chat extensions giving 3rd party bots the ability to seamlessly connect other services such as Spotify or Apple Music.

Facebook is also keen on driving discovery among the 100,000 bots now on the platform via the new discover tab.

Data Design & Artificial Intelligence

Facebook is focused on leveraging multiple facets of Artificial Intelligence to power their products and accelerate 3rd party ecosystems.

Facebook’s ultimate goal is to develop intelligent systems that go beyond computer vision and truly understand the world. This will then converge with their vision of an AR driven future to create a unified experience.

The Rise of Proxy’s – In the very near future we as consumers will have intelligent systems serving the role of a proxy. Facebook is betting on M to first serve as a virtual assistant that will eventually become a predictive service that is the foundation for their virtual computing future.

M will integrate into multiple facets of a users life from sharing location to recommendations. In the near future M can become the connection between a recommendation and AR object recognition action.

Virtual Reality & Facebook Spaces – Facebook officially launched Spaces for Oculus. This was first teased at F8 last year and the experience has definitely advanced from the grainy avatars from a year ago.

Facebook took research and learnings from Oculus Rooms via the Samsung Gear and refined an experience that lets your virtual avatar interact with Facebook content and friends in a virtual environment.

From virtual selfies to watching 360 video. It’s very clear to see that Facebook is focused on creating a new for of social interaction via a virtual environment.

The Future – Facebook took the first major step in achieving their 10 year goal of fully immersive augmented reality by launching the camera as their first augmented reality platform.

On day 2 of the conference, they outlined in detail how they viewtransparent glasses (deemed more socially appropriate) or some equivalent that is paired with a general artificial intelligence system to enhance our daily lives.

This includes improving memory, cognition, recognition and redefining how we interact with the physical world and collaborate with one another.Here is the full recap consisting of all 15 territories analyzed plus implications for brand marketers to consider based on the trend identified.

A big thank you to the Epsilon corporate communications team, DGC and Advertising Age judges. I am truly humbled by the inclusion with such a great list of industry innovators.

I am incredibly grateful to my data design strategy and innovation teams. From research, planning, data design, digital strategy, digital experience delivery, social and innovation a huge thank you for all that you do.

I also want to thank Richard McDonald and the Epsilon agency leadership team for your continued support. Richard, it was your vision that sold me on joining Epsilon and its one of the best career decisions I have made.

Finally, a very special thank you to my amazing wife Cherlyn for supporting all the crazy hours and travel for the past 17 years.

BlackFin360 Archives

Tom Edwards, Ad Age Marketing Technology Trailblazer and Chief Digital Officer, Agency @ Epsilon analyzes best practices and points of difference between Google Actions across the Google Assistant ecosystem as well as Amazon Alexa Voice Services.

In this video, Tom compares and contrasts Amazon Alexa Skills with Google Actions and discusses feature differences, outlines best practices associated with deploying skills and actions as well as key points to consider before submitting for final approvals.

Tom also discusses driving skill and action discovery as well as strategic thoughts tied to going beyond tactical utility towards full ecosystem considerations.

In this video, Tom analyzes the new features that are available as well as discusses topics such as the shift towards social messaging, the role of YouTube’s Uptime application and a preview towards the world of immersive co-viewing with YouTube in virtual reality.