The first speaker, David Luan @ Dextro described his company which uses deep learning techniques to summarize and search videos by categories and items appearing in the videos. The goal is to create a real-time automated method for understanding how consumers view videos.

They describe the video by a Salience graph showing the important themes in the video and a time line of when concepts/items are displayed.

Analysis of video is complicated as items are embedded in a context and information needs to be summarized at the correct level (not too low, such as there are ice skates, seats, lights, etc., but at the level of understanding that this is a specific hockey game). They also aim to use motion cues to give context to the items and segment the video into meaningful chunks.

They work with a taxonomy provided by the customer to create models based on the units wanted by the customer.

David talked about the challenges of speeding the computation using GPUs and how they eventually will incorporate metadata and the sound track.

—

The second speaker, Sameer Maskey @FuseMachines talked about how they use data science analysis to improve customer service.

He talked about the treasure trove of data generated in prior customer service interactions. These can be analyzed to improve the customer experience by

Improving the ability of customers to find solutions using self service

Empower customer service reps with tool that anticipate the flow of the conversation

Sameer mentioned several ways that this information can assist in these tasks:

Expose information embedded in documents

Considers what the user is looking at and predicts the types of questions that the user will ask.

Train customer service reps using previous conversations. New rep talk to the system and see how the system responds.

On a call, the system automatically brings up documents that might be needed.

Three fundamental problems are important

Data to score – ranks answers

Data to classes/labels – predict answer type

Data to cluster – cluster topics

They currently do not have the sophistication to ask for further clarification or start a dialog such as “when is my next garbage collection?” which should be answered by the question, “what is you location within the city?”

—

Jake Porway @DataKind spoke about his program to use data for the greater good.

DataKind brings pro bono data scientists to improve the understanding of data by non-profits. They have had 10,000 analysts working on 100’s of projects. Projects include:

org – kick starter for NYC public schools soliciting online donations. Applied semantics3 to automate the taxonomy. Can determine which types of schools ask for what types of categories of goods/services.

Crisis Text Line – teen’s text if they are in need – note that 5% of users take up 40% of all services. Created a predictive model of when someone will become a repeat texter so they can intervene more quickly.

GiveDirectly – money to the poorest places in Kenya & Uganda. Check Thatch vs iron roofs to determine which communities are the poorest – build a map of types of roofs in different communities by analyzing satellite imagery. Jake talked about the limitations of this method and how refining the specifications is part of the process.

Jake said they have recently set up a centralized project group that can initiate its own projects

—

The last speaker, Mahamoud El-Assir @Verizon talked in very general terms about how Verizon leveraging data analysis to improve customer experience. He talked about information about the various channels and various services can be used to better match a advertising and advice to the customer needs

Talking to customers – rep can consider the TV, Data and equipment usage.

Supervisors coach their agents in real time – types of calls and the resolution on calls