Friday Faves - What We're Reading This Week - Ripple Effect Group

In Friday Faves for 8 June 2018, we have reads & listens on Mary Meeker’s Internet Trends for 2018, developing empathetic robots, looking inside the mind of a robot, how to queue, and The Future This Week.

for W3c validation

Mary Meeker’s Internet Trends 2018

For more than a decade (since 2004 in fact), Mary Meeker, a partner at Kleiner Perkins, has been producing her Internet Trends report, touted as the most highly anticipated slide deck in Silicon Valley. This year, with 294 slides – it’s a little smaller than 2017 with 355 slides – it’s a report that reviews everything from number of internet users, to smartphone behaviours, to e-commerce to tech companies.

The report is already being reviewed and analysed across industries and sectors – each viewing through their own lens and adding their key takeaways. I’ve added a few links at the bottom of this post. There’s a few selected headline topics here – but I’d recommend you explore the report through your own lens, probably more than once.

As of 2018, half the world population, or about 3.6 billion people, will be on the internet.

2017 was the first year in which smartphone unit shipments didn’t grow at all. As more of the world become smartphone owners, growth has been harder and harder to come by.

Despite the high-profile releases of $1,000 iPhones and Samsung Galaxy Notes, the global average selling price of smartphones is continuing to decline.

China is catching up as a hub to the world’s biggest internet companies. Currently, China is home to nine of the world’s 20 biggest internet companies – five years ago, China had two and the U.S. had nine.

Interest in cryptocurrency is exploding as Coinbase’s user count has nearly quadrupled since January 2017.

Voice technology is at an inflection point due to speech recognition hitting 95% accuracy.

More people start product searches on Amazon than search engines now.

Employees seek retraining and education from YouTube and online courses to keep up with new job requirements .

Seeing the mind of a robot in augmented reality

Nat says: Do not be fooled by the title of this shared article. In fact, what the title implies is the very reason I chose to discuss this article this week. The rhetoric surrounding our technological creations is an alarming phenomenon in its own right. We as humans do not even know what a “mind” supposedly is or what it supposedly constitutes (do we even have one!?), and yet we boldly claim to be able to see inside the mind of one of our robotic creations. The creators of this particular robot are claiming their robot is “learning” how to work with humans, in which the robot tests and rejects patterns of input data based on our casual speech; learning to recognise and act based on such instructions as “Put the crate here”. What the researchers hope to do is visualize what the robot is “thinking” based on such commands, with the intent being to fix or fine-tune the outputs in order to better align robotic action with our everyday verbal commands.

However, it is important to note (as Cathy O’Neil points out) how this learning takes place, which is something that still reflects our own human biases which we have programmed into the robot’s so-called “mind”. By stating we can see inside the robot’s mind implies that the robot is consciously aware of what it is doing and what it is learning (which it is not). Furthermore, we promote the idea of robots as standing separate to ourselves, as the article claims it wants robots to work “with humans”. What our technological creations do, however, which is something they have always done, is reveal ourselves to ourselves. It is not so much a question of how this robot can work with us, but rather a question of why we seek to create and then work with such a robot as part of our lives.

Who hates a queue?

Helen says: Clever ideas and technology have brought countless conveniences to our finger tips and in this busy congested world, improving the queue experience has to be high on the wish list. In January, Amazon launched their Seattle no check-in grocery store, Amazon Go. Using the latest in machine learning, computer vision and artificial intelligence Amazon developed what they call, ‘just walk out technology’. All you need is to an Amazon account and a smart phone and away you go. Just swipe your phone to enter the store, everything you select from the shelves gets automatically added to your virtual trolley and on exit you receive a digital receipt. However, we all know too well that with new technology comes teething problems and funnily enough Amazon’s was the long queue to get into the store!

Gary Mortimer, Queensland University of Technology and Louise Grimmer, University of Tasmania, take an interesting look at queues, how they have evolved, their direct and indirect costs to business, their influence on design and impact on customer experience. Significant modifications to the layouts of customer service areas are occurring across a wide range of businesses. I hadn’t really considered the queue to be a key driver of these changes, to me it was more about automation to reduce wages, but whatever the reason, the dwindling queues at airports, supermarkets and banks and the likes are a big plus.

Artificial intelligence is being trained to have empathy. Should we be worried?

The shift can be subtle or overt — from emotionally appropriate gestures from your smartphone’s voice assistant, to comforting robotics in clinical situations.

For instance, Danielle Krettek, the founder of Google’s Empathy Lab, said her work has contributed to some of the Google Assistant’s apparent ability to attune to your mood.

“When you say, ‘I’m feeling depressed’, instead of giving you a description of what depression is, it [might say], ‘you know what, a lot of people feel that. You’re not alone’,” she explained at the design conference Semi Permanent in Sydney.

But the real question we need to ask ourselves is are we OK with a robot making you think it cares?

The move towards empathetic gestures in our technology comes as voice-activated devices are growing in popularity. The coming generation of products, driven by voice, will rely more on inference, emotion and trust, according to Ms Krettek, whose lab aims to bring “deep humanity to deep learning”. AI could be programmed to identify and express that it’s feeling your pain, but given it cannot truly “feel” anything, this might involve a degree of mimicry. If you tell your AI assistant a sad story, you don’t want it to reply in a chirpy tone.

“They [might] mimic the emotional connotation in the words that they’re using … something that humans often do automatically,”.

Sydney Business Insights – The Future This Week Podcast

This week

Face recognition for a noble cause, the capability to find you in a crowd and DNA predictions. Sandra Peter (Sydney Business Insights) and Kai Riemer (Digital Disruption Research Group) meet once a week to put their own spin on news that is impacting the future of business in The Future, This Week.