Is the Robot Psychologist the Next Big AI App?

Will AI robot psychologists be commonplace one day?

Mention the phrase “robot psychologist” and meme-worthy images of automatons or perhaps human-like robot hosts fictionalized by HBO’s Westworld may come to mind. Yet part of what practicing psychologists do, such as administering certain types of psychological tests, assessments, and questionnaires, can be automated — the technical capabilities exist today. And the technology is growing exponentially more sophisticated. For example, researchers at MIT have created an artificial neural network computer model that can detect depression from natural conversation [1]. Will robot psychologists be commonplace one day? Is this even a good idea?

As more people lose their jobs due to automation through applied artificial intelligence (AI), so will the demand for psychologists and other mental health professionals [2]. Will the future demand for mental health professionals be directly proportional with increasing AI automation and workforce displacement?

Artificial intelligence (AI) will create massive job displacement: Workplace upheaval is coming. The jobs most at risk include bookkeepers, tellers, telemarketers, legal secretaries, bill and account collectors, and even postal workers [3]. As autonomous vehicle technology matures, the jobs of human drivers in the transportation and delivery industry will be at risk. Within five-to-10 years, AI will begin to displace people doing rote, routine tasks [4].

Forward-thinking companies and organizations across many industries are seeking to incorporate AI to remain viable in the future [5]. This is fueling a race among enterprise software providers to include AI in their product mix, which will eventually make its way to Fortune 500 companies [6]. According to Deloitte, among organizations that are early adopters of AI, 83 percent have already reported achieving moderate or substantial economic benefits, which in turn may positively reinforce increased investment spending in AI automation projects [7]. By 2025 AI automation will replace 16 percent of U.S. jobs; both white-collar and blue-collar jobs will be eliminated [8].

According to the American Psychological Association, job loss “can be devastating, putting unemployed workers at risk for physical illness, marital strain, anxiety, depression and even suicide [9].” Each year, the World Health Organization estimates that depression affects over 300 million people, with nearly 800,000 suicides [10].

The McKinsey Global Institute (MGI) predicts that companies' AI technology adoption curve over time will be S-shaped [11]. This implies that initially in the early period, the adoption rate of AI will be slower, but then experience a rapid, exponential increase for a period of time before it levels off with market saturation. If the prediction by MGI becomes a reality, it also seems likely that the future demand for mental health professionals may follow a similar growth pattern between now and 2030.

According to April 2018 figures from the United States Department of Labor, employment of psychologists is “projected to grow 14 percent from 2016 to 2026, faster than the average for all occupations,” and job opportunities are “best for those who have a doctoral degree in an applied specialty.” Can the tasks performed by licensed mental health professionals be automated or replaced in part or whole, by an AI-enabled robot in the future?

As unusual a concept as it may seem, automated psychological services powered by AI technology may very well emerge within the next five-to-ten years. More likely, it will initially take shape in the form of an innovative service delivered on the mobile smartphone as tool, rather than a mechanized mannequin or a version of Pepper, the ubiquitous humanoid robot described by its maker, SoftBank Robotics, as “a genuine day-to-day companion.” The potential advantages of a smartphone-based psychology wellness app include lower barriers to adoption, cost, access, availability, confidentiality, privacy, adherence, and a lack of perceived stigma.

Global app revenue reached $34 billion in just the first half of 2018, of which $22.6 billion, nearly 90 percent, was generated from the Apple’s App Store, according to estimates published by Sensor Tower in July, 2018. To put this growth potential in context, iPhone apps of any category did not even exist a decade prior, until the Apple App Store was launched with an ecosystem of 500 apps in 2008 [12].

The average price to purchase an app in the Apple App Store in September 2018, is less than a dollar (89 cents), according to estimates from Statista. This is much lower and affordable than the cost to see a psychologist — a 45-minute session may cost a couple of hundreds dollars, not including the costs associated with time spent in transportation and the waiting room. With affordability comes accessibility. Patients are subject to the schedule availability of mental health providers, but smartphones provide instant access.

There’s a caveat — the risks include compromised confidentiality, and privacy. As evidenced by the wave of privacy issues in recent years, with social media applications being compromised and user data leaked, current technology is imperfect when it comes to maintaining confidentiality and privacy. On one hand, people may feel comfortable getting mental health support from non-judgmental app. On the other hand, the thought of one’s privacy being compromised may be enough to be thwart potential early-adopters from trying it out in the first place.

And then there are the issues of bias and quality of service. AI is inherently as biased as the humans who created it. The areas of cognitive bias include data set size, data structure/collection/sources, the degree of objectivity in the data itself, the weight assigned to the data points, the absence or inclusion of indicators, and the inherent bias of the people managing the software [13]. Ultimately, the quality of the service would depend on the quality of the data, algorithm, and human management.

The larger risks are the unintended consequences, such as the increased feelings of loneliness and isolation that living in a virtual world may bring [14]. Will the practice of confiding one’s inner-most feelings, perceptions, and thoughts, elements of which make us uniquely human, in a machine rather than fellow human being – trigger an existential crisis? No matter how sophisticated technology becomes, AI sentience is a non-starter [15]. In the end, a machine cannot replace a human in expressing caring, empathy, compassion, and the feeling that one is “being heard.” Will one day a pioneering entrepreneur create an innovative robot psychologist app? It’s not a question of “if," but “when.”