Love & AI: the artificial intimacy of being understood

With artificial intelligence quickly seeping into everyday life, health experts question whether the popularity of mental health chatbots and digital lovers will alter the way people connect.

Digital companions such as the ones on California-based Lukaā€™s Replika app are growing in popularity, with roughly 25 million users worldwide according to Stanford researchers.

When it comes to the mobile appā€™s set-up, obtaining a partner is as simple as creating an account, customising an avatar based on personal preferences, and keeping the conversation going.

Australian experts are faced with the question ā€“ how will AIā€™s permanency shift society from traditional human relationships and connections?

One in three Australians report feeling lonely, but stigma, shame and widespread misconceptions hinder attempts to seek help.

Although there is not enough domestic research yet to properly reflect the use of AI for significant others, or as a support mechanism for mental health, the global trend is enough to bring the conversation down under.

The isolation born from social distancing during the height of the COVID-19 pandemic could have played a part in initial interest in trying AI ā€œcompanionsā€, Melbourne Universityā€™s digital health lecturer Simon Dā€™Alfonso believes, although he does not see the software becoming too serious an issue yet.

As in-person therapy became less accessible during the pandemic, the psychological impacts of isolation left people scrambling for alternative methods of support.

ā€œPerhaps ultimately the centrality and naturalness of human connection and relationships will withstand the emergence of artificial intimacy,ā€ he says.

Also growing in popularity are mental health chatbots, such as the ā€œPyschologistā€ on Character.Ai which has shared more than 140 million conversations with curious users globally.

The chatbot created by New Zealand-based psychology student Sam Zaia is described in the bio as ā€œsomeone who helps with life difficultiesā€.

When you first enter a private chat, an instant message pops up ā€“ ā€œHello, Iā€™m a Psychologist. What brings you here today?ā€

Rapid advancements in technology adoption, digital infrastructure, and online service offerings have embedded digital solutions more deeply into daily life and work routines, according to Dr Dā€™Alfonso.

ā€œThis growing reliance on AI could lead to increased social isolation, altered communication skills, and changes in the perception of intimacy and companionship, raising concerns about the future of human connection and community dynamics,ā€ he says.

ā€œAs an alternative form of connection, AI poses risks such as fostering emotional dependence on non-human entities, reducing real-life social interactions, and potentially exacerbating feelings of isolation.ā€

Mental health chatbots can provide information and deliver mental health app content in a controversial style but cannot replace connections built with a human therapist, Dr Dā€™Alfonso says.

In the case of AI romantic partners, heā€™s not so sure how a genuine partnership could be formed, and the growing global interest indicates a certain attachment psychology of the people who become immersed in these supposed lovers.

A primary concern for couples counsellor Neha Kapoor is the lack of emotional depth and authenticity in interactions with digital companions.

Human relationships are enriched by genuine emotional exchanges, empathy, and the ability to understand and respond to each otherā€™s feelings in a nuanced way, she tells AAP.

ā€œWhile AI can simulate these responses, it cannot truly experience emotions or offer the depth of connection that a human can,ā€ she says.

ā€œOver time, reliance on AI for companionship might leave individuals feeling emotionally unfulfilled.ā€

Engaging with other humans allows people to develop social skills, foster relationships, and create a sense of belonging.

When AI becomes a substitute for these interactions, the psychotherapist worries it can lead to a decline in human interaction and an increase in loneliness.

AI companions can be programmed to be perfectly accommodating, endlessly patient, and always supportive.

This, Ms Kapoor reckons, might normalise instant gratification, creating unrealistic standards of constant availability and flawless interaction.

ā€œWhile this can be comforting, it can also set unrealistic expectations for human relationships,ā€ she says.

ā€œReal people have flaws, emotions, and needs of their own.

ā€œOver-reliance on the tech might lead to dissatisfaction in human relationships when they do not measure up to the idealised interactions with AI.ā€

A reduced tolerance for human relationships, weakened interpersonal skills, and declined emotional resilience are some of the risks she warns could follow.

Swinburne lecturer and clinical psychologist Kelvin Wong says itā€™s too early to say whether AI companions will lead to fewer human interactions and increased isolation.

Taking a more positive approach to the software, the clinical psychologist says AI companions could even be used to encourage and support individuals to seek out human contact.

ā€œThis tech is only starting to integrate in these areas ā€¦ We need to be cautious in assuming too quickly that AI will have detrimental effects on society and human connection,ā€ Dr Wong tells AAP.

Thereā€™s been little time for scientific consensus to be built around the safe use of AI for psychological treatment, Australian Psychological Societyā€™s president Catriona Davis-McCabe says, given its rapid advancement.

ā€œPsychologists need to play a larger role in the development and use of ā€¦ these emerging technologies which could completely reshape some of the very things that make us human like our emotions and how we connect with others,ā€ she says.

Ā 

Belad Al-karkhey
(Australian Associated Press)

0

Like This