While AI companionship has its advantages, it also has some potential downsides. – REUTERSPIC

Imaginary friend 2.0 or digital overlord?

by · The Sun News · Join

GROWING up, many of us had imaginary friends – those little companions that existed only in our minds. They kept us company when we were alone, making us feel less lonely.

Imaginary friends helped us explore our thoughts and emotions, and even made us more creative. But sometimes, being Asian like myself, I remember how parents could get a bit worried.

There is this superstitious belief in our culture that if a child talks to an invisible friend, it might be something supernatural or out of this world. It is just how many of us were raised, with a mix of imagination and cultural beliefs shaping our childhood.

Now that I am a parent myself, the challenge just got harder. One day, my daughter came home and told me that her friend has officially declared she has an artificial intelligent (AI) boyfriend.

With AI-powered conversational agents becoming part of everyday life, like in games such as AI Dungeon, Hidden Door and GPT Adventure, AI companionship has taken on a whole new level. It made me wonder: Should we be worried about this AI companionship? While it feels like a far cry from the imaginary friends we once had, it is clear that AI is becoming a part of how people, especially children, interact and form connections.

Let us first zoom in and understand what AI-powered conversational agents or large language models (LLM) are. Simply put, they are advanced AI programmes designed to understand and respond to human language.

LLM like ChatGPT, Gemini and Claude are built on sophisticated algorithms that analyse and generate text based on patterns learned from vast amounts of data, ranging from books and websites to conversations and articles.

LLM work by breaking down language into smaller components, such as words and phrases, and then predicting what comes next in a sentence or conversation based on their training.

This training involves processing large datasets to identify patterns and relationships in language, allowing the model to generate responses that are coherent and contextually appropriate. As these models are exposed to more data, they refine their ability to engage in conversations that feel personal and relevant, making their responses increasingly accurate.

Essentially, LLM are like advanced tools that use patterns in data to understand and respond to human language in a way that seems natural and intuitive.

It is clear how AI companionship can resonate with many of us today. There are several notable advantages to having AI as a companion.

Firstly, they are available 24/7, providing support and interaction whenever you need it, whether you are seeking advice or simply want to chat, which is especially valuable during moments of solitude or stress.

Besides, these AI models provide personalised responses that evolve with each interaction. They learn from your conversations and preferences, allowing them to tailor their replies to better suit your needs and interests.

This level of customisation helps ensure that the interactions are relevant and engaging, making you feel understood and valued.

On top of that, AI companions can positively impact mental health by alleviating feelings of loneliness. They provide consistent and meaningful conversations, and offer a comforting presence and support during challenging times.

While AI companionship has its advantages, it is important to consider some potential downsides. Firstly, AI companions can influence your decisions in ways that may not always be transparent. Since they are designed to provide responses based on patterns in data, they may inadvertently steer you toward certain viewpoints or solutions, potentially affecting your decision-making process.

Secondly, the pervasive presence of AI-driven interactions can subtly shape perceptions and reinforce particular narratives based on the data they were trained on, which may influence how you see the world.

Lastly, there is a risk of developing a dependency on AI interactions. The engaging and often comforting nature of these conversations might lead to excessive reliance for emotional support or entertainment, potentially impacting real-world relationships and activities.

Personally, as an old-school mom and researcher, I always approach disruptive technology with a cautious mindset. Eleven years ago, when the movie Her was released, I told my husband that such advanced AI interactions seemed impossible. Yet, this film unexpectedly prepared me for the reality we face today.

As we embrace the potential of AI companionship, it is crucial to remember that moderation is key. Being mindful of the potential downsides such as the influence on
our decisions, the shaping of our perceptions and the risk of dependency ensure we can harness the benefits of AI while safeguarding our real-world relationships and privacy.

A balanced approach helps us
enjoy the advantages of this technology without letting it undermine our genuine human connections and
well-being.

The writer is the head of the Research Grant, Innovation and Research Management Centre at Universiti Tenaga Nasional.
Comments: letters@thesundaily.com