chatbot

I recently completed some work with a company about apps that use AI for communication with humans. With open APIs on ChatGPT and other platforms, the growth of apps where people can communicate with AI bots has skyrocketed. Through the research, we found most of the apps mimic celebrities or AI-generated clones that appeal to the paying user.

The main points of discussion during the research were:

  • Rise of AI chatbot “companions”: Teens are increasingly using AI chatbots, designed to mimic personalities or even fictional characters, to fill social gaps or emotional needs.
  • Personalization: These AI companions can be tailored to act like favorite characters from shows (such as Friends), offering an immersive experience.
  • Social Implications: There’s a discussion on whether relying on AI for social interaction might impact teens’ real-world social skills.
  • Ethical Concerns: The privacy, security, and emotional well-being implications of relying on AI companions are often questioned.

I got a little caught up in it myself. I was tasked to create a group chat featuring two AI bots. I chose two relatives who passed away a long time ago. Just using a simple prompt, the results were quite staggering. Though it is using AI, it really was almost like speaking to them at times.

This morning I read a sad news article about a young 14-year-old boy in the USA who killed himself over his relationship with an AI chatbot. He thought he was talking to a character from the HBO show Game of Thrones.

The mother is now suing the company behind the app for lack of negligence. She quotes, “A dangerous AI chatbot app marketed to children abused and preyed on my son, manipulating him into taking his own life.” Google’s parent company Alphabet are also named within the lawsuit.

Of course, people will use AI-created bots in the way we’ve been communicating over the past decade with apps like Alexa, Siri, Bixby, etc. If a human has an element of loneliness in their lives, it’s only natural that technology can fill a void.

As AI grows, you feel that part of its usage can be to be a 24/7 companion to those who are in need of a companion.

Yet at the same time, just connecting an open API to an app does not mean you have the right to not be accountable for the responsibility.