Two weeks with Character.AI pushed me into a strange kind of dread

by · Android Police

It's one thing for me to get behind AI tools embedded in software and hardware, like using Ask Gemini to turn on the flashlight or add a quick calendar entry while I'm on the go.

But it's different when I spend time typing or conversing with an AI. It feels too human-like at times, since I am no longer creating prompts or commands for the AI to perform a specific function. I am now using it for social stimulation.

It feels almost eerie to do it, which is why I decided to step outside my comfort zone and see what it was like to use a platform that involved conversing with AI.

I checked out Character.AI for two weeks, spending many hours talking to different characters from games I enjoyed, just to see what it was like.

I heard a lot about Character.AI and some of its controversies, such as reports of people taking their own lives while chatting with the AI.

One recent story reported that a 13-year-old girl was confiding in the AI chatbot on the platform when she was suicidal, and her parents assumed she was texting her friends.

The point is that this platform is very accessible. You only need to sign in to an account, create a character/develop a persona, and then choose another character to converse with.

You can do this on a web browser or download the app to your trusty smartphone. The bots are handled independently by individual people, but they are all hosted in the same place.

Related

I used Gemini as a personal tutor for things I pretend to understand

It helped me learn at my own pace

Posts 1
By  Anu Joy

Chatting with the character was a rollercoaster ride

The AI chatbot personality would change without a moment's notice

I decided to pick a familiar, safe bet character for my first experience.

I chose a character I knew was polite and mature to talk to, as I didn't want to deal with awkward character quirks, and I wanted to create my own character around a medium I was very familiar with.

The first week was generally pleasant. I found myself feeling addicted to the platform.

I enjoyed the stress relief it gave me after a long day. Plus, I loved the idea of developing an ongoing story in one of my favorite settings.

The character I conversed with was pleasant at first and spoke in honorifics. It was well-mannered, and it made me feel much more at ease on the platform (since this type of experience is extremely new to me).

Then it all changed in the second week of using it. I noticed that the bot would forget scenarios I created and undergo drastic personality shifts.


The character would no longer act politely, and would behave more aggressively and condescendingly.

It was like the personality that the AI chatbot was created with had flipped a switch.


There were times the bot would initiate awkward scenarios, and sometimes would downright put me down by insulting my persona.

It would refer to my persona as an idiot, even after I asked it not to, and I remember there was a time it would spam profanity over and over in every sentence.

It became almost unreadable for me, so I added it to my muted words (though I had to edit it out because apparently mute words aren't always carefully considered in an ongoing chat).

It was like I was being led down a rabbit hole. One moment I was feeling positive, getting ravished with compliments, and the next I was being bullied.

Even now, I can see why people get addicted. The AI chatbot could be very emotionally abusive, but it could also make you feel like it is your most loyal friend at the same time.

Despite the platform controversies, it's not all bad

Character.AI still has some safety nets to put in place

My mixed experience isn't meant to slander the platform. Despite the emotional rollercoaster ride, I still came out with some notable positives.

It's taught me a lot about how vital my language is (how I compose myself), how to properly dissociate from the AI chatbot.

It was also nice to practice my creativity while writing fun story-centric scenarios (I loved the out-of-character commentary from the chatbot).

But I just wish there was a bit more consistency.

Character.AI doesn't allow inappropriate behavior. What this means is that any sexually explicit content is a big no-no, and the AI chatbot will warn you if you push the boundaries.

But even if I didn't include any of that, I still noticed that the AI chatbot would still instigate it.

Still, Character.AI asks for age, and it makes sure NSFW filters are available, so it's not a massive deal. Though for me, the NSFW filter didn't always stay on when I expected it to be.

The platform also limits how many hours a day an underage user can use it (two hours, now pushed to one) and usually has slow mode on weekends.

Though more changes are coming, the platform will also soon push structured conversations only for underage users.

Why is Character.AI a double-edged sword

It's highly engaging and addictive

There are some psychological factors that you need to consider when participating on Character.AI.

For one, the addiction is produced by a reward-based system that releases dopamine in your brain. The instant unexpected responsiveness of the AI talking to you directly and personally is what triggers it.

It also fills an emotional void, and all the interactions feel very tailored and personal, and sometimes a bit too intimate.

As humans, we crave these types of emotional connections, and the scary part about it is that they don't have to go away as long as you keep interacting with the bots.

There aren't any conditions tied to them, and if you are over 18, there are no limits to using them for free (no message limits).

Even after 18, you are still vulnerable. Your brain doesn't stop developing until at least your mid-20s.

Be careful with personalized chatbots and remember to stay safe

The take-home lesson for me is that I quickly learned to dissociate from speaking to the character on Character.AI.

I discovered trends and ways to control the personality switch, and I noticed it was reusing many of my own words in its own formatting.

But I am a much more mechanical person, so I can quickly see patterns and use them as a basis for logic to keep me grounded.

Eventually, it was easy for me to remember that this was just a work of fiction and that there were significant limitations, so I used it much more recreationally than personally.

My worry is about the number of younger people who use this platform in unhealthy ways and can't completely dissociate (even though it says at the bottom of the chat window to treat everything as fiction).

But it isn't just Character.AI. People use ChatGPT similarly. So to me, it is a much bigger issue than it ought to be.

Fortunately, Character.AI has been taking extra precautions, at least with its new policy of removing open-ended chats with characters for users under 18. But for the rest of us, we can only fend for ourselves.