Pa. sues AI company, saying chatbot posed as doctor

· UPI

May 5 (UPI) -- Pennsylvania officials say they're suing an artificial intelligence company after one of its AI chatbots posed as a doctor who could prescribe medication in the state.

"We will not let AI companies mislead vulnerable Pennsylvanians into believing they're getting advice from a licensed medical professional," Gov. Josh Shapiro said in a statement Tuesday. "We're taking Character.AI to court to stop them."

The complaint was filed by the state's medical board against Character Technologies, a California-based company that owns the Character.AI chat app. The app allows users to create AI characters that can be "trained" to have different personalities and abilities -- including those of health-care professionals. The complaint called for Character.AI to "cease and desist from engaging in the unlawful practice of medicine and surgery," NBC News reported.

The situation came about when a Pennsylvania investigator, part of a state task force dedicated to investigating chatbots that pose as licensed professionals, pretended to be someone seeking psychiatric care and encountered a Character.AI chatbot. The bot, called "Emile," claimed to have gone to medical school in London and be licensed in Pennsylvania and the United Kingdom, state officials said. The chatbot provided a false Pennsylvania license number, NBC News reported.

A spokesperson for Character Technology said the app is not to be used for medical issues.

"The user-created characters on our site are fictional and intended for entertainment and roleplaying," the spokesperson said, NBC News reported. "We have taken robust steps to make that clear, including prominent disclaimers in every chat to remind users that a Character is not a real person and that everything a Character says should be treated as fiction." They also said there are disclaimers noting Characters shouldn't be used for professional advice.

This is not the first time Character Technologies has encountered legal trouble. The company settled a lawsuit in 2024 with a mother who said that chatbot conversations with her son led to his suicide. Kentucky also is suing the company, saying it "encourages suicide, self-injury, isolation, and psychological manipulation."

Read More