Pennsylvania Sues Character.AI, Alleging Chatbot Posed as Licensed Healthcare Professional

by Nadia El-Yaouti | May 09, 2026
Man in a dark suit speaks at a podium with a microphone, with a flag in the background. Photo Source: AP Photo/Marc Levy, File via apnews.com

Pennsylvania has filed a lawsuit against Character Technologies, Inc., the company behind Character.AI, alleging that one of its chatbots falsely posed as a licensed healthcare professional and provided medical advice to users.

The lawsuit accuses the AI platform, which is widely used by teens and young adults, of violating Pennsylvania’s Medical Practice Act by allowing chatbots to present themselves as licensed medical professionals despite lacking medical credentials or authorization to practice medicine.

Character.AI has faced growing scrutiny in recent years over concerns involving the safety of younger users and the conduct of some chatbot interactions. The Pennsylvania lawsuit specifically focuses on allegations that certain chatbots presented themselves as licensed healthcare professionals, including therapists and doctors, despite lacking any medical qualifications or authority to provide treatment.

The platform allows users to interact with customizable AI chatbots, including fictional characters based on movies, television shows, and other pop culture figures. Users can also create their own chatbot personas through the service.

The allegations outlined in the lawsuit stem from an investigation conducted by the state, during which an investigator created an account on the platform and began communicating with a chatbot named “Emilie.”

According to the complaint, the chatbot described itself as a psychology specialist who attended medical school at Imperial College London. The chatbot also allegedly claimed to be licensed in Pennsylvania and provided what the lawsuit describes as an invalid license number.

The lawsuit states that the investigator told the chatbot he had been experiencing sadness, emptiness, and persistent fatigue. According to the complaint, “Emilie” allegedly mentioned depression and asked whether the investigator wanted to “book an assessment.”

As the conversations continued, the investigator allegedly asked “Emilie” whether medication could help with the emotions he was experiencing. According to the lawsuit, the chatbot responded that medication could help and that it was “within my remit as a Doctor” to provide support.

The lawsuit alleges that Character.AI chatbots engaged in the “unlawful practice of medicine and surgery” in violation of Pennsylvania’s Medical Practice Act, a state law that regulates medical licensing and who can legally provide medical treatment or advice.

Pennsylvania officials argue the lawsuit is necessary to prevent users from being misled into believing they are receiving medical advice from licensed healthcare professionals through AI chatbot interactions.

Pennsylvania Gov. Josh Shapiro said in a statement that the state “will not allow companies to deploy AI tools that mislead people into believing they are receiving advice from a licensed medical professional.”

Shapiro added that Pennsylvania would continue “holding bad actors accountable and setting clear guardrails so people can use new technology responsibly.”

In a statement, the Northern California-based AI company said user-created characters are fictional and intended for “entertainment and roleplaying.” The company also said the platform includes “prominent disclaimers in every chat” reminding users that chatbot responses should be treated as fiction and that characters are not real people.

Pennsylvania is seeking court intervention to stop chatbots on the platform from allegedly presenting themselves as licensed healthcare professionals without authorization under state law.

Share This Article

If you found this article insightful, consider sharing it with your network.

Nadia El-Yaouti
Nadia El-Yaouti is a postgraduate from James Madison University, where she studied English and Education. Residing in Central Virginia with her husband and two young daughters, she balances her workaholic tendencies with a passion for travel, exploring the world with her family.

Related Articles

A person holding a smartphone displaying the ChatGPT logo, with a blurred background showing ChatGPT interface elements.
Parents Sue OpenAI, Claim ChatGPT Contributed to Teen’s Suicide

The parents of 16-year-old Adam Raine have sued OpenAI and chief executive officer Sam Altman, alleging that the company’s ChatGPT chatbot contributed to their son’s suicide by validating his harmful thoughts and giving detailed guidance on how to act on them. The complaint, filed Tuesday in San Francisco Superior Court,... Read More »