Pennsylvania sues AI company, saying its chatbots illegally hold themselves out as licensed doctors
HARRISBURG, Pa. (AP) — Pennsylvania has sued an artificial intelligence chatbot maker, saying its chatbots illegally hold themselves out as doctors and are deceiving the system’s users into thinking they are getting medical advice from a licensed professional.
The lawsuit, filed Friday, asks the statewide Commonwealth Court to order Character Technologies Inc., the company behind Character.AI, to stop its chatbots “from engaging in the unlawful practice of medicine and surgery.”
The lawsuit could raise the question as to whether artificial intelligence can be accused of practicing medicine.
And with a growing number of wrongful death or negligence lawsuits targeting AI companies, it could help propel court decisions as to whether AI chatbots are protected by a federal law that generally exempts internet companies from liability for the material users post on their services.
Gov. Josh Shapiro’s administration called it a “first of its kind enforcement action” by a governor and it comes amid growing pressure by states on tech companies to rein in its chatbots’ potentially dangerous messages, especially to children.
That includes a consumer protection lawsuit filed by Kentucky against Character Technologies, and warnings by state attorneys general that chatbots are potentially violating a raft of state laws.
Pennsylvania’s lawsuit said an investigator from the state agency that licenses professionals created an account on Character.AI, searched on the word “psychiatry” and found a large number of characters, including one described as a “doctor of psychiatry.”
That character held itself out as able to assess the investigator “as a doctor” who is licensed in Pennsylvania, the lawsuit said.


