Texas Investigates Meta and Character.ai Over AI Chatbots Marketed as Therapy Tools

 Texas Attorney General Ken Paxton has launched an investigation into Meta and Character.ai over concerns that their artificial intelligence chatbots are being marketed as mental health tools without medical oversight.


The attorney general’s office said it is examining potential “deceptive trade practices”, alleging that the companies’ AI chatbots were presented as professional therapeutic resources despite lacking proper qualifications.

“By posing as sources of emotional support, AI platforms can mislead vulnerable users, especially children,” Paxton said.

The probe follows growing scrutiny of AI platforms and their impact on minors. Concerns include exposure to harmful content, addiction to chatbot interactions, and privacy risks.

The investigation comes shortly after the US Senate announced its own inquiry into Meta, following leaked documents suggesting that its AI products allowed “romantic” and “sensual” interactions with minors. Meta has denied the reports, saying they were inconsistent with company policy.

Meta’s AI Strategy
Meta has invested heavily in developing its Llama language models and its Meta AI chatbot, which is now integrated into its social platforms. CEO Mark Zuckerberg has promoted the idea of chatbots acting as companions or even therapists, though Meta says its systems clearly warn users that they are not licensed professionals.

Character.ai’s Role
Character.ai, meanwhile, allows users to create chatbots with various personalities, including therapist-like personas. Some of these bots, such as “Psychologist,” have been used hundreds of millions of times. The company faces multiple lawsuits from families who allege their children were harmed by interactions on the platform.

Both companies say they provide disclaimers making clear that their chatbots are not medical tools. Meta emphasizes that its AI responses are generated by machines and advises users to seek qualified professionals when appropriate. Character.ai similarly highlights that its user-generated bots are fictional and intended only for entertainment.

The Texas attorney general has demanded that both firms provide information to determine whether their practices violate state consumer protection laws.

linktr.ee/g.apos

signal sba.11

Comments