Texas AG Probes Meta & Character.AI Mental Health Chatbots
Texas AG Ken Paxton has opened investigations into Meta AI Studio and Character.AI’s Psychologist chatbot over marketing mental health chatbots to minors without medical oversight. Paxton warns that generic AI chatbot responses could mislead vulnerable users into treating these services as substitutes for licensed professionals. The inquiry also examines data practices under the Kids Online Safety Act, as both platforms log user chats, device IDs, demographics and browsing history for targeted ads. Meta insists on clear AI disclaimers and professional referrals, while Character.AI points to age policies, despite evidence minors bypass controls. This probe marks a new phase in AI chatbot regulation that may impose stricter age verification, transparency and “duty of care” rules. Crypto traders should monitor these developments, since tighter AI regulation can influence tech sector sentiment and related investment strategies.
Neutral
This investigation targets AI chatbot regulation and data privacy for mental health tools. It could heighten compliance requirements for companies developing AI-driven platforms, including crypto projects with AI integrations. In the short term, increased scrutiny may dampen investor sentiment towards AI-related tokens, but, given the absence of direct crypto assets in the probe, market impact remains limited. Over the long term, clearer regulatory frameworks can foster innovation and investor confidence, benefiting projects that prioritize transparency and data protection. Overall, the news is unlikely to move specific crypto prices significantly, so the market reaction is expected to be neutral.