New York Senate Bill Would Bar AI Chatbots From Giving Legal and Medical Advice

by Nadia El-Yaouti | Mar 09, 2026
Photo Source: Adobe Stock Image

A New York Senate bill would bar artificial intelligence chatbots from presenting themselves as licensed professionals, such as lawyers, doctors, or therapists, when offering advice online. Supporters say the measure aims to prevent businesses from using AI tools in ways that could mislead consumers seeking professional guidance and potentially expose companies to legal liability.

Senate Bill S7263 would prohibit chatbots from impersonating licensed professionals when providing substantive advice or information. The legislation targets situations where artificial intelligence systems simulate human interaction and provide guidance that could otherwise require a licensed professional under state law.

Bill sponsor New York State Senator Kristen Gonzalez said the proposal is intended to address a gap in existing law governing artificial intelligence tools.

“Today, there is no law that says that a large language model cannot tell you that it is a lawyer, that it is a licensed therapist, and then give you legal advice or therapy accordingly. I think that's really concerning,” Gonzalez told Reuters.

According to details in the bill, users who are harmed could file a civil lawsuit seeking economic and non-economic damages. The legislation compares AI chatbot advice to situations where unlicensed individuals provide professional guidance, which can already expose them to liability.

Gonzalez is not the only senator raising concerns about artificial intelligence and consumer protection as lawmakers across the country debate how the technology should be regulated. Over the past two years, regulators and lawmakers across several industries have raised questions about how AI systems should be governed as the technology becomes more widely used in online services and consumer tools.

As AI systems continue to expand across online services, the proposed bill strengthens user protections. Platforms that embed AI chatbots into their services would not be able to rely on blanket statements that users are interacting with a “non-human chatbot” to shield them from liability.

While the legislation specifically addresses services offering legal advice, the measure would also apply to services involving licensed professionals, including doctors, therapists, and mental healthcare providers.

The bill would also require services using artificial intelligence tools to "conspicuously" disclose that the information generated by those systems may be inaccurate.

OpenAI and Anthropic, two major developers in the AI industry, have been frequent participants in public discussions and industry forums about the advancement of artificial intelligence technology and its impacts on businesses and consumers. As conversational AI systems become more widely used across online platforms, lawmakers and regulators have increasingly examined whether additional rules are needed to govern how such systems interact with the public.

Some of the scrutiny surrounding AI chatbots has also emerged through litigation. Several companies developing conversational AI systems have faced lawsuits alleging that their products caused harm due to inaccurate or persuasive responses generated by artificial intelligence tools. Some of those cases involve claims that chatbots interacted with minors in ways that contributed to serious harm, including wrongful death lawsuits involving the chatbot Character.AI, a separate artificial intelligence platform developed by another company.

The measure is currently on the New York Senate floor calendar after advancing through the chamber’s Internet and Technology Committee earlier in the legislative process.

Share This Article

If you found this article insightful, consider sharing it with your network.

Nadia El-Yaouti
Nadia El-Yaouti is a postgraduate from James Madison University, where she studied English and Education. Residing in Central Virginia with her husband and two young daughters, she balances her workaholic tendencies with a passion for travel, exploring the world with her family.