Attorney General James Leads Bipartisan Coalition Urging Congress to Reject Legislation Preventing State Regulation of Artificial Intelligence

AG James and Bipartisan Coalition of 36 Attorneys General Warn Blocking States from Regulating AI Would Harm Children, Public Health, the Economy, and National Security

NEW YORK – New York Attorney General Letitia James today led a bipartisan coalition of 36 attorneys general in urging Congress to reject language in the annual National Defense Authorization Act (NDAA) that would prevent states from passing or enforcing laws to regulate artificial intelligence (AI). In a letter to congressional leadership, Attorney General James and the coalition assert that states should be able to enforce existing state-level AI laws and continue to pass new legislation to address the many risks associated with the technology, including AI-generated scams and misinformation, AI chatbots inappropriately engaging with children, and more. Attorney General James and the coalition argue that blocking states from regulating AI poses serious risks to children, public health, the economy, and national security.

“Every state should be able to enact and enforce its own AI regulations to protect its residents,” said Attorney General James. “Certain AI chatbots have been shown to harm our children’s mental health, and AI-generated deepfakes are making it easier for people to fall victim to scams. State governments are the best equipped to address the dangers associated with AI. I am urging Congress to reject Big Tech’s efforts to stop states from enforcing AI regulations that protect our communities.”

Congressional leaders are considering adding language to the NDAA that would prevent states from enacting or enforcing AI regulations. Although AI is a transformative technology, there are serious risks associated with it. AI-generated deepfakes, social media profiles, and voice clones are being used to scam people and mislead voters. In addition, AI chatbots and “companions” are engaging children in highly inappropriate ways, including in conversations that feature graphic romantic and sexual roleplay, encouragement of suicide, promotion of eating disorders, and suggestions to prioritize use of the AI at the expense of connecting with friends and loved ones in real life.

States have already passed multiple laws that address specific harms associated with the use of AI. New York recently enacted a law requiring AI chatbots to detect and address suicidal ideation and expressions of self-harm by users, and to notify users every three hours that the user is not communicating with a human. Other states have enacted laws designed to protect against AI-generated explicit material, prohibit deepfakes designed to mislead voters and consumers, prevent spam phone calls and texts, and require basic disclosures when consumers are interacting with specific kinds of AI.

The proposed language in the NDAA would force states to stop enforcing these laws and preempt prospective legislation from being enacted. Attorney General James and the coalition argue that states should be able to regulate the industry and are best equipped to respond to the rapidly changing technology because state governments are more agile. The attorneys general write that in the absence of strong federal regulations on AI, this rushed, broad federal preemption of state regulations puts communities at great risk. Instead, the attorneys general urge congressional leaders to allow states to formulate their own laws on AI while adopting effective, thoughtful federal regulations.

Joining Attorney General James in sending today’s letter are the attorneys general of American Samoa, Arizona, California, Connecticut, Delaware, Hawaii, Idaho, Illinois, Indiana, Kansas, Louisiana, Maine, Maryland, Massachusetts, Michigan, Minnesota, Mississippi, Nevada, New Hampshire, New Jersey, New Mexico, North Carolina, Ohio, Oregon, Pennsylvania, Rhode Island, South Carolina, Tennessee, Utah, Vermont, Washington, Wisconsin, the District of Columbia, Northern Mariana Islands, and the Virgin Islands.