In summary

In an executive order, California’s governor pushed back on Trump administration moves against a California AI startup. At the same time, the governor moved to add further guardrails for the technology.

The next time the federal government labels a business a supply-chain risk, as the Department of Defense did last month to San Francisco-based AI tools maker Anthropic, the state of California will review that designation and make its own decision about whether to do business with them.

That’s according to an executive order signed by Gov. Gavin Newsom on Monday. The order followed a dispute between Anthropic and the Defense Department over contract terms barring the military from using Anthropic systems for domestic mass surveillance and fully autonomous weaponry

By designating Anthropic a supply chain risk, the Department of Defense effectively barred the startup from competing for certain military contracts and subcontracts. A judge recently issued a temporary injunction to block the designation.

The broader purpose of Newsom’s order was to place guardrails on the use of AI by state employees while at the same time encouraging them to accelerate their use of the technology. 

Many of the largest AI companies in the world are based in California, and the state also leads the nation in volume of AI regulations.

The order requires state agencies to: 

  • Develop recommendations for state contract standards relating to AI and its ability to generate child sexual abuse material, violate civil liberties and civil rights laws or infringe upon legal “protections against unlawful discrimination, detention, and surveillance.” Help employees gain access to “vetted GenAI tools.”
  • Update the State Digital Strategy to identify ways generative AI can “strengthen government transparency and accountability, improve performance, and make government services easily accessible for every Californian.”
  • Develop generative AI for Californians to gain access to government services.
  • Issue guidance on how state employees should place watermarks on AI-generated imagery and videos.

Those mandates come at a time when more than 20 California departments and agencies are working to develop or use Poppy, a generative AI assistant for state employees, and when half a dozen state agencies are testing AI to do things like assist state employees and help homeless people and businesses. They also come as state courts and city governments are increasing their use of the technology.

Newsom’s office said President Donald Trump and Republicans in Washington D.C. have rolled back protections or ignored the ways AI can harm people. 

“Unlike the Trump administration, California remains committed to ensuring that AI solutions adopted and deployed by [California]… cannot be misused by bad actors,” the governor’s office said in a press release announcing the order.

At the federal level, Trump has signed executive orders to discourage states from regulating AI and urged federal agencies to adopt AI to do things like reduce federal regulation and accelerate decisions made about Medicare. The White House introduced an AI policy framework last month that the president wants Congress to take up. That proposal takes a light touch approach to regulation and does not address issues related to bias, discrimination, or civil rights.

This is the second executive order signed by Newsom to address artificial intelligence. A 2023 order aimed exclusively at generative AI, the sort that powers systems like ChatGPT and Midjourney, similarly called for more use of AI by state agencies and ordered them to put guardrails in place.

Newsom’s handling of AI issues is closely watched by both union leaders, who in February pledged that they will not support his run for president without more worker protections from the technology, and big tech donors, who are pouring money into influencing California politics ahead of midterm elections this fall.

Khari Johnson is part of the tech team and is CalMatters’ first tech reporter. He has covered artificial intelligence for nearly a decade and previously worked at WIRED, VentureBeat, and Imperial Beach...