California's AI Bill Threatens To Derail Open-Source Innovation
The bill’s sweeping regulations could leave developers navigating a legal minefield and potentially halt progress in its tracks.

This month, the California State Assembly will vote on whether to pass Senate Bill 1047, the Safe and Secure Innovation for Frontier Artificial Intelligence Models Act. While proponents tout amendments made "in direct response to" concerns voiced by "the open source community," critics of the bill argue that it would crush the development of open-source AI models.
As written, S.B. 1047 would disincentivize developers from open-sourcing their models by mandating complex safety protocols and holding developers liable for harmful modifications and misuse by bad actors. The bill offers no concrete guidance on the types of features or guardrails developers could build to avoid liability. As a result, developers would likely keep their models closed rather than risk getting sued, handicapping startups and slowing domestic AI development.
S.B. 1047 defines open-source AI tools as "artificial intelligence model[s] that [are] made freely available and that may be freely modified and redistributed." The bill directs developers who make models available for public use—in other words, open-sourced—to implement safeguards to manage risks posed by "causing or enabling the creation of covered model derivatives."
California's bill would hold developers liable for harm caused by "derivatives" of their models, including unmodified copies, copies "subjected to post-training modifications," and copies "combined with other software." In other words, the bill would require developers to demonstrate superhuman foresight in predicting and preventing bad actors from altering or using their models to inflict a wide range of harms.
The bill gets more specific in its demands. It would require developers of open-source models to implement reasonable safeguards to prevent the "creation or use of a chemical, biological, radiological, or nuclear weapon," "mass casualties or at least five hundred million dollars ($500,000,000) of damage resulting from cyberattacks," and comparable "harms to public safety and security."
The bill further mandates that developers take steps to prevent "critical harms"—a vague catch-all that courts could interpret broadly to hold developers liable unless they build innumerable, undefined guardrails into their models.
Additionally, S.B. 1047 would impose extensive reporting and auditing requirements on open-source developers. Developers would have to identify the "specific tests and test results" that are used to prevent critical harm. The bill would also require developers to submit an annual "certification under penalty of perjury of compliance," and self-report "each artificial intelligence safety incident" within 72 hours. Starting in 2028, developers of open-source models would need to "annually retain a third-party auditor" to confirm compliance. Developers would then have to reevaluate the "procedures, policies, protections, capabilities, and safeguards" implemented under the bill on an annual basis.
In recent weeks, politicians and technologists have publicly denounced S.B. 1047 for threatening open-source models. Rep. Zoe Lofgren (D–Calif.), ranking member of the House Committee on Science, Space, and Technology, explained: "SB 1047 would have unintended consequences from its treatment of open-source models….This bill would reduce this practice by holding the original developer of a model liable for a party misusing their technology downstream. The natural response from developers will be to stop releasing open-source models."
Fei-Fei Lee, a computer scientist widely credited as the "godmother of AI," said, "SB-1047 will shackle open-source development" by making AI developers "much more hesitant to write code and collaborate" and "crippl[ing] public sector and academic AI research." A group of University of California students and faculty, along with researchers from over 20 other institutions, expanded on this point, saying that S.B. 1047 would chill "open-source model releases, to the detriment of our research" because the "onerous restrictions" in the bill would lead AI developers to "release their models under licenses."
Meanwhile, the federal government continues to tout the value of open-source artificial intelligence. On July 30, the Department of Commerce's National Telecommunications and Information Administration released a report advising the government to "refrain from immediately restricting the wide availability of open model weights in the largest AI systems" and emphasized that "openness of the largest and most powerful AI systems will affect competition, innovation and risks in these revolutionary tools." In a recent blog post, the Federal Trade Commission concluded that "open-weights models have the potential to drive innovation, reduce costs, increase consumer choice, and generally benefit the public — as has been seen with open-source software."
S.B. 1047's provisions are lengthy, ambiguous, and create a labyrinth of regulations requiring developers to anticipate and mitigate future harms they neither intended nor caused. Developers would then have to submit detailed reports to the California government explaining and certifying their efforts. Even developers who try to comply in good faith may face liability due to the vague nature and broad sweep of the bill.
This bill presents itself as a panacea for harms caused by advanced AI models. Instead, the bill would serve as a death knell to open-source AI development.
Editor's Note: As of February 29, 2024, commenting privileges on reason.com posts are limited to Reason Plus subscribers. Past commenters are grandfathered in for a temporary period. Subscribe here to preserve your ability to comment. Your Reason Plus subscription also gives you an ad-free version of reason.com, along with full access to the digital edition and archives of Reason magazine. We request that comments be civil and on-topic. We do not moderate or assume any responsibility for comments, which are owned by the readers who post them. Comments do not represent the views of reason.com or Reason Foundation. We reserve the right to delete any comment and ban commenters for any reason at any time. Comments may only be edited within 5 minutes of posting. Report abuses.
Please
to post comments
Just trying to save the planet from AI's huge carbon footprint.
Anyone who supports AI is a climate change denier.
This bill is an obvious hand out to giant tech companies like Google, and an attempt to intimidate Facebook into going closed source, while drying up funding for would-be startups.
It is such a fucking smarmy example of open lawfare that one wonders if the CA legislature has just given up even the pretense of being for the "little guy".
"California über alles!"
Wonders?
What year is it? 2012?
California Democrats stopped pretending to be for the little guy as soon as they had a supermajority and no Republican Governor. Brown was actually a check on those assholes, which is a frightening thing to say, but he had a notion of himself as a liberal. Newsome has no such ideology.
In the post 2008 period when GoogleFacebookEtc were the only part of the economy making any money, they pretty much bought Sacramento and consolidated power.
It’s for your safety. Trust the experts.
Why do we care what law California passes?
This is like saying "section 230 is the first amendment of the internet" when neither the first amendment nor section 230 apply anywhere outside of the United States.
So some law passes in California. Move your company to Vermont or Texas or Oklahoma or Montana or Idaho or Utah or Latvia or Montenegro just like I go down the street to the other Sushi place because the rice is better.
Also, AI will be the greatest censorship tool ever invented.
A lot of claims for AI's capabilities have been made. If even half of these claims are true, we should all be very scared.
Whatever is going to happen, it certainly isn’t happening as fast as they said it would.
The Northern Lights in Ireland offer a mesmerizing spectacle that captivates both locals and visitors alike. With the right conditions, the skies come alive with vibrant colors, creating an unforgettable experience https://orbitaltoday.com/2024/07/31/northern-lights-in-ireland-how-where-and-when-to-see-aurora-borealis-orbital-today/ This natural phenomenon is a testament to the beauty and wonder of our planet, inviting everyone to look up and marvel at the cosmic dance above.