Newsletter Monday, November 18

AI company Anthropic is now backing a controversial bill known as SB 1047, which would regulate AI in California.

In a letter sent to California Gov. Gavin Newsom, Anthropic’s CEO, Dario Amodei, said the bill’s “benefits likely outweigh the costs.” However, he added that “we are not certain of this, and there are still some aspects of the bill which seem concerning or ambiguous to us.”

Anthropic’s cautious approval comes just a month after the company proposed a series of amendments to SB 1047 — which was first introduced to the state legislature by Sen. Scott Wiener in February.

In a letter sent to state leaders in July, Anthropic called for a greater focus on deterring companies from building unsafe models rather than enforcing stringent laws before catastrophic incidents happen. It also suggested that companies be able to set their standards for safety testing instead of adhering to regulations prescribed by the state.

An amended version of the bill published on August 19 includes several modifications. First, it limits the scope of civil penalties for violations that don’t result in harm or imminent risk, according to a post by Nathan Calvin, senior policy counsel at the Center for AI Safety Action Fund, which is a cosponsor of the bill and has been working with Anthropic since the bill was first introduced.

There are also some key language changes. Where the bill originally called for companies to demonstrate “reasonable assurance” against potential harms, it now calls for them to demonstrate “reasonable care,” which “helps clarify the focus of the bill on testing and risk mitigation,” according to Calvin. It’s also “the most common standard existing in tort liability,” he wrote.

The updated version also downsized a new government agency that would enforce AI regulations, once called the Frontier Model Division, to a board known as the Board of Frontier Models and placed it within the already existing Government Operations Agency. That board now has nine members instead of five members. With that, however, the reporting requirements on companies have also increased. Companies must publicly release safety reports and send unredacted versions to the state’s attorney general.

The updated bill removes the penalty of perjury, thereby eliminating all criminal liability for companies and imposing only civil liabilities. Companies are now required to submit “statements of compliance” to the attorney general rather than “certifications of compliance” to the Frontier Model Division.

Amodei said the bill now “appears to us to be halfway between our suggested version and the original bill.” The benefits of developing publicly available safety and security protocols, mitigating downstream harms, and forcing companies to seriously question the risks of their technologies will “meaningfully improve” the industry’s ability to combat threats.

Anthropic bills itself as a “safety and research company” and has won some $4 billion in backing from Amazon. In 2021, a group of former OpenAI staffers, including Dario Amodei and his sister Daniela, started the company because they believed AI would have a dramatic impact on the world and wanted to build a company that would ensure it was aligned with human values.

Wiener was “really pleased to see the kind of detailed engagement that Anthropic brought in its ‘support if amended letter,” Calvin told Business Insider. “I really hope that this encourages other companies to also engage substantively and to try to approach some of this with nuance and realize that this kind of false trade-off between innovation and safety is not going to be in the long run interest of this industry.”

Other companies that the new legislation will impact have been more hesitant. OpenAI sent a letter this week to California state leaders opposing the bill. One of its key concerns was that the new regulations would push AI companies out of California. Meta has also argued that the bill “actively discourages the release of open-source AI.”



Read the full article here

Share.
Leave A Reply

Exit mobile version