Tech leaders agree on AI regulation but divided on how in Washington forum

A delegation of top tech leaders including Sundar Pichai, Elon Musk, Mark Zuckerberg and Sam Altman convened in Washington on Wednesday for a closed-door meeting with US senators to discuss the rise of artificial intelligence and how it should be regulated.

The discussion, billed as an “AI safety forum”, is one of several meetings between Silicon Valley, researchers, labor leaders and government and is taking on fresh urgency with the US elections looming and the rapid pace of AI advancement already affecting peoples’ lives and work.

The Democratic senator Chuck Schumer, who called the meeting “historic”, said that attendees loosely endorsed the idea of regulations but that there is little consensus on what such rules would look like.

Schumer said he asked everyone in the room – including more than 60 senators, almost two dozen tech executives, advocates and skeptics – whether government should have a role in the oversight of artificial intelligence, and that “every single person raised their hands, even though they had diverse views”.

Among the ideas discussed was whether there should be an independent agency to oversee certain aspects of the rapidly-developing technology, how companies could be more transparent and how the US can stay ahead of China and other countries.

“The key point was really that it’s important for us to have a referee,” said Elon Musk, the CEO of Tesla and X, the social network formerly known as Twitter, during a break in the forum. “It was a very civilized discussion, actually, among some of the smartest people in the world.”

Congress should do what it can to maximize the benefits and minimize the negatives of AI, Schumer told reporters, “whether that’s enshrining bias, or the loss of jobs, or even the kind of doomsday scenarios that were mentioned in the room. And only government can be there to put in guardrails”.

Attendees also discussed the pressing need for steps to protect the 2024 US elections from disinformation becoming supercharged by AI, Schumer said.

“The issue of actually having deep fakes where people really believe that somebody, that a campaign was saying something when they were the total creation of AI” was a key concern, said Schumer, and that “watermarking” – badging content as AI generated – was discussed as a solution.

US Senate majority leader Chuck Schumer speaks to members of the press.US Senate majority leader Chuck Schumer speaks to members of the press. Photograph: Alex Wong/Getty Images

Several AI experts and other industry leaders also attended, including Bill Gates; the Motion Picture Association CEO Charles Rivkin; the former Google CEO Eric Schmidt; the Center for Humane Technology co-founder Tristan Harris; and Deborah Raji, a researcher at University of California, Berkeley.

Some labor and civil liberties groups were also represented among the 22 attendees including Elizabeth Shuler, the president of the labor union AFL-CIO; Randi Weingarten, the president of the American Federation of Teachers; Janet Murguía, the president of UnidosUS; and Maya Wiley, the president and CEO of the Leadership Conference on Civil & Human Rights.

Sparked by the release of ChatGPT less than a year ago, businesses have been clamoring to apply new generative AI tools that can compose human-like passages of text, program computer code and create novel images, audio and video. The hype over such tools has accelerated worries over its potential societal harms and prompted calls for more transparency in how the data behind the new products is collected and used.

In his opening remarks, which Meta shared with the Guardian, Mark Zuckerberg said the company is working with academics, policy makers and civil society to “minimize the risk” of the technology while ensuring they don’t undervalue the benefits. He specifically cited work on how to watermark AI content to avoid risks such as mass spread of disinformation.

Before the forum, representatives for the Alphabet Workers Union said that Schuler, the president of AFL-CIO, would raise worker issues including those of AI raters – human moderators who are tasked with training, testing and evaluating results from Google Search and the company’s AI chatbot – who say they have struggled with low wages and minimum benefits.

“There are many conversations still to come and, throughout the process, the interests of working people must be Congress’ North Star,” Schuler said in a statement. “Workers are not the victims of technological change– – we’re the solution.”

Meredith Stiehm, the president of Writers Guild of America (WGA), and Randi Weingarten, the president of American Federation of Teachers.Meredith Stiehm, the president of Writers Guild of America (WGA), and Randi Weingarten, the president of American Federation of Teachers. Photograph: Shutterstock

While Schumer described the meeting as “diverse”, the sessions faced criticism for leaning heavily on the opinions of people who stand to benefit from the rapid advancements in generative AI technology. “Half of the people in the room represent industries that will profit off lax AI regulations,” said Caitlin Seeley George, a campaigns and managing director at Fight for the Future, a digital rights group.

“People who are actually impacted by AI must have a seat at this table, including the vulnerable groups already being harmed by discriminatory use of AI right now,” George said. “Tech companies have been running the AI game long enough and we know where that takes us – biased algorithms that discriminate against Black and brown folks, immigrants, people with disabilities and other marginalized groups in banking, the job market, surveillance and policing.”

Some senators were critical of the private meeting, arguing that tech executives should testify in public. The Republican senator Josh Hawley said he would not attend what he said was a “giant cocktail party for big tech”.

“I don’t know why we would invite all the biggest monopolists in the world to come and give Congress tips on how to help them make more money and then close it to the public,” Hawley said.

Agencies contributed reporting

Leave a Comment