The dome of the U.S. Capitol in Washington is seen beyond a fountain (CNS photo/Kevin Lamarque, Reuters).

Last year, following a dramatic and closely monitored Senate hearing on the risks of artificial intelligence (AI), Majority Leader Chuck Schumer announced an “all-hands-on-deck” effort to get a handle on an emergent technology that critics warn could threaten the extinction of the human race. 

As part of this initiative, Schumer convened a series of “AI Insight Forums,” where ranking members of Congress and tech executives—most notably, OpenAI chief executive Sam Altman, Elon Musk, Mark Zuckerberg, and Google founder Eric Schmidt—discussed the most pressing issues AI presents. These included the abuse of user privacy, copyright and intellectual property infringements, short- and long-term effects on employment, and threats to national security. 

“We come together at a moment of revolution, not one of weapons or of political power, but a revolution in science and understanding that will change humanity,” Schumer said, announcing the forums. “We must approach AI with the urgency and humility it deserves.”

Given the stakes, it was reasonable to expect an ambitious legislative proposal to come out of these discussions—or, at minimum, a broad sketch of the kind of federal regulations the AI industry has been calling for. But instead of proposing legislation to slow AI’s dangerous acceleration and protect American citizens from its misuse, Schumer and his hand-picked group of bipartisan senators called for $32 billion in emergency funding for research and development.

We come together at a moment of revolution, not one of weapons or of political power, but a revolution in science and understanding that will change humanity

Almost immediately, experts on both the Left and the Right criticized the plan. The Wall Street Journal called it an “AI pork barrel,” noting that Congress already passed the $280 billion CHIPS and Science Act in 2022, which allocated more than half of its funding to scientific research unrelated to microchips. Alondra Nelson, former acting director of the White House Office of Science and Technology Policy, faulted Schumer and his group for prioritizing innovation and competitiveness over regulation and the public interest, and for failing to articulate a “robust vision” for a future in which U.S. citizens are protected from the risks of AI while sharing in its benefits. Defending his plan, Schumer characterized it as merely a guide for bipartisan AI legislation, insisting that “committees do the legislating. That’s what’s always happened around here.”

The urgency of this issue, however, demands a new approach. The United States is woefully behind the rest of the developed world in addressing the risks posed by AI and other technologies. While the Schumer group was finalizing its proposal—without meaningful input from AI ethicists, policy experts, labor representatives, or civil-liberty advocates—the European Union was passing the Artificial Intelligence Act, which codifies clear requirements for those who develop or deploy AI. The act follows the General Data Protection Regulation, which the EU adopted in 2018 to protect the privacy of internet users. Today, the United States is the only country in the G20 without a national privacy law. 

More encouragingly, Democratic senator Maria Cantwell and Republican representative Cathy McMorris Rodgers recently released the American Privacy Rights Act (APRA), an ambitious proposal to limit how much information tech companies can collect from users and an important step in the decades-long fight to pass national privacy protections. But with a presidential election only a few months away, the proposal runs the risk of stalling in Congress, just as the American Data Privacy and Protection Act did before it.

If U.S. lawmakers are serious about managing AI’s risks, they should pass APRA and then, like the EU, immediately get to work enacting strict regulations. A revolution has started, as Schumer rightly noted, and we’re running out of time to do anything about it.  

Miles Doyle is Commonweal’s special projects editor.

Also by this author

Please email comments to [email protected] and join the conversation on our Facebook page.

Published in the June 2024 issue: View Contents
© 2024 Commonweal Magazine. All rights reserved. Design by Point Five. Site by Deck Fifty.