Policy and Accountability

AI Harms Are Already Here

And we can’t rely on the tech industry to address them.

 

 

 

April 3, 2023

To address the rapid development of AI technologies that have the potential to transform or upend so many aspects of our society, what we don’t need is more self-regulation by the tech industry. We don’t need the industry to set forth the values that will shape the future of these systems, or to reassure the public that they’re the good guys. What we do need, urgently, is for government to enact a full spectrum of public interest, accountability-focused regulations — ones grounded in a rights-based framework that protects people from the harms of AI systems now. (Many of those harms are clearly laid out in a statement from our friends at DAIR, responding to the “AI pause” letter.)

A starting point for what that could look like is the Blueprint for an AI Bill of Rights, which the White House Office of Science and Technology Policy put out in October 2022. If enacted and implemented with real enforcement power, this set of five principles could radically reshape the sort of risk that is currently being externalized onto society by these companies. The Blueprint makes clear that AI and other automated systems are not magical or inevitable, but the result of decisions made by people — and that accordingly, there are actions we can take to ensure that AI and other automated systems are built and deployed in ways that protect the rights of the public.

What often gets obscured by the hype and catastrophizing about AI is that a handful of private actors are in a position to determine the major directions of AI development for all of society. In that light, the letter signed by business leaders and academics asking companies to “pause” their work on more powerful AI systems so the industry can assess the risks they pose is a clever bit of rhetoric. While some of the signatories are surely motivated by genuine concern and good intentions, they fall into the trap of assuming that it is up to the tech industry to create their own benchmarks for what passes societal muster. It is not.