Who Will Shape AI in the Public Interest

The current controversy over the Pentagon’s AI contracts reveals a deeper issue: governments are shaping the AI market through procurement in the wrong ways or not at all, failing to make demands that AI strengthen democracy and improve governance.

Who Will Shape AI in the Public Interest

This was so good I had to fire up the Linkblog again!

Who Will Shape AI in the Public Interest
The current controversy over the Pentagon’s AI contracts reveals a deeper issue: governments are shaping the AI market through procurement in the wrong ways or not at all, failing to make demands that AI strengthen democracy and improve governance. As AI becomes core public infrastructure, public institutions must use their purchasing power deliberately by requiring portability, accountability, and interoperability and prioritizing use in the public interest. This post explains the public conversation we are having about public and democratic AI andhow governments can buy, build, and govern AI on the public’s terms.
Public institutions have extraordinary power to shape how market players behave. In this case, the federal government appears to be misusing that power. But just as concerning is that governments at every level, including states and cities, are failing to use their purchasing power to shape AI in the public interest.

I've long argued that one of the main ways culture is set or changed in organisations and institutions is through procurement. Often thought if as a financial and risk management process, the choices made during procurement, especially when it comes to technology, have massive impact on culture.

The choice to use Microsoft for example leads to a closed, risk averse culture by the way it works. Beyond just the actual tools themselves, Microsoft as a core leads to closing the organisation off to both other technologies and other ways of working. Choosing alternatives has implications for culture, Google (more collaborative), proton (more privacy and ethically focused). Every choice is a cultural one, without that often ever being defined as part of the process.

And now when it comes to AI, the choices are also cultural. Microsoft leads to copilot, maybe to Open AI. Besides the ethical choices here, the lock in, and technical moat created have implications for cultural. As more people use the tools, the way models respond, what they highlight or don't, have cultural shaping potential.

And beyond this, as the post highlight, institutions should shape markets. Yes this is written from an American point of view, where institutions have much bigger market shaping potential, but it still matters here. The post highlights examples from Europe, but not the UK. Where are our market shapers, both in the public sector and the the social purpose sector?

https://rebootdemocracy.ai/blog/public-ai

Subscribe to Tomcw.xyz

Don’t miss out on the latest issues. Sign up now to get access to the library of members-only issues.
jamie@example.com
Subscribe