Media Contact
The Biden administration is wading into the debate about whether the most powerful artificial intelligence systems should be “open source” or closed. Tech companies are divided on how open to make AI models.
David Gray Widder, a postdoctoral fellow at Cornell Tech, examines ethical, political and economic aspects of "open" AI systems.
Widder says:
“The ‘open versus closed’ binary obscures much. The most maximally open AI systems can provide some level of transparency, reusability, and extensibility – but do not themselves ‘democratize’ access to AI, or enable outside scrutiny, as both require resources concentrated in the hands of a few large companies.
“Those lobbying for ‘open’ AI have made clear how they stand to gain from this path: in earnings calls, Mark Zuckerberg has said how their choice to open source PyTorch allows them to easily productize and profit from external contributions.
“On the other hand, companies such as OpenAI making arguments for closed AI systems on the basis of ‘safety’ have historically rejected other ‘burdensome’ safety mechanisms like audits, suggesting their argument for restrictions is more about using regulation to entrench their forerunner position, not safety.”