OpenAI is exploring collective decisions on AI, like Wikipedia entries

May 22 (Reuters) – ChatGPT’s creator OpenAI is testing how to gather broad input on decisions impacting its artificial intelligence, its president Greg Brockman said on Monday.At AI Forward, an event in San Francisco hosted by Goldman Sachs Group Inc (GS.N) and SV Angel, Brockman discussed the broad contours of how the maker of the wildly popular chatbot is seeking regulation of AI globally.One announcement he previewed is akin to the model of Wikipedia, which he said requires people with diverse views to coalesce and agree on the encyclopedia’s entries.

“We’re not just sitting in Silicon Valley thinking we can write these rules for everyone,” he said of AI policy. “We’re starting to think about democratic decision-making.”
Another idea that Brockman discussed, on which OpenAI elaborated in a blog post Monday, is that governments around the world should coordinate to ensure AI is developed safely.Since the Nov. 30 launch of ChatGPT, generative AI technology that can spin uncannily authoritative prose from text prompts has captivated the public, making the program the fastest growing app of all time. AI has also become a focus of concern over its ability to create deepfake pictures and other misinformation.

In assessing the path forward for AI, Brockman looked at Wikipedia as well as elsewhere. He and OpenAI said a body like the International Atomic Energy Agency (IAEA) could place restrictions on deployment, vet compliance with safety standards and track usage of computing power.
Another suggestion was a global agreement to limit the annual growth of frontier AI capabilities, or a joint global project that major governments could participate in.
OpenAI CEO Sam Altman proposed various ideas to U.S. lawmakers last week for setting guardrails for artificial intelligence, among them requiring licenses to develop the most sophisticated AI models and establishing a related governance regime. He is visiting European policymakers this week.

[Read More…]