Participating in the Punchbowl Pop-Up Conversation on June 12, Senator John Hickenlooper (D-CO) reiterated his call for qualified third parties to effectively audit AI systems and verify their compliance with federal laws and regulations.
Jake Sherman of Punchbowl News asked the Senator to explain his February 2024 keynote speech at the Silicon Flatirons Flagship Conference, in which he proposed AI auditing standards and laid out a framework for AI regulation.
Sherman asked, specifically, "What kind of third party do you foresee being able to oversee AI?" Senator Hickenlooper responded, "Well, I mean, it's got to be somebody." He went on, "It's great to let people say, oh well, you're going to self-regulate yourself, right? It's like letting you know high school kids grade their own papers." You can see the exchange here.
The Center for AI Policy (CAIP) agrees with Senator Hickenlooper. Qualified third parties to audit AI systems are needed, and this commonsense requirement should be passed into law. Although some AI companies take the initiative to assess their own risks, independent experts must audit all Generative AI systems and ensure they adhere to federal safety standards. This policy is an important part of CAIP's model legislation, which you can learn more about here.
Analyzing present and future military uses of AI
AISI conducted pre-deployment evaluations of Anthropic's Claude 3.5 Sonnet model
Slower AI progress would still move fast enough to radically disrupt American society, culture, and business