OpenMined’s Policy Lead, Lacey Strahm, outlines innovative approaches to AI oversight in response to Australia’s consultation on Responsible AI.
As Australia continues to develop its approach to responsible AI development and deployment, a crucial question emerges: How can we effectively evaluate AI systems while protecting privacy and intellectual property? In a recent response to the Australian Government’s Department of Industry, Science and Resources request for consultation, OpenMined presents an innovative approach for AI oversight through remote auditing.
Building on Australia’s Progress
Australia has already taken significant steps toward responsible AI development, including:
- Releasing and piloting an AI Ethics Framework aligned with OECD principles
- Issuing guidance on public sector AI adoption
- Investing in a National AI Centre to promote responsible AI practices
OpenMined’s response builds on these foundations by addressing a critical challenge: facilitating external access to AI systems for evaluation and oversight.
The Hidden Challenge of AI Oversight
The development of safe and responsible AI faces two interconnected challenges around access. First, auditors need to evaluate proprietary AI systems while respecting privacy, security, and intellectual property concerns. Second, companies often struggle to assess their AI systems’ real-world impacts because relevant data exists across multiple organizations.
Consider a streaming service’s movie recommendation system. While the company can track basic metrics about user engagement, they might not know if their algorithm disrupts users’ sleep patterns. Even if a sleep-tracking app has this information, sharing it raises significant privacy and competitive concerns. This scenario illustrates how the inability to access crucial impact data can leave potential harms undetected.
The Solution: Remote Audits
OpenMined proposes remote auditing as a solution that protects all stakeholders while enabling meaningful oversight. This innovative approach begins with AI companies hosting their systems on secure servers with specialized APIs, while third parties securely share relevant impact data. Auditors then work with mock data to develop and test their evaluation methods.
Once auditors have created their evaluation approach using mock data, all parties review and approve the proposed audit methodology. The approved evaluations then run on real data in a controlled environment, allowing auditors to receive verified answers to specific questions without exposing raw data or sensitive information. This process ensures that all parties maintain their privacy and security while enabling meaningful oversight.
Lessons Learned from Pharmaceuticals Approach to Safety
OpenMined draws an illuminating parallel between AI evaluation and drug development. Just as drugs are first tested in laboratories before moving to clinical trials, AI systems should undergo initial testing in simulated environments. After deployment, AI systems should be carefully monitored in real-world conditions. This staged approach allows for early detection of potential issues while recognizing that the true test comes from real-world implementation.
The comparison goes deeper than just process. Both drug trials and AI audits fundamentally focus on human impact rather than technical specifications. Early testing provides important indicators, but the ultimate measure of success is whether people’s lives improve when using the technology. This framework suggests the need for continuous monitoring and assessment, similar to how the pharmaceutical industry maintains ongoing oversight over drugs after market release.
Looking Forward
As Australia continues to develop its approach to responsible AI, OpenMined’s remote auditing approach offers a practical path forward that balances innovation with oversight. By addressing fundamental access challenges while protecting all stakeholders’ interests, this approach could help Australia maintain its position as a leader in responsible AI development.
Read our full response to the Australian Government here.