OpenMined is proud to announce that our flagship software, PySyft, has been profiled by the United Kingdom’s Department for Science, Innovation, and Technology (DSIT) in their portfolio of AI assurance techniques. This recognition highlights our ongoing commitment to building practical tools that facilitate greater transparency and accountability of AI systems.
PySyft can enable external researchers to study proprietary algorithmic systems whilst protecting any proprietary information or private user data. Model owners can load information about their production AI algorithms onto a secure server, where researchers can query the data in a privacy-preserving way and receive answers without ever accessing the underlying sensitive information.
DSIT’s profile particularly showcases our work with the Christchurch Call Initiative on Algorithmic Outcomes (CCIAO), which aims to combat terrorist and violent extremist content online. Through CCIAO, OpenMined has successfully deployed PySyft at online platforms like Dailymotion and LinkedIn to demonstrate that privacy-enhancing technologies can effectively enable external oversight of AI systems while protecting proprietary information and user privacy.
We are heartened to see DSIT acknowledge the importance of privacy-preserving audit tools in building a robust AI assurance ecosystem. The recognition underscores how PySyft can address critical access challenges that typically inhibit meaningful AI governance.
We look forward to continuing our collaboration with the UK government as it works to foster a thriving AI assurance ecosystem that promotes both innovation and responsible development of AI technologies.
Read the full profile here: https://www.gov.uk/ai-assurance-techniques/openmined-privacy-preserving-third-party-audits-on-unreleased-digital-assets-with-pysyft