A free tool for supporting the safe use of AI applications has been launched by researchers.
The University of Strathclyde is a partner in the PHAWM (Participatory Harm Auditing Workbenches and Methodologies) project, which has developed the tool to help organisations, policymakers, and the public make the most of AI apps while identifying their potential harms.
The tool aims to address the urgent need for rigorous assessments of AI risks amid the rapid expansion and adoption of the technology across a wide range of sectors. It is also designed to help support the aims of regulations such as the EU’s AI Act, introduced in 2024, which seek to balance AI innovation with protections against unintended negative consequences.
Bettter outcomes
It will enable ordinary users to conduct in-depth audits of the strengths and weaknesses of any AI-driven apps. It will also actively involve audiences who are usually excluded from the audit process, including those who will be affected by the AI app’s decisions, to produce better outcomes for end users.
The tool and framework are free to download from PHAWM’s website.
Principal Investigator at Strathclyde, Dr Yashar Moshfeghi, said: “AI can be an invaluable resource for processing and understanding vast amounts of often complex information, yet significant concerns remain about its safety and reliability.
This increasing prevalence means it is important that people using AI, particularly those who do not have technical knowledge around it, are able to do so with confidence and reassurance, for either professional or personal use. The tool we have developed will help these users to take full advantage of AI’s potential and minimise their exposure to its risks.
The tool and its accompanying guiding framework have been developed through extensive workshops with the project’s partners and other stakeholders in the health and cultural heritage sectors. These are two of the four areas, along with media content and collaborative content generation, that the project, led by the University of Glasgow, was set up to investigate.
PHAWM is supported by £3.5 million in funding from Responsible AI UK (RAi UK) and brings together more than 30 researchers from seven UK universities and 28 partner organisations.
The team is also developing comprehensive training and support for certification to help organisations adopt PHAWM’s auditing tools as effectively as possible.