News

New methods to maximise AI’s potential benefits

Woman's hand on computer mouse

New methods for maximising the potential benefits of predictive and generative AI are to be developed in research involving the University of Strathclyde.

The PHAWM (Participatory Harm Auditing Workbenches and Methodologies) project will also aim to minimise the technology’s potential for harm arising from bias and ‘hallucinations’, in which AI tools present false or invented information as fact.

The project will pioneer participatory AI auditing, where non-experts including regulators, end-users and people likely to be affected by decisions made by AI systems, will play a role in ensuring that those systems provide fair and reliable outputs.


It will develop new tools to support the auditing process in partnership with relevant stakeholders, focusing on four key use cases for predictive and generative AI, and create new training resources to help encourage widespread adoption of the tools.

The predictive AI use cases in the research will focus on health and media content, analysing data sets for predicting hospital readmissions and assessing child attachment for potential bias, and examining fairness in search engines and hate speech detection on social media.

The project is receiving funding of £3.5 million from Responsible AI UK (RAi UK) as part of a £12million programme to address the challenges of the rapid advances in artificial intelligence.

Dr Yashar Moshfeghi, of the University of Strathclyde’s Department of Computer and Information Sciences and a member of the University’s iSchool Research Group, is the principal investigator of the work package dedicated to generative AI in cultural heritage in the PHAWM project.

He said: “Democratising AI auditing takes on added significance within generative AI, especially in contexts like Cultural Heritage. Within the PHAWM project, by collaborating with stakeholders such as the National Library of Scotland, we ensure that generative AI's creative potential is harnessed responsibly.

Through participatory auditing, we take a crucial step towards safeguarding against risks like hallucinations, historical bias arising from the data, or lack of consideration of the context in which historical events have occurred, ensuring that AI-generated content respects the integrity of our shared cultural heritage.

Heritage

The generative AI use cases in the project will look at cultural heritage and collaborative content generation. It will explore the potential of AI to deepen understanding of historical materials without misrepresentation or bias, and how AI could be used to write accurate Wikipedia articles in under-represented languages without contributing to the spread of misinformation.

The predictive AI use cases in the research will focus on health and media content, analysing data sets for predicting hospital readmissions and assessing child attachment for potential bias, and examining fairness in search engines and hate speech detection on social media.

Dr Kedar Pandya, UKRI Technology Missions Fund SRO and Executive Director at EPSRC said: “AI has great potential to drive positive impacts across both our society and economy. This £4m of funding through the UKRI Technology Missions Fund will support projects that are considering the responsible use of AI within specific contexts. These projects showcase strong features of the responsible AI ecosystem we have within the UK and will build partnerships across a diverse set of organisations working on shared challenges.

“These investments complement UKRI’s £1bn portfolio of investments in AI research and innovation, and will help strengthen public trust in AI, maximising the value of this transformative technology.”

The PHAWM consortium will be led by the University of Glasgow. Along with Strathclyde, it will involve the Universities of Edinburgh, Sheffield, Stirling, Strathclyde, York, King’s College London and 23 partner organisations. Dr Moshfeghi's departmental colleagues, Professor Ian Ruthven and Dr Leif Azzopardi, are also working on the project. 

RAi UK is led from the University of Southampton and backed by UK Research and Innovation (UKRI), through the UKRI Technology Missions Fund and EPSRC. UKRI has also committed an additional £4m of funding to support further these initiatives.