Postgraduate research opportunities Explainable AI for decision-making under uncertainty
ApplyKey facts
- Opens: Wednesday 6 September 2023
- Number of places: Open call
- Duration: PhDs are 3 years full time, or 6 years part-time
Overview
The project aims to bridge the gap between complex AI models and transparent decision-making in uncertain scenarios. In various fields, AI-driven systems are becoming integral for making critical decisions, yet the opacity of their inner workings poses challenges for understanding and trusting the decisions they produce, particularly in uncertain or ambiguous contexts. This project seeks to develop a framework that enhances the explainability of AI models in such scenarios.Eligibility
For entry onto our PhD programme we look for a first-class or upper second-class UK Honours degree, or overseas equivalent, in a relevant data science, operational research or computer science related subject. We also normally expect a Masters’ degree in a relevant subject, or overseas equivalent. When reviewing your academic achievements, we're particularly interested in grades which relate to independent research (for example, your research project or dissertation).

Project Details
The project aims to bridge the gap between complex AI models and transparent decision-making in uncertain scenarios. In various fields, AI-driven systems are becoming integral for making critical decisions, yet the opacity of their inner workings poses challenges for understanding and trusting the decisions they produce, particularly in uncertain or ambiguous contexts. This project seeks to develop a framework that enhances the explainability of AI models in such scenarios, enabling stakeholders to comprehend the rationale behind the decisions and facilitating their acceptance. By combining advanced machine learning techniques with probabilistic reasoning, the project endeavours to create models that not only provide accurate predictions but also generate human-interpretable explanations for these predictions, shedding light on the factors and uncertainties that contribute to the decision outcomes. The outcomes of this research have far-reaching implications, including improving decision-making in fields like healthcare diagnosis, financial risk assessment, and environmental monitoring. The proposed explainable AI framework could empower domain experts, regulators, and end-users to confidently utilize AI-generated insights, even when facing uncertainty. As a result, the project strives to advance the responsible and accountable adoption of AI technologies while fostering transparency and fostering trust in complex decision-making processes.
Further information
Funding details
While there is no funding in place for opportunities marked "unfunded", there are lots of different options to help you fund postgraduate research. Visit funding your postgraduate research for links to government grants, research councils funding and more, that could be available.
Apply
Number of places: Open call
To read how we process personal data, applicants can review our 'Privacy Notice for Student Applicants and Potential Applicants' on our Privacy notices' web page.