Postgraduate research opportunities Explainable AI for decision-making under uncertainty

Apply

Key facts

  • Opens: Wednesday 6 September 2023
  • Number of places: Open call
  • Duration: PhDs are 3 years full time, or 6 years part-time

Overview

The project aims to bridge the gap between complex AI models and transparent decision-making in uncertain scenarios. In various fields, AI-driven systems are becoming integral for making critical decisions, yet the opacity of their inner workings poses challenges for understanding and trusting the decisions they produce, particularly in uncertain or ambiguous contexts. This project seeks to develop a framework that enhances the explainability of AI models in such scenarios.
Back to opportunity

Eligibility

For entry onto our PhD programme we look for a first-class or upper second-class UK Honours degree, or overseas equivalent, in a relevant data science, operational research or computer science related subject. We also normally expect a Masters’ degree in a relevant subject, or overseas equivalent. When reviewing your academic achievements, we're particularly interested in grades which relate to independent research (for example, your research project or dissertation).

THE Awards 2019: UK University of the Year Winner
Back to opportunity

Project Details

The project aims to bridge the gap between complex AI models and transparent decision-making in uncertain scenarios. In various fields, AI-driven systems are becoming integral for making critical decisions, yet the opacity of their inner workings poses challenges for understanding and trusting the decisions they produce, particularly in uncertain or ambiguous contexts. This project seeks to develop a framework that enhances the explainability of AI models in such scenarios, enabling stakeholders to comprehend the rationale behind the decisions and facilitating their acceptance. By combining advanced machine learning techniques with probabilistic reasoning, the project endeavours to create models that not only provide accurate predictions but also generate human-interpretable explanations for these predictions, shedding light on the factors and uncertainties that contribute to the decision outcomes. The outcomes of this research have far-reaching implications, including improving decision-making in fields like healthcare diagnosis, financial risk assessment, and environmental monitoring. The proposed explainable AI framework could empower domain experts, regulators, and end-users to confidently utilize AI-generated insights, even when facing uncertainty. As a result, the project strives to advance the responsible and accountable adoption of AI technologies while fostering transparency and fostering trust in complex decision-making processes.

Further information

Find more information about risk and uncertainty.

Back to opportunity

Supervisors

For further information, contact sbs-pgradmissions@strath.ac.uk.

Back to course

Apply

Number of places: Open call

To read how we process personal data, applicants can review our 'Privacy Notice for Student Applicants and Potential Applicants' on our Privacy notices' web page.

Management Science

Programme: Management Science

PhD
full-time
Start date: Oct 2023 - Sep 2024

Management Science

Programme: Management Science

PhD
part-time
Start date: Oct 2023 - Sep 2024

Back to course

Contact us

For further information, contact sbs-pgradmissions@strath.ac.uk.