Towards a Quantitative Evaluation Metric for XAI

Finding a quantitative evaluation metric that can be used to estimate the usefulness of ML model explanations. That wil be the goal of your research in this thesis.

Required interest(s)

  • Explainable AI
  • Artificial Intelligence
  • Machine Learning

What do you get

  • A challenging assignment within a practical environment
  • € 1000 compensation, € 500 + lease car or € 600 + living space
  • Professional guidance
  • Courses aimed at your graduation period
  • Support from our academic Research center at your disposal
  • Two vacation days per month

What you will do

  • 65% Research
  • 10% Analyze, design, realize
  • 25% Documentation

As machine learning (ML) systems take a more prominent and central role in contributing to life-impacting decisions, ensuring their trustworthiness and accountabilityis of utmostimportance. Explanations sit at the core of these desirable attributes of a ML system. The emerging field is frequently called “Explainable AI (XAI)” or “Explainable ML.” The goal of explainable ML is to intuitively ex-plain the predictions of a ML system, while adhering to the needs to various stakeholders. Many explanation techniques were developed with contributions from both academia and industry. However, there are several existing challenges that have not garnered enough interest and serve as roadblocks to widespread adoption of explainable ML.

It is difficult to determine which eXplainable AI technique (XAI) is most useful in a given scenario. A proper quantitative evaluation metric to determine this does not exist at the moment. As a result, it is unknown whether the explanations created for a certain model are sufficient and usable.

The goal of your research is to find a quantitative evaluation metric that can be used to estimate the usefulness of ML model explanations. This should be a stable metric that could even be utilized in an automated MLOps flow, to give the best possible set of explanations for a given ML model. Your results will assist us in implementing XAI solutions at our clients.

About Info Support Research Center

We anticipate on upcoming and future challenges and ensures our engineers develop cutting-edge solutions based on the latest scientific insights. Our research community proactively tackles emerging technologies. We do this in cooperation with renowned scientists, making sure that research teams are positioned and embedded throughout our organisation and our community, so that their insights are directly applied to our business. We truly believe in sharing knowledge, so we want to do this without any restrictions.

Sign up for this assignment

  • Geaccepteerde bestandstypen: docx, doc, txt, pdf.
  • Geaccepteerde bestandstypen: docx, doc, txt, pdf.

Application procedure

  1. 1
  2. Introductory meeting

    Discuss (study) career, interests and ambitions and introduction Info Support.

  1. 2
  2. Review

    Assessment of professional knowledge and personality (capacity, competences and motives).

  1. 3
  2. Selection interview

    Deepen professional knowledge and personality.

  1. 4
  2. The signing of a contract

    Contract offer and invitation for drawing moments.