Projects from Kai Biermeier, M.Sc.
TRR 318-2 - Project B06: Ethics and normativity of explainable AI
B06 investigates the normative purposes of XAI. In the first funding period we have established that there are many different normative grounds for XAI. To assess them, it is necessary to take the organizational context of XAI into account. To that aim, media studies will clarify the organizational context in which XAI is embedded, where this ...
Duration: 01/2026 - 06/2029
TRR 318-2 - Project B05: Co-constructing explainability with an interactively learning robot
Research focus of B05 is the double loop of training and understanding. In the training loop the robot continuously adapts and refines its movements based on user input. The understanding loop allows the human trainer to develop a deeper comprehension of the robot's learning mechanisms through real-time interaction and explanations provided during ...
Duration: 01/2026 - 06/2029
TRR 318-2 - Project B01: A dialog-based approach to explaining machine-learning models
B01 explores how dialog-based explanations of machine learning (ML) models function in real-world organizational contexts, accounting for organizational structures, roles, and communication styles, focusing on the predictive policing domain. An experimental evaluation showed that dialog-based explanations significantly enhance users’ understanding ...
Duration: 01/2026 - 06/2029
TRR 318-2 - Project A06: Explaining the multimodal display of stress in clinical explanations
We investigated the influence of stress and mental health conditions in explanatory settings. We determined how signals related to understanding differ intra-individually under stress and inter-individually for people with social interaction conditions. In the second phase, we will develop techniques to train clinicians to detect signs of stress ...
Duration: 01/2026 - 06/2029
TRR 318-2 - Project A04: Co-constructing duality-enhanced explanations
Technical artifacts can be explained via their Architecture (e.g., structure and mechanisms) and their Relevance (e.g., functions and goals)—summarized as Duality. We will analyze human-human explanations of digital artifacts with respect to duality-related monitoring and multimodal scaffolding and how it is tailored to EEs’ social roles, and ...
Duration: 01/2026 - 06/2029
TRR 318-2 - Project A03: Co-constructing explanations between AI-explainer and human explainee under arousal or nonarousal
We investigate how arousal affects the processing of explanations. Arousal can occur from the task, contextual factors or from the explanation itself. Our goal is to develop an interactive system that co-constructs an explanation that allows the explainee to understand XAI explanations when being overly or too little aroused. Both, human and the ...
Duration: 01/2026 - 06/2029
TRR 318-2 - Constructing Explainability
The scope of the EU right to explanation has fueled the need to improve eXplainable Artificial Intelligence capacities aiming at strengthening the rights of individuals affected by AI-based recommendations. Among other purposes, explanations serve the right to contest an AI output and protect humans from being left out of control. However, ...
Duration: 01/2026 - 06/2029
TRR 318-2 - Project A01: Adaptive explanation generation
Our aim is to investigate the adaptation mechanisms enabling partners to interactively construct an explanation, and then to imbue interactive XAI systems with similar capabilities. Based on results suggesting an interplay between interactive (verbal moves) and cognitive adaptivity (partner model), we developed a computational model for generating ...
Duration: 01/2026 - 06/2029
TRR 318-2 - Project A02: Monitoring the understanding of explanations
Our goal is to build a computational architecture that will monitor a human interlocutor’s moments of (non)understanding in multimodal ensembles. We will make artificial EXs able to reason about these and adapt their explanatory strategy. Utilizing a large corpus of explanations (MUNDEX), we integrate rich, multimodal signals with different ...
Duration: 01/2026 - 06/2029
TRR 318-2 - Project A05: Contextualized and online parametrization of scaffolding in human–robot explanatory dialog
Negation and verbal contrast are important means of scaffolding. When negations are provided during explaining joint actions, they link the present knowledge about actions to those that have been performed in the past. We continue to develop scaffolding strategies for a dialog with a social robot in a joint action setting. Our long-term objective ...
Duration: 01/2026 - 06/2029