loading page

Designing Explanations for AI-driven Trajectory Predictions
  • Raphaël Tuor
Raphaël Tuor
Human-IST Institute

Corresponding Author:[email protected]

Author Profile

Abstract

The constant increase in air traffic volume and complexity makes automated decision aids an essential tool to assist Air Traffic Controllers (ATCo) in conflict resolution tasks.  In this paper, we present an overview of relevant literature in the domain of explainable AI (XAI) for trajectory predictions. We describe our explanation prototype based on Temporal Probability Density Fields represented with time instances.  We present the plan of our first user study meant to evaluate the effects of explanations on the ATCo's trust, satisfaction, performance and adoption in a conflict detection task. The conflict detection explanation prototype considers the following aspects: the difficulty level of the conflict detection task, the information that needs to be displayed, and the suitable techniques to display it. Our two main contributions to the ATC research community are: 1. a simplified  air traffic control simulation prototype allowing to set the difficulty level of a conflict detection task and allowing to compare different visualization variants and 2. a theoretical approach to the the explanations of AI-driven trajectory predictions.