Human-Agent Interaction

AI systems are becoming increasingly intelligent and better able to perform tasks with little or no support from humans. However, an increase of behavior complexity has resulted in a decrease in transparency: systems that perform well but are unable to explain their outputs. The field of eXplainable AI (XAI) studies and develops methods and techniques to equip intelligent systems with the ability to explain their behavior in a way that facilitates understanding and trust in a human. This leads to more effective human-AI teamwork.

Example

For drone operators we developed an interaction prototype that allows one operator to manage and control a number of drones. The interface monitors the sensor input from drones and uses this to adapts itself to show anything from detailed information of a single drone (e.g. camera feeds) to high-level aggregated information (e.g. overview detected objects). The operator can interact with drones using the idea of high-level tasking (e.g. monitor this area, stay away from this area) using a map interface and/or voice.