People with severe physical disabilities often require assistance with daily activities. Robotic arms in the form of assistance robots can help them perform tasks such as eating and drinking, thereby promoting their independence and participation in society. However, controlling such systems remains challenging: conventional user interfaces are often not barrier-free, especially when multiple degrees of freedom (DoF) have to be controlled manually.
Shared control approaches such as the Adaptive DoF Mapping Control (ADMC) system, which was developed in Max’ previous research, address this issue by suggesting appropriate movement options for the robotic arm depending on the situation.
The central challenge, however, is communicating these suggestions in a way that is understandable and accessible, particularly for users with impaired visual perception. Currently, many systems rely exclusively on visual feedback. However, this quickly becomes problematic in inclusive contexts. Comparative studies of auditory and vibrotactile alternatives are scarce.
This is precisely where the present project comes in, as it systematically compares the comprehensibility, stress and acceptance of the three feedback modalities (visual, auditory and vibrotactile) in adaptive robot movement suggestions. The project is based on the modular XR framework AdaptiX, which enables the flexible integration and evaluation of these modalities.
With the Young Academy, TU Dortmund University supports scientists in the qualification phase to become successful in acquiring external research funding.