Project Details



Bottom-Up Design of a Shape-Changing UI for Parameters Control


Physical controls are widely used by professionals such as sound engineers or aircraft pilots. In particular knobs and sliders are the most prevalent in such interfaces. They have advantages over touchscreen GUIs, especially when users require quick and eyes-free control. However, their interfaces (e.g., mixing consoles) are often bulky and crowded. To improve this, we present the results of a formative study with professionals who use physical controllers. Based on their feedback, we propose design requirements for future interfaces for parameters control. We then introduce the design of our KnobSlider that combines the advantages of a knob and a slider in one unique shape-changing device. A qualitative study with professionals shows how KnobSlider supports the design requirements, and inspired new interactions and applications.

In the next step, we investigates ERN in collaborative settings where observing another user (the executer) perform a task is typical and then explores its applicability to HCI. We first show that ERN can be detected on signals captured by commodity EEG headsets like an Emotiv headset when observing another person perform a typical multiple-choice reaction time task. We then investigate the anticipation effects by detecting ERN in the time interval when an executer is reaching towards an answer. We show that we can detect this signal with both a clinical EEG device and with an Emotiv headset. Our results show that online single trial detection is possible using both headsets during tasks that are typical of collaborative interactive applications. However there is a trade-off between the detection speed and the quality/prices of the headsets. Based on the results, we discuss and present several HCI scenarios for use of ERN in observing tasks and collaborative settings.