Cognition & Control in Human-Machine Systems
We investigate how humans process information, relevant for the effective control of machine systems, such as vehicles. Machines extend our physical capacity to sense and interact with our environments. For example, collision avoidance systems in an aircraft allow the pilot to be aware of fast moving traffic before they are even within range of human sight. Meanwhile, the pilot selectively relies on information provided by the system, to determine and execute the appropriate combination of actions, necessary for effectively maneuvering of the aircraft.
This continuous interaction between man and machine comprise a closed loop system. Information is constantly exchanged between man and machine, which is subsequently processed and acted on according to their respective cognitive and control processes. Our group employs eye-tracking, motion capture and electroencephalography to define the capacity of a human operator to interact in tandem with a responsive machine system. In particular, vehicle models with control dynamics have been well-defined and engineered for their intended purpose. We believe that doing so will extend our current understanding of attentional processes and motor control. In addition, we are motivated to apply our findings to the development of novel and more effective interfaces for information visualization and shared control.
Main research areas
Estimating perceptual-motor workload from EEG
The goal of this project is to extract EEG features that can reliably index the amount of workload that the operator is experiencing in the domain of perceptual-motor control. Research into EEG markers of mental workload have tended to be focused on aspects such as sustained attention or working memory. Here, we are motivated to estimate perceptual-motor fatigue of the operator before potentially fatal decrements in performance occur.
Detection and recognition during steering
High perceptual motor demands can reduce our capacity to attend to secondary tasks. For example, we could fail to notice the sudden appearance of a crossing pedestrian, especially under severe driving conditions. In this line of research, we seek to understand how our capacity for detecting and recognizing peripheral events vary with increasing demands in the control task (e.g., instability).
Gaze control for relevant information retrieval
We move our eyes to actively select and process task-relevant information in real-time. By monitoring how eye-movements are coordinated during control maneuvers, we are able to determine aspects of the visual scene that support the operator’s control capabilities. Our research in this area has two emphases. The first involves developing algorithms for estimating, filtering and analyzing natural gaze in real-time and under challenging scenarios (e.g., cockpit environment). The second targets a fundamental understanding of how eye-movements are coordinated so as to handle shifts in task priorities.
Robust EEG measurement in mobile workspaces
EEG signals can suffer from artefacts due to electromagnetic noise or muscle activity. These noise sources can be amplified in settings that involve a heavy use of electrical equipment and voluntary user movements, such as moving-base flight simulators. Here, we seek to enable EEG recordings in such demanding workspaces by developing robust measurement paradigms and filter algorithms.