Phd Student in Brain-computer Interfaces for - Zurich, Schweiz - ETH Zürich

ETH Zürich
ETH Zürich
Geprüftes Unternehmen
Zurich, Schweiz

vor 1 Woche

Lena Schneider

Geschrieben von:

Lena Schneider

beBee Recruiter


Beschreibung

PhD student in Brain-Computer Interfaces for Virtual Reality, focused on Signal Processing and User-Interface Adaptation:


100%, Zurich, fixed-term:


The
Sensing, Interaction & Perception Lab is looking for a PhD student in Brain-Computer Interfaces for Virtual Reality at ETH Zurich.

The project will place a strong focus on machine learning-based signal processing, optimization for adaptive user interfaces, neuro markers of perception, modeling presence & sickness,
but also
empirical evaluation with users in an online setting using BCI and Virtual Reality through user studies.


Brain-Computer Interfaces are the next frontier in interactive systems to enable users to efficiently and effectively accomplish their tasks in immersive environments, such as Virtual Reality or Augmented Reality. Simultaneously, efficient Brain-Computer Interfaces enable us to refine our understanding of processes in cognitive neuroscience.

Brain-Computer Interfaces (BCI) analyze users' brain activity to enable them to interact, without muscle activity, with computer systems.

Originally designed to assist people with severe motor disabilities, a new trend in immersive environments is the use of BCIs by a wider public.

For example, through passive BCI systems, it is possible to transparently
monitor users' mental states to estimate their
cognitive load or their
level of engagement in a task.


In this project, we will design and study novel VR systems that analyze the electrophysiological activity of a user's brain, through a passive BCI, to improve the perception of virtual environments.

The objective is to integrate into the VR system new ways to assess the user's mental state through real-time classification of EEG data, determining the neural correlates of physiological markers and users' perceptions.

The ultimate goal will be to improve user immersion by increasing levels of embodiment and reducing or preventing cybersickness through real-time adaptation of virtual content to mental states.


Project background:

To date, a significant challenge in Virtual Reality systems is in reaching widespread adoption for everyday computing tasks.

Rather, VR systems remain at the level of entertaining devices at this point, without harvesting their larger potential to replace mobile computing for everyday activities.

A common explanation for this phenomenon is that current VR systems fail to characterize the user's mental state interaction and thus lack the means of adapting interactive environments accordingly.

Related studies have shown that experiencing the same VR system and experiences evokes significantly varying responses in users—from a perception point as well as from the point of users' physiological reactions to immersive environments.

A major challenge of current systems is "cyber sickness", a phenomenon that occurs in over 60% of users.

The phenomenon refers to the set of deleterious symptoms that can occur after more or less prolonged use of virtual reality systems.

Less severe, depending on personal characteristics, users can suffer from breaks in presence and immersion due to anomalies in rendering and interaction, which can lead to a bad feeling of embodiment in their virtual avatars.

All these cases severely affect the user experience to the detriment of pleasant interactions and thus the success of immersive computing.

Therefore, research on these phenomena is required to tailor interaction to users' physiological reactions and adapt immersive environments in real time.


Job description:


The aim of this project is to develop intelligent and interactive software experiences that detect physiological reactions related to cybersickness/simulator sickness, breaks in presence, and embodiment.

We will develop sensing models and signal processing algorithms for the early detection of these reactions. These will be input to our optimization-based engine for real-time system and UI adaptation.

The work will focus on the identification, characterization, and real-time detection of neurophysiological markers associated with users' mental states. A second component of the work will focus on the real-time adaptation of the immersive environment. The work will comprise methodological and theoretical aspects.

First, we will investigate the state of the art of the various dimensions associated with "user experience" in VR, the factors that influence their perception, and a range of "offline" markers that have been highlighted in prior work.

From this, we will define experimental protocols for controlled empirical studies that we will conduct with state-of-the-art EEG headsets in immersive settings, where we will electroencephalographic data associated with the different mental states of VR users.

Through an in-depth analysis, we will determine the individual neuromarkers that relate to the various dimensions of user experience, with a particular focus on the perception of presence

Mehr Jobs von ETH Zürich