Intracranial Brain Signals Decoding and VR Integration Using AI
PI-Technion, Assist. Prof. Stefano Recanatesi, PhD PI-Rambam, Dr. Moshe Hershkovitz, MD
Clinical Need and Research Challenge Patients with epilepsy who undergo intracranial monitoring with implanted deep electrodes provide a unique opportunity to observe human brain activity with exceptional spatial and temporal resolution. However, most intracranial EEG recordings are still studied in highly controlled laboratory paradigms that capture only a limited range of human cognition and behavior. Our project addresses this limitation by studying neural activity while patients engage in immersive virtual reality (VR) tasks using an Oculus headset.
These environments allow participants to interact with dynamic scenarios and game-like tasks that more closely resemble natural behavior. By combining intracranial EEG recordings with detailed behavioral measurements from VR interactions, we aim to decode task variables directly from neural activity. Ultimately, this approach enables the development of closed-loop systems in which neural signals can dynamically interact with task environments.
Artificial Intelligence Perspective
Understanding neural activity in naturalistic settings requires computational tools capable of extracting structure from complex, high-dimensional data. To address this challenge, we develop advanced artificial intelligence models designed to decode and interpret neural signals. Our approach uses deep decoding architectures based on transformer models to extract task-relevant information from intracranial brain activity. In parallel, we investigate complementary frameworks based on latent-state models and switching dynamical systems that identify reproducible neural states and transitions during behavior. Together, these approaches provide both high-performance neural decoding and a principled framework for interpreting how brain activity evolves over time in realistic environments.
Research Goals and Current Progress
Our central objective is to understand how neural activity drives human behavior during interaction with virtual environments and interactive games. More broadly, we seek to uncover the neural dynamics that support naturalistic, goal-directed behavior and to build models that link internal brain states to actions, decisions, and learning processes. We have already collected intracranial EEG recordings during VR tasks from multiple patients. This dataset provides a strong foundation for developing and validating AI models of neural decoding. Early results demonstrate the feasibility of combining immersive behavioral paradigms with intracranial recordings to study neural dynamics in conditions that more closely resemble real-world cognition.