Estimating Fear Tendency from Physiological Responses using virtual stimuli - For psychotherapy and similar applications it is critical to be able to measure the responses to the Virtual Reality, and physiology may be even more useful than subjective reporting. This is even more critical for closed-loop applications advocated in affective computing. We have induced four types of fearful stimuli in VR, and systematically measured reported phobia using questionnaires and physiology. Our expectation was that high phobia would result in a high degree of fear in the VR, and also high physiological arousal. Our findings are that this is not always the case.
Towards Reliable Monitoring and Machine Induced Regulation of Stress in Every Day Life using Highly Immersive Virtual Reality (2017-2019). joyventures.com, In collaboration with Dr. Yulia Golland, School of Psychology.
A timely intervention to manage daily stress can significantly improve our physiological and psychological health and well-being. A key ingredient in most intervention methods is measurement of daily stress, but this has proved challenging. The "golden standard" is often considered to be cortisol levels, but these are not easy to determine, and have a very low temporal resolution. Subjective reports have found to correlate poorly (0.26-0.36) with cortisol levels. Therefore, there is a growing interest in developing accurate measurements of stress based on neurophysiology. A sensor-based continuous measurement of stress in daily life would be a necessary building block for a wide range of applications and scenarios (often referred to as "affective computing"), including many interventions for wellness.
Our first and main goal is to make substantial progress towards accurate and reliable continuous measurement of stress using neurophysiological signals in everyday life, based on a data-driven machine learning approach. We use immersive virtual reality to simulate everyday life situations, retaining both ecological validity and experimental control. Our next goal is to harness this measurement for regulating stress. Given that we live in a digital world (soon to become even more pervasive with IoT devices) we will ask – can machines take a proactive role in helping to regulate people's stress levels?
Led by media artist and researcher Daniel Landau, The Mediated-Body Lab is an art and science research laboratory aimed at creating a bridge between humanities, arts, and sciences to study the complex relationship between body and technology. At the center of our investigation is the human-machine co-evolution. We believe that addressing complex societal and interpersonal challenges facing humans in the information-age requires a multi-disciplinary effort and in-depth cross-fertilization of theory, methodologies, and practices. Typical outcomes of our work may take shape in the form of academic papers, art installations, performances, workshops and conferences.
Key themes of investigation:
Virtual Peacemakers: Mimicry Increases Empathy in Simulated Contact with Virtual Outgroup Members
This research examined virtual–human interactions as a new form of simulated contact between members of groups in conflict. A virtual human representing an outgroup member (a Palestinian) interacted with 60 Jewish Israeli participants in an experimental study. We manipulated postural mimicry by the virtual interaction partner during a conversation about a sensitive conflict issue. Mimicry increased empathy toward the Palestinians, irrespective of participants' feelings toward the Palestinians prior to the experiment. Further, mimicked participants who reported a priori negative feelings toward Palestinians expressed more sympathy toward their Palestinian virtual interaction partner, rated themselves as close to him, and perceived the interaction as more harmonious compared to participants in a counter-mimicry condition. The results underscore the impact of mimicry on intergroup interactions, especially on individuals who harbor negative feelings toward the outgroup. The use of virtual–human interactions in obtaining this effect reveals the still widely unexplored potential of technology-enhanced conflict resolution.
Real-time fMRI control of a humanoid robot: First ever Robotic reimbodiment using an fMRI interface.
Brain computer interfaces (BCIs) allow for a direct communication between a person's brain and any technological device – with applications for communication, control, rehabilitation, and more. Prof Friedman has been one of the first to work on the integration of BCIs with virtual reality and has been the first to demonstrate BCI control of a virtual avatar. The lab has also done some work on embedding SSVEP based BCIs in Unity, led by Jonathan Giron, and even BCI control of DNA origami nanobots (in collaboration with Dr. Ido Bachelet).
Following our work on EEG-based BCIs, the lab has worked on real-time fMRI BCIs – in research led by Dr. Ori Cohen and in collaboration with Rafael Malach's lab in the Weizmann Isntitute, we have demonstrated a whole brain machine learning classification system that can be used to control avatars, humanoid robots, and also demonstrated the application of real time fMRI for basic science.
Recently, our focus is on deep learning the brain – Ori Tal has demonstrated the superiority of deep learning for transfer learning among subjects, as compared to other machine learning approaches.