Seeing through sound is something that bats and dolphins have evolved to do over tens of thousands of years. At our institute, using the EyeMusic technology and dedicated training, blind or sighted people can be trained to see through sound in a matter of weeks. The EyeMusic is a sensory substitution device (SSD) that conveys visual information into audition, while preserving shape, color and location. X-axis information is conveyed through time, such that visual details on the left are heard before those on the right. Y-axis information is conveyed via pitch, as higher parts of the image are conveyed through higher tones than parts lower in the image. Colors are differentiated through musical instruments. The EyeMusic, as other SSDs, has rehabilitative potential for blind individuals especially in the effort to create a low resolution, whole-colored image in their brain in a resolution of up to 1500 pixels.
In another example, crickets have slowly evolved to hear better via their legs! Can we do something similar?
In our Speech and Music to Touch project, we are developing technology and training that improves hearing via touch with only one hour of training. This can help the hearing-impaired but also people with normal hearing trying to understand speech in a noisy environment or when the speaker’s lips are hidden behind a mask (did someone say COVID-19?). In collaboration with the World Hearing Center in Warsaw, Poland, we developed and implemented an array of audio-to-touch sensory substitution devices. The first device delivers vibrotactile stimulation, representing certain features of sound such as speech signal, to the fingertips. This part of the body has the most dense representation of Pecunian cells - tactile cells coding low-frequency vibration. We have already shown in 30 individuals that the understanding of speech in a noisy environment improves when the auditory signal is complemented with a corresponding vibration. The speech-to-touch device will be further extended to provide stimulation to other parts of the body, and is compatible with a 3T MRI scanner.
Other audio-tactile devices include the TactileGlove which can convert speech and music information (from fundamental frequency to sound envelope) into touch information. We are also working on several new sound to touch devices including multisensory chair and a multisensory bed for music enhancement and neuro-wellness. Finally, we are in the process of creating a special multisensory (visual, touch, audio, movement and smell) 360 "dome for neuro wellness" (in collaboration with Simnoa Technologies and Joy Ventures.
We then use all this knowledge and our unique facilities like the Multisensory Ambisonic Room to develop new multisensory technologies. for example, in the field of rehabilitation and neuro-wellness (e.g., by using tools from certain senses - such as hearing and touch - to produce technologies that lower anxiety).
In yet another example, snakes have evolved to develop a far reaching thermal infrared sense. In our new ERC project; How Experience Shapes the Human Brain: NovelExperieSENSE 2019-2023, we also ask if humans can develop in weeks such superhero like abilities in weeks.
(Expanding the auditory range using non-perceptible-to-human-ears auditory frequencies, from very low (infrasound, <20Hz) to very high (ultrasound, >20kHz). These broad frequency ranges are perceptible to animals, including household pets);
In this project a person is provided with information from ever-changing maps of the environment, including parameters such as, pollution, crime, and radiation levels. Users will be able to perceive various properties of their surroundings continuously both as current point information (user location) and as full spatial maps of their surroundings. This project will enable us to investigate how the brain represents differences stemming only from contextual info (i.e. same environment with different degrees of pollution).
All of these technological tools, based on an understanding of human and animal brains, allow us to decipher the mysteries of the brain and map it in an unprecedented way. The main goal of the Baruch Ivcher Institute for Brain, Cognition & Technology is to use this unique approach to examine age-old basic but mysterious scientific questions about the brain and human civilization, such as: What has a greater impact on us - nature or nurture? Can the human brain develop senses that are able to perceive inputs far sensing thermal infrared information? (as snakes developed over the course of evolution)? Is our brain more elastic or more stable across the lifespan and can we reverse the time and get older brain to be more plastic/young? And which of our senses corresponds to which daily tasks and how do these tasks are represented in maps and areas in the brain?
In collaboration with IDC Media Innovation Lab (miLab) directed by Dr. Oren Zukerman.
This field of research incorporates projects that aime to rehabilitate or improve residual sensory capabilities of patients with different types of disabilities:
The aim is to provide multisensory feedback of body movement in order to improve proprioception and time perception: Muscle activity will be recorded by electro-miography (EMG) and replaced by sound, light, or vibrations that scale muscle contraction intensity. The EMG and sensory output will be built into a unique portable device that people can use to train movement and gait after stroke, as well as during exercice or relaxation.
In the multisensory room we are building real-life sound scenarios, such as a shop/a restaurant/city traffic/concert/ forest/ sea, etc. This space can be used for instance for the rehabilitation of speech perception in noise and for sound localization in the hearing impaired.
We are using virtual reality to study brain flexibility and to develop novel applications by using the insights from our findings in different populations. For instance, we are using immersive virtual reality techniques to measure behavioral factors and train a healthy human population to develop cross modal links between the auditory system and the somatosensory system. There are quite a few key VR experiences based on research on sensory substitution and motor performance, such as: