Prof. Ruth Feldman and Prof. Amir Amedi, two leading neuroscientists in their fields, are running cutting-edge brain studies with astounding results. While researching different areas of the brain, and different brain activities and functionalities, both are looking to solve serious issues with what could eventually be life-changing solutions.
Prof. Feldman is the Simms-Mann Professor of Developmental Social Neuroscience at Reichman University, with a joint appointment at the Yale Child Study Center. As a world leader in developmental psychology and psychopathology, her approach integrates perspectives from neuroscience, human development, philosophy, clinical practice, and the arts within an interpersonal frame and a behavior-based approach.
The Center for Developmental Social Neuroscience at the Baruch Ivcher School of Psychology is unique in its studies of the brain its natural social environment, similar to how interactions between people, mainly between parents and their children but also between couples and close friends.
Several groundbreaking studies in "hyperscanning" are being conducted at the Center. They are considered rare and complicated to conduct, with only a few dozen having been held around the world. Prof. Feldman's lab specializes in them. "Hyperscanning" studies examine two brains simultaneously, during different interactions between them.
One of these studies was recently published in NeuroImage and examined what happens to the brain during non-direct interactions between mothers and children aged 10-14. While initially conducted before COVID-19, this study and its results are applicable to the early days of the COVID-19 pandemic and lockdowns, when most of our social interactions were remote. The study asks, “Can we create and maintain social bonds from afar?”
During the study, participants were asked to communicate with one another either face-to-face or through Zoom while in two separate rooms, all while connected to an EEG (electroencephalogram) machine. The study examines what happens to the brain of each participant during the interactions, while they complete different pleasant tasks. During the study, the participants also give saliva samples to measure hormone levels.
The study found that while technology allows the creation of brain synchronization between participants not sitting in the same room, it does not replace direct and personal interaction, and is actually very limited. Of 36 possible combinations of brain interactions, only one occurred during the Zoom chat between mothers and their children, while nine occurred during face-to-face communications.
According to Prof. Feldman, "Imagine you’re driving down a motorway in optimal conditions; lots of lanes, the traffic moves smoothly. Now compare it to driving on a gravel road with many bumps along the way. You will still get to the same place, but it will be much more difficult getting there. This is the difference between face-to-face interactions and non-direct interactions."
Some of the other studies Prof. Feldman and her team are working on include examining the effects of a mother's body odor on brain synchronization with her baby; a unique study that examines the effects of the father's body odor on brain synchronization with his baby; and two more in cooperation with the Baruch Ivcher School of Psychology Clinic.
According to Post-Doctoral Fellow, Dr. Linoy Schwartz, "Hyperscanning studies can give a unique and live response to the question, 'How do two brains communicate with each other and impact on each other?’ And they also provide the opportunity to examine the neurological component, along with behavioral and hormonal, of synchronization between people."
Many of the studies at the Center are longitudinal, examining the effects of interactions between mothers and their children during a period of about 20 years. A few of these studies use fMRI machines (functional MRI), to measure brain activity while conducting specific tasks.
One of the most notable follows children and their mothers from Sderot and the Gaza border. The purpose is to learn how stress that arises from the security situation influences the development of children who grew up in these areas. The control group includes children who grew up in towns that are socioeconomically similar to Sderot but have not experienced the security situation.
The study’s initial analysis shows that many participants suffered from PTSD (Post Traumatic Stress Disorder) symptoms, regression, and other disorders. The interaction of mothers with their children showed patterns that were not age- or situation-appropriate. Researchers continued to meet participants until they were 15 years old and found differences in their hormone levels: cortisol, the hormone associated with stress; and oxytocin, the hormone associated with love and trust. The situation was unique as all participants suffered from the same stressor – the security situation, yet researchers found that while some children who lived on the same street suffered from anxiety, some did not.
Moreover, the correlation between mothers and their children was profound. Early on, they found that the interaction between mother and child meaningfully affects the physiological metrics of both, with the mother being some sort of buffer protecting the child. Mothers who were more resilient had more resilient children, and mothers who suffered more from anxiety had more anxious children.
The next stage of the study examined the gut microbiome population and in future, the study will scan the children in an fMRI machine when they are 18 years old, looking for a neurological pattern that can differentiate them from other populations.
According to Research Associate Dr. Adi Ulmer-Yaniv, who leads imaging studies at the Center, "These studies allow us to receive new insights on the long-term effects of the relationship between mothers and children on the children's development, not just behaviorally, but neurologically as well. With fMRI imaging, we can examine how social interactions shape brain regions associated with social cognition across development."
Prof. Amir Amedi, founding director of The Baruch Ivcher Institute for Brain, Cognition & Technology & The Ruth and Meir Rosental Brain Imaging Center (MRI), is a world leader in multisensory research with a multidisciplinary background in computational neuroscience (PhD 2005), brain imaging, neurology (visiting research fellow at NIH and instructor of neurology at Harvard Medical School), and music. In his lab, Prof. Amedi uses a wide range of technologies and research methods that help the team understand the interplay between nature and nurture in shaping the human brain.
The uniqueness of Prof. Amedi’s lab is two-fold, according to Dr. Amber Maimon, Research Associate and Academic Lab Manager. First is their unique methodology of “developing technologies that help us study the brain, and studying the brain so the findings help us develop new technologies”. Second is their infrastructure in the form of a multi-sensory ambisonic room, a one-of-a-kind research space that allows staff to program human senses in 360°, and then export the experiences outside of the laboratory setting to the MRI machine to examine what happens in the brain during the multi-sensory experience. The room allows the team to control sound, vision and touch in various ways and create sensory and multi-sensory experiences that cannot be created elsewhere. Their research indicates that within the brain, there are hidden connections between the senses in addition to the unhidden ones. Synesthesia, a phenomenon in which stimulation of one sense leads to another sensory experience, is an example of an unhidden connection between the senses. According to staff, there are also hidden connections between the senses, that we are not aware of in our daily experience, that shape how we perceive the world, and their research and studies are based on this understanding.
Looking to assist people with health issues and disabilities, and to enhance the capabilities of each and every one of us, the Institute is working on various solutions that will eventually become accessible to everyone who needs them. One of these solutions is “EyeMusic”, a sensory substitution device (SSD) that converts visual information into audio information while preserving shape, color, and location. This was inspired by the “echolocation” abilities of dolphins and bats, that use it to determine the location of objects with reflected sound. Similarly, Prof. Amedi and his team have developed a sensory substitution device that scans the visual field and converts it with a dedicated algorithm, pixel by pixel, into what is known as soundscape. The soundscape allows blind people. After intensive training, to "see" the identity, 3D shape, location and even color of objects through sound.
When they scanned these participants in the MRI, they saw that the areas of the brain associated with vision in the normally sighted were activated. Since in blind people these areas are not supposed to work as they have never used them before, this is an astounding discovery and proof that with appropriate training and technology, even blind people can learn to "see" through different sense, sound in this case. Even more astounding, the visual areas of the brain commonly thought to be divided into sub-areas were found by the team to respond to visual faces and body postures were activated to the soundscape of faces and body postures through the EyeMusic.
Another sensory substitution device created by Prof. Amedi and his team, in a project lead by Dr. Kasia Ciesla and Dr. Adi Snir, includes using conveying audition through tactile vibrations in the fingertips. One of the implementations of this device help deaf people sense the vibrations of a sound, and its location.
A major project thar the lab is leading, overseen by the lab's Chief Design and Technology Officer Iddo Wald, along with Prof. Ben Corn, a world-leading Professor of Oncology at the Hebrew University of Jerusalem, the Deputy Director of the new Shaare Zedek Cancer Center and Head of the Department of Radiation Medicine, is aimed at employing what they know about the brain to influence the body and vice versa. This project, known as "The Hope Initiative," aims to improve medical treatment and quality of life by enriching the treatment environment with scientifically grounded, multi-sensory, neuro-wellness interventions. In the new Cancer Center, the team built a CT simulator room, a multi-sensory room with a curved LED display, a unique ambisonic audio system and infrastructure supporting touch interfaces and interactive experiences that incorporate their custom-made tools and technologies. These include 3D glasses to convey 3D within the MRI setting, which were designed by Iddo Wald. One of the many purposes of the unique multi-sensory immersive setups is to reduce the anxiety and claustrophobic feelings that arise for many people during scanning or treatment. The next step of this unique cooperation is building matching environments for additional diagnostic and treatment settings, and even relaxation rooms for the hospital staff.
The late Prof. Paul Bach-y-Rita, an American neuroscientist who specialized in neuroplasticity (the scope that studies the ability of neural networks in the brain to change through growth and reorganization), once said that “We see with our brains, not with our eyes”.
“And we say”, explains Dr. Maimon, “that we experience the world through the brain and not through the sensory organs. And as these studies prove, the areas in charge of specific senses such as hearing or seeing can also be activated by sound or touch; these discoveries have led us to propose a paradigm shift – that the brain is not divided into senses but into tasks, and by using technologies and perceptual learning, the plasticity of the brain can be tapped deep into adulthood, contrary to what was previously thought and accepted."