Members of Advanced Reallity Lab team are among the best researchers and experts in their field. Our Research projects are ground-breaking and at the front of academic research in the fields of virtual and augmented reality.
Virtual humans are animated intelligent virtual agents that interact with you, typically in immersive VR. Our lab has developed a platform for virtual humans, which is being used for multidisciplinary research. The development of believable virtual humans encompasses a huge range of challenges, from photorealistic graphics and animation to high level AI. Clearly we only focus on a subset of these problems, and we are happy to collaborate with other labs or industry with complimentary skills.
Some social scenarios implemented with our virtual human platform:
Deep learning based multimodal communication
Recent breakthroughs achieved with "deep learning" applied to large datasets are highly relevant for the development of virtual humans. In 2018 we have started addressing some of the underlying challenges, in collaboration with Prof. Yaakov Hel-Or, Prof. Arik Shamir, Dr. Kfir Bar, and Dr. Shai Fine. The general framework is based on large multimodal datasets scrapped from the web and analyzed with seq2seq methods, aimed at creative generation of verbal and non verbal behavior.
On 22.9.19 we will have an informal small international workshop related to this topic.
And a related demo of Half-Life 2 agents learning to combat using RL video.
Psychological and physiological responses to virtual humans
Over the years we have carried out studies evaluating the psychological and physiological (mostly autonomous nervous system – skin conductance, heart rate and derivatives, respiration, and head movements), and behavioral (mostly spoken language, both prosody and semantics) responses of VR participant to virtual humans.
Our virtual humans have been applied to studying flirting and romantic behavior, in collaboration with Prof. Gurit Birnbaum, and conflict resolution in the context of the Israeli-Palestinian conflict, in collaboration with Dr. Beatrice Hasler.
Between the years 2010 and 2015 the lab took part in the EU FP7 project Beaming, aimed at social telepresence. Our role was to develop the AI proxy – automatically controlling your remote representation, either virtual or robotic. See below papers and videos demonstrating: automatic research assistants, being in three places at the same time, automatic nonverbal translation, automatically replacing you in class, a dual gender avatar, and more.
D. Friedman, O. Salomon, B. Hasler, Virtual substitute teacher: Introducing the concept of the classroom proxy,Proceedings of the 3rd European Immersive Education Summit (iED), pp. 186-197, London, UK, November 2013.