Novel Multimodal Interfaces for XR
New interactive and XR technologies are incorporating immersive user interfaces that leverage gesture and voice recognition in addition to existing controller inputs. However, the state-of-the-art interfaces are quite rudimentary and not widely accessible for the general consumer. At the MIT Media Lab and MIT Center for Advanced Virtuality we have been working on scalable, multimodal interfaces that can be easily deployed on an AR or VR system, heads-up displays for autonomous vehicles, and everyday large displays. Vik Parthiban is a Research Affiliate at MIT and a graduate of the MIT Media Lab. He is researching new technologies that can augment our daily interactions and previously worked as a project lead and hardware engineer in the Magic Leap Advanced Photonics division.
Programming descriptions are generated by participants and do not necessarily reflect the opinions of SXSW.