MIT.nano Immersion Lab gaming program awards seed grants to three XR research and development projects

In Augmented reality and Virtual Reality News

April 13, 2022 – MIT.nanoMIT’s Advanced Facility for Nanoscience and Nanotechnology, announced its next round of seed grants to support hardware and software research related to sensors, 3D/4D interaction and analysis, augmented and virtual reality (AR/VR), and games.

Grants are awarded through the MIT.nano Immersion Lab Gaming Program, a four-year collaboration between MIT.nano and NCSOFT, a digital entertainment company and founding member of the MIT.nano consortium.

The MIT.nano Immersion laboratory is an immersive two-story space dedicated to viewing, understanding, and interacting with big data and synthetic environments. Equipped with equipment and software tools for motion capture, photogrammetry, and 4D experiments, and supported by expert technical staff, this open-access facility is available for use by any MIT student, professor, or researcher, as well as by external users.

“We are pleased to be able to continue supporting research at the intersection of the physical and the digital through this collaboration with NCSOFT,” said Brian W. Anthony, Associate Director of MIT.nano. “These projects are just a few examples of how MIT researchers are exploring how new technologies could change the way humans interact with the world and each other.”

The three three projects selected to receive seed grants this year include:

Jeehwan Kim: next-generation human/computer interface for advanced AR/VR games

The most widely used user interaction methods for AR/VR games are gaze and motion tracking. However, according to Associate Professor of Mechanical Engineering Jeehwan Kim, current state-of-the-art devices fail to deliver truly immersive AR/VR experiences due to limitations in size, power consumption, noticeability and reliability.

Kim offers a microLED/pupillary dilation-based gaze tracker and an electronic, skin-based, controller-less motion tracker for next-generation AR/VR human-computer interface. According to MIT.nano, Kim’s gaze tracker is more compact and consumes less power than conventional trackers. It can also be integrated into transparent screens and could be used to develop compact AR glasses. The e-skin motion tracker can adhere to human skin and accurately detect human movement, which Kim says will facilitate more natural human interaction with AR/VR.

Ian Condry: Innovations in Spatial Audio and Immersive Sound

Professor of Japanese Culture and Media Studies, Ian Condry explores spatial sound research and technology for video games. Specifically, Condry and co-researcher Philip Tan, research scientist and creative director at MIT Game Lab, hope to develop software to add “the roar of the crowd” to online gaming and esports so that gamers and spectators can hear and participate. in the sound.

According to MIT.nano, Condry and Tan will use object-based mixing technology from the MIT Spatial Sound Lab, combined with tracking and playback capabilities from the Immersion Lab, to collect data and compare various approaches to immersive audio. Both see the project leading to fully immersive “real-life” gaming experiences with 360-degree video, or blended games where online and in-person players can be present at the same event and interact with players. performers.

Robert Hupp: Immersive athlete training technology and data-driven training support in fencing

MIT assistant coach Robert Hupp aims to advance fencing education using extended reality (XR) technology and biomechanical data to improve training, practice and experiences in fencing. training athletes to maximize learning while minimizing the risk of injury.

With the seed grant, Hupp plans to develop an immersive training system for athlete self-directed learning, creating a biofeedback system to support coaches, conducting scientific studies to track an athlete’s progress, and advancing current understanding of opponent interaction. The hope is that the work will impact athletics, biomechanics and physiotherapy, and that the use of XR technology for training could expand to other sports.

This is the third year of the MIT.nano Immersion Lab gaming program start-up scholarships. In the program’s first two calls for proposals in 2019 and 2020, 12 projects from five departments received US$1.5 million in combined research funding. MIT.nano said the collaborative proposal selection process by MIT.nano and NCSOFT ensures that awarded projects develop advances with industrial impact and that MIT researchers are exposed to NCSOFT’s technical partners over the duration of the grants. starting.

For more information on MIT.nano and its Immersion Lab program, click here.

Image/video credit: MIT.nano / YouTube


About the Author


Sam Sprigg

Sam is the founder and editor of Auganix. With a background in research and report writing, he covers news articles on the AR and VR industries. He is also interested in human augmentation technology as a whole and does not limit his learning specifically to the visual experience side of things.

Comments are closed.