MIT.nano Immersion Lab Gaming Program Awards Third Annual Seed Grant | MIT News

MIT.nano announced its next round of seed grants to support hardware and software research related to sensors, 3D/4D interaction and analysis, augmented and virtual reality (AR/VR), and games . Grants are awarded through the MIT.nano Immersion Lab Gaming Program, a four-year collaboration between MIT.nano and NCSOFT, a digital entertainment company and founding member of the MIT.nano consortium.

“We are pleased to be able to continue to support research at the intersection of the physical and the digital through this collaboration with NCSOFT,” said Brian W. Anthony, Associate Director of MIT.nano, who is also a Principal Investigator in Mechanical Engineering and Institute of medical engineering and science. “These projects are just a few examples of how MIT researchers are exploring how new technologies could change the way humans interact with the world and each other.”

The MIT.nano Immersion Lab is a two-story immersive space dedicated to viewing, understanding, and interacting with big data and synthetic environments. Equipped with equipment and software tools for motion capture, photogrammetry, and 4D experiments, and supported by expert technical staff, this open-access facility is available for use by any MIT student, professor, or researcher, as well as by external users.

This year, three projects were selected to receive seed grants:

Ian Condry: Innovations in Spatial Audio and Immersive Sound

Professor of Japanese Culture and Media Studies Ian Condry explores spatial sound research and technology for video games. Specifically, Condry and co-researcher Philip Tan, researcher and creative director at MIT Game Lab, hope to develop software to add “crowd roar” to online gaming and esports so players and spectators can hear and participate. in the sound.

Condry and Tan will use object-based mixing technology from the MIT Spatial Sound Lab, combined with tracking and playback capabilities from the Immersion Lab, to collect data and compare various approaches to immersive audio. Both see the project leading to fully immersive “real-life” gaming experiences with 360-degree video, or blended games where online and in-person players can be present at the same event and interact with players. performers.

Robert Hupp: Immersive athlete training technology and data-driven training support in fencing

Seeking to improve the training, practice, and training experience of athletes to maximize learning while minimizing the risk of injury, MIT assistant coach Robert Hupp aims to advance the pedagogy of athletics. fencing using extended reality (XR) technology and biomechanical data.

Hupp, who worked with staff at the MIT.nano Immersion Lab, says preliminary data suggests that technology-assisted practice can make a fencer’s movements more compact and practice in an immersive environment can improve techniques. reactive. He spoke about data-driven coaching support and athlete training at an MIT.nano IMMERSED seminar in September 2021.

With this seed grant, Hupp plans to develop an immersive training system for self-paced learning for athletes, create a biofeedback system to support coaches, conduct scientific studies to track an athlete’s progress and to advance the current understanding of adversary interaction. He envisions the work impacting athletics, biomechanics and physiotherapy, and that the use of XR technology for training could expand to other sports.

Jeehwan Kim: next-generation human/computer interface for advanced AR/VR games

The most widely used user interaction methods for AR/VR games are gaze and motion tracking. However, according to Associate Professor of Mechanical Engineering Jeehwan Kim, current state-of-the-art devices fail to deliver truly immersive AR/VR experiences due to limitations in size, power consumption, noticeability and reliability.

Kim, who is also an associate professor of materials science and engineering, offers a microLED/pupillary dilation (PD)-based gaze tracker and an electronic, skin-based, controller-less motion tracker for the human-computer interface. Next-gen AR/VR. Kim’s gaze tracker is more compact and consumes less power than conventional trackers. It can be integrated into transparent screens and could be used to develop compact AR glasses. The e-skin motion tracker can adhere imperceptibly to human skin and accurately detect human movements, which Kim says will facilitate more natural human interaction with AR/VR.

This is the third year of the MIT.nano Immersion Lab gaming program start-up scholarships. In the program’s first two calls for proposals in 2019 and 2020, 12 projects from five departments received $1.5 million in combined research funding. The collaborative proposal selection process by MIT.nano and NCSOFT ensures that awarded projects develop advances with industrial impact and that MIT researchers are exposed to NCSOFT’s technical partners for the duration of the seed grants.

Comments are closed.