RIT becomes a test site for Yamaha immersive audio technology

Rochester Institute of Technology researchers are improving the way people perceive sound. Some of these upgrades will be added to the high-tech acoustics developed by international entertainment company Yamaha Corp.

Applied and innovative research by Sungyoung Kim for the immersive sound laboratory (AIRIS) at RIT has become a test site to explore the future capabilities of one of Yamaha’s newest technologies, the Active field control system (CAF). Kim and his students will help develop improvements for the next phase of the high-tech audio system.

“I was approached by Yamaha to discuss the future of this technology. The first step was to adapt all of the Yamaha technology in our lab. That was the first phase,” said Kim, associate professor of technology at audio engineering at RIT’s College of Engineering Technology, which has a background in developing immersive audio systems that enhance sound in modern spaces.

AFC technology enhances an environment by controlling reverberations and sound/object positions, deepening acoustics in passive spaces, and creating various auditory ambient parameters that augment site architecture. Often referred to as 3D audio, virtual and immersive sound is an emerging area of ​​research and production where companies such as Yamaha are continually striving to deliver enriched quality sound across various platforms, especially for 3D classical music concerts.

“One of the existing problems in 3D musical performances is how to synchronize computer-generated parts and acoustic atmospheres with human performances. And it’s a hot topic of research on how computers recognize music and track a human’s performance,” Kim said. “Composers use technology as a medium to accomplish their musical creativity. We just changed the concept.

Part of the concept involved understanding the integration of music and the listening environment. The system allows certain acoustic parameters to be recreated to improve live performance. It’s more than improvements to speakers; the system adjusts the acoustics of a modern space, for example, to make the audience feel like they’re experiencing sound – from ancient cathedrals to windblown caves – without having to be in the actual setting .

As part of a first experience, Kim worked with a composer from Eastman School of Music, Sihyun Uhm, asking him to render a new piece of music through the system. Computers can have pre-recorded effects, but this system differs by placing an audience in the soundscape. The composer intended for the audience to feel or internally visualize two environments – a mountain and a desert, and the composition went back and forth between each, Kim explained.

“A lot of people try this today. With orchestras, for example, there are always changes in orchestral movements, comparable from scene to scene in a play,” he said. “This new composition was only 10 minutes long. We changed the acoustic set, or the musical set, several times in the same movement, which is unique and stimulating. It’s another approach to music. »

Uhm’s composition, String Quartet No. 2, was performed at Yamaha Ginza studio in Tokyo, Japan last summer, and Kim was joined by Hideo Miyazaki, a spatial acoustic design engineer at Yamaha, local musicians, company colleagues as well as several alumni of RIT’s engineering technology program living in Japan. .

Four RIT students participated as assistant engineers/operators at the concert, working with the AFC system while Kim prepared for the concert.

“While students can learn many aspects of acoustics from books, the system provided a unique learning opportunity on how to virtually manipulate acoustics in real time,” said Kim, whose work is funded by two corporate grants from Yamaha. The first, “Towards an Individualized Presentation of Immersive Experience,” is a three-year grant to assess the process of auditory selective attention, a cognitive process that humans undertake to discriminate between sounds and environments. Another, “Investigation of Perceptual Signals Necessary for Yamaha AFC System Remote Tuning”, concerns the process of virtual connections and how engineers virtually tune systems without entering real space.

“Tuning a system remotely is more technical, and that involves the concept of working in the metaverse. With people doing more virtual work remotely, there is a need to have audio systems that are compatible and with as high audio quality as possible,” Kim said.

This audio search for Yamaha may also impact RIT with its multiple conference and auditorium spaces on campus. The RIT can be a space for experimentation where physically separated spaces can be virtually synchronized.

“We can have one musician in one building, another musician in another building, and I want to see if they can play together,” he said. “This is the future of music in the Metaverse. You can have one artist performing in Korea, while another performs in the US, even here at RIT. In the metaverse, there should be no walls or barriers between these musicians. I think the acoustic landscape is the thing that makes them feel like they’re in the same space.

Comments are closed.