Rochester Institute of Technology researchers are improving the way people experience sound. Some of these improvements will add to the high-tech sound that is being developed by international entertainment company Yamaha.
Sungyoung Kim’s applied and innovative research at RIT’s Immersive Sound Lab (AIRIS) became a testing ground for exploring the future capabilities of one of Yamaha’s latest technologies, the Active Field Control System (AFC). Kim and his students will help develop the next phase of improvement in the high-tech audio system.
“Yamaha approached me to discuss the future of this technology. The first step is to incorporate all Yamaha technology into our lab. This is the first phase,” said Kim, associate professor of audio engineering technology at RIT’s School of Engineering and Technology, who develops immersive audio systems Have a background to enhance the sound aspect of a modern space.
AFC technology enhances the environment by controlling reverberation and sound/object placement, deepening the acoustics of passive spaces, and creating various auditory ambient settings to enhance the architecture of the site. Often referred to as 3D audio, virtual and immersive sound is an emerging area of research and production, with companies such as Yamaha continually seeking to deliver rich, high-quality sound on a variety of platforms, especially 3D classical concerts.
“One of the problems in 3D music performance is how to synchronize computer-generated parts and acoustic ambience with human performance. This is a hot research topic on how computers can recognize music and follow human performance,” Kim said. “Composers are using technology as a support to accomplish their musical creativity. We just changed the concept.”
Some concepts require an understanding of the integration of music and listening environments. The system can recreate certain acoustic settings to enhance live performances. It’s not just improvements to speakers; for example, the system can adjust the acoustics of modern spaces to make audiences feel like they’re experiencing sound — from ancient cathedrals to wind-blown caves — without having to be in the actual environment.
Photo courtesy of SungYoung Kim
As part of the first experiment, Kim worked with composer Sihyun Uhm of the Eastman School of Music and asked her to render a new piece of music through the system. Computers can pre-record effects, but this system differs by placing the audience in an auditory environment. King explained that the composer intended for the audience to perceive or imagine two environments internally—the mountains and the desert—with the composition moving back and forth between each environment.
“A lot of people are trying this today. For example, in an orchestra, the movement of the orchestra is always changing, like scene to scene in a play,” he said. “This new piece is only 10 minutes long. We change the acoustic or musical scene multiple times in one movement – it’s unique and challenging. It’s a different approach to music.”
The composition of UM, String Quartet No. 2, performed last summer at Yamaha’s Ginza Studio in Tokyo, Japan, and Kim joined Yamaha’s spatial acoustics design engineer Hideo Miyazaki, local musicians, company colleagues, and several alumni of RIT’s engineering technology program. Live in Japan.
Four RIT students attended the concert as assistant engineers/operators, working with the AFC system as Kim prepared for the concert.
“While students can learn many aspects of acoustics from books, this system provides a unique learning opportunity for how to virtually manipulate acoustics in real time,” said Kim, whose work was funded through two Yamaha corporate grants. The first project, “Personalized Representation of Immersive Experiences,” is a three-year grant to assess auditory selective attentional processes, which are cognitive processes in humans to differentiate sounds and environments. Another, “Investigating the Perceptual Clues Needed to Remote Tuning a Yamaha AFC System” is about the process of virtual connection and how engineers can virtually tune the system without entering the actual space.
“How to remotely adjust a system is more technical, and it involves the concept of working in the metaverse. As people do more remote virtual work, the audio system needs to be compatible and have the highest possible audio quality,” says Kim.
This audio research from Yamaha could also impact RIT and its multiple meeting and auditorium spaces on campus. A RIT can be an experimental space where physically separate spaces can be virtually synchronized.
“We could have one musician in one building and another musician in another, and I wanted to see if they could play together,” he said. “This is the future of music in the metaverse. You can have one artist perform in Korea and another in the US — even at RIT. There shouldn’t be any walls or walls between these musicians in the metaverse. Barriers. I think the acoustic landscape is something that makes them feel like they’re in the same space.”