After 18 long months in the dark, stage lights are shining once again in live performance venues across the United States. The performing arts industry has emerged from the COVID-19 pandemic, albeit slowly due to the Delta variant as well as a lack of consensus on prevention mandates. Still, the impacts of the pandemic have left scars on the industry that may not soon fade.
Having a large chunk of the population avoid public gatherings for more than a year means that accommodations and alternative ways of distributing the performance remain important. Not everyone wants to get out yet, and fears of another COVID-19 wave are keeping remote broadcasting on the minds of some AV consultants.
“Starting over a year ago, we’ve noticed that the consultants we’ve talked to are designing more and more for remote audiences via broadcast and streaming,” explained Tim Boot, director of global marketing at Meyer Sound. “It’s not so much they are designing differently for a live audience, but rather making sure all systems are compatible with either augmenting a live performance with a remote audience or, if necessary, a remote audience only.”
While these systems are still designed for a full house, they may be configured or zoned differently for a smaller audience or broadcast only. Boot also said he's seeing requests for both permanent and temporary systems for outdoor venues, or systems that can pull double duty—which often results in scaling up to address the same-sized audience over a larger area.
Immersive Is Emerging
As clients and consultants prepare for a post-COVID market, the trend toward immersive performance experiences continues to grow. “Consultants are telling us that almost every venue they are designing either will have an immersive sound capability or will have an immersive-ready infrastructure, even if all the loudspeakers don’t go in right away,” said Boot, noting that Meyer Sound’s Spacemap Go spatial sound design and live mixing tool helps designers essentially create a system blueprint.
Gino Pellicano, application manager at L-Acoustics, has a theory for why this trend toward immersion is accelerating, and it has a lot to do with the pandemic. “If we're talking about how COVID has transformed the industry,” he offered, “despite how disruptive it was to our business, there are some interesting trends that we’re seeing. For a year, everybody was at home listening to music and livestreams of their favorite artists in their home theater or over headphones. It’s become inexpensive to deploy a good-sounding audio system at home, and as live events return, the expectations are elevated, as we’ve become accustomed to immersive audio at home. The standard is set by what listeners can experience in their homes or on the go on a daily basis, [and] live events are quickly evolving to deliver on these heightened expectations."
“Theater audio should never be overpowering, but simply add to the experience of the production,” said Ben Escobedo, senior specialist of market development at Shure. “Audiences should not be focused on mysterious black boxes on stage with cables coming out of them, but rather that the show was immersive and moving. If an audio production is successful, it should be completely transparent to the attendee, serving as one of the many well-planned components of a magical theater experience.”
In large markets, competition for radio frequencies can be fierce, and sound designers can find themselves in the position of making the most of limited RF spectrum for wireless applications. “Here in NYC, more than ever, we are seeing elevated challenges using wireless microphones and in-ear monitors for theater,” said Escobedo. “Productions still have high channel-count demands, while RF spectrum is continually threatened.”
Shure’s solution for making efficient use of the available spectrum is Axient Digital technology, which uses encryption and digital diversity to overcome high channel counts, constrained spectrum, transparency, and other hurdles. The technology is used in productions like David Byrne’s Broadway hit American Utopia, in which every nearly audio source is wireless. “Just a few years ago something like this would have been very difficult, if not impossible, based on the wireless technologies available at the time,” Escobedo added.
While clearing the stage of distractions, there are also changes happening with how loudspeakers are transmitting audio to the audience, a trend most apparent in theaters in-the-round. Boot said the challenges of creating a uniform audio experience in-the-round are twofold. Not only is tight pattern control non-negotiable, but FOH engineers mix audio without the benefit of hearing all the components at once. “That means you have to design and calibrate the system so that the coverage and response is consistent throughout the audience seating,” he said.
When considering applications such as theater in-the-round, immersive audio systems add significant value while also reducing complexity for the engineer. “If you look at using [L-Acoustics’] L-ISA for this application,” explained Pellicano, “surrounds and overheads can supplement the main reinforcement system, providing sound designers the ability to pan effects around the space in both the vertical and horizontal planes, all from the same mixing environment used to reinforce speech. Complementing this, the L-ISA room engine generates spatially dependent reverberation from each object individually, resulting in a realistic-sounding reverberant environment that can work to complement the existing room acoustics, or be used as a creative tool to simulate a different space entirely.”
Audio localization is another key component of a successful immersive audio installation. “With theatrical, achieving accurate localization to performers on stage can be very challenging in a traditional L/R or LCR system,” Pellicano added. “The last thing you want is for somebody to put on a great performance, but you're not engaged because there’s a spatial disconnect with where the performer is on stage and where their voice is [coming] from. With L-ISA, we significantly improve localization by deploying a minimum of five arrays horizontally across the front of the stage. We call this the scene system—and with that additional horizontal resolution, the technology disappears and your focus becomes the performer rather than the loudspeakers.”
Remedies include deploying a phalanx of front-fill and under-balcony loudspeakers and mixing sound in a way that audio sources follow the action. For example, when an actor moves from stage left to stage right, their microphone output would move in the same direction, avoiding dissociation in which the actor is physically in one place while the sound of their voice comes from another place.
“Bob McCarthy, our director of system optimization, boils it down to three fundamentals,” said Boot. “First, pick the right loudspeaker. Second, put it in the right place, and finally, point it at a listener. Of course, implementing the fundamentals can get complicated, and Bob wrote a thick book about it. But with all our electronic wizardry, we can’t ignore these fundamentals. At the end of the day, you can’t break the laws of physics.”