The Stadium Where the Scoreboard Sound Arrived First
The new LED scoreboard had built-in speakers and its own audio feed from the broadcast truck. The stadium PA had its own separate feed. Nobody synchronized them. The scoreboard processed its video and audio digitally, adding 40ms of latency. The stadium PA had 15ms of DSP processing delay. The scoreboard was closer to the upper deck seats than the PA speakers.
For fans in the upper deck, the sequence of events was: (1) see the goal on the scoreboard, (2) hear the scoreboard announcement, (3) hear the stadium PA echo, (4) hear the crowd reaction from below, (5) hear the reflection from the roof. Five temporally distinct versions of the same moment, spread across nearly a full second. It was like watching a goal in a time-lapse.
Lower-deck fans had the opposite problem: the PA arrived first, then the scoreboard sound arrived late enough to be a distinct echo. The crowd celebration rippled upward through the stands in a wave as each section received their version of the news at different times, creating what one journalist described as "the wave, but for sound."
Multiple sound sources at different distances with different processing delays create a temporal mess that the Haas effect cannot reconcile because there is no single "first arrival" — it varies by seat.
The Moral: Every audio source in a venue must be time-aligned to every other. Use SonaVyx Transfer Function to measure the impulse response from each source at key listening positions and synchronize them.
Try It Now
Open this measurement tool in your browser — free, no download required.