I've been thinking about amplifier burst testing lately. The capacitor reservoir in amplifiers is different in every design and it makes a big impact on how the amp performs in sub duties for huge transients where it can give the amp more power than it can draw from the wall power outlet.
There are no standards to what length amps are tested for burst power duration. Some claim that they should be tested at up to 500 milliseconds for burst power while others use 1ms and sometimes less to show that an amp has a higher peak power despite having a small cap bank.
With a timescale (horizontal) of 50ms per division, here are the different lengths of CEA bursts at a few different frequencies:
You can see how much more demanding a 5Hz burst would be because it is a longer duration which will drain a cap bank more than a shorter burst.
Looking closely at the 5Hz burst, we can guess where the capacitor reservoir will start to discharge. If we figure that we have a 120V/30A (3600W) power outlet and an amp that is about 60% efficient so that it can do a sustained tone at around 2205W, a 5 ohm speaker load at 5Hz would make for around 105V before running out of juice from the wall. Everything above 105V would have to be covered by the cap bank.
This graph has a vertical scale of 50V per division and a horizontal scale of 50 milliseconds per division.
Since we now know where to look for the transition from wall outlet power to cap reservoir (a bit over 2 divisions or 105V), we can zoom in on different CEA frequency bursts and see how much time the cap bank will have to pick up.
Single digits get pretty stressful on an amplifier no matter what topology it is.
The question is what burst frequency is the most logical to use to test an amplifier's capabilities. Stay tuned and I'll show some real world movie content shots of transient waveforms and their time duration zoomed in on to see if that helps pick a burst frequency that most closely matches the stresses that some of the coolest bass hits in movies inflict.