Originally Posted by xtremerevolution
See my response in that thread. I never said ALL amplifiers are 50% efficient. Say you're 86% efficient at 1000W, you're still drawing 1160 watts, or 84A at 13.8V. Considering that the alternator is not going to be running at its peak and considering more components in your car require electricity than just your amplifiers, my logic still holds true. Feel free to provide some concrete evidence if you're going to tell me I'm wrong instead of just "I misunderstood your last post so I'm just going to write you off as flawed and incorrect in this one by default." A little respect shouldn't be something I have to ask for.
This is the point that i'm making though brother, YOU WILL NOT BE ABLE TO MAKE RATED POWER @ 13.8v on that amp, it'* rated to make power at 14.4v! There is no way around that. Not only that, but an amp'* efficiency is not rated at a particular power output, rather a particular ohm load. Your calculation is way off anyway, not only because you're making a calculation of full rated power output of the amp at a voltage that by default would cause the amp to put out less than rated power, but it'* generally flawed.
The correct way to find out what your current draw is to divide the wattage by the voltage, and then multiply the result by 1.5 (Assuming 50% effeciency...if the amp were 80% efficient then you'd multiply by 1.2)...the sum of that calculation will give you your MAXIMUM AMPERAGE DRAW WHEN PLAYING A TEST TONE. Current draws while playing music would be a third of that, so divide the sum by 3 and you'll have your amp'* current draw at full tilt on music. Finding the voltage is the easy part, measure it AT THE AMP with a dmm. To find the wattage it'* putting out you'd have to measure the ac voltage at the speaker terminal on the amp, multiply that number by itself and then divide the sum by the impedance of the speaker connected to the terminal where you just measured the ac voltage. That will give you the maximum power output so the rms would be half of that. At that point you can take the number you just came up with, divide it by the voltage you measured at the amp'* power/ground terminals and then you can find out how much current your amp is drawing by dividing the sum by 3, factoring in the dynamic nature of music as opposed to a test tone.
So, using your example of the JL 1000.1, and assuming 13.8v is the system voltage, that amp would draw roughly 108.69a at full tilt on a test tone but less than 54a at full tilt on music. But since we know that this amp will not put out 1000 watts at 13.8v, the current draw would be reduced even more.
I have nothing but respect for you X, I just wanted to clarify some misinformation so people aren't spooked into thinking they need to pay $400+ for a high output alternator to run a 1000 watt system. Please don't take offense.