Originally Posted by
Weigel21
But I don't understand why it'd make the slightest bit of difference.
I mean if the sub is rated at say 1000RMS (don't know what it's rating was years ago) and you can achieve such power regardless of weather it's wired to present a .5 or 2 ohm final impedance, why would it "hit harder" wired at .5 ohm vs 2 ohm?
Again, I'm not talking about how a lower impedance load will allow for an amplifier to produce more power and in turn, greater output from the setup. My question was that if all else were equal, what difference would it make? 1000RMS@.5 vs 1000RMS@2 ohm to that sub. You saying it'd still hit harder wired for .5 vs 2 ohm?