So here it goes:

When you see a sensitivity rating, look carefully at exactly what it says, don't just read 93db and run with it. If it says 1w/1m, you're good to go. If it says 2.83v/1m you aren't.

Here's the deal, 2.83v/1m

**ONLY**means 1 watt if it's an 8 ohm speaker, if it's anything else, it isn't 1 watt.

2.83v seems to be the norm these days so when you encounter it, make sure you look closely at the impedance of the driver. If it's a dual voice coil, it's almost guaranteed that the measurement was with the coils in parallel and that will always artificially raise the sensitivity spec.

Let's look at an imaginary subwoofer and let's say it's available in 8, 4, 2 and 1 ohm, but it's otherwise the same and it's rated at 2.83v/1m. Here's what it would look like on paper:

10" 8 ohm = 85db

10" 4 ohm = 88db

10" 2 ohm = 91db

10" 1 ohm = 94db

In reality they are all 85db, because 2.83v = 1 watt at 8 ohms, but it equals 2 watts at 4 ohm, 4 watts at 2 ohms and 8 watts and 1 ohm. So the 1 ohm speaker is actually measured 8w/1m.

So in summary, check whether it says 2.83v/1m or 1w/1m and then check the impedance it was measured at. They can't magically make a subwoofer with a sensitivity of 93db without sacrificing something else. Efficiency is not free, hence Hoffman's Iron Law.

I hope this clears up some confusion on the subject.