It wasn't until Opus arrived with its -b setting (which, at that point, seemed to like a weird return to "the old handfisted bitrate settings of the past"), that someone corrected me explaining that Vorbis' -q setting was also an abstraction, so to speak, of a set of constraints around bitrate. I was told that this perception of an absolute perceptual quality target didn't in fact exist, and that all that -q# did was an approximation based on tuned bitrate allocations and weights and other techniques.
That sure bursted my bubble. Everyone and his dog back then were spreading my same perception about Vorbis' "quality" based encoding: "Choose a quality level that sounds right to you and stick to it so that any source material you throw at it sounds exactly as good. The encoder will take care of using just the necessary ammount of bits to achieve that target quality. qX will always be qX and the improvements are about improving bandwidth efficiency to reach the same target quality"
To be honest, It's kind of frustrating that this mythical "quality" setting doesn't actually exist. It seemed so neat and... oh well.
Quite removed from the actual reality of lossy codecs, where it's the perceptual efficiency that's improved upon for different bitrate ranges.