If power consumption were related to bitrate (which it may or may not be) then it makes sense to assume that a lower average bitrate would equate to less power consumption.
Well, I've also heard this the other way, regarding lower limit of 96 kbps for Vorbis on iRiver DAPs a few years back. Lower bitrate files supposedly needed more power, because the decoder had to generate (add) more data.
Anyway, let's do a simple test. I took ~84 minutes long mix, converted it to MP3s with various settings and ran it in fb2k's benchmark.
First, is there a decoding speed difference between low and high bitrate files? How big?File: cbr-32kbps.mp3
Speed (x realtime): 790.820 min, 804.221 max, 798.964 average
File: cbr-320.mp3
Speed (x realtime): 224.727 min, 225.882 max, 225.190 average
Yes, the low-bitrate file really decodes approximately 3.5x faster than the one at maximum bitrate.
Next, is there a decoding speed difference between CBR and VBR at similar bitrate?
File: cbr-128.mp3
Speed (x realtime): 268.777 min, 269.622 max, 269.084 average
File: vbr-v5.mp3
Speed (x realtime): 266.027 min, 267.631 max, 266.823 average
Considering that the V5 file had real average bitrate 138.7 kbps, which would explain the little difference, I don't think so.