Skip to main content

Notice

Please note that most of the software linked on this forum is likely to be safe to use. If you are unsure, feel free to ask in the relevant topics, or send a private message to an administrator or moderator. To help curb the problems of false positives, or in the event that you do find actual malware, you can contribute through the article linked here.
Topic: How many bits is 128 "kbps"? 128*1000 or 128*1024? (Read 8091 times) previous topic - next topic
0 Members and 1 Guest are viewing this topic.

How many bits is 128 "kbps"? 128*1000 or 128*1024?

Hi everyone,

Does "128 kbps" for AAC mean 128*1000 bits per second, or 128*1024 bits per second? What's the right way?

Thanks!

How many bits is 128 "kbps"? 128*1000 or 128*1024?

Reply #1
http://en.wikipedia.org/wiki/Bit_rate

Quote
Note that, unlike many other computer-related units, 1 kbit/s is traditionally defined as 1,000 bit/s, not 1,024 bit/s, etc., also before 1999 when SI prefixes were introduced for units of information in the standard IEC 60027-2.

How many bits is 128 "kbps"? 128*1000 or 128*1024?

Reply #2
Ok, so it's 1000!

Thanks!

How many bits is 128 "kbps"? 128*1000 or 128*1024?

Reply #3
In general, when talking about transmiting data, units are expressed in SI ( 1000 ). That also applies to Internet speeds (ADSL, 3G...).

That said, Microsoft used 1024 as kbits in their wma codec.  (which translates in 131072 bits for its 128kbps setting).


The reason why computers have historically used 1024 is because of its usage of binary data ( 2^10 = 1024). This, then, defined the addressable space for data, which means: If i have x bits, i can have 2^x different addresses in which to store data.

The most recent effect of this is the reason why one needs a 64bit Operating System if needing to use more than 4GB of RAM (let's forget about the fact that it is more like 3.5GB in practice) 2^32 = 4GB in 1024, or 4.29GB if expressed in 1000.