Computer units

A common source of confusion for me is conversions of units in Computer science.

It also doesn't help that Google interprets kilobit per second kb/s incorrectly as kilobyte per second.

WAIT A MINUTE, 33729 kb/s to Mbps is ~33 Mbps... How do people distinguish between Megabit and Megabyte !?! pic.twitter.com/yX3gV6cLzT

— Kai Hendry πŸ‡ΈπŸ‡¬ (@kaihendry) January 9, 2018

How to cope

There are 8 bits in a byte. Always (ref IEC 80000-13). Use bytes. I prefer decimal units of bytes as a rule, but unfortunately some markets like video and internet speeds commonly use bits.

Why are they using bits?

Because way back in the day stuff ran so slowly that you measured stuff in bits. And to add to the confusion, in the 80s, 1 byte wasn't definitely 8 bits. So using bits was more absolute.

Bigger numbers sound more impressive in bits unfortunately. 100Mbps sounds way more impressive than 12.5MBps. Though in reality you care about how many megabytes a second you are shifting. Btw there are 1000 Megabytes in a Gigabyte.

Be aware of the confusion threshold

Have a look at these bitrate examples.

Notice we go from Gbit/s (BITS!) to MB/s (8*BITS=BYTES!). Be aware!

Found any of my content interesting or useful?