I still don't understand the difference. GB sometimes means 1024 MB, and sometimes it means 1000. But then also sometimes it also means ether 1000 MiB or 1024 MiB, depending on who's saying it.
The whole "iB" thing was supposed to clear up confusion but it only makes it worse. I don't understand why we can't just make 1GB = 1024 MB, ditch the silly "iB" nonsense, and then just call it a day.
I blame hard drive manufacturers. They're the ones who started this whole 1GB = 1000MB bullshit.
I agree with you. A long time ago, those of us "in the know" techies could parse the difference like it was a native language. When talking anything but computers, it was always the SI of 1000. When talking about computers, it was always 1024.
I think the masses were confused and the SI purists felt their SI prefixes were being corrupted. So they made a distinction/standard between binary numbering system prefixes and decimal numbering system prefixes.
I hate it. Feels wrong because I'm old and set in my ways. People like me are confused because we still use the old nomenclature, and when someone else uses the old nomenclature (when talking about computers), it's ambiguous to us because we don't know which numbering system they are using (e.g., binary as opposed to decimal). I still have to ask and half say binary and half say decimal.
I suppose if they're teaching it in high school and college it'll become native soon enough, if it hasn't already with the next generations.
GB is metric and it’s easy for us to remember. E.g. 1000 bytes = 1 Kilobyte, 1000 kilobytes = megabyte and so on.
GiB is the binary value. In binary, you have to work in powers of 2. That is… the values double every time (2, 4, 8, 16, 32, 64 and so on…). 1024 bytes = 1 KiB, 1024 KiB = 1 MiB
Since computers work in binary, and 1000 isn’t a number that’s easy to deal with in binary, we use the closest value available to us, 1024. In fact, back in the days when people were only concerned about KBs, they would say that 1000 KB = 1024 KiB.
Of course, we’re now working with TBs rather than KBs. Everything ramps up including the amount of “missing” space an OS reports on a hard drive.
I know windows tries to be helpful and shows you the value of a drive in GB, rather than its GiB value. Ever wonder why a 1TB hard drive appears as ~931GBs? This is why.
Other OSes tend to show you the GiB value since that’s generally a lot more accurate.
An important thing to note about this is that as we go up exponentially the error between GiBs and GBs increases. A kiB is only 2% more than a kB, but a TiB is 10% more than a TB. So using them interchangeably is increasingly misleading.
Also, there are many cases in computers where it doesn't really make sense to fuss about binary. Like, an HDD is a spinning piece of metal, the number of bits it can store has no binary constraint.
Fun fact: the old 3.5" floppies that were marketed as 1.44 MB were neither 1.44MiBs nor 1.44 MBs, but some weird hybrid mash-up unit.
To expand on this: people sometimes use SI prefixes to refer to 1024 units, but it's just wrong. A kilometer is 1000 meters, a kilogram is 1000 grams, and so on. If we were to re-define these prefixes for specific disciplines things get much more complicated very quickly.
@skullgiver@LaughingFox It's not programmer laziness at all, RAM modules' size has to be of a power of 2 on most platforms because of various assumptions the CPU makes in memory alignment and memory bulk reads for performance reasons. Processors don't interact directly with the flash dies, so it's fine for them to be of the size they feel like provided the controller knows what it's doing.