Because it gets worse with size. 1 KB is just 2.4% less than 1 KiB. But 1 TB is almost 10% less than 1 TiB. So your 12 TB archive drive is less like 11.9 TiB and more like 11 TiB. An entire “terabyte” less.
Data transfers have always been in base 10. And disc manufacturers are actually right. If anything, it’s probably Microsoft that has popularised the wrong way of counting data.
It has nothing to do with wanting to make disks be bigger or whatever.
Underlying storage doesn’t actually care about being in powers of 2^10 or anything, it’s only the controllers that do, but not the storage medium. You’re mixing up the different possibilities to fill your storage with (which is 2^(number of bits)).
Looking at triple layer cell SSDs, how would you ever reach a 2^10, a 2^20 or 2^30 capacity when each physical cell represents three bits? You could only do multiples of three. So you can do three gibibytes, but that’s just as arbitrary as any other configuration.
It’s just the agreed metric for all capacities except for RAM. Your Gigabit network card also doesn’t transfer a full Gibibit (or 125 Mebibytes) in a single second. Yet nobody complains. Because it’s only the operating system manufacturer that thinks his product needs AI that keeps using the prefix wrongly (or at least did, I’m not up to speed). Everyone else either uses SI units (Apple) or correctly uses the “bi”-prefixes.
A twelve core 3 GHz processor is also cheating you out of a 2.4 GiHz core by the same logic. It’s not actually 3 x 2^30 Hz.
Because it gets worse with size. 1 KB is just 2.4% less than 1 KiB. But 1 TB is almost 10% less than 1 TiB. So your 12 TB archive drive is less like 11.9 TiB and more like 11 TiB. An entire “terabyte” less.
Btw ISPs do the same.
Data transfers have always been in base 10. And disc manufacturers are actually right. If anything, it’s probably Microsoft that has popularised the wrong way of counting data.
It has nothing to do with wanting to make disks be bigger or whatever.
Why is it the wrong way when its the way the underlying storage works?
Underlying storage doesn’t actually care about being in powers of 2^10 or anything, it’s only the controllers that do, but not the storage medium. You’re mixing up the different possibilities to fill your storage with (which is 2^(number of bits)).
Looking at triple layer cell SSDs, how would you ever reach a 2^10, a 2^20 or 2^30 capacity when each physical cell represents three bits? You could only do multiples of three. So you can do three gibibytes, but that’s just as arbitrary as any other configuration.
It’s just the agreed metric for all capacities except for RAM. Your Gigabit network card also doesn’t transfer a full Gibibit (or 125 Mebibytes) in a single second. Yet nobody complains. Because it’s only the operating system manufacturer that thinks his product needs AI that keeps using the prefix wrongly (or at least did, I’m not up to speed). Everyone else either uses SI units (Apple) or correctly uses the “bi”-prefixes.
A twelve core 3 GHz processor is also cheating you out of a 2.4 GiHz core by the same logic. It’s not actually 3 x 2^30 Hz.