Windows (and many other places) uses GB when they mean GiB for historical reasons. Until 2008, the IEC had not standardized the binary SI extension (KiB, MiB, GiB, etc) and so computing professionals used whatever was most convenient for them. This meant most software folks used K = 1024, and many hardware folks stuck with K = 1000 for consistency with the other times they would encounter SI prefixes. The old convention persists in part because bits and bytes already indicate binary to a lot of computer folk, and there isn't really any confusion about 1024 vs 1000 until people start trying to apply the exact SI standard to non-SI units. Personally, I use KiB and friends whenever writing the unit symbol, but I refuse to pronounce them the way IEC wants me to. Instead, I use words like "kilos", "megs", "gigs" and let context fill in the other details about bits vs bytes and 1000 vs 1024.