This definition is used in all contexts of science, engineering, business, and many areas of computing. However, historically, the term has also been used in some fields of computer science and information technology to denote the gibibyte, or 1073741824 (10243 or 230) bytes. For instance, the memory standards of JEDEC, a semiconductor trade and engineering society, define memory sizes in this way.

The term gigabyte is commonly used to mean either 10003 bytes or 10243 bytes. This originated as compromise technical jargon for the byte multiples that needed to be expressed by the powers of 2 but lacked a convenient name. As 1024 (210) is approximately 1000 (103), roughly corresponding to SI multiples, it was used for binary multiples as well.

Since the early 2000s, disk drive manufacturers based most consumer hard drive capacities in certain size classes measured in decimal gigabytes. The exact capacity of a given drive model is close to the class designation. Most manufacturers of hard disk drives and flash-memory disk devices[3][4] define one gigabyte as 1000000000bytes, which is displayed on the packaging. Some operating systems now express hard drive capacity or file size using decimal multipliers (integer powers of 1000), while others, such as Microsoft Windows, still report size in gigabytes by dividing the total capacity in bytes by 1073741824 (230 bytes= 1 gibibyte), while still reporting the result with the symbol GB. This discrepancy causes confusion, as a disk with an advertised capacity of, for example, 400 GB (meaning 400000000000bytes) might be reported by the operating system as 372 GB, meaning 372 GiB. Other software, like Mac OS X 10.6[5] and some components of the Linux kernel[6] measure in decimal units. The JEDEC memory standards use the IEEE 100 nomenclatures which define a gigabyte as 1073741824bytes (or 230 bytes).[7]

The difference between units based on decimal and binary prefixes increases as a semi-logarithmic (linear-log) function—for example, the decimal kilobyte value is nearly 98% of the kibibyte, a megabyte is under 96% of a mebibyte, and a gigabyte is just over 93% of a gibibyte value. This means that a 300 GB (279 GiB) hard disk might be indicated variously as 300 GB, 279 GB or 279 GiB, depending on the operating system. As storage sizes increase and larger units are used, these differences become even more pronounced. Some legal challenges have been waged over this confusion such as a suit against Western Digital.[8][9] Western Digital settled the challenge and added explicit disclaimers to products that the usable capacity may differ from the advertised capacity.[8] Seagate was sued on similar grounds and also settled.[8][10]

Because of its physical design, the capacity of modern computer random access memory devices, such as DIMM modules, is always a multiple of a power of 1024. It is thus convenient to use prefixes denoting powers of 1024, known as binary prefixes, in describing them. For example, a memory capacity of 1073741824bytes is conveniently expressed as 1 GiB rather than as 1.074 GB. The former specification is, however, almost always quoted as 1 GB when applied to random access memory.

Software allocates memory in varying degrees of granularity as needed to fulfill data structure requirements and binary multiples are usually not required. Other computer measurements, like storage hardware size, data transfer rates, clock speeds, operations per second, etc., do not depend on an inherent base, and are usually presented in decimal units. For example, the manufacturer of a "300 GB" hard drive is claiming a capacity of 300000000000bytes, not 300x10243 (which would be 322122547200).