Gigabyte - GCSE Computer Science Definition
Reviewed by: James Woodhouse
Last updated
A Gigabyte, often abbreviated as GB, is a unit of digital information used to measure data storage and memory. 1 gigabyte = 1,000,000,000 bytes (10⁹) using the decimal system, which is the standard used in most GCSE specifications. Just like how kilometres measure distance, Gigabytes help us understand how much digital space is available or being used on devices like smartphones, computers, and tablets. When you download apps, save photos, or watch videos, you use up a certain amount of Gigabytes, and knowing this helps you manage your data effectively. Understanding Gigabytes is important in everyday life and in studying computer science because it helps you get an idea of how much information a device can handle or store.
Most GCSE exam boards follow SI (decimal) prefixes, not binary. Binary equivalents (e.g. Gibibyte = 2³⁰ bytes) are not required at GCSE, but you may come across them when researching independently
Examiner-written GCSE Computer Science revision resources that improve your grades 2x
- Written by expert teachers and examiners
- Aligned to exam specifications
- Everything you need to know, and nothing you don’t

Share this article