Gigabyte - GCSE Computer Science Definition

Reviewed by: James Woodhouse

Last updated

A Gigabyte, often abbreviated as GB, is a unit of digital information used to measure data storage and memory. 1 gigabyte = 1,000,000,000 bytes (10⁹) using the decimal system, which is the standard used in most GCSE specifications. Just like how kilometres measure distance, Gigabytes help us understand how much digital space is available or being used on devices like smartphones, computers, and tablets. When you download apps, save photos, or watch videos, you use up a certain amount of Gigabytes, and knowing this helps you manage your data effectively. Understanding Gigabytes is important in everyday life and in studying computer science because it helps you get an idea of how much information a device can handle or store.

Most GCSE exam boards follow SI (decimal) prefixes, not binary. Binary equivalents (e.g. Gibibyte = 2³⁰ bytes) are not required at GCSE, but you may come across them when researching independently

Examiner-written GCSE Computer Science revision resources that improve your grades 2x

  • Written by expert teachers and examiners
  • Aligned to exam specifications
  • Everything you need to know, and nothing you don’t
GCSE Computer Science revision resources

Share this article

James Woodhouse

Reviewer: James Woodhouse

Expertise: Computer Science Lead

James graduated from the University of Sunderland with a degree in ICT and Computing education. He has over 14 years of experience both teaching and leading in Computer Science, specialising in teaching GCSE and A-level. James has held various leadership roles, including Head of Computer Science and coordinator positions for Key Stage 3 and Key Stage 4. James has a keen interest in networking security and technologies aimed at preventing security breaches.

The examiner written revision resources that improve your grades 2x.

Join now