A megabyte is a unit of information or computer storage equal to either 106 (1,000,000) bytes or 220 (1,048,576) bytes, depending on context.

In the past (1990s and prior), MB, GB, KB, were all taken to mean multiples of 1024 (2^10, 2^20, and so on). This was understood to be the case when referring to data storage on all computers. The confusion happened when disk manufacturers started reporting their sizes of 1,000,000 bytes as being 1 MB. Their reasoning (or excuse) was that 'Mega' was a metric term, and, therefore was more accurate to use 1 million, 1 billion, etc when referring to Megabytes and Gigabytes. This, of course, allowed them to report the disk drive as being larger than it actually was. Later on, late comers to the world of computers in European countries started pushing for the metric definition of the Kilo (K), Mega (M), Giga (G), Tera (T), to preserve the integrity of their metric system, thus further confounding the original binary meaning of these prefixes as defined by the creators of the microprocessors. When reading any computer book written in the mid 1990s and earlier, Kilobyte was always assumed to means 2^10=1024 bytes, and Megabyte to be 2^20 bytes (1024*1024, or 1048576 bytes). Sometime towards the end of this time period, MB could sometimes mean 1 million bytes when referring to hard disk storage, but always meant 2^20 bytes when referring to system memory.

SI, a European foundation, later redefined MB to mean 1 million bytes, and created MiBi to use use the original binary definition for Megabyte.

It is commonly abbreviated as Mbyte or MB (compare Mb, for the megabit). The term megabyte was coined in 1970.


The term "megabyte" is ambiguous because it is commonly used to mean either 10002 bytes or 10242 bytes. The confusion originated as compromise technical jargon for the byte multiples that needed to be expressed by the powers of 2 but lacked a convenient name. As 1024 (210) is roughly equal to 1000 (103), roughly corresponding SI multiples began to be used as approximate binary multiples. By the end of 2007, standards and government authorities including IEC, IEEE, EU, and NIST, had addressed this ambiguity by promulgating standards requiring the use of megabyte to describe strictly 10002 bytes and "mebibyte" to describe 10242 bytes. This is reflected in an increasing number of software projects, but most file managers still show file sizes in "megabytes" ("MB") in the binary sense (10242 bytes). The term remains ambiguous and it can follow any one of the following common definitions:

  1. 1,000,000 bytes (10002, 106): This is the definition recommended by the International System of Units (SI) and the International Electrotechnical Commission IEC. This definition is used in networking contexts and most storage media, particularly hard drives, Flash-based storage, and DVDs, and is also consistent with the other uses of the SI prefix in computing, such as CPU clock speeds or measures of performance.
  2. 1,048,576 bytes (10242, 220): This definition is most commonly used in reference to computer memory, but most software that display file size or drive capacity, including file managers also use this definition. See Consumer confusion (in the "gigabyte" article).
  3. 1,024,000 bytes (1000×1024): This is used to describe the formatted capacity of USB flash drives and the "1.44 MB" 3.5 inch HD floppy disk, which actually has a 1,440 KiB capacity, that is, 1,440×1,024 bytes, or 1,474,560 bytes.

Megabyte examples

Depending on compression methods and file format, a megabyte of data can roughly be:

  • a 1024×1024 pixel bitmap image with 256 colors (8 bpp color depth).
  • 1 minute of 128 kbit/s MP3 compressed music.
  • 6 seconds of uncompressed CD audio.
  • a typical book volume in text format (500 pages × 2000 characters per page).

See also


External links

Search another word or see megabyteon Dictionary | Thesaurus |Spanish
Copyright © 2014, LLC. All rights reserved.
  • Please Login or Sign Up to use the Recent Searches feature