Data is represented in a computer by means of simple on/off switches, and digitally these become 1 and 0. Millions of switches in combination create all the data in a computer system.
Computer data handing is an electrical system; therefore the on/off switches become electrical switches. The current is either turned on or off and the various patterns and combinations of on/off or 1 and 0 are used to represent all data. Because of the nature of this system, it is called a binary system.
In its smallest form, data is packaged in bits. A bit can be only one value, either 0 or 1. Two bits together can represent up to four different values,
- 00 represents 0
- 01 represents 1
- 10 represents 2
- 11 represents 3
This process of using binary digits to represent a symbol or character is called coding or encoding. As the number of binary digits is increased, the number of possible combinations increases exponentially. For example 8 bits can represent 256 values and 10 bits can be used to represent 1024 different values. In practice, computer programmers bunch bits together into groups of eight called bytes.
The evolution of computers has meant that the switches have gone from the use of vacuum tubes and transistors to digital circuits. This has meant that computers have become smaller and smaller, allowing for the representation of more and more data into smaller and smaller devices.