How Does Moore's Law Work?


Quick Answer

Moore's law is an empirical law, meaning that it was derived from observation rather than from general principle and therefore doesn't seem to be driven by a single underlying force. According to Mooreslaw.org, the law was first articulated by Intel co-founder Gordon Moore and states that the number of processing elements that can be fitted onto a chip at a given cost roughly doubles every year.

Continue Reading
Related Videos

Full Answer

What makes Moore's law unusual is its ability to predict the rate of change across the computing industry regardless of the technology being used. In 1965 when Moore wrote the article in which he described the rate of change, computers were still built around transistors. The annual doubling of processor speed continued with the advent of integrated circuits during the 1970s and through the software boom of the 1990s. According to Wikipedia, what began as a rule of thumb that was intended to predict the advances in hardware that would take place in the 10 years following Moore's article has also proven to be valid in other areas where exponential growth has occurred. In the beginning, Moore's law worked by anticipating the rate at which individual components could be fitted to a chip and continued as a projection of overall performance.

Learn more about Physics

Related Questions