Moore's Law

Written By:
Paul Tracy
Updated August 12, 2020

What is Moore's Law?

Moore's law describes the computing hardware trend that transistors on an integrated circuit will double every two years.

How Does Moore's Law Work?

In 1965, Gordon E. Moore, the co-founder of Intel published a paper predicting that the integrated circuit could be expanded exponentially at a reasonable cost approximately every two years.  At the time, the integrated circuit, a key component in the central processing unit of computers had only been around for seven years.   Indeed, the trend has continued for over fifty years with no sign of abating.

For the consumer, Moore's law is demonstrated by a $1500 computer today being worth half that amount next year and being almost obsolete in two years.

While Moore's law is a really just an observation of a trend, it has also become a goal of the electronics industry.  The innovation of electronic design and manufacturing costs, including the objective of placing whole "systems" on a chip, miniaturization of electronic devices, and the seamless integration of electronics into social fabric of daily life are outcomes from these industry goals.

Why Does Moore's Law Matter?

Moore's law has been applied (though not by Moore) across the entire electronic sector, marking the price-performance trends for processing speed, memory, storage, digital networks, and picture resolution by the same exponential growth measure while controlling for cost.

Most agree that the trend of continued price-performance improvements cannot continue at this exponential rate forever.  While the price may remain constant, performance cannot exceed physical limitations of processors.