Definitions – Moore’s law  

AI Definitions

Moore’s law is the observation that the number of transistors on a microchip doubles about every two years, while the cost of computers is halved. It also suggests that the growth of microprocessors is exponential. Moore’s law was named after Gordon Moore, the co-founder and former CEO of Intel, who made this prediction in 1965 based on the trends he noticed in the semiconductor industry. Moore’s law has been a driving force of technological and social change, productivity, and economic growth, as it has enabled the development of faster, smaller, and cheaper computers and devices that have transformed various fields and sectors. However, Moore’s law also faces some challenges and limitations, such as physical, economic, and environmental factors, that may slow down or stop its progress in the future.