There’s a new law that affects everybody. It replaces Moore’s Law that we grew up with. Moore’s Law predicted “The number of transistors on a chip doubles every 2 years“. This meant computers would get twice as fast… or twice as cheap at a particular performance level.
Physics eventually got in the way. (Weird things happen when distances shrink to nanometers.) Progress slowed on traditional CPU’s… But Nvidia & TSMC still manage to squeeze another >30% FLOPS per dollar from new GPU’s each year.

As the cost of compute continues to fall… We buy MORE. Not just more units, but more in aggregate dollars spent. This is Jevons Paradox: “Improvements in efficiency of resource use tend to increase (not decrease) overall consumption of that resource.“
This is why you pay $1,500 for the latest iphone when the iPhone 3 sold for just $199. We don’t settle for something merely 10x better. We demand >500 times the performance.
This massive consumption boom brings us under the regime of Wright’s Law: “With every doubling of cumulative production (experience), unit costs fall by a constant percentage.“
The 10,000th unit is cheaper to make than the first 5,000.
By the time you sell a million, avoidable costs are wrung out of the system.
After 10 million or 100 million, the whole supply chain is reenginered from the point of resource extraction.
Eventually physics will kick in again. The lower bound for the price of an object is the price of the constituent elements or molecules.
… At least until AI writes a new law for that!
