Such a mammoth chip, according to Cerebras, is needed to meet the growing demands of artificial intelligence (AI). AI algorithms learn to do a task by first training on a huge amount of data. In particular, deep learning algorithms, which use neural networks that roughly mimic how the brain works, require enormous computing power, with training runs that can take hours or even days. According to a recent analysis from OpenAI, a San Francisco-based, AI-focused company backed by Microsoft, the computing power demanded by AI training has, from 2012 to 2018, increased by a factor of 300 000, with a doubling time of 3.5 months. That is 25 000 times faster than Moore’s law at its peak [3].