Summary

  • For decades, AI researchers assumed progress would come through scientific breakthroughs and algorithmic improvements, but many of the recent advances in AI have come through scaling existing systems.
  • Critical components of scaling AI are increasing the computational power used in training AI models; increasing the size of AI models to handle more complex tasks and larger datasets; and growing the amount of training data to avoid overfitting and improve performance.
  • All of these require ever-increasing investment, both in terms of hardware and money.
  • This article analyses trends in the scaling of AI, using Epoch AI’s extensive dataset.
  • It concludes that organisations are having to invest large sums in AI R&D, and the appropriate hardware, to keep up with the giants in the field, and stay ahead of the curve.

By Veronika Samborska

Original Article