Performance of AI models on various benchmarks from 1998 to 2024.
In machine learning, a neural scaling law is an empirical scaling law that describes how neural network performance changes as key factors are scaled up or down. These factors typically include the number of parameters, training dataset size,[1][2] and training cost.