Neural Scaling Law
Jump to navigation
Jump to search
A Neural Scaling Law is an Empirical Statistical Laws that ...
- See: Empirical Statistical Laws, Mixture of Experts, Large Language Model, Graphics Processing Unit, Tensor Processing Unit, Glossary of Artificial Intelligence#Epoch (Machine Learning), Accuracy And Precision, Precision And Recall, F1 Score, Mean Squared Error, Mean Absolute Error, Likelihood Function.
References
2023
- (Wikipedia, 2023) ⇒ https://en.wikipedia.org/wiki/Neural_scaling_law#Introduction Retrieved:2023-7-14.
- In general, a neural model can be characterized by 4 parameters: size of the model, size of the training dataset, cost of training, performance after training. Each of these four variables can be precisely defined into a real number, and they are empirically found to be related by simple statistical laws, called "scaling laws".These are usually written as [math]\displaystyle{ N, D, C, L }[/math] (number of parameters, dataset size, computing cost, loss).