2024 CanAIScalingContinueThrough2030

From GM-RKB
Jump to navigation Jump to search

Subject Headings: AI Scaling Laws, Power Availability for AI, Chip Manufacturing Capacity, Data Scarcity for AI Training, Latency in AI Systems, Economic Investment in AI Scaling, AI Infrastructure Development.

Notes

Here are the revised bullet points following your specified pattern:

Cited By

Quotes

Figure 1: Estimates of the scale constraints imposed by the most important bottlenecks to scale. Each estimate is based on historical projections. The dark shaded box corresponds to an interquartile range and light shaded region to an 80% confidence interval. Click on the arrow to learn more.

NOTE:

  • Key Insights from Figure 1:
    • Power constraints (left): By 2030, AI training runs could reach 2e29 FLOP with power limitations being the most immediate constraint. The projection here suggests a 10,000-fold increase in computational power relative to GPT-4.
    • Chip production capacity (second from the left): Constraints on chip manufacturing could allow training runs of up to 9e29 FLOP, which would be 50,000 times greater than GPT-4.
    • Data scarcity (second from the right): The scarcity of high-quality data might cap training runs at around 2e30 FLOP, representing an 80,000-fold increase over current levels.
    • Latency wall (right): Latency constraints, primarily in distributed systems, could allow training runs to reach 3e31 FLOP, a 1,000,000-fold increase over GPT-4's compute.
  • Each bar represents the median estimate of the largest training run feasible by 2030 for each constraint, with shaded regions showing uncertainty ranges (interquartile and confidence intervals).

NOTE: Figure 2: Reported and planned total installed IT capacity of North American data centers, visualizing trends and projections of power growth for AI and non-AI data centers.

NOTE: Figure 3: Assumptions and estimates related to power supply scaling and training run sizes, illustrating the maximum feasible scale of AI training runs.

References

;

 AuthorvolumeDate ValuetitletypejournaltitleUrldoinoteyear
2024 CanAIScalingContinueThrough2030Jaime Sevilla
Tamay Besiroglu
Ben Cottier
Josh You
Pablo Villalobos
Ege Erdil
Edu Roldán
Can AI Scaling Continue Through 2030?2024