tesla's-hidden-compute-power-|-nextbigfuture.com

Tesla's Hidden Compute Power | NextBigFuture.com

I did a video with Brighter with Herbert, Herbert Ong.

I explained why the chart from last year showing Tesla going to 100 Exaflops of compute in October 2024 is out of date.

Elon Musk said that both XAI and Tesla had over 30,000 H100 chip equivalents. Nvidia H100 chips each have petaflops of compute.

If Tesla has over 30,000 Nvidia H100’s then Tesla has 120 Exaflops of compute already.

There was also word that the wait time for Nvidia H100s is down to six weeks. This means Tesla has or will spend $3 billion to get about 100,000 H100s. This would be 400 Exaflops of compute. XAI is going to buy a total of 100,000 Nvidia H100s to get the 400 Exaflops needed to train an OpenAI GPT-5 class Grok 3 model.

Before the end of the year, Tesla would likely buy and install at least 20,000 B100s. This would be 20,000 times 20 petaflops of compute. This would be another 400 Exaflops combined with the 400,000 Exaflops of 100k H100s for 800 Exaflops of compute.

Tesla and the other AI companies also need a lot of fast cache memory. The training AI needs to view of the thousands of compute chips and exabytes of cache memory as one continuous area.

Here I have tables for how we can expect the compute, memory, data needs and energy to scale for each of the major AI training centers.

The data needs are huge and the primary sources of additional data are real world video like what is gathered by $250 billion (6 million) Tesla cars with their 8 cameras and teraflop driving inference chip.

Elon said that with GPT5 and Grok 3, the models have run out of regular text, image and video data. This is about 40 trillion tokens worth of data. Getting more data would involve real world video, real world audio and synthetic data. Synthetic data is AI extrapolating data its has with statistically similar data.

The scaling will be so large that if the value for the world is making better and better SuperAI, then civilization could organize itself around making more and better chips, with faster and bigger fast memory to make better AI products like self driving cars and humanoid bots. Those self driving cars and humanoid bots would have cameras to record and learn from things happening in the world and to send that information back to further improve the AI.

Article Source




Information contained on this page is provided by an independent third-party content provider. This website makes no warranties or representations in connection therewith. If you are affiliated with this page and would like it removed please contact editor @riverton.business

Similar Posts