Aussie AI

16. The Sweetest Lesson: Brains are 50 Times Bigger

  • Book Excerpt from "The Sweetest Lesson: Your Brain vs AI"
  • by David Spuler, Ph.D.

Brains are 50 Times Bigger

Your brain is 50 times bigger than the best AI engines today. Hence, to get to an intelligence AI model, we might need 50 times more compute power. Don’t believe me? Do you think ChatGPT is smarter than you? Let’s look at the facts about brains and models.

The biggest AI model is about two trillion weights. That’s for GPT-4 from OpenAI, and we don’t even officially know its size, but only from “leaks” we’ve heard that it was 1.76 trillion total weights (split into eight parts).

Is that the biggest model?

Indeed, GPT-4 was in March, 2023, so it seems likely that the state-of-the-art is more. The weight count of OpenAI’s “Strawberry” reasoning model released in September 2024 has not even leaked, but we can reasonably assume it was more. And there are a handful of other companies that are talking about making models with “trillions” of weights. Most of the details are not disclosed in research papers or press releases, but you can see the word “trillion” in patents and job ads.

It seems very likely that the latest models are above two trillion weights, but probably not by a lot, because the computation needed to run massive models is immense. You don’t just train a two trillion weight model once, and done. Rather, you have two trillion weights that need to run as “inference” for every user’s query after you release your model to the public.

How does your brain compare?

Firstly, it has around 86 billion neurons. There are various research papers supporting this, and it can go up to around 100 billion neurons. Even so, 86 billion neurons is substantially less than 2 trillion by a factor of about 23. So, an AI model is 23 times bigger than your brain?

Not so fast!

We’re making the wrong comparison here. It’s measuring silicon apples against carbon oranges.

Weights are not neurons. No, the neuron-equivalent structure is the “model dimension” of all those neural network computations in the AI engine. A neural network is based on a matrix, like in High School math class. The equivalent of a neuron in AI is the length of the sides of the matrices, rather than all the numbers inside the matrix. The weights in an AI model are numbers telling it how much “weight” to give to a connection between two neurons, represented in a vector of artificial neurons.

Anyway, the point is that the weights are more like “connections” between a pair of neurons. It’s not a surprise that the human brain has these types of connections, because neural networks actually copied that, and made up the idea of weights being their connections. In the brain, we call them “synapses” and they connect two neurons together via a “dendrite” physical structure.

How many synapses?

Well, the official answer is 100 trillion synapses in the human brain. This sounds like a made-up statistic, even when a chatbot tells you, so we’ll stress test its assumptions below. But first, let’s summarize the situation:

  • AI Models — 2 trillion weights
  • Human brain — 100 trillion synapses.

Like I said earlier, your brain is 50 times bigger.

Really That Big?

It’s fair to question that 100 trillion synapse statistic. Such a big, round number just sounds like made-up science. In fact, the overall method to get to that number is simplistic:

    100 billion neurons time 1,000 synapses per neuron.

There are really two numbers to investigate here. The number of neurons is more accurately believed to be around 86 billion, so maybe that drops our estimate down by 16% overall. On the other hand, Pakkenberg et. al. (2003) estimates 0.15 quadrillion total synapses, which is more than 50% higher than the overall 100 trillion total. Another analysis is by AI Impact (2015), based on Chudler’s facts from neurology textbooks (undated), gives an even higher estimate of 1.8-3.2 times 10^14, which is 180-320 trillion synapses.

Unfortunately, we can’t be sure, because no-one has taken the time to sit down and count all those synapses. Instead, there are various estimates based on the weight and volume of different parts of the brain and the apparent density of synapse cell structures within the brain of humans and some animals. As is obvious from the range in estimates, it’s not an exact science, but this still seems to be based on reasonable assumptions.

Finally, an obvious but important point that underpins this analysis: neurons matter. The intelligence of humans is closely correlated to the number of neurons and the physical size of the brain. For example, Azevedo et. al. (2009) and Herculano-Houzel (2009) both show that human intelligence is based on us having a linearly scaled-up version of primate brains.

Maybe Less

Yes, the analysis may be incorrect, and the analogy between weights and synapses is not perfect. Perhaps a lot of the 100 trillion connections are doing unimportant things that an AI engine doesn’t need. The vast majority of your brain is not doing high-level reasoning.

Rather, your brain has to do a lot of other things related to its attached carbon body. There are plenty of neurons in the brainstem and other similar regions. The brainstem is around 2.6% of the human brain by weight, giving an estimate of 2.2 billion neurons out of the 86 billion total neurons. The whole autonomous nervous system has to control our breathing, heartbeat, and dozens of other systems.

An AI won’t need that.

There’s also a lot of input signal processing that can perhaps be avoided in a computerized LLM brain. For example, a lot of the brain’s power is consumed by processing video inputs, and AI engines may not need to do that.

There are non-LLM ways to process video inputs, so maybe we can hook those up to an LLM. Hence, the LLM itself wouldn’t need to do all of that work to process the basic video features like colors and intensities. However, the LLM still has to do the higher-level processing, such as detecting shapes and making sense of them.

Maybe More

On the other lobe, it could be that 100 trillion synapses is an underestimate. Indeed, a number of research papers give much larger numbers. Pakkenberg et. al. (2003) gives an estimate of “0.15 quadrillion” and that’s the same as 150 trillion synapses. And AI Impact’s 2015 article cites to an alternative statistic of 7,000 synapses per neuron, which when combined with 86 billion neurons, would put us at about 600 trillion total synapses.

Those are some mighty big numbers about what’s between your ears. To put it into context, let’s divide by two (using AI):

  • 100 trillion — 50 times
  • 150 trillion — 75 times
  • 180 trillion — 90 times
  • 320 trillion — 160 times
  • 600 trillion — 300 times

In comparison to the 2 trillion weights in state-of-the-art AI models, all these number give ratios more than 50 times greater. Indeed, 150 trillion synapses gives a ratio of 75 times higher, and the upper bound is 600 trillion synapses, with a ratio of human brains being over 300 times larger.

Which one will it be? Your guess is as good as mine. Might need some more GPU chips.

Smart Coding

Another variable here is that the brainy boffins in the AI labs have found numerous advanced algorithms to make things go faster. For example, the GPT-4 model has 2 trillion total weights, but they’re actually split up into eight separate areas, called “experts” in what’s known as a Mixture-of-Experts (MoE) architecture. Each expert in GPT4- was about 200 billion parameters. Similarly, the DeepSeek model that matched and exceeded this performance was less than 600 billion parameters, and had even smaller experts.

I’m not sure which way this cuts the analysis. If the state-of-the-art models are 200 billion rather than two trillion weights, does this mean that AI models don’t need to be as big as the human brain? Maybe an AI model only needs 10 trillion total weights. Alternatively, does this nuance mean that the AI models have ten times further to travel before they match the full intelligence of a brain?

The Bitter Lesson Again

Whatever the answer to this analysis, it seems clear that more compute is needed. It sounds like another case of the bitter lesson, doesn’t it? We just need to brute-force up another 50 times greater compute power, and then it’s the same as the 100 trillion synapses in the human brain.

This level of extra computation power seems beyond our reach at the moment. However, it’s not unimaginable when you consider the rate of advances in GPU chip workloads and the newer, more efficient algorithms being discovered to run the AI computations on these chips.

So much for all those reasoning algorithms!

Thus, here we are on the verge of learning the bitter lesson all over again. Instead of making advances by using intelligent and insightful new methods of thinking, we’re just going to throw more electricity and C++ programmers at the problem.

The Sweetest Lesson

Is it the bitter lesson? Sure, the idea is to brute-force a lot more GPUs in 100 trillion weight models. There’s going to be plenty of computation power needed over the next few years as we advance the state-of-the-art models in both training and inference.

But the bitter lesson actually has two factors, of which brute-force hardware is only one part. Let’s see if they both fit:

    1. Brute-force computation, and

    2. Algorithms that differ.

Can you see the mismatch? There needs to not only be brute-force computations, but also the abandonment of human-based reasoning in favor of simpler number crunching. At first, this also seems like a match, because those tricky reasoning algorithms will probably be gone. It’ll just be GPUs and matrices and lots of number crunching.

But, then, inspiration.

Maybe we’re not going to use a fancy reasoning algorithm on top of all those LLMs. The path to intelligence may not be a better “controller” that manages all of those low-level computations. So, the most advanced AI algorithms won’t be using human-style reasoning. But, you know wnat, it’s kind of weird, because here’s the thing:

    We already are!

There’s this whole theory of neural networks that forms the basis of AI theory. For decades, it was a kind of backwater in computing research, where the results were not very impressive. Gradually, it grew in importance as the results from Machine Learning models, which is old-school AI theory, started to be used in real-world activities, like recommending which movies you should watch. Eventually, the advent of hyperscale GPUs led to the GPT series of LLMs, and the rest you already know.

Hence, this is not the bitter lesson. The whole of AI theory in the computing industry is about neural networks, which means this:

    Copying the human brain.

It doesn’t stop there at copying the basic human architecture. The AI industry is effectively copying all the workarounds and extensions that we use to make humans smarter:

  • Tools — e.g., computer usage by LLMs.
  • Data sources — e.g., LLMs now search the internet.
  • Training — it’s like sending LLMs to High School, where they read a textbook.

In summary, AI models are based on not just copying human brains, but also imitating all of the ways that humans learn. And it’s not just the main neural network algorithm that’s copied from a carbon brain. It’s all of the workarounds, too. If we ever get to a truly “intelligent” model, and I do think it’s an “if” not a “when” for the achievement of “Artificial General Intelligence” (AGI), then it’s going to be achieved by copying everything we know about human intelligence.

And that’s the sweetest lesson of all.

References

Research papers on neurology and estimates of the size of the human brain:

  1. Carl Zimmer, January 2011, 100 Trillion Connections: New Efforts Probe and Map the Brain’s Detailed Architecture, Scientific American Magazine, Vol. 304, No. 1, https://www.scientificamerican.com/article/100-trillion-connections/
  2. Kandel E.R.,Schwartz J.H., and Jessell T. M., January 2000, Principles of Neural Science (4th Ed.), New York, McGraw-Hill, https://www.amazon.com/Principles-Neural-Science-Eric-Kandel/dp/0838577016 (Early source for the 100 trillion synapse estimate.)
  3. Eric H. Chudler, July 2025 (accessed), Brain Facts and Figures, University of Washington, https://faculty.washington.edu/chudler/facts.html
  4. Wieslaw L. Nowinski, 2024, On human nanoscale synaptome: Morphology modeling and storage estimation, Plos one, https://journals.plos.org/plosone/article?id=10.1371/journal.pone.0310156, PDF: https://journals.plos.org/plosone/article/file?id=10.1371/journal.pone.0310156&type=printable (Attributes the 100 trillion synapse estimate to: Kandel et. al, 2000)
  5. AI Impacts, 2015, Scale of the Human Brain: The brain has about 10¹¹ neurons and 1.8-3.2 x 10¹⁴ synapses, https://aiimpacts.org/scale-of-the-human-brain/ (Calculations give 180 to 320 trillion inter-neuron connections.)
  6. Alain Goriely, March 2025, Eighty-six billion and counting: do we know the number of neurons in the human brain?, Brain, Volume 148, Issue 3, Pages 689–691, https://doi.org/10.1093/brain/awae390, https://academic.oup.com/brain/article/148/3/689/7909879
  7. Roberto Lent, May 2025, Yes, the human brain has around 86 billion neurons, Brain, Volume 148, Issue 5, Pages e37–e38, https://doi.org/10.1093/brain/awaf048, https://academic.oup.com/brain/article-abstract/148/5/e37/8003626
  8. Wikipedia, July 2025 (accessed), Cerebellar granule cell, https://en.wikipedia.org/wiki/Cerebellar_granule_cell
  9. Frederico A C Azevedo, Ludmila R B Carvalho, Lea T Grinberg, José Marcelo Farfel, Renata E L Ferretti, Renata E P Leite, Wilson Jacob Filho, Roberto Lent, Suzana Herculano-Houzel, 2009, Equal numbers of neuronal and nonneuronal cells make the human brain an isometrically scaled-up primate brain, J Comp Neurol, 2009 Apr 10;513(5):532-41, PMID: 19226510, DOI: 10.1002/cne.21974, https://pubmed.ncbi.nlm.nih.gov/19226510/ (Quote: debunking the frequently-cited statistic: “100 billion neurons and ten times more glial cells”... their findings: "...the adult male human brain contains on average 86.1 +/- 8.1 billion NeuN-positive cells (“neurons”) and 84.6 +/- 9.8 billion NeuN-negative (“nonneuronal”) cells..." )
  10. Bente Pakkenberg, Dorte Pelvig, Lisbeth Marner, Mads J. Bundgaard, Hans Jørgen G. Gundersen, Jens R. Nyengaard, Lisbeth Regeur, 2003, Aging and the human neocortex, Experimental Gerontology, Volume 38, Issues 1–2, Pages 95-99, ISSN 0531-5565, https://doi.org/10.1016/S0531-5565(02)00151-1, https://www.sciencedirect.com/science/article/abs/pii/S0531556502001511, https://citeseerx.ist.psu.edu/document?repid=rep1&type=pdf&doi=7c7c04323f6ba96c32c47fc3159cc30614c2e822 (Around 36-39 billion glial cells in the human brain, and around 0.15×1015 (0.15 quadrillion) synapses, which is around 150 trillion.)
  11. Horn-Bochtler, A.K.E., Büttner-Ennever, J.A., 2011, Neuroanatomy of the Brainstem, In: Urban, P., Caplan, L. (eds) Brainstem Disorders. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-04203-4_1
  12. Pakkenberg, B. and Gundersen, H.J.G., 1997, Neocortical Neuron Number in Humans: Effect of Sex and Age, Journal of Comparative Neurology, Vol. 384, Pages 312-320, http://dx.doi.org/10.1002/(SICI)1096-9861(19970728)384:2<312::AID-CNE10>3.0.CO;2-K, https://onlinelibrary.wiley.com/doi/abs/10.1002/(SICI)1096-9861(19970728)384:2%3C312::AID-CNE10%3E3.0.CO;2-K (Human neuron counts around 19-23 billion in the neocortex, a part of the brain.)
  13. Haines, D., Mihailoff, G., 2018, Fundamental Neuroscience for Basic and Clinical Applications (5th ed.), Elsevier, p. 195, ISBN 9780323396325, https://www.amazon.com/dp/0323396321/
  14. Jianfeng Feng, Viktor Jirsa, Wenlian Lu, May 2024, Human brain computing and brain-inspired intelligence, National Science Review, Volume 11, Issue 5, nwae144, https://doi.org/10.1093/nsr/nwae144, https://academic.oup.com/nsr/article/11/5/nwae144/7656427
  15. Catherine Caruso, January 19, 2023, A New Field of Neuroscience Aims to Map Connections in the Brain: Scientists working in connectomics are creating comprehensive maps of how neurons connect to one another, https://hms.harvard.edu/news/new-field-neuroscience-aims-map-connections-brain (Quote: “86 billion neurons form 100 trillion connections to each other...”)
  16. Bradley Voytek, May 20, 2013, Are There Really as Many Neurons in the Human Brain as Stars in the Milky Way?, Nature, https://www.nature.com/scitable/blog/brain-metrics/are_there_really_as_many/
  17. Suzana Herculano-Houzel, 2009, The Human Brain in Numbers: A Linearly Scaled-up Primate Brain, Front Hum Neurosci. 2009 Nov 9;3:31. doi: 10.3389/neuro.09.031.2009, https://pmc.ncbi.nlm.nih.gov/articles/PMC2776484/

 

Online: Table of Contents

PDF: Free PDF book download

Buy: The Sweetest Lesson: Your Brain vs AI

The Sweetest Lesson: Your Brain Versus AI The Sweetest Lesson: Your Brain Versus AI: new book on AI intelligence theory:
  • Your brain is 50 times bigger than the best AI engines.
  • Truly intelligent AI will require more compute!
  • Another case of the bitter lesson?
  • Maybe it's the opposite of that: the sweetest lesson.

Get your copy from Amazon: The Sweetest Lesson