A million years in a decade

October 18, 2024

In just twelve years (2012–2024), Artificial Intelligence (AI) has advanced at a pace akin to a million years of natural evolution. Small neural networks have evolved into massive systems that rival—and may soon surpass—the cognitive capacities of Earth's most intelligent species. AI disruption over the next decade will touch every aspect of modern life, creating great risk and opportunity. Navigating AI adoption will be key to success in the coming decade.

Charting AI’s Evolutionary Path

The first chart illustrates the parameter counts of groundbreaking AI models from 2012 to 2024. Parameters1 represent the complexity scale of a neural network, roughly analogous to synaptic connections in mammal brains. In 2012, AlexNet had 60 million parameters. As of 2024, models like GPT-4 and Google Gemini exceed a trillion parameters, approaching the synapse counts of cat and dog brains (2–3.5 trillion), yet capable of conversing in multiple human languages at a college level.

The flat purple line represents an approximate scale of human intelligence, as measured by the roughly 100 trillion synapses within our neocortex.

While mammalian brains evolved gradually over millions of years, AI has achieved similar complexity2 in just a decade and continues to improve at an accelerated rate. The past decade suggests our paths will cross within the next decade.


A Biological Comparison

Nature likewise reveals a strong correlation between neural scale and intelligence. The second chart presents neocortex neuron counts of various species: a mouse has about 14 million neurons; a dolphin close to 6 billion; humans top the list with 16 billion.

All of these species exhibit unique intelligence that is honed for survival in their respective environments. However, as brain scale increases in this group we see increasingly sophisticated behaviors such as tool use, coordinated hunting tactics, longer periods of child rearing, complex language and all the way to self-identity in Orcas and Sperm Whales, which are the closest to human brain scale.

Nature is showing us that biological intelligence also scales with neural complexity, providing further support for the notion that more neural scale will increase the capabilities of artificial intelligence.


Breaking the Thought Barrier

People can't fly, but we built machines that can. Nor can we think faster or deeper than our physical limits, but we're building machines than can.

First flight by the Wright Brothers

"First flight by the Wright Brothers", December 17, 1903, by John T. Daniels. Source: Wikimedia Commons.

The introduction of ChatGPT can be thought of as a "Kitty Hawk Moment" where the long-held dream of AI finally took flight. Much as the Wright brothers broke the human flight barrier in 1903, OpenAI engineers broke the "thought barrier" in November, 2022 with the world's first general purpose AI Assistant. Basic human cognition had finally been automated, and rapid improvements have been flowing ever since.

We cannot accurately predict how much or how fast AI will improve, but all indications are that we can expect robust growth over the next several years. AI hardware3 throughput is roughly doubling every two years. Substantial AI software optimizations and algorithmic innovations have continued to appear every year since the seminal Transformers research was released by Google in 2017. Heavy investment is pouring into new AI infrastructure and research from tech companies, venture capital and government sources.

Today AI is accelerating the cognitive work that people do in all walks of life. This "Augmented Intelligence" emerging between people and AI is complementing our natural reasoning and creative abilities, a trend that will continue with emerging new AI tools in the future.

As we stand on the cusp of this unprecedented transformation, breaking the "thought barrier" is not just a technological milestone—it's a redefinition of human potential. The swift ascent of AI from basic cognition to advanced reasoning4 mirrors our own evolutionary journey, but at a lightning pace. This convergence of artificial and human intelligence heralds a future where our limitations are no longer bound by biology but expanded by innovation. Navigating this new landscape requires vision and adaptability. That's why Cogl was founded: to help organizations adapt to and prosper in the AI revolution. Let's push the boundaries of possibility and break through new barriers together.


Footnotes

1 AI model parameter count can be viewed as a very rough comparison to neural synapic counts in mammals. Actual biological neurons are vastly more complex (and far less understood) than the basic "nodes" of artificial neural networks, which are made of completely known weights, biases and activation functions; all of which operate on the level of high school math.

2 Combinations of breakthrough machine learning algorithms (primarily Reinforcement Learning 1979, Back Propagation 1986, Deep Learning: AlexNet 2012, Sequence to Sequence Learning 2014, Transformers 2017, and Chain of Thought 2022) approximate intelligence by making use of ever larger networks of artificial neurons. Not unlike real neurons, exactly "how" these massive neural nets learn is still not fully understood by the data scientists that make them. They just know that they work, and seem to get better the larger the neural networks and training datasets get.

3 The computational power needed to drive modern AI was only made possible by the advent of Graphical Processing Units (GPUs) introduced by NVIDIA in the 2000s. Today's GPUs are vastly more powerful than the modest "graphics cards" that were used to train the breakthrough AlexNet model in 2012, but the basic principle of massively parrallel GPU computation remains the same.

4 The above is a gross simplification of how today's state-of-the-art AI models (like ChatGPT) work; there are lots of extra (often proprietary) steps involved in production AI systems that make them commercially viable, but the fundamentals of massive neural network size, training data size and computational power are still the main ingredients.