We’re Only Scratching the Surface of the AI Revolution
- By Paul Mah
- August 16, 2023
While neuroscience as a field has exploded over the last 20 to 30 years, almost none of these more recent breakthroughs are evident in today’s A.I. systems.
This observation was made by Subutai Ahmad, the CEO of AI firm Numenta in a commentary published on Fortune.
And while recent advances in generative AI are impressive, he argues it pales significantly when one considers the sheer amount of power it consumes. In contrast, the human brain consumes a meager 20 watts of power or half the consumption of an average light bulb.
Based on old technology
This is not sustainable when one considers the energy cost of training an AI model and for inference. There is hope, however, when one considers that today’s cutting-edge AI is based on neuroscience from the 1950s and 1960s.
Specifically, none of these more recent breakthroughs are evident in today’s AI systems despite extraordinary progress in the neuroscience field, says Ahmad.
Imagine what A.I. could do if it incorporates the latest breakthroughs.
Getting there will be no walk in the park though, according to Ahmad. However, this can happen if neuroscientists make the effort to “step back” and explain their concepts in a way that makes sense to AI professionals.
In addition, more researchers with hybrid AI and neuroscience roles will be needed to help fill the gap between the two fields. With interdisciplinary collaboration, AI researchers can then gain a better understanding of neuroscientific findings and translate it into new AI breakthroughs.
He wrote: “Recent breakthroughs prove that applying brain-based principles to large language models can increase efficiency and sustainability by orders of magnitude… In practice, this means mapping neuroscience-based logic… so that it can learn quickly on very little training data, just like our brains.”
Not a pipe dream
In a LinkedIn post referencing Ahmad’s opinion piece, Laurence Liew, the director of AI Singapore confirmed that his team has been experimenting with some of these neuroscience-based techniques and algorithms at AI Singapore.
“Some are very promising, and we can see 5-10x performance improvement in the basic models and expect up to 100x performance improvement in the newer models,” he wrote.
This translates to LLM inference at a tenth of the cost and 100 times the performance on modern-day CPUs, he concluded.
“From the smallpox virus to the light bulb, almost all of humanity’s greatest breakthroughs have come from multiple contributions and interdisciplinary collaboration. That must happen with A.I. and neuroscience as well,” wrote Ahmad.
“Whether we like it or not, A.I. is here. We must make it sustainable and efficient by bridging the neuroscience-A.I. gap. Only then can we apply the right interdisciplinary research and commercialization, education, policies, and practices to A.I. so it can be used to improve the human condition.”
Paul Mah is the editor of DSAITrends. A former system administrator, programmer, and IT lecturer, he enjoys writing both code and prose. You can reach him at [email protected].
Image credit: iStockphoto/arthon meekodong
Paul Mah
Paul Mah is the editor of DSAITrends, where he report on the latest developments in data science and AI. A former system administrator, programmer, and IT lecturer, he enjoys writing both code and prose.