The field of artificial intelligence (AI) is experiencing an astronomical rise in costs, comparable to a hallucination induced by ChatGPT. OpenAI, a leading AI research organization, revealed that training the algorithm behind ChatGPT cost them over $100 million. A major contributor to this cost explosion is the demand for graphics chips, called GPUs, which are essential for large-scale AI training. As the race to compete in AI intensifies, data centers are consuming alarmingly high amounts of energy, further adding to the financial burden of AI development. In response to these challenges, a number of startups are emerging with audacious plans to revolutionize computer chip design and facilitate affordable AI advancement.
Nvidia’s GPUs have long been the go-to hardware for AI development. However, a group of startups argue that the time has come for a radical reboot of computer chip design. One such startup is Normal Computing, founded by experts from Google Brain and Alphabet’s moonshot lab X. They have pioneered a simple prototype known as the stochastic processing unit (SPU) that harnesses the thermodynamic properties of electrical oscillators. Unlike conventional silicon chips that use binary bits (0s and 1s) to represent information, the SPU performs calculations using random fluctuations occurring within the circuits. This unique approach generates random samples that are valuable for computations and solving linear algebra equations, which are common in science, engineering, and machine learning. Faris Sbahi, CEO of Normal Computing, touts the efficiency and efficacy of their hardware in handling statistical calculations, potentially paving the way for AI algorithms capable of handling uncertainty and mitigating the issue of “hallucinations” in large language models.
Extropic: The Pioneers of Thermodynamic Computing for AI
Guillaume Verdon, the founder and CEO of Extropic, another startup, aims to take thermodynamic computing even further. Verdon, along with a team of ex-quantum researchers from Alphabet, is working on tightly integrating neural computing into an analog thermodynamic chip. Drawing from their experience in quantum computing software and hardware, they aspire to bring the full-stack thermodynamic paradigm to AI. While Extropic remains in stealth mode, their ambition to leverage thermodynamics for AI computing hints at a larger movement toward reimagining the foundations of computing.
The notion of reevaluating computing practices is gaining traction as the industry grapples with the constraints of Moore’s law. This long-standing prediction states that the density of components on computer chips will continue to shrink, enabling faster processing power. However, even if Moore’s law were not slowing down, the increasing model sizes released by OpenAI and other organizations would still outpace chip capacity. Peter McMahon, a professor at Cornell University specializing in novel computing approaches, emphasizes the magnitude of the challenge. To ensure the continuous progress of AI, it becomes imperative to explore alternative computing methods.
As the costs of AI development skyrocket, and energy consumption raises environmental concerns, the industry is ripe for a computing revolution. Startups like Normal Computing and Extropic are at the forefront of this movement, challenging traditional chip designs and exploring the potential of thermodynamics for AI computations. By thinking outside the box, these innovators envision more efficient and cost-effective solutions that can meet the growing demands of AI algorithms while addressing inherent limitations. Whether through the refinement of existing technologies or the exploration of entirely new paradigms, a computing revolution promises to keep the AI hype train on track and drive the field towards its full potential.
Leave a Reply