‘Brain-like’ Chip May Be the Future of Greener AI

image provided by pixabay

This post is also available in: עברית (Hebrew)

With the rise of artificial intelligence technology, many experts raised their concerns regarding the emissions of warehouses full of the computers needed to power these AI systems. IBM’s new “brain-like” chip prototype could make artificial intelligence more energy efficient, since its efficiency, according to the company, comes from components that work in a similar way to connections in human brains.

Thanos Vasilopoulos, a scientist at IBM’s research lab spoke to BBC News, saying that compared to traditional computers, “the human brain is able to achieve remarkable performance while consuming little power.” This superior energy efficiency would mean large and more complex workloads could be executed in low-power or battery-constrained environments like cars, mobile phones, and cameras. “Additionally, cloud providers will be able to use these chips to reduce energy costs and their carbon footprint,” he added.

As opposed to most digital chips that store information as 0s and 1s, the new chip uses components called memristors (memory resistors) that are analog and can store a range of numbers.

Prof Ferrante Neri from the University of Surrey explains that memristors are “nature-inspired computing” that mimics brain function, meaning a memristor could “remember” its electric history, like a synapse in a biological system. He concluded that “Interconnected memristors can form a network resembling a biological brain.”

According to BBC News, this new chip is not only energy efficient but also has digital elements that make it easier to put into existing AI systems. For example, many smartphones already have integrated AI chips for tasks like processing photos.

If they replace the chips in the banks of computers powering applications and powerful AI systems, IBM’s prototype chips could eventually save both lots of energy and the amount of water used to cool the huge server farms.

Nevertheless, IT Professor at the University of Bath James Davenport cautiously warned that despite IBM’s finding being “potentially interesting”, the new chip is more of a first step than an easy one-and-done solution.