Two Revolutionary AI Chips Can Control Robots Through Thought

Two Revolutionary AI Chips Can Control Robots Through Thought

image provided by pixabay

These two chips might be the key to developing sophisticated brain-computer interfacing.

Scientists from the University of Electronic Science and Technology of China claim to have developed the world’s most energy-efficient artificial intelligence AI microchips that are small enough to fit inside smart devices and could open doors for innovative offline functions like voice and even mind control.

Generally, AI chips that are designed for heavy tasks often require significant power because of high computational demands, which limits their use in real-world scenarios. Professor Zhou Jun and his team managed to significantly reduce power consumption through algorithm and architectural optimization.

According to Interesting Engineering, the first chip was designed to be embedded into smart devices and enable offline voice control. It is said to excel in keyword spotting and speaker verification by recognizing the voice signals of a target speaker. Its primary advantage is its ability to “bypass” the constraints of standard voice recognition systems, like its ability to accurately recognize the target speaker’s speech even in noisy environments. Possible real-life applications are in low-power voice control scenarios like smart homes, wearable devices, and smart toys.

The second chip is designed to detect seizure signals in people with epilepsy. This technology, designed for wearable devices, uses electroencephalogram recognition to detect epileptic seizures and alert the patient for medical assistance or treatment. The researchers explain: “Existing designs rely on extensive patient seizure data for training to achieve high accuracy, a process that is time-consuming and costly due to the low occurrence of seizures and the need for hospitalization.”

The South China Morning Post reported that during a demonstration of the technology, electroencephalogram signals collected from a wearable brain-computer interface device were transmitted in real-time to a test board using Bluetooth technology. The chip was programmed to recognize imagined motor commands, enabling the user to command the robot to move forward, stop, or move in reverse.