This post is also available in: עברית (Hebrew)
Unlike today’s AI systems, which rely on carefully labeled datasets and explicit training, a new AI approach would use logic and inference. While today’s tools only “know” what it’s explicitly told by developers and datasets, the new tech would be able to correctly interpret situations it’s never seen before. The research will reshape the way AI language systems like Alexa and Siri learn to speak.
The initiative is funded by DARPA, the US Defense Advanced Research Projects Agency. The purpose is to build AI language systems that learn more like people and less like machines. Instead of crunching gargantuan datasets to learn the ins and outs of language, the agency wants the tech to teach itself by observing the world like human babies do.
The program – Grounded Artificial Intelligence Language Acquisition, or GAILA – aims to build AI tools that understand the meaning of what they’re saying instead of stringing together words based on statistics, according to defenseone.com.
The program is part of DARPA’s AI Exploration initiative, which provides rapid bursts of funding for a slew of high-risk, high-reward AI research projects.
A theoretical model will be built for training AI systems to speak by associating audible sounds with visual cues — videos, images and live demonstrations. Ultimately, teams will build a working prototype that learns and understands English text and speech from scratch.