Intel unveils an AI chip modeled after the human brain

By : |September 27, 2017 0
Image Courtesy Twitter.com

Intel has announced a neuromorphic artificial intelligence (AI) test chip named ‘Loihi’ that’s modeled after the human brain and can learn from its surroundings or locally from within the machine it is part of.

“As part of an effort within Intel Labs, Intel has developed a first-of-its-kind self-learning neuromorphic chip – codenamed Loihi – that mimics how the brain functions by learning to operate based on various modes of feedback from the environment. This extremely energy-efficient chip, which uses the data to learn and make inferences, gets smarter over time and does not need to be trained in the traditional way. It takes a novel approach to computing via asynchronous spiking,” Michael Mayberry, MD of Intel Labs said in a blog post.

Intel has been exploring neuromorphic tech for a while and even designed a chip in 2012. Instead of logic gates, it uses “spiking neurons” as a fundamental computing unit. Loihi has 1,024 artificial neurons or 130,000 simulated neurons with 130 million possible synaptic connections. That’s a bit more complex than, say, a lobster’s brain, but quite distant from our 80 billion neurons.

___________________________________________________________________________________________________________

Each neuromorphic core of the Loihi chip includes a learning engine that can be programmed to adapt network parameters during operation. It supports reinforcement, supervised, unsupervised, and other learning patterns.

“The self-learning capabilities prototyped by this test chip have enormous potential to improve automotive and industrial applications as well as personal robotics — any application that would benefit from autonomous operation and continuous learning in an unstructured environment. For example, recognising the movement of a car or bike,” Mayberry added.

Though all this does sound great, neuromorphic chips have yet to show better performance in real-life applications than regular CPUs and GPUs.

No Comments so fars

Jump into a conversation

No Comments Yet!

You can be the one to start a conversation.