Advertisment

IBM creates new foundation to program SyNAPSE Chips

author-image
Abhigna
New Update

SAN JOSE, USA: Scientists from IBM unveiled a software ecosystem designed for programming silicon chips that have an architecture inspired by the function, low power, and compact volume of the brain.

Advertisment

The technology could enable a new generation of intelligent sensor networks that mimic the brain's abilities for perception, action, and cognition.

Dramatically different from traditional software, IBM's new programming model breaks the mold of sequential operation underlying today's von Neumann architectures and computers

It is instead tailored for a new class of distributed, highly interconnected, asynchronous, parallel, large-scale cognitive computing architectures, said a press release.

Advertisment

"Architectures and programs are closely intertwined and a new architecture necessitates a new programming paradigm," said Dr. Dharmendra S. Modha, principal investigator and senior manager, IBM Research.

"We are working to create a FORTRAN for synaptic computing chips. While complementing today's computers, this will bring forth a fundamentally new technological capability in terms of programming and applying emerging learning systems," added Modha.

To advance and enable this new ecosystem, IBM claims that the researchers at IBM developed the following breakthroughs that support all aspects of the programming cycle from design through development, debugging, and deployment:

Advertisment

Simulator: A multi-threaded, massively parallel and highly scalable functional software simulator of a cognitive computing architecture comprising a network of neurosynaptic cores.

Neuron Model: A simple, digital, highly parameterized spiking neuron model that forms a fundamental information processing unit of brain-like computation and supports a wide range of deterministic and stochastic neural computations, codes, and behaviors.

Programming Model: A high-level description of a "program" that is based on composable, reusable building blocks called "corelets." Each corelet represents a complete blueprint of a network of neurosynaptic cores that specifies a based-level function. Inner workings of a corelet are hidden so that only its external inputs and outputs are exposed to other programmers, who can concentrate on what the corelet does rather than how it does it. Corelets can be combined to produce new corelets that are larger, more complex, or have added functionality.

Advertisment

Library: A cognitive system store containing designs and implementations of consistent, parameterized, large-scale algorithms and applications that link massively parallel, multi-modal, spatio-temporal sensors and actuators together in real-time. In less than a year, the IBM researchers have designed and stored over 150 corelets in the program library.

Laboratory: A novel teaching curriculum that spans the architecture, neuron specification, chip simulator, programming language, application library and prototype design models. It also includes an end-to-end software environment that can be used to create corelets, access the library, experiment with a variety of programs on the simulator, connect the simulator inputs/outputs to sensors/actuators, build systems, and visualize/debug the results.

These innovations are being presented at The International Joint Conference on Neural Networks in Dallas, TX.

Advertisment

Paving the path to SyNAPSE

Modern computing systems were designed decades ago for sequential processing according to a pre-defined program. Although they are fast and precise "number crunchers," computers of traditional design become constrained by power and size while operating at reduced effectiveness when applied to real-time processing of the noisy, analog, voluminous, Big Data produced by the world around us. In contrast, the brain-which operates comparatively slowly and at low precision-excels at tasks such as recognizing, interpreting, and acting upon patterns, while consuming the same amount of power as a 20 watt light bulb and occupying the volume of a two-liter bottle.

In August 2011, IBM successfully demonstrated a building block of a novel brain-inspired chip architecture based on a scalable, interconnected, configurable network of "neurosynaptic cores." Each core brings memory ("synapses"), processors ("neurons"), and communication ("axons") in close proximity, executing activity in an event-driven fashion. These chips serve as a platform for emulating and extending the brain's ability to respond to biological sensors and analyzing vast amounts of data from many sources at once.

Advertisment

Having completed Phase 0, Phase 1, and Phase 2, IBM and its collaborators (Cornell University and iniLabs, Ltd) have recently been awarded approximately $12 million in new funding from the Defense Advanced Research Projects Agency (DARPA) for Phase 3 of the Systems of Neuromorphic Adaptive Plastic Scalable Electronics (SyNAPSE) project, thus bringing the cumulative funding to approximately $53 million.

Smarter Sensors

IBM's long-term goal is to build a chip system with ten billion neurons and hundred trillion synapses, while consuming merely one kilowatt of power and occupying less than two liters of volume.

Advertisment

Systems built from these chips could bring the real-time capture and analysis of various types of data closer to the point of collection. They would not only gather symbolic data, which is fixed text or digital information, but also gather sub-symbolic data, which is sensory based and whose values change continuously. This raw data reflects activity in the world of every kind ranging from commerce, social, logistics, location, movement, and environmental conditions.

Take the human eyes, for example. They sift through over a Terabyte of data per day. Emulating the visual cortex, low-power, light-weight eye glasses designed to help the visually impaired could be outfitted with multiple video and auditory sensors that capture and analyze this optical flow of data.

These sensors would gather and interpret large-scale volumes of data to signal how many individuals are ahead of the user, distance to an upcoming curb, number of vehicles in a given intersection, height of a ceiling or length of a crosswalk.

Like a guide dog, sub-symbolic data perceived by the glasses would allow them to plot the safest pathway through a room or outdoor setting and help the user navigate the environment via embedded speakers or ear buds. This same technology -- at increasing levels of scale -- can form sensory-based data input capabilities and on-board analytics for automobiles, medical imagers, healthcare devices, smartphones, cameras, and robots.

developer