Health Ranger Inventions Com

The Second Coming of Neuromorphic Computing

Monday, February 15, 2016 by

Just a few years ago, the promise of ultra-low power, high performance computing was tied to the rather futuristic-sounding vision of a “brain chip” or neuromorphic processor, which could mimic the brain’s structure and processing ability in silicon—quickly learning and chewing on data as fast as it could be generated.

(Article by Nicole Hemsoth, republished from http://www.nextplatform.com/2016/02/09/the-second-coming-of-neuromorphic-computing/)

In that short amount of time, broader attention has shifted to other devices and software frameworks for achieving the same end. From custom ASICs designed to train and execute neural networks, to reprogrammable hardware with a low-power hook, like FPGAs, to ARM, GPUs, and other non-standard CPU cores, the range of neuromorphic approaches are less likely to garner press, even though the work that is happening in this area is legitimate, fascinating, and directly in range with the wave of new deep learning, machine learning, in-situ analysis, on-sensor processing and other capabilities that are rising to the fore.

The handful of neuromorphic devices that do exist that are based on a widely variable set of architectures and visions, but the goal is the same–to create a chip that operates on the same principles of the brain. The goal that has not been met, however, is the delivery of a revolution in computing. But the full story has not played out quite yet—neuromorphic devices may see a second (albeit tidal) wave of interest in coming years. To call it a second coming might not be entirely fair since neuromorphic computing never really died off to begin with. What did dissipate, however, was the focus and wider attention.

There was an initial window of opportunity for neuromorphic computing, which opened as a few major funding initiatives were afoot. While these propelled critical research and the production of actual hardware devices and programming tools, attention cooled as other trends rose to the fore. Still, the research groups dedicated to exploring the range of potential architectures, programming approaches, and potential use cases have moved ahead—and now might be their year to shine once again.

There have been a couple of noteworthy investments that have fed existing research for neuromorphic architectures. The DARPA Synapse program was one such effort, which beginning in 2008, eventually yielded IBM’s “True North” chip—a 4096-core device comprised of 256 programmable “neurons” that act much like synapses in the brain, resulting in a highly energy efficient architecture that while fascinating—means an entire rethink of programming approaches. Since that time, other funding from scientific sources, including the Human Brain Project, have pushed the area further, leading to the creation of the SpiNNaker neuromorphic device, although there is still a lack of a single architecture that appears best for neuromorphic computing in general.

The problem is really that there is no “general” purpose for such devices as of yet and no widely accepted device or programmatic approach. Much of this stems from the fact that many of the existing projects are built around specific goals that vary widely. For starters, there are projects around broader neuromorphic engineering that are more centered on robotics versus large-scale computing applications (and vice versa). One of several computing-oriented approaches taken by Stanford University’s Neurogrid project, which was presented in hardware in 2009 and remains an ongoing research endeavor, was to simulate the human brain, thus the programming approach and hardware design are both thus modeled as closely to the brain as possible while others are more oriented toward solving computer science related challenges related to power consumption and computational capability using the same concepts, including a 2011 effort at MIT, work at HP with memristors as a key to neuromorphic device creation, and various other smaller projects, including one spin-off of the True North architecture we described here.

A New Wave is Forming

What’s interesting about the above referenced projects is that their heyday appears to be from the 2009 to 2013 period with a large gap until the present, even if research is still ongoing. Still, one can make the argument that the attention around deep neural networks and other brain-inspired (although not brain-like) programmatic and algorithm trends might bring neuromorphic computing back to the fore.

“Neuromorphic computing is still in its beginning stages,” Dr. Catherine Schuman, a researcher working on such architectures at Oak Ridge National Laboratory tells The Next Platform. “We haven’t nailed down a particular architecture that we are going to run with. True North is an important one, but there are other projects looking at different ways to model a neuron or synapse. And there are also a lot of questions about how to actually use these devices as well, so the programming side of things is just as important.”

The programming approach varies from device to device, as Schuman explains. “With True North, for example, the best results come from training a deep learning network offline and moving that program onto the chip. Others that are biologically inspired implementations like Neurogrid, for instance, are based on spike timing dependent plasticity.”

The approach Schuman’s team is working on at Oak Ridge and the University of Tennessee is based on a neuromorphic architecture called NIDA, short for the Neuroscience Inspired Dynamic Architecture, which was implemented in FPGA in 2014 and now has a full SDK and tooling around it. The hardware implementation, called Dynamic Adaptive Neural Network Array (DANNA) differs from other approaches to neuromorphic computing in that is allows for programmability of structure and is trained using an evolutionary optimization approach—again, based as closely as possible to what we know (and still don’t know) about the way our brains work.

Schuman stresses the exploratory nature of existing neuromorphic computing efforts, including those at the lab, but does see a new host of opportunities for them on the horizon, presuming the programming models can be developed to suit both domain scientists and computer scientists. There are, she notes, two routes for neuromorphic devices in the next several years. First, as embedded processors on sensors and other devices, given their low power consumption and high performance processing capability. Second, and perhaps more important for a research center like Oak Ridge National Lab, neuromorphic devices could act “as co-processors on large-scale supercomputers like Titan today where the neuromorphic processor would sit alongside the traditional CPUs and GPU accelerators.” Where they tend to shine most, and where her team is focusing effort, is on the role they might play in real-time data analysis.

“For large simulations where there might be petabytes of data being created, normally that would all be spun off to tape. But neuromorphic devices can be intelligent processors handling data as it’s being created to guide scientists more quickly.”

What is really needed for these potential use cases, beyond research like Schuman’s and many others at IBM, HP, and others that are working toward such goals, is the development of a richer programming and vendor landscape. One promising effort from the Brain Corporation, a Qualcomm-backed venture, appears to be gaining some traction, even if it is slightly later to the neuromorphic device game relative to its competitors. Although it is more robotics and sensor-oriented (versus larger-scale computing/co-processing, which is encapsulated by Qualcomm’s coming Zeroth platform for machine learning, which is based on neuromorphic approaches), the team there is reported to have developed in silicon a neuromorphic device and the companion software environment as an interface for programmers.

Although the concept has been floating around since the 1980s and implemented in hardware across a number of projects, including some not mentioned here, the future of neuromorphic computing is still somewhat uncertain—even if the exploding range of applications puts it back in the lens once again. The small range of existing physical devices and an evolving set of programming approaches match a growing set of problems in research and enterprise—and this could very well be the year neuromorphic computing breaks into the mainstream.

Read more at: http://www.nextplatform.com/2016/02/09/the-second-coming-of-neuromorphic-computing/



Comments

comments powered by Disqus

×
Please like our Facebook Page
Show us your support by liking our page!