Neuromorphic Chips
Microprocessors configured more like brains than traditional chips could soon make computers far more astute about what’s going on around them.
by Robert D. Hof, MIT Technology Review, May/June 2014
Microprocessors configured more like brains than traditional chips could soon make computers far more astute about what’s going on around them.
by Robert D. Hof, MIT Technology Review, May/June 2014
A pug-size robot named pioneer slowly rolls up to the
Captain America action figure on the carpet. They’re facing off inside a rough
model of a child’s bedroom that the wireless-chip maker Qualcomm has set up in
a trailer. The robot pauses, almost as if it is evaluating the situation, and
then corrals the figure with a snowplow-like implement mounted in front, turns
around, and pushes it toward three squat pillars representing toy bins.
Qualcomm senior engineer Ilwoo Chang sweeps both arms toward the pillar where
the toy should be deposited. Pioneer spots that gesture with its camera and
dutifully complies. Then it rolls back and spies another action figure,
Spider-Man. This time Pioneer beelines for the toy, ignoring a chessboard
nearby, and delivers it to the same pillar with no human guidance.This
demonstration at Qualcomm’s headquarters in San Diego looks modest, but it’s a glimpse of
the future of computing. The robot is performing tasks that have typically
needed powerful, specially programmed computers that use far more electricity.
Powered by only a smartphone chip with specialized software, Pioneer can
recognize objects it hasn’t seen before, sort them by their similarity to
related objects, and navigate the room to deliver them to the right
location—not because of laborious programming but merely by being shown once
where they should go. The robot can do all that because it is simulating,
albeit in a very limited fashion, the way a brain works.
Later this year, Qualcomm will begin to reveal how the
technology can be embedded into the silicon chips that power every manner of
electronic device. These “neuromorphic” chips—so named because they are modeled
on biological brains—will be designed to process sensory data such as images
and sound and to respond to changes in that data in ways not specifically
programmed. They promise to accelerate decades of fitful progress in artificial
intelligence and lead to machines that are able to understand and interact with
the world in humanlike ways. Medical sensors and devices could track
individuals’ vital signs and response to treatments over time, learning to
adjust dosages or even catch problems early. Your smartphone could learn to
anticipate what you want next, such as background on someone you’re about to
meet or an alert that it’s time to leave for your next meeting. Those
self-driving cars Google is experimenting with might not need your help at all,
and more adept Roombas wouldn’t get stuck under your couch. “We’re blurring the
boundary between silicon and biological systems,” says Qualcomm’s chief
technology officer, Matthew Grob.
Qualcomm’s chips won’t become available until next year
at the earliest; the company will spend 2014 signing up researchers to try out
the technology. But if it delivers, the project—known as the Zeroth
program—would be the first large-scale commercial platform for neuromorphic
computing. That’s on top of promising efforts at universities and at corporate
labs such as IBM Research and HRL Laboratories, which have each developed
neuromorphic chips under a $100 million project for the Defense Advanced
Research Projects Agency. Likewise, the Human Brain Project in Europe is
spending roughly 100 million euros on neuromorphic projects, including efforts
at Heidelberg University
and the University
of Manchester . Another
group in Germany
recently reported using a neuromorphic chip and software modeled on insects’
odor-processing systems to recognize plant species by their flowers.
Today’s computers all use the so-called von Neumann
architecture, which shuttles data back and forth between a central processor
and memory chips in linear sequences of calculations. That method is great for
crunching numbers and executing precisely written programs, but not for processing
images or sound and making sense of it all. It’s telling that in 2012, when
Google demonstrated artificial-intelligence software that learned to recognize
cats in videos without being told what a cat was, it needed 16,000 processors
to pull it off.
Continuing to improve the performance of such processors
requires their manufacturers to pack in ever more, ever faster transistors,
silicon memory caches, and data pathways, but the sheer heat generated by all
those components is limiting how fast chips can be operated, especially in
power-stingy mobile devices. That could halt progress toward devices that
effectively process images, sound, and other sensory information and then apply
it to tasks such as face recognition and robot or vehicle navigation.
No one is more acutely interested in getting around those
physical challenges than Qualcomm, maker of wireless chips used in many phones
and tablets. Increasingly, users of mobile devices are demanding more from
these machines. But today’s personal-assistant services, such as Apple’s Siri
and Google Now, are limited because they must call out to the cloud for more
powerful computers to answer or anticipate queries. “We’re running up against
walls,” says Jeff Gehlhaar, the Qualcomm vice president of technology who heads
the Zeroth engineering team.
Neuromorphic chips attempt to model in silicon the
massively parallel way the brain processes information as billions of neurons
and trillions of synapses respond to sensory inputs such as visual and auditory
stimuli. Those neurons also change how they connect with each other in response
to changing images, sounds, and the like. That is the process we call learning.
The chips, which incorporate brain-inspired models called neural networks, do
the same. That’s why Qualcomm’s robot—even though for now it’s merely running
software that simulates a neuromorphic chip—can put Spider-Man in the same
location as Captain America without having seen Spider-Man before.
Even if neuromorphic chips are nowhere near as capable as
the brain, they should be much faster than current computers at processing
sensory data and learning from it. Trying to emulate the brain just by using
special software on conventional processors—the way Google did in its cat
experiment—is way too inefficient to be the basis of machines with still
greater intelligence, says Jeff Hawkins, a leading thinker on AI who created
the Palm Pilot before cofounding Numenta, a maker of brain-inspired software.
“There’s no way you can build it [only] in software,” he says of effective AI.
“You have to build this in silicon.”
Neural Channel
As smartphones have taken off, so has Qualcomm, whose
market capitalization now tops Intel’s. That’s thanks in part to the hundreds
of wireless-communications patents that Qualcomm shows off on two levels of a
seven-story atrium lobby at its San
Diego headquarters. Now it’s looking to break new
ground again. First in coöperation with Brain Corp., a neuroscience startup it
invested in and that is housed at its headquarters, and more recently with its
own growing staff, it has been quietly working for the past five years on
algorithms to mimic brain functions as well as hardware to execute them. The
Zeroth project has initially focused on robotics applications because the way
robots can interact with the real world provides broader lessons about how the
brain learns—lessons that can then be applied in smartphones and other
products. Its name comes from Isaac Asimov’s “Zeroth Law” of robotics: “A robot
may not harm humanity, or, by inaction, allow humanity to come to harm.”
The idea of neuromorphic chips dates back decades. Carver
Mead, the Caltech professor emeritus who is a legend in integrated-circuit
design, coined the term in a 1990 paper, describing how analog chips—those that
vary in their output, like real-world phenomena, in contrast to the binary,
on-or-off nature of digital chips—could mimic the electrical activity of
neurons and synapses in the brain. But he struggled to find ways to reliably
build his analog chip designs. Only one arguably neuromorphic processor, a
noise suppression chip made by Audience, has sold in the hundreds of millions.
The chip, which is based on the human cochlea, has been used in phones from
Apple, Samsung, and others.
As a commercial company, Qualcomm has opted for
pragmatism over sheer performance in its design. That means the neuromorphic
chips it’s developing are still digital chips, which are more predictable and
easier to manufacture than analog ones. And instead of modeling the chips as
closely as possible on actual brain biology, Qualcomm’s project emulates
aspects of the brain’s behavior. For instance, the chips encode and transmit
data in a way that mimics the electrical spikes generated in the brain as it
responds to sensory information. “Even with this digital representation, we can
reproduce a huge range of behaviors we see in biology,” says M. Anthony Lewis,
the project engineer for Zeroth.
The chips would fit neatly into the existing business of
Qualcomm, which dominates the market for mobile-phone chips but has seen
revenue growth slow. Its Snapdragon mobile-phone chips include components such
as graphics processing units; Qualcomm could add a “neural processing unit” to
the chips to handle sensory data and tasks such as image recognition and robot
navigation. And given that Qualcomm has a highly profitable business of
licensing technologies to other companies, it would be in a position to sell
the rights to use algorithms that run on neuromorphic chips. That could lead to
sensor chips for vision, motion control, and other applications.
Cognitive Companion
Matthew Grob was startled, then annoyed, when he heard
the theme to Sanford
and Son start playing in the middle of a recent meeting. It turns
out that on a recent trip to Spain ,
he had set his smartphone to issue a reminder using the tune as an alarm, and
the phone thought it was time to play it again. That’s just one small example
of how far our personal devices are from being intelligent. Grob dreams of a
future when instead of monkeying with the settings of his misbehaving phone, as
he did that day, all he would have to do is bark, “Don’t do that!” Then the
phone might learn that it should switch off the alarm when he’s in a new time
zone.
Qualcomm is especially interested in the possibility that
neuromorphic chips could transform smartphones and other mobile devices into
cognitive companions that pay attention to your actions and surroundings and
learn your habits over time. “If you and your device can perceive the
environment in the same way, your device will be better able to understand your
intentions and anticipate your needs,” says Samir Kumar, a business development
director at Qualcomm’s research lab.
Pressed for examples, Kumar ticks off a litany: If you
tag your dog in a photo, your phone’s camera would recognize the pet in every
subsequent photo. At a soccer game, you could tell the phone to snap a photo
only when your child is near the goal. At bedtime, it would know without your
telling it to send calls to voice mail. In short, says Grob, your smartphone
would have a digital sixth sense.
Qualcomm executives are reluctant to embark on too many
flights of fancy before their chip is even available. But neuromorphic
researchers elsewhere don’t mind speculating. According to Dharmendra Modha, a
top IBM researcher in San Jose, such chips might lead to glasses for the blind
that use visual and auditory sensors to recognize objects and provide audio
cues; health-care systems that monitor vital signs, provide early warnings of
potential problems, and suggest ways to individualize treatments; and computers
that draw on wind patterns, tides, and other indicators to predict tsunamis
more accurately. At HRL this summer, principal research scientist Narayan
Srinivasa plans to test a neuromorphic chip in a bird-size device from
AeroVironment that will be flown around a couple of rooms. It will take in data
from cameras and other sensors so it can remember which room it’s in and learn
to navigate that space more adeptly, which could lead to more capable drones.
It will take programmers time to figure out the best way
to exploit the hardware. “It’s not too early for hardware companies to do
research,” says Dileep George, cofounder of the artificial-intelligence
startup Vicarious. “The commercial products could take a while.” Qualcomm
executives don’t disagree. But they’re betting that the technology they expect
to launch this year will bring those products a lot closer to reality.
= = = = = = = = = = = = = = = = = = = = = = = = = = = = = =
See also the neuromorphic engineering article in Wikipedia
at: https://en.wikipedia.org/wiki/Neuromorphic_engineering
See also this review of neuromorphic chips by a PhD
candidate at:
https://www.singularityweblog.com/neuromorphic-chips-and-human-level-ai/
No comments:
Post a Comment