Singapore Researchers Look to Intel Neuromorphic Computing to Help Enable Robots That ‘Feel’
15 July 2020 - 11:00PM
Business Wire
What’s New: Today, two researchers from the National
University of Singapore (NUS), who are members of the Intel
Neuromorphic Research Community (INRC), presented new findings
demonstrating the promise of event-based vision and touch sensing
in combination with Intel’s neuromorphic processing for robotics.
The work highlights how bringing a sense of touch to robotics can
significantly improve capabilities and functionality compared to
today’s visual-only systems and how neuromorphic processors can
outperform traditional architectures in processing such sensory
data.
This press release features multimedia. View
the full release here:
https://www.businesswire.com/news/home/20200715005068/en/
The National University of Singapore
research team behind the novel robotic system integrated with
event-driven artificial skin and vision sensors was led by
assistant professor Harold Soh (left) and assistant professor
Benjamin Tee (right). With them are team members (second from left
to right) Sng Weicong, Tasbolat Taunyazov and See Hian Hian.
(Credit: National University of Singapore)
“This research from National University of
Singapore provides a compelling glimpse to the future of robotics
where information is both sensed and processed in an event-driven
manner combining multiple modalities. The work adds to a growing
body of results showing that neuromorphic computing can deliver
significant gains in latency and power consumption once the entire
system is re-engineered in an event-based paradigm spanning
sensors, data formats, algorithms, and hardware architecture.” —
Mike Davies, director of Intel’s Neuromorphic Computing Lab
Why It Matters: The human sense of touch is sensitive
enough to feel the difference between surfaces that differ by just
a single layer of molecules, yet most of today’s robots operate
solely on visual processing. Researchers at NUS hope to change this
using their recently developed artificial skin, which according to
their research can detect touch more than 1,000 times faster than
the human sensory nervous system and identify the shape, texture
and hardness of objects 10 times faster than the blink of an
eye.
Enabling a human-like sense of touch in robotics could
significantly improve current functionality and even lead to new
use cases. For example, robotic arms fitted with artificial skin
could easily adapt to changes in goods manufactured in a factory,
using tactile sensing to identify and grip unfamiliar objects with
the right amount of pressure to prevent slipping. The ability to
feel and better perceive surroundings could also allow for closer
and safer human-robotic interaction, such as in caregiving
professions, or bring us closer to automating surgical tasks by
giving surgical robots the sense of touch that they lack today.
While the creation of artificial skin is one step in bringing
this vision to life, it also requires a chip that can draw accurate
conclusions based on the skin’s sensory data in real time, while
operating at a power level efficient enough to be deployed directly
inside the robot. “Making an ultra-fast artificial skin sensor
solves about half the puzzle of making robots smarter,” said
assistant professor Benjamin Tee from the NUS Department of
Materials Science and Engineering and NUS Institute for Health
Innovation & Technology. “They also need an artificial brain
that can ultimately achieve perception and learning as another
critical piece in the puzzle. Our unique demonstration of an AI
skin system with neuromorphic chips such as the Intel Loihi
provides a major step forward towards power-efficiency and
scalability.”
About the Research: To break new ground in robotic
perception, the NUS team began exploring the potential of
neuromorphic technology to process sensory data from the artificial
skin using Intel’s Loihi neuromorphic research chip. In their
initial experiment, the researchers used a robotic hand fitted with
the artificial skin to read Braille, passing the tactile data to
Loihi through the cloud to convert the micro bumps felt by the hand
into a semantic meaning. Loihi achieved over 92 percent accuracy in
classifying the Braille letters, while using 20 times less power
than a standard Von Neumann processor.
Building on this work, the NUS team further improved robotic
perception capabilities by combining both vision and touch data in
a spiking neural network. To do so, they tasked a robot to classify
various opaque containers holding differing amounts of liquid using
sensory inputs from the artificial skin and an event-based camera.
Researchers used the same tactile and vision sensors to test the
ability of the perception system to identify rotational slip, which
is important for stable grasping.
Once this sensory data was captured, the team sent it to both a
GPU and Intel’s Loihi neuromorphic research chip to compare
processing capabilities. The results, which were presented at
Robotics: Science and Systems this week, show that combining
event-based vision and touch using a spiking neural network enabled
10 percent greater accuracy in object classification compared to a
vision-only system. Moreover, they demonstrated the promise for
neuromorphic technology to power such robotic devices, with Loihi
processing the sensory data 21 percent faster than a top-performing
GPU, while using 45 times less power.
“We’re excited by these results. They show that a neuromorphic
system is a promising piece of the puzzle for combining multiple
sensors to improve robot perception. It’s a step toward building
power-efficient and trustworthy robots that can respond quickly and
appropriately in unexpected situations,” said assistant professor
Harold Soh from the Department of Computer Science at the NUS
School of Computing.
About the Intel Neuromorphic Research Community: The
Intel Neuromorphic Research Community is an ecosystem of academic
groups, government labs, research institutions, and companies
around the world working with Intel to further neuromorphic
computing and develop innovative AI applications. Researchers
interested in participating in the INRC and developing for Loihi
can visit the Intel Neuromorphic Research Community website. A list
of current members can also be found at the site.
More Context: Neuromorphic Computing (Press Kit) | Intel
Labs (Press Kit) | How Neuromorphic Computing Uses the Human Brain
as a Model (Video) | Exceptional sense of touch for robots and
prosthetics (National University of Singapore) | New breakthrough
by NUS researchers gives robots intelligent sensing abilities to
carry out complex tasks (National University of Singapore)
About Intel
Intel (Nasdaq: INTC) is an industry leader, creating
world-changing technology that enables global progress and enriches
lives. Inspired by Moore’s Law, we continuously work to advance the
design and manufacturing of semiconductors to help address our
customers’ greatest challenges. By embedding intelligence in the
cloud, network, edge and every kind of computing device, we unleash
the potential of data to transform business and society for the
better. To learn more about Intel’s innovations, go to
newsroom.intel.com and intel.com.
© Intel Corporation. Intel, the Intel logo and other Intel marks
are trademarks of Intel Corporation or its subsidiaries. Other
names and brands may be claimed as the property of others.
View source
version on businesswire.com: https://www.businesswire.com/news/home/20200715005068/en/
Alexa Korkos 415-706-5783 alexa.korkos@intel.com
Intel (NASDAQ:INTC)
Historical Stock Chart
From Jun 2024 to Jul 2024
Intel (NASDAQ:INTC)
Historical Stock Chart
From Jul 2023 to Jul 2024