{"id":25581,"date":"2020-07-21T00:16:15","date_gmt":"2020-07-20T18:46:15","guid":{"rendered":"https:\/\/www.technologyforyou.org\/?p=25581"},"modified":"2020-07-21T00:16:15","modified_gmt":"2020-07-20T18:46:15","slug":"singapore-researchers-look-to-intel-neuromorphic-computing-to-help-enable-robots-that-feel","status":"publish","type":"post","link":"https:\/\/www.technologyforyou.org\/singapore-researchers-look-to-intel-neuromorphic-computing-to-help-enable-robots-that-feel\/","title":{"rendered":"Singapore Researchers Look to Intel Neuromorphic Computing to Help Enable Robots That \u2018Feel\u2019"},"content":{"rendered":"<p><span style=\"font-family: 'trebuchet ms', geneva, sans-serif; font-size: 12pt;\">Today, two researchers from the National University of Singapore (NUS), who are members of the Intel Neuromorphic Research Community (INRC), presented new findings demonstrating the promise of event-based vision and touch sensing in combination with Intel\u2019s neuromorphic processing for robotics. The work highlights how bringing a sense of touch to robotics can significantly improve capabilities and functionality compared to today\u2019s visual-only systems and how neuromorphic processors can outperform traditional architectures in processing such sensory data.<\/span><\/p>\n<p><span style=\"font-family: 'trebuchet ms', geneva, sans-serif; font-size: 12pt;\"><strong>Mike Davies, director of Intel\u2019s Neuromorphic Computing Lab said,<\/strong> \u201cThis research from the National University of Singapore provides a compelling glimpse to the future of robotics where information is both sensed and processed in an event-driven manner combining multiple modalities. The work adds to a growing body of results showing that neuromorphic computing can deliver significant gains in latency and power consumption once the entire system is re-engineered in an event-based paradigm spanning sensors, data formats, algorithms, and hardware architecture.\u201d<\/span><span style=\"font-family: 'trebuchet ms', geneva, sans-serif; font-size: 12pt;\">\u00a0<\/span><\/p>\n<p><span style=\"font-family: 'trebuchet ms', geneva, sans-serif; font-size: 12pt;\"><span style=\"font-size: 12pt;\">Enabling a human-like sense of touch in robotics could significantly improve current functionality and even lead to new use cases. For example, robotic arms fitted with artificial skin could easily adapt to changes in goods manufactured in a factory, using tactile sensing to identify and grip unfamiliar objects with the right amount of pressure to prevent slipping. The ability to feel and better perceive surroundings could also allow for closer and safer human-robotic interaction, such as in caregiving professions, or bring us closer to automating surgical tasks by giving surgical robots the sense of touch that they lack today.<\/span><\/span><\/p>\n<p><span style=\"font-family: 'trebuchet ms', geneva, sans-serif; font-size: 12pt;\"><strong>Why It Matters:<\/strong>\u00a0The human sense of touch is sensitive enough to feel the difference between surfaces that differ by just a single layer of molecules, yet most of today\u2019s robots operate solely on visual processing. Researchers at NUS hope to change this using their recently developed\u00a0artificial skin, which according to their research can detect touch more than 1,000 times faster than the human sensory nervous system and identify the shape, texture, and hardness of objects 10 times faster than the blink of an eye.<\/span><\/p>\n<p><span style=\"font-family: 'trebuchet ms', geneva, sans-serif; font-size: 12pt;\">While the creation of artificial skin is one step in bringing this vision to life, it also requires a chip that can draw accurate conclusions based on the skin\u2019s sensory data in real-time, while operating at a power level efficient enough to be deployed directly inside the robot. \u201cMaking an ultra-fast artificial skin sensor solves about half the puzzle of making robots smarter,\u201d said assistant professor Benjamin Tee from the NUS Department of Materials Science and Engineering and NUS Institute for Health Innovation &amp; Technology. \u201cThey also need an artificial brain that can ultimately achieve perception and learning as another critical piece in the puzzle. Our unique demonstration of an AI skin system with neuromorphic chips such as the Intel Loihi provides a major step forward towards power-efficiency and scalability.\u201d<\/span><\/p>\n<p><span style=\"font-family: 'trebuchet ms', geneva, sans-serif; font-size: 12pt;\"><span style=\"font-size: 12pt;\">Building on this work, the NUS team further improved robotic perception capabilities by combining both vision and touch data in a spiking neural network. To do so, they tasked a robot to classify various opaque containers holding differing amounts of liquid using sensory inputs from the artificial skin and an event-based camera. Researchers used the same tactile and vision sensors to test the ability of the perception system to identify rotational slip, which is important for stable grasping.<\/span><\/span><\/p>\n<p><span style=\"font-family: 'trebuchet ms', geneva, sans-serif; font-size: 12pt;\"><strong>About the Research:\u00a0<\/strong>To break new ground in robotic perception, the NUS team began exploring the potential of neuromorphic technology to process sensory data from the artificial skin using Intel\u2019s Loihi neuromorphic research chip. In their initial experiment, the researchers used a robotic hand fitted with the artificial skin to read Braille, passing the tactile data to Loihi through the cloud to convert the micro bumps felt by the hand into a semantic meaning. Loihi achieved over 92 percent accuracy in classifying the Braille letters, while using 20 times less power than a standard Von Neumann processor.<\/span><\/p>\n<p><span style=\"font-family: 'trebuchet ms', geneva, sans-serif; font-size: 12pt;\">Once this sensory data was captured, the team sent it to both a GPU and Intel\u2019s Loihi neuromorphic research chip to compare processing capabilities. The results, which were presented at\u00a0Robotics: Science and Systems this week, show that combining event-based vision and touch using a spiking neural network-enabled 10 percent greater accuracy in object classification compared to a vision-only system. Moreover, they demonstrated the promise for neuromorphic technology to power such robotic devices, with Loihi processing the sensory data 21 percent faster than a top-performing GPU, while using 45 times less power.<\/span><\/p>\n<p><span style=\"font-family: 'trebuchet ms', geneva, sans-serif; font-size: 12pt;\">\u201cWe\u2019re excited by these results. They show that a neuromorphic system is a promising piece of the puzzle for combining multiple sensors to improve robot perception. It\u2019s a step toward building power-efficient and trustworthy robots that can respond quickly and appropriately in unexpected situations,\u201d <strong>said assistant professor Harold Soh from the Department of Computer Science at the NUS School of Computing.<\/strong><\/span><\/p>\n<p><iframe loading=\"lazy\" title=\"Combining Vision and Touch in Robotics Using Intel Neuromorphic Computing\" width=\"696\" height=\"392\" src=\"https:\/\/www.youtube.com\/embed\/tmDjoSIYtsY?feature=oembed\" frameborder=\"0\" allow=\"accelerometer; autoplay; encrypted-media; gyroscope; picture-in-picture\" allowfullscreen><\/iframe><\/p>\n","protected":false},"excerpt":{"rendered":"<p>Today, two researchers from the National University of Singapore (NUS), who are members of the Intel Neuromorphic Research Community (INRC), presented new findings demonstrating the promise of event-based vision and touch sensing in combination with Intel\u2019s neuromorphic processing for robotics. The work highlights how bringing a sense of touch to robotics can significantly improve capabilities [&hellip;]<\/p>\n","protected":false},"author":1,"featured_media":0,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[17,4],"tags":[],"class_list":{"0":"post-25581","1":"post","2":"type-post","3":"status-publish","4":"format-standard","6":"category-pics-and-videos","7":"category-technology"},"_links":{"self":[{"href":"https:\/\/www.technologyforyou.org\/wp-json\/wp\/v2\/posts\/25581","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.technologyforyou.org\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.technologyforyou.org\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.technologyforyou.org\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/www.technologyforyou.org\/wp-json\/wp\/v2\/comments?post=25581"}],"version-history":[{"count":0,"href":"https:\/\/www.technologyforyou.org\/wp-json\/wp\/v2\/posts\/25581\/revisions"}],"wp:attachment":[{"href":"https:\/\/www.technologyforyou.org\/wp-json\/wp\/v2\/media?parent=25581"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.technologyforyou.org\/wp-json\/wp\/v2\/categories?post=25581"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.technologyforyou.org\/wp-json\/wp\/v2\/tags?post=25581"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}