Neuromorphic spatiotemporal optical flow: Enabling ultrafast visual perception beyond human capabilities

Optical flow, inspired by the mechanisms of biological visual systems, calculates spatial motion vectors within visual scenes that are necessary for enabling robotics to excel in complex and dynamic working environments. However, current optical flow algorithms, despite human-competitive task perfor...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Hauptverfasser: Wang, Shengbo, Zhao, Jingwen, Pu, Tongming, Zhao, Liangbing, Guo, Xiaoyu, Cheng, Yue, Li, Cong, Ma, Weihao, Tang, Chenyu, Xu, Zhenyu, Wang, Ningli, Occhipinti, Luigi, Nathan, Arokia, Dahiya, Ravinder, Wu, Huaqiang, Tao, Li, Gao, Shuo
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page
container_issue
container_start_page
container_title
container_volume
creator Wang, Shengbo
Zhao, Jingwen
Pu, Tongming
Zhao, Liangbing
Guo, Xiaoyu
Cheng, Yue
Li, Cong
Ma, Weihao
Tang, Chenyu
Xu, Zhenyu
Wang, Ningli
Occhipinti, Luigi
Nathan, Arokia
Dahiya, Ravinder
Wu, Huaqiang
Tao, Li
Gao, Shuo
description Optical flow, inspired by the mechanisms of biological visual systems, calculates spatial motion vectors within visual scenes that are necessary for enabling robotics to excel in complex and dynamic working environments. However, current optical flow algorithms, despite human-competitive task performance on benchmark datasets, remain constrained by unacceptable time delays (~0.6 seconds per inference, 4X human processing speed) in practical deployment. Here, we introduce a neuromorphic optical flow approach that addresses delay bottlenecks by encoding temporal information directly in a synaptic transistor array to assist spatial motion analysis. Compared to conventional spatial-only optical flow methods, our spatiotemporal neuromorphic optical flow offers the spatial-temporal consistency of motion information, rapidly identifying regions of interest in as little as 1-2 ms using the temporal motion cues derived from the embedded temporal information in the two-dimensional floating gate synaptic transistors. Thus, the visual input can be selectively filtered to achieve faster velocity calculations and various task execution. At the hardware level, due to the atomically sharp interfaces between distinct functional layers in two-dimensional van der Waals heterostructures, the synaptic transistor offers high-frequency response (~100 {\mu}s), robust non-volatility (>10000 s), and excellent endurance (>8000 cycles), enabling robust visual processing. In software benchmarks, our system outperforms state-of-the-art algorithms with a 400% speedup, frequently surpassing human-level performance while maintaining or enhancing accuracy by utilizing the temporal priors provided by the embedded temporal information.
doi_str_mv 10.48550/arxiv.2409.15345
format Article
fullrecord <record><control><sourceid>arxiv_GOX</sourceid><recordid>TN_cdi_arxiv_primary_2409_15345</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>2409_15345</sourcerecordid><originalsourceid>FETCH-arxiv_primary_2409_153453</originalsourceid><addsrcrecordid>eNqFjrsOgjAUQLs4GPUDnOwPiCA0UVeDcXJyJxcscpO-ctui_L1I3J3OcM5wGFtnaVIchEh3QG_sk32RHpNM5IWYs-4mI1ltyXXYcO8goA1SO0uguHUBm5Gtsq8TLw3UCs2TRxUIWvCB9-jj6J2kRo6tNbyWgzUP3kUNhjfgoEaFAaVfslkLysvVjwu2uZT383U7PVWOUAMN1fetmt7y_8UHxQ1HUw</addsrcrecordid><sourcetype>Open Access Repository</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype></control><display><type>article</type><title>Neuromorphic spatiotemporal optical flow: Enabling ultrafast visual perception beyond human capabilities</title><source>arXiv.org</source><creator>Wang, Shengbo ; Zhao, Jingwen ; Pu, Tongming ; Zhao, Liangbing ; Guo, Xiaoyu ; Cheng, Yue ; Li, Cong ; Ma, Weihao ; Tang, Chenyu ; Xu, Zhenyu ; Wang, Ningli ; Occhipinti, Luigi ; Nathan, Arokia ; Dahiya, Ravinder ; Wu, Huaqiang ; Tao, Li ; Gao, Shuo</creator><creatorcontrib>Wang, Shengbo ; Zhao, Jingwen ; Pu, Tongming ; Zhao, Liangbing ; Guo, Xiaoyu ; Cheng, Yue ; Li, Cong ; Ma, Weihao ; Tang, Chenyu ; Xu, Zhenyu ; Wang, Ningli ; Occhipinti, Luigi ; Nathan, Arokia ; Dahiya, Ravinder ; Wu, Huaqiang ; Tao, Li ; Gao, Shuo</creatorcontrib><description>Optical flow, inspired by the mechanisms of biological visual systems, calculates spatial motion vectors within visual scenes that are necessary for enabling robotics to excel in complex and dynamic working environments. However, current optical flow algorithms, despite human-competitive task performance on benchmark datasets, remain constrained by unacceptable time delays (~0.6 seconds per inference, 4X human processing speed) in practical deployment. Here, we introduce a neuromorphic optical flow approach that addresses delay bottlenecks by encoding temporal information directly in a synaptic transistor array to assist spatial motion analysis. Compared to conventional spatial-only optical flow methods, our spatiotemporal neuromorphic optical flow offers the spatial-temporal consistency of motion information, rapidly identifying regions of interest in as little as 1-2 ms using the temporal motion cues derived from the embedded temporal information in the two-dimensional floating gate synaptic transistors. Thus, the visual input can be selectively filtered to achieve faster velocity calculations and various task execution. At the hardware level, due to the atomically sharp interfaces between distinct functional layers in two-dimensional van der Waals heterostructures, the synaptic transistor offers high-frequency response (~100 {\mu}s), robust non-volatility (&gt;10000 s), and excellent endurance (&gt;8000 cycles), enabling robust visual processing. In software benchmarks, our system outperforms state-of-the-art algorithms with a 400% speedup, frequently surpassing human-level performance while maintaining or enhancing accuracy by utilizing the temporal priors provided by the embedded temporal information.</description><identifier>DOI: 10.48550/arxiv.2409.15345</identifier><language>eng</language><subject>Computer Science - Computer Vision and Pattern Recognition ; Computer Science - Robotics</subject><creationdate>2024-09</creationdate><rights>http://creativecommons.org/licenses/by/4.0</rights><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><link.rule.ids>228,230,776,881</link.rule.ids><linktorsrc>$$Uhttps://arxiv.org/abs/2409.15345$$EView_record_in_Cornell_University$$FView_record_in_$$GCornell_University$$Hfree_for_read</linktorsrc><backlink>$$Uhttps://doi.org/10.48550/arXiv.2409.15345$$DView paper in arXiv$$Hfree_for_read</backlink></links><search><creatorcontrib>Wang, Shengbo</creatorcontrib><creatorcontrib>Zhao, Jingwen</creatorcontrib><creatorcontrib>Pu, Tongming</creatorcontrib><creatorcontrib>Zhao, Liangbing</creatorcontrib><creatorcontrib>Guo, Xiaoyu</creatorcontrib><creatorcontrib>Cheng, Yue</creatorcontrib><creatorcontrib>Li, Cong</creatorcontrib><creatorcontrib>Ma, Weihao</creatorcontrib><creatorcontrib>Tang, Chenyu</creatorcontrib><creatorcontrib>Xu, Zhenyu</creatorcontrib><creatorcontrib>Wang, Ningli</creatorcontrib><creatorcontrib>Occhipinti, Luigi</creatorcontrib><creatorcontrib>Nathan, Arokia</creatorcontrib><creatorcontrib>Dahiya, Ravinder</creatorcontrib><creatorcontrib>Wu, Huaqiang</creatorcontrib><creatorcontrib>Tao, Li</creatorcontrib><creatorcontrib>Gao, Shuo</creatorcontrib><title>Neuromorphic spatiotemporal optical flow: Enabling ultrafast visual perception beyond human capabilities</title><description>Optical flow, inspired by the mechanisms of biological visual systems, calculates spatial motion vectors within visual scenes that are necessary for enabling robotics to excel in complex and dynamic working environments. However, current optical flow algorithms, despite human-competitive task performance on benchmark datasets, remain constrained by unacceptable time delays (~0.6 seconds per inference, 4X human processing speed) in practical deployment. Here, we introduce a neuromorphic optical flow approach that addresses delay bottlenecks by encoding temporal information directly in a synaptic transistor array to assist spatial motion analysis. Compared to conventional spatial-only optical flow methods, our spatiotemporal neuromorphic optical flow offers the spatial-temporal consistency of motion information, rapidly identifying regions of interest in as little as 1-2 ms using the temporal motion cues derived from the embedded temporal information in the two-dimensional floating gate synaptic transistors. Thus, the visual input can be selectively filtered to achieve faster velocity calculations and various task execution. At the hardware level, due to the atomically sharp interfaces between distinct functional layers in two-dimensional van der Waals heterostructures, the synaptic transistor offers high-frequency response (~100 {\mu}s), robust non-volatility (&gt;10000 s), and excellent endurance (&gt;8000 cycles), enabling robust visual processing. In software benchmarks, our system outperforms state-of-the-art algorithms with a 400% speedup, frequently surpassing human-level performance while maintaining or enhancing accuracy by utilizing the temporal priors provided by the embedded temporal information.</description><subject>Computer Science - Computer Vision and Pattern Recognition</subject><subject>Computer Science - Robotics</subject><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2024</creationdate><recordtype>article</recordtype><sourceid>GOX</sourceid><recordid>eNqFjrsOgjAUQLs4GPUDnOwPiCA0UVeDcXJyJxcscpO-ctui_L1I3J3OcM5wGFtnaVIchEh3QG_sk32RHpNM5IWYs-4mI1ltyXXYcO8goA1SO0uguHUBm5Gtsq8TLw3UCs2TRxUIWvCB9-jj6J2kRo6tNbyWgzUP3kUNhjfgoEaFAaVfslkLysvVjwu2uZT383U7PVWOUAMN1fetmt7y_8UHxQ1HUw</recordid><startdate>20240910</startdate><enddate>20240910</enddate><creator>Wang, Shengbo</creator><creator>Zhao, Jingwen</creator><creator>Pu, Tongming</creator><creator>Zhao, Liangbing</creator><creator>Guo, Xiaoyu</creator><creator>Cheng, Yue</creator><creator>Li, Cong</creator><creator>Ma, Weihao</creator><creator>Tang, Chenyu</creator><creator>Xu, Zhenyu</creator><creator>Wang, Ningli</creator><creator>Occhipinti, Luigi</creator><creator>Nathan, Arokia</creator><creator>Dahiya, Ravinder</creator><creator>Wu, Huaqiang</creator><creator>Tao, Li</creator><creator>Gao, Shuo</creator><scope>AKY</scope><scope>GOX</scope></search><sort><creationdate>20240910</creationdate><title>Neuromorphic spatiotemporal optical flow: Enabling ultrafast visual perception beyond human capabilities</title><author>Wang, Shengbo ; Zhao, Jingwen ; Pu, Tongming ; Zhao, Liangbing ; Guo, Xiaoyu ; Cheng, Yue ; Li, Cong ; Ma, Weihao ; Tang, Chenyu ; Xu, Zhenyu ; Wang, Ningli ; Occhipinti, Luigi ; Nathan, Arokia ; Dahiya, Ravinder ; Wu, Huaqiang ; Tao, Li ; Gao, Shuo</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-arxiv_primary_2409_153453</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2024</creationdate><topic>Computer Science - Computer Vision and Pattern Recognition</topic><topic>Computer Science - Robotics</topic><toplevel>online_resources</toplevel><creatorcontrib>Wang, Shengbo</creatorcontrib><creatorcontrib>Zhao, Jingwen</creatorcontrib><creatorcontrib>Pu, Tongming</creatorcontrib><creatorcontrib>Zhao, Liangbing</creatorcontrib><creatorcontrib>Guo, Xiaoyu</creatorcontrib><creatorcontrib>Cheng, Yue</creatorcontrib><creatorcontrib>Li, Cong</creatorcontrib><creatorcontrib>Ma, Weihao</creatorcontrib><creatorcontrib>Tang, Chenyu</creatorcontrib><creatorcontrib>Xu, Zhenyu</creatorcontrib><creatorcontrib>Wang, Ningli</creatorcontrib><creatorcontrib>Occhipinti, Luigi</creatorcontrib><creatorcontrib>Nathan, Arokia</creatorcontrib><creatorcontrib>Dahiya, Ravinder</creatorcontrib><creatorcontrib>Wu, Huaqiang</creatorcontrib><creatorcontrib>Tao, Li</creatorcontrib><creatorcontrib>Gao, Shuo</creatorcontrib><collection>arXiv Computer Science</collection><collection>arXiv.org</collection></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext_linktorsrc</fulltext></delivery><addata><au>Wang, Shengbo</au><au>Zhao, Jingwen</au><au>Pu, Tongming</au><au>Zhao, Liangbing</au><au>Guo, Xiaoyu</au><au>Cheng, Yue</au><au>Li, Cong</au><au>Ma, Weihao</au><au>Tang, Chenyu</au><au>Xu, Zhenyu</au><au>Wang, Ningli</au><au>Occhipinti, Luigi</au><au>Nathan, Arokia</au><au>Dahiya, Ravinder</au><au>Wu, Huaqiang</au><au>Tao, Li</au><au>Gao, Shuo</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Neuromorphic spatiotemporal optical flow: Enabling ultrafast visual perception beyond human capabilities</atitle><date>2024-09-10</date><risdate>2024</risdate><abstract>Optical flow, inspired by the mechanisms of biological visual systems, calculates spatial motion vectors within visual scenes that are necessary for enabling robotics to excel in complex and dynamic working environments. However, current optical flow algorithms, despite human-competitive task performance on benchmark datasets, remain constrained by unacceptable time delays (~0.6 seconds per inference, 4X human processing speed) in practical deployment. Here, we introduce a neuromorphic optical flow approach that addresses delay bottlenecks by encoding temporal information directly in a synaptic transistor array to assist spatial motion analysis. Compared to conventional spatial-only optical flow methods, our spatiotemporal neuromorphic optical flow offers the spatial-temporal consistency of motion information, rapidly identifying regions of interest in as little as 1-2 ms using the temporal motion cues derived from the embedded temporal information in the two-dimensional floating gate synaptic transistors. Thus, the visual input can be selectively filtered to achieve faster velocity calculations and various task execution. At the hardware level, due to the atomically sharp interfaces between distinct functional layers in two-dimensional van der Waals heterostructures, the synaptic transistor offers high-frequency response (~100 {\mu}s), robust non-volatility (&gt;10000 s), and excellent endurance (&gt;8000 cycles), enabling robust visual processing. In software benchmarks, our system outperforms state-of-the-art algorithms with a 400% speedup, frequently surpassing human-level performance while maintaining or enhancing accuracy by utilizing the temporal priors provided by the embedded temporal information.</abstract><doi>10.48550/arxiv.2409.15345</doi><oa>free_for_read</oa></addata></record>
fulltext fulltext_linktorsrc
identifier DOI: 10.48550/arxiv.2409.15345
ispartof
issn
language eng
recordid cdi_arxiv_primary_2409_15345
source arXiv.org
subjects Computer Science - Computer Vision and Pattern Recognition
Computer Science - Robotics
title Neuromorphic spatiotemporal optical flow: Enabling ultrafast visual perception beyond human capabilities
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-02-20T21%3A17%3A59IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-arxiv_GOX&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Neuromorphic%20spatiotemporal%20optical%20flow:%20Enabling%20ultrafast%20visual%20perception%20beyond%20human%20capabilities&rft.au=Wang,%20Shengbo&rft.date=2024-09-10&rft_id=info:doi/10.48550/arxiv.2409.15345&rft_dat=%3Carxiv_GOX%3E2409_15345%3C/arxiv_GOX%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_id=info:pmid/&rfr_iscdi=true