Efficient spatio-temporal feature clustering for large event-based datasets

Event-based cameras encode changes in a visual scene with high temporal precision and low power consumption, generating millions of events per second in the process. Current event-based processing algorithms do not scale well in terms of runtime and computational resources when applied to a large am...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Neuromorphic computing and engineering 2022-12, Vol.2 (4), p.44004
Hauptverfasser: Oubari, Omar, Exarchakis, Georgios, Lenz, Gregor, Benosman, Ryad, Ieng, Sio-Hoi
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page
container_issue 4
container_start_page 44004
container_title Neuromorphic computing and engineering
container_volume 2
creator Oubari, Omar
Exarchakis, Georgios
Lenz, Gregor
Benosman, Ryad
Ieng, Sio-Hoi
description Event-based cameras encode changes in a visual scene with high temporal precision and low power consumption, generating millions of events per second in the process. Current event-based processing algorithms do not scale well in terms of runtime and computational resources when applied to a large amount of data. This problem is further exacerbated by the development of high spatial resolution vision sensors. We introduce a fast and computationally efficient clustering algorithm that is particularly designed for dealing with large event-based datasets. The approach is based on the expectation-maximization (EM) algorithm and relies on a stochastic approximation of the E-step over a truncated space to reduce the computational burden and speed up the learning process. We evaluate the quality, complexity, and stability of the clustering algorithm on a variety of large event-based datasets, and then validate our approach with a classification task. The proposed algorithm is significantly faster than standard k-means and reduces computational demands by two to three orders of magnitude while being more stable, interpretable, and close to the state of the art in terms of classification accuracy.
doi_str_mv 10.1088/2634-4386/ac970d
format Article
fullrecord <record><control><sourceid>hal_cross</sourceid><recordid>TN_cdi_hal_primary_oai_HAL_hal_04191950v1</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>oai_HAL_hal_04191950v1</sourcerecordid><originalsourceid>FETCH-LOGICAL-c387t-fe2b8a595cd26eb83074ee1e27b876bf4206118c26919dc08a626a829a68c1cc3</originalsourceid><addsrcrecordid>eNp9kMFLwzAUxoMoOObuHnsSBOuSNE3T4xjTiQUveg6v6cvs6NqSZAP_e1sqw4N4-h4f3-_x3kfILaOPjCq15DIRsUiUXILJM1pdkNnZuvw1X5OF93tKKc8yxmQ6I68ba2tTYxsi30Oouzjgoe8cNJFFCEeHkWmOPqCr211kOxc14HYY4WlA4hI8VlEFYdDgb8iVhcbj4kfn5ONp877exsXb88t6VcQmUVmILfJSQZqnpuISS5XQTCAy5FmpMllawalkTBkuc5ZXhiqQXILiOUhlmDHJnNxPez-h0b2rD-C-dAe13q4KPXpUsAFN6YkNWTpljeu8d2jPAKN67E6P5eixHD11NyAPE1J3vd53R9cOz_wXv_sj3hrUXIvhFEGp0H1lk2-iDHx_</addsrcrecordid><sourcetype>Open Access Repository</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype></control><display><type>article</type><title>Efficient spatio-temporal feature clustering for large event-based datasets</title><source>DOAJ Directory of Open Access Journals</source><source>Institute of Physics Open Access Journal Titles</source><source>EZB-FREE-00999 freely available EZB journals</source><creator>Oubari, Omar ; Exarchakis, Georgios ; Lenz, Gregor ; Benosman, Ryad ; Ieng, Sio-Hoi</creator><creatorcontrib>Oubari, Omar ; Exarchakis, Georgios ; Lenz, Gregor ; Benosman, Ryad ; Ieng, Sio-Hoi</creatorcontrib><description>Event-based cameras encode changes in a visual scene with high temporal precision and low power consumption, generating millions of events per second in the process. Current event-based processing algorithms do not scale well in terms of runtime and computational resources when applied to a large amount of data. This problem is further exacerbated by the development of high spatial resolution vision sensors. We introduce a fast and computationally efficient clustering algorithm that is particularly designed for dealing with large event-based datasets. The approach is based on the expectation-maximization (EM) algorithm and relies on a stochastic approximation of the E-step over a truncated space to reduce the computational burden and speed up the learning process. We evaluate the quality, complexity, and stability of the clustering algorithm on a variety of large event-based datasets, and then validate our approach with a classification task. The proposed algorithm is significantly faster than standard k-means and reduces computational demands by two to three orders of magnitude while being more stable, interpretable, and close to the state of the art in terms of classification accuracy.</description><identifier>ISSN: 2634-4386</identifier><identifier>EISSN: 2634-4386</identifier><identifier>DOI: 10.1088/2634-4386/ac970d</identifier><identifier>CODEN: NCEECN</identifier><language>eng</language><publisher>IOP Publishing</publisher><subject>asynchronous vision ; clusterings ; Computer Science ; Computer Vision and Pattern Recognition ; event-based processing ; feature extraction ; Gaussian mixture model</subject><ispartof>Neuromorphic computing and engineering, 2022-12, Vol.2 (4), p.44004</ispartof><rights>2022 The Author(s). Published by IOP Publishing Ltd</rights><rights>Distributed under a Creative Commons Attribution 4.0 International License</rights><lds50>peer_reviewed</lds50><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c387t-fe2b8a595cd26eb83074ee1e27b876bf4206118c26919dc08a626a829a68c1cc3</citedby><cites>FETCH-LOGICAL-c387t-fe2b8a595cd26eb83074ee1e27b876bf4206118c26919dc08a626a829a68c1cc3</cites><orcidid>0000-0003-2517-5782 ; 0000-0002-0030-6574 ; 0000-0003-0619-3555</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktopdf>$$Uhttps://iopscience.iop.org/article/10.1088/2634-4386/ac970d/pdf$$EPDF$$P50$$Giop$$Hfree_for_read</linktopdf><link.rule.ids>230,314,780,784,864,885,27924,27925,38890,53867</link.rule.ids><backlink>$$Uhttps://hal.science/hal-04191950$$DView record in HAL$$Hfree_for_read</backlink></links><search><creatorcontrib>Oubari, Omar</creatorcontrib><creatorcontrib>Exarchakis, Georgios</creatorcontrib><creatorcontrib>Lenz, Gregor</creatorcontrib><creatorcontrib>Benosman, Ryad</creatorcontrib><creatorcontrib>Ieng, Sio-Hoi</creatorcontrib><title>Efficient spatio-temporal feature clustering for large event-based datasets</title><title>Neuromorphic computing and engineering</title><addtitle>NCE</addtitle><addtitle>Neuromorph. Comput. Eng</addtitle><description>Event-based cameras encode changes in a visual scene with high temporal precision and low power consumption, generating millions of events per second in the process. Current event-based processing algorithms do not scale well in terms of runtime and computational resources when applied to a large amount of data. This problem is further exacerbated by the development of high spatial resolution vision sensors. We introduce a fast and computationally efficient clustering algorithm that is particularly designed for dealing with large event-based datasets. The approach is based on the expectation-maximization (EM) algorithm and relies on a stochastic approximation of the E-step over a truncated space to reduce the computational burden and speed up the learning process. We evaluate the quality, complexity, and stability of the clustering algorithm on a variety of large event-based datasets, and then validate our approach with a classification task. The proposed algorithm is significantly faster than standard k-means and reduces computational demands by two to three orders of magnitude while being more stable, interpretable, and close to the state of the art in terms of classification accuracy.</description><subject>asynchronous vision</subject><subject>clusterings</subject><subject>Computer Science</subject><subject>Computer Vision and Pattern Recognition</subject><subject>event-based processing</subject><subject>feature extraction</subject><subject>Gaussian mixture model</subject><issn>2634-4386</issn><issn>2634-4386</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2022</creationdate><recordtype>article</recordtype><sourceid>O3W</sourceid><recordid>eNp9kMFLwzAUxoMoOObuHnsSBOuSNE3T4xjTiQUveg6v6cvs6NqSZAP_e1sqw4N4-h4f3-_x3kfILaOPjCq15DIRsUiUXILJM1pdkNnZuvw1X5OF93tKKc8yxmQ6I68ba2tTYxsi30Oouzjgoe8cNJFFCEeHkWmOPqCr211kOxc14HYY4WlA4hI8VlEFYdDgb8iVhcbj4kfn5ONp877exsXb88t6VcQmUVmILfJSQZqnpuISS5XQTCAy5FmpMllawalkTBkuc5ZXhiqQXILiOUhlmDHJnNxPez-h0b2rD-C-dAe13q4KPXpUsAFN6YkNWTpljeu8d2jPAKN67E6P5eixHD11NyAPE1J3vd53R9cOz_wXv_sj3hrUXIvhFEGp0H1lk2-iDHx_</recordid><startdate>20221201</startdate><enddate>20221201</enddate><creator>Oubari, Omar</creator><creator>Exarchakis, Georgios</creator><creator>Lenz, Gregor</creator><creator>Benosman, Ryad</creator><creator>Ieng, Sio-Hoi</creator><general>IOP Publishing</general><general>IOPScience</general><scope>O3W</scope><scope>TSCCA</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>1XC</scope><scope>VOOES</scope><orcidid>https://orcid.org/0000-0003-2517-5782</orcidid><orcidid>https://orcid.org/0000-0002-0030-6574</orcidid><orcidid>https://orcid.org/0000-0003-0619-3555</orcidid></search><sort><creationdate>20221201</creationdate><title>Efficient spatio-temporal feature clustering for large event-based datasets</title><author>Oubari, Omar ; Exarchakis, Georgios ; Lenz, Gregor ; Benosman, Ryad ; Ieng, Sio-Hoi</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c387t-fe2b8a595cd26eb83074ee1e27b876bf4206118c26919dc08a626a829a68c1cc3</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2022</creationdate><topic>asynchronous vision</topic><topic>clusterings</topic><topic>Computer Science</topic><topic>Computer Vision and Pattern Recognition</topic><topic>event-based processing</topic><topic>feature extraction</topic><topic>Gaussian mixture model</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Oubari, Omar</creatorcontrib><creatorcontrib>Exarchakis, Georgios</creatorcontrib><creatorcontrib>Lenz, Gregor</creatorcontrib><creatorcontrib>Benosman, Ryad</creatorcontrib><creatorcontrib>Ieng, Sio-Hoi</creatorcontrib><collection>Institute of Physics Open Access Journal Titles</collection><collection>IOPscience (Open Access)</collection><collection>CrossRef</collection><collection>Hyper Article en Ligne (HAL)</collection><collection>Hyper Article en Ligne (HAL) (Open Access)</collection><jtitle>Neuromorphic computing and engineering</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Oubari, Omar</au><au>Exarchakis, Georgios</au><au>Lenz, Gregor</au><au>Benosman, Ryad</au><au>Ieng, Sio-Hoi</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Efficient spatio-temporal feature clustering for large event-based datasets</atitle><jtitle>Neuromorphic computing and engineering</jtitle><stitle>NCE</stitle><addtitle>Neuromorph. Comput. Eng</addtitle><date>2022-12-01</date><risdate>2022</risdate><volume>2</volume><issue>4</issue><spage>44004</spage><pages>44004-</pages><issn>2634-4386</issn><eissn>2634-4386</eissn><coden>NCEECN</coden><abstract>Event-based cameras encode changes in a visual scene with high temporal precision and low power consumption, generating millions of events per second in the process. Current event-based processing algorithms do not scale well in terms of runtime and computational resources when applied to a large amount of data. This problem is further exacerbated by the development of high spatial resolution vision sensors. We introduce a fast and computationally efficient clustering algorithm that is particularly designed for dealing with large event-based datasets. The approach is based on the expectation-maximization (EM) algorithm and relies on a stochastic approximation of the E-step over a truncated space to reduce the computational burden and speed up the learning process. We evaluate the quality, complexity, and stability of the clustering algorithm on a variety of large event-based datasets, and then validate our approach with a classification task. The proposed algorithm is significantly faster than standard k-means and reduces computational demands by two to three orders of magnitude while being more stable, interpretable, and close to the state of the art in terms of classification accuracy.</abstract><pub>IOP Publishing</pub><doi>10.1088/2634-4386/ac970d</doi><tpages>15</tpages><orcidid>https://orcid.org/0000-0003-2517-5782</orcidid><orcidid>https://orcid.org/0000-0002-0030-6574</orcidid><orcidid>https://orcid.org/0000-0003-0619-3555</orcidid><oa>free_for_read</oa></addata></record>
fulltext fulltext
identifier ISSN: 2634-4386
ispartof Neuromorphic computing and engineering, 2022-12, Vol.2 (4), p.44004
issn 2634-4386
2634-4386
language eng
recordid cdi_hal_primary_oai_HAL_hal_04191950v1
source DOAJ Directory of Open Access Journals; Institute of Physics Open Access Journal Titles; EZB-FREE-00999 freely available EZB journals
subjects asynchronous vision
clusterings
Computer Science
Computer Vision and Pattern Recognition
event-based processing
feature extraction
Gaussian mixture model
title Efficient spatio-temporal feature clustering for large event-based datasets
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2024-12-25T04%3A31%3A10IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-hal_cross&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Efficient%20spatio-temporal%20feature%20clustering%20for%20large%20event-based%20datasets&rft.jtitle=Neuromorphic%20computing%20and%20engineering&rft.au=Oubari,%20Omar&rft.date=2022-12-01&rft.volume=2&rft.issue=4&rft.spage=44004&rft.pages=44004-&rft.issn=2634-4386&rft.eissn=2634-4386&rft.coden=NCEECN&rft_id=info:doi/10.1088/2634-4386/ac970d&rft_dat=%3Chal_cross%3Eoai_HAL_hal_04191950v1%3C/hal_cross%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_id=info:pmid/&rfr_iscdi=true