The TUM Kitchen Data Set of everyday manipulation activities for motion tracking and action recognition

We introduce the publicly available TUM Kitchen Data Set as a comprehensive collection of activity sequences recorded in a kitchen environment equipped with multiple complementary sensors. The recorded data consists of observations of naturally performed manipulation tasks as encountered in everyday...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Hauptverfasser: Tenorth, Moritz, Bandouch, Jan, Beetz, Michael
Format: Tagungsbericht
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page 1096
container_issue
container_start_page 1089
container_title
container_volume
creator Tenorth, Moritz
Bandouch, Jan
Beetz, Michael
description We introduce the publicly available TUM Kitchen Data Set as a comprehensive collection of activity sequences recorded in a kitchen environment equipped with multiple complementary sensors. The recorded data consists of observations of naturally performed manipulation tasks as encountered in everyday activities of human life. Several instances of a table-setting task were performed by different subjects, involving the manipulation of objects and the environment. We provide the original video sequences, full-body motion capture data recorded by a markerless motion tracker, RFID tag readings and magnetic sensor readings from objects and the environment, as well as corresponding action labels. In this paper, we both describe how the data was computed, in particular the motion tracker and the labeling, and give examples what it can be used for. We present first results of an automatic method for segmenting the observed motions into semantic classes, and describe how the data can be integrated in a knowledge-based framework for reasoning about the observations.
doi_str_mv 10.1109/ICCVW.2009.5457583
format Conference Proceeding
fullrecord <record><control><sourceid>ieee_6IE</sourceid><recordid>TN_cdi_ieee_primary_5457583</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><ieee_id>5457583</ieee_id><sourcerecordid>5457583</sourcerecordid><originalsourceid>FETCH-LOGICAL-c205t-763245339908c57d0c7c7b711217ee7f4c472488f879e53eeca7da44f19a219f3</originalsourceid><addsrcrecordid>eNpFUE1Lw0AUXJGCtvYP6GX_QOp-uslRotZixYOpeivPzdt0tUlKshby701qwbnMm2EYhkfIJWczzllyvUjTt_eZYCyZaaWNjuUJGXMl1ADOT_-F-BiR8RBMmGRKn5Fp236xHkpLxeJzUmQbpNnqmT75YDdY0TsIQF8x0NpR3GPT5dDREiq_-9lC8HVFwQa_98FjS13d0LI-uKEB--2rgkKVHyK916Cti8oP9wUZOdi2OD3yhKwe7rP0MVq-zBfp7TKygukQmRsp-mmy3xtbbXJmjTWfhnPBDaJxyiojVBy72CSoJaIFk4NSjicgeOLkhFz99XpEXO8aX0LTrY9fkr_9OVnm</addsrcrecordid><sourcetype>Publisher</sourcetype><iscdi>true</iscdi><recordtype>conference_proceeding</recordtype></control><display><type>conference_proceeding</type><title>The TUM Kitchen Data Set of everyday manipulation activities for motion tracking and action recognition</title><source>IEEE Electronic Library (IEL) Conference Proceedings</source><creator>Tenorth, Moritz ; Bandouch, Jan ; Beetz, Michael</creator><creatorcontrib>Tenorth, Moritz ; Bandouch, Jan ; Beetz, Michael</creatorcontrib><description>We introduce the publicly available TUM Kitchen Data Set as a comprehensive collection of activity sequences recorded in a kitchen environment equipped with multiple complementary sensors. The recorded data consists of observations of naturally performed manipulation tasks as encountered in everyday activities of human life. Several instances of a table-setting task were performed by different subjects, involving the manipulation of objects and the environment. We provide the original video sequences, full-body motion capture data recorded by a markerless motion tracker, RFID tag readings and magnetic sensor readings from objects and the environment, as well as corresponding action labels. In this paper, we both describe how the data was computed, in particular the motion tracker and the labeling, and give examples what it can be used for. We present first results of an automatic method for segmenting the observed motions into semantic classes, and describe how the data can be integrated in a knowledge-based framework for reasoning about the observations.</description><identifier>ISBN: 142444442X</identifier><identifier>ISBN: 9781424444427</identifier><identifier>EISBN: 1424444411</identifier><identifier>EISBN: 9781424444410</identifier><identifier>DOI: 10.1109/ICCVW.2009.5457583</identifier><identifier>LCCN: 2009903045</identifier><language>eng</language><publisher>IEEE</publisher><subject>Computer vision ; Conferences ; Humans ; Intelligent sensors ; Labeling ; Magnetic sensors ; Motion segmentation ; RFID tags ; Tracking ; Video sequences</subject><ispartof>2009 IEEE 12th International Conference on Computer Vision Workshops, ICCV Workshops, 2009, p.1089-1096</ispartof><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c205t-763245339908c57d0c7c7b711217ee7f4c472488f879e53eeca7da44f19a219f3</citedby></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://ieeexplore.ieee.org/document/5457583$$EHTML$$P50$$Gieee$$H</linktohtml><link.rule.ids>309,310,777,781,786,787,2052,27906,54901</link.rule.ids><linktorsrc>$$Uhttps://ieeexplore.ieee.org/document/5457583$$EView_record_in_IEEE$$FView_record_in_$$GIEEE</linktorsrc></links><search><creatorcontrib>Tenorth, Moritz</creatorcontrib><creatorcontrib>Bandouch, Jan</creatorcontrib><creatorcontrib>Beetz, Michael</creatorcontrib><title>The TUM Kitchen Data Set of everyday manipulation activities for motion tracking and action recognition</title><title>2009 IEEE 12th International Conference on Computer Vision Workshops, ICCV Workshops</title><addtitle>ICCVW</addtitle><description>We introduce the publicly available TUM Kitchen Data Set as a comprehensive collection of activity sequences recorded in a kitchen environment equipped with multiple complementary sensors. The recorded data consists of observations of naturally performed manipulation tasks as encountered in everyday activities of human life. Several instances of a table-setting task were performed by different subjects, involving the manipulation of objects and the environment. We provide the original video sequences, full-body motion capture data recorded by a markerless motion tracker, RFID tag readings and magnetic sensor readings from objects and the environment, as well as corresponding action labels. In this paper, we both describe how the data was computed, in particular the motion tracker and the labeling, and give examples what it can be used for. We present first results of an automatic method for segmenting the observed motions into semantic classes, and describe how the data can be integrated in a knowledge-based framework for reasoning about the observations.</description><subject>Computer vision</subject><subject>Conferences</subject><subject>Humans</subject><subject>Intelligent sensors</subject><subject>Labeling</subject><subject>Magnetic sensors</subject><subject>Motion segmentation</subject><subject>RFID tags</subject><subject>Tracking</subject><subject>Video sequences</subject><isbn>142444442X</isbn><isbn>9781424444427</isbn><isbn>1424444411</isbn><isbn>9781424444410</isbn><fulltext>true</fulltext><rsrctype>conference_proceeding</rsrctype><creationdate>2009</creationdate><recordtype>conference_proceeding</recordtype><sourceid>6IE</sourceid><sourceid>RIE</sourceid><recordid>eNpFUE1Lw0AUXJGCtvYP6GX_QOp-uslRotZixYOpeivPzdt0tUlKshby701qwbnMm2EYhkfIJWczzllyvUjTt_eZYCyZaaWNjuUJGXMl1ADOT_-F-BiR8RBMmGRKn5Fp236xHkpLxeJzUmQbpNnqmT75YDdY0TsIQF8x0NpR3GPT5dDREiq_-9lC8HVFwQa_98FjS13d0LI-uKEB--2rgkKVHyK916Cti8oP9wUZOdi2OD3yhKwe7rP0MVq-zBfp7TKygukQmRsp-mmy3xtbbXJmjTWfhnPBDaJxyiojVBy72CSoJaIFk4NSjicgeOLkhFz99XpEXO8aX0LTrY9fkr_9OVnm</recordid><startdate>200909</startdate><enddate>200909</enddate><creator>Tenorth, Moritz</creator><creator>Bandouch, Jan</creator><creator>Beetz, Michael</creator><general>IEEE</general><scope>6IE</scope><scope>6IL</scope><scope>CBEJK</scope><scope>RIE</scope><scope>RIL</scope></search><sort><creationdate>200909</creationdate><title>The TUM Kitchen Data Set of everyday manipulation activities for motion tracking and action recognition</title><author>Tenorth, Moritz ; Bandouch, Jan ; Beetz, Michael</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c205t-763245339908c57d0c7c7b711217ee7f4c472488f879e53eeca7da44f19a219f3</frbrgroupid><rsrctype>conference_proceedings</rsrctype><prefilter>conference_proceedings</prefilter><language>eng</language><creationdate>2009</creationdate><topic>Computer vision</topic><topic>Conferences</topic><topic>Humans</topic><topic>Intelligent sensors</topic><topic>Labeling</topic><topic>Magnetic sensors</topic><topic>Motion segmentation</topic><topic>RFID tags</topic><topic>Tracking</topic><topic>Video sequences</topic><toplevel>online_resources</toplevel><creatorcontrib>Tenorth, Moritz</creatorcontrib><creatorcontrib>Bandouch, Jan</creatorcontrib><creatorcontrib>Beetz, Michael</creatorcontrib><collection>IEEE Electronic Library (IEL) Conference Proceedings</collection><collection>IEEE Proceedings Order Plan All Online (POP All Online) 1998-present by volume</collection><collection>IEEE Xplore All Conference Proceedings</collection><collection>IEEE Electronic Library (IEL)</collection><collection>IEEE Proceedings Order Plans (POP All) 1998-Present</collection></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext_linktorsrc</fulltext></delivery><addata><au>Tenorth, Moritz</au><au>Bandouch, Jan</au><au>Beetz, Michael</au><format>book</format><genre>proceeding</genre><ristype>CONF</ristype><atitle>The TUM Kitchen Data Set of everyday manipulation activities for motion tracking and action recognition</atitle><btitle>2009 IEEE 12th International Conference on Computer Vision Workshops, ICCV Workshops</btitle><stitle>ICCVW</stitle><date>2009-09</date><risdate>2009</risdate><spage>1089</spage><epage>1096</epage><pages>1089-1096</pages><isbn>142444442X</isbn><isbn>9781424444427</isbn><eisbn>1424444411</eisbn><eisbn>9781424444410</eisbn><abstract>We introduce the publicly available TUM Kitchen Data Set as a comprehensive collection of activity sequences recorded in a kitchen environment equipped with multiple complementary sensors. The recorded data consists of observations of naturally performed manipulation tasks as encountered in everyday activities of human life. Several instances of a table-setting task were performed by different subjects, involving the manipulation of objects and the environment. We provide the original video sequences, full-body motion capture data recorded by a markerless motion tracker, RFID tag readings and magnetic sensor readings from objects and the environment, as well as corresponding action labels. In this paper, we both describe how the data was computed, in particular the motion tracker and the labeling, and give examples what it can be used for. We present first results of an automatic method for segmenting the observed motions into semantic classes, and describe how the data can be integrated in a knowledge-based framework for reasoning about the observations.</abstract><pub>IEEE</pub><doi>10.1109/ICCVW.2009.5457583</doi><tpages>8</tpages></addata></record>
fulltext fulltext_linktorsrc
identifier ISBN: 142444442X
ispartof 2009 IEEE 12th International Conference on Computer Vision Workshops, ICCV Workshops, 2009, p.1089-1096
issn
language eng
recordid cdi_ieee_primary_5457583
source IEEE Electronic Library (IEL) Conference Proceedings
subjects Computer vision
Conferences
Humans
Intelligent sensors
Labeling
Magnetic sensors
Motion segmentation
RFID tags
Tracking
Video sequences
title The TUM Kitchen Data Set of everyday manipulation activities for motion tracking and action recognition
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-20T05%3A10%3A36IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-ieee_6IE&rft_val_fmt=info:ofi/fmt:kev:mtx:book&rft.genre=proceeding&rft.atitle=The%20TUM%20Kitchen%20Data%20Set%20of%20everyday%20manipulation%20activities%20for%20motion%20tracking%20and%20action%20recognition&rft.btitle=2009%20IEEE%2012th%20International%20Conference%20on%20Computer%20Vision%20Workshops,%20ICCV%20Workshops&rft.au=Tenorth,%20Moritz&rft.date=2009-09&rft.spage=1089&rft.epage=1096&rft.pages=1089-1096&rft.isbn=142444442X&rft.isbn_list=9781424444427&rft_id=info:doi/10.1109/ICCVW.2009.5457583&rft_dat=%3Cieee_6IE%3E5457583%3C/ieee_6IE%3E%3Curl%3E%3C/url%3E&rft.eisbn=1424444411&rft.eisbn_list=9781424444410&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_id=info:pmid/&rft_ieee_id=5457583&rfr_iscdi=true