Evaluating a collection of Sound-Tracing Data of Melodic Phrases
Melodic contour, the ‘shape’ of a melody, is a common way to visualize and remember a musical piece. The purpose of this paper is to explore the building blocks of a future ‘gesture-based’ melody retrieval system. We present a dataset containing 16 melodic phrases from four musical styles and with a...
Gespeichert in:
Hauptverfasser: | , , |
---|---|
Format: | Buch |
Sprache: | nor |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
container_end_page | |
---|---|
container_issue | |
container_start_page | |
container_title | |
container_volume | |
creator | Kelkar, Tejaswinee Roy, Udit Jensenius, Alexander Refsum |
description | Melodic contour, the ‘shape’ of a melody, is a common way to visualize and remember a musical piece. The purpose of this paper is to explore the building blocks of a future ‘gesture-based’ melody retrieval system. We present a dataset containing 16 melodic phrases from four musical styles and with a large range of contour variability. This is accompanied by full-body motion capture data of 26 participants performing sound-tracing to the melodies. The dataset is analyzed using canonical correlation analysis (CCA), and its neural network variant (Deep CCA), to understand how melodic contours and sound tracings relate to each other. The analyses reveal non-linear relationships between sound and motion. The link between pitch and verticality does not appear strong enough for complex melodies. We also find that descending melodic contours have the least correlation with tracing. |
format | Book |
fullrecord | <record><control><sourceid>cristin_3HK</sourceid><recordid>TN_cdi_cristin_nora_10852_65560</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>10852_65560</sourcerecordid><originalsourceid>FETCH-cristin_nora_10852_655603</originalsourceid><addsrcrecordid>eNpjZuC1NLcwsjQ1MTA2NTQy5mRwcC1LzClNLMnMS1dIVEjOz8lJTS7JzM9TyE9TCM4vzUvRDSlKTAbJuiSWJIJEfVNz8lMykxUCMooSi1OLeRhY0xJzilN5oTQ3g7yba4izh25yUWYx0Nj4vPyixHhDAwtTo3gzU1MzA2PCKgBdZTIV</addsrcrecordid><sourcetype>Open Access Repository</sourcetype><iscdi>true</iscdi><recordtype>book</recordtype></control><display><type>book</type><title>Evaluating a collection of Sound-Tracing Data of Melodic Phrases</title><source>NORA - Norwegian Open Research Archives</source><creator>Kelkar, Tejaswinee ; Roy, Udit ; Jensenius, Alexander Refsum</creator><creatorcontrib>Kelkar, Tejaswinee ; Roy, Udit ; Jensenius, Alexander Refsum</creatorcontrib><description>Melodic contour, the ‘shape’ of a melody, is a common way to visualize and remember a musical piece. The purpose of this paper is to explore the building blocks of a future ‘gesture-based’ melody retrieval system. We present a dataset containing 16 melodic phrases from four musical styles and with a large range of contour variability. This is accompanied by full-body motion capture data of 26 participants performing sound-tracing to the melodies. The dataset is analyzed using canonical correlation analysis (CCA), and its neural network variant (Deep CCA), to understand how melodic contours and sound tracings relate to each other. The analyses reveal non-linear relationships between sound and motion. The link between pitch and verticality does not appear strong enough for complex melodies. We also find that descending melodic contours have the least correlation with tracing.</description><identifier>ISBN: 9782954035123</identifier><identifier>ISBN: 2954035129</identifier><language>nor</language><publisher>Institut de Recherche et Coordination Acoustique/Musique</publisher><creationdate>2018</creationdate><rights>info:eu-repo/semantics/openAccess</rights><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><link.rule.ids>230,307,776,881,4034,26544</link.rule.ids><linktorsrc>$$Uhttp://hdl.handle.net/10852/65560$$EView_record_in_NORA$$FView_record_in_$$GNORA$$Hfree_for_read</linktorsrc></links><search><creatorcontrib>Kelkar, Tejaswinee</creatorcontrib><creatorcontrib>Roy, Udit</creatorcontrib><creatorcontrib>Jensenius, Alexander Refsum</creatorcontrib><title>Evaluating a collection of Sound-Tracing Data of Melodic Phrases</title><description>Melodic contour, the ‘shape’ of a melody, is a common way to visualize and remember a musical piece. The purpose of this paper is to explore the building blocks of a future ‘gesture-based’ melody retrieval system. We present a dataset containing 16 melodic phrases from four musical styles and with a large range of contour variability. This is accompanied by full-body motion capture data of 26 participants performing sound-tracing to the melodies. The dataset is analyzed using canonical correlation analysis (CCA), and its neural network variant (Deep CCA), to understand how melodic contours and sound tracings relate to each other. The analyses reveal non-linear relationships between sound and motion. The link between pitch and verticality does not appear strong enough for complex melodies. We also find that descending melodic contours have the least correlation with tracing.</description><isbn>9782954035123</isbn><isbn>2954035129</isbn><fulltext>true</fulltext><rsrctype>book</rsrctype><creationdate>2018</creationdate><recordtype>book</recordtype><sourceid>3HK</sourceid><recordid>eNpjZuC1NLcwsjQ1MTA2NTQy5mRwcC1LzClNLMnMS1dIVEjOz8lJTS7JzM9TyE9TCM4vzUvRDSlKTAbJuiSWJIJEfVNz8lMykxUCMooSi1OLeRhY0xJzilN5oTQ3g7yba4izh25yUWYx0Nj4vPyixHhDAwtTo3gzU1MzA2PCKgBdZTIV</recordid><startdate>2018</startdate><enddate>2018</enddate><creator>Kelkar, Tejaswinee</creator><creator>Roy, Udit</creator><creator>Jensenius, Alexander Refsum</creator><general>Institut de Recherche et Coordination Acoustique/Musique</general><scope>3HK</scope></search><sort><creationdate>2018</creationdate><title>Evaluating a collection of Sound-Tracing Data of Melodic Phrases</title><author>Kelkar, Tejaswinee ; Roy, Udit ; Jensenius, Alexander Refsum</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-cristin_nora_10852_655603</frbrgroupid><rsrctype>books</rsrctype><prefilter>books</prefilter><language>nor</language><creationdate>2018</creationdate><toplevel>online_resources</toplevel><creatorcontrib>Kelkar, Tejaswinee</creatorcontrib><creatorcontrib>Roy, Udit</creatorcontrib><creatorcontrib>Jensenius, Alexander Refsum</creatorcontrib><collection>NORA - Norwegian Open Research Archives</collection></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext_linktorsrc</fulltext></delivery><addata><au>Kelkar, Tejaswinee</au><au>Roy, Udit</au><au>Jensenius, Alexander Refsum</au><format>book</format><genre>book</genre><ristype>BOOK</ristype><btitle>Evaluating a collection of Sound-Tracing Data of Melodic Phrases</btitle><date>2018</date><risdate>2018</risdate><isbn>9782954035123</isbn><isbn>2954035129</isbn><abstract>Melodic contour, the ‘shape’ of a melody, is a common way to visualize and remember a musical piece. The purpose of this paper is to explore the building blocks of a future ‘gesture-based’ melody retrieval system. We present a dataset containing 16 melodic phrases from four musical styles and with a large range of contour variability. This is accompanied by full-body motion capture data of 26 participants performing sound-tracing to the melodies. The dataset is analyzed using canonical correlation analysis (CCA), and its neural network variant (Deep CCA), to understand how melodic contours and sound tracings relate to each other. The analyses reveal non-linear relationships between sound and motion. The link between pitch and verticality does not appear strong enough for complex melodies. We also find that descending melodic contours have the least correlation with tracing.</abstract><pub>Institut de Recherche et Coordination Acoustique/Musique</pub><oa>free_for_read</oa></addata></record> |
fulltext | fulltext_linktorsrc |
identifier | ISBN: 9782954035123 |
ispartof | |
issn | |
language | nor |
recordid | cdi_cristin_nora_10852_65560 |
source | NORA - Norwegian Open Research Archives |
title | Evaluating a collection of Sound-Tracing Data of Melodic Phrases |
url | https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-02-02T12%3A50%3A18IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-cristin_3HK&rft_val_fmt=info:ofi/fmt:kev:mtx:book&rft.genre=book&rft.btitle=Evaluating%20a%20collection%20of%20Sound-Tracing%20Data%20of%20Melodic%20Phrases&rft.au=Kelkar,%20Tejaswinee&rft.date=2018&rft.isbn=9782954035123&rft.isbn_list=2954035129&rft_id=info:doi/&rft_dat=%3Ccristin_3HK%3E10852_65560%3C/cristin_3HK%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_id=info:pmid/&rfr_iscdi=true |