Active query sensing: Suggesting the best query view for mobile visual search
While much exciting progress is being made in mobile visual search, one important question has been left unexplored in all current systems. When searching objects or scenes in the 3D world, which viewing angle is more likely to be successful? More particularly, if the first query fails to find the r...
Gespeichert in:
Veröffentlicht in: | ACM transactions on multimedia computing communications and applications 2012-09, Vol.8 (3s), p.1-21 |
---|---|
Hauptverfasser: | , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
container_end_page | 21 |
---|---|
container_issue | 3s |
container_start_page | 1 |
container_title | ACM transactions on multimedia computing communications and applications |
container_volume | 8 |
creator | Ji, Rongrong Yu, Felix X. Zhang, Tongtao Chang, Shih-Fu |
description | While much exciting progress is being made in mobile visual search, one important question has been left unexplored in all current systems. When searching objects or scenes in the 3D world, which viewing angle is more likely to be successful? More particularly, if the first query fails to find the right target, how should the user control the mobile camera to form the second query? In this article, we propose a novel
Active Query Sensing
system for mobile location search, which actively suggests the best subsequent query view to recognize the physical location in the mobile environment. The proposed system includes two unique components: (1) an offline process for analyzing the saliencies of different views associated with each geographical location, which predicts the location search precisions of individual views by modeling their self-retrieval score distributions. (2) an online process for estimating the view of an unseen query, and suggesting the best subsequent view change. Specifically, the optimal viewing angle change for the next query can be formulated as an online information theoretic approach. Using a scalable visual search system implemented over a NYC street view dataset (0.3 million images), we show a performance gain by reducing the failure rate of mobile location search to only 12% after the second query. We have also implemented an end-to-end functional system, including user interfaces on iPhones, client-server communication, and a remote search server. This work may open up an exciting new direction for developing interactive mobile media applications through the innovative exploitation of active sensing and query formulation. |
doi_str_mv | 10.1145/2348816.2348819 |
format | Article |
fullrecord | <record><control><sourceid>proquest_cross</sourceid><recordid>TN_cdi_proquest_miscellaneous_1506371596</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>1506371596</sourcerecordid><originalsourceid>FETCH-LOGICAL-c228t-b3ba76dd2ee7af9de8f05305a82899d5eeec97c3ff8b997f452ac583285e902f3</originalsourceid><addsrcrecordid>eNo9j01LAzEURYMoWKs7wW2XbqbNx7wkb1mKVqHgRtchk3mRkelMTaZC_72VGVydu7hc7mHsQfClECWspCqtFXo5Ei_YTACIQlsNl_8ZzDW7yfmLc6Wh1DN2vw5D80OL7yOl0yJTl5vu85ZdRd9mups4Zx_PT--bl2L3tn3drHdFkNIORaUqb3RdSyLjI9ZkIwfFwVtpEWsgooAmqBhthWhiCdIHsEpaIOQyqjl7HHcPqT8fyIPbNzlQ2_qO-mN2ArhWRgDqc3U1VkPqc04U3SE1e59OTnD35-8m_4mofgHYFkuj</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>1506371596</pqid></control><display><type>article</type><title>Active query sensing: Suggesting the best query view for mobile visual search</title><source>ACM Digital Library Complete</source><creator>Ji, Rongrong ; Yu, Felix X. ; Zhang, Tongtao ; Chang, Shih-Fu</creator><creatorcontrib>Ji, Rongrong ; Yu, Felix X. ; Zhang, Tongtao ; Chang, Shih-Fu</creatorcontrib><description>While much exciting progress is being made in mobile visual search, one important question has been left unexplored in all current systems. When searching objects or scenes in the 3D world, which viewing angle is more likely to be successful? More particularly, if the first query fails to find the right target, how should the user control the mobile camera to form the second query? In this article, we propose a novel
Active Query Sensing
system for mobile location search, which actively suggests the best subsequent query view to recognize the physical location in the mobile environment. The proposed system includes two unique components: (1) an offline process for analyzing the saliencies of different views associated with each geographical location, which predicts the location search precisions of individual views by modeling their self-retrieval score distributions. (2) an online process for estimating the view of an unseen query, and suggesting the best subsequent view change. Specifically, the optimal viewing angle change for the next query can be formulated as an online information theoretic approach. Using a scalable visual search system implemented over a NYC street view dataset (0.3 million images), we show a performance gain by reducing the failure rate of mobile location search to only 12% after the second query. We have also implemented an end-to-end functional system, including user interfaces on iPhones, client-server communication, and a remote search server. This work may open up an exciting new direction for developing interactive mobile media applications through the innovative exploitation of active sensing and query formulation.</description><identifier>ISSN: 1551-6857</identifier><identifier>EISSN: 1551-6865</identifier><identifier>DOI: 10.1145/2348816.2348819</identifier><language>eng</language><subject>Detection</subject><ispartof>ACM transactions on multimedia computing communications and applications, 2012-09, Vol.8 (3s), p.1-21</ispartof><lds50>peer_reviewed</lds50><woscitedreferencessubscribed>false</woscitedreferencessubscribed><cites>FETCH-LOGICAL-c228t-b3ba76dd2ee7af9de8f05305a82899d5eeec97c3ff8b997f452ac583285e902f3</cites></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><link.rule.ids>314,780,784,27924,27925</link.rule.ids></links><search><creatorcontrib>Ji, Rongrong</creatorcontrib><creatorcontrib>Yu, Felix X.</creatorcontrib><creatorcontrib>Zhang, Tongtao</creatorcontrib><creatorcontrib>Chang, Shih-Fu</creatorcontrib><title>Active query sensing: Suggesting the best query view for mobile visual search</title><title>ACM transactions on multimedia computing communications and applications</title><description>While much exciting progress is being made in mobile visual search, one important question has been left unexplored in all current systems. When searching objects or scenes in the 3D world, which viewing angle is more likely to be successful? More particularly, if the first query fails to find the right target, how should the user control the mobile camera to form the second query? In this article, we propose a novel
Active Query Sensing
system for mobile location search, which actively suggests the best subsequent query view to recognize the physical location in the mobile environment. The proposed system includes two unique components: (1) an offline process for analyzing the saliencies of different views associated with each geographical location, which predicts the location search precisions of individual views by modeling their self-retrieval score distributions. (2) an online process for estimating the view of an unseen query, and suggesting the best subsequent view change. Specifically, the optimal viewing angle change for the next query can be formulated as an online information theoretic approach. Using a scalable visual search system implemented over a NYC street view dataset (0.3 million images), we show a performance gain by reducing the failure rate of mobile location search to only 12% after the second query. We have also implemented an end-to-end functional system, including user interfaces on iPhones, client-server communication, and a remote search server. This work may open up an exciting new direction for developing interactive mobile media applications through the innovative exploitation of active sensing and query formulation.</description><subject>Detection</subject><issn>1551-6857</issn><issn>1551-6865</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2012</creationdate><recordtype>article</recordtype><recordid>eNo9j01LAzEURYMoWKs7wW2XbqbNx7wkb1mKVqHgRtchk3mRkelMTaZC_72VGVydu7hc7mHsQfClECWspCqtFXo5Ei_YTACIQlsNl_8ZzDW7yfmLc6Wh1DN2vw5D80OL7yOl0yJTl5vu85ZdRd9mups4Zx_PT--bl2L3tn3drHdFkNIORaUqb3RdSyLjI9ZkIwfFwVtpEWsgooAmqBhthWhiCdIHsEpaIOQyqjl7HHcPqT8fyIPbNzlQ2_qO-mN2ArhWRgDqc3U1VkPqc04U3SE1e59OTnD35-8m_4mofgHYFkuj</recordid><startdate>201209</startdate><enddate>201209</enddate><creator>Ji, Rongrong</creator><creator>Yu, Felix X.</creator><creator>Zhang, Tongtao</creator><creator>Chang, Shih-Fu</creator><scope>AAYXX</scope><scope>CITATION</scope><scope>7SC</scope><scope>8FD</scope><scope>JQ2</scope><scope>L7M</scope><scope>L~C</scope><scope>L~D</scope></search><sort><creationdate>201209</creationdate><title>Active query sensing</title><author>Ji, Rongrong ; Yu, Felix X. ; Zhang, Tongtao ; Chang, Shih-Fu</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c228t-b3ba76dd2ee7af9de8f05305a82899d5eeec97c3ff8b997f452ac583285e902f3</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2012</creationdate><topic>Detection</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Ji, Rongrong</creatorcontrib><creatorcontrib>Yu, Felix X.</creatorcontrib><creatorcontrib>Zhang, Tongtao</creatorcontrib><creatorcontrib>Chang, Shih-Fu</creatorcontrib><collection>CrossRef</collection><collection>Computer and Information Systems Abstracts</collection><collection>Technology Research Database</collection><collection>ProQuest Computer Science Collection</collection><collection>Advanced Technologies Database with Aerospace</collection><collection>Computer and Information Systems Abstracts Academic</collection><collection>Computer and Information Systems Abstracts Professional</collection><jtitle>ACM transactions on multimedia computing communications and applications</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Ji, Rongrong</au><au>Yu, Felix X.</au><au>Zhang, Tongtao</au><au>Chang, Shih-Fu</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Active query sensing: Suggesting the best query view for mobile visual search</atitle><jtitle>ACM transactions on multimedia computing communications and applications</jtitle><date>2012-09</date><risdate>2012</risdate><volume>8</volume><issue>3s</issue><spage>1</spage><epage>21</epage><pages>1-21</pages><issn>1551-6857</issn><eissn>1551-6865</eissn><abstract>While much exciting progress is being made in mobile visual search, one important question has been left unexplored in all current systems. When searching objects or scenes in the 3D world, which viewing angle is more likely to be successful? More particularly, if the first query fails to find the right target, how should the user control the mobile camera to form the second query? In this article, we propose a novel
Active Query Sensing
system for mobile location search, which actively suggests the best subsequent query view to recognize the physical location in the mobile environment. The proposed system includes two unique components: (1) an offline process for analyzing the saliencies of different views associated with each geographical location, which predicts the location search precisions of individual views by modeling their self-retrieval score distributions. (2) an online process for estimating the view of an unseen query, and suggesting the best subsequent view change. Specifically, the optimal viewing angle change for the next query can be formulated as an online information theoretic approach. Using a scalable visual search system implemented over a NYC street view dataset (0.3 million images), we show a performance gain by reducing the failure rate of mobile location search to only 12% after the second query. We have also implemented an end-to-end functional system, including user interfaces on iPhones, client-server communication, and a remote search server. This work may open up an exciting new direction for developing interactive mobile media applications through the innovative exploitation of active sensing and query formulation.</abstract><doi>10.1145/2348816.2348819</doi><tpages>21</tpages></addata></record> |
fulltext | fulltext |
identifier | ISSN: 1551-6857 |
ispartof | ACM transactions on multimedia computing communications and applications, 2012-09, Vol.8 (3s), p.1-21 |
issn | 1551-6857 1551-6865 |
language | eng |
recordid | cdi_proquest_miscellaneous_1506371596 |
source | ACM Digital Library Complete |
subjects | Detection |
title | Active query sensing: Suggesting the best query view for mobile visual search |
url | https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-01T09%3A18%3A46IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_cross&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Active%20query%20sensing:%20Suggesting%20the%20best%20query%20view%20for%20mobile%20visual%20search&rft.jtitle=ACM%20transactions%20on%20multimedia%20computing%20communications%20and%20applications&rft.au=Ji,%20Rongrong&rft.date=2012-09&rft.volume=8&rft.issue=3s&rft.spage=1&rft.epage=21&rft.pages=1-21&rft.issn=1551-6857&rft.eissn=1551-6865&rft_id=info:doi/10.1145/2348816.2348819&rft_dat=%3Cproquest_cross%3E1506371596%3C/proquest_cross%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=1506371596&rft_id=info:pmid/&rfr_iscdi=true |