Conjugated-Point Positioning Method Based on Multiview Images
The performance of the direct georeferencing method depends on the accuracy of the position and orientation data. The accuracy of the attitude data has a particularly significant impact on the positioning results. This paper proposes an object point positioning method that does not require attitude...
Gespeichert in:
Veröffentlicht in: | IEEE transactions on geoscience and remote sensing 2023-01, Vol.61, p.1-1 |
---|---|
Hauptverfasser: | , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
container_end_page | 1 |
---|---|
container_issue | |
container_start_page | 1 |
container_title | IEEE transactions on geoscience and remote sensing |
container_volume | 61 |
creator | Liu, Jianchen Huang, Weixiang Guo, Bingxuan |
description | The performance of the direct georeferencing method depends on the accuracy of the position and orientation data. The accuracy of the attitude data has a particularly significant impact on the positioning results. This paper proposes an object point positioning method that does not require attitude information. In the proposed algorithm, only the position information of the perspective center is necessary. The basic principle is that the error equation of each image can be established according to two visible object points and the image perspective center. The angle between the rays connecting these points and the center is the observation value that can be calculated from the image points and camera parameters. In contrast to the spatial intersection method, this method does not need the attitude information to solve for the coordinates of object points. This paper analyzes the main factors affecting the accuracy through simulation experiments. Using a DJI Phantom 4 RTK unmanned aerial vehicle in experiment, the accuracy of positioning four points can achieve 0.2 m when the flight altitude is 60 m and the flight radius is 40 m. On the experiment of positioning N object points, the positioning accuracy can achieve 0.105 m. The method in this paper can achieve the same positioning accuracy as the space intersection. |
doi_str_mv | 10.1109/TGRS.2023.3247919 |
format | Article |
fullrecord | <record><control><sourceid>proquest_RIE</sourceid><recordid>TN_cdi_proquest_journals_2785446340</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><ieee_id>10054602</ieee_id><sourcerecordid>2785446340</sourcerecordid><originalsourceid>FETCH-LOGICAL-c246t-8e8612ff9666e1c0f31b25f5fe1abb7632970e003f07a80130a42ce93b487bb43</originalsourceid><addsrcrecordid>eNpNkMFOAjEURRujiYh-gImLSVwPvtd2OtOFCyWKJBCJ4rrpwCuWwBSnMxr_3iGwcHU3596bHMauEQaIoO_mo7f3AQcuBoLLXKM-YT3MsiIFJeUp6wFqlfJC83N2EeMaAGWGeY_dD0O1ble2oWU6C75qklmIvvGh8tUqmVLzGZbJo420TEKVTNtN4789_STjrV1RvGRnzm4iXR2zzz6en-bDl3TyOhoPHybpgkvVpAUVCrlzWilFuAAnsOSZyxyhLctcCa5zIADhILcFoAAr-YK0KGWRl6UUfXZ72N3V4aul2Jh1aOuquzQ8LzIplZDQUXigFnWIsSZndrXf2vrXIJi9JbO3ZPaWzNFS17k5dDwR_eMhk6rj_gBH3mIm</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2785446340</pqid></control><display><type>article</type><title>Conjugated-Point Positioning Method Based on Multiview Images</title><source>IEEE Electronic Library (IEL)</source><creator>Liu, Jianchen ; Huang, Weixiang ; Guo, Bingxuan</creator><creatorcontrib>Liu, Jianchen ; Huang, Weixiang ; Guo, Bingxuan</creatorcontrib><description>The performance of the direct georeferencing method depends on the accuracy of the position and orientation data. The accuracy of the attitude data has a particularly significant impact on the positioning results. This paper proposes an object point positioning method that does not require attitude information. In the proposed algorithm, only the position information of the perspective center is necessary. The basic principle is that the error equation of each image can be established according to two visible object points and the image perspective center. The angle between the rays connecting these points and the center is the observation value that can be calculated from the image points and camera parameters. In contrast to the spatial intersection method, this method does not need the attitude information to solve for the coordinates of object points. This paper analyzes the main factors affecting the accuracy through simulation experiments. Using a DJI Phantom 4 RTK unmanned aerial vehicle in experiment, the accuracy of positioning four points can achieve 0.2 m when the flight altitude is 60 m and the flight radius is 40 m. On the experiment of positioning N object points, the positioning accuracy can achieve 0.105 m. The method in this paper can achieve the same positioning accuracy as the space intersection.</description><identifier>ISSN: 0196-2892</identifier><identifier>EISSN: 1558-0644</identifier><identifier>DOI: 10.1109/TGRS.2023.3247919</identifier><identifier>CODEN: IGRSD2</identifier><language>eng</language><publisher>New York: IEEE</publisher><subject>Accuracy ; Algorithms ; Attitudes ; Autonomous aerial vehicles ; Conjugated-Point Positioning ; Connecting ; Flight ; Flight altitude ; Geometry ; Global Positioning System ; Image contrast ; Image Positioning ; Kinematics ; Mathematical analysis ; Mathematical models ; Measurement units ; Methods ; Oblique Images ; Real-time systems ; Sensors ; Unmanned aerial vehicles</subject><ispartof>IEEE transactions on geoscience and remote sensing, 2023-01, Vol.61, p.1-1</ispartof><rights>Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2023</rights><lds50>peer_reviewed</lds50><woscitedreferencessubscribed>false</woscitedreferencessubscribed><cites>FETCH-LOGICAL-c246t-8e8612ff9666e1c0f31b25f5fe1abb7632970e003f07a80130a42ce93b487bb43</cites><orcidid>0000-0002-6004-7631 ; 0000-0002-9418-7835 ; 0000-0003-0061-0602</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://ieeexplore.ieee.org/document/10054602$$EHTML$$P50$$Gieee$$H</linktohtml><link.rule.ids>314,776,780,792,27901,27902,54733</link.rule.ids><linktorsrc>$$Uhttps://ieeexplore.ieee.org/document/10054602$$EView_record_in_IEEE$$FView_record_in_$$GIEEE</linktorsrc></links><search><creatorcontrib>Liu, Jianchen</creatorcontrib><creatorcontrib>Huang, Weixiang</creatorcontrib><creatorcontrib>Guo, Bingxuan</creatorcontrib><title>Conjugated-Point Positioning Method Based on Multiview Images</title><title>IEEE transactions on geoscience and remote sensing</title><addtitle>TGRS</addtitle><description>The performance of the direct georeferencing method depends on the accuracy of the position and orientation data. The accuracy of the attitude data has a particularly significant impact on the positioning results. This paper proposes an object point positioning method that does not require attitude information. In the proposed algorithm, only the position information of the perspective center is necessary. The basic principle is that the error equation of each image can be established according to two visible object points and the image perspective center. The angle between the rays connecting these points and the center is the observation value that can be calculated from the image points and camera parameters. In contrast to the spatial intersection method, this method does not need the attitude information to solve for the coordinates of object points. This paper analyzes the main factors affecting the accuracy through simulation experiments. Using a DJI Phantom 4 RTK unmanned aerial vehicle in experiment, the accuracy of positioning four points can achieve 0.2 m when the flight altitude is 60 m and the flight radius is 40 m. On the experiment of positioning N object points, the positioning accuracy can achieve 0.105 m. The method in this paper can achieve the same positioning accuracy as the space intersection.</description><subject>Accuracy</subject><subject>Algorithms</subject><subject>Attitudes</subject><subject>Autonomous aerial vehicles</subject><subject>Conjugated-Point Positioning</subject><subject>Connecting</subject><subject>Flight</subject><subject>Flight altitude</subject><subject>Geometry</subject><subject>Global Positioning System</subject><subject>Image contrast</subject><subject>Image Positioning</subject><subject>Kinematics</subject><subject>Mathematical analysis</subject><subject>Mathematical models</subject><subject>Measurement units</subject><subject>Methods</subject><subject>Oblique Images</subject><subject>Real-time systems</subject><subject>Sensors</subject><subject>Unmanned aerial vehicles</subject><issn>0196-2892</issn><issn>1558-0644</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2023</creationdate><recordtype>article</recordtype><sourceid>RIE</sourceid><recordid>eNpNkMFOAjEURRujiYh-gImLSVwPvtd2OtOFCyWKJBCJ4rrpwCuWwBSnMxr_3iGwcHU3596bHMauEQaIoO_mo7f3AQcuBoLLXKM-YT3MsiIFJeUp6wFqlfJC83N2EeMaAGWGeY_dD0O1ble2oWU6C75qklmIvvGh8tUqmVLzGZbJo420TEKVTNtN4789_STjrV1RvGRnzm4iXR2zzz6en-bDl3TyOhoPHybpgkvVpAUVCrlzWilFuAAnsOSZyxyhLctcCa5zIADhILcFoAAr-YK0KGWRl6UUfXZ72N3V4aul2Jh1aOuquzQ8LzIplZDQUXigFnWIsSZndrXf2vrXIJi9JbO3ZPaWzNFS17k5dDwR_eMhk6rj_gBH3mIm</recordid><startdate>20230101</startdate><enddate>20230101</enddate><creator>Liu, Jianchen</creator><creator>Huang, Weixiang</creator><creator>Guo, Bingxuan</creator><general>IEEE</general><general>The Institute of Electrical and Electronics Engineers, Inc. (IEEE)</general><scope>97E</scope><scope>RIA</scope><scope>RIE</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>7UA</scope><scope>8FD</scope><scope>C1K</scope><scope>F1W</scope><scope>FR3</scope><scope>H8D</scope><scope>H96</scope><scope>KR7</scope><scope>L.G</scope><scope>L7M</scope><orcidid>https://orcid.org/0000-0002-6004-7631</orcidid><orcidid>https://orcid.org/0000-0002-9418-7835</orcidid><orcidid>https://orcid.org/0000-0003-0061-0602</orcidid></search><sort><creationdate>20230101</creationdate><title>Conjugated-Point Positioning Method Based on Multiview Images</title><author>Liu, Jianchen ; Huang, Weixiang ; Guo, Bingxuan</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c246t-8e8612ff9666e1c0f31b25f5fe1abb7632970e003f07a80130a42ce93b487bb43</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2023</creationdate><topic>Accuracy</topic><topic>Algorithms</topic><topic>Attitudes</topic><topic>Autonomous aerial vehicles</topic><topic>Conjugated-Point Positioning</topic><topic>Connecting</topic><topic>Flight</topic><topic>Flight altitude</topic><topic>Geometry</topic><topic>Global Positioning System</topic><topic>Image contrast</topic><topic>Image Positioning</topic><topic>Kinematics</topic><topic>Mathematical analysis</topic><topic>Mathematical models</topic><topic>Measurement units</topic><topic>Methods</topic><topic>Oblique Images</topic><topic>Real-time systems</topic><topic>Sensors</topic><topic>Unmanned aerial vehicles</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Liu, Jianchen</creatorcontrib><creatorcontrib>Huang, Weixiang</creatorcontrib><creatorcontrib>Guo, Bingxuan</creatorcontrib><collection>IEEE All-Society Periodicals Package (ASPP) 2005-present</collection><collection>IEEE All-Society Periodicals Package (ASPP) 1998-Present</collection><collection>IEEE Electronic Library (IEL)</collection><collection>CrossRef</collection><collection>Water Resources Abstracts</collection><collection>Technology Research Database</collection><collection>Environmental Sciences and Pollution Management</collection><collection>ASFA: Aquatic Sciences and Fisheries Abstracts</collection><collection>Engineering Research Database</collection><collection>Aerospace Database</collection><collection>Aquatic Science & Fisheries Abstracts (ASFA) 2: Ocean Technology, Policy & Non-Living Resources</collection><collection>Civil Engineering Abstracts</collection><collection>Aquatic Science & Fisheries Abstracts (ASFA) Professional</collection><collection>Advanced Technologies Database with Aerospace</collection><jtitle>IEEE transactions on geoscience and remote sensing</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext_linktorsrc</fulltext></delivery><addata><au>Liu, Jianchen</au><au>Huang, Weixiang</au><au>Guo, Bingxuan</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Conjugated-Point Positioning Method Based on Multiview Images</atitle><jtitle>IEEE transactions on geoscience and remote sensing</jtitle><stitle>TGRS</stitle><date>2023-01-01</date><risdate>2023</risdate><volume>61</volume><spage>1</spage><epage>1</epage><pages>1-1</pages><issn>0196-2892</issn><eissn>1558-0644</eissn><coden>IGRSD2</coden><abstract>The performance of the direct georeferencing method depends on the accuracy of the position and orientation data. The accuracy of the attitude data has a particularly significant impact on the positioning results. This paper proposes an object point positioning method that does not require attitude information. In the proposed algorithm, only the position information of the perspective center is necessary. The basic principle is that the error equation of each image can be established according to two visible object points and the image perspective center. The angle between the rays connecting these points and the center is the observation value that can be calculated from the image points and camera parameters. In contrast to the spatial intersection method, this method does not need the attitude information to solve for the coordinates of object points. This paper analyzes the main factors affecting the accuracy through simulation experiments. Using a DJI Phantom 4 RTK unmanned aerial vehicle in experiment, the accuracy of positioning four points can achieve 0.2 m when the flight altitude is 60 m and the flight radius is 40 m. On the experiment of positioning N object points, the positioning accuracy can achieve 0.105 m. The method in this paper can achieve the same positioning accuracy as the space intersection.</abstract><cop>New York</cop><pub>IEEE</pub><doi>10.1109/TGRS.2023.3247919</doi><tpages>1</tpages><orcidid>https://orcid.org/0000-0002-6004-7631</orcidid><orcidid>https://orcid.org/0000-0002-9418-7835</orcidid><orcidid>https://orcid.org/0000-0003-0061-0602</orcidid></addata></record> |
fulltext | fulltext_linktorsrc |
identifier | ISSN: 0196-2892 |
ispartof | IEEE transactions on geoscience and remote sensing, 2023-01, Vol.61, p.1-1 |
issn | 0196-2892 1558-0644 |
language | eng |
recordid | cdi_proquest_journals_2785446340 |
source | IEEE Electronic Library (IEL) |
subjects | Accuracy Algorithms Attitudes Autonomous aerial vehicles Conjugated-Point Positioning Connecting Flight Flight altitude Geometry Global Positioning System Image contrast Image Positioning Kinematics Mathematical analysis Mathematical models Measurement units Methods Oblique Images Real-time systems Sensors Unmanned aerial vehicles |
title | Conjugated-Point Positioning Method Based on Multiview Images |
url | https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-02-02T22%3A31%3A33IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_RIE&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Conjugated-Point%20Positioning%20Method%20Based%20on%20Multiview%20Images&rft.jtitle=IEEE%20transactions%20on%20geoscience%20and%20remote%20sensing&rft.au=Liu,%20Jianchen&rft.date=2023-01-01&rft.volume=61&rft.spage=1&rft.epage=1&rft.pages=1-1&rft.issn=0196-2892&rft.eissn=1558-0644&rft.coden=IGRSD2&rft_id=info:doi/10.1109/TGRS.2023.3247919&rft_dat=%3Cproquest_RIE%3E2785446340%3C/proquest_RIE%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=2785446340&rft_id=info:pmid/&rft_ieee_id=10054602&rfr_iscdi=true |