Do Lateral Views Help Automated Chest X-ray Predictions?

Most convolutional neural networks in chest radiology use only the frontal posteroanterior (PA) view to make a prediction. However the lateral view is known to help the diagnosis of certain diseases and conditions. The recently released PadChest dataset contains paired PA and lateral views, allowing...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Hauptverfasser: Bertrand, Hadrien, Hashir, Mohammad, Cohen, Joseph Paul
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page
container_issue
container_start_page
container_title
container_volume
creator Bertrand, Hadrien
Hashir, Mohammad
Cohen, Joseph Paul
description Most convolutional neural networks in chest radiology use only the frontal posteroanterior (PA) view to make a prediction. However the lateral view is known to help the diagnosis of certain diseases and conditions. The recently released PadChest dataset contains paired PA and lateral views, allowing us to study for which diseases and conditions the performance of a neural network improves when provided a lateral x-ray view as opposed to a frontal posteroanterior (PA) view. Using a simple DenseNet model, we find that using the lateral view increases the AUC of 8 of the 56 labels in our data and achieves the same performance as the PA view for 21 of the labels. We find that using the PA and lateral views jointly doesn't trivially lead to an increase in performance but suggest further investigation.
doi_str_mv 10.48550/arxiv.1904.08534
format Article
fullrecord <record><control><sourceid>arxiv_GOX</sourceid><recordid>TN_cdi_arxiv_primary_1904_08534</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>1904_08534</sourcerecordid><originalsourceid>FETCH-LOGICAL-a674-7260f2ddeeb6613bc75d792b567982f1fb5e246324614d2ec21262a905487eb63</originalsourceid><addsrcrecordid>eNotj8tqwzAURLXJoiT5gK6qH7ArXT29CsF9JGBIFqF0Z2TrmgqcOMhu2vx9nMdiGBiYYQ4hz5yl0irFXl38D6eUZ0ymzCohn4h962jhBoyupV8B_3q6wvZIl79Dtx9jT_Mf7Af6nUR3ptuIPtRD6A79YkYmjWt7nD98SnYf77t8lRSbz3W-LBKnjUwMaNaA94iV1lxUtVHeZFApbTILDW8qhSC1GMWlB6yBgwaXMSWtGTtiSl7us7fr5TGGvYvn8opQ3hDEBdMFPys</addsrcrecordid><sourcetype>Open Access Repository</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype></control><display><type>article</type><title>Do Lateral Views Help Automated Chest X-ray Predictions?</title><source>arXiv.org</source><creator>Bertrand, Hadrien ; Hashir, Mohammad ; Cohen, Joseph Paul</creator><creatorcontrib>Bertrand, Hadrien ; Hashir, Mohammad ; Cohen, Joseph Paul</creatorcontrib><description>Most convolutional neural networks in chest radiology use only the frontal posteroanterior (PA) view to make a prediction. However the lateral view is known to help the diagnosis of certain diseases and conditions. The recently released PadChest dataset contains paired PA and lateral views, allowing us to study for which diseases and conditions the performance of a neural network improves when provided a lateral x-ray view as opposed to a frontal posteroanterior (PA) view. Using a simple DenseNet model, we find that using the lateral view increases the AUC of 8 of the 56 labels in our data and achieves the same performance as the PA view for 21 of the labels. We find that using the PA and lateral views jointly doesn't trivially lead to an increase in performance but suggest further investigation.</description><identifier>DOI: 10.48550/arxiv.1904.08534</identifier><language>eng</language><subject>Computer Science - Computer Vision and Pattern Recognition ; Computer Science - Learning</subject><creationdate>2019-04</creationdate><rights>http://arxiv.org/licenses/nonexclusive-distrib/1.0</rights><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><link.rule.ids>228,230,778,883</link.rule.ids><linktorsrc>$$Uhttps://arxiv.org/abs/1904.08534$$EView_record_in_Cornell_University$$FView_record_in_$$GCornell_University$$Hfree_for_read</linktorsrc><backlink>$$Uhttps://doi.org/10.48550/arXiv.1904.08534$$DView paper in arXiv$$Hfree_for_read</backlink></links><search><creatorcontrib>Bertrand, Hadrien</creatorcontrib><creatorcontrib>Hashir, Mohammad</creatorcontrib><creatorcontrib>Cohen, Joseph Paul</creatorcontrib><title>Do Lateral Views Help Automated Chest X-ray Predictions?</title><description>Most convolutional neural networks in chest radiology use only the frontal posteroanterior (PA) view to make a prediction. However the lateral view is known to help the diagnosis of certain diseases and conditions. The recently released PadChest dataset contains paired PA and lateral views, allowing us to study for which diseases and conditions the performance of a neural network improves when provided a lateral x-ray view as opposed to a frontal posteroanterior (PA) view. Using a simple DenseNet model, we find that using the lateral view increases the AUC of 8 of the 56 labels in our data and achieves the same performance as the PA view for 21 of the labels. We find that using the PA and lateral views jointly doesn't trivially lead to an increase in performance but suggest further investigation.</description><subject>Computer Science - Computer Vision and Pattern Recognition</subject><subject>Computer Science - Learning</subject><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2019</creationdate><recordtype>article</recordtype><sourceid>GOX</sourceid><recordid>eNotj8tqwzAURLXJoiT5gK6qH7ArXT29CsF9JGBIFqF0Z2TrmgqcOMhu2vx9nMdiGBiYYQ4hz5yl0irFXl38D6eUZ0ymzCohn4h962jhBoyupV8B_3q6wvZIl79Dtx9jT_Mf7Af6nUR3ptuIPtRD6A79YkYmjWt7nD98SnYf77t8lRSbz3W-LBKnjUwMaNaA94iV1lxUtVHeZFApbTILDW8qhSC1GMWlB6yBgwaXMSWtGTtiSl7us7fr5TGGvYvn8opQ3hDEBdMFPys</recordid><startdate>20190417</startdate><enddate>20190417</enddate><creator>Bertrand, Hadrien</creator><creator>Hashir, Mohammad</creator><creator>Cohen, Joseph Paul</creator><scope>AKY</scope><scope>GOX</scope></search><sort><creationdate>20190417</creationdate><title>Do Lateral Views Help Automated Chest X-ray Predictions?</title><author>Bertrand, Hadrien ; Hashir, Mohammad ; Cohen, Joseph Paul</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-a674-7260f2ddeeb6613bc75d792b567982f1fb5e246324614d2ec21262a905487eb63</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2019</creationdate><topic>Computer Science - Computer Vision and Pattern Recognition</topic><topic>Computer Science - Learning</topic><toplevel>online_resources</toplevel><creatorcontrib>Bertrand, Hadrien</creatorcontrib><creatorcontrib>Hashir, Mohammad</creatorcontrib><creatorcontrib>Cohen, Joseph Paul</creatorcontrib><collection>arXiv Computer Science</collection><collection>arXiv.org</collection></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext_linktorsrc</fulltext></delivery><addata><au>Bertrand, Hadrien</au><au>Hashir, Mohammad</au><au>Cohen, Joseph Paul</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Do Lateral Views Help Automated Chest X-ray Predictions?</atitle><date>2019-04-17</date><risdate>2019</risdate><abstract>Most convolutional neural networks in chest radiology use only the frontal posteroanterior (PA) view to make a prediction. However the lateral view is known to help the diagnosis of certain diseases and conditions. The recently released PadChest dataset contains paired PA and lateral views, allowing us to study for which diseases and conditions the performance of a neural network improves when provided a lateral x-ray view as opposed to a frontal posteroanterior (PA) view. Using a simple DenseNet model, we find that using the lateral view increases the AUC of 8 of the 56 labels in our data and achieves the same performance as the PA view for 21 of the labels. We find that using the PA and lateral views jointly doesn't trivially lead to an increase in performance but suggest further investigation.</abstract><doi>10.48550/arxiv.1904.08534</doi><oa>free_for_read</oa></addata></record>
fulltext fulltext_linktorsrc
identifier DOI: 10.48550/arxiv.1904.08534
ispartof
issn
language eng
recordid cdi_arxiv_primary_1904_08534
source arXiv.org
subjects Computer Science - Computer Vision and Pattern Recognition
Computer Science - Learning
title Do Lateral Views Help Automated Chest X-ray Predictions?
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-15T09%3A57%3A37IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-arxiv_GOX&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Do%20Lateral%20Views%20Help%20Automated%20Chest%20X-ray%20Predictions?&rft.au=Bertrand,%20Hadrien&rft.date=2019-04-17&rft_id=info:doi/10.48550/arxiv.1904.08534&rft_dat=%3Carxiv_GOX%3E1904_08534%3C/arxiv_GOX%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_id=info:pmid/&rfr_iscdi=true