A Deep Learning Approach for Managing Medical Consumable Materials in Intensive Care Units via Convolutional Neural Networks: Technical Proof-of-Concept Study
High numbers of consumable medical materials (eg, sterile needles and swabs) are used during the daily routine of intensive care units (ICUs) worldwide. Although medical consumables largely contribute to total ICU hospital expenditure, many hospitals do not track the individual use of materials. Cur...
Gespeichert in:
Veröffentlicht in: | JMIR medical informatics 2019-10, Vol.7 (4), p.e14806-e14806 |
---|---|
Hauptverfasser: | , , , , , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
container_end_page | e14806 |
---|---|
container_issue | 4 |
container_start_page | e14806 |
container_title | JMIR medical informatics |
container_volume | 7 |
creator | Peine, Arne Hallawa, Ahmed Schöffski, Oliver Dartmann, Guido Fazlic, Lejla Begic Schmeink, Anke Marx, Gernot Martin, Lukas |
description | High numbers of consumable medical materials (eg, sterile needles and swabs) are used during the daily routine of intensive care units (ICUs) worldwide. Although medical consumables largely contribute to total ICU hospital expenditure, many hospitals do not track the individual use of materials. Current tracking solutions meeting the specific requirements of the medical environment, like barcodes or radio frequency identification, require specialized material preparation and high infrastructure investment. This impedes the accurate prediction of consumption, leads to high storage maintenance costs caused by large inventories, and hinders scientific work due to inaccurate documentation. Thus, new cost-effective and contactless methods for object detection are urgently needed.
The goal of this work was to develop and evaluate a contactless visual recognition system for tracking medical consumable materials in ICUs using a deep learning approach on a distributed client-server architecture.
We developed Consumabot, a novel client-server optical recognition system for medical consumables, based on the convolutional neural network model MobileNet implemented in Tensorflow. The software was designed to run on single-board computer platforms as a detection unit. The system was trained to recognize 20 different materials in the ICU, while 100 sample images of each consumable material were provided. We assessed the top-1 recognition rates in the context of different real-world ICU settings: materials presented to the system without visual obstruction, 50% covered materials, and scenarios of multiple items. We further performed an analysis of variance with repeated measures to quantify the effect of adverse real-world circumstances.
Consumabot reached a >99% reliability of recognition after about 60 steps of training and 150 steps of validation. A desirable low cross entropy of |
doi_str_mv | 10.2196/14806 |
format | Article |
fullrecord | <record><control><sourceid>proquest_pubme</sourceid><recordid>TN_cdi_pubmedcentral_primary_oai_pubmedcentral_nih_gov_6819012</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>2511964066</sourcerecordid><originalsourceid>FETCH-LOGICAL-c391t-3d18b763a88e7caa824853f502b16dc30211026b4abe040f7200d14a2303b9033</originalsourceid><addsrcrecordid>eNpdkV1rFDEUhoMottT9CxIQwZvRk4_NznghLOtXYauC7XXIZM7sps4mY5JZ6Z_xt5rd1lKFwAk5z3nPS15CZgxec9aoN0zWoB6RU84bVjWqkY8f3E_ILKVrAGCSKaUWT8mJYAqEFHBKfi_pe8SRrtFE7_yGLscxBmO3tA-RXhhvNofXC-ycNQNdBZ-mnWkHLL2M0ZkhUefpuc_ok9sjXZmI9Mq7nOjemcPAPgxTdsGX8S84xWPJv0L8kd7SS7Rbf1T-FkPoq3LKhMUx0-956m6ekSd9WYGzu3pGrj5-uFx9rtZfP52vluvKioblSnSsbhdKmLrGhTWm5rKei34OvGWqswI4Y8BVK02LIKFfcICOScMFiLYBIc7Iu1vdcWp32Fn0uRjVY3Q7E290ME7_2_Fuqzdhr1XNGmC8CLy6E4jh54Qp651LFofBeAxT0mXTHAQHVhf0xX_odZhi-Z5CzVmJU4JShXp5S9kYUorY35thoA-h62PohXv-0Pk99Tdi8Qd5K6bM</addsrcrecordid><sourcetype>Open Access Repository</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2511964066</pqid></control><display><type>article</type><title>A Deep Learning Approach for Managing Medical Consumable Materials in Intensive Care Units via Convolutional Neural Networks: Technical Proof-of-Concept Study</title><source>DOAJ Directory of Open Access Journals</source><source>Elektronische Zeitschriftenbibliothek - Frei zugängliche E-Journals</source><source>PubMed Central Open Access</source><source>PubMed Central</source><creator>Peine, Arne ; Hallawa, Ahmed ; Schöffski, Oliver ; Dartmann, Guido ; Fazlic, Lejla Begic ; Schmeink, Anke ; Marx, Gernot ; Martin, Lukas</creator><creatorcontrib>Peine, Arne ; Hallawa, Ahmed ; Schöffski, Oliver ; Dartmann, Guido ; Fazlic, Lejla Begic ; Schmeink, Anke ; Marx, Gernot ; Martin, Lukas</creatorcontrib><description>High numbers of consumable medical materials (eg, sterile needles and swabs) are used during the daily routine of intensive care units (ICUs) worldwide. Although medical consumables largely contribute to total ICU hospital expenditure, many hospitals do not track the individual use of materials. Current tracking solutions meeting the specific requirements of the medical environment, like barcodes or radio frequency identification, require specialized material preparation and high infrastructure investment. This impedes the accurate prediction of consumption, leads to high storage maintenance costs caused by large inventories, and hinders scientific work due to inaccurate documentation. Thus, new cost-effective and contactless methods for object detection are urgently needed.
The goal of this work was to develop and evaluate a contactless visual recognition system for tracking medical consumable materials in ICUs using a deep learning approach on a distributed client-server architecture.
We developed Consumabot, a novel client-server optical recognition system for medical consumables, based on the convolutional neural network model MobileNet implemented in Tensorflow. The software was designed to run on single-board computer platforms as a detection unit. The system was trained to recognize 20 different materials in the ICU, while 100 sample images of each consumable material were provided. We assessed the top-1 recognition rates in the context of different real-world ICU settings: materials presented to the system without visual obstruction, 50% covered materials, and scenarios of multiple items. We further performed an analysis of variance with repeated measures to quantify the effect of adverse real-world circumstances.
Consumabot reached a >99% reliability of recognition after about 60 steps of training and 150 steps of validation. A desirable low cross entropy of <0.03 was reached for the training set after about 100 iteration steps and after 170 steps for the validation set. The system showed a high top-1 mean recognition accuracy in a real-world scenario of 0.85 (SD 0.11) for objects presented to the system without visual obstruction. Recognition accuracy was lower, but still acceptable, in scenarios where the objects were 50% covered (P<.001; mean recognition accuracy 0.71; SD 0.13) or multiple objects of the target group were present (P=.01; mean recognition accuracy 0.78; SD 0.11), compared to a nonobstructed view. The approach met the criteria of absence of explicit labeling (eg, barcodes, radio frequency labeling) while maintaining a high standard for quality and hygiene with minimal consumption of resources (eg, cost, time, training, and computational power).
Using a convolutional neural network architecture, Consumabot consistently achieved good results in the classification of consumables and thus is a feasible way to recognize and register medical consumables directly to a hospital's electronic health record. The system shows limitations when the materials are partially covered, therefore identifying characteristics of the consumables are not presented to the system. Further development of the assessment in different medical circumstances is needed.</description><identifier>ISSN: 2291-9694</identifier><identifier>EISSN: 2291-9694</identifier><identifier>DOI: 10.2196/14806</identifier><identifier>PMID: 31603430</identifier><language>eng</language><publisher>Canada: JMIR Publications</publisher><subject>Deep learning ; Intensive care ; Neural networks ; Original Paper ; Software ; Wireless networks</subject><ispartof>JMIR medical informatics, 2019-10, Vol.7 (4), p.e14806-e14806</ispartof><rights>Arne Peine, Ahmed Hallawa, Oliver Schöffski, Guido Dartmann, Lejla Begic Fazlic, Anke Schmeink, Gernot Marx, Lukas Martin. Originally published in JMIR Medical Informatics (http://medinform.jmir.org), 10.10.2019.</rights><rights>2019. This work is licensed under https://creativecommons.org/licenses/by/4.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.</rights><rights>Arne Peine, Ahmed Hallawa, Oliver Schöffski, Guido Dartmann, Lejla Begic Fazlic, Anke Schmeink, Gernot Marx, Lukas Martin. Originally published in JMIR Medical Informatics (http://medinform.jmir.org), 10.10.2019. 2019</rights><lds50>peer_reviewed</lds50><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c391t-3d18b763a88e7caa824853f502b16dc30211026b4abe040f7200d14a2303b9033</citedby><cites>FETCH-LOGICAL-c391t-3d18b763a88e7caa824853f502b16dc30211026b4abe040f7200d14a2303b9033</cites><orcidid>0000-0003-0866-4234 ; 0000-0003-4163-2402 ; 0000-0001-8650-5090 ; 0000-0002-6786-6664 ; 0000-0003-3932-4873 ; 0000-0002-9929-2925 ; 0000-0002-9869-0219 ; 0000-0002-2503-8887</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://www.ncbi.nlm.nih.gov/pmc/articles/PMC6819012/$$EHTML$$P50$$Gpubmedcentral$$Hfree_for_read</linktohtml><link.rule.ids>230,314,727,780,784,864,885,27924,27925,53791</link.rule.ids><backlink>$$Uhttps://www.ncbi.nlm.nih.gov/pubmed/31603430$$D View this record in MEDLINE/PubMed$$Hfree_for_read</backlink></links><search><creatorcontrib>Peine, Arne</creatorcontrib><creatorcontrib>Hallawa, Ahmed</creatorcontrib><creatorcontrib>Schöffski, Oliver</creatorcontrib><creatorcontrib>Dartmann, Guido</creatorcontrib><creatorcontrib>Fazlic, Lejla Begic</creatorcontrib><creatorcontrib>Schmeink, Anke</creatorcontrib><creatorcontrib>Marx, Gernot</creatorcontrib><creatorcontrib>Martin, Lukas</creatorcontrib><title>A Deep Learning Approach for Managing Medical Consumable Materials in Intensive Care Units via Convolutional Neural Networks: Technical Proof-of-Concept Study</title><title>JMIR medical informatics</title><addtitle>JMIR Med Inform</addtitle><description>High numbers of consumable medical materials (eg, sterile needles and swabs) are used during the daily routine of intensive care units (ICUs) worldwide. Although medical consumables largely contribute to total ICU hospital expenditure, many hospitals do not track the individual use of materials. Current tracking solutions meeting the specific requirements of the medical environment, like barcodes or radio frequency identification, require specialized material preparation and high infrastructure investment. This impedes the accurate prediction of consumption, leads to high storage maintenance costs caused by large inventories, and hinders scientific work due to inaccurate documentation. Thus, new cost-effective and contactless methods for object detection are urgently needed.
The goal of this work was to develop and evaluate a contactless visual recognition system for tracking medical consumable materials in ICUs using a deep learning approach on a distributed client-server architecture.
We developed Consumabot, a novel client-server optical recognition system for medical consumables, based on the convolutional neural network model MobileNet implemented in Tensorflow. The software was designed to run on single-board computer platforms as a detection unit. The system was trained to recognize 20 different materials in the ICU, while 100 sample images of each consumable material were provided. We assessed the top-1 recognition rates in the context of different real-world ICU settings: materials presented to the system without visual obstruction, 50% covered materials, and scenarios of multiple items. We further performed an analysis of variance with repeated measures to quantify the effect of adverse real-world circumstances.
Consumabot reached a >99% reliability of recognition after about 60 steps of training and 150 steps of validation. A desirable low cross entropy of <0.03 was reached for the training set after about 100 iteration steps and after 170 steps for the validation set. The system showed a high top-1 mean recognition accuracy in a real-world scenario of 0.85 (SD 0.11) for objects presented to the system without visual obstruction. Recognition accuracy was lower, but still acceptable, in scenarios where the objects were 50% covered (P<.001; mean recognition accuracy 0.71; SD 0.13) or multiple objects of the target group were present (P=.01; mean recognition accuracy 0.78; SD 0.11), compared to a nonobstructed view. The approach met the criteria of absence of explicit labeling (eg, barcodes, radio frequency labeling) while maintaining a high standard for quality and hygiene with minimal consumption of resources (eg, cost, time, training, and computational power).
Using a convolutional neural network architecture, Consumabot consistently achieved good results in the classification of consumables and thus is a feasible way to recognize and register medical consumables directly to a hospital's electronic health record. The system shows limitations when the materials are partially covered, therefore identifying characteristics of the consumables are not presented to the system. Further development of the assessment in different medical circumstances is needed.</description><subject>Deep learning</subject><subject>Intensive care</subject><subject>Neural networks</subject><subject>Original Paper</subject><subject>Software</subject><subject>Wireless networks</subject><issn>2291-9694</issn><issn>2291-9694</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2019</creationdate><recordtype>article</recordtype><sourceid>ABUWG</sourceid><sourceid>AFKRA</sourceid><sourceid>AZQEC</sourceid><sourceid>BENPR</sourceid><sourceid>CCPQU</sourceid><sourceid>DWQXO</sourceid><recordid>eNpdkV1rFDEUhoMottT9CxIQwZvRk4_NznghLOtXYauC7XXIZM7sps4mY5JZ6Z_xt5rd1lKFwAk5z3nPS15CZgxec9aoN0zWoB6RU84bVjWqkY8f3E_ILKVrAGCSKaUWT8mJYAqEFHBKfi_pe8SRrtFE7_yGLscxBmO3tA-RXhhvNofXC-ycNQNdBZ-mnWkHLL2M0ZkhUefpuc_ok9sjXZmI9Mq7nOjemcPAPgxTdsGX8S84xWPJv0L8kd7SS7Rbf1T-FkPoq3LKhMUx0-956m6ekSd9WYGzu3pGrj5-uFx9rtZfP52vluvKioblSnSsbhdKmLrGhTWm5rKei34OvGWqswI4Y8BVK02LIKFfcICOScMFiLYBIc7Iu1vdcWp32Fn0uRjVY3Q7E290ME7_2_Fuqzdhr1XNGmC8CLy6E4jh54Qp651LFofBeAxT0mXTHAQHVhf0xX_odZhi-Z5CzVmJU4JShXp5S9kYUorY35thoA-h62PohXv-0Pk99Tdi8Qd5K6bM</recordid><startdate>20191010</startdate><enddate>20191010</enddate><creator>Peine, Arne</creator><creator>Hallawa, Ahmed</creator><creator>Schöffski, Oliver</creator><creator>Dartmann, Guido</creator><creator>Fazlic, Lejla Begic</creator><creator>Schmeink, Anke</creator><creator>Marx, Gernot</creator><creator>Martin, Lukas</creator><general>JMIR Publications</general><scope>NPM</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>3V.</scope><scope>7X7</scope><scope>7XB</scope><scope>88C</scope><scope>8FI</scope><scope>8FJ</scope><scope>8FK</scope><scope>ABUWG</scope><scope>AFKRA</scope><scope>AZQEC</scope><scope>BENPR</scope><scope>CCPQU</scope><scope>DWQXO</scope><scope>FYUFA</scope><scope>GHDGH</scope><scope>K9.</scope><scope>M0S</scope><scope>M0T</scope><scope>PIMPY</scope><scope>PQEST</scope><scope>PQQKQ</scope><scope>PQUKI</scope><scope>PRINS</scope><scope>7X8</scope><scope>5PM</scope><orcidid>https://orcid.org/0000-0003-0866-4234</orcidid><orcidid>https://orcid.org/0000-0003-4163-2402</orcidid><orcidid>https://orcid.org/0000-0001-8650-5090</orcidid><orcidid>https://orcid.org/0000-0002-6786-6664</orcidid><orcidid>https://orcid.org/0000-0003-3932-4873</orcidid><orcidid>https://orcid.org/0000-0002-9929-2925</orcidid><orcidid>https://orcid.org/0000-0002-9869-0219</orcidid><orcidid>https://orcid.org/0000-0002-2503-8887</orcidid></search><sort><creationdate>20191010</creationdate><title>A Deep Learning Approach for Managing Medical Consumable Materials in Intensive Care Units via Convolutional Neural Networks: Technical Proof-of-Concept Study</title><author>Peine, Arne ; Hallawa, Ahmed ; Schöffski, Oliver ; Dartmann, Guido ; Fazlic, Lejla Begic ; Schmeink, Anke ; Marx, Gernot ; Martin, Lukas</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c391t-3d18b763a88e7caa824853f502b16dc30211026b4abe040f7200d14a2303b9033</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2019</creationdate><topic>Deep learning</topic><topic>Intensive care</topic><topic>Neural networks</topic><topic>Original Paper</topic><topic>Software</topic><topic>Wireless networks</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Peine, Arne</creatorcontrib><creatorcontrib>Hallawa, Ahmed</creatorcontrib><creatorcontrib>Schöffski, Oliver</creatorcontrib><creatorcontrib>Dartmann, Guido</creatorcontrib><creatorcontrib>Fazlic, Lejla Begic</creatorcontrib><creatorcontrib>Schmeink, Anke</creatorcontrib><creatorcontrib>Marx, Gernot</creatorcontrib><creatorcontrib>Martin, Lukas</creatorcontrib><collection>PubMed</collection><collection>CrossRef</collection><collection>ProQuest Central (Corporate)</collection><collection>Health & Medical Collection</collection><collection>ProQuest Central (purchase pre-March 2016)</collection><collection>Healthcare Administration Database (Alumni)</collection><collection>Hospital Premium Collection</collection><collection>Hospital Premium Collection (Alumni Edition)</collection><collection>ProQuest Central (Alumni) (purchase pre-March 2016)</collection><collection>ProQuest Central (Alumni Edition)</collection><collection>ProQuest Central UK/Ireland</collection><collection>ProQuest Central Essentials</collection><collection>ProQuest Central</collection><collection>ProQuest One Community College</collection><collection>ProQuest Central Korea</collection><collection>Health Research Premium Collection</collection><collection>Health Research Premium Collection (Alumni)</collection><collection>ProQuest Health & Medical Complete (Alumni)</collection><collection>Health & Medical Collection (Alumni Edition)</collection><collection>Healthcare Administration Database</collection><collection>Publicly Available Content Database</collection><collection>ProQuest One Academic Eastern Edition (DO NOT USE)</collection><collection>ProQuest One Academic</collection><collection>ProQuest One Academic UKI Edition</collection><collection>ProQuest Central China</collection><collection>MEDLINE - Academic</collection><collection>PubMed Central (Full Participant titles)</collection><jtitle>JMIR medical informatics</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Peine, Arne</au><au>Hallawa, Ahmed</au><au>Schöffski, Oliver</au><au>Dartmann, Guido</au><au>Fazlic, Lejla Begic</au><au>Schmeink, Anke</au><au>Marx, Gernot</au><au>Martin, Lukas</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>A Deep Learning Approach for Managing Medical Consumable Materials in Intensive Care Units via Convolutional Neural Networks: Technical Proof-of-Concept Study</atitle><jtitle>JMIR medical informatics</jtitle><addtitle>JMIR Med Inform</addtitle><date>2019-10-10</date><risdate>2019</risdate><volume>7</volume><issue>4</issue><spage>e14806</spage><epage>e14806</epage><pages>e14806-e14806</pages><issn>2291-9694</issn><eissn>2291-9694</eissn><abstract>High numbers of consumable medical materials (eg, sterile needles and swabs) are used during the daily routine of intensive care units (ICUs) worldwide. Although medical consumables largely contribute to total ICU hospital expenditure, many hospitals do not track the individual use of materials. Current tracking solutions meeting the specific requirements of the medical environment, like barcodes or radio frequency identification, require specialized material preparation and high infrastructure investment. This impedes the accurate prediction of consumption, leads to high storage maintenance costs caused by large inventories, and hinders scientific work due to inaccurate documentation. Thus, new cost-effective and contactless methods for object detection are urgently needed.
The goal of this work was to develop and evaluate a contactless visual recognition system for tracking medical consumable materials in ICUs using a deep learning approach on a distributed client-server architecture.
We developed Consumabot, a novel client-server optical recognition system for medical consumables, based on the convolutional neural network model MobileNet implemented in Tensorflow. The software was designed to run on single-board computer platforms as a detection unit. The system was trained to recognize 20 different materials in the ICU, while 100 sample images of each consumable material were provided. We assessed the top-1 recognition rates in the context of different real-world ICU settings: materials presented to the system without visual obstruction, 50% covered materials, and scenarios of multiple items. We further performed an analysis of variance with repeated measures to quantify the effect of adverse real-world circumstances.
Consumabot reached a >99% reliability of recognition after about 60 steps of training and 150 steps of validation. A desirable low cross entropy of <0.03 was reached for the training set after about 100 iteration steps and after 170 steps for the validation set. The system showed a high top-1 mean recognition accuracy in a real-world scenario of 0.85 (SD 0.11) for objects presented to the system without visual obstruction. Recognition accuracy was lower, but still acceptable, in scenarios where the objects were 50% covered (P<.001; mean recognition accuracy 0.71; SD 0.13) or multiple objects of the target group were present (P=.01; mean recognition accuracy 0.78; SD 0.11), compared to a nonobstructed view. The approach met the criteria of absence of explicit labeling (eg, barcodes, radio frequency labeling) while maintaining a high standard for quality and hygiene with minimal consumption of resources (eg, cost, time, training, and computational power).
Using a convolutional neural network architecture, Consumabot consistently achieved good results in the classification of consumables and thus is a feasible way to recognize and register medical consumables directly to a hospital's electronic health record. The system shows limitations when the materials are partially covered, therefore identifying characteristics of the consumables are not presented to the system. Further development of the assessment in different medical circumstances is needed.</abstract><cop>Canada</cop><pub>JMIR Publications</pub><pmid>31603430</pmid><doi>10.2196/14806</doi><orcidid>https://orcid.org/0000-0003-0866-4234</orcidid><orcidid>https://orcid.org/0000-0003-4163-2402</orcidid><orcidid>https://orcid.org/0000-0001-8650-5090</orcidid><orcidid>https://orcid.org/0000-0002-6786-6664</orcidid><orcidid>https://orcid.org/0000-0003-3932-4873</orcidid><orcidid>https://orcid.org/0000-0002-9929-2925</orcidid><orcidid>https://orcid.org/0000-0002-9869-0219</orcidid><orcidid>https://orcid.org/0000-0002-2503-8887</orcidid><oa>free_for_read</oa></addata></record> |
fulltext | fulltext |
identifier | ISSN: 2291-9694 |
ispartof | JMIR medical informatics, 2019-10, Vol.7 (4), p.e14806-e14806 |
issn | 2291-9694 2291-9694 |
language | eng |
recordid | cdi_pubmedcentral_primary_oai_pubmedcentral_nih_gov_6819012 |
source | DOAJ Directory of Open Access Journals; Elektronische Zeitschriftenbibliothek - Frei zugängliche E-Journals; PubMed Central Open Access; PubMed Central |
subjects | Deep learning Intensive care Neural networks Original Paper Software Wireless networks |
title | A Deep Learning Approach for Managing Medical Consumable Materials in Intensive Care Units via Convolutional Neural Networks: Technical Proof-of-Concept Study |
url | https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2024-12-29T20%3A17%3A14IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_pubme&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=A%20Deep%20Learning%20Approach%20for%20Managing%20Medical%20Consumable%20Materials%20in%20Intensive%20Care%20Units%20via%20Convolutional%20Neural%20Networks:%20Technical%20Proof-of-Concept%20Study&rft.jtitle=JMIR%20medical%20informatics&rft.au=Peine,%20Arne&rft.date=2019-10-10&rft.volume=7&rft.issue=4&rft.spage=e14806&rft.epage=e14806&rft.pages=e14806-e14806&rft.issn=2291-9694&rft.eissn=2291-9694&rft_id=info:doi/10.2196/14806&rft_dat=%3Cproquest_pubme%3E2511964066%3C/proquest_pubme%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=2511964066&rft_id=info:pmid/31603430&rfr_iscdi=true |