An observation support system with an adaptive ontology-driven user interface for the modeling of complex behaviors during surgical interventions
The field of surgical interventions emphasizes knowledge and experience; explicit and detailed models of surgical processes are hard to obtain by observation or measurement. However, in medical engineering and related developments, such models are highly valuable. Surgical process modeling deals wit...
Gespeichert in:
Veröffentlicht in: | Behavior research methods 2010-11, Vol.42 (4), p.1049-1058 |
---|---|
Hauptverfasser: | , , , , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
container_end_page | 1058 |
---|---|
container_issue | 4 |
container_start_page | 1049 |
container_title | Behavior research methods |
container_volume | 42 |
creator | Neumuth, T. Kaschek, B. Neumuth, D. Ceschia, M. Meixensberger, J. Strauss, G. Burgert, O. |
description | The field of surgical interventions emphasizes knowledge and experience; explicit and detailed models of surgical processes are hard to obtain by observation or measurement. However, in medical engineering and related developments, such models are highly valuable. Surgical process modeling deals with the generation of complex process descriptions by observation. This places high demands on the observers, who have to use a sizable terminology to denominate surgical actions, instruments, and patient anatomies, and to describe processes unambiguously. Here, we present a novel method, employing an ontology-based user interface that adapts to the actual situation and describe the principles of the system. A validation study showed that this method enables observers with little recording experience to reach a recording accuracy of >90%. Furthermore, this method can be used for live and video observation. We conclude that the method of ontology-supported recording for complex behaviors can be advantageously employed when surgical processes are modeled. |
doi_str_mv | 10.3758/BRM.42.4.1049 |
format | Article |
fullrecord | <record><control><sourceid>proquest_cross</sourceid><recordid>TN_cdi_proquest_miscellaneous_910649975</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>910649975</sourcerecordid><originalsourceid>FETCH-LOGICAL-c429t-96b1eb391a3ad4f06573c6b44bb4efe634e6d2ae4f47505b0c7a16b6ba7b09ad3</originalsourceid><addsrcrecordid>eNqFkU2L1TAUhosozji6dCvBjavWfDbNchz8ghFBFNyFpD29N0Ob1CS9en-G_9iUO44ggqsk5DnPyclbVU8JbpgU3ctXnz40nDa8IZire9U5EYLXTNDu_t2efD2rHqV0gzHrKOEPqzNKCFNE0vPq56VHwSaIB5Nd8CityxJiRumYMszou8t7ZDwyg1myOwAKPocp7I71EMvRo7WUIuczxNH0gMYQUd4DmsMAk_M7FEbUh3mZ4AeysDcHF2JCwxq3u7TGnevNdKovtu0F6XH1YDRTgie360X15c3rz1fv6uuPb99fXV7XPacq16q1BGyZwjAz8BG3QrK-tZxby2GElnFoB2qAj1wKLCzupSGtba2RFiszsIvqxcm7xPBthZT17FIP02Q8hDVpRXDLlZLiv2RHWqlIR7pCPv-LvAlr9GUM3QkqBcds09UnqI8hpQijXqKbTTxqgvWWqS6Zak4111umhX92K13tDMMd_TvEAjQnIC3bv0L80_Xfxl-OfK9a</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>852754035</pqid></control><display><type>article</type><title>An observation support system with an adaptive ontology-driven user interface for the modeling of complex behaviors during surgical interventions</title><source>MEDLINE</source><source>SpringerLink Journals</source><source>Elektronische Zeitschriftenbibliothek - Frei zugängliche E-Journals</source><creator>Neumuth, T. ; Kaschek, B. ; Neumuth, D. ; Ceschia, M. ; Meixensberger, J. ; Strauss, G. ; Burgert, O.</creator><creatorcontrib>Neumuth, T. ; Kaschek, B. ; Neumuth, D. ; Ceschia, M. ; Meixensberger, J. ; Strauss, G. ; Burgert, O.</creatorcontrib><description>The field of surgical interventions emphasizes knowledge and experience; explicit and detailed models of surgical processes are hard to obtain by observation or measurement. However, in medical engineering and related developments, such models are highly valuable. Surgical process modeling deals with the generation of complex process descriptions by observation. This places high demands on the observers, who have to use a sizable terminology to denominate surgical actions, instruments, and patient anatomies, and to describe processes unambiguously. Here, we present a novel method, employing an ontology-based user interface that adapts to the actual situation and describe the principles of the system. A validation study showed that this method enables observers with little recording experience to reach a recording accuracy of >90%. Furthermore, this method can be used for live and video observation. We conclude that the method of ontology-supported recording for complex behaviors can be advantageously employed when surgical processes are modeled.</description><identifier>ISSN: 1554-351X</identifier><identifier>EISSN: 1554-3528</identifier><identifier>DOI: 10.3758/BRM.42.4.1049</identifier><identifier>PMID: 21139172</identifier><language>eng</language><publisher>New York: Springer-Verlag</publisher><subject>Accuracy ; Behavior ; Behavioral Science and Psychology ; Cognitive Psychology ; Computer science ; Humans ; Models, Anatomic ; Ontology ; Otolaryngology ; Psychology ; Simulation ; Software ; Surgery ; User interface ; User-Computer Interface ; Validation studies</subject><ispartof>Behavior research methods, 2010-11, Vol.42 (4), p.1049-1058</ispartof><rights>Psychonomic Society, Inc. 2010</rights><rights>Copyright Psychonomic Society, Inc. Nov 2010</rights><lds50>peer_reviewed</lds50><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c429t-96b1eb391a3ad4f06573c6b44bb4efe634e6d2ae4f47505b0c7a16b6ba7b09ad3</citedby><cites>FETCH-LOGICAL-c429t-96b1eb391a3ad4f06573c6b44bb4efe634e6d2ae4f47505b0c7a16b6ba7b09ad3</cites></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktopdf>$$Uhttps://link.springer.com/content/pdf/10.3758/BRM.42.4.1049$$EPDF$$P50$$Gspringer$$H</linktopdf><linktohtml>$$Uhttps://link.springer.com/10.3758/BRM.42.4.1049$$EHTML$$P50$$Gspringer$$H</linktohtml><link.rule.ids>314,776,780,27901,27902,41464,42533,51294</link.rule.ids><backlink>$$Uhttps://www.ncbi.nlm.nih.gov/pubmed/21139172$$D View this record in MEDLINE/PubMed$$Hfree_for_read</backlink></links><search><creatorcontrib>Neumuth, T.</creatorcontrib><creatorcontrib>Kaschek, B.</creatorcontrib><creatorcontrib>Neumuth, D.</creatorcontrib><creatorcontrib>Ceschia, M.</creatorcontrib><creatorcontrib>Meixensberger, J.</creatorcontrib><creatorcontrib>Strauss, G.</creatorcontrib><creatorcontrib>Burgert, O.</creatorcontrib><title>An observation support system with an adaptive ontology-driven user interface for the modeling of complex behaviors during surgical interventions</title><title>Behavior research methods</title><addtitle>Behavior Research Methods</addtitle><addtitle>Behav Res Methods</addtitle><description>The field of surgical interventions emphasizes knowledge and experience; explicit and detailed models of surgical processes are hard to obtain by observation or measurement. However, in medical engineering and related developments, such models are highly valuable. Surgical process modeling deals with the generation of complex process descriptions by observation. This places high demands on the observers, who have to use a sizable terminology to denominate surgical actions, instruments, and patient anatomies, and to describe processes unambiguously. Here, we present a novel method, employing an ontology-based user interface that adapts to the actual situation and describe the principles of the system. A validation study showed that this method enables observers with little recording experience to reach a recording accuracy of >90%. Furthermore, this method can be used for live and video observation. We conclude that the method of ontology-supported recording for complex behaviors can be advantageously employed when surgical processes are modeled.</description><subject>Accuracy</subject><subject>Behavior</subject><subject>Behavioral Science and Psychology</subject><subject>Cognitive Psychology</subject><subject>Computer science</subject><subject>Humans</subject><subject>Models, Anatomic</subject><subject>Ontology</subject><subject>Otolaryngology</subject><subject>Psychology</subject><subject>Simulation</subject><subject>Software</subject><subject>Surgery</subject><subject>User interface</subject><subject>User-Computer Interface</subject><subject>Validation studies</subject><issn>1554-351X</issn><issn>1554-3528</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2010</creationdate><recordtype>article</recordtype><sourceid>EIF</sourceid><sourceid>8G5</sourceid><sourceid>BENPR</sourceid><sourceid>GUQSH</sourceid><sourceid>M2O</sourceid><recordid>eNqFkU2L1TAUhosozji6dCvBjavWfDbNchz8ghFBFNyFpD29N0Ob1CS9en-G_9iUO44ggqsk5DnPyclbVU8JbpgU3ctXnz40nDa8IZire9U5EYLXTNDu_t2efD2rHqV0gzHrKOEPqzNKCFNE0vPq56VHwSaIB5Nd8CityxJiRumYMszou8t7ZDwyg1myOwAKPocp7I71EMvRo7WUIuczxNH0gMYQUd4DmsMAk_M7FEbUh3mZ4AeysDcHF2JCwxq3u7TGnevNdKovtu0F6XH1YDRTgie360X15c3rz1fv6uuPb99fXV7XPacq16q1BGyZwjAz8BG3QrK-tZxby2GElnFoB2qAj1wKLCzupSGtba2RFiszsIvqxcm7xPBthZT17FIP02Q8hDVpRXDLlZLiv2RHWqlIR7pCPv-LvAlr9GUM3QkqBcds09UnqI8hpQijXqKbTTxqgvWWqS6Zak4111umhX92K13tDMMd_TvEAjQnIC3bv0L80_Xfxl-OfK9a</recordid><startdate>20101101</startdate><enddate>20101101</enddate><creator>Neumuth, T.</creator><creator>Kaschek, B.</creator><creator>Neumuth, D.</creator><creator>Ceschia, M.</creator><creator>Meixensberger, J.</creator><creator>Strauss, G.</creator><creator>Burgert, O.</creator><general>Springer-Verlag</general><general>Psychonomic Society, Inc</general><scope>CGR</scope><scope>CUY</scope><scope>CVF</scope><scope>ECM</scope><scope>EIF</scope><scope>NPM</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>0-V</scope><scope>3V.</scope><scope>4T-</scope><scope>7X7</scope><scope>7XB</scope><scope>88E</scope><scope>88G</scope><scope>88J</scope><scope>8AO</scope><scope>8FI</scope><scope>8FJ</scope><scope>8FK</scope><scope>8G5</scope><scope>ABUWG</scope><scope>AFKRA</scope><scope>ALSLI</scope><scope>AZQEC</scope><scope>BENPR</scope><scope>CCPQU</scope><scope>DWQXO</scope><scope>FYUFA</scope><scope>GHDGH</scope><scope>GNUQQ</scope><scope>GUQSH</scope><scope>K9.</scope><scope>M0S</scope><scope>M1P</scope><scope>M2M</scope><scope>M2O</scope><scope>M2R</scope><scope>MBDVC</scope><scope>PQEST</scope><scope>PQQKQ</scope><scope>PQUKI</scope><scope>PRINS</scope><scope>PSYQQ</scope><scope>Q9U</scope><scope>7X8</scope><scope>7QO</scope><scope>7TK</scope><scope>8FD</scope><scope>FR3</scope><scope>P64</scope></search><sort><creationdate>20101101</creationdate><title>An observation support system with an adaptive ontology-driven user interface for the modeling of complex behaviors during surgical interventions</title><author>Neumuth, T. ; Kaschek, B. ; Neumuth, D. ; Ceschia, M. ; Meixensberger, J. ; Strauss, G. ; Burgert, O.</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c429t-96b1eb391a3ad4f06573c6b44bb4efe634e6d2ae4f47505b0c7a16b6ba7b09ad3</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2010</creationdate><topic>Accuracy</topic><topic>Behavior</topic><topic>Behavioral Science and Psychology</topic><topic>Cognitive Psychology</topic><topic>Computer science</topic><topic>Humans</topic><topic>Models, Anatomic</topic><topic>Ontology</topic><topic>Otolaryngology</topic><topic>Psychology</topic><topic>Simulation</topic><topic>Software</topic><topic>Surgery</topic><topic>User interface</topic><topic>User-Computer Interface</topic><topic>Validation studies</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Neumuth, T.</creatorcontrib><creatorcontrib>Kaschek, B.</creatorcontrib><creatorcontrib>Neumuth, D.</creatorcontrib><creatorcontrib>Ceschia, M.</creatorcontrib><creatorcontrib>Meixensberger, J.</creatorcontrib><creatorcontrib>Strauss, G.</creatorcontrib><creatorcontrib>Burgert, O.</creatorcontrib><collection>Medline</collection><collection>MEDLINE</collection><collection>MEDLINE (Ovid)</collection><collection>MEDLINE</collection><collection>MEDLINE</collection><collection>PubMed</collection><collection>CrossRef</collection><collection>ProQuest Social Sciences Premium Collection</collection><collection>ProQuest Central (Corporate)</collection><collection>Docstoc</collection><collection>Health & Medical Collection</collection><collection>ProQuest Central (purchase pre-March 2016)</collection><collection>Medical Database (Alumni Edition)</collection><collection>Psychology Database (Alumni)</collection><collection>Social Science Database (Alumni Edition)</collection><collection>ProQuest Pharma Collection</collection><collection>Hospital Premium Collection</collection><collection>Hospital Premium Collection (Alumni Edition)</collection><collection>ProQuest Central (Alumni) (purchase pre-March 2016)</collection><collection>Research Library (Alumni Edition)</collection><collection>ProQuest Central (Alumni Edition)</collection><collection>ProQuest Central UK/Ireland</collection><collection>Social Science Premium Collection</collection><collection>ProQuest Central Essentials</collection><collection>ProQuest Central</collection><collection>ProQuest One Community College</collection><collection>ProQuest Central Korea</collection><collection>Health Research Premium Collection</collection><collection>Health Research Premium Collection (Alumni)</collection><collection>ProQuest Central Student</collection><collection>Research Library Prep</collection><collection>ProQuest Health & Medical Complete (Alumni)</collection><collection>Health & Medical Collection (Alumni Edition)</collection><collection>Medical Database</collection><collection>ProQuest Psychology</collection><collection>Research Library</collection><collection>Social Science Database</collection><collection>Research Library (Corporate)</collection><collection>ProQuest One Academic Eastern Edition (DO NOT USE)</collection><collection>ProQuest One Academic</collection><collection>ProQuest One Academic UKI Edition</collection><collection>ProQuest Central China</collection><collection>ProQuest One Psychology</collection><collection>ProQuest Central Basic</collection><collection>MEDLINE - Academic</collection><collection>Biotechnology Research Abstracts</collection><collection>Neurosciences Abstracts</collection><collection>Technology Research Database</collection><collection>Engineering Research Database</collection><collection>Biotechnology and BioEngineering Abstracts</collection><jtitle>Behavior research methods</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Neumuth, T.</au><au>Kaschek, B.</au><au>Neumuth, D.</au><au>Ceschia, M.</au><au>Meixensberger, J.</au><au>Strauss, G.</au><au>Burgert, O.</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>An observation support system with an adaptive ontology-driven user interface for the modeling of complex behaviors during surgical interventions</atitle><jtitle>Behavior research methods</jtitle><stitle>Behavior Research Methods</stitle><addtitle>Behav Res Methods</addtitle><date>2010-11-01</date><risdate>2010</risdate><volume>42</volume><issue>4</issue><spage>1049</spage><epage>1058</epage><pages>1049-1058</pages><issn>1554-351X</issn><eissn>1554-3528</eissn><abstract>The field of surgical interventions emphasizes knowledge and experience; explicit and detailed models of surgical processes are hard to obtain by observation or measurement. However, in medical engineering and related developments, such models are highly valuable. Surgical process modeling deals with the generation of complex process descriptions by observation. This places high demands on the observers, who have to use a sizable terminology to denominate surgical actions, instruments, and patient anatomies, and to describe processes unambiguously. Here, we present a novel method, employing an ontology-based user interface that adapts to the actual situation and describe the principles of the system. A validation study showed that this method enables observers with little recording experience to reach a recording accuracy of >90%. Furthermore, this method can be used for live and video observation. We conclude that the method of ontology-supported recording for complex behaviors can be advantageously employed when surgical processes are modeled.</abstract><cop>New York</cop><pub>Springer-Verlag</pub><pmid>21139172</pmid><doi>10.3758/BRM.42.4.1049</doi><tpages>10</tpages><oa>free_for_read</oa></addata></record> |
fulltext | fulltext |
identifier | ISSN: 1554-351X |
ispartof | Behavior research methods, 2010-11, Vol.42 (4), p.1049-1058 |
issn | 1554-351X 1554-3528 |
language | eng |
recordid | cdi_proquest_miscellaneous_910649975 |
source | MEDLINE; SpringerLink Journals; Elektronische Zeitschriftenbibliothek - Frei zugängliche E-Journals |
subjects | Accuracy Behavior Behavioral Science and Psychology Cognitive Psychology Computer science Humans Models, Anatomic Ontology Otolaryngology Psychology Simulation Software Surgery User interface User-Computer Interface Validation studies |
title | An observation support system with an adaptive ontology-driven user interface for the modeling of complex behaviors during surgical interventions |
url | https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-02-10T01%3A41%3A57IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_cross&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=An%20observation%20support%20system%20with%20an%20adaptive%20ontology-driven%20user%20interface%20for%20the%20modeling%20of%20complex%20behaviors%20during%20surgical%20interventions&rft.jtitle=Behavior%20research%20methods&rft.au=Neumuth,%20T.&rft.date=2010-11-01&rft.volume=42&rft.issue=4&rft.spage=1049&rft.epage=1058&rft.pages=1049-1058&rft.issn=1554-351X&rft.eissn=1554-3528&rft_id=info:doi/10.3758/BRM.42.4.1049&rft_dat=%3Cproquest_cross%3E910649975%3C/proquest_cross%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=852754035&rft_id=info:pmid/21139172&rfr_iscdi=true |