Developmental Stage Classification of Embryos Using Two-Stream Neural Network with Linear-Chain Conditional Random Field
The developmental process of embryos follows a monotonic order. An embryo can progressively cleave from one cell to multiple cells and finally transform to morula and blastocyst. For time-lapse videos of embryos, most existing developmental stage classification methods conduct per-frame predictions...
Gespeichert in:
Veröffentlicht in: | arXiv.org 2021-07 |
---|---|
Hauptverfasser: | , , , , , , , , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
container_end_page | |
---|---|
container_issue | |
container_start_page | |
container_title | arXiv.org |
container_volume | |
creator | Lukyanenko, Stanislav Won-Dong, Jang Donglai Wei Struyven, Robbert Yoon, Kim Leahy, Brian Yang, Helen Rush, Alexander Ben-Yosef, Dalit Needleman, Daniel Pfister, Hanspeter |
description | The developmental process of embryos follows a monotonic order. An embryo can progressively cleave from one cell to multiple cells and finally transform to morula and blastocyst. For time-lapse videos of embryos, most existing developmental stage classification methods conduct per-frame predictions using an image frame at each time step. However, classification using only images suffers from overlapping between cells and imbalance between stages. Temporal information can be valuable in addressing this problem by capturing movements between neighboring frames. In this work, we propose a two-stream model for developmental stage classification. Unlike previous methods, our two-stream model accepts both temporal and image information. We develop a linear-chain conditional random field (CRF) on top of neural network features extracted from the temporal and image streams to make use of both modalities. The linear-chain CRF formulation enables tractable training of global sequential models over multiple frames while also making it possible to inject monotonic development order constraints into the learning process explicitly. We demonstrate our algorithm on two time-lapse embryo video datasets: i) mouse and ii) human embryo datasets. Our method achieves 98.1 % and 80.6 % for mouse and human embryo stage classification, respectively. Our approach will enable more profound clinical and biological studies and suggests a new direction for developmental stage classification by utilizing temporal information. |
format | Article |
fullrecord | <record><control><sourceid>proquest</sourceid><recordid>TN_cdi_proquest_journals_2551800563</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>2551800563</sourcerecordid><originalsourceid>FETCH-proquest_journals_25518005633</originalsourceid><addsrcrecordid>eNqNjsFOwkAQhjckJhDkHSbx3GTZtci9QjwYDoJnMtopDGx3YGdL9e2tiQ_g6b9835d_ZCbO-3mxfHRubGaqJ2utWzy5svQT8_VMNwpyaSlmDLDNeCCoAqpyw5-YWSJIA6v2I32LwrtyPMCul2KbE2ELG-rS4G0o95LO0HM-witHwlRUR-QIlcSafzMD9YaxlhbWTKG-N3cNBqXZ307Nw3q1q16KS5JrR5r3J-nSYOl-eDpfWlsuvP8f9QNy702T</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2551800563</pqid></control><display><type>article</type><title>Developmental Stage Classification of Embryos Using Two-Stream Neural Network with Linear-Chain Conditional Random Field</title><source>Free E- Journals</source><creator>Lukyanenko, Stanislav ; Won-Dong, Jang ; Donglai Wei ; Struyven, Robbert ; Yoon, Kim ; Leahy, Brian ; Yang, Helen ; Rush, Alexander ; Ben-Yosef, Dalit ; Needleman, Daniel ; Pfister, Hanspeter</creator><creatorcontrib>Lukyanenko, Stanislav ; Won-Dong, Jang ; Donglai Wei ; Struyven, Robbert ; Yoon, Kim ; Leahy, Brian ; Yang, Helen ; Rush, Alexander ; Ben-Yosef, Dalit ; Needleman, Daniel ; Pfister, Hanspeter</creatorcontrib><description>The developmental process of embryos follows a monotonic order. An embryo can progressively cleave from one cell to multiple cells and finally transform to morula and blastocyst. For time-lapse videos of embryos, most existing developmental stage classification methods conduct per-frame predictions using an image frame at each time step. However, classification using only images suffers from overlapping between cells and imbalance between stages. Temporal information can be valuable in addressing this problem by capturing movements between neighboring frames. In this work, we propose a two-stream model for developmental stage classification. Unlike previous methods, our two-stream model accepts both temporal and image information. We develop a linear-chain conditional random field (CRF) on top of neural network features extracted from the temporal and image streams to make use of both modalities. The linear-chain CRF formulation enables tractable training of global sequential models over multiple frames while also making it possible to inject monotonic development order constraints into the learning process explicitly. We demonstrate our algorithm on two time-lapse embryo video datasets: i) mouse and ii) human embryo datasets. Our method achieves 98.1 % and 80.6 % for mouse and human embryo stage classification, respectively. Our approach will enable more profound clinical and biological studies and suggests a new direction for developmental stage classification by utilizing temporal information.</description><identifier>EISSN: 2331-8422</identifier><language>eng</language><publisher>Ithaca: Cornell University Library, arXiv.org</publisher><subject>Algorithms ; Chains ; Classification ; Conditional random fields ; Datasets ; Embryos ; Feature extraction ; Frames (data processing) ; Image classification ; Machine learning ; Neural networks</subject><ispartof>arXiv.org, 2021-07</ispartof><rights>2021. This work is published under http://arxiv.org/licenses/nonexclusive-distrib/1.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.</rights><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><link.rule.ids>780,784</link.rule.ids></links><search><creatorcontrib>Lukyanenko, Stanislav</creatorcontrib><creatorcontrib>Won-Dong, Jang</creatorcontrib><creatorcontrib>Donglai Wei</creatorcontrib><creatorcontrib>Struyven, Robbert</creatorcontrib><creatorcontrib>Yoon, Kim</creatorcontrib><creatorcontrib>Leahy, Brian</creatorcontrib><creatorcontrib>Yang, Helen</creatorcontrib><creatorcontrib>Rush, Alexander</creatorcontrib><creatorcontrib>Ben-Yosef, Dalit</creatorcontrib><creatorcontrib>Needleman, Daniel</creatorcontrib><creatorcontrib>Pfister, Hanspeter</creatorcontrib><title>Developmental Stage Classification of Embryos Using Two-Stream Neural Network with Linear-Chain Conditional Random Field</title><title>arXiv.org</title><description>The developmental process of embryos follows a monotonic order. An embryo can progressively cleave from one cell to multiple cells and finally transform to morula and blastocyst. For time-lapse videos of embryos, most existing developmental stage classification methods conduct per-frame predictions using an image frame at each time step. However, classification using only images suffers from overlapping between cells and imbalance between stages. Temporal information can be valuable in addressing this problem by capturing movements between neighboring frames. In this work, we propose a two-stream model for developmental stage classification. Unlike previous methods, our two-stream model accepts both temporal and image information. We develop a linear-chain conditional random field (CRF) on top of neural network features extracted from the temporal and image streams to make use of both modalities. The linear-chain CRF formulation enables tractable training of global sequential models over multiple frames while also making it possible to inject monotonic development order constraints into the learning process explicitly. We demonstrate our algorithm on two time-lapse embryo video datasets: i) mouse and ii) human embryo datasets. Our method achieves 98.1 % and 80.6 % for mouse and human embryo stage classification, respectively. Our approach will enable more profound clinical and biological studies and suggests a new direction for developmental stage classification by utilizing temporal information.</description><subject>Algorithms</subject><subject>Chains</subject><subject>Classification</subject><subject>Conditional random fields</subject><subject>Datasets</subject><subject>Embryos</subject><subject>Feature extraction</subject><subject>Frames (data processing)</subject><subject>Image classification</subject><subject>Machine learning</subject><subject>Neural networks</subject><issn>2331-8422</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2021</creationdate><recordtype>article</recordtype><sourceid>ABUWG</sourceid><sourceid>AFKRA</sourceid><sourceid>AZQEC</sourceid><sourceid>BENPR</sourceid><sourceid>CCPQU</sourceid><sourceid>DWQXO</sourceid><recordid>eNqNjsFOwkAQhjckJhDkHSbx3GTZtci9QjwYDoJnMtopDGx3YGdL9e2tiQ_g6b9835d_ZCbO-3mxfHRubGaqJ2utWzy5svQT8_VMNwpyaSlmDLDNeCCoAqpyw5-YWSJIA6v2I32LwrtyPMCul2KbE2ELG-rS4G0o95LO0HM-witHwlRUR-QIlcSafzMD9YaxlhbWTKG-N3cNBqXZ307Nw3q1q16KS5JrR5r3J-nSYOl-eDpfWlsuvP8f9QNy702T</recordid><startdate>20210713</startdate><enddate>20210713</enddate><creator>Lukyanenko, Stanislav</creator><creator>Won-Dong, Jang</creator><creator>Donglai Wei</creator><creator>Struyven, Robbert</creator><creator>Yoon, Kim</creator><creator>Leahy, Brian</creator><creator>Yang, Helen</creator><creator>Rush, Alexander</creator><creator>Ben-Yosef, Dalit</creator><creator>Needleman, Daniel</creator><creator>Pfister, Hanspeter</creator><general>Cornell University Library, arXiv.org</general><scope>8FE</scope><scope>8FG</scope><scope>ABJCF</scope><scope>ABUWG</scope><scope>AFKRA</scope><scope>AZQEC</scope><scope>BENPR</scope><scope>BGLVJ</scope><scope>CCPQU</scope><scope>DWQXO</scope><scope>HCIFZ</scope><scope>L6V</scope><scope>M7S</scope><scope>PIMPY</scope><scope>PQEST</scope><scope>PQQKQ</scope><scope>PQUKI</scope><scope>PRINS</scope><scope>PTHSS</scope></search><sort><creationdate>20210713</creationdate><title>Developmental Stage Classification of Embryos Using Two-Stream Neural Network with Linear-Chain Conditional Random Field</title><author>Lukyanenko, Stanislav ; Won-Dong, Jang ; Donglai Wei ; Struyven, Robbert ; Yoon, Kim ; Leahy, Brian ; Yang, Helen ; Rush, Alexander ; Ben-Yosef, Dalit ; Needleman, Daniel ; Pfister, Hanspeter</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-proquest_journals_25518005633</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2021</creationdate><topic>Algorithms</topic><topic>Chains</topic><topic>Classification</topic><topic>Conditional random fields</topic><topic>Datasets</topic><topic>Embryos</topic><topic>Feature extraction</topic><topic>Frames (data processing)</topic><topic>Image classification</topic><topic>Machine learning</topic><topic>Neural networks</topic><toplevel>online_resources</toplevel><creatorcontrib>Lukyanenko, Stanislav</creatorcontrib><creatorcontrib>Won-Dong, Jang</creatorcontrib><creatorcontrib>Donglai Wei</creatorcontrib><creatorcontrib>Struyven, Robbert</creatorcontrib><creatorcontrib>Yoon, Kim</creatorcontrib><creatorcontrib>Leahy, Brian</creatorcontrib><creatorcontrib>Yang, Helen</creatorcontrib><creatorcontrib>Rush, Alexander</creatorcontrib><creatorcontrib>Ben-Yosef, Dalit</creatorcontrib><creatorcontrib>Needleman, Daniel</creatorcontrib><creatorcontrib>Pfister, Hanspeter</creatorcontrib><collection>ProQuest SciTech Collection</collection><collection>ProQuest Technology Collection</collection><collection>Materials Science & Engineering Collection</collection><collection>ProQuest Central (Alumni Edition)</collection><collection>ProQuest Central UK/Ireland</collection><collection>ProQuest Central Essentials</collection><collection>ProQuest Central</collection><collection>Technology Collection</collection><collection>ProQuest One Community College</collection><collection>ProQuest Central Korea</collection><collection>SciTech Premium Collection</collection><collection>ProQuest Engineering Collection</collection><collection>Engineering Database</collection><collection>Publicly Available Content Database</collection><collection>ProQuest One Academic Eastern Edition (DO NOT USE)</collection><collection>ProQuest One Academic</collection><collection>ProQuest One Academic UKI Edition</collection><collection>ProQuest Central China</collection><collection>Engineering Collection</collection></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Lukyanenko, Stanislav</au><au>Won-Dong, Jang</au><au>Donglai Wei</au><au>Struyven, Robbert</au><au>Yoon, Kim</au><au>Leahy, Brian</au><au>Yang, Helen</au><au>Rush, Alexander</au><au>Ben-Yosef, Dalit</au><au>Needleman, Daniel</au><au>Pfister, Hanspeter</au><format>book</format><genre>document</genre><ristype>GEN</ristype><atitle>Developmental Stage Classification of Embryos Using Two-Stream Neural Network with Linear-Chain Conditional Random Field</atitle><jtitle>arXiv.org</jtitle><date>2021-07-13</date><risdate>2021</risdate><eissn>2331-8422</eissn><abstract>The developmental process of embryos follows a monotonic order. An embryo can progressively cleave from one cell to multiple cells and finally transform to morula and blastocyst. For time-lapse videos of embryos, most existing developmental stage classification methods conduct per-frame predictions using an image frame at each time step. However, classification using only images suffers from overlapping between cells and imbalance between stages. Temporal information can be valuable in addressing this problem by capturing movements between neighboring frames. In this work, we propose a two-stream model for developmental stage classification. Unlike previous methods, our two-stream model accepts both temporal and image information. We develop a linear-chain conditional random field (CRF) on top of neural network features extracted from the temporal and image streams to make use of both modalities. The linear-chain CRF formulation enables tractable training of global sequential models over multiple frames while also making it possible to inject monotonic development order constraints into the learning process explicitly. We demonstrate our algorithm on two time-lapse embryo video datasets: i) mouse and ii) human embryo datasets. Our method achieves 98.1 % and 80.6 % for mouse and human embryo stage classification, respectively. Our approach will enable more profound clinical and biological studies and suggests a new direction for developmental stage classification by utilizing temporal information.</abstract><cop>Ithaca</cop><pub>Cornell University Library, arXiv.org</pub><oa>free_for_read</oa></addata></record> |
fulltext | fulltext |
identifier | EISSN: 2331-8422 |
ispartof | arXiv.org, 2021-07 |
issn | 2331-8422 |
language | eng |
recordid | cdi_proquest_journals_2551800563 |
source | Free E- Journals |
subjects | Algorithms Chains Classification Conditional random fields Datasets Embryos Feature extraction Frames (data processing) Image classification Machine learning Neural networks |
title | Developmental Stage Classification of Embryos Using Two-Stream Neural Network with Linear-Chain Conditional Random Field |
url | https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-04T00%3A19%3A27IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest&rft_val_fmt=info:ofi/fmt:kev:mtx:book&rft.genre=document&rft.atitle=Developmental%20Stage%20Classification%20of%20Embryos%20Using%20Two-Stream%20Neural%20Network%20with%20Linear-Chain%20Conditional%20Random%20Field&rft.jtitle=arXiv.org&rft.au=Lukyanenko,%20Stanislav&rft.date=2021-07-13&rft.eissn=2331-8422&rft_id=info:doi/&rft_dat=%3Cproquest%3E2551800563%3C/proquest%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=2551800563&rft_id=info:pmid/&rfr_iscdi=true |