Enabling Mixed Effects Neural Networks for Diverse, Clustered Data Using Monte Carlo Methods

Neural networks often assume independence among input data samples, disregarding correlations arising from inherent clustering patterns in real-world datasets (e.g., due to different sites or repeated measurements). Recently, mixed effects neural networks (MENNs) which separate cluster-specific ...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:arXiv.org 2024-07
Hauptverfasser: Tschalzev, Andrej, Nitschke, Paul, Kirchdorfer, Lukas, Lüdtke, Stefan, Bartelt, Christian, Stuckenschmidt, Heiner
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page
container_issue
container_start_page
container_title arXiv.org
container_volume
creator Tschalzev, Andrej
Nitschke, Paul
Kirchdorfer, Lukas
Lüdtke, Stefan
Bartelt, Christian
Stuckenschmidt, Heiner
description Neural networks often assume independence among input data samples, disregarding correlations arising from inherent clustering patterns in real-world datasets (e.g., due to different sites or repeated measurements). Recently, mixed effects neural networks (MENNs) which separate cluster-specific 'random effects' from cluster-invariant 'fixed effects' have been proposed to improve generalization and interpretability for clustered data. However, existing methods only allow for approximate quantification of cluster effects and are limited to regression and binary targets with only one clustering feature. We present MC-GMENN, a novel approach employing Monte Carlo methods to train Generalized Mixed Effects Neural Networks. We empirically demonstrate that MC-GMENN outperforms existing mixed effects deep learning models in terms of generalization performance, time complexity, and quantification of inter-cluster variance. Additionally, MC-GMENN is applicable to a wide range of datasets, including multi-class classification tasks with multiple high-cardinality categorical features. For these datasets, we show that MC-GMENN outperforms conventional encoding and embedding methods, simultaneously offering a principled methodology for interpreting the effects of clustering patterns.
format Article
fullrecord <record><control><sourceid>proquest</sourceid><recordid>TN_cdi_proquest_journals_3074864335</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>3074864335</sourcerecordid><originalsourceid>FETCH-proquest_journals_30748643353</originalsourceid><addsrcrecordid>eNqNi0sKwjAUAIMgWLR3eODWQk3627cVN3WlO6FE-6KtodG8VD2-RTyAq1nMzIR5XIh1kEWcz5hP1IVhyJOUx7Hw2LHs5Um3_QWq9o0NlErh2RHscLBSj3AvY28Eylgo2idawhXkeiCHdswL6SQc6Pub3iHk0moDFbqraWjBpkpqQv_HOVtuyn2-De7WPAYkV3dmsP2oahGmUZZEQsTiv-oDSiRCig</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>3074864335</pqid></control><display><type>article</type><title>Enabling Mixed Effects Neural Networks for Diverse, Clustered Data Using Monte Carlo Methods</title><source>Free E- Journals</source><creator>Tschalzev, Andrej ; Nitschke, Paul ; Kirchdorfer, Lukas ; Lüdtke, Stefan ; Bartelt, Christian ; Stuckenschmidt, Heiner</creator><creatorcontrib>Tschalzev, Andrej ; Nitschke, Paul ; Kirchdorfer, Lukas ; Lüdtke, Stefan ; Bartelt, Christian ; Stuckenschmidt, Heiner</creatorcontrib><description>Neural networks often assume independence among input data samples, disregarding correlations arising from inherent clustering patterns in real-world datasets (e.g., due to different sites or repeated measurements). Recently, mixed effects neural networks (MENNs) which separate cluster-specific 'random effects' from cluster-invariant 'fixed effects' have been proposed to improve generalization and interpretability for clustered data. However, existing methods only allow for approximate quantification of cluster effects and are limited to regression and binary targets with only one clustering feature. We present MC-GMENN, a novel approach employing Monte Carlo methods to train Generalized Mixed Effects Neural Networks. We empirically demonstrate that MC-GMENN outperforms existing mixed effects deep learning models in terms of generalization performance, time complexity, and quantification of inter-cluster variance. Additionally, MC-GMENN is applicable to a wide range of datasets, including multi-class classification tasks with multiple high-cardinality categorical features. For these datasets, we show that MC-GMENN outperforms conventional encoding and embedding methods, simultaneously offering a principled methodology for interpreting the effects of clustering patterns.</description><identifier>EISSN: 2331-8422</identifier><language>eng</language><publisher>Ithaca: Cornell University Library, arXiv.org</publisher><subject>Clustering ; Clusters ; Datasets ; Machine learning ; Monte Carlo simulation ; Neural networks</subject><ispartof>arXiv.org, 2024-07</ispartof><rights>2024. This work is published under http://creativecommons.org/licenses/by/4.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.</rights><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><link.rule.ids>780,784</link.rule.ids></links><search><creatorcontrib>Tschalzev, Andrej</creatorcontrib><creatorcontrib>Nitschke, Paul</creatorcontrib><creatorcontrib>Kirchdorfer, Lukas</creatorcontrib><creatorcontrib>Lüdtke, Stefan</creatorcontrib><creatorcontrib>Bartelt, Christian</creatorcontrib><creatorcontrib>Stuckenschmidt, Heiner</creatorcontrib><title>Enabling Mixed Effects Neural Networks for Diverse, Clustered Data Using Monte Carlo Methods</title><title>arXiv.org</title><description>Neural networks often assume independence among input data samples, disregarding correlations arising from inherent clustering patterns in real-world datasets (e.g., due to different sites or repeated measurements). Recently, mixed effects neural networks (MENNs) which separate cluster-specific 'random effects' from cluster-invariant 'fixed effects' have been proposed to improve generalization and interpretability for clustered data. However, existing methods only allow for approximate quantification of cluster effects and are limited to regression and binary targets with only one clustering feature. We present MC-GMENN, a novel approach employing Monte Carlo methods to train Generalized Mixed Effects Neural Networks. We empirically demonstrate that MC-GMENN outperforms existing mixed effects deep learning models in terms of generalization performance, time complexity, and quantification of inter-cluster variance. Additionally, MC-GMENN is applicable to a wide range of datasets, including multi-class classification tasks with multiple high-cardinality categorical features. For these datasets, we show that MC-GMENN outperforms conventional encoding and embedding methods, simultaneously offering a principled methodology for interpreting the effects of clustering patterns.</description><subject>Clustering</subject><subject>Clusters</subject><subject>Datasets</subject><subject>Machine learning</subject><subject>Monte Carlo simulation</subject><subject>Neural networks</subject><issn>2331-8422</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2024</creationdate><recordtype>article</recordtype><sourceid>ABUWG</sourceid><sourceid>AFKRA</sourceid><sourceid>AZQEC</sourceid><sourceid>BENPR</sourceid><sourceid>CCPQU</sourceid><sourceid>DWQXO</sourceid><recordid>eNqNi0sKwjAUAIMgWLR3eODWQk3627cVN3WlO6FE-6KtodG8VD2-RTyAq1nMzIR5XIh1kEWcz5hP1IVhyJOUx7Hw2LHs5Um3_QWq9o0NlErh2RHscLBSj3AvY28Eylgo2idawhXkeiCHdswL6SQc6Pub3iHk0moDFbqraWjBpkpqQv_HOVtuyn2-De7WPAYkV3dmsP2oahGmUZZEQsTiv-oDSiRCig</recordid><startdate>20240701</startdate><enddate>20240701</enddate><creator>Tschalzev, Andrej</creator><creator>Nitschke, Paul</creator><creator>Kirchdorfer, Lukas</creator><creator>Lüdtke, Stefan</creator><creator>Bartelt, Christian</creator><creator>Stuckenschmidt, Heiner</creator><general>Cornell University Library, arXiv.org</general><scope>8FE</scope><scope>8FG</scope><scope>ABJCF</scope><scope>ABUWG</scope><scope>AFKRA</scope><scope>AZQEC</scope><scope>BENPR</scope><scope>BGLVJ</scope><scope>CCPQU</scope><scope>DWQXO</scope><scope>HCIFZ</scope><scope>L6V</scope><scope>M7S</scope><scope>PIMPY</scope><scope>PQEST</scope><scope>PQQKQ</scope><scope>PQUKI</scope><scope>PRINS</scope><scope>PTHSS</scope></search><sort><creationdate>20240701</creationdate><title>Enabling Mixed Effects Neural Networks for Diverse, Clustered Data Using Monte Carlo Methods</title><author>Tschalzev, Andrej ; Nitschke, Paul ; Kirchdorfer, Lukas ; Lüdtke, Stefan ; Bartelt, Christian ; Stuckenschmidt, Heiner</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-proquest_journals_30748643353</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2024</creationdate><topic>Clustering</topic><topic>Clusters</topic><topic>Datasets</topic><topic>Machine learning</topic><topic>Monte Carlo simulation</topic><topic>Neural networks</topic><toplevel>online_resources</toplevel><creatorcontrib>Tschalzev, Andrej</creatorcontrib><creatorcontrib>Nitschke, Paul</creatorcontrib><creatorcontrib>Kirchdorfer, Lukas</creatorcontrib><creatorcontrib>Lüdtke, Stefan</creatorcontrib><creatorcontrib>Bartelt, Christian</creatorcontrib><creatorcontrib>Stuckenschmidt, Heiner</creatorcontrib><collection>ProQuest SciTech Collection</collection><collection>ProQuest Technology Collection</collection><collection>Materials Science &amp; Engineering Collection</collection><collection>ProQuest Central (Alumni Edition)</collection><collection>ProQuest Central UK/Ireland</collection><collection>ProQuest Central Essentials</collection><collection>ProQuest Central</collection><collection>Technology Collection</collection><collection>ProQuest One Community College</collection><collection>ProQuest Central Korea</collection><collection>SciTech Premium Collection</collection><collection>ProQuest Engineering Collection</collection><collection>Engineering Database</collection><collection>Access via ProQuest (Open Access)</collection><collection>ProQuest One Academic Eastern Edition (DO NOT USE)</collection><collection>ProQuest One Academic</collection><collection>ProQuest One Academic UKI Edition</collection><collection>ProQuest Central China</collection><collection>Engineering Collection</collection></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Tschalzev, Andrej</au><au>Nitschke, Paul</au><au>Kirchdorfer, Lukas</au><au>Lüdtke, Stefan</au><au>Bartelt, Christian</au><au>Stuckenschmidt, Heiner</au><format>book</format><genre>document</genre><ristype>GEN</ristype><atitle>Enabling Mixed Effects Neural Networks for Diverse, Clustered Data Using Monte Carlo Methods</atitle><jtitle>arXiv.org</jtitle><date>2024-07-01</date><risdate>2024</risdate><eissn>2331-8422</eissn><abstract>Neural networks often assume independence among input data samples, disregarding correlations arising from inherent clustering patterns in real-world datasets (e.g., due to different sites or repeated measurements). Recently, mixed effects neural networks (MENNs) which separate cluster-specific 'random effects' from cluster-invariant 'fixed effects' have been proposed to improve generalization and interpretability for clustered data. However, existing methods only allow for approximate quantification of cluster effects and are limited to regression and binary targets with only one clustering feature. We present MC-GMENN, a novel approach employing Monte Carlo methods to train Generalized Mixed Effects Neural Networks. We empirically demonstrate that MC-GMENN outperforms existing mixed effects deep learning models in terms of generalization performance, time complexity, and quantification of inter-cluster variance. Additionally, MC-GMENN is applicable to a wide range of datasets, including multi-class classification tasks with multiple high-cardinality categorical features. For these datasets, we show that MC-GMENN outperforms conventional encoding and embedding methods, simultaneously offering a principled methodology for interpreting the effects of clustering patterns.</abstract><cop>Ithaca</cop><pub>Cornell University Library, arXiv.org</pub><oa>free_for_read</oa></addata></record>
fulltext fulltext
identifier EISSN: 2331-8422
ispartof arXiv.org, 2024-07
issn 2331-8422
language eng
recordid cdi_proquest_journals_3074864335
source Free E- Journals
subjects Clustering
Clusters
Datasets
Machine learning
Monte Carlo simulation
Neural networks
title Enabling Mixed Effects Neural Networks for Diverse, Clustered Data Using Monte Carlo Methods
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2024-12-22T23%3A22%3A53IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest&rft_val_fmt=info:ofi/fmt:kev:mtx:book&rft.genre=document&rft.atitle=Enabling%20Mixed%20Effects%20Neural%20Networks%20for%20Diverse,%20Clustered%20Data%20Using%20Monte%20Carlo%20Methods&rft.jtitle=arXiv.org&rft.au=Tschalzev,%20Andrej&rft.date=2024-07-01&rft.eissn=2331-8422&rft_id=info:doi/&rft_dat=%3Cproquest%3E3074864335%3C/proquest%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=3074864335&rft_id=info:pmid/&rfr_iscdi=true