Continuous Optimization for Offline Change Point Detection and Estimation
This work explores use of novel advances in best subset selection for regression modelling via continuous optimization for offline change point detection and estimation in univariate Gaussian data sequences. The approach exploits reformulating the normal mean multiple change point model into a regul...
Gespeichert in:
Hauptverfasser: | , , |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
container_end_page | |
---|---|
container_issue | |
container_start_page | |
container_title | |
container_volume | |
creator | Reimann, Hans Moka, Sarat Sofronov, Georgy |
description | This work explores use of novel advances in best subset selection for
regression modelling via continuous optimization for offline change point
detection and estimation in univariate Gaussian data sequences. The approach
exploits reformulating the normal mean multiple change point model into a
regularized statistical inverse problem enforcing sparsity. After introducing
the problem statement, criteria and previous investigations via
Lasso-regularization, the recently developed framework of continuous
optimization for best subset selection (COMBSS) is briefly introduced and
related to the problem at hand. Supervised and unsupervised perspectives are
explored with the latter testing different approaches for the choice of
regularization penalty parameters via the discrepancy principle and a
confidence bound. The main result is an adaptation and evaluation of the COMBSS
approach for offline normal mean multiple change-point detection via
experimental results on simulated data for different choices of regularisation
parameters. Results and future directions are discussed. |
doi_str_mv | 10.48550/arxiv.2407.03383 |
format | Article |
fullrecord | <record><control><sourceid>arxiv_GOX</sourceid><recordid>TN_cdi_arxiv_primary_2407_03383</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>2407_03383</sourcerecordid><originalsourceid>FETCH-arxiv_primary_2407_033833</originalsourceid><addsrcrecordid>eNpjYJA0NNAzsTA1NdBPLKrILNMzMjEw1zMwNrYw5mTwdM7PK8nMK80vLVbwLyjJzM2sSizJzM9TSMsvUvBPS8vJzEtVcM5IzEtPVQjIz8wrUXBJLUlNBitJzEtRcC0G6gHr4GFgTUvMKU7lhdLcDPJuriHOHrpgS-MLioDqiirjQZbHgy03JqwCAGIoOl4</addsrcrecordid><sourcetype>Open Access Repository</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype></control><display><type>article</type><title>Continuous Optimization for Offline Change Point Detection and Estimation</title><source>arXiv.org</source><creator>Reimann, Hans ; Moka, Sarat ; Sofronov, Georgy</creator><creatorcontrib>Reimann, Hans ; Moka, Sarat ; Sofronov, Georgy</creatorcontrib><description>This work explores use of novel advances in best subset selection for
regression modelling via continuous optimization for offline change point
detection and estimation in univariate Gaussian data sequences. The approach
exploits reformulating the normal mean multiple change point model into a
regularized statistical inverse problem enforcing sparsity. After introducing
the problem statement, criteria and previous investigations via
Lasso-regularization, the recently developed framework of continuous
optimization for best subset selection (COMBSS) is briefly introduced and
related to the problem at hand. Supervised and unsupervised perspectives are
explored with the latter testing different approaches for the choice of
regularization penalty parameters via the discrepancy principle and a
confidence bound. The main result is an adaptation and evaluation of the COMBSS
approach for offline normal mean multiple change-point detection via
experimental results on simulated data for different choices of regularisation
parameters. Results and future directions are discussed.</description><identifier>DOI: 10.48550/arxiv.2407.03383</identifier><language>eng</language><subject>Statistics - Computation ; Statistics - Machine Learning ; Statistics - Methodology</subject><creationdate>2024-07</creationdate><rights>http://creativecommons.org/licenses/by/4.0</rights><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><link.rule.ids>228,230,780,885</link.rule.ids><linktorsrc>$$Uhttps://arxiv.org/abs/2407.03383$$EView_record_in_Cornell_University$$FView_record_in_$$GCornell_University$$Hfree_for_read</linktorsrc><backlink>$$Uhttps://doi.org/10.48550/arXiv.2407.03383$$DView paper in arXiv$$Hfree_for_read</backlink></links><search><creatorcontrib>Reimann, Hans</creatorcontrib><creatorcontrib>Moka, Sarat</creatorcontrib><creatorcontrib>Sofronov, Georgy</creatorcontrib><title>Continuous Optimization for Offline Change Point Detection and Estimation</title><description>This work explores use of novel advances in best subset selection for
regression modelling via continuous optimization for offline change point
detection and estimation in univariate Gaussian data sequences. The approach
exploits reformulating the normal mean multiple change point model into a
regularized statistical inverse problem enforcing sparsity. After introducing
the problem statement, criteria and previous investigations via
Lasso-regularization, the recently developed framework of continuous
optimization for best subset selection (COMBSS) is briefly introduced and
related to the problem at hand. Supervised and unsupervised perspectives are
explored with the latter testing different approaches for the choice of
regularization penalty parameters via the discrepancy principle and a
confidence bound. The main result is an adaptation and evaluation of the COMBSS
approach for offline normal mean multiple change-point detection via
experimental results on simulated data for different choices of regularisation
parameters. Results and future directions are discussed.</description><subject>Statistics - Computation</subject><subject>Statistics - Machine Learning</subject><subject>Statistics - Methodology</subject><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2024</creationdate><recordtype>article</recordtype><sourceid>GOX</sourceid><recordid>eNpjYJA0NNAzsTA1NdBPLKrILNMzMjEw1zMwNrYw5mTwdM7PK8nMK80vLVbwLyjJzM2sSizJzM9TSMsvUvBPS8vJzEtVcM5IzEtPVQjIz8wrUXBJLUlNBitJzEtRcC0G6gHr4GFgTUvMKU7lhdLcDPJuriHOHrpgS-MLioDqiirjQZbHgy03JqwCAGIoOl4</recordid><startdate>20240702</startdate><enddate>20240702</enddate><creator>Reimann, Hans</creator><creator>Moka, Sarat</creator><creator>Sofronov, Georgy</creator><scope>EPD</scope><scope>GOX</scope></search><sort><creationdate>20240702</creationdate><title>Continuous Optimization for Offline Change Point Detection and Estimation</title><author>Reimann, Hans ; Moka, Sarat ; Sofronov, Georgy</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-arxiv_primary_2407_033833</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2024</creationdate><topic>Statistics - Computation</topic><topic>Statistics - Machine Learning</topic><topic>Statistics - Methodology</topic><toplevel>online_resources</toplevel><creatorcontrib>Reimann, Hans</creatorcontrib><creatorcontrib>Moka, Sarat</creatorcontrib><creatorcontrib>Sofronov, Georgy</creatorcontrib><collection>arXiv Statistics</collection><collection>arXiv.org</collection></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext_linktorsrc</fulltext></delivery><addata><au>Reimann, Hans</au><au>Moka, Sarat</au><au>Sofronov, Georgy</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Continuous Optimization for Offline Change Point Detection and Estimation</atitle><date>2024-07-02</date><risdate>2024</risdate><abstract>This work explores use of novel advances in best subset selection for
regression modelling via continuous optimization for offline change point
detection and estimation in univariate Gaussian data sequences. The approach
exploits reformulating the normal mean multiple change point model into a
regularized statistical inverse problem enforcing sparsity. After introducing
the problem statement, criteria and previous investigations via
Lasso-regularization, the recently developed framework of continuous
optimization for best subset selection (COMBSS) is briefly introduced and
related to the problem at hand. Supervised and unsupervised perspectives are
explored with the latter testing different approaches for the choice of
regularization penalty parameters via the discrepancy principle and a
confidence bound. The main result is an adaptation and evaluation of the COMBSS
approach for offline normal mean multiple change-point detection via
experimental results on simulated data for different choices of regularisation
parameters. Results and future directions are discussed.</abstract><doi>10.48550/arxiv.2407.03383</doi><oa>free_for_read</oa></addata></record> |
fulltext | fulltext_linktorsrc |
identifier | DOI: 10.48550/arxiv.2407.03383 |
ispartof | |
issn | |
language | eng |
recordid | cdi_arxiv_primary_2407_03383 |
source | arXiv.org |
subjects | Statistics - Computation Statistics - Machine Learning Statistics - Methodology |
title | Continuous Optimization for Offline Change Point Detection and Estimation |
url | https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2024-12-25T14%3A40%3A18IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-arxiv_GOX&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Continuous%20Optimization%20for%20Offline%20Change%20Point%20Detection%20and%20Estimation&rft.au=Reimann,%20Hans&rft.date=2024-07-02&rft_id=info:doi/10.48550/arxiv.2407.03383&rft_dat=%3Carxiv_GOX%3E2407_03383%3C/arxiv_GOX%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_id=info:pmid/&rfr_iscdi=true |