Multiple Descent in the Multiple Random Feature Model
Recent works have demonstrated a double descent phenomenon in over-parameterized learning. Although this phenomenon has been investigated by recent works, it has not been fully understood in theory. In this paper, we investigate the multiple descent phenomenon in a class of multi-component predictio...
Gespeichert in:
Hauptverfasser: | , , |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
container_end_page | |
---|---|
container_issue | |
container_start_page | |
container_title | |
container_volume | |
creator | Meng, Xuran Yao, Jianfeng Cao, Yuan |
description | Recent works have demonstrated a double descent phenomenon in
over-parameterized learning. Although this phenomenon has been investigated by
recent works, it has not been fully understood in theory. In this paper, we
investigate the multiple descent phenomenon in a class of multi-component
prediction models. We first consider a ''double random feature model'' (DRFM)
concatenating two types of random features, and study the excess risk achieved
by the DRFM in ridge regression. We calculate the precise limit of the excess
risk under the high dimensional framework where the training sample size, the
dimension of data, and the dimension of random features tend to infinity
proportionally. Based on the calculation, we further theoretically demonstrate
that the risk curves of DRFMs can exhibit triple descent. We then provide a
thorough experimental study to verify our theory. At last, we extend our study
to the ''multiple random feature model'' (MRFM), and show that MRFMs ensembling
$K$ types of random features may exhibit $(K+1)$-fold descent. Our analysis
points out that risk curves with a specific number of descent generally exist
in learning multi-component prediction models. |
doi_str_mv | 10.48550/arxiv.2208.09897 |
format | Article |
fullrecord | <record><control><sourceid>arxiv_GOX</sourceid><recordid>TN_cdi_arxiv_primary_2208_09897</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>2208_09897</sourcerecordid><originalsourceid>FETCH-LOGICAL-a677-ad424daf95571ba771557555afb09329d1c66bd8bd559bc44b635ad31de3c44a3</originalsourceid><addsrcrecordid>eNo9jssKwjAURLNxIeoHuDI_0Jo0uU2zFN-gCOK-3PSmWKhVahX9e-sDVzNzFsNhbChFqBMAMcb6UdzDKBJJKGxiTZfB9lY2xaX0fOavma8aXlS8OXr-53us6HziC4_NrW75mXzZZ50cy6sf_LLHDov5YboKNrvlejrZBBgbEyDpSBPmFsBIh8bItgAA5k5YFVmSWRw7ShwBWJdp7WIFSEqSV-1C1WOj7-3HO73UxQnrZ_r2Tz_-6gVxrz97</addsrcrecordid><sourcetype>Open Access Repository</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype></control><display><type>article</type><title>Multiple Descent in the Multiple Random Feature Model</title><source>arXiv.org</source><creator>Meng, Xuran ; Yao, Jianfeng ; Cao, Yuan</creator><creatorcontrib>Meng, Xuran ; Yao, Jianfeng ; Cao, Yuan</creatorcontrib><description>Recent works have demonstrated a double descent phenomenon in
over-parameterized learning. Although this phenomenon has been investigated by
recent works, it has not been fully understood in theory. In this paper, we
investigate the multiple descent phenomenon in a class of multi-component
prediction models. We first consider a ''double random feature model'' (DRFM)
concatenating two types of random features, and study the excess risk achieved
by the DRFM in ridge regression. We calculate the precise limit of the excess
risk under the high dimensional framework where the training sample size, the
dimension of data, and the dimension of random features tend to infinity
proportionally. Based on the calculation, we further theoretically demonstrate
that the risk curves of DRFMs can exhibit triple descent. We then provide a
thorough experimental study to verify our theory. At last, we extend our study
to the ''multiple random feature model'' (MRFM), and show that MRFMs ensembling
$K$ types of random features may exhibit $(K+1)$-fold descent. Our analysis
points out that risk curves with a specific number of descent generally exist
in learning multi-component prediction models.</description><identifier>DOI: 10.48550/arxiv.2208.09897</identifier><language>eng</language><subject>Computer Science - Learning ; Mathematics - Statistics Theory ; Statistics - Machine Learning ; Statistics - Theory</subject><creationdate>2022-08</creationdate><rights>http://creativecommons.org/licenses/by-nc-nd/4.0</rights><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><link.rule.ids>228,230,780,885</link.rule.ids><linktorsrc>$$Uhttps://arxiv.org/abs/2208.09897$$EView_record_in_Cornell_University$$FView_record_in_$$GCornell_University$$Hfree_for_read</linktorsrc><backlink>$$Uhttps://doi.org/10.48550/arXiv.2208.09897$$DView paper in arXiv$$Hfree_for_read</backlink></links><search><creatorcontrib>Meng, Xuran</creatorcontrib><creatorcontrib>Yao, Jianfeng</creatorcontrib><creatorcontrib>Cao, Yuan</creatorcontrib><title>Multiple Descent in the Multiple Random Feature Model</title><description>Recent works have demonstrated a double descent phenomenon in
over-parameterized learning. Although this phenomenon has been investigated by
recent works, it has not been fully understood in theory. In this paper, we
investigate the multiple descent phenomenon in a class of multi-component
prediction models. We first consider a ''double random feature model'' (DRFM)
concatenating two types of random features, and study the excess risk achieved
by the DRFM in ridge regression. We calculate the precise limit of the excess
risk under the high dimensional framework where the training sample size, the
dimension of data, and the dimension of random features tend to infinity
proportionally. Based on the calculation, we further theoretically demonstrate
that the risk curves of DRFMs can exhibit triple descent. We then provide a
thorough experimental study to verify our theory. At last, we extend our study
to the ''multiple random feature model'' (MRFM), and show that MRFMs ensembling
$K$ types of random features may exhibit $(K+1)$-fold descent. Our analysis
points out that risk curves with a specific number of descent generally exist
in learning multi-component prediction models.</description><subject>Computer Science - Learning</subject><subject>Mathematics - Statistics Theory</subject><subject>Statistics - Machine Learning</subject><subject>Statistics - Theory</subject><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2022</creationdate><recordtype>article</recordtype><sourceid>GOX</sourceid><recordid>eNo9jssKwjAURLNxIeoHuDI_0Jo0uU2zFN-gCOK-3PSmWKhVahX9e-sDVzNzFsNhbChFqBMAMcb6UdzDKBJJKGxiTZfB9lY2xaX0fOavma8aXlS8OXr-53us6HziC4_NrW75mXzZZ50cy6sf_LLHDov5YboKNrvlejrZBBgbEyDpSBPmFsBIh8bItgAA5k5YFVmSWRw7ShwBWJdp7WIFSEqSV-1C1WOj7-3HO73UxQnrZ_r2Tz_-6gVxrz97</recordid><startdate>20220821</startdate><enddate>20220821</enddate><creator>Meng, Xuran</creator><creator>Yao, Jianfeng</creator><creator>Cao, Yuan</creator><scope>AKY</scope><scope>AKZ</scope><scope>EPD</scope><scope>GOX</scope></search><sort><creationdate>20220821</creationdate><title>Multiple Descent in the Multiple Random Feature Model</title><author>Meng, Xuran ; Yao, Jianfeng ; Cao, Yuan</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-a677-ad424daf95571ba771557555afb09329d1c66bd8bd559bc44b635ad31de3c44a3</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2022</creationdate><topic>Computer Science - Learning</topic><topic>Mathematics - Statistics Theory</topic><topic>Statistics - Machine Learning</topic><topic>Statistics - Theory</topic><toplevel>online_resources</toplevel><creatorcontrib>Meng, Xuran</creatorcontrib><creatorcontrib>Yao, Jianfeng</creatorcontrib><creatorcontrib>Cao, Yuan</creatorcontrib><collection>arXiv Computer Science</collection><collection>arXiv Mathematics</collection><collection>arXiv Statistics</collection><collection>arXiv.org</collection></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext_linktorsrc</fulltext></delivery><addata><au>Meng, Xuran</au><au>Yao, Jianfeng</au><au>Cao, Yuan</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Multiple Descent in the Multiple Random Feature Model</atitle><date>2022-08-21</date><risdate>2022</risdate><abstract>Recent works have demonstrated a double descent phenomenon in
over-parameterized learning. Although this phenomenon has been investigated by
recent works, it has not been fully understood in theory. In this paper, we
investigate the multiple descent phenomenon in a class of multi-component
prediction models. We first consider a ''double random feature model'' (DRFM)
concatenating two types of random features, and study the excess risk achieved
by the DRFM in ridge regression. We calculate the precise limit of the excess
risk under the high dimensional framework where the training sample size, the
dimension of data, and the dimension of random features tend to infinity
proportionally. Based on the calculation, we further theoretically demonstrate
that the risk curves of DRFMs can exhibit triple descent. We then provide a
thorough experimental study to verify our theory. At last, we extend our study
to the ''multiple random feature model'' (MRFM), and show that MRFMs ensembling
$K$ types of random features may exhibit $(K+1)$-fold descent. Our analysis
points out that risk curves with a specific number of descent generally exist
in learning multi-component prediction models.</abstract><doi>10.48550/arxiv.2208.09897</doi><oa>free_for_read</oa></addata></record> |
fulltext | fulltext_linktorsrc |
identifier | DOI: 10.48550/arxiv.2208.09897 |
ispartof | |
issn | |
language | eng |
recordid | cdi_arxiv_primary_2208_09897 |
source | arXiv.org |
subjects | Computer Science - Learning Mathematics - Statistics Theory Statistics - Machine Learning Statistics - Theory |
title | Multiple Descent in the Multiple Random Feature Model |
url | https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-10T17%3A25%3A29IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-arxiv_GOX&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Multiple%20Descent%20in%20the%20Multiple%20Random%20Feature%20Model&rft.au=Meng,%20Xuran&rft.date=2022-08-21&rft_id=info:doi/10.48550/arxiv.2208.09897&rft_dat=%3Carxiv_GOX%3E2208_09897%3C/arxiv_GOX%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_id=info:pmid/&rfr_iscdi=true |