The phase diagram of kernel interpolation in large dimensions

The generalization ability of kernel interpolation in large dimensions (i.e., $n \asymp d^{\gamma}$ for some $\gamma>0$) might be one of the most interesting problems in the recent renaissance of kernel regression, since it may help us understand the 'benign overfitting phenomenon' repo...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Hauptverfasser: Zhang, Haobo, Lu, Weihao, Lin, Qian
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page
container_issue
container_start_page
container_title
container_volume
creator Zhang, Haobo
Lu, Weihao
Lin, Qian
description The generalization ability of kernel interpolation in large dimensions (i.e., $n \asymp d^{\gamma}$ for some $\gamma>0$) might be one of the most interesting problems in the recent renaissance of kernel regression, since it may help us understand the 'benign overfitting phenomenon' reported in the neural networks literature. Focusing on the inner product kernel on the sphere, we fully characterized the exact order of both the variance and bias of large-dimensional kernel interpolation under various source conditions $s\geq 0$. Consequently, we obtained the $(s,\gamma)$-phase diagram of large-dimensional kernel interpolation, i.e., we determined the regions in $(s,\gamma)$-plane where the kernel interpolation is minimax optimal, sub-optimal and inconsistent.
doi_str_mv 10.48550/arxiv.2404.12597
format Article
fullrecord <record><control><sourceid>arxiv_GOX</sourceid><recordid>TN_cdi_arxiv_primary_2404_12597</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>2404_12597</sourcerecordid><originalsourceid>FETCH-LOGICAL-a677-a7468056b1c9c2a6bec60055382ef9087b1c5cfaa371dd49a329d9ac141c60c93</originalsourceid><addsrcrecordid>eNotT7mOwjAUdEOBAh9AhX8gWd-OC4oVWmAlJJr00cN5AYtcctBq9-8JsNVoDo1mCFlxlqlca_YB8Tf8ZEIxlXGhnZ2TTXFFOlxhRFoFuERoaV_TG8YOGxq6O8ahb-Ae-m5itIF4eQZb7MZJGhdkVkMz4vIfE1LsvortIT2e9t_bz2MKxtoUrDI50-bMvfMCzBm9YUxrmQusHcvtZGhfA0jLq0o5kMJVDjxXfAp6JxOyfte-9pdDDC3Ev_L5o3z9kA8lIkLE</addsrcrecordid><sourcetype>Open Access Repository</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype></control><display><type>article</type><title>The phase diagram of kernel interpolation in large dimensions</title><source>arXiv.org</source><creator>Zhang, Haobo ; Lu, Weihao ; Lin, Qian</creator><creatorcontrib>Zhang, Haobo ; Lu, Weihao ; Lin, Qian</creatorcontrib><description>The generalization ability of kernel interpolation in large dimensions (i.e., $n \asymp d^{\gamma}$ for some $\gamma&gt;0$) might be one of the most interesting problems in the recent renaissance of kernel regression, since it may help us understand the 'benign overfitting phenomenon' reported in the neural networks literature. Focusing on the inner product kernel on the sphere, we fully characterized the exact order of both the variance and bias of large-dimensional kernel interpolation under various source conditions $s\geq 0$. Consequently, we obtained the $(s,\gamma)$-phase diagram of large-dimensional kernel interpolation, i.e., we determined the regions in $(s,\gamma)$-plane where the kernel interpolation is minimax optimal, sub-optimal and inconsistent.</description><identifier>DOI: 10.48550/arxiv.2404.12597</identifier><language>eng</language><subject>Computer Science - Learning ; Mathematics - Statistics Theory ; Statistics - Machine Learning ; Statistics - Theory</subject><creationdate>2024-04</creationdate><rights>http://arxiv.org/licenses/nonexclusive-distrib/1.0</rights><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><link.rule.ids>228,230,780,885</link.rule.ids><linktorsrc>$$Uhttps://arxiv.org/abs/2404.12597$$EView_record_in_Cornell_University$$FView_record_in_$$GCornell_University$$Hfree_for_read</linktorsrc><backlink>$$Uhttps://doi.org/10.48550/arXiv.2404.12597$$DView paper in arXiv$$Hfree_for_read</backlink></links><search><creatorcontrib>Zhang, Haobo</creatorcontrib><creatorcontrib>Lu, Weihao</creatorcontrib><creatorcontrib>Lin, Qian</creatorcontrib><title>The phase diagram of kernel interpolation in large dimensions</title><description>The generalization ability of kernel interpolation in large dimensions (i.e., $n \asymp d^{\gamma}$ for some $\gamma&gt;0$) might be one of the most interesting problems in the recent renaissance of kernel regression, since it may help us understand the 'benign overfitting phenomenon' reported in the neural networks literature. Focusing on the inner product kernel on the sphere, we fully characterized the exact order of both the variance and bias of large-dimensional kernel interpolation under various source conditions $s\geq 0$. Consequently, we obtained the $(s,\gamma)$-phase diagram of large-dimensional kernel interpolation, i.e., we determined the regions in $(s,\gamma)$-plane where the kernel interpolation is minimax optimal, sub-optimal and inconsistent.</description><subject>Computer Science - Learning</subject><subject>Mathematics - Statistics Theory</subject><subject>Statistics - Machine Learning</subject><subject>Statistics - Theory</subject><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2024</creationdate><recordtype>article</recordtype><sourceid>GOX</sourceid><recordid>eNotT7mOwjAUdEOBAh9AhX8gWd-OC4oVWmAlJJr00cN5AYtcctBq9-8JsNVoDo1mCFlxlqlca_YB8Tf8ZEIxlXGhnZ2TTXFFOlxhRFoFuERoaV_TG8YOGxq6O8ahb-Ae-m5itIF4eQZb7MZJGhdkVkMz4vIfE1LsvortIT2e9t_bz2MKxtoUrDI50-bMvfMCzBm9YUxrmQusHcvtZGhfA0jLq0o5kMJVDjxXfAp6JxOyfte-9pdDDC3Ev_L5o3z9kA8lIkLE</recordid><startdate>20240418</startdate><enddate>20240418</enddate><creator>Zhang, Haobo</creator><creator>Lu, Weihao</creator><creator>Lin, Qian</creator><scope>AKY</scope><scope>AKZ</scope><scope>EPD</scope><scope>GOX</scope></search><sort><creationdate>20240418</creationdate><title>The phase diagram of kernel interpolation in large dimensions</title><author>Zhang, Haobo ; Lu, Weihao ; Lin, Qian</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-a677-a7468056b1c9c2a6bec60055382ef9087b1c5cfaa371dd49a329d9ac141c60c93</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2024</creationdate><topic>Computer Science - Learning</topic><topic>Mathematics - Statistics Theory</topic><topic>Statistics - Machine Learning</topic><topic>Statistics - Theory</topic><toplevel>online_resources</toplevel><creatorcontrib>Zhang, Haobo</creatorcontrib><creatorcontrib>Lu, Weihao</creatorcontrib><creatorcontrib>Lin, Qian</creatorcontrib><collection>arXiv Computer Science</collection><collection>arXiv Mathematics</collection><collection>arXiv Statistics</collection><collection>arXiv.org</collection></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext_linktorsrc</fulltext></delivery><addata><au>Zhang, Haobo</au><au>Lu, Weihao</au><au>Lin, Qian</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>The phase diagram of kernel interpolation in large dimensions</atitle><date>2024-04-18</date><risdate>2024</risdate><abstract>The generalization ability of kernel interpolation in large dimensions (i.e., $n \asymp d^{\gamma}$ for some $\gamma&gt;0$) might be one of the most interesting problems in the recent renaissance of kernel regression, since it may help us understand the 'benign overfitting phenomenon' reported in the neural networks literature. Focusing on the inner product kernel on the sphere, we fully characterized the exact order of both the variance and bias of large-dimensional kernel interpolation under various source conditions $s\geq 0$. Consequently, we obtained the $(s,\gamma)$-phase diagram of large-dimensional kernel interpolation, i.e., we determined the regions in $(s,\gamma)$-plane where the kernel interpolation is minimax optimal, sub-optimal and inconsistent.</abstract><doi>10.48550/arxiv.2404.12597</doi><oa>free_for_read</oa></addata></record>
fulltext fulltext_linktorsrc
identifier DOI: 10.48550/arxiv.2404.12597
ispartof
issn
language eng
recordid cdi_arxiv_primary_2404_12597
source arXiv.org
subjects Computer Science - Learning
Mathematics - Statistics Theory
Statistics - Machine Learning
Statistics - Theory
title The phase diagram of kernel interpolation in large dimensions
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-07T13%3A13%3A51IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-arxiv_GOX&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=The%20phase%20diagram%20of%20kernel%20interpolation%20in%20large%20dimensions&rft.au=Zhang,%20Haobo&rft.date=2024-04-18&rft_id=info:doi/10.48550/arxiv.2404.12597&rft_dat=%3Carxiv_GOX%3E2404_12597%3C/arxiv_GOX%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_id=info:pmid/&rfr_iscdi=true