(\epsilon, \delta)$-Differentially Private Partial Least Squares Regression
As data-privacy requirements are becoming increasingly stringent and statistical models based on sensitive data are being deployed and used more routinely, protecting data-privacy becomes pivotal. Partial Least Squares (PLS) regression is the premier tool for building such models in analytical chemi...
Gespeichert in:
Hauptverfasser: | , , , |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | As data-privacy requirements are becoming increasingly stringent and
statistical models based on sensitive data are being deployed and used more
routinely, protecting data-privacy becomes pivotal. Partial Least Squares (PLS)
regression is the premier tool for building such models in analytical
chemistry, yet it does not inherently provide privacy guarantees, leaving
sensitive (training) data vulnerable to privacy attacks. To address this gap,
we propose an $(\epsilon, \delta)$-differentially private PLS (edPLS)
algorithm, which integrates well-studied and theoretically motivated Gaussian
noise-adding mechanisms into the PLS algorithm to ensure the privacy of the
data underlying the model. Our approach involves adding carefully calibrated
Gaussian noise to the outputs of four key functions in the PLS algorithm: the
weights, scores, $X$-loadings, and $Y$-loadings. The noise variance is
determined based on the global sensitivity of each function, ensuring that the
privacy loss is controlled according to the $(\epsilon, \delta)$-differential
privacy framework. Specifically, we derive the sensitivity bounds for each
function and use these bounds to calibrate the noise added to the model
components. Experimental results demonstrate that edPLS effectively renders
privacy attacks, aimed at recovering unique sources of variability in the
training data, ineffective. Application of edPLS to the NIR corn benchmark
dataset shows that the root mean squared error of prediction (RMSEP) remains
competitive even at strong privacy levels (i.e., $\epsilon=1$), given proper
pre-processing of the corresponding spectra. These findings highlight the
practical utility of edPLS in creating privacy-preserving multivariate
calibrations and for the analysis of their privacy-utility trade-offs. |
---|---|
DOI: | 10.48550/arxiv.2412.09164 |