Adaptive Hierarchical Certification for Segmentation using Randomized Smoothing
International Conference on Machine Learning (ICML), 2024 Certification for machine learning is proving that no adversarial sample can evade a model within a range under certain conditions, a necessity for safety-critical domains. Common certification methods for segmentation use a flat set of fine-...
Gespeichert in:
Hauptverfasser: | , , , |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | International Conference on Machine Learning (ICML), 2024 Certification for machine learning is proving that no adversarial sample can
evade a model within a range under certain conditions, a necessity for
safety-critical domains. Common certification methods for segmentation use a
flat set of fine-grained classes, leading to high abstain rates due to model
uncertainty across many classes. We propose a novel, more practical setting,
which certifies pixels within a multi-level hierarchy, and adaptively relaxes
the certification to a coarser level for unstable components classic methods
would abstain from, effectively lowering the abstain rate whilst providing more
certified semantically meaningful information. We mathematically formulate the
problem setup, introduce an adaptive hierarchical certification algorithm and
prove the correctness of its guarantees. Since certified accuracy does not take
the loss of information into account for coarser classes, we introduce the
Certified Information Gain ($\mathrm{CIG}$) metric, which is proportional to
the class granularity level. Our extensive experiments on the datasets
Cityscapes, PASCAL-Context, ACDC and COCO-Stuff demonstrate that our adaptive
algorithm achieves a higher $\mathrm{CIG}$ and lower abstain rate compared to
the current state-of-the-art certification method. Our code can be found here:
https://github.com/AlaaAnani/adaptive-certify. |
---|---|
DOI: | 10.48550/arxiv.2402.08400 |