Efficient MedSAMs: Segment Anything in Medical Images on Laptop
Promptable segmentation foundation models have emerged as a transformative approach to addressing the diverse needs in medical images, but most existing models require expensive computing, posing a big barrier to their adoption in clinical practice. In this work, we organized the first international...
Gespeichert in:
Hauptverfasser: | , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Promptable segmentation foundation models have emerged as a transformative
approach to addressing the diverse needs in medical images, but most existing
models require expensive computing, posing a big barrier to their adoption in
clinical practice. In this work, we organized the first international
competition dedicated to promptable medical image segmentation, featuring a
large-scale dataset spanning nine common imaging modalities from over 20
different institutions. The top teams developed lightweight segmentation
foundation models and implemented an efficient inference pipeline that
substantially reduced computational requirements while maintaining
state-of-the-art segmentation accuracy. Moreover, the post-challenge phase
advanced the algorithms through the design of performance booster and
reproducibility tasks, resulting in improved algorithms and validated
reproducibility of the winning solution. Furthermore, the best-performing
algorithms have been incorporated into the open-source software with a
user-friendly interface to facilitate clinical adoption. The data and code are
publicly available to foster the further development of medical image
segmentation foundation models and pave the way for impactful real-world
applications. |
---|---|
DOI: | 10.48550/arxiv.2412.16085 |