Performance of Multiple Pretrained BERT Models to Automate and Accelerate Data Annotation for Large Datasets

PurposeTo develop and evaluate domain-specific and pretrained bidirectional encoder representations from transformers (BERT) models in a transfer learning task on varying training dataset sizes to annotate a larger overall dataset. Materials and MethodsThe authors retrospectively reviewed 69 095 ano...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Radiology. Artificial intelligence 2022-07, Vol.4 (4), p.e220007-e220007
Hauptverfasser: Tejani, Ali S., Ng, Yee S., Xi, Yin, Fielding, Julia R., Browning, Travis G., Rayan, Jesse C.
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!