Incremental learning method based on knowledge distillation and parameter isolation
The invention discloses an incremental learning method based on knowledge distillation and parameter isolation, and relates to the technical field of artificial intelligence, and the method comprises the steps: initializing the parameters of a teacher model feature extractor through the parameters o...
Gespeichert in:
Hauptverfasser: | , , , |
---|---|
Format: | Patent |
Sprache: | chi ; eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | The invention discloses an incremental learning method based on knowledge distillation and parameter isolation, and relates to the technical field of artificial intelligence, and the method comprises the steps: initializing the parameters of a teacher model feature extractor through the parameters of a student model feature extractor, and training a teacher model in an auxiliary manner through a multi-pooling branch module, in the mask learning stage, the teacher model obtained in the fine tuning stage is utilized to guide learning of binary masks in a knowledge distillation mode, the teacher model is abandoned when the model is inferred after training is finished, and corresponding masks and classifiers are found according to provided task ID information to act on student models for inference prediction. Parameters of different tasks are isolated through binary masks, space consumption caused by parameter increase of the model in an increment task is greatly reduced, mask updating is guided through a teacher |
---|