OUT-OF-ORDER BACKPROPAGATION SCHEDULING METHOD FOR DEEP LEARNING

Disclosed is a method for scheduling a non-sequential backpropagation for deep learning training. The scheduling method removes, in a deep learning model, a dependence of a weight gradient operation and an output gradient operation of a backpropagation process within one layer, enables, in the backp...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Hauptverfasser: OH HYUNG JUN, SEO JI WON, LEE JUN YEOL, KIM HYUNG JU
Format: Patent
Sprache:eng ; kor
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Disclosed is a method for scheduling a non-sequential backpropagation for deep learning training. The scheduling method removes, in a deep learning model, a dependence of a weight gradient operation and an output gradient operation of a backpropagation process within one layer, enables, in the backpropagation process, the weight gradient operation to be deferred regardless of the output gradient operation, and enables the deferred weight gradient operation to be executed at a point of maximizing a resource usage amount of a graphic processing unit. Therefore, the present invention is capable of improving a processing performance. 딥러닝 훈련을 위한 비순차적 역전파 스케줄링 방법이 개시된다. 스케줄링 방법은 딥러닝 모델에서 하나의 레이어 내의 역전파 과정의 가중치 기울기 연산과 출력 기울기 연산의 의존성을 제거하여 상기 역전파 과정에서 가중치 기울기 연산을 출력 기울기 연산에 관계 없이 뒤로 미루고, 그래픽 처리 장치의 리소스 사용량을 최대로 하는 시점에서 뒤로 미룬 가중치 기울기 연산을 실행한다.