OUT-OF-ORDER BACKPROPAGATION SCHEDULING METHOD FOR DEEP LEARNING

Disclosed is a method for scheduling a non-sequential backpropagation for deep learning training. The scheduling method removes, in a deep learning model, a dependence of a weight gradient operation and an output gradient operation of a backpropagation process within one layer, enables, in the backp...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Hauptverfasser: OH HYUNG JUN, SEO JI WON, LEE JUN YEOL, KIM HYUNG JU
Format: Patent
Sprache:eng ; kor
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page
container_issue
container_start_page
container_title
container_volume
creator OH HYUNG JUN
SEO JI WON
LEE JUN YEOL
KIM HYUNG JU
description Disclosed is a method for scheduling a non-sequential backpropagation for deep learning training. The scheduling method removes, in a deep learning model, a dependence of a weight gradient operation and an output gradient operation of a backpropagation process within one layer, enables, in the backpropagation process, the weight gradient operation to be deferred regardless of the output gradient operation, and enables the deferred weight gradient operation to be executed at a point of maximizing a resource usage amount of a graphic processing unit. Therefore, the present invention is capable of improving a processing performance. 딥러닝 훈련을 위한 비순차적 역전파 스케줄링 방법이 개시된다. 스케줄링 방법은 딥러닝 모델에서 하나의 레이어 내의 역전파 과정의 가중치 기울기 연산과 출력 기울기 연산의 의존성을 제거하여 상기 역전파 과정에서 가중치 기울기 연산을 출력 기울기 연산에 관계 없이 뒤로 미루고, 그래픽 처리 장치의 리소스 사용량을 최대로 하는 시점에서 뒤로 미룬 가중치 기울기 연산을 실행한다.
format Patent
fullrecord <record><control><sourceid>epo_EVB</sourceid><recordid>TN_cdi_epo_espacenet_KR20230040282A</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>KR20230040282A</sourcerecordid><originalsourceid>FETCH-epo_espacenet_KR20230040282A3</originalsourceid><addsrcrecordid>eNrjZHDwDw3R9XfT9Q9ycQ1ScHJ09g4I8g9wdHcM8fT3Uwh29nB1CfXx9HNX8HUN8fB3UXDzD1JwcXUNUPBxdQzyA0rwMLCmJeYUp_JCaW4GZTfXEGcP3dSC_PjU4oLE5NS81JJ47yAjAyNjAwMTAyMLI0dj4lQBAFGLK2U</addsrcrecordid><sourcetype>Open Access Repository</sourcetype><iscdi>true</iscdi><recordtype>patent</recordtype></control><display><type>patent</type><title>OUT-OF-ORDER BACKPROPAGATION SCHEDULING METHOD FOR DEEP LEARNING</title><source>esp@cenet</source><creator>OH HYUNG JUN ; SEO JI WON ; LEE JUN YEOL ; KIM HYUNG JU</creator><creatorcontrib>OH HYUNG JUN ; SEO JI WON ; LEE JUN YEOL ; KIM HYUNG JU</creatorcontrib><description>Disclosed is a method for scheduling a non-sequential backpropagation for deep learning training. The scheduling method removes, in a deep learning model, a dependence of a weight gradient operation and an output gradient operation of a backpropagation process within one layer, enables, in the backpropagation process, the weight gradient operation to be deferred regardless of the output gradient operation, and enables the deferred weight gradient operation to be executed at a point of maximizing a resource usage amount of a graphic processing unit. Therefore, the present invention is capable of improving a processing performance. 딥러닝 훈련을 위한 비순차적 역전파 스케줄링 방법이 개시된다. 스케줄링 방법은 딥러닝 모델에서 하나의 레이어 내의 역전파 과정의 가중치 기울기 연산과 출력 기울기 연산의 의존성을 제거하여 상기 역전파 과정에서 가중치 기울기 연산을 출력 기울기 연산에 관계 없이 뒤로 미루고, 그래픽 처리 장치의 리소스 사용량을 최대로 하는 시점에서 뒤로 미룬 가중치 기울기 연산을 실행한다.</description><language>eng ; kor</language><subject>CALCULATING ; COMPUTER SYSTEMS BASED ON SPECIFIC COMPUTATIONAL MODELS ; COMPUTING ; COUNTING ; PHYSICS</subject><creationdate>2023</creationdate><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://worldwide.espacenet.com/publicationDetails/biblio?FT=D&amp;date=20230322&amp;DB=EPODOC&amp;CC=KR&amp;NR=20230040282A$$EHTML$$P50$$Gepo$$Hfree_for_read</linktohtml><link.rule.ids>230,308,780,885,25564,76547</link.rule.ids><linktorsrc>$$Uhttps://worldwide.espacenet.com/publicationDetails/biblio?FT=D&amp;date=20230322&amp;DB=EPODOC&amp;CC=KR&amp;NR=20230040282A$$EView_record_in_European_Patent_Office$$FView_record_in_$$GEuropean_Patent_Office$$Hfree_for_read</linktorsrc></links><search><creatorcontrib>OH HYUNG JUN</creatorcontrib><creatorcontrib>SEO JI WON</creatorcontrib><creatorcontrib>LEE JUN YEOL</creatorcontrib><creatorcontrib>KIM HYUNG JU</creatorcontrib><title>OUT-OF-ORDER BACKPROPAGATION SCHEDULING METHOD FOR DEEP LEARNING</title><description>Disclosed is a method for scheduling a non-sequential backpropagation for deep learning training. The scheduling method removes, in a deep learning model, a dependence of a weight gradient operation and an output gradient operation of a backpropagation process within one layer, enables, in the backpropagation process, the weight gradient operation to be deferred regardless of the output gradient operation, and enables the deferred weight gradient operation to be executed at a point of maximizing a resource usage amount of a graphic processing unit. Therefore, the present invention is capable of improving a processing performance. 딥러닝 훈련을 위한 비순차적 역전파 스케줄링 방법이 개시된다. 스케줄링 방법은 딥러닝 모델에서 하나의 레이어 내의 역전파 과정의 가중치 기울기 연산과 출력 기울기 연산의 의존성을 제거하여 상기 역전파 과정에서 가중치 기울기 연산을 출력 기울기 연산에 관계 없이 뒤로 미루고, 그래픽 처리 장치의 리소스 사용량을 최대로 하는 시점에서 뒤로 미룬 가중치 기울기 연산을 실행한다.</description><subject>CALCULATING</subject><subject>COMPUTER SYSTEMS BASED ON SPECIFIC COMPUTATIONAL MODELS</subject><subject>COMPUTING</subject><subject>COUNTING</subject><subject>PHYSICS</subject><fulltext>true</fulltext><rsrctype>patent</rsrctype><creationdate>2023</creationdate><recordtype>patent</recordtype><sourceid>EVB</sourceid><recordid>eNrjZHDwDw3R9XfT9Q9ycQ1ScHJ09g4I8g9wdHcM8fT3Uwh29nB1CfXx9HNX8HUN8fB3UXDzD1JwcXUNUPBxdQzyA0rwMLCmJeYUp_JCaW4GZTfXEGcP3dSC_PjU4oLE5NS81JJ47yAjAyNjAwMTAyMLI0dj4lQBAFGLK2U</recordid><startdate>20230322</startdate><enddate>20230322</enddate><creator>OH HYUNG JUN</creator><creator>SEO JI WON</creator><creator>LEE JUN YEOL</creator><creator>KIM HYUNG JU</creator><scope>EVB</scope></search><sort><creationdate>20230322</creationdate><title>OUT-OF-ORDER BACKPROPAGATION SCHEDULING METHOD FOR DEEP LEARNING</title><author>OH HYUNG JUN ; SEO JI WON ; LEE JUN YEOL ; KIM HYUNG JU</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-epo_espacenet_KR20230040282A3</frbrgroupid><rsrctype>patents</rsrctype><prefilter>patents</prefilter><language>eng ; kor</language><creationdate>2023</creationdate><topic>CALCULATING</topic><topic>COMPUTER SYSTEMS BASED ON SPECIFIC COMPUTATIONAL MODELS</topic><topic>COMPUTING</topic><topic>COUNTING</topic><topic>PHYSICS</topic><toplevel>online_resources</toplevel><creatorcontrib>OH HYUNG JUN</creatorcontrib><creatorcontrib>SEO JI WON</creatorcontrib><creatorcontrib>LEE JUN YEOL</creatorcontrib><creatorcontrib>KIM HYUNG JU</creatorcontrib><collection>esp@cenet</collection></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext_linktorsrc</fulltext></delivery><addata><au>OH HYUNG JUN</au><au>SEO JI WON</au><au>LEE JUN YEOL</au><au>KIM HYUNG JU</au><format>patent</format><genre>patent</genre><ristype>GEN</ristype><title>OUT-OF-ORDER BACKPROPAGATION SCHEDULING METHOD FOR DEEP LEARNING</title><date>2023-03-22</date><risdate>2023</risdate><abstract>Disclosed is a method for scheduling a non-sequential backpropagation for deep learning training. The scheduling method removes, in a deep learning model, a dependence of a weight gradient operation and an output gradient operation of a backpropagation process within one layer, enables, in the backpropagation process, the weight gradient operation to be deferred regardless of the output gradient operation, and enables the deferred weight gradient operation to be executed at a point of maximizing a resource usage amount of a graphic processing unit. Therefore, the present invention is capable of improving a processing performance. 딥러닝 훈련을 위한 비순차적 역전파 스케줄링 방법이 개시된다. 스케줄링 방법은 딥러닝 모델에서 하나의 레이어 내의 역전파 과정의 가중치 기울기 연산과 출력 기울기 연산의 의존성을 제거하여 상기 역전파 과정에서 가중치 기울기 연산을 출력 기울기 연산에 관계 없이 뒤로 미루고, 그래픽 처리 장치의 리소스 사용량을 최대로 하는 시점에서 뒤로 미룬 가중치 기울기 연산을 실행한다.</abstract><oa>free_for_read</oa></addata></record>
fulltext fulltext_linktorsrc
identifier
ispartof
issn
language eng ; kor
recordid cdi_epo_espacenet_KR20230040282A
source esp@cenet
subjects CALCULATING
COMPUTER SYSTEMS BASED ON SPECIFIC COMPUTATIONAL MODELS
COMPUTING
COUNTING
PHYSICS
title OUT-OF-ORDER BACKPROPAGATION SCHEDULING METHOD FOR DEEP LEARNING
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2024-12-28T21%3A21%3A48IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-epo_EVB&rft_val_fmt=info:ofi/fmt:kev:mtx:patent&rft.genre=patent&rft.au=OH%20HYUNG%20JUN&rft.date=2023-03-22&rft_id=info:doi/&rft_dat=%3Cepo_EVB%3EKR20230040282A%3C/epo_EVB%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_id=info:pmid/&rfr_iscdi=true