Abstraction library for enabling scalable distributed machine learning

The invention discloses an abstract library for enabling scalable distributed machine learning. One embodiment provides a non-transitory machine-readable medium storing instructions that, when executed by one or more processors, cause the one or more processors to perform operations that include pro...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Hauptverfasser: KALAMKAR DHIRAJ D, SRIDHARAN, SRIDHAR, DAS DIPANKAR, VAIDYANATHAN, KRISHNAMURTHY
Format: Patent
Sprache:chi ; eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page
container_issue
container_start_page
container_title
container_volume
creator KALAMKAR DHIRAJ D
SRIDHARAN, SRIDHAR
DAS DIPANKAR
VAIDYANATHAN, KRISHNAMURTHY
description The invention discloses an abstract library for enabling scalable distributed machine learning. One embodiment provides a non-transitory machine-readable medium storing instructions that, when executed by one or more processors, cause the one or more processors to perform operations that include providing an interface for defining a neural network using machine learning domain-specific terms, the interface enables selection of a neural network topology and abstracting low-layer communication details for distributed training of the neural network. 本申请公开了用于使得能够进行可扩展分布式机器学习的抽象库。一个实施例提供了一种存储有指令的非暂态机器可读介质,所述指令当由一个或多个处理器执行时使所述一个或多个处理器执行包括以下各项的操作:提供用于使用机器学习领域特定术语来定义神经网络的界面,其中,所述界面使得能够选择神经网络拓扑并且抽象出所述神经网络的分布式训练的低层通信细节。
format Patent
fullrecord <record><control><sourceid>epo_EVB</sourceid><recordid>TN_cdi_epo_espacenet_CN118096495A</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>CN118096495A</sourcerecordid><originalsourceid>FETCH-epo_espacenet_CN118096495A3</originalsourceid><addsrcrecordid>eNrjZHBzTCouKUpMLsnMz1PIyUwqSiyqVEjLL1JIzUtMysnMS1coTk7MATJTFVIygSozk0pLUlMUchOTMzLzUhVyUhOL8oCqeBhY0xJzilN5oTQ3g6Kba4izh25qQX58anFBYnJqXmpJvLOfoaGFgaWZiaWpozExagBnTjQY</addsrcrecordid><sourcetype>Open Access Repository</sourcetype><iscdi>true</iscdi><recordtype>patent</recordtype></control><display><type>patent</type><title>Abstraction library for enabling scalable distributed machine learning</title><source>esp@cenet</source><creator>KALAMKAR DHIRAJ D ; SRIDHARAN, SRIDHAR ; DAS DIPANKAR ; VAIDYANATHAN, KRISHNAMURTHY</creator><creatorcontrib>KALAMKAR DHIRAJ D ; SRIDHARAN, SRIDHAR ; DAS DIPANKAR ; VAIDYANATHAN, KRISHNAMURTHY</creatorcontrib><description>The invention discloses an abstract library for enabling scalable distributed machine learning. One embodiment provides a non-transitory machine-readable medium storing instructions that, when executed by one or more processors, cause the one or more processors to perform operations that include providing an interface for defining a neural network using machine learning domain-specific terms, the interface enables selection of a neural network topology and abstracting low-layer communication details for distributed training of the neural network. 本申请公开了用于使得能够进行可扩展分布式机器学习的抽象库。一个实施例提供了一种存储有指令的非暂态机器可读介质,所述指令当由一个或多个处理器执行时使所述一个或多个处理器执行包括以下各项的操作:提供用于使用机器学习领域特定术语来定义神经网络的界面,其中,所述界面使得能够选择神经网络拓扑并且抽象出所述神经网络的分布式训练的低层通信细节。</description><language>chi ; eng</language><subject>CALCULATING ; COMPUTER SYSTEMS BASED ON SPECIFIC COMPUTATIONAL MODELS ; COMPUTING ; COUNTING ; IMAGE DATA PROCESSING OR GENERATION, IN GENERAL ; PHYSICS</subject><creationdate>2024</creationdate><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://worldwide.espacenet.com/publicationDetails/biblio?FT=D&amp;date=20240528&amp;DB=EPODOC&amp;CC=CN&amp;NR=118096495A$$EHTML$$P50$$Gepo$$Hfree_for_read</linktohtml><link.rule.ids>230,308,780,885,25563,76418</link.rule.ids><linktorsrc>$$Uhttps://worldwide.espacenet.com/publicationDetails/biblio?FT=D&amp;date=20240528&amp;DB=EPODOC&amp;CC=CN&amp;NR=118096495A$$EView_record_in_European_Patent_Office$$FView_record_in_$$GEuropean_Patent_Office$$Hfree_for_read</linktorsrc></links><search><creatorcontrib>KALAMKAR DHIRAJ D</creatorcontrib><creatorcontrib>SRIDHARAN, SRIDHAR</creatorcontrib><creatorcontrib>DAS DIPANKAR</creatorcontrib><creatorcontrib>VAIDYANATHAN, KRISHNAMURTHY</creatorcontrib><title>Abstraction library for enabling scalable distributed machine learning</title><description>The invention discloses an abstract library for enabling scalable distributed machine learning. One embodiment provides a non-transitory machine-readable medium storing instructions that, when executed by one or more processors, cause the one or more processors to perform operations that include providing an interface for defining a neural network using machine learning domain-specific terms, the interface enables selection of a neural network topology and abstracting low-layer communication details for distributed training of the neural network. 本申请公开了用于使得能够进行可扩展分布式机器学习的抽象库。一个实施例提供了一种存储有指令的非暂态机器可读介质,所述指令当由一个或多个处理器执行时使所述一个或多个处理器执行包括以下各项的操作:提供用于使用机器学习领域特定术语来定义神经网络的界面,其中,所述界面使得能够选择神经网络拓扑并且抽象出所述神经网络的分布式训练的低层通信细节。</description><subject>CALCULATING</subject><subject>COMPUTER SYSTEMS BASED ON SPECIFIC COMPUTATIONAL MODELS</subject><subject>COMPUTING</subject><subject>COUNTING</subject><subject>IMAGE DATA PROCESSING OR GENERATION, IN GENERAL</subject><subject>PHYSICS</subject><fulltext>true</fulltext><rsrctype>patent</rsrctype><creationdate>2024</creationdate><recordtype>patent</recordtype><sourceid>EVB</sourceid><recordid>eNrjZHBzTCouKUpMLsnMz1PIyUwqSiyqVEjLL1JIzUtMysnMS1coTk7MATJTFVIygSozk0pLUlMUchOTMzLzUhVyUhOL8oCqeBhY0xJzilN5oTQ3g6Kba4izh25qQX58anFBYnJqXmpJvLOfoaGFgaWZiaWpozExagBnTjQY</recordid><startdate>20240528</startdate><enddate>20240528</enddate><creator>KALAMKAR DHIRAJ D</creator><creator>SRIDHARAN, SRIDHAR</creator><creator>DAS DIPANKAR</creator><creator>VAIDYANATHAN, KRISHNAMURTHY</creator><scope>EVB</scope></search><sort><creationdate>20240528</creationdate><title>Abstraction library for enabling scalable distributed machine learning</title><author>KALAMKAR DHIRAJ D ; SRIDHARAN, SRIDHAR ; DAS DIPANKAR ; VAIDYANATHAN, KRISHNAMURTHY</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-epo_espacenet_CN118096495A3</frbrgroupid><rsrctype>patents</rsrctype><prefilter>patents</prefilter><language>chi ; eng</language><creationdate>2024</creationdate><topic>CALCULATING</topic><topic>COMPUTER SYSTEMS BASED ON SPECIFIC COMPUTATIONAL MODELS</topic><topic>COMPUTING</topic><topic>COUNTING</topic><topic>IMAGE DATA PROCESSING OR GENERATION, IN GENERAL</topic><topic>PHYSICS</topic><toplevel>online_resources</toplevel><creatorcontrib>KALAMKAR DHIRAJ D</creatorcontrib><creatorcontrib>SRIDHARAN, SRIDHAR</creatorcontrib><creatorcontrib>DAS DIPANKAR</creatorcontrib><creatorcontrib>VAIDYANATHAN, KRISHNAMURTHY</creatorcontrib><collection>esp@cenet</collection></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext_linktorsrc</fulltext></delivery><addata><au>KALAMKAR DHIRAJ D</au><au>SRIDHARAN, SRIDHAR</au><au>DAS DIPANKAR</au><au>VAIDYANATHAN, KRISHNAMURTHY</au><format>patent</format><genre>patent</genre><ristype>GEN</ristype><title>Abstraction library for enabling scalable distributed machine learning</title><date>2024-05-28</date><risdate>2024</risdate><abstract>The invention discloses an abstract library for enabling scalable distributed machine learning. One embodiment provides a non-transitory machine-readable medium storing instructions that, when executed by one or more processors, cause the one or more processors to perform operations that include providing an interface for defining a neural network using machine learning domain-specific terms, the interface enables selection of a neural network topology and abstracting low-layer communication details for distributed training of the neural network. 本申请公开了用于使得能够进行可扩展分布式机器学习的抽象库。一个实施例提供了一种存储有指令的非暂态机器可读介质,所述指令当由一个或多个处理器执行时使所述一个或多个处理器执行包括以下各项的操作:提供用于使用机器学习领域特定术语来定义神经网络的界面,其中,所述界面使得能够选择神经网络拓扑并且抽象出所述神经网络的分布式训练的低层通信细节。</abstract><oa>free_for_read</oa></addata></record>
fulltext fulltext_linktorsrc
identifier
ispartof
issn
language chi ; eng
recordid cdi_epo_espacenet_CN118096495A
source esp@cenet
subjects CALCULATING
COMPUTER SYSTEMS BASED ON SPECIFIC COMPUTATIONAL MODELS
COMPUTING
COUNTING
IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
PHYSICS
title Abstraction library for enabling scalable distributed machine learning
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-08T13%3A57%3A29IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-epo_EVB&rft_val_fmt=info:ofi/fmt:kev:mtx:patent&rft.genre=patent&rft.au=KALAMKAR%20DHIRAJ%20D&rft.date=2024-05-28&rft_id=info:doi/&rft_dat=%3Cepo_EVB%3ECN118096495A%3C/epo_EVB%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_id=info:pmid/&rfr_iscdi=true