Method, System, and Computer Program Product for Dynamically Assigning an Inference Request to a CPU or GPU
A method for dynamically assigning an inference request is disclosed. A method for dynamically assigning an inference request may include determining at least one model to process an inference request on a plurality of computing platforms, the plurality of computing platforms including at least one...
Gespeichert in:
Hauptverfasser: | , , , , , |
---|---|
Format: | Patent |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
container_end_page | |
---|---|
container_issue | |
container_start_page | |
container_title | |
container_volume | |
creator | Yang, Hao Das, Biswajit Christensen, Robert Brian Karpenko, Igor Walker, Peter Gu, Yu |
description | A method for dynamically assigning an inference request is disclosed. A method for dynamically assigning an inference request may include determining at least one model to process an inference request on a plurality of computing platforms, the plurality of computing platforms including at least one Central Processing Unit (CPU) and at least one Graphics Processing Unit (GPU), obtaining, with at least one processor, profile information of the at least one model, the profile information including measured characteristics of the at least one model, dynamically determining a selected computing platform from between the at least one CPU and the at least one GPU for responding to the inference request based on an optimized objective associated with a status of the computing platform and the profile information, and routing, with at least one processor, the inference request to the selected computing platform. A system and computer program product are also disclosed. |
format | Patent |
fullrecord | <record><control><sourceid>epo_EVB</sourceid><recordid>TN_cdi_epo_espacenet_US2021232399A1</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>US2021232399A1</sourcerecordid><originalsourceid>FETCH-epo_espacenet_US2021232399A13</originalsourceid><addsrcrecordid>eNqNyz0OgkAQQGEaC6PeYRJbTGSpKAn-FiZEpCaTZUAiO4u7Q8HtxcQDWH3Ne8vgdSN52jqEYvJCJgTkGjJrhlHIQe5s69B8rUct0FgHh4nRdBr7foLU-67ljtt5gys35Ig1wZ3eI3kBsYCQ5SXM2zkv18Giwd7T5ucq2J6Oj-yyo8FW5AfUxCRVWai9ilSs4iRJo_i_6gPkXEBi</addsrcrecordid><sourcetype>Open Access Repository</sourcetype><iscdi>true</iscdi><recordtype>patent</recordtype></control><display><type>patent</type><title>Method, System, and Computer Program Product for Dynamically Assigning an Inference Request to a CPU or GPU</title><source>esp@cenet</source><creator>Yang, Hao ; Das, Biswajit ; Christensen, Robert Brian ; Karpenko, Igor ; Walker, Peter ; Gu, Yu</creator><creatorcontrib>Yang, Hao ; Das, Biswajit ; Christensen, Robert Brian ; Karpenko, Igor ; Walker, Peter ; Gu, Yu</creatorcontrib><description>A method for dynamically assigning an inference request is disclosed. A method for dynamically assigning an inference request may include determining at least one model to process an inference request on a plurality of computing platforms, the plurality of computing platforms including at least one Central Processing Unit (CPU) and at least one Graphics Processing Unit (GPU), obtaining, with at least one processor, profile information of the at least one model, the profile information including measured characteristics of the at least one model, dynamically determining a selected computing platform from between the at least one CPU and the at least one GPU for responding to the inference request based on an optimized objective associated with a status of the computing platform and the profile information, and routing, with at least one processor, the inference request to the selected computing platform. A system and computer program product are also disclosed.</description><language>eng</language><subject>CALCULATING ; COMPUTER SYSTEMS BASED ON SPECIFIC COMPUTATIONAL MODELS ; COMPUTING ; COUNTING ; ELECTRIC DIGITAL DATA PROCESSING ; IMAGE DATA PROCESSING OR GENERATION, IN GENERAL ; PHYSICS</subject><creationdate>2021</creationdate><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://worldwide.espacenet.com/publicationDetails/biblio?FT=D&date=20210729&DB=EPODOC&CC=US&NR=2021232399A1$$EHTML$$P50$$Gepo$$Hfree_for_read</linktohtml><link.rule.ids>230,308,776,881,25543,76292</link.rule.ids><linktorsrc>$$Uhttps://worldwide.espacenet.com/publicationDetails/biblio?FT=D&date=20210729&DB=EPODOC&CC=US&NR=2021232399A1$$EView_record_in_European_Patent_Office$$FView_record_in_$$GEuropean_Patent_Office$$Hfree_for_read</linktorsrc></links><search><creatorcontrib>Yang, Hao</creatorcontrib><creatorcontrib>Das, Biswajit</creatorcontrib><creatorcontrib>Christensen, Robert Brian</creatorcontrib><creatorcontrib>Karpenko, Igor</creatorcontrib><creatorcontrib>Walker, Peter</creatorcontrib><creatorcontrib>Gu, Yu</creatorcontrib><title>Method, System, and Computer Program Product for Dynamically Assigning an Inference Request to a CPU or GPU</title><description>A method for dynamically assigning an inference request is disclosed. A method for dynamically assigning an inference request may include determining at least one model to process an inference request on a plurality of computing platforms, the plurality of computing platforms including at least one Central Processing Unit (CPU) and at least one Graphics Processing Unit (GPU), obtaining, with at least one processor, profile information of the at least one model, the profile information including measured characteristics of the at least one model, dynamically determining a selected computing platform from between the at least one CPU and the at least one GPU for responding to the inference request based on an optimized objective associated with a status of the computing platform and the profile information, and routing, with at least one processor, the inference request to the selected computing platform. A system and computer program product are also disclosed.</description><subject>CALCULATING</subject><subject>COMPUTER SYSTEMS BASED ON SPECIFIC COMPUTATIONAL MODELS</subject><subject>COMPUTING</subject><subject>COUNTING</subject><subject>ELECTRIC DIGITAL DATA PROCESSING</subject><subject>IMAGE DATA PROCESSING OR GENERATION, IN GENERAL</subject><subject>PHYSICS</subject><fulltext>true</fulltext><rsrctype>patent</rsrctype><creationdate>2021</creationdate><recordtype>patent</recordtype><sourceid>EVB</sourceid><recordid>eNqNyz0OgkAQQGEaC6PeYRJbTGSpKAn-FiZEpCaTZUAiO4u7Q8HtxcQDWH3Ne8vgdSN52jqEYvJCJgTkGjJrhlHIQe5s69B8rUct0FgHh4nRdBr7foLU-67ljtt5gys35Ig1wZ3eI3kBsYCQ5SXM2zkv18Giwd7T5ucq2J6Oj-yyo8FW5AfUxCRVWai9ilSs4iRJo_i_6gPkXEBi</recordid><startdate>20210729</startdate><enddate>20210729</enddate><creator>Yang, Hao</creator><creator>Das, Biswajit</creator><creator>Christensen, Robert Brian</creator><creator>Karpenko, Igor</creator><creator>Walker, Peter</creator><creator>Gu, Yu</creator><scope>EVB</scope></search><sort><creationdate>20210729</creationdate><title>Method, System, and Computer Program Product for Dynamically Assigning an Inference Request to a CPU or GPU</title><author>Yang, Hao ; Das, Biswajit ; Christensen, Robert Brian ; Karpenko, Igor ; Walker, Peter ; Gu, Yu</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-epo_espacenet_US2021232399A13</frbrgroupid><rsrctype>patents</rsrctype><prefilter>patents</prefilter><language>eng</language><creationdate>2021</creationdate><topic>CALCULATING</topic><topic>COMPUTER SYSTEMS BASED ON SPECIFIC COMPUTATIONAL MODELS</topic><topic>COMPUTING</topic><topic>COUNTING</topic><topic>ELECTRIC DIGITAL DATA PROCESSING</topic><topic>IMAGE DATA PROCESSING OR GENERATION, IN GENERAL</topic><topic>PHYSICS</topic><toplevel>online_resources</toplevel><creatorcontrib>Yang, Hao</creatorcontrib><creatorcontrib>Das, Biswajit</creatorcontrib><creatorcontrib>Christensen, Robert Brian</creatorcontrib><creatorcontrib>Karpenko, Igor</creatorcontrib><creatorcontrib>Walker, Peter</creatorcontrib><creatorcontrib>Gu, Yu</creatorcontrib><collection>esp@cenet</collection></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext_linktorsrc</fulltext></delivery><addata><au>Yang, Hao</au><au>Das, Biswajit</au><au>Christensen, Robert Brian</au><au>Karpenko, Igor</au><au>Walker, Peter</au><au>Gu, Yu</au><format>patent</format><genre>patent</genre><ristype>GEN</ristype><title>Method, System, and Computer Program Product for Dynamically Assigning an Inference Request to a CPU or GPU</title><date>2021-07-29</date><risdate>2021</risdate><abstract>A method for dynamically assigning an inference request is disclosed. A method for dynamically assigning an inference request may include determining at least one model to process an inference request on a plurality of computing platforms, the plurality of computing platforms including at least one Central Processing Unit (CPU) and at least one Graphics Processing Unit (GPU), obtaining, with at least one processor, profile information of the at least one model, the profile information including measured characteristics of the at least one model, dynamically determining a selected computing platform from between the at least one CPU and the at least one GPU for responding to the inference request based on an optimized objective associated with a status of the computing platform and the profile information, and routing, with at least one processor, the inference request to the selected computing platform. A system and computer program product are also disclosed.</abstract><oa>free_for_read</oa></addata></record> |
fulltext | fulltext_linktorsrc |
identifier | |
ispartof | |
issn | |
language | eng |
recordid | cdi_epo_espacenet_US2021232399A1 |
source | esp@cenet |
subjects | CALCULATING COMPUTER SYSTEMS BASED ON SPECIFIC COMPUTATIONAL MODELS COMPUTING COUNTING ELECTRIC DIGITAL DATA PROCESSING IMAGE DATA PROCESSING OR GENERATION, IN GENERAL PHYSICS |
title | Method, System, and Computer Program Product for Dynamically Assigning an Inference Request to a CPU or GPU |
url | https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-27T09%3A07%3A38IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-epo_EVB&rft_val_fmt=info:ofi/fmt:kev:mtx:patent&rft.genre=patent&rft.au=Yang,%20Hao&rft.date=2021-07-29&rft_id=info:doi/&rft_dat=%3Cepo_EVB%3EUS2021232399A1%3C/epo_EVB%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_id=info:pmid/&rfr_iscdi=true |