THE PROMISE AND PERIL OF GENERATIVE AI
ChatGPT's creator, OpenAI in San Francisco, California, has announced a subscription service for $20 per month, promising faster response times and priority access to new features (although its trial version remains free). In September last year, Google subsidiary DeepMind published a paper4 on...
Gespeichert in:
Veröffentlicht in: | Nature (London) 2023-02, Vol.614 (7947), p.214-216 |
---|---|
Hauptverfasser: | , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
container_end_page | 216 |
---|---|
container_issue | 7947 |
container_start_page | 214 |
container_title | Nature (London) |
container_volume | 614 |
creator | Stokel-Walker, Chris Van Noorden, Richard |
description | ChatGPT's creator, OpenAI in San Francisco, California, has announced a subscription service for $20 per month, promising faster response times and priority access to new features (although its trial version remains free). In September last year, Google subsidiary DeepMind published a paper4 on a 'dialogue agent' called Sparrow, which the firm's chief executive and co-founder Demis Hassabis later told TIME magazine would be released in private beta this year; the magazine reported that Google aimed to work on features including the ability to cite sources. (Meta did not respond to a request, made through their press office, to speak to LeCun.) Safety and responsibility Galactica had hit a familiar safety concern that ethicists have been pointing out for years: without output controls LLMs can easily be used to generate hate speech and spam, as well as racist, sexist and other harmful associations that might be implicit in their training data. Besides directly producing toxic content, there are concerns that AI chatbots will embed historical biases or ideas about the world from their training data, such as the superiority of particular cultures, says Shobita Parthasarathy, director of a science, technology and public-policy programme at the University of Michigan in Ann Arbor. Because the firms that are creating big LLMs are mostly in, and from, these cultures, they might make little attempt to overcome such biases, which are systemic and hard to rectify, she adds. |
doi_str_mv | 10.1038/d41586-023-00340-6 |
format | Article |
fullrecord | <record><control><sourceid>proquest</sourceid><recordid>TN_cdi_proquest_journals_2775798703</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>2775798703</sourcerecordid><originalsourceid>FETCH-LOGICAL-p113t-c2bb7b27c2d81dddb77a5e624e2f8e66eb455bf11192a7831e4b85ec2f3b27083</originalsourceid><addsrcrecordid>eNotjk1Lw0AURQexYGz9A64GBHej783Xmy5DnbYDsSkxui2ZZLIoYmvT_n8DdXXhwrnnMvaI8IKg3Gun0TgrQCoBoDQIe8My1GSFto5uWQYgnQCn7B27H4Y9ABgknbHneu35tirfw4fn-eaNb30VCl4u-cpvfJXX4Wvsw4xN-uZ7SA__OWWfS18v1qIoV2GRF-KIqM6ilTFSlNTKzmHXdZGoMclKnWTvkrUpamNij4hz2ZBTmHR0JrWyVyM13puyp-vu8XT4vaThvNsfLqefUbmTRIbmjkCpPzm9Pe4</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2775798703</pqid></control><display><type>article</type><title>THE PROMISE AND PERIL OF GENERATIVE AI</title><source>Nature_系列刊</source><source>Springer Nature - Complete Springer Journals</source><creator>Stokel-Walker, Chris ; Van Noorden, Richard</creator><creatorcontrib>Stokel-Walker, Chris ; Van Noorden, Richard</creatorcontrib><description>ChatGPT's creator, OpenAI in San Francisco, California, has announced a subscription service for $20 per month, promising faster response times and priority access to new features (although its trial version remains free). In September last year, Google subsidiary DeepMind published a paper4 on a 'dialogue agent' called Sparrow, which the firm's chief executive and co-founder Demis Hassabis later told TIME magazine would be released in private beta this year; the magazine reported that Google aimed to work on features including the ability to cite sources. (Meta did not respond to a request, made through their press office, to speak to LeCun.) Safety and responsibility Galactica had hit a familiar safety concern that ethicists have been pointing out for years: without output controls LLMs can easily be used to generate hate speech and spam, as well as racist, sexist and other harmful associations that might be implicit in their training data. Besides directly producing toxic content, there are concerns that AI chatbots will embed historical biases or ideas about the world from their training data, such as the superiority of particular cultures, says Shobita Parthasarathy, director of a science, technology and public-policy programme at the University of Michigan in Ann Arbor. Because the firms that are creating big LLMs are mostly in, and from, these cultures, they might make little attempt to overcome such biases, which are systemic and hard to rectify, she adds.</description><identifier>ISSN: 0028-0836</identifier><identifier>EISSN: 1476-4687</identifier><identifier>DOI: 10.1038/d41586-023-00340-6</identifier><language>eng</language><publisher>London: Nature Publishing Group</publisher><subject>Chatbots ; Generative artificial intelligence ; Researchers ; Scientists ; Search engines ; Software ; Training</subject><ispartof>Nature (London), 2023-02, Vol.614 (7947), p.214-216</ispartof><rights>Copyright Nature Publishing Group Feb 9, 2023</rights><lds50>peer_reviewed</lds50><woscitedreferencessubscribed>false</woscitedreferencessubscribed></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><link.rule.ids>314,776,780,27901,27902</link.rule.ids></links><search><creatorcontrib>Stokel-Walker, Chris</creatorcontrib><creatorcontrib>Van Noorden, Richard</creatorcontrib><title>THE PROMISE AND PERIL OF GENERATIVE AI</title><title>Nature (London)</title><description>ChatGPT's creator, OpenAI in San Francisco, California, has announced a subscription service for $20 per month, promising faster response times and priority access to new features (although its trial version remains free). In September last year, Google subsidiary DeepMind published a paper4 on a 'dialogue agent' called Sparrow, which the firm's chief executive and co-founder Demis Hassabis later told TIME magazine would be released in private beta this year; the magazine reported that Google aimed to work on features including the ability to cite sources. (Meta did not respond to a request, made through their press office, to speak to LeCun.) Safety and responsibility Galactica had hit a familiar safety concern that ethicists have been pointing out for years: without output controls LLMs can easily be used to generate hate speech and spam, as well as racist, sexist and other harmful associations that might be implicit in their training data. Besides directly producing toxic content, there are concerns that AI chatbots will embed historical biases or ideas about the world from their training data, such as the superiority of particular cultures, says Shobita Parthasarathy, director of a science, technology and public-policy programme at the University of Michigan in Ann Arbor. Because the firms that are creating big LLMs are mostly in, and from, these cultures, they might make little attempt to overcome such biases, which are systemic and hard to rectify, she adds.</description><subject>Chatbots</subject><subject>Generative artificial intelligence</subject><subject>Researchers</subject><subject>Scientists</subject><subject>Search engines</subject><subject>Software</subject><subject>Training</subject><issn>0028-0836</issn><issn>1476-4687</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2023</creationdate><recordtype>article</recordtype><sourceid>8G5</sourceid><sourceid>BEC</sourceid><sourceid>BENPR</sourceid><sourceid>GUQSH</sourceid><sourceid>M2O</sourceid><recordid>eNotjk1Lw0AURQexYGz9A64GBHej783Xmy5DnbYDsSkxui2ZZLIoYmvT_n8DdXXhwrnnMvaI8IKg3Gun0TgrQCoBoDQIe8My1GSFto5uWQYgnQCn7B27H4Y9ABgknbHneu35tirfw4fn-eaNb30VCl4u-cpvfJXX4Wvsw4xN-uZ7SA__OWWfS18v1qIoV2GRF-KIqM6ilTFSlNTKzmHXdZGoMclKnWTvkrUpamNij4hz2ZBTmHR0JrWyVyM13puyp-vu8XT4vaThvNsfLqefUbmTRIbmjkCpPzm9Pe4</recordid><startdate>20230209</startdate><enddate>20230209</enddate><creator>Stokel-Walker, Chris</creator><creator>Van Noorden, Richard</creator><general>Nature Publishing Group</general><scope>3V.</scope><scope>7QG</scope><scope>7QL</scope><scope>7QP</scope><scope>7QR</scope><scope>7RV</scope><scope>7SN</scope><scope>7SS</scope><scope>7ST</scope><scope>7T5</scope><scope>7TG</scope><scope>7TK</scope><scope>7TM</scope><scope>7TO</scope><scope>7U9</scope><scope>7X2</scope><scope>7X7</scope><scope>7XB</scope><scope>88A</scope><scope>88E</scope><scope>88G</scope><scope>88I</scope><scope>8AF</scope><scope>8AO</scope><scope>8C1</scope><scope>8FD</scope><scope>8FE</scope><scope>8FG</scope><scope>8FH</scope><scope>8FI</scope><scope>8FJ</scope><scope>8FK</scope><scope>8G5</scope><scope>ABJCF</scope><scope>ABUWG</scope><scope>AEUYN</scope><scope>AFKRA</scope><scope>ARAPS</scope><scope>ATCPS</scope><scope>AZQEC</scope><scope>BBNVY</scope><scope>BEC</scope><scope>BENPR</scope><scope>BGLVJ</scope><scope>BHPHI</scope><scope>BKSAR</scope><scope>C1K</scope><scope>CCPQU</scope><scope>D1I</scope><scope>DWQXO</scope><scope>FR3</scope><scope>FYUFA</scope><scope>GHDGH</scope><scope>GNUQQ</scope><scope>GUQSH</scope><scope>H94</scope><scope>HCIFZ</scope><scope>K9.</scope><scope>KB.</scope><scope>KB0</scope><scope>KL.</scope><scope>L6V</scope><scope>LK8</scope><scope>M0K</scope><scope>M0S</scope><scope>M1P</scope><scope>M2M</scope><scope>M2O</scope><scope>M2P</scope><scope>M7N</scope><scope>M7P</scope><scope>M7S</scope><scope>MBDVC</scope><scope>NAPCQ</scope><scope>P5Z</scope><scope>P62</scope><scope>P64</scope><scope>PATMY</scope><scope>PCBAR</scope><scope>PDBOC</scope><scope>PQEST</scope><scope>PQQKQ</scope><scope>PQUKI</scope><scope>PSYQQ</scope><scope>PTHSS</scope><scope>PYCSY</scope><scope>Q9U</scope><scope>R05</scope><scope>RC3</scope><scope>S0X</scope><scope>SOI</scope></search><sort><creationdate>20230209</creationdate><title>THE PROMISE AND PERIL OF GENERATIVE AI</title><author>Stokel-Walker, Chris ; Van Noorden, Richard</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-p113t-c2bb7b27c2d81dddb77a5e624e2f8e66eb455bf11192a7831e4b85ec2f3b27083</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2023</creationdate><topic>Chatbots</topic><topic>Generative artificial intelligence</topic><topic>Researchers</topic><topic>Scientists</topic><topic>Search engines</topic><topic>Software</topic><topic>Training</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Stokel-Walker, Chris</creatorcontrib><creatorcontrib>Van Noorden, Richard</creatorcontrib><collection>ProQuest Central (Corporate)</collection><collection>Animal Behavior Abstracts</collection><collection>Bacteriology Abstracts (Microbiology B)</collection><collection>Calcium & Calcified Tissue Abstracts</collection><collection>Chemoreception Abstracts</collection><collection>ProQuest Nursing & Allied Health Database</collection><collection>Ecology Abstracts</collection><collection>Entomology Abstracts (Full archive)</collection><collection>Environment Abstracts</collection><collection>Immunology Abstracts</collection><collection>Meteorological & Geoastrophysical Abstracts</collection><collection>Neurosciences Abstracts</collection><collection>Nucleic Acids Abstracts</collection><collection>Oncogenes and Growth Factors Abstracts</collection><collection>Virology and AIDS Abstracts</collection><collection>Agricultural Science Collection</collection><collection>ProQuest Health & Medical Collection</collection><collection>ProQuest Central (purchase pre-March 2016)</collection><collection>Biology Database (Alumni Edition)</collection><collection>Medical Database (Alumni Edition)</collection><collection>Psychology Database (Alumni)</collection><collection>Science Database (Alumni Edition)</collection><collection>STEM Database</collection><collection>ProQuest Pharma Collection</collection><collection>ProQuest Public Health Database</collection><collection>Technology Research Database</collection><collection>ProQuest SciTech Collection</collection><collection>ProQuest Technology Collection</collection><collection>ProQuest Natural Science Collection</collection><collection>Hospital Premium Collection</collection><collection>Hospital Premium Collection (Alumni Edition)</collection><collection>ProQuest Central (Alumni) (purchase pre-March 2016)</collection><collection>Research Library (Alumni Edition)</collection><collection>Materials Science & Engineering Collection</collection><collection>ProQuest Central (Alumni)</collection><collection>ProQuest One Sustainability</collection><collection>ProQuest Central UK/Ireland</collection><collection>Advanced Technologies & Aerospace Database (1962 - current)</collection><collection>Agricultural & Environmental Science Collection</collection><collection>ProQuest Central Essentials</collection><collection>Biological Science Collection</collection><collection>eLibrary</collection><collection>ProQuest Central</collection><collection>Technology Collection</collection><collection>ProQuest Natural Science Collection</collection><collection>Earth, Atmospheric & Aquatic Science Collection</collection><collection>Environmental Sciences and Pollution Management</collection><collection>ProQuest One Community College</collection><collection>ProQuest Materials Science Collection</collection><collection>ProQuest Central</collection><collection>Engineering Research Database</collection><collection>Health Research Premium Collection</collection><collection>Health Research Premium Collection (Alumni)</collection><collection>ProQuest Central Student</collection><collection>Research Library Prep</collection><collection>AIDS and Cancer Research Abstracts</collection><collection>SciTech Premium Collection</collection><collection>ProQuest Health & Medical Complete (Alumni)</collection><collection>Materials Science Database</collection><collection>Nursing & Allied Health Database (Alumni Edition)</collection><collection>Meteorological & Geoastrophysical Abstracts - Academic</collection><collection>ProQuest Engineering Collection</collection><collection>Biological Sciences</collection><collection>Agriculture Science Database</collection><collection>Health & Medical Collection (Alumni Edition)</collection><collection>PML(ProQuest Medical Library)</collection><collection>Psychology Database (ProQuest)</collection><collection>ProQuest Research Library</collection><collection>ProQuest Science Journals</collection><collection>Algology Mycology and Protozoology Abstracts (Microbiology C)</collection><collection>Biological Science Database</collection><collection>Engineering Database</collection><collection>Research Library (Corporate)</collection><collection>Nursing & Allied Health Premium</collection><collection>ProQuest advanced technologies & aerospace journals</collection><collection>ProQuest Advanced Technologies & Aerospace Collection</collection><collection>Biotechnology and BioEngineering Abstracts</collection><collection>Environmental Science Database</collection><collection>Earth, Atmospheric & Aquatic Science Database</collection><collection>Materials Science Collection</collection><collection>ProQuest One Academic Eastern Edition (DO NOT USE)</collection><collection>ProQuest One Academic</collection><collection>ProQuest One Academic UKI Edition</collection><collection>ProQuest One Psychology</collection><collection>Engineering collection</collection><collection>Environmental Science Collection</collection><collection>ProQuest Central Basic</collection><collection>University of Michigan</collection><collection>Genetics Abstracts</collection><collection>SIRS Editorial</collection><collection>Environment Abstracts</collection><jtitle>Nature (London)</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Stokel-Walker, Chris</au><au>Van Noorden, Richard</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>THE PROMISE AND PERIL OF GENERATIVE AI</atitle><jtitle>Nature (London)</jtitle><date>2023-02-09</date><risdate>2023</risdate><volume>614</volume><issue>7947</issue><spage>214</spage><epage>216</epage><pages>214-216</pages><issn>0028-0836</issn><eissn>1476-4687</eissn><abstract>ChatGPT's creator, OpenAI in San Francisco, California, has announced a subscription service for $20 per month, promising faster response times and priority access to new features (although its trial version remains free). In September last year, Google subsidiary DeepMind published a paper4 on a 'dialogue agent' called Sparrow, which the firm's chief executive and co-founder Demis Hassabis later told TIME magazine would be released in private beta this year; the magazine reported that Google aimed to work on features including the ability to cite sources. (Meta did not respond to a request, made through their press office, to speak to LeCun.) Safety and responsibility Galactica had hit a familiar safety concern that ethicists have been pointing out for years: without output controls LLMs can easily be used to generate hate speech and spam, as well as racist, sexist and other harmful associations that might be implicit in their training data. Besides directly producing toxic content, there are concerns that AI chatbots will embed historical biases or ideas about the world from their training data, such as the superiority of particular cultures, says Shobita Parthasarathy, director of a science, technology and public-policy programme at the University of Michigan in Ann Arbor. Because the firms that are creating big LLMs are mostly in, and from, these cultures, they might make little attempt to overcome such biases, which are systemic and hard to rectify, she adds.</abstract><cop>London</cop><pub>Nature Publishing Group</pub><doi>10.1038/d41586-023-00340-6</doi><tpages>3</tpages></addata></record> |
fulltext | fulltext |
identifier | ISSN: 0028-0836 |
ispartof | Nature (London), 2023-02, Vol.614 (7947), p.214-216 |
issn | 0028-0836 1476-4687 |
language | eng |
recordid | cdi_proquest_journals_2775798703 |
source | Nature_系列刊; Springer Nature - Complete Springer Journals |
subjects | Chatbots Generative artificial intelligence Researchers Scientists Search engines Software Training |
title | THE PROMISE AND PERIL OF GENERATIVE AI |
url | https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-02-14T18%3A51%3A02IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=THE%20PROMISE%20AND%20PERIL%20OF%20GENERATIVE%20AI&rft.jtitle=Nature%20(London)&rft.au=Stokel-Walker,%20Chris&rft.date=2023-02-09&rft.volume=614&rft.issue=7947&rft.spage=214&rft.epage=216&rft.pages=214-216&rft.issn=0028-0836&rft.eissn=1476-4687&rft_id=info:doi/10.1038/d41586-023-00340-6&rft_dat=%3Cproquest%3E2775798703%3C/proquest%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=2775798703&rft_id=info:pmid/&rfr_iscdi=true |