A relation aware embedding mechanism for relation extraction
Extracting possible relational triples from natural language text is a fundamental task of information extraction, which has attracted extensive attention. The embedding mechanism has a significant impact on the performance of relation extraction models, and the embedding vectors should contain rich...
Gespeichert in:
Veröffentlicht in: | Applied intelligence (Dordrecht, Netherlands) Netherlands), 2022-07, Vol.52 (9), p.10022-10031 |
---|---|
Hauptverfasser: | , , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
container_end_page | 10031 |
---|---|
container_issue | 9 |
container_start_page | 10022 |
container_title | Applied intelligence (Dordrecht, Netherlands) |
container_volume | 52 |
creator | Li, Xiang Li, Yuwei Yang, Junan Liu, Hui Hu, Pengjiang |
description | Extracting possible relational triples from natural language text is a fundamental task of information extraction, which has attracted extensive attention. The embedding mechanism has a significant impact on the performance of relation extraction models, and the embedding vectors should contain rich semantic information that has close relevance to the relation extraction task. Driven by this motivation, we propose a
R
elation
A
ware Embedding Mechanism (RA)
for relation extraction. In specific, this mechanism incorporates the relation label information into sentence embedding by leveraging the attention mechanism to distinguish the importance of different relation labels to each word of a sentence. We apply the proposed method to three state-of-the-art relation extraction models: CasRel, SMHSA and ETL-Span, and implement the corresponding models named RA-CasRel, RA-SMHSA and RA-ETL-Span. To evaluate the effectiveness of our method, we conduct extensive experiments on two widely-used open datasets: NYT and WebNLG, and compare RA-CasRel, RA-SMHSA and RA-ETL-Span with 12 state-of-the-art models. The experimental results show that our method can effectively improve the performance of relation extraction. For instance, RA-CasRel reaches 91.7% and 92.4% of F1-score on NYT and WebNLG, respectively, which is the best performance among all the compared models. We have open sourced the code of our proposed method in [
1
] to facilitate future research in relation extraction. |
doi_str_mv | 10.1007/s10489-021-02699-3 |
format | Article |
fullrecord | <record><control><sourceid>proquest_cross</sourceid><recordid>TN_cdi_proquest_journals_2678581182</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>2678581182</sourcerecordid><originalsourceid>FETCH-LOGICAL-c319t-d04f9c8382dab0131bd4bb40a59282e4f39dd744bcbea90adafe760e0d491e1d3</originalsourceid><addsrcrecordid>eNp9kE1LAzEQhoMoWKt_wNOC5-hMkm424KUUv6DgRcFbSDazdUt3tyZb1H_v1hV68zDMHJ73HXgYu0S4RgB9kxBUYTgIHCY3hssjNsGZllwro4_ZBIxQPM_N2yk7S2kNAFICTtjtPIu0cX3dtZn7dJEyajyFULerrKHy3bV1arKqiweMvvroyv15zk4qt0l08ben7PX-7mXxyJfPD0-L-ZKXEk3PA6jKlIUsRHAeUKIPynsFbmZEIUhV0oSglfKlJ2fABVeRzoEgKIOEQU7Z1di7jd3HjlJv190utsNLK3JdzArEQgyUGKkydilFquw21o2L3xbB7i3Z0ZIdLNlfS1YOITmG0gC3K4qH6n9SP8Ouass</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2678581182</pqid></control><display><type>article</type><title>A relation aware embedding mechanism for relation extraction</title><source>SpringerLink Journals - AutoHoldings</source><creator>Li, Xiang ; Li, Yuwei ; Yang, Junan ; Liu, Hui ; Hu, Pengjiang</creator><creatorcontrib>Li, Xiang ; Li, Yuwei ; Yang, Junan ; Liu, Hui ; Hu, Pengjiang</creatorcontrib><description>Extracting possible relational triples from natural language text is a fundamental task of information extraction, which has attracted extensive attention. The embedding mechanism has a significant impact on the performance of relation extraction models, and the embedding vectors should contain rich semantic information that has close relevance to the relation extraction task. Driven by this motivation, we propose a
R
elation
A
ware Embedding Mechanism (RA)
for relation extraction. In specific, this mechanism incorporates the relation label information into sentence embedding by leveraging the attention mechanism to distinguish the importance of different relation labels to each word of a sentence. We apply the proposed method to three state-of-the-art relation extraction models: CasRel, SMHSA and ETL-Span, and implement the corresponding models named RA-CasRel, RA-SMHSA and RA-ETL-Span. To evaluate the effectiveness of our method, we conduct extensive experiments on two widely-used open datasets: NYT and WebNLG, and compare RA-CasRel, RA-SMHSA and RA-ETL-Span with 12 state-of-the-art models. The experimental results show that our method can effectively improve the performance of relation extraction. For instance, RA-CasRel reaches 91.7% and 92.4% of F1-score on NYT and WebNLG, respectively, which is the best performance among all the compared models. We have open sourced the code of our proposed method in [
1
] to facilitate future research in relation extraction.</description><identifier>ISSN: 0924-669X</identifier><identifier>EISSN: 1573-7497</identifier><identifier>DOI: 10.1007/s10489-021-02699-3</identifier><language>eng</language><publisher>New York: Springer US</publisher><subject>Artificial Intelligence ; Computer Science ; Embedding ; Information retrieval ; Machines ; Manufacturing ; Mechanical Engineering ; Natural language processing ; Performance enhancement ; Performance evaluation ; Processes</subject><ispartof>Applied intelligence (Dordrecht, Netherlands), 2022-07, Vol.52 (9), p.10022-10031</ispartof><rights>The Author(s), under exclusive licence to Springer Science+Business Media, LLC, part of Springer Nature 2021</rights><rights>The Author(s), under exclusive licence to Springer Science+Business Media, LLC, part of Springer Nature 2021.</rights><lds50>peer_reviewed</lds50><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c319t-d04f9c8382dab0131bd4bb40a59282e4f39dd744bcbea90adafe760e0d491e1d3</citedby><cites>FETCH-LOGICAL-c319t-d04f9c8382dab0131bd4bb40a59282e4f39dd744bcbea90adafe760e0d491e1d3</cites><orcidid>0000-0002-9336-2152</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktopdf>$$Uhttps://link.springer.com/content/pdf/10.1007/s10489-021-02699-3$$EPDF$$P50$$Gspringer$$H</linktopdf><linktohtml>$$Uhttps://link.springer.com/10.1007/s10489-021-02699-3$$EHTML$$P50$$Gspringer$$H</linktohtml><link.rule.ids>314,776,780,27901,27902,41464,42533,51294</link.rule.ids></links><search><creatorcontrib>Li, Xiang</creatorcontrib><creatorcontrib>Li, Yuwei</creatorcontrib><creatorcontrib>Yang, Junan</creatorcontrib><creatorcontrib>Liu, Hui</creatorcontrib><creatorcontrib>Hu, Pengjiang</creatorcontrib><title>A relation aware embedding mechanism for relation extraction</title><title>Applied intelligence (Dordrecht, Netherlands)</title><addtitle>Appl Intell</addtitle><description>Extracting possible relational triples from natural language text is a fundamental task of information extraction, which has attracted extensive attention. The embedding mechanism has a significant impact on the performance of relation extraction models, and the embedding vectors should contain rich semantic information that has close relevance to the relation extraction task. Driven by this motivation, we propose a
R
elation
A
ware Embedding Mechanism (RA)
for relation extraction. In specific, this mechanism incorporates the relation label information into sentence embedding by leveraging the attention mechanism to distinguish the importance of different relation labels to each word of a sentence. We apply the proposed method to three state-of-the-art relation extraction models: CasRel, SMHSA and ETL-Span, and implement the corresponding models named RA-CasRel, RA-SMHSA and RA-ETL-Span. To evaluate the effectiveness of our method, we conduct extensive experiments on two widely-used open datasets: NYT and WebNLG, and compare RA-CasRel, RA-SMHSA and RA-ETL-Span with 12 state-of-the-art models. The experimental results show that our method can effectively improve the performance of relation extraction. For instance, RA-CasRel reaches 91.7% and 92.4% of F1-score on NYT and WebNLG, respectively, which is the best performance among all the compared models. We have open sourced the code of our proposed method in [
1
] to facilitate future research in relation extraction.</description><subject>Artificial Intelligence</subject><subject>Computer Science</subject><subject>Embedding</subject><subject>Information retrieval</subject><subject>Machines</subject><subject>Manufacturing</subject><subject>Mechanical Engineering</subject><subject>Natural language processing</subject><subject>Performance enhancement</subject><subject>Performance evaluation</subject><subject>Processes</subject><issn>0924-669X</issn><issn>1573-7497</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2022</creationdate><recordtype>article</recordtype><sourceid>BENPR</sourceid><recordid>eNp9kE1LAzEQhoMoWKt_wNOC5-hMkm424KUUv6DgRcFbSDazdUt3tyZb1H_v1hV68zDMHJ73HXgYu0S4RgB9kxBUYTgIHCY3hssjNsGZllwro4_ZBIxQPM_N2yk7S2kNAFICTtjtPIu0cX3dtZn7dJEyajyFULerrKHy3bV1arKqiweMvvroyv15zk4qt0l08ben7PX-7mXxyJfPD0-L-ZKXEk3PA6jKlIUsRHAeUKIPynsFbmZEIUhV0oSglfKlJ2fABVeRzoEgKIOEQU7Z1di7jd3HjlJv190utsNLK3JdzArEQgyUGKkydilFquw21o2L3xbB7i3Z0ZIdLNlfS1YOITmG0gC3K4qH6n9SP8Ouass</recordid><startdate>20220701</startdate><enddate>20220701</enddate><creator>Li, Xiang</creator><creator>Li, Yuwei</creator><creator>Yang, Junan</creator><creator>Liu, Hui</creator><creator>Hu, Pengjiang</creator><general>Springer US</general><general>Springer Nature B.V</general><scope>AAYXX</scope><scope>CITATION</scope><scope>3V.</scope><scope>7SC</scope><scope>7WY</scope><scope>7WZ</scope><scope>7XB</scope><scope>87Z</scope><scope>8AL</scope><scope>8FD</scope><scope>8FE</scope><scope>8FG</scope><scope>8FK</scope><scope>8FL</scope><scope>ABJCF</scope><scope>ABUWG</scope><scope>AFKRA</scope><scope>ARAPS</scope><scope>AZQEC</scope><scope>BENPR</scope><scope>BEZIV</scope><scope>BGLVJ</scope><scope>CCPQU</scope><scope>DWQXO</scope><scope>FRNLG</scope><scope>F~G</scope><scope>GNUQQ</scope><scope>HCIFZ</scope><scope>JQ2</scope><scope>K60</scope><scope>K6~</scope><scope>K7-</scope><scope>L.-</scope><scope>L6V</scope><scope>L7M</scope><scope>L~C</scope><scope>L~D</scope><scope>M0C</scope><scope>M0N</scope><scope>M7S</scope><scope>P5Z</scope><scope>P62</scope><scope>PQBIZ</scope><scope>PQBZA</scope><scope>PQEST</scope><scope>PQQKQ</scope><scope>PQUKI</scope><scope>PRINS</scope><scope>PSYQQ</scope><scope>PTHSS</scope><scope>Q9U</scope><orcidid>https://orcid.org/0000-0002-9336-2152</orcidid></search><sort><creationdate>20220701</creationdate><title>A relation aware embedding mechanism for relation extraction</title><author>Li, Xiang ; Li, Yuwei ; Yang, Junan ; Liu, Hui ; Hu, Pengjiang</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c319t-d04f9c8382dab0131bd4bb40a59282e4f39dd744bcbea90adafe760e0d491e1d3</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2022</creationdate><topic>Artificial Intelligence</topic><topic>Computer Science</topic><topic>Embedding</topic><topic>Information retrieval</topic><topic>Machines</topic><topic>Manufacturing</topic><topic>Mechanical Engineering</topic><topic>Natural language processing</topic><topic>Performance enhancement</topic><topic>Performance evaluation</topic><topic>Processes</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Li, Xiang</creatorcontrib><creatorcontrib>Li, Yuwei</creatorcontrib><creatorcontrib>Yang, Junan</creatorcontrib><creatorcontrib>Liu, Hui</creatorcontrib><creatorcontrib>Hu, Pengjiang</creatorcontrib><collection>CrossRef</collection><collection>ProQuest Central (Corporate)</collection><collection>Computer and Information Systems Abstracts</collection><collection>ABI/INFORM Collection</collection><collection>ABI/INFORM Global (PDF only)</collection><collection>ProQuest Central (purchase pre-March 2016)</collection><collection>ABI/INFORM Global (Alumni Edition)</collection><collection>Computing Database (Alumni Edition)</collection><collection>Technology Research Database</collection><collection>ProQuest SciTech Collection</collection><collection>ProQuest Technology Collection</collection><collection>ProQuest Central (Alumni) (purchase pre-March 2016)</collection><collection>ABI/INFORM Collection (Alumni Edition)</collection><collection>Materials Science & Engineering Collection</collection><collection>ProQuest Central (Alumni Edition)</collection><collection>ProQuest Central UK/Ireland</collection><collection>Advanced Technologies & Aerospace Collection</collection><collection>ProQuest Central Essentials</collection><collection>ProQuest Central</collection><collection>Business Premium Collection</collection><collection>Technology Collection (ProQuest)</collection><collection>ProQuest One Community College</collection><collection>ProQuest Central Korea</collection><collection>Business Premium Collection (Alumni)</collection><collection>ABI/INFORM Global (Corporate)</collection><collection>ProQuest Central Student</collection><collection>SciTech Premium Collection</collection><collection>ProQuest Computer Science Collection</collection><collection>ProQuest Business Collection (Alumni Edition)</collection><collection>ProQuest Business Collection</collection><collection>Computer Science Database</collection><collection>ABI/INFORM Professional Advanced</collection><collection>ProQuest Engineering Collection</collection><collection>Advanced Technologies Database with Aerospace</collection><collection>Computer and Information Systems Abstracts Academic</collection><collection>Computer and Information Systems Abstracts Professional</collection><collection>ABI/INFORM Global</collection><collection>Computing Database</collection><collection>Engineering Database</collection><collection>Advanced Technologies & Aerospace Database</collection><collection>ProQuest Advanced Technologies & Aerospace Collection</collection><collection>ProQuest One Business</collection><collection>ProQuest One Business (Alumni)</collection><collection>ProQuest One Academic Eastern Edition (DO NOT USE)</collection><collection>ProQuest One Academic</collection><collection>ProQuest One Academic UKI Edition</collection><collection>ProQuest Central China</collection><collection>ProQuest One Psychology</collection><collection>Engineering Collection</collection><collection>ProQuest Central Basic</collection><jtitle>Applied intelligence (Dordrecht, Netherlands)</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Li, Xiang</au><au>Li, Yuwei</au><au>Yang, Junan</au><au>Liu, Hui</au><au>Hu, Pengjiang</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>A relation aware embedding mechanism for relation extraction</atitle><jtitle>Applied intelligence (Dordrecht, Netherlands)</jtitle><stitle>Appl Intell</stitle><date>2022-07-01</date><risdate>2022</risdate><volume>52</volume><issue>9</issue><spage>10022</spage><epage>10031</epage><pages>10022-10031</pages><issn>0924-669X</issn><eissn>1573-7497</eissn><abstract>Extracting possible relational triples from natural language text is a fundamental task of information extraction, which has attracted extensive attention. The embedding mechanism has a significant impact on the performance of relation extraction models, and the embedding vectors should contain rich semantic information that has close relevance to the relation extraction task. Driven by this motivation, we propose a
R
elation
A
ware Embedding Mechanism (RA)
for relation extraction. In specific, this mechanism incorporates the relation label information into sentence embedding by leveraging the attention mechanism to distinguish the importance of different relation labels to each word of a sentence. We apply the proposed method to three state-of-the-art relation extraction models: CasRel, SMHSA and ETL-Span, and implement the corresponding models named RA-CasRel, RA-SMHSA and RA-ETL-Span. To evaluate the effectiveness of our method, we conduct extensive experiments on two widely-used open datasets: NYT and WebNLG, and compare RA-CasRel, RA-SMHSA and RA-ETL-Span with 12 state-of-the-art models. The experimental results show that our method can effectively improve the performance of relation extraction. For instance, RA-CasRel reaches 91.7% and 92.4% of F1-score on NYT and WebNLG, respectively, which is the best performance among all the compared models. We have open sourced the code of our proposed method in [
1
] to facilitate future research in relation extraction.</abstract><cop>New York</cop><pub>Springer US</pub><doi>10.1007/s10489-021-02699-3</doi><tpages>10</tpages><orcidid>https://orcid.org/0000-0002-9336-2152</orcidid></addata></record> |
fulltext | fulltext |
identifier | ISSN: 0924-669X |
ispartof | Applied intelligence (Dordrecht, Netherlands), 2022-07, Vol.52 (9), p.10022-10031 |
issn | 0924-669X 1573-7497 |
language | eng |
recordid | cdi_proquest_journals_2678581182 |
source | SpringerLink Journals - AutoHoldings |
subjects | Artificial Intelligence Computer Science Embedding Information retrieval Machines Manufacturing Mechanical Engineering Natural language processing Performance enhancement Performance evaluation Processes |
title | A relation aware embedding mechanism for relation extraction |
url | https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-02-06T09%3A24%3A36IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_cross&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=A%20relation%20aware%20embedding%20mechanism%20for%20relation%20extraction&rft.jtitle=Applied%20intelligence%20(Dordrecht,%20Netherlands)&rft.au=Li,%20Xiang&rft.date=2022-07-01&rft.volume=52&rft.issue=9&rft.spage=10022&rft.epage=10031&rft.pages=10022-10031&rft.issn=0924-669X&rft.eissn=1573-7497&rft_id=info:doi/10.1007/s10489-021-02699-3&rft_dat=%3Cproquest_cross%3E2678581182%3C/proquest_cross%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=2678581182&rft_id=info:pmid/&rfr_iscdi=true |