Thinking with Knowledge Graphs: Enhancing LLM Reasoning Through Structured Data

Large Language Models (LLMs) have demonstrated remarkable capabilities in natural language understanding and generation. However, they often struggle with complex reasoning tasks and are prone to hallucination. Recent research has shown promising results in leveraging knowledge graphs (KGs) to enhan...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:arXiv.org 2024-12
Hauptverfasser: Wu, Xue, Tsioutsiouliklis, Kostas
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page
container_issue
container_start_page
container_title arXiv.org
container_volume
creator Wu, Xue
Tsioutsiouliklis, Kostas
description Large Language Models (LLMs) have demonstrated remarkable capabilities in natural language understanding and generation. However, they often struggle with complex reasoning tasks and are prone to hallucination. Recent research has shown promising results in leveraging knowledge graphs (KGs) to enhance LLM performance. KGs provide a structured representation of entities and their relationships, offering a rich source of information that can enhance the reasoning capabilities of LLMs. For this work, we have developed different techniques that tightly integrate KG structures and semantics into LLM representations. Our results show that we are able to significantly improve the performance of LLMs in complex reasoning scenarios, and ground the reasoning process with KGs. We are the first to represent KGs with programming language and fine-tune pretrained LLMs with KGs. This integration facilitates more accurate and interpretable reasoning processes, paving the way for more advanced reasoning capabilities of LLMs.
format Article
fullrecord <record><control><sourceid>proquest</sourceid><recordid>TN_cdi_proquest_journals_3145903654</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>3145903654</sourcerecordid><originalsourceid>FETCH-proquest_journals_31459036543</originalsourceid><addsrcrecordid>eNqNit0KgjAYQEcQJOU7DLoW5ubs57asoCIo72Xocpp8s_3g65fQA3R1OJwzQQFlLI7WCaUzFFrbEkJouqKcswDdctXAq4EaD41T-Ax66GRVS3w0old2izNQAspxuFyu-C6F1TBaroz2tcIPZ3zpvJEV3gsnFmj6FJ2V4Y9ztDxk-e4U9Ua_vbSuaLU38E0FixO-ISzlCfvv-gDiLj4h</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>3145903654</pqid></control><display><type>article</type><title>Thinking with Knowledge Graphs: Enhancing LLM Reasoning Through Structured Data</title><source>Free E- Journals</source><creator>Wu, Xue ; Tsioutsiouliklis, Kostas</creator><creatorcontrib>Wu, Xue ; Tsioutsiouliklis, Kostas</creatorcontrib><description>Large Language Models (LLMs) have demonstrated remarkable capabilities in natural language understanding and generation. However, they often struggle with complex reasoning tasks and are prone to hallucination. Recent research has shown promising results in leveraging knowledge graphs (KGs) to enhance LLM performance. KGs provide a structured representation of entities and their relationships, offering a rich source of information that can enhance the reasoning capabilities of LLMs. For this work, we have developed different techniques that tightly integrate KG structures and semantics into LLM representations. Our results show that we are able to significantly improve the performance of LLMs in complex reasoning scenarios, and ground the reasoning process with KGs. We are the first to represent KGs with programming language and fine-tune pretrained LLMs with KGs. This integration facilitates more accurate and interpretable reasoning processes, paving the way for more advanced reasoning capabilities of LLMs.</description><identifier>EISSN: 2331-8422</identifier><language>eng</language><publisher>Ithaca: Cornell University Library, arXiv.org</publisher><subject>Graphical representations ; Graphs ; Knowledge representation ; Large language models ; Natural language processing ; Performance enhancement ; Programming languages ; Reasoning ; Semantics ; Speech recognition ; Structured data ; Task complexity</subject><ispartof>arXiv.org, 2024-12</ispartof><rights>2024. This work is published under http://creativecommons.org/licenses/by/4.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.</rights><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><link.rule.ids>776,780</link.rule.ids></links><search><creatorcontrib>Wu, Xue</creatorcontrib><creatorcontrib>Tsioutsiouliklis, Kostas</creatorcontrib><title>Thinking with Knowledge Graphs: Enhancing LLM Reasoning Through Structured Data</title><title>arXiv.org</title><description>Large Language Models (LLMs) have demonstrated remarkable capabilities in natural language understanding and generation. However, they often struggle with complex reasoning tasks and are prone to hallucination. Recent research has shown promising results in leveraging knowledge graphs (KGs) to enhance LLM performance. KGs provide a structured representation of entities and their relationships, offering a rich source of information that can enhance the reasoning capabilities of LLMs. For this work, we have developed different techniques that tightly integrate KG structures and semantics into LLM representations. Our results show that we are able to significantly improve the performance of LLMs in complex reasoning scenarios, and ground the reasoning process with KGs. We are the first to represent KGs with programming language and fine-tune pretrained LLMs with KGs. This integration facilitates more accurate and interpretable reasoning processes, paving the way for more advanced reasoning capabilities of LLMs.</description><subject>Graphical representations</subject><subject>Graphs</subject><subject>Knowledge representation</subject><subject>Large language models</subject><subject>Natural language processing</subject><subject>Performance enhancement</subject><subject>Programming languages</subject><subject>Reasoning</subject><subject>Semantics</subject><subject>Speech recognition</subject><subject>Structured data</subject><subject>Task complexity</subject><issn>2331-8422</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2024</creationdate><recordtype>article</recordtype><sourceid>BENPR</sourceid><recordid>eNqNit0KgjAYQEcQJOU7DLoW5ubs57asoCIo72Xocpp8s_3g65fQA3R1OJwzQQFlLI7WCaUzFFrbEkJouqKcswDdctXAq4EaD41T-Ax66GRVS3w0old2izNQAspxuFyu-C6F1TBaroz2tcIPZ3zpvJEV3gsnFmj6FJ2V4Y9ztDxk-e4U9Ua_vbSuaLU38E0FixO-ISzlCfvv-gDiLj4h</recordid><startdate>20241214</startdate><enddate>20241214</enddate><creator>Wu, Xue</creator><creator>Tsioutsiouliklis, Kostas</creator><general>Cornell University Library, arXiv.org</general><scope>8FE</scope><scope>8FG</scope><scope>ABJCF</scope><scope>ABUWG</scope><scope>AFKRA</scope><scope>AZQEC</scope><scope>BENPR</scope><scope>BGLVJ</scope><scope>CCPQU</scope><scope>DWQXO</scope><scope>HCIFZ</scope><scope>L6V</scope><scope>M7S</scope><scope>PIMPY</scope><scope>PQEST</scope><scope>PQQKQ</scope><scope>PQUKI</scope><scope>PRINS</scope><scope>PTHSS</scope></search><sort><creationdate>20241214</creationdate><title>Thinking with Knowledge Graphs: Enhancing LLM Reasoning Through Structured Data</title><author>Wu, Xue ; Tsioutsiouliklis, Kostas</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-proquest_journals_31459036543</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2024</creationdate><topic>Graphical representations</topic><topic>Graphs</topic><topic>Knowledge representation</topic><topic>Large language models</topic><topic>Natural language processing</topic><topic>Performance enhancement</topic><topic>Programming languages</topic><topic>Reasoning</topic><topic>Semantics</topic><topic>Speech recognition</topic><topic>Structured data</topic><topic>Task complexity</topic><toplevel>online_resources</toplevel><creatorcontrib>Wu, Xue</creatorcontrib><creatorcontrib>Tsioutsiouliklis, Kostas</creatorcontrib><collection>ProQuest SciTech Collection</collection><collection>ProQuest Technology Collection</collection><collection>Materials Science &amp; Engineering Collection</collection><collection>ProQuest Central (Alumni Edition)</collection><collection>ProQuest Central UK/Ireland</collection><collection>ProQuest Central Essentials</collection><collection>ProQuest Central</collection><collection>Technology Collection</collection><collection>ProQuest One Community College</collection><collection>ProQuest Central Korea</collection><collection>SciTech Premium Collection</collection><collection>ProQuest Engineering Collection</collection><collection>Engineering Database</collection><collection>Publicly Available Content Database</collection><collection>ProQuest One Academic Eastern Edition (DO NOT USE)</collection><collection>ProQuest One Academic</collection><collection>ProQuest One Academic UKI Edition</collection><collection>ProQuest Central China</collection><collection>Engineering Collection</collection></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Wu, Xue</au><au>Tsioutsiouliklis, Kostas</au><format>book</format><genre>document</genre><ristype>GEN</ristype><atitle>Thinking with Knowledge Graphs: Enhancing LLM Reasoning Through Structured Data</atitle><jtitle>arXiv.org</jtitle><date>2024-12-14</date><risdate>2024</risdate><eissn>2331-8422</eissn><abstract>Large Language Models (LLMs) have demonstrated remarkable capabilities in natural language understanding and generation. However, they often struggle with complex reasoning tasks and are prone to hallucination. Recent research has shown promising results in leveraging knowledge graphs (KGs) to enhance LLM performance. KGs provide a structured representation of entities and their relationships, offering a rich source of information that can enhance the reasoning capabilities of LLMs. For this work, we have developed different techniques that tightly integrate KG structures and semantics into LLM representations. Our results show that we are able to significantly improve the performance of LLMs in complex reasoning scenarios, and ground the reasoning process with KGs. We are the first to represent KGs with programming language and fine-tune pretrained LLMs with KGs. This integration facilitates more accurate and interpretable reasoning processes, paving the way for more advanced reasoning capabilities of LLMs.</abstract><cop>Ithaca</cop><pub>Cornell University Library, arXiv.org</pub><oa>free_for_read</oa></addata></record>
fulltext fulltext
identifier EISSN: 2331-8422
ispartof arXiv.org, 2024-12
issn 2331-8422
language eng
recordid cdi_proquest_journals_3145903654
source Free E- Journals
subjects Graphical representations
Graphs
Knowledge representation
Large language models
Natural language processing
Performance enhancement
Programming languages
Reasoning
Semantics
Speech recognition
Structured data
Task complexity
title Thinking with Knowledge Graphs: Enhancing LLM Reasoning Through Structured Data
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-02-09T11%3A57%3A58IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest&rft_val_fmt=info:ofi/fmt:kev:mtx:book&rft.genre=document&rft.atitle=Thinking%20with%20Knowledge%20Graphs:%20Enhancing%20LLM%20Reasoning%20Through%20Structured%20Data&rft.jtitle=arXiv.org&rft.au=Wu,%20Xue&rft.date=2024-12-14&rft.eissn=2331-8422&rft_id=info:doi/&rft_dat=%3Cproquest%3E3145903654%3C/proquest%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=3145903654&rft_id=info:pmid/&rfr_iscdi=true