Semantic Information Extraction and Multi-Agent Communication Optimization Based on Generative Pre-Trained Transformer
The collaboration among multiple agents demands for efficient communication. However, the observational data in the multi-agent systems are typically voluminous and redundant and pose substantial challenges to the communication system when transmitted directly. To address this issue, this paper intr...
Gespeichert in:
Veröffentlicht in: | IEEE transactions on cognitive communications and networking 2024-10, p.1-1 |
---|---|
Hauptverfasser: | , , , , , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | The collaboration among multiple agents demands for efficient communication. However, the observational data in the multi-agent systems are typically voluminous and redundant and pose substantial challenges to the communication system when transmitted directly. To address this issue, this paper introduces a multi-agent communication scheme based on large language model (LLM), referred to as GPT-based semantic information extraction for multi-agent communication (GMAC). This scheme utilizes an LLM to extract semantic information and leverages the generative capabilities to predict subsequent actions, thereby enabling agents to make more informed decisions. The GMAC approach significantly reduces signaling expenditure exchanged among agents by extracting key semantic data via LLM. This method not only simplifies the communication process but also effectively reduces the communication overhead by approximately 53% compared to the baseline methods. Experimental results indicate that GMAC not only improves the convergence speed and accuracy of decision-making but also substantially decreases the signaling expenditure among agents. Consequently, GMAC offers a straightforward and effective method to achieve efficient and economical communication in the multi-agent systems. |
---|---|
ISSN: | 2332-7731 2332-7731 |
DOI: | 10.1109/TCCN.2024.3482354 |