Retrieval of Injection Molding Industrial Knowledge Graph Based on Transformer and BERT
Knowledge graphs play an important role in the field of knowledge management by providing a simple and clear way of expressing complex data relationships. Injection molding is a highly knowledge-intensive technology, and in our previous research, we have used knowledge graphs to manage and express r...
Gespeichert in:
Veröffentlicht in: | Applied sciences 2023-05, Vol.13 (11), p.6687 |
---|---|
Hauptverfasser: | , , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Knowledge graphs play an important role in the field of knowledge management by providing a simple and clear way of expressing complex data relationships. Injection molding is a highly knowledge-intensive technology, and in our previous research, we have used knowledge graphs to manage and express relevant knowledge, gradually establishing an injection molding industrial knowledge graph. However, the current way of retrieving knowledge graphs is still mainly through programming, which results in many difficulties for users without programming backgrounds when it comes to searching a graph. This study will utilize the previously established injection molding industrial knowledge graph and employ a BERT (Bidirectional Encoder Representations from Transformers) fine-tuning model to analyze the semantics of user questions. A knowledge graph will be retrieved through a search engine built on the Transformer Encoder, which can reason based on the structure of the graph to find relevant knowledge that satisfies a user’s questions. The experimental results show that both the BERT fine-tuned model and the search engine achieve an excellent performance. This approach can help engineers who do not have a knowledge graph background to retrieve information from the graph by inputting natural language queries, thereby improving the usability of the graph. |
---|---|
ISSN: | 2076-3417 2076-3417 |
DOI: | 10.3390/app13116687 |