Examining User-Friendly and Open-Sourced Large GPT Models: A Survey on Language, Multimodal, and Scientific GPT Models
Generative pre-trained transformer (GPT) models have revolutionized the field of natural language processing (NLP) with remarkable performance in various tasks and also extend their power to multimodal domains. Despite their success, large GPT models like GPT-4 face inherent limitations such as cons...
Gespeichert in:
Hauptverfasser: | , , , , , , |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Generative pre-trained transformer (GPT) models have revolutionized the field
of natural language processing (NLP) with remarkable performance in various
tasks and also extend their power to multimodal domains. Despite their success,
large GPT models like GPT-4 face inherent limitations such as considerable
size, high computational requirements, complex deployment processes, and closed
development loops. These constraints restrict their widespread adoption and
raise concerns regarding their responsible development and usage. The need for
user-friendly, relatively small, and open-sourced alternative GPT models arises
from the desire to overcome these limitations while retaining high performance.
In this survey paper, we provide an examination of alternative open-sourced
models of large GPTs, focusing on user-friendly and relatively small models
that facilitate easier deployment and accessibility. Through this extensive
survey, we aim to equip researchers, practitioners, and enthusiasts with a
thorough understanding of user-friendly and relatively small open-sourced
models of large GPTs, their current state, challenges, and future research
directions, inspiring the development of more efficient, accessible, and
versatile GPT models that cater to the broader scientific community and advance
the field of general artificial intelligence. The source contents are
continuously updating in https://github.com/GPT-Alternatives/gpt_alternatives. |
---|---|
DOI: | 10.48550/arxiv.2308.14149 |