Mastering transformers the journey from BERT to large language models and stable diffusion

Explore transformer-based language models from BERT to GPT, delving into NLP and computer vision tasks, while tackling challenges effectively Key Features Understand the complexity of deep learning architecture and transformers architecture Create solutions to industrial natural language processing...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Hauptverfasser: Yıldırım, Savaş (VerfasserIn), Asgari-Chenaghlu, Meysam (VerfasserIn)
Format: Elektronisch E-Book
Sprache:English
Veröffentlicht: Birmingham, UK Packt Publishing Ltd. 2024
Ausgabe:Second edition.
Schlagworte:
Online-Zugang:lizenzpflichtig
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!

MARC

LEADER 00000nam a22000002 4500
001 ZDB-30-ORH-10437232X
003 DE-627-1
005 20240701091204.0
007 cr uuu---uuuuu
008 240701s2024 xx |||||o 00| ||eng c
020 |a 9781837633784  |9 978-1-83763-378-4 
035 |a (DE-627-1)10437232X 
035 |a (DE-599)KEP10437232X 
035 |a (ORHE)9781837633784 
035 |a (DE-627-1)10437232X 
040 |a DE-627  |b ger  |c DE-627  |e rda 
041 |a eng 
082 0 |a 006.3/5  |2 23/eng/20240611 
100 1 |a Yıldırım, Savaş  |e VerfasserIn  |4 aut 
245 1 0 |a Mastering transformers  |b the journey from BERT to large language models and stable diffusion  |c Savaş Yıldırım, Meysam Asgari-Chenaghlu 
250 |a Second edition. 
264 1 |a Birmingham, UK  |b Packt Publishing Ltd.  |c 2024 
300 |a 1 online resource (462 pages)  |b illustrations 
336 |a Text  |b txt  |2 rdacontent 
337 |a Computermedien  |b c  |2 rdamedia 
338 |a Online-Ressource  |b cr  |2 rdacarrier 
500 |a Includes bibliographical references and index 
520 |a Explore transformer-based language models from BERT to GPT, delving into NLP and computer vision tasks, while tackling challenges effectively Key Features Understand the complexity of deep learning architecture and transformers architecture Create solutions to industrial natural language processing (NLP) and computer vision (CV) problems Explore challenges in the preparation process, such as problem and language-specific dataset transformation Purchase of the print or Kindle book includes a free PDF eBook Book Description Transformer-based language models such as BERT, T5, GPT, DALL-E, and ChatGPT have dominated NLP studies and become a new paradigm. Thanks to their accurate and fast fine-tuning capabilities, transformer-based language models have been able to outperform traditional machine learning-based approaches for many challenging natural language understanding (NLU) problems. Aside from NLP, a fast-growing area in multimodal learning and generative AI has recently been established, showing promising results. Mastering Transformers will help you understand and implement multimodal solutions, including text-to-image. Computer vision solutions that are based on transformers are also explained in the book. You'll get started by understanding various transformer models before learning how to train different autoregressive language models such as GPT and XLNet. The book will also get you up to speed with boosting model performance, as well as tracking model training using the TensorBoard toolkit. In the later chapters, you'll focus on using vision transformers to solve computer vision problems. Finally, you'll discover how to harness the power of transformers to model time series data and for predicting. By the end of this transformers book, you'll have an understanding of transformer models and how to use them to solve challenges in NLP and CV. What you will learn Focus on solving simple-to-complex NLP problems with Python Discover how to solve classification/regression problems with traditional NLP approaches Train a language model and explore how to fine-tune models to the downstream tasks Understand how to use transformers for generative AI and computer vision tasks Build transformer-based NLP apps with the Python transformers library Focus on language generation such as machine translation and conversational AI in any language Speed up transformer model inference to reduce latency Who this book is for This book is for deep learning researchers, hands-on practitioners, and ML/NLP researchers. Educators, as well as students who have a good command of programming subjects, knowledge in the field of machine learning and artificial intelligence, and who want to develop apps in the field of NLP as well as multimodal tasks will also benefit from this book's hands-on approach. Knowledge of Python (or any programming language) and machine learning literature, as well as a basic understanding of computer science, are required. 
650 0 |a Natural language processing (Computer science) 
650 4 |a Traitement automatique des langues naturelles 
700 1 |a Asgari-Chenaghlu, Meysam  |e VerfasserIn  |4 aut 
856 4 0 |l TUM01  |p ZDB-30-ORH  |q TUM_PDA_ORH  |u https://learning.oreilly.com/library/view/-/9781837633784/?ar  |m X:ORHE  |x Aggregator  |z lizenzpflichtig  |3 Volltext 
912 |a ZDB-30-ORH 
951 |a BO 
912 |a ZDB-30-ORH 
049 |a DE-91 

Datensatz im Suchindex

DE-BY-TUM_katkey ZDB-30-ORH-10437232X
_version_ 1818767367410286592
adam_text
any_adam_object
author Yıldırım, Savaş
Asgari-Chenaghlu, Meysam
author_facet Yıldırım, Savaş
Asgari-Chenaghlu, Meysam
author_role aut
aut
author_sort Yıldırım, Savaş
author_variant s y sy
m a c mac
building Verbundindex
bvnumber localTUM
collection ZDB-30-ORH
ctrlnum (DE-627-1)10437232X
(DE-599)KEP10437232X
(ORHE)9781837633784
dewey-full 006.3/5
dewey-hundreds 000 - Computer science, information, general works
dewey-ones 006 - Special computer methods
dewey-raw 006.3/5
dewey-search 006.3/5
dewey-sort 16.3 15
dewey-tens 000 - Computer science, information, general works
discipline Informatik
edition Second edition.
format Electronic
eBook
fullrecord <?xml version="1.0" encoding="UTF-8"?><collection xmlns="http://www.loc.gov/MARC21/slim"><record><leader>04367nam a22003732 4500</leader><controlfield tag="001">ZDB-30-ORH-10437232X</controlfield><controlfield tag="003">DE-627-1</controlfield><controlfield tag="005">20240701091204.0</controlfield><controlfield tag="007">cr uuu---uuuuu</controlfield><controlfield tag="008">240701s2024 xx |||||o 00| ||eng c</controlfield><datafield tag="020" ind1=" " ind2=" "><subfield code="a">9781837633784</subfield><subfield code="9">978-1-83763-378-4</subfield></datafield><datafield tag="035" ind1=" " ind2=" "><subfield code="a">(DE-627-1)10437232X</subfield></datafield><datafield tag="035" ind1=" " ind2=" "><subfield code="a">(DE-599)KEP10437232X</subfield></datafield><datafield tag="035" ind1=" " ind2=" "><subfield code="a">(ORHE)9781837633784</subfield></datafield><datafield tag="035" ind1=" " ind2=" "><subfield code="a">(DE-627-1)10437232X</subfield></datafield><datafield tag="040" ind1=" " ind2=" "><subfield code="a">DE-627</subfield><subfield code="b">ger</subfield><subfield code="c">DE-627</subfield><subfield code="e">rda</subfield></datafield><datafield tag="041" ind1=" " ind2=" "><subfield code="a">eng</subfield></datafield><datafield tag="082" ind1="0" ind2=" "><subfield code="a">006.3/5</subfield><subfield code="2">23/eng/20240611</subfield></datafield><datafield tag="100" ind1="1" ind2=" "><subfield code="a">Yıldırım, Savaş</subfield><subfield code="e">VerfasserIn</subfield><subfield code="4">aut</subfield></datafield><datafield tag="245" ind1="1" ind2="0"><subfield code="a">Mastering transformers</subfield><subfield code="b">the journey from BERT to large language models and stable diffusion</subfield><subfield code="c">Savaş Yıldırım, Meysam Asgari-Chenaghlu</subfield></datafield><datafield tag="250" ind1=" " ind2=" "><subfield code="a">Second edition.</subfield></datafield><datafield tag="264" ind1=" " ind2="1"><subfield code="a">Birmingham, UK</subfield><subfield code="b">Packt Publishing Ltd.</subfield><subfield code="c">2024</subfield></datafield><datafield tag="300" ind1=" " ind2=" "><subfield code="a">1 online resource (462 pages)</subfield><subfield code="b">illustrations</subfield></datafield><datafield tag="336" ind1=" " ind2=" "><subfield code="a">Text</subfield><subfield code="b">txt</subfield><subfield code="2">rdacontent</subfield></datafield><datafield tag="337" ind1=" " ind2=" "><subfield code="a">Computermedien</subfield><subfield code="b">c</subfield><subfield code="2">rdamedia</subfield></datafield><datafield tag="338" ind1=" " ind2=" "><subfield code="a">Online-Ressource</subfield><subfield code="b">cr</subfield><subfield code="2">rdacarrier</subfield></datafield><datafield tag="500" ind1=" " ind2=" "><subfield code="a">Includes bibliographical references and index</subfield></datafield><datafield tag="520" ind1=" " ind2=" "><subfield code="a">Explore transformer-based language models from BERT to GPT, delving into NLP and computer vision tasks, while tackling challenges effectively Key Features Understand the complexity of deep learning architecture and transformers architecture Create solutions to industrial natural language processing (NLP) and computer vision (CV) problems Explore challenges in the preparation process, such as problem and language-specific dataset transformation Purchase of the print or Kindle book includes a free PDF eBook Book Description Transformer-based language models such as BERT, T5, GPT, DALL-E, and ChatGPT have dominated NLP studies and become a new paradigm. Thanks to their accurate and fast fine-tuning capabilities, transformer-based language models have been able to outperform traditional machine learning-based approaches for many challenging natural language understanding (NLU) problems. Aside from NLP, a fast-growing area in multimodal learning and generative AI has recently been established, showing promising results. Mastering Transformers will help you understand and implement multimodal solutions, including text-to-image. Computer vision solutions that are based on transformers are also explained in the book. You'll get started by understanding various transformer models before learning how to train different autoregressive language models such as GPT and XLNet. The book will also get you up to speed with boosting model performance, as well as tracking model training using the TensorBoard toolkit. In the later chapters, you'll focus on using vision transformers to solve computer vision problems. Finally, you'll discover how to harness the power of transformers to model time series data and for predicting. By the end of this transformers book, you'll have an understanding of transformer models and how to use them to solve challenges in NLP and CV. What you will learn Focus on solving simple-to-complex NLP problems with Python Discover how to solve classification/regression problems with traditional NLP approaches Train a language model and explore how to fine-tune models to the downstream tasks Understand how to use transformers for generative AI and computer vision tasks Build transformer-based NLP apps with the Python transformers library Focus on language generation such as machine translation and conversational AI in any language Speed up transformer model inference to reduce latency Who this book is for This book is for deep learning researchers, hands-on practitioners, and ML/NLP researchers. Educators, as well as students who have a good command of programming subjects, knowledge in the field of machine learning and artificial intelligence, and who want to develop apps in the field of NLP as well as multimodal tasks will also benefit from this book's hands-on approach. Knowledge of Python (or any programming language) and machine learning literature, as well as a basic understanding of computer science, are required.</subfield></datafield><datafield tag="650" ind1=" " ind2="0"><subfield code="a">Natural language processing (Computer science)</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">Traitement automatique des langues naturelles</subfield></datafield><datafield tag="700" ind1="1" ind2=" "><subfield code="a">Asgari-Chenaghlu, Meysam</subfield><subfield code="e">VerfasserIn</subfield><subfield code="4">aut</subfield></datafield><datafield tag="856" ind1="4" ind2="0"><subfield code="l">TUM01</subfield><subfield code="p">ZDB-30-ORH</subfield><subfield code="q">TUM_PDA_ORH</subfield><subfield code="u">https://learning.oreilly.com/library/view/-/9781837633784/?ar</subfield><subfield code="m">X:ORHE</subfield><subfield code="x">Aggregator</subfield><subfield code="z">lizenzpflichtig</subfield><subfield code="3">Volltext</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">ZDB-30-ORH</subfield></datafield><datafield tag="951" ind1=" " ind2=" "><subfield code="a">BO</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">ZDB-30-ORH</subfield></datafield><datafield tag="049" ind1=" " ind2=" "><subfield code="a">DE-91</subfield></datafield></record></collection>
id ZDB-30-ORH-10437232X
illustrated Illustrated
indexdate 2024-12-18T08:48:45Z
institution BVB
isbn 9781837633784
language English
open_access_boolean
owner DE-91
DE-BY-TUM
owner_facet DE-91
DE-BY-TUM
physical 1 online resource (462 pages) illustrations
psigel ZDB-30-ORH
publishDate 2024
publishDateSearch 2024
publishDateSort 2024
publisher Packt Publishing Ltd.
record_format marc
spelling Yıldırım, Savaş VerfasserIn aut
Mastering transformers the journey from BERT to large language models and stable diffusion Savaş Yıldırım, Meysam Asgari-Chenaghlu
Second edition.
Birmingham, UK Packt Publishing Ltd. 2024
1 online resource (462 pages) illustrations
Text txt rdacontent
Computermedien c rdamedia
Online-Ressource cr rdacarrier
Includes bibliographical references and index
Explore transformer-based language models from BERT to GPT, delving into NLP and computer vision tasks, while tackling challenges effectively Key Features Understand the complexity of deep learning architecture and transformers architecture Create solutions to industrial natural language processing (NLP) and computer vision (CV) problems Explore challenges in the preparation process, such as problem and language-specific dataset transformation Purchase of the print or Kindle book includes a free PDF eBook Book Description Transformer-based language models such as BERT, T5, GPT, DALL-E, and ChatGPT have dominated NLP studies and become a new paradigm. Thanks to their accurate and fast fine-tuning capabilities, transformer-based language models have been able to outperform traditional machine learning-based approaches for many challenging natural language understanding (NLU) problems. Aside from NLP, a fast-growing area in multimodal learning and generative AI has recently been established, showing promising results. Mastering Transformers will help you understand and implement multimodal solutions, including text-to-image. Computer vision solutions that are based on transformers are also explained in the book. You'll get started by understanding various transformer models before learning how to train different autoregressive language models such as GPT and XLNet. The book will also get you up to speed with boosting model performance, as well as tracking model training using the TensorBoard toolkit. In the later chapters, you'll focus on using vision transformers to solve computer vision problems. Finally, you'll discover how to harness the power of transformers to model time series data and for predicting. By the end of this transformers book, you'll have an understanding of transformer models and how to use them to solve challenges in NLP and CV. What you will learn Focus on solving simple-to-complex NLP problems with Python Discover how to solve classification/regression problems with traditional NLP approaches Train a language model and explore how to fine-tune models to the downstream tasks Understand how to use transformers for generative AI and computer vision tasks Build transformer-based NLP apps with the Python transformers library Focus on language generation such as machine translation and conversational AI in any language Speed up transformer model inference to reduce latency Who this book is for This book is for deep learning researchers, hands-on practitioners, and ML/NLP researchers. Educators, as well as students who have a good command of programming subjects, knowledge in the field of machine learning and artificial intelligence, and who want to develop apps in the field of NLP as well as multimodal tasks will also benefit from this book's hands-on approach. Knowledge of Python (or any programming language) and machine learning literature, as well as a basic understanding of computer science, are required.
Natural language processing (Computer science)
Traitement automatique des langues naturelles
Asgari-Chenaghlu, Meysam VerfasserIn aut
TUM01 ZDB-30-ORH TUM_PDA_ORH https://learning.oreilly.com/library/view/-/9781837633784/?ar X:ORHE Aggregator lizenzpflichtig Volltext
spellingShingle Yıldırım, Savaş
Asgari-Chenaghlu, Meysam
Mastering transformers the journey from BERT to large language models and stable diffusion
Natural language processing (Computer science)
Traitement automatique des langues naturelles
title Mastering transformers the journey from BERT to large language models and stable diffusion
title_auth Mastering transformers the journey from BERT to large language models and stable diffusion
title_exact_search Mastering transformers the journey from BERT to large language models and stable diffusion
title_full Mastering transformers the journey from BERT to large language models and stable diffusion Savaş Yıldırım, Meysam Asgari-Chenaghlu
title_fullStr Mastering transformers the journey from BERT to large language models and stable diffusion Savaş Yıldırım, Meysam Asgari-Chenaghlu
title_full_unstemmed Mastering transformers the journey from BERT to large language models and stable diffusion Savaş Yıldırım, Meysam Asgari-Chenaghlu
title_short Mastering transformers
title_sort mastering transformers the journey from bert to large language models and stable diffusion
title_sub the journey from BERT to large language models and stable diffusion
topic Natural language processing (Computer science)
Traitement automatique des langues naturelles
topic_facet Natural language processing (Computer science)
Traitement automatique des langues naturelles
url https://learning.oreilly.com/library/view/-/9781837633784/?ar
work_keys_str_mv AT yıldırımsavas masteringtransformersthejourneyfromberttolargelanguagemodelsandstablediffusion
AT asgarichenaghlumeysam masteringtransformersthejourneyfromberttolargelanguagemodelsandstablediffusion