000 03478cam a2200289zu 4500
001 88929010
003 FRCYB88929010
005 20250429181235.0
006 m o d
007 cr un
008 250429s2022 fr | o|||||0|0|||eng d
020 _a9781803247335
035 _aFRCYB88929010
040 _aFR-PaCSA
_ben
_c
_erda
100 1 _aRothman, Denis
245 0 1 _aTransformers for Natural Language Processing
_c['Rothman, Denis', 'Gulli, Antonio']
264 1 _bPackt Publishing
_c2022
300 _a p.
336 _btxt
_2rdacontent
337 _bc
_2rdamdedia
338 _bc
_2rdacarrier
650 0 _a
700 0 _aRothman, Denis
700 0 _aGulli, Antonio
856 4 0 _2Cyberlibris
_uhttps://international.scholarvox.com/netsen/book/88929010
_qtext/html
_a
520 _aUnder the hood working of transformers, fine-tuning GPT-3 models, DeBERTa, vision models, and the start of Metaverse, using a variety of NLP platforms: Hugging Face, OpenAI API, Trax, and AllenNLP Key Features Implement models, such as BERT, Reformer, and T5, that outperform classical language models Compare NLP applications using GPT-3, GPT-2, and other transformers Analyze advanced use cases, including polysemy, cross-lingual learning, and computer vision Book Description Transformers are a game-changer for natural language understanding (NLU) and have become one of the pillars of artificial intelligence. Transformers for Natural Language Processing, 2nd Edition, investigates deep learning for machine translations, speech-to-text, text-to-speech, language modeling, question-answering, and many more NLP domains with transformers. An Industry 4.0 AI specialist needs to be adaptable; knowing just one NLP platform is not enough anymore. Different platforms have different benefits depending on the application, whether it's cost, flexibility, ease of implementation, results, or performance. In this book, we analyze numerous use cases with Hugging Face, Google Trax, OpenAI, and AllenNLP. This book takes transformers' capabilities further by combining multiple NLP techniques, such as sentiment analysis, named entity recognition, and semantic role labeling, to analyze complex use cases, such as dissecting fake news on Twitter. Also, see how transformers can create code using just a brief description. By the end of this NLP book, you will understand transformers from a cognitive science perspective and be proficient in applying pretrained transformer models to various datasets. What you will learn Discover new ways of performing NLP techniques with the latest pretrained transformers Grasp the workings of the original Transformer, GPT-3, BERT, T5, DeBERTa, and Reformer Find out how ViT and CLIP label images (including blurry ones!) and reconstruct images using DALL-E Carry out sentiment analysis, text summarization, casual language analysis, machine translations, and more using TensorFlow, PyTorch, and GPT-3 Measure the productivity of key transformers to define their scope, potential, and limits in production Who this book is for If you want to learn about and apply transformers to your natural language (and image) data, this book is for you. A good understanding of NLP, Python, and deep learning is required to benefit most from this book. Many platforms covered in this book provide interactive user interfaces, which allow readers with a general interest in NLP and AI to follow several chapters of this book.
999 _c1324239
_d1324239