Building Transformer Models with PyTorch 2.0 (notice n° 1556124)

détails MARC
000 -LEADER
fixed length control field 03618cam a2200277zu 4500
003 - CONTROL NUMBER IDENTIFIER
control field FRCYB88955333
005 - DATE AND TIME OF LATEST TRANSACTION
control field 20251020124027.0
008 - FIXED-LENGTH DATA ELEMENTS--GENERAL INFORMATION
fixed length control field 251020s2024 fr | o|||||0|0|||eng d
020 ## - INTERNATIONAL STANDARD BOOK NUMBER
International Standard Book Number 9789355517494
035 ## - SYSTEM CONTROL NUMBER
System control number FRCYB88955333
040 ## - CATALOGING SOURCE
Original cataloging agency FR-PaCSA
Language of cataloging en
Transcribing agency
Description conventions rda
100 1# - MAIN ENTRY--PERSONAL NAME
Personal name Timsina, Prem
245 01 - TITLE STATEMENT
Title Building Transformer Models with PyTorch 2.0
Remainder of title NLP, computer vision, and speech processing with PyTorch and Hugging Face (English Edition)
Statement of responsibility, etc. ['Timsina, Prem']
264 #1 - PRODUCTION, PUBLICATION, DISTRIBUTION, MANUFACTURE, AND COPYRIGHT NOTICE
Name of producer, publisher, distributor, manufacturer BPB Publications
Date of production, publication, distribution, manufacture, or copyright notice 2024
300 ## - PHYSICAL DESCRIPTION
Extent p.
336 ## - CONTENT TYPE
Content type code txt
Source rdacontent
337 ## - MEDIA TYPE
Media type code c
Source rdamdedia
338 ## - CARRIER TYPE
Carrier type code c
Source rdacarrier
520 ## - SUMMARY, ETC.
Summary, etc. Your key to transformer based NLP, vision, speech, and multimodalitiesKey Features? Transformer architecture for different modalities and multimodalities.? Practical guidelines to build and fine-tune transformer models.? Comprehensive code samples with detailed documentation.DescriptionThis book covers transformer architecture for various applications including NLP, computer vision, speech processing, and predictive modeling with tabular data. It is a valuable resource for anyone looking to harness the power of transformer architecture in their machine learning projects.The book provides a step-by-step guide to building transformer models from scratch and fine-tuning pre-trained open-source models. It explores foundational model architecture, including GPT, VIT, Whisper, TabTransformer, Stable Diffusion, and the core principles for solving various problems with transformers. The book also covers transfer learning, model training, and fine-tuning, and discusses how to utilize recent models from Hugging Face. Additionally, the book explores advanced topics such as model benchmarking, multimodal learning, reinforcement learning, and deploying and serving transformer models.In conclusion, this book offers a comprehensive and thorough guide to transformer models and their various applications.What you will learn? Understand the core architecture of various foundational models, including single and multimodalities.? Step-by-step approach to developing transformer-based Machine Learning models.? Utilize various open-source models to solve your business problems.? Train and fine-tune various open-source models using PyTorch 2.0 and the Hugging Face ecosystem.? Deploy and serve transformer models.? Best practices and guidelines for building transformer-based models.Who this book is forThis book caters to data scientists, Machine Learning engineers, developers, and software architects interested in the world of generative AI. Table of Contents1. Transformer Architecture2. Hugging Face Ecosystem3. Transformer Model in PyTorch4. Transfer Learning with PyTorch and Hugging Face5. Large Language Models: BERT, GPT-3, and BART6. NLP Tasks with Transformers7. CV Model Anatomy: ViT, DETR, and DeiT8. Computer Vision Tasks with Transformers9. Speech Processing Model Anatomy: Whisper, SpeechT5, and Wav2Vec10. Speech Tasks with Transformers11. Transformer Architecture for Tabular Data Processing12. Transformers for Tabular Data Regression and Classification13. Multimodal Transformers, Architectures and Applications14. Explore Reinforcement Learning for Transformer15. Model Export, Serving, and Deployment16. Transformer Model Interpretability, and Experimental Visualization17. PyTorch Models: Best Practices and Debugging
650 #0 - SUBJECT ADDED ENTRY--TOPICAL TERM
Topical term or geographic name entry element
700 0# - ADDED ENTRY--PERSONAL NAME
Personal name Timsina, Prem
856 40 - ELECTRONIC LOCATION AND ACCESS
Access method Cyberlibris
Uniform Resource Identifier <a href="https://international.scholarvox.com/netsen/book/88955333">https://international.scholarvox.com/netsen/book/88955333</a>
Electronic format type text/html
Host name

Pas d'exemplaire disponible.

PLUDOC

PLUDOC est la plateforme unique et centralisée de gestion des bibliothèques physiques et numériques de Guinée administré par le CEDUST. Elle est la plus grande base de données de ressources documentaires pour les Étudiants, Enseignants chercheurs et Chercheurs de Guinée.

Adresse

627 919 101/664 919 101

25 boulevard du commerce
Kaloum, Conakry, Guinée

Réseaux sociaux

Powered by Netsen Group @ 2025