›Generative AI language Modeling with Transformers
Generative AI language Modeling with Transformers
Overview
What you’ll learn
- - Explain the concept of attention mechanisms in transformers, including their role in capturing contextual information.
- - Describe language modeling with the decoder-based GPT and encoder-based BERT.
- - Implement positional encoding, masking, attention mechanism, document classification, and create LLMs like GPT and BERT.
- - Use transformer-based models and PyTorch functions for text classification, language translation, and modeling.
Go to Class
PriceFree
LanguageEnglish
Duration8 Hours
CertificateNo
PaceSelf Paced
LevelAdvanced
Category
Generative AI
Instructor
IBM