Transformers for Natural Language Processing
This course helps you master the topic of transformers for natural language processing, including the architecture, training, and prompt design
The demand for language understanding is on the rise in many fields, such as media, social media, and research papers. Vast amounts of data need to be processed for research, documents need to be translated and summarized for every area of the economy, and social media posts need to be scanned for ethical and legal reasons, among hundreds of other AI tasks whose use is ever-expanding.
This course will cover everything from developing code to prompt design, a new programming skill that controls the behavior of a transformer model. Each chapter will go through the key aspects of language understanding in Python, PyTorch, and TensorFlow.
This course will discuss the architecture of the original transformer, Google BERT, OpenAI GPT-3, T5, and several other models. We will also fine-tune transformers, train models from scratch, and learn to use powerful APIs. We’ll work with large datasets from Facebook, Google, Microsoft, and other big tech corporations.
There are no reviews yet.