NLP’s and Transformer Models
Transformers have revolutionized NLP since their inception in 2017. They can understand language at a deeper level as well as parallelize training on GPUs. In this course, you will learn how to create Transformers for NLP.
Natural language processing is a set of tools and techniques that enable us to unlock the power of text, and transformers have revolutionized NLP since their inception in 2017. They can understand language at a deeper level as well as parallelize training on GPUs. In this course, NLP’s and Transformer Models, you’ll gain the ability to use and create Transformers and LLMs for NLP. First, you’ll explore the attention mechanism and learn how it really works. Next, you’ll discover about the Transformer architecture and write one of your own! Finally, you’ll learn how to leverage Hugging Face to fine-tune or perform “transfer learning” on LLMs to do pretty amazing tasks. When you’re finished with this course, you’ll have the skills and knowledge of Transformers on Natural Language Processing needed to develop outstanding solutions.
Author Name: Axel Sirota
Author Description:
Axel Sirota is a Microsoft Certified Trainer with a deep interest in Deep Learning and Machine Learning Operations. He has a Masters degree in Mathematics and after researching in Probability, Statistics and Machine Learning optimisation, he works as an AI and Cloud Consultant as well as being an Author and Instructor at Pluralsight, Develop Intelligence, and O’Reilly Media.
There are no reviews yet.