Introduction to natural language processing with TensorFlow
In this module, we’ll explore different neural network architectures for processing natural language texts. Natural Language Processing (NLP) has experienced fast growth and advancement primarily because the performance of the language models depends on their overall ability to “understand” text and can be trained using an unsupervised technique on large text corpora. Additionally, pre-trained text models (such as BERT) simplified many NLP tasks and has dramatically improved the performance. We’ll learn more about these techniques and the basics of NLP in this learning module.
Understand how text is processed for natural language processing tasks, Get an introduced to Recurrent Neural Networks (RNNs) and Generative Neural Networks (GNNs), Learn about Attention Mechanisms, Learn how to build text classification models
Prerequisites
Basic Python knowledge
Basic understanding of machine learning
There are no reviews yet.