thumbnail
NLP 101

Natural Language Processing

Instructor

Dr. Garcia

Reviews 5.00 (1 Reviews)

Course Overview

Transformers have revolutionized Natural Language Understanding (NLU), a crucial subset of Natural Language Processing (NLP). As our global economy transitions from the physical to the digital realm, AI-driven language understanding underpins essential applications such as chatbots, personal assistants, text summarization, and machine translation. Without this technology, navigating the internet would be incredibly challenging.

The Transformer architecture marks a significant departure from traditional RNN and CNN models, bringing us closer to seamless machine intelligence. This innovation is transforming NLP, enabling the efficient processing of vast amounts of data that would be impossible for humans to handle manually. AI’s capacity to monitor social media, translate web pages, and transcribe streaming content demonstrates its indispensable role in the digital age.

This book, written by Dr. Rigoberto Garcia, guides you through improving language understanding using Python, PyTorch, and TensorFlow. Each chapter offers hands-on experience with key models and transformers, equipping you with the knowledge and tools needed for effective deep learning development in language understanding.

What You'll Learn?

  • This course is not an introduction to Python programming nor machine learning concepts. It does require that you take "Introduction to Python programming, PY-101 and ML-101 "Introduction to Machine Learning with Azure Cloud" NLP-100 focuses on deep learning for machine translations, speech-to-text, text-to-speech, language modeling, question answering, and many more NLP domains.
  • Readers who can benefit the most from this book are:
  • Deep learning and NLP practitioners with Python programming familiarity.
  • Data analysts and data scientists who want an introduction to AI language understanding to process the increasing amounts of language-driven functions.
  • PY-101 "Introduction to Python programming",
  • ML-101 "Introduction to Machine Learning with Azure Cloud"

Course Content

  • Chapter I: Introduction to Transformer Architectures
    • 1: Getting Started with the Model Architecture of the Transformer

    • 2: Fine-Tuning BERT Models, builds on the architecture of the original Transformer.

    • 3, Pretraining a RoBERTa Model from Scratch, builds a RoBERTa transformer model from scratch using the Hugging Face PyTorch modules.

      • Module 1: Model Architecture of the Transformer
        • 1: Cognitive Dissonance

        • 2: Emotional Reactions to Fake News

        • 3: A Behavioral Representation of Fake News

        • 4: A Rational Approach to Fake News

        • 5: A Fake News Resolution Roadmap

        • 6: Applying Sentiment Analysis Transformer Tasks to Social Media

        • 7: Analyzing Gun Control Perceptions with NER and SRL

        • 8: Using Information Extracted by Transformers to Find Reliable Websites

        • 9: Using Transformers to Produce Results for Educational Purposes

        • 10: How to Read Former President Trump's Tweets with an Objective but Critical Eye

  • Chapter II: Applying Transformers for Natural Language Understanding and Generation
    • 4: Downstream NLP Tasks with Transformers

    • 5: Machine Translation with the Transformer

    • 6: Text Generation with OpenAI GPT-2 and GPT-3 Models

    • 7: Applying Transformers to Legal and Financial Documents for AI Text Summarization

    • 8: Matching Tokenizers and Datasets

    • 9: Understanding the Semantic Role Labeling with BERT-Based Transformers.

  • Chapter III: Advanced Language Understanding Techniques
    • 10: Let Your Data Do the Talking: Story, Questions, and Answers

    • 11: Detecting Customer Emotions to Make Predictions

    • 12: Analyzing Deepfakes with Transformers.

Free
  • Lessons 12
  • Skill Experts
  • Last Update May 30, 2024