HOME > Development > Natural Language Processing- NLP With Transformers in Python

Natural Language Processing- NLP With Transformers in Python

  • Development
  • May 05, 2025
SynopsisNatural Language Processing: NLP With Transformers in Python,...
Natural Language Processing- NLP With Transformers in Python  No.1

Natural Language Processing: NLP With Transformers in Python, available at $84.99, has an average rating of 4.47, with 105 lectures, based on 2110 reviews, and has 28414 subscribers.

You will learn about Industry standard NLP using transformer models Build full-stack question-answering transformer models Perform sentiment analysis with transformers models in PyTorch and TensorFlow Advanced search technologies like Elasticsearch and Facebook AI Similarity Search (FAISS) Create fine-tuned transformers models for specialized use-cases Measure performance of language models using advanced metrics like ROUGE Vector building techniques like BM25 or dense passage retrievers (DPR) An overview of recent developments in NLP Understand attention and other key components of transformers Learn about key transformers models such as BERT Preprocess text data for NLP Named entity recognition (NER) using spaCy and transformers Fine-tune language classification models This course is ideal for individuals who are Aspiring data scientists and ML engineers interested in NLP or Practitioners looking to upgrade their skills or Developers looking to implement NLP solutions or Data scientist or Machine Learning Engineer or Python Developers It is particularly useful for Aspiring data scientists and ML engineers interested in NLP or Practitioners looking to upgrade their skills or Developers looking to implement NLP solutions or Data scientist or Machine Learning Engineer or Python Developers.

Enroll now: Natural Language Processing: NLP With Transformers in Python

Summary

Title: Natural Language Processing: NLP With Transformers in Python

Price: $84.99

Average Rating: 4.47

Number of Lectures: 105

Number of Published Lectures: 104

Number of Curriculum Items: 109

Number of Published Curriculum Objects: 108

Original Price: $24.99

Quality Status: approved

Status: Live

What You Will Learn

  • Industry standard NLP using transformer models
  • Build full-stack question-answering transformer models
  • Perform sentiment analysis with transformers models in PyTorch and TensorFlow
  • Advanced search technologies like Elasticsearch and Facebook AI Similarity Search (FAISS)
  • Create fine-tuned transformers models for specialized use-cases
  • Measure performance of language models using advanced metrics like ROUGE
  • Vector building techniques like BM25 or dense passage retrievers (DPR)
  • An overview of recent developments in NLP
  • Understand attention and other key components of transformers
  • Learn about key transformers models such as BERT
  • Preprocess text data for NLP
  • Named entity recognition (NER) using spaCy and transformers
  • Fine-tune language classification models
  • Who Should Attend

  • Aspiring data scientists and ML engineers interested in NLP
  • Practitioners looking to upgrade their skills
  • Developers looking to implement NLP solutions
  • Data scientist
  • Machine Learning Engineer
  • Python Developers
  • Target Audiences

  • Aspiring data scientists and ML engineers interested in NLP
  • Practitioners looking to upgrade their skills
  • Developers looking to implement NLP solutions
  • Data scientist
  • Machine Learning Engineer
  • Python Developers
  • Transformer models are the de-facto standard in modern NLP. They have proven themselves as the most expressive, powerful models for language by a large margin, beating all major language-based benchmarks time and time again.

    In this course, we cover everything you need to get started with building cutting-edge performance NLP applications using transformer models like Google AI’s BERT, or Facebook AI’s DPR.

    We cover several key NLP frameworks including:

  • HuggingFace’s Transformers

  • TensorFlow 2

  • PyTorch

  • spaCy

  • NLTK

  • Flair

  • And learn how to apply transformers to some of the most popular NLP use-cases:

  • Language classification/sentiment analysis

  • Named entity recognition (NER)

  • Question and Answering

  • Similarity/comparative learning

  • Throughout each of these use-cases we work through a variety of examples to ensure that what, how, and why transformers are so important. Alongside these sections we also work through two full-size NLP projects, one for sentiment analysis of financial Reddit data, and another covering a fully-fledged open domain question-answering application.

    All of this is supported by several other sections that encourage us to learn how to better design, implement, and measure the performance of our models, such as:

  • History of NLP and where transformers come from

  • Common preprocessing techniques for NLP

  • The theory behind transformers

  • How to fine-tune transformers

  • We cover all this and more, I look forward to seeing you in the course!

    Course Curriculum

    Chapter 1: Introduction

    Lecture 1: Introduction

    Lecture 2: Course Overview

    Lecture 3: Hello! and Further Resources

    Lecture 4: Environment Setup

    Lecture 5: Alternative Local Setup

    Lecture 6: Alternative Colab Setup

    Lecture 7: CUDA Setup

    Lecture 8: Apple Silicon Setup

    Chapter 2: NLP and Transformers

    Lecture 1: The Three Eras of AI

    Lecture 2: Pros and Cons of Neural AI

    Lecture 3: Word Vectors

    Lecture 4: Recurrent Neural Networks

    Lecture 5: Long Short-Term Memory

    Lecture 6: Encoder-Decoder Attention

    Lecture 7: Self-Attention

    Lecture 8: Multi-head Attention

    Lecture 9: Positional Encoding

    Lecture 10: Transformer Heads

    Chapter 3: Preprocessing for NLP

    Lecture 1: Stopwords

    Lecture 2: Tokens Introduction

    Lecture 3: Model-Specific Special Tokens

    Lecture 4: Stemming

    Lecture 5: Lemmatization

    Lecture 6: Unicode Normalization – Canonical and Compatibility Equivalence

    Lecture 7: Unicode Normalization – Composition and Decomposition

    Lecture 8: Unicode Normalization – NFD and NFC

    Lecture 9: Unicode Normalization – NFKD and NFKC

    Chapter 4: Attention

    Lecture 1: Attention Introduction

    Lecture 2: Alignment With Dot-Product

    Lecture 3: Dot-Product Attention

    Lecture 4: Self Attention

    Lecture 5: Bidirectional Attention

    Lecture 6: Multi-head and Scaled Dot-Product Attention

    Chapter 5: Language Classification

    Lecture 1: Introduction to Sentiment Analysis

    Lecture 2: Prebuilt Flair Models

    Lecture 3: Introduction to Sentiment Models With Transformers

    Lecture 4: Tokenization And Special Tokens For BERT

    Lecture 5: Making Predictions

    Chapter 6: [Project] Sentiment Model With TensorFlow and Transformers

    Lecture 1: Project Overview

    Lecture 2: Getting the Data (Kaggle API)

    Lecture 3: Preprocessing

    Lecture 4: Building a Dataset

    Lecture 5: Dataset Shuffle, Batch, Split, and Save

    Lecture 6: Build and Save

    Lecture 7: Loading and Prediction

    Chapter 7: Long Text Classification With BERT

    Lecture 1: Classification of Long Text Using Windows

    Lecture 2: Window Method in PyTorch

    Chapter 8: Named Entity Recognition (NER)

    Lecture 1: Introduction to spaCy

    Lecture 2: Extracting Entities

    Lecture 3: Authenticating With The Reddit API

    Lecture 4: Pulling Data With The Reddit API

    Lecture 5: Extracting ORGs From Reddit Data

    Lecture 6: Getting Entity Frequency

    Lecture 7: Entity Blacklist

    Lecture 8: NER With Sentiment

    Lecture 9: NER With roBERTa

    Chapter 9: Question and Answering

    Lecture 1: Open Domain and Reading Comprehension

    Lecture 2: Retrievers, Readers, and Generators

    Lecture 3: Intro to SQuAD 2.0

    Lecture 4: Processing SQuAD Training Data

    Lecture 5: (Optional) Processing SQuAD Training Data with Match-Case

    Lecture 6: Our First Q&A Model

    Chapter 10: Metrics For Language

    Lecture 1: Q&A Performance With Exact Match (EM)

    Lecture 2: Introducing the ROUGE Metric

    Lecture 3: ROUGE in Python

    Lecture 4: Applying ROUGE to Q&A

    Lecture 5: Recall, Precision and F1

    Lecture 6: Longest Common Subsequence (LCS)

    Chapter 11: Reader-Retriever QA With Haystack

    Lecture 1: Intro to Retriever-Reader and Haystack

    Lecture 2: What is Elasticsearch?

    Lecture 3: Elasticsearch Setup (Windows)

    Lecture 4: Elasticsearch Setup (Linux)

    Lecture 5: Elasticsearch in Haystack

    Lecture 6: Sparse Retrievers

    Lecture 7: Cleaning the Index

    Lecture 8: Implementing a BM25 Retriever

    Lecture 9: What is FAISS?

    Lecture 10: Further Materials for Faiss

    Lecture 11: FAISS in Haystack

    Lecture 12: What is DPR?

    Lecture 13: The DPR Architecture

    Lecture 14: Retriever-Reader Stack

    Chapter 12: [Project] Open-Domain QA

    Lecture 1: ODQA Stack Structure

    Lecture 2: Creating the Database

    Lecture 3: Building the Haystack Pipeline

    Chapter 13: Similarity

    Instructors

  • Natural Language Processing- NLP With Transformers in Python  No.2
    James Briggs
    ML Engineer
  • Rating Distribution

  • 1 stars: 39 votes
  • 2 stars: 57 votes
  • 3 stars: 182 votes
  • 4 stars: 695 votes
  • 5 stars: 1137 votes
  • Frequently Asked Questions

    How long do I have access to the course materials?

    You can view and review the lecture materials indefinitely, like an on-demand channel.

    Can I take my courses with me wherever I go?

    Definitely! If you have an internet connection, courses on Udemy are available on any device at any time. If you don’t have an internet connection, some instructors also let their students download course lectures. That’s up to the instructor though, so make sure you get on their good side!