Favorites
b/udemy1edited 1 year agobyELKinG

Natural Language Processing: NLP With Transformers in Python

Natural Language Processing: Nlp With Transformers In Python

Last updated 8/2022
MP4 | Video: h264, 1280x720 | Audio: AAC, 44.1 KHz
Language: English | Size: 3.67 GB | Duration: 11h 31m

Learn next-generation NLP with transformers for sentiment analysis, Q&A, similarity search, NER, and more

What you'll learn
Industry standard NLP using transformer models
Build full-stack question-answering transformer models
Perform sentiment analysis with transformers models in PyTorch and TensorFlow
Advanced search technologies like Elasticsearch and Facebook AI Similarity Search (FAISS)
Create fine-tuned transformers models for specialized use-cases
Measure performance of language models using advanced metrics like ROUGE
Vector building techniques like BM25 or dense passage retrievers (DPR)
An overview of recent developments in NLP
Understand attention and other key components of transformers
Learn about key transformers models such as BERT
Preprocess text data for NLP
Named entity recognition (NER) using spaCy and transformers
Fine-tune language classification models

Requirements
Knowledge of Python
Experience in data science a plus
Experience in NLP a plus

Description
Transformer models are the de-facto standard in modern NLP. They have proven themselves as the most expressive, powerful models for language by a large margin, beating all major language-based benchmarks time and time again.In this course, we cover everything you need to get started with building cutting-edge performance NLP applications using transformer models like Google AI's BERT, or Facebook AI's DPR.We cover several key NLP frameworks including:HuggingFace's TransformersTensorFlow 2PyTorchspaCyNLTKFlairAnd learn how to apply transformers to some of the most popular NLP use-cases:Language classification/sentiment analysisNamed entity recognition (NER)Question and AnsweringSimilarity/comparative learningThroughout each of these use-cases we work through a variety of examples to ensure that what, how, and why transformers are so important. Alongside these sections we also work through two full-size NLP projects, one for sentiment analysis of financial Reddit data, and another covering a fully-fledged open domain question-answering application.All of this is supported by several other sections that encourage us to learn how to better design, implement, and measure the performance of our models, such as:History of NLP and where transformers come fromCommon preprocessing techniques for NLPThe theory behind transformersHow to fine-tune transformersWe cover all this and more, I look forward to seeing you in the course!

Overview
Section 1: Introduction

Lecture 1 Introduction

Lecture 2 Course Overview

Lecture 3 Hello! and Further Resources

Lecture 4 Environment Setup

Lecture 5 Alternative Local Setup

Lecture 6 Alternative Colab Setup

Lecture 7 CUDA Setup

Lecture 8 Apple Silicon Setup

Section 2: NLP and Transformers

Lecture 9 The Three Eras of AI

Lecture 10 Pros and Cons of Neural AI

Lecture 11 Word Vectors

Lecture 12 Recurrent Neural Networks

Lecture 13 Long Short-Term Memory

Lecture 14 Encoder-Decoder Attention

Lecture 15 Self-Attention

Lecture 16 Multi-head Attention

Lecture 17 Positional Encoding

Lecture 18 Transformer Heads

Section 3: Preprocessing for NLP

Lecture 19 Stopwords

Lecture 20 Tokens Introduction

Lecture 21 Model-Specific Special Tokens

Lecture 22 Stemming

Lecture 23 Lemmatization

Lecture 24 Unicode Normalization - Canonical and Compatibility Equivalence

Lecture 25 Unicode Normalization - Composition and Decomposition

Lecture 26 Unicode Normalization - NFD and NFC

Lecture 27 Unicode Normalization - NFKD and NFKC

Section 4: Attention

Lecture 28 Attention Introduction

Lecture 29 Alignment With Dot-Product

Lecture 30 Dot-Product Attention

Lecture 31 Self Attention

Lecture 32 Bidirectional Attention

Lecture 33 Multi-head and Scaled Dot-Product Attention

Section 5: Language Classification

Lecture 34 Introduction to Sentiment Analysis

Lecture 35 Prebuilt Flair Models

Lecture 36 Introduction to Sentiment Models With Transformers

Lecture 37 Tokenization And Special Tokens For BERT

Lecture 38 Making Predictions

Section 6: [Project] Sentiment Model With TensorFlow and Transformers

Lecture 39 Project Overview

Lecture 40 Getting the Data (Kaggle API)

Lecture 41 Preprocessing

Lecture 42 Building a Dataset

Lecture 43 Dataset Shuffle, Batch, Split, and Save

Lecture 44 Build and Save

Lecture 45 Loading and Prediction

Section 7: Long Text Classification With BERT

Lecture 46 Classification of Long Text Using Windows

Lecture 47 Window Method in PyTorch

Section 8: Named Entity Recognition (NER)

Lecture 48 Introduction to spaCy

Lecture 49 Extracting Entities

Lecture 50 Authenticating With The Reddit API

Lecture 51 Pulling Data With The Reddit API

Lecture 52 Extracting ORGs From Reddit Data

Lecture 53 Getting Entity Frequency

Lecture 54 Entity Blacklist

Lecture 55 NER With Sentiment

Lecture 56 NER With roBERTa

Section 9: Question and Answering

Lecture 57 Open Domain and Reading Comprehension

Lecture 58 Retrievers, Readers, and Generators

Lecture 59 Intro to SQuAD 2.0

Lecture 60 Processing SQuAD Training Data

Lecture 61 (Optional) Processing SQuAD Training Data with Match-Case

Lecture 62 Our First Q&A Model

Section 10: Metrics For Language

Lecture 63 Q&A Performance With Exact Match (EM)

Lecture 64 Introducing the ROUGE Metric

Lecture 65 ROUGE in Python

Lecture 66 Applying ROUGE to Q&A

Lecture 67 Recall, Precision and F1

Lecture 68 Longest Common Subsequence (LCS)

Section 11: Reader-Retriever QA With Haystack

Lecture 69 Intro to Retriever-Reader and Haystack

Lecture 70 What is Elasticsearch?

Lecture 71 Elasticsearch Setup (Windows)

Lecture 72 Elasticsearch Setup (Linux)

Lecture 73 Elasticsearch in Haystack

Lecture 74 Sparse Retrievers

Lecture 75 Cleaning the Index

Lecture 76 Implementing a BM25 Retriever

Lecture 77 What is FAISS?

Lecture 78 Further Materials for Faiss

Lecture 79 FAISS in Haystack

Lecture 80 What is DPR?

Lecture 81 The DPR Architecture

Lecture 82 Retriever-Reader Stack

Section 12: [Project] Open-Domain QA

Lecture 83 ODQA Stack Structure

Lecture 84 Creating the Database

Lecture 85 Building the Haystack Pipeline

Section 13: Similarity

Lecture 86 Introduction to Similarity

Lecture 87 Extracting The Last Hidden State Tensor

Lecture 88 Sentence Vectors With Mean Pooling

Lecture 89 Using Cosine Similarity

Lecture 90 Similarity With Sentence-Transformers

Lecture 91 Further Learning

Section 14: Pre-Training Transformer Models

Lecture 92 Visual Guide to BERT Pretraining

Lecture 93 Introduction to BERT For Pretraining Code

Lecture 94 BERT Pretraining - Masked-Language Modeling (MLM)

Lecture 95 BERT Pretraining - Next Sentence Prediction (NSP)

Lecture 96 The Logic of MLM

Lecture 97 Pre-training with MLM - Data Preparation

Lecture 98 Pre-training with MLM - Training

Lecture 99 Pre-training with MLM - Training with Trainer

Lecture 100 The Logic of NSP

Lecture 101 Pre-training with NSP - Data Preparation

Lecture 102 Pre-training with NSP - DataLoader

Lecture 103 The Logic of MLM and NSP

Lecture 104 Pre-training with MLM and NSP - Data Preparation

Aspiring data scientists and ML engineers interested in NLP,Practitioners looking to upgrade their skills,Developers looking to implement NLP solutions,Data scientist,Machine Learning Engineer,Python Developers

Screenshots

Natural Language Processing: Nlp With Transformers In Python

Homepage

without You and Your Support We Can’t Continue
Thanks for Buying Premium From My Links for Support
Click >>here & Visit My Blog Daily for More Udemy Tutorial. If You Need Update or Links Dead Don't Wait Just Pm Me or Leave Comment at This Post

All comments

    Load more replies

    Join the conversation!

    Log in or Sign up
    to post a comment.