Machine Learning: Natural Language Processing in Python (V2)

NLP: Use Markov Models, NLTK, Artificial Intelligence, Deep Learning, Machine Learning, and Data Science in Python

Generative AI
4.7/5
$54.99
$219.99
75% OFF!
  • All levels
  • 180 Lectures
  • 25h 32m
  • English
  • Lifetime access, certificate of completion (shareable on LinkedIn, Facebook, and Twitter), Q&A forum, subtitles in English
Login or signup to
register for this course

Course Description

Hello friends!

Welcome to Machine Learning: Natural Language Processing in Python (Version 2).

This is a massive 4-in-1 course covering:

  • 1) Vector models and text preprocessing methods
  • 2) Probability models and Markov models
  • 3) Machine learning methods
  • 4) Deep learning and neural network methods


In part 1, which covers vector models and text preprocessing methods, you will learn about why vectors are so essential in data science and artificial intelligence. You will learn about various techniques for converting text into vectors, such as the CountVectorizer and TF-IDF, and you'll learn the basics of neural embedding methods like word2vec, and GloVe.

You'll then apply what you learned for various tasks, such as:

  • Text classification
  • Document retrieval / search engine
  • Text summarization


Along the way, you'll also learn important text preprocessing steps, such as tokenization, stemming, and lemmatization.

You'll be introduced briefly to classic NLP tasks such as parts-of-speech tagging.

In part 2, which covers probability models and Markov models, you'll learn about one of the most important models in all of data science and machine learning in the past 100 years. It has been applied in many areas in addition to NLP, such as finance, bioinformatics, and reinforcement learning.

In this course, you'll see how such probability models can be used in various ways, such as:

  • Building a text classifier
  • Article spinning
  • Text generation (generating poetry)


Importantly, these methods are an essential prerequisite for understanding how the latest Transformer (attention) models such as BERT and GPT-3 work. Specifically, we'll learn about 2 important tasks which correspond with the pre-training objectives for BERT and GPT.

In part 3, which covers machine learning methods, you'll learn about more of the classic NLP tasks, such as:

  • Spam detection
  • Sentiment analysis
  • Latent semantic analysis (also known as latent semantic indexing) (LSA / LSI)
  • Topic modeling
  • Text summarization (re-visit)


This section will be application-focused rather than theory-focused, meaning that instead of spending most of our effort learning about the details of various ML algorithms, you'll be focusing on how they can be applied to the above tasks.

Of course, you'll still need to learn something about those algorithms in order to understand what's going on. The following algorithms will be used:

  • Naive Bayes
  • Logistic Regression
  • Principal Components Analysis (PCA) / Singular Value Decomposition (SVD)
  • Latent Dirichlet Allocation (LDA)
  • Non-negative Matrix Factorization
  • TextRank (based on Google's PageRank)


These are not just "any" machine learning / artificial intelligence algorithms but rather, ones that have been staples in NLP and are thus an essential part of any NLP course.

In part 4, which covers deep learning methods, you'll learn about modern neural network architectures that can be applied to solve NLP tasks. Thanks to their great power and flexibility, neural networks can be used to solve any of the aforementioned tasks in the course.

You'll learn about:

  • Feedforward Artificial Neural Networks (ANNs)
  • Embeddings
  • Convolutional Neural Networks (CNNs)
  • Recurrent Neural Networks (RNNs)


The study of RNNs will involve modern architectures such as the LSTM and GRU which have been widely used by Google, Amazon, Apple, Facebook, etc. for difficult tasks such as language translation, speech recognition, and text-to-speech.

Obviously, as the latest Transformers (such as BERT and GPT-3) are examples of deep neural networks, this part of the course is an essential prerequisite for understanding Transformers.

VIP-only: In the VIP version of this course, you will get your first taste of the power of Transformers. In this section, we will use the Hugging Face library to apply pre-trained NLP Transformer models to tasks such as:

  • Sentiment analysis
  • Converting text into embedding vectors for document retrieval
  • Named entity recognition (NER)
  • Text generation and language modeling
  • Masked language modeling and article spinning
  • Text summarization
  • Neural language translation
  • Question answering
  • Zero-shot classification


You'll notice the first few tasks have been seen earlier in the course. This is intentional.

This section will "connect the dots" between what you learned previously, and the state-of-the-art today.

To end the section, we will go beyond just the familiar tasks to look at some very impressive feats of the modern NLP era, like zero-shot classification.

MORE BONUS CONTENT

This VIP section will contain even more content than what was included in the original VIP section (released elsewhere). In particular, you will get the following extra bonus notebooks:

  • Stock Movement Prediction Using News
  • LSA / LSI for Recommendations
  • LSA / LSI for Classification (Feature Engineering)
  • LSA / LSI for Text Summarization
  • LSA / LSI for Topic Modeling
  • Article spinner (masked language model) with LSTMs
  • Text generator (forward language model) with LSTMs
  • CNN for POS Tagging with custom loss for masking


The final notebooks, which show how to build an article spinner and seq2seq model with LSTMs, will help to "bridge the gap" between RNNs and Transformers. Specifically, masked language modeling is a training objective for some Transformers, while seq2seq introduces the "encoder-decoder" paradigm.

Thank you for reading and I hope to see you soon!

Lectures

  • 26 sections
  • 180 lectures
  • 25h 32m total length
Introduction and Outline
Preview
10:40
Are You Beginner, Intermediate, or Advanced? All are OK!
05:06
Where to get the code
02:06
How to Succeed in this Course
03:04
Temporary 403 Errors
02:58
Vector Models & Text Preprocessing Intro
03:40
Basic Definitions for NLP
05:01
What is a Vector?
10:41
Bag of Words
02:32
Count Vectorizer (Theory)
13:45
Tokenization
14:44
Stopwords
04:51
Stemming and Lemmatization
12:03
Stemming and Lemmatization Demo
13:26
Count Vectorizer (Code)
15:43
Vector Similarity
11:35
TF-IDF (Theory)
14:16
(Interactive) Recommender Exercise Prompt
02:36
TF-IDF (Code)
20:25
Word-to-Index Mapping
10:54
How to Build TF-IDF From Scratch
15:08
Neural Word Embeddings
10:15
Neural Word Embeddings Demo
11:25
Vector Models & Text Preprocessing Summary
03:50
Text Summarization Preview
01:21
How To Do NLP In Other Languages
10:41
Suggestion Box
03:10
Probabilistic Models (Introduction)
04:46
Markov Models Section Introduction
02:42
The Markov Property
07:34
The Markov Model
12:30
Probability Smoothing and Log-Probabilities
07:50
Building a Text Classifier (Theory)
07:29
Building a Text Classifier (Exercise Prompt)
06:33
Building a Text Classifier (Code pt 1)
10:32
Building a Text Classifier (Code pt 2)
12:06
Language Model (Theory)
10:15
Language Model (Exercise Prompt)
06:52
Language Model (Code pt 1)
10:45
Language Model (Code pt 2)
09:25
Markov Models Section Summary
03:00
Article Spinning - Problem Description
07:55
Article Spinning - N-Gram Approach
04:24
Article Spinner Exercise Prompt
05:45
Article Spinner in Python (pt 1)
17:31
Article Spinner in Python (pt 2)
10:00
Case Study: Article Spinning Gone Wrong
05:42
Section Introduction
04:50
Ciphers
03:59
Language Models (Review)
16:06
Genetic Algorithms
21:23
Code Preparation
04:46
Code pt 1
03:06
Code pt 2
07:20
Code pt 3
04:52
Code pt 4
04:03
Code pt 5
07:11
Code pt 6
05:25
Cipher Decryption - Additional Discussion
02:56
Real-World Application: Acoustic Keylogger
02:51
Section Conclusion
06:00
Machine Learning Models (Introduction)
05:50
Spam Detection - Problem Description
06:32
Naive Bayes Intuition
11:37
Spam Detection - Exercise Prompt
02:07
Aside: Class Imbalance, ROC, AUC, and F1 Score (pt 1)
12:25
Aside: Class Imbalance, ROC, AUC, and F1 Score (pt 2)
11:02
Spam Detection in Python
16:23
Sentiment Analysis - Problem Description
07:27
Logistic Regression Intuition (pt 1)
17:36
Multiclass Logistic Regression (pt 2)
06:52
Logistic Regression Training and Interpretation (pt 3)
08:15
Sentiment Analysis - Exercise Prompt
04:00
Sentiment Analysis in Python (pt 1)
10:38
Sentiment Analysis in Python (pt 2)
08:28
Text Summarization Section Introduction
05:34
Text Summarization Using Vectors
05:30
Text Summarization Exercise Prompt
01:50
Text Summarization in Python
12:40
TextRank Intuition
08:03
TextRank - How It Really Works (Advanced)
10:50
TextRank Exercise Prompt (Advanced)
01:23
TextRank in Python (Advanced)
14:33
Text Summarization in Python - The Easy Way (Beginner)
06:06
Text Summarization Section Summary
03:22
Topic Modeling Section Introduction
03:06
Latent Dirichlet Allocation (LDA) - Essentials
10:54
LDA - Code Preparation
03:41
LDA - Maybe Useful Picture (Optional)
01:52
Latent Dirichlet Allocation (LDA) - Intuition (Advanced)
14:54
Topic Modeling with Latent Dirichlet Allocation (LDA) in Python
11:38
Non-Negative Matrix Factorization (NMF) Intuition
10:21
Topic Modeling with Non-Negative Matrix Factorization (NMF) in Python
05:33
Topic Modeling Section Summary
01:37
LSA / LSI Section Introduction
04:06
SVD (Singular Value Decomposition) Intuition
12:11
LSA / LSI: Applying SVD to NLP
07:46
Latent Semantic Analysis / Latent Semantic Indexing in Python
09:15
LSA / LSI Exercises
06:00
Deep Learning Introduction (Intermediate-Advanced)
04:57
The Neuron - Section Introduction
02:20
Fitting a Line
14:23
Classification Code Preparation
07:20
Text Classification in Tensorflow
12:09
The Neuron
09:58
How does a model learn?
10:53
The Neuron - Section Summary
01:51
ANN - Section Introduction
06:59
Forward Propagation
09:40
The Geometrical Picture
09:43
Activation Functions
17:18
Multiclass Classification
08:41
ANN Code Preparation
04:35
Text Classification ANN in Tensorflow
05:43
Text Preprocessing Code Preparation
11:33
Text Preprocessing in Tensorflow
05:30
Embeddings
10:13
CBOW (Advanced)
04:07
CBOW Exercise Prompt
00:57
CBOW in Tensorflow (Advanced)
19:24
ANN - Section Summary
01:32
Aside: How to Choose Hyperparameters (Optional)
06:25
CNN - Section Introduction
04:34
What is Convolution?
16:38
What is Convolution? (Pattern Matching)
05:56
What is Convolution? (Weight Sharing)
06:41
Convolution on Color Images
15:58
CNN Architecture
20:58
CNNs for Text
08:07
Convolutional Neural Network for NLP in Tensorflow
05:31
CNN - Section Summary
01:27
RNN - Section Introduction
04:46
Simple RNN / Elman Unit (pt 1)
09:20
Simple RNN / Elman Unit (pt 2)
09:42
RNN Code Preparation
09:45
RNNs: Paying Attention to Shapes
08:26
GRU and LSTM (pt 1)
17:35
GRU and LSTM (pt 2)
11:36
RNN for Text Classification in Tensorflow
05:56
Parts-of-Speech (POS) Tagging in Tensorflow
19:50
Named Entity Recognition (NER) in Tensorflow
05:13
Exercise: Return to CNNs (Advanced)
03:19
RNN - Section Summary
01:58
Transformers Section Introduction
10:14
From RNNs to Attention and Transformers - Intuition
17:01
Sentiment Analysis
10:32
Sentiment Analysis in Python
17:00
Text Generation
10:47
Text Generation in Python
11:47
Masked Language Modeling (Article Spinner)
11:37
Masked Language Modeling (Article Spinner) in Python
08:26
Question Answering
07:20
Question Answering in Python
06:14
Zero-Shot Classification
05:30
Zero-Shot Classification in Python
13:47
Transformers Section Summary
04:53
LLM Section Intro
01:34
Using vs Building
05:57
Scaling Laws
04:11
Transformers
04:20
Foundation Models and Self-Supervised Pretraining
06:00
Alignment, Fine-Tuning, RLHF, DPO, GRPO
09:58
Impact and Usage
06:00
Multimodal and Vision-Language Models
03:09
From LLMs to AI Agents and Agentic AI
05:51
What to Learn Next
06:27
Where is BERT, ChatGPT, GPT-4, ...?
07:01
What is the Appendix?
03:47
Pre-Installation Check
04:13
Anaconda Environment Setup
20:21
How to install Numpy, Scipy, Matplotlib, Pandas, PyTorch, and TensorFlow
17:33
How to Code Yourself (part 1)
15:55
How to Code Yourself (part 2)
09:24
Proof that using Jupyter Notebook is the same as not using it
12:29
How to use Github & Extra Coding Tips (Optional)
11:12
How to Succeed in this Course (Long Version)
10:25
Is this for Beginners or Experts? Academic or Practical? Fast or slow-paced?
22:05
What order should I take your courses in? (part 1)
11:19
What order should I take your courses in? (part 2)
16:07
Where to get discount coupons and FREE AI tutorials
05:49
GloVe Word Embeddings Demo
Stock Movement Prediction Using News
LSA / LSI for Recommendations
LSA / LSI for Classification (Feature Engineering)
LSA / LSI for Topic Modeling
LSA / LSI for Text Summarization (Method 1)
LSA / LSI for Text Summarization (Method 2)
LSTM for Text Generation Notebook
Language Model Training Efficiency
Masked language model with LSTM Notebook
CNN POS Tagging Custom Loss

Reviews

4.7

5,940 reviews for this course

5 Stars
(70%)
4 Stars
(27%)
3 Stars
(2%)
2 Stars
(1%)
1 Stars
(0%)

Testimonials and Success Stories

student-avatar

H. Z.

Machine Learning Research Scientist
flag-usa
United States

“I am one of your students. Yesterday, I presented my paper at ICCV 2019. You have a significant part in this, so I want to sincerely thank you for your in-depth guidance to the puzzle of deep learning. Please keep making awesome courses that teach us!”

5.0
student-avatar

Wade J.

Data Scientist
flag-usa
United States

“I just watched your short video on “Predicting Stock Prices with LSTMs: One Mistake Everyone Makes.” Giggled with delight.

You probably already know this, but some of us really and truly appreciate you. BTW, I spent a reasonable amount of time making a learning roadmap based on your courses and have started the journey.

Looking forward to your new stuff.”

5.0
student-avatar

Kris M.

Data Scientist
flag-usa
United States

“Thank you for doing this! I wish everyone who call’s themselves a Data Scientist would take the time to do this either as a refresher or learn the material. I have had to work with so many people in prior roles that wanted to jump right into machine learning on my teams and didn’t even understand the first thing about the basics you have in here!!

I am signing up so that I have the easy refresh when needed and the see what you consider important, as well as to support your great work, thank you.”

5.0
student-avatar

Steve M.

Machine Learning Research Scientist
flag-usa
United States

“I have been intending to send you an email expressing my gratitude for the work that you have done to create all of these data science courses in Machine Learning and Artificial Intelligence. I have been looking long and hard for courses that have mathematical rigor relative to the application of the ML & AI algorithms as opposed to just exhibit some 'canned routine' and then viola here is your neural network or logistical regression.

Your courses are just what I have been seeking. I am a retired mathematician, statistician and Supply Chain executive from a large Fortune 500 company in Ohio. I also taught mathematics, statistics and operations research courses at a couple of universities in Northern Ohio.

I have taken many courses and have enjoyed the journey, I am not going to be critical of any of the organizations from whom I have taken courses. However, when I read a review about one of your courses in which the student was complaining that one would need a PhD in Mathematics to understand it, I knew this was the course (or series of courses) that I wanted. (Having advanced degrees in mathematics, I knew that it was highly unlikely that a PhD would actually be required.)”

5.0
student-avatar

Saurabh W.

Data Scientist
flag-india
India

“Hi Sir I am a student from India. I've been wanting to write a note to thank you for the courses that you've made because they have changed my career. I wanted to work in the field of data science but I was not having proper guidance but then I stumbled upon your "Logistic Regression" course in March and since then, there's been no looking back. I learned ANNs, CNNs, RNNs, Tensorflow, NLP and whatnot by going through your lectures. The knowledge that I gained enabled me to get a job as a Business Technology Analyst at one of my dream firms even in the midst of this pandemic. For that, I shall always be grateful to you. Please keep making more courses with the level of detail that you do in low-level libraries like Theano.”

5.0
student-avatar

David P.

Financial Analyst
flag-usa
United States

“I just wanted to reach out and thank you for your most excellent course that I am nearing finishing.

And, I couldn't agree more with some of your "rants", and found myself nodding vigorously!

You are an excellent teacher, and a rare breed.

And, your courses are frankly, more digestible and teach a student far more than some of the top-tier courses from ivy leagues I have taken in the past.

(I plan to go through many more courses, one by one!)

I know you must be deluged with complaints in spite of the best content around That's just human nature.

Also, satisfied people rarely take the time to write, so I thought I will write in for a change. :)”

5.0
student-avatar

P. C.

Deep Learning Research Scientist
flag-china
China

“Hello, Lazy Programmer!

In the process of completing my Master’s at Hunan University, China, I am writing this feedback to you in order to express my deep gratitude for all the knowledge and skills I have obtained studying your courses and following your recommendations.

The first course of yours I took was on Convolutional Neural Networks (“Deep Learning p.5”, as far as I remember). Answering one of my questions on the Q&A board, you suggested I should start from the beginning – the Linear and Logistic Regression courses. Despite that I assumed I had already known many basic things at that time, I overcame my “pride” and decided to start my journey in Deep Learning from scratch.

Course by course, I was renewing the basics and the prerequisites. Thus, in several months, after every day studying under your guidance, I was able to gain enough intuitions and practical skills in order to begin progressing in my research. Having a solid background, it was just a pleasure to read all the relevant papers in the field as well as to make all the experiments needed for achieving my goal – creating a high-performance CNN for offline HCCR.

I believe, the professionalism of any teacher can be estimated by the feedback received from their students, and it’s of the utmost importance for me to thank you, Lazy Programmer!

I want you to know, in spite, that we have never actually met and you haven’t taught me privately, I consider you one of my greatest Teachers.

The most important things I have learned from you (some in the hard way, though) beside many exciting modern Deep Learning/AI techniques and algorithms are:

1) If one doesn’t know how to program something, one doesn’t understand it completely.

2) If one is not honest with oneself about one’s prior knowledge, one will never succeed in studying more advanced things.

3) Developing skills in BOTH Math and Programming is what makes one a good student of this major.

I am still studying your courses, and am certain I will ask you more than just a few technical questions regarding their content, but I already would like to say, that I will remember your contribution to my adventure in the Deep Learning field, and consider it as big as one of such great scientists’ as Andrew Ng, Geoffrey Hinton, and my supervisor.

Thank you, Lazy Programmer! 非常感谢您,Lazy 老师!

If you are interested, you can find my first paper’s preprint here:

https://arxiv.org/abs/xxx”

5.0
student-avatar

Dima K.

Data Scientist
flag-ukraine
Ukraine

“By the way, if you are interested to hear. I used the HMM classification, as it was in your course (95% of the script, I had little adjustments there), for the Customer-Care department in a big known fintech company. to predict who will call them, so they can call him before the rush hours, and improve the service. Instead of a poem, I Had a sequence of the last 24 hours' events that the customer had, like: "Loaded money", "Usage in the food service", "Entering the app", "Trying to change the password", etc... the label was called or didn't call. The outcome was great. They use it for their VIP customers. Our data science department and I got a lot of praise.”

5.0
student-avatar

Andres Lopez C.

Data Engineer
flag-usa
United States

“This course is exactly what I was looking for. The instructor does an impressive job making students understand they need to work hard in order to learned. The examples are clear, and the explanations of the theory is very interesting.”

5.0
student-avatar

Mohammed K.

Machine Learning Engineer
flag-germany
Germany

“Thank you, I think you have opened my eyes. I was using API to implement Deep learning algorithms and each time I felt I was messing out on some things. So thank you very much.”

5.0
student-avatar

Tom P.

Machine Learning Engineer
flag-usa
United States

“I have now taken a few classes from some well-known AI profs at Stanford (Andrew Ng, Christopher Manning, …) with an overall average mark in the mid-90s. Just so you know, you are as good as any of them. But I hope that you already know that.

I wish you a happy and safe holiday season. I am glad you chose to share your knowledge with the rest of us.”

5.0
Start learning today

Join 30 day bootcamp for free

4.7/5 from — 600k+ learners