این دوره نحوه پردازش زبان طبیعی و یادگیری ماشین با پایتون را آموزش می دهد. NLP مخفف Natural language processing به معنی پردازش زبان طبیعی می باشد. در این آموزش تصویری نحوه پیاده سازی NLP ، کار با پایتون ، طبقه بندی متن در پایتون ، کار با کلاس ها ، تحلیل مولفه های اصلی و نحوه تولید متغیرهای تصادفی را می آموزید. در ادامه نحوه کار با رگرسیون خطی و لجستیک ، کار با قضیه بیز ، شاخص های پراکندگی واریانس ، کلاستربندی داده ها و… را نیز خواهید آموخت.

این دوره آموزشی محصول موسسه Udemy است.

سرفصل های دوره:

  • یادگیری زبان ماشین
  • روشهای تشخیص Spam
  • بررسی انواع مشکلات یادگیری ماشین
  • کار با متغیرهای تصادفی
  • معرفی قضیه بیز
  • نحوه طبقه بندی داده ها
  • پشتیبانی از بردارها
  • نحوه کلاستر بندی داده ها
  • نحوه کلاستر بندی K-means و DBSCAN
  • آنالیز اجزای اساسی برنامه
  • شناخت شبکه های عصبی مصنوعی
  • رگرسیون خطی و لجستیک
  • پراکندگی واریانس
  • پردازش زبان طبیعی و پایتون
  • نحوه نصب پایتون
  • پردازش زبان طبیعی با NLTK
  • کار با ویژگی های NLTK
  • نحوه طبقه بندی با KNN
  • نحوه استفاده از TF-IDF
  • کار با عبارات منظم
  • کار با API
  • کار با درخت های تصمیم گیری
  • یادگیری زبان ماشین
  • نحوه مونتاژ برنامه
  • الگوریتم APRIORI
  • مبانی Numpy در Python
  • کار با Pandas
  • تجزیه و تحلیل ماتریس ها
  • و…

عنوان دوره: Udemy From 0 to 1: Machine Learning NLP And Python-Cut to the Chase
مدت زمان: 20 ساعت و 30 دقیقه
نویسنده: Loony Corn


توضیحات:

Udemy From 0 to 1: Machine Learning, NLP & Python-Cut to the Chase

Loony Corn
All Levels
20.5 hours

A down-to-earth, shy but confident take on machine learning techniques that you can put to work today
Prerequisites: No prerequisites, knowledge of some undergraduate level mathematics would help but is not mandatory. Working knowledge of Python would be helpful if you want to run the source code that is provided.
Taught by a Stanford-educated, ex-Googler and an IIT, IIM - educated ex-Flipkart lead analyst. This team has decades of practical experience in quant trading, analytics and e-commerce.
This course is a down-to-earth, shy but confident take on machine learning techniques that you can put to work today
Let's parse that.
The course is down-to-earth : it makes everything as simple as possible - but not simpler
The course is shy but confident : It is authoritative, drawn from decades of practical experience -but shies away from needlessly complicating stuff.
You can put ML to work today : If Machine Learning is a car, this car will have you driving today. It won't tell you what the carburetor is.
The course is very visual : most of the techniques are explained with the help of animations to help you understand better.
This course is practical as well : There are hundreds of lines of source code with comments that can be used directly to implement natural language processing and machine learning for text summarization, text classification in Python.
The course is also quirky. The examples are irreverent. Lots of little touches: repetition, zooming out so we remember the big picture, active learning with plenty of quizzes. There's also a peppy soundtrack, and art - all shown by studies to improve cognition and recall.
What's Covered:
Machine Learning:
Supervised/Unsupervised learning, Classification, Clustering, Association Detection, Anomaly Detection, Dimensionality Reduction, Regression.
Naive Bayes, K-nearest neighbours, Support Vector Machines, Artificial Neural Networks, K-means, Hierarchical clustering, Principal Components Analysis, Linear regression, Logistics regression, Random variables, Bayes theorem, Bias-variance tradeoff
Natural Language Processing with Python:
Corpora, stopwords, sentence and word parsing, auto-summarization, sentiment analysis (as a special case of classification), TF-IDF, Document Distance, Text summarization, Text classification with Naive Bayes and K-Nearest Neighbours and Clustering with K-Means
Sentiment Analysis:
Why it's useful, Approaches to solving - Rule-Based , ML-Based , Training , Feature Extraction, Sentiment Lexicons, Regular Expressions, Twitter API, Sentiment Analysis of Tweets with Python
A Note on Python: The code-alongs in this class all use Python 2.7. Source code (with copious amounts of comments) is attached as a resource with all the code-alongs. The source code has been provided for both Python 2 and Python 3 wherever possible.
Mail us about anything - anything! - and we will always reply :-)
What are the requirements?
No prerequisites, knowledge of some undergraduate level mathematics would help but is not mandatory. Working knowledge of Python would be helpful if you want to run the source code that is provided.
What am I going to get from this course?
Over 87 lectures and 20.5 hours of content!
Identify situations that call for the use of Machine Learning
Understand which type of Machine learning problem you are solving and choose the appropriate solution
Use Machine Learning and Natural Language processing to solve problems like text classification, text summarization in Python
What is the target audience?
Yep! Analytics professionals, modelers, big data professionals who haven't had exposure to machine learning
Yep! Engineers who want to understand or learn machine learning and apply it to problems they are solving
Yep! Product managers who want to have intelligent conversations with data scientists and engineers about machine learning
Yep! Tech executives and investors who are interested in big data, machine learning or natural language processing
Yep! MBA graduates or business professionals who are looking to move to a heavily quantitative role

Section 1: Introduction
Lecture 1
What this course is about
03:17
Section 2: Jump right in : Machine learning for Spam detection
Lecture 2
Machine Learning: Why should you jump on the bandwagon?
16:31
Lecture 3
Plunging In - Machine Learning Approaches to Spam Detection
17:01
Lecture 4
Spam Detection with Machine Learning Continued
17:04
Lecture 5
Get the Lay of the Land : Types of Machine Learning Problems
17:26
Section 3: Naive Bayes Classifier
Lecture 6
Random Variables
20:10
Many popular machine learning techniques are probabilistic in nature and having some working knowledge helps. We'll cover random variables, probability distributions and the normal distribution.
Lecture 7
Bayes Theorem
18:36
We have been learning some fundamentals that will help us with probabilistic concepts in Machine Learning. In this class, we will learn about conditional probability and Bayes theorem which is the foundation of many ML techniques.
Lecture 8
Naive Bayes Classifier
08:49
Naive Bayes Classifier is a probabilistic classifier. We have built the foundation to understand what goes on under the hood - let's understand how the Naive Bayes classifier uses the Bayes theorem
Lecture 9
Naive Bayes Classifier : An example
14:03
Section 4: K-Nearest Neighbors
Lecture 10
K-Nearest Neighbors
13:09
Lecture 11
K-Nearest Neighbors : A few wrinkles
14:47
Section 5: Support Vector Machines
Lecture 12
Support Vector Machines Introduced
08:16
Lecture 13
Support Vector Machines : Maximum Margin Hyperplane and Kernel Trick
16:23
Section 6: Clustering as a form of Unsupervised learning
Lecture 14
Clustering : Introduction
19:07
Lecture 15
Clustering : K-Means and DBSCAN
13:42
Section 7: Association Detection
Lecture 16
Association Rules Learning
09:12
Section 8: Dimensionality Reduction
Lecture 17
Dimensionality Reduction
10:22
Lecture 18
Principal Component Analysis
18:53
PCA is one of the most famous Dimensionality Reduction techniques. When you have data with a lot of variables and confusing interactions, PCA clears the air and finds the underlying causes.
Section 9: Artificial Neural Networks
Lecture 19
Artificial Neural Networks:Perceptrons Introduced
11:18
Section 10: Regression as a form of supervised learning
Lecture 20
Regression Introduced : Linear and Logistic Regression
13:54
Lecture 21
Bias Variance Trade-off
10:13
Section 11: Natural Language Processing and Python
Lecture 22
Installing Python - Anaconda and Pip
09:00
Lecture 23
Natural Language Processing with NLTK
07:26
Lecture 24
Natural Language Processing with NLTK - See it in action
14:14
Lecture 25
Web Scraping with BeautifulSoup
18:09
Lecture 26
A Serious NLP Application : Text Auto Summarization using Python
11:34
Lecture 27
Python Drill : Autosummarize News Articles I
18:33
Lecture 28
Python Drill : Autosummarize News Articles II
11:28
Lecture 29
Python Drill : Autosummarize News Articles III
10:23
Lecture 30
Put it to work : News Article Classification using K-Nearest Neighbors
19:29
Lecture 31
Put it to work : News Article Classification using Naive Bayes Classifier
19:24
Lecture 32
Python Drill : Scraping News Websites
15:45
Lecture 33
Python Drill : Feature Extraction with NLTK
18:51
Lecture 34
Python Drill : Classification with KNN
04:15
Lecture 35
Python Drill : Classification with Naive Bayes
08:08
Lecture 36
Document Distance using TF-IDF
11:03
Lecture 37
Put it to work : News Article Clustering with K-Means and TF-IDF
14:32
Lecture 38
Python Drill : Clustering with K Means
08:32
Section 12: Sentiment Analysis
Lecture 39
A Sneak Peek at what's coming up
02:36
Lecture 40
Sentiment Analysis - What's all the fuss about?
17:17
Lecture 41
ML Solutions for Sentiment Analysis - the devil is in the details
19:57
Lecture 42
Sentiment Lexicons ( with an introduction to WordNet and SentiWordNet)
18:49
Lecture 43
Regular Expressions
17:53
Regular expressions are a handy tool to have when you deal with text processing. They are a bit arcane, but pretty useful in the right situation. Understanding the operators from basics help you build up to constructing complex regexps.
Lecture 44
Regular Expressions in Python
05:41
Lecture 45
Put it to work : Twitter Sentiment Analysis
17:48
Lecture 46
Twitter Sentiment Analysis - Work the API
20:00
Lecture 47
Twitter Sentiment Analysis - Regular Expressions for Preprocessing
12:24
Lecture 48
Twitter Sentiment Analysis - Naive Bayes, SVM and Sentiwordnet
19:40
Section 13: Decision Trees
Lecture 49
Planting the seed - What are Decision Trees?
17:00
Lecture 50
Growing the Tree - Decision Tree Learning
18:03
Lecture 51
Branching out - Information Gain
18:51
Lecture 52
Decision Tree Algorithms
07:50
Lecture 53
Titanic : Decision Trees predict Survival (Kaggle) - I
19:21
Lecture 54
Titanic : Decision Trees predict Survival (Kaggle) - II
14:16
Lecture 55
Titanic : Decision Trees predict Survival (Kaggle) - III
13:00
Section 14: A Few Useful Things to Know About Overfitting
Lecture 56
Overfitting - the bane of Machine Learning
19:03
Lecture 57
Overfitting Continued
11:19
Lecture 58
Cross Validation
18:55
Cross Validation is a popular way to choose between models. There are a few different variants - K-Fold Cross validation is the most well known.
Lecture 59
Simplicity is a virtue - Regularization
07:18
Lecture 60
The Wisdom of Crowds - Ensemble Learning
16:39
Lecture 61
Ensemble Learning continued - Bagging, Boosting and Stacking
18:02
Section 15: Random Forests
Lecture 62
Random Forests - Much more than trees
12:28
Lecture 63
Back on the Titanic - Cross Validation and Random Forests
20:03
Section 16: Recommendation Systems
Lecture 64
What do Amazon and Netflix have in common?
16:43
Lecture 65
Recommendation Engines - A look inside
10:45
Lecture 66
What are you made of? - Content-Based Filtering
13:35
Lecture 67
With a little help from friends - Collaborative Filtering
10:26
Lecture 68
A Neighbourhood Model for Collaborative Filtering
17:51
Lecture 69
Top Picks for You! - Recommendations with Neighbourhood Models
09:41
Lecture 70
Discover the Underlying Truth - Latent Factor Collaborative Filtering
20:13
Lecture 71
Latent Factor Collaborative Filtering contd.
12:09
Lecture 72
Gray Sheep and Shillings - Challenges with Collaborative Filtering
08:12
Lecture 73
The Apriori Algorithm for Association Rules
18:31
Section 17: Recommendation Systems in Python
Lecture 74
Back to Basics : Numpy in Python
18:05
Lecture 75
Back to Basics : Numpy and Scipy in Python
14:19
Lecture 76
Movielens and Pandas
16:45
Lecture 77
Code Along - What's my favorite movie? - Data Analysis with Pandas
06:18
Lecture 78
Code Along - Movie Recommendation with Nearest Neighbour CF
18:10
Lecture 79
Code Along - Top Movie Picks (Nearest Neighbour CF)
06:16
Lecture 80
Code Along - Movie Recommendations with Matrix Factorization
17:55
Lecture 81
Code Along - Association Rules with the Apriori Algorithm
09:50
Section 18: A Taste of Deep Learning and Computer Vision
Lecture 82
Computer Vision - An Introduction
18:08