HOME > Development > Supervised Machine Learning in Python

Supervised Machine Learning in Python

  • Development
  • Apr 29, 2025
SynopsisSupervised Machine Learning in Python, available at $64.99, h...
Supervised Machine Learning in Python  No.1

Supervised Machine Learning in Python, available at $64.99, has an average rating of 4.15, with 79 lectures, based on 25 reviews, and has 218 subscribers.

You will learn about Regression and classification models Linear models Decision trees Naive Bayes k-nearest neighbors Support Vector Machines Neural networks Random Forest Gradient Boosting XGBoost Voting Stacking Performance metrics (RMSE, MAPE, Accuracy, Precision, ROC Curve) Feature importance SHAP Recursive Feature Elimination Hyperparameter tuning Cross-validation This course is ideal for individuals who are Python developers or Data Scientists or Computer engineers or Researchers or Students It is particularly useful for Python developers or Data Scientists or Computer engineers or Researchers or Students.

Enroll now: Supervised Machine Learning in Python

Summary

Title: Supervised Machine Learning in Python

Price: $64.99

Average Rating: 4.15

Number of Lectures: 79

Number of Published Lectures: 79

Number of Curriculum Items: 79

Number of Published Curriculum Objects: 79

Original Price: $29.99

Quality Status: approved

Status: Live

What You Will Learn

  • Regression and classification models
  • Linear models
  • Decision trees
  • Naive Bayes
  • k-nearest neighbors
  • Support Vector Machines
  • Neural networks
  • Random Forest
  • Gradient Boosting
  • XGBoost
  • Voting
  • Stacking
  • Performance metrics (RMSE, MAPE, Accuracy, Precision, ROC Curve)
  • Feature importance
  • SHAP
  • Recursive Feature Elimination
  • Hyperparameter tuning
  • Cross-validation
  • Who Should Attend

  • Python developers
  • Data Scientists
  • Computer engineers
  • Researchers
  • Students
  • Target Audiences

  • Python developers
  • Data Scientists
  • Computer engineers
  • Researchers
  • Students
  • In this practicalcourse, we are going to focus on supervised machine learningand how to apply it in Python programming language.

    Supervised machine learning is a branch of artificial intelligence whose goal is to create predictive modelsstarting from a dataset. With the proper optimization of the models, it is possible to create mathematical representationsof our data in order to extract the informationthat is hidden inside our database and use it for making inferencesand predictions.

    A very powerful use of supervised machine learning is the calculation of feature importance, which makes us better understand the information behind data and allows us to reduce the dimensionalityof our problem considering only the relevant information, discarding all the useless variables. A common approach for calculating feature importance is the SHAPtechnique.

    Finally, the proper optimization of a model is possible using some hyperparameter tuningtechniques that make use of cross-validation.

    With this course, you are going to learn:

    1. What supervised machine learning is

    2. What overfitting and underfitting are and how to avoid them

    3. The difference between regression and classification models

    4. Linear models

      1. Linear regression

      2. Lasso regression

      3. Ridge regression

      4. Elastic Net regression

      5. Logistic regression

    5. Decision trees

    6. Naive Bayes

    7. K-nearest neighbors

    8. Support Vector Machines

      1. Linear SVM

      2. Non-linear SVM

    9. Feedforward neural networks

    10. Ensemble models

      1. Bias-variance tradeoff

      2. Bagging and Random Forest

      3. Boosting and Gradient Boosting

      4. Voting

      5. Stacking

    11. Performance metrics

      1. Regression

        1. Root Mean Squared Error

        2. Mean Absolute Error

        3. Mean Absolute Percentage Error

      2. Classification

        1. Confusion matrix

        2. Accuracy and balanced accuracy

        3. Precision

        4. Recall

        5. ROC Curve and the area under it

        6. Multi-class metrics

    12. Feature importance

      1. How to calculate feature importance according to a model

      2. SHAP technique for calculating feature importance according to every model

      3. Recursive Feature Elimination for dimensionality reduction

    13. Hyperparameter tuning

      1. k-fold cross-validation

      2. Grid search

      3. Random search

    All the lessons of this course start with a brief introduction and end with a practical example in Python programming language and its powerful scikit-learn library. The environment that will be used is Jupyter, which is a standard in the data science industry. All the Jupyter notebooks are downloadable.

    Course Curriculum

    Chapter 1: Introduction to supervised machine learning

    Lecture 1: Introduction to the course

    Lecture 2: What is supervised machine learning?

    Lecture 3: Regression and classification models

    Lecture 4: Overfitting and underfitting

    Chapter 2: The tools used in this course

    Lecture 1: Required Python packages

    Lecture 2: Jupyter notebook

    Lecture 3: Sklearn API

    Chapter 3: Linear models

    Lecture 1: Introduction to Linear Regression

    Lecture 2: Linear regression in Python

    Lecture 3: Introduction to Ridge Regression

    Lecture 4: Ridge regression in Python

    Lecture 5: Introduction to Lasso Regression

    Lecture 6: Lasso regression in Python

    Lecture 7: Introduction to Elastic Net Regression

    Lecture 8: Elastic Net Regression in Python

    Lecture 9: Introduction to Logistic Regression for classification

    Lecture 10: Logistic regression in Python

    Chapter 4: Decision trees

    Lecture 1: Introduction to decision trees

    Lecture 2: Decision trees in Python

    Chapter 5: K-nearest neighbors

    Lecture 1: Introduction to KNN

    Lecture 2: KNN in Python

    Chapter 6: Naive Bayes

    Lecture 1: Introduction to Naive Bayes

    Lecture 2: Categorical Naive Bayes in Python

    Lecture 3: Bernoulli Naive Bayes in Python

    Lecture 4: Gaussian Naive Bayes in Python

    Chapter 7: Support Vector Machines

    Lecture 1: Introduction to SVM

    Lecture 2: Linear SVM in Python

    Lecture 3: Non-linear SVM in Python

    Chapter 8: Neural Networks

    Lecture 1: Introduction to Neural Networks

    Lecture 2: Neural Networks in Python

    Chapter 9: Introduction to ensemble models

    Lecture 1: Ensemble models and bias-variance tradeoff

    Chapter 10: Ensemble models: bagging

    Lecture 1: Introduction to bagging

    Lecture 2: Bagging in Python

    Lecture 3: Introduction to Random Forest

    Lecture 4: Random Forest in Python

    Lecture 5: Introduction to Extremely Randomized Trees

    Lecture 6: Extremely Randomized Trees in Python

    Chapter 11: Ensemble models: boosting

    Lecture 1: Introduction to boosting

    Lecture 2: Boosting in Python

    Lecture 3: Introduction to Gradient Boosting

    Lecture 4: Gradient Boosting in Python

    Lecture 5: XGBoost in Python

    Chapter 12: Ensemble models: voting

    Lecture 1: Introduction to voting

    Lecture 2: Voting in Python

    Chapter 13: Ensemble models: stacking

    Lecture 1: Introduction to stacking

    Lecture 2: Stacking in Python

    Chapter 14: Performance evaluation

    Lecture 1: Regression performance metrics

    Lecture 2: Regression performance metrics in Python

    Lecture 3: Pairplot in Python

    Lecture 4: Binary classification performance metrics

    Lecture 5: Binary classification performance metrics in Python

    Lecture 6: Introduction to ROC curve

    Lecture 7: ROC curve in Python

    Lecture 8: Multi-class classification performance metrics

    Lecture 9: Multi-class classification performance metrics in Python

    Lecture 10: When to use classification performance metrics

    Chapter 15: Cross-Validation and hyperparameter tuning

    Lecture 1: Introduction to k-fold cross-validation

    Lecture 2: k-fold cross-validation in Python

    Lecture 3: The need for hyperparameter tuning

    Lecture 4: Introduction to grid search

    Lecture 5: Grid search in Python

    Lecture 6: Introduction to Random Search

    Lecture 7: Random Search in Python

    Chapter 16: Feature importance and model interpretation

    Lecture 1: What is feature importance?

    Lecture 2: Models that calculate feature importance in Python

    Lecture 3: Introduction to SHAP

    Lecture 4: Using SHAP with tree-based models in Python

    Lecture 5: Using SHAP with every model in Python

    Chapter 17: Recursive Feature Elimination

    Lecture 1: Introduction to RFE

    Lecture 2: RFE in Python

    Chapter 18: Practical examples in Python

    Lecture 1: A complete pipeline: model selection and hyperparameter tuning

    Lecture 2: Feature selection with Lasso

    Lecture 3: Dimensionality reduction with RFE

    Lecture 4: How to choose the right scaler

    Chapter 19: Persisting our model

    Lecture 1: Pickle library

    Chapter 20: Practical approaches

    Lecture 1: The curse of dimensionality

    Lecture 2: The importance of pre-processing

    Lecture 3: The importance of the right features against the model

    Lecture 4: Interpretability of a model

    Instructors

  • Supervised Machine Learning in Python  No.2
    Gianluca Malato
    Your Data Teacher
  • Rating Distribution

  • 1 stars: 1 votes
  • 2 stars: 0 votes
  • 3 stars: 3 votes
  • 4 stars: 8 votes
  • 5 stars: 13 votes
  • Frequently Asked Questions

    How long do I have access to the course materials?

    You can view and review the lecture materials indefinitely, like an on-demand channel.

    Can I take my courses with me wherever I go?

    Definitely! If you have an internet connection, courses on Udemy are available on any device at any time. If you don’t have an internet connection, some instructors also let their students download course lectures. That’s up to the instructor though, so make sure you get on their good side!