HOME > Development > College Level Neural Nets [I] Basic Nets- Math Practice!

College Level Neural Nets [I] Basic Nets- Math Practice!

  • Development
  • May 12, 2025
SynopsisCollege Level Neural Nets [I] – Basic Nets: Math &...
College Level Neural Nets [I] Basic Nets- Math Practice!  No.1

College Level Neural Nets [I] – Basic Nets: Math & Practice!, available at $64.99, has an average rating of 4.65, with 83 lectures, based on 50 reviews, and has 831 subscribers.

You will learn about Step By Step Conceptual Introduction For Neural Networks And Deep Learning [Even If You Are A Beginner] Understanding The Basic Perceptron[Neuron] Conceptually, Graphically, And Mathematically – Perceptron Convergence Theorem Proof Mathematical Derivations For Deep Learning Modules Step-By-Step Derivation Of BackPropagation Algorithm Vectorization Of BackPropagation Different Performance Metrics Like Performance – Recall – F1 Score – ROC & AUC Mathematical Derivation Of Cross-Entropy Cost Function Mathematical Derivation Of Back-Propagation Through Batch-Normalization Different Solved Examples On Various Topics This course is ideal for individuals who are Deep Learning Engineers Or College Students Who Want To Gain Deep Mathematical Understanding Of The Topic It is particularly useful for Deep Learning Engineers Or College Students Who Want To Gain Deep Mathematical Understanding Of The Topic.

Enroll now: College Level Neural Nets [I] – Basic Nets: Math & Practice!

Summary

Title: College Level Neural Nets [I] – Basic Nets: Math & Practice!

Price: $64.99

Average Rating: 4.65

Number of Lectures: 83

Number of Published Lectures: 83

Number of Curriculum Items: 83

Number of Published Curriculum Objects: 83

Original Price: $199.99

Quality Status: approved

Status: Live

What You Will Learn

  • Step By Step Conceptual Introduction For Neural Networks And Deep Learning [Even If You Are A Beginner]
  • Understanding The Basic Perceptron[Neuron] Conceptually, Graphically, And Mathematically – Perceptron Convergence Theorem Proof
  • Mathematical Derivations For Deep Learning Modules
  • Step-By-Step Derivation Of BackPropagation Algorithm
  • Vectorization Of BackPropagation
  • Different Performance Metrics Like Performance – Recall – F1 Score – ROC & AUC
  • Mathematical Derivation Of Cross-Entropy Cost Function
  • Mathematical Derivation Of Back-Propagation Through Batch-Normalization
  • Different Solved Examples On Various Topics
  • Who Should Attend

  • Deep Learning Engineers Or College Students Who Want To Gain Deep Mathematical Understanding Of The Topic
  • Target Audiences

  • Deep Learning Engineers Or College Students Who Want To Gain Deep Mathematical Understanding Of The Topic
  • Deep Learning is surely one of the hottest topics nowadays, with a tremendous amount of practical applications in many many fields.Those applications include, without being limited to, image classification, object detection, action recognition in videos, motion synthesis, machine translation, self-driving cars, speech recognition, speech and video generation, natural language processing and understanding, robotics, and many many more.

    Now you might be wondering :

    There is a very large number of courses well-explaining deep learning, why should I prefer this specific course over them ?

    The answer is : You shouldn’t !Most of the other courses heavily focus on “Programming” deep learning applications as fast as possible, without giving detailed explanations on the underlying mathematical foundations that the field of deep learning was built upon. And this is exactly the gap that my course is designed to cover. It is designed to be used hand in hand with other programming courses, not to replace them.

    Since this series is heavily mathematical, I will refer many many times during my explanations to sections from my own college level linear algebra course. In general, being quite familiar with linear algebra is a real prerequisite for this course.

    Please have a look at the course syllables, and remember : This is only part (I) of the deep learning series!

    Course Curriculum

    Chapter 1: Introduction To Machine Learning

    Lecture 1: Promo Video

    Lecture 2: Introduction To Machine Learning

    Chapter 2: The Linear Perceptron

    Lecture 1: Introduction To The Classification Problem

    Lecture 2: A Simple Glimpse Of Overfitting

    Lecture 3: The Perceptron Equation

    Lecture 4: Visualization Of The Perceptron Equation

    Lecture 5: Proof : Weight Vector Is Perpendicular To The Decision Boundary

    Lecture 6: More Visualization For The Perceptron Weights – I

    Lecture 7: More Visualization Of The Perceptron Weights – II

    Lecture 8: Activation Functions

    Lecture 9: Graphical Representation Of A Neural Network

    Lecture 10: Types Of Machine Learning

    Lecture 11: Solved Example (I) : Single Layer Perceptron Designed Graphically

    Chapter 3: Non-Linearly Separable Data And The Multi Layer Perceptron (MLP)

    Lecture 1: Introduction To Multi-Layer Perceptrons

    Lecture 2: Solved Example (II) : MLP Design Graphically

    Lecture 3: Intuition Of Multi-Layer Perceptrons – Part 1

    Lecture 4: Intuition Of Multi-Layer Perceptrons – Part 2

    Lecture 5: The XOR Problem – Part 1

    Lecture 6: The XOR Problem – Part 2

    Lecture 7: MultiClass Classification And The Sigmoid Activation

    Lecture 8: Vectorized Notation And The Weight Matrix

    Chapter 4: Perceptron Learning !

    Lecture 1: The Perceptron Learning Rule – Part 1

    Lecture 2: The Perceptron Learning Rule – Part 2

    Lecture 3: Proof : Perceptron Convergence Theorem – Part 1

    Lecture 4: Proof : Perceptron Convergence Theorem – Part 2

    Lecture 5: Proof : Perceptron Convergence Theorem – Part 3

    Lecture 6: Three Main Problems Of The Threshold Perceptron

    Chapter 5: The Gradient Descent Algorithm

    Lecture 1: The Error Function

    Lecture 2: The Sigmoid Activation Function Again

    Lecture 3: Deriving The Gradient Descent Algorithm

    Lecture 4: Notes About Gradient Descent

    Lecture 5: More Notes And filling Up

    Lecture 6: Solved Example (III) : Gradient Descent Convergence

    Lecture 7: Solved Example (IIII) : MLP With Linear Activations

    Chapter 6: The Back-Propagation Algorithm !

    Lecture 1: Derivation Of Back Propagation – Part 1

    Lecture 2: Derivation Of Back Propagation – Part 2

    Lecture 3: Derivation Of Back Propagation – Part 3

    Lecture 4: Vectorization Of BackPropagation – Part 1

    Lecture 5: Vectorization Of BackPropagation – Part 2

    Lecture 6: Vectorization Of BackPropagation – Part 3

    Lecture 7: Vectorization Of BackPropagation – Part 4

    Lecture 8: Vectorization Of BackPropagation – Part 5 – Batch Vectorization

    Chapter 7: Regularization !

    Lecture 1: Regression, Overfitting, And Underfitting

    Lecture 2: Introduction To Reglarization

    Lecture 3: Different Ways For Regularization

    Lecture 4: L1 vs L2 Regularization – Part 1 – Gradient Descent

    Lecture 5: L1 vs L2 Regularization -Part 2 – Numerical, Intuitive, And Graphical Comparison

    Lecture 6: Dropout ! – Intuition

    Lecture 7: Dropout vs Inverted Dropout

    Lecture 8: Dropout in a nutshell

    Lecture 9: Cross-Validation : How Do I Know I Am Overfitting Or Underfitting ?

    Chapter 8: Model Performance Metrics !

    Lecture 1: Class Imbalance – Why Is Accuracy Not Always The Best Metric ?

    Lecture 2: Precision – Recall , And F1 Score

    Lecture 3: F1 Score vs Simple Average

    Lecture 4: Precision-Recall Curve

    Lecture 5: ROC and AUC

    Chapter 9: Improving Neural Network Performance – Part (I)

    Lecture 1: Gradient Descent With Momentum – Part 1

    Lecture 2: Gradient Descent With Momentum – Part 2

    Lecture 3: Adagrad And RMSProb

    Lecture 4: Adam And Learning Rate Decay

    Lecture 5: The Vanishing Gradient Problem

    Lecture 6: Input Centering And Normalization – Part 1

    Lecture 7: Input Centering And Normalization – Part 2

    Lecture 8: Weight Initialization – Part 1 – The Symmetry Problem

    Lecture 9: Weight Initialization – Part 2

    Lecture 10: Changing Activation Functions – Tanh – Relu – LeakyRelu

    Chapter 10: Maximum Likelihood Estimation Review

    Lecture 1: Source Of Those Lectures

    Lecture 2: Maximum Likelihood Estimation – Quick Overview

    Lecture 3: Maximum Likelihood Estimation Of Gaussian Distribution Parameters

    Chapter 11: Improving Neural Network Performance – Part (II)

    Lecture 1: The Sigmoid And Bernoulli Distribution

    Lecture 2: The Cross Entropy Cost Function – Derivation

    Lecture 3: The Cross Entropy & The Vanishing Gradient Problem

    Lecture 4: Cross Entropy In Multi-Class Problems

    Lecture 5: The Softmax Activation Function

    Lecture 6: BackPropagation Derivation For The Softmax Activation Function

    Lecture 7: Notes About Softmax

    Chapter 12: Batch Normalization !

    Lecture 1: Introduction To Batch Normalization – Part 1

    Lecture 2: Introduction To Batch Normalization – Part 2

    Lecture 3: Forward Pass Equations For Batch Normalization

    Lecture 4: Batch Normalization :: Inference

    Lecture 5: Derivation Of Back Propagation Through Batch Normalization – Part (I)

    Lecture 6: Derivation Of Back Propagation Though Batch Normalization – Part 2

    Chapter 13: Get My Other Courses !

    Lecture 1: Get My Other Courses !

    Instructors

  • College Level Neural Nets [I] Basic Nets- Math Practice!  No.2
    Ahmed Fathy, MSc
    MSc, Senior Deep learning engineer @ Affectiva & Instructor
  • Rating Distribution

  • 1 stars: 1 votes
  • 2 stars: 1 votes
  • 3 stars: 3 votes
  • 4 stars: 11 votes
  • 5 stars: 34 votes
  • Frequently Asked Questions

    How long do I have access to the course materials?

    You can view and review the lecture materials indefinitely, like an on-demand channel.

    Can I take my courses with me wherever I go?

    Definitely! If you have an internet connection, courses on Udemy are available on any device at any time. If you don’t have an internet connection, some instructors also let their students download course lectures. That’s up to the instructor though, so make sure you get on their good side!