Machine learning is an extremely hot area in Artificial Intelligence and Data Science. There is no doubt that Neural Networks are the most well-regarded and widely used machine learning techniques.
A lot of Data Scientists use Neural Networks without understanding their internal structure. However, understanding the internal structure and mechanism of such machine learning techniques will allow them to solve problems more efficiently. This also allows them to tune, tweak, and even design new Neural Networks for different projects.
Machine learning is an extremely hot area in Artificial Intelligence and Data Science. There is no doubt that Neural Networks are the most well-regarded and widely used machine learning techniques.
A lot of Data Scientists use Neural Networks without understanding their internal structure. However, understanding the internal structure and mechanism of such machine learning techniques will allow them to solve problems more efficiently. This also allows them to tune, tweak, and even design new Neural Networks for different projects.
This course is the easiest way to understand how Neural Networks work in detail. It also puts you ahead of a lot of data scientists. You will potentially have a higher chance of joining a small pool of well-paid data scientists.
Why learn Neural Networks as a Data Scientist?
Machine learning is getting popular in all industries every single month with the main purpose of improving revenue and decreasing costs. Neural Networks are extremely practical machine learning techniques in different projects. You can use them to automate and optimize the process of solving challenging tasks.
What does a data scientist need to learn about Neural Networks?
The first thing you need to learn is the mathematical models behind them. You cannot believe how easy and intuitive the mathematical models and equations are. This course starts with intuitive examples to take you through the most fundamental mathematical models of all Neural Networks. There is no equation in this course without an in-depth explanation and visual examples. If you hate math, then sit back, relax, and enjoy the videos to learn the math behind Neural Networks with minimum efforts.
It is also important to know what types of problems can be solved with Neural Networks. This course shows different types of problems to solve using Neural Networks including classification, regression, and prediction. There will be several examples to practice how to solve such problems as well.
What does this course cover?
As discussed above, this course starts straight up with an intuitive example to see what a single Neuron is as the most fundamental component of Neural Networks. It also shows you the mathematical and conceptual model of a Neuron. After learning how easy and simple the mathematical models of a single Neuron are, you will see how it performs in action live.
The second part of this course covers terminologies in the field of machine learning, a mathematical model of a special type of neuron called Perceptron, and its inspiration. We will go through the main component of a perceptron as well.
In the third part, we will work with you on the process of training and learning in Neural networks. This includes learning different error/cost functions, optimizing the cost function, gradient descent algorithm, the impact of the learning rate, and challenges in this area.
In the first three parts of this course, you master how a single neuron works (e.g. Perceptron). This prepares you for the fourth part of this course, which is where we will learn how to make a network of these neurons. You will see how powerful even connecting two neurons are. We will learn the impact of multiple neurons and multiple layers on the outputs of a Neural Network. The main model here is a Multi-Layer Perceptron (MLP), which is the most well-regarded Neural Networks in both science and industry. This part of the course also includes Deep Neural Networks (DNN).
In the fifth section of this course, we will learn about the Backpropagation (BP) algorithm to train a multi-layer perceptron. The theory, mathematical model, and numerical example of this algorithm will be discussed in detail.
All the problems used in Sections 1-5 are classification, which is a very important task with a wide range of real-world applications. For instance, you can classify customers based on their interest in a certain product category. However, there are problems that require prediction. Such problems are solved by regression modes. Neural Networks can play the role of a regression method as well. This is exactly what we will be learning in Section 6 of this course. We start with an intuitive example of doing regression using a single neuron. There is a live demo as well to show how a neuron plays the role of a regression model. Other things that you will learn in this section are: linear regression, logistic (non-linear) regression, regression examples and issues, multiple regressions, and an MLP with three layers to solve any type of repression problems.
The last part of this course covers problem-solving using Neural Networks. We will be using Neuroph, which is a Java-based program, to see examples of Neural Networks in the areas and hand-character recognitions and image procession. If you have never used Neuroph before, there is nothing to worry about. There are several videos showing you the steps on how to create and run projects in Neuroph.
By the end of this course, you will have a comprehensive understanding of Neural Networks and able to easily use them in your project. You can analyze, tune, and improve the performance of Neural Networks based on your project too.
Does this course suit you?
This course is an introduction to Neural Networks, so you need absolutely no prior knowledge in Artificial Intelligence, Machine Learning, and AI. However, you need to have a basic understanding of programming especially in Java to easily follow the coding video. If you just want to learn the mathematical model and the problem-solving process using Neural Networks, you can then skip the coding videos.
Who is the instructor?
I am a leading researcher in the field of Machine Learning with expertise in Neural Networks and Optimization. I have more than 150 publications including 80 journal articles, 3 books, and 20 conference papers. These publications have been cited over 13,000 times around the world. As a leading researcher in this field with over 10 years of experience, I have prepared this course to make everything easy for those interested in Machine Learning and Neural Networks. I have been counseling big companies like Facebook and Google in my career too. I am also a star-rising Udemy instructor with more than 5000 students and 1000 5-star reviews, I have designed and developed this course to facilitate the process of learning Neural Networks for those who are interested in this area. You will have my full support throughout your Neural Networks journey in this course.
There is no RISK.
I have some preview videos, so make sure to watch them to see if this course is for you. This course comes with a full 30-day money-back guarantee, which means that if you are not happy after your purchase, you can get a 100% refund no question.
What are you waiting?
Enroll now using the “Add to Cart” button on the right and get started today.
Let's start with a quick and intuitive analogy to see what the purpose of a neuron is.
PLEASE NOTE: In this video, I use a simplified example to explain classification using neural networks. The example of 'boys love gaming and girls love shopping' was based on personal experiences with my partner and was intended purely for illustration. However, I understand that interests vary greatly across individuals, regardless of gender. I value inclusivity and diversity, and this example was not meant to reinforce any stereotypes.
This lesson shows the mathematical of an artificial neuron.
This lesson shows how we model the mathematical equations in the last video, as a mode of neuron.
This lecture shows how an artificial neuron works in action. I have written a program that allows you to interactively change weights and bias of a neuro to see how it changes the shape of the output line.
This lecture introduces the terminologies in the areas of Machine Learning and Neural Networks.
Perceptron is a neuron with a special transfer or activation function. This lecture shows how to mathematically model a perceptron with more than 2 inputs.
No you know how a single neuron and perceptron work with two or more than two inputs. It is time to learn their inspiration.
This lecture covers the concepts of training and learning in Neural Networks. You will learn about the problem of training/learning in Neural network which is to minimize the cost function.
In the last lecture, we realized that we have to minimize the cost function in Neural Networks to classify a data set. There are different cost functions in the field of Neural Networks, and we will learn about the most popular ones in this video.
This video shows you the process of minimizing a cost function using the Gradient Descent algorithm.
To better underestand the Gradient Descent algorithm, this video taks you through a numerical example. In this lecture, we focus on finding optimal values for the connection weights.
This lecture shows how to find the optimal values for the biases in Neural Networks using the Gradient Descent algorithm.
Learning rate has a significant impact on the performance of the Gradient Descent Algorithm. This videos shows the impact of learning rate and some recommendation to define a good value for it.
There are several challenges when training Neural Networks. This video discusses the most important ones to be considered when solving real-world problems.
This lecture takes you though the steps of implementing a Perceptron in Java.
We have been using one transfer function so far for our models. The step transfer function is good for binary classification problems. For other types of problems, we might need different transfer functions. This lecture introduces a wide range of transfer functions.
After mastering Perceptron and the process of training it, it is time to see how to make a network of neurons. Yes, this is called Neural Networks. We see what will happen when adding one more neuron.
In this lecture, we will learn the impact of adding a new layer in a Multi-Layer Perceptron (MLP).
This lecture shows you the impact of changing the weights and biases of an MLP on the shape of its output.
In the MLP model, we can add as many layers as we can, but the question is: what will happen when we include more layers. Let's find out the answer to this question by watching this video.
The Back Propagation (BP) algorithm is gradient-based for training MLPs. This videos takes you through the theory and steps of this algorithm.
Momentum is a new parameter in the BP algorithm in addition to the learning rate. It assists BP to jump outside the locally optimal solutions. In this video, we will see its impact on the performance of BP.
This lecture takes you through the process of solving regression and prediction problems using MLPs. We will be learning about both linear and logistic regression using MLPs.
This video shows you a live example of an MLP designed for doing regression. You will learn about the impact of the connection weights and biases on the output shape of MLPs.
This lesson covers examples and issues in the process of doing regression using MLPs.
In the above video, we have learned about regression problems with 1 independent variable which requires an MLP with 1 input in the first layer. But the question is: what if we have more than 1 independent variable? This video will answer this question.
Do you want to see a live demo of how MLP solves problems require multiple regression? Well, let's watch this video them.
There is a theory in the field of Neural Network called Universal Approximator. This video is about this theory.
This vidoe shows the Neuroph website and its user interface.
This lesson includes the steps to create an artificial neuron, train, and test it in Neuroph.
This lesson shows the steps of designing, training, and testing MLPS in Neuroph.
Neuroph has a large number of sample projects, which are very good for learning. This video shows where to find and how to use them.
Neuroph allows a wide range of visualization methods, and it is very user friendly. This video takes you through the steps of using one of the visualization tool to see the output of an MLP.
This lesson shows the process of recognizing hand-written character recognition in Neuroph.
In this lesson, you will learn how to recognize images using MLPs in Neuroph.
Download my book on NNs below.
OpenCourser helps millions of learners each year. People visit us to learn workspace skills, ace their exams, and nurture their curiosity.
Our extensive catalog contains over 50,000 courses and twice as many books. Browse by search, by topic, or even by career interests. We'll match you to the right resources quickly.
Find this site helpful? Tell a friend about us.
We're supported by our community of learners. When you purchase or subscribe to courses and programs or purchase books, we may earn a commission from our partners.
Your purchases help us maintain our catalog and keep our servers humming without ads.
Thank you for supporting OpenCourser.