Learn Python
Learn Data Structure & Algorithm
Learn Numpy
Learn Pandas
Learn Matplotlib
Learn Seaborn
Learn Statistics
Learn Math
Learn MATLAB
Learn Machine learning
Learn Github
Learn OpenCV
Introduction
Setup
ANN
Working process ANN
Propagation
Bias parameter
Activation function
Loss function
Overfitting and Underfitting
Optimization function
Chain rule
Minima
Gradient problem
Weight initialization
Dropout
ANN Regression Exercise
ANN Classification Exercise
Hyper parameter tuning
CNN
CNN basics
Convolution
Padding
Pooling
Data argumentation
Flattening
Create Custom Dataset
Binary Classification Exercise
Multiclass Classification Exercise
Transfer learning
Transfer model Basic template
RNN
How RNN works
LSTM
Bidirectional RNN
Sequence to sequence
Attention model
Transformer model
Bag of words
Tokenization & Stop words
Stemming & Lemmatization
TF-IDF
N-Gram
Word embedding
Normalization
Pos tagging
Parser
semantic analysis
Regular expression
Learn MySQL
Learn MongoDB
Learn Web scraping
Learn Excel
Learn Power BI
Learn Tableau
Learn Docker
Learn Hadoop
Forward propagation means all those processes happen in our neural network to predict a value. We know that
for prediction first, we multiply weights and inputs and then do sum all of them and then add a bias. After
this, we apply an activation function. This process happens in all the neurons from hidden layers to output
layers. To do this process we move from the left side to the right side means first input layer then hidden
layers and output layer. It means we are going forward direction. After completing this process we get an
output. Together all these processes are called forward propagation. In forward propagation we can't come back
or do process in backward direction. We can only move forward.
Back-propagation happens after forward propagation. After forward propagation, we get an output. After getting the output we calculate loss and our work is to reduce the loss as much as possible by updating the weights. All processes to reduce loss come in back-propagation. This process is called back-propagation because all the process is done in the backward direction. In back-propagation process starts from the output layers, then hidden layers, and then the input layer. In this process, our target is to update weights by using an optimizer to reduce loss.