โซ๏ธ #BackPropagation is an algorithm to train neural networks.ย It is the method of fine-tuning weights of a neural network based on error rate obtained in previous epoch (i.e., iteration)
A Complete ๐งต
โ Backpropagation is an algorithm for supervised learning of artificial neural networks using #gradientdescent
Given an artificial neural network and an error function, method calculates gradient of error function with respect to neural network's weights using chain rule
โซ๏ธ #Padding is simplyย a process of adding layers of zeros to our input images
โซ๏ธ #Stride describesย step size of kernel when you slide a filter over an input image
A Complete ๐งต
โซ๏ธ Padding is simplyย a process of adding layers of zeros to our input images.
The purpose of padding is to preserve original size of an image when applying a #convolutional filter & enable filter to perform full convolutions on edge pixel
โซ๏ธ So to prevent this-
We will be using padding of size 2 (i.e. original image(5) โ feature map(3)).
It is also known as zero padding because we are padding it with 0