The output from the convolutional layer is often handed throughout the ReLU activation purpose to bring non-linearity on the model. It requires the attribute map and replaces the many adverse values with zero.From the convolution layer, we shift the filter/kernel to every probable position around the input matrix. Component-smart multiplication bet