The output of your convolutional layer is often handed throughout the ReLU activation purpose to bring non-linearity towards the model. It's going to take the function map and replaces many of the damaging values with zero. A VGG-block had a bunch of 3x3 convolutions padded by one to keep https://financefeeds.com/dogecoin-doge-and-shiba-inu-shib-holders-won-over-by-new-coin-gearing-up-for-a-20100-run/