All convolutions in a dense block are ReLU-activated and use batch normalization. Channel-sensible concatenation is simply probable if the height and width Proportions of the information stay unchanged, so convolutions within a dense block are all of stride one. Pooling levels are inserted between dense blocks for further more https://financefeeds.com/xrp-skyrockets-45-new-ath-could-be-imminent-as-this-altcoin-looks-set-to-explode/