All convolutions inside a dense block are ReLU-activated and use batch normalization. Channel-smart concatenation is just doable if the peak and width dimensions of the info stay unchanged, so convolutions in a very dense block are all of stride one. Pooling levels are inserted between dense blocks for even https://financefeeds.com/european-banking-authority-wants-rts-to-standardize-copyright-exposure/