All convolutions in a dense block are ReLU-activated and use batch normalization. Channel-clever concatenation is simply probable if the height and width Proportions of the information stay unchanged, so convolutions within a dense block are all of stride 1. Pooling levels are inserted between dense blocks for more dimensionality https://financefeeds.com/the-3-top-new-meme-coins-for-massive-return-potential-btfd-coin-bonk-and-cat-in-a-dogs-world/