I was comparing your Residual Architecture to the actual Residual Architecture as shown in this cheat sheet.
Your Implementation
- input
- Conv2D 75x4x4
- BatchNormalization
- Add input
- LeakyReLU
Actual Implementation
- input
- Conv2D 256x3x3
- BatchNormalization
- ReLU
- Conv2D 256x3x3
- BatchNormalization
- Add input
- ReLU
I'm just going to put this here so that people know the difference in implementation.
I was comparing your Residual Architecture to the actual Residual Architecture as shown in this cheat sheet.
Your Implementation
Actual Implementation
I'm just going to put this here so that people know the difference in implementation.