File tree Expand file tree Collapse file tree 1 file changed +5
-1
lines changed
Expand file tree Collapse file tree 1 file changed +5
-1
lines changed Original file line number Diff line number Diff line change @@ -51,7 +51,11 @@ model = strip_softmax(model)
5151# Applying the [`GammaRule`](@ref) to two linear layers in a row will yield different results
5252# than first fusing the two layers into one linear layer and then applying the rule.
5353# This fusing is called "canonization" and can be done using the [`canonize`](@ref) function:
54- model = canonize (model)
54+ model_canonized = canonize (model)
55+
56+ # After canonization, the first `BatchNorm` layer has been fused into the preceding `Conv` layer.
57+ # The second `BatchNorm` layer wasn't fused
58+ # since its preceding `Conv` layer has a ReLU activation function.
5559
5660# ### [Flattening the model](@id docs-lrp-flatten-model)
5761# ExplainableAI.jl's LRP implementation supports nested Flux Chains and Parallel layers.
You can’t perform that action at this time.
0 commit comments