@@ -7,7 +7,8 @@ using ExplainabilityMethods: LRPSupportedLayer, LRPSupportedActivation
77Check whether LRP can be used on a layer or a Chain.
88To extend LRP to your own layers, define:
99```julia
10- LRP_CONFIG.supports_layer(::MyLayer) = true
10+ LRP_CONFIG.supports_layer(::MyLayer) = true # for structs
11+ LRP_CONFIG.supports_layer(::typeof(mylayer)) = true # for functions
1112```
1213"""
1314supports_layer (l) = false
@@ -18,7 +19,8 @@ supports_layer(::LRPSupportedLayer) = true
1819Check whether LRP can be used on a given activation function.
1920To extend LRP to your own activation functions, define:
2021```julia
21- LRP_CONFIG.supports_activation(::MyActivation) = true
22+ LRP_CONFIG.supports_activation(::typeof(myactivation)) = true # for functions
23+ LRP_CONFIG.supports_activation(::MyActivation) = true # for structs
2224```
2325"""
2426supports_activation (σ) = false
@@ -73,7 +75,7 @@ function check_model(::Val{:LRP}, c::Chain; verbose=true)
7375 If you implemented custom layers, register them via
7476 ```julia
7577 LRP_CONFIG.supports_layer(::MyLayer) = true # for structs
76- LRP_CONFIG.supports_activation (::typeof(mylayer)) = true # for functions
78+ LRP_CONFIG.supports_layer (::typeof(mylayer)) = true # for functions
7779 ```
7880 The default fallback for this layer will use Automatic Differentiation according to "Layer-Wise Relevance Propagation: An Overview".
7981 You can also define a fully LRP-custom rule for your layer by using the interface
0 commit comments