Skip to content

Commit 39f2669

Browse files
committed
Fix doc string in model checks
1 parent 11ea2b7 commit 39f2669

File tree

1 file changed

+5
-3
lines changed

1 file changed

+5
-3
lines changed

src/lrp_checks.jl

Lines changed: 5 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -7,7 +7,8 @@ using ExplainabilityMethods: LRPSupportedLayer, LRPSupportedActivation
77
Check whether LRP can be used on a layer or a Chain.
88
To extend LRP to your own layers, define:
99
```julia
10-
LRP_CONFIG.supports_layer(::MyLayer) = true
10+
LRP_CONFIG.supports_layer(::MyLayer) = true # for structs
11+
LRP_CONFIG.supports_layer(::typeof(mylayer)) = true # for functions
1112
```
1213
"""
1314
supports_layer(l) = false
@@ -18,7 +19,8 @@ supports_layer(::LRPSupportedLayer) = true
1819
Check whether LRP can be used on a given activation function.
1920
To extend LRP to your own activation functions, define:
2021
```julia
21-
LRP_CONFIG.supports_activation(::MyActivation) = true
22+
LRP_CONFIG.supports_activation(::typeof(myactivation)) = true # for functions
23+
LRP_CONFIG.supports_activation(::MyActivation) = true # for structs
2224
```
2325
"""
2426
supports_activation(σ) = false
@@ -73,7 +75,7 @@ function check_model(::Val{:LRP}, c::Chain; verbose=true)
7375
If you implemented custom layers, register them via
7476
```julia
7577
LRP_CONFIG.supports_layer(::MyLayer) = true # for structs
76-
LRP_CONFIG.supports_activation(::typeof(mylayer)) = true # for functions
78+
LRP_CONFIG.supports_layer(::typeof(mylayer)) = true # for functions
7779
```
7880
The default fallback for this layer will use Automatic Differentiation according to "Layer-Wise Relevance Propagation: An Overview".
7981
You can also define a fully LRP-custom rule for your layer by using the interface

0 commit comments

Comments
 (0)