You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: docs/src/examples/sensitivity-analysis-svm.jl
+25-24Lines changed: 25 additions & 24 deletions
Original file line number
Diff line number
Diff line change
@@ -4,7 +4,7 @@
4
4
5
5
# This notebook illustrates sensitivity analysis of data points in a [Support Vector Machine](https://en.wikipedia.org/wiki/Support-vector_machine) (inspired from [@matbesancon](http://github.com/matbesancon)'s [SimpleSVMs](http://github.com/matbesancon/SimpleSVMs.jl).)
6
6
7
-
# For reference, Section 10.1 of https://online.stat.psu.edu/stat508/book/export/html/792 gives an intuitive explanation of what it means to have a sensitive hyperplane or data point. The general form of the SVM training problem is given below (without regularization):
7
+
# For reference, Section 10.1 of https://online.stat.psu.edu/stat508/book/export/html/792 gives an intuitive explanation of what it means to have a sensitive hyperplane or data point. The general form of the SVM training problem is given below (with $\ell_2$ regularization):
8
8
9
9
# ```math
10
10
# \begin{split}
@@ -19,25 +19,27 @@
19
19
# - `X`, `y` are the `N` data points
20
20
# - `w` is the support vector
21
21
# - `b` determines the offset `b/||w||` of the hyperplane with normal `w`
22
-
# - `ξ` is the soft-margin loss.
23
-
22
+
# - `ξ` is the soft-margin loss
23
+
# - `λ` is the $\ell_2$ regularization.
24
+
#
24
25
# This tutorial uses the following packages
25
26
26
27
using JuMP # The mathematical programming modelling language
27
28
import DiffOpt # JuMP extension for differentiable optimization
28
29
import Ipopt # Optimization solver that handles quadratic programs
29
30
import Plots # Graphing tool
30
-
import LinearAlgebra: dot, norm, normalize!
31
+
import LinearAlgebra: dot, norm
31
32
import Random
32
33
33
34
# ## Define and solve the SVM
34
35
35
-
# Construct separable, non-trivial data points.
36
+
# Construct two clusters of data points.
36
37
37
38
N =100
38
39
D =2
40
+
39
41
Random.seed!(62)
40
-
X =vcat(randn(N ÷2, D), randn(N ÷2, D) .+ [4.5, 2.0]')
42
+
X =vcat(randn(N ÷2, D), randn(N ÷2, D) .+ [2.0, 2.0]')
41
43
y =append!(ones(N ÷2), -ones(N ÷2))
42
44
λ =0.05;
43
45
@@ -86,11 +88,10 @@ wv = value.(w)
86
88
87
89
bv =value(b)
88
90
89
-
svm_x = [0.0, 5.0] # arbitrary points
91
+
svm_x = [-2.0, 4.0] # arbitrary points
90
92
svm_y = (-bv .- wv[1] * svm_x )/wv[2]
91
93
92
94
p = Plots.scatter(X[:,1], X[:,2], color = [yi >0?:red::bluefor yi in y], label ="")
0 commit comments