Skip to content

Commit 40d92a2

Browse files
committed
remove nbviewer
1 parent b1a7710 commit 40d92a2

File tree

6 files changed

+0
-8
lines changed

6 files changed

+0
-8
lines changed

docs/src/examples/autotuning-ridge.jl

Lines changed: 0 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,7 +1,6 @@
11
# # Auto-tuning Hyperparameters
22

33
#md # [![](https://img.shields.io/badge/show-github-579ACA.svg)](@__REPO_ROOT_URL__/docs/src/examples/autotuning-ridge.jl)
4-
#md # [![](https://img.shields.io/badge/show-nbviewer-579ACA.svg)](@__NBVIEWER_ROOT_URL__/generated/autotuning-ridge.ipynb)
54

65
# This example shows how to learn a hyperparameter in Ridge Regression using a gradient descent routine.
76
# Let the regularized regression problem be formulated as:

docs/src/examples/chainrules_unit.jl

Lines changed: 0 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,7 +1,6 @@
11
# # ChainRules integration demo: Relaxed Unit Commitment
22

33
#md # [![](https://img.shields.io/badge/show-github-579ACA.svg)](@__REPO_ROOT_URL__/docs/src/examples/chainrules_unit.jl)
4-
#md # [![](https://img.shields.io/badge/show-nbviewer-579ACA.svg)](@__NBVIEWER_ROOT_URL__/generated/chainrules_unit.ipynb)
54

65

76
# In this example, we will demonstrate the integration of DiffOpt with

docs/src/examples/custom-relu.jl

Lines changed: 0 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,7 +1,6 @@
11
# # Custom ReLU layer
22

33
#md # [![](https://img.shields.io/badge/show-github-579ACA.svg)](@__REPO_ROOT_URL__/docs/src/examples/custom-relu.jl)
4-
#md # [![](https://img.shields.io/badge/show-nbviewer-579ACA.svg)](@__NBVIEWER_ROOT_URL__/generated/custom-relu.ipynb)
54

65
# We demonstrate how DiffOpt can be used to generate a simple neural network
76
# unit - the ReLU layer. A neural network is created using Flux.jl which is

docs/src/examples/matrix-inversion-manual.jl

Lines changed: 0 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -1,8 +1,6 @@
11
# # Differentiating a QP wrt a single variable
22

33
#md # [![](https://img.shields.io/badge/show-github-579ACA.svg)](@__REPO_ROOT_URL__/docs/src/examples/matrix-inversion-manual.jl)
4-
#md # [![](https://img.shields.io/badge/show-nbviewer-579ACA.svg)](@__NBVIEWER_ROOT_URL__/generated/matrix-inversion-manual.ipynb)
5-
64

75
# Consider the quadratic program
86

docs/src/examples/sensitivity-analysis-ridge.jl

Lines changed: 0 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,7 +1,6 @@
11
# # Sensitivity Analysis of Ridge Regression
22

33
#md # [![](https://img.shields.io/badge/show-github-579ACA.svg)](@__REPO_ROOT_URL__/docs/src/examples/sensitivity-analysis-ridge.jl)
4-
#md # [![](https://img.shields.io/badge/show-nbviewer-579ACA.svg)](@__NBVIEWER_ROOT_URL__/generated/sensitivity-analysis-ridge.ipynb)
54

65
# This example illustrates the sensitivity analysis of data points in a
76
# [Ridge Regression](https://en.wikipedia.org/wiki/Ridge_regression) problem.

docs/src/examples/sensitivity-analysis-svm.jl

Lines changed: 0 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -1,8 +1,6 @@
11
# # Sensitivity Analysis of SVM
22

33
#md # [![](https://img.shields.io/badge/show-github-579ACA.svg)](@__REPO_ROOT_URL__/docs/src/examples/sensitivity-analysis-svm.jl)
4-
#md # [![](https://img.shields.io/badge/show-nbviewer-579ACA.svg)](@__NBVIEWER_ROOT_URL__/generated/sensitivity-analysis-svm.ipynb)
5-
64

75
# This notebook illustrates sensitivity analysis of data points in a [Support Vector Machine](https://en.wikipedia.org/wiki/Support-vector_machine) (inspired from [@matbesancon](http://github.com/matbesancon)'s [SimpleSVMs](http://github.com/matbesancon/SimpleSVMs.jl).)
86

0 commit comments

Comments
 (0)