You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: README.md
+1-1Lines changed: 1 addition & 1 deletion
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -253,7 +253,7 @@ optimizer.maximize(
253
253
)
254
254
```
255
255
256
-
By default the previous data in the json file is removed. If you want to keep working with the same logger, the `reset`paremeter in `JSONLogger` should be set to False.
256
+
By default the previous data in the json file is removed. If you want to keep working with the same logger, the `reset`parameter in `JSONLogger` should be set to False.
Copy file name to clipboardExpand all lines: examples/advanced-tour.ipynb
+3-2Lines changed: 3 additions & 2 deletions
Original file line number
Diff line number
Diff line change
@@ -33,7 +33,7 @@
33
33
"metadata": {},
34
34
"outputs": [],
35
35
"source": [
36
-
"# Let's start by defining our function, bounds, and instanciating an optimization object.\n",
36
+
"# Let's start by defining our function, bounds, and instantiating an optimization object.\n",
37
37
"def black_box_function(x, y):\n",
38
38
" return -x ** 2 - (y - 1) ** 2 + 1"
39
39
]
@@ -347,12 +347,13 @@
347
347
]
348
348
},
349
349
{
350
+
"attachments": {},
350
351
"cell_type": "markdown",
351
352
"metadata": {},
352
353
"source": [
353
354
"### 3.3 Changing kernels\n",
354
355
"\n",
355
-
"By default this package uses the Mattern 2.5 kernel. Depending on your use case you may find that tunning the GP kernel could be beneficial. You're on your own here since these are very specific solutions to very specific problems."
356
+
"By default this package uses the Matern 2.5 kernel. Depending on your use case you may find that tunning the GP kernel could be beneficial. You're on your own here since these are very specific solutions to very specific problems."
Copy file name to clipboardExpand all lines: examples/basic-tour.ipynb
+10-6Lines changed: 10 additions & 6 deletions
Original file line number
Diff line number
Diff line change
@@ -16,12 +16,13 @@
16
16
]
17
17
},
18
18
{
19
+
"attachments": {},
19
20
"cell_type": "markdown",
20
21
"metadata": {},
21
22
"source": [
22
23
"## 1. Specifying the function to be optimized\n",
23
24
"\n",
24
-
"This is a function optimization package, therefore the first and most important ingreedient is, of course, the function to be optimized.\n",
25
+
"This is a function optimization package, therefore the first and most important ingredient is, of course, the function to be optimized.\n",
25
26
"\n",
26
27
"**DISCLAIMER:** We know exactly how the output of the function below depends on its parameter. Obviously this is just an example, and you shouldn't expect to know it in a real scenario. However, it should be clear that you don't need to. All you need in order to use this package (and more generally, this technique) is a function `f` that takes a known set of parameters and outputs a real number."
27
28
]
@@ -43,12 +44,13 @@
43
44
]
44
45
},
45
46
{
47
+
"attachments": {},
46
48
"cell_type": "markdown",
47
49
"metadata": {},
48
50
"source": [
49
51
"## 2. Getting Started\n",
50
52
"\n",
51
-
"All we need to get started is to instanciate a `BayesianOptimization` object specifying a function to be optimized `f`, and its parameters with their corresponding bounds, `pbounds`. This is a constrained optimization technique, so you must specify the minimum and maximum values that can be probed for each parameter in order for it to work"
53
+
"All we need to get started is to instantiate a `BayesianOptimization` object specifying a function to be optimized `f`, and its parameters with their corresponding bounds, `pbounds`. This is a constrained optimization technique, so you must specify the minimum and maximum values that can be probed for each parameter in order for it to work"
52
54
]
53
55
},
54
56
{
@@ -306,12 +308,13 @@
306
308
]
307
309
},
308
310
{
311
+
"attachments": {},
309
312
"cell_type": "markdown",
310
313
"metadata": {},
311
314
"source": [
312
315
"## 4. Saving, loading and restarting\n",
313
316
"\n",
314
-
"By default you can follow the progress of your optimization by setting `verbose>0` when instanciating the `BayesianOptimization` object. If you need more control over logging/alerting you will need to use an observer. For more information about observers checkout the advanced tour notebook. Here we will only see how to use the native `JSONLogger` object to save to and load progress from files.\n",
317
+
"By default you can follow the progress of your optimization by setting `verbose>0` when instantiating the `BayesianOptimization` object. If you need more control over logging/alerting you will need to use an observer. For more information about observers checkout the advanced tour notebook. Here we will only see how to use the native `JSONLogger` object to save to and load progress from files.\n",
315
318
"\n",
316
319
"### 4.1 Saving progress"
317
320
]
@@ -327,14 +330,15 @@
327
330
]
328
331
},
329
332
{
333
+
"attachments": {},
330
334
"cell_type": "markdown",
331
335
"metadata": {},
332
336
"source": [
333
337
"The observer paradigm works by:\n",
334
338
"1. Instantiating an observer object.\n",
335
339
"2. Tying the observer object to a particular event fired by an optimizer.\n",
336
340
"\n",
337
-
"The `BayesianOptimization` object fires a number of internal events during optimization, in particular, everytime it probes the function and obtains a new parameter-target combination it will fire an `Events.OPTIMIZATION_STEP` event, which our logger will listen to.\n",
341
+
"The `BayesianOptimization` object fires a number of internal events during optimization, in particular, every time it probes the function and obtains a new parameter-target combination it will fire an `Events.OPTIMIZATION_STEP` event, which our logger will listen to.\n",
338
342
"\n",
339
343
"**Caveat:** The logger will not look back at previously probed points."
0 commit comments