|
16 | 16 | ] |
17 | 17 | }, |
18 | 18 | { |
| 19 | + "attachments": {}, |
19 | 20 | "cell_type": "markdown", |
20 | 21 | "metadata": {}, |
21 | 22 | "source": [ |
22 | 23 | "## 1. Specifying the function to be optimized\n", |
23 | 24 | "\n", |
24 | | - "This is a function optimization package, therefore the first and most important ingreedient is, of course, the function to be optimized.\n", |
| 25 | + "This is a function optimization package, therefore the first and most important ingredient is, of course, the function to be optimized.\n", |
25 | 26 | "\n", |
26 | 27 | "**DISCLAIMER:** We know exactly how the output of the function below depends on its parameter. Obviously this is just an example, and you shouldn't expect to know it in a real scenario. However, it should be clear that you don't need to. All you need in order to use this package (and more generally, this technique) is a function `f` that takes a known set of parameters and outputs a real number." |
27 | 28 | ] |
|
43 | 44 | ] |
44 | 45 | }, |
45 | 46 | { |
| 47 | + "attachments": {}, |
46 | 48 | "cell_type": "markdown", |
47 | 49 | "metadata": {}, |
48 | 50 | "source": [ |
49 | 51 | "## 2. Getting Started\n", |
50 | 52 | "\n", |
51 | | - "All we need to get started is to instanciate a `BayesianOptimization` object specifying a function to be optimized `f`, and its parameters with their corresponding bounds, `pbounds`. This is a constrained optimization technique, so you must specify the minimum and maximum values that can be probed for each parameter in order for it to work" |
| 53 | + "All we need to get started is to instantiate a `BayesianOptimization` object specifying a function to be optimized `f`, and its parameters with their corresponding bounds, `pbounds`. This is a constrained optimization technique, so you must specify the minimum and maximum values that can be probed for each parameter in order for it to work" |
52 | 54 | ] |
53 | 55 | }, |
54 | 56 | { |
|
306 | 308 | ] |
307 | 309 | }, |
308 | 310 | { |
| 311 | + "attachments": {}, |
309 | 312 | "cell_type": "markdown", |
310 | 313 | "metadata": {}, |
311 | 314 | "source": [ |
312 | 315 | "## 4. Saving, loading and restarting\n", |
313 | 316 | "\n", |
314 | | - "By default you can follow the progress of your optimization by setting `verbose>0` when instanciating the `BayesianOptimization` object. If you need more control over logging/alerting you will need to use an observer. For more information about observers checkout the advanced tour notebook. Here we will only see how to use the native `JSONLogger` object to save to and load progress from files.\n", |
| 317 | + "By default you can follow the progress of your optimization by setting `verbose>0` when instantiating the `BayesianOptimization` object. If you need more control over logging/alerting you will need to use an observer. For more information about observers checkout the advanced tour notebook. Here we will only see how to use the native `JSONLogger` object to save to and load progress from files.\n", |
315 | 318 | "\n", |
316 | 319 | "### 4.1 Saving progress" |
317 | 320 | ] |
|
327 | 330 | ] |
328 | 331 | }, |
329 | 332 | { |
| 333 | + "attachments": {}, |
330 | 334 | "cell_type": "markdown", |
331 | 335 | "metadata": {}, |
332 | 336 | "source": [ |
333 | 337 | "The observer paradigm works by:\n", |
334 | 338 | "1. Instantiating an observer object.\n", |
335 | 339 | "2. Tying the observer object to a particular event fired by an optimizer.\n", |
336 | 340 | "\n", |
337 | | - "The `BayesianOptimization` object fires a number of internal events during optimization, in particular, everytime it probes the function and obtains a new parameter-target combination it will fire an `Events.OPTIMIZATION_STEP` event, which our logger will listen to.\n", |
| 341 | + "The `BayesianOptimization` object fires a number of internal events during optimization, in particular, every time it probes the function and obtains a new parameter-target combination it will fire an `Events.OPTIMIZATION_STEP` event, which our logger will listen to.\n", |
338 | 342 | "\n", |
339 | 343 | "**Caveat:** The logger will not look back at previously probed points." |
340 | 344 | ] |
|
0 commit comments