Conversation
Codecov Report
@@ Coverage Diff @@
## master #506 +/- ##
===========================================
- Coverage 75.35% 50.62% -24.73%
===========================================
Files 127 117 -10
Lines 11117 10590 -527
===========================================
- Hits 8377 5361 -3016
- Misses 2740 5229 +2489
Continue to review full report at Codecov.
|
|
|
||
| | Framework | Version | | ||
| | --- | --- | | ||
| | [TensorFlow](docs/tensorflow.md) | 1.15, 2.1.0, 2.2.0, 2.3.0, 2.3.1 | |
There was a problem hiding this comment.
2.4 and 2.5 are also supported
| | [MXNet](docs/mxnet.md) | 1.6, 1.7 | | ||
| | [PyTorch](docs/pytorch.md) | 1.4, 1.5, 1.6 | | ||
| | [XGBoost](docs/xgboost.md) | 0.90-2, 1.0-1 ([As a built-in algorithm](docs/xgboost.md#use-xgboost-as-a-built-in-algorithm))| |
There was a problem hiding this comment.
Smdebug is supported on the latest versions of all available DLCs.
See page.
README.md
Outdated
| | [TensorFlow](tensorflow.md) | 1.13, 1.14, 1.15, 2.1.0, 2.2.0, 2.3.0, 2.3.1 | | ||
| | Keras (with TensorFlow backend) | 2.3 | | ||
| | [MXNet](docs/mxnet.md) | 1.4, 1.5, 1.6, 1.7 | | ||
| | [PyTorch](docs/pytorch.md) | 1.2, 1.3, 1.4, 1.5, 1.6 | | ||
| | [XGBoost](docs/xgboost.md) | 0.90-2, 1.0-1 (As a framework)| | ||
| | [MXNet](mxnet.md) | 1.4, 1.5, 1.6, 1.7 | | ||
| | [PyTorch](pytorch.md) | 1.2, 1.3, 1.4, 1.5, 1.6 | | ||
| | [XGBoost](xgboost.md) | 0.90-2, 1.0-1 (As a framework)| |
ndodda-amazon
left a comment
There was a problem hiding this comment.
Left mostly nits, otherwise looks good.
|
|
||
| # Optionally set the version of Python and requirements required to build your docs | ||
| python: | ||
| version: 3.6 |
There was a problem hiding this comment.
nit: why Python 3.6? can we use Python 3.9?
| ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | ||
|
|
||
| You can use Debugger with your training script on your own container | ||
| making only a minimal modification to your training script to add |
| Using SageMaker Debugger on custom containers | ||
| ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | ||
|
|
||
| Debugger is available for any deep learning models that you bring to |
There was a problem hiding this comment.
nit: "any deep learning model" or "all deep learning models"
| ~~~~~~~~~~~~~~~~~~~~ | ||
|
|
||
| Below is a comprehensive list of the built-in collections that are | ||
| managed by SageMaker Debugger. The Hook identifes the tensors that |
| ``XGBoost`` METRICS | ||
| ============== =========================== | ||
|
|
||
| If for some reason, you want to disable the saving of these collections, |
There was a problem hiding this comment.
We should tell customers to set debugger_hook_config=False in the estimator, this is a simpler alternative.
| name="weights", | ||
| parameters={ "parameter": "value" }) | ||
|
|
||
| The parameters can be one of the following. The meaning of these |
There was a problem hiding this comment.
nit: The meaning of these parameters -> These parameters
| - docutils==0.15.2 | ||
| - bokeh | ||
| - ipython | ||
| - pandas |
There was a problem hiding this comment.
Should we pin the versions of bokeh, ipython, and pandas here?
| ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | ||
|
|
||
| The available ``hook_parameters`` keys are listed in the following. The meaning | ||
| of these parameters will be clear as you review the sections of |
There was a problem hiding this comment.
nit: The meaning of these parameters -> These parameters
| .. method:: create_from_json_file(json_file_path (str) | ||
|
|
||
| Takes the path of a file which holds the json configuration of the hook, | ||
| and creates hook from that configuration. This is an optional parameter. |
| ### AWS training containers with script mode | ||
| The following frameworks are available AWS Deep Learning Containers with | ||
| the deep learning frameworks for the zero script change experience. | ||
|
|
There was a problem hiding this comment.
Are we explaining what is 'zero script change experience' in the doc? If yes, can we link it here?
In other locations, I am seeing the lines such as 'no changes to your training script'
|
|
||
| However, for some advanced use cases where you need access to customized | ||
| tensors from targeted parts of a training script, you can manually | ||
| construct the hook object. The SMDebug library provides hook classes to |
There was a problem hiding this comment.
Are we using the 'SMDebug' consistently? In other locations I am seeing it is mentioned as 'smdebug'
| Support | ||
| ------- | ||
|
|
||
| - Zero Script Change experience where you need no modifications to your |
There was a problem hiding this comment.
Same as above. If we are introducing a new term 'Zero script change experience', it needs to be explained some where.
| Migration to Deep Learning Containers | ||
| ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | ||
|
|
||
| TBD |
* add unified search to RTD website * configure rtd build environment to make it functional
* add licensing information * add licensing information
* add licensing information * add search filter
Description of changes:
readthedocs build log: https://readthedocs.org/projects/sagemaker-debugger/builds/14082688/
pre-launched doc: https://sagemaker-debugger.readthedocs.io/en/website/
Style and formatting:
I have run
pre-commit installto ensure that auto-formatting happens with every commit.Issue number, if available
By submitting this pull request, I confirm that you can use, modify, copy, and redistribute this contribution, under the terms of your choice.