You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
{{ message }}
This repository was archived by the owner on Jun 3, 2025. It is now read-only.
Copy file name to clipboardExpand all lines: CONTRIBUTING.md
+1-3Lines changed: 1 addition & 3 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -77,9 +77,7 @@ For documentation edits, include:
77
77
78
78
## Question or Problem
79
79
80
-
- Go to: [GitHub Discussions](https://github.com/neuralmagic/sparseml/discussions/)
81
-
82
-
Post all other questions including support or how to contribute. Don’t forget to search through existing discussions to avoid duplication! Thanks!
80
+
Sign up or log in: **Deep Sparse Community**[Discourse Forum](https://https://discuss.neuralmagic.com/) and/or [Slack](https://discuss-neuralmagic.slack.com/). We are growing the community member by member and happy to see you there. Post all other questions including support or how to contribute. Don’t forget to search through existing discussions to avoid duplication! Thanks!
Copy file name to clipboardExpand all lines: README.md
+4-4Lines changed: 4 additions & 4 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -64,8 +64,8 @@ Techniques for sparsification are all encompassing including everything from ind
64
64
When implemented correctly, these techniques result in significantly more performant and smaller models with limited to no effect on the baseline metrics.
65
65
For example, pruning plus quantization can give over noticeable improvements in performance while recovering to nearly the same baseline accuracy.
66
66
67
-
The Deep Sparse product suite builds on top of sparsification enabling you to easily apply the techniques to your datasets and models using recipe-driven approaches.
68
-
Recipes encode the directions for how to sparsify a model into a simple, easily editable format.
67
+
The Deep Sparse product suite builds on top of sparsification enabling you to easily apply the techniques to your datasets and models using recipe-driven approaches. Recipes encode the directions for how to sparsify a model into a simple, easily editable format.
68
+
69
69
- Download a sparsification recipe and sparsified model from the [SparseZoo](https://github.com/neuralmagic/sparsezoo).
70
70
- Alternatively, create a recipe for your model using [Sparsify](https://github.com/neuralmagic/sparsify).
71
71
- Apply your recipe with only a few lines of code using [SparseML](https://github.com/neuralmagic/sparseml).
@@ -371,7 +371,7 @@ We appreciate contributions to the code, examples, integrations, and documentati
371
371
372
372
## Join the Community
373
373
374
-
For user help or questions about Sparsify, use our [GitHub Discussions](https://www.github.com/neuralmagic/sparseml/discussions/). Everyone is welcome!
374
+
For user help or questions about Sparsify, sign up or log in: **Deep Sparse Community** [Discourse Forum](https://discuss.neuralmagic.com/) and/or [Slack](https://discuss-neuralmagic.slack.com/). We are growing the community member by member and happy to see you there.
375
375
376
376
You can get the latest news, webinar and event invites, research papers, and other ML Performance tidbits by [subscribing](https://neuralmagic.com/subscribe/) to the Neural Magic community.
377
377
@@ -383,7 +383,7 @@ The project is licensed under the [Apache License Version 2.0](https://github.co
Copy file name to clipboardExpand all lines: docs/index.rst
+7-5Lines changed: 7 additions & 5 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -61,8 +61,8 @@ Techniques for sparsification are all encompassing including everything from ind
61
61
When implemented correctly, these techniques result in significantly more performant and smaller models with limited to no effect on the baseline metrics.
62
62
For example, pruning plus quantization can give noticeable improvements in performance while recovering to nearly the same baseline accuracy.
63
63
64
-
The Deep Sparse product suite builds on top of sparsification enabling you to easily apply the techniques to your datasets and models using recipe-driven approaches.
65
-
Recipes encode the directions for how to sparsify a model into a simple, easily editable format.
64
+
The Deep Sparse product suite builds on top of sparsification enabling you to easily apply the techniques to your datasets and models using recipe-driven approaches. Recipes encode the directions for how to sparsify a model into a simple, easily editable format.
65
+
66
66
- Download a sparsification recipe and sparsified model from the `SparseZoo <https://github.com/neuralmagic/sparsezoo>`_.
67
67
- Alternatively, create a recipe for your model using `Sparsify <https://github.com/neuralmagic/sparsify>`_.
68
68
- Apply your recipe with only a few lines of code using `SparseML <https://github.com/neuralmagic/sparseml>`_.
0 commit comments