Skip to content
This repository was archived by the owner on Jun 3, 2025. It is now read-only.

Commit f136fe0

Browse files
jeanniefinksmarkurtz
authored andcommitted
Update Slack and Discourse links (#227)
1 parent cae2a1e commit f136fe0

File tree

3 files changed

+12
-12
lines changed

3 files changed

+12
-12
lines changed

CONTRIBUTING.md

Lines changed: 1 addition & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -77,9 +77,7 @@ For documentation edits, include:
7777

7878
## Question or Problem
7979

80-
- Go to: [GitHub Discussions](https://github.com/neuralmagic/sparseml/discussions/)
81-
82-
Post all other questions including support or how to contribute. Don’t forget to search through existing discussions to avoid duplication! Thanks!
80+
Sign up or log in: **Deep Sparse Community** [Discourse Forum](https://https://discuss.neuralmagic.com/) and/or [Slack](https://discuss-neuralmagic.slack.com/). We are growing the community member by member and happy to see you there. Post all other questions including support or how to contribute. Don’t forget to search through existing discussions to avoid duplication! Thanks!
8381

8482
## Developing SparseML
8583

README.md

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -64,8 +64,8 @@ Techniques for sparsification are all encompassing including everything from ind
6464
When implemented correctly, these techniques result in significantly more performant and smaller models with limited to no effect on the baseline metrics.
6565
For example, pruning plus quantization can give over noticeable improvements in performance while recovering to nearly the same baseline accuracy.
6666

67-
The Deep Sparse product suite builds on top of sparsification enabling you to easily apply the techniques to your datasets and models using recipe-driven approaches.
68-
Recipes encode the directions for how to sparsify a model into a simple, easily editable format.
67+
The Deep Sparse product suite builds on top of sparsification enabling you to easily apply the techniques to your datasets and models using recipe-driven approaches. Recipes encode the directions for how to sparsify a model into a simple, easily editable format.
68+
6969
- Download a sparsification recipe and sparsified model from the [SparseZoo](https://github.com/neuralmagic/sparsezoo).
7070
- Alternatively, create a recipe for your model using [Sparsify](https://github.com/neuralmagic/sparsify).
7171
- Apply your recipe with only a few lines of code using [SparseML](https://github.com/neuralmagic/sparseml).
@@ -371,7 +371,7 @@ We appreciate contributions to the code, examples, integrations, and documentati
371371

372372
## Join the Community
373373

374-
For user help or questions about Sparsify, use our [GitHub Discussions](https://www.github.com/neuralmagic/sparseml/discussions/). Everyone is welcome!
374+
For user help or questions about Sparsify, sign up or log in: **Deep Sparse Community** [Discourse Forum](https://discuss.neuralmagic.com/) and/or [Slack](https://discuss-neuralmagic.slack.com/). We are growing the community member by member and happy to see you there.
375375

376376
You can get the latest news, webinar and event invites, research papers, and other ML Performance tidbits by [subscribing](https://neuralmagic.com/subscribe/) to the Neural Magic community.
377377

@@ -383,7 +383,7 @@ The project is licensed under the [Apache License Version 2.0](https://github.co
383383

384384
## Release History
385385

386-
Official builds are hosted on PyPi
386+
Official builds are hosted on PyPI
387387

388388
- stable: [sparseml](https://pypi.org/project/sparseml/)
389389
- nightly (dev): [sparseml-nightly](https://pypi.org/project/sparseml-nightly/)

docs/index.rst

Lines changed: 7 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -61,8 +61,8 @@ Techniques for sparsification are all encompassing including everything from ind
6161
When implemented correctly, these techniques result in significantly more performant and smaller models with limited to no effect on the baseline metrics.
6262
For example, pruning plus quantization can give noticeable improvements in performance while recovering to nearly the same baseline accuracy.
6363

64-
The Deep Sparse product suite builds on top of sparsification enabling you to easily apply the techniques to your datasets and models using recipe-driven approaches.
65-
Recipes encode the directions for how to sparsify a model into a simple, easily editable format.
64+
The Deep Sparse product suite builds on top of sparsification enabling you to easily apply the techniques to your datasets and models using recipe-driven approaches. Recipes encode the directions for how to sparsify a model into a simple, easily editable format.
65+
6666
- Download a sparsification recipe and sparsified model from the `SparseZoo <https://github.com/neuralmagic/sparsezoo>`_.
6767
- Alternatively, create a recipe for your model using `Sparsify <https://github.com/neuralmagic/sparsify>`_.
6868
- Apply your recipe with only a few lines of code using `SparseML <https://github.com/neuralmagic/sparseml>`_.
@@ -88,7 +88,8 @@ Resources and Learning More
8888
Release History
8989
===============
9090

91-
Official builds are hosted on PyPi
91+
Official builds are hosted on PyPI
92+
9293
- stable: `sparseml <https://pypi.org/project/sparseml>`_
9394
- nightly (dev): `sparseml-nightly <https://pypi.org/project/sparseml-nightly>`_
9495

@@ -111,8 +112,9 @@ Additionally, more information can be found via
111112

112113
.. toctree::
113114
:maxdepth: 3
114-
:caption: Help
115+
:caption: Connect Online
115116

116117
Bugs, Feature Requests <https://github.com/neuralmagic/sparseml/issues>
117-
Support, General Q&A <https://github.com/neuralmagic/sparseml/discussions>
118+
Support, General Q&A Forums <https://discuss.neuralmagic.com/>
119+
Deep Sparse Slack Community <https://discuss-neuralmagic.slack.com/>
118120
Neural Magic Docs <https://docs.neuralmagic.com>

0 commit comments

Comments
 (0)