Skip to content

Support mx post-training quantization for emerging models#306

Open
jiaeenie wants to merge 1 commit intoDeepWok:mainfrom
jiaeenie:jiaeenie/mx-quant-ptq
Open

Support mx post-training quantization for emerging models#306
jiaeenie wants to merge 1 commit intoDeepWok:mainfrom
jiaeenie:jiaeenie/mx-quant-ptq

Conversation

@jiaeenie
Copy link

@jiaeenie jiaeenie commented Feb 10, 2026

TODOS:

  • Quantizers: Add MXFP, MXINT, and Minifloat quantizers
  • GPTQ: Integrate GPTQ into quantization pass
  • QuaRot: Integrate QuaRot into quantization pass
  • Evaluation: Keep evaluation scripts in a seperate repo with lm-eval-harness integration
  • Documentation: Add documentation
  • Quantized Models: Add MX modules (llama, llada, nemotron)
  • Add NVFP kernels

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant

Comments