Popular repositories Loading
-
-
-
Megatron-DeepSpeed-MuP
Megatron-DeepSpeed-MuP PublicForked from argonne-lcf/Megatron-DeepSpeed
Ongoing research training transformer language models at scale, including: BERT & GPT-2
Python
-
nanoGPT-HPTransfer
nanoGPT-HPTransfer PublicForked from EleutherAI/nanoGPT-mup
The simplest, fastest repository for training/finetuning medium-sized GPTs.
Python
-
Something went wrong, please refresh the page to try again.
If the problem persists, check the GitHub status page or contact support.
If the problem persists, check the GitHub status page or contact support.