Skip to content
This repository was archived by the owner on Jun 3, 2025. It is now read-only.

Commit 6ef1cb4

Browse files
corey-nmnatuan
authored andcommitted
Fixing return argument of _create_model for distillation (#1345)
Co-authored-by: Tuan Nguyen <tuan@neuralmagic.com>
1 parent d76d5c7 commit 6ef1cb4

File tree

1 file changed

+1
-1
lines changed
  • src/sparseml/pytorch/torchvision

1 file changed

+1
-1
lines changed

src/sparseml/pytorch/torchvision/train.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -393,7 +393,7 @@ def collate_fn(batch):
393393

394394
if args.distill_teacher not in ["self", "disable", None]:
395395
_LOGGER.info("Instantiating teacher")
396-
distill_teacher = _create_model(
396+
distill_teacher, _ = _create_model(
397397
arch_key=args.teacher_arch_key,
398398
local_rank=local_rank,
399399
pretrained=True, # teacher is always pretrained

0 commit comments

Comments
 (0)