Skip to content
This repository was archived by the owner on Jun 3, 2025. It is now read-only.

Commit eb5a02c

Browse files
corey-nmbfineran
authored andcommitted
Removing distill_teacher model from args (#1338)
1 parent e959958 commit eb5a02c

File tree

1 file changed

+2
-2
lines changed
  • src/sparseml/pytorch/torchvision

1 file changed

+2
-2
lines changed

src/sparseml/pytorch/torchvision/train.py

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -380,7 +380,7 @@ def collate_fn(batch):
380380

381381
if args.distill_teacher not in ["self", "disable", None]:
382382
_LOGGER.info("Instantiating teacher")
383-
args.distill_teacher = _create_model(
383+
distill_teacher = _create_model(
384384
arch_key=args.teacher_arch_key,
385385
local_rank=local_rank,
386386
pretrained=True, # teacher is always pretrained
@@ -572,7 +572,7 @@ def log_metrics(tag: str, metrics: utils.MetricLogger, epoch: int, epoch_step: i
572572
model,
573573
epoch=args.start_epoch,
574574
loggers=logger,
575-
distillation_teacher=args.distill_teacher,
575+
distillation_teacher=distill_teacher,
576576
)
577577
step_wrapper = manager.modify(
578578
model,

0 commit comments

Comments
 (0)