-
Notifications
You must be signed in to change notification settings - Fork 13
Open
Description
RuntimeError: This backward function was compiled with non-empty donated buffers which requires create_graph=False and retain_graph=False. Please keep backward(create_graph=False, retain_graph=False) across all backward() function calls, or set torch._functorch.config.donated_buffer=False to disable donated buffer.When models are wrapped by @torch.compile, this error will be triggered when using Phantom Gradient, calculating sradius with power_method,.etc.
Though it can be skipped via setting TORCH_COMPILE_DISABLE=1, it's still quite annoying.
When calculating sradius, another warning also shows up:
/home/lynn/miniforge3/envs/flow/lib/python3.12/site-packages/torch/autograd/graph.py:841: UserWarning: Attempting to run cuBLAS, but there was no current CUDA context! Attempting to set the primary context... (Triggered internally at /pytorch/aten/src/ATen/cuda/CublasHandlePool.cpp:270.)
return Variable._execution_engine.run_backward( # Calls into the C++ engine to run the backward passMetadata
Metadata
Assignees
Labels
No labels