8000 [inductor] default cache_dir in torch._inductor.codecache should be l… · pytorch/pytorch@b004c0b · GitHub
[go: up one dir, main page]

Skip to content

Commit b004c0b

Browse files
hsfzxjypytorchme
8000
rgebot
authored andcommitted
[inductor] default cache_dir in torch._inductor.codecache should be lazily evaluated (#100824)
`getpass.getuser` may raise exceptions in some circumstances, where users cannot override the default cache dir with env `TORCHINDUCTOR_CACHE_DIR`. Hence the assemble of default cache dir should be lazily evaluated. Pull Request resolved: #100824 Approved by: https://github.com/ezyang
1 parent b06c180 commit b004c0b

File tree

1 file changed

+3
-4
lines changed

1 file changed

+3
-4
lines changed

torch/_inductor/codecache.py

Lines changed: 3 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -64,10 +64,9 @@ def _compile_end():
6464

6565
@functools.lru_cache(None)
6666
def cache_dir():
67-
cache_dir = os.environ.get(
68-
"TORCHINDUCTOR_CACHE_DIR",
69-
f"{tempfile.gettempdir()}/torchinductor_{getpass.getuser()}",
70-
)
67+
cache_dir = os.environ.get("TORCHINDUCTOR_CACHE_DIR")
68+
if cache_dir is None:
69+
cache_dir = f"{tempfile.gettempdir()}/torchinductor_{getpass.getuser()}"
7170
os.makedirs(cache_dir, exist_ok=True)
7271
return cache_dir
7372

0 commit comments

Comments
 (0)
0