8000 [DDP] Change the --no-optimize-ddp flag to reflect the latest usage (… · pytorch/pytorch@c0e5cca · GitHub
[go: up one dir, main page]

Skip to content

Commit c0e5cca

Browse files
feginpytorchmergebot
authored andcommitted
[DDP] Change the --no-optimize-ddp flag to reflect the latest usage (#119437)
Compiled DDP now has 4 different optimization modes. This PR changes the Dynamo benchmark flag to reflect that change. Pull Request resolved: #119437 Approved by: https://github.com/wconstab, https://github.com/xmfan
1 parent c252255 commit c0e5cca

File tree

1 file changed

+5
-8
lines changed

1 file changed

+5
-8
lines changed

benchmarks/dynamo/common.py

-8Lines changed: 5 additions & 8 deletions
Original file line numberDiff line numberDiff line change
@@ -2986,9 +2986,10 @@ def get_example_inputs(self):
29862986
""",
29872987
)
29882988
parser.add_argument(
2989-
"--no-optimize-ddp",
2990-
action="store_true",
2991-
help="Disables dynamo DDPOptimizer (graph breaks). (Applies only when using --ddp benchmark mode).",
2989+
"--optimize-ddp-mode",
2990+
type=str,
2991+
default="ddp_optimizer",
2992+
help="Specify the DDP optimization mode -- the value of torch._dynamo.config.optimize_ddp.",
29922993
)
29932994
parser.add_argument(
29942995
"--distributed-master-port",
@@ -3438,11 +3439,7 @@ def run(runner, args, original_dir=None):
34383439
)
34393440
if args.ddp:
34403441
assert args.training, "DDP benchmark requires --training mode"
3441-
if args.no_optimize_ddp:
3442-
torch._dynamo.config.optimize_ddp = False
3443-
else:
3444-
# TODO(whc) after enabling DDPOptimizer by default this could be removed or assert
3445-
torch._dynamo.config.optimize_ddp = True
3442+
torch._dynamo.config.optimize_ddp = args.optimize_ddp_mode
34463443
if args.only == "dlrm":
34473444
log.error(
34483445
"DLRM+DDP is unsupported as it requires sharding the embedding layer separately from DDP"

0 commit comments

Comments
 (0)
0