Skip to content

Torch compile on webui setting broken #1933

@dill-shower

Description

@dill-shower

After set torch dynamo on webui training can't start

                          INFO:     127.0.0.1:50376 - "POST /callback HTTP/1.1" 200 OK                                                  INFO:     127.0.0.1:50390 - "POST /callback HTTP/1.1" 200 OK                                                  INFO:     127.0.0.1:50404 - "POST /callback HTTP/1.1" 200 OK                                                  INFO:     127.0.0.1:50408 - "POST /callback HTTP/1.1" 200 OK                                                  2025-11-07 11:11:49,946 - SimpleTuner - INFO - 2025-11-07 11:11:49,946 - SimpleTuner - INFO - FSDP v2 configuration detected (fsdp_version=2, reshard_after_forward=True, cpu_ram_efficient_loading=False, limit_all_gathers=True, cpu_offload=False, state_dict_type=sharded_state_dict, auto_wrap_policy=transformer_based_wrap, transformer_cls_names_to_wrap=['JointTransformerBlock']); enabling FullyShardedDataParallelPlugin (reshard_after_forward=True).                                                                                                    INFO:     127.0.0.1:50418 - "POST /callback HTTP/1.1" 200 OK                                                  2025-11-07 11:11:49,946 - SimpleTuner - INFO - [RANK 0] 2025-11-07 11:11:49,946 [WARNING] backward_prefetch is not supported in FSDP2. Setting backward prefetch to None.                                                   2025-11-07 11:11:49,948 - SimpleTuner - INFO - 2025-11-07 11:11:49,948 - SimpleTuner - INFO - Removing FSDP auto-wrap candidates excluded by model configuration: PatchEmbed                                                2025-11-07 11:11:49,948 - SimpleTuner - INFO - 2025-11-07 11:11:49,948 - SimpleTuner - INFO - FSDP base component: SD3Transformer2DModel                                                                                    2025-11-07 11:11:49,948 - SimpleTuner - INFO - 2025-11-07 11:11:49,948 - SimpleTuner - INFO - FSDP auto-wrap candidates from model family 'sd3': JointTransformerBlock                                                      INFO:     127.0.0.1:50430 - "POST /callback HTTP/1.1" 200 OK                                                  2025-11-07 11:11:49,948 - SimpleTuner - INFO - 2025-11-07 11:11:49,948 - SimpleTuner - INFO - Resolved FSDP transformer classes to wrap: JointTransformerBlock                                                              2025-11-07 11:11:49,948 - SimpleTuner - INFO - 2025-11-07 11:11:49,948 - SimpleTuner - INFO - Torch Dynamo enabled (backend=inductor) (mode=max-autotune, dynamic, regional).                                               INFO:     127.0.0.1:50444 - "POST /callback HTTP/1.1" 200 OK                                                  INFO:     127.0.0.1:50458 - "POST /callback HTTP/1.1" 200 OK                                                  INFO:     127.0.0.1:50470 - "POST /callback HTTP/1.1" 200 OK                                                  INFO:     127.0.0.1:50466 - "POST /callback HTTP/1.1" 200 OK                                                  INFO:     127.0.0.1:50478 - "POST /callback HTTP/1.1" 200 OK

INFO: 127.0.0.1:50488 - "POST /callback HTTP/1.1" 200 OK 2025-11-07 11:11:49,954 - SimpleTuner - INFO - You cannot pass in both dynamo_plugin and dynamo_backend, please only pass in one. INFO: 127.0.0.1:50492 - "POST /callback HTTP/1.1" 200 OK 2025-11-07 11:11:49,955 - SimpleTuner - INFO - Traceback (most recent call last):
2025-11-07 11:11:49,955 - SimpleTuner - INFO - File "/workspace/SimpleTuner/simpletuner/train.py", line 40, in 2025-11-07 11:11:49,955 - SimpleTuner - INFO - trainer = Trainer( 2025-11-07 11:11:49,955 - SimpleTuner - INFO - ^^^^^^^^ 2025-11-07 11:11:49,955 - SimpleTuner - INFO - File "/workspace/SimpleTuner/simpletuner/helpers/training/trainer.py", line 275, in init 2025-11-07 11:11:49,955 - SimpleTuner - INFO - raise e 2025-11-07 11:11:49,955 - SimpleTuner - INFO - File "/workspace/SimpleTuner/simpletuner/helpers/training/trainer.py", line 268, in init 2025-11-07 11:11:49,955 - SimpleTuner - INFO - self.parse_arguments( 2025-11-07 11:11:49,955 - SimpleTuner - INFO - File "/workspace/SimpleTuner/simpletuner/helpers/training/trainer.py", line 607, in parse_arguments 2025-11-07 11:11:49,955 - SimpleTuner - INFO - self.accelerator = Accelerator(**accelerator_kwargs) 2025-11-07 11:11:49,955 - SimpleTuner - INFO - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 2025-11-07 11:11:49,955 - SimpleTuner - INFO - File "/workspace/SimpleTuner/venv/lib/python3.11/site-packages/accelerate/accelerator.py", line 324, in init 2025-11-07 11:11:49,955 - SimpleTuner - INFO - raise ValueError("You cannot pass in both dynamo_plugin and dynamo_backend, please only pass in one.") 2025-11-07 11:11:49,955 - SimpleTuner - INFO - ValueError: You cannot pass in both dynamo_plugin and dynamo_backend, please only pass in one. 2025-11-07 11:11:49,955 - SimpleTuner - INFO - INFO: 127.0.0.1:50402 - "POST /callback HTTP/1.1" 200 OK INFO: 127.0.0.1:50508 - "POST /callback HTTP/1.1" 200 OK INFO: 127.0.0.1:50520 - "POST /callback HTTP/1.1" 200 OK INFO: 127.0.0.1:50522 - "POST /callback HTTP/1.1" 200 OK INFO: 127.0.0.1:50536 - "POST /callback HTTP/1.1" 200 OK INFO: 127.0.0.1:50538 - "POST /callback HTTP/1.1" 200 OK INFO: 127.0.0.1:50550 - "POST /callback HTTP/1.1" 200 OK INFO: 127.0.0.1:50566 - "POST /callback HTTP/1.1" 200 OK INFO: 127.0.0.1:50570 - "POST /callback HTTP/1.1" 200 OK INFO: 127.0.0.1:50576 - "POST /callback HTTP/1.1" 200 OK INFO: 127.0.0.1:50580 - "POST /callback HTTP/1.1" 200 OK INFO: 127.0.0.1:50586 - "POST /callback HTTP/1.1" 200 OK 2025-11-07 11:11:49,972 - SimpleTuner - INFO - 2025-11-07 11:11:49,972 - SimpleTuner - INFO - Using cmd configuration backend. INFO: 127.0.0.1:50602 - "POST /callback HTTP/1.1" 200 OK INFO: 127.0.0.1:50612 - "POST /callback HTTP/1.1" 200 OK INFO: 127.0.0.1:50616 - "POST /callback HTTP/1.1" 200 OK INFO: 127.0.0.1:50628 - "POST /callback HTTP/1.1" 200 OK 2025-11-07 11:11:49,980 - SimpleTuner - INFO - 2025-11-07 11:11:49,979 - SimpleTuner - INFO - FSDP v2 configuration detected (fsdp_version=2, reshard_after_forward=True, cpu_ram_efficient_loading=False, limit_all_gathers=True, cpu_offload=False, state_dict_type=sharded_state_dict, auto_wrap_policy=transformer_based_wrap, transformer_cls_names_to_wrap=['JointTransformerBlock']); enabling FullyShardedDataParallelPlugin (reshard_after_forward=True). 2025-11-07 11:11:49,980 - SimpleTuner - INFO - [RANK 0] 2025-11-07 11:11:49,980 [WARNING] backward_prefetch is not supported in FSDP2. Setting backward prefetch to None. 2025-11-07 11:11:49,980 - SimpleTuner - INFO - 2025-11-07 11:11:49,980 - SimpleTuner - INFO - Removing FSDP auto-wrap candidates excluded by model configuration: PatchEmbed 2025-11-07 11:11:49,980 - SimpleTuner - INFO - 2025-11-07 11:11:49,980 - SimpleTuner - INFO - FSDP base component: SD3Transformer2DModel 2025-11-07 11:11:49,980 - SimpleTuner - INFO - 2025-11-07 11:11:49,980 - SimpleTuner - INFO - FSDP auto-wrap candidates from model family 'sd3': JointTransformerBlock 2025-11-07 11:11:49,980 - SimpleTuner - INFO - 2025-11-07 11:11:49,980 - SimpleTuner - INFO - Resolved FSDP transformer classes to wrap: JointTransformerBlock 2025-11-07 11:11:49,980 - SimpleTuner - INFO - 2025-11-07 11:11:49,980 - SimpleTuner - INFO - Torch Dynamo enabled (backend=inductor) (mode=max-autotune, dynamic, regional). INFO: 127.0.0.1:50636 - "POST /callback HTTP/1.1" 200 OK INFO: 127.0.0.1:50640 - "POST /callback HTTP/1.1" 200 OK INFO: 127.0.0.1:50646 - "POST /callback HTTP/1.1" 200 OK INFO: 127.0.0.1:50672 - "POST /callback HTTP/1.1" 200 OK INFO: 127.0.0.1:50656 - "POST /callback HTTP/1.1" 200 OK INFO: 127.0.0.1:50680 - "POST /callback HTTP/1.1" 200 OK INFO: 127.0.0.1:50686 - "POST /callback HTTP/1.1" 200 OK 2025-11-07 11:11:49,988 - SimpleTuner - INFO - You cannot pass in both dynamo_plugin and dynamo_backend, please only pass in one. 2025-11-07 11:11:49,989 - SimpleTuner - INFO - Traceback (most recent call last): 2025-11-07 11:11:49,989 - SimpleTuner - INFO - File "/workspace/SimpleTuner/simpletuner/train.py", line 40, in 2025-11-07 11:11:49,989 - SimpleTuner - INFO - trainer = Trainer( 2025-11-07 11:11:49,989 - SimpleTuner - INFO - ^^^^^^^^ 2025-11-07 11:11:49,989 - SimpleTuner - INFO - File "/workspace/SimpleTuner/simpletuner/helpers/training/trainer.py", line 275, in init 2025-11-07 11:11:49,989 - SimpleTuner - INFO - raise e 2025-11-07 11:11:49,989 - SimpleTuner - INFO - File "/workspace/SimpleTuner/simpletuner/helpers/training/trainer.py", line 268, in init 2025-11-07 11:11:49,989 - SimpleTuner - INFO - self.parse_arguments( 2025-11-07 11:11:49,990 - SimpleTuner - INFO - File "/workspace/SimpleTuner/simpletuner/helpers/training/trainer.py", line 607, in parse_arguments 2025-11-07 11:11:49,990 - SimpleTuner - INFO - self.accelerator = Accelerator(**accelerator_kwargs) 2025-11-07 11:11:49,990 - SimpleTuner - INFO - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 2025-11-07 11:11:49,990 - SimpleTuner - INFO - File "/workspace/SimpleTuner/venv/lib/python3.11/site-packages/accelerate/accelerator.py", line 324, in init INFO: 127.0.0.1:50702 - "POST /callback HTTP/1.1" 200 OK 2025-11-07 11:11:49,990 - SimpleTuner - INFO - raise ValueError("You cannot pass in both dynamo_plugin and dynamo_backend, please only pass in one.") 2025-11-07 11:11:49,990 - SimpleTuner - INFO - ValueError: You cannot pass in both dynamo_plugin and dynamo_backend, please only pass in one. 2025-11-07 11:11:49,990 - SimpleTuner - INFO - INFO: 127.0.0.1:50714 - "POST /callback HTTP/1.1" 200 OK INFO: 127.0.0.1:50718 - "POST /callback HTTP/1.1" 200 OK INFO: 127.0.0.1:50722 - "POST /callback HTTP/1.1" 200 OK 2025-11-07 11:11:49,994 - SimpleTuner - INFO - 2025-11-07 11:11:49,994 - SimpleTuner - INFO - Using cmd configuration backend. INFO: 127.0.0.1:50738 - "POST /callback HTTP/1.1" 200 OK INFO: 127.0.0.1:50752 - "POST /callback HTTP/1.1" 200 OK INFO: 127.0.0.1:50758 - "POST /callback HTTP/1.1" 200 OK 2025-11-07 11:11:50,001 - SimpleTuner - INFO - 2025-11-07 11:11:50,001 - SimpleTuner - INFO - FSDP v2 configuration detected (fsdp_version=2, reshard_after_forward=True, cpu_ram_efficient_loading=False, limit_all_gathers=True, cpu_offload=False, state_dict_type=sharded_state_dict, auto_wrap_policy=transformer_based_wrap, transformer_cls_names_to_wrap=['JointTransformerBlock']); enabling FullyShardedDataParallelPlugin (reshard_after_forward=True). 2025-11-07 11:11:50,002 - SimpleTuner - INFO - [RANK 0] 2025-11-07 11:11:50,002 [WARNING] backward_prefetch is not supported in FSDP2. Setting backward prefetch to None. 2025-11-07 11:11:50,002 - SimpleTuner - INFO - 2025-11-07 11:11:50,002 - SimpleTuner - INFO - Removing FSDP auto-wrap candidates excluded by model configuration: PatchEmbed 2025-11-07 11:11:50,002 - SimpleTuner - INFO - 2025-11-07 11:11:50,002 - SimpleTuner - INFO - FSDP base component: SD3Transformer2DModel 2025-11-07 11:11:50,002 - SimpleTuner - INFO - 2025-11-07 11:11:50,002 - SimpleTuner - INFO - FSDP auto-wrap candidates from model family 'sd3': JointTransformerBlock 2025-11-07 11:11:50,002 - SimpleTuner - INFO - 2025-11-07 11:11:50,002 - SimpleTuner - INFO - Resolved FSDP transformer classes to wrap: JointTransformerBlock 2025-11-07 11:11:50,002 - SimpleTuner - INFO - 2025-11-07 11:11:50,002 - SimpleTuner - INFO - Torch Dynamo enabled (backend=inductor) (mode=max-autotune, dynamic, regional). INFO: 127.0.0.1:50770 - "POST /callback HTTP/1.1" 200 OK INFO: 127.0.0.1:50774 - "POST /callback HTTP/1.1" 200 OK INFO: 127.0.0.1:50782 - "POST /callback HTTP/1.1" 200 OK INFO: 127.0.0.1:50794 - "POST /callback HTTP/1.1" 200 OK INFO: 127.0.0.1:50720 - "POST /callback HTTP/1.1" 200 OK INFO: 127.0.0.1:50804 - "POST /callback HTTP/1.1" 200 OK INFO: 127.0.0.1:50810 - "POST /callback HTTP/1.1" 200 OK INFO: 127.0.0.1:50814 - "POST /callback HTTP/1.1" 200 OK INFO: 127.0.0.1:50816 - "POST /callback HTTP/1.1" 200 OK 2025-11-07 11:11:50,011 - SimpleTuner - INFO - You cannot pass in both dynamo_plugin and dynamo_backend, please only pass in one. INFO: 127.0.0.1:50832 - "POST /callback HTTP/1.1" 200 OK 2025-11-07 11:11:50,012 - SimpleTuner - INFO - Traceback (most recent call last): 2025-11-07 11:11:50,012 - SimpleTuner - INFO - File "/workspace/SimpleTuner/simpletuner/train.py", line 40, in 2025-11-07 11:11:50,012 - SimpleTuner - INFO - trainer = Trainer( 2025-11-07 11:11:50,012 - SimpleTuner - INFO - ^^^^^^^^ 2025-11-07 11:11:50,012 - SimpleTuner - INFO - File "/workspace/SimpleTuner/simpletuner/helpers/training/trainer.py", line 275, in init 2025-11-07 11:11:50,012 - SimpleTuner - INFO - raise e 2025-11-07 11:11:50,012 - SimpleTuner - INFO - File "/workspace/SimpleTuner/simpletuner/helpers/training/trainer.py", line 268, in init 2025-11-07 11:11:50,012 - SimpleTuner - INFO - self.parse_arguments( 2025-11-07 11:11:50,012 - SimpleTuner - INFO - File "/workspace/SimpleTuner/simpletuner/helpers/training/trainer.py", line 607, in parse_arguments 2025-11-07 11:11:50,012 - SimpleTuner - INFO - self.accelerator = Accelerator(**accelerator_kwargs) 2025-11-07 11:11:50,012 - SimpleTuner - INFO - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 2025-11-07 11:11:50,012 - SimpleTuner - INFO - File "/workspace/SimpleTuner/venv/lib/python3.11/site-packages/accelerate/accelerator.py", line 324, in init 2025-11-07 11:11:50,012 - SimpleTuner - INFO - raise ValueError("You cannot pass in both dynamo_plugin and dynamo_backend, please only pass in one.") 2025-11-07 11:11:50,012 - SimpleTuner - INFO - ValueError: You cannot pass in both dynamo_plugin and dynamo_backend, please only pass in one. 2025-11-07 11:11:50,012 - SimpleTuner - INFO -

{
"caption_strategy": "textfile",
"checkpoint_step_interval": 50,
"checkpoints_total_limit": 3,
"data_backend_config": "/workspace/simpletuner/configs/velvet-solstice/multidatabackend.json",
"dataloader_prefetch": true,
"debug_aspect_buckets": true,
"debug_dataset_loader": true,
"dynamo_backend": "inductor",
"dynamo_dynamic": true,
"dynamo_mode": "max-autotune",
"dynamo_use_regional_compilation": true,
"evaluation_type": "none",
"flow_schedule_shift": 1,
"flow_use_uniform_schedule": true,
"fsdp_enable": true,
"fsdp_transformer_layer_cls_to_wrap": "JointTransformerBlock",
"i_know_what_i_am_doing": true,
"learning_rate": 0.00006,
"lr_end": "0.000004",
"lr_num_cycles": 6,
"lr_scheduler": "cosine_with_restarts",
"lr_warmup_steps": 500,
"lycoris_config": "/workspace/simpletuner/configs/config/lycoris_config.json",
"max_train_steps": 0,
"model_family": "sd3",
"model_flavour": "large",
"model_type": "full",
"num_train_epochs": 133,
"optimizer": "torch-adamw",
"optimizer_beta1": 0.9,
"optimizer_beta2": 0.99,
"output_dir": "/workspace/simpletuner/output/velvet-solstice",
"prediction_type": "flow_matching",
"pretrained_model_name_or_path": "stabilityai/stable-diffusion-3.5-large",
"pretrained_vae_model_name_or_path": "/workspace/Vae-checkpoints/vae_clean.safetensors",
"print_filenames": true,
"print_sampler_statistics": true,
"report_to": "wandb",
"seed": 42,
"train_batch_size": 24,
"vae_batch_size": 16,
"vae_cache_ondemand": true,
"vae_dtype": "fp32",
"validation_disable": true,
"validation_guidance": 4,
"validation_prompt": "1girl, blonde hair, blue eyes, knife, holding knife, throwing knife, splayed fingers, looking at viewer, touhou",
"validation_randomize": true,
"validation_resolution": "1024",
"validation_step_interval": 20,
"validation_using_datasets": false
}

Somewhere on training scripts already setting dynamo plugin?

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions