Skip to content

Commit 71f34fc

Browse files
linoytsabansayakpaulgithub-actions[bot]
authored
[Flux LoRA] fix issues in flux lora scripts (#11111)
* remove custom scheduler * update requirements.txt * log_validation with mixed precision * add intermediate embeddings saving when checkpointing is enabled * remove comment * fix validation * add unwrap_model for accelerator, torch.no_grad context for validation, fix accelerator.accumulate call in advanced script * revert unwrap_model change temp * add .module to address distributed training bug + replace accelerator.unwrap_model with unwrap model * changes to align advanced script with canonical script * make changes for distributed training + unify unwrap_model calls in advanced script * add module.dtype fix to dreambooth script * unify unwrap_model calls in dreambooth script * fix condition in validation run * mixed precision * Update examples/advanced_diffusion_training/train_dreambooth_lora_flux_advanced.py Co-authored-by: Sayak Paul <[email protected]> * smol style change * change autocast * Apply style fixes --------- Co-authored-by: Sayak Paul <[email protected]> Co-authored-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
1 parent c51b6bd commit 71f34fc

File tree

4 files changed

+154
-171
lines changed

4 files changed

+154
-171
lines changed
Original file line numberDiff line numberDiff line change
@@ -1,7 +1,8 @@
1-
accelerate>=0.16.0
1+
accelerate>=0.31.0
22
torchvision
3-
transformers>=4.25.1
3+
transformers>=4.41.2
44
ftfy
55
tensorboard
66
Jinja2
7-
peft==0.7.0
7+
peft>=0.11.1
8+
sentencepiece

0 commit comments

Comments
 (0)