Skip to content

Commit

Permalink
removing redundant requires_grad = False (#10628)
Browse files Browse the repository at this point in the history
We already set the unet to requires grad false at line 506

Co-authored-by: Aryan <[email protected]>
  • Loading branch information
YanivDorGalron and a-r-r-o-w authored Jan 23, 2025
1 parent 37c9697 commit a451c0e
Showing 1 changed file with 0 additions and 4 deletions.
4 changes: 0 additions & 4 deletions examples/text_to_image/train_text_to_image_lora.py
Original file line number Diff line number Diff line change
Expand Up @@ -515,10 +515,6 @@ def main():
elif accelerator.mixed_precision == "bf16":
weight_dtype = torch.bfloat16

# Freeze the unet parameters before adding adapters
for param in unet.parameters():
param.requires_grad_(False)

unet_lora_config = LoraConfig(
r=args.rank,
lora_alpha=args.rank,
Expand Down

0 comments on commit a451c0e

Please sign in to comment.