Skip to content
Merged
Changes from 1 commit
Commits
Show all changes
46 commits
Select commit Hold shift + click to select a range
507d043
start deprecating loraattn.
sayakpaul Dec 28, 2023
a5e0951
fix
sayakpaul Dec 28, 2023
2f273ea
wrap into unet_lora_state_dict
sayakpaul Dec 28, 2023
015e4a1
utilize text_encoder_lora_params
sayakpaul Dec 28, 2023
08289e1
utilize text_encoder_attn_modules
sayakpaul Dec 28, 2023
604936f
debug
sayakpaul Dec 28, 2023
91c5888
debug
sayakpaul Dec 28, 2023
3f10766
remove print
sayakpaul Dec 28, 2023
4ce6112
don't use text encoder for test_stable_diffusion_lora
sayakpaul Dec 28, 2023
7020cec
load the procs.
sayakpaul Dec 28, 2023
a72aba2
set_default_attn_processor
sayakpaul Dec 28, 2023
bdb2f6b
fix: set_default_attn_processor call.
sayakpaul Dec 28, 2023
3a35ceb
fix: lora_components[unet_lora_params]
sayakpaul Dec 28, 2023
18af8e7
checking for 3d.
sayakpaul Dec 28, 2023
5b70ddb
3d.
sayakpaul Dec 28, 2023
5d006d4
more fixes.
sayakpaul Dec 28, 2023
df47cc4
debug
sayakpaul Dec 28, 2023
01bd812
debug
sayakpaul Dec 28, 2023
a4a46f0
debug
sayakpaul Dec 28, 2023
7d70f37
debug
sayakpaul Dec 28, 2023
fe668cd
more debug
sayakpaul Dec 28, 2023
2b097de
more debug
sayakpaul Dec 28, 2023
940a4a0
more debug
sayakpaul Dec 28, 2023
3b38d0c
more debug
sayakpaul Dec 28, 2023
3c05c10
more debug
sayakpaul Dec 28, 2023
620df7d
more debug
sayakpaul Dec 28, 2023
598667c
hack.
sayakpaul Dec 28, 2023
60c5242
remove comments and prep for a PR.
sayakpaul Dec 28, 2023
2229668
appropriate set_lora_weights()
sayakpaul Dec 28, 2023
03bb29c
fix
sayakpaul Dec 28, 2023
f49185e
fix: test_unload_lora_sd
sayakpaul Dec 28, 2023
b05585d
fix: test_unload_lora_sd
sayakpaul Dec 28, 2023
451816b
use dfault attebtion processors.
sayakpaul Dec 28, 2023
dc085bf
debu
sayakpaul Dec 28, 2023
7d9cbaa
debug nan
sayakpaul Dec 28, 2023
d15af62
debug nan
sayakpaul Dec 28, 2023
a7b1606
debug nan
sayakpaul Dec 28, 2023
b507b40
use NaN instead of inf
sayakpaul Dec 28, 2023
67df56f
remove comments.
sayakpaul Dec 28, 2023
ccda992
fix: test_text_encoder_lora_state_dict_unchanged
sayakpaul Dec 28, 2023
bd995df
attention processor default
sayakpaul Dec 28, 2023
e80acf3
default attention processors.
sayakpaul Dec 28, 2023
046a8b3
default
sayakpaul Dec 28, 2023
6f2dda3
style
sayakpaul Dec 28, 2023
8b91172
Merge branch 'main' into remove-depcrecated-lora
sayakpaul Dec 28, 2023
9678f8b
Merge branch 'main' into remove-depcrecated-lora
sayakpaul Jan 2, 2024
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Prev Previous commit
Next Next commit
wrap into unet_lora_state_dict
  • Loading branch information
sayakpaul committed Dec 28, 2023
commit 2f273ea6051bfecd58f6737e2896511b54c96acb
3 changes: 2 additions & 1 deletion tests/lora/test_lora_layers_old_backend.py
Original file line number Diff line number Diff line change
Expand Up @@ -48,6 +48,7 @@
XFormersAttnProcessor,
)
from diffusers.models.lora import LoRALinearLayer
from diffusers.training_utils import unet_lora_state_dict
from diffusers.utils.import_utils import is_xformers_available
from diffusers.utils.testing_utils import (
deprecate_after_peft_backend,
Expand Down Expand Up @@ -144,7 +145,7 @@ def create_unet_lora_layers(unet: nn.Module, rank=4, is_3d=False, mock_weights=T
unet_lora_parameters.extend(attn_module.to_v.lora_layer.parameters())
unet_lora_parameters.extend(attn_module.to_out[0].lora_layer.parameters())

return unet_lora_parameters
return unet_lora_state_dict(unet)


# def create_text_encoder_lora_attn_procs(text_encoder: nn.Module):
Expand Down