ENH Make PEFT configs forward compatible#2038
Merged
BenjaminBossan merged 4 commits intohuggingface:mainfrom Oct 9, 2024
Merged
ENH Make PEFT configs forward compatible#2038BenjaminBossan merged 4 commits intohuggingface:mainfrom
BenjaminBossan merged 4 commits intohuggingface:mainfrom
Conversation
Right now, loading a PEFT config saved with a more recent PEFT version than is currently installed will lead to errors when new arguments are added to the config in the newer PEFT version. The current workaround is for users to manually edit the adapter_config.json to remove those entries. With this PR, PEFT will make an attempt at removing these unknown keys by inspecting the signature. The user will be warned about these removed keys. This should generally be a safe measure because we will generally not introduce new config settings that change the default behavior. However, if a non-default is used, this could lead to wrong results. This is mentioned in the warning. While working on the tests, I also converted the unittest.TestCase to a normal pytest test in order to be able to use pytest fixtures. I also plan on adding the PEFT version to the adapter_config.json in the future. This will allow us to better handle compatibility issues in the future. As adding that new key to all PEFT configs could cause a lot of disruption, I want to get this PR in first to ensure forward compatibility. Note that this new mechanism will not help anyone using a PEFT version <= 0.12.0, so this will be a slow transition.
|
The docs for this PR live here. All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update. |
Member
Author
|
Let's wait for #1996 to be merged first, than update this branch and make the two changes work together nicely. |
|
This issue has been automatically marked as stale because it has not had recent activity. If you think this still needs to be addressed please comment on this thread. |
Member
Author
|
Not stale, still waiting for that PR, should be done soon. |
sayakpaul
approved these changes
Oct 9, 2024
Member
sayakpaul
left a comment
There was a problem hiding this comment.
Neat! Fully agree that this is a better plan.
BenjaminBossan
added a commit
to BenjaminBossan/peft
that referenced
this pull request
Oct 22, 2024
Right now, loading a PEFT config saved with a more recent PEFT version than is currently installed will lead to errors when new arguments are added to the config in the newer PEFT version. The current workaround is for users to manually edit the adapter_config.json to remove those entries. With this PR, PEFT will make an attempt at removing these unknown keys by inspecting the signature. The user will be warned about these removed keys. This should generally be a safe measure because we will generally not introduce new config settings that change the default behavior. However, if a non-default is used, this could lead to wrong results. This is mentioned in the warning. While working on the tests, I also converted the unittest.TestCase to a normal pytest test in order to be able to use pytest fixtures. I also plan on adding the PEFT version to the adapter_config.json in the future. This will allow us to better handle compatibility issues in the future. As adding that new key to all PEFT configs could cause a lot of disruption, I want to get this PR in first to ensure forward compatibility. Note that this new mechanism will not help anyone using a PEFT version < 0.14.0, so this will be a slow transition.
BenjaminBossan
added a commit
to BenjaminBossan/peft
that referenced
this pull request
Jan 22, 2025
There have been multiple issues and forum posts in the past asking about errors like: TypeError: LoraConfig.__init__() got an unexpected keyword argument ... This error can occur when the adapter that is being loaded is trained with a more recent PEFT version than the one currently being used. I thus added a section to the Troubleshooting part of our docs to describe the solutions. Note that we already added changes to PEFT in huggingface#2038 to make configs forward compatible. But since users who encounter this problem have, by definition, older PEFT versions, they don't benefit from this.
BenjaminBossan
added a commit
that referenced
this pull request
Jan 23, 2025
There have been multiple issues and forum posts in the past asking about errors like: TypeError: LoraConfig.__init__() got an unexpected keyword argument ... This error can occur when the adapter that is being loaded is trained with a more recent PEFT version than the one currently being used. I thus added a section to the Troubleshooting part of our docs to describe the solutions. Note that we already added changes to PEFT in #2038 to make configs forward compatible. But since users who encounter this problem have, by definition, older PEFT versions, they don't benefit from this. --------- Co-authored-by: Steven Liu <59462357+stevhliu@users.noreply.github.com>
BenjaminBossan
added a commit
to BenjaminBossan/peft
that referenced
this pull request
Feb 25, 2025
In huggingface#2038, we added a change to PEFT to make PEFT configs forward compatible. To recap, when we add a new config value, say foo, for the LoraConfig, normally users of older PEFT versions would get an error when trying to load it because LoraConfig would not accept a foo argument. Now, we remove this unknown arg and just give a warning. In general, this worked well, but there was a bug when using PeftConfig.from_pretrained instead of the more specific LoraConfig.from_pretrained etc. In that case, we would check the known arguments from the PeftConfig type, which are only a few. This means that we would ignore parameters like the rank for LoRA. With this PR, that bug is fixed. As we know the specific PEFT config, we can use that instead of the PeftConfig super type to determine the unknown parameters. Therefore, PeftConfig.from_pretrained will work the same as LoraConfig.from_pretrained. Note that when a user uses PeftModel.from_pretrained, under the hood it will use the more specific PEFT config, i.e. LoraConfig etc. Therefore, the described bug would not occur there. It is thus very unlikely that this bug affected many (or any) users in the wild.
BenjaminBossan
added a commit
that referenced
this pull request
Mar 4, 2025
In #2038, we added a change to PEFT to make PEFT configs forward compatible. To recap, when we add a new config value, say foo, for the LoraConfig, normally users of older PEFT versions would get an error when trying to load it because LoraConfig would not accept a foo argument. Now, we remove this unknown arg and just give a warning. In general, this worked well, but there was a bug when using PeftConfig.from_pretrained instead of the more specific LoraConfig.from_pretrained etc. In that case, we would check the known arguments from the PeftConfig type, which are only a few. This means that we would ignore parameters like the rank for LoRA. With this PR, that bug is fixed. As we know the specific PEFT config, we can use that instead of the PeftConfig super type to determine the unknown parameters. Therefore, PeftConfig.from_pretrained will work the same as LoraConfig.from_pretrained. Note that when a user uses PeftModel.from_pretrained, under the hood it will use the more specific PEFT config, i.e. LoraConfig etc. Therefore, the described bug would not occur there. It is thus very unlikely that this bug affected many (or any) users in the wild.
Guy-Bilitski
pushed a commit
to Guy-Bilitski/peft
that referenced
this pull request
May 13, 2025
Right now, loading a PEFT config saved with a more recent PEFT version than is currently installed will lead to errors when new arguments are added to the config in the newer PEFT version. The current workaround is for users to manually edit the adapter_config.json to remove those entries. With this PR, PEFT will make an attempt at removing these unknown keys by inspecting the signature. The user will be warned about these removed keys. This should generally be a safe measure because we will generally not introduce new config settings that change the default behavior. However, if a non-default is used, this could lead to wrong results. This is mentioned in the warning. While working on the tests, I also converted the unittest.TestCase to a normal pytest test in order to be able to use pytest fixtures. I also plan on adding the PEFT version to the adapter_config.json in the future. This will allow us to better handle compatibility issues in the future. As adding that new key to all PEFT configs could cause a lot of disruption, I want to get this PR in first to ensure forward compatibility. Note that this new mechanism will not help anyone using a PEFT version < 0.14.0, so this will be a slow transition.
Guy-Bilitski
pushed a commit
to Guy-Bilitski/peft
that referenced
this pull request
May 13, 2025
There have been multiple issues and forum posts in the past asking about errors like: TypeError: LoraConfig.__init__() got an unexpected keyword argument ... This error can occur when the adapter that is being loaded is trained with a more recent PEFT version than the one currently being used. I thus added a section to the Troubleshooting part of our docs to describe the solutions. Note that we already added changes to PEFT in huggingface#2038 to make configs forward compatible. But since users who encounter this problem have, by definition, older PEFT versions, they don't benefit from this. --------- Co-authored-by: Steven Liu <59462357+stevhliu@users.noreply.github.com>
Guy-Bilitski
pushed a commit
to Guy-Bilitski/peft
that referenced
this pull request
May 13, 2025
In huggingface#2038, we added a change to PEFT to make PEFT configs forward compatible. To recap, when we add a new config value, say foo, for the LoraConfig, normally users of older PEFT versions would get an error when trying to load it because LoraConfig would not accept a foo argument. Now, we remove this unknown arg and just give a warning. In general, this worked well, but there was a bug when using PeftConfig.from_pretrained instead of the more specific LoraConfig.from_pretrained etc. In that case, we would check the known arguments from the PeftConfig type, which are only a few. This means that we would ignore parameters like the rank for LoRA. With this PR, that bug is fixed. As we know the specific PEFT config, we can use that instead of the PeftConfig super type to determine the unknown parameters. Therefore, PeftConfig.from_pretrained will work the same as LoraConfig.from_pretrained. Note that when a user uses PeftModel.from_pretrained, under the hood it will use the more specific PEFT config, i.e. LoraConfig etc. Therefore, the described bug would not occur there. It is thus very unlikely that this bug affected many (or any) users in the wild.
BenjaminBossan
added a commit
to BenjaminBossan/peft
that referenced
this pull request
Sep 12, 2025
This PR adds the PEFT version to the adapter_config.json. This can be useful in the future -- for instance when we change the state dict format of a PEFT method, we can convert it in a backwards compatible way based on the PEFT version being used. It can also be useful for debugging by providing an easy way to see the PEFT version that was used to train a PEFT adapter. Notes: In huggingface#2038, we made a change to PEFT configs to make it so that even if new arguments are added to a config, it can still be loaded with older PEFT versions (forward compatibility). Before that change, adding the PEFT version would have been quite disruptive, as it would make all PEFT configs incompatible with older PEFT versions. Said PR was included in the 0.14.0 release from Dec 2024, so we can expect the vast majority of PEFT users to use this version or a more recent one. If the PEFT version is a dev version, the version tag is ambiguous. Therefore, I added some code to try to determine the commit hash. This works if users installed PEFT with git+...@<HASH>. Unit testing that the function to determine the hash works with these types of installs is not trivial. Therefore, I just patched the function to return a fixed hash. I did, however, test it locally and it works: python -m pip install git+https://github.com/huggingface/diffusers.git@5e181eddfe7e44c1444a2511b0d8e21d177850a0 python -c "from peft.config import _get_commit_hash; print(_get_commit_hash('diffusers'))" Also note that I tried to make the retrieval of the hash super robust by adding a broad try ... except. If there is an error there, e.g. due to a busted install path, we never want this to fail, but rather just accept that the hash cannot be determined. If users installed a dev version of PEFT in different way, e.g. using git clone && pip install ., the commit hash will not be detected. I think this is fine, I really don't want to start shelling out with git just for this purpose.
BenjaminBossan
added a commit
that referenced
this pull request
Sep 30, 2025
This PR adds the PEFT version to the adapter_config.json. This can be useful in the future -- for instance when we change the state dict format of a PEFT method, we can convert it in a backwards compatible way based on the PEFT version being used. It can also be useful for debugging by providing an easy way to see the PEFT version that was used to train a PEFT adapter. Notes: In #2038, we made a change to PEFT configs to make it so that even if new arguments are added to a config, it can still be loaded with older PEFT versions (forward compatibility). Before that change, adding the PEFT version would have been quite disruptive, as it would make all PEFT configs incompatible with older PEFT versions. Said PR was included in the 0.14.0 release from Dec 2024, so we can expect the vast majority of PEFT users to use this version or a more recent one. If the PEFT version is a dev version, the version tag is ambiguous. Therefore, I added some code to try to determine the commit hash. This works if users installed PEFT with git+...@<HASH>. Unit testing that the function to determine the hash works with these types of installs is not trivial. Therefore, I just patched the function to return a fixed hash. I did, however, test it locally and it works: python -m pip install git+https://github.com/huggingface/diffusers.git@5e181eddfe7e44c1444a2511b0d8e21d177850a0 python -c "from peft.config import _get_commit_hash; print(_get_commit_hash('diffusers'))" Also note that I tried to make the retrieval of the hash super robust by adding a broad try ... except. If there is an error there, e.g. due to a busted install path, we never want this to fail, but rather just accept that the hash cannot be determined (we add @unknown in this case). If users installed a dev version of PEFT in different way, e.g. using git clone && pip install ., the commit hash will not be detected. I think this is fine, I really don't want to start shelling out with git just for this purpose.
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Right now, loading a PEFT config saved with a more recent PEFT version than is currently installed will lead to errors when new arguments are added to the config in the newer PEFT version. The current workaround is for users to manually edit the
adapter_config.jsonto remove those entries.To give an example, if a new LoRA option called "foobar" is added to PEFT v0.14.0, the saved
adapter_config.jsonwill from that point on always contain an entry for "foobar", even if by default, this value does nothing and the default value is being used. When a user of a lower version of PEFT wants to load that checkpoint, they'll get an error, even though the checkpoint would work. The only workarounds are upgrading PEFT (which is not always desired) or manually deleting "foobar", which is annoying.With this PR, PEFT will make an attempt at removing these unknown keys by inspecting the signature. The user will be warned about these removed keys. This should generally be a safe measure because we will generally not introduce new config settings that change the default behavior. However, if a non-default is used, this could lead to wrong results. For this reason, users are urged in the warning to upgrade PEFT.
While working on the tests, I also converted the
unittest.TestCaseto a normal pytest test in order to be able to use pytest fixtures.I also plan on adding the PEFT version to the
adapter_config.jsonin the future. This will allow us to better handle compatibility issues in the future. As adding that new key to all PEFT configs could cause a lot of disruption, I want to get this PR in first to ensure forward compatibility.Note that this new mechanism will not help anyone using a PEFT version <= 0.13.0, so this will be a slow transition.