Refactor layer initialization: PR 2960 continued#3047
Open
BenjaminBossan wants to merge 28 commits intohuggingface:mainfrom
Open
Refactor layer initialization: PR 2960 continued#3047BenjaminBossan wants to merge 28 commits intohuggingface:mainfrom
BenjaminBossan wants to merge 28 commits intohuggingface:mainfrom
Conversation
|
The docs for this PR live here. All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update. |
BenjaminBossan
commented
Feb 18, 2026
| if "wavelet_family" in optional_kwargs: | ||
| wavelet_family = optional_kwargs["wavelet_family"] | ||
| if wavelet_family is None: | ||
| wavelet_family = waveft_config.wavelet_family |
Member
Author
There was a problem hiding this comment.
Note: I ran the unit tests and this handling seemed unnecessary, so I just removed it.
BenjaminBossan
added a commit
to BenjaminBossan/peft
that referenced
this pull request
Feb 26, 2026
The layer initialization was refactored in huggingface#2960. This introduced a bug when initializing AdaLoRA with GPTQ layers because some parameters were missing. This bug is now fixed. The same bugfix is contained in huggingface#3047 but is provided here separately to allow merging it more easily.
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Continuation of #2960 which only covered LoRA and AdaLoRA.
This PR targets the
__init__andupdate_layermethods of the other relevant PEFT methods (i.e. everything except for prompt learning). Again, the goal is to pass the corresponding PEFT config directly and let the layers pick out relevant arguments, instead of having the model classes pick out the arguments and pass them to the layers. The advantage is that this reduces code duplication (this PR reduces LOC by 250). If, say, a new init argument is added, there is no longer the need to update the code in multiple places just to ensure that the argument is passed to the layers correctly.The handling for trainable tokens is a bit awkward, as the config can have different types. I think ideally it should always be a config instance, but I'm not sure how easy that would be to change.
Bone was not updated, as it will be removed for the 0.19 release.
There was also a bug in the GPTQ AdaLoRA implementation caused by #2960, which is now fixed.