Skip to content

Add Apertus model support with XIeLU activation#1197

Open
sinievanderben wants to merge 1 commit intoTransformerLensOrg:mainfrom
sinievanderben:add-apertus-support
Open

Add Apertus model support with XIeLU activation#1197
sinievanderben wants to merge 1 commit intoTransformerLensOrg:mainfrom
sinievanderben:add-apertus-support

Conversation

@sinievanderben
Copy link

Description

Please include a summary of the change and which issue is fixed. Please also include relevant motivation and context. List any dependencies that are required for this change.

Fixes # (issue)

Type of change

Please delete options that are not relevant.

  • Bug fix (non-breaking change which fixes an issue)
  • New feature (non-breaking change which adds functionality)
  • Breaking change (fix or feature that would cause existing functionality to not work as expected)
  • This change requires a documentation update

Checklist:

  • I have commented my code, particularly in hard-to-understand areas
  • I have made corresponding changes to the documentation
  • My changes generate no new warnings
  • I have added tests that prove my fix is effective or that my feature works
  • New and existing unit tests pass locally with my changes
  • I have not rewritten tests relating to key interfaces which would affect backward compatibility

Comments

This PR adds support for the novel Extended Inverse ELU (xIELU) activation and makes the Apertus‑to‑TransformerLens weight converter aware of its trainable parameters.

Key changes:

This PR adds support for the Apertus model.

New trainable xIELU activation.

  • XIELU class in utils.py with learnable α₊, α₋, β.
  • Original xielu function kept for static use.
  • Added "xielu" to ACTIVATION_FN_DICT.

Activation wiring

  • can_be_used_as_mlp.select_activation_function() now instantiates XIELU when cfg.act_fn == "xielu".

Apertus weight converter improvements

  • Extracts trainable activation parameters (alpha_p, alpha_n, beta).
  • Handles different attribute locations (mlp.act_fn, mlp.act, or top‑level attrs).
  • Falls back to default constants if missing (back‑compatible with old checkpoints).

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant