Skip to content

[pull] main from m-bain:main#63

Merged
pull[bot] merged 3 commits intoAsofwar:mainfrom
m-bain:main
Feb 14, 2026
Merged

[pull] main from m-bain:main#63
pull[bot] merged 3 commits intoAsofwar:mainfrom
m-bain:main

Conversation

@pull
Copy link

@pull pull bot commented Feb 14, 2026

See Commits and Changes for more details.


Created by pull[bot] (v2.0.0-alpha.4)

Can you help keep this open source service alive? 💖 Please sponsor : )

MrPrayer and others added 3 commits February 14, 2026 14:11
…g paths (#1285)

- Add `model_cache_only` param to `load_align_model()`, pass as `local_files_only` to HuggingFace `from_pretrained` calls
- Forward `model_dir` and `model_cache_only` to both `load_align_model` call sites (initial load and language-change reload)
- Add `cache_dir` param to `DiarizationPipeline.__init__`, forward to pyannote `Pipeline.from_pretrained`
- Pass `--model_dir` as `cache_dir` when constructing `DiarizationPipeline` in CLI

Previously only the ASR model respected these flags. Alignment and diarization models would always download from HuggingFace to the default cache, breaking offline and custom-cache workflows.


---------

Co-authored-by: Barabazs <31799121+Barabazs@users.noreply.github.com>
Forward the existing --hf_token CLI argument to faster-whisper's
WhisperModel via a new use_auth_token parameter on load_model(),
enabling downloads of gated/private HuggingFace models.
@pull pull bot locked and limited conversation to collaborators Feb 14, 2026
@pull pull bot added the ⤵️ pull label Feb 14, 2026
@pull pull bot merged commit 42beab1 into Asofwar:main Feb 14, 2026
4 checks passed
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants