Skip to content
This repository was archived by the owner on Jun 5, 2025. It is now read-only.

Commit 6bbc147

Browse files
jhrozekptelangaponcedeleonch
committed
A temporary workaround to make Anthropic FIM working with Continue
We need to run the normalizer and denormalizer even if there is nothing in the output pipeline for Anthropic because it uses a special format for messages. However, we can't run the pipeline for all providers because we are missing a normalizer/denormalizer for llama.cpp. We need to add that, but let's add a temporary hack to get FIM with Anthropic working. Co-authored-by: Pankaj Telang <pankaj@stacklok.com> Co-authored-by: Alejandro Ponce <aponcedeleonch@stacklok.com>
1 parent 768ae40 commit 6bbc147

File tree

1 file changed

+3
-1
lines changed

1 file changed

+3
-1
lines changed

src/codegate/providers/base.py

Lines changed: 3 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -81,7 +81,9 @@ async def _run_output_stream_pipeline(
8181
if out_pipeline_processor is None:
8282
logger.info("No output pipeline processor found, passing through")
8383
return model_stream
84-
if len(out_pipeline_processor.pipeline_steps) == 0:
84+
85+
# HACK! for anthropic we always need to run the output FIM pipeline even if empty to run the normalizers
86+
if len(out_pipeline_processor.pipeline_steps) == 0 and self.provider_route_name != "anthropic":
8587
logger.info("No output pipeline steps configured, passing through")
8688
return model_stream
8789

0 commit comments

Comments
 (0)