Skip to content

Commit bff3f6a

Browse files
committed
fix: correct typo
1 parent f94caf2 commit bff3f6a

File tree

1 file changed

+1
-1
lines changed

1 file changed

+1
-1
lines changed

tests/models/test_llamacpp.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -182,7 +182,7 @@ class Foo(BaseModel):
182182
generator = model.stream("foo?", Foo)
183183

184184
# NOTE: The first few chunks may be empty (role info, control tokens, finish chunks)
185-
# Relavant issue: https://github.com/abetlen/llama-cpp-python/issues/372
185+
# Relevant issue: https://github.com/abetlen/llama-cpp-python/issues/372
186186
first_non_empty_token = next(x for x in generator if x)
187187
assert first_non_empty_token == "{"
188188

0 commit comments

Comments
 (0)