-
Notifications
You must be signed in to change notification settings - Fork 1.6k
Description
ERROR - api.simple_chat - simple_chat.py:501 - Error with Openai API: Error code: 413 - {'error': {'message': '输入Tokens数量(1176440)超过系统限制(1000000) (request id: 2026020123301026091426VGEK0SGk)', 'type': 'new_api_error', 'param': '', 'code': 'local:invalid_request'}}
是下面这段代码报的错:
elif request.provider == "openai":
try:
# Get the response and handle it properly using the previously created api_kwargs
logger.info("Making Openai API call")
response = await model.acall(api_kwargs=api_kwargs, model_type=ModelType.LLM)
# Handle streaming response from Openai
async for chunk in response:
choices = getattr(chunk, "choices", [])
if len(choices) > 0:
delta = getattr(choices[0], "delta", None)
if delta is not None:
text = getattr(delta, "content", None)
if text is not None:
yield text
except Exception as e_openai:
logger.error(f"Error with Openai API: {str(e_openai)}")
yield f"\nError with Openai API: {str(e_openai)}\n\nPlease check that you have set the OPENAI_API_KEY environment variable with a valid API key."
我使用的embedder.json配置内容,用的是openai api
{
"embedder": {
"client_class": "OpenAIClient",
"initialize_kwargs": {
"api_key": "${OPENAI_API_KEY}",
"base_url": "${OPENAI_BASE_URL}"
},
"batch_size": 8,
"model_kwargs": {
"model": "text-embedding-3-large",
"dimensions": 512,
"encoding_format": "float"
}
},
"embedder_ollama": {
"client_class": "OllamaClient",
"model_kwargs": {
"model": "nomic-embed-text"
}
},
"retriever": {
"top_k": 10
},
"text_splitter": {
"split_by": "token",
"chunk_size": 300,
"chunk_overlap": 60
}
}