Do you need to file an issue?
Describe the bug
O1 models return the following error:
"Unsupported parameter: 'max_tokens' is not supported with this model. Use 'max_completion_tokens' instead."
Steps to reproduce
Use an Azure OpenAI hosted O1 model.
Expected Behavior
No response
GraphRAG Config Used
Logs and screenshots
No response
Additional Information
- GraphRAG Version: 1.2.0
- Operating System:
- Python Version:
- Related Issues:
Do you need to file an issue?
Describe the bug
O1 models return the following error:
"Unsupported parameter: 'max_tokens' is not supported with this model. Use 'max_completion_tokens' instead."
Steps to reproduce
Use an Azure OpenAI hosted O1 model.
Expected Behavior
No response
GraphRAG Config Used
# Paste your config hereLogs and screenshots
No response
Additional Information