
azure-openai
#10008
model_incompatibility_error
The embeddings operation does not work with the specified model. Please choose a different model and try again. You can learn more about compatible models for each operation here: https://go.microsoft.com/fwlink/?linkid=2197993.
This error has been identified and solved.
Reason
The error you're encountering with a 400 status code in the Azure OpenAI API is due to several possible reasons:
Invalid Model
The specified model, gpt-35-turbo-16k
, is not supported by the Azure OpenAI API. The API only works with specific models that are compatible with the operations you are performing.
Incorrect Request Configuration
The error could also be due to an incorrectly configured API request. This includes issues such as invalid API keys, incorrect headers, or an improperly formatted data payload.
Rate Limiting
Another potential cause is exceeding the rate limits imposed by the OpenAI API. If your requests exceed the allowed rate for your pricing tier, you will receive a 400 error or a more specific rate limiting error.
Solution
To resolve the 400 status error in the Azure OpenAI API, you need to address the following key areas:
Ensure you are using a compatible model for the operation you are performing. For example, if you are trying to perform an embeddings operation, use a model like text-embedding-3-large
or any of the ada models, as gpt-4
and similar models are not compatible with this operation.
Here are some steps to take:
Choose a compatible model: Select a model that is supported for the specific operation you are trying to perform.
Verify API keys and headers: Ensure your API keys are correct, not expired, and that your request headers are properly configured.
Check rate limits: Make sure you are not exceeding the rate limits set for your pricing tier, as this can also trigger a 400 error or a specific rate limiting error.