Logo

openai

#10006

response_format_error

"messages" must include "json" to use "response_format" as "json_object".

This error has been identified and solved.

Reason

The error you are seeing, where it states that "messages" must contain the word "json" in some form to use "response_format" of type "json_object", is likely due to a specific requirement in the OpenAI API. Here’s a concise explanation:

  • The OpenAI API has strict criteria for when the response_format can be set to json_object. Specifically, the API expects the input or the instructions provided to the model to explicitly indicate that the response should be in JSON format. If this indication is missing, the API will reject the request with a 400 error, as it does not meet the required conditions for generating a JSON response.

Solution

To fix the error where "messages" must include "json" to use "response_format" as "json_object", you need to ensure the following:

  • Include the word "json" in your prompt or instructions to the model.

  • Ensure that the model you are using supports the response_format parameter set to json_object. Currently, this is supported for models like gpt-3.5-turbo and gpt-4-turbo, but not for gpt-4-vision models.

Key points to check:

  • Modify your prompt to explicitly mention JSON.

  • Verify the model you are using is compatible with the json_object response format.

Original Error Message

Raw

"messages" must include "json" to use "response_format" as "json_object".

Original Error Message

Raw

"messages" must include "json" to use "response_format" as "json_object".

Suggested Links

© 2024 Portkey, Inc. All rights reserved

HIPAA

COMPLIANT

GDPR

Transform Your AI app prototypes to be production-ready

Talk to us →