Enforcing JSON Schema with Anyscale & Together
Get the LLM to adhere to your JSON schema using Anyscale & Together AI’s newly introduced JSON modes
LLMs excel at generating creative text, but production applications demand structured outputs for seamless integration. Instructing LLMs to only generate the output in a specified syntax can help make their behaviour a bit more predictable. JSON is the format of choice here - it is versatile enough and is widely used as a standard data exchange format.
Several LLM providers offer features that help enforce JSON outputs:
- OpenAI has a feature called JSON mode that ensures that the output is a valid JSON object.
- While this is great, it doesn’t guarantee adherence to your custom JSON schemas, but only that the output IS a JSON.
- Anyscale and Together AI go further - they not only enforce that the output is in JSON but also ensure that the output follows any given JSON schema.
Using Portkey, you can easily experiment with models from Anyscale & Together AI and explore the power of their JSON modes:
Output JSON:
As you can see - it’s pretty simple. Just define the JSON schema, and pass it at the time of making your request using the response_format
param. The response_format
’s type
is json_object
and the schema
contains all keys and their expected type.
Supporting Models
Model/Provider | Ensure JSON | Ensure Schema |
---|---|---|
mistralai/Mistral-7B-Instruct-v0.1 Anyscale | ||
mistralai/Mixtral-8x7B-Instruct-v0.1Anyscale | ||
mistralai/Mixtral-8x7B-Instruct-v0.1Together AI | ||
mistralai/Mistral-7B-Instruct-v0.1Together AI | ||
togethercomputer/CodeLlama-34b-InstructTogether AI | ||
gpt-4 and previous releases OpenAI / Azure OpenAI | ||
gpt-3.5-turbo and previous releases OpenAI / Azure OpenAI | ||
Ollama models |
Creating Nested JSON Object Schema
Here’s an example showing how you can also create nested JSON schema and get the LLM to enforce it:
Add your Anyscale or Together AI virtual keys to Portkey vault, and get started!
Was this page helpful?