Prompt Templates
With Prompt Templates, you can seamlessly create and manage your LLM prompts in one place, and deploy them with just an API call.
This feature is available for all plans:
- Developer: 3 Prompt Templates
- Production & Enterprise: Unlimited Prompt Templates
Prompt templates on Portkey are built to be production-ready - Portkey automatically tracks changes, maintains versions, and gives both the developer and the prompt engineer immense flexibility to do fast experimentation without breaking prod.
How to use Prompt Templates
- On the Portkey app, just click on the “Prompts” button on the left, click on “Create” and a new, blank playground opens up.
- Here you can pick your provider & model of choice - Portkey supports
vision
,chat
, andcompletions
models from 20+ providers. Provider choice here is tied up to Virtual keys so you may see multiple options for the same provider, based on the number of virtual keys you have. - You can write the user/assistant messages as well as configure all the model parameters like
top_p
,max_tokens
,logit_bias
etc - right from UI. - Portkey prompts also has support for enabling
JSON Mode
, and writingTools/Functions
call chains.
Templating Engine
Portkey uses Mustache under the hood to power the prompt templates.
Mustache is a commonly used logic-less templating engine that follows a simple schema for defining variables and more.
With Mustache, prompt templates become even more extensible by letting you incorporate various {{tags}}
in your prompt template and easily pass your data.
The most common usage of mustache templates is for {{variables}}
, used to pass a value at runtime.
Using Variables
Let’s look at the following template:
As you can see, {{customer_data}}
and {{chat_query}}
are defined as variables in the template and you can pass their value at the runtime:
Using variables is just the start! Portkey supports multiple Mustache tags that let you extend the template functionality:
Supported Tags
Tag | Functionality | Example |
---|---|---|
{{variable}} | Variable | Template: Hi! My name is {{name}} . I work at {{company}} . Data: Copy{ "name": "Chris", "company": "GitHub" } Output: Hi! My name is Chris. I work at Github. |
{{#variable}} <string> {{/variable}} | Render <string> only if variable is true or non Empty | Template: Hello I am Tesla bot.{{#chat_mode_pleasant}} Excited to chat with you! {{chat_mode_pleasant}} What can I help you with? Data: Copy { "chat_mode_pleasant": False } Output: Hello I am Tesla bot. What can I help you with? |
{{^variable}} <string>``{{/variable}} | Render <string> only if variable is false or empty | Template: Hello I am Tesla bot.{{^chat_mode_pleasant}} Excited to chat with you! {{/chat_mode_pleasant}} What can I help you with? Data: Copy { "chat_mode_pleasant": False } Output: Hello I am Tesla bot. Excited to chat with you! What can I help you with? |
{{#variable}} {{sub_variable}} {{/variable}} | Iteratively render all the values of sub_variable if variable is true or non Empty | Template: Give atomic symbols for the following: {{#variable}} - {{sub_variable}} {{/variable}} Data: Copy { "variable": \[ { "sub\_variable": "Gold" }, { "sub\_variable": "Carbon" }, { "sub\_variable": "Zinc" } \] } Output: Give atomic symbols for the following: - Gold - Carbon - Zinc |
{{! Comment}} | Comments that are ignored | Template: Hello I am Tesla bot.{{! How do tags work?}} What can I help you with? Data: Copy Output: Hello I am Tesla bot. What can I help you with? |
{{>Partials}} | ”Mini-templates” that can be called at runtime. On Portkey, you can save partials separately and call them in your prompt templates by typing {{> | Template: Hello I am Tesla bot.{{>pp-tesla-template}} What can I help you with? Data in pp-tesla-template : CopyTake the context from {{context}} . And answer user questions. Output: Hello I am Tesla bot. Take the context from {{context}} . And answer user questions. What can I help you with? |
{{>>Partial Variables}} | Pass your privately saved partials to Portkey by creating tags with double >>Like: {{>> }} This is helpful if you do not want to save your partials with Portkey but are maintaining them elsewhere | Template: Hello I am Tesla bot.{{>>My Private Partial}} What can I help you with? |
Using Tags
You can directly pass your data object containing all the variable/tags info (in JSON) to Portkey’s prompts.completions
method with the variables
property.
For example, here’s a prompt partial containing the key instructions for an AI support bot:
And the prompt template uses the partial like this:
We can pass the data object inside the variables:
Versioning Prompts
Whenever any changes are made to your prompt template, Portkey saves your changes in the browser but they are not pushed to Portkey. You can click on the Update
button on the top right to save the latest version of the prompt on Portkey.
All of your prompt versions can be seen on the right column of the playground:
You can Restore
or Publish
any of the previous versions by clicking on the elipsis.
Using Different Prompt Versions
By default, when you pass the PROMPT_ID
in prompts.completions.create
method, Portkey sends the request to the Published
version of your prompt.
But, you can also call any of the other prompt versions (that you can see on the right side bar) by appending their version numbers with the PROMPT_ID
slug.
For example,
Here, I am sending my request to Version #12 of my prompt template. Portkey also has the latest
tag that will always send the request to the latest available version of your prompt, regardless if it’s published or not.
latest
refers to the last version of prompt, it may not be the same as thePublished
version of your prompt.- When no suffix is provided, Portkey defaults to send the request to the
Published
version of the prompt
This feature allows you to easily switch between different versions of your prompts for experimenting or specific use cases without affecting your production environment.
Publishing Prompts
Updating the Prompt does not automatically update your prompt in production. While updating, you can tick Publish prompt changes
which will also update your prompt deployment to the latest version.
Was this page helpful?