You will need Portkey and Together AI API keys to get started
pip install -qU portkey-ai openai
With OpenAI Client
from openai import OpenAI
from portkey_ai import PORTKEY_GATEWAY_URL, createHeaders
openai = OpenAI(
api_key= 'TOGETHER_API_KEY', ## Grab from https:
base_url=PORTKEY_GATEWAY_URL,
default_headers=createHeaders(
provider="together-ai",
api_key= 'PORTKEY_API_KEY' ## Grab from https:
)
)
response = openai.chat.completions.create(
model="meta-llama/Llama-3-8b-chat-hf",
messages=[{"role": "user", "content": "What's a fractal?"}],
max_tokens=500
)
print(response.choices[0].message.content)
With Portkey Client
You can safely store your Together API key in Portkey and access models using Portkey’s Virtual Key
from portkey_ai import Portkey
portkey = Portkey(
api_key = 'PORTKEY_API_KEY', ## Grab from https:
virtual_key= "together-virtual-key" ## Grab from https:
)
response = portkey.chat.completions.create(
model= 'meta-llama/Llama-3-8b-chat-hf',
messages= [{ "role": 'user', "content": 'Who are you?'}],
max_tokens=500
)
print(response.choices[0].message.content)
Monitoring your Requests
Using Portkey you can monitor your Llama 3 requests and track tokens, cost, latency, and more.