Autogen
AutoGen is a framework that enables the development of LLM applications using multiple agents that can converse with each other to solve tasks.
Find more information about Autogen here: https://microsoft.github.io/autogen/docs/Getting-Started
Quick Start Integration
Autogen supports a concept of config_list which allows definitions of the LLM provider and model to be used. Portkey seamlessly integrates into the Autogen framework through a custom config we create.
Example using minimal configuration
Notice that we updated the base_url
to Portkey’s AI Gateway and then added default_headers
to enable Portkey specific features.
When we execute this script, it would yield the same results as without Portkey, but every request can now be inspected in the Portkey Analytics & Logs UI - including token, cost, accuracy calculations.
All the config parameters supported in Portkey are available for use as part of the headers. Let’s look at some examples:
Using 100+ models in Autogen through Portkey
Since Portkey seamlessly connects to 150+ models across providers, you can easily connect any of these to now run with Autogen.
Let’s see an example using Mistral-7B on Anyscale running with Autogen seamlessly:
Using a Virtual Key
Virtual keys in Portkey allow you to easily switch between providers without manually having to store and change their API keys. Let’s use the same Mistral example above, but this time using a Virtual Key.
Using Configs
Configs in Portkey unlock advanced management and routing functionality including load balancing, fallbacks, canary testing, switching models and more.
You can use Portkey configs in Autogen like this:
Was this page helpful?