C# (.NET)
Integrate Portkey in your .NET
app easily using the OpenAI library and get advanced monitoring, routing, and enterprise features.
Building Enterprise LLM Apps with .NET
.NET
is Microsoft’s battle-tested framework trusted by Fortune 500 companies. It’s now easier than ever to build LLM apps. You get:
Battle-Tested Security | Built-in identity management, secret rotation, and compliance standards |
Production Performance | High-throughput processing with advanced memory management |
Azure Integration | Seamless Azure OpenAI and Active Directory support |
Combined with Portkey’s enterprise features, you get everything needed for mission-critical LLM deployments. Monitor costs, ensure reliability, maintain compliance, and scale with confidence.
Portkey Features
Complete Observability | Monitor costs, latency, and performance metrics |
Provider Flexibility | Route to 250+ LLMs (like Claude, Gemini, Llama, self-hosted etc.) without code changes |
Smart Caching | Reduce costs & time by caching frequent requests |
High Reliability | Automatic fallback and load balancing across providers |
Prompt Management | Use Portkey as a centralized hub to version, experiment with prompts, and call them using a single ID |
Continuous Improvement | Improve your app by capturing and analyzing user feedback |
Enterprise Ready | Budget controls, rate limits, model-provisioning, and role-based access |
Supported Clients
ChatClient | ✅ Fully Supported |
EmbeddingClient | ✅ Fully Supported |
ImageClient | 🚧 Coming Soon |
BatchClient | 🚧 Coming Soon |
AudioClient | 🚧 Coming Soon |
Implementation Overview
- Install OpenAI SDK
- Create Portkey client by extending OpenAI client
- Use the client in your application to make requests
1. Install the NuGet package
Add the OpenAI NuGet package to your .NET project:
2. Create Portkey Client Extension
The OpenAI package does not support directly modifying the base URL or passing additional headers. So, we write a simple function to extend OpenAI’s ChatClient
or EmbeddingClient
to create a new PortkeyClient
.
3. Use the Portkey Client
After creating the extension above, you can pass any Portkey supported headers directly while creating the new client.
While we show common headers here, you can pass any Portkey-supported headers to enable features like custom metadata, fallbacks, caching, retries, and more.
4. View Your Request in Portkey Logs
This request will now be logged on Portkey:
Chat Completions Example
Embedding Example
Microsoft Semantic Kernel Example
More Features
You can also use the PortkeyClient
to send Async
requests:
Next Steps
- Call local models
- Enable cache
- Setup fallbacks
- Loadbalance requests against multiple instances
- Append metadata with requests
Need Help?
Ping the Portkey team on our Developer Forum or email us at [email protected]
Was this page helpful?