# Portkey Features Source: https://docs.portkey.ai/docs/introduction/feature-overview Explore the powerful features of Portkey ## AI Gateway Connect to 250+ AI models using a single consistent API. Set up load balancers, automated fallbacks, caching, conditional routing, and more, seamlessly. Integrate with multiple AI models through a single API Implement simple and semantic caching for improved performance Set up automated fallbacks for enhanced reliability Handle various data types with multimodal AI capabilities Implement automatic retries for improved resilience Configure per-strategy circuit protection and failure handling Distribute workload efficiently across multiple models Manage access to LLMs Set and manage request timeouts Implement canary testing for safe deployments Route requests based on specific conditions Set and manage budget limits ## Observability & Logs Gain real-time insights, track key metrics, and streamline debugging with our OpenTelemetry-compliant system. Access and analyze detailed logs Implement distributed tracing for request flows Gain insights through comprehensive analytics Apply filters for targeted analysis Manage and utilize metadata effectively Collect and analyze user feedback ## Prompt Library Collaborate with team members to create, templatize, and version prompt templates easily. Experiment across 250+ LLMs with a strong Publish/Release flow to deploy the prompts. Create and manage reusable prompt templates Utilize modular prompt components Advanced prompting with JSON mode ## Guardrails Enforce Real-Time LLM Behavior with 50+ state-of-the-art AI guardrails, so that you can synchronously run Guardrails on your requests and route them with precision. Implement rule-based safety checks Leverage AI for advanced content filtering Integrate third-party safety solutions Customize guardrails to your needs ## Agents Natively integrate Portkey's gateway, guardrails, and observability suite with leading agent frameworks and take them to production. ## More Resources Compare different Portkey subscription plans Join our community of developers Explore our comprehensive API documentation Learn about our enterprise solutions Contribute to our open-source projects # Make Your First Request Source: https://docs.portkey.ai/docs/introduction/make-your-first-request Integrate Portkey and analyze your first LLM call in 2 minutes! ## 1. Get your Portkey API Key [Create](https://app.portkey.ai/signup) or [log in](https://app.portkey.ai/login) to your Portkey account. Grab your account's API key from the "Settings" page. Copy your Portkey account API key Based on your access level, you might see the relevant permissions on the API key modal - tick the ones you'd like, name your API key, and save it. ## 2. Integrate Portkey Portkey offers a variety of integration options, including SDKs, REST APIs, and native connections with platforms like OpenAI, Langchain, and LlamaIndex, among others. ### Through the OpenAI SDK If you're using the **OpenAI SDK**, import the Portkey SDK and configure it within your OpenAI client object: ### Portkey SDK You can also use the **Portkey SDK / REST APIs** directly to make the chat completion calls. This is a more versatile way to make LLM calls across any provider: Once, the integration is ready, you can view the requests reflect on your Portkey dashboard.