Portkey
Portkey is an AI gateway platform offering unified API access, observability, guardrails, and prompt management for teams running LLMs in production.
Reviewed by Mathijs Bronsdijk · Updated Apr 13, 2026

What is Portkey?
Portkey is an open-source AI gateway and production platform designed for developers and engineering teams building applications with large language models. It provides a unified API that connects to 1,600+ models from providers like OpenAI, Anthropic, Google, Groq, Azure, and AWS Bedrock, replacing scattered integrations with a single control plane. Beyond routing, Portkey adds observability, guardrails, prompt management, and governance tooling so teams can take AI applications from prototype to production without rebuilding infrastructure from scratch. It currently powers 3,000+ GenAI teams and handles millions of requests daily with a stated 99.99% uptime.
Key Features
- Unified API: Access 1,600+ LLMs and multimodal models (vision, audio, image generation, speech) from 250+ providers through one OpenAI-compatible endpoint, with no code changes required when switching models.
- AI Gateway Orchestration: Built-in load balancing, automatic retries, fallbacks, conditional routing, and circuit breakers keep applications running even when individual providers have outages or rate limits.
- Observability Dashboard: Full-stack logging with OpenTelemetry compliance, tracking 40+ metrics including cost, latency, and accuracy for every request, with real-time insights and debugging tools.
- Guardrails: 60+ filters for real-time safety and compliance checks, with options for custom webhook integrations to apply organization-specific rules before or after model responses.
- Semantic and Simple Caching: Reduces redundant API calls by caching responses, which lowers both latency and costs, particularly useful for repeated or similar queries.
- Prompt Management: A collaborative prompt library with versioning, templates, variables, playground testing, and publish/release workflows for managing prompts across models and teams.
- Virtual Key Management: Stores, rotates, and revokes provider API keys securely, with usage monitoring and granular budget and rate limit controls per team or use case.
Use Cases
- Multi-provider AI engineering teams: Teams integrating models from several providers use Portkey to manage routing, fallbacks, and cost tracking from one interface, avoiding vendor lock-in and reducing integration overhead.
- Compliance and security teams at enterprises: Organizations in regulated industries such as healthcare, finance, and insurance use Portkey's governance features, including SOC 2, ISO 27001, GDPR, and HIPAA compliance controls, to standardize and audit AI access across departments.
- Platform teams managing many GenAI use cases: One documented example involves a company handling 30 million insurance policies monthly across 25+ GenAI use cases, using Portkey for prompt management, per-use-case cost tracking, and key security.
- AI agent developers: Teams building autonomous agents use Portkey's MCP client/server hub and agent observability tooling to monitor, debug, and govern agent behavior in production.
- FinOps and cost-optimization teams: By routing to cheaper or faster models based on task requirements and using caching for repeated requests, teams reduce overall LLM spend without manual intervention.
Strengths and Weaknesses
Strengths:
- Easy integration with LLM providers, cited in 8 mentions across G2 reviews, with a stated 2-minute setup time.
- Intuitive dashboard and analytics that make monitoring, cost tracking, and troubleshooting accessible without building custom tooling.
- Cost optimization through caching and routing, with user reports of meaningful savings on repeated API calls such as test reruns in CI/CD workflows.
- Strong security posture backed by SOC 2 Type 2, ISO 27001, GDPR, and HIPAA certifications, with responsive engineering support noted by reviewers.
- Unified gateway with guardrails, fallbacks, and load balancing reduces reliability work that teams would otherwise build themselves.
Weaknesses:
- Documentation quality is a recurring complaint, with 4 G2 mentions noting it hinders onboarding and forces self-troubleshooting.
- Software bugs and feature complexity overwhelm newcomers, with multiple reviewers citing a steep learning curve given the breadth of features.
- Independent benchmarks show significantly higher latency compared to more specialized gateways, which can be a problem for latency-sensitive applications.
- Advanced analytics and data export capabilities are limited, with users noting missing functionality for deeper reporting and analysis.
Pricing
- Developer: Free forever, 10,000 recorded logs per month (3-day log retention, 30-day metrics), AI Gateway, basic observability, 3 prompt templates, simple caching, deterministic guardrails, community support. Intended for prototyping and evaluation, not production workloads.
- Production: $49/month, 100,000 recorded logs per month (+$9 per additional 100,000), 30-day log retention, 90-day metrics, full observability with alerts, LLM and partner guardrails, unlimited prompt templates, role-based access control, semantic and simple caching, production support.
- Enterprise: Custom pricing, 10 million+ recorded logs per month, custom retention periods, custom guardrail hooks, advanced evaluation templates, SSO, granular budget and rate limits, private cloud deployment, VPC hosting, data export to data lakes, advanced compliance (SOC 2 Type 2, GDPR, HIPAA), custom BAAs, data isolation, dedicated onboarding and priority support.
FAQ
What is Portkey?
Portkey is an open-source AI gateway and production platform that provides a unified API for 1,600+ LLMs, along with observability, guardrails, prompt management, and governance tools for teams running AI applications at scale.
Who makes Portkey?
Portkey was founded by Ayush Garg and Rohit Agarwal and launched in March 2023.
How many models does Portkey support?
Portkey supports access to 1,600+ LLMs and multimodal models from 250+ providers, including OpenAI, Anthropic, Google, Groq, Azure, and AWS Bedrock.
Is Portkey open source?
Yes, Portkey has an open-source version of its AI gateway available, alongside its commercial hosted service and enterprise plans.
Is Portkey free to use?
Portkey offers a free Developer plan that includes 10,000 recorded logs per month and core gateway features, intended for prototyping and evaluation rather than production workloads.
How much does Portkey cost?
The Production plan starts at $49/month. Enterprise pricing is custom and requires contacting the Portkey sales team.
What is an AI gateway and why does it matter?
An AI gateway acts as a single access point between your application and multiple AI model providers. It handles routing, retries, fallbacks, caching, and monitoring so teams do not need to build and maintain those capabilities themselves for each provider.
Does Portkey support AI agents?
Yes, Portkey includes native agent observability and an MCP client/server hub for building and monitoring AI agents, with features for prompt engineering, evaluations, and compliance in agentic workflows.
What compliance certifications does Portkey have?
Portkey holds SOC 2 Type 2, ISO 27001, GDPR, and HIPAA certifications. Enterprise plans include custom BAAs and data isolation options for organizations with strict regulatory requirements.
What are the main complaints users have about Portkey?
The most commonly cited issues on G2 include poor documentation that slows onboarding, software bugs, and the complexity of the feature set overwhelming new users. Independent benchmarks have also noted higher latency compared to more specialized gateway tools.
How does Portkey compare to alternatives like Langfuse or Kong AI Gateway?
Portkey is broader in scope than Langfuse, which focuses primarily on observability and evaluation. Kong AI Gateway is noted in benchmarks to have lower latency than Portkey. Portkey's advantage is its all-in-one approach combining routing, caching, guardrails, prompt management, and governance in a single platform.
What integrations does Portkey support?
Portkey integrates with GitHub, Docker, MongoDB, and Auth0, and offers an API. It supports TypeScript and JavaScript, and runs on Web, macOS, and Windows platforms.
Can Portkey help reduce LLM costs?
Yes, Portkey reduces costs through semantic and simple caching (avoiding redundant API calls), model routing to cheaper providers for appropriate tasks, and billing analytics that show cost breakdowns per use case or team.
What is the uptime guarantee for Portkey?
Portkey states 99.99% uptime and handles millions of requests daily using an edge architecture designed for traffic spikes.
Who is Portkey best suited for?
Portkey is most suitable for engineering and AI teams at companies that use multiple LLM providers, need production reliability, and require centralized governance over model access, costs, and compliance. It is used by both startups and Fortune 500 companies across industries including healthcare, finance, retail, and insurance.