The best AI agents don’t just write code, they ship applications. But most infrastructure wasn’t built for that. The agent scaffolds your app, then stops. One shot becomes ten errands: create an OpenAI account; add these env vars; set up a database; add more variables, and so it goes.
We’ve been building Netlify to be the platform where agents can one-shot real applications. Today, AI Gateway makes that more complete. AI model access is now built into the platform, generally available for all Netlify users on credit-based plans.
The platform to one-shot
Think about what a real application actually needs: backend logic, scheduled tasks, background processing, file storage, a database, user input handling, and increasingly, AI.
With Netlify, an agent can build all of that without creating a single external account:
- Functions for backend logic—serverless, scheduled, long-running, or edge
- Blobs for files, images, and assets
- DB for persistent database storage
- Forms for collecting user input without writing backend code
- AI Gateway for calling OpenAI, Anthropic, and Gemini models
That’s a complete, ready-to-scale stack on one platform. All from one prompt. No stopping to set up external services, no credentials across providers, no stitching together infrastructure that wasn’t made to work together.
What AI Gateway does
AI Gateway handles the infrastructure side of AI model access so you can focus on building. Here’s what it looks like in practice:
import Anthropic from '@anthropic-ai/sdk';
const anthropic = new Anthropic();/// No API keys or base URL configuration needed.// Netlify AI Gateway injects credentials automatically.
async function callAnthropic() { const message = await anthropic.messages.create({ model: 'claude-opus-4.5', max_tokens: 1024, messages: [{ role: 'user', content: 'Happy holidays.' }] }); return message;}That’s it. No API key configuration, no environment variables to set up, no account to create with Anthropic. Deploy this function to Netlify and it works.
Here’s what’s happening behind the scenes:
Automatic key management. Credentials for OpenAI, Anthropic, and Gemini are injected automatically. The official client libraries pick them up without any configuration.
Simplified billing. AI usage shows up on your Netlify bill alongside everything else. One vendor relationship, one invoice.
Fast experimentation. Access to new providers without creating new accounts. When you want to test a different model, the infrastructure is already there.
Built-in governance. Rate limiting and credit controls protect against runaway costs. For teams, this means developers can experiment with AI features without anyone worrying about surprise bills or uncontrolled access. The guardrails are all in the right place.
If you’d rather use your own API keys, just set your own environment variables and Netlify won’t override them. You can opt out per provider or entirely – whatever’s best for you.
Built for one-shot apps
AI tools aren’t just autocompleting code anymore, but generating entire apps. The bottleneck isn’t writing code, but everything that goes after it.
AI Gateway removes one of the new friction points. Getting AI capabilities into production without creating external accounts or managing separate credentials and billing. Your agent generates a function that calls the model and you just deploy it. Netlify handles the rest.
Get started
AI Gateway is available now on all credit-based plans. Free plans include 300 credits to use across the platform including AI inference. Check out the quickstart or a TanStack Start demo to get running quickly.
The best agent experience isn’t just about what the agent can write. Every primitive we add to Netlify from storage and compute to inference serves the same goal, providing a platform where “build me an app” results in a real app.


