Today, we’re announcing the ability to deploy the Grafbase Gateway to your own infrastructure. This means flexibility and control for our users, enabling you to leverage the full power of the Grafbase Gateway in environments tailored to your specific needs.
By running Grafbase Gateway in your own environment, you gain the benefits of the core of the Grafbase platform, but on your own infrastructure:
- Deployment Flexibility: Users can deploy their own GraphQL API to the their infrastructure platform of choice, including Cloudflare Workers, Deno, Lambda, or Docker containers. Tailor the Grafbase Gateway to meet your unique performance requirements, ensuring optimal efficiency and speed for your workloads.
- Full Operational Control: Manage the Grafbase Gateway with your existing tools and processes, integrating it smoothly into your operational workflows.
- Scalability on Your Terms: Scale Grafbase according to your own growth trajectory and resource availability without depending on external cloud limitations.
The Grafbase Gateway can be started with the following command:
GRAFBASE_ACCESS_TOKEN=token ./grafbase-gateway \
--config grafbase.toml \
--graph-ref graph@branch
The Grafbase Gateway can also be run in a Docker container. Here's an example docker-compose.yml file:
version: '3'
services:
grafbase:
image: ghcr.io/grafbase/gateway:latest
restart: always
volumes:
- ./grafbase.toml:/etc/grafbase.toml
environment:
GRAFBASE_GRAPH_REF: 'graph-ref@branch'
GRAFBASE_ACCESS_TOKEN: 'ACCESS_TOKEN_HERE'
ports:
- '5000:5000'
Soon, we’ll extend this ability so you can run the entire Grafbase platform entirely on your own infrastructure, not just the Gateway. You can read this post for more details on the Grafbase Engine. Reach out to us on our Discord to tell us how you are deploying Grafbase.
Didn't find what you were looking for? Contact us.