Deploying OpenAI-compatible applications to cloud-native managed language models with Defang
Defang makes it easy to deploy on your favourite cloud's managed LLM service with our OpenAI Access Gateway. This service sits between your application and the cloud service and acts as a compatibility layer. It handles incoming OpenAI requests, translates those requests to the appropriate cloud-native API, handles the native response, and re-constructs an OpenAI-compatible response.
See our tutorial which describes how to configure the OpenAI Access Gateway for your application
Current Support
Provider | Managed Language Models |
---|---|
Playground | ❌ |
AWS Bedrock | ✅ |
DigitalOcean GenAI | ❌ |
GCP Vertex | ❌ |