Skip to main content

8 posts tagged with "DigitalOcean"

View All Tags

Deploying Agentic Apps to the Cloud Shouldn’t Be This Hard…

· 3 min read
Defang Team
Defang Team

Agentic Apps

Deploying Agentic Apps to the Cloud Shouldn’t Be This Hard…

Agentic apps are redefining how software is built: multi-agent workflows, persistent memory, tool-using LLMs, and orchestrated autonomy. But deploying them to the cloud is still painful - for example, your agentic app typically needs to provision:

  • Managed databases like Postgres or MongoDB
  • Fast, scalable caching (hello Redis)
  • Containerized compute that scales
  • Secure networking and service discovery
  • Managed LLMs like AWS Bedrock or GCP Vertex AI

And for many teams, these apps must run inside the customer’s cloud, where sensitive data lives and compliance rules apply. That means you cannot just spin up your own environment and call it a day. Instead, you are deploying across AWS, GCP, DigitalOcean, or whichever stack your customers demand, each with its own APIs, quirks, and limitations.

Now you are not just building agents; you are picking the right infrastructure, rewriting IaC templates for every provider, and untangling the edge cases of each cloud.

The result: weeks of DevOps headaches, lost momentum, and engineers stuck wiring infrastructure instead of shipping agents.

Simplifying Deployment of AI Apps to the Cloud using Docker and Model Context Protocol

· 8 min read
Defang Team
Defang Team

mcp

Anthropic recently unveiled the Model Context Protocol (MCP), “a new standard for connecting AI assistants to the systems where data lives”. However, as Docker pointed out, “packaging and distributing MCP Servers is very challenging due to complex environment setups across multiple architectures and operating systems”. Docker helps to solve this problem by enabling developers to “encapsulate their development environment into containers, ensuring consistency across all team members’ machines and deployments.” The Docker work includes a list of reference MCP Servers packaged up as containers, which you can deploy locally and test your AI application.

However, to put such containerized AI applications into production, you need to be able to not only test locally, but also easily deploy the application to the cloud. This is what Defang enables. In this blog and the accompanying sample, we show how to build a sample AI application using one of the reference MCP Servers, run and test it locally using Docker, and when ready, to easily deploy it to the cloud of your choice (AWS, GCP, or DigitalOcean) using Defang.

January 2025 Defang Compose Update

· 3 min read
Defang Team
Defang Team

Defang Compose Update

Welcome to 2025! As we had shared in our early Dec update, we reached our V1 milestone with support for GCP and DigitalOcean in Preview and production support for AWS. We were very gratified to see the excitement around our launch, with Defang ending 2024 with twice the number of users as our original goal!

We are excited to build on that momentum going into 2025. And we are off to a great start in Jan, with some key advancements:

🚀 Defang V1: Launch Week is Here!

· 4 min read
Defang Team
Defang Team

Defang Compose Update

At Defang, we’re enabling developers go from idea to code to deployment 10x faster. We’re thrilled to announce that Defang V1 is officially launching during our action-packed Launch Week, running from December 4–10, 2024! This marks a major milestone as we officially release the tools and features developers have been waiting for.