Skip to main content

2 posts tagged with "compose"

View All Tags

· 4 min read

In this guide, we'll walk through the easiest and fastest way to deploy a full-featured Django application—including real-time chat and background task processing—to the cloud using Defang. You'll see firsthand how simple Defang makes it to deploy apps that require multiple services like web servers, background workers, Redis, and Postgres.

Clone the repo

Before we get started, you'll want to clone the repo with the app code, here.

Overview of Our Django Application

We're deploying a real-time chat application that includes automatic moderation powered by a background worker using the Natural Language Toolkit (NLTK). The application structure includes:

  • Web Service: Django app with chat functionality using Django Channels for real-time interactions.
  • Worker Service: Background tasks processing messages for profanity and sentiment analysis.
  • Postgres Database: Managed database instance for persistent storage.
  • Redis Broker: Managed Redis instance serving as the broker for Celery tasks and Django Channels.

Running Locally

To run the app locally, we use Docker Compose, splitting configurations into two YAML files:

  • compose.yaml: Production configuration.
  • compose.dev.yaml: Development overrides extending production.

You can quickly spin up the application locally with:

docker compose --env-file .env.dev -f compose.dev.yaml up --build

This runs things with autoreloading so you can iterate on the Django app, all while passing environment variables in the same way as we will with Defang's secure configuration system and being ready to deploy to production.

Application Features

Real-time Chat

Using Django Channels and Redis, users can engage in real-time conversations within chat rooms.

Background Moderation Tasks

The worker service runs independently, handling moderation tasks asynchronously. It uses NLTK to:

  • Check for profanity.
  • Perform sentiment analysis.
  • Automatically flag negative or inappropriate messages.

This decouples resource-intensive tasks from the main API server, ensuring optimal application responsiveness. The demo isn't doing anything very complicated, but you could easily run machine learning models with access to GPUs with Defang if you needed to.

Django Admin

The Django admin is setup to quickly visualize messages and their moderation status. Access it at /admin with your superuser credentials: username admin and password admin setup by default when you first run or deploy.

Deploying with Defang

Deploying multi-service applications to cloud providers traditionally involves complex infrastructure setup, including configuring ECS clusters, security groups, networking, and more. Defang simplifies this significantly.

Deploying to Defang Playground

The Defang Playground lets you quickly preview your deployed app in a managed environment.

Secure Configuration

Before deploying, securely set encrypted sensitive values:

defang config set DJANGO_SECRET_KEY
defang config set POSTGRES_PASSWORD

Then run the deployment command:

defang compose up

Defang automatically:

  • Builds Docker containers.
  • Sets up required services.
  • Manages networking and provisioning.

Once deployed, your app is accessible via a public URL provided by Defang, which you can find in the CLI output or in our portal at https://portal.defang.io

Deploying to Your Own Cloud

To deploy directly into your AWS account (or other supported providers):

  1. Set your cloud provider:

In my case, I use an AWS Profile, but you should be able to use any methods supported by the AWS CLI

export DEFANG_PROVIDER=AWS
export AWS_PROFILE=your-profile-name

Secure Configuration

Before deploying, securely set encrypted sensitive values in your cloud account:

defang config set DJANGO_SECRET_KEY
defang config set POSTGRES_PASSWORD
  1. Deploy:
defang compose up

Defang handles provisioning managed services (RDS for Postgres, ElastiCache for Redis), container builds, and networking setup. Note: Initial provisioning for managed data stores might take a few minutes.

Cloud Deployment Results

Post-deployment, your Django app infrastructure includes (among other things):

  • Managed Postgres: AWS RDS instance.
  • Managed Redis: AWS ElastiCache instance.
  • Containers: ECS services with load balancers and DNS configured.

Why Use Defang?

Defang simplifies complex cloud deployments by:

  • Automatically provisioning managed cloud resources.
  • Securely handling sensitive configurations.
  • Providing seamless container orchestration without manual infrastructure setup.

Try It Yourself

Explore deploying your Django applications effortlessly with Defang. The full source code for this example is available on GitHub. Feel free to give it a try, and let us know how it goes!

Happy deploying!

· 3 min read

Defang Compose Update

Well, that went by quick! Seems like it was just a couple of weeks ago that we published the Jan update, and it’s already time for the next one. Still, we do have some exciting progress to report in this short month!

  1. Pulumi Provider: We are excited to announce a Preview of the Defang Pulumi Provider. With the Defang Pulumi Provider, you can leverage all the power of Defang with all of the extensibility of Pulumi. Defang will provision infrastructure to deploy your application straight from your Compose file, while allowing you to connect that deployment with other resources you deploy to your cloud account. The new provider makes it easy to leverage Defang if you’re already using Pulumi, and it also provides an upgrade-path for users who need more configurability than the Compose specification can provide.
  2. Portal Update: We are now fully deploying our portal with Defang alone using the defang compose up command. Our original portal architecture was designed before we supported managed storage so we used to use Pulumi to provision and connect external storage. But since we added support in Compose to specify managed storage, we can fully describe our Portal using Compose alone. This has allowed us to rip out hundreds of lines of code and heavily simplify our deployments. To learn more about how we do this, check out our Defang-Deployed-with-Defang (Part 1) blog.
  3. Open-Auth Contribution: In the past couple months we have been communicating with the OpenAuth maintainers and contributors via PRs (#120, #156) and Issues (#127) to enable features like local testing with DynamoDB, enabling support for scopes, improving standards alignment, supporting Redis, and more. We are rebuilding our authentication systems around OpenAuth and are excited about the future of the project.

Events and Social Media

February was an exciting month for the Defang team as we continued to engage with the developer community and showcase what’s possible with Defang. We sponsored and demo’ed at the DevTools Vancouver meetup, as well as sponsored the Vancouver.dev IRL: Building AI Startups event. Also, at the AWS Startup Innovation Showcase in Vancouver, our CTO Lio demonstrated how Defang makes it effortless to deploy secure, scalable, and cost-efficient serverless apps on AWS! And finally, we had a great response to our LinkedIn post on the Model Context Protocol, catching the attention of many observers, including some of our key partners.

We are eager to see what you deploy with Defang. Join our Discord to ask any questions, see what others are building, and share your own experience with Defang. And stay tuned for more to come in March!