Samples
Agentic Autogen
This sample shows an agentic Autogen application using Mistral and FastAPI, deployed with Defang. For demonstration purposes, it will require a Mistral AI API key (see Configuration for more details). However, you are free to modify it to use a different LLM, say the Defang OpenAI Access Gateway service, as an alternative. Note that the Vite React frontend is served through the FastAPI backend so that they can be treated as one service in production.
Prerequisites
- Download Defang CLI
- (Optional) If you are using Defang BYOC authenticate with your cloud provider account
- (Optional for local development) Docker CLI
Development
To run the application locally, you can use the following command:
docker compose up --build
Configuration
For this sample, you will need to provide the following configuration:
Note that if you are using the 1-click deploy option, you can set these values as secrets in your GitHub repository and the action will automatically deploy them for you.
MISTRAL_API_KEY
An API key to access the Mistral AI API.
defang config set MISTRAL_API_KEY
Deployment
[!NOTE] Download Defang CLI
Defang Playground
Deploy your application to the Defang Playground by opening up your terminal and typing:
defang compose up
BYOC
If you want to deploy to your own cloud account, you can use Defang BYOC.
Title: Agentic Autogen
Short Description: An Autogen agent application using Mistral and FastAPI, deployed with Defang.
Tags: Agent, Autogen, Mistral, FastAPI, Vite, React, Python, JavaScript, AI
Languages: Python, JavaScript
Agentic LangGraph
This sample demonstrates a LangGraph Agent application deployed with Defang. You can customize the agent's tools as needed. For example, it includes a Tavily Search tool for performing search queries, which requires a TAVILY_API_KEY (see Configuration for setup details).
Prerequisites
- Download Defang CLI
- (Optional) If you are using Defang BYOC authenticate with your cloud provider account
- (Optional for local development) Docker CLI
Development
To run the application locally, you can use the following command:
docker compose -f compose.dev.yaml up --build
Configuration
For this sample, you will need to provide the following configuration:
Note that if you are using the 1-click deploy option, you can set these values as secrets in your GitHub repository and the action will automatically deploy them for you.
TAVILY_API_KEY
A Tavily API key for accessing Tavily Search.
defang config set TAVILY_API_KEY
Deployment
[!NOTE] Download Defang CLI
Defang Playground
Deploy your application to the Defang Playground by opening up your terminal and typing:
defang compose up
BYOC
If you want to deploy to your own cloud account, you can use Defang BYOC.
Title: Agentic LangGraph
Short Description: A LangGraph Agent application that can use tools, deployed with Defang.
Tags: Agent, LangGraph, LangChain, AI, OpenAI, Tavily
Languages: TypeScript
Agentic Strands
This sample demonstrates a Strands Agent application, deployed with Defang. This Strands Agent can use tools, and is compatible with the Defang OpenAI Access Gateway.
Prerequisites
- Download Defang CLI
- (Optional) If you are using Defang BYOC authenticate with your cloud provider account
- (Optional for local development) Docker CLI
Development
To run the application locally, you can use the following command:
docker compose -f compose.dev.yaml up --build
Configuration
For this sample, you will not need to provide any configuration. However, if you ever need to, below is an example of how to do so in Defang:
defang config set API_KEY
Deployment
[!NOTE] Download Defang CLI
Defang Playground
Deploy your application to the Defang Playground by opening up your terminal and typing:
defang compose up
BYOC
If you want to deploy to your own cloud account, you can use Defang BYOC.
Title: Agentic Strands
Short Description: A Strands Agent application, deployed with Defang.
Tags: Python, Flask, Strands, AI, Agent
Languages: Python
Angular & Node.js
This sample demonstrates how to deploy a full-stack Angular and Node.js application with Defang. It uses Socket.IO for real-time communication. The Docker setup ensures the app can be easily built and deployed.
Prerequisites
- Download Defang CLI
- (Optional) If you are using Defang BYOC authenticate with your cloud provider account
- (Optional for local development) Docker CLI
- Install Node.js
- Install Angular CLI
Development
For development, we use two local containers, one for the frontend Angular service and one for the backend service in Express. It also uses Caddy as a web server for serving static files.
To run the application locally, you can use the following command:
docker compose -f compose.dev.yaml up --build
Configuration
For this sample, you will not need to provide configuration.
If you wish to provide configuration, see below for an example of setting a configuration for a value named API_KEY.
defang config set API_KEY
Deployment
[!NOTE] Download Defang CLI
Defang Playground
Deploy your application to the Defang Playground by opening up your terminal and typing:
defang compose up
BYOC (AWS)
If you want to deploy to your own cloud account, you can use Defang BYOC:
- Authenticate your AWS account, and check that you have properly set your environment variables like
AWS_PROFILE,AWS_REGION,AWS_ACCESS_KEY_ID, andAWS_SECRET_ACCESS_KEY. - Run in a terminal that has access to your AWS environment variables:
defang --provider=aws compose up
Title: Angular & Node.js
Short Description: A full-stack application using Angular for the frontend and Node.js with Socket.IO for the backend, containerized with Docker.
Tags: Angular, Node.js, Socket.IO, TypeScript, JavaScript
Languages: nodejs
Arduino Flask Wifi Server
This sample contains an interactive wifi-connected UI program for a SenseCAP Indicator Device, built for an Embedded Systems project at Defang Software Labs.
The device has a square liquid-crystal touch screen display, and a ESP32-S3 chip that can be programmed in an Arduino environment.
The program welcome.ino, acting as a client, pings data every 5 seconds after it is connected to a wifi network. It uses a library called ArduinoHTTPClient. It is also recommended to use Arduino IDE when coding with .ino files. The program UI will display a message that is randomized in color and location on the screen at the same time during pings (every 5 seconds).
The Flask server in web_server.py receives these pings when it is initialized and connected to the same wifi network as the client. To initalize it, run python web_server.py. To view it, open localhost with the port number used. To deploy it to the cloud, run defang up in the \welcome directory.
A helpful file called serial_reader.py decodes serial monitor readings to a readable format, allowing you to see Serial.println() messages in real time when running. To initalize it, run python serial_reader.py and see it show up in the terminal.
Here is a diagram showing the structure of the application.
Prerequisites
- Download Defang CLI
- (Optional) If you are using Defang BYOC authenticate with your cloud provider account
- (Optional for local development) Docker CLI
Development
To run the application locally, you can use the following command:
docker compose up --build
Configuration
For this sample, you will not need to provide configuration.
If you wish to provide configuration, see below for an example of setting a configuration for a value named API_KEY.
defang config set API_KEY
Deployment
[!NOTE] Download Defang CLI
Defang Playground
Deploy your application to the Defang Playground by opening up your terminal and typing:
defang compose up
BYOC
If you want to deploy to your own cloud account, you can use Defang BYOC.
Title: Arduino Flask Wifi Server
Short Description: An Arduino wifi server built with Flask.
Tags: Arduino, Flask, Python, IoT, Wifi, Serial
Languages: python
BullMQ & BullBoard & Redis
This sample demonstrates how to deploy a BullMQ message queue on top of managed Redis with a queue processor and a dashboard to monitor the queue.
Once your app is up and running you can go to the /board route for the board service to see the Bull Board dashboard and use the username admin and the board password you set to log in (see Configuration).
To add a job to the queue, you can go to the /add route of the api service. This will use some default values so you can test things out. You can also see an example of a post request in the sample HTTP request file.
The worker service is the queue processor that will process the jobs added to the queue. You can see in the compose.yaml file that the worker service is set to scale to 2 instances. This means that there will be 2 workers processing jobs from the queue. You can set this to your desired number of workers, but we wanted to show how you can increase the number of workers to handle more jobs.
Prerequisites
- Download Defang CLI
- (Optional) If you are using Defang BYOC authenticate with your cloud provider account
- (Optional for local development) Docker CLI
Development
To run the application locally, you can use the following command:
docker compose -f compose.dev.yaml up --build
Configuration
For this sample, you will need to provide the following configuration:
Note that if you are using the 1-click deploy option, you can set these values as secrets in your GitHub repository and the action will automatically deploy them for you.
BOARD_PASSWORD
Set a board password and use together with the board username admin when signing in.
defang config set BOARD_PASSWORD
QUEUE
defang config set QUEUE
Deployment
[!NOTE] Download Defang CLI
Defang Playground
Deploy your application to the Defang Playground by opening up your terminal and typing:
defang compose up
BYOC (AWS)
If you want to deploy to your own cloud account, you can use Defang BYOC:
- Authenticate your AWS account, and check that you have properly set your environment variables like
AWS_PROFILE,AWS_REGION,AWS_ACCESS_KEY_ID, andAWS_SECRET_ACCESS_KEY. - Run in a terminal that has access to your AWS environment variables:
defang --provider=aws compose up
Title: BullMQ & BullBoard & Redis
Short Description: A sample project with BullMQ, BullBoard, and Redis.
Tags: BullMQ, BullBoard, Redis, Express, Node.js, Message Queue, JavaScript
Languages: nodejs
Crew.ai Django Sample
This sample shows how to use Crew.ai with a Django application. It provides a simple web interface that allows users to input text and receive a summary of the text in real-time using Django Channels with a Redis broker. It uses Celery to handle the Crew.ai tasks in the background with Redis as a broker. It uses Postgres as the database for Django.
Prerequisites
- Download Defang CLI
- (Optional) If you are using Defang BYOC authenticate with your cloud provider account
- (Optional for local development) Docker CLI
Development
To run the application locally, you can use the following command:
docker compose -f ./compose.local.yaml up --build
Configuration
For this sample, you will need to provide the following configuration:
Note that if you are using the 1-click deploy option, you can set these values as secrets in your GitHub repository and the action will automatically deploy them for you.
POSTGRES_PASSWORD
The password for the Postgres database.
defang config set POSTGRES_PASSWORD
SSL_MODE
The SSL mode for the Postgres database.
defang config set SSL_MODE
DJANGO_SECRET_KEY
The secret key for the Django application.
defang config set DJANGO_SECRET_KEY
Deployment
[!NOTE] Download Defang CLI
Defang Playground
Deploy your application to the Defang Playground by opening up your terminal and typing:
defang compose up
BYOC
If you want to deploy to your own cloud account, you can use Defang BYOC.
Title: Crew.ai Django Sample
Short Description: A sample application that uses Crew.ai to summarize text in a background task, streamed to the user in real-time.
Tags: Django, Celery, Redis, Postgres, AI, ML
Languages: Python
C# & ASP.NET Core
This sample project is a simple task manager application using ASP.NET Core for the backend and JavaScript for client-side component rendering.
It showcases how you could deploy a full-stack application with ASP.NET Core and JavaScript using Defang. The Docker setup ensures the app can be easily built and tested during development.