Skip to main content

· 4 min read

Defang Compose Update

When we refreshed the Defang brand, we knew our website needed more than just a fresh coat of paint. It needed to become a more dynamic part of our stack. We needed some parts to be more flexible, some parts to be more interactive, and better aligned with how modern apps are organized and deployed. And what better way to take it there than to deploy it with Defang itself?

This is part of our ongoing "Defang on Defang" series, where we show how we're using our own tool to deploy all the services that power Defang. In this post, we're diving into how we turned our own website into a project to better understand how Defang can be used to deploy a dynamic Next.js apps and how we can improve the experience for developers.


From S3 + CloudFront to Dynamic, Containerized Deployments

Our original site was a Next.js app using static exports deployed via S3 and fronted by CloudFront. That setup worked for a while—it was fast and simple. But with our brand refresh, we added pages and components where it made sense to use (and test for other developers) some Next.js features that we couldn't use with the static export:

That meant static hosting wouldn't cut it. So we decided to run the site as an app in a container.

That being said, our learnings from the previous setup are being used to develop the capabilities of Defang. We're using the experience to make sure that Defang can handle the deployment of static sites as well as dynamic ones. We'll keep you updated when that's ready.


Deploying with Defang (and Why It Was Easy)

We already deploy our other services with Defang using Compose files. In fact, the static website actually already used a Dockerfile and Compose file to manage the build process. So we just had to make some minor changes to the Compose file to take into account new environment variables for features we're adding and make a few small changes to the Dockerfile to handle the new build process.

Some things we had to change:

Adding ports to the Compose file:

    ports:
- mode: ingress
target: 3000
published: 3000

Adding domain info the Composer file:

    domainname: defang.io
networks:
public:
aliases:
- www.defang.io

One other hiccup was that we used to do www to non-www redirects using S3. There are a few ways to switch that up, but for the time being we decided to use Next.js middleware.

Pretty soon after that, the site was up and running in an AWS account—with TLS, DNS, and both the www and root domains automatically configured. Pretty straightfoward!


Real-World Lessons That Are Shaping Defang

Deploying the website wasn't just a checkbox—it helped surface real-world pain points and ideas for improvement.

1. Static Assets Still Need CDNs

Even though the site is dynamic now, we still want assets like /_next/static to load quickly from a CDN. This made it clear that CDN support—like CloudFront integration—should be easier to configure in Defang. That’s now on our roadmap. That's also going to be useful for other frameworks that use similar asset paths, like Django.

2. Next.js Env Vars Can Be Tricky in Containers

Next.js splits env vars between build-time and runtime, and the rules aren’t always obvious. Some need to be passed as build args, and others as runtime envs. That made us think harder about how Defang could help clarify or streamline this for developers—even if we can’t change that aspect of Next.js itself.

3. Redirects and Rewrites

We had to add a middleware to handle www to non-www redirects. This is a common need, so we're keeping an eye on how we can make this easier to deal with in Defang projects.

These are the kinds of things we only notice by using Defang on real-world projects.


The Takeaway

Our site now runs like the rest of our infrastructure:

  • Fully containerized
  • Deployed to our own AWS account
  • Managed with a Compose file
  • Deployed with Defang

Stay tuned for the next post in the series—because this is just one piece of the puzzle.

· 3 min read

Defang Compose Update

Well, that went by quick! Seems like it was just a couple of weeks ago that we published the Jan update, and it’s already time for the next one. Still, we do have some exciting progress to report in this short month!

  1. Pulumi Provider: We are excited to announce a Preview of the Defang Pulumi Provider. With the Defang Pulumi Provider, you can leverage all the power of Defang with all of the extensibility of Pulumi. Defang will provision infrastructure to deploy your application straight from your Compose file, while allowing you to connect that deployment with other resources you deploy to your cloud account. The new provider makes it easy to leverage Defang if you’re already using Pulumi, and it also provides an upgrade-path for users who need more configurability than the Compose specification can provide.
  2. Portal Update: We are now fully deploying our portal with Defang alone using the defang compose up command. Our original portal architecture was designed before we supported managed storage so we used to use Pulumi to provision and connect external storage. But since we added support in Compose to specify managed storage, we can fully describe our Portal using Compose alone. This has allowed us to rip out hundreds of lines of code and heavily simplify our deployments. To learn more about how we do this, check out our Defang-Deployed-with-Defang (Part 1) blog.
  3. Open-Auth Contribution: In the past couple months we have been communicating with the OpenAuth maintainers and contributors via PRs (#120, #156) and Issues (#127) to enable features like local testing with DynamoDB, enabling support for scopes, improving standards alignment, supporting Redis, and more. We are rebuilding our authentication systems around OpenAuth and are excited about the future of the project.

Events and Social Media

February was an exciting month for the Defang team as we continued to engage with the developer community and showcase what’s possible with Defang. We sponsored and demo’ed at the DevTools Vancouver meetup, as well as sponsored the Vancouver.dev IRL: Building AI Startups event. Also, at the AWS Startup Innovation Showcase in Vancouver, our CTO Lio demonstrated how Defang makes it effortless to deploy secure, scalable, and cost-efficient serverless apps on AWS! And finally, we had a great response to our LinkedIn post on the Model Context Protocol, catching the attention of many observers, including some of our key partners.

We are eager to see what you deploy with Defang. Join our Discord to ask any questions, see what others are building, and share your own experience with Defang. And stay tuned for more to come in March!

· 5 min read

Defang Compose Update

Deploying applications is hard. Deploying complex, multi-service applications is even harder. When we first built the Defang Portal, we quickly recognized the complexity required to deploy it, even with the early Defang tooling helping us simplify it a lot. But we’ve worked a lot to expand Defang’s capabilities over the last year+ so it could take on more of the work and simplify that process.

This evolution wasn’t just based on our own instincts and what we saw in the Portal—it was informed by listening to developers who have been using Defang, as well as our experience building dozens of sample projects for different frameworks and languages. Each time we build a new sample, we learn more about the different requirements of various types of applications and developers and refine Defang’s feature set accordingly. The Portal became an extension of this learning process, serving as both a proving ground and an opportunity to close any remaining gaps, since it’s one of the most complex things we’ve built with Defang.

Finally, though, the Defang Portal—an application with six services, including two managed data stores, authentication, and a frontend—is fully deployed using just Docker Compose files and the Defang CLI using GitHub Actions.


The Initial Setup: A More Complex Deployment

The Portal isn’t a simple static website; it’s a full-stack application with the following services:

  • Next.js frontend – Including server components and server actions.
  • Hasura (GraphQL API) – Serves as a GraphQL layer.
  • Hono (TypeScript API) – Lightweight API for custom business logic.
  • OpenAuth (Authentication Service) – Manages authentication flows.
  • Redis – Used for caching and session storage.
  • Postgres – The main database.

Initially, we provisioned databases and some DNS configurations using Infra-as-Code because Defang couldn’t yet manage them for us. We also deployed the services themselves manually through infrastructure-as-code, requiring us to define each service separately.

This worked, but seemed unnecessarily complex, if we had the right tooling…


The Transition: Expanding Defang to Reduce Complexity

We’ve made it a priority to expand Defang’s capabilities a lot over the last year so it could take on more of the heavy lifting of a more complex application. Over the past year, we’ve added loads of features to handle things like:

  • Provisioning databases, including managing passwords and other secrets securely
  • Config interpolation using values stored in AWS SSM, ensuring the same Compose file works both locally and in the cloud
  • Provisioning certs and managing DNS records from configuration in the Compose file.

As a result, we reached a point where we no longer needed custom infrastructure definitions for most of our deployment.

What Changed?

  • Previously: GitHub Actions ran infra-as-code scripts to provision databases, manage DNS, and define services separately from the Docker Compose file we used for local dev
  • Now: Our Defang GitHub Action targets normal Compose files and deploys everything, using secrets and variables managed in GitHub Actions environments.
  • Result: We eliminated hundreds of lines of Infra-as-Code, making our deployment leaner and easier to manage and reducing the differences between running the Portal locally and running it in the cloud.

This wasn’t just about reducing complexity—it was also a validation exercise. We knew that Defang had evolved enough to take over much of our deployment, but by going through the transition process ourselves, we could identify and close the remaining gaps and make sure our users could really make use of Defang for complex production-ready apps.


How Deployment Works Today

Config & Secrets Management

  • Sensitive configuration values (database credentials, API keys) are stored securely in AWS SSM using Defang’s configuration management tooling.
  • Environment variable interpolation allows these SSM-stored config values to be referenced directly in the Compose file, ensuring the same configuration works in local and cloud environments.
  • Defang provisions managed Postgres and Redis instances automatically when using the x-defang-postgres and x-defang-redis extensions, securely injecting credentials where needed with variable interpolation.

Deployment Modes

  • Deployment modes (development, staging, production) adjust infrastructure settings across our dev/staging/prod deployments making sure dev is low cost, and production is secure and resilient.

DNS & Certs

CI/CD Integration

  • Previously: GitHub Actions ran custom infra-as-code scripts.
  • Now: The Defang GitHub Action installs Defang automatically and runs defang compose up, simplifying deployment.
  • Result: A streamlined, repeatable CI/CD pipeline.

The Takeaway: Why This Matters

By transitioning to fully Compose-based deployments, we:

  • Eliminated hundreds of lines of Infra-as-Code
  • Simplified configuration management with secure, environment-aware secrets handling
  • Streamlined CI/CD with a lightweight GitHub Actions workflow
  • Simplified DNS and cert management

Every sample project we built, every conversation we had with developers, and every challenge we encountered with the Portal helped us get to this point where we could focus on closing the gaps last few gaps to deploying everything from a Compose file.

· 2 min read

Defang New Look

Over the last couple of years, as we have been building Defang, we've learnt a lot about the key needs of developers in deploying their applications to the cloud - the primacy of a simple developer experience, while at the same time providing a flexible and production-ready solution that can work seamlessly with all of the popular cloud platform targets.

In response, we have been constantly evolving our product functionality to address those needs in the simplest yet most powerful way we can come up with. While certainly there is a long way to go, we have definitely come a long way since we started.

Why the Refresh?

As we reflected on our journey, we realized our branding and messaging needed to better reflect Defang's current value-proposition. That's why today, we're excited to unveil our brand refresh, our first since the early days of Defang.

Here's what's new:

1. Refining Our Messaging

As Defang evolves, so does our message:

  • Our Promise: Develop Anything, Deploy Anywhere.
  • What We Enable: Any App, Any Stack, Any Cloud.
  • How It Works: Take your app from Docker Compose to a secure, scalable deployment on your favorite cloud in minutes.

We've modernized our logo while keeping the core hexagonal design. The new look symbolizes Defang's role in seamlessly deploying any cloud application to any cloud.

3. A Redesigned Website

We've refreshed our website with a sleek, intuitive design and a modern user experience to better showcase Defang's capabilities.

Rolling Out the Refresh

Starting today, you'll see these updates across our Defang.io homepage and social media platforms (Twitter, LinkedIn, Discord, BlueSky). In the coming days, we'll extend this refresh across all our digital assets.

We'd Love Your Feedback!

Check out the new look and let us know what you think! And if you haven't, please join us on Discord and follow us on social media.

· 7 min read

mcp

Anthropic recently unveiled the Model Context Protocol (MCP), “a new standard for connecting AI assistants to the systems where data lives”. However, as Docker pointed out, “packaging and distributing MCP Servers is very challenging due to complex environment setups across multiple architectures and operating systems”. Docker helps to solve this problem by enabling developers to “encapsulate their development environment into containers, ensuring consistency across all team members’ machines and deployments.” The Docker work includes a list of reference MCP Servers packaged up as containers, which you can deploy locally and test your AI application.

However, to put such containerized AI applications into production, you need to be able to not only test locally, but also easily deploy the application to the cloud. This is what Defang enables. In this blog and the accompanying sample, we show how to build a sample AI application using one of the reference MCP Servers, run and test it locally using Docker, and when ready, to easily deploy it to the cloud of your choice (AWS, GCP, or DigitalOcean) using Defang.

Sample Model Context Protocol Time Chatbot Application

Using Docker’s mcp/time image and Anthropic Claude, we made a chatbot application that can access time-based resources directly on the user’s local machine and answer time-based questions.

The application is containerized using Docker, enabling a convenient and easy way to get it running locally. We will later demonstrate how we deployed it to the cloud using Defang.

Let’s go over the structure of the application in a local environment.

mcp_before

General Overview

  1. There are two containerized services, Service 1 and Service 2, that sit on the local machine.
    • Service 1 contains a custom-built web server that interacts with an MCP Client.
    • Service 2 contains an MCP Server from Docker as a base image for the container, and a custom-built MCP Client we created for interacting with the MCP Server.
  2. We have a browser on our local machine, which interacts with the web server in Service 1.
  3. The MCP Server in Service 2 is able to access tools from either a cloud or on our local machine. This configuration is included as a part of the Docker MCP image.
  4. The MCP Client in Service 2 interacts with the Anthropic API and the web server.

Architecture

Service 1: Web Server

Service 1 contains a web server and the UI for a chat application (not shown in the diagram), written in Next.js. The chat UI updates based on user-entered queries and chatbot responses. A POST request is sent to Service 1 every time a user enters a query from the browser. In the web server, a Next.js server action function is used to forward the user queries to the endpoint URL of Service 2 to be processed by the MCP Client.

Service 2: MCP Service Configuration

The original Docker mcp/time image is not designed with the intent of being deployed to the cloud - it is created for a seamless experience with Claude Desktop. To achieve cloud deployment, an HTTP layer is needed in front of the MCP Server. To address this, we've bundled an MCP Client together with the Server into one container. The MCP Client provides the HTTP interface and communicates with the MCP Server via standard input/output (stdio).

MCP Client

The MCP Client is written in Python, and runs in a virtual environment (/app/.venv/bin) to accommodate specific package dependencies. The MCP Client is instantiated in a Quart app, where it connects to the MCP Server and handles POST requests from the web server in Service 1. Additionally, the MCP Client connects to the Anthropic API to request LLM responses.

MCP Server and Tools (from the Docker Image)

The MCP Server enables access to tools from an external source, whether it be from a cloud or from the local machine. This configuration is included as a part of the Docker MCP image. The tools can be accessed indirectly by the MCP Client through the MCP Server. The Docker image is used as a base image for Service 2, and the MCP Client is built in the same container as the MCP Server. Note that the MCP Server also runs in a virtual environment (/app/.venv/bin).

Anthropic API

The MCP Client connects to the Anthropic API to request responses from a Claude model. Two requests are sent to Claude for each query. The first request will send the query contents and a list of tools available, and let Claude respond with a selection of the tools needed to craft a response. The MCP Client will then call the tools indirectly through the MCP Server. Once the tool results come back to the Client, a second request is sent to Claude with the query contents and tool results to craft the final response.

Setting Up Dockerfiles

Service 1: Web Server - Dockerfile

The base image for Service 1 is the node:bookworm-slim image. We construct the image by copying the server code and setting an entry point command to start the web server.

Service 2: MCP Service Configuration - Dockerfile

The base image for Service 2 is the Docker mcp/time image. Since both the MCP Client and Server run in a virtual environment, we activate a venv command in the Dockerfile for Service 2 and create a run.sh shell script that runs the file containing the MCP Client and Server connection code. We then add the shell script as an entry point command for the container.

Compose File

To define Services 1 and 2 as Docker containers, we’ve written a compose.yaml file in the root directory, as shown below.

services:
service-1: # Web Server and UI
build:
context: ./service-1
dockerfile: Dockerfile
ports:
- target: 3000
published: 3000
mode: ingress
deploy:
resources:
reservations:
memory: 256M
environment:
- MCP_SERVICE_URL=http://service-2:8000
healthcheck:
test: ["CMD", "curl", "-f", "http://localhost:3000/"]

service-2: # MCP Service (MCP Client and Server)
build:
context: ./service-2
dockerfile: Dockerfile
ports:
- target: 8000
published: 8000
mode: host
environment:
- ANTHROPIC_API_KEY

Testing and Running on Local Machine

Now that we’ve defined our application in Docker containers using a compose.yaml file, we can test and run it on our local machine by running the command:

docker compose up --build

Once the application is started up, it can be easily tested in a local environment. However, to make it easily accessible to others online, we should deploy it to the cloud. Fortunately, deploying the application is a straightforward process using Defang, particularly since the application is Compose-compatible.

Deploying to the Cloud

Let’s go over the structure of the application after cloud deployment.

mcp_after

Here we can see what changes if we deploy to the cloud:

  1. Service 1 and Service 2 are now deployed to the cloud, not on the local machine anymore.
  2. The only part on the local machine is the browser.

Using the same compose.yaml file as shown earlier, we can deploy the containers to the cloud with the Defang CLI. Once we’ve authenticated and logged in, we can choose a cloud provider (i.e. AWS, GCP, or DigitalOcean) and use our own cloud account for deployment. Then, we can set a configuration variable for the Anthropic API key:

defang config set ANTHROPIC_API=<your-api-key-value>

Then, we can run the command:

defang compose up

Now, the MCP time chatbot application will be up and running in the cloud. This means that anyone can access the application online and try it for themselves!

For our case, anyone can use the chatbot to ask for the exact time or convert time zones from their machine, regardless of where they are located.

mcp_time_chatbot

Most importantly, this chatbot application can be adapted to use any of the other Docker reference MCP Server images, not just the mcp/time server.

Have fun building and deploying MCP-based containerized applications to the cloud with Defang!

· 3 min read

Defang Compose Update

Welcome to 2025! As we had shared in our early Dec update, we reached our V1 milestone with support for GCP and DigitalOcean in Preview and production support for AWS. We were very gratified to see the excitement around our launch, with Defang ending 2024 with twice the number of users as our original goal!

We are excited to build on that momentum going into 2025. And we are off to a great start in Jan, with some key advancements:

  1. GCP parity with AWS: We are really excited to announce that our GCP provider is now feature-complete, with support for key features such as Managed Postgres, Managed Redis, BYOD (Bring-Your-Own-Domain), GPUs, AI-assisted Debugging, and more. Install the latest version of our CLI and give it a try! Please let us know your feedback.
  2. Defang Deployed with Defang: In 2025, we are doubling our focus on production use-cases where developers are using Defang every day to deploy their production apps. And where better to start than with Defang itself? We had already been using Defang to deploy portions of our infrastructure (such as our web site), but we are super happy to report that now we are using Defang to deploy all our services - including our Portal, Playground, the Defang back-end (aka Fabric) and more. We’ll be sharing more about how we did this, and publishing some of the related artifacts, in a blog post soon - stay tuned.
  3. Campus Advocate Program: One of our key goals for 2025 is to bring Defang to more students and hobbyists. To do this, we are very excited to launch our Campus Advocate Program, a community of student leaders passionate about cloud technology. Our advocates will build communities, host events, and help peers adopt cloud development with Defang. If you’re a student eager to drive cloud innovation on your campus, we’d love to hear from you - you can apply here.
  4. 1-click Deploy instructions: One of our most popular features is the ability to deploy any of our 50+ samples with a single click. We have now published instructions showing how you can provide a similar experience for your project or sample. We are curious to see what you deploy with this!
  5. Model Context Protocol sample: AI agents are of course the rage nowadays. Recently, Docker published a blog showing how you can use Docker to containerize “servers” following Anthropic’s Model Context Protocol. We have now published a sample that shows you how to easily deploy such containerized servers to the cloud using Defang - check it out here.

So, you can see we have been busy! But that is not all - we have a lot more in the pipeline in the coming months. Stay tuned - it’s going to be an exciting 2025!

P.S.: Defang is now on Bluesky! Follow us to stay connected, get the latest updates, and join the conversation. See you there!

· 2 min read

Defang Compose Update - Product Hunt

The moment is finally here – Defang V1 is officially LIVE on Product Hunt! 🎉

Defang - Go from idea to your favorite cloud in minutes. | Product Hunt

Over the past few months, our team has been working tirelessly to create a tool that transforms how developers develop, deploy, and debug cloud apps. With Defang, you can go from idea to your favorite cloud in minutes. 🚀

Today, we have the opportunity to showcase Defang to a global audience, and your support could make all the difference!

If you already have a Product Hunt account, it's super easy.

  • ✅ You can support our product if you like what we have built so far
  • ✅ You can leave a comment and any feedback you may have (comments are great!)
  • ✅ You can leave a review

Product Hunt launches are time sensitive as they last 24 hours, so if you have 30 seconds available right now, it would really mean a lot.

If you don't already have a Product Hunt account, please don't create one now to upvote (we may get penalized for that).

Instead, you can like and share our (e.g. LinkedIn, Twitter, Instagram or Facebook) posts . Thank you in advance. Your support means the world.

· 4 min read

Defang Compose Update

At Defang, we’re enabling developers go from idea to code to deployment 10x faster. We’re thrilled to announce that Defang V1 is officially launching during our action-packed Launch Week, running from December 4–10, 2024! This marks a major milestone as we officially release the tools and features developers have been waiting for.

What’s New in Defang CLI V1?

Defang is a powerful tool that lets you easily develop, deploy, and debug production-ready cloud applications. With Defang V1, we continue to deliver on our vision to make cloud development effortlessly simple and portable, with the ability to develop once and deploy anywhere. Here’s what’s included in this milestone release:

  • Production-Ready Support for AWS

Seamlessly deploy and scale with confidence on AWS. Defang is now WAFR-compliant, assuring that your deployments conform to all the best-practices for AWS deployments. Defang is now officially part of the AWS Partner Network.

  • New - Google Cloud Platform (GCP) in Preview

This week, we are excited to unveil support for deployments to GCP, in Preview. Start building and exploring and give us feedback as we work to enhance the experience further and move towards production support. Defang is also now officially part of the Google Cloud Partner Advantage program.

  • Support for DigitalOcean in Preview

Developers using DigitalOcean can explore our Preview features, with further enhancements and production support coming soon.

Defang Product Tiers and Introductory Pricing 🛠️

As we move into V1, we are also rolling out our differentiated product tiers, along with our special introductory pricing. Fear not, we will always have a free tier for hobbyists - conveniently called the Hobby tier. We now also provide Personal, Pro, and Enterprise tiers for customers with more advanced requirements. Check out what is included in each here. And as always, the Defang CLI is and remains open-source.

Launch Week Activities

We’ve lined up an exciting week of activities to showcase the power of Defang and bring together our growing community:

  • December 4: Vancouver CDW x AWS re:Invent Watch Party

Join us at the Vancouver CDW x AWS re:Invent Watch Party, where we will have a booth showcasing Defang’s capabilities and AWS integration. Stop by to learn more about Defang and see a live demo from the Defang dev team.

  • December 5–6: GFSA DemoDay and Git Push to 2025: Devs Social Party

Hear directly from Defang’s co-founder and CTO, Lio Lunesu, as we unveil Defang’s support for GCP at the Google for Startups Accelerator (GFSA) DemoDay event in Toronto. This event will also be live-streamed here.

Additionally, join us on December 5th for the final meetup of the year for Vancouver’s developer groups, hosted by VanJS in collaboration with other local dev communities.

  • December 6 & 7: MLH Global Hack Week (GHW)

Join us during MLH Global Hack Week for hands-on workshops and learn how to build production-ready cloud applications in minutes with Defang.

  • December 7: Cloud Chat

An IRL event with our team to explore V1 features in depth, answer your questions, and share insights from our journey.

  • December 10: Product Hunt Launch

Be part of our Product Hunt debut and show your support as we reach the broader tech community.

Join the Celebration 🎉

This launch week is not just about us. It is about you, our community. Here is how you can get involved:

  1. Explore the Platform: Sign up at Defang.io and dive into V1.

  2. Attend Events: Mark your calendar for our scheduled activities.

  3. Spread the Word: Follow us on LinkedIn and X, share your experiences, and let others know why you love Defang.

We’re excited to celebrate this milestone with all of you. Stay tuned for more updates, and let’s make Launch Week unforgettable!

· 3 min read

Defang Compose Update

Hi everyone,

We are a little late getting our monthly update out this time, but we did ship a number of important updates in October that we would like to inform you about. And more is coming… stay tuned for that!

  1. New CLI version 0.6.5: this was a big release with a number of improvements and bug fixes. You can find details in the release notes here, but to highlight a few:

    • defang --provider aws shows improved error message if unauthenticated
    • Added --wait and --wait-timeout flags
    • Improved generate menu: all samples shown (previously these were restricted by language), and we moved the "Generate with AI" option to now be shown in the search filter.
  2. AI Debug for BYOC: AI Debug feature is incredibly useful in helping users find and fix issues when something goes wrong. Our initial version only worked on Playground, but now we have extended this to BYOC environments. We hope this makes it even easier for you to deploy your apps to AWS, DigitalOcean, etc.

  3. A range of other improvements, including

    • Network aliases support
    • GDPR Delete Me: You can now delete your Defang account from the Defang Portal
    • 30min time-out for the deployments to avoid runaway tasks in your account
    • Allow Postgres major version upgrade, e.g. changing the image frompostgres:14 to postgres:16 in your Compose file (Currently, we only support this in --mode development, we will explore ways to support in other modes in the future.)
    • More feedback logs when containers exit, e.g. failing health-checks
    • Fixes for multiple BYOD domains in a single account

Events and Adoption

In October, the Defang team was actively involved in a range of exciting events. We participated in MLH Cloud Week, StormHacks, and hosted a DevTools Vancouver meetup, bringing together local DevTool founders, engineers, and enthusiasts. It was inspiring to see Defang in action, helping these hackers build their amazing projects.

DevToolsMeetup

When we shipped our Public Beta earlier in 2024, we had a goal to reach 1000 users by end of year. We are pleased to announce that we have already reached this milestone a couple of months in advance! We are excited to see the momentum behind the product and how our users are using Defang for developing and deploying a variety of different applications. Thank you for your support!

The Road Ahead

The team is now heads-down dotting the i’s and crossing the t’s so we can release Defang V1 before end of year. This will enable customers to use Defang for production workloads. We look forward to sharing more in our next monthly update.


CoffeeChat

As always, we appreciate your feedback and are committed to making Defang the easiest way to develop, deploy, and debug your cloud applications. Go build something awesome! 🚀

· 4 min read

About the author: Linda Lee is an intern at Defang Software Labs who enjoys learning about computer-related things. She wrote this blog post after having fun with hardware at work.

My Story of Embedded Systems With Defang

Have you ever looked at a touch screen fridge and wondered how it works? Back in my day (not very long ago), a fridge was just a fridge. No fancy built-in interface, no images displayed, and no wifi. But times have changed, and I’ve learned a lot about embedded systems, thanks to Defang!

smart_fridge

From my background, I was more into the web development and software side of things. Buffer flushing? Serial monitors? ESP32-S3? These were unheard of. Then one day at Defang, I was suggested to work on a project with a SenseCAP Indicator, a small programmable touch screen device. Everyone wished me good luck when I started. That’s how I knew it wasn’t going to be an easy ride. But here I am, and I’m glad I did it.

What is embedded systems/programming? It’s combining hardware with software to perform a function, such as interacting with the physical world or accessing cloud services. A common starting point for beginners is an Arduino board, which is what the SenseCAP Indicator has for its hardware. My goal was to make a UI display for this device, and then send its input to a computer, and get that data into the cloud.

hand_typing

The Beginning

My journey kicked off with installing the Arduino IDE on my computer. It took me two hours—far longer than I expected—because the software versions I kept trying were not the right ones. Little did I know that I would encounter this issue many times later, such as when downloading ESP-IDF, a tool for firmware flashing. Figuring out what not to install had become a highly coveted skill.

The next part was writing software to display images and text. This was slightly less of a problem thanks to forums of users who had done the exact same thing several years ago. One tool I used was Squareline Studio, a UX/UI design tool for embedded devices. With a bit of trial and error, I got a simple static program displayed onto the device. Not half bad looking either. Here’s what it looked like:

ui_static

The Middle

Now came the networking part. Over wifi, I set up a Flask (Python) server on my computer to receive network pings from the SenseCAP Indicator. I used a library called ArduinoHTTPClient. At first, I wanted to ping the server each time a user touched the screen. Then came driver problems, platform incompatibilities, deprecated libraries…

… After weeks of limited progress due to resurfacing issues, I decided to adjust my goal to send pings on a schedule of every 5 seconds, rather than relying on user input. I changed the UI to be more colorful, and for good reason. Now, each network ping appears with a message on the screen. Can you look closely to see what it says?

ui_wifi

This is what the Flask server looked like on my computer as it got pinged:

local_server

Hooray! Once everything was working, It was time to deploy my Flask code as a cloud service so I could access it from any computer, not just my own. Deployment usually takes several hours due to configuring a ton of cloud provider settings. But I ain’t got time for that. Instead, I used Defang to deploy it within minutes, which took care of all that for me. Saved me a lot of time and tears.

Here’s the Flask deployment on Defang’s Portal view:

portal_view

Here’s the Flask server on the cloud, accessed with a deployment link:

deployed_server

The End

After two whole months, I finally completed my journey from start to finish! This project was an insightful dive into the world of embedded systems, internet networking, and cloud deployment.

Before I let you go, here are the hard lessons from hardware, from yours truly:

  1. Learning what not to do can be equally as important.
  2. Some problems are not as unique as you think.
  3. One way to achieve a goal is by modifying it.
  4. Choose the simpler way if it is offered.
  5. That’s where Defang comes in.

Want to try deploying to the cloud yourself? You can try it out here. Keep on composing up! 💪