Deploying Rust App – Part 1: Going Serverless with Google Cloud Run

by Daniel Zelei

Apr 21

3 min read

9 views

You are ready with your first server written in Rust, and you would like to show it to the world—not just run it on your localhost. So naturally, you want to deploy it somewhere. What choices do you have? Multiple options: Google Cloud, AWS, DigitalOcean, Azure, and even Rust-native platforms like shuttle.rs.

In the first part of this deployment series, we’ll show how to deploy your Rust server to Google Cloud – Cloud Run.

What is Google Cloud Run?

Google Cloud Run is a fully managed compute platform that automatically scales your stateless containers. One of its biggest advantages? You only pay when your code is running. That means if there are no incoming requests, your container can scale down to zero—and you won’t be charged a cent.

It’s built on Knative, runs on top of containers, and supports a wide variety of languages and frameworks—including Rust, as long as you can package it in a container image.

Why use Cloud Run?

  • Serverless: No need to manage servers or infrastructure.
  • Autoscaling: Scales from 0 to as many requests as needed, instantly.
  • Pay-per-use: Only pay for what you use—measured in request time and memory.
  • Easy deployment: One command deploys your container from local or from a Docker registry.

🛠️ Set Up Your Rust Project

Start by creating a new Rust app:

cargo new rust-cloud-run
cd rust-cloud-run

Update your Cargo.toml to include the actix-web dependency:

[dependencies]
actix-web = "4"

Now replace the contents of src/main.rs with a minimal web server:

use actix_web::{get, App, HttpServer, Responder};

#[get("/")]
async fn hello() -> impl Responder {
    "Hello from Rust on Cloud Run!"
}

#[actix_web::main]
async fn main() -> std::io::Result<()> {
    let port = std::env::var("PORT").unwrap_or_else(|_| "8080".to_string());
    HttpServer::new(|| App::new().service(hello))
        .bind(("0.0.0.0", port.parse::<u16>().unwrap()))?
        .run()
        .await
}

🐳 Add a Dockerfile

At the root of your project, create a Dockerfile like this:

# Stage 1: Build using Rust 1.86.0 (latest stable as of April 2025)
FROM rust:1.86.0-slim as builder

# Create app directory and copy project files
WORKDIR /app
COPY . .

# Install required packages for compiling
RUN apt-get update && apt-get install -y \
    pkg-config \
    libssl-dev \
    build-essential \
 && rm -rf /var/lib/apt/lists/*

# Compile the Rust project in release mode
RUN cargo build --release

# Stage 2: Create minimal image with the compiled binary
FROM debian:bullseye-slim

# Install runtime dependencies (like SSL certs)
RUN apt-get update && apt-get install -y \
    ca-certificates \
 && rm -rf /var/lib/apt/lists/*

# Copy the compiled binary from builder stage
COPY --from=builder /app/target/release/rust-cloud-run /app

# Set environment variable for port (used by Cloud Run)
ENV PORT=8080
EXPOSE 8080

# Run the binary
CMD ["/app"]

🔧 Setup Google Cloud

First, make sure you have the Google Cloud SDK installed:

👉 Install it here

Then log in:

gcloud auth login

Set the project you want to deploy to:

gcloud config set project YOUR_PROJECT_ID

🚀 Deploy to Cloud Run

Now you're ready to deploy! From your project root:

gcloud run deploy

During the first deployment:

  • It might ask you to enable some APIs—go ahead and confirm.
  • Select a region when prompted.
  • Make sure to allow unauthenticated access so anyone can access your app.

Once it’s done, you’ll get a live URL like:

https://rust-cloud-run-234166774295.europe-west3.run.app

Open that in a browser and boom 💥—your Rust app is live on the internet.


🔜 What’s Next?

In the next article, we’ll explore a different path: deploying to Google Compute Engine, where you’ll manage the VM yourself. This gives you more control and flexibility—but also a bit more responsibility.

After that, we’ll switch gears and take a look at what AWS offers for hosting Rust apps.