Deployment
Pulse deploys as a single Node.js process. No adapters, no serverless wrapping, no separate static file server required. Build once, run anywhere Node 22+ runs. All guarantees — security headers, brotli compression, immutable asset caching — are active in production automatically.
Build
Run the production build before deploying:
npm run build
This generates content-hashed bundles in public/dist/ and writes public/dist/manifest.json. The server reads the manifest at startup to resolve hydration script paths.
immutable cache headers. Always run npm run build before deploying to production.What to deploy
| Include | Reason |
|---|---|
src/ | Page specs — imported by the server at runtime |
public/ | Static assets and built bundles (public/dist/) |
server.js | Entry point |
pulse.config.js | Server config |
package.json + node_modules/ | Runtime dependencies |
| Exclude | Reason |
|---|---|
.claude/ | AI agent config — not needed at runtime |
.pulse/ | Local report data — not needed at runtime |
Environment variables
| Variable | Default | Description |
|---|---|---|
NODE_ENV | development | Set to production to enable HSTS headers and production cache behaviour. |
PORT | Value in pulse.config.js (default 3000) | Override the listening port. Most PaaS platforms set this automatically. |
NODE_ENV=production pulse start
VPS with PM2
PM2 keeps the process alive, restarts it on crash, and manages logs.
# Install PM2 globally
npm install -g pm2
# Start the app
NODE_ENV=production pm2 start server.js --name myapp
# Persist across reboots
pm2 save
pm2 startup
# Zero-downtime reload after a deploy
pm2 reload myapp
For repeatable deployments, check an ecosystem.config.cjs into version control:
// ecosystem.config.cjs
module.exports = {
apps: [{
name: 'myapp',
script: 'server.js',
env_production: {
NODE_ENV: 'production',
PORT: 3000,
},
}],
}
pm2 start ecosystem.config.cjs --env production
Docker
A two-stage build keeps the image small — build tools stay in the first stage.
# ---- build stage ----
FROM node:22-alpine AS build
WORKDIR /app
COPY package*.json ./
RUN npm ci
COPY . .
RUN npx pulse build
# ---- runtime stage ----
FROM node:22-alpine
WORKDIR /app
ENV NODE_ENV=production
COPY package*.json ./
RUN npm ci --omit=dev
COPY --from=build /app/src ./src
COPY --from=build /app/public ./public
COPY --from=build /app/server.js ./server.js
COPY pulse.config.js ./
EXPOSE 3000
CMD ["node", "server.js"]
docker build -t myapp .
docker run -p 3000:3000 --env NODE_ENV=production myapp
Fly.io
# fly.toml
app = 'myapp'
primary_region = 'lhr'
[env]
NODE_ENV = 'production'
[build]
[build.args]
NODE_VERSION = '22'
[deploy]
release_command = 'npx pulse build'
[http_service]
internal_port = 3000
force_https = true
auto_stop_machines = 'stop'
auto_start_machines = true
[[vm]]
memory = '256mb'
cpu_kind = 'shared'
cpus = 1
# First deploy
fly launch
# Subsequent deploys
fly deploy
Railway
Railway auto-detects Node apps. Add a railway.json to set the build and start commands:
{
"$schema": "https://railway.app/railway.schema.json",
"build": {
"builder": "NIXPACKS",
"buildCommand": "npm run build"
},
"deploy": {
"startCommand": "NODE_ENV=production node server.js",
"healthcheckPath": "/",
"restartPolicyType": "ON_FAILURE"
}
}
Render
| Setting | Value |
|---|---|
| Environment | Node |
| Build command | npm install && npm run build |
| Start command | NODE_ENV=production node server.js |
| Node version | 22.x |
Set NODE_ENV=production in the Render environment variables dashboard.
Vercel, Cloudflare, and edge platforms
These platforms each have multiple products with very different runtimes — the compatibility story varies significantly between them.
Vercel
Vercel has two distinct runtimes:
| Product | Runtime | Pulse compatible? |
|---|---|---|
| Functions (Node.js) | Full Node.js — same built-ins as a VPS | Partially — see below |
| Edge Functions | V8 isolates (no Node built-ins) | No |
Vercel Functions (Node.js) can run Pulse with some differences in behaviour:
| Feature | Behaviour on Vercel Functions |
|---|---|
serverTtl cache | Works within a warm instance, but cold starts reset it. Not reliable for expensive queries. |
| Streaming SSR | Vercel Functions support streaming responses, but require explicit configuration via supportsResponseStreaming. |
| Static files | Vercel serves public/ automatically via its CDN — Pulse's static file serving is bypassed. |
| Security headers | Work as normal — Pulse adds them to every response. |
Cloudflare
| Product | Runtime | Pulse compatible? |
|---|---|---|
| Workers | V8 isolates — no node:http, node:fs, node:zlib | No |
| Pages Functions | Same V8 isolate runtime as Workers | No |
| CDN / proxy | Sits in front of your origin server | Yes — works great with Fly.io or a VPS behind it |
HTTPS and reverse proxy
Pulse detects TLS automatically. When a request arrives with an x-forwarded-proto: https header or over a direct TLS socket, Strict-Transport-Security: max-age=31536000; includeSubDomains is added to the response. All four platforms above forward this header — no Pulse config is needed.
If running behind nginx for TLS termination:
# nginx — TLS termination, proxy to Pulse
server {
listen 443 ssl;
server_name myapp.com;
ssl_certificate /etc/letsencrypt/live/myapp.com/fullchain.pem;
ssl_certificate_key /etc/letsencrypt/live/myapp.com/privkey.pem;
location / {
proxy_pass http://localhost:3000;
proxy_http_version 1.1;
proxy_set_header Host $host;
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
proxy_set_header X-Forwarded-Proto $scheme;
}
}
# Redirect HTTP to HTTPS
server {
listen 80;
server_name myapp.com;
return 301 https://$host$request_uri;
}
certbot --nginx -d myapp.com.