Compare commits

..

20 Commits

Author SHA1 Message Date
f86a2db161 feat(dev): add Overmind-based development workflow
Add Procfile.dev for parallel backend/frontend development with
Overmind, replacing the sequential build approach. The new `just dev`
recipe starts both servers simultaneously with log multiplexing.

Changes:
- Update `just dev` to use Overmind
- Add `build-demo` recipe with optional Windows cross-compilation
- Add `smoke` recipe to verify development setup
- Improve logging in backend with bind address and dev mode indicator
- Allow missing Windows executable in debug builds
2025-12-11 18:23:11 -06:00
4a191a59f4 feat(frontend): add mobile device detection and warning modal
Add mobile device detection to prevent confusion when users try to download desktop applications on mobile devices. The download button now displays "Download for Desktop" on mobile and shows a warning modal explaining that the executables are for desktop platforms only. The warning is shown once per session and can be acknowledged to proceed with viewing download options.
2025-12-11 18:18:55 -06:00
b4022ff9db chore: add bacon config and improve dev workflow
- Add bacon.toml for Rust development watching with keybindings
- Update Justfile to use bacon for dev watching
- Configure frontend to build to ./public for backend serving
- Improve Justfile organization with comments and better task separation
- Add dev-backend and dev-frontend tasks for separate workflows
- Minor formatting fix in backend/src/state.rs
2025-12-11 17:58:10 -06:00
2532a21772 feat(backend): add thiserror-based error handling
Introduce AppError enum to replace panic-based error handling in executable loading and state management. Adds proper error propagation with descriptive error messages for missing executables, key patterns, and environment variables.
2025-12-11 17:43:40 -06:00
fd474767ae feat(frontend): add platform icons and improve download button UX
Replace generic download icon with platform-specific icons (Windows,
macOS, Linux) using react-icons. Show detected platform name in the
main download button text and disable auto-download when platform
cannot be detected, requiring manual selection from dropdown instead.
2025-12-11 17:40:04 -06:00
65aa9d66d3 ci: add Docker build and publish workflow
Add GitHub Actions workflow job to build and publish Docker images to GitHub Container Registry. Images are pushed on master branch commits and tags, with appropriate tagging strategy including semver, branch refs, and SHA.
2025-12-11 17:28:43 -06:00
e23c01e4fd refactor: reorganize backend modules and create lib.rs 2025-12-11 17:22:05 -06:00
d4454d7367 refactor: extract handlers to handlers/ directory 2025-12-11 17:22:05 -06:00
1a2b8c4407 refactor: convert to Cargo workspace structure 2025-12-11 17:21:10 -06:00
702205e181 refactor(docker): optimize multi-stage build with cargo-chef and layer caching
Improves Docker build performance and security through better layer caching,
dependency pre-building, and a minimal non-root runtime container.

- Add cargo-chef for Rust dependency caching across demo and server builds
- Separate planner and builder stages for optimal layer reuse
- Use pnpm with frozen lockfile for reproducible frontend builds
- Switch to debian:12-slim runtime with non-root user (uid 1000)
- Add health check endpoint monitoring
- Strip release binaries to reduce image size
- Pre-compress frontend assets during build
2025-12-11 12:18:57 -06:00
006055cb7f chore: add Justfile, apply clippy fixes, add frontend type checking
- Add Justfile with comprehensive development workflow commands (check,
lint, build, docker, etc.)
- Add @astrojs/check and typescript dependencies for frontend type
checking
2025-12-11 12:15:51 -06:00
8129975ecc chore(deps): resolve cargo-audit warnings, update dependencies 2025-12-11 12:07:22 -06:00
3ba9250cca refactor: apply clippy suggestions 2025-12-11 12:01:45 -06:00
82ac8caa88 refactor: migrate to envy for type-safe config parsing
Replace manual environment variable parsing with envy for structured configuration. Updates dotenv to dotenvy (maintained fork), adds dedicated Config struct with Railway-specific settings, and consolidates all environment variable access. Upgrades reqwest to 0.12.
2025-12-11 11:55:39 -06:00
a9e3ab8337 perf: optimize release profile for smaller binary size
Configure release profile with aggressive size optimizations:
- Set opt-level to 'z' for minimum binary size
- Enable LTO for better optimization across crates
- Strip debug info to reduce final binary size
- Use panic=abort to eliminate unwinding machinery
- Reduce codegen units to 1 for maximum optimization
- Keep overflow checks for safety in production
2025-12-11 11:51:30 -06:00
24c2c2b3c5 ci: add Renovate config and GitHub Actions quality workflow
Add automated dependency management with Renovate and comprehensive CI
checks including formatting, clippy, audit, and frontend build validation.
2025-12-11 11:50:59 -06:00
280f01bb28 feat: dynamic deployment id fetching in debug mode for development 2025-08-20 18:03:05 -05:00
1ffdd2b6eb fix(demo): avoid logging assumptions about state 2025-08-20 17:51:16 -05:00
e0bb0f52f0 feat: dynamically set PORT from Dockerfile, add .env.example 2025-08-20 17:40:09 -05:00
d20f298da5 feat: fetch builds logs for Railway on startup 2025-08-20 17:40:09 -05:00
44 changed files with 3592 additions and 3031 deletions

45
.dockerignore Normal file
View File

@@ -0,0 +1,45 @@
# Git
.git
.gitignore
.github
# Rust
target/
**/*.rs.bk
# Node.js
node_modules/
npm-debug.log*
yarn-debug.log*
yarn-error.log*
pnpm-debug.log*
.pnpm-store/
# Frontend build output
frontend/dist/
frontend/.next/
frontend/out/
# Environment files
.env
.env.*
!.env.example
# IDEs
.vscode/
.idea/
*.swp
*.swo
*~
.DS_Store
# Documentation
*.md
!README.md
# CI/CD
.github/
# Other
Justfile
.dockerignore

14
.env.example Normal file
View File

@@ -0,0 +1,14 @@
# optional, used for fetching build logs, not configured automatically
RAILWAY_TOKEN=your_railway_token_here
# optional but automatically configured by Railway
# RAILWAY_PROJECT_ID=your_project_id_here
# RAILWAY_SERVICE_ID=your_service_id_here
# RAILWAY_ENVIRONMENT_ID=your_environment_id_here
# RAILWAY_DEPLOYMENT_ID=your_deployment_id_here
# optional, automatically configured by Railway
# PORT=5800
# optional, has a default you may not want
# RAILWAY_PUBLIC_DOMAIN=your-domain.railway.app

42
.github/renovate.json vendored Normal file
View File

@@ -0,0 +1,42 @@
{
"$schema": "https://docs.renovatebot.com/renovate-schema.json",
"extends": [
"config:recommended",
":dependencyDashboard",
":semanticCommits",
":automergeDigest",
":automergeMinor"
],
"schedule": ["before 5am on monday"],
"timezone": "America/Chicago",
"prConcurrentLimit": 3,
"prCreation": "not-pending",
"rebaseWhen": "behind-base-branch",
"semanticCommitScope": "deps",
"vulnerabilityAlerts": {
"labels": ["security"],
"automerge": true,
"schedule": ["at any time"]
},
"packageRules": [
{
"description": "Group all non-major dependency updates together",
"groupName": "all non-major dependencies",
"matchUpdateTypes": ["minor", "patch", "digest"],
"automerge": true,
"automergeType": "pr",
"minimumReleaseAge": "3 days"
},
{
"description": "Major updates get individual PRs for review",
"matchUpdateTypes": ["major"],
"automerge": false,
"minimumReleaseAge": "7 days"
}
],
"postUpdateOptions": ["pnpmDedupe"],
"lockFileMaintenance": {
"enabled": true,
"schedule": ["before 5am on monday"]
}
}

134
.github/workflows/quality.yaml vendored Normal file
View File

@@ -0,0 +1,134 @@
name: Quality
on: [push, pull_request]
env:
CARGO_TERM_COLOR: always
jobs:
format:
name: Format
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- name: Install Rust
uses: dtolnay/rust-toolchain@stable
with:
components: rustfmt
- name: Check formatting
run: cargo fmt --all -- --check
clippy:
name: Clippy
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- name: Install Rust
uses: dtolnay/rust-toolchain@stable
with:
components: clippy
- uses: Swatinem/rust-cache@v2
- name: Run clippy
run: cargo clippy --workspace --all-targets --all-features -- -D warnings
audit:
name: Audit
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- uses: taiki-e/install-action@cargo-audit
- name: Run audit
run: cargo audit
check:
name: Check
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- name: Install Rust
uses: dtolnay/rust-toolchain@stable
- uses: Swatinem/rust-cache@v2
- name: Run check
run: cargo check --workspace --all-targets --all-features
frontend:
name: Frontend
runs-on: ubuntu-latest
defaults:
run:
working-directory: frontend
steps:
- uses: actions/checkout@v4
- name: Setup pnpm
uses: pnpm/action-setup@v4
with:
version: 9
- name: Setup Node.js
uses: actions/setup-node@v4
with:
node-version: 22
cache: pnpm
cache-dependency-path: frontend/pnpm-lock.yaml
- name: Install dependencies
run: pnpm install --frozen-lockfile
- name: Type check
run: pnpm astro check
- name: Build
run: pnpm build
docker:
name: Docker
runs-on: ubuntu-latest
permissions:
contents: read
packages: write
steps:
- uses: actions/checkout@v4
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@v3
- name: Log in to GitHub Container Registry
if: github.ref == 'refs/heads/master' || startsWith(github.ref, 'refs/tags/')
uses: docker/login-action@v3
with:
registry: ghcr.io
username: ${{ github.actor }}
password: ${{ secrets.GITHUB_TOKEN }}
- name: Extract metadata
id: meta
uses: docker/metadata-action@v5
with:
images: ghcr.io/${{ github.repository }}
tags: |
type=ref,event=branch
type=ref,event=pr
type=semver,pattern={{version}}
type=semver,pattern={{major}}.{{minor}}
type=sha
- name: Build and push
uses: docker/build-push-action@v6
with:
context: .
push: ${{ github.ref == 'refs/heads/master' || startsWith(github.ref, 'refs/tags/') }}
tags: ${{ steps.meta.outputs.tags }}
labels: ${{ steps.meta.outputs.labels }}
cache-from: type=gha
cache-to: type=gha,mode=max

1
.gitignore vendored
View File

@@ -3,3 +3,4 @@
demo-*
/public
.env
.overmind.sock

1451
Cargo.lock generated
View File

File diff suppressed because it is too large Load Diff

View File

@@ -1,17 +1,34 @@
[package]
name = "dynamic-preauth"
[workspace]
resolver = "2"
members = ["backend", "demo"]
[workspace.package]
version = "0.1.0"
edition = "2021"
[dependencies]
[workspace.dependencies]
anyhow = "1.0.95"
chrono = { version = "0.4.39", features = ["serde"] }
dotenvy = "0.15.7"
envy = "0.4.2"
futures-util = "0.3.31"
hex = "0.4.3"
rand = "0.8.5"
salvo = { version = "0.74.3", features = ["affix-state", "catch-panic", "cors", "logging", "serve-static", "websocket"] }
regex = "1.10"
reqwest = { version = "0.12", default-features = false }
salvo = { version = "0.74.3", features = ["affix-state", "catch-panic", "cors", "logging", "serve-static", "test", "websocket"] }
serde = { version = "1.0.216", features = ["derive"] }
serde_json = "1.0.134"
sha2 = "0.10.8"
tokio = { version = "1", features = ["macros"] }
tokio-stream = "0.1.17"
tracing = "0.1"
tracing-subscriber = { version = "0.3", features = ["env-filter"] }
[profile.release]
opt-level = "z"
lto = true
strip = "debuginfo"
panic = "abort"
codegen-units = 1
overflow-checks = true

View File

@@ -1,55 +1,137 @@
# Build the demo application
FROM rust:latest AS builder-demo
# syntax=docker/dockerfile:1
ARG RUST_VERSION=1.86.0
WORKDIR /build/demo
RUN apt update && apt install -y g++-mingw-w64-x86-64
ARG RAILWAY_PUBLIC_DOMAIN
# --- Chef Base Stage ---
FROM lukemathwalker/cargo-chef:latest-rust-${RUST_VERSION} AS chef
WORKDIR /app
RUN rustup target add x86_64-pc-windows-gnu
RUN rustup target add x86_64-unknown-linux-gnu
# TODO: Add support for macOS
# RUN rustup target add x86_64-apple-darwin
# --- Demo Planner Stage ---
FROM chef AS demo-planner
COPY Cargo.toml Cargo.lock ./
COPY backend ./backend
COPY demo ./demo
RUN cargo chef prepare --recipe-path recipe.json --bin demo
COPY ./demo ./
# --- Demo Builder Stage ---
FROM chef AS demo-builder
RUN cargo build --release --target x86_64-pc-windows-gnu
RUN cargo build --release --target x86_64-unknown-linux-gnu
# RUN cargo build --release --target x86_64-apple-darwin
# Install cross-compilation toolchain for Windows
RUN apt-get update && apt-get install -y \
g++-mingw-w64-x86-64 \
&& rm -rf /var/lib/apt/lists/*
# Build the server application
FROM rust:alpine AS builder-server
# Add cross-compilation targets
RUN rustup target add x86_64-pc-windows-gnu x86_64-unknown-linux-gnu
RUN apk update && apk add musl-dev
WORKDIR /build/server
# Copy recipe and cook dependencies
COPY --from=demo-planner /app/recipe.json recipe.json
RUN cargo chef cook --release --target x86_64-unknown-linux-gnu --recipe-path recipe.json --bin demo
RUN cargo chef cook --release --target x86_64-pc-windows-gnu --recipe-path recipe.json --bin demo
COPY ./src ./src
COPY ./Cargo.toml ./Cargo.lock ./
RUN cargo build --release
# Build the Astro frontend
FROM node:lts AS builder-astro
WORKDIR /build/astro
COPY ./frontend/ ./
# Copy source and build
COPY Cargo.toml Cargo.lock ./
COPY backend ./backend
COPY demo ./demo
ARG RAILWAY_PUBLIC_DOMAIN
ENV RAILWAY_PUBLIC_DOMAIN=${RAILWAY_PUBLIC_DOMAIN}
RUN npm install pnpm -g
RUN pnpm install
RUN pnpm build
RUN ./compress.sh
RUN cargo build --release --target x86_64-unknown-linux-gnu --bin demo
RUN cargo build --release --target x86_64-pc-windows-gnu --bin demo
# Run the server application
FROM alpine:latest
# Strip binaries
RUN strip target/x86_64-unknown-linux-gnu/release/demo
# --- Server Planner Stage ---
FROM chef AS server-planner
COPY Cargo.toml Cargo.lock ./
COPY backend ./backend
COPY demo ./demo
RUN cargo chef prepare --recipe-path recipe.json
# --- Server Builder Stage ---
FROM chef AS server-builder
# Copy recipe and cook dependencies
COPY --from=server-planner /app/recipe.json recipe.json
RUN cargo chef cook --release --recipe-path recipe.json
# Copy source and build
COPY Cargo.toml Cargo.lock ./
COPY backend ./backend
COPY demo ./demo
RUN cargo build --release --bin dynamic-preauth
# Strip binary
RUN strip target/release/dynamic-preauth
# --- Frontend Builder Stage ---
FROM node:22-slim AS frontend-builder
WORKDIR /app
COPY --from=builder-astro /build/astro/dist/ ./public/
COPY --from=builder-demo /build/demo/target/x86_64-pc-windows-gnu/release/demo.exe ./demo.exe
COPY --from=builder-demo /build/demo/target/x86_64-unknown-linux-gnu/release/demo ./demo-linux
COPY --from=builder-server /build/server/target/release/dynamic-preauth ./dynamic-preauth
# Install pnpm
RUN corepack enable && corepack prepare pnpm@9 --activate
EXPOSE 5800
CMD ["/app/dynamic-preauth"]
# Copy package files for layer caching
COPY frontend/package.json frontend/pnpm-lock.yaml ./
# Install dependencies
RUN pnpm install --frozen-lockfile
# Copy source and build
COPY frontend/ ./
ARG RAILWAY_PUBLIC_DOMAIN
ENV RAILWAY_PUBLIC_DOMAIN=${RAILWAY_PUBLIC_DOMAIN}
RUN pnpm build
# Pre-compress static assets
RUN ./compress.sh
# --- Runtime Stage ---
FROM debian:12-slim
ARG APP=/app
ARG APP_USER=appuser
ARG UID=1000
ARG GID=1000
# Install runtime dependencies
RUN apt-get update && apt-get install -y --no-install-recommends \
ca-certificates \
tzdata \
wget \
&& rm -rf /var/lib/apt/lists/*
ARG TZ=Etc/UTC
ENV TZ=${TZ}
# Create non-root user
RUN addgroup --gid $GID $APP_USER \
&& adduser --uid $UID --disabled-password --gecos "" --ingroup $APP_USER $APP_USER \
&& mkdir -p ${APP}
WORKDIR ${APP}
# Copy built artifacts
COPY --from=frontend-builder --chown=$APP_USER:$APP_USER /app/dist/ ./public/
COPY --from=demo-builder --chown=$APP_USER:$APP_USER /app/target/x86_64-pc-windows-gnu/release/demo.exe ./demo.exe
COPY --from=demo-builder --chown=$APP_USER:$APP_USER /app/target/x86_64-unknown-linux-gnu/release/demo ./demo-linux
COPY --from=server-builder --chown=$APP_USER:$APP_USER /app/target/release/dynamic-preauth ./dynamic-preauth
# Set proper permissions
RUN chmod +x ${APP}/dynamic-preauth
USER $APP_USER
# Build-time arg for PORT, default to 5800
ARG PORT=5800
ENV PORT=${PORT}
EXPOSE ${PORT}
# Health check
HEALTHCHECK --interval=30s --timeout=3s --start-period=5s --retries=3 \
CMD wget --no-verbose --tries=1 --spider http://localhost:${PORT}/session || exit 1
CMD ["./dynamic-preauth"]

191
Justfile Normal file
View File

@@ -0,0 +1,191 @@
# Justfile for dynamic-preauth
# Uses bacon for Rust watching, pnpm for frontend
# Frontend builds to ./public, which backend serves as static files
# Variables
image_name := "dynamic-preauth"
container_name := "dynamic-preauth-dev"
port := "5800"
# Default recipe
default:
@just --list
# Run all checks (matches quality workflow)
check: format-check cargo-check lint audit frontend-check frontend-build
@echo "All checks passed!"
# Format all Rust code
format:
@echo "Formatting code..."
cargo fmt --all
# Check formatting without modifying
format-check:
@echo "Checking formatting..."
cargo fmt --all -- --check
# Check code without building
cargo-check:
@echo "Running cargo check..."
cargo check --workspace --all-targets --all-features
# Lint with clippy
lint:
@echo "Running clippy..."
cargo clippy --workspace --all-targets --all-features -- -D warnings
# Frontend type check
frontend-check:
@echo "Checking frontend..."
pnpm --dir frontend astro check
# Build frontend
frontend-build:
@echo "Building frontend..."
pnpm --dir frontend build
# Build demo executables (debug mode for faster dev builds)
build-demo:
#!/usr/bin/env bash
set -euo pipefail
echo "Building demo executables..."
# Always build Linux demo
echo "Building Linux demo..."
cargo build --bin demo
cp target/debug/demo ./demo-linux
echo " [OK] Linux demo built"
# Try to build Windows demo if cross-compilation is available
if rustup target list --installed | grep -q x86_64-pc-windows-gnu; then
echo "Building Windows demo..."
if cargo build --bin demo --target x86_64-pc-windows-gnu 2>/dev/null; then
cp target/x86_64-pc-windows-gnu/debug/demo.exe ./demo.exe
echo " [OK] Windows demo built"
else
echo " [!] Windows build failed (mingw-w64 toolchain may not be installed)"
echo " Continuing without Windows demo..."
fi
else
echo " [SKIP] Windows target not installed"
echo " Install with: rustup target add x86_64-pc-windows-gnu"
echo " Also requires: sudo apt install mingw-w64"
fi
echo "Demo executables ready!"
# Development server with hot reload (backend + frontend using Overmind)
dev: build-demo
@echo "Starting development servers with Overmind..."
@echo ""
@echo "Backend will run on: http://localhost:5800"
@echo "Frontend will run on: http://localhost:4321"
@echo ""
@echo "Overmind multiplexes logs with prefixes:"
@echo " [backend] - Bacon watching Rust backend"
@echo " [frontend] - Astro dev server"
@echo ""
@echo "Overmind shortcuts:"
@echo " Ctrl+C - Stop all processes"
@echo " 'overmind connect <process>' - Attach to a specific process"
@echo ""
overmind start -f Procfile.dev
# Watch backend only (for when frontend is already built)
dev-backend: build-demo
@echo "Starting backend watch with bacon..."
bacon run
# Watch and serve frontend only
dev-frontend:
@echo "Starting frontend dev server..."
@echo "Make sure the backend is running on port 5800!"
pnpm --dir frontend dev
# Simple development run (no hot reload)
run:
@echo "Starting server..."
cargo run --bin dynamic-preauth
# Build release
build:
@echo "Building release..."
cargo build --workspace --release
# Security audit
audit:
@echo "Running security audit..."
cargo audit
# Build Docker image (ensures frontend is built first)
docker-build: frontend-build
@echo "Building Docker image..."
docker build -t {{image_name}}:latest .
# Run Docker container
docker-run: docker-build
@echo "Running Docker container..."
docker run --rm -d --name {{container_name}} -p {{port}}:{{port}} -e PORT={{port}} {{image_name}}:latest
@echo "Container started at http://localhost:{{port}}"
# Stop Docker container
docker-stop:
@echo "Stopping Docker container..."
docker stop {{container_name}} || true
# Docker logs
docker-logs:
docker logs {{container_name}}
# Follow Docker logs
docker-logs-follow:
docker logs -f {{container_name}}
# Clean Docker artifacts
docker-clean: docker-stop
@echo "Cleaning Docker artifacts..."
docker rmi {{image_name}}:latest || true
# Clean cargo artifacts
clean:
@echo "Cleaning cargo artifacts..."
cargo clean
# Full CI pipeline
ci: format-check lint frontend-check build docker-build
@echo "CI pipeline completed!"
# Quick development check (format + clippy)
quick: format
@echo "Running quick clippy check..."
cargo clippy --workspace --all-targets --all-features -- -D warnings
@echo "Quick check completed!"
# Verify dev setup is ready (builds demo executables and checks dependencies)
smoke: build-demo
@echo "Verifying development setup..."
@echo ""
@echo "Checking for overmind (required for 'just dev')..."
@command -v overmind >/dev/null 2>&1 || { echo " [!] overmind not found. Install from: https://github.com/DarthSim/overmind#installation"; exit 1; }
@echo " [OK] overmind found"
@echo ""
@echo "Checking for bacon..."
@command -v bacon >/dev/null 2>&1 || { echo " [!] bacon not found. Install with: cargo install bacon"; exit 1; }
@echo " [OK] bacon found"
@echo ""
@echo "Checking for pnpm..."
@command -v pnpm >/dev/null 2>&1 || { echo " [!] pnpm not found. Install from: https://pnpm.io/installation"; exit 1; }
@echo " [OK] pnpm found"
@echo ""
@echo "Checking demo executables..."
@test -f ./demo-linux || { echo " [!] demo-linux not found"; exit 1; }
@echo " [OK] demo-linux exists"
@if [ -f ./demo.exe ]; then \
echo " [OK] demo.exe exists"; \
else \
echo " [SKIP] demo.exe not found (Windows builds not available)"; \
fi
@echo ""
@echo "[OK] Development setup is ready! Run 'just dev' to start."

5
Procfile.dev Normal file
View File

@@ -0,0 +1,5 @@
# Procfile for Overmind development workflow
# Start with: overmind start -f Procfile.dev
backend: PORT=5800 bacon run --headless
frontend: pnpm --dir frontend dev

26
backend/Cargo.toml Normal file
View File

@@ -0,0 +1,26 @@
[package]
name = "dynamic-preauth"
version.workspace = true
edition.workspace = true
[[bin]]
name = "dynamic-preauth"
path = "src/main.rs"
[dependencies]
anyhow.workspace = true
chrono.workspace = true
dotenvy.workspace = true
envy.workspace = true
futures-util.workspace = true
rand.workspace = true
regex.workspace = true
reqwest = { workspace = true, features = ["json", "rustls-tls"] }
salvo.workspace = true
serde.workspace = true
serde_json.workspace = true
thiserror = "2.0.17"
tokio.workspace = true
tokio-stream.workspace = true
tracing.workspace = true
tracing-subscriber.workspace = true

70
backend/src/config.rs Normal file
View File

@@ -0,0 +1,70 @@
use serde::Deserialize;
fn default_port() -> u16 {
5800
}
/// Railway-specific configuration parsed from environment variables.
#[derive(Deserialize, Debug, Default)]
pub struct RailwayConfig {
pub railway_token: Option<String>,
pub railway_project_id: Option<String>,
pub railway_service_id: Option<String>,
pub railway_environment_id: Option<String>,
pub railway_deployment_id: Option<String>,
pub railway_public_domain: Option<String>,
}
impl RailwayConfig {
/// Returns true if running on Railway (project ID is set).
pub fn is_railway(&self) -> bool {
self.railway_project_id.is_some()
}
/// Returns true if Railway API token is configured.
pub fn has_token(&self) -> bool {
self.railway_token.is_some()
}
/// Build the Railway dashboard URL for viewing build logs.
pub fn build_logs_url(&self) -> Option<String> {
let project_id = self.railway_project_id.as_ref()?;
let service_id = self.railway_service_id.as_ref()?;
let environment_id = self.railway_environment_id.as_ref()?;
let deployment_id = self.railway_deployment_id.as_deref().unwrap_or("latest");
Some(format!(
"https://railway.com/project/{}/service/{}?environmentId={}&id={}#build",
project_id, service_id, environment_id, deployment_id
))
}
/// Returns the CORS origin based on public domain.
pub fn cors_origin(&self) -> String {
if cfg!(debug_assertions) {
return "*".to_string();
}
match &self.railway_public_domain {
Some(domain) => format!("https://{}", domain),
None => "*".to_string(),
}
}
}
/// Main configuration struct parsed from environment variables.
#[derive(Deserialize, Debug)]
pub struct Config {
#[serde(default = "default_port")]
pub port: u16,
#[serde(flatten)]
pub railway: RailwayConfig,
}
impl Config {
/// Returns the socket address to bind to.
pub fn bind_addr(&self) -> String {
format!("0.0.0.0:{}", self.port)
}
}

19
backend/src/errors.rs Normal file
View File

@@ -0,0 +1,19 @@
use std::path::PathBuf;
use thiserror::Error;
pub type Result<T> = std::result::Result<T, AppError>;
#[derive(Debug, Error)]
pub enum AppError {
#[error("executable not found at '{path}'")]
ExecutableNotFound { path: PathBuf },
#[error("key pattern not found in executable '{name}'")]
KeyPatternNotFound { name: String },
#[error("missing required environment variable '{name}'")]
MissingEnvVar { name: String },
#[error("configuration error: {message}")]
Config { message: String },
}

View File

@@ -0,0 +1,52 @@
use salvo::http::StatusCode;
use salvo::prelude::{handler, Request, Response};
use salvo::Depot;
use crate::state::STORE;
#[handler]
pub async fn get_build_logs(req: &mut Request, res: &mut Response, _depot: &mut Depot) {
let store = STORE.lock().await;
if let Some(build_logs) = &store.build_logs {
// Use pre-computed hash for ETag
let etag = format!("\"{:x}\"", build_logs.content_hash);
// Check If-None-Match header
if let Some(if_none_match) = req.headers().get("If-None-Match") {
if if_none_match == &etag {
res.status_code(StatusCode::NOT_MODIFIED);
return;
}
}
// Check If-Modified-Since header
if let Some(if_modified_since) = req.headers().get("If-Modified-Since") {
if let Ok(if_modified_since_str) = if_modified_since.to_str() {
if let Ok(if_modified_since_time) =
chrono::DateTime::parse_from_rfc2822(if_modified_since_str)
{
if build_logs.fetched_at <= if_modified_since_time {
res.status_code(StatusCode::NOT_MODIFIED);
return;
}
}
}
}
res.headers_mut().insert("ETag", etag.parse().unwrap());
res.headers_mut()
.insert("Content-Type", "text/plain; charset=utf-8".parse().unwrap());
res.headers_mut()
.insert("Cache-Control", "public, max-age=300".parse().unwrap());
res.headers_mut().insert(
"Last-Modified",
build_logs.fetched_at.to_rfc2822().parse().unwrap(),
);
res.render(&build_logs.content);
} else {
res.status_code(StatusCode::NOT_FOUND);
res.render("Build logs not available");
}
}

View File

@@ -0,0 +1,58 @@
use salvo::http::HeaderValue;
use salvo::prelude::{handler, Request, Response};
use salvo::Depot;
use crate::state::STORE;
use super::session::get_session_id;
#[handler]
pub async fn download(req: &mut Request, res: &mut Response, depot: &mut Depot) {
let download_id = req
.param::<String>("id")
.expect("Download ID required to download file");
let session_id =
get_session_id(req, depot).expect("Session ID could not be found via request or depot");
let store = &mut *STORE.lock().await;
let session = store
.sessions
.get_mut(&session_id)
.expect("Session not found");
let executable = store
.executables
.get(&download_id as &str)
.expect("Executable not found");
// Create a download for the session
let session_download = session.add_download(executable);
tracing::info!(session_id, type = download_id, dl_token = session_download.token, "Download created");
let data = executable.with_key(session_download.token.to_string().as_bytes());
if let Err(e) = res.write_body(data) {
tracing::error!("Error writing body: {}", e);
}
res.headers.insert(
"Content-Disposition",
HeaderValue::from_str(
format!("attachment; filename=\"{}\"", session_download.filename).as_str(),
)
.expect("Unable to create header"),
);
res.headers.insert(
"Content-Type",
HeaderValue::from_static("application/octet-stream"),
);
// Don't try to send state if somehow the session has not connected
if session.tx.is_some() {
session
.send_state()
.expect("Failed to buffer state message");
} else {
tracing::warn!("Download being made without any connection websocket");
}
}

View File

@@ -0,0 +1,11 @@
mod build_logs;
mod downloads;
mod notifications;
mod session;
mod websocket;
pub use build_logs::get_build_logs;
pub use downloads::download;
pub use notifications::notify;
pub use session::{get_session, session_middleware};
pub use websocket::connect;

View File

@@ -0,0 +1,61 @@
use salvo::http::StatusCode;
use salvo::prelude::{handler, Request, Response};
use crate::models::OutgoingMessage;
use crate::state::STORE;
#[handler]
pub async fn notify(req: &mut Request, res: &mut Response) {
let key = req.query::<String>("key");
if key.is_none() {
res.status_code(StatusCode::BAD_REQUEST);
return;
}
let key = key.unwrap();
if !key.starts_with("0x") {
res.status_code(StatusCode::BAD_REQUEST);
return;
}
// Parse key into u32
let key = match u32::from_str_radix(key.trim_start_matches("0x"), 16) {
Ok(k) => k,
Err(e) => {
tracing::error!("Error parsing key: {}", e);
res.status_code(StatusCode::BAD_REQUEST);
return;
}
};
let store = &mut *STORE.lock().await;
let target_session = store
.sessions
.iter_mut()
.find(|(_, session)| session.downloads.iter().any(|d| d.token == key));
match target_session {
Some((_, session)) => {
let message = OutgoingMessage::TokenAlert { token: key };
if let Err(e) = session.send_message(message) {
tracing::warn!(
error = e.to_string(),
"Session did not have a receiving WebSocket available, notify ignored.",
);
res.status_code(StatusCode::NOT_MODIFIED);
return;
}
res.render("Notification sent");
}
None => {
tracing::warn!("Session not found for key while attempting notify: {}", key);
res.status_code(StatusCode::UNAUTHORIZED);
return;
}
}
}

View File

@@ -0,0 +1,85 @@
use salvo::http::StatusCode;
use salvo::prelude::{handler, Request, Response};
use salvo::writing::Json;
use salvo::Depot;
use crate::state::STORE;
#[handler]
pub async fn session_middleware(req: &mut Request, res: &mut Response, depot: &mut Depot) {
match req.cookie("Session") {
Some(cookie) => {
// Check if the session exists
match cookie.value().parse::<u32>() {
Ok(session_id) => {
let mut store = STORE.lock().await;
if !store.sessions.contains_key(&session_id) {
let new_session_id = store.new_session(res).await;
depot.insert("session_id", new_session_id);
tracing::debug!(
existing_session_id = session_id,
new_session_id = new_session_id,
"Session provided in cookie, but does not exist"
);
} else {
store.sessions.get_mut(&session_id).unwrap().seen(false);
}
}
Err(parse_error) => {
tracing::debug!(
invalid_session_id = cookie.value(),
error = ?parse_error,
"Session provided in cookie, but is not a valid number"
);
let mut store = STORE.lock().await;
let id = store.new_session(res).await;
depot.insert("session_id", id);
}
}
}
None => {
tracing::debug!("Session was not provided in cookie");
let mut store = STORE.lock().await;
let id = store.new_session(res).await;
depot.insert("session_id", id);
}
}
}
#[handler]
pub async fn get_session(req: &mut Request, res: &mut Response, depot: &mut Depot) {
let store = STORE.lock().await;
let session_id = get_session_id(req, depot);
if session_id.is_none() {
res.status_code(StatusCode::BAD_REQUEST);
return;
}
match store.sessions.get(&session_id.unwrap()) {
Some(session) => {
res.render(Json(&session));
}
None => {
res.status_code(StatusCode::BAD_REQUEST);
}
}
}
// Acquires the session id from the request, preferring the depot
pub fn get_session_id(req: &Request, depot: &Depot) -> Option<u32> {
if depot.contains_key("session_id") {
return Some(*depot.get::<u32>("session_id").unwrap());
}
// Otherwise, just use whatever the Cookie might have
match req.cookie("Session") {
Some(cookie) => cookie.value().parse::<u32>().ok(),
None => {
tracing::warn!("Session was not provided in cookie or depot");
None
}
}
}

View File

@@ -0,0 +1,138 @@
use futures_util::{FutureExt, StreamExt};
use salvo::http::StatusError;
use salvo::prelude::{handler, Request, Response, WebSocketUpgrade};
use salvo::websocket::WebSocket;
use salvo::Depot;
use tokio::sync::mpsc;
use tokio_stream::wrappers::UnboundedReceiverStream;
use crate::models::{IncomingMessage, OutgoingMessage};
use crate::state::STORE;
use super::session::get_session_id;
#[handler]
pub async fn connect(
req: &mut Request,
res: &mut Response,
depot: &Depot,
) -> Result<(), StatusError> {
let session_id = get_session_id(req, depot).unwrap();
WebSocketUpgrade::new()
.upgrade(req, res, move |ws| async move {
handle_socket(session_id, ws).await;
})
.await
}
async fn handle_socket(session_id: u32, websocket: WebSocket) {
// Split the socket into a sender and receive of messages.
let (socket_tx, mut socket_rx) = websocket.split();
// Use an unbounded channel to handle buffering and flushing of messages to the websocket...
let (tx_channel, tx_channel_rx) = mpsc::unbounded_channel();
let transmit = UnboundedReceiverStream::new(tx_channel_rx);
let fut_handle_tx_buffer = transmit
.then(|message| async {
match message {
Ok(ref message) => {
tracing::debug!(message = ?message, "Outgoing Message");
}
Err(ref e) => {
tracing::error!(error = ?e, "Outgoing Message Error");
}
}
message
})
.forward(socket_tx)
.map(|result| {
tracing::debug!("WebSocket send result: {:?}", result);
if let Err(e) = result {
tracing::error!(error = ?e, "websocket send error");
}
});
tokio::task::spawn(fut_handle_tx_buffer);
let store = &mut *STORE.lock().await;
// Create the executable message first, borrow issues
let executable_message = OutgoingMessage::Executables {
executables: store.executable_json(),
build_log: if store.build_logs.is_some() {
Some("/build-logs".to_string())
} else {
None
},
};
let session = store
.sessions
.get_mut(&session_id)
.expect("Unable to get session");
session.tx = Some(tx_channel);
session
.send_state()
.expect("Failed to buffer state message");
session
.send_message(executable_message)
.expect("Failed to buffer executables message");
// Handle incoming messages
let fut = async move {
tracing::info!(
"WebSocket connection established for session_id: {}",
session_id
);
while let Some(result) = socket_rx.next().await {
let msg = match result {
Ok(msg) => msg,
Err(error) => {
tracing::error!(
"WebSocket Error session_id={} error=({})",
session_id,
error
);
break;
}
};
if msg.is_close() {
tracing::info!("WebSocket closing for Session {}", session_id);
break;
}
if msg.is_text() {
let text = msg.to_str().unwrap();
// Deserialize
match serde_json::from_str::<IncomingMessage>(text) {
Ok(message) => {
tracing::debug!(message = ?message, "Received message");
match message {
IncomingMessage::DeleteDownloadToken { id } => {
let store = &mut *STORE.lock().await;
let session = store
.sessions
.get_mut(&session_id)
.expect("Session not found");
if session.delete_download(id) {
session
.send_state()
.expect("Failed to buffer state message");
}
}
}
}
Err(e) => {
tracing::error!("Error deserializing message: {} {}", text, e);
}
}
}
}
};
tokio::task::spawn(fut);
}

6
backend/src/lib.rs Normal file
View File

@@ -0,0 +1,6 @@
pub mod config;
pub mod errors;
pub mod handlers;
pub mod models;
pub mod railway;
pub mod state;

129
backend/src/main.rs Normal file
View File

@@ -0,0 +1,129 @@
use dynamic_preauth::config::Config;
use dynamic_preauth::handlers::{
connect, download, get_build_logs, get_session, notify, session_middleware,
};
use dynamic_preauth::railway;
use dynamic_preauth::state::STORE;
use salvo::cors::Cors;
use salvo::http::Method;
use salvo::logging::Logger;
use salvo::prelude::{CatchPanic, Listener, Router, Server, Service, StaticDir, TcpListener};
use tracing_subscriber::EnvFilter;
#[tokio::main]
async fn main() {
// Load environment variables from .env file (development only)
#[cfg(debug_assertions)]
dotenvy::dotenv().ok();
// Parse configuration from environment
let config: Config = envy::from_env().expect("Failed to parse environment configuration");
tracing_subscriber::fmt()
.with_env_filter(EnvFilter::new(format!(
"info,dynamic_preauth={}",
if cfg!(debug_assertions) {
"debug"
} else {
"info"
}
)))
.init();
// Add the build log & executables to the store
let mut store = STORE.lock().await;
// Check if we are deployed on Railway
if config.railway.is_railway() {
if let Some(build_logs_url) = config.railway.build_logs_url() {
tracing::info!("Build logs available here: {}", build_logs_url);
store.build_log_url = Some(build_logs_url);
}
// Try to fetch actual build logs using Railway API
if config.railway.has_token() {
match railway::fetch_build_logs().await {
Ok(build_logs) => {
tracing::info!(
"Successfully fetched build logs ({} bytes)",
build_logs.content.len()
);
store.build_logs = Some(build_logs);
}
Err(e) => {
tracing::warn!("Failed to fetch build logs from Railway API: {}", e);
}
}
} else {
tracing::warn!("RAILWAY_TOKEN not set, skipping build log fetch");
}
}
for (exe_type, exe_path) in [
("Windows", "./demo.exe"),
("Linux", "./demo-linux"),
// ("MacOS", "./demo-macos"),
] {
if let Err(e) = store.add_executable(exe_type, exe_path) {
// In debug mode, allow missing Windows executable for dev convenience
if cfg!(debug_assertions) && exe_type == "Windows" {
tracing::warn!(
"Windows executable not found at {} (skipping - cross-compilation not set up)",
exe_path
);
tracing::warn!("To enable Windows builds: rustup target add x86_64-pc-windows-gnu && sudo apt install mingw-w64");
continue;
}
tracing::error!("{}", e);
std::process::exit(1);
}
}
drop(store); // critical: Drop the lock to avoid deadlock, otherwise the server will hang
let origin = config.railway.cors_origin();
let cors = Cors::new()
.allow_origin(&origin)
.allow_methods(vec![Method::GET])
.into_handler();
tracing::debug!("CORS Allowed Origin: {}", &origin);
let static_dir = StaticDir::new(["./public"]).defaults("index.html");
// TODO: Improved Token Generation
// TODO: Advanced HMAC Verification
// TODO: Session Purging
let router = Router::new()
.hoop(CatchPanic::new())
// /notify does not need a session, nor should it have one
.push(Router::with_path("notify").post(notify))
// /build-logs does not need a session
.push(Router::with_path("build-logs").get(get_build_logs))
.push(
Router::new()
.hoop(session_middleware)
.push(Router::with_path("download/<id>").get(download))
.push(Router::with_path("session").get(get_session))
// websocket /ws
.push(Router::with_path("ws").goal(connect))
// static files
.push(Router::with_path("<**path>").get(static_dir)),
);
let service = Service::new(router).hoop(cors).hoop(Logger::new());
let bind_addr = config.bind_addr();
tracing::info!("Server starting on http://{}", bind_addr);
tracing::info!("WebSocket endpoint: ws://{}/ws", bind_addr);
if cfg!(debug_assertions) {
tracing::info!("Development mode - CORS allows all origins");
tracing::info!("Access the app at http://localhost:4321 (Astro dev server)");
}
let acceptor = TcpListener::new(&bind_addr).bind().await;
Server::new(acceptor).serve(service).await;
}

View File

@@ -0,0 +1,6 @@
#[derive(Debug, Clone)]
pub struct BuildLogs {
pub content: String,
pub fetched_at: chrono::DateTime<chrono::Utc>,
pub content_hash: u64,
}

View File

@@ -0,0 +1,83 @@
use serde::Serialize;
#[derive(Default, Clone, Debug)]
pub struct Executable {
pub data: Vec<u8>, // the raw data of the executable
pub filename: String,
pub name: String, // the name before the extension
pub extension: String, // may be empty string
pub key_start: usize, // the index of the byte where the key starts
pub key_end: usize, // the index of the byte where the key ends
}
impl Executable {
pub fn search_pattern(buf: &[u8], pattern: &[u8], start_index: usize) -> Option<usize> {
let mut i = start_index;
// If the buffer is empty, the pattern is too long
if pattern.len() > buf.len() {
return None;
}
// If the pattern is empty
if pattern.is_empty() {
return None;
}
// If the starting index is too high
if start_index >= buf.len() {
return None;
}
while i < buf.len() {
for j in 0..pattern.len() {
// If the pattern is too long to fit in the buffer anymore
if i + j >= buf.len() {
return None;
}
// If the pattern stops matching
if buf[i + j] != pattern[j] {
break;
}
// If the pattern is found
if j == pattern.len() - 1 {
return Some(i);
}
}
i += 1;
}
None
}
pub fn with_key(&self, new_key: &[u8]) -> Vec<u8> {
let mut data = self.data.clone();
// Copy the key into the data
for i in 0..new_key.len() {
data[self.key_start + i] = new_key[i];
}
// If the new key is shorter than the old key, we just write over the remaining data
if new_key.len() < self.key_end - self.key_start {
for item in data
.iter_mut()
.take(self.key_end)
.skip(self.key_start + new_key.len())
{
*item = b' ';
}
}
data
}
}
#[derive(Debug, Serialize)]
pub struct ExecutableJson {
pub id: String,
pub size: usize,
pub filename: String,
}

View File

@@ -0,0 +1,29 @@
use serde::{Deserialize, Serialize};
use super::executable::ExecutableJson;
use super::session::Session;
#[derive(Debug, Deserialize)]
#[serde(tag = "type", rename_all = "kebab-case")]
pub enum IncomingMessage {
// A request from the client to delete a download token
DeleteDownloadToken { id: u32 },
}
#[derive(Debug, Serialize)]
#[serde(tag = "type", rename_all = "kebab-case")]
pub enum OutgoingMessage {
// An alert to the client that a session download has been used.
#[serde(rename = "notify")]
TokenAlert {
token: u32,
},
// A message describing the current session state
State {
session: Session,
},
Executables {
build_log: Option<String>,
executables: Vec<ExecutableJson>,
},
}

View File

@@ -0,0 +1,9 @@
mod build_logs;
mod executable;
mod messages;
mod session;
pub use build_logs::BuildLogs;
pub use executable::{Executable, ExecutableJson};
pub use messages::{IncomingMessage, OutgoingMessage};
pub use session::Session;

View File

@@ -0,0 +1,97 @@
use salvo::websocket::Message;
use serde::Serialize;
use tokio::sync::mpsc::UnboundedSender;
use super::executable::Executable;
use super::messages::OutgoingMessage;
#[derive(Debug, Serialize, Clone)]
pub struct Session {
pub id: u32,
pub downloads: Vec<SessionDownload>,
pub first_seen: chrono::DateTime<chrono::Utc>,
// The last time a request OR websocket message from/to this session was made
pub last_seen: chrono::DateTime<chrono::Utc>,
// The last time a request was made with this session
pub last_request: chrono::DateTime<chrono::Utc>,
// The sender for the websocket connection
#[serde(skip_serializing)]
pub tx: Option<UnboundedSender<Result<Message, salvo::Error>>>,
}
impl Session {
// Update the last seen time(s) for the session
pub fn seen(&mut self, socket: bool) {
self.last_seen = chrono::Utc::now();
if !socket {
self.last_request = chrono::Utc::now();
}
}
// Add a download to the session
pub fn add_download(&mut self, exe: &Executable) -> &SessionDownload {
let token: u32 = rand::random();
let download = SessionDownload {
token,
filename: format!(
"{}-{:08x}{}{}",
exe.name,
token,
if !exe.extension.is_empty() { "." } else { "" },
exe.extension
),
last_used: chrono::Utc::now(),
download_time: chrono::Utc::now(),
};
self.downloads.push(download);
self.downloads.last().unwrap()
}
// Delete a download from the session
// Returns true if the download was deleted, false if it was not found
pub fn delete_download(&mut self, token: u32) -> bool {
if let Some(index) = self.downloads.iter().position(|d| d.token == token) {
self.downloads.remove(index);
true
} else {
tracing::warn!("Attempted to delete non-existent download token: {}", token);
false
}
}
// This function's failure is not a failure to transmit the message, but a failure to buffer it into the channel (or any preceding steps).
pub fn send_message(&mut self, message: OutgoingMessage) -> Result<(), anyhow::Error> {
if self.tx.is_none() {
return Err(anyhow::anyhow!("Session {} has no sender", self.id));
}
// TODO: Error handling
let tx = self.tx.as_ref().unwrap();
let result = tx.send(Ok(Message::text(serde_json::to_string(&message).unwrap())));
match result {
Ok(_) => Ok(()),
Err(e) => Err(anyhow::anyhow!("Error sending message: {}", e)),
}
}
pub fn send_state(&mut self) -> Result<(), anyhow::Error> {
let message = OutgoingMessage::State {
session: self.clone(),
};
self.send_message(message)
}
}
#[derive(Serialize, Debug, Clone)]
pub struct SessionDownload {
pub token: u32,
pub filename: String,
pub last_used: chrono::DateTime<chrono::Utc>,
pub download_time: chrono::DateTime<chrono::Utc>,
}

276
backend/src/railway.rs Normal file
View File

@@ -0,0 +1,276 @@
use anyhow::Result;
use serde::{Deserialize, Serialize};
use std::env;
#[derive(Debug, Serialize)]
struct GraphQLRequest {
query: String,
variables: serde_json::Value,
}
#[derive(Debug, Deserialize)]
struct GraphQLResponse {
data: Option<serde_json::Value>,
errors: Option<Vec<GraphQLError>>,
}
#[derive(Debug, Deserialize)]
struct GraphQLError {
message: String,
}
#[derive(Debug, Deserialize)]
struct BuildLogEntry {
message: String,
severity: String,
timestamp: String,
}
#[derive(Debug, Deserialize)]
struct DeploymentNode {
id: String,
}
#[derive(Debug, Deserialize)]
struct DeploymentEdge {
node: DeploymentNode,
}
#[derive(Debug, Deserialize)]
struct DeploymentsConnection {
edges: Vec<DeploymentEdge>,
}
fn strip_ansi_codes(text: &str) -> String {
// Simple regex to remove ANSI escape sequences
let re = regex::Regex::new(r"\x1b\[[0-9;]*[a-zA-Z]").unwrap();
re.replace_all(text, "").to_string()
}
fn should_stop_at_message(message: &str) -> bool {
let clean_message = strip_ansi_codes(message);
// Check for "Build time: X seconds" pattern (case insensitive)
let build_time_pattern = regex::Regex::new(r"(?i)Build\s+time:\s+\d+").unwrap();
if build_time_pattern.is_match(&clean_message) {
return true;
}
// Check for "Starting Container" (case insensitive)
let starting_container_pattern = regex::Regex::new(r"(?i)Starting\s+Container").unwrap();
if starting_container_pattern.is_match(&clean_message) {
return true;
}
false
}
async fn fetch_latest_deployment_id() -> Result<String> {
let token = env::var("RAILWAY_TOKEN")?;
let service_id = env::var("RAILWAY_SERVICE_ID")?;
let project_id = env::var("RAILWAY_PROJECT_ID")?;
let environment_id = env::var("RAILWAY_ENVIRONMENT_ID")?;
let query = r#"
query deployments($input: DeploymentListInput!, $first: Int) {
deployments(input: $input, first: $first) {
edges {
node {
id
}
}
}
}
"#;
let variables = serde_json::json!({
"input": {
"projectId": project_id,
"serviceId": service_id,
"environmentId": environment_id,
"status": {"in": ["SUCCESS", "DEPLOYING", "SLEEPING", "BUILDING"]}
},
"first": 1
});
let request = GraphQLRequest {
query: query.to_string(),
variables,
};
let client = reqwest::Client::new();
let response = client
.post("https://backboard.railway.app/graphql/v2")
.header("Authorization", format!("Bearer {}", token))
.json(&request)
.send()
.await?;
let response_text = response.text().await?;
let graphql_response: GraphQLResponse = serde_json::from_str(&response_text)?;
if let Some(errors) = graphql_response.errors {
let error_messages: Vec<String> = errors.iter().map(|e| e.message.clone()).collect();
return Err(anyhow::anyhow!(
"GraphQL errors: {}",
error_messages.join(", ")
));
}
if let Some(data) = graphql_response.data {
if let Some(deployments_value) = data.get("deployments") {
if let Ok(deployments) =
serde_json::from_value::<DeploymentsConnection>(deployments_value.clone())
{
if let Some(first_edge) = deployments.edges.first() {
return Ok(first_edge.node.id.clone());
}
}
}
}
Err(anyhow::anyhow!(
"No deployments found or unexpected response structure"
))
}
pub async fn fetch_build_logs() -> Result<crate::models::BuildLogs> {
let token = env::var("RAILWAY_TOKEN")?;
// Get deployment ID - in debug mode, fetch latest if not specified
let deployment_id = if cfg!(debug_assertions) {
match env::var("RAILWAY_DEPLOYMENT_ID") {
Ok(id) => id,
Err(_) => {
tracing::debug!(
"No RAILWAY_DEPLOYMENT_ID specified in debug mode, fetching latest deployment"
);
fetch_latest_deployment_id().await?
}
}
} else {
env::var("RAILWAY_DEPLOYMENT_ID")?
};
let query = r#"
query buildLogs($deploymentId: String!, $endDate: DateTime, $filter: String, $limit: Int, $startDate: DateTime) {
buildLogs(
deploymentId: $deploymentId
endDate: $endDate
filter: $filter
limit: $limit
startDate: $startDate
) {
message
severity
timestamp
}
}
"#;
let variables = serde_json::json!({
"deploymentId": deployment_id,
"limit": 1000
});
let request = GraphQLRequest {
query: query.to_string(),
variables,
};
let client = reqwest::Client::new();
let response = client
.post("https://backboard.railway.app/graphql/v2")
.header("Authorization", format!("Bearer {}", token))
.json(&request)
.send()
.await?;
let response_text = response.text().await?;
let graphql_response: GraphQLResponse = serde_json::from_str(&response_text)?;
if let Some(errors) = graphql_response.errors {
let error_messages: Vec<String> = errors.iter().map(|e| e.message.clone()).collect();
return Err(anyhow::anyhow!(
"GraphQL errors: {}",
error_messages.join(", ")
));
}
if let Some(data) = graphql_response.data {
if let Some(build_logs_value) = data.get("buildLogs") {
if let Ok(build_logs) =
serde_json::from_value::<Vec<BuildLogEntry>>(build_logs_value.clone())
{
let mut filtered_logs = Vec::new();
let starting_container_pattern =
regex::Regex::new(r"(?i)Starting\s+Container").unwrap();
for entry in build_logs {
// Check if we should stop at this message
if should_stop_at_message(&entry.message) {
// For "Build time" messages, include them
// For "Starting Container" messages, stop before them
let clean_message = strip_ansi_codes(&entry.message);
if starting_container_pattern.is_match(&clean_message) {
// Stop before "Starting Container" message
break;
} else {
// Include "Build time" message and stop
let formatted_entry = format!(
"{} {} {}",
entry.timestamp,
entry.severity,
clean_message.trim()
);
filtered_logs.push(formatted_entry);
break;
}
}
// Include this log entry
let clean_message = strip_ansi_codes(&entry.message);
let formatted_entry = format!(
"{} {} {}",
entry.timestamp,
entry.severity,
clean_message.trim()
);
filtered_logs.push(formatted_entry);
}
// Add Railway URL header to the logs
let railway_url = format!(
"Railway Build Logs: https://railway.com/project/{}/service/{}?environmentId={}&id={}#build\n\n",
env::var("RAILWAY_PROJECT_ID").unwrap_or_default(),
env::var("RAILWAY_SERVICE_ID").unwrap_or_default(),
env::var("RAILWAY_ENVIRONMENT_ID").unwrap_or_default(),
deployment_id
);
let content = format!("{}{}", railway_url, filtered_logs.join("\n"));
let fetched_at = chrono::Utc::now();
// Generate hash for the content
use std::collections::hash_map::DefaultHasher;
use std::hash::{Hash, Hasher};
let mut hasher = DefaultHasher::new();
content.hash(&mut hasher);
let content_hash = hasher.finish();
return Ok(crate::models::BuildLogs {
content,
fetched_at,
content_hash,
});
}
}
Err(anyhow::anyhow!(
"Unexpected response structure from Railway API"
))
} else {
Err(anyhow::anyhow!("No data received from Railway API"))
}
}

122
backend/src/state.rs Normal file
View File

@@ -0,0 +1,122 @@
use std::collections::HashMap;
use std::path::{Path, PathBuf};
use std::sync::LazyLock;
use salvo::{http::cookie::Cookie, Response};
use tokio::sync::Mutex;
use crate::errors::{AppError, Result};
use crate::models::{BuildLogs, Executable, ExecutableJson, Session};
pub static STORE: LazyLock<Mutex<State>> = LazyLock::new(|| Mutex::new(State::new()));
#[derive(Default)]
pub struct State {
pub sessions: HashMap<u32, Session>,
pub executables: HashMap<String, Executable>,
pub build_logs: Option<BuildLogs>,
pub build_log_url: Option<String>,
}
impl State {
pub fn new() -> Self {
Self {
sessions: HashMap::new(),
executables: HashMap::new(),
build_logs: None,
build_log_url: None,
}
}
pub fn add_executable(&mut self, exe_type: &str, exe_path: &str) -> Result<()> {
let path = Path::new(exe_path);
let data = std::fs::read(path).map_err(|_| AppError::ExecutableNotFound {
path: PathBuf::from(exe_path),
})?;
let pattern = "a".repeat(1024);
let name = path
.file_stem()
.and_then(|s| s.to_str())
.unwrap_or_default()
.to_string();
let key_start = Executable::search_pattern(&data, pattern.as_bytes(), 0)
.ok_or_else(|| AppError::KeyPatternNotFound { name: name.clone() })?;
let key_end = key_start + pattern.len();
let extension = path
.extension()
.and_then(|s| s.to_str())
.unwrap_or_default()
.to_string();
let exe = Executable {
data,
filename: path
.file_name()
.and_then(|s| s.to_str())
.unwrap_or_default()
.to_string(),
name,
extension,
key_start,
key_end,
};
self.executables.insert(exe_type.to_string(), exe);
Ok(())
}
pub async fn new_session(&mut self, res: &mut Response) -> u32 {
let id: u32 = rand::random();
let now = chrono::Utc::now();
self.sessions.insert(
id,
Session {
id,
downloads: Vec::new(),
last_seen: now,
last_request: now,
first_seen: now,
tx: None,
},
);
tracing::info!("New session created: {}", id);
res.add_cookie(
Cookie::build(("Session", id.to_string()))
.http_only(true)
.partitioned(true)
.secure(cfg!(debug_assertions) == false)
.path("/")
// Use SameSite=None only in development
.same_site(if cfg!(debug_assertions) {
salvo::http::cookie::SameSite::None
} else {
salvo::http::cookie::SameSite::Strict
})
.permanent()
.build(),
);
id
}
pub fn executable_json(&self) -> Vec<ExecutableJson> {
let mut executables = Vec::new();
for (key, exe) in &self.executables {
executables.push(ExecutableJson {
id: key.to_string(),
size: exe.data.len(),
filename: exe.filename.clone(),
});
}
executables
}
}

36
bacon.toml Normal file
View File

@@ -0,0 +1,36 @@
# Bacon configuration for dynamic-preauth
default_job = "check"
[jobs.check]
command = ["cargo", "check", "--workspace", "--all-targets", "--all-features", "--color", "always"]
need_stdout = false
[jobs.clippy]
command = ["cargo", "clippy", "--workspace", "--all-targets", "--all-features", "--color", "always", "--", "-D", "warnings"]
need_stdout = false
[jobs.test]
command = ["cargo", "test", "--workspace", "--color", "always"]
need_stdout = true
[jobs.run]
command = ["cargo", "run", "--bin", "dynamic-preauth", "--color", "always"]
need_stdout = true
on_success = "back"
[jobs.doc]
command = ["cargo", "doc", "--workspace", "--all-features", "--no-deps", "--color", "always"]
need_stdout = false
[keybindings]
# Use 'c' to switch to check job
c = "job:check"
# Use 'l' to switch to clippy job
l = "job:clippy"
# Use 't' to switch to test job
t = "job:test"
# Use 'r' to switch to run job
r = "job:run"
# Use 'd' to switch to doc job
d = "job:doc"

1612
demo/Cargo.lock generated
View File

File diff suppressed because it is too large Load Diff

View File

@@ -1,27 +1,19 @@
[package]
name = "demo"
version = "0.1.0"
edition = "2021"
version.workspace = true
edition.workspace = true
build = "build.rs"
[dependencies]
chrono = "0.4.39"
hex = "0.4.3"
reqwest = { version = "0.12.9", features = ["blocking", "json"] }
serde = { version = "1.0.216", features = ["derive"] }
serde_json = "1.0.134"
sha2 = "0.10.8"
hex.workspace = true
reqwest = { workspace = true, features = ["blocking", "json"] }
serde.workspace = true
serde_json.workspace = true
sha2.workspace = true
[build-dependencies]
chrono = "0.4.39"
hex = "0.4.3"
serde = { version = "1.0.216", features = ["derive"] }
serde_json = "1.0.134"
sha2 = "0.10.8"
[profile.release]
strip = true
opt-level = "z"
lto = true
codegen-units = 1
panic = "abort"
chrono.workspace = true
hex.workspace = true
serde.workspace = true
serde_json.workspace = true
sha2.workspace = true

View File

@@ -35,7 +35,7 @@ fn main() -> Result<(), Box<dyn Error>> {
};
let json_data = serde_json::to_string(&key_data)?;
write!(f, "{}", json_data.to_string())?;
write!(f, "{}", json_data)?;
Ok(())
}

View File

@@ -9,8 +9,8 @@ struct KeyData<'a> {
compile_time: String,
}
static KEY: &'static str = include_str!(concat!(env!("OUT_DIR"), "/key.json"));
const HOST_INFO: (&'static str, &'static str) = match option_env!("RAILWAY_PUBLIC_DOMAIN") {
static KEY: &str = include_str!(concat!(env!("OUT_DIR"), "/key.json"));
const HOST_INFO: (&str, &str) = match option_env!("RAILWAY_PUBLIC_DOMAIN") {
Some(domain) => ("https", domain),
None => ("http", "localhost:5800"),
};
@@ -55,7 +55,7 @@ fn main() {
request(token);
}
Err(e) => {
eprintln!("Token was changed, but is not a valid u32 integer: {}", e);
eprintln!("Token is not a valid u32 integer: {}", e);
eprintln!("Original Value: {}", key_data.value);
return;
}
@@ -67,7 +67,7 @@ fn main() {
fn request(token: u32) {
let client = reqwest::blocking::Client::new();
let response = client
.post(&format!(
.post(format!(
"{}://{}/notify?key=0x{:08X}",
HOST_INFO.0, HOST_INFO.1, token
))

View File

@@ -1,5 +1,5 @@
// @ts-check
import { defineConfig, envField } from "astro/config";
import { defineConfig } from "astro/config";
import tailwind from "@astrojs/tailwind";
import sitemap from "@astrojs/sitemap";
@@ -20,6 +20,7 @@ if (
// https://astro.build/config
export default defineConfig({
outDir: "../public",
build: {
assets: "assets",
},

View File

@@ -9,6 +9,7 @@
"astro": "astro"
},
"dependencies": {
"@astrojs/check": "^0.9.6",
"@astrojs/react": "^4.1.2",
"@astrojs/sitemap": "^3.2.1",
"@astrojs/tailwind": "^5.1.4",
@@ -21,10 +22,12 @@
"clsx": "^2.1.1",
"react": "^19.0.0",
"react-dom": "^19.0.0",
"react-icons": "^5.5.0",
"react-tooltip": "^5.28.0",
"react-use-websocket": "^4.11.1",
"tailwind-merge": "^2.5.5",
"tailwindcss": "^3.4.17"
"tailwindcss": "^3.4.17",
"typescript": "^5.9.3"
},
"packageManager": "pnpm@9.15.1+sha512.1acb565e6193efbebda772702950469150cf12bcc764262e7587e71d19dc98a423dff9536e57ea44c49bdf790ff694e83c27be5faa23d67e0c033b583be4bfcf",
"devDependencies": {

627
frontend/pnpm-lock.yaml generated
View File

File diff suppressed because it is too large Load Diff

View File

@@ -69,7 +69,7 @@ const Demo = ({ class: className }: DemoProps) => {
buildLog={buildLog}
executables={executables}
/>
{downloads?.map((download, i) => (
{downloads?.map((download) => (
<Badge
key={download.token}
className={cn(

View File

@@ -1,5 +1,6 @@
import type { Executable } from "@/components/useSocket";
import { cn, withBackend } from "@/util";
import MobileWarningModal from "@/components/MobileWarningModal";
import { cn, isMobile, withBackend } from "@/util";
import {
Button,
Menu,
@@ -9,11 +10,13 @@ import {
MenuSeparator,
} from "@headlessui/react";
import {
ArrowDownTrayIcon,
BeakerIcon,
ChevronDownIcon,
} from "@heroicons/react/16/solid";
import { useRef } from "react";
import { FaWindows, FaApple, FaLinux } from "react-icons/fa";
import { useRef, useState } from "react";
const MOBILE_WARNING_KEY = "mobile-warning-acknowledged";
type DownloadButtonProps = {
disabled?: boolean;
@@ -36,17 +39,78 @@ function getSystemType(): SystemType | null {
}
}
function getPlatformIcon(id: string, className?: string) {
const platformId = id.toLowerCase();
switch (platformId) {
case "windows":
return <FaWindows className={className} />;
case "macos":
return <FaApple className={className} />;
case "linux":
return <FaLinux className={className} />;
default:
return null;
}
}
function getPlatformDisplayName(id: string): string {
const platformId = id.toLowerCase();
switch (platformId) {
case "windows":
return "Windows";
case "macos":
return "macOS";
case "linux":
return "Linux";
default:
return id;
}
}
export default function DownloadButton({
disabled,
executables,
buildLog,
}: DownloadButtonProps) {
const menuRef = useRef<HTMLButtonElement>(null);
const [showMobileWarning, setShowMobileWarning] = useState(false);
const [mobileAcknowledged, setMobileAcknowledged] = useState(() => {
if (typeof window === "undefined") return false;
return sessionStorage.getItem(MOBILE_WARNING_KEY) === "true";
});
function getExecutable(id: string) {
return executables?.find((e) => e.id.toLowerCase() === id.toLowerCase());
}
const mobile = isMobile();
const detectedPlatform = mobile ? null : getSystemType();
const platformExecutable = detectedPlatform ? getExecutable(detectedPlatform) : null;
const canAutoDownload = platformExecutable != null;
function acknowledgeMobileWarning() {
sessionStorage.setItem(MOBILE_WARNING_KEY, "true");
setMobileAcknowledged(true);
}
function handleMobileButtonClick() {
if (!mobileAcknowledged) {
setShowMobileWarning(true);
} else {
menuRef.current?.click();
}
}
function handleMobileWarningClose() {
setShowMobileWarning(false);
}
function handleMobileWarningContinue() {
acknowledgeMobileWarning();
setShowMobileWarning(false);
menuRef.current?.click();
}
async function handleDownload(id: string) {
const executable = getExecutable(id);
if (executable == null) {
@@ -59,20 +123,18 @@ export default function DownloadButton({
}
function handleDownloadAutomatic() {
const systemType = getSystemType();
// If the system type is unknown/unavailable, open the menu for manual selection
if (systemType == null || getExecutable(systemType) == null) {
menuRef.current?.click();
}
// Otherwise, download the executable automatically
else {
handleDownload(systemType);
if (canAutoDownload && detectedPlatform) {
handleDownload(detectedPlatform);
}
}
return (
<>
<MobileWarningModal
open={showMobileWarning}
onClose={handleMobileWarningClose}
onContinue={handleMobileWarningContinue}
/>
<div
className={cn(
"[&>*]:py-1 overflow-clip transition-[background-color] text-sm/6 flex items-center shadow-inner align-middle text-white focus:outline-none data-[focus]:outline-1 data-[focus]:outline-white",
@@ -83,16 +145,38 @@ export default function DownloadButton({
)}
>
<Button
onClick={handleDownloadAutomatic}
onClick={
mobile
? handleMobileButtonClick
: canAutoDownload
? handleDownloadAutomatic
: undefined
}
suppressHydrationWarning
disabled={disabled}
disabled={disabled || (!mobile && !canAutoDownload)}
className={cn("pl-3 font-semibold pr-2.5", {
"hover:bg-white/5 cursor-pointer": !disabled && (mobile || canAutoDownload),
"cursor-default": !mobile && !canAutoDownload,
})}
>
{mobile
? "Download for Desktop"
: canAutoDownload && detectedPlatform
? `Download for ${getPlatformDisplayName(detectedPlatform)}`
: "Download"}
</Button>
<Menu>
{mobile && !mobileAcknowledged ? (
<button
onClick={handleMobileButtonClick}
disabled={disabled}
className={cn("pl-1.5 min-h-8 pr-2 py-1", {
"hover:bg-white/5": !disabled,
})}
>
Download
</Button>
<Menu>
<ChevronDownIcon className="size-4 fill-white/60" />
</button>
) : (
<MenuButton
ref={menuRef}
suppressHydrationWarning
@@ -103,6 +187,7 @@ export default function DownloadButton({
>
<ChevronDownIcon className="size-4 fill-white/60" />
</MenuButton>
)}
<MenuItems
transition
anchor="bottom end"
@@ -115,8 +200,8 @@ export default function DownloadButton({
onClick={() => handleDownload(executable.id)}
>
<div className="flex items-center gap-1.5">
<ArrowDownTrayIcon className="size-4 fill-white/40" />
{executable.id}
{getPlatformIcon(executable.id, "size-4 fill-white/40")}
{getPlatformDisplayName(executable.id)}
</div>
<div className="text-xs text-zinc-500">
{(executable.size / 1024 / 1024).toFixed(1)} MiB
@@ -130,7 +215,7 @@ export default function DownloadButton({
<MenuItem>
<a
className="group flex w-full items-center gap-2 rounded-lg py-1.5 px-2 data-[focus]:bg-white/10"
href={buildLog}
href={buildLog.startsWith('/') ? withBackend(buildLog) : buildLog}
target="_blank"
>
<BeakerIcon className="size-4 fill-white/40" />
@@ -142,5 +227,6 @@ export default function DownloadButton({
</MenuItems>
</Menu>
</div>
</>
);
}

View File

@@ -0,0 +1,57 @@
import {
Dialog,
DialogBackdrop,
DialogPanel,
DialogTitle,
} from "@headlessui/react";
import { ExclamationTriangleIcon } from "@heroicons/react/24/outline";
type MobileWarningModalProps = {
open: boolean;
onClose: () => void;
onContinue: () => void;
};
export default function MobileWarningModal({
open,
onClose,
onContinue,
}: MobileWarningModalProps) {
return (
<Dialog open={open} onClose={onClose} className="relative z-50">
<DialogBackdrop
transition
className="fixed inset-0 bg-black/60 backdrop-blur-sm transition-opacity duration-200 data-[closed]:opacity-0"
/>
<div className="fixed inset-0 flex items-center justify-center p-4">
<DialogPanel
transition
className="w-full max-w-sm rounded-xl border border-zinc-700 bg-zinc-900 p-5 shadow-xl transition-all duration-200 data-[closed]:scale-95 data-[closed]:opacity-0"
>
<div className="flex items-center gap-3 mb-3">
<div className="flex h-10 w-10 items-center justify-center rounded-full bg-amber-500/10">
<ExclamationTriangleIcon className="h-5 w-5 text-amber-400" />
</div>
<DialogTitle className="text-lg font-semibold text-zinc-100">
Heads up!
</DialogTitle>
</div>
<p className="text-sm text-zinc-300 leading-relaxed mb-4">
These downloads are desktop applications for Windows, macOS, and
Linux. They won't run on mobile devices, but you're welcome to
download them to transfer to a computer later.
</p>
<button
onClick={onContinue}
className="w-full rounded-lg bg-emerald-700 px-4 py-2 text-sm font-medium text-white transition-colors hover:bg-emerald-600 focus:outline-none focus:ring-2 focus:ring-emerald-500 focus:ring-offset-2 focus:ring-offset-zinc-900"
>
Got it, continue
</button>
</DialogPanel>
</div>
</Dialog>
);
}

View File

@@ -1,5 +1,5 @@
import { withBackend } from "@/util";
import { useEffect, useRef, useState } from "react";
import { useEffect, useState } from "react";
import useWebSocket, { ReadyState } from "react-use-websocket";
export interface Download {

View File

@@ -1,4 +1,3 @@
import { WindowIcon } from "@heroicons/react/16/solid";
import { clsx, type ClassValue } from "clsx";
import { twMerge } from "tailwind-merge";
@@ -20,6 +19,11 @@ export function os(): Platform | "other" {
return "other";
}
export function isMobile(): boolean {
const ua = navigator.userAgent.toLowerCase();
return /android|iphone|ipad|ipod|webos|blackberry|windows phone/.test(ua);
}
export function toHex(value: number): string {
return "0x" + value.toString(16).toUpperCase();
}

View File

@@ -1,418 +0,0 @@
use std::sync::LazyLock;
use std::{env, vec};
use futures_util::{FutureExt, StreamExt};
use models::{IncomingMessage, OutgoingMessage};
use salvo::cors::Cors;
use salvo::http::{HeaderValue, Method, StatusCode, StatusError};
use salvo::logging::Logger;
use salvo::prelude::{
handler, CatchPanic, Listener, Request, Response, Router, Server, Service, StaticDir,
TcpListener, WebSocketUpgrade,
};
use salvo::websocket::WebSocket;
use salvo::writing::Json;
use salvo::Depot;
use tokio::sync::{mpsc, Mutex};
use tokio_stream::wrappers::UnboundedReceiverStream;
use tracing_subscriber::EnvFilter;
use crate::models::State;
static STORE: LazyLock<Mutex<State>> = LazyLock::new(State::new);
mod models;
mod utility;
#[handler]
async fn session_middleware(req: &mut Request, res: &mut Response, depot: &mut Depot) {
match req.cookie("Session") {
Some(cookie) => {
// Check if the session exists
match cookie.value().parse::<u32>() {
Ok(session_id) => {
let mut store = STORE.lock().await;
if !store.sessions.contains_key(&session_id) {
let new_session_id = store.new_session(res).await;
depot.insert("session_id", new_session_id);
tracing::debug!(
existing_session_id = session_id,
new_session_id = new_session_id,
"Session provided in cookie, but does not exist"
);
} else {
store.sessions.get_mut(&session_id).unwrap().seen(false);
}
}
Err(parse_error) => {
tracing::debug!(
invalid_session_id = cookie.value(),
error = ?parse_error,
"Session provided in cookie, but is not a valid number"
);
let mut store = STORE.lock().await;
let id = store.new_session(res).await;
depot.insert("session_id", id);
}
}
}
None => {
tracing::debug!("Session was not provided in cookie");
let mut store = STORE.lock().await;
let id = store.new_session(res).await;
depot.insert("session_id", id);
}
}
}
#[handler]
async fn connect(req: &mut Request, res: &mut Response, depot: &Depot) -> Result<(), StatusError> {
let session_id = get_session_id(req, depot).unwrap();
WebSocketUpgrade::new()
.upgrade(req, res, move |ws| async move {
handle_socket(session_id, ws).await;
})
.await
}
async fn handle_socket(session_id: u32, websocket: WebSocket) {
// Split the socket into a sender and receive of messages.
let (socket_tx, mut socket_rx) = websocket.split();
// Use an unbounded channel to handle buffering and flushing of messages to the websocket...
let (tx_channel, tx_channel_rx) = mpsc::unbounded_channel();
let transmit = UnboundedReceiverStream::new(tx_channel_rx);
let fut_handle_tx_buffer = transmit
.then(|message| async {
match message {
Ok(ref message) => {
tracing::debug!(message = ?message, "Outgoing Message");
}
Err(ref e) => {
tracing::error!(error = ?e, "Outgoing Message Error");
}
}
message
})
.forward(socket_tx)
.map(|result| {
tracing::debug!("WebSocket send result: {:?}", result);
if let Err(e) = result {
tracing::error!(error = ?e, "websocket send error");
}
});
tokio::task::spawn(fut_handle_tx_buffer);
let store = &mut *STORE.lock().await;
// Create the executable message first, borrow issues
let executable_message = OutgoingMessage::Executables {
executables: store.executable_json(),
build_log: store.build_log.clone(),
};
let session = store
.sessions
.get_mut(&session_id)
.expect("Unable to get session");
session.tx = Some(tx_channel);
session
.send_state()
.expect("Failed to buffer state message");
session
.send_message(executable_message)
.expect("Failed to buffer executables message");
// Handle incoming messages
let fut = async move {
tracing::info!(
"WebSocket connection established for session_id: {}",
session_id
);
while let Some(result) = socket_rx.next().await {
let msg = match result {
Ok(msg) => msg,
Err(error) => {
tracing::error!(
"WebSocket Error session_id={} error=({})",
session_id,
error
);
break;
}
};
if msg.is_close() {
tracing::info!("WebSocket closing for Session {}", session_id);
break;
}
if msg.is_text() {
let text = msg.to_str().unwrap();
// Deserialize
match serde_json::from_str::<IncomingMessage>(text) {
Ok(message) => {
tracing::debug!(message = ?message, "Received message");
match message {
IncomingMessage::DeleteDownloadToken { id } => {
let store = &mut *STORE.lock().await;
let session = store
.sessions
.get_mut(&session_id)
.expect("Session not found");
if session.delete_download(id) {
session
.send_state()
.expect("Failed to buffer state message");
}
}
}
}
Err(e) => {
tracing::error!("Error deserializing message: {} {}", text, e);
}
}
}
}
};
tokio::task::spawn(fut);
}
#[handler]
pub async fn download(req: &mut Request, res: &mut Response, depot: &mut Depot) {
let download_id = req
.param::<String>("id")
.expect("Download ID required to download file");
let session_id =
get_session_id(req, depot).expect("Session ID could not be found via request or depot");
let store = &mut *STORE.lock().await;
let session = store
.sessions
.get_mut(&session_id)
.expect("Session not found");
let executable = store
.executables
.get(&download_id as &str)
.expect("Executable not found");
// Create a download for the session
let session_download = session.add_download(executable);
tracing::info!(session_id, type = download_id, dl_token = session_download.token, "Download created");
let data = executable.with_key(session_download.token.to_string().as_bytes());
if let Err(e) = res.write_body(data) {
tracing::error!("Error writing body: {}", e);
}
res.headers.insert(
"Content-Disposition",
HeaderValue::from_str(
format!("attachment; filename=\"{}\"", session_download.filename).as_str(),
)
.expect("Unable to create header"),
);
res.headers.insert(
"Content-Type",
HeaderValue::from_static("application/octet-stream"),
);
// Don't try to send state if somehow the session has not connected
if session.tx.is_some() {
session
.send_state()
.expect("Failed to buffer state message");
} else {
tracing::warn!("Download being made without any connection websocket");
}
}
#[handler]
pub async fn notify(req: &mut Request, res: &mut Response) {
let key = req.query::<String>("key");
if key.is_none() {
res.status_code(StatusCode::BAD_REQUEST);
return;
}
let key = key.unwrap();
if !key.starts_with("0x") {
res.status_code(StatusCode::BAD_REQUEST);
return;
}
// Parse key into u32
let key = match u32::from_str_radix(key.trim_start_matches("0x"), 16) {
Ok(k) => k,
Err(e) => {
tracing::error!("Error parsing key: {}", e);
res.status_code(StatusCode::BAD_REQUEST);
return;
}
};
let store = &mut *STORE.lock().await;
let target_session = store
.sessions
.iter_mut()
.find(|(_, session)| session.downloads.iter().find(|d| d.token == key).is_some());
match target_session {
Some((_, session)) => {
let message = OutgoingMessage::TokenAlert { token: key };
if let Err(e) = session.send_message(message) {
tracing::warn!(
error = e.to_string(),
"Session did not have a receiving WebSocket available, notify ignored.",
);
res.status_code(StatusCode::NOT_MODIFIED);
return;
}
res.render("Notification sent");
}
None => {
tracing::warn!("Session not found for key while attempting notify: {}", key);
res.status_code(StatusCode::UNAUTHORIZED);
return;
}
}
}
#[handler]
pub async fn get_session(req: &mut Request, res: &mut Response, depot: &mut Depot) {
let store = STORE.lock().await;
let session_id = get_session_id(req, depot);
if session_id.is_none() {
res.status_code(StatusCode::BAD_REQUEST);
return;
}
match store.sessions.get(&session_id.unwrap()) {
Some(session) => {
res.render(Json(&session));
}
None => {
res.status_code(StatusCode::BAD_REQUEST);
}
}
}
// Acquires the session id from the request, preferring the depot
fn get_session_id(req: &Request, depot: &Depot) -> Option<u32> {
if depot.contains_key("session_id") {
return Some(*depot.get::<u32>("session_id").unwrap());
}
// Otherwise, just use whatever the Cookie might have
match req.cookie("Session") {
Some(cookie) => match cookie.value().parse::<u32>() {
Ok(id) => Some(id),
_ => None,
},
None => {
tracing::warn!("Session was not provided in cookie or depot");
None
}
}
}
#[tokio::main]
async fn main() {
let port = std::env::var("PORT").unwrap_or_else(|_| "5800".to_string());
let addr = format!("0.0.0.0:{}", port);
tracing_subscriber::fmt()
.with_env_filter(EnvFilter::new(format!(
"info,dynamic_preauth={}",
// Only log our message in debug mode
match cfg!(debug_assertions) {
true => "debug",
false => "info",
}
)))
.init();
// Add the build log & executables to the store
let mut store = STORE.lock().await;
// Check if we are deployed on Railway
let is_railway = env::var("RAILWAY_PROJECT_ID").is_ok();
if is_railway {
let build_logs = format!(
"https://railway.com/project/{}/service/{}?environmentId={}&id={}#build",
env::var("RAILWAY_PROJECT_ID").unwrap(),
env::var("RAILWAY_SERVICE_ID").unwrap(),
env::var("RAILWAY_ENVIRONMENT_ID").unwrap(),
env::var("RAILWAY_DEPLOYMENT_ID").unwrap()
);
tracing::info!("Build logs available here: {}", build_logs);
store.build_log = Some(build_logs);
}
store.add_executable("Windows", "./demo.exe");
store.add_executable("Linux", "./demo-linux");
// store.add_executable("MacOS", "./demo-macos");
drop(store); // critical: Drop the lock to avoid deadlock, otherwise the server will hang
// Allow all origins if: debug mode or RAILWAY_PUBLIC_DOMAIN is not set
let origin = if cfg!(debug_assertions) | env::var_os("RAILWAY_PUBLIC_DOMAIN").is_none() {
"*".to_string()
} else {
format!(
"https://{}",
env::var_os("RAILWAY_PUBLIC_DOMAIN")
.unwrap()
.to_str()
.unwrap()
)
};
let cors = Cors::new()
.allow_origin(&origin)
.allow_methods(vec![Method::GET])
.into_handler();
tracing::debug!("CORS Allowed Origin: {}", &origin);
let static_dir = StaticDir::new(["./public"]).defaults("index.html");
// TODO: Move handlers to a separate file
// TODO: Improved Token Generation
// TODO: Advanded HMAC Verification
// TODO: Session Purging
let router = Router::new()
.hoop(CatchPanic::new())
// /notify does not need a session, nor should it have one
.push(Router::with_path("notify").post(notify))
.push(
Router::new()
.hoop(session_middleware)
.push(Router::with_path("download/<id>").get(download))
.push(Router::with_path("session").get(get_session))
// websocket /ws
.push(Router::with_path("ws").goal(connect))
// static files
.push(Router::with_path("<**path>").get(static_dir)),
);
let service = Service::new(router).hoop(cors).hoop(Logger::new());
let acceptor = TcpListener::new(addr).bind().await;
Server::new(acceptor).serve(service).await;
}

View File

@@ -1,258 +0,0 @@
use salvo::{http::cookie::Cookie, websocket::Message, Response};
use serde::{Deserialize, Serialize};
use std::{collections::HashMap, path};
use tokio::sync::{mpsc::UnboundedSender, Mutex};
use crate::utility::search;
#[derive(Debug, Serialize, Clone)]
pub struct Session {
pub id: u32,
pub downloads: Vec<SessionDownload>,
pub first_seen: chrono::DateTime<chrono::Utc>,
// The last time a request OR websocket message from/to this session was made
pub last_seen: chrono::DateTime<chrono::Utc>,
// The last time a request was made with this session
pub last_request: chrono::DateTime<chrono::Utc>,
// The sender for the websocket connection
#[serde(skip_serializing)]
pub tx: Option<UnboundedSender<Result<Message, salvo::Error>>>,
}
impl Session {
// Update the last seen time(s) for the session
pub fn seen(&mut self, socket: bool) {
self.last_seen = chrono::Utc::now();
if !socket {
self.last_request = chrono::Utc::now();
}
}
// Add a download to the session
pub fn add_download(&mut self, exe: &Executable) -> &SessionDownload {
let token: u32 = rand::random();
let download = SessionDownload {
token,
filename: format!(
"{}-{:08x}{}{}",
exe.name,
token,
if exe.extension.len() > 0 { "." } else { "" },
exe.extension
),
last_used: chrono::Utc::now(),
download_time: chrono::Utc::now(),
};
self.downloads.push(download);
return self.downloads.last().unwrap();
}
// Delete a download from the session
// Returns true if the download was deleted, false if it was not found
pub fn delete_download(&mut self, token: u32) -> bool {
if let Some(index) = self.downloads.iter().position(|d| d.token == token) {
self.downloads.remove(index);
true
} else {
tracing::warn!("Attempted to delete non-existent download token: {}", token);
false
}
}
// This function's failure is not a failure to transmit the message, but a failure to buffer it into the channel (or any preceding steps).
pub fn send_message(&mut self, message: OutgoingMessage) -> Result<(), anyhow::Error> {
if self.tx.is_none() {
return Err(anyhow::anyhow!("Session {} has no sender", self.id));
}
// TODO: Error handling
let tx = self.tx.as_ref().unwrap();
let result = tx.send(Ok(Message::text(serde_json::to_string(&message).unwrap())));
match result {
Ok(_) => return Ok(()),
Err(e) => return Err(anyhow::anyhow!("Error sending message: {}", e)),
}
}
pub fn send_state(&mut self) -> Result<(), anyhow::Error> {
let message = OutgoingMessage::State {
session: self.clone(),
};
self.send_message(message)
}
}
#[derive(Serialize, Debug, Clone)]
pub struct SessionDownload {
pub token: u32,
pub filename: String,
pub last_used: chrono::DateTime<chrono::Utc>,
pub download_time: chrono::DateTime<chrono::Utc>,
}
impl SessionDownload {}
#[derive(Clone, Debug)]
pub struct State<'a> {
// A map of executables, keyed by their type/platform
pub executables: HashMap<&'a str, Executable>,
// A map of sessions, keyed by their identifier (a random number)
pub sessions: HashMap<u32, Session>,
// Provided on startup, the URL to the build log of the current deployment
pub build_log: Option<String>,
}
impl<'a> State<'a> {
pub fn new() -> Mutex<Self> {
Mutex::new(Self {
build_log: None,
executables: HashMap::new(),
sessions: HashMap::new(),
})
}
pub fn add_executable(&mut self, exe_type: &'a str, exe_path: &str) {
let data = std::fs::read(&exe_path).expect("Unable to read file");
let pattern = "a".repeat(1024);
let key_start = search(&data, pattern.as_bytes(), 0).unwrap();
let key_end = key_start + pattern.len();
let path = path::Path::new(&exe_path);
let name = path.file_stem().unwrap().to_str().unwrap();
let extension = match path.extension() {
Some(s) => s.to_str().unwrap(),
None => "",
};
let exe = Executable {
data,
filename: path.file_name().unwrap().to_str().unwrap().to_string(),
name: name.to_string(),
extension: extension.to_string(),
key_start: key_start,
key_end: key_end,
};
self.executables.insert(exe_type, exe);
}
pub async fn new_session(&mut self, res: &mut Response) -> u32 {
let id: u32 = rand::random();
let now = chrono::Utc::now();
self.sessions.insert(
id,
Session {
id,
downloads: Vec::new(),
last_seen: now,
last_request: now,
first_seen: now,
tx: None,
},
);
tracing::info!("New session created: {}", id);
res.add_cookie(
Cookie::build(("Session", id.to_string()))
.http_only(true)
.partitioned(true)
.secure(cfg!(debug_assertions) == false)
.path("/")
// Use SameSite=None only in development
.same_site(if cfg!(debug_assertions) {
salvo::http::cookie::SameSite::None
} else {
salvo::http::cookie::SameSite::Strict
})
.permanent()
.build(),
);
return id;
}
pub fn executable_json(&self) -> Vec<ExecutableJson> {
let mut executables = Vec::new();
for (key, exe) in &self.executables {
executables.push(ExecutableJson {
id: key.to_string(),
size: exe.data.len(),
filename: exe.filename.clone(),
});
}
return executables;
}
}
#[derive(Default, Clone, Debug)]
pub struct Executable {
pub data: Vec<u8>, // the raw data of the executable
pub filename: String,
pub name: String, // the name before the extension
pub extension: String, // may be empty string
pub key_start: usize, // the index of the byte where the key starts
pub key_end: usize, // the index of the byte where the key ends
}
impl Executable {
pub fn with_key(&self, new_key: &[u8]) -> Vec<u8> {
let mut data = self.data.clone();
// Copy the key into the data
for i in 0..new_key.len() {
data[self.key_start + i] = new_key[i];
}
// If the new key is shorter than the old key, we just write over the remaining data
if new_key.len() < self.key_end - self.key_start {
for i in self.key_start + new_key.len()..self.key_end {
data[i] = b' ';
}
}
return data;
}
}
#[derive(Debug, Deserialize)]
#[serde(tag = "type", rename_all = "kebab-case")]
pub enum IncomingMessage {
// A request from the client to delete a download token
DeleteDownloadToken { id: u32 },
}
#[derive(Debug, Serialize)]
#[serde(tag = "type", rename_all = "kebab-case")]
pub enum OutgoingMessage {
// An alert to the client that a session download has been used.
#[serde(rename = "notify")]
TokenAlert {
token: u32,
},
// A message describing the current session state
State {
session: Session,
},
Executables {
build_log: Option<String>,
executables: Vec<ExecutableJson>,
},
}
#[derive(Debug, Serialize)]
pub struct ExecutableJson {
pub id: String,
pub size: usize,
pub filename: String,
}

View File

@@ -1,40 +0,0 @@
pub(crate) fn search(buf: &[u8], pattern: &[u8], start_index: usize) -> Option<usize> {
let mut i = start_index;
// If the buffer is empty, the pattern is too long
if pattern.len() > buf.len() {
return None;
}
// If the pattern is empty
if pattern.len() == 0 {
return None;
}
// If the starting index is too high
if start_index >= buf.len() {
return None;
}
while i < buf.len() {
for j in 0..pattern.len() {
// If the pattern is too long to fit in the buffer anymore
if i + j >= buf.len() {
return None;
}
// If the pattern stops matching
if buf[i + j] != pattern[j] {
break;
}
// If the pattern is found
if j == pattern.len() - 1 {
return Some(i);
}
}
i += 1;
}
None
}