Phase 1A — KC=F Coffee Futures Prices: - New extract/coffee_prices/ package (yfinance): downloads KC=F daily OHLCV, stores as gzip CSV with SHA256-based idempotency - SQLMesh models: raw/coffee_prices → foundation/fct_coffee_prices → serving/coffee_prices (with 20d/50d SMA, 52-week high/low, daily return %) - Dashboard: 4 metric cards + dual-line chart (close, 20d MA, 50d MA) - API: GET /commodities/<ticker>/prices Phase 1B — Data Methodology Page: - New /methodology route with full-page template (base.html) - 6 anchored sections: USDA PSD, CFTC COT, KC=F price, ICE warehouse stocks, data quality model, update schedule table - "Methodology" link added to marketing footer Phase 1C — Automated Pipeline: - supervisor.sh updated: runs extract_cot, extract_prices, extract_ice in sequence before transform - Webhook failure alerting via ALERT_WEBHOOK_URL env var (ntfy/Slack/Telegram) ICE Warehouse Stocks: - New extract/ice_stocks/ package (niquests): normalizes ICE Report Center CSV to canonical schema, hash-based idempotency, soft-fail on 404 with guidance - SQLMesh models: raw/ice_warehouse_stocks → foundation/fct_ice_warehouse_stocks → serving/ice_warehouse_stocks (30d avg, WoW change, 52w drawdown) - Dashboard: 4 metric cards + line chart (certified bags + 30d avg) - API: GET /commodities/<code>/stocks Foundation: - dim_commodity: added ticker (KC=F) and ice_stock_report_code (COFFEE-C) columns - macros/__init__.py: added prices_glob() and ice_stocks_glob() - pipelines.py: added extract_prices and extract_ice entries Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
Materia Infrastructure
Single-server local-first setup for BeanFlows.coffee on Hetzner NVMe.
Architecture
Hetzner Server (NVMe)
├── /opt/materia/ # Git repo, code, uv environment
├── /data/materia/landing/ # Extracted USDA data (year/month subdirs)
├── /data/materia/lakehouse.duckdb # SQLMesh output database
└── systemd services:
├── materia-supervisor # Pulls git, runs extract + transform daily
└── materia-backup.timer # Syncs landing/ to R2 every 6 hours
Data Flow
- Extract: USDA API →
/data/materia/landing/psd/{year}/{month}/{etag}.csv.gzip - Transform: SQLMesh reads landing CSVs → writes to
/data/materia/lakehouse.duckdb - Backup: rclone syncs
/data/materia/landing/→ R2materia-raw/landing/ - Web: Reads
lakehouse.duckdb(read-only)
Setup
Prerequisites
- Hetzner server with NVMe storage
- Pulumi ESC configured (
beanflows/prodenvironment) GITLAB_READ_TOKENandPULUMI_ACCESS_TOKENset
Bootstrap
# From local machine or CI:
ssh root@<server_ip> 'bash -s' < infra/bootstrap_supervisor.sh
This installs dependencies, clones the repo, creates data directories, and starts the supervisor service.
R2 Backup
- Install rclone:
apt install rclone - Copy and configure:
cp infra/backup/rclone.conf.example /root/.config/rclone/rclone.conf - Fill in R2 credentials from Pulumi ESC
- Install systemd units:
cp infra/backup/materia-backup.service /etc/systemd/system/
cp infra/backup/materia-backup.timer /etc/systemd/system/
systemctl daemon-reload
systemctl enable --now materia-backup.timer
Pulumi IaC
Still manages Cloudflare R2 buckets and can provision Hetzner instances:
cd infra
pulumi login
pulumi stack select prod
pulumi up
Monitoring
# Supervisor status and logs
systemctl status materia-supervisor
journalctl -u materia-supervisor -f
# Backup timer status
systemctl list-timers materia-backup.timer
journalctl -u materia-backup -f
Cost
| Resource | Type | Cost |
|---|---|---|
| Hetzner Server | CCX22 (4 vCPU, 16GB) | ~€24/mo |
| R2 Storage | Backup (~10 GB) | $0.15/mo |
| R2 Egress | Zero | $0.00 |
| Total |