fix: export_serving — Arrow-based copy, rename to analytics.duckdb

Two bugs fixed:

1. Cross-connection COPY: DuckDB doesn't support referencing another
   connection's tables as src.serving.table. Replace with Arrow as
   intermediate: src reads to Arrow, dst.register() + CREATE TABLE.

2. Catalog/schema name collision: naming the export file serving.duckdb
   made DuckDB assign catalog name "serving" — same as the schema we
   create inside it. Every serving.table query became ambiguous. Rename
   to analytics.duckdb (catalog "analytics", schema "serving" = no clash).

   SERVING_DUCKDB_PATH values updated: serving.duckdb → analytics.duckdb
   in supervisor, service, bootstrap, dev_run.sh, .env.example, docker-compose.

3. Temp file: use _export.duckdb (not serving.duckdb.tmp) to avoid
   the same catalog collision during the write phase.

Verified: 6 tables exported, serving.* queries work read-only.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
This commit is contained in:
Deeman
2026-02-22 12:54:39 +01:00
parent ac8ab47448
commit 9ee7a3d9d3
7 changed files with 21 additions and 16 deletions

View File

@@ -8,10 +8,10 @@ ADMIN_EMAILS=admin@beanflows.coffee
# Database
DATABASE_PATH=data/app.db
# DUCKDB_PATH points to the full pipeline DB (lakehouse.duckdb) — used by SQLMesh and export_serving.
# SERVING_DUCKDB_PATH points to the serving-only export (serving.duckdb) — used by the web app.
# SERVING_DUCKDB_PATH points to the serving-only export (analytics.duckdb) — used by the web app.
# Run `uv run materia pipeline run export_serving` after each SQLMesh transform to populate it.
DUCKDB_PATH=../local.duckdb
SERVING_DUCKDB_PATH=../serving.duckdb
SERVING_DUCKDB_PATH=../analytics.duckdb
# Auth
MAGIC_LINK_EXPIRY_MINUTES=15

View File

@@ -10,7 +10,7 @@ services:
env_file: .env
environment:
- DATABASE_PATH=/app/data/app.db
- SERVING_DUCKDB_PATH=/app/duckdb/serving.duckdb
- SERVING_DUCKDB_PATH=/app/duckdb/analytics.duckdb
healthcheck:
test: ["CMD", "curl", "-f", "http://localhost:5000/health"]
interval: 30s

View File

@@ -55,14 +55,14 @@ make css-build
ok "CSS built"
# -- Pipeline (first-time only) ----------------------------------------------
# Runs extract → transform → export_serving from the repo root if serving.duckdb
# Runs extract → transform → export_serving from the repo root if analytics.duckdb
# does not exist yet. Subsequent dev_run.sh invocations skip this — delete
# serving.duckdb from the repo root to force a full re-run.
# analytics.duckdb from the repo root to force a full re-run.
REPO_ROOT="$(cd .. && pwd)"
PIPELINE_LANDING="$REPO_ROOT/data/landing"
PIPELINE_DUCKDB="$REPO_ROOT/local.duckdb"
PIPELINE_SERVING="$REPO_ROOT/serving.duckdb"
PIPELINE_SERVING="$REPO_ROOT/analytics.duckdb"
if [ ! -f "$PIPELINE_SERVING" ]; then
info "First run — fetching and transforming data (this may take a few minutes)"