Compare commits

..

93 Commits

Author SHA1 Message Date
Deeman
add5f8ddfa fix(extract): correct lc_lci_lev lcstruct filter value
All checks were successful
CI / test (push) Successful in 53s
CI / tag (push) Successful in 3s
2026-03-05 17:39:37 +01:00
Deeman
15ca316682 fix(extract): correct lc_lci_lev lcstruct filter value
D1_D2_A_HW doesn't exist in the API; use D1_D4_MD5 (total labour cost
= compensation + taxes - subsidies).

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-05 17:32:49 +01:00
Deeman
103ef73cf5 fix(pipeline): eurostat filter bugs + supervisor uses sqlmesh plan
All checks were successful
CI / test (push) Successful in 53s
CI / tag (push) Successful in 3s
2026-03-05 17:19:21 +01:00
Deeman
aa27f14f3c fix(pipeline): eurostat filter bugs + supervisor uses sqlmesh plan
- nrg_pc_203: add missing unit=KWH filter (API returns 2 units)
- lc_lci_lev: fix currency→unit filter dimension name
- supervisor: use `sqlmesh plan prod --auto-apply` instead of
  `sqlmesh run` so new/changed models are detected automatically

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-05 17:19:12 +01:00
Deeman
8205744444 chore: remove accidentally committed .claire/ worktree directory
All checks were successful
CI / test (push) Successful in 56s
CI / tag (push) Successful in 3s
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-05 17:10:48 +01:00
Deeman
1cbefe349c add env var 2026-03-05 17:08:52 +01:00
Deeman
003f19e071 fix(pipeline): handle DuckDB catalog naming in diagnostic script 2026-03-05 17:07:52 +01:00
Deeman
c3f15535b8 fix(pipeline): handle DuckDB catalog naming in diagnostic script
The lakehouse.duckdb file uses catalog "lakehouse" not "local", causing
SQLMesh logical views to break. Script now auto-detects the catalog via
USE and falls back to physical tables when views fail.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-05 17:06:44 +01:00
Deeman
fcb8ec4227 merge: pipeline diagnostic script + extraction card UX improvements
All checks were successful
CI / test (push) Successful in 54s
CI / tag (push) Successful in 3s
2026-03-05 15:40:16 +01:00
Deeman
6b7fa45bce feat(admin): add pipeline diagnostic script + extraction card UX improvements
- Add scripts/check_pipeline.py: read-only diagnostic for pricing pipeline
  row counts, date range analysis, HAVING filter impact, join coverage
- Add description field to all 12 workflows in workflows.toml
- Parse and display descriptions on extraction status cards
- Show spinner + "Running" state with blue-tinted card border
- Display start time with "running..." text for active extractions

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-05 15:40:12 +01:00
Deeman
0d8687859d fix(docker): copy workflows.toml into container for admin pipeline view
All checks were successful
CI / test (push) Successful in 53s
CI / tag (push) Successful in 3s
The admin Extraction Status page reads infra/supervisor/workflows.toml
but the Dockerfile only copied web/ into the image. Adding the COPY
so the file exists at /app/infra/supervisor/workflows.toml in the
container.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-05 15:16:07 +01:00
Deeman
b064e18aa1 fix(admin): resolve workflows.toml path via CWD instead of __file__
All checks were successful
CI / test (push) Successful in 54s
CI / tag (push) Successful in 3s
In prod the package is installed in a venv, so __file__.parents[4] doesn't
reach the repo root. Use CWD (repo root in both dev and prod via systemd
WorkingDirectory) with REPO_ROOT env var override.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-05 14:39:30 +01:00
Deeman
dc68976148 docs(marketing): add GTM, social posts, Reddit plan, and SEO calendar
All checks were successful
CI / test (push) Successful in 54s
CI / tag (push) Successful in 3s
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-05 11:43:11 +01:00
Deeman
60fa2bc720 test(billing): add Stripe E2E test scripts for sandbox validation
- test_stripe_sandbox.py: API-only validation of all 17 products (67 tests)
- stripe_e2e_setup.py: webhook endpoint registration via ngrok
- stripe_e2e_test.py: live webhook tests with real DB verification (67 tests)
- stripe_e2e_checkout_test.py: checkout webhook tests for credit packs,
  sticky boosts, and business plan PDF purchases (40 tests)

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-05 10:50:26 +01:00
Deeman
66c2dfce66 fix(billing): fetch line items for checkout.session.completed webhooks
_extract_line_items() was returning [] for all checkout sessions, which
meant _handle_transaction_completed never processed credit packs, sticky
boosts, or business plan PDF purchases. Now fetches line items from the
Stripe API using the session ID, with a fallback to embedded line_items.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-05 10:49:41 +01:00
Deeman
6e3c5554aa fix(admin): enable bulk actions in grouped articles view
All checks were successful
CI / test (push) Successful in 54s
CI / tag (push) Successful in 3s
- dev_run.sh: also remove app.db-shm and app.db-wal on reset to fix
  SQLite disk I/O error from stale WAL/SHM files
- articles bulk: add checkboxes to grouped rows (data-ids holds all
  variant IDs); checking a group selects EN+DE together
- restore select-all checkbox in grouped <th>
- add toggleArticleGroupSelect() JS function
- fix htmx:afterSwap to re-check group checkboxes correctly

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-05 09:48:54 +01:00
Deeman
ad02140594 fix(quote): add missing required asterisk and error hint to step 4
All checks were successful
CI / test (push) Successful in 54s
CI / tag (push) Successful in 3s
Step 4 (Project Phase) required location_status server-side but had no
visual "*" indicator and no error message when submitting without a
selection. All other steps already had both.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-04 22:21:57 +01:00
Deeman
5bcd87d7e5 fix(ci): replace non-existent quote.wizard endpoint with leads.quote_request
All checks were successful
CI / test (push) Successful in 53s
CI / tag (push) Successful in 3s
The CRO homepage overhaul (f4f8a45) introduced url_for('quote.wizard')
in landing.html, but that endpoint never existed — the actual route is
leads.quote_request. This broke CI runs #99–#109.

Also adds landing_vs_col_us to i18n allowlist (brand name, same in both
languages).

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-04 22:12:45 +01:00
Deeman
77772b7ea4 feat(maps): beanflows-style divIcon bubbles + feature flag gate
Replace L.circleMarker with L.divIcon + .pn-marker CSS class (white
border, box-shadow, hover scale) matching the beanflows growing
conditions map pattern. Dark .map-tooltip CSS override (no arrow,
dark navy background). Small venue dots use .pn-venue class.

Add _require_maps_flag() to all 4 API endpoints (default=True so
dev works without seeding the flag row). Gate /opportunity-map route
the same way.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-04 20:51:00 +01:00
Deeman
59f1f0d699 merge(worktree): interactive maps for market pages
Self-hosted Leaflet 1.9.4 maps across 4 placements: markets hub
country bubbles, country overview city bubbles, city venue dots, and
a standalone opportunity map. New /api blueprint with 4 JSON endpoints.
New city_venue_locations SQLMesh serving model. No CDN — GDPR-safe.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

# Conflicts:
#	CHANGELOG.md
2026-03-04 15:36:41 +01:00
Deeman
0a89ba2213 docs: update CHANGELOG with interactive maps feature
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-04 15:33:35 +01:00
Deeman
6e936dbb95 feat(maps): Phase 5 — standalone opportunity map page
New route GET /<lang>/opportunity-map renders a full-width Leaflet map
with a country selector. On country change, fetches
/api/opportunity/{slug}.json and renders opportunity circles
(color-coded by score, sized by population) plus existing-venue gray
reference dots from /api/markets/{country}/cities.json.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-04 15:32:56 +01:00
Deeman
edf678ac4e feat(maps): Phase 4 — city venue dot map
New serving model: city_venue_locations joins dim_venues + dim_cities
to expose lat/lon/court_count per venue for the city dot map endpoint.

pseo_city_costs_de.sql: add c.lat, c.lon so city-cost articles have
city coordinates for the #city-map data attributes.

city-cost-de.md.jinja: add #city-map div (both DE and EN sections)
after the stats strip. Leaflet init handled by article_detail.html.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-04 15:07:06 +01:00
Deeman
0eef455543 feat(maps): Phase 3 — country overview city bubble map + article_detail Leaflet loader
Add #country-map div to country-overview.md.jinja (both DE/EN).
article_detail.html: always include Leaflet CSS, conditionally load
Leaflet JS only when #country-map or #city-map divs are present.
Initializes country city-bubble map and city venue-dot map from
/api/markets/{slug}/cities.json and /api/markets/{country}/{city}/venues.json.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-04 13:15:41 +01:00
Deeman
8e53fda283 feat(maps): Phase 2 — markets hub country bubble map
Add Leaflet map to /markets with country-level bubbles sized by
total_venues and colored by avg_market_score. Click navigates to
country overview page.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-04 13:04:40 +01:00
Deeman
db0d7cfee9 feat(maps): Phase 1 — Leaflet vendor files, API blueprint, app registration
Self-host Leaflet 1.9.4 JS/CSS/images in static/vendor/leaflet/.
Create api.py blueprint with 4 JSON endpoints for map data.
Register api_bp at /api in app.py (before content catch-all).

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-04 12:46:13 +01:00
Deeman
61c197d233 merge(worktree): individualise article costs with per-country Eurostat data + tiered proxy tenant work
# Conflicts:
#	CHANGELOG.md
#	transform/sqlmesh_padelnomics/models/foundation/dim_cities.sql
#	transform/sqlmesh_padelnomics/models/foundation/dim_locations.sql
2026-03-04 12:44:56 +01:00
Deeman
2e68cfbe4f feat(transform): individualise article costs with per-country Eurostat data
Add real per-country cost data to ~30 calculator fields so pSEO articles
show country-specific CAPEX/OPEX instead of hardcoded DE defaults.

Extractor:
- eurostat.py: add 8 new datasets (nrg_pc_205, nrg_pc_203, lc_lci_lev,
  5×prc_ppp_ind variants); add optional `dataset_code` field so multiple
  dict entries can share one Eurostat API endpoint

Staging (4 new models):
- stg_electricity_prices — EUR/kWh by country, semi-annual
- stg_gas_prices         — EUR/GJ by country, semi-annual
- stg_labour_costs       — EUR/hour by country, annual (future staffed scenario)
- stg_price_levels       — PLI indices (EU27=100) for 5 categories, annual

Foundation:
- dim_countries (new) — conformed country dimension; eliminates ~50-line CASE
  blocks duplicated in dim_cities/dim_locations; computes ~29 calculator cost
  override columns from PLI ratios and energy price ratios vs DE baseline;
  NULL for DE so calculator falls through to DEFAULTS unchanged
- dim_cities — replace country_name/slug CASE blocks + country_income CTE
  with JOIN dim_countries
- dim_locations — same refactor as dim_cities

Serving:
- pseo_city_costs_de — JOIN dim_countries; add 29 camelCase override columns
  auto-applied by calculator (electricity, heating, rentSqm, hallCostSqm, …)
- planner_defaults — JOIN dim_countries; same 29 cost columns flow through
  to /api/market-data endpoint

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-04 10:09:48 +01:00
Deeman
7af6f32a2b merge: bulk actions for articles and leads
Some checks failed
CI / test (push) Failing after 33s
CI / tag (push) Has been skipped
2026-03-04 09:55:19 +01:00
Deeman
53fdbd9fd5 docs: update CHANGELOG with bulk actions feature
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-04 09:55:04 +01:00
Deeman
81487d6f01 feat(admin): bulk actions for articles and leads
Add bulk selection checkboxes and action bars to the articles and leads
admin pages, replicating the existing supplier bulk pattern.

Articles: publish, unpublish, toggle noindex, rebuild, delete (with
confirmation dialog). Leads: set status, set heat. Both re-render the
results partial after action via HTMX, preserving current filters.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-04 09:40:26 +01:00
Deeman
477f635bc5 test(billing): Stripe E2E webhook lifecycle tests
Some checks failed
CI / test (push) Failing after 29s
CI / tag (push) Has been skipped
2026-03-03 18:17:10 +01:00
Deeman
4dbded74ca test(billing): add Stripe E2E webhook lifecycle tests
16 tests covering the full Stripe webhook flow through /billing/webhook/stripe:
- Subscription creation (customer.subscription.created → DB row)
- Period end extraction from items (Stripe API 2026-02+ compatibility)
- Billing customer creation
- Status updates (active, past_due, trialing)
- Cancellation (customer.subscription.deleted → cancelled)
- Payment failure (invoice.payment_failed → past_due)
- One-time payments (checkout.session.completed mode=payment)
- Full lifecycle: create → update → recover → cancel
- Edge cases: missing metadata, empty items, invalid JSON, bad signatures

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-03 18:17:05 +01:00
Deeman
230406f34f fix(billing): period_end from Stripe items + test 2026-03-03 18:06:01 +01:00
Deeman
7da6a4737d fix(billing): extract current_period_end from Stripe subscription items
Stripe API 2026-02+ moved current_period_end from subscription to
subscription items. Add _get_period_end() helper that falls back to
items[0].current_period_end when the subscription-level field is None.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-03 18:05:55 +01:00
Deeman
710e21a186 fix(billing): handle customer.subscription.created + test isolation 2026-03-03 17:58:15 +01:00
Deeman
72c4de91b0 fix(billing): handle customer.subscription.created webhook + test isolation
- Add customer.subscription.created → subscription.activated mapping in
  stripe.parse_webhook so direct API subscription creation also creates DB rows
- Add customer.subscription.created to setup_stripe.py enabled_events
- Pin PAYMENT_PROVIDER=paddle and STRIPE_WEBHOOK_SECRET="" in test conftest
  so billing tests don't hit real Stripe API when env has Stripe keys
- Add 8 unit tests for stripe.parse_webhook covering all event types

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-03 17:29:13 +01:00
Deeman
046be665db merge: fix remaining request_options in stripe.py 2026-03-03 16:46:48 +01:00
Deeman
7c5fa86fb8 fix(billing): remove remaining request_options from Price.retrieve calls
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-03 16:46:25 +01:00
Deeman
0a9f980813 merge: fix Stripe SDK request_options + webhook endpoint graceful failure 2026-03-03 16:36:58 +01:00
Deeman
2682e810fa fix(billing): remove invalid request_options from Stripe SDK calls
Stripe Python SDK doesn't accept request_options as a kwarg to create/retrieve/modify.
Timeouts are handled by the global max_network_retries setting.
Also gracefully handle webhook endpoint creation failure for localhost URLs.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-03 16:36:47 +01:00
Deeman
10af6a284c fix(content): slug transliteration, article links, country overview ranking
Some checks failed
CI / test (push) Failing after 30s
CI / tag (push) Has been skipped
# Conflicts:
#	CHANGELOG.md
2026-03-03 16:29:41 +01:00
Deeman
68f354ac2b docs: update CHANGELOG for slug fix + country overview ranking
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-03 16:26:55 +01:00
Deeman
0b74156ef7 merge: accept alternative Stripe env var names 2026-03-03 16:24:25 +01:00
Deeman
fab16cb48f fix(billing): accept STRIPE_API_PRIVATE_KEY / STRIPE_API_PUBLIC_KEY env var names
Also normalise PAYMENT_PROVIDER to lowercase so STRIPE/stripe both work.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-03 16:24:03 +01:00
Deeman
062a6d2766 merge: Stripe payment provider (dispatch-by-config alongside Paddle) 2026-03-03 16:07:52 +01:00
Deeman
80c2f111d2 feat(billing): B4-B5 — tests, lint fixes, CHANGELOG + PROJECT.md
- Fix unused imports in stripe.py (hashlib, hmac, time)
- Update test_billing_routes.py: insert into payment_products table,
  fix mock paths for extracted paddle.py, add Stripe webhook 404 test
- Update CHANGELOG.md with Stripe provider feature
- Update PROJECT.md Done section

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-03 16:07:30 +01:00
Deeman
7ae8334d7a feat(billing): B3 — setup_stripe.py product/price creation script
Mirrors setup_paddle.py structure:
- Creates 17 products + prices in Stripe (same keys, same prices)
- Writes to payment_products table with provider='stripe'
- Registers webhook endpoint at /billing/webhook/stripe
- tax_behavior='exclusive' (price + VAT on top, EU standard)
- Supports --sync flag to re-populate from existing Stripe products

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-03 15:53:38 +01:00
Deeman
032fe8d86c feat(billing): B2 — Stripe payment provider implementation
billing/stripe.py exports the same interface as paddle.py:
- build_checkout_payload() → Stripe Checkout Session with automatic_tax
- build_multi_item_checkout_payload() → multi-line-item sessions
- cancel_subscription() → cancel_at_period_end=True
- get_management_url() → Stripe Billing Portal session
- verify_webhook() → Stripe-Signature header verification
- parse_webhook() → maps Stripe events to shared format:
  checkout.session.completed → subscription.activated / transaction.completed
  customer.subscription.updated → subscription.updated
  customer.subscription.deleted → subscription.canceled
  invoice.payment_failed → subscription.past_due

All API calls have 10s timeout and max 2 retries.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-03 15:48:08 +01:00
Deeman
4907bc8b64 feat(billing): B1 — add stripe SDK dependency
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-03 15:37:29 +01:00
Deeman
bf69270913 feat(billing): A6 — planner/supplier routes use get_price_id() + _provider()
- planner/routes.py: import get_price_id instead of get_paddle_price,
  export_checkout uses _provider().build_checkout_payload()
- suppliers/routes.py: all get_paddle_price → get_price_id,
  signup_checkout uses _provider().build_multi_item_checkout_payload(),
  dashboard boosts use get_all_price_ids() bulk load

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-03 15:36:12 +01:00
Deeman
8f0a56079f feat(billing): A5 — dual-path JS templates for Paddle overlay / Stripe redirect
- New _payment_js.html: conditionally loads Paddle.js or nothing (Stripe
  uses server-side Checkout Session). Provides startCheckout() helper.
- All checkout templates use _payment_js.html instead of _paddle.html
- export.html, signup_step_4.html: Paddle.Checkout.open() → startCheckout()
- dashboard_boosts.html: inline onclick → buyItem() with server round-trip
- New /billing/checkout/item endpoint for single-item purchases (boosts, credits)

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-03 15:31:52 +01:00
Deeman
7af9b2c82c feat(billing): A2+A4 — extract paddle.py + dispatch layer in routes.py
- New billing/paddle.py: Paddle-specific functions (build_checkout_payload,
  cancel_subscription, get_management_url, verify_webhook, parse_webhook)
- routes.py: _provider() dispatch function selects paddle or stripe module
- Checkout/manage/cancel routes now delegate to _provider()
- /webhook/paddle always active (existing subscribers)
- /webhook/stripe endpoint added (returns 404 until Stripe configured)
- Shared _handle_webhook_event() processes normalized events from any provider
- _price_id_to_key() queries payment_products with paddle_products fallback

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-03 15:26:47 +01:00
Deeman
276328af33 feat(billing): A1+A3 — payment_products table + provider-agnostic price lookups
- Migration 0028: create payment_products table, copy paddle_products rows
- Add STRIPE_SECRET_KEY, STRIPE_PUBLISHABLE_KEY, STRIPE_WEBHOOK_SECRET config
- Make PAYMENT_PROVIDER read from env (was hardcoded "paddle")
- Add get_price_id() / get_all_price_ids() querying payment_products
- Keep get_paddle_price() as deprecated fallback alias

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-03 15:07:10 +01:00
Deeman
a00c8727d7 fix(content): slugify transliteration + article links + country overview ranking
- Add @slugify SQLMesh macro (STRIP_ACCENTS + ß→ss) replacing broken
  inline REGEXP_REPLACE that dropped non-ASCII chars (Düsseldorf → d-sseldorf)
- Apply @slugify to dim_venues, dim_cities, dim_locations
- Fix Python slugify() to pre-replace ß→ss before NFKD normalization
- Add language prefix to B2B article market links (/markets/germany → /de/markets/germany)
- Change country overview top-5 ranking: venue count (not raw market_score)
  for top cities, population for top opportunity cities

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-03 10:46:30 +01:00
Deeman
0fc0ca66b1 fix(i18n): replace smart quotes with straight quotes in sup_hero_sub
Some checks failed
CI / test (push) Failing after 29s
CI / tag (push) Has been skipped
Curly quotes (U+201C/U+201D) were used as JSON key/value delimiters
on line 894 of both locale files, making them invalid JSON.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-03 06:53:39 +01:00
Deeman
385deb7f81 feat(cro): CRO overhaul — homepage + supplier landing pages (JTBD rewrite)
Some checks failed
CI / test (push) Failing after 9s
CI / tag (push) Has been skipped
2026-03-03 06:44:30 +01:00
Deeman
3ddb26ae0f chore: update CHANGELOG.md and PROJECT.md for CRO overhaul
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-03 06:44:19 +01:00
Deeman
695e956501 feat(cro): German translations for all CRO copy changes
Native-quality DE translations for homepage + supplier page:
- Hero, ROI, features, FAQ, final CTA, meta/SEO
- Proof strip, struggling moments, "Why Padelnomics" comparison
- Supplier proof points, ROI line, struggling moments, pricing CTAs

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-03 06:42:16 +01:00
Deeman
a862d21269 feat(cro): supplier page CRO — struggling moments, conditional stats, honest proof
Task 3: Add "Is this your sales team?" struggling-moments section.
Conditional stats display (hide if below thresholds). Replace anonymous
testimonials with data-backed proof points. Tier-specific pricing CTAs.
Tighter hero sub-headline. Move ROI callout above pricing grid.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-03 06:36:46 +01:00
Deeman
f4f8a45654 feat(cro): homepage structural overhaul — proof strip, struggling moments, comparison
Task 2: Remove journey timeline (3 "SOON" badges = incomplete signal).
Add proof strip below hero with live stats. Add "Sound familiar?"
section with 4 JTBD struggling-moment cards. Add "Why Padelnomics"
3-column comparison (DIY vs consultant vs us). Update hero secondary
CTA and supplier matching links to /quote. Route handler now passes
calc_requests and total_budget_millions to template.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-03 06:33:43 +01:00
Deeman
9e471f8960 feat(cro): rewrite homepage EN copy — outcome-focused JTBD framing
Task 1: Hero, features, FAQ, final CTA, supplier matching, meta/SEO
strings all rewritten. New keys added for proof strip, struggling-
moments section, and "Why Padelnomics" comparison section.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-02 22:31:24 +01:00
Deeman
48401bd2af feat(articles): rewrite B2B article CTAs — directory → /quote form
All checks were successful
CI / test (push) Successful in 50s
CI / tag (push) Successful in 3s
All 12 hall-building articles now link to /quote (leads.quote_request).
Previously: 2 had broken directory prose, 4 had unlinked planner mentions,
4 had broken [→ placeholder] links, 2 had scenario cards but no CTA link.

- Group 1 (bauen/build-guide): replace directory section with quote CTA
- Group 2 (kosten/risiken): link planner refs, append quote CTA
- Group 3 (finanzierung): append quote CTA after scenario card
- Group 4 (standort/businessplan): fix broken [→] links to /de|en/planner,
  append quote CTA

CTA copy is contextual per article. Light-blue banner pattern, .btn class.
B2C gear articles unaffected.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-02 14:55:28 +01:00
Deeman
cd02726d4c chore(changelog): document B2B article CTA rewrite
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-02 14:55:20 +01:00
Deeman
fbc259cafa fix(articles): fix broken CTA links + add /quote CTA in location and business plan articles
- padel-standort-analyse-de, padel-hall-location-guide-en:
  fix [→ ...] placeholders to /de/planner and /en/planner
  append quote CTA "Standort gefunden? Angebote einholen"
- padel-business-plan-bank-de, padel-business-plan-bank-requirements-en:
  fix [→ Businessplan erstellen] / [→ Generate your business plan] to planner
  append quote CTA "Bankfähige Zahlen plus passende Baupartner"

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-02 14:39:59 +01:00
Deeman
992e448c18 fix(articles): add /quote CTA after scenario card in financing articles
Appends contextual quote CTA block to padel-halle-finanzierung-de.md
and padel-hall-financing-germany-en.md after the scenario card embed.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-02 14:29:01 +01:00
Deeman
777a4af505 fix(articles): add /quote CTA + planner links in cost and risk articles
- padel-halle-kosten-de, padel-hall-cost-guide-en: link planner ref,
  append quote CTA "Zahlen prüfen — Angebote einholen"
- padel-halle-risiken-de, padel-hall-investment-risks-en: link planner
  in sensitivity tab mention, append quote CTA on risk management

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-02 14:18:46 +01:00
Deeman
2c8c662e9e fix(articles): replace directory CTA with /quote in build guides
Removes the broken "find suppliers" directory section from
padel-halle-bauen-de.md and padel-hall-build-guide-en.md.
Replaces with a contextual light-blue quote CTA block linking to /quote.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-02 14:17:28 +01:00
Deeman
34f8e45204 merge(articles): iframe preview + collapsible meta + word count 2026-03-02 12:09:04 +01:00
Deeman
6b9187f420 fix(articles): iframe preview + collapsible meta + word count
Replace the auto-escaped `{{ body_html }}` div (showed raw HTML tags)
with a sandboxed `<iframe srcdoc>` pattern matching the email preview.
Both the initial page load and the HTMX live-update endpoint now build
a full `preview_doc` document embedding the public CSS and wrapping
content in `<div class="article-body">` — pixel-perfect against the
live article, admin styles fully isolated.

Also:
- Delete ~65 lines of redundant `.preview-body` custom CSS
- Add "Meta ▾" toolbar toggle to collapse/expand metadata strip
- Add word count footer in the editor pane (updates on input)

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-02 12:01:16 +01:00
Deeman
94d92328b8 merge: fix article .md lookup + lighter editor
All checks were successful
CI / test (push) Successful in 51s
CI / tag (push) Successful in 3s
2026-03-02 11:47:13 +01:00
Deeman
100e200c3b fix(articles): find .md by slug scan + lighter editor theme
Two fixes:
- _find_article_md() scans _ARTICLES_DIR for files whose frontmatter
  slug matches, so padel-halle-bauen-de.md is found for slug
  'padel-halle-bauen'. The previous exact-name lookup missed any file
  where the filename ≠ slug (e.g. {slug}-{lang}.md naming convention).
- Editor pane: replace dark navy background with warm off-white (#FEFDFB)
  and dark text so it reads like a document, not a code editor.
2026-03-02 11:43:26 +01:00
Deeman
70628ea881 merge(pipeline-transform-tab): split article editor + frontmatter fix + transform tab features
All checks were successful
CI / test (push) Successful in 50s
CI / tag (push) Successful in 2s
2026-03-02 11:34:13 +01:00
Deeman
d619f5e1ef feat(articles): split editor with live preview + fix frontmatter bug
Bug: article_edit GET was passing raw .md file content (including YAML
frontmatter) to the body textarea. Articles synced from disk via
_sync_static_articles() had their frontmatter bled into the editor,
making it look like content was missing or garbled.

Fix: strip frontmatter (using existing _FRONTMATTER_RE) before setting
body, consistent with how _rebuild_article() already does it.
Also switch to _ARTICLES_DIR (absolute) instead of relative path.

New: split editor layout — compact metadata strip at top, dark
monospace textarea on the left, live rendered preview on the right
(HTMX, 500ms debounce). Initial preview server-rendered on page load.
New POST /admin/articles/preview endpoint returns the preview partial.
2026-03-02 11:10:01 +01:00
Deeman
2a7eed1576 merge: test suite compression pass (-197 lines)
All checks were successful
CI / test (push) Successful in 51s
CI / tag (push) Successful in 3s
2026-03-02 10:46:01 +01:00
Deeman
162e633c62 refactor(tests): compress admin_client + mock_send_email into conftest
Lift admin_client fixture from 7 duplicate definitions into conftest.py.
Add mock_send_email fixture, replacing 60 inline patch() blocks across
test_emails.py, test_waitlist.py, and test_businessplan.py. Net -197 lines.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-02 09:40:52 +01:00
Deeman
31017457a6 merge: semantic-compression — add compression helpers, macros, and coding philosophy
All checks were successful
CI / test (push) Successful in 50s
CI / tag (push) Successful in 3s
Applies Casey Muratori's semantic compression across all three packages:
- count_where() helper: 30+ COUNT(*) call sites compressed
- _forward_lead(): deduplicates lead forward routes
- 5 SQLMesh macros for country code patterns (7 models)
- skip_if_current() + write_jsonl_atomic() extract helpers
Net: -118 lines (272 added, 390 removed)

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-02 08:00:15 +01:00
Deeman
f93e4fd0d1 chore(changelog): document semantic compression pass
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-02 07:54:44 +01:00
Deeman
567798ebe1 feat(extract): add skip_if_current() and write_jsonl_atomic() helpers
Task 5/6: Compress repeated patterns in extractors:
- skip_if_current(): cursor check + early-return dict (3 extractors)
- write_jsonl_atomic(): working-file → JSONL → compress (2 extractors)
Applied in gisco, geonames, census_usa, playtomic_tenants.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-02 07:49:18 +01:00
Deeman
b32b7cd748 merge: unify confirm dialog — pure hx-confirm + form[method=dialog]
Eliminates confirmAction() entirely. One code path: all confirmations
go through showConfirm() called by the htmx:confirm interceptor.
14 template files converted to hx-boost + hx-confirm pattern.
Pipeline endpoints updated to exclude HX-Boosted requests from the
HTMX partial path.

# Conflicts:
#	web/src/padelnomics/admin/templates/admin/affiliate_form.html
#	web/src/padelnomics/admin/templates/admin/affiliate_program_form.html
#	web/src/padelnomics/admin/templates/admin/base_admin.html
#	web/src/padelnomics/admin/templates/admin/partials/affiliate_program_results.html
#	web/src/padelnomics/admin/templates/admin/partials/affiliate_row.html
2026-03-02 07:48:49 +01:00
Deeman
6774254cb0 feat(sqlmesh): add country code macros, apply across models
Task 4/6: Add 5 macros to compress repeated country code patterns:
- @country_name / @country_slug: 20-country CASE in dim_cities, dim_locations
- @normalize_eurostat_country / @normalize_eurostat_nuts: EL→GR, UK→GB
- @infer_country_from_coords: bounding box for 8 markets
Net: +91 lines in macros, -135 lines in models = -44 lines total.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-02 07:45:52 +01:00
Deeman
e87a7fc9d6 refactor(admin): extract _forward_lead() from duplicate lead forward routes
Task 3/6: lead_forward and lead_forward_htmx shared ~20 lines of
identical DB logic. Extracted into _forward_lead() that returns an
error string or None. Both routes now call the helper and differ
only in response format (redirect vs HTMX partial).

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-02 07:43:50 +01:00
Deeman
3d7a72ba26 refactor: apply count_where() across remaining web blueprints
Task 2/6 continued: Compress 18 more COUNT(*) call sites across
suppliers, directory, dashboard, public, planner, pseo, and pipeline
routes. -24 lines net.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-02 07:40:24 +01:00
Deeman
a55501f2ea feat(core): add count_where() helper, compress admin COUNT queries
Task 2/6: Adds count_where(table_where, params) to core.py that
compresses the fetch_one + null-check COUNT(*) pattern. Applied
across admin/routes.py — dashboard stats shrinks from ~75 to ~25
lines, plus 10 more call sites compressed.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-02 07:35:33 +01:00
Deeman
d3626193c5 refactor(admin): unify confirm dialog — pure hx-confirm + form[method=dialog]
Eliminate `confirmAction()` and the duplicate `cloneNode` hack entirely.
One code path: everything goes through `showConfirm()` called by the
`htmx:confirm` interceptor.

Dialog HTML:
- `<form method="dialog">` for native close semantics; button `value`
  becomes `dialog.returnValue` — no manual event listener reassignment.

JS:
- `showConfirm(message)` — Promise-based, listens for `close` once.
- `htmx:confirm` handler calls `showConfirm()` and calls `issueRequest`
  if confirmed. Replaces both the old HTMX handler and `confirmAction()`.

Templates (Padelnomics, 14 files):
- All `onclick=confirmAction(...)` and `onclick=confirm()` removed.
- Form-submit buttons: added `hx-boost="true"` to form + `hx-confirm`
  on the submit button.
- Pure HTMX buttons (pipeline_transform, pipeline_overview): `hx-confirm`
  replaces `onclick=if(!confirm(...))return false;`.

Pipeline routes (pipeline_trigger_extract, pipeline_trigger_transform):
- `is_htmx` now excludes `HX-Boosted: true` requests — boosted form
  POSTs get the normal redirect instead of the inline partial.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-02 07:35:32 +01:00
Deeman
7ea1f234e8 chore(changelog): document htmx:confirm guard fix
All checks were successful
CI / test (push) Successful in 51s
CI / tag (push) Successful in 2s
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-01 22:40:07 +01:00
Deeman
c1cf472caf fix(admin): guard htmx:confirm handler against empty question
The handler called evt.preventDefault() unconditionally, so auto-poll
requests (hx-trigger="every 5s", no hx-confirm) caused an empty dialog
to pop up every 5 seconds. Add an early return when evt.detail.question
is falsy so only actual hx-confirm interactions are intercepted.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-01 22:39:38 +01:00
Deeman
f9e22a72dd merge: fix CI — update proxy tests for 2-tier design
All checks were successful
CI / test (push) Successful in 54s
CI / tag (push) Successful in 3s
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-01 22:36:35 +01:00
Deeman
ce466e3f7f test(proxy): update supervisor tests for 2-tier proxy (no Webshare)
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-01 22:36:30 +01:00
Deeman
563bd1fb2e merge: tiered-proxy-tenants — gisco extractor, proxy fixes, recheck datetime fix
Some checks failed
CI / test (push) Failing after 46s
CI / tag (push) Has been skipped
- feat: GISCO NUTS-2 extractor module (replaces standalone script)
- feat: wire 5 unscheduled extractors into workflows.toml
- fix: add load_dotenv() to _shared.py so .env proxies are picked up
- fix: recheck datetime parsing (HH:MM:SS slot times need start_date prefix)
- fix: graceful 0-venue early return in recheck
- fix(proxy): remove Webshare free tier — DC tier 1, residential tier 2

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-01 22:12:17 +01:00
Deeman
b980b8f567 fix(proxy): remove Webshare free tier — DC tier 1, residential tier 2
Free Webshare proxies were timing out and exhausting the circuit breaker
before datacenter/residential proxies got a chance to run.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-01 22:12:08 +01:00
Deeman
0733f1c2a1 docs(scratch): rename guide → question bank with full gap analysis
Transforms the raw question bank into an annotated gap analysis document:
- Every section tagged ANSWERED / PARTIAL / GAP
- Summary table of 13 gaps across 3 tiers with impact and feasibility
- Inline actionable notes linking to research files, planner inputs, and backlog

Key findings captured:
- Tier 1 gaps: subsidies/grants, buyer segmentation, indoor-vs-outdoor decision
  framework, OPEX benchmark display
- Tier 2 gaps: booking platform strategy, depreciation/tax shield, legal/regulatory
  checklist (DE), supplier selection framework, staffing plan template
- Tier 3 gaps: zero-court pSEO pages, pre-opening playbook, drive-time isochrones

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-01 21:30:27 +01:00
Deeman
320777d24c update env vars
All checks were successful
CI / test (push) Successful in 50s
CI / tag (push) Successful in 2s
2026-03-01 21:28:45 +01:00
136 changed files with 8669 additions and 1461 deletions

View File

@@ -31,12 +31,18 @@ RESEND_WEBHOOK_SECRET=
#ENC[AES256_GCM,data:1HqXvAspvNIUNpCxJwge3mEsyO0Y/EWvD3vbLxkgGqIex0hABcupX/Nzk15u8iOY5JWvvEuAO414MNt6mFvnWBDpEw==,iv:N7gCzTNJAR/ljx5gGsX+ieZctya8vQbCIb3hw49OhXg=,tag:PJKNyzhrit5VgIXl+cNlbQ==,type:comment]
#ENC[AES256_GCM,data:do6DZ/1Osc5y4xseG8Q8bDX84JBHLzvmVbHiqxP7ChlicmzYBkZ85g43BuM7V0KInFTFgvaC8xmFic+2d37Holuf1ywdAjbLkRhg,iv:qrNmhPbmFDr2ynIF5EdOLZl3FI5f68WDrxuHMkAzuuU=,tag:761gYOlEdNM+e1//1MbCHg==,type:comment]
#ENC[AES256_GCM,data:dseLIQiUEU20xJqoq2dkFho9SnKyoyQ8pStjvfxwnj8v18/ua0TH/PDx/qwIp9z5kEIvbsz5ycJesFfKPhLA5juGcdCbi5zBmZRWYg==,iv:7JUmRnohJt0H5yoJXVD3IauuJkpPHDPyY02OWHWb9Nw=,tag:KcM6JGT01Aa1kTx+U30UKQ==,type:comment]
#ENC[AES256_GCM,data:VXv1O5oRNTws8wbx/nZWH6Q=,iv:M/XwF6Zef+xlJ/8AAVI1zSmsEUNYL+0twzxXwkf8moY=,tag:y3Nu5akuiKtEIMeZhSNIkw==,type:comment]
PAYMENT_PROVIDER=ENC[AES256_GCM,data:7uxz3xmr,iv:4uEOA7ZjehD1bF91Gxl0+OxnvlZW3QIq22MhnYM43uE=,tag:XvHqyRM+ugnWTUN9GFJ3fQ==,type:str]
#ENC[AES256_GCM,data:GgXo4zkhJsxXEk8F5a/+wdbvBUGN00MUAutZYLDEqqN4T1rZu92fioOLx7MEoC0b8i61,iv:f1hUBoZpmnzXNcikf/anVNdRSHNwVmmjdIcba3eiRI4=,tag:uWpF40uuiXyWqKrYGyLVng==,type:comment]
PADDLE_API_KEY=
PADDLE_CLIENT_TOKEN=
PADDLE_WEBHOOK_SECRET=
PADDLE_NOTIFICATION_SETTING_ID=
PADDLE_ENVIRONMENT=ENC[AES256_GCM,data:KIGNxEaodA==,iv:SRebaYRpVJR0LpfalBZJLTE8qBGwWZB/Fx3IokQF99Q=,tag:lcC56e4FjVkCiyaq41vxcQ==,type:str]
#ENC[AES256_GCM,data:sk79dbsswA==,iv:J8CyJt/WOMLd7CZNutDwIOtAOAooaMsLPO35gfWo+Nc=,tag:JQcGMYdgcQgtIWKcqXZkNQ==,type:comment]
STRIPE_API_PUBLIC_KEY=ENC[AES256_GCM,data:WhWvIzNd1sS+IrrEdE+FJI6ZgEiNlgG3oxC8VoDzXf0z1oH1wgY6m9wUq6UEZZyzeiRGAeAylOk6wHJ+Lx4+zx2cfv+yweX7I3Sq5VN2D1OBPiQ3Kde4zm5cXqA92jRkLAomZxw/DkeiB14=,iv:Rb3GSLMVSySR++X240MICsXbVtOuqZNjm+nIe+s65dU=,tag:z82dyRzmxF3e87Sm2F+4Qw==,type:str]
STRIPE_API_PRIVATE_KEY=ENC[AES256_GCM,data:/62y1Iv2Op21eEvT3BosgWD0S3YqGMgdfb2Edjhq2cuh32B3eH5fh9FaqBc3CvJpM7R79hy9jTnV3CTjlCkvrXGCLDnFY2a6kvSz5f+v2d/lsr8zvFLs6OP+bhssHdVygfIwz9ye46tfcFk=,iv:iw0NAYUf/gCM4awb2tKBEKuo/j7kkpVP6JjIIdVy7O8=,tag:GO3ASp5bykwHDHNkCYsdiA==,type:str]
STRIPE_ACCOUNT_ID=ENC[AES256_GCM,data:ahJsOgZLRi5n9P7Dy0U1rvmhwr/B,iv:aoVA3M8Faqv1kZwTtagD0WLVipkA5nkX5uSjtHl14+I=,tag:XwLOu9ZiHUizcsnk73bt1w==,type:str]
#ENC[AES256_GCM,data:2Hs7ds2ppeRqKB7EiAAbWqlainKdZ+eTYZSvPloirT4Hlsuf+zTwtJTA6RzHNCuK4em//jhOx8R2k80I,iv:1N6CNPqYWp3z8lm5e2Vp6OlpgHdMOiD7dsEYp23nMtA=,tag:ulWP/BFFoLljLMVCrsgizw==,type:comment]
UMAMI_API_URL=ENC[AES256_GCM,data:oX/m95YB+S2ziUKoxDhsDzMhGZfxppw+w603tQ==,iv:GAj7ccF6seiCfLAh2XIjUi13RpgNA3GONMtINcG+KMw=,tag:mUfRlvaEWrw2QWFydtnbNA==,type:str]
UMAMI_API_TOKEN=
@@ -73,7 +79,7 @@ GEONAMES_USERNAME=ENC[AES256_GCM,data:aSkVdLNrhiF6tlg=,iv:eemFGwDIv3EG/P3lVHGZj9
CENSUS_API_KEY=ENC[AES256_GCM,data:qqG971573aGq9MiHI2xLlanKKFwjfcNNoMXtm8LNbyh0rMbQN2XukQ==,iv:az2i0ldH75nHGah4DeOxaXmDbVYqmC1c77ptZqFA9BI=,tag:zoDdKj9bR7fgIDo1/dEU2g==,type:str]
sops_age__list_0__map_enc=-----BEGIN AGE ENCRYPTED FILE-----\nYWdlLWVuY3J5cHRpb24ub3JnL3YxCi0+IFgyNTUxOSBxNWNmUzVNUGdWRnE0ZFpF\nM0JQZWZ3UDdEVzlwTmIxakxOZXBkT2x2ZlNrClRtV2M3S2daSGxUZmFDSWQ2Nmh4\neU51QndFcUxlSE00RFovOVJTcDZmUUUKLS0tIDcvL3hRMDRoMWZZSXljNzA3WG5o\nMWFic21MV0krMzlIaldBTVU0ZDdlTE0K7euGQtA+9lHNws+x7TMCArZamm9att96\nL8cXoUDWe5fNI5+M1bXReqVfNwPTwZsV6j/+ZtYKybklIzWz02Ex4A==\n-----END AGE ENCRYPTED FILE-----\n
sops_age__list_0__map_recipient=age1f5002gj4s78jju45jd28kuejtcfhn5cdujz885fl7z2p9ym68pnsgky87a
sops_lastmodified=2026-03-01T13:34:16Z
sops_mac=ENC[AES256_GCM,data:JLfGLbNTEcI6M/sUA5Zez6cfEUObgnUBmX52560PzBmeLZt0F5Y5QpeojIBqEDMuNB0hp1nnPI59WClLJtQ12VlHo9TkL3x9uCNUG+KneQrn1bTmJpA3cwNkWTzIm4l+TGbJbd4FpKJ9H0v1w+sqoKOgG8DqbtOeVdUfsVspAso=,iv:UqYxooXkEtx+y7fYzl+GFncpkjz8dcP7o9fp+kFf6w4=,tag:/maSb1aZGo+Ia8eGpB7PYw==,type:str]
sops_lastmodified=2026-03-03T15:16:35Z
sops_mac=ENC[AES256_GCM,data:T0qph3KPd68Lo4hxd6ECP+wv87uwRFsAFZwnVyf/MXvuG7raraUW02RLox0xklVcKBJXk+9jM7ycQ1nuk95UIuu7uRU88g11RaAm67XaOsafgwDMrC17AjIlg0Vf0w64WAJBrQLaXhJlh/Gz45bXlz82F+XVnTW8fGCpHRZooMY=,iv:cDgMZX6FRVe9JqQXLN6OhO06Ysfg2AKP2hG0B/GeajU=,tag:vHavf9Hw2xqJrqM3vVUTjA==,type:str]
sops_unencrypted_suffix=_unencrypted
sops_version=3.12.1

View File

@@ -3,6 +3,7 @@ APP_NAME=ENC[AES256_GCM,data:ldJf4P0iD9ziMVg=,iv:hiVl2whhd02yZCafzBfbxX5/EU/suvz
SECRET_KEY=ENC[AES256_GCM,data:hmlXm7NKVVFmeea4DnlrH/oSnsoaMAkUz42oWwFXOXL1XwAh3iemIKHUQOV2G4SPlmjfmEVQD64xbxaJW0OcPQ/8KqhrRYDsy0F/u0h7nmNQdwJrcvzcmbvjgcwU5IITPIr23d/W5PeSJzxhB93uaJ0+zFN2CyHfeewrJKafPfw=,iv:e+ZSLUO+dlt+ET8r/0/pf74UtGIBMkaVoJMWlJn1W5U=,tag:LdDCCrHcJnKLkKL/cY/R/Q==,type:str]
BASE_URL=ENC[AES256_GCM,data:50k/RqlZ1EHqGM4UkSmTaCsuJgyU4w==,iv:f8zKr2jkts4RsawA97hzICHwj9Quzgp+Dw8AhQ7GSWA=,tag:9KhNvwmoOtDyuIql7okeew==,type:str]
DEBUG=ENC[AES256_GCM,data:O0/uRF4=,iv:cZ+vyUuXjQOYYRf4l8lWS3JIWqL/w3pnlCTDPAZpB1E=,tag:OmJE9oJpzYzth0xwaMqADQ==,type:str]
LANDING_DIR=ENC[AES256_GCM,data:rn8u+tGob0vU7kSAtxmrpYQlneesvyO10A==,iv:PuGtdcQBdRbnybulzd6L7JVQClcK3/QjMeYFXZSxGW0=,tag:K2PJPMCWXdqTlQpwP9+DOQ==,type:str]
#ENC[AES256_GCM,data:xmJc6WTb3yumHzvLeA==,iv:9jKuYaDgm4zR/DTswIMwsajV0s5UTe+AOX4Sue0GPCs=,tag:b/7H9js1HmFYjuQE4zJz8w==,type:comment]
ADMIN_EMAILS=ENC[AES256_GCM,data:R/2YTk8KDEpNQ71RN8Fm6miLZvXNJQ==,iv:kzmiaBK7KvnSjR5gx6lp7zEMzs5xRul6LBhmLf48bCU=,tag:csVZ0W1TxBAoJacQurW9VQ==,type:str]
#ENC[AES256_GCM,data:S7Pdg9tcom3N,iv:OjmYk3pqbZHKPS1Y06w1y8BE7CU0y6Vx2wnio9tEhus=,tag:YAOGbrHQ+UOcdSQFWdiCDA==,type:comment]
@@ -42,8 +43,8 @@ SUPERVISOR_GIT_PULL=ENC[AES256_GCM,data:mg==,iv:KgqMVYj12FjOzWxtA1T0r0pqCDJ6MtHz
PROXY_URLS_RESIDENTIAL=ENC[AES256_GCM,data:vxRcXQ/8TUTCtr6hKWBD1zVF47GFSfluIHZ8q0tt8SqQOWDdDe2D7Of6boy/kG3lqlpl7TjqMGJ7fLORcr0klKCykQ==,iv:YjegXXtIXm2qr0a3ZHRHxj3L1JoGZ1iQXkVXQupGQ2E=,tag:kahoHRskXbzplZasWOeiig==,type:str]
PROXY_URLS_DATACENTER=ENC[AES256_GCM,data:23TgU6oUeO7J+MFkraALQ5/RO38DZ3ib5oYYJr7Lj3KXQSlRsgwA+bJlweI5gcUpFphnPXvmwFGiuL6AeY8LzAQ3bx46dcZa5w9LfKw2PMFt,iv:AGXwYLqWjT5VmU02qqada3PbdjfC0mLK2sPruO0uru8=,tag:Z2IS/JPOqWX+x0LZYwyArA==,type:str]
WEBSHARE_DOWNLOAD_URL=ENC[AES256_GCM,data:/N77CFf6tJWCk7HrnBOm2Q1ynx7XoblzfbzJySeCjrxqiu4r+CB90aDkaPahlQKI00DUZih3pcy7WhnjdAwI30G5kJZ3P8H8/R0tP7OBK1wPVbsJq8prQJPFOAWewsS4KWNtSURZPYSCxslcBb7DHLX6ZAjv6A5KFOjRK2N8usR9sIabrCWh,iv:G3Ropu/JGytZK/zKsNGFjjSu3Wt6fvHaAqI9RpUHvlI=,tag:fv6xuS94OR+4xfiyKrYELA==,type:str]
PROXY_CONCURRENCY=ENC[AES256_GCM,data:vdEZ,iv:+eTNQO+s/SsVDBLg1/+fneMzEEsFkuEFxo/FcVV+mWc=,tag:i/EPwi/jOoWl3xW8H0XMdw==,type:str]
RECHECK_WINDOW_MINUTES=ENC[AES256_GCM,data:L2s=,iv:fV3mCKmK5fxUmIWRePELBDAPTb8JZqasVIhnAl55kYw=,tag:XL+PO6sblz/7WqHC3dtk1w==,type:str]
PROXY_CONCURRENCY=ENC[AES256_GCM,data:WWpx,iv:4RdNHXPXxFS5Yf1qa1NbaZgXydhKiiiEiMhkhQxD3xE=,tag:6UOQmBqj+9WlcxFooiTL+A==,type:str]
RECHECK_WINDOW_MINUTES=ENC[AES256_GCM,data:9wQ=,iv:QS4VfelUDdaDbIUC8SJBuy09VpiWM9QQcYliQ7Uai+I=,tag:jwkJY95qXPPrgae8RhKPSg==,type:str]
#ENC[AES256_GCM,data:RC+t2vqLwLjapdAUql8rQls=,iv:Kkiz3ND0g0MRAgcPJysIYMzSQS96Rq+3YP5yO7yWfIY=,tag:Y6TbZd81ihIwn+U515qd1g==,type:comment]
GSC_SERVICE_ACCOUNT_PATH=ENC[AES256_GCM,data:Vki6yHk+gd4n,iv:rxzKvwrGnAkLcpS41EZ097E87NrIpNZGFfl4iXFvr40=,tag:EZkBJpCq5rSpKYVC4H3JHQ==,type:str]
GSC_SITE_URL=ENC[AES256_GCM,data:K0i1xRym+laMP6kgOMEfUyoAn2eNgQ==,iv:kyb+grzFq1e5CG/0NJRO3LkSXexOuCK07uJYApAdWsA=,tag:faljHqYjGTgrR/Zbh27/Yw==,type:str]
@@ -63,7 +64,7 @@ sops_age__list_1__map_enc=-----BEGIN AGE ENCRYPTED FILE-----\nYWdlLWVuY3J5cHRpb2
sops_age__list_1__map_recipient=age1wjepykv3glvsrtegu25tevg7vyn3ngpl607u3yjc9ucay04s045s796msw
sops_age__list_2__map_enc=-----BEGIN AGE ENCRYPTED FILE-----\nYWdlLWVuY3J5cHRpb24ub3JnL3YxCi0+IFgyNTUxOSBFeHhaOURNZnRVMEwxNThu\nUjF4Q0kwUXhTUE1QSzZJbmpubnh3RnpQTmdvCjRmWWxpNkxFUmVGb3NRbnlydW5O\nWEg3ZXJQTU4vcndzS2pUQXY3Q0ttYjAKLS0tIE9IRFJ1c2ZxbGVHa2xTL0swbGN1\nTzgwMThPUDRFTWhuZHJjZUYxOTZrU00KY62qrNBCUQYxwcLMXFEnLkwncxq3BPJB\nKm4NzeHBU87XmPWVrgrKuf+PH1mxJlBsl7Hev8xBTy7l6feiZjLIvQ==\n-----END AGE ENCRYPTED FILE-----\n
sops_age__list_2__map_recipient=age1c783ym2q5x9tv7py5d28uc4k44aguudjn03g97l9nzs00dd9tsrqum8h4d
sops_lastmodified=2026-03-01T17:40:31Z
sops_mac=ENC[AES256_GCM,data:xiTAz5BSk9F7GqQHcy0UpU7jCS2wHbfi27hOvpdoxAKtGLxaZ5PISQHVWEStWjHS+8g+3ACrTj/UQfUuCTr/55UVU0Wu6hyAWnuZ3DuaMfYUNer+9XZm5V2jTibQIYH01ZWyt4aeqs/Njn39FMx33s4hRdYVjfN391wgkx2+Hsg=,iv:UbgoSuVPu9H7Gu+HwZ6m60KgfGxZwKITMrkT54nd1yY=,tag:pM0hoz6XDQk6HaSJBkOR1Q==,type:str]
sops_lastmodified=2026-03-05T15:55:19Z
sops_mac=ENC[AES256_GCM,data:orLypjurBTYmk3um0bDQV3wFxj1pjCsjOf2D+AZyoIYY88MeY8BjK8mg8BWhmJYlGWqHH1FCpoJS+2SECv2Bvgejqvx/C/HSysA8et5CArM/p/MBbcupLAKOD8bTXorKMRDYPkWpK/snkPToxIZZd7dNj/zSU+OhRp5qLGCHkvM=,iv:eBn93z4DSk8UPHgP/Jf/Kz+3KwoKIQ9Et72pbLFcLP8=,tag:79kzPIKp0rtHGhH1CkXqwg==,type:str]
sops_unencrypted_suffix=_unencrypted
sops_version=3.12.1

View File

@@ -6,6 +6,93 @@ The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.1.0/).
## [Unreleased]
### Fixed
- **Pipeline diagnostic script** (`scripts/check_pipeline.py`) — handle DuckDB catalog naming quirk where `lakehouse.duckdb` uses catalog `lakehouse` instead of `local`, causing SQLMesh logical views to break. Script now auto-detects the catalog via `USE`, and falls back to querying physical tables (`sqlmesh__<schema>.<table>__<hash>`) when views fail.
- **Eurostat gas prices extractor** — `nrg_pc_203` filter missing `unit` dimension (API returns both KWH and GJ_GCV); now filters to `KWH`.
- **Eurostat labour costs extractor** — `lc_lci_lev` used non-existent `currency` filter dimension; corrected to `unit: EUR`.
- **Supervisor transform step** — changed `sqlmesh run` to `sqlmesh plan prod --auto-apply` so new/modified models are detected and applied automatically.
### Added
- **Pipeline diagnostic script** (`scripts/check_pipeline.py`) — read-only script that reports row counts at every layer of the pricing pipeline (staging → foundation → serving), date range analysis, HAVING filter impact, and join coverage. Run on prod to diagnose empty serving tables.
- **Extraction card descriptions** — each workflow card on the admin pipeline page now shows a one-line description explaining what the data source is (e.g. "EU geographic boundaries (NUTS2 polygons) from Eurostat GISCO"). Descriptions defined in `workflows.toml`.
- **Running state indicator** — extraction cards show a spinner + "Running" label with a blue-tinted border when an extraction is actively running, replacing the plain Run button. Cards also display the start time with "running..." text.
- **Interactive Leaflet maps** — geographic visualization across 4 key placements using self-hosted Leaflet 1.9.4 (GDPR-safe, no CDN):
- **Markets hub** (`/markets`): country bubble map with circles sized by total venues, colored by avg market score (green ≥ 60, amber 30-60, red < 30). Click navigates to country overview.
- **Country overview articles**: city bubble map loads after article render, auto-fits bounds, click navigates to city page. Bubbles colored by market score.
- **City cost articles**: venue dot map centered on city lat/lon (zoom 13), navy dots per venue with tooltip showing name + court breakdown (indoor/outdoor).
- **Opportunity map** (`/<lang>/opportunity-map`): standalone full-width page with country selector. Circles sized by population, colored by opportunity score (green ≥ 70, amber 40-70, blue < 40). Existing venues shown as gray reference dots.
- New `/api` blueprint with 4 JSON endpoints (`/api/markets/countries.json`, `/api/markets/<country>/cities.json`, `/api/markets/<country>/<city>/venues.json`, `/api/opportunity/<country>.json`) — 1-hour public cache headers, all queries against `analytics.duckdb` via `fetch_analytics`.
- New SQLMesh serving model `city_venue_locations` exposing venue lat/lon + court counts per city.
- `pseo_city_costs_de` serving model: added `lat`/`lon` columns for city map data attributes in baked articles.
- Leaflet CSS included on all article pages (5KB, cached). JS loaded dynamically only when a map container is present.
- **Individualised article financial calculations with real per-country cost data** — ~30 CAPEX/OPEX calculator fields now scale to each country's actual cost level via Eurostat data, eliminating the identical DE-hardcoded numbers shown for every city globally.
- **New Eurostat datasets extracted** (8 new landing files): electricity prices (`nrg_pc_205`), gas prices (`nrg_pc_203`), labour costs (`lc_lci_lev`), and 5 price level index categories from `prc_ppp_ind` (construction, housing, services, misc, government).
- `extract/padelnomics_extract/src/padelnomics_extract/eurostat.py`: added 8 dataset entries; added `dataset_code` field support so multiple dict entries can share one Eurostat API endpoint (needed for 5 prc_ppp_ind variants).
- **4 new staging models**: `stg_electricity_prices`, `stg_gas_prices`, `stg_labour_costs`, `stg_price_levels` — all read from landing zone with ISO code normalisation (EL→GR, UK→GB).
- **New `foundation.dim_countries`** — conformed country dimension (grain: `country_code`). Consolidates country names/slugs and income data previously duplicated in `dim_cities` and `dim_locations` as ~50-line CASE blocks. Computes ~29 calculator cost override columns from Eurostat PLI indices and energy prices relative to DE baseline.
- **Refactored `dim_cities`** — removed ~50-line CASE blocks and `country_income` CTE; JOIN `dim_countries` for `country_name_en`, `country_slug`, `median_income_pps`, `income_year`.
- **Refactored `dim_locations`** — same refactor as `dim_cities`; income cascade still cascades EU NUTS-2 → US state → `dim_countries` country-level.
- **Updated `serving.pseo_city_costs_de`** — JOIN `dim_countries`; 29 new camelCase override columns (`electricity`, `heating`, `rentSqm`, `hallCostSqm`, …, `permitsCompliance`) auto-applied by calculator.
- **Updated `serving.planner_defaults`** — JOIN `dim_countries`; same 29 cost columns flow through to the planner API `/api/market-data` endpoint.
- **Bulk actions for articles and leads** — checkbox selection + floating action bar on admin articles and leads pages (same pattern as suppliers). Articles: publish, unpublish, toggle noindex, rebuild, delete. Leads: set status, set heat. Re-renders results via HTMX after each action.
- **Stripe payment provider** — second payment provider alongside Paddle, switchable via `PAYMENT_PROVIDER=stripe` env var. Existing Paddle subscribers keep working regardless of toggle — both webhook endpoints stay active.
- `billing/stripe.py`: full Stripe implementation (Checkout Sessions, Billing Portal, subscription cancel, webhook verification + parsing)
- `billing/paddle.py`: extracted Paddle-specific logic from routes.py into its own module
- `billing/routes.py`: provider-agnostic dispatch layer — checkout, manage, cancel routes call `_provider().xxx()`
- `_payment_js.html`: dual-path JS — conditionally loads Paddle.js SDK, universal `startCheckout()` handles both overlay (Paddle) and redirect (Stripe)
- `scripts/setup_stripe.py`: mirrors `setup_paddle.py` — creates 17 products + prices in Stripe, registers webhook endpoint
- Migration 0028: `payment_products` table generalizing `paddle_products` with `provider` column; existing Paddle rows copied
- `get_price_id()` / `get_all_price_ids()` replace `get_paddle_price()` for provider-agnostic lookups
- Stripe config vars: `STRIPE_SECRET_KEY`, `STRIPE_PUBLISHABLE_KEY`, `STRIPE_WEBHOOK_SECRET`
- Dashboard boost buttons converted from inline `Paddle.Checkout.open()` to server round-trip via `/billing/checkout/item` endpoint
- Stripe Tax add-on handles EU VAT (must be enabled in Stripe Dashboard)
### Fixed
- **City slug transliteration** — replaced broken inline `REGEXP_REPLACE(LOWER(...), '[^a-z0-9]+', '-')` with new `@slugify` SQLMesh macro that uses `STRIP_ACCENTS` + `ß→ss` pre-replacement. Fixes: `Düsseldorf``dusseldorf` (was `d-sseldorf`), `Überlingen``uberlingen` (was `-berlingen`). Applied to `dim_venues`, `dim_cities`, `dim_locations`. Python `slugify()` in `core.py` updated to match.
- **B2B article market links** — added missing language prefix (`/markets/germany``/de/markets/germany` and `/en/markets/germany`). Without the prefix, Quart interpreted `markets` as a language code → 500 error.
- **Country overview top-5 city list** — changed ranking from raw `market_score DESC` (which inflated tiny towns with high density scores) to `padel_venue_count DESC` for top cities and `population DESC` for top opportunity cities. Germany now shows Berlin, Hamburg, München instead of Überlingen, Schwaigern.
### Changed
- **CRO overhaul — homepage and supplier landing pages** — rewrote all copy from feature-focused ("60+ variables", "6 analysis tabs") to outcome-focused JTBD framing ("Invest in Padel with Confidence, Not Guesswork"). Based on JTBD analysis: the visitor's job is confidence committing €200K+, not "plan faster."
- **Homepage hero**: new headline, description, and trust-building bullets (bank-ready metrics, real market data, free/no-signup)
- **Proof strip**: live stats bar below hero (business plans created, suppliers, countries, project volume)
- **"Sound familiar?" section**: replaces the 5-step journey timeline (3 items said "SOON") with 4 struggling-moment cards from JTBD research
- **Feature cards reframed as outcomes**: "60+ Variables" → "Know Your Numbers Inside Out", "6 Analysis Tabs" → "Bank-Ready from Day One", "Sensitivity Analysis" → "Stress-Test Before You Commit", etc.
- **"Why Padelnomics" comparison**: 3-column section (DIY Spreadsheet vs. Hired Consultant vs. Padelnomics) from JTBD Competitive Job Map
- **FAQ rewritten**: customer-first questions ("How much does it cost to open a padel facility?", "Will a bank accept this?") replace product-internal questions
- **Final CTA**: "Your Bank Meeting Is Coming. Be Ready." replaces generic "Start Planning Today"
- **Supplier page**: "Is this your sales team?" struggling-moments section, conditional stats display (hides zeros), data-backed proof points replacing anonymous testimonials, ROI math moved above pricing, tier-specific CTAs
- **Meta/SEO**: updated page title and description for search intent
- All changes in both EN and DE (native-quality German, generisches Maskulinum)
### Fixed
- **B2B article CTAs rewritten — all 12 now link to `/quote`** — zero articles previously linked to the quote lead-capture form. Each article's final section has been updated:
- `padel-halle-bauen-de` / `padel-hall-build-guide-en`: replaced broken "directory" section (no link) with a contextual light-blue quote CTA block
- `padel-halle-kosten-de` / `padel-hall-cost-guide-en`: planner mention linked to `/de/planner` / `/en/planner`; quote CTA block appended
- `padel-halle-risiken-de` / `padel-hall-investment-risks-en`: planner sensitivity-tab mention linked; quote CTA block appended
- `padel-halle-finanzierung-de` / `padel-hall-financing-germany-en`: quote CTA block appended after scenario card embed
- `padel-standort-analyse-de` / `padel-hall-location-guide-en`: fixed broken `[→ Standortanalyse starten]` / `[→ Run a location analysis]` placeholders (no href) to `/de/planner` / `/en/planner`; quote CTA block appended
- `padel-business-plan-bank-de` / `padel-business-plan-bank-requirements-en`: fixed broken `[→ Businessplan erstellen]` / `[→ Generate your business plan]` placeholders to `/de/planner` / `/en/planner`; quote CTA block appended
- CTA copy is contextual per article (not identical boilerplate); uses the light-blue banner pattern (`.btn` class, `#EFF6FF` background) consistent with other generated articles
- **Article editor preview now renders HTML correctly** — replaced the raw `{{ body_html }}` div (which Jinja2 auto-escaped to literal `<h1>...</h1>` text) with a sandboxed `<iframe srcdoc="...">` pattern. The route builds a full `preview_doc` HTML document embedding the public site stylesheet (`/static/css/output.css`) and wraps content in `<div class="article-body">`, so the preview is pixel-perfect against the live article. The `article_preview` POST endpoint uses the same pattern for HTMX live updates. Removed ~65 lines of redundant `.preview-body` custom CSS from the editor template.
### Changed
- **Semantic compression pass** — applied Casey Muratori's compression workflow (write concrete → observe patterns → compress genuine repetitions) across all three packages. Net result: ~200 lines removed, codebase simpler.
- **`count_where()` helper** (`web/core.py`): compresses the `fetch_one("SELECT COUNT(*) ...") + null-check` pattern. Applied across 30+ call sites in admin, suppliers, directory, dashboard, public, and planner routes. Dashboard stats function shrinks from 75 to 25 lines.
- **`_forward_lead()` helper** (`web/admin/routes.py`): extracts shared DB logic from `lead_forward` and `lead_forward_htmx` — both routes now call the helper and differ only in response format.
- **SQLMesh macros** (`transform/macros/__init__.py`): 5 new macros compress repeated country code patterns across 7 SQL models: `@country_name`, `@country_slug`, `@normalize_eurostat_country`, `@normalize_eurostat_nuts`, `@infer_country_from_coords`.
- **Extract helpers** (`extract/utils.py`): `skip_if_current()` compresses cursor-check + early-return pattern (3 extractors); `write_jsonl_atomic()` compresses working-file → JSONL → compress pattern (2 extractors).
- **Coding philosophy updated** (`~/.claude/coding_philosophy.md`): added `<compression>` section documenting the workflow, the test ("Did this abstraction make the total codebase smaller?"), and distinction from premature DRY.
- **Test suite compression pass** — applied same compression workflow to `web/tests/` (30 files, 13,949 lines). Net result: -197 lines across 11 files.
- **`admin_client` fixture** lifted from 7 duplicate definitions into `conftest.py`.
- **`mock_send_email` fixture** added to `conftest.py`, replacing 60 inline `with patch("padelnomics.worker.send_email", ...)` blocks across `test_emails.py` (51), `test_waitlist.py` (4), `test_businessplan.py` (2). Each refactored test drops one indentation level.
### Fixed
- **Admin: empty confirm dialog on auto-poll** — `htmx:confirm` handler now guards with `if (!evt.detail.question) return` so auto-poll requests (`hx-trigger="every 5s"`, no `hx-confirm` attribute) no longer trigger an empty dialog every 5 seconds.
### Changed
- **Admin: styled confirm dialog for all destructive actions** — replaced all native `window.confirm()` calls with the existing `#confirm-dialog` styled `<dialog>`. A new global `htmx:confirm` handler intercepts HTMX confirmation prompts and shows the dialog; form-submit buttons on affiliate pages were updated to use `confirmAction()`. Affected: pipeline Transform tab (Run Transform, Run Export, Run Full Pipeline), pipeline Overview tab (Run extractor), affiliate product delete, affiliate program delete (both form and list variants).
- **Pipeline tabs: no scrollbar** — added `scrollbar-width: none` and `::-webkit-scrollbar { display: none }` to `.pipeline-tabs` to suppress the spurious horizontal scrollbar on narrow viewports.
@@ -16,6 +103,12 @@ The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.1.0/).
- **Proxy URL scheme validation in `load_proxy_tiers()`** — URLs in `PROXY_URLS_DATACENTER` / `PROXY_URLS_RESIDENTIAL` that are missing an `http://` or `https://` scheme are now logged as a warning and skipped, rather than being passed through and causing SSL handshake failures or connection errors at request time. Also fixed a missing `http://` prefix in the dev `.env` `PROXY_URLS_DATACENTER` entry.
### Changed
- **Unified confirm dialog — pure HTMX `hx-confirm` + `<form method="dialog">`** — eliminated the `confirmAction()` JS function and the duplicate `cloneNode` hack. All confirmation prompts now go through a single `showConfirm()` Promise-based function called by the `htmx:confirm` interceptor. The dialog HTML uses `<form method="dialog">` for native close semantics (`returnValue` is `"ok"` or `"cancel"`), removing the need to clone and replace buttons on every invocation. All 12 Padelnomics call sites converted from `onclick=confirmAction(...)` to `hx-boost="true"` + `hx-confirm="..."` on the submit button. Pipeline trigger endpoints updated to treat `HX-Boosted: true` requests as non-HTMX (returning a redirect rather than an inline partial) so boosted form submissions flow through the normal redirect cycle. Same changes applied to BeanFlows and the quart-saas-boilerplate template.
- `web/src/padelnomics/admin/templates/admin/base_admin.html`: replaced dialog `<div>` with `<form method="dialog">`, replaced `confirmAction()` + inline `htmx:confirm` handler with unified `showConfirm()` + single `htmx:confirm` listener
- `web/src/padelnomics/admin/pipeline_routes.py`: `pipeline_trigger_extract` and `pipeline_trigger_transform` now exclude `HX-Boosted: true` from the HTMX partial path
- 12 templates updated: `pipeline.html`, `partials/pipeline_extractions.html`, `affiliate_form.html`, `affiliate_program_form.html`, `partials/affiliate_program_results.html`, `partials/affiliate_row.html`, `generate_form.html`, `articles.html`, `audience_contacts.html`, `template_detail.html`, `partials/scenario_results.html`
- Same changes mirrored to BeanFlows and quart-saas-boilerplate template
- **Per-proxy dead tracking in tiered cycler** — `make_tiered_cycler` now accepts a `proxy_failure_limit` parameter (default 3). Individual proxies that hit the limit are marked dead and permanently skipped by `next_proxy()`. If all proxies in the active tier are dead, `next_proxy()` auto-escalates to the next tier without needing the tier-level threshold. `record_failure(proxy_url)` and `record_success(proxy_url)` accept an optional `proxy_url` argument for per-proxy tracking; callers without `proxy_url` are fully backward-compatible. New `dead_proxy_count()` callable exposed for monitoring.
- `extract/padelnomics_extract/src/padelnomics_extract/proxy.py`: added per-proxy state (`proxy_failure_counts`, `dead_proxies`), updated `next_proxy`/`record_failure`/`record_success`, added `dead_proxy_count`
- `extract/padelnomics_extract/src/padelnomics_extract/playtomic_tenants.py`: `_fetch_page_via_cycler` passes `proxy_url` to `record_success`/`record_failure`

View File

@@ -25,6 +25,7 @@ WORKDIR /app
RUN mkdir -p /app/data && chown -R appuser:appuser /app
COPY --from=build --chown=appuser:appuser /app .
COPY --from=css-build /app/web/src/padelnomics/static/css/output.css ./web/src/padelnomics/static/css/output.css
COPY --chown=appuser:appuser infra/supervisor/workflows.toml ./infra/supervisor/workflows.toml
USER appuser
ENV PYTHONUNBUFFERED=1
ENV DATABASE_PATH=/app/data/app.db

View File

@@ -60,6 +60,7 @@
- [x] Boost purchases (logo, highlight, verified, card color, sticky week/month)
- [x] Credit pack purchases (25/50/100/250)
- [x] Supplier subscription tiers (Basic free / Growth €199 / Pro €499, monthly + annual)
- [x] **Stripe payment provider** — env-var toggle (`PAYMENT_PROVIDER=paddle|stripe`), Stripe Checkout Sessions + Billing Portal + webhook handling, `payment_products` table generalizes `paddle_products`, dual-path JS templates, `billing/paddle.py` + `billing/stripe.py` dispatch pattern, `setup_stripe.py` product creation script
- [x] **Feature flags** (DB-backed, migration 0019) — `is_flag_enabled()` + `feature_gate()` decorator replace `WAITLIST_MODE`; 5 flags (markets, payments, planner_export, supplier_signup, lead_unlock); admin UI at `/admin/flags` with toggle
- [x] **Pricing overhaul** — Basic free (no Paddle sub), card color €59, BP PDF €149; supplier page restructured value-first (why → guarantee → leads → social proof → pricing); all CTAs "Get Started Free"; static ROI line; credits-only callout
- [x] **Lead-Back Guarantee** (migration 0020) — 1-click credit refund for non-responding leads (330 day window); `refund_lead_guarantee()` in credits.py; "Lead didn't respond" button on unlocked lead cards
@@ -157,6 +158,7 @@
- [x] Padel racket SVG logo/favicon
- [x] Feedback widget (HTMX POST, rate-limited)
- [x] Interactive ROI calculator widget on landing page (JS sliders, no server call)
- [x] **CRO overhaul — homepage + supplier landing pages** — JTBD-driven copy rewrite (feature → outcome framing), proof strip, struggling-moments sections, "Why Padelnomics" comparison, rewritten FAQ, conditional supplier stats, data-backed proof points, tier-specific CTAs (EN + DE)
---

View File

@@ -160,4 +160,10 @@ Ein bankfähiger Businessplan steht und fällt mit der Qualität der Finanzdaten
Der Businessplan-Export enthält alle 13 Gliederungsabschnitte mit automatisch befüllten Finanztabellen, einer KDDB-Berechnung für alle drei Szenarien und einer Übersicht der relevanten KfW-Programme für Ihr Bundesland.
[→ Businessplan erstellen]
[→ Businessplan erstellen](/de/planner)
<div style="background:#EFF6FF;border:1px solid #BFDBFE;border-radius:12px;padding:1.5rem 2rem;margin:2rem 0;">
<p style="margin:0 0 0.5rem;font-weight:600;color:#0F172A;font-size:1.0625rem;">Bankfähige Zahlen plus passende Baupartner</p>
<p style="margin:0 0 1rem;color:#334155;font-size:0.9375rem;">Zum überzeugenden Bankgespräch gehören nicht nur solide Zahlen — sondern auch ein konkretes Angebot von realen Baupartnern. Schildern Sie Ihr Vorhaben in wenigen Minuten — wir stellen den Kontakt zu Architekten, Court-Lieferanten und Haustechnikspezialisten her. Kostenlos und unverbindlich.</p>
<a href="/quote" class="btn">Angebot anfordern</a>
</div>

View File

@@ -160,4 +160,10 @@ A bankable business plan depends on the quality of the financial model behind it
The business plan export includes all 13 sections with auto-populated financial tables, a DSCR calculation across all three scenarios, and a summary of applicable KfW and state programs for your *Bundesland*.
[→ Generate your business plan]
[→ Generate your business plan](/en/planner)
<div style="background:#EFF6FF;border:1px solid #BFDBFE;border-radius:12px;padding:1.5rem 2rem;margin:2rem 0;">
<p style="margin:0 0 0.5rem;font-weight:600;color:#0F172A;font-size:1.0625rem;">Complete your bank file — get a build cost estimate</p>
<p style="margin:0 0 1rem;color:#334155;font-size:0.9375rem;">A credible bank application pairs a financial model with a real contractor quote. Describe your project — we'll connect you with architects, court suppliers, and MEP specialists who can provide the cost documentation your bank needs. Free and non-binding.</p>
<a href="/quote" class="btn">Request a Quote</a>
</div>

View File

@@ -331,8 +331,10 @@ Building a padel hall is complex, but it is a solved problem. The failures are n
---
## Find Builders and Suppliers Through Padelnomics
## Find the Right Build Partners
Padelnomics maintains a directory of verified build partners for padel hall projects: architects with sports facility experience, court suppliers, HVAC specialists, and operational consultants.
If you're currently in Phase 1 or Phase 2 and looking for the right partners, the directory is the fastest place to start.
<div style="background:#EFF6FF;border:1px solid #BFDBFE;border-radius:12px;padding:1.5rem 2rem;margin:2rem 0;">
<p style="margin:0 0 0.5rem;font-weight:600;color:#0F172A;font-size:1.0625rem;">Get quotes from verified build partners</p>
<p style="margin:0 0 1rem;color:#334155;font-size:0.9375rem;">From feasibility to court installation: describe your project in a few minutes — we'll connect you with vetted architects, court suppliers, and MEP specialists. Free and non-binding.</p>
<a href="/quote" class="btn">Request a Quote</a>
</div>

View File

@@ -191,4 +191,10 @@ Opening a padel hall in Germany in 2026 is a real capital commitment: €930k on
The investors who succeed here are not the ones who found a cheaper build. They are the ones who understood the numbers precisely enough to make the right location and concept decisions early — and to structure their financing before the costs escalated.
**Next step:** Use the Padelnomics Financial Planner to model your specific scenario — your city, your financing mix, your pricing assumptions. The figures in this article are your starting point; your hall deserves a projection built around your actual numbers.
**Next step:** Use the [Padelnomics Financial Planner](/en/planner) to model your specific scenario — your city, your financing mix, your pricing assumptions. The figures in this article are your starting point; your hall deserves a projection built around your actual numbers.
<div style="background:#EFF6FF;border:1px solid #BFDBFE;border-radius:12px;padding:1.5rem 2rem;margin:2rem 0;">
<p style="margin:0 0 0.5rem;font-weight:600;color:#0F172A;font-size:1.0625rem;">Test your numbers against real market prices</p>
<p style="margin:0 0 1rem;color:#334155;font-size:0.9375rem;">Once your model is in shape, the next step is benchmarking against actual quotes. Describe your project — we'll connect you with build partners who can give you concrete figures for your specific facility. Free and non-binding.</p>
<a href="/quote" class="btn">Request a Quote</a>
</div>

View File

@@ -179,3 +179,9 @@ Your most powerful tool in every bank meeting: a complete financial model demons
[scenario:padel-halle-6-courts:full]
The Padelnomics business plan includes a full financing structure overview and use-of-funds breakdown — the exact format your bank needs to evaluate the application.
<div style="background:#EFF6FF;border:1px solid #BFDBFE;border-radius:12px;padding:1.5rem 2rem;margin:2rem 0;">
<p style="margin:0 0 0.5rem;font-weight:600;color:#0F172A;font-size:1.0625rem;">Ready to take financing to the next step?</p>
<p style="margin:0 0 1rem;color:#334155;font-size:0.9375rem;">A credible bank application pairs your financial model with a real build cost estimate from a contractor. Describe your project — we'll connect you with build partners who provide the cost documentation lenders expect. Free and non-binding.</p>
<a href="/quote" class="btn">Request a Quote</a>
</div>

View File

@@ -218,6 +218,12 @@ The investors who succeed long-term in padel aren't the ones who found a risk-fr
## Model the Downside with Padelnomics
The Padelnomics investment planner includes a sensitivity analysis tab designed for exactly this kind of scenario work: how does ROI change at 40% vs 65% utilization? What does a six-month construction delay cost in total? What happens to the model when a competitor opens in year three and takes 20% of demand?
The [Padelnomics investment planner](/en/planner) includes a sensitivity analysis tab designed for exactly this kind of scenario work: how does ROI change at 40% vs 65% utilization? What does a six-month construction delay cost in total? What happens to the model when a competitor opens in year three and takes 20% of demand?
Good decisions need an honest model — not just the best-case assumptions.
<div style="background:#EFF6FF;border:1px solid #BFDBFE;border-radius:12px;padding:1.5rem 2rem;margin:2rem 0;">
<p style="margin:0 0 0.5rem;font-weight:600;color:#0F172A;font-size:1.0625rem;">Start with the right partners</p>
<p style="margin:0 0 1rem;color:#334155;font-size:0.9375rem;">Most of the risks in this article are manageable with the right advisors, builders, and specialists on board from day one. Describe your project — we'll connect you with vetted partners who specialize in padel facilities. Free and non-binding.</p>
<a href="/quote" class="btn">Request a Quote</a>
</div>

View File

@@ -176,7 +176,7 @@ Before committing to a site search in any city, calibrate where it sits on this
Padelnomics tracks venue density, booking platform utilisation, and demographic fit for cities across Europe. Use the country market overview to read the maturity stage of your target city before evaluating individual sites.
[→ View market data by country](/markets/germany)
[→ View market data by country](/en/markets/germany)
---
@@ -184,4 +184,10 @@ Padelnomics tracks venue density, booking platform utilisation, and demographic
Padelnomics analyzes market data for your target area: player density, competitive supply, demand signals from booking platform data, and demographic indicators at municipality level. For your candidate sites, Padelnomics produces a catchment area profile and a side-by-side comparison — so the decision is grounded in data rather than a map with a finger pointing at it.
[→ Run a location analysis]
[→ Run a location analysis](/en/planner)
<div style="background:#EFF6FF;border:1px solid #BFDBFE;border-radius:12px;padding:1.5rem 2rem;margin:2rem 0;">
<p style="margin:0 0 0.5rem;font-weight:600;color:#0F172A;font-size:1.0625rem;">Site shortlisted — time to get quotes</p>
<p style="margin:0 0 1rem;color:#334155;font-size:0.9375rem;">Once a location passes your criteria, the next step is engaging architects and court suppliers. Describe your project — we'll connect you with vetted build partners who can give you concrete figures. Free and non-binding.</p>
<a href="/quote" class="btn">Request a Quote</a>
</div>

View File

@@ -326,8 +326,10 @@ Eine Padelhalle zu bauen ist komplex — aber kein ungelöstes Problem. Die Fehl
---
## Planer und Lieferanten finden
## Die richtigen Baupartner finden
Padelnomics führt ein Verzeichnis verifizierter Baupartner für Padelhallen im DACH-Raum: Architekten mit Sportanlagenerfahrung, Court-Lieferanten, Haustechnikspezialisten und Betriebsberater.
Wenn Sie gerade in Phase 1 oder Phase 2 sind und die richtigen Partner suchen, ist das Verzeichnis der schnellste Einstieg.
<div style="background:#EFF6FF;border:1px solid #BFDBFE;border-radius:12px;padding:1.5rem 2rem;margin:2rem 0;">
<p style="margin:0 0 0.5rem;font-weight:600;color:#0F172A;font-size:1.0625rem;">Angebote von verifizierten Baupartnern erhalten</p>
<p style="margin:0 0 1rem;color:#334155;font-size:0.9375rem;">Von der Machbarkeitsstudie bis zum Court-Einbau: Schildern Sie Ihr Projekt in wenigen Minuten — wir stellen den Kontakt zu geprüften Architekten, Court-Lieferanten und Haustechnikspezialisten her. Kostenlos und unverbindlich.</p>
<a href="/quote" class="btn">Angebot anfordern</a>
</div>

View File

@@ -199,3 +199,9 @@ Ihr wichtigstes Werkzeug in jedem Bankgespräch: ein vollständiges Finanzmodell
[scenario:padel-halle-6-courts:full]
Der Padelnomics-Businessplan enthält eine vollständige Finanzierungsstrukturübersicht und eine Mittelverwendungsplanung, die direkt in Ihr Bankgespräch mitgenommen werden kann.
<div style="background:#EFF6FF;border:1px solid #BFDBFE;border-radius:12px;padding:1.5rem 2rem;margin:2rem 0;">
<p style="margin:0 0 0.5rem;font-weight:600;color:#0F172A;font-size:1.0625rem;">Bankgespräch vorbereiten — Baupartner finden</p>
<p style="margin:0 0 1rem;color:#334155;font-size:0.9375rem;">Bereit, die Finanzierungsphase anzugehen? Für ein überzeugendes Bankgespräch brauchen Sie auch ein konkretes Angebot von realen Baupartnern. Schildern Sie Ihr Projekt in wenigen Minuten — wir stellen den Kontakt zu Architekten, Court-Lieferanten und Haustechnikspezialisten her, die bankfähige Kalkulationsunterlagen liefern. Kostenlos und unverbindlich.</p>
<a href="/quote" class="btn">Angebot anfordern</a>
</div>

View File

@@ -189,4 +189,10 @@ Die Kosten für eine Padelhalle sind real und erheblich — €930.000 bis €1,
Richtig aufgesetzt, stimmt die Wirtschaftlichkeit: Bei konservativen Annahmen und solider Betriebsführung ist die Amortisation in 35 Jahren realistisch. Der deutsche Padel-Markt wächst weiter — aber mit wachsendem Angebot steigen auch die Erwartungen der Spieler und die Anforderungen an Konzept, Lage und Service.
**Nächster Schritt:** Nutzen Sie den Padelnomics Financial Planner, um Ihre spezifische Konstellation durchzurechnen — mit Ihrem Standort, Ihrer Finanzierungsstruktur und Ihren Preisannahmen. Die Zahlen in diesem Artikel sind Ihr Ausgangspunkt — Ihre Halle verdient eine Kalkulation, die auf Ihren tatsächlichen Rahmenbedingungen aufbaut.
**Nächster Schritt:** Nutzen Sie den [Padelnomics Financial Planner](/de/planner), um Ihre spezifische Konstellation durchzurechnen — mit Ihrem Standort, Ihrer Finanzierungsstruktur und Ihren Preisannahmen. Die Zahlen in diesem Artikel sind Ihr Ausgangspunkt — Ihre Halle verdient eine Kalkulation, die auf Ihren tatsächlichen Rahmenbedingungen aufbaut.
<div style="background:#EFF6FF;border:1px solid #BFDBFE;border-radius:12px;padding:1.5rem 2rem;margin:2rem 0;">
<p style="margin:0 0 0.5rem;font-weight:600;color:#0F172A;font-size:1.0625rem;">Zahlen prüfen — Angebote einholen</p>
<p style="margin:0 0 1rem;color:#334155;font-size:0.9375rem;">Wenn Ihre Kalkulation steht, ist der nächste Schritt die Konfrontation mit realen Marktpreisen. Schildern Sie Ihr Vorhaben — wir stellen den Kontakt zu Baupartnern her, die konkrete Angebote auf Basis Ihrer Anlage machen können. Kostenlos und unverbindlich.</p>
<a href="/quote" class="btn">Angebot anfordern</a>
</div>

View File

@@ -216,6 +216,12 @@ Niemand kann alle Risiken eliminieren. Aber die Investoren, die langfristig erfo
## Die Padelnomics-Investitionsrechnung
Der Padelnomics-Planer enthält einen Sensitivitätsanalyse-Tab, der genau diese Szenarien berechenbar macht: Wie verändert sich der ROI bei 40 versus 65 Prozent Auslastung? Was kostet ein sechsmonatiger Bauverzug? Was passiert, wenn ein Wettbewerber in Jahr drei 20 Prozent Ihrer Nachfrage abzieht?
Der [Padelnomics-Planer](/de/planner) enthält einen Sensitivitätsanalyse-Tab, der genau diese Szenarien berechenbar macht: Wie verändert sich der ROI bei 40 versus 65 Prozent Auslastung? Was kostet ein sechsmonatiger Bauverzug? Was passiert, wenn ein Wettbewerber in Jahr drei 20 Prozent Ihrer Nachfrage abzieht?
Gute Entscheidungen brauchen ein ehrliches Modell — nicht nur die besten Annahmen.
<div style="background:#EFF6FF;border:1px solid #BFDBFE;border-radius:12px;padding:1.5rem 2rem;margin:2rem 0;">
<p style="margin:0 0 0.5rem;font-weight:600;color:#0F172A;font-size:1.0625rem;">Ihr Projekt mit den richtigen Partnern absichern</p>
<p style="margin:0 0 1rem;color:#334155;font-size:0.9375rem;">Das beste Risikomanagement beginnt mit der richtigen Auswahl an Planern und Baupartnern. Schildern Sie Ihr Vorhaben — wir stellen den Kontakt zu geprüften Architekten, Court-Lieferanten und Haustechnikspezialisten her, die sich auf Padelanlagen spezialisiert haben. Kostenlos und unverbindlich.</p>
<a href="/quote" class="btn">Angebot anfordern</a>
</div>

View File

@@ -166,7 +166,7 @@ Bevor Sie in einer Stadt konkret nach Objekten suchen, sollten Sie deren Marktre
Padelnomics erfasst Anlagendichte, Buchungsplattform-Auslastung und demografische Kennzahlen für Städte europaweit. Den aktuellen Marktüberblick für Ihr Zielland finden Sie hier:
[→ Marktüberblick nach Land](/markets/germany)
[→ Marktüberblick nach Land](/de/markets/germany)
---
@@ -174,4 +174,10 @@ Padelnomics erfasst Anlagendichte, Buchungsplattform-Auslastung und demografisch
Padelnomics wertet Marktdaten für Ihr Zielgebiet aus: Spielerdichte, Wettbewerbsdichte, Court-Nachfrage-Indikatoren aus Buchungsplattformdaten und demografische Kennzahlen auf Gemeindeebene. Für Ihre potenziellen Standorte erstellt Padelnomics ein Einzugsgebietsprofil und einen Standortvergleich — so dass die Entscheidung auf einer Datenbasis getroffen werden kann, nicht auf einer Karte mit Fingerzeig.
[→ Standortanalyse starten]
[→ Standortanalyse starten](/de/planner)
<div style="background:#EFF6FF;border:1px solid #BFDBFE;border-radius:12px;padding:1.5rem 2rem;margin:2rem 0;">
<p style="margin:0 0 0.5rem;font-weight:600;color:#0F172A;font-size:1.0625rem;">Den richtigen Standort gefunden? Angebote einholen.</p>
<p style="margin:0 0 1rem;color:#334155;font-size:0.9375rem;">Sobald ein Standort die Kriterien erfüllt, folgt der nächste Schritt: die Kontaktaufnahme mit Architekten und Court-Lieferanten. Schildern Sie Ihr Vorhaben — wir stellen den Kontakt zu geprüften Baupartnern her. Kostenlos und unverbindlich.</p>
<a href="/quote" class="btn">Angebot anfordern</a>
</div>

View File

@@ -1,6 +1,6 @@
# Padelnomics — Marketing Master Doc
> Living doc. Update state column as things progress. Last updated: 2026-02-22.
> Living doc. Update state column as things progress. Last updated: 2026-03-04.
---
@@ -216,9 +216,9 @@ The moat compounds over time — this is critical to long-term defensibility.
| Channel | Approach | State |
|---------|----------|-------|
| **LinkedIn** | Founder posts, thought leadership, padel community | [ ] Not started |
| **Reddit** | r/padel, r/entrepreneur — seeding calculator, articles | [ ] Not started |
| **Facebook Groups** | Padel business groups, sports entrepreneur communities | [ ] Not started |
| **LinkedIn** | Founder posts, thought leadership, padel community | [~] First post published |
| **Reddit** | r/padel, r/sweatystartup, r/entrepreneur, r/tennis, r/smallbusiness, r/pickleball, r/CRE — seeding calculator, articles | [~] Active in 7 subreddits |
| **Facebook Groups** | Padel business groups, sports entrepreneur communities | [~] Active in 2-3 groups |
### Borrowed (Month 2+)

89
docs/gtm-day-one.md Normal file
View File

@@ -0,0 +1,89 @@
# GTM — Day One Action Plan
> Created: 2026-03-04. Do these in order. Total time: ~45 hours.
---
## Right Now (12 hours, highest leverage)
### 1. Submit sitemap to Google Search Console + Bing Webmaster Tools
You have 80 programmatic city articles sitting unindexed. Every day without indexing is wasted compound time.
- [search.google.com/search-console](https://search.google.com/search-console) → Add property → Submit sitemap
- [bing.com/webmasters](https://www.bing.com/webmasters) (Bing also feeds DuckDuckGo, Ecosia, Yahoo)
- Your SEO hub already supports both — just add the env vars
### 2. Publish SEO articles on prod
Run `seed_content --generate` from admin or CLI. Those 80 city pages (40 cities × EN+DE) are the primary organic traffic engine. Until they're live and crawlable, they generate zero value.
### 3. Index the planner in Google
Make sure `/en/calculator` and `/de/rechner` are in the sitemap and crawlable. This is the #1 free tool — the entire PLG funnel starts here. Check canonical tags and hreflang are correct.
---
## This Afternoon (23 hours, seed distribution)
### 4. First LinkedIn post
Data-driven insight from the pipeline. See `docs/social-posts.md` for the full post.
### 5. Post in Reddit communities
- **r/padel**: Free calculator angle — genuinely useful tool
- **r/entrepreneur**: Indie maker angle — "built this with real market data"
- **r/smallbusiness**: Business planning tool angle
- **r/tennis**: Cross-sport angle — tennis clubs adding padel courts
See `docs/social-posts.md` for all posts ready to copy-paste.
### 6. Share in 23 Facebook padel business groups
Same angle as Reddit — free tool, no hard sell. Search for:
- "Padel Business" groups
- "Padel Club Owners" groups
- "Padel Deutschland" / "Padel Germany" groups
---
## This Evening (1 hour, set up compounding assets)
### 7. Verify Resend production API key
Test a real magic link email. Until email works in prod, you can't capture traffic.
### 8. Wipe test suppliers
Delete the 5 `example.com` entries. Empty directory with "Be the first to list" > obviously fake data.
### 9. Request indexing for top 5 city pages
After GSC is set up, use "Request Indexing" manually for highest-value pages:
- `/de/markets/berlin`, `/de/markets/muenchen`, `/de/markets/hamburg`
- `/en/markets/london`, `/en/markets/madrid`
Google prioritizes manually requested URLs — can appear in search within days vs. weeks.
---
## What NOT to do today
- ~~"State of Padel" report~~ — multi-day effort
- ~~Supplier outreach~~ — site needs to be live + articles indexed first
- ~~Copy/CRO optimization~~ — premature, get traffic first
- ~~Paid ads~~ — excluded in channel strategy
---
## Expected outcome
If you do steps 19 today:
- 80 pages submitted for indexing (organic traffic starts in 13 weeks)
- 35 social posts seeding traffic immediately
- Planner discoverable and shareable
- Email capture working for when traffic arrives
**Single highest-leverage action: publish the articles + submit the sitemap.** Everything else is distribution on top of that foundation.

View File

@@ -0,0 +1,91 @@
# Reddit Communities — Padelnomics Distribution
> Permanent reference for Reddit distribution. Subreddits ranked by relevance + size.
> Created: 2026-03-04. Review monthly — subreddit rules change.
---
## Tier 1 — Post Here First
High relevance, receptive to tools/data, proven padel or business-planning interest.
| Subreddit | Size | Angle | Notes |
|-----------|------|-------|-------|
| r/padel | ~20K | Free calculator, data insights, answer existing biz threads | Player community — lead with the sport, not the product. Helpful tone only. |
| r/sweatystartup | ~56-81K | "Best brick-and-mortar sports opportunity" with unit economics | Loves concrete P&L numbers. Show CAPEX/OPEX/payback, not vision. |
| r/tennis | ~2M | Tennis club court conversion trends + data | Huge audience. Angle: "your club is probably already thinking about this." |
| r/smallbusiness | ~2.2M | Free business planning tool for sports facilities | Practical, no-hype tone. Lead with the tool, not the market thesis. |
---
## Tier 2 — Test With One Post Each
Potentially high-value but less proven fit. Post once, measure engagement, double down if it works.
| Subreddit | Size | Angle | Notes |
|-----------|------|-------|-------|
| r/entrepreneur | ~4.8M | "Bloomberg for padel" indie builder story | Loves "I built X" posts with real data. Show the data pipeline, not just the product. |
| r/CommercialRealEstate | ~44K | Sports venue site selection as niche CRE | Small but highly targeted. Angle: alternative asset class with data backing. |
| r/realestateinvesting | ~1.2M | Alternative commercial RE asset class | Broader audience. Frame padel as "the new self-storage" — boring but profitable. |
| r/pickleball | ~30K | Padel vs pickleball facility economics comparison | Comparative angle works. Don't trash pickleball — frame as "here's what the padel side looks like." |
| r/gymowners | Small | Cross-reference gym location frameworks with padel data | Niche. Test if gym owners see padel as a complementary or competing asset. |
| r/padelUSA | <5K | US-specific demand data | Tiny but highly relevant. US padel market is nascent — early authority opportunity. |
---
## Tier 3 — Monitor Only
Read these for trends and conversations. Don't post unless a specific thread is a perfect fit for a data-backed comment.
- r/business — too generic, self-promo gets buried
- r/startups — SaaS-focused, padel doesn't fit the narrative
- r/SaaS — pure software community, facility business is off-topic
- r/venturecapital — wrong audience for bootstrapped niche tool
- r/sports — massive, low engagement on niche content
---
## Key Gap
No subreddit exists for padel facility operators or business owners. If community forms organically around Padelnomics content (comments like "where can I discuss this more?"), consider creating **r/padelbusiness** later. Don't force it — let demand signal the timing.
---
## Posting Rules
1. **One link per post, at the end.** Never in the title.
2. **Engage with every comment for 24 hours** after posting. This is where the real value is.
3. **No cross-posting.** Each post is unique to the subreddit's culture and tone.
4. **If a post gets removed, don't repost.** Move to the next subreddit. Respect mod decisions.
5. **Read each subreddit's rules before posting.** Some ban self-promotion entirely. Some require flair. Some have minimum account age/karma requirements.
6. **Never post more than one subreddit per day.** Spread it out. Reddit's spam detection flags rapid multi-sub posting.
7. **Comment on existing threads first.** Build karma and presence in a sub before dropping your own post.
---
## UTM Tracking Format
All Reddit links use this format:
```
https://padelnomics.io/<path>?utm_source=reddit&utm_medium=social&utm_campaign=launch&utm_content=r_<subreddit>
```
Examples:
- `https://padelnomics.io/en/planner/?utm_source=reddit&utm_medium=social&utm_campaign=launch&utm_content=r_padel`
- `https://padelnomics.io/en/markets?utm_source=reddit&utm_medium=social&utm_campaign=launch&utm_content=r_sweatystartup`
- `https://padelnomics.io/en/markets?utm_source=reddit&utm_medium=social&utm_campaign=launch&utm_content=r_cre`
---
## Measuring Success
| Metric | Good | Great |
|--------|------|-------|
| Post upvotes | 10+ | 50+ |
| Comments | 5+ | 20+ |
| UTM clicks (GA) | 20+ per post | 100+ per post |
| Planner completions from Reddit | 5+ per post | 20+ per post |
| Email captures from Reddit | 2+ per post | 10+ per post |
Track weekly in a simple spreadsheet. Drop subreddits that produce zero clicks after 2 posts.

106
docs/reddit-posting-plan.md Normal file
View File

@@ -0,0 +1,106 @@
# Reddit Posting Plan — Launch Sequence
> Day-by-day posting schedule. One post per day, engage for 24 hours after each.
> Created: 2026-03-04. See `docs/reddit-communities.md` for full subreddit research.
---
## Posting Sequence
| Day | Subreddit | Post Title | Angle | UTM |
|-----|-----------|-----------|-------|-----|
| 1 | r/padel | "I built a free padel court ROI calculator — feedback welcome" | Free tool, genuinely helpful | `utm_content=r_padel` |
| 2 | r/sweatystartup | "25K venues analyzed — which cities are undersupplied for padel" | Unit economics, brick-and-mortar opportunity | `utm_content=r_sweatystartup` |
| 3 | r/entrepreneur | "I'm building the 'Bloomberg for padel' — tracking 10,127 facilities across 17 countries" | Indie builder story with real data | `utm_content=r_entrepreneur` |
| 4 | r/tennis | "Data on padel facility economics — useful for tennis clubs considering adding courts" | Tennis club conversion data | `utm_content=r_tennis` |
| 5 | r/smallbusiness | "Free business planning tool for anyone looking at opening a sports facility" | Practical tool for real decisions | `utm_content=r_smallbusiness` |
| 7 | r/pickleball | "Padel vs pickleball facility economics — a data comparison" | Comparative, respectful of pickleball | `utm_content=r_pickleball` |
| 10 | r/CommercialRealEstate | "Sports venue site selection — data on underserved markets" | Alternative CRE asset class | `utm_content=r_cre` |
Day 6 and days 8-9 are rest days for engaging with comments on previous posts.
---
## Full UTM Format
Every Reddit link follows this exact format:
```
https://padelnomics.io/<path>?utm_source=reddit&utm_medium=social&utm_campaign=launch&utm_content=<value>
```
| Subreddit | utm_content value |
|-----------|-------------------|
| r/padel | `r_padel` |
| r/sweatystartup | `r_sweatystartup` |
| r/entrepreneur | `r_entrepreneur` |
| r/tennis | `r_tennis` |
| r/smallbusiness | `r_smallbusiness` |
| r/pickleball | `r_pickleball` |
| r/CommercialRealEstate | `r_cre` |
---
## Post Content
Full post text is in `docs/social-posts.md`. Before posting, replace `[LINK]` placeholders with the correct UTM-tagged URL:
| Post | Link to |
|------|---------|
| r/padel | `https://padelnomics.io/en/planner/?utm_source=reddit&utm_medium=social&utm_campaign=launch&utm_content=r_padel` |
| r/sweatystartup | `https://padelnomics.io/en/markets?utm_source=reddit&utm_medium=social&utm_campaign=launch&utm_content=r_sweatystartup` |
| r/entrepreneur | `https://padelnomics.io/en/?utm_source=reddit&utm_medium=social&utm_campaign=launch&utm_content=r_entrepreneur` |
| r/tennis | `https://padelnomics.io/en/planner/?utm_source=reddit&utm_medium=social&utm_campaign=launch&utm_content=r_tennis` |
| r/smallbusiness | `https://padelnomics.io/en/planner/?utm_source=reddit&utm_medium=social&utm_campaign=launch&utm_content=r_smallbusiness` |
| r/pickleball | `https://padelnomics.io/en/planner/?utm_source=reddit&utm_medium=social&utm_campaign=launch&utm_content=r_pickleball` |
| r/CommercialRealEstate | `https://padelnomics.io/en/markets?utm_source=reddit&utm_medium=social&utm_campaign=launch&utm_content=r_cre` |
---
## Rules
1. **One link per post, at the end.** Never in the title.
2. **Engage with every comment for 24 hours** after posting.
3. **No cross-posting.** Each post is written uniquely for its subreddit's culture.
4. **If a post gets removed, don't repost.** Move to the next subreddit.
5. **Read subreddit rules before posting.** Check for self-promotion policies, flair requirements, minimum karma.
6. **Comment on 2-3 existing threads** in a subreddit before making your own post (builds credibility).
7. **Never mention other posts.** Each community should feel like they're getting a unique share.
---
## Engagement Playbook
### When you get comments:
- **"How accurate is this?"** — Share methodology: real market data from OpenStreetMap, Playtomic, Eurostat. Not generic assumptions.
- **"What about [city]?"** — Run the planner for their city, share the numbers. This is high-value personalized engagement.
- **"I'm actually looking at opening a facility"** — Offer to walk through the planner with them. Ask about their timeline, location, budget. This is a lead.
- **"This is just an ad"** — Don't get defensive. Say "Fair point — I built this and wanted feedback. The tool is free with no signup, so figured it might be useful here."
- **"What's your business model?"** — Be transparent: "Free calculator, paid market intelligence for serious investors, supplier directory for builders."
### When a post gets traction (50+ upvotes):
- Reply with additional data points to keep the thread alive
- Answer every question, even late ones
- Don't edit the original post to add more links
---
## Tracking
After each post, log:
| Field | Example |
|-------|---------|
| Date posted | 2026-03-04 |
| Subreddit | r/padel |
| Post URL | reddit.com/r/padel/... |
| Upvotes (24hr) | 15 |
| Comments (24hr) | 7 |
| UTM clicks (GA, 7d) | 42 |
| Planner starts (7d) | 12 |
| Emails captured (7d) | 3 |
| Removed? | No |
Review after Day 10. Double down on subreddits that drove clicks. Drop ones that didn't.

View File

@@ -0,0 +1,150 @@
# SEO Content Calendar — First 30 Days
> 4-week content plan covering programmatic SEO deployment, cornerstone articles, and data-driven content.
> Created: 2026-03-04.
---
## Week 1 — Foundation (March 4-10)
Get the existing 80 pages indexed and write the first cornerstone article.
| Day | Task | Owner | State |
|-----|------|-------|-------|
| Mon | Publish 80 programmatic city articles (40 cities x EN+DE) | Deploy | [ ] |
| Mon | Submit sitemap to Google Search Console | Manual | [ ] |
| Mon | Submit sitemap to Bing Webmaster Tools | Manual | [ ] |
| Tue | Request manual indexing for top 10 pages in GSC | Manual | [ ] |
| Tue | Verify hreflang tags and canonical URLs on all city pages | Audit | [ ] |
| Wed-Fri | Write Article #1: "Is Padel Still a Good Investment in 2026?" | Editorial | [ ] |
| Fri | Publish Article #1, add to sitemap | Deploy | [ ] |
**Top 10 pages for manual indexing:**
1. `/de/markets/berlin`
2. `/de/markets/muenchen`
3. `/de/markets/hamburg`
4. `/en/markets/london`
5. `/en/markets/madrid`
6. `/en/calculator`
7. `/de/rechner`
8. `/en/markets/paris`
9. `/de/markets/frankfurt`
10. `/de/markets/koeln`
---
## Week 2 — Cornerstone Content (March 11-17)
Two high-value articles targeting decision-stage keywords. Internal linking pass connects everything.
| Day | Task | Owner | State |
|-----|------|-------|-------|
| Mon-Tue | Write Article #2: "How Much Does It Cost to Open a Padel Hall in Germany?" | Editorial | [ ] |
| Wed | Publish Article #2 | Deploy | [ ] |
| Thu-Fri | Write Article #3: "What Banks Want to See in a Padel Business Plan" | Editorial | [ ] |
| Fri | Publish Article #3 | Deploy | [ ] |
| Sat | Internal linking pass: city articles -> cornerstone articles -> planner | Technical | [ ] |
### Article #2 — Target Keywords
- "padel halle kosten" / "padel court cost germany"
- "padel halle eroeffnen kosten" / "how much to open padel hall"
- "padel anlage investition"
### Article #3 — Target Keywords
- "padel business plan" / "padel halle business plan"
- "padel halle finanzierung" / "padel financing"
- "bank business plan padel"
### Internal Linking Structure
```
City article (e.g., /markets/berlin)
-> "How much does it cost?" (Article #2)
-> "Plan your facility" (/calculator)
Article #2 (Cost breakdown)
-> "Build your business plan" (/calculator)
-> "What banks want to see" (Article #3)
-> City-specific examples (/markets/muenchen, /markets/hamburg)
Article #3 (Bank requirements)
-> "Generate your business plan" (/calculator)
-> "Check market data for your city" (/markets)
```
---
## Week 3 — Data-Driven Content (March 18-24)
Leverage the pipeline data for unique content nobody else can produce.
| Day | Task | Owner | State |
|-----|------|-------|-------|
| Mon-Wed | Write "Top 50 Underserved Locations for Padel in Europe" | Editorial | [ ] |
| Wed | Publish Top 50 article | Deploy | [ ] |
| Thu-Fri | Build Gemeinde-level pSEO template (targets "Padel in [Ort]") | Technical | [ ] |
| Fri | Generate first batch of Gemeinde pages (top 20 locations) | Deploy | [ ] |
### Top 50 Article
- Source data from `location_opportunity_profile` in the serving layer
- Rank by opportunity score, filter to locations with zero existing facilities
- Include mini-profiles: population, income level, nearest existing facility, opportunity score
- Embed interactive map if possible, otherwise static top-50 table
- Target keywords: "where to open padel", "best locations padel europe", "padel market gaps"
### Gemeinde-Level pSEO
- Template targets: "Padel in [Ort]" / "Padel [Gemeinde]"
- Zero SERP competition confirmed for most German municipalities
- Content: local demographics, nearest facilities, opportunity score, CTA to planner
- Start with top 20 highest-opportunity Gemeinden, expand weekly
---
## Week 4 — Authority Building (March 25-31)
Establish Padelnomics as the data authority. Begin email-gated content for list building.
| Day | Task | Owner | State |
|-----|------|-------|-------|
| Mon-Wed | Write "State of Padel Q1 2026" report | Editorial | [ ] |
| Wed | Design PDF layout (WeasyPrint or similar) | Technical | [ ] |
| Thu | Publish report landing page (email-gated download) | Deploy | [ ] |
| Thu | Promote Market Score methodology page via social | Social | [ ] |
| Fri | Begin link building via Reddit/LinkedIn engagement | Social | [ ] |
| Ongoing | Monitor GSC for indexing progress, fix crawl errors | Technical | [ ] |
### State of Padel Q1 2026 Report
- Executive summary of European padel market
- Facility count by country (from pipeline data)
- Growth trends (year-over-year where data exists)
- Top opportunity markets (from opportunity scoring)
- Investment economics summary (from planner defaults)
- Email-gated: free download in exchange for email address
- Promote via LinkedIn, Reddit, and direct outreach to industry contacts
---
## Content Inventory (End of Month 1)
| Type | Count | State |
|------|-------|-------|
| Programmatic city articles (EN+DE) | 80 | Deployed Week 1 |
| Cornerstone articles | 3 | Published Weeks 1-2 |
| Data-driven article (Top 50) | 1 | Published Week 3 |
| Gemeinde-level pSEO pages | 20+ | Started Week 3 |
| Gated report (State of Padel) | 1 | Published Week 4 |
| **Total indexable pages** | **105+** | |
---
## SEO KPIs — End of Month 1
| Metric | Target |
|--------|--------|
| Pages indexed (GSC) | 80+ of 105 |
| Organic impressions | 500+ |
| Organic clicks | 50+ |
| Average position (target keywords) | Top 50 |
| Email captures from gated report | 50+ |
| Backlinks acquired | 3+ |
These are conservative baselines. Programmatic pages in zero-competition niches can index and rank faster than typical content.

153
docs/social-posts-de.md Normal file
View File

@@ -0,0 +1,153 @@
# Social Posts — Deutsche Versionen
> Fertige Posts zum Rauskopieren. Domain: padelnomics.io
> Erstellt: 2026-03-04.
>
> Reddit-Posts bleiben auf Englisch (englischsprachige Subreddits).
> Diese Datei enthält LinkedIn- und Facebook-Posts auf Deutsch.
---
## LinkedIn Post #1 — Marktdaten
> Ziel: Glaubwürdigkeit aufbauen + Traffic auf den Rechner lenken.
```
10.127 Padel-Anlagen in 17 Ländern — wir haben sie alle erfasst.
Was dabei auffällt:
→ Italien führt mit 3.069 Anlagen. Mehr als Spanien (2.241).
→ Portugal hat den reifsten Padel-Markt weltweit (Maturity Score 45,2/100) — bei „nur" 506 Anlagen.
→ Deutschland: 359 Anlagen für 84 Mio. Einwohner. Spanien: 2.241 für 47 Mio.
Diese Lücke ist die Chance.
Wir haben 15.390 Standorte ohne Padel-Angebot identifiziert, die hohes Potenzial zeigen. Hamburg, München und Frankfurt stehen in Deutschland ganz oben.
Für alle, die über eine eigene Padel-Anlage nachdenken oder jemanden beraten: Wir haben einen kostenlosen ROI-Rechner gebaut, der mit echten Marktdaten die Kosten, Umsätze und Amortisation für jede Stadt in Europa modelliert.
Ohne Anmeldung. Einfach rechnen.
→ https://padelnomics.io/de/planner/?utm_source=linkedin&utm_medium=social&utm_campaign=launch&utm_content=li_marktdaten
#padel #sportbusiness #marktdaten #unternehmertum
```
---
## LinkedIn Post #2 — Standortanalyse (Tag 23 posten)
```
Die 5 am stärksten unterversorgten Städte für Padel in Europa:
1. Hamburg — 1,85 Mio. Einwohner, keine einzige Padel-Anlage
2. München — 1,26 Mio. Einwohner, starke Sportkultur, kaum Angebot
3. Bergen (Norwegen) — 294.000 Einwohner, Opportunity Score: 87,5/100
4. Graz (Österreich) — 303.000 Einwohner, null Courts, hohes Einkommen
5. Genf (Schweiz) — 202.000 Einwohner, null Courts, höchste Kaufkraft
Keine Schätzungen. Wir bewerten 143.877 Standorte in Europa anhand von Bevölkerungsdichte, Einkommensdaten, bestehendem Angebot und Sportinfrastruktur.
Der Padel-Markt wächst von 25.000 auf über 50.000 Anlagen weltweit. Die Frage ist nicht ob — sondern wo.
→ Daten für eure Stadt: https://padelnomics.io/de/markets?utm_source=linkedin&utm_medium=social&utm_campaign=launch&utm_content=li_standortanalyse
#padel #marktanalyse #sportsinvestment #immobilien
```
---
## LinkedIn Post #3 — Gründerstory (optional, Woche 2)
```
Vor einem Jahr habe ich angefangen, den europäischen Padel-Markt systematisch zu erfassen.
Der Auslöser: Jeder, der eine Padel-Halle plant, trifft eine Entscheidung im sechsstelligen Bereich — und hat dafür keine belastbaren Daten. Kein zentrales Marktbild. Keine vergleichbaren Kennzahlen. Nur Excel und Bauchgefühl.
Daraus ist Padelnomics entstanden: eine Datenplattform für die Padel-Branche.
Was heute live ist:
→ Kostenloser ROI-Rechner mit stadtspezifischen Realdaten
→ 80 Marktanalysen für Städte in 17 Ländern
→ Standortbewertung für 143.877 Orte in Europa
→ Anbieterverzeichnis für Bau und Ausstattung
Die Daten kommen aus OpenStreetMap, Playtomic, Eurostat und Zensusdaten — automatisch aggregiert und bewertet.
Noch am Anfang, aber der Datenvorsprung wächst jeden Tag.
→ https://padelnomics.io/de/?utm_source=linkedin&utm_medium=social&utm_campaign=launch&utm_content=li_gruenderstory
#padel #startup #datenplattform #sportbusiness
```
---
## Facebook — Padel-Gruppen (Deutschland/DACH)
> Ton: locker, hilfsbereit, kurz. Kein Pitch.
**Titel (falls die Gruppe Titel erlaubt):** Kostenloser Padel-Rechner mit echten Marktdaten
```
Moin zusammen,
ich hab einen kostenlosen Finanzplanungs-Rechner für Padel-Anlagen gebaut. CAPEX, laufende Kosten, Umsatzprognose — und am Ende eine 5-Jahres-GuV mit Amortisation.
Der Unterschied zu den üblichen Excel-Vorlagen: Der Rechner befüllt sich automatisch mit echten Daten für euren Standort. Mieten, Nebenkosten, Genehmigungsgebühren — alles stadtspezifisch, basierend auf Daten aus 17 Ländern.
Keine Anmeldung, kostenlos.
→ https://padelnomics.io/de/planner/?utm_source=facebook&utm_medium=social&utm_campaign=launch&utm_content=fb_padel_de
Feedback ist willkommen — gerade von Leuten, die den Planungsprozess schon hinter sich haben und wissen, welche Zahlen wirklich zählen.
```
---
## Facebook — Tennisvereine / Sportvereine (DACH)
> Ziel: Tennisvereine, die über Padel-Courts nachdenken.
```
Falls euer Verein gerade über Padel-Courts nachdenkt (und viele tun das): Ich hab ein kostenloses Tool gebaut, das die Wirtschaftlichkeit durchrechnet.
→ Investitionskosten für 26 Courts an bestehenden Anlagen
→ Umsatzprognose auf Basis realer Auslastungs- und Preisdaten
→ Laufende Kosten für euren konkreten Standort
→ Amortisation und ROI-Kennzahlen
Ein paar Zahlen aus unseren Daten:
- Durchschnittliche Auslastung in reifen Märkten: 6075 %
- Outdoor-Anlage mit 4 Courts: 200.000350.000 €
- Indoor: 700.0003 Mio. € je nach Bauweise
- Tennisvereine, die 2 Plätze umrüsten, sehen typischerweise nach 1830 Monaten Amortisation
Keine Anmeldung nötig.
→ https://padelnomics.io/de/planner/?utm_source=facebook&utm_medium=social&utm_campaign=launch&utm_content=fb_tennis_de
Kann gern Daten zu einzelnen Städten oder Regionen teilen, wenn ihr etwas Konkretes prüft.
```
---
## Posting-Zeitplan
| Tag | Plattform | Post |
|-----|-----------|------|
| Heute | LinkedIn (Company Page) | Post #1 (Marktdaten) |
| Heute | 12 deutsche FB-Padel-Gruppen | Padel-Rechner |
| Morgen | 12 FB-Tennisvereins-Gruppen | Tennisverein-Angle |
| Tag 3 | LinkedIn (Company Page) | Post #2 (Standortanalyse) |
| Woche 2 | LinkedIn (Company Page) | Post #3 (Gründerstory) |
---
## Regeln
- Ein Link pro Post, am Ende.
- 24 Stunden auf jeden Kommentar reagieren.
- Wenn ein Post Traktion bekommt: mit zusätzlichen Datenpunkten nachliefern.
- UTM-Tracking: `?utm_source=linkedin&utm_medium=social&utm_campaign=launch&utm_content=li_marktdaten` bzw. `utm_source=facebook` für FB-Posts.

248
docs/social-posts.md Normal file
View File

@@ -0,0 +1,248 @@
# Social Posts — Launch Day
> Ready to copy-paste. Domain: padelnomics.io
> Created: 2026-03-04.
---
## LinkedIn Post #1 — Data Insight
> Post type: data-driven thought leadership. Goal: establish credibility + drive traffic to planner.
```
We've been tracking 10,127 padel facilities across 17 countries.
Here's what surprised me about the European market:
→ Italy leads with 3,069 facilities — more than Spain (2,241)
→ Portugal has the world's most mature padel market (45.2/100 maturity score) with "only" 506 facilities
→ Germany has just 359 facilities for 84M people. Spain has 2,241 for 47M.
That gap is the opportunity.
We identified 15,390 high-potential locations with zero padel courts worldwide.
Hamburg, Munich, and Frankfurt top the list in Germany alone.
If you're thinking about opening a padel facility — or advising someone who is — we built a free ROI calculator that uses this data to model costs, revenue, and payback period for any city in Europe.
No signup required. Just real numbers.
→ https://padelnomics.io/en/planner/?utm_source=linkedin&utm_medium=social&utm_campaign=launch&utm_content=li_marketdata
#padel #sportsbusiness #marketdata #entrepreneurship
```
---
## LinkedIn Post #2 — Opportunity Angle (schedule for Day 23)
```
The 5 most underserved cities for padel in Europe right now:
1. Hamburg (1.85M residents, zero dedicated padel facilities)
2. Munich (1.26M residents, massive sports culture, minimal supply)
3. Bergen, Norway (294K residents, opportunity score: 87.5/100)
4. Graz, Austria (303K residents, zero courts, high income)
5. Geneva, Switzerland (202K residents, zero courts, highest purchasing power)
These aren't guesses. We score 143,877 locations across Europe using population density, income data, existing supply, and sports infrastructure.
The padel market is growing from 25K to 50K+ facilities globally. The question isn't whether — it's where.
→ Explore the data for your city: https://padelnomics.io/en/markets?utm_source=linkedin&utm_medium=social&utm_campaign=launch&utm_content=li_opportunity
#padel #marketintelligence #sportsinvestment #realestate
```
---
## Reddit — r/padel
> Tone: genuinely helpful, not promotional. r/padel is a player community, so lead with the sport angle.
**Title:** I built a free padel court ROI calculator — feedback welcome
```
Hey r/padel,
I've been working on a data project tracking the padel market across Europe
(facility counts, market maturity, opportunity gaps). As part of that, I built
a free calculator for anyone thinking about opening a padel facility.
It models:
- CAPEX (construction, equipment, permits)
- OPEX (rent, staffing, utilities, maintenance)
- Revenue projections based on real market data from your city
- 5-year P&L with payback period, IRR, and break-even
It pre-fills with city-specific defaults — so if you pick Munich, it uses
Munich rents, Munich utility costs, etc. Not generic averages.
No signup needed. Just wanted to share in case anyone here has ever thought
about the business side of padel.
→ https://padelnomics.io/en/planner/?utm_source=reddit&utm_medium=social&utm_campaign=launch&utm_content=r_padel
Happy to answer questions about the data or methodology. Also open to feedback
on what would make this more useful.
```
---
## Reddit — r/entrepreneur
> Tone: indie builder sharing a project. r/entrepreneur loves "I built X" posts with real data.
**Title:** I'm building the "Bloomberg for padel" — tracking 10,127 facilities across 17 countries
```
Padel is the fastest-growing sport in Europe and Latin America. There are now
10,000+ facilities worldwide and the market is expected to double to 50K+ in
the next 5 years.
The problem: anyone trying to open a padel facility is flying blind. No
centralized market data exists. People are making €200K€2M investment
decisions based on Excel spreadsheets and gut feel.
I'm building Padelnomics — a data intelligence platform for the padel industry.
Think "Kpler for padel" if you're familiar with commodity data platforms.
What's live right now:
- Free ROI calculator that models costs, revenue, and payback for any European
city (pre-filled with real local data — rents, utilities, permits, etc.)
- 80 market analysis pages covering cities across 17 countries
- Market maturity scoring for 4,686 cities with padel facilities
- Opportunity scoring for 143,877 locations (identifying where to build next)
The data comes from OpenStreetMap, Playtomic (booking platform), Eurostat, and
census data — aggregated and scored automatically.
Revenue model: free calculator captures leads (aspiring facility owners) →
supplier directory connects them with builders → suppliers pay for qualified
leads via credit system.
Still early but the data moat compounds daily — every day of scraping = data
competitors can't replicate.
Would love feedback from anyone who's built data products or two-sided
marketplaces.
→ https://padelnomics.io/en/?utm_source=reddit&utm_medium=social&utm_campaign=launch&utm_content=r_entrepreneur
```
---
## Reddit — r/smallbusiness
> Tone: practical tool for a real business decision.
**Title:** Free business planning tool for anyone looking at opening a sports facility
```
I built a free financial planning tool specifically for padel facilities
(indoor/outdoor sports courts — fastest growing sport in Europe right now).
It covers the full picture:
- Construction costs (indoor vs outdoor, number of courts)
- Operating expenses (rent, staff, utilities, insurance, maintenance)
- Revenue modeling (hourly rates, occupancy rates, lessons, events)
- 5-year P&L projection
- Key metrics: payback period, IRR, break-even point
The tool pre-fills with real data for your city — actual local rents, utility
costs, permit fees — not generic averages.
You can also generate a bank-ready business plan PDF from it.
Free to use, no signup required for the calculator itself.
→ https://padelnomics.io/en/planner/?utm_source=reddit&utm_medium=social&utm_campaign=launch&utm_content=r_smallbusiness
Built this because I kept seeing people on forums asking "how much does it cost
to open a padel hall?" and getting wildly different answers. Figured real data
was better than guesswork.
```
---
## Reddit — r/tennis
> Tone: cross-sport angle. Many tennis clubs are adding padel courts.
**Title:** Data on padel facility economics — useful for tennis clubs considering adding courts
```
If your club is thinking about adding padel courts (and many are right now),
I built a free financial planning tool that models the full economics:
- CAPEX for adding 26 courts to an existing facility
- Revenue projections based on real occupancy and pricing data
- Operating costs specific to your city/country
- Payback period and ROI metrics
The tool uses actual market data — we track 10,127 padel facilities across
17 countries and score market maturity + opportunity by city.
Some interesting numbers:
- Average padel facility in a mature market runs at 6075% occupancy
- A 4-court outdoor setup costs €200K€350K
- Indoor builds jump to €700K€3M depending on structure
- Tennis clubs converting 2 courts to padel typically see payback in 1830 months
Free to use, no signup needed.
→ https://padelnomics.io/en/planner/?utm_source=reddit&utm_medium=social&utm_campaign=launch&utm_content=r_tennis
Happy to share data on any specific city or country if you're evaluating this
for your club.
```
---
## Facebook Groups — Padel Business / Deutschland
> Tone: casual, helpful. Shorter than Reddit posts.
**Title (if group allows):** Free padel facility ROI calculator — uses real market data
```
Hey everyone 👋
Built a free tool for anyone planning a padel facility. It models CAPEX,
OPEX, revenue, and gives you a 5-year P&L with payback period.
The difference from spreadsheet templates: it pre-fills with real data for
your city (actual rents, utility costs, permit fees, etc.) based on data
we're collecting across 17 countries.
No signup, no cost. Just real numbers.
→ https://padelnomics.io/en/planner/?utm_source=facebook&utm_medium=social&utm_campaign=launch&utm_content=fb_padel
Feedback welcome — especially from anyone who's been through the planning
process and knows what numbers actually matter.
```
---
## Posting Schedule
| Day | Platform | Post |
|-----|----------|------|
| Today | LinkedIn | Post #1 (Data Insight) |
| Today | r/padel | Calculator feedback post |
| Today | r/entrepreneur | "Bloomberg for padel" builder post |
| Today | 12 FB groups | Calculator share |
| Tomorrow | r/smallbusiness | Business planning tool post |
| Tomorrow | r/tennis | Tennis club angle |
| Day 3 | LinkedIn | Post #2 (Opportunity Angle) |
---
## Rules
- Never link-spam. One link per post, at the end.
- Engage with every comment for 24 hours after posting.
- If a post gets traction, reply with additional data points to keep it alive.
- Track which subreddits/groups drive actual signups via UTM params:
`?utm_source=reddit&utm_medium=social&utm_campaign=launch&utm_content=r_padel`

View File

@@ -19,7 +19,7 @@ from pathlib import Path
import niquests
from ._shared import HTTP_TIMEOUT_SECONDS, run_extractor, setup_logging
from .utils import get_last_cursor, landing_path, write_gzip_atomic
from .utils import landing_path, skip_if_current, write_gzip_atomic
logger = setup_logging("padelnomics.extract.census_usa")
@@ -73,10 +73,10 @@ def extract(
return {"files_written": 0, "files_skipped": 1, "bytes_written": 0}
# Skip if we already have data for this month (annual data, monthly cursor)
last_cursor = get_last_cursor(conn, EXTRACTOR_NAME)
if last_cursor == year_month:
skip = skip_if_current(conn, EXTRACTOR_NAME, year_month)
if skip:
logger.info("already have data for %s — skipping", year_month)
return {"files_written": 0, "files_skipped": 1, "bytes_written": 0}
return skip
year, month = year_month.split("/")
url = f"{ACS_URL}&key={api_key}"

View File

@@ -26,6 +26,10 @@ EUROSTAT_BASE_URL = "https://ec.europa.eu/eurostat/api/dissemination/statistics/
# Dataset configs: filters fix dimension values, geo_dim/time_dim are iterated.
# All other dimensions must either be in filters or have size=1.
#
# Optional `dataset_code` field: when present, used for the API URL instead of the dict key.
# This allows multiple entries to share the same Eurostat dataset with different filters
# (e.g. five prc_ppp_ind entries with different ppp_cat values).
DATASETS: dict[str, dict] = {
"urb_cpop1": {
"filters": {"indic_ur": "DE1001V"}, # Population on 1 January, total
@@ -51,6 +55,59 @@ DATASETS: dict[str, dict] = {
"geo_dim": "geo",
"time_dim": "time",
},
# ── Direct-value datasets (actual EUR figures) ───────────────────────────
"nrg_pc_205": {
# Electricity prices for non-household consumers, EUR/kWh, excl. taxes
"filters": {"freq": "S", "nrg_cons": "MWH500-1999", "currency": "EUR", "tax": "I_TAX"},
"geo_dim": "geo",
"time_dim": "time",
},
"nrg_pc_203": {
# Gas prices for non-household consumers, EUR/kWh, excl. taxes
"filters": {"freq": "S", "nrg_cons": "GJ1000-9999", "unit": "KWH", "currency": "EUR", "tax": "I_TAX"},
"geo_dim": "geo",
"time_dim": "time",
},
"lc_lci_lev": {
# Labour cost levels EUR/hour — NACE N (administrative/support services)
# D1_D4_MD5 = compensation of employees + taxes - subsidies (total labour cost)
"filters": {"lcstruct": "D1_D4_MD5", "nace_r2": "N", "unit": "EUR"},
"geo_dim": "geo",
"time_dim": "time",
},
# ── Price level indices (relative scaling, EU27=100) ─────────────────────
# Five entries share the prc_ppp_ind dataset with different ppp_cat filters.
# dataset_code points to the real API endpoint; the dict key is the landing filename.
"prc_ppp_ind_construction": {
"dataset_code": "prc_ppp_ind",
"filters": {"ppp_cat": "A050202", "na_item": "PLI_EU27_2020"},
"geo_dim": "geo",
"time_dim": "time",
},
"prc_ppp_ind_housing": {
"dataset_code": "prc_ppp_ind",
"filters": {"ppp_cat": "A0104", "na_item": "PLI_EU27_2020"},
"geo_dim": "geo",
"time_dim": "time",
},
"prc_ppp_ind_services": {
"dataset_code": "prc_ppp_ind",
"filters": {"ppp_cat": "P0201", "na_item": "PLI_EU27_2020"},
"geo_dim": "geo",
"time_dim": "time",
},
"prc_ppp_ind_misc": {
"dataset_code": "prc_ppp_ind",
"filters": {"ppp_cat": "A0112", "na_item": "PLI_EU27_2020"},
"geo_dim": "geo",
"time_dim": "time",
},
"prc_ppp_ind_government": {
"dataset_code": "prc_ppp_ind",
"filters": {"ppp_cat": "P0202", "na_item": "PLI_EU27_2020"},
"geo_dim": "geo",
"time_dim": "time",
},
}
@@ -196,22 +253,25 @@ def extract(
files_skipped = 0
bytes_written_total = 0
for dataset_code, config in DATASETS.items():
url = f"{EUROSTAT_BASE_URL}/{dataset_code}?format=JSON&lang=EN"
for dataset_key, config in DATASETS.items():
# Use dataset_code (if set) for the API URL; fall back to the dict key.
# This lets multiple entries share one Eurostat dataset with different filters.
api_code = config.get("dataset_code", dataset_key)
url = f"{EUROSTAT_BASE_URL}/{api_code}?format=JSON&lang=EN"
for key, val in config.get("filters", {}).items():
url += f"&{key}={val}"
dest_dir = landing_path(landing_dir, "eurostat", year, month)
dest = dest_dir / f"{dataset_code}.json.gz"
dest = dest_dir / f"{dataset_key}.json.gz"
logger.info("GET %s", dataset_code)
logger.info("GET %s", dataset_key)
bytes_written = _fetch_with_etag(url, dest, session, config)
if bytes_written > 0:
logger.info("%s updated — %s bytes compressed", dataset_code, f"{bytes_written:,}")
logger.info("%s updated — %s bytes compressed", dataset_key, f"{bytes_written:,}")
files_written += 1
bytes_written_total += bytes_written
else:
logger.info("%s not modified (304)", dataset_code)
logger.info("%s not modified (304)", dataset_key)
files_skipped += 1
return {

View File

@@ -19,7 +19,6 @@ Output: one JSON object per line, e.g.:
import gzip
import io
import json
import os
import sqlite3
import zipfile
@@ -28,7 +27,7 @@ from pathlib import Path
import niquests
from ._shared import HTTP_TIMEOUT_SECONDS, run_extractor, setup_logging
from .utils import compress_jsonl_atomic, get_last_cursor, landing_path
from .utils import landing_path, skip_if_current, write_jsonl_atomic
logger = setup_logging("padelnomics.extract.geonames")
@@ -139,10 +138,10 @@ def extract(
tmp.rename(dest)
return {"files_written": 0, "files_skipped": 1, "bytes_written": 0}
last_cursor = get_last_cursor(conn, EXTRACTOR_NAME)
if last_cursor == year_month:
skip = skip_if_current(conn, EXTRACTOR_NAME, year_month)
if skip:
logger.info("already have data for %s — skipping", year_month)
return {"files_written": 0, "files_skipped": 1, "bytes_written": 0}
return skip
year, month = year_month.split("/")
@@ -168,11 +167,7 @@ def extract(
dest_dir = landing_path(landing_dir, "geonames", year, month)
dest = dest_dir / "cities_global.jsonl.gz"
working_path = dest.with_suffix(".working.jsonl")
with open(working_path, "w") as f:
for row in rows:
f.write(json.dumps(row, separators=(",", ":")) + "\n")
bytes_written = compress_jsonl_atomic(working_path, dest)
bytes_written = write_jsonl_atomic(dest, rows)
logger.info("written %s bytes compressed", f"{bytes_written:,}")
return {

View File

@@ -17,7 +17,7 @@ from pathlib import Path
import niquests
from ._shared import HTTP_TIMEOUT_SECONDS, run_extractor, setup_logging
from .utils import get_last_cursor
from .utils import skip_if_current
logger = setup_logging("padelnomics.extract.gisco")
@@ -45,10 +45,10 @@ def extract(
session: niquests.Session,
) -> dict:
"""Download NUTS-2 GeoJSON. Skips if already run this month or file exists."""
last_cursor = get_last_cursor(conn, EXTRACTOR_NAME)
if last_cursor == year_month:
skip = skip_if_current(conn, EXTRACTOR_NAME, year_month)
if skip:
logger.info("already ran for %s — skipping", year_month)
return {"files_written": 0, "files_skipped": 1, "bytes_written": 0}
return skip
dest = landing_dir / DEST_REL
if dest.exists():

View File

@@ -21,7 +21,6 @@ Rate: 1 req / 2 s per IP (see docs/data-sources-inventory.md §1.2).
Landing: {LANDING_DIR}/playtomic/{year}/{month}/tenants.jsonl.gz
"""
import json
import os
import sqlite3
import time
@@ -33,7 +32,7 @@ import niquests
from ._shared import HTTP_TIMEOUT_SECONDS, run_extractor, setup_logging, ua_for_proxy
from .proxy import load_proxy_tiers, make_tiered_cycler
from .utils import compress_jsonl_atomic, landing_path
from .utils import landing_path, write_jsonl_atomic
logger = setup_logging("padelnomics.extract.playtomic_tenants")
@@ -215,11 +214,7 @@ def extract(
time.sleep(THROTTLE_SECONDS)
# Write each tenant as a JSONL line, then compress atomically
working_path = dest.with_suffix(".working.jsonl")
with open(working_path, "w") as f:
for tenant in all_tenants:
f.write(json.dumps(tenant, separators=(",", ":")) + "\n")
bytes_written = compress_jsonl_atomic(working_path, dest)
bytes_written = write_jsonl_atomic(dest, all_tenants)
logger.info("%d unique venues -> %s", len(all_tenants), dest)
return {

View File

@@ -3,10 +3,9 @@
Proxies are configured via environment variables. When unset, all functions
return None/no-op — extractors fall back to direct requests.
Three-tier escalation: free → datacenter → residential.
Tier 1 (free): WEBSHARE_DOWNLOAD_URL — auto-fetched from Webshare API
Tier 2 (datacenter): PROXY_URLS_DATACENTER — comma-separated paid DC proxies
Tier 3 (residential): PROXY_URLS_RESIDENTIAL — comma-separated paid residential proxies
Two-tier escalation: datacenter → residential.
Tier 1 (datacenter): PROXY_URLS_DATACENTER — comma-separated paid DC proxies
Tier 2 (residential): PROXY_URLS_RESIDENTIAL — comma-separated paid residential proxies
Tiered circuit breaker:
Active tier is used until consecutive failures >= threshold, then escalates
@@ -69,22 +68,15 @@ def fetch_webshare_proxies(download_url: str, max_proxies: int = MAX_WEBSHARE_PR
def load_proxy_tiers() -> list[list[str]]:
"""Assemble proxy tiers in escalation order: free → datacenter → residential.
"""Assemble proxy tiers in escalation order: datacenter → residential.
Tier 1 (free): fetched from WEBSHARE_DOWNLOAD_URL if set.
Tier 2 (datacenter): PROXY_URLS_DATACENTER (comma-separated).
Tier 3 (residential): PROXY_URLS_RESIDENTIAL (comma-separated).
Tier 1 (datacenter): PROXY_URLS_DATACENTER (comma-separated).
Tier 2 (residential): PROXY_URLS_RESIDENTIAL (comma-separated).
Empty tiers are omitted. Returns [] if no proxies configured anywhere.
"""
tiers: list[list[str]] = []
webshare_url = os.environ.get("WEBSHARE_DOWNLOAD_URL", "").strip()
if webshare_url:
free_proxies = fetch_webshare_proxies(webshare_url)
if free_proxies:
tiers.append(free_proxies)
for var in ("PROXY_URLS_DATACENTER", "PROXY_URLS_RESIDENTIAL"):
raw = os.environ.get(var, "")
urls = [u.strip() for u in raw.split(",") if u.strip()]

View File

@@ -101,6 +101,19 @@ def get_last_cursor(conn: sqlite3.Connection, extractor: str) -> str | None:
return row["cursor_value"] if row else None
_SKIP_RESULT = {"files_written": 0, "files_skipped": 1, "bytes_written": 0}
def skip_if_current(conn: sqlite3.Connection, extractor: str, year_month: str) -> dict | None:
"""Return an early-exit result dict if this extractor already ran for year_month.
Returns None when the extractor should proceed with extraction.
"""
if get_last_cursor(conn, extractor) == year_month:
return _SKIP_RESULT
return None
# ---------------------------------------------------------------------------
# File I/O helpers
# ---------------------------------------------------------------------------
@@ -176,6 +189,20 @@ def write_gzip_atomic(path: Path, data: bytes) -> int:
return len(compressed)
def write_jsonl_atomic(dest: Path, items: list[dict]) -> int:
"""Write items as JSONL, then compress atomically to dest (.jsonl.gz).
Compresses the working-file → JSONL → gzip pattern into one call.
Returns compressed bytes written.
"""
assert items, "items must not be empty"
working_path = dest.with_suffix(".working.jsonl")
with open(working_path, "w") as f:
for item in items:
f.write(json.dumps(item, separators=(",", ":")) + "\n")
return compress_jsonl_atomic(working_path, dest)
def compress_jsonl_atomic(jsonl_path: Path, dest_path: Path) -> int:
"""Compress a JSONL working file to .jsonl.gz atomically, then delete the source.

View File

@@ -33,10 +33,10 @@ do
DUCKDB_PATH="${DUCKDB_PATH:-/data/padelnomics/lakehouse.duckdb}" \
uv run --package padelnomics_extract extract
# Transform
# Transform — plan detects new/changed models; run only executes existing plans.
LANDING_DIR="${LANDING_DIR:-/data/padelnomics/landing}" \
DUCKDB_PATH="${DUCKDB_PATH:-/data/padelnomics/lakehouse.duckdb}" \
uv run --package sqlmesh_padelnomics sqlmesh run --select-model "serving.*"
uv run sqlmesh -p transform/sqlmesh_padelnomics plan prod --auto-apply
# Export serving tables to analytics.duckdb (atomic swap).
# The web app detects the inode change on next query — no restart needed.

View File

@@ -8,54 +8,67 @@
# entry — optional: function name if not "main" (default: "main")
# depends_on — optional: list of workflow names that must run first
# proxy_mode — optional: "round-robin" (default) or "sticky"
# description — optional: human-readable one-liner shown in the admin UI
[overpass]
module = "padelnomics_extract.overpass"
schedule = "monthly"
description = "Padel court locations from OpenStreetMap via Overpass API"
[overpass_tennis]
module = "padelnomics_extract.overpass_tennis"
schedule = "monthly"
description = "Tennis court locations from OpenStreetMap via Overpass API"
[eurostat]
module = "padelnomics_extract.eurostat"
schedule = "monthly"
description = "City population data from Eurostat Urban Audit"
[geonames]
module = "padelnomics_extract.geonames"
schedule = "monthly"
description = "Global city/town gazetteer from GeoNames (pop >= 1K)"
[playtomic_tenants]
module = "padelnomics_extract.playtomic_tenants"
schedule = "daily"
description = "Padel venue directory from Playtomic (names, locations, courts)"
[playtomic_availability]
module = "padelnomics_extract.playtomic_availability"
schedule = "daily"
depends_on = ["playtomic_tenants"]
description = "Morning availability snapshots — slot-level pricing per venue"
[playtomic_recheck]
module = "padelnomics_extract.playtomic_availability"
entry = "main_recheck"
schedule = "0,30 6-23 * * *"
depends_on = ["playtomic_availability"]
description = "Intraday availability rechecks for occupancy tracking"
[census_usa]
module = "padelnomics_extract.census_usa"
schedule = "monthly"
description = "US city/place population from Census Bureau ACS"
[census_usa_income]
module = "padelnomics_extract.census_usa_income"
schedule = "monthly"
description = "US county median household income from Census Bureau ACS"
[eurostat_city_labels]
module = "padelnomics_extract.eurostat_city_labels"
schedule = "monthly"
description = "City code-to-name mapping for Eurostat Urban Audit cities"
[ons_uk]
module = "padelnomics_extract.ons_uk"
schedule = "monthly"
description = "UK local authority population estimates from ONS"
[gisco]
module = "padelnomics_extract.gisco"
schedule = "monthly"
description = "EU geographic boundaries (NUTS2 polygons) from Eurostat GISCO"

View File

@@ -1,4 +1,35 @@
# Building a Padel Hall — Complete Guide
# Padel Hall — Question Bank & Gap Analysis
> **What this file is**: A structured question bank covering the full universe of questions a padel hall entrepreneur needs to answer — from concept to exit. It is **not** an article for publication.
>
> **Purpose**: Gap analysis — identify which questions Padelnomics already answers (planner, city articles, pipeline data, business plan PDF) and which are unanswered gaps we could fill to improve product value.
>
> **Coverage legend**:
> - `ANSWERED` — fully covered by the planner, city articles, or BP export
> - `PARTIAL` — partially addressed; notable gap or missing depth
> - `GAP` — not addressed at all; actionable opportunity
---
## Gap Analysis Summary
| Tier | Gap | Estimated Impact | Status |
|------|-----|-----------------|--------|
| 1 | Subsidies & grants (Germany) | High | Not in product; data exists in `research/padel-hall-economics.md` |
| 1 | Buyer segmentation (sports club / commercial / hotel / franchise) | High | Not in planner; segmentation table exists in research |
| 1 | Indoor vs outdoor decision framework | High | Planner models both; no comparison table or decision guide |
| 1 | OPEX benchmarks shown inline | Medium-High | Planner has inputs; defaults not visually benchmarked |
| 2 | Booking platform strategy (Playtomic vs Matchi vs custom) | Medium | Zero guidance; we scrape Playtomic so know it well |
| 2 | Depreciation & tax shield | Medium | All calcs pre-tax; Germany: 30% effective, 7yr courts |
| 2 | Legal & regulatory checklist (Germany) | Medium | Only permit cost line; Bauantrag, TA Lärm, GmbH etc. missing |
| 2 | Court supplier selection framework | Medium | Supplier directory exists; no evaluation criteria |
| 2 | Staffing plan template | Medium | BP has narrative field; no structured role × FTE × salary |
| 3 | Zero-court location pages (white-space pSEO) | High data value | `location_opportunity_profile` scores them; none published |
| 3 | Pre-opening / marketing playbook | Low-Medium | Out of scope; static article possible |
| 3 | Catchment area isochrones (drive-time) | Low | Heavy lift; `nearest_padel_court_km` is straight-line only |
| 3 | Trend/fad risk quantification | Low | Inherently speculative |
---
## Table of Contents
@@ -16,6 +47,8 @@
### Market & Demand
> **COVERAGE: PARTIAL** — Venue counts, density (venues/100K), Market Score, and Opportunity Score per city are all answered by pipeline data (`location_opportunity_profile`) and surfaced in city articles. Missing: actual player counts, competitor utilization rates, household income / age demographics for the catchment area. No drive-time isochrone analysis (Tier 3 gap).
- How many padel players are in your target area? Is the sport growing locally or are you betting on future adoption?
- What's the competitive landscape — how many existing courts within a 2030 minute drive radius? Are they full? What are their peak/off-peak utilization rates?
- What's the demographic profile of your catchment area (income, age, sports participation)?
@@ -23,6 +56,8 @@
### Site & Location
> **COVERAGE: GAP** — The planner has a rent/land cost input and a `own` toggle for buy vs lease, but there is no guidance on site selection criteria (ceiling height, column spacing, zoning classification, parking ratios). A static article or checklist would cover this. See also Tier 2 gap: legal/regulatory checklist.
- Do you want to build new (greenfield), convert an existing building (warehouse, industrial hall), or add to an existing sports complex?
- What zoning and building regulations apply? Is a padel hall classified as sports, leisure, commercial?
- What's the required ceiling height? (Minimum ~810m for indoor padel, ideally 10m+)
@@ -30,6 +65,8 @@
### Product & Scope
> **COVERAGE: PARTIAL** — Court count is fully answered (planner supports 112 courts, sensitivity analysis included). Ancillary revenue streams (coaching, F&B, pro shop, events, memberships, corporate) are modelled. Indoor vs outdoor is modelled but there is no structured decision framework comparing CAPEX, revenue ceiling, seasonal risk, noise, and permits (Tier 1 gap #3). Quality level / positioning is not addressed.
- How many courts? (Typically 48 is the sweet spot for a standalone hall; fewer than 4 struggles with profitability, more than 8 requires very strong demand)
- Indoor only, outdoor, or hybrid with a retractable/seasonal structure?
- What ancillary offerings: pro shop, café/bar/lounge, fitness area, changing rooms, padel school/academy?
@@ -37,6 +74,8 @@
### Financial
> **COVERAGE: ANSWERED** — All four questions are directly answered by the planner: equity/debt split, rent/land cost, real peak/off-peak prices per city (from Playtomic via `planner_defaults`), utilization ramp curve (Year 15), and breakeven utilization (sensitivity grid).
- What's your total budget, and what's the split between equity and debt?
- What rental or land purchase cost can you sustain?
- What are realistic court booking prices in your market?
@@ -45,6 +84,8 @@
### Legal & Organizational
> **COVERAGE: GAP** — Only a permit cost line item exists in CAPEX. No entity guidance (GmbH vs UG vs Verein), no permit checklist, no license types, no insurance guidance. A Germany-first legal/regulatory checklist (Bauantrag, Nutzungsänderung, TA Lärm, Gewerbeerlaubnis, §4 Nr. 22 UStG sports VAT exemption) would be high-value static content (Tier 2 gap #7). Buyer segmentation (sports club vs. commercial) affects entity choice and grant eligibility (Tier 1 gap #2).
- What legal entity will you use?
- Do you need partners (operational, financial, franchise)?
- What permits, licenses, and insurance do you need?
@@ -56,6 +97,10 @@
### Phase 1: Feasibility & Concept (Month 13)
> **COVERAGE: ANSWERED** — This phase is fully supported. Market research → city articles (venue density, Market Score, Opportunity Score). Concept development → planner inputs. Location scouting → city articles + planner. Preliminary financial model → planner. Go/no-go → planner output (EBITDA, IRR, NPV).
>
> Missing: Buyer segmentation (Tier 1 gap #2) — the planner treats all users identically. A "project type" selector (sports club / commercial / hotel / franchise) would adjust CAPEX defaults, grant eligibility, and entity guidance.
1. **Market research**: Survey local players, visit competing facilities, analyze demographics within a 1520 minute drive radius. Talk to padel coaches and club organizers.
2. **Concept development**: Define your number of courts, target audience, service level, and ancillary revenue streams.
3. **Location scouting**: Identify 35 candidate sites. Evaluate each on accessibility, visibility, size, ceiling height (if conversion), zoning, and cost.
@@ -64,6 +109,8 @@
### Phase 2: Planning & Design (Month 36)
> **COVERAGE: PARTIAL** — Detailed financial model (step 9) and financing (step 10) are fully answered by the planner (DSCR, covenants, sensitivity). Court supplier selection (step 8) has a partial answer: a supplier directory exists in the product but there is no evaluation framework (Tier 2 gap #8: origin, price/court, warranty, glass type, installation, lead time). Permit process (step 11) is a gap (Tier 2 gap #7). Site security and architect hiring are operational advice, out of scope.
6. **Secure the site**: Sign a letter of intent or option agreement for purchase or lease.
7. **Hire an architect** experienced in sports facilities. They'll produce floor plans, elevations, structural assessments (for conversions), and MEP (mechanical, electrical, plumbing) layouts.
8. **Padel court supplier selection**: Get quotes from manufacturers (e.g., Mondo, Padelcreations, MejorSet). Courts come as prefabricated modules — coordinate dimensions, drainage, lighting, and glass specifications with your architect.
@@ -73,6 +120,8 @@
### Phase 3: Construction / Conversion (Month 612)
> **COVERAGE: PARTIAL** — Booking system (step 15) is partially addressed: booking system cost is a planner input, but there is no guidance on platform selection (Playtomic vs Matchi vs custom) despite this being a real decision with revenue and data implications (Tier 2 gap #5). Construction, installation, fit-out, and inspections are operational steps outside Padelnomics' scope.
12. **Tender and contract construction**: Either a general contractor or construction management approach. Key trades: structural/civil, flooring, HVAC (critical for indoor comfort), electrical (LED court lighting to specific lux standards), plumbing.
13. **Install padel courts**: Usually done after the building shell is complete. Courts take 24 weeks to install per batch.
14. **Fit-out ancillary areas**: Reception, changing rooms, lounge/bar, pro shop.
@@ -81,6 +130,8 @@
### Phase 4: Pre-Opening (Month 1013)
> **COVERAGE: PARTIAL** — Staffing plan (step 17): the BP export has a `staffing_plan` narrative field, but there is no structured template with role × FTE × salary defaults. Research benchmarks (€9.914.2K/month for 23 FTE + manager) could pre-fill this based on court count (Tier 2 gap #9). Marketing playbook (step 18): not addressed; could be a static article (Tier 3 gap #11). Soft/grand opening: out of scope.
17. **Hire staff**: Manager, reception, coaches, cleaning, potentially F&B staff.
18. **Marketing launch**: Social media, local partnerships (sports clubs, corporate wellness), opening event, introductory pricing.
19. **Soft opening**: Invite local players, influencers, press for a trial period.
@@ -88,6 +139,8 @@
### Phase 5: Operations & Optimization (Ongoing)
> **COVERAGE: PARTIAL** — Utilization monitoring and financial review are covered by the planner model. Upsell streams (coaching, equipment, F&B, memberships) are all revenue line items. Community building and dynamic pricing strategy are not addressed — these are operational, not data-driven, and are out of scope.
21. **Monitor utilization** by court, time slot, and day. Adjust pricing dynamically.
22. **Build community**: Leagues, tournaments, social events, corporate bookings.
23. **Upsell**: Coaching, equipment, food/beverage, memberships.
@@ -97,6 +150,8 @@
## Plans You Need to Create
> **COVERAGE: PARTIAL** — Business Plan and Financial Plan are both fully answered (planner + BP PDF export with 15+ narrative sections). Architectural Plans, Marketing Plan, and Legal/Permit Plan are outside the product's scope. Operational Plan is partial: staffing and booking system inputs exist but lack depth (Tier 2 gaps #5, #9).
- **Business Plan** — the master document covering market analysis, concept, operations plan, management team, and financials. This is what banks and investors want to see.
- **Architectural Plans** — floor plans, cross-sections, elevations, structural drawings, MEP plans. Required for permits and construction.
- **Financial Plan** — the core of your business plan. Includes investment budget, funding plan, P&L forecast (35 years), cash flow forecast, and sensitivity analysis.
@@ -112,6 +167,8 @@
### Investment Budget (CAPEX)
> **COVERAGE: ANSWERED** — The planner covers all 15+ CAPEX line items for both lease (`rent`) and purchase (`own`) scenarios. Subsidies and grants are **not** modelled (Tier 1 gap #1): `research/padel-hall-economics.md` documents Landessportbund grants (35% for sports clubs), KfW 150 loans, and a real example of €258K → €167K net after grant (padel-court.de). A "Fördermittel" (grants) section in the BP or a callout in DE city articles would surface this.
| Item | Estimate |
|---|---|
| Building lease deposit or land | €50,000€200,000 |
@@ -131,6 +188,8 @@ Realistic midpoint for a solid 6-court hall: **~€1.21.5M**.
### Revenue Model
> **COVERAGE: ANSWERED** — Court utilization × price per hour is the core model. Real peak/off-peak prices per city are pre-filled via `planner_defaults` from Playtomic data. Ramp curve (Year 15 utilization), 6 ancillary streams, and monthly seasonal curve are all modelled.
Core driver: **court utilization × price per hour**.
- 6 courts × 15 bookable hours/day × 365 days = **32,850 court-hours/year** (theoretical max)
@@ -149,6 +208,8 @@ Core driver: **court utilization × price per hour**.
### Operating Costs (OPEX)
> **COVERAGE: PARTIAL** — All OPEX line items exist as planner inputs. The defaults are reasonable but are not visually benchmarked against market data (Tier 1 gap #4). Research benchmarks from `research/padel-hall-economics.md` §7: electricity €2.54.5K/month, staff €9.914.2K/month for 23 FTE + manager, rent €815K/month. Showing "typical range for your market" next to each OPEX input field would improve trust in the defaults.
| Cost Item | Year 1 | Year 2 | Year 3 |
|---|---|---|---|
| Rent / lease | €120k | €123k | €127k |
@@ -164,6 +225,8 @@ Core driver: **court utilization × price per hour**.
### Profitability
> **COVERAGE: ANSWERED** — EBITDA, EBITDA margin, debt service, and free cash flow after debt are all computed by the planner for all 60 months.
| Metric | Year 1 | Year 2 | Year 3 |
|---|---|---|---|
| **EBITDA** | €310k | €577k | €759k |
@@ -173,6 +236,8 @@ Core driver: **court utilization × price per hour**.
### Key Metrics to Track
> **COVERAGE: ANSWERED** — Payback period, IRR (equity + project), NPV, MOIC, DSCR per year, breakeven utilization, and revenue per available hour are all computed and displayed.
- **Payback period**: Typically 35 years for a well-run padel hall
- **ROI on equity**: If you put in €500k equity and generate €300k+ annual free cash flow by year 3, that's a 60%+ cash-on-cash return
- **Breakeven utilization**: Usually around 3540% — below which you lose money
@@ -180,12 +245,18 @@ Core driver: **court utilization × price per hour**.
### Sensitivity Analysis
> **COVERAGE: ANSWERED** — 12-step utilization sensitivity and 8-step price sensitivity are both shown as grids, each including DSCR values.
Model what happens if utilization is 10% lower than planned, if the average price drops by €5, or if construction costs overrun by 20%. This is what banks want to see — that you survive the downside.
---
## How to Decide Where to Build
> **COVERAGE: PARTIAL overall** — The product answers competition mapping (venue density, Opportunity Score) and rent/cost considerations (planner input). Missing: drive-time catchment analysis (Tier 3 gap #12 — would need isochrone API), accessibility/visibility/building suitability assessment (static checklist possible), growth trajectory (no new-development data), and regulatory environment (Tier 2 gap #7).
>
> **Tier 3 opportunity**: `location_opportunity_profile` scores thousands of GeoNames locations including zero-court towns. Only venues with existing courts get a public article. Generating pSEO pages for top-scoring zero-court locations would surface "build here" recommendations (white-space pages).
1. **Catchment area analysis**: Draw a 15-minute and 30-minute drive-time radius around candidate sites. Analyze population density, household income, age distribution (2555 is the core padel demographic), and existing sports participation rates.
2. **Competition mapping**: Map every existing padel facility within 30 minutes. Call them, check their booking systems — are courts booked out at peak? If competitors are running at 80%+ utilization, that's a strong signal of unmet demand.
@@ -208,70 +279,104 @@ Model what happens if utilization is 10% lower than planned, if the average pric
### NPV & IRR
> **COVERAGE: ANSWERED** — Both equity IRR and project IRR are computed. NPV is shown with the WACC input. Hurdle rate is a user input.
Discount your projected free cash flows at your WACC (or required return on equity if all-equity financed) to get a net present value. The IRR tells you whether the project clears your hurdle rate. For a padel hall, you'd typically want an unlevered IRR of 1525% to justify the risk of a single-asset, operationally intensive business. Compare this against alternative uses of your capital.
### WACC & Cost of Capital
> **COVERAGE: ANSWERED** — WACC is a planner input used in NPV calculations. Debt cost and equity cost are separately configurable.
If you're blending debt and equity, calculate your weighted average cost of capital properly. Bank debt for a sports facility might run 47% depending on jurisdiction and collateral. Your equity cost should reflect the illiquidity premium and operational risk — this isn't a passive real estate investment, it's an operating business. A reasonable cost of equity might be 1220%.
### Terminal Value
> **COVERAGE: ANSWERED** — Terminal value is computed as EBITDA × exit multiple at the end of the hold period. MOIC and value bridge are displayed.
If you model 5 years of explicit cash flows, you need a terminal value. You can use a perpetuity growth model (FCF year 5 × (1+g) / (WACC g)) or an exit multiple. For the exit multiple approach, think about what a buyer would pay — likely 47x EBITDA for a mature, well-run single-location padel hall, potentially higher if it's part of a multi-site rollout story.
### Lease vs. Buy
> **COVERAGE: ANSWERED** — The `own` toggle in the planner changes the entire CAPEX/OPEX structure: land purchase replaces lease deposit, mortgage replaces rent, and property appreciation is modelled in terminal value.
A critical capital allocation decision. Buying the property ties up far more capital but gives you residual asset value and eliminates landlord risk. Leasing preserves capital for operations and expansion but exposes you to rent increases and lease termination risk. Model both scenarios and compare the risk-adjusted NPV. Also consider sale-and-leaseback if you build on owned land.
### Operating Leverage
> **COVERAGE: ANSWERED** — The sensitivity grids explicitly show how a 10% utilization swing affects EBITDA and DSCR.
A padel hall has high fixed costs (rent, staff base, debt service) and relatively low variable costs. This means profitability is extremely sensitive to utilization. Model the operating leverage explicitly — a 10% swing in utilization might cause a 2530% swing in EBITDA. This is both the opportunity and the risk.
### Depreciation & Tax Shield
> **COVERAGE: GAP** — All planner calculations are pre-tax (Tier 2 gap #6). Adding a depreciation schedule and effective tax rate would materially improve the financial model for Germany: 7-year depreciation for courts/equipment, ~30% effective tax rate (15% KSt + 14% GewSt). This would require jurisdiction selection (start with Germany only). Non-trivial but the most common user geography.
Padel courts depreciate over 710 years, building fit-out over 1015 years, equipment over 35 years. The depreciation tax shield is meaningful. Interest expense on debt is also tax-deductible. Model your effective tax rate and the present value of these shields — they improve your after-tax returns materially.
### Working Capital Cycle
> **COVERAGE: ANSWERED** — Pre-opening cash burn and ramp-up period are modelled in the 60-month cash flow. Working capital reserve is a CAPEX line item.
Padel halls are generally working-capital-light (customers pay at booking or on arrival, you pay suppliers on 3060 day terms). But model the initial ramp-up period where you're carrying costs before revenue reaches steady state. The pre-opening cash burn and first 612 months of sub-breakeven operation is where most of your working capital risk sits.
### Scenario & Sensitivity Analysis
> **COVERAGE: ANSWERED** — Utilization sensitivity (12 steps) and price sensitivity (8 steps) grids are shown, both with DSCR. Bear/base/bull narrative is covered in the BP export.
Model three scenarios (bear/base/bull) varying utilization, pricing, and cost overruns simultaneously. Identify the breakeven utilization rate precisely. A Monte Carlo simulation on the key variables (utilization, average price, construction cost, ramp-up speed) gives you a probability distribution of outcomes rather than a single point estimate.
### Exit Strategy & Valuation
> **COVERAGE: ANSWERED** — Hold period, exit EBITDA multiple, terminal value, MOIC, and value bridge are all displayed in the planner.
Think about this upfront. Are you building to hold and cash-flow, or building to sell to a consolidator or franchise operator? The exit multiple depends heavily on whether you've built a transferable business (brand, systems, trained staff, long lease) or an owner-dependent operation. Multi-site operators and franchise groups trade at higher multiples (610x EBITDA) than single sites.
### Optionality Value
> **COVERAGE: GAP** — Real option value (second location, franchise, repurposing) is mentioned in the BP narrative but not quantified. Out of scope for the planner; noting as a caveat in the BP export text would be sufficient.
A successful first hall gives you the option to expand — second location, franchise model, or selling the playbook. This real option has value that a static DCF doesn't capture. Similarly, if you own the land/building, you have conversion optionality (the building could be repurposed if padel demand fades).
### Counterparty & Concentration Risk
> **COVERAGE: PARTIAL** — The planner models this implicitly (single-site, single-sport), and DSCR warnings flag over-leverage. No explicit counterparty risk section. Mentioning it in the BP risk narrative would be low-effort coverage.
You're exposed to a single landlord (lease risk), a single location (demand risk), and potentially a single sport (trend risk). A bank or sophisticated investor will flag all three. Mitigants include long lease terms with caps on escalation, diversified revenue streams (F&B, events, coaching), and contractual protections.
### Subsidies & Grants
> **COVERAGE: GAP — Tier 1 priority.** `research/padel-hall-economics.md` documents: Landessportbund grants (up to 35% CAPEX for registered sports clubs), KfW 150 low-interest loans, and a worked example: €258K gross → €167K net CAPEX after grant. The planner has no grants input. Quick wins: (a) add a "Fördermittel" accordion section to DE city articles; (b) add a grant percentage input to the planner CAPEX section (reduces total investment and boosts IRR). Note: grant eligibility depends on buyer type (Tier 1 gap #2) — sports clubs qualify, commercial operators typically do not.
Many municipalities and national sports bodies offer grants or subsidized loans for sports infrastructure. In some European countries, this can cover 1030% of CAPEX. Factor this into your funding plan — it's essentially free equity that boosts your returns.
### VAT & Tax Structuring
> **COVERAGE: GAP** — Not modelled. Germany-specific: court rental may qualify for §4 Nr. 22 UStG sports VAT exemption (0% VAT) if operated by a non-commercial entity; commercial operators pay 19% VAT on court rental. F&B is 19% (or 7% eat-in). Getting this wrong materially affects revenue net-of-VAT. Worth a callout in the legal/regulatory article (Tier 2 gap #7).
Depending on your jurisdiction, court rental may be VAT-exempt or reduced-rate (sports exemption), while F&B is standard-rated. This affects pricing strategy and cash flow. The entity structure (single GmbH, holding structure, partnership) has implications for profit extraction, liability, and eventual exit taxation. Worth getting tax advice early.
### Insurance & Business Interruption
> **COVERAGE: PARTIAL** — Insurance is a planner OPEX line item. No guidance on coverage types or BI insurance sizing. Low priority to expand.
Price in comprehensive insurance — property, liability, business interruption. A fire or structural issue that shuts you down for 3 months could be existential without BI coverage. This is a real cost that's often underestimated.
### Covenant Compliance
> **COVERAGE: ANSWERED** — DSCR is computed for each of the 5 years and shown with a warning band. LTV warnings are also displayed.
If you take bank debt, you'll likely face covenants — DSCR (debt service coverage ratio) minimums of 1.21.5x, leverage caps, possibly revenue milestones. Model your covenant headroom explicitly. Breaching a covenant in year 1 during ramp-up is a real risk if you've over-leveraged.
### Inflation Sensitivity
> **COVERAGE: ANSWERED** — The planner has separate `revenue_growth_rate` and `opex_growth_rate` inputs, allowing asymmetric inflation scenarios.
Energy costs, staff wages, and maintenance all inflate. Can you pass these through via price increases without killing utilization? Model a scenario where costs inflate at 35% but you can only raise prices by 23%.
### Residual / Liquidation Value
> **COVERAGE: PARTIAL** — Terminal/exit value is modelled (EBITDA multiple). A true liquidation scenario (courts resale, lease termination penalties, building write-off) is not separately modelled. Sufficient for the current product.
In a downside scenario, what are your assets worth? Padel courts have some resale value. Building improvements are largely sunk. If you've leased, your downside is limited to equity invested plus any personal guarantees. If you've bought property, the real estate retains value but may take time to sell. Model the liquidation scenario honestly.
---
@@ -280,24 +385,34 @@ In a downside scenario, what are your assets worth? Padel courts have some resal
### Existential Risks
> **COVERAGE: PARTIAL** — Trend/fad risk is acknowledged in the BP narrative but not quantified (Tier 3 gap #13). FIP/Playtomic data (7,187 new courts globally in 2024, +26% YoY new clubs) exists but long-term quantification is inherently speculative. Force majeure/pandemic risk is not addressed; a reserve fund input (CAPEX working capital) provides partial mitigation modelling.
- **Trend / Fad Risk**: Padel is booming now, but so did squash in the 1980s. You're locking in a 1015 year investment thesis on a sport that may plateau or decline. The key question is whether padel reaches self-sustaining critical mass in your market or stays a novelty. If utilization drops from 65% to 35% in year 5 because the hype fades, your entire model breaks. This is largely unhedgeable.
- **Force Majeure / Pandemic Risk**: COVID shut down indoor sports facilities for months. Insurance may not cover it. Having enough cash reserves or credit facilities to survive 36 months of zero revenue is prudent.
### Construction & Development Risks
> **COVERAGE: PARTIAL** — A contingency/overrun percentage is a planner CAPEX input. Delay cost (carrying costs during construction) is not explicitly modelled.
- **Construction Cost Overruns & Delays**: Sports facility builds routinely overrun by 1530%. Every month of delay is a month of carrying costs (rent, debt service, staff already hired) with zero revenue. Build a contingency buffer of 1520% of CAPEX minimum and negotiate fixed-price construction contracts where possible.
### Property & Lease Risks
> **COVERAGE: GAP** — No lease-term inputs or landlord risk guidance. The `own` toggle handles the buy scenario. A callout in the BP template about minimum lease length (15+ years, renewal options) would be useful but is low priority.
- **Landlord Risk**: If you're leasing, you're spending €500k+ fitting out someone else's building. What happens if the landlord sells, goes bankrupt, or refuses to renew? You need a long lease (15+ years), with options to renew, and ideally a step-in right or compensation clause for tenant improvements.
### Competitive Risks
> **COVERAGE: PARTIAL** — City articles show existing venue density and Opportunity Score. The planner does not model a "competitor opens nearby" scenario. A simple sensitivity scenario (utilization drop) is the best proxy available in the current model.
- **Cannibalization from New Entrants**: Your success is visible — full courts, long waitlists. This attracts competitors. Someone opens a new hall 10 minutes away, and your utilization drops from 70% to 50%. There's no real moat in padel besides location, community loyalty, and service quality. Model what happens when a competitor opens nearby in year 3.
### Operational Risks
> **COVERAGE: PARTIAL** — Court maintenance OPEX and maintenance reserve are planner inputs. F&B, staffing, and booking platform risks are not addressed. See Tier 2 gaps #5 (booking platform strategy) and #9 (staffing plan). Seasonality is fully modelled (12-month outdoor seasonal curve; monthly cash flow).
- **Key Person Dependency**: If the whole operation depends on one founder-operator or one star coach who brings all the members, that's a fragility. Illness, burnout, or departure can crater the business.
- **Staff Retention & Labor Market**: Good facility managers, coaches, and front-desk staff with a hospitality mindset are hard to find and keep. Turnover is expensive and disruptive. In tight labor markets, wage pressure can erode margins.
@@ -310,6 +425,8 @@ In a downside scenario, what are your assets worth? Padel courts have some resal
### Financial Risks
> **COVERAGE: PARTIAL** — Energy volatility: energy OPEX is a modelled input with growth rate, but no locking/hedging guidance. Financing environment: debt rate is a planner input; stress-test at +2% is covered by the sensitivity grid indirectly. Personal guarantee and customer concentration: not addressed (out of scope for data-driven product). Inflation pass-through: answered (separate revenue vs OPEX growth rates).
- **Energy Price Volatility**: Indoor padel halls consume significant energy. Energy costs spiking can destroy margins. Consider locking in energy contracts, investing in solar panels, or using LED lighting and efficient HVAC to reduce exposure.
- **Financing Environment**: If interest rates rise between when you plan the project and when you draw down the loan, your debt service costs increase. Lock in rates where possible, or stress-test your model at rates 2% higher than current.
@@ -322,22 +439,32 @@ In a downside scenario, what are your assets worth? Padel courts have some resal
### Regulatory & Legal Risks
> **COVERAGE: GAP — Tier 2 priority.** Noise complaints (TA Lärm), injury liability, and permit risks are all unaddressed. A Germany-first regulatory checklist article would cover: Bauantrag, Nutzungsänderung, TA Lärm compliance, GmbH vs UG formation, Gewerbeerlaubnis, §4 Nr. 22 UStG sports VAT, and Gaststättengesetz (liquor license). High value for Phase 1/2 users who are evaluating feasibility.
- **Noise Complaints**: Padel is loud — the ball hitting glass walls generates significant noise. Neighbors can complain and municipal authorities can impose operating hour restrictions or require expensive sound mitigation. Check local noise ordinances thoroughly before committing.
- **Injury Liability**: Padel involves glass walls, fast-moving balls, and quick lateral movement. Player injuries happen. Proper insurance, waiver systems, and court maintenance protocols are essential.
### Technology & Platform Risks
> **COVERAGE: GAP — Tier 2 priority.** Booking platform dependency is a real decision point for operators (Playtomic commission ~1520%, data ownership implications, competitor steering risk). We scrape Playtomic and know it intimately. A standalone article "Playtomic vs Matchi vs eigenes System" or a section in the BP template would address this. The booking system commission rate is already a planner input — we could link to a decision guide from there.
- **Booking Platform Dependency**: If you rely on a third-party booking platform like Playtomic, you're giving them access to your customer relationships and paying commission. They could raise fees, change terms, or steer demand to competitors.
### Reputational Risks
> **COVERAGE: GAP** — Not addressed. Out of scope for a data-driven product; operational advice.
- **Brand / Reputation Risk**: One viral negative review, a hygiene issue, a safety incident, or a social media complaint can disproportionately hurt a local leisure business.
### Currency & External Risks
> **COVERAGE: GAP** — FX risk from Spanish/Italian manufacturers is not modelled. Minor; most German buyers pay in EUR. Note in BP template as a caveat if importing outside Eurozone.
- **Currency Risk**: Relevant if importing courts or equipment from another currency zone — padel court manufacturers are often Spanish or Italian, so FX moves can affect CAPEX if you're outside the Eurozone.
### Opportunity Cost
> **COVERAGE: PARTIAL** — IRR and NPV implicitly address opportunity cost (you enter the hurdle rate as WACC/cost of equity). No explicit comparison against passive investment alternatives is shown. Sufficient for current product.
The capital, time, and energy you put into this project could go elsewhere. If you could earn 810% passively in diversified investments, a padel hall needs to deliver meaningfully more on a risk-adjusted basis to justify the concentration, illiquidity, and personal time commitment.

290
scripts/check_pipeline.py Normal file
View File

@@ -0,0 +1,290 @@
"""
Diagnostic script: check row counts at every layer of the pricing pipeline.
Run on prod via SSH:
DUCKDB_PATH=/opt/padelnomics/data/lakehouse.duckdb uv run python scripts/check_pipeline.py
Or locally:
DUCKDB_PATH=data/lakehouse.duckdb uv run python scripts/check_pipeline.py
Read-only — never writes to the database.
Handles the DuckDB catalog naming quirk: when the file is named lakehouse.duckdb,
the catalog is "lakehouse" not "local". SQLMesh views may reference the wrong catalog,
so we fall back to querying physical tables (sqlmesh__<schema>.<table>__<hash>).
"""
import os
import sys
import duckdb
DUCKDB_PATH = os.environ.get("DUCKDB_PATH", "data/lakehouse.duckdb")
PIPELINE_TABLES = [
("staging", "stg_playtomic_availability"),
("foundation", "fct_availability_slot"),
("foundation", "dim_venue_capacity"),
("foundation", "fct_daily_availability"),
("serving", "venue_pricing_benchmarks"),
("serving", "pseo_city_pricing"),
]
def _use_catalog(con):
"""Detect and USE the database catalog so schema-qualified queries work."""
catalogs = [
row[0]
for row in con.execute(
"SELECT catalog_name FROM information_schema.schemata"
).fetchall()
]
# Pick the non-system catalog (not 'system', 'temp', 'memory')
user_catalogs = [c for c in set(catalogs) if c not in ("system", "temp", "memory")]
if user_catalogs:
catalog = user_catalogs[0]
con.execute(f"USE {catalog}")
return catalog
return None
def _find_physical_table(con, schema, table):
"""Find the SQLMesh physical table name for a logical table.
SQLMesh stores physical tables as:
sqlmesh__<schema>.<schema>__<table>__<hash>
"""
sqlmesh_schema = f"sqlmesh__{schema}"
try:
rows = con.execute(
"SELECT table_schema, table_name "
"FROM information_schema.tables "
f"WHERE table_schema = '{sqlmesh_schema}' "
f"AND table_name LIKE '{schema}__{table}%' "
"ORDER BY table_name "
"LIMIT 1"
).fetchall()
if rows:
return f"{rows[0][0]}.{rows[0][1]}"
except Exception:
pass
return None
def _query_table(con, schema, table):
"""Try logical view first, fall back to physical table. Returns (fqn, count) or (fqn, error_str)."""
logical = f"{schema}.{table}"
try:
(count,) = con.execute(f"SELECT COUNT(*) FROM {logical}").fetchone()
return logical, count
except Exception:
pass
physical = _find_physical_table(con, schema, table)
if physical:
try:
(count,) = con.execute(f"SELECT COUNT(*) FROM {physical}").fetchone()
return f"{physical} (physical)", count
except Exception as e:
return f"{physical} (physical)", f"ERROR: {e}"
return logical, "ERROR: view broken, no physical table found"
def _query_sql(con, sql, schema_tables):
"""Execute SQL, falling back to rewritten SQL using physical table names if views fail.
schema_tables: list of (schema, table) tuples used in the SQL, in order of appearance.
The SQL must use {schema}.{table} format for these references.
"""
try:
return con.execute(sql)
except Exception:
# Rewrite SQL to use physical table names
rewritten = sql
for schema, table in schema_tables:
physical = _find_physical_table(con, schema, table)
if physical:
rewritten = rewritten.replace(f"{schema}.{table}", physical)
else:
raise
return con.execute(rewritten)
def main():
if not os.path.exists(DUCKDB_PATH):
print(f"ERROR: {DUCKDB_PATH} not found")
sys.exit(1)
con = duckdb.connect(DUCKDB_PATH, read_only=True)
print(f"Database: {DUCKDB_PATH}")
print(f"DuckDB version: {con.execute('SELECT version()').fetchone()[0]}")
catalog = _use_catalog(con)
if catalog:
print(f"Catalog: {catalog}")
print()
# ── Row counts at each layer ──────────────────────────────────────────
print("=" * 60)
print("PIPELINE ROW COUNTS")
print("=" * 60)
for schema, table in PIPELINE_TABLES:
fqn, result = _query_table(con, schema, table)
if isinstance(result, int):
print(f" {fqn:55s} {result:>10,} rows")
else:
print(f" {fqn:55s} {result}")
# ── Date range in fct_daily_availability ──────────────────────────────
print()
print("=" * 60)
print("DATE RANGE: fct_daily_availability")
print("=" * 60)
try:
row = _query_sql(
con,
"""
SELECT
MIN(snapshot_date) AS min_date,
MAX(snapshot_date) AS max_date,
COUNT(DISTINCT snapshot_date) AS distinct_days,
CURRENT_DATE AS today,
CURRENT_DATE - INTERVAL '30 days' AS window_start
FROM foundation.fct_daily_availability
""",
[("foundation", "fct_daily_availability")],
).fetchone()
if row:
min_date, max_date, days, today, window_start = row
print(f" Min snapshot_date: {min_date}")
print(f" Max snapshot_date: {max_date}")
print(f" Distinct days: {days}")
print(f" Today: {today}")
print(f" 30-day window start: {window_start}")
if max_date and str(max_date) < str(window_start):
print()
print(" *** ALL DATA IS OUTSIDE THE 30-DAY WINDOW ***")
print(" This is why venue_pricing_benchmarks is empty.")
except Exception as e:
print(f" ERROR: {e}")
# ── HAVING filter impact in venue_pricing_benchmarks ──────────────────
print()
print("=" * 60)
print("HAVING FILTER IMPACT (venue_pricing_benchmarks)")
print("=" * 60)
try:
row = _query_sql(
con,
"""
WITH venue_stats AS (
SELECT
da.tenant_id,
da.country_code,
da.city,
COUNT(DISTINCT da.snapshot_date) AS days_observed
FROM foundation.fct_daily_availability da
WHERE TRY_CAST(da.snapshot_date AS DATE) >= CURRENT_DATE - INTERVAL '30 days'
AND da.occupancy_rate IS NOT NULL
AND da.occupancy_rate BETWEEN 0 AND 1.5
GROUP BY da.tenant_id, da.country_code, da.city
)
SELECT
COUNT(*) AS total_venues,
COUNT(*) FILTER (WHERE days_observed >= 3) AS venues_passing_having,
COUNT(*) FILTER (WHERE days_observed < 3) AS venues_failing_having,
MAX(days_observed) AS max_days,
MIN(days_observed) AS min_days
FROM venue_stats
""",
[("foundation", "fct_daily_availability")],
).fetchone()
if row:
total, passing, failing, max_d, min_d = row
print(f" Venues in 30-day window: {total}")
print(f" Venues with >= 3 days (PASSING): {passing}")
print(f" Venues with < 3 days (FILTERED): {failing}")
print(f" Max days observed: {max_d}")
print(f" Min days observed: {min_d}")
if total == 0:
print()
print(" *** NO VENUES IN 30-DAY WINDOW — check fct_daily_availability dates ***")
except Exception as e:
print(f" ERROR: {e}")
# ── Occupancy rate distribution ───────────────────────────────────────
print()
print("=" * 60)
print("OCCUPANCY RATE DISTRIBUTION (fct_daily_availability)")
print("=" * 60)
try:
rows = _query_sql(
con,
"""
SELECT
CASE
WHEN occupancy_rate IS NULL THEN 'NULL'
WHEN occupancy_rate < 0 THEN '< 0 (invalid)'
WHEN occupancy_rate > 1.5 THEN '> 1.5 (filtered)'
WHEN occupancy_rate <= 0.25 THEN '0 0.25'
WHEN occupancy_rate <= 0.50 THEN '0.25 0.50'
WHEN occupancy_rate <= 0.75 THEN '0.50 0.75'
ELSE '0.75 1.0+'
END AS bucket,
COUNT(*) AS cnt
FROM foundation.fct_daily_availability
GROUP BY 1
ORDER BY 1
""",
[("foundation", "fct_daily_availability")],
).fetchall()
for bucket, cnt in rows:
print(f" {bucket:25s} {cnt:>10,}")
except Exception as e:
print(f" ERROR: {e}")
# ── dim_venue_capacity join coverage ──────────────────────────────────
print()
print("=" * 60)
print("JOIN COVERAGE: fct_availability_slot → dim_venue_capacity")
print("=" * 60)
try:
row = _query_sql(
con,
"""
SELECT
COUNT(DISTINCT a.tenant_id) AS slot_tenants,
COUNT(DISTINCT c.tenant_id) AS capacity_tenants,
COUNT(DISTINCT a.tenant_id) - COUNT(DISTINCT c.tenant_id) AS missing_capacity
FROM foundation.fct_availability_slot a
LEFT JOIN foundation.dim_venue_capacity c ON a.tenant_id = c.tenant_id
""",
[
("foundation", "fct_availability_slot"),
("foundation", "dim_venue_capacity"),
],
).fetchone()
if row:
slot_t, cap_t, missing = row
print(f" Tenants in fct_availability_slot: {slot_t}")
print(f" Tenants with capacity match: {cap_t}")
print(f" Tenants missing capacity: {missing}")
if missing and missing > 0:
print(f" *** {missing} tenants dropped by INNER JOIN to dim_venue_capacity ***")
except Exception as e:
print(f" ERROR: {e}")
con.close()
print()
print("Done.")
if __name__ == "__main__":
main()

View File

@@ -0,0 +1,553 @@
"""
E2E test for checkout.session.completed webhook → transaction.completed handler.
Tests credit packs, sticky boosts, and business plan PDF purchases by:
1. Constructing realistic checkout.session.completed payloads with our real price IDs
2. Signing them with the active webhook secret
3. POSTing to the running dev server
4. Verifying DB state changes (credit_balance, supplier_boosts, business_plan_exports)
Prerequisites:
- ngrok + webhook endpoint registered (stripe_e2e_setup.py)
- Dev server running with webhook secret loaded
- Stripe products synced (setup_stripe --sync)
Run: uv run python scripts/stripe_e2e_checkout_test.py
"""
import hashlib
import hmac
import json
import os
import sqlite3
import subprocess
import sys
import time
from dotenv import load_dotenv
load_dotenv(override=True)
DATABASE_PATH = os.getenv("DATABASE_PATH", "data/app.db")
WEBHOOK_SECRET = os.getenv("STRIPE_WEBHOOK_SECRET", "")
SERVER_URL = "http://localhost:5000"
WEBHOOK_URL = f"{SERVER_URL}/billing/webhook/stripe"
assert WEBHOOK_SECRET, "STRIPE_WEBHOOK_SECRET not set — run stripe_e2e_setup.py"
passed = 0
failed = 0
errors = []
def ok(msg):
global passed
passed += 1
print(f" \u2713 {msg}")
def fail(msg):
global failed
failed += 1
errors.append(msg)
print(f" \u2717 {msg}")
def section(title):
print(f"\n{'' * 60}")
print(f" {title}")
print(f"{'' * 60}")
def query_db(sql, params=()):
conn = sqlite3.connect(f"file:{DATABASE_PATH}?mode=ro", uri=True)
conn.row_factory = sqlite3.Row
try:
return [dict(r) for r in conn.execute(sql, params).fetchall()]
finally:
conn.close()
def sign_stripe_payload(payload_bytes: bytes, secret: str) -> str:
"""Create a valid Stripe-Signature header."""
timestamp = str(int(time.time()))
signed_payload = f"{timestamp}.{payload_bytes.decode()}"
sig = hmac.new(
secret.encode(), signed_payload.encode(), hashlib.sha256
).hexdigest()
return f"t={timestamp},v1={sig}"
def post_webhook(event_type: str, obj: dict) -> int:
"""Post a signed webhook to the server. Returns HTTP status code."""
payload = json.dumps({
"id": f"evt_test_{int(time.time()*1000)}",
"type": event_type,
"data": {"object": obj},
}).encode()
sig = sign_stripe_payload(payload, WEBHOOK_SECRET)
result = subprocess.run(
["curl", "-s", "-o", "/dev/null", "-w", "%{http_code}",
"-X", "POST",
"-H", "Content-Type: application/json",
"-H", f"Stripe-Signature: {sig}",
"--data-binary", "@-",
WEBHOOK_URL],
input=payload.decode(), capture_output=True, text=True, timeout=10,
)
return int(result.stdout.strip())
# ─── Preflight ────────────────────────────────────────────
section("Preflight")
# Server up
result = subprocess.run(
["curl", "-s", "-o", "/dev/null", "-w", "%{http_code}", f"{SERVER_URL}/"],
capture_output=True, text=True, timeout=5,
)
assert result.stdout.strip() in ("200", "301"), f"Server down ({result.stdout})"
ok("Dev server running")
# Webhook active
result = subprocess.run(
["curl", "-s", "-o", "/dev/null", "-w", "%{http_code}",
"-X", "POST", "-H", "Content-Type: application/json", "-d", "{}",
WEBHOOK_URL],
capture_output=True, text=True, timeout=5,
)
assert result.stdout.strip() == "400", f"Webhook returns {result.stdout} (expected 400)"
ok("Webhook signature check active")
# Load price IDs
products = query_db("SELECT key, provider_price_id FROM payment_products WHERE provider = 'stripe'")
price_map = {p["key"]: p["provider_price_id"] for p in products}
ok(f"Loaded {len(price_map)} products")
# Test data
users = query_db("SELECT id, email FROM users LIMIT 5")
test_user = users[0]
ok(f"User: {test_user['email']} (id={test_user['id']})")
suppliers = query_db("SELECT id, name, credit_balance FROM suppliers WHERE claimed_by IS NOT NULL LIMIT 1")
assert suppliers, "No claimed supplier found"
test_supplier = suppliers[0]
initial_balance = test_supplier["credit_balance"]
ok(f"Supplier: {test_supplier['name']} (id={test_supplier['id']}, balance={initial_balance})")
# ═══════════════════════════════════════════════════════════
# Test 1: Credit Pack purchases (all 4 sizes)
# ═══════════════════════════════════════════════════════════
section("1. Credit Pack purchases via checkout.session.completed")
credit_packs = [
("credits_25", 25),
("credits_50", 50),
("credits_100", 100),
("credits_250", 250),
]
running_balance = initial_balance
for key, amount in credit_packs:
price_id = price_map.get(key)
if not price_id:
fail(f"{key}: price not found")
continue
status = post_webhook("checkout.session.completed", {
"id": f"cs_test_{key}_{int(time.time())}",
"mode": "payment",
"customer": "cus_test_credits",
"metadata": {
"user_id": str(test_user["id"]),
"supplier_id": str(test_supplier["id"]),
"plan": key,
},
"line_items": {"data": [{"price": {"id": price_id}, "quantity": 1}]},
})
if status == 200:
ok(f"{key}: webhook accepted (HTTP 200)")
else:
fail(f"{key}: webhook returned HTTP {status}")
continue
# Wait and check balance
time.sleep(2)
rows = query_db("SELECT credit_balance FROM suppliers WHERE id = ?", (test_supplier["id"],))
new_balance = rows[0]["credit_balance"] if rows else -1
expected = running_balance + amount
if new_balance == expected:
ok(f"{key}: balance {running_balance}{new_balance} (+{amount})")
running_balance = new_balance
else:
fail(f"{key}: balance {new_balance}, expected {expected}")
running_balance = new_balance # update anyway for next test
# Check ledger entries
ledger = query_db(
"SELECT * FROM credit_ledger WHERE supplier_id = ? AND event_type = 'pack_purchase' ORDER BY id DESC LIMIT 4",
(test_supplier["id"],),
)
if len(ledger) >= 4:
ok(f"Credit ledger: {len(ledger)} pack_purchase entries")
else:
fail(f"Credit ledger: only {len(ledger)} entries (expected 4)")
# ═══════════════════════════════════════════════════════════
# Test 2: Sticky Boost purchases
# ═══════════════════════════════════════════════════════════
section("2. Sticky boost purchases")
# 2a. Sticky Week
price_id = price_map.get("boost_sticky_week")
if price_id:
status = post_webhook("checkout.session.completed", {
"id": f"cs_test_sticky_week_{int(time.time())}",
"mode": "payment",
"customer": "cus_test_sticky",
"metadata": {
"user_id": str(test_user["id"]),
"supplier_id": str(test_supplier["id"]),
"plan": "boost_sticky_week",
"sticky_country": "DE",
},
"line_items": {"data": [{"price": {"id": price_id}, "quantity": 1}]},
})
if status == 200:
ok("boost_sticky_week: webhook accepted")
else:
fail(f"boost_sticky_week: HTTP {status}")
time.sleep(2)
# Check supplier_boosts
boosts = query_db(
"SELECT * FROM supplier_boosts WHERE supplier_id = ? AND boost_type = 'sticky_week' ORDER BY id DESC LIMIT 1",
(test_supplier["id"],),
)
if boosts:
b = boosts[0]
ok(f"supplier_boosts row: type=sticky_week, status={b['status']}")
if b.get("expires_at"):
ok(f"expires_at set: {b['expires_at']}")
else:
fail("expires_at is NULL")
else:
fail("No supplier_boosts row for sticky_week")
# Check suppliers.sticky_until
sup = query_db("SELECT sticky_until, sticky_country FROM suppliers WHERE id = ?", (test_supplier["id"],))
if sup and sup[0]["sticky_until"]:
ok(f"sticky_until set: {sup[0]['sticky_until']}")
else:
fail("sticky_until not set")
if sup and sup[0]["sticky_country"] == "DE":
ok("sticky_country=DE")
else:
fail(f"sticky_country={sup[0]['sticky_country'] if sup else '?'}")
else:
fail("boost_sticky_week price not found")
# 2b. Sticky Month
price_id = price_map.get("boost_sticky_month")
if price_id:
# Reset sticky fields
conn = sqlite3.connect(DATABASE_PATH)
conn.execute("UPDATE suppliers SET sticky_until=NULL, sticky_country=NULL WHERE id=?", (test_supplier["id"],))
conn.commit()
conn.close()
status = post_webhook("checkout.session.completed", {
"id": f"cs_test_sticky_month_{int(time.time())}",
"mode": "payment",
"customer": "cus_test_sticky",
"metadata": {
"user_id": str(test_user["id"]),
"supplier_id": str(test_supplier["id"]),
"plan": "boost_sticky_month",
"sticky_country": "ES",
},
"line_items": {"data": [{"price": {"id": price_id}, "quantity": 1}]},
})
if status == 200:
ok("boost_sticky_month: webhook accepted")
else:
fail(f"boost_sticky_month: HTTP {status}")
time.sleep(2)
boosts = query_db(
"SELECT * FROM supplier_boosts WHERE supplier_id = ? AND boost_type = 'sticky_month' ORDER BY id DESC LIMIT 1",
(test_supplier["id"],),
)
if boosts:
ok(f"supplier_boosts row: type=sticky_month, expires_at={boosts[0].get('expires_at', '?')[:10]}")
else:
fail("No supplier_boosts row for sticky_month")
sup = query_db("SELECT sticky_until, sticky_country FROM suppliers WHERE id = ?", (test_supplier["id"],))
if sup and sup[0]["sticky_country"] == "ES":
ok("sticky_country=ES (month)")
else:
fail(f"sticky_country wrong: {sup[0] if sup else '?'}")
else:
fail("boost_sticky_month price not found")
# ═══════════════════════════════════════════════════════════
# Test 3: Business Plan PDF purchase
# ═══════════════════════════════════════════════════════════
section("3. Business Plan PDF purchase")
price_id = price_map.get("business_plan")
if price_id:
# Create a scenario for the user first
conn = sqlite3.connect(DATABASE_PATH)
conn.execute(
"INSERT INTO scenarios (user_id, name, state_json, created_at) VALUES (?, 'Test', '{}', datetime('now'))",
(test_user["id"],),
)
conn.commit()
scenario_row = conn.execute("SELECT id FROM scenarios WHERE user_id = ? ORDER BY id DESC LIMIT 1",
(test_user["id"],)).fetchone()
scenario_id = scenario_row[0] if scenario_row else 0
conn.close()
ok(f"Created test scenario: id={scenario_id}")
status = post_webhook("checkout.session.completed", {
"id": f"cs_test_bp_{int(time.time())}",
"mode": "payment",
"customer": "cus_test_bp",
"metadata": {
"user_id": str(test_user["id"]),
"plan": "business_plan",
"scenario_id": str(scenario_id),
"language": "de",
},
"line_items": {"data": [{"price": {"id": price_id}, "quantity": 1}]},
})
if status == 200:
ok("business_plan: webhook accepted")
else:
fail(f"business_plan: HTTP {status}")
time.sleep(2)
# Check business_plan_exports
exports = query_db(
"SELECT * FROM business_plan_exports WHERE user_id = ? ORDER BY id DESC LIMIT 1",
(test_user["id"],),
)
if exports:
e = exports[0]
ok(f"Export row: status={e['status']}, language={e['language']}")
if e["status"] == "pending":
ok("Status: pending (waiting for worker)")
else:
print(f" ? Status: {e['status']} (expected pending)")
if e["language"] == "de":
ok("Language: de")
else:
fail(f"Language: {e['language']} (expected de)")
if e.get("token"):
ok(f"Download token generated: {e['token'][:10]}...")
else:
fail("No download token")
if e.get("scenario_id") == scenario_id:
ok(f"Scenario ID matches: {scenario_id}")
else:
fail(f"Scenario ID: {e.get('scenario_id')} (expected {scenario_id})")
else:
fail("No business_plan_exports row created")
else:
fail("business_plan price not found")
# ═══════════════════════════════════════════════════════════
# Test 4: Edge cases
# ═══════════════════════════════════════════════════════════
section("4a. Edge: checkout.session.completed with unknown price_id")
status = post_webhook("checkout.session.completed", {
"id": "cs_test_unknown",
"mode": "payment",
"customer": "cus_test_unknown",
"metadata": {
"user_id": str(test_user["id"]),
"supplier_id": str(test_supplier["id"]),
"plan": "nonexistent_product",
},
"line_items": {"data": [{"price": {"id": "price_nonexistent"}, "quantity": 1}]},
})
ok(f"Unknown price: HTTP {status} (no crash)") if status == 200 else fail(f"Unknown price: HTTP {status}")
# Server alive?
result = subprocess.run(
["curl", "-s", "-o", "/dev/null", "-w", "%{http_code}", f"{SERVER_URL}/"],
capture_output=True, text=True, timeout=5,
)
ok("Server alive after unknown price") if result.stdout.strip() in ("200", "301") else fail("Server crashed!")
section("4b. Edge: checkout.session.completed with missing supplier_id (credit pack)")
balance_before = query_db("SELECT credit_balance FROM suppliers WHERE id = ?", (test_supplier["id"],))[0]["credit_balance"]
status = post_webhook("checkout.session.completed", {
"id": "cs_test_no_supplier",
"mode": "payment",
"customer": "cus_test_nosup",
"metadata": {
"user_id": str(test_user["id"]),
# NO supplier_id
"plan": "credits_25",
},
"line_items": {"data": [{"price": {"id": price_map["credits_25"]}, "quantity": 1}]},
})
ok(f"Missing supplier_id: HTTP {status} (no crash)") if status == 200 else fail(f"HTTP {status}")
time.sleep(1)
balance_after = query_db("SELECT credit_balance FROM suppliers WHERE id = ?", (test_supplier["id"],))[0]["credit_balance"]
if balance_after == balance_before:
ok("Balance unchanged (correctly skipped — no supplier_id)")
else:
fail(f"Balance changed: {balance_before}{balance_after}")
section("4c. Edge: checkout.session.completed with missing metadata")
status = post_webhook("checkout.session.completed", {
"id": "cs_test_no_meta",
"mode": "payment",
"customer": "cus_test_nometa",
"metadata": {},
})
ok(f"Empty metadata: HTTP {status}") if status == 200 else fail(f"HTTP {status}")
result = subprocess.run(
["curl", "-s", "-o", "/dev/null", "-w", "%{http_code}", f"{SERVER_URL}/"],
capture_output=True, text=True, timeout=5,
)
ok("Server alive after empty metadata") if result.stdout.strip() in ("200", "301") else fail("Server crashed!")
section("4d. Edge: subscription mode checkout (not payment)")
# checkout.session.completed with mode=subscription should create a subscription
status = post_webhook("checkout.session.completed", {
"id": "cs_test_sub_mode",
"mode": "subscription",
"customer": "cus_test_submode",
"subscription": "sub_from_checkout_123",
"metadata": {
"user_id": str(test_user["id"]),
"plan": "starter",
},
})
ok(f"Subscription-mode checkout: HTTP {status}") if status == 200 else fail(f"HTTP {status}")
# Note: this fires subscription.activated, but since we can't mock the Stripe API call
# to fetch the subscription, it will log a warning and continue. That's fine.
section("4e. Edge: sticky boost without sticky_country in metadata")
price_id = price_map.get("boost_sticky_week")
if price_id:
# Reset sticky fields
conn = sqlite3.connect(DATABASE_PATH)
conn.execute("UPDATE suppliers SET sticky_until=NULL, sticky_country=NULL WHERE id=?", (test_supplier["id"],))
conn.commit()
conn.close()
status = post_webhook("checkout.session.completed", {
"id": f"cs_test_no_country_{int(time.time())}",
"mode": "payment",
"customer": "cus_test_nocountry",
"metadata": {
"user_id": str(test_user["id"]),
"supplier_id": str(test_supplier["id"]),
"plan": "boost_sticky_week",
# NO sticky_country
},
"line_items": {"data": [{"price": {"id": price_id}, "quantity": 1}]},
})
ok(f"Missing sticky_country: HTTP {status}") if status == 200 else fail(f"HTTP {status}")
time.sleep(2)
sup = query_db("SELECT sticky_until, sticky_country FROM suppliers WHERE id = ?", (test_supplier["id"],))
if sup and sup[0]["sticky_until"]:
ok(f"sticky_until still set (country defaults to empty: '{sup[0]['sticky_country']}')")
else:
fail("sticky boost not created without country")
# ═══════════════════════════════════════════════════════════
# Test 5: Use stripe trigger for a real checkout.session.completed
# ═══════════════════════════════════════════════════════════
section("5. stripe trigger checkout.session.completed (real Stripe event)")
print(" Triggering real checkout.session.completed via Stripe CLI...")
result = subprocess.run(
["stripe", "trigger", "checkout.session.completed"],
capture_output=True, text=True, timeout=30,
)
if result.returncode == 0:
ok("stripe trigger succeeded")
# Wait for webhook delivery via ngrok
time.sleep(5)
# Check ngrok for the delivery
import urllib.request
try:
resp = urllib.request.urlopen("http://localhost:4040/api/requests/http?limit=5", timeout=5)
reqs = json.loads(resp.read())
recent_webhooks = [
r for r in reqs.get("requests", [])
if r.get("request", {}).get("uri") == "/billing/webhook/stripe"
]
if recent_webhooks:
latest = recent_webhooks[0]
http_status = latest.get("response", {}).get("status_code")
ok(f"Webhook delivered via ngrok: HTTP {http_status}")
else:
print(" (no webhook seen in ngrok — may have been delivered before log window)")
ok("stripe trigger completed (webhook delivery not verified)")
except Exception:
ok("stripe trigger completed (ngrok API unavailable for verification)")
else:
fail(f"stripe trigger failed: {result.stderr[:100]}")
# ═══════════════════════════════════════════════════════════
# Summary
# ═══════════════════════════════════════════════════════════
section("RESULTS")
total = passed + failed
print(f"\n {passed}/{total} passed, {failed} failed\n")
if errors:
print(" Failures:")
for err in errors:
print(f" - {err}")
print()
sys.exit(1 if failed else 0)

124
scripts/stripe_e2e_setup.py Normal file
View File

@@ -0,0 +1,124 @@
"""
Step 1: Register a Stripe webhook endpoint via ngrok and update .env.
Run BEFORE starting the dev server:
1. Start ngrok: ngrok http 5000
2. Run this script: uv run python scripts/stripe_e2e_setup.py
3. Start dev server: make dev
4. Run E2E tests: uv run python scripts/stripe_e2e_test.py
To tear down afterward:
uv run python scripts/stripe_e2e_setup.py --teardown
"""
import json
import os
import re
import sys
import urllib.request
from dotenv import load_dotenv
load_dotenv()
import stripe
STRIPE_SECRET_KEY = os.getenv("STRIPE_SECRET_KEY", "") or os.getenv("STRIPE_API_PRIVATE_KEY", "")
if not STRIPE_SECRET_KEY:
print("ERROR: Set STRIPE_SECRET_KEY or STRIPE_API_PRIVATE_KEY in .env")
sys.exit(1)
stripe.api_key = STRIPE_SECRET_KEY
stripe.max_network_retries = 2
ENV_PATH = os.path.join(os.path.dirname(__file__), "..", ".env")
ENV_PATH = os.path.abspath(ENV_PATH)
WEBHOOK_PATH = "/billing/webhook/stripe"
NGROK_API = "http://localhost:4040/api/tunnels"
def _update_env(key, value):
"""Update a key in .env file."""
text = open(ENV_PATH).read()
pattern = rf"^{key}=.*$"
replacement = f"{key}={value}"
if re.search(pattern, text, re.MULTILINE):
text = re.sub(pattern, replacement, text, flags=re.MULTILINE)
else:
text = text.rstrip("\n") + f"\n{replacement}\n"
open(ENV_PATH, "w").write(text)
def setup():
# Get ngrok tunnel URL
try:
resp = urllib.request.urlopen(NGROK_API, timeout=5)
tunnels = json.loads(resp.read())
tunnel_url = tunnels["tunnels"][0]["public_url"]
except Exception as e:
print(f"ERROR: ngrok not running: {e}")
print("Start ngrok first: ngrok http 5000")
sys.exit(1)
webhook_url = f"{tunnel_url}{WEBHOOK_PATH}"
print(f"ngrok tunnel: {tunnel_url}")
print(f"Webhook URL: {webhook_url}")
# Check for existing E2E webhook endpoint
existing_id = os.getenv("STRIPE_WEBHOOK_ENDPOINT_ID", "")
if existing_id:
try:
ep = stripe.WebhookEndpoint.retrieve(existing_id)
if ep.url == webhook_url and ep.status == "enabled":
print(f"\nEndpoint already exists and matches: {existing_id}")
print("Ready to test. Run: uv run python scripts/stripe_e2e_test.py")
return
# URL changed (new ngrok session), delete and recreate
print(f"Existing endpoint URL mismatch, recreating...")
stripe.WebhookEndpoint.delete(existing_id)
except stripe.InvalidRequestError:
pass # Already deleted
# Create webhook endpoint
endpoint = stripe.WebhookEndpoint.create(
url=webhook_url,
enabled_events=[
"checkout.session.completed",
"customer.subscription.created",
"customer.subscription.updated",
"customer.subscription.deleted",
"invoice.payment_failed",
],
)
print(f"\nCreated endpoint: {endpoint.id}")
print(f"Webhook secret: {endpoint.secret[:25]}...")
# Update .env
_update_env("STRIPE_WEBHOOK_SECRET", endpoint.secret)
_update_env("STRIPE_WEBHOOK_ENDPOINT_ID", endpoint.id)
print("\nUpdated .env with STRIPE_WEBHOOK_SECRET and STRIPE_WEBHOOK_ENDPOINT_ID")
print("\nNext steps:")
print(" 1. Restart dev server: make dev")
print(" 2. Run E2E tests: uv run python scripts/stripe_e2e_test.py")
def teardown():
endpoint_id = os.getenv("STRIPE_WEBHOOK_ENDPOINT_ID", "")
if endpoint_id:
try:
stripe.WebhookEndpoint.delete(endpoint_id)
print(f"Deleted webhook endpoint: {endpoint_id}")
except stripe.InvalidRequestError:
print(f"Endpoint {endpoint_id} already deleted")
_update_env("STRIPE_WEBHOOK_SECRET", "")
_update_env("STRIPE_WEBHOOK_ENDPOINT_ID", "")
print("Cleared .env webhook config")
if __name__ == "__main__":
if "--teardown" in sys.argv:
teardown()
else:
setup()

727
scripts/stripe_e2e_test.py Normal file
View File

@@ -0,0 +1,727 @@
"""
Comprehensive Stripe E2E Tests — real webhooks via ngrok.
Tests every product type, subscription lifecycle, payment failures,
and edge cases against a running dev server with real Stripe webhooks.
Prerequisites:
1. ngrok http 5000
2. uv run python scripts/stripe_e2e_setup.py
3. make dev (or restart after setup)
4. uv run python scripts/stripe_e2e_test.py
"""
import os
import sqlite3
import subprocess
import sys
import time
from dotenv import load_dotenv
load_dotenv(override=True)
import stripe
STRIPE_SECRET_KEY = os.getenv("STRIPE_SECRET_KEY", "") or os.getenv("STRIPE_API_PRIVATE_KEY", "")
assert STRIPE_SECRET_KEY, "Set STRIPE_SECRET_KEY or STRIPE_API_PRIVATE_KEY in .env"
stripe.api_key = STRIPE_SECRET_KEY
stripe.max_network_retries = 2
DATABASE_PATH = os.getenv("DATABASE_PATH", "data/app.db")
MAX_WAIT_SECONDS = 20
POLL_SECONDS = 0.5
passed = 0
failed = 0
errors = []
cleanup_sub_ids = []
# ─── Helpers ──────────────────────────────────────────────
def ok(msg):
global passed
passed += 1
print(f" \u2713 {msg}")
def fail(msg):
global failed
failed += 1
errors.append(msg)
print(f" \u2717 {msg}")
def section(title):
print(f"\n{'' * 60}")
print(f" {title}")
print(f"{'' * 60}")
def query_db(sql, params=()):
conn = sqlite3.connect(f"file:{DATABASE_PATH}?mode=ro", uri=True)
conn.row_factory = sqlite3.Row
try:
return [dict(r) for r in conn.execute(sql, params).fetchall()]
finally:
conn.close()
def wait_for_row(sql, params=(), timeout_seconds=MAX_WAIT_SECONDS):
"""Poll until query returns at least one row."""
deadline = time.time() + timeout_seconds
while time.time() < deadline:
rows = query_db(sql, params)
if rows:
return rows
time.sleep(POLL_SECONDS)
return []
def wait_for_value(sql, params, column, expected, timeout_seconds=MAX_WAIT_SECONDS):
"""Poll until column == expected."""
deadline = time.time() + timeout_seconds
last = None
while time.time() < deadline:
rows = query_db(sql, params)
if rows:
last = rows[0]
if last[column] == expected:
return last
time.sleep(POLL_SECONDS)
return last
def get_or_create_customer(email, name):
existing = stripe.Customer.list(email=email, limit=1)
if existing.data:
return existing.data[0]
return stripe.Customer.create(email=email, name=name, metadata={"e2e": "true"})
_pm_cache = {}
def attach_pm(customer_id):
"""Create a fresh test Visa and attach it."""
if customer_id in _pm_cache:
return _pm_cache[customer_id]
pm = stripe.PaymentMethod.create(type="card", card={"token": "tok_visa"})
stripe.PaymentMethod.attach(pm.id, customer=customer_id)
stripe.Customer.modify(customer_id, invoice_settings={"default_payment_method": pm.id})
_pm_cache[customer_id] = pm.id
return pm.id
def create_sub(customer_id, price_id, metadata, pm_id):
"""Create subscription and track for cleanup."""
sub = stripe.Subscription.create(
customer=customer_id,
items=[{"price": price_id}],
metadata=metadata,
default_payment_method=pm_id,
)
cleanup_sub_ids.append(sub.id)
return sub
def cancel_sub(sub_id):
try:
stripe.Subscription.cancel(sub_id)
except stripe.InvalidRequestError:
pass
# ─── Preflight ────────────────────────────────────────────
section("Preflight")
# Dev server
result = subprocess.run(
["curl", "-s", "-o", "/dev/null", "-w", "%{http_code}", "http://localhost:5000/"],
capture_output=True, text=True, timeout=5,
)
assert result.stdout.strip() in ("200", "301", "302"), f"Dev server down (HTTP {result.stdout.strip()})"
ok("Dev server running")
# Webhook endpoint
endpoint_id = os.getenv("STRIPE_WEBHOOK_ENDPOINT_ID", "")
assert endpoint_id, "STRIPE_WEBHOOK_ENDPOINT_ID not set — run stripe_e2e_setup.py"
ep = stripe.WebhookEndpoint.retrieve(endpoint_id)
assert ep.status == "enabled", f"Endpoint status: {ep.status}"
ok(f"Webhook endpoint: {ep.url}")
# Webhook secret loaded in server
result = subprocess.run(
["curl", "-s", "-o", "/dev/null", "-w", "%{http_code}",
"-X", "POST", "-H", "Content-Type: application/json",
"-d", "{}", "http://localhost:5000/billing/webhook/stripe"],
capture_output=True, text=True, timeout=5,
)
assert result.stdout.strip() == "400", f"Webhook returns {result.stdout.strip()} (need 400 = sig check active)"
ok("Webhook signature verification active")
# Price map
products = query_db("SELECT key, provider_price_id, billing_type FROM payment_products WHERE provider = 'stripe'")
price_map = {p["key"]: p for p in products}
assert len(price_map) >= 17, f"Only {len(price_map)} products"
ok(f"{len(price_map)} Stripe products loaded")
# Test data
users = query_db("SELECT id, email FROM users LIMIT 10")
assert users
test_user = users[0]
ok(f"User: {test_user['email']} (id={test_user['id']})")
suppliers = query_db("SELECT id, name, claimed_by, credit_balance, tier FROM suppliers LIMIT 5")
assert suppliers
# Pick a supplier with claimed_by set (has an owner user)
test_supplier = next((s for s in suppliers if s["claimed_by"]), suppliers[0])
supplier_user_id = test_supplier["claimed_by"] or test_user["id"]
ok(f"Supplier: {test_supplier['name']} (id={test_supplier['id']}, owner={supplier_user_id})")
# Record initial supplier state for later comparison
initial_credit_balance = test_supplier["credit_balance"]
# ═══════════════════════════════════════════════════════════
# 1. PLANNER SUBSCRIPTIONS
# ═══════════════════════════════════════════════════════════
section("1a. Planner Starter — create → verify DB → cancel → verify cancelled")
cus_starter = get_or_create_customer("e2e-starter@sandbox.padelnomics.com", "E2E Starter")
pm_starter = attach_pm(cus_starter.id)
sub = create_sub(cus_starter.id, price_map["starter"]["provider_price_id"],
{"user_id": str(test_user["id"]), "plan": "starter"}, pm_starter)
ok(f"Created: {sub.id} (status={sub.status})")
rows = wait_for_row("SELECT * FROM subscriptions WHERE provider_subscription_id = ?", (sub.id,))
if rows:
r = rows[0]
ok(f"DB: plan={r['plan']}, status={r['status']}") if r["plan"] == "starter" and r["status"] == "active" else fail(f"DB: plan={r['plan']}, status={r['status']}")
if r.get("current_period_end"):
ok(f"period_end set: {r['current_period_end'][:10]}")
else:
fail("period_end is NULL")
else:
fail("Subscription NOT in DB")
# billing_customers
bc = query_db("SELECT * FROM billing_customers WHERE user_id = ?", (test_user["id"],))
ok("billing_customers created") if bc else fail("billing_customers NOT created")
# Cancel
cancel_sub(sub.id)
result = wait_for_value("SELECT status FROM subscriptions WHERE provider_subscription_id = ?",
(sub.id,), "status", "cancelled")
ok("Status → cancelled") if result and result["status"] == "cancelled" else fail(f"Status: {result['status'] if result else '?'}")
section("1b. Planner Pro — subscription lifecycle")
pro_user = users[1] if len(users) > 1 else users[0]
cus_pro = get_or_create_customer("e2e-pro@sandbox.padelnomics.com", "E2E Pro")
pm_pro = attach_pm(cus_pro.id)
sub = create_sub(cus_pro.id, price_map["pro"]["provider_price_id"],
{"user_id": str(pro_user["id"]), "plan": "pro"}, pm_pro)
ok(f"Created: {sub.id}")
rows = wait_for_row("SELECT * FROM subscriptions WHERE provider_subscription_id = ?", (sub.id,))
if rows and rows[0]["plan"] == "pro" and rows[0]["status"] == "active":
ok("DB: plan=pro, status=active")
else:
fail(f"DB: {rows[0] if rows else 'not found'}")
cancel_sub(sub.id)
ok("Cleaned up")
# ═══════════════════════════════════════════════════════════
# 2. SUPPLIER SUBSCRIPTIONS (all 4 variants)
# ═══════════════════════════════════════════════════════════
section("2a. Supplier Growth (monthly) — tier, credits, verified")
cus_sup = get_or_create_customer("e2e-supplier@sandbox.padelnomics.com", "E2E Supplier")
pm_sup = attach_pm(cus_sup.id)
sub = create_sub(cus_sup.id, price_map["supplier_growth"]["provider_price_id"], {
"user_id": str(supplier_user_id),
"supplier_id": str(test_supplier["id"]),
"plan": "supplier_growth",
}, pm_sup)
ok(f"Created: {sub.id}")
result = wait_for_value(
"SELECT tier, is_verified, monthly_credits, credit_balance FROM suppliers WHERE id = ?",
(test_supplier["id"],), "tier", "growth",
)
if result:
ok("tier=growth") if result["tier"] == "growth" else fail(f"tier={result['tier']}")
ok("is_verified=1") if result["is_verified"] == 1 else fail(f"is_verified={result['is_verified']}")
ok("monthly_credits=30") if result["monthly_credits"] == 30 else fail(f"monthly_credits={result['monthly_credits']}")
ok(f"credit_balance={result['credit_balance']}") if result["credit_balance"] >= 30 else fail(f"credit_balance={result['credit_balance']}")
else:
fail("Tier not updated")
# Check credit ledger entry was created
ledger = query_db(
"SELECT * FROM credit_ledger WHERE supplier_id = ? AND event_type = 'monthly_allocation' ORDER BY id DESC LIMIT 1",
(test_supplier["id"],),
)
ok("Credit ledger entry created") if ledger else fail("No credit ledger entry")
cancel_sub(sub.id)
ok("Cleaned up")
section("2b. Supplier Pro (monthly) — 100 credits")
# Reset supplier to basic first
query_conn = sqlite3.connect(DATABASE_PATH)
query_conn.execute("UPDATE suppliers SET tier='free', monthly_credits=0, credit_balance=0, is_verified=0 WHERE id=?",
(test_supplier["id"],))
query_conn.commit()
query_conn.close()
time.sleep(1)
sub = create_sub(cus_sup.id, price_map["supplier_pro"]["provider_price_id"], {
"user_id": str(supplier_user_id),
"supplier_id": str(test_supplier["id"]),
"plan": "supplier_pro",
}, pm_sup)
ok(f"Created: {sub.id}")
result = wait_for_value(
"SELECT tier, monthly_credits, credit_balance FROM suppliers WHERE id = ?",
(test_supplier["id"],), "tier", "pro",
)
if result:
ok("tier=pro") if result["tier"] == "pro" else fail(f"tier={result['tier']}")
ok("monthly_credits=100") if result["monthly_credits"] == 100 else fail(f"monthly_credits={result['monthly_credits']}")
ok(f"credit_balance={result['credit_balance']}") if result["credit_balance"] >= 100 else fail(f"credit_balance={result['credit_balance']}")
else:
fail("Tier not updated to pro")
cancel_sub(sub.id)
ok("Cleaned up")
section("2c. Supplier Growth (yearly)")
# Reset
query_conn = sqlite3.connect(DATABASE_PATH)
query_conn.execute("UPDATE suppliers SET tier='free', monthly_credits=0, credit_balance=0, is_verified=0 WHERE id=?",
(test_supplier["id"],))
query_conn.commit()
query_conn.close()
time.sleep(1)
sub = create_sub(cus_sup.id, price_map["supplier_growth_yearly"]["provider_price_id"], {
"user_id": str(supplier_user_id),
"supplier_id": str(test_supplier["id"]),
"plan": "supplier_growth_yearly",
}, pm_sup)
ok(f"Created: {sub.id}")
result = wait_for_value(
"SELECT tier, monthly_credits FROM suppliers WHERE id = ?",
(test_supplier["id"],), "tier", "growth",
)
if result:
ok("tier=growth (yearly maps to growth)")
ok("monthly_credits=30") if result["monthly_credits"] == 30 else fail(f"monthly_credits={result['monthly_credits']}")
else:
fail("Yearly growth not processed")
cancel_sub(sub.id)
ok("Cleaned up")
section("2d. Supplier Pro (yearly)")
query_conn = sqlite3.connect(DATABASE_PATH)
query_conn.execute("UPDATE suppliers SET tier='free', monthly_credits=0, credit_balance=0, is_verified=0 WHERE id=?",
(test_supplier["id"],))
query_conn.commit()
query_conn.close()
time.sleep(1)
sub = create_sub(cus_sup.id, price_map["supplier_pro_yearly"]["provider_price_id"], {
"user_id": str(supplier_user_id),
"supplier_id": str(test_supplier["id"]),
"plan": "supplier_pro_yearly",
}, pm_sup)
ok(f"Created: {sub.id}")
result = wait_for_value(
"SELECT tier, monthly_credits FROM suppliers WHERE id = ?",
(test_supplier["id"],), "tier", "pro",
)
if result:
ok("tier=pro (yearly maps to pro)")
ok("monthly_credits=100") if result["monthly_credits"] == 100 else fail(f"monthly_credits={result['monthly_credits']}")
else:
fail("Yearly pro not processed")
cancel_sub(sub.id)
ok("Cleaned up")
# ═══════════════════════════════════════════════════════════
# 3. BOOST ADD-ON SUBSCRIPTIONS (all 4)
# ═══════════════════════════════════════════════════════════
section("3. Boost add-on subscriptions (Logo, Highlight, Verified, Card Color)")
cus_boost = get_or_create_customer("e2e-boost@sandbox.padelnomics.com", "E2E Boost")
pm_boost = attach_pm(cus_boost.id)
boost_keys = ["boost_logo", "boost_highlight", "boost_verified", "boost_card_color"]
for key in boost_keys:
price_id = price_map[key]["provider_price_id"]
sub = create_sub(cus_boost.id, price_id, {
"user_id": str(supplier_user_id),
"supplier_id": str(test_supplier["id"]),
"plan": key,
}, pm_boost)
ok(f"{key}: {sub.id} (active)")
# Let webhook arrive
time.sleep(2)
cancel_sub(sub.id)
# Boosts with plan starting "boost_" don't hit supplier handler (only supplier_ plans do).
# They go through the user subscription path. Verify at least the webhooks were accepted.
# Check ngrok logs for 200s
import json
import urllib.request
try:
resp = urllib.request.urlopen("http://localhost:4040/api/requests/http?limit=50", timeout=5)
requests_data = json.loads(resp.read())
webhook_200s = sum(1 for r in requests_data.get("requests", [])
if r.get("request", {}).get("uri") == "/billing/webhook/stripe"
and r.get("response", {}).get("status_code") == 200)
ok(f"Webhook 200 responses seen: {webhook_200s}")
except Exception:
print(" (could not verify ngrok logs)")
ok("All 4 boost add-ons tested")
# ═══════════════════════════════════════════════════════════
# 4. CHECKOUT SESSIONS — every product
# ═══════════════════════════════════════════════════════════
section("4. Checkout session creation (all 17 products)")
try:
ngrok_resp = urllib.request.urlopen("http://localhost:4040/api/tunnels", timeout=5)
tunnel_url = json.loads(ngrok_resp.read())["tunnels"][0]["public_url"]
except Exception:
tunnel_url = "http://localhost:5000"
checkout_ok = 0
for key, p in sorted(price_map.items()):
mode = "subscription" if p["billing_type"] == "subscription" else "payment"
try:
stripe.checkout.Session.create(
mode=mode,
customer=cus_starter.id,
line_items=[{"price": p["provider_price_id"], "quantity": 1}],
metadata={"user_id": str(test_user["id"]), "plan": key, "test": "true"},
success_url=f"{tunnel_url}/billing/success?session_id={{CHECKOUT_SESSION_ID}}",
cancel_url=f"{tunnel_url}/billing/pricing",
)
checkout_ok += 1
except stripe.StripeError as e:
fail(f"Checkout failed: {key} -> {e}")
if checkout_ok == len(price_map):
ok(f"All {checkout_ok} checkout sessions created")
else:
fail(f"{len(price_map) - checkout_ok} checkout sessions failed")
# ═══════════════════════════════════════════════════════════
# 5. PAYMENT FAILURE — declined card
# ═══════════════════════════════════════════════════════════
section("5. Payment failure — declined card scenarios")
cus_fail = get_or_create_customer("e2e-failure@sandbox.padelnomics.com", "E2E Failure")
fail_user = users[2] if len(users) > 2 else users[0]
# 5a. First create a valid subscription, then simulate payment failure
pm_valid = attach_pm(cus_fail.id)
try:
sub_fail = stripe.Subscription.create(
customer=cus_fail.id,
items=[{"price": price_map["starter"]["provider_price_id"]}],
metadata={"user_id": str(fail_user["id"]), "plan": "starter"},
default_payment_method=pm_valid,
)
cleanup_sub_ids.append(sub_fail.id)
ok(f"Created valid sub first: {sub_fail.id} (status={sub_fail.status})")
# Wait for subscription.created webhook
rows = wait_for_row("SELECT * FROM subscriptions WHERE provider_subscription_id = ?", (sub_fail.id,))
ok("DB row created") if rows else fail("No DB row after valid sub creation")
# Now swap to a declined card — next invoice will fail
try:
pm_decline = stripe.PaymentMethod.create(type="card", card={"token": "tok_chargeDeclined"})
stripe.PaymentMethod.attach(pm_decline.id, customer=cus_fail.id)
stripe.Customer.modify(cus_fail.id, invoice_settings={"default_payment_method": pm_decline.id})
ok("Swapped to declined card for next billing cycle")
except stripe.CardError:
ok("tok_chargeDeclined rejected at attach (newer API) — card swap skipped")
cancel_sub(sub_fail.id)
result = wait_for_value("SELECT status FROM subscriptions WHERE provider_subscription_id = ?",
(sub_fail.id,), "status", "cancelled")
ok("Cancelled after failure test") if result else ok("Cleanup done")
except stripe.CardError as e:
ok(f"Card declined at subscription level: {e.user_message}")
# 5b. Try creating subscription with payment_behavior=default_incomplete
try:
pm_ok = stripe.PaymentMethod.create(type="card", card={"token": "tok_visa"})
stripe.PaymentMethod.attach(pm_ok.id, customer=cus_fail.id)
sub_inc = stripe.Subscription.create(
customer=cus_fail.id,
items=[{"price": price_map["pro"]["provider_price_id"]}],
metadata={"user_id": str(fail_user["id"]), "plan": "pro"},
default_payment_method=pm_ok.id,
payment_behavior="default_incomplete",
)
cleanup_sub_ids.append(sub_inc.id)
ok(f"Incomplete-mode sub: {sub_inc.id} (status={sub_inc.status})")
cancel_sub(sub_inc.id)
except stripe.StripeError as e:
ok(f"Incomplete mode handled: {e}")
# ═══════════════════════════════════════════════════════════
# 6. EDGE CASES
# ═══════════════════════════════════════════════════════════
section("6a. Edge case — missing user_id in metadata")
cus_edge = get_or_create_customer("e2e-edge@sandbox.padelnomics.com", "E2E Edge")
pm_edge = attach_pm(cus_edge.id)
sub = create_sub(cus_edge.id, price_map["starter"]["provider_price_id"],
{"plan": "starter"}, # NO user_id
pm_edge)
ok(f"Created sub without user_id: {sub.id}")
# Webhook should arrive but handler should not crash (no DB write expected)
time.sleep(5)
# Server should not have crashed — verify it's still up
result = subprocess.run(
["curl", "-s", "-o", "/dev/null", "-w", "%{http_code}", "http://localhost:5000/"],
capture_output=True, text=True, timeout=5,
)
ok("Server still alive after missing user_id") if result.stdout.strip() in ("200", "301", "302") else fail("Server crashed!")
cancel_sub(sub.id)
section("6b. Edge case — missing supplier_id for supplier plan")
sub = create_sub(cus_edge.id, price_map["supplier_growth"]["provider_price_id"],
{"user_id": str(test_user["id"]), "plan": "supplier_growth"}, # NO supplier_id
pm_edge)
ok(f"Created supplier sub without supplier_id: {sub.id}")
time.sleep(5)
result = subprocess.run(
["curl", "-s", "-o", "/dev/null", "-w", "%{http_code}", "http://localhost:5000/"],
capture_output=True, text=True, timeout=5,
)
ok("Server still alive after missing supplier_id") if result.stdout.strip() in ("200", "301", "302") else fail("Server crashed!")
cancel_sub(sub.id)
section("6c. Edge case — duplicate subscription (idempotency)")
# Create same subscription twice for same user
cus_dup = get_or_create_customer("e2e-dup@sandbox.padelnomics.com", "E2E Dup")
pm_dup = attach_pm(cus_dup.id)
dup_user = users[3] if len(users) > 3 else users[0]
sub1 = create_sub(cus_dup.id, price_map["starter"]["provider_price_id"],
{"user_id": str(dup_user["id"]), "plan": "starter"}, pm_dup)
time.sleep(3)
sub2 = create_sub(cus_dup.id, price_map["pro"]["provider_price_id"],
{"user_id": str(dup_user["id"]), "plan": "pro"}, pm_dup)
time.sleep(3)
rows = query_db("SELECT * FROM subscriptions WHERE user_id = ? ORDER BY created_at", (dup_user["id"],))
ok(f"Two subscriptions exist: {len(rows)} rows") if len(rows) >= 2 else fail(f"Expected 2+ rows, got {len(rows)}")
# get_subscription returns most recent
latest = query_db("SELECT * FROM subscriptions WHERE user_id = ? ORDER BY created_at DESC LIMIT 1", (dup_user["id"],))
if latest and latest[0]["plan"] == "pro":
ok("Latest subscription is 'pro' (upgrade scenario)")
else:
fail(f"Latest plan: {latest[0]['plan'] if latest else '?'}")
cancel_sub(sub1.id)
cancel_sub(sub2.id)
section("6d. Edge case — rapid create + cancel (race condition)")
cus_race = get_or_create_customer("e2e-race@sandbox.padelnomics.com", "E2E Race")
pm_race = attach_pm(cus_race.id)
race_user = users[4] if len(users) > 4 else users[0]
sub = create_sub(cus_race.id, price_map["starter"]["provider_price_id"],
{"user_id": str(race_user["id"]), "plan": "starter"}, pm_race)
# Cancel immediately — webhooks may arrive out of order
stripe.Subscription.cancel(sub.id)
ok(f"Created and immediately cancelled: {sub.id}")
time.sleep(8) # Wait for both webhooks
rows = query_db("SELECT * FROM subscriptions WHERE provider_subscription_id = ?", (sub.id,))
if rows:
ok(f"Final DB status: {rows[0]['status']}")
else:
ok("No DB row (created webhook may have arrived after deleted)")
result = subprocess.run(
["curl", "-s", "-o", "/dev/null", "-w", "%{http_code}", "http://localhost:5000/"],
capture_output=True, text=True, timeout=5,
)
ok("Server survived race condition") if result.stdout.strip() in ("200", "301", "302") else fail("Server crashed!")
# ═══════════════════════════════════════════════════════════
# 7. BILLING PORTAL
# ═══════════════════════════════════════════════════════════
section("7. Billing Portal session")
try:
portal = stripe.billing_portal.Session.create(
customer=cus_starter.id,
return_url=f"{tunnel_url}/billing/success",
)
ok(f"Portal URL: {portal.url[:50]}...")
except stripe.StripeError as e:
fail(f"Portal failed: {e}")
# ═══════════════════════════════════════════════════════════
# 8. ONE-TIME PAYMENTS (via PaymentIntent — simulates completed checkout)
# ═══════════════════════════════════════════════════════════
section("8. One-time payments (PaymentIntents — all credit packs + boosts + PDF)")
cus_buyer = get_or_create_customer("e2e-buyer@sandbox.padelnomics.com", "E2E Buyer")
pm_buyer = attach_pm(cus_buyer.id)
one_time_products = [
("credits_25", 9900),
("credits_50", 17900),
("credits_100", 32900),
("credits_250", 74900),
("boost_sticky_week", 7900),
("boost_sticky_month", 19900),
("business_plan", 14900),
]
for key, amount_cents in one_time_products:
try:
pi = stripe.PaymentIntent.create(
amount=amount_cents,
currency="eur",
customer=cus_buyer.id,
payment_method=pm_buyer,
confirm=True,
automatic_payment_methods={"enabled": True, "allow_redirects": "never"},
metadata={
"user_id": str(test_user["id"]),
"supplier_id": str(test_supplier["id"]),
"plan": key,
},
)
if pi.status == "succeeded":
ok(f"{key}: \u20ac{amount_cents/100:.2f} succeeded ({pi.id[:20]}...)")
else:
fail(f"{key}: status={pi.status}")
except stripe.StripeError as e:
fail(f"{key}: {e}")
# Note: PaymentIntents don't trigger checkout.session.completed webhooks.
# The actual credit/boost/PDF creation requires a Checkout Session completion,
# which can only happen via browser. These tests verify the payments succeed.
print(" (PaymentIntents succeed but don't trigger checkout webhooks —")
print(" credit/boost/PDF creation requires browser checkout completion)")
# ═══════════════════════════════════════════════════════════
# 9. DECLINED CARDS — different failure modes
# ═══════════════════════════════════════════════════════════
section("9. Declined card scenarios (PaymentIntent level)")
decline_tokens = [
("tok_chargeDeclined", "generic decline"),
("tok_chargeDeclinedInsufficientFunds", "insufficient funds"),
("tok_chargeDeclinedExpiredCard", "expired card"),
("tok_chargeDeclinedProcessingError", "processing error"),
]
for token, description in decline_tokens:
try:
pm = stripe.PaymentMethod.create(type="card", card={"token": token})
stripe.PaymentMethod.attach(pm.id, customer=cus_buyer.id)
pi = stripe.PaymentIntent.create(
amount=1900,
currency="eur",
customer=cus_buyer.id,
payment_method=pm.id,
confirm=True,
automatic_payment_methods={"enabled": True, "allow_redirects": "never"},
)
fail(f"{description}: should have been declined but succeeded")
except stripe.CardError as e:
ok(f"{description}: correctly declined ({e.code})")
except stripe.StripeError as e:
ok(f"{description}: rejected ({type(e).__name__})")
# ═══════════════════════════════════════════════════════════
# Summary
# ═══════════════════════════════════════════════════════════
section("RESULTS")
total = passed + failed
print(f"\n {passed}/{total} passed, {failed} failed\n")
if errors:
print(" Failures:")
for err in errors:
print(f" - {err}")
print()
# Final cleanup: cancel any remaining subs
for sid in cleanup_sub_ids:
try:
stripe.Subscription.cancel(sid)
except Exception:
pass
sys.exit(1 if failed else 0)

View File

@@ -0,0 +1,422 @@
"""
Stripe Sandbox Integration Test — verifies all products work end-to-end.
Creates multiple test customers with different personas, tests:
- Checkout session creation for every product
- Subscription creation + cancellation lifecycle
- One-time payment intents
- Price/product consistency
Run: uv run python scripts/test_stripe_sandbox.py
"""
import os
import sys
import time
from dotenv import load_dotenv
load_dotenv()
import stripe
STRIPE_SECRET_KEY = os.getenv("STRIPE_SECRET_KEY", "") or os.getenv("STRIPE_API_PRIVATE_KEY", "")
if not STRIPE_SECRET_KEY:
print("ERROR: STRIPE_SECRET_KEY / STRIPE_API_PRIVATE_KEY not set in .env")
sys.exit(1)
stripe.api_key = STRIPE_SECRET_KEY
stripe.max_network_retries = 2
BASE_URL = os.getenv("BASE_URL", "http://localhost:5000")
# ═══════════════════════════════════════════════════════════
# Expected product catalog — must match setup_stripe.py
# ═══════════════════════════════════════════════════════════
EXPECTED_PRODUCTS = {
"Supplier Growth": {"price_cents": 19900, "billing": "subscription", "interval": "month"},
"Supplier Growth (Yearly)": {"price_cents": 179900, "billing": "subscription", "interval": "year"},
"Supplier Pro": {"price_cents": 49900, "billing": "subscription", "interval": "month"},
"Supplier Pro (Yearly)": {"price_cents": 449900, "billing": "subscription", "interval": "year"},
"Boost: Logo": {"price_cents": 2900, "billing": "subscription", "interval": "month"},
"Boost: Highlight": {"price_cents": 3900, "billing": "subscription", "interval": "month"},
"Boost: Verified Badge": {"price_cents": 4900, "billing": "subscription", "interval": "month"},
"Boost: Custom Card Color": {"price_cents": 5900, "billing": "subscription", "interval": "month"},
"Boost: Sticky Top 1 Week": {"price_cents": 7900, "billing": "one_time"},
"Boost: Sticky Top 1 Month": {"price_cents": 19900, "billing": "one_time"},
"Credit Pack 25": {"price_cents": 9900, "billing": "one_time"},
"Credit Pack 50": {"price_cents": 17900, "billing": "one_time"},
"Credit Pack 100": {"price_cents": 32900, "billing": "one_time"},
"Credit Pack 250": {"price_cents": 74900, "billing": "one_time"},
"Padel Business Plan (PDF)": {"price_cents": 14900, "billing": "one_time"},
"Planner Starter": {"price_cents": 1900, "billing": "subscription", "interval": "month"},
"Planner Pro": {"price_cents": 4900, "billing": "subscription", "interval": "month"},
}
# Test customer personas
TEST_CUSTOMERS = [
{"email": "planner-starter@sandbox.padelnomics.com", "name": "Anna Planner (Starter)"},
{"email": "planner-pro@sandbox.padelnomics.com", "name": "Ben Planner (Pro)"},
{"email": "supplier-growth@sandbox.padelnomics.com", "name": "Carlos Supplier (Growth)"},
{"email": "supplier-pro@sandbox.padelnomics.com", "name": "Diana Supplier (Pro)"},
{"email": "one-time-buyer@sandbox.padelnomics.com", "name": "Eva Buyer (Credits+Boosts)"},
]
passed = 0
failed = 0
errors = []
def ok(msg):
global passed
passed += 1
print(f"{msg}")
def fail(msg):
global failed
failed += 1
errors.append(msg)
print(f"{msg}")
def section(title):
print(f"\n{'' * 60}")
print(f" {title}")
print(f"{'' * 60}")
# ═══════════════════════════════════════════════════════════
# Phase 1: Verify all products and prices exist
# ═══════════════════════════════════════════════════════════
section("Phase 1: Product & Price Verification")
products = list(stripe.Product.list(limit=100, active=True).auto_paging_iter())
product_map = {} # name -> {product_id, price_id, price_amount, price_type, interval}
for product in products:
prices = stripe.Price.list(product=product.id, active=True, limit=1)
if not prices.data:
continue
price = prices.data[0]
product_map[product.name] = {
"product_id": product.id,
"price_id": price.id,
"price_amount": price.unit_amount,
"price_type": price.type,
"interval": price.recurring.interval if price.recurring else None,
}
for name, expected in EXPECTED_PRODUCTS.items():
if name not in product_map:
fail(f"MISSING product: {name}")
continue
actual = product_map[name]
if actual["price_amount"] != expected["price_cents"]:
fail(f"{name}: price {actual['price_amount']} != expected {expected['price_cents']}")
elif expected["billing"] == "subscription" and actual["price_type"] != "recurring":
fail(f"{name}: expected recurring, got {actual['price_type']}")
elif expected["billing"] == "one_time" and actual["price_type"] != "one_time":
fail(f"{name}: expected one_time, got {actual['price_type']}")
elif expected.get("interval") and actual["interval"] != expected["interval"]:
fail(f"{name}: interval {actual['interval']} != expected {expected['interval']}")
else:
ok(f"{name}: €{actual['price_amount']/100:.2f} ({actual['price_type']}"
f"{', ' + actual['interval'] if actual['interval'] else ''})")
extra_products = set(product_map.keys()) - set(EXPECTED_PRODUCTS.keys())
if extra_products:
print(f"\n Extra products in Stripe (not in catalog): {extra_products}")
# ═══════════════════════════════════════════════════════════
# Phase 2: Create test customers (idempotent)
# ═══════════════════════════════════════════════════════════
section("Phase 2: Create Test Customers")
customer_ids = {} # email -> customer_id
for persona in TEST_CUSTOMERS:
existing = stripe.Customer.list(email=persona["email"], limit=1)
if existing.data:
cus = existing.data[0]
ok(f"Reusing: {persona['name']} ({cus.id})")
else:
cus = stripe.Customer.create(
email=persona["email"],
name=persona["name"],
metadata={"test": "true", "persona": persona["name"]},
)
ok(f"Created: {persona['name']} ({cus.id})")
customer_ids[persona["email"]] = cus.id
# ═══════════════════════════════════════════════════════════
# Phase 3: Test Checkout Sessions for every product
# ═══════════════════════════════════════════════════════════
section("Phase 3: Checkout Session Creation (all products)")
success_url = f"{BASE_URL}/billing/success?session_id={{CHECKOUT_SESSION_ID}}"
cancel_url = f"{BASE_URL}/billing/pricing"
# Use the first customer for checkout tests
checkout_customer = customer_ids["planner-starter@sandbox.padelnomics.com"]
for name, info in product_map.items():
if name not in EXPECTED_PRODUCTS:
continue
mode = "subscription" if info["price_type"] == "recurring" else "payment"
try:
session = stripe.checkout.Session.create(
mode=mode,
customer=checkout_customer,
line_items=[{"price": info["price_id"], "quantity": 1}],
metadata={"user_id": "999", "plan": name, "test": "true"},
success_url=success_url,
cancel_url=cancel_url,
)
ok(f"Checkout ({mode}): {name} -> {session.id[:30]}...")
except stripe.StripeError as e:
fail(f"Checkout FAILED for {name}: {e.user_message or str(e)}")
# ═══════════════════════════════════════════════════════════
# Phase 4: Subscription lifecycle tests (per persona)
# ═══════════════════════════════════════════════════════════
section("Phase 4: Subscription Lifecycle Tests")
created_subs = []
# Cache: customer_id -> payment_method_id
_customer_pms = {}
def _ensure_payment_method(cus_id):
"""Create and attach a test Visa card to a customer (cached)."""
if cus_id in _customer_pms:
return _customer_pms[cus_id]
pm = stripe.PaymentMethod.create(type="card", card={"token": "tok_visa"})
stripe.PaymentMethod.attach(pm.id, customer=cus_id)
stripe.Customer.modify(
cus_id,
invoice_settings={"default_payment_method": pm.id},
)
_customer_pms[cus_id] = pm.id
return pm.id
def test_subscription(customer_email, product_name, user_id, extra_metadata=None):
"""Create a subscription, verify it's active, then cancel it."""
cus_id = customer_ids[customer_email]
info = product_map.get(product_name)
if not info:
fail(f"Product not found: {product_name}")
return
metadata = {"user_id": str(user_id), "plan": product_name, "test": "true"}
if extra_metadata:
metadata.update(extra_metadata)
pm_id = _ensure_payment_method(cus_id)
# Create subscription
sub = stripe.Subscription.create(
customer=cus_id,
items=[{"price": info["price_id"]}],
metadata=metadata,
default_payment_method=pm_id,
)
created_subs.append(sub.id)
if sub.status == "active":
ok(f"Sub created: {product_name} for {customer_email} -> {sub.id} (active)")
else:
fail(f"Sub status unexpected: {product_name} -> {sub.status} (expected active)")
# Verify subscription items
items = sub["items"]["data"]
if len(items) == 1 and items[0]["price"]["id"] == info["price_id"]:
ok(f"Sub items correct: price={info['price_id'][:20]}...")
else:
fail(f"Sub items mismatch for {product_name}")
# Cancel at period end
updated = stripe.Subscription.modify(sub.id, cancel_at_period_end=True)
if updated.cancel_at_period_end:
ok(f"Cancel scheduled: {product_name} (cancel_at_period_end=True)")
else:
fail(f"Cancel failed for {product_name}")
# Immediately cancel to clean up
deleted = stripe.Subscription.cancel(sub.id)
if deleted.status == "canceled":
ok(f"Cancelled: {product_name} -> {deleted.status}")
else:
fail(f"Final cancel status: {product_name} -> {deleted.status}")
# Planner Starter
test_subscription(
"planner-starter@sandbox.padelnomics.com", "Planner Starter", user_id=101,
)
# Planner Pro
test_subscription(
"planner-pro@sandbox.padelnomics.com", "Planner Pro", user_id=102,
)
# Supplier Growth (monthly)
test_subscription(
"supplier-growth@sandbox.padelnomics.com", "Supplier Growth", user_id=103,
extra_metadata={"supplier_id": "201"},
)
# Supplier Pro (monthly)
test_subscription(
"supplier-pro@sandbox.padelnomics.com", "Supplier Pro", user_id=104,
extra_metadata={"supplier_id": "202"},
)
# ═══════════════════════════════════════════════════════════
# Phase 5: One-time payment tests
# ═══════════════════════════════════════════════════════════
section("Phase 5: One-Time Payment Tests")
buyer_id = customer_ids["one-time-buyer@sandbox.padelnomics.com"]
buyer_pm = _ensure_payment_method(buyer_id)
ONE_TIME_PRODUCTS = [
"Credit Pack 25",
"Credit Pack 50",
"Credit Pack 100",
"Credit Pack 250",
"Boost: Sticky Top 1 Week",
"Boost: Sticky Top 1 Month",
"Padel Business Plan (PDF)",
]
for product_name in ONE_TIME_PRODUCTS:
info = product_map.get(product_name)
if not info:
fail(f"Product not found: {product_name}")
continue
try:
pi = stripe.PaymentIntent.create(
amount=info["price_amount"],
currency="eur",
customer=buyer_id,
payment_method=buyer_pm,
confirm=True,
automatic_payment_methods={"enabled": True, "allow_redirects": "never"},
metadata={
"user_id": "105",
"supplier_id": "203",
"plan": product_name,
"test": "true",
},
)
if pi.status == "succeeded":
ok(f"Payment: {product_name} -> €{info['price_amount']/100:.2f} ({pi.id[:25]}...)")
else:
fail(f"Payment status: {product_name} -> {pi.status}")
except stripe.StripeError as e:
fail(f"Payment FAILED for {product_name}: {e.user_message or str(e)}")
# ═══════════════════════════════════════════════════════════
# Phase 6: Boost subscription add-ons
# ═══════════════════════════════════════════════════════════
section("Phase 6: Boost Add-on Subscriptions")
BOOST_PRODUCTS = [
"Boost: Logo",
"Boost: Highlight",
"Boost: Verified Badge",
"Boost: Custom Card Color",
]
boost_customer = customer_ids["supplier-pro@sandbox.padelnomics.com"]
boost_pm = _ensure_payment_method(boost_customer)
for product_name in BOOST_PRODUCTS:
info = product_map.get(product_name)
if not info:
fail(f"Product not found: {product_name}")
continue
try:
sub = stripe.Subscription.create(
customer=boost_customer,
items=[{"price": info["price_id"]}],
metadata={
"user_id": "104",
"supplier_id": "202",
"plan": product_name,
"test": "true",
},
default_payment_method=boost_pm,
)
created_subs.append(sub.id)
if sub.status == "active":
ok(f"Boost sub: {product_name} -> €{info['price_amount']/100:.2f}/mo ({sub.id[:25]}...)")
else:
fail(f"Boost sub status: {product_name} -> {sub.status}")
# Clean up
stripe.Subscription.cancel(sub.id)
except stripe.StripeError as e:
fail(f"Boost sub FAILED for {product_name}: {e.user_message or str(e)}")
# ═══════════════════════════════════════════════════════════
# Phase 7: Billing Portal access
# ═══════════════════════════════════════════════════════════
section("Phase 7: Billing Portal")
try:
portal = stripe.billing_portal.Session.create(
customer=checkout_customer,
return_url=f"{BASE_URL}/billing/success",
)
ok(f"Portal URL generated: {portal.url[:50]}...")
except stripe.StripeError as e:
fail(f"Portal creation failed: {e.user_message or str(e)}")
# ═══════════════════════════════════════════════════════════
# Summary
# ═══════════════════════════════════════════════════════════
section("RESULTS")
total = passed + failed
print(f"\n {passed}/{total} passed, {failed} failed\n")
if errors:
print(" Failures:")
for err in errors:
print(f" - {err}")
print()
# Customer summary
print(" Test customers in sandbox:")
for persona in TEST_CUSTOMERS:
cid = customer_ids.get(persona["email"], "?")
print(f" {persona['name']}: {cid}")
print()
sys.exit(1 if failed else 0)

View File

@@ -54,6 +54,7 @@ Grain must match reality — use `QUALIFY ROW_NUMBER()` to enforce it.
| Dimension | Grain | Used by |
|-----------|-------|---------|
| `foundation.dim_countries` | `country_code` | `dim_cities`, `dim_locations`, `pseo_city_costs_de`, `planner_defaults` — single source for country names, income, PLI/cost overrides |
| `foundation.dim_venues` | `venue_id` | `dim_cities`, `dim_venue_capacity`, `fct_daily_availability` (via capacity join) |
| `foundation.dim_cities` | `(country_code, city_slug)` | `serving.city_market_profile` → all pSEO serving models |
| `foundation.dim_locations` | `(country_code, geoname_id)` | `serving.location_opportunity_profile` — all GeoNames locations (pop ≥1K), incl. zero-court locations |

View File

@@ -16,5 +16,107 @@ def padelnomics_glob(evaluator) -> str:
return f"'{landing_dir}/padelnomics/**/*.csv.gz'"
# Add one macro per landing zone subdirectory you create.
# Pattern: def {source}_glob(evaluator) → f"'{landing_dir}/{source}/**/*.csv.gz'"
# ── Country code helpers ─────────────────────────────────────────────────────
# Shared lookup used by dim_cities and dim_locations.
_COUNTRY_NAMES = {
"DE": "Germany", "ES": "Spain", "GB": "United Kingdom",
"FR": "France", "IT": "Italy", "PT": "Portugal",
"AT": "Austria", "CH": "Switzerland", "NL": "Netherlands",
"BE": "Belgium", "SE": "Sweden", "NO": "Norway",
"DK": "Denmark", "FI": "Finland", "US": "United States",
"AR": "Argentina", "MX": "Mexico", "AE": "UAE",
"AU": "Australia", "IE": "Ireland",
}
def _country_case(col: str) -> str:
"""Build a CASE expression mapping ISO 3166-1 alpha-2 → English name."""
whens = "\n ".join(
f"WHEN '{code}' THEN '{name}'" for code, name in _COUNTRY_NAMES.items()
)
return f"CASE {col}\n {whens}\n ELSE {col}\n END"
@macro()
def country_name(evaluator, code_col) -> str:
"""CASE expression: country code → English name.
Usage in SQL: @country_name(vc.country_code) AS country_name_en
"""
return _country_case(str(code_col))
@macro()
def country_slug(evaluator, code_col) -> str:
"""CASE expression: country code → URL-safe slug (lowercased, spaces → dashes).
Usage in SQL: @country_slug(vc.country_code) AS country_slug
"""
return f"LOWER(REGEXP_REPLACE({_country_case(str(code_col))}, '[^a-zA-Z0-9]+', '-'))"
@macro()
def normalize_eurostat_country(evaluator, code_col) -> str:
"""Normalize Eurostat country codes to ISO 3166-1 alpha-2: EL→GR, UK→GB.
Usage in SQL: @normalize_eurostat_country(geo_code) AS country_code
"""
col = str(code_col)
return f"CASE {col} WHEN 'EL' THEN 'GR' WHEN 'UK' THEN 'GB' ELSE {col} END"
@macro()
def normalize_eurostat_nuts(evaluator, code_col) -> str:
"""Normalize NUTS code prefix: EL→GR, UK→GB, preserving the suffix.
Usage in SQL: @normalize_eurostat_nuts(geo_code) AS nuts_code
"""
col = str(code_col)
return (
f"CASE"
f" WHEN {col} LIKE 'EL%' THEN 'GR' || SUBSTR({col}, 3)"
f" WHEN {col} LIKE 'UK%' THEN 'GB' || SUBSTR({col}, 3)"
f" ELSE {col}"
f" END"
)
@macro()
def slugify(evaluator, col) -> str:
"""URL-safe slug: lowercase → ß→ss → strip accents → non-alnum to dashes → trim.
Usage in SQL: @slugify(city) AS city_slug
"""
c = str(col)
return (
f"TRIM(REGEXP_REPLACE("
f"LOWER(STRIP_ACCENTS(REPLACE(LOWER({c}), 'ß', 'ss'))), "
f"'[^a-z0-9]+', '-'"
f"), '-')"
)
@macro()
def infer_country_from_coords(evaluator, lat_col, lon_col) -> str:
"""Infer ISO country code from lat/lon using bounding boxes for 8 European markets.
Usage in SQL:
COALESCE(NULLIF(TRIM(UPPER(country_code)), ''),
@infer_country_from_coords(lat, lon)) AS country_code
"""
lat = str(lat_col)
lon = str(lon_col)
return (
f"CASE"
f" WHEN {lat} BETWEEN 47.27 AND 55.06 AND {lon} BETWEEN 5.87 AND 15.04 THEN 'DE'"
f" WHEN {lat} BETWEEN 35.95 AND 43.79 AND {lon} BETWEEN -9.39 AND 4.33 THEN 'ES'"
f" WHEN {lat} BETWEEN 49.90 AND 60.85 AND {lon} BETWEEN -8.62 AND 1.77 THEN 'GB'"
f" WHEN {lat} BETWEEN 41.36 AND 51.09 AND {lon} BETWEEN -5.14 AND 9.56 THEN 'FR'"
f" WHEN {lat} BETWEEN 45.46 AND 47.80 AND {lon} BETWEEN 5.96 AND 10.49 THEN 'CH'"
f" WHEN {lat} BETWEEN 46.37 AND 49.02 AND {lon} BETWEEN 9.53 AND 17.16 THEN 'AT'"
f" WHEN {lat} BETWEEN 36.35 AND 47.09 AND {lon} BETWEEN 6.62 AND 18.51 THEN 'IT'"
f" WHEN {lat} BETWEEN 37.00 AND 42.15 AND {lon} BETWEEN -9.50 AND -6.19 THEN 'PT'"
f" ELSE NULL"
f" END"
)

View File

@@ -5,7 +5,7 @@
-- Conformed dimension: used by city_market_profile and all pSEO serving models.
-- Integrates four sources:
-- dim_venues → city list, venue count, coordinates (Playtomic + OSM)
-- stg_income → country-level median income (Eurostat)
-- foundation.dim_countries → country_name_en, country_slug, median_income_pps
-- stg_city_labels → Eurostat city_code → city_name mapping (EU cities)
-- stg_population → Eurostat city-level population (EU, joined via city code)
-- stg_population_usa → US Census ACS place population
@@ -33,8 +33,7 @@ venue_cities AS (
SELECT
country_code,
city AS city_name,
-- Lowercase before regex so uppercase letters aren't stripped to '-'
LOWER(REGEXP_REPLACE(LOWER(city), '[^a-z0-9]+', '-')) AS city_slug,
@slugify(city) AS city_slug,
COUNT(*) AS padel_venue_count,
AVG(lat) AS centroid_lat,
AVG(lon) AS centroid_lon
@@ -42,12 +41,6 @@ venue_cities AS (
WHERE city IS NOT NULL AND LENGTH(city) > 0
GROUP BY country_code, city
),
-- Latest country income per country
country_income AS (
SELECT country_code, median_income_pps, ref_year AS income_year
FROM staging.stg_income
QUALIFY ROW_NUMBER() OVER (PARTITION BY country_code ORDER BY ref_year DESC) = 1
),
-- Eurostat EU population: join city labels (code→name) with population values.
-- QUALIFY keeps only the most recent year per (country, city name).
eurostat_pop AS (
@@ -109,56 +102,9 @@ SELECT
vc.country_code,
vc.city_slug,
vc.city_name,
-- Human-readable country name for pSEO templates and internal linking
CASE vc.country_code
WHEN 'DE' THEN 'Germany'
WHEN 'ES' THEN 'Spain'
WHEN 'GB' THEN 'United Kingdom'
WHEN 'FR' THEN 'France'
WHEN 'IT' THEN 'Italy'
WHEN 'PT' THEN 'Portugal'
WHEN 'AT' THEN 'Austria'
WHEN 'CH' THEN 'Switzerland'
WHEN 'NL' THEN 'Netherlands'
WHEN 'BE' THEN 'Belgium'
WHEN 'SE' THEN 'Sweden'
WHEN 'NO' THEN 'Norway'
WHEN 'DK' THEN 'Denmark'
WHEN 'FI' THEN 'Finland'
WHEN 'US' THEN 'United States'
WHEN 'AR' THEN 'Argentina'
WHEN 'MX' THEN 'Mexico'
WHEN 'AE' THEN 'UAE'
WHEN 'AU' THEN 'Australia'
WHEN 'IE' THEN 'Ireland'
ELSE vc.country_code
END AS country_name_en,
-- URL-safe country slug
LOWER(REGEXP_REPLACE(
CASE vc.country_code
WHEN 'DE' THEN 'Germany'
WHEN 'ES' THEN 'Spain'
WHEN 'GB' THEN 'United Kingdom'
WHEN 'FR' THEN 'France'
WHEN 'IT' THEN 'Italy'
WHEN 'PT' THEN 'Portugal'
WHEN 'AT' THEN 'Austria'
WHEN 'CH' THEN 'Switzerland'
WHEN 'NL' THEN 'Netherlands'
WHEN 'BE' THEN 'Belgium'
WHEN 'SE' THEN 'Sweden'
WHEN 'NO' THEN 'Norway'
WHEN 'DK' THEN 'Denmark'
WHEN 'FI' THEN 'Finland'
WHEN 'US' THEN 'United States'
WHEN 'AR' THEN 'Argentina'
WHEN 'MX' THEN 'Mexico'
WHEN 'AE' THEN 'UAE'
WHEN 'AU' THEN 'Australia'
WHEN 'IE' THEN 'Ireland'
ELSE vc.country_code
END, '[^a-zA-Z0-9]+', '-'
)) AS country_slug,
-- Human-readable country name and slug — from dim_countries (single source of truth)
c.country_name_en,
c.country_slug,
vc.centroid_lat AS lat,
vc.centroid_lon AS lon,
-- Population cascade: Eurostat EU > US Census > ONS UK > GeoNames string > GeoNames spatial > 0.
@@ -180,13 +126,13 @@ SELECT
0
)::INTEGER AS population_year,
vc.padel_venue_count,
ci.median_income_pps,
ci.income_year,
c.median_income_pps,
c.income_year,
-- GeoNames ID: FK to dim_locations / location_opportunity_profile.
-- String match preferred; spatial fallback used when name doesn't match (Milano→Milan, etc.)
COALESCE(gn.geoname_id, gs.spatial_geoname_id) AS geoname_id
FROM venue_cities vc
LEFT JOIN country_income ci ON vc.country_code = ci.country_code
LEFT JOIN foundation.dim_countries c ON vc.country_code = c.country_code
-- Eurostat EU population (via city code→name lookup)
LEFT JOIN eurostat_pop ep
ON vc.country_code = ep.country_code

View File

@@ -0,0 +1,285 @@
-- Conformed country dimension — single authoritative source for all country metadata.
--
-- Consolidates data previously duplicated across dim_cities and dim_locations:
-- - country_name_en / country_slug (was: ~50-line CASE blocks in both models)
-- - median_income_pps (was: country_income CTE in both models)
-- - energy prices, labour costs, PLI indices (new — from Eurostat datasets)
-- - cost override columns for the financial calculator
--
-- Used by: dim_cities, dim_locations, pseo_city_costs_de, planner_defaults.
-- Grain: country_code (one row per ISO 3166-1 alpha-2 country code).
-- Kind: FULL — small table (~40 rows), full refresh daily.
--
-- Cost override columns:
-- NULL = fall through to calculator.py DEFAULTS (safe: auto-mapping filters None).
-- For DE (the baseline country) all overrides are NULL to preserve exact DEFAULTS.
-- For countries missing Eurostat data, NULLs propagate naturally.
-- camelCase column aliases match DEFAULTS keys for auto-mapping in content/__init__.py.
--
-- !! DE baseline values sourced from calculator.py DEFAULTS (web/src/padelnomics/planner/calculator.py).
-- !! If DEFAULTS change, the hardcoded baseline values below must be updated to match.
-- !! Search "DE baseline" in this file to find all affected lines.
MODEL (
name foundation.dim_countries,
kind FULL,
cron '@daily',
grain country_code
);
WITH
-- Latest income per country
latest_income AS (
SELECT country_code, median_income_pps, ref_year AS income_year
FROM staging.stg_income
QUALIFY ROW_NUMBER() OVER (PARTITION BY country_code ORDER BY ref_year DESC) = 1
),
-- Latest electricity price per country (use most recent semi-annual period)
latest_electricity AS (
SELECT country_code, electricity_eur_kwh, ref_period
FROM staging.stg_electricity_prices
QUALIFY ROW_NUMBER() OVER (PARTITION BY country_code ORDER BY ref_period DESC) = 1
),
-- Latest gas price per country
latest_gas AS (
SELECT country_code, gas_eur_gj, ref_period
FROM staging.stg_gas_prices
QUALIFY ROW_NUMBER() OVER (PARTITION BY country_code ORDER BY ref_period DESC) = 1
),
-- Latest labour cost per country
latest_labour AS (
SELECT country_code, labour_cost_eur_hour, ref_year
FROM staging.stg_labour_costs
QUALIFY ROW_NUMBER() OVER (PARTITION BY country_code ORDER BY ref_year DESC) = 1
),
-- Latest PLI per (country, category)
latest_pli AS (
SELECT country_code, category, pli, ref_year
FROM staging.stg_price_levels
QUALIFY ROW_NUMBER() OVER (PARTITION BY country_code, category ORDER BY ref_year DESC) = 1
),
-- Pivot PLI categories into columns per country
pli_pivoted AS (
SELECT
country_code,
MAX(pli) FILTER (WHERE category = 'construction') AS construction,
MAX(pli) FILTER (WHERE category = 'housing') AS housing,
MAX(pli) FILTER (WHERE category = 'services') AS services,
MAX(pli) FILTER (WHERE category = 'misc') AS misc,
MAX(pli) FILTER (WHERE category = 'government') AS government
FROM latest_pli
GROUP BY country_code
),
-- DE baseline rows for ratio computation
-- NULL-safe: if DE is missing from a source, ratios produce NULL (safe fallthrough).
de_pli AS (
SELECT construction, housing, services, misc, government
FROM pli_pivoted WHERE country_code = 'DE'
),
de_elec AS (
SELECT electricity_eur_kwh FROM latest_electricity WHERE country_code = 'DE'
),
de_gas AS (
SELECT gas_eur_gj FROM latest_gas WHERE country_code = 'DE'
),
-- All distinct country codes from any source
all_countries AS (
SELECT country_code FROM latest_income
UNION
SELECT country_code FROM latest_electricity
UNION
SELECT country_code FROM latest_gas
UNION
SELECT country_code FROM latest_labour
UNION
SELECT country_code FROM pli_pivoted
-- Ensure known padel markets appear even if Eurostat doesn't cover them yet
UNION ALL
SELECT unnest(['DE','ES','GB','FR','IT','PT','AT','CH','NL','BE','SE','NO','DK','FI',
'US','AR','MX','AE','AU','IE']) AS country_code
)
SELECT
ac.country_code,
-- Country name and slug (single definition, replacing duplicated CASE blocks)
CASE ac.country_code
WHEN 'DE' THEN 'Germany'
WHEN 'ES' THEN 'Spain'
WHEN 'GB' THEN 'United Kingdom'
WHEN 'FR' THEN 'France'
WHEN 'IT' THEN 'Italy'
WHEN 'PT' THEN 'Portugal'
WHEN 'AT' THEN 'Austria'
WHEN 'CH' THEN 'Switzerland'
WHEN 'NL' THEN 'Netherlands'
WHEN 'BE' THEN 'Belgium'
WHEN 'SE' THEN 'Sweden'
WHEN 'NO' THEN 'Norway'
WHEN 'DK' THEN 'Denmark'
WHEN 'FI' THEN 'Finland'
WHEN 'US' THEN 'United States'
WHEN 'AR' THEN 'Argentina'
WHEN 'MX' THEN 'Mexico'
WHEN 'AE' THEN 'UAE'
WHEN 'AU' THEN 'Australia'
WHEN 'IE' THEN 'Ireland'
ELSE ac.country_code
END AS country_name_en,
LOWER(REGEXP_REPLACE(
CASE ac.country_code
WHEN 'DE' THEN 'Germany'
WHEN 'ES' THEN 'Spain'
WHEN 'GB' THEN 'United Kingdom'
WHEN 'FR' THEN 'France'
WHEN 'IT' THEN 'Italy'
WHEN 'PT' THEN 'Portugal'
WHEN 'AT' THEN 'Austria'
WHEN 'CH' THEN 'Switzerland'
WHEN 'NL' THEN 'Netherlands'
WHEN 'BE' THEN 'Belgium'
WHEN 'SE' THEN 'Sweden'
WHEN 'NO' THEN 'Norway'
WHEN 'DK' THEN 'Denmark'
WHEN 'FI' THEN 'Finland'
WHEN 'US' THEN 'United States'
WHEN 'AR' THEN 'Argentina'
WHEN 'MX' THEN 'Mexico'
WHEN 'AE' THEN 'UAE'
WHEN 'AU' THEN 'Australia'
WHEN 'IE' THEN 'Ireland'
ELSE ac.country_code
END, '[^a-zA-Z0-9]+', '-'
)) AS country_slug,
-- Income data
i.median_income_pps,
i.income_year,
-- Raw energy and labour data (for reference / future staffed-scenario use)
e.electricity_eur_kwh,
g.gas_eur_gj,
la.labour_cost_eur_hour,
-- PLI indices per category (EU27=100)
p.construction AS pli_construction,
p.housing AS pli_housing,
p.services AS pli_services,
p.misc AS pli_misc,
p.government AS pli_government,
-- ── Calculator cost override columns ────────────────────────────────────
-- NULL for DE = fall through to calculator.py DEFAULTS (safe: auto-mapping skips None).
-- Formulas: country_value = DE_default × (country_price / DE_price)
-- or DE_default × (country_PLI / DE_PLI)
--
-- OPEX overrides — energy (direct price ratio)
-- DE baseline: electricity=600, heating=400 (see calculator.py DEFAULTS)
CASE WHEN ac.country_code = 'DE' THEN NULL
ELSE ROUND(600.0 * (e.electricity_eur_kwh / de_e.electricity_eur_kwh), 0)
END AS electricity,
CASE WHEN ac.country_code = 'DE' THEN NULL
ELSE ROUND(400.0 * (g.gas_eur_gj / de_g.gas_eur_gj), 0)
END AS heating,
-- OPEX overrides — PLI-scaled (housing category)
-- DE baseline: rentSqm=4, water=125, outdoorRent=400
CASE WHEN ac.country_code = 'DE' THEN NULL
ELSE ROUND(4.0 * (p.housing / de_p.housing), 2)
END AS rent_sqm,
CASE WHEN ac.country_code = 'DE' THEN NULL
ELSE ROUND(125.0 * (p.housing / de_p.housing), 0)
END AS water,
CASE WHEN ac.country_code = 'DE' THEN NULL
ELSE ROUND(400.0 * (p.housing / de_p.housing), 0)
END AS outdoor_rent,
-- OPEX overrides — PLI-scaled (misc category)
-- DE baseline: insurance=300
CASE WHEN ac.country_code = 'DE' THEN NULL
ELSE ROUND(300.0 * (p.misc / de_p.misc), 0)
END AS insurance,
-- OPEX overrides — PLI-scaled (services category)
-- DE baseline: cleaning=300, maintenance=300, marketing=350
CASE WHEN ac.country_code = 'DE' THEN NULL
ELSE ROUND(300.0 * (p.services / de_p.services), 0)
END AS cleaning,
CASE WHEN ac.country_code = 'DE' THEN NULL
ELSE ROUND(300.0 * (p.services / de_p.services), 0)
END AS maintenance,
CASE WHEN ac.country_code = 'DE' THEN NULL
ELSE ROUND(350.0 * (p.services / de_p.services), 0)
END AS marketing,
-- OPEX overrides — PLI-scaled (government category)
-- DE baseline: propertyTax=250, permitsCompliance=12000
CASE WHEN ac.country_code = 'DE' THEN NULL
ELSE ROUND(250.0 * (p.government / de_p.government), 0)
END AS property_tax,
CASE WHEN ac.country_code = 'DE' THEN NULL
ELSE ROUND(12000.0 * (p.government / de_p.government), 0)
END AS permits_compliance,
-- CAPEX overrides — PLI-scaled (construction category)
-- DE baseline: hallCostSqm=500, foundationSqm=150, hvac=100000, electrical=60000,
-- sanitary=80000, parking=50000, fitout=40000, planning=100000,
-- fireProtection=80000, floorPrep=12000, hvacUpgrade=20000,
-- lightingUpgrade=10000, outdoorFoundation=35, outdoorSiteWork=8000,
-- outdoorLighting=4000, outdoorFencing=6000, workingCapital=15000
CASE WHEN ac.country_code = 'DE' THEN NULL
ELSE ROUND(500.0 * (p.construction / de_p.construction), 0)
END AS hall_cost_sqm,
CASE WHEN ac.country_code = 'DE' THEN NULL
ELSE ROUND(150.0 * (p.construction / de_p.construction), 0)
END AS foundation_sqm,
CASE WHEN ac.country_code = 'DE' THEN NULL
ELSE ROUND(100000.0 * (p.construction / de_p.construction), 0)
END AS hvac,
CASE WHEN ac.country_code = 'DE' THEN NULL
ELSE ROUND(60000.0 * (p.construction / de_p.construction), 0)
END AS electrical,
CASE WHEN ac.country_code = 'DE' THEN NULL
ELSE ROUND(80000.0 * (p.construction / de_p.construction), 0)
END AS sanitary,
CASE WHEN ac.country_code = 'DE' THEN NULL
ELSE ROUND(50000.0 * (p.construction / de_p.construction), 0)
END AS parking,
CASE WHEN ac.country_code = 'DE' THEN NULL
ELSE ROUND(40000.0 * (p.construction / de_p.construction), 0)
END AS fitout,
CASE WHEN ac.country_code = 'DE' THEN NULL
ELSE ROUND(100000.0 * (p.construction / de_p.construction), 0)
END AS planning,
CASE WHEN ac.country_code = 'DE' THEN NULL
ELSE ROUND(80000.0 * (p.construction / de_p.construction), 0)
END AS fire_protection,
CASE WHEN ac.country_code = 'DE' THEN NULL
ELSE ROUND(12000.0 * (p.construction / de_p.construction), 0)
END AS floor_prep,
CASE WHEN ac.country_code = 'DE' THEN NULL
ELSE ROUND(20000.0 * (p.construction / de_p.construction), 0)
END AS hvac_upgrade,
CASE WHEN ac.country_code = 'DE' THEN NULL
ELSE ROUND(10000.0 * (p.construction / de_p.construction), 0)
END AS lighting_upgrade,
CASE WHEN ac.country_code = 'DE' THEN NULL
ELSE ROUND(35.0 * (p.construction / de_p.construction), 0)
END AS outdoor_foundation,
CASE WHEN ac.country_code = 'DE' THEN NULL
ELSE ROUND(8000.0 * (p.construction / de_p.construction), 0)
END AS outdoor_site_work,
CASE WHEN ac.country_code = 'DE' THEN NULL
ELSE ROUND(4000.0 * (p.construction / de_p.construction), 0)
END AS outdoor_lighting,
CASE WHEN ac.country_code = 'DE' THEN NULL
ELSE ROUND(6000.0 * (p.construction / de_p.construction), 0)
END AS outdoor_fencing,
CASE WHEN ac.country_code = 'DE' THEN NULL
ELSE ROUND(15000.0 * (p.construction / de_p.construction), 0)
END AS working_capital,
-- CAPEX overrides — PLI-scaled (housing category)
-- DE baseline: landPriceSqm=60
CASE WHEN ac.country_code = 'DE' THEN NULL
ELSE ROUND(60.0 * (p.housing / de_p.housing), 0)
END AS land_price_sqm
FROM (SELECT DISTINCT country_code FROM all_countries WHERE LENGTH(country_code) = 2) ac
LEFT JOIN latest_income i ON ac.country_code = i.country_code
LEFT JOIN latest_electricity e ON ac.country_code = e.country_code
LEFT JOIN latest_gas g ON ac.country_code = g.country_code
LEFT JOIN latest_labour la ON ac.country_code = la.country_code
LEFT JOIN pli_pivoted p ON ac.country_code = p.country_code
CROSS JOIN de_pli de_p
CROSS JOIN de_elec de_e
CROSS JOIN de_gas de_g
-- Enforce grain
QUALIFY ROW_NUMBER() OVER (PARTITION BY ac.country_code ORDER BY ac.country_code) = 1

View File

@@ -6,9 +6,9 @@
-- covers all locations with population ≥ 1K so zero-court Gemeinden score fully.
--
-- Enriched with:
-- foundation.dim_countries → country_name_en, country_slug, median_income_pps
-- stg_nuts2_boundaries + stg_regional_income → EU NUTS-2/NUTS-1 income (spatial join)
-- stg_income_usa → US state-level income (PPS-normalised)
-- stg_income → country-level income (fallback for all countries)
-- stg_padel_courts → padel venue count + nearest court distance (km)
-- stg_tennis_courts → tennis court count within 25km radius
--
@@ -16,7 +16,7 @@
-- 1. EU NUTS-2 regional income (finest; spatial join via ST_Contains)
-- 2. EU NUTS-1 regional income (fallback when NUTS-2 income missing from dataset)
-- 3. US state income (ratio-normalised to PPS scale; see us_income CTE)
-- 4. Country-level income (global fallback from stg_income / ilc_di03)
-- 4. Country-level income (global fallback from dim_countries / ilc_di03)
--
-- Distance calculations use ST_Distance_Sphere (DuckDB spatial extension).
-- Spatial joins use BETWEEN predicates (not ABS()) to enable DuckDB's IEJoin
@@ -38,7 +38,7 @@ locations AS (
geoname_id,
city_name AS location_name,
-- URL-safe location slug
LOWER(REGEXP_REPLACE(LOWER(city_name), '[^a-z0-9]+', '-')) AS location_slug,
@slugify(city_name) AS location_slug,
country_code,
lat,
lon,
@@ -49,12 +49,6 @@ locations AS (
FROM staging.stg_population_geonames
WHERE lat IS NOT NULL AND lon IS NOT NULL
),
-- Country income (ilc_di03) — global fallback for all countries
country_income AS (
SELECT country_code, median_income_pps, ref_year AS income_year
FROM staging.stg_income
QUALIFY ROW_NUMBER() OVER (PARTITION BY country_code ORDER BY ref_year DESC) = 1
),
-- ── EU NUTS-2 income via spatial join ──────────────────────────────────────
-- Each EU location's (lon, lat) is matched against NUTS-2 boundary polygons.
-- The bounding box pre-filter (bbox_lat/lon_min/max) eliminates most candidates
@@ -214,56 +208,9 @@ tennis_nearby AS (
SELECT
l.geoname_id,
l.country_code,
-- Human-readable country name (consistent with dim_cities)
CASE l.country_code
WHEN 'DE' THEN 'Germany'
WHEN 'ES' THEN 'Spain'
WHEN 'GB' THEN 'United Kingdom'
WHEN 'FR' THEN 'France'
WHEN 'IT' THEN 'Italy'
WHEN 'PT' THEN 'Portugal'
WHEN 'AT' THEN 'Austria'
WHEN 'CH' THEN 'Switzerland'
WHEN 'NL' THEN 'Netherlands'
WHEN 'BE' THEN 'Belgium'
WHEN 'SE' THEN 'Sweden'
WHEN 'NO' THEN 'Norway'
WHEN 'DK' THEN 'Denmark'
WHEN 'FI' THEN 'Finland'
WHEN 'US' THEN 'United States'
WHEN 'AR' THEN 'Argentina'
WHEN 'MX' THEN 'Mexico'
WHEN 'AE' THEN 'UAE'
WHEN 'AU' THEN 'Australia'
WHEN 'IE' THEN 'Ireland'
ELSE l.country_code
END AS country_name_en,
-- URL-safe country slug
LOWER(REGEXP_REPLACE(
CASE l.country_code
WHEN 'DE' THEN 'Germany'
WHEN 'ES' THEN 'Spain'
WHEN 'GB' THEN 'United Kingdom'
WHEN 'FR' THEN 'France'
WHEN 'IT' THEN 'Italy'
WHEN 'PT' THEN 'Portugal'
WHEN 'AT' THEN 'Austria'
WHEN 'CH' THEN 'Switzerland'
WHEN 'NL' THEN 'Netherlands'
WHEN 'BE' THEN 'Belgium'
WHEN 'SE' THEN 'Sweden'
WHEN 'NO' THEN 'Norway'
WHEN 'DK' THEN 'Denmark'
WHEN 'FI' THEN 'Finland'
WHEN 'US' THEN 'United States'
WHEN 'AR' THEN 'Argentina'
WHEN 'MX' THEN 'Mexico'
WHEN 'AE' THEN 'UAE'
WHEN 'AU' THEN 'Australia'
WHEN 'IE' THEN 'Ireland'
ELSE l.country_code
END, '[^a-zA-Z0-9]+', '-'
)) AS country_slug,
-- Human-readable country name and slug — from dim_countries (single source of truth)
c.country_name_en,
c.country_slug,
l.location_name,
l.location_slug,
l.lat,
@@ -276,12 +223,12 @@ SELECT
COALESCE(
ri.regional_income_pps, -- EU: NUTS-2 (finest) or NUTS-1 (fallback)
us.median_income_pps, -- US: state-level PPS-equivalent
ci.median_income_pps -- Global: country-level from ilc_di03
c.median_income_pps -- Global: country-level from dim_countries / ilc_di03
) AS median_income_pps,
COALESCE(
ri.regional_income_year,
us.income_year,
ci.income_year
c.income_year
) AS income_year,
COALESCE(pl.padel_venue_count, 0)::INTEGER AS padel_venue_count,
-- Venues per 100K residents (NULL if population = 0)
@@ -293,8 +240,8 @@ SELECT
COALESCE(tn.tennis_courts_within_25km, 0)::INTEGER AS tennis_courts_within_25km,
CURRENT_DATE AS refreshed_date
FROM locations l
LEFT JOIN country_income ci ON l.country_code = ci.country_code
LEFT JOIN regional_income ri ON l.geoname_id = ri.geoname_id
LEFT JOIN foundation.dim_countries c ON l.country_code = c.country_code
LEFT JOIN regional_income ri ON l.geoname_id = ri.geoname_id
LEFT JOIN us_income us ON l.country_code = 'US'
AND l.admin1_code = us.admin1_code
LEFT JOIN nearest_padel np ON l.geoname_id = np.geoname_id

View File

@@ -99,7 +99,7 @@ SELECT
indoor_court_count,
outdoor_court_count,
-- Conformed city key: enables deterministic joins to dim_cities / venue_pricing_benchmarks
LOWER(REGEXP_REPLACE(LOWER(COALESCE(city, '')), '[^a-z0-9]+', '-')) AS city_slug,
@slugify(COALESCE(city, '')) AS city_slug,
extracted_date
FROM ranked
QUALIFY ROW_NUMBER() OVER (

View File

@@ -0,0 +1,26 @@
-- Per-venue lat/lon for the city detail dot map.
-- Joins dim_venues to dim_cities to attach country_slug and city_slug
-- (needed by the /api/markets/<country>/<city>/venues.json endpoint).
-- Only rows with valid coordinates are included.
MODEL (
name serving.city_venue_locations,
kind FULL,
cron '@daily',
grain venue_id
);
SELECT
v.venue_id,
v.name,
v.lat,
v.lon,
v.court_count,
v.indoor_court_count,
v.outdoor_court_count,
v.city_slug,
c.country_slug
FROM foundation.dim_venues v
JOIN foundation.dim_cities c
ON v.country_code = c.country_code AND v.city_slug = c.city_slug
WHERE v.lat IS NOT NULL AND v.lon IS NOT NULL

View File

@@ -7,6 +7,10 @@
-- 2. Country-level: median across cities in same country
-- 3. Hardcoded fallback: market research estimates (only when no Playtomic data)
--
-- Cost override columns from dim_countries (Eurostat PLI + energy price indices) are
-- included so the planner API pre-fills country-adjusted CAPEX/OPEX for all cities.
-- NULL = fall through to calculator.py DEFAULTS. DE always NULL (baseline preserved).
--
-- Units are explicit in column names. Monetary values in local currency.
MODEL (
@@ -125,6 +129,37 @@ SELECT
ELSE 0.2
END AS data_confidence,
COALESCE(cb.price_currency, ctb.price_currency, hf.currency, 'EUR') AS price_currency,
-- Cost override columns (Eurostat PLI + energy prices via dim_countries).
-- NULL = fall through to calculator.py DEFAULTS. DE always NULL (baseline).
dc.electricity,
dc.heating,
dc.rent_sqm,
dc.insurance,
dc.cleaning,
dc.maintenance,
dc.marketing,
dc.water,
dc.property_tax,
dc.outdoor_rent,
dc.hall_cost_sqm,
dc.foundation_sqm,
dc.land_price_sqm,
dc.hvac,
dc.electrical,
dc.sanitary,
dc.parking,
dc.fitout,
dc.planning,
dc.fire_protection,
dc.floor_prep,
dc.hvac_upgrade,
dc.lighting_upgrade,
dc.outdoor_foundation,
dc.outdoor_site_work,
dc.outdoor_lighting,
dc.outdoor_fencing,
dc.working_capital,
dc.permits_compliance,
CURRENT_DATE AS refreshed_date
FROM city_profiles cp
LEFT JOIN city_benchmarks cb
@@ -134,3 +169,5 @@ LEFT JOIN country_benchmarks ctb
ON cp.country_code = ctb.country_code
LEFT JOIN hardcoded_fallbacks hf
ON cp.country_code = hf.country_code
LEFT JOIN foundation.dim_countries dc
ON cp.country_code = dc.country_code

View File

@@ -4,6 +4,10 @@
--
-- Calculator override columns use camelCase to match the DEFAULTS keys in
-- planner/calculator.py, so they are auto-applied as calc pre-fills.
--
-- Cost override columns come from foundation.dim_countries (Eurostat PLI and energy
-- price indices). NULL = fall through to calculator.py DEFAULTS (safe: auto-mapping
-- filters None). DE always produces NULL overrides — preserves exact DEFAULTS behaviour.
MODEL (
name serving.pseo_city_costs_de,
@@ -22,6 +26,9 @@ SELECT
c.country_code,
c.country_name_en,
c.country_slug,
-- City coordinates (for the city venue dot map)
c.lat,
c.lon,
-- Market metrics
c.population,
c.padel_venue_count,
@@ -44,6 +51,39 @@ SELECT
FLOOR(p.courts_typical) AS "dblCourts",
-- 'country' drives currency formatting in the calculator
c.country_code AS "country",
-- Cost override columns from dim_countries (Eurostat PLI + energy price indices).
-- NULL = fall through to calculator.py DEFAULTS. DE always NULL (baseline preserved).
-- OPEX overrides
cc.electricity AS "electricity",
cc.heating AS "heating",
cc.rent_sqm AS "rentSqm",
cc.insurance AS "insurance",
cc.cleaning AS "cleaning",
cc.maintenance AS "maintenance",
cc.marketing AS "marketing",
cc.water AS "water",
cc.property_tax AS "propertyTax",
cc.outdoor_rent AS "outdoorRent",
-- CAPEX overrides
cc.hall_cost_sqm AS "hallCostSqm",
cc.foundation_sqm AS "foundationSqm",
cc.land_price_sqm AS "landPriceSqm",
cc.hvac AS "hvac",
cc.electrical AS "electrical",
cc.sanitary AS "sanitary",
cc.parking AS "parking",
cc.fitout AS "fitout",
cc.planning AS "planning",
cc.fire_protection AS "fireProtection",
cc.floor_prep AS "floorPrep",
cc.hvac_upgrade AS "hvacUpgrade",
cc.lighting_upgrade AS "lightingUpgrade",
cc.outdoor_foundation AS "outdoorFoundation",
cc.outdoor_site_work AS "outdoorSiteWork",
cc.outdoor_lighting AS "outdoorLighting",
cc.outdoor_fencing AS "outdoorFencing",
cc.working_capital AS "workingCapital",
cc.permits_compliance AS "permitsCompliance",
CURRENT_DATE AS refreshed_date
FROM serving.city_market_profile c
LEFT JOIN serving.planner_defaults p
@@ -52,6 +92,8 @@ LEFT JOIN serving.planner_defaults p
LEFT JOIN serving.location_opportunity_profile lop
ON c.country_code = lop.country_code
AND c.geoname_id = lop.geoname_id
LEFT JOIN foundation.dim_countries cc
ON c.country_code = cc.country_code
-- Only cities with actual padel presence and at least some rate data
WHERE c.padel_venue_count > 0
AND (p.rate_peak IS NOT NULL OR c.median_peak_rate IS NOT NULL)

View File

@@ -20,15 +20,15 @@ SELECT
SUM(padel_venue_count) AS total_venues,
ROUND(AVG(market_score), 1) AS avg_market_score,
MAX(market_score) AS top_city_market_score,
-- Top 5 cities by market score for internal linking (DuckDB list slice syntax)
LIST(city_slug ORDER BY market_score DESC NULLS LAST)[1:5] AS top_city_slugs,
LIST(city_name ORDER BY market_score DESC NULLS LAST)[1:5] AS top_city_names,
-- Top 5 cities by venue count (prominence), then score for internal linking
LIST(city_slug ORDER BY padel_venue_count DESC, market_score DESC NULLS LAST)[1:5] AS top_city_slugs,
LIST(city_name ORDER BY padel_venue_count DESC, market_score DESC NULLS LAST)[1:5] AS top_city_names,
-- Opportunity score aggregates (NULL-safe: cities without geoname_id match excluded from AVG)
ROUND(AVG(opportunity_score), 1) AS avg_opportunity_score,
MAX(opportunity_score) AS top_opportunity_score,
-- Top 5 cities by opportunity score (may differ from top market score cities)
LIST(city_slug ORDER BY opportunity_score DESC NULLS LAST)[1:5] AS top_opportunity_slugs,
LIST(city_name ORDER BY opportunity_score DESC NULLS LAST)[1:5] AS top_opportunity_names,
-- Top 5 opportunity cities by population (prominence), then opportunity score
LIST(city_slug ORDER BY population DESC, opportunity_score DESC NULLS LAST)[1:5] AS top_opportunity_slugs,
LIST(city_name ORDER BY population DESC, opportunity_score DESC NULLS LAST)[1:5] AS top_opportunity_names,
-- Pricing medians across cities (NULL when no Playtomic coverage in country)
ROUND(MEDIAN(median_hourly_rate), 0) AS median_hourly_rate,
ROUND(MEDIAN(median_peak_rate), 0) AS median_peak_rate,

View File

@@ -0,0 +1,42 @@
-- Electricity prices for non-household consumers (Eurostat nrg_pc_205).
-- EUR/kWh excluding taxes, band MWH500-1999 (medium-sized commercial consumer).
-- Semi-annual frequency: ref_period is "YYYY-S1" or "YYYY-S2".
--
-- Source: data/landing/eurostat/{year}/{month}/nrg_pc_205.json.gz
-- Format: {"rows": [{"geo_code": "DE", "ref_year": "2024-S1", "value": 0.1523}, ...]}
MODEL (
name staging.stg_electricity_prices,
kind FULL,
cron '@daily',
grain (country_code, ref_period)
);
WITH source AS (
SELECT unnest(rows) AS r
FROM read_json(
@LANDING_DIR || '/eurostat/*/*/nrg_pc_205.json.gz',
auto_detect = true
)
),
parsed AS (
SELECT
UPPER(TRIM(r.geo_code)) AS geo_code,
TRIM(r.ref_year) AS ref_period,
TRY_CAST(r.value AS DOUBLE) AS electricity_eur_kwh
FROM source
WHERE r.value IS NOT NULL
)
SELECT
-- Normalise to ISO 3166-1 alpha-2: EL→GR, UK→GB
CASE geo_code
WHEN 'EL' THEN 'GR'
WHEN 'UK' THEN 'GB'
ELSE geo_code
END AS country_code,
ref_period,
electricity_eur_kwh
FROM parsed
WHERE LENGTH(geo_code) = 2
AND geo_code NOT IN ('EU', 'EA', 'EU27_2020')
AND electricity_eur_kwh > 0

View File

@@ -0,0 +1,42 @@
-- Gas prices for non-household consumers (Eurostat nrg_pc_203).
-- EUR/GJ excluding taxes, band GJ1000-9999 (medium-sized commercial consumer).
-- Semi-annual frequency: ref_period is "YYYY-S1" or "YYYY-S2".
--
-- Source: data/landing/eurostat/{year}/{month}/nrg_pc_203.json.gz
-- Format: {"rows": [{"geo_code": "DE", "ref_year": "2024-S1", "value": 14.23}, ...]}
MODEL (
name staging.stg_gas_prices,
kind FULL,
cron '@daily',
grain (country_code, ref_period)
);
WITH source AS (
SELECT unnest(rows) AS r
FROM read_json(
@LANDING_DIR || '/eurostat/*/*/nrg_pc_203.json.gz',
auto_detect = true
)
),
parsed AS (
SELECT
UPPER(TRIM(r.geo_code)) AS geo_code,
TRIM(r.ref_year) AS ref_period,
TRY_CAST(r.value AS DOUBLE) AS gas_eur_gj
FROM source
WHERE r.value IS NOT NULL
)
SELECT
-- Normalise to ISO 3166-1 alpha-2: EL→GR, UK→GB
CASE geo_code
WHEN 'EL' THEN 'GR'
WHEN 'UK' THEN 'GB'
ELSE geo_code
END AS country_code,
ref_period,
gas_eur_gj
FROM parsed
WHERE LENGTH(geo_code) = 2
AND geo_code NOT IN ('EU', 'EA', 'EU27_2020')
AND gas_eur_gj > 0

View File

@@ -30,11 +30,7 @@ parsed AS (
)
SELECT
-- Normalise to ISO 3166-1 alpha-2: EL→GR, UK→GB
CASE geo_code
WHEN 'EL' THEN 'GR'
WHEN 'UK' THEN 'GB'
ELSE geo_code
END AS country_code,
@normalize_eurostat_country(geo_code) AS country_code,
ref_year,
median_income_pps,
extracted_date

View File

@@ -0,0 +1,46 @@
-- Labour cost levels EUR/hour (Eurostat lc_lci_lev).
-- NACE R2 sector N (administrative and support service activities).
-- D1_D2_A_HW structure: wages + non-wage costs, actual hours worked.
-- Annual frequency.
--
-- Stored for future "staffed scenario" calculator variant.
-- Not wired into default calculator overrides (staff=0 is a business assumption).
--
-- Source: data/landing/eurostat/{year}/{month}/lc_lci_lev.json.gz
-- Format: {"rows": [{"geo_code": "DE", "ref_year": "2022", "value": 28.4}, ...]}
MODEL (
name staging.stg_labour_costs,
kind FULL,
cron '@daily',
grain (country_code, ref_year)
);
WITH source AS (
SELECT unnest(rows) AS r
FROM read_json(
@LANDING_DIR || '/eurostat/*/*/lc_lci_lev.json.gz',
auto_detect = true
)
),
parsed AS (
SELECT
UPPER(TRIM(r.geo_code)) AS geo_code,
TRY_CAST(r.ref_year AS INTEGER) AS ref_year,
TRY_CAST(r.value AS DOUBLE) AS labour_cost_eur_hour
FROM source
WHERE r.value IS NOT NULL
)
SELECT
-- Normalise to ISO 3166-1 alpha-2: EL→GR, UK→GB
CASE geo_code
WHEN 'EL' THEN 'GR'
WHEN 'UK' THEN 'GB'
ELSE geo_code
END AS country_code,
ref_year,
labour_cost_eur_hour
FROM parsed
WHERE LENGTH(geo_code) = 2
AND geo_code NOT IN ('EU', 'EA', 'EU27_2020')
AND labour_cost_eur_hour > 0

View File

@@ -28,11 +28,7 @@ WITH raw AS (
SELECT
NUTS_ID AS nuts2_code,
-- Normalise country prefix to ISO 3166-1 alpha-2: EL→GR, UK→GB
CASE CNTR_CODE
WHEN 'EL' THEN 'GR'
WHEN 'UK' THEN 'GB'
ELSE CNTR_CODE
END AS country_code,
@normalize_eurostat_country(CNTR_CODE) AS country_code,
NAME_LATN AS region_name,
geom AS geometry,
-- Pre-compute bounding box for efficient spatial pre-filter in dim_locations.

View File

@@ -48,17 +48,8 @@ deduped AS (
with_country AS (
SELECT
osm_id, lat, lon,
COALESCE(NULLIF(TRIM(UPPER(country_code)), ''), CASE
WHEN lat BETWEEN 47.27 AND 55.06 AND lon BETWEEN 5.87 AND 15.04 THEN 'DE'
WHEN lat BETWEEN 35.95 AND 43.79 AND lon BETWEEN -9.39 AND 4.33 THEN 'ES'
WHEN lat BETWEEN 49.90 AND 60.85 AND lon BETWEEN -8.62 AND 1.77 THEN 'GB'
WHEN lat BETWEEN 41.36 AND 51.09 AND lon BETWEEN -5.14 AND 9.56 THEN 'FR'
WHEN lat BETWEEN 45.46 AND 47.80 AND lon BETWEEN 5.96 AND 10.49 THEN 'CH'
WHEN lat BETWEEN 46.37 AND 49.02 AND lon BETWEEN 9.53 AND 17.16 THEN 'AT'
WHEN lat BETWEEN 36.35 AND 47.09 AND lon BETWEEN 6.62 AND 18.51 THEN 'IT'
WHEN lat BETWEEN 37.00 AND 42.15 AND lon BETWEEN -9.50 AND -6.19 THEN 'PT'
ELSE NULL
END) AS country_code,
COALESCE(NULLIF(TRIM(UPPER(country_code)), ''),
@infer_country_from_coords(lat, lon)) AS country_code,
NULLIF(TRIM(name), '') AS name,
NULLIF(TRIM(city_tag), '') AS city,
postcode, operator_name, opening_hours, fee, extracted_date

View File

@@ -0,0 +1,96 @@
-- Price level indices relative to EU27=100 (Eurostat prc_ppp_ind).
-- Five categories, each from a separate landing file (different ppp_cat filters).
-- Annual frequency.
--
-- Categories and what they scale in the calculator:
-- construction — CAPEX: hallCostSqm, foundationSqm, hvac, electrical, sanitary, etc.
-- housing — rentSqm, landPriceSqm, water, outdoorRent
-- services — cleaning, maintenance, marketing
-- misc — insurance
-- government — permitsCompliance, propertyTax
--
-- Sources:
-- data/landing/eurostat/*/*/prc_ppp_ind_construction.json.gz (ppp_cat: A050202)
-- data/landing/eurostat/*/*/prc_ppp_ind_housing.json.gz (ppp_cat: A0104)
-- data/landing/eurostat/*/*/prc_ppp_ind_services.json.gz (ppp_cat: P0201)
-- data/landing/eurostat/*/*/prc_ppp_ind_misc.json.gz (ppp_cat: A0112)
-- data/landing/eurostat/*/*/prc_ppp_ind_government.json.gz (ppp_cat: P0202)
--
-- Format: {"rows": [{"geo_code": "DE", "ref_year": "2022", "value": 107.3}, ...]}
MODEL (
name staging.stg_price_levels,
kind FULL,
cron '@daily',
grain (country_code, category, ref_year)
);
WITH construction_raw AS (
SELECT unnest(rows) AS r, 'construction' AS category
FROM read_json(
@LANDING_DIR || '/eurostat/*/*/prc_ppp_ind_construction.json.gz',
auto_detect = true
)
),
housing_raw AS (
SELECT unnest(rows) AS r, 'housing' AS category
FROM read_json(
@LANDING_DIR || '/eurostat/*/*/prc_ppp_ind_housing.json.gz',
auto_detect = true
)
),
services_raw AS (
SELECT unnest(rows) AS r, 'services' AS category
FROM read_json(
@LANDING_DIR || '/eurostat/*/*/prc_ppp_ind_services.json.gz',
auto_detect = true
)
),
misc_raw AS (
SELECT unnest(rows) AS r, 'misc' AS category
FROM read_json(
@LANDING_DIR || '/eurostat/*/*/prc_ppp_ind_misc.json.gz',
auto_detect = true
)
),
government_raw AS (
SELECT unnest(rows) AS r, 'government' AS category
FROM read_json(
@LANDING_DIR || '/eurostat/*/*/prc_ppp_ind_government.json.gz',
auto_detect = true
)
),
all_raw AS (
SELECT r, category FROM construction_raw
UNION ALL
SELECT r, category FROM housing_raw
UNION ALL
SELECT r, category FROM services_raw
UNION ALL
SELECT r, category FROM misc_raw
UNION ALL
SELECT r, category FROM government_raw
),
parsed AS (
SELECT
UPPER(TRIM(r.geo_code)) AS geo_code,
TRY_CAST(r.ref_year AS INTEGER) AS ref_year,
TRY_CAST(r.value AS DOUBLE) AS pli,
category
FROM all_raw
WHERE r.value IS NOT NULL
)
SELECT
-- Normalise to ISO 3166-1 alpha-2: EL→GR, UK→GB
CASE geo_code
WHEN 'EL' THEN 'GR'
WHEN 'UK' THEN 'GB'
ELSE geo_code
END AS country_code,
category,
ref_year,
pli
FROM parsed
WHERE LENGTH(geo_code) = 2
AND geo_code NOT IN ('EU', 'EA', 'EU27_2020')
AND pli > 0

View File

@@ -30,11 +30,7 @@ parsed AS (
)
SELECT
-- Normalise to ISO 3166-1 alpha-2 prefix: EL→GR, UK→GB
CASE
WHEN geo_code LIKE 'EL%' THEN 'GR' || SUBSTR(geo_code, 3)
WHEN geo_code LIKE 'UK%' THEN 'GB' || SUBSTR(geo_code, 3)
ELSE geo_code
END AS nuts_code,
@normalize_eurostat_nuts(geo_code) AS nuts_code,
-- NUTS level: 3-char = NUTS-1, 4-char = NUTS-2
LENGTH(geo_code) - 2 AS nuts_level,
ref_year,

View File

@@ -54,17 +54,8 @@ deduped AS (
with_country AS (
SELECT
osm_id, lat, lon,
COALESCE(NULLIF(TRIM(UPPER(country_code)), ''), CASE
WHEN lat BETWEEN 47.27 AND 55.06 AND lon BETWEEN 5.87 AND 15.04 THEN 'DE'
WHEN lat BETWEEN 35.95 AND 43.79 AND lon BETWEEN -9.39 AND 4.33 THEN 'ES'
WHEN lat BETWEEN 49.90 AND 60.85 AND lon BETWEEN -8.62 AND 1.77 THEN 'GB'
WHEN lat BETWEEN 41.36 AND 51.09 AND lon BETWEEN -5.14 AND 9.56 THEN 'FR'
WHEN lat BETWEEN 45.46 AND 47.80 AND lon BETWEEN 5.96 AND 10.49 THEN 'CH'
WHEN lat BETWEEN 46.37 AND 49.02 AND lon BETWEEN 9.53 AND 17.16 THEN 'AT'
WHEN lat BETWEEN 36.35 AND 47.09 AND lon BETWEEN 6.62 AND 18.51 THEN 'IT'
WHEN lat BETWEEN 37.00 AND 42.15 AND lon BETWEEN -9.50 AND -6.19 THEN 'PT'
ELSE NULL
END) AS country_code,
COALESCE(NULLIF(TRIM(UPPER(country_code)), ''),
@infer_country_from_coords(lat, lon)) AS country_code,
NULLIF(TRIM(name), '') AS name,
NULLIF(TRIM(city_tag), '') AS city,
extracted_date

15
uv.lock generated
View File

@@ -1392,6 +1392,7 @@ dependencies = [
{ name = "pyyaml" },
{ name = "quart" },
{ name = "resend" },
{ name = "stripe" },
{ name = "weasyprint" },
]
@@ -1413,6 +1414,7 @@ requires-dist = [
{ name = "pyyaml", specifier = ">=6.0" },
{ name = "quart", specifier = ">=0.19.0" },
{ name = "resend", specifier = ">=2.22.0" },
{ name = "stripe", specifier = ">=14.4.0" },
{ name = "weasyprint", specifier = ">=68.1" },
]
@@ -2519,6 +2521,19 @@ wheels = [
{ url = "https://files.pythonhosted.org/packages/f1/7b/ce1eafaf1a76852e2ec9b22edecf1daa58175c090266e9f6c64afcd81d91/stack_data-0.6.3-py3-none-any.whl", hash = "sha256:d5558e0c25a4cb0853cddad3d77da9891a08cb85dd9f9f91b9f8cd66e511e695", size = 24521, upload-time = "2023-09-30T13:58:03.53Z" },
]
[[package]]
name = "stripe"
version = "14.4.0"
source = { registry = "https://pypi.org/simple" }
dependencies = [
{ name = "requests" },
{ name = "typing-extensions" },
]
sdist = { url = "https://files.pythonhosted.org/packages/6a/ec/0f17cff3f7c91b0215266959c5a2a96b0bf9f45ac041c50b99ad8f9b5047/stripe-14.4.0.tar.gz", hash = "sha256:ddaa06f5e38a582bef7e93e06fc304ba8ae3b4c0c2aac43da02c84926f05fa0a", size = 1472370, upload-time = "2026-02-25T17:52:40.905Z" }
wheels = [
{ url = "https://files.pythonhosted.org/packages/38/09/fcecad01d76dbe027015dd559ec1b6dccfc319c2540991dde4b1de81ba34/stripe-14.4.0-py3-none-any.whl", hash = "sha256:357151a816cd0bb012d6cb29f108fae50b9f6eece8530d7bc31dfa90c9ceb84c", size = 2115405, upload-time = "2026-02-25T17:52:39.128Z" },
]
[[package]]
name = "tenacity"
version = "9.1.4"

View File

@@ -22,6 +22,7 @@ dependencies = [
"httpx>=0.27.0",
"google-api-python-client>=2.100.0",
"google-auth>=2.23.0",
"stripe>=14.4.0",
]
[build-system]

View File

@@ -48,7 +48,7 @@ PADDLE_ENVIRONMENT=${PADDLE_ENVIRONMENT:-sandbox}
# -- Preparation -------------------------------------------------------------
info "Resetting database"
rm -f "$DATABASE_PATH"
rm -f "$DATABASE_PATH" "${DATABASE_PATH}-shm" "${DATABASE_PATH}-wal"
ok "Removed $DATABASE_PATH"
info "Running migrations"

View File

@@ -35,7 +35,7 @@ from pathlib import Path
from quart import Blueprint, flash, redirect, render_template, request, url_for
from ..auth.routes import role_required
from ..core import csrf_protect
from ..core import count_where, csrf_protect
logger = logging.getLogger(__name__)
@@ -51,8 +51,10 @@ bp = Blueprint(
_LANDING_DIR = os.environ.get("LANDING_DIR", "data/landing")
_SERVING_DUCKDB_PATH = os.environ.get("SERVING_DUCKDB_PATH", "data/analytics.duckdb")
# Repo root: web/src/padelnomics/admin/ → up 4 levels
_REPO_ROOT = Path(__file__).resolve().parents[4]
# In prod the package is installed in a venv so __file__.parents[4] won't
# reach the repo checkout. WorkingDirectory in the systemd unit is /opt/padelnomics,
# so CWD is reliable; REPO_ROOT env var overrides for non-standard setups.
_REPO_ROOT = Path(os.environ.get("REPO_ROOT", ".")).resolve()
_WORKFLOWS_TOML = _REPO_ROOT / "infra" / "supervisor" / "workflows.toml"
# A "running" row older than this is considered stale/crashed.
@@ -298,11 +300,8 @@ async def _inject_sidebar_data():
"""Load unread inbox count for the admin sidebar badge."""
from quart import g
from ..core import fetch_one
try:
row = await fetch_one("SELECT COUNT(*) as cnt FROM inbound_emails WHERE is_read = 0")
g.admin_unread_count = row["cnt"] if row else 0
g.admin_unread_count = await count_where("inbound_emails WHERE is_read = 0")
except Exception:
g.admin_unread_count = 0
@@ -541,6 +540,7 @@ def _load_workflows() -> list[dict]:
"schedule": schedule,
"schedule_label": schedule_label,
"depends_on": config.get("depends_on", []),
"description": config.get("description", ""),
})
return workflows
@@ -780,7 +780,8 @@ async def pipeline_trigger_extract():
else:
await enqueue("run_extraction")
is_htmx = request.headers.get("HX-Request") == "true"
is_htmx = (request.headers.get("HX-Request") == "true"
and request.headers.get("HX-Boosted") != "true")
if is_htmx:
return await _render_overview_partial()
@@ -1005,7 +1006,8 @@ async def pipeline_trigger_transform():
(task_name,),
)
if existing:
is_htmx = request.headers.get("HX-Request") == "true"
is_htmx = (request.headers.get("HX-Request") == "true"
and request.headers.get("HX-Boosted") != "true")
if is_htmx:
return await _render_transform_partial()
await flash(f"A '{step}' task is already queued (task #{existing['id']}).", "warning")
@@ -1013,7 +1015,8 @@ async def pipeline_trigger_transform():
await enqueue(task_name)
is_htmx = request.headers.get("HX-Request") == "true"
is_htmx = (request.headers.get("HX-Request") == "true"
and request.headers.get("HX-Boosted") != "true")
if is_htmx:
return await _render_transform_partial()

View File

@@ -25,7 +25,7 @@ from ..content.health import (
get_template_freshness,
get_template_stats,
)
from ..core import csrf_protect, fetch_all, fetch_one
from ..core import count_where, csrf_protect, fetch_all, fetch_one
bp = Blueprint(
"pseo",
@@ -41,8 +41,7 @@ async def _inject_sidebar_data():
from quart import g
try:
row = await fetch_one("SELECT COUNT(*) as cnt FROM inbound_emails WHERE is_read = 0")
g.admin_unread_count = row["cnt"] if row else 0
g.admin_unread_count = await count_where("inbound_emails WHERE is_read = 0")
except Exception:
g.admin_unread_count = 0
@@ -80,8 +79,7 @@ async def pseo_dashboard():
total_published = sum(r["stats"]["published"] for r in template_rows)
stale_count = sum(1 for f in freshness if f["status"] == "stale")
noindex_row = await fetch_one("SELECT COUNT(*) as cnt FROM articles WHERE noindex = 1")
noindex_count = noindex_row["cnt"] if noindex_row else 0
noindex_count = await count_where("articles WHERE noindex = 1")
# Recent generation jobs — enough for the dashboard summary.
jobs = await fetch_all(

View File

@@ -28,6 +28,7 @@ from ..auth.routes import role_required
from ..core import (
EMAIL_ADDRESSES,
config,
count_where,
csrf_protect,
execute,
fetch_all,
@@ -91,8 +92,7 @@ async def _inject_admin_sidebar_data():
"""Load unread inbox count for sidebar badge on every admin page."""
from quart import g
try:
row = await fetch_one("SELECT COUNT(*) as cnt FROM inbound_emails WHERE is_read = 0")
g.admin_unread_count = row["cnt"] if row else 0
g.admin_unread_count = await count_where("inbound_emails WHERE is_read = 0")
except Exception:
logger.exception("Failed to load admin sidebar unread count")
g.admin_unread_count = 0
@@ -114,76 +114,32 @@ async def get_dashboard_stats() -> dict:
now = utcnow()
today = now.date().isoformat()
week_ago = (now - timedelta(days=7)).strftime("%Y-%m-%d %H:%M:%S")
users_total = await fetch_one("SELECT COUNT(*) as count FROM users WHERE deleted_at IS NULL")
users_today = await fetch_one(
"SELECT COUNT(*) as count FROM users WHERE created_at >= ? AND deleted_at IS NULL",
(today,)
)
users_week = await fetch_one(
"SELECT COUNT(*) as count FROM users WHERE created_at >= ? AND deleted_at IS NULL",
(week_ago,)
)
subs = await fetch_one(
"SELECT COUNT(*) as count FROM subscriptions WHERE status = 'active'"
# Two queries that aren't simple COUNT(*) — keep as fetch_one
planner_row = await fetch_one(
"SELECT COUNT(DISTINCT user_id) AS n FROM scenarios WHERE deleted_at IS NULL"
)
tasks_pending = await fetch_one("SELECT COUNT(*) as count FROM tasks WHERE status = 'pending'")
tasks_failed = await fetch_one("SELECT COUNT(*) as count FROM tasks WHERE status = 'failed'")
# Lead funnel stats
leads_total = await fetch_one(
"SELECT COUNT(*) as count FROM lead_requests WHERE lead_type = 'quote'"
)
leads_new = await fetch_one(
"SELECT COUNT(*) as count FROM lead_requests WHERE status = 'new' AND lead_type = 'quote'"
)
leads_verified = await fetch_one(
"SELECT COUNT(*) as count FROM lead_requests WHERE verified_at IS NOT NULL AND lead_type = 'quote'"
)
leads_unlocked = await fetch_one(
"SELECT COUNT(*) as count FROM lead_requests WHERE unlock_count > 0 AND lead_type = 'quote'"
)
# Planner users
planner_users = await fetch_one(
"SELECT COUNT(DISTINCT user_id) as count FROM scenarios WHERE deleted_at IS NULL"
)
# Supplier stats
suppliers_claimed = await fetch_one(
"SELECT COUNT(*) as count FROM suppliers WHERE claimed_by IS NOT NULL"
)
suppliers_growth = await fetch_one(
"SELECT COUNT(*) as count FROM suppliers WHERE tier = 'growth'"
)
suppliers_pro = await fetch_one(
"SELECT COUNT(*) as count FROM suppliers WHERE tier = 'pro'"
)
total_credits_spent = await fetch_one(
"SELECT COALESCE(SUM(ABS(delta)), 0) as total FROM credit_ledger WHERE delta < 0"
)
leads_unlocked_by_suppliers = await fetch_one(
"SELECT COUNT(*) as count FROM lead_forwards"
credits_row = await fetch_one(
"SELECT COALESCE(SUM(ABS(delta)), 0) AS n FROM credit_ledger WHERE delta < 0"
)
return {
"users_total": users_total["count"] if users_total else 0,
"users_today": users_today["count"] if users_today else 0,
"users_week": users_week["count"] if users_week else 0,
"active_subscriptions": subs["count"] if subs else 0,
"tasks_pending": tasks_pending["count"] if tasks_pending else 0,
"tasks_failed": tasks_failed["count"] if tasks_failed else 0,
"leads_total": leads_total["count"] if leads_total else 0,
"leads_new": leads_new["count"] if leads_new else 0,
"leads_verified": leads_verified["count"] if leads_verified else 0,
"leads_unlocked": leads_unlocked["count"] if leads_unlocked else 0,
"planner_users": planner_users["count"] if planner_users else 0,
"suppliers_claimed": suppliers_claimed["count"] if suppliers_claimed else 0,
"suppliers_growth": suppliers_growth["count"] if suppliers_growth else 0,
"suppliers_pro": suppliers_pro["count"] if suppliers_pro else 0,
"total_credits_spent": total_credits_spent["total"] if total_credits_spent else 0,
"leads_unlocked_by_suppliers": leads_unlocked_by_suppliers["count"] if leads_unlocked_by_suppliers else 0,
"users_total": await count_where("users WHERE deleted_at IS NULL"),
"users_today": await count_where("users WHERE created_at >= ? AND deleted_at IS NULL", (today,)),
"users_week": await count_where("users WHERE created_at >= ? AND deleted_at IS NULL", (week_ago,)),
"active_subscriptions": await count_where("subscriptions WHERE status = 'active'"),
"tasks_pending": await count_where("tasks WHERE status = 'pending'"),
"tasks_failed": await count_where("tasks WHERE status = 'failed'"),
"leads_total": await count_where("lead_requests WHERE lead_type = 'quote'"),
"leads_new": await count_where("lead_requests WHERE status = 'new' AND lead_type = 'quote'"),
"leads_verified": await count_where("lead_requests WHERE verified_at IS NOT NULL AND lead_type = 'quote'"),
"leads_unlocked": await count_where("lead_requests WHERE unlock_count > 0 AND lead_type = 'quote'"),
"planner_users": planner_row["n"] if planner_row else 0,
"suppliers_claimed": await count_where("suppliers WHERE claimed_by IS NOT NULL"),
"suppliers_growth": await count_where("suppliers WHERE tier = 'growth'"),
"suppliers_pro": await count_where("suppliers WHERE tier = 'pro'"),
"total_credits_spent": credits_row["n"] if credits_row else 0,
"leads_unlocked_by_suppliers": await count_where("lead_forwards WHERE 1=1"),
}
@@ -446,10 +402,7 @@ async def get_leads(
params.append(f"-{days} days")
where = " AND ".join(wheres)
count_row = await fetch_one(
f"SELECT COUNT(*) as cnt FROM lead_requests WHERE {where}", tuple(params)
)
total = count_row["cnt"] if count_row else 0
total = await count_where(f"lead_requests WHERE {where}", tuple(params))
offset = (page - 1) * per_page
rows = await fetch_all(
@@ -579,6 +532,71 @@ async def lead_results():
)
@bp.route("/leads/bulk", methods=["POST"])
@role_required("admin")
@csrf_protect
async def leads_bulk():
"""Bulk actions on leads: set_status, set_heat."""
form = await request.form
ids_raw = form.get("lead_ids", "").strip()
action = form.get("action", "").strip()
if action not in ("set_status", "set_heat") or not ids_raw:
return "", 400
lead_ids = [int(i) for i in ids_raw.split(",") if i.strip().isdigit()]
assert len(lead_ids) <= 500, "too many lead IDs in bulk action"
if not lead_ids:
return "", 400
placeholders = ",".join("?" for _ in lead_ids)
if action == "set_status":
target = form.get("target_status", "").strip()
if target not in LEAD_STATUSES:
return "", 400
await execute(
f"UPDATE lead_requests SET status = ? WHERE id IN ({placeholders})",
(target, *lead_ids),
)
elif action == "set_heat":
target = form.get("target_heat", "").strip()
if target not in HEAT_OPTIONS:
return "", 400
await execute(
f"UPDATE lead_requests SET heat_score = ? WHERE id IN ({placeholders})",
(target, *lead_ids),
)
# Re-render results partial with current filters
search = form.get("search", "").strip()
status_filter = form.get("status", "")
heat_filter = form.get("heat", "")
country_filter = form.get("country", "")
days_str = form.get("days", "")
days = int(days_str) if days_str.isdigit() else None
per_page = 50
lead_list, total = await get_leads(
status=status_filter or None, heat=heat_filter or None,
country=country_filter or None, search=search or None,
days=days, page=1, per_page=per_page,
)
return await render_template(
"admin/partials/lead_results.html",
leads=lead_list,
page=1,
per_page=per_page,
total=total,
current_status=status_filter,
current_heat=heat_filter,
current_country=country_filter,
current_search=search,
current_days=days_str,
)
@bp.route("/leads/<int:lead_id>")
@role_required("admin")
async def lead_detail(lead_id: int):
@@ -679,26 +697,14 @@ async def lead_new():
return await render_template("admin/lead_form.html", data={}, statuses=LEAD_STATUSES)
@bp.route("/leads/<int:lead_id>/forward", methods=["POST"])
@role_required("admin")
@csrf_protect
async def lead_forward(lead_id: int):
"""Manually forward a lead to a supplier (no credit cost)."""
form = await request.form
supplier_id = int(form.get("supplier_id", 0))
if not supplier_id:
await flash("Select a supplier.", "error")
return redirect(url_for("admin.lead_detail", lead_id=lead_id))
# Check if already forwarded
async def _forward_lead(lead_id: int, supplier_id: int) -> str | None:
"""Forward a lead to a supplier. Returns error message or None on success."""
existing = await fetch_one(
"SELECT 1 FROM lead_forwards WHERE lead_id = ? AND supplier_id = ?",
(lead_id, supplier_id),
)
if existing:
await flash("Already forwarded to this supplier.", "warning")
return redirect(url_for("admin.lead_detail", lead_id=lead_id))
return "Already forwarded to this supplier."
now = utcnow_iso()
await execute(
@@ -710,15 +716,27 @@ async def lead_forward(lead_id: int):
"UPDATE lead_requests SET unlock_count = unlock_count + 1, status = 'forwarded' WHERE id = ?",
(lead_id,),
)
# Enqueue forward email
from ..worker import enqueue
await enqueue("send_lead_forward_email", {
"lead_id": lead_id,
"supplier_id": supplier_id,
})
await enqueue("send_lead_forward_email", {"lead_id": lead_id, "supplier_id": supplier_id})
return None
await flash("Lead forwarded to supplier.", "success")
@bp.route("/leads/<int:lead_id>/forward", methods=["POST"])
@role_required("admin")
@csrf_protect
async def lead_forward(lead_id: int):
"""Manually forward a lead to a supplier (no credit cost)."""
form = await request.form
supplier_id = int(form.get("supplier_id", 0))
if not supplier_id:
await flash("Select a supplier.", "error")
return redirect(url_for("admin.lead_detail", lead_id=lead_id))
error = await _forward_lead(lead_id, supplier_id)
if error:
await flash(error, "warning")
else:
await flash("Lead forwarded to supplier.", "success")
return redirect(url_for("admin.lead_detail", lead_id=lead_id))
@@ -751,25 +769,9 @@ async def lead_forward_htmx(lead_id: int):
return Response("Select a supplier.", status=422)
supplier_id = int(supplier_id_str)
existing = await fetch_one(
"SELECT 1 FROM lead_forwards WHERE lead_id = ? AND supplier_id = ?",
(lead_id, supplier_id),
)
if existing:
return Response("Already forwarded to this supplier.", status=422)
now = utcnow_iso()
await execute(
"""INSERT INTO lead_forwards (lead_id, supplier_id, credit_cost, status, created_at)
VALUES (?, ?, 0, 'sent', ?)""",
(lead_id, supplier_id, now),
)
await execute(
"UPDATE lead_requests SET unlock_count = unlock_count + 1, status = 'forwarded' WHERE id = ?",
(lead_id,),
)
from ..worker import enqueue
await enqueue("send_lead_forward_email", {"lead_id": lead_id, "supplier_id": supplier_id})
error = await _forward_lead(lead_id, supplier_id)
if error:
return Response(error, status=422)
lead = await get_lead_detail(lead_id)
return await render_template(
@@ -929,13 +931,10 @@ async def get_suppliers_list(
async def get_supplier_stats() -> dict:
"""Get aggregate supplier stats for the admin list header."""
claimed = await fetch_one("SELECT COUNT(*) as cnt FROM suppliers WHERE claimed_by IS NOT NULL")
growth = await fetch_one("SELECT COUNT(*) as cnt FROM suppliers WHERE tier = 'growth'")
pro = await fetch_one("SELECT COUNT(*) as cnt FROM suppliers WHERE tier = 'pro'")
return {
"claimed": claimed["cnt"] if claimed else 0,
"growth": growth["cnt"] if growth else 0,
"pro": pro["cnt"] if pro else 0,
"claimed": await count_where("suppliers WHERE claimed_by IS NOT NULL"),
"growth": await count_where("suppliers WHERE tier = 'growth'"),
"pro": await count_where("suppliers WHERE tier = 'pro'"),
}
@@ -1017,11 +1016,7 @@ async def supplier_detail(supplier_id: int):
(supplier_id,),
)
enquiry_row = await fetch_one(
"SELECT COUNT(*) as cnt FROM supplier_enquiries WHERE supplier_id = ?",
(supplier_id,),
)
enquiry_count = enquiry_row["cnt"] if enquiry_row else 0
enquiry_count = await count_where("supplier_enquiries WHERE supplier_id = ?", (supplier_id,))
# Email activity timeline — correlate by contact_email (no FK)
timeline = []
@@ -1239,7 +1234,6 @@ _PRODUCT_CATEGORIES = [
@role_required("admin")
async def billing_products():
"""Read-only overview of Paddle products, subscriptions, and revenue proxies."""
active_subs_row = await fetch_one("SELECT COUNT(*) as cnt FROM subscriptions WHERE status = 'active'")
mrr_row = await fetch_one(
"""SELECT COALESCE(SUM(
CASE WHEN pp.key LIKE '%_yearly' THEN pp.price_cents / 12
@@ -1249,14 +1243,12 @@ async def billing_products():
JOIN paddle_products pp ON s.plan = pp.key
WHERE s.status = 'active' AND pp.billing_type = 'subscription'"""
)
active_boosts_row = await fetch_one("SELECT COUNT(*) as cnt FROM supplier_boosts WHERE status = 'active'")
bp_exports_row = await fetch_one("SELECT COUNT(*) as cnt FROM business_plan_exports WHERE status = 'completed'")
stats = {
"active_subs": (active_subs_row or {}).get("cnt", 0),
"active_subs": await count_where("subscriptions WHERE status = 'active'"),
"mrr_cents": (mrr_row or {}).get("total_cents", 0),
"active_boosts": (active_boosts_row or {}).get("cnt", 0),
"bp_exports": (bp_exports_row or {}).get("cnt", 0),
"active_boosts": await count_where("supplier_boosts WHERE status = 'active'"),
"bp_exports": await count_where("business_plan_exports WHERE status = 'completed'"),
}
products_rows = await fetch_all("SELECT * FROM paddle_products ORDER BY key")
@@ -1342,23 +1334,18 @@ async def get_email_log(
async def get_email_stats() -> dict:
"""Aggregate email stats for the list header."""
total = await fetch_one("SELECT COUNT(*) as cnt FROM email_log")
delivered = await fetch_one("SELECT COUNT(*) as cnt FROM email_log WHERE last_event = 'delivered'")
bounced = await fetch_one("SELECT COUNT(*) as cnt FROM email_log WHERE last_event = 'bounced'")
today = utcnow().date().isoformat()
sent_today = await fetch_one("SELECT COUNT(*) as cnt FROM email_log WHERE created_at >= ?", (today,))
return {
"total": total["cnt"] if total else 0,
"delivered": delivered["cnt"] if delivered else 0,
"bounced": bounced["cnt"] if bounced else 0,
"sent_today": sent_today["cnt"] if sent_today else 0,
"total": await count_where("email_log WHERE 1=1"),
"delivered": await count_where("email_log WHERE last_event = 'delivered'"),
"bounced": await count_where("email_log WHERE last_event = 'bounced'"),
"sent_today": await count_where("email_log WHERE created_at >= ?", (today,)),
}
async def get_unread_count() -> int:
"""Count unread inbound emails."""
row = await fetch_one("SELECT COUNT(*) as cnt FROM inbound_emails WHERE is_read = 0")
return row["cnt"] if row else 0
return await count_where("inbound_emails WHERE is_read = 0")
@bp.route("/emails")
@@ -1824,11 +1811,7 @@ async def template_detail(slug: str):
columns = await get_table_columns(config["data_table"])
sample_rows = await fetch_template_data(config["data_table"], limit=10)
# Count generated articles
row = await fetch_one(
"SELECT COUNT(*) as cnt FROM articles WHERE template_slug = ?", (slug,),
)
generated_count = row["cnt"] if row else 0
generated_count = await count_where("articles WHERE template_slug = ?", (slug,))
return await render_template(
"admin/template_detail.html",
@@ -1959,8 +1942,8 @@ async def _query_scenarios(search: str, country: str, venue_type: str) -> tuple[
f"SELECT * FROM published_scenarios WHERE {where} ORDER BY created_at DESC LIMIT 500",
tuple(params),
)
total_row = await fetch_one("SELECT COUNT(*) as cnt FROM published_scenarios")
return rows, (total_row["cnt"] if total_row else 0)
total = await count_where("published_scenarios WHERE 1=1")
return rows, total
@bp.route("/scenarios")
@@ -2203,6 +2186,27 @@ _ARTICLES_DIR = Path(__file__).parent.parent.parent.parent.parent / "data" / "co
_FRONTMATTER_RE = re.compile(r"^---\s*\n(.*?)\n---\s*\n", re.DOTALL)
def _find_article_md(slug: str) -> Path | None:
"""Return the Path of the .md file whose frontmatter slug matches, or None.
Tries the exact name first ({slug}.md), then scans _ARTICLES_DIR for any
file whose YAML frontmatter contains 'slug: <slug>'. This handles the
common pattern where files are named {slug}-{lang}.md but the frontmatter
slug omits the language suffix.
"""
if not _ARTICLES_DIR.is_dir():
return None
exact = _ARTICLES_DIR / f"{slug}.md"
if exact.exists():
return exact
for path in _ARTICLES_DIR.glob("*.md"):
raw = path.read_text(encoding="utf-8")
m = _FRONTMATTER_RE.match(raw)
if m and f"slug: {slug}" in m.group(1):
return path
return None
async def _sync_static_articles() -> None:
"""Upsert static .md articles from data/content/articles/ into the DB.
@@ -2491,6 +2495,101 @@ async def article_results():
)
@bp.route("/articles/bulk", methods=["POST"])
@role_required("admin")
@csrf_protect
async def articles_bulk():
"""Bulk actions on articles: publish, unpublish, toggle_noindex, rebuild, delete."""
form = await request.form
ids_raw = form.get("article_ids", "").strip()
action = form.get("action", "").strip()
valid_actions = ("publish", "unpublish", "toggle_noindex", "rebuild", "delete")
if action not in valid_actions or not ids_raw:
return "", 400
article_ids = [int(i) for i in ids_raw.split(",") if i.strip().isdigit()]
assert len(article_ids) <= 500, "too many article IDs in bulk action"
if not article_ids:
return "", 400
placeholders = ",".join("?" for _ in article_ids)
now = utcnow_iso()
if action == "publish":
await execute(
f"UPDATE articles SET status = 'published', updated_at = ? WHERE id IN ({placeholders})",
(now, *article_ids),
)
from ..sitemap import invalidate_sitemap_cache
invalidate_sitemap_cache()
elif action == "unpublish":
await execute(
f"UPDATE articles SET status = 'draft', updated_at = ? WHERE id IN ({placeholders})",
(now, *article_ids),
)
from ..sitemap import invalidate_sitemap_cache
invalidate_sitemap_cache()
elif action == "toggle_noindex":
await execute(
f"UPDATE articles SET noindex = CASE WHEN noindex = 1 THEN 0 ELSE 1 END, updated_at = ? WHERE id IN ({placeholders})",
(now, *article_ids),
)
elif action == "rebuild":
for aid in article_ids:
await _rebuild_article(aid)
elif action == "delete":
from ..content.routes import BUILD_DIR
articles = await fetch_all(
f"SELECT id, slug FROM articles WHERE id IN ({placeholders})",
tuple(article_ids),
)
for a in articles:
build_path = BUILD_DIR / f"{a['slug']}.html"
if build_path.exists():
build_path.unlink()
md_path = Path("data/content/articles") / f"{a['slug']}.md"
if md_path.exists():
md_path.unlink()
await execute(
f"DELETE FROM articles WHERE id IN ({placeholders})",
tuple(article_ids),
)
from ..sitemap import invalidate_sitemap_cache
invalidate_sitemap_cache()
# Re-render results partial with current filters
search = form.get("search", "").strip()
status_filter = form.get("status", "")
template_filter = form.get("template", "")
language_filter = form.get("language", "")
grouped = not language_filter
if grouped:
article_list = await _get_article_list_grouped(
status=status_filter or None, template_slug=template_filter or None,
search=search or None,
)
else:
article_list = await _get_article_list(
status=status_filter or None, template_slug=template_filter or None,
language=language_filter or None, search=search or None,
)
return await render_template(
"admin/partials/article_results.html",
articles=article_list,
grouped=grouped,
page=1,
is_generating=await _is_generating(),
)
@bp.route("/articles/new", methods=["GET", "POST"])
@role_required("admin")
@csrf_protect
@@ -2519,11 +2618,11 @@ async def article_new():
if not title or not body:
await flash("Title and body are required.", "error")
return await render_template("admin/article_form.html", data=dict(form), editing=False)
return await render_template("admin/article_form.html", data=dict(form), editing=False, preview_doc="")
if is_reserved_path(url_path):
await flash(f"URL path '{url_path}' conflicts with a reserved route.", "error")
return await render_template("admin/article_form.html", data=dict(form), editing=False)
return await render_template("admin/article_form.html", data=dict(form), editing=False, preview_doc="")
# Render markdown → HTML with scenario + product cards baked in
body_html = mistune.html(body)
@@ -2556,7 +2655,7 @@ async def article_new():
await flash(f"Article '{title}' created.", "success")
return redirect(url_for("admin.articles"))
return await render_template("admin/article_form.html", data={}, editing=False)
return await render_template("admin/article_form.html", data={}, editing=False, preview_doc="")
@bp.route("/articles/<int:article_id>/edit", methods=["GET", "POST"])
@@ -2592,7 +2691,7 @@ async def article_edit(article_id: int):
if is_reserved_path(url_path):
await flash(f"URL path '{url_path}' conflicts with a reserved route.", "error")
return await render_template(
"admin/article_form.html", data=dict(form), editing=True, article_id=article_id,
"admin/article_form.html", data=dict(form), editing=True, article_id=article_id, preview_doc="",
)
# Re-render if body provided
@@ -2626,18 +2725,55 @@ async def article_edit(article_id: int):
# Load markdown source if available (manual or generated)
from ..content.routes import BUILD_DIR as CONTENT_BUILD_DIR
md_path = Path("data/content/articles") / f"{article['slug']}.md"
if not md_path.exists():
md_path = _find_article_md(article["slug"])
if md_path is None:
lang = article["language"] or "en"
md_path = CONTENT_BUILD_DIR / lang / "md" / f"{article['slug']}.md"
body = md_path.read_text() if md_path.exists() else ""
fallback = CONTENT_BUILD_DIR / lang / "md" / f"{article['slug']}.md"
md_path = fallback if fallback.exists() else None
raw = md_path.read_text() if md_path else ""
# Strip YAML frontmatter so only the prose body appears in the editor
m = _FRONTMATTER_RE.match(raw)
body = raw[m.end():].lstrip("\n") if m else raw
body_html = mistune.html(body) if body else ""
css_url = url_for("static", filename="css/output.css")
preview_doc = (
f"<!doctype html><html><head>"
f"<link rel='stylesheet' href='{css_url}'>"
f"<style>html,body{{margin:0;padding:0}}body{{padding:2rem 2.5rem}}</style>"
f"</head><body><div class='article-body'>{body_html}</div></body></html>"
) if body_html else ""
data = {**dict(article), "body": body}
return await render_template(
"admin/article_form.html", data=data, editing=True, article_id=article_id,
"admin/article_form.html",
data=data,
editing=True,
article_id=article_id,
preview_doc=preview_doc,
)
@bp.route("/articles/preview", methods=["POST"])
@role_required("admin")
@csrf_protect
async def article_preview():
"""Render markdown body to HTML for the live editor preview panel."""
form = await request.form
body = form.get("body", "")
m = _FRONTMATTER_RE.match(body)
body = body[m.end():].lstrip("\n") if m else body
body_html = mistune.html(body) if body else ""
css_url = url_for("static", filename="css/output.css")
preview_doc = (
f"<!doctype html><html><head>"
f"<link rel='stylesheet' href='{css_url}'>"
f"<style>html,body{{margin:0;padding:0}}body{{padding:2rem 2.5rem}}</style>"
f"</head><body><div class='article-body'>{body_html}</div></body></html>"
) if body_html else ""
return await render_template("admin/partials/article_preview.html", preview_doc=preview_doc)
@bp.route("/articles/<int:article_id>/delete", methods=["POST"])
@role_required("admin")
@csrf_protect
@@ -2927,11 +3063,9 @@ _CSV_IMPORT_LIMIT = 500 # guard against huge uploads
async def get_follow_up_due_count() -> int:
"""Count pipeline suppliers with follow_up_at <= today."""
row = await fetch_one(
"""SELECT COUNT(*) as cnt FROM suppliers
WHERE outreach_status IS NOT NULL AND follow_up_at <= date('now')"""
return await count_where(
"suppliers WHERE outreach_status IS NOT NULL AND follow_up_at <= date('now')"
)
return row["cnt"] if row else 0
async def get_outreach_pipeline() -> dict:

View File

@@ -226,10 +226,9 @@ document.addEventListener('DOMContentLoaded', function() {
<a href="{{ url_for('admin.affiliate_products') }}" class="btn-outline">Cancel</a>
</div>
{% if editing %}
<form method="post" action="{{ url_for('admin.affiliate_delete', product_id=product_id) }}" style="margin:0">
<form method="post" action="{{ url_for('admin.affiliate_delete', product_id=product_id) }}" style="margin:0" hx-boost="true">
<input type="hidden" name="csrf_token" value="{{ csrf_token() }}">
<button type="submit" class="btn-outline"
onclick="event.preventDefault(); confirmAction('Delete this product? This cannot be undone.', this.closest('form'))">Delete</button>
</form>
{% endif %}
</div>

View File

@@ -120,10 +120,9 @@ document.addEventListener('DOMContentLoaded', function() {
<a href="{{ url_for('admin.affiliate_programs') }}" class="btn-outline">Cancel</a>
</div>
{% if editing %}
<form method="post" action="{{ url_for('admin.affiliate_program_delete', program_id=program_id) }}" style="margin:0">
<form method="post" action="{{ url_for('admin.affiliate_program_delete', program_id=program_id) }}" style="margin:0" hx-boost="true">
<input type="hidden" name="csrf_token" value="{{ csrf_token() }}">
<button type="submit" class="btn-outline"
onclick="event.preventDefault(); confirmAction('Delete this program? Blocked if products reference it.', this.closest('form'))">Delete</button>
</form>
{% endif %}
</div>

View File

@@ -1,89 +1,413 @@
{% extends "admin/base_admin.html" %}
{% set admin_page = "articles" %}
{% block title %}{% if editing %}Edit{% else %}New{% endif %} Article - Admin - {{ config.APP_NAME }}{% endblock %}
{% block title %}{% if editing %}Edit{% else %}New{% endif %} Article Admin {{ config.APP_NAME }}{% endblock %}
{% block head %}{{ super() }}
<style>
/* Override admin-main so the split editor fills the column */
.admin-main {
padding: 0;
overflow: hidden;
display: flex;
flex-direction: column;
}
/* ── Editor shell ──────────────────────────────────────────── */
.ae-shell {
display: flex;
flex-direction: column;
flex: 1;
min-height: 0;
overflow: hidden;
}
/* ── Toolbar ────────────────────────────────────────────────── */
.ae-toolbar {
display: flex;
align-items: center;
gap: 0.75rem;
padding: 0.625rem 1.25rem;
background: #fff;
border-bottom: 1px solid #E2E8F0;
flex-shrink: 0;
}
.ae-toolbar__back {
font-size: 0.8125rem;
color: #64748B;
text-decoration: none;
flex-shrink: 0;
transition: color 0.1s;
}
.ae-toolbar__back:hover { color: #0F172A; }
.ae-toolbar__sep {
width: 1px; height: 1.25rem;
background: #E2E8F0;
flex-shrink: 0;
}
.ae-toolbar__title {
font-size: 0.875rem;
font-weight: 600;
color: #0F172A;
flex: 1;
white-space: nowrap;
overflow: hidden;
text-overflow: ellipsis;
}
.ae-toolbar__status {
font-size: 0.6875rem;
font-weight: 600;
text-transform: uppercase;
letter-spacing: 0.06em;
padding: 0.2rem 0.55rem;
border-radius: 9999px;
flex-shrink: 0;
}
.ae-toolbar__status--draft {
background: #F1F5F9;
color: #64748B;
}
.ae-toolbar__status--published {
background: #DCFCE7;
color: #16A34A;
}
/* ── Metadata strip ─────────────────────────────────────────── */
#ae-form {
display: contents; /* form participates in flex layout as transparent wrapper */
}
.ae-meta {
padding: 0.75rem 1.25rem;
background: #F8FAFC;
border-bottom: 1px solid #E2E8F0;
flex-shrink: 0;
}
.ae-meta__row {
display: flex;
gap: 0.625rem;
flex-wrap: wrap;
align-items: end;
}
.ae-meta__row + .ae-meta__row { margin-top: 0.5rem; }
.ae-field {
display: flex;
flex-direction: column;
gap: 0.2rem;
min-width: 0;
}
.ae-field--flex1 { flex: 1; min-width: 120px; }
.ae-field--flex2 { flex: 2; min-width: 180px; }
.ae-field--flex3 { flex: 3; min-width: 220px; }
.ae-field--fixed80 { flex: 0 0 80px; }
.ae-field--fixed120 { flex: 0 0 120px; }
.ae-field--fixed160 { flex: 0 0 160px; }
.ae-field label {
font-size: 0.625rem;
font-weight: 700;
text-transform: uppercase;
letter-spacing: 0.07em;
color: #94A3B8;
white-space: nowrap;
}
.ae-field input,
.ae-field select {
width: 100%;
padding: 0.3rem 0.5rem;
border: 1px solid #E2E8F0;
border-radius: 4px;
font-size: 0.8125rem;
font-family: var(--font-sans);
color: #0F172A;
background: #fff;
outline: none;
transition: border-color 0.15s, box-shadow 0.15s;
min-width: 0;
}
.ae-field input:focus,
.ae-field select:focus {
border-color: #1D4ED8;
box-shadow: 0 0 0 2px rgba(29,78,216,0.1);
}
.ae-field input[readonly] {
background: #F1F5F9;
color: #94A3B8;
}
/* ── Split pane ─────────────────────────────────────────────── */
.ae-split {
display: grid;
grid-template-columns: 1fr 1fr;
flex: 1;
min-height: 0;
overflow: hidden;
}
.ae-pane {
display: flex;
flex-direction: column;
min-height: 0;
overflow: hidden;
}
.ae-pane--editor { border-right: 1px solid #E2E8F0; }
.ae-pane__header {
display: flex;
align-items: center;
justify-content: space-between;
padding: 0.375rem 0.875rem;
background: #F8FAFC;
border-bottom: 1px solid #E2E8F0;
flex-shrink: 0;
}
.ae-pane--preview .ae-pane__header {
background: #F8FAFC;
border-bottom: 1px solid #E2E8F0;
}
.ae-pane__label {
font-size: 0.625rem;
font-weight: 700;
text-transform: uppercase;
letter-spacing: 0.09em;
color: #94A3B8;
}
.ae-pane__hint {
font-size: 0.625rem;
font-family: var(--font-mono);
color: #94A3B8;
}
/* The markdown textarea */
.ae-editor {
flex: 1;
resize: none;
border: none;
outline: none;
padding: 1.5rem 2rem;
font-family: var(--font-mono);
font-size: 0.875rem;
line-height: 1.8;
background: #FEFDFB;
color: #1E293B;
caret-color: #1D4ED8;
tab-size: 2;
}
.ae-editor::placeholder { color: #CBD5E1; }
.ae-editor:focus { outline: none; }
/* Preview pane — iframe fills the content area */
#ae-preview-content {
flex: 1;
display: flex;
min-height: 0;
}
.preview-placeholder {
font-size: 0.875rem;
color: #94A3B8;
font-style: italic;
margin: 1.5rem 2rem;
}
/* Collapsible metadata */
.ae-meta--collapsed { display: none; }
.ae-toolbar__toggle {
font-size: 0.75rem;
font-weight: 600;
color: #64748B;
background: none;
border: 1px solid #E2E8F0;
border-radius: 4px;
padding: 0.25rem 0.6rem;
cursor: pointer;
flex-shrink: 0;
}
.ae-toolbar__toggle:hover { color: #0F172A; border-color: #94A3B8; }
/* Word count footer */
.ae-footer {
display: flex;
align-items: center;
justify-content: flex-end;
padding: 0.25rem 0.875rem;
background: #F8FAFC;
border-top: 1px solid #E2E8F0;
flex-shrink: 0;
}
.ae-wordcount {
font-size: 0.625rem;
font-family: var(--font-mono);
color: #94A3B8;
}
/* HTMX loading indicator — htmx toggles .htmx-request on the element */
.ae-loading {
font-size: 0.625rem;
color: #94A3B8;
font-family: var(--font-mono);
opacity: 0;
transition: opacity 0.2s;
}
.ae-loading.htmx-request { opacity: 1; }
/* Responsive: stack on narrow screens */
@media (max-width: 900px) {
.ae-split { grid-template-columns: 1fr; }
.ae-pane--preview { display: none; }
}
</style>
{% endblock %}
{% block admin_content %}
<div style="max-width: 48rem; margin: 0 auto;">
<a href="{{ url_for('admin.articles') }}" class="text-sm text-slate">&larr; Back to articles</a>
<h1 class="text-2xl mt-4 mb-6">{% if editing %}Edit{% else %}New{% endif %} Article</h1>
<div class="ae-shell">
<form method="post" class="card">
<input type="hidden" name="csrf_token" value="{{ csrf_token() }}">
<!-- Toolbar -->
<div class="ae-toolbar">
<a href="{{ url_for('admin.articles') }}" class="ae-toolbar__back">← Articles</a>
<div class="ae-toolbar__sep"></div>
<span class="ae-toolbar__title">
{% if editing %}{{ data.get('title', 'Edit Article') }}{% else %}New Article{% endif %}
</span>
{% if editing %}
<span class="ae-toolbar__status ae-toolbar__status--{{ data.get('status', 'draft') }}">
{{ data.get('status', 'draft') }}
</span>
{% endif %}
<button type="button" class="ae-toolbar__toggle"
onclick="document.querySelector('.ae-meta').classList.toggle('ae-meta--collapsed')">Meta ▾</button>
<button form="ae-form" type="submit" class="btn btn-sm">
{% if editing %}Save Changes{% else %}Create Article{% endif %}
</button>
</div>
<div style="display: grid; grid-template-columns: 1fr 1fr; gap: 1rem;" class="mb-4">
<div>
<label class="form-label" for="title">Title</label>
<input type="text" id="title" name="title" value="{{ data.get('title', '') }}" class="form-input" required>
<!-- Form wraps everything below the toolbar -->
<form id="ae-form" method="post" style="display:contents;">
<input type="hidden" name="csrf_token" value="{{ csrf_token() }}">
<!-- Metadata strip -->
<div class="ae-meta">
<div class="ae-meta__row">
<div class="ae-field ae-field--flex3">
<label for="title">Title</label>
<input type="text" id="title" name="title" value="{{ data.get('title', '') }}"
required placeholder="Article title…">
</div>
<div>
<label class="form-label" for="slug">Slug</label>
<input type="text" id="slug" name="slug" value="{{ data.get('slug', '') }}" class="form-input"
placeholder="auto-generated from title" {% if editing %}readonly{% endif %}>
<div class="ae-field ae-field--flex2">
<label for="slug">Slug</label>
<input type="text" id="slug" name="slug" value="{{ data.get('slug', '') }}"
placeholder="auto-generated" {% if editing %}readonly{% endif %}>
</div>
</div>
<div class="mb-4">
<label class="form-label" for="url_path">URL Path</label>
<input type="text" id="url_path" name="url_path" value="{{ data.get('url_path', '') }}" class="form-input"
placeholder="e.g. /padel-court-cost-miami">
<p class="form-hint">Defaults to /slug. Must not conflict with existing routes.</p>
</div>
<div class="mb-4">
<label class="form-label" for="meta_description">Meta Description</label>
<input type="text" id="meta_description" name="meta_description" value="{{ data.get('meta_description', '') }}"
class="form-input" maxlength="160">
</div>
<div style="display: grid; grid-template-columns: 1fr 1fr 1fr; gap: 1rem;" class="mb-4">
<div>
<label class="form-label" for="country">Country</label>
<input type="text" id="country" name="country" value="{{ data.get('country', '') }}" class="form-input"
placeholder="e.g. US">
<div class="ae-field ae-field--flex2">
<label for="url_path">URL Path</label>
<input type="text" id="url_path" name="url_path" value="{{ data.get('url_path', '') }}"
placeholder="/slug">
</div>
<div>
<label class="form-label" for="region">Region</label>
<input type="text" id="region" name="region" value="{{ data.get('region', '') }}" class="form-input"
placeholder="e.g. North America">
</div>
<div>
<label class="form-label" for="og_image_url">OG Image URL</label>
<input type="text" id="og_image_url" name="og_image_url" value="{{ data.get('og_image_url', '') }}" class="form-input">
</div>
</div>
<div class="mb-4">
<label class="form-label" for="body">Body (Markdown)</label>
<textarea id="body" name="body" rows="20" class="form-input"
style="font-family: var(--font-mono); font-size: 0.8125rem;" {% if not editing %}required{% endif %}>{{ data.get('body', '') }}</textarea>
<p class="form-hint">Use [scenario:slug] to embed scenario widgets. Sections: :capex, :operating, :cashflow, :returns, :full</p>
</div>
<div style="display: grid; grid-template-columns: 1fr 1fr 1fr; gap: 1rem;" class="mb-4">
<div>
<label class="form-label" for="language">Language</label>
<select id="language" name="language" class="form-input">
<option value="en" {% if data.get('language', 'en') == 'en' %}selected{% endif %}>English (en)</option>
<option value="de" {% if data.get('language') == 'de' %}selected{% endif %}>German (de)</option>
<div class="ae-field ae-field--fixed80">
<label for="language">Language</label>
<select id="language" name="language">
<option value="en" {% if data.get('language', 'en') == 'en' %}selected{% endif %}>EN</option>
<option value="de" {% if data.get('language') == 'de' %}selected{% endif %}>DE</option>
</select>
</div>
<div>
<label class="form-label" for="status">Status</label>
<select id="status" name="status" class="form-input">
<option value="draft" {% if data.get('status') == 'draft' %}selected{% endif %}>Draft</option>
<div class="ae-field ae-field--fixed120">
<label for="status">Status</label>
<select id="status" name="status">
<option value="draft" {% if data.get('status', 'draft') == 'draft' %}selected{% endif %}>Draft</option>
<option value="published" {% if data.get('status') == 'published' %}selected{% endif %}>Published</option>
</select>
</div>
<div>
<label class="form-label" for="published_at">Publish Date</label>
</div>
<div class="ae-meta__row">
<div class="ae-field ae-field--flex3">
<label for="meta_description">Meta Description</label>
<input type="text" id="meta_description" name="meta_description"
value="{{ data.get('meta_description', '') }}" maxlength="160"
placeholder="160 chars max…">
</div>
<div class="ae-field ae-field--flex1">
<label for="country">Country</label>
<input type="text" id="country" name="country" value="{{ data.get('country', '') }}"
placeholder="e.g. US">
</div>
<div class="ae-field ae-field--flex1">
<label for="region">Region</label>
<input type="text" id="region" name="region" value="{{ data.get('region', '') }}"
placeholder="e.g. North America">
</div>
<div class="ae-field ae-field--flex2">
<label for="og_image_url">OG Image URL</label>
<input type="text" id="og_image_url" name="og_image_url"
value="{{ data.get('og_image_url', '') }}">
</div>
<div class="ae-field ae-field--fixed160">
<label for="published_at">Publish Date</label>
<input type="datetime-local" id="published_at" name="published_at"
value="{{ data.get('published_at', '')[:16] if data.get('published_at') else '' }}" class="form-input">
<p class="form-hint">Leave blank for now. Future date = scheduled.</p>
value="{{ data.get('published_at', '')[:16] if data.get('published_at') else '' }}">
</div>
</div>
</div>
<!-- Split: editor | preview -->
<div class="ae-split">
<!-- Left — Markdown editor -->
<div class="ae-pane ae-pane--editor">
<div class="ae-pane__header">
<span class="ae-pane__label">Markdown</span>
<span class="ae-pane__hint">[scenario:slug] · [product:slug]</span>
</div>
<textarea
id="body" name="body"
class="ae-editor"
{% if not editing %}required{% endif %}
placeholder="Start writing in Markdown…"
hx-post="{{ url_for('admin.article_preview') }}"
hx-trigger="input delay:500ms"
hx-target="#ae-preview-content"
hx-include="[name=csrf_token]"
hx-indicator="#ae-loading"
>{{ data.get('body', '') }}</textarea>
<div class="ae-footer">
<span id="ae-wordcount" class="ae-wordcount">0 words</span>
</div>
</div>
<button type="submit" class="btn" style="width: 100%;">{% if editing %}Update Article{% else %}Create Article{% endif %}</button>
</form>
</div>
<!-- Right — Rendered preview -->
<div class="ae-pane ae-pane--preview">
<div class="ae-pane__header">
<span class="ae-pane__label">Preview</span>
<span id="ae-loading" class="ae-loading">Rendering…</span>
</div>
<div id="ae-preview-content" style="flex:1;display:flex;min-height:0;">
{% if preview_doc %}
<iframe
srcdoc="{{ preview_doc | e }}"
style="flex:1;width:100%;border:none;display:block;"
sandbox="allow-same-origin"
title="Article preview"
></iframe>
{% else %}
<p class="preview-placeholder">Start writing to see a preview.</p>
{% endif %}
</div>
</div>
</div>
</form>
</div>
<script>
(function () {
var textarea = document.getElementById('body');
var counter = document.getElementById('ae-wordcount');
function updateCount() {
var text = textarea.value.trim();
var count = text ? text.split(/\s+/).length : 0;
counter.textContent = count + (count === 1 ? ' word' : ' words');
}
textarea.addEventListener('input', updateCount);
updateCount();
}());
</script>
{% endblock %}

View File

@@ -11,9 +11,10 @@
</div>
<div class="flex gap-2">
<a href="{{ url_for('admin.article_new') }}" class="btn btn-sm">New Article</a>
<form method="post" action="{{ url_for('admin.rebuild_all') }}" class="m-0" style="display:inline">
<form method="post" action="{{ url_for('admin.rebuild_all') }}" class="m-0" style="display:inline" hx-boost="true">
<input type="hidden" name="csrf_token" value="{{ csrf_token() }}">
<button type="button" class="btn-outline btn-sm" onclick="confirmAction('Rebuild all articles? This will re-render every article from its template.', this.closest('form'))">Rebuild All</button>
<button type="submit" class="btn-outline btn-sm"
hx-confirm="Rebuild all articles? This will re-render every article from its template.">Rebuild All</button>
</form>
</div>
</header>
@@ -69,8 +70,105 @@
</form>
</div>
{# Bulk action bar #}
<form id="article-bulk-form" style="display:none">
<input type="hidden" name="csrf_token" value="{{ csrf_token() }}">
<input type="hidden" name="article_ids" id="article-bulk-ids" value="">
<input type="hidden" name="action" id="article-bulk-action" value="">
<input type="hidden" name="search" value="{{ current_search }}">
<input type="hidden" name="status" value="{{ current_status }}">
<input type="hidden" name="template" value="{{ current_template }}">
<input type="hidden" name="language" value="{{ current_language }}">
</form>
<div id="article-bulk-bar" class="card mb-4" style="padding:0.75rem 1.25rem;display:none;align-items:center;gap:1rem;background:#EFF6FF;border:1px solid #BFDBFE;">
<span id="article-bulk-count" class="text-sm font-semibold text-navy">0 selected</span>
<select id="article-bulk-action-select" class="form-input" style="min-width:140px;padding:0.25rem 0.5rem;font-size:0.8125rem">
<option value="">Action…</option>
<option value="publish">Publish</option>
<option value="unpublish">Unpublish</option>
<option value="toggle_noindex">Toggle noindex</option>
<option value="rebuild">Rebuild</option>
<option value="delete">Delete</option>
</select>
<button type="button" class="btn btn-sm" onclick="submitArticleBulk()">Apply</button>
<button type="button" class="btn-outline btn-sm" onclick="clearArticleSelection()">Clear</button>
</div>
{# Results #}
<div id="article-results">
{% include "admin/partials/article_results.html" %}
</div>
<script>
const articleSelectedIds = new Set();
function toggleArticleSelect(id, checked) {
if (checked) articleSelectedIds.add(id);
else articleSelectedIds.delete(id);
updateArticleBulkBar();
}
function toggleArticleGroupSelect(checkbox) {
var ids = (checkbox.dataset.ids || '').split(',').map(Number).filter(Boolean);
ids.forEach(function(id) {
if (checkbox.checked) articleSelectedIds.add(id);
else articleSelectedIds.delete(id);
});
updateArticleBulkBar();
}
function clearArticleSelection() {
articleSelectedIds.clear();
document.querySelectorAll('.article-checkbox').forEach(function(cb) { cb.checked = false; });
var selectAll = document.getElementById('article-select-all');
if (selectAll) selectAll.checked = false;
updateArticleBulkBar();
}
function updateArticleBulkBar() {
var bar = document.getElementById('article-bulk-bar');
var count = document.getElementById('article-bulk-count');
var ids = document.getElementById('article-bulk-ids');
bar.style.display = articleSelectedIds.size > 0 ? 'flex' : 'none';
count.textContent = articleSelectedIds.size + ' selected';
ids.value = Array.from(articleSelectedIds).join(',');
}
function submitArticleBulk() {
var action = document.getElementById('article-bulk-action-select').value;
if (!action) return;
if (articleSelectedIds.size === 0) return;
function doSubmit() {
document.getElementById('article-bulk-action').value = action;
htmx.ajax('POST', '{{ url_for("admin.articles_bulk") }}', {
source: document.getElementById('article-bulk-form'),
target: '#article-results',
swap: 'innerHTML'
});
clearArticleSelection();
}
if (action === 'delete') {
showConfirm('Delete ' + articleSelectedIds.size + ' articles? This cannot be undone.').then(function(ok) {
if (ok) doSubmit();
});
} else {
doSubmit();
}
}
document.body.addEventListener('htmx:afterSwap', function(evt) {
if (evt.detail.target.id === 'article-results') {
document.querySelectorAll('.article-checkbox').forEach(function(cb) {
if (cb.dataset.ids) {
var ids = cb.dataset.ids.split(',').map(Number).filter(Boolean);
cb.checked = ids.length > 0 && ids.every(function(id) { return articleSelectedIds.has(id); });
} else {
cb.checked = articleSelectedIds.has(Number(cb.dataset.id));
}
});
}
});
</script>
{% endblock %}

View File

@@ -27,10 +27,11 @@
<td class="text-sm">{{ c.email if c.email is defined else (c.get('email', '-') if c is mapping else '-') }}</td>
<td class="mono text-sm">{{ (c.created_at if c.created_at is defined else (c.get('created_at', '-') if c is mapping else '-'))[:16] if c else '-' }}</td>
<td style="text-align:right">
<form method="post" action="{{ url_for('admin.audience_contact_remove', audience_id=audience.audience_id) }}" style="display:inline">
<form method="post" action="{{ url_for('admin.audience_contact_remove', audience_id=audience.audience_id) }}" style="display:inline" hx-boost="true">
<input type="hidden" name="csrf_token" value="{{ csrf_token() }}">
<input type="hidden" name="contact_id" value="{{ c.id if c.id is defined else (c.get('id', '') if c is mapping else '') }}">
<button type="button" class="btn-outline btn-sm" style="color:#DC2626" onclick="confirmAction('Remove this contact from the audience?', this.closest('form'))">Remove</button>
<button type="submit" class="btn-outline btn-sm" style="color:#DC2626"
hx-confirm="Remove this contact from the audience?">Remove</button>
</form>
</td>
</tr>

View File

@@ -228,35 +228,29 @@
<dialog id="confirm-dialog">
<p id="confirm-msg"></p>
<div class="dialog-actions">
<button id="confirm-cancel" class="btn-outline btn-sm">Cancel</button>
<button id="confirm-ok" class="btn btn-sm">Confirm</button>
</div>
<form method="dialog" class="dialog-actions">
<button value="cancel" class="btn-outline btn-sm">Cancel</button>
<button value="ok" class="btn btn-sm">Confirm</button>
</form>
</dialog>
<script>
function confirmAction(message, form) {
function showConfirm(message) {
var dialog = document.getElementById('confirm-dialog');
document.getElementById('confirm-msg').textContent = message;
var ok = document.getElementById('confirm-ok');
var newOk = ok.cloneNode(true);
ok.replaceWith(newOk);
newOk.addEventListener('click', function() { dialog.close(); form.submit(); });
document.getElementById('confirm-cancel').addEventListener('click', function() { dialog.close(); }, { once: true });
dialog.showModal();
return new Promise(function(resolve) {
dialog.addEventListener('close', function() {
resolve(dialog.returnValue === 'ok');
}, { once: true });
});
}
// Intercept hx-confirm to use the styled dialog instead of window.confirm()
document.body.addEventListener('htmx:confirm', function(evt) {
var dialog = document.getElementById('confirm-dialog');
if (!dialog) return; // fallback: let HTMX use native confirm
if (!evt.detail.question) return;
evt.preventDefault();
document.getElementById('confirm-msg').textContent = evt.detail.question;
var ok = document.getElementById('confirm-ok');
var newOk = ok.cloneNode(true);
ok.replaceWith(newOk);
newOk.addEventListener('click', function() { dialog.close(); evt.detail.issueRequest(true); }, { once: true });
document.getElementById('confirm-cancel').addEventListener('click', function() { dialog.close(); }, { once: true });
dialog.showModal();
showConfirm(evt.detail.question).then(function(ok) {
if (ok) evt.detail.issueRequest(true);
});
});
</script>
{% endblock %}

View File

@@ -19,7 +19,7 @@
<p class="text-slate text-sm">No data rows found. Run the data pipeline to populate <code>{{ config_data.data_table }}</code>.</p>
</div>
{% else %}
<form method="post" class="card">
<form method="post" class="card" hx-boost="true">
<input type="hidden" name="csrf_token" value="{{ csrf_token() }}">
<div class="mb-4">
@@ -45,7 +45,8 @@
</p>
</div>
<button type="button" class="btn" style="width: 100%;" onclick="confirmAction('Generate articles? Existing articles will be updated in-place.', this.closest('form'))">
<button type="submit" class="btn" style="width: 100%;"
hx-confirm="Generate articles? Existing articles will be updated in-place.">
Generate Articles
</button>
</form>

View File

@@ -126,8 +126,103 @@
</form>
</div>
<!-- Bulk action bar -->
<form id="lead-bulk-form" style="display:none">
<input type="hidden" name="csrf_token" value="{{ csrf_token() }}">
<input type="hidden" name="lead_ids" id="lead-bulk-ids" value="">
<input type="hidden" name="action" id="lead-bulk-action" value="">
<input type="hidden" name="target_status" id="lead-bulk-target-status" value="">
<input type="hidden" name="target_heat" id="lead-bulk-target-heat" value="">
<input type="hidden" name="search" value="{{ current_search }}">
<input type="hidden" name="status" value="{{ current_status }}">
<input type="hidden" name="heat" value="{{ current_heat }}">
<input type="hidden" name="country" value="{{ current_country }}">
<input type="hidden" name="days" value="{{ current_days }}">
</form>
<div id="lead-bulk-bar" class="card mb-4" style="padding:0.75rem 1.25rem;display:none;align-items:center;gap:1rem;background:#EFF6FF;border:1px solid #BFDBFE;">
<span id="lead-bulk-count" class="text-sm font-semibold text-navy">0 selected</span>
<select id="lead-bulk-action-select" class="form-input" style="min-width:120px;padding:0.25rem 0.5rem;font-size:0.8125rem" onchange="onLeadActionChange()">
<option value="">Action…</option>
<option value="set_status">Set Status</option>
<option value="set_heat">Set Heat</option>
</select>
<select id="lead-status-select" class="form-input" style="min-width:140px;padding:0.25rem 0.5rem;font-size:0.8125rem;display:none">
{% for s in statuses %}
<option value="{{ s }}">{{ s | replace('_', ' ') }}</option>
{% endfor %}
</select>
<select id="lead-heat-select" class="form-input" style="min-width:100px;padding:0.25rem 0.5rem;font-size:0.8125rem;display:none">
{% for h in heat_options %}
<option value="{{ h }}">{{ h | upper }}</option>
{% endfor %}
</select>
<button type="button" class="btn btn-sm" onclick="submitLeadBulk()">Apply</button>
<button type="button" class="btn-outline btn-sm" onclick="clearLeadSelection()">Clear</button>
</div>
<!-- Results -->
<div id="lead-results">
{% include "admin/partials/lead_results.html" %}
</div>
<script>
const leadSelectedIds = new Set();
function toggleLeadSelect(id, checked) {
if (checked) leadSelectedIds.add(id);
else leadSelectedIds.delete(id);
updateLeadBulkBar();
}
function clearLeadSelection() {
leadSelectedIds.clear();
document.querySelectorAll('.lead-checkbox').forEach(function(cb) { cb.checked = false; });
var selectAll = document.getElementById('lead-select-all');
if (selectAll) selectAll.checked = false;
updateLeadBulkBar();
}
function updateLeadBulkBar() {
var bar = document.getElementById('lead-bulk-bar');
var count = document.getElementById('lead-bulk-count');
var ids = document.getElementById('lead-bulk-ids');
bar.style.display = leadSelectedIds.size > 0 ? 'flex' : 'none';
count.textContent = leadSelectedIds.size + ' selected';
ids.value = Array.from(leadSelectedIds).join(',');
}
function onLeadActionChange() {
var action = document.getElementById('lead-bulk-action-select').value;
document.getElementById('lead-status-select').style.display = action === 'set_status' ? '' : 'none';
document.getElementById('lead-heat-select').style.display = action === 'set_heat' ? '' : 'none';
}
function submitLeadBulk() {
var action = document.getElementById('lead-bulk-action-select').value;
if (!action) return;
if (leadSelectedIds.size === 0) return;
document.getElementById('lead-bulk-action').value = action;
if (action === 'set_status') {
document.getElementById('lead-bulk-target-status').value = document.getElementById('lead-status-select').value;
} else if (action === 'set_heat') {
document.getElementById('lead-bulk-target-heat').value = document.getElementById('lead-heat-select').value;
}
htmx.ajax('POST', '{{ url_for("admin.leads_bulk") }}', {
source: document.getElementById('lead-bulk-form'),
target: '#lead-results',
swap: 'innerHTML'
});
clearLeadSelection();
}
document.body.addEventListener('htmx:afterSwap', function(evt) {
if (evt.detail.target.id === 'lead-results') {
document.querySelectorAll('.lead-checkbox').forEach(function(cb) {
if (leadSelectedIds.has(Number(cb.dataset.id))) cb.checked = true;
});
}
});
</script>
{% endblock %}

View File

@@ -21,10 +21,9 @@
</td>
<td class="text-right" style="white-space:nowrap">
<a href="{{ url_for('admin.affiliate_program_edit', program_id=prog.id) }}" class="btn-outline btn-sm">Edit</a>
<form method="post" action="{{ url_for('admin.affiliate_program_delete', program_id=prog.id) }}" style="display:inline">
<form method="post" action="{{ url_for('admin.affiliate_program_delete', program_id=prog.id) }}" style="display:inline" hx-boost="true">
<input type="hidden" name="csrf_token" value="{{ csrf_token() }}">
<button type="submit" class="btn-outline btn-sm"
onclick="event.preventDefault(); confirmAction('Delete {{ prog.name }}? This is blocked if products reference it.', this.closest('form'))">Delete</button>
</form>
</td>
</tr>

View File

@@ -20,10 +20,9 @@
<td class="mono text-right">{{ product.click_count or 0 }}</td>
<td class="text-right" style="white-space:nowrap">
<a href="{{ url_for('admin.affiliate_edit', product_id=product.id) }}" class="btn-outline btn-sm">Edit</a>
<form method="post" action="{{ url_for('admin.affiliate_delete', product_id=product.id) }}" style="display:inline">
<form method="post" action="{{ url_for('admin.affiliate_delete', product_id=product.id) }}" style="display:inline" hx-boost="true">
<input type="hidden" name="csrf_token" value="{{ csrf_token() }}">
<button type="submit" class="btn-outline btn-sm"
onclick="event.preventDefault(); confirmAction('Delete {{ product.name }}?', this.closest('form'))">Delete</button>
</form>
</td>
</tr>

View File

@@ -1,4 +1,9 @@
<tr id="article-group-{{ g.url_path | replace('/', '-') | trim('-') }}">
<td onclick="event.stopPropagation()">
<input type="checkbox" class="article-checkbox"
data-ids="{{ g.variants | map(attribute='id') | join(',') }}"
onchange="toggleArticleGroupSelect(this)">
</td>
<td style="max-width:260px">
<div style="overflow:hidden;text-overflow:ellipsis;white-space:nowrap;font-weight:500" title="{{ g.url_path }}">{{ g.title }}</div>
<div class="article-subtitle">{{ g.url_path }}</div>

View File

@@ -0,0 +1,12 @@
{# HTMX partial: sandboxed iframe showing a rendered article preview.
Rendered by POST /admin/articles/preview. #}
{% if preview_doc %}
<iframe
srcdoc="{{ preview_doc | e }}"
style="flex:1;width:100%;border:none;display:block;"
sandbox="allow-same-origin"
title="Article preview"
></iframe>
{% else %}
<p class="preview-placeholder">Start writing to see a preview.</p>
{% endif %}

View File

@@ -54,6 +54,11 @@
<table class="table text-sm">
<thead>
<tr>
{% if not grouped %}
<th style="width:32px"><input type="checkbox" id="article-select-all" onchange="document.querySelectorAll('.article-checkbox').forEach(cb => { cb.checked = this.checked; toggleArticleSelect(Number(cb.dataset.id), this.checked); })"></th>
{% else %}
<th style="width:32px"><input type="checkbox" id="article-select-all" onchange="document.querySelectorAll('.article-checkbox').forEach(cb => { cb.checked = this.checked; toggleArticleGroupSelect(cb); })"></th>
{% endif %}
<th>Title</th>
<th>{% if grouped %}Variants{% else %}Status{% endif %}</th>
<th>Published</th>

View File

@@ -1,4 +1,8 @@
<tr id="article-{{ a.id }}">
<td onclick="event.stopPropagation()">
<input type="checkbox" class="article-checkbox" data-id="{{ a.id }}"
onchange="toggleArticleSelect({{ a.id }}, this.checked)">
</td>
<td style="max-width:280px; overflow:hidden; text-overflow:ellipsis; white-space:nowrap"
title="{{ a.url_path }}">{{ a.title }}</td>
<td>

View File

@@ -29,6 +29,7 @@
<table class="table">
<thead>
<tr>
<th style="width:32px"><input type="checkbox" id="lead-select-all" onchange="document.querySelectorAll('.lead-checkbox').forEach(cb => { cb.checked = this.checked; toggleLeadSelect(Number(cb.dataset.id), this.checked); })"></th>
<th>ID</th>
<th>Heat</th>
<th>Contact</th>
@@ -43,6 +44,10 @@
<tbody>
{% for lead in leads %}
<tr data-href="{{ url_for('admin.lead_detail', lead_id=lead.id) }}">
<td onclick="event.stopPropagation()">
<input type="checkbox" class="lead-checkbox" data-id="{{ lead.id }}"
onchange="toggleLeadSelect({{ lead.id }}, this.checked)">
</td>
<td><a href="{{ url_for('admin.lead_detail', lead_id=lead.id) }}">#{{ lead.id }}</a></td>
<td>{{ heat_badge(lead.heat_score) }}</td>
<td>

View File

@@ -29,10 +29,10 @@
</div>
</form>
<form method="post" action="{{ url_for('pipeline.pipeline_trigger_extract') }}" class="m-0">
<form method="post" action="{{ url_for('pipeline.pipeline_trigger_extract') }}" class="m-0" hx-boost="true">
<input type="hidden" name="csrf_token" value="{{ csrf_token() }}">
<button type="button" class="btn-outline btn-sm"
onclick="confirmAction('Enqueue a full extraction run? This will run all extractors in the background.', this.closest('form'))">
<button type="submit" class="btn-outline btn-sm"
hx-confirm="Enqueue a full extraction run? This will run all extractors in the background.">
Run All Extractors
</button>
</form>
@@ -112,11 +112,11 @@
{% if run.status == 'running' %}
<form method="post"
action="{{ url_for('pipeline.pipeline_mark_stale', run_id=run.run_id) }}"
class="m-0">
class="m-0" hx-boost="true">
<input type="hidden" name="csrf_token" value="{{ csrf_token() }}">
<button type="button" class="btn-danger btn-sm"
<button type="submit" class="btn-danger btn-sm"
style="padding:2px 8px;font-size:11px"
onclick="confirmAction('Mark run #{{ run.run_id }} as failed? Only do this if the process is definitely dead.', this.closest('form'))">
hx-confirm="Mark run #{{ run.run_id }} as failed? Only do this if the process is definitely dead.">
Mark Failed
</button>
</form>

View File

@@ -16,8 +16,9 @@
{% set wf = row.workflow %}
{% set run = row.run %}
{% set stale = row.stale %}
<div style="border:1px solid #E2E8F0;border-radius:10px;padding:0.875rem;background:#FAFAFA">
<div class="flex items-center gap-2 mb-2">
{% set is_running = run and run.status == 'running' and not stale %}
<div style="border:1px solid {% if is_running %}#93C5FD{% else %}#E2E8F0{% endif %};border-radius:10px;padding:0.875rem;background:{% if is_running %}#EFF6FF{% else %}#FAFAFA{% endif %}">
<div class="flex items-center gap-2 mb-1">
{% if not run %}
<span class="status-dot pending"></span>
{% elif stale %}
@@ -33,6 +34,15 @@
{% if stale %}
<span class="badge-warning" style="font-size:10px;padding:1px 6px;margin-left:auto">stale</span>
{% endif %}
{% if is_running %}
<span class="btn btn-sm ml-auto"
style="padding:2px 8px;font-size:11px;opacity:0.6;cursor:default;pointer-events:none">
<svg class="spinner-icon" width="12" height="12" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="3">
<path d="M12 2a10 10 0 0 1 10 10" stroke-linecap="round"/>
</svg>
Running
</span>
{% else %}
<button type="button"
class="btn btn-sm ml-auto"
style="padding:2px 8px;font-size:11px"
@@ -41,9 +51,17 @@
hx-swap="outerHTML"
hx-vals='{"extractor": "{{ wf.name }}", "csrf_token": "{{ csrf_token() }}"}'
hx-confirm="Run {{ wf.name }} extractor?">Run</button>
{% endif %}
</div>
{% if wf.description %}
<p class="text-xs text-slate" style="margin-top:2px;margin-bottom:4px">{{ wf.description }}</p>
{% endif %}
<p class="text-xs text-slate">{{ wf.schedule_label }}</p>
{% if run %}
{% if is_running %}
<p class="text-xs mt-1" style="color:#2563EB">
Started {{ run.started_at[:16].replace('T', ' ') if run.started_at else '—' }} — running...
</p>
{% elif run %}
<p class="text-xs mono text-slate-dark mt-1">{{ run.started_at[:16].replace('T', ' ') if run.started_at else '—' }}</p>
{% if run.status == 'failed' and run.error_message %}
<p class="text-xs text-danger mt-1" style="font-family:monospace;word-break:break-all">

View File

@@ -36,9 +36,10 @@
<a href="{{ url_for('admin.scenario_pdf', scenario_id=s.id, lang='en') }}" class="btn-outline btn-sm">PDF EN</a>
<a href="{{ url_for('admin.scenario_pdf', scenario_id=s.id, lang='de') }}" class="btn-outline btn-sm">PDF DE</a>
<a href="{{ url_for('admin.scenario_edit', scenario_id=s.id) }}" class="btn-outline btn-sm">Edit</a>
<form method="post" action="{{ url_for('admin.scenario_delete', scenario_id=s.id) }}" class="m-0" style="display: inline;">
<form method="post" action="{{ url_for('admin.scenario_delete', scenario_id=s.id) }}" class="m-0" style="display: inline;" hx-boost="true">
<input type="hidden" name="csrf_token" value="{{ csrf_token() }}">
<button type="button" class="btn-outline btn-sm" onclick="confirmAction('Delete this scenario? This cannot be undone.', this.closest('form'))">Delete</button>
<button type="submit" class="btn-outline btn-sm"
hx-confirm="Delete this scenario? This cannot be undone.">Delete</button>
</form>
</td>
</tr>

View File

@@ -57,11 +57,11 @@
<p class="text-sm text-slate mt-1">Extraction status, data catalog, and ad-hoc query editor</p>
</div>
<div class="flex gap-2">
<form method="post" action="{{ url_for('pipeline.pipeline_trigger_transform') }}" class="m-0">
<form method="post" action="{{ url_for('pipeline.pipeline_trigger_transform') }}" class="m-0" hx-boost="true">
<input type="hidden" name="csrf_token" value="{{ csrf_token() }}">
<input type="hidden" name="step" value="pipeline">
<button type="button" class="btn btn-sm"
onclick="confirmAction('Run full ELT pipeline (extract → transform → export)? This runs in the background.', this.closest('form'))">
<button type="submit" class="btn btn-sm"
hx-confirm="Run full ELT pipeline (extract → transform → export)? This runs in the background.">
Run Pipeline
</button>
</form>

View File

@@ -13,9 +13,10 @@
</div>
<div class="flex gap-2">
<a href="{{ url_for('admin.template_generate', slug=config_data.slug) }}" class="btn">Generate Articles</a>
<form method="post" action="{{ url_for('admin.template_regenerate', slug=config_data.slug) }}" style="display:inline">
<form method="post" action="{{ url_for('admin.template_regenerate', slug=config_data.slug) }}" style="display:inline" hx-boost="true">
<input type="hidden" name="csrf_token" value="{{ csrf_token() }}">
<button type="button" class="btn-outline" onclick="confirmAction('Regenerate all articles for this template with fresh data? Existing articles will be overwritten.', this.closest('form'))">
<button type="submit" class="btn-outline"
hx-confirm="Regenerate all articles for this template with fresh data? Existing articles will be overwritten.">
Regenerate
</button>
</form>

100
web/src/padelnomics/api.py Normal file
View File

@@ -0,0 +1,100 @@
"""
JSON API endpoints for interactive maps.
Serves pre-aggregated geographic data from analytics.duckdb for Leaflet maps.
All responses are JSON with 1-hour public cache headers (data changes at most
daily when the pipeline runs).
"""
from quart import Blueprint, abort, jsonify
from .analytics import fetch_analytics
from .core import is_flag_enabled
bp = Blueprint("api", __name__)
_CACHE_HEADERS = {"Cache-Control": "public, max-age=3600"}
async def _require_maps_flag() -> None:
"""Abort with 404 if the maps feature flag is explicitly disabled.
Defaults to enabled (True) so that dev environments without the flag row
in the DB still work. An admin can disable by setting the flag to False.
"""
if not await is_flag_enabled("maps", default=True):
abort(404)
@bp.route("/markets/countries.json")
async def countries():
"""Country-level aggregates for the markets hub map."""
await _require_maps_flag()
rows = await fetch_analytics("""
SELECT country_code, country_name_en, country_slug,
COUNT(*) AS city_count,
SUM(padel_venue_count) AS total_venues,
ROUND(AVG(market_score), 1) AS avg_market_score,
AVG(lat) AS lat, AVG(lon) AS lon
FROM serving.city_market_profile
GROUP BY country_code, country_name_en, country_slug
HAVING SUM(padel_venue_count) > 0
ORDER BY total_venues DESC
""")
return jsonify(rows), 200, _CACHE_HEADERS
@bp.route("/markets/<country_slug>/cities.json")
async def country_cities(country_slug: str):
"""City-level data for a country overview bubble map."""
await _require_maps_flag()
assert country_slug, "country_slug required"
rows = await fetch_analytics(
"""
SELECT city_name, city_slug, lat, lon,
padel_venue_count, market_score, population
FROM serving.city_market_profile
WHERE country_slug = ?
ORDER BY padel_venue_count DESC
LIMIT 200
""",
[country_slug],
)
return jsonify(rows), 200, _CACHE_HEADERS
@bp.route("/markets/<country_slug>/<city_slug>/venues.json")
async def city_venues(country_slug: str, city_slug: str):
"""Venue-level dots for the city detail map."""
await _require_maps_flag()
assert country_slug and city_slug, "country_slug and city_slug required"
rows = await fetch_analytics(
"""
SELECT name, lat, lon, court_count,
indoor_court_count, outdoor_court_count
FROM serving.city_venue_locations
WHERE country_slug = ? AND city_slug = ?
LIMIT 500
""",
[country_slug, city_slug],
)
return jsonify(rows), 200, _CACHE_HEADERS
@bp.route("/opportunity/<country_slug>.json")
async def opportunity(country_slug: str):
"""Location-level opportunity scores for the opportunity map."""
await _require_maps_flag()
assert country_slug, "country_slug required"
rows = await fetch_analytics(
"""
SELECT location_name, location_slug, lat, lon,
opportunity_score, nearest_padel_court_km,
padel_venue_count, population
FROM serving.location_opportunity_profile
WHERE country_slug = ? AND opportunity_score > 0
ORDER BY opportunity_score DESC
LIMIT 500
""",
[country_slug],
)
return jsonify(rows), 200, _CACHE_HEADERS

View File

@@ -362,6 +362,7 @@ def create_app() -> Quart:
from .admin.pipeline_routes import bp as pipeline_bp
from .admin.pseo_routes import bp as pseo_bp
from .admin.routes import bp as admin_bp
from .api import bp as api_bp
from .auth.routes import bp as auth_bp
from .billing.routes import bp as billing_bp
from .content.routes import bp as content_bp
@@ -391,6 +392,9 @@ def create_app() -> Quart:
app.register_blueprint(pipeline_bp)
app.register_blueprint(webhooks_bp)
# JSON API for interactive maps (no lang prefix)
app.register_blueprint(api_bp, url_prefix="/api")
# Content catch-all LAST — lives under /<lang> too
app.register_blueprint(content_bp, url_prefix="/<lang>")

View File

View File

@@ -0,0 +1,116 @@
"""
Paddle payment provider — checkout, webhook verification, subscription management.
Exports the 5 functions that billing/routes.py dispatches to:
- build_checkout_payload()
- build_multi_item_checkout_payload()
- cancel_subscription()
- get_management_url()
- handle_webhook()
"""
import json
from paddle_billing import Client as PaddleClient
from paddle_billing import Environment, Options
from paddle_billing.Notifications import Secret, Verifier
from ..core import config
def _paddle_client() -> PaddleClient:
"""Create a Paddle SDK client."""
env = Environment.SANDBOX if config.PADDLE_ENVIRONMENT == "sandbox" else Environment.PRODUCTION
return PaddleClient(config.PADDLE_API_KEY, options=Options(env))
class _WebhookRequest:
"""Minimal wrapper satisfying paddle_billing's Request Protocol."""
def __init__(self, body: bytes, headers):
self.body = body
self.headers = headers
_verifier = Verifier(maximum_variance=300)
def build_checkout_payload(
price_id: str, custom_data: dict, success_url: str,
) -> dict:
"""Build JSON payload for a single-item Paddle.js overlay checkout."""
return {
"items": [{"priceId": price_id, "quantity": 1}],
"customData": custom_data,
"settings": {"successUrl": success_url},
}
def build_multi_item_checkout_payload(
items: list[dict], custom_data: dict, success_url: str,
) -> dict:
"""Build JSON payload for a multi-item Paddle.js overlay checkout."""
return {
"items": items,
"customData": custom_data,
"settings": {"successUrl": success_url},
}
def cancel_subscription(provider_subscription_id: str) -> None:
"""Cancel a Paddle subscription at end of current billing period."""
from paddle_billing.Resources.Subscriptions.Operations import CancelSubscription
paddle = _paddle_client()
paddle.subscriptions.cancel(
provider_subscription_id,
CancelSubscription(effective_from="next_billing_period"),
)
def get_management_url(provider_subscription_id: str) -> str:
"""Get the Paddle customer portal URL for updating payment method."""
paddle = _paddle_client()
paddle_sub = paddle.subscriptions.get(provider_subscription_id)
return paddle_sub.management_urls.update_payment_method
def verify_webhook(payload: bytes, headers) -> bool:
"""Verify Paddle webhook signature. Returns True if valid or no secret configured."""
if not config.PADDLE_WEBHOOK_SECRET:
return True
try:
return _verifier.verify(
_WebhookRequest(payload, headers),
Secret(config.PADDLE_WEBHOOK_SECRET),
)
except (ConnectionRefusedError, ValueError):
return False
def parse_webhook(payload: bytes) -> dict:
"""Parse a Paddle webhook payload into a normalized event dict.
Returns dict with keys: event_type, subscription_id, customer_id,
user_id, supplier_id, plan, status, current_period_end, data, items.
"""
event = json.loads(payload)
event_type = event.get("event_type", "")
data = event.get("data") or {}
custom_data = data.get("custom_data") or {}
billing_period = data.get("current_billing_period") or {}
return {
"event_type": event_type,
"subscription_id": data.get("id", ""),
"customer_id": str(data.get("customer_id", "")),
"user_id": custom_data.get("user_id"),
"supplier_id": custom_data.get("supplier_id"),
"plan": custom_data.get("plan", ""),
"status": data.get("status", ""),
"current_period_end": billing_period.get("ends_at"),
"data": data,
"items": data.get("items", []),
"custom_data": custom_data,
}

View File

@@ -1,6 +1,9 @@
"""
Billing domain: checkout, webhooks, subscription management.
Payment provider: paddle
Provider dispatch: PAYMENT_PROVIDER env var selects 'paddle' or 'stripe'.
Both webhook endpoints (/webhook/paddle and /webhook/stripe) stay active
regardless of the toggle — existing subscribers keep sending webhooks.
"""
import json
@@ -8,20 +11,21 @@ import secrets
from datetime import timedelta
from pathlib import Path
from paddle_billing import Client as PaddleClient
from paddle_billing import Environment, Options
from paddle_billing.Notifications import Secret, Verifier
from quart import Blueprint, flash, g, jsonify, redirect, render_template, request, session, url_for
from ..auth.routes import login_required
from ..core import config, execute, fetch_one, get_paddle_price, utcnow, utcnow_iso
from ..core import config, execute, fetch_one, get_price_id, utcnow, utcnow_iso
from ..i18n import get_translations
def _paddle_client() -> PaddleClient:
"""Create a Paddle SDK client. Used only for subscription management + webhook verification."""
env = Environment.SANDBOX if config.PADDLE_ENVIRONMENT == "sandbox" else Environment.PRODUCTION
return PaddleClient(config.PADDLE_API_KEY, options=Options(env))
def _provider():
"""Return the active payment provider module."""
if config.PAYMENT_PROVIDER == "stripe":
from . import stripe as mod
else:
from . import paddle as mod
return mod
# Blueprint with its own template folder
bp = Blueprint(
@@ -33,7 +37,7 @@ bp = Blueprint(
# =============================================================================
# SQL Queries
# SQL Queries (provider-agnostic)
# =============================================================================
async def get_subscription(user_id: int) -> dict | None:
@@ -132,7 +136,7 @@ async def is_within_limits(user_id: int, resource: str, current_count: int) -> b
# =============================================================================
# Routes
# Routes (provider-agnostic)
# =============================================================================
@bp.route("/pricing")
@@ -151,129 +155,171 @@ async def success():
return await render_template("success.html")
# =============================================================================
# Paddle Implementation — Paddle.js Overlay Checkout
# Checkout / Manage / Cancel — dispatched to active provider
# =============================================================================
@bp.route("/checkout/<plan>", methods=["POST"])
@login_required
async def checkout(plan: str):
"""Return JSON for Paddle.js overlay checkout."""
price_id = await get_paddle_price(plan)
"""Return JSON for checkout (overlay for Paddle, redirect URL for Stripe)."""
price_id = await get_price_id(plan)
if not price_id:
return jsonify({"error": "Invalid plan selected."}), 400
return jsonify({
"items": [{"priceId": price_id, "quantity": 1}],
"customData": {"user_id": str(g.user["id"]), "plan": plan},
"settings": {
"successUrl": f"{config.BASE_URL}/billing/success",
},
})
payload = _provider().build_checkout_payload(
price_id=price_id,
custom_data={"user_id": str(g.user["id"]), "plan": plan},
success_url=f"{config.BASE_URL}/billing/success",
)
return jsonify(payload)
@bp.route("/checkout/item", methods=["POST"])
@login_required
async def checkout_item():
"""Return checkout JSON for a single item (boost, credit pack, etc.).
Used by dashboard boost/credit buttons that need a server round-trip
for Stripe (Checkout Session creation) and work with Paddle overlay too.
Expects JSON body: {price_key, custom_data, success_url?}
"""
body = await request.get_json(silent=True) or {}
price_key = body.get("price_key", "")
custom_data = body.get("custom_data", {})
success_url = body.get("success_url", f"{config.BASE_URL}/suppliers/dashboard?tab=boosts")
price_id = await get_price_id(price_key)
if not price_id:
return jsonify({"error": "Product not configured."}), 400
payload = _provider().build_checkout_payload(
price_id=price_id,
custom_data=custom_data,
success_url=success_url,
)
return jsonify(payload)
@bp.route("/manage", methods=["POST"])
@login_required
async def manage():
"""Redirect to Paddle customer portal."""
"""Redirect to payment provider's customer portal."""
sub = await get_subscription(g.user["id"])
if not sub or not sub.get("provider_subscription_id"):
t = get_translations(g.get("lang") or "en")
await flash(t["billing_no_subscription"], "error")
return redirect(url_for("dashboard.settings"))
paddle = _paddle_client()
paddle_sub = paddle.subscriptions.get(sub["provider_subscription_id"])
portal_url = paddle_sub.management_urls.update_payment_method
portal_url = _provider().get_management_url(sub["provider_subscription_id"])
return redirect(portal_url)
@bp.route("/cancel", methods=["POST"])
@login_required
async def cancel():
"""Cancel subscription via Paddle API."""
"""Cancel subscription via active payment provider."""
sub = await get_subscription(g.user["id"])
if sub and sub.get("provider_subscription_id"):
from paddle_billing.Resources.Subscriptions.Operations import CancelSubscription
paddle = _paddle_client()
paddle.subscriptions.cancel(
sub["provider_subscription_id"],
CancelSubscription(effective_from="next_billing_period"),
)
_provider().cancel_subscription(sub["provider_subscription_id"])
return redirect(url_for("dashboard.settings"))
class _WebhookRequest:
"""Minimal wrapper satisfying paddle_billing's Request Protocol."""
def __init__(self, body: bytes, headers):
self.body = body
self.headers = headers
_verifier = Verifier(maximum_variance=300)
# =============================================================================
# Paddle Webhook — always active (existing subscribers keep sending)
# =============================================================================
@bp.route("/webhook/paddle", methods=["POST"])
async def webhook():
"""Handle Paddle webhooks."""
async def webhook_paddle():
"""Handle Paddle webhooks — always active regardless of PAYMENT_PROVIDER toggle."""
from . import paddle as paddle_mod
payload = await request.get_data()
if config.PADDLE_WEBHOOK_SECRET:
try:
ok = _verifier.verify(
_WebhookRequest(payload, request.headers),
Secret(config.PADDLE_WEBHOOK_SECRET),
)
except (ConnectionRefusedError, ValueError):
ok = False
if not ok:
return jsonify({"error": "Invalid signature"}), 400
if not paddle_mod.verify_webhook(payload, request.headers):
return jsonify({"error": "Invalid signature"}), 400
try:
event = json.loads(payload)
ev = paddle_mod.parse_webhook(payload)
except (json.JSONDecodeError, ValueError):
return jsonify({"error": "Invalid JSON payload"}), 400
event_type = event.get("event_type")
data = event.get("data") or {}
custom_data = data.get("custom_data") or {}
user_id = custom_data.get("user_id")
plan = custom_data.get("plan", "")
# Store billing customer for any subscription event with a customer_id
customer_id = str(data.get("customer_id", ""))
await _handle_webhook_event(ev)
return jsonify({"received": True})
# =============================================================================
# Stripe Webhook — always active (once Stripe is configured)
# =============================================================================
@bp.route("/webhook/stripe", methods=["POST"])
async def webhook_stripe():
"""Handle Stripe webhooks — always active regardless of PAYMENT_PROVIDER toggle."""
if not config.STRIPE_WEBHOOK_SECRET:
return jsonify({"error": "Stripe not configured"}), 404
from . import stripe as stripe_mod
payload = await request.get_data()
if not stripe_mod.verify_webhook(payload, request.headers):
return jsonify({"error": "Invalid signature"}), 400
try:
ev = stripe_mod.parse_webhook(payload)
except (json.JSONDecodeError, ValueError):
return jsonify({"error": "Invalid payload"}), 400
await _handle_webhook_event(ev)
return jsonify({"received": True})
# =============================================================================
# Shared Webhook Event Handler (provider-agnostic)
# =============================================================================
async def _handle_webhook_event(ev: dict) -> None:
"""Process a normalized webhook event from any provider.
ev keys: event_type, subscription_id, customer_id, user_id, supplier_id,
plan, status, current_period_end, data, items, custom_data
"""
event_type = ev.get("event_type", "")
user_id = ev.get("user_id")
plan = ev.get("plan", "")
# Store billing customer
customer_id = ev.get("customer_id", "")
if customer_id and user_id:
await upsert_billing_customer(int(user_id), customer_id)
if event_type == "subscription.activated":
if plan.startswith("supplier_"):
await _handle_supplier_subscription_activated(data, custom_data)
await _handle_supplier_subscription_activated(ev)
elif user_id:
await upsert_subscription(
user_id=int(user_id),
plan=plan or "starter",
status="active",
provider_subscription_id=data.get("id", ""),
current_period_end=(data.get("current_billing_period") or {}).get("ends_at"),
provider_subscription_id=ev.get("subscription_id", ""),
current_period_end=ev.get("current_period_end"),
)
elif event_type == "subscription.updated":
await update_subscription_status(
data.get("id", ""),
status=data.get("status", "active"),
current_period_end=(data.get("current_billing_period") or {}).get("ends_at"),
ev.get("subscription_id", ""),
status=ev.get("status", "active"),
current_period_end=ev.get("current_period_end"),
)
elif event_type == "subscription.canceled":
await update_subscription_status(data.get("id", ""), status="cancelled")
await update_subscription_status(ev.get("subscription_id", ""), status="cancelled")
elif event_type == "subscription.past_due":
await update_subscription_status(data.get("id", ""), status="past_due")
await update_subscription_status(ev.get("subscription_id", ""), status="past_due")
elif event_type == "transaction.completed":
await _handle_transaction_completed(data, custom_data)
return jsonify({"received": True})
await _handle_transaction_completed(ev)
# =============================================================================
@@ -301,7 +347,13 @@ BOOST_PRICE_KEYS = {
async def _price_id_to_key(price_id: str) -> str | None:
"""Reverse-lookup a paddle_products key from a Paddle price ID."""
"""Reverse-lookup a product key from a provider price ID."""
row = await fetch_one(
"SELECT key FROM payment_products WHERE provider_price_id = ?", (price_id,)
)
if row:
return row["key"]
# Fallback to old table for pre-migration DBs
row = await fetch_one(
"SELECT key FROM paddle_products WHERE paddle_price_id = ?", (price_id,)
)
@@ -330,13 +382,13 @@ def _derive_tier_from_plan(plan: str) -> tuple[str, str]:
return base, tier
async def _handle_supplier_subscription_activated(data: dict, custom_data: dict) -> None:
async def _handle_supplier_subscription_activated(ev: dict) -> None:
"""Handle supplier plan subscription activation."""
from ..core import transaction as db_transaction
supplier_id = custom_data.get("supplier_id")
plan = custom_data.get("plan", "supplier_growth")
user_id = custom_data.get("user_id")
supplier_id = ev.get("supplier_id")
plan = ev.get("plan", "supplier_growth")
user_id = ev.get("user_id")
if not supplier_id:
return
@@ -365,7 +417,8 @@ async def _handle_supplier_subscription_activated(data: dict, custom_data: dict)
)
# Create boost records for items included in the subscription
items = data.get("items", [])
items = ev.get("items", [])
data = ev.get("data", {})
for item in items:
price_id = item.get("price", {}).get("id", "")
key = await _price_id_to_key(price_id)
@@ -388,13 +441,15 @@ async def _handle_supplier_subscription_activated(data: dict, custom_data: dict)
)
async def _handle_transaction_completed(data: dict, custom_data: dict) -> None:
async def _handle_transaction_completed(ev: dict) -> None:
"""Handle one-time transaction completion (credit packs, sticky boosts, business plan)."""
supplier_id = custom_data.get("supplier_id")
user_id = custom_data.get("user_id")
supplier_id = ev.get("supplier_id")
user_id = ev.get("user_id")
custom_data = ev.get("custom_data", {})
data = ev.get("data", {})
now = utcnow_iso()
items = data.get("items", [])
items = ev.get("items", [])
for item in items:
price_id = item.get("price", {}).get("id", "")
key = await _price_id_to_key(price_id)

View File

@@ -0,0 +1,378 @@
"""
Stripe payment provider — checkout sessions, webhook handling, subscription management.
Exports the same interface as paddle.py so billing/routes.py can dispatch:
- build_checkout_payload()
- build_multi_item_checkout_payload()
- cancel_subscription()
- get_management_url()
- verify_webhook()
- parse_webhook()
Stripe Tax add-on handles EU VAT collection (must be enabled in Stripe Dashboard).
"""
import json
import logging
import stripe as stripe_sdk
from ..core import config
logger = logging.getLogger(__name__)
def _stripe_client():
"""Configure and return the stripe module with our API key."""
stripe_sdk.api_key = config.STRIPE_SECRET_KEY
stripe_sdk.max_network_retries = 2
return stripe_sdk
def build_checkout_payload(
price_id: str, custom_data: dict, success_url: str,
) -> dict:
"""Create a Stripe Checkout Session for a single item.
Returns {checkout_url: "https://checkout.stripe.com/..."} — the client
JS redirects the browser there (no overlay SDK needed).
"""
s = _stripe_client()
session = s.checkout.Session.create(
mode=_mode_for_price(s, price_id),
line_items=[{"price": price_id, "quantity": 1}],
metadata=custom_data,
success_url=success_url + "?session_id={CHECKOUT_SESSION_ID}",
cancel_url=success_url.rsplit("/success", 1)[0] + "/pricing",
automatic_tax={"enabled": True},
tax_id_collection={"enabled": True},
)
return {"checkout_url": session.url}
def build_multi_item_checkout_payload(
items: list[dict], custom_data: dict, success_url: str,
) -> dict:
"""Create a Stripe Checkout Session for multiple line items.
items: list of {"priceId": "price_xxx", "quantity": 1}
"""
s = _stripe_client()
line_items = [{"price": i["priceId"], "quantity": i.get("quantity", 1)} for i in items]
# Determine mode: if any item is recurring, use "subscription".
# Otherwise use "payment" for one-time purchases.
has_recurring = any(_is_recurring_price(s, i["priceId"]) for i in items)
mode = "subscription" if has_recurring else "payment"
session = s.checkout.Session.create(
mode=mode,
line_items=line_items,
metadata=custom_data,
success_url=success_url + "?session_id={CHECKOUT_SESSION_ID}",
cancel_url=success_url.rsplit("/success", 1)[0],
automatic_tax={"enabled": True},
tax_id_collection={"enabled": True},
)
return {"checkout_url": session.url}
def _mode_for_price(s, price_id: str) -> str:
"""Determine Checkout Session mode from price type."""
try:
price = s.Price.retrieve(price_id)
return "subscription" if price.type == "recurring" else "payment"
except Exception:
# Default to payment if we can't determine
return "payment"
def _is_recurring_price(s, price_id: str) -> bool:
"""Check if a Stripe price is recurring (subscription)."""
try:
price = s.Price.retrieve(price_id)
return price.type == "recurring"
except Exception:
return False
def cancel_subscription(provider_subscription_id: str) -> None:
"""Cancel a Stripe subscription at end of current billing period."""
s = _stripe_client()
s.Subscription.modify(
provider_subscription_id,
cancel_at_period_end=True,
)
def get_management_url(provider_subscription_id: str) -> str:
"""Create a Stripe Billing Portal session and return its URL."""
s = _stripe_client()
# Get customer_id from the subscription
sub = s.Subscription.retrieve(
provider_subscription_id,
)
portal = s.billing_portal.Session.create(
customer=sub.customer,
return_url=f"{config.BASE_URL}/billing/success",
)
return portal.url
def verify_webhook(payload: bytes, headers) -> bool:
"""Verify Stripe webhook signature using the Stripe-Signature header."""
if not config.STRIPE_WEBHOOK_SECRET:
return True
sig_header = headers.get("Stripe-Signature", "")
if not sig_header:
return False
try:
stripe_sdk.Webhook.construct_event(
payload, sig_header, config.STRIPE_WEBHOOK_SECRET,
)
return True
except (stripe_sdk.SignatureVerificationError, ValueError):
return False
def parse_webhook(payload: bytes) -> dict:
"""Parse a Stripe webhook payload into a normalized event dict.
Maps Stripe event types to the shared format used by _handle_webhook_event():
- checkout.session.completed (mode=subscription) → subscription.activated
- customer.subscription.created → subscription.activated
- customer.subscription.updated → subscription.updated
- customer.subscription.deleted → subscription.canceled
- invoice.payment_failed → subscription.past_due
- checkout.session.completed (mode=payment) → transaction.completed
"""
raw = json.loads(payload)
stripe_type = raw.get("type", "")
obj = raw.get("data", {}).get("object", {})
# Extract metadata — Stripe stores custom data in session/subscription metadata
metadata = obj.get("metadata") or {}
# Common fields
customer_id = obj.get("customer", "")
user_id = metadata.get("user_id")
supplier_id = metadata.get("supplier_id")
plan = metadata.get("plan", "")
# Map Stripe events to our shared event types
if stripe_type == "checkout.session.completed":
mode = obj.get("mode", "")
if mode == "subscription":
subscription_id = obj.get("subscription", "")
# Fetch subscription details for period end
period_end = None
if subscription_id:
try:
s = _stripe_client()
sub = s.Subscription.retrieve(
subscription_id,
)
# Stripe API 2026-02+ moved period_end to items
ts = sub.current_period_end
if not ts and sub.get("items", {}).get("data"):
ts = sub["items"]["data"][0].get("current_period_end")
period_end = _unix_to_iso(ts)
except Exception:
logger.warning("Failed to fetch subscription %s for period_end", subscription_id)
return {
"event_type": "subscription.activated",
"subscription_id": subscription_id,
"customer_id": str(customer_id),
"user_id": user_id,
"supplier_id": supplier_id,
"plan": plan,
"status": "active",
"current_period_end": period_end,
"data": obj,
"items": _extract_line_items(obj),
"custom_data": metadata,
}
else:
# One-time payment
return {
"event_type": "transaction.completed",
"subscription_id": "",
"customer_id": str(customer_id),
"user_id": user_id,
"supplier_id": supplier_id,
"plan": plan,
"status": "completed",
"current_period_end": None,
"data": obj,
"items": _extract_line_items(obj),
"custom_data": metadata,
}
elif stripe_type == "customer.subscription.created":
# New subscription — map to subscription.activated so the handler creates the DB row
status = _map_stripe_status(obj.get("status", ""))
return {
"event_type": "subscription.activated",
"subscription_id": obj.get("id", ""),
"customer_id": str(customer_id),
"user_id": user_id,
"supplier_id": supplier_id,
"plan": plan,
"status": status,
"current_period_end": _get_period_end(obj),
"data": obj,
"items": _extract_sub_items(obj),
"custom_data": metadata,
}
elif stripe_type == "customer.subscription.updated":
status = _map_stripe_status(obj.get("status", ""))
return {
"event_type": "subscription.updated",
"subscription_id": obj.get("id", ""),
"customer_id": str(customer_id),
"user_id": user_id,
"supplier_id": supplier_id,
"plan": plan,
"status": status,
"current_period_end": _get_period_end(obj),
"data": obj,
"items": _extract_sub_items(obj),
"custom_data": metadata,
}
elif stripe_type == "customer.subscription.deleted":
return {
"event_type": "subscription.canceled",
"subscription_id": obj.get("id", ""),
"customer_id": str(customer_id),
"user_id": user_id,
"supplier_id": supplier_id,
"plan": plan,
"status": "cancelled",
"current_period_end": _get_period_end(obj),
"data": obj,
"items": _extract_sub_items(obj),
"custom_data": metadata,
}
elif stripe_type == "invoice.payment_failed":
sub_id = obj.get("subscription", "")
return {
"event_type": "subscription.past_due",
"subscription_id": sub_id,
"customer_id": str(customer_id),
"user_id": user_id,
"supplier_id": supplier_id,
"plan": plan,
"status": "past_due",
"current_period_end": None,
"data": obj,
"items": [],
"custom_data": metadata,
}
# Unknown event — return a no-op
return {
"event_type": "",
"subscription_id": "",
"customer_id": str(customer_id),
"user_id": user_id,
"supplier_id": supplier_id,
"plan": plan,
"status": "",
"current_period_end": None,
"data": obj,
"items": [],
"custom_data": metadata,
}
# =============================================================================
# Helpers
# =============================================================================
def _map_stripe_status(stripe_status: str) -> str:
"""Map Stripe subscription status to our internal status."""
mapping = {
"active": "active",
"trialing": "on_trial",
"past_due": "past_due",
"canceled": "cancelled",
"unpaid": "past_due",
"incomplete": "past_due",
"incomplete_expired": "expired",
"paused": "paused",
}
return mapping.get(stripe_status, stripe_status)
def _unix_to_iso(ts) -> str | None:
"""Convert Unix timestamp to ISO string, or None."""
if not ts:
return None
from datetime import UTC, datetime
return datetime.fromtimestamp(int(ts), tz=UTC).strftime("%Y-%m-%dT%H:%M:%S.000000Z")
def _get_period_end(obj: dict) -> str | None:
"""Extract current_period_end from subscription or its first item.
Stripe API 2026-02+ moved period fields from subscription to subscription items.
"""
ts = obj.get("current_period_end")
if not ts:
items = obj.get("items", {}).get("data", [])
if items:
ts = items[0].get("current_period_end")
return _unix_to_iso(ts)
def _extract_line_items(session_obj: dict) -> list[dict]:
"""Extract line items from a Checkout Session in Paddle-compatible format.
Stripe doesn't embed line_items in checkout.session.completed webhooks,
so we fetch them via the API. Returns [{"price": {"id": "price_xxx"}}].
"""
session_id = session_obj.get("id", "")
if not session_id or not session_id.startswith("cs_"):
return []
try:
s = _stripe_client()
line_items = s.checkout.Session.list_line_items(session_id, limit=20)
return [
{"price": {"id": item["price"]["id"]}}
for item in line_items.get("data", [])
if item.get("price", {}).get("id")
]
except Exception:
logger.warning("Failed to fetch line_items for session %s", session_id)
# Fallback: check if line_items were embedded in the payload (e.g. tests)
embedded = session_obj.get("line_items", {}).get("data", [])
return [
{"price": {"id": item["price"]["id"]}}
for item in embedded
if item.get("price", {}).get("id")
]
def _extract_sub_items(sub_obj: dict) -> list[dict]:
"""Extract items from a Stripe Subscription object in Paddle-compatible format."""
items = sub_obj.get("items", {}).get("data", [])
return [{"price": {"id": item.get("price", {}).get("id", "")}} for item in items]

View File

@@ -31,6 +31,7 @@
}
</script>
{% endif %}
<link rel="stylesheet" href="{{ url_for('static', filename='vendor/leaflet/leaflet.min.css') }}">
{% endblock %}
{% block content %}
@@ -57,3 +58,108 @@
</article>
</main>
{% endblock %}
{% block scripts %}
<script>
(function() {
var countryMapEl = document.getElementById('country-map');
var cityMapEl = document.getElementById('city-map');
if (!countryMapEl && !cityMapEl) return;
var TILES = 'https://{s}.basemaps.cartocdn.com/light_all/{z}/{x}/{y}{r}.png';
var TILES_ATTR = '&copy; <a href="https://www.openstreetmap.org/copyright">OSM</a> &copy; <a href="https://carto.com/">CARTO</a>';
function scoreColor(score) {
if (score >= 60) return '#16A34A';
if (score >= 30) return '#D97706';
return '#DC2626';
}
function makeIcon(size, color) {
var s = Math.round(size);
return L.divIcon({
className: '',
html: '<div class="pn-marker" style="width:' + s + 'px;height:' + s + 'px;background:' + color + ';opacity:0.82;"></div>',
iconSize: [s, s],
iconAnchor: [s / 2, s / 2],
});
}
function initCountryMap(el) {
var slug = el.dataset.countrySlug;
var map = L.map(el, {scrollWheelZoom: false});
L.tileLayer(TILES, { attribution: TILES_ATTR, maxZoom: 18 }).addTo(map);
var lang = document.documentElement.lang || 'en';
fetch('/api/markets/' + slug + '/cities.json')
.then(function(r) { return r.json(); })
.then(function(data) {
if (!data.length) return;
var maxV = Math.max.apply(null, data.map(function(d) { return d.padel_venue_count || 1; }));
var bounds = [];
data.forEach(function(c) {
if (!c.lat || !c.lon) return;
var size = 10 + 36 * Math.sqrt((c.padel_venue_count || 1) / maxV);
var color = scoreColor(c.market_score);
var pop = c.population >= 1000000
? (c.population / 1000000).toFixed(1) + 'M'
: (c.population >= 1000 ? Math.round(c.population / 1000) + 'K' : (c.population || ''));
var tip = '<strong>' + c.city_name + '</strong><br>'
+ (c.padel_venue_count || 0) + ' venues'
+ (pop ? ' · ' + pop : '') + '<br>'
+ '<span style="color:' + color + ';font-weight:600;">Score ' + Math.round(c.market_score) + '/100</span>';
L.marker([c.lat, c.lon], { icon: makeIcon(size, color) })
.bindTooltip(tip, { className: 'map-tooltip', direction: 'top', offset: [0, -Math.round(size / 2)] })
.on('click', function() { window.location = '/' + lang + '/markets/' + slug + '/' + c.city_slug; })
.addTo(map);
bounds.push([c.lat, c.lon]);
});
if (bounds.length) map.fitBounds(bounds, { padding: [24, 24] });
});
}
var VENUE_ICON = L.divIcon({
className: '',
html: '<div class="pn-venue"></div>',
iconSize: [10, 10],
iconAnchor: [5, 5],
});
function initCityMap(el) {
var countrySlug = el.dataset.countrySlug;
var citySlug = el.dataset.citySlug;
var lat = parseFloat(el.dataset.lat);
var lon = parseFloat(el.dataset.lon);
var map = L.map(el, {scrollWheelZoom: false}).setView([lat, lon], 13);
L.tileLayer(TILES, { attribution: TILES_ATTR, maxZoom: 18 }).addTo(map);
fetch('/api/markets/' + countrySlug + '/' + citySlug + '/venues.json')
.then(function(r) { return r.json(); })
.then(function(data) {
data.forEach(function(v) {
if (!v.lat || !v.lon) return;
var indoor = v.indoor_court_count || 0;
var outdoor = v.outdoor_court_count || 0;
var total = v.court_count || (indoor + outdoor);
var courtLine = total
? total + ' court' + (total > 1 ? 's' : '')
+ (indoor || outdoor
? ' (' + [indoor ? indoor + ' indoor' : '', outdoor ? outdoor + ' outdoor' : ''].filter(Boolean).join(', ') + ')'
: '')
: '';
var tip = '<strong>' + v.name + '</strong>' + (courtLine ? '<br>' + courtLine : '');
L.marker([v.lat, v.lon], { icon: VENUE_ICON })
.bindTooltip(tip, { className: 'map-tooltip', direction: 'top', offset: [0, -7] })
.addTo(map);
});
});
}
var script = document.createElement('script');
script.src = '{{ url_for("static", filename="vendor/leaflet/leaflet.min.js") }}';
script.onload = function() {
if (countryMapEl) initCountryMap(countryMapEl);
if (cityMapEl) initCityMap(cityMapEl);
};
document.head.appendChild(script);
})();
</script>
{% endblock %}

View File

@@ -39,6 +39,8 @@ priority_column: population
</div>
</div>
<div id="city-map" data-country-slug="{{ country_slug }}" data-city-slug="{{ city_slug }}" data-lat="{{ lat }}" data-lon="{{ lon }}" style="height:300px; border-radius:12px; margin-bottom:1.5rem;"></div>
{{ city_name }} erreicht einen **<a href="/{{ language }}/market-score" style="text-decoration:none"><span style="font-family:'Bricolage Grotesque',sans-serif;font-weight:800;color:#0F172A;letter-spacing:-0.02em">padelnomics</span> Market Score</a> von {{ market_score | round(1) }}/100** — damit liegt die Stadt{% if market_score >= 55 %} unter den stärksten Padel-Märkten in {{ country_name_en }}{% elif market_score >= 35 %} im soliden Mittelfeld der Padel-Märkte in {{ country_name_en }}{% else %} in einem frühen Padel-Markt mit Wachstumspotenzial{% endif %}. Aktuell gibt es **{{ padel_venue_count }} Padelanlagen** für {% if population >= 1000000 %}{{ (population / 1000000) | round(1) }}M{% else %}{{ (population / 1000) | round(0) | int }}K{% endif %} Einwohner — das entspricht {{ venues_per_100k | round(1) }} Anlagen pro 100.000 Einwohner.{% if opportunity_score %} Der **<a href="/{{ language }}/market-score" style="text-decoration:none"><span style="font-family:'Bricolage Grotesque',sans-serif;font-weight:800;color:#0F172A;letter-spacing:-0.02em">padelnomics</span> Opportunity Score</a> von {{ opportunity_score | round(1) }}/100** bewertet das Investitionspotenzial — Versorgungslücken, Einzugsgebiet und Sportaffinität der Region:{% if opportunity_score >= 65 and market_score < 40 %} überschaubare Konkurrenz trifft auf starkes Standortpotenzial{% elif opportunity_score >= 65 %} hohes Potenzial trotz bereits aktivem Marktumfeld{% elif opportunity_score >= 40 %} solides Potenzial, der Markt beginnt sich zu verdichten{% else %} der Standort ist vergleichsweise gut versorgt, Differenzierung wird zum Schlüssel{% endif %}.{% endif %}
Die entscheidende Frage für Investoren: Was bringt ein Padel-Investment bei den aktuellen Preisen, Auslastungsraten und Baukosten tatsächlich? Das Finanzmodell unten rechnet mit echten Marktdaten aus {{ city_name }}.
@@ -179,6 +181,8 @@ Der **Market Score ({{ market_score | round(1) }}/100)** misst die *Marktreife*:
</div>
</div>
<div id="city-map" data-country-slug="{{ country_slug }}" data-city-slug="{{ city_slug }}" data-lat="{{ lat }}" data-lon="{{ lon }}" style="height:300px; border-radius:12px; margin-bottom:1.5rem;"></div>
{{ city_name }} has a **<a href="/{{ language }}/market-score" style="text-decoration:none"><span style="font-family:'Bricolage Grotesque',sans-serif;font-weight:800;color:#0F172A;letter-spacing:-0.02em">padelnomics</span> Market Score</a> of {{ market_score | round(1) }}/100** — placing it{% if market_score >= 55 %} among the strongest padel markets in {{ country_name_en }}{% elif market_score >= 35 %} in the mid-tier of {{ country_name_en }}'s padel markets{% else %} in an early-stage padel market with room for growth{% endif %}. The city currently has **{{ padel_venue_count }} padel venues** serving a population of {% if population >= 1000000 %}{{ (population / 1000000) | round(1) }}M{% else %}{{ (population / 1000) | round(0) | int }}K{% endif %} residents — a density of {{ venues_per_100k | round(1) }} venues per 100,000 people.{% if opportunity_score %} The **<a href="/{{ language }}/market-score" style="text-decoration:none"><span style="font-family:'Bricolage Grotesque',sans-serif;font-weight:800;color:#0F172A;letter-spacing:-0.02em">padelnomics</span> Opportunity Score</a> of {{ opportunity_score | round(1) }}/100** scores investment potential — supply gaps, catchment reach, and sports culture as a demand proxy:{% if opportunity_score >= 65 and market_score < 40 %} limited competition meets strong location fundamentals{% elif opportunity_score >= 65 %} strong potential despite an already active market{% elif opportunity_score >= 40 %} solid potential as the market starts to fill in{% else %} the area is comparatively well-served; differentiation is the key lever{% endif %}.{% endif %}
The question that matters: given current pricing, occupancy, and build costs, what does a padel investment in {{ city_name }} actually return? The financial model below works with real local market data.

View File

@@ -40,6 +40,8 @@ priority_column: total_venues
</div>
</div>
<div id="country-map" data-country-slug="{{ country_slug }}" style="height:360px; border-radius:12px; margin-bottom:1.5rem;"></div>
In {{ country_name_en }} erfassen wir aktuell **{{ total_venues }} Padelanlagen** in **{{ city_count }} Städten**. Der durchschnittliche <a href="/{{ language }}/market-score" style="text-decoration:none"><span style="font-family:'Bricolage Grotesque',sans-serif;font-weight:800;color:#0F172A;letter-spacing:-0.02em">padelnomics</span> Market Score</a> liegt bei **{{ avg_market_score }}/100**{% if avg_market_score >= 55 %} — ein starker Markt mit breiter Infrastruktur und belastbaren Preisdaten{% elif avg_market_score >= 35 %} — ein wachsender Markt mit guter Abdeckung{% else %} — ein aufstrebender Markt, in dem Früheinsteiger noch Premiumstandorte sichern können{% endif %}.
## Marktlandschaft
@@ -172,6 +174,8 @@ Der **Market Score (Ø {{ avg_market_score }}/100)** bewertet die Marktreife: Be
</div>
</div>
<div id="country-map" data-country-slug="{{ country_slug }}" style="height:360px; border-radius:12px; margin-bottom:1.5rem;"></div>
{{ country_name_en }} has **{{ total_venues }} padel venues** tracked across **{{ city_count }} cities**. The average <a href="/{{ language }}/market-score" style="text-decoration:none"><span style="font-family:'Bricolage Grotesque',sans-serif;font-weight:800;color:#0F172A;letter-spacing:-0.02em">padelnomics</span> Market Score</a> across tracked cities is **{{ avg_market_score }}/100**{% if avg_market_score >= 55 %} — a strong market with widespread venue penetration and solid pricing data{% elif avg_market_score >= 35 %} — a growing market with healthy city coverage{% else %} — an emerging market where early entrants can still capture prime locations{% endif %}.
## Market Landscape

View File

@@ -6,6 +6,7 @@
<meta name="description" content="{{ t.markets_page_description }}">
<meta property="og:title" content="{{ t.markets_page_og_title }} - {{ config.APP_NAME }}">
<meta property="og:description" content="{{ t.markets_page_og_description }}">
<link rel="stylesheet" href="{{ url_for('static', filename='vendor/leaflet/leaflet.min.css') }}">
{% endblock %}
{% block content %}
@@ -15,6 +16,8 @@
<p class="text-slate">{{ t.mkt_subheading }}</p>
</header>
<div id="markets-map" style="height:420px; border-radius:12px;" class="mb-6"></div>
<!-- Filters -->
<div class="card mb-8">
<div style="display: grid; grid-template-columns: 1fr auto auto; gap: 1rem; align-items: end;">
@@ -62,3 +65,52 @@
</div>
</main>
{% endblock %}
{% block scripts %}
<script src="{{ url_for('static', filename='vendor/leaflet/leaflet.min.js') }}"></script>
<script>
(function() {
var map = L.map('markets-map', {scrollWheelZoom: false}).setView([48.5, 10], 4);
L.tileLayer('https://{s}.basemaps.cartocdn.com/light_all/{z}/{x}/{y}{r}.png', {
attribution: '&copy; <a href="https://www.openstreetmap.org/copyright">OSM</a> &copy; <a href="https://carto.com/">CARTO</a>',
maxZoom: 18
}).addTo(map);
function scoreColor(score) {
if (score >= 60) return '#16A34A';
if (score >= 30) return '#D97706';
return '#DC2626';
}
function makeIcon(size, color) {
var s = Math.round(size);
return L.divIcon({
className: '',
html: '<div class="pn-marker" style="width:' + s + 'px;height:' + s + 'px;background:' + color + ';opacity:0.82;"></div>',
iconSize: [s, s],
iconAnchor: [s / 2, s / 2],
});
}
fetch('/api/markets/countries.json')
.then(function(r) { return r.json(); })
.then(function(data) {
if (!data.length) return;
var maxV = Math.max.apply(null, data.map(function(d) { return d.total_venues; }));
var lang = document.documentElement.lang || 'en';
data.forEach(function(c) {
if (!c.lat || !c.lon) return;
var size = 12 + 44 * Math.sqrt(c.total_venues / maxV);
var color = scoreColor(c.avg_market_score);
var tip = '<strong>' + c.country_name_en + '</strong><br>'
+ c.total_venues + ' venues · ' + c.city_count + ' cities<br>'
+ '<span style="color:' + color + ';font-weight:600;">Score ' + c.avg_market_score + '/100</span>';
L.marker([c.lat, c.lon], { icon: makeIcon(size, color) })
.bindTooltip(tip, { className: 'map-tooltip', direction: 'top', offset: [0, -Math.round(size / 2)] })
.on('click', function() { window.location = '/' + lang + '/markets/' + c.country_slug; })
.addTo(map);
});
});
})();
</script>
{% endblock %}

View File

@@ -49,13 +49,17 @@ class Config:
MAGIC_LINK_EXPIRY_MINUTES: int = int(os.getenv("MAGIC_LINK_EXPIRY_MINUTES", "15"))
SESSION_LIFETIME_DAYS: int = int(os.getenv("SESSION_LIFETIME_DAYS", "30"))
PAYMENT_PROVIDER: str = "paddle"
PAYMENT_PROVIDER: str = _env("PAYMENT_PROVIDER", "paddle").lower()
PADDLE_API_KEY: str = os.getenv("PADDLE_API_KEY", "")
PADDLE_CLIENT_TOKEN: str = os.getenv("PADDLE_CLIENT_TOKEN", "")
PADDLE_WEBHOOK_SECRET: str = os.getenv("PADDLE_WEBHOOK_SECRET", "")
PADDLE_ENVIRONMENT: str = _env("PADDLE_ENVIRONMENT", "sandbox")
STRIPE_SECRET_KEY: str = os.getenv("STRIPE_SECRET_KEY", "") or os.getenv("STRIPE_API_PRIVATE_KEY", "")
STRIPE_PUBLISHABLE_KEY: str = os.getenv("STRIPE_PUBLISHABLE_KEY", "") or os.getenv("STRIPE_API_PUBLIC_KEY", "")
STRIPE_WEBHOOK_SECRET: str = os.getenv("STRIPE_WEBHOOK_SECRET", "")
UMAMI_API_URL: str = os.getenv("UMAMI_API_URL", "https://umami.padelnomics.io")
UMAMI_API_TOKEN: str = os.getenv("UMAMI_API_TOKEN", "")
UMAMI_WEBSITE_ID: str = "4474414b-58d6-4c6e-89a1-df5ea1f49d70"
@@ -192,6 +196,15 @@ async def fetch_all(sql: str, params: tuple = ()) -> list[dict]:
return [dict(row) for row in rows]
async def count_where(table_where: str, params: tuple = ()) -> int:
"""Count rows matching a condition. Compresses the fetch_one + null-check pattern.
Usage: await count_where("users WHERE deleted_at IS NULL")
"""
row = await fetch_one(f"SELECT COUNT(*) AS n FROM {table_where}", params)
return row["n"] if row else 0
async def execute(sql: str, params: tuple = ()) -> int:
"""Execute SQL and return lastrowid."""
db = await get_db()
@@ -713,16 +726,39 @@ async def purge_deleted(table: str, days: int = 30) -> int:
# =============================================================================
async def get_price_id(key: str, provider: str = None) -> str | None:
"""Look up a provider price ID by product key from the payment_products table."""
provider = provider or config.PAYMENT_PROVIDER
row = await fetch_one(
"SELECT provider_price_id FROM payment_products WHERE provider = ? AND key = ?",
(provider, key),
)
return row["provider_price_id"] if row else None
async def get_all_price_ids(provider: str = None) -> dict[str, str]:
"""Load all price IDs for a provider as a {key: price_id} dict."""
provider = provider or config.PAYMENT_PROVIDER
rows = await fetch_all(
"SELECT key, provider_price_id FROM payment_products WHERE provider = ?",
(provider,),
)
return {r["key"]: r["provider_price_id"] for r in rows}
async def get_paddle_price(key: str) -> str | None:
"""Look up a Paddle price ID by product key from the paddle_products table."""
"""Deprecated: use get_price_id(). Falls back to paddle_products for pre-migration DBs."""
result = await get_price_id(key, provider="paddle")
if result:
return result
# Fallback to old table if payment_products not yet populated
row = await fetch_one("SELECT paddle_price_id FROM paddle_products WHERE key = ?", (key,))
return row["paddle_price_id"] if row else None
async def get_all_paddle_prices() -> dict[str, str]:
"""Load all Paddle price IDs as a {key: price_id} dict."""
rows = await fetch_all("SELECT key, paddle_price_id FROM paddle_products")
return {r["key"]: r["paddle_price_id"] for r in rows}
"""Deprecated: use get_all_price_ids()."""
return await get_all_price_ids(provider="paddle")
# =============================================================================
@@ -731,9 +767,14 @@ async def get_all_paddle_prices() -> dict[str, str]:
def slugify(text: str, max_length_chars: int = 80) -> str:
"""Convert text to URL-safe slug."""
"""Convert text to URL-safe slug.
Pre-replaces ß→ss before NFKD normalization so output matches the SQL
@slugify macro (which uses DuckDB STRIP_ACCENTS + REPLACE).
"""
text = text.lower().replace("ß", "ss")
text = unicodedata.normalize("NFKD", text).encode("ascii", "ignore").decode()
text = re.sub(r"[^\w\s-]", "", text.lower())
text = re.sub(r"[^\w\s-]", "", text)
text = re.sub(r"[-\s]+", "-", text).strip("-")
return text[:max_length_chars]

View File

@@ -6,7 +6,7 @@ from pathlib import Path
from quart import Blueprint, flash, g, redirect, render_template, request, url_for
from ..auth.routes import login_required, update_user
from ..core import csrf_protect, fetch_one, soft_delete, utcnow_iso
from ..core import count_where, csrf_protect, fetch_one, soft_delete, utcnow_iso
from ..i18n import get_translations
bp = Blueprint(
@@ -18,17 +18,13 @@ bp = Blueprint(
async def get_user_stats(user_id: int) -> dict:
scenarios = await fetch_one(
"SELECT COUNT(*) as count FROM scenarios WHERE user_id = ? AND deleted_at IS NULL",
(user_id,),
)
leads = await fetch_one(
"SELECT COUNT(*) as count FROM lead_requests WHERE user_id = ?",
(user_id,),
)
return {
"scenarios": scenarios["count"] if scenarios else 0,
"leads": leads["count"] if leads else 0,
"scenarios": await count_where(
"scenarios WHERE user_id = ? AND deleted_at IS NULL", (user_id,)
),
"leads": await count_where(
"lead_requests WHERE user_id = ?", (user_id,)
),
}

View File

@@ -6,7 +6,7 @@ from pathlib import Path
from quart import Blueprint, g, make_response, redirect, render_template, request, url_for
from ..core import csrf_protect, execute, fetch_all, fetch_one, utcnow_iso
from ..core import count_where, csrf_protect, execute, fetch_all, fetch_one, utcnow_iso
from ..i18n import COUNTRY_LABELS, get_translations
bp = Blueprint(
@@ -79,11 +79,7 @@ async def _build_directory_query(q, country, category, region, page, per_page=24
where = " AND ".join(wheres) if wheres else "1=1"
count_row = await fetch_one(
f"SELECT COUNT(*) as cnt FROM suppliers s WHERE {where}",
tuple(params),
)
total = count_row["cnt"] if count_row else 0
total = await count_where(f"suppliers s WHERE {where}", tuple(params))
offset = (page - 1) * per_page
# Tier-based ordering: sticky first, then pro > growth > free, then name
@@ -159,16 +155,16 @@ async def index():
"SELECT category, COUNT(*) as cnt FROM suppliers GROUP BY category ORDER BY cnt DESC"
)
total_suppliers = await fetch_one("SELECT COUNT(*) as cnt FROM suppliers")
total_countries = await fetch_one("SELECT COUNT(DISTINCT country_code) as cnt FROM suppliers")
total_suppliers = await count_where("suppliers")
total_countries = await count_where("(SELECT DISTINCT country_code FROM suppliers)")
return await render_template(
"directory.html",
**ctx,
country_counts=country_counts,
category_counts=category_counts,
total_suppliers=total_suppliers["cnt"] if total_suppliers else 0,
total_countries=total_countries["cnt"] if total_countries else 0,
total_suppliers=total_suppliers,
total_countries=total_countries,
)
@@ -204,11 +200,9 @@ async def supplier_detail(slug: str):
# Enquiry count (Basic+)
enquiry_count = 0
if supplier.get("tier") in ("basic", "growth", "pro"):
row = await fetch_one(
"SELECT COUNT(*) as cnt FROM supplier_enquiries WHERE supplier_id = ?",
(supplier["id"],),
enquiry_count = await count_where(
"supplier_enquiries WHERE supplier_id = ?", (supplier["id"],)
)
enquiry_count = row["cnt"] if row else 0
lang = g.get("lang", "en")
cat_labels, country_labels, region_labels = get_directory_labels(lang)

View File

@@ -8,7 +8,8 @@
<p class="q-step-sub">{{ t.q4_subheading }}</p>
<div class="q-field-group">
<span class="q-label">{{ t.q4_phase_label }}</span>
<span class="q-label">{{ t.q4_phase_label }} <span class="required">*</span></span>
{% if 'location_status' in errors %}<p class="q-error-hint">{{ t.q4_error_phase }}</p>{% endif %}
<div class="q-pills">
{% for val, label in [('still_searching', t.q4_phase_searching), ('location_found', t.q4_phase_found), ('converting_existing', t.q4_phase_converting), ('lease_signed', t.q4_phase_lease_signed), ('permit_not_filed', t.q4_phase_permit_not_filed), ('permit_pending', t.q4_phase_permit_pending), ('permit_granted', t.q4_phase_permit_granted)] %}
<label><input type="radio" name="location_status" value="{{ val }}" {{ 'checked' if data.get('location_status') == val }}><span class="q-pill">{{ label }}</span></label>

View File

@@ -89,17 +89,17 @@
"flash_verify_invalid": "Ungültiger Verifizierungslink.",
"flash_verify_expired": "Dieser Link ist abgelaufen oder wurde bereits verwendet. Bitte stelle eine neue Anfrage.",
"flash_verify_invalid_lead": "Dieses Angebot wurde bereits verifiziert oder existiert nicht.",
"landing_hero_badge": "Padel-Finanzrechner & Businessplan-Tool",
"landing_hero_h1_1": "Plan Dein Padel-",
"landing_hero_h1_2": "Business in Minuten,",
"landing_hero_h1_3": "nicht Monaten",
"landing_hero_btn_primary": "Jetzt Dein Padel-Business planen →",
"landing_hero_btn_secondary": "Anbieter durchsuchen",
"landing_hero_bullet_1": "Keine Registrierung erforderlich",
"landing_hero_bullet_2": "60+ Variablen",
"landing_hero_bullet_3": "Unbegrenzte Szenarien",
"landing_roi_title": "Schnelle Renditeschätzung",
"landing_roi_subtitle": "Schieberegler bewegen und Projektion in Echtzeit sehen",
"landing_hero_badge": "Das Padel-Gründer-Toolkit — kostenlos",
"landing_hero_h1_1": "Investier in Padel",
"landing_hero_h1_2": "mit Sicherheit,",
"landing_hero_h1_3": "nicht Bauchgefühl",
"landing_hero_btn_primary": "Kostenlosen Businessplan starten →",
"landing_hero_btn_secondary": "Anbieter-Angebote einholen",
"landing_hero_bullet_1": "Kostenlos — ohne Registrierung, ohne Kreditkarte",
"landing_hero_bullet_2": "Bankfertige Kennzahlen (IRR, DSCR, MOIC)",
"landing_hero_bullet_3": "Basiert auf echten Marktdaten",
"landing_roi_title": "Ist Deine Padel-Idee rentabel?",
"landing_roi_subtitle": "Finde es in 30 Sekunden heraus",
"landing_roi_courts": "Plätze",
"landing_roi_rate": "Durchschn. Stundensatz",
"landing_roi_util": "Ziel-Auslastung",
@@ -108,7 +108,7 @@
"landing_roi_payback": "Amortisationszeit",
"landing_roi_annual_roi": "Jährlicher ROI",
"landing_roi_note": "Annahmen: Indoorhalle Mietmodell, 8 €/m² Miete, Personalkosten, 5 % Zinsen, 10-jähriges Darlehen. Amortisation und ROI basieren auf der Gesamtinvestition.",
"landing_roi_cta": "Jetzt Dein Padel-Business planen →",
"landing_roi_cta": "Vollständigen Businessplan erstellen — kostenlos →",
"landing_journey_title": "Deine Reise",
"landing_journey_01": "Analysieren",
"landing_journey_01_badge": "Demnächst",
@@ -118,27 +118,27 @@
"landing_journey_04": "Bauen",
"landing_journey_05": "Wachsen",
"landing_journey_05_badge": "Demnächst",
"landing_features_title": "Für ernsthafte Padel-Unternehmer gebaut",
"landing_feature_1_h3": "60+ Variablen",
"landing_feature_2_h3": "6 Analyse-Tabs",
"landing_feature_3_h3": "Indoor & Outdoor",
"landing_feature_4_h3": "Sensitivitätsanalyse",
"landing_feature_5_h3": "Professionelle Kennzahlen",
"landing_feature_6_h3": "Speichern & Vergleichen",
"landing_supplier_title": "Die richtigen Anbieter für Dein Projekt finden",
"landing_supplier_step_1_title": "Padel-Platz planen",
"landing_supplier_step_2_title": "Angebote einholen",
"landing_supplier_step_3_title": "Vergleichen & Bauen",
"landing_supplier_browse_btn": "Anbieterverzeichnis durchsuchen",
"landing_features_title": "Alles, was Du für eine fundierte Entscheidung brauchst",
"landing_feature_1_h3": "Kenne Deine Zahlen in- und auswendig",
"landing_feature_2_h3": "Bankfertig ab Tag eins",
"landing_feature_3_h3": "Jeder Anlagentyp, jeder Markt",
"landing_feature_4_h3": "Stresstest vor dem Commitment",
"landing_feature_5_h3": "Ersetzt den 5.000-€-Berater",
"landing_feature_6_h3": "Szenarien direkt vergleichen",
"landing_supplier_title": "Bereit zum Bauen? Lass Dich mit verifizierten Anbietern verbinden",
"landing_supplier_step_1_title": "Projekt teilen",
"landing_supplier_step_2_title": "Passende Anbieter finden",
"landing_supplier_step_3_title": "Angebote vergleichen",
"landing_supplier_browse_btn": "Angebote einholen — kostenlos & unverbindlich",
"landing_faq_title": "Häufig gestellte Fragen",
"landing_faq_q1": "Was berechnet der Planer?",
"landing_faq_q2": "Muss ich mich registrieren?",
"landing_faq_q3": "Wie funktioniert die Anbieter-Vermittlung?",
"landing_faq_q4": "Ist das Anbieterverzeichnis kostenlos?",
"landing_faq_q5": "Wie genau sind die Finanzprojektionen?",
"landing_faq_q1": "Wie viel kostet es, eine Padel-Anlage zu eröffnen?",
"landing_faq_q2": "Akzeptiert die Bank einen Padelnomics-Businessplan?",
"landing_faq_q3": "Wie genau sind die Finanzprojektionen?",
"landing_faq_q4": "Auf welchen Daten basieren die Markt-Benchmarks?",
"landing_faq_q5": "Muss ich etwas bezahlen?",
"landing_seo_title": "Padel-Platz-Investitionsplanung",
"landing_final_cta_h2": "Jetzt mit der Planung loslegen",
"landing_final_cta_btn": "Jetzt Dein Padel-Business planen →",
"landing_final_cta_h2": "Dein Banktermin kommt. Sei vorbereitet.",
"landing_final_cta_btn": "Kostenlosen Businessplan starten →",
"features_h1": "Alles, was Du für Dein Padel-Business brauchst",
"features_subtitle": "Professionelles Finanzmodell — vollständig kostenlos.",
"features_card_1_h2": "60+ Variablen",
@@ -428,6 +428,7 @@
"q4_phase_permit_not_filed": "Baugenehmigung noch nicht beantragt",
"q4_phase_permit_pending": "Baugenehmigung in Bearbeitung",
"q4_phase_permit_granted": "Baugenehmigung erteilt",
"q4_error_phase": "Bitte wähle Deine Projektphase aus.",
"q5_heading": "Zeitplan",
"q5_subheading": "Wann möchtest Du beginnen?",
"q5_timeline_label": "Zeitplan",
@@ -891,7 +892,7 @@
"sup_meta_desc": "Kostenloser Verzeichniseintrag auf Padelnomics. Qualifizierte Leads von Interessenten mit fertigem Businessplan. Growth- und Pro-Pläne ab €199/Monat.",
"sup_hero_h1a": "Kein Kaltakquise mehr.",
"sup_hero_h1b": "Triff Käufer, die bereits einen Businessplan haben.",
"sup_hero_sub": "Jeder Lead auf Padelnomics hat CAPEX, Umsatz und ROI bereits modelliert bevor er dich kontaktiert. Keine Zeitverschwender. Kein „ich schau mich nur um.“",
"sup_hero_sub": "Jeder Lead hat bereits ein Finanzmodell für sein Projekt erstellt. Du bekommst Budget, Zeitplan und Spezifikationen — noch vor dem Erstkontakt.",
"sup_hero_cta": "Kostenlos starten",
"sup_hero_trust_pre": "Vertrauen von Anbietern in",
"sup_hero_trust_post": "Ländern",
@@ -955,7 +956,7 @@
"sup_basic_f4": "Website & Kontaktdaten",
"sup_basic_f5": "Checkliste der angebotenen Leistungen",
"sup_basic_f6": "Kontaktformular auf der Listing-Seite",
"sup_basic_cta": "Unternehmen kostenlos eintragen",
"sup_basic_cta": "Kostenlos eintragen",
"sup_growth_name": "Growth",
"sup_growth_popular": "Beliebtester Plan",
"sup_growth_credits": "30 Credits/Monat inklusive",
@@ -965,7 +966,7 @@
"sup_growth_f4": "Priorität gegenüber kostenlosen Einträgen",
"sup_growth_f5": "30 Lead-Credits pro Monat",
"sup_growth_f6": "Zusätzliche Credit-Pakete kaufen",
"sup_growth_cta": "Jetzt starten",
"sup_growth_cta": "Leads erhalten",
"sup_pro_name": "Pro",
"sup_pro_credits": "100 Credits/Monat inklusive",
"sup_pro_f1": "Alles aus Growth",
@@ -974,7 +975,7 @@
"sup_pro_f4": "Featured Card-Rahmen & Glow",
"sup_pro_f5": "Bevorzugte Platzierung im Verzeichnis",
"sup_pro_f6": "100 Lead-Credits pro Monat",
"sup_pro_cta": "Jetzt starten",
"sup_pro_cta": "Pipeline maximieren",
"sup_yearly_note_basic": "Dauerhaft kostenlos",
"sup_yearly_note_growth": "€1.799 jährlich",
"sup_yearly_note_pro": "€4.499 jährlich",
@@ -1012,14 +1013,14 @@
"sup_cmp_t4": "Nie",
"sup_cmp_m1": "Nach Kategorie gefiltert",
"sup_cmp_footnote": "*Google-Ads-Schätzung basierend auf €2080 CPC für Padel-Baukeywords bei 510 Klicks/Tag.",
"sup_proof_h2": "Vertrauen von führenden Unternehmen der Padel-Branche",
"sup_proof_h2": "Das bekommst du mit jedem Lead",
"sup_proof_stat1": "erstellte Businesspläne",
"sup_proof_stat2": "Anbieter",
"sup_proof_stat3": "Länder",
"sup_proof_q1": "Padelnomics schickt uns Leads, die bereits ernsthaft an einem Bau interessiert sind. Die Projektbriefings sind detaillierter als das, was wir von Messen erhalten.",
"sup_proof_cite1": "— Europäischer Padel-Court-Hersteller",
"sup_proof_q2": "Endlich eine Plattform, die den Padel-Baumarkt versteht. Wir kennen das Budget, den Zeitplan und den Standorttyp, bevor wir überhaupt Erstkontakt aufnehmen.",
"sup_proof_cite2": "— Padel-Court-Installationsunternehmen, Skandinavien",
"sup_proof_point1_h3": "Komplettes Projektbriefing",
"sup_proof_point1_p": "Anlagentyp, Court-Anzahl, Glas-/Lichtspezifikationen, Budget, Zeitplan, Finanzierungsstatus und vollständige Kontaktdaten — bevor du überhaupt Erstkontakt aufnimmst.",
"sup_proof_point2_h3": "Finanzmodell inklusive",
"sup_proof_point2_p": "Jeder Lead hat bereits CAPEX, Umsatzprognosen und ROI durchgerechnet. Du sprichst mit jemandem, der seine Zahlen kennt.",
"sup_faq_h2": "Anbieter-FAQ",
"sup_faq_q1": "Wie werde ich gelistet?",
"sup_faq_a1_pre": "Finde dein Unternehmen in unserem",
@@ -1172,34 +1173,67 @@
"features_opex_body": "Peak- und Off-Peak-Preise mit konfigurierbaren Stundenaufteilungen. Monatliche Anlaufkurven für die Auslastung. Personalkosten, Wartung, Versicherung, Marketing und Betriebskosten — alle mit Schiebereglern anpassbar. Einnahmen aus Platzvermietung, Coaching, Ausrüstung und F&B.",
"features_cf_body": "Monatliche Cashflow-Projektionen über 10 Jahre. Eigen-/Fremdkapitalaufteilung, Zinssätze und Kreditlaufzeiten modellieren. Schuldendienstdeckungsgrade und freien Cashflow Monat für Monat einsehen. Wasserfalldiagramme zeigen genau, wohin dein Geld fließt.",
"features_returns_body": "Eigenkapital-IRR und MOIC unter verschiedenen Exit-Szenarien berechnen. Cap-Rate-Exits mit konfigurierbaren Haltedauern modellieren. Die Eigenkapitalentwicklung vom Ersteinsatz bis zum Exit-Erlös nachvollziehen.",
"landing_page_title": "Padelnomics - Padel-Kostenrechner & Finanzplaner",
"landing_meta_desc": "Modelliere deine Padelplatz-Investition mit 60+ Variablen, Sensitivitätsanalyse und professionellen Projektionen. Innen-/Außenanlage, Miet- oder Eigentumsmodell.",
"landing_og_desc": "Der professionellste Padel-Finanzplaner. 60+ Variablen, 6 Analyse-Tabs, Diagramme, Sensitivitätsanalyse und Anbieter-Vermittlung.",
"landing_hero_desc": "Modelliere Deine Padelplatz-Investition mit 60+ Variablen, Sensitivitätsanalyse und professionellen Projektionen. Danach wirst Du mit verifizierten Anbietern zusammengebracht.",
"landing_page_title": "Padelnomics Padel-Businessplan & Renditerechner | Kostenlos",
"landing_meta_desc": "Plane Deine Padel-Investition mit echten Marktdaten. Bankfertiges Finanzmodell mit IRR, DSCR, Sensitivitätsanalyse. Kostenlos — ohne Registrierung.",
"landing_og_desc": "Plane Deine Padel-Investition mit Sicherheit. Bankfertiges Finanzmodell, echte Marktdaten und verifizierte Anbieter-Vermittlung. Kostenlos — ohne Registrierung.",
"landing_hero_desc": "Du stehst vor einer Investition von über 200.000 €. Padelnomics gibt Dir das Finanzmodell, die Marktdaten und die Anbieter-Kontakte, um diese Entscheidung mit offenen Augen zu treffen.",
"landing_journey_01_desc": "Marktbedarfsanalyse, Standortbewertung und Identifikation von Nachfragepotenzialen.",
"landing_journey_02_desc": "Modelliere deine Investition mit 60+ Variablen, Diagrammen und Sensitivitätsanalyse.",
"landing_journey_03_desc": "Kontakte zu Banken und Investoren herstellen. Dein Finanzplan wird zum Businesscase.",
"landing_journey_04_desc": "{total_suppliers}+ Platz-Anbieter aus {total_countries} Ländern durchsuchen. Passend zu Deinen Anforderungen vermittelt.",
"landing_journey_05_desc": "Launch-Playbook, Performance-Benchmarks und Wachstumsanalysen für deinen Betrieb.",
"landing_feature_1_body": "Jede Annahme ist anpassbar: Platzbaukosten, Miete, Preisgestaltung, Auslastung, Finanzierungskonditionen, Exit-Szenarien. Nichts ist fest vorgegeben.",
"landing_feature_2_body": "Annahmen, Investition (CAPEX), Betriebsmodell, Cashflow, Renditen & Exit sowie Kennzahlen — jeder Tab mit interaktiven Diagrammen.",
"landing_feature_3_body": "Indoorhallenmodelle (Miete oder Neubau) und Außenanlagen mit Saisonalität. Szenarien direkt nebeneinander vergleichen.",
"landing_feature_4_body": "Sieh dir an, wie sich deine Renditen bei unterschiedlichen Auslastungsraten und Preisen verändern. Break-even-Punkt sofort ermitteln.",
"landing_feature_5_body": "IRR, MOIC, DSCR, Cash-on-Cash-Rendite, Break-even-Auslastung, RevPAH, Schuldenrendite — die Kennzahlen, die Banken und Investoren sehen möchten.",
"landing_feature_6_body": "Unbegrenzte Szenarien speichern. Verschiedene Standorte, Platzzahlen und Finanzierungsstrukturen testen. Den optimalen Plan finden.",
"landing_supplier_sub": "{total_suppliers}+ verifizierte Anbieter aus {total_countries} Ländern. Hersteller, Baufirmen, Belaghersteller, Beleuchtung und mehr.",
"landing_supplier_step_1_body": "Nutze den Finanzplaner, um deine Platzzahl, dein Budget und deinen Zeitplan zu modellieren.",
"landing_supplier_step_2_body": "Angebote anfordern — wir vermitteln dich anhand deiner Projektspezifikationen an passende Anbieter.",
"landing_supplier_step_3_body": "Angebote von vermittelten Anbietern erhalten. Keine Kaltakquise erforderlich.",
"landing_faq_a1": "Der Planer erstellt ein vollständiges Finanzmodell: CAPEX-Aufschlüsselung, monatliche Betriebskosten, Cashflow-Projektionen, Schuldendienst, IRR, MOIC, DSCR, Amortisationszeit, Break-even-Auslastung und Sensitivitätsanalyse. Es werden Indoor-/Outdoor-Anlagen, Miet- und Eigentumsmodelle sowie alle wesentlichen Kosten- und Erlösvariablen abgedeckt.",
"landing_faq_a2": "Nein. Der Planer funktioniert sofort ohne Registrierung. Erstelle ein Konto, um Szenarien zu speichern, Konfigurationen zu vergleichen und PDF-Berichte zu exportieren.",
"landing_faq_a3": "Wenn du über den Planer Angebote anforderst, teilen wir deine Projektdetails (Anlagentyp, Platzzahl, Glas, Beleuchtung, Land, Budget, Zeitplan) mit passenden Anbietern aus unserem Verzeichnis. Diese kontaktieren dich direkt mit ihren Angeboten.",
"landing_faq_a4": "Das Durchsuchen des Verzeichnisses ist für alle kostenlos. Anbieter erhalten standardmäßig einen Basiseintrag. Kostenpflichtige Pläne (Basic ab 39 €/Monat, Growth ab 199 €/Monat, Pro ab 499 €/Monat) schalten Anfrageformulare, vollständige Beschreibungen, Logos, verifizierte Badges und Prioritätsplatzierung frei.",
"landing_faq_a5": "Das Modell verwendet reale Standardwerte auf Basis globaler Marktdaten. Jede Annahme ist anpassbar, sodass du deine lokalen Gegebenheiten abbilden kannst. Die Sensitivitätsanalyse zeigt, wie sich die Ergebnisse in verschiedenen Szenarien verändern, und hilft dir, die Bandbreite möglicher Ergebnisse zu verstehen.",
"landing_feature_1_body": "Jede Kosten-, Erlös- und Finanzierungsannahme ist anpassbar. Nichts ist versteckt, nichts ist fest vorgegeben.",
"landing_feature_2_body": "IRR, MOIC, DSCR, Cash-on-Cash-Rendite, Break-even-Analyse — genau die Kennzahlen, die Banken und Investoren verlangen.",
"landing_feature_3_body": "Indoorhallen, Außenplätze, Miet- oder Eigentumsmodell — mit Saisonalität und regionalen Kostenanpassungen.",
"landing_feature_4_body": "Sieh, wie sich Deine Rendite verändert, wenn die Auslastung um 10 % sinkt oder die Zinsen steigen. Break-even-Punkt sofort ermitteln.",
"landing_feature_5_body": "Erhalte dasselbe Finanzmodell, das ein Berater für 5.00010.000 € berechnen würde. Jederzeit selbst aktualisierbar.",
"landing_feature_6_body": "Verschiedene Standorte, Platzzahlen und Finanzierungsstrukturen testen. Den Plan finden, der funktioniert.",
"landing_supplier_sub": "Jede Angebotsanfrage enthält Dein vollständiges Finanzmodell — Budget, Platzzahl, Zeitplan und Finanzierungsstatus. {total_suppliers}+ Anbieter aus {total_countries} Ländern.",
"landing_supplier_step_1_body": "Fülle in 2 Minuten einen Projektbrief aus. Deine Planer-Daten werden automatisch übernommen.",
"landing_supplier_step_2_body": "Wir benachrichtigen Anbieter, die zu Deinen Anforderungen, Deinem Standort und Budget passen. Keine Kaltakquise nötig.",
"landing_supplier_step_3_body": "Erhalte Angebote von passenden Anbietern. Jedes Angebot basiert auf Deinen tatsächlichen Projektdaten — keine Standardkalkulationen.",
"landing_faq_a1": "Das hängt vom Format ab. Eine typische Indoorhalle mit 68 Plätzen in einem Mietobjekt kostet 250.000500.000 €. Ein Neubau liegt bei 13 Mio. €. Outdoor-Plätze starten bei rund 150.000 € für 4 Courts. Mit Padelnomics modellierst Du Dein genaues Szenario — jede Variable ist anpassbar, und Du siehst das vollständige Finanzbild in Minuten.",
"landing_faq_a2": "Ja. Der Planer erstellt IRR, MOIC, DSCR, Break-even-Analyse und 10-Jahres-Cashflow-Projektionen — genau die Kennzahlen, die Banken und Investoren erwarten. Exportiere als professionelles PDF für Deinen Kreditantrag oder Dein Investoren-Pitch.",
"landing_faq_a3": "Das Modell verwendet reale Standardwerte auf Basis europäischer und internationaler Marktdaten. Jede Annahme ist anpassbar, damit Du Deine lokalen Gegebenheiten abbilden kannst. Die Sensitivitätsanalyse zeigt, wie sich die Ergebnisse in verschiedenen Szenarien verändern — nicht nur im Best Case.",
"landing_faq_a4": "Die Standardwerte basieren auf echten Platzbaukosten, Mietpreisen und Betriebsbenchmarks aus öffentlichen Quellen und Branchendaten. Du kannst jede Annahme mit Deinen eigenen Zahlen überschreiben.",
"landing_faq_a5": "Der Planer ist 100 % kostenlos — ohne Registrierung, ohne Kreditkarte, ohne Testphase. Erstelle ein kostenloses Konto, um Szenarien zu speichern und Konfigurationen zu vergleichen. Der PDF-Export ist als Zusatzleistung verfügbar (99 € einmalig).",
"landing_seo_p1": "Padel ist eine der am schnellsten wachsenden Racketsportarten weltweit — die Nachfrage nach Plätzen übersteigt das Angebot in Märkten von Deutschland, Spanien und Schweden bis in die USA und den Nahen Osten. Eine Padel-Anlage zu eröffnen kann eine attraktive Investition sein, aber die Zahlen müssen stimmen. Eine typische Indoorhalle mit 68 Plätzen erfordert zwischen 300.000 € (Anmietung eines Bestandsgebäudes) und 23 Mio. € (Neubau), mit Amortisationszeiten von 35 Jahren für gut gelegene Anlagen.",
"landing_seo_p2": "Die entscheidenden Faktoren für den Erfolg sind Standort (treibt die Auslastung), Baukosten (CAPEX), Miet- oder Grundstückskosten sowie die Preisstrategie. Unser Finanzplaner ermöglicht es Dir, alle diese Variablen interaktiv zu modellieren und die Auswirkungen auf IRR, MOIC, Cashflow und Schuldendienstdeckungsgrad in Echtzeit zu sehen. Ob Du als Unternehmer Deine erste Anlage prüfst, als Immobilienentwickler Padel in ein Mixed-Use-Projekt integrierst oder als Investor eine bestehende Padel-Anlage bewertest — Padelnomics gibt Dir die finanzielle Klarheit für fundierte Entscheidungen.",
"landing_final_cta_sub": "Modelliere Deine Investition und lass Dich mit verifizierten Platz-Anbietern aus {total_countries} Ländern zusammenbringen.",
"landing_final_cta_sub": "Schließ Dich 1.000+ Padel-Unternehmern an, die aufgehört haben zu raten — und angefangen haben, mit echten Daten zu planen.",
"landing_jsonld_org_desc": "Professionelle Planungsplattform für Padelplatz-Investitionen. Finanzplaner, Anbieterverzeichnis und Marktinformationen für Padel-Unternehmer.",
"landing_proof_plans": "{count}+ Businesspläne erstellt",
"landing_proof_suppliers": "{count}+ Anbieter in {countries} Ländern",
"landing_proof_projects": "{amount} Mio. €+ an geplanten Projekten",
"landing_familiar_title": "Kommt Dir das bekannt vor?",
"landing_familiar_1_quote": "Ich denke seit Monaten darüber nach — ich muss einfach mal die Zahlen durchrechnen",
"landing_familiar_1_desc": "Der Planer macht aus Deinen Annahmen ein bankfertiges Finanzmodell — in Minuten statt Wochen.",
"landing_familiar_2_quote": "Die Bank will einen Businessplan und ich starre auf eine leere Tabelle",
"landing_familiar_2_desc": "IRR, DSCR, MOIC, Cashflow-Projektionen — alles wird automatisch aus Deinen Eingaben generiert.",
"landing_familiar_3_quote": "Ich finde widersprüchliche Kostendaten und weiß nicht, was ich glauben soll",
"landing_familiar_3_desc": "Die Standardwerte basieren auf echten Marktdaten. Passe jede Annahme an Deinen lokalen Markt an.",
"landing_familiar_4_quote": "Mein Partner ist skeptisch — ich brauche einen Beweis, dass das nicht verrückt ist",
"landing_familiar_4_desc": "Stresstest per Sensitivitätsanalyse. Zeig genau, wo der Plan bricht — und wo nicht.",
"landing_familiar_cta": "Du bist nicht allein. 1.000+ Padel-Unternehmer haben hier angefangen.",
"landing_vs_title": "Warum Padelnomics?",
"landing_vs_sub": "Du hast Alternativen. Hier der ehrliche Vergleich.",
"landing_vs_col_diy": "Eigene Tabelle",
"landing_vs_col_consultant": "Externer Berater",
"landing_vs_col_us": "Padelnomics",
"landing_vs_row1_label": "Kosten",
"landing_vs_row1_diy": "Kostenlos, dauert aber Wochen",
"landing_vs_row1_consultant": "5.00010.000 €",
"landing_vs_row1_us": "Kostenlos, sofort",
"landing_vs_row2_label": "Qualität",
"landing_vs_row2_diy": "Wirkt unprofessionell bei Banken",
"landing_vs_row2_consultant": "Professionell, aber statisch",
"landing_vs_row2_us": "Professionell, anpassbar",
"landing_vs_row3_label": "Daten",
"landing_vs_row3_diy": "Keine Markt-Benchmarks",
"landing_vs_row3_consultant": "Generisch, nicht padelspezifisch",
"landing_vs_row3_us": "Echte Padel-Marktdaten",
"landing_vs_diy_cta": "Du verdienst Besseres als Raten",
"landing_vs_consultant_cta": "Du verdienst Besseres als 5.000 € zu zahlen",
"landing_vs_us_cta": "Kostenlos starten →",
"plan_basic_f1": "Verifiziert-Badge",
"plan_basic_f2": "Firmenlogo",
"plan_basic_f3": "Vollständige Beschreibung & Slogan",
@@ -1736,7 +1770,14 @@
"sup_guarantee_badge": "Garantie ohne Risiko",
"sup_leads_section_h2": "So sehen deine Interessenten aus",
"sup_leads_section_sub": "Jeder Lead hat unseren Finanzplaner genutzt. Kontaktdaten werden nach dem Freischalten sichtbar.",
"sup_roi_line": "Ein einziges 4-Court-Projekt = <strong>€30.000+ Gewinn</strong>. Growth-Plan: €2.388/Jahr. Die Rechnung ist einfach.",
"sup_roi_line": "Dein durchschnittliches Projekt ist <strong>€50.000+</strong> wert. Wenn wir dir 5 qualifizierte Leads/Monat schicken und du 1 abschließt, sind das €50.000 Umsatz für €199/Monat. Die Rechnung ist einfach.",
"sup_familiar_title": "Kommt dir das bekannt vor?",
"sup_familiar_1_quote": "20 Angebote letztes Quartal. 3 Abschlüsse.",
"sup_familiar_1_pivot": "Schluss mit Angeboten an Interessenten, die nie ernst gemeint haben.",
"sup_familiar_2_quote": "Pipeline für Q3 sieht dünn aus — dabei haben wir Kapazität.",
"sup_familiar_2_pivot": "Ein planbarer Lead-Strom, auf den du dich verlassen kannst.",
"sup_familiar_3_quote": "Schon wieder hat ein Wettbewerber uns beim Preis unterboten.",
"sup_familiar_3_pivot": "Gewinn über Angebotsqualität, nicht über den Preis.",
"sup_credits_only_pre": "Noch nicht bereit für ein Abo? Kaufe ein Credit-Paket und schalte Leads einzeln frei. Keine Bindung, keine Monatsgebühr.",
"sup_credits_only_cta": "Credits kaufen →",
"sup_step1_free_forever": "Dauerhaft kostenlos",

View File

@@ -89,17 +89,17 @@
"flash_verify_invalid": "Invalid verification link.",
"flash_verify_expired": "This link has expired or already been used. Please submit a new quote request.",
"flash_verify_invalid_lead": "This quote has already been verified or does not exist.",
"landing_hero_badge": "Padel court financial planner",
"landing_hero_h1_1": "Plan Your Padel",
"landing_hero_h1_2": "Business in Minutes,",
"landing_hero_h1_3": "Not Months",
"landing_hero_btn_primary": "Plan Your Padel Business →",
"landing_hero_btn_secondary": "Browse Suppliers",
"landing_hero_bullet_1": "No signup required",
"landing_hero_bullet_2": "60+ variables",
"landing_hero_bullet_3": "Unlimited scenarios",
"landing_roi_title": "Quick ROI Estimate",
"landing_roi_subtitle": "Drag the sliders to see your projection",
"landing_hero_badge": "The padel startup toolkit — free",
"landing_hero_h1_1": "Invest in Padel",
"landing_hero_h1_2": "with Confidence,",
"landing_hero_h1_3": "Not Guesswork",
"landing_hero_btn_primary": "Start Your Free Business Plan →",
"landing_hero_btn_secondary": "Get Supplier Quotes",
"landing_hero_bullet_1": "Free — no signup, no credit card",
"landing_hero_bullet_2": "Bank-ready metrics (IRR, DSCR, MOIC)",
"landing_hero_bullet_3": "Based on real market data",
"landing_roi_title": "Is your padel idea profitable?",
"landing_roi_subtitle": "Find out in 30 seconds",
"landing_roi_courts": "Courts",
"landing_roi_rate": "Avg. Hourly Rate",
"landing_roi_util": "Target Utilization",
@@ -108,7 +108,7 @@
"landing_roi_payback": "Payback Period",
"landing_roi_annual_roi": "Annual ROI",
"landing_roi_note": "Assumes indoor rent model, €8/m² rent, staff costs, 5% interest, 10-yr loan. Payback and ROI based on total investment.",
"landing_roi_cta": "Plan Your Padel Business →",
"landing_roi_cta": "Build Your Full Business Plan — Free →",
"landing_journey_title": "Your Journey",
"landing_journey_01": "Explore",
"landing_journey_01_badge": "Soon",
@@ -118,27 +118,27 @@
"landing_journey_04": "Build",
"landing_journey_05": "Grow",
"landing_journey_05_badge": "Soon",
"landing_features_title": "Built for Serious Padel Entrepreneurs",
"landing_feature_1_h3": "60+ Variables",
"landing_feature_2_h3": "6 Analysis Tabs",
"landing_feature_3_h3": "Indoor & Outdoor",
"landing_feature_4_h3": "Sensitivity Analysis",
"landing_feature_5_h3": "Professional Metrics",
"landing_feature_6_h3": "Save & Compare",
"landing_supplier_title": "Find the Right Suppliers for Your Project",
"landing_supplier_step_1_title": "Plan Your Venue",
"landing_supplier_step_2_title": "Get Quotes",
"landing_supplier_step_3_title": "Compare & Build",
"landing_supplier_browse_btn": "Browse Supplier Directory",
"landing_features_title": "Everything You Need to Make a Confident Decision",
"landing_feature_1_h3": "Know Your Numbers Inside Out",
"landing_feature_2_h3": "Bank-Ready from Day One",
"landing_feature_3_h3": "Any Venue Type, Any Market",
"landing_feature_4_h3": "Stress-Test Before You Commit",
"landing_feature_5_h3": "Replace the €5K Consultant",
"landing_feature_6_h3": "Compare Scenarios Side by Side",
"landing_supplier_title": "Ready to Build? Get Matched with Verified Suppliers",
"landing_supplier_step_1_title": "Share Your Project",
"landing_supplier_step_2_title": "Get Matched",
"landing_supplier_step_3_title": "Compare Proposals",
"landing_supplier_browse_btn": "Get Quotes — Free & No Obligation",
"landing_faq_title": "Frequently Asked Questions",
"landing_faq_q1": "What does the planner calculate?",
"landing_faq_q2": "Do I need to sign up?",
"landing_faq_q3": "How does supplier matching work?",
"landing_faq_q4": "Is the supplier directory free?",
"landing_faq_q5": "How accurate are the financial projections?",
"landing_faq_q1": "How much does it cost to open a padel facility?",
"landing_faq_q2": "Will a bank accept a Padelnomics business plan?",
"landing_faq_q3": "How accurate are the financial projections?",
"landing_faq_q4": "What data are the market benchmarks based on?",
"landing_faq_q5": "Do I need to pay anything?",
"landing_seo_title": "Padel Court Investment Planning",
"landing_final_cta_h2": "Start Planning Today",
"landing_final_cta_btn": "Plan Your Padel Business →",
"landing_final_cta_h2": "Your Bank Meeting Is Coming. Be Ready.",
"landing_final_cta_btn": "Start Your Free Business Plan →",
"features_h1": "Everything You Need to Plan Your Padel Business",
"features_subtitle": "Professional-grade financial modeling, completely free.",
"features_card_1_h2": "60+ Variables",
@@ -428,6 +428,7 @@
"q4_phase_permit_not_filed": "Permit not yet filed",
"q4_phase_permit_pending": "Permit in progress",
"q4_phase_permit_granted": "Permit approved",
"q4_error_phase": "Please select your project phase.",
"q5_heading": "Timeline",
"q5_subheading": "When do you want to get started?",
"q5_timeline_label": "Timeline",
@@ -891,7 +892,7 @@
"sup_meta_desc": "Free directory listing on Padelnomics. Qualified leads from buyers with business plans. Growth and Pro plans from €199/mo.",
"sup_hero_h1a": "Stop Chasing Cold Leads.",
"sup_hero_h1b": "Meet Buyers Who Already Have a Business Plan.",
"sup_hero_sub": "Every lead on Padelnomics has modeled their CAPEX, projected revenue, and calculated ROI — before they contact you. No tire-kickers. No “just browsing.”",
"sup_hero_sub": "Every lead has already built a financial model for their project. You get the budget, timeline, and specs — before you make first contact.",
"sup_hero_cta": "Get Started Free",
"sup_hero_trust_pre": "Trusted by suppliers in",
"sup_hero_trust_post": "countries",
@@ -955,7 +956,7 @@
"sup_basic_f4": "Website & contact details",
"sup_basic_f5": "Services offered checklist",
"sup_basic_f6": "Enquiry form on listing page",
"sup_basic_cta": "List Your Company Free",
"sup_basic_cta": "Get Listed Free",
"sup_growth_name": "Growth",
"sup_growth_popular": "Most Popular",
"sup_growth_credits": "30 credits/mo included",
@@ -965,7 +966,7 @@
"sup_growth_f4": "Priority over free listings",
"sup_growth_f5": "30 lead credits per month",
"sup_growth_f6": "Buy additional credit packs",
"sup_growth_cta": "Get Started",
"sup_growth_cta": "Start Getting Leads",
"sup_pro_name": "Pro",
"sup_pro_credits": "100 credits/mo included",
"sup_pro_f1": "Everything in Growth",
@@ -974,7 +975,7 @@
"sup_pro_f4": "Featured card border & glow",
"sup_pro_f5": "Priority placement in directory",
"sup_pro_f6": "100 lead credits per month",
"sup_pro_cta": "Get Started",
"sup_pro_cta": "Maximize Your Pipeline",
"sup_yearly_note_basic": "Free forever",
"sup_yearly_note_growth": "€1,799 billed yearly",
"sup_yearly_note_pro": "€4,499 billed yearly",
@@ -1012,14 +1013,14 @@
"sup_cmp_t4": "Never",
"sup_cmp_m1": "Filtered by category",
"sup_cmp_footnote": "*Google Ads estimate based on €2080 CPC for padel construction keywords at 510 clicks/day.",
"sup_proof_h2": "Trusted by Padel Industry Leaders",
"sup_proof_h2": "What You Get with Every Lead",
"sup_proof_stat1": "business plans created",
"sup_proof_stat2": "suppliers",
"sup_proof_stat3": "countries",
"sup_proof_q1": "Padelnomics sends us leads that are already serious about building. The project briefs are more detailed than what we get from trade shows.",
"sup_proof_cite1": "— European padel court manufacturer",
"sup_proof_q2": "Finally a platform that understands the padel construction market. We know the budget, the timeline, and the venue type before we even make first contact.",
"sup_proof_cite2": "— Padel court installation company, Scandinavia",
"sup_proof_point1_h3": "Complete Project Brief",
"sup_proof_point1_p": "Venue type, court count, glass/lighting specs, budget, timeline, financing status, and full contact details — before you make first contact.",
"sup_proof_point2_h3": "Financial Model Included",
"sup_proof_point2_p": "Every lead has already modeled CAPEX, revenue projections, and ROI. You're talking to someone who knows their numbers.",
"sup_faq_h2": "Supplier FAQ",
"sup_faq_q1": "How do I get listed?",
"sup_faq_a1_pre": "Find your company in our",
@@ -1055,7 +1056,14 @@
"sup_guarantee_badge": "No-risk guarantee",
"sup_leads_section_h2": "See What Your Prospects Look Like",
"sup_leads_section_sub": "Every lead has used our financial planner. Contact details are blurred until you unlock.",
"sup_roi_line": "A single 4-court project = <strong>€30,000+ in profit</strong>. Growth plan costs €2,388/year. The math is simple.",
"sup_roi_line": "Your average project is worth <strong>€50K+</strong>. If we send you 5 qualified leads/month and you close 1, that's €50K in revenue for €199/mo. The math is simple.",
"sup_familiar_title": "Is This Your Sales Team Right Now?",
"sup_familiar_1_quote": "We quoted 20 projects last quarter. Closed 3.",
"sup_familiar_1_pivot": "Stop wasting proposals on tire-kickers.",
"sup_familiar_2_quote": "Pipeline looks thin for Q3 — but we have capacity.",
"sup_familiar_2_pivot": "A predictable lead flow you can plan around.",
"sup_familiar_3_quote": "Another competitor just undercut us on price.",
"sup_familiar_3_pivot": "Win on proposal quality, not price.",
"sup_credits_only_pre": "Not ready for a subscription? Buy a credit pack and unlock leads one at a time. No commitment, no monthly fee.",
"sup_credits_only_cta": "Buy Credits →",
"sup_step1_free_forever": "Free forever",
@@ -1193,34 +1201,67 @@
"features_opex_body": "Peak and off-peak pricing with configurable hour splits. Monthly utilization ramp-up curves. Staff costs, maintenance, insurance, marketing, and utilities — all adjustable with sliders. Revenue from court rentals, coaching, equipment, and F&B.",
"features_cf_body": "10-year monthly cash flow projections. Model your equity/debt split, interest rates, and loan terms. See debt service coverage ratios and free cash flow month by month. Waterfall charts show exactly where your money goes.",
"features_returns_body": "Calculate your equity IRR and MOIC under different exit scenarios. Model cap rate exits with configurable holding periods. See your equity waterfall from initial investment through to exit proceeds.",
"landing_page_title": "Padelnomics - Padel Court Business Plan & ROI Calculator",
"landing_meta_desc": "Plan your padel court investment in minutes. 60+ variables, sensitivity analysis, and professional-grade projections. Indoor/outdoor, rent/buy models.",
"landing_og_desc": "The most sophisticated padel court business plan calculator. 60+ variables, 6 analysis tabs, charts, sensitivity analysis, and supplier connections.",
"landing_hero_desc": "Model your padel court investment with 60+ variables, sensitivity analysis, and professional-grade projections. Then get matched with verified suppliers.",
"landing_page_title": "Padelnomics Padel Business Plan & ROI Calculator | Free",
"landing_meta_desc": "Plan your padel facility investment with real market data. Bank-ready financial model with IRR, DSCR, sensitivity analysis. Free — no signup required.",
"landing_og_desc": "Plan your padel facility investment with confidence. Bank-ready financial model, real market data, and verified supplier connections. Free — no signup required.",
"landing_hero_desc": "You're about to commit €200K+. Padelnomics gives you the financial model, market data, and supplier connections to make that decision with your eyes wide open.",
"landing_journey_01_desc": "Market demand analysis, whitespace mapping, location scoring.",
"landing_journey_02_desc": "Model your investment with 60+ variables, charts, and sensitivity analysis.",
"landing_journey_03_desc": "Connect with banks and investors. Your planner becomes your business case.",
"landing_journey_04_desc": "Browse {total_suppliers}+ court suppliers across {total_countries} countries. Get matched to your specs.",
"landing_journey_05_desc": "Launch playbook, performance benchmarks, and expansion analytics.",
"landing_feature_1_body": "Every assumption is adjustable. Court costs, rent, pricing, utilization, financing terms, exit scenarios. Nothing is hard-coded.",
"landing_feature_2_body": "Assumptions, Investment (CAPEX), Operating Model, Cash Flow, Returns & Exit, and Key Metrics. Each with interactive charts.",
"landing_feature_3_body": "Model indoor halls (rent or build) and outdoor courts with seasonality. Compare scenarios side by side.",
"landing_feature_4_body": "See how your returns change with different utilization rates and pricing. Find your break-even point instantly.",
"landing_feature_5_body": "IRR, MOIC, DSCR, cash-on-cash yield, break-even utilization, RevPAH, debt yield. The metrics banks and investors want to see.",
"landing_feature_6_body": "Save unlimited scenarios. Test different locations, court counts, financing structures. Find the optimal plan.",
"landing_supplier_sub": "{total_suppliers}+ verified suppliers across {total_countries} countries. Manufacturers, builders, turf, lighting, and more.",
"landing_supplier_step_1_body": "Use the financial planner to model your courts, budget, and timeline.",
"landing_supplier_step_2_body": "Request quotes and we match you with suppliers based on your project specs.",
"landing_supplier_step_3_body": "Receive proposals from matched suppliers. No cold outreach needed.",
"landing_faq_a1": "The planner produces a complete financial model: CAPEX breakdown, monthly operating costs, cash flow projections, debt service, IRR, MOIC, DSCR, payback period, break-even utilization, and sensitivity analysis. It covers indoor/outdoor, rent/buy, and all major cost and revenue variables.",
"landing_faq_a2": "No. The planner works instantly with no signup. Create an account to save scenarios, compare configurations, and export PDF reports.",
"landing_faq_a3": "When you request quotes through the planner, we share your project details (venue type, court count, glass, lighting, country, budget, timeline) with relevant suppliers from our directory. They contact you directly with proposals.",
"landing_faq_a4": "Browsing the directory is free for everyone. Suppliers have a basic listing by default. Paid plans (Basic at €39/mo, Growth at €199/mo, Pro at €499/mo) unlock enquiry forms, full descriptions, logos, verified badges, and priority placement.",
"landing_faq_a5": "The model uses real-world defaults based on global market data. Every assumption is adjustable so you can match your local conditions. The sensitivity analysis shows how results change across different scenarios, helping you understand the range of outcomes.",
"landing_feature_1_body": "Every cost, revenue, and financing assumption is adjustable. Nothing is hidden, nothing is hard-coded.",
"landing_feature_2_body": "IRR, MOIC, DSCR, cash-on-cash yield, break-even analysis — the exact metrics banks and investors ask for.",
"landing_feature_3_body": "Indoor halls, outdoor courts, rent or build — with seasonality and regional cost adjustments built in.",
"landing_feature_4_body": "See how your returns change when utilization drops 10% or interest rates rise. Find your break-even point instantly.",
"landing_feature_5_body": "Get the same financial model a consulting firm would charge €5,00010,000 for. Update it yourself, anytime.",
"landing_feature_6_body": "Test different locations, court counts, and financing structures. Find the plan that works.",
"landing_supplier_sub": "Every quote request includes your full financial model — budget, court count, timeline, and financing status. {total_suppliers}+ suppliers across {total_countries} countries.",
"landing_supplier_step_1_body": "Complete a 2-minute project brief. Your planner scenario data is included automatically.",
"landing_supplier_step_2_body": "We notify suppliers who match your specs, location, and budget. No cold outreach needed.",
"landing_supplier_step_3_body": "Receive quotes from matched suppliers. Every proposal is based on your actual project data — no generic estimates.",
"landing_faq_a1": "It depends on the format. A typical indoor padel venue with 68 courts in a rented building costs €250K500K. Building your own hall pushes that to €13M. Outdoor courts start around €150K for 4 courts. Padelnomics lets you model your exact scenario — adjust every variable and see the full financial picture in minutes.",
"landing_faq_a2": "Yes. The planner produces IRR, MOIC, DSCR, break-even analysis, and 10-year cash flow projections — the exact metrics banks and investors expect. Export as a professional PDF to include in your loan application or investor pitch.",
"landing_faq_a3": "The model uses real-world defaults based on market data across Europe and beyond. Every assumption is adjustable so you can match your local conditions. The sensitivity analysis shows how results change across different scenarios, so you understand the full range of outcomes — not just the best case.",
"landing_faq_a4": "Default values are derived from real court construction costs, rental rates, and operating benchmarks gathered from public sources and industry data. You can override any assumption with your own numbers.",
"landing_faq_a5": "The planner is 100% free — no signup, no credit card, no trial period. Create a free account to save scenarios and compare configurations. PDF export is available as a paid add-on (€99 one-time).",
"landing_seo_p1": "Padel is one of the fastest-growing racket sports globally, with demand for courts outstripping supply across markets from Germany, Spain, and Sweden to the US and Middle East. Opening a padel hall can be a lucrative investment, but the numbers need to work. A typical indoor padel venue with 6-8 courts requires between €300K (renting an existing building) and €2-3M (building new), with payback periods of 3-5 years for well-located venues.",
"landing_seo_p2": "The key variables that determine success are location (driving utilization), construction costs (CAPEX), rent or land costs, and pricing strategy. Our financial planner lets you model all of these variables interactively, seeing the impact on your IRR, MOIC, cash flow, and debt service coverage ratio in real time. Whether you're an entrepreneur exploring your first venue, a real estate developer adding padel to a mixed-use project, or an investor evaluating a padel hall acquisition, Padelnomics gives you the financial clarity to make informed decisions.",
"landing_final_cta_sub": "Model your investment, then get matched with verified court suppliers across {total_countries} countries.",
"landing_final_cta_sub": "Join 1,000+ padel entrepreneurs who stopped guessing and started planning with real data.",
"landing_jsonld_org_desc": "Professional padel court investment planning platform. Financial planner, supplier directory, and market intelligence for padel entrepreneurs.",
"landing_proof_plans": "{count}+ business plans created",
"landing_proof_suppliers": "{count}+ suppliers in {countries} countries",
"landing_proof_projects": "€{amount}M+ in projects planned",
"landing_familiar_title": "Sound Familiar?",
"landing_familiar_1_quote": "I've been thinking about this for months — I just need to run the numbers",
"landing_familiar_1_desc": "The planner turns your assumptions into a bank-ready financial model in minutes, not weeks.",
"landing_familiar_2_quote": "The bank asked for a business plan and I'm staring at a blank spreadsheet",
"landing_familiar_2_desc": "IRR, DSCR, MOIC, cash flow projections — all generated automatically from your inputs.",
"landing_familiar_3_quote": "I found conflicting cost data and I don't know what to trust",
"landing_familiar_3_desc": "Default values are based on real market data. Adjust any assumption to match your local market.",
"landing_familiar_4_quote": "My partner is skeptical — I need proof this isn't crazy",
"landing_familiar_4_desc": "Stress-test your plan with sensitivity analysis. Show exactly where it breaks — and where it doesn't.",
"landing_familiar_cta": "You're not alone. 1,000+ padel entrepreneurs started here.",
"landing_vs_title": "Why Padelnomics?",
"landing_vs_sub": "You have options. Here's the honest comparison.",
"landing_vs_col_diy": "DIY Spreadsheet",
"landing_vs_col_consultant": "Hired Consultant",
"landing_vs_col_us": "Padelnomics",
"landing_vs_row1_label": "Cost",
"landing_vs_row1_diy": "Free but takes weeks",
"landing_vs_row1_consultant": "€5,00010,000",
"landing_vs_row1_us": "Free, instant",
"landing_vs_row2_label": "Quality",
"landing_vs_row2_diy": "Looks amateur to banks",
"landing_vs_row2_consultant": "Professional but static",
"landing_vs_row2_us": "Professional, adjustable",
"landing_vs_row3_label": "Data",
"landing_vs_row3_diy": "No market benchmarks",
"landing_vs_row3_consultant": "Generic, not padel-specific",
"landing_vs_row3_us": "Real padel market data",
"landing_vs_diy_cta": "You deserve better than guessing",
"landing_vs_consultant_cta": "You deserve better than paying €5K",
"landing_vs_us_cta": "Start free →",
"plan_basic_f1": "Verified badge",
"plan_basic_f2": "Company logo",
"plan_basic_f3": "Full description & tagline",

Some files were not shown because too many files have changed in this diff Show More