Compare commits

..

17 Commits

Author SHA1 Message Date
Deeman
add5f8ddfa fix(extract): correct lc_lci_lev lcstruct filter value
All checks were successful
CI / test (push) Successful in 53s
CI / tag (push) Successful in 3s
2026-03-05 17:39:37 +01:00
Deeman
15ca316682 fix(extract): correct lc_lci_lev lcstruct filter value
D1_D2_A_HW doesn't exist in the API; use D1_D4_MD5 (total labour cost
= compensation + taxes - subsidies).

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-05 17:32:49 +01:00
Deeman
103ef73cf5 fix(pipeline): eurostat filter bugs + supervisor uses sqlmesh plan
All checks were successful
CI / test (push) Successful in 53s
CI / tag (push) Successful in 3s
2026-03-05 17:19:21 +01:00
Deeman
aa27f14f3c fix(pipeline): eurostat filter bugs + supervisor uses sqlmesh plan
- nrg_pc_203: add missing unit=KWH filter (API returns 2 units)
- lc_lci_lev: fix currency→unit filter dimension name
- supervisor: use `sqlmesh plan prod --auto-apply` instead of
  `sqlmesh run` so new/changed models are detected automatically

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-05 17:19:12 +01:00
Deeman
8205744444 chore: remove accidentally committed .claire/ worktree directory
All checks were successful
CI / test (push) Successful in 56s
CI / tag (push) Successful in 3s
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-05 17:10:48 +01:00
Deeman
1cbefe349c add env var 2026-03-05 17:08:52 +01:00
Deeman
003f19e071 fix(pipeline): handle DuckDB catalog naming in diagnostic script 2026-03-05 17:07:52 +01:00
Deeman
c3f15535b8 fix(pipeline): handle DuckDB catalog naming in diagnostic script
The lakehouse.duckdb file uses catalog "lakehouse" not "local", causing
SQLMesh logical views to break. Script now auto-detects the catalog via
USE and falls back to physical tables when views fail.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-05 17:06:44 +01:00
Deeman
fcb8ec4227 merge: pipeline diagnostic script + extraction card UX improvements
All checks were successful
CI / test (push) Successful in 54s
CI / tag (push) Successful in 3s
2026-03-05 15:40:16 +01:00
Deeman
6b7fa45bce feat(admin): add pipeline diagnostic script + extraction card UX improvements
- Add scripts/check_pipeline.py: read-only diagnostic for pricing pipeline
  row counts, date range analysis, HAVING filter impact, join coverage
- Add description field to all 12 workflows in workflows.toml
- Parse and display descriptions on extraction status cards
- Show spinner + "Running" state with blue-tinted card border
- Display start time with "running..." text for active extractions

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-05 15:40:12 +01:00
Deeman
0d8687859d fix(docker): copy workflows.toml into container for admin pipeline view
All checks were successful
CI / test (push) Successful in 53s
CI / tag (push) Successful in 3s
The admin Extraction Status page reads infra/supervisor/workflows.toml
but the Dockerfile only copied web/ into the image. Adding the COPY
so the file exists at /app/infra/supervisor/workflows.toml in the
container.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-05 15:16:07 +01:00
Deeman
b064e18aa1 fix(admin): resolve workflows.toml path via CWD instead of __file__
All checks were successful
CI / test (push) Successful in 54s
CI / tag (push) Successful in 3s
In prod the package is installed in a venv, so __file__.parents[4] doesn't
reach the repo root. Use CWD (repo root in both dev and prod via systemd
WorkingDirectory) with REPO_ROOT env var override.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-05 14:39:30 +01:00
Deeman
dc68976148 docs(marketing): add GTM, social posts, Reddit plan, and SEO calendar
All checks were successful
CI / test (push) Successful in 54s
CI / tag (push) Successful in 3s
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-05 11:43:11 +01:00
Deeman
60fa2bc720 test(billing): add Stripe E2E test scripts for sandbox validation
- test_stripe_sandbox.py: API-only validation of all 17 products (67 tests)
- stripe_e2e_setup.py: webhook endpoint registration via ngrok
- stripe_e2e_test.py: live webhook tests with real DB verification (67 tests)
- stripe_e2e_checkout_test.py: checkout webhook tests for credit packs,
  sticky boosts, and business plan PDF purchases (40 tests)

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-05 10:50:26 +01:00
Deeman
66c2dfce66 fix(billing): fetch line items for checkout.session.completed webhooks
_extract_line_items() was returning [] for all checkout sessions, which
meant _handle_transaction_completed never processed credit packs, sticky
boosts, or business plan PDF purchases. Now fetches line items from the
Stripe API using the session ID, with a fallback to embedded line_items.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-05 10:49:41 +01:00
Deeman
6e3c5554aa fix(admin): enable bulk actions in grouped articles view
All checks were successful
CI / test (push) Successful in 54s
CI / tag (push) Successful in 3s
- dev_run.sh: also remove app.db-shm and app.db-wal on reset to fix
  SQLite disk I/O error from stale WAL/SHM files
- articles bulk: add checkboxes to grouped rows (data-ids holds all
  variant IDs); checking a group selects EN+DE together
- restore select-all checkbox in grouped <th>
- add toggleArticleGroupSelect() JS function
- fix htmx:afterSwap to re-check group checkboxes correctly

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-05 09:48:54 +01:00
Deeman
ad02140594 fix(quote): add missing required asterisk and error hint to step 4
All checks were successful
CI / test (push) Successful in 54s
CI / tag (push) Successful in 3s
Step 4 (Project Phase) required location_status server-side but had no
visual "*" indicator and no error message when submitting without a
selection. All other steps already had both.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-04 22:21:57 +01:00
29 changed files with 3073 additions and 33 deletions

View File

@@ -31,12 +31,18 @@ RESEND_WEBHOOK_SECRET=
#ENC[AES256_GCM,data:1HqXvAspvNIUNpCxJwge3mEsyO0Y/EWvD3vbLxkgGqIex0hABcupX/Nzk15u8iOY5JWvvEuAO414MNt6mFvnWBDpEw==,iv:N7gCzTNJAR/ljx5gGsX+ieZctya8vQbCIb3hw49OhXg=,tag:PJKNyzhrit5VgIXl+cNlbQ==,type:comment]
#ENC[AES256_GCM,data:do6DZ/1Osc5y4xseG8Q8bDX84JBHLzvmVbHiqxP7ChlicmzYBkZ85g43BuM7V0KInFTFgvaC8xmFic+2d37Holuf1ywdAjbLkRhg,iv:qrNmhPbmFDr2ynIF5EdOLZl3FI5f68WDrxuHMkAzuuU=,tag:761gYOlEdNM+e1//1MbCHg==,type:comment]
#ENC[AES256_GCM,data:dseLIQiUEU20xJqoq2dkFho9SnKyoyQ8pStjvfxwnj8v18/ua0TH/PDx/qwIp9z5kEIvbsz5ycJesFfKPhLA5juGcdCbi5zBmZRWYg==,iv:7JUmRnohJt0H5yoJXVD3IauuJkpPHDPyY02OWHWb9Nw=,tag:KcM6JGT01Aa1kTx+U30UKQ==,type:comment]
#ENC[AES256_GCM,data:VXv1O5oRNTws8wbx/nZWH6Q=,iv:M/XwF6Zef+xlJ/8AAVI1zSmsEUNYL+0twzxXwkf8moY=,tag:y3Nu5akuiKtEIMeZhSNIkw==,type:comment]
PAYMENT_PROVIDER=ENC[AES256_GCM,data:7uxz3xmr,iv:4uEOA7ZjehD1bF91Gxl0+OxnvlZW3QIq22MhnYM43uE=,tag:XvHqyRM+ugnWTUN9GFJ3fQ==,type:str]
#ENC[AES256_GCM,data:GgXo4zkhJsxXEk8F5a/+wdbvBUGN00MUAutZYLDEqqN4T1rZu92fioOLx7MEoC0b8i61,iv:f1hUBoZpmnzXNcikf/anVNdRSHNwVmmjdIcba3eiRI4=,tag:uWpF40uuiXyWqKrYGyLVng==,type:comment]
PADDLE_API_KEY=
PADDLE_CLIENT_TOKEN=
PADDLE_WEBHOOK_SECRET=
PADDLE_NOTIFICATION_SETTING_ID=
PADDLE_ENVIRONMENT=ENC[AES256_GCM,data:KIGNxEaodA==,iv:SRebaYRpVJR0LpfalBZJLTE8qBGwWZB/Fx3IokQF99Q=,tag:lcC56e4FjVkCiyaq41vxcQ==,type:str]
#ENC[AES256_GCM,data:sk79dbsswA==,iv:J8CyJt/WOMLd7CZNutDwIOtAOAooaMsLPO35gfWo+Nc=,tag:JQcGMYdgcQgtIWKcqXZkNQ==,type:comment]
STRIPE_API_PUBLIC_KEY=ENC[AES256_GCM,data:WhWvIzNd1sS+IrrEdE+FJI6ZgEiNlgG3oxC8VoDzXf0z1oH1wgY6m9wUq6UEZZyzeiRGAeAylOk6wHJ+Lx4+zx2cfv+yweX7I3Sq5VN2D1OBPiQ3Kde4zm5cXqA92jRkLAomZxw/DkeiB14=,iv:Rb3GSLMVSySR++X240MICsXbVtOuqZNjm+nIe+s65dU=,tag:z82dyRzmxF3e87Sm2F+4Qw==,type:str]
STRIPE_API_PRIVATE_KEY=ENC[AES256_GCM,data:/62y1Iv2Op21eEvT3BosgWD0S3YqGMgdfb2Edjhq2cuh32B3eH5fh9FaqBc3CvJpM7R79hy9jTnV3CTjlCkvrXGCLDnFY2a6kvSz5f+v2d/lsr8zvFLs6OP+bhssHdVygfIwz9ye46tfcFk=,iv:iw0NAYUf/gCM4awb2tKBEKuo/j7kkpVP6JjIIdVy7O8=,tag:GO3ASp5bykwHDHNkCYsdiA==,type:str]
STRIPE_ACCOUNT_ID=ENC[AES256_GCM,data:ahJsOgZLRi5n9P7Dy0U1rvmhwr/B,iv:aoVA3M8Faqv1kZwTtagD0WLVipkA5nkX5uSjtHl14+I=,tag:XwLOu9ZiHUizcsnk73bt1w==,type:str]
#ENC[AES256_GCM,data:2Hs7ds2ppeRqKB7EiAAbWqlainKdZ+eTYZSvPloirT4Hlsuf+zTwtJTA6RzHNCuK4em//jhOx8R2k80I,iv:1N6CNPqYWp3z8lm5e2Vp6OlpgHdMOiD7dsEYp23nMtA=,tag:ulWP/BFFoLljLMVCrsgizw==,type:comment]
UMAMI_API_URL=ENC[AES256_GCM,data:oX/m95YB+S2ziUKoxDhsDzMhGZfxppw+w603tQ==,iv:GAj7ccF6seiCfLAh2XIjUi13RpgNA3GONMtINcG+KMw=,tag:mUfRlvaEWrw2QWFydtnbNA==,type:str]
UMAMI_API_TOKEN=
@@ -73,7 +79,7 @@ GEONAMES_USERNAME=ENC[AES256_GCM,data:aSkVdLNrhiF6tlg=,iv:eemFGwDIv3EG/P3lVHGZj9
CENSUS_API_KEY=ENC[AES256_GCM,data:qqG971573aGq9MiHI2xLlanKKFwjfcNNoMXtm8LNbyh0rMbQN2XukQ==,iv:az2i0ldH75nHGah4DeOxaXmDbVYqmC1c77ptZqFA9BI=,tag:zoDdKj9bR7fgIDo1/dEU2g==,type:str]
sops_age__list_0__map_enc=-----BEGIN AGE ENCRYPTED FILE-----\nYWdlLWVuY3J5cHRpb24ub3JnL3YxCi0+IFgyNTUxOSBxNWNmUzVNUGdWRnE0ZFpF\nM0JQZWZ3UDdEVzlwTmIxakxOZXBkT2x2ZlNrClRtV2M3S2daSGxUZmFDSWQ2Nmh4\neU51QndFcUxlSE00RFovOVJTcDZmUUUKLS0tIDcvL3hRMDRoMWZZSXljNzA3WG5o\nMWFic21MV0krMzlIaldBTVU0ZDdlTE0K7euGQtA+9lHNws+x7TMCArZamm9att96\nL8cXoUDWe5fNI5+M1bXReqVfNwPTwZsV6j/+ZtYKybklIzWz02Ex4A==\n-----END AGE ENCRYPTED FILE-----\n
sops_age__list_0__map_recipient=age1f5002gj4s78jju45jd28kuejtcfhn5cdujz885fl7z2p9ym68pnsgky87a
sops_lastmodified=2026-03-01T13:34:16Z
sops_mac=ENC[AES256_GCM,data:JLfGLbNTEcI6M/sUA5Zez6cfEUObgnUBmX52560PzBmeLZt0F5Y5QpeojIBqEDMuNB0hp1nnPI59WClLJtQ12VlHo9TkL3x9uCNUG+KneQrn1bTmJpA3cwNkWTzIm4l+TGbJbd4FpKJ9H0v1w+sqoKOgG8DqbtOeVdUfsVspAso=,iv:UqYxooXkEtx+y7fYzl+GFncpkjz8dcP7o9fp+kFf6w4=,tag:/maSb1aZGo+Ia8eGpB7PYw==,type:str]
sops_lastmodified=2026-03-03T15:16:35Z
sops_mac=ENC[AES256_GCM,data:T0qph3KPd68Lo4hxd6ECP+wv87uwRFsAFZwnVyf/MXvuG7raraUW02RLox0xklVcKBJXk+9jM7ycQ1nuk95UIuu7uRU88g11RaAm67XaOsafgwDMrC17AjIlg0Vf0w64WAJBrQLaXhJlh/Gz45bXlz82F+XVnTW8fGCpHRZooMY=,iv:cDgMZX6FRVe9JqQXLN6OhO06Ysfg2AKP2hG0B/GeajU=,tag:vHavf9Hw2xqJrqM3vVUTjA==,type:str]
sops_unencrypted_suffix=_unencrypted
sops_version=3.12.1

View File

@@ -3,6 +3,7 @@ APP_NAME=ENC[AES256_GCM,data:ldJf4P0iD9ziMVg=,iv:hiVl2whhd02yZCafzBfbxX5/EU/suvz
SECRET_KEY=ENC[AES256_GCM,data:hmlXm7NKVVFmeea4DnlrH/oSnsoaMAkUz42oWwFXOXL1XwAh3iemIKHUQOV2G4SPlmjfmEVQD64xbxaJW0OcPQ/8KqhrRYDsy0F/u0h7nmNQdwJrcvzcmbvjgcwU5IITPIr23d/W5PeSJzxhB93uaJ0+zFN2CyHfeewrJKafPfw=,iv:e+ZSLUO+dlt+ET8r/0/pf74UtGIBMkaVoJMWlJn1W5U=,tag:LdDCCrHcJnKLkKL/cY/R/Q==,type:str]
BASE_URL=ENC[AES256_GCM,data:50k/RqlZ1EHqGM4UkSmTaCsuJgyU4w==,iv:f8zKr2jkts4RsawA97hzICHwj9Quzgp+Dw8AhQ7GSWA=,tag:9KhNvwmoOtDyuIql7okeew==,type:str]
DEBUG=ENC[AES256_GCM,data:O0/uRF4=,iv:cZ+vyUuXjQOYYRf4l8lWS3JIWqL/w3pnlCTDPAZpB1E=,tag:OmJE9oJpzYzth0xwaMqADQ==,type:str]
LANDING_DIR=ENC[AES256_GCM,data:rn8u+tGob0vU7kSAtxmrpYQlneesvyO10A==,iv:PuGtdcQBdRbnybulzd6L7JVQClcK3/QjMeYFXZSxGW0=,tag:K2PJPMCWXdqTlQpwP9+DOQ==,type:str]
#ENC[AES256_GCM,data:xmJc6WTb3yumHzvLeA==,iv:9jKuYaDgm4zR/DTswIMwsajV0s5UTe+AOX4Sue0GPCs=,tag:b/7H9js1HmFYjuQE4zJz8w==,type:comment]
ADMIN_EMAILS=ENC[AES256_GCM,data:R/2YTk8KDEpNQ71RN8Fm6miLZvXNJQ==,iv:kzmiaBK7KvnSjR5gx6lp7zEMzs5xRul6LBhmLf48bCU=,tag:csVZ0W1TxBAoJacQurW9VQ==,type:str]
#ENC[AES256_GCM,data:S7Pdg9tcom3N,iv:OjmYk3pqbZHKPS1Y06w1y8BE7CU0y6Vx2wnio9tEhus=,tag:YAOGbrHQ+UOcdSQFWdiCDA==,type:comment]
@@ -63,7 +64,7 @@ sops_age__list_1__map_enc=-----BEGIN AGE ENCRYPTED FILE-----\nYWdlLWVuY3J5cHRpb2
sops_age__list_1__map_recipient=age1wjepykv3glvsrtegu25tevg7vyn3ngpl607u3yjc9ucay04s045s796msw
sops_age__list_2__map_enc=-----BEGIN AGE ENCRYPTED FILE-----\nYWdlLWVuY3J5cHRpb24ub3JnL3YxCi0+IFgyNTUxOSBFeHhaOURNZnRVMEwxNThu\nUjF4Q0kwUXhTUE1QSzZJbmpubnh3RnpQTmdvCjRmWWxpNkxFUmVGb3NRbnlydW5O\nWEg3ZXJQTU4vcndzS2pUQXY3Q0ttYjAKLS0tIE9IRFJ1c2ZxbGVHa2xTL0swbGN1\nTzgwMThPUDRFTWhuZHJjZUYxOTZrU00KY62qrNBCUQYxwcLMXFEnLkwncxq3BPJB\nKm4NzeHBU87XmPWVrgrKuf+PH1mxJlBsl7Hev8xBTy7l6feiZjLIvQ==\n-----END AGE ENCRYPTED FILE-----\n
sops_age__list_2__map_recipient=age1c783ym2q5x9tv7py5d28uc4k44aguudjn03g97l9nzs00dd9tsrqum8h4d
sops_lastmodified=2026-03-01T20:26:09Z
sops_mac=ENC[AES256_GCM,data:IxzU6VehA0iHgpIEqDSoMywKyKONI6jSr/6Amo+g3JI72awJtk6ft0ppfDWZjeHhL0ixfnvgqMNwai+1e0V/U8hSP8/FqYKEVpAO0UGJfBPKP3pbw+tx3WJQMF5dIh2/UVNrKvoACZq0IDJfXlVqalCnRMQEHGtKVTIT3fn8m6c=,iv:0w0ohOBsqTzuoQdtt6AI5ZdHEKw9+hI73tycBjDSS0o=,tag:Guw7LweA4m4Nw+3kSuZKWA==,type:str]
sops_lastmodified=2026-03-05T15:55:19Z
sops_mac=ENC[AES256_GCM,data:orLypjurBTYmk3um0bDQV3wFxj1pjCsjOf2D+AZyoIYY88MeY8BjK8mg8BWhmJYlGWqHH1FCpoJS+2SECv2Bvgejqvx/C/HSysA8et5CArM/p/MBbcupLAKOD8bTXorKMRDYPkWpK/snkPToxIZZd7dNj/zSU+OhRp5qLGCHkvM=,iv:eBn93z4DSk8UPHgP/Jf/Kz+3KwoKIQ9Et72pbLFcLP8=,tag:79kzPIKp0rtHGhH1CkXqwg==,type:str]
sops_unencrypted_suffix=_unencrypted
sops_version=3.12.1

View File

@@ -6,7 +6,17 @@ The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.1.0/).
## [Unreleased]
### Fixed
- **Pipeline diagnostic script** (`scripts/check_pipeline.py`) — handle DuckDB catalog naming quirk where `lakehouse.duckdb` uses catalog `lakehouse` instead of `local`, causing SQLMesh logical views to break. Script now auto-detects the catalog via `USE`, and falls back to querying physical tables (`sqlmesh__<schema>.<table>__<hash>`) when views fail.
- **Eurostat gas prices extractor** — `nrg_pc_203` filter missing `unit` dimension (API returns both KWH and GJ_GCV); now filters to `KWH`.
- **Eurostat labour costs extractor** — `lc_lci_lev` used non-existent `currency` filter dimension; corrected to `unit: EUR`.
- **Supervisor transform step** — changed `sqlmesh run` to `sqlmesh plan prod --auto-apply` so new/modified models are detected and applied automatically.
### Added
- **Pipeline diagnostic script** (`scripts/check_pipeline.py`) — read-only script that reports row counts at every layer of the pricing pipeline (staging → foundation → serving), date range analysis, HAVING filter impact, and join coverage. Run on prod to diagnose empty serving tables.
- **Extraction card descriptions** — each workflow card on the admin pipeline page now shows a one-line description explaining what the data source is (e.g. "EU geographic boundaries (NUTS2 polygons) from Eurostat GISCO"). Descriptions defined in `workflows.toml`.
- **Running state indicator** — extraction cards show a spinner + "Running" label with a blue-tinted border when an extraction is actively running, replacing the plain Run button. Cards also display the start time with "running..." text.
- **Interactive Leaflet maps** — geographic visualization across 4 key placements using self-hosted Leaflet 1.9.4 (GDPR-safe, no CDN):
- **Markets hub** (`/markets`): country bubble map with circles sized by total venues, colored by avg market score (green ≥ 60, amber 30-60, red < 30). Click navigates to country overview.
- **Country overview articles**: city bubble map loads after article render, auto-fits bounds, click navigates to city page. Bubbles colored by market score.

View File

@@ -25,6 +25,7 @@ WORKDIR /app
RUN mkdir -p /app/data && chown -R appuser:appuser /app
COPY --from=build --chown=appuser:appuser /app .
COPY --from=css-build /app/web/src/padelnomics/static/css/output.css ./web/src/padelnomics/static/css/output.css
COPY --chown=appuser:appuser infra/supervisor/workflows.toml ./infra/supervisor/workflows.toml
USER appuser
ENV PYTHONUNBUFFERED=1
ENV DATABASE_PATH=/app/data/app.db

View File

@@ -1,6 +1,6 @@
# Padelnomics — Marketing Master Doc
> Living doc. Update state column as things progress. Last updated: 2026-02-22.
> Living doc. Update state column as things progress. Last updated: 2026-03-04.
---
@@ -216,9 +216,9 @@ The moat compounds over time — this is critical to long-term defensibility.
| Channel | Approach | State |
|---------|----------|-------|
| **LinkedIn** | Founder posts, thought leadership, padel community | [ ] Not started |
| **Reddit** | r/padel, r/entrepreneur — seeding calculator, articles | [ ] Not started |
| **Facebook Groups** | Padel business groups, sports entrepreneur communities | [ ] Not started |
| **LinkedIn** | Founder posts, thought leadership, padel community | [~] First post published |
| **Reddit** | r/padel, r/sweatystartup, r/entrepreneur, r/tennis, r/smallbusiness, r/pickleball, r/CRE — seeding calculator, articles | [~] Active in 7 subreddits |
| **Facebook Groups** | Padel business groups, sports entrepreneur communities | [~] Active in 2-3 groups |
### Borrowed (Month 2+)

89
docs/gtm-day-one.md Normal file
View File

@@ -0,0 +1,89 @@
# GTM — Day One Action Plan
> Created: 2026-03-04. Do these in order. Total time: ~45 hours.
---
## Right Now (12 hours, highest leverage)
### 1. Submit sitemap to Google Search Console + Bing Webmaster Tools
You have 80 programmatic city articles sitting unindexed. Every day without indexing is wasted compound time.
- [search.google.com/search-console](https://search.google.com/search-console) → Add property → Submit sitemap
- [bing.com/webmasters](https://www.bing.com/webmasters) (Bing also feeds DuckDuckGo, Ecosia, Yahoo)
- Your SEO hub already supports both — just add the env vars
### 2. Publish SEO articles on prod
Run `seed_content --generate` from admin or CLI. Those 80 city pages (40 cities × EN+DE) are the primary organic traffic engine. Until they're live and crawlable, they generate zero value.
### 3. Index the planner in Google
Make sure `/en/calculator` and `/de/rechner` are in the sitemap and crawlable. This is the #1 free tool — the entire PLG funnel starts here. Check canonical tags and hreflang are correct.
---
## This Afternoon (23 hours, seed distribution)
### 4. First LinkedIn post
Data-driven insight from the pipeline. See `docs/social-posts.md` for the full post.
### 5. Post in Reddit communities
- **r/padel**: Free calculator angle — genuinely useful tool
- **r/entrepreneur**: Indie maker angle — "built this with real market data"
- **r/smallbusiness**: Business planning tool angle
- **r/tennis**: Cross-sport angle — tennis clubs adding padel courts
See `docs/social-posts.md` for all posts ready to copy-paste.
### 6. Share in 23 Facebook padel business groups
Same angle as Reddit — free tool, no hard sell. Search for:
- "Padel Business" groups
- "Padel Club Owners" groups
- "Padel Deutschland" / "Padel Germany" groups
---
## This Evening (1 hour, set up compounding assets)
### 7. Verify Resend production API key
Test a real magic link email. Until email works in prod, you can't capture traffic.
### 8. Wipe test suppliers
Delete the 5 `example.com` entries. Empty directory with "Be the first to list" > obviously fake data.
### 9. Request indexing for top 5 city pages
After GSC is set up, use "Request Indexing" manually for highest-value pages:
- `/de/markets/berlin`, `/de/markets/muenchen`, `/de/markets/hamburg`
- `/en/markets/london`, `/en/markets/madrid`
Google prioritizes manually requested URLs — can appear in search within days vs. weeks.
---
## What NOT to do today
- ~~"State of Padel" report~~ — multi-day effort
- ~~Supplier outreach~~ — site needs to be live + articles indexed first
- ~~Copy/CRO optimization~~ — premature, get traffic first
- ~~Paid ads~~ — excluded in channel strategy
---
## Expected outcome
If you do steps 19 today:
- 80 pages submitted for indexing (organic traffic starts in 13 weeks)
- 35 social posts seeding traffic immediately
- Planner discoverable and shareable
- Email capture working for when traffic arrives
**Single highest-leverage action: publish the articles + submit the sitemap.** Everything else is distribution on top of that foundation.

View File

@@ -0,0 +1,91 @@
# Reddit Communities — Padelnomics Distribution
> Permanent reference for Reddit distribution. Subreddits ranked by relevance + size.
> Created: 2026-03-04. Review monthly — subreddit rules change.
---
## Tier 1 — Post Here First
High relevance, receptive to tools/data, proven padel or business-planning interest.
| Subreddit | Size | Angle | Notes |
|-----------|------|-------|-------|
| r/padel | ~20K | Free calculator, data insights, answer existing biz threads | Player community — lead with the sport, not the product. Helpful tone only. |
| r/sweatystartup | ~56-81K | "Best brick-and-mortar sports opportunity" with unit economics | Loves concrete P&L numbers. Show CAPEX/OPEX/payback, not vision. |
| r/tennis | ~2M | Tennis club court conversion trends + data | Huge audience. Angle: "your club is probably already thinking about this." |
| r/smallbusiness | ~2.2M | Free business planning tool for sports facilities | Practical, no-hype tone. Lead with the tool, not the market thesis. |
---
## Tier 2 — Test With One Post Each
Potentially high-value but less proven fit. Post once, measure engagement, double down if it works.
| Subreddit | Size | Angle | Notes |
|-----------|------|-------|-------|
| r/entrepreneur | ~4.8M | "Bloomberg for padel" indie builder story | Loves "I built X" posts with real data. Show the data pipeline, not just the product. |
| r/CommercialRealEstate | ~44K | Sports venue site selection as niche CRE | Small but highly targeted. Angle: alternative asset class with data backing. |
| r/realestateinvesting | ~1.2M | Alternative commercial RE asset class | Broader audience. Frame padel as "the new self-storage" — boring but profitable. |
| r/pickleball | ~30K | Padel vs pickleball facility economics comparison | Comparative angle works. Don't trash pickleball — frame as "here's what the padel side looks like." |
| r/gymowners | Small | Cross-reference gym location frameworks with padel data | Niche. Test if gym owners see padel as a complementary or competing asset. |
| r/padelUSA | <5K | US-specific demand data | Tiny but highly relevant. US padel market is nascent — early authority opportunity. |
---
## Tier 3 — Monitor Only
Read these for trends and conversations. Don't post unless a specific thread is a perfect fit for a data-backed comment.
- r/business — too generic, self-promo gets buried
- r/startups — SaaS-focused, padel doesn't fit the narrative
- r/SaaS — pure software community, facility business is off-topic
- r/venturecapital — wrong audience for bootstrapped niche tool
- r/sports — massive, low engagement on niche content
---
## Key Gap
No subreddit exists for padel facility operators or business owners. If community forms organically around Padelnomics content (comments like "where can I discuss this more?"), consider creating **r/padelbusiness** later. Don't force it — let demand signal the timing.
---
## Posting Rules
1. **One link per post, at the end.** Never in the title.
2. **Engage with every comment for 24 hours** after posting. This is where the real value is.
3. **No cross-posting.** Each post is unique to the subreddit's culture and tone.
4. **If a post gets removed, don't repost.** Move to the next subreddit. Respect mod decisions.
5. **Read each subreddit's rules before posting.** Some ban self-promotion entirely. Some require flair. Some have minimum account age/karma requirements.
6. **Never post more than one subreddit per day.** Spread it out. Reddit's spam detection flags rapid multi-sub posting.
7. **Comment on existing threads first.** Build karma and presence in a sub before dropping your own post.
---
## UTM Tracking Format
All Reddit links use this format:
```
https://padelnomics.io/<path>?utm_source=reddit&utm_medium=social&utm_campaign=launch&utm_content=r_<subreddit>
```
Examples:
- `https://padelnomics.io/en/planner/?utm_source=reddit&utm_medium=social&utm_campaign=launch&utm_content=r_padel`
- `https://padelnomics.io/en/markets?utm_source=reddit&utm_medium=social&utm_campaign=launch&utm_content=r_sweatystartup`
- `https://padelnomics.io/en/markets?utm_source=reddit&utm_medium=social&utm_campaign=launch&utm_content=r_cre`
---
## Measuring Success
| Metric | Good | Great |
|--------|------|-------|
| Post upvotes | 10+ | 50+ |
| Comments | 5+ | 20+ |
| UTM clicks (GA) | 20+ per post | 100+ per post |
| Planner completions from Reddit | 5+ per post | 20+ per post |
| Email captures from Reddit | 2+ per post | 10+ per post |
Track weekly in a simple spreadsheet. Drop subreddits that produce zero clicks after 2 posts.

106
docs/reddit-posting-plan.md Normal file
View File

@@ -0,0 +1,106 @@
# Reddit Posting Plan — Launch Sequence
> Day-by-day posting schedule. One post per day, engage for 24 hours after each.
> Created: 2026-03-04. See `docs/reddit-communities.md` for full subreddit research.
---
## Posting Sequence
| Day | Subreddit | Post Title | Angle | UTM |
|-----|-----------|-----------|-------|-----|
| 1 | r/padel | "I built a free padel court ROI calculator — feedback welcome" | Free tool, genuinely helpful | `utm_content=r_padel` |
| 2 | r/sweatystartup | "25K venues analyzed — which cities are undersupplied for padel" | Unit economics, brick-and-mortar opportunity | `utm_content=r_sweatystartup` |
| 3 | r/entrepreneur | "I'm building the 'Bloomberg for padel' — tracking 10,127 facilities across 17 countries" | Indie builder story with real data | `utm_content=r_entrepreneur` |
| 4 | r/tennis | "Data on padel facility economics — useful for tennis clubs considering adding courts" | Tennis club conversion data | `utm_content=r_tennis` |
| 5 | r/smallbusiness | "Free business planning tool for anyone looking at opening a sports facility" | Practical tool for real decisions | `utm_content=r_smallbusiness` |
| 7 | r/pickleball | "Padel vs pickleball facility economics — a data comparison" | Comparative, respectful of pickleball | `utm_content=r_pickleball` |
| 10 | r/CommercialRealEstate | "Sports venue site selection — data on underserved markets" | Alternative CRE asset class | `utm_content=r_cre` |
Day 6 and days 8-9 are rest days for engaging with comments on previous posts.
---
## Full UTM Format
Every Reddit link follows this exact format:
```
https://padelnomics.io/<path>?utm_source=reddit&utm_medium=social&utm_campaign=launch&utm_content=<value>
```
| Subreddit | utm_content value |
|-----------|-------------------|
| r/padel | `r_padel` |
| r/sweatystartup | `r_sweatystartup` |
| r/entrepreneur | `r_entrepreneur` |
| r/tennis | `r_tennis` |
| r/smallbusiness | `r_smallbusiness` |
| r/pickleball | `r_pickleball` |
| r/CommercialRealEstate | `r_cre` |
---
## Post Content
Full post text is in `docs/social-posts.md`. Before posting, replace `[LINK]` placeholders with the correct UTM-tagged URL:
| Post | Link to |
|------|---------|
| r/padel | `https://padelnomics.io/en/planner/?utm_source=reddit&utm_medium=social&utm_campaign=launch&utm_content=r_padel` |
| r/sweatystartup | `https://padelnomics.io/en/markets?utm_source=reddit&utm_medium=social&utm_campaign=launch&utm_content=r_sweatystartup` |
| r/entrepreneur | `https://padelnomics.io/en/?utm_source=reddit&utm_medium=social&utm_campaign=launch&utm_content=r_entrepreneur` |
| r/tennis | `https://padelnomics.io/en/planner/?utm_source=reddit&utm_medium=social&utm_campaign=launch&utm_content=r_tennis` |
| r/smallbusiness | `https://padelnomics.io/en/planner/?utm_source=reddit&utm_medium=social&utm_campaign=launch&utm_content=r_smallbusiness` |
| r/pickleball | `https://padelnomics.io/en/planner/?utm_source=reddit&utm_medium=social&utm_campaign=launch&utm_content=r_pickleball` |
| r/CommercialRealEstate | `https://padelnomics.io/en/markets?utm_source=reddit&utm_medium=social&utm_campaign=launch&utm_content=r_cre` |
---
## Rules
1. **One link per post, at the end.** Never in the title.
2. **Engage with every comment for 24 hours** after posting.
3. **No cross-posting.** Each post is written uniquely for its subreddit's culture.
4. **If a post gets removed, don't repost.** Move to the next subreddit.
5. **Read subreddit rules before posting.** Check for self-promotion policies, flair requirements, minimum karma.
6. **Comment on 2-3 existing threads** in a subreddit before making your own post (builds credibility).
7. **Never mention other posts.** Each community should feel like they're getting a unique share.
---
## Engagement Playbook
### When you get comments:
- **"How accurate is this?"** — Share methodology: real market data from OpenStreetMap, Playtomic, Eurostat. Not generic assumptions.
- **"What about [city]?"** — Run the planner for their city, share the numbers. This is high-value personalized engagement.
- **"I'm actually looking at opening a facility"** — Offer to walk through the planner with them. Ask about their timeline, location, budget. This is a lead.
- **"This is just an ad"** — Don't get defensive. Say "Fair point — I built this and wanted feedback. The tool is free with no signup, so figured it might be useful here."
- **"What's your business model?"** — Be transparent: "Free calculator, paid market intelligence for serious investors, supplier directory for builders."
### When a post gets traction (50+ upvotes):
- Reply with additional data points to keep the thread alive
- Answer every question, even late ones
- Don't edit the original post to add more links
---
## Tracking
After each post, log:
| Field | Example |
|-------|---------|
| Date posted | 2026-03-04 |
| Subreddit | r/padel |
| Post URL | reddit.com/r/padel/... |
| Upvotes (24hr) | 15 |
| Comments (24hr) | 7 |
| UTM clicks (GA, 7d) | 42 |
| Planner starts (7d) | 12 |
| Emails captured (7d) | 3 |
| Removed? | No |
Review after Day 10. Double down on subreddits that drove clicks. Drop ones that didn't.

View File

@@ -0,0 +1,150 @@
# SEO Content Calendar — First 30 Days
> 4-week content plan covering programmatic SEO deployment, cornerstone articles, and data-driven content.
> Created: 2026-03-04.
---
## Week 1 — Foundation (March 4-10)
Get the existing 80 pages indexed and write the first cornerstone article.
| Day | Task | Owner | State |
|-----|------|-------|-------|
| Mon | Publish 80 programmatic city articles (40 cities x EN+DE) | Deploy | [ ] |
| Mon | Submit sitemap to Google Search Console | Manual | [ ] |
| Mon | Submit sitemap to Bing Webmaster Tools | Manual | [ ] |
| Tue | Request manual indexing for top 10 pages in GSC | Manual | [ ] |
| Tue | Verify hreflang tags and canonical URLs on all city pages | Audit | [ ] |
| Wed-Fri | Write Article #1: "Is Padel Still a Good Investment in 2026?" | Editorial | [ ] |
| Fri | Publish Article #1, add to sitemap | Deploy | [ ] |
**Top 10 pages for manual indexing:**
1. `/de/markets/berlin`
2. `/de/markets/muenchen`
3. `/de/markets/hamburg`
4. `/en/markets/london`
5. `/en/markets/madrid`
6. `/en/calculator`
7. `/de/rechner`
8. `/en/markets/paris`
9. `/de/markets/frankfurt`
10. `/de/markets/koeln`
---
## Week 2 — Cornerstone Content (March 11-17)
Two high-value articles targeting decision-stage keywords. Internal linking pass connects everything.
| Day | Task | Owner | State |
|-----|------|-------|-------|
| Mon-Tue | Write Article #2: "How Much Does It Cost to Open a Padel Hall in Germany?" | Editorial | [ ] |
| Wed | Publish Article #2 | Deploy | [ ] |
| Thu-Fri | Write Article #3: "What Banks Want to See in a Padel Business Plan" | Editorial | [ ] |
| Fri | Publish Article #3 | Deploy | [ ] |
| Sat | Internal linking pass: city articles -> cornerstone articles -> planner | Technical | [ ] |
### Article #2 — Target Keywords
- "padel halle kosten" / "padel court cost germany"
- "padel halle eroeffnen kosten" / "how much to open padel hall"
- "padel anlage investition"
### Article #3 — Target Keywords
- "padel business plan" / "padel halle business plan"
- "padel halle finanzierung" / "padel financing"
- "bank business plan padel"
### Internal Linking Structure
```
City article (e.g., /markets/berlin)
-> "How much does it cost?" (Article #2)
-> "Plan your facility" (/calculator)
Article #2 (Cost breakdown)
-> "Build your business plan" (/calculator)
-> "What banks want to see" (Article #3)
-> City-specific examples (/markets/muenchen, /markets/hamburg)
Article #3 (Bank requirements)
-> "Generate your business plan" (/calculator)
-> "Check market data for your city" (/markets)
```
---
## Week 3 — Data-Driven Content (March 18-24)
Leverage the pipeline data for unique content nobody else can produce.
| Day | Task | Owner | State |
|-----|------|-------|-------|
| Mon-Wed | Write "Top 50 Underserved Locations for Padel in Europe" | Editorial | [ ] |
| Wed | Publish Top 50 article | Deploy | [ ] |
| Thu-Fri | Build Gemeinde-level pSEO template (targets "Padel in [Ort]") | Technical | [ ] |
| Fri | Generate first batch of Gemeinde pages (top 20 locations) | Deploy | [ ] |
### Top 50 Article
- Source data from `location_opportunity_profile` in the serving layer
- Rank by opportunity score, filter to locations with zero existing facilities
- Include mini-profiles: population, income level, nearest existing facility, opportunity score
- Embed interactive map if possible, otherwise static top-50 table
- Target keywords: "where to open padel", "best locations padel europe", "padel market gaps"
### Gemeinde-Level pSEO
- Template targets: "Padel in [Ort]" / "Padel [Gemeinde]"
- Zero SERP competition confirmed for most German municipalities
- Content: local demographics, nearest facilities, opportunity score, CTA to planner
- Start with top 20 highest-opportunity Gemeinden, expand weekly
---
## Week 4 — Authority Building (March 25-31)
Establish Padelnomics as the data authority. Begin email-gated content for list building.
| Day | Task | Owner | State |
|-----|------|-------|-------|
| Mon-Wed | Write "State of Padel Q1 2026" report | Editorial | [ ] |
| Wed | Design PDF layout (WeasyPrint or similar) | Technical | [ ] |
| Thu | Publish report landing page (email-gated download) | Deploy | [ ] |
| Thu | Promote Market Score methodology page via social | Social | [ ] |
| Fri | Begin link building via Reddit/LinkedIn engagement | Social | [ ] |
| Ongoing | Monitor GSC for indexing progress, fix crawl errors | Technical | [ ] |
### State of Padel Q1 2026 Report
- Executive summary of European padel market
- Facility count by country (from pipeline data)
- Growth trends (year-over-year where data exists)
- Top opportunity markets (from opportunity scoring)
- Investment economics summary (from planner defaults)
- Email-gated: free download in exchange for email address
- Promote via LinkedIn, Reddit, and direct outreach to industry contacts
---
## Content Inventory (End of Month 1)
| Type | Count | State |
|------|-------|-------|
| Programmatic city articles (EN+DE) | 80 | Deployed Week 1 |
| Cornerstone articles | 3 | Published Weeks 1-2 |
| Data-driven article (Top 50) | 1 | Published Week 3 |
| Gemeinde-level pSEO pages | 20+ | Started Week 3 |
| Gated report (State of Padel) | 1 | Published Week 4 |
| **Total indexable pages** | **105+** | |
---
## SEO KPIs — End of Month 1
| Metric | Target |
|--------|--------|
| Pages indexed (GSC) | 80+ of 105 |
| Organic impressions | 500+ |
| Organic clicks | 50+ |
| Average position (target keywords) | Top 50 |
| Email captures from gated report | 50+ |
| Backlinks acquired | 3+ |
These are conservative baselines. Programmatic pages in zero-competition niches can index and rank faster than typical content.

153
docs/social-posts-de.md Normal file
View File

@@ -0,0 +1,153 @@
# Social Posts — Deutsche Versionen
> Fertige Posts zum Rauskopieren. Domain: padelnomics.io
> Erstellt: 2026-03-04.
>
> Reddit-Posts bleiben auf Englisch (englischsprachige Subreddits).
> Diese Datei enthält LinkedIn- und Facebook-Posts auf Deutsch.
---
## LinkedIn Post #1 — Marktdaten
> Ziel: Glaubwürdigkeit aufbauen + Traffic auf den Rechner lenken.
```
10.127 Padel-Anlagen in 17 Ländern — wir haben sie alle erfasst.
Was dabei auffällt:
→ Italien führt mit 3.069 Anlagen. Mehr als Spanien (2.241).
→ Portugal hat den reifsten Padel-Markt weltweit (Maturity Score 45,2/100) — bei „nur" 506 Anlagen.
→ Deutschland: 359 Anlagen für 84 Mio. Einwohner. Spanien: 2.241 für 47 Mio.
Diese Lücke ist die Chance.
Wir haben 15.390 Standorte ohne Padel-Angebot identifiziert, die hohes Potenzial zeigen. Hamburg, München und Frankfurt stehen in Deutschland ganz oben.
Für alle, die über eine eigene Padel-Anlage nachdenken oder jemanden beraten: Wir haben einen kostenlosen ROI-Rechner gebaut, der mit echten Marktdaten die Kosten, Umsätze und Amortisation für jede Stadt in Europa modelliert.
Ohne Anmeldung. Einfach rechnen.
→ https://padelnomics.io/de/planner/?utm_source=linkedin&utm_medium=social&utm_campaign=launch&utm_content=li_marktdaten
#padel #sportbusiness #marktdaten #unternehmertum
```
---
## LinkedIn Post #2 — Standortanalyse (Tag 23 posten)
```
Die 5 am stärksten unterversorgten Städte für Padel in Europa:
1. Hamburg — 1,85 Mio. Einwohner, keine einzige Padel-Anlage
2. München — 1,26 Mio. Einwohner, starke Sportkultur, kaum Angebot
3. Bergen (Norwegen) — 294.000 Einwohner, Opportunity Score: 87,5/100
4. Graz (Österreich) — 303.000 Einwohner, null Courts, hohes Einkommen
5. Genf (Schweiz) — 202.000 Einwohner, null Courts, höchste Kaufkraft
Keine Schätzungen. Wir bewerten 143.877 Standorte in Europa anhand von Bevölkerungsdichte, Einkommensdaten, bestehendem Angebot und Sportinfrastruktur.
Der Padel-Markt wächst von 25.000 auf über 50.000 Anlagen weltweit. Die Frage ist nicht ob — sondern wo.
→ Daten für eure Stadt: https://padelnomics.io/de/markets?utm_source=linkedin&utm_medium=social&utm_campaign=launch&utm_content=li_standortanalyse
#padel #marktanalyse #sportsinvestment #immobilien
```
---
## LinkedIn Post #3 — Gründerstory (optional, Woche 2)
```
Vor einem Jahr habe ich angefangen, den europäischen Padel-Markt systematisch zu erfassen.
Der Auslöser: Jeder, der eine Padel-Halle plant, trifft eine Entscheidung im sechsstelligen Bereich — und hat dafür keine belastbaren Daten. Kein zentrales Marktbild. Keine vergleichbaren Kennzahlen. Nur Excel und Bauchgefühl.
Daraus ist Padelnomics entstanden: eine Datenplattform für die Padel-Branche.
Was heute live ist:
→ Kostenloser ROI-Rechner mit stadtspezifischen Realdaten
→ 80 Marktanalysen für Städte in 17 Ländern
→ Standortbewertung für 143.877 Orte in Europa
→ Anbieterverzeichnis für Bau und Ausstattung
Die Daten kommen aus OpenStreetMap, Playtomic, Eurostat und Zensusdaten — automatisch aggregiert und bewertet.
Noch am Anfang, aber der Datenvorsprung wächst jeden Tag.
→ https://padelnomics.io/de/?utm_source=linkedin&utm_medium=social&utm_campaign=launch&utm_content=li_gruenderstory
#padel #startup #datenplattform #sportbusiness
```
---
## Facebook — Padel-Gruppen (Deutschland/DACH)
> Ton: locker, hilfsbereit, kurz. Kein Pitch.
**Titel (falls die Gruppe Titel erlaubt):** Kostenloser Padel-Rechner mit echten Marktdaten
```
Moin zusammen,
ich hab einen kostenlosen Finanzplanungs-Rechner für Padel-Anlagen gebaut. CAPEX, laufende Kosten, Umsatzprognose — und am Ende eine 5-Jahres-GuV mit Amortisation.
Der Unterschied zu den üblichen Excel-Vorlagen: Der Rechner befüllt sich automatisch mit echten Daten für euren Standort. Mieten, Nebenkosten, Genehmigungsgebühren — alles stadtspezifisch, basierend auf Daten aus 17 Ländern.
Keine Anmeldung, kostenlos.
→ https://padelnomics.io/de/planner/?utm_source=facebook&utm_medium=social&utm_campaign=launch&utm_content=fb_padel_de
Feedback ist willkommen — gerade von Leuten, die den Planungsprozess schon hinter sich haben und wissen, welche Zahlen wirklich zählen.
```
---
## Facebook — Tennisvereine / Sportvereine (DACH)
> Ziel: Tennisvereine, die über Padel-Courts nachdenken.
```
Falls euer Verein gerade über Padel-Courts nachdenkt (und viele tun das): Ich hab ein kostenloses Tool gebaut, das die Wirtschaftlichkeit durchrechnet.
→ Investitionskosten für 26 Courts an bestehenden Anlagen
→ Umsatzprognose auf Basis realer Auslastungs- und Preisdaten
→ Laufende Kosten für euren konkreten Standort
→ Amortisation und ROI-Kennzahlen
Ein paar Zahlen aus unseren Daten:
- Durchschnittliche Auslastung in reifen Märkten: 6075 %
- Outdoor-Anlage mit 4 Courts: 200.000350.000 €
- Indoor: 700.0003 Mio. € je nach Bauweise
- Tennisvereine, die 2 Plätze umrüsten, sehen typischerweise nach 1830 Monaten Amortisation
Keine Anmeldung nötig.
→ https://padelnomics.io/de/planner/?utm_source=facebook&utm_medium=social&utm_campaign=launch&utm_content=fb_tennis_de
Kann gern Daten zu einzelnen Städten oder Regionen teilen, wenn ihr etwas Konkretes prüft.
```
---
## Posting-Zeitplan
| Tag | Plattform | Post |
|-----|-----------|------|
| Heute | LinkedIn (Company Page) | Post #1 (Marktdaten) |
| Heute | 12 deutsche FB-Padel-Gruppen | Padel-Rechner |
| Morgen | 12 FB-Tennisvereins-Gruppen | Tennisverein-Angle |
| Tag 3 | LinkedIn (Company Page) | Post #2 (Standortanalyse) |
| Woche 2 | LinkedIn (Company Page) | Post #3 (Gründerstory) |
---
## Regeln
- Ein Link pro Post, am Ende.
- 24 Stunden auf jeden Kommentar reagieren.
- Wenn ein Post Traktion bekommt: mit zusätzlichen Datenpunkten nachliefern.
- UTM-Tracking: `?utm_source=linkedin&utm_medium=social&utm_campaign=launch&utm_content=li_marktdaten` bzw. `utm_source=facebook` für FB-Posts.

248
docs/social-posts.md Normal file
View File

@@ -0,0 +1,248 @@
# Social Posts — Launch Day
> Ready to copy-paste. Domain: padelnomics.io
> Created: 2026-03-04.
---
## LinkedIn Post #1 — Data Insight
> Post type: data-driven thought leadership. Goal: establish credibility + drive traffic to planner.
```
We've been tracking 10,127 padel facilities across 17 countries.
Here's what surprised me about the European market:
→ Italy leads with 3,069 facilities — more than Spain (2,241)
→ Portugal has the world's most mature padel market (45.2/100 maturity score) with "only" 506 facilities
→ Germany has just 359 facilities for 84M people. Spain has 2,241 for 47M.
That gap is the opportunity.
We identified 15,390 high-potential locations with zero padel courts worldwide.
Hamburg, Munich, and Frankfurt top the list in Germany alone.
If you're thinking about opening a padel facility — or advising someone who is — we built a free ROI calculator that uses this data to model costs, revenue, and payback period for any city in Europe.
No signup required. Just real numbers.
→ https://padelnomics.io/en/planner/?utm_source=linkedin&utm_medium=social&utm_campaign=launch&utm_content=li_marketdata
#padel #sportsbusiness #marketdata #entrepreneurship
```
---
## LinkedIn Post #2 — Opportunity Angle (schedule for Day 23)
```
The 5 most underserved cities for padel in Europe right now:
1. Hamburg (1.85M residents, zero dedicated padel facilities)
2. Munich (1.26M residents, massive sports culture, minimal supply)
3. Bergen, Norway (294K residents, opportunity score: 87.5/100)
4. Graz, Austria (303K residents, zero courts, high income)
5. Geneva, Switzerland (202K residents, zero courts, highest purchasing power)
These aren't guesses. We score 143,877 locations across Europe using population density, income data, existing supply, and sports infrastructure.
The padel market is growing from 25K to 50K+ facilities globally. The question isn't whether — it's where.
→ Explore the data for your city: https://padelnomics.io/en/markets?utm_source=linkedin&utm_medium=social&utm_campaign=launch&utm_content=li_opportunity
#padel #marketintelligence #sportsinvestment #realestate
```
---
## Reddit — r/padel
> Tone: genuinely helpful, not promotional. r/padel is a player community, so lead with the sport angle.
**Title:** I built a free padel court ROI calculator — feedback welcome
```
Hey r/padel,
I've been working on a data project tracking the padel market across Europe
(facility counts, market maturity, opportunity gaps). As part of that, I built
a free calculator for anyone thinking about opening a padel facility.
It models:
- CAPEX (construction, equipment, permits)
- OPEX (rent, staffing, utilities, maintenance)
- Revenue projections based on real market data from your city
- 5-year P&L with payback period, IRR, and break-even
It pre-fills with city-specific defaults — so if you pick Munich, it uses
Munich rents, Munich utility costs, etc. Not generic averages.
No signup needed. Just wanted to share in case anyone here has ever thought
about the business side of padel.
→ https://padelnomics.io/en/planner/?utm_source=reddit&utm_medium=social&utm_campaign=launch&utm_content=r_padel
Happy to answer questions about the data or methodology. Also open to feedback
on what would make this more useful.
```
---
## Reddit — r/entrepreneur
> Tone: indie builder sharing a project. r/entrepreneur loves "I built X" posts with real data.
**Title:** I'm building the "Bloomberg for padel" — tracking 10,127 facilities across 17 countries
```
Padel is the fastest-growing sport in Europe and Latin America. There are now
10,000+ facilities worldwide and the market is expected to double to 50K+ in
the next 5 years.
The problem: anyone trying to open a padel facility is flying blind. No
centralized market data exists. People are making €200K€2M investment
decisions based on Excel spreadsheets and gut feel.
I'm building Padelnomics — a data intelligence platform for the padel industry.
Think "Kpler for padel" if you're familiar with commodity data platforms.
What's live right now:
- Free ROI calculator that models costs, revenue, and payback for any European
city (pre-filled with real local data — rents, utilities, permits, etc.)
- 80 market analysis pages covering cities across 17 countries
- Market maturity scoring for 4,686 cities with padel facilities
- Opportunity scoring for 143,877 locations (identifying where to build next)
The data comes from OpenStreetMap, Playtomic (booking platform), Eurostat, and
census data — aggregated and scored automatically.
Revenue model: free calculator captures leads (aspiring facility owners) →
supplier directory connects them with builders → suppliers pay for qualified
leads via credit system.
Still early but the data moat compounds daily — every day of scraping = data
competitors can't replicate.
Would love feedback from anyone who's built data products or two-sided
marketplaces.
→ https://padelnomics.io/en/?utm_source=reddit&utm_medium=social&utm_campaign=launch&utm_content=r_entrepreneur
```
---
## Reddit — r/smallbusiness
> Tone: practical tool for a real business decision.
**Title:** Free business planning tool for anyone looking at opening a sports facility
```
I built a free financial planning tool specifically for padel facilities
(indoor/outdoor sports courts — fastest growing sport in Europe right now).
It covers the full picture:
- Construction costs (indoor vs outdoor, number of courts)
- Operating expenses (rent, staff, utilities, insurance, maintenance)
- Revenue modeling (hourly rates, occupancy rates, lessons, events)
- 5-year P&L projection
- Key metrics: payback period, IRR, break-even point
The tool pre-fills with real data for your city — actual local rents, utility
costs, permit fees — not generic averages.
You can also generate a bank-ready business plan PDF from it.
Free to use, no signup required for the calculator itself.
→ https://padelnomics.io/en/planner/?utm_source=reddit&utm_medium=social&utm_campaign=launch&utm_content=r_smallbusiness
Built this because I kept seeing people on forums asking "how much does it cost
to open a padel hall?" and getting wildly different answers. Figured real data
was better than guesswork.
```
---
## Reddit — r/tennis
> Tone: cross-sport angle. Many tennis clubs are adding padel courts.
**Title:** Data on padel facility economics — useful for tennis clubs considering adding courts
```
If your club is thinking about adding padel courts (and many are right now),
I built a free financial planning tool that models the full economics:
- CAPEX for adding 26 courts to an existing facility
- Revenue projections based on real occupancy and pricing data
- Operating costs specific to your city/country
- Payback period and ROI metrics
The tool uses actual market data — we track 10,127 padel facilities across
17 countries and score market maturity + opportunity by city.
Some interesting numbers:
- Average padel facility in a mature market runs at 6075% occupancy
- A 4-court outdoor setup costs €200K€350K
- Indoor builds jump to €700K€3M depending on structure
- Tennis clubs converting 2 courts to padel typically see payback in 1830 months
Free to use, no signup needed.
→ https://padelnomics.io/en/planner/?utm_source=reddit&utm_medium=social&utm_campaign=launch&utm_content=r_tennis
Happy to share data on any specific city or country if you're evaluating this
for your club.
```
---
## Facebook Groups — Padel Business / Deutschland
> Tone: casual, helpful. Shorter than Reddit posts.
**Title (if group allows):** Free padel facility ROI calculator — uses real market data
```
Hey everyone 👋
Built a free tool for anyone planning a padel facility. It models CAPEX,
OPEX, revenue, and gives you a 5-year P&L with payback period.
The difference from spreadsheet templates: it pre-fills with real data for
your city (actual rents, utility costs, permit fees, etc.) based on data
we're collecting across 17 countries.
No signup, no cost. Just real numbers.
→ https://padelnomics.io/en/planner/?utm_source=facebook&utm_medium=social&utm_campaign=launch&utm_content=fb_padel
Feedback welcome — especially from anyone who's been through the planning
process and knows what numbers actually matter.
```
---
## Posting Schedule
| Day | Platform | Post |
|-----|----------|------|
| Today | LinkedIn | Post #1 (Data Insight) |
| Today | r/padel | Calculator feedback post |
| Today | r/entrepreneur | "Bloomberg for padel" builder post |
| Today | 12 FB groups | Calculator share |
| Tomorrow | r/smallbusiness | Business planning tool post |
| Tomorrow | r/tennis | Tennis club angle |
| Day 3 | LinkedIn | Post #2 (Opportunity Angle) |
---
## Rules
- Never link-spam. One link per post, at the end.
- Engage with every comment for 24 hours after posting.
- If a post gets traction, reply with additional data points to keep it alive.
- Track which subreddits/groups drive actual signups via UTM params:
`?utm_source=reddit&utm_medium=social&utm_campaign=launch&utm_content=r_padel`

View File

@@ -63,15 +63,15 @@ DATASETS: dict[str, dict] = {
"time_dim": "time",
},
"nrg_pc_203": {
# Gas prices for non-household consumers, EUR/GJ, excl. taxes
"filters": {"freq": "S", "nrg_cons": "GJ1000-9999", "currency": "EUR", "tax": "I_TAX"},
# Gas prices for non-household consumers, EUR/kWh, excl. taxes
"filters": {"freq": "S", "nrg_cons": "GJ1000-9999", "unit": "KWH", "currency": "EUR", "tax": "I_TAX"},
"geo_dim": "geo",
"time_dim": "time",
},
"lc_lci_lev": {
# Labour cost levels EUR/hour — NACE N (administrative/support services)
# Stored in dim_countries for future staffed-scenario calculations.
"filters": {"lcstruct": "D1_D2_A_HW", "nace_r2": "N", "currency": "EUR"},
# D1_D4_MD5 = compensation of employees + taxes - subsidies (total labour cost)
"filters": {"lcstruct": "D1_D4_MD5", "nace_r2": "N", "unit": "EUR"},
"geo_dim": "geo",
"time_dim": "time",
},

View File

@@ -33,10 +33,10 @@ do
DUCKDB_PATH="${DUCKDB_PATH:-/data/padelnomics/lakehouse.duckdb}" \
uv run --package padelnomics_extract extract
# Transform
# Transform — plan detects new/changed models; run only executes existing plans.
LANDING_DIR="${LANDING_DIR:-/data/padelnomics/landing}" \
DUCKDB_PATH="${DUCKDB_PATH:-/data/padelnomics/lakehouse.duckdb}" \
uv run --package sqlmesh_padelnomics sqlmesh run --select-model "serving.*"
uv run sqlmesh -p transform/sqlmesh_padelnomics plan prod --auto-apply
# Export serving tables to analytics.duckdb (atomic swap).
# The web app detects the inode change on next query — no restart needed.

View File

@@ -8,54 +8,67 @@
# entry — optional: function name if not "main" (default: "main")
# depends_on — optional: list of workflow names that must run first
# proxy_mode — optional: "round-robin" (default) or "sticky"
# description — optional: human-readable one-liner shown in the admin UI
[overpass]
module = "padelnomics_extract.overpass"
schedule = "monthly"
description = "Padel court locations from OpenStreetMap via Overpass API"
[overpass_tennis]
module = "padelnomics_extract.overpass_tennis"
schedule = "monthly"
description = "Tennis court locations from OpenStreetMap via Overpass API"
[eurostat]
module = "padelnomics_extract.eurostat"
schedule = "monthly"
description = "City population data from Eurostat Urban Audit"
[geonames]
module = "padelnomics_extract.geonames"
schedule = "monthly"
description = "Global city/town gazetteer from GeoNames (pop >= 1K)"
[playtomic_tenants]
module = "padelnomics_extract.playtomic_tenants"
schedule = "daily"
description = "Padel venue directory from Playtomic (names, locations, courts)"
[playtomic_availability]
module = "padelnomics_extract.playtomic_availability"
schedule = "daily"
depends_on = ["playtomic_tenants"]
description = "Morning availability snapshots — slot-level pricing per venue"
[playtomic_recheck]
module = "padelnomics_extract.playtomic_availability"
entry = "main_recheck"
schedule = "0,30 6-23 * * *"
depends_on = ["playtomic_availability"]
description = "Intraday availability rechecks for occupancy tracking"
[census_usa]
module = "padelnomics_extract.census_usa"
schedule = "monthly"
description = "US city/place population from Census Bureau ACS"
[census_usa_income]
module = "padelnomics_extract.census_usa_income"
schedule = "monthly"
description = "US county median household income from Census Bureau ACS"
[eurostat_city_labels]
module = "padelnomics_extract.eurostat_city_labels"
schedule = "monthly"
description = "City code-to-name mapping for Eurostat Urban Audit cities"
[ons_uk]
module = "padelnomics_extract.ons_uk"
schedule = "monthly"
description = "UK local authority population estimates from ONS"
[gisco]
module = "padelnomics_extract.gisco"
schedule = "monthly"
description = "EU geographic boundaries (NUTS2 polygons) from Eurostat GISCO"

290
scripts/check_pipeline.py Normal file
View File

@@ -0,0 +1,290 @@
"""
Diagnostic script: check row counts at every layer of the pricing pipeline.
Run on prod via SSH:
DUCKDB_PATH=/opt/padelnomics/data/lakehouse.duckdb uv run python scripts/check_pipeline.py
Or locally:
DUCKDB_PATH=data/lakehouse.duckdb uv run python scripts/check_pipeline.py
Read-only — never writes to the database.
Handles the DuckDB catalog naming quirk: when the file is named lakehouse.duckdb,
the catalog is "lakehouse" not "local". SQLMesh views may reference the wrong catalog,
so we fall back to querying physical tables (sqlmesh__<schema>.<table>__<hash>).
"""
import os
import sys
import duckdb
DUCKDB_PATH = os.environ.get("DUCKDB_PATH", "data/lakehouse.duckdb")
PIPELINE_TABLES = [
("staging", "stg_playtomic_availability"),
("foundation", "fct_availability_slot"),
("foundation", "dim_venue_capacity"),
("foundation", "fct_daily_availability"),
("serving", "venue_pricing_benchmarks"),
("serving", "pseo_city_pricing"),
]
def _use_catalog(con):
"""Detect and USE the database catalog so schema-qualified queries work."""
catalogs = [
row[0]
for row in con.execute(
"SELECT catalog_name FROM information_schema.schemata"
).fetchall()
]
# Pick the non-system catalog (not 'system', 'temp', 'memory')
user_catalogs = [c for c in set(catalogs) if c not in ("system", "temp", "memory")]
if user_catalogs:
catalog = user_catalogs[0]
con.execute(f"USE {catalog}")
return catalog
return None
def _find_physical_table(con, schema, table):
"""Find the SQLMesh physical table name for a logical table.
SQLMesh stores physical tables as:
sqlmesh__<schema>.<schema>__<table>__<hash>
"""
sqlmesh_schema = f"sqlmesh__{schema}"
try:
rows = con.execute(
"SELECT table_schema, table_name "
"FROM information_schema.tables "
f"WHERE table_schema = '{sqlmesh_schema}' "
f"AND table_name LIKE '{schema}__{table}%' "
"ORDER BY table_name "
"LIMIT 1"
).fetchall()
if rows:
return f"{rows[0][0]}.{rows[0][1]}"
except Exception:
pass
return None
def _query_table(con, schema, table):
"""Try logical view first, fall back to physical table. Returns (fqn, count) or (fqn, error_str)."""
logical = f"{schema}.{table}"
try:
(count,) = con.execute(f"SELECT COUNT(*) FROM {logical}").fetchone()
return logical, count
except Exception:
pass
physical = _find_physical_table(con, schema, table)
if physical:
try:
(count,) = con.execute(f"SELECT COUNT(*) FROM {physical}").fetchone()
return f"{physical} (physical)", count
except Exception as e:
return f"{physical} (physical)", f"ERROR: {e}"
return logical, "ERROR: view broken, no physical table found"
def _query_sql(con, sql, schema_tables):
"""Execute SQL, falling back to rewritten SQL using physical table names if views fail.
schema_tables: list of (schema, table) tuples used in the SQL, in order of appearance.
The SQL must use {schema}.{table} format for these references.
"""
try:
return con.execute(sql)
except Exception:
# Rewrite SQL to use physical table names
rewritten = sql
for schema, table in schema_tables:
physical = _find_physical_table(con, schema, table)
if physical:
rewritten = rewritten.replace(f"{schema}.{table}", physical)
else:
raise
return con.execute(rewritten)
def main():
if not os.path.exists(DUCKDB_PATH):
print(f"ERROR: {DUCKDB_PATH} not found")
sys.exit(1)
con = duckdb.connect(DUCKDB_PATH, read_only=True)
print(f"Database: {DUCKDB_PATH}")
print(f"DuckDB version: {con.execute('SELECT version()').fetchone()[0]}")
catalog = _use_catalog(con)
if catalog:
print(f"Catalog: {catalog}")
print()
# ── Row counts at each layer ──────────────────────────────────────────
print("=" * 60)
print("PIPELINE ROW COUNTS")
print("=" * 60)
for schema, table in PIPELINE_TABLES:
fqn, result = _query_table(con, schema, table)
if isinstance(result, int):
print(f" {fqn:55s} {result:>10,} rows")
else:
print(f" {fqn:55s} {result}")
# ── Date range in fct_daily_availability ──────────────────────────────
print()
print("=" * 60)
print("DATE RANGE: fct_daily_availability")
print("=" * 60)
try:
row = _query_sql(
con,
"""
SELECT
MIN(snapshot_date) AS min_date,
MAX(snapshot_date) AS max_date,
COUNT(DISTINCT snapshot_date) AS distinct_days,
CURRENT_DATE AS today,
CURRENT_DATE - INTERVAL '30 days' AS window_start
FROM foundation.fct_daily_availability
""",
[("foundation", "fct_daily_availability")],
).fetchone()
if row:
min_date, max_date, days, today, window_start = row
print(f" Min snapshot_date: {min_date}")
print(f" Max snapshot_date: {max_date}")
print(f" Distinct days: {days}")
print(f" Today: {today}")
print(f" 30-day window start: {window_start}")
if max_date and str(max_date) < str(window_start):
print()
print(" *** ALL DATA IS OUTSIDE THE 30-DAY WINDOW ***")
print(" This is why venue_pricing_benchmarks is empty.")
except Exception as e:
print(f" ERROR: {e}")
# ── HAVING filter impact in venue_pricing_benchmarks ──────────────────
print()
print("=" * 60)
print("HAVING FILTER IMPACT (venue_pricing_benchmarks)")
print("=" * 60)
try:
row = _query_sql(
con,
"""
WITH venue_stats AS (
SELECT
da.tenant_id,
da.country_code,
da.city,
COUNT(DISTINCT da.snapshot_date) AS days_observed
FROM foundation.fct_daily_availability da
WHERE TRY_CAST(da.snapshot_date AS DATE) >= CURRENT_DATE - INTERVAL '30 days'
AND da.occupancy_rate IS NOT NULL
AND da.occupancy_rate BETWEEN 0 AND 1.5
GROUP BY da.tenant_id, da.country_code, da.city
)
SELECT
COUNT(*) AS total_venues,
COUNT(*) FILTER (WHERE days_observed >= 3) AS venues_passing_having,
COUNT(*) FILTER (WHERE days_observed < 3) AS venues_failing_having,
MAX(days_observed) AS max_days,
MIN(days_observed) AS min_days
FROM venue_stats
""",
[("foundation", "fct_daily_availability")],
).fetchone()
if row:
total, passing, failing, max_d, min_d = row
print(f" Venues in 30-day window: {total}")
print(f" Venues with >= 3 days (PASSING): {passing}")
print(f" Venues with < 3 days (FILTERED): {failing}")
print(f" Max days observed: {max_d}")
print(f" Min days observed: {min_d}")
if total == 0:
print()
print(" *** NO VENUES IN 30-DAY WINDOW — check fct_daily_availability dates ***")
except Exception as e:
print(f" ERROR: {e}")
# ── Occupancy rate distribution ───────────────────────────────────────
print()
print("=" * 60)
print("OCCUPANCY RATE DISTRIBUTION (fct_daily_availability)")
print("=" * 60)
try:
rows = _query_sql(
con,
"""
SELECT
CASE
WHEN occupancy_rate IS NULL THEN 'NULL'
WHEN occupancy_rate < 0 THEN '< 0 (invalid)'
WHEN occupancy_rate > 1.5 THEN '> 1.5 (filtered)'
WHEN occupancy_rate <= 0.25 THEN '0 0.25'
WHEN occupancy_rate <= 0.50 THEN '0.25 0.50'
WHEN occupancy_rate <= 0.75 THEN '0.50 0.75'
ELSE '0.75 1.0+'
END AS bucket,
COUNT(*) AS cnt
FROM foundation.fct_daily_availability
GROUP BY 1
ORDER BY 1
""",
[("foundation", "fct_daily_availability")],
).fetchall()
for bucket, cnt in rows:
print(f" {bucket:25s} {cnt:>10,}")
except Exception as e:
print(f" ERROR: {e}")
# ── dim_venue_capacity join coverage ──────────────────────────────────
print()
print("=" * 60)
print("JOIN COVERAGE: fct_availability_slot → dim_venue_capacity")
print("=" * 60)
try:
row = _query_sql(
con,
"""
SELECT
COUNT(DISTINCT a.tenant_id) AS slot_tenants,
COUNT(DISTINCT c.tenant_id) AS capacity_tenants,
COUNT(DISTINCT a.tenant_id) - COUNT(DISTINCT c.tenant_id) AS missing_capacity
FROM foundation.fct_availability_slot a
LEFT JOIN foundation.dim_venue_capacity c ON a.tenant_id = c.tenant_id
""",
[
("foundation", "fct_availability_slot"),
("foundation", "dim_venue_capacity"),
],
).fetchone()
if row:
slot_t, cap_t, missing = row
print(f" Tenants in fct_availability_slot: {slot_t}")
print(f" Tenants with capacity match: {cap_t}")
print(f" Tenants missing capacity: {missing}")
if missing and missing > 0:
print(f" *** {missing} tenants dropped by INNER JOIN to dim_venue_capacity ***")
except Exception as e:
print(f" ERROR: {e}")
con.close()
print()
print("Done.")
if __name__ == "__main__":
main()

View File

@@ -0,0 +1,553 @@
"""
E2E test for checkout.session.completed webhook → transaction.completed handler.
Tests credit packs, sticky boosts, and business plan PDF purchases by:
1. Constructing realistic checkout.session.completed payloads with our real price IDs
2. Signing them with the active webhook secret
3. POSTing to the running dev server
4. Verifying DB state changes (credit_balance, supplier_boosts, business_plan_exports)
Prerequisites:
- ngrok + webhook endpoint registered (stripe_e2e_setup.py)
- Dev server running with webhook secret loaded
- Stripe products synced (setup_stripe --sync)
Run: uv run python scripts/stripe_e2e_checkout_test.py
"""
import hashlib
import hmac
import json
import os
import sqlite3
import subprocess
import sys
import time
from dotenv import load_dotenv
load_dotenv(override=True)
DATABASE_PATH = os.getenv("DATABASE_PATH", "data/app.db")
WEBHOOK_SECRET = os.getenv("STRIPE_WEBHOOK_SECRET", "")
SERVER_URL = "http://localhost:5000"
WEBHOOK_URL = f"{SERVER_URL}/billing/webhook/stripe"
assert WEBHOOK_SECRET, "STRIPE_WEBHOOK_SECRET not set — run stripe_e2e_setup.py"
passed = 0
failed = 0
errors = []
def ok(msg):
global passed
passed += 1
print(f" \u2713 {msg}")
def fail(msg):
global failed
failed += 1
errors.append(msg)
print(f" \u2717 {msg}")
def section(title):
print(f"\n{'' * 60}")
print(f" {title}")
print(f"{'' * 60}")
def query_db(sql, params=()):
conn = sqlite3.connect(f"file:{DATABASE_PATH}?mode=ro", uri=True)
conn.row_factory = sqlite3.Row
try:
return [dict(r) for r in conn.execute(sql, params).fetchall()]
finally:
conn.close()
def sign_stripe_payload(payload_bytes: bytes, secret: str) -> str:
"""Create a valid Stripe-Signature header."""
timestamp = str(int(time.time()))
signed_payload = f"{timestamp}.{payload_bytes.decode()}"
sig = hmac.new(
secret.encode(), signed_payload.encode(), hashlib.sha256
).hexdigest()
return f"t={timestamp},v1={sig}"
def post_webhook(event_type: str, obj: dict) -> int:
"""Post a signed webhook to the server. Returns HTTP status code."""
payload = json.dumps({
"id": f"evt_test_{int(time.time()*1000)}",
"type": event_type,
"data": {"object": obj},
}).encode()
sig = sign_stripe_payload(payload, WEBHOOK_SECRET)
result = subprocess.run(
["curl", "-s", "-o", "/dev/null", "-w", "%{http_code}",
"-X", "POST",
"-H", "Content-Type: application/json",
"-H", f"Stripe-Signature: {sig}",
"--data-binary", "@-",
WEBHOOK_URL],
input=payload.decode(), capture_output=True, text=True, timeout=10,
)
return int(result.stdout.strip())
# ─── Preflight ────────────────────────────────────────────
section("Preflight")
# Server up
result = subprocess.run(
["curl", "-s", "-o", "/dev/null", "-w", "%{http_code}", f"{SERVER_URL}/"],
capture_output=True, text=True, timeout=5,
)
assert result.stdout.strip() in ("200", "301"), f"Server down ({result.stdout})"
ok("Dev server running")
# Webhook active
result = subprocess.run(
["curl", "-s", "-o", "/dev/null", "-w", "%{http_code}",
"-X", "POST", "-H", "Content-Type: application/json", "-d", "{}",
WEBHOOK_URL],
capture_output=True, text=True, timeout=5,
)
assert result.stdout.strip() == "400", f"Webhook returns {result.stdout} (expected 400)"
ok("Webhook signature check active")
# Load price IDs
products = query_db("SELECT key, provider_price_id FROM payment_products WHERE provider = 'stripe'")
price_map = {p["key"]: p["provider_price_id"] for p in products}
ok(f"Loaded {len(price_map)} products")
# Test data
users = query_db("SELECT id, email FROM users LIMIT 5")
test_user = users[0]
ok(f"User: {test_user['email']} (id={test_user['id']})")
suppliers = query_db("SELECT id, name, credit_balance FROM suppliers WHERE claimed_by IS NOT NULL LIMIT 1")
assert suppliers, "No claimed supplier found"
test_supplier = suppliers[0]
initial_balance = test_supplier["credit_balance"]
ok(f"Supplier: {test_supplier['name']} (id={test_supplier['id']}, balance={initial_balance})")
# ═══════════════════════════════════════════════════════════
# Test 1: Credit Pack purchases (all 4 sizes)
# ═══════════════════════════════════════════════════════════
section("1. Credit Pack purchases via checkout.session.completed")
credit_packs = [
("credits_25", 25),
("credits_50", 50),
("credits_100", 100),
("credits_250", 250),
]
running_balance = initial_balance
for key, amount in credit_packs:
price_id = price_map.get(key)
if not price_id:
fail(f"{key}: price not found")
continue
status = post_webhook("checkout.session.completed", {
"id": f"cs_test_{key}_{int(time.time())}",
"mode": "payment",
"customer": "cus_test_credits",
"metadata": {
"user_id": str(test_user["id"]),
"supplier_id": str(test_supplier["id"]),
"plan": key,
},
"line_items": {"data": [{"price": {"id": price_id}, "quantity": 1}]},
})
if status == 200:
ok(f"{key}: webhook accepted (HTTP 200)")
else:
fail(f"{key}: webhook returned HTTP {status}")
continue
# Wait and check balance
time.sleep(2)
rows = query_db("SELECT credit_balance FROM suppliers WHERE id = ?", (test_supplier["id"],))
new_balance = rows[0]["credit_balance"] if rows else -1
expected = running_balance + amount
if new_balance == expected:
ok(f"{key}: balance {running_balance}{new_balance} (+{amount})")
running_balance = new_balance
else:
fail(f"{key}: balance {new_balance}, expected {expected}")
running_balance = new_balance # update anyway for next test
# Check ledger entries
ledger = query_db(
"SELECT * FROM credit_ledger WHERE supplier_id = ? AND event_type = 'pack_purchase' ORDER BY id DESC LIMIT 4",
(test_supplier["id"],),
)
if len(ledger) >= 4:
ok(f"Credit ledger: {len(ledger)} pack_purchase entries")
else:
fail(f"Credit ledger: only {len(ledger)} entries (expected 4)")
# ═══════════════════════════════════════════════════════════
# Test 2: Sticky Boost purchases
# ═══════════════════════════════════════════════════════════
section("2. Sticky boost purchases")
# 2a. Sticky Week
price_id = price_map.get("boost_sticky_week")
if price_id:
status = post_webhook("checkout.session.completed", {
"id": f"cs_test_sticky_week_{int(time.time())}",
"mode": "payment",
"customer": "cus_test_sticky",
"metadata": {
"user_id": str(test_user["id"]),
"supplier_id": str(test_supplier["id"]),
"plan": "boost_sticky_week",
"sticky_country": "DE",
},
"line_items": {"data": [{"price": {"id": price_id}, "quantity": 1}]},
})
if status == 200:
ok("boost_sticky_week: webhook accepted")
else:
fail(f"boost_sticky_week: HTTP {status}")
time.sleep(2)
# Check supplier_boosts
boosts = query_db(
"SELECT * FROM supplier_boosts WHERE supplier_id = ? AND boost_type = 'sticky_week' ORDER BY id DESC LIMIT 1",
(test_supplier["id"],),
)
if boosts:
b = boosts[0]
ok(f"supplier_boosts row: type=sticky_week, status={b['status']}")
if b.get("expires_at"):
ok(f"expires_at set: {b['expires_at']}")
else:
fail("expires_at is NULL")
else:
fail("No supplier_boosts row for sticky_week")
# Check suppliers.sticky_until
sup = query_db("SELECT sticky_until, sticky_country FROM suppliers WHERE id = ?", (test_supplier["id"],))
if sup and sup[0]["sticky_until"]:
ok(f"sticky_until set: {sup[0]['sticky_until']}")
else:
fail("sticky_until not set")
if sup and sup[0]["sticky_country"] == "DE":
ok("sticky_country=DE")
else:
fail(f"sticky_country={sup[0]['sticky_country'] if sup else '?'}")
else:
fail("boost_sticky_week price not found")
# 2b. Sticky Month
price_id = price_map.get("boost_sticky_month")
if price_id:
# Reset sticky fields
conn = sqlite3.connect(DATABASE_PATH)
conn.execute("UPDATE suppliers SET sticky_until=NULL, sticky_country=NULL WHERE id=?", (test_supplier["id"],))
conn.commit()
conn.close()
status = post_webhook("checkout.session.completed", {
"id": f"cs_test_sticky_month_{int(time.time())}",
"mode": "payment",
"customer": "cus_test_sticky",
"metadata": {
"user_id": str(test_user["id"]),
"supplier_id": str(test_supplier["id"]),
"plan": "boost_sticky_month",
"sticky_country": "ES",
},
"line_items": {"data": [{"price": {"id": price_id}, "quantity": 1}]},
})
if status == 200:
ok("boost_sticky_month: webhook accepted")
else:
fail(f"boost_sticky_month: HTTP {status}")
time.sleep(2)
boosts = query_db(
"SELECT * FROM supplier_boosts WHERE supplier_id = ? AND boost_type = 'sticky_month' ORDER BY id DESC LIMIT 1",
(test_supplier["id"],),
)
if boosts:
ok(f"supplier_boosts row: type=sticky_month, expires_at={boosts[0].get('expires_at', '?')[:10]}")
else:
fail("No supplier_boosts row for sticky_month")
sup = query_db("SELECT sticky_until, sticky_country FROM suppliers WHERE id = ?", (test_supplier["id"],))
if sup and sup[0]["sticky_country"] == "ES":
ok("sticky_country=ES (month)")
else:
fail(f"sticky_country wrong: {sup[0] if sup else '?'}")
else:
fail("boost_sticky_month price not found")
# ═══════════════════════════════════════════════════════════
# Test 3: Business Plan PDF purchase
# ═══════════════════════════════════════════════════════════
section("3. Business Plan PDF purchase")
price_id = price_map.get("business_plan")
if price_id:
# Create a scenario for the user first
conn = sqlite3.connect(DATABASE_PATH)
conn.execute(
"INSERT INTO scenarios (user_id, name, state_json, created_at) VALUES (?, 'Test', '{}', datetime('now'))",
(test_user["id"],),
)
conn.commit()
scenario_row = conn.execute("SELECT id FROM scenarios WHERE user_id = ? ORDER BY id DESC LIMIT 1",
(test_user["id"],)).fetchone()
scenario_id = scenario_row[0] if scenario_row else 0
conn.close()
ok(f"Created test scenario: id={scenario_id}")
status = post_webhook("checkout.session.completed", {
"id": f"cs_test_bp_{int(time.time())}",
"mode": "payment",
"customer": "cus_test_bp",
"metadata": {
"user_id": str(test_user["id"]),
"plan": "business_plan",
"scenario_id": str(scenario_id),
"language": "de",
},
"line_items": {"data": [{"price": {"id": price_id}, "quantity": 1}]},
})
if status == 200:
ok("business_plan: webhook accepted")
else:
fail(f"business_plan: HTTP {status}")
time.sleep(2)
# Check business_plan_exports
exports = query_db(
"SELECT * FROM business_plan_exports WHERE user_id = ? ORDER BY id DESC LIMIT 1",
(test_user["id"],),
)
if exports:
e = exports[0]
ok(f"Export row: status={e['status']}, language={e['language']}")
if e["status"] == "pending":
ok("Status: pending (waiting for worker)")
else:
print(f" ? Status: {e['status']} (expected pending)")
if e["language"] == "de":
ok("Language: de")
else:
fail(f"Language: {e['language']} (expected de)")
if e.get("token"):
ok(f"Download token generated: {e['token'][:10]}...")
else:
fail("No download token")
if e.get("scenario_id") == scenario_id:
ok(f"Scenario ID matches: {scenario_id}")
else:
fail(f"Scenario ID: {e.get('scenario_id')} (expected {scenario_id})")
else:
fail("No business_plan_exports row created")
else:
fail("business_plan price not found")
# ═══════════════════════════════════════════════════════════
# Test 4: Edge cases
# ═══════════════════════════════════════════════════════════
section("4a. Edge: checkout.session.completed with unknown price_id")
status = post_webhook("checkout.session.completed", {
"id": "cs_test_unknown",
"mode": "payment",
"customer": "cus_test_unknown",
"metadata": {
"user_id": str(test_user["id"]),
"supplier_id": str(test_supplier["id"]),
"plan": "nonexistent_product",
},
"line_items": {"data": [{"price": {"id": "price_nonexistent"}, "quantity": 1}]},
})
ok(f"Unknown price: HTTP {status} (no crash)") if status == 200 else fail(f"Unknown price: HTTP {status}")
# Server alive?
result = subprocess.run(
["curl", "-s", "-o", "/dev/null", "-w", "%{http_code}", f"{SERVER_URL}/"],
capture_output=True, text=True, timeout=5,
)
ok("Server alive after unknown price") if result.stdout.strip() in ("200", "301") else fail("Server crashed!")
section("4b. Edge: checkout.session.completed with missing supplier_id (credit pack)")
balance_before = query_db("SELECT credit_balance FROM suppliers WHERE id = ?", (test_supplier["id"],))[0]["credit_balance"]
status = post_webhook("checkout.session.completed", {
"id": "cs_test_no_supplier",
"mode": "payment",
"customer": "cus_test_nosup",
"metadata": {
"user_id": str(test_user["id"]),
# NO supplier_id
"plan": "credits_25",
},
"line_items": {"data": [{"price": {"id": price_map["credits_25"]}, "quantity": 1}]},
})
ok(f"Missing supplier_id: HTTP {status} (no crash)") if status == 200 else fail(f"HTTP {status}")
time.sleep(1)
balance_after = query_db("SELECT credit_balance FROM suppliers WHERE id = ?", (test_supplier["id"],))[0]["credit_balance"]
if balance_after == balance_before:
ok("Balance unchanged (correctly skipped — no supplier_id)")
else:
fail(f"Balance changed: {balance_before}{balance_after}")
section("4c. Edge: checkout.session.completed with missing metadata")
status = post_webhook("checkout.session.completed", {
"id": "cs_test_no_meta",
"mode": "payment",
"customer": "cus_test_nometa",
"metadata": {},
})
ok(f"Empty metadata: HTTP {status}") if status == 200 else fail(f"HTTP {status}")
result = subprocess.run(
["curl", "-s", "-o", "/dev/null", "-w", "%{http_code}", f"{SERVER_URL}/"],
capture_output=True, text=True, timeout=5,
)
ok("Server alive after empty metadata") if result.stdout.strip() in ("200", "301") else fail("Server crashed!")
section("4d. Edge: subscription mode checkout (not payment)")
# checkout.session.completed with mode=subscription should create a subscription
status = post_webhook("checkout.session.completed", {
"id": "cs_test_sub_mode",
"mode": "subscription",
"customer": "cus_test_submode",
"subscription": "sub_from_checkout_123",
"metadata": {
"user_id": str(test_user["id"]),
"plan": "starter",
},
})
ok(f"Subscription-mode checkout: HTTP {status}") if status == 200 else fail(f"HTTP {status}")
# Note: this fires subscription.activated, but since we can't mock the Stripe API call
# to fetch the subscription, it will log a warning and continue. That's fine.
section("4e. Edge: sticky boost without sticky_country in metadata")
price_id = price_map.get("boost_sticky_week")
if price_id:
# Reset sticky fields
conn = sqlite3.connect(DATABASE_PATH)
conn.execute("UPDATE suppliers SET sticky_until=NULL, sticky_country=NULL WHERE id=?", (test_supplier["id"],))
conn.commit()
conn.close()
status = post_webhook("checkout.session.completed", {
"id": f"cs_test_no_country_{int(time.time())}",
"mode": "payment",
"customer": "cus_test_nocountry",
"metadata": {
"user_id": str(test_user["id"]),
"supplier_id": str(test_supplier["id"]),
"plan": "boost_sticky_week",
# NO sticky_country
},
"line_items": {"data": [{"price": {"id": price_id}, "quantity": 1}]},
})
ok(f"Missing sticky_country: HTTP {status}") if status == 200 else fail(f"HTTP {status}")
time.sleep(2)
sup = query_db("SELECT sticky_until, sticky_country FROM suppliers WHERE id = ?", (test_supplier["id"],))
if sup and sup[0]["sticky_until"]:
ok(f"sticky_until still set (country defaults to empty: '{sup[0]['sticky_country']}')")
else:
fail("sticky boost not created without country")
# ═══════════════════════════════════════════════════════════
# Test 5: Use stripe trigger for a real checkout.session.completed
# ═══════════════════════════════════════════════════════════
section("5. stripe trigger checkout.session.completed (real Stripe event)")
print(" Triggering real checkout.session.completed via Stripe CLI...")
result = subprocess.run(
["stripe", "trigger", "checkout.session.completed"],
capture_output=True, text=True, timeout=30,
)
if result.returncode == 0:
ok("stripe trigger succeeded")
# Wait for webhook delivery via ngrok
time.sleep(5)
# Check ngrok for the delivery
import urllib.request
try:
resp = urllib.request.urlopen("http://localhost:4040/api/requests/http?limit=5", timeout=5)
reqs = json.loads(resp.read())
recent_webhooks = [
r for r in reqs.get("requests", [])
if r.get("request", {}).get("uri") == "/billing/webhook/stripe"
]
if recent_webhooks:
latest = recent_webhooks[0]
http_status = latest.get("response", {}).get("status_code")
ok(f"Webhook delivered via ngrok: HTTP {http_status}")
else:
print(" (no webhook seen in ngrok — may have been delivered before log window)")
ok("stripe trigger completed (webhook delivery not verified)")
except Exception:
ok("stripe trigger completed (ngrok API unavailable for verification)")
else:
fail(f"stripe trigger failed: {result.stderr[:100]}")
# ═══════════════════════════════════════════════════════════
# Summary
# ═══════════════════════════════════════════════════════════
section("RESULTS")
total = passed + failed
print(f"\n {passed}/{total} passed, {failed} failed\n")
if errors:
print(" Failures:")
for err in errors:
print(f" - {err}")
print()
sys.exit(1 if failed else 0)

124
scripts/stripe_e2e_setup.py Normal file
View File

@@ -0,0 +1,124 @@
"""
Step 1: Register a Stripe webhook endpoint via ngrok and update .env.
Run BEFORE starting the dev server:
1. Start ngrok: ngrok http 5000
2. Run this script: uv run python scripts/stripe_e2e_setup.py
3. Start dev server: make dev
4. Run E2E tests: uv run python scripts/stripe_e2e_test.py
To tear down afterward:
uv run python scripts/stripe_e2e_setup.py --teardown
"""
import json
import os
import re
import sys
import urllib.request
from dotenv import load_dotenv
load_dotenv()
import stripe
STRIPE_SECRET_KEY = os.getenv("STRIPE_SECRET_KEY", "") or os.getenv("STRIPE_API_PRIVATE_KEY", "")
if not STRIPE_SECRET_KEY:
print("ERROR: Set STRIPE_SECRET_KEY or STRIPE_API_PRIVATE_KEY in .env")
sys.exit(1)
stripe.api_key = STRIPE_SECRET_KEY
stripe.max_network_retries = 2
ENV_PATH = os.path.join(os.path.dirname(__file__), "..", ".env")
ENV_PATH = os.path.abspath(ENV_PATH)
WEBHOOK_PATH = "/billing/webhook/stripe"
NGROK_API = "http://localhost:4040/api/tunnels"
def _update_env(key, value):
"""Update a key in .env file."""
text = open(ENV_PATH).read()
pattern = rf"^{key}=.*$"
replacement = f"{key}={value}"
if re.search(pattern, text, re.MULTILINE):
text = re.sub(pattern, replacement, text, flags=re.MULTILINE)
else:
text = text.rstrip("\n") + f"\n{replacement}\n"
open(ENV_PATH, "w").write(text)
def setup():
# Get ngrok tunnel URL
try:
resp = urllib.request.urlopen(NGROK_API, timeout=5)
tunnels = json.loads(resp.read())
tunnel_url = tunnels["tunnels"][0]["public_url"]
except Exception as e:
print(f"ERROR: ngrok not running: {e}")
print("Start ngrok first: ngrok http 5000")
sys.exit(1)
webhook_url = f"{tunnel_url}{WEBHOOK_PATH}"
print(f"ngrok tunnel: {tunnel_url}")
print(f"Webhook URL: {webhook_url}")
# Check for existing E2E webhook endpoint
existing_id = os.getenv("STRIPE_WEBHOOK_ENDPOINT_ID", "")
if existing_id:
try:
ep = stripe.WebhookEndpoint.retrieve(existing_id)
if ep.url == webhook_url and ep.status == "enabled":
print(f"\nEndpoint already exists and matches: {existing_id}")
print("Ready to test. Run: uv run python scripts/stripe_e2e_test.py")
return
# URL changed (new ngrok session), delete and recreate
print(f"Existing endpoint URL mismatch, recreating...")
stripe.WebhookEndpoint.delete(existing_id)
except stripe.InvalidRequestError:
pass # Already deleted
# Create webhook endpoint
endpoint = stripe.WebhookEndpoint.create(
url=webhook_url,
enabled_events=[
"checkout.session.completed",
"customer.subscription.created",
"customer.subscription.updated",
"customer.subscription.deleted",
"invoice.payment_failed",
],
)
print(f"\nCreated endpoint: {endpoint.id}")
print(f"Webhook secret: {endpoint.secret[:25]}...")
# Update .env
_update_env("STRIPE_WEBHOOK_SECRET", endpoint.secret)
_update_env("STRIPE_WEBHOOK_ENDPOINT_ID", endpoint.id)
print("\nUpdated .env with STRIPE_WEBHOOK_SECRET and STRIPE_WEBHOOK_ENDPOINT_ID")
print("\nNext steps:")
print(" 1. Restart dev server: make dev")
print(" 2. Run E2E tests: uv run python scripts/stripe_e2e_test.py")
def teardown():
endpoint_id = os.getenv("STRIPE_WEBHOOK_ENDPOINT_ID", "")
if endpoint_id:
try:
stripe.WebhookEndpoint.delete(endpoint_id)
print(f"Deleted webhook endpoint: {endpoint_id}")
except stripe.InvalidRequestError:
print(f"Endpoint {endpoint_id} already deleted")
_update_env("STRIPE_WEBHOOK_SECRET", "")
_update_env("STRIPE_WEBHOOK_ENDPOINT_ID", "")
print("Cleared .env webhook config")
if __name__ == "__main__":
if "--teardown" in sys.argv:
teardown()
else:
setup()

727
scripts/stripe_e2e_test.py Normal file
View File

@@ -0,0 +1,727 @@
"""
Comprehensive Stripe E2E Tests — real webhooks via ngrok.
Tests every product type, subscription lifecycle, payment failures,
and edge cases against a running dev server with real Stripe webhooks.
Prerequisites:
1. ngrok http 5000
2. uv run python scripts/stripe_e2e_setup.py
3. make dev (or restart after setup)
4. uv run python scripts/stripe_e2e_test.py
"""
import os
import sqlite3
import subprocess
import sys
import time
from dotenv import load_dotenv
load_dotenv(override=True)
import stripe
STRIPE_SECRET_KEY = os.getenv("STRIPE_SECRET_KEY", "") or os.getenv("STRIPE_API_PRIVATE_KEY", "")
assert STRIPE_SECRET_KEY, "Set STRIPE_SECRET_KEY or STRIPE_API_PRIVATE_KEY in .env"
stripe.api_key = STRIPE_SECRET_KEY
stripe.max_network_retries = 2
DATABASE_PATH = os.getenv("DATABASE_PATH", "data/app.db")
MAX_WAIT_SECONDS = 20
POLL_SECONDS = 0.5
passed = 0
failed = 0
errors = []
cleanup_sub_ids = []
# ─── Helpers ──────────────────────────────────────────────
def ok(msg):
global passed
passed += 1
print(f" \u2713 {msg}")
def fail(msg):
global failed
failed += 1
errors.append(msg)
print(f" \u2717 {msg}")
def section(title):
print(f"\n{'' * 60}")
print(f" {title}")
print(f"{'' * 60}")
def query_db(sql, params=()):
conn = sqlite3.connect(f"file:{DATABASE_PATH}?mode=ro", uri=True)
conn.row_factory = sqlite3.Row
try:
return [dict(r) for r in conn.execute(sql, params).fetchall()]
finally:
conn.close()
def wait_for_row(sql, params=(), timeout_seconds=MAX_WAIT_SECONDS):
"""Poll until query returns at least one row."""
deadline = time.time() + timeout_seconds
while time.time() < deadline:
rows = query_db(sql, params)
if rows:
return rows
time.sleep(POLL_SECONDS)
return []
def wait_for_value(sql, params, column, expected, timeout_seconds=MAX_WAIT_SECONDS):
"""Poll until column == expected."""
deadline = time.time() + timeout_seconds
last = None
while time.time() < deadline:
rows = query_db(sql, params)
if rows:
last = rows[0]
if last[column] == expected:
return last
time.sleep(POLL_SECONDS)
return last
def get_or_create_customer(email, name):
existing = stripe.Customer.list(email=email, limit=1)
if existing.data:
return existing.data[0]
return stripe.Customer.create(email=email, name=name, metadata={"e2e": "true"})
_pm_cache = {}
def attach_pm(customer_id):
"""Create a fresh test Visa and attach it."""
if customer_id in _pm_cache:
return _pm_cache[customer_id]
pm = stripe.PaymentMethod.create(type="card", card={"token": "tok_visa"})
stripe.PaymentMethod.attach(pm.id, customer=customer_id)
stripe.Customer.modify(customer_id, invoice_settings={"default_payment_method": pm.id})
_pm_cache[customer_id] = pm.id
return pm.id
def create_sub(customer_id, price_id, metadata, pm_id):
"""Create subscription and track for cleanup."""
sub = stripe.Subscription.create(
customer=customer_id,
items=[{"price": price_id}],
metadata=metadata,
default_payment_method=pm_id,
)
cleanup_sub_ids.append(sub.id)
return sub
def cancel_sub(sub_id):
try:
stripe.Subscription.cancel(sub_id)
except stripe.InvalidRequestError:
pass
# ─── Preflight ────────────────────────────────────────────
section("Preflight")
# Dev server
result = subprocess.run(
["curl", "-s", "-o", "/dev/null", "-w", "%{http_code}", "http://localhost:5000/"],
capture_output=True, text=True, timeout=5,
)
assert result.stdout.strip() in ("200", "301", "302"), f"Dev server down (HTTP {result.stdout.strip()})"
ok("Dev server running")
# Webhook endpoint
endpoint_id = os.getenv("STRIPE_WEBHOOK_ENDPOINT_ID", "")
assert endpoint_id, "STRIPE_WEBHOOK_ENDPOINT_ID not set — run stripe_e2e_setup.py"
ep = stripe.WebhookEndpoint.retrieve(endpoint_id)
assert ep.status == "enabled", f"Endpoint status: {ep.status}"
ok(f"Webhook endpoint: {ep.url}")
# Webhook secret loaded in server
result = subprocess.run(
["curl", "-s", "-o", "/dev/null", "-w", "%{http_code}",
"-X", "POST", "-H", "Content-Type: application/json",
"-d", "{}", "http://localhost:5000/billing/webhook/stripe"],
capture_output=True, text=True, timeout=5,
)
assert result.stdout.strip() == "400", f"Webhook returns {result.stdout.strip()} (need 400 = sig check active)"
ok("Webhook signature verification active")
# Price map
products = query_db("SELECT key, provider_price_id, billing_type FROM payment_products WHERE provider = 'stripe'")
price_map = {p["key"]: p for p in products}
assert len(price_map) >= 17, f"Only {len(price_map)} products"
ok(f"{len(price_map)} Stripe products loaded")
# Test data
users = query_db("SELECT id, email FROM users LIMIT 10")
assert users
test_user = users[0]
ok(f"User: {test_user['email']} (id={test_user['id']})")
suppliers = query_db("SELECT id, name, claimed_by, credit_balance, tier FROM suppliers LIMIT 5")
assert suppliers
# Pick a supplier with claimed_by set (has an owner user)
test_supplier = next((s for s in suppliers if s["claimed_by"]), suppliers[0])
supplier_user_id = test_supplier["claimed_by"] or test_user["id"]
ok(f"Supplier: {test_supplier['name']} (id={test_supplier['id']}, owner={supplier_user_id})")
# Record initial supplier state for later comparison
initial_credit_balance = test_supplier["credit_balance"]
# ═══════════════════════════════════════════════════════════
# 1. PLANNER SUBSCRIPTIONS
# ═══════════════════════════════════════════════════════════
section("1a. Planner Starter — create → verify DB → cancel → verify cancelled")
cus_starter = get_or_create_customer("e2e-starter@sandbox.padelnomics.com", "E2E Starter")
pm_starter = attach_pm(cus_starter.id)
sub = create_sub(cus_starter.id, price_map["starter"]["provider_price_id"],
{"user_id": str(test_user["id"]), "plan": "starter"}, pm_starter)
ok(f"Created: {sub.id} (status={sub.status})")
rows = wait_for_row("SELECT * FROM subscriptions WHERE provider_subscription_id = ?", (sub.id,))
if rows:
r = rows[0]
ok(f"DB: plan={r['plan']}, status={r['status']}") if r["plan"] == "starter" and r["status"] == "active" else fail(f"DB: plan={r['plan']}, status={r['status']}")
if r.get("current_period_end"):
ok(f"period_end set: {r['current_period_end'][:10]}")
else:
fail("period_end is NULL")
else:
fail("Subscription NOT in DB")
# billing_customers
bc = query_db("SELECT * FROM billing_customers WHERE user_id = ?", (test_user["id"],))
ok("billing_customers created") if bc else fail("billing_customers NOT created")
# Cancel
cancel_sub(sub.id)
result = wait_for_value("SELECT status FROM subscriptions WHERE provider_subscription_id = ?",
(sub.id,), "status", "cancelled")
ok("Status → cancelled") if result and result["status"] == "cancelled" else fail(f"Status: {result['status'] if result else '?'}")
section("1b. Planner Pro — subscription lifecycle")
pro_user = users[1] if len(users) > 1 else users[0]
cus_pro = get_or_create_customer("e2e-pro@sandbox.padelnomics.com", "E2E Pro")
pm_pro = attach_pm(cus_pro.id)
sub = create_sub(cus_pro.id, price_map["pro"]["provider_price_id"],
{"user_id": str(pro_user["id"]), "plan": "pro"}, pm_pro)
ok(f"Created: {sub.id}")
rows = wait_for_row("SELECT * FROM subscriptions WHERE provider_subscription_id = ?", (sub.id,))
if rows and rows[0]["plan"] == "pro" and rows[0]["status"] == "active":
ok("DB: plan=pro, status=active")
else:
fail(f"DB: {rows[0] if rows else 'not found'}")
cancel_sub(sub.id)
ok("Cleaned up")
# ═══════════════════════════════════════════════════════════
# 2. SUPPLIER SUBSCRIPTIONS (all 4 variants)
# ═══════════════════════════════════════════════════════════
section("2a. Supplier Growth (monthly) — tier, credits, verified")
cus_sup = get_or_create_customer("e2e-supplier@sandbox.padelnomics.com", "E2E Supplier")
pm_sup = attach_pm(cus_sup.id)
sub = create_sub(cus_sup.id, price_map["supplier_growth"]["provider_price_id"], {
"user_id": str(supplier_user_id),
"supplier_id": str(test_supplier["id"]),
"plan": "supplier_growth",
}, pm_sup)
ok(f"Created: {sub.id}")
result = wait_for_value(
"SELECT tier, is_verified, monthly_credits, credit_balance FROM suppliers WHERE id = ?",
(test_supplier["id"],), "tier", "growth",
)
if result:
ok("tier=growth") if result["tier"] == "growth" else fail(f"tier={result['tier']}")
ok("is_verified=1") if result["is_verified"] == 1 else fail(f"is_verified={result['is_verified']}")
ok("monthly_credits=30") if result["monthly_credits"] == 30 else fail(f"monthly_credits={result['monthly_credits']}")
ok(f"credit_balance={result['credit_balance']}") if result["credit_balance"] >= 30 else fail(f"credit_balance={result['credit_balance']}")
else:
fail("Tier not updated")
# Check credit ledger entry was created
ledger = query_db(
"SELECT * FROM credit_ledger WHERE supplier_id = ? AND event_type = 'monthly_allocation' ORDER BY id DESC LIMIT 1",
(test_supplier["id"],),
)
ok("Credit ledger entry created") if ledger else fail("No credit ledger entry")
cancel_sub(sub.id)
ok("Cleaned up")
section("2b. Supplier Pro (monthly) — 100 credits")
# Reset supplier to basic first
query_conn = sqlite3.connect(DATABASE_PATH)
query_conn.execute("UPDATE suppliers SET tier='free', monthly_credits=0, credit_balance=0, is_verified=0 WHERE id=?",
(test_supplier["id"],))
query_conn.commit()
query_conn.close()
time.sleep(1)
sub = create_sub(cus_sup.id, price_map["supplier_pro"]["provider_price_id"], {
"user_id": str(supplier_user_id),
"supplier_id": str(test_supplier["id"]),
"plan": "supplier_pro",
}, pm_sup)
ok(f"Created: {sub.id}")
result = wait_for_value(
"SELECT tier, monthly_credits, credit_balance FROM suppliers WHERE id = ?",
(test_supplier["id"],), "tier", "pro",
)
if result:
ok("tier=pro") if result["tier"] == "pro" else fail(f"tier={result['tier']}")
ok("monthly_credits=100") if result["monthly_credits"] == 100 else fail(f"monthly_credits={result['monthly_credits']}")
ok(f"credit_balance={result['credit_balance']}") if result["credit_balance"] >= 100 else fail(f"credit_balance={result['credit_balance']}")
else:
fail("Tier not updated to pro")
cancel_sub(sub.id)
ok("Cleaned up")
section("2c. Supplier Growth (yearly)")
# Reset
query_conn = sqlite3.connect(DATABASE_PATH)
query_conn.execute("UPDATE suppliers SET tier='free', monthly_credits=0, credit_balance=0, is_verified=0 WHERE id=?",
(test_supplier["id"],))
query_conn.commit()
query_conn.close()
time.sleep(1)
sub = create_sub(cus_sup.id, price_map["supplier_growth_yearly"]["provider_price_id"], {
"user_id": str(supplier_user_id),
"supplier_id": str(test_supplier["id"]),
"plan": "supplier_growth_yearly",
}, pm_sup)
ok(f"Created: {sub.id}")
result = wait_for_value(
"SELECT tier, monthly_credits FROM suppliers WHERE id = ?",
(test_supplier["id"],), "tier", "growth",
)
if result:
ok("tier=growth (yearly maps to growth)")
ok("monthly_credits=30") if result["monthly_credits"] == 30 else fail(f"monthly_credits={result['monthly_credits']}")
else:
fail("Yearly growth not processed")
cancel_sub(sub.id)
ok("Cleaned up")
section("2d. Supplier Pro (yearly)")
query_conn = sqlite3.connect(DATABASE_PATH)
query_conn.execute("UPDATE suppliers SET tier='free', monthly_credits=0, credit_balance=0, is_verified=0 WHERE id=?",
(test_supplier["id"],))
query_conn.commit()
query_conn.close()
time.sleep(1)
sub = create_sub(cus_sup.id, price_map["supplier_pro_yearly"]["provider_price_id"], {
"user_id": str(supplier_user_id),
"supplier_id": str(test_supplier["id"]),
"plan": "supplier_pro_yearly",
}, pm_sup)
ok(f"Created: {sub.id}")
result = wait_for_value(
"SELECT tier, monthly_credits FROM suppliers WHERE id = ?",
(test_supplier["id"],), "tier", "pro",
)
if result:
ok("tier=pro (yearly maps to pro)")
ok("monthly_credits=100") if result["monthly_credits"] == 100 else fail(f"monthly_credits={result['monthly_credits']}")
else:
fail("Yearly pro not processed")
cancel_sub(sub.id)
ok("Cleaned up")
# ═══════════════════════════════════════════════════════════
# 3. BOOST ADD-ON SUBSCRIPTIONS (all 4)
# ═══════════════════════════════════════════════════════════
section("3. Boost add-on subscriptions (Logo, Highlight, Verified, Card Color)")
cus_boost = get_or_create_customer("e2e-boost@sandbox.padelnomics.com", "E2E Boost")
pm_boost = attach_pm(cus_boost.id)
boost_keys = ["boost_logo", "boost_highlight", "boost_verified", "boost_card_color"]
for key in boost_keys:
price_id = price_map[key]["provider_price_id"]
sub = create_sub(cus_boost.id, price_id, {
"user_id": str(supplier_user_id),
"supplier_id": str(test_supplier["id"]),
"plan": key,
}, pm_boost)
ok(f"{key}: {sub.id} (active)")
# Let webhook arrive
time.sleep(2)
cancel_sub(sub.id)
# Boosts with plan starting "boost_" don't hit supplier handler (only supplier_ plans do).
# They go through the user subscription path. Verify at least the webhooks were accepted.
# Check ngrok logs for 200s
import json
import urllib.request
try:
resp = urllib.request.urlopen("http://localhost:4040/api/requests/http?limit=50", timeout=5)
requests_data = json.loads(resp.read())
webhook_200s = sum(1 for r in requests_data.get("requests", [])
if r.get("request", {}).get("uri") == "/billing/webhook/stripe"
and r.get("response", {}).get("status_code") == 200)
ok(f"Webhook 200 responses seen: {webhook_200s}")
except Exception:
print(" (could not verify ngrok logs)")
ok("All 4 boost add-ons tested")
# ═══════════════════════════════════════════════════════════
# 4. CHECKOUT SESSIONS — every product
# ═══════════════════════════════════════════════════════════
section("4. Checkout session creation (all 17 products)")
try:
ngrok_resp = urllib.request.urlopen("http://localhost:4040/api/tunnels", timeout=5)
tunnel_url = json.loads(ngrok_resp.read())["tunnels"][0]["public_url"]
except Exception:
tunnel_url = "http://localhost:5000"
checkout_ok = 0
for key, p in sorted(price_map.items()):
mode = "subscription" if p["billing_type"] == "subscription" else "payment"
try:
stripe.checkout.Session.create(
mode=mode,
customer=cus_starter.id,
line_items=[{"price": p["provider_price_id"], "quantity": 1}],
metadata={"user_id": str(test_user["id"]), "plan": key, "test": "true"},
success_url=f"{tunnel_url}/billing/success?session_id={{CHECKOUT_SESSION_ID}}",
cancel_url=f"{tunnel_url}/billing/pricing",
)
checkout_ok += 1
except stripe.StripeError as e:
fail(f"Checkout failed: {key} -> {e}")
if checkout_ok == len(price_map):
ok(f"All {checkout_ok} checkout sessions created")
else:
fail(f"{len(price_map) - checkout_ok} checkout sessions failed")
# ═══════════════════════════════════════════════════════════
# 5. PAYMENT FAILURE — declined card
# ═══════════════════════════════════════════════════════════
section("5. Payment failure — declined card scenarios")
cus_fail = get_or_create_customer("e2e-failure@sandbox.padelnomics.com", "E2E Failure")
fail_user = users[2] if len(users) > 2 else users[0]
# 5a. First create a valid subscription, then simulate payment failure
pm_valid = attach_pm(cus_fail.id)
try:
sub_fail = stripe.Subscription.create(
customer=cus_fail.id,
items=[{"price": price_map["starter"]["provider_price_id"]}],
metadata={"user_id": str(fail_user["id"]), "plan": "starter"},
default_payment_method=pm_valid,
)
cleanup_sub_ids.append(sub_fail.id)
ok(f"Created valid sub first: {sub_fail.id} (status={sub_fail.status})")
# Wait for subscription.created webhook
rows = wait_for_row("SELECT * FROM subscriptions WHERE provider_subscription_id = ?", (sub_fail.id,))
ok("DB row created") if rows else fail("No DB row after valid sub creation")
# Now swap to a declined card — next invoice will fail
try:
pm_decline = stripe.PaymentMethod.create(type="card", card={"token": "tok_chargeDeclined"})
stripe.PaymentMethod.attach(pm_decline.id, customer=cus_fail.id)
stripe.Customer.modify(cus_fail.id, invoice_settings={"default_payment_method": pm_decline.id})
ok("Swapped to declined card for next billing cycle")
except stripe.CardError:
ok("tok_chargeDeclined rejected at attach (newer API) — card swap skipped")
cancel_sub(sub_fail.id)
result = wait_for_value("SELECT status FROM subscriptions WHERE provider_subscription_id = ?",
(sub_fail.id,), "status", "cancelled")
ok("Cancelled after failure test") if result else ok("Cleanup done")
except stripe.CardError as e:
ok(f"Card declined at subscription level: {e.user_message}")
# 5b. Try creating subscription with payment_behavior=default_incomplete
try:
pm_ok = stripe.PaymentMethod.create(type="card", card={"token": "tok_visa"})
stripe.PaymentMethod.attach(pm_ok.id, customer=cus_fail.id)
sub_inc = stripe.Subscription.create(
customer=cus_fail.id,
items=[{"price": price_map["pro"]["provider_price_id"]}],
metadata={"user_id": str(fail_user["id"]), "plan": "pro"},
default_payment_method=pm_ok.id,
payment_behavior="default_incomplete",
)
cleanup_sub_ids.append(sub_inc.id)
ok(f"Incomplete-mode sub: {sub_inc.id} (status={sub_inc.status})")
cancel_sub(sub_inc.id)
except stripe.StripeError as e:
ok(f"Incomplete mode handled: {e}")
# ═══════════════════════════════════════════════════════════
# 6. EDGE CASES
# ═══════════════════════════════════════════════════════════
section("6a. Edge case — missing user_id in metadata")
cus_edge = get_or_create_customer("e2e-edge@sandbox.padelnomics.com", "E2E Edge")
pm_edge = attach_pm(cus_edge.id)
sub = create_sub(cus_edge.id, price_map["starter"]["provider_price_id"],
{"plan": "starter"}, # NO user_id
pm_edge)
ok(f"Created sub without user_id: {sub.id}")
# Webhook should arrive but handler should not crash (no DB write expected)
time.sleep(5)
# Server should not have crashed — verify it's still up
result = subprocess.run(
["curl", "-s", "-o", "/dev/null", "-w", "%{http_code}", "http://localhost:5000/"],
capture_output=True, text=True, timeout=5,
)
ok("Server still alive after missing user_id") if result.stdout.strip() in ("200", "301", "302") else fail("Server crashed!")
cancel_sub(sub.id)
section("6b. Edge case — missing supplier_id for supplier plan")
sub = create_sub(cus_edge.id, price_map["supplier_growth"]["provider_price_id"],
{"user_id": str(test_user["id"]), "plan": "supplier_growth"}, # NO supplier_id
pm_edge)
ok(f"Created supplier sub without supplier_id: {sub.id}")
time.sleep(5)
result = subprocess.run(
["curl", "-s", "-o", "/dev/null", "-w", "%{http_code}", "http://localhost:5000/"],
capture_output=True, text=True, timeout=5,
)
ok("Server still alive after missing supplier_id") if result.stdout.strip() in ("200", "301", "302") else fail("Server crashed!")
cancel_sub(sub.id)
section("6c. Edge case — duplicate subscription (idempotency)")
# Create same subscription twice for same user
cus_dup = get_or_create_customer("e2e-dup@sandbox.padelnomics.com", "E2E Dup")
pm_dup = attach_pm(cus_dup.id)
dup_user = users[3] if len(users) > 3 else users[0]
sub1 = create_sub(cus_dup.id, price_map["starter"]["provider_price_id"],
{"user_id": str(dup_user["id"]), "plan": "starter"}, pm_dup)
time.sleep(3)
sub2 = create_sub(cus_dup.id, price_map["pro"]["provider_price_id"],
{"user_id": str(dup_user["id"]), "plan": "pro"}, pm_dup)
time.sleep(3)
rows = query_db("SELECT * FROM subscriptions WHERE user_id = ? ORDER BY created_at", (dup_user["id"],))
ok(f"Two subscriptions exist: {len(rows)} rows") if len(rows) >= 2 else fail(f"Expected 2+ rows, got {len(rows)}")
# get_subscription returns most recent
latest = query_db("SELECT * FROM subscriptions WHERE user_id = ? ORDER BY created_at DESC LIMIT 1", (dup_user["id"],))
if latest and latest[0]["plan"] == "pro":
ok("Latest subscription is 'pro' (upgrade scenario)")
else:
fail(f"Latest plan: {latest[0]['plan'] if latest else '?'}")
cancel_sub(sub1.id)
cancel_sub(sub2.id)
section("6d. Edge case — rapid create + cancel (race condition)")
cus_race = get_or_create_customer("e2e-race@sandbox.padelnomics.com", "E2E Race")
pm_race = attach_pm(cus_race.id)
race_user = users[4] if len(users) > 4 else users[0]
sub = create_sub(cus_race.id, price_map["starter"]["provider_price_id"],
{"user_id": str(race_user["id"]), "plan": "starter"}, pm_race)
# Cancel immediately — webhooks may arrive out of order
stripe.Subscription.cancel(sub.id)
ok(f"Created and immediately cancelled: {sub.id}")
time.sleep(8) # Wait for both webhooks
rows = query_db("SELECT * FROM subscriptions WHERE provider_subscription_id = ?", (sub.id,))
if rows:
ok(f"Final DB status: {rows[0]['status']}")
else:
ok("No DB row (created webhook may have arrived after deleted)")
result = subprocess.run(
["curl", "-s", "-o", "/dev/null", "-w", "%{http_code}", "http://localhost:5000/"],
capture_output=True, text=True, timeout=5,
)
ok("Server survived race condition") if result.stdout.strip() in ("200", "301", "302") else fail("Server crashed!")
# ═══════════════════════════════════════════════════════════
# 7. BILLING PORTAL
# ═══════════════════════════════════════════════════════════
section("7. Billing Portal session")
try:
portal = stripe.billing_portal.Session.create(
customer=cus_starter.id,
return_url=f"{tunnel_url}/billing/success",
)
ok(f"Portal URL: {portal.url[:50]}...")
except stripe.StripeError as e:
fail(f"Portal failed: {e}")
# ═══════════════════════════════════════════════════════════
# 8. ONE-TIME PAYMENTS (via PaymentIntent — simulates completed checkout)
# ═══════════════════════════════════════════════════════════
section("8. One-time payments (PaymentIntents — all credit packs + boosts + PDF)")
cus_buyer = get_or_create_customer("e2e-buyer@sandbox.padelnomics.com", "E2E Buyer")
pm_buyer = attach_pm(cus_buyer.id)
one_time_products = [
("credits_25", 9900),
("credits_50", 17900),
("credits_100", 32900),
("credits_250", 74900),
("boost_sticky_week", 7900),
("boost_sticky_month", 19900),
("business_plan", 14900),
]
for key, amount_cents in one_time_products:
try:
pi = stripe.PaymentIntent.create(
amount=amount_cents,
currency="eur",
customer=cus_buyer.id,
payment_method=pm_buyer,
confirm=True,
automatic_payment_methods={"enabled": True, "allow_redirects": "never"},
metadata={
"user_id": str(test_user["id"]),
"supplier_id": str(test_supplier["id"]),
"plan": key,
},
)
if pi.status == "succeeded":
ok(f"{key}: \u20ac{amount_cents/100:.2f} succeeded ({pi.id[:20]}...)")
else:
fail(f"{key}: status={pi.status}")
except stripe.StripeError as e:
fail(f"{key}: {e}")
# Note: PaymentIntents don't trigger checkout.session.completed webhooks.
# The actual credit/boost/PDF creation requires a Checkout Session completion,
# which can only happen via browser. These tests verify the payments succeed.
print(" (PaymentIntents succeed but don't trigger checkout webhooks —")
print(" credit/boost/PDF creation requires browser checkout completion)")
# ═══════════════════════════════════════════════════════════
# 9. DECLINED CARDS — different failure modes
# ═══════════════════════════════════════════════════════════
section("9. Declined card scenarios (PaymentIntent level)")
decline_tokens = [
("tok_chargeDeclined", "generic decline"),
("tok_chargeDeclinedInsufficientFunds", "insufficient funds"),
("tok_chargeDeclinedExpiredCard", "expired card"),
("tok_chargeDeclinedProcessingError", "processing error"),
]
for token, description in decline_tokens:
try:
pm = stripe.PaymentMethod.create(type="card", card={"token": token})
stripe.PaymentMethod.attach(pm.id, customer=cus_buyer.id)
pi = stripe.PaymentIntent.create(
amount=1900,
currency="eur",
customer=cus_buyer.id,
payment_method=pm.id,
confirm=True,
automatic_payment_methods={"enabled": True, "allow_redirects": "never"},
)
fail(f"{description}: should have been declined but succeeded")
except stripe.CardError as e:
ok(f"{description}: correctly declined ({e.code})")
except stripe.StripeError as e:
ok(f"{description}: rejected ({type(e).__name__})")
# ═══════════════════════════════════════════════════════════
# Summary
# ═══════════════════════════════════════════════════════════
section("RESULTS")
total = passed + failed
print(f"\n {passed}/{total} passed, {failed} failed\n")
if errors:
print(" Failures:")
for err in errors:
print(f" - {err}")
print()
# Final cleanup: cancel any remaining subs
for sid in cleanup_sub_ids:
try:
stripe.Subscription.cancel(sid)
except Exception:
pass
sys.exit(1 if failed else 0)

View File

@@ -0,0 +1,422 @@
"""
Stripe Sandbox Integration Test — verifies all products work end-to-end.
Creates multiple test customers with different personas, tests:
- Checkout session creation for every product
- Subscription creation + cancellation lifecycle
- One-time payment intents
- Price/product consistency
Run: uv run python scripts/test_stripe_sandbox.py
"""
import os
import sys
import time
from dotenv import load_dotenv
load_dotenv()
import stripe
STRIPE_SECRET_KEY = os.getenv("STRIPE_SECRET_KEY", "") or os.getenv("STRIPE_API_PRIVATE_KEY", "")
if not STRIPE_SECRET_KEY:
print("ERROR: STRIPE_SECRET_KEY / STRIPE_API_PRIVATE_KEY not set in .env")
sys.exit(1)
stripe.api_key = STRIPE_SECRET_KEY
stripe.max_network_retries = 2
BASE_URL = os.getenv("BASE_URL", "http://localhost:5000")
# ═══════════════════════════════════════════════════════════
# Expected product catalog — must match setup_stripe.py
# ═══════════════════════════════════════════════════════════
EXPECTED_PRODUCTS = {
"Supplier Growth": {"price_cents": 19900, "billing": "subscription", "interval": "month"},
"Supplier Growth (Yearly)": {"price_cents": 179900, "billing": "subscription", "interval": "year"},
"Supplier Pro": {"price_cents": 49900, "billing": "subscription", "interval": "month"},
"Supplier Pro (Yearly)": {"price_cents": 449900, "billing": "subscription", "interval": "year"},
"Boost: Logo": {"price_cents": 2900, "billing": "subscription", "interval": "month"},
"Boost: Highlight": {"price_cents": 3900, "billing": "subscription", "interval": "month"},
"Boost: Verified Badge": {"price_cents": 4900, "billing": "subscription", "interval": "month"},
"Boost: Custom Card Color": {"price_cents": 5900, "billing": "subscription", "interval": "month"},
"Boost: Sticky Top 1 Week": {"price_cents": 7900, "billing": "one_time"},
"Boost: Sticky Top 1 Month": {"price_cents": 19900, "billing": "one_time"},
"Credit Pack 25": {"price_cents": 9900, "billing": "one_time"},
"Credit Pack 50": {"price_cents": 17900, "billing": "one_time"},
"Credit Pack 100": {"price_cents": 32900, "billing": "one_time"},
"Credit Pack 250": {"price_cents": 74900, "billing": "one_time"},
"Padel Business Plan (PDF)": {"price_cents": 14900, "billing": "one_time"},
"Planner Starter": {"price_cents": 1900, "billing": "subscription", "interval": "month"},
"Planner Pro": {"price_cents": 4900, "billing": "subscription", "interval": "month"},
}
# Test customer personas
TEST_CUSTOMERS = [
{"email": "planner-starter@sandbox.padelnomics.com", "name": "Anna Planner (Starter)"},
{"email": "planner-pro@sandbox.padelnomics.com", "name": "Ben Planner (Pro)"},
{"email": "supplier-growth@sandbox.padelnomics.com", "name": "Carlos Supplier (Growth)"},
{"email": "supplier-pro@sandbox.padelnomics.com", "name": "Diana Supplier (Pro)"},
{"email": "one-time-buyer@sandbox.padelnomics.com", "name": "Eva Buyer (Credits+Boosts)"},
]
passed = 0
failed = 0
errors = []
def ok(msg):
global passed
passed += 1
print(f"{msg}")
def fail(msg):
global failed
failed += 1
errors.append(msg)
print(f"{msg}")
def section(title):
print(f"\n{'' * 60}")
print(f" {title}")
print(f"{'' * 60}")
# ═══════════════════════════════════════════════════════════
# Phase 1: Verify all products and prices exist
# ═══════════════════════════════════════════════════════════
section("Phase 1: Product & Price Verification")
products = list(stripe.Product.list(limit=100, active=True).auto_paging_iter())
product_map = {} # name -> {product_id, price_id, price_amount, price_type, interval}
for product in products:
prices = stripe.Price.list(product=product.id, active=True, limit=1)
if not prices.data:
continue
price = prices.data[0]
product_map[product.name] = {
"product_id": product.id,
"price_id": price.id,
"price_amount": price.unit_amount,
"price_type": price.type,
"interval": price.recurring.interval if price.recurring else None,
}
for name, expected in EXPECTED_PRODUCTS.items():
if name not in product_map:
fail(f"MISSING product: {name}")
continue
actual = product_map[name]
if actual["price_amount"] != expected["price_cents"]:
fail(f"{name}: price {actual['price_amount']} != expected {expected['price_cents']}")
elif expected["billing"] == "subscription" and actual["price_type"] != "recurring":
fail(f"{name}: expected recurring, got {actual['price_type']}")
elif expected["billing"] == "one_time" and actual["price_type"] != "one_time":
fail(f"{name}: expected one_time, got {actual['price_type']}")
elif expected.get("interval") and actual["interval"] != expected["interval"]:
fail(f"{name}: interval {actual['interval']} != expected {expected['interval']}")
else:
ok(f"{name}: €{actual['price_amount']/100:.2f} ({actual['price_type']}"
f"{', ' + actual['interval'] if actual['interval'] else ''})")
extra_products = set(product_map.keys()) - set(EXPECTED_PRODUCTS.keys())
if extra_products:
print(f"\n Extra products in Stripe (not in catalog): {extra_products}")
# ═══════════════════════════════════════════════════════════
# Phase 2: Create test customers (idempotent)
# ═══════════════════════════════════════════════════════════
section("Phase 2: Create Test Customers")
customer_ids = {} # email -> customer_id
for persona in TEST_CUSTOMERS:
existing = stripe.Customer.list(email=persona["email"], limit=1)
if existing.data:
cus = existing.data[0]
ok(f"Reusing: {persona['name']} ({cus.id})")
else:
cus = stripe.Customer.create(
email=persona["email"],
name=persona["name"],
metadata={"test": "true", "persona": persona["name"]},
)
ok(f"Created: {persona['name']} ({cus.id})")
customer_ids[persona["email"]] = cus.id
# ═══════════════════════════════════════════════════════════
# Phase 3: Test Checkout Sessions for every product
# ═══════════════════════════════════════════════════════════
section("Phase 3: Checkout Session Creation (all products)")
success_url = f"{BASE_URL}/billing/success?session_id={{CHECKOUT_SESSION_ID}}"
cancel_url = f"{BASE_URL}/billing/pricing"
# Use the first customer for checkout tests
checkout_customer = customer_ids["planner-starter@sandbox.padelnomics.com"]
for name, info in product_map.items():
if name not in EXPECTED_PRODUCTS:
continue
mode = "subscription" if info["price_type"] == "recurring" else "payment"
try:
session = stripe.checkout.Session.create(
mode=mode,
customer=checkout_customer,
line_items=[{"price": info["price_id"], "quantity": 1}],
metadata={"user_id": "999", "plan": name, "test": "true"},
success_url=success_url,
cancel_url=cancel_url,
)
ok(f"Checkout ({mode}): {name} -> {session.id[:30]}...")
except stripe.StripeError as e:
fail(f"Checkout FAILED for {name}: {e.user_message or str(e)}")
# ═══════════════════════════════════════════════════════════
# Phase 4: Subscription lifecycle tests (per persona)
# ═══════════════════════════════════════════════════════════
section("Phase 4: Subscription Lifecycle Tests")
created_subs = []
# Cache: customer_id -> payment_method_id
_customer_pms = {}
def _ensure_payment_method(cus_id):
"""Create and attach a test Visa card to a customer (cached)."""
if cus_id in _customer_pms:
return _customer_pms[cus_id]
pm = stripe.PaymentMethod.create(type="card", card={"token": "tok_visa"})
stripe.PaymentMethod.attach(pm.id, customer=cus_id)
stripe.Customer.modify(
cus_id,
invoice_settings={"default_payment_method": pm.id},
)
_customer_pms[cus_id] = pm.id
return pm.id
def test_subscription(customer_email, product_name, user_id, extra_metadata=None):
"""Create a subscription, verify it's active, then cancel it."""
cus_id = customer_ids[customer_email]
info = product_map.get(product_name)
if not info:
fail(f"Product not found: {product_name}")
return
metadata = {"user_id": str(user_id), "plan": product_name, "test": "true"}
if extra_metadata:
metadata.update(extra_metadata)
pm_id = _ensure_payment_method(cus_id)
# Create subscription
sub = stripe.Subscription.create(
customer=cus_id,
items=[{"price": info["price_id"]}],
metadata=metadata,
default_payment_method=pm_id,
)
created_subs.append(sub.id)
if sub.status == "active":
ok(f"Sub created: {product_name} for {customer_email} -> {sub.id} (active)")
else:
fail(f"Sub status unexpected: {product_name} -> {sub.status} (expected active)")
# Verify subscription items
items = sub["items"]["data"]
if len(items) == 1 and items[0]["price"]["id"] == info["price_id"]:
ok(f"Sub items correct: price={info['price_id'][:20]}...")
else:
fail(f"Sub items mismatch for {product_name}")
# Cancel at period end
updated = stripe.Subscription.modify(sub.id, cancel_at_period_end=True)
if updated.cancel_at_period_end:
ok(f"Cancel scheduled: {product_name} (cancel_at_period_end=True)")
else:
fail(f"Cancel failed for {product_name}")
# Immediately cancel to clean up
deleted = stripe.Subscription.cancel(sub.id)
if deleted.status == "canceled":
ok(f"Cancelled: {product_name} -> {deleted.status}")
else:
fail(f"Final cancel status: {product_name} -> {deleted.status}")
# Planner Starter
test_subscription(
"planner-starter@sandbox.padelnomics.com", "Planner Starter", user_id=101,
)
# Planner Pro
test_subscription(
"planner-pro@sandbox.padelnomics.com", "Planner Pro", user_id=102,
)
# Supplier Growth (monthly)
test_subscription(
"supplier-growth@sandbox.padelnomics.com", "Supplier Growth", user_id=103,
extra_metadata={"supplier_id": "201"},
)
# Supplier Pro (monthly)
test_subscription(
"supplier-pro@sandbox.padelnomics.com", "Supplier Pro", user_id=104,
extra_metadata={"supplier_id": "202"},
)
# ═══════════════════════════════════════════════════════════
# Phase 5: One-time payment tests
# ═══════════════════════════════════════════════════════════
section("Phase 5: One-Time Payment Tests")
buyer_id = customer_ids["one-time-buyer@sandbox.padelnomics.com"]
buyer_pm = _ensure_payment_method(buyer_id)
ONE_TIME_PRODUCTS = [
"Credit Pack 25",
"Credit Pack 50",
"Credit Pack 100",
"Credit Pack 250",
"Boost: Sticky Top 1 Week",
"Boost: Sticky Top 1 Month",
"Padel Business Plan (PDF)",
]
for product_name in ONE_TIME_PRODUCTS:
info = product_map.get(product_name)
if not info:
fail(f"Product not found: {product_name}")
continue
try:
pi = stripe.PaymentIntent.create(
amount=info["price_amount"],
currency="eur",
customer=buyer_id,
payment_method=buyer_pm,
confirm=True,
automatic_payment_methods={"enabled": True, "allow_redirects": "never"},
metadata={
"user_id": "105",
"supplier_id": "203",
"plan": product_name,
"test": "true",
},
)
if pi.status == "succeeded":
ok(f"Payment: {product_name} -> €{info['price_amount']/100:.2f} ({pi.id[:25]}...)")
else:
fail(f"Payment status: {product_name} -> {pi.status}")
except stripe.StripeError as e:
fail(f"Payment FAILED for {product_name}: {e.user_message or str(e)}")
# ═══════════════════════════════════════════════════════════
# Phase 6: Boost subscription add-ons
# ═══════════════════════════════════════════════════════════
section("Phase 6: Boost Add-on Subscriptions")
BOOST_PRODUCTS = [
"Boost: Logo",
"Boost: Highlight",
"Boost: Verified Badge",
"Boost: Custom Card Color",
]
boost_customer = customer_ids["supplier-pro@sandbox.padelnomics.com"]
boost_pm = _ensure_payment_method(boost_customer)
for product_name in BOOST_PRODUCTS:
info = product_map.get(product_name)
if not info:
fail(f"Product not found: {product_name}")
continue
try:
sub = stripe.Subscription.create(
customer=boost_customer,
items=[{"price": info["price_id"]}],
metadata={
"user_id": "104",
"supplier_id": "202",
"plan": product_name,
"test": "true",
},
default_payment_method=boost_pm,
)
created_subs.append(sub.id)
if sub.status == "active":
ok(f"Boost sub: {product_name} -> €{info['price_amount']/100:.2f}/mo ({sub.id[:25]}...)")
else:
fail(f"Boost sub status: {product_name} -> {sub.status}")
# Clean up
stripe.Subscription.cancel(sub.id)
except stripe.StripeError as e:
fail(f"Boost sub FAILED for {product_name}: {e.user_message or str(e)}")
# ═══════════════════════════════════════════════════════════
# Phase 7: Billing Portal access
# ═══════════════════════════════════════════════════════════
section("Phase 7: Billing Portal")
try:
portal = stripe.billing_portal.Session.create(
customer=checkout_customer,
return_url=f"{BASE_URL}/billing/success",
)
ok(f"Portal URL generated: {portal.url[:50]}...")
except stripe.StripeError as e:
fail(f"Portal creation failed: {e.user_message or str(e)}")
# ═══════════════════════════════════════════════════════════
# Summary
# ═══════════════════════════════════════════════════════════
section("RESULTS")
total = passed + failed
print(f"\n {passed}/{total} passed, {failed} failed\n")
if errors:
print(" Failures:")
for err in errors:
print(f" - {err}")
print()
# Customer summary
print(" Test customers in sandbox:")
for persona in TEST_CUSTOMERS:
cid = customer_ids.get(persona["email"], "?")
print(f" {persona['name']}: {cid}")
print()
sys.exit(1 if failed else 0)

View File

@@ -48,7 +48,7 @@ PADDLE_ENVIRONMENT=${PADDLE_ENVIRONMENT:-sandbox}
# -- Preparation -------------------------------------------------------------
info "Resetting database"
rm -f "$DATABASE_PATH"
rm -f "$DATABASE_PATH" "${DATABASE_PATH}-shm" "${DATABASE_PATH}-wal"
ok "Removed $DATABASE_PATH"
info "Running migrations"

View File

@@ -51,8 +51,10 @@ bp = Blueprint(
_LANDING_DIR = os.environ.get("LANDING_DIR", "data/landing")
_SERVING_DUCKDB_PATH = os.environ.get("SERVING_DUCKDB_PATH", "data/analytics.duckdb")
# Repo root: web/src/padelnomics/admin/ → up 4 levels
_REPO_ROOT = Path(__file__).resolve().parents[4]
# In prod the package is installed in a venv so __file__.parents[4] won't
# reach the repo checkout. WorkingDirectory in the systemd unit is /opt/padelnomics,
# so CWD is reliable; REPO_ROOT env var overrides for non-standard setups.
_REPO_ROOT = Path(os.environ.get("REPO_ROOT", ".")).resolve()
_WORKFLOWS_TOML = _REPO_ROOT / "infra" / "supervisor" / "workflows.toml"
# A "running" row older than this is considered stale/crashed.
@@ -538,6 +540,7 @@ def _load_workflows() -> list[dict]:
"schedule": schedule,
"schedule_label": schedule_label,
"depends_on": config.get("depends_on", []),
"description": config.get("description", ""),
})
return workflows

View File

@@ -108,6 +108,15 @@ function toggleArticleSelect(id, checked) {
updateArticleBulkBar();
}
function toggleArticleGroupSelect(checkbox) {
var ids = (checkbox.dataset.ids || '').split(',').map(Number).filter(Boolean);
ids.forEach(function(id) {
if (checkbox.checked) articleSelectedIds.add(id);
else articleSelectedIds.delete(id);
});
updateArticleBulkBar();
}
function clearArticleSelection() {
articleSelectedIds.clear();
document.querySelectorAll('.article-checkbox').forEach(function(cb) { cb.checked = false; });
@@ -152,7 +161,12 @@ function submitArticleBulk() {
document.body.addEventListener('htmx:afterSwap', function(evt) {
if (evt.detail.target.id === 'article-results') {
document.querySelectorAll('.article-checkbox').forEach(function(cb) {
if (articleSelectedIds.has(Number(cb.dataset.id))) cb.checked = true;
if (cb.dataset.ids) {
var ids = cb.dataset.ids.split(',').map(Number).filter(Boolean);
cb.checked = ids.length > 0 && ids.every(function(id) { return articleSelectedIds.has(id); });
} else {
cb.checked = articleSelectedIds.has(Number(cb.dataset.id));
}
});
}
});

View File

@@ -1,5 +1,9 @@
<tr id="article-group-{{ g.url_path | replace('/', '-') | trim('-') }}">
<td></td>
<td onclick="event.stopPropagation()">
<input type="checkbox" class="article-checkbox"
data-ids="{{ g.variants | map(attribute='id') | join(',') }}"
onchange="toggleArticleGroupSelect(this)">
</td>
<td style="max-width:260px">
<div style="overflow:hidden;text-overflow:ellipsis;white-space:nowrap;font-weight:500" title="{{ g.url_path }}">{{ g.title }}</div>
<div class="article-subtitle">{{ g.url_path }}</div>

View File

@@ -57,7 +57,7 @@
{% if not grouped %}
<th style="width:32px"><input type="checkbox" id="article-select-all" onchange="document.querySelectorAll('.article-checkbox').forEach(cb => { cb.checked = this.checked; toggleArticleSelect(Number(cb.dataset.id), this.checked); })"></th>
{% else %}
<th style="width:32px"></th>
<th style="width:32px"><input type="checkbox" id="article-select-all" onchange="document.querySelectorAll('.article-checkbox').forEach(cb => { cb.checked = this.checked; toggleArticleGroupSelect(cb); })"></th>
{% endif %}
<th>Title</th>
<th>{% if grouped %}Variants{% else %}Status{% endif %}</th>

View File

@@ -16,8 +16,9 @@
{% set wf = row.workflow %}
{% set run = row.run %}
{% set stale = row.stale %}
<div style="border:1px solid #E2E8F0;border-radius:10px;padding:0.875rem;background:#FAFAFA">
<div class="flex items-center gap-2 mb-2">
{% set is_running = run and run.status == 'running' and not stale %}
<div style="border:1px solid {% if is_running %}#93C5FD{% else %}#E2E8F0{% endif %};border-radius:10px;padding:0.875rem;background:{% if is_running %}#EFF6FF{% else %}#FAFAFA{% endif %}">
<div class="flex items-center gap-2 mb-1">
{% if not run %}
<span class="status-dot pending"></span>
{% elif stale %}
@@ -33,6 +34,15 @@
{% if stale %}
<span class="badge-warning" style="font-size:10px;padding:1px 6px;margin-left:auto">stale</span>
{% endif %}
{% if is_running %}
<span class="btn btn-sm ml-auto"
style="padding:2px 8px;font-size:11px;opacity:0.6;cursor:default;pointer-events:none">
<svg class="spinner-icon" width="12" height="12" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="3">
<path d="M12 2a10 10 0 0 1 10 10" stroke-linecap="round"/>
</svg>
Running
</span>
{% else %}
<button type="button"
class="btn btn-sm ml-auto"
style="padding:2px 8px;font-size:11px"
@@ -41,9 +51,17 @@
hx-swap="outerHTML"
hx-vals='{"extractor": "{{ wf.name }}", "csrf_token": "{{ csrf_token() }}"}'
hx-confirm="Run {{ wf.name }} extractor?">Run</button>
{% endif %}
</div>
{% if wf.description %}
<p class="text-xs text-slate" style="margin-top:2px;margin-bottom:4px">{{ wf.description }}</p>
{% endif %}
<p class="text-xs text-slate">{{ wf.schedule_label }}</p>
{% if run %}
{% if is_running %}
<p class="text-xs mt-1" style="color:#2563EB">
Started {{ run.started_at[:16].replace('T', ' ') if run.started_at else '—' }} — running...
</p>
{% elif run %}
<p class="text-xs mono text-slate-dark mt-1">{{ run.started_at[:16].replace('T', ' ') if run.started_at else '—' }}</p>
{% if run.status == 'failed' and run.error_message %}
<p class="text-xs text-danger mt-1" style="font-family:monospace;word-break:break-all">

View File

@@ -346,16 +346,30 @@ def _get_period_end(obj: dict) -> str | None:
def _extract_line_items(session_obj: dict) -> list[dict]:
"""Extract line items from a Checkout Session in Paddle-compatible format.
Stripe sessions don't embed line items directly — we'd need an extra API call.
For webhook handling, the key info (price_id) comes from subscription items.
Returns items in the format: [{"price": {"id": "price_xxx"}}]
Stripe doesn't embed line_items in checkout.session.completed webhooks,
so we fetch them via the API. Returns [{"price": {"id": "price_xxx"}}].
"""
# For checkout.session.completed, line_items aren't in the webhook payload.
# The webhook handler for subscription.activated fetches them separately.
# For one-time payments, we can reconstruct from the session's line_items
# via the Stripe API, but to keep webhook handling fast we skip this and
# handle it via the subscription events instead.
return []
session_id = session_obj.get("id", "")
if not session_id or not session_id.startswith("cs_"):
return []
try:
s = _stripe_client()
line_items = s.checkout.Session.list_line_items(session_id, limit=20)
return [
{"price": {"id": item["price"]["id"]}}
for item in line_items.get("data", [])
if item.get("price", {}).get("id")
]
except Exception:
logger.warning("Failed to fetch line_items for session %s", session_id)
# Fallback: check if line_items were embedded in the payload (e.g. tests)
embedded = session_obj.get("line_items", {}).get("data", [])
return [
{"price": {"id": item["price"]["id"]}}
for item in embedded
if item.get("price", {}).get("id")
]
def _extract_sub_items(sub_obj: dict) -> list[dict]:

View File

@@ -8,7 +8,8 @@
<p class="q-step-sub">{{ t.q4_subheading }}</p>
<div class="q-field-group">
<span class="q-label">{{ t.q4_phase_label }}</span>
<span class="q-label">{{ t.q4_phase_label }} <span class="required">*</span></span>
{% if 'location_status' in errors %}<p class="q-error-hint">{{ t.q4_error_phase }}</p>{% endif %}
<div class="q-pills">
{% for val, label in [('still_searching', t.q4_phase_searching), ('location_found', t.q4_phase_found), ('converting_existing', t.q4_phase_converting), ('lease_signed', t.q4_phase_lease_signed), ('permit_not_filed', t.q4_phase_permit_not_filed), ('permit_pending', t.q4_phase_permit_pending), ('permit_granted', t.q4_phase_permit_granted)] %}
<label><input type="radio" name="location_status" value="{{ val }}" {{ 'checked' if data.get('location_status') == val }}><span class="q-pill">{{ label }}</span></label>

View File

@@ -428,6 +428,7 @@
"q4_phase_permit_not_filed": "Baugenehmigung noch nicht beantragt",
"q4_phase_permit_pending": "Baugenehmigung in Bearbeitung",
"q4_phase_permit_granted": "Baugenehmigung erteilt",
"q4_error_phase": "Bitte wähle Deine Projektphase aus.",
"q5_heading": "Zeitplan",
"q5_subheading": "Wann möchtest Du beginnen?",
"q5_timeline_label": "Zeitplan",

View File

@@ -428,6 +428,7 @@
"q4_phase_permit_not_filed": "Permit not yet filed",
"q4_phase_permit_pending": "Permit in progress",
"q4_phase_permit_granted": "Permit approved",
"q4_error_phase": "Please select your project phase.",
"q5_heading": "Timeline",
"q5_subheading": "When do you want to get started?",
"q5_timeline_label": "Timeline",