Compare commits

...

15 Commits

Author SHA1 Message Date
Deeman
a898a06575 feat(proxy): per-proxy dead tracking in tiered cycler
Add proxy_failure_limit param to make_tiered_cycler (default 3).
Individual proxies hitting the limit are marked dead and permanently
skipped. next_proxy() auto-escalates when all proxies in the active
tier are dead. Both mechanisms coexist: per-proxy dead tracking removes
broken individuals; tier-level threshold catches systemic failure.

- proxy.py: dead_proxies set + proxy_failure_counts dict in state;
  next_proxy skips dead proxies with bounded loop; record_failure/
  record_success accept optional proxy_url; dead_proxy_count() added
- playtomic_tenants.py: pass proxy_url to record_success/record_failure
- playtomic_availability.py: _worker returns (proxy_url, result);
  serial loops in extract + extract_recheck capture proxy_url
- test_supervisor.py: 11 new tests in TestTieredCyclerDeadProxyTracking

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-01 12:28:54 +01:00
Deeman
1aedf78ec6 fix(extract): use tiered cycler in playtomic_tenants
Previously the tenants extractor flattened all proxy tiers into a single
round-robin list, bypassing the circuit breaker entirely. When the free
Webshare tier runs out of bandwidth (402), all 20 free proxies fail and
the batch crashes — the paid datacenter/residential proxies are never tried.

Changes:
- Replace make_round_robin_cycler with make_tiered_cycler (same as availability)
- Add _fetch_page_via_cycler: retries per page across tiers, records
  success/failure in cycler so circuit breaker can escalate
- Fix batch_size to BATCH_SIZE=20 constant (was len(all_proxies) ≈ 22)
- Check cycler.is_exhausted() before each batch; catch RuntimeError mid-batch
  and write partial results rather than crashing with nothing
- CIRCUIT_BREAKER_THRESHOLD from env (default 10), matching availability

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-01 12:13:50 +01:00
Deeman
8f2ffd432b fix(admin): correct docker volume mount + pipeline_routes repo root
All checks were successful
CI / test (push) Successful in 50s
CI / tag (push) Successful in 2s
- docker-compose.prod.yml: fix volume mount for all 6 web containers
  from /opt/padelnomics/data (stale) → /data/padelnomics (live supervisor output);
  add LANDING_DIR=/app/data/pipeline/landing so extraction/landing stats work
- pipeline_routes.py: fix _REPO_ROOT parents[5] → parents[4] so workflows.toml
  is found in dev and pipeline overview shows workflow schedules

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-01 11:41:29 +01:00
Deeman
c9dec066f7 fix(admin): mobile UX fixes — contrast, scroll, responsive grids
- CSS: `.nav-mobile a` → `.nav-mobile a:not(.nav-auth-btn)` to fix Sign
  Out button showing slate text instead of white on mobile
- base_admin.html: add `overflow-y: hidden` + `scrollbar-width: none` to
  `.admin-subnav` to eliminate ghost 1px scrollbar on Content tab row
- routes.py: pass `outreach_email=EMAIL_ADDRESSES["outreach"]` to outreach
  template so sending domain is no longer hardcoded
- outreach.html: display dynamic `outreach_email`; replace inline
  `repeat(6,1fr)` grid with responsive `.pipeline-status-grid` (2→3→6 cols)
- index.html: replace inline `repeat(5,1fr)` Lead/Supplier Funnel grids
  with responsive `.funnel-grid` class (2 cols mobile, 5 cols md+)
- pipeline.html: replace inline `repeat(4,1fr)` stat grid with responsive
  `.pipeline-stat-grid` (2 cols mobile, 4 cols md+)
- 4 partials (lead/email/supplier/outreach results): wrap `<table>` in
  `<div style="overflow-x:auto">` so tables scroll on narrow screens

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-01 11:20:46 +01:00
Deeman
fea4f85da3 perf(transform): optimize dim_locations spatial joins via IEJoin + country filters
All checks were successful
CI / test (push) Successful in 51s
CI / tag (push) Successful in 2s
Replace ABS() bbox predicates with BETWEEN in all three spatial CTEs
(nearest_padel, padel_local, tennis_nearby). BETWEEN enables DuckDB's
IEJoin (interval join) which is O((N+M) log M) vs the previous O(N×M)
nested-loop cross-join.

Add country pre-filters to restrict the left side from ~140K global
locations to ~20K rows for padel/tennis CTEs (~8 countries each).

Expected: ~50-200x speedup on the spatial CTE portion of the model.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-01 02:57:05 +01:00
Deeman
2590020014 update sops
All checks were successful
CI / test (push) Successful in 51s
CI / tag (push) Successful in 3s
2026-03-01 01:27:01 +01:00
Deeman
a72f7721bb fix(supervisor): add geonames to workflows.toml
All checks were successful
CI / test (push) Successful in 53s
CI / tag (push) Successful in 3s
stg_population_geonames → dim_locations → location_opportunity_profile
were all 0 rows in prod because the GeoNames extractor was never
scheduled. First run will backfill cities1000 to landing zone.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-01 01:25:38 +01:00
Deeman
849dc8359c feat(affiliate): affiliate programs management + frontmatter bugfix
Centralises retailer config in affiliate_programs table (URL template,
tracking tag, commission %). Products now use program dropdown + product
identifier instead of manual URL baking. URL assembled at redirect time
via build_affiliate_url() — changing a tag propagates to all products
instantly. Backward compatible: legacy baked-URL products fall through
unchanged. Amazon OneLink (configured in Associates dashboard) handles
geo-redirect to local marketplaces with no additional programs needed.

Also fixes _rebuild_article() frontmatter rendering bug.

Commits: fix frontmatter, migration 0027, program CRUD functions,
redirect update, admin CRUD + templates, product form update, tests.
41 tests, all passing. Ruff clean.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-02-28 22:54:28 +01:00
Deeman
ec839478c3 feat(affiliate): add program + URL assembly tests; update CHANGELOG + PROJECT.md
41 tests total (+15). New coverage: get_all_programs(), get_program(),
get_program_by_slug(), build_affiliate_url() (program path, legacy fallback,
no program_id, no program dict), program-based redirect, legacy redirect,
migration seed assertion, ASIN backfill assertion. All ruff checks pass.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-02-28 22:51:26 +01:00
Deeman
47acf4d3df feat(affiliate): product form uses program dropdown + product identifier
Replaces the manual affiliate URL field with a program selector and
product identifier input. JS toggles visibility between program mode and
manual (custom URL) mode. retailer field is auto-populated from the
program name on save. INSERT/UPDATE statements include new program_id
and product_identifier columns. Validation accepts program+ID or manual
URL as the URL source.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-02-28 22:43:10 +01:00
Deeman
53117094ee feat(affiliate): admin CRUD for affiliate programs
Adds program list, create, edit, delete routes with appropriate guards
(delete blocked if products reference the program). Adds "Programs" tab
to the affiliate subnav. New templates: affiliate_programs.html,
affiliate_program_form.html, partials/affiliate_program_results.html.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-02-28 22:32:45 +01:00
Deeman
6076a0b30f feat(affiliate): use build_affiliate_url() in /go/<slug> redirect
Program-based products now get URLs assembled from the template at
redirect time. Changing a program's tracking_tag propagates instantly
to all its products without rebuilding. Legacy products (no program_id)
still use their baked affiliate_url via fallback.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-02-28 22:27:57 +01:00
Deeman
8dbbd0df05 feat(affiliate): add program CRUD functions + build_affiliate_url()
Adds get_all_programs(), get_program(), get_program_by_slug() for admin
CRUD. Adds build_affiliate_url() that assembles URLs from program template
+ product identifier, with fallback to baked affiliate_url for legacy
products. Updates get_product() to JOIN affiliate_programs so _program
dict is available at redirect time. _parse_product() extracts program
fields into nested _program key.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-02-28 22:23:53 +01:00
Deeman
b1eeb0a0ac feat(affiliate): add affiliate_programs table + migration 0027
Creates affiliate_programs for centralised retailer config (URL template,
tracking tag, commission %). Adds nullable program_id + product_identifier
to affiliate_products for backward compat. Seeds "Amazon" program with
oneLink template. Backfills existing products by extracting ASINs from
baked affiliate_url values.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-02-28 22:23:00 +01:00
Deeman
6aae92fc58 fix(admin): strip YAML frontmatter before mistune in _rebuild_article()
Fixes a bug where manual article previews rendered raw frontmatter
(title:, slug:, etc.) as visible text. Now strips the --- block using
the existing _FRONTMATTER_RE before passing the body to mistune.html().

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-02-28 22:17:44 +01:00
30 changed files with 1472 additions and 186 deletions

View File

@@ -32,10 +32,6 @@ LITESTREAM_R2_BUCKET=ENC[AES256_GCM,data:pAqSkoJzsw==,iv:5J1Js7JPH/j1oTmEBdNXjwd
LITESTREAM_R2_ACCESS_KEY_ID=ENC[AES256_GCM,data:e89yGzousImmdO7WVqmRWLJNejDFH5eTaw7G74CyZSw=,iv:bR1jgqSzJlxPA8LMMg2Mc1Lnp01iZgaqa9dgAoV0RpY=,tag:m92xzCP0qaP2onK7ChwA1Q==,type:str]
LITESTREAM_R2_SECRET_ACCESS_KEY=ENC[AES256_GCM,data:yzXeb8c/Y0d+EluY7g6buo4BnFvBDEVblOi7doNgOp3siLvfMmPkjdRLqZzA14ET6CW5vef9i51yijPYwuhnbw==,iv:IYQRZ8SsquUQpsHH3X/iovz2wFskR4iHyvr0arY7Ag4=,tag:9G5lpHloacjQbEhSk9T2pw==,type:str]
LITESTREAM_R2_ENDPOINT=ENC[AES256_GCM,data:qqDLfsPeiWOfwtgpZeItypnYNmIOD07fV0IPlZfphhUFeY0Z/BRpkVXA7nfqQ2M6PmcYKVIlBiBY,iv:hsEBxxv1+fvUY4v3nhBP8puKlu216eAGZDUNBAjibas=,tag:MvnsJ8W3oSrv4ZrWW/p+dg==,type:str]
#ENC[AES256_GCM,data:YGV2exKdGOUkblNZZos=,iv:NuabFM/gNHIzYmDMRZ2tglFYdMPVFuHFGd+AAWvvu6Q=,tag:gZRoNNEmjL9v3nC8j9YkHw==,type:comment]
DUCKDB_PATH=ENC[AES256_GCM,data:GgOEQ5B1KeQrVavhoMU/JGXcVu3H,iv:XY8JiaosxaUDv5PwizrZFWuNKMSOeuE3cfVyp51r++8=,tag:RnoDE5+7WQolFLejfRZ//w==,type:str]
SERVING_DUCKDB_PATH=ENC[AES256_GCM,data:U2X9KmlgnWXM9uCfhHCJ03HMGCLm,iv:KHHdBTq+ct4AG7Jt4zLog/5jbDC7LvHA6KzWNTDS/Yw=,tag:m5uIG/bS4vaBooSYoYa6SA==,type:str]
LANDING_DIR=ENC[AES256_GCM,data:NkEmV8LOwEiN9Sal,iv:mQHBVT6lNoEEEVbl7a5bNN5qoF/LvTyWXQvvkv/z/B0=,tag:IgA5A1nfF91fOBdYxEN71g==,type:str]
#ENC[AES256_GCM,data:jvZYm7ceM4jtNRg=,iv:nuv65SDTZiaVukVZ40seBZevpqP8uiKCgJyQcIrY524=,tag:cq6gB3vmJzJWIXCLHaIc9g==,type:comment]
REPO_DIR=ENC[AES256_GCM,data:ae8i6PpGFaiYFA/gGIhczg==,iv:nmsIRMPJYocIO6Z2Gz4OIzAOvSpdgDYmUaIr2hInFo0=,tag:EmAYG5NujnHg8lPaO/uAnQ==,type:str]
WORKFLOWS_PATH=ENC[AES256_GCM,data:sGU4l68Pbb1thsPyG104mWXWD+zJGTIcR/TqVbPmew==,iv:+xhGkX+ep4kFEAU65ELdDrfjrl/WyuaOi35JI3OB/zM=,tag:brauZhFq8twPXmvhZKjhDQ==,type:str]
@@ -62,7 +58,7 @@ sops_age__list_1__map_enc=-----BEGIN AGE ENCRYPTED FILE-----\nYWdlLWVuY3J5cHRpb2
sops_age__list_1__map_recipient=age1wjepykv3glvsrtegu25tevg7vyn3ngpl607u3yjc9ucay04s045s796msw
sops_age__list_2__map_enc=-----BEGIN AGE ENCRYPTED FILE-----\nYWdlLWVuY3J5cHRpb24ub3JnL3YxCi0+IFgyNTUxOSBFeHhaOURNZnRVMEwxNThu\nUjF4Q0kwUXhTUE1QSzZJbmpubnh3RnpQTmdvCjRmWWxpNkxFUmVGb3NRbnlydW5O\nWEg3ZXJQTU4vcndzS2pUQXY3Q0ttYjAKLS0tIE9IRFJ1c2ZxbGVHa2xTL0swbGN1\nTzgwMThPUDRFTWhuZHJjZUYxOTZrU00KY62qrNBCUQYxwcLMXFEnLkwncxq3BPJB\nKm4NzeHBU87XmPWVrgrKuf+PH1mxJlBsl7Hev8xBTy7l6feiZjLIvQ==\n-----END AGE ENCRYPTED FILE-----\n
sops_age__list_2__map_recipient=age1c783ym2q5x9tv7py5d28uc4k44aguudjn03g97l9nzs00dd9tsrqum8h4d
sops_lastmodified=2026-02-28T17:03:44Z
sops_mac=ENC[AES256_GCM,data:IQ9jpRxVUssaMK+qFcM3nPdzXHkiqp6E+DhEey1TfqUu5GCBNsWeVy9m9A6p9RWhu2NtJV7aKdUeqneuMtD1q5Tnm6L96zuyot2ESnx2N2ssD9ilrDauQxoBJcrJVnGV61CgaCz9458w8BuVUZydn3MoHeRaU7bOBBzQlTI6vZk=,iv:qHqdt3av/KZRQHr/OS/9KdAJUgKlKEDgan7qI3Zzkck=,tag:fOvdO9iRTTF1Siobu2mLqg==,type:str]
sops_lastmodified=2026-03-01T00:26:54Z
sops_mac=ENC[AES256_GCM,data:DdcABGVm9KbAcFrF0iuZlAaugsouNs7Hon2mZISaHs15/2H/Pd9FniXW3KeQ0+/NdZFQkz/h3i3bVFampcpFS1AxuOE5+1/IgWn8sKtaqPc7E9y8g6lxMnwTkUX2z+n/Q2nR8KAcO9IyE0GNjIluMWkxPWQuLzlRYDOjRN4/1e0=,iv:rm+6lXhYu6VUmrdCIrU0BRN2/ooa21Fw1ESWxr7vATg=,tag:GZmLLZf/LQaNeNNAAEg5bA==,type:str]
sops_unencrypted_suffix=_unencrypted
sops_version=3.12.1

View File

@@ -6,6 +6,28 @@ The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.1.0/).
## [Unreleased]
### Changed
- **Per-proxy dead tracking in tiered cycler** — `make_tiered_cycler` now accepts a `proxy_failure_limit` parameter (default 3). Individual proxies that hit the limit are marked dead and permanently skipped by `next_proxy()`. If all proxies in the active tier are dead, `next_proxy()` auto-escalates to the next tier without needing the tier-level threshold. `record_failure(proxy_url)` and `record_success(proxy_url)` accept an optional `proxy_url` argument for per-proxy tracking; callers without `proxy_url` are fully backward-compatible. New `dead_proxy_count()` callable exposed for monitoring.
- `extract/padelnomics_extract/src/padelnomics_extract/proxy.py`: added per-proxy state (`proxy_failure_counts`, `dead_proxies`), updated `next_proxy`/`record_failure`/`record_success`, added `dead_proxy_count`
- `extract/padelnomics_extract/src/padelnomics_extract/playtomic_tenants.py`: `_fetch_page_via_cycler` passes `proxy_url` to `record_success`/`record_failure`
- `extract/padelnomics_extract/src/padelnomics_extract/playtomic_availability.py`: `_worker` returns `(proxy_url, result)` tuple; serial loops in `extract` and `extract_recheck` capture `proxy_url` before passing to `record_success`/`record_failure`
- `web/tests/test_supervisor.py`: 11 new tests in `TestTieredCyclerDeadProxyTracking` covering dead proxy skipping, auto-escalation, `dead_proxy_count`, backward compat, and thread safety
### Added
- **Affiliate programs management** — centralised retailer config (`affiliate_programs` table) with URL template + tracking tag + commission %. Products now use a program dropdown + product identifier (e.g. ASIN) instead of manually baking full URLs. URL is assembled at redirect time via `build_affiliate_url()`, so changing a tag propagates instantly to all products. Legacy products (baked `affiliate_url`) continue to work via fallback. Amazon OneLink configured in the Associates dashboard handles geo-redirect to local marketplaces — no per-country programs needed.
- `web/src/padelnomics/migrations/versions/0027_affiliate_programs.py`: `affiliate_programs` table, nullable `program_id` + `product_identifier` columns on `affiliate_products`, seeds "Amazon" program, backfills ASINs from existing URLs
- `web/src/padelnomics/affiliate.py`: `get_all_programs()`, `get_program()`, `get_program_by_slug()`, `build_affiliate_url()`; `get_product()` JOINs program for redirect assembly; `_parse_product()` extracts `_program` sub-dict
- `web/src/padelnomics/app.py`: `/go/<slug>` uses `build_affiliate_url()` — program-based products get URLs assembled at redirect time
- `web/src/padelnomics/admin/routes.py`: program CRUD routes (list, new, edit, delete — delete blocked if products reference the program); product form updated to program dropdown + identifier; `retailer` auto-populated from program name
- New templates: `admin/affiliate_programs.html`, `admin/affiliate_program_form.html`, `admin/partials/affiliate_program_results.html`
- Updated templates: `admin/affiliate_form.html` (program dropdown + JS toggle), `admin/base_admin.html` (Programs subnav tab)
- 15 new tests in `web/tests/test_affiliate.py` (41 total)
### Fixed
- **Data Platform admin view showing stale/zero row counts** — Docker web containers were mounting `/opt/padelnomics/data` (stale copy) instead of `/data/padelnomics` (live supervisor output). Fixed volume mount in all 6 containers (blue/green × app/worker/scheduler) and added `LANDING_DIR=/app/data/pipeline/landing` so extraction stats and landing zone file stats are visible to the web app.
- **`workflows.toml` never found in dev** — `_REPO_ROOT` in `pipeline_routes.py` used `parents[5]` (one level too far up) instead of `parents[4]`. Workflow schedules now display correctly on the pipeline overview tab in dev.
- **Article preview frontmatter bug** — `_rebuild_article()` in `admin/routes.py` now strips YAML frontmatter before passing markdown to `mistune.html()`, preventing raw `title:`, `slug:` etc. from appearing as visible text in article previews.
### Added
- **Affiliate product system** — "Wirecutter for padel" editorial affiliate cards embedded in articles via `[product:slug]` and `[product-group:category]` markers, baked at build time into static HTML. `/go/<slug>` click-tracking redirect (302, GDPR-compliant daily-rotated IP hash). Admin CRUD (`/admin/affiliate`) with live preview, inline status toggle, HTMX search/filter. Click stats dashboard (pure CSS bar chart, top products/articles/retailers). 10 German equipment review article scaffolds seeded.
- `web/src/padelnomics/migrations/versions/0026_affiliate_products.py`: `affiliate_products` + `affiliate_clicks` tables; `UNIQUE(slug, language)` constraint mirrors articles schema

View File

@@ -1,7 +1,7 @@
# Padelnomics — Project Tracker
> Move tasks across columns as you work. Add new tasks at the top of the relevant column.
> Last updated: 2026-02-28 (Affiliate product system — editorial gear cards + click tracking).
> Last updated: 2026-02-28 (Affiliate programs management — centralised retailer config + URL template assembly).
---
@@ -133,6 +133,7 @@
- [x] **group_key static article grouping** — migration 0020 adds `group_key TEXT` column; `_sync_static_articles()` auto-upserts `data/content/articles/*.md` on admin page load; `_get_article_list_grouped()` groups by `COALESCE(group_key, url_path)` so EN/DE static cornerstones pair into one row
- [x] **Email-gated report PDF**`reports/` blueprint with email capture gate + PDF download; premium WeasyPrint PDF (full-bleed navy cover, Padelnomics wordmark watermark, gold/teal accents); `make report-pdf` target; EN + DE i18n (26 keys, native German); state-of-padel report moved to `data/content/reports/`
- [x] **Affiliate product system** — "Wirecutter for padel" editorial gear cards embedded in articles via `[product:slug]` / `[product-group:category]` markers, baked at build time; `/go/<slug>` click-tracking redirect (302, GDPR daily-rotated IP hash, rate-limited); admin CRUD with live preview, HTMX filter/search, status toggle; click stats dashboard (pure CSS charts); 10 German equipment review article scaffolds; 26 tests
- [x] **Affiliate programs management**`affiliate_programs` table centralises retailer configs (URL template, tracking tag, commission %); product form uses program dropdown + product identifier (ASIN etc.); `build_affiliate_url()` assembles at redirect time; legacy baked-URL products still work; admin CRUD (delete blocked if products reference program); Amazon OneLink for multi-marketplace; article frontmatter preview bug fixed; 41 tests
### SEO & Legal
- [x] Sitemap (both language variants, `<lastmod>` on all entries)

View File

@@ -1,67 +1,67 @@
---
title: "Padelschläger für Fortgeschrittene: Die besten Modelle 2026"
slug: padelschlaeger-fortgeschrittene-de
language: de
url_path: /padelschlaeger-fortgeschrittene
meta_description: "Die besten Padelschläger für fortgeschrittene und ambitionierte Spieler. High-End-Modelle mit Carbon, Kevlar und ausgereifter Schlagbalance für Spieler ab 3.0."
---
# Padelschläger für Fortgeschrittene: Die besten Modelle 2026
<!-- TODO: Einleitung — wann ist man bereit für einen Fortgeschrittenenschläger? -->
Ab einem gewissen Spielniveau lohnt sich der Griff zu einem anspruchsvolleren Schläger. Wer sauber trifft, kann von einer härteren Bespannung und einer präziseren Balance profitieren. Die Schläger in dieser Liste sind kein Selbstläufer — aber in den richtigen Händen ein echter Vorteil.
---
## Top-Schläger für Fortgeschrittene im Überblick
[product-group:racket]
---
## Carbon, Kevlar, Glasfaser: Was steckt drin?
<!-- TODO: Materialüberblick mit Vor- und Nachteilen -->
### Carbon-Rahmen
<!-- TODO -->
### 3K vs. 12K Carbon
<!-- TODO -->
### Kevlar-Einlagen
<!-- TODO -->
---
## Testbericht: Unser Empfehlungsschläger
[product:platzhalter-fortgeschrittene-schlaeger-amazon]
<!-- TODO: Praxistest -->
---
## Häufige Fragen
<details>
<summary>Ab welcher Spielstufe lohnt sich ein Fortgeschrittenenschläger?</summary>
<!-- TODO -->
Wer regelmäßig spielt (23 Mal pro Woche), seit mindestens einem Jahr dabei ist und an Taktik und Technik arbeitet, kann von einem hochwertigeren Schläger profitieren. Für gelegentliche Spieler ist der Unterschied zu einem Mittelklassemodell kaum spürbar.
</details>
<details>
<summary>Müssen Fortgeschrittenenschläger teurer sein?</summary>
<!-- TODO -->
Nicht zwingend. Es gibt ausgezeichnete Modelle im 150200-Euro-Segment, die professionell verarbeitete Carbon-Elemente enthalten. Alles über 300 Euro richtet sich meist an Spieler mit Wettkampfambitionen.
</details>
---
title: "Padelschläger für Fortgeschrittene: Die besten Modelle 2026"
slug: padelschlaeger-fortgeschrittene-de
language: de
url_path: /padelschlaeger-fortgeschrittene
meta_description: "Die besten Padelschläger für fortgeschrittene und ambitionierte Spieler. High-End-Modelle mit Carbon, Kevlar und ausgereifter Schlagbalance für Spieler ab 3.0."
---
# Padelschläger für Fortgeschrittene: Die besten Modelle 2026
<!-- TODO: Einleitung — wann ist man bereit für einen Fortgeschrittenenschläger? -->
Ab einem gewissen Spielniveau lohnt sich der Griff zu einem anspruchsvolleren Schläger. Wer sauber trifft, kann von einer härteren Bespannung und einer präziseren Balance profitieren. Die Schläger in dieser Liste sind kein Selbstläufer — aber in den richtigen Händen ein echter Vorteil.
---
## Top-Schläger für Fortgeschrittene im Überblick
[product-group:racket]
---
## Carbon, Kevlar, Glasfaser: Was steckt drin?
<!-- TODO: Materialüberblick mit Vor- und Nachteilen -->
### Carbon-Rahmen
<!-- TODO -->
### 3K vs. 12K Carbon
<!-- TODO -->
### Kevlar-Einlagen
<!-- TODO -->
---
## Testbericht: Unser Empfehlungsschläger
[product:platzhalter-fortgeschrittene-schlaeger-amazon]
<!-- TODO: Praxistest -->
---
## Häufige Fragen
<details>
<summary>Ab welcher Spielstufe lohnt sich ein Fortgeschrittenenschläger?</summary>
<!-- TODO -->
Wer regelmäßig spielt (23 Mal pro Woche), seit mindestens einem Jahr dabei ist und an Taktik und Technik arbeitet, kann von einem hochwertigeren Schläger profitieren. Für gelegentliche Spieler ist der Unterschied zu einem Mittelklassemodell kaum spürbar.
</details>
<details>
<summary>Müssen Fortgeschrittenenschläger teurer sein?</summary>
<!-- TODO -->
Nicht zwingend. Es gibt ausgezeichnete Modelle im 150200-Euro-Segment, die professionell verarbeitete Carbon-Elemente enthalten. Alles über 300 Euro richtet sich meist an Spieler mit Wettkampfambitionen.
</details>

View File

@@ -60,9 +60,10 @@ services:
environment:
- DATABASE_PATH=/app/data/app.db
- SERVING_DUCKDB_PATH=/app/data/pipeline/analytics.duckdb
- LANDING_DIR=/app/data/pipeline/landing
volumes:
- app-data:/app/data
- /opt/padelnomics/data:/app/data/pipeline:ro
- /data/padelnomics:/app/data/pipeline:ro
networks:
- net
healthcheck:
@@ -82,9 +83,10 @@ services:
environment:
- DATABASE_PATH=/app/data/app.db
- SERVING_DUCKDB_PATH=/app/data/pipeline/analytics.duckdb
- LANDING_DIR=/app/data/pipeline/landing
volumes:
- app-data:/app/data
- /opt/padelnomics/data:/app/data/pipeline:ro
- /data/padelnomics:/app/data/pipeline:ro
networks:
- net
@@ -98,9 +100,10 @@ services:
environment:
- DATABASE_PATH=/app/data/app.db
- SERVING_DUCKDB_PATH=/app/data/pipeline/analytics.duckdb
- LANDING_DIR=/app/data/pipeline/landing
volumes:
- app-data:/app/data
- /opt/padelnomics/data:/app/data/pipeline:ro
- /data/padelnomics:/app/data/pipeline:ro
networks:
- net
@@ -115,9 +118,10 @@ services:
environment:
- DATABASE_PATH=/app/data/app.db
- SERVING_DUCKDB_PATH=/app/data/pipeline/analytics.duckdb
- LANDING_DIR=/app/data/pipeline/landing
volumes:
- app-data:/app/data
- /opt/padelnomics/data:/app/data/pipeline:ro
- /data/padelnomics:/app/data/pipeline:ro
networks:
- net
healthcheck:
@@ -137,9 +141,10 @@ services:
environment:
- DATABASE_PATH=/app/data/app.db
- SERVING_DUCKDB_PATH=/app/data/pipeline/analytics.duckdb
- LANDING_DIR=/app/data/pipeline/landing
volumes:
- app-data:/app/data
- /opt/padelnomics/data:/app/data/pipeline:ro
- /data/padelnomics:/app/data/pipeline:ro
networks:
- net
@@ -153,9 +158,10 @@ services:
environment:
- DATABASE_PATH=/app/data/app.db
- SERVING_DUCKDB_PATH=/app/data/pipeline/analytics.duckdb
- LANDING_DIR=/app/data/pipeline/landing
volumes:
- app-data:/app/data
- /opt/padelnomics/data:/app/data/pipeline:ro
- /data/padelnomics:/app/data/pipeline:ro
networks:
- net

View File

@@ -213,9 +213,10 @@ def _fetch_venues_parallel(
completed_count = 0
lock = threading.Lock()
def _worker(tenant_id: str) -> dict | None:
def _worker(tenant_id: str) -> tuple[str | None, dict | None]:
proxy_url = cycler["next_proxy"]()
return _fetch_venue_availability(tenant_id, start_min_str, start_max_str, proxy_url)
result = _fetch_venue_availability(tenant_id, start_min_str, start_max_str, proxy_url)
return proxy_url, result
with ThreadPoolExecutor(max_workers=worker_count) as pool:
for batch_start in range(0, len(tenant_ids), PARALLEL_BATCH_SIZE):
@@ -231,17 +232,17 @@ def _fetch_venues_parallel(
batch_futures = {pool.submit(_worker, tid): tid for tid in batch}
for future in as_completed(batch_futures):
result = future.result()
proxy_url, result = future.result()
with lock:
completed_count += 1
if result is not None:
venues_data.append(result)
cycler["record_success"]()
cycler["record_success"](proxy_url)
if on_result is not None:
on_result(result)
else:
venues_errored += 1
cycler["record_failure"]()
cycler["record_failure"](proxy_url)
if completed_count % 500 == 0:
logger.info(
@@ -336,16 +337,17 @@ def extract(
else:
logger.info("Serial mode: 1 worker, %d venues", len(venues_to_process))
for i, tenant_id in enumerate(venues_to_process):
proxy_url = cycler["next_proxy"]()
result = _fetch_venue_availability(
tenant_id, start_min_str, start_max_str, cycler["next_proxy"](),
tenant_id, start_min_str, start_max_str, proxy_url,
)
if result is not None:
new_venues_data.append(result)
cycler["record_success"]()
cycler["record_success"](proxy_url)
_on_result(result)
else:
venues_errored += 1
cycler["record_failure"]()
cycler["record_failure"](proxy_url)
if cycler["is_exhausted"]():
logger.error("All proxy tiers exhausted — writing partial results")
break
@@ -500,13 +502,14 @@ def extract_recheck(
venues_data = []
venues_errored = 0
for tid in venues_to_recheck:
result = _fetch_venue_availability(tid, start_min_str, start_max_str, cycler["next_proxy"]())
proxy_url = cycler["next_proxy"]()
result = _fetch_venue_availability(tid, start_min_str, start_max_str, proxy_url)
if result is not None:
venues_data.append(result)
cycler["record_success"]()
cycler["record_success"](proxy_url)
else:
venues_errored += 1
cycler["record_failure"]()
cycler["record_failure"](proxy_url)
if cycler["is_exhausted"]():
logger.error("All proxy tiers exhausted — writing partial recheck results")
break

View File

@@ -10,11 +10,11 @@ API notes (discovered 2026-02):
- `size=100` is the maximum effective page size
- ~14K venues globally as of Feb 2026
Parallel mode: when PROXY_URLS is set, fires batch_size = len(proxy_urls)
pages concurrently. Each page gets its own fresh session + proxy. Pages beyond
the last one return empty lists (safe — just triggers the done condition).
Without proxies, falls back to single-threaded with THROTTLE_SECONDS between
pages.
Parallel mode: when proxy tiers are configured, fires BATCH_SIZE pages
concurrently. Each page gets its own fresh session + proxy from the tiered
cycler. On failure the cycler escalates through free → datacenter →
residential tiers. Without proxies, falls back to single-threaded with
THROTTLE_SECONDS between pages.
Rate: 1 req / 2 s per IP (see docs/data-sources-inventory.md §1.2).
@@ -22,6 +22,7 @@ Landing: {LANDING_DIR}/playtomic/{year}/{month}/tenants.jsonl.gz
"""
import json
import os
import sqlite3
import time
from concurrent.futures import ThreadPoolExecutor, as_completed
@@ -31,7 +32,7 @@ from pathlib import Path
import niquests
from ._shared import HTTP_TIMEOUT_SECONDS, run_extractor, setup_logging, ua_for_proxy
from .proxy import load_proxy_tiers, make_round_robin_cycler
from .proxy import load_proxy_tiers, make_tiered_cycler
from .utils import compress_jsonl_atomic, landing_path
logger = setup_logging("padelnomics.extract.playtomic_tenants")
@@ -42,6 +43,9 @@ PLAYTOMIC_TENANTS_URL = "https://api.playtomic.io/v1/tenants"
THROTTLE_SECONDS = 2
PAGE_SIZE = 100
MAX_PAGES = 500 # safety bound — ~50K venues max, well above current ~14K
BATCH_SIZE = 20 # concurrent pages per batch (fixed, independent of proxy count)
CIRCUIT_BREAKER_THRESHOLD = int(os.environ.get("CIRCUIT_BREAKER_THRESHOLD") or "10")
MAX_PAGE_ATTEMPTS = 5 # max retries per individual page before giving up
def _fetch_one_page(proxy_url: str | None, page: int) -> tuple[int, list[dict]]:
@@ -61,22 +65,57 @@ def _fetch_one_page(proxy_url: str | None, page: int) -> tuple[int, list[dict]]:
return (page, tenants)
def _fetch_pages_parallel(pages: list[int], next_proxy) -> list[tuple[int, list[dict]]]:
"""Fetch multiple pages concurrently. Returns [(page_num, tenants_list), ...]."""
def _fetch_page_via_cycler(cycler: dict, page: int) -> tuple[int, list[dict]]:
"""Fetch a single page, retrying across proxy tiers via the circuit breaker.
On each attempt, pulls the next proxy from the active tier. Records
success/failure so the circuit breaker can escalate tiers. Raises
RuntimeError if all tiers are exhausted or MAX_PAGE_ATTEMPTS is exceeded.
"""
last_exc: Exception | None = None
for attempt in range(MAX_PAGE_ATTEMPTS):
proxy_url = cycler["next_proxy"]()
if proxy_url is None: # all tiers exhausted
raise RuntimeError(f"All proxy tiers exhausted fetching page {page}")
try:
result = _fetch_one_page(proxy_url, page)
cycler["record_success"](proxy_url)
return result
except Exception as exc:
last_exc = exc
logger.warning(
"Page %d attempt %d/%d failed (proxy=%s): %s",
page,
attempt + 1,
MAX_PAGE_ATTEMPTS,
proxy_url,
exc,
)
cycler["record_failure"](proxy_url)
if cycler["is_exhausted"]():
raise RuntimeError(f"All proxy tiers exhausted fetching page {page}") from exc
raise RuntimeError(f"Page {page} failed after {MAX_PAGE_ATTEMPTS} attempts") from last_exc
def _fetch_pages_parallel(pages: list[int], cycler: dict) -> list[tuple[int, list[dict]]]:
"""Fetch multiple pages concurrently using the tiered cycler.
Returns [(page_num, tenants_list), ...]. Raises if any page exhausts all tiers.
"""
with ThreadPoolExecutor(max_workers=len(pages)) as pool:
futures = [pool.submit(_fetch_one_page, next_proxy(), p) for p in pages]
futures = [pool.submit(_fetch_page_via_cycler, cycler, p) for p in pages]
return [f.result() for f in as_completed(futures)]
def extract(
landing_dir: Path,
year_month: str, # noqa: ARG001 — unused; tenants uses ISO week partition instead
year_month: str, # noqa: ARG001 — unused; tenants uses daily partition instead
conn: sqlite3.Connection,
session: niquests.Session,
) -> dict:
"""Fetch all Playtomic venues via global pagination. Returns run metrics.
Partitioned by ISO week (e.g. 2026/W09) so each weekly run produces a
Partitioned by day (e.g. 2026/03/01) so each daily run produces a
fresh file. _load_tenant_ids() in playtomic_availability globs across all
partitions and picks the most recent one.
"""
@@ -89,12 +128,16 @@ def extract(
return {"files_written": 0, "files_skipped": 1, "bytes_written": 0}
tiers = load_proxy_tiers()
all_proxies = [url for tier in tiers for url in tier]
next_proxy = make_round_robin_cycler(all_proxies) if all_proxies else None
batch_size = len(all_proxies) if all_proxies else 1
cycler = make_tiered_cycler(tiers, CIRCUIT_BREAKER_THRESHOLD) if tiers else None
batch_size = BATCH_SIZE if cycler else 1
if next_proxy:
logger.info("Parallel mode: %d pages per batch (%d proxies across %d tier(s))", batch_size, len(all_proxies), len(tiers))
if cycler:
logger.info(
"Parallel mode: %d pages/batch, %d tier(s), threshold=%d",
batch_size,
cycler["tier_count"](),
CIRCUIT_BREAKER_THRESHOLD,
)
else:
logger.info("Serial mode: 1 page at a time (no proxies)")
@@ -104,15 +147,33 @@ def extract(
done = False
while not done and page < MAX_PAGES:
if cycler and cycler["is_exhausted"]():
logger.error(
"All proxy tiers exhausted — stopping at page %d (%d venues collected)",
page,
len(all_tenants),
)
break
batch_end = min(page + batch_size, MAX_PAGES)
pages_to_fetch = list(range(page, batch_end))
if next_proxy and len(pages_to_fetch) > 1:
if cycler and len(pages_to_fetch) > 1:
logger.info(
"Fetching pages %d-%d in parallel (%d workers, total so far: %d)",
page, batch_end - 1, len(pages_to_fetch), len(all_tenants),
page,
batch_end - 1,
len(pages_to_fetch),
len(all_tenants),
)
results = _fetch_pages_parallel(pages_to_fetch, next_proxy)
try:
results = _fetch_pages_parallel(pages_to_fetch, cycler)
except RuntimeError:
logger.error(
"Proxy tiers exhausted mid-batch — writing partial results (%d venues)",
len(all_tenants),
)
break
else:
# Serial: reuse the shared session, throttle between pages
page_num = pages_to_fetch[0]
@@ -126,7 +187,7 @@ def extract(
)
results = [(page_num, tenants)]
# Process pages in order so the done-detection on < PAGE_SIZE is deterministic
# Process pages in order so done-detection on < PAGE_SIZE is deterministic
for p, tenants in sorted(results):
new_count = 0
for tenant in tenants:
@@ -137,7 +198,11 @@ def extract(
new_count += 1
logger.info(
"page=%d got=%d new=%d total=%d", p, len(tenants), new_count, len(all_tenants),
"page=%d got=%d new=%d total=%d",
p,
len(tenants),
new_count,
len(all_tenants),
)
# Last page — fewer than PAGE_SIZE results means we've exhausted the list
@@ -146,7 +211,7 @@ def extract(
break
page = batch_end
if not next_proxy:
if not cycler:
time.sleep(THROTTLE_SECONDS)
# Write each tenant as a JSONL line, then compress atomically

View File

@@ -134,8 +134,8 @@ def make_sticky_selector(proxy_urls: list[str]):
return select_proxy
def make_tiered_cycler(tiers: list[list[str]], threshold: int) -> dict:
"""Thread-safe N-tier proxy cycler with circuit breaker.
def make_tiered_cycler(tiers: list[list[str]], threshold: int, proxy_failure_limit: int = 3) -> dict:
"""Thread-safe N-tier proxy cycler with circuit breaker and per-proxy dead tracking.
Uses tiers[0] until consecutive failures >= threshold, then escalates
to tiers[1], then tiers[2], etc. Once all tiers are exhausted,
@@ -144,13 +144,21 @@ def make_tiered_cycler(tiers: list[list[str]], threshold: int) -> dict:
Failure counter resets on each escalation — the new tier gets a fresh start.
Once exhausted, further record_failure() calls are no-ops.
Per-proxy dead tracking (when proxy_failure_limit > 0):
Individual proxies are marked dead after proxy_failure_limit failures and
skipped by next_proxy(). If all proxies in the active tier are dead,
next_proxy() auto-escalates to the next tier. Both mechanisms coexist:
per-proxy dead tracking removes broken individuals; tier-level threshold
catches systemic failure even before any single proxy hits the limit.
Returns a dict of callables:
next_proxy() -> str | None — URL from the active tier, or None
record_success() -> None — resets consecutive failure counter
record_failure() -> bool — True if just escalated to next tier
next_proxy() -> str | None — URL from active tier (skips dead), or None
record_success(proxy_url=None) -> None — resets consecutive failure counter
record_failure(proxy_url=None) -> bool — True if just escalated to next tier
is_exhausted() -> bool — True if all tiers exhausted
active_tier_index() -> int — 0-based index of current tier
tier_count() -> int — total number of tiers
dead_proxy_count() -> int — number of individual proxies marked dead
Edge cases:
Empty tiers list: next_proxy() always returns None, is_exhausted() True.
@@ -158,28 +166,75 @@ def make_tiered_cycler(tiers: list[list[str]], threshold: int) -> dict:
"""
assert threshold > 0, f"threshold must be positive, got {threshold}"
assert isinstance(tiers, list), f"tiers must be a list, got {type(tiers)}"
assert proxy_failure_limit >= 0, f"proxy_failure_limit must be >= 0, got {proxy_failure_limit}"
lock = threading.Lock()
cycles = [itertools.cycle(t) for t in tiers]
state = {
"active_tier": 0,
"consecutive_failures": 0,
"proxy_failure_counts": {}, # proxy_url -> int
"dead_proxies": set(), # proxy URLs marked dead
}
def next_proxy() -> str | None:
with lock:
idx = state["active_tier"]
if idx >= len(cycles):
return None
return next(cycles[idx])
# Try each remaining tier (bounded: at most len(tiers) escalations)
for _ in range(len(tiers) + 1):
idx = state["active_tier"]
if idx >= len(cycles):
return None
def record_success() -> None:
tier_proxies = tiers[idx]
tier_len = len(tier_proxies)
# Find a live proxy in this tier (bounded: try each proxy at most once)
for _ in range(tier_len):
candidate = next(cycles[idx])
if candidate not in state["dead_proxies"]:
return candidate
# All proxies in this tier are dead — auto-escalate
state["consecutive_failures"] = 0
state["active_tier"] += 1
new_idx = state["active_tier"]
if new_idx < len(tiers):
logger.warning(
"All proxies in tier %d are dead — auto-escalating to tier %d/%d",
idx + 1,
new_idx + 1,
len(tiers),
)
else:
logger.error(
"All proxies in all %d tier(s) are dead — no more fallbacks",
len(tiers),
)
return None # safety fallback
def record_success(proxy_url: str | None = None) -> None:
with lock:
state["consecutive_failures"] = 0
if proxy_url is not None:
state["proxy_failure_counts"][proxy_url] = 0
def record_failure() -> bool:
def record_failure(proxy_url: str | None = None) -> bool:
"""Increment failure counter. Returns True if just escalated to next tier."""
with lock:
# Per-proxy dead tracking (additional to tier-level circuit breaker)
if proxy_url is not None and proxy_failure_limit > 0:
count = state["proxy_failure_counts"].get(proxy_url, 0) + 1
state["proxy_failure_counts"][proxy_url] = count
if count >= proxy_failure_limit and proxy_url not in state["dead_proxies"]:
state["dead_proxies"].add(proxy_url)
logger.warning(
"Proxy %s marked dead after %d consecutive failures",
proxy_url,
count,
)
# Tier-level circuit breaker (existing behavior)
idx = state["active_tier"]
if idx >= len(tiers):
# Already exhausted — no-op
@@ -219,6 +274,10 @@ def make_tiered_cycler(tiers: list[list[str]], threshold: int) -> dict:
def tier_count() -> int:
return len(tiers)
def dead_proxy_count() -> int:
with lock:
return len(state["dead_proxies"])
return {
"next_proxy": next_proxy,
"record_success": record_success,
@@ -226,4 +285,5 @@ def make_tiered_cycler(tiers: list[list[str]], threshold: int) -> dict:
"is_exhausted": is_exhausted,
"active_tier_index": active_tier_index,
"tier_count": tier_count,
"dead_proxy_count": dead_proxy_count,
}

View File

@@ -21,6 +21,10 @@ schedule = "monthly"
module = "padelnomics_extract.eurostat"
schedule = "monthly"
[geonames]
module = "padelnomics_extract.geonames"
schedule = "monthly"
[playtomic_tenants]
module = "padelnomics_extract.playtomic_tenants"
schedule = "daily"

View File

@@ -19,8 +19,10 @@
-- 4. Country-level income (global fallback from stg_income / ilc_di03)
--
-- Distance calculations use ST_Distance_Sphere (DuckDB spatial extension).
-- A bounding-box pre-filter (~0.5°, ≈55km) reduces the cross-join before the
-- exact sphere distance is computed.
-- Spatial joins use BETWEEN predicates (not ABS()) to enable DuckDB's IEJoin
-- (interval join) optimization: O((N+M) log M) vs O(N×M) nested-loop.
-- Country pre-filters restrict the left side to ~20K rows for padel/tennis CTEs
-- (~8 countries each), down from ~140K global locations.
MODEL (
name foundation.dim_locations,
@@ -147,6 +149,8 @@ padel_courts AS (
WHERE lat IS NOT NULL AND lon IS NOT NULL
),
-- Nearest padel court distance per location (bbox pre-filter → exact sphere distance)
-- BETWEEN enables DuckDB IEJoin (O((N+M) log M)) vs ABS() nested-loop (O(N×M)).
-- Country pre-filter reduces left side from ~140K to ~20K rows (padel is ~8 countries).
nearest_padel AS (
SELECT
l.geoname_id,
@@ -158,9 +162,12 @@ nearest_padel AS (
) AS nearest_padel_court_km
FROM locations l
JOIN padel_courts p
-- ~55km bounding box pre-filter to limit cross-join before sphere calc
ON ABS(l.lat - p.lat) < 0.5
AND ABS(l.lon - p.lon) < 0.5
-- ~55km bounding box pre-filter; BETWEEN triggers IEJoin optimization
ON l.lat BETWEEN p.lat - 0.5 AND p.lat + 0.5
AND l.lon BETWEEN p.lon - 0.5 AND p.lon + 0.5
WHERE l.country_code IN (
SELECT DISTINCT country_code FROM padel_courts WHERE country_code IS NOT NULL
)
GROUP BY l.geoname_id
),
-- Padel venues within 5km of each location (counts as "local padel supply")
@@ -170,24 +177,35 @@ padel_local AS (
COUNT(*) AS padel_venue_count
FROM locations l
JOIN padel_courts p
ON ABS(l.lat - p.lat) < 0.05 -- ~5km bbox pre-filter
AND ABS(l.lon - p.lon) < 0.05
WHERE ST_Distance_Sphere(
-- ~5km bbox pre-filter; BETWEEN triggers IEJoin optimization
ON l.lat BETWEEN p.lat - 0.05 AND p.lat + 0.05
AND l.lon BETWEEN p.lon - 0.05 AND p.lon + 0.05
WHERE l.country_code IN (
SELECT DISTINCT country_code FROM padel_courts WHERE country_code IS NOT NULL
)
AND ST_Distance_Sphere(
ST_Point(l.lon, l.lat),
ST_Point(p.lon, p.lat)
) / 1000.0 <= 5.0
GROUP BY l.geoname_id
),
-- Tennis courts within 25km of each location (sports culture proxy)
-- Country pre-filter reduces left side from ~140K to ~20K rows (tennis courts are European only).
tennis_nearby AS (
SELECT
l.geoname_id,
COUNT(*) AS tennis_courts_within_25km
FROM locations l
JOIN staging.stg_tennis_courts t
ON ABS(l.lat - t.lat) < 0.23 -- ~25km bbox pre-filter
AND ABS(l.lon - t.lon) < 0.23
WHERE ST_Distance_Sphere(
-- ~25km bbox pre-filter; BETWEEN triggers IEJoin optimization
ON l.lat BETWEEN t.lat - 0.23 AND t.lat + 0.23
AND l.lon BETWEEN t.lon - 0.23 AND t.lon + 0.23
WHERE l.country_code IN (
SELECT DISTINCT country_code
FROM staging.stg_tennis_courts
WHERE country_code IS NOT NULL
)
AND ST_Distance_Sphere(
ST_Point(l.lon, l.lat),
ST_Point(t.lon, t.lat)
) / 1000.0 <= 25.0

View File

@@ -49,7 +49,7 @@ _LANDING_DIR = os.environ.get("LANDING_DIR", "data/landing")
_SERVING_DUCKDB_PATH = os.environ.get("SERVING_DUCKDB_PATH", "data/analytics.duckdb")
# Repo root: web/src/padelnomics/admin/ → up 4 levels
_REPO_ROOT = Path(__file__).resolve().parents[5]
_REPO_ROOT = Path(__file__).resolve().parents[4]
_WORKFLOWS_TOML = _REPO_ROOT / "infra" / "supervisor" / "workflows.toml"
# A "running" row older than this is considered stale/crashed.

View File

@@ -2769,7 +2769,10 @@ async def _rebuild_article(article_id: int):
md_path = Path("data/content/articles") / f"{article['slug']}.md"
if not md_path.exists():
return
body_html = mistune.html(md_path.read_text())
raw = md_path.read_text()
m = _FRONTMATTER_RE.match(raw)
body = raw[m.end():] if m else raw
body_html = mistune.html(body)
lang = article.get("language", "en") if hasattr(article, "get") else "en"
body_html = await bake_scenario_cards(body_html, lang=lang)
body_html = await bake_product_cards(body_html, lang=lang)
@@ -3034,6 +3037,7 @@ async def outreach():
current_search=search,
current_follow_up=follow_up,
page=page,
outreach_email=EMAIL_ADDRESSES["outreach"],
)
@@ -3254,6 +3258,210 @@ async def outreach_import():
AFFILIATE_CATEGORIES = ("racket", "ball", "shoe", "bag", "grip", "eyewear", "accessory")
AFFILIATE_STATUSES = ("draft", "active", "archived")
AFFILIATE_PROGRAM_STATUSES = ("active", "inactive")
# ── Affiliate Programs ────────────────────────────────────────────────────────
def _form_to_program(form) -> dict:
"""Parse affiliate program form values into a data dict."""
commission_str = form.get("commission_pct", "").strip()
commission_pct = 0.0
if commission_str:
try:
commission_pct = float(commission_str.replace(",", "."))
except ValueError:
commission_pct = 0.0
return {
"name": form.get("name", "").strip(),
"slug": form.get("slug", "").strip(),
"url_template": form.get("url_template", "").strip(),
"tracking_tag": form.get("tracking_tag", "").strip(),
"commission_pct": commission_pct,
"homepage_url": form.get("homepage_url", "").strip(),
"status": form.get("status", "active").strip(),
"notes": form.get("notes", "").strip(),
}
@bp.route("/affiliate/programs")
@role_required("admin")
async def affiliate_programs():
"""Affiliate programs list — full page."""
from ..affiliate import get_all_programs
programs = await get_all_programs()
return await render_template(
"admin/affiliate_programs.html",
admin_page="affiliate_programs",
programs=programs,
)
@bp.route("/affiliate/programs/results")
@role_required("admin")
async def affiliate_program_results():
"""HTMX partial: program rows."""
from ..affiliate import get_all_programs
programs = await get_all_programs()
return await render_template(
"admin/partials/affiliate_program_results.html",
programs=programs,
)
@bp.route("/affiliate/programs/new", methods=["GET", "POST"])
@role_required("admin")
@csrf_protect
async def affiliate_program_new():
"""Create an affiliate program."""
if request.method == "POST":
form = await request.form
data = _form_to_program(form)
if not data["name"] or not data["slug"] or not data["url_template"]:
await flash("Name, slug, and URL template are required.", "error")
return await render_template(
"admin/affiliate_program_form.html",
admin_page="affiliate_programs",
data=data,
editing=False,
program_statuses=AFFILIATE_PROGRAM_STATUSES,
)
existing = await fetch_one(
"SELECT id FROM affiliate_programs WHERE slug = ?", (data["slug"],)
)
if existing:
await flash(f"Slug '{data['slug']}' already exists.", "error")
return await render_template(
"admin/affiliate_program_form.html",
admin_page="affiliate_programs",
data=data,
editing=False,
program_statuses=AFFILIATE_PROGRAM_STATUSES,
)
await execute(
"""INSERT INTO affiliate_programs
(name, slug, url_template, tracking_tag, commission_pct,
homepage_url, status, notes)
VALUES (?, ?, ?, ?, ?, ?, ?, ?)""",
(
data["name"], data["slug"], data["url_template"],
data["tracking_tag"], data["commission_pct"],
data["homepage_url"], data["status"], data["notes"],
),
)
await flash(f"Program '{data['name']}' created.", "success")
return redirect(url_for("admin.affiliate_programs"))
return await render_template(
"admin/affiliate_program_form.html",
admin_page="affiliate_programs",
data={},
editing=False,
program_statuses=AFFILIATE_PROGRAM_STATUSES,
)
@bp.route("/affiliate/programs/<int:program_id>/edit", methods=["GET", "POST"])
@role_required("admin")
@csrf_protect
async def affiliate_program_edit(program_id: int):
"""Edit an affiliate program."""
program = await fetch_one(
"SELECT * FROM affiliate_programs WHERE id = ?", (program_id,)
)
if not program:
await flash("Program not found.", "error")
return redirect(url_for("admin.affiliate_programs"))
if request.method == "POST":
form = await request.form
data = _form_to_program(form)
if not data["name"] or not data["slug"] or not data["url_template"]:
await flash("Name, slug, and URL template are required.", "error")
return await render_template(
"admin/affiliate_program_form.html",
admin_page="affiliate_programs",
data={**dict(program), **data},
editing=True,
program_id=program_id,
program_statuses=AFFILIATE_PROGRAM_STATUSES,
)
if data["slug"] != program["slug"]:
collision = await fetch_one(
"SELECT id FROM affiliate_programs WHERE slug = ? AND id != ?",
(data["slug"], program_id),
)
if collision:
await flash(f"Slug '{data['slug']}' already exists.", "error")
return await render_template(
"admin/affiliate_program_form.html",
admin_page="affiliate_programs",
data={**dict(program), **data},
editing=True,
program_id=program_id,
program_statuses=AFFILIATE_PROGRAM_STATUSES,
)
await execute(
"""UPDATE affiliate_programs
SET name=?, slug=?, url_template=?, tracking_tag=?, commission_pct=?,
homepage_url=?, status=?, notes=?, updated_at=datetime('now')
WHERE id=?""",
(
data["name"], data["slug"], data["url_template"],
data["tracking_tag"], data["commission_pct"],
data["homepage_url"], data["status"], data["notes"],
program_id,
),
)
await flash(f"Program '{data['name']}' updated.", "success")
return redirect(url_for("admin.affiliate_programs"))
return await render_template(
"admin/affiliate_program_form.html",
admin_page="affiliate_programs",
data=dict(program),
editing=True,
program_id=program_id,
program_statuses=AFFILIATE_PROGRAM_STATUSES,
)
@bp.route("/affiliate/programs/<int:program_id>/delete", methods=["POST"])
@role_required("admin")
@csrf_protect
async def affiliate_program_delete(program_id: int):
"""Delete an affiliate program — blocked if products reference it."""
program = await fetch_one(
"SELECT name FROM affiliate_programs WHERE id = ?", (program_id,)
)
if not program:
return redirect(url_for("admin.affiliate_programs"))
product_count = await fetch_one(
"SELECT COUNT(*) AS cnt FROM affiliate_products WHERE program_id = ?",
(program_id,),
)
count = product_count["cnt"] if product_count else 0
if count > 0:
await flash(
f"Cannot delete '{program['name']}'{count} product(s) reference it. "
"Reassign or remove those products first.",
"error",
)
return redirect(url_for("admin.affiliate_programs"))
await execute("DELETE FROM affiliate_programs WHERE id = ?", (program_id,))
await flash(f"Program '{program['name']}' deleted.", "success")
return redirect(url_for("admin.affiliate_programs"))
def _form_to_product(form) -> dict:
@@ -3279,13 +3487,26 @@ def _form_to_product(form) -> dict:
pros = json.dumps([line.strip() for line in pros_raw.splitlines() if line.strip()])
cons = json.dumps([line.strip() for line in cons_raw.splitlines() if line.strip()])
# Program-based URL vs manual URL.
# When a program is selected, product_identifier holds the ASIN/path;
# affiliate_url is cleared. Manual mode is the reverse.
program_id_str = form.get("program_id", "").strip()
program_id = int(program_id_str) if program_id_str and program_id_str != "0" else None
product_identifier = form.get("product_identifier", "").strip()
affiliate_url = form.get("affiliate_url", "").strip()
# retailer is auto-populated from program name on save (kept for display/filter)
retailer = form.get("retailer", "").strip()
return {
"slug": form.get("slug", "").strip(),
"name": form.get("name", "").strip(),
"brand": form.get("brand", "").strip(),
"category": form.get("category", "accessory").strip(),
"retailer": form.get("retailer", "").strip(),
"affiliate_url": form.get("affiliate_url", "").strip(),
"retailer": retailer,
"program_id": program_id,
"product_identifier": product_identifier,
"affiliate_url": affiliate_url,
"image_url": form.get("image_url", "").strip(),
"price_cents": price_cents,
"currency": "EUR",
@@ -3403,14 +3624,15 @@ async def affiliate_preview():
@csrf_protect
async def affiliate_new():
"""Create an affiliate product."""
from ..affiliate import get_distinct_retailers
from ..affiliate import get_all_programs, get_distinct_retailers
if request.method == "POST":
form = await request.form
data = _form_to_product(form)
if not data["slug"] or not data["name"] or not data["affiliate_url"]:
await flash("Slug, name, and affiliate URL are required.", "error")
has_url = bool(data["program_id"] and data["product_identifier"]) or bool(data["affiliate_url"])
if not data["slug"] or not data["name"] or not has_url:
await flash("Slug, name, and either a program+product ID or manual URL are required.", "error")
return await render_template(
"admin/affiliate_form.html",
admin_page="affiliate",
@@ -3419,6 +3641,7 @@ async def affiliate_new():
categories=AFFILIATE_CATEGORIES,
statuses=AFFILIATE_STATUSES,
retailers=await get_distinct_retailers(),
programs=await get_all_programs(status="active"),
)
existing = await fetch_one(
@@ -3435,17 +3658,27 @@ async def affiliate_new():
categories=AFFILIATE_CATEGORIES,
statuses=AFFILIATE_STATUSES,
retailers=await get_distinct_retailers(),
programs=await get_all_programs(status="active"),
)
# Auto-populate retailer from program name if not manually set
if data["program_id"] and not data["retailer"]:
prog = await fetch_one(
"SELECT name FROM affiliate_programs WHERE id = ?", (data["program_id"],)
)
if prog:
data["retailer"] = prog["name"]
await execute(
"""INSERT INTO affiliate_products
(slug, name, brand, category, retailer, affiliate_url, image_url,
price_cents, currency, rating, pros, cons, description, cta_label,
status, language, sort_order)
VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)""",
(slug, name, brand, category, retailer, program_id, product_identifier,
affiliate_url, image_url, price_cents, currency, rating, pros, cons,
description, cta_label, status, language, sort_order)
VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)""",
(
data["slug"], data["name"], data["brand"], data["category"],
data["retailer"], data["affiliate_url"], data["image_url"],
data["retailer"], data["program_id"], data["product_identifier"],
data["affiliate_url"], data["image_url"],
data["price_cents"], data["currency"], data["rating"],
data["pros"], data["cons"], data["description"], data["cta_label"],
data["status"], data["language"], data["sort_order"],
@@ -3462,6 +3695,7 @@ async def affiliate_new():
categories=AFFILIATE_CATEGORIES,
statuses=AFFILIATE_STATUSES,
retailers=await get_distinct_retailers(),
programs=await get_all_programs(status="active"),
)
@@ -3470,7 +3704,7 @@ async def affiliate_new():
@csrf_protect
async def affiliate_edit(product_id: int):
"""Edit an affiliate product."""
from ..affiliate import get_distinct_retailers
from ..affiliate import get_all_programs, get_distinct_retailers
product = await fetch_one("SELECT * FROM affiliate_products WHERE id = ?", (product_id,))
if not product:
@@ -3481,8 +3715,9 @@ async def affiliate_edit(product_id: int):
form = await request.form
data = _form_to_product(form)
if not data["slug"] or not data["name"] or not data["affiliate_url"]:
await flash("Slug, name, and affiliate URL are required.", "error")
has_url = bool(data["program_id"] and data["product_identifier"]) or bool(data["affiliate_url"])
if not data["slug"] or not data["name"] or not has_url:
await flash("Slug, name, and either a program+product ID or manual URL are required.", "error")
return await render_template(
"admin/affiliate_form.html",
admin_page="affiliate",
@@ -3492,6 +3727,7 @@ async def affiliate_edit(product_id: int):
categories=AFFILIATE_CATEGORIES,
statuses=AFFILIATE_STATUSES,
retailers=await get_distinct_retailers(),
programs=await get_all_programs(status="active"),
)
# Check slug collision only if slug or language changed
@@ -3511,18 +3747,29 @@ async def affiliate_edit(product_id: int):
categories=AFFILIATE_CATEGORIES,
statuses=AFFILIATE_STATUSES,
retailers=await get_distinct_retailers(),
programs=await get_all_programs(status="active"),
)
# Auto-populate retailer from program name if not manually set
if data["program_id"] and not data["retailer"]:
prog = await fetch_one(
"SELECT name FROM affiliate_programs WHERE id = ?", (data["program_id"],)
)
if prog:
data["retailer"] = prog["name"]
await execute(
"""UPDATE affiliate_products
SET slug=?, name=?, brand=?, category=?, retailer=?, affiliate_url=?,
image_url=?, price_cents=?, currency=?, rating=?, pros=?, cons=?,
SET slug=?, name=?, brand=?, category=?, retailer=?, program_id=?,
product_identifier=?, affiliate_url=?, image_url=?,
price_cents=?, currency=?, rating=?, pros=?, cons=?,
description=?, cta_label=?, status=?, language=?, sort_order=?,
updated_at=datetime('now')
WHERE id=?""",
(
data["slug"], data["name"], data["brand"], data["category"],
data["retailer"], data["affiliate_url"], data["image_url"],
data["retailer"], data["program_id"], data["product_identifier"],
data["affiliate_url"], data["image_url"],
data["price_cents"], data["currency"], data["rating"],
data["pros"], data["cons"], data["description"], data["cta_label"],
data["status"], data["language"], data["sort_order"],
@@ -3554,6 +3801,7 @@ async def affiliate_edit(product_id: int):
categories=AFFILIATE_CATEGORIES,
statuses=AFFILIATE_STATUSES,
retailers=await get_distinct_retailers(),
programs=await get_all_programs(status="active"),
)

View File

@@ -24,6 +24,20 @@ document.addEventListener('DOMContentLoaded', function() {
slugInput.dataset.manual = '1';
});
}
// Toggle program-based vs manual URL fields
function toggleProgramFields() {
var sel = document.getElementById('f-program');
if (!sel) return;
var isManual = sel.value === '0' || sel.value === '';
document.getElementById('f-product-id-row').style.display = isManual ? 'none' : '';
document.getElementById('f-manual-url-row').style.display = isManual ? '' : 'none';
}
var programSel = document.getElementById('f-program');
if (programSel) {
programSel.addEventListener('change', toggleProgramFields);
toggleProgramFields();
}
});
</script>
{% endblock %}
@@ -87,9 +101,38 @@ document.addEventListener('DOMContentLoaded', function() {
</div>
</div>
{# Retailer #}
{# Program dropdown #}
<div>
<label class="form-label" for="f-retailer">Retailer</label>
<label class="form-label" for="f-program">Affiliate Program</label>
<select id="f-program" name="program_id" class="form-input">
<option value="0" {% if not data.get('program_id') %}selected{% endif %}>Manual (custom URL)</option>
{% for prog in programs %}
<option value="{{ prog.id }}" {% if data.get('program_id') == prog.id %}selected{% endif %}>{{ prog.name }}</option>
{% endfor %}
</select>
<p class="form-hint">Select a program to auto-build the URL, or choose Manual for a custom link.</p>
</div>
{# Product Identifier (shown when program selected) #}
<div id="f-product-id-row">
<label class="form-label" for="f-product-id">Product ID *</label>
<input id="f-product-id" type="text" name="product_identifier"
value="{{ data.get('product_identifier','') }}"
class="form-input" placeholder="e.g. B0XXXXXXXXX (ASIN for Amazon)">
<p class="form-hint">ASIN, product path, or other program-specific identifier. URL is assembled at redirect time.</p>
</div>
{# Manual URL (shown when Manual selected) #}
<div id="f-manual-url-row">
<label class="form-label" for="f-url">Affiliate URL</label>
<input id="f-url" type="url" name="affiliate_url" value="{{ data.get('affiliate_url','') }}"
class="form-input" placeholder="https://www.amazon.de/dp/B0XXXXX?tag=padelnomics-21">
<p class="form-hint">Full URL with tracking params already baked in. Used as fallback if no program is set.</p>
</div>
{# Retailer (auto-populated from program; editable for manual products) #}
<div>
<label class="form-label" for="f-retailer">Retailer <span class="form-hint" style="font-weight:normal">(auto-filled from program)</span></label>
<input id="f-retailer" type="text" name="retailer" value="{{ data.get('retailer','') }}"
class="form-input" placeholder="e.g. Amazon, Padel Nuestro"
list="retailers-list">
@@ -100,14 +143,6 @@ document.addEventListener('DOMContentLoaded', function() {
</datalist>
</div>
{# Affiliate URL #}
<div>
<label class="form-label" for="f-url">Affiliate URL *</label>
<input id="f-url" type="url" name="affiliate_url" value="{{ data.get('affiliate_url','') }}"
class="form-input" placeholder="https://www.amazon.de/dp/B0XXXXX?tag=padelnomics-21" required>
<p class="form-hint">Full URL with tracking params already baked in.</p>
</div>
{# Image URL #}
<div>
<label class="form-label" for="f-image">Image URL</label>

View File

@@ -0,0 +1,134 @@
{% extends "admin/base_admin.html" %}
{% set admin_page = "affiliate_programs" %}
{% block title %}{% if editing %}Edit Program{% else %}New Program{% endif %} - Admin - {{ config.APP_NAME }}{% endblock %}
{% block admin_head %}
<script>
function slugify(text) {
return text.toLowerCase()
.replace(/[äöü]/g, c => ({'ä':'ae','ö':'oe','ü':'ue'}[c]))
.replace(/[^a-z0-9]+/g, '-')
.replace(/^-+|-+$/g, '');
}
document.addEventListener('DOMContentLoaded', function() {
var nameInput = document.getElementById('f-name');
var slugInput = document.getElementById('f-slug');
if (nameInput && slugInput && !slugInput.value) {
nameInput.addEventListener('input', function() {
if (!slugInput.dataset.manual) {
slugInput.value = slugify(nameInput.value);
}
});
slugInput.addEventListener('input', function() {
slugInput.dataset.manual = '1';
});
}
});
</script>
{% endblock %}
{% block admin_content %}
<header class="flex justify-between items-center mb-6">
<div>
<a href="{{ url_for('admin.affiliate_programs') }}" class="text-slate text-sm" style="text-decoration:none">← Programs</a>
<h1 class="text-2xl mt-1">{% if editing %}Edit Program{% else %}New Program{% endif %}</h1>
</div>
</header>
<div style="max-width:600px">
<form method="post" id="program-form"
action="{% if editing %}{{ url_for('admin.affiliate_program_edit', program_id=program_id) }}{% else %}{{ url_for('admin.affiliate_program_new') }}{% endif %}">
<input type="hidden" name="csrf_token" value="{{ csrf_token() }}">
<div class="card" style="padding:1.5rem;display:flex;flex-direction:column;gap:1.25rem;">
{# Name #}
<div>
<label class="form-label" for="f-name">Name *</label>
<input id="f-name" type="text" name="name" value="{{ data.get('name','') }}"
class="form-input" placeholder="e.g. Amazon, Padel Nuestro" required>
</div>
{# Slug #}
<div>
<label class="form-label" for="f-slug">Slug *</label>
<input id="f-slug" type="text" name="slug" value="{{ data.get('slug','') }}"
class="form-input" placeholder="e.g. amazon, padel-nuestro" required
pattern="[a-z0-9][a-z0-9\-]*">
<p class="form-hint">Lowercase letters, numbers, hyphens only.</p>
</div>
{# URL Template #}
<div>
<label class="form-label" for="f-template">URL Template *</label>
<input id="f-template" type="text" name="url_template" value="{{ data.get('url_template','') }}"
class="form-input" placeholder="https://www.amazon.de/dp/{product_id}?tag={tag}" required>
<p class="form-hint">
Use <code>{product_id}</code> for the ASIN/product path and <code>{tag}</code> for the tracking tag.<br>
Example: <code>https://www.amazon.de/dp/{product_id}?tag={tag}</code>
</p>
</div>
{# Tracking Tag + Commission row #}
<div style="display:grid;grid-template-columns:1fr 1fr;gap:1rem;">
<div>
<label class="form-label" for="f-tag">Tracking Tag</label>
<input id="f-tag" type="text" name="tracking_tag" value="{{ data.get('tracking_tag','') }}"
class="form-input" placeholder="e.g. padelnomics-21">
</div>
<div>
<label class="form-label" for="f-commission">Commission %</label>
<input id="f-commission" type="number" name="commission_pct" value="{{ data.get('commission_pct', 0) }}"
class="form-input" placeholder="3" step="0.1" min="0" max="100">
<p class="form-hint">Used for revenue estimates (e.g. 3 = 3%).</p>
</div>
</div>
{# Homepage URL #}
<div>
<label class="form-label" for="f-homepage">Homepage URL</label>
<input id="f-homepage" type="url" name="homepage_url" value="{{ data.get('homepage_url','') }}"
class="form-input" placeholder="https://www.amazon.de">
<p class="form-hint">Shown as a link in the programs list.</p>
</div>
{# Status #}
<div>
<label class="form-label" for="f-status">Status</label>
<select id="f-status" name="status" class="form-input">
{% for s in program_statuses %}
<option value="{{ s }}" {% if data.get('status','active') == s %}selected{% endif %}>{{ s | capitalize }}</option>
{% endfor %}
</select>
<p class="form-hint">Inactive programs are hidden from the product form dropdown.</p>
</div>
{# Notes #}
<div>
<label class="form-label" for="f-notes">Notes <span class="form-hint" style="font-weight:normal">(internal)</span></label>
<textarea id="f-notes" name="notes" rows="3"
class="form-input" placeholder="Login URL, account ID, affiliate dashboard link...">{{ data.get('notes','') }}</textarea>
</div>
{# Actions #}
<div class="flex gap-3 justify-between" style="margin-top:.5rem">
<div class="flex gap-2">
<button type="submit" class="btn">
{% if editing %}Save Changes{% else %}Create Program{% endif %}
</button>
<a href="{{ url_for('admin.affiliate_programs') }}" class="btn-outline">Cancel</a>
</div>
{% if editing %}
<form method="post" action="{{ url_for('admin.affiliate_program_delete', program_id=program_id) }}" style="margin:0">
<input type="hidden" name="csrf_token" value="{{ csrf_token() }}">
<button type="submit" class="btn-outline"
onclick="return confirm('Delete this program? Blocked if products reference it.')">Delete</button>
</form>
{% endif %}
</div>
</div>
</form>
</div>
{% endblock %}

View File

@@ -0,0 +1,30 @@
{% extends "admin/base_admin.html" %}
{% set admin_page = "affiliate_programs" %}
{% block title %}Affiliate Programs - Admin - {{ config.APP_NAME }}{% endblock %}
{% block admin_content %}
<header class="flex justify-between items-center mb-6">
<h1 class="text-2xl">Affiliate Programs</h1>
<a href="{{ url_for('admin.affiliate_program_new') }}" class="btn btn-sm">+ New Program</a>
</header>
<div id="prog-results">
<table class="table">
<thead>
<tr>
<th>Name</th>
<th>Slug</th>
<th>Tracking Tag</th>
<th class="text-right">Commission</th>
<th class="text-right">Products</th>
<th>Status</th>
<th class="text-right">Actions</th>
</tr>
</thead>
<tbody>
{% include "admin/partials/affiliate_program_results.html" %}
</tbody>
</table>
</div>
{% endblock %}

View File

@@ -40,8 +40,10 @@
.admin-subnav {
display: flex; align-items: stretch; padding: 0 2rem;
background: #fff; border-bottom: 1px solid #E2E8F0;
flex-shrink: 0; overflow-x: auto; gap: 0;
flex-shrink: 0; overflow-x: auto; overflow-y: hidden; gap: 0;
scrollbar-width: none;
}
.admin-subnav::-webkit-scrollbar { display: none; }
.admin-subnav a {
display: flex; align-items: center; gap: 5px;
padding: 0 1px; margin: 0 13px 0 0; height: 42px;
@@ -99,7 +101,7 @@
'suppliers': 'suppliers',
'articles': 'content', 'scenarios': 'content', 'templates': 'content', 'pseo': 'content',
'emails': 'email', 'inbox': 'email', 'compose': 'email', 'gallery': 'email', 'audiences': 'email', 'outreach': 'email',
'affiliate': 'affiliate', 'affiliate_dashboard': 'affiliate',
'affiliate': 'affiliate', 'affiliate_dashboard': 'affiliate', 'affiliate_programs': 'affiliate',
'billing': 'billing',
'seo': 'analytics',
'pipeline': 'pipeline',
@@ -206,6 +208,7 @@
<nav class="admin-subnav">
<a href="{{ url_for('admin.affiliate_dashboard') }}" class="{% if admin_page == 'affiliate_dashboard' %}active{% endif %}">Dashboard</a>
<a href="{{ url_for('admin.affiliate_products') }}" class="{% if admin_page == 'affiliate' %}active{% endif %}">Products</a>
<a href="{{ url_for('admin.affiliate_programs') }}" class="{% if admin_page == 'affiliate_programs' %}active{% endif %}">Programs</a>
</nav>
{% elif active_section == 'system' %}
<nav class="admin-subnav">

View File

@@ -3,6 +3,19 @@
{% block title %}Admin Dashboard - {{ config.APP_NAME }}{% endblock %}
{% block admin_head %}
<style>
.funnel-grid {
display: grid;
grid-template-columns: repeat(2, 1fr);
gap: 0.75rem;
}
@media (min-width: 768px) {
.funnel-grid { grid-template-columns: repeat(5, 1fr); }
}
</style>
{% endblock %}
{% block admin_content %}
<header class="flex justify-between items-center mb-8">
<div>
@@ -47,7 +60,7 @@
<!-- Lead Funnel -->
<p class="text-xs font-semibold text-slate uppercase tracking-wider mb-2">Lead Funnel</p>
<div style="display:grid;grid-template-columns:repeat(5,1fr);gap:0.75rem" class="mb-8">
<div class="funnel-grid mb-8">
<div class="card text-center border-l-4 border-l-electric" style="padding:0.75rem">
<p class="text-xs text-slate">Planner Users</p>
<p class="text-xl font-bold text-navy">{{ stats.planner_users }}</p>
@@ -72,7 +85,7 @@
<!-- Supplier Stats -->
<p class="text-xs font-semibold text-slate uppercase tracking-wider mb-2">Supplier Funnel</p>
<div style="display:grid;grid-template-columns:repeat(5,1fr);gap:0.75rem" class="mb-8">
<div class="funnel-grid mb-8">
<div class="card text-center border-l-4 border-l-accent" style="padding:0.75rem">
<p class="text-xs text-slate">Claimed Suppliers</p>
<p class="text-xl font-bold text-navy">{{ stats.suppliers_claimed }}</p>

View File

@@ -2,13 +2,30 @@
{% set admin_page = "outreach" %}
{% block title %}Outreach Pipeline - Admin - {{ config.APP_NAME }}{% endblock %}
{% block admin_head %}
<style>
.pipeline-status-grid {
display: grid;
grid-template-columns: repeat(2, 1fr);
gap: 0.75rem;
margin-bottom: 1.5rem;
}
@media (min-width: 640px) {
.pipeline-status-grid { grid-template-columns: repeat(3, 1fr); }
}
@media (min-width: 1024px) {
.pipeline-status-grid { grid-template-columns: repeat(6, 1fr); }
}
</style>
{% endblock %}
{% block admin_content %}
<header class="flex justify-between items-center mb-6">
<div>
<h1 class="text-2xl">Outreach</h1>
<p class="text-sm text-slate mt-1">
{{ pipeline.total }} supplier{{ 's' if pipeline.total != 1 else '' }} in pipeline
&middot; Sending domain: <span class="mono text-xs">hello.padelnomics.io</span>
&middot; Sending from: <span class="mono text-xs">{{ outreach_email }}</span>
</p>
</div>
<div class="flex gap-2">
@@ -18,7 +35,7 @@
</header>
<!-- Pipeline cards -->
<div style="display:grid;grid-template-columns:repeat(6,1fr);gap:0.75rem;margin-bottom:1.5rem">
<div class="pipeline-status-grid">
{% set status_colors = {
'prospect': '#E2E8F0',
'contacted': '#DBEAFE',

View File

@@ -0,0 +1,36 @@
{% if programs %}
{% for prog in programs %}
<tr id="prog-{{ prog.id }}">
<td style="font-weight:500">
{% if prog.homepage_url %}
<a href="{{ prog.homepage_url }}" target="_blank" rel="noopener" style="color:#0F172A;text-decoration:none">{{ prog.name }}</a>
{% else %}
{{ prog.name }}
{% endif %}
</td>
<td class="mono text-slate">{{ prog.slug }}</td>
<td class="mono text-slate">{{ prog.tracking_tag or '—' }}</td>
<td class="mono text-right">
{% if prog.commission_pct %}{{ "%.0f" | format(prog.commission_pct) }}%{% else %}—{% endif %}
</td>
<td class="mono text-right">{{ prog.product_count }}</td>
<td>
<span class="badge {% if prog.status == 'active' %}badge-success{% else %}badge{% endif %}">
{{ prog.status }}
</span>
</td>
<td class="text-right" style="white-space:nowrap">
<a href="{{ url_for('admin.affiliate_program_edit', program_id=prog.id) }}" class="btn-outline btn-sm">Edit</a>
<form method="post" action="{{ url_for('admin.affiliate_program_delete', program_id=prog.id) }}" style="display:inline">
<input type="hidden" name="csrf_token" value="{{ csrf_token() }}">
<button type="submit" class="btn-outline btn-sm"
onclick="return confirm('Delete {{ prog.name }}? This is blocked if products reference it.')">Delete</button>
</form>
</td>
</tr>
{% endfor %}
{% else %}
<tr>
<td colspan="7" class="text-slate" style="text-align:center;padding:2rem;">No programs found.</td>
</tr>
{% endif %}

View File

@@ -1,5 +1,6 @@
{% if emails %}
<div class="card">
<div style="overflow-x:auto">
<table class="table">
<thead>
<tr>
@@ -38,6 +39,7 @@
{% endfor %}
</tbody>
</table>
</div>
</div>
{% else %}
<div class="card text-center" style="padding:2rem">

View File

@@ -25,6 +25,7 @@
{% if leads %}
<div class="card">
<div style="overflow-x:auto">
<table class="table">
<thead>
<tr>
@@ -58,6 +59,7 @@
{% endfor %}
</tbody>
</table>
</div>
</div>
<!-- Pagination -->

View File

@@ -1,5 +1,6 @@
{% if suppliers %}
<div class="card">
<div style="overflow-x:auto">
<table class="table">
<thead>
<tr>
@@ -19,6 +20,7 @@
{% endfor %}
</tbody>
</table>
</div>
</div>
{% else %}
<div class="card text-center" style="padding:2rem">

View File

@@ -1,5 +1,6 @@
{% if suppliers %}
<div class="card">
<div style="overflow-x:auto">
<table class="table">
<thead>
<tr>
@@ -47,6 +48,7 @@
{% endfor %}
</tbody>
</table>
</div>
</div>
{% else %}
<div class="card text-center" style="padding:2rem">

View File

@@ -4,6 +4,15 @@
{% block admin_head %}
<style>
.pipeline-stat-grid {
display: grid;
grid-template-columns: repeat(2, 1fr);
gap: 0.75rem;
}
@media (min-width: 768px) {
.pipeline-stat-grid { grid-template-columns: repeat(4, 1fr); }
}
.pipeline-tabs {
display: flex; gap: 0; border-bottom: 2px solid #E2E8F0; margin-bottom: 1.5rem;
}
@@ -46,7 +55,7 @@
</header>
<!-- Health stat cards -->
<div style="display:grid;grid-template-columns:repeat(4,1fr);gap:0.75rem" class="mb-6">
<div class="pipeline-stat-grid mb-6">
<div class="card text-center" style="padding:0.875rem">
<p class="text-xs text-slate">Total Runs</p>
<p class="text-2xl font-bold text-navy metric">{{ summary.total | default(0) }}</p>

View File

@@ -21,6 +21,7 @@ logger = logging.getLogger(__name__)
VALID_CATEGORIES = ("racket", "ball", "shoe", "bag", "grip", "eyewear", "accessory")
VALID_STATUSES = ("draft", "active", "archived")
VALID_PROGRAM_STATUSES = ("active", "inactive")
def hash_ip(ip_address: str) -> str:
@@ -32,20 +33,86 @@ def hash_ip(ip_address: str) -> str:
return hashlib.sha256(raw.encode()).hexdigest()
async def get_product(slug: str, language: str = "de") -> dict | None:
"""Return active product by slug+language, falling back to any language."""
async def get_all_programs(status: str | None = None) -> list[dict]:
"""Return all affiliate programs, optionally filtered by status."""
if status:
assert status in VALID_PROGRAM_STATUSES, f"unknown program status: {status}"
rows = await fetch_all(
"SELECT ap.*, ("
" SELECT COUNT(*) FROM affiliate_products WHERE program_id = ap.id"
") AS product_count"
" FROM affiliate_programs ap WHERE ap.status = ?"
" ORDER BY ap.name ASC",
(status,),
)
else:
rows = await fetch_all(
"SELECT ap.*, ("
" SELECT COUNT(*) FROM affiliate_products WHERE program_id = ap.id"
") AS product_count"
" FROM affiliate_programs ap ORDER BY ap.name ASC"
)
return [dict(r) for r in rows]
async def get_program(program_id: int) -> dict | None:
"""Return a single affiliate program by id."""
assert program_id > 0, "program_id must be positive"
row = await fetch_one(
"SELECT * FROM affiliate_programs WHERE id = ?", (program_id,)
)
return dict(row) if row else None
async def get_program_by_slug(slug: str) -> dict | None:
"""Return a single affiliate program by slug."""
assert slug, "slug must not be empty"
row = await fetch_one(
"SELECT * FROM affiliate_products"
" WHERE slug = ? AND language = ? AND status = 'active'",
"SELECT * FROM affiliate_programs WHERE slug = ?", (slug,)
)
return dict(row) if row else None
def build_affiliate_url(product: dict, program: dict | None = None) -> str:
"""Assemble the final affiliate URL from program template + product identifier.
Falls back to the baked product["affiliate_url"] when no program is set,
preserving backward compatibility with products created before programs existed.
"""
if not product.get("program_id") or not program:
return product["affiliate_url"]
return program["url_template"].format(
product_id=product["product_identifier"],
tag=program["tracking_tag"],
)
async def get_product(slug: str, language: str = "de") -> dict | None:
"""Return active product by slug+language, falling back to any language.
JOINs affiliate_programs so the returned dict includes program fields
(prefixed with _program_*) for use in build_affiliate_url().
"""
assert slug, "slug must not be empty"
row = await fetch_one(
"SELECT p.*, pg.url_template AS _program_url_template,"
" pg.tracking_tag AS _program_tracking_tag,"
" pg.name AS _program_name"
" FROM affiliate_products p"
" LEFT JOIN affiliate_programs pg ON pg.id = p.program_id"
" WHERE p.slug = ? AND p.language = ? AND p.status = 'active'",
(slug, language),
)
if row:
return _parse_product(row)
# Graceful fallback: show any language rather than nothing
row = await fetch_one(
"SELECT * FROM affiliate_products"
" WHERE slug = ? AND status = 'active' LIMIT 1",
"SELECT p.*, pg.url_template AS _program_url_template,"
" pg.tracking_tag AS _program_tracking_tag,"
" pg.name AS _program_name"
" FROM affiliate_products p"
" LEFT JOIN affiliate_programs pg ON pg.id = p.program_id"
" WHERE p.slug = ? AND p.status = 'active' LIMIT 1",
(slug,),
)
return _parse_product(row) if row else None
@@ -217,8 +284,24 @@ async def get_distinct_retailers() -> list[str]:
def _parse_product(row) -> dict:
"""Convert aiosqlite Row to plain dict, parsing JSON pros/cons arrays."""
"""Convert aiosqlite Row to plain dict, parsing JSON pros/cons arrays.
If the row includes _program_* columns (from a JOIN), extracts them into
a nested "_program" dict so build_affiliate_url() can use them directly.
"""
d = dict(row)
d["pros"] = json.loads(d.get("pros") or "[]")
d["cons"] = json.loads(d.get("cons") or "[]")
# Extract program fields added by get_product()'s JOIN
if "_program_url_template" in d:
if d.get("program_id") and d["_program_url_template"]:
d["_program"] = {
"url_template": d.pop("_program_url_template"),
"tracking_tag": d.pop("_program_tracking_tag", ""),
"name": d.pop("_program_name", ""),
}
else:
d.pop("_program_url_template", None)
d.pop("_program_tracking_tag", None)
d.pop("_program_name", None)
return d

View File

@@ -291,7 +291,7 @@ def create_app() -> Quart:
Uses 302 (not 301) so every hit is tracked — browsers don't cache 302s.
Extracts article_slug and lang from Referer header best-effort.
"""
from .affiliate import get_product, log_click
from .affiliate import build_affiliate_url, get_product, log_click
from .core import check_rate_limit
# Extract lang from Referer path (e.g. /de/blog/... → "de"), default de
@@ -314,14 +314,17 @@ def create_app() -> Quart:
if not product:
abort(404)
# Assemble URL from program template; falls back to baked affiliate_url
url = build_affiliate_url(product, product.get("_program"))
ip = request.remote_addr or "unknown"
allowed, _info = await check_rate_limit(f"aff:{ip}", limit=60, window=60)
if not allowed:
# Still redirect even if rate-limited; just don't log the click
return redirect(product["affiliate_url"], 302)
return redirect(url, 302)
await log_click(product["id"], ip, article_slug, referer or None)
return redirect(product["affiliate_url"], 302)
return redirect(url, 302)
# Legacy 301 redirects — bookmarked/cached URLs before lang prefixes existed
@app.route("/terms")

View File

@@ -0,0 +1,79 @@
"""Migration 0027: Affiliate programs table + program FK on products.
affiliate_programs: centralises retailer configs (URL template + tag + commission).
- url_template uses {product_id} and {tag} placeholders, assembled at redirect time.
- tracking_tag: e.g. "padelnomics-21" — changing it propagates to all products instantly.
- commission_pct: stored as a decimal (0.03 = 3%) for revenue estimates.
- status: active/inactive — only active programs appear in the product form dropdown.
- notes: internal field for login URLs, account IDs, etc.
affiliate_products changes:
- program_id (nullable FK): new products use a program; existing products keep their
baked affiliate_url (backward compat via build_affiliate_url() fallback).
- product_identifier: ASIN, product path, or other program-specific ID (e.g. B0XXXXX).
Amazon OneLink note: we use a single "Amazon" program pointing to amazon.de.
Amazon OneLink (configured in the Associates dashboard, no code changes needed)
auto-redirects visitors to their local marketplace (UK→amazon.co.uk, ES→amazon.es)
with the correct regional tag. One program covers all Amazon marketplaces.
"""
import re
def up(conn) -> None:
conn.execute("""
CREATE TABLE affiliate_programs (
id INTEGER PRIMARY KEY AUTOINCREMENT,
name TEXT NOT NULL,
slug TEXT NOT NULL UNIQUE,
url_template TEXT NOT NULL,
tracking_tag TEXT NOT NULL DEFAULT '',
commission_pct REAL NOT NULL DEFAULT 0,
homepage_url TEXT NOT NULL DEFAULT '',
status TEXT NOT NULL DEFAULT 'active',
notes TEXT NOT NULL DEFAULT '',
created_at TEXT NOT NULL DEFAULT (datetime('now')),
updated_at TEXT
)
""")
# Seed the default Amazon program.
# OneLink handles geo-redirect to local marketplaces — no per-country programs needed.
conn.execute("""
INSERT INTO affiliate_programs (name, slug, url_template, tracking_tag, commission_pct, homepage_url)
VALUES ('Amazon', 'amazon', 'https://www.amazon.de/dp/{product_id}?tag={tag}', 'padelnomics-21', 3.0, 'https://www.amazon.de')
""")
# Add program FK + product identifier to products table.
# program_id is nullable — existing rows keep their baked affiliate_url.
conn.execute("""
ALTER TABLE affiliate_products
ADD COLUMN program_id INTEGER REFERENCES affiliate_programs(id)
""")
conn.execute("""
ALTER TABLE affiliate_products
ADD COLUMN product_identifier TEXT NOT NULL DEFAULT ''
""")
# Backfill: extract ASIN from existing Amazon affiliate URLs.
# Pattern: /dp/<ASIN> where ASIN is 10 uppercase alphanumeric chars.
amazon_program = conn.execute(
"SELECT id FROM affiliate_programs WHERE slug = 'amazon'"
).fetchone()
assert amazon_program is not None, "Amazon program must exist after seed"
amazon_id = amazon_program[0]
rows = conn.execute(
"SELECT id, affiliate_url FROM affiliate_products"
).fetchall()
asin_re = re.compile(r"/dp/([A-Z0-9]{10})")
for product_id, url in rows:
if not url:
continue
m = asin_re.search(url)
if m:
asin = m.group(1)
conn.execute(
"UPDATE affiliate_products SET program_id=?, product_identifier=? WHERE id=?",
(amazon_id, asin, product_id),
)

View File

@@ -218,9 +218,7 @@
.nav-bar[data-navopen="true"] .nav-mobile {
display: flex;
}
.nav-mobile a,
.nav-mobile button.nav-auth-btn,
.nav-mobile a.nav-auth-btn {
.nav-mobile a:not(.nav-auth-btn) {
display: block;
padding: 0.625rem 0;
border-bottom: 1px solid #F1F5F9;
@@ -230,15 +228,18 @@
text-decoration: none;
transition: color 0.15s;
}
.nav-mobile a:last-child { border-bottom: none; }
.nav-mobile a:hover { color: #1D4ED8; }
.nav-mobile a:not(.nav-auth-btn):last-child { border-bottom: none; }
.nav-mobile a:not(.nav-auth-btn):hover { color: #1D4ED8; }
/* nav-auth-btn in mobile menu: override block style, keep button colours */
.nav-mobile a.nav-auth-btn,
.nav-mobile button.nav-auth-btn {
display: inline-flex;
margin-top: 0.5rem;
padding: 6px 16px;
border-bottom: none;
width: auto;
align-self: flex-start;
color: #fff;
}
.nav-mobile .nav-mobile-section {
font-size: 0.6875rem;

View File

@@ -2,7 +2,8 @@
Tests for the affiliate product system.
Covers: hash_ip determinism, product CRUD, bake_product_cards marker replacement,
click redirect (302 + logged), rate limiting, inactive product 404, multi-retailer.
click redirect (302 + logged), rate limiting, inactive product 404, multi-retailer,
program CRUD, build_affiliate_url(), program-based redirect.
"""
import json
from datetime import date
@@ -10,11 +11,15 @@ from unittest.mock import patch
import pytest
from padelnomics.affiliate import (
build_affiliate_url,
get_all_products,
get_all_programs,
get_click_counts,
get_click_stats,
get_product,
get_products_by_category,
get_program,
get_program_by_slug,
hash_ip,
log_click,
)
@@ -330,3 +335,282 @@ async def test_affiliate_redirect_unknown_404(app, db):
async with app.test_client() as client:
response = await client.get("/go/totally-unknown-xyz")
assert response.status_code == 404
# ── affiliate_programs ────────────────────────────────────────────────────────
async def _insert_program(
name="Test Shop",
slug="test-shop",
url_template="https://testshop.example.com/p/{product_id}?ref={tag}",
tracking_tag="testref",
commission_pct=5.0,
homepage_url="https://testshop.example.com",
status="active",
) -> int:
"""Insert an affiliate program, return its id."""
return await execute(
"""INSERT INTO affiliate_programs
(name, slug, url_template, tracking_tag, commission_pct, homepage_url, status)
VALUES (?, ?, ?, ?, ?, ?, ?)""",
(name, slug, url_template, tracking_tag, commission_pct, homepage_url, status),
)
@pytest.mark.usefixtures("db")
async def test_get_all_programs_returns_all(db):
"""get_all_programs returns inserted programs sorted by name."""
await _insert_program(slug="zebra-shop", name="Zebra Shop")
await _insert_program(slug="alpha-shop", name="Alpha Shop")
programs = await get_all_programs()
names = [p["name"] for p in programs]
assert "Alpha Shop" in names
assert "Zebra Shop" in names
# Sorted by name ascending
assert names.index("Alpha Shop") < names.index("Zebra Shop")
@pytest.mark.usefixtures("db")
async def test_get_all_programs_status_filter(db):
"""get_all_programs(status='active') excludes inactive programs."""
await _insert_program(slug="inactive-prog", status="inactive")
await _insert_program(slug="active-prog", name="Active Shop")
active = await get_all_programs(status="active")
statuses = [p["status"] for p in active]
assert all(s == "active" for s in statuses)
slugs = [p["slug"] for p in active]
assert "inactive-prog" not in slugs
assert "active-prog" in slugs
@pytest.mark.usefixtures("db")
async def test_get_program_by_id(db):
"""get_program returns a program by id."""
prog_id = await _insert_program()
prog = await get_program(prog_id)
assert prog is not None
assert prog["slug"] == "test-shop"
@pytest.mark.usefixtures("db")
async def test_get_program_not_found(db):
"""get_program returns None for unknown id."""
prog = await get_program(99999)
assert prog is None
@pytest.mark.usefixtures("db")
async def test_get_program_by_slug(db):
"""get_program_by_slug returns the program for a known slug."""
await _insert_program(slug="find-by-slug")
prog = await get_program_by_slug("find-by-slug")
assert prog is not None
assert prog["name"] == "Test Shop"
@pytest.mark.usefixtures("db")
async def test_get_program_by_slug_not_found(db):
"""get_program_by_slug returns None for unknown slug."""
prog = await get_program_by_slug("nonexistent-slug-xyz")
assert prog is None
@pytest.mark.usefixtures("db")
async def test_get_all_programs_product_count(db):
"""get_all_programs includes product_count for each program."""
prog_id = await _insert_program(slug="counted-prog")
await _insert_product(slug="p-for-count", program_id=prog_id)
programs = await get_all_programs()
prog = next(p for p in programs if p["slug"] == "counted-prog")
assert prog["product_count"] == 1
# ── build_affiliate_url ───────────────────────────────────────────────────────
def test_build_affiliate_url_with_program():
"""build_affiliate_url assembles URL from program template."""
product = {"program_id": 1, "product_identifier": "B0TESTTEST", "affiliate_url": ""}
program = {"url_template": "https://amazon.de/dp/{product_id}?tag={tag}", "tracking_tag": "mysite-21"}
url = build_affiliate_url(product, program)
assert url == "https://amazon.de/dp/B0TESTTEST?tag=mysite-21"
def test_build_affiliate_url_legacy_fallback():
"""build_affiliate_url falls back to baked affiliate_url when no program."""
product = {"program_id": None, "product_identifier": "", "affiliate_url": "https://baked.example.com/p?tag=x"}
url = build_affiliate_url(product, None)
assert url == "https://baked.example.com/p?tag=x"
def test_build_affiliate_url_no_program_id():
"""build_affiliate_url uses fallback when program_id is 0/falsy."""
product = {"program_id": 0, "product_identifier": "B0IGNORED", "affiliate_url": "https://fallback.example.com"}
program = {"url_template": "https://shop.example.com/{product_id}?ref={tag}", "tracking_tag": "tag123"}
url = build_affiliate_url(product, program)
# program_id is falsy → fallback
assert url == "https://fallback.example.com"
def test_build_affiliate_url_no_program_dict():
"""build_affiliate_url uses fallback when program dict is None."""
product = {"program_id": 5, "product_identifier": "ASIN123", "affiliate_url": "https://fallback.example.com"}
url = build_affiliate_url(product, None)
assert url == "https://fallback.example.com"
# ── program-based redirect ────────────────────────────────────────────────────
async def _insert_product( # noqa: F811 — redefined to add program_id support
slug="test-racket-amazon",
name="Test Racket",
brand="TestBrand",
category="racket",
retailer="Amazon",
affiliate_url="https://amazon.de/dp/TEST?tag=test-21",
status="active",
language="de",
price_cents=14999,
pros=None,
cons=None,
sort_order=0,
program_id=None,
product_identifier="",
) -> int:
"""Insert an affiliate product with optional program_id, return its id."""
return await execute(
"""INSERT INTO affiliate_products
(slug, name, brand, category, retailer, affiliate_url,
price_cents, currency, status, language, pros, cons, sort_order,
program_id, product_identifier)
VALUES (?, ?, ?, ?, ?, ?, ?, 'EUR', ?, ?, ?, ?, ?, ?, ?)""",
(
slug, name, brand, category, retailer, affiliate_url,
price_cents, status, language,
json.dumps(pros or ["Gut"]),
json.dumps(cons or ["Teuer"]),
sort_order,
program_id,
product_identifier,
),
)
@pytest.mark.usefixtures("db")
async def test_affiliate_redirect_uses_program_url(app, db):
"""Redirect assembles URL from program template when product has program_id."""
prog_id = await _insert_program(
slug="amzn-test",
url_template="https://www.amazon.de/dp/{product_id}?tag={tag}",
tracking_tag="testsite-21",
)
await _insert_product(
slug="program-redirect-test",
affiliate_url="",
program_id=prog_id,
product_identifier="B0PROGRAM01",
)
async with app.test_client() as client:
response = await client.get("/go/program-redirect-test")
assert response.status_code == 302
location = response.headers.get("Location", "")
assert "B0PROGRAM01" in location
assert "testsite-21" in location
@pytest.mark.usefixtures("db")
async def test_affiliate_redirect_legacy_url_still_works(app, db):
"""Legacy products with baked affiliate_url still redirect correctly."""
await _insert_product(
slug="legacy-redirect-test",
affiliate_url="https://amazon.de/dp/LEGACY?tag=old-21",
program_id=None,
product_identifier="",
)
async with app.test_client() as client:
response = await client.get("/go/legacy-redirect-test")
assert response.status_code == 302
assert "LEGACY" in response.headers.get("Location", "")
# ── migration backfill ────────────────────────────────────────────────────────
def _load_migration_0027():
"""Import migration 0027 via importlib (filename starts with a digit)."""
import importlib
from pathlib import Path
versions_dir = Path(__file__).parent.parent / "src/padelnomics/migrations/versions"
spec = importlib.util.spec_from_file_location(
"migration_0027", versions_dir / "0027_affiliate_programs.py"
)
mod = importlib.util.module_from_spec(spec)
spec.loader.exec_module(mod)
return mod
def _make_pre_migration_db():
"""Create a minimal sqlite3 DB simulating state just before migration 0027.
Provides the affiliate_products table (migration ALTERs it), but not
affiliate_programs (migration CREATEs it).
"""
import sqlite3
conn = sqlite3.connect(":memory:")
conn.execute("""
CREATE TABLE affiliate_products (
id INTEGER PRIMARY KEY AUTOINCREMENT,
slug TEXT NOT NULL,
name TEXT NOT NULL DEFAULT '',
affiliate_url TEXT NOT NULL DEFAULT '',
UNIQUE(slug)
)
""")
conn.commit()
return conn
def test_migration_seeds_amazon_program():
"""Migration 0027 up() seeds the Amazon program with expected fields.
Tests the migration function directly against a real sqlite3 DB
(the conftest only replays CREATE TABLE DDL, not INSERT seeds).
"""
migration = _load_migration_0027()
conn = _make_pre_migration_db()
migration.up(conn)
conn.commit()
row = conn.execute(
"SELECT * FROM affiliate_programs WHERE slug = 'amazon'"
).fetchone()
assert row is not None
cols = [d[0] for d in conn.execute("SELECT * FROM affiliate_programs WHERE slug = 'amazon'").description]
prog = dict(zip(cols, row))
assert prog["name"] == "Amazon"
assert "padelnomics-21" in prog["tracking_tag"]
assert "{product_id}" in prog["url_template"]
assert "{tag}" in prog["url_template"]
assert prog["commission_pct"] == 3.0
conn.close()
def test_migration_backfills_asin_from_url():
"""Migration 0027 up() extracts ASINs from existing affiliate_url values."""
migration = _load_migration_0027()
conn = _make_pre_migration_db()
conn.execute(
"INSERT INTO affiliate_products (slug, affiliate_url) VALUES (?, ?)",
("test-racket", "https://www.amazon.de/dp/B0ASIN1234?tag=test-21"),
)
conn.commit()
migration.up(conn)
conn.commit()
row = conn.execute(
"SELECT program_id, product_identifier FROM affiliate_products WHERE slug = 'test-racket'"
).fetchone()
assert row is not None
assert row[0] is not None # program_id set
assert row[1] == "B0ASIN1234" # ASIN extracted correctly
conn.close()

View File

@@ -500,3 +500,131 @@ class TestTieredCyclerNTier:
t.join()
assert errors == [], f"Thread safety errors: {errors}"
class TestTieredCyclerDeadProxyTracking:
"""Per-proxy dead tracking: individual proxies marked dead are skipped."""
def test_dead_proxy_skipped_in_next_proxy(self):
"""After a proxy hits the failure limit it is never returned again."""
tiers = [["http://dead", "http://live"]]
cycler = make_tiered_cycler(tiers, threshold=10, proxy_failure_limit=1)
# Mark http://dead as dead
cycler["record_failure"]("http://dead")
# next_proxy must always return the live one
for _ in range(6):
assert cycler["next_proxy"]() == "http://live"
def test_dead_proxy_count_increments(self):
tiers = [["http://a", "http://b", "http://c"]]
cycler = make_tiered_cycler(tiers, threshold=10, proxy_failure_limit=2)
assert cycler["dead_proxy_count"]() == 0
cycler["record_failure"]("http://a")
assert cycler["dead_proxy_count"]() == 0 # only 1 failure, limit is 2
cycler["record_failure"]("http://a")
assert cycler["dead_proxy_count"]() == 1
cycler["record_failure"]("http://b")
cycler["record_failure"]("http://b")
assert cycler["dead_proxy_count"]() == 2
def test_auto_escalates_when_all_proxies_in_tier_dead(self):
"""If all proxies in the active tier are dead, next_proxy auto-escalates."""
tiers = [["http://t0a", "http://t0b"], ["http://t1"]]
cycler = make_tiered_cycler(tiers, threshold=10, proxy_failure_limit=1)
# Kill all proxies in tier 0
cycler["record_failure"]("http://t0a")
cycler["record_failure"]("http://t0b")
# next_proxy should transparently escalate and return tier 1 proxy
assert cycler["next_proxy"]() == "http://t1"
def test_auto_escalates_updates_active_tier_index(self):
"""Auto-escalation via dead proxies bumps active_tier_index."""
tiers = [["http://t0a", "http://t0b"], ["http://t1"]]
cycler = make_tiered_cycler(tiers, threshold=10, proxy_failure_limit=1)
cycler["record_failure"]("http://t0a")
cycler["record_failure"]("http://t0b")
cycler["next_proxy"]() # triggers auto-escalation
assert cycler["active_tier_index"]() == 1
def test_returns_none_when_all_tiers_exhausted_by_dead_proxies(self):
tiers = [["http://t0"], ["http://t1"]]
cycler = make_tiered_cycler(tiers, threshold=10, proxy_failure_limit=1)
cycler["record_failure"]("http://t0")
cycler["record_failure"]("http://t1")
assert cycler["next_proxy"]() is None
def test_record_success_resets_per_proxy_counter(self):
"""Success resets the failure count so proxy is not marked dead."""
tiers = [["http://a", "http://b"]]
cycler = make_tiered_cycler(tiers, threshold=10, proxy_failure_limit=3)
# Two failures — not dead yet
cycler["record_failure"]("http://a")
cycler["record_failure"]("http://a")
assert cycler["dead_proxy_count"]() == 0
# Success resets the counter
cycler["record_success"]("http://a")
# Two more failures — still not dead (counter was reset)
cycler["record_failure"]("http://a")
cycler["record_failure"]("http://a")
assert cycler["dead_proxy_count"]() == 0
# Third failure after reset — now dead
cycler["record_failure"]("http://a")
assert cycler["dead_proxy_count"]() == 1
def test_dead_proxy_stays_dead_after_success(self):
"""Once marked dead, a proxy is not revived by record_success."""
tiers = [["http://a", "http://b"]]
cycler = make_tiered_cycler(tiers, threshold=10, proxy_failure_limit=1)
cycler["record_failure"]("http://a")
assert cycler["dead_proxy_count"]() == 1
cycler["record_success"]("http://a")
assert cycler["dead_proxy_count"]() == 1
# http://a is still skipped
for _ in range(6):
assert cycler["next_proxy"]() == "http://b"
def test_backward_compat_no_proxy_url(self):
"""Calling record_failure/record_success without proxy_url still works."""
tiers = [["http://t0"], ["http://t1"]]
cycler = make_tiered_cycler(tiers, threshold=2)
cycler["record_failure"]()
cycler["record_failure"]() # escalates
assert cycler["active_tier_index"]() == 1
cycler["record_success"]()
assert cycler["dead_proxy_count"]() == 0 # no per-proxy tracking happened
def test_proxy_failure_limit_zero_disables_per_proxy_tracking(self):
"""proxy_failure_limit=0 disables per-proxy dead tracking entirely."""
tiers = [["http://a", "http://b"]]
cycler = make_tiered_cycler(tiers, threshold=10, proxy_failure_limit=0)
for _ in range(100):
cycler["record_failure"]("http://a")
assert cycler["dead_proxy_count"]() == 0
def test_thread_safety_with_per_proxy_tracking(self):
"""Concurrent record_failure(proxy_url) calls don't corrupt state."""
import threading as _threading
tiers = [["http://t0a", "http://t0b", "http://t0c"], ["http://t1a"]]
cycler = make_tiered_cycler(tiers, threshold=50, proxy_failure_limit=5)
errors = []
lock = _threading.Lock()
def worker():
try:
for _ in range(30):
p = cycler["next_proxy"]()
if p is not None:
cycler["record_failure"](p)
cycler["record_success"](p)
except Exception as e:
with lock:
errors.append(e)
threads = [_threading.Thread(target=worker) for _ in range(10)]
for t in threads:
t.start()
for t in threads:
t.join()
assert errors == [], f"Thread safety errors: {errors}"