A production-ready MVP that matches social media trend signals to AliExpress product listings. Fully Dockerized, tested, and documented. Ready for your next e-commerce SaaS.
TrendMatch ingests trend signals, scrapes AliExpress listings, scores them with a configurable weighted formula, and generates tiered HTML reports.
Configurable weighted scoring that combines demand signals and marketplace performance. Weights, thresholds, and limits are externalized to YAML—adjust without touching code.
Playwright-based Node.js scraper enriches listings with public marketplace signals (e.g., sold counts and images). Some fields like price and rating may be extracted depending on page structure and are not strictly required for scoring. Products grouped by category buckets.
Daily (50 products), weekly (50), and monthly (20 curated) reports with HOT/PROVEN/RISK badges. Sample report available without auth.
12 endpoints with frozen OpenAPI 3.1 spec. Swagger UI included. Token-authenticated tiered access for production use.
SQLite-backed pipeline history tracks every execution. Query runs, per-product score history, and metadata via API.
Full transparency on what you're acquiring.
Every claim is independently verifiable. Buyers get a 48–72 hour escrow verification window before funds release.
macOS Darwin 24.6.0, Docker 29.2.0, Compose v5.0.2. 7 smoke tests pass.
25 Python pytest + 24 Node.js parser tests. Full suite runs in ~1 second.
OpenAPI 3.1 spec included and verified to match the live API. 12 documented endpoints.
Pipeline execution history tracked and queryable. Auditable record of every run.
88 files, ~526 KB. No secrets, no .env, no node_modules, no real scraped data.
Escrow.com, PayPal Business, or Wise. 48–72 hour buyer verification window.
Sample report, product buckets, and API documentation. Trends are computed at query-level (may overlap); canonical deduped totals are also provided for operational clarity.
A smooth transition so you can get up and running fast.
You're acquiring full, exclusive ownership of the TrendMatch codebase and all associated assets: source code, documentation, sample data, CI/CD pipeline, and data room. This is an asset sale with complete IP transfer—the seller retains zero rights post-close. License is MIT, original work with no third-party IP claims.
No. TikTok trend signals are buyer-provided JSON input. The repo does NOT include a TikTok scraper. You'll need to source trend data via a third-party provider such as Apify, Bright Data, or Phantombuster. The API accepts standard JSON input, so integration is straightforward.
Three commands: docker compose up --build -d, then curl localhost:8000/health, then open localhost:8000/reports/sample. The data room includes step-by-step verification instructions. Escrow is available with a 48–72 hour verification window before funds release.
Configurable scoring that blends buyer-provided TikTok demand signals with AliExpress marketplace performance signals. Each product gets an explainable breakdown and automatic badges (HOT, PROVEN, RISK). All weights, thresholds, and badge rules are tunable via YAML—full details in the data room.
Web scraping requires ongoing maintenance. AliExpress may change their HTML structure, introduce CAPTCHAs, or adjust rate limits. Budget approximately 2–4 hours per month for selector updates. Annual estimate: 28–52 hours total. This is typical for any scraping-based product.
FastAPI + Uvicorn (Python backend), Playwright on Node.js (AliExpress scraping), SQLite (run history), Docker Compose (orchestration), GitHub Actions (CI/CD). JSON file storage handles ~10K products; migration path to PostgreSQL is documented in the roadmap.
The concept is validated and the MVP works, but I'm moving to other projects. This is a solid foundation for further development—whether you want to turn it into a subscription SaaS, add more data providers, or integrate it into an existing e-commerce tool.
The AliExpress scraper may violate their Terms of Service. The buyer accepts full responsibility for compliance with applicable laws and platform terms. We include detailed disclaimers in the data room. Consult qualified legal counsel before using the scraping functionality in production.