Owner & sole engineer (Software Is Nothing, LLC) · 2025–present

Turbine Intelligence Platform

A Python AI-agent platform for industrial-asset market intelligence — discovery, dedup, scoring, and human-gated outreach.

PythonPostgresNocoDBClaudeMicrosoft GraphEarth Engine

What it is

An internal-use AI-agent platform for industrial-asset market intelligence. It ingests broker listings and reference data from public sources, dedups and legitimacy-scores assets against a canonical registry, surfaces match candidates between sellers and prospective buyers, drafts personalised outreach in the operator’s voice profile under human-approval gates, and monitors infrastructure changes via satellite imagery.

Built for use by the operator’s own consultancy — not commercially distributed.

Stack

  • Storage: PostgreSQL 16 with a 14-table relational schema (entities, listings, signals, deals, calls, outreach campaigns, outreach messages). NocoDB v2 layered over the same DB for spreadsheet-style review UI + REST API.
  • Tooling: ~18 Python tools (intake, extraction, dedup, plant matching, legitimacy scoring, retirement-signal scoring, buyer discovery, match engine, outreach engine, sequence manager, reply detection, call-intel extraction, morning brief).
  • AI: Anthropic Claude for structured-field extraction from broker emails/PDFs, multi-rubric scoring (legitimacy, retirement-likelihood, buyer-priority, BANT qualification), and outreach draft generation. All outreach passes through compliance checks (CAN-SPAM, GDPR, frequency caps, voice-profile validation) before send.
  • Microsoft Graph: Outlook for inbound listing intake + outreach send, Teams for transcript pulling + alerts, SharePoint for attachment storage.
  • Public data: EIA, EPA, FERC, ISO interconnection queues (PJM, ERCOT, MISO, CAISO, SPP, NYISO), ENTSO-E, WRI, Global Energy Monitor, NewsAPI, Apollo.io.
  • Satellite: Google Earth Engine — Sentinel-2 + Landsat NDVI delta scans against ~35,000 plants on a tiered cadence, with Claude-vision before/after analysis on flagged sites.
  • Schedules: GitHub Actions cron — every-15-min intake polling, daily score recompute, hourly outreach queue, daily 7am Teams brief, weekly public-data refresh, monthly full satellite scan.

What it demonstrates

  • AI agent design under hard compliance constraints — voice profiles, MNDA gating (asset-specific data only flows after a signed MNDA), per-contact frequency caps, mandatory CAN-SPAM/GDPR footers.
  • Long-running automated workflows with cost discipline (~$75–265/mo total infra).
  • Multi-source data normalisation — every asset gets three deduplication hashes (primary SHA-256, secondary fuzzy, perceptual photo hash) before being promoted into the canonical registry.

I don’t share specific buyer/seller data, pricing, or client identities publicly.


← All work