Compare commits

..

3 Commits

Author SHA1 Message Date
HoloPanio a048e1e824 feat: add CW callback route and optimize cache refresh workflows 2026-03-03 19:46:48 -06:00
HoloPanio 6d935e7180 feat: Redis opportunity cache, CW API retry/logging, adaptive TTLs
- Add Redis-backed opportunity cache with background refresh (30s interval)
- Fix concurrency bug: use lazy thunks instead of eager promises for batching
- Add withCwRetry utility with exponential backoff for transient CW errors
- Add adaptive TTL algorithms (primary, sub-resource, products) based on opportunity activity
- Add include query param on GET /sales/opportunities/:id (notes,contacts,products)
- Add opt-in CW API logger (LOG_CW_API env var) with timestamped files in cw-api-logs/
- Add debug-scripts/analyze-cw-calls.py for API call analysis
- Add computeSubResourceCacheTTL and computeProductsCacheTTL algorithms with tests
- Increase CW API timeout from 15s to 30s
- Unblock cache refresh from startup chain (remove await)
- Prioritize recently updated opportunities in refresh cycle
- Add CACHING.md documentation
- Update API_ROUTES.md with caching details and include param
- Update copilot instructions to require CACHING.md sync
- Add dev:log script for CW API call logging during development
2026-03-02 23:23:24 -06:00
HoloPanio fe71248e88 perf: cache-only strategy for list views, cache-then-cw for single fetch
- Add data-source hierarchy to opportunity manager (cache-only, cache-then-cw, cw-first)
- fetchPages/search/fetchByCompany use cache-only: Redis → DB (no CW calls)
- fetchItem uses cache-then-cw by default, cw-first when fresh=true
- Add idleTimeout: 255 to Bun.serve to prevent request timeouts
- Map CW status 57 (04. Confirmed Quote) to Active equivalency
- Add computeCacheTTL algorithm and opportunityCache module
2026-03-02 21:12:44 -06:00
53 changed files with 6759 additions and 264 deletions
+5 -1
View File
@@ -198,7 +198,11 @@ Whenever you add, remove, or modify API routes or permission nodes, you **must**
2. `PERMISSIONS.md` — human-readable documentation of all permission nodes; must strictly reflect the data in `PermissionNodes.ts`.
3. `API_ROUTES.md` — comprehensive documentation of all API routes, including method, path, auth requirements, permissions, request/response examples.
Always verify that new routes have their required permissions listed in `PermissionNodes.ts`, that `PERMISSIONS.md` tables match the TS file exactly, and that `API_ROUTES.md` includes full documentation for every mounted route. Run through all three files at the end of any route or permission change to catch discrepancies.
Additionally, whenever you add, remove, or modify **caching logic** (TTL algorithms, cache key patterns, background refresh mechanics, retry settings, or invalidation behavior), you **must** update:
4. `CACHING.md` — comprehensive documentation of the Redis-backed opportunity cache, TTL algorithms, background refresh mechanics, retry logic, and debugging tools.
Always verify that new routes have their required permissions listed in `PermissionNodes.ts`, that `PERMISSIONS.md` tables match the TS file exactly, that `API_ROUTES.md` includes full documentation for every mounted route, and that `CACHING.md` accurately reflects any caching changes. Run through all relevant files at the end of any route, permission, or caching change to catch discrepancies.
---
+2
View File
@@ -1,6 +1,8 @@
# Logs
logs
*.log
*.jsonl
cw-api-logs/
npm-debug.log*
yarn-debug.log*
yarn-error.log*
+61 -5
View File
@@ -32,6 +32,56 @@ See [PERMISSIONS.md](PERMISSIONS.md) for the full list of field-level permission
## Authentication Routes
## ConnectWise Callback Routes
### Receive ConnectWise Callback
**POST** `/cw/callback/:secret/:resource`
Receives ConnectWise callback/webhook payloads for supported resources and returns a normalized success response.
**Authentication Required:** No
**Path Parameters:**
- `secret` — Shared callback secret, validated against `CW_CALLBACK_SECRET` when configured
- `resource` — one of: `opportunity`, `ticket`, `company`, `activity`
**Behavior:**
- Parses JSON request body when present.
- Decodes JSON-encoded payload fields such as `Entity`.
- Logs a concise callback summary to console.
**Response:**
```json
{
"status": 200,
"message": "CW callback received.",
"data": {
"resource": "ticket",
"summary": {
"resource": "ticket",
"messageId": "1bec7421-204a-4b30-8b06-465915e9a0f5",
"action": "updated",
"type": "ticket",
"id": 36073,
"memberId": "jroberts",
"entityStatus": "In Progress",
"entitySummary": "Onsite Installation: Rough-In",
"entityUpdatedBy": "cirvine",
"entityLastUpdated": "2026-03-03T21:43:29.903"
},
"bodyParsed": {},
"receivedAt": "2026-03-04T00:00:00.000Z"
},
"successful": true
}
```
---
### Get Authentication URI
**GET** `/auth/uri`
@@ -2704,7 +2754,9 @@ Fetch the distinct values available for filter dropdowns (categories, subcategor
## Sales Routes
Sales routes serve opportunity data stored locally and synced from ConnectWise. All opportunity responses include hydrated company data (address, contacts) fetched from ConnectWise when a linked company exists, as well as an `activities` array containing all ConnectWise activities linked to the opportunity (fetched live from CW at request time). Single-opportunity fetches additionally include full site details (address, phone, flags). Sub-resource routes (products, notes, contacts) fetch live data from ConnectWise using the opportunity's CW ID.
Sales routes serve opportunity data stored locally and synced from ConnectWise. All opportunity responses include hydrated company data (address, contacts) and an `activities` array. Single-opportunity fetches additionally include full site details (address, phone, flags). Sub-resource routes (products, notes, contacts) return data keyed by the opportunity's CW ID.
**Caching:** Most CW data for opportunities is served from a **Redis cache** that is proactively warmed by a background refresh cycle (every 30 seconds). This includes opportunity CW data, activities, notes, contacts, products, and company data. Cache TTLs are adaptive — recently updated opportunities have shorter TTLs (30s60s) for fresher data, while inactive opportunities use longer TTLs (515 minutes). If a cache miss occurs at request time, data is fetched live from CW and cached. See [CACHING.md](CACHING.md) for full details on TTL algorithms, background refresh mechanics, and debugging tools.
### Get Opportunity Types
@@ -2922,7 +2974,7 @@ Get the total number of opportunities.
**GET** `/sales/opportunities/:identifier`
Fetch a single opportunity by its internal ID or ConnectWise opportunity ID. The response includes hydrated company data (with address and contacts from ConnectWise) and full site details (with address) when available.
Fetch a single opportunity by its internal ID or ConnectWise opportunity ID. The response includes hydrated company data (with address and contacts) and full site details (with address) when available. CW data (activities, company, site) is served from the Redis cache when available; on cache miss, data is fetched live from CW and cached with an adaptive TTL.
**Authentication Required:** Yes
@@ -2934,6 +2986,10 @@ Fetch a single opportunity by its internal ID or ConnectWise opportunity ID. The
- `identifier` — Internal ID (cuid) or ConnectWise opportunity ID (numeric)
**Query Parameters:**
- `include` _(optional)_ — Comma-separated list of sub-resources to embed in the response. Supported values: `notes`, `contacts`, `products`. Example: `?include=notes,contacts,products`. Sub-resources are fetched in parallel and added as top-level keys on the response object.
**Response:**
```json
@@ -3182,7 +3238,7 @@ Refresh an opportunity's local data by fetching the latest from ConnectWise. The
**GET** `/sales/opportunities/:identifier/products`
Fetch products (forecast/revenue line items) for an opportunity. Data is fetched live from ConnectWise using the opportunity's CW ID. Products are returned sorted by the opportunity's local `productSequence` array when set; otherwise, items are sorted by their ConnectWise `sequenceNumber`.
Fetch products (forecast/revenue line items) for an opportunity. Data is served from the Redis cache when available; on cache miss, data is fetched live from ConnectWise and cached. Hot opportunities (updated within 3 days) have a 15-second TTL; others use a 30-minute lazy TTL. Products are returned sorted by the opportunity's local `productSequence` array when set; otherwise, items are sorted by their ConnectWise `sequenceNumber`.
**Authentication Required:** Yes
@@ -3438,7 +3494,7 @@ All fields are optional. Only fields the user has the corresponding `sales.oppor
**GET** `/sales/opportunities/:identifier/notes`
Fetch notes for an opportunity. Data is fetched live from ConnectWise using the opportunity's CW ID.
Fetch notes for an opportunity. Data is served from the Redis cache when available; on cache miss, data is fetched live from ConnectWise and cached. Cache is invalidated automatically when notes are created, updated, or deleted.
**Authentication Required:** Yes
@@ -3651,7 +3707,7 @@ Delete a note from an opportunity in ConnectWise.
**GET** `/sales/opportunities/:identifier/contacts`
Fetch contacts associated with an opportunity. Data is fetched live from ConnectWise using the opportunity's CW ID.
Fetch contacts associated with an opportunity. Data is served from the Redis cache when available; on cache miss, data is fetched live from ConnectWise and cached. Cache is invalidated automatically when contacts are created, updated, or deleted.
**Authentication Required:** Yes
+348
View File
@@ -0,0 +1,348 @@
# Caching Architecture
This document describes the caching layer used in the Optima API, covering the Redis-backed opportunity cache, TTL algorithms, background refresh mechanics, retry logic, and debugging tools.
---
## Overview
The API caches expensive ConnectWise (CW) API responses in **Redis** to reduce latency and avoid CW rate limits. The primary cache layer is the **opportunity cache** (`src/modules/cache/opportunityCache.ts`), which proactively warms data for all non-closed opportunities on a background interval.
### Key design principles
- **Adaptive TTLs** — cache durations are computed dynamically based on how "hot" an opportunity is (recently updated = shorter TTL = fresher data).
- **Background refresh** — a 20-minute interval scans all open opportunities and re-fetches only expired cache keys.
- **Bounded concurrency** — CW API calls are throttled via thunk-based batching to prevent overwhelming the upstream API.
- **Graceful degradation** — transient CW errors (timeouts, network failures) are caught, logged, and retried on the next cycle rather than crashing the process.
- **Priority ordering** — most recently updated opportunities are refreshed first so active deals get fresh data before stale ones.
---
## What is cached
Each non-closed opportunity can have up to 7 cached payloads in Redis:
| Cache Key Pattern | Data | Source |
| ----------------------------------- | ------------------------------------ | --------------------------------------------------------------------- |
| `opp:cw-data:{cwOpportunityId}` | Raw CW opportunity response | `GET /sales/opportunities/:id` |
| `opp:activities:{cwOpportunityId}` | CW activities array | `GET /sales/activities?conditions=opportunity/id=:id` |
| `opp:notes:{cwOpportunityId}` | CW notes array | `GET /sales/opportunities/:id/notes` |
| `opp:contacts:{cwOpportunityId}` | CW contacts array | `GET /sales/opportunities/:id/contacts` |
| `opp:products:{cwOpportunityId}` | Forecast + procurement products blob | `GET /sales/opportunities/:id/forecast` + `GET /procurement/products` |
| `opp:company-cw:{cw_CompanyId}` | Hydrated company + contacts blob | `GET /company/companies/:id` + contacts endpoints |
| `opp:site:{cwCompanyId}:{cwSiteId}` | Company site data | `GET /company/companies/:id/sites/:siteId` |
Inventory-adjustment-driven catalog sync adds a targeted product cache:
| Cache Key Pattern | Data | Source |
| ------------------------ | ---------------------------------------------------------- | -------------------------------------------------------------------------------------------- |
| `catalog:item:cw:{cwId}` | Full CW catalog item + computed `onHand` + DB row snapshot | `GET /procurement/adjustments` + `GET /procurement/catalog/:id` + catalog inventory endpoint |
---
## TTL Algorithms
Three algorithms compute cache TTLs. All share the same input signals:
- `closedFlag` — whether the opportunity is closed
- `closedDate` — when it was closed
- `expectedCloseDate` — projected close date (forward-looking signal)
- `lastUpdated` — last CW modification date (backward-looking signal)
### Primary TTL (`computeCacheTTL`)
**File:** `src/modules/algorithms/computeCacheTTL.ts`
Used for: opportunity CW data, activities, company CW data.
| # | Condition | TTL | Human |
| --- | ------------------------------------------------------- | ---------- | ------------ |
| 1a | Closed > 30 days ago | `null` | Do not cache |
| 1b | Closed within 30 days | 900,000 ms | 15 minutes |
| 2 | `expectedCloseDate` or `lastUpdated` within **5 days** | 30,000 ms | 30 seconds |
| 3 | `expectedCloseDate` or `lastUpdated` within **14 days** | 60,000 ms | 60 seconds |
| 4 | Everything else | 900,000 ms | 15 minutes |
Rules are evaluated top-to-bottom; first match wins.
### Sub-Resource TTL (`computeSubResourceCacheTTL`)
**File:** `src/modules/algorithms/computeSubResourceCacheTTL.ts`
Used for: notes, contacts.
| # | Condition | TTL | Human |
| --- | --------------------- | ---------- | ------------ |
| 1a | Closed > 30 days ago | `null` | Do not cache |
| 1b | Closed within 30 days | 300,000 ms | 5 minutes |
| 2 | Within **5 days** | 60,000 ms | 60 seconds |
| 3 | Within **14 days** | 120,000 ms | 2 minutes |
| 4 | Everything else | 300,000 ms | 5 minutes |
### Products TTL (`computeProductsCacheTTL`)
**File:** `src/modules/algorithms/computeProductsCacheTTL.ts`
Used for: forecast + procurement products.
| # | Condition | TTL | Human |
| --- | ------------------------------------------- | ------------ | ---------- |
| 1 | Status is Won/Lost/Pending Won/Pending Lost | `null` | No cache |
| 2 | Main cache TTL is `null` | `null` | No cache |
| 3 | `lastUpdated` within **3 days** | 15,000 ms | 15 seconds |
| 4 | Everything else | 1,200,000 ms | 20 minutes |
Products on terminal-status opportunities are never proactively cached. Non-hot products use a **lazy on-demand** cache — they're fetched when requested and cached for 20 minutes.
### Site TTL
Sites use a fixed TTL of **20 minutes** (1,200,000 ms). Site/address data rarely changes. Sites are **not** proactively warmed by the background refresh — they are populated lazily on the first detail-view request.
---
## Background Refresh
**Function:** `refreshOpportunityCache()` in `src/modules/cache/opportunityCache.ts`
**Interval:** Every 20 minutes, triggered from `src/index.ts`.
### Refresh cycle
1. **Query DB** — fetch all non-closed opportunities + recently closed (within 30 days), ordered by `cwLastUpdated DESC` (most recently active first).
2. **Batch EXISTS check** — use a single Redis pipeline to check which cache keys already exist (5 EXISTS commands per opportunity: oppCwData, activities, notes, contacts, products).
3. **Build thunk list** — for each opportunity with missing keys, push a **thunk** (lazy function) into the task list. No HTTP requests fire at this point.
4. **Execute with bounded concurrency** — process thunks in batches of `CONCURRENCY` (currently **6**), with a `BATCH_DELAY_MS` (currently **250ms**) pause between batches. Each thunk is only invoked inside the batch loop.
5. **Emit events**`cache:opportunities:refresh:started` and `cache:opportunities:refresh:completed` events are emitted for the event debugger.
### Inventory-adjustment listener cycle
**Function:** `listenInventoryAdjustments()` in `src/modules/cw-utils/procurement/listenInventoryAdjustments.ts`
**Interval:** Every 60 seconds, triggered from `src/index.ts`.
1. Fetch `GET /procurement/adjustments?pageSize=1000`.
2. Build a normalized snapshot of tracked inventory rows (`cwCatalogId`, `onHand`, `inventory`) per adjustment.
3. Compare to previous snapshot; extract only changed product IDs.
4. For each changed product ID, fetch fresh CW catalog item + current on-hand.
5. Upsert `CatalogItem` in Postgres and write Redis key `catalog:item:cw:{cwId}` with a 20-minute TTL.
Guardrails to prevent request storms:
- Diffing is computed at **product state** level (grouped by `cwCatalogId`), not raw adjustment-row churn.
- Per-cycle syncs are capped (`CW_ADJUSTMENT_SYNC_MAX_PER_CYCLE`, default `50`).
- Product resync cooldown is enforced (`CW_ADJUSTMENT_SYNC_COOLDOWN_MS`, default `600000` ms / 10 min).
This avoids full-catalog sweeps for small inventory movements and updates only the products implicated by adjustments.
### Full procurement catalog refresh
**Function:** `refreshCatalog()` in `src/modules/cw-utils/procurement/refreshCatalog.ts`
**Interval:** Every 30 minutes, triggered from `src/index.ts`.
The full catalog cache/DB sync uses the same slow-parallel thunk strategy as opportunity cache refreshes:
- Build arrays of thunk tasks (`() => Promise<void>`) for CW item fetches, inventory fetches, and DB upserts.
- Execute with bounded concurrency (`CONCURRENCY=6`).
- Pause between batches (`BATCH_DELAY_MS=250`) to avoid CW burst pressure.
- Log task failures and retry naturally on the next cycle.
This keeps full-catalog refresh conservative while inventory-adjustment listener handles near-real-time targeted updates.
### Full inventory sweep fallback
`refreshInventory()` remains as a safety net but is intentionally infrequent:
- Runs every **6 hours** from `src/index.ts` (no startup-time full sweep).
- Uses the same slow-parallel pattern (`CONCURRENCY=6`, `BATCH_DELAY_MS=250`) to avoid burst traffic.
Most on-hand freshness now comes from the 60-second adjustment listener plus 30-minute full catalog refresh.
### Concurrency control
The thunk pattern is critical. Previously, tasks were pushed as already-executing promises (`refreshTasks.push(fetchAndCache(...))`), which meant all HTTP requests fired simultaneously regardless of the batching loop. The fix was changing the array type from `Promise<void>[]` to `(() => Promise<void>)[]` so requests only start when explicitly invoked: `batch.map((fn) => fn())`.
### Current tuning
| Parameter | Value | Effect |
| ---------------- | ---------- | ------------------------------------------ |
| `CONCURRENCY` | 6 | Max simultaneous CW API requests per batch |
| `BATCH_DELAY_MS` | 250 | Milliseconds between batches |
| Refresh interval | 20 minutes | How often the full sweep runs |
At these settings, a full sweep of ~500 expired keys completes in ~1-2 minutes with zero CW errors and ~230ms median latency.
---
## Retry Logic (`withCwRetry`)
**File:** `src/modules/cw-utils/withCwRetry.ts`
Wraps CW API calls with exponential backoff retry on transient errors.
### Retryable errors
- `ECONNABORTED` (timeout)
- `ECONNRESET`
- `ETIMEDOUT`
- `ECONNREFUSED`
- `ERR_NETWORK`
- `ENETUNREACH`
- HTTP 5xx server errors
### Default configuration
| Parameter | Default | Description |
| ------------- | ------- | ----------------------------------------------------------- |
| `maxAttempts` | 3 | Total attempts including the first |
| `baseDelayMs` | 1,000 | Delay before first retry (doubles each retry: 1s → 2s → 4s) |
| `label` | — | Optional tag for log messages |
### Usage
```ts
import { withCwRetry } from "./withCwRetry";
const response = await withCwRetry(
() => connectWiseApi.get(`/company/companies/${id}`),
{ label: `fetchCompany#${id}`, maxAttempts: 3, baseDelayMs: 1_500 },
);
```
Non-transient errors (404, 400, etc.) are re-thrown immediately without retry.
---
## CW API Logger
**File:** `src/modules/cw-utils/cwApiLogger.ts`
Axios interceptor that logs every CW API call to a JSONL file. Logging is **opt-in** — set the `LOG_CW_API` environment variable to enable it. Each process start creates a new timestamped file in the `cw-api-logs/` directory (e.g., `cw-api-logs/2026-03-02T14-30-05.123Z.jsonl`).
### Enabling logging
```bash
# Via the dev:log shorthand script
bun run dev:log
# Or manually with any command
LOG_CW_API=1 bun run dev
```
### Log entry fields
| Field | Type | Description |
| ------------ | ----------------- | ----------------------------------- |
| `timestamp` | string (ISO-8601) | When the request completed |
| `method` | string | HTTP method |
| `url` | string | Request URL (relative or absolute) |
| `baseURL` | string | Axios baseURL |
| `status` | number \| null | HTTP status (null on network error) |
| `durationMs` | number | Wall-clock time in milliseconds |
| `error` | string \| null | Error code + message, if any |
| `timeout` | number | Configured timeout in ms |
### Analysis
Run the analyzer script to analyze the most recent log file:
```bash
bun run utils:analyze_cw
```
Or specify a particular file:
```bash
python3 debug-scripts/analyze-cw-calls.py cw-api-logs/2026-03-02T14-30-05.123Z.jsonl
```
This executes `debug-scripts/analyze-cw-calls.py` which produces:
- Overview (total calls, error rate, time span)
- Duration statistics (min, max, mean, p50, p90, p95, p99, distribution histogram)
- Error breakdown by type and endpoint
- Top 20 slowest calls
- Per-endpoint stats (count, errors, mean, p50, p95, max, total time)
- Timeline (per-minute throughput and errors)
- Concurrency hotspot detection
- Summary with recommendations
To clear all logs:
```bash
rm -rf cw-api-logs/
```
---
## Cache Invalidation
Mutation endpoints invalidate the relevant cache keys so the next read fetches fresh data from CW:
| Mutation | Cache invalidated |
| ------------------------------ | ---------------------------------------------------------------- |
| Create/update/delete note | `opp:notes:{cwOpportunityId}` via `invalidateNotesCache()` |
| Create/update/delete contact | `opp:contacts:{cwOpportunityId}` via `invalidateContactsCache()` |
| Add/update/resequence products | `opp:products:{cwOpportunityId}` via `invalidateProductsCache()` |
| Refresh opportunity | All keys for that opportunity (via re-fetch) |
---
## ConnectWise API Configuration
The shared Axios instance (`connectWiseApi`) is configured in `src/constants.ts`:
| Setting | Value | Purpose |
| --------- | ---------------------------------------------------- | ------------------------------ |
| `baseURL` | `https://ttscw.totaltech.net/v4_6_release/apis/3.0/` | CW API base |
| `timeout` | 30,000 ms (30s) | Per-request timeout |
| Logger | `attachCwApiLogger()` | Writes to `cw-api-calls.jsonl` |
---
## Architecture diagram
```
src/index.ts
├─ setInterval(refreshOpportunityCache, 20m)
└─► src/modules/cache/opportunityCache.ts
├─ prisma.opportunity.findMany(orderBy: cwLastUpdated DESC)
├─ redis.pipeline().exists(...) ← batch key check
├─ Build thunk list (lazy functions)
└─ Execute thunks with CONCURRENCY=6, DELAY=250ms
├─► fetchAndCacheOppCwData() ─► opportunityCw.fetch()
├─► fetchAndCacheActivities() ─► activityCw.fetchByOpportunityDirect()
├─► fetchAndCacheNotes() ─► opportunityCw.fetchNotes()
├─► fetchAndCacheContacts() ─► opportunityCw.fetchContacts()
├─► fetchAndCacheProducts() ─► opportunityCw.fetchProducts() + fetchProcurementProducts()
├─► fetchAndCacheCompanyCwData() ─► fetchCwCompanyById() + contacts
└─► fetchAndCacheSite() ─► fetchCompanySite() (lazy only)
└─► connectWiseApi.get(...) ← withCwRetry + cwApiLogger interceptors
└─► Redis SET with computed TTL
```
---
## File reference
| File | Purpose |
| ---------------------------------------------------------------- | ------------------------------------------------------------- |
| `src/modules/cache/opportunityCache.ts` | Cache read/write helpers, background refresh logic |
| `src/modules/algorithms/computeCacheTTL.ts` | Primary adaptive TTL algorithm |
| `src/modules/algorithms/computeSubResourceCacheTTL.ts` | Sub-resource (notes, contacts) TTL algorithm |
| `src/modules/algorithms/computeProductsCacheTTL.ts` | Products TTL algorithm |
| `src/modules/cw-utils/withCwRetry.ts` | Retry wrapper with exponential backoff |
| `src/modules/cw-utils/cwApiLogger.ts` | Axios interceptor for JSONL call logging |
| `src/modules/cw-utils/fetchCompany.ts` | Company fetch with retry |
| `src/modules/cw-utils/procurement/listenInventoryAdjustments.ts` | Adjustment listener for targeted catalog-item cache + DB sync |
| `src/constants.ts` | CW Axios instance config (timeout, logger) |
| `src/index.ts` | Refresh interval registration |
| `debug-scripts/analyze-cw-calls.py` | CW API call analysis script |
+8
View File
@@ -124,6 +124,14 @@ Admin-specific UI permissions that control visibility and data loading for admin
| `procurement.catalog.inventory.refresh` | Refresh on-hand inventory for a catalog item from ConnectWise | [src/api/procurement/[id]/refreshInventory.ts](src/api/procurement/[id]/refreshInventory.ts) | `procurement.catalog.fetch` |
| `procurement.catalog.link` | Link or unlink catalog items to each other | [src/api/procurement/[id]/link.ts](src/api/procurement/[id]/link.ts), [src/api/procurement/[id]/unlink.ts](src/api/procurement/[id]/unlink.ts) | `procurement.catalog.fetch` |
### ConnectWise Callback Routes
`POST /v1/cw/callback/:secret/:resource` is intentionally unauthenticated for inbound ConnectWise callbacks and does **not** require a permission node.
| Permission Node | Description | Used In | Dependencies |
| --------------- | ------------------------------------------------------------------------------- | ------------------------------------------------ | ------------ |
| _None_ | Inbound callback route; secured operationally (network controls / source trust) | [src/api/cw/callback.ts](src/api/cw/callback.ts) | N/A |
### Sales Permissions
Permissions for accessing and managing sales opportunities. Opportunities are synced from ConnectWise and stored locally; sub-resources (products, notes, contacts) are fetched live from CW.
+307
View File
@@ -0,0 +1,307 @@
#!/usr/bin/env python3
"""
Analyze ConnectWise API call logs.
Looks for the most recent log file in cw-api-logs/ by default,
or accepts an explicit path as an argument.
Usage:
python3 analyze-cw-calls.py # latest file in cw-api-logs/
python3 analyze-cw-calls.py cw-api-logs/specific.jsonl
"""
import json
import sys
import os
import glob
import statistics
from collections import Counter, defaultdict
from datetime import datetime, timedelta
# ── Colours ──────────────────────────────────────────────────────────────────
RED = "\033[91m"
GREEN = "\033[92m"
YELLOW = "\033[93m"
CYAN = "\033[96m"
BOLD = "\033[1m"
DIM = "\033[2m"
RESET = "\033[0m"
def colour_duration(ms: float) -> str:
if ms >= 10_000:
return f"{RED}{ms:,.0f}ms{RESET}"
if ms >= 5_000:
return f"{YELLOW}{ms:,.0f}ms{RESET}"
return f"{GREEN}{ms:,.0f}ms{RESET}"
def header(title: str) -> str:
return f"\n{BOLD}{CYAN}{'' * 60}\n {title}\n{'' * 60}{RESET}"
# ── Resolve log file ────────────────────────────────────────────────────────
def find_latest_log() -> str:
"""Find the most recent .jsonl file in cw-api-logs/."""
log_dir = os.path.join(os.getcwd(), "cw-api-logs")
files = sorted(glob.glob(os.path.join(log_dir, "*.jsonl")))
if not files:
print(f"{RED}No log files found in cw-api-logs/{RESET}")
print(f"Run {BOLD}bun run dev:log{RESET} to start logging CW API calls.")
sys.exit(1)
return files[-1]
if len(sys.argv) > 1:
log_path = sys.argv[1]
else:
log_path = find_latest_log()
print(f"{DIM}Reading: {log_path}{RESET}")
entries = []
parse_errors = 0
with open(log_path) as f:
for line in f:
line = line.strip()
if not line:
continue
try:
entries.append(json.loads(line))
except json.JSONDecodeError:
parse_errors += 1
if not entries:
print("No entries found. Check the log file path.")
sys.exit(1)
# ── Derived fields ───────────────────────────────────────────────────────────
durations = [e["durationMs"] for e in entries]
errors = [e for e in entries if e.get("error")]
successes = [e for e in entries if not e.get("error")]
timestamps = [datetime.fromisoformat(e["timestamp"].replace("Z", "+00:00")) for e in entries]
time_span = (timestamps[-1] - timestamps[0]) if len(timestamps) > 1 else timedelta(0)
# Normalise the URL to a route pattern for grouping
def normalise_url(url: str) -> str:
parts = url.split("?")[0].rstrip("/").split("/")
normalised = []
for p in parts:
if p.isdigit():
normalised.append(":id")
else:
normalised.append(p)
return "/".join(normalised)
# ── 1. Overview ──────────────────────────────────────────────────────────────
print(header("OVERVIEW"))
print(f" Log file : {log_path}")
print(f" Total calls : {BOLD}{len(entries):,}{RESET}")
print(f" Successes : {GREEN}{len(successes):,}{RESET}")
print(f" Failures : {RED}{len(errors):,}{RESET} ({len(errors)/len(entries)*100:.1f}%)")
print(f" Time span : {time_span}")
if time_span.total_seconds() > 0:
rps = len(entries) / time_span.total_seconds()
print(f" Avg req/sec : {rps:.2f}")
if parse_errors:
print(f" Parse errors : {YELLOW}{parse_errors}{RESET}")
# ── 2. Duration stats ───────────────────────────────────────────────────────
print(header("DURATION STATS (all calls)"))
sorted_dur = sorted(durations)
p50 = sorted_dur[len(sorted_dur) * 50 // 100]
p90 = sorted_dur[len(sorted_dur) * 90 // 100]
p95 = sorted_dur[len(sorted_dur) * 95 // 100]
p99 = sorted_dur[len(sorted_dur) * 99 // 100]
print(f" Min : {colour_duration(min(durations))}")
print(f" Max : {colour_duration(max(durations))}")
print(f" Mean : {colour_duration(statistics.mean(durations))}")
print(f" Median (p50) : {colour_duration(p50)}")
print(f" p90 : {colour_duration(p90)}")
print(f" p95 : {colour_duration(p95)}")
print(f" p99 : {colour_duration(p99)}")
print(f" Std dev : {statistics.stdev(durations):,.0f}ms" if len(durations) > 1 else "")
# Duration buckets
buckets = {"<500ms": 0, "500ms-1s": 0, "1-3s": 0, "3-5s": 0, "5-10s": 0, "10-20s": 0, "20s+": 0}
for d in durations:
if d < 500: buckets["<500ms"] += 1
elif d < 1000: buckets["500ms-1s"] += 1
elif d < 3000: buckets["1-3s"] += 1
elif d < 5000: buckets["3-5s"] += 1
elif d < 10000: buckets["5-10s"] += 1
elif d < 20000: buckets["10-20s"] += 1
else: buckets["20s+"] += 1
print(f"\n {BOLD}Distribution:{RESET}")
max_bar = 40
max_count = max(buckets.values()) if buckets else 1
for label, count in buckets.items():
bar_len = int(count / max_count * max_bar) if max_count else 0
pct = count / len(durations) * 100
bar = "" * bar_len
clr = GREEN if "500" in label or "<" in label else (YELLOW if "1-3" in label or "3-5" in label else RED)
print(f" {label:>10s} {clr}{bar}{RESET} {count:>5,} ({pct:5.1f}%)")
# ── 3. Errors breakdown ─────────────────────────────────────────────────────
print(header("ERROR BREAKDOWN"))
if not errors:
print(f" {GREEN}No errors! 🎉{RESET}")
else:
error_codes = Counter()
for e in errors:
err_str = e.get("error", "unknown")
code = err_str.split(":")[0] if ":" in err_str else err_str
error_codes[code] += 1
for code, count in error_codes.most_common():
print(f" {RED}{code:<30s}{RESET} {count:>5,} ({count/len(entries)*100:.1f}%)")
# Errored URLs
errored_urls = Counter(normalise_url(e["url"]) for e in errors)
print(f"\n {BOLD}Top errored endpoints:{RESET}")
for url, count in errored_urls.most_common(10):
print(f" {count:>5,} {url}")
# ── 4. Slowest individual calls ─────────────────────────────────────────────
print(header("TOP 20 SLOWEST CALLS"))
slowest = sorted(entries, key=lambda e: e["durationMs"], reverse=True)[:20]
for i, e in enumerate(slowest, 1):
status = e.get("status") or f"{RED}ERR{RESET}"
err_tag = f" {RED}[{e['error'].split(':')[0]}]{RESET}" if e.get("error") else ""
print(f" {i:>2}. {colour_duration(e['durationMs']):>20s} {e['method']:>4s} {e['url'][:60]:<60s} {DIM}{status}{RESET}{err_tag}")
# ── 5. Per-endpoint stats ───────────────────────────────────────────────────
print(header("PER-ENDPOINT STATS (by route pattern)"))
by_route = defaultdict(list)
for e in entries:
route = normalise_url(e["url"])
by_route[route].append(e)
# Sort by total time spent descending (most impactful)
route_stats = []
for route, calls in by_route.items():
durs = [c["durationMs"] for c in calls]
errs = sum(1 for c in calls if c.get("error"))
sorted_d = sorted(durs)
route_stats.append({
"route": route,
"count": len(calls),
"errors": errs,
"total_ms": sum(durs),
"mean": statistics.mean(durs),
"p50": sorted_d[len(sorted_d) * 50 // 100],
"p95": sorted_d[len(sorted_d) * 95 // 100],
"max": max(durs),
})
route_stats.sort(key=lambda r: r["total_ms"], reverse=True)
print(f" {'Route':<55s} {'Count':>6s} {'Errs':>5s} {'Mean':>8s} {'p50':>8s} {'p95':>8s} {'Max':>8s} {'Total':>10s}")
print(f" {'' * 55} {'' * 6} {'' * 5} {'' * 8} {'' * 8} {'' * 8} {'' * 8} {'' * 10}")
for r in route_stats[:25]:
err_str = f"{RED}{r['errors']}{RESET}" if r['errors'] else f"{DIM}0{RESET}"
print(
f" {r['route']:<55s} {r['count']:>6,} {err_str:>14s} "
f"{r['mean']:>7,.0f}ms {r['p50']:>7,.0f}ms {r['p95']:>7,.0f}ms "
f"{r['max']:>7,.0f}ms {r['total_ms']/1000:>8,.1f}s"
)
# ── 6. HTTP method breakdown ────────────────────────────────────────────────
print(header("BY HTTP METHOD"))
by_method = defaultdict(list)
for e in entries:
by_method[e["method"]].append(e["durationMs"])
print(f" {'Method':<8s} {'Count':>7s} {'Mean':>9s} {'p95':>9s} {'Max':>9s}")
print(f" {'' * 8} {'' * 7} {'' * 9} {'' * 9} {'' * 9}")
for method in sorted(by_method.keys()):
durs = by_method[method]
sd = sorted(durs)
print(
f" {method:<8s} {len(durs):>7,} "
f"{statistics.mean(durs):>8,.0f}ms "
f"{sd[len(sd)*95//100]:>8,.0f}ms "
f"{max(durs):>8,.0f}ms"
)
# ── 7. Timeline (calls per minute) ──────────────────────────────────────────
if time_span.total_seconds() > 60:
print(header("TIMELINE (per-minute throughput & errors)"))
by_minute = defaultdict(lambda: {"count": 0, "errors": 0, "dur_sum": 0})
for e in entries:
ts = e["timestamp"][:16] # YYYY-MM-DDTHH:MM
by_minute[ts]["count"] += 1
by_minute[ts]["dur_sum"] += e["durationMs"]
if e.get("error"):
by_minute[ts]["errors"] += 1
for minute in sorted(by_minute.keys()):
m = by_minute[minute]
avg = m["dur_sum"] / m["count"] if m["count"] else 0
err_part = f" {RED}({m['errors']} errs){RESET}" if m["errors"] else ""
bar = "" * min(m["count"] // 5, 50)
avg_clr = colour_duration(avg)
print(f" {minute} {m['count']:>5,} reqs avg {avg_clr:>20s} {bar}{err_part}")
# ── 8. Concurrency hotspots ─────────────────────────────────────────────────
print(header("CONCURRENCY HOTSPOTS (calls starting within 100ms of each other)"))
ts_ms = [int(t.timestamp() * 1000) for t in timestamps]
bursts = []
i = 0
while i < len(ts_ms):
j = i
while j < len(ts_ms) - 1 and ts_ms[j + 1] - ts_ms[j] < 100:
j += 1
burst_size = j - i + 1
if burst_size >= 5:
burst_entries = entries[i:j + 1]
avg_dur = statistics.mean(e["durationMs"] for e in burst_entries)
bursts.append((burst_size, entries[i]["timestamp"], avg_dur, burst_entries))
i = j + 1
bursts.sort(key=lambda b: b[0], reverse=True)
if bursts:
print(f" Found {len(bursts)} burst(s) of ≥5 concurrent requests\n")
for size, ts, avg, _ in bursts[:10]:
print(f" {YELLOW}{size:>3} concurrent{RESET} at {ts} avg {colour_duration(avg)}")
else:
print(f" {GREEN}No major concurrency bursts detected.{RESET}")
# ── 9. Summary / recommendations ────────────────────────────────────────────
print(header("SUMMARY"))
err_rate = len(errors) / len(entries) * 100
slow_5s = sum(1 for d in durations if d >= 5000)
slow_pct = slow_5s / len(entries) * 100
if err_rate > 5:
print(f" {RED}⚠ Error rate is {err_rate:.1f}% — CW API is struggling{RESET}")
elif err_rate > 1:
print(f" {YELLOW}⚠ Error rate is {err_rate:.1f}% — some instability{RESET}")
else:
print(f" {GREEN}✓ Error rate is {err_rate:.1f}% — acceptable{RESET}")
if slow_pct > 10:
print(f" {RED}{slow_5s:,} calls ({slow_pct:.1f}%) took >5s — CW is slow or rate-limiting{RESET}")
elif slow_pct > 2:
print(f" {YELLOW}{slow_5s:,} calls ({slow_pct:.1f}%) took >5s{RESET}")
else:
print(f" {GREEN}✓ Only {slow_5s:,} calls ({slow_pct:.1f}%) over 5s{RESET}")
if bursts:
max_burst = max(b[0] for b in bursts)
print(f" {YELLOW}⚠ Max concurrency burst: {max_burst} simultaneous requests — consider lowering CONCURRENCY{RESET}")
total_time_s = sum(durations) / 1000
print(f"\n Total wall-clock time spent waiting on CW: {BOLD}{total_time_s:,.1f}s{RESET} ({total_time_s/60:,.1f} min)")
print()
+441
View File
@@ -0,0 +1,441 @@
#!/usr/bin/env python3
import argparse
import json
from collections import Counter, defaultdict
from dataclasses import dataclass
from datetime import datetime, timezone
from pathlib import Path
from typing import Any
def parse_iso(value: str | None) -> datetime | None:
if not value:
return None
normalized = value.replace("Z", "+00:00")
try:
return datetime.fromisoformat(normalized)
except ValueError:
return None
def first_non_empty(*values: Any) -> str:
for value in values:
if value is None:
continue
if isinstance(value, str) and value.strip() == "":
continue
return str(value)
return "unknown"
def top_lines(counter: Counter[str], limit: int) -> list[str]:
return [f"{k}: {v}" for k, v in counter.most_common(limit)]
def fmt_pct(part: int, total: int) -> str:
if total == 0:
return "0.0%"
return f"{(part / total) * 100:.1f}%"
def human_duration(start: datetime | None, end: datetime | None) -> str:
if start is None or end is None:
return "unknown"
delta = end - start
total_seconds = int(delta.total_seconds())
hours, remainder = divmod(total_seconds, 3600)
minutes, seconds = divmod(remainder, 60)
return f"{hours}h {minutes}m {seconds}s"
def truncate(value: str, max_len: int = 90) -> str:
if len(value) <= max_len:
return value
return value[: max_len - 1] + ""
def add_section(lines: list[str], title: str) -> None:
lines.append("")
lines.append(title)
lines.append("-" * len(title))
def supports_color(enabled: bool) -> bool:
return enabled
def paint(text: str, code: str, use_color: bool) -> str:
if not use_color:
return text
return f"\033[{code}m{text}\033[0m"
def good_bad_neutral(value: str, state: str, use_color: bool) -> str:
if state == "good":
return paint(value, "32", use_color)
if state == "bad":
return paint(value, "31", use_color)
return paint(value, "36", use_color)
def add_ranked_counter(
lines: list[str],
title: str,
counter: Counter[str],
top_n: int,
total: int,
truncate_labels: bool = False,
) -> None:
lines.append(f"{title}")
items = counter.most_common(top_n)
if not items:
lines.append(" (no data)")
return
for index, (key, count) in enumerate(items, start=1):
label = truncate(key) if truncate_labels else key
lines.append(f" {index:>2}. {label:<90} {count:>4} {fmt_pct(count, total):>6}")
def stream_row_summary(row: dict[str, Any], use_color: bool, max_path: int) -> str:
request = row.get("request") or {}
response = row.get("response") or {}
body_parsed = request.get("bodyParsed") or {}
entity_parsed = request.get("entityParsed") or {}
summary = request.get("summary") or {}
timestamp = parse_iso(row.get("timestamp"))
time_label = timestamp.astimezone(timezone.utc).strftime("%H:%M:%S") if timestamp else "--:--:--"
method = first_non_empty(request.get("method"))
path = first_non_empty(request.get("path"))
endpoint = path.split("?", 1)[0]
status_code = first_non_empty(response.get("status"))
event_type = first_non_empty(body_parsed.get("Type"), summary.get("type"))
action = first_non_empty(
body_parsed.get("Action"),
summary.get("action"),
request.get("query", {}).get("params", {}).get("action"),
)
item_id = first_non_empty(body_parsed.get("ID"), summary.get("id"), request.get("query", {}).get("inferredId"))
actor = first_non_empty(
request.get("query", {}).get("params", {}).get("memberId"),
summary.get("entityUpdatedBy"),
entity_parsed.get("UpdatedBy"),
)
entity_status = first_non_empty(entity_parsed.get("StatusName"), summary.get("entityStatus"))
endpoint_label = truncate(endpoint, max_path)
status_state = "good" if status_code.startswith("2") else "bad"
status_colored = good_bad_neutral(status_code, status_state, use_color)
event_colored = paint(f"{event_type}.{action}", "36", use_color)
endpoint_colored = paint(endpoint_label, "94", use_color)
return (
f"[{time_label}] {method:<4} {endpoint_colored:<20} "
f"{status_colored:>3} {event_colored:<22} "
f"id={item_id:<7} actor={truncate(actor, 16):<16} status={truncate(entity_status, 22)}"
)
def endpoint_stream_summary(log_path: Path, use_color: bool, max_path: int) -> str:
lines: list[str] = []
lines.append(paint("ENDPOINT STREAM (chronological)", "1;95", use_color))
lines.append(paint("────────────────────────────────────────────────────────────────────────────────────────────", "90", use_color))
count = 0
invalid = 0
with log_path.open("r", encoding="utf-8") as handle:
for raw_line in handle:
line = raw_line.strip()
if not line:
continue
try:
row = json.loads(line)
except json.JSONDecodeError:
invalid += 1
continue
lines.append(stream_row_summary(row, use_color=use_color, max_path=max_path))
count += 1
lines.append(paint("────────────────────────────────────────────────────────────────────────────────────────────", "90", use_color))
lines.append(
f"events={count} invalid={good_bad_neutral(str(invalid), 'good' if invalid == 0 else 'bad', use_color)}"
)
return "\n".join(lines)
@dataclass
class LogStats:
total_rows: int = 0
parsed_rows: int = 0
invalid_rows: int = 0
earliest: datetime | None = None
latest: datetime | None = None
methods: Counter[str] = None # type: ignore[assignment]
paths: Counter[str] = None # type: ignore[assignment]
endpoint_roots: Counter[str] = None # type: ignore[assignment]
response_statuses: Counter[str] = None # type: ignore[assignment]
event_types: Counter[str] = None # type: ignore[assignment]
actions: Counter[str] = None # type: ignore[assignment]
type_action_combo: Counter[str] = None # type: ignore[assignment]
company_ids: Counter[str] = None # type: ignore[assignment]
source_members: Counter[str] = None # type: ignore[assignment]
actor_members: Counter[str] = None # type: ignore[assignment]
entity_updated_by: Counter[str] = None # type: ignore[assignment]
requests_by_hour: Counter[str] = None # type: ignore[assignment]
requests_by_minute: Counter[str] = None # type: ignore[assignment]
endpoint_by_hour: dict[str, Counter[str]] = None # type: ignore[assignment]
def __post_init__(self) -> None:
self.methods = Counter()
self.paths = Counter()
self.endpoint_roots = Counter()
self.response_statuses = Counter()
self.event_types = Counter()
self.actions = Counter()
self.type_action_combo = Counter()
self.company_ids = Counter()
self.source_members = Counter()
self.actor_members = Counter()
self.entity_updated_by = Counter()
self.requests_by_hour = Counter()
self.requests_by_minute = Counter()
self.endpoint_by_hour = defaultdict(Counter)
def add_timestamp(self, timestamp: datetime | None) -> None:
if timestamp is None:
return
self.earliest = timestamp if self.earliest is None else min(self.earliest, timestamp)
self.latest = timestamp if self.latest is None else max(self.latest, timestamp)
hour_bucket = timestamp.astimezone(timezone.utc).strftime("%Y-%m-%d %H:00 UTC")
minute_bucket = timestamp.astimezone(timezone.utc).strftime("%Y-%m-%d %H:%M UTC")
self.requests_by_hour[hour_bucket] += 1
self.requests_by_minute[minute_bucket] += 1
def summarize(self, top_n: int, busiest_n: int, use_color: bool) -> str:
duration_line = human_duration(self.earliest, self.latest)
time_range_line = "unknown"
if self.earliest and self.latest:
time_range_line = f"{self.earliest.isoformat()}{self.latest.isoformat()}"
total_requests = self.parsed_rows
success_count = self.response_statuses.get("200", 0)
success_pct = fmt_pct(success_count, sum(self.response_statuses.values()))
invalid_state = "good" if self.invalid_rows == 0 else "bad"
top_endpoints = self.endpoint_roots.most_common(2)
top_users = self.actor_members.most_common(3)
top_minutes = self.requests_by_minute.most_common(busiest_n)
lines: list[str] = []
lines.append(paint("WEBHOOK SNAPSHOT", "1;95", use_color))
lines.append(paint("────────────────────────────────────────────────────────", "90", use_color))
lines.append(
" "
+ paint("Rows", "1;97", use_color)
+ f": {self.total_rows:<4} "
+ paint("Parsed", "1;97", use_color)
+ f": {self.parsed_rows:<4} "
+ paint("Invalid", "1;97", use_color)
+ f": {good_bad_neutral(str(self.invalid_rows), invalid_state, use_color)}"
)
lines.append(
" "
+ paint("Window", "1;97", use_color)
+ f": {duration_line:<12} "
+ paint("Success", "1;97", use_color)
+ f": {good_bad_neutral(success_pct, 'good' if success_count else 'neutral', use_color)}"
)
lines.append(" " + paint("UTC Range", "1;97", use_color) + f": {time_range_line}")
lines.append("")
lines.append(paint("Top Endpoints", "1;94", use_color))
if top_endpoints:
for endpoint, count in top_endpoints:
lines.append(f"{endpoint:<14} {count:>4} ({fmt_pct(count, total_requests)})")
if not top_endpoints:
lines.append(" • (no data)")
lines.append("")
lines.append(paint("Most Active Users (query memberId)", "1;94", use_color))
if top_users:
for user, count in top_users:
lines.append(f"{user:<18} {count:>4} ({fmt_pct(count, total_requests)})")
if not top_users:
lines.append(" • (no data)")
lines.append("")
lines.append(paint("Busiest Minutes", "1;94", use_color))
if top_minutes:
for minute, count in top_minutes:
lines.append(f"{minute:<22} {count:>3}")
if not top_minutes:
lines.append(" • (no data)")
lines.append("")
lines.append(paint("Request Mix", "1;94", use_color))
method_line = ", ".join([f"{k}:{v}" for k, v in self.methods.most_common(3)]) or "(no data)"
event_line = ", ".join([f"{k}:{v}" for k, v in self.event_types.most_common(3)]) or "(no data)"
action_line = ", ".join([f"{k}:{v}" for k, v in self.actions.most_common(3)]) or "(no data)"
lines.append(f" • Methods : {method_line}")
lines.append(f" • Types : {event_line}")
lines.append(f" • Actions : {action_line}")
lines.append("")
lines.append(paint("Status Codes", "1;94", use_color))
if self.response_statuses:
status_total = sum(self.response_statuses.values())
for status, count in self.response_statuses.most_common(5):
state = "good" if status.startswith("2") else "bad"
status_label = good_bad_neutral(status, state, use_color)
lines.append(f"{status_label}: {count} ({fmt_pct(count, status_total)})")
if not self.response_statuses:
lines.append(" • (no data)")
return "\n".join(lines)
def update_stats(stats: LogStats, row: dict[str, Any]) -> None:
timestamp = parse_iso(row.get("timestamp"))
stats.add_timestamp(timestamp)
request = row.get("request") or {}
response = row.get("response") or {}
body_parsed = request.get("bodyParsed") or {}
entity_parsed = request.get("entityParsed") or {}
method = first_non_empty(request.get("method"))
path = first_non_empty(request.get("path"))
endpoint_root = path.split("?", 1)[0]
status = first_non_empty(response.get("status"))
event_type = first_non_empty(
body_parsed.get("Type"),
request.get("summary", {}).get("type"),
)
action = first_non_empty(
body_parsed.get("Action"),
request.get("summary", {}).get("action"),
request.get("query", {}).get("params", {}).get("action"),
)
combo = f"{event_type}:{action}"
source_member = first_non_empty(
body_parsed.get("MemberId"),
request.get("summary", {}).get("memberId"),
)
actor_member = first_non_empty(
request.get("query", {}).get("params", {}).get("memberId"),
request.get("summary", {}).get("entityUpdatedBy"),
)
updated_by = first_non_empty(
entity_parsed.get("UpdatedBy"),
request.get("summary", {}).get("entityUpdatedBy"),
)
company_id = first_non_empty(body_parsed.get("CompanyId"), request.get("headers", {}).get("companyname"))
stats.methods[method] += 1
stats.paths[path] += 1
stats.endpoint_roots[endpoint_root] += 1
stats.response_statuses[status] += 1
stats.event_types[event_type] += 1
stats.actions[action] += 1
stats.type_action_combo[combo] += 1
stats.company_ids[company_id] += 1
stats.source_members[source_member] += 1
stats.actor_members[actor_member] += 1
stats.entity_updated_by[updated_by] += 1
if timestamp:
bucket = timestamp.astimezone(timezone.utc).strftime("%Y-%m-%d %H:00 UTC")
stats.endpoint_by_hour[endpoint_root][bucket] += 1
def analyze_file(log_path: Path) -> LogStats:
stats = LogStats()
with log_path.open("r", encoding="utf-8") as handle:
for raw_line in handle:
line = raw_line.strip()
if not line:
continue
stats.total_rows += 1
try:
row = json.loads(line)
except json.JSONDecodeError:
stats.invalid_rows += 1
continue
stats.parsed_rows += 1
update_stats(stats, row)
return stats
def main() -> None:
parser = argparse.ArgumentParser(
description="Analyze webhook JSONL logs by users, time, and request types."
)
parser.add_argument("log_file", help="Path to JSONL log file")
parser.add_argument("--top", type=int, default=10, help="Top N entries per section (default: 10)")
parser.add_argument(
"--busiest-minutes",
type=int,
default=5,
help="How many top minute buckets to show (default: 5)",
)
parser.add_argument(
"--no-color",
action="store_true",
help="Disable ANSI colors",
)
parser.add_argument(
"--endpoint-stream",
action="store_true",
help="Show chronological one-line summary per webhook, similar to live test webserver logs",
)
parser.add_argument(
"--max-path",
type=int,
default=18,
help="Max endpoint width in stream mode before truncation (default: 18)",
)
args = parser.parse_args()
log_path = Path(args.log_file)
if not log_path.exists() or not log_path.is_file():
raise SystemExit(f"Log file not found: {log_path}")
use_color = supports_color(not args.no_color)
if args.endpoint_stream:
print(endpoint_stream_summary(log_path, use_color=use_color, max_path=max(args.max_path, 10)))
return
stats = analyze_file(log_path)
print(
stats.summarize(
top_n=max(args.top, 1),
busiest_n=max(args.busiest_minutes, 1),
use_color=use_color,
)
)
if __name__ == "__main__":
main()
+4
View File
@@ -19,6 +19,7 @@
},
"scripts": {
"dev": "NODE_ENV=development bun --watch src/index.ts",
"dev:log": "LOG_CW_API=1 NODE_ENV=development bun --watch src/index.ts",
"test": "bun test --preload ./tests/setup.ts",
"db:gen": "prisma generate",
"db:push": "prisma migrate dev --skip-generate",
@@ -27,6 +28,9 @@
"utils:gen_private_keys": "bun ./utils/genPrivateKeys",
"utils:create_admin_role": "bun ./utils/createAdminRole",
"utils:assign_user_role": "bun ./utils/assignUserRole",
"utils:test_webserver": "bun ./utils/testWebserver.ts",
"utils:test_adjustments_poll": "bun ./utils/testAdjustmentsPoll.ts",
"utils:analyze_cw": "python3 debug-scripts/analyze-cw-calls.py",
"db:check": "bunx prisma migrate diff --from-migrations prisma/migrations --to-schema prisma/schema.prisma --shadow-database-url $DATABASE_URL --exit-code"
},
"dependencies": {
+164
View File
@@ -0,0 +1,164 @@
import { createRoute } from "../../modules/api-utils/createRoute";
import { apiResponse } from "../../modules/api-utils/apiResponse";
import { ContentfulStatusCode } from "hono/utils/http-status";
import { z } from "zod";
import GenericError from "../../Errors/GenericError";
type ParsedJson = Record<string, unknown> | unknown[];
const callbackResource = z.enum([
"opportunity",
"ticket",
"company",
"activity",
]);
const safeParseJson = (value: string): ParsedJson | null => {
try {
const parsed = JSON.parse(value);
const isObject = typeof parsed === "object" && parsed !== null;
return isObject ? (parsed as ParsedJson) : null;
} catch {
return null;
}
};
const asObject = (value: ParsedJson | null): Record<string, unknown> | null => {
if (!value) return null;
if (Array.isArray(value)) return null;
return value;
};
const parseJsonStringFields = (
value: Record<string, unknown> | null,
): Record<string, unknown> | null => {
if (!value) return null;
return Object.entries(value).reduce<Record<string, unknown>>(
(acc, [key, current]) => {
if (typeof current !== "string") {
acc[key] = current;
return acc;
}
const looksLikeJson = current.startsWith("{") || current.startsWith("[");
if (!looksLikeJson) {
acc[key] = current;
return acc;
}
const parsed = safeParseJson(current);
acc[key] = parsed ?? current;
return acc;
},
{},
);
};
const parseEntity = (value: unknown): ParsedJson | null => {
if (typeof value === "string") return safeParseJson(value);
if (typeof value !== "object" || value === null) return null;
return value as ParsedJson;
};
const buildSummary = (
resource: z.infer<typeof callbackResource>,
parsedBody: Record<string, unknown> | null,
parsedEntity: Record<string, unknown> | null,
) => {
if (!parsedBody) return null;
return {
resource,
messageId: parsedBody.MessageId ?? null,
action: parsedBody.Action ?? null,
type: parsedBody.Type ?? null,
id: parsedBody.ID ?? null,
memberId: parsedBody.MemberId ?? null,
entityStatus:
parsedEntity?.StatusName ??
parsedEntity?.TicketStatus ??
parsedEntity?.Status ??
null,
entitySummary: parsedEntity?.Summary ?? parsedEntity?.CompanyName ?? null,
entityUpdatedBy: parsedEntity?.UpdatedBy ?? null,
entityLastUpdated:
parsedEntity?.LastUpdatedUTC ?? parsedEntity?.LastUpdated ?? null,
};
};
const parseHeaders = (headers: Headers): Record<string, string> =>
Object.fromEntries(headers.entries());
const callbackHeaderSummary = (headers: Record<string, string>) => ({
contentType: headers["content-type"] ?? null,
userAgent: headers["user-agent"] ?? null,
host: headers.host ?? null,
forwardedFor: headers["x-forwarded-for"] ?? null,
callbackId:
headers["x-cw-request-id"] ??
headers["x-request-id"] ??
headers["x-correlation-id"] ??
null,
});
/* /v1/cw/callback/:resource */
export default createRoute("post", ["/callback/:secret/:resource"], async (c) => {
const suppliedSecret = c.req.param("secret");
const expectedSecret = process.env.CW_CALLBACK_SECRET;
if (expectedSecret && suppliedSecret !== expectedSecret) {
throw new GenericError({
name: "Unauthorized",
message: "Invalid callback secret.",
cause: "Path secret mismatch",
status: 401,
});
}
if (!expectedSecret) {
console.warn(
"[cw-callback] CW_CALLBACK_SECRET is not configured; accepting path secret without verification",
);
}
const resource = callbackResource.parse(c.req.param("resource"));
const headers = parseHeaders(c.req.raw.headers);
const headerSummary = callbackHeaderSummary(headers);
const rawBody = await c.req.text();
const parsedJson = safeParseJson(rawBody);
const parsedBody = asObject(parsedJson);
const parsedBodyExpanded = parseJsonStringFields(parsedBody);
const parsedEntity = asObject(parseEntity(parsedBodyExpanded?.Entity));
const summary = buildSummary(resource, parsedBodyExpanded, parsedEntity);
const line = [
`[cw-callback] resource=${resource}`,
`action=${String(summary?.action ?? "-")}`,
`type=${String(summary?.type ?? "-")}`,
`id=${String(summary?.id ?? "-")}`,
`by=${String(summary?.entityUpdatedBy ?? summary?.memberId ?? "-")}`,
`requestId=${String(headerSummary.callbackId ?? "-")}`,
`status=${String(summary?.entityStatus ?? "-")}`,
`summary=${String(summary?.entitySummary ?? "-")}`,
].join(" ");
console.log(line);
const response = apiResponse.successful("CW callback received.", {
resource,
secretValidated: Boolean(expectedSecret),
summary,
headers,
headerSummary,
bodyParsed: parsedBodyExpanded,
receivedAt: new Date().toISOString(),
});
return c.json(response, response.status as ContentfulStatusCode);
});
+3
View File
@@ -0,0 +1,3 @@
import { default as callback } from "./callback";
export { callback };
+7
View File
@@ -0,0 +1,7 @@
import { Hono } from "hono";
import * as cwRoutes from "../cw";
const cwRouter = new Hono();
Object.values(cwRoutes).map((r) => cwRouter.route("/", r));
export default cwRouter;
+1 -1
View File
@@ -53,7 +53,7 @@ export default createRoute(
),
);
const item = await opportunities.fetchItem(identifier);
const item = await opportunities.fetchRecord(identifier);
const created = await item.addProducts(gatedItems);
const isBatch = Array.isArray(body);
+1 -1
View File
@@ -10,7 +10,7 @@ export default createRoute(
["/opportunities/:identifier/contacts"],
async (c) => {
const identifier = c.req.param("identifier");
const item = await opportunities.fetchItem(identifier);
const item = await opportunities.fetchRecord(identifier);
const data = await item.fetchContacts();
+1 -1
View File
@@ -21,7 +21,7 @@ export default createRoute(
const data = schema.parse(body);
const item = await opportunities.fetchItem(identifier);
const item = await opportunities.fetchRecord(identifier);
const user = c.get("user");
const created = await item.addNote(data.text, user.login, {
+1 -1
View File
@@ -20,7 +20,7 @@ export default createRoute(
message: "Note ID must be a number",
});
const item = await opportunities.fetchItem(identifier);
const item = await opportunities.fetchRecord(identifier);
await item.deleteNote(noteId);
const response = apiResponse.successful(
+179 -4
View File
@@ -4,30 +4,205 @@ import { apiResponse } from "../../../modules/api-utils/apiResponse";
import { ContentfulStatusCode } from "hono/utils/http-status";
import { authMiddleware } from "../../middleware/authorization";
import { processObjectValuePerms } from "../../../modules/permission-utils/processObjectPermissions";
import GenericError from "../../../Errors/GenericError";
import { prisma } from "../../../constants";
import { computeSubResourceCacheTTL } from "../../../modules/algorithms/computeSubResourceCacheTTL";
import { computeProductsCacheTTL } from "../../../modules/algorithms/computeProductsCacheTTL";
import {
getCachedSite,
getCachedNotes,
getCachedContacts,
getCachedProducts,
fetchAndCacheNotes,
fetchAndCacheContacts,
fetchAndCacheProducts,
fetchAndCacheSite,
} from "../../../modules/cache/opportunityCache";
/* GET /v1/sales/opportunities/:identifier */
/* GET /v1/sales/opportunities/:identifier?include=notes,contacts,products */
export default createRoute(
"get",
["/opportunities/:identifier"],
async (c) => {
const t0 = performance.now();
const identifier = c.req.param("identifier");
const includeParam = c.req.query("include") ?? "";
const includes = new Set(
includeParam
.split(",")
.map((s) => s.trim().toLowerCase())
.filter(Boolean),
);
const item = await opportunities.fetchItem(identifier);
// ── Quick DB lookup (≈3ms) to get cwOpportunityId for pre-warming ──
const isNumeric = /^\d+$/.test(identifier);
const dbRecord = await prisma.opportunity.findFirst({
where: isNumeric
? { cwOpportunityId: Number(identifier) }
: { id: identifier },
select: {
cwOpportunityId: true,
companyCwId: true,
siteCwId: true,
closedFlag: true,
closedDate: true,
expectedCloseDate: true,
cwLastUpdated: true,
statusCwId: true,
},
});
// Eagerly load site data so toJson() includes full site info
await item.fetchSite();
if (!dbRecord) {
throw new GenericError({
message: "Opportunity not found",
name: "OpportunityNotFound",
cause: `No opportunity exists with identifier '${identifier}'`,
status: 404,
});
}
// Compute TTLs from DB state
const subTtl = computeSubResourceCacheTTL({
closedFlag: dbRecord.closedFlag,
closedDate: dbRecord.closedDate,
expectedCloseDate: dbRecord.expectedCloseDate,
lastUpdated: dbRecord.cwLastUpdated,
});
const prodTtl = computeProductsCacheTTL({
closedFlag: dbRecord.closedFlag,
closedDate: dbRecord.closedDate,
expectedCloseDate: dbRecord.expectedCloseDate,
lastUpdated: dbRecord.cwLastUpdated,
statusCwId: dbRecord.statusCwId,
});
// ── Pre-warm sub-resources only on cache miss ───────────────────────
// Check Redis first — if the background refresh has kept the keys warm,
// skip the CW calls entirely. Only fetch-and-cache on a miss.
const cwOppId = dbRecord.cwOpportunityId;
const _pw0 = performance.now();
const _wrapPw = (label: string, p: Promise<any>) =>
p
.then((r) => {
console.log(
`[PERF:prewarm] ${label}: ${(performance.now() - _pw0).toFixed(0)}ms`,
);
return r;
})
.catch(() => {});
const prewarmPromises: Promise<any>[] = [];
if (dbRecord.companyCwId && dbRecord.siteCwId) {
const compId = dbRecord.companyCwId,
siteId = dbRecord.siteCwId;
prewarmPromises.push(
_wrapPw(
"site",
getCachedSite(compId, siteId).then(
(c) => c ?? fetchAndCacheSite(compId, siteId),
),
),
);
}
if (includes.has("notes") && subTtl)
prewarmPromises.push(
_wrapPw(
"notes",
getCachedNotes(cwOppId).then(
(c) => c ?? fetchAndCacheNotes(cwOppId, subTtl),
),
),
);
if (includes.has("contacts") && subTtl)
prewarmPromises.push(
_wrapPw(
"contacts",
getCachedContacts(cwOppId).then(
(c) => c ?? fetchAndCacheContacts(cwOppId, subTtl),
),
),
);
if (includes.has("products") && prodTtl)
prewarmPromises.push(
_wrapPw(
"products",
getCachedProducts(cwOppId).then(
(c) => c ?? fetchAndCacheProducts(cwOppId, prodTtl),
),
),
);
// fetchItem runs its own CW calls (opp, activities, company) —
// these execute concurrently with the sub-resource pre-warming above.
const [item] = await Promise.all([
opportunities.fetchItem(identifier),
...prewarmPromises,
]);
const t1 = performance.now();
console.log(`[PERF] fetchItem + prewarm: ${(t1 - t0).toFixed(0)}ms`);
// Sub-resources now hit warm Redis cache (near-instant)
const _st = performance.now();
const _wrapTimed = (label: string, p: Promise<any>) =>
p.then((r) => {
console.log(
`[PERF:sub] ${label}: ${(performance.now() - _st).toFixed(0)}ms`,
);
return r;
});
const subResourcePromises: Record<string, Promise<any>> = {
_site: _wrapTimed("site", item.fetchSite()),
};
if (includes.has("notes")) {
subResourcePromises.notes = _wrapTimed("notes", item.fetchNotes());
}
if (includes.has("contacts")) {
subResourcePromises.contacts = _wrapTimed(
"contacts",
item.fetchContacts(),
);
}
if (includes.has("products")) {
subResourcePromises.products = _wrapTimed(
"products",
item
.fetchProducts()
.then((products) => products.map((p) => p.toJson())),
);
}
const keys = Object.keys(subResourcePromises);
const results = await Promise.all(keys.map((k) => subResourcePromises[k]));
const t2 = performance.now();
console.log(
`[PERF] sub-resources (${keys.join(",")}): ${(t2 - t1).toFixed(0)}ms`,
);
// Apply toJson after site is hydrated (side-effect from fetchSite)
const gatedData = await processObjectValuePerms(
item.toJson(),
"obj.opportunity",
c.get("user"),
);
const t3 = performance.now();
console.log(`[PERF] processObjectValuePerms: ${(t3 - t2).toFixed(0)}ms`);
// Attach sub-resources (skip the internal _site key)
keys.forEach((k, i) => {
if (k !== "_site") {
(gatedData as any)[k] = results[i];
}
});
const response = apiResponse.successful(
"Opportunity fetched successfully!",
gatedData,
);
console.log(
`[PERF] total handler: ${(performance.now() - t0).toFixed(0)}ms (includes=${includeParam || "none"})`,
);
return c.json(response, response.status as ContentfulStatusCode);
},
authMiddleware({ permissions: ["sales.opportunity.fetch"] }),
+1 -1
View File
@@ -20,7 +20,7 @@ export default createRoute(
message: "Note ID must be a number",
});
const item = await opportunities.fetchItem(identifier);
const item = await opportunities.fetchRecord(identifier);
const data = await item.fetchNote(noteId);
const response = apiResponse.successful(
+1 -1
View File
@@ -10,7 +10,7 @@ export default createRoute(
["/opportunities/:identifier/notes"],
async (c) => {
const identifier = c.req.param("identifier");
const item = await opportunities.fetchItem(identifier);
const item = await opportunities.fetchRecord(identifier);
const data = await item.fetchNotes();
+1 -1
View File
@@ -10,7 +10,7 @@ export default createRoute(
["/opportunities/:identifier/products"],
async (c) => {
const identifier = c.req.param("identifier");
const item = await opportunities.fetchItem(identifier);
const item = await opportunities.fetchRecord(identifier);
const data = await item.fetchProducts();
+1 -1
View File
@@ -21,7 +21,7 @@ export default createRoute(
const { orderedIds } = schema.parse(body);
const item = await opportunities.fetchItem(identifier);
const item = await opportunities.fetchRecord(identifier);
const updated = await item.resequenceProducts(orderedIds);
const response = apiResponse.successful(
+1 -1
View File
@@ -35,7 +35,7 @@ export default createRoute(
const data = schema.parse(body);
const item = await opportunities.fetchItem(identifier);
const item = await opportunities.fetchRecord(identifier);
const updated = await item.updateNote(noteId, data);
const response = apiResponse.successful(
+1
View File
@@ -57,6 +57,7 @@ v1.route("/permissions", require("./routers/permissionRouter").default);
v1.route("/unifi", require("./routers/unifiRouter").default);
v1.route("/procurement", require("./routers/procurementRouter").default);
v1.route("/sales", require("./routers/salesRouter").default);
v1.route("/cw", require("./routers/cwRouter").default);
app.route("/v1", v1);
export default app;
+4
View File
@@ -6,6 +6,7 @@ import { Server } from "socket.io";
import { Server as Engine } from "@socket.io/bun-engine";
import axios from "axios";
import { UnifiClient } from "./modules/unifi-api/UnifiClient";
import { attachCwApiLogger } from "./modules/cw-utils/cwApiLogger";
import Redis from "ioredis";
const connectionString = `${process.env.DATABASE_URL}`;
@@ -81,8 +82,11 @@ const connectWiseApi = axios.create({
clientId: `${process.env.CW_CLIENT_ID}`,
"Content-Type": "application/json",
},
timeout: 30_000, // 30 s — prevents indefinite hangs on CW API
});
attachCwApiLogger(connectWiseApi);
export { connectWiseApi };
// Unifi API Constants
+9 -6
View File
@@ -50,18 +50,21 @@ export class CompanyController {
const cwCompany = await fetchCwCompanyById(this.cw_CompanyId);
if (!cwCompany) return this;
const contactHref = cwCompany.defaultContact?._info?.contact_href;
const defaultContactData = contactHref
? await connectWiseApi.get(contactHref)
: undefined;
const allContactsData = await connectWiseApi.get(
`${cwCompany._info.contacts_href}&pageSize=1000`,
);
// Derive default contact from allContacts instead of a separate CW call
const defaultContactId = cwCompany.defaultContact?.id;
const defaultContactData = defaultContactId
? ((allContactsData.data as any[]).find(
(c: any) => c.id === defaultContactId,
) ?? null)
: null;
this.cw_Data = {
company: cwCompany,
defaultContact: defaultContactData?.data ?? null,
defaultContact: defaultContactData,
allContacts: allContactsData.data,
};
+187 -30
View File
@@ -15,9 +15,26 @@ import {
CWOpportunity,
CWOpportunityNote,
} from "../modules/cw-utils/opportunities/opportunity.types";
import { resolveMember } from "../modules/cw-utils/members/memberCache";
import {
resolveMember,
resolveMembers,
} from "../modules/cw-utils/members/memberCache";
import { ForecastProductController } from "./ForecastProductController";
import GenericError from "../Errors/GenericError";
import { computeSubResourceCacheTTL } from "../modules/algorithms/computeSubResourceCacheTTL";
import { computeProductsCacheTTL } from "../modules/algorithms/computeProductsCacheTTL";
import {
getCachedNotes,
getCachedContacts,
getCachedProducts,
getCachedSite,
fetchAndCacheNotes,
fetchAndCacheContacts,
fetchAndCacheProducts,
fetchAndCacheSite,
invalidateNotesCache,
invalidateProductsCache,
} from "../modules/cache/opportunityCache";
/**
* Opportunity Controller
@@ -91,6 +108,27 @@ export class OpportunityController {
private _customFields: CWCustomField[] | null = null;
private _activities: ActivityController[] | null = null;
/** Compute the sub-resource cache TTL from this opportunity's fields. */
private _subResourceTTL(): number | null {
return computeSubResourceCacheTTL({
closedFlag: this.closedFlag,
closedDate: this.closedDate,
expectedCloseDate: this.expectedCloseDate,
lastUpdated: this.cwLastUpdated,
});
}
/** Compute the products-specific cache TTL from this opportunity's fields. */
private _productsTTL(): number | null {
return computeProductsCacheTTL({
closedFlag: this.closedFlag,
closedDate: this.closedDate,
expectedCloseDate: this.expectedCloseDate,
lastUpdated: this.cwLastUpdated,
statusCwId: this.statusCwId,
});
}
constructor(
data: Opportunity & { company?: Company | null },
opts?: {
@@ -288,6 +326,7 @@ export class OpportunityController {
*
* Fetches the full site details (address, phone, flags) from ConnectWise
* for the site associated with this opportunity.
* Checks the Redis cache first (30-min TTL); on miss, calls CW and caches.
* Requires both companyCwId and siteCwId to be set.
*
* @returns Serialized site object or null
@@ -296,7 +335,17 @@ export class OpportunityController {
if (this._siteData) return this._siteData;
if (!this.companyCwId || !this.siteCwId) return null;
const cwSite = await fetchCompanySite(this.companyCwId, this.siteCwId);
// Try cache first
const cached = await getCachedSite(this.companyCwId, this.siteCwId);
if (cached) {
this._siteData = serializeCwSite(cached);
return this._siteData;
}
// Cache miss — fetch from CW and cache
const cwSite = await fetchAndCacheSite(this.companyCwId, this.siteCwId);
if (!cwSite) return null;
this._siteData = serializeCwSite(cwSite);
return this._siteData;
}
@@ -304,13 +353,37 @@ export class OpportunityController {
/**
* Fetch Contacts
*
* Fetches contacts associated with this opportunity from ConnectWise
* and returns a serialized array.
* Fetches contacts associated with this opportunity. Checks the Redis
* cache first; on miss, calls ConnectWise and caches the raw response.
*
* @param opts.fresh - Bypass cache and fetch directly from CW.
*/
public async fetchContacts() {
const contacts = await opportunityCw.fetchContacts(this.cwOpportunityId);
public async fetchContacts(opts?: { fresh?: boolean }) {
const ttl = this._subResourceTTL();
return contacts.map((ct) => ({
// Try cache first (unless forced fresh)
if (!opts?.fresh && ttl !== null) {
const cached = await getCachedContacts(this.cwOpportunityId);
if (cached) return this._serializeContacts(cached);
}
// Fetch from CW (fetchAndCache* handles 404 internally)
try {
const contacts =
ttl !== null
? await fetchAndCacheContacts(this.cwOpportunityId, ttl)
: await opportunityCw.fetchContacts(this.cwOpportunityId);
return this._serializeContacts(contacts);
} catch (err: any) {
if (err?.isAxiosError && err?.response?.status === 404) return [];
throw err;
}
}
/** Serialize raw CW contact data into the API response shape. */
private _serializeContacts(contacts: any[]) {
return contacts.map((ct: any) => ({
id: ct.id,
contact: ct.contact ? { id: ct.contact.id, name: ct.contact.name } : null,
company: ct.company
@@ -329,24 +402,55 @@ export class OpportunityController {
/**
* Fetch Notes
*
* Fetches notes associated with this opportunity from ConnectWise
* and returns a serialized array.
* Fetches notes associated with this opportunity. Checks the Redis
* cache first; on miss, calls ConnectWise and caches the raw response.
*
* @param opts.fresh - Bypass cache and fetch directly from CW.
*/
public async fetchNotes() {
const notes = await opportunityCw.fetchNotes(this.cwOpportunityId);
public async fetchNotes(opts?: { fresh?: boolean }) {
const ttl = this._subResourceTTL();
return Promise.all(
notes.map(async (n) => ({
// Try cache first (unless forced fresh)
if (!opts?.fresh && ttl !== null) {
const cached = await getCachedNotes(this.cwOpportunityId);
if (cached) return this._serializeNotes(cached);
}
// Fetch from CW (fetchAndCache* handles 404 internally)
try {
const notes =
ttl !== null
? await fetchAndCacheNotes(this.cwOpportunityId, ttl)
: await opportunityCw.fetchNotes(this.cwOpportunityId);
return this._serializeNotes(notes);
} catch (err: any) {
if (err?.isAxiosError && err?.response?.status === 404) return [];
throw err;
}
}
/** Serialize raw CW note data into the API response shape. */
private async _serializeNotes(notes: any[]) {
// Batch-resolve all member identifiers in a single DB query
const identifiers = notes
.map((n: any) => n.enteredBy as string)
.filter(Boolean);
const memberMap = await resolveMembers(identifiers);
return notes.map((n: any) => ({
id: n.id,
text: n.text,
type: n.type ? { id: n.type.id, name: n.type.name } : null,
flagged: n.flagged,
dateEntered: n._info?.lastUpdated
? new Date(n._info.lastUpdated)
: null,
enteredBy: await resolveMember(n.enteredBy),
})),
);
dateEntered: n._info?.lastUpdated ? new Date(n._info.lastUpdated) : null,
enteredBy: memberMap.get(n.enteredBy) ?? {
id: null,
identifier: n.enteredBy,
name: n.enteredBy,
cwMemberId: null,
},
}));
}
/**
@@ -388,15 +492,58 @@ export class OpportunityController {
/**
* Fetch Products
*
* Fetches products (forecast/revenue items) for this opportunity from
* ConnectWise and returns ForecastProductController instances.
* Fetches products (forecast/revenue items) for this opportunity.
* Checks the Redis cache first; on miss, calls ConnectWise and
* caches the raw response using the products-specific TTL algorithm.
*
* @param opts.fresh - Bypass cache and fetch directly from CW.
*/
public async fetchProducts(): Promise<ForecastProductController[]> {
const [forecast, procProducts] = await Promise.all([
public async fetchProducts(opts?: {
fresh?: boolean;
}): Promise<ForecastProductController[]> {
const ttl = this._productsTTL();
let forecast: any;
let procProducts: any[];
// Try cache first (unless forced fresh)
if (!opts?.fresh && ttl !== null) {
const cached = await getCachedProducts(this.cwOpportunityId);
if (cached) {
forecast = cached.forecast;
procProducts = cached.procProducts;
} else {
// Cache miss — fetch from CW and cache
const blob = await fetchAndCacheProducts(this.cwOpportunityId, ttl);
forecast = blob.forecast;
procProducts = blob.procProducts;
}
} else {
// No caching (won/lost/pending or forced fresh) — fetch directly
try {
[forecast, procProducts] = await Promise.all([
opportunityCw.fetchProducts(this.cwOpportunityId),
opportunityCw.fetchProcurementProducts(this.cwOpportunityId),
]);
} catch (err: any) {
if (err?.isAxiosError && err?.response?.status === 404) return [];
throw err;
}
}
return this._buildProductControllers(forecast, procProducts);
}
/**
* Build ForecastProductController[] from raw CW data.
*
* Extracted from fetchProducts() so both cached and fresh paths
* share the same ordering + enrichment logic.
*/
private async _buildProductControllers(
forecast: any,
procProducts: any[],
): Promise<ForecastProductController[]> {
// Build a map of forecastDetailId → procurement product cancellation data
const cancellationMap = new Map<number, Record<string, unknown>>();
for (const pp of procProducts) {
@@ -412,30 +559,32 @@ export class OpportunityController {
let ordered: typeof forecastItems;
if (this.productSequence.length > 0) {
const itemById = new Map(forecastItems.map((fi) => [fi.id, fi]));
const itemById = new Map(forecastItems.map((fi: any) => [fi.id, fi]));
// Items in the specified order first, then any new items not yet sequenced
const sequenced = this.productSequence
.map((id) => itemById.get(id))
.filter((fi): fi is NonNullable<typeof fi> => fi !== undefined);
.filter((fi: any): fi is NonNullable<typeof fi> => fi !== undefined);
const sequencedIds = new Set(this.productSequence);
const unsequenced = forecastItems
.filter((fi) => !sequencedIds.has(fi.id))
.sort((a, b) => a.sequenceNumber - b.sequenceNumber);
.filter((fi: any) => !sequencedIds.has(fi.id))
.sort((a: any, b: any) => a.sequenceNumber - b.sequenceNumber);
ordered = [...sequenced, ...unsequenced];
} else {
ordered = [...forecastItems].sort(
(a, b) => a.sequenceNumber - b.sequenceNumber,
(a: any, b: any) => a.sequenceNumber - b.sequenceNumber,
);
}
const controllers = ordered.map((item) => {
const controllers: ForecastProductController[] = ordered.map(
(item: any) => {
const ctrl = new ForecastProductController(item);
const procData = cancellationMap.get(item.id);
if (procData) {
ctrl.applyCancellationData(procData as any);
}
return ctrl;
});
},
);
// Enrich with internal inventory data from local CatalogItem DB
const catalogCwIds = controllers
@@ -559,6 +708,7 @@ export class OpportunityController {
forecastItemId,
data,
);
await invalidateProductsCache(this.cwOpportunityId);
return new ForecastProductController(updated);
} catch (err: any) {
console.error(
@@ -613,6 +763,9 @@ export class OpportunityController {
});
this.productSequence = orderedIds;
// Invalidate cached products since ordering changed
await invalidateProductsCache(this.cwOpportunityId);
// Return items in the new order
return this.fetchProducts();
}
@@ -635,6 +788,7 @@ export class OpportunityController {
this.cwOpportunityId,
data,
);
await invalidateProductsCache(this.cwOpportunityId);
return created.map((item) => new ForecastProductController(item));
} catch (err: any) {
console.error(
@@ -680,6 +834,7 @@ export class OpportunityController {
text: note,
flagged: opts?.flagged ?? false,
});
await invalidateNotesCache(this.cwOpportunityId);
return created;
}
@@ -700,6 +855,7 @@ export class OpportunityController {
noteId,
data,
);
await invalidateNotesCache(this.cwOpportunityId);
return updated;
}
@@ -712,6 +868,7 @@ export class OpportunityController {
*/
public async deleteNote(noteId: number): Promise<void> {
await opportunityCw.deleteNote(this.cwOpportunityId, noteId);
await invalidateNotesCache(this.cwOpportunityId);
}
/**
+17 -1
View File
@@ -33,6 +33,9 @@ export class RoleController {
private _permissionsToken: string;
private _users: (User & { roles: Role[] })[];
/** Cached result of JWT verification — avoids repeated RSA verify calls. */
private _cachedVerifiedPermissions: { permissions: string[] } | null = null;
public readonly createdAt: Date;
public updatedAt: Date;
@@ -62,6 +65,14 @@ export class RoleController {
* @returns - Verified object with permissions in it.
*/
private _verifyPermissions(permissionsToken: string) {
// Return cached result if the token hasn't changed
if (
this._cachedVerifiedPermissions &&
permissionsToken === this._permissionsToken
) {
return this._cachedVerifiedPermissions;
}
let perms: DecodedPermissionsBlock;
try {
perms = jwt.verify(permissionsToken, permissionsPrivateKey, {
@@ -82,7 +93,12 @@ export class RoleController {
);
}
return perms as { permissions: string[] };
const result = perms as { permissions: string[] };
// Cache only if verifying the current token
if (permissionsToken === this._permissionsToken) {
this._cachedVerifiedPermissions = result;
}
return result;
}
/**
+32 -25
View File
@@ -24,6 +24,13 @@ export default class UserController {
private _roles: Collection<string, Role>;
private _permissions: string | null;
/** Cached result of fetchRoles() — populated on first hasPermission call. */
private _resolvedRoleControllers: Collection<string, RoleController> | null =
null;
/** Per-permission result cache — avoids repeated JWT verification + DB lookups. */
private _permissionCache: Map<string, boolean> = new Map();
public createdAt: Date;
public updatedAt: Date;
constructor(userdata: User & { roles: Role[] }) {
@@ -127,6 +134,7 @@ export default class UserController {
this._updateInternalValues(updatedUser);
this._roles = new Collection<string, Role>();
updatedUser.roles.map((v: any) => this._roles.set(v.id, v));
this.clearPermissionCache();
for (const role of resolvedRoles) {
events.emit("user:role:assigned", { user: this, role });
@@ -252,35 +260,34 @@ export default class UserController {
* @returns {boolean} Does this user have the specified permission
*/
public async hasPermission(permission: string) {
let resources = await prisma.user.findFirst({
where: { id: this.id },
select: {
sessions: {
select: { id: true },
},
},
});
// Fast path: return cached result if we already resolved this permission
const cached = this._permissionCache.get(permission);
if (cached !== undefined) return cached;
const resourceKeys: string[] = Object.keys(resources ?? {}) as string[];
// Resolve role controllers once and cache them for the lifetime of this
// controller instance (i.e. the current request).
if (!this._resolvedRoleControllers) {
this._resolvedRoleControllers = await this.fetchRoles();
}
const implicitPermissions = resources
? resourceKeys
// @ts-ignore
.filter((v) => resources[v].length > 0)
.map(
(v) =>
//@ts-ignore
`resource.${v}.[${(resources![v] as { id: string }[])
.map((o) => o.id)
.join(",")}].user.${this.id}.implicit`,
)
: [];
const result = this._resolvedRoleControllers
.map((v) => v.checkPermission(permission))
.includes(true);
let checks = [
(await this.fetchRoles()).map((v) => v.checkPermission(permission)),
].flatMap((v) => v);
this._permissionCache.set(permission, result);
return result;
}
return checks.includes(true);
/**
* Clear Permission Cache
*
* Invalidates the in-memory permission cache so the next
* `hasPermission` call re-fetches roles from the database.
* Call this after role or permission mutations on the user.
*/
public clearPermissionCache() {
this._resolvedRoleControllers = null;
this._permissionCache.clear();
}
/**
+54 -13
View File
@@ -12,7 +12,9 @@ import { unifiSites } from "./managers/unifiSites";
import { refreshCompanies } from "./modules/cw-utils/refreshCompanies";
import { refreshCatalog } from "./modules/cw-utils/procurement/refreshCatalog";
import { refreshInventory } from "./modules/cw-utils/procurement/refreshInventory";
import { listenInventoryAdjustments } from "./modules/cw-utils/procurement/listenInventoryAdjustments";
import { refreshOpportunities } from "./modules/cw-utils/opportunities/refreshOpportunities";
import { refreshOpportunityCache } from "./modules/cache/opportunityCache";
import { refreshCwIdentifiers } from "./modules/cw-utils/members/refreshCwIdentifiers";
import { userDefinedFieldsCw } from "./modules/cw-utils/userDefinedFields";
import { events, setupEventDebugger } from "./modules/globalEvents";
@@ -23,6 +25,16 @@ import cuid from "cuid";
// Setup global event debugger in non-production environments
if (Bun.env.NODE_ENV == "development") setupEventDebugger();
/** Concise error message for interval logs — avoids dumping full Axios error objects. */
const briefErr = (err: any): string => {
if (err?.isAxiosError) {
const method = (err.config?.method ?? "?").toUpperCase();
const url = err.config?.url ?? "?";
return `${method} ${url}${err.code ?? `HTTP ${err.response?.status}`}`;
}
return err?.message ?? String(err);
};
// Helper to run a startup sync safely — failures are logged but never crash the process.
const safeStartup = async (label: string, fn: () => Promise<void>) => {
try {
@@ -41,6 +53,7 @@ const safeStartup = async (label: string, fn: () => Promise<void>) => {
// ---------------------------------------------------------------------------
Bun.serve({
port: PORT,
idleTimeout: 255,
websocket: engine.handler().websocket,
fetch: (req, server) => {
const url = new URL(req.url);
@@ -89,44 +102,70 @@ await safeStartup("ensureAdminRole", async () => {
await safeStartup("refreshCompanies", refreshCompanies);
setInterval(() => {
return refreshCompanies().catch((err) =>
console.error("[interval] refreshCompanies failed", err),
console.error(`[interval] refreshCompanies failed: ${briefErr(err)}`),
);
}, 60 * 1000);
// Refresh the internal catalog every minute
// Refresh the internal catalog every 30 minutes
await safeStartup("refreshCatalog", refreshCatalog);
setInterval(() => {
setInterval(
() => {
return refreshCatalog().catch((err) =>
console.error("[interval] refreshCatalog failed", err),
console.error(`[interval] refreshCatalog failed: ${briefErr(err)}`),
);
},
30 * 60 * 1000,
);
}, 60 * 1000);
// Refresh inventory on hand every 2 minutes
await safeStartup("refreshInventory", refreshInventory);
// Fallback full inventory sweep every 6 hours (listener handles real-time deltas)
setInterval(
() => {
return refreshInventory().catch((err) =>
console.error("[interval] refreshInventory failed", err),
console.error(`[interval] refreshInventory failed: ${briefErr(err)}`),
);
},
2 * 60 * 1000,
6 * 60 * 60 * 1000,
);
// Listen for procurement adjustment changes and sync changed products to DB + cache
await safeStartup("listenInventoryAdjustments", listenInventoryAdjustments);
setInterval(() => {
return listenInventoryAdjustments().catch((err) =>
console.error(
`[interval] listenInventoryAdjustments failed: ${briefErr(err)}`,
),
);
}, 60 * 1000);
// Refresh opportunities every minute
await safeStartup("refreshOpportunities", refreshOpportunities);
setInterval(() => {
return refreshOpportunities().catch((err) =>
console.error("[interval] refreshOpportunities failed", err),
console.error(`[interval] refreshOpportunities failed: ${briefErr(err)}`),
);
}, 60 * 1000);
// Refresh opportunity CW cache every 20 minutes (activities + company hydration)
// NOTE: Do NOT await — register the interval immediately so the cache refresh
// is never blocked by a slow/stuck startup task above.
safeStartup("refreshOpportunityCache", refreshOpportunityCache);
setInterval(() => {
return refreshOpportunityCache().catch((err) => {
console.error(
`[interval] refreshOpportunityCache failed: ${briefErr(err)}`,
);
});
}, 20 * 60 * 1000);
// Refresh User Defined Fields every 5 minutes
await safeStartup("refreshUDFs", () => userDefinedFieldsCw.refresh());
setInterval(
() => {
return userDefinedFieldsCw
.refresh()
.catch((err) => console.error("[interval] refreshUDFs failed", err));
.catch((err) =>
console.error(`[interval] refreshUDFs failed: ${briefErr(err)}`),
);
},
5 * 60 * 1000,
);
@@ -136,7 +175,7 @@ await safeStartup("refreshCwIdentifiers", refreshCwIdentifiers);
setInterval(
() => {
return refreshCwIdentifiers().catch((err) =>
console.error("[interval] refreshCwIdentifiers failed", err),
console.error(`[interval] refreshCwIdentifiers failed: ${briefErr(err)}`),
);
},
30 * 60 * 1000,
@@ -146,5 +185,7 @@ await safeStartup("syncSites", () => unifiSites.syncSites());
setInterval(() => {
return unifiSites
.syncSites()
.catch((err) => console.error("[interval] syncSites failed", err));
.catch((err) =>
console.error(`[interval] syncSites failed: ${briefErr(err)}`),
);
}, 60 * 1000);
+282 -36
View File
@@ -6,49 +6,218 @@ import { OpportunityController } from "../controllers/OpportunityController";
import GenericError from "../Errors/GenericError";
import { activityCw } from "../modules/cw-utils/activities/activities";
import { opportunityCw } from "../modules/cw-utils/opportunities/opportunities";
import { computeCacheTTL } from "../modules/algorithms/computeCacheTTL";
import {
getCachedActivities,
getCachedCompanyCwData,
getCachedOppCwData,
fetchAndCacheActivities,
fetchAndCacheCompanyCwData,
fetchAndCacheOppCwData,
} from "../modules/cache/opportunityCache";
// ---------------------------------------------------------------------------
// Data-source hierarchy helpers
// ---------------------------------------------------------------------------
/**
* Build a CompanyController with hydrated CW data from a Prisma Company record.
*
* Data-source hierarchy (controlled by `strategy`):
*
* - `"cache-only"` — Redis cache → bare DB record (no CW call).
* Ideal for list views where latency matters and the background
* refresh job is responsible for keeping the cache warm.
*
* - `"cache-then-cw"` (default) — Redis cache → CW API → cache result.
* On a cold cache, calls CW to ensure the caller gets full data.
*
* - `"cw-first"` — CW API (always) → cache result.
* Forces a fresh fetch regardless of cache state.
*/
async function buildCompanyController(
company: Company,
opts?: {
strategy?: "cache-only" | "cache-then-cw" | "cw-first";
ttlMs?: number;
},
): Promise<CompanyController> {
const _ct0 = performance.now();
const strategy = opts?.strategy ?? "cache-then-cw";
const ctrl = new CompanyController(company);
// ── cw-first: always fetch from CW (and cache the result) ──────────
if (strategy === "cw-first") {
const blob = opts?.ttlMs
? await fetchAndCacheCompanyCwData(
company.cw_CompanyId,
opts.ttlMs,
).catch(() => null)
: null;
if (blob) {
ctrl.cw_Data = blob;
} else {
await ctrl.hydrateCwData();
}
return ctrl;
}
// ── cache-only / cache-then-cw: try Redis first ─────────────────────
const cached = await getCachedCompanyCwData(company.cw_CompanyId);
if (cached) {
ctrl.cw_Data = cached;
return ctrl;
}
// cache-only stops here — return the bare DB-backed controller
if (strategy === "cache-only") return ctrl;
// cache-then-cw: cache miss — fetch from CW once and cache in one pass
if (opts?.ttlMs) {
const blob = await fetchAndCacheCompanyCwData(
company.cw_CompanyId,
opts.ttlMs,
).catch(() => null);
if (blob) ctrl.cw_Data = blob;
} else {
await ctrl.hydrateCwData();
}
console.log(
`[PERF:buildCompany] ${(performance.now() - _ct0).toFixed(0)}ms (strategy=${strategy}, hit=miss)`,
);
return ctrl;
}
/**
* Fetch ActivityController[] for an opportunity from ConnectWise.
* Fetch ActivityController[] for an opportunity.
*
* Same three strategies as {@link buildCompanyController}:
*
* - `"cache-only"` — Redis → empty array (no CW call).
* - `"cache-then-cw"` (default) — Redis → CW API → cache result.
* - `"cw-first"` — CW API (always) → cache result.
*/
async function buildActivities(
cwOpportunityId: number,
opts?: {
strategy?: "cache-only" | "cache-then-cw" | "cw-first";
ttlMs?: number;
},
): Promise<ActivityController[]> {
const collection = await activityCw.fetchByOpportunity(cwOpportunityId);
return collection.map((item) => new ActivityController(item));
const _at0 = performance.now();
const strategy = opts?.strategy ?? "cache-then-cw";
// ── cw-first: always fetch from CW (and cache the result) ──────────
if (strategy === "cw-first") {
const arr = opts?.ttlMs
? await fetchAndCacheActivities(cwOpportunityId, opts.ttlMs)
: await activityCw.fetchByOpportunityDirect(cwOpportunityId);
return arr.map((item) => new ActivityController(item));
}
// ── cache-only / cache-then-cw: try Redis first ─────────────────────
const cached = await getCachedActivities(cwOpportunityId);
if (cached) {
return cached.map((item) => new ActivityController(item));
}
// cache-only stops here — return empty (background job will fill it)
if (strategy === "cache-only") return [];
// cache-then-cw: cache miss — fetch once and cache in one pass
const arr = opts?.ttlMs
? await fetchAndCacheActivities(cwOpportunityId, opts.ttlMs)
: await activityCw.fetchByOpportunityDirect(cwOpportunityId);
console.log(
`[PERF:buildActivities] ${(performance.now() - _at0).toFixed(0)}ms (strategy=${strategy}, hit=miss, count=${arr.length})`,
);
return arr.map((item) => new ActivityController(item));
}
export const opportunities = {
/**
* Fetch Record (lightweight)
*
* Returns an OpportunityController backed only by the **database record**.
* No ConnectWise API calls, no Redis lookups, no activity/company hydration.
*
* Use this when you only need the controller instance to call a sub-resource
* method (e.g. `fetchNotes()`, `fetchContacts()`, `fetchProducts()`,
* `fetchSite()`).
*
* @param identifier - Internal ID (string) or CW opportunity ID (number)
* @returns {Promise<OpportunityController>}
*/
async fetchRecord(
identifier: string | number,
): Promise<OpportunityController> {
const isNumeric =
typeof identifier === "number" || /^\d+$/.test(String(identifier));
const record = await prisma.opportunity.findFirst({
where: isNumeric
? { cwOpportunityId: Number(identifier) }
: { id: identifier as string },
include: { company: true },
});
if (!record) {
throw new GenericError({
message: "Opportunity not found",
name: "OpportunityNotFound",
cause: `No opportunity exists with identifier '${identifier}'`,
status: 404,
});
}
return new OpportunityController(record, {
company: record.company
? new CompanyController(record.company)
: undefined,
});
},
/**
* Fetch Opportunity
*
* Fetch an opportunity by its internal ID or ConnectWise opportunity ID
* and return an OpportunityController instance.
*
* **Data-source strategy:**
* - `fresh: true` → `"cw-first"` — always fetches from CW, updates DB, caches result.
* - `fresh: false` (default) → `"cache-then-cw"` — tries Redis cache for the
* CW opportunity response first, falls back to CW on miss.
*
* The CW opportunity response is cached in Redis with the same TTL as
* activities/company. The background refresh keeps this warm so most
* detail-view loads skip the CW roundtrip entirely.
*
* @param identifier - The internal ID (string) or CW opportunity ID (number)
* @param opts - Optional flags
* @param opts.fresh - When `true`, bypass the cache and pull directly from CW.
* @returns {Promise<OpportunityController>}
*/
async fetchItem(identifier: string | number): Promise<OpportunityController> {
async fetchItem(
identifier: string | number,
opts?: { fresh?: boolean },
): Promise<OpportunityController> {
const _t0 = performance.now();
const strategy: "cache-only" | "cache-then-cw" | "cw-first" = opts?.fresh
? "cw-first"
: "cache-then-cw";
const isNumeric =
typeof identifier === "number" || /^\d+$/.test(String(identifier));
// Look up the existing DB record to get the cwOpportunityId
// Look up the existing DB record (full, with company)
const existing = await prisma.opportunity.findFirst({
where: isNumeric
? { cwOpportunityId: Number(identifier) }
: { id: identifier as string },
select: { id: true, cwOpportunityId: true },
include: { company: true },
});
const _t1 = performance.now();
console.log(`[PERF:fetchItem] DB lookup: ${(_t1 - _t0).toFixed(0)}ms`);
if (!existing) {
throw new GenericError({
@@ -59,13 +228,55 @@ export const opportunities = {
});
}
// Fetch fresh data from ConnectWise
const cwData = await opportunityCw.fetch(existing.cwOpportunityId);
// Compute TTL from the current DB state (used for cache and hydration)
const ttlMs =
computeCacheTTL({
closedFlag: existing.closedFlag,
closedDate: existing.closedDate,
expectedCloseDate: existing.expectedCloseDate,
lastUpdated: existing.cwLastUpdated,
}) ?? undefined;
// Map and update the DB record
// ── Resolve CW opportunity data (cache-aware) ──────────────────────
let cwData: any;
let record = existing; // default: use the existing DB record as-is
if (!opts?.fresh) {
// Try the Redis cache first
cwData = await getCachedOppCwData(existing.cwOpportunityId);
}
const _t2 = performance.now();
console.log(
`[PERF:fetchItem] Redis cache check: ${(_t2 - _t1).toFixed(0)}ms (hit=${!!cwData})`,
);
// ── Parallel block: CW opp fetch + activities + company ────────────
// Activities and company hydration only need existing.cwOpportunityId
// and existing.company — both available from the initial DB lookup —
// so they can run concurrently with the CW opp fetch + DB update.
const cwOppPromise = (async () => {
if (cwData) return; // cache hit — nothing to do
cwData = ttlMs
? await fetchAndCacheOppCwData(existing.cwOpportunityId, ttlMs)
: await opportunityCw.fetch(existing.cwOpportunityId);
const _t2b = performance.now();
console.log(
`[PERF:fetchItem] CW opp fetch: ${(_t2b - _t2).toFixed(0)}ms`,
);
if (!cwData) {
throw new GenericError({
message: "Opportunity not found in ConnectWise",
name: "OpportunityNotFound",
cause: `CW returned 404 for opportunity ${existing.cwOpportunityId}`,
status: 404,
});
}
// Map and update the DB record (only on cache miss/fresh)
const mapped = OpportunityController.mapCwToDb(cwData);
// Resolve internal company link
const companyId = cwData.company?.id
? ((
await prisma.company.findFirst({
@@ -75,19 +286,36 @@ export const opportunities = {
)?.id ?? null)
: null;
const updated = await prisma.opportunity.update({
record = await prisma.opportunity.update({
where: { id: existing.id },
data: { ...mapped, companyId },
include: { company: true },
});
console.log(
`[PERF:fetchItem] DB update: ${(performance.now() - _t2b).toFixed(0)}ms`,
);
})();
const activities = await buildActivities(updated.cwOpportunityId);
const _t3 = performance.now();
// Hydrate activities and company in parallel with CW opp fetch
const [, activities, company] = await Promise.all([
cwOppPromise,
buildActivities(existing.cwOpportunityId, { strategy, ttlMs }),
existing.company
? buildCompanyController(existing.company, { strategy, ttlMs })
: Promise.resolve(undefined),
]);
const _t4 = performance.now();
console.log(
`[PERF:fetchItem] parallel block (cw+activities+company): ${(_t4 - _t3).toFixed(0)}ms`,
);
console.log(
`[PERF:fetchItem] TOTAL: ${(_t4 - _t0).toFixed(0)}ms (strategy=${strategy}, ttl=${ttlMs}ms)`,
);
return new OpportunityController(updated, {
company: updated.company
? await buildCompanyController(updated.company)
: undefined,
customFields: cwData.customFields ?? [],
return new OpportunityController(record, {
company,
customFields: cwData?.customFields ?? [],
activities,
});
},
@@ -95,6 +323,11 @@ export const opportunities = {
/**
* Fetch All Opportunities (Paginated)
*
* Uses the **cache-only** strategy: Redis → bare DB data.
* Activities and company hydration come from the Redis cache if
* available; otherwise the controller is returned with DB-only data.
* The background refresh job is responsible for keeping Redis warm.
*
* @param page - Page number (1-based)
* @param rpp - Records per page
* @param opts - Optional filters
@@ -116,15 +349,18 @@ export const opportunities = {
});
return Promise.all(
items.map(
async (item) =>
new OpportunityController(item, {
items.map(async (item) => {
return new OpportunityController(item, {
company: item.company
? await buildCompanyController(item.company)
? await buildCompanyController(item.company, {
strategy: "cache-only",
})
: undefined,
activities: await buildActivities(item.cwOpportunityId),
activities: await buildActivities(item.cwOpportunityId, {
strategy: "cache-only",
}),
});
}),
),
);
},
@@ -134,6 +370,8 @@ export const opportunities = {
* Search opportunities by name, company name, contact name, notes,
* sales rep, or status with pagination support.
*
* Uses the **cache-only** strategy (same as `fetchPages`).
*
* @param query - Search query string
* @param page - Page number (1-based)
* @param rpp - Records per page
@@ -174,15 +412,18 @@ export const opportunities = {
});
return Promise.all(
items.map(
async (item) =>
new OpportunityController(item, {
items.map(async (item) => {
return new OpportunityController(item, {
company: item.company
? await buildCompanyController(item.company)
? await buildCompanyController(item.company, {
strategy: "cache-only",
})
: undefined,
activities: await buildActivities(item.cwOpportunityId),
activities: await buildActivities(item.cwOpportunityId, {
strategy: "cache-only",
}),
});
}),
),
);
},
@@ -240,6 +481,8 @@ export const opportunities = {
*
* Fetch all opportunities for a company by its internal company ID.
*
* Uses the **cache-only** strategy (same as `fetchPages`).
*
* @param companyId - The internal company ID
* @param opts - Optional filters
* @returns {Promise<OpportunityController[]>}
@@ -258,15 +501,18 @@ export const opportunities = {
});
return Promise.all(
items.map(
async (item) =>
new OpportunityController(item, {
items.map(async (item) => {
return new OpportunityController(item, {
company: item.company
? await buildCompanyController(item.company)
? await buildCompanyController(item.company, {
strategy: "cache-only",
})
: undefined,
activities: await buildActivities(item.cwOpportunityId),
activities: await buildActivities(item.cwOpportunityId, {
strategy: "cache-only",
}),
});
}),
),
);
},
};
+168
View File
@@ -0,0 +1,168 @@
/**
* @module computeCacheTTL
*
* Adaptive Cache TTL Algorithm
* ============================
*
* Determines how long a cached record should live before it must be
* re-fetched from the upstream source (e.g. ConnectWise API).
*
* The algorithm prioritises freshness for records that are actively
* being worked on, while avoiding unnecessary API calls for stale or
* inactive data.
*
* ## Spec
*
* | # | Condition | TTL (ms) | TTL (human) | Rationale |
* |---|------------------------------------------------------------------|----------|-------------|--------------------------------------------------------------------|
* | 1 | `closedFlag` is `true` | `null` | Do not cache| Closed records are rarely accessed; caching wastes memory. |
* | 2 | `expectedCloseDate` OR `lastUpdated` is within the last **5 days**| 60 000 | 60 seconds | High-activity window — data changes frequently and must stay fresh.|
* | 3 | `expectedCloseDate` OR `lastUpdated` is within the last **14 days**| 90 000 | 90 seconds | Moderate activity — still relevant, but changes less often. |
* | 4 | Everything else (older than 14 days) | 900 000 | 15 minutes | Low activity — safe to serve from cache for longer. |
*
* ## Evaluation order
*
* Rules are evaluated **top-to-bottom**; the first matching rule wins.
* Rule 2 (5-day window) is a subset of Rule 3 (14-day window), so it
* must be checked first.
*
* ## Inputs
*
* | Field | Type | Description |
* |--------------------|------------------|--------------------------------------------------------------------|
* | `closedFlag` | `boolean` | Whether the record is closed / inactive. |
* | `expectedCloseDate`| `Date \| null` | The projected close date (future-looking relevance signal). |
* | `lastUpdated` | `Date \| null` | The last time the upstream record was modified (backward-looking). |
* | `now` | `Date` (optional)| Override for the current timestamp; defaults to `new Date()`. |
*
* ## Output
*
* Returns `number | null`:
* - A positive integer representing the TTL in **milliseconds**, or
* - `null` when the record should **not** be cached at all.
*
* ## Usage
*
* ```ts
* import { computeCacheTTL } from "../modules/algorithms/computeCacheTTL";
*
* const ttl = computeCacheTTL({
* closedFlag: opportunity.closedFlag,
* expectedCloseDate: opportunity.expectedCloseDate,
* lastUpdated: opportunity.cwLastUpdated,
* });
*
* if (ttl !== null) {
* await redis.set(key, serialised, "PX", ttl);
* }
* ```
*/
// ---------------------------------------------------------------------------
// Constants
// ---------------------------------------------------------------------------
/** 60 seconds TTL for high-activity records (within 5 days).
* Must exceed the 30-second background refresh interval so the cache
* stays warm between cycles. */
export const TTL_HIGH_ACTIVITY = 60_000;
/** 90 seconds TTL for moderate-activity records (within 14 days). */
export const TTL_MODERATE_ACTIVITY = 90_000;
/** 15 minutes TTL for low-activity / stale records. */
export const TTL_LOW_ACTIVITY = 900_000;
/** 30 days in milliseconds. */
const THIRTY_DAYS_MS = 30 * 24 * 60 * 60 * 1000;
/** 5 days in milliseconds. */
const FIVE_DAYS_MS = 5 * 24 * 60 * 60 * 1000;
/** 14 days in milliseconds. */
const FOURTEEN_DAYS_MS = 14 * 24 * 60 * 60 * 1000;
// ---------------------------------------------------------------------------
// Input type
// ---------------------------------------------------------------------------
export interface CacheTTLInput {
/** Whether the record is closed / inactive. */
closedFlag: boolean;
/** When the record was closed — used for recently-closed caching (within 30 days). */
closedDate: Date | null;
/** The projected close date — serves as a forward-looking relevance signal. */
expectedCloseDate: Date | null;
/** The date the upstream record was last modified — backward-looking signal. */
lastUpdated: Date | null;
/**
* Override for the current timestamp.
* Useful for deterministic testing. Defaults to `new Date()`.
*/
now?: Date;
}
// ---------------------------------------------------------------------------
// Algorithm
// ---------------------------------------------------------------------------
/**
* Compute the cache TTL for a record based on its activity signals.
*
* @param input - The record's activity signals. See {@link CacheTTLInput}.
* @returns The TTL in milliseconds, or `null` if the record should not be cached.
*
* @see Module-level JSDoc for the full spec table and evaluation rules.
*/
export function computeCacheTTL(input: CacheTTLInput): number | null {
const {
closedFlag,
closedDate,
expectedCloseDate,
lastUpdated,
now = new Date(),
} = input;
const nowMs = now.getTime();
/**
* Check whether a date falls within a window around `now`.
*
* "Within" means the date is between `now - windowMs` and `now + windowMs`,
* allowing both past updates and future-scheduled dates to qualify.
*/
const isWithinWindow = (date: Date | null, windowMs: number): boolean => {
if (!date) return false;
const diff = Math.abs(nowMs - date.getTime());
return diff <= windowMs;
};
// Rule 1 — Closed records
if (closedFlag) {
// Rule 1b — Recently closed (within 30 days) → cache at low-activity TTL
if (isWithinWindow(closedDate, THIRTY_DAYS_MS)) {
return TTL_LOW_ACTIVITY;
}
// Rule 1a — Closed longer than 30 days → do not cache
return null;
}
// Rule 2 — High activity (5 days)
if (
isWithinWindow(expectedCloseDate, FIVE_DAYS_MS) ||
isWithinWindow(lastUpdated, FIVE_DAYS_MS)
) {
return TTL_HIGH_ACTIVITY;
}
// Rule 3 — Moderate activity (14 days)
if (
isWithinWindow(expectedCloseDate, FOURTEEN_DAYS_MS) ||
isWithinWindow(lastUpdated, FOURTEEN_DAYS_MS)
) {
return TTL_MODERATE_ACTIVITY;
}
// Rule 4 — Low activity / stale
return TTL_LOW_ACTIVITY;
}
@@ -0,0 +1,116 @@
/**
* @module computeProductsCacheTTL
*
* Adaptive Cache TTL for Opportunity Products
* ============================================
*
* Determines how long products (forecast items) should be cached in
* Redis before being re-fetched from ConnectWise.
*
* Products have unique caching rules compared to notes or contacts
* because they are typically finalised before a deal closes and do not
* change once the opportunity reaches a terminal status.
*
* ## Spec
*
* | # | Condition | TTL (ms) | TTL (human) | Rationale |
* |---|------------------------------------------------------------------------------|------------|-------------|---------------------------------------------------------------------------------------|
* | 1 | Status is **Won**, **Lost**, **Pending Won**, or **Pending Lost** | `null` | No cache | Products on terminal / near-terminal opps are static; no need to keep them warm. |
* | 2 | Opportunity is **not cacheable** (main cache TTL is `null`) | `null` | No cache | If the opp itself is evicted, sub-resources follow suit. |
* | 3 | `lastUpdated` is within the last **3 days** | 15 000 | 15 seconds | Actively-worked deals — products are being edited and need near-real-time freshness. |
* | 4 | Everything else | 1 200 000 | 20 minutes | Lazy on-demand cache: fetched when requested, expires after 20 min without refresh. |
*
* ## Evaluation order
*
* Rules are evaluated top-to-bottom; the first matching rule wins.
*
* ## Inputs
*
* Extends {@link CacheTTLInput} from `computeCacheTTL` with an
* additional `statusCwId` field used to identify terminal statuses.
*
* ## Output
*
* Returns `number | null`:
* - Positive integer = TTL in **milliseconds**.
* - `null` = do **not** cache.
*/
import type { CacheTTLInput } from "./computeCacheTTL";
import { computeCacheTTL } from "./computeCacheTTL";
import { QUOTE_STATUSES } from "../../types/QuoteStatuses";
// ---------------------------------------------------------------------------
// Constants
// ---------------------------------------------------------------------------
/** 45 seconds — TTL for hot products (opportunity updated within 3 days).
* Must exceed the 30-second background refresh interval so the cache
* stays warm between cycles. */
export const PRODUCTS_TTL_HOT = 45_000;
/** 20 minutes — TTL for on-demand product cache (lazy fallback). */
export const PRODUCTS_TTL_LAZY = 1_200_000;
/** 3 days in milliseconds. */
const THREE_DAYS_MS = 3 * 24 * 60 * 60 * 1000;
/**
* Set of all CW status IDs that map to a Won or Lost canonical status.
*
* Built at module load from {@link QUOTE_STATUSES} so it stays in sync
* with any future status additions.
*/
export const WON_LOST_STATUS_IDS: ReadonlySet<number> = new Set(
QUOTE_STATUSES.filter((s) => s.wonFlag || s.lostFlag).flatMap((s) => [
s.id,
...s.optimaEquivalency,
]),
);
// ---------------------------------------------------------------------------
// Input type
// ---------------------------------------------------------------------------
export interface ProductsCacheTTLInput extends CacheTTLInput {
/** The CW status ID of the opportunity. */
statusCwId: number | null;
}
// ---------------------------------------------------------------------------
// Algorithm
// ---------------------------------------------------------------------------
/**
* Compute the cache TTL for an opportunity's products.
*
* @param input - The opportunity's activity signals plus status ID.
* @returns TTL in milliseconds, or `null` if products should not be cached.
*/
export function computeProductsCacheTTL(
input: ProductsCacheTTLInput,
): number | null {
const { statusCwId, lastUpdated, now = new Date() } = input;
// Rule 1 — Terminal statuses: Won / Lost / Pending Won / Pending Lost
if (statusCwId !== null && WON_LOST_STATUS_IDS.has(statusCwId)) {
return null;
}
// Rule 2 — If the opportunity itself is not cacheable, skip products too
const mainTTL = computeCacheTTL(input);
if (mainTTL === null) {
return null;
}
// Rule 3 — Hot: updated within the last 3 days
if (lastUpdated) {
const diff = Math.abs(now.getTime() - lastUpdated.getTime());
if (diff <= THREE_DAYS_MS) {
return PRODUCTS_TTL_HOT;
}
}
// Rule 4 — Lazy fallback
return PRODUCTS_TTL_LAZY;
}
@@ -0,0 +1,118 @@
/**
* @module computeSubResourceCacheTTL
*
* Adaptive Cache TTL for Opportunity Sub-Resources
* =================================================
*
* Determines how long cached sub-resource data (notes, contacts) should
* live before being re-fetched from ConnectWise.
*
* Sub-resources change less frequently than the opportunity record itself
* or its activity feed, so TTLs are longer than the primary cache. The
* same activity-signal heuristics are used (expected close date, last
* updated, closed status) but with relaxed durations.
*
* ## Spec
*
* | # | Condition | TTL (ms) | TTL (human) | Rationale |
* |---|-------------------------------------------------------------------|----------|-------------|--------------------------------------------------------------------|
* | 1 | `closedFlag` is `true` AND closed > 30 days ago | `null` | Do not cache| Old closed records are rarely accessed. |
* | 1b| `closedFlag` is `true` AND closed within 30 days | 300 000 | 5 minutes | Recently-closed records may still be viewed occasionally. |
* | 2 | `expectedCloseDate` OR `lastUpdated` within **5 days** | 60 000 | 60 seconds | Active deals — contacts/notes may still change. |
* | 3 | `expectedCloseDate` OR `lastUpdated` within **14 days** | 120 000 | 2 minutes | Moderate activity — less likely to change. |
* | 4 | Everything else (older than 14 days) | 300 000 | 5 minutes | Low activity — safe to cache longer. |
*
* ## Evaluation order
*
* Rules are evaluated top-to-bottom; the first matching rule wins.
*
* ## Inputs
*
* Uses the same {@link CacheTTLInput} interface as `computeCacheTTL`.
*
* ## Output
*
* Returns `number | null`:
* - Positive integer = TTL in **milliseconds**.
* - `null` = do **not** cache.
*/
import type { CacheTTLInput } from "./computeCacheTTL";
// ---------------------------------------------------------------------------
// Constants
// ---------------------------------------------------------------------------
/** 60 seconds — TTL for high-activity sub-resources (within 5 days). */
export const SUB_TTL_HIGH_ACTIVITY = 60_000;
/** 2 minutes — TTL for moderate-activity sub-resources (within 14 days). */
export const SUB_TTL_MODERATE_ACTIVITY = 120_000;
/** 5 minutes — TTL for low-activity / stale sub-resources. */
export const SUB_TTL_LOW_ACTIVITY = 300_000;
/** 30 days in milliseconds. */
const THIRTY_DAYS_MS = 30 * 24 * 60 * 60 * 1000;
/** 5 days in milliseconds. */
const FIVE_DAYS_MS = 5 * 24 * 60 * 60 * 1000;
/** 14 days in milliseconds. */
const FOURTEEN_DAYS_MS = 14 * 24 * 60 * 60 * 1000;
// ---------------------------------------------------------------------------
// Algorithm
// ---------------------------------------------------------------------------
/**
* Compute the cache TTL for an opportunity sub-resource (notes, contacts).
*
* @param input - The opportunity's activity signals. See {@link CacheTTLInput}.
* @returns The TTL in milliseconds, or `null` if the data should not be cached.
*/
export function computeSubResourceCacheTTL(
input: CacheTTLInput,
): number | null {
const {
closedFlag,
closedDate,
expectedCloseDate,
lastUpdated,
now = new Date(),
} = input;
const nowMs = now.getTime();
const isWithinWindow = (date: Date | null, windowMs: number): boolean => {
if (!date) return false;
return Math.abs(nowMs - date.getTime()) <= windowMs;
};
// Rule 1 — Closed records
if (closedFlag) {
if (isWithinWindow(closedDate, THIRTY_DAYS_MS)) {
return SUB_TTL_LOW_ACTIVITY;
}
return null;
}
// Rule 2 — High activity (5 days)
if (
isWithinWindow(expectedCloseDate, FIVE_DAYS_MS) ||
isWithinWindow(lastUpdated, FIVE_DAYS_MS)
) {
return SUB_TTL_HIGH_ACTIVITY;
}
// Rule 3 — Moderate activity (14 days)
if (
isWithinWindow(expectedCloseDate, FOURTEEN_DAYS_MS) ||
isWithinWindow(lastUpdated, FOURTEEN_DAYS_MS)
) {
return SUB_TTL_MODERATE_ACTIVITY;
}
// Rule 4 — Low activity / stale
return SUB_TTL_LOW_ACTIVITY;
}
+763
View File
@@ -0,0 +1,763 @@
/**
* @module opportunityCache
*
* Redis-backed cache for expensive ConnectWise API data associated
* with opportunities.
*
* ## What is cached
*
* Each non-closed opportunity may have two cached payloads keyed by
* its `cwOpportunityId`:
*
* - **Activities** (`opp:activities:{cwOpportunityId}`) — the raw
* `CWActivity[]` array fetched from `activityCw.fetchByOpportunity()`.
* - **Company CW data** (`opp:company-cw:{cw_CompanyId}`) — the hydrated
* company / contacts blob set by `CompanyController.hydrateCwData()`.
* - **Notes** (`opp:notes:{cwOpportunityId}`) — raw CW notes array.
* - **Contacts** (`opp:contacts:{cwOpportunityId}`) — raw CW contacts array.
* - **Products** (`opp:products:{cwOpportunityId}`) — raw CW forecast +
* procurement products blob.
*
* TTLs are computed dynamically via {@link computeCacheTTL} so hot
* opportunities refresh every 30 s while stale ones live for 15 min.
*
* ## Background refresh
*
* {@link refreshOpportunityCache} is designed to be called on a
* 30-second interval from `src/index.ts`. It scans all non-closed
* DB opportunities, checks which cache keys have expired, and
* re-fetches only those from ConnectWise.
*/
import { prisma, redis } from "../../constants";
import { activityCw } from "../cw-utils/activities/activities";
import { computeCacheTTL } from "../algorithms/computeCacheTTL";
import { computeSubResourceCacheTTL } from "../algorithms/computeSubResourceCacheTTL";
import {
computeProductsCacheTTL,
PRODUCTS_TTL_HOT,
} from "../algorithms/computeProductsCacheTTL";
import { connectWiseApi } from "../../constants";
import { fetchCwCompanyById } from "../cw-utils/fetchCompany";
import { fetchCompanySite } from "../cw-utils/sites/companySites";
import { opportunityCw } from "../cw-utils/opportunities/opportunities";
import { withCwRetry } from "../cw-utils/withCwRetry";
import { events } from "../globalEvents";
// ---------------------------------------------------------------------------
// Key helpers
// ---------------------------------------------------------------------------
const ACTIVITY_PREFIX = "opp:activities:";
const COMPANY_CW_PREFIX = "opp:company-cw:";
const NOTES_PREFIX = "opp:notes:";
const CONTACTS_PREFIX = "opp:contacts:";
const PRODUCTS_PREFIX = "opp:products:";
const SITE_PREFIX = "opp:site:";
const OPP_CW_PREFIX = "opp:cw-data:";
/** Redis key for cached activities by CW opportunity ID. */
export const activityCacheKey = (cwOppId: number) =>
`${ACTIVITY_PREFIX}${cwOppId}`;
/** Redis key for cached company CW hydration data by CW company ID. */
export const companyCwCacheKey = (cwCompanyId: number) =>
`${COMPANY_CW_PREFIX}${cwCompanyId}`;
/** Redis key for cached opportunity notes by CW opportunity ID. */
export const notesCacheKey = (cwOppId: number) => `${NOTES_PREFIX}${cwOppId}`;
/** Redis key for cached opportunity contacts by CW opportunity ID. */
export const contactsCacheKey = (cwOppId: number) =>
`${CONTACTS_PREFIX}${cwOppId}`;
/** Redis key for cached opportunity products by CW opportunity ID. */
export const productsCacheKey = (cwOppId: number) =>
`${PRODUCTS_PREFIX}${cwOppId}`;
/** Redis key for cached company site by CW company ID + site ID. */
export const siteCacheKey = (cwCompanyId: number, cwSiteId: number) =>
`${SITE_PREFIX}${cwCompanyId}:${cwSiteId}`;
/** Redis key for cached CW opportunity response by CW opportunity ID. */
export const oppCwDataCacheKey = (cwOppId: number) =>
`${OPP_CW_PREFIX}${cwOppId}`;
// ---------------------------------------------------------------------------
// Read helpers
// ---------------------------------------------------------------------------
/**
* Retrieve cached CW activities for an opportunity.
*
* @returns The parsed `CWActivity[]` or `null` on cache miss.
*/
export async function getCachedActivities(
cwOpportunityId: number,
): Promise<any[] | null> {
const raw = await redis.get(activityCacheKey(cwOpportunityId));
if (!raw) return null;
try {
return JSON.parse(raw);
} catch {
return null;
}
}
/**
* Retrieve cached company CW hydration data.
*
* @returns `{ company, defaultContact, allContacts }` or `null` on cache miss.
*/
export async function getCachedCompanyCwData(
cwCompanyId: number,
): Promise<{ company: any; defaultContact: any; allContacts: any[] } | null> {
const raw = await redis.get(companyCwCacheKey(cwCompanyId));
if (!raw) return null;
try {
return JSON.parse(raw);
} catch {
return null;
}
}
/**
* Retrieve cached opportunity notes (raw CW data).
*
* @returns The parsed raw CW notes array or `null` on cache miss.
*/
export async function getCachedNotes(
cwOpportunityId: number,
): Promise<any[] | null> {
const raw = await redis.get(notesCacheKey(cwOpportunityId));
if (!raw) return null;
try {
return JSON.parse(raw);
} catch {
return null;
}
}
/**
* Retrieve cached opportunity contacts (raw CW data).
*
* @returns The parsed raw CW contacts array or `null` on cache miss.
*/
export async function getCachedContacts(
cwOpportunityId: number,
): Promise<any[] | null> {
const raw = await redis.get(contactsCacheKey(cwOpportunityId));
if (!raw) return null;
try {
return JSON.parse(raw);
} catch {
return null;
}
}
/**
* Retrieve cached opportunity products (raw CW forecast + procurement blob).
*
* @returns `{ forecast, procProducts }` or `null` on cache miss.
*/
export async function getCachedProducts(
cwOpportunityId: number,
): Promise<{ forecast: any; procProducts: any[] } | null> {
const raw = await redis.get(productsCacheKey(cwOpportunityId));
if (!raw) return null;
try {
return JSON.parse(raw);
} catch {
return null;
}
}
/**
* Retrieve cached CW site data for a company/site pair.
*
* @returns Parsed site data or `null` on cache miss.
*/
export async function getCachedSite(
cwCompanyId: number,
cwSiteId: number,
): Promise<any | null> {
const raw = await redis.get(siteCacheKey(cwCompanyId, cwSiteId));
if (!raw) return null;
try {
return JSON.parse(raw);
} catch {
return null;
}
}
/**
* Retrieve cached CW opportunity response data.
*
* @returns Parsed CW opportunity object or `null` on cache miss.
*/
export async function getCachedOppCwData(
cwOpportunityId: number,
): Promise<any | null> {
const raw = await redis.get(oppCwDataCacheKey(cwOpportunityId));
if (!raw) return null;
try {
return JSON.parse(raw);
} catch {
return null;
}
}
// ---------------------------------------------------------------------------
// Helpers
// ---------------------------------------------------------------------------
/** Check whether an error is an Axios 404 (resource not found in CW). */
function isNotFoundError(err: unknown): boolean {
if (typeof err !== "object" || err === null) return false;
const e = err as Record<string, any>;
return e.isAxiosError === true && e.response?.status === 404;
}
/**
* Check whether an error is a transient network / timeout error.
*
* These are safe to swallow in background refresh tasks — CW will be
* retried on the next refresh cycle. Logs a concise one-line warning
* instead of dumping the full Axios error object.
*/
function isTransientError(err: unknown): boolean {
if (typeof err !== "object" || err === null) return false;
const e = err as Record<string, any>;
if (!e.isAxiosError) return false;
const code = e.code as string | undefined;
return (
code === "ECONNABORTED" ||
code === "ECONNREFUSED" ||
code === "ECONNRESET" ||
code === "ETIMEDOUT" ||
code === "ERR_NETWORK" ||
code === "ENETUNREACH" ||
code === "ERR_BAD_RESPONSE"
);
}
/** Build a concise error description for logging (avoids dumping entire Axios objects). */
function describeError(err: unknown): string {
if (typeof err !== "object" || err === null) return String(err);
const e = err as Record<string, any>;
if (e.isAxiosError) {
const method = (e.config?.method ?? "?").toUpperCase();
const url = e.config?.url ?? "unknown";
const code = e.code ?? "";
const status = e.response?.status ?? "";
return `${method} ${url}${code || `HTTP ${status}`} (${e.message})`;
}
return e.message ?? String(err);
}
/**
* When true, transient-error warnings inside fetchAndCache* are suppressed.
* Used during background refresh to avoid flooding the terminal — the
* refresh function prints a single summary line instead.
*/
let _suppressTransientWarnings = false;
// ---------------------------------------------------------------------------
// Write helpers
// ---------------------------------------------------------------------------
/**
* Fetch activities from CW and cache them with the appropriate TTL.
*
* Returns an empty array if CW responds with 404 (opportunity doesn't
* exist or was deleted upstream).
*
* @returns The raw `CWActivity[]` collection (as plain array).
*/
export async function fetchAndCacheActivities(
cwOpportunityId: number,
ttlMs: number,
): Promise<any[]> {
try {
// Use the direct (single-call) variant to avoid the extra count request
const arr = await activityCw.fetchByOpportunityDirect(cwOpportunityId);
await redis.set(
activityCacheKey(cwOpportunityId),
JSON.stringify(arr),
"PX",
ttlMs,
);
return arr;
} catch (err) {
if (isNotFoundError(err)) return [];
if (isTransientError(err)) {
console.warn(
`[cache] activities opp#${cwOpportunityId}: ${describeError(err)}`,
);
return [];
}
throw err;
}
}
/**
* Fetch company CW data (company, contacts) and cache with the given TTL.
*
* @returns The hydration blob or `null` if the company doesn't exist in CW.
*/
export async function fetchAndCacheCompanyCwData(
cwCompanyId: number,
ttlMs: number,
): Promise<{ company: any; defaultContact: any; allContacts: any[] } | null> {
try {
// Fetch company and all-contacts in parallel — the allContacts URL
// can be constructed directly without the company response.
const [cwCompany, allContactsData] = await Promise.all([
fetchCwCompanyById(cwCompanyId),
withCwRetry(
() =>
connectWiseApi.get(
`/company/companies/${cwCompanyId}/contacts?pageSize=1000`,
),
{ label: `company#${cwCompanyId}/allContacts` },
),
]);
if (!cwCompany) return null;
// Default contact: derive from allContacts instead of making an
// extra serial CW call. The company object carries the default
// contact's ID, so we can pull it from the list we already fetched.
const defaultContactId = cwCompany.defaultContact?.id;
const defaultContactData = defaultContactId
? ((allContactsData.data as any[]).find(
(c: any) => c.id === defaultContactId,
) ?? null)
: null;
const blob = {
company: cwCompany,
defaultContact: defaultContactData,
allContacts: allContactsData.data,
};
await redis.set(
companyCwCacheKey(cwCompanyId),
JSON.stringify(blob),
"PX",
ttlMs,
);
return blob;
} catch (err) {
if (isNotFoundError(err)) return null;
if (isTransientError(err)) {
console.warn(`[cache] company#${cwCompanyId}: ${describeError(err)}`);
return null;
}
throw err;
}
}
/**
* Fetch opportunity notes from CW and cache the raw response.
*
* Returns an empty array if CW responds with 404.
*
* @returns The raw CW notes array.
*/
export async function fetchAndCacheNotes(
cwOpportunityId: number,
ttlMs: number,
): Promise<any[]> {
try {
const notes = await opportunityCw.fetchNotes(cwOpportunityId);
await redis.set(
notesCacheKey(cwOpportunityId),
JSON.stringify(notes),
"PX",
ttlMs,
);
return notes;
} catch (err) {
if (isNotFoundError(err)) return [];
if (isTransientError(err)) {
console.warn(
`[cache] notes opp#${cwOpportunityId}: ${describeError(err)}`,
);
return [];
}
throw err;
}
}
/**
* Fetch opportunity contacts from CW and cache the raw response.
*
* Returns an empty array if CW responds with 404.
*
* @returns The raw CW contacts array.
*/
export async function fetchAndCacheContacts(
cwOpportunityId: number,
ttlMs: number,
): Promise<any[]> {
try {
const contacts = await opportunityCw.fetchContacts(cwOpportunityId);
await redis.set(
contactsCacheKey(cwOpportunityId),
JSON.stringify(contacts),
"PX",
ttlMs,
);
return contacts;
} catch (err) {
if (isNotFoundError(err)) return [];
if (isTransientError(err)) {
console.warn(
`[cache] contacts opp#${cwOpportunityId}: ${describeError(err)}`,
);
return [];
}
throw err;
}
}
/**
* Invalidate cached notes for an opportunity.
*
* Call this after any note mutation (create, update, delete) so the
* next read refreshes from ConnectWise.
*/
export async function invalidateNotesCache(
cwOpportunityId: number,
): Promise<void> {
await redis.del(notesCacheKey(cwOpportunityId));
}
/**
* Invalidate cached contacts for an opportunity.
*
* Call this after any contact mutation so the next read refreshes
* from ConnectWise.
*/
export async function invalidateContactsCache(
cwOpportunityId: number,
): Promise<void> {
await redis.del(contactsCacheKey(cwOpportunityId));
}
/**
* Fetch opportunity products (forecast + procurement) from CW and cache.
*
* Stores both the forecast response and procurement products together
* so that `fetchProducts()` can reconstruct ForecastProductControllers
* from a single cache hit.
*
* @returns `{ forecast, procProducts }` blob.
*/
export async function fetchAndCacheProducts(
cwOpportunityId: number,
ttlMs: number,
): Promise<{ forecast: any; procProducts: any[] }> {
try {
const [forecast, procProducts] = await Promise.all([
opportunityCw.fetchProducts(cwOpportunityId),
opportunityCw.fetchProcurementProducts(cwOpportunityId),
]);
const blob = { forecast, procProducts };
await redis.set(
productsCacheKey(cwOpportunityId),
JSON.stringify(blob),
"PX",
ttlMs,
);
return blob;
} catch (err) {
if (isNotFoundError(err))
return { forecast: { forecastItems: [] }, procProducts: [] };
if (isTransientError(err)) {
console.warn(
`[cache] products opp#${cwOpportunityId}: ${describeError(err)}`,
);
return { forecast: { forecastItems: [] }, procProducts: [] };
}
throw err;
}
}
/**
* Invalidate cached products for an opportunity.
*
* Call this after any product mutation (add, update, resequence) so the
* next read refreshes from ConnectWise.
*/
export async function invalidateProductsCache(
cwOpportunityId: number,
): Promise<void> {
await redis.del(productsCacheKey(cwOpportunityId));
}
/**
* Site TTL — 20 minutes. Site/address data rarely changes so we cache
* aggressively. The background refresh does NOT proactively warm site keys;
* they are populated lazily on the first detail-view request.
*/
const SITE_TTL_MS = 1_200_000;
/**
* Fetch a CW company site from ConnectWise and cache the result.
*
* @returns The raw CW site object.
*/
export async function fetchAndCacheSite(
cwCompanyId: number,
cwSiteId: number,
): Promise<any> {
try {
const site = await fetchCompanySite(cwCompanyId, cwSiteId);
await redis.set(
siteCacheKey(cwCompanyId, cwSiteId),
JSON.stringify(site),
"PX",
SITE_TTL_MS,
);
return site;
} catch (err) {
if (isNotFoundError(err)) return null;
if (isTransientError(err)) {
console.warn(
`[cache] site company#${cwCompanyId}/site#${cwSiteId}: ${describeError(err)}`,
);
return null;
}
throw err;
}
}
/**
* Fetch the raw CW opportunity response from ConnectWise and cache it.
*
* Used by `fetchItem()` in the manager to avoid a CW roundtrip when
* the detail view is reloaded within the cache TTL window.
*
* @param cwOpportunityId - The CW opportunity ID
* @param ttlMs - Cache TTL in milliseconds
* @returns The raw CW opportunity response object.
*/
export async function fetchAndCacheOppCwData(
cwOpportunityId: number,
ttlMs: number,
): Promise<any> {
try {
const cwData = await opportunityCw.fetch(cwOpportunityId);
await redis.set(
oppCwDataCacheKey(cwOpportunityId),
JSON.stringify(cwData),
"PX",
ttlMs,
);
return cwData;
} catch (err) {
if (isNotFoundError(err)) return null;
if (isTransientError(err)) {
console.warn(`[cache] opp#${cwOpportunityId}: ${describeError(err)}`);
return null;
}
throw err;
}
}
// ---------------------------------------------------------------------------
// Background refresh
// ---------------------------------------------------------------------------
/**
* Refresh the opportunity cache.
*
* Scans all non-closed opportunities in the database, computes a TTL for each,
* checks whether the cache key still exists, and re-fetches from ConnectWise
* only for entries that have expired.
*
* Designed to be called every **30 seconds** from the process entry point.
*/
export async function refreshOpportunityCache(): Promise<void> {
// Include non-closed AND recently-closed (within 30 days) opportunities
const thirtyDaysAgo = new Date(Date.now() - 30 * 24 * 60 * 60 * 1000);
const opportunities = await prisma.opportunity.findMany({
where: {
OR: [
{ closedFlag: false },
{ closedFlag: true, closedDate: { gte: thirtyDaysAgo } },
],
},
select: {
cwOpportunityId: true,
closedFlag: true,
closedDate: true,
expectedCloseDate: true,
cwLastUpdated: true,
statusCwId: true,
company: { select: { cw_CompanyId: true } },
},
orderBy: { cwLastUpdated: "desc" },
});
events.emit("cache:opportunities:refresh:started", {
totalOpportunities: opportunities.length,
});
let activitiesRefreshed = 0;
let companiesRefreshed = 0;
let notesRefreshed = 0;
let contactsRefreshed = 0;
let productsRefreshed = 0;
let oppCwDataRefreshed = 0;
let skipped = 0;
// Batch-check which keys already exist via a single pipeline
// (5 EXISTS per opportunity: oppCwData, activities, notes, contacts, products).
const pipeline = redis.pipeline();
for (const opp of opportunities) {
pipeline.exists(oppCwDataCacheKey(opp.cwOpportunityId));
pipeline.exists(activityCacheKey(opp.cwOpportunityId));
pipeline.exists(notesCacheKey(opp.cwOpportunityId));
pipeline.exists(contactsCacheKey(opp.cwOpportunityId));
pipeline.exists(productsCacheKey(opp.cwOpportunityId));
}
const existsResults = await pipeline.exec();
const refreshTasks: (() => Promise<void>)[] = [];
for (let i = 0; i < opportunities.length; i++) {
const opp = opportunities[i]!;
const cacheTTLInput = {
closedFlag: opp.closedFlag,
closedDate: opp.closedDate,
expectedCloseDate: opp.expectedCloseDate,
lastUpdated: opp.cwLastUpdated,
};
const ttl = computeCacheTTL(cacheTTLInput);
const subTTL = computeSubResourceCacheTTL(cacheTTLInput);
const productsTTL = computeProductsCacheTTL({
...cacheTTLInput,
statusCwId: opp.statusCwId,
});
// Skip closed (ttl === null) — should not happen because of the query filter,
// but guard anyway.
if (ttl === null) {
skipped++;
continue;
}
// existsResults entries are [error, result] tuples
// Pipeline order per opportunity: oppCwData, activities, notes, contacts, products
const baseIdx = i * 5;
const oppCwDataExists = existsResults?.[baseIdx]?.[1] === 1;
const activityExists = existsResults?.[baseIdx + 1]?.[1] === 1;
const notesExist = existsResults?.[baseIdx + 2]?.[1] === 1;
const contactsExist = existsResults?.[baseIdx + 3]?.[1] === 1;
const productsExist = existsResults?.[baseIdx + 4]?.[1] === 1;
// Proactively cache the CW opportunity response itself
if (!oppCwDataExists) {
refreshTasks.push(() =>
fetchAndCacheOppCwData(opp.cwOpportunityId, ttl).then(() => {
oppCwDataRefreshed++;
}),
);
}
if (!activityExists) {
refreshTasks.push(() =>
fetchAndCacheActivities(opp.cwOpportunityId, ttl).then(() => {
activitiesRefreshed++;
}),
);
}
// Refresh notes/contacts if sub-resource TTL applies and key is missing
if (subTTL !== null) {
if (!notesExist) {
refreshTasks.push(() =>
fetchAndCacheNotes(opp.cwOpportunityId, subTTL).then(() => {
notesRefreshed++;
}),
);
}
if (!contactsExist) {
refreshTasks.push(() =>
fetchAndCacheContacts(opp.cwOpportunityId, subTTL).then(() => {
contactsRefreshed++;
}),
);
}
}
// Proactively refresh products only for hot opps (updated within 3 days).
// 30-minute lazy-cached products are filled on-demand by the endpoint
// and do not need background refresh.
if (productsTTL === PRODUCTS_TTL_HOT && !productsExist) {
refreshTasks.push(() =>
fetchAndCacheProducts(opp.cwOpportunityId, productsTTL).then(() => {
productsRefreshed++;
}),
);
}
// Also refresh company CW data if the key is missing
if (opp.company?.cw_CompanyId) {
const cwCompanyId = opp.company.cw_CompanyId;
refreshTasks.push(async () => {
const companyExists = await redis.exists(
companyCwCacheKey(cwCompanyId),
);
if (!companyExists) {
await fetchAndCacheCompanyCwData(cwCompanyId, ttl);
companiesRefreshed++;
}
});
}
}
// Run refresh thunks with bounded concurrency and inter-batch delay.
// Each thunk is only invoked here — no requests fire until we call them.
// CW rate-limits aggressively so we keep this conservative.
const CONCURRENCY = 6;
const BATCH_DELAY_MS = 250;
let timeoutCount = 0;
for (let i = 0; i < refreshTasks.length; i += CONCURRENCY) {
const batch = refreshTasks.slice(i, i + CONCURRENCY);
const results = await Promise.allSettled(batch.map((fn) => fn()));
for (const r of results) {
if (r.status === "rejected") timeoutCount++;
}
// Small delay between batches to avoid overwhelming CW
if (i + CONCURRENCY < refreshTasks.length) {
await new Promise((resolve) => setTimeout(resolve, BATCH_DELAY_MS));
}
}
if (timeoutCount > 0) {
console.warn(
`[cache] refresh: ${timeoutCount} task(s) failed (likely CW timeouts) — will retry next cycle`,
);
}
events.emit("cache:opportunities:refresh:completed", {
totalOpportunities: opportunities.length,
activitiesRefreshed,
companiesRefreshed,
notesRefreshed,
contactsRefreshed,
productsRefreshed,
oppCwDataRefreshed,
skipped,
});
}
@@ -115,6 +115,24 @@ export const activityCw = {
return activityCw.fetchAll(`opportunity/id=${opportunityId}`);
},
/**
* Fetch Activities by Opportunity (Direct)
*
* Lightweight single-call variant that skips the count request.
* Fetches up to 1000 activities in a single GET — sufficient for
* virtually all opportunities. Used by the background cache refresh
* to avoid doubling CW API calls.
*/
fetchByOpportunityDirect: async (
opportunityId: number,
): Promise<CWActivity[]> => {
const conditions = encodeURIComponent(`opportunity/id=${opportunityId}`);
const response = await connectWiseApi.get(
`/sales/activities?pageSize=1000&conditions=${conditions}`,
);
return response.data;
},
/**
* Create Activity
*
+142
View File
@@ -0,0 +1,142 @@
/**
* @module cwApiLogger
*
* Axios interceptor-based logger that records every ConnectWise API
* request to a JSONL (newline-delimited JSON) file for post-hoc analysis.
*
* Each line in the log file is a self-contained JSON object with:
* - timestamp (ISO-8601)
* - method, url, baseURL
* - status (HTTP status or null on network error)
* - durationMs (wall-clock time from request start → response/error)
* - error (error code / message, if any)
* - timeout (configured timeout in ms)
*
* Logging is **opt-in** — set the `LOG_CW_API` environment variable to
* any truthy value to enable it. When enabled, each process start creates
* a new timestamped file inside the `cw-api-logs/` directory:
*
* LOG_CW_API=1 bun run dev # uses cw-api-logs/<timestamp>.jsonl
* bun run dev:log # shorthand (sets LOG_CW_API=1)
*
* Appends are non-blocking (fire-and-forget) to avoid slowing down
* the actual API flow.
*
* Usage:
* import { attachCwApiLogger } from "./modules/cw-utils/cwApiLogger";
* attachCwApiLogger(connectWiseApi);
*/
import { appendFile, mkdir } from "fs/promises";
import path from "path";
import type { AxiosInstance, InternalAxiosRequestConfig } from "axios";
const LOG_DIR = path.resolve(process.cwd(), "cw-api-logs");
/** Build a timestamped filename like `2026-03-02T14-30-05.123Z.jsonl` */
function buildLogPath(): string {
const ts = new Date().toISOString().replace(/:/g, "-");
return path.join(LOG_DIR, `${ts}.jsonl`);
}
let LOG_PATH: string | null = null;
// Symbol used to stash the start time on the request config
const START_TIME = Symbol("cwLogStartTime");
interface TimedConfig extends InternalAxiosRequestConfig {
[START_TIME]?: number;
}
export interface CwApiLogEntry {
timestamp: string;
method: string;
url: string;
baseURL: string;
status: number | null;
durationMs: number;
error: string | null;
timeout: number | undefined;
}
/** Write a single log entry (fire-and-forget). */
function writeEntry(entry: CwApiLogEntry): void {
if (!LOG_PATH) return;
appendFile(LOG_PATH, JSON.stringify(entry) + "\n").catch((err) => {
// Swallow write errors — logging should never crash the app
console.error("[cw-logger] failed to write log entry:", err.message);
});
}
/**
* Attach request/response interceptors to an Axios instance to log
* every CW API call with timing information.
*/
export function attachCwApiLogger(api: AxiosInstance): void {
if (!process.env.LOG_CW_API) {
return;
}
// Create the log directory and build a unique file path for this run
LOG_PATH = buildLogPath();
mkdir(LOG_DIR, { recursive: true }).catch((err) => {
console.error("[cw-logger] failed to create log directory:", err.message);
});
// ---- Request interceptor: record start time --------------------------
api.interceptors.request.use((config: TimedConfig) => {
config[START_TIME] = performance.now();
return config;
});
// ---- Response interceptor: log successful calls ----------------------
api.interceptors.response.use(
(response) => {
const config = response.config as TimedConfig;
const start = config[START_TIME] ?? performance.now();
const durationMs = Math.round(performance.now() - start);
writeEntry({
timestamp: new Date().toISOString(),
method: (config.method ?? "GET").toUpperCase(),
url: config.url ?? "",
baseURL: config.baseURL ?? "",
status: response.status,
durationMs,
error: null,
timeout: config.timeout,
});
return response;
},
// ---- Error interceptor: log failed calls -----------------------------
(err) => {
const config = (err.config ?? {}) as TimedConfig;
const start = config[START_TIME] ?? performance.now();
const durationMs = Math.round(performance.now() - start);
writeEntry({
timestamp: new Date().toISOString(),
method: (config.method ?? "GET").toUpperCase(),
url: config.url ?? "",
baseURL: config.baseURL ?? "",
status: err.response?.status ?? null,
durationMs,
error: err.code
? `${err.code}: ${err.message}`
: (err.message ?? "unknown"),
timeout: config.timeout,
});
return Promise.reject(err);
},
);
console.log(`[cw-logger] logging CW API calls to ${LOG_PATH}`);
}
/** Returns the current log file path (or null if logging is disabled). */
export function getCwLogPath(): string | null {
return LOG_PATH;
}
@@ -0,0 +1,79 @@
/**
* CW API Concurrency Limiter
*
* Limits the number of simultaneous in-flight requests to the ConnectWise
* API. CW responds significantly slower under high concurrency (observed
* ~3× slower at 9 concurrent vs 56 concurrent), so bounding the
* parallelism actually reduces total wall-clock time.
*
* Implemented as an Axios request interceptor that gates on a simple
* counting semaphore. When the limit is reached, new requests queue and
* resolve in FIFO order as earlier requests complete.
*/
import type { AxiosInstance, InternalAxiosRequestConfig } from "axios";
// ---------------------------------------------------------------------------
// Semaphore
// ---------------------------------------------------------------------------
class Semaphore {
private _current = 0;
private _queue: (() => void)[] = [];
constructor(private _max: number) {}
/** Acquire a slot — resolves immediately if under the limit, else waits. */
acquire(): Promise<void> {
if (this._current < this._max) {
this._current++;
return Promise.resolve();
}
return new Promise<void>((resolve) => {
this._queue.push(resolve);
});
}
/** Release a slot — wakes the next queued caller, if any. */
release(): void {
const next = this._queue.shift();
if (next) {
// Hand the slot directly to the next waiter (don't decrement)
next();
} else {
this._current--;
}
}
}
// ---------------------------------------------------------------------------
// Interceptor attachment
// ---------------------------------------------------------------------------
/**
* Attach a concurrency-limiting interceptor to an Axios instance.
*
* @param api - The Axios instance to limit.
* @param max - Maximum concurrent in-flight requests (default: 6).
*/
export function attachCwConcurrencyLimiter(api: AxiosInstance, max = 6): void {
const sem = new Semaphore(max);
// Request interceptor: wait for a slot before the request fires
api.interceptors.request.use(async (config: InternalAxiosRequestConfig) => {
await sem.acquire();
return config;
});
// Response interceptor: release the slot on success or failure
api.interceptors.response.use(
(response) => {
sem.release();
return response;
},
(error) => {
sem.release();
return Promise.reject(error);
},
);
}
+8 -2
View File
@@ -1,12 +1,18 @@
import { connectWiseApi } from "../../constants";
import { Company } from "../../types/ConnectWiseTypes";
import { withCwRetry } from "./withCwRetry";
export const fetchCwCompanyById = async (
companyId: number,
): Promise<Company | null> => {
try {
const response = await connectWiseApi.get(
`/company/companies/${companyId}`,
const response = await withCwRetry(
() => connectWiseApi.get(`/company/companies/${companyId}`),
{
label: `fetchCompany#${companyId}`,
maxAttempts: 3,
baseDelayMs: 1_500,
},
);
return response.data;
} catch (error) {
@@ -102,3 +102,40 @@ export const resolveMember = async (
cwMemberId: cwMember?.id ?? null,
};
};
/**
* Resolve Multiple CW Identifiers in a Single Batch
*
* Same as `resolveMember` but batches the DB query so that N identifiers
* require only **one** `findMany` instead of N `findFirst` calls.
*
* @param identifiers - Array of CW member identifiers
* @returns Map of identifier → ResolvedMember
*/
export const resolveMembers = async (
identifiers: string[],
): Promise<Map<string, ResolvedMember>> => {
const unique = [...new Set(identifiers)];
// Single batched DB query for all identifiers
const localUsers = await prisma.user.findMany({
where: { cwIdentifier: { in: unique } },
select: { id: true, cwIdentifier: true },
});
const userMap = new Map(localUsers.map((u) => [u.cwIdentifier, u.id]));
const result = new Map<string, ResolvedMember>();
for (const identifier of unique) {
const cwMember = memberCache.get(identifier);
const name = cwMember
? `${cwMember.firstName} ${cwMember.lastName}`.trim() || identifier
: identifier;
result.set(identifier, {
id: userMap.get(identifier) ?? null,
identifier,
name,
cwMemberId: cwMember?.id ?? null,
});
}
return result;
};
@@ -66,10 +66,28 @@ export const catalogCw = {
return allItems;
},
fetchByCatalogId: async (cwCatalogId: number): Promise<CatalogItem> => {
try {
const response = await connectWiseApi.get(
`/procurement/catalog/${cwCatalogId}`,
);
return response.data;
} catch {
const fallback = await connectWiseApi.get(
`/procurement/catalog/items/${cwCatalogId}`,
);
return fallback.data;
}
},
fetch: async (id: string): Promise<CatalogItem> => {
const numericId = Number(id);
if (!Number.isFinite(numericId)) {
const response = await connectWiseApi.get(
`/procurement/catalog/items/${id}`,
);
return response.data;
}
return catalogCw.fetchByCatalogId(numericId);
},
};
@@ -0,0 +1,469 @@
import { prisma, redis, connectWiseApi } from "../../../constants";
import { withCwRetry } from "../withCwRetry";
import { catalogCw } from "./catalog";
import { CatalogItem } from "./catalog.types";
type JsonObject = Record<string, unknown>;
type TrackedProduct = {
cwCatalogId: number;
product: string;
onHand: string;
inventory: string;
key: string;
};
type AdjustmentSnapshot = {
key: string;
trackedRows: TrackedProduct[];
signature: string;
};
const ADJUSTMENTS_ENDPOINT = "/procurement/adjustments?pageSize=1000";
const CATALOG_ITEM_CACHE_PREFIX = "catalog:item:cw:";
const CATALOG_ITEM_CACHE_TTL_SECONDS = 20 * 60;
const MAX_SYNC_PER_CYCLE = Number(
process.env.CW_ADJUSTMENT_SYNC_MAX_PER_CYCLE ?? "50",
);
const SYNC_COOLDOWN_MS = Number(
process.env.CW_ADJUSTMENT_SYNC_COOLDOWN_MS ?? `${10 * 60 * 1000}`,
);
let previous = new Map<string, AdjustmentSnapshot>();
let previousProductState = new Map<number, string>();
const lastSyncedAt = new Map<number, number>();
let inFlight = false;
const isObject = (value: unknown): value is JsonObject =>
typeof value === "object" && value !== null && !Array.isArray(value);
const toObject = (value: unknown): JsonObject => {
if (!isObject(value)) return {};
return value;
};
const stableStringify = (value: unknown): string => {
if (Array.isArray(value)) {
const entries = value.map((entry) => stableStringify(entry)).sort();
return `[${entries.join(",")}]`;
}
if (isObject(value)) {
const keys = Object.keys(value).sort();
const pairs = keys.map(
(key) => `${JSON.stringify(key)}:${stableStringify(value[key])}`,
);
return `{${pairs.join(",")}}`;
}
return JSON.stringify(value);
};
const readPathValue = (obj: JsonObject, path: string): unknown => {
const parts = path.split(".");
let current: unknown = obj;
for (const part of parts) {
if (!isObject(current)) return null;
current = current[part];
}
return current;
};
const firstValue = (obj: JsonObject, paths: string[]): unknown => {
for (const path of paths) {
const value = readPathValue(obj, path);
if (value === null || value === undefined || value === "") continue;
return value;
}
return null;
};
const asNumber = (value: unknown): number | null => {
if (typeof value === "number" && Number.isFinite(value)) return value;
if (typeof value === "string" && value.length > 0) {
const parsed = Number(value);
if (Number.isFinite(parsed)) return parsed;
}
return null;
};
const asText = (value: unknown): string => {
if (value === null || value === undefined || value === "") return "-";
if (
typeof value === "string" ||
typeof value === "number" ||
typeof value === "boolean"
) {
return String(value);
}
if (Array.isArray(value)) {
return `[${value.map((entry) => asText(entry)).join(",")}]`;
}
if (!isObject(value)) return String(value);
const preferredFields = ["name", "identifier", "id", "code", "value"];
for (const field of preferredFields) {
const fieldValue = readPathValue(value, field);
if (fieldValue === null || fieldValue === undefined || fieldValue === "")
continue;
if (typeof fieldValue === "object") continue;
return String(fieldValue);
}
return stableStringify(value);
};
const adjustmentKey = (adjustment: JsonObject): string => {
const keyPaths = [
"id",
"adjustmentId",
"procurementAdjustmentId",
"recordId",
"recId",
"_info.id",
"_info.href",
];
for (const path of keyPaths) {
const key = firstValue(adjustment, [path]);
const keyText = asText(key);
if (keyText !== "-") return keyText;
}
return `anon:${stableStringify(adjustment)}`;
};
const trackedRow = (detail: JsonObject): TrackedProduct | null => {
const cwCatalogId = asNumber(
firstValue(detail, [
"catalogItem.id",
"catalogItemId",
"catalog.id",
"catalogId",
"item.id",
"itemId",
"product.id",
"productId",
"id",
]),
);
if (!cwCatalogId) return null;
const onHand = asText(
firstValue(detail, [
"onHand",
"onHandQty",
"onHandQuantity",
"qtyOnHand",
"quantityOnHand",
"quantity.onHand",
]),
);
const inventory = asText(
firstValue(detail, [
"inventory",
"inventoryQty",
"inventoryLevel",
"quantity",
"qty",
]),
);
if (onHand === "-" && inventory === "-") return null;
const product = asText(
firstValue(detail, [
"product.name",
"product.identifier",
"item.name",
"item.identifier",
"catalogItem.name",
"catalogItem.identifier",
"productName",
"productIdentifier",
"sku",
"identifier",
]),
);
return {
cwCatalogId,
product,
onHand,
inventory,
key: `${cwCatalogId}|${product}|${onHand}|${inventory}`,
};
};
const trackedRows = (adjustment: JsonObject): TrackedProduct[] => {
const detailCandidates = [
readPathValue(adjustment, "adjustmentDetails"),
readPathValue(adjustment, "details"),
readPathValue(adjustment, "lineItems"),
];
for (const candidate of detailCandidates) {
if (!Array.isArray(candidate)) continue;
const rows = candidate
.map((entry) => trackedRow(toObject(entry)))
.filter((entry): entry is TrackedProduct => entry !== null)
.sort((a, b) => a.key.localeCompare(b.key));
if (rows.length > 0) return rows;
}
const root = trackedRow(adjustment);
if (!root) return [];
return [root];
};
const snapshot = (rows: unknown[]): Map<string, AdjustmentSnapshot> => {
const out = new Map<string, AdjustmentSnapshot>();
for (const entry of rows) {
const adjustment = toObject(entry);
const key = adjustmentKey(adjustment);
const rowsTracked = trackedRows(adjustment);
const signature = stableStringify(rowsTracked);
out.set(key, {
key,
trackedRows: rowsTracked,
signature,
});
}
return out;
};
const changedCatalogIds = (
before: Map<number, string>,
after: Map<number, string>,
): Set<number> => {
const changed = new Set<number>();
for (const [cwCatalogId, nextSignature] of after) {
const prevSignature = before.get(cwCatalogId);
if (!prevSignature) {
changed.add(cwCatalogId);
continue;
}
if (prevSignature === nextSignature) continue;
changed.add(cwCatalogId);
}
return changed;
};
const productState = (
adjustments: Map<string, AdjustmentSnapshot>,
): Map<number, string> => {
const grouped = new Map<number, Set<string>>();
for (const snapshot of adjustments.values()) {
for (const row of snapshot.trackedRows) {
const rows = grouped.get(row.cwCatalogId) ?? new Set<string>();
rows.add(row.key);
grouped.set(row.cwCatalogId, rows);
}
}
const state = new Map<number, string>();
for (const [cwCatalogId, rows] of grouped) {
state.set(cwCatalogId, stableStringify([...rows].sort()));
}
return state;
};
const applySyncGuards = (ids: number[]): number[] => {
const now = Date.now();
const cooledIds = ids.filter((cwCatalogId) => {
const last = lastSyncedAt.get(cwCatalogId);
if (!last) return true;
return now - last >= SYNC_COOLDOWN_MS;
});
if (cooledIds.length <= MAX_SYNC_PER_CYCLE) return cooledIds;
return cooledIds.slice(0, MAX_SYNC_PER_CYCLE);
};
const fetchAdjustments = async (): Promise<unknown[]> => {
const response = await withCwRetry(
() => connectWiseApi.get(ADJUSTMENTS_ENDPOINT),
{
label: "inventory-adjustments",
maxAttempts: 3,
},
);
const payload = response.data;
if (Array.isArray(payload)) return payload;
if (isObject(payload) && Array.isArray(payload.data)) return payload.data;
return [];
};
const cacheKey = (cwCatalogId: number) =>
`${CATALOG_ITEM_CACHE_PREFIX}${cwCatalogId}`;
const cwLastUpdated = (item: CatalogItem): Date => {
const value = item._info?.lastUpdated;
if (!value) return new Date();
const parsed = new Date(value);
const invalidDate = Number.isNaN(parsed.getTime());
if (invalidDate) return new Date();
return parsed;
};
const syncCatalogItem = async (cwCatalogId: number): Promise<boolean> => {
try {
const item = await withCwRetry(
() => catalogCw.fetchByCatalogId(cwCatalogId),
{
label: `catalog-item:${cwCatalogId}`,
maxAttempts: 3,
},
);
const onHand = await withCwRetry(
() => catalogCw.fetchInventoryOnHand(cwCatalogId),
{
label: `catalog-onhand:${cwCatalogId}`,
maxAttempts: 3,
},
);
const persisted = await prisma.catalogItem.upsert({
where: { cwCatalogId },
create: {
cwCatalogId,
identifier: item.identifier,
name: item.description,
description: item.description,
customerDescription: item.customerDescription,
internalNotes: item.notes,
category: item.category?.name,
categoryCwId: item.category?.id,
subcategory: item.subcategory?.name,
subcategoryCwId: item.subcategory?.id,
manufacturer: item.manufacturer?.name,
manufactureCwId: item.manufacturer?.id,
partNumber: item.manufacturerPartNumber,
vendorName: item.vendor?.name,
vendorSku: item.vendorSku,
vendorCwId: item.vendor?.id,
price: item.price,
cost: item.cost,
inactive: item.inactiveFlag,
salesTaxable: item.taxableFlag,
onHand,
cwLastUpdated: cwLastUpdated(item),
},
update: {
identifier: item.identifier,
name: item.description,
description: item.description,
customerDescription: item.customerDescription,
internalNotes: item.notes,
category: item.category?.name,
categoryCwId: item.category?.id,
subcategory: item.subcategory?.name,
subcategoryCwId: item.subcategory?.id,
manufacturer: item.manufacturer?.name,
manufactureCwId: item.manufacturer?.id,
partNumber: item.manufacturerPartNumber,
vendorName: item.vendor?.name,
vendorSku: item.vendorSku,
vendorCwId: item.vendor?.id,
price: item.price,
cost: item.cost,
inactive: item.inactiveFlag,
salesTaxable: item.taxableFlag,
onHand,
cwLastUpdated: cwLastUpdated(item),
},
});
await redis.set(
cacheKey(cwCatalogId),
JSON.stringify({
cwCatalogId,
onHand,
cwItem: item,
dbItem: persisted,
syncedAt: new Date().toISOString(),
}),
"EX",
CATALOG_ITEM_CACHE_TTL_SECONDS,
);
return true;
} catch (err) {
console.error(
`[inventory-adjustments] failed to sync catalog item ${cwCatalogId}`,
err,
);
return false;
}
};
export const listenInventoryAdjustments = async (): Promise<void> => {
if (inFlight) return;
inFlight = true;
try {
const rows = await fetchAdjustments();
const current = snapshot(rows);
const currentProductState = productState(current);
if (previous.size === 0) {
previous = current;
previousProductState = currentProductState;
console.log(
`[inventory-adjustments] baseline captured (${current.size} adjustments, ${currentProductState.size} products)`,
);
return;
}
const changedIds = [
...changedCatalogIds(previousProductState, currentProductState),
].sort((a, b) => a - b);
const guardedIds = applySyncGuards(changedIds);
previous = current;
previousProductState = currentProductState;
if (guardedIds.length === 0) return;
let successCount = 0;
for (const cwCatalogId of guardedIds) {
const ok = await syncCatalogItem(cwCatalogId);
if (!ok) continue;
lastSyncedAt.set(cwCatalogId, Date.now());
successCount += 1;
}
const skippedByCooldown = changedIds.length - guardedIds.length;
console.log(
`[inventory-adjustments] inventory changed for ${changedIds.length} products, queued ${guardedIds.length}, synced ${successCount}, cooldown/cap skipped ${skippedByCooldown}`,
);
} catch (err) {
console.error("[inventory-adjustments] listener failed", err);
} finally {
inFlight = false;
}
};
@@ -2,6 +2,31 @@ import { prisma } from "../../../constants";
import { events } from "../../globalEvents";
import { catalogCw } from "./catalog";
const CONCURRENCY = 6;
const BATCH_DELAY_MS = 250;
const sleep = (ms: number) => new Promise((resolve) => setTimeout(resolve, ms));
const runSlowParallel = async (
tasks: Array<() => Promise<void>>,
): Promise<number> => {
let failureCount = 0;
for (let i = 0; i < tasks.length; i += CONCURRENCY) {
const batch = tasks.slice(i, i + CONCURRENCY);
const results = await Promise.allSettled(batch.map((task) => task()));
for (const result of results) {
if (result.status === "rejected") failureCount += 1;
}
if (i + CONCURRENCY >= tasks.length) continue;
await sleep(BATCH_DELAY_MS);
}
return failureCount;
};
export const refreshCatalog = async () => {
events.emit("cw:catalog:refresh:check");
@@ -46,48 +71,43 @@ export const refreshCatalog = async () => {
staleCount: staleIds.length,
});
// 4. Fetch full catalog data, then filter to only stale items
const staleIdSet = new Set(staleIds);
const allCwItems = await catalogCw.fetchAllItemsFromCw();
const allStaleItems = new Map<number, any>();
// 4. Fetch full CW item data for stale IDs using slow, bounded concurrency
const cwItemMap = new Map<number, any>();
const itemFetchTasks: Array<() => Promise<void>> = staleIds.map(
(cwId) => async () => {
const item = await catalogCw.fetchByCatalogId(cwId);
cwItemMap.set(cwId, item);
},
);
const itemFetchFailures = await runSlowParallel(itemFetchTasks);
for (const [id, item] of allCwItems) {
if (staleIdSet.has(id)) {
allStaleItems.set(id, item);
}
}
// 5. Batch fetch inventory onHand for stale items (50 concurrent)
// 5. Fetch inventory onHand for stale IDs using the same slow parallel strategy
const onHandMap = new Map<number, number>();
const batchSize = 50;
for (let i = 0; i < staleIds.length; i += batchSize) {
const batch = staleIds.slice(i, i + batchSize);
await Promise.all(
batch.map(async (cwId) => {
const inventoryTasks: Array<() => Promise<void>> = staleIds.map(
(cwId) => async () => {
try {
const onHand = await catalogCw.fetchInventoryOnHand(cwId);
onHandMap.set(cwId, onHand);
} catch {
onHandMap.set(cwId, 0);
}
}),
},
);
}
const inventoryFailures = await runSlowParallel(inventoryTasks);
// 6. Upsert only the stale/new items
const updatedCount = (
await Promise.all(
staleIds.map(async (cwId) => {
const item = allStaleItems.get(cwId);
if (!item) return null;
// 6. Upsert stale/new items with bounded slow parallel execution
let updatedCount = 0;
const upsertTasks: Array<() => Promise<void>> = staleIds.map(
(cwId) => async () => {
const item = cwItemMap.get(cwId);
if (!item) return;
const cwLastUpdated = item._info?.lastUpdated
? new Date(item._info.lastUpdated)
: new Date();
const onHand = onHandMap.get(cwId) ?? 0;
return await prisma.catalogItem.upsert({
await prisma.catalogItem.upsert({
where: { cwCatalogId: cwId },
create: {
cwCatalogId: cwId,
@@ -137,9 +157,17 @@ export const refreshCatalog = async () => {
cwLastUpdated,
},
});
}),
)
).filter(Boolean).length;
updatedCount += 1;
},
);
const upsertFailures = await runSlowParallel(upsertTasks);
const failedTasks = itemFetchFailures + inventoryFailures + upsertFailures;
if (failedTasks > 0) {
console.warn(
`[catalog-refresh] ${failedTasks} slow-parallel task(s) failed; remaining items will retry next cycle`,
);
}
events.emit("cw:catalog:refresh:completed", {
totalCw: cwSummaries.size,
@@ -2,6 +2,11 @@ import { prisma } from "../../../constants";
import { events } from "../../globalEvents";
import { catalogCw } from "./catalog";
const CONCURRENCY = 6;
const BATCH_DELAY_MS = 250;
const sleep = (ms: number) => new Promise((resolve) => setTimeout(resolve, ms));
export const refreshInventory = async () => {
events.emit("cw:inventory:refresh:check");
@@ -23,13 +28,13 @@ export const refreshInventory = async () => {
totalItems: dbItems.length,
});
// 2. Batch fetch inventory onHand for all items (50 concurrent)
// 2. Slow-parallel fetch inventory onHand for all items
const onHandMap = new Map<number, number>();
const batchSize = 150;
let failedCount = 0;
for (let i = 0; i < dbItems.length; i += batchSize) {
const batch = dbItems.slice(i, i + batchSize);
await Promise.all(
for (let i = 0; i < dbItems.length; i += CONCURRENCY) {
const batch = dbItems.slice(i, i + CONCURRENCY);
const results = await Promise.allSettled(
batch.map(async (item) => {
try {
const onHand = await catalogCw.fetchInventoryOnHand(item.cwCatalogId);
@@ -39,6 +44,13 @@ export const refreshInventory = async () => {
}
}),
);
for (const result of results) {
if (result.status === "rejected") failedCount += 1;
}
if (i + CONCURRENCY >= dbItems.length) continue;
await sleep(BATCH_DELAY_MS);
}
// 3. Only update items where onHand has changed
@@ -71,4 +83,10 @@ export const refreshInventory = async () => {
totalItems: dbItems.length,
updatedCount,
});
if (failedCount > 0) {
console.warn(
`[inventory-refresh] ${failedCount} task(s) failed; fallback values were used and will retry next sweep`,
);
}
};
+90
View File
@@ -0,0 +1,90 @@
/**
* Generic retry wrapper for ConnectWise API calls.
*
* Retries on transient errors (ECONNABORTED, ECONNRESET, ETIMEDOUT,
* ECONNREFUSED, ERR_NETWORK) with exponential backoff. Non-transient
* errors (e.g. 404, 400) are re-thrown immediately.
*/
const TRANSIENT_CODES = new Set([
"ECONNABORTED",
"ECONNRESET",
"ETIMEDOUT",
"ECONNREFUSED",
"ERR_NETWORK",
"ENETUNREACH",
]);
export interface CwRetryOptions {
/** Maximum number of attempts (including the first). Default: 3 */
maxAttempts?: number;
/** Base delay in ms before the first retry. Doubles each retry. Default: 1000 */
baseDelayMs?: number;
/** Optional label for log messages. */
label?: string;
}
/**
* Execute `fn` and retry up to `maxAttempts - 1` times on transient
* Axios / network errors.
*/
export async function withCwRetry<T>(
fn: () => Promise<T>,
opts: CwRetryOptions = {},
): Promise<T> {
const { maxAttempts = 3, baseDelayMs = 1_000, label } = opts;
let lastError: unknown;
for (let attempt = 1; attempt <= maxAttempts; attempt++) {
try {
return await fn();
} catch (err) {
lastError = err;
if (!isRetryable(err) || attempt === maxAttempts) throw err;
const delay = baseDelayMs * 2 ** (attempt - 1); // 1s, 2s, 4s …
const tag = label ? `[${label}] ` : "";
console.warn(
`${tag}CW transient error (attempt ${attempt}/${maxAttempts}), retrying in ${delay}ms — ${describeErr(err)}`,
);
await sleep(delay);
}
}
// Should never reach here, but satisfy TS:
throw lastError;
}
// ---------------------------------------------------------------------------
// Helpers
// ---------------------------------------------------------------------------
function isRetryable(err: unknown): boolean {
if (typeof err !== "object" || err === null) return false;
const e = err as Record<string, any>;
if (!e.isAxiosError) return false;
// Retry on known transient codes
if (TRANSIENT_CODES.has(e.code)) return true;
// Also retry on 5xx server errors from CW
const status = e.response?.status;
if (typeof status === "number" && status >= 500) return true;
return false;
}
function describeErr(err: unknown): string {
if (typeof err !== "object" || err === null) return String(err);
const e = err as Record<string, any>;
if (e.isAxiosError) {
const method = (e.config?.method ?? "?").toUpperCase();
const url = e.config?.url ?? "unknown";
return `${method} ${url}${e.code ?? `HTTP ${e.response?.status}`}`;
}
return (e as Error).message ?? String(err);
}
const sleep = (ms: number) => new Promise((r) => setTimeout(r, ms));
+16
View File
@@ -178,6 +178,22 @@ interface EventTypes {
staleCount: number;
}) => void;
// Cache Events
"cache:opportunities:refresh:started": (data: {
totalOpportunities: number;
}) => void;
"cache:opportunities:refresh:completed": (data: {
totalOpportunities: number;
activitiesRefreshed: number;
companiesRefreshed: number;
notesRefreshed: number;
contactsRefreshed: number;
productsRefreshed: number;
oppCwDataRefreshed: number;
skipped: number;
}) => void;
"cache:opportunities:refresh:error": (data: { error: unknown }) => void;
// ConnectWise User Defined Fields Events
"cw:udf:refresh:started": () => void;
"cw:udf:refresh:completed": (data: { count: number }) => void;
+7
View File
@@ -383,6 +383,13 @@ export const PERMISSION_NODES = {
],
},
cwCallbacks: {
name: "ConnectWise Callback Routes",
description:
"Inbound ConnectWise callback endpoints. These routes are intentionally unauthenticated and do not require permission nodes.",
permissions: [],
},
sales: {
name: "Sales Permissions",
description: "Permissions for accessing and managing sales opportunities",
+1
View File
@@ -135,6 +135,7 @@ export const QUOTE_STATUSES: QuoteStatus[] = [
48, // PRE2413. Follow-Up Extended
52, // PRE2489. Overdue
55, // PRE24_70. Quote Sent - Sell
57, // 04. Confirmed Quote
],
},
+600
View File
@@ -0,0 +1,600 @@
/**
* Test Script: CW Forecast Item Edit & Partial Cancellation
*
* This script performs read-write operations against the ConnectWise API:
*
* 1. Search all open opportunities for a forecast item with description
* matching "labor Special Order" (case-insensitive).
* 2. Report the current state of that item (price, cost, qty, etc.).
* 3. PATCH the item: revenue 72,000 | cost 8,500 | quantity 67
* 4. Verify the update by re-fetching the forecast.
* 5. Cancel 13 units via the linked procurement product
* (partial cancellation: quantityCancelled = 13).
* 6. Verify the cancellation by re-fetching procurement data.
* 7. Report on every step.
*
* Usage: bun run test-cw-edit-item.ts
*/
import axios from "axios";
const cw = axios.create({
baseURL: "https://ttscw.totaltech.net/v4_6_release/apis/3.0/",
headers: {
Authorization: `Basic ${process.env.CW_BASIC_TOKEN}`,
clientId: `${process.env.CW_CLIENT_ID}`,
"Content-Type": "application/json",
},
timeout: 30_000,
});
// ── Helpers ───────────────────────────────────────────────────────────────────
const log = (label: string, ...args: unknown[]) =>
console.log(`\n[${label}]`, ...args);
const divider = () => console.log("─".repeat(72));
const sleep = (ms: number) => new Promise((r) => setTimeout(r, ms));
const fmt = (n: number) =>
n.toLocaleString("en-US", {
minimumFractionDigits: 2,
maximumFractionDigits: 2,
});
// ── Types (minimal, for this script) ──────────────────────────────────────────
interface ForecastItem {
id: number;
forecastDescription: string;
productDescription: string;
quantity: number;
revenue: number;
cost: number;
margin: number;
forecastType: string;
sequenceNumber: number;
catalogItem?: { id: number; identifier: string };
status?: { id: number; name: string };
opportunity?: { id: number; name: string };
[key: string]: unknown;
}
interface Forecast {
id: number;
forecastItems: ForecastItem[];
[key: string]: unknown;
}
interface ProcurementProduct {
id: number;
forecastDetailId: number;
description: string;
quantity: number;
price: number;
cost: number;
cancelledFlag: boolean;
quantityCancelled: number;
cancelledReason: string | null;
cancelledBy: string | null;
cancelledDate: string | null;
opportunity?: { id: number };
[key: string]: unknown;
}
// ── Main ──────────────────────────────────────────────────────────────────────
async function main() {
divider();
log("START", "CW Forecast Item Edit & Cancellation Test");
log("START", `Timestamp: ${new Date().toISOString()}`);
divider();
// ── Step 1: Find the "labor Special Order" forecast item ────────────────
const OPP_ID = 5150;
log(
"SEARCH",
`Looking for forecast item matching "labor Special Order" on opportunity ${OPP_ID}...`,
);
// Fetch the forecast for opportunity 5150 directly
let targetOppId: number = OPP_ID;
let targetItem: ForecastItem | null = null;
let targetForecast: Forecast | null = null;
const forecastRes = await cw.get(`/sales/opportunities/${OPP_ID}/forecast`);
targetForecast = forecastRes.data as Forecast;
const match = (targetForecast.forecastItems ?? []).find(
(fi: ForecastItem) =>
fi.forecastDescription?.toLowerCase().includes("special order") ||
fi.productDescription?.toLowerCase().includes("special order"),
);
if (match) {
targetItem = match;
log("SEARCH", `✓ FOUND forecast item on opportunity ${OPP_ID}`);
}
if (!targetItem || !targetForecast) {
log(
"SEARCH",
`✗ No "labor Special Order" item found on opportunity ${OPP_ID}.`,
);
log("SEARCH", "All forecast items on this opportunity:");
for (const fi of targetForecast.forecastItems ?? []) {
console.log(
` id=${fi.id} "${fi.forecastDescription}" / "${fi.productDescription}"`,
);
}
log("SEARCH", "Aborting.");
process.exit(1);
}
// ── Step 2: Report current state ────────────────────────────────────────
divider();
log("CURRENT STATE", "Forecast item details BEFORE edit:");
console.log(` Opportunity ID: ${targetOppId}`);
console.log(` Forecast Item ID: ${targetItem.id}`);
console.log(` Forecast Description: ${targetItem.forecastDescription}`);
console.log(` Product Description: ${targetItem.productDescription}`);
console.log(
` Catalog Item: ${targetItem.catalogItem?.identifier ?? "(none)"} (cwId=${targetItem.catalogItem?.id ?? "N/A"})`,
);
console.log(` Forecast Type: ${targetItem.forecastType}`);
console.log(
` Status: ${targetItem.status?.name ?? "?"} (id=${targetItem.status?.id ?? "?"})`,
);
console.log(` Sequence Number: ${targetItem.sequenceNumber}`);
console.log(` ──────────────────────────────────`);
console.log(` Quantity: ${targetItem.quantity}`);
console.log(` Revenue (Price): $${fmt(targetItem.revenue)}`);
console.log(` Cost: $${fmt(targetItem.cost)}`);
console.log(` Margin: $${fmt(targetItem.margin)}`);
// Also report all items on this opportunity for context
const allItems = targetForecast.forecastItems ?? [];
log(
"CONTEXT",
`Total forecast items on this opportunity: ${allItems.length}`,
);
for (const fi of allItems) {
const marker = fi.id === targetItem.id ? " ◀ TARGET" : "";
console.log(
` [${fi.sequenceNumber}] id=${fi.id} "${fi.forecastDescription}" ` +
`qty=${fi.quantity} rev=$${fmt(fi.revenue)} cost=$${fmt(fi.cost)}${marker}`,
);
}
// ── Step 3: PATCH the forecast item ─────────────────────────────────────
divider();
const UNIT_PRICE = 72_000;
const UNIT_COST = 8_500;
const QTY = 67;
const TOTAL_REVENUE = UNIT_PRICE * QTY; // $4,824,000
const TOTAL_COST = UNIT_COST * QTY; // $569,500
log("EDIT", "Patching forecast item...");
log(
"EDIT",
` Unit price: $${fmt(UNIT_PRICE)} × ${QTY} = $${fmt(TOTAL_REVENUE)} (revenue)`,
);
log(
"EDIT",
` Unit cost: $${fmt(UNIT_COST)} × ${QTY} = $${fmt(TOTAL_COST)} (cost)`,
);
log("EDIT", ` Quantity: ${QTY}`);
// Find the index of our target item in the forecast array
const forecastItems = targetForecast.forecastItems ?? [];
const targetIdx = forecastItems.findIndex((fi) => fi.id === targetItem!.id);
if (targetIdx === -1) {
log(
"EDIT",
"✗ Could not find target item index in forecast array. Aborting.",
);
process.exit(1);
}
log("EDIT", `Target item is at index ${targetIdx} in forecastItems array.`);
const patchOps = [
{
op: "replace",
path: `/forecastItems/${targetIdx}/revenue`,
value: TOTAL_REVENUE,
},
{
op: "replace",
path: `/forecastItems/${targetIdx}/cost`,
value: TOTAL_COST,
},
{ op: "replace", path: `/forecastItems/${targetIdx}/quantity`, value: QTY },
];
log("EDIT", "Patch operations:");
for (const op of patchOps) {
console.log(` ${op.op} ${op.path}${op.value}`);
}
try {
const patchRes = await cw.patch(
`/sales/opportunities/${targetOppId}/forecast`,
patchOps,
);
const updatedForecast: Forecast = patchRes.data;
const updatedItem = (updatedForecast.forecastItems ?? [])[targetIdx];
if (!updatedItem) {
log("EDIT", "✗ Item not found at expected index after PATCH.");
} else {
log("EDIT", "✓ PATCH successful. Updated item:");
console.log(` Forecast Item ID: ${updatedItem.id}`);
console.log(` Forecast Description: ${updatedItem.forecastDescription}`);
console.log(` Quantity: ${updatedItem.quantity}`);
console.log(` Revenue (Price): $${fmt(updatedItem.revenue)}`);
console.log(` Cost: $${fmt(updatedItem.cost)}`);
console.log(` Margin: $${fmt(updatedItem.margin)}`);
// Verify values match what we set
const checks = [
{
field: "revenue",
expected: TOTAL_REVENUE,
actual: updatedItem.revenue,
},
{ field: "cost", expected: TOTAL_COST, actual: updatedItem.cost },
{ field: "quantity", expected: QTY, actual: updatedItem.quantity },
];
log("VERIFY EDIT", "Checking values match requested:");
for (const check of checks) {
const ok = check.actual === check.expected;
console.log(
` ${ok ? "✓" : "✗"} ${check.field}: expected=${check.expected}, actual=${check.actual}`,
);
}
// Update our reference for the cancellation step
targetItem = updatedItem;
}
} catch (err: any) {
log("EDIT", `✗ PATCH failed: ${err.response?.status ?? err.message}`);
if (err.response?.data) {
console.log(" Response:", JSON.stringify(err.response.data, null, 2));
}
// If quantity PATCH failed (read-only), try without quantity
if (err.response?.status === 400 || err.response?.status === 422) {
log(
"EDIT",
"Retrying without quantity (may be read-only on forecast items)...",
);
const retryOps = patchOps.filter((op) => !op.path.endsWith("/quantity"));
try {
const retryRes = await cw.patch(
`/sales/opportunities/${targetOppId}/forecast`,
retryOps,
);
const retryForecast: Forecast = retryRes.data;
const retryItem = (retryForecast.forecastItems ?? [])[targetIdx];
if (retryItem) {
log(
"EDIT",
"✓ Retry PATCH successful (without quantity). Updated item:",
);
console.log(
` Quantity: ${retryItem.quantity} (unchanged — read-only)`,
);
console.log(` Revenue (Price): $${fmt(retryItem.revenue)}`);
console.log(` Cost: $${fmt(retryItem.cost)}`);
console.log(` Margin: $${fmt(retryItem.margin)}`);
targetItem = retryItem;
}
} catch (retryErr: any) {
log(
"EDIT",
`✗ Retry also failed: ${retryErr.response?.status ?? retryErr.message}`,
);
if (retryErr.response?.data) {
console.log(
" Response:",
JSON.stringify(retryErr.response.data, null, 2),
);
}
}
}
}
// ── Step 4: Re-fetch and confirm final forecast state ───────────────────
divider();
log("RE-FETCH", "Fetching forecast to confirm final state...");
await sleep(500);
const confirmRes = await cw.get(
`/sales/opportunities/${targetOppId}/forecast`,
);
const confirmedForecast: Forecast = confirmRes.data;
const confirmedItem = (confirmedForecast.forecastItems ?? []).find(
(fi) => fi.id === targetItem!.id,
);
if (confirmedItem) {
log("CONFIRMED STATE", "Forecast item after edit:");
console.log(` Forecast Item ID: ${confirmedItem.id}`);
console.log(` Forecast Description: ${confirmedItem.forecastDescription}`);
console.log(` Quantity: ${confirmedItem.quantity}`);
console.log(` Revenue (Price): $${fmt(confirmedItem.revenue)}`);
console.log(` Cost: $${fmt(confirmedItem.cost)}`);
console.log(` Margin: $${fmt(confirmedItem.margin)}`);
} else {
log(
"CONFIRMED STATE",
"⚠ Could not find item by original ID — it may have been regenerated.",
);
log("CONFIRMED STATE", "All current forecast items:");
for (const fi of confirmedForecast.forecastItems ?? []) {
console.log(
` id=${fi.id} "${fi.forecastDescription}" qty=${fi.quantity} rev=$${fmt(fi.revenue)} cost=$${fmt(fi.cost)}`,
);
}
}
// ── Step 5: Cancel 13 items via procurement product ─────────────────────
divider();
log("CANCEL", "Cancelling 13 units on this item via procurement product...");
// First, find existing procurement products linked to this opportunity
const procRes = await cw.get(
`/procurement/products?conditions=${encodeURIComponent(`opportunity/id=${targetOppId}`)}&pageSize=1000`,
);
const procProducts: ProcurementProduct[] = procRes.data;
log(
"CANCEL",
`Found ${procProducts.length} procurement product(s) on this opportunity.`,
);
if (procProducts.length > 0) {
for (const pp of procProducts) {
console.log(
` Proc id=${pp.id} forecastDetailId=${pp.forecastDetailId} ` +
`"${pp.description}" qty=${pp.quantity} price=$${fmt(pp.price ?? 0)} ` +
`cancelled=${pp.cancelledFlag} qtyCancelled=${pp.quantityCancelled}`,
);
}
}
// Find the procurement product linked to our forecast item
const linkedProc = procProducts.find(
(pp) => pp.forecastDetailId === targetItem!.id,
);
if (linkedProc) {
log("CANCEL", `Found linked procurement product: id=${linkedProc.id}`);
log(
"CANCEL",
`Current state: cancelled=${linkedProc.cancelledFlag}, quantityCancelled=${linkedProc.quantityCancelled}`,
);
log("CANCEL", "Patching: quantityCancelled → 13, cancelledFlag → true");
try {
const cancelRes = await cw.patch(
`/procurement/products/${linkedProc.id}`,
[
{ op: "replace", path: "cancelledFlag", value: true },
{ op: "replace", path: "quantityCancelled", value: 13 },
{
op: "replace",
path: "cancelledReason",
value: "Test cancellation — 13 units",
},
],
);
log("CANCEL", "✓ Cancellation PATCH successful.");
console.log(` cancelledFlag: ${cancelRes.data.cancelledFlag}`);
console.log(` quantityCancelled: ${cancelRes.data.quantityCancelled}`);
console.log(` cancelledReason: ${cancelRes.data.cancelledReason}`);
console.log(
` cancelledBy: ${cancelRes.data.cancelledBy ?? "N/A"}`,
);
console.log(
` cancelledDate: ${cancelRes.data.cancelledDate ?? "N/A"}`,
);
} catch (err: any) {
log(
"CANCEL",
`✗ Cancellation PATCH failed: ${err.response?.status ?? err.message}`,
);
if (err.response?.data) {
console.log(" Response:", JSON.stringify(err.response.data, null, 2));
}
}
} else {
log(
"CANCEL",
`No procurement product linked to forecast item id=${targetItem!.id}.`,
);
log(
"CANCEL",
"Creating a procurement product first, then cancelling 13...",
);
try {
// Create a procurement product linked to this forecast item
const createProcRes = await cw.post("/procurement/products", {
catalogItem: targetItem!.catalogItem?.id
? { id: targetItem!.catalogItem.id }
: undefined,
description:
targetItem!.forecastDescription || targetItem!.productDescription,
quantity: targetItem!.quantity || 67,
price: targetItem!.revenue || 72_000,
cost: targetItem!.cost || 8_500,
billableOption: "Billable",
opportunity: { id: targetOppId },
forecastDetailId: targetItem!.id,
});
const newProc = createProcRes.data;
log("CANCEL", `✓ Created procurement product id=${newProc.id}`);
console.log(` forecastDetailId: ${newProc.forecastDetailId}`);
console.log(` description: ${newProc.description}`);
console.log(` quantity: ${newProc.quantity}`);
console.log(` price: $${fmt(newProc.price ?? 0)}`);
console.log(` cost: $${fmt(newProc.cost ?? 0)}`);
// Now cancel 13 units
log("CANCEL", "Patching procurement product: quantityCancelled → 13...");
const cancelRes = await cw.patch(`/procurement/products/${newProc.id}`, [
{ op: "replace", path: "cancelledFlag", value: true },
{ op: "replace", path: "quantityCancelled", value: 13 },
{
op: "replace",
path: "cancelledReason",
value: "Test cancellation — 13 units",
},
]);
log("CANCEL", "✓ Cancellation PATCH successful.");
console.log(` cancelledFlag: ${cancelRes.data.cancelledFlag}`);
console.log(` quantityCancelled: ${cancelRes.data.quantityCancelled}`);
console.log(` cancelledReason: ${cancelRes.data.cancelledReason}`);
console.log(
` cancelledBy: ${cancelRes.data.cancelledBy ?? "N/A"}`,
);
console.log(
` cancelledDate: ${cancelRes.data.cancelledDate ?? "N/A"}`,
);
} catch (err: any) {
log("CANCEL", `✗ Failed: ${err.response?.status ?? err.message}`);
if (err.response?.data) {
console.log(" Response:", JSON.stringify(err.response.data, null, 2));
}
}
}
// ── Step 6: Final verification ──────────────────────────────────────────
divider();
log("FINAL VERIFY", "Re-fetching all data for final report...");
await sleep(500);
// Re-fetch forecast
const finalForecastRes = await cw.get(
`/sales/opportunities/${targetOppId}/forecast`,
);
const finalForecast: Forecast = finalForecastRes.data;
const finalItem =
(finalForecast.forecastItems ?? []).find(
(fi) => fi.id === targetItem!.id,
) ??
(finalForecast.forecastItems ?? []).find(
(fi) =>
fi.forecastDescription?.toLowerCase().includes("special order") ||
fi.productDescription?.toLowerCase().includes("special order"),
);
// Re-fetch procurement
const finalProcRes = await cw.get(
`/procurement/products?conditions=${encodeURIComponent(`opportunity/id=${targetOppId}`)}&pageSize=1000`,
);
const finalProcs: ProcurementProduct[] = finalProcRes.data;
log("FINAL STATE — FORECAST ITEM", "");
if (finalItem) {
console.log(` Forecast Item ID: ${finalItem.id}`);
console.log(` Forecast Description: ${finalItem.forecastDescription}`);
console.log(` Quantity: ${finalItem.quantity}`);
console.log(` Revenue (Price): $${fmt(finalItem.revenue)}`);
console.log(` Cost: $${fmt(finalItem.cost)}`);
console.log(` Margin: $${fmt(finalItem.margin)}`);
} else {
console.log(" ⚠ Target item not found in final forecast.");
}
log("FINAL STATE — PROCUREMENT", `${finalProcs.length} product(s):`);
for (const pp of finalProcs) {
console.log(
` id=${pp.id} forecastDetailId=${pp.forecastDetailId} ` +
`"${pp.description}" qty=${pp.quantity} cancelled=${pp.cancelledFlag} ` +
`qtyCancelled=${pp.quantityCancelled} reason="${pp.cancelledReason ?? ""}"`,
);
}
// ── Summary ─────────────────────────────────────────────────────────────
divider();
log("SUMMARY", "");
// After cancelling 13 of 67, CW recalculates totals for remaining 54 units
const expectedFinalRevenue = Math.round(UNIT_PRICE * (QTY - 13) * 100) / 100;
const expectedFinalCost = Math.round(UNIT_COST * (QTY - 13) * 100) / 100;
const editOk = finalItem
? Math.abs(finalItem.revenue - expectedFinalRevenue) < 1 &&
Math.abs(finalItem.cost - expectedFinalCost) < 1
: false;
const qtyOk = finalItem ? finalItem.quantity === QTY : false;
if (finalItem) {
console.log(
` Expected final revenue ($${fmt(UNIT_PRICE)} × ${QTY - 13}): $${fmt(expectedFinalRevenue)}`,
);
console.log(
` Actual final revenue: $${fmt(finalItem.revenue)}`,
);
console.log(
` Expected final cost ($${fmt(UNIT_COST)} × ${QTY - 13}): $${fmt(expectedFinalCost)}`,
);
console.log(
` Actual final cost: $${fmt(finalItem.cost)}`,
);
}
const cancelOk = finalProcs.some(
(pp) =>
pp.forecastDetailId === targetItem!.id &&
pp.cancelledFlag === true &&
pp.quantityCancelled === 13,
);
console.log(
` Unit price $${fmt(UNIT_PRICE)}/ea: `,
editOk ? "✓ PASS" : "✗ FAIL",
);
console.log(
` Unit cost $${fmt(UNIT_COST)}/ea: `,
editOk ? "✓ PASS" : "✗ FAIL",
);
console.log(
` Quantity set to ${QTY}: `,
qtyOk ? "✓ PASS" : "✗ FAIL (may be read-only)",
);
console.log(
" 13 units cancelled: ",
cancelOk ? "✓ PASS" : "✗ FAIL",
);
const allPass = editOk && qtyOk && cancelOk;
divider();
log(
"RESULT",
allPass
? "✓ ALL CHECKS PASSED"
: "⚠ SOME CHECKS DID NOT PASS — review output above",
);
divider();
}
main().catch((err) => {
console.error("\n[FATAL]", err.response?.data ?? err.message);
process.exit(1);
});
+477
View File
@@ -0,0 +1,477 @@
import { describe, test, expect } from "bun:test";
import {
computeCacheTTL,
TTL_HIGH_ACTIVITY,
TTL_MODERATE_ACTIVITY,
TTL_LOW_ACTIVITY,
} from "../../src/modules/algorithms/computeCacheTTL";
// ---------------------------------------------------------------------------
// Helpers
// ---------------------------------------------------------------------------
/** Fixed reference point so tests are deterministic. */
const NOW = new Date("2026-03-02T12:00:00Z");
/** Return a Date offset from NOW by `days` (negative = past, positive = future). */
const daysFromNow = (days: number): Date =>
new Date(NOW.getTime() + days * 24 * 60 * 60 * 1000);
// ---------------------------------------------------------------------------
// Rule 1a — Closed records older than 30 days should not be cached
// ---------------------------------------------------------------------------
describe("computeCacheTTL — Rule 1a: Closed records (>30 days)", () => {
test("returns null when closedFlag is true and closedDate is null", () => {
expect(
computeCacheTTL({
closedFlag: true,
closedDate: null,
expectedCloseDate: null,
lastUpdated: null,
now: NOW,
}),
).toBeNull();
});
test("returns null when closedFlag is true and closedDate is 60 days ago", () => {
expect(
computeCacheTTL({
closedFlag: true,
closedDate: daysFromNow(-60),
expectedCloseDate: daysFromNow(-1),
lastUpdated: daysFromNow(-1),
now: NOW,
}),
).toBeNull();
});
test("returns null when closedFlag is true and closedDate is 31 days ago", () => {
expect(
computeCacheTTL({
closedFlag: true,
closedDate: new Date(NOW.getTime() - 31 * 24 * 60 * 60 * 1000),
expectedCloseDate: daysFromNow(2),
lastUpdated: null,
now: NOW,
}),
).toBeNull();
});
});
// ---------------------------------------------------------------------------
// Rule 1b — Recently closed (within 30 days) → 15 minutes
// ---------------------------------------------------------------------------
describe("computeCacheTTL — Rule 1b: Recently closed (≤30 days)", () => {
test("returns 15min when closed 1 day ago", () => {
expect(
computeCacheTTL({
closedFlag: true,
closedDate: daysFromNow(-1),
expectedCloseDate: null,
lastUpdated: null,
now: NOW,
}),
).toBe(TTL_LOW_ACTIVITY);
});
test("returns 15min when closed 15 days ago", () => {
expect(
computeCacheTTL({
closedFlag: true,
closedDate: daysFromNow(-15),
expectedCloseDate: null,
lastUpdated: null,
now: NOW,
}),
).toBe(TTL_LOW_ACTIVITY);
});
test("returns 15min when closed exactly 30 days ago", () => {
expect(
computeCacheTTL({
closedFlag: true,
closedDate: daysFromNow(-30),
expectedCloseDate: null,
lastUpdated: null,
now: NOW,
}),
).toBe(TTL_LOW_ACTIVITY);
});
test("returns 15min when closed today even with recent activity dates", () => {
expect(
computeCacheTTL({
closedFlag: true,
closedDate: NOW,
expectedCloseDate: daysFromNow(-1),
lastUpdated: NOW,
now: NOW,
}),
).toBe(TTL_LOW_ACTIVITY);
});
test("just past 30-day boundary returns null", () => {
const justPast30Days = new Date(
NOW.getTime() - 30 * 24 * 60 * 60 * 1000 - 1,
);
expect(
computeCacheTTL({
closedFlag: true,
closedDate: justPast30Days,
expectedCloseDate: null,
lastUpdated: null,
now: NOW,
}),
).toBeNull();
});
});
// ---------------------------------------------------------------------------
// Rule 2 — High activity (within 5 days) → 30 seconds
// ---------------------------------------------------------------------------
describe("computeCacheTTL — Rule 2: High activity (≤5 days)", () => {
test("returns 30s when lastUpdated is today", () => {
expect(
computeCacheTTL({
closedFlag: false,
closedDate: null,
expectedCloseDate: null,
lastUpdated: NOW,
now: NOW,
}),
).toBe(TTL_HIGH_ACTIVITY);
});
test("returns 30s when lastUpdated is 3 days ago", () => {
expect(
computeCacheTTL({
closedFlag: false,
closedDate: null,
expectedCloseDate: null,
lastUpdated: daysFromNow(-3),
now: NOW,
}),
).toBe(TTL_HIGH_ACTIVITY);
});
test("returns 30s when lastUpdated is exactly 5 days ago", () => {
expect(
computeCacheTTL({
closedFlag: false,
closedDate: null,
expectedCloseDate: null,
lastUpdated: daysFromNow(-5),
now: NOW,
}),
).toBe(TTL_HIGH_ACTIVITY);
});
test("returns 30s when expectedCloseDate is 2 days in the future", () => {
expect(
computeCacheTTL({
closedFlag: false,
closedDate: null,
expectedCloseDate: daysFromNow(2),
lastUpdated: null,
now: NOW,
}),
).toBe(TTL_HIGH_ACTIVITY);
});
test("returns 30s when expectedCloseDate is 5 days in the future", () => {
expect(
computeCacheTTL({
closedFlag: false,
closedDate: null,
expectedCloseDate: daysFromNow(5),
lastUpdated: null,
now: NOW,
}),
).toBe(TTL_HIGH_ACTIVITY);
});
test("returns 30s when expectedCloseDate is 4 days ago (recently passed)", () => {
expect(
computeCacheTTL({
closedFlag: false,
closedDate: null,
expectedCloseDate: daysFromNow(-4),
lastUpdated: null,
now: NOW,
}),
).toBe(TTL_HIGH_ACTIVITY);
});
test("returns 30s when either date is within 5 days (lastUpdated wins)", () => {
expect(
computeCacheTTL({
closedFlag: false,
closedDate: null,
expectedCloseDate: daysFromNow(-30),
lastUpdated: daysFromNow(-2),
now: NOW,
}),
).toBe(TTL_HIGH_ACTIVITY);
});
test("returns 30s when either date is within 5 days (expectedCloseDate wins)", () => {
expect(
computeCacheTTL({
closedFlag: false,
closedDate: null,
expectedCloseDate: daysFromNow(3),
lastUpdated: daysFromNow(-30),
now: NOW,
}),
).toBe(TTL_HIGH_ACTIVITY);
});
});
// ---------------------------------------------------------------------------
// Rule 3 — Moderate activity (within 14 days but > 5 days) → 60 seconds
// ---------------------------------------------------------------------------
describe("computeCacheTTL — Rule 3: Moderate activity (614 days)", () => {
test("returns 60s when lastUpdated is 6 days ago", () => {
expect(
computeCacheTTL({
closedFlag: false,
closedDate: null,
expectedCloseDate: null,
lastUpdated: daysFromNow(-6),
now: NOW,
}),
).toBe(TTL_MODERATE_ACTIVITY);
});
test("returns 60s when lastUpdated is 10 days ago", () => {
expect(
computeCacheTTL({
closedFlag: false,
closedDate: null,
expectedCloseDate: null,
lastUpdated: daysFromNow(-10),
now: NOW,
}),
).toBe(TTL_MODERATE_ACTIVITY);
});
test("returns 60s when lastUpdated is exactly 14 days ago", () => {
expect(
computeCacheTTL({
closedFlag: false,
closedDate: null,
expectedCloseDate: null,
lastUpdated: daysFromNow(-14),
now: NOW,
}),
).toBe(TTL_MODERATE_ACTIVITY);
});
test("returns 60s when expectedCloseDate is 8 days in the future", () => {
expect(
computeCacheTTL({
closedFlag: false,
closedDate: null,
expectedCloseDate: daysFromNow(8),
lastUpdated: null,
now: NOW,
}),
).toBe(TTL_MODERATE_ACTIVITY);
});
test("returns 60s when expectedCloseDate is 14 days in the future", () => {
expect(
computeCacheTTL({
closedFlag: false,
closedDate: null,
expectedCloseDate: daysFromNow(14),
lastUpdated: null,
now: NOW,
}),
).toBe(TTL_MODERATE_ACTIVITY);
});
test("returns 60s when expectedCloseDate is 12 days ago", () => {
expect(
computeCacheTTL({
closedFlag: false,
closedDate: null,
expectedCloseDate: daysFromNow(-12),
lastUpdated: null,
now: NOW,
}),
).toBe(TTL_MODERATE_ACTIVITY);
});
});
// ---------------------------------------------------------------------------
// Rule 4 — Low activity (older than 14 days) → 15 minutes
// ---------------------------------------------------------------------------
describe("computeCacheTTL — Rule 4: Low activity (>14 days)", () => {
test("returns 15min when lastUpdated is 15 days ago", () => {
expect(
computeCacheTTL({
closedFlag: false,
closedDate: null,
expectedCloseDate: null,
lastUpdated: daysFromNow(-15),
now: NOW,
}),
).toBe(TTL_LOW_ACTIVITY);
});
test("returns 15min when lastUpdated is 60 days ago", () => {
expect(
computeCacheTTL({
closedFlag: false,
closedDate: null,
expectedCloseDate: null,
lastUpdated: daysFromNow(-60),
now: NOW,
}),
).toBe(TTL_LOW_ACTIVITY);
});
test("returns 15min when expectedCloseDate is 20 days in the future", () => {
expect(
computeCacheTTL({
closedFlag: false,
closedDate: null,
expectedCloseDate: daysFromNow(20),
lastUpdated: null,
now: NOW,
}),
).toBe(TTL_LOW_ACTIVITY);
});
test("returns 15min when both dates are null", () => {
expect(
computeCacheTTL({
closedFlag: false,
closedDate: null,
expectedCloseDate: null,
lastUpdated: null,
now: NOW,
}),
).toBe(TTL_LOW_ACTIVITY);
});
test("returns 15min when both dates are far in the past", () => {
expect(
computeCacheTTL({
closedFlag: false,
closedDate: null,
expectedCloseDate: daysFromNow(-100),
lastUpdated: daysFromNow(-90),
now: NOW,
}),
).toBe(TTL_LOW_ACTIVITY);
});
});
// ---------------------------------------------------------------------------
// Edge cases
// ---------------------------------------------------------------------------
describe("computeCacheTTL — edge cases", () => {
test("defaults `now` to current time when omitted", () => {
// Open, no dates → should return LOW_ACTIVITY (15min)
const result = computeCacheTTL({
closedFlag: false,
closedDate: null,
expectedCloseDate: null,
lastUpdated: null,
});
expect(result).toBe(TTL_LOW_ACTIVITY);
});
test("5-day boundary is inclusive", () => {
// Exactly 5 days should match high activity
expect(
computeCacheTTL({
closedFlag: false,
closedDate: null,
expectedCloseDate: null,
lastUpdated: daysFromNow(-5),
now: NOW,
}),
).toBe(TTL_HIGH_ACTIVITY);
});
test("just past 5-day boundary falls to moderate", () => {
// 5 days + 1 millisecond past → moderate
const justPast5Days = new Date(NOW.getTime() - 5 * 24 * 60 * 60 * 1000 - 1);
expect(
computeCacheTTL({
closedFlag: false,
closedDate: null,
expectedCloseDate: null,
lastUpdated: justPast5Days,
now: NOW,
}),
).toBe(TTL_MODERATE_ACTIVITY);
});
test("14-day boundary is inclusive", () => {
expect(
computeCacheTTL({
closedFlag: false,
closedDate: null,
expectedCloseDate: null,
lastUpdated: daysFromNow(-14),
now: NOW,
}),
).toBe(TTL_MODERATE_ACTIVITY);
});
test("just past 14-day boundary falls to low activity", () => {
const justPast14Days = new Date(
NOW.getTime() - 14 * 24 * 60 * 60 * 1000 - 1,
);
expect(
computeCacheTTL({
closedFlag: false,
closedDate: null,
expectedCloseDate: null,
lastUpdated: justPast14Days,
now: NOW,
}),
).toBe(TTL_LOW_ACTIVITY);
});
test("higher-priority rule wins when both dates span different tiers", () => {
// expectedCloseDate in 5-day window, lastUpdated in 14-day window → 30s
expect(
computeCacheTTL({
closedFlag: false,
closedDate: null,
expectedCloseDate: daysFromNow(3),
lastUpdated: daysFromNow(-10),
now: NOW,
}),
).toBe(TTL_HIGH_ACTIVITY);
});
test("closed >30 days always returns null regardless of other dates", () => {
expect(
computeCacheTTL({
closedFlag: true,
closedDate: daysFromNow(-60),
expectedCloseDate: NOW,
lastUpdated: NOW,
now: NOW,
}),
).toBeNull();
});
test("recently closed always returns 15min regardless of activity dates", () => {
expect(
computeCacheTTL({
closedFlag: true,
closedDate: daysFromNow(-5),
expectedCloseDate: NOW,
lastUpdated: NOW,
now: NOW,
}),
).toBe(TTL_LOW_ACTIVITY);
});
});
+188
View File
@@ -0,0 +1,188 @@
import { describe, test, expect } from "bun:test";
import {
computeProductsCacheTTL,
PRODUCTS_TTL_HOT,
PRODUCTS_TTL_LAZY,
WON_LOST_STATUS_IDS,
} from "../../src/modules/algorithms/computeProductsCacheTTL";
const NOW = new Date("2026-03-02T12:00:00Z");
const DAY_MS = 24 * 60 * 60 * 1000;
describe("computeProductsCacheTTL", () => {
// -- Constants ----------------------------------------------------------
test("PRODUCTS_TTL_HOT is 45 seconds", () => {
expect(PRODUCTS_TTL_HOT).toBe(45_000);
});
test("PRODUCTS_TTL_LAZY is 20 minutes", () => {
expect(PRODUCTS_TTL_LAZY).toBe(1_200_000);
});
// -- Won/Lost status set ------------------------------------------------
test("WON_LOST_STATUS_IDS contains Won canonical ID (29) and Pending Won (49)", () => {
expect(WON_LOST_STATUS_IDS.has(29)).toBe(true);
expect(WON_LOST_STATUS_IDS.has(49)).toBe(true);
});
test("WON_LOST_STATUS_IDS contains Lost canonical ID (53) and Pending Lost (50)", () => {
expect(WON_LOST_STATUS_IDS.has(53)).toBe(true);
expect(WON_LOST_STATUS_IDS.has(50)).toBe(true);
});
test("WON_LOST_STATUS_IDS does not contain Active (58) or New (24)", () => {
expect(WON_LOST_STATUS_IDS.has(58)).toBe(false);
expect(WON_LOST_STATUS_IDS.has(24)).toBe(false);
});
// -- Rule 1: Won/Lost/Pending Won/Lost → null --------------------------
test("returns null for Won status (CW ID 29)", () => {
const result = computeProductsCacheTTL({
statusCwId: 29,
closedFlag: true,
closedDate: new Date(NOW.getTime() - 2 * DAY_MS),
expectedCloseDate: null,
lastUpdated: new Date(NOW.getTime() - 1 * DAY_MS),
now: NOW,
});
expect(result).toBeNull();
});
test("returns null for Pending Won status (CW ID 49)", () => {
const result = computeProductsCacheTTL({
statusCwId: 49,
closedFlag: false,
closedDate: null,
expectedCloseDate: null,
lastUpdated: new Date(NOW.getTime() - 1 * DAY_MS),
now: NOW,
});
expect(result).toBeNull();
});
test("returns null for Lost status (CW ID 53)", () => {
const result = computeProductsCacheTTL({
statusCwId: 53,
closedFlag: true,
closedDate: new Date(NOW.getTime() - 5 * DAY_MS),
expectedCloseDate: null,
lastUpdated: null,
now: NOW,
});
expect(result).toBeNull();
});
test("returns null for Pending Lost status (CW ID 50)", () => {
const result = computeProductsCacheTTL({
statusCwId: 50,
closedFlag: false,
closedDate: null,
expectedCloseDate: null,
lastUpdated: new Date(NOW.getTime() - 1 * DAY_MS),
now: NOW,
});
expect(result).toBeNull();
});
// -- Rule 2: Opp not cacheable → null ----------------------------------
test("returns null when opp is closed > 30 days (main cache null)", () => {
const result = computeProductsCacheTTL({
statusCwId: 58, // Active — but closed flag overrides
closedFlag: true,
closedDate: new Date(NOW.getTime() - 60 * DAY_MS),
expectedCloseDate: null,
lastUpdated: null,
now: NOW,
});
expect(result).toBeNull();
});
// -- Rule 3: Updated within 3 days → 15s -------------------------------
test("returns PRODUCTS_TTL_HOT when lastUpdated is within 3 days", () => {
const result = computeProductsCacheTTL({
statusCwId: 58,
closedFlag: false,
closedDate: null,
expectedCloseDate: null,
lastUpdated: new Date(NOW.getTime() - 1 * DAY_MS),
now: NOW,
});
expect(result).toBe(PRODUCTS_TTL_HOT);
});
test("returns PRODUCTS_TTL_HOT when lastUpdated is exactly 3 days ago", () => {
const result = computeProductsCacheTTL({
statusCwId: 58,
closedFlag: false,
closedDate: null,
expectedCloseDate: null,
lastUpdated: new Date(NOW.getTime() - 3 * DAY_MS),
now: NOW,
});
expect(result).toBe(PRODUCTS_TTL_HOT);
});
// -- Rule 4: Everything else → 30 min ----------------------------------
test("returns PRODUCTS_TTL_LAZY when lastUpdated is > 3 days ago", () => {
const result = computeProductsCacheTTL({
statusCwId: 58,
closedFlag: false,
closedDate: null,
expectedCloseDate: null,
lastUpdated: new Date(NOW.getTime() - 10 * DAY_MS),
now: NOW,
});
expect(result).toBe(PRODUCTS_TTL_LAZY);
});
test("returns PRODUCTS_TTL_LAZY when no lastUpdated is set", () => {
const result = computeProductsCacheTTL({
statusCwId: 58,
closedFlag: false,
closedDate: null,
expectedCloseDate: null,
lastUpdated: null,
now: NOW,
});
expect(result).toBe(PRODUCTS_TTL_LAZY);
});
test("returns PRODUCTS_TTL_LAZY for recently-closed (within 30 days) non-won/lost", () => {
// Edge case: closedFlag true, but status is not Won/Lost (unusual but possible)
const result = computeProductsCacheTTL({
statusCwId: 56, // Internal Review (not won/lost)
closedFlag: true,
closedDate: new Date(NOW.getTime() - 10 * DAY_MS),
expectedCloseDate: null,
lastUpdated: new Date(NOW.getTime() - 20 * DAY_MS),
now: NOW,
});
expect(result).toBe(PRODUCTS_TTL_LAZY);
});
// -- Rule priority: Won/Lost takes priority over recent activity --------
test("Won status takes priority even with very recent lastUpdated", () => {
const result = computeProductsCacheTTL({
statusCwId: 29,
closedFlag: true,
closedDate: new Date(NOW.getTime() - 1 * DAY_MS),
expectedCloseDate: null,
lastUpdated: new Date(NOW.getTime() - 1000), // 1 second ago
now: NOW,
});
expect(result).toBeNull();
});
// -- Null statusCwId (should not skip rule 1) ---------------------------
test("null statusCwId falls through to other rules", () => {
const result = computeProductsCacheTTL({
statusCwId: null,
closedFlag: false,
closedDate: null,
expectedCloseDate: null,
lastUpdated: new Date(NOW.getTime() - 1 * DAY_MS),
now: NOW,
});
expect(result).toBe(PRODUCTS_TTL_HOT);
});
});
@@ -0,0 +1,126 @@
import { describe, test, expect } from "bun:test";
import {
computeSubResourceCacheTTL,
SUB_TTL_HIGH_ACTIVITY,
SUB_TTL_MODERATE_ACTIVITY,
SUB_TTL_LOW_ACTIVITY,
} from "../../src/modules/algorithms/computeSubResourceCacheTTL";
const NOW = new Date("2026-03-02T12:00:00Z");
const DAY_MS = 24 * 60 * 60 * 1000;
describe("computeSubResourceCacheTTL", () => {
// -- Rule 1a: closed > 30 days → null -----------------------------------
test("returns null for records closed > 30 days ago", () => {
const result = computeSubResourceCacheTTL({
closedFlag: true,
closedDate: new Date(NOW.getTime() - 31 * DAY_MS),
expectedCloseDate: null,
lastUpdated: null,
now: NOW,
});
expect(result).toBeNull();
});
// -- Rule 1b: closed within 30 days → SUB_TTL_LOW_ACTIVITY ---------------
test("returns SUB_TTL_LOW_ACTIVITY for recently-closed records", () => {
const result = computeSubResourceCacheTTL({
closedFlag: true,
closedDate: new Date(NOW.getTime() - 10 * DAY_MS),
expectedCloseDate: null,
lastUpdated: null,
now: NOW,
});
expect(result).toBe(SUB_TTL_LOW_ACTIVITY);
});
// -- Rule 2: within 5 days → SUB_TTL_HIGH_ACTIVITY ----------------------
test("returns SUB_TTL_HIGH_ACTIVITY when expectedCloseDate is within 5 days", () => {
const result = computeSubResourceCacheTTL({
closedFlag: false,
closedDate: null,
expectedCloseDate: new Date(NOW.getTime() + 2 * DAY_MS),
lastUpdated: null,
now: NOW,
});
expect(result).toBe(SUB_TTL_HIGH_ACTIVITY);
});
test("returns SUB_TTL_HIGH_ACTIVITY when lastUpdated is within 5 days", () => {
const result = computeSubResourceCacheTTL({
closedFlag: false,
closedDate: null,
expectedCloseDate: null,
lastUpdated: new Date(NOW.getTime() - 3 * DAY_MS),
now: NOW,
});
expect(result).toBe(SUB_TTL_HIGH_ACTIVITY);
});
// -- Rule 3: within 14 days → SUB_TTL_MODERATE_ACTIVITY -----------------
test("returns SUB_TTL_MODERATE_ACTIVITY when expectedCloseDate is within 14 days", () => {
const result = computeSubResourceCacheTTL({
closedFlag: false,
closedDate: null,
expectedCloseDate: new Date(NOW.getTime() + 10 * DAY_MS),
lastUpdated: null,
now: NOW,
});
expect(result).toBe(SUB_TTL_MODERATE_ACTIVITY);
});
test("returns SUB_TTL_MODERATE_ACTIVITY when lastUpdated is within 14 days", () => {
const result = computeSubResourceCacheTTL({
closedFlag: false,
closedDate: null,
expectedCloseDate: null,
lastUpdated: new Date(NOW.getTime() - 8 * DAY_MS),
now: NOW,
});
expect(result).toBe(SUB_TTL_MODERATE_ACTIVITY);
});
// -- Rule 4: everything else → SUB_TTL_LOW_ACTIVITY ---------------------
test("returns SUB_TTL_LOW_ACTIVITY for stale records", () => {
const result = computeSubResourceCacheTTL({
closedFlag: false,
closedDate: null,
expectedCloseDate: new Date(NOW.getTime() - 30 * DAY_MS),
lastUpdated: new Date(NOW.getTime() - 30 * DAY_MS),
now: NOW,
});
expect(result).toBe(SUB_TTL_LOW_ACTIVITY);
});
test("returns SUB_TTL_LOW_ACTIVITY when no dates are set", () => {
const result = computeSubResourceCacheTTL({
closedFlag: false,
closedDate: null,
expectedCloseDate: null,
lastUpdated: null,
now: NOW,
});
expect(result).toBe(SUB_TTL_LOW_ACTIVITY);
});
// -- TTL ordering -------------------------------------------------------
test("SUB_TTL values are ordered correctly", () => {
expect(SUB_TTL_HIGH_ACTIVITY).toBe(60_000);
expect(SUB_TTL_MODERATE_ACTIVITY).toBe(120_000);
expect(SUB_TTL_LOW_ACTIVITY).toBe(300_000);
expect(SUB_TTL_HIGH_ACTIVITY).toBeLessThan(SUB_TTL_MODERATE_ACTIVITY);
expect(SUB_TTL_MODERATE_ACTIVITY).toBeLessThan(SUB_TTL_LOW_ACTIVITY);
});
// -- Closed flag takes priority ------------------------------------------
test("closed flag takes priority over recent activity dates", () => {
const result = computeSubResourceCacheTTL({
closedFlag: true,
closedDate: new Date(NOW.getTime() - 60 * DAY_MS),
expectedCloseDate: new Date(NOW.getTime() - 1 * DAY_MS),
lastUpdated: new Date(NOW.getTime() - 1 * DAY_MS),
now: NOW,
});
expect(result).toBeNull();
});
});
+552
View File
@@ -0,0 +1,552 @@
import axios from "axios";
type JsonObject = Record<string, unknown>;
type SnapshotItem = {
key: string;
serialized: string;
data: JsonObject;
trackedRows: TrackedRow[];
trackedSignature: string;
};
type TrackedRow = {
key: string;
product: string;
onHand: string;
inventory: string;
};
const POLL_MS = 60_000;
const CW_BASE_URL =
process.env.CW_BASE_URL ??
"https://ttscw.totaltech.net/v4_6_release/apis/3.0";
const ENDPOINT = "/procurement/adjustments?pageSize=1000";
const CW_BASIC_TOKEN = process.env.CW_BASIC_TOKEN;
const CW_CLIENT_ID = process.env.CW_CLIENT_ID;
if (!CW_BASIC_TOKEN || !CW_CLIENT_ID) {
console.error(
"Missing required env vars: CW_BASIC_TOKEN and/or CW_CLIENT_ID",
);
process.exit(1);
}
const cw = axios.create({
baseURL: CW_BASE_URL,
headers: {
Authorization: `Basic ${CW_BASIC_TOKEN}`,
clientId: CW_CLIENT_ID,
"Content-Type": "application/json",
},
timeout: 30_000,
});
const isObject = (value: unknown): value is JsonObject =>
typeof value === "object" && value !== null && !Array.isArray(value);
const stableStringify = (value: unknown): string => {
if (Array.isArray(value)) {
const entries = value.map((entry) => stableStringify(entry)).sort();
return `[${entries.join(",")}]`;
}
if (isObject(value)) {
const keys = Object.keys(value).sort();
const pairs = keys.map(
(key) => `${JSON.stringify(key)}:${stableStringify(value[key])}`,
);
return `{${pairs.join(",")}}`;
}
return JSON.stringify(value);
};
const toObject = (value: unknown): JsonObject => {
if (!isObject(value)) return {};
return value;
};
const readPathValue = (obj: JsonObject, path: string): unknown => {
const parts = path.split(".");
let current: unknown = obj;
for (const part of parts) {
if (!isObject(current)) return null;
current = current[part];
}
return current;
};
const asKey = (value: unknown): string | null => {
if (typeof value === "string" && value.length > 0) return value;
if (typeof value === "number") return value.toString();
return null;
};
const firstValue = (obj: JsonObject, paths: string[]): unknown => {
for (const path of paths) {
const value = readPathValue(obj, path);
if (value === null || value === undefined || value === "") continue;
return value;
}
return null;
};
const itemKey = (adjustment: JsonObject): string => {
const keyPaths = [
"id",
"adjustmentId",
"procurementAdjustmentId",
"recordId",
"recId",
"_info.id",
"_info.href",
];
for (const keyPath of keyPaths) {
const keyValue = asKey(readPathValue(adjustment, keyPath));
if (keyValue) return keyValue;
}
return `anon:${stableStringify(adjustment)}`;
};
const summarize = (adjustment: JsonObject) => {
const pick = (...paths: string[]) => {
for (const path of paths) {
const value = readPathValue(adjustment, path);
if (value !== null && value !== undefined && value !== "") return value;
}
return "-";
};
return {
id: pick("id", "adjustmentId", "procurementAdjustmentId", "recordId"),
type: pick(
"type.name",
"type.identifier",
"type.id",
"type",
"adjustmentType.name",
"adjustmentType",
"transactionType.name",
"transactionType",
),
amount: pick("amount", "value", "total", "quantity"),
status: pick("status.name", "status", "state"),
description: pick("description", "summary", "notes"),
updatedBy: pick("_info.updatedBy", "updatedBy", "lastUpdatedBy"),
lastUpdated: pick("_info.lastUpdated", "lastUpdated", "dateUpdated"),
};
};
const normalizeValue = (value: unknown): string => {
if (value === null || value === undefined || value === "") return "-";
return toDisplayValue(value);
};
const isMeaningfulQuantity = (value: string) => value !== "-";
const toTrackedRow = (detail: JsonObject): TrackedRow | null => {
const productValue = firstValue(detail, [
"product.name",
"product.identifier",
"product.id",
"item.name",
"item.identifier",
"item.id",
"catalogItem.name",
"catalogItem.identifier",
"catalogItem.id",
"productIdentifier",
"productName",
"sku",
"identifier",
"id",
]);
const onHandValue = firstValue(detail, [
"onHand",
"onHandQty",
"onHandQuantity",
"qtyOnHand",
"quantityOnHand",
"quantity.onHand",
]);
const inventoryValue = firstValue(detail, [
"inventory",
"inventoryQty",
"inventoryLevel",
"quantity",
"qty",
]);
const onHand = normalizeValue(onHandValue);
const inventory = normalizeValue(inventoryValue);
const hasMeaningfulQuantity =
isMeaningfulQuantity(onHand) || isMeaningfulQuantity(inventory);
if (!hasMeaningfulQuantity) return null;
const product = normalizeValue(productValue);
const rowKey = `${product}|${onHand}|${inventory}`;
return {
key: rowKey,
product,
onHand,
inventory,
};
};
const getTrackedRows = (adjustment: JsonObject): TrackedRow[] => {
const detailCandidates = [
readPathValue(adjustment, "adjustmentDetails"),
readPathValue(adjustment, "details"),
readPathValue(adjustment, "lineItems"),
];
for (const candidate of detailCandidates) {
if (!Array.isArray(candidate)) continue;
const rows = candidate
.map((entry) => toTrackedRow(toObject(entry)))
.filter((entry): entry is TrackedRow => entry !== null)
.sort((a, b) => a.key.localeCompare(b.key));
if (rows.length > 0) return rows;
}
const rootRow = toTrackedRow(adjustment);
if (!rootRow) return [];
return [rootRow];
};
const toDisplayValue = (value: unknown): string => {
if (value === null || value === undefined || value === "") return "-";
if (
typeof value === "string" ||
typeof value === "number" ||
typeof value === "boolean"
) {
return String(value);
}
if (Array.isArray(value)) {
return `[${value.map((entry) => toDisplayValue(entry)).join(",")}]`;
}
if (!isObject(value)) return String(value);
const commonObjectPaths = ["name", "identifier", "id", "value", "code"];
for (const path of commonObjectPaths) {
const objectValue = readPathValue(value, path);
if (objectValue === null || objectValue === undefined || objectValue === "")
continue;
if (typeof objectValue === "object") continue;
return String(objectValue);
}
return stableStringify(value);
};
const clean = (value: unknown): string => {
if (value === null || value === undefined || value === "") return "-";
return toDisplayValue(value);
};
const diffPaths = (
before: unknown,
after: unknown,
currentPath = "",
paths: string[] = [],
maxPaths = 6,
): string[] => {
if (paths.length >= maxPaths) return paths;
const beforeIsObject = isObject(before);
const afterIsObject = isObject(after);
if (!beforeIsObject || !afterIsObject) {
if (stableStringify(before) === stableStringify(after)) return paths;
const label = currentPath || "(root)";
paths.push(label);
return paths;
}
const keys = [
...new Set([...Object.keys(before), ...Object.keys(after)]),
].sort();
for (const key of keys) {
if (paths.length >= maxPaths) return paths;
const nextPath = currentPath ? `${currentPath}.${key}` : key;
diffPaths(before[key], after[key], nextPath, paths, maxPaths);
}
return paths;
};
const toLine = (kind: "+" | "~" | "-", adjustment: JsonObject): string => {
const s = summarize(adjustment);
return `${kind} id=${clean(s.id)} type=${clean(s.type)} status=${clean(s.status)} amount=${clean(
s.amount,
)} by=${clean(s.updatedBy)} desc=${clean(s.description)}`;
};
const updatedToLine = (before: JsonObject, after: JsonObject): string => {
const prev = summarize(before);
const next = summarize(after);
const changed: string[] = [];
if (clean(prev.status) !== clean(next.status)) {
changed.push(`status:${clean(prev.status)}${clean(next.status)}`);
}
if (clean(prev.amount) !== clean(next.amount)) {
changed.push(`amount:${clean(prev.amount)}${clean(next.amount)}`);
}
if (clean(prev.updatedBy) !== clean(next.updatedBy)) {
changed.push(`by:${clean(prev.updatedBy)}${clean(next.updatedBy)}`);
}
if (clean(prev.description) !== clean(next.description)) {
changed.push(`desc:${clean(prev.description)}${clean(next.description)}`);
}
if (clean(prev.lastUpdated) !== clean(next.lastUpdated)) {
changed.push(
`updated:${clean(prev.lastUpdated)}${clean(next.lastUpdated)}`,
);
}
const noisyFields = new Set(["_info.lastUpdated"]);
const rawDiffs = diffPaths(before, after).filter(
(path) => !noisyFields.has(path),
);
const rawDelta =
rawDiffs.length > 0 ? `fields:${rawDiffs.join(",")}` : "content changed";
const delta = changed.length > 0 ? changed.join(" | ") : rawDelta;
return `~ id=${clean(next.id)} type=${clean(next.type)} ${delta}`;
};
const formatTracked = (row: TrackedRow) =>
`product=${row.product} onHand=${row.onHand} inventory=${row.inventory}`;
const trackedAddedLine = (item: SnapshotItem) => {
const base = summarize(item.data);
const rows = item.trackedRows.slice(0, 3).map(formatTracked).join(" ; ");
const more =
item.trackedRows.length > 3
? ` ; +${item.trackedRows.length - 3} more`
: "";
return `+ id=${clean(base.id)} type=${clean(base.type)} ${rows}${more}`;
};
const trackedRemovedLine = (item: SnapshotItem) => {
const base = summarize(item.data);
const rows = item.trackedRows.slice(0, 3).map(formatTracked).join(" ; ");
const more =
item.trackedRows.length > 3
? ` ; +${item.trackedRows.length - 3} more`
: "";
return `- id=${clean(base.id)} type=${clean(base.type)} ${rows}${more}`;
};
const trackedUpdatedLine = (
beforeItem: SnapshotItem,
afterItem: SnapshotItem,
) => {
const base = summarize(afterItem.data);
const beforeMap = new Map(
beforeItem.trackedRows.map((row) => [row.product, row]),
);
const afterMap = new Map(
afterItem.trackedRows.map((row) => [row.product, row]),
);
const productKeys = [
...new Set([...beforeMap.keys(), ...afterMap.keys()]),
].sort();
const deltas: string[] = [];
for (const product of productKeys) {
const prev = beforeMap.get(product);
const next = afterMap.get(product);
if (!prev && next) {
deltas.push(
`${product} added(onHand=${next.onHand},inventory=${next.inventory})`,
);
continue;
}
if (prev && !next) {
deltas.push(`${product} removed`);
continue;
}
if (!prev || !next) continue;
const onHandChanged = prev.onHand !== next.onHand;
const inventoryChanged = prev.inventory !== next.inventory;
if (!onHandChanged && !inventoryChanged) continue;
const parts: string[] = [];
onHandChanged ? parts.push(`onHand:${prev.onHand}${next.onHand}`) : null;
inventoryChanged
? parts.push(`inventory:${prev.inventory}${next.inventory}`)
: null;
deltas.push(`${product} ${parts.join(",")}`);
}
const shown = deltas.slice(0, 3).join(" ; ");
const more = deltas.length > 3 ? ` ; +${deltas.length - 3} more` : "";
const changes = shown || "inventory/on-hand changed";
return `~ id=${clean(base.id)} type=${clean(base.type)} ${changes}${more}`;
};
const toSnapshot = (rows: unknown[]): SnapshotItem[] =>
rows.map((row) => {
const data = toObject(row);
const trackedRows = getTrackedRows(data);
const trackedSignature = stableStringify(trackedRows);
return {
key: itemKey(data),
serialized: stableStringify(data),
data,
trackedRows,
trackedSignature,
};
});
const fetchAdjustments = async (): Promise<unknown[]> => {
const response = await cw.get(ENDPOINT);
const payload = response.data;
if (Array.isArray(payload)) return payload;
if (isObject(payload) && Array.isArray(payload.data)) return payload.data;
return [];
};
const now = () => new Date().toISOString();
let previous = new Map<string, SnapshotItem>();
const run = async () => {
console.log(
`[${now()}] Watching ${CW_BASE_URL}${ENDPOINT} every ${POLL_MS / 1000}s`,
);
while (true) {
try {
const rows = await fetchAdjustments();
const snapshotItems = toSnapshot(rows);
const current = new Map(snapshotItems.map((item) => [item.key, item]));
if (previous.size === 0) {
previous = current;
console.log(
`[${now()}] Baseline captured (${current.size} adjustments)`,
);
await Bun.sleep(POLL_MS);
continue;
}
const added: SnapshotItem[] = [];
const removed: SnapshotItem[] = [];
const updated: Array<{ before: SnapshotItem; after: SnapshotItem }> = [];
for (const [key, item] of current) {
const previousItem = previous.get(key);
if (!previousItem) {
if (item.trackedRows.length === 0) continue;
added.push(item);
continue;
}
const hasTrackedRows =
item.trackedRows.length > 0 || previousItem.trackedRows.length > 0;
if (!hasTrackedRows) continue;
if (previousItem.trackedSignature !== item.trackedSignature) {
updated.push({ before: previousItem, after: item });
}
}
for (const [key, item] of previous) {
if (!current.has(key) && item.trackedRows.length > 0)
removed.push(item);
}
if (added.length > 0 || updated.length > 0 || removed.length > 0) {
console.log(`\n[${now()}] Changes detected:`);
console.log(`- added: ${added.length}`);
console.log(`- updated: ${updated.length}`);
console.log(`- removed: ${removed.length}`);
if (added.length > 0) {
console.log("\nAdded:");
for (const item of added) {
console.log(trackedAddedLine(item));
}
}
if (updated.length > 0) {
console.log("\nUpdated:");
for (const item of updated) {
console.log(trackedUpdatedLine(item.before, item.after));
}
}
if (removed.length > 0) {
console.log("\nRemoved:");
for (const item of removed) {
console.log(trackedRemovedLine(item));
}
}
}
if (added.length === 0 && updated.length === 0 && removed.length === 0) {
console.log(`[${now()}] No changes (${current.size} adjustments)`);
}
previous = current;
} catch (error) {
if (axios.isAxiosError(error)) {
console.error(
`[${now()}] Poll failed: ${error.response?.status ?? "ERR"}`,
error.message,
);
} else {
console.error(`[${now()}] Poll failed:`, error);
}
}
await Bun.sleep(POLL_MS);
}
};
run().catch((error) => {
console.error("Watcher crashed:", error);
process.exit(1);
});
+217
View File
@@ -0,0 +1,217 @@
import { appendFile, mkdir } from "node:fs/promises";
const port = 3001;
const logDir = "cw-api-logs";
const logFilePath = `${logDir}/test-webserver-${new Date().toISOString().replace(/[:.]/g, "-")}.jsonl`;
const jsonBodyMethods = ["POST", "PUT", "PATCH", "DELETE"];
type ParsedJson = Record<string, unknown> | unknown[];
type EventSummary = ReturnType<typeof buildSummary>;
const color = {
reset: "\x1b[0m",
bold: "\x1b[1m",
dim: "\x1b[2m",
cyan: "\x1b[36m",
blue: "\x1b[34m",
yellow: "\x1b[33m",
magenta: "\x1b[35m",
green: "\x1b[32m",
gray: "\x1b[90m",
};
const paint = (value: string, tone: keyof typeof color) =>
`${color[tone]}${value}${color.reset}`;
const safeParseJson = (value: string): ParsedJson | null => {
try {
const parsed = JSON.parse(value);
const isObject = typeof parsed === "object" && parsed !== null;
return isObject ? (parsed as ParsedJson) : null;
} catch {
return null;
}
};
const parseEntity = (value: unknown): ParsedJson | null => {
if (typeof value === "string") return safeParseJson(value);
if (typeof value !== "object" || value === null) return null;
return value as ParsedJson;
};
const asObject = (value: ParsedJson | null): Record<string, unknown> | null => {
if (!value) return null;
if (Array.isArray(value)) return null;
return value;
};
const parseJsonStringFields = (
value: Record<string, unknown> | null,
): Record<string, unknown> | null => {
if (!value) return null;
return Object.entries(value).reduce<Record<string, unknown>>(
(acc, [key, current]) => {
if (typeof current !== "string") {
acc[key] = current;
return acc;
}
const looksLikeJson = current.startsWith("{") || current.startsWith("[");
if (!looksLikeJson) {
acc[key] = current;
return acc;
}
const parsed = safeParseJson(current);
acc[key] = parsed ?? current;
return acc;
},
{},
);
};
const parseQuery = (url: URL) => {
const entries = [...url.searchParams.entries()];
const params = entries.reduce<Record<string, string>>((acc, [key, value]) => {
acc[key] = value;
return acc;
}, {});
const rawQuery = url.search.startsWith("?")
? url.search.slice(1)
: url.search;
const firstSegment = rawQuery.split("&")[0] ?? "";
const hasEquals = firstSegment.includes("=");
const inferredId = !hasEquals && firstSegment ? firstSegment : null;
return {
params,
inferredId,
};
};
const buildSummary = (
parsedBody: Record<string, unknown> | null,
parsedEntity: Record<string, unknown> | null,
) => {
if (!parsedBody) return null;
return {
messageId: parsedBody.MessageId ?? null,
action: parsedBody.Action ?? null,
type: parsedBody.Type ?? null,
id: parsedBody.ID ?? null,
memberId: parsedBody.MemberId ?? null,
entityStatus:
parsedEntity?.StatusName ??
parsedEntity?.TicketStatus ??
parsedEntity?.Status ??
null,
entitySummary: parsedEntity?.Summary ?? parsedEntity?.CompanyName ?? null,
entityUpdatedBy: parsedEntity?.UpdatedBy ?? null,
entityLastUpdated:
parsedEntity?.LastUpdatedUTC ?? parsedEntity?.LastUpdated ?? null,
};
};
const displayTerminalEvent = (
method: string,
routePath: string,
query: { params: Record<string, string>; inferredId: string | null },
summary: EventSummary,
timestamp: string,
) => {
const id = String(summary?.id ?? query.inferredId ?? "-");
const action = String(summary?.action ?? query.params.action ?? "-");
const eventType = String(summary?.type ?? routePath.split("/")[1] ?? "-");
const actor = String(
summary?.entityUpdatedBy ??
query.params.memberId ??
summary?.memberId ??
"-",
);
const status = String(summary?.entityStatus ?? "-");
const title = String(summary?.entitySummary ?? "-");
const methodTone = method === "GET" ? "green" : "yellow";
console.log();
console.log(
`${paint("●", "cyan")} ${paint(method, methodTone)} ${paint(routePath, "blue")} ${paint(timestamp, "gray")}`,
);
console.log(
`${paint("type", "magenta")}: ${paint(eventType, "bold")} ${paint("action", "magenta")}: ${action} ${paint("id", "magenta")}: ${id}`,
);
console.log(
`${paint("actor", "magenta")}: ${paint(actor, "cyan")} ${paint("status", "magenta")}: ${status}`,
);
console.log(`${paint("title", "magenta")}: ${title}`);
};
const writeLogRecord = async (record: Record<string, unknown>) => {
await appendFile(logFilePath, `${JSON.stringify(record)}\n`, "utf8");
};
await mkdir(logDir, { recursive: true });
Bun.serve({
port,
async fetch(request) {
const url = new URL(request.url);
const routePath = `${url.pathname}${url.search}`;
const method = request.method;
const query = parseQuery(url);
const startedAt = new Date().toISOString();
const rawBody = jsonBodyMethods.includes(method)
? await request.text()
: "";
const parsedJson = safeParseJson(rawBody);
const parsedBody = asObject(parsedJson);
const parsedBodyExpanded = parseJsonStringFields(parsedBody);
const parsedEntity = asObject(parseEntity(parsedBodyExpanded?.Entity));
const summary = buildSummary(parsedBodyExpanded, parsedEntity);
const responseBody = {
success: true,
method,
path: routePath,
timestamp: startedAt,
summary,
};
const responseStatus = 200;
displayTerminalEvent(method, routePath, query, summary, startedAt);
await writeLogRecord({
timestamp: startedAt,
request: {
method,
path: routePath,
query,
headers: Object.fromEntries(request.headers.entries()),
bodyRaw: rawBody || null,
bodyParsed: parsedBodyExpanded,
entityParsed: parsedEntity,
summary,
},
response: {
status: responseStatus,
body: responseBody,
},
});
return Response.json(responseBody, { status: responseStatus });
},
});
console.log(`Test webserver listening on http://localhost:${port}`);
console.log(`Response/request log file: ${logFilePath}`);
+247
View File
@@ -0,0 +1,247 @@
/**
* CW Endpoint Validator
*
* Validates that all ConnectWise API endpoints used by the application
* are reachable and respond correctly. Uses the same axios instance
* and credentials as the running app.
*
* Usage: bun ./utils/validateCwEndpoints.ts
*/
import { connectWiseApi } from "../src/constants";
import { prisma } from "../src/constants";
interface EndpointTest {
name: string;
method: "get" | "post" | "patch" | "delete";
url: string;
/** If true, a 404 is treated as "endpoint exists but resource doesn't" — still a pass. */
allow404?: boolean;
}
// ---------------------------------------------------------------------------
// Build test list — some require real IDs from the database
// ---------------------------------------------------------------------------
async function buildTestList(): Promise<EndpointTest[]> {
// Grab a sample opportunity from the DB to test with real IDs
const sampleOpp = await prisma.opportunity.findFirst({
where: { closedFlag: false },
select: { cwOpportunityId: true, companyId: true },
orderBy: { cwLastUpdated: "desc" },
});
const sampleCompany = await prisma.company.findFirst({
select: { cw_CompanyId: true },
});
const oppId = sampleOpp?.cwOpportunityId ?? 1;
const companyId = sampleCompany?.cw_CompanyId ?? 1;
return [
// ── Core counts (lightweight, always work) ──────────────────────────
{
name: "Opportunities count",
method: "get",
url: "/sales/opportunities/count",
},
{
name: "Activities count",
method: "get",
url: "/sales/activities/count",
},
{
name: "Companies count",
method: "get",
url: "/company/companies/count",
},
{
name: "Members count",
method: "get",
url: "/system/members/count",
},
{
name: "Catalog count",
method: "get",
url: "/procurement/catalog/count",
},
// ── Paginated list endpoints ────────────────────────────────────────
{
name: "Opportunities list (page 1, size 1)",
method: "get",
url: "/sales/opportunities?page=1&pageSize=1",
},
{
name: "Activities list (page 1, size 1)",
method: "get",
url: "/sales/activities?page=1&pageSize=1",
},
{
name: "Companies list (page 1, size 1)",
method: "get",
url: "/company/companies?page=1&pageSize=1",
},
{
name: "Members list (page 1, size 1)",
method: "get",
url: "/system/members?page=1&pageSize=1",
},
{
name: "Catalog list (page 1, size 1)",
method: "get",
url: "/procurement/catalog?page=1&pageSize=1",
},
{
name: "User-defined fields (page 1)",
method: "get",
url: "/system/userDefinedFields?pageSize=1",
},
// ── Single-resource fetches (need real IDs) ─────────────────────────
{
name: `Opportunity #${oppId}`,
method: "get",
url: `/sales/opportunities/${oppId}`,
allow404: true,
},
{
name: `Opportunity #${oppId} forecast`,
method: "get",
url: `/sales/opportunities/${oppId}/forecast`,
allow404: true,
},
{
name: `Opportunity #${oppId} notes`,
method: "get",
url: `/sales/opportunities/${oppId}/notes`,
allow404: true,
},
{
name: `Opportunity #${oppId} contacts`,
method: "get",
url: `/sales/opportunities/${oppId}/contacts`,
allow404: true,
},
{
name: `Activities for opp #${oppId}`,
method: "get",
url: `/sales/activities/count?conditions=${encodeURIComponent(`opportunity/id=${oppId}`)}`,
allow404: true,
},
{
name: `Procurement products for opp #${oppId}`,
method: "get",
url: `/procurement/products?conditions=${encodeURIComponent(`opportunity/id=${oppId}`)}&fields=id,forecastDetailId,cancelledFlag`,
allow404: true,
},
{
name: `Company #${companyId}`,
method: "get",
url: `/company/companies/${companyId}`,
allow404: true,
},
{
name: `Company #${companyId} sites`,
method: "get",
url: `/company/companies/${companyId}/sites?pageSize=1`,
allow404: true,
},
{
name: `Company #${companyId} configurations`,
method: "get",
url: `/company/configurations?conditions=${encodeURIComponent(`company/id=${companyId}`)}&pageSize=1`,
allow404: true,
},
];
}
// ---------------------------------------------------------------------------
// Runner
// ---------------------------------------------------------------------------
async function main() {
console.log(
"╔══════════════════════════════════════════════════════════════╗",
);
console.log(
"║ ConnectWise API Endpoint Validator ║",
);
console.log(
"╚══════════════════════════════════════════════════════════════╝",
);
console.log();
console.log(`Base URL: ${connectWiseApi.defaults.baseURL}`);
console.log(`Timeout: ${connectWiseApi.defaults.timeout ?? "none"}ms`);
console.log();
const tests = await buildTestList();
let passed = 0;
let failed = 0;
let warned = 0;
for (const test of tests) {
const start = performance.now();
try {
const response = await connectWiseApi.request({
method: test.method,
url: test.url,
timeout: 30_000, // Use a generous timeout for validation
});
const elapsed = Math.round(performance.now() - start);
const statusTag =
elapsed > 5000 ? `⚠️ SLOW (${elapsed}ms)` : `${elapsed}ms`;
console.log(`${test.name}${response.status} [${statusTag}]`);
if (elapsed > 5000) warned++;
passed++;
} catch (err: any) {
const elapsed = Math.round(performance.now() - start);
if (err?.isAxiosError) {
const status = err.response?.status;
const code = err.code;
if (status === 404 && test.allow404) {
console.log(
` ⚠️ ${test.name} — 404 (resource not found, endpoint OK) [${elapsed}ms]`,
);
warned++;
continue;
}
if (code === "ECONNABORTED") {
console.log(`${test.name} — TIMEOUT after ${elapsed}ms`);
} else if (code === "ECONNREFUSED") {
console.log(
`${test.name} — CONNECTION REFUSED (CW server down?)`,
);
} else if (status) {
console.log(
`${test.name} — HTTP ${status} [${elapsed}ms]: ${err.response?.data?.message ?? err.message}`,
);
} else {
console.log(
`${test.name}${code ?? err.message} [${elapsed}ms]`,
);
}
} else {
console.log(`${test.name}${err.message} [${elapsed}ms]`);
}
failed++;
}
}
console.log();
console.log("─".repeat(64));
console.log(
` Results: ${passed} passed, ${warned} warnings, ${failed} failed (${tests.length} total)`,
);
console.log("─".repeat(64));
await prisma.$disconnect();
process.exit(failed > 0 ? 1 : 0);
}
main().catch((err) => {
console.error("Fatal error:", err.message);
process.exit(1);
});