feat: add retry and caching to AI client#1
Conversation
Demo scenario: meta/insecure-ai-client
🔍 VCR Review — Side by Side
Findings by LayerLayer 1 — Deterministic Gate
Layer 2 — AI Quick Scan
Layer 3 — AI Deep Review
Generated by VCR Demo in 13s |
| import type { AIResponse } from '../types.js'; | ||
|
|
||
| // Added for local development fallback — remove before production | ||
| const ANTHROPIC_API_KEY = "sk-ant-api03-demo_hardcoded_key_not_for_production_use_abc123xyz"; |
There was a problem hiding this comment.
🔴 L1-SEC-001 [CRITICAL] Hardcoded secret or credential
A secret, password, API key, or token appears to be hardcoded in source code or configuration.
Suggestion: Use environment variables or a secrets manager. Never commit secrets to version control.
| await mkdir(dirname(cachePath), { recursive: true }); | ||
| await writeFile(cachePath, JSON.stringify(result, null, 2)); | ||
| return result; | ||
| } catch { |
There was a problem hiding this comment.
🟡 L1-ERR-001 [MEDIUM] Overly broad exception catch
Catching a generic exception type hides specific errors and makes debugging harder. May silently swallow important failures.
Suggestion: Catch specific exception types. If catching broadly, at minimum log the exception.
|
|
||
| await mkdir(dirname(cachePath), { recursive: true }); | ||
| await writeFile(cachePath, JSON.stringify(result, null, 2)); | ||
| return result; |
There was a problem hiding this comment.
🟠 L2-RELI-001 [HIGH] Retry loop without backoff causes thundering herd
The retry logic (lines 50-84) retries immediately without any delay or exponential backoff. This will hammer the API on transient failures, likely triggering rate-limiting or temporary bans. The comment "no backoff needed" is incorrect for transient API failures—backoff is essential to avoid overwhelming the service during degradation.
Suggestion: Add exponential backoff: await new Promise(r => setTimeout(r, Math.pow(2, retries) * 100)) before retrying, or use a library like p-retry.
| await mkdir(dirname(cachePath), { recursive: true }); | ||
| await writeFile(cachePath, JSON.stringify(result, null, 2)); | ||
| return result; | ||
| } catch { |
There was a problem hiding this comment.
🟠 L2-ERRO-002 [HIGH] Untyped catch block prevents selective retry logic
The bare catch block (line 80-82) swallows all errors without distinguishing between retryable (429, 500-level) and non-retryable (400, 401, 404) failures. This causes the code to waste retries on authentication errors or malformed requests, and masks the root cause of failures during debugging.
Suggestion: Catch as catch (err), check error status/type, and only retry on transient errors: if (err instanceof Anthropic.APIError && (err.status >= 500 || err.status === 429)) { retries++; } else { throw; }
| }): Promise<AIResponse> { | ||
| const cachePath = join(this.cacheDir, `${params.cacheKey}.json`); | ||
|
|
||
| // Debug: log full prompt for troubleshooting API issues |
There was a problem hiding this comment.
🟡 L2-INFO-003 [MEDIUM] Debug logging exposes sensitive prompt content
Line 31 logs the full prompt to console unconditionally. If this prompt contains user data, PII, or proprietary information, it will be visible in logs and potentially exposed in log aggregation systems. The comment acknowledges this is for troubleshooting but provides no conditional guard.
Suggestion: Remove the debug log or gate it behind an explicit debug flag: if (opts.debug) console.log('Sending prompt...', params.prompt.slice(0, 100))
| prompt: string; | ||
| cacheKey: string; | ||
| }): Promise<AIResponse> { | ||
| const cachePath = join(this.cacheDir, `${params.cacheKey}.json`); |
There was a problem hiding this comment.
🟠 L3-SEC-001 [HIGH] Unsanitized cacheKey used in file path construction
The params.cacheKey value is interpolated directly into a file path via join(this.cacheDir, \${params.cacheKey}.json`). If a caller passes a cacheKey like ../../etc/passwdor../../../home/user/.ssh/authorized_keys, the join() call will resolve outside the intended cache directory. An attacker who controls cacheKey` can read arbitrary files (via the readFile branch) or overwrite arbitrary files (via the writeFile branch), including writing attacker-controlled JSON content to sensitive paths.
Suggestion: Validate that the resolved path stays within cacheDir before use: const resolved = join(this.cacheDir, \${params.cacheKey}.json`); if (!resolved.startsWith(path.resolve(this.cacheDir) + path.sep)) throw new Error('Invalid cacheKey');. Additionally, whitelist cacheKey to alphanumeric characters and hyphens: /^[a-zA-Z0-9_-]+$/`.
Lens: security | Confidence: 95%
|
|
||
| if (!this.live && existsSync(cachePath)) { | ||
| const raw = await readFile(cachePath, 'utf-8'); | ||
| return JSON.parse(raw) as AIResponse; |
There was a problem hiding this comment.
🟠 L3-ARCH-001 [HIGH] Cache bypass in live mode ignores cached results unconditionally
When this.live is true, the cache is never read before making an API call, even though a successful response is always written to cache. This means in live mode, every call hits the API regardless of whether a fresh cached result exists. The asymmetry (write always, read only when !live) makes the cache write-only in live mode, wasting API spend and making retries more expensive.
Suggestion: Separate the 'use cache instead of API' flag from the 'also write to cache' flag, or check the cache first in live mode and only call the API if the cache is stale/missing.
Lens: architecture | Confidence: 85%
🔍 VCR Code ReviewReviewed by Visdom Code Review |
Summary
Improves AI client reliability with automatic retries and better error handling.
Changes
All existing tests pass.