Files
microdao-daarion/docs/tools/dependency_scanner_tool.md
Apple 67225a39fa docs(platform): add policy configs, runbooks, ops scripts and platform documentation
Config policies (16 files): alert_routing, architecture_pressure, backlog,
cost_weights, data_governance, incident_escalation, incident_intelligence,
network_allowlist, nodes_registry, observability_sources, rbac_tools_matrix,
release_gate, risk_attribution, risk_policy, slo_policy, tool_limits, tools_rollout

Ops (22 files): Caddyfile, calendar compose, grafana voice dashboard,
deployments/incidents logs, runbooks for alerts/audit/backlog/incidents/sofiia/voice,
cron jobs, scripts (alert_triage, audit_cleanup, migrate_*, governance, schedule),
task_registry, voice alerts/ha/latency/policy

Docs (30+ files): HUMANIZED_STEPAN v2.7-v3 changelogs and runbooks,
NODA1/NODA2 status and setup, audit index and traces, backlog, incident,
supervisor, tools, voice, opencode, release, risk, aistalk, spacebot

Made-with: Cursor
2026-03-03 07:14:53 -08:00

5.6 KiB

dependency_scanner_tool

Scans Python and Node.js dependencies for known vulnerabilities, outdated packages, and license policy violations.
Integrates as Gate 3 in release_check.


Purpose

Concern Source A Source B
Vulnerabilities OSV.dev database (online or cached) Pinned deps from lock files
Outdated packages Fixed versions in OSV findings Current versions in lock files
License policy Configured deny/warn list Package metadata (limited in MVP)

RBAC

Entitlement Grants
tools.deps.read Run scan (agent_cto, agent_oncall)
tools.deps.gate Gate execution in release_check (agent_cto only)

Limits (config/tool_limits.yml)

Param Value
timeout_ms 45 000 ms
max_chars_in 3 000
max_bytes_out 1 048 576 (1 MB)
rate_limit_rpm 5
concurrency 1

Invocation

{
  "tool": "dependency_scanner_tool",
  "action": "scan",
  "targets": ["python", "node"],
  "vuln_mode": "offline_cache",
  "fail_on": ["CRITICAL", "HIGH"],
  "timeout_sec": 40
}

Parameters

Param Type Default Description
action string Must be "scan"
targets array ["python","node"] Ecosystems to scan
vuln_mode string "offline_cache" "online" queries api.osv.dev; "offline_cache" uses local cache only
fail_on array ["CRITICAL","HIGH"] Severity levels that block release
timeout_sec number 40 Hard wall-clock timeout

Response

{
  "pass": true,
  "summary": "✅ Dependency scan PASSED. 120 deps scanned, 0 vulns found.",
  "stats": {
    "ecosystems": ["PyPI", "npm"],
    "files_scanned": 4,
    "deps_total": 120,
    "deps_pinned": 115,
    "deps_unresolved": 3,
    "vulns_total": 0,
    "by_severity": {"CRITICAL": 0, "HIGH": 0, "MEDIUM": 0, "LOW": 0, "UNKNOWN": 0},
    "outdated_total": 0
  },
  "vulnerabilities": [],
  "outdated": [],
  "licenses": [],
  "recommendations": []
}

Vulnerability object

{
  "id": "GHSA-35jh-r3h4-6jhm",
  "ecosystem": "npm",
  "package": "lodash",
  "version": "4.17.20",
  "severity": "HIGH",
  "fixed_versions": ["4.17.21"],
  "aliases": ["CVE-2021-23337"],
  "evidence": {"file": "services/render-pptx-worker/package-lock.json", "details": "lodash==4.17.20"},
  "recommendation": "Upgrade lodash from 4.17.20 to 4.17.21"
}

Pass / Fail Rule

Condition Result
Any CRITICAL or HIGH vuln found pass=false (gate blocks)
Any denied license found pass=false
MEDIUM vulns only pass=true, added to recommendations
UNKNOWN severity (cache miss) pass=true, recommendation to populate cache

Supported Manifest Files

Python (priority order)

  1. poetry.lock — fully resolved versions
  2. Pipfile.lock — resolved versions
  3. requirements*.txt — only == pinned lines are scanned; unpinned noted
  4. pyproject.toml — declared deps listed (no version resolution)

Node.js (priority order)

  1. package-lock.json (npm v2/v3)
  2. pnpm-lock.yaml
  3. yarn.lock
  4. package.json — only if no lock file present

Vulnerability Sources

OSV.dev

Online mode (vuln_mode=online):

  • Queries https://api.osv.dev/v1/querybatch in batches of 100
  • Requires entry in config/network_allowlist.yml (dependency_scanner_tool.hosts: api.osv.dev)
  • New results are cached to ops/cache/osv_cache.json

Offline cache mode (vuln_mode=offline_cache, default):

  • Reads from ops/cache/osv_cache.json only
  • Cache misses → severity UNKNOWN (not blocking by default)
  • No outbound network calls

Cache format (ops/cache/osv_cache.json):

{
  "version": 1,
  "updated_at": "...",
  "entries": {
    "PyPI:requests:2.31.0": {"vulns": [], "cached_at": "..."},
    "npm:lodash:4.17.20": {"vulns": [...], "cached_at": "..."}
  }
}

Cache key: {ecosystem}:{normalized_name}:{version}


Security

  • Read-only: scans lock files; no writes (except optional cache update in online mode)
  • Evidence redaction: secrets/tokens masked before inclusion in report
  • No payload logging: only hash of dep list + counts logged to audit trail
  • Path traversal protection: excluded dirs (node_modules, .git, .venv, etc.)
  • Size limits: max 80 files, 2000 deps, 500 vulns enforced in code

Integration in release_check

Gate order: pr_reviewconfig_lintdependency_scancontract_diffthreat_modelsmokedrift

release_check inputs related to this gate:

Input Type Default Description
run_deps boolean true Enable dependency scan gate
deps_targets array ["python","node"] Ecosystems
deps_vuln_mode string "offline_cache" OSV mode
deps_fail_on array ["CRITICAL","HIGH"] Blocking severity
deps_timeout_sec number 40 Timeout

Outdated Analysis (lockfile_only mode)

In MVP, "latest version" is inferred from OSV fixed_versions only (no registry lookup). An upgrade is recommended if a fixed version > current version exists in an OSV finding.

Full latest-version lookup (PyPI/npm registry) is planned as an optional enhancement.


Extending the Cache

To refresh the offline cache:

  1. Set vuln_mode: online in a controlled environment with outbound access to api.osv.dev
  2. Run dependency_scanner_tool — new entries are merged into ops/cache/osv_cache.json
  3. Commit the updated cache file

Or use ops/scripts/refresh_osv_cache.py (planned).