NCS:
- _collect_worker_caps() fetches capability flags from node-worker /caps
- _derive_capabilities() merges served model types + worker provider flags
- installed_artifacts replaces inventory_only (disk scan with DISK_SCAN_PATHS env)
- New endpoints: /capabilities/caps, /capabilities/installed
Node Worker:
- STT_PROVIDER, TTS_PROVIDER, OCR_PROVIDER, IMAGE_PROVIDER env flags
- /caps endpoint returns capabilities + providers for NCS aggregation
- STT adapter (providers/stt_mlx_whisper.py) — remote + local mode
- TTS adapter (providers/tts_mlx_kokoro.py) — remote + local mode
- OCR handler via vision_prompted (ollama_vision with OCR prompt)
- NATS subjects: node.{id}.stt/tts/ocr/image.request
Router:
- POST /v1/capability/{stt,tts,ocr,image} — capability-based offload routing
- GET /v1/capabilities — global view with capabilities_by_node
- require_fresh_caps(ttl) preflight guard
- find_nodes_with_capability(cap) + load-based node selection
Ops:
- ops/fabric_snapshot.py — full runtime snapshot collector
- ops/fabric_preflight.sh — quick check + snapshot save + diff
- docs/fabric_contract.md — Dev Contract v0.1 (preflight-first)
- tests/test_fabric_contract.py — CI enforcement (6 tests)
Made-with: Cursor
48 lines
1.3 KiB
Python
48 lines
1.3 KiB
Python
"""Canonical job envelope for cross-node inference offload."""
|
|
from typing import Any, Dict, List, Literal, Optional
|
|
from pydantic import BaseModel, Field
|
|
import time
|
|
import uuid
|
|
|
|
|
|
def _ulid() -> str:
|
|
return str(uuid.uuid4())
|
|
|
|
|
|
class JobRequest(BaseModel):
|
|
job_id: str = Field(default_factory=_ulid)
|
|
trace_id: str = ""
|
|
actor_agent_id: str = ""
|
|
target_agent_id: str = ""
|
|
required_type: Literal["llm", "vision", "stt", "tts", "image", "ocr"] = "llm"
|
|
deadline_ts: int = 0
|
|
idempotency_key: str = ""
|
|
payload: Dict[str, Any] = Field(default_factory=dict)
|
|
hints: Dict[str, Any] = Field(default_factory=dict)
|
|
|
|
def remaining_ms(self) -> int:
|
|
if self.deadline_ts <= 0:
|
|
return 30_000
|
|
return max(0, self.deadline_ts - int(time.time() * 1000))
|
|
|
|
def effective_idem_key(self) -> str:
|
|
return self.idempotency_key or self.job_id
|
|
|
|
|
|
class JobError(BaseModel):
|
|
code: str = "UNKNOWN"
|
|
message: str = ""
|
|
|
|
|
|
class JobResponse(BaseModel):
|
|
job_id: str = ""
|
|
trace_id: str = ""
|
|
node_id: str = ""
|
|
status: Literal["ok", "busy", "timeout", "error"] = "ok"
|
|
provider: str = ""
|
|
model: str = ""
|
|
latency_ms: int = 0
|
|
result: Optional[Dict[str, Any]] = None
|
|
error: Optional[JobError] = None
|
|
cached: bool = False
|