Files
Apple 129e4ea1fc feat(platform): add new services, tools, tests and crews modules
New router intelligence modules (26 files): alert_ingest/store, audit_store,
architecture_pressure, backlog_generator/store, cost_analyzer, data_governance,
dependency_scanner, drift_analyzer, incident_* (5 files), llm_enrichment,
platform_priority_digest, provider_budget, release_check_runner, risk_* (6 files),
signature_state_store, sofiia_auto_router, tool_governance

New services:
- sofiia-console: Dockerfile, adapters/, monitor/nodes/ops/voice modules, launchd, react static
- memory-service: integration_endpoints, integrations, voice_endpoints, static UI
- aurora-service: full app suite (analysis, job_store, orchestrator, reporting, schemas, subagents)
- sofiia-supervisor: new supervisor service
- aistalk-bridge-lite: Telegram bridge lite
- calendar-service: CalDAV calendar service with reminders
- mlx-stt-service / mlx-tts-service: Apple Silicon speech services
- binance-bot-monitor: market monitor service
- node-worker: STT/TTS memory providers

New tools (9): agent_email, browser_tool, contract_tool, observability_tool,
oncall_tool, pr_reviewer_tool, repo_tool, safe_code_executor, secure_vault

New crews: agromatrix_crew (10 modules: depth_classifier, doc_facts, doc_focus,
farm_state, light_reply, llm_factory, memory_manager, proactivity, reflection_engine,
session_context, style_adapter, telemetry)

Tests: 85+ test files for all new modules
Made-with: Cursor
2026-03-03 07:14:14 -08:00

118 lines
3.6 KiB
Python

"""
Sofiia Supervisor — Pydantic models for HTTP API
"""
from __future__ import annotations
import datetime
from enum import Enum
from typing import Any, Dict, List, Optional
from pydantic import BaseModel, Field
class RunStatus(str, Enum):
QUEUED = "queued"
RUNNING = "running"
SUCCEEDED = "succeeded"
FAILED = "failed"
CANCELLED = "cancelled"
class EventType(str, Enum):
NODE_START = "node_start"
NODE_END = "node_end"
TOOL_CALL = "tool_call"
TOOL_RESULT = "tool_result"
ERROR = "error"
# ─── Run event (stored without payload for privacy) ──────────────────────────
class RunEvent(BaseModel):
ts: str = Field(description="ISO timestamp")
type: EventType
node: Optional[str] = None
tool: Optional[str] = None
# payload is intentionally NOT stored — only correlation/size info
details: Dict[str, Any] = Field(default_factory=dict)
# ─── Run metadata stored in Redis ────────────────────────────────────────────
class RunRecord(BaseModel):
run_id: str
graph: str
status: RunStatus = RunStatus.QUEUED
agent_id: str
workspace_id: str
user_id: str
started_at: Optional[str] = None
finished_at: Optional[str] = None
result: Optional[Dict[str, Any]] = None
error: Optional[str] = None
events: List[RunEvent] = Field(default_factory=list)
# ─── HTTP Request/Response models ────────────────────────────────────────────
class StartRunRequest(BaseModel):
workspace_id: str = Field(default="daarion")
user_id: str = Field(default="system")
agent_id: str = Field(default="sofiia")
input: Dict[str, Any] = Field(default_factory=dict)
class StartRunResponse(BaseModel):
run_id: str
status: RunStatus
result: Optional[Dict[str, Any]] = None
class GetRunResponse(BaseModel):
run_id: str
graph: str
status: RunStatus
started_at: Optional[str] = None
finished_at: Optional[str] = None
result: Optional[Dict[str, Any]] = None
events: List[RunEvent] = Field(default_factory=list)
class CancelRunResponse(BaseModel):
run_id: str
status: RunStatus
message: str
# ─── Graph input schemas ──────────────────────────────────────────────────────
class ReleaseCheckInput(BaseModel):
diff_text: Optional[str] = None
service_name: Optional[str] = None
openapi_base: Optional[Dict[str, str]] = None
openapi_head: Optional[Dict[str, str]] = None
risk_profile: str = "default"
fail_fast: bool = True
run_smoke: bool = False
run_drift: bool = True
run_deps: bool = True
deps_targets: List[str] = Field(default_factory=lambda: ["python", "node"])
deps_vuln_mode: str = "offline_cache"
deps_fail_on: List[str] = Field(default_factory=lambda: ["CRITICAL", "HIGH"])
drift_categories: List[str] = Field(
default_factory=lambda: ["services", "openapi", "nats", "tools"]
)
timeouts: Dict[str, float] = Field(
default_factory=lambda: {"overall_sec": 180, "per_gate_sec": 60}
)
class IncidentTriageInput(BaseModel):
service: str
symptom: str
time_range: Optional[Dict[str, str]] = None # {"from": ISO, "to": ISO}
env: str = "prod"
include_traces: bool = False
max_log_lines: int = 120
log_query_hint: Optional[str] = None