Files
microdao-daarion/ops/runbook-sofiia-docs.md
Apple 67225a39fa docs(platform): add policy configs, runbooks, ops scripts and platform documentation
Config policies (16 files): alert_routing, architecture_pressure, backlog,
cost_weights, data_governance, incident_escalation, incident_intelligence,
network_allowlist, nodes_registry, observability_sources, rbac_tools_matrix,
release_gate, risk_attribution, risk_policy, slo_policy, tool_limits, tools_rollout

Ops (22 files): Caddyfile, calendar compose, grafana voice dashboard,
deployments/incidents logs, runbooks for alerts/audit/backlog/incidents/sofiia/voice,
cron jobs, scripts (alert_triage, audit_cleanup, migrate_*, governance, schedule),
task_registry, voice alerts/ha/latency/policy

Docs (30+ files): HUMANIZED_STEPAN v2.7-v3 changelogs and runbooks,
NODA1/NODA2 status and setup, audit index and traces, backlog, incident,
supervisor, tools, voice, opencode, release, risk, aistalk, spacebot

Made-with: Cursor
2026-03-03 07:14:53 -08:00

5.6 KiB

Runbook: Sofiia Console — Projects, Documents, Sessions

Scope: sofiia-console BFF (NODA2) | Storage: SQLite (Phase 1) | Vol: sofiia-data


1. Volume Paths

Item Host path Container path
SQLite DB sofiia-data Docker volume /app/data/sofiia.db
Uploaded files sofiia-data Docker volume /app/data/uploads/{sha[:2]}/{sha}_{filename}

Inspect volume:

docker volume inspect microdao-daarion_sofiia-data
# -> Mountpoint: /var/lib/docker/volumes/.../data/_data

2. Backup Strategy

# Get volume mountpoint
VOL=$(docker volume inspect microdao-daarion_sofiia-data --format '{{.Mountpoint}}')

# Create timestamped backup
BACKUP_DIR=/opt/backups/sofiia-data/$(date +%Y%m%d_%H%M%S)
mkdir -p "$BACKUP_DIR"
rsync -a "$VOL/" "$BACKUP_DIR/"
echo "Backup: $BACKUP_DIR"

Option B: SQLite online backup

# Create consistent SQLite backup while service is running
docker exec sofiia-console-node2 sqlite3 /app/data/sofiia.db ".backup /app/data/sofiia_backup.db"
docker cp sofiia-console-node2:/app/data/sofiia_backup.db ./backup_$(date +%Y%m%d).db
0 3 * * * rsync -a $(docker volume inspect microdao-daarion_sofiia-data --format '{{.Mountpoint}}/') /opt/backups/sofiia/$(date +\%Y\%m\%d_\%H\%M\%S)/ >> /var/log/sofiia-backup.log 2>&1

3. Migration Commands

Phase 1 → Phase 2 (SQLite → PostgreSQL)

When ready to migrate to Postgres:

  1. Set DATABASE_URL=postgresql://user:pass@host:5432/dbname in docker-compose.
  2. Restart service — schemas auto-create via init_db().
  3. Migrate data:
# Export SQLite to SQL
sqlite3 /app/data/sofiia.db .dump > /tmp/sofiia_dump.sql

# Import to Postgres (manual cleanup may be required for SQLite-specific syntax)
psql "$DATABASE_URL" < /tmp/sofiia_dump.sql

Schema version check

docker exec sofiia-console-node2 sqlite3 /app/data/sofiia.db ".tables"
# Expected: documents  messages  projects  sessions

4. API Endpoints Reference

Endpoint Method Purpose
/api/projects GET List all projects
/api/projects POST Create project {name, description}
/api/projects/{pid} GET Get project details
/api/projects/{pid} PATCH Update name/description
/api/files/upload?project_id=... POST Upload file (multipart)
/api/files/{file_id}/download GET Download file
/api/projects/{pid}/documents GET List documents
/api/projects/{pid}/documents/{did} GET Document metadata + text
/api/projects/{pid}/search POST Keyword search {query}
/api/sessions?project_id=... GET List sessions
/api/sessions/{sid} GET Session details
/api/sessions/{sid}/title PATCH Update session title
/api/chat/history?session_id=... GET Load message history
/api/sessions/{sid}/map GET Dialog map nodes + edges
/api/sessions/{sid}/fork POST Fork session from message

5. Upload Limits (env-configurable)

Type Env var Default
Images UPLOAD_MAX_IMAGE_MB 10 MB
Videos UPLOAD_MAX_VIDEO_MB 200 MB
Docs UPLOAD_MAX_DOC_MB 50 MB

Change without rebuild:

# in docker-compose.node2-sofiia.yml
environment:
  - UPLOAD_MAX_IMAGE_MB=20
  - UPLOAD_MAX_DOC_MB=100

Then: docker compose restart sofiia-console


6. Phase 2 Feature Flags

# Enable Fabric OCR for images (routes through Router /v1/capability/ocr)
USE_FABRIC_OCR=true

# Enable Qdrant embedding indexing for documents
USE_EMBEDDINGS=true

Both default to false (no impact on baseline performance).


7. Troubleshooting

DB not initialized

docker logs sofiia-console-node2 | grep -i "DB init"
# Expected: "sofiia-console DB initialised"

If missing: restart container. DB init is in lifespan() startup hook.

Upload failing (413)

Check file size vs. limit. Inspect:

curl -s http://localhost:8002/api/projects | jq

If 500 → check logs: docker logs sofiia-console-node2 --tail 50

Session not restoring after page reload

  • Browser localStorage must have sofiia_session_id
  • Check: GET /api/chat/history?session_id={id}&limit=20
  • If empty: session exists but has 0 messages (new session)

Dialog map empty

curl -s "http://localhost:8002/api/sessions?project_id=default&limit=5" | jq
curl -s "http://localhost:8002/api/sessions/{session_id}/map" | jq '.nodes | length'

If 0 nodes: no messages saved yet. Ensure _do_save_memory is not blocked (check Memory Service health).

Volume full

docker system df
du -sh $(docker volume inspect microdao-daarion_sofiia-data --format '{{.Mountpoint}}')

Cleanup old uploads manually (content-addressed, safe to delete by sha if no DB references):

sqlite3 /app/data/sofiia.db "SELECT file_id FROM documents" > /tmp/active_files.txt
# Then diff with actual /app/data/uploads/* to find orphans

8. Testing

Run unit tests

cd /opt/microdao-daarion
python3 -m pytest tests/test_sofiia_docs.py -v

Smoke test: create project + upload

BASE=http://localhost:8002

# Create project
curl -s -X POST "$BASE/api/projects" -H "Content-Type: application/json" \
  -d '{"name":"Test Project","description":"Smoke test"}' | jq .

# Upload file
curl -s -X POST "$BASE/api/files/upload?project_id=default&title=Test+Doc" \
  -F "file=@/etc/hostname" | jq '.doc_id, .sha256, .size_bytes'

# List docs
curl -s "$BASE/api/projects/default/documents" | jq '.[].filename'