8 Commits

Author SHA1 Message Date
Apple
5f2fd7905f docs: sync consolidation and session starter 2026-02-16 02:32:27 -08:00
Apple
3146e74ce8 docs: sync consolidation and session starter 2026-02-16 02:32:27 -08:00
Apple
fc2d86bd1b docs: sync consolidation and session starter 2026-02-16 02:32:08 -08:00
Apple
8ba71f240f docs: sync consolidation and session starter 2026-02-16 02:32:08 -08:00
Lord of Chaos
f82404dc36 Merge pull request #2 from IvanTytar/docs/node1-sync
docs(node1): safe deploy + snapshot
2026-02-10 05:35:07 -08:00
Apple
0d8582d552 docs(node1): add safe deploy workflow and snapshot
Document canonical sync between GitHub and NODA1 and add a snapshot script to capture runtime state without editing production by hand.

Co-authored-by: Cursor <cursoragent@cursor.com>
2026-02-10 05:33:32 -08:00
Apple
c1cc5591f6 chore(helion): respond to direct mentions in groups
Clarify Helion group behavior: stay silent unless energy topic or direct mention, but answer operational questions when directly addressed.

Co-authored-by: Cursor <cursoragent@cursor.com>
2026-02-10 04:30:01 -08:00
Apple
3eb628e4ff fix(router): guard DSML tool-call flows
Prevent DeepSeek DSML from leaking to users and avoid returning raw memory_search/web results when DSML is detected.

Co-authored-by: Cursor <cursoragent@cursor.com>
2026-02-10 04:30:01 -08:00
30 changed files with 1553 additions and 22 deletions

36
.github/workflows/docs-lint.yml vendored Normal file
View File

@@ -0,0 +1,36 @@
name: docs-lint
on:
push:
branches: [main]
paths:
- "**/*.md"
- ".markdownlint.yml"
- ".markdownlintignore"
- ".github/workflows/docs-lint.yml"
- "scripts/docs/docs_lint.sh"
- "docs/standards/lint_scope.txt"
pull_request:
branches: [main]
paths:
- "**/*.md"
- ".markdownlint.yml"
- ".markdownlintignore"
- ".github/workflows/docs-lint.yml"
- "scripts/docs/docs_lint.sh"
- "docs/standards/lint_scope.txt"
jobs:
markdownlint:
runs-on: ubuntu-latest
steps:
- name: Checkout
uses: actions/checkout@v4
- name: Setup Node
uses: actions/setup-node@v4
with:
node-version: "20"
- name: Lint markdown
run: bash scripts/docs/docs_lint.sh

17
.markdownlint.yml Normal file
View File

@@ -0,0 +1,17 @@
default: true
MD013:
line_length: 180
heading_line_length: 180
code_block_line_length: 220
MD024:
allow_different_nesting: true
MD033: false
MD041: false
MD022: false
MD029: false
MD031: false
MD032: false
MD060: false

6
.markdownlintignore Normal file
View File

@@ -0,0 +1,6 @@
.worktrees/
node_modules/
site/
rollback_backups/
docs/consolidation/_node1_runtime_docs/
**/*.bak_*

71
NODA1-SAFE-DEPLOY.md Normal file
View File

@@ -0,0 +1,71 @@
# NODA1 Safe Deploy (Canonical Workflow)
**Goal:** синхронізувати ноут ↔ GitHub ↔ NODA1 ↔ реальний Docker-стек так, щоб не ламати працюючий прод і не плодити "невидимі" гілки.
**Canonical source of truth:** GitHub `origin/main`.
---
## Ролі директорій на NODA1
- `/opt/microdao-daarion` — legacy checkout (може мати роз'їхавшу історію). Не використовувати для `git pull/rebase`, якщо є конфлікти.
- `/opt/microdao-daarion.repo` — canonical deployment checkout (git worktree на `origin/main`). Деплой робимо тільки звідси.
---
## Golden rules
1. Жодних ручних правок коду/доків на NODA1 (крім аварійного hotfix з наступним PR).
2. Жодних гілок/розробки на NODA1.
3. Docker compose завжди з однаковим project name: `-p microdao-daarion`.
4. Secrets не комітяться; для них тримати `*.example` + короткий опис розміщення.
---
## Safe sync (NODA1)
```bash
ssh root@144.76.224.179
cd /opt/microdao-daarion
git fetch origin
cd /opt/microdao-daarion.repo
git pull --ff-only
git rev-parse --short HEAD
```
---
## Safe deploy одного сервісу (мінімальний ризик)
### Router
```bash
cd /opt/microdao-daarion.repo
docker compose -p microdao-daarion -f docker-compose.node1.yml build router
docker compose -p microdao-daarion -f docker-compose.node1.yml up -d --no-deps --force-recreate router
curl -fsS http://127.0.0.1:9102/health
```
### Gateway
```bash
cd /opt/microdao-daarion.repo
docker compose -p microdao-daarion -f docker-compose.node1.yml build gateway
docker compose -p microdao-daarion -f docker-compose.node1.yml up -d --no-deps --force-recreate gateway
curl -fsS http://127.0.0.1:9300/health
```
---
## Runtime snapshot
```bash
cd /opt/microdao-daarion.repo
./scripts/node1/snapshot_node1.sh > "/opt/backups/node1_snapshot_$(date +%Y%m%d-%H%M%S).txt"
```

View File

@@ -5,6 +5,14 @@
--- ---
## Session Start (Canonical)
- Старт кожної нової сесії: `docs/SESSION_STARTER.md`
- Консолідація та маркування старих/нових/фактичних доків: `docs/consolidation/README.md`
- Керований список ключових документів і статусів: `docs/consolidation/docs_registry_curated.csv`
---
## 🗂️ Де що лежить ## 🗂️ Де що лежить
### Основні репозиторії ### Основні репозиторії
@@ -26,6 +34,19 @@
| **Project Root** | `/opt/microdao-daarion/` | | **Project Root** | `/opt/microdao-daarion/` |
| **Docker Network** | `dagi-network` | | **Docker Network** | `dagi-network` |
### NODA1 Sync Policy (Repo ↔ Runtime)
**Канонічна правда:** GitHub `origin/main`.
На NODA1:
- `/opt/microdao-daarion.repo` — canonical deployment checkout (git worktree на `origin/main`), деплой робимо тільки звідси.
- `/opt/microdao-daarion` — legacy checkout; не використовувати для `git pull/rebase`, якщо історія роз'їхалась.
**Safe deploy runbook:** `NODA1-SAFE-DEPLOY.md`
**Runtime snapshot:** `scripts/node1/snapshot_node1.sh`
--- ---
## 🎯 AGENT REGISTRY (Single Source of Truth) ## 🎯 AGENT REGISTRY (Single Source of Truth)

101
docs/SESSION_STARTER.md Normal file
View File

@@ -0,0 +1,101 @@
# SESSION STARTER (Documentation + Runtime Truth)
Date: 2026-02-16
Purpose: single context file to start any new session without losing architectural state.
## Canonical Repo
- Main working repo: `/Users/apple/github-projects/microdao-daarion`
- Runtime on NODE1: `/opt/microdao-daarion`
Important: keep docs and runtime aligned via explicit drift checks, not assumptions.
## Core Entry Documents
1. `PROJECT-MASTER-INDEX.md` (entry point to docs map)
2. `config/README.md` (how to add/modify agents)
3. `docs/NODA1-AGENT-ARCHITECTURE.md` (node1 architecture and agent wiring)
4. `NODA1-SAFE-DEPLOY.md` (safe deployment flow)
5. `docs/consolidation/README.md` (docs hub)
6. `docs/consolidation/docs_registry_curated.csv` (curated doc truth table)
## Session Bootstrap (Services + Docs)
1. Run integrations bootstrap report:
```bash
bash scripts/docs/session_bootstrap.sh
```
Output:
- `docs/consolidation/INTEGRATIONS_STATUS_LATEST.md`
- `docs/consolidation/INTEGRATIONS_STATUS_<timestamp>.md`
2. Dry-run docs sync to remotes:
```bash
bash scripts/docs/docs_sync.sh --dry-run
```
3. Apply docs sync (only after review):
```bash
bash scripts/docs/docs_sync.sh --apply --targets github,gitea
```
4. Service adapters (Jupyter + Pieces):
```bash
bash scripts/docs/services_sync.sh --dry-run
# apply mode:
bash scripts/docs/services_sync.sh --apply
```
5. Docs lint and standards:
```bash
bash scripts/docs/docs_lint.sh
```
6. Docs backup (explicit run):
```bash
bash scripts/docs/docs_backup.sh --dry-run
bash scripts/docs/docs_backup.sh --apply
```
## Runtime-First Facts (must re-check each session)
1. NODE1 branch/SHA:
```bash
ssh root@<NODE1> "cd /opt/microdao-daarion && git rev-parse --abbrev-ref HEAD && git rev-parse HEAD"
```
2. Core health:
```bash
curl -sS http://127.0.0.1:9102/health
curl -sS http://127.0.0.1:9300/health
```
3. Canary suite:
```bash
cd /opt/microdao-daarion
./ops/canary_all.sh
./ops/canary_senpai_osr_guard.sh
```
4. Router endpoint contract:
- Active: `POST /v1/agents/{agent_id}/infer`
- Not active: `POST /route` (returns 404 on current runtime)
## NODE3/NODE4 Policy
- NODE3 and NODE4 remain part of target architecture.
- If currently unreachable, mark as `DEGRADED` (not removed).
- Re-enable dependent flows only after connectivity + health checks pass.
## Documentation Status Model
- `new-canonical`: active docs in canonical repo.
- `runtime-fact`: docs/snapshots that reflect current live runtime behavior.
- `legacy-worktree`: old but useful strategic docs in `.worktrees/*`.
- `legacy-desktop`: docs from old Desktop repo (`MicroDAO 3`).
- `needs-triage`: unresolved status.
## Governance Rule
No deployment/reconfiguration based only on docs text.
Always confirm against live runtime facts (health, canary, config hashes, active endpoints).

1
docs/backups/LATEST.txt Normal file
View File

@@ -0,0 +1 @@
/Users/apple/github-projects/microdao-daarion/docs/backups/docs_backup_20260216-022549.tar.gz

View File

@@ -0,0 +1,31 @@
# Integrations Bootstrap Status
Generated: 2026-02-16 10:25:49 UTC
Repo: /Users/apple/github-projects/microdao-daarion
## Gitea
- **gitea_http**: OK - http_code=200 (http://127.0.0.1:3000)
- **gitea_git_remote**: OK - gitea http://localhost:3000/daarion-admin/microdao-daarion.git;
## GitHub
- **gh_auth**: OK - IvanTytar (keyring)
- **github_git_remote**: OK - origin git@github.com:IvanTytar/microdao-daarion.git;
## Jupyter
- **jupyter_cli**: DEGRADED - jupyter not found in PATH
- **notebooks_dir**: OK - /Users/apple/notebooks (ipynb_count=7)
## Pieces
- **pieces_extension**: OK - cursor extensions matched=1
- **pieces_data_dir**: INFO - /Users/apple/Library/Application Support/Pieces not found
## Next
- Run docs sync dry-run: bash scripts/docs/docs_sync.sh --dry-run
- Apply sync to remotes: bash scripts/docs/docs_sync.sh --apply --targets github,gitea

View File

@@ -0,0 +1,39 @@
# Docs Consolidation Hub
Single entry point for documentation reconciliation across:
- canonical repo docs
- legacy repo copies
- worktree docs
- NODE1 runtime docs snapshot
- local notebooks
## Files
- `docs_inventory.csv` — machine-readable inventory with classification.
- `docs_inventory_summary.txt` — class-level counts.
- `docs_registry_curated.csv` — manually curated truth table for key docs (startup + runtime + legacy refs).
- `_node1_runtime_docs/` — runtime docs snapshot pulled from NODE1.
## Classification
- `runtime-fact` — observed in NODE1 runtime snapshot.
- `new-canonical` — canonical docs in active repo (`docs/runbooks`, master index, safe deploy).
- `legacy-worktree` — docs from `.worktrees/*`.
- `legacy-desktop` — docs from `/Users/apple/Desktop/MicroDAO/MicroDAO 3`.
- `needs-triage` — requires manual decision.
## Rebuild inventory
```bash
python3 /Users/apple/github-projects/microdao-daarion/scripts/docs/build_docs_hub_inventory.py
```
## Service Automation
```bash
bash /Users/apple/github-projects/microdao-daarion/scripts/docs/session_bootstrap.sh
bash /Users/apple/github-projects/microdao-daarion/scripts/docs/services_sync.sh --dry-run
bash /Users/apple/github-projects/microdao-daarion/scripts/docs/services_sync.sh --apply
bash /Users/apple/github-projects/microdao-daarion/scripts/docs/docs_lint.sh
bash /Users/apple/github-projects/microdao-daarion/scripts/docs/docs_sync.sh --dry-run
```

View File

@@ -0,0 +1,31 @@
# Documentation Sources Map
This file tracks where documentation is collected from for consolidation.
## Primary
- Canonical repo: `/Users/apple/github-projects/microdao-daarion`
- Legacy repo: `/Users/apple/Desktop/MicroDAO/MicroDAO 3`
- Worktrees:
- `/Users/apple/github-projects/microdao-daarion/.worktrees/origin-main`
- `/Users/apple/github-projects/microdao-daarion/.worktrees/docs-node1-sync`
- NODE1 runtime snapshot mirror:
- `/Users/apple/github-projects/microdao-daarion/docs/consolidation/_node1_runtime_docs`
- Local notebooks:
- `/Users/apple/notebooks`
## Connectivity/Integrations (verified 2026-02-16)
- GitHub CLI: authenticated (`gh auth status` -> account `IvanTytar`, scopes include `repo`)
- Gitea web UI: reachable at `http://127.0.0.1:3000` (HTTP 200)
- Jupyter CLI: not found in current PATH (`jupyter: command not found`), notebooks directory exists at `/Users/apple/notebooks`
- Pieces: Cursor extension found (`meshintelligenttechnologiesinc.pieces-vscode-3.0.1-universal`)
- NODE1 SSH: intermittent (periodic `connection refused`), use retry/backoff for snapshot refresh.
- NODE3/NODE4: currently unreachable from this workstation and from NODE1.
## Automation Scripts
- `scripts/docs/session_bootstrap.sh` — refreshes integration status and writes `INTEGRATIONS_STATUS_LATEST.md`.
- `scripts/docs/docs_sync.sh` — safe docs sync automation with `--dry-run` by default and explicit `--apply`.
- `scripts/docs/docs_lint.sh` — markdown lint for canonical documentation.
- `scripts/docs/docs_backup.sh` — timestamped docs backup with retention rotation.
- `scripts/docs/jupyter_sync.sh` — adapter for Jupyter server API status + notebooks index export.
- `scripts/docs/pieces_sync.sh` — adapter for Pieces local runtime/data index + optional API probe.
- `scripts/docs/services_sync.sh` — orchestrator for bootstrap + Jupyter + Pieces sync adapters.

View File

@@ -0,0 +1,16 @@
path,title,status,source,notes,last_verified
/Users/apple/github-projects/microdao-daarion/docs/SESSION_STARTER.md,Session Starter,new-canonical,canonical-repo,Primary startup context for new sessions,2026-02-16
/Users/apple/github-projects/microdao-daarion/PROJECT-MASTER-INDEX.md,Master Index,new-canonical,canonical-repo,Single entry point for docs navigation,2026-02-16
/Users/apple/github-projects/microdao-daarion/config/README.md,Agent Registry README,new-canonical,canonical-repo,Canonical process for adding/modifying agents,2026-02-16
/Users/apple/github-projects/microdao-daarion/docs/NODA1-AGENT-ARCHITECTURE.md,NODA1 Agent Architecture,new-canonical,canonical-repo,Architecture and wiring for NODE1 agents,2026-02-16
/Users/apple/github-projects/microdao-daarion/NODA1-SAFE-DEPLOY.md,NODA1 Safe Deploy,new-canonical,canonical-repo,Safe deploy workflow and rollback gates,2026-02-16
/Users/apple/github-projects/microdao-daarion/docs/runbooks/AGENT_REGISTRY_NODE1_DECISION_2026-02-16.md,Agent Registry Decision,runtime-fact,canonical-repo,Decision log aligned to live NODE1 runtime,2026-02-16
/Users/apple/github-projects/microdao-daarion/docs/runbooks/NODE_ARCH_RECONCILIATION_PLAN_2026-02-16.md,Node Arch Reconciliation Plan,runtime-fact,canonical-repo,Runtime-first reconciliation plan across nodes,2026-02-16
/Users/apple/github-projects/microdao-daarion/docs/consolidation/_node1_runtime_docs/PROJECT-MASTER-INDEX.md,Runtime Snapshot Master Index,runtime-fact,node1-snapshot,Snapshot from /opt/microdao-daarion,2026-02-16
/Users/apple/github-projects/microdao-daarion/docs/consolidation/_node1_runtime_docs/config/README.md,Runtime Snapshot Config README,runtime-fact,node1-snapshot,Snapshot from /opt/microdao-daarion,2026-02-16
/Users/apple/github-projects/microdao-daarion/docs/consolidation/_node1_runtime_docs/docs/NODA1-AGENT-ARCHITECTURE.md,Runtime Snapshot NODA1 Architecture,runtime-fact,node1-snapshot,Snapshot from /opt/microdao-daarion,2026-02-16
/Users/apple/github-projects/microdao-daarion/.worktrees/origin-main/IMPLEMENTATION-STATUS.md,Implementation Status,legacy-worktree,worktree-origin-main,Legacy strategic doc kept for reference only,2026-02-16
/Users/apple/github-projects/microdao-daarion/.worktrees/origin-main/ARCHITECTURE-150-NODES.md,Architecture 150 Nodes,legacy-worktree,worktree-origin-main,Legacy scale architecture reference,2026-02-16
/Users/apple/github-projects/microdao-daarion/.worktrees/origin-main/infrastructure/auth/AUTH-IMPLEMENTATION-PLAN.md,Auth Implementation Plan,legacy-worktree,worktree-origin-main,Legacy auth rollout plan,2026-02-16
/Users/apple/github-projects/microdao-daarion/.worktrees/origin-main/infrastructure/matrix-gateway/README.md,Matrix Gateway README,legacy-worktree,worktree-origin-main,Legacy matrix gateway reference,2026-02-16
/Users/apple/Desktop/MicroDAO/MicroDAO 3/NODA1-CURRENT-STATUS-2026-01-26.md,NODA1 Current Status 2026-01-26,legacy-desktop,desktop-legacy,Old status snapshot from legacy repo,2026-02-16
1 path title status source notes last_verified
2 /Users/apple/github-projects/microdao-daarion/docs/SESSION_STARTER.md Session Starter new-canonical canonical-repo Primary startup context for new sessions 2026-02-16
3 /Users/apple/github-projects/microdao-daarion/PROJECT-MASTER-INDEX.md Master Index new-canonical canonical-repo Single entry point for docs navigation 2026-02-16
4 /Users/apple/github-projects/microdao-daarion/config/README.md Agent Registry README new-canonical canonical-repo Canonical process for adding/modifying agents 2026-02-16
5 /Users/apple/github-projects/microdao-daarion/docs/NODA1-AGENT-ARCHITECTURE.md NODA1 Agent Architecture new-canonical canonical-repo Architecture and wiring for NODE1 agents 2026-02-16
6 /Users/apple/github-projects/microdao-daarion/NODA1-SAFE-DEPLOY.md NODA1 Safe Deploy new-canonical canonical-repo Safe deploy workflow and rollback gates 2026-02-16
7 /Users/apple/github-projects/microdao-daarion/docs/runbooks/AGENT_REGISTRY_NODE1_DECISION_2026-02-16.md Agent Registry Decision runtime-fact canonical-repo Decision log aligned to live NODE1 runtime 2026-02-16
8 /Users/apple/github-projects/microdao-daarion/docs/runbooks/NODE_ARCH_RECONCILIATION_PLAN_2026-02-16.md Node Arch Reconciliation Plan runtime-fact canonical-repo Runtime-first reconciliation plan across nodes 2026-02-16
9 /Users/apple/github-projects/microdao-daarion/docs/consolidation/_node1_runtime_docs/PROJECT-MASTER-INDEX.md Runtime Snapshot Master Index runtime-fact node1-snapshot Snapshot from /opt/microdao-daarion 2026-02-16
10 /Users/apple/github-projects/microdao-daarion/docs/consolidation/_node1_runtime_docs/config/README.md Runtime Snapshot Config README runtime-fact node1-snapshot Snapshot from /opt/microdao-daarion 2026-02-16
11 /Users/apple/github-projects/microdao-daarion/docs/consolidation/_node1_runtime_docs/docs/NODA1-AGENT-ARCHITECTURE.md Runtime Snapshot NODA1 Architecture runtime-fact node1-snapshot Snapshot from /opt/microdao-daarion 2026-02-16
12 /Users/apple/github-projects/microdao-daarion/.worktrees/origin-main/IMPLEMENTATION-STATUS.md Implementation Status legacy-worktree worktree-origin-main Legacy strategic doc kept for reference only 2026-02-16
13 /Users/apple/github-projects/microdao-daarion/.worktrees/origin-main/ARCHITECTURE-150-NODES.md Architecture 150 Nodes legacy-worktree worktree-origin-main Legacy scale architecture reference 2026-02-16
14 /Users/apple/github-projects/microdao-daarion/.worktrees/origin-main/infrastructure/auth/AUTH-IMPLEMENTATION-PLAN.md Auth Implementation Plan legacy-worktree worktree-origin-main Legacy auth rollout plan 2026-02-16
15 /Users/apple/github-projects/microdao-daarion/.worktrees/origin-main/infrastructure/matrix-gateway/README.md Matrix Gateway README legacy-worktree worktree-origin-main Legacy matrix gateway reference 2026-02-16
16 /Users/apple/Desktop/MicroDAO/MicroDAO 3/NODA1-CURRENT-STATUS-2026-01-26.md NODA1 Current Status 2026-01-26 legacy-desktop desktop-legacy Old status snapshot from legacy repo 2026-02-16

View File

@@ -0,0 +1,13 @@
# Jupyter Sync Report
Generated: 2026-02-16 10:25:50 UTC
- jupyter_cmd: not-found
- server_count: 0
- api_ok: 0/0
- notebooks_dir: /Users/apple/notebooks
- notebooks_count: 7
## API Probes
- no active jupyter servers discovered

View File

@@ -0,0 +1,8 @@
path,size_bytes,mtime_epoch
"/Users/apple/notebooks/04_latency_profile.ipynb",31210,1760335066
"/Users/apple/notebooks/03_window_packing.ipynb",8762,1760334699
"/Users/apple/notebooks/02_reranker_ablation.ipynb",26615,1760334615
"/Users/apple/notebooks/05_groundedness_eval.ipynb",39210,1760335328
"/Users/apple/notebooks/hybrid-search-demo.ipynb",27580,1760307946
"/Users/apple/notebooks/01_retrieval_sanity.ipynb",16137,1760308259
"/Users/apple/notebooks/ai-stack-demo.ipynb",14607,1760307623
1 path size_bytes mtime_epoch
2 /Users/apple/notebooks/04_latency_profile.ipynb 31210 1760335066
3 /Users/apple/notebooks/03_window_packing.ipynb 8762 1760334699
4 /Users/apple/notebooks/02_reranker_ablation.ipynb 26615 1760334615
5 /Users/apple/notebooks/05_groundedness_eval.ipynb 39210 1760335328
6 /Users/apple/notebooks/hybrid-search-demo.ipynb 27580 1760307946
7 /Users/apple/notebooks/01_retrieval_sanity.ipynb 16137 1760308259
8 /Users/apple/notebooks/ai-stack-demo.ipynb 14607 1760307623

View File

@@ -0,0 +1,25 @@
# Pieces Sync Report
Generated: 2026-02-16 10:25:50 UTC
- pieces_extensions_count: 1
- pieces_data_dirs_count: 0
- pieces_process_count: 2
- api_probe_ok: 0/0
## Extensions
- /Users/apple/.cursor/extensions/meshintelligenttechnologiesinc.pieces-vscode-3.0.1-universal
## Data Dirs
- no pieces data directories found
## Processes
- 54969 /Applications/Pieces OS.app/Contents/MacOS/Pieces OS
- 55184 /Applications/Pieces.app/Contents/MacOS/Pieces
## API Probes
- no api probe ports supplied

View File

@@ -0,0 +1,4 @@
kind,path_or_process
extension,"/Users/apple/.cursor/extensions/meshintelligenttechnologiesinc.pieces-vscode-3.0.1-universal"
process,"54969 /Applications/Pieces OS.app/Contents/MacOS/Pieces OS"
process,"55184 /Applications/Pieces.app/Contents/MacOS/Pieces"
1 kind path_or_process
2 extension /Users/apple/.cursor/extensions/meshintelligenttechnologiesinc.pieces-vscode-3.0.1-universal
3 process 54969 /Applications/Pieces OS.app/Contents/MacOS/Pieces OS
4 process 55184 /Applications/Pieces.app/Contents/MacOS/Pieces

View File

@@ -0,0 +1,90 @@
# Docs Services Automation Runbook
Date: 2026-02-16
Scope: GitHub, Gitea, Jupyter, Pieces documentation integration workflow.
## Goal
Keep docs state synchronized with service reality and publish curated docs updates safely.
## 1) Refresh service status
```bash
cd /Users/apple/github-projects/microdao-daarion
bash scripts/docs/session_bootstrap.sh
```
Primary output:
- `docs/consolidation/INTEGRATIONS_STATUS_LATEST.md`
## 2) Run service adapters (Jupyter + Pieces)
```bash
cd /Users/apple/github-projects/microdao-daarion
bash scripts/docs/services_sync.sh --dry-run
```
Apply mode:
```bash
cd /Users/apple/github-projects/microdao-daarion
bash scripts/docs/services_sync.sh --apply
```
Optional local Pieces API probe:
```bash
cd /Users/apple/github-projects/microdao-daarion
bash scripts/docs/services_sync.sh --apply --probe-ports 39300,39301
```
Outputs:
- `docs/consolidation/jupyter/JUPYTER_SYNC_LATEST.md`
- `docs/consolidation/jupyter/notebooks_index_latest.csv`
- `docs/consolidation/pieces/PIECES_SYNC_LATEST.md`
- `docs/consolidation/pieces/pieces_index_latest.csv`
- `docs/backups/LATEST.txt` (latest backup archive reference)
## 3) Review pending docs updates
```bash
cd /Users/apple/github-projects/microdao-daarion
git status --short
bash scripts/docs/docs_sync.sh --dry-run
```
## 4) Apply sync to remotes
```bash
cd /Users/apple/github-projects/microdao-daarion
bash scripts/docs/docs_sync.sh --apply --targets github,gitea
```
## Safety gates
- `docs_sync.sh` is dry-run by default.
- `--apply` refuses to proceed if staged non-doc files are present.
- Only curated doc paths are auto-staged by the script.
- `services_sync.sh --apply` creates docs backup before writing integration artifacts.
## Standardization and lint
Local lint:
```bash
cd /Users/apple/github-projects/microdao-daarion
bash scripts/docs/docs_lint.sh
```
CI lint:
- `.github/workflows/docs-lint.yml`
- config: `.markdownlint.yml`, `.markdownlintignore`
Style and template:
- `docs/standards/DOCS_STYLE_GUIDE.md`
- `docs/standards/DOC_TEMPLATE.md`
## Jupyter and Pieces notes
- `jupyter_sync.sh` probes active Jupyter servers via `/api/status` when servers are discovered.
- `pieces_sync.sh` captures local Pieces runtime markers and supports optional local API probes (`--probe-ports`).

View File

@@ -0,0 +1,61 @@
# Docs Style Guide
Date: 2026-02-16
Purpose: standardized writing and structure for canonical documentation in this repository.
## Scope
This guide applies to canonical docs in:
- `docs/`
- `docs/runbooks/`
- `config/*.md`
- root operational docs (`PROJECT-MASTER-INDEX.md`, `NODA1-SAFE-DEPLOY.md`, `NODA1-README.md`)
Legacy and snapshot docs are excluded from strict linting:
- `docs/consolidation/_node1_runtime_docs/`
- `.worktrees/`
- backup artifacts (`*.bak_*`)
## Required Structure
Every new canonical document should start with:
1. H1 title
2. `Date: YYYY-MM-DD`
3. `Purpose: ...`
Recommended sections:
1. Scope
2. Preconditions
3. Procedure / Steps
4. Verification
5. Rollback / Recovery
## Writing Rules
- Use sentence-case headings.
- Keep sections short and operational.
- Prefer explicit paths and commands.
- Mark runtime facts with concrete dates.
- Avoid mixing strategic legacy notes into canonical runbooks.
## Naming Rules
- Runbooks: `docs/runbooks/<TOPIC>_RUNBOOK.md`
- Decisions: `docs/runbooks/<TOPIC>_DECISION_YYYY-MM-DD.md`
- One-off status snapshots: include date in filename.
## Linting
Local check:
```bash
bash scripts/docs/docs_lint.sh
```
CI check:
- GitHub workflow: `.github/workflows/docs-lint.yml`
Lint scope source:
- `docs/standards/lint_scope.txt`
Note:
- Full historical docs migration is phased; lint is enforced on managed canonical scope first.

View File

@@ -0,0 +1,37 @@
# <Document Title>
Date: YYYY-MM-DD
Purpose: one-sentence objective of this document.
## Scope
- Systems, repos, services, and files covered.
## Preconditions
- Required access
- Required tools
- Required environment
## Procedure
1. Step one
2. Step two
3. Step three
## Verification
- Command(s):
```bash
# put verification commands here
```
- Expected result(s)
## Rollback
1. Recovery step one
2. Recovery step two
## References
- `/absolute/or/repo/path/to/relevant/file`

View File

@@ -0,0 +1,7 @@
docs/SESSION_STARTER.md
docs/consolidation/README.md
docs/consolidation/SOURCES.md
docs/runbooks/DOCS_SERVICES_AUTOMATION_RUNBOOK.md
docs/standards/DOCS_STYLE_GUIDE.md
docs/standards/DOC_TEMPLATE.md
config/README.md

95
scripts/docs/docs_backup.sh Executable file
View File

@@ -0,0 +1,95 @@
#!/usr/bin/env bash
set -euo pipefail
ROOT_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")/../.." && pwd)"
OUT_DIR="$ROOT_DIR/docs/backups"
RETENTION="${DOCS_BACKUP_RETENTION:-20}"
DRY_RUN=1
APPLY=0
usage() {
cat <<'USAGE'
Usage:
bash scripts/docs/docs_backup.sh [--dry-run] [--apply] [--retention N]
Behavior:
- default mode is --dry-run
- --apply creates docs backup archive and rotates old backups
USAGE
}
while [[ $# -gt 0 ]]; do
case "$1" in
--dry-run)
DRY_RUN=1
APPLY=0
shift
;;
--apply)
APPLY=1
DRY_RUN=0
shift
;;
--retention)
RETENTION="$2"
shift 2
;;
-h|--help)
usage
exit 0
;;
*)
echo "Unknown arg: $1" >&2
usage
exit 2
;;
esac
done
STAMP="$(date +%Y%m%d-%H%M%S)"
ARCHIVE="$OUT_DIR/docs_backup_${STAMP}.tar.gz"
LATEST_REF="$OUT_DIR/LATEST.txt"
backup_targets=(
PROJECT-MASTER-INDEX.md
NODA1-SAFE-DEPLOY.md
docs/SESSION_STARTER.md
docs/runbooks
docs/standards
docs/consolidation/README.md
docs/consolidation/SOURCES.md
docs/consolidation/docs_registry_curated.csv
docs/consolidation/INTEGRATIONS_STATUS_LATEST.md
docs/consolidation/jupyter/JUPYTER_SYNC_LATEST.md
docs/consolidation/jupyter/notebooks_index_latest.csv
docs/consolidation/pieces/PIECES_SYNC_LATEST.md
docs/consolidation/pieces/pieces_index_latest.csv
)
existing=()
for t in "${backup_targets[@]}"; do
[[ -e "$ROOT_DIR/$t" ]] && existing+=("$t")
done
if [[ "$DRY_RUN" -eq 1 ]]; then
echo "[dry-run] backup archive: $ARCHIVE"
echo "[dry-run] retention: $RETENTION"
echo "[dry-run] targets:"
printf ' - %s\n' "${existing[@]}"
exit 0
fi
mkdir -p "$OUT_DIR"
tar -czf "$ARCHIVE" -C "$ROOT_DIR" "${existing[@]}"
printf '%s\n' "$ARCHIVE" > "$LATEST_REF"
mapfile -t backups < <(ls -1t "$OUT_DIR"/docs_backup_*.tar.gz 2>/dev/null || true)
if [[ "${#backups[@]}" -gt "$RETENTION" ]]; then
for old in "${backups[@]:$RETENTION}"; do
rm -f "$old"
done
fi
echo "Wrote: $ARCHIVE"
echo "Updated: $LATEST_REF"

31
scripts/docs/docs_lint.sh Executable file
View File

@@ -0,0 +1,31 @@
#!/usr/bin/env bash
set -euo pipefail
ROOT_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")/../.." && pwd)"
cd "$ROOT_DIR"
if ! command -v npx >/dev/null 2>&1; then
echo "npx not found; install Node.js to run markdown lint" >&2
exit 2
fi
echo "Running markdown lint on tracked markdown files"
scope_file="docs/standards/lint_scope.txt"
if [[ ! -f "$scope_file" ]]; then
echo "Missing lint scope file: $scope_file" >&2
exit 2
fi
mapfile -t scope_paths < <(sed '/^\s*$/d; /^\s*#/d' "$scope_file")
files=()
for p in "${scope_paths[@]}"; do
[[ -f "$p" ]] && files+=("$p")
done
if [[ "${#files[@]}" -eq 0 ]]; then
echo "No markdown files to lint"
exit 0
fi
npx -y markdownlint-cli2 --config .markdownlint.yml "${files[@]}"

View File

@@ -1,26 +1,170 @@
#!/usr/bin/env bash #!/usr/bin/env bash
set -euo pipefail set -euo pipefail
REPO_DIR="${REPO_DIR:-/opt/microdao-daarion}" ROOT_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")/../.." && pwd)"
BRANCH="${SYNC_BRANCH:-main}" cd "$ROOT_DIR"
cd "$REPO_DIR" DRY_RUN=1
APPLY=0
TARGETS="github,gitea"
COMMIT_MSG="docs: sync consolidation and session starter"
if [ ! -d .git ]; then usage() {
echo "[docs-sync] Missing git repo in $REPO_DIR" >&2 cat <<'USAGE'
exit 1 Usage:
bash scripts/docs/docs_sync.sh [--dry-run] [--apply] [--targets github,gitea] [--message "..."]
Behavior:
- default mode is --dry-run (prints actions only)
- --apply performs git add/commit/push for selected targets
- targets values: github,gitea
Safety:
- refuses to apply with unstaged non-doc changes in index
- only stages docs-related paths by default
USAGE
}
while [[ $# -gt 0 ]]; do
case "$1" in
--dry-run)
DRY_RUN=1
APPLY=0
shift
;;
--apply)
APPLY=1
DRY_RUN=0
shift
;;
--targets)
TARGETS="$2"
shift 2
;;
--message)
COMMIT_MSG="$2"
shift 2
;;
-h|--help)
usage
exit 0
;;
*)
echo "Unknown arg: $1" >&2
usage
exit 2
;;
esac
done
if [[ "$APPLY" -eq 1 ]]; then
# Prevent accidental mix with staged non-doc changes.
staged_non_doc="$(git diff --cached --name-only | rg -v '^(docs/|scripts/docs/|PROJECT-MASTER-INDEX.md$|NODA1-SAFE-DEPLOY.md$|config/README.md$)' || true)"
if [[ -n "$staged_non_doc" ]]; then
echo "Refusing --apply: staged non-doc files detected:" >&2
echo "$staged_non_doc" >&2
exit 3
fi
fi fi
echo "[docs-sync] Fetching origin/$BRANCH" mapfile -t GH_REMOTES < <(git remote -v | awk '/github.com/ {print $1}' | sort -u)
git fetch origin "$BRANCH" mapfile -t GITEA_REMOTES < <(git remote -v | awk '/gitea/ {print $1}' | sort -u)
LOCAL=$(git rev-parse @) wants_target() {
REMOTE=$(git rev-parse "origin/$BRANCH") local needle="$1"
[[ ",$TARGETS," == *",$needle,"* ]]
}
if [ "$LOCAL" = "$REMOTE" ]; then print_remotes() {
echo "[docs-sync] Already up to date" echo "Detected remotes:"
echo " github: ${GH_REMOTES[*]:-<none>}"
echo " gitea : ${GITEA_REMOTES[*]:-<none>}"
}
run_or_echo() {
if [[ "$DRY_RUN" -eq 1 ]]; then
echo "[dry-run] $*"
else
eval "$@"
fi
}
print_remotes
echo "Targets: $TARGETS"
echo "Commit message: $COMMIT_MSG"
doc_paths=(
docs/SESSION_STARTER.md
docs/runbooks/DOCS_SERVICES_AUTOMATION_RUNBOOK.md
docs/consolidation/README.md
docs/consolidation/SOURCES.md
docs/consolidation/docs_registry_curated.csv
docs/consolidation/INTEGRATIONS_STATUS_LATEST.md
docs/consolidation/jupyter/JUPYTER_SYNC_LATEST.md
docs/consolidation/jupyter/notebooks_index_latest.csv
docs/consolidation/pieces/PIECES_SYNC_LATEST.md
docs/consolidation/pieces/pieces_index_latest.csv
docs/backups/LATEST.txt
docs/standards/DOCS_STYLE_GUIDE.md
docs/standards/DOC_TEMPLATE.md
docs/standards/lint_scope.txt
PROJECT-MASTER-INDEX.md
.markdownlint.yml
.markdownlintignore
.github/workflows/docs-lint.yml
scripts/docs/session_bootstrap.sh
scripts/docs/docs_backup.sh
scripts/docs/docs_lint.sh
scripts/docs/jupyter_sync.sh
scripts/docs/pieces_sync.sh
scripts/docs/services_sync.sh
scripts/docs/docs_sync.sh
)
existing_paths=()
for p in "${doc_paths[@]}"; do
[[ -e "$p" ]] && existing_paths+=("$p")
done
if [[ "${#existing_paths[@]}" -eq 0 ]]; then
echo "No docs paths found to sync."
exit 0 exit 0
fi fi
echo "[docs-sync] Updating from origin/$BRANCH" run_or_echo "git add ${existing_paths[*]}"
git pull --ff-only origin "$BRANCH"
if [[ "$DRY_RUN" -eq 1 ]]; then
echo "[dry-run] git diff --cached --name-status"
git diff --cached --name-status || true
else
if git diff --cached --quiet; then
echo "No staged changes after add; nothing to commit."
exit 0
fi
git commit -m "$COMMIT_MSG"
fi
if wants_target "github"; then
if [[ "${#GH_REMOTES[@]}" -eq 0 ]]; then
echo "Skip github: no matching remotes."
else
for r in "${GH_REMOTES[@]}"; do
current_branch="$(git rev-parse --abbrev-ref HEAD)"
run_or_echo "git push $r $current_branch"
done
fi
fi
if wants_target "gitea"; then
if [[ "${#GITEA_REMOTES[@]}" -eq 0 ]]; then
echo "Skip gitea: no matching remotes."
else
for r in "${GITEA_REMOTES[@]}"; do
current_branch="$(git rev-parse --abbrev-ref HEAD)"
run_or_echo "git push $r $current_branch"
done
fi
fi
echo "Done."

153
scripts/docs/jupyter_sync.sh Executable file
View File

@@ -0,0 +1,153 @@
#!/usr/bin/env bash
set -euo pipefail
ROOT_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")/../.." && pwd)"
OUT_DIR="$ROOT_DIR/docs/consolidation/jupyter"
NOTEBOOKS_DIR="/Users/apple/notebooks"
DRY_RUN=1
APPLY=0
usage() {
cat <<'USAGE'
Usage:
bash scripts/docs/jupyter_sync.sh [--dry-run] [--apply] [--notebooks-dir PATH]
Behavior:
- default mode is --dry-run
- --apply writes markdown and csv outputs under docs/consolidation/jupyter
USAGE
}
while [[ $# -gt 0 ]]; do
case "$1" in
--dry-run)
DRY_RUN=1
APPLY=0
shift
;;
--apply)
APPLY=1
DRY_RUN=0
shift
;;
--notebooks-dir)
NOTEBOOKS_DIR="$2"
shift 2
;;
-h|--help)
usage
exit 0
;;
*)
echo "Unknown arg: $1" >&2
usage
exit 2
;;
esac
done
JUPYTER_CMD=""
if command -v jupyter >/dev/null 2>&1; then
JUPYTER_CMD="jupyter"
elif python3 -m jupyter --version >/dev/null 2>&1; then
JUPYTER_CMD="python3 -m jupyter"
fi
server_list=""
if [[ -n "$JUPYTER_CMD" ]]; then
server_list="$(eval "$JUPYTER_CMD server list" 2>/dev/null || true)"
fi
server_rows="$(echo "$server_list" | awk '/^http/ {print}' || true)"
server_count="$(echo "$server_rows" | sed '/^\s*$/d' | wc -l | tr -d ' ')"
ok_api=0
total_api=0
api_lines=""
if [[ "$server_count" -gt 0 ]]; then
while IFS= read -r line; do
[[ -z "$line" ]] && continue
url="$(echo "$line" | awk '{print $1}')"
token="$(echo "$url" | sed -n 's/.*token=\([^&]*\).*/\1/p')"
base="${url%%\?*}"
status_url="${base%/}/api/status"
if [[ -n "$token" ]]; then
probe_url="${status_url}?token=${token}"
else
probe_url="$status_url"
fi
code="$(curl -sS -o /tmp/jupyter_status.json -w '%{http_code}' "$probe_url" || true)"
total_api=$((total_api + 1))
if [[ "$code" == "200" ]]; then
ok_api=$((ok_api + 1))
fi
api_lines+="- ${probe_url} -> HTTP ${code}"$'\n'
done <<< "$server_rows"
fi
nb_count=0
if [[ -d "$NOTEBOOKS_DIR" ]]; then
nb_count="$(find "$NOTEBOOKS_DIR" -maxdepth 1 -type f -name '*.ipynb' | wc -l | tr -d ' ')"
fi
if [[ "$DRY_RUN" -eq 1 ]]; then
echo "[dry-run] jupyter_cmd: ${JUPYTER_CMD:-<not-found>}"
echo "[dry-run] server_count: $server_count"
echo "[dry-run] api_ok: $ok_api/$total_api"
echo "[dry-run] notebooks_dir: $NOTEBOOKS_DIR"
echo "[dry-run] notebooks_count: $nb_count"
if [[ -n "$api_lines" ]]; then
echo "[dry-run] api_probes:"
printf "%s" "$api_lines"
fi
exit 0
fi
mkdir -p "$OUT_DIR"
STAMP="$(date +%Y%m%d-%H%M%S)"
REPORT="$OUT_DIR/JUPYTER_SYNC_${STAMP}.md"
LATEST="$OUT_DIR/JUPYTER_SYNC_LATEST.md"
INDEX="$OUT_DIR/notebooks_index_${STAMP}.csv"
INDEX_LATEST="$OUT_DIR/notebooks_index_latest.csv"
{
echo "# Jupyter Sync Report"
echo
echo "Generated: $(date -u '+%Y-%m-%d %H:%M:%S UTC')"
echo
echo "- jupyter_cmd: ${JUPYTER_CMD:-not-found}"
echo "- server_count: $server_count"
echo "- api_ok: $ok_api/$total_api"
echo "- notebooks_dir: $NOTEBOOKS_DIR"
echo "- notebooks_count: $nb_count"
echo
echo "## API Probes"
echo
if [[ -n "$api_lines" ]]; then
printf "%s" "$api_lines"
else
echo "- no active jupyter servers discovered"
fi
} > "$REPORT"
if [[ -d "$NOTEBOOKS_DIR" ]]; then
{
echo "path,size_bytes,mtime_epoch"
find "$NOTEBOOKS_DIR" -maxdepth 1 -type f -name '*.ipynb' -print0 | \
while IFS= read -r -d '' file; do
size="$(stat -f '%z' "$file")"
mtime="$(stat -f '%m' "$file")"
echo "\"$file\",$size,$mtime"
done
} > "$INDEX"
else
echo "path,size_bytes,mtime_epoch" > "$INDEX"
fi
cp "$REPORT" "$LATEST"
cp "$INDEX" "$INDEX_LATEST"
echo "Wrote: $REPORT"
echo "Updated: $LATEST"
echo "Wrote: $INDEX"
echo "Updated: $INDEX_LATEST"

178
scripts/docs/pieces_sync.sh Executable file
View File

@@ -0,0 +1,178 @@
#!/usr/bin/env bash
set -euo pipefail
ROOT_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")/../.." && pwd)"
OUT_DIR="$ROOT_DIR/docs/consolidation/pieces"
DRY_RUN=1
APPLY=0
PROBE_PORTS=""
usage() {
cat <<'USAGE'
Usage:
bash scripts/docs/pieces_sync.sh [--dry-run] [--apply] [--probe-ports 39300,39301]
Behavior:
- default mode is --dry-run
- --apply writes markdown/csv outputs under docs/consolidation/pieces
- optional --probe-ports performs local API probe on localhost for /health and /status
USAGE
}
while [[ $# -gt 0 ]]; do
case "$1" in
--dry-run)
DRY_RUN=1
APPLY=0
shift
;;
--apply)
APPLY=1
DRY_RUN=0
shift
;;
--probe-ports)
PROBE_PORTS="$2"
shift 2
;;
-h|--help)
usage
exit 0
;;
*)
echo "Unknown arg: $1" >&2
usage
exit 2
;;
esac
done
EXT_DIR="$HOME/.cursor/extensions"
APP_SUPPORT="$HOME/Library/Application Support"
ext_hits=""
if [[ -d "$EXT_DIR" ]]; then
ext_hits="$(find "$EXT_DIR" -maxdepth 1 -type d -iname '*pieces*' | sort || true)"
fi
ext_count="$(echo "$ext_hits" | sed '/^\s*$/d' | wc -l | tr -d ' ')"
support_hits=""
if [[ -d "$APP_SUPPORT" ]]; then
support_hits="$(find "$APP_SUPPORT" -maxdepth 1 -type d -iname '*pieces*' | sort || true)"
fi
support_count="$(echo "$support_hits" | sed '/^\s*$/d' | wc -l | tr -d ' ')"
proc_hits="$(pgrep -fal -i 'pieces|copilot' | rg -v 'scripts/docs/pieces_sync.sh' || true)"
proc_count="$(echo "$proc_hits" | sed '/^\s*$/d' | wc -l | tr -d ' ')"
probe_lines=""
probe_ok=0
probe_total=0
if [[ -n "$PROBE_PORTS" ]]; then
IFS=',' read -r -a ports <<< "$PROBE_PORTS"
for port in "${ports[@]}"; do
port_trimmed="$(echo "$port" | xargs)"
[[ -z "$port_trimmed" ]] && continue
for endpoint in health status; do
url="http://127.0.0.1:${port_trimmed}/${endpoint}"
code="$(curl -sS -o /dev/null -w '%{http_code}' "$url" || true)"
probe_total=$((probe_total + 1))
if [[ "$code" == "200" ]]; then
probe_ok=$((probe_ok + 1))
fi
probe_lines+="- ${url} -> HTTP ${code}"$'\n'
done
done
fi
if [[ "$DRY_RUN" -eq 1 ]]; then
echo "[dry-run] pieces_extensions_count: $ext_count"
echo "[dry-run] pieces_data_dirs_count: $support_count"
echo "[dry-run] pieces_process_count: $proc_count"
echo "[dry-run] api_probe_ok: $probe_ok/$probe_total"
if [[ -n "$probe_lines" ]]; then
echo "[dry-run] api_probes:"
printf "%s" "$probe_lines"
fi
exit 0
fi
mkdir -p "$OUT_DIR"
STAMP="$(date +%Y%m%d-%H%M%S)"
REPORT="$OUT_DIR/PIECES_SYNC_${STAMP}.md"
LATEST="$OUT_DIR/PIECES_SYNC_LATEST.md"
INDEX="$OUT_DIR/pieces_index_${STAMP}.csv"
INDEX_LATEST="$OUT_DIR/pieces_index_latest.csv"
{
echo "# Pieces Sync Report"
echo
echo "Generated: $(date -u '+%Y-%m-%d %H:%M:%S UTC')"
echo
echo "- pieces_extensions_count: $ext_count"
echo "- pieces_data_dirs_count: $support_count"
echo "- pieces_process_count: $proc_count"
echo "- api_probe_ok: $probe_ok/$probe_total"
echo
echo "## Extensions"
echo
if [[ -n "$ext_hits" ]]; then
echo "$ext_hits" | sed 's/^/- /'
else
echo "- no pieces extensions found"
fi
echo
echo "## Data Dirs"
echo
if [[ -n "$support_hits" ]]; then
echo "$support_hits" | sed 's/^/- /'
else
echo "- no pieces data directories found"
fi
echo
echo "## Processes"
echo
if [[ -n "$proc_hits" ]]; then
echo "$proc_hits" | sed 's/^/- /'
else
echo "- no pieces related processes found"
fi
echo
echo "## API Probes"
echo
if [[ -n "$probe_lines" ]]; then
printf "%s" "$probe_lines"
else
echo "- no api probe ports supplied"
fi
} > "$REPORT"
{
echo "kind,path_or_process"
if [[ -n "$ext_hits" ]]; then
while IFS= read -r row; do
[[ -z "$row" ]] && continue
echo "extension,\"$row\""
done <<< "$ext_hits"
fi
if [[ -n "$support_hits" ]]; then
while IFS= read -r row; do
[[ -z "$row" ]] && continue
echo "data_dir,\"$row\""
done <<< "$support_hits"
fi
if [[ -n "$proc_hits" ]]; then
while IFS= read -r row; do
[[ -z "$row" ]] && continue
echo "process,\"$row\""
done <<< "$proc_hits"
fi
} > "$INDEX"
cp "$REPORT" "$LATEST"
cp "$INDEX" "$INDEX_LATEST"
echo "Wrote: $REPORT"
echo "Updated: $LATEST"
echo "Wrote: $INDEX"
echo "Updated: $INDEX_LATEST"

77
scripts/docs/services_sync.sh Executable file
View File

@@ -0,0 +1,77 @@
#!/usr/bin/env bash
set -euo pipefail
ROOT_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")/../.." && pwd)"
cd "$ROOT_DIR"
DRY_RUN=1
APPLY=0
PROBE_PORTS=""
usage() {
cat <<'USAGE'
Usage:
bash scripts/docs/services_sync.sh [--dry-run] [--apply] [--probe-ports 39300,39301]
Workflow:
1) docs backup
2) integration bootstrap status
3) jupyter sync adapter
4) pieces sync adapter
USAGE
}
while [[ $# -gt 0 ]]; do
case "$1" in
--dry-run)
DRY_RUN=1
APPLY=0
shift
;;
--apply)
APPLY=1
DRY_RUN=0
shift
;;
--probe-ports)
PROBE_PORTS="$2"
shift 2
;;
-h|--help)
usage
exit 0
;;
*)
echo "Unknown arg: $1" >&2
usage
exit 2
;;
esac
done
if [[ "$DRY_RUN" -eq 1 ]]; then
echo "[dry-run] running: scripts/docs/docs_backup.sh --dry-run"
bash scripts/docs/docs_backup.sh --dry-run
echo "[dry-run] running: scripts/docs/session_bootstrap.sh"
bash scripts/docs/session_bootstrap.sh
echo "[dry-run] running: scripts/docs/jupyter_sync.sh --dry-run"
bash scripts/docs/jupyter_sync.sh --dry-run
echo "[dry-run] running: scripts/docs/pieces_sync.sh --dry-run"
if [[ -n "$PROBE_PORTS" ]]; then
bash scripts/docs/pieces_sync.sh --dry-run --probe-ports "$PROBE_PORTS"
else
bash scripts/docs/pieces_sync.sh --dry-run
fi
exit 0
fi
bash scripts/docs/docs_backup.sh --apply
bash scripts/docs/session_bootstrap.sh
bash scripts/docs/jupyter_sync.sh --apply
if [[ -n "$PROBE_PORTS" ]]; then
bash scripts/docs/pieces_sync.sh --apply --probe-ports "$PROBE_PORTS"
else
bash scripts/docs/pieces_sync.sh --apply
fi
echo "services sync completed"

117
scripts/docs/session_bootstrap.sh Executable file
View File

@@ -0,0 +1,117 @@
#!/usr/bin/env bash
set -euo pipefail
ROOT_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")/../.." && pwd)"
OUT_DIR="$ROOT_DIR/docs/consolidation"
STAMP="$(date +%Y%m%d-%H%M%S)"
LATEST="$OUT_DIR/INTEGRATIONS_STATUS_LATEST.md"
REPORT="$OUT_DIR/INTEGRATIONS_STATUS_${STAMP}.md"
mkdir -p "$OUT_DIR"
section() {
printf "\n## %s\n\n" "$1"
}
write_status() {
local name="$1"; shift
local status="$1"; shift
local details="$*"
printf -- "- **%s**: %s" "$name" "$status"
if [[ -n "$details" ]]; then
printf " - %s" "$details"
fi
printf "\n"
}
{
printf "# Integrations Bootstrap Status\n\n"
printf "Generated: %s\n\n" "$(date -u '+%Y-%m-%d %H:%M:%S UTC')"
printf "Repo: %s\n\n" "$ROOT_DIR"
section "Gitea"
gitea_code="$(curl -sS -o /dev/null -w '%{http_code}' http://127.0.0.1:3000/ || true)"
if [[ "$gitea_code" == "200" || "$gitea_code" == "302" ]]; then
write_status "gitea_http" "OK" "http_code=$gitea_code (http://127.0.0.1:3000)"
else
write_status "gitea_http" "DEGRADED" "http_code=$gitea_code (http://127.0.0.1:3000)"
fi
gitea_remotes="$(git -C "$ROOT_DIR" remote -v | awk '/gitea/ {print $1" "$2}' | sort -u || true)"
if [[ -n "$gitea_remotes" ]]; then
write_status "gitea_git_remote" "OK" "$(echo "$gitea_remotes" | tr '\n' '; ' | sed 's/; $//')"
else
write_status "gitea_git_remote" "INFO" "no remote URL containing 'gitea' found"
fi
section "GitHub"
if command -v gh >/dev/null 2>&1; then
gh_status="$(gh auth status 2>&1 || true)"
if echo "$gh_status" | rg -q "Logged in to github.com"; then
account_line="$(echo "$gh_status" | awk -F'account ' '/Logged in to github.com account/{print $2}' | head -n1)"
write_status "gh_auth" "OK" "$account_line"
else
write_status "gh_auth" "DEGRADED" "gh installed but not authenticated"
fi
else
write_status "gh_auth" "DEGRADED" "gh CLI not installed"
fi
gh_remotes="$(git -C "$ROOT_DIR" remote -v | awk '/github.com/ {print $1" "$2}' | sort -u || true)"
if [[ -n "$gh_remotes" ]]; then
write_status "github_git_remote" "OK" "$(echo "$gh_remotes" | tr '\n' '; ' | sed 's/; $//')"
else
write_status "github_git_remote" "INFO" "no remote URL containing 'github.com' found"
fi
section "Jupyter"
if command -v jupyter >/dev/null 2>&1; then
jlist="$(jupyter server list 2>/dev/null || true)"
running="$(echo "$jlist" | sed '/^\s*$/d' | wc -l | tr -d ' ')"
write_status "jupyter_cli" "OK" "jupyter found in PATH"
write_status "jupyter_servers" "INFO" "listed_rows=$running"
elif python3 -m jupyter --version >/dev/null 2>&1; then
write_status "jupyter_cli" "OK" "python3 -m jupyter available"
jlist="$(python3 -m jupyter server list 2>/dev/null || true)"
running="$(echo "$jlist" | sed '/^\s*$/d' | wc -l | tr -d ' ')"
write_status "jupyter_servers" "INFO" "listed_rows=$running"
else
write_status "jupyter_cli" "DEGRADED" "jupyter not found in PATH"
fi
nb_dir="/Users/apple/notebooks"
if [[ -d "$nb_dir" ]]; then
nb_count="$(find "$nb_dir" -maxdepth 1 -type f -name '*.ipynb' | wc -l | tr -d ' ')"
write_status "notebooks_dir" "OK" "$nb_dir (ipynb_count=$nb_count)"
else
write_status "notebooks_dir" "DEGRADED" "$nb_dir missing"
fi
section "Pieces"
pext_dir="$HOME/.cursor/extensions"
if [[ -d "$pext_dir" ]]; then
pieces_hits="$(find "$pext_dir" -maxdepth 1 -type d -iname '*pieces*' | wc -l | tr -d ' ')"
if [[ "$pieces_hits" -gt 0 ]]; then
write_status "pieces_extension" "OK" "cursor extensions matched=$pieces_hits"
else
write_status "pieces_extension" "DEGRADED" "no Pieces extension in $pext_dir"
fi
else
write_status "pieces_extension" "DEGRADED" "$pext_dir missing"
fi
pieces_data_dir="$HOME/Library/Application Support/Pieces"
if [[ -d "$pieces_data_dir" ]]; then
write_status "pieces_data_dir" "OK" "$pieces_data_dir"
else
write_status "pieces_data_dir" "INFO" "$pieces_data_dir not found"
fi
section "Next"
printf -- "- Run docs sync dry-run: bash scripts/docs/docs_sync.sh --dry-run\n"
printf -- "- Apply sync to remotes: bash scripts/docs/docs_sync.sh --apply --targets github,gitea\n"
} > "$REPORT"
cp "$REPORT" "$LATEST"
echo "Wrote: $REPORT"
echo "Updated: $LATEST"

43
scripts/node1/snapshot_node1.sh Executable file
View File

@@ -0,0 +1,43 @@
#!/bin/bash
set -euo pipefail
TS=$(date -u +%Y-%m-%dT%H:%M:%SZ)
echo "=== NODE1 SNAPSHOT ==="
echo "timestamp_utc: $TS"
echo ""
echo "--- git (deployment checkout) ---"
if [ -d "/opt/microdao-daarion.repo/.git" ]; then
cd /opt/microdao-daarion.repo
echo "path: /opt/microdao-daarion.repo"
echo "head: $(git rev-parse --short HEAD 2>/dev/null || echo 'unknown')"
echo "branch: $(git branch --show-current 2>/dev/null || echo 'detached')"
echo "status:"
git status -sb 2>/dev/null || true
else
echo "WARN: /opt/microdao-daarion.repo not found"
fi
echo ""
echo "--- docker ps (top 30) ---"
docker ps --format "table {{.Names}} {{.Status}} {{.Image}}" | head -30
echo ""
echo "--- health endpoints ---"
for u in "http://127.0.0.1:9102/health" "http://127.0.0.1:9300/health" "http://127.0.0.1:8000/health" "http://127.0.0.1:6333/healthz" ; do
code=$(curl -sS -m 3 -o /dev/null -w "%{http_code}" "$u" || true)
echo "$u => $code"
done
echo ""
echo "--- docker inspect (images + compose project) ---"
for c in dagi-router-node1 dagi-gateway-node1 dagi-memory-service-node1; do
if docker inspect "$c" >/dev/null 2>&1; then
img=$(docker inspect -f '{{.Config.Image}}' "$c" 2>/dev/null || true)
prj=$(docker inspect -f '{{ index .Config.Labels "com.docker.compose.project" }}' "$c" 2>/dev/null || true)
echo "$c image=$img compose_project=$prj"
else
echo "$c not_found"
fi
done

View File

@@ -4,6 +4,7 @@ from typing import Literal, Optional, Dict, Any, List
import asyncio import asyncio
import json import json
import os import os
import re
import yaml import yaml
import httpx import httpx
import logging import logging
@@ -28,6 +29,35 @@ except ImportError:
logging.basicConfig(level=logging.INFO) logging.basicConfig(level=logging.INFO)
logger = logging.getLogger(__name__) logger = logging.getLogger(__name__)
def _strip_dsml_keep_text_before(text: str) -> str:
"""If response contains DSML, return only the part before the first DSML-like tag. Otherwise return empty (caller will use fallback)."""
if not text or len(text.strip()) < 10:
return ""
# Find first occurrence of DSML-like patterns (tag or keyword that starts markup)
dsml_start_patterns = [
r"<function_calls",
r"<invoke\s",
r"<parameter\s",
r"<think>",
# DSML variants (ASCII and Unicode separators, e.g. <DSMLinvoke ...>)
r"<\s*(?:\||)?\s*DSML",
r"DSML\s*(?:\||)",
r"DSML\s*>\s*",
]
earliest = len(text)
for pat in dsml_start_patterns:
m = re.search(pat, text, re.IGNORECASE | re.DOTALL)
if m:
earliest = min(earliest, m.start())
if earliest == 0:
return ""
prefix = text[:earliest].strip()
# Remove trailing incomplete tags
prefix = re.sub(r"<[^>]*$", "", prefix).strip()
return prefix if len(prefix) > 30 else ""
app = FastAPI(title="DAARION Router", version="2.0.0") app = FastAPI(title="DAARION Router", version="2.0.0")
# Configuration # Configuration
@@ -841,8 +871,46 @@ async def agent_infer(agent_id: str, request: InferRequest):
response_text = final_data.get("choices", [{}])[0].get("message", {}).get("content", "") response_text = final_data.get("choices", [{}])[0].get("message", {}).get("content", "")
# CRITICAL: Check for DSML in second response too! # CRITICAL: Check for DSML in second response too!
if response_text and "DSML" in response_text: if response_text and ("DSML" in response_text or "invoke name=" in response_text or "function_calls>" in response_text):
logger.warning(f"🧹 DSML detected in second LLM response, clearing ({len(response_text)} chars)") prefix_before_dsml = _strip_dsml_keep_text_before(response_text)
if prefix_before_dsml:
logger.warning(f"🧹 DSML in 2nd response: keeping text before DSML ({len(prefix_before_dsml)} chars), discarding {len(response_text) - len(prefix_before_dsml)} chars")
response_text = prefix_before_dsml
else:
logger.warning(f"🧹 DSML detected in 2nd LLM response, trying 3rd call ({len(response_text)} chars)")
# Third LLM call: explicitly ask to synthesize tool results
tool_summary_parts = []
for tr in tool_results:
if tr.get("success") and tr.get("result"):
res_text = str(tr["result"])[:500]
tool_summary_parts.append(f"Tool '{tr['name']}' returned: {res_text}")
if tool_summary_parts:
synthesis_prompt = "Based on the following tool results, provide a helpful response to the user in their language. Do NOT use any markup or XML. Just respond naturally.\n\n" + "\n".join(tool_summary_parts)
try:
synth_resp = await http_client.post(
f"{cloud['base_url']}/v1/chat/completions",
headers={"Authorization": f"Bearer {api_key}", "Content-Type": "application/json"},
json={"model": cloud["model"], "messages": [
{"role": "system", "content": system_prompt or "You are a helpful assistant. Respond naturally."},
{"role": "user", "content": synthesis_prompt}
], "max_tokens": max_tokens, "temperature": 0.3, "stream": False},
timeout=cloud["timeout"]
)
if synth_resp.status_code == 200:
synth_data = synth_resp.json()
synth_text = synth_data.get("choices", [{}])[0].get("message", {}).get("content", "")
if synth_text and "DSML" not in synth_text and "invoke" not in synth_text:
response_text = synth_text
tokens_used += synth_data.get("usage", {}).get("total_tokens", 0)
logger.info("\u2705 3rd LLM call synthesized clean response from tool results")
else:
response_text = format_tool_calls_for_response(tool_results, fallback_mode="dsml_detected")
else:
response_text = format_tool_calls_for_response(tool_results, fallback_mode="dsml_detected")
except Exception as synth_err:
logger.warning(f"3rd LLM call failed: {synth_err}")
response_text = format_tool_calls_for_response(tool_results, fallback_mode="dsml_detected")
else:
response_text = format_tool_calls_for_response(tool_results, fallback_mode="dsml_detected") response_text = format_tool_calls_for_response(tool_results, fallback_mode="dsml_detected")
if not response_text: if not response_text:
@@ -858,8 +926,12 @@ async def agent_infer(agent_id: str, request: InferRequest):
if response_text: if response_text:
# FINAL DSML check before returning - never show DSML to user # FINAL DSML check before returning - never show DSML to user
if "DSML" in response_text or "invoke name=" in response_text or "function_calls>" in response_text: if "DSML" in response_text or "invoke name=" in response_text or "function_calls>" in response_text:
prefix_before_dsml = _strip_dsml_keep_text_before(response_text)
if prefix_before_dsml:
logger.warning(f"🧹 DSML in final response: keeping text before DSML ({len(prefix_before_dsml)} chars)")
response_text = prefix_before_dsml
else:
logger.warning(f"🧹 DSML in final response! Replacing with fallback ({len(response_text)} chars)") logger.warning(f"🧹 DSML in final response! Replacing with fallback ({len(response_text)} chars)")
# Use dsml_detected mode - LLM confused, just acknowledge presence
response_text = format_tool_calls_for_response(tool_results, fallback_mode="dsml_detected") response_text = format_tool_calls_for_response(tool_results, fallback_mode="dsml_detected")
# Check if any tool generated an image # Check if any tool generated an image

View File

@@ -324,7 +324,8 @@ agents:
Допомагай користувачам з технологіями EcoMiner/BioMiner, токеномікою та DAO governance. Допомагай користувачам з технологіями EcoMiner/BioMiner, токеномікою та DAO governance.
- Консультуй щодо hardware, стейкінгу, інфраструктури. - Консультуй щодо hardware, стейкінгу, інфраструктури.
- Аналізуй PDF/зображення, коли просять. - Аналізуй PDF/зображення, коли просять.
- В групах мовчи, доки тема не про енергетику або немає тегу @HelionBot. - В групах мовчи, якщо немає явного звернення до тебе (Helion/Хеліон/Хелион або тег @HelionBot) І тема не про енергетику.
- Якщо тебе звернули напряму, відповідай навіть на операційні питання (як завантажити PDF/Word/Excel, як надіслати посилання, ліміти відповіді тощо).
- Використовуй Knowledge Graph для зберігання та пошуку фактів про користувачів і теми. - Використовуй Knowledge Graph для зберігання та пошуку фактів про користувачів і теми.
Визначай інших агентів за ніком (суфікс Bot) і спілкуйся як з колегами. Визначай інших агентів за ніком (суфікс Bot) і спілкуйся як з колегами.
tools: tools:

View File

@@ -671,6 +671,11 @@ def format_tool_calls_for_response(tool_results: List[Dict], fallback_mode: str
if tool_results: if tool_results:
for tr in tool_results: for tr in tool_results:
if tr.get("success") and tr.get("result"): if tr.get("success") and tr.get("result"):
# Avoid dumping raw retrieval/search payloads to the user.
# These often look like "memory dumps" and are perceived as incorrect answers.
tool_name = (tr.get("name") or "").strip()
if tool_name in {"memory_search", "web_search", "web_extract", "web_read"}:
continue
result = str(tr.get("result", "")) result = str(tr.get("result", ""))
if result and len(result) > 10 and "error" not in result.lower(): if result and len(result) > 10 and "error" not in result.lower():
# We have a useful tool result - use it! # We have a useful tool result - use it!
@@ -678,7 +683,7 @@ def format_tool_calls_for_response(tool_results: List[Dict], fallback_mode: str
return result[:600] + "..." return result[:600] + "..."
return result return result
# No useful tool results - give presence acknowledgment # No useful tool results - give presence acknowledgment
return "Я тут. Чим можу допомогти?" return "Вибач, відповідь згенерувалась некоректно. Спробуй ще раз (коротше/конкретніше) або повтори питання одним реченням."
if not tool_results: if not tool_results:
if fallback_mode == "empty_response": if fallback_mode == "empty_response":