docs: sync consolidation and session starter
This commit is contained in:
@@ -39,6 +39,13 @@ bash scripts/docs/docs_sync.sh --dry-run
|
|||||||
bash scripts/docs/docs_sync.sh --apply --targets github,gitea
|
bash scripts/docs/docs_sync.sh --apply --targets github,gitea
|
||||||
```
|
```
|
||||||
|
|
||||||
|
4. Service adapters (Jupyter + Pieces):
|
||||||
|
```bash
|
||||||
|
bash scripts/docs/services_sync.sh --dry-run
|
||||||
|
# apply mode:
|
||||||
|
bash scripts/docs/services_sync.sh --apply
|
||||||
|
```
|
||||||
|
|
||||||
## Runtime-First Facts (must re-check each session)
|
## Runtime-First Facts (must re-check each session)
|
||||||
|
|
||||||
1. NODE1 branch/SHA:
|
1. NODE1 branch/SHA:
|
||||||
|
|||||||
@@ -1,63 +1,31 @@
|
|||||||
# Integrations Bootstrap Status
|
# Integrations Bootstrap Status
|
||||||
|
|
||||||
Generated: 2026-02-16 10:15:20 UTC
|
Generated: 2026-02-16 10:20:19 UTC
|
||||||
|
|
||||||
Repo:
|
Repo: /Users/apple/github-projects/microdao-daarion
|
||||||
|
|
||||||
|
|
||||||
## Gitea
|
## Gitea
|
||||||
|
|
||||||
- **gitea_http**: OK — http_code=200 (http://127.0.0.1:3000)
|
- **gitea_http**: OK - http_code=200 (http://127.0.0.1:3000)
|
||||||
- **gitea_git_remote**: OK — gitea http://localhost:3000/daarion-admin/microdao-daarion.git;
|
- **gitea_git_remote**: OK - gitea http://localhost:3000/daarion-admin/microdao-daarion.git;
|
||||||
|
|
||||||
## GitHub
|
## GitHub
|
||||||
|
|
||||||
- **gh_auth**: OK — IvanTytar (keyring)
|
- **gh_auth**: OK - IvanTytar (keyring)
|
||||||
- **github_git_remote**: OK — origin git@github.com:IvanTytar/microdao-daarion.git;
|
- **github_git_remote**: OK - origin git@github.com:IvanTytar/microdao-daarion.git;
|
||||||
|
|
||||||
## Jupyter
|
## Jupyter
|
||||||
|
|
||||||
- **jupyter_cli**: DEGRADED — jupyter not found in PATH
|
- **jupyter_cli**: DEGRADED - jupyter not found in PATH
|
||||||
- **notebooks_dir**: OK — /Users/apple/notebooks (ipynb_count=7)
|
- **notebooks_dir**: OK - /Users/apple/notebooks (ipynb_count=7)
|
||||||
|
|
||||||
## Pieces
|
## Pieces
|
||||||
|
|
||||||
- **pieces_extension**: OK — cursor extensions matched=1
|
- **pieces_extension**: OK - cursor extensions matched=1
|
||||||
- **pieces_data_dir**: INFO — /Users/apple/Library/Application Support/Pieces not found
|
- **pieces_data_dir**: INFO - /Users/apple/Library/Application Support/Pieces not found
|
||||||
|
|
||||||
## Next
|
## Next
|
||||||
|
|
||||||
- Run docs sync dry-run: Detected remotes:
|
- Run docs sync dry-run: bash scripts/docs/docs_sync.sh --dry-run
|
||||||
github: origin
|
- Apply sync to remotes: bash scripts/docs/docs_sync.sh --apply --targets github,gitea
|
||||||
gitea : gitea
|
|
||||||
Targets: github,gitea
|
|
||||||
Commit message: docs: sync consolidation and session starter
|
|
||||||
[dry-run] git add docs/SESSION_STARTER.md docs/consolidation/README.md docs/consolidation/SOURCES.md docs/consolidation/docs_registry_curated.csv PROJECT-MASTER-INDEX.md
|
|
||||||
[dry-run] git diff --cached --name-status
|
|
||||||
[dry-run] git push origin feat/md-pipeline-senpai-consumer
|
|
||||||
[dry-run] git push gitea feat/md-pipeline-senpai-consumer
|
|
||||||
Done.
|
|
||||||
- Apply sync to remotes: Detected remotes:
|
|
||||||
github: origin
|
|
||||||
gitea : gitea
|
|
||||||
Targets: github,gitea
|
|
||||||
Commit message: docs: sync consolidation and session starter
|
|
||||||
[feat/md-pipeline-senpai-consumer de3bd8c] docs: sync consolidation and session starter
|
|
||||||
Committer: Apple <apple@MacBook-Pro.local>
|
|
||||||
Your name and email address were configured automatically based
|
|
||||||
on your username and hostname. Please check that they are accurate.
|
|
||||||
You can suppress this message by setting them explicitly:
|
|
||||||
|
|
||||||
git config --global user.name "Your Name"
|
|
||||||
git config --global user.email you@example.com
|
|
||||||
|
|
||||||
After doing this, you may fix the identity used for this commit with:
|
|
||||||
|
|
||||||
git commit --amend --reset-author
|
|
||||||
|
|
||||||
5 files changed, 225 insertions(+), 3 deletions(-)
|
|
||||||
create mode 100644 docs/SESSION_STARTER.md
|
|
||||||
create mode 100644 docs/consolidation/README.md
|
|
||||||
create mode 100644 docs/consolidation/SOURCES.md
|
|
||||||
create mode 100644 docs/consolidation/docs_registry_curated.csv
|
|
||||||
Done.
|
|
||||||
|
|||||||
@@ -24,3 +24,6 @@ This file tracks where documentation is collected from for consolidation.
|
|||||||
## Automation Scripts
|
## Automation Scripts
|
||||||
- `scripts/docs/session_bootstrap.sh` — refreshes integration status and writes `INTEGRATIONS_STATUS_LATEST.md`.
|
- `scripts/docs/session_bootstrap.sh` — refreshes integration status and writes `INTEGRATIONS_STATUS_LATEST.md`.
|
||||||
- `scripts/docs/docs_sync.sh` — safe docs sync automation with `--dry-run` by default and explicit `--apply`.
|
- `scripts/docs/docs_sync.sh` — safe docs sync automation with `--dry-run` by default and explicit `--apply`.
|
||||||
|
- `scripts/docs/jupyter_sync.sh` — adapter for Jupyter server API status + notebooks index export.
|
||||||
|
- `scripts/docs/pieces_sync.sh` — adapter for Pieces local runtime/data index + optional API probe.
|
||||||
|
- `scripts/docs/services_sync.sh` — orchestrator for bootstrap + Jupyter + Pieces sync adapters.
|
||||||
|
|||||||
13
docs/consolidation/jupyter/JUPYTER_SYNC_LATEST.md
Normal file
13
docs/consolidation/jupyter/JUPYTER_SYNC_LATEST.md
Normal file
@@ -0,0 +1,13 @@
|
|||||||
|
# Jupyter Sync Report
|
||||||
|
|
||||||
|
Generated: 2026-02-16 10:20:20 UTC
|
||||||
|
|
||||||
|
- jupyter_cmd: not-found
|
||||||
|
- server_count: 0
|
||||||
|
- api_ok: 0/0
|
||||||
|
- notebooks_dir: /Users/apple/notebooks
|
||||||
|
- notebooks_count: 7
|
||||||
|
|
||||||
|
## API Probes
|
||||||
|
|
||||||
|
- no active jupyter servers discovered
|
||||||
8
docs/consolidation/jupyter/notebooks_index_latest.csv
Normal file
8
docs/consolidation/jupyter/notebooks_index_latest.csv
Normal file
@@ -0,0 +1,8 @@
|
|||||||
|
path,size_bytes,mtime_epoch
|
||||||
|
"/Users/apple/notebooks/04_latency_profile.ipynb",31210,1760335066
|
||||||
|
"/Users/apple/notebooks/03_window_packing.ipynb",8762,1760334699
|
||||||
|
"/Users/apple/notebooks/02_reranker_ablation.ipynb",26615,1760334615
|
||||||
|
"/Users/apple/notebooks/05_groundedness_eval.ipynb",39210,1760335328
|
||||||
|
"/Users/apple/notebooks/hybrid-search-demo.ipynb",27580,1760307946
|
||||||
|
"/Users/apple/notebooks/01_retrieval_sanity.ipynb",16137,1760308259
|
||||||
|
"/Users/apple/notebooks/ai-stack-demo.ipynb",14607,1760307623
|
||||||
|
25
docs/consolidation/pieces/PIECES_SYNC_LATEST.md
Normal file
25
docs/consolidation/pieces/PIECES_SYNC_LATEST.md
Normal file
@@ -0,0 +1,25 @@
|
|||||||
|
# Pieces Sync Report
|
||||||
|
|
||||||
|
Generated: 2026-02-16 10:20:43 UTC
|
||||||
|
|
||||||
|
- pieces_extensions_count: 1
|
||||||
|
- pieces_data_dirs_count: 0
|
||||||
|
- pieces_process_count: 2
|
||||||
|
- api_probe_ok: 0/0
|
||||||
|
|
||||||
|
## Extensions
|
||||||
|
|
||||||
|
- /Users/apple/.cursor/extensions/meshintelligenttechnologiesinc.pieces-vscode-3.0.1-universal
|
||||||
|
|
||||||
|
## Data Dirs
|
||||||
|
|
||||||
|
- no pieces data directories found
|
||||||
|
|
||||||
|
## Processes
|
||||||
|
|
||||||
|
- 54969 /Applications/Pieces OS.app/Contents/MacOS/Pieces OS
|
||||||
|
- 55184 /Applications/Pieces.app/Contents/MacOS/Pieces
|
||||||
|
|
||||||
|
## API Probes
|
||||||
|
|
||||||
|
- no api probe ports supplied
|
||||||
4
docs/consolidation/pieces/pieces_index_latest.csv
Normal file
4
docs/consolidation/pieces/pieces_index_latest.csv
Normal file
@@ -0,0 +1,4 @@
|
|||||||
|
kind,path_or_process
|
||||||
|
extension,"/Users/apple/.cursor/extensions/meshintelligenttechnologiesinc.pieces-vscode-3.0.1-universal"
|
||||||
|
process,"54969 /Applications/Pieces OS.app/Contents/MacOS/Pieces OS"
|
||||||
|
process,"55184 /Applications/Pieces.app/Contents/MacOS/Pieces"
|
||||||
|
71
docs/runbooks/DOCS_SERVICES_AUTOMATION_RUNBOOK.md
Normal file
71
docs/runbooks/DOCS_SERVICES_AUTOMATION_RUNBOOK.md
Normal file
@@ -0,0 +1,71 @@
|
|||||||
|
# Docs Services Automation Runbook
|
||||||
|
|
||||||
|
Date: 2026-02-16
|
||||||
|
Scope: GitHub, Gitea, Jupyter, Pieces documentation integration workflow.
|
||||||
|
|
||||||
|
## Goal
|
||||||
|
|
||||||
|
Keep docs state synchronized with service reality and publish curated docs updates safely.
|
||||||
|
|
||||||
|
## 1) Refresh service status
|
||||||
|
|
||||||
|
```bash
|
||||||
|
cd /Users/apple/github-projects/microdao-daarion
|
||||||
|
bash scripts/docs/session_bootstrap.sh
|
||||||
|
```
|
||||||
|
|
||||||
|
Primary output:
|
||||||
|
- `docs/consolidation/INTEGRATIONS_STATUS_LATEST.md`
|
||||||
|
|
||||||
|
## 2) Run service adapters (Jupyter + Pieces)
|
||||||
|
|
||||||
|
```bash
|
||||||
|
cd /Users/apple/github-projects/microdao-daarion
|
||||||
|
bash scripts/docs/services_sync.sh --dry-run
|
||||||
|
```
|
||||||
|
|
||||||
|
Apply mode:
|
||||||
|
|
||||||
|
```bash
|
||||||
|
cd /Users/apple/github-projects/microdao-daarion
|
||||||
|
bash scripts/docs/services_sync.sh --apply
|
||||||
|
```
|
||||||
|
|
||||||
|
Optional local Pieces API probe:
|
||||||
|
|
||||||
|
```bash
|
||||||
|
cd /Users/apple/github-projects/microdao-daarion
|
||||||
|
bash scripts/docs/services_sync.sh --apply --probe-ports 39300,39301
|
||||||
|
```
|
||||||
|
|
||||||
|
Outputs:
|
||||||
|
- `docs/consolidation/jupyter/JUPYTER_SYNC_LATEST.md`
|
||||||
|
- `docs/consolidation/jupyter/notebooks_index_latest.csv`
|
||||||
|
- `docs/consolidation/pieces/PIECES_SYNC_LATEST.md`
|
||||||
|
- `docs/consolidation/pieces/pieces_index_latest.csv`
|
||||||
|
|
||||||
|
## 3) Review pending docs updates
|
||||||
|
|
||||||
|
```bash
|
||||||
|
cd /Users/apple/github-projects/microdao-daarion
|
||||||
|
git status --short
|
||||||
|
bash scripts/docs/docs_sync.sh --dry-run
|
||||||
|
```
|
||||||
|
|
||||||
|
## 4) Apply sync to remotes
|
||||||
|
|
||||||
|
```bash
|
||||||
|
cd /Users/apple/github-projects/microdao-daarion
|
||||||
|
bash scripts/docs/docs_sync.sh --apply --targets github,gitea
|
||||||
|
```
|
||||||
|
|
||||||
|
## Safety gates
|
||||||
|
|
||||||
|
- `docs_sync.sh` is dry-run by default.
|
||||||
|
- `--apply` refuses to proceed if staged non-doc files are present.
|
||||||
|
- Only curated doc paths are auto-staged by the script.
|
||||||
|
|
||||||
|
## Jupyter and Pieces notes
|
||||||
|
|
||||||
|
- `jupyter_sync.sh` probes active Jupyter servers via `/api/status` when servers are discovered.
|
||||||
|
- `pieces_sync.sh` captures local Pieces runtime markers and supports optional local API probes (`--probe-ports`).
|
||||||
161
scripts/docs/docs_sync.sh
Executable file
161
scripts/docs/docs_sync.sh
Executable file
@@ -0,0 +1,161 @@
|
|||||||
|
#!/usr/bin/env bash
|
||||||
|
set -euo pipefail
|
||||||
|
|
||||||
|
ROOT_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")/../.." && pwd)"
|
||||||
|
cd "$ROOT_DIR"
|
||||||
|
|
||||||
|
DRY_RUN=1
|
||||||
|
APPLY=0
|
||||||
|
TARGETS="github,gitea"
|
||||||
|
COMMIT_MSG="docs: sync consolidation and session starter"
|
||||||
|
|
||||||
|
usage() {
|
||||||
|
cat <<'USAGE'
|
||||||
|
Usage:
|
||||||
|
bash scripts/docs/docs_sync.sh [--dry-run] [--apply] [--targets github,gitea] [--message "..."]
|
||||||
|
|
||||||
|
Behavior:
|
||||||
|
- default mode is --dry-run (prints actions only)
|
||||||
|
- --apply performs git add/commit/push for selected targets
|
||||||
|
- targets values: github,gitea
|
||||||
|
|
||||||
|
Safety:
|
||||||
|
- refuses to apply with unstaged non-doc changes in index
|
||||||
|
- only stages docs-related paths by default
|
||||||
|
USAGE
|
||||||
|
}
|
||||||
|
|
||||||
|
while [[ $# -gt 0 ]]; do
|
||||||
|
case "$1" in
|
||||||
|
--dry-run)
|
||||||
|
DRY_RUN=1
|
||||||
|
APPLY=0
|
||||||
|
shift
|
||||||
|
;;
|
||||||
|
--apply)
|
||||||
|
APPLY=1
|
||||||
|
DRY_RUN=0
|
||||||
|
shift
|
||||||
|
;;
|
||||||
|
--targets)
|
||||||
|
TARGETS="$2"
|
||||||
|
shift 2
|
||||||
|
;;
|
||||||
|
--message)
|
||||||
|
COMMIT_MSG="$2"
|
||||||
|
shift 2
|
||||||
|
;;
|
||||||
|
-h|--help)
|
||||||
|
usage
|
||||||
|
exit 0
|
||||||
|
;;
|
||||||
|
*)
|
||||||
|
echo "Unknown arg: $1" >&2
|
||||||
|
usage
|
||||||
|
exit 2
|
||||||
|
;;
|
||||||
|
esac
|
||||||
|
done
|
||||||
|
|
||||||
|
if [[ "$APPLY" -eq 1 ]]; then
|
||||||
|
# Prevent accidental mix with staged non-doc changes.
|
||||||
|
staged_non_doc="$(git diff --cached --name-only | rg -v '^(docs/|scripts/docs/|PROJECT-MASTER-INDEX.md$|NODA1-SAFE-DEPLOY.md$|config/README.md$)' || true)"
|
||||||
|
if [[ -n "$staged_non_doc" ]]; then
|
||||||
|
echo "Refusing --apply: staged non-doc files detected:" >&2
|
||||||
|
echo "$staged_non_doc" >&2
|
||||||
|
exit 3
|
||||||
|
fi
|
||||||
|
fi
|
||||||
|
|
||||||
|
mapfile -t GH_REMOTES < <(git remote -v | awk '/github.com/ {print $1}' | sort -u)
|
||||||
|
mapfile -t GITEA_REMOTES < <(git remote -v | awk '/gitea/ {print $1}' | sort -u)
|
||||||
|
|
||||||
|
wants_target() {
|
||||||
|
local needle="$1"
|
||||||
|
[[ ",$TARGETS," == *",$needle,"* ]]
|
||||||
|
}
|
||||||
|
|
||||||
|
print_remotes() {
|
||||||
|
echo "Detected remotes:"
|
||||||
|
echo " github: ${GH_REMOTES[*]:-<none>}"
|
||||||
|
echo " gitea : ${GITEA_REMOTES[*]:-<none>}"
|
||||||
|
}
|
||||||
|
|
||||||
|
run_or_echo() {
|
||||||
|
if [[ "$DRY_RUN" -eq 1 ]]; then
|
||||||
|
echo "[dry-run] $*"
|
||||||
|
else
|
||||||
|
eval "$@"
|
||||||
|
fi
|
||||||
|
}
|
||||||
|
|
||||||
|
print_remotes
|
||||||
|
|
||||||
|
echo "Targets: $TARGETS"
|
||||||
|
echo "Commit message: $COMMIT_MSG"
|
||||||
|
|
||||||
|
doc_paths=(
|
||||||
|
docs/SESSION_STARTER.md
|
||||||
|
docs/runbooks/DOCS_SERVICES_AUTOMATION_RUNBOOK.md
|
||||||
|
docs/consolidation/README.md
|
||||||
|
docs/consolidation/SOURCES.md
|
||||||
|
docs/consolidation/docs_registry_curated.csv
|
||||||
|
docs/consolidation/INTEGRATIONS_STATUS_LATEST.md
|
||||||
|
docs/consolidation/jupyter/JUPYTER_SYNC_LATEST.md
|
||||||
|
docs/consolidation/jupyter/notebooks_index_latest.csv
|
||||||
|
docs/consolidation/pieces/PIECES_SYNC_LATEST.md
|
||||||
|
docs/consolidation/pieces/pieces_index_latest.csv
|
||||||
|
PROJECT-MASTER-INDEX.md
|
||||||
|
scripts/docs/session_bootstrap.sh
|
||||||
|
scripts/docs/jupyter_sync.sh
|
||||||
|
scripts/docs/pieces_sync.sh
|
||||||
|
scripts/docs/services_sync.sh
|
||||||
|
scripts/docs/docs_sync.sh
|
||||||
|
)
|
||||||
|
|
||||||
|
existing_paths=()
|
||||||
|
for p in "${doc_paths[@]}"; do
|
||||||
|
[[ -e "$p" ]] && existing_paths+=("$p")
|
||||||
|
done
|
||||||
|
|
||||||
|
if [[ "${#existing_paths[@]}" -eq 0 ]]; then
|
||||||
|
echo "No docs paths found to sync."
|
||||||
|
exit 0
|
||||||
|
fi
|
||||||
|
|
||||||
|
run_or_echo "git add ${existing_paths[*]}"
|
||||||
|
|
||||||
|
if [[ "$DRY_RUN" -eq 1 ]]; then
|
||||||
|
echo "[dry-run] git diff --cached --name-status"
|
||||||
|
git diff --cached --name-status || true
|
||||||
|
else
|
||||||
|
if git diff --cached --quiet; then
|
||||||
|
echo "No staged changes after add; nothing to commit."
|
||||||
|
exit 0
|
||||||
|
fi
|
||||||
|
git commit -m "$COMMIT_MSG"
|
||||||
|
fi
|
||||||
|
|
||||||
|
if wants_target "github"; then
|
||||||
|
if [[ "${#GH_REMOTES[@]}" -eq 0 ]]; then
|
||||||
|
echo "Skip github: no matching remotes."
|
||||||
|
else
|
||||||
|
for r in "${GH_REMOTES[@]}"; do
|
||||||
|
current_branch="$(git rev-parse --abbrev-ref HEAD)"
|
||||||
|
run_or_echo "git push $r $current_branch"
|
||||||
|
done
|
||||||
|
fi
|
||||||
|
fi
|
||||||
|
|
||||||
|
if wants_target "gitea"; then
|
||||||
|
if [[ "${#GITEA_REMOTES[@]}" -eq 0 ]]; then
|
||||||
|
echo "Skip gitea: no matching remotes."
|
||||||
|
else
|
||||||
|
for r in "${GITEA_REMOTES[@]}"; do
|
||||||
|
current_branch="$(git rev-parse --abbrev-ref HEAD)"
|
||||||
|
run_or_echo "git push $r $current_branch"
|
||||||
|
done
|
||||||
|
fi
|
||||||
|
fi
|
||||||
|
|
||||||
|
echo "Done."
|
||||||
153
scripts/docs/jupyter_sync.sh
Executable file
153
scripts/docs/jupyter_sync.sh
Executable file
@@ -0,0 +1,153 @@
|
|||||||
|
#!/usr/bin/env bash
|
||||||
|
set -euo pipefail
|
||||||
|
|
||||||
|
ROOT_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")/../.." && pwd)"
|
||||||
|
OUT_DIR="$ROOT_DIR/docs/consolidation/jupyter"
|
||||||
|
NOTEBOOKS_DIR="/Users/apple/notebooks"
|
||||||
|
DRY_RUN=1
|
||||||
|
APPLY=0
|
||||||
|
|
||||||
|
usage() {
|
||||||
|
cat <<'USAGE'
|
||||||
|
Usage:
|
||||||
|
bash scripts/docs/jupyter_sync.sh [--dry-run] [--apply] [--notebooks-dir PATH]
|
||||||
|
|
||||||
|
Behavior:
|
||||||
|
- default mode is --dry-run
|
||||||
|
- --apply writes markdown and csv outputs under docs/consolidation/jupyter
|
||||||
|
USAGE
|
||||||
|
}
|
||||||
|
|
||||||
|
while [[ $# -gt 0 ]]; do
|
||||||
|
case "$1" in
|
||||||
|
--dry-run)
|
||||||
|
DRY_RUN=1
|
||||||
|
APPLY=0
|
||||||
|
shift
|
||||||
|
;;
|
||||||
|
--apply)
|
||||||
|
APPLY=1
|
||||||
|
DRY_RUN=0
|
||||||
|
shift
|
||||||
|
;;
|
||||||
|
--notebooks-dir)
|
||||||
|
NOTEBOOKS_DIR="$2"
|
||||||
|
shift 2
|
||||||
|
;;
|
||||||
|
-h|--help)
|
||||||
|
usage
|
||||||
|
exit 0
|
||||||
|
;;
|
||||||
|
*)
|
||||||
|
echo "Unknown arg: $1" >&2
|
||||||
|
usage
|
||||||
|
exit 2
|
||||||
|
;;
|
||||||
|
esac
|
||||||
|
done
|
||||||
|
|
||||||
|
JUPYTER_CMD=""
|
||||||
|
if command -v jupyter >/dev/null 2>&1; then
|
||||||
|
JUPYTER_CMD="jupyter"
|
||||||
|
elif python3 -m jupyter --version >/dev/null 2>&1; then
|
||||||
|
JUPYTER_CMD="python3 -m jupyter"
|
||||||
|
fi
|
||||||
|
|
||||||
|
server_list=""
|
||||||
|
if [[ -n "$JUPYTER_CMD" ]]; then
|
||||||
|
server_list="$(eval "$JUPYTER_CMD server list" 2>/dev/null || true)"
|
||||||
|
fi
|
||||||
|
|
||||||
|
server_rows="$(echo "$server_list" | awk '/^http/ {print}' || true)"
|
||||||
|
server_count="$(echo "$server_rows" | sed '/^\s*$/d' | wc -l | tr -d ' ')"
|
||||||
|
|
||||||
|
ok_api=0
|
||||||
|
total_api=0
|
||||||
|
api_lines=""
|
||||||
|
if [[ "$server_count" -gt 0 ]]; then
|
||||||
|
while IFS= read -r line; do
|
||||||
|
[[ -z "$line" ]] && continue
|
||||||
|
url="$(echo "$line" | awk '{print $1}')"
|
||||||
|
token="$(echo "$url" | sed -n 's/.*token=\([^&]*\).*/\1/p')"
|
||||||
|
base="${url%%\?*}"
|
||||||
|
status_url="${base%/}/api/status"
|
||||||
|
if [[ -n "$token" ]]; then
|
||||||
|
probe_url="${status_url}?token=${token}"
|
||||||
|
else
|
||||||
|
probe_url="$status_url"
|
||||||
|
fi
|
||||||
|
code="$(curl -sS -o /tmp/jupyter_status.json -w '%{http_code}' "$probe_url" || true)"
|
||||||
|
total_api=$((total_api + 1))
|
||||||
|
if [[ "$code" == "200" ]]; then
|
||||||
|
ok_api=$((ok_api + 1))
|
||||||
|
fi
|
||||||
|
api_lines+="- ${probe_url} -> HTTP ${code}"$'\n'
|
||||||
|
done <<< "$server_rows"
|
||||||
|
fi
|
||||||
|
|
||||||
|
nb_count=0
|
||||||
|
if [[ -d "$NOTEBOOKS_DIR" ]]; then
|
||||||
|
nb_count="$(find "$NOTEBOOKS_DIR" -maxdepth 1 -type f -name '*.ipynb' | wc -l | tr -d ' ')"
|
||||||
|
fi
|
||||||
|
|
||||||
|
if [[ "$DRY_RUN" -eq 1 ]]; then
|
||||||
|
echo "[dry-run] jupyter_cmd: ${JUPYTER_CMD:-<not-found>}"
|
||||||
|
echo "[dry-run] server_count: $server_count"
|
||||||
|
echo "[dry-run] api_ok: $ok_api/$total_api"
|
||||||
|
echo "[dry-run] notebooks_dir: $NOTEBOOKS_DIR"
|
||||||
|
echo "[dry-run] notebooks_count: $nb_count"
|
||||||
|
if [[ -n "$api_lines" ]]; then
|
||||||
|
echo "[dry-run] api_probes:"
|
||||||
|
printf "%s" "$api_lines"
|
||||||
|
fi
|
||||||
|
exit 0
|
||||||
|
fi
|
||||||
|
|
||||||
|
mkdir -p "$OUT_DIR"
|
||||||
|
STAMP="$(date +%Y%m%d-%H%M%S)"
|
||||||
|
REPORT="$OUT_DIR/JUPYTER_SYNC_${STAMP}.md"
|
||||||
|
LATEST="$OUT_DIR/JUPYTER_SYNC_LATEST.md"
|
||||||
|
INDEX="$OUT_DIR/notebooks_index_${STAMP}.csv"
|
||||||
|
INDEX_LATEST="$OUT_DIR/notebooks_index_latest.csv"
|
||||||
|
|
||||||
|
{
|
||||||
|
echo "# Jupyter Sync Report"
|
||||||
|
echo
|
||||||
|
echo "Generated: $(date -u '+%Y-%m-%d %H:%M:%S UTC')"
|
||||||
|
echo
|
||||||
|
echo "- jupyter_cmd: ${JUPYTER_CMD:-not-found}"
|
||||||
|
echo "- server_count: $server_count"
|
||||||
|
echo "- api_ok: $ok_api/$total_api"
|
||||||
|
echo "- notebooks_dir: $NOTEBOOKS_DIR"
|
||||||
|
echo "- notebooks_count: $nb_count"
|
||||||
|
echo
|
||||||
|
echo "## API Probes"
|
||||||
|
echo
|
||||||
|
if [[ -n "$api_lines" ]]; then
|
||||||
|
printf "%s" "$api_lines"
|
||||||
|
else
|
||||||
|
echo "- no active jupyter servers discovered"
|
||||||
|
fi
|
||||||
|
} > "$REPORT"
|
||||||
|
|
||||||
|
if [[ -d "$NOTEBOOKS_DIR" ]]; then
|
||||||
|
{
|
||||||
|
echo "path,size_bytes,mtime_epoch"
|
||||||
|
find "$NOTEBOOKS_DIR" -maxdepth 1 -type f -name '*.ipynb' -print0 | \
|
||||||
|
while IFS= read -r -d '' file; do
|
||||||
|
size="$(stat -f '%z' "$file")"
|
||||||
|
mtime="$(stat -f '%m' "$file")"
|
||||||
|
echo "\"$file\",$size,$mtime"
|
||||||
|
done
|
||||||
|
} > "$INDEX"
|
||||||
|
else
|
||||||
|
echo "path,size_bytes,mtime_epoch" > "$INDEX"
|
||||||
|
fi
|
||||||
|
|
||||||
|
cp "$REPORT" "$LATEST"
|
||||||
|
cp "$INDEX" "$INDEX_LATEST"
|
||||||
|
|
||||||
|
echo "Wrote: $REPORT"
|
||||||
|
echo "Updated: $LATEST"
|
||||||
|
echo "Wrote: $INDEX"
|
||||||
|
echo "Updated: $INDEX_LATEST"
|
||||||
178
scripts/docs/pieces_sync.sh
Executable file
178
scripts/docs/pieces_sync.sh
Executable file
@@ -0,0 +1,178 @@
|
|||||||
|
#!/usr/bin/env bash
|
||||||
|
set -euo pipefail
|
||||||
|
|
||||||
|
ROOT_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")/../.." && pwd)"
|
||||||
|
OUT_DIR="$ROOT_DIR/docs/consolidation/pieces"
|
||||||
|
DRY_RUN=1
|
||||||
|
APPLY=0
|
||||||
|
PROBE_PORTS=""
|
||||||
|
|
||||||
|
usage() {
|
||||||
|
cat <<'USAGE'
|
||||||
|
Usage:
|
||||||
|
bash scripts/docs/pieces_sync.sh [--dry-run] [--apply] [--probe-ports 39300,39301]
|
||||||
|
|
||||||
|
Behavior:
|
||||||
|
- default mode is --dry-run
|
||||||
|
- --apply writes markdown/csv outputs under docs/consolidation/pieces
|
||||||
|
- optional --probe-ports performs local API probe on localhost for /health and /status
|
||||||
|
USAGE
|
||||||
|
}
|
||||||
|
|
||||||
|
while [[ $# -gt 0 ]]; do
|
||||||
|
case "$1" in
|
||||||
|
--dry-run)
|
||||||
|
DRY_RUN=1
|
||||||
|
APPLY=0
|
||||||
|
shift
|
||||||
|
;;
|
||||||
|
--apply)
|
||||||
|
APPLY=1
|
||||||
|
DRY_RUN=0
|
||||||
|
shift
|
||||||
|
;;
|
||||||
|
--probe-ports)
|
||||||
|
PROBE_PORTS="$2"
|
||||||
|
shift 2
|
||||||
|
;;
|
||||||
|
-h|--help)
|
||||||
|
usage
|
||||||
|
exit 0
|
||||||
|
;;
|
||||||
|
*)
|
||||||
|
echo "Unknown arg: $1" >&2
|
||||||
|
usage
|
||||||
|
exit 2
|
||||||
|
;;
|
||||||
|
esac
|
||||||
|
done
|
||||||
|
|
||||||
|
EXT_DIR="$HOME/.cursor/extensions"
|
||||||
|
APP_SUPPORT="$HOME/Library/Application Support"
|
||||||
|
|
||||||
|
ext_hits=""
|
||||||
|
if [[ -d "$EXT_DIR" ]]; then
|
||||||
|
ext_hits="$(find "$EXT_DIR" -maxdepth 1 -type d -iname '*pieces*' | sort || true)"
|
||||||
|
fi
|
||||||
|
ext_count="$(echo "$ext_hits" | sed '/^\s*$/d' | wc -l | tr -d ' ')"
|
||||||
|
|
||||||
|
support_hits=""
|
||||||
|
if [[ -d "$APP_SUPPORT" ]]; then
|
||||||
|
support_hits="$(find "$APP_SUPPORT" -maxdepth 1 -type d -iname '*pieces*' | sort || true)"
|
||||||
|
fi
|
||||||
|
support_count="$(echo "$support_hits" | sed '/^\s*$/d' | wc -l | tr -d ' ')"
|
||||||
|
|
||||||
|
proc_hits="$(pgrep -fal -i 'pieces|copilot' | rg -v 'scripts/docs/pieces_sync.sh' || true)"
|
||||||
|
proc_count="$(echo "$proc_hits" | sed '/^\s*$/d' | wc -l | tr -d ' ')"
|
||||||
|
|
||||||
|
probe_lines=""
|
||||||
|
probe_ok=0
|
||||||
|
probe_total=0
|
||||||
|
if [[ -n "$PROBE_PORTS" ]]; then
|
||||||
|
IFS=',' read -r -a ports <<< "$PROBE_PORTS"
|
||||||
|
for port in "${ports[@]}"; do
|
||||||
|
port_trimmed="$(echo "$port" | xargs)"
|
||||||
|
[[ -z "$port_trimmed" ]] && continue
|
||||||
|
for endpoint in health status; do
|
||||||
|
url="http://127.0.0.1:${port_trimmed}/${endpoint}"
|
||||||
|
code="$(curl -sS -o /dev/null -w '%{http_code}' "$url" || true)"
|
||||||
|
probe_total=$((probe_total + 1))
|
||||||
|
if [[ "$code" == "200" ]]; then
|
||||||
|
probe_ok=$((probe_ok + 1))
|
||||||
|
fi
|
||||||
|
probe_lines+="- ${url} -> HTTP ${code}"$'\n'
|
||||||
|
done
|
||||||
|
done
|
||||||
|
fi
|
||||||
|
|
||||||
|
if [[ "$DRY_RUN" -eq 1 ]]; then
|
||||||
|
echo "[dry-run] pieces_extensions_count: $ext_count"
|
||||||
|
echo "[dry-run] pieces_data_dirs_count: $support_count"
|
||||||
|
echo "[dry-run] pieces_process_count: $proc_count"
|
||||||
|
echo "[dry-run] api_probe_ok: $probe_ok/$probe_total"
|
||||||
|
if [[ -n "$probe_lines" ]]; then
|
||||||
|
echo "[dry-run] api_probes:"
|
||||||
|
printf "%s" "$probe_lines"
|
||||||
|
fi
|
||||||
|
exit 0
|
||||||
|
fi
|
||||||
|
|
||||||
|
mkdir -p "$OUT_DIR"
|
||||||
|
STAMP="$(date +%Y%m%d-%H%M%S)"
|
||||||
|
REPORT="$OUT_DIR/PIECES_SYNC_${STAMP}.md"
|
||||||
|
LATEST="$OUT_DIR/PIECES_SYNC_LATEST.md"
|
||||||
|
INDEX="$OUT_DIR/pieces_index_${STAMP}.csv"
|
||||||
|
INDEX_LATEST="$OUT_DIR/pieces_index_latest.csv"
|
||||||
|
|
||||||
|
{
|
||||||
|
echo "# Pieces Sync Report"
|
||||||
|
echo
|
||||||
|
echo "Generated: $(date -u '+%Y-%m-%d %H:%M:%S UTC')"
|
||||||
|
echo
|
||||||
|
echo "- pieces_extensions_count: $ext_count"
|
||||||
|
echo "- pieces_data_dirs_count: $support_count"
|
||||||
|
echo "- pieces_process_count: $proc_count"
|
||||||
|
echo "- api_probe_ok: $probe_ok/$probe_total"
|
||||||
|
echo
|
||||||
|
echo "## Extensions"
|
||||||
|
echo
|
||||||
|
if [[ -n "$ext_hits" ]]; then
|
||||||
|
echo "$ext_hits" | sed 's/^/- /'
|
||||||
|
else
|
||||||
|
echo "- no pieces extensions found"
|
||||||
|
fi
|
||||||
|
echo
|
||||||
|
echo "## Data Dirs"
|
||||||
|
echo
|
||||||
|
if [[ -n "$support_hits" ]]; then
|
||||||
|
echo "$support_hits" | sed 's/^/- /'
|
||||||
|
else
|
||||||
|
echo "- no pieces data directories found"
|
||||||
|
fi
|
||||||
|
echo
|
||||||
|
echo "## Processes"
|
||||||
|
echo
|
||||||
|
if [[ -n "$proc_hits" ]]; then
|
||||||
|
echo "$proc_hits" | sed 's/^/- /'
|
||||||
|
else
|
||||||
|
echo "- no pieces related processes found"
|
||||||
|
fi
|
||||||
|
echo
|
||||||
|
echo "## API Probes"
|
||||||
|
echo
|
||||||
|
if [[ -n "$probe_lines" ]]; then
|
||||||
|
printf "%s" "$probe_lines"
|
||||||
|
else
|
||||||
|
echo "- no api probe ports supplied"
|
||||||
|
fi
|
||||||
|
} > "$REPORT"
|
||||||
|
|
||||||
|
{
|
||||||
|
echo "kind,path_or_process"
|
||||||
|
if [[ -n "$ext_hits" ]]; then
|
||||||
|
while IFS= read -r row; do
|
||||||
|
[[ -z "$row" ]] && continue
|
||||||
|
echo "extension,\"$row\""
|
||||||
|
done <<< "$ext_hits"
|
||||||
|
fi
|
||||||
|
if [[ -n "$support_hits" ]]; then
|
||||||
|
while IFS= read -r row; do
|
||||||
|
[[ -z "$row" ]] && continue
|
||||||
|
echo "data_dir,\"$row\""
|
||||||
|
done <<< "$support_hits"
|
||||||
|
fi
|
||||||
|
if [[ -n "$proc_hits" ]]; then
|
||||||
|
while IFS= read -r row; do
|
||||||
|
[[ -z "$row" ]] && continue
|
||||||
|
echo "process,\"$row\""
|
||||||
|
done <<< "$proc_hits"
|
||||||
|
fi
|
||||||
|
} > "$INDEX"
|
||||||
|
|
||||||
|
cp "$REPORT" "$LATEST"
|
||||||
|
cp "$INDEX" "$INDEX_LATEST"
|
||||||
|
|
||||||
|
echo "Wrote: $REPORT"
|
||||||
|
echo "Updated: $LATEST"
|
||||||
|
echo "Wrote: $INDEX"
|
||||||
|
echo "Updated: $INDEX_LATEST"
|
||||||
73
scripts/docs/services_sync.sh
Executable file
73
scripts/docs/services_sync.sh
Executable file
@@ -0,0 +1,73 @@
|
|||||||
|
#!/usr/bin/env bash
|
||||||
|
set -euo pipefail
|
||||||
|
|
||||||
|
ROOT_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")/../.." && pwd)"
|
||||||
|
cd "$ROOT_DIR"
|
||||||
|
|
||||||
|
DRY_RUN=1
|
||||||
|
APPLY=0
|
||||||
|
PROBE_PORTS=""
|
||||||
|
|
||||||
|
usage() {
|
||||||
|
cat <<'USAGE'
|
||||||
|
Usage:
|
||||||
|
bash scripts/docs/services_sync.sh [--dry-run] [--apply] [--probe-ports 39300,39301]
|
||||||
|
|
||||||
|
Workflow:
|
||||||
|
1) integration bootstrap status
|
||||||
|
2) jupyter sync adapter
|
||||||
|
3) pieces sync adapter
|
||||||
|
USAGE
|
||||||
|
}
|
||||||
|
|
||||||
|
while [[ $# -gt 0 ]]; do
|
||||||
|
case "$1" in
|
||||||
|
--dry-run)
|
||||||
|
DRY_RUN=1
|
||||||
|
APPLY=0
|
||||||
|
shift
|
||||||
|
;;
|
||||||
|
--apply)
|
||||||
|
APPLY=1
|
||||||
|
DRY_RUN=0
|
||||||
|
shift
|
||||||
|
;;
|
||||||
|
--probe-ports)
|
||||||
|
PROBE_PORTS="$2"
|
||||||
|
shift 2
|
||||||
|
;;
|
||||||
|
-h|--help)
|
||||||
|
usage
|
||||||
|
exit 0
|
||||||
|
;;
|
||||||
|
*)
|
||||||
|
echo "Unknown arg: $1" >&2
|
||||||
|
usage
|
||||||
|
exit 2
|
||||||
|
;;
|
||||||
|
esac
|
||||||
|
done
|
||||||
|
|
||||||
|
if [[ "$DRY_RUN" -eq 1 ]]; then
|
||||||
|
echo "[dry-run] running: scripts/docs/session_bootstrap.sh"
|
||||||
|
bash scripts/docs/session_bootstrap.sh
|
||||||
|
echo "[dry-run] running: scripts/docs/jupyter_sync.sh --dry-run"
|
||||||
|
bash scripts/docs/jupyter_sync.sh --dry-run
|
||||||
|
echo "[dry-run] running: scripts/docs/pieces_sync.sh --dry-run"
|
||||||
|
if [[ -n "$PROBE_PORTS" ]]; then
|
||||||
|
bash scripts/docs/pieces_sync.sh --dry-run --probe-ports "$PROBE_PORTS"
|
||||||
|
else
|
||||||
|
bash scripts/docs/pieces_sync.sh --dry-run
|
||||||
|
fi
|
||||||
|
exit 0
|
||||||
|
fi
|
||||||
|
|
||||||
|
bash scripts/docs/session_bootstrap.sh
|
||||||
|
bash scripts/docs/jupyter_sync.sh --apply
|
||||||
|
if [[ -n "$PROBE_PORTS" ]]; then
|
||||||
|
bash scripts/docs/pieces_sync.sh --apply --probe-ports "$PROBE_PORTS"
|
||||||
|
else
|
||||||
|
bash scripts/docs/pieces_sync.sh --apply
|
||||||
|
fi
|
||||||
|
|
||||||
|
echo "services sync completed"
|
||||||
117
scripts/docs/session_bootstrap.sh
Executable file
117
scripts/docs/session_bootstrap.sh
Executable file
@@ -0,0 +1,117 @@
|
|||||||
|
#!/usr/bin/env bash
|
||||||
|
set -euo pipefail
|
||||||
|
|
||||||
|
ROOT_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")/../.." && pwd)"
|
||||||
|
OUT_DIR="$ROOT_DIR/docs/consolidation"
|
||||||
|
STAMP="$(date +%Y%m%d-%H%M%S)"
|
||||||
|
LATEST="$OUT_DIR/INTEGRATIONS_STATUS_LATEST.md"
|
||||||
|
REPORT="$OUT_DIR/INTEGRATIONS_STATUS_${STAMP}.md"
|
||||||
|
|
||||||
|
mkdir -p "$OUT_DIR"
|
||||||
|
|
||||||
|
section() {
|
||||||
|
printf "\n## %s\n\n" "$1"
|
||||||
|
}
|
||||||
|
|
||||||
|
write_status() {
|
||||||
|
local name="$1"; shift
|
||||||
|
local status="$1"; shift
|
||||||
|
local details="$*"
|
||||||
|
printf -- "- **%s**: %s" "$name" "$status"
|
||||||
|
if [[ -n "$details" ]]; then
|
||||||
|
printf " - %s" "$details"
|
||||||
|
fi
|
||||||
|
printf "\n"
|
||||||
|
}
|
||||||
|
|
||||||
|
{
|
||||||
|
printf "# Integrations Bootstrap Status\n\n"
|
||||||
|
printf "Generated: %s\n\n" "$(date -u '+%Y-%m-%d %H:%M:%S UTC')"
|
||||||
|
printf "Repo: %s\n\n" "$ROOT_DIR"
|
||||||
|
|
||||||
|
section "Gitea"
|
||||||
|
gitea_code="$(curl -sS -o /dev/null -w '%{http_code}' http://127.0.0.1:3000/ || true)"
|
||||||
|
if [[ "$gitea_code" == "200" || "$gitea_code" == "302" ]]; then
|
||||||
|
write_status "gitea_http" "OK" "http_code=$gitea_code (http://127.0.0.1:3000)"
|
||||||
|
else
|
||||||
|
write_status "gitea_http" "DEGRADED" "http_code=$gitea_code (http://127.0.0.1:3000)"
|
||||||
|
fi
|
||||||
|
|
||||||
|
gitea_remotes="$(git -C "$ROOT_DIR" remote -v | awk '/gitea/ {print $1" "$2}' | sort -u || true)"
|
||||||
|
if [[ -n "$gitea_remotes" ]]; then
|
||||||
|
write_status "gitea_git_remote" "OK" "$(echo "$gitea_remotes" | tr '\n' '; ' | sed 's/; $//')"
|
||||||
|
else
|
||||||
|
write_status "gitea_git_remote" "INFO" "no remote URL containing 'gitea' found"
|
||||||
|
fi
|
||||||
|
|
||||||
|
section "GitHub"
|
||||||
|
if command -v gh >/dev/null 2>&1; then
|
||||||
|
gh_status="$(gh auth status 2>&1 || true)"
|
||||||
|
if echo "$gh_status" | rg -q "Logged in to github.com"; then
|
||||||
|
account_line="$(echo "$gh_status" | awk -F'account ' '/Logged in to github.com account/{print $2}' | head -n1)"
|
||||||
|
write_status "gh_auth" "OK" "$account_line"
|
||||||
|
else
|
||||||
|
write_status "gh_auth" "DEGRADED" "gh installed but not authenticated"
|
||||||
|
fi
|
||||||
|
else
|
||||||
|
write_status "gh_auth" "DEGRADED" "gh CLI not installed"
|
||||||
|
fi
|
||||||
|
|
||||||
|
gh_remotes="$(git -C "$ROOT_DIR" remote -v | awk '/github.com/ {print $1" "$2}' | sort -u || true)"
|
||||||
|
if [[ -n "$gh_remotes" ]]; then
|
||||||
|
write_status "github_git_remote" "OK" "$(echo "$gh_remotes" | tr '\n' '; ' | sed 's/; $//')"
|
||||||
|
else
|
||||||
|
write_status "github_git_remote" "INFO" "no remote URL containing 'github.com' found"
|
||||||
|
fi
|
||||||
|
|
||||||
|
section "Jupyter"
|
||||||
|
if command -v jupyter >/dev/null 2>&1; then
|
||||||
|
jlist="$(jupyter server list 2>/dev/null || true)"
|
||||||
|
running="$(echo "$jlist" | sed '/^\s*$/d' | wc -l | tr -d ' ')"
|
||||||
|
write_status "jupyter_cli" "OK" "jupyter found in PATH"
|
||||||
|
write_status "jupyter_servers" "INFO" "listed_rows=$running"
|
||||||
|
elif python3 -m jupyter --version >/dev/null 2>&1; then
|
||||||
|
write_status "jupyter_cli" "OK" "python3 -m jupyter available"
|
||||||
|
jlist="$(python3 -m jupyter server list 2>/dev/null || true)"
|
||||||
|
running="$(echo "$jlist" | sed '/^\s*$/d' | wc -l | tr -d ' ')"
|
||||||
|
write_status "jupyter_servers" "INFO" "listed_rows=$running"
|
||||||
|
else
|
||||||
|
write_status "jupyter_cli" "DEGRADED" "jupyter not found in PATH"
|
||||||
|
fi
|
||||||
|
|
||||||
|
nb_dir="/Users/apple/notebooks"
|
||||||
|
if [[ -d "$nb_dir" ]]; then
|
||||||
|
nb_count="$(find "$nb_dir" -maxdepth 1 -type f -name '*.ipynb' | wc -l | tr -d ' ')"
|
||||||
|
write_status "notebooks_dir" "OK" "$nb_dir (ipynb_count=$nb_count)"
|
||||||
|
else
|
||||||
|
write_status "notebooks_dir" "DEGRADED" "$nb_dir missing"
|
||||||
|
fi
|
||||||
|
|
||||||
|
section "Pieces"
|
||||||
|
pext_dir="$HOME/.cursor/extensions"
|
||||||
|
if [[ -d "$pext_dir" ]]; then
|
||||||
|
pieces_hits="$(find "$pext_dir" -maxdepth 1 -type d -iname '*pieces*' | wc -l | tr -d ' ')"
|
||||||
|
if [[ "$pieces_hits" -gt 0 ]]; then
|
||||||
|
write_status "pieces_extension" "OK" "cursor extensions matched=$pieces_hits"
|
||||||
|
else
|
||||||
|
write_status "pieces_extension" "DEGRADED" "no Pieces extension in $pext_dir"
|
||||||
|
fi
|
||||||
|
else
|
||||||
|
write_status "pieces_extension" "DEGRADED" "$pext_dir missing"
|
||||||
|
fi
|
||||||
|
|
||||||
|
pieces_data_dir="$HOME/Library/Application Support/Pieces"
|
||||||
|
if [[ -d "$pieces_data_dir" ]]; then
|
||||||
|
write_status "pieces_data_dir" "OK" "$pieces_data_dir"
|
||||||
|
else
|
||||||
|
write_status "pieces_data_dir" "INFO" "$pieces_data_dir not found"
|
||||||
|
fi
|
||||||
|
|
||||||
|
section "Next"
|
||||||
|
printf -- "- Run docs sync dry-run: bash scripts/docs/docs_sync.sh --dry-run\n"
|
||||||
|
printf -- "- Apply sync to remotes: bash scripts/docs/docs_sync.sh --apply --targets github,gitea\n"
|
||||||
|
} > "$REPORT"
|
||||||
|
|
||||||
|
cp "$REPORT" "$LATEST"
|
||||||
|
echo "Wrote: $REPORT"
|
||||||
|
echo "Updated: $LATEST"
|
||||||
Reference in New Issue
Block a user