Files
microdao-daarion/services/market-data-service/app/consumers/storage.py
Apple 09dee24342 feat: MD pipeline — market-data-service hardening + SenpAI NATS consumer
Producer (market-data-service):
- Backpressure: smart drop policy (heartbeats→quotes→trades preserved)
- Heartbeat monitor: synthetic HeartbeatEvent on provider silence
- Graceful shutdown: WS→bus→storage→DB engine cleanup sequence
- Bybit V5 public WS provider (backup for Binance, no API key needed)
- FailoverManager: health-based provider switching with recovery
- NATS output adapter: md.events.{type}.{symbol} for SenpAI
- /bus-stats endpoint for backpressure monitoring
- Dockerfile + docker-compose.node1.yml integration
- 36 tests (parsing + bus + failover), requirements.lock

Consumer (senpai-md-consumer):
- NATSConsumer: subscribe md.events.>, queue group senpai-md, backpressure
- State store: LatestState + RollingWindow (deque, 60s)
- Feature engine: 11 features (mid, spread, VWAP, return, vol, latency)
- Rule-based signals: long/short on return+volume+spread conditions
- Publisher: rate-limited features + signals + alerts to NATS
- HTTP API: /health, /metrics, /state/latest, /features/latest, /stats
- 10 Prometheus metrics
- Dockerfile + docker-compose.senpai.yml
- 41 tests (parsing + state + features + rate-limit), requirements.lock

CI: ruff + pytest + smoke import for both services
Tests: 77 total passed, lint clean
Co-authored-by: Cursor <cursoragent@cursor.com>
2026-02-09 11:46:15 -08:00

64 lines
1.9 KiB
Python

"""
StorageConsumer: persists events to SQLite + JSONL log.
"""
from __future__ import annotations
import logging
from pathlib import Path
from app.config import settings
from app.domain.events import (
BookL2Event,
Event,
EventType,
QuoteEvent,
TradeEvent,
)
from app.db import repo
logger = logging.getLogger(__name__)
class StorageConsumer:
"""
Writes every event to:
1. SQLite (via async repo) — structured, queryable.
2. JSONL file — append-only event log for replay/audit.
"""
def __init__(self, jsonl_path: str | None = None) -> None:
self._jsonl_path = Path(jsonl_path or settings.jsonl_path)
self._jsonl_file = None
self._count = 0
async def start(self) -> None:
"""Open JSONL file for appending."""
self._jsonl_file = open(self._jsonl_path, "a", buffering=1) # line-buffered
logger.info("storage.started", extra={"jsonl": str(self._jsonl_path)})
async def handle(self, event: Event) -> None:
"""Persist one event."""
# 1. JSONL log (always)
if self._jsonl_file:
line = event.model_dump_json()
self._jsonl_file.write(line + "\n")
# 2. SQLite (by type)
if event.event_type == EventType.TRADE:
assert isinstance(event, TradeEvent)
await repo.save_trade(event)
elif event.event_type == EventType.QUOTE:
assert isinstance(event, QuoteEvent)
await repo.save_quote(event)
elif event.event_type == EventType.BOOK_L2:
assert isinstance(event, BookL2Event)
await repo.save_book_snapshot(event)
# Heartbeats → only JSONL, not SQLite
self._count += 1
async def stop(self) -> None:
if self._jsonl_file:
self._jsonl_file.close()
logger.info("storage.stopped", extra={"events_written": self._count})