Skip to main content
Calseta is under active development. APIs and features may change. We welcome feedback and contributions on GitHub.

Run Calseta locally

Calseta runs as a single stack: FastAPI server, MCP server, and PostgreSQL. Start everything with one command:
docker compose up
That’s it. The platform starts three services:
ServicePortDescription
FastAPI server8000REST API for alerts, incidents, enrichment, and more
MCP server8001Model Context Protocol server for AI agents
PostgreSQL5432Primary store and task queue

Next steps

How It Works

Understand the five-step pipeline: ingest, normalize, enrich, contextualize, and dispatch.

Authentication

Create API keys and authenticate your requests.

Alert Sources

Connect Microsoft Sentinel, Elastic, Splunk, or a generic webhook.

MCP Setup

Connect your AI agent to Calseta via the MCP server.