OpenSRE vs Commercial Incident Tools
How OpenSRE compares to PagerDuty AI, Rootly AI, and Shoreline.
| Capability | OpenSRE | PagerDuty AI | Rootly AI | Shoreline |
|---|---|---|---|---|
| Open Source | Apache 2.0 | No | No | No |
| Self-Hosted | Yes | No | No | Partial |
| Episodic Memory | Yes | No | No | No |
| Knowledge Graph | Yes | No | No | No |
| Investigation Skills | 46 built-in + custom | Limited | Limited | Yes |
| LLM Provider Choice | Any (via LiteLLM) | Fixed | Fixed | Fixed |
| Slack Integration | Yes | Yes | Yes | Yes |
| Web Console | Yes | Yes | Yes | Yes |
| Custom Skills | Yes | No | No | Partial |
| Price | Free (Apache 2.0) | $$$/month | $$$/month | $$$/month |
Episodic Memory
OpenSRE's episodic memory system learns from every investigation. After each incident, it extracts structured metadata — root cause, alert type, affected services — and stores it. Future investigations of similar incidents benefit from this accumulated knowledge. Neither PagerDuty AI, Rootly AI, nor Shoreline have an equivalent system.
Knowledge Graph
The Neo4j knowledge graph gives OpenSRE's agents a live map of your service topology. During investigation, they can perform blast radius analysis (what services are affected by this failure?), dependency traversal (what does this service depend on?), and recent change detection. This structural context is what transforms generic AI investigation into system-aware diagnosis.
Open Source + Self-Hosted
OpenSRE is Apache 2.0 licensed and fully self-hosted. Your incident data, episodic memory, and knowledge graph never leave your infrastructure. For teams with compliance requirements, air-gapped environments, or data sovereignty concerns, this is the decisive advantage. Commercial tools require sending incident data to external SaaS platforms.
LLM Provider Choice
Commercial tools lock you into a specific AI provider. OpenSRE routes through LiteLLM, which supports Anthropic, OpenAI, OpenRouter, Ollama, and any compatible provider. Use Claude for quality, switch to efficient open-source models for cost, run local models for air-gapped environments.