Real-time pipelines
built into the event store
Filter, map, reduce, window, branch, and enrich events as they arrive. No separate stream processor. No Kafka. No Flink. Just your event store doing the work.
Six operators. Infinite pipelines.
Filter
Drop events that don't match your criteria before they enter the pipeline
Map
Transform event payloads — rename fields, compute derived values, enrich with context
Reduce
Aggregate events into running totals, counts, or custom accumulations over time windows
Window
Group events by time window (tumbling, sliding, session) for batch-style processing on a stream
Branch
Route events to different downstream pipelines based on type, content, or custom predicates
Enrich
Join event data with external sources — add user profiles, geo data, or lookup tables in-flight
Define pipelines as JSON
// Define a pipeline: order events → compute daily revenue
{
"name": "daily-revenue",
"source": "order.*",
"stages": [
{ "filter": { "event_type": "order.placed" } },
{ "map": { "extract": ["payload.total", "payload.currency"] } },
{ "window": { "type": "tumbling", "size": "1d" } },
{ "reduce": { "sum": "total", "count": "*", "group_by": "currency" } }
],
"sink": "projection://daily-revenue-by-currency"
}Projections
Materialized views that stay in sync with your event stream. Define a projection as a fold over events — AllSource keeps it current as new events arrive.
WebSocket Streaming
Subscribe to live event feeds via Phoenix Channels. The Query Service pushes new events to connected clients in real-time — no polling.
Event Replay
Replay any sequence of events through a pipeline. Rebuild projections from scratch, test new pipeline logic against historical data, or debug by replaying the last hour.
469K Events/Sec
The Rust core processes events through pipelines at ingestion speed. No separate stream processor to deploy — it's built into the event store.
Build your first pipeline in 5 minutes
Free tier: 100K events/month. Pipelines included. No credit card required.
