Skip to Content
Getting Started

Getting Started

The fastest way to try GlassFlow AI Runtime is with Docker Compose. This starts all platform services locally — you only need Docker installed.

Prerequisites

Start the stack

Clone the repository and start all services:

git clone https://github.com/glassflow/glassflow-ai-runtime.git cd glassflow-ai-runtime docker compose up -d

This starts:

ServiceURLDescription
UIhttp://localhost:3000 Web interface
Control Plane APIhttp://localhost:8080 Management API
Receiver (Data Plane)http://localhost:4318 OTLP ingest + agent output
Pipeline(internal)Filter/transform + agent forwarding
Sink(internal)Webhook/Slack dispatch
NATSlocalhost:4222Message streaming
PostgreSQLlocalhost:5432Config and auth storage

Create an account

Open http://localhost:3000  and click Sign Up to create your first user account.

Create a project

  1. After logging in, click Create Project and give it a name.
  2. Navigate to API Keys and create a key — you’ll need this to send data and connect agents.

Send test data

Send a test OTLP log to the receiver:

curl -X POST http://localhost:4318/v1/logs \ -H "Content-Type: application/json" \ -H "X-API-Key: YOUR_API_KEY" \ -d '{ "resourceLogs": [{ "resource": { "attributes": [ {"key": "service.name", "value": {"stringValue": "my-service"}} ] }, "scopeLogs": [{ "logRecords": [{ "timeUnixNano": "1700000000000000000", "severityNumber": 17, "severityText": "ERROR", "body": {"stringValue": "Connection refused: database pool exhausted"} }] }] }] }'

Configure the pipeline

In the UI, go to your project’s Pipeline page:

  1. Optionally enable a filter (e.g. severityNumber >= 17 to only process errors)
  2. Optionally add transforms to extract or reshape fields
  3. Set the Agent Endpoint URL — this is where the pipeline sends batched data (e.g. http://agent:8000/process)

Connect an agent

See Agents for how to build and connect an AI agent. The bundled example agent at agents/otlp-error-summary/ classifies and enriches error logs using OpenAI.

Next steps