Agents
AI agents are standalone services that receive batched log records from the GlassFlow pipeline, process them, and send enriched results back. Agents are not part of the platform — you deploy and manage them yourself.
How agents work
- The pipeline service consumes filtered records from NATS, collects them into a batch (up to 10), and POSTs them as a JSON array to the agent’s endpoint.
- The agent processes the batch (classification, enrichment, summarization, anomaly detection — whatever you build).
- The agent sends results back to the receiver via the GlassFlow Python SDK, which POSTs to
/internal/agent-outputwith the project’s API key. - The sink service picks up the output from NATS and dispatches it to configured webhooks or Slack.
Request format
The pipeline sends a POST request to your agent with:
Headers:
Content-Type: application/jsonX-Project-ID: <project-uuid>X-Request-ID: <unique-request-id>
Body: A JSON array of flat log records:
[
{
"resourceAttributes": {
"service.name": "payment-service",
"k8s.pod.name": "payment-abc123"
},
"severityNumber": 17,
"severityText": "ERROR",
"body": "Connection refused: database pool exhausted",
"attributes": {},
"traceId": "...",
"spanId": "...",
"timestamp": "2024-01-15T10:30:00Z"
},
{
"resourceAttributes": {
"service.name": "payment-service"
},
"severityNumber": 17,
"severityText": "ERROR",
"body": "Timeout waiting for DB connection after 30s",
"attributes": {},
"traceId": "...",
"spanId": "...",
"timestamp": "2024-01-15T10:30:01Z"
}
]Sending results back
Install the GlassFlow Python SDK:
pip install /path/to/sdk/python
# or from your package registryUse it to send output:
from glassflow_ai import GlassFlow
gf = GlassFlow(
api_key="your-project-api-key",
endpoint="http://receiver:4318", # data plane
)
gf.send_output(
payload={
"summary": "Database connection pool exhausted",
"severity": "critical",
"root_cause": "Too many concurrent connections",
"remediation": "Increase max_connections or add connection pooling",
},
request_id=request_id, # from X-Request-ID header
)Building an agent
A minimal agent is a FastAPI server with a /process endpoint:
from fastapi import FastAPI, Header, Request
from glassflow_ai import GlassFlow
app = FastAPI()
gf = GlassFlow(api_key="...", endpoint="http://receiver:4318")
@app.post("/process")
async def process(
request: Request,
x_project_id: str = Header(default=""),
x_request_id: str = Header(default=""),
):
records = await request.json()
# Your processing logic here
output = {"processed": len(records), "results": [...]}
gf.send_output(payload=output, request_id=x_request_id)
return {"status": "ok"}
@app.get("/healthz")
async def healthz():
return {"status": "ok"}Bundled example: OTLP Error Summary
The repository includes a complete example agent at agents/otlp-error-summary/ that uses the OpenAI Agents SDK to classify and enrich error logs.
It runs two AI agents in sequence:
- Classifier — identifies the error type (e.g.
ConnectionPoolExhausted) and severity (critical,warning,informational) - Enricher — produces a summary, root cause hypothesis, remediation steps, and whether to notify
Running the example agent
cd agents/otlp-error-summary
# Install dependencies
pip install -r requirements.txt
pip install ../../../sdk/python
# Set environment variables
export OPENAI_API_KEY="sk-..."
export GLASSFLOW_API_KEY="your-project-api-key"
export GLASSFLOW_ENDPOINT="http://localhost:4318"
# Start the agent
uvicorn main:app --host 0.0.0.0 --port 8000Then set your project’s agent endpoint to http://localhost:8000/process in the Pipeline configuration.
With Docker Compose
The docker-compose.yml includes the agent service. Set your API keys in the environment:
OPENAI_API_KEY=sk-... GLASSFLOW_API_KEY=your-key docker compose up agentOn Kubernetes
The example agent ships with ready-to-use Kubernetes manifests in agents/otlp-error-summary/k8s/.
1. Create the secrets
Edit k8s/deployment.yaml and replace the placeholder values in the Secret, or create the secret directly:
kubectl create secret generic otlp-error-summary-secrets \
--from-literal=openai-api-key="sk-..." \
--from-literal=glassflow-api-key="your-project-api-key"If you create the secret manually, remove the Secret resource from deployment.yaml before applying.
2. Update the receiver endpoint
If GlassFlow AI Runtime is installed via the Helm chart with the default release name glassflow, the receiver service is glassflow-receiver. The manifests default to:
- name: GLASSFLOW_ENDPOINT
value: "http://glassflow-receiver:4318"Adjust the service name if your Helm release has a different name (e.g. http://<release>-receiver:4318).
3. Deploy
# Using kubectl
kubectl apply -f agents/otlp-error-summary/k8s/deployment.yaml
# Or using kustomize
kubectl apply -k agents/otlp-error-summary/k8s/4. Verify
kubectl get pods -l app.kubernetes.io/name=otlp-error-summary
kubectl logs -l app.kubernetes.io/name=otlp-error-summary5. Configure the pipeline
Set the agent endpoint in your GlassFlow project’s pipeline configuration to:
http://otlp-error-summary:8000/processThe agent and GlassFlow must be in the same Kubernetes namespace, or use the fully qualified service name (otlp-error-summary.<namespace>.svc.cluster.local).
Customizing the deployment
| Environment variable | Description | Default |
|---|---|---|
OPENAI_API_KEY | OpenAI API key | (required) |
GLASSFLOW_API_KEY | GlassFlow project API key | (required) |
GLASSFLOW_ENDPOINT | Receiver URL for sending output | http://glassflow-receiver:4318 |
OPENAI_MODEL | OpenAI model to use | gpt-4o-mini |
AI_TIMEOUT_SECONDS | Timeout for AI calls | 30 |
LOG_LEVEL | Python log level | INFO |
Connecting to GlassFlow
- Deploy your agent anywhere reachable from the pipeline pods
- Create an API key in the project’s UI
- Set the agent endpoint in the project’s Pipeline configuration
- Configure the SDK in your agent to point to the receiver (
http://receiver:4318inside the cluster, or your external receiver URL)
The agent endpoint must be reachable from the pipeline service. If both run in the same Kubernetes cluster, use the internal service name. If the agent is external, ensure network connectivity.
Agent requirements
Your agent must:
- Accept
POSTrequests with a JSON array body - Return a
2xxstatus code on success (the pipeline retries on5xx) - Send output back via the GlassFlow SDK (or POST to
/internal/agent-outputdirectly)
Your agent can be written in any language — the SDK is optional. You can POST directly to the receiver’s /internal/agent-output endpoint with the X-API-Key header.