Overview
Observability in Fraim helps you:- Track workflow execution - Monitor the performance and progress of your security workflows
- Debug LLM interactions - Inspect prompts, responses, and token usage across different AI models
- Analyze performance - Identify bottlenecks and optimize your workflows
- Monitor costs - Track API usage and costs across different LLM providers
Installation
For Development (Git Clone Users)
If you’ve cloned the Fraim repository and are usinguv
for development:
For Production (PyPI Users) [Coming Soon]
If you’re installing Fraim as a package:Configuration
Langfuse Setup
Choose between hosted (Langfuse Cloud) or self-hosted (Docker) deployment:Option 1: Langfuse Cloud (Hosted)
The easiest way to get started with observability:- Create a Langfuse account at langfuse.com
- Create a new project in the Langfuse dashboard
-
Get your API credentials from the project settings:
- Public Key
- Secret Key
- Configure environment variables:
Option 2: Self-hosted Langfuse (Docker)
For enhanced privacy and control, run Langfuse in your own infrastructure using Docker:- Clone the Langfuse repository:
- Start the services using their Docker Compose:
- Access Langfuse:
http://localhost:3000
.
- Create your first project
- Configure environment variables:
Usage
Enable observability when running Fraim workflows by passing the--observability langfuse
flag.
For example, to enable observability for a code security analysis:
Troubleshooting
Common Issues
Observability Not Working
Missing Traces
- Ensure
--observability langfuse
flag is provided - Check network connectivity to Langfuse host
- Verify API credentials are correct
Support
For observability-related issues:- Check the Langfuse documentation
- Report issues with integration on the Fraim GitHub repository