Integration Overview
Integrate your application with Langfuse to explore production traces and metrics.
Objective:
- Capture traces of your application
- Add scores to these traces to measure/evaluate quality of outputs
There are currently five main ways to integrate with Langfuse:
Main Integrations
Integration | Supports | Description |
---|---|---|
SDK | Python, JS/TS | Manual instrumentation using the SDKs for full flexibility. |
OpenAI | Python | Automated instrumentation using drop-in replacement of OpenAI SDK. |
Langchain | Python, JS/TS | Automated instrumentation by passing callback handler to Langchain application. |
LlamaIndex | Python | Automated instrumentation via LlamaIndex callback system. |
API | Directly call the public API. OpenAPI spec available. |
Packages integrated with Langfuse
Name | Description |
---|---|
Flowise | JS/TS no-code builder for customized LLM flows. |
Instructor | Library to get structured LLM outputs (JSON, Pydantic) |
Langflow | Python-based UI for LangChain, designed with react-flow to provide an effortless way to experiment and prototype flows. |
LiteLLM | Use any LLM as a drop in replacement for GPT. Use Azure, OpenAI, Cohere, Anthropic, Ollama, VLLM, Sagemaker, HuggingFace, Replicate (100+ LLMs). |
Superagent | Open Source AI Assistant Framework & API for prototyping and deployment of agents. |
AI SDK by Vercel | Typescript SDK that makes streaming LLM outputs super easy. |
Unsure which integration to choose? Ask us on Discord or in the chat.