Langchain Integration
Integrate with Langfuse in seconds using the new Langchain Integration
For teams building their LLM app with Langchain, adopting Langfuse just got easier. We added a CallbackHandler
to the Langfuse Python SDK that natively integrates with Langchain Callbacks (opens in a new tab).
πͺ’ + π¦π β π€
pip install langfuse
# Initialize Langfuse handler
from langfuse.callback import CallbackHandler
langfuse_handler = CallbackHandler(
secret_key="sk-lf-...",
public_key="pk-lf-...",
host="https://cloud.langfuse.com", # πͺπΊ EU region
# host="https://us.cloud.langfuse.com", # πΊπΈ US region
)
# Your Langchain code
# Add Langfuse handler as callback (classic and LCEL)
chain.invoke({"input": "<user_input>"}, config={"callbacks": [langfuse_handler]})
Also works for run
and predict
methods.
chain.run(input="<user_input>", callbacks=[langfuse_handler])
conversation.predict(input="<user_input>", callbacks=[langfuse_handler])
From the Langchain integration docs
Which actions are tracked?
The Langfuse CallbackHandler
tracks the following actions when using Langchain:
- Chains:
on_chain_start
,on_chain_end
.on_chain_error
- Agents:
on_agent_start
,on_agent_action
,on_agent_finish
,on_agent_end
- Tools:
on_tool_start
,on_tool_end
,on_tool_error
- Retriever:
on_retriever_start
,on_retriever_end
- ChatModel:
on_chat_model_start
, - LLM:
on_llm_start
,on_llm_end
,on_llm_error
All actions are automatically nested based on the call tree and include inputs, outputs, model configurations, token counts, latencies and errors.
How does it look like in Langfuse?
Demo of the debug view in Langfuse:
You can find the code of these examples in the Python Cookbook
About Langfuse
Langfuse is an open source product analytics platform for LLM applications. It is used by teams to track and analyze their LLM app in production with regards to quality, cost and latency across product releases and use cases. In addition, the Langfuse Debug UI helps to visualize the control flow of LLM apps in production. Read our launch post if you want to learn more.
Next steps
- Read the Langchain integration docs for more details and examples to get started.
- Not (exclusively) using Langchain in production? Follow the quickstart to get started with the Typescript and Python SDKs that allow you to integrate with your custom LLM app.
Questions / Feedback?
We are happy to hear from you! If you have questions or feature requests, open an issue on Github, join the Langfuse Discord, or contact us via Twitter or email: hi@langfuse.com