feat(Optimization): Update icon colors for consistency and improve hover styles feat(SimplifiedAgent): Add messagesCount parameter to initializeAgent for adaptive prompts
1.8 KiB
Tracing LLM Calls in Perplexica
Perplexica supports tracing all LangChain and LangGraph LLM calls for debugging, analytics, and prompt transparency. You can use either Langfuse (self-hosted, private, or cloud) or LangSmith (cloud, by LangChain) for tracing.
Langfuse Tracing (Recommended for Private/Self-Hosted)
Langfuse is an open-source, self-hostable observability platform for LLM applications. It allows you to trace prompts, completions, and tool calls privately—no data leaves your infrastructure if you self-host.
Setup
-
Deploy Langfuse
- See: Langfuse Self-Hosting Guide
- You can also use the Langfuse Cloud if you prefer.
-
Configure Environment Variables
-
Add the following to your environment variables in docker-compose or your deployment environment:
LANGFUSE_PUBLIC_KEY=your-public-key LANGFUSE_SECRET_KEY=your-secret-key LANGFUSE_BASE_URL=https://your-langfuse-instance.com -
These are required for the tracing integration to work. If not set, tracing is disabled gracefully.
-
-
Run Perplexica
- All LLM and agent calls will be traced automatically. You can view traces in your Langfuse dashboard.
LangSmith Tracing (Cloud by LangChain)
Perplexica also supports tracing via LangSmith, the official observability platform by LangChain.
- To enable LangSmith, follow the official guide: LangSmith Observability Docs
- Set the required environment variables as described in their documentation.
LangSmith is a managed cloud service.
For more details on tracing, see the respective documentation: