Perplexica/docs/installation/TRACING.md
Willie Zutz 803fd5cc17 feat(Focus): Enhance hover effects and update icon colors for better visibility
feat(Optimization): Update icon colors for consistency and improve hover styles
feat(SimplifiedAgent): Add messagesCount parameter to initializeAgent for adaptive prompts
2025-08-10 17:36:15 -06:00

1.8 KiB

Tracing LLM Calls in Perplexica

Perplexica supports tracing all LangChain and LangGraph LLM calls for debugging, analytics, and prompt transparency. You can use either Langfuse (self-hosted, private, or cloud) or LangSmith (cloud, by LangChain) for tracing.

Langfuse is an open-source, self-hostable observability platform for LLM applications. It allows you to trace prompts, completions, and tool calls privately—no data leaves your infrastructure if you self-host.

Setup

  1. Deploy Langfuse

  2. Configure Environment Variables

    • Add the following to your environment variables in docker-compose or your deployment environment:

      LANGFUSE_PUBLIC_KEY=your-public-key
      LANGFUSE_SECRET_KEY=your-secret-key
      LANGFUSE_BASE_URL=https://your-langfuse-instance.com
      
    • These are required for the tracing integration to work. If not set, tracing is disabled gracefully.

  3. Run Perplexica

    • All LLM and agent calls will be traced automatically. You can view traces in your Langfuse dashboard.

LangSmith Tracing (Cloud by LangChain)

Perplexica also supports tracing via LangSmith, the official observability platform by LangChain.

  • To enable LangSmith, follow the official guide: LangSmith Observability Docs
  • Set the required environment variables as described in their documentation.

LangSmith is a managed cloud service.


For more details on tracing, see the respective documentation: