feat(tracing): Integrate Langfuse for LLM call tracing and add documentation
This commit is contained in:
parent
9670003970
commit
7b4a7a531e
17 changed files with 183 additions and 36 deletions
23
README.md
23
README.md
|
|
@ -50,10 +50,6 @@ Want to know more about its architecture and how it works? You can read it [here
|
|||
- **All Mode:** Searches the entire web to find the best results.
|
||||
- **Local Research Mode:** Research and interact with local files with citations.
|
||||
- **Chat Mode:** Have a truly creative conversation without web search.
|
||||
- **Academic Search Mode:** Finds articles and papers, ideal for academic research.
|
||||
- **YouTube Search Mode:** Finds YouTube videos based on the search query.
|
||||
- **Wolfram Alpha Search Mode:** Answers queries that need calculations or data analysis using Wolfram Alpha.
|
||||
- **Reddit Search Mode:** Searches Reddit for discussions and opinions related to the query.
|
||||
- **Current Information:** Some search tools might give you outdated info because they use data from crawling bots and convert them into embeddings and store them in a index. Unlike them, Perplexica uses SearxNG, a metasearch engine to get the results and rerank and get the most relevant source out of it, ensuring you always get the latest information without the overhead of daily data updates.
|
||||
- **API**: Integrate Perplexica into your existing applications and make use of its capibilities.
|
||||
|
||||
|
|
@ -178,22 +174,6 @@ When running Perplexica behind a reverse proxy (like Nginx, Apache, or Traefik),
|
|||
|
||||
This ensures that OpenSearch descriptions, browser integrations, and all URLs work properly when accessing Perplexica through your reverse proxy.
|
||||
|
||||
## One-Click Deployment
|
||||
|
||||
[](https://usw.sealos.io/?openapp=system-template%3FtemplateName%3Dperplexica)
|
||||
[](https://repocloud.io/details/?app_id=267)
|
||||
[](https://template.run.claw.cloud/?referralCode=U11MRQ8U9RM4&openapp=system-fastdeploy%3FtemplateName%3Dperplexica)
|
||||
|
||||
## Upcoming Features
|
||||
|
||||
- [x] Add settings page
|
||||
- [x] Adding support for local LLMs
|
||||
- [x] History Saving features
|
||||
- [x] Introducing various Focus Modes
|
||||
- [x] Adding API support
|
||||
- [x] Adding Discover
|
||||
- [ ] Finalizing Copilot Mode
|
||||
|
||||
## Fork Improvements
|
||||
|
||||
This fork adds several enhancements to the original Perplexica project:
|
||||
|
|
@ -250,6 +230,9 @@ This fork adds several enhancements to the original Perplexica project:
|
|||
- Real-time preview system to test widget output before saving
|
||||
- Automatic refresh of stale widgets when navigating to dashboard
|
||||
|
||||
- ✅ **Observability**: Built-in support for tracing and monitoring LLM calls using Langfuse or LangSmith.
|
||||
- See [Tracing LLM Calls in Perplexica](docs/installation/TRACING.md) for more details.
|
||||
|
||||
### Bug Fixes
|
||||
|
||||
- ✅ Improved history rewriting
|
||||
|
|
|
|||
42
docs/installation/TRACING.md
Normal file
42
docs/installation/TRACING.md
Normal file
|
|
@ -0,0 +1,42 @@
|
|||
|
||||
# Tracing LLM Calls in Perplexica
|
||||
|
||||
Perplexica supports tracing all LangChain and LangGraph LLM calls for debugging, analytics, and prompt transparency. You can use either Langfuse (self-hosted, private, or cloud) or LangSmith (cloud, by LangChain) for tracing.
|
||||
|
||||
## Langfuse Tracing (Recommended for Private/Self-Hosted)
|
||||
|
||||
Langfuse is an open-source, self-hostable observability platform for LLM applications. It allows you to trace prompts, completions, and tool calls **privately**—no data leaves your infrastructure if you self-host.
|
||||
|
||||
### Setup
|
||||
|
||||
1. **Deploy Langfuse**
|
||||
- See: [Langfuse Self-Hosting Guide](https://langfuse.com/docs/self-hosting)
|
||||
- You can also use the Langfuse Cloud if you prefer.
|
||||
|
||||
2. **Configure Environment Variables**
|
||||
- Add the following to your environment variables in docker-compose or your deployment environment:
|
||||
|
||||
```env
|
||||
LANGFUSE_PUBLIC_KEY=your-public-key
|
||||
LANGFUSE_SECRET_KEY=your-secret-key
|
||||
LANGFUSE_BASE_URL=https://your-langfuse-instance.com
|
||||
```
|
||||
- These are required for the tracing integration to work. If not set, tracing is disabled gracefully.
|
||||
|
||||
3. **Run Perplexica**
|
||||
- All LLM and agent calls will be traced automatically. You can view traces in your Langfuse dashboard.
|
||||
|
||||
## LangSmith Tracing (Cloud by LangChain)
|
||||
|
||||
Perplexica also supports tracing via [LangSmith](https://smith.langchain.com/), the official observability platform by LangChain.
|
||||
|
||||
- To enable LangSmith, follow the official guide: [LangSmith Observability Docs](https://docs.smith.langchain.com/observability)
|
||||
- Set the required environment variables as described in their documentation.
|
||||
|
||||
**LangSmith is a managed cloud service.**
|
||||
|
||||
---
|
||||
|
||||
For more details on tracing, see the respective documentation:
|
||||
- [Langfuse Documentation](https://langfuse.com/docs)
|
||||
- [LangSmith Observability](https://docs.smith.langchain.com/observability)
|
||||
41
package-lock.json
generated
41
package-lock.json
generated
|
|
@ -35,6 +35,7 @@
|
|||
"jsdom": "^26.1.0",
|
||||
"jspdf": "^3.0.1",
|
||||
"langchain": "^0.3.26",
|
||||
"langfuse-langchain": "^3.38.4",
|
||||
"lucide-react": "^0.525.0",
|
||||
"luxon": "^3.7.1",
|
||||
"mammoth": "^1.9.1",
|
||||
|
|
@ -8855,6 +8856,46 @@
|
|||
}
|
||||
}
|
||||
},
|
||||
"node_modules/langfuse": {
|
||||
"version": "3.38.4",
|
||||
"resolved": "https://registry.npmjs.org/langfuse/-/langfuse-3.38.4.tgz",
|
||||
"integrity": "sha512-2UqMeHLl3DGNX1Nh/cO4jGhk7TzDJ6gjQLlyS9rwFCKVO81xot6b58yeTsTB5YrWupWsOxQtMNoQYIQGOUlH9Q==",
|
||||
"license": "MIT",
|
||||
"dependencies": {
|
||||
"langfuse-core": "^3.38.4"
|
||||
},
|
||||
"engines": {
|
||||
"node": ">=18"
|
||||
}
|
||||
},
|
||||
"node_modules/langfuse-core": {
|
||||
"version": "3.38.4",
|
||||
"resolved": "https://registry.npmjs.org/langfuse-core/-/langfuse-core-3.38.4.tgz",
|
||||
"integrity": "sha512-onTAqcEGhoXuBgqDFXe2t+bt9Vi+5YChRgdz3voM49JKoHwtVZQiUdqTfjSivGR75eSbYoiaIL8IRoio+jaqwg==",
|
||||
"license": "MIT",
|
||||
"dependencies": {
|
||||
"mustache": "^4.2.0"
|
||||
},
|
||||
"engines": {
|
||||
"node": ">=18"
|
||||
}
|
||||
},
|
||||
"node_modules/langfuse-langchain": {
|
||||
"version": "3.38.4",
|
||||
"resolved": "https://registry.npmjs.org/langfuse-langchain/-/langfuse-langchain-3.38.4.tgz",
|
||||
"integrity": "sha512-7HJqouMrVOP9MFdu33M4G4uBFyQAIh/DqGYALfs41xqm7t99eZxKcTvt4rYZy67iQAhd58TG3q8+9haGzuLbOA==",
|
||||
"license": "MIT",
|
||||
"dependencies": {
|
||||
"langfuse": "^3.38.4",
|
||||
"langfuse-core": "^3.38.4"
|
||||
},
|
||||
"engines": {
|
||||
"node": ">=18"
|
||||
},
|
||||
"peerDependencies": {
|
||||
"langchain": ">=0.0.157 <0.4.0"
|
||||
}
|
||||
},
|
||||
"node_modules/langsmith": {
|
||||
"version": "0.3.55",
|
||||
"resolved": "https://registry.npmjs.org/langsmith/-/langsmith-0.3.55.tgz",
|
||||
|
|
|
|||
|
|
@ -39,6 +39,7 @@
|
|||
"jsdom": "^26.1.0",
|
||||
"jspdf": "^3.0.1",
|
||||
"langchain": "^0.3.26",
|
||||
"langfuse-langchain": "^3.38.4",
|
||||
"lucide-react": "^0.525.0",
|
||||
"luxon": "^3.7.1",
|
||||
"mammoth": "^1.9.1",
|
||||
|
|
|
|||
|
|
@ -16,6 +16,7 @@ import { allTools } from '@/lib/tools';
|
|||
import { Source } from '@/lib/types/widget';
|
||||
import { WidgetProcessRequest } from '@/lib/types/api';
|
||||
import axios from 'axios';
|
||||
import { getLangfuseCallbacks } from '@/lib/tracing/langfuse';
|
||||
|
||||
// Helper function to fetch content from a single source
|
||||
async function fetchSourceContent(
|
||||
|
|
@ -149,6 +150,7 @@ async function processWithLLM(
|
|||
},
|
||||
{
|
||||
recursionLimit: 15, // Limit recursion depth to prevent infinite loops
|
||||
...getLangfuseCallbacks(),
|
||||
},
|
||||
);
|
||||
|
||||
|
|
|
|||
|
|
@ -17,6 +17,7 @@ import { ChatOpenAI } from '@langchain/openai';
|
|||
import { ChatOllama } from '@langchain/ollama';
|
||||
import { z } from 'zod';
|
||||
import { withStructuredOutput } from '@/lib/utils/structuredOutput';
|
||||
import { getLangfuseCallbacks } from '@/lib/tracing/langfuse';
|
||||
|
||||
interface FileRes {
|
||||
fileName: string;
|
||||
|
|
@ -71,7 +72,9 @@ Generate topics that describe what this document is about, its domain, and key s
|
|||
name: 'generate_topics',
|
||||
});
|
||||
|
||||
const result = await structuredLlm.invoke(prompt);
|
||||
const result = await structuredLlm.invoke(prompt, {
|
||||
...getLangfuseCallbacks(),
|
||||
});
|
||||
console.log('Generated topics:', result.topics);
|
||||
// Filename is included for context
|
||||
return filename + ', ' + result.topics.join(', ');
|
||||
|
|
|
|||
|
|
@ -25,7 +25,7 @@ export default function RootLayout({
|
|||
children: React.ReactNode;
|
||||
}>) {
|
||||
return (
|
||||
<html className="h-full" lang="en" suppressHydrationWarning>
|
||||
<html className="h-full" lang="en" suppressHydrationWarning>
|
||||
<head>
|
||||
<link
|
||||
rel="search"
|
||||
|
|
|
|||
|
|
@ -10,6 +10,7 @@ import LineOutputParser from '../outputParsers/lineOutputParser';
|
|||
import { searchSearxng } from '../searxng';
|
||||
import { formatDateForLLM } from '../utils';
|
||||
import type { BaseChatModel } from '@langchain/core/language_models/chat_models';
|
||||
import { getLangfuseCallbacks } from '@/lib/tracing/langfuse';
|
||||
|
||||
const imageSearchChainPrompt = `
|
||||
# Instructions
|
||||
|
|
@ -140,7 +141,7 @@ const handleImageSearch = (
|
|||
systemInstructions?: string,
|
||||
) => {
|
||||
const imageSearchChain = createImageSearchChain(llm, systemInstructions);
|
||||
return imageSearchChain.invoke(input);
|
||||
return imageSearchChain.invoke(input, { ...getLangfuseCallbacks() });
|
||||
};
|
||||
|
||||
export default handleImageSearch;
|
||||
|
|
|
|||
|
|
@ -5,6 +5,7 @@ import formatChatHistoryAsString from '../utils/formatHistory';
|
|||
import { BaseMessage } from '@langchain/core/messages';
|
||||
import { BaseChatModel } from '@langchain/core/language_models/chat_models';
|
||||
import { ChatOpenAI } from '@langchain/openai';
|
||||
import { getLangfuseCallbacks } from '@/lib/tracing/langfuse';
|
||||
|
||||
const suggestionGeneratorPrompt = `
|
||||
You are an AI suggestion generator for an AI powered search engine.
|
||||
|
|
@ -74,7 +75,9 @@ const generateSuggestions = (
|
|||
llm,
|
||||
systemInstructions,
|
||||
);
|
||||
return suggestionGeneratorChain.invoke(input);
|
||||
return suggestionGeneratorChain.invoke(input, {
|
||||
...getLangfuseCallbacks(),
|
||||
});
|
||||
};
|
||||
|
||||
export default generateSuggestions;
|
||||
|
|
|
|||
|
|
@ -10,6 +10,7 @@ import LineOutputParser from '../outputParsers/lineOutputParser';
|
|||
import { searchSearxng } from '../searxng';
|
||||
import { formatDateForLLM } from '../utils';
|
||||
import type { BaseChatModel } from '@langchain/core/language_models/chat_models';
|
||||
import { getLangfuseCallbacks } from '@/lib/tracing/langfuse';
|
||||
|
||||
const VideoSearchChainPrompt = `
|
||||
# Instructions
|
||||
|
|
@ -147,7 +148,7 @@ const handleVideoSearch = (
|
|||
systemInstructions?: string,
|
||||
) => {
|
||||
const VideoSearchChain = createVideoSearchChain(llm, systemInstructions);
|
||||
return VideoSearchChain.invoke(input);
|
||||
return VideoSearchChain.invoke(input, { ...getLangfuseCallbacks() });
|
||||
};
|
||||
|
||||
export default handleVideoSearch;
|
||||
|
|
|
|||
|
|
@ -19,6 +19,7 @@ import {
|
|||
import { formatDateForLLM } from '../utils';
|
||||
import { getModelName } from '../utils/modelUtils';
|
||||
import { removeThinkingBlocks } from '../utils/contentUtils';
|
||||
import { getLangfuseCallbacks } from '@/lib/tracing/langfuse';
|
||||
|
||||
/**
|
||||
* Normalize usage metadata from different LLM providers
|
||||
|
|
@ -511,12 +512,14 @@ Use all available tools strategically to provide comprehensive, well-researched,
|
|||
},
|
||||
recursionLimit: 25, // Allow sufficient iterations for tool use
|
||||
signal: this.signal,
|
||||
...getLangfuseCallbacks(),
|
||||
};
|
||||
|
||||
// Use streamEvents to capture both tool calls and token-level streaming
|
||||
const eventStream = agent.streamEvents(initialState, {
|
||||
...config,
|
||||
version: 'v2',
|
||||
...getLangfuseCallbacks(),
|
||||
});
|
||||
|
||||
let finalResult: any = null;
|
||||
|
|
|
|||
|
|
@ -23,6 +23,7 @@ import { formatDateForLLM } from '../utils';
|
|||
import { getDocumentsFromLinks } from '../utils/documents';
|
||||
import formatChatHistoryAsString from '../utils/formatHistory';
|
||||
import { getModelName } from '../utils/modelUtils';
|
||||
import { getLangfuseCallbacks } from '@/lib/tracing/langfuse';
|
||||
|
||||
export interface SpeedSearchAgentType {
|
||||
searchAndAnswer: (
|
||||
|
|
@ -103,7 +104,7 @@ class SpeedSearchAgent implements SpeedSearchAgentType {
|
|||
|
||||
this.emitProgress(emitter, 10, `Building search query`);
|
||||
|
||||
return RunnableSequence.from([
|
||||
return RunnableSequence.from([
|
||||
PromptTemplate.fromTemplate(this.config.queryGeneratorPrompt),
|
||||
llm,
|
||||
this.strParser,
|
||||
|
|
@ -235,8 +236,8 @@ class SpeedSearchAgent implements SpeedSearchAgentType {
|
|||
</text>
|
||||
|
||||
Make sure to answer the query in the summary.
|
||||
`,
|
||||
{ signal },
|
||||
`,
|
||||
{ signal, ...getLangfuseCallbacks() },
|
||||
);
|
||||
|
||||
const document = new Document({
|
||||
|
|
@ -348,7 +349,7 @@ class SpeedSearchAgent implements SpeedSearchAgentType {
|
|||
date,
|
||||
systemInstructions,
|
||||
},
|
||||
{ signal: options?.signal },
|
||||
{ signal: options?.signal, ...getLangfuseCallbacks() },
|
||||
);
|
||||
|
||||
query = searchRetrieverResult.query;
|
||||
|
|
@ -379,6 +380,7 @@ class SpeedSearchAgent implements SpeedSearchAgentType {
|
|||
)
|
||||
.withConfig({
|
||||
runName: 'FinalSourceRetriever',
|
||||
...getLangfuseCallbacks(),
|
||||
})
|
||||
.pipe(this.processDocs),
|
||||
}),
|
||||
|
|
@ -391,6 +393,7 @@ class SpeedSearchAgent implements SpeedSearchAgentType {
|
|||
this.strParser,
|
||||
]).withConfig({
|
||||
runName: 'FinalResponseGenerator',
|
||||
...getLangfuseCallbacks(),
|
||||
});
|
||||
}
|
||||
|
||||
|
|
@ -539,7 +542,7 @@ ${docs[index].metadata?.url.toLowerCase().includes('file') ? '' : '\n<url>' + do
|
|||
personaInstructions,
|
||||
);
|
||||
|
||||
const stream = answeringChain.streamEvents(
|
||||
const stream = answeringChain.streamEvents(
|
||||
{
|
||||
chat_history: history,
|
||||
query: message,
|
||||
|
|
@ -547,7 +550,8 @@ ${docs[index].metadata?.url.toLowerCase().includes('file') ? '' : '\n<url>' + do
|
|||
{
|
||||
version: 'v1',
|
||||
// Pass the abort signal to the LLM streaming chain
|
||||
signal,
|
||||
signal,
|
||||
...getLangfuseCallbacks(),
|
||||
},
|
||||
);
|
||||
|
||||
|
|
|
|||
|
|
@ -7,6 +7,7 @@ import { removeThinkingBlocks } from '@/lib/utils/contentUtils';
|
|||
import { Command, getCurrentTaskInput } from '@langchain/langgraph';
|
||||
import { SimplifiedAgentStateType } from '@/lib/state/chatAgentState';
|
||||
import { ToolMessage } from '@langchain/core/messages';
|
||||
import { getLangfuseCallbacks } from '@/lib/tracing/langfuse';
|
||||
|
||||
// Schema for URL summarization tool input
|
||||
const URLSummarizationToolSchema = z.object({
|
||||
|
|
@ -144,6 +145,7 @@ Provide a comprehensive summary of the above web page content, focusing on infor
|
|||
|
||||
const result = await llm.invoke(summarizationPrompt, {
|
||||
signal: config?.signal,
|
||||
...getLangfuseCallbacks(),
|
||||
});
|
||||
|
||||
finalContent = removeThinkingBlocks(result.content as string);
|
||||
|
|
|
|||
34
src/lib/tracing/langfuse.ts
Normal file
34
src/lib/tracing/langfuse.ts
Normal file
|
|
@ -0,0 +1,34 @@
|
|||
// Centralized Langfuse tracing utility
|
||||
// Provides a singleton CallbackHandler and a helper to attach callbacks
|
||||
|
||||
import type { Callbacks } from '@langchain/core/callbacks/manager';
|
||||
import { CallbackHandler } from 'langfuse-langchain';
|
||||
|
||||
let handler: CallbackHandler | null = null;
|
||||
|
||||
export function getLangfuseHandler(): CallbackHandler | null {
|
||||
// Only initialize on server
|
||||
if (typeof window !== 'undefined') return null;
|
||||
|
||||
if (handler) return handler;
|
||||
|
||||
try {
|
||||
// The handler reads LANGFUSE_* env vars by default. You can also pass keys here if desired.
|
||||
handler = new CallbackHandler({
|
||||
publicKey: process.env.LANGFUSE_PUBLIC_KEY,
|
||||
secretKey: process.env.LANGFUSE_SECRET_KEY,
|
||||
baseUrl: process.env.LANGFUSE_BASE_URL,
|
||||
});
|
||||
} catch (e) {
|
||||
// If initialization fails (e.g., missing envs), disable tracing gracefully
|
||||
handler = null;
|
||||
}
|
||||
|
||||
return handler;
|
||||
}
|
||||
|
||||
// Convenience helper to spread into LangChain invoke/config objects
|
||||
export function getLangfuseCallbacks(): { callbacks?: Callbacks } {
|
||||
const h = getLangfuseHandler();
|
||||
return h ? { callbacks: [h] } : {};
|
||||
}
|
||||
|
|
@ -5,6 +5,7 @@ import { formatDateForLLM } from '../utils';
|
|||
import { ChatOpenAI, OpenAIClient } from '@langchain/openai';
|
||||
import { removeThinkingBlocks } from './contentUtils';
|
||||
import { withStructuredOutput } from './structuredOutput';
|
||||
import { getLangfuseCallbacks } from '@/lib/tracing/langfuse';
|
||||
|
||||
export type PreviewAnalysisResult = {
|
||||
isSufficient: boolean;
|
||||
|
|
@ -81,7 +82,7 @@ Snippet: ${content.snippet}
|
|||
name: 'analyze_preview_content',
|
||||
});
|
||||
|
||||
const analysisResult = await structuredLLM.invoke(
|
||||
const analysisResult = await structuredLLM.invoke(
|
||||
`You are a preview content analyzer, tasked with determining if search result snippets contain sufficient information to answer the Task Query.
|
||||
|
||||
# Instructions
|
||||
|
|
@ -118,7 +119,7 @@ ${taskQuery}
|
|||
# Search Result Previews to Analyze:
|
||||
${formattedPreviewContent}
|
||||
`,
|
||||
{ signal },
|
||||
{ signal, ...getLangfuseCallbacks() },
|
||||
);
|
||||
|
||||
if (!analysisResult) {
|
||||
|
|
|
|||
|
|
@ -6,6 +6,7 @@ import { getWebContent } from './documents';
|
|||
import { removeThinkingBlocks } from './contentUtils';
|
||||
import { setTemperature } from './modelUtils';
|
||||
import { withStructuredOutput } from './structuredOutput';
|
||||
import { getLangfuseCallbacks } from '@/lib/tracing/langfuse';
|
||||
|
||||
export type SummarizeResult = {
|
||||
document: Document | null;
|
||||
|
|
@ -95,7 +96,7 @@ Here is the query you need to answer: ${query}
|
|||
|
||||
Here is the content to analyze:
|
||||
${contentToAnalyze}`,
|
||||
{ signal },
|
||||
{ signal, ...getLangfuseCallbacks() },
|
||||
);
|
||||
|
||||
if (!relevanceResult) {
|
||||
|
|
@ -168,7 +169,10 @@ Here is the query you need to answer: ${query}
|
|||
Here is the content to summarize:
|
||||
${i === 0 ? content.metadata.html : content.pageContent}`;
|
||||
|
||||
const result = await llm.invoke(prompt, { signal });
|
||||
const result = await llm.invoke(prompt, {
|
||||
signal,
|
||||
...getLangfuseCallbacks(),
|
||||
});
|
||||
summary = removeThinkingBlocks(result.content as string);
|
||||
break;
|
||||
} catch (error) {
|
||||
|
|
|
|||
24
yarn.lock
24
yarn.lock
|
|
@ -3546,7 +3546,7 @@ kuler@^2.0.0:
|
|||
resolved "https://registry.npmjs.org/kuler/-/kuler-2.0.0.tgz"
|
||||
integrity sha512-Xq9nH7KlWZmXAtodXDDRE7vs6DU1gTU8zYDHDiWLSip45Egwq3plLHzPn27NgvzL2r1LMPC1vdqh98sQxtqj4A==
|
||||
|
||||
langchain@^0.3.26, "langchain@>=0.2.3 <0.3.0 || >=0.3.4 <0.4.0":
|
||||
langchain@^0.3.26, "langchain@>=0.0.157 <0.4.0", "langchain@>=0.2.3 <0.3.0 || >=0.3.4 <0.4.0":
|
||||
version "0.3.30"
|
||||
resolved "https://registry.npmjs.org/langchain/-/langchain-0.3.30.tgz"
|
||||
integrity sha512-UyVsfwHDpHbrnWrjWuhJHqi8Non+Zcsf2kdpDTqyJF8NXrHBOpjdHT5LvPuW9fnE7miDTWf5mLcrWAGZgcrznQ==
|
||||
|
|
@ -3563,6 +3563,28 @@ langchain@^0.3.26, "langchain@>=0.2.3 <0.3.0 || >=0.3.4 <0.4.0":
|
|||
yaml "^2.2.1"
|
||||
zod "^3.25.32"
|
||||
|
||||
langfuse-core@^3.38.4:
|
||||
version "3.38.4"
|
||||
resolved "https://registry.npmjs.org/langfuse-core/-/langfuse-core-3.38.4.tgz"
|
||||
integrity sha512-onTAqcEGhoXuBgqDFXe2t+bt9Vi+5YChRgdz3voM49JKoHwtVZQiUdqTfjSivGR75eSbYoiaIL8IRoio+jaqwg==
|
||||
dependencies:
|
||||
mustache "^4.2.0"
|
||||
|
||||
langfuse-langchain@^3.38.4:
|
||||
version "3.38.4"
|
||||
resolved "https://registry.npmjs.org/langfuse-langchain/-/langfuse-langchain-3.38.4.tgz"
|
||||
integrity sha512-7HJqouMrVOP9MFdu33M4G4uBFyQAIh/DqGYALfs41xqm7t99eZxKcTvt4rYZy67iQAhd58TG3q8+9haGzuLbOA==
|
||||
dependencies:
|
||||
langfuse "^3.38.4"
|
||||
langfuse-core "^3.38.4"
|
||||
|
||||
langfuse@^3.38.4:
|
||||
version "3.38.4"
|
||||
resolved "https://registry.npmjs.org/langfuse/-/langfuse-3.38.4.tgz"
|
||||
integrity sha512-2UqMeHLl3DGNX1Nh/cO4jGhk7TzDJ6gjQLlyS9rwFCKVO81xot6b58yeTsTB5YrWupWsOxQtMNoQYIQGOUlH9Q==
|
||||
dependencies:
|
||||
langfuse-core "^3.38.4"
|
||||
|
||||
langsmith@^0.3.33, langsmith@^0.3.46:
|
||||
version "0.3.55"
|
||||
resolved "https://registry.npmjs.org/langsmith/-/langsmith-0.3.55.tgz"
|
||||
|
|
|
|||
Loading…
Add table
Add a link
Reference in a new issue