If you are using Python, Javascript or other Langfuse compatible language, you can submit LLM traces to Darkraven with Langfuse SDK.
Full compatibility reference:
-
-
Installation
Installation guides are from document.
pip install langfuse
npm i langfuse
# or
yarn add langfuse
# Node.js < 18
npm i langfuse-node
# Deno
import { Langfuse } from "https://esm.sh/langfuse"// Some code
Setup
Create an API key in Darkraven in settings
Add environment variables
LANGFUSE_SECRET_KEY: Your API key
LANGFUSE_PUBLIC_KEY: Your API key
LANGFUSE_HOST: Use https://api.darkraven.ai/api/dify
Usage Example
from langfuse.decorators import observe
from langfuse.openai import openai # OpenAI integration
@observe()
def story():
return openai.chat.completions.create(
model="gpt-3.5-turbo",
max_tokens=100,
messages=[
{"role": "system", "content": "You are a great storyteller."},
{"role": "user", "content": "Once upon a time in a galaxy far, far away..."}
],
).choices[0].message.content
@observe()
def main():
return story()
main()
import { Langfuse } from "langfuse"; // or "langfuse-node"
// without additional options
const langfuse = new Langfuse();
// with additional options
const langfuse = new Langfuse({
release: "v1.0.0",
requestTimeout: 10000,
});