AI Open Source · SDK 与开发工具

Helicone/helicone

开源的 LLM 可观测性平台,一行代码接入就能监控 API 调用、做评估和实验。 记录 prompt、响应、token、延迟、成本,并支持 A/B 测试和 prompt 版本管理。 研究里跑大量模型实验、生产环境追踪 Agent 行为时,作为统一的观测层。

🧊 Open source LLM observability platform. One line of code to monitor, evaluate, and experiment. YC W23 🍓

Stars
5.7k
Language
TypeScript
License
Apache-2.0
Last push
3d ago
Created
2023-01-31
Topics
agent-monitoringanalyticsevaluationgptlangchainlarge-language-models

README

<div align="center">
🔍 Observability🕸️ Agent Tracing🚂 LLM Routing
💰 Cost & Latency Tracking📚 Datasets & Fine-tuning🎛️ Automatic Fallbacks
</div> <p align="center" style="margin: 0; padding: 0;"> <img alt="helicone logo" src="https://marketing-assets-helicone.s3.us-west-2.amazonaws.com/Twitter_Cover_A1.png" style="display: block; margin: 0; padding: 0;"> </p> </br> <p align="center"> <a href='https://github.com/helicone/helicone/graphs/contributors'><img src='https://img.shields.io/github/contributors/helicone/helicone?style=flat-square' alt='Contributors' /></a> <a href='https://github.com/helicone/helicone/stargazers'><img alt="GitHub stars" src="https://img.shields.io/github/stars/helicone/helicone?style=flat-square"/></a> <a href='https://github.com/helicone/helicone/pulse'><img alt="GitHub commit activity" src="https://img.shields.io/github/commit-activity/m/helicone/helicone?style=flat-square"/></a> <a href='https://github.com/helicone/helicone/issues?q=is%3Aissue+is%3Aclosed'><img alt="GitHub closed issues" src="https://img.shields.io/github/issues-closed/helicone/helicone?style=flat-square"/></a> <a href='https://www.ycombinator.com/companies/helicone'><img alt="Y Combinator" src="https://img.shields.io/badge/Y%20Combinator-Helicone-orange?style=flat-square"/></a> </p> <p align="center"> <a href="https://docs.helicone.ai/">Docs</a> • <a href="https://www.helicone.ai/changelog">Changelog</a> • <a href="https://github.com/helicone/helicone/issues">Bug reports</a> • <a href="https://helicone.ai/demo">See Helicone in Action! (Free)</a> </p>

Helicone is an AI Gateway & LLM Observability Platform for AI Engineers

  • 🌐 AI Gateway: Access 100+ AI models with 1 API key through the OpenAI API with intelligent routing and automatic fallbacks. Get started in 2 minutes.
  • 🔌 Quick integration: One-line of code to log all your requests from OpenAI, Anthropic, LangChain, Gemini, Vercel AI SDK, and more.
  • 📊 Observe: Inspect and debug traces & sessions for agents, chatbots, document processing pipelines, and more
  • 📈 Analyze: Track metrics like cost, latency, quality, and more. Export to PostHog in one-line for custom dashboards
  • 🎮 Playground: Rapidly test and iterate on prompts, sessions and traces in our UI.
  • 🧠 Prompt Management: Version prompts using production data. Deploy prompts through the AI Gateway without code changes. Your prompts remain under your control, always accessible.
  • 🎛️ Fine-tune: Fine-tune with one of our fine-tuning partners: OpenPipe or Autonomi (more coming soon)
  • 🛡️ Enterprise Ready: SOC 2 and GDPR compliant

🎁 Generous monthly free tier (10k requests/month) - No credit card required!

<img src="https://github.com/user-attachments/assets/e16332e9-d642-427e-b3ce-1a74a17f7b2c" alt="Open Sourced LLM Observability & AI Gateway Platform" width="600">

Quick Start ⚡️

  1. Get your API key by signing up here and add credits at helicone.ai/credits

  2. Update the baseURL in your code and add your API key.

    import OpenAI from "openai";
    
    const client = new OpenAI({
      baseURL: "https://ai-gateway.helicone.ai",
      apiKey: process.env.HELICONE_API_KEY,
    });
    
    const response = await client.chat.completions.create({
      model: "gpt-4o-mini",  // claude-sonnet-4, gemini-2.0-flash or any model from https://www.helicone.ai/models
      messages: [{ role: "user", content: "Hello!" }]
    });
    
  3. 🎉 You're all set! View your logs at Helicone and access 100+ models through one API.

Self-Hosting Open Source LLM Observability

Docker

Helicone is simple to self-host and update. To get started locally, just use our docker-compose file.

# Clone the repository
git clone https://github.com/Helicone/helicone.git
cd docker
cp .env.example .env

# Start the services
./helicone-compose.sh helicone up

Helm

For Enterprise workloads, we also have a production-ready Helm chart available. To access, contact us at enterprise@helicone.ai.

同一分类的其他项