Example implementations of various LLM providers using PostHog's AI SDKs. This repository demonstrates how to integrate multiple AI providers (Anthropic, OpenAI, Google Gemini) with PostHog for analytics tracking.
- Python 3.8 or higher
- pip package manager
- Node.js 16 or higher
- npm package manager
-
Configure environment variables:
cp .env.example .env
Edit
.env
and add your API keys:ANTHROPIC_API_KEY
: Your Anthropic API keyGEMINI_API_KEY
: Your Google Gemini API keyOPENAI_API_KEY
: Your OpenAI API keyPOSTHOG_API_KEY
: Your PostHog API keyPOSTHOG_HOST
: PostHog host (defaults to https://app.posthog.com)
-
Run the application:
For Python:
cd python ./run.sh
For Node.js:
cd node ./run.sh
The run.sh
script will automatically:
- Set up a virtual environment (Python) or install dependencies (Node)
- Install all required packages
- Start the interactive CLI
- Chat Mode: Interactive conversation with the selected provider
- Tool Call Test: Automatically tests weather tool calling
- Message Test: Simple greeting test
- Image Test: Tests image description capabilities
- Embeddings Test: Tests embedding generation (OpenAI only)
An interactive tool for creating complex nested LLM trace data for testing PostHog analytics. Features pre-built templates (simple chat, RAG pipeline, multi-agent) and a custom trace builder for creating arbitrarily complex structures.
cd python/trace-generator
./run.sh
If you're developing the PostHog SDKs locally, you can use local paths instead of published packages:
-
Set environment variables in your
.env
:# For local PostHog SDK development POSTHOG_PYTHON_PATH=/../posthog-python POSTHOG_JS_PATH=/../posthog-js
-
Run the application normally with
./run.sh
The scripts will automatically detect and use your local SDK versions.
MIT License - see LICENSE file for details
Contributions are welcome! Please feel free to submit a Pull Request.