Skip to content

Navigation Menu

Sign in
Appearance settings

Search code, repositories, users, issues, pull requests...

Provide feedback

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly

Appearance settings

Comments

Close side panel

feat(llma): add prompt management#417

Merged
Radu-Raicea merged 3 commits intomasterPostHog/posthog-python:masterfrom
feat/add-prompt-managementPostHog/posthog-python:feat/add-prompt-managementCopy head branch name to clipboard
Jan 30, 2026
Merged

feat(llma): add prompt management#417
Radu-Raicea merged 3 commits intomasterPostHog/posthog-python:masterfrom
feat/add-prompt-managementPostHog/posthog-python:feat/add-prompt-managementCopy head branch name to clipboard

Conversation

@Radu-Raicea
Copy link
Member

This adds the prompt management functions to the SDK.

The flow is to:

  1. Fetch a prompt template
  2. Compile the prompt with variables
  3. Use it in your LLM calls

There's SDK side caching for the prompt. There's also a an optional fallback prompt. You can technically achieve 100% availability by using both.

Example:

from posthog import Posthog
from posthog.ai import Prompts
from posthog.ai.openai import OpenAI

posthog = Posthog(
    "phc_xxx",
    host="https://us.i.posthog.com",
    personal_api_key="phx_xxx",
)

prompts = Prompts(posthog)
openai = OpenAI(api_key="sk-xxx", posthog=posthog)

template = prompts.get(
    "support-system-prompt",
    fallback="You are a helpful assistant.",
)

system_prompt = prompts.compile(template, {
    "company": "Acme Corp",
    "tier": "premium",
})

response = openai.chat.completions.create(
    model="gpt-4",
    messages=[
        {"role": "system", "content": system_prompt},
        {"role": "user", "content": "How do I reset my password?"},
    ],
    posthog_distinct_id="user-123",
)

print(response.choices[0].message.content)

@Radu-Raicea Radu-Raicea requested a review from a team January 28, 2026 20:02
Copy link
Contributor

@greptile-apps greptile-apps bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

1 file reviewed, 1 comment

Edit Code Review Agent Settings | Greptile

posthog/ai/prompts.py Outdated Show resolved Hide resolved
Use _get_session() from posthog/request.py instead of raw requests.get()
to benefit from the SDK's existing retry configuration on transient
network failures.
@Radu-Raicea Radu-Raicea merged commit 4350389 into master Jan 30, 2026
20 checks passed
@Radu-Raicea Radu-Raicea deleted the feat/add-prompt-management branch January 30, 2026 13:43
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants

Morty Proxy This is a proxified and sanitized view of the page, visit original site.