OpenAI Euphony: A Browser-Based Viewer for Harmony Conversations and Codex CLI Sessions
OpenAI just shipped Euphony — an Apache 2.0 open source web app for inspecting two specific formats that have been a pain to read with a text editor:
- Harmony conversations — the structural wire format gpt-oss models are trained on
-
Codex CLI sessions — the
rollout-*.jsonlfiles auto-created by Codex CLI
There’s already a hosted build you can use without installing anything.
Why Euphony Exists
If you’ve ever looked at a raw Harmony conversation, you know the pain. The format uses explicit role/channel/stop tokens like <|start|>, <|end|>, <|message|>, <|channel|> — great for training, terrible for human reading.
Codex CLI sessions are similar. Every session dumps a JSONL file to $CODEX_HOME/sessions/YYYY/MM/DD/rollout-*.jsonl. If you want to understand what your agent did last Tuesday, you’re scrolling through hundreds of lines of structured tool calls and responses.
Euphony fills this gap with a proper timeline UI, filters, and metadata inspection.
What You Get
1. Flexible Input
Three ways to load data:
- Paste JSON/JSONL from clipboard
- Drag and drop a local
.jsonor.jsonlfile - Enter a public HTTP(S) URL (great for Hugging Face datasets)
2. Two Run Modes
Frontend-only (recommended for deployment):
export VITE_EUPHONY_FRONTEND_ONLY=true
pnpm run dev
All processing happens in the browser. URL fetches are client-side. Translation uses the user’s own OpenAI API key. Safe to host on GitHub Pages, Cloudflare, etc.
Backend-assisted (local dev only):
uvicorn fastapi-main:app --app-dir server --host 127.0.0.1 --port 8020 --reload
Adds a FastAPI server for large remote files, server-side translation, and Harmony rendering. Do not expose this externally — SSRF risk.
3. JMESPath Filtering
Query large datasets right in the UI:
| Goal | Query |
|---|---|
| Assistant messages only | messages[?role=='assistant'] |
| Specific tool calls | messages[?recipient=='browser'] |
| Last 10 messages | `messages |
4. Token Inspector
For Harmony debugging, the token view shows:
- Raw Harmony renderer output
- Token ID arrays
- Decoded token strings
- Display string conversions
Invaluable if you’re tracking down tokenizer mismatches.
5. Embeddable Web Components
This is the feature I find most compelling. You can drop Euphony into any web stack via a custom element:
{% raw %}
<euphony-conversation src="path/to/conversation.json"></euphony-conversation>
Theme with CSS variables:
euphony-conversation {
--euphony-user-color: #4F46E5;
--euphony-assistant-color: #10B981;
--euphony-background: #0D1117;
}
React Usage
import './lib/euphony.js';
declare global {
namespace JSX {
interface IntrinsicElements {
'euphony-conversation': {
src?: string;
data?: string;
};
}
}
}
export function ConversationViewer({ url }: { url: string }) {
return <euphony-conversation src={url} />;
}
Vue 3 Usage
<script setup lang="ts">
import './lib/euphony.js';
defineProps<{ src: string }>();
</script>
<template>
<euphony-conversation :src="src" />
</template>
Remember to add compilerOptions.isCustomElement for Vue to recognize the tag.
Quick Start
git clone https://github.com/openai/euphony.git
cd euphony
pnpm install
# Recommended: frontend-only
export VITE_EUPHONY_FRONTEND_ONLY=true
pnpm run dev
Open http://localhost:3000/ and drag in any JSONL file.
Building the Library
To embed in your own app:
pnpm install
pnpm run build:library
# Output: ./lib/euphony.js
Deployment Checklist
- Frontend-only mode: safe to deploy to static hosts
- Set
VITE_EUPHONY_FRONTEND_ONLY=trueat build time - Backend mode: keep bound to
127.0.0.1, never expose to the internet - Translation: pass user’s API key at runtime, never hardcode
Practical Use Cases
-
Codex CLI debugging — Paste a
rollout-*.jsonlpath and step through the agent’s reasoning - Dataset inspection — Point at a Hugging Face URL to audit conversation quality
- Agent platform UI — Drop the Web Component into your admin dashboard
- Tokenizer debugging — Use the token inspector to track down Harmony rendering issues
Related Resources
Final Thoughts
The fact that OpenAI is open-sourcing their internal inspection tools signals something important: agent workflow debugging is becoming a first-class concern. If you’re building on gpt-oss or using Codex CLI, Euphony is worth 10 minutes of your time. And if you’re shipping your own agent platform, the Web Components embed means you can have a proper conversation viewer without writing one from scratch.
I’ll probably integrate it into my own tooling this week. Curious if anyone else is already using it in production.
