Origo follows a modern serverless architecture optimized for speed and cost efficiency. The Next.js frontend handles both the guided input flow and brief rendering, with server-side generation for shared brief pages.
The AI pipeline uses Anthropic's Claude API via the Vercel AI SDK for streaming responses — users see their brief being generated in real time rather than waiting for a full response. Internationalization is built in from day one (FR/EN), with locale-aware routing and translated UI throughout.
Key Decisions
- Streaming generation — Briefs are generated using Claude with streaming enabled. Users see content appear progressively, which dramatically improves perceived performance and engagement.
- Local-first storage — Brief data is persisted client-side, reducing server costs and giving users full control of their data. No account required for basic usage.
- Shareable links — Each brief generates a unique URL that clients can view without creating an account, reducing friction in the handoff process.
- Bilingual by default — Full FR/EN support with Next.js i18n routing, reflecting our France-based market while targeting international clients.
// AI brief generation with streaming
import { streamText } from "ai";
import { anthropic } from "@ai-sdk/anthropic";
async function generateBrief(inputs: BriefInputs) {
const result = streamText({
model: anthropic("claude-sonnet-4-20250514"),
system: BRIEF_SYSTEM_PROMPT,
messages: [
{
role: "user",
content: formatInputsAsPrompt(inputs),
},
],
maxTokens: 4096,
});
return result.toDataStreamResponse();
}