OpenAI-Compatible API
Drop InfuseAI into any existing OpenAI integration. Use POST /api/v1/chat/completions with the same request/response format you already know, plus an infuse extension for sources, credits, and analytics.
A hosted platform and React SDK for building intelligent, context-aware chat experiences with generative UI, tool calling, and RAG โ all with an OpenAI-compatible API.
Add a full AI chat interface to your Next.js app in under 10 lines:
npx create-infuseai-appOr set it up manually:
// app/page.tsx
import { InfuseProvider, InfuseChatUI } from 'infuseai-sdk'
export default function Home() {
return (
<InfuseProvider
config={{
baseUrl: process.env.NEXT_PUBLIC_INFUSEAI_URL!,
clientId: process.env.NEXT_PUBLIC_CLIENT_ID!,
appId: process.env.NEXT_PUBLIC_APP_ID!,
apiKey: process.env.NEXT_PUBLIC_API_KEY!,
}}
>
<InfuseChatUI />
</InfuseProvider>
)
}That's it. InfuseAI handles session creation, thread management, message streaming, and rendering โ including generative UI components and tool calls.
Your App (React)
โ
โ InfuseProvider (SDK)
โ โโโ useInfuseSession โ POST /api/v1/auth/session โ session token
โ โโโ useInfuseThread โ POST /api/v1/threads โ thread created
โ โโโ useInfuseThreadInput โ POST /api/v1/threads/:id/advance
โ
โผ
InfuseAI Platform
โโโ RAG retrieval (Pinecone + Gemini embeddings)
โโโ LLM inference (Groq โ llama-4-maverick-17b)
โโโ Tool execution loop
โโโ Response (OpenAI format + infuse extension)| Section | Description |
|---|---|
| Getting Started | 5-minute setup with npx create-infuseai-app or manual install |
| Core Concepts | Apps, Clients, Threads, Sessions, Tools, Generative UI, RAG |
| React SDK | InfuseProvider, hooks, defineComponent, defineTool |
| API Reference | All REST endpoints with request/response schemas |
| Dashboard | Manage Apps, Knowledge Bases, and Analytics |