On this page

AI Chat Integration

Connect with LLM models like OpenAI GPT or Google Gemini for AI chat

This live demo connects to Google Gemini 2.5 Flash API in real-time. Try chatting with the AI assistant below — it understands multiple languages and can help answer questions about web development, Surakiat.dev, or anything else. The implementation uses Next.js Server Actions to securely call the API without exposing keys to the client.

This is a real API integration using @google/genai SDK. The AI responds based on conversation context and can answer in any language you use. You could swap Gemini for OpenAI, Anthropic, or any LLM provider — the server action pattern keeps your API keys secure.

How it works

app/
└── actions/
    └── chat.ts              # Server Action for Gemini API

features/
└── sharing/
    └── components/
        └── Integration/
            └── AIChatShowcase/
                └── AIChatShowcase.tsx  # Chat UI component

.env.local
└── GEMINI_API_KEY=your_key  # Keep server-side only (no NEXT_PUBLIC_)

Streaming

Real-time streaming responses for better UX.

Fast Response

Optimized API calls for quick AI responses.

Error Handling

Graceful error handling and retry logic.

Type-Safe

Full TypeScript support with typed responses.

Key Features

  • Streaming responses for real-time chat
  • Support for multiple LLM providers (OpenAI, Gemini)
  • Chat history management
  • Rate limiting and error handling
  • Customizable AI prompts

Live Example
API

AI Chat DemoGemini 2.5 Flash

This demo uses Google Gemini 2.5 Flash model — Google's latest and fastest AI model with enhanced reasoning capabilities.

Hello! I'm Surakiat.dev AI assistant, happy to help you.