Ayush Sharma
202515 Days

CappyChat

Production AI chat workspace with multi-model routing, realtime sync, tool calling, and a local-first UX.

cappychat.com

Project Overview

CappyChat is my take on a serious AI chat product, not just a thin wrapper over one model. It supports 30+ models, realtime sync, a local-first architecture, voice input, file uploads, image generation, and collaborative workflows. The product is optimized around responsiveness so users can switch models, recover prior context, and keep working without the interface feeling slow or fragile.

The later versions pushed the system much further with plan mode, AI artifacts, model-driven tool calling, web search, logging, and production hardening. That meant solving not only for prompt quality but also for rate limits, synchronization, bundle size, state management, and the UX details required to make a real AI application feel fast in everyday use.

Technical Challenges

1

Building a local-first chat experience that stays responsive while still syncing reliably across devices through Appwrite Realtime.

2

Supporting many model providers and capabilities without turning the interface into configuration overload.

3

Adding tool calling, artifacts, and web search in a way that keeps responses useful instead of noisy.

4

Hardening the product for production with logging, rate limiting, and performance work across a rapidly evolving feature set.

Tech Stack

Next.js
TypeScript
Appwrite
OpenRouterOpenRouter
Zustand
Upstash Redis
TailwindCSS

Status

Current Status

Completed & Live

Last Update

2025

Development Time15 Days