Skip to main content
← BACK TO PROJECTS

Parsely

personal
Next.js · TypeScript · Convex · Gemini AI · SwiftUI · Kotlin · Jetpack Compose · Vercel AI SDK · Ink · Tailwind CSS
Parsely

Background

I wanted to actually learn how to build with AI, not just read about it. April 2025, I picked a problem I had: recipe websites are unusable. Ads, life stories, pop-ups. You just want ingredients and steps.

So I built a thing that takes a URL, runs it through Gemini, and gives you a clean recipe. That was Parsely v1. A Next.js app with one text input.

Where it went

It kept growing. I added user accounts, a cookbook to save recipes, folders, favorites, search. Then a streaming AI chat where you can ask questions about a recipe and it proposes edits you can accept or reject. Then subscriptions. Then an admin dashboard.

At some point I decided to go native. Built an iOS app in SwiftUI. Then Kotlin with Jetpack Compose for Android. Then a terminal CLI with Ink because I wanted to try building a TUI. Then a Chrome extension because sometimes you just want to parse from the toolbar.

Five platforms, one Convex backend syncing everything in real time.

Stack

LayerWhat
WebNext.js 16, React 19, Tailwind CSS 4, shadcn/ui
BackendConvex (real-time DB, auth, server functions)
AIGemini Flash Lite (web), gpt-4o-mini (CLI), Apple FoundationModels (iOS), ML Kit Gemini Nano (Android)
iOSSwiftUI, Convex Mobile SDK
AndroidKotlin, Jetpack Compose, Hilt, Room, WorkManager
CLITypeScript, React 18, Ink v5, Puppeteer
PaymentsPolar
AuthConvex Auth (Google OAuth, magic links, password)
AnalyticsPostHog
HostingNetlify (web), Convex (backend)

What I learned

Every platform forced me to think about the same problems differently. State management in SwiftUI is nothing like React. Android's multi-module architecture with Hilt took a while to click. The CLI doesn't connect to the backend at all. It's a standalone scraper with Puppeteer and OpenAI, which was a deliberate choice to keep it simple and auth-free.

The AI parts taught me the most. Rate limiting that doesn't have race conditions. SSRF protection for user-submitted URLs. Structured output with Zod schemas. Streaming responses. On-device models on iOS and Android as fallbacks. iOS uses Apple's on-device FoundationModels when available, Android runs Gemini Nano through ML Kit, and the CLI just hits OpenAI directly since there's no user account to rate-limit against.