Hello Developers! 👋
I recently launched Nanobanan Editor, an AI-powered image editing tool focusing on natural language prompts.
While building the MVP was fun, I hit a massive roadblock: Performance. Specifically, my “Community Feed” page was taking 3-5 seconds to load. For a user-facing gallery, that’s unacceptable.
In this post, I want to share how I diagnosed the bottleneck and optimized it to under 300ms (a 10x improvement) using client-side rendering strategies and database indexing.
The Stack 🛠️
- Framework: Next.js 14 (App Router)
- Database: Supabase (PostgreSQL)
- Styling: TailwindCSS
- Deployment: Vercel
The Problem: Traditional SSR Bloat 🐢
Initially, I implemented the community page using standard Server-Side Rendering (SSR) in Next.js.
// ❌ The slow way (Simplified)
export const dynamic = "force-dynamic"
export default async function CommunityPage() {
// Blocking current thread to fetch database
const images = await supabase.from('generations').select('*')...
return <Gallery images={images} />
}
Why it was slow:**
- Blocking: The HTML wouldn’t stream until the database query finished.
- Complex Query: I was querying a large dataset without proper indexes.
- Network: The server-to-database round trip added latency for every single request.
The Solution: CSR + SWR + Indexes 🚀
I decided to pivot from SSR to Client-Side Rendering (CSR) with a “Stale-While-Revalidate” strategy.
Step 1: Switch to CSR with Skeleton Loading
Key change: Show the UI immediately (skeletons), then fetch data.
"use client"
import useSWR from 'swr'
export default function CommunityPage() {
// Non-blocking fetch
const { data, isLoading } = useSWR('/api/community/feed', fetcher)
if (isLoading) return <SkeletonGrid /> // Instant feedback
return <Gallery images={data.images} />
}
Step 2: Intelligent Caching with SWR
I used swr to handle caching. If a user visits the community page, leaves, and comes back 10 seconds later, it loads instantly from the cache without hitting the API.
const { data } = useSWR('/api/community/feed', fetcher, {
dedupingInterval: 60000, // Reuse data for 60s
revalidateOnFocus: false // Don't re-fetch just because I clicked a tab
})
Step 3: Database Indexing (The Real MVP)
This was the biggest win. I analyzed my SQL query:
SELECT * FROM generations WHERE is_public = true ORDER BY created_at DESC LIMIT 12
I realized I was doing a sequential scan. I added a composite index in Supabase:
CREATE INDEX idx_generations_public_created
ON generations (is_public, created_at DESC)
WHERE is_public = true;
Result*: The database query time dropped from *~500ms** to ~15ms.
The Results 📊
- First Load: ~300ms (Skeleton UI visible instantly)
- Repeat Visit: < 50ms (Cache hit)
- Lighthouse Score: Jumped from 65 to 95.
Try it out
You can experience the speed difference live here:
👉 **[Nanobanan Editor Community]
This journey taught me that while SSR is powerful, sometimes good old Client-Side Rendering with a smart caching strategy provides a snappier UX for feed-based pages.
Let me know what you think of the app! Also, happy to answer any questions about the Next.js + Supabase stack in the comments. 👇
webdev #javascript #programming #showdev

