How I Reduced Load Time on Low-End Devices
Sun Jul 06 2025

Let’s be real, optimizing for performance isn’t about slapping “lazy” on a component and calling it a day. It’s a battle between the bundle size, the design team’s pixel-perfection, and the guy who insists on loading Lottie animations from a CDN.
But when your users are running on 2GB RAM phones with Chrome eating half of it, you can’t afford sloppy performance. Here’s a breakdown of how I brought down load time by ~300ms without sacrificing interactivity, aesthetics, or maintainability.

Code Splitting with Suspense + Dynamic Components
- For heavy or interaction-only components, we split them using dynamic imports inside Suspense
- These were placed inside Server Components to maintain SSR where needed, or in Client Components for interactivity
'use client';
import React, { Suspense } from 'react';
import dynamic from 'next/dynamic';
const HeavyComponent= dynamic(() => import('./HeavyComponent'), { ssr: false });
export default function SomeComponent() {
return (
<Suspense fallback={<div className="h-40 bg-gray-200 animate-pulse rounded" />}>
<HeavyComponent/>
</Suspense>
);
}Why This is Better:
- Keeps bundle size minimal for first paint
- Doesn’t block rendering of visible content
- Maintains Reacts concurrent capabilities for hydration control
Offloading Third-Party Scripts with Worker Threads
Using Partytown, we offloaded third-party scripts to web workers allowing the main thread to stay focused on rendering.
import { Partytown } from '@builder.io/partytown/react';
export default function RootLayout({ children }) {
return (
<html>
<head>
<Partytown debug={false} forward={['dataLayer.push']} />
<script
type="text/partytown"
src="some-third-party-script"
/>
</head>
<body>{children}</body>
</html>
);
}Any <script> with type=text/partytown is now offloaded to a Worker.
Why It’s Powerful:
- Frees up the main thread for reacts hydration and rendering
- Keeps third-party scripts isolated from layout/render blocking
- TTI improved
Note: Not Every Script Can Be Offloaded

Memoization Where It Counts
With server + client component splitting, it becomes even more important to stabilize renders on the client.
- memo for reusable widgets (button sets, filter panels)
- useMemo for computed transformations
useCallback for debounced handlers passed as props
'use client';
// This massively helped us on filter-heavy dashboards.
const filteredData = useMemo(() => {
return rawData.filter((item) => item.active);
}, [rawData]);Important Note:
useMemo and useCallback aren’t free. They do have a performance cost especially in components that re-render often or don’t actually benefit from memoization. Don’t wrap every function in useCallback “just in case.” That’s like bubble-wrapping your lunch because it once spilled.
Smart Caching with Next.js App Router
A big part of performance is not hitting the network unnecessarily.
Here’s how we approached caching on both the server and client side in our Next.js 13 App Router setup:
Server-side Caching with Next.js
Next 13+ makes it easy to cache API responses using fetch() with the next option:
const res = await fetch('someapi', {
next: { revalidate: 3600 },
});This caches the response at the edge and revalidates it in the background, keeping things snappy while still fresh.
Client-side Caching with useQuery
We used react-query on the client side for dynamic data that needed frequent refresh but smart deduping
'use client';
import { useQuery } from '@tanstack/react-query';
const { data, isLoading } = useQuery({
queryKey: ['products'],
queryFn: () => fetch('someapi').then(res => res.json()),
staleTime: 5 * 60 * 1000,
});Why we like useQuery:
- Built-in caching and revalidation
- Handles background re-fetches
- Avoids multiple requests for the same data
Devtools for debugging

Slashing Bundle Size With Discipline
Tools we used:
- next/bundle-analyzer
- Chrome’s coverage tab (to find unused code)
- Dynamic imports for date-fns, loadsh-es, etc.
What We Changed:
- Avoided default imports from utility libraries
- Removed unused third-party animations
- Offloaded static JSONs and SVGs to CDN
Bundle size shrank from 1.2MB → 715KB gzipped.

Native Image Optimization with next/image
This one’s simple but insanely effective.
We replaced every <img> tag with the new App Router-friendly next/image. Here’s the best practice for it now:
import Image from 'next/image';
<Image
src="/assets/somephoto"
alt="Some Photo"
width={600}
height={400}
placeholder="blur"
priority
className="rounded-md object-cover"
/>Features We Leveraged:
- Automatic lazy load
- Responsive resizing
- Blur-up placeholder for perceived performance boost
- Prevented CLS using fixed height/width or aspect-ratio
Final Thought
Performance isn’t a checklist it’s an attitude. We should aim to render smart, hydrate less, split early and much more.
If you liked this, follow me for more dev logs, code tricks, and meme powered explanations.