10 React Best Practices for High-Performance Apps

I looked at the node_modules folder today. It’s 400MB. For a landing page. We’re done playing games.

Back in my day, we managed memory. We cared about the heap. We knew where every byte lived and, more importantly, when it died. Now? I walk into this “modern” repository and it’s like walking into a landfill. I’m running Node 22.x, thinking maybe the V8 improvements will save us, but no. You can’t optimize a dumpster fire.

Here is the state of the “project” as of this morning. I ran a fresh install and a build. My workstation, which has 64GB of RAM and handles LLVM compilations without breaking a sweat, actually groaned.

$ npm audit
                           blocklist.js:24
                           throw new Error('Dependency Hell Detected');
                           ^

Error: Dependency Hell Detected
    at Object.<anonymous> (root/node_modules/bloatware/index.js:1:1)

found 482 vulnerabilities (112 low, 154 moderate, 180 high, 36 critical)
run `npm audit fix` to fix them, or `npm audit` for details

$ ls -lh node_modules | wc -l
   34212
$ du -sh node_modules
412M    node_modules
$ vite build
vite v5.4.10 building for production...
✓ 1422 modules transformed.
dist/index.html                  0.45 kB
dist/assets/index-D8x2n9zL.css   42.10 kB
dist/assets/index-Cj9W1_mI.js    1,240.22 kB │ gzip: 380.14 kB

(!) Some chunks are larger than 500 kB after minification. 
Consider using dynamic import() to break them down.
Build finished in 14.2s.

One point two megabytes of JavaScript. To show a list of users and a search bar. I’ve seen operating systems smaller than this bundle. This is what happens when you let people who think “pointers” are a type of dog write software.

LOGISTICAL COLLAPSE: THE DEPENDENCY INFESTATION

I opened the package.json. It’s a horror story. We’re on React 18.3.2, which is fine, I guess, if you enjoy a library that thinks it’s a framework. But then I see it: lodash, moment, axios, styled-components, framer-motion, react-query, redux-toolkit, and something called left-pad-ultimate.

Why do we have axios? Node 22 has a native fetch. Why is moment here? It’s 2024. The Intl object exists. We are pulling in 200KB of date-parsing logic to display “2 days ago.” I could write a C++ template to calculate relative time in four lines of assembly-adjacent code that would execute in three clock cycles. Instead, we’re shipping a library that includes support for the Julian calendar.

The node_modules folder is 34,000 files. Every time you run npm install, you are performing a distributed denial of service attack on your own hard drive. The sheer amount of cruft is staggering. I ran a grep to see how many times we’re importing React:

$ grep -r "import React" src | wc -l
214

In React 18.3.2, you don’t even need to import React for JSX. But here we are, wasting bytes on every single file because nobody bothers to read the changelog. They just copy-paste the same boilerplate until the project becomes a sentient mass of spaghetti.

ANATOMY OF A CRIME: THE 500-LINE GOD COMPONENT

I found it. src/components/UserDashboard.tsx. Five hundred and twelve lines of pure, unadulterated chaos. It’s a “God Component.” It fetches data, it filters data, it handles three different modals, it manages its own complex state, and it even has inline SVG icons that are 50 lines long each.

// A snippet of the nightmare
export const UserDashboard = () => {
  const [users, setUsers] = useState([]);
  const [filteredUsers, setFilteredUsers] = useState([]);
  const [searchTerm, setSearchTerm] = useState("");
  const [isLoading, setIsLoading] = useState(false);
  const [error, setError] = useState(null);
  const [isModalOpen, setIsModalOpen] = useState(false);
  const [selectedUser, setSelectedUser] = useState(null);
  // ... 15 more useStates

Look at this. users and filteredUsers. We are duplicating the entire data set in the heap. If the API returns 1,000 users, we have 2,000 objects floating around in memory. This is a basic violation of the “Single Source of Truth” principle, but more importantly, it’s a waste of the user’s L3 cache.

In a real language, you’d have a pointer to the original data and a view or an iterator for the filtered results. In JavaScript, we just copy everything and hope the garbage collector isn’t busy.

THE RE-RENDER CASCADE: A DEATH BY A THOUSAND HOOKS

Then we get to the useEffect blocks. This is where the real “re-render hell” begins.

useEffect(() => {
  setIsLoading(true);
  fetchUsers().then(data => {
    setUsers(data);
    setIsLoading(false);
  });
}, []);

useEffect(() => {
  const results = users.filter(user => 
    user.name.toLowerCase().includes(searchTerm.toLowerCase())
  );
  setFilteredUsers(results);
}, [searchTerm, users]);

Do you see it? Every time searchTerm changes—which is every single keystroke—the second useEffect fires. It triggers setFilteredUsers. That triggers a re-render of the entire 500-line component. Because the component is so massive, React has to reconstruct the entire Virtual DOM tree for the dashboard, the sidebar, the header, and the footer, just to figure out that only one text node changed.

This is “shaving the yak” at a professional level. We are performing O(N) string operations on every keystroke and then forcing the browser to do a full layout pass. On a mobile device, this makes the UI feel like it’s stuck in molasses. The fans on my MacBook Pro started spinning just looking at this code.

If we are going to adhere to what the industry calls react best practices, we need to stop treating the browser like it has infinite registers and start respecting the call stack. The filtering should be a useMemo calculation, not a state-driven effect.

const filteredUsers = useMemo(() => {
  return users.filter(user => 
    user.name.toLowerCase().includes(searchTerm.toLowerCase())
  );
}, [searchTerm, users]);

But even useMemo has overhead. You’re allocating a dependency array, storing it, and performing a shallow comparison on every render. In C++, I’d just use a std::string_view and a predicate. Here, I have to beg the library not to destroy the frame rate.

STATE MANAGEMENT AS A PYRAMID SCHEME

I looked further down. The component is wrapped in three different Context Providers. One for themes, one for auth, and one for “Global UI State.”

Every time the “Global UI State” changes—say, when a notification pops up—the UserDashboard re-renders. Why? Because useContext is a blunt instrument. It doesn’t care if the component only needs the isSidebarOpen boolean; if anything in the context object changes, the whole world burns.

We’ve replaced the simplicity of passing a pointer with a complex subscription model that nobody understands. People use Redux or Zustand because they’re afraid of props, but then they end up with a dependency graph that looks like a bowl of wet noodles.

I saw a dispatch call inside a useEffect that was triggered by a prop change. That dispatch updated the store, which updated the prop, which triggered the useEffect again. It was a circular dependency that only stopped because of a Math.random() check. I’m not joking. Someone actually wrote if (Math.random() > 0.5) to “throttle” an infinite loop. I need a drink.

THE MEMOIZATION FALLACY: SHAVING THE YAK IN VIRTUAL SPACE

“Just use useCallback,” they said. “It’ll be faster,” they said.

I found this gem in the middle of the God Component:

const handleUserClick = useCallback((id) => {
  console.log("User clicked:", id);
  setSelectedUser(users.find(u => u.id === id));
  setIsModalOpen(true);
}, [users]);

This is a classic example of memoization cargo culting. The developer thinks they are saving performance by wrapping this function in useCallback. But look at the dependency array: [users]. Every time the user list is fetched or filtered (which we already established happens constantly), the users array reference changes. This means the useCallback is invalidated and the function is recreated anyway.

You are paying the price of the memoization—the memory for the closure and the overhead of the dependency check—and getting absolutely zero benefit. It’s like buying a gym membership and only using it to buy pizza at the snack bar.

And don’t get me started on React.memo. They wrapped the UserItem component in memo, but they pass it an inline object for styles:

<UserItem 
  user={user} 
  style={{ color: 'red' }} 
  onClick={() => handleUserClick(user.id)} 
/>

In JavaScript, {} !== {}. Every time the parent renders, a new style object is created. Every time the parent renders, a new anonymous function is created for onClick. The React.memo check fails every single time. We are performing a shallow comparison of props that we know will be different, just to satisfy some linter rule written by someone who doesn’t understand how memory allocation works in V8.

SURGICAL INTERVENTION: REFACTORING FOR THE BARE METAL

I couldn’t take it anymore. I started the refactor. I’m not using Vite 5.4’s fancy features; I’m using common sense.

First, I broke the God Component into pieces. Not because I like “clean code”—I hate that term—but because I want to limit the scope of the re-renders. If the search bar is its own component with its own local state, typing in it won’t force the entire user list to re-reconcile.

Second, I killed the filteredUsers state. It’s a derived value. It belongs in the render logic, or at most, a useMemo if the filtering logic is computationally expensive (which it isn’t, it’s just string matching).

Third, I replaced the inline SVGs with a single sprite sheet. Why were we shipping 50KB of XML inside our JavaScript bundle? It’s a waste of the parser’s time.

// Refactored SearchInput.tsx
const SearchInput = ({ onSearch }: { onSearch: (val: string) => void }) => {
  const [value, setValue] = useState("");

  const handleChange = (e: React.ChangeEvent<HTMLInputElement>) => {
    const val = e.target.value;
    setValue(val);
    // Debounce this so we don't kill the CPU
    debounce(() => onSearch(val), 150);
  };

  return <input value={value} onChange={handleChange} />;
};

I implemented a simple debounce. We don’t need to filter the list 60 times a second while the user is typing “Smith.” We can wait 150ms. That’s 150ms where the CPU can actually sleep, or at least do something useful like background garbage collection.

I also looked at the UserList component. Instead of rendering 1,000 DOM nodes, I implemented a basic virtualization window. Why render what the user can’t see? In C++, if I had a buffer of 1,000,000 elements, I wouldn’t try to draw them all to the screen at once. I’d calculate the offset and only draw the visible range. Why is this concept so foreign to web developers?

THE COST OF ABSTRACTION: A FINAL POST-MORTEM

We’ve reached a point where the “developer experience” has completely cannibalized the “user experience.” We use these massive frameworks and libraries because they make it “easier” to build things, but we’ve forgotten what we’re actually building.

We are building instructions for a processor. That processor doesn’t care about your “hooks” or your “functional purity.” It cares about instructions per clock. It cares about cache hits. It cares about memory bandwidth.

Every time you add a dependency because you’re too lazy to write a 10-line utility function, you are adding latency. Every time you ignore a re-render because “computers are fast now,” you are draining someone’s battery.

I finished the refactor. The bundle size dropped from 1.2MB to 180KB. The build time went from 14 seconds to 2 seconds. The node_modules folder is still a disaster, but at least the code we’re shipping isn’t a personal insult to the silicon it runs on.

$ vite build
✓ 84 modules transformed.
dist/assets/index-A1b2C3d4.js    182.10 kB │ gzip: 45.20 kB
Build finished in 2.1s.

It’s better, but it’s still JavaScript. It’s still running in a sandbox, managed by a runtime that thinks it knows better than I do about when to free memory. But for now, the fans have stopped spinning. I’m going to go look at some assembly code just to remind myself that the world can be sane.

If you’re going to work in this codebase, remember: the CPU is not your friend. It is a resource you are exploiting. Try to be a little less greedy. Stop the re-render hell. Kill the cruft. And for the love of all that is holy, stop using useEffect for things that can be done in a simple function call.

We’re done here. Don’t call me for the next “sprint.” I’ll be in the server room, listening to the hum of hardware that actually does what it’s told.

Related Articles

Explore more insights and best practices:

Leave a Comment