“Look at this, kid. Look at the terminal. I just ran du -sh node_modules on this ‘boilerplate’ you pulled from GitHub. One point four gigabytes. For a landing page with a contact form. Do you have any idea what I could do with 1.4 gigabytes of addressable memory in 1994? I could have mapped the entire genome of a small mammal. I could have simulated a fluid dynamics model for a jet engine. And here you are, using it to store three different versions of lodash and a CSS-in-JS library that requires its own runtime parser. Why?”
The Junior Developer shifts uncomfortably, his brand-new mechanical keyboard glowing with a “vortex” RGB pattern that makes my eyes ache. “But it’s about the ecosystem, Sarge. It’s about developer velocity. We’re using React 18.3.1. It’s got Concurrent Mode, Transitions, and the new use hook. It makes the UI feel… snappy.”
“Snappy?” I point to the monitor where top is running. “Look at the PID for your Node.js v20.11.0 process. Look at the RES column. 842 Megabytes. Your ‘snappy’ UI is currently eating more RAM than the entire operating system, the window manager, and the compiler combined. You’re not building software; you’re building a landfill.”
$ top -p 44210
Tasks: 1 total, 0 running, 1 sleeping, 0 stopped, 0 zombie
%Cpu(s): 12.4 us, 4.2 sy, 0.0 ni, 83.4 id, 0.0 wa, 0.0 hi, 0.0 si, 0.0 st
MiB Mem : 32142.4 total, 14210.1 free, 842.6 used, 17089.7 buff/cache
MiB Swap: 2048.0 total, 2048.0 free, 0.0 used. 30842.1 avail Mem
PID USER PR NI VIRT RES SHR S %CPU %MEM TIME+ COMMAND
44210 greybeard 20 0 4210.4m 842.6m 42.1m S 18.6 2.6 0:44.12 node
“But the Virtual DOM handles the optimization for us!” the Junior protests. “It only updates what needs to change. It’s efficient!”
I sigh, the sound of a man who has spent too many nights debugging race conditions in interrupt handlers. “Sit down, kid. We’re going to have a talk about what’s actually happening under that hood of yours. And no, it’s not ‘magic’.”
Table of Contents
The Illusion of the Virtual DOM
“You think the Virtual DOM is a performance feature,” I begin, leaning back into my creaky chair. “That’s the first lie they told you in that bootcamp. The Virtual DOM is a developer convenience feature. It exists because you lot forgot how to manage state, so you decided it was easier to just re-render everything and let a massive, complex algorithm figure out the difference. Do you know the O(n) complexity of the reconciliation algorithm?”
“It’s… linear?” he guesses.
“It’s O(n) because they use heuristics,” I snap. “If they did a true tree-diffing algorithm, it would be O(n^3). But even at O(n), you’re still creating an entire tree of JavaScript objects on every single render. Every time a user types a single character into a text box, you’re triggering a function that generates thousands of ‘Fiber’ nodes. Each one of those is an object. Each object has a cost. Each object has a memory address. Each object needs to be tracked by the garbage collector.”
I pull up a heap snapshot on the second monitor.
Snapshot 1: 142.1 MB
(Object) ReactFiberConfig: 12,402 instances
(Object) FiberNode: 45,892 instances
(String): 82,104 instances
(Array): 15,200 instances
“Look at those Fiber nodes, kid. 45,000 of them. In React 18.3.1, a Fiber node isn’t just a simple struct. It’s a massive object with properties like return, child, sibling, index, pendingProps, memoizedProps, updateQueue, and memoizedState. In C, I could represent a UI node in 64 bytes. Your Fiber nodes are hundreds of bytes each, plus the overhead of the V8 heap management. You’re chasing pointers across the entire memory space just to decide whether a ‘Submit’ button should be blue or grey. That’s not efficiency. That’s a tragedy.”
Dependency Hell and the Death of the CPU
“But we need those dependencies,” the Junior says, his voice gaining a desperate edge. “We’re using framer-motion for the animations, react-query for the data fetching, and zod for the schema validation. They’re industry standards!”
“Industry standards for what? Bloat?” I run npm list --depth=0.
$ npm list --depth=0
├── @tanstack/[email protected]
├── [email protected]
├── [email protected]
├── [email protected]
├── [email protected]
├── [email protected]
├── [email protected]
└── [email protected]
“Look at this list. You think you have seven dependencies. But let’s look at the actual tree. npm list | wc -l. One thousand, four hundred and twelve. You have 1,412 separate packages in your node_modules. Each one of those is a potential security vulnerability, a point of failure, and a drain on the CPU. When you run your dev server, Node.js v20.11.0 has to parse, compile, and execute all of that.
The CPU isn’t a magical thinking machine, kid. It’s a series of registers and caches. When you have a dependency tree this deep, you are constantly blowing out your L1 and L2 caches. You’re forcing the CPU to jump all over the RAM, waiting for data to arrive from the slow main memory because your ‘modern’ stack is too big to fit in the cache. You’re spending 90% of your clock cycles just moving data around and 10% actually doing work. It’s like trying to build a house by hiring 1,400 consultants to tell you how to hammer a nail.”
“But the DX is so good!” he cries. “I can change a line of code and see it update instantly with Hot Module Replacement!”
“Hot Module Replacement is just another layer of complexity to hide the fact that your build process is too slow,” I counter. “In the time it takes for your Webpack or Vite to ‘hot reload’ a simple change, I could have recompiled an entire C kernel from scratch. You’ve traded fundamental understanding for a feedback loop that feels fast but produces slow results.”
The Garbage Collector’s Lament
“Now, let’s talk about why the server is currently choking,” I say, pointing to the spiking CPU graph. “We’re seeing a memory leak in the production build. Do you know what happens to all those objects you create during reconciliation?”
“The Garbage Collector cleans them up?”
“Eventually. But the GC isn’t free. In Node.js v20.11.0, the V8 engine uses a generational garbage collector. It has a ‘Young Generation’ for new objects and an ‘Old Generation’ for long-lived ones. When you’re spamming the heap with thousands of Fiber nodes and temporary state objects every second, you’re forcing the ‘Scavenger’ to run constantly.
When the Scavenger can’t keep up, it triggers a ‘Mark-Sweep’ or ‘Mark-Compact’ cycle. That’s a ‘stop-the-world’ event. The entire execution of your program pauses while V8 crawls through the heap to find what’s still alive. Look at the GC logs I enabled.”
[44210:0x55f1a20] 44102 ms: Scavenge 142.4 (158.2) -> 138.1 (160.2) MB, 4.2 / 0.0 ms (average idle time 0.0 ms, TLB flush 0.0 ms)
[44210:0x55f1a20] 44510 ms: Mark-sweep 138.1 (160.2) -> 120.4 (155.1) MB, 18.4 / 0.0 ms (average idle time 5.0 ms, TLB flush 0.0 ms)
“See that? 18.4 milliseconds. That doesn’t sound like much to you, does it? But in 18.4 milliseconds, a 3GHz CPU could have executed 55 million instructions. Instead, it sat there, twiddling its thumbs, while V8 tried to figure out if your useEffect cleanup function was still reachable. And because you’re using closures for everything, you’re creating ‘retained paths’ that the GC can’t break. You’re holding onto memory you don’t need because you don’t understand how the scope chain works in JavaScript.”
Synthetic Events and the Abstraction Tax
“Let’s talk about how you handle a simple click,” I continue. “In my day, a click was an interrupt. The hardware sent a signal, the OS caught it, and the application responded. In your world, a click is a ‘Synthetic Event’.”
“But Synthetic Events are great!” the Junior says. “They provide a consistent interface across different browsers. I don’t have to worry about cross-browser compatibility!”
“Cross-browser compatibility was a problem in 2008, kid. Today, it’s a solved problem that you’re still paying a tax for. When a user clicks a button in React 18.3.1, the event doesn’t go to the button. It bubbles up to the root of the document where React has a single event listener. Then, React creates a ‘SyntheticBaseEvent’ object—more memory, more GC pressure—and starts its own internal propagation system.
It has to look up the Fiber tree to see which components have onClick props. It has to simulate the bubbling and capturing phases. It’s an entire event system written in JavaScript, running on top of the event system already built into the browser. It’s an abstraction on top of an abstraction. Why? Because you’re afraid of the DOM. You’ve been taught that the DOM is ‘slow’. The DOM isn’t slow. Your framework is slow because it spends all its time trying to avoid the DOM.”
I show him a flame graph of a single click handler. “Look at the call stack. It’s fifty levels deep. dispatchEvent, processEventQueue, executeDispatches, invokeGuardedCallback. All of that just to toggle a boolean. It’s like calling a congressional hearing to decide which socks to wear.”
The Hook Dependency Trap
“And then we get to the hooks,” I say, my voice dripping with disdain. “The ‘modern’ way to manage state. useState, useEffect, useMemo, useCallback. Do you know how these are implemented?”
“They’re… just functions?”
“They’re entries in a linked list attached to the Fiber node. And they rely entirely on the order of execution. If you put a hook inside an if statement, the whole thing explodes. That’s the first sign of a fragile architecture. But the real crime is the dependency array.
Look at your code here: useEffect(() => { ... }, [data, user, settings]). Every time this component renders, React has to iterate through that array and perform a shallow comparison on every element. Object.is(oldData, newData). If you’re passing in an object that you recreated in the parent component, the comparison fails, and the effect runs again.
So what do you do? You wrap the parent object in useMemo. But useMemo also has a dependency array. So you wrap those dependencies in useMemo. It’s useMemo all the way down. You’re manually doing the work that a compiler should be doing, and you’re doing it at runtime, on the user’s machine. You’re burning battery life to save yourself from having to think about object identity.”
“But it prevents unnecessary re-renders!” the Junior argues.
“No,” I bark. “It attempts to prevent unnecessary re-renders while adding the overhead of memory memoization and comparison logic. You’re trading CPU cycles for memory, but you’re doing it so inefficiently that you’re losing on both fronts. In a real language, I’d just compare two pointers. In your world, you’re comparing entire object graphs and hoping the V8 engine can optimize the hidden classes fast enough.”
Fiber Nodes and the Memory Graveyard
“Let’s get technical for a second. Let’s talk about the ‘Concurrent Mode’ in React 18.3.1. You think it’s about making things faster. It’s actually about making things slower but more ‘interruptible’. React now has the ability to pause a render, do something else, and come back to it later.
To do this, it maintains two trees: the ‘current’ tree and the ‘workInProgress’ tree. When you start an update, React clones the Fiber nodes from the current tree into the workInProgress tree. That’s double the memory. If the update is high priority, it finishes and swaps the trees. If it’s low priority, it can be interrupted.
But while it’s interrupted, all those workInProgress nodes are sitting in the heap, taking up space, holding onto references. If you have a lot of ‘Transitions’ happening, you can end up with multiple versions of your UI state floating around in memory at the same time.
And because React 18 uses ‘Lanes’ for priority, the logic for deciding what to render and when is incredibly complex. It’s a bitmask-based priority system. In C, bitmasks are fast. In JavaScript, you’re still dealing with the overhead of the engine’s number representation. You’re trying to do low-level scheduling in a high-level, garbage-collected language. It’s like trying to perform heart surgery while wearing oven mitts.”
I point to a specific part of the heap snapshot. “See this? Detached Window. That’s a memory leak. Somewhere in your ‘snappy’ UI, a component is unmounting but leaving behind a listener or a reference in a closure. Because React’s Fiber tree is so interconnected, a single leaked node can pull an entire branch of the tree with it. You’re leaking megabytes of data every time the user navigates between pages. And you didn’t even notice because you have 32GB of RAM on your dev machine. But the user on a five-year-old Android phone? Their browser is going to crash in ten minutes.”
The Final Reckoning: Why We Can’t Have Nice Things
“You see, kid,” I say, softening my tone just a fraction, “the problem isn’t that React is ‘bad’. The problem is that it’s an architectural waste that has become the default. We’ve stopped teaching people how computers work. We teach them how frameworks work.
You know how to use useContext, but you don’t know what a cache miss is. You know how to use Styled Components, but you don’t know how the browser’s CSS engine actually calculates layout. You’re building on top of a precarious stack of cards, and you’re just adding more cards and calling it ‘progress’.
Every layer of abstraction—Node.js, V8, React, the Virtual DOM, Synthetic Events, Hooks—is a tax. A tax on performance, a tax on memory, and a tax on the user’s patience. We used to write software that respected the hardware. Now we write software that treats the hardware as an infinite resource to be consumed.
But the hardware isn’t infinite. The CPU has limits. The memory bus has limits. And the garbage collector… the garbage collector always comes for its due.
Now, look at this code again. Do we really need a 1.4GB node_modules folder to show a list of users? Or could we just… I don’t know… write some HTML and a little bit of JavaScript?”
The Junior Developer looks at his screen, then back at me. “But… how would I manage the state without Redux or React Query?”
I close my eyes and rub my temples. “State? You mean data? You manage it by putting it in a variable. And when the data changes, you update the DOM. It’s been possible since 1995. It’s fast, it’s light, and it doesn’t require a 18.4ms ‘stop-the-world’ garbage collection cycle.”
I stand up and head toward the door of the server room. “Fix the leak, kid. And for the love of the silicon, stop using useMemo for things that are just basic math. The CPU can multiply two numbers without you ‘optimizing’ it into a heap allocation.”
As I walk away, I hear him typing. Probably searching StackOverflow for “how to optimize React 18 memory leak.” He doesn’t get it. He’ll never get it. The abstraction has him now. He’s just another pointer in the heap, waiting to be collected.
Related Articles
Explore more insights and best practices: