Fatih's Personal Blog

Do we really need node_modules or npm?

April 09, 2020 · 3 minutes to read

I remember thinking that the way we’re doing JavaScript is complex but we don’t have any choice. What we’ve been doing for the last few years is that we are downloading a lot of JavaScript modules from npm to our node_modules folder and we transform and bundle it for browsers using webpack and babel. This was necessary because browsers didn’t have support for new features, most importantly module support and sending a lot of separate files to the browser was inefficient, so we transformed and bundled ahead-of-time.

Many popular browsers now support the crucial features including module support, and HTTP/2 makes it more efficient to send a bunch of files. But we’re stuck with the old ways, and we’re paying the price for what made sense at the time. As it turns out, putting all your JavaScript in one bundle is not that efficient either, since you’re sending non essential code which makes it load and parse slower, thus affecting the user experience badly. Also caching is hard, as your bundle changes even with the tiniest change. Now we’re seeing more granular approaches like route specific bundling which should improve the problems mentioned.

When I think of the best chunking approach that would be the most cachable, it’s no surprise that sending modules separately seems to be it. If we send lodash separately, for instance, it will stay cached until we bump the version and all subsequent visits will hit the cache. How about initial load? It would be the best for the browsers to aware of bundles and unbundle them and cache the modules but that doesn’t seem like something on the roadmap. So it’s either you develop intricate strategies to build the “just fine” size of bundles or you trust HTTP/2 and optimize for returning users. And it’s not an either/or scenario, you can use server rendering and graceful degradation so the users don’t need to wait for loading and parsing of scripts to experience the basic functionalities of your website.

The way to solve this is to trust the browsers. They are now module aware so you can add dependencies without adding them to the global scope, and they are much better at downloading multiple resources than before. So my suggestion would be to add script tags for the modules that you use to your HTML from UNPKG, which is an excellent CDN. You may need a tool to notify your out-of-date dependencies and bump them for you but that’s it. This is the vanilla case, which would cover the needs of many projects.

The advanced case is a bit more complicated, and for good reason, but the tooling necessary is currently nonexistent. But if you plan to build a web app that has server rendering and fast page transitions with client side validation and maximum code sharing, then you must have some kind of hybrid solution. What I would prefer is to write the routes of your application with JavaScript, and have a tool to generate the HTML for that route while turning the imports into script tags and inject the script for page transitions. One the script is loaded, internal links can be intercepted, so the script for the target and the data requirements can be fetched without a full-page refresh. The tool also must be able to generate an API route for the data.

In this case we can also go with UNPKG and use the hypothetical dependency tracking tool, but we also need IDE support and we may want to serve the scripts ourselves. (No, shared cache does not work anymore.) In that case using npm and node_modules can be unavoidable, but the bright side is you get all the niceties of npm. Maybe in the future the tooling can be improved so we can get away with simpler tools than npm to mimic the UNPKG way.

It also makes sense to keep in mind that static generation of routes is also an important optimization that has a bright future. The logical conclusion is to have a multi-pass rendering scheme where the routes are semi-generated for semi-static routes so unnecessary calls to the database are not made frequently. In that case database can be a trigger for the build process but that’s a topic for another note.

Key takeaways:

  • If your website is mostly static, think about adding modules from UNPKG to keep things simple, using a simple tool to manage your dependencies
  • If your website is dynamic, define your routes with JavaScript and React, use server rendering and let a tool add script tags for your dependencies, create API routes and inject fast transition script
  • Seriously consider what you can push to static, explore multi-pass rendering
Share on Twitter · Edit on GitHub
Fatih Altinok

Written by Fatih Altinok, who cares a lot about user experience, teamwork and functional programming. Learn more about me →