Hi, I'm Node.js and I'll be your server today.

The magical moment has arrived: your startup just got its first round of funding, and you’re ready to crunch out an MVP that will dazzle your users and make investors salivate.

Only, you have no clue what tech stack to use.

You heard that Node.js is pretty popular for building a backend server because “hey, it’s JavaScript on the backend!” and there’s a huge pool of full-stack developers to hire from. But weeks of Reddit-ing and Hacker News-ing later, you have an equal collection of posts that say Node is “literally the best” and “definitely the worst.”

Optimize for the Right Things™

In truth, your choice of tech stack will rarely make or break your company. Since a new service often starts out as an MVP-style proof-of-concept, your backend server may only see a few hundred users before it’s scrapped and rewritten (or dies if you join the 92% of startups that fail).

So if you are worrying about “Will it scale to handle tons of users?” you may be asking the wrong question. If you have reached a scale where your decision actually matters… congratulations! But unless you are a large service with an established user base, you have time to worry about scaling later. Don’t kill your startup by prematurely optimizing your tech stack. Instead, focus on maximizing developer speed and happiness. Usually, this means leveraging what your team already knows best.

With that in mind, Node.js will often hit the sweet spot: it scales to huge amounts of traffic, and likely your team is already skilled in JavaScript.

What is JavaScript?

To answer this, it will help to understand how Node.js works. At its core, Node is a runtime for executing JavaScript code on a server. Traditionally, JavaScript was run only in the browser, but nowadays you’ll find JavaScript in a lot of places.

JavaScript is a dynamic, lexically scoped, duck-typed scripting language. Practically speaking, this means developers can quickly modify code without recompiling and enjoy exploratory programming, which makes debugging easier. Dynamic scripting languages have traditionally been much slower than their compiled counterparts, but thanks to the browser wars and the Google V8 engine, JavaScript often runs within an order of magnitude of the speed of its native equivalent, and in optimized subsets runs only 50% slower than native code.

What is Node.js?

In technicalese, Node.js is a “non-blocking, event-driven I/O runtime.” Does that read like “word soup”? Let’s make a sandwich instead.

Traditional backend tech stacks work like a sandwich shop: for every customer, a sandwich maker will be assigned to you while you instruct them in what toppings you would like. So if the sandwich shop has one sandwich maker, the shop can handle one customer at a time. To serve more customers simultaneously, you just hire more sandwich makers.

This paradigm works great because making sandwiches is fast, and there’s not much waiting in between adding toppings.

But now imagine a fancy sit-down restaurant. Instead of getting in-and-out with a sandwich in 3 minutes, customers will likely spend an hour dining. If each customer monopolized a chef’s time for an entire hour, you’d need a lot of cooks!

So instead of customers talking directly to a chef, each customer is assigned a waiter. Still, it would be nonsensical for a waiter to be stuck with a customer until they left, because there’s lots of waiting! The waiter will wait for the customer to be ready to order, for their food to be prepared, etc. But a single waiter can attend to multiple customers over the period of an hour: after they take an order, they forward it to a chef and check on other customers.

But it’s easy to predict when your waiter will leave you to attend to other customers: they won’t ask you to “hold that thought” and leave you in the middle of ordering. Instead, they will only leave when you’ve finished placing your order—that way, waiters won’t have to remember what the customer was halfway through ordering.

While waiters are good at helping customers discover new items and validating their menu choices, they can’t handle lengthy tasks—otherwise, their other customers could be waiting for a while. Instead, a waiter delegates time-consuming tasks, like food preparation, to other people.

In short, a waiter doesn’t do any one thing that takes much time.

When the restaurant is clogged with customers, there is now a possible bottleneck: you might not have enough cooks! In such a case, you wouldn’t employ more waiters to speed up order time—instead, you should hire more chefs. However, sometimes exceptional circumstances arise and a waiter needs to leave unexpectedly. To add “fault-tolerance,” you just add more waiters!

Splitting up restaurant duties into labor-intensive food preparation and multitask-style waiting makes sense. And in the world of backend tech stacks, Node.js is your waiter at a sit-down restaurant!

What is Node.js Good At?

Like a restaurant waiter, Node.js is exceptionally good at waiting. For a backend server, this may seem strange—why would the backend wait before responding to a browser’s HTTP request? Most backends wait for a lot of resources before responding: they fetch data from a database, read a file from disk, or just wait to finish streaming the response back to the browser!

This wouldn’t be problematic if there was only one request at a time, but if your backend needs to handle 20 requests simultaneously, blocking 19 of the other requests until the first one finishes is not an option. To solve this, most backend stacks rely on multithreading and load balancers.

But why can’t a single backend process handle multiple requests concurrently, like a waiter, so long as no task takes long? This is the superpower of Node.js: a single Node process can seamlessly handle hundreds of thousands of simultaneous requests by juggling between requests whenever it must wait for a resource (database, reading a file off disk, or networking). This paradigm, called asynchronous or cooperative multitasking, allows the backend to predictably make context switches when it gets to a good stopping point, i.e. when it’s waiting for something. This is in contrast to preemptive multitasking, which gives each request handler a slice of time to compute before forcefully switching to another request handler.

It turns out a large category of web services do a lot of waiting by delegating to other services (database, file system, networking), then aggregate the data into a suitable response. Because “context switches” between these simultaneous tasks are predictable, memory usage stays very low and there are far fewer worries about thread safety.

Even though your code is single-threaded, you can scale it in the same way you would a restaurant: add more waiters! Or in this case, run more processes (usually, one per CPU core).

So Node supports cooperative multitasking, but not through multithreading. This isn’t a disadvantage—it actually makes programs easier to reason about! What if a waiter could leave a customer in the middle of ordering? They would need to keep track of where they left off ordering. But what if during that time someone persuaded the customer to order something different? Since the code is single-threaded, we don’t need to worry about thread safety since we know the waiter will only leave off when a customer is done ordering.

What isn’t Node.js Good At?

As the homepage asserts, Node.js is really good for programs that deal with event-oriented I/O (input/output). This also means that there are a lot of things Node.js is not good at.

In particular, Node does its best to make blocking operations impossible: all the core APIs are asynchronous. But despite JavaScript’s execution speed, you can still “block the Event Loop” by performing CPU intensive tasks. If your backend needs to analyze data or do complex aggregation and filtering, you will annul Node’s primary strength.

Thankfully, Node comes with many core APIs that are implemented natively which effectively run on separate threads selected from a thread pool, so you can do a lot of “CPU intensive” things in Node without blocking. If you need to do some custom intensive computation, you can leverage the WebWorkers API to create thread safe background workers. Or, you can build out specialized microservices, perhaps with Elixir or Rust, and use them as a “backend for your backend.”

Since Node.js is a general-purpose programming language, a naive HTTP server will not be fault-tolerant (resilient to crashes) out of the box. For a single-threaded server, a process supervisor like forever will do, but to leverage multi-core CPUs you will want to use the built-in cluster API.

Why is Node So Popular?

With these caveats in mind, Node.js is an exceptional fit for many backend servers. Its extreme popularity among developers is especially telling, and with good reason:

It’s easy to get caught up in a language’s elegance, yet overlook the most important aspect of a tech stack: its community support and libraries. JavaScript enjoys the largest community (people and open source libraries) of any language, and Node inherits these benefits with the npm package manager, “the largest ecosystem of open source libraries in the world.”

With ES2017 (the 2017 release of JavaScript), async programming is incredibly straightforward and readable with async functions.

Node adopts new features and APIs rapidly, so developers can enjoy all of the tooling and APIs the TC39 committee has helped standardize for years in browsers. As a pleasant consequence of the io.jshostile fork, Node.js is 100% open-source, has an open governance model and is supported by a diverse community of independent contributors and company backers who frequently contribute to Node’s fantastic production readiness.

All told, this means Node.js is unlikely to stagnate or die out since it builds on the web platform’s incredible momentum. And at eight years old, it’s a safe bet that Node.js will continue to innovate for years to come.