Introduction to Node.js Streams

Scalability. Big Data. Real Time. These are some of the challenges that a web application has to face in the modern World Wide Web. This is where Node.js and its non-blocking I/O model comes into play. This article will introduce you to one of Node’s most powerful APIs for data intensive computing, streams.

This code works perfectly. Nothing is wrong with it, except for the fact that Node.js
buffers the entire contents of data.txt before sending the data back to the client. With the increase of clients requests your application could start to consume a lot of memory. In addition clients will need to wait for the entire file to be read by the server application, resulting in increased latency.

Here, to overcome the scalability issues we use the streams API. Using the stream object ensures that data.txt is sent to clients one chunk at a time as they are read from the disk, without server buffering and waiting times on the client.

What are Streams?

Streams can be defined as a continuous flow of data that can be manipulated asynchronously as data comes in (or out). In Node.js streams can be readable or writable. A readable stream is an EventEmitter object that emits data events every time a chunk of data is received. In our previous example a readable stream has been used to pipe the content of a file down to a HTTP client. When the stream reaches the end of our file it emits an end event, indicating that no more data events will occur. In addition a readable stream can be paused and resumed.

Writable streams, on the other hand, accept streams of data. This type of stream inherits from the EventEmitter object too, and implements two methods: write() and end(). The first method writes data to a buffer and returns true if the data has been flushed correctly, or false if the buffer is full (in this case the data will be sent out later).The end() method simply indicates that the stream is finished.

Your First Streams Application

Let’s take a closer look at streams. To do so we are going to build a simple file upload application. First of all, we need to build a client that reads a file using a readable stream and pipes the data to a specific destination. At the other end of the pipe we’ll implement a server that saves the uploaded data using a writable stream.

Let’s start with the client. We begin with importing the HTTP and file system modules.

Please note that the req and res objects returned by createServer() are a readable stream and writable stream, respectively. We can listen for the data event, and pipe back the result to the client once all the processing is over.

Conclusion

This article has introduced one of the most powerful tools of Node.js, the streams API. In the coming weeks, we will dive deeper into the world of streams, exploring all the different types built into Node.js, and third party streams too.

Giovanni Ferron is web developer currently living in Melbourne, Australia. He has worked for major media companies including MTV and DMG Radio Australia. He co-founded stereomood.com, ticketonrails.com and a whole bunch of other projects. A couple of years ago he fell in love with Node.js and since then he has been spending his nights programming in JavaScript.