Quick Summary

? Node Streams are an efficient way to channelize and process input and output data for Node.js application.

? Using Node Js streaming, entrepreneurs can improve the performance, scalability, and maintainability of Node.js applications that function with huge amounts of data.

? Find out about the types of streams in Node.js, along with their practical tutorial for better understanding.

? Explore the chaining and piping of Node Streams.

Table of Contents

What are Streams in Node Js?

Streams are abstract interfaces for working with data that can be read or written sequentially. In Node.js, streams are a fundamental concept used to handle data flow between input and output sources.

Streams are an important concept in Node.js because they allow for the efficient handling of large amounts of data. Instead of loading all the data into memory at once, streams process data in chunks as it becomes available. Data can be streamed from a source (like a file or a network socket) to a destination (like a response object or another file) in real-time, without buffering the whole data into memory at once.

For instance, one may read a stream or write a stream from and to various data sources, sinks, files, network sockets, and stdin/stdout.

? The Stream Module

The stream module in Node.js is a core module that provides a way to handle streaming data. It provides a set of APIs for creating, reading from, and writing to streams.

? Node Js Streaming API

The Stream API in Node.js is a set of APIs that provide a way to handle streaming data in Node.js. The Stream API provides a set of classes and functions for creating, reading from, and writing to Node streams.

Here are the main components of the Stream API in Node.js:

  • Stream Classes: The Stream API provides several classes for working with Node js streams, including the Readable, Writable, Duplex, and Transform classes. These classes provide different types of streams with varying functionality.
  • Stream Methods: The Stream API provides several methods for working with streams, including the pipe() method for connecting both readable and writable type, and the onData() method for handling a data event or more.
  • Events: The Stream API provides several events that can be emitted by streams, including ‘data’, ‘end’, ‘error’, and ‘finish’. These events can be used to handle different aspects of stream processing.
  • Stream Options: The Stream API provides options for configuring streams, such as setting the encoding for readable streams or setting the high watermark for writable ones.

Types of Node Streams

There are four different types of streams, each for a specific purpose, namely, Readable NodeJs Streams, Writable Streams, Duplex Streams, and Transform Streams for Node js applications.

Let us understand the Readable Node Js Stream example.

? Readable Stream

Readable streams are used to reading data from a source, such as a file or a network socket. They emit a ‘data’ event whenever new data is available and an ‘end’ event when the stream has ended. Examples of readable streams in Node.js include ‘fs.createReadStream()’ for reading files and ‘http.IncomingMessage’ for reading HTTP requests.

Let us understand the Readable Node Js Stream with an example.

Copy Text
const fs = require('fs');

// Create a readable stream from a file
const readStream = fs.createReadStream('example.txt', { encoding: 'utf8' });

// Handle 'data' events emitted by the stream
readStream.on('data', (chunk) => {
  console.log(`Received ${chunk.length} bytes of data.`);
});

// Handle the 'end' event emitted by the stream
readStream.on('end', () => {
  console.log('End of file reached.');
});

// Handle errors emitted by the stream
readStream.on('error', (err) => {
  console.error(`Error: ${err}`);
});

output

Copy Text
Reading data from the file in chunk: My name is john doe
Read Stream Ended!

In this example, we use the fs module to create a readable stream from a file named ‘example.txt’. We set the encoding option to ‘utf8’ to read the file as a string.

We then handle the ‘data’ event emitted by the stream, which is triggered every time a chunk of data is read from the file. In this case, we simply log the number of bytes received.

We also handle the ‘end’ event emitted by the stream, which is triggered when the end of the file is reached. Finally, we log any errors emitted by the stream to the console.

Ready to harness the power of Node.js Streams?
Hire Node js developer or partner with a reputable Node.js development company today and take your projects to the next level!

? Writable Stream

Writable streams are used for writing data to a destination, such as a file or a network socket. They have a ‘write()’ method to write data and an ‘end()’ method to signal the end of the stream. Examples of this streams in Node.js include ‘fs.createWriteStream()’ for writing files and ‘http.ServerResponse’ for writing HTTP responses.

Example of NodeJs Writable Stream:

Copy Text
const fs = require('fs');

// Create a writable stream
const writeStream = fs.createWriteStream('output.txt');

// write data to file
writeStream.write('Hello from write stream')

// ending writable stream
writeStream.end();

// Handle stream events
writeStream.on('finish', () => {
    console.log(`Write Stream Finished!`);
})

writeStream.on('error', (error) => {
    console.error(`Write Stream error: ${error}`);
})

Output

Copy Text
Write Stream Finished!

In this example, we use the fs module to create a writable stream to a file named ‘output.txt’. We set the encoding option to ‘utf8’ read data from the file as a string.

We then write data to the stream using the write() method, calling it twice to write two lines of text. We end the stream using the end() method.

We also handle the ‘finish’ event emitted by the stream, triggered when all data has been written to the file. Finally, we log any errors emitted by the stream to the console.

? Duplex Stream

Duplex streams are bidirectional, meaning they can read and write data. They can be used for tasks such as proxying data from one network socket to another. Duplex streams inherit from both ‘Readable’ and ‘Writable’ streams, so they have all the methods of both.

Duplex Stream example:

Copy Text
const { Duplex } = require('stream');

const myDuplex = new Duplex({
  write(chunk, encoding, callback) {
    console.log(chunk.toString());
    callback();
  },
  read(size) {
    if (this.currentCharCode > 90) {
      this.push(null);
      return;
    }
    this.push(String.fromCharCode(this.currentCharCode++));
  }
});

myDuplex.currentCharCode = 65;

process.stdin.pipe(myDuplex).pipe(process.stdout);

In this example, we create a new Duplex stream using the Duplex class from the stream module. The write method is called whenever data is written to the stream, and simply logs the chunk of data to the console. The read method is called whenever the stream is read from, and in this example, it pushes characters from the ASCII character set to the stream until the character code reaches 90, at which point it pushes null to signal the end of the stream.

We then pipe the standard input stream (process.stdin) to our Duplex stream, and then pipe the Duplex stream to the standard output stream (process.stdout). This allows us to type input into the console, which gets written to the Duplex stream, and then the output from the Duplex stream gets written to the console.

? Transform Stream

Transform streams are a type of duplex stream that can modify data as it passes through them. They can be used for compression, encryption, or data validation tasks. Transform streams inherit from ‘Duplex’, so they have both a ‘read()’ and a ‘write()’ method. When you write data to a transform stream, it will be transformed by the transform function before being emitted as output.

Let us see an example of the transform Node.js stream.

Copy Text
const fs = require('fs');

// Importing strema APIs
const { Transform, pipeline } = require('stream');

// Create a readable stream
const readableStream = fs.createReadStream('input.txt');

// Create a writable stream
const writableStream = fs.createWriteStream('output.txt');

// Set the encoding to be utf8. 
readableStream.setEncoding('utf8');

// Transform chunk into uppercase
const uppercaseWordProcessing = new Transform({
    transform(chunk, encoding, callback) {
        console.log(`Data to be transformed: ${chunk}`);
        callback(null, chunk.toString().toUpperCase());
    }
});

readableStream
    .pipe(uppercaseWordProcessing)
    .pipe(writableStream)

// Alternatively, we can use the pipeline API to easily pipe a series of streams
// together and get notified when the pipeline is fully completed.
pipeline(readableStream, uppercaseWordProcessing, writableStream, (error) => {
    if (error) {
        console.error(`Error occured while transforming stream: ${error}`);
    } else {
        console.log('Pipeline succeeded!');
    }
});

// Handle stream events
readableStream.on('end', () => {
    console.log(`Read Stream Ended!`);
    writableStream.end();
})

readableStream.on('error', (error) => {
    console.error(`Read Stream Ended with an error: ${error}`);
})

writableStream.on('finish', () => {
    console.log(`Write Stream Finished!`);
})

writableStream.on('error', (error) => {
    console.error(`Write Stream error: ${error}`);
})

Output

Copy Text
Data to be transformed: My name is john doe
Read Stream Ended!
Write Stream Finished!
Pipeline succeeded!

In this example, we create a new class called ‘UpperCaseTransform’ that extends the built-in ‘Transform’ class from the ‘stream’ module. We override the’ _transform’ method to convert each chunk of incoming data to uppercase using the ‘toUpperCase’ method of the string object. Then, we push the transformed chunk to the writable stream using the ‘push’ method and call the ‘callback’ function to indicate that we’re done processing the chunk.

Finally, we pipe the ‘stdin’ readable stream into an instance of our ‘UpperCaseTransform’ class, and pipe the resulting transformed data to the ‘stdout’ writable stream. This causes all data written to ‘stdin’ to be converted to uppercase and printed to the console.

Now that we know about the types of Nodejs Streams, let us get to the business benefits of using them.

Advantages of Node Js Streaming

Popular companies using Node.js, having humungous data such as Netflix, NASA, Uber, Walmart, etc. are leveraging streams Node JS and hence able to manage better, sustain, and perform with their applications. Here are the advantages of using Node Streams in your Node.js applications.

  • Memory efficiency: Streams can process large amounts of data without everything into memory simultaneously. This means streams can handle files and data too large to fit in memory.
  • Performance: Because streams can process data in chunks, they can be faster and more efficient than other methods reading or writing the entire data set simultaneously. This can be particularly useful for real-time applications that require low latency and high throughput.
  • Flexibility: Streams can be used to handle a wide range of data sources and destinations, including files, network sockets, and HTTP requests and responses. This makes streams a versatile tool for handling data in different contexts.
  • Modularity: Node Streams can be easily combined and piped, allowing for complex data processing tasks to be broken down into smaller, more manageable parts. This can make code easier to read and maintain.
  • Backpressure handling: Streams can handle backpressure by automatically slowing down the data source when the data destination cannot keep up. This can help prevent buffer overflows and other performance issues.

Overall, the use of streams in Node.js can help improve the performance, scalability, and maintainability of applications that handle large amounts of data.

It is time to further know about the potential and scope of implementing Node streaming, along with the use cases of Node Js Streams.

Piping in Node Streams

In Node Js Streaming, piping is a way to connect a readable stream with a writable one using the pipe() method. The pipe() method takes a writable stream as an argument and connects it to a readable stream.

When pipe() is called, it sets up listeners on the readable stream’s ‘data’ and ‘end’ events, and automatically writes data from the readable stream to the writable stream until the end of the readable stream is reached. This makes it easy to chain together multiple streams and create a pipeline for processing data.

Here’s an example of using pipe() method:

Copy Text
const fs = require('fs');

// Create a readable stream from a file
const readStream = fs.createReadStream('input.txt');

// Create a writable stream to a file
const writeStream = fs.createWriteStream('output.txt');

// Pipe the readable stream to the writable stream
readStream.pipe(writeStream);

// Handle errors emitted by either stream
readStream.on('error', (err) => {
  console.error(`Error reading file: ${err}`);
});

writeStream.on('error', (err) => {
  console.error(`Error writing file: ${err}`);
});

In this example, we first create readable and writable type streams using the fs module. Then, we then use the pipe() method to connect the readable stream to the writable stream.

We also handle any errors emitted by either stream using the on(‘error’) method.

Note that pipe() is a convenient way to handle stream data flow in Node.js, but it may not always be suitable for complex stream processing scenarios. Also, Discover various debugging techniques and tools that can help you identify and fix issues quickly with Debug Node JS Application.

Pros and Cons of Piping Node Streams

Benefits Drawbacks
Efficient processing Steep learning curve
Easy to Use Not compatible with other Node.js streams
Modular code Debugging issues
Backpressure handling Complex control flow


Let us head to answer the question, What is Node Stream chaining?

Node Js Stream Chaining

In Node Streams, chaining is a way to connect multiple stream operations together using method chaining. Chaining allows you to easily create a pipeline of stream operations that can be applied to a readable stream, transforming or processing the data as it flows through the pipeline.

To chain stream operations together, you simply call methods on a readable stream, which returns new stream objects that can be further manipulated or connected to other Node streams. The resulting stream operations are applied in sequentially as the data flows through the pipeline.

Here’s an example of using chaining to create a pipeline of stream operations:

Copy Text
const fs = require('fs');

// Create a readable stream from a file
const readStream = fs.createReadStream('input.txt');

// Chain stream operations to transform the data
readStream
  .pipe(transformStream1)
  .pipe(transformStream2)
  .pipe(transformStream3)
  .pipe(writeStream);

// Create a writable stream to a file
const writeStream = fs.createWriteStream('output.txt');

// Define transform stream operations
const transformStream1 = // ...
const transformStream2 = // ...
const transformStream3 = // ...

In this example of Node streams, we create a readable stream from a file using the fs module. We then combine several stream operations to transform the data, using the pipe() method to connect each operation to the next.

We define the individual transform stream operations separately and pass them as arguments to pipe(). These operations can be any stream type, including Transform, Duplex, or even other Readable streams.

Finally, we create a writable stream to a file and connect it to the end of the pipeline using pipe().

Note that chaining is a powerful way to process stream data in Node.js, but it may only sometimes be the most efficient or flexible approach. Also, Learn how to leverage a new version of Node.js for your projects. Follow simple steps to download and install the latest version of Node 19 and updates.

Pros and Cons of Chaining in Node

Benefits Drawbacks
Flexible processing Complex
Reusability Steep learning curve
Improved performance Limited compatibility
Easy debugging Issues with control flow


Key Take Aways

Data handling with Node.js streams enables Node developers to smoothly function with the incoming and outgoing data. Entrepreneurs can better manage, function, and leverage excellent Node js performance out of their Node application using streams, especially with better memory management.

Frequently Asked Questions (FAQs)

The benefits of using Node.js Streams include improved performance, lower memory usage, and better handling of large data sets. Streams allow you to process data in chunks, which can help avoid bottlenecks and reduce the memory needed to process data. Streams also allow you to process data as received or sent, which can help reduce latency and improve overall performance.

Yes, Node Streams are compatible with other programming languages and can stream data between different systems or applications.

Node Streams can be a powerful tool for processing data efficiently in Node.js applications. Using streams turns out to be fruitful in the following use cases: processing large files, real-time data processing, handling HTTP requests and responses, and when transforming data.

Ready to Supercharge your Node.js application's data processing capabilities?

Experience speedy and efficient data processing in your Node.js app. Take advantage of the power of Node.js Streams and see a noticeable improvement in your app’s performance

Book a 30 min free call

Build Your Agile Team

Hire Skilled Developer From Us

[email protected]

Your Success Is Guaranteed !

We accelerate the release of digital product and guaranteed their success

We Use Slack, Jira & GitHub for Accurate Deployment and Effective Communication.

How Can We Help You?