Node.JS Streams – Best Practices & Practical Examples

Node.js is a popular JavaScript runtime environment used to build scalable, high-performance applications. One of its key features is the ability to handle streams, which are continuous flows of data that can be read or written in chunks. In this article, we will explore best practices for working with Node.js streams and provide practical application examples.

Best Practices

  • Use pipe() to chain multiple streams together: The pipe() method allows you to connect two or more streams in a pipeline, where data flows from one stream to the next. This is useful when you need to perform multiple operations on a continuous flow of data. For example, you might use pipe() to read data from a file and then pipe it through a filter function before writing it to another file.
  • Use on('error', ...) to handle errors: When working with streams, errors can occur at any point in the pipeline. It’s important to handle these errors gracefully by using the on() method to listen for error events and perform appropriate actions. For example, you might log an error message or close a connection when an error occurs.
  • Use on('finish', ...) to handle completion: When all data has been processed through a stream, it emits a finish event. You can use this event to perform cleanup tasks such as closing connections or releasing resources. For example, you might close a database connection when the stream completes.
  • Use on('data', ...) to handle incoming data: When working with readable streams, you can listen for incoming data using the on() method and process it in real-time. This is useful for applications that require low latency processing of data. For example, you might use this event to update a user interface as new data arrives.
  • Use createWriteStream() or createReadStream() to create streams: To create a stream, you can use the createWriteStream() method for writable streams and the createReadStream() method for readable streams. These methods take an options object that allows you to configure various properties such as encoding and highWaterMark.

Practical Examples

FFMPEG based transcoding using express, multer, streams

const express = require('express');
const multer = require('multer');
const ffmpeg = require('fluent-ffmpeg');
const fs = require('fs');

// Configure storage for uploaded files
const storage = multer.diskStorage({
  destination: (req, file, cb) => {
    cb(null, 'uploads/');
  },
  filename: (req, file, cb) => {
    cb(null, Date.now() + '-' + file.originalname);
  }
});

// Filter for only allowing video files to be uploaded
const fileFilter = (req, file, cb) => {
  if (file.mimetype.indexOf('video') == -1) {
    return cb(new Error('Only .mp4 format allowed!'));
  }
  cb(null, true);
};

// Create an instance of Multer with the storage and file filter configurations
const upload = multer({ storage, fileFilter });

// Set up an Express server to handle incoming requests
const app = express();
app.use(express.static('public'));

// Handle file uploads using the Multer middleware
app.post('/upload', upload.single('video'), async (req, res) => {
  try {
    // Transcode the video to MP4 with 1920x1080 resolution and 1500 bitrate
    const outputPath = `./transcoded/${req.file.filename}.mp4`;
    await ffmpeg(req.file.buffer)
      .size('1920x1080')
      .bitrate(1500)
      .outputFile(outputPath);
    
    // Send a response back to the client indicating success
    res.sendStatus(200);
  } catch (error) {
    console.error('Error transcoding video:', error);
    // Send an error response back to the client if there was an issue
    res.status(500).send('An error occurred while transcoding your video.');
  }
});

// Start the server on port 3000
app.listen(3000, () => {
  console.log('Server listening on http://localhost:3000');
});

On-the-fly Filter and Replace JSON field values

const { Readable } = require('stream');

// Create a new stream that reads from the request body
const requestBodyStream = new Readable({
  read() {
    // Check if the current chunk of data contains a "video" property
    const chunkData = this.push(request.body);
    if (chunkData && chunkData.includes('video')) {
      // If it does, set the value to "mp4" and emit an event
      chunkData['video'] = 'mp4';
      this.emit('data', chunkData);
    }
  },
});
  • This code uses the `Readable` class from the `stream` module to create a new stream that reads data from the request body. It then checks each chunk of data as it is read, and if it contains a “video” property, sets its value to “mp4” and emits an event with the updated data.
  • It’s important to note that this code assumes that you have already set up your Node.js server to handle incoming JSON POST requests and that you are using the `request` object to access the request body.
  • Additionally, it is not recommended to modify the original request body in place as it may cause issues with other middleware or endpoints that rely on the same data. Instead, you should create a new copy of the request body and update its properties as needed before passing it along to other parts of your application

Conclusion

Node.JS is a powerful feature in the hands of intelligent developers. You may reduce un-necessary complexity, lines of code and make the solution performant and efficient.

Leave a Comment

Your email address will not be published.