Home >>Nodejs Tutorial >Node.js - Streams
Streams are objects that allow you to continuously read data from a source, or write data to a destination. Node.js contains four types of streams–
This tutorial provides a basic understanding of the operations commonly used on Streams.
Keep Training and keep Learning
Until you get it Right !!!!!
Create a js file with the following code named main.js –
var fs = require("fs");
var data = '';
// Create a readable stream
var readerStream = fs.createReadStream('input.txt');
// Set the encoding to be utf8.
readerStream.setEncoding('UTF8');
// Handle stream events --> data, end, and error
readerStream.on('data', function(chunk) {
data += chunk;
});
readerStream.on('end',function() {
console.log(data);
});
readerStream.on('error', function(err) {
console.log(err.stack);
});
console.log("Program Ended");
Run main.js now to see the result –
$ node main.js
Create a js file with the following code named main.js –
var fs = require("fs");
var data = 'Keep Learning';
// Create a writable stream
var writerStream = fs.createWriteStream('output.txt');
// Write the data to stream with encoding to be utf8
writerStream.write(data,'UTF8');
// Mark the end of file
writerStream.end();
// Handle stream events --> finish, and error
writerStream.on('finish', function() {
console.log("Write completed.");
});
writerStream.on('error', function(err){
console.log(err.stack);
});
console.log("Program Ended");
Run main.js now to see the result –
$ node main.js
Now open the generated output.txt in your current directory; it should contain the following –
Keep Learning
Piping is a mechanism where we provide another stream with the output of one stream as the input. It is normally used for collecting data from one stream and passing the output from that stream to another. Piping operations are unlimited. Now we're going to show an example of piping to read from one file and write it to another.
Create a js file with the following code named main.js –
var fs = require("fs");
// Create a readable stream
var readerStream = fs.createReadStream('input.txt');
// Create a writable stream
var writerStream = fs.createWriteStream('output.txt');
// Pipe the read and write operations
// read input.txt and write data to output.txt
readerStream.pipe(writerStream);
console.log("Program Ended");
Run main.js now to see the result –
$ node main.js
Open output.txt that was created in your current directory; it should contain –
Keep Training and keep Learning Until you get it Right !!!!!
Chaining is a mechanism for connecting one stream output to another stream and creating a multiple stream operation chain. Usually used in piping operations. Now we are going to use piping and chaining to compress a file first, and then decompress the same.
Create a js file with the following code named main.js –
var fs = require("fs");
var zlib = require('zlib');
// Compress the file input.txt to input.txt.gz
fs.createReadStream('input.txt')
.pipe(zlib.createGzip())
.pipe(fs.createWriteStream('input.txt.gz'));
console.log("File Compressed.");
Run main.js now to see the result –
$ node main.js
You will find that input.txt was compressed, and a file input.txt.gz was created in the current directory. Let us now try to decompress the same file using the code – below
var fs = require("fs");
var zlib = require('zlib');
// Decompress the file input.txt.gz to input.txt
fs.createReadStream('input.txt.gz')
.pipe(zlib.createGunzip())
.pipe(fs.createWriteStream('input.txt'));
console.log("File Decompressed.");
Run main.js now to see the result –
$ node main.js