07 - Streams & Buffers
📋 Jump to TakeawaysThis lesson uses ES Modules. Make sure your package.json has
"type": "module". See Modules if you need a refresher.
Why Streams
Reading a 2GB file with fs.readFile loads the entire thing into memory. Streams process data in chunks, so memory stays low.
import fs from 'node:fs/promises';
import { createReadStream } from 'node:fs';
// Bad: loads entire file into memory
const data = await fs.readFile('huge-file.csv', 'utf-8');
// Good: processes chunk by chunk
const stream = createReadStream('huge-file.csv', 'utf-8');
stream.on('data', (chunk) => {
console.log('Got', chunk.length, 'bytes');
});By default, each chunk is 64KB. You can change this with highWaterMark:
// Custom: 16KB chunks
createReadStream('huge-file.csv', { highWaterMark: 16 * 1024 });Readable Streams
A readable stream emits data in pieces.
import { createReadStream } from 'node:fs';
const stream = createReadStream('data.txt', { encoding: 'utf-8' });
stream.on('data', (chunk) => {
console.log('Chunk:', chunk.length, 'chars');
});
stream.on('end', () => {
console.log('Done reading');
});
stream.on('error', (err) => {
console.error('Read error:', err.message);
});Three events matter: data (got a chunk), end (no more data), error (something broke).
Writable Streams
A writable stream accepts data in pieces.
import { createWriteStream } from 'node:fs';
const stream = createWriteStream('output.txt');
stream.write('First line\n');
stream.write('Second line\n');
stream.end('Last line\n'); // end() writes final chunk and closeswrite() sends data. end() sends the last piece and closes the stream.
Piping
Pipe connects a readable stream to a writable stream. Data flows automatically.
import { createReadStream, createWriteStream } from 'node:fs';
const readStream = createReadStream('input.txt');
const writeStream = createWriteStream('copy.txt');
readStream.pipe(writeStream);That copies a file. Works for any size because it processes chunks, not the whole file.
Piping is everywhere in Node. HTTP responses are writable streams:
import http from 'node:http';
import { createReadStream } from 'node:fs';
const server = http.createServer((req, res) => {
const stream = createReadStream('big-video.mp4');
stream.pipe(res); // Stream file directly to client
});No loading the entire video into memory. It streams chunk by chunk.
Buffers
A Buffer is raw binary data. When you read a file without specifying encoding, you get a Buffer.
import fs from 'node:fs';
const buf = fs.readFileSync('photo.png');
console.log(buf); // <Buffer 89 50 4e 47 0d 0a ...>
console.log(buf.length); // 34521 (bytes)
console.log(buf[0]); // 137 (first byte as number)Creating Buffers
// From a string
const buf1 = Buffer.from('Hello');
console.log(buf1); // <Buffer 48 65 6c 6c 6f>
console.log(buf1.toString()); // "Hello"
// Allocate empty buffer
const buf2 = Buffer.alloc(10); // 10 zero-filled bytes
// Concatenate buffers
const buf3 = Buffer.concat([buf1, Buffer.from(' World')]);
console.log(buf3.toString()); // "Hello World"When You'll See Buffers
- Reading files without encoding:
fs.readFile('img.png')returns a Buffer - Crypto operations: hashes and encryption work with Buffers
- Network data: raw TCP/UDP data arrives as Buffers
- Converting between encodings:
buf.toString('base64'),buf.toString('hex')
import { createHash } from 'node:crypto';
const hash = createHash('sha256');
hash.update('password');
console.log(hash.digest('hex'));
// "5e884898da28047151d0e56f8dc6292773603d0d6aabbdd62a11ef721d1542d8"Key Takeaways
- Streams process data in chunks, keeping memory usage low
- Readable streams emit
data,end, anderrorevents - Writable streams accept data with
write()and close withend() pipe()connects readable to writable, data flows automatically- Buffers are raw binary data, you get them when reading files without encoding
Buffer.from()creates from strings,buf.toString()converts back- Use streams for large files, HTTP responses, and anything where loading everything into memory is wasteful