You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: src/content/blog/reading-writing-files-nodejs/index.md
+36-42Lines changed: 36 additions & 42 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -1,6 +1,6 @@
1
1
---
2
-
date: 2025-01-06T10:00:00
3
-
updatedAt: 2025-01-06T10:00:00
2
+
date: 2025-10-12T23:40:00
3
+
updatedAt: 2025-10-12T23:40:00
4
4
title: Reading and Writing Files in Node.js - The Complete Modern Guide
5
5
slug: reading-writing-files-nodejs
6
6
description: Learn the modern way to read and write files in Node.js using promises, streams, and file handles. Master memory-efficient file operations for production applications.
@@ -187,7 +187,7 @@ await createBeepWav()
187
187
188
188
This example demonstrates several key concepts for binary file manipulation in Node.js. The `encodeWavPcm16()` function creates a complete WAV file structure by constructing both the header and audio data sections. The WAV header contains metadata like file size, audio format, sample rate, and number of channels, while the data section contains the actual audio samples.
189
189
190
-
The `generateBeepTone()` function creates audio samples using a sine wave mathematical formula, generating a pure tone at the specified frequency. Each sample represents the amplitude of the sound wave at a specific point in time, and when played back at the correct sample rate, these digital values recreate the original analog sound.
190
+
The `makeSine()` function creates audio samples using a sine wave mathematical formula, generating a pure tone at the specified frequency. Each sample represents the amplitude of the sound wave at a specific point in time, and when played back at the correct sample rate, these digital values recreate the original analog sound.
191
191
192
192
Don't worry too much about the specifics of this example dealing with the WAV binary format! What matters here is learning that we can put arbitrary binary data into a buffer and write it into a file using `writeFile()`. The key takeaway is that Node.js treats all file operations the same way, whether you're writing text, JSON, images, audio, or any other type of data.
193
193
@@ -280,10 +280,6 @@ The function uses Buffer methods like `readUInt32LE()` and `readUInt16LE()` to i
280
280
281
281
The duration calculation combines several pieces of metadata: we divide the total audio data size by the byte rate (bytes per second) to get the duration in seconds, then multiply by 1000 to convert to milliseconds. This demonstrates how understanding both the file format and basic math allows us to derive meaningful information from binary data.
282
282
283
-
:::note[Finding free WAV files]
284
-
You can try out this example with the beep WAV files we created in the previous example or, if you need some longer free WAV files, you can check out a website with free audio samples like [Zapsplat](https://www.zapsplat.com/).
285
-
:::
286
-
287
283
:::note[Binary file complexity]
288
284
This is a very simple example showing basic binary manipulation using only Node.js built-ins. We can handle the WAV format manually here because we're creating a minimal, single-channel PCM file with known parameters.
@@ -411,11 +408,11 @@ Learn more: [Promise.all() on MDN](https://developer.mozilla.org/en-US/docs/Web/
411
408
412
409
### Working with Directories
413
410
414
-
Files don't exist in isolation, they live in directories. What if you need to process all files in a folder, or create directories dynamically? You'll often need to work with directories too:
411
+
In the latest example we used the `mkdir()` function to create a directory before writing files into it. This is a common pattern because files don't exist in isolation, they live in directories. What if you need to process all files in a folder, or create directories dynamically? Let's see a more complete example that combines several file system operations:
415
412
416
413
```javascript {8,11,14,15,17}
417
414
// process-directory.js
418
-
import { readdir, mkdir, stat } from'node:fs/promises'
415
+
import { mkdir, readdir, stat } from'node:fs/promises'
@@ -558,47 +555,48 @@ Imagine you're trying to read a 2GB log file using `readFile()`. Your Node.js pr
558
555
1. **Out of memory errors** - Your application might crash
559
556
2. **Poor performance** - High memory usage affects other operations
560
557
561
-
### Understanding Node.js Buffer Limits
558
+
### Understanding Memory Considerations for Large Files
562
559
563
-
Node.js has built-in limits on buffer sizes to prevent applications from consuming too much memory. You can check these limits:
560
+
While modern Node.js has significantly increased buffer limits (the theoretical maximum is now around 9 petabytes, compared to the original limits of ~1GB on 32-bit and ~2GB on 64-bit architectures), this doesn't mean we should load massive files into memory all at once. You can check the current buffer limits on your system:
console.log('File is too large to read into memory at once!')
590
+
console.log('File exceeds buffer limits!')
591
+
} elseif (error.code==='ENOMEM') {
592
+
console.log('Not enough memory available!')
593
+
} else {
594
+
console.log('Error loading file:', error.message)
588
595
}
589
596
}
590
597
```
591
598
592
-
So does this mean that Node.js can't handle big files?! Of course not, Node.js is actually quite good at handling them... we just need to use different tools to do that! The trick is to make sure we don't load ALL the data into memory in one go, but we process the data in smaller incremental chunks!
593
-
594
-
### When to Use Promise-Based Methods
595
-
596
-
Promise-based file operations are great when:
597
-
598
-
- **Files are small to medium-sized** (typically under 100MB)
599
-
- **You need the entire content at once** (parsing JSON, reading config files)
600
-
- **Simplicity is important** (rapid prototyping, simple scripts)
601
-
- **Memory usage isn't a concern** (plenty of RAM available)
599
+
The key insight is that loading massive files into memory all at once is rarely a good idea, even when technically possible. It can lead to memory pressure, slower performance, and poor user experience. Instead, we should process large files incrementally using streams or file handles!
602
600
603
601
## Advanced Node.js File Operations with File Handles
604
602
@@ -635,13 +633,13 @@ async function readFileInChunks(filePath) {
@@ -1066,7 +1064,7 @@ We've covered a comprehensive range of file operation techniques in Node.js, fro
1066
1064
// stream-buffer-size.js
1067
1065
// For large files, use bigger chunks
1068
1066
conststream=createReadStream(path, {
1069
-
highWaterMark:128*1024, //64KB chunks
1067
+
highWaterMark:128*1024, //128KB chunks
1070
1068
})
1071
1069
```
1072
1070
@@ -1133,7 +1131,7 @@ Yes! Modern Node.js supports top-level await in ES modules, so you can use `awai
1133
1131
1134
1132
### How do I process multiple files concurrently in Node.js?
1135
1133
1136
-
Use `Promise.all()` or `Promise.allSettled()` with an array of promises to process multiple files simultaneously. For example: `awaitPromise.all(filenames.map(name=>readFile(name)))`. This is much faster than processing files sequentially, especially for I/O-bound operations. If you are processing large files, you might want to consider using streams. You can create multiple stream objects and pipeline and run them concurrently.
1134
+
Use `Promise.all()` or `Promise.allSettled()` with an array of promises to process multiple files simultaneously. For example: `awaitPromise.all(filenames.map(name=>readFile(name)))`. This is much faster than processing files sequentially, especially for I/O-bound operations. If you are processing large files, you might want to consider using streams. You can create multiple stream pipelines and run them concurrently.
1137
1135
1138
1136
---
1139
1137
@@ -1146,7 +1144,3 @@ If you found value in this comprehensive guide to file operations, you'll love *
1146
1144
This book dives deep into the patterns, techniques, and best practices that separate good Node.js code from great Node.js code. From fundamental concepts like the ones covered in this article to advanced architectural patterns for building scalable applications, it provides the knowledge you need to write professional, maintainable Node.js code.
1147
1145
1148
1146
**Ready to master Node.js?** Visit our [homepage](/) to discover how Node.js Design Patterns can accelerate your development journey and help you build better applications with confidence.
0 commit comments