Skip to content

Commit ebbc27d

Browse files
committed
feat: Completed article
1 parent bff23cd commit ebbc27d

File tree

1 file changed

+36
-42
lines changed
  • src/content/blog/reading-writing-files-nodejs

1 file changed

+36
-42
lines changed

src/content/blog/reading-writing-files-nodejs/index.md

Lines changed: 36 additions & 42 deletions
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,6 @@
11
---
2-
date: 2025-01-06T10:00:00
3-
updatedAt: 2025-01-06T10:00:00
2+
date: 2025-10-12T23:40:00
3+
updatedAt: 2025-10-12T23:40:00
44
title: Reading and Writing Files in Node.js - The Complete Modern Guide
55
slug: reading-writing-files-nodejs
66
description: Learn the modern way to read and write files in Node.js using promises, streams, and file handles. Master memory-efficient file operations for production applications.
@@ -187,7 +187,7 @@ await createBeepWav()
187187

188188
This example demonstrates several key concepts for binary file manipulation in Node.js. The `encodeWavPcm16()` function creates a complete WAV file structure by constructing both the header and audio data sections. The WAV header contains metadata like file size, audio format, sample rate, and number of channels, while the data section contains the actual audio samples.
189189

190-
The `generateBeepTone()` function creates audio samples using a sine wave mathematical formula, generating a pure tone at the specified frequency. Each sample represents the amplitude of the sound wave at a specific point in time, and when played back at the correct sample rate, these digital values recreate the original analog sound.
190+
The `makeSine()` function creates audio samples using a sine wave mathematical formula, generating a pure tone at the specified frequency. Each sample represents the amplitude of the sound wave at a specific point in time, and when played back at the correct sample rate, these digital values recreate the original analog sound.
191191

192192
Don't worry too much about the specifics of this example dealing with the WAV binary format! What matters here is learning that we can put arbitrary binary data into a buffer and write it into a file using `writeFile()`. The key takeaway is that Node.js treats all file operations the same way, whether you're writing text, JSON, images, audio, or any other type of data.
193193

@@ -280,10 +280,6 @@ The function uses Buffer methods like `readUInt32LE()` and `readUInt16LE()` to i
280280

281281
The duration calculation combines several pieces of metadata: we divide the total audio data size by the byte rate (bytes per second) to get the duration in seconds, then multiply by 1000 to convert to milliseconds. This demonstrates how understanding both the file format and basic math allows us to derive meaningful information from binary data.
282282

283-
:::note[Finding free WAV files]
284-
You can try out this example with the beep WAV files we created in the previous example or, if you need some longer free WAV files, you can check out a website with free audio samples like [Zapsplat](https://www.zapsplat.com/).
285-
:::
286-
287283
:::note[Binary file complexity]
288284
This is a very simple example showing basic binary manipulation using only Node.js built-ins. We can handle the WAV format manually here because we're creating a minimal, single-channel PCM file with known parameters.
289285

@@ -318,7 +314,7 @@ try {
318314
const [databaseConfig, apiConfig, loggingConfig, featureFlagsConfig] =
319315
await Promise.all(promises)
320316

321-
console.log(`Successfully loaded config files`, {
317+
console.log('Successfully loaded config files', {
322318
databaseConfig,
323319
apiConfig,
324320
loggingConfig,
@@ -341,7 +337,7 @@ This concurrent approach provides a significant performance improvement. If we u
341337

342338
```javascript {26-29}
343339
// write-multiple-files.js
344-
import { writeFile } from 'node:fs/promises'
340+
import { mkdir, writeFile } from 'node:fs/promises'
345341

346342
async function generateReports(data) {
347343
const reports = [
@@ -385,6 +381,7 @@ const reportData = {
385381
yearly: { sales: 365000, visitors: 91250 },
386382
}
387383

384+
await mkdir('reports', { recursive: true }) // Ensure reports directory exists
388385
await generateReports(reportData)
389386
```
390387

@@ -411,11 +408,11 @@ Learn more: [Promise.all() on MDN](https://developer.mozilla.org/en-US/docs/Web/
411408

412409
### Working with Directories
413410

414-
Files don't exist in isolation, they live in directories. What if you need to process all files in a folder, or create directories dynamically? You'll often need to work with directories too:
411+
In the latest example we used the `mkdir()` function to create a directory before writing files into it. This is a common pattern because files don't exist in isolation, they live in directories. What if you need to process all files in a folder, or create directories dynamically? Let's see a more complete example that combines several file system operations:
415412

416413
```javascript {8,11,14,15,17}
417414
// process-directory.js
418-
import { readdir, mkdir, stat } from 'node:fs/promises'
415+
import { mkdir, readdir, stat } from 'node:fs/promises'
419416
import { join } from 'node:path'
420417

421418
async function processDirectory(dirPath) {
@@ -513,7 +510,7 @@ try {
513510
// Synchronous file writing
514511
try {
515512
const userData = { name: 'John Doe', email: '[email protected]' }
516-
const jsonData = JSON.stringify(userData, null, 2)
513+
const jsonData = JSON.stringify(userData)
517514
writeFileSync('user-data.json', jsonData, 'utf8')
518515
console.log('User data saved successfully!')
519516
} catch (error) {
@@ -558,47 +555,48 @@ Imagine you're trying to read a 2GB log file using `readFile()`. Your Node.js pr
558555
1. **Out of memory errors** - Your application might crash
559556
2. **Poor performance** - High memory usage affects other operations
560557
561-
### Understanding Node.js Buffer Limits
558+
### Understanding Memory Considerations for Large Files
562559
563-
Node.js has built-in limits on buffer sizes to prevent applications from consuming too much memory. You can check these limits:
560+
While modern Node.js has significantly increased buffer limits (the theoretical maximum is now around 9 petabytes, compared to the original limits of ~1GB on 32-bit and ~2GB on 64-bit architectures), this doesn't mean we should load massive files into memory all at once. You can check the current buffer limits on your system:
564561
565562
```javascript
566563
// check-buffer-limits.js
567-
// Check the maximum buffer size
568-
console.log('Max buffer size:', Buffer.constants.MAX_LENGTH)
569-
console.log('Max string length:', Buffer.constants.MAX_STRING_LENGTH)
564+
import buffer from 'node:buffer'
570565

571-
// On most systems:
572-
// MAX_LENGTH is around 2GB (2,147,483,647 bytes on 64-bit systems)
573-
// MAX_STRING_LENGTH is around 1GB
566+
// Check the maximum buffer size
567+
console.log('Max buffer size:', buffer.constants.MAX_LENGTH)
568+
console.log('Max string length:', buffer.constants.MAX_STRING_LENGTH)
569+
570+
// Convert to more readable format
571+
const maxSizeGB = (buffer.constants.MAX_LENGTH / (1024 * 1024 * 1024)).toFixed(
572+
2,
573+
)
574+
console.log(`That's approximately ${maxSizeGB} GB`)
574575
```
575576
576-
If you try to read a file larger than these limits using `readFile()`, you'll get an error:
577+
Even though Node.js can theoretically handle extremely large buffers, attempting to read huge files into memory can cause practical problems:
577578
578579
```javascript
579-
// handle-large-file-error.js
580+
// handle-large-file-memory.js
580581
import { readFile } from 'node:fs/promises'
581582

582583
try {
583-
// This will fail if the file is larger than the buffer limit
584+
// Even if this doesn't hit buffer limits, it might cause memory issues
585+
console.log('Attempting to read large file...')
584586
const hugeFile = await readFile('massive-dataset.csv', 'utf8')
587+
console.log('File loaded successfully!')
585588
} catch (error) {
586589
if (error.code === 'ERR_FS_FILE_TOO_LARGE') {
587-
console.log('File is too large to read into memory at once!')
590+
console.log('File exceeds buffer limits!')
591+
} else if (error.code === 'ENOMEM') {
592+
console.log('Not enough memory available!')
593+
} else {
594+
console.log('Error loading file:', error.message)
588595
}
589596
}
590597
```
591598
592-
So does this mean that Node.js can't handle big files?! Of course not, Node.js is actually quite good at handling them... we just need to use different tools to do that! The trick is to make sure we don't load ALL the data into memory in one go, but we process the data in smaller incremental chunks!
593-
594-
### When to Use Promise-Based Methods
595-
596-
Promise-based file operations are great when:
597-
598-
- **Files are small to medium-sized** (typically under 100MB)
599-
- **You need the entire content at once** (parsing JSON, reading config files)
600-
- **Simplicity is important** (rapid prototyping, simple scripts)
601-
- **Memory usage isn't a concern** (plenty of RAM available)
599+
The key insight is that loading massive files into memory all at once is rarely a good idea, even when technically possible. It can lead to memory pressure, slower performance, and poor user experience. Instead, we should process large files incrementally using streams or file handles!
602600
603601
## Advanced Node.js File Operations with File Handles
604602
@@ -635,13 +633,13 @@ async function readFileInChunks(filePath) {
635633

636634
// Process the chunk
637635
const chunk = buffer.subarray(0, result.bytesRead)
638-
console.log(`Read ${result.bytesRead} bytes:`, chunk.toString('utf8'))
636+
console.log(`>>> Read ${result.bytesRead} bytes:`, chunk.toString('utf8'))
639637

640638
position += result.bytesRead
641639
totalBytesRead += result.bytesRead
642640
}
643641

644-
console.log(`Total bytes read: ${totalBytesRead}`)
642+
console.log(`>>> Total bytes read: ${totalBytesRead}`)
645643
} catch (error) {
646644
console.error('Error reading file:', error.message)
647645
throw error
@@ -1066,7 +1064,7 @@ We've covered a comprehensive range of file operation techniques in Node.js, fro
10661064
// stream-buffer-size.js
10671065
// For large files, use bigger chunks
10681066
const stream = createReadStream(path, {
1069-
highWaterMark: 128 * 1024, // 64KB chunks
1067+
highWaterMark: 128 * 1024, // 128KB chunks
10701068
})
10711069
```
10721070
@@ -1133,7 +1131,7 @@ Yes! Modern Node.js supports top-level await in ES modules, so you can use `awai
11331131
11341132
### How do I process multiple files concurrently in Node.js?
11351133
1136-
Use `Promise.all()` or `Promise.allSettled()` with an array of promises to process multiple files simultaneously. For example: `await Promise.all(filenames.map(name => readFile(name)))`. This is much faster than processing files sequentially, especially for I/O-bound operations. If you are processing large files, you might want to consider using streams. You can create multiple stream objects and pipeline and run them concurrently.
1134+
Use `Promise.all()` or `Promise.allSettled()` with an array of promises to process multiple files simultaneously. For example: `await Promise.all(filenames.map(name => readFile(name)))`. This is much faster than processing files sequentially, especially for I/O-bound operations. If you are processing large files, you might want to consider using streams. You can create multiple stream pipelines and run them concurrently.
11371135
11381136
---
11391137
@@ -1146,7 +1144,3 @@ If you found value in this comprehensive guide to file operations, you'll love *
11461144
This book dives deep into the patterns, techniques, and best practices that separate good Node.js code from great Node.js code. From fundamental concepts like the ones covered in this article to advanced architectural patterns for building scalable applications, it provides the knowledge you need to write professional, maintainable Node.js code.
11471145
11481146
**Ready to master Node.js?** Visit our [homepage](/) to discover how Node.js Design Patterns can accelerate your development journey and help you build better applications with confidence.
1149-
1150-
```
1151-
1152-
```

0 commit comments

Comments
 (0)