Fading Coder

One Final Commit for the Last Sprint

Home > Tech > Content

Working with the File System in Node.js

Tech 1

Introduction

In Node.js development, interacting with the local file system is a common requirement. Whether it's reading or writing configuration files, handling uploaded documents, or managing project directory structures, the fs module plays a central role. This article provides a comprehensive overview of core fs module functionalities—from file I/O operations to directory management—complete with practical examples and modern best practices.

Understanding the fs Module

The fs module (short for file system) is a built-in component of Node.js that does not require additional installation. It offers a full suite of APIs for performing all kinds of file and directory operations including creation, reading, updating, and deletion (CRUD).

Path Handling Considerations

Correctly handling paths is critical for program stability. Be mindful of:

  • Relative Paths: These are interpreted relative to the terminal's working directory at runtime, which can lead to unexpected behavior when scripts are executed from different locations.
  • Absolute Paths: These start from the root of the disk (e.g., /user/project/file.txt or C:/project/file.txt). Using absolute paths is generally recommended.
  • Best Practice: Use path.resolve(__dirname, 'relative/path') to generate an absolute path based on the current file’s location:
const path = require('path');
const filePath = path.resolve(__dirname, 'data.txt');

Writing Files

Writing data to disk involves persisting memory contents into files. The fs module supports several methods tailored for different use cases and coding styles.

Asynchronous Write: writeFile

Non-blocking operation where the main thread continues without waiting for completion. Ideal for scenarios where blocking execution is undesirable.

Syntax: fs.writeFile(file, data[, options], callback)

  • file: File path (preferably absolute)
  • data: Data to write (string or buffer)
  • options: Optional settings (e.g., encoding, flag)
  • callback: Function called upon completion with error parameter

Example:

const fs = require('fs');
const path = require('path');
const filePath = path.resolve(__dirname, 'notes.txt');

fs.writeFile(
  filePath,
  'Hello Node.js',
  { encoding: 'utf8', flag: 'wx' },
  (err) => {
    if (err) {
      if (err.code === 'EEXIST') {
        console.error('Write failed: File already exists');
      } else {
        console.error('Write failed:', err);
      }
      return;
    }
    console.log('Write successful!');
  }
);

Synchronous Write: writeFileSync

Blocking operation that halts execution until the write completes. Suitable for simple scripts or tools where avoiding callbacks is preferred.

Syntax: fs.writeFileSync(file, data[, options])

Example:

const fs = require('fs');
const path = require('path');
const filePath = path.resolve(__dirname, 'notes.txt');

try {
  fs.writeFileSync(filePath, 'Synchronous write', 'utf8');
  console.log('Synchronous write successful!');
} catch (err) {
  if (err.code === 'EACCES') {
    console.error('Write failed: No permission');
  } else {
    console.error('Synchronous write failed:', err);
  }
}

Promise-based Write: fs.promises

Available since Node.js 10+, this approach allows using async/await syntax to avoid callback nesting and aligns with modern JS practices.

Example:

const fs = require('fs').promises;
const path = require('path');

async function writeFileDemo() {
  const filePath = path.resolve(__dirname, 'notes.txt');
  try {
    await fs.writeFile(filePath, 'Promise-style write', 'utf8');
    console.log('Write successful');
  } catch (err) {
    if (err.code === 'ENOENT') {
      console.error('Write failed: Directory does not exist');
    } else {
      console.error('Write failed:', err);
    }
  }
}

writeFileDemo();

Append to File: appendFile

Adds content to the end of a file rather than overwriting it. Commonly used for logging or accumulating data.

Example (Promise style):

const fs = require('fs').promises;
const path = require('path');

async function appendLog() {
  const logPath = path.resolve(__dirname, 'app.log');
  try {
    await fs.appendFile(
      logPath,
      `[${new Date().toISOString()}] User logged in\n`
    );
    console.log('Log appended successfully');
  } catch (err) {
    console.error('Failed to append log:', err);
  }
}

appendLog();

Stream-based Write: createWriteStream

Efficiently handles large files by streaming chunks instead of loading everything into memory. Useful for gigabyte-sized files or frequent writes.

Syntax: fs.createWriteStream(path[, options])

Example:

const fs = require('fs');
const path = require('path');

const ws = fs.createWriteStream(
  path.resolve(__dirname, 'large-file.txt'),
  { highWaterMark: 1024 * 1024 }
);

let i = 0;
function write() {
  let ok = true;

  while (i < 10000 && ok) {
    ok = ws.write(`Line ${i}\n`);
    i++;
  }

  if (i === 10000) {
    ws.end('End of writing');
    return;
  }

  ws.once('drain', write);
}

write();
ws.on('finish', () => {
  console.log('Large file written successfully!');
});

Key benefit: Reduces disk I/O by buffering data before writing, improving performance for large files. highWaterMark controls internal buffer size (default 64KB).

Reading Files

Loading data from disk into memory is handled through various fs functions depending on file size and usage context.

Asynchronous Read: readFile

Ideal for small to medium-sized files that can fit comfortably in memory.

Example:

const fs = require('fs');
const path = require('path');
const filePath = path.resolve(__dirname, 'notes.txt');

fs.readFile(filePath, 'utf8', (err, data) => {
  if (err) {
    if (err.code === 'ENOENT') {
      console.error('Read failed: File not found');
    } else {
      console.error('Read failed:', err);
    }
    return;
  }
  console.log('File content:', data);
});

Synchronous Read: readFileSync

Simple and direct method for reading small files.

Example:

const fs = require('fs');
const path = require('path');
const filePath = path.resolve(__dirname, 'notes.txt');

try {
  const data = fs.readFileSync(filePath, 'utf8');
  console.log('Sync read content:', data);
} catch (err) {
  console.error('Sync read failed:', err);
}

Promise-based Read

Modern asynchronous way using promises.

Example:

const fs = require('fs').promises;
const path = require('path');

async function readFileDemo() {
  const filePath = path.resolve(__dirname, 'notes.txt');
  try {
    const data = await fs.readFile(filePath, 'utf8');
    console.log('Promise read content:', data);
  } catch (err) {
    console.error('Read failed:', err);
  }
}

readFileDemo();

Stream-based Read: createReadStream

Designed for large files to prevent excessive memory consumption.

Example:

const fs = require('fs');
const path = require('path');

const rs = fs.createReadStream(
  path.resolve(__dirname, 'large-file.txt'),
  { highWaterMark: 128 * 1024 }
);

rs.on('data', (chunk) => {
  console.log(`Received ${chunk.length} bytes`);
});

rs.on('end', () => {
  console.log('File read completed!');
});

rs.on('error', (err) => {
  console.error('Read error:', err);
});

Common use cases: Video playback, log analysis, resumable downloads.

Renaming and Moving Files

The fs.rename function serves both renaming and moving operations across directories or drives.

Example (using Promises):

const fs = require('fs').promises;
const path = require('path');

async function moveFile() {
  const oldName = path.resolve(__dirname, 'notes.txt');
  const newName = path.resolve(__dirname, 'diary.txt');

  try {
    await fs.rename(oldName, newName);
    console.log('Renamed successfully!');
  } catch (err) {
    console.error('Rename failed:', err);
  }

  const sourcePath = path.resolve(__dirname, 'diary.txt');
  const targetPath = path.resolve(__dirname, 'docs', 'diary.txt');

  try {
    await fs.rename(sourcePath, targetPath);
    console.log('Moved successfully!');
  } catch (err) {
    console.error('Move failed:', err);
  }
}

moveFile();

Note: If a file with the same name exists at the destination, it will be overwritten. Moving files between disks may involve copying and deleting, which is less efficient.

Deleting Files

Use fs.unlink (traditional) or fs.rm (Node.js 14.14+) for removing files.

Modern Approach: rm

Supports more options including ignoring non-existent files (force: true).

Example:

const fs = require('fs').promises;
const path = require('path');

async function deleteFile() {
  try {
    await fs.rm(path.resolve(__dirname, 'temp.txt'), { force: true });
    console.log('File deleted');
  } catch (err) {
    console.error('Delete failed:', err);
  }
}

deleteFile();

Directory Operations

Managing folders requires specific APIs within the fs module.

Creating Directories: mkdir

Supports recursive creation of nested paths.

Example:

const fs = require('fs').promises;
const path = require('path');

async function createDir() {
  try {
    await fs.mkdir(path.resolve(__dirname, 'data/logs/2024'), {
      recursive: true,
    });
    console.log('Directory created successfully');
  } catch (err) {
    console.error('Create failed:', err);
  }
}

createDir();

Reading Directories: readdir

Retrieves names of entries inside a directory. Can also provide detailed metadata.

Example:

const fs = require('fs').promises;
const path = require('path');

async function readDir() {
  const dirPath = path.resolve(__dirname, 'docs');
  try {
    const items = await fs.readdir(dirPath, { withFileTypes: true });
    items.forEach((item) => {
      if (item.isFile()) {
        console.log('File:', item.name);
      } else if (item.isDirectory()) {
        console.log('Directory:', item.name);
      }
    });
  } catch (err) {
    console.error('Read failed:', err);
  }
}

readDir();

Removing Directories: rm

Use fs.rm with recursive: true to delete non-empty directories easily.

Example:

const fs = require('fs').promises;
const path = require('path');

async function deleteDir() {
  const dirPath = path.resolve(__dirname, 'data');
  try {
    await fs.rm(dirPath, { recursive: true, force: true });
    console.log('Directory deleted');
  } catch (err) {
    console.error('Delete failed:', err);
  }
}

deleteDir();

Checking File Status: stat

Provides detailed information about a file or directory such as size, type, and modification time.

Example:

const fs = require('fs').promises;
const path = require('path');

async function getStats() {
  const targetPath = path.resolve(__dirname, 'docs');
  try {
    const stats = await fs.stat(targetPath);
    console.log('Is file:', stats.isFile());
    console.log('Is directory:', stats.isDirectory());
    console.log('Size:', stats.size, 'bytes');
    console.log('Last modified:', stats.mtime.toLocaleString());
  } catch (err) {
    console.error('Failed to get stats:', err);
  }
}

ggetStats();

Practical Use Cases

Case 1: Log Rotation Based on Date

Automatically split logs into daily files to manage growth.

Implementation: Use createWriteStream combined with scheduled tasks to switch streams each day.

const fs = require('fs');
const path = require('path');
const { format } = require('date-fns');

let currentStream;
let currentDate = format(new Date(), 'yyyy-MM-dd');

function getLogStream() {
  const today = format(new Date(), 'yyyy-MM-dd');

  if (today !== currentDate || !currentStream) {
    currentDate = today;
    const logPath = path.resolve(__dirname, `logs/${currentDate}.log`);

    if (currentStream) currentStream.close();

    currentStream = fs.createWriteStream(logPath, { flags: 'a' });
  }

  return currentStream;
}

function writeLog(content) {
  const stream = getLogStream();
  stream.write(`[${new Date().toISOString()}] ${content}\n`);
}

setInterval(() => {
  writeLog('User visited homepage');
}, 3000);

Case 2: Large File Copy Without Memory Overflows

Copy a large video file efficiently without loading it entirely into memory.

Implementation: Use readable and writable streams connected via pipe().

const fs = require('fs');
const path = require('path');

function copyLargeFile(source, target) {
  const rs = fs.createReadStream(source);
  const ws = fs.createWriteStream(target);

  rs.pipe(ws);

  ws.on('finish', () => {
    console.log('File copied successfully');
  });

  rs.on('error', (err) => console.error('Read error:', err));
  ws.on('error', (err) => console.error('Write error:', err));
}

copyLargeFile(
  path.resolve(__dirname, 'source.mp4'),
  path.resolve(__dirname, 'target.mp4')
);

Case 3: Batch Processing Text Files

Recursively process .txt files in a folder to replace contant.

Implementation: Combine readdir, stat, readFile, and writeFile.

const fs = require('fs').promises;
const path = require('path');

async function replaceInTxtFiles(dirPath, oldStr, newStr) {
  const items = await fs.readdir(dirPath, { withFileTypes: true });

  for (const item of items) {
    const fullPath = path.resolve(dirPath, item.name);

    if (item.isDirectory()) {
      await replaceInTxtFiles(fullPath, oldStr, newStr);
    } else if (item.isFile() && item.name.endsWith('.txt')) {
      const content = await fs.readFile(fullPath, 'utf8');
      const newContent = content.replaceAll(oldStr, newStr);
      await fs.writeFile(fullPath, newContent, 'utf8');
      console.log(`Processed: ${fullPath}`);
    }
  }
}

replaceInTxtFiles(path.resolve(__dirname, 'docs'), 'old content', 'new content')
  .then(() => console.log('All files processed'))
  .catch((err) => console.error('Processing failed:', err));

Performance Comparison and Recommendations

Scenario Recommended API Not Recommended Reason
Small files (<100KB) readFile/writeFile Streaming Overhead outweighs benefits
Large files (>10MB) Streaming readFile/writeFile Risk of memory overflow
Frequent appends (logging) appendFile / Stream write writeFile Would overwrite existing content
Scripts/tools Sync APIs (readFileSync) Async APIs Simpler without callbacks
Modern projects fs.promises + async/await Callbacks Cleaner and more maintainable

Summary

The fs module is essential for Node.js applications needing local file interaction. Key takeaways:

  1. Always resolve paths using path.resolve(__dirname, 'relative/path') to ensure safety.
  2. For beginners, prefer fs.promises with async/await; synchronous APIs work well for quick scripts.
  3. For large files, always use stream-based operations (createReadStream/createWriteStream) with pipe().
  4. Handle common errors like ENOENT, EEXIST, and EACCES appropriately.
  5. Combine multiple techniques like recursion, timing, and streaming for real-world tasks like log rotation or batch processing.

Related Articles

Understanding Strong and Weak References in Java

Strong References Strong reference are the most prevalent type of object referencing in Java. When an object has a strong reference pointing to it, the garbage collector will not reclaim its memory. F...

Comprehensive Guide to SSTI Explained with Payload Bypass Techniques

Introduction Server-Side Template Injection (SSTI) is a vulnerability in web applications where user input is improper handled within the template engine and executed on the server. This exploit can r...

Implement Image Upload Functionality for Django Integrated TinyMCE Editor

Django’s Admin panel is highly user-friendly, and pairing it with TinyMCE, an effective rich text editor, simplifies content management significantly. Combining the two is particular useful for bloggi...

Leave a Comment

Anonymous

◎Feel free to join the discussion and share your thoughts.