Fading Coder

One Final Commit for the Last Sprint

Home > Tech > Content

Building a Custom JavaScript Module Bundler

Tech May 16 2

Building a Custom JavaScript Module Bundler

1. Creating a Custom Command with npm Bin

To start building our custom bundler, we'll first set up a command-line interface using npm's bin functionality.


# Initialize a new project
npm init -y

# Add bin configuration to package.json
{
  "bin": {
    "mybundler": "bin/index.js"
  }
}

# Add shebang to the entry file
#!/usr/bin/env node

# Link the package globally
npm link

2. Basic File Operations

Our bundler needs to read entry files and write output files. Here's how to perform basic file operations in Node.js:


const path = require("path")
const fs = require("fs")

const inputPath = path.resolve('./src/entry.js')
const outputPath = path.resolve('./dist/bundle.js')
const fileContent = fs.readFileSync(inputPath, "utf-8")
fs.writeFileSync(outputPath, fileContent)

3. Understanding Webpack's Core Mechanism

Let's examine how webpack processes modules and depnedencies:


# Install webpack for reference
npm i webpack webpack-cli -D

# Configure package.json
{
  "scripts": {
    "build": "webpack --config webpack.config.js"
  }
}

# Create webpack.config.js
const path = require('path');

module.exports = {
  entry: './src/entry.js',
  output: {
    path: path.resolve(__dirname, './dist'),
    filename: 'bundle.js'
  },
  mode: "development"
};

Webpack transforms our code into a self-executing function that manages module dependencies:


(() => {
  var modules = {
    "./src/entry.js":
      ((module) => {
        eval("const {helperFunction} = require(\"./src/utils.js\")\r\n\r\nconsole.log(helperFunction)\r\n\n\n");
      }),
    "./src/utils.js":
      ((module) => {
        eval("module.exports = {\r\n  helperFunction: \"Utility function\"\r\n}\r\n\n\n");
      })
  };
  var modulesCache = {};

  function require(modulePath) {
    if (modulesCache[modulePath]) {
      return modulesCache[modulePath].exports;
    }
    modulesCache[modulePath] = {
      exports: {}
    };
    modules[modulePath](modulesCache[modulePath]);
    return modulesCache[modulePath].exports;
  }

  require("./src/entry.js");
})();

4. Implementing Module Traversal

Our bundler needs to recursively discover and process all modules in the dependency graph:


const path = require("path")
const fs = require("fs")
const moduleMap = {}

function processModule(filePath) {
  moduleMap[filePath] = fs.readFileSync(filePath, 'utf-8')
    .replace(/import\(["'](.*)['"]\)/g, (match, importPath) => {
      const resolvedPath = path.join(filePath, '..', importPath)
      if (!moduleMap[resolvedPath]) {
        processModule(resolvedPath)
      }
      return `import(${JSON.stringify(resolvedPath)})`
    })
}

processModule(path.join('./src/entry.js'))

5. Concatenating Module Scripts

After discovering all modules, we need to concatenate them into a single executable file:


const path = require("path")
const fs = require("fs")
const entryPoint = path.join('./src/entry.js')
const outputFile = path.join('./dist/bundle.js')
const moduleMap = {}

function processModule(filePath) {
  moduleMap[filePath] = fs.readFileSync(filePath, 'utf-8')
    .replace(/import\(["'](.*)['"]\)/g, (match, importPath) => {
      const resolvedPath = path.join(filePath, '..', importPath)
      if (!moduleMap[resolvedPath]) {
        processModule(resolvedPath)
      }
      return `import(${JSON.stringify(resolvedPath)})`
    })
}

processModule(entryPoint)
const moduleDefinitions = Object.keys(moduleMap).map(moduleId => {
  return `
${JSON.stringify(moduleId)}:
  ((module) => {
    eval(${JSON.stringify(moduleMap[moduleId])})
  })
`
}).join(',')

const bundledCode = `
(() => {
  var modules = {
    ${moduleDefinitions}
  };
  var moduleCache = {};

  function require(modulePath) {
    if (moduleCache[modulePath]) {
      return moduleCache[modulePath].exports;
    }
    moduleCache[modulePath] = {
      exports: {}
    };
    modules[modulePath](moduleCache[modulePath]);
    return moduleCache[modulePath].exports;
  }

  require(${JSON.stringify(entryPoint)});
})();
`

fs.writeFileSync(outputFile, bundledCode)

6. Implementing Loaders for Asset Transformation

To handle different file types, we'll implement a simple loader system:


const path = require("path")
const fs = require("fs")
const moduleMap = {}

function cssLoader(source) {
  return `
const styleElement = document.createElement('style')
styleElement.textContent = ${JSON.stringify(source)}
document.head.appendChild(styleElement)
`
}

function processModule(filePath) {
  let moduleContent = fs.readFileSync(filePath, 'utf-8')
    .replace(/import\(["'](.*)['"]\)/g, (match, importPath) => {
      const resolvedPath = path.join(filePath, '..', importPath)
      if (!moduleMap[resolvedPath]) {
        processModule(resolvedPath)
      }
      return `import(${JSON.stringify(resolvedPath)})`
    })

  // Apply appropriate loader based on file extension
  if (/\.css$/.test(filePath)) {
    moduleContent = cssLoader(moduleContent)
  }

  moduleMap[filePath] = moduleContent
}

processModule(path.join('./src/entry.js'))

Related Articles

Understanding Strong and Weak References in Java

Strong References Strong reference are the most prevalent type of object referencing in Java. When an object has a strong reference pointing to it, the garbage collector will not reclaim its memory. F...

Comprehensive Guide to SSTI Explained with Payload Bypass Techniques

Introduction Server-Side Template Injection (SSTI) is a vulnerability in web applications where user input is improper handled within the template engine and executed on the server. This exploit can r...

Implement Image Upload Functionality for Django Integrated TinyMCE Editor

Django’s Admin panel is highly user-friendly, and pairing it with TinyMCE, an effective rich text editor, simplifies content management significantly. Combining the two is particular useful for bloggi...

Leave a Comment

Anonymous

◎Feel free to join the discussion and share your thoughts.