Fading Coder

One Final Commit for the Last Sprint

Home > Tech > Content

Understanding Vue's Asynchronous Update Mechanism and the Implementation of nextTick

Tech 3

JavaScript Event Loop Mechanism

JavaScript operates on a single-threaded model. To handle asynchronous operations without blocking execution, environments like browsers and Node.js implement an Event Loop. This system divides execution into a "Main Thread" (or call stack) for synchronous tasks and an "Event Loop Thread" that manages queues for asynchronous tasks, specifically macro-tasks and micro-tasks.

The Event Loop dictates the execution order: it runs all synchronous code first. Then, it processes tasks from the micro-task queue completely before moving to the next task in the macro-task queue. If new micro-tasks are generated while executing micro-tasks, they are added to the current micro-task queue and executed before the loop returns too macro-tasks.

Common macro-tasks include: script (overall code), setTimeout, setInterval, setImmediate, I/O operations, and UI rendering.

Common micro-tasks include: process.nextTick() (Node.js), Promise callbacks (.then, .catch, .finally), async/await (which uses Promises), and MutationObserver.

In terms of priority within their respective queues: process.nextTick has a higher priority than Promise callbacks in Node.js's micro-task queue.

Example 1: Basic Execution Order

setTimeout(() => {
    console.log('Macro-task 1 executed');
});

new Promise((resolveFunc) => {
    console.log('Synchronous promise executor');
    resolveFunc();
}).then(() => {
    console.log('Micro-task 1 executed');
});

console.log('Synchronous log');
// Output order:
// Synchronous promise executor
// Synchronous log
// Micro-task 1 executed
// Macro-task 1 executed

Analysis:

  1. Synchronous execution: The Promise constructor and the final console.log run immediately.
  2. The setTimeout callback is placed in the macro-task queue.
  3. The .then() callback is placed in the micro-task queue.
  4. After the synchronous stack is empty, the Event Loop processes the micro-task queue, executing the .then() callback.
  5. Finally, it processes the macro-task queue, executing the setTimeout callback.

Example 2: Complex Async/Await Order

async function firstAsync() {
    console.log('async2 end');
}

async function secondAsync() {
    console.log('async1 start');
    await firstAsync();
    console.log('async1 end');
}

console.log('script start');

setTimeout(function() {
    console.log('setTimeout');
}, 0);

secondAsync();

new Promise(function(resolve) {
    console.log('promise');
    resolve();
}).then(function() {
    console.log('promise1');
}).then(function() {
    console.log('promise2');
});

console.log('script end');
// Typical output order:
// script start
// async1 start
// async2 end
// promise
// script end
// async1 end
// promise1
// promise2
// setTimeout

Vue's nextTick API

Purpose: nextTick allows you to perform operations on the DOM after Vue has updated it in response to a data change. Since Vue's updates are asynchronous, DOM manipulations directly after a data change might target the old state.

Core Concept: It leverages the micro-task queue (with fallbacks to macro-tasks) to defer the execution of a callback until after the current synchronous code and the current micro-task queue have been processed.

Implementation Processs:

  1. Collect callbacks into a queue.
  2. Schedule a flush of this queue using an asynchronous method (preferring micro-tasks).
  3. When the scheduled task executes, it runs all accumulated callbacks in order.

Simplified Implementation Example

// next-tick.js
const deferredCallbacks = [];
let isScheduled = false;

function executeCallbacks() {
  isScheduled = false;
  const copies = deferredCallbacks.slice(0);
  deferredCallbacks.length = 0;
  for (let i = 0; i < copies.length; i++) {
    copies[i]();
  }
}

let scheduleFlush;

// Determine the best asynchronous scheduling strategy (progressive enhancement)
if (typeof Promise !== 'undefined') {
  const promiseResolved = Promise.resolve();
  scheduleFlush = () => {
    promiseResolved.then(executeCallbacks);
  };
} else if (typeof MutationObserver !== 'undefined') {
  let counter = 1;
  const textObserver = new MutationObserver(executeCallbacks);
  const textElement = document.createTextNode(String(counter));
  textObserver.observe(textElement, { characterData: true });
  scheduleFlush = () => {
    counter = (counter + 1) % 2;
    textElement.data = String(counter);
  };
} else if (typeof setImmediate !== 'undefined') {
  scheduleFlush = () => {
    setImmediate(executeCallbacks);
  };
} else {
  // Fallback to setTimeout
  scheduleFlush = () => {
    setTimeout(executeCallbacks, 0);
  };
}

export function nextTick(callback) {
  deferredCallbacks.push(callback);
  if (!isScheduled) {
    isScheduled = true;
    scheduleFlush();
  }
}

How it works: When nextTick(cb) is called, the callback cb is pushed in to the deferredCallbacks array. If a flush operation hasn't already been scheduled (!isScheduled), it triggers scheduleFlush(). This function uses the best available asynchronous API to schedule executeCallbacks, which will run all collected callbacks on the next tick of the event loop. The isScheduled flag ensures that even if nextTick is called multiple times synchronous, only one asynchronous flush is scheduled.

Related Articles

Understanding Strong and Weak References in Java

Strong References Strong reference are the most prevalent type of object referencing in Java. When an object has a strong reference pointing to it, the garbage collector will not reclaim its memory. F...

Comprehensive Guide to SSTI Explained with Payload Bypass Techniques

Introduction Server-Side Template Injection (SSTI) is a vulnerability in web applications where user input is improper handled within the template engine and executed on the server. This exploit can r...

Implement Image Upload Functionality for Django Integrated TinyMCE Editor

Django’s Admin panel is highly user-friendly, and pairing it with TinyMCE, an effective rich text editor, simplifies content management significantly. Combining the two is particular useful for bloggi...

Leave a Comment

Anonymous

◎Feel free to join the discussion and share your thoughts.