Fading Coder

One Final Commit for the Last Sprint

Home > Tech > Content

Applying Entropy Weight Method for Objective Weighting in TOPSIS

Tech 1

In multi-criteria decision making, determining the appropriate weights for criteria is a critical step. While subjective methods like the Analytic Hierarchy Process (AHP) are common, they rely heavily on expert judgment, introducing significant subjectivity.

To address this, the entropy weight method offers an objective approach for weight determination. The core principal is that an indicator with less variation (smaller variance or standard deviation) conveys less information and should thus be assigned a lower weight.

Information Content and Entropy

The amount of information an event conveys is inversely related to its probability: highly probable events contain little information, whereas improbable events contain significant information. Information entropy quantifies the expected value of this information content.

For the entropy weight method, a higher information entropy for a data column indicates less information content derived from that indicator.

Implementation Steps

  1. Preprocess the Data: Ensure the input decision matrix contains no negative values. This typically involves positive normalization and subsequent stanadrdization. All elements must be non-negative for probability calculations.
  2. Calculate Probability Distribution: For each indicator (column) (j), compute the probability (p_{ij}) that the (i)-th sample contributes to that indicator: [ p_{ij} = \frac{z_{ij}}{\sum_{i=1}^{n} z_{ij}} ] where (z_{ij}) is the standardized value for sample (i) and indicator (j).
  3. Compute Entropy and Weights:
    • Calculate the information entropy (e_j) for the (j)-th indicator: [ e_j = -k \sum_{i=1}^{n} p_{ij} \ln(p_{ij}) ] where (k = 1 / \ln(n)) is a constant ensuring (0 \le e_j \le 1).
    • Derive the information utility value (d_j = 1 - e_j). A larger (d_j) means the indicator provides more useful information.
    • Normalize the utility values to obtain the final objective weights (w_j): [ w_j = \frac{d_j}{\sum_{j=1}^{m} d_j} ]

This method produces weights derived solely from the data's inherent structure.

MATLAB Code Example

The following function calculates entropy-based weights for an (n \times m) standardized matrix Z.

function weights = calculateEntropyWeights(Z)
    % Calculates objective weights using the entropy method.
    % Input: Z (n x m matrix), standardized and non-negative.
    % Output: weights (1 x m row vector).

    [numSamples, numIndicators] = size(Z);
    utilityValues = zeros(1, numIndicators);

    for idx = 1:numIndicators
        colData = Z(:, idx);
        probDist = colData / sum(colData);
        entropyVal = -sum(probDist .* safeLog(probDist)) / log(numSamples);
        utilityValues(idx) = 1 - entropyVal;
    end
    weights = utilityValues / sum(utilityValues);
end

function result = safeLog(inputVec)
    % Returns 0 for log(0) to avoid NaN results.
    n = length(inputVec);
    result = zeros(n, 1);
    for i = 1:n
        if inputVec(i) == 0
            result(i) = 0;
        else
            result(i) = log(inputVec(i));
        end
    end
end

Integration with TOPSIS and Considerations

The entropy weight method is primarily a technique for determining weights. It is commonly integrated with other decision-making models, such as TOPSIS, to provide an objective weighting component, thereby reducing reliance on subjective judgments.

Important Limitations:

  • Being purely data-driven, the resulting weights may not always align with real-world expert knowledge or expectations.
  • The calculated probabilities depend on the initial data standardization method. Different standardization techniques can lead to significantly different weight outcomes, as there is no universally agreed-upon standard in practice.

Related Articles

Understanding Strong and Weak References in Java

Strong References Strong reference are the most prevalent type of object referencing in Java. When an object has a strong reference pointing to it, the garbage collector will not reclaim its memory. F...

Comprehensive Guide to SSTI Explained with Payload Bypass Techniques

Introduction Server-Side Template Injection (SSTI) is a vulnerability in web applications where user input is improper handled within the template engine and executed on the server. This exploit can r...

Implement Image Upload Functionality for Django Integrated TinyMCE Editor

Django’s Admin panel is highly user-friendly, and pairing it with TinyMCE, an effective rich text editor, simplifies content management significantly. Combining the two is particular useful for bloggi...

Leave a Comment

Anonymous

◎Feel free to join the discussion and share your thoughts.