JavaScript Implicit Conversion

A Head-scratching Problem

[undefined] == false; // Outputs true, believe it or not?

[undefined] === false; // But here it outputs false, why???

JavaScript variables have the following characteristics:

  1. Weakly typed, and can change types dynamically
  2. Accessed as wrapped objects by their corresponding types, making it difficult to access primitive values directly
  3. Except for a few special types, all variables appear to be objects
  4. Different types can be used together in calculations and comparisons

When performing calculations or comparisons, the JavaScript engine automatically converts variables into appropriate values. This conversion happens “under the table” without explicit type declarations, so we call it Implicit Conversion.

Implicit Conversion

[1] document.all is also falsy. See documentation.

[2] Non-primitive to primitive conversion is also an implicit conversion. It first calls the valueOf() method to get the primitive value. If there’s no return value or if it’s non-primitive, it then calls toString(). If neither valueOf() nor toString() exists, it throws an error.

Putting It to the Test

Analyzing the initial logical comparison:

[undefined] == false;

This is a primitive vs. non-primitive comparison, so first convert the non-primitive to primitive:

[undefined] => [undefined].toString() => ''

Becomes:

[undefined] == false => '' == false

Now comparing string and boolean, convert both to number:

'' == false => 0 == 0

Ultimately becomes 0 == 0, which returns true.

Whereas [undefined] === false is simpler: === is a strict equality operator that requires both type and value to be identical. Since they’re different types, it returns false.

Leave a Reply