To supplement Wayne's answer and to try to explain why ToPrimitive([])
returns ""
, it's worth considering two possible types of answers to the 'why' question. The first type of answer is: "because the specification says this is how JavaScript will behave." In the ES5 spec, section 9.1, which describes the result of ToPrimitive as a default value for an Object:
The default value of an object is retrieved by calling the [[DefaultValue]] internal method of the object, passing the optional hint PreferredType.
Section 8.12.8 describes the [[DefaultValue]]
method. This method takes a "hint" as an argument, and the hint can be either String or Number. To simplify the matter by dispensing with some details, if the hint is String, then [[DefaultValue]]
returns the value of toString()
if it exists and returns a primitive value and otherwise returns the value of valueOf()
. If the hint is Number, the priorities of toString()
and valueOf()
are reversed so that valueOf()
is called first and its value returned if it's a primitive. Thus, whether [[DefaultValue]]
returns the result of toString()
or valueOf()
depends on the specified PreferredType for the object and whether or not these functions return primitive values.
The default valueOf()
Object method just returns the object itself, which means that unless a class overrides the default method, valueOf()
just returns the Object itself. This is the case for Array
. [].valueOf()
returns the object []
itself. Since an Array
object is not a primitive, the [[DefaultValue]]
hint is irrelevant: the return value for an array will be the value of toString()
.
To quote David Flanagan's JavaScript: The Definitive Guide, which, by the way, is a superb book that should be everyone's first place to get answers to these types of questions:
The details of this object-to-number conversion explain why an empty array converts to the number 0 and why an array with a single element may also convert to a number. Arrays inherit the default valueOf() method that returns an object rather than a primitive value, so array-to-number conversion relies on the toString() method. Empty arrays convert to the empty string. And the empty string converts to the number 0. An array with a single element converts to the same string that that one element does. If an array contains a single number, that number is converted to a string, and then back to a number.
The second type of answer to the "why" question, other than "because the spec says", gives some explanation for why the behavior makes sense from the design perspective. On this issue I can only speculate. First, how would one convert an array to a number? The only sensible possibility I can think of would be to convert an empty array to 0 and any non-empty array to 1. But as Wayne's answer revealed, an empty array will get converted to 0 for many types of comparisons anyway. Beyond this, it's hard to think of a sensible primitive return value for Array.valueOf(). So one could argue that it just makes more sense to have Array.valueOf()
be the default and return the Array itself, leading toString()
to be the result used by ToPrimitive. It just makes more sense to convert an Array to a string, rather than a number.
Moreover, as hinted by the Flanagan quote, this design decision does enable certain types of beneficial behaviors. For instance:
var a = [17], b = 17, c=1;
console.log(a==b); // <= true
console.log(a==c); // <= false
This behavior allows you to compare a single-element array to numbers and get the expected result.
arr == true
does not evaluate to true ;-) – Microcurieif (arr == !arr) console.log('There is no spoon');
– Feedarr === []
, as that will ALWAYS return false, since the right side is instantiating a new array, and the variable on the left cannot refer to something you just created. Testing emptiness should be done by looking uparr.length === 0
. – Supersensual