Is this standard behavior?
Yes. typeof doesn't throw an error because it just returns a value as specified. However, as other answers have said, the code fails when evaluating the operand.
If so, it would be helpful to quote the relevant part of the ECMAScript standard.
When evaluating the function expression, an attempt to resolve the value of foo (so that it can be returned) will call the internal GetValue method with argument foo. However, since foo hasn't been declared or otherwise created, a reference error is thrown.
Edit
In the case of:
typeof (foo)
"(" and ")" are punctuators, denoting a grouping, such as a (possibly empty) parameter list when calling a function like foo(a, b)
, or an expression to be evaluated, e.g. if (x < 0)
and so on.
In the case of typeof (foo)
they simply denote evaluating foo before applying the typeof operator. So foo, being a valid identifier, is passed to typeof, per link above, which attempts to resolve it, can't, determines it's an unresolveable reference, and returns the string "undefined"
.
In the case of:
typeof (foo + 0)
the brackets cause the expression foo + 0
to be evaluated first. When getting the value of foo, a reference error is thrown so typeof doesn't get to operate. Note that without the brackets:
typeof foo + 0 // undefined0
because of operator precedence: typeof foo
returns the string "undefined"
, so +
becomes addition operator because one of the arguments is a string, it does concatenation (the string version of addition, not the mathematic version), so 0
is converted to the string "0"
and concatenated to "undefined"
, resutling in the string "undefined0"
.
So any time the evaluation of an expression with an unresolveable reference is attempted (e.g. an undeclared or initialised variable) a reference error will be thrown, e.g.
typeof !foo
throws a reference error too because in order to work out what to pass to typeof, the expression must be evaluated. To apply the !
operator, the value of foo must be obtained and in attempting that, a reference error is thrown.
typeof foo
isn't throwing theReferenceError
, but the function expression is probably throwing the error on thereturn
statement, not ontypeof
. – Ric(function(){ return typeof foo; })()
works as you think it would. – Barefoottypeof (1 + 1)
first evaluates1 + 1
, I would think that the expressionfoo
is evaluated, e.g. to2
or'hello world'
, and that evaluation would have also throw aReferenceError
. Of course, that's not what happened. – Generoustypeof (1 + foobar) // => ReferenceError
– Barefootfoo
before evaluatingtypeof
. That is, for example, if you sayalert("test")
, the expression"test"
is evaluated before thealert
is called, and should be evaluated independently of alert i.e.alert("test")
andsomeOtherFunction("test")
should not affect the value of the constant"test"
. If that is true, why doestypeof
not evaluatefoo
independently of context? – Hertfordshiretypeof (foo + 0)
force evaluation whereastypeof foo
andtypeof (foo)
do not.typeof
by definition does not evaluate its operand, but it seems like the operand can evaluate itself. – Barefoottypeof
is unlike every other JS operator and function in its evaluation of the operator? – Generoustypeof
short-circuits when encountering an "unsolveable reference" returning the string "undefined". – Barefootasync () => await x
vs.async () => () => await x
destroyed it. (Or similarly(function*(){ yield x; })
vs.(function*(){ () => yield x; })
). – Unspoiled