Why does typeof only sometimes throw ReferenceError?
Asked Answered
G

5

14

In Chrome and Firefox,

typeof foo

evalulates to 'undefined'.

But

typeof (function() { return foo; })()

throws an error:

ReferenceError: foo is not defined

This destroys the notions that I have of susbstitutability of expressions! Until now, I knew of no conditions for which foo and (function() { return foo; })() are not the same.

Is this standard behavior? If so, it would be helpful to quote the relevant part of the ECMAScript standard.


EDIT:

Another example:

typeof (foo)
typeof (foo + 0)

I would have expect (foo) and (foo + 0) to throw an error.

But the first one has no error; the second one does.

Generous answered 10/6, 2014 at 20:58 Comment(9)
I'm not sure why typeof foo isn't throwing the ReferenceError, but the function expression is probably throwing the error on the return statement, not on typeof.Ric
Agreed. You're attempting to return something that's undefined, and your code is throwing an exception there. (function(){ return typeof foo; })() works as you think it would.Barefoot
@ajp15243, correct. But just as typeof (1 + 1) first evaluates 1 + 1, I would think that the expression foo is evaluated, e.g. to 2 or 'hello world', and that evaluation would have also throw a ReferenceError. Of course, that's not what happened.Generous
Your anonymous function executes first, throwing an exception. The same thing happens when you attempt to evaluate any other undefined expression: typeof (1 + foobar) // => ReferenceErrorBarefoot
@Ric I understand what he's saying; if you think about how the language is evaluated, you would have to evaluate foo before evaluating typeof. That is, for example, if you say alert("test"), the expression "test" is evaluated before the alert is called, and should be evaluated independently of alert i.e. alert("test") and someOtherFunction("test") should not affect the value of the constant "test". If that is true, why does typeof not evaluate foo independently of context?Hertfordshire
I think the issue is your self-executing anonymous function and typeof (foo + 0) force evaluation whereas typeof foo and typeof (foo) do not. typeof by definition does not evaluate its operand, but it seems like the operand can evaluate itself.Barefoot
So, @ajm, if I understand you, typeof is unlike every other JS operator and function in its evaluation of the operator?Generous
Check my answer out below. Basically, typeof short-circuits when encountering an "unsolveable reference" returning the string "undefined".Barefoot
“This destroys the notions that I have of susbstitutability of expressions” — For me, async () => await x vs. async () => () => await x destroyed it. (Or similarly (function*(){ yield x; }) vs. (function*(){ () => yield x; })).Unspoiled
F
25

Basically, the typeof operator checks whether a variable¹ is unresolvable and returns "undefined". That is, typeof returns a defined value for undeclared variables¹ before reaching the GetValue algorithm which throws for undeclared variables¹.

Quoting ECMAScript 5.1 § 11.4.3 The typeof Operator (emphasis added):

11.4.3 The typeof Operator

The production UnaryExpression : typeof UnaryExpression is evaluated as follows:

  1. Let val be the result of evaluating UnaryExpression.
  2. If Type(val) is Reference, then

    2.1. If IsUnresolvableReference(val) is true, return "undefined".

    2.2 Let val be GetValue(val).

  3. Return a String determined by Type(val) according to Table 20.

In the other hand, the return statement -- like most operators and statements which read the value from identifier(s) -- will always call GetValue which throws on unresolvable identifiers (undeclared variables). Quoting ECMAScript 5.1 § 8.7.1 GetValue (V) (emphasis added):

8.7.1 GetValue (V)

  1. If Type(V) is not Reference, return V.
  2. Let base be the result of calling GetBase(V).
  3. If IsUnresolvableReference(V), throw a ReferenceError exception.

Now, analyzing the code:

typeof (function() { return foo; })()

This code will instantiate a function object, execute it and only then typeof will operate on the function's return value (function call takes precedence over the typeof operator).

Hence, the code throws while evaluating the IIFE's return statement, before the typeof operation can be evaluated.

A similar but simpler example:

typeof (foo+1)

The addition is evaluated before typeof. This will throw an error when the Addition Operator calls GetValue on foo, before typeof comes into play.

Now:

typeof (foo)

Does not throw an error as the grouping operator (parentheses) does not "evaluate" anything per se, it just forces precedence. More specifically, the grouping operator does not call GetValue. In the example above it returns an (unresolvable) Reference.

The annotated ES5.1 spec even adds a note about this:

NOTE This algorithm does not apply GetValue to the result of evaluating Expression. The principal motivation for this is so that operators such as delete and typeof may be applied to parenthesised expressions.


N.B. I've wrote this answer with the focus on providing a simple and understandable explanation, keeping the technical jargon to a minimum while still being sufficiently clear and providing the requested ECMAScript standard references, which I hope to be a helpful resource to developers who struggle with understanding the typeof operator.

¹ The term "variable" is used for ease of understanding. A more correct term would be identifier, which can be introduced into a Lexical Environment not only through variable declarations, but also function declarations, formal parameters, calling a function (arguments), with/catch blocks, assigning a property to the global object, let and const statements (ES6), and possibly a few other ways.

Fulguration answered 10/6, 2014 at 21:17 Comment(4)
Oh great, spend several minutes of my life writing an answer so an anon downvoter can just dv it right off without reading.Deepfry
Fabcricio, I have no idea why your answer was downvoted. :( Seems great.Generous
@FabrícioMatté Yeah... this is probably the best answer so far IMHOHertfordshire
@PaulDraper thank you, no problem. I've upvoted @ RobG's answer for explaining most of this before me as well.Deepfry
G
8

Is this standard behavior?

Yes. typeof doesn't throw an error because it just returns a value as specified. However, as other answers have said, the code fails when evaluating the operand.

If so, it would be helpful to quote the relevant part of the ECMAScript standard.

When evaluating the function expression, an attempt to resolve the value of foo (so that it can be returned) will call the internal GetValue method with argument foo. However, since foo hasn't been declared or otherwise created, a reference error is thrown.

Edit

In the case of:

typeof (foo)

"(" and ")" are punctuators, denoting a grouping, such as a (possibly empty) parameter list when calling a function like foo(a, b), or an expression to be evaluated, e.g. if (x < 0) and so on.

In the case of typeof (foo) they simply denote evaluating foo before applying the typeof operator. So foo, being a valid identifier, is passed to typeof, per link above, which attempts to resolve it, can't, determines it's an unresolveable reference, and returns the string "undefined".

In the case of:

typeof (foo + 0)

the brackets cause the expression foo + 0 to be evaluated first. When getting the value of foo, a reference error is thrown so typeof doesn't get to operate. Note that without the brackets:

typeof foo + 0 // undefined0

because of operator precedence: typeof foo returns the string "undefined", so + becomes addition operator because one of the arguments is a string, it does concatenation (the string version of addition, not the mathematic version), so 0 is converted to the string "0" and concatenated to "undefined", resutling in the string "undefined0".

So any time the evaluation of an expression with an unresolveable reference is attempted (e.g. an undeclared or initialised variable) a reference error will be thrown, e.g.

typeof !foo 

throws a reference error too because in order to work out what to pass to typeof, the expression must be evaluated. To apply the ! operator, the value of foo must be obtained and in attempting that, a reference error is thrown.

Giulia answered 10/6, 2014 at 21:12 Comment(2)
Does this also explain why typeof (foo) does not throw an error?Generous
@PaulDraper AFAIK the grouping operator (parentheses) only force precedence, they don't "evaluate" anything.Deepfry
C
2

The error "ReferenceError: foo is not defined" is not being thrown by typeof, its being thrown by the function itself. If you had used:

typeof (function() { return 2; })()

it would have returned "number" as expected, but in this example JavaScript is not even getting to the point where typeof is being run on anything. You are receiving the same error as if you had run:

function test () {
    return foo;
}
test();
Capable answered 10/6, 2014 at 21:5 Comment(0)
B
2

Digging through the spec, I think this all comes down to when the operator in question attempts to run GetValue() on its operand.

typeof attempts to determine its operand's Type first. If that type is a Reference and is IsUnresolvableReference(), then it bails out and returns undefined. In essence, it does not fully evaluate the operand; if it did, anything that was undefined would throw an exception, so instead it short circuits and returns a nice, useful string.

In the examples, self-executing functions and the addition operator call GetValue without first checking for IsUnresolvableReference() like typeof does: they call GetValue and throw an exception if the reference is unresolved (foo is undefined in our case). (I think! This is my best guess from reading through the spec.)

Barefoot answered 10/6, 2014 at 21:29 Comment(2)
What's up with the downvote fairies today? This is pretty accurate. +1Deepfry
Note though that undefined is a bit ambiguous -- e.g. var foo; means foo is declared but has the primitive value undefined. I believe "undeclared" or "unresolvable" are less ambiguous for what you mean.Deepfry
H
1

This is standard behavior. The typeof operator almost takes a reference of the next variable you pass to it.

So let's try typeof foo.

The javascript interpreter looks at typeof and finds the type of foo.

Now we try typeof (function() { return foo })()

The javascript interpreter looks at typeof. Since the expression afterwards isn't a variable, it evaluates the expression. (function() { return foo })() throws a ReferenceError because foo is undefined. If it were possible to pass the reference of a varialbe i.e. something like (function() { return *foo })() then this wouldn't happen.

Note: According to this, one may think that typeof (foo) would throw an error, since (foo) isn't a variable and must be evaluated, but that is incorrect; typeof (foo) will also return "undefined" if foo isn't defined.

Essentially, the interpreter evaluates the next variable, but not expression, in a "safe" context so that typeof doesn't throw an error.

It is a bit confusing.

Hertfordshire answered 10/6, 2014 at 21:5 Comment(4)
I am not understanding why typeof (foo) does not throw an error.Generous
@PaulDraper Yes, that is confusing, as I said. I believe it might be because (foo) is evaluated as a variable as well, so it runs in the "safe" non-error context. However, normal non-variable expressions do not run in that context.Hertfordshire
So, which ones are "safe" and which are not?Generous
I believe foo and (foo) are the only "safe" ones. Even expressions like false || foo or (0, foo) throw an error. (The comma operator returns the last value)Hertfordshire

© 2022 - 2024 — McMap. All rights reserved.