Ruby "defined?" operator works wrong?
Asked Answered
A

1

9

So, we have the code:

class Foo
  def bar
    puts "Before existent: #{(defined? some_variable)}"
    puts "Before not_existent: #{(defined? nonexistent_variable)}"

    raise "error"

    some_variable = 42
  rescue
    puts "exception"
  ensure
    puts "Ensure existent: #{(defined? some_variable)}"
    puts "Ensure not_existent: #{(defined? nonexistent_variable)}"
  end
end

And call it from irb:

> Foo.new.bar

And, that is will return:

Before existent:
Before not_existent:
exception
Ensure existent: local-variable
Ensure not_existent:
=> nil

And now is question - why? We raised exception before than some_variable be defined. Why it works this way? Why some_variable is defined in ensure block? (btw, it defined as nil)

UPDATE: Thanks @Max for answer, but if we change code to use instance variable:

class Foo
  def bar
    puts "Before existent: #{(defined? @some_variable)}"
    puts "Before not_existent: #{(defined? @nonexistent_variable)}"

    raise "error"

    @some_variable = 42
  ensure
    puts "Ensure existent: #{(defined? @some_variable)}"
    puts "Ensure not_existent: #{(defined? @nonexistent_variable)}"
  end
end

It works as expected:

Before existent:
Before not_existent:
Ensure existent:
Ensure not_existent:

Why?

Argeliaargent answered 31/3, 2015 at 15:7 Comment(1)
References to undefined instance (and global) variables are treated differently than those to undefined local (and class) variables. For example, puts @a #=> nil, whereas puts a #NameError: undefined local variable or method a' for main:Object`.Fridlund
S
6

The first thing to notice is that defined? is a keyword, not a method. That means it has its own special VM instruction recognized by the interpreter during parsing when the syntax tree is constructed (just like if, return, next, etc.) rather than dynamically looked up at runtime.

This is why defined? can handle expressions that would normally raise an error: defined?(what is this even) #=> nil because the parser can exclude its argument from the normal evaluating process.

Even though it is a keyword, its behavior is still determined at runtime. It uses parser magic to determine whether its argument is an instance variable, constant, method, etc. but then calls normal Ruby methods to determine whether these specific types have been defined at runtime:

// ...
case DEFINED_GVAR:
if (rb_gvar_defined(rb_global_entry(SYM2ID(obj)))) {
    expr_type = DEFINED_GVAR;
}
break;
case DEFINED_CVAR:
// ...
if (rb_cvar_defined(klass, SYM2ID(obj))) {
    expr_type = DEFINED_CVAR;
}
break;
case DEFINED_CONST:
// ...
if (vm_get_ev_const(th, klass, SYM2ID(obj), 1)) {
    expr_type = DEFINED_CONST;
}
break;
// ...

That rb_cvar_defined function is the same one called by Module#class_variable_defined?, for example.

So defined? is weird. Really weird. Its behavior could vary a lot depending on its argument, and I wouldn't even bet on it being the same across different Ruby implementations. Based on this I would recommend not using it and instead use Ruby's *_defined? methods wherever possible.

Slipway answered 31/3, 2015 at 15:16 Comment(1)
Wow. Thats weird. Do you have some doc or details about it?Argeliaargent

© 2022 - 2024 — McMap. All rights reserved.