I am trying to understand what seems like strange behavior when dealing with nulls and type annotations inside a for-comprehension.
As an example:
def f(): String = null
for {
a <- Option("hello")
b = f()
} yield (a, b)
results in the expected:
//> res0: Option[(String, String)] = Some((hello,null))
however, if I add a type annotation to the type of b
def f(): String = null
for {
a <- Option("hello")
b: String = f()
} yield (a, b)
then I get a runtime exception:
//> scala.MatchError: (hello,null) (of class scala.Tuple2)
Why does this happen? Isn't b
implicitly of type String
in the first example anyway? What does the explicit type annotation in the second example change?
(Note, examples were run in Scala 2.11.4)
reify
in the repl:import scala.reflect.runtime.universe._; reify {for {... } }
. I can't tell you why though. AIUInull
doesn't match because matching works on the runtime type (or value), even thoughb
has a compile-time type ofString
; this is in some sense a hole in the type system, and scala code should generally avoid using nulls. – PigmentationTry
s to integrate with some legacy java code. However, null seems incidental to me here: the aspect I find bizarre (scary?) is that adding more type specificity results in an (unintuitive) runtime exception. – Dobbinsfor
/yield
results in a pattern match; it's not a problem that would occur with "ordinary" types on expressions. All I can say is it's unfortunate, but probably can't be changed at this stage; I tend to avoid ever giving a type on the left hand side of a for/yield for this reason :/ – Pigmentation