Why doesn't C# let you declare multiple variables using var?
Asked Answered
D

5

49

Given the following:

// not a problem
int i = 2, j = 3;

so it surprises me that this:

// compiler error: Implicitly-typed local variables cannot have multiple declarators
var i = 2, j = 3;

doesn't compile. Maybe there is something I don't understand about this (which is why I'm asking this)?

But why wouldn't the compiler realize that I meant:

var i = 2;
var j = 3;

which WOULD compile.

Dalmatic answered 9/2, 2011 at 20:30 Comment(23)
I never had a need for that feature. In the cases were var is useful the initializers are usually rather long, and thus multiple statements are easier to read anyways.Weaponry
I'm sure the C# compiler development team lurks here. Why don't any of you speak up and end this madness?Rexferd
@Rexferd this thread is just 20 minutes old. Do you expect Eric to check every C# thread on SO several times per hour?Weaponry
By the way, I realize this is just an example, but it always bugs me when developers use var in place of int. It's the same number of letters! int is even easier to type, in my opinion ;)Percolator
@Dan All newer Microsoft keyboards have a function key that types "var". That's why.Rexferd
@bzlm: Wait... wait... what?Percolator
@Dan Tao, thats a good point but I was using that as an example. James Gaunt has a pretty good explanation as to why this doesn't work. A case I hadn't thought about which is why I posted this question because I figured someone could come up with a good answer as to why they don't allow this. I didn't assume it was a bug in the compiler but that I hadn't really thought out what was going on.Dalmatic
@bzlm: First, learn patience. Second, I don't "lurk". Third, if there is something you want brought to my attention, use the contact link on my blog and I'll get to it eventually. I do have actual work on the compiler to do from time to time.Venturous
@Dao tao It always bugs me when some one wants to use an explicit type instead of var. Why on earth would you ever want that :). And don't say for readability. There are (strongly typed) languages that won't even let you use a type that are extremely readable (usually FP languages) so why would you want to ever enforce a binding to a specific type. One day will come and you have to update all "int" to "long" (example types) simply because you used "int" instead of var and had to change the return type of one method in the central dllJenna
@Rune FS: I mostly agree with you, but there are some times when the mechanism of the types is very important to the correctness and understanding of the code, and in those cases it is a good idea to make the type explicit in the text. I think it's nice to have the option, so that you can choose to emphasize or de-emphasize the type as appropriate.Venturous
@Eric agree 100% I was definitely exaggerating to point out that I think the explicit typing is the odd case not the other way around.Jenna
I, for one, hope for the arrival of E#, where E is for explicit. Dadgummit. Now let me go back to my rocking chair.Eliott
@Rune FS: wow ... explicit typing the "odd case"? Obviously you are not old enough to remember FORTRAN, where you didn't even have to declare variables and the type of a new variable depended on its first letter ... caused untold horrors. Give me explicit typing every time. Many times I have looked at a piece of code with var and wondered what type it REALLY was. Now obviously if it's an anonymous type, you want to use var. But if I know the type, I use it.Unshroud
@Cynthia: I take your point. But consider this. Temporary variables are also variables. When the compiler compiles "int x = a * b + c", it compiles that as "int x; int temp = a * b; x = temp + c;". If you are so keen on explicit typing, then why is it good for the compiler to silently generate a variable of inferred type on your behalf? Should we require you to declare that variable explicitly, and declare its type explicitly? Where do you want to draw that line?Venturous
@Cynthia: if the first letter dictates the type, the type is explicit. It's right there in the first letter everywhere in the program, not just in some declaration. It is even more explicit.Seymour
@Unshroud I'm old enough for my first program to have been written in assemblies and FORTRAN (as menition else where) is explicitly typed. I personally care about what the program does and defer my need to know how until the last possible moment. The types go into the how Take this: sequence.Select(someSelector). Does it really matter for your understanding of what the code does, whether it's Enumerable.Select or Queryable.Select? of course not. It selects based on the someSelector. The how part only because important if that particular statement behaves different from expectedJenna
@Martinho (et al): I didn't mean to suggest that FORTRAN variables were not explicitly typed. It's just that my experiences in FORTRAN informed my preference for declaring variables of a specific type. MANY times a variable name would be misspelled, and a new variable would be created, and sometimes it might be of the wrong type. But would the compiler catch it? No -- some other horrible error would occur that would take hours to debug. I came to really appreciate the strict type requirements of languages like Ada.Unshroud
Possibly I'm too influenced by those experiences, and I realize that "var" variables are not the same as variables accidentally created in FORTRAN. But I feel that often it's laziness that causes programmers to use var when they could just as easily use a known type, and it's nice to be able to mouse over a variable and see what type it actually is.Unshroud
@Cynthia: you can mouse over a "var" variable and see what type it is. Remember, "var" just means "compiler, work out what the compile-time type is for me". It doesn't mean "dynamic", ie, "compiler, defer this type analysis until runtime".Venturous
For what it's worth, C++ does something different (auto is the C++0x equivalent of C#'s var): it allows you to declare multiple variables in a single auto declaration, but the initializers must all be of the same type. So, auto i = 1, j = 2; is ok because both initializers are of type int so the auto means int here, but auto i = 1, j = 1.2; is ill-formed because there is an ambiguity as to what the auto should mean. The rules are essentially identical to the template argument deduction rules. [Maybe no one here cares about C++, but it's interesting to make the comparison.]Praetorian
var x = 1.2; by itself is just as ambiguous. It could be a float (single). It happens to default to a double because that's the way the compiler is designed. There would be no confusion if var x = 1, y = "hello"; is defined as each variable being inferred independently. i.e. var x = 1, y = "hello"; === same as ===>> var x = 1; var y = "hello";Pecten
@Eric: you are right, you can mouse over a var variable and see what type the compiler decided it should be, I stand corrected. Nevertheless, I stand by my commitment to using explicit typing where possible. Ultimately, I think it's just a question of taste. I had such bad experiences early on with ambiguously-typed variables that I am resistant to abandoning explicit typing.Unshroud
@Cynthia: I understand. The distinction that we're drawing here is that there's a difference between static typing - where the compiler knows the type of everything and detects type algebra violations at compile time and thereby prevents bugs - from explicit typing - where a textual representation of the type appears in the code for the convenience of the human reader. "var" gives you non-explicit typing without abandoning static typing, just like "a * b + c" gives you non-explicit typing on each sub-expression without abandoning static type checks.Venturous
S
44

It's just another point of possible confusion for the programmer and the compiler.

For example this is fine:

double i = 2, j = 3.4;

but what does this mean?

var i = 2, j = 3.4;

With syntactic sugar this kind of thing is a headache no one needs--so I doubt your case would ever be supported. It involves too much of the compiler trying to be a little bit too clever.

Survey answered 9/2, 2011 at 20:36 Comment(25)
+1; You could think of several logical ways, given the way var works, that this should shake out; to keep from picking one, they just force you to avoid confusion by separating the initialization.Cytochrome
Note that the restriction is added in the specification explicitly (“The local-variable-declaration cannot include multiple local-variable-declarators.”, see §8.5.1), together with others, which are much more obvious (like “The local-variable-declarator must include a local-variable-initializer.”)Ulotrichous
Yeah, but the compiler could catch an assignment of more than one type and only produce an error in that case. It seems like this was simply a design choice. Of course, there may be technical reasons that I don't know about.Madder
@Ed S. Anything that is supported has a cost involved in design, testing and support. Since this feature has no real upside and is potentially confusing and/or frustrating to some people at least I don't see that a positive decision to exclude it is required. It just doesn't merit inclusion in the C# feature set. Or to quote Star Trek... 'just because we can do a thing.....'Survey
Of course, but in this case having this feature would make things more consistent. It's not that it is absolutely necessary, and personally I rarely use assignments like that anyway, but there must have been some snag that made it non-trivial to implement, thus the omission.Madder
I don't agree. A a1 = anA1, a2 = anA2; === same as ===> A a1 = anA1; A a2 = anA2; ..... replace A with var, it could work exactly the same.Pecten
Compiler could simply use one of the scenarios and solve it in the first pass. The programmers on the other hand would be VERY confused. C++ programmers would probably expect i and j to be double, some others i to be int and j to be double and the rest compilation error! What would they all had in common is they would hate the compiler designers for doing it wrong.Thanksgiving
It has always frustrated me that + means to add rather than to subtract. I HATE that.Pecten
@Pedro, sorry, I don't understand that. My point is just that because something is non-trivial to implement in a certain way isn't a reason to implement it.Survey
I've seen this conversation with the language-writers, and indeed: the reason is exactly this. Since there wasn't a clear and obvious way of interpreting it that everyone agreed with (least surprise etc), it was better not to. Some people would think obviously that was double and double, and some would think obviously it was int and double...Tithe
@James I don't think this feature is all that important. However, I don't think it would b very confusing to anyone that having a "var" keyword means that all variables declared on that line are to have their types be inferred.Pecten
@Jaroslav Everyone hates compiler designers no matter what. ;)Rexferd
@Pedro. Indeed that isn't confusing, provided that's what everyone expects. But others in this thread have implied they expect all the types to be the same. Who is right? No-one is right... so just to avoid causing confusion don't implement the feature. Simples.Survey
@Marc And what is the obvious type when you have an int and a string? Mustn't you not conclude that each variable should be inferred independently?Pecten
What does ?? do? What about ?:? I know many programmers who don't know this stuff -- not a reason to not implement them in the language. (Rhetorical questions, in case anyone doesn't realize that.)Pecten
You could just allow the cases where both expressions have exactly the same type. No confusion or ambiguity in that case.Weaponry
And then we get Stack Overflow question... why is var i = 1, j = 2; ok but var i = 1, j = 2.0; not ok. Surely C# can [infer the types|cast to int|cast to double|(insert favourite here)].... language features aren't free.Survey
@CodeInChaos: Also a rather limp feature in that case (in my opinion).Percolator
@CodeInChaos: I think Pedro knew what they were and was trying to make a point. @Pedro: You're fighting for one implementation of a var declaration list. There are multiple possible implementations that would be logical, given existing, valid, semantically similar statements involving var. If it was done "your" way, someone else could see several variables created using var and think it works exactly the opposite from you for very valid reasons based on how var is used elsewhere. It's confusing, so it's not allowed; disagree all you like, that's not going to change.Cytochrome
@Pedro: I think I understand your point; they could've just picked one implementation, documented it, and then it would be the responsibility of the developer to know it just like it's our responsibility to understand any language feature we use. I just want to remind you and everyone else that we're talking about a feature which, if implemented, would save about four keystrokes per variable. That's pretty small justification for a feature, if you ask me.Percolator
@Dan Tao - Not even; since the equivalent to a var list as Pedro would want it implemented is a strongly-typed declaration list, you're saving the difference in characters between var and whatever the strong type should be, once. The strongly-typed declaration list would probably be far shorter than individual vars. I think what MIGHT be logical is a ReSharper or other refactoring tool operation that would convert individual declarations to a declaration list and v.v.; RS already goes between specific type and var.Cytochrome
@KeithS: My understanding of Pedro's comments is that he believes that the type of each variable in a hypothetical var declaration list should be inferred independently; so I was comparing the cost of typing var x = 1, y = "Blah"; against that of var x = 1; var y = "Blah"; (plus line breaks), etc. And yes I do realize we have beaten this horse to death.Percolator
Truthfully, I probably wouldn't even use this feature. I almost always declare variables on separate lines anyway. So I wouldn't even use a feature like this. However, I think when you give this enough thought, there is really one single best way to implement this hypothetical feature --- as I said earlier: every variable's type inferred independently. I would like to hear someone argue that another interpretation would be better than that.Pecten
@Pedro: Because any other comma-delimited list of declared variables that is valid in C# MUST have the same type. So, you're introducing confusion right there; replacing double with var at the beginning of such a list all of a sudden changes a lot of variable types. Also, if you mouse over any existing var, you get the currently-inferred type. Imagine mousing over the var in var x=1, y="foo". It's confusing.Cytochrome
So just do not allow it for doubles and ints or strings and char.Fleam
V
63

When we designed the feature I asked the community what

var x = 1, y = 1.2;

should mean. The question and answers are here:

http://blogs.msdn.com/b/ericlippert/archive/2006/06/26/what-are-the-semantics-of-multiple-implicitly-typed-declarations-part-one.aspx

http://blogs.msdn.com/b/ericlippert/archive/2006/06/27/what-are-the-semantics-of-multiple-implicitly-typed-declarations-part-two.aspx

Briefly, about half the respondants said that the obviously correct thing to do was to make x and y both double, and about half the respondants said that the obviously correct thing to do was to make x int and y double.

(The language committee specified that it should be "double", and I actually implemented the code that way long before we shipped. We used the same type inference algorithm as we do for implicitly typed arrays, where all the expressions must be convertible to a best element type.)

When half your customer base thinks that one thing is "obviously correct" and the other half believes that the opposite is "obviously correct" then you have a big design problem on your hands. The solution was to make the whole thing illegal and avoid the problem.

Venturous answered 9/2, 2011 at 21:22 Comment(12)
@Eric: so you took out your implementation from the compiled code?Checkrein
@Joan:We removed it, and replaced it with an error-recovery system that instead binds "var x = a, y = b;" as though the "var" were replaced with the type of "a". That way you get IntelliSense on "x." and "y." as though they were both the type of "a". That seemed like the more sensible thing to do for the IDE experience than some complex heuristic that tries to figure out the best type, or the type of each.Venturous
@Eric: Thanks that's good to know. I like it that you kept it simple :OCheckrein
When I first thought about this, I decided that y should be an int. Then I considered the following C++ code: int* x, y. x is a pointer to an int, y is a regular int. So in my opinion, y should be a double.Overflight
@Marlon: First off, your intuition from C was wrong because of the way you wrote the declaration. "int *x, y;" expresses the true meaning, namely, that *x and y are both variables that contain an int. Second, the fact that you had to consider the (strange) semantics of another language, and the fact that you changed your mind halfway through the process, is just more evidence that the feature is completely unclear and misleading and therefore should be made illegal. Which is what we did.Venturous
Hooray, a win for sanity! Clarity of intent and ease of reading are always infinitely more important than saving a few keystrokes.Cassino
I would say var x = 1, y = 1.2; should throw an exception because the types of x and y are detected as being different. It would be better than preventing var x = 1, y = 2; altogether. Or just treat it as var x = 1; var y = 1.2 because var is only indicating our intention to declare variables (in a short hand manner, i.e. it is to the start of the line what ; is to the end of the line) and all that matters is that the compiler can figure out what x and y are.Elysha
in these situations, can you not just have an error specifically in the ambiguous case? there's a lot of times like this where there's an ambiguous case, so then the whole feature becomes an error rather than just the ambiguous case. it should just say something like "inferred types conflict; multiple implicitly typed variables must resolve to the same type. separate the declarations or make the types match."Adamandeve
@DaveCousineau: Yeah, you know what that sounds like? Work. The C# team was the "long pole" for that release of Visual Studio; we had more work on the schedule than any other team, so every day we were late, VS would slip. Guess which developer had the long pole for the C# team? That would be me. Writing ambiguity detectors, and then writing the test cases and the error messages and all that stuff adds up to many days even for simple features. Did I want to be responsible for slipping VS to add such an error? I did not.Venturous
@DaveCousineau: Now, of course since I had already implemented the code to choose double, making it a parse-time error is also work, but it is less work than making it sometimes work and sometimes not work. It's less testing burden, it's less user education burden, and so on. C# is already a complicated language, and "guess what the user meant" is not one of its design goals. Because suppose we did what you suggest; you'd then be asking me today why var x = "foo", y = null; doesn't work "because y is obviously intended to be string", even though null is not of type string.Venturous
@DaveCousineau: Also, there are longer-term factors to consider. Making a feature legal today means that you can't revisit that decision and make it illegal tomorrow, but the opposite is not (usually) true. If the C# team decided today that they wanted to implement your proposed feature, they could without breaking anyone. But if they implement a feature that seems dodgy that they regret later, it's forever. That's points towards just making it illegal at parse time.Venturous
@EricLippert yipes, thank you sir, do not mean to offend or anything. I just know from my own sort of quick and dirty interpreter that I have many many places that say something like: if there's 0, IDontKnowWhatThisIsError, if there's 2 or more, AmbiguityError, and if there's exactly 1, continue on happy. I never think "hmm this could be ambiguous, better make it an error in all cases ambiguous or not". but yes, C# is orders of magnitude more complex, so it's not really comparable. thanks.Adamandeve
S
44

It's just another point of possible confusion for the programmer and the compiler.

For example this is fine:

double i = 2, j = 3.4;

but what does this mean?

var i = 2, j = 3.4;

With syntactic sugar this kind of thing is a headache no one needs--so I doubt your case would ever be supported. It involves too much of the compiler trying to be a little bit too clever.

Survey answered 9/2, 2011 at 20:36 Comment(25)
+1; You could think of several logical ways, given the way var works, that this should shake out; to keep from picking one, they just force you to avoid confusion by separating the initialization.Cytochrome
Note that the restriction is added in the specification explicitly (“The local-variable-declaration cannot include multiple local-variable-declarators.”, see §8.5.1), together with others, which are much more obvious (like “The local-variable-declarator must include a local-variable-initializer.”)Ulotrichous
Yeah, but the compiler could catch an assignment of more than one type and only produce an error in that case. It seems like this was simply a design choice. Of course, there may be technical reasons that I don't know about.Madder
@Ed S. Anything that is supported has a cost involved in design, testing and support. Since this feature has no real upside and is potentially confusing and/or frustrating to some people at least I don't see that a positive decision to exclude it is required. It just doesn't merit inclusion in the C# feature set. Or to quote Star Trek... 'just because we can do a thing.....'Survey
Of course, but in this case having this feature would make things more consistent. It's not that it is absolutely necessary, and personally I rarely use assignments like that anyway, but there must have been some snag that made it non-trivial to implement, thus the omission.Madder
I don't agree. A a1 = anA1, a2 = anA2; === same as ===> A a1 = anA1; A a2 = anA2; ..... replace A with var, it could work exactly the same.Pecten
Compiler could simply use one of the scenarios and solve it in the first pass. The programmers on the other hand would be VERY confused. C++ programmers would probably expect i and j to be double, some others i to be int and j to be double and the rest compilation error! What would they all had in common is they would hate the compiler designers for doing it wrong.Thanksgiving
It has always frustrated me that + means to add rather than to subtract. I HATE that.Pecten
@Pedro, sorry, I don't understand that. My point is just that because something is non-trivial to implement in a certain way isn't a reason to implement it.Survey
I've seen this conversation with the language-writers, and indeed: the reason is exactly this. Since there wasn't a clear and obvious way of interpreting it that everyone agreed with (least surprise etc), it was better not to. Some people would think obviously that was double and double, and some would think obviously it was int and double...Tithe
@James I don't think this feature is all that important. However, I don't think it would b very confusing to anyone that having a "var" keyword means that all variables declared on that line are to have their types be inferred.Pecten
@Jaroslav Everyone hates compiler designers no matter what. ;)Rexferd
@Pedro. Indeed that isn't confusing, provided that's what everyone expects. But others in this thread have implied they expect all the types to be the same. Who is right? No-one is right... so just to avoid causing confusion don't implement the feature. Simples.Survey
@Marc And what is the obvious type when you have an int and a string? Mustn't you not conclude that each variable should be inferred independently?Pecten
What does ?? do? What about ?:? I know many programmers who don't know this stuff -- not a reason to not implement them in the language. (Rhetorical questions, in case anyone doesn't realize that.)Pecten
You could just allow the cases where both expressions have exactly the same type. No confusion or ambiguity in that case.Weaponry
And then we get Stack Overflow question... why is var i = 1, j = 2; ok but var i = 1, j = 2.0; not ok. Surely C# can [infer the types|cast to int|cast to double|(insert favourite here)].... language features aren't free.Survey
@CodeInChaos: Also a rather limp feature in that case (in my opinion).Percolator
@CodeInChaos: I think Pedro knew what they were and was trying to make a point. @Pedro: You're fighting for one implementation of a var declaration list. There are multiple possible implementations that would be logical, given existing, valid, semantically similar statements involving var. If it was done "your" way, someone else could see several variables created using var and think it works exactly the opposite from you for very valid reasons based on how var is used elsewhere. It's confusing, so it's not allowed; disagree all you like, that's not going to change.Cytochrome
@Pedro: I think I understand your point; they could've just picked one implementation, documented it, and then it would be the responsibility of the developer to know it just like it's our responsibility to understand any language feature we use. I just want to remind you and everyone else that we're talking about a feature which, if implemented, would save about four keystrokes per variable. That's pretty small justification for a feature, if you ask me.Percolator
@Dan Tao - Not even; since the equivalent to a var list as Pedro would want it implemented is a strongly-typed declaration list, you're saving the difference in characters between var and whatever the strong type should be, once. The strongly-typed declaration list would probably be far shorter than individual vars. I think what MIGHT be logical is a ReSharper or other refactoring tool operation that would convert individual declarations to a declaration list and v.v.; RS already goes between specific type and var.Cytochrome
@KeithS: My understanding of Pedro's comments is that he believes that the type of each variable in a hypothetical var declaration list should be inferred independently; so I was comparing the cost of typing var x = 1, y = "Blah"; against that of var x = 1; var y = "Blah"; (plus line breaks), etc. And yes I do realize we have beaten this horse to death.Percolator
Truthfully, I probably wouldn't even use this feature. I almost always declare variables on separate lines anyway. So I wouldn't even use a feature like this. However, I think when you give this enough thought, there is really one single best way to implement this hypothetical feature --- as I said earlier: every variable's type inferred independently. I would like to hear someone argue that another interpretation would be better than that.Pecten
@Pedro: Because any other comma-delimited list of declared variables that is valid in C# MUST have the same type. So, you're introducing confusion right there; replacing double with var at the beginning of such a list all of a sudden changes a lot of variable types. Also, if you mouse over any existing var, you get the currently-inferred type. Imagine mousing over the var in var x=1, y="foo". It's confusing.Cytochrome
So just do not allow it for doubles and ints or strings and char.Fleam
C
7

Because if this worked:

var i = 2, j = 3;

because this works:

var i = 2;
var j = 3;

then you might expect this to work:

var i = 2, j = "3";

because this works:

var i = 2;
var j = "3";

Even in the case posited by James Gaunt, where they are both numeric types and could be stored in a value of the same type, what type would i be?:

var i = 2, j = 3.4;

j is obviously a double, but i could logically be either an int or a double, depending on how you expected var to infer the types. Either way it were implemented, you'd cause confusion with people who expected it to work the other way.

To avoid all this confusion, it's simply disallowed. I personally don't see it as a big loss, personally; if you want to declare a list of variables (which is itself pretty rare in my working experience), just strongly type em.

Cytochrome answered 9/2, 2011 at 20:43 Comment(1)
" i could logically be either an int or a double, depending on how you expected var to infer the types" you have given the correct way to reason about this right above by saying that "var i = 2;" means that i is an int. If people can't figure this out by reasoning in this very simple manner then it's their fault for being cerebrally insufficient. The reasoning is extremely simple to follow and it's clearly needed in for-loops where one might need more than one var "for (var i = 0, j =3; ...)". I find the entire argument against this feature completely bogus.Madonia
P
4

I think it's just too iffy. When the two variables are the same type it's an easy specific case, but in the more general case you'd have to consider what is "correct" in code like:

var x = new object(), y = "Hello!", z = 5;

Should those all be typed as object, since that's the only type they all have in common? Or should x be object, y be string, and z be int?

On the one hand you might think the former, since variables declared in this way (all on one line) are usually presumed to all be the same type. On the other hand perhaps you'd think it's the latter, since the var keyword is typically supposed to get the compiler to infer the most specific type for you.

Better to just prohibit this altogether than bother working out exactly how it should behave, given that it would not exactly be a "killer" feature anyway.

That's my opinion/guess, at least.

Percolator answered 9/2, 2011 at 20:39 Comment(5)
The rule could have been that all local variables in one declaration must result in the same inferred type.Weaponry
You'd still have cases where all values could be of one inferred type, but if each were declared separately they'd be different inferred types. Either way they'd allow it to work (individual or group inference) you'd cause confusion with people who expected it to work the other way.Cytochrome
@CodeInChaos, @KeithS: I'm with Keith on this one. Nobody can really deny that it could work a certain way; I think the problem is that whatever that hypothetical way is, it also could work the opposite way.Percolator
I don't understand what you mean KeithS. My suggestion was that only to only allow such a var statement if all types in it were the same(and not just convertible to the same type). So var i=1,j=2 would be valid, var i=1.0,j=2 would not. But I agree with JamesGaunt that it's possibly not worth the cost of specing, coding, testing...Weaponry
@CodeInChaos: So you mean exactly the same type. All types are object at least, after all. I think Keith (and I) figured you just meant that the compiler could choose the most specific common type for all variables.Percolator
R
1

I think, that's because for compiler it could be the same as:

var i = 2, j = "three"

And those surely aren't of the same type.

Regional answered 9/2, 2011 at 20:37 Comment(4)
In the example in the question, this does not apply.Rexferd
@Rexferd Why would you think so? The question was about the surprizing point and this answer shows why it shouldn't be that surprising.Regional
+1 to counter-act the -1. Although a little terse, this answer contains the same reasoning as some of the up-voted answers.Stier
@pst thank you that was the only reason I've undeleted it, 'cause surely there is a more distinct answerRegional

© 2022 - 2024 — McMap. All rights reserved.