In C++ there are a lot of ways that you can write code that compiles, but yields undefined behavior (Wikipedia). Is there something similar in C#? Can we write code in C# that compiles, but has undefined behavior?
As others have mentioned, pretty much anything in the "unsafe" block can yield implementation-defined behaviour; abuse of unsafe blocks allows you to change the bytes of code that make up the runtime itself, and therefore all bets are off.
The division int.MinValue/-1
has an implementation-defined behaviour.
Throwing an exception and never catching it causes implementation-defined behaviour -- terminate the process, start a debugger, and so on.
There are a number of other situations in C# where we are forced to emit code which has implementation-determined behaviour. For example, this situation:
https://learn.microsoft.com/en-us/archive/blogs/ericlippert/odious-ambiguous-overloads-part-two
However, the situations in which a safe, well-behaved C# program has implementation-defined behaviour should be quite rare.
int.MinValue/-1
. –
Aconcagua int.MinValue/-1
isn't or couldn't be defined in the language specification? –
Morelli Yes! There is, even in a safe context! (Well, it's implementation defined to be undefined, at least)
Here's one from Marek Safar and VSadov in the Roslyn issues.There is a mismatch between C# and the CLI in regards to bool
.
C# believes that there is only one kind of true
, and one kind of false
.
CLI believes that false
is a byte containing 0, and all other values are true
.
This discrepancy means we can coerce C# to do some a (marginally) interesting things thing:
//non-standard bool
//We're setting a bool's value to a byte value of 5.
var a = new bool[1];
Buffer.SetByte(a, 0, 5);
//non-standard bool
//We're setting a bool's value to a byte value of 10.
var b = new bool[1];
Buffer.SetByte(b, 0, 10);
//Both are true.
Console.WriteLine(a[0]);
Console.WriteLine(b[0]);
//But they are not the same true.
Console.WriteLine(a[0] == b[0]);
The above outputs:
true
true
false
Interestingly, the debugger disagrees (must evaluate truth differently?)
Anyways, the conclusion the C# team appears to have come to is (emphasis added):
I.E. the language will stay entirely unconcerned about nonstandard bools. The particular implementation (as in MS C# on CIL) will acknowledge the existence of nonstandard bools and specify their behavior as undefined
Looking at the Wikipedia article on undefined behaviour, the situations in which undefined behavior happens are either not allowed or throw an exception in C#.
However in Unsafe code, undefined behavior I believe is possible, as that allows you to use pointers etc.
Edit: It looks like I'm right: http://msdn.microsoft.com/en-us/library/aa664771%28VS.71%29.aspx
Has an example of undefined behavior in c#
According to the ECMA-334 document (p. 473):
A program that does not contain any occurrences of the unsafe modifier cannot exhibit any undefined behavior.
That promotes 'implementation-defined' to the worst case, see Eric Lippert's answer.
(signed char)128
would yield -126 during the first half of each phase of the moon, and -127 during the second half, but only if the implementation actually ascertains the phase of the Moon. Implementations cannot simply say "yields arbitrary value". –
Hoffert Many and subprograms have requirements that can be summarized as:
When given valid data, produce valid output.
Refrain from launching nuclear missiles or negating the laws of time and causality, even when given invalid input.
One of the major design goals of Java and .NET languages is that unless code makes use of certain which are marked as "unsafe", no particular effort is generally required to meet the second constraint above [though some behaviors related to garbage collection and Finalize
can be a little weird from the time/causality standpoint, those can be described as exceptions to normal rules of causality, rather than a total revocation of them]. That situation is very different from the situation in C, where many kinds of data-dependent errors (e.g. integer overflow) may result in compilers behaving in arbitrary fashion including making whatever assumptions would be necessary to avoid overflow. The truly horrible kinds of Undefined Behavior which are encouraged in hypermodern C philosophy do not exist in C# or other .NET languages outside of "unsafe" blocks.
Not really in the exactly Wiki sense but I suppose the most obvious example that comes to my mind is simply writing some threaded code, but then it's like that in any language.
The C# spec lists a few more undefined, implementation-defined, and unspecified behaviors: https://learn.microsoft.com/en-us/dotnet/csharp/language-reference/language-specification/portability-issues ... these are defined in the spec as follows:
behavior, implementation-defined – unspecified behavior where each implementation documents how the choice is made
behavior, undefined – behavior, upon use of a non-portable or erroneous construct or of erroneous data, for which this specification imposes no requirements
behavior, unspecified – behavior where this specification provides two or more possibilities and imposes no further requirements on which is chosen in any instance
The spec evolves and the list is too long for a StackOverflow answer, plus it's not all-encompassing. As the spec only has one term for "undefined behavior" ("Undefined behavior is indicated in this specification only by the words ‘undefined behavior.’") googling "undefined" site:https://learn.microsoft.com/en-us/dotnet/csharp/language-reference/"
reveals some interesting tidbits:
Field Initialization across Partial Classes is Undefined
Field initialization order can be significant within C# code, and some guarantees are provided, as defined in §15.5.6.1. Otherwise, the ordering of members within a type is rarely significant, but may be significant when interfacing with other languages and environments. In these cases, the ordering of members within a type declared in multiple parts is undefined.
https://learn.microsoft.com/en-us/dotnet/csharp/language-reference/language-specification/classes
Is the "textual order" across partial classes formally defined?
Pattern Evaluation Order is Undefined
The order in which patterns are checked is undefined. At run time, the right-hand nested patterns of or and and patterns can be checked first.
Furthermore, regarding list patterns:
The following assumptions are made on the members being used:
The property that makes the type countable is assumed to always return a non-negative value, if and only if the type is indexable. For instance, the pattern { Length: -1 } can never match an array.
The member that makes the type sliceable is assumed to be well-behaved, that is, the return value is never null and that it is a proper subslice of the containing list.
The behavior of a pattern-matching operation is undefined if any of the above assumptions doesn't hold.
Likewise, Collection Literal behavior is undefined if collections are not well-behaved:
Collections are assumed to be well-behaved. For example:
It is assumed that the value of Count on a collection will produce that same value as the number of elements when enumerated.
The types used in this spec defined in the System.Collections.Generic namespace are presumed to be side-effect free. As such, the compiler can optimize scenarios where such types might be used as intermediary values, but otherwise not be exposed.
It is assumed that a call to some applicable .AddRange(x) member on a collection will result in the same final value as iterating over x and adding all of its enumerated values individually to the collection with .Add.
The behavior of collection literals with collections that are not well-behaved is undefined.
default(Nullable).Value is implementation-defined (not undefined)
The elements of an array can be initialized to known values when the array is created. Beginning with C# 12, all of the collection types can be initialized using a Collection expression. Elements that aren't initialized are set to the default value. The default value is the 0-bit pattern. All reference types (including non-nullable types), have the values null. All value types have the 0-bit patterns. That means the Nullable.HasValue property is false and the Nullable.Value property is undefined. In the .NET implementation, the Value property throws an exception.
In general I would say no.
Use Automatic variable before it’s initialized.
All variables must be initialized. If not an exception occurs.
Division by zero
Exception is thrown.
Indexing an array out of bounds
Exception is thrown
As Aequitarum Custos pointed out you can use unsafe code. Then again this isn't really C#, you are explicitly opting out of the C# environment.
© 2022 - 2025 — McMap. All rights reserved.
1/0
. Anyway, I'll +1 this tomorrow... – Aconcagua