A major difference between "modern" C and C++, compared with most other popular languages, is that while other languages allow compilers to select among various corner-case behaviors in Unspecified fashion, the authors of the C and C++ Standard didn't want to constrain the languages to platforms where any kind of behavioral guarantee could be met easily.
Given a construct like:
int blah(int x)
{
return x+10 > 20 ? x : 0;
}
Java precisely specifies the behavior for all values of x, including those
which would cause integer wraparound; the design of early C compilers for
two's-complement machines would yield the same behavior except that machines
with different sizes of "int" (16 bit, 36 bit, etc.) would wrap at different
places. Machines that use other representations for integers might behave
differently, however.
Further, it would not have been uncommon for even "traditional" C compilers
to behave as though the computations were performed on a longer type. Some
machines had a few instructions that operated with longer types, and using
those instructions and keeping values as longer types could sometimes be
cheaper than truncating/wrapping values into the range of "int". On such
machines, it would not be surprising for a function like the above to yield
x even for values that were within 10 of overflowing. Note that Java tries
to minimize behavioral differences among implementations, and thus does not
generally allow even that level of behavioral variation.
Modern C, however, goes an extra step beyond Java. Not only does it allow
for the possibility that compilers might arbitrarily keep excess precision
with integer values, a modern compiler given a function like the above may
infer that since the Standard would allow compilers to do anything whatsoever
if a program receives inputs that would cause the function to receive a value
of x greater than INT_MAX-10, a compiler should discard as irrelevant any code
which would have no effect if such inputs are not received. The net effect
of this is that integer overflow can disrupt the effect of preceding code
in arbitrary fashion.
Java is thus two steps removed from Modern C's model of "Undefined Behavior";
it rigidly prescribes many more behaviors, and even in cases where behaviors are not rigidly defined implementations are still limited to choosing from among various possibilities. Unless one makes use of features in the Unsafe namespace or links Java with outside languages, Java programs will have much more constrained behavior, and even when using such constructs Java programs will still obey laws of time and causality in ways C programs may not.