The standard is not concerned with binary compatibility. It is concerned though with classes, and by "changing" the definition of a class from one translation unit to another you indeed invoke undefined behavior.
Most compilers do allow a number of changes without the need for recompilation, however the list is small... and for this one I would say that it might not be possible, depending on the a priori knowledge of the derived classes.
The problem I foresee lies with the optimization that compilers usually carry out on the virtual tables.
When you create a class with virtual functions, you get a virtual table that looks like so:
// B virtual table
0 - Offset to complete object
1 - RTTI
2 - func0
3 - func1
...
In order to gain some space, the derived class own virtual functions are usually "appended":
// D virtual table
Same as B
N+3 - func(N+1)
N+4 - func(N+2)
This way a D
object only has one virtual pointer, than can be used as such even when the type is (statically) a B
(through pointer or reference).
However, if you were to extend B
without recompiling D
, then it would just plain crash, since when calling the N+1
function of B
you would instead call the N+1
function of D
which would probably not even have the same arguments... oups!
It can be done, though, if you know than no derived class add any virtual function of its own.