In C(++) programs typically (that is, unless we're talking about interpreting code instead of compiling it+executing it directly) arrays are contiguous in the virtual address space (if, of course, there is such a thing on the platform in question).
There, if a big array can't be allocated contiguously, even if there's enough free memory, you will get either the std::bad_alloc exception (in C++) or NULL (from malloc()-like functions in C/C++ or nonthrowing operator new in C++).
Virtual memory (and paging to/from disk) usually doesn't solve virtual address space fragmentation problems, or, at least, not directly, its purpose is different. It's normally used to let programs think there's enough memory, when in fact there isn't. The RAM is effectively extended by the free disk space at the expense of lower performance because the OS has to exchange data between the RAM and disk when there's memory pressure.
Your array (in parts or in whole) can be offloaded to the disk by the OS. But this is made transparent to your program because whenever it needs to access something from the array the OS will load it back (again, in parts or in whole, as the OS deems necessary).
On systems without virtual memory, there's no virtual to physical address translation and your program will work directly with physical memory, hence, it will have to deal with the physical memory fragmentation and also compete with other programs for both free memory and the address space, making allocation failures more likely to occur in general (systems with virtual memory often run programs in separate virtual address spaces and fragmentation in app A's virtual address space won't affect that of app B's).