If you are 100% certain that the programs that are going to communicate over ZMQ will at all times be capable of understanding each other's binary format (eg because they are always distributed together and were all compiled with the same compiler options anyways) I see no benefit to the overhead that's added by serialization.
As soon as the above condition cannot be satisfied (like partner programs running on different host types, programs written in different languages or even partner programs that can evolve independently in time - which may cause incompatibilities in their raw binary structures) serialization becomes quite probably a must.
It seems that nowadays everybody and their brother is creating serialization solutions, which may be an indication that there's no one size fits all solution. This page contains a pretty thorough benchmarking of serialization time, deserialization time and sizes for 27 (!!) different serialization systems. Don't skip the first paragraph of that page, it says "Warning, benchmarks can be misleading". Your application, your data are what counts for you, but the data presented there may help you narrow down the choices you want to study in detail.