only let classes implement Serializable that have to be passed by value over the network.
Your professor is suggesting you minimize your use of Serializable
to areas where it's strictly needed.
This is because serialization is a strong candidate for leaking implementation. Implementing Serializable
shows intent of serialization (even if the object is never actually serialized), which imposes the idea that developers should take caution when modifying those classes to avoid breaking software.
Joshua Bloch covers this in his book Effective Java.
The moment you serialize an object, the class that it was instantiated from can no longer be modified without special treatment†. If you modify the class, the binary representation will no longer match the objects already serialized. Thus deserialization of any objects serialized before modifying the class will fail.
If a type implements Serializable
, it has the potential to be serialized. If an instance of that type was serialized, you may break code by modifying it's implementation.
Since there's no easy way of knowing for sure that an instance of a serializable type has been serialized (albeit you may not intend for objects to be serialized), developers take strong caution when modifying implementations of those types.
† - This could be avoided by properly versioning your serializable types, but due to the potential of versioning a type that had no contract change (with no compile time error handling support to notify you), it's best to keep explicit versioning minimal to avoid adding excess complexity to your design.
implements Serializable
makes your object become serialized with no reason? – UsurySerializable
imposes an unnecessary intent (*this object may be serialized). Modifying serializable types may result in either version refining (which is a pain) or breaking software. Check out my answer – Tallou