As already known it’s easy to add Serialization support to a lambda expression when the target interface does not already inherit Serializable
, just like (TargetInterface&Serializable)()->{/*code*/}
.
What I ask for, is a way to do the opposite, explicitly remove Serialization support when the target interface does inherit Serializable
.
Since you can’t remove an interface from a type a language-based solution would possibly look like (@NotSerializable TargetInterface)()->{/* code */}
. But as far as I know, there is no such solution. (Correct me if I’m wrong, that would be a perfect answer)
Denying the serialization even when the class implements Serializable
was a legitimate behavior in the past and with classes under the programmers control, the pattern would look like:
public class NotSupportingSerialization extends SerializableBaseClass {
private void writeObject(java.io.ObjectOutputStream out) throws IOException {
throw new NotSerializableException();
}
private void readObject(java.io.ObjectInputStream in)
throws IOException, ClassNotFoundException {
throw new NotSerializableException();
}
private void readObjectNoData() throws ObjectStreamException {
throw new NotSerializableException();
}
}
But for lambda expression, the programmer doesn’t have that control over the lambda class.
Why would someone ever bother about removing the support? Well, beside the bigger code generated to include the Serialization
support, it creates a security risk. Consider the following code:
public class CreationSite {
public static void main(String... arg) {
TargetInterface f=CreationSite::privateMethod;
}
private static void privateMethod() {
System.out.println("should be private");
}
}
Here, the access to the private method is not exposed even if the TargetInterface
is public
(interface methods are always public
) as long as the programmer takes care, not to pass the instance f
to untrusted code.
However, things change if TargetInterface
inherits Serializable
. Then, even if the CreationSite
never hands out an instance, an attacker could create an equivalent instance by de-serializing a manually constructed stream. If the interface for the above example looks like
public interface TargetInterface extends Runnable, Serializable {}
it’s as easy as:
SerializedLambda l=new SerializedLambda(CreationSite.class,
TargetInterface.class.getName().replace('.', '/'), "run", "()V",
MethodHandleInfo.REF_invokeStatic,
CreationSite.class.getName().replace('.', '/'), "privateMethod",
"()V", "()V", new Object[0]);
ByteArrayOutputStream os=new ByteArrayOutputStream();
try(ObjectOutputStream oos=new ObjectOutputStream(os)) { oos.writeObject(l);}
TargetInterface f;
try(ByteArrayInputStream is=new ByteArrayInputStream(os.toByteArray());
ObjectInputStream ois=new ObjectInputStream(is)) {
f=(TargetInterface) ois.readObject();
}
f.run();// invokes privateMethod
Note that the attacking code does not contain any action that a SecurityManager
would revoke.
The decision to support Serialization is made at compile-time. It requires a synthetic factory method added to CreationSite
and a flag passed to the metafactory method. Without the flag, the generated lambda will not support Serialization even if the interface happens to inherit Serializable
. The lambda class will even have a writeObject
method like in the NotSupportingSerialization
example above. And without the synthetic factory method, De-Serialization is impossible.
This leads to the one solution, I found. You can create a copy of the interface and modify it to not inherit Serializable
, then compile against that modified version. So when the real version at runtime happens to inherit Serializable
, Serialization will still be revoked.
Well, another solution is to never use lambda expressions/method references in security relevant code, at least if the target interface inherits Serializable
which must always be re-checked, when compiling against a newer version of the interface.
But I think there must be better, preferably in-language solutions.