Bounding generics with 'super' keyword
Asked Answered
P

6

72

Why can I use super only with wildcards and not with type parameters?

For example, in the Collection interface, why is the toArray method not written like this

interface Collection<T>{
    <S super T> S[] toArray(S[] a);
}
Pyxis answered 10/5, 2010 at 4:48 Comment(3)
Can you explain what you semantically intend to be the difference between <S super T> and <? super T>. Seems to me it's just splitting hairs over syntax.Carrew
You can't do ? super T[] toArray(? super T[] a), can you?Foreground
And this is why there is an Object[] toArray() method in the Collection class.Lykins
C
58

super to bound a named type parameter (e.g. <S super T>) as opposed to a wildcard (e.g. <? super T>) is ILLEGAL simply because even if it's allowed, it wouldn't do what you'd hoped it would do, because since Object is the ultimate super of all reference types, and everything is an Object, in effect there is no bound.

In your specific example, since any array of reference type is an Object[] (by Java array covariance), it can therefore be used as an argument to <S super T> S[] toArray(S[] a) (if such bound is legal) at compile-time, and it wouldn't prevent ArrayStoreException at run-time.

What you're trying to propose is that given:

List<Integer> integerList;

and given this hypothetical super bound on toArray:

<S super T> S[] toArray(S[] a) // hypothetical! currently illegal in Java

the compiler should only allow the following to compile:

integerList.toArray(new Integer[0]) // works fine!
integerList.toArray(new Number[0])  // works fine!
integerList.toArray(new Object[0])  // works fine!

and no other array type arguments (since Integer only has those 3 types as super). That is, you're trying to prevent this from compiling:

integerList.toArray(new String[0])  // trying to prevent this from compiling

because, by your argument, String is not a super of Integer. However, Object is a super of Integer, and a String[] is an Object[], so the compiler still would let the above compile, even if hypothetically you can do <S super T>!

So the following would still compile (just as the way they are right now), and ArrayStoreException at run-time could not be prevented by any compile-time checking using generic type bounds:

integerList.toArray(new String[0])  // compiles fine!
// throws ArrayStoreException at run-time

Generics and arrays don't mix, and this is one of the many places where it shows.


A non-array example

Again, let's say that you have this generic method declaration:

<T super Integer> void add(T number) // hypothetical! currently illegal in Java

And you have these variable declarations:

Integer anInteger
Number aNumber
Object anObject
String aString

Your intention with <T super Integer> (if it's legal) is that it should allow add(anInteger), and add(aNumber), and of course add(anObject), but NOT add(aString). Well, String is an Object, so add(aString) would still compile anyway.


See also

Related questions

On generics typing rules:

On using super and extends:

Chesna answered 10/5, 2010 at 5:4 Comment(11)
This way Collection"<? super Integer> should take every thing. Because every thing is Object and Object is super type of Integer.Pyxis
@mohsenof: even though String is an Object, a Collection<String> is not a Collection<? super Integer>, because generics are invariant: a Collection<String> is NOT a Collection<Object>. A String[] however is an Object[], because arrays are covariant. Again, generics and arrays don't mix, and they run under very different type rules. Read Effective Java 2nd Edition, Item 25: Prefer lists to arrays.Chesna
so why would <S super T> List<S> addToList(List<S> list, T element){ list.add(element); return list; } not make sense?Mcphail
"it wouldn't do what you'd hoped" - this is just plain wrong. The OP provided a great use case (modulo the dreaded array covariance) with obvious semantics: the collection should be able to fill the array of any more general type and return the said array. There is nothing wrong with it.Sivia
"since any array of reference type is an Object[], it can therefore be used as an argument to <S super T> S[] toArray(S[] a)" True. However, if S were inferred to be Object, then that would also constrain the return type of the method to Object[], which may cause a compile error depending on the context in which the call is made (if it expects a S[]). So changing the type variable to Object is not a substitute for a super bound.Twocycle
there are some, though few, valid use cases for lower bounds.Pitarys
polygenelubricants: The main point in this answer, that upper type bound are not useful, is incorrect. This is demonstrated in Rotsor's answer. I would like to encourage you to edit this answer to point this out, because in its current form it is misleading people!Price
@polygenelubricants, <? super T> is ILLEGAL!! Are you sure!!Pastorale
-1 This is answer is incorrect. The real answer is because "They designed Java that way". See Rotsor's answer for valid (and extremely useful) use cases that are invalid code only because of the Java specifications, not because the logic is incorrect.Pliable
Yes, the answer is wrong. It sounds convincing at first but it's nonsense. T super String means T is either Object, CharSequence or String - but not anything else. Clearly that is a real - and useful - bound.Knighton
What if the hypothetical clause <S super T> is allowed, and if we were to use it along with a classy alternative such as Class<S>. Then we would be able to solve this problem? So the new version of toArray would be: <S super T> S[] toArray(Class<S> a)Igneous
S
51

As no one has provided a satisfactory answer, the correct answer seems to be "for no good reason".

polygenelubricants provided a good overview of bad things happening with the java array covariance, which is a terrible feature by itself. Consider the following code fragment:

String[] strings = new String[1];
Object[] objects = strings;
objects[0] = 0;

This obviously wrong code compiles without resorting to any "super" construct, so array covariance should not be used as an argument.

Now, here I have a perfectly valid example of code requiring super in the named type parameter:

class Nullable<A> {
    private A value;
    // Does not compile!!
    public <B super A> B withDefault(B defaultValue) {
        return value == null ? defaultValue : value;
    }
}

Potentially supporting some nice usage:

Nullable<Integer> intOrNull = ...;
Integer i = intOrNull.withDefault(8);
Number n = intOrNull.withDefault(3.5);
Object o = intOrNull.withDefault("What's so bad about a String here?");

The latter code fragment does not compile if I remove the B altogether, so B is indeed needed.

Note that the feature I'm trying to implement is easily obtained if I invert the order of type parameter declarations, thus changing the super constraint to extends. However, this is only possible if I rewrite the method as a static one:

// This one actually works and I use it.
public static <B, A extends B> B withDefault(Nullable<A> nullable, B defaultValue) { ... }

The point is that this Java language restriction is indeed restricting some otherwise possible useful features and may require ugly workarounds. I wonder what would happen if we needed withDefault to be virtual.

Now, to correlate with what polygenelubricants said, we use B here not to restrict the type of object passed as defaultValue (see the String used in the example), but rather to restrict the caller expectations about the object we return. As a simple rule, you use extends with the types you demand and super with the types you provide.

Sivia answered 15/4, 2011 at 0:6 Comment(9)
+1 Your example matches up with a real-world use case in Guava's Optional.or(T). From the documentation: "The signature public T or(T defaultValue) is overly restrictive. However, the ideal signature, public <S super T> S or(S), is not legal Java. As a result, some sensible operations involving subtypes are compile errors".Bergess
@PaulBellora …and now that Java has its own Optional or the Stream API, it’s biting everyone. The best you can do then, is an actually obsolete .<SuperType>map(t -> t) or .map(Function.<SuperType>identity())Martensite
@Martensite I actually remember this definition and explanation in the guava's doc. I still can't understand why one would want to return a super type; for me it just defeats the purpose, it makes perfect sense to return a subtype, I guess I am missing something hereMariomariology
@Mariomariology this answer’s examples should be explanatory enough; just replace Nullable with Optional and withDefault with orElse. Another example would be CompletableFuture<CharBuffer> f = someIoOperation(); CharSequence result = f.exceptionally(t -> "constant fallback").join();, which would be a reasonable operation, but doesn’t work, unless you insert a workaround like .thenApply(Function.<CharSequence>identity()). Another indirect example would be String.join(", ", () -> Stream.of("foo", "bar").iterator());, which doesn’t work.Martensite
@Rotsor, can you please explain why the valid variant of method that you proposed in the end of post can be written only as static? I mean this declaration: public static <B, A extends B> B withDefault(Nullable<A> nullable, B defaultValue) { ... }.Drusilladrusus
You can't make that be a non-static method of a Nullable<A> class (and use this instead of nullable) because then you can't express the necessary constraint between B and A (as shown). You could still add it as a non-static method to another class, but that would be useless because we don't use any instances of that class.Sivia
@Rotsor, ah, so the only reason of that method being static is the declaration of class Nullable<A> in a form, in which we don't have B in type parameters, got it, thank you. If we could rewrite this class like Nullable<B, A extends B>, then this method could be non-static.Drusilladrusus
Ah, yes, something like that could work, but: 1) the choice of B would have to be made when constructing the instance of such class; 2) I don't know if constraints are allowed at all in type parameters of a class and what exactly their meaning would be.Sivia
Another example of where this would be useful is Stream.concat. I always wondered why this was not designed as an instance method so you could write stream.map(...).filter(...).concat(...). Now I understand...Radiomicrometer
P
26

The "official" answer to your question can be found in a Sun/Oracle bug report.

BT2:EVALUATION

See

http://lampwww.epfl.ch/~odersky/ftp/local-ti.ps

particularly section 3 and the last paragraph on page 9. Admitting type variables on both sides of subtype constraints can result in a set of type equations with no single best solution; consequently, type inference cannot be done using any of the existing standard algorithms. That is why type variables have only "extends" bounds.

Wildcards, on the other hand, do not have to be inferred, so there is no need for this constraint.

@###.### 2004-05-25

Yes; the key point is that wildcards, even when captured, are only used as inputs of the inference process; nothing with (only) a lower bound needs to be inferred as a result.

@###.### 2004-05-26

I see the problem. But I do not see how it is different from the problems we have with lower bounds on wildcards during inference, e.g.:

List<? super Number> s;
boolean b;
...
s = b ? s : s;

Currently, we infer List<X> where X extends Object as the type of the conditional expression, meaning that the assignment is illegal.

@###.### 2004-05-26

Sadly, the conversation ends there. The paper to which the (now dead) link used to point is Inferred Type Instantiation for GJ. From glancing at the last page, it boils down to: If lower bounds are admitted, type inference may yield multiple solutions, none of which is principal.

Periodic answered 24/11, 2015 at 13:53 Comment(1)
Underappreciated answer, should be accepted. Lower bounds in method type parameters are useful (as Rotsor showed) and can be implemented (as the author of that paper showed when he did so in Scala). Them being hard to implement sensibly is clearly the real reason they aren't allowed in Java.Orgiastic
P
0

The only reason is it makes no sense when declaring a type parameter with a super keyword when defining at a class level. The only logical type-erasure strategy for Java would have been to fallback to the supertype of all objects, which is the Object class.

A great example and explanation can be found here: http://www.angelikalanger.com/GenericsFAQ/FAQSections/TypeParameters.html#Why%20is%20there%20no%20lower%20bound%20for%20type%20parameters?

A simple example for rules of type-erasure can be found here: https://www.tutorialspoint.com/java_generics/java_generics_type_erasure.htm#:~:text=Type%20erasure%20is%20a%20process,there%20is%20no%20runtime%20overhead.

Panel answered 17/10, 2022 at 19:4 Comment(0)
S
-1

Suppose we have:

  • basic classes A > B > C and D

    class A{
        void methodA(){}
    };
    class B extends  A{
        void methodB(){}
    }
    
    class C extends  B{
        void methodC(){}
    }
    
    class D {
        void methodD(){}
    }
    
  • job wrapper classes

    interface Job<T> {
        void exec(T t);
    }
    
    class JobOnA implements Job<A>{
        @Override
        public void exec(A a) {
            a.methodA();
        }
    }
    class JobOnB implements Job<B>{
        @Override
        public void exec(B b) {
            b.methodB();
        }
    }
    
    class JobOnC implements Job<C>{
        @Override
        public void exec(C c) {
            c.methodC();
        }
    }
    
    class JobOnD implements Job<D>{
        @Override
        public void exec(D d) {
            d.methodD();
        }
    }
    
  • and one manager class with 4 different approaches to execute job on object

    class Manager<T>{
        final T t;
        Manager(T t){
            this.t=t;
        }
        public void execute1(Job<T> job){
            job.exec(t);
        }
    
        public <U> void execute2(Job<U> job){
            U u= (U) t;  //not safe
            job.exec(u);
        }
    
        public <U extends T> void execute3(Job<U> job){
            U u= (U) t; //not safe
            job.exec(u);
        }
    
        //desired feature, not compiled for now
        public <U super T> void execute4(Job<U> job){
            U u= (U) t; //safe
            job.exec(u);
        }
    }
    
  • with usage

    void usage(){
        B b = new B();
        Manager<B> managerB = new Manager<>(b);
    
        //TOO STRICT
        managerB.execute1(new JobOnA());
        managerB.execute1(new JobOnB()); //compiled
        managerB.execute1(new JobOnC());
        managerB.execute1(new JobOnD());
    
        //TOO MUCH FREEDOM
        managerB.execute2(new JobOnA()); //compiled
        managerB.execute2(new JobOnB()); //compiled
        managerB.execute2(new JobOnC()); //compiled !!
        managerB.execute2(new JobOnD()); //compiled !!
    
        //NOT ADEQUATE RESTRICTIONS     
        managerB.execute3(new JobOnA());
        managerB.execute3(new JobOnB()); //compiled
        managerB.execute3(new JobOnC()); //compiled !!
        managerB.execute3(new JobOnD());
    
        //SHOULD BE
        managerB.execute4(new JobOnA());  //compiled
        managerB.execute4(new JobOnB());  //compiled
        managerB.execute4(new JobOnC());
        managerB.execute4(new JobOnD());
    }
    

Any suggestions how to implement execute4 now ?

==========edited =======

    public void execute4(Job<? super  T> job){
        job.exec( t);
    }

Thanks to all :)

========== edited ==========

    private <U> void execute2(Job<U> job){
        U u= (U) t;  //now it's safe
        job.exec(u);
    }
    public void execute4(Job<? super  T> job){
        execute2(job);
    }

much better, any code with U inside execute2

super type U becomes named !

interesting discussion :)

Sterol answered 3/10, 2014 at 15:54 Comment(0)
E
-1

I really like the accepted answer, but I would like to put a slightly different perspective on it.

super is supported in a typed parameter only to allow contravariance capabilities. When it comes to covariance and contravariance it's important to understand that Java only supports use-site variance. Unlike Kotlin or Scala, which allow declaration-site variance. Kotlin documentation explains it very well here. Or if you're more into Scala, here's one for you.

It basically means that in Java, you can not limit the way you're gonna use your class when you declare it in terms of PECS. The class can both consume and produce, and some of its methods can do it at the same time, like toArray([]), by the way.

Now, the reason extends is allowed in classes and methods declarations is because it's more about polymorphism than it is about variance. And polymorphism is an intrinsic part of Java and OOP in general: If a method can accept some supertype, a subtype can always safely be passed to it. And if a method, at declaration site as it's "contract", should return some supertype, it's totally fine if it returns a subtype instead in its implementations

Emelda answered 11/10, 2019 at 7:12 Comment(1)
The accepted answer is wrong. It's correct that this question is about type bounds and not about variance. But lower type bounds using T super R would sometimes be useful, as Rotsors' answer demonstrates, and also the reference to the Guava Optional class.Price

© 2022 - 2024 — McMap. All rights reserved.