How to upload files with graphql-java?
Asked Answered
R

4

6

I can't find out how to upload files if i use graphql-java, can someone show me a demo? I will be appreciated!

reference : https://github.com/graphql-java-kickstart/graphql-java-tools/issues/240

I tried it in springboot by using graphql-java-kickstart graphql-java-tools, but it didn't work

@Component
public class FilesUpload implements GraphQLMutationResolver {

    public Boolean testMultiFilesUpload(List<Part> parts, DataFetchingEnvironment env) {
        // get file parts from DataFetchingEnvironment, the parts parameter is not used
        List<Part> attchmentParts = env.getArgument("files");
        System.out.println(attchmentParts);
        return true;
    }
}

this is my schema

type Mutation {
    testSingleFileUpload(file: Upload): UploadResult
}

I expect this resolver can print attchmentParts,so i can get the file part.

Rutheruthenia answered 6/8, 2019 at 8:49 Comment(1)
Please check this : #58847214 what I'm doing wrong?Insufficient
R
15
  1. define a scalar type in our schema

    scalar Upload

    and we should configure GraphQLScalarType for Upload, use this below:

    @Configuration
    public class GraphqlConfig {
    
       @Bean
       public GraphQLScalarType uploadScalarDefine() {
          return ApolloScalars.Upload;
       } 
    }
    
  2. then we would define a mutation in schema and a GraphQLMutationResolver for testMultiFilesUpload

    type Mutation {
      testMultiFilesUpload(files: [Upload!]!): Boolean
    }
    

here is Resolver:

public Boolean testMultiFilesUpload(List<Part> parts, DataFetchingEnvironment env) {
    // get file parts from DataFetchingEnvironment, the parts parameter is not use
    List<Part> attachmentParts = env.getArgument("files");
    int i = 1;
    for (Part part : attachmentParts) {
      String uploadName = "copy" + i;
      try {
        part.write("your path:" + uploadName);
      } catch (IOException e) {
        e.printStackTrace();
      }
      i++;
    }
    return true;   
  }
}
  1. configure a jackson deserializer for javax.servlet.http.Part and register it to ObjectMapper

    public class PartDeserializer extends JsonDeserializer<Part> {
    
      @Override
      public Part deserialize(JsonParser p, DeserializationContext ctxt) throws IOException, JsonProcessingException {         
         return null;
      }
    }
    

    why we return null? because the List<Part> parts always null ,In the resolver's method, get the parts argument from the DataFetchingEnvironment;

    environment.getArgument("files")

register it to ObjectMapper:

@Bean
public ObjectMapper objectMapper() {
  ObjectMapper objectMapper = new ObjectMapper();
  objectMapper.configure(SerializationFeature.FAIL_ON_EMPTY_BEANS, false);
  SimpleModule module = new SimpleModule();
  module.addDeserializer(Part.class, new PartDeserializer());
  objectMapper.registerModule(module);
  return objectMapper;
}
  1. To test this, post the following form data (we use Postman) to GraphQL endpoint
operations

{ "query": "mutation($files: [Upload!]!) {testMultiFilesUpload(files:$files)}", "variables": {"files": [null,null] } }

map

{ "file0": ["variables.files.0"] , "file1":["variables.files.1"]}

file0

your file

file1

your file

like this:

remember to select the form-data option enter image description here

through this we can upload multiple files

Rutheruthenia answered 8/8, 2019 at 2:20 Comment(5)
@Val Bonn you can check out my solutionRutheruthenia
To avoid global ObjectMapper overriding (which is not very good btw, as it could bring side effects where you don't expect :D ) you can better register this bean with same configuration: @Bean public PerFieldObjectMapperProvider perFieldObjectMapperProvider() {}Eraste
@SergeiDubinin is this outdated by now? I am trying to implement this answer with the PerFieldObjectMapperProvider but I can't get it to work.Beutner
@Beutner I was able to make it working, but it wasn't great to use as graphql always converts data to JSON and vice versa and I changed that approach. I recommend you to upload files before submitting the data, so before submitting your mutation you already have URL to an uploaded file. It's much betterEraste
@Beutner This anwser has been 2 years,I don't remember some detail, but what I can tell you is that we can now using Netflix DGS Graphql Framework: netflix.github.io/dgs which already implement file upload with graphql. you can find file upload doc here: netflix.github.io/dgs/advanced/file-uploadsRutheruthenia
A
10

The main problem is that graphql-java-tools might have issues to do the field mapping for resolvers that contain fields of not basic types like List, String, Integer, Boolean, etc...

We solved this issue by just creating our own custom scalar that is basically like ApolloScalar.Upload. But instead of returning an object of the type Part, we return our own resolver type FileUpload which contains the contentType as String and the inputStream as byte[], then the field mapping works and we can read the byte[] within the resolver.

First, set up the new type to be used in the resolver:

public class FileUpload {
    private String contentType;
    private byte[] content;

    public FileUpload(String contentType, byte[] content) {
        this.contentType = contentType;
        this.content = content;
    }

    public String getContentType() {
        return contentType;
    }

    public byte[] getContent() {
        return content;
    }
}

Then we make a custom scalar that looks pretty much like ApolloScalars.Upload, but returns our own resolver type FileUpload:

public class MyScalars {
    public static final GraphQLScalarType FileUpload = new GraphQLScalarType(
        "FileUpload",
        "A file part in a multipart request",
        new Coercing<FileUpload, Void>() {

            @Override
            public Void serialize(Object dataFetcherResult) {
                throw new CoercingSerializeException("Upload is an input-only type");
            }

            @Override
            public FileUpload parseValue(Object input) {
                if (input instanceof Part) {
                    Part part = (Part) input;
                    try {
                        String contentType = part.getContentType();
                        byte[] content = new byte[part.getInputStream().available()];
                        part.delete();
                        return new FileUpload(contentType, content);

                    } catch (IOException e) {
                        throw new CoercingParseValueException("Couldn't read content of the uploaded file");
                    }
                } else if (null == input) {
                    return null;
                } else {
                    throw new CoercingParseValueException(
                            "Expected type " + Part.class.getName() + " but was " + input.getClass().getName());
                }
            }

            @Override
            public FileUpload parseLiteral(Object input) {
                throw new CoercingParseLiteralException(
                        "Must use variables to specify Upload values");
            }
    });
}

In the resolver, you would now be able to get the file from the resolver arguments:

public class FileUploadResolver implements GraphQLMutationResolver {

    public Boolean uploadFile(FileUpload fileUpload) {

        String fileContentType = fileUpload.getContentType();
        byte[] fileContent = fileUpload.getContent();

        // Do something in order to persist the file :)


        return true;
    }
}

In the schema, you declare it like:

scalar FileUpload

type Mutation {
    uploadFile(fileUpload: FileUpload): Boolean
}

Let me know if it doesn't work for you :)

Alcantar answered 24/10, 2019 at 14:58 Comment(5)
yes,your solution did work for me, but there is a problem : upload file through graphql we can't delete the temp file that generate by graphql in tomcat tmp directory and i didn't solve this problem yet . Did you meet this problem?Rutheruthenia
Good call, but it’s not directly related to GraphQL I assume but to Part and the Java Servlet API. I think it has to do with the InputStream is not getting closed. If that’s the case, you should be able to close it within the scalar right after declaration of the byte[] content variable. Check this thread #31741977Alcantar
I updated my code example and added part.delete(), haven’t run the code yet, but should be right :) Please let me know.Alcantar
i use part.delete() in my try catch finally block, but it didn't work. Actually, part.delete() didn't delete temp file for us. And like you say the reason can be the InputStream is not getting closed.Rutheruthenia
3 notes here 1. You are creating an empty array, there should be byte[] content = inputStream.readAllBytes(); 2. Temporary files are deleted even without the part.delete(); 3. You shouldn't read all bytes at once but you should only store InputStream in FileUpload (imagine 10 GB file)Occur
P
2

Just to add onto the answers above, for anyone like me who could find 0 examples of file upload with the GraphQLSchemaGenerator vs the schema first approach, you have to just create a TypeMapper and add that to your GraphQLSchemaGenerator:

public class FileUploadMapper implements TypeMapper {

  @Override
  public GraphQLOutputType toGraphQLType(
      final AnnotatedType javaType, final OperationMapper operationMapper,
      final Set<Class<? extends TypeMapper>> mappersToSkip, final BuildContext buildContext) {
    return MyScalars.FileUpload;
  }

  @Override
  public GraphQLInputType toGraphQLInputType(
      final AnnotatedType javaType, final OperationMapper operationMapper,
      final Set<Class<? extends TypeMapper>> mappersToSkip, final BuildContext buildContext) {
    return MyScalars.FileUpload;
  }

  @Override
  public boolean supports(final AnnotatedType type) {
     return type.getType().equals(FileUpload.class); //class of your fileUpload POJO from the previous answer
  }
}

then in your GraphQL @Configuration file where you are building your GraphQLSchema:

public GraphQLSchema schema(GraphQLSchemaGenerator schemaGenerator) {
    return schemaGenerator
        .withTypeMappers(new FileUploadMapper()) //add this line
        .generate();
  }

Then in your mutation resolver

  @GraphQLMutation(name = "fileUpload")
  public void fileUpload(      
      @GraphQLArgument(name = "file") FileUpload fileUpload //type here must be the POJO.class referenced in your TypeMapper
  ) {
    //do something with the byte[] from fileUpload.getContent();
    return;
  }
Precontract answered 14/5, 2020 at 21:4 Comment(1)
Can you delete the tmp file after process the uploaded file?Rutheruthenia
H
1

Since there is no data type for bytes I decided to use the String type sending the data in base64. I explain, schema first:

type Mutation{ 
  uploadCSV(filedatabase64: String!): Boolean
}

spring boot:

public DataFetcher<Boolean> uploadCSV() { 
    return dataFetchingEnvironment -> {
        String input= dataFetchingEnvironment.getArgument("filedatabase64");
        byte[] bytes = Base64.getDecoder().decode(input);
        //in my case is textfile:
        String strCSV = new String(bytes);
        //....
        return true;
    };
}

Http Client sender, for example in python3:

import requests
import base64
import json

with open('myfile.csv', 'r',encoding='utf-8') as file:
    content = file.read().rstrip()
file.close()
    
base64data = base64.b64encode(content.encode()).decode()
url = 'https://www.misite/graphql/'
query = "mutation{uploadCSV(filedatabase64:\""+base64data+"\")}"
r = requests.post(url, json={'query': query})
print("response " + r.status_code + " " + r.text)
    

demo capture

about base64 encoding/decoding in java this article is helpful: https://www.baeldung.com/java-base64-encode-and-decode

Harar answered 11/2, 2022 at 7:57 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.