dropzone.js direct upload to S3 with content-type
Asked Answered
C

2

9

I'm currently using dropzone.js to upload images to S3 with a presigned URL. Everything works except I am unable to set the content-type of the file being uploaded. By default they are all being uploaded with binary/octet-stream and I am unable to view them directly in the browser.

My S3 presigned policy looks like this:

const policy = s3PolicyV4.generate({
    key: key,
    bucket: process.env.S3_BUCKET,
    contentType: 'multipart/form-data',
    region: process.env.REGION,
    accessKey: process.env.ACCESS_KEY_ID,
    secretKey: process.env.SECRET_ACCESS_KEY,
});

I've tried changing the contentType key here with no luck and I've also tried to add this after doing some research.

conditions: [
   ["starts-with", "$Content-Type", ""]
]

This is the front end code where I add the values of the presigned URL to the dropzone.js options.

$.ajax({
    type: "POST",
    contentType: "application/json",
    dataType: "json",
    url: api_endpoint,
    cache: false,
    success: function(data) {
        s3_filename_key = data.key;
        $this.options.params = {
            key: data.key,
            acl: data.acl,
            success_action_status: data.success_action_status,
            "X-Amz-Credential": data['X-Amz-Credential'],
            "X-Amz-Algorithm": data['X-Amz-Algorithm'],
            "X-Amz-Date": data['X-Amz-Date'],
            "Policy": data.Policy,
            "X-Amz-Signature": data['X-Amz-Signature']
        }
        done();
    },
    error: function(data) {}
});

When I add Content-Type to the dropzone options I get this result back - Invalid according to Policy: Extra input fields: content-type

Here is my CORS config for the bucket.

<?xml version="1.0" encoding="UTF-8"?>
<CORSConfiguration xmlns="http://s3.amazonaws.com/doc/2006-03-01/">
<CORSRule>
    <AllowedOrigin>*</AllowedOrigin>
    <AllowedMethod>HEAD</AllowedMethod>
    <AllowedMethod>PUT</AllowedMethod>
    <AllowedMethod>POST</AllowedMethod>
    <AllowedMethod>GET</AllowedMethod>
    <ExposeHeader>ETag</ExposeHeader>
    <ExposeHeader>Content-length</ExposeHeader>
    <AllowedHeader>*</AllowedHeader>
    <AllowedHeader>Content-*</AllowedHeader>
</CORSRule>
</CORSConfiguration>
Ced answered 16/11, 2017 at 20:58 Comment(2)
Did you ever resolve this? I'm facing the same problem.Highfalutin
In your dropzone options.params I add an extra key Content-Type. I also have a acceptedFiles list in the dropzone config. And [["starts-with", "$Content-Type", "image/"]] in the S3 ConditionsFonseca
H
1

Try adding correct contentType, you defined enctype instead. E.g.:

const policy = s3PolicyV4.generate({
 key: key,
 bucket: process.env.S3_BUCKET,
 contentType: 'application/json',
 region: process.env.REGION,
 accessKey: process.env.ACCESS_KEY_ID,
 secretKey: process.env.SECRET_ACCESS_KEY,
});

Multypart/form-data is enctype attribute of form.

Hoodwink answered 22/11, 2017 at 18:42 Comment(1)
I tried this for the PDF I was uploading, using application/pdfCed
B
0

we are using below solution in nodejs.

const mimeobj = FileType(buffer);

const s3Options = {
    ACL: 'public-read',
    Body: buffer,
    ContentType: mimeobj.mime,
    Key: UUID + '.' + mimeobj.ext,
    Bucket: 'your-bucket-name'
};

S3.upload(s3Options).send((err, data) => {

    if (err) {
        return reply(Boom.badImplementation('Internal Server Error'));
    }

    const responseObj = {
        orgFileName: decodedFileName,
        name:uuId + '.' + mimeobj.ext, 
        url: data.Location
    };

    responseArray.push(responseObj);

    count++;
    if (count === fileCount) {
        return reply({ files: responseArray });
    }
});
Birthplace answered 28/11, 2017 at 6:38 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.