How to get md5 iterated md5 sum of every chunk, using blueimp jquery upload plugin
Asked Answered
J

1

29

I need to calculate and send an iterated md5-hash to my upload-api. But I don't know how.

I´m using the tutorial found here:

http://tutorialzine.com/2013/05/mini-ajax-file-upload-form/

Along with the blueimp jquery upload plugin.

For sending only ONE FILE (file size smaller than chunk size), everything is working fine. But if a file is chunked, then I have no idea, how to catch the chunk to get md5 of it.

At the end I have to make the md5 iteratively like described here:

https://code.google.com/p/crypto-js/#Progressive_Hashing

$('#upload').fileupload({

    // This element will accept file drag/drop uploading
    dropZone: $('#drop'),

    type        : GLOBAL_FORM_METHOD,
    method      : "post",   // Type of data-send-method
    dataType    : "json",   // Type of data to recieve from api-call
    maxChunkSize: GLOBAL_CHUNK_SIZE,
    multipart   : true,

    // This function is called when a file is added to the queue;
    // either via the browse button, or via drag/drop:
    add: function (e, data)
    {
        var reader = new FileReader();
        var file = data.files[0];
        var jqXHR;

        var tpl = $('<li class="working"><input type="text" value="0" data-width="48" data-height="48"'+
            ' data-fgColor="#0788a5" data-readOnly="1" data-bgColor="#3e4043" /><p></p><div class="msg"></div><span></span></li>');

        // Append the file name and file size
        tpl.find('p')
            .text( file.name )
            .append('<i>' + formatFileSize( file.size ) + '</i>');

        // Add the HTML to the UL element
        data.context = tpl.appendTo(ul);

        // Initialize the knob plugin
        tpl.find('input').knob();

        // Listen for clicks on the cancel icon
        tpl.find('span').click(function()
        {
            if( tpl.hasClass('working') )
            {
                jqXHR.abort();
            }

            tpl.fadeOut( function()
            {
                tpl.remove();
            });
        });

        // Prevent XHR from sending data in "multipart/formData"
        data.postMessage = data.files[0].type;
        data.contentType = data.files[0].type;

        var chunksize = GLOBAL_CHUNK_SIZE > file.size ? file.size : GLOBAL_CHUNK_SIZE;

        // Describe the FileReader-DataLoad-Event
        reader.onload = function( event ) 
        {
            var binary = event.target.result;
            var md5 = CryptoJS.MD5(binary).toString();

            data.url += "&md5sum=" + md5;

            // D A T A   S E N D 
            jqXHR = data.submit();
        };

        // ADD url to XHR-object
        data.url = GLOBAL_FORM_ACTION;
        data.url += "?etf_id=" + GLOBAL_FOLDER_ID;
        data.url += "&file_title=" + file.name;

        // If the file will be send in one piece...
        if( GLOBAL_CHUNK_SIZE > file.size )
        {
            // ADD url-parameter to XHR-object
            data.url += "&size_chunk_start=" + 0;
            data.url += "&size_chunk_length=" + chunksize;
        }
        // This part for the chunks must be in "beforeSend"-Callback,
        // because, the chunk-related size-data is undefined in this case
        // but available there.

        // ADD url-parameter to XHR-object
        data.url += "&size_final=" + file.size;

        // Read md5-sum and send the file / chunk...
        // On multipart "file" is a chunk !
        reader.readAsBinaryString( file );
    },

    beforeSend : function(e, data)
    {
        var file = data.files[0];

        this.find(".msg").hide();

        // If the file will be send as chunks...
        if( GLOBAL_CHUNK_SIZE < file.size )
        {
            console.log( "Chunk data: ", data.uploadedBytes, data.chunkSize, file.size, data );

            // ADD url-parameter to XHR-object
            data.url += "&size_chunk_start=" + data.uploadedBytes;
            data.url += "&size_chunk_length=" + data.chunkSize;

            if( typeof this.attr('session_id') !== "undefined" )
                data.url += "&session_id=" + this.attr( 'session_id' );
        }

    });

I hope you help, so that I can myself solve this.

Jinajingle answered 17/11, 2015 at 16:5 Comment(7)
Looking for the similar functionality. Daniel, did you find any solution for this ?Littoral
Yes, i´ve wrote a completely different uploader script.Jinajingle
@Jinajingle just a suggestion: you could answer your own question with your new code, so others will benefit from it ;)Unilobed
Sorry, i can´t. In this case i´m restricted by my company. :-(Jinajingle
Please, close it then. Otherwise it will be forever in the "unanswered questions" list.Chiton
How? I can not find any possibility.Jinajingle
@Jinajingle There should be a "delete" option right under your post next to "edit".Zsa
S
2

This answer to a similar question offers a method to accomplish this. There are also libraries that will do this for you, like CryptoJS

the gist is you pad your file with 0 bytes to reach a certain length that is divisible by a chunk length, then you read the data into a buffer chunk by chunk, and has that data. Then you append that hash for each chunk you read.

MD5 Hash a large file incrementally

Sabbat answered 21/11, 2019 at 14:31 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.