AWS S3 batch upload from localhost php error
Asked Answered
T

2

3

I am trying to batch/bulk upload from localhost (xampp) to my S3 bucket.
It seems to work for about 6 items then i get an error message:

The cURL error says Failed sending network data. from http://curl.haxx.se/libcurl/c/libcurl-errors.html

Fatal error: Uncaught exception 'cURL_Multi_Exception' with message 'cURL resource: Resource id #34; cURL error: SSL_write() returned SYSCALL, errno = 0 (cURL error code 55). See http://curl.haxx.se/libcurl/c/libcurl-errors.html for an explanation of error codes.' in D:\xampp\htdocs\path\to\my\files\sdk-1.5.14\lib\requestcore\requestcore.class.php on line 902

and

cURL_Multi_Exception: cURL resource: Resource id #34; cURL error: SSL_write() returned SYSCALL, errno = 0 (cURL error code 55). See http://curl.haxx.se/libcurl/c/libcurl-errors.html for an explanation of error codes. in D:\xampp\htdocs\path\to\my\files\sdk-1.5.14\lib\requestcore\requestcore.class.php on line 902

heres my php. It gets a list of images from a directory and from that loop I wish to batch upload those items to S3.

require_once('sdk-1.5.14/sdk.class.php');
$s3 = new AmazonS3();
//$s3->disable_ssl_verification(); //this didnt fix it

$folder = "./../"; 
$handle = opendir($folder); 

# Making an array of files in a directory to upload to S3
while ($file = readdir($handle)) 
{ 
        $files[] = $file; 
} 
closedir($handle);

foreach ($files as $file) { 
        $path_parts = pathinfo($file);
        if(isset($path_parts['extension']) && $path_parts['extension'] != '') {

                // local path
                $fileTempName = "D:/xampp/htdocs/path/to/my/files/";

                //batch
                $response = $s3->batch()->create_object('bucketname', "tempdirectory/" . $file, array(
                        'fileUpload' => fopen($fileTempName . $file, 'r'),
                        'acl' => AmazonS3::ACL_PUBLIC
                ));

        }

}
$s3->batch()->send();

update: after making changes to congig.inc.php i am now getting error messages:

Fatal error: Uncaught exception 'cURL_Multi_Exception' with message 'cURL resource: Resource id #149; cURL error: Failed connect to mybucket.s3.amazonaws.com:443; No error (cURL error code 7). See http://curl.haxx.se/libcurl/c/libcurl-errors.html for an explanation of error codes.' in D:\xampp\htdocs\sdk-1.5.14\lib\requestcore\requestcore.class.php on line 902

cURL_Multi_Exception: cURL resource: Resource id #149; cURL error: Failed connect to prayerbucket.s3.amazonaws.com:443; No error (cURL error code 7). See http://curl.haxx.se/libcurl/c/libcurl-errors.html for an explanation of error codes. in D:\xampp\htdocs\sdk-1.5.14\lib\requestcore\requestcore.class.php on line 902

Tella answered 3/9, 2012 at 20:43 Comment(11)
as far as i know, you can't test amazon aw3 from localhost. I had to upload my files to my host and try live.Donalddonaldson
@Donalddonaldson i see, have you heard of any workarounds?Tella
this is all I know about it forums.aws.amazon.com/message.jspa?messageID=215107Donalddonaldson
I get this errors too, i try to set up limit batch queue = 2Clareclarence
@Clareclarence ok, where do i set this?Tella
I have created this way: $batch = new CFBatchRequest(2); $s3->batch($batch); But need call $s3->batch()->send() many time; And batch send is not good :(Clareclarence
And now sdk is 1.5.15 (Panther)Clareclarence
@Clareclarence i dont get the error but it does not upoad to s3 $response1 = $s3->batch($batch)->create_object('mybucket', "path/to/my/" . $file, array( 'fileUpload' => fopen($fileuserName1 . $file, 'r'), 'contentType' => 'image/' . $extension, 'acl' => AmazonS3::ACL_PUBLIC ));Tella
@Clareclarence it worked, the fix was in the placement of $s3->batch()->send(); need to place it in the loop of $s3->batch($batch)->create_object(). please rephrase this and i will accept as answerTella
Sorry, if you set batch to loop, you do not need batch :) just send it normal. Or you mean next loop for work only batch?Clareclarence
@tq, I explain batch work with AmazonSDK. Multi curl is depends from your speed outgoing connections. If you have less than 1Mb and you send not big file - 2 instance is goodClareclarence
C
0

Try set limit for batch:

$batch = new CFBatchRequest(2); // only two instance at once

foreach ($files as $file) { 
        $path_parts = pathinfo($file);
        if(isset($path_parts['extension']) && $path_parts['extension'] != '') {

                // local path
                $fileTempName = "D:/xampp/htdocs/path/to/my/files/";

                // if batch, it have to return curl's resource
                $curl_handler = $s3->batch($batch)->create_object('bucketname', "tempdirectory/" . $file, array(
                        'fileUpload' => fopen($fileTempName . $file, 'r'),
                        'acl' => AmazonS3::ACL_PUBLIC
                ));

        }

}
// batch object send queue
$batch->send();
Clareclarence answered 3/10, 2012 at 4:33 Comment(0)
A
0

Try setting 'certificate_authority' to true in the config.inc.php.

Apophasis answered 27/9, 2012 at 4:18 Comment(0)
C
0

Try set limit for batch:

$batch = new CFBatchRequest(2); // only two instance at once

foreach ($files as $file) { 
        $path_parts = pathinfo($file);
        if(isset($path_parts['extension']) && $path_parts['extension'] != '') {

                // local path
                $fileTempName = "D:/xampp/htdocs/path/to/my/files/";

                // if batch, it have to return curl's resource
                $curl_handler = $s3->batch($batch)->create_object('bucketname', "tempdirectory/" . $file, array(
                        'fileUpload' => fopen($fileTempName . $file, 'r'),
                        'acl' => AmazonS3::ACL_PUBLIC
                ));

        }

}
// batch object send queue
$batch->send();
Clareclarence answered 3/10, 2012 at 4:33 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.