filezilla, error while writing failure
Asked Answered
C

7

42

I'm transferring a very large (35GB) file through SFTP and FileZilla.

Now the transfer is 59.7% done, but I keep getting this error, and it hasn't changed that number for hours.

Error:     File transfer failed after transferring 1,048,576 bytes in 10 seconds
Status:    Starting upload of C:\Files\static.sql.gz
Status:    Retrieving directory listing...
Command:   ls
Status:    Listing directory /var/www/vhosts/site/httpdocs
Command:   reput "C:\Files\static.sql.gz" "static.sql.gz"
Status:    reput: restarting at file position 20450758656
Status:    local:C:\Files\static.sql.gz => remote:/var/www/vhosts/site/httpdocs/static.sql.gz
Error:     error while writing: failure

Why do I keep getting this error?

Centroclinal answered 17/11, 2010 at 16:32 Comment(2)
Is the remote volume out of free space?Houseleek
Check disk space hereCunctation
G
80

Credit to cdhowie: The remote volume was out of space.

Gardy answered 25/1, 2011 at 19:11 Comment(3)
This is huge... I was going out of my mind before I thought to contact the hardware department and see how much space they had reserved for the vm I was uploading to.Coppola
you're a livesaverNecrolatry
made my day! :DTideway
S
16

I encountered the same situation. Go to your server, run "df" command to see if there is a problem of hard disk space.

Serration answered 10/1, 2013 at 9:34 Comment(0)
A
2

Recently faced this issue, Turned out to be the disk space issue. Removed some old logs, specially mysqld.log file which was in GBs. It worked after that.

Aldred answered 9/11, 2020 at 16:47 Comment(0)
S
1

filezilla, error while writing: failure issue occurred when server storage is full. Login in Linux server and Kindly run below two commands to find out which files are consuming max storage in /var/log recursively..

for MB Size:

sudo du -csh $(sudo find /var/log -type f) |grep M|sort -nr

For GB size:

sudo du -csh $(sudo find /var/log -type f) |grep G|sort -nr
Spitler answered 20/7, 2021 at 12:39 Comment(0)
V
1

In our case it was because the file exceeded the user's quota. We use Virtualmin and the virtual server had a default quota of just 1GB. Increasing that value in Virtualmin solved the problem.

Vivyanne answered 25/4, 2022 at 21:17 Comment(0)
L
0

This happened when I tried to replace the file which was already open or running in the background. Once closed, I was able to overwrite the file.

Lambent answered 18/12, 2020 at 6:31 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.