An error occurred (404) when calling the HeadObject operation: Key "" does not exist
Asked Answered
R

6

12

I am using the terminal to copy a file from an s3 bucket to my local machine but I keep getting the error:

fatal error: An error occurred (404) when calling the HeadObject operation: Key "file_000" does not exist 

I am using the command:

usr/local/bin/aws s3 cp s3://{bucket}/file_000 /Users/user/Documents/Docs/dir/new_file.csv

I know the file exists using:

aws s3 ls s3://{bucket} --recursive --human-readable

and shows up as:

2022-08-04 15:53:12 21.2 MiB file_000

I have tried adding --recursive to the end of the command. The command goes through but then creates an empty directory named new_file.csv

Is there anything I can do to solve this?

Rectory answered 5/8, 2022 at 14:15 Comment(4)
How did you list the files?Vellavelleity
I added it into the postRectory
Try using aws s3api list-objects --bucket example-bucket --query Contents[].Key to verify there are no special characters, for instance, keep an eye out for trailing spaces in the key name.Juni
Thanks @AnonCoward, this helped to see my file was listed as " file_000". Thanks!Rectory
O
11

I ran into this same problem, what worked for me was going back to the s3 bucket and confirming the name of the object in the bucket is the same as inside of my code. I realized that it wasn't and once I corrected that and re-ran the program everything worked fine.

Ossified answered 26/10, 2022 at 19:39 Comment(1)
I just wanted to know that, if I am getting 404 like this, does it also cost anything?Intermezzo
G
5

For anyone, who will ran into this problem lately, it could happen, when you are trying to copy files from folder.

Imagine, I have a large folder with thousand of files, and I want to copy just small amount of file to another folder. In that case if I have s3 path s3://object1/object2/ where object2 is folder, I need to add --recursive flag to mine request.

And the whole CLI command will be:

aws s3 cp "s3://object1/object2/" "s3://object1/object3/" --recursive --exclude "*" --include "*.jpg"

P.S. This is not the only one solution for the "Key '' does not exist" problem, and you might want to debug it further: https://repost.aws/knowledge-center/404-error-nosuchkey-s3

Gorse answered 21/12, 2022 at 10:44 Comment(0)
S
2

I came across this issue when I was trying to access one file from s3, it turns out I was requesting a file that does not exist! Hope it helps

Silverweed answered 12/2, 2023 at 8:50 Comment(0)
A
0

I came across this issue. I tried switching the user from root to ec2-user and it worked for me

Acquah answered 24/8, 2023 at 6:30 Comment(1)
Your answer could be improved with additional supporting information. Please edit to add further details, such as citations or documentation, so that others can confirm that your answer is correct. You can find more information on how to write good answers in the help center.Merow
L
0

In case of python users using boto3 this error arises when calling s3.head_object, In my case I was using s3.get_object also which raises NoSuchKey error. If you want to handle both at once you can use/modify the below code.

from botocore.exceptions import ClientError
import boto3


try:
    s3 = boto3.client('s3')
    s3.head_object(Bucket=bucket_name, Key=file_key)
    s3.get_object(Bucket=bucket_name, Key=file_key)
except ClientError as e:
    if e.response['Error']['Code'] == '404' or e.response['Error']['Code'] == 'NoSuchKey':
    # missing file logic
else:
    raise e # or whatever you want
Langelo answered 13/12, 2023 at 5:37 Comment(0)
C
0

In the context of the error, here are a few more points to check:

  1. If there is a folder in the S3 bucket, the folder will be mentioned as follows - Key="<folder_name>/<file_name>"

  2. Check that your file name is exactly the same. For example, if your file name has spaces, they might be replaced by '+', in case you are dynamically retrieving the filename. Example - "file name.txt" -> "file+name.txt"

Corrasion answered 20/8 at 18:5 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.