The answers work fine if you are using the app's filesystem to store your files. But, If your are using boto3 and uploading to sth like AWS S3 and maybe you want to set a file already existing in an S3 bucket to your model's FileField then, this is what you need.
We have a simple model class with a filefield:
class Image(models.Model):
img = models.FileField()
owner = models.ForeignKey(get_user_model(), on_delete=models.CASCADE, related_name='images')
date_added = models.DateTimeField(editable=False)
date_modified = models.DateTimeField(editable=True)
from botocore.exceptions import ClientError
import boto3
s3 = boto3.client(
's3',
aws_access_key_id=os.getenv("AWS_ACCESS_KEY_ID"),
aws_secret_access_key=os.getenv("AWS_SECRET_ACCESS_KEY")
)
s3_key = S3_DIR + '/' + filename
bucket_name = os.getenv("AWS_STORAGE_BUCKET_NAME")
try:
s3.upload_file(local_file_path, bucket_name, s3_key)
# we want to store it to our db model called **Image** after s3 upload is complete so,
image_data = Image()
image_data.img.name = s3_key # this does it !!
image_data.owner = get_user_model().objects.get(id=owner_id)
image_data.save()
except ClientError as e:
print(f"failed uploading to s3 {e}")
Setting the S3 KEY into the name field of the FileField does the trick. As much i have tested everything related works as expected e.g previewing the image file in django admin. fetching the images from db appends the root s3 bucket prefix (or, the cloudfront cdn prefix) to the s3 keys of the files too. Ofcourse, its given that, i already had a working setup of the django settings.py for boto and s3.
FileField
. Whenever aFileField
is saved, a new copy of the file is created. It would be fairly straightforward to add an option to avoid this. – Provitamin