Setting limits on file uploads via Firebase auth and storage without server in the middle?
Asked Answered
C

6

8

I'm learning about Firebase auth and storage in a web app. My idea asks users to login via Firebase and then upload an image.

I can see that this is possible from Firebase auth and storage. However, I would like to put limits on the file count and file-size they can upload.

Is it possible to control uploads within the Firebase console (or somewhere else)? After reviewing the JavaScript examples, I see how I can put files in, and I can imagine writing code which would query Firebase for a user's upload count, and then limit on the client side, but of course, this is a completely insecure method.

If I hosted this as a single page app on, say, GitHub pages, I am wondering if I could set these limits without involving a server. Or, do I need to proxy my uploads through a server to make sure I never allow users to upload more than I intend them to?

Cowgirl answered 9/11, 2016 at 17:50 Comment(1)
There are lots of tutorials and answers I've seen describing totally insecure methods, and the authors even assert that it's secure :(Roil
L
12

You can limit what a user can upload through Firebase Storage's security rules.

For example this (from the linked docs) is a way to limit the size of uploaded files:

service firebase.storage {
  match /b/<your-firebase-storage-bucket>/o {
    match /images/{imageId} {
      // Only allow uploads of any image file that's less than 5MB
      allow write: if request.resource.size < 5 * 1024 * 1024
                   && request.resource.contentType.matches('image/.*');
    }
  }
}

But there is currently no way in these rules to limit the number of files a user can upload.

One approach that comes to mind would be to use fixed file names for that. For example, if you limit the allowed file names to be numbered 1..5, the user can only ever have five files in storage:

match /public/{userId}/{imageId} {
  allow write: if imageId.matches("[1-5]\.txt");
}
Letendre answered 10/11, 2016 at 8:23 Comment(2)
This filenames hack is the best I've seen. You could instead size up files using Cloud Functions to catch rulebreakers slightly after the fact, but that's more complicated and fragile.Roil
Using cloud functions, firestore and auth claims, it can be done. See my answer below.Mondragon
C
1

If you need a per user storage validation the solution is a little bit more tricky, but can be done.

Ps.: You will need to generate a Firebase token with Cloud Functions, but the server won't be in the middle for the upload...

https://medium.com/@felipepastoree/per-user-storage-limit-validation-with-firebase-19ab3341492d

Criticism answered 16/8, 2017 at 11:0 Comment(0)
M
1

One solution may be is to use Admin SDK to change Storage Rules based on a Firestore document holding the upload count per day.

Say you have a firestore collection/document as userUploads/uid having fields uploadedFiles: 0 and lastUploadedOn.

Now, once the user uploads the file to Firebase Storage (assuming within limits and no errors), you can trigger Cloud Function which will read userUploads/uid document and check the lastUploadedOn field is of an earlier date than the currently uploaded file's date and if yes then make the uploadedFiles to 1 and change the lastUploadedOn to uploaded datetime. Else, increment the uploadedFiles count and change lastUpdateOn to current datetime. Once the uploadedFiles value becomes 10 (your limit), you can change the storage rules using Admin SDK. See example here. Then, change the count to 0 in userUploads/uid document.

However, there a little caveat. The change in rules might take some time and there should be no legit async work under process for that rule. From Admin SDK:

Firebase security rules take a period of several minutes to fully deploy. When using the Admin SDK to deploy rules, make sure to avoid race conditions in which your app immediately relies on rules whose deployment is not yet complete

I haven't tried this myself but it looks like it will work. On a second thought, changing back the rules to allow write could be complicated. If the user uploads on the next day (after rules has been changed), the upload error handler can trigger another cloud function to check if it is a legit request, change the rules back to normal and attempt the upload again after sometime but it will be very bad user experience. On the other hand, if you use a scheduler cloud function to check userUploads/uid document everyday and reset values, it could be costly (~$18 per million users per month @ $0.06/100K reads) and it may be complicated if users are in different timezones and it may be irrelevant regarding most users depending on they're uploading that frequently. Furthermore, rules have limits

  • Rules must be smaller than 64 KiB of UTF-8 encoded text when serialized
  • A project can have at most 2500 total deployed rulesets. Once this limit is reached, you must delete some old rulesets before creating new ones.

So per user rule for a large user base can easily reach this limit (apart from other rules).

Perhaps the optimum solution could be to use Auth Claims. Originally have a deny write rule if user has a particular auth claim token (say canUpload: false). Then in cloud function triggered on upload, attach this claim when the user reached limit. This will be real-time as it immediately blocks the user as oppose to Admin SDK rules deployment delay.

To remove the auth claim:

  1. Check through another cloud function in the upload error handler if the lastUploadedOn has been changed hence removing the claim
  2. Check through a separate cloud function called before upload that checks if the user has auth claim and the lastUploadedOn is an earlier date, then remove the claim
  3. Additionally, during login, it can be checked and removed if lastUploadedOn is earlier than today but it is less efficient than 2 since it would constitute unnecessary and needless read on firestore while the user is not even uploading anything

In 2, if the client tries to skip the call, and has the auth claim, s/he cannot upload ever as blocked by security rule. Otherwise if no auth claim then s/he will go through the normal process.

Note: Changing auth claims needs to be pushed to the client. See this doc.

Mondragon answered 11/4, 2020 at 2:11 Comment(0)
P
1

Following the filenames hack Frank gave us, I think we can improve on that to make it more flexible.

For example, in my case I don't want to put a hard limit on user uploads like "you can upload up to 50 files, ever", but rather "you're allowed to upload up to 20 files per day".

I just had this idea and will work on the implementation soon enough, but here it goes:

Following the same logic, we can allow only filenames like 1-07252022, 2-07252022, etc.

And since Firebase rules handles us some string and timestamp methods, I think we can achieve this upload limit/day only using Storage Rules, without the need for user custom claims or any cloud function.

Although in my case, I only want to allow uploads from paying customers, so in that case I would need also a custom claim on the user's token.

I'll edit this answer when I work on the code snippet, but anyone struggling, here you have an idea.

Prude answered 25/7, 2022 at 14:40 Comment(0)
V
1

One way to limit number of files (or storage size) a user can upload is to use signed URLs . You would need a server (Cloud Functions) to generate signed URLs but then you can upload large files directly to Cloud storage without streaming the file through the server. The flow would be:

  1. Send the file names and sizes to your server in the request body
  2. Generate signed URL for each file and set Content-Length equal to size of file so that user can only upload a file of that size using the URL.
  3. Update user's storage usage in a database like Firestore.
  4. Upload the files to Cloud storage using the signed URLs received from server.

You just need to ensure that user has enough storage available by checking their Firestore document before generating the signed URLs. If not, you can return an error like:

// storageLimit 
if (storageUsed + size > storageLimit) {
  throw new functions.https.HttpsError(
    "failed-precondition",
    "Not enough storage available"
  );
}

Checkout How to set maximum storage size limit per euser in Google Cloud Storage? for detailed explanation and code snippets.

Vickeyvicki answered 29/10, 2022 at 15:57 Comment(0)
S
1

Here is my version of Rafael's answer above, that is intending to restrict a particular user to 20 image uploads per calendar month.

However, a flaw with this version is that you can upload 00.png and 00.jpg and since they're different file names the upload is allowed. This means that the user is allowed 20 of each image file type.

I'm open to feedback about how to improve this, but I figured a sample solution posted here is better than a sample description of a solution.

rules_version = '2';
service firebase.storage {
  match /b/{bucket}/o {
    match /images/{userId}/{currentYear}/{currentMonth}/{imageId} {
        function userIsAuthenticated() {
        return request.auth != null && request.auth.uid == userId;
      }
      
      function imageSizeIsValid() {
        return request.resource.size < 5 * 1024 * 1024;
      }

      function isImage() {
        return request.resource.contentType.matches('image/.*');
      }

      function dateIsValid() {
        let isCurrentYear = request.time.year() == int(currentYear);
        let isCurrentMonth = request.time.month() == int(currentMonth);
                return isCurrentYear && isCurrentMonth;
      }
      
      function fileNameIsValid() {
        return imageId.matches('[0-1][0-9]\\.\\w{3,4}');
      }

      allow read: if true;
      allow write: if userIsAuthenticated() 
                   && imageSizeIsValid()
                   && isImage()
                   && dateIsValid()
                   && fileNameIsValid();
    }
  }
}
Soler answered 23/3, 2023 at 21:5 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.