Change storage class of (existing) objects in Google Cloud Storage
Asked Answered
P

5

9

I recently learnt of the new storage tiers and reduced prices announced on the Google Cloud Storage platform/service.

So I wanted to change the default storage class for one of my buckets from Durable Reduced Availability to Coldline, as that is what is appropriate for the files that I'm archiving in that bucket.

I got this note though:

Changing the default storage class only affects objects you add to this bucket going forward. It does not change the storage class of objects that are already in your bucket.

Any advice/tips on how I can change class of all existing objects in the bucket (using Google Cloud Console or gsutil)?

Paperback answered 27/10, 2016 at 5:49 Comment(0)
G
18

The easiest way to synchronously move the objects to a different storage class in the same bucket is to use rewrite. For example, to do this with gsutil, you can run:

gsutil -m rewrite -s coldline gs://your-bucket/**

Note: make sure gsutil is up to date (version 4.22 and above support the -s flag with rewrite).

Alternatively, you can use the new SetStorageClass action of the Lifecycle Management feature to asynchronously (usually takes about 1 day) modify storage classes of objects in place (e.g. by using a CreatedBefore condition set to some time after you change the bucket's default storage class).

Gambell answered 27/10, 2016 at 16:19 Comment(7)
This is the easiest way to synchronously move the objects to a different storage class in the same bucket. Alternatively, you can use the new SetStorageClass action of the Lifecycle Management feature to asynchronously (usually takes about 1 day) modify storage classes of objects in place (e.g., by using a CreatedBefore condition set to some time after you change the bucket's default storage class): cloud.google.com/storage/docs/lifecycleNarayan
Thank you, Travis, Jeff; appreciate it! Works like a charm ☺, although the operation overwrites the Creation & Update times! I had two follow-up queries: 1. Is gsutil rewrite operation billable? 2. Can Creation time be preserved? Many thanks. Cheers! fynaliPaperback
Yes, it is billable as a Class A operation (it uses storage.objects.rewrite, see cloud.google.com/storage/pricing). No, there's no way to preserve the creation/update time because rewrite creates a new object generation.Gambell
Done! Changed storage class of 29,459 DRA objects to Coldline; cost < ¢30!Paperback
Using Object Lifecycle Management does preserve the creation/update time. cloud.google.com/storage/docs/lifecycle This blog post covers setting Lifecycle rules on a Cloud Storage bucket to change the storage class of files over time so that their storage price costs less: cloud.google.com/blog/topics/developers-practitioners/…Anastassia
For me with zsh and GCP bucket level access enabled to command that worked was gsutil rewrite -O -s nearline 'gs:/your-bucket/**'Ebersole
I got CommandException: 21933 files/objects could not be rewritten and solved it by adding the -O flag. Needed if the bucket has uniform access control (no object-level ACLs).Waldemar
D
5

To change the storage class from NEARLINE to COLDLINE, create a JSON file with the following content:

{
  "lifecycle": {
    "rule": [
      {
        "action": {
          "type": "SetStorageClass",
          "storageClass": "COLDLINE"
        },
        "condition": {
          "matchesStorageClass": [
            "NEARLINE"
          ]
        }
      }
    ]
  }
}

Name it lifecycle.json or something, then run this in your shell:

$ gsutil lifecycle set lifecycle.json gs://my-cool-bucket

The changes may take up to 24 hours to go through. As far as I know, this change will not cost anything extra.

Deafanddumb answered 16/5, 2021 at 8:20 Comment(4)
this is the best answer todayMould
You can also create this rule in the dashboard UI. cloud.google.com/storage/docs/…Porphyritic
This answer is good but it only works from one class to a "colder" one. For example, COLDINE to STANDARD is impossible currently.Hobnailed
This operation DOES cost, it's a Class A operation on the target storage class (it uses storage.objects.rewrite, see cloud.google.com/storage/pricing).Trumantrumann
D
2

I did this:

gsutil -m rewrite -r -s <storage-class> gs://my-bucket-name/

(-r for recursive, because I want all objects in my bucket to be affected).

Disfavor answered 13/7, 2021 at 12:55 Comment(1)
I needed to add a -O because I use bucket level permissions. So in the end my command becomes: gsutil -m rewrite -O -r -s <storage-class> gs://my-bucket-name/etcSpindling
W
1

You could now use "Data Transfer" to change a storage class by moving your bucket objects to a new bucket.

Access this from the left panel of Storage.

enter image description here

Waybill answered 24/4, 2020 at 18:14 Comment(0)
D
0

If you couldn't access to the gsutil console, as in Google Cloud Function environment because Cloud Functions server instances don't have gsutil installed. Gsutil works on your local machine because you do have it installed and configured there. For all these cases I suggest you to evaluate the update_storage_class() blob method in python. This method is callable when you retrieve the single blob (in other words it refers to your specific object inside your bucket). Here an example:

from google.cloud import storage

storage_client = storage.Client()

blobs = storage_client.list_blobs(bucket_name)

for blob in blobs:
    print(blob.name)
    print(blob.storage_class)

all_classes = ['NEARLINE_STORAGE_CLASS', 'COLDLINE_STORAGE_CLASS', 'ARCHIVE_STORAGE_CLASS', 'STANDARD_STORAGE_CLASS', 'MULTI_REGIONAL_LEGACY_STORAGE_CLASS', 'REGIONAL_LEGACY_STORAGE_CLASS']

new_class = all_classes[my_index]
update_storage_class(new_class)

References:

Darreldarrell answered 23/7, 2020 at 10:21 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.