life cycle policies on dataproc staging bucket sub folders
Asked Answered
G

0

1

We have a dataproc cluster staging bucket wherein all the spark job logs are getting stored.

eu-digi-pipe-dataproc-stage/google-cloud-dataproc-metainfo/d0decf20-21fd-4536-bbc4-5a4f829e49bf/jobs/google-cloud-dataproc-metainfo/d0decf20-21fd-4536-bbc4-5a4f829e49bf/jobs

enter image description here

I have set up a life cycle policy as below. so that all the jobs which starts with prefix digi-- should get deleted as an when they get older a day.

gcloud storage buckets describe gs://eu-digi-pipe-dataproc-stage --format="default(lifecycle_config)"
lifecycle_config:
  rule:
  - action:
      type: Delete
    condition:
      age: 1
      matchesPrefix:
      - **google-cloud-dataproc-metainfo/d0decf20-21fd-4536-bbc4-5a4f829e49bf/jobs/digi--**

Is it feasible to achieve this?

Garbage answered 31/5, 2024 at 4:39 Comment(1)
Hi @vikrant Rana, Are you facing any errors ? If yes, could you provide the error message ?Consonance

© 2022 - 2025 — McMap. All rights reserved.