We have a dataproc cluster staging bucket wherein all the spark job logs are getting stored.
eu-digi-pipe-dataproc-stage/google-cloud-dataproc-metainfo/d0decf20-21fd-4536-bbc4-5a4f829e49bf/jobs/google-cloud-dataproc-metainfo/d0decf20-21fd-4536-bbc4-5a4f829e49bf/jobs
I have set up a life cycle policy as below. so that all the jobs which starts with prefix digi--
should get deleted as an when they get older a day.
gcloud storage buckets describe gs://eu-digi-pipe-dataproc-stage --format="default(lifecycle_config)"
lifecycle_config:
rule:
- action:
type: Delete
condition:
age: 1
matchesPrefix:
- **google-cloud-dataproc-metainfo/d0decf20-21fd-4536-bbc4-5a4f829e49bf/jobs/digi--**
Is it feasible to achieve this?