Log retention in Stackdriver GCP
Asked Answered
T

5

14

How can I get log retention enabled in GCP Stack-driver. I haven't found any document for configuring log retention. I can see export option in logging section and log ingestion.

Twedy answered 21/8, 2018 at 5:40 Comment(0)
L
15

Log Retention is now POSSIBLE.

Use this documentation for custom retention periods. This can be between 1 day and 3650 days.

gcloud beta logging buckets update _Default --location=global --retention-days=[RETENTION_DAYS]

Explanation:

For each Google Cloud project, Logging automatically creates two logs buckets: _Required and _Default. All logs generated in the project are stored in the _Required and _Default logs buckets, which live in the project that the logs are generated in:

_Required: This bucket holds Admin Activity audit logs, System Event audit logs, and Access Transparency logs, and retains them for 400 days. You aren't charged for the logs stored in _Required, and the retention period of the logs stored here cannot be modified. You cannot delete this bucket.

_Default: This bucket holds all other ingested logs in a Google Cloud project except for the logs held in the _Required bucket. Standard Cloud Logging pricing applies to these logs. Log entries held in the _Default bucket are retained for 30 days, unless you apply custom retention rules.

With custom buckets and the _Default bucket, you can configure custom retention periods for different logs.

enter image description here

Lempres answered 18/4, 2020 at 14:56 Comment(5)
Thanks! Just read the page, it is still in Beta, but it works!Qulllon
This is free too!! Up to 10 yearsSutter
Running this command I get ERROR: (gcloud.beta.logging) Invalid choice: 'buckets'. - I've set up a sink to my cloud-storage, but other than that I can't change the retention period for stack-driver.Unmeet
@Unmeet try alpha instead of beta - fixed it for meLaflamme
@Sutter party is over: "Effective March 31, 2021, storage costs will apply to all chargeable logs retained longer than the default retention periods at the rate of $.01 per GiB per month (or fraction thereof). For details, see cloud.google.com/stackdriver/pricing#logs-storage"Palmerpalmerston
B
10

NOW, it is possible, see this post bellow (edited)


Previous answer:

Logging retention is 30 days and it is not configurable, you only pay for the storage

Stackdriver Logging allows you to retain the logs for 30 days, and gives you a one-click configuration tool to archive data for a longer period in Google Cloud Storage.

https://cloud.google.com/logging/

But you can create a sink for you logs and store them in Big Query or Google Cloud Storage (or both of them)

Befall answered 21/8, 2018 at 16:20 Comment(0)
W
3

In addition to the 30 days, audit logs are retained for 400 days.

The Design Patterns for Logging Exports covers the specifics of exports to GCS, BigQuery and PubSub (for streaming logs).

Wodge answered 23/8, 2018 at 16:17 Comment(0)
Q
2

Stackdriver retention:

  1. Admin Activity (400 days)
  2. Data Access (30 days)
  3. System Event (400 days)
  4. Other logs (30 days)

Bullet 1 to 3 are audit logs, which you can enable on the iam > audit logs page. Remember that this can be a large stream of logs. This is especially the case for data access logs, since the log every access object in for example GCS (Google Cloud Storage) or Cloud Datastore. Some best practices are that you turn off audit logging for development or only turn on audit logging for services that you are frequently using (KMS, IAM, storage etc.) and turn audit logging off for Cloud Build, Cloud Functions etc.

Bullet 4, other logs, could be application logging from Cloud Functions, App Engine etc. This is the logging that comes from the applications that your run on GCP. For all the Stackdriver logging, retention is now configurable (April 2020). You can read more about this here.

Want to store the logging for a longer period of time?

There are many use cases in which you want to maintain logging for a longer period of time. This may be for analytical purposes, monitoring or compliance reasons. You can export logs with a logsink on project, directory or even organization level. Logsinks itself are free, you only pay for the storage of the destination, which could be one of the following:

  • Google cloud storage
  • Pubs/sub
  • Bigquery

Pub/sub could be a neat solution if you want to move the logging export somewhere else, possibly external of GCP. I recently learned that Google Cloud Storage and Big Query do not differ a lot when it comes to storage cost. They both offer lower class storage tears, for longer term storage.

Logsink best practices

  • For a lot of use-cases, Big Query may be the best and easiest solution. Storage costs are comparable to Big Query standard and nearline storage classes. And you have the possibility to easily query the data.
  • Using Big Query, don't run queries that cover a lot of data too often. This may become expensive.
  • Using Big Query, partition your data so that every day or every week data is inserted in a new partition or table. Which in turn automatically reduces the storage cost of the tables that are not updated by 50%.
  • When you have to store data for several years (3, 5, or even 7 years) because of compliance reasons. I would recommend exporting the data to google cloud storage. By object lifecycle management you can put this data in archive storage, which costs only a fraction (15%) of standard storage.
Qulllon answered 15/4, 2020 at 14:23 Comment(0)
L
0

Now, you can configure custom retention by the command:

gcloud alpha logging buckets update _Default --location=global --retention-days=[RETENTION_DAYS]

See https://cloud.google.com/logging/docs/storage#logs-retention

Lunn answered 26/2, 2020 at 2:0 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.