Google Cloud Storage – Lifecycle Management

March 4, 2015 | By

gcslogoIn my last post, Google Cloud Storage – “Initial setup, cp, and rsync“, I showed how to get started with Google Cloud Storage (GCS) including how to copy and rsync files to and from storage.

A common use-case for GCS is backups.  For example, I backup my blog content nightly to GCS, so after a year, I’d have 365  gregsramblings.tgz files.  I could write a script to list the files in my bucket and automatically delete backups based on various conditions, but GCS’s Lifecycle Management features makes this super simple.

You can read about multiple types of conditional deletes in the excellent GCS Lifecycle Management docs.   In my particular case, I wanted to always have the last 30 day’s backups available, so I created a file named lifecycle_config.json as follows:

{
    "rule":
    [
      {
        "action": {"type": "Delete"},
        "numNewerVersions": 30
      }
    ]
}

To apply the rule to my bucket, I used:

gsutil versioning set on gs://gregsramblings-backup
gsutil lifecycle set lifecycle_config.json gs://gregsramblings-backups

Now when the 31st file is added, it deletes the older versions leaving me with the last 30 backups every day.

Other condition types allow you to delete based on age, etc.  The docs cover this.

In the next post, I’ll cover how to create public links for your bucket objects with extremely efficient edge caching and index.html/404.html behavior (static website).

Share

Filed in: Google Cloud Platform, Google Cloud Storage | Tags: , , , , , , , ,

About the Author (Author Profile)

Greg is a developer advocate for the Google Cloud Platform based in San Francisco, California
  • Sourabh Verma

    Does it delete the archived objects as well , I mean when we set the object versioning , on the object deletion , it create a archived object . how are you managing that object ?