Streamlining Data Lifecycle Management in AWS S3

Introduction: Managing data lifecycle in Amazon Simple Storage Service (S3) is crucial for efficient storage, cost optimization, and regulatory compliance. In this blog post, we'll delve into the step-by-step process of implementing retention policies and purging outdated data from S3 buckets using AWS CLI scripts.




Implementing Retention Policies:

Overview: Retention policies in S3 automate the management of stored objects based on predefined rules. These policies allow organizations to seamlessly transition or delete objects based on criteria such as age or size.

Script to Implement Retention Policy:

bash

 

#!/bin/bash BUCKET_NAME="example-bucket-123" RETENTION_DAYS=90 aws s3api put-bucket-lifecycle-configuration \ --bucket $BUCKET_NAME \ --lifecycle-configuration '{ "Rules": [ { "ID": "DeleteObjectsRule", "Prefix": "", "Status": "Enabled", "Expiration": { "Days": '"$RETENTION_DAYS"' } } ] }'

Example Usage: Execute the provided command to apply a retention policy of 90 days to specified S3 buckets.

Deleting Older Data: Regularly purging outdated data from S3 buckets is essential for optimizing storage resources and reducing costs.

Script to Delete Older Data:

bash

Copy code

#!/bin/bash BUCKET_NAME="example-bucket-123" OLDER_THAN_DAYS=90 aws s3 ls s3://$BUCKET_NAME --recursive | \ while read -r line; do createDate=$(echo $line | awk '{print $1}') fileName=$(echo $line | awk '{print $4}') if [ -n "$createDate" ]; then if [[ $createDate < "$(date -d "$OLDER_THAN_DAYS days ago" +%Y-%m-%d)" ]]; then aws s3 rm s3://$BUCKET_NAME/$fileName fi fi done

Example Usage: Execute the provided command to delete data older than 90 days from specified S3 buckets.

Python Script for Deleting Older Logs:

python

Copy code

#!/usr/bin/env python3 import boto3 from datetime import datetime, timedelta buckets = [ "example-bucket-123", "example-bucket-456", "example-bucket-789" ] print("Initializing the S3 client") s3 = boto3.client("s3") def delete_older_logs(bucket_name): print(f"Deleting logs older than 90 days from bucket: {bucket_name}") ninety_days_ago = datetime.now() - timedelta(days=90) print(f"Date 90 days ago: {ninety_days_ago}") response = s3.list_objects_v2(Bucket=bucket_name) if 'Contents' in response: objects = response['Contents'] print(f"Objects in bucket {bucket_name}: {objects}") for obj in objects: last_modified = obj["LastModified"].replace(tzinfo=None) print(f"Object: {obj['Key']}, Last modified: {last_modified}") if last_modified < ninety_days_ago: print(f"Deleting object: {obj['Key']}") s3.delete_object(Bucket=bucket_name, Key=obj["Key"]) else: print(f"No objects found in bucket: {bucket_name}") for bucket in buckets: delete_older_logs(bucket)

Conclusion: By implementing retention policies and regularly removing outdated data from S3 buckets, organizations can optimize storage resources, reduce costs, and ensure compliance with data retention policies. These scripts offer a simple yet effective way to automate data lifecycle management in AWS S3.

Post a Comment

Previous Post Next Post