This week I will explain S3 bucket logging feature. By using bucket logging we can find lots of information about our buckets’ access like who accessed to our buckets , the bucket names, the requests type etc. And we can find if there is any permission problems with our buckets. For example, we may think that we had configured a bucket policy with proper permissions. We can find if any unauthorized access is attempted. When we enable access logging, logs will be delivered on an hourly basis ( AWS says that it is not guaranteed that all logs will be delivered. You can read more on here. Also permission will be set on both source and target buckets for “Log Delivery” account when we enable logging.
To show how it works, I will use two buckets and enable the logging. Then I will analyse my logs using goaccess. Remember that, we can keep our logs in the same bucket, but in my example I will use a different one.
As you see these are my buckets. On the “my.customer.bucket”, on properties tab’s logging section , I enable logging and select “my.customer.bucket.logs” as target bucket. Here you can see that we can also use a “target prefix”. If you want to use the same bucket for different logs , you can use target prefix in order to separate the logs. Also we can use the prefixes on our lifecycle rules when we want to delete them.
After a while, logs are delivered as seen.
Now I will use s3stat python module to analyze my log files. You can find the installation steps here. S3stat will download the log files and help us to analyse them using goaccess.
And here my results:
root@salt:~# s3stat.py mykey mysecret my.customer.bucket.logs ""
You can use bucket logging if you want to analyse and track your bucket usage. If you have any question or comment, please feel free to write and don’t forget to share please.