Simple Storage Service (S3) — Part 3
This is a continuation of https://codebykev.medium.com/100-days-of-devops-day-2-bac3dd0eeadc. There I was looking at managing the buckets and objects in S3 in a really high-level holistic way.
Today I’d like to try out some S3 commands on the AWS CLI.
Setup
Installing the AWS CLI
It’s a bit pointless to explain this in detail so instead, you should go to https://docs.aws.amazon.com/cli/latest/userguide/cli-chap-install.html and install the v2 version.
When you’ve done it you can verify it like so
$ aws --version
aws-cli/2.0.5 Python/3.7.5 Windows/10 botocore/2.0.0dev9
Configure the CLI
If this is your first time setting up is pretty simple, you will just use:
aws configure
It will ask you for some details like aws_access_key_id, aws_secret_access_key.
Where do I get those?
If you’re asking this, you probably need to create a user in IAM’s which has CLI access, when you complete them you’ll be able to get both their access and secrets. Follow this guide. Make sure you add s3FullAccess to that user so you can use the CLI to create s3 buckets and add objects etc.
Multiple Accounts
Yes, you can have multiple accounts on the CLI. I followed this to add a second one, the first was my work account.
I set up my second profile as codebykev and then switched to it using setx
(I'm on a windows machine). Ultimately just set an environmental variable up called AWS_PROFILE
.
setx AWS_PROFILE codebykev
Buckets & Objects
Create a bucket
$ aws s3 mb s3://codebykev.dev.assets
If it exists already by you will get an error starting with:
make_bucket failed: s3://chaos-blog-test-bucket An error occurred...
The reasons will vary.
Listing Buckets
$ aws s3 ls
The command ls
can do more including listing the objects in your buckets and folders. But this is enough, for now, to prove our bucket exists.
$ aws s3 ls2020-10-21 14:19:17 codebykev.dev2020-10-23 18:21:37 codebykev.dev.assets
Copy an item
Copying a file to the bucket is pretty easy too
$ aws s3 cp demo.jpg s3://codebykev.dev.assetsupload: .\demo.jpg to s3://codebykev.dev.assets/demo.jpg
This copies a single file 🎉… but did it?
Listing the files in a bucket
Listing the files in a bucket uses ls
again.
$ aws s3 ls s3://codebykev.dev.assets2020-10-23 18:37:15 43479 demo.jpg
Deleting the file
$ aws s3 rm s3://codebykev.dev.assets/demo.jpgdelete: s3://codebykev.dev.assets/demo.jpg
Removing the bucket
$ aws s3 rb s3://codebykev.dev.assetsremove_bucket: codebykev.dev.assets
…and just like that, you are back to where you started.
Cheatsheet I liked
There are plenty of options out there, but this one was very succinct from https://acloudguru.com/blog/engineering/aws-s3-cheat-sheet
Sync
$ aws s3 sync ./folder s3://<bucket>
This command is part of my current professional pipeline, we’ve found it very useful for quick static deploys also. Just don’t sync the wrong folder on your machine like I did (%USER_PROFILE% if you wondered) and end up with a rather lot of personal stuff synced to the assets folder.
aws s3 rm s3://<bucket>/ --recursive --exclude "keepme/*"
This was pretty useful 🙌 to purge that mistake.
Hopefully, this was useful as a reference for someone else, I quite enjoyed poking around the CLI for buckets. I find it useful when scripting pipelines myself to know what my options are.
The best command in my opinion is the help
command. You can put it after any command to get a decent manpage outline of possibilities. If you prefer websites the AWS doc pages are pretty decent.