Harnessing the Power of AWS S3

Harnessing the Power of AWS S3

Harnessing the Power of AWS S3

The total volume of data and number of objects you can store in Amazon S3 are unlimited. Individual Amazon S3 objects can range in size from a minimum of 0 bytes to a maximum of 5 TB. The largest object that can be uploaded in a single PUT is 5 GB.

We can start the practical. Have start with how to create the bucket?

Once we clicked on S3 we have been landed to below page:

So there is an option called Create Bucket. Click on Create bucket.

Then we can select the default in General Configuration , Bucket type should be General Purpose.

Now we have to provide the Bucket name and that should be unique though it's been accessed globally. So we can assume that we are going to implement this project in a real time organization.

So we will provide the name like environment wise lets say “ app1-transaction-prod-exmaple.com

& also one thing we noticed over here that while configuring we are able to see the AWS Region ? Why ? as we have learned that S3 is the global service. Then why Region is appear over here?

Buckets are tied to the region. Every service in AWS is tied to a specific region. The content in that bucket is globally accessible. Lets take an example. I'm sitting in Pune & we have created the bucket in Mumbai region and so once we access the content in that bucket from Pune , it will use the low latency and fulfill the request fastly. In contrast, if I’m sitting in Mumbai and Creating the bucket in Us or any other region which is far away then to access the content it will take some more time to fulfill the request. So S3 buckets are scoped in the region.

Then coming to the section below and this option enabled by default while creating the bucket.

If we uncheck the below option everybody has access to check any sensitive information in our bucket and that is not a standard practice. So always keep this enabled.

In bucket versioning will go with the default option.

Now coming to the Default encryption. Keep as it is default. & Bucket key is also default i.e. enable.

Then click on create the bucket option.

Bucket has been successfully created.

Now we will upload the content within this bucket.

Click on Bucket.

Then click on upload.

So we can upload anything from our laptop to the S3 Bucket

Click on Add files.

Uploaded one of the file from my laptop.

So above is the index.htm is an object.

Now we will do the demo on the very important concept of the versioning.

So if we are having the file in S3 bucket.

We will go back to the file and try to do some modification. Though Versioning has not been enabled as of now.

Will enable the Bucket versioning, We have clicked on Enable Bucket versioning. Now I will go and do some modifications in the file.

Now, we have done some modifications in the file.

Again we will upload the file, Post ,modify we have been successfully uploaded the file.

Now click on Versions

Now see the Versioning was happened

The above image we can see both copies of the file are there once we created i.e. original file and the other one is the post modification file .

By enabling S3 Versioning, you can significantly enhance the durability and recoverability of your data stored in AWS S3, providing peace of mind and protecting your valuable assets.

With the help of the Versioning:

AWS S3 Versioning: Protecting Your Data

AWS S3 Versioning is a powerful feature that enables you to preserve multiple versions of each object stored in your S3 bucket. This helps protect your data from accidental deletions or overwrites, ensuring data durability and recoverability.

So let's explore more options in the S3 bucket.

Tags is something to identify the resources. So as of now I can check the permissions tab in the Bucket.

Left Side Pane click on Block Public Access settings for this account.

Click on Edit

We will keep this setting “On” to restrict the access to this particular bucket with this option. So if other users have access to S3 they are able to access the bucket as well. Let's say if this bucket is having sensitive information and you want to restrict the access to this bucket so we can use “Block all public access”
We will going to see the demo practically; first we will enable the setting.

Click Save changes.

We will create an IAM & click on the Users to create a user.

Click on Create user

Select the I want to create an IAM user.

And create the Custom Password & unticked Users must create a new password at next sign-in

Then Click Next.

As of now we are not attaching any policy to this user.

Click Create User

Now open the new tab (Incognito Window) & the AWS console and we will login with the user which was created.

Now we are logged in with the user demo-s3-user , With the user demo-s3-user we don't have any permission to list out the bucket.

Though it is expected behavior as we did not attach the policy to this user.

So the next step is to attach the permission to this user. Will go with the console where we have root access and grant the permission through IAM.

In the Permissions tab attach policies directly and provided the S3 full access to the user.

Click Add permissions & attach policy directly.

Click Next

Click Add Permissions.

The created IAM user is having the permission to list the S3 buckets and so on though the user is having S3 full access.

Now the user has full access so he is able to access the file or object within the bucket. And that is not a good practice. Let's say in the real time world that file is having sensitive data and user-demo-s3 is able to download the file . So the file is not secure . Being a devops engineer we have to restrict this access. So we will implement that as a bucket owner nobody should access my bucket.

To do this click on the bucket in permission tab and click edit.

So as of now I am going to implement that only Myself should have an access and others should have a block.

Then Within Bucket Policy we have to write the Json file as like below

Post click on Edit we have to click on Add new statement

Its look like below:

In the Statement is the generic it's just like a name so will put the “blockallpublicaccess”.In Principle means whom do you want to perform this action so in principle “*”, that means everyone in aws should not have access. Effect is “Deny”

In Action we can search in right pane select the service is S3 & tick mark on all actions

Below that ,Click Add a Resource.

Copy ARN Name

Click on Add resource.

We have denied that anyone in the AWS organization should be blocked , but if we block everyone so myself also getting blocked. So the above statement should be ignored for only you. To do this add a condition.

Click on Add a condition

Then click Add condition.

So what we have done above is denying access to this specific S3 resource and all the objects in the S3 Resource to everyone it means everyone restricted to access the object except what we did through the condition we have given the root user i.e. me exception. We have mentioned the account id for the root over there.

Click Save the changes

We got below error 👍

So we will do again we have to edit the Json file

Previously we have missed putting the specific things where for the bucket and the resources inside the bucket. So we have done the correction above. With the help of the resource generator it's very easy to write.

Then click save changes. Now let us try with the other user that we created earlier.

Previously the user demo-s3-user was able to access the resource. We are able to see the bucket & let it try to get into in that bucket.

We can understand that specific users have access to other S3 buckets but for this bucket the user is failing to read or download the object within that bucket as we have assigned the bucket policy. So in this way we can restrict access to the specific bucket using the bucket policy.

We will do the demo of the Static Website Hosting with S3.

But before moving ahead we have to do some settings otherwise the static website would not be accessible.

  1. We have to allow the public access for that bucket.

  2. & the previous bucket policy we have generated that we should remove.

Remove the Bucket Policy as well. 👍

Click on Edit & then remove it.

Then from here uncheck the Block public access.

Then Save the Changes.

Now we will go and implement the Static website hosting with S3 service.

We can see the index.html file is there you can pull it from anywhere and upload that file in your bucket.

Then click in Buckets and go to the Properties tab.

& in the Static website hosting section click on Edit and enable it.

Type the index.html in index document.

& click Save changes.

Below is the endpoint link to access the website.

It’s showing an error.

We have to add the new bucket policy as we did previously to block access to that particular user.

Go into the permission tab and Edit the Bucket Policy.

Then, we have updated the bucket policy as above:

Then click Save Changes. Now try to access static website endpoint url.

URL has been successfully accessible. So we have successfully implemented static website hosting.

Happy Learning !!

Kindly Share and repost!!