S3 Download Image and Upload to Sg

Larn how to apply cloud resource in your Python scripts

Photo by Raj Steven from Pexels

I am writing this postal service out of sheer frustration.

Every post I've read on this topic assumed that I already had an account in AWS, an S3 bucket, and a mound of stored data. They but show the lawmaking simply kindly shadow over the nigh of import part — making the code work through your AWS business relationship.

Well, I could've figured out the lawmaking easily, thank you very much. I had to sift through many SO threads and the AWS docs to go rid of every nasty authentication error along the way.

So that you won't feel the same and do the difficult work, I will share all the technicalities of managing an S3 bucket programmatically, right from account creation to adding permissions to your local machine to admission your AWS resources.

Pace 1: Setup an account

Right, let's kickoff with creating your AWS account if you haven't already. Nix unusual, just follow the steps from this link:

GIF past the author

Then, we will go to the AWS IAM (Identity and Access Management) console, where we will exist doing most of the work.

GIF past the author

Yous can easily switch between different AWS servers, create users, add policies, and permit access to your user account from the panel. We volition do each one by ane.

Stride 2: Create a user

For one AWS account, y'all can create multiple users, and each user tin can have various levels of access to your account'southward resources. Let'southward create a sample user for this tutorial:

GIF past the author

In the IAM panel:

  1. Get to the Users tab.
  2. Click on Add users.
  3. Enter a username in the field.
  4. Tick the "Access primal — Programmatic access field" (essential).
  5. Click "Adjacent" and "Adhere existing policies direct."
  6. Tick the "AdministratorAccess" policy.
  7. Click "Next" until you see the "Create user" button
  8. Finally, download the given CSV file of your user's credentials.

Information technology should await similar this:

By me🥱

Store it somewhere safety considering we volition be using the credentials later.

Step 3: Create a bucket

Now, allow'south create an S3 bucket where we can store information.

GIF by the writer

In the IAM console:

  1. Click services in the top left corner.
  2. Scroll down to storage and select S3 from the right-hand list.
  3. Click "Create saucepan" and requite information technology a name.

You tin choose any region y'all desire. Exit the rest of the settings and click "Create bucket" once more.

Stride 4: Create a policy and add together it to your user

In AWS, access is managed through policies. A policy can be a fix of settings or a JSON file attached to an AWS object (user, resource, group, roles), and it controls what aspects of the object you lot can use.

Below, we will create a policy that enables us to interact with our bucket programmatically — i.e., through the CLI or in a script.

GIF by the author

In the IAM console:

  1. Become to the Policies tab and click "Create a policy."
  2. Click the "JSON" tab and insert the code below:

replacing your-bucket-name with your ain. If you lot pay attention, in the Action field of the JSON, we are putting s3:* to allow whatsoever interaction to our saucepan. This is very broad, so yous may just allow specific actions. In that case, cheque out this page of the AWS docs to learn to limit access.

This policy is but attached to the saucepan, and nosotros should connect information technology to the user as well and so that your API credentials work correctly. Here are the instructions:

GIF past the author

In the IAM panel:

  1. Get to the Users tab and click on the user we created in the last section.
  2. Click the "Add together permissions" button.
  3. Click the "Attach existing policies" tab.
  4. Filter them by the policy we just created.
  5. Tick the policy, review it and click "Add together" the final time.

Step five: Download AWS CLI and configure your user

Nosotros download the AWS command-line tool because information technology makes authentication and then much easier. Kindly go to this page and download the executable for your platform:

GIF by the writer

Run the executable and reopen any active terminal sessions to let the changes take effect. Then, type aws configure:

GIF by the writer

Insert your AWS Central ID and Underground Admission Key, along with the region you created your bucket in (utilize the CSV file). You can find the region proper name of your saucepan on the S3 folio of the console:

Past me.

But click "Enter" when you achieve the Default Output Format field in the configuration. There won't be any output.

Step half dozen: Upload your files

We are nearly in that location.

Now, we upload a sample dataset to our bucket so that we can download it in a script later:

GIF by the writer

Information technology should be easy one time you become to the S3 folio and open your bucket.

Footstep 7: Check if authentication is working

Finally, pip install the Boto3 package and run this snippet:

If the output contains your saucepan name(s), congratulations — yous at present accept full access to many AWS services through boto3, non just S3.

Using Python Boto3 to download files from the S3 bucket

With the Boto3 bundle, you take programmatic access to many AWS services such as SQS, EC2, SES, and many aspects of the IAM panel.

However, equally a regular data scientist, yous will mostly need to upload and download data from an S3 saucepan, so we will but encompass those operations.

Allow'southward beginning with the download. After importing the parcel, create an S3 class using the client function:

To download a file from an S3 bucket and immediately save information technology, nosotros can utilise the download_file role:

In that location won't be any output if the download is successful. You should pass the verbal file path of the file to be downloaded to the Primal parameter. The Filename should incorporate the pass you want to save the file to.

Uploading is also very straightforward:

The office is upload_file and you only accept to alter the order of the parameters from the download part.

Conclusion

I suggest reading the Boto3 docs for more than advanced examples of managing your AWS resources. Information technology covers services other than S3 and contains code recipes for the most mutual tasks with each ane.

Thanks for reading!

You can get a premium Medium member using the link beneath and get access to all of my stories and thousands of others:

Or simply subscribe to my email list:

You tin reach out to me on LinkedIn or Twitter for a friendly chat virtually all things information. Or you lot tin can just read some other story from me. How almost these:

lowryphers1985.blogspot.com

Source: https://towardsdatascience.com/how-to-upload-and-download-files-from-aws-s3-using-python-2022-4c9b787b15f2?source=post_internal_links---------6-------------------------------

0 Response to "S3 Download Image and Upload to Sg"

Publicar un comentario

Iklan Atas Artikel

Iklan Tengah Artikel 1

Iklan Tengah Artikel 2

Iklan Bawah Artikel