How to upload data from local machine to Amazon S3

Learn how to seamlessly transfer files from your local machine to Amazon S3, AWS's robust cloud storage solution.

Chris Phan · 8 minute read

Uploading files to Amazon S3, the robust and scalable cloud storage solution offered by Amazon Web Services (AWS), has never been easier. In this comprehensive guide, we'll walk you through the simple steps to effortlessly transfer your files from a local machine to an Amazon S3 bucket. Whether you're a seasoned developer or a beginner, our easy-to-follow instructions will have your files securely stored in the cloud in no time.

At a high-level overview, upload files for folder from local to Amazon S3 can be done by using AWS CLI, upload directly via AWS Console or using AWS SDK such as boto3 for python application. This article will go though all the option to allow upload files to S3 bucket.

Assume that you already have an AWS account an have permission to access to Amazon S3. If not, please follow the document how to create AWS account to create a personal account.

Upload files directly via AWS console

Upload data via AWS Console is very easy, once logged in, navigate to the Amazon S3 service by typing "S3" in the search bar or locating it under the "Storage" section in the AWS Management Console. In the Amazon S3 dashboard, you'll see a list of your existing S3 buckets. If you already have a bucket where you want to upload the file, select it. Otherwise, create a new bucket by clicking the "Create bucket" button and following the prompts. Note that bucket names must be unique across all AWS accounts. In this example we will use the bucket name: galireview-demobucket.

After choosing or creating a bucket, click on its name to enter the bucket details page. From here, you can start uploading files by clicking the "Upload" button. published

Upload data to Amazon S3 step 1

In the upload screen, click the "Add files" or "Add folder" button to select the files or folders you want to upload from your local machine. You can also drag and drop the files directly into the upload window.

At this time, you can set permission for the files and/or folder by adjusting the permissions for the uploaded files in the "Set permissions" section. This is optional by default the permission to access the files are private. After files uploaded to Amazon S3, click the "Upload" button to start the file upload process.

Once the upload is complete, you'll see the uploaded files listed in the bucket. You can click on the files to access their details or download them if needed.

Upload files from AWS CLI

To upload the files from local to Amazon S3 using AWS CLI you have install AWS CLI on your local machine. I suggest using cli version 2 which is latest release version at the time writing the article. Follow the installation instructions for your specific operating system.

Once the AWS CLI is installed, open your command-line interface (e.g., Terminal on macOS or Command Prompt on Windows) and run the following command to configure the CLI with your AWS credentials: aws configure

You'll need to provide your AWS Access Key ID, AWS Secret Access Key, default region (e.g., us-east-1), and default output format (e.g., json) follow the document to config aws profile. To verify whether you access to AWS or not, run the following command to check: aws sts get-caller-identity. Example output:

{
  "UserId": "AIDA4DUJDFADSAD627TZFZ",
  "Account": "<account_id>",
  "Arn": "arn:aws:iam::<account_id>:user/<user_name>"
}

Now, it is ready to upload the file to S3 bucket by running the following command:

aws s3 cp /path/to/local/file s3://bucket-name/path/to/s3/location/
aws s3 cp /galireview/galireview-demobucket-file s3://galireview-demobucket

To upload multiple files to Amazon S3, run the following command:

aws s3 cp /path/to/local/folder s3://bucket-name/path/to/s3/location/ --recursive
aws s3 cp /galireview/galireview-demobucket s3://galireview-demobucket --recursive

To verify the upload, run the following command:

aws s3 ls s3://bucket-name/path/to/s3/location/
aws s3 ls s3://galireview-demobucket

Upload files using AWS SDK (boto3)

In this article we will use python3 and boto3 as an AWS SDK to upload file to S3 bucket. Boto3 is the official AWS SDK for Python and provides an easy-to-use interface to interact with various AWS services, including S3.

import boto3

# Replace these values with your AWS credentials and S3 bucket name
aws_access_key_id = 'YOUR_ACCESS_KEY_ID'
aws_secret_access_key = 'YOUR_SECRET_ACCESS_KEY'
bucket_name = 'YOUR_BUCKET_NAME'

# The local file path that you want to upload
local_file_path = '/path/to/local/file.txt'

# The S3 key (file name) under which the file will be stored
s3_key = 'folder/file.txt'

# Create an S3 client
s3 = boto3.client('s3', aws_access_key_id=aws_access_key_id, aws_secret_access_key=aws_secret_access_key)

# Upload the file to S3
try:
    s3.upload_file(local_file_path, bucket_name, s3_key)
    print(f"File {local_file_path} successfully uploaded to S3 bucket {bucket_name} as {s3_key}")
except Exception as e:
    print(f"Error uploading file to S3: {e}")

The code example above used to upload file to Amazon S3. Make sure to replace the placeholders (YOUR_ACCESS_KEY_ID, YOUR_SECRET_ACCESS_KEY, YOUR_BUCKET_NAME) with your actual AWS credentials and S3 bucket name. However, using credential might not best practice, we can use session token or using IAM role to access to the S3 bucket. Replace the lines aws_access_key_id, aws_secret_access_key to s3 = boto3.client('s3'). You can also upload file or folder by using the following code:

import boto3
import argparse
import os

def upload_file_to_s3(local_file_path, s3_key):
s3.upload_file(local_file_path, bucket_name, s3_key)

def upload_folder_to_s3(local_folder_path, s3_key_prefix):
    for root, dirs, files in os.walk(local_folder_path):
        for file in files:
            local_file_path = os.path.join(root, file)
            s3_key = os.path.join(s3_key_prefix, os.path.relpath(local_file_path, local_folder_path))
            upload_file_to_s3(local_file_path, s3_key)

if __name__ == "__main__":
    parser = argparse.ArgumentParser(description="Upload file or folder to Amazon S3")
    parser.add_argument("mode", choices=["file", "folder"], help="Upload mode (file or folder)")
    parser.add_argument("file_path", help="Path to the file or folder for upload")
    args = parser.parse_args()

    # Replace these values with your AWS credentials and S3 bucket name
    aws_access_key_id = 'YOUR_ACCESS_KEY_ID'
    aws_secret_access_key = 'YOUR_SECRET_ACCESS_KEY'
    bucket_name = 'YOUR_BUCKET_NAME'

    # Create an S3 client
    s3 = boto3.client('s3', aws_access_key_id=aws_access_key_id, aws_secret_access_key=aws_secret_access_key)

    if args.mode == "file":
        # Upload individual file
        s3_key = 'destination_folder/file.txt'
        upload_file_to_s3(args.file_path, s3_key)
        print(f"File {args.file_path} successfully uploaded to S3 bucket {bucket_name} as {s3_key}")
    elif args.mode == "folder":
        # Upload entire folder
        s3_key_prefix = 'destination_folder/'
        upload_folder_to_s3(args.file_path, s3_key_prefix)
        print(f"Folder {args.file_path} successfully uploaded to S3 bucket {bucket_name} with key prefix {s3_key_prefix}")

Conclusion

With our step-by-step guide, uploading files to Amazon S3 becomes effortless. Empower yourself with the knowledge to utilize Amazon S3's vast storage capabilities securely and efficiently. Start uploading your files today and unlock the full potential of AWS for your storage needs. Simplify your file management in the cloud with ease and convenience.