AWS CLI for uploading large files to Amazon S3:
Amazon S3 is a cost effective, high-speed, extensible cloud storage system. S3 allows the users to upload, store and retrieve any type of files up to the size 5 TB. Amazon S3 uses AWS web based interface to upload and manage files in S3 buckets. In web interface uploading large files more than 100 GB is difficult, some time it fails to upload. To resolve this issue Amazon S3 come up with Multipart Upload API.

Amazon AWS S3 CLI is a open tool for uploading huge files to S3. It works in all network environments with the speed of 7 MB/s in a 100 Mbps network.
Lets discuss about how to upload large files using CLI:

Install AWS CLI tool:

$ pip install awscli


The above code install AWS CLI tool in Linux operating system.

Uploading large files:

While we are uploading 200 GB data file to S3://sys-data-test/store_dir/ (sys-data-test - s3 bucket, store_dir - directory) in the bucket and directory at present on S3 run the following command.

$ aws s3 cp ./200GB.data s3://sys-data-test/store_dir/


Once the uploading is started, it will print the following message like

Completed 1 part(s) with ... file(s) remaining

It will shows the following message when it getting to end.

Completed 9896 of 9896 part(s) with 1 file(s) remaining

After uploading the files successfully it shows the message as follows.

upload: ./200GB.data to s3://sys-data-test/store_dir/200GB.data