This document discusses two methods for uploading multiple files from a local system to an AWS S3 bucket: 1) Using the multipart upload API, which allows files to be uploaded in parts for improved speed and recovery from network issues without restarting uploads. 2) Using the AWS CLI tool to recursively copy multiple files from a local directory to an S3 bucket, which ensures uploads continue even if the network connection drops temporarily. The AWS CLI method is demonstrated as an easy way to bulk upload files to S3 with resumable capabilities.
3. Amazon Simple Storage Service(S3) is one of
the most used object storage services, and it
is because of scalability, security,
performance, and data availability. That
means customers of any size or industries
such as websites, mobile apps, enterprise
applications, and IoT devices can use it to
store any volume of data.
Amazon S3 provides easy-to-use
management features so you can
appropriately organize your data to fulfill
your business requirements.
Many of us are using AWS s3 bucket on a daily
basis; one of the most common challenges
that are faced while working with cloud
storage is syncing or uploading multiple
objects at once. Yes, we can drag and drop or
upload on a direct bucket page. Like the
below image.
4. But the problem with this approach is if
you’re uploading large objects over an
unstable network if network errors occur
you must have to restart uploading from
the beginning.
5. Suppose you are uploading 2000+ files and
you come to know that upload fails and
your uploading these files from the last 1
hour, re-uploading has become a time-
consuming process. So, to overcome this
problem we have two solutions that we will
discuss in the next sections.
Prerequisites
AWS Account
Installed AWS CLI
7. Multipart upload opens the gate to upload a
single object as a set of parts. Considering
that it is possible to upload object parts
independently and in any order.
In case the transmission fails in any section,
it is possible to retransmit that section
without affecting any other sections. So, it’s
a good practice to use multipart uploads
instead of uploading the object in a single
operation.
Improved throughput – improve
uploading speed
Fast recovery from any network issues:
no need to re-upload from beginning
Resume and pause object uploads
It is possible to upload any object as you
are creating it.
Advantages of Using multipart upload:
8. We can use multipart file uploading API
with different technologies SDK or REST
API for more details visit
Manage AWS Cloud Services with
Bacancy!
Stop looking for AWS certified architect
for your project, because Bacancy is here!
Contact the best! Get the best! Hire AWS
developer today!
10. Install AWS CLI
We need to install CLI. With the use of AWS
CLI, we can perform an S3 copy operation. If
you don’t know how to install CLI follow
this guide: Install AWS CLI.
Configure AWS Profile
Now, it’s time to configure the AWS
profile. For that, use “AWS configure”
command. You can configure AWS
credentials, find your credentials under
IAM -> Users -> security_credentials tab
on the AWS console
We are done with configuring the AWS
profile. Now, you can access your S3 bucket
name “bacancy-s3-blog” using the list below
the bucket command
11. List All the Existing
Buckets in S3
Use the below command to list all the
existing buckets.
aws s3 ls
Copy Single File to AWS S3
Bucket
Use the below command to copy a single file
to the S3 bucket.
aws s3 cp file.txt s3://< lt;your bucket
name> gt;
12. AWS S3 Copy Multiple Files
Use the below command to copy multiple
files from one directory to another
directory using AWS S3.
aws s3 cp < lt;your directory path > gt;
s3:// < lt;your bucket name > gt; –
recursive
13. Note: By using aws s3 cp recursive flag to
indicate that all files must be copied
recursively.
As you can see on the above video even if
our network connection is lost or is
connected after reconnecting the process
goes on without losing any file.
15. So, this was about how to copy multiple
files from local to AWS S3 bucket using AWS
CLI. For more such tutorials feel free to visit
the Cloud tutorials page and start learning!
Feel free to contact us if you have any
suggestions or feedback.