en:navody:object_storage:awscli:start

AWS CLI tool for command line usage

AWS CLI is a common tool allowing to control S3 service. AWS CLI tool is written in python.

AWS CLI installation

To install AWS CLI we recommend using official AWS docummentation. There you can find the guide on how to install AWS CLI on Linux and Windows as well.

If you need to install AWS CLI in the virtual environment you can use this guide

Configuration of AWS CLI

To configure AWS CLI we recommend using the option --profile which allows you to define multiple user profiles with different user credentials. Of course, you can also use the settings without the option --profile. All commands will be the same, you will just omit the option --profile. AWS will then use the default settings.

In the following, we will demonstrate the AWS CLI configuration. Following exemplary commands utilize the --profile option.

In the configuration wizard, it is necessary by the option Default region name to insert “us-east-1”. If you will not put value into “Default region name” the config file will not contain region parameter. You will then obtain the error related to Invalid region: region was not a valid DNS name during the usage aws s3.
aws configure --profile test_user
AWS Access Key ID [None]: xxxxxxxxxxxxxxxxxxxxxx
AWS Secret Access Key [None]: xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx
Default region name [None]: us-east-1
Default output format [None]: text

AWS Access Key ID - access key, obtained from data storage administrator
Secret Access Key - secret key, obtained from data storage administrator
Default region name - Here isert “us-east-1”!
Default output format - choose the output format (json, text, table)

For smooth operation is necessary to use option --endpoint-url with particular S3 endpoint address provided by CESNET.
Multipart S3 upload - the maximal size of the file is limited up to 5 GB. It's a best practice to use aws s3 commands (such as aws s3 cp) for multipart uploads and downloads because these aws s3 commands automatically perform multipart uploading and downloading based on the file size. By comparison, aws s3api commands, such as aws s3api create-multipart-upload, should be used only when aws s3 commands don't support a specific upload need, such as when the multipart upload involves multiple servers, a multipart upload is manually stopped and resumed later, or when the aws s3 command doesn't support a required request parameter. More information can be found on the AWS websites.

Controls of AWS CLI - high-level (s3)

To show the help (available commands) you can use help. aws s3 tool allows you to use several advanced functions, see below.

aws s3 help

Operations with Buckets

Operations with Buckets

The bucket name has to be unique. It should contain lower letters, numbers, dashes, and dots. The bucket name should begin only with a letter or number and cannot contain dots followed by a dash or dots preceded by a dash or multiple dots. We also recommend not using “slash” in the bucket name. Using the slash will disallow the usage of the bucket via API.

Bucket creation

aws s3 --profile test_user --endpoint-url https://s3.cl2.du.cesnet.cz mb s3://test1

Bucket listing

aws s3 --profile test_user --endpoint-url https://s3.cl2.du.cesnet.cz ls
2019-09-18 13:30:17 test1

Bucket deletion

aws s3  --profile test_user --endpoint-url https://s3.cl2.du.cesnet.cz rb s3://test1


Operation with files

Operation with files

Files
File upload

aws s3 --profile test_user --endpoint-url https://s3.cl2.du.cesnet.cz cp C:/Users/User/Desktop/test_file.zip s3://test1
upload: Desktop\test_file.zip to s3://test1/test_file.zip

File download

$ aws s3 --profile test_user --endpoint-url https://s3.cl2.du.cesnet.cz cp s3://test1/test_file.zip C:\Users\User\Downloads\
download: s3://test1/test_file.zip to Downloads\test_file.zip

File deletion

$ aws s3 --profile test_user --endpoint-url https://s3.cl2.du.cesnet.cz rm s3://test1/test_file.zip 
delete: s3://test1/test_file.zip


Directory/Folders operations

Directory/Folders operations

Directories
Upload the directory

The content of the source folder is always copied while using the following command. It does not depend on the slash character at the end of the source path. The behavior of aws is in this perspective different than the rsync behavior. If you wish to have the source directory in the destination you can add the name of the source directory to the destination path. AWS tool will create the directory in the destination while copying the data, see the exemplary commands below. The same is valid in the case of directory downloads or synchronization via aws s3 sync.
aws s3 --profile test_user --endpoint-url https://s3.cl2.du.cesnet.cz cp C:\Users\User\Desktop\test_dir  s3://test1/test_dir/ --recursive

Directory download

aws s3 --profile test_user --endpoint-url https://s3.cl2.du.cesnet.cz cp s3://test1/test_dir C:\Users\User\Downloads\test_dir\ --recursive

Directory deletion

aws s3 --profile test_user --endpoint-url https://s3.cl2.du.cesnet.cz rm s3://test1/test_dir --recursive

Directory sync
Directory sync to the S3 storage

aws s3 --profile test_user --endpoint-url https://s3.cl2.du.cesnet.cz sync C:\Users\User\Desktop\test_sync  s3://test1/test_sync/

Directory sync from the S3 storage to local PC

aws s3 --profile test_user --endpoint-url https://s3.cl2.du.cesnet.cz sync s3://test1/test_sync/ C:\Users\User\Downloads\test_sync


aws s3 tool allows the usage of advanced functionalities. For instance presigned URL settings for sharing.

Controls of AWS CLI - api-level (s3api)

aws tool allows the usage of aws s3api module. This module provides advanced functions to control S3 service, see below. The configuration of credentials and connections is the same like for aws in the beginning of this guide

The set of available commands can be obtained by the following command with the option help. Alternatively is the complete list available in the AWS website.

Advanced functionalities available via aws s3api are for instance sharing bucket policies or bucket versioning.

Exeplary configuration file for aws tool

After successful configuration, the configuration file should be created. You can find the example below. You can find the credentials file in the same path.

Windows: C:/Users/User/.aws/config
Linux: /home/user/.aws/config

[profile test-user]
region = us-east-1
output = text
Last modified:: 18.04.2024 14:17