Skip to main content

Lyve Cloud Documentation

Using AWS CLI

The AWS Command Line Interface (AWS CLI) is an open-source tool enabling you to interact with Lyve Cloud S3 buckets. The S3 API provides direct access to Lyve Cloud buckets and you can also develop shell scripts to manage your resources.

This section will guide you to acquire, test and configure the AWS CLI tool on Linux for use with Lyve Cloud.

Note

Consult your organization's policies and the EULA policies of the software before downloading 3rd party applications.

Installing AWS S3 API utility

This section explains installing the S3 API utility. Ensure you have privileges to download and install the software.

  1. Use the cURL command to download the installation file.

    For Linux x86 (64-bit)
    $ curl "https://awscli.amazonaws.com/awscli-exe-linux-x86_64.zip -o awscliv2.zip
    For Linux ARM
    $ curl https://awscli.amazonaws.com/awscli-exe-linux-aarch64.zip  awscliv2.zip
  2. Run the command to unzip the installer.

    $ unzip awscliv2.zip
  3. Run the command to install

    $ sudo ./aws/install
  4. Run the command to:

    Locate the AWS binary
    $ which aws 
    $ /usr/local/bin/aws
    Verify the AWS CLI version

    Note

    All the S3 API commands will execute for AWS CLI 2.x.x version and above.

    $ aws --version 
    
    aws-cli/2.7.3 Python/3.9.11 
    Linux/5.15.0-1004-aws exe/x86_64.ubuntu.22 prompt/off
  5. Save the AWS CLI path in the bash profile file.

    export PATH=/usr/local/bin:$PATH

    Reload the updated profile in your current session to apply the changes.

    $ source ~/.bash_profile

  1. Use the cURL command to download the installation file.

    $ curl https://awscli.amazonaws.com/AWSCLIV2.pkg -o AWSCLIV2.pkg
  2. Run the installer and specify the package file as the source.

    $ sudo installer -pkg ./AWSCLIV2.pkg -target /home
  3. Run the command to:

    Locate the AWS binary
    $ which aws 
    
    /usr/local/bin/aws
    Verify the AWS CLI version

    Note

    All the S3 API commands will execute for AWS CLI 2.x.x version and above.

    $ aws --version 
    

    This is a sample output to get AWS version.

    aws-cli/2.4.5 Python/3.8.8
    Darwin/18.7.0 botocore/2.4.5
  4. Save the AWS CLI path in the bash profile file.

    export PATH=/usr/local/bin:$PATH

    Reload the updated profile in your current session to apply the changes.

    $ source ~/.bash_profile
  1. Download AWS CLI MSI installer.

    https://awscli.amazonaws.com/AWSCLIV2.msi
  2. Run the msiexec command to run the MSI installer.

    C:\> msiexec.exe /i https://awscli.amazonaws.com/AWSCLIV2.msi
    
  3. Run the command to:

    Locate the AWS.exe
    Get-Command aws
    CommandType Name Version Source
    ----------- ---- ------- ------
    Application aws.exe 0.0.0.0 C:\Program Files\Amazon\AWSCLIV2\aws.exe
    Verify the AWS CLI version

    Note

    All the S3 API commands will execute for AWS CLI 2.x.x version and above.

    C:\> aws --version
    
    aws-cli/2.4.5 Python/3.8.8 Windows/10 exe/AMD64 prompt/off
Configuring S3 API
Prerequisites
  • Create a service account to generate the Access and Secret key. For more information, see Creating service accounts

    • Access key

    • Secret key

  • Lyve Cloud S3 API endpoint. For more information, see S3 API endpoints

. To configure S3 API:
  1. Use the aws configure command to set up the Lyve Cloud account.

  2. Enter the following:

    • Access key

    • Secret keys

    • Region: Lyve Cloud region to perform S3 operation. For more information, see S3 endpoint.

    • Output format: Output format can be text, json, or table.

Example: Create a profile (profile name - adminuser)

The following example creates a profile named adminuser

$ aws configure --profile adminuser
AWS Access Key ID
[None]: xxxxxxxxxxxxxxx

AWS Secret Access Key
[None]: xxxxxxxxxxxxxxxxx

Default region name
[None]: us-east-1

Default output format [None]: text

Note

You must specify  –-profile to access Lyve Cloud using S3API. Ensure you create Lyve CLoud's default profile to work with Lyve Cloud. Following successful configuration, you are ready to use AWS CLI to perform file system operations. Since AWS CLI works with an AWS URL by default, users must override the URL for Lyve Cloud.

All configuration information is stored in a local file named credentials in the home directory~/.aws/config and ~/.aws/credentials.

You can access the features of Amazon S3 using the AWS CLI. The AWS CLI provides two tiers of commands for accessing Amazon S3.

  • AWS S3: These are high-level commands that simplify performing common tasks.

  • AWS S3 API: These expose direct access to all Amazon S3 API operations

For more information, see Leveraging the s3 and s3api Commands.

Using AWS S3
Creating a bucket

Enter the syntax below, to create a bucket:

$ aws s3 mb s3://<NEW BUCKET NAME> --profile <profile_name> --endpoint-url=https://s3.us-east-1.lyvecloud.seagate.com

Example

$ aws s3 mb s3://testbkt0 --profile <profile_name> --endpoint-url=https://s3.us-east-1.lyvecloud.seagate.com
Listing Objects

Note

If you have the correct bucket name, access key and secret key, the ls command will work as expected and show your top-level bucket listing.

  1. Run a simple Linux ls command with a bucket named testbkt0. The resulting ls command is as follows:

    $ aws s3 ls s3://testbkt0 --profile <profile_name> --endpoint-url=https://s3.us-east-1.lyvecloud.seagate.com
    

    The output of the above command is

    PRE folder1/
    2021-07-01 21:04:43 411352 myfile.zip
    2021-07-01 21:04:26 19 testfile.txt
  2. To see the entire directory tree use the --recursive option:

    $ aws s3 ls s3://testbkt0 --profile <profile_name> --recursive --endpoint-url=https://s3.us-east-1.lyvecloud.seagate.com
    2021-06-29 22:27:26 0 folder1/
    2021-07-01 20:35:26 411352 folder1/SystemsControllersBrief.zip
    2021-07-01 21:04:43 411352 myfile.zip
    2021-07-01 21:04:26 19 testfile.txt
Download a specific file

To download a specifically named file use the “cp” command. Note the destination folder is the current directory in this example, denoted by “.”.

$ aws s3 cp s3://testbkt0/testfile.txt . --profile <profile_name> --endpoint-url=https://s3.us-east-1.lyvecloud.seagate.com
download: s3://testbkt0/testfile.txt to ./testfile.txt
Download entire directories

To download entire directories or prefix including sub-directories or sub-prefix and files or object Modified , use the recursive option as an ls command

$ aws s3 cp s3://testbkt0/ . --recursive --profile <profile_name> --endpoint-url=https://s3.us-east-1.lyvecloud.seagate.com
download: s3://testbkt0/testfile.txt to ./testfile.txt
download: s3://testbkt0/folder1/SystemsControllersBrief.zip to folder1/SystemsControllersBrief.zip
download: s3://testbkt0/myfile.zip to ./myfile.zip
Find a specific file by name
  • If the path of the file is known, users can issue an ls or cp command to the file. The following example will download a file using the path to the current working directory.

    aws cp s3://testbkt0/subfolder1/subfolder2/find_this_file.txt . --profile <profile_name> --endpoint-url=https://s3.us-east-1.lyvecloud.seagate.com
  • If the path of a file is not known, users can locate the path by using a part of the file’s name. Issue the command aws ls and then add grep lt <part of file name> to the end of the output.

    Make sure to include --recursive option.

    The following example illustrates this where the test is part of the file’s name.

    $ aws s3 ls s3://testbkt0 --recursive --endpoint-url=https://s3.us-east-1.lyvecloud.seagate.com | grep test
    2021-07-01 21:04:26 19 testfile.txt
      Uploading a file to a bucket

      Note

      The object name can contain any of these special characters like @, #, *, $, %, &amp;amp;, !, ?, , , ;, ’, ”, |, +, =, &amp;lt;, &amp;gt;, ^, (, ), {, }, [, ] and alphanumeric characters like 0-9, a-z, A-Z . However, using any of these characters can cause issues due to limiting factors of S3 client SDK .

      $ aws s3 cp myfile s3://testbkt0 --profile <profile_name> --endpoint-url=https://s3.us-east-1.lyvecloud.seagate.com
      upload: ./myfile to s3://testbkt0/myfile
      Uploading a folder

      To upload a folder and all its contents, use the following command, note without the folder name in the destination path only the contents of the source folder will be copied directly to the bucket.

      $ aws s3 cp localdir s3://testbkt0/localdir --recursive --profile <profile_name> --endpoint-url=https://s3.useast-1.lyvecloud.seagate.com
      upload: localdir/local to s3://testbkt0/localdir/local
      $ aws s3 ls s3://testbkt0/ --recursive --profile <profile_name> --endpoint-url=https://s3.us-east-1.lyvecloud.seagate.com
      2021-06-29 22:27:26 0 folder1/2021-07-01 20:35:26 411352 folder1/SystemsControllersBrief.zip
      2021-07-06 16:02:51 20 local2021-07-06 16:05:58 20 localdir/local 
      
        Copying a folder/prefix
        1. Complete this command to copy a folder using AWS CLI.

          $ aws s3 cp folder1 s3://lyve-bucket/folder2 --recursive --profile <profile_name> --endpoint https://s3.us-east-1.lyvecloud.seagate.com
        2. Alternatively, to place parameters

           aws --profile <profile_name> --endpoint https://s3.us-east-1.lyvecloud.seagate.com s3 --recursive cp folder1 s3://lyve-bucket/folder2
          . Command using an alias
          1. The alias we are using is:

            aws_east='aws --profile <profile_name> --endpoint https://s3.us-east-1.lyvecloud.seagate.com s3'
          2. Copy folder to a bucket (placement of –recursive option is important)

            $ aws_east cp folder1/ s3://lyve-bucket/folder --recursive
          3. Upload: folder1/SystemsControllersBrief.zip to s3://lyve-bucket/folder/SystemsControllersBrief.zip

            1. Recursive is necessary for copying the folder.

            2. Specifying the destination folder name is also necessary (below is an example where the contents of folder1 are copied into the bucket directly because the destination folder is not specified).

              $ aws_east cp folder1/ s3://lyve-bucket/ --recursive
          4. Upload: folder1/SystemsControllersBrief.zip to s3://lyve-bucket/SystemsControllersBrief.zip.

          5. Without the –recursive option, folder copy does not work.

            $ aws_east cp folder1/ s3://lyve-bucket/ 
            
            upload failed: folder1/ to s3://lyve-bucket/
            

            Parameter validation failed: Invalid length for parameter Key, value: 0, valid min length: 1

            This topic describes some of the commands you can use to manage buckets and objects using the AWS S3 API commands.

            Using S3 API with Lyve Cloud

            This topic describes some of the commands you can use to manage Lyve Cloud S3 buckets and objects using the S3 API commands.

            Creating a bucket

            The create-bucket S3 API creates a bucket.

            Use the following command to create a bucket:

            $aws s3api create-bucket -–bucket <bucket_name> –-profile <profile_name> --endpoint <endpoint>
            
            Example: The following example creates a bucket named lyve-bucket.
            $aws s3api create-bucket --bucket lyve-bucket --profile --endpoint https://s3.us-east-1.lyvecloud.seagate.com 

            The output displays the endpoint to access the bucket lyve-bucket.

            {  
            "Location":
            "http://s3.us-east-1.lyvecloud.seagate.com/lyve-bucket"
            }
            Listing buckets

            The list-buckts S3API lists the bucket.

            Use the following command to list the bucket.

            $aws s3api list-buckets –-profile <profile_name> --endpoint <endpoint> 
            Example: The following example lists all the buckets in Lyve Cloud.
            $aws s3api list-buckets --profile adminuser --endpoint https://s3.us-east-1.lyvecloud.seagate.com 
            {  
            "Buckets": [
                    {           
            "Name": "lyve-bucket",           
            "CreationDate": "2022-03-11T13:57:33.598000+00:00"
                    }
                ],   
            "Owner": {       
            "DisplayName": "",       
            "ID":
            "02d6176db174dc93cb1b899f7c6078f08654445fe8cf1b6ce98d8855f66bdbf4"
                }
            }
            Listing Objects

            The list-objects S3 API lists the objects in a bucket.

            Use the following command to list objects:

            $aws s3api list-objects -–bucket <bucket_name> –-profile <profile_name> --endpoint <endpoint> 
            
            Example: The following example displays the names of objects in the lyve-bucket bucket.
            $aws s3api list-objects --bucket lyve-bucket --profile adminuser --endpoint https://s3.us-east-1.lyvecloud.seagate.com 
            {  
            "Contents": [
                    {           
            "Key": "testupload1.txt",           
            "LastModified": "2022-03-11T15:31:52.020000+00:00",           
            "ETag":
            "\"473c9ae7c14380c5d8f80c6d8042ae6c\"",           
            "Size": 41475,           
            "StorageClass": "STANDARD",           
            "Owner": {               
            "DisplayName": "",               
            "ID":
            "02d6176db174dc93cb1b899f7c6078f08654445fe8cf1b6ce98d8855f66bdbf4"
                        }
                    }
                ]
            }

            Alternatively, you can use the following command to list objects, using list objects V2 API.

            $aws s3api list-objects-v2 --bucket lyve-bucket --profile adminuser --endpoint https://s3.us-east-1.lyvecloud.seagate.com 
            {   
            "Contents": [
                    {           
            "Key": "testupload1.txt",           
            "LastModified": "2022-03-11T15:31:52.020000+00:00",           
            "ETag":
            "\"473c9ae7c14380c5d8f80c6d8042ae6c\"",
                        "Size":
            41475,           
            "StorageClass": "STANDARD",           
            "Owner": {               
            "DisplayName": "",               
            "ID": ""
                        }
                    }
                ]
            }
              Copying objects

              The copy-object S3 API creates a copy of an object that is already stored.

              Use the following command to copy objects:

              $aws s3api copy-object –copy-source <source-bucket/object-name> –key <object-name> –-bucket <Destination-bucket> –-profile <profile_name> --endpoint <endpoint> 
              Example: The following command copies an object from lyve-bucket to lyve-bucket2.
              $aws s3api copy-object --copy-source lyve-bucket/testupload1.txt --key testupload1.txt --bucket lyve-bucket2 --profile adminuser --endpoint https://s3.us-east-1.lyvecloud.seagate.com
              {  
              "CopyObjectResult": {       
              "ETag":
              "\"473c9ae7c14380c5d8f80c6d8042ae6c\"",       
              "LastModified": "2022-03-14T08:45:03.129000+00:00"
                  }
              }
              Uploading object to a bucket

              You must have WRITE permissions for the bucket to add an object to that bucket.

              Note

              The object name can contain any of these special characters like @, #, *, $, %, &amp;amp;, !, ?, , , ;, ’, ”, |, +, =, &amp;lt;, &amp;gt;, ^, (, ), {, }, [, ] and alphanumeric characters like 0-9, a-z, A-Z . However, using any of these characters can cause issues due to limiting factors of the S3 client SDK .

              Use the following command to upload an object:

              $aws s3api put-object -–bucket <bucket_name> --key <Object_key> --body <Object_data> –-profile <profile_name> --endpoint <endpoint> 
              Example: The following example uploads an object to the bucket.
              $aws s3api put-object --bucket lyve-bucket --key testupload1.txt --body testupload1.txt --profile adminuser --endpoint https://s3.us-east-1.lyvecloud.seagate.com 

              The output displays the Etag and Encryption method through which the object is encrypted.

              {
                  "ETag":
              "\"473c9ae7c14380c5d8f80c6d8042ae6c\"", 
              "ServerSideEncryption": "AES256"
              }
              Deleting objects

              The delete-object S3 API deletes an object from the bucket. Once you delete an object you can no longer retrieve it.

              Use the following command to delete objects:

              $aws s3api delete-object --bucket <bucket_name> --key <object-name> –-profile <profile_name> --endpoint <endpoint> 
              Example: The following example deletes an object named testupload2.txt from a bucket named lyve-bucket.
              $aws s3api delete-object --bucket lyve-bucket --key testupload2.txt --profile adminuser --endpoint https://s3.us-east-1.lyvecloud.seagate.com 

              Use the following command to validate if the object has been deleted:

              $aws s3api list-objects --bucket lyve-bucket --profile adminuser --endpoint https://s3.us-east-1.lyvecloud.seagate.com 

              The output does not show the file testupload2.txt.

              {
              "Contents": [
                      {          
              "Key": "testupload1.txt",          
              "LastModified": "2022-03-11T15:31:52.020000+00:00",
                          "ETag":
              "\"473c9ae7c14380c5d8f80c6d8042ae6c\"",          
              "Size": 41475,          
              "StorageClass": "STANDARD",          
              "Owner": {              
              "DisplayName": "",              
              "ID":
              "02d6176db174dc93cb1b899f7c6078f08654445fe8cf1b6ce98d8855f66bdbf4"
                          }
                      }
                  ]
              }
              Deleting bucket

              The delete-bucket S3 API deletes the S3 bucket. You must delete all the objects in the bucket before deleting a bucket. Once you delete a bucket you cannot retrieve the bucket nor the objects within the bucket.

              Use the following command to delete a bucket:

              $aws s3api delete-bucket --bucket <bucket_name> –-profile <profile_name> --endpoint <endpoint> 
              Example: The following example deletes a bucket named my-bucket2.
              $aws s3api delete-bucket --bucket lyve-bucket2--profile adminuser --endpoint https://s3.us-east-1.lyvecloud.seagate.com