This works for PowerShell >= 5.1
You need to install the AWS tools beforehand with
Install-Module -Name AWS.Tools.Installer
Install-AWSToolsModule S3
Then create a profile for the credentials
Set-AWSCredential -AccessKey "abc" -SecretKey "def" -StoreAs "s3test"
Now you can create a regular script to upload files
Write-S3Object -BucketName "apteco-cloud-customer" -File .\test.txt -ProfileName s3test
To see the list of files you have created, just use this command
# List all files/objects in bucket
Get-S3Object -ProfileName s3test -BucketName "apteco-cloud-customer"
This needs some packages to be installed, e.g. with pip from PowerShell or Bash
pip install boto3
Then you can directly jump into your code. This example directly enters the credentials. To see more secure possibilites, please have a look at https://boto3.amazonaws.com/v1/documentation/api/latest/guide/credentials.html
import boto3
boto3.client('s3',aws_access_key_id='abc',aws_secret_access_key='def')
s3_client.upload_file(Filename='./hw.txt',Bucket='apteco-cloud-client',Key='hw.txt')
To see all uploaded files, simply output this dict
s3_client.list_objects(Bucket='apteco-cloud-srk')['Contents']