This works for PowerShell >= 5.1
You need to install the AWS tools beforehand with
Install-Module -Name AWS.Tools.Installer
Install-AWSToolsModule S3List all commands of that module and syntax of a specific function/cmdlet
Get-command -module aws.tools.s3
get-command -Name Write-S3Object -Syntax
get-help Write-S3Object -full
Get-help Write-S3Object -parameter *Then create a profile for the credentials
Set-AWSCredential -AccessKey "abc" -SecretKey "def" -StoreAs "s3test"Now you can create a regular script to upload files
Write-S3Object -BucketName "apteco-cloud-customer" -File .\test.txt -ProfileName s3testTo see the list of files you have created, just use this command
# List all files/objects in bucket
Get-S3Object -ProfileName s3test -BucketName "apteco-cloud-customer"Without a profile you could use this
$credentials = [Amazon.Runtime.BasicAWSCredentials]::new($accessKey, $secretKey)
$region = "eu-central-1"
Get-S3Object -BucketName "your-bucket-name" -Credential $credentials -Region $regionThis needs some packages to be installed, e.g. with pip from PowerShell or Bash
pip install boto3Then you can directly jump into your code. This example directly enters the credentials. To see more secure possibilites, please have a look at https://boto3.amazonaws.com/v1/documentation/api/latest/guide/credentials.html
import boto3
boto3.client('s3',aws_access_key_id='abc',aws_secret_access_key='def')
s3_client.upload_file(Filename='./hw.txt',Bucket='apteco-cloud-client',Key='hw.txt')
To see all uploaded files, simply output this dict
s3_client.list_objects(Bucket='apteco-cloud-srk')['Contents']