Skip to content

Instantly share code, notes, and snippets.

@yuvalif
Last active July 29, 2024 18:57
Show Gist options
  • Select an option

  • Save yuvalif/f10177f1a1788c4235c09103cf6089ec to your computer and use it in GitHub Desktop.

Select an option

Save yuvalif/f10177f1a1788c4235c09103cf6089ec to your computer and use it in GitHub Desktop.

Setup

  • get our boto extensions file (from this PR)
wget https://raw.githubusercontent.com/ceph/ceph/15ac96a42d9aa2e828d68b9b21556063ef2c8531/examples/rgw/boto3/service-2.sdk-extras.json
  • copy to the right place:
mkdir -p ~/.aws/models/s3/2006-03-01/
cp ./service-2.sdk-extras.json ~/.aws/models/s3/2006-03-01/ 

  • start cluster using vstart
  • create bucket and log bucket:
aws --endpoint-url http://localhost:8000 s3 mb s3://fish
aws --endpoint-url http://localhost:8000 s3 mb s3://all-log
  • create bucket logging conf:
aws --endpoint-url http://localhost:8000 s3api put-bucket-logging --bucket fish \
  --bucket-logging-status '{"LoggingEnabled": {"TargetBucket": "all-log", "TargetPrefix": "fish/", "ObjectRollTime": 5, "EventType": "ReadWrite", "RecordType": "Standard"}}'
  • get bucket logging conf:
aws --endpoint-url http://localhost:8000 s3api get-bucket-logging --bucket fish
  • delete bucket logging conf:
aws --endpoint-url http://localhost:8000 s3api put-bucket-logging --bucket fish --bucket-logging-status '{}'

Test

  • create a file and upload to the bucket:
head -c 512 </dev/urandom > myfile1
head -c 512 </dev/urandom > myfile2
head -c 512 </dev/urandom > myfile3

aws --endpoint-url http://localhost:8000 s3 cp myfile1 s3://fish
aws --endpoint-url http://localhost:8000 s3 cp myfile2 s3://fish
  • wait 5 seconds, and upload another file (this should commit the log file to the log bucket):
aws --endpoint-url http://localhost:8000 s3 cp myfile3 s3://fish
  • and see if we see a log file in the log bucket:
aws --endpoint-url http://localhost:8000 s3api list-objects --bucket all-log
  • look at the log (assuming a log object called: fish/2024-07-08-11-24-49/4179-default-default):
aws --endpoint-url http://localhost:8000 s3api get-object --bucket all-log --key "fish/2024-07-08-11-24-49/4179-default-default" tmp && cat tmp
  • you should see 2 logs lines for file1 and file2. e.g.:
/fish [08/Jul/2024:11:24:46 +0000] myfile1 put_obj d18005cbd5d50c9de34ef550b720124f
/fish [08/Jul/2024:11:24:52 +0000] myfile2 put_obj d18005cbd5d50c9de34ef550b720124f
  • test COPY:
aws --endpoint-url http://localhost:8000 s3 cp s3://fish/myfile3 s3://fish/copy-of-myfile3
  • test GET:
aws --endpoint-url http://localhost:8000 s3 cp s3://fish/copy-of-myfile3 .
  • test DELETE:
aws --endpoint-url http://localhost:8000 s3 rm s3://fish/myfile3
  • test MPU:
head -c 50M </dev/urandom > largefile
aws --endpoint-url http://localhost:8000 s3 cp largefile s3://fish/

Load Test

  • use hsbench to create a bucket:
hsbench -a 0555b35654ad1656d804 -s h7GhxuBLTrlhVUyxSPUKUV8r/2EI4ngqJxD7iBdBYLhwluN30JaT3Q==  -u http://localhost:8000 -bp bk -m i
  • create bucket logging conf (pointing to the same log bucket with different prefix):
aws --endpoint-url http://localhost:8000 s3api put-bucket-logging --bucket bk000000000000 \
  --bucket-logging-status '{"LoggingEnabled": {"TargetBucket": "all-log", "TargetPrefix": "bk000000000000/", "ObjectRollTime": 5}}}'
  • make changes on the bucket:
hsbench -a 0555b35654ad1656d804 -s h7GhxuBLTrlhVUyxSPUKUV8r/2EI4ngqJxD7iBdBYLhwluN30JaT3Q==  -u http://localhost:8000 -bp bk -m pd -t 1 -l 1
  • look again in the log bucket:
aws --endpoint-url http://localhost:8000 s3api list-objects --bucket all-log
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment