The script I posted works. I have uploaded about 750 GB of .bak files (backing up the SQL database to 12 .bak files) first to S3 storage, I used scriptblock start-job to transfer the .bak files in parallel. After the files are loaded to S3 storage, I then used Cloudberry Drive to map directly to S3 storage and run restore command in EC2 instance. We are implementing a DR plan to AWS.
I use SQL encryption when running the backup job locally so all files are encrypted, I also use -ServerSideEncryption AES256 parameter to enable server side encryption in S3. You do not need to unencrypt anything when you download the files again from S3 when server side encryption is used.