Category: Amazon S3

When we have to upload multiple files or attach many files to any record, Salesforce provides storage limit per user license purchased. It varies from edition to edition. So, sometimes organisations decide to use external storage service like Amazon S3 cloud.

User can be given option to upload files to Amazon S3 via Salesforce and access them using the uploaded URLs. REST protocol is used in this scenario.

Files will be uploaded securely from Salesforce to Amazon server. After create your AWS (Amazon Web Service) user account, login secret and key ID will be shared with you by Amazon. This will be used to login to S3 Cloud from Salesforce.

After logging in to AWS, you can go to console screen and click on S3 under Storage & Content Delivery section.

You can create a bucket where the files will be uploaded.

You can not create folders inside bucket, but a logical folder using ‘/’ slash can be created.

host can be region specific server ‘s3-ap-southeast-1.amazonaws.com’ or the generic ‘s3.amazonaws.com’.

The request needs to be equipped with proper authentication so that it reaches securely at correct endpoint. To achieve this, Amazon provided login secret and key ID will be used and an authorization string will be created. Authorization string will contain an encrypted signature.

Now, the bucket needs to be configured as a website. The objects (files uploaded) should be made publicly readable, so that the same URL using which the file is uploaded can be used to access the files publicly. To do so you need to write a bucket policy that grants everyone “s3:GetObject” permission.

Then open the bucket you created, go to properties. Click on Add Bucket Policy, when the popup opens, paste the script generated and save. This will make the files uploaded in the bucket publicly accessible.