Using Backblaze S3 Compatible API for presigned requests

Hi,

We are trying to use Backblaze S3 Compatible API for our presigned upload with shrinerb. We have an endpoint for uploading images with S3 buckets, and it works flawlessly, this new upload feature is for video, and we want to reduce our bandwidth cost usage and go with Backblaze, especially considering they have s3 compatible api.

Apparently the CORS option isnt available in the api
https://www.backblaze.com/b2/docs/s3_compatible_api.html

so the presigned upload is being blocked. Is there anyway to get around this in shrine? I have made the bucket on backblaze public with the following CORS rules

“corsRules”: [
{
“allowedHeaders”: [
“range”
],
“allowedOperations”: [
“b2_download_file_by_id”,
“b2_upload_part”,
“b2_upload_file”,
“b2_download_file_by_name”
],
“allowedOrigins”: [
“*”
],
“corsRuleName”: “downloadFromAnyOriginWithUpload”,
“exposeHeaders”: [
“authorization”,
“x-bz-file-name”,
“x-bz-content-sha1”
],
“maxAgeSeconds”: 3600
}
],

But it is still blocked, here is our configuration for the s3 presigned url

Shrine.plugin :presign_endpoint, presign_options: -> (request) {
filename = request.params[“filename”]
type = request.params[“type”]

{
content_disposition: ContentDisposition.inline(filename),
content_type: type,
content_length_range: 0…(1010241024), # limit upload size to 10 MB
}
}

Is there anything we can do in shrine that will overcome this limitation from backblaze?

How would you do it using the aws-sdk-s3 gem directly? Shrine’s S3 storage is a relatively thin wrapper around aws-sdk-s3 functionality, so once we figure that out, it should be easy to implement it in Shrine.

as far as I can tell, they have structured their api to be compatible with s3 endpoints. I have small service that basically fetches some s3 objects. all you have to do is override your credentials, and the endpoint

def initialize(bucket: name)
client = Aws::S3::Client.new(
access_key_id: ENV[“BACKBLAZE_ID”],
secret_access_key: ENV[“BACKBLAZE_KEY”],
region: “us-west-2”,
endpoint: ‘https://s3.us-west-000.backblazeb2.com’)

  resource = Aws::S3::Resource.new(client: client)
  @bucket = resource.bucket(bucket)
end

def get_obj_url(folder, limit: 15, expires_in: 86400)
  @bucket.objects(prefix: folder).map do |o|
    o.presigned_url(:get, expires_in: expires_in)
  end
end

@janko, Thanks for writing such flexible and powerful library.

I got it figured out.

In the original s3 presigned url upload, Amazon allows you to upload via ajax if you allow CORS on your bucket. However, backblaze s3 compatible api doesnt allow you do DOWNLOAD or UPLOAD to the api.

to get around this. I made a small service module on my rails app that authenticates against the backblaze API and returns an authorized upload url to the front end.

Then the frontend ajax code upload the file using the authorized url so the upload wont encounter CORS error. Then the ID of the uploaded file along with the other spec is passed into the rails controller to create a new media item.

Shrine then retrieves the file from the cache and pass it through the processing job and upload it to the permanent store.

If @janko plans to add modification to the store plugin to include backblaze support. One quirky thing is that the s3 compatible url from backblaze doesnt support download CORS at all, so if you try to use ajax to fetch a downloaded file, it will fail.

the file location needs to look like this

https://f000.backblazeb2.com/file/BUCKET-ID/FILELOCATION”, and your bucket has to be public for this to work.

Thanks again for this great Library!!

1 Like