Skip to content

Obtaining an S3 bucket

If you do not already have an S3 bucket, several options are possible for you to get started: a temporary test bucket, free-tier and commercial S3 providers, and academic providers.

The following is a non-exhaustive list of options - in general, any S3-compatible server can be used with FaaSr.

MinIO Play - temporary S3 test bucket

MinIO Play is a shared "playground" environment for Internet users to try out S3. When you follow the tutorial for the first time, you use this bucket: it allows easy access to try things out quickly. However, it's a temporary solution - it does not keep your data persistent or private.

Its credentials that you use when setting up your workflow repo are common to all users, as follows: - Access key: Q3AM3UQ867SPQQA43P2F - Secret key: zuf+tfteSlswRu7BJ86wekitnifILbZam1KYY3TG

Backblaze - free-tier without credit card

Backblaze offers a free-tier account that you can setup quickly without a credit card:

  • Follow their instructions to create a new account using your email address
  • When prompted to select your region, any choice is acceptable. In this example we'll use US-east
  • Once you sign-in successfully with your account, click on "Create a Bucket"
  • Give your bucket a unique string name - this will be needed to configure FaaSr workflows with the workflow builder. We'll use myuniquebucket as an example here
  • Leave the encryption and lock options as disabled (default)
  • Once your bucket is created, you need also to add a new Application Key - this will be used as credentials in your workflow repo
  • Click on "Add Application Key", and give the key a unique name (e.g. myFaaSrKey). Leave the default "read and write" setting
  • Once the key is created, you need to save the KeyID and applicationKey strings that are shown to you only once in this page
  • In your FaaSr workflow repo, save the KeyID as a GitHub secret with name S3_ACCESSKEY and save the applicationKey as a GitHub secret with name S3_SECRETKEY
  • To use this bucket in your workflows, configure S3 with the Edit Data Store button in the workflow builder as follows:
  • Endpoint: enter the endpoint shown for your bucket in Backblaze, including a leading https://. For example: https://s3.us-east-005.backblazeb2.com
  • Bucket: enter the bucket name you created, e.g. myuniquebucket
  • Region: enter the name after s3 in the Endpoint above, e.g. s3.us-east-005

AWS S3 - commercial cloud provider

Follow the documentation on Getting Started with S3

OSN - academic S3 provider in the US

  • For researchers in the US, you can request an allocation of 10+ TB S3 storage.
  • If your request is approved, you will be assigned one S3 bucket, and can then copy the access and secret keys provided to you for use with FaaSr