You can run scheduled and on-demand backups of your Redis Cloud Pro databases to a remote storage location. Scheduled backups run every 24 hours.

Amazon Simple Storage Service (AWS S3)

To use an S3 bucket for storing backups, first access your AWS Management Console and follow these steps:

  1. Select the S3 service under Services -> Storage -> S3 to navigate to the S3 Management Console.
  2. Navigate to your bucket's permissions page:
    1. To create a new bucket:
      1. Click the + Create Bucket button
      2. Enter a name and region for the bucket new-bucket
      3. Click the Next button.
      4. Set any bucket properties to your company's standards
      5. On the Set permissions page, click the + Add account button
      6. In the Account field enter: fd1b05415aa5ea3a310265ddb13b156c7c76260dbc87e037a8fc290c3c86b614
      7. Check read/write boxes for Objects and Object permissions, then click Save. add_s3_user
      8. Click the Create bucket button
    2. To use an existing bucket, click on the bucket and go to the Permissions tab
      1. Click + Add account in the Access for other AWS accounts section enter the below information
      2. In the Account field enter: fd1b05415aa5ea3a310265ddb13b156c7c76260dbc87e037a8fc290c3c86b614 add_user_existing
      3. Check read/write boxes for various permissions, then click the Save button

Once your bucket's permissions are set, you can use it with your resource by setting its Backup Path to the path of your S3 bucket and clicking the Apply button. For example, if your backups bucket's name is backups-bucket, you should use the following path:

s3://backups-bucket

Google Cloud Storage (GCS)

For Google Cloud Platform (GCP) console subscriptions, to use a GCS bucket for storing your resources' backups:

  1. Login to your account on Google Cloud Platform
  2. Navigate to Storage -> Browser
  3. Click on the three dot button (1) on your relevant bucket name and choose Edit bucket permissions (2). GCS bucket
permissions
  4. Under Add members, enter: service@redislabs-prod-clusters.iam.gserviceaccount.com
  5. For the role, select Storage Legacy -> Storage Legacy Bucket Writer. Google Cloud Storage
Permissions
  6. Click on the Add button.

Once your bucket's permissions are set, you can use it with your resource by setting its Backup Path to the path of your GCS bucket and clicking the Activate button. For example, if your backups bucket's name is backups-bucket, use the path:

gs://backups-bucket

Azure Blob Storage (ABS)

To use an ABS container for storing your resources' backups, follow these steps in your Microsoft Azure Management Portal:

  1. Access your storage by clicking the left-hand STORAGE icon.
  2. Select the storage account:
    1. To create a new storage account:
      1. Click the + NEW button at the lower-left corner of the page.
      2. Verify that you've selected DATA SERVICES->STORAGE->QUICK CREATE from the menu.
      3. Enter the URL for your new storage account.
      4. Select a LOCATION/AFFINITY GROUP for the storage account.
      5. Choose a REPLICATION mode for the account.
      6. Click the CREATE STORAGE ACCOUNT button.
      7. Continue to step 2.2.
    2. To use an existing storage account, select it by clicking on it.
  3. Click the MANAGE ACCESS KEYS button at the bottom of the page.
  4. Copy your storage account's PRIMARY ACCESS KEY

Set your resource's Backup Path to the path of your ABS storage account and clicking the Apply button using the following syntax:

abs://:storage_account_access_key@storage_account_name/container_name/[path/]

Where:

  • storage_account_access_key: the primary access key to the storage account
  • storage_account_name: the storage account name
  • url: the URL of the storage account.
  • container_name: the name of the container, if needed.
  • path: the backups path, if needed.

FTP Server

To store your resource backups on an FTP server, set its Backup Path using the following syntax:

<protocol>://[username]:[password]@[hostname]:[port]/[path]/

Where:

  • protocol: the server's protocol, can be either ftp or ftps.
  • username: your username, if needed.
  • password: your password, if needed.
  • hostname: the hostname or IP address of the server.
  • port: the port number of the server, if needed.
  • path: the backups path, if needed.

Can I export my Redis data from Redis Cloud Pro?

Absolutely! There is no lock-in with Redis Cloud Pro. Using the instructions on this page, you can export your latest RDB backup file from your cloud storage, FTP or HTTP server to any Redis server of your choice (paid subscriptions only).