Skip to content

Commit 8c1b8a5

Browse files
authored
Merge pull request #1825 from redis/DOC-5433
RC: Remove retention policy requirement and implement tabs for Import/Backup data
2 parents 0893c22 + f0201e5 commit 8c1b8a5

File tree

2 files changed

+53
-40
lines changed

2 files changed

+53
-40
lines changed

content/operate/rc/databases/back-up-data.md

Lines changed: 23 additions & 20 deletions
Original file line numberDiff line numberDiff line change
@@ -70,9 +70,15 @@ Database backups can be stored to a cloud provider service or saved to a URI usi
7070

7171
Your subscription needs the ability to view permissions and update objects in the storage location. Specific details vary according to the provider. To learn more, consult the provider's documentation.
7272

73-
The following sections describe specific backup options. Be aware that provider features change frequently. For best results, use your provider's documentation for the latest info.
73+
Be aware that provider features change frequently. For best results, use your provider's documentation for the latest info.
7474

75-
### AWS S3
75+
Select the tab for your storage location type.
76+
77+
{{< multitabs id="backup-storage-locations"
78+
tab1="AWS S3"
79+
tab2="Google Cloud Storage"
80+
tab3="Azure Blob Storage"
81+
tab4="FTP/FTPS Server" >}}
7682

7783
To store backups in an Amazon Web Services (AWS) Simple Storage Service (S3) [bucket](https://docs.aws.amazon.com/AmazonS3/latest/userguide/creating-buckets-s3.html):
7884

@@ -156,7 +162,7 @@ To learn more, see [Using bucket policies](https://docs.aws.amazon.com/AmazonS3/
156162
An AWS S3 bucket can be used by only one Redis Cloud account. If you have more than one Redis Cloud account, repeat the setup steps for multiple buckets.
157163
{{< /note >}}
158164

159-
### Google Cloud Storage
165+
-tab-sep-
160166

161167
To store backups in a Google Cloud Storage [bucket](https://cloud.google.com/storage/docs/creating-buckets):
162168

@@ -170,29 +176,19 @@ To store backups in a Google Cloud Storage [bucket](https://cloud.google.com/sto
170176

171177
1. Select the **Grant Access** button and then add:
172178

173-
179+
```sh
180+
181+
```
174182

175183
1. Set **Role** to **Storage Legacy Bucket Writer**.
176184

177185
1. Save your changes.
178186

179-
1. Verify that your bucket does _not_ have a set retention policy.
180-
181-
To do so:
182-
183-
1. View the details of your bucket.
184-
185-
1. Select the **Configuration** tab.
186-
187-
1. Verify **Protection** -> **Bucket retention policy** is set to **none**.
188-
189-
If a policy is defined and you cannot delete it, you need to use a different bucket.
190-
191187
Use the bucket details **Configuration** tab to locate the **gsutil URI**. This is the value you'll assign to your resource's backup path.
192188

193189
To learn more, see [Use IAM permissions](https://cloud.google.com/storage/docs/access-control/using-iam-permissions#bucket-iam).
194190

195-
### Azure Blob Storage
191+
-tab-sep-
196192

197193
To store your backup in Microsoft Azure Blob Storage, sign in to the Azure portal and then:
198194

@@ -206,7 +202,9 @@ Set your resource's **Backup Path** to the path of your storage account.
206202

207203
The syntax for creating the backup varies according to your authorization mechanism. For example:
208204

209-
`abs://:storage_account_access_key@storage_account_name/container_name/[path/]`
205+
```
206+
abs://storage_account_access_key@storage_account_name/container_name/[path/]
207+
```
210208
211209
Where:
212210
@@ -218,11 +216,14 @@ Where:
218216
219217
To learn more, see [Authorizing access to data in Azure Storage](https://docs.microsoft.com/en-us/azure/storage/common/storage-auth).
220218
221-
### FTP Server
219+
-tab-sep-
222220
223221
To store your backups on an FTP server, set its **Backup Path** using the following syntax:
224222
225-
`<protocol>://[username]:[password]@[hostname]:[port]/[path]/`
223+
```sh
224+
<protocol>://[username]:[password]@[hostname]:[port]/[path]/
225+
```
226+
226227

227228
Where:
228229

@@ -238,3 +239,5 @@ If your FTP username or password contains special characters such as `@`, `\`, o
238239
{{< /note >}}
239240

240241
The user account needs permission to write files to the server.
242+
243+
{{< /multitabs >}}

content/operate/rc/databases/import-data.md

Lines changed: 30 additions & 20 deletions
Original file line numberDiff line numberDiff line change
@@ -44,16 +44,23 @@ To import a dataset from any publicly available Redis Open Source server:
4444

4545
If you have an RDB or a compressed RDB file from a previous backup, you can restore data from that file into your Redis Cloud database.
4646

47-
### Via FTP or HTTP
47+
Select the tab for your storage location type.
4848

49-
To import an RDB file stored on an FTP or HTTP server:
49+
{{< multitabs id="rdb-import-locations"
50+
tab1="FTP or HTTP server"
51+
tab2="AWS S3"
52+
tab3="Google Cloud Storage"
53+
tab4="Azure Blob Storage" >}}
5054

5155
1. Select **Databases** from the Redis Cloud console menu and then select your database from the list.
52-
1. Select **Import**.
53-
{{<image filename="images/rc/database-configuration-import.png" alt="The Import dataset section and Import button." >}}
54-
1. Enter the details for the RDB file:
56+
2. Select **Import**.
57+
3. Enter the details for the RDB file:
5558
- Source type - Select **FTP** or **HTTP**.
56-
- Source path - Enter the URL for the RDB file: `<protocol>://[username][:password]@hostname[:port]/[path/]filename.rdb[.gz]`
59+
- Source path - Enter the URL for the RDB file:
60+
61+
```
62+
<protocol>://[username][:password]@hostname[:port]/[path/]filename.rdb[.gz]
63+
```
5764
5865
Where:
5966
@@ -69,14 +76,14 @@ To import an RDB file stored on an FTP or HTTP server:
6976
If your FTP username or password contains special characters such as `@`, `\`, or `:`, you must URL encode (also known as Percent encode) these special characters. If you don't, your database may become stuck.
7077
{{< /note >}}
7178
72-
1. For sharded databases with multiple RDB files, select **Add source** to add another RDB file.
79+
4. For sharded databases with multiple RDB files, select **Add source** to add another RDB file.
7380
{{< warning >}}
7481
For sharded databases with multiple RDB files, make sure to add every file before proceeding.
7582
{{< /warning >}}
7683
77-
1. Select **Import**.
84+
5. Select **Import**.
7885
79-
### Via AWS S3
86+
-tab-sep-
8087
8188
To use the Redis Cloud console to import your data, you must first share the file from the Amazon Web Services (AWS) management console.
8289
@@ -158,22 +165,25 @@ To share and import an RDB file that is stored in an AWS Simple Storage Service
158165
159166
1. In the [Redis Cloud console](https://cloud.redis.io/), select the target database from the database list.
160167
1. Select **Import**.
161-
{{<image filename="images/rc/database-configuration-import.png" alt="The Import dataset section and Import button." >}}
162168
1. Enter the details for the RDB file:
163169
- Source type - Select **AWS S3**.
164-
- Source path - Enter the URL for the RDB file: `s3://bucketname/[path/]filename.rdb[.gz]`
170+
- Source path - Enter the URL for the RDB file:
171+
172+
```text
173+
s3://bucketname/[path/]filename.rdb[.gz]
174+
```
165175
166-
Where:
176+
Where:
167177
168-
- `bucketname` - Name of the S3 bucket
169-
- `path` - Path to the file, if necessary
170-
- `filename` - Filename of the RDB file, including the .gz suffix if the file is compressed
178+
- `bucketname` - Name of the S3 bucket
179+
- `path` - Path to the file, if necessary
180+
- `filename` - Filename of the RDB file, including the .gz suffix if the file is compressed
171181
172182
1. For sharded databases with multiple RDB files, select **Add source** to add another RDB file.
173183
174184
1. Select **Import**.
175185
176-
### Via Google Cloud Storage
186+
-tab-sep-
177187
178188
To use the Redis Cloud console to import your data, you must first share the file from the Google Cloud console.
179189
@@ -192,7 +202,6 @@ To share and import an RDB file that is stored in a Google Cloud Storage bucket:
192202
193203
1. In the [Redis Cloud console](https://cloud.redis.io/), select the target database from the database list.
194204
1. Select **Import**.
195-
{{<image filename="images/rc/database-configuration-import.png" alt="The Import dataset section and Import button." >}}
196205
1. Enter the details for the RDB file:
197206
- Source type - Select **Google Cloud Storage**.
198207
- Source path - Enter the URL for the RDB file: `gs://bucketname/[path/]filename.rdb[.gz]`
@@ -206,17 +215,16 @@ To share and import an RDB file that is stored in a Google Cloud Storage bucket:
206215
207216
1. Select **Import**.
208217
209-
### Via Azure Blob Storage container
218+
-tab-sep-
210219
211220
To import an RDB file stored in a Microsoft Azure Blog storage container:
212221
213222
1. In the Redis Cloud console, select the target database from the database list.
214223
1. Select **Import**.
215-
{{<image filename="images/rc/database-configuration-import.png" alt="The Import dataset section and Import button." >}}
216224
1. Enter the details for the RDB file:
217225
- Source type - Select **Azure Blob Storage**.
218226
- Source path - Enter the URL for the RDB file:
219-
```text
227+
```
220228
abs://:storage_account_access_key@storage_account_name/[container/]filename.rdb[.gz]
221229
```
222230
@@ -230,3 +238,5 @@ To import an RDB file stored in a Microsoft Azure Blog storage container:
230238
1. For sharded databases with multiple RDB files, select **Add source** to add another RDB file.
231239
232240
1. Select **Import**.
241+
242+
{{< /multitabs >}}

0 commit comments

Comments
 (0)