Skip to content

Commit db72fe5

Browse files
Update azure-blob-storage-connector.md
Changes in authorization options added
1 parent a396c80 commit db72fe5

File tree

1 file changed

+8
-4
lines changed

1 file changed

+8
-4
lines changed

content/en/docs/appstore/use-content/platform-supported-content/modules/azure/azure-blob-storage-connector.md

Lines changed: 8 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -40,9 +40,13 @@ After you install the connector, you can find it in the **App Explorer**, in the
4040

4141
### Configuring Authentication {#authentication}
4242

43-
In order to use the Azure Blob Storage service, you must authenticate using a Shared Access Signature (SAS). To do so, you must create a SAS or ask your Azure admin to do that for you. The SAS is then added to a ConnectionDetails object on the SAS attribute. In addition you will need to set the storage account to the Connection details on the StorageAccount attribute.
43+
In order to use the Azure Blob Storage service, you must authenticate using a Shared Access Signature (SAS) or an Azure Entra ID Access Token.
4444

45-
We plan to provide additional means of authentication soon.
45+
#### SAS authorization
46+
You or your admin needs to create a SAS for the container or blob you want to perform operations on. This SAS should then be added to a `SASCredentials` object on the `SASToken` attribute. Feed the `SASCredentials` object to the `AbstractCredentials` input parameter of the operation microflow you want to use.
47+
48+
#### Azure Entra ID Access Token
49+
Set up SSO using the OIDC SSO marketplace module. When this is set up for your application you can use the `GetCurrentToken` microflow to get the access token needed for authenticationg the call. Create an `EntraCredentials` object and add the access token to the `BearerToken` attribute. Feed the `EntraCredentials` object to the `AbstractCredentials` input parameter of the operation microflow you want to use.
4650

4751
### Configuring a Microflow for an AWS Service
4852

@@ -51,11 +55,11 @@ You can implement the operations of the connector by using them in microflows. F
5155
1. In the **App Explorer**, right-click on the name of your module, and then click **Add microflow**.
5256
2. Enter a name for your microflow, for example, *ACT_PutBlob*, and then click **OK**.
5357
3. In the **App Explorer**, in the **AzureBlobStorageConnector** section, find the **PUT_v1_Azure_PutBlob** operation microflow.
54-
4. In the **App Explorer**, in the **AWSAuthentication** section, find the **GetStaticCredentials** and **GetTemporaryCredentials** microflows.
58+
4. Create a **SASCredentials** or **EntrCredentials** object and add the SAS or access token to the **SASToken** or **BearerToken** attributes respectively.
5559
5. Drag the **PUT_v1_Azure_PutBlob** microflow in to your microflow.
5660
6. Double-click the **PUT_v1_Azure_PutBlob** operation to configure the required parameters.
5761

58-
For the `PUT_v1_Azure_PutBlob` operation, retrieve the `System.FileDocument` you want to store and provide a configured `ConnectionDetails` object. You must then create a `PutBlobRequest` object in your microflow as the last parameter. This entity requires the following parameters:
62+
For the `PUT_v1_Azure_PutBlob` operation, retrieve the `System.FileDocument` you want to store and provide a configured `SASCredentials` or `EntrCredentials` object. You must then create a `PutBlobRequest` object in your microflow as the last parameter. This entity requires the following parameters:
5963

6064
* `BlobName` - The BlobName attribute holds the name the blob will get in the Blob storage.
6165
* `ContainerName` - The ContainerName attribute holds the target container name where the blob will be stored.

0 commit comments

Comments
 (0)