You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: README.md
+36-10
Original file line number
Diff line number
Diff line change
@@ -142,27 +142,53 @@ To setup an environment in Azure, simply run the [Setup-Environment.ps1](./Setup
142
142
143
143
### Running the document processing pipeline
144
144
145
-
Once an environment is setup, you can run the document processing pipeline by uploading a batch of documents to the Azure Storage blob container and sending a message to the Azure Storage queue containing the container reference.
145
+
Once an environment is setup, you can run the document processing pipeline by uploading a batch of documents to the Azure Storage blob container and sending a message via a HTTP request or Azure Storage queue containing the container reference.
146
146
147
-
> [!TIP]
148
-
> Use the [Azure Storage Explorer](https://azure.microsoft.com/en-us/features/storage-explorer/) to upload the batch of documents to the Azure Storage blob container and send a message to the Azure Storage queue.
147
+
A batch of invoices is provided in the tests [Invoice Batch folder](./tests/InvoiceBatch/) which can be uploaded into an Azure Storage blob container, locally via Azurite, or in the deployed Azure Storage account.
149
148
150
-
A batch of invoices is provided in the tests [Invoice Batch folder](./tests/InvoiceBatch/) which can be uploaded into an Azure Storage blob container.
149
+
These files can be uploaded using the Azure VS Code extension or the Azure Storage Explorer.
150
+
151
+

151
152
152
153
> [!NOTE]
153
154
> Upload all of the individual folders into the container, not the individual files. This sample processed a container that contains multiple folders, each representing a customer's data to be processed which may contain one or more invoices.
154
155
155
-
Once uploaded, add the following message to the **invoices** queue in the Azure Storage account:
156
+
#### Via the HTTP trigger
156
157
157
-
> [!IMPORTANT]
158
-
> When running locally, the batch must be uploaded to the deployed Azure Storage account. However, the queue message must be created in the local development storage account, Azurite, running as a Docker container. You may need to create the **invoices** queue in the local storage account first via the Azure Storage Explorer.
158
+
To send via HTTP, open the [`tests/HttpTrigger.rest`](./tests/HttpTrigger.rest) file and use the request to trigger the pipeline.
159
+
160
+
```http
161
+
POST http://localhost:7071/api/invoices
162
+
Content-Type: application/json
163
+
164
+
{
165
+
"container_name": "invoices"
166
+
}
167
+
```
168
+
169
+
To run in Azure, replace `http://localhost:7071` with the `containerAppInfo.value.url` value from the [`./infra/apps/AIDocumentPipeline/AppOutputs.json`](./infra/apps/AIDocumentPipeline/AppOutputs.json) file after deployment.
170
+
171
+
#### Via the Azure Storage queue
172
+
173
+
To send via the Azure Storage queue, run the [`tests/QueueTrigger.ps1`](./tests/QueueTrigger.ps1) PowerShell script to trigger the pipeline.
174
+
175
+
This will add the following message to the **invoices** queue in the Azure Storage account, Base64 encoded:
159
176
160
177
```json
161
178
{
162
-
"container_name": "<container-name>"
179
+
"container_name": "invoices"
163
180
}
164
181
```
165
182
166
-

183
+
To run in Azure, replace the `az storage message put` command with the following:
184
+
185
+
```powershell
186
+
az storage message put `
187
+
--content $Base64EncodedMessage `
188
+
--queue-name "invoices" `
189
+
--account-name "<storage-account-name>" `
190
+
--auth-mode login `
191
+
--time-to-live 86400
192
+
```
167
193
168
-
The document processing pipeline will then be triggered, processing the batch of invoices and extracting the structured data from each invoice.
194
+
The `--account-name` parameter should be replaced with the name of the Azure Storage account deployed in the environment found in the `storageAccountInfo.value.name` value from the [`./infra/InfrastructureOutputs.json`](./infra/InfrastructureOutputs.json) file after deployment.
Copy file name to clipboardExpand all lines: tests/HttpTrigger.rest
+2
Original file line number
Diff line number
Diff line change
@@ -1,3 +1,5 @@
1
+
# Start processing a batch of folders containing invoices within a Storage Container
2
+
# Note: The following URL is for the local environment. To run in Azure, replace `http://localhost:7071` with the `containerAppInfo.value.url` value from the `./infra/apps/AIDocumentPipeline/AppOutputs.json` file.
0 commit comments