Skip to content

Commit 732f246

Browse files
committed
Update documentation and some inline comments
1 parent fec60be commit 732f246

File tree

2 files changed

+3
-4
lines changed

2 files changed

+3
-4
lines changed

API_DOCS.md

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -188,6 +188,7 @@ Describes the shape and behaviour of the resources object you will pass to `getL
188188
| `isBatchKeyASet` | (Optional) Set to true if the interface of the resource takes the batch key as a set (rather than an array). For example, when using a generated clientlib based on swagger where `uniqueItems: true` is set for the batchKey parameter. Default: false. |
189189
| `propertyBatchKey` | (Optional) The argument to the resource that represents the optional properties we want to fetch. (e.g. usually 'properties' or 'features'). |
190190
| `responseKey` | (Non-optional when propertyBatchKey is used) The key in the response objects corresponds to `batchKey`. This should be the only field that are marked as required in your swagger endpoint response, except nestedPath. |
191+
| `maxBatchSize` | (Optional) Limits the number of items that can be batched together in a single request. When more items are requested than this limit, multiple requests will be made. This can help prevent hitting URI length limits or timeouts for large batches. |
191192
192193
### `typings`
193194

__tests__/implementation.test.js

Lines changed: 2 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -1279,7 +1279,7 @@ test('batch endpoint with maxBatchSize', async () => {
12791279
await createDataLoaders(config, async (getLoaders) => {
12801280
const loaders = getLoaders(resources);
12811281

1282-
// Request 5 items at once, which should be split into 3 batches (2 + 2 + 1)
1282+
// Request 5 items at once, which should be split by maxBatchSize later in the test.
12831283
const results = await Promise.all([
12841284
loaders.foo.load({ foo_id: 1, properties: ['name', 'rating'] }),
12851285
loaders.foo.load({ foo_id: 2, properties: ['name', 'rating'] }),
@@ -1298,7 +1298,6 @@ test('batch endpoint with maxBatchSize', async () => {
12981298
]);
12991299

13001300
// Verify that the requests were batched correctly
1301-
// We should have 3 batches with max 2 IDs.
13021301
expect(receivedBatches.map(batch => batch.length)).toEqual([3, 2]);
13031302

13041303
// Verify that all IDs were requested
@@ -1341,7 +1340,7 @@ test('batch endpoint with propertyBatchKey and maxBatchSize', async () => {
13411340
await createDataLoaders(config, async (getLoaders) => {
13421341
const loaders = getLoaders(resources);
13431342

1344-
// Request 5 items at once, which should be split into 3 batches (2 + 2 + 1)
1343+
// Request 5 items at once, which should be split by maxBatchSize later in the test.
13451344
const results = await Promise.all([
13461345
loaders.foo.load({ foo_id: 1, properties: ['name', 'rating'] }),
13471346
loaders.foo.load({ foo_id: 2, properties: ['name', 'rating'] }),
@@ -1360,7 +1359,6 @@ test('batch endpoint with propertyBatchKey and maxBatchSize', async () => {
13601359
]);
13611360

13621361
// Verify that the requests were batched correctly
1363-
// We should have 3 batches with max 2 IDs.
13641362
expect(receivedBatches.map(batch => batch.length)).toEqual([2, 2, 1]);
13651363

13661364
// Verify that all IDs were requested

0 commit comments

Comments
 (0)