You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
New content: Add definition for shape broadcasting (#534)
* New content: Add definition for shape broadcasting
This change introduces a new section for Algorithms, following APIs,
to collect algorithms referenced throughout the specification.
A section for Broadcasting is introduced, which defines broadcasting
shapes and gives an explicit algorithm matching WebNN implementations
of NumPy's General Broadcasting Rules. Definitions for "broadcastable"
and "unidirectionally broadcastable" are introduced. The previous
definition of "broadcast-shapes" is removed in favor of these new
algorithms.
Use broadcasting definition in expand(), rather than bespoke steps
For #324, #378, #462, and potentially #523.
Co-authored-by: Dwayne Robinson <[email protected]>
* Fix prelu parameter order
---------
Co-authored-by: Dwayne Robinson <[email protected]>
1. If |a|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dataType}} is not equal to |b|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dataType}}, then [=exception/throw=] a "{{DataError}}" {{DOMException}}.
2238
2238
1. Let |descriptor| be a new {{MLOperandDescriptor}}.
2239
2239
1. Set |descriptor|.{{MLOperandDescriptor/dataType}} to |a|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dataType}}.
2240
-
1. Set |descriptor|.{{MLOperandDescriptor/dimensions}} to the result of running the [=MLGraphBuilder/broadcast-shapes=] steps given |a|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}} and |b|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}}.
2241
-
1. If that [=exception/throws=] an error, re-[=exception/throw=]the error.
2240
+
1. Set |descriptor|.{{MLOperandDescriptor/dimensions}} to the result of [=bidirectionally broadcasting the shapes=] |a|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}} and |b|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}}.
2241
+
1. If that returns failure, then [=exception/throw=]a "{{DataError}}" {{DOMException}}.
2242
2242
1. If any of the following sub-steps fail, [=exception/throw=] an "{{OperationError}}" {{DOMException}}.
2243
2243
1. Let |output| be the result of [=creating an MLOperand=] given [=this=] and |descriptor|.
To <dfn for="MLGraphBuilder">broadcast-shapes</dfn> given [=/list=] |shape1| and [=/list=] |shape2|, run the following steps:
2258
-
</summary>
2259
-
<div class=algorithm-steps>
2260
-
1. [=Assert=]: The type of |shape1| and |shape2| is `sequence of unsigned long`.
2261
-
1. Let |output| be the result of invoking the [=implementation-defined=] shape broadcast on |shape1| and |shape2|.
2262
-
1. If that fails, then [=exception/throw=] a "{{DataError}}" {{DOMException}}.
2263
-
1. Return |output|.
2264
-
<div class = "note">
2265
-
The most common implementation is that two shapes are compatible, when each of their corresponding dimensions are equal, or one of them is 1. The output shape consists of the maximum of the corresponding dimensions.
2266
-
</div>
2267
-
</div>
2268
-
</details>
2269
-
2270
2255
<details open>
2271
2256
<summary>
2272
2257
The element-wise binary operation algorithms invoke the [=MLGraphBuilder/element-wise-binary-op | create element-wise binary operation=] steps as follows.
@@ -2373,8 +2358,8 @@ Although operations *greaterOrEqual* and *lesserOrEqual* can each be implemented
2373
2358
1. If |a|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dataType}} is not equal to |b|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dataType}}, then [=exception/throw=] a "{{DataError}}" {{DOMException}}.
2374
2359
1. Let |descriptor| be a new {{MLOperandDescriptor}}.
2375
2360
1. Set |descriptor|.{{MLOperandDescriptor/dataType}} to {{MLOperandDataType/"uint8"}}.
2376
-
1. Set |descriptor|.{{MLOperandDescriptor/dimensions}} to the result of running the [=MLGraphBuilder/broadcast-shapes=] steps given |a|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}} and |b|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}}.
2377
-
1. If that [=exception/throws=] an error, re-[=exception/throw=]the error.
2361
+
1. Set |descriptor|.{{MLOperandDescriptor/dimensions}} to the result of [=bidirectionally broadcasting the shapes=] |a|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}} and |b|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}}.
2362
+
1. If that returns failure, then [=exception/throw=]a "{{DataError}}" {{DOMException}}.
2378
2363
1. If any of the following sub-steps fail, [=exception/throw=] an "{{OperationError}}" {{DOMException}}.
2379
2364
1. Let |output| be the result of [=creating an MLOperand=] given [=this=] and |descriptor|.
The permissions and context validity have been checked by [[#api-mlgraphbuilder-constructor]] steps.
2709
2694
</div>
2710
-
1. If any of the following steps fail, then [=exception/throw=] a "{{DataError}}" {{DOMException}}.
2711
-
1. Let |inputDesc| be |input|.{{MLOperand/[[descriptor]]}}.
2712
-
1. If the sequence length of |newShape| is not equal to the [=rank=] of |inputDesc|, then [=exception/throw=] a "{{DataError}}" {{DOMException}}.
2713
-
1. Let |outputDesc| be a copy of |inputDesc|.
2714
-
1. [=list/For each=] |index| in [=the range=] 0 to the [=rank=] of |input|, exclusive:
2715
-
1. Let |size| be the |input|.{{MLOperand/shape()}}[|index|].
2716
-
1. If |size| is not equal to 1 and not equal to |newShape|[index], then [=exception/throw=] a "{{DataError}}" {{DOMException}}.
2717
-
1. If |size| is equal to 1, then let |outputDesc|.{{MLOperandDescriptor/dimensions}}[|index|] be |newShape|[|index|].
2695
+
1. Let |descriptor| be a new {{MLOperandDescriptor}}.
2696
+
1. Set |descriptor|.{{MLOperandDescriptor/dataType}} to |input|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dataType}}.
2697
+
1. Set |outputDescriptor|.{{MLOperandDescriptor/dimensions}} to the result of [=unidirectionally broadcasting the shapes=] |input|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}} and |newShape|.
2698
+
1. If that returns failure, then [=exception/throw=] a "{{DataError}}" {{DOMException}}.
2718
2699
1. If any of the following sub-steps fail, [=exception/throw=] an "{{OperationError}}" {{DOMException}}.
2719
-
1. Let |output| be the result of [=creating an MLOperand=] given [=this=] and |outputDesc|.
2700
+
1. Let |output| be the result of [=creating an MLOperand=] given [=this=] and |outputDescriptor|.
2720
2701
1. Make a request to the underlying platform to:
2721
2702
1. Create [=platform operator=] |expandImpl| for this method, given |input| and |newShape|.
2722
2703
1. Set |output|.{{MLOperand/[[operator]]}} to |expandImpl|.
Calculate the [general matrix multiplication of the Basic Linear Algebra Subprograms](https://en.wikipedia.org/wiki/Basic_Linear_Algebra_Subprograms#Level_3). The calculation follows the expression `alpha * A * B + beta * C`, where `A` is a 2-D tensor with shape [M, K] or [K, M], `B` is a 2-D tensor with shape [K, N] or [N, K], and `C` is broadcastable to the shape [M, N]. `A` and `B` may optionally be transposed prior to the calculation.
2843
+
Calculate the [general matrix multiplication of the Basic Linear Algebra Subprograms](https://en.wikipedia.org/wiki/Basic_Linear_Algebra_Subprograms#Level_3). The calculation follows the expression `alpha * A * B + beta * C`, where `A` is a 2-D tensor with shape [M, K] or [K, M], `B` is a 2-D tensor with shape [K, N] or [N, K], and `C` is [=unidirectionally broadcastable=] to the shape [M, N]. `A` and `B` may optionally be transposed prior to the calculation.
The third input tensor. It is either a scalar, or of the shape that is unidirectionally broadcastable to the shape [M, N] according to [[!numpy-broadcasting-rule]]. When it is not specified, the computation is done as if *c* is a scalar 0.0.
2863
+
The third input tensor. It is either a scalar, or of the shape that is [=unidirectionally broadcastable=] to the shape [M, N]. When it is not specified, the computation is done as if *c* is a scalar 0.0.
1. If |options|.{{MLGemmOptions/aTranspose}} is true, then let |shapeA| be the reverse array of |shapeA|.
2920
2901
1. If |options|.{{MLGemmOptions/bTranspose}} is true, then let |shapeB| be the reverse array of |shapeB|.
2921
2902
1. If |shapeA|[1] is not equal to |shapeB|[0], then [=exception/throw=] a "{{DataError}}" {{DOMException}}.
2922
-
1. If |options|.{{MLGemmOptions/c}}[=map/exists=] and is not unidirectionally broadcastable to the shape [|shapeA|[0], |shapeB|[1]] according to the [[!numpy-broadcasting-rule]], then [=exception/throw=] a "{{DataError}}" {{DOMException}}.
2903
+
1. If |options|.{{MLGemmOptions/c}}[=map/exists=] and is not [=unidirectionally broadcastable=] to the shape [|shapeA|[0], |shapeB|[1]], then [=exception/throw=] a "{{DataError}}" {{DOMException}}.
2923
2904
<div class="note">
2924
2905
Type compatibility between |a|, |b| and |options|.{{MLGemmOptions/c}} can be also checked.
- *slope*: an {{MLOperand}}. The slope tensor. Its shape is either the same as, or unidirectionally broadcastable to the shape of input tensor *input* according to [[!numpy-broadcasting-rule]].
4702
+
- *slope*: an {{MLOperand}}. The slope tensor. Its shape is either the same as, or [=unidirectionally broadcastable=] to the shape of input tensor *input*.
4722
4703
4723
4704
**Returns:**
4724
4705
- an {{MLOperand}}. The output tensor of the same shape as *input*.
1. Let |descriptor| be a new {{MLOperandDescriptor}}.
4734
4715
1. Set |descriptor|.{{MLOperandDescriptor/dataType}} to |input|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dataType}}.
4735
-
1. Set |descriptor|.{{MLOperandDescriptor/dimensions}} to the result of running the [=MLGraphBuilder/broadcast-shapes=]steps given |input|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}} and |slope|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}}.
4736
-
1. If that [=exception/throws=] an error, re-[=exception/throw=]the error.
4716
+
1. Set |descriptor|.{{MLOperandDescriptor/dimensions}} to the result of [=unidirectionally broadcasting the shapes=]|slope|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}} and |input|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}}.
4717
+
1. If that returns failure, then [=exception/throw=]a "{{DataError}}" {{DOMException}}.
4737
4718
1. If any of the following sub-steps fail, [=exception/throw=] an "{{OperationError}}" {{DOMException}}.
4738
4719
1. Let |output| be the result of [=creating an MLOperand=] given [=this=] and |descriptor|.
1. If |input|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dataType}} is not equal to |other|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dataType}}, then [=exception/throw=] a "{{DataError}}" {{DOMException}}.
5865
5846
1. Let |descriptor| be a new {{MLOperandDescriptor}}.
5866
5847
1. Set |descriptor|.{{MLOperandDescriptor/dataType}} to |input|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dataType}}.
5867
-
1. Set |descriptor|.{{MLOperandDescriptor/dimensions}} to the result of running the [=MLGraphBuilder/broadcast-shapes=] steps given |input|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}} and |other|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}}.
5868
-
1. If that [=exception/throws=] an error, re-[=exception/throw=]the error.
5869
-
1. If |condition| is not unidirectionally broadcastable to |descriptor|.{{MLOperandDescriptor/dimensions}} according to the [[!numpy-broadcasting-rule]], then [=exception/throw=] a "{{DataError}}" {{DOMException}}.
5848
+
1. Set |descriptor|.{{MLOperandDescriptor/dimensions}} to the result of [=bidirectionally broadcasting the shapes=] |input|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}} and |other|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}}.
5849
+
1. If that returns failure, then [=exception/throw=]a "{{DataError}}" {{DOMException}}.
5850
+
1. If |condition| is not [=bidirectionally broadcastable=] to |descriptor|.{{MLOperandDescriptor/dimensions}}, then [=exception/throw=] a "{{DataError}}" {{DOMException}}.
5870
5851
1. If any of the following sub-steps fail, [=exception/throw=] an "{{OperationError}}" {{DOMException}}.
5871
5852
1. Let |output| be the result of [=creating an MLOperand=] given [=this=] and |descriptor|.
Broadcasting refers to how operations treat tensors with different shapes, and follow the precedent set by [[!numpy-broadcasting-rule]].
6069
+
6070
+
<div algorithm>
6071
+
To <dfn data-lt="unidirectionally broadcasting the shapes">unidirectionally broadcast the shapes</dfn> |A| and |B|, perform the following steps. |A| and |B| are [=/lists=] of positive integers, representing the dimensions of tensors, and the steps return a new [=/list=] of positive integers, or failure.
6072
+
6073
+
1. Let |sizeA| be the [=list/size=] of |A|.
6074
+
1. Let |sizeB| be the [=list/size=] of |B|.
6075
+
1. If |sizeB| > |sizeA|, then return failure.
6076
+
1. Let |paddedB| be a [=list/clone=] of |B|.
6077
+
1. While |paddedB|'s [=list/size=] is less than |sizeA|, [=list/prepend=] 1 to |paddedB|.
6078
+
1. Let |outputShape| be a new [=/list=].
6079
+
1. [=list/For each=] |index| in [=the range=] 0 to |sizeA|, exclusive:
6080
+
1. Let |dimA| be |A|[|index|].
6081
+
1. Let |dimB| be |paddedB|[|index|].
6082
+
1. If |dimA| is not equal to |dimB| and |dimA| is not equal to 1, then return failure.
6083
+
1. [=list/Append=] |dimA| to |outputShape|.
6084
+
1. Return |outputShape|.
6085
+
6086
+
</div>
6087
+
6088
+
<div algorithm>
6089
+
|A| is <dfn>unidirectionally broadcastable</dfn> to |B| if [=unidirectionally broadcasting the shapes=] |A| and |B| does not result in failure.
6090
+
</div>
6091
+
6092
+
<div algorithm>
6093
+
To <dfn data-lt="bidirectionally broadcasting the shapes">bidirectionally broadcast the shapes</dfn> |A| and |B|, perform the following steps. |A| and |B| are [=/lists=] of positive integers, representing the dimensions of tensors, and the steps return a new [=/list=] of positive integers, or failure.
6094
+
6095
+
1. Let |sizeA| be the [=list/size=] of |A|.
6096
+
1. Let |sizeB| be the [=list/size=] of |B|.
6097
+
1. Let |outputSize| be the maximum of |sizeA| and |sizeB|.
6098
+
1. Let |paddedA| be a [=list/clone=] of |A|.
6099
+
1. While |paddedA|'s [=list/size=] is less than |outputSize|, [=list/prepend=] 1 to |paddedA|.
6100
+
1. Let |paddedB| be a [=list/clone=] of |B|.
6101
+
1. While |paddedB|'s [=list/size=] is less than |outputSize|, [=list/prepend=] 1 to |paddedB|.
6102
+
1. Let |outputShape| be a new [=/list=].
6103
+
1. [=list/For each=] |index| in [=the range=] 0 to |outputSize|, exclusive:
6104
+
1. Let |dimA| be |paddedA|[|index|].
6105
+
1. Let |dimB| be |paddedB|[|index|].
6106
+
1. If |dimA| is not equal to |dimB|, and |dimA| is not equal to 1, and |dimB| is not equal to 1, then return failure.
6107
+
1. [=list/Append=] the maximum of |dimA| and |dimB| to |outputShape|.
6108
+
1. Return |outputShape|.
6109
+
6110
+
</div>
6111
+
6112
+
<div algorithm>
6113
+
|A| is <dfn>bidirectionally broadcastable</dfn> to |B| if [=bidirectionally broadcasting the shapes=] |A| and |B| does not result in failure.
0 commit comments