Skip to content

Commit 104d6cf

Browse files
committed
remove a few MLActivations definitely not used for recurrent ops
1 parent 4eda710 commit 104d6cf

File tree

1 file changed

+1
-58
lines changed

1 file changed

+1
-58
lines changed

index.bs

Lines changed: 1 addition & 58 deletions
Original file line numberDiff line numberDiff line change
@@ -640,7 +640,7 @@ The {{MLGraphBuilder}} interface serves as a builder (factory) to construct a [=
640640

641641
In WebNN, a [=computational graph=] is composed of <dfn>operators</dfn> which act on data, and are the nodes of the graph. {{MLOperand}}s are a representation of data that flows within the computational graph, and are the edges of the graph. {{MLOperand}}s include a [=computational graph=]'s <dfn for="computational graph">input</dfn> values for inference, <dfn for="computational graph">constants</dfn> (including trained weights) used for inference, intermediate values (often referred to as activations) computed during inference, as well as the output values of inference. An [=operator=]'s <dfn for=operator>input</dfn> is one or more {{MLOperand}}s. An [=operator=]'s <dfn for=operator>output</dfn> is one or more {{MLOperand}}s. [=Operators=] have operator-specific parameters that control their behavior, which can include zero or more <dfn for=operator lt="activation|activation function">activation functions</dfn>, which are {{MLActivation}}s.
642642

643-
A key part of the {{MLGraphBuilder}} interface are methods such as {{MLGraphBuilder/gemm()}} and {{MLGraphBuilder/softmax(axis)|softmax()}} which create an [=operator=] which represents the actual operation to perform on the input data when the computation is run, and return a new {{MLOperand}} or {{MLActivation}} holding the operator. Methods that create an {{MLOperand}} connect any [=operator/inputs=] and [=operator/activations=] to the operator. Each method invocation returns a distinct new value, without changing the value of any other {{MLOperand}}.
643+
A key part of the {{MLGraphBuilder}} interface are methods such as {{MLGraphBuilder/gemm()}} and {{MLGraphBuilder/softmax()}} which create an [=operator=] which represents the actual operation to perform on the input data when the computation is run, and return a new {{MLOperand}} or {{MLActivation}} holding the operator. Methods that create an {{MLOperand}} connect any [=operator/inputs=] and [=operator/activations=] to the operator. Each method invocation returns a distinct new value, without changing the value of any other {{MLOperand}}.
644644

645645
At inference time, every {{MLOperand}} will be bound to a tensor (the actual data), which are essentially multidimensional arrays. The representation of the tensors is implementation dependent, but it typically includes the array data stored in some buffer (memory) and some metadata describing the array data (such as its shape).
646646

@@ -1603,7 +1603,6 @@ dictionary MLClampOptions {
16031603

16041604
partial interface MLGraphBuilder {
16051605
MLOperand clamp(MLOperand input, optional MLClampOptions options = {});
1606-
MLActivation clamp(optional MLClampOptions options = {});
16071606
};
16081607
</script>
16091608

@@ -1674,23 +1673,6 @@ partial interface MLGraphBuilder {
16741673
1. Return |output|.
16751674
</details>
16761675

1677-
#### {{MLGraphBuilder/clamp(options)}} #### {#api-mlgraphbuilder-clamp-options}
1678-
<div>
1679-
**Arguments:**
1680-
- *options*: an optional {{MLClampOptions}}. The optional parameters of the operation.
1681-
**Returns:**
1682-
- an {{MLActivation}}. The operator representing the clamp operation.
1683-
</div>
1684-
1685-
<details open algorithm>
1686-
<summary>
1687-
The <dfn method for=MLGraphBuilder>clamp(|options|)</dfn> method steps are:
1688-
</summary>
1689-
1. If [=checking clamp options=] given |options| returns false, then [=exception/throw=] a {{TypeError}}.
1690-
1. Let |op| be the result of [=creating an MLActivation=] given [=this=], "clamp" and |options|.
1691-
1. Return |op|.
1692-
</details>
1693-
16941676
### concat ### {#api-mlgraphbuilder-concat}
16951677
Concatenates the input tensors along a given axis.
16961678
<script type=idl>
@@ -2781,7 +2763,6 @@ Compute the <a href="https://en.wikipedia.org/wiki/Rectifier_(neural_networks)#G
27812763
<script type=idl>
27822764
partial interface MLGraphBuilder {
27832765
MLOperand gelu(MLOperand input);
2784-
MLActivation gelu();
27852766
};
27862767
</script>
27872768

@@ -2825,23 +2806,6 @@ partial interface MLGraphBuilder {
28252806
1. Return |output|.
28262807
</details>
28272808

2828-
#### {{MLGraphBuilder/gelu()}} #### {#api-mlgraphbuilder-gelu}
2829-
<div>
2830-
**Arguments:**
2831-
- None.
2832-
2833-
**Returns:**
2834-
- an {{MLActivation}}. The activation function representing the gelu operation.
2835-
</div>
2836-
2837-
<details open algorithm>
2838-
<summary>
2839-
The <dfn method for=MLGraphBuilder id=gelu-noargs>gelu()</dfn> method steps are:
2840-
</summary>
2841-
1. Let |op| be the result of [=creating an MLActivation=] given [=this=] and "gelu".
2842-
1. Return |op|.
2843-
</details>
2844-
28452809
### gemm ### {#api-mlgraphbuilder-gemm}
28462810
Calculate the [general matrix multiplication of the Basic Linear Algebra Subprograms](https://en.wikipedia.org/wiki/Basic_Linear_Algebra_Subprograms#Level_3). The calculation follows the expression `alpha * A * B + beta * C`, where `A` is a 2-D tensor with shape [M, K] or [K, M], `B` is a 2-D tensor with shape [K, N] or [N, K], and `C` is [=unidirectionally broadcastable=] to the shape [M, N]. `A` and `B` may optionally be transposed prior to the calculation.
28472811

@@ -5334,7 +5298,6 @@ the N-D input tensor along the given axis.
53345298
<script type=idl>
53355299
partial interface MLGraphBuilder {
53365300
MLOperand softmax(MLOperand input, unsigned long axis);
5337-
MLActivation softmax(unsigned long axis);
53385301
};
53395302
</script>
53405303

@@ -5382,26 +5345,6 @@ partial interface MLGraphBuilder {
53825345
1. Return |output|.
53835346
</details>
53845347

5385-
#### {{MLGraphBuilder/softmax(axis)}} #### {#api-mlgraphbuilder-softmax-axis}
5386-
<div>
5387-
**Arguments:**
5388-
- None.
5389-
5390-
**Returns:**
5391-
- an {{MLActivation}}. The activation function representing the softmax operation.
5392-
</div>
5393-
5394-
<details open algorithm>
5395-
<summary>
5396-
The <dfn method for=MLGraphBuilder>softmax(|axis|)</dfn> method steps are:
5397-
</summary>
5398-
1. Let |validationSteps| given {{MLOperandDescriptor}} |descriptor| be these steps:
5399-
1. If |axis| is greater than or equal to |descriptor|.{{MLOperandDescriptor/dimensions}}'s [=list/size=], then return false;
5400-
1. Otherwise, return true.
5401-
1. Let |op| be the result of [=creating an MLActivation=] given [=this=], "softmax", «[ "axis" → |axis| ]», and |validationSteps|.
5402-
1. Return |op|.
5403-
</details>
5404-
54055348
### softplus ### {#api-mlgraphbuilder-softplus-method}
54065349
Compute the <a href="https://en.wikipedia.org/wiki/Rectifier_(neural_networks)#Softplus">softplus function</a> of the input tensor. The calculation follows the expression `ln(1 + exp(x))`.
54075350
<script type=idl>

0 commit comments

Comments
 (0)