You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Compute the <a href="https://en.wikipedia.org/wiki/Rectifier_(neural_networks)#Gaussian-error_linear_unit_(GELU)">gaussian error linear unit function</a> (GELU) of the input tensor. The calculation follows the expression `0.5 * x * (1 + erf(x / sqrt(2)))`.
2823
+
2824
+
<script type=idl>
2825
+
partial interface MLGraphBuilder {
2826
+
MLOperand gelu(MLOperand input);
2827
+
MLActivation gelu();
2828
+
};
2829
+
</script>
2830
+
2831
+
<div class="note">
2832
+
<details open>
2833
+
<summary>
2834
+
The behavior of this operation can be generically emulated from the usage of
2835
+
other operations as follows. However, user agents typically have a more
2836
+
efficient implementation for it. Therefore its usage is encouraged from the
- an {{MLActivation}}. The activation function representing the gelu operation.
2882
+
</div>
2883
+
2884
+
<details open algorithm>
2885
+
<summary>
2886
+
The <dfn method for=MLGraphBuilder id=gelu-noargs>gelu()</dfn> method steps are:
2887
+
</summary>
2888
+
1. Let |op| be the result of [=creating an MLActivation=] given [=this=] and "gelu".
2889
+
1. Return |op|.
2890
+
</details>
2891
+
2821
2892
### gemm ### {#api-mlgraphbuilder-gemm}
2822
2893
Calculate the [general matrix multiplication of the Basic Linear Algebra Subprograms](https://en.wikipedia.org/wiki/Basic_Linear_Algebra_Subprograms#Level_3). The calculation follows the expression `alpha * A * B + beta * C`, where `A` is a 2-D tensor with shape [M, K] or [K, M], `B` is a 2-D tensor with shape [K, N] or [N, K], and `C` is [=unidirectionally broadcastable=] to the shape [M, N]. `A` and `B` may optionally be transposed prior to the calculation.
0 commit comments