Skip to content

Commit e948f7e

Browse files
committed
add back gelu
1 parent 78d9d7b commit e948f7e

File tree

1 file changed

+18
-0
lines changed

1 file changed

+18
-0
lines changed

index.bs

Lines changed: 18 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -2755,6 +2755,7 @@ Compute the <a href="https://en.wikipedia.org/wiki/Rectifier_(neural_networks)#G
27552755
<script type=idl>
27562756
partial interface MLGraphBuilder {
27572757
MLOperand gelu(MLOperand input);
2758+
MLActivation gelu();
27582759
};
27592760
</script>
27602761

@@ -2798,6 +2799,23 @@ partial interface MLGraphBuilder {
27982799
1. Return |output|.
27992800
</details>
28002801

2802+
#### {{MLGraphBuilder/gelu()}} #### {#api-mlgraphbuilder-gelu}
2803+
<div>
2804+
**Arguments:**
2805+
- None.
2806+
2807+
**Returns:**
2808+
- an {{MLActivation}}. The activation function representing the gelu operation.
2809+
</div>
2810+
2811+
<details open algorithm>
2812+
<summary>
2813+
The <dfn method for=MLGraphBuilder id=gelu-noargs>gelu()</dfn> method steps are:
2814+
</summary>
2815+
1. Let |op| be the result of [=creating an MLActivation=] given [=this=] and "gelu".
2816+
1. Return |op|.
2817+
</details>
2818+
28012819
### gemm ### {#api-mlgraphbuilder-gemm}
28022820
Calculate the [general matrix multiplication of the Basic Linear Algebra Subprograms](https://en.wikipedia.org/wiki/Basic_Linear_Algebra_Subprograms#Level_3). The calculation follows the expression `alpha * A * B + beta * C`, where `A` is a 2-D tensor with shape [M, K] or [K, M], `B` is a 2-D tensor with shape [K, N] or [N, K], and `C` is [=unidirectionally broadcastable=] to the shape [M, N]. `A` and `B` may optionally be transposed prior to the calculation.
28032821

0 commit comments

Comments
 (0)