Skip to content

Commit f1acf1e

Browse files
committed
add SiLU activation
1 parent 2ed7626 commit f1acf1e

3 files changed

+100
-1
lines changed

doc/specs/stdlib_specialfunctions_activations.md

+55-1
Original file line numberDiff line numberDiff line change
@@ -354,6 +354,60 @@ Elemental function
354354

355355
The function returns a value with the same type and kind as input argument.
356356

357+
## `SiLU` - Sigmoid Linear Unit function
358+
359+
### Status
360+
361+
Experimental
362+
363+
### Description
364+
365+
Computes the Sigmoid Linear Unit function:
366+
$$f(x) = \frac{x}{1+\exp(-x)} $$
367+
368+
### Syntax
369+
370+
`result = ` [[stdlib_specialfunctions(module):silu(interface)]] ` (x)`
371+
372+
### Class
373+
374+
Elemental function
375+
376+
### Arguments
377+
378+
`x`: Shall be a scalar or array of any `real` kind.
379+
380+
### Return value
381+
382+
The function returns a value with the same type and kind as input argument.
383+
384+
## `Silu_grad` - Gradient of the Sigmoid Linear Unit function
385+
386+
### Status
387+
388+
Experimental
389+
390+
### Description
391+
392+
Computes the gradient of the Sigmoid function:
393+
$$f(x) = \frac{\exp(x)*(x+(1+\exp(x))^2)}{(1+\exp(x))^2} $$
394+
395+
### Syntax
396+
397+
`result = ` [[stdlib_specialfunctions(module):silu_grad(interface)]] ` (x)`
398+
399+
### Class
400+
401+
Elemental function
402+
403+
### Arguments
404+
405+
`x`: Shall be a scalar or array of any `real` kind.
406+
407+
### Return value
408+
409+
The function returns a value with the same type and kind as input argument.
410+
357411
## `Step` - Step function
358412

359413
### Status
@@ -442,7 +496,7 @@ Pure function for ranks 1 to 4.
442496

443497
The function returns an array with the same rank and kind as the input argument `x`.
444498

445-
## `Softplus_grad` - Gradient of the Softplus function
499+
## `Softmax_grad` - Gradient of the Softmax function
446500

447501
### Status
448502

src/stdlib_specialfunctions.fypp

+26
Original file line numberDiff line numberDiff line change
@@ -187,6 +187,32 @@ module stdlib_specialfunctions
187187
#:endfor
188188
end interface
189189
public :: sigmoid_grad
190+
191+
interface silu
192+
!! Version: experimental
193+
!!
194+
!! Sigmoid Linear Unit function
195+
#:for rk, rt in REAL_KINDS_TYPES
196+
elemental module function silu_${rk}$( x ) result( y )
197+
${rt}$, intent(in) :: x
198+
${rt}$ :: y
199+
end function
200+
#:endfor
201+
end interface
202+
public :: silu
203+
204+
interface silu_grad
205+
!! Version: experimental
206+
!!
207+
!! Gradient of the Sigmoid Linear Unit function
208+
#:for rk, rt in REAL_KINDS_TYPES
209+
elemental module function silu_grad_${rk}$( x ) result( y )
210+
${rt}$, intent(in) :: x
211+
${rt}$ :: y
212+
end function
213+
#:endfor
214+
end interface
215+
public :: silu_grad
190216

191217
interface step
192218
!! Version: experimental

src/stdlib_specialfunctions_activations.fypp

+19
Original file line numberDiff line numberDiff line change
@@ -129,6 +129,25 @@ end function
129129

130130
#:endfor
131131

132+
!==================================================
133+
! SiLU: Sigmoid Linear Unit
134+
!==================================================
135+
#:for rk, rt in REAL_KINDS_TYPES
136+
elemental module function silu_${rk}$( x ) result( y )
137+
${rt}$, intent(in) :: x
138+
${rt}$ :: y
139+
y = x / (1._${rk}$ + exp(-x))
140+
end function
141+
142+
elemental module function silu_grad_${rk}$( x ) result( y )
143+
${rt}$, intent(in) :: x
144+
${rt}$ :: y
145+
y = (1._${rk}$ + exp(x))**2
146+
y = exp(x) * ( x + y ) / y
147+
end function
148+
149+
#:endfor
150+
132151
!==================================================
133152
! Step
134153
!==================================================

0 commit comments

Comments
 (0)