nn.shifted_softplus

nn.shifted_softplus(x: torch.Tensor)[source]

Compute shifted soft-plus activation function.

\[y = \ln\left(1 + e^{-x}\right) - \ln(2)\]
Parameters:

x (torch.Tensor) – input tensor.

Returns:

shifted soft-plus of input.

Return type:

torch.Tensor