### A continuum among logarithmic, linear, and exponential functions, and its potential to improve generalization in neural networks

**2016-02-03**

1602.01321 | cs.NE

We present the soft exponential activation function for artificial neural
networks that continuously interpolates between logarithmic, linear, and
exponential functions. This activation function is simple, differentiable, and
parameterized so that it can be trained as the rest of the network is trained.
We hypothesize that soft exponential has the potential to improve neural
network learning, as it can exactly calculate many natural operations that
typical neural networks can only approximate, including addition,
multiplication, inner product, distance, polynomials, and sinusoids.

**Login to like/save this paper, take notes and configure your recommendations**

# Related Articles

**2017-10-16**

1710.05941 | cs.NE

The choice of activation functions in deep networks has a significant effect
on the training dynamic… show more

**2015-05-29**

1506.00019 | cs.LG

Countless learning tasks require dealing with sequential data. Image
captioning, speech synthesis, a… show more

**2015-11-09**

1511.02580 | cs.LG

We propose ways to improve the performance of fully connected networks. We
found that two approaches… show more

**2014-12-21**

1412.6830 | cs.NE

Artificial neural networks typically have a fixed, non-linear activation
function at each neuron. We… show more

**2019-02-13**

1902.04704 | q-bio.NC

Originally inspired by neurobiology, deep neural network models have become a
powerful tool of machi… show more

**2018-01-17**

1801.05894 | math.HO

Multilayered artificial neural networks are becoming a pervasive tool in a
host of application field… show more

**2016-10-03**

1610.01145 | cs.LG

We study expressive power of shallow and deep neural networks with piece-wise
linear activation func… show more

**2019-01-01**

1901.05894 | cs.CV

The activation function in neural network is one of the important aspects
which facilitates the deep… show more

**2016-10-31**

1610.10087 | cs.NE

We present a novel neural network algorithm, the Tensor Switching (TS)
network, which generalizes th… show more

**2018-02-01**

1802.00212 | cs.LG

In this paper, we introduce "Power Linear Unit" (PoLU) which increases the
nonlinearity capacity of … show more

**2018-03-19**

1804.11237 | cs.NE

`Biologically inspired' activation functions, such as the logistic sigmoid,
have been instrumental i… show more

**2017-05-25**

1705.09137 | cs.NE

We present a neural network technique for the analysis and extrapolation of
time-series data called … show more

**2019-01-17**

1901.06261 | cs.LG

Application of neural networks to a vast variety of practical applications is
transforming the way A… show more

**2019-01-28**

1901.09849 | cs.LG

Many neural network architectures rely on the choice of the activation
function for each hidden laye… show more

**2019-05-23**

1905.09574 | cs.NE

In the present study, an amplifying neuron and attenuating neuron, which can
be easily implemented i… show more

**2018-12-31**

1812.11800 | cs.LG

The deployment of Deep neural networks (DNN) on edge devices has been
difficult because they are res… show more