Softmax

tags
Applied maths

The Softmax can refer to two mathematical functions:

  • In machine learning a softmax is the function which normalizes a vector of values to a probability vector: softmax(x)=exiexi\text{softmax}(\mathbf{x}) = \dfrac{e^{\mathbf{x}}}{\sum_i e^{x_i}} where x=(xi)Rn\mathbf{x} = (x_i) \in \mathbb{R}^n. This function could also be called soft-argmax because it is a smooth approximation of the discrete argmax function.
  • It may also refer to a smoothed maximum function like ϵlogiexp(xi/ϵ)\epsilon \log \sum_i \exp (x_i / \epsilon) which approximates the max\text{max} function in the limit ϵ0\epsilon \rightarrow 0
Last changed | authored by

Comments


← Back to Notes