"comment":"\nLog Activation Operator.\n\n$out = \\ln(x)$\n\nNatural logarithm of x.\n\n",
"inputs":[
{
"name":"X",
"comment":"Input of Log operator",
"duplicable":0,
"intermediate":0
}],
"outputs":[
{
"name":"Out",
"comment":"Output of Log operator",
"duplicable":0,
"intermediate":0
}],
"attrs":[]
},{
"type":"softmax",
"comment":"\nSoftmax Operator.\n\nThe input of the softmax operator is a 2-D tensor with shape N x K (N is the\nbatch_size, K is the dimension of input feature). The output tensor has the\nsame shape as the input tensor.\n\nFor each row of the input tensor, the softmax operator squashes the\nK-dimensional vector of arbitrary real values to a K-dimensional vector of real\nvalues in the range [0, 1] that add up to 1.\nIt computes the exponential of the given dimension and the sum of exponential\nvalues of all the other dimensions in the K-dimensional vector input.\nThen the ratio of the exponential of the given dimension and the sum of\nexponential values of all the other dimensions is the output of the softmax\noperator.\n\nFor each row $i$ and each column $j$ in Input(X), we have:\n $$Out[i, j] = \\frac{\\exp(X[i, j])}{\\sum_j(exp(X[i, j])}$$\n\n",
...
...
@@ -2516,6 +2534,40 @@
"comment":"(bool, default false) If true, output a scalar reduced along all dimensions.",
"comment":"(float, default 1.0e-6) Constant for numerical stability",
"generated":0
}]
},{
"type":"log",
"comment":"\nLog Activation Operator.\n\n$out = \\ln(x)$\n\nNatural logarithm of x.\n\n",
"inputs":[
{
"name":"X",
"comment":"Input of Log operator",
"duplicable":0,
"intermediate":0
}],
"outputs":[
{
"name":"Out",
"comment":"Output of Log operator",
"duplicable":0,
"intermediate":0
}],
"attrs":[]
},{
"type":"nce",
"comment":"\nCompute and return the noise-contrastive estimation training loss.\nSee [Noise-contrastive estimation: A new estimation principle for unnormalized statistical models](http://www.jmlr.org/proceedings/papers/v9/gutmann10a/gutmann10a.pdf).\nBy default this operator uses a uniform distribution for sampling.\n",