Rate this Page

torch.nn.functional.nll_loss#

torch.nn.functional.nll_loss(input, target, weight=None, size_average=None, ignore_index=-100, reduce=None, reduction='mean')[source]#

Compute the negative log likelihood loss.

See NLLLoss for details.

Parameters
  • input (Tensor) – (N,C) where C = number of classes or (N,C,H,W) in case of 2D Loss, or (N,C,d1,d2,...,dK) where K1 in the case of K-dimensional loss. input is expected to be log-probabilities.

  • target (Tensor) – (N) where each value is 0targets[i]C1, or (N,d1,d2,...,dK) where K1 for K-dimensional loss.

  • weight (Tensor, optional) – A manual rescaling weight given to each class. If given, has to be a Tensor of size C

  • size_average (bool, optional) – Deprecated (see reduction).

  • ignore_index (int, optional) – Specifies a target value that is ignored and does not contribute to the input gradient. When size_average is True, the loss is averaged over non-ignored targets. Default: -100

  • reduce (bool, optional) – Deprecated (see reduction).

  • reduction (str, optional) – Specifies the reduction to apply to the output: 'none' | 'mean' | 'sum'. 'none': no reduction will be applied, 'mean': the sum of the output will be divided by the number of elements in the output, 'sum': the output will be summed. Note: size_average and reduce are in the process of being deprecated, and in the meantime, specifying either of those two args will override reduction. Default: 'mean'

Return type

Tensor

Example:

>>> # input is of size N x C = 3 x 5
>>> input = torch.randn(3, 5, requires_grad=True)
>>> # each element in target has to have 0 <= value < C
>>> target = torch.tensor([1, 0, 4])
>>> output = F.nll_loss(F.log_softmax(input, dim=1), target)
>>> output.backward()

Docs

Access comprehensive developer documentation for PyTorch

View Docs

Tutorials

Get in-depth tutorials for beginners and advanced developers

View Tutorials

Resources

Find development resources and get your questions answered

View Resources
Morty Proxy This is a proxified and sanitized view of the page, visit original site.