Trying to understand cross_entropy loss in PyTorch
https://stackoverflow.com/questions/57161524/trying-to-understand-cross-entropy-loss-in-pytorch
WEBJul 23, 2019 · torch.nn.functional.cross_entropy function combines log_softmax (softmax followed by a logarithm) and nll_loss (negative log likelihood loss) in a single function, i.e. it is equivalent to F.nll_loss (F.log_softmax (x, 1), y). Code: x = torch.FloatTensor ( [ [1.,0.,0.], [0.,1.,0.], [0.,0.,1.]]) y = torch.LongTensor ( [0,1,2]) print (torch.nn ...
DA: 64 PA: 89 MOZ Rank: 10