1 year ago

#248380

test-img

user2543622

neural network binary classification softmax logsofmax and loss function

I am building a binary classification where the class I want to predict is present only <2% of times. I am using pytorch

The last layer could be logosftmax or softmax.

self.softmax = nn.Softmax(dim=1) or self.softmax = nn.LogSoftmax(dim=1)

my questions

  1. I should use softmax as it will provide outputs that sum up to 1 and I can check performance for various prob thresholds. is that understanding correct?

  2. if I use softmax then can I use cross_entropy loss? This seems to suggest that it is okay to use

  3. if i use logsoftmax then can I use cross_entropy loss? This seems to suggest that I shouldnt.

  4. if I use softmax then is there any better option than cross_entropy loss?

        ` cross_entropy = nn.CrossEntropyLoss(weight=class_wts)`
    

pytorch

binary

classification

softmax

cross-entropy

0 Answers

Your Answer

Accepted video resources