1 year ago
#248380
user2543622
neural network binary classification softmax logsofmax and loss function
I am building a binary classification where the class I want to predict is present only <2% of times. I am using pytorch
The last layer could be logosftmax
or softmax
.
self.softmax = nn.Softmax(dim=1)
or self.softmax = nn.LogSoftmax(dim=1)
my questions
I should use
softmax
as it will provide outputs that sum up to 1 and I can check performance for various prob thresholds. is that understanding correct?if I use
softmax
then can I usecross_entropy
loss? This seems to suggest that it is okay to useif i use
logsoftmax
then can I usecross_entropy
loss? This seems to suggest that I shouldnt.if I use
softmax
then is there any better option thancross_entropy
loss?` cross_entropy = nn.CrossEntropyLoss(weight=class_wts)`
pytorch
binary
classification
softmax
cross-entropy
0 Answers
Your Answer