Skip to content

Instantly share code, notes, and snippets.

View dron-dronych's full-sized avatar
🤡
wandering math, science, and tech lover --> chasing dreams

dron-dronych

🤡
wandering math, science, and tech lover --> chasing dreams
View GitHub Profile
@dayyass
dayyass / pytorch_cross_entropy_loss_for_binary_classification.py
Last active June 17, 2021 15:37
PyTorch nn.BCELoss and nn.CrossEntropyLoss equivalence for binary classification.
"""
In a binary classification problem, a neural network usually returns a vector of logits of shape [batch_size],
while in a multiclass classification problem, logits are represented as a matrix of shape [batch_size, n_classes].
For these tasks, different loss functions are used, and, therefore, the network training pipelines are also different,
which is not convenient when you need to test hypotheses for both problem statements (binary/multiclass).
Pipeline schemes:
- binary classification:
logits (of shape [batch_size]) -> BCEWithLogitsLoss
// XPath CheatSheet
// To test XPath in your Chrome Debugger: $x('/html/body')
// http://www.jittuu.com/2012/2/14/Testing-XPath-In-Chrome/
// 0. XPath Examples.
// More: http://xpath.alephzarro.com/content/cheatsheet.html
'//hr[@class="edge" and position()=1]' // every first hr of 'edge' class