Skip to content

Instantly share code, notes, and snippets.

@karolzak
Last active January 16, 2019 22:27
Show Gist options
  • Save karolzak/d0cfd40d5ac1aeab94bd3565ff15d477 to your computer and use it in GitHub Desktop.
Save karolzak/d0cfd40d5ac1aeab94bd3565ff15d477 to your computer and use it in GitHub Desktop.
jaccard_distance loss function
#https://github.com/keras-team/keras-contrib/blob/master/keras_contrib/losses/jaccard.py
from keras import backend as K
def jaccard_distance(y_true, y_pred, smooth=100):
"""Jaccard distance for semantic segmentation.
Also known as the intersection-over-union loss.
This loss is useful when you have unbalanced numbers of pixels within an image
because it gives all classes equal weight. However, it is not the defacto
standard for image segmentation.
For example, assume you are trying to predict if
each pixel is cat, dog, or background.
You have 80% background pixels, 10% dog, and 10% cat.
If the model predicts 100% background
should it be be 80% right (as with categorical cross entropy)
or 30% (with this loss)?
The loss has been modified to have a smooth gradient as it converges on zero.
This has been shifted so it converges on 0 and is smoothed to avoid exploding
or disappearing gradient.
Jaccard = (|X & Y|)/ (|X|+ |Y| - |X & Y|)
= sum(|A*B|)/(sum(|A|)+sum(|B|)-sum(|A*B|))
# Arguments
y_true: The ground truth tensor.
y_pred: The predicted tensor
smooth: Smoothing factor. Default is 100.
# Returns
The Jaccard distance between the two tensors.
# References
- [What is a good evaluation measure for semantic segmentation?](
http://www.bmva.org/bmvc/2013/Papers/paper0032/paper0032.pdf)
"""
intersection = K.sum(K.abs(y_true * y_pred), axis=-1)
sum_ = K.sum(K.abs(y_true) + K.abs(y_pred), axis=-1)
jac = (intersection + smooth) / (sum_ - intersection + smooth)
return (1 - jac) * smooth
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment