bob.ip.binseg.models.losses¶
Loss implementations
Classes
|
|
|
Implements Equation 3 in [IGLOVIKOV-2018] for the multi-output networks such as HED or Little W-Net. |
Weighted Binary Cross-Entropy Loss for multi-layered inputs (e.g. |
|
|
Implements the generalized loss function of Equation (3) in [IGLOVIKOV-2018], with J being the Jaccard distance, and H, the Binary Cross-Entropy Loss: |
Calculates sum of weighted cross entropy loss. |
-
class
bob.ip.binseg.models.losses.
WeightedBCELogitsLoss
[source]¶ Bases:
torch.nn.modules.loss._Loss
Calculates sum of weighted cross entropy loss.
Implements Equation 1 in [MANINIS-2016]. The weight depends on the current proportion between negatives and positives in the ground-truth sample being analyzed.
-
forward
(input, target, mask)[source]¶ - Parameters
input (
torch.Tensor
) – Value produced by the model to be evaluated, with the shape[n, c, h, w]
target (
torch.Tensor
) – Ground-truth information with the shape[n, c, h, w]
mask (
torch.Tensor
) – Mask to be use for specifying the region of interest where to compute the loss, with the shape[n, c, h, w]
- Returns
loss – The average loss for all input data
- Return type
-
-
class
bob.ip.binseg.models.losses.
SoftJaccardBCELogitsLoss
(alpha=0.7)[source]¶ Bases:
torch.nn.modules.loss._Loss
Implements the generalized loss function of Equation (3) in [IGLOVIKOV-2018], with J being the Jaccard distance, and H, the Binary Cross-Entropy Loss:
\[L = lpha H + (1-lpha)(1-J)\]Our implementation is based on
torch.nn.BCEWithLogitsLoss
.-
forward
(input, target, mask)[source]¶ - Parameters
input (
torch.Tensor
) – Value produced by the model to be evaluated, with the shape[n, c, h, w]
target (
torch.Tensor
) – Ground-truth information with the shape[n, c, h, w]
mask (
torch.Tensor
) – Mask to be use for specifying the region of interest where to compute the loss, with the shape[n, c, h, w]
- Returns
loss – Loss, in a single entry
- Return type
-
-
class
bob.ip.binseg.models.losses.
MultiWeightedBCELogitsLoss
[source]¶ Bases:
bob.ip.binseg.models.losses.WeightedBCELogitsLoss
Weighted Binary Cross-Entropy Loss for multi-layered inputs (e.g. for Holistically-Nested Edge Detection in [XIE-2015]).
-
forward
(input, target, mask)[source]¶ - Parameters
input (iterable over
torch.Tensor
) – Value produced by the model to be evaluated, with the shape[L, n, c, h, w]
target (
torch.Tensor
) – Ground-truth information with the shape[n, c, h, w]
mask (
torch.Tensor
) – Mask to be use for specifying the region of interest where to compute the loss, with the shape[n, c, h, w]
- Returns
loss – The average loss for all input data
- Return type
-
-
class
bob.ip.binseg.models.losses.
MultiSoftJaccardBCELogitsLoss
(alpha=0.7)[source]¶ Bases:
bob.ip.binseg.models.losses.SoftJaccardBCELogitsLoss
Implements Equation 3 in [IGLOVIKOV-2018] for the multi-output networks such as HED or Little W-Net.
-
forward
(inputlist, target)[source]¶ - Parameters
input (iterable over
torch.Tensor
) – Value produced by the model to be evaluated, with the shape[L, n, c, h, w]
target (
torch.Tensor
) – Ground-truth information with the shape[n, c, h, w]
mask (
torch.Tensor
) – Mask to be use for specifying the region of interest where to compute the loss, with the shape[n, c, h, w]
- Returns
loss – The average loss for all input data
- Return type
-
-
class
bob.ip.binseg.models.losses.
MixJacLoss
(lambda_u=100, jacalpha=0.7, size_average=None, reduce=None, reduction='mean', pos_weight=None)[source]¶ Bases:
torch.nn.modules.loss._Loss
- Parameters
lambda_u (int) – determines the weighting of SoftJaccard and BCE.
-
forward
(input, target, unlabeled_input, unlabeled_target, ramp_up_factor)[source]¶ - Parameters
input (
torch.Tensor
) –target (
torch.Tensor
) –unlabeled_input (
torch.Tensor
) –unlabeled_target (
torch.Tensor
) –ramp_up_factor (float) –
- Returns
- Return type