pixyz.losses (Loss API)¶
Loss¶
Negative expected value of log-likelihood (entropy)¶
CrossEntropy¶
-
class
pixyz.losses.
CrossEntropy
(p1, p2, input_var=None)[source]¶ Bases:
pixyz.losses.losses.Loss
Cross entropy, a.k.a., the negative expected value of log-likelihood (Monte Carlo approximation).
where .
-
loss_text
¶
-
Entropy¶
-
class
pixyz.losses.
Entropy
(p1, input_var=None)[source]¶ Bases:
pixyz.losses.losses.Loss
Entropy (Monte Carlo approximation).
where .
- Note:
- This class is a special case of the CrossEntropy class. You can get the same result with CrossEntropy.
-
loss_text
¶
StochasticReconstructionLoss¶
-
class
pixyz.losses.
StochasticReconstructionLoss
(encoder, decoder, input_var=None)[source]¶ Bases:
pixyz.losses.losses.Loss
Reconstruction Loss (Monte Carlo approximation).
where .
- Note:
- This class is a special case of the CrossEntropy class. You can get the same result with CrossEntropy.
-
loss_text
¶
LossExpectation¶
-
class
pixyz.losses.
LossExpectation
(p, loss, input_var=None)[source]¶ Bases:
pixyz.losses.losses.Loss
Expectation of a given loss function (Monte Carlo approximation).
where .
-
loss_text
¶
-
Negative log-likelihood¶
NLL¶
-
class
pixyz.losses.
NLL
(p, input_var=None)[source]¶ Bases:
pixyz.losses.losses.Loss
Negative log-likelihood.
-
loss_text
¶
-
Lower bound¶
ELBO¶
-
class
pixyz.losses.
ELBO
(p, q, input_var=None)[source]¶ Bases:
pixyz.losses.losses.Loss
The evidence lower bound (Monte Carlo approximation).
where .
-
loss_text
¶
-
Divergence¶
KullbackLeibler¶
-
class
pixyz.losses.
KullbackLeibler
(p1, p2, input_var=None, dim=None)[source]¶ Bases:
pixyz.losses.losses.Loss
Kullback-Leibler divergence (analytical).
- TODO: This class seems to be slightly slower than this previous implementation
- (perhaps because of set_distribution).
-
loss_text
¶
Similarity¶
SimilarityLoss¶
-
class
pixyz.losses.
SimilarityLoss
(p1, p2, input_var=None, var=['z'], margin=0)[source]¶ Bases:
pixyz.losses.losses.Loss
Learning Modality-Invariant Representations for Speech and Images (Leidai et. al.)
MultiModalContrastivenessLoss¶
-
class
pixyz.losses.
MultiModalContrastivenessLoss
(p1, p2, input_var=None, margin=0.5)[source]¶ Bases:
pixyz.losses.losses.Loss
Disentangling by Partitioning: A Representation Learning Framework for Multimodal Sensory Data
Adversarial loss (GAN loss)¶
AdversarialJensenShannon¶
AdversarialKullbackLeibler¶
Loss for sequential distributions¶
IterativeLoss¶
-
class
pixyz.losses.
IterativeLoss
(step_loss, max_iter=1, input_var=None, series_var=None, update_value={}, slice_step=None, timestep_var=['t'])[source]¶ Bases:
pixyz.losses.losses.Loss
Iterative loss.
This class allows implementing an arbitrary model which requires iteration (e.g., auto-regressive models).
-
loss_text
¶
-
Loss for special purpose¶
Parameter¶
-
class
pixyz.losses.losses.
Parameter
(input_var)[source]¶ Bases:
pixyz.losses.losses.Loss
-
loss_text
¶
-
Operators¶
LossOperator¶
LossSelfOperator¶
AddLoss¶
-
class
pixyz.losses.losses.
AddLoss
(loss1, loss2)[source]¶ Bases:
pixyz.losses.losses.LossOperator
-
loss_text
¶
-
SubLoss¶
-
class
pixyz.losses.losses.
SubLoss
(loss1, loss2)[source]¶ Bases:
pixyz.losses.losses.LossOperator
-
loss_text
¶
-
MulLoss¶
-
class
pixyz.losses.losses.
MulLoss
(loss1, loss2)[source]¶ Bases:
pixyz.losses.losses.LossOperator
-
loss_text
¶
-
DivLoss¶
-
class
pixyz.losses.losses.
DivLoss
(loss1, loss2)[source]¶ Bases:
pixyz.losses.losses.LossOperator
-
loss_text
¶
-
NegLoss¶
-
class
pixyz.losses.losses.
NegLoss
(loss1)[source]¶ Bases:
pixyz.losses.losses.LossSelfOperator
-
loss_text
¶
-
AbsLoss¶
-
class
pixyz.losses.losses.
AbsLoss
(loss1)[source]¶ Bases:
pixyz.losses.losses.LossSelfOperator
-
loss_text
¶
-
BatchMean¶
-
class
pixyz.losses.losses.
BatchMean
(loss1)[source]¶ Bases:
pixyz.losses.losses.LossSelfOperator
Loss averaged over batch data.
where and is a loss function.
-
loss_text
¶
-
BatchSum¶
-
class
pixyz.losses.losses.
BatchSum
(loss1)[source]¶ Bases:
pixyz.losses.losses.LossSelfOperator
Loss summed over batch data.
where and is a loss function.
-
loss_text
¶
-