- Cross Entropy Error (Binary Cross Entropy)
- Earth Mover Distance
- Hinge Embedding Loss
- Huber Loss / Smooth L1 Loss
- KL Divergence Loss
- L1 Loss
- Log CosH Loss
- Mean Bias Error
- Mean Squared Error
- Mean Squared Logarithmic Error
- NLL Loss
- Margin Ranking Loss (can't be done without major modifications, see notebook)
- Cosine Embedding Loss (
kartikdutt18
said he will be doing this)
Jupyter Notebooks comparing PyTorch/numpy implementations of each of the above loss functions with mlpack/armadillo implementations along with explanations for the possible errors in the existing implementations.
To run the notebooks, you would need pytorch, numpy and armadillo. To avoid issues with linking the libraries correctly, it is better to run the notebook in Google Colab.