Loss Functions

37

Comprehensive loss function library for every training objective

37 loss functions covering classification (CrossEntropy, Focal, ArcFace), regression (MSE, Huber, Quantile), segmentation (Dice, Tversky, Lovasz), and generative training (Perceptual, LPIPS, SSIM, InfoNCE). All with automatic differentiation and GPU support.

Classification Object Detection Semantic Segmentation Image Generation Contrastive Learning Face Recognition Regression Ranking

Classification Losses

Losses for categorical prediction tasks with class imbalance handling.

CrossEntropy

Standard cross-entropy with optional label smoothing and class weights.

BinaryCrossEntropy

Binary classification with pos_weight for imbalanced datasets.

FocalLoss

Down-weight easy examples to focus on hard cases (RetinaNet).

ArcFace / CosFace

Angular margin losses for discriminative face/embedding learning.

PolyLoss

Polynomial expansion of cross-entropy for improved tail-class accuracy.

Regression Losses

Losses for continuous value prediction with outlier robustness.

MSE

Mean Squared Error for standard regression objectives.

MAE / L1

Mean Absolute Error robust to outliers.

Huber

Smooth transition between MSE (small errors) and MAE (large errors).

LogCosh

Log of hyperbolic cosine, twice differentiable and outlier-robust.

Quantile

Asymmetric loss for predicting specific quantiles.

Segmentation Losses

Specialized losses for pixel-level prediction tasks.

Dice

Overlap-based loss for imbalanced segmentation masks.

Tversky

Generalized Dice with adjustable false positive/negative weighting.

Lovasz

Lovasz extension of IoU for direct mIoU optimization.

Boundary

Focus on boundary pixels for precise edge segmentation.

Generative & Contrastive Losses

Losses for image generation, representation learning, and similarity.

Perceptual (VGG)

Feature-space loss using VGG activations for perceptual quality.

LPIPS

Learned Perceptual Image Patch Similarity for generation quality.

SSIM

Structural Similarity Index for image quality assessment.

InfoNCE

Noise Contrastive Estimation for self-supervised representation learning.

NT-Xent

Normalized Temperature-scaled Cross Entropy for contrastive learning (SimCLR).

Triplet

Learn embeddings where positives are closer than negatives by a margin.

Loss functions with AiModelBuilder

C#
using AiDotNet;

// Train with a custom loss function using AiModelBuilder
var result = await new AiModelBuilder<float, float[], float>()
    .ConfigureModel(new NeuralNetwork<float>(
        inputSize: 784, hiddenSize: 128, outputSize: 10))
    .ConfigureLossFunction(new FocalLoss<float>(
        gamma: 2.0f, alpha: 0.25f))
    .ConfigureOptimizer(new AdamOptimizer<float>())
    .BuildAsync(features, labels);

var prediction = result.Predict(newSample);

Start building with Loss Functions

All 37 implementations are included free under Apache 2.0.