Loss Functions

Complete reference for all 41 loss functions in AiDotNet.


Classification Losses

Binary Classification

Loss FunctionDescriptionUse Case
BinaryCrossEntropyLoss<T>-[y·log(p) + (1-y)·log(1-p)]Binary classification
HingeLoss<T>max(0, 1 - y·p)SVM-style binary classification
SquaredHingeLoss<T>(max(0, 1 - y·p))^2Smooth differentiable hinge
ModifiedHuberLoss<T>Modified hinge + quadraticRobust binary
var loss = new BinaryCrossEntropyLoss<float>();
var value = loss.CalculateLoss(predictions, targets);

Multi-Class Classification

Loss FunctionDescriptionUse Case
CrossEntropyLoss<T>Standard cross entropyMulti-class
CategoricalCrossEntropyLoss<T>One-hot encoded cross entropyMulti-class with one-hot
SparseCategoricalCrossEntropyLoss<T>Integer label cross entropyMulti-class with indices
WeightedCrossEntropyLoss<T>Class-weighted cross entropyImbalanced classes
var loss = new CrossEntropyLoss<float>();

Focal Loss (Imbalanced Data)

Loss FunctionDescriptionUse Case
FocalLoss<T>Down-weights easy examplesClass imbalance
var loss = new FocalLoss<float>();

Regression Losses

Loss FunctionDescriptionUse Case
MeanSquaredErrorLoss<T>Mean squared error: (y - p)^2General regression
MeanAbsoluteErrorLoss<T>Mean absolute errorRobust to outliers
RootMeanSquaredErrorLoss<T>Square root of MSESame scale as target
MeanBiasErrorLoss<T>Mean bias errorDirectional bias
HuberLoss<T>Quadratic near 0, linear elsewhereBalanced robustness
LogCoshLoss<T>log(cosh(y - p))Smooth L1 approximation
CharbonnierLoss<T>Differentiable L1Image restoration
PoissonLoss<T>Poisson negative log likelihoodCount data
QuantileLoss<T>Quantile regressionPrediction intervals
var loss = new HuberLoss<float>();

Segmentation Losses

Loss FunctionDescriptionUse Case
DiceLoss<T>1 - Dice coefficientSegmentation
JaccardLoss<T>1 - Intersection over UnionObject detection
ScaleInvariantDepthLoss<T>Scale-invariant depthDepth estimation
var loss = new DiceLoss<float>();

Contrastive/Metric Learning

Loss FunctionDescriptionUse Case
ContrastiveLoss<T>Siamese networksSimilarity learning
TripletLoss<T>Anchor, positive, negativeFace recognition
NTXentLoss<T>Normalized temperature cross-entropySelf-supervised
InfoNCELoss<T>Information NCEContrastive learning
NoiseContrastiveEstimationLoss<T>NCE for large vocabulariesWord embeddings
CosineSimilarityLoss<T>1 - cosine similarityEmbedding alignment
var loss = new TripletLoss<float>();

Reconstruction Losses

Loss FunctionDescriptionUse Case
MAEReconstructionLoss<T>MAE for reconstructionMasked autoencoders
PerceptualLoss<T>Feature space distanceSuper-resolution
RealESRGANLoss<T>Combined perceptual + adversarialImage enhancement
RotationPredictionLoss<T>Rotation predictionSelf-supervised pretext
var loss = new PerceptualLoss<float>();

GAN Losses

Loss FunctionDescriptionUse Case
WassersteinLoss<T>Wasserstein distanceWGAN
MarginLoss<T>Margin-based lossPairwise ranking
var loss = new WassersteinLoss<float>();

Other Losses

Loss FunctionDescriptionUse Case
ElasticNetLoss<T>L1 + L2 combinedRegularized regression
ExponentialLoss<T>Exponential lossAdaBoost
OrdinalRegressionLoss<T>Ordered categoriesOrdinal prediction
QuantumLoss<T>Quantum-inspired lossQuantum ML

Sequence Losses

Loss FunctionDescriptionUse Case
CTCLoss<T>Connectionist Temporal ClassificationSpeech recognition
var loss = new CTCLoss<float>();

Self-Supervised Learning Losses

Loss FunctionDescriptionUse Case
BYOLLoss<T>Bootstrap Your Own LatentSelf-supervised
BarlowTwinsLoss<T>Redundancy reductionSelf-supervised
DINOLoss<T>Self-distillationVision transformers

Usage Examples

With AiModelBuilder

var result = await new AiModelBuilder<float, float[][], float[]>()
    .ConfigureModel(model)
    .ConfigureLossFunction(new FocalLoss<float>())
    .BuildAsync(trainData, trainLabels);

Loss Selection Guide

TaskRecommended Loss
Binary classificationBinaryCrossEntropyLoss
Multi-class classificationCrossEntropyLoss
Imbalanced classificationFocalLoss
RegressionMeanSquaredErrorLoss or HuberLoss
SegmentationDiceLoss + CrossEntropyLoss
Object detectionJaccardLoss
Face recognitionTripletLoss
Contrastive learningNTXentLoss or InfoNCELoss
GANsWassersteinLoss
Speech recognitionCTCLoss
Self-supervisedBYOLLoss or DINOLoss