Classical Machine Learning

155+

The most comprehensive classical ML toolkit available for .NET

Everything from random forests to XGBoost, from K-Means to HDBSCAN, from isolation forests to deep anomaly detectors. AiDotNet provides more classical ML algorithms than scikit-learn, all in pure C# with full GPU acceleration and SIMD optimization support.

Tabular Data Analysis Customer Churn Prediction Fraud Detection Medical Diagnosis Predictive Maintenance Market Segmentation Credit Scoring Quality Control

Classification (52 algorithms)

Predict categorical outcomes with ensemble methods, SVMs, Bayesian classifiers, and more.

RandomForest

Bagged ensemble of decision trees with feature importance and OOB estimation.

GradientBoosting

Sequential boosting with gradient descent optimization.

XGBoost

Extreme gradient boosting with regularization and histogram-based splitting.

LightGBM

Gradient boosting with leaf-wise growth and categorical feature support.

CatBoost

Gradient boosting with native categorical feature handling.

SVM

Support vector machine with linear, RBF, polynomial, and sigmoid kernels.

KNN

K-nearest neighbors with distance weighting and ball/KD tree acceleration.

LogisticRegression

Linear classification with L1/L2 regularization and multinomial support.

NaiveBayes

Gaussian, Multinomial, and Bernoulli Naive Bayes classifiers.

ExtraTrees

Extremely randomized trees for faster training with comparable accuracy.

AdaBoost

Adaptive boosting with automatic re-weighting of misclassified samples.

StackingClassifier

Meta-learning ensemble combining multiple base classifiers.

Regression (59 algorithms)

Predict continuous values with linear models, tree ensembles, kernel methods, and robust estimators.

LinearRegression

Ordinary least squares with closed-form solution and statistical diagnostics.

Ridge

L2-regularized regression for handling multicollinearity.

Lasso

L1-regularized regression with automatic feature selection.

ElasticNet

Combined L1+L2 regularization with adjustable mixing ratio.

RandomForestRegressor

Ensemble of regression trees with variance reduction.

GradientBoostingRegressor

Sequential boosting for regression with multiple loss functions.

SVR

Support Vector Regression with epsilon-insensitive loss.

BayesianRidge

Bayesian linear regression with automatic relevance determination.

HuberRegressor

Robust regression resistant to outliers using Huber loss.

QuantileRegression

Predict conditional quantiles for uncertainty estimation.

IsotonicRegression

Non-parametric monotonic regression for calibration.

Clustering (90+ algorithms)

Discover natural groupings with density-based, centroid-based, hierarchical, and spectral methods.

KMeans

Centroid-based clustering with k-means++ initialization.

DBSCAN

Density-based clustering that discovers arbitrary-shaped clusters.

HDBSCAN

Hierarchical DBSCAN with automatic cluster selection and stability scores.

Agglomerative

Bottom-up hierarchical clustering with single, complete, average, Ward linkage.

Spectral

Graph-based clustering using eigenvalues of similarity matrix.

GaussianMixture

Probabilistic clustering with EM algorithm and model selection (BIC/AIC).

OPTICS

Ordering Points to Identify Clustering Structure for varying density.

MeanShift

Non-parametric clustering that finds modes in density distribution.

Birch

Balanced Iterative Reducing and Clustering using Hierarchies for large datasets.

AffinityPropagation

Message-passing algorithm that simultaneously identifies exemplars and clusters.

MiniBatchKMeans

Scalable K-Means variant using mini-batches for large datasets.

Anomaly Detection (66 algorithms)

Identify outliers and anomalies with isolation-based, density-based, and deep learning approaches.

IsolationForest

Anomaly detection via random partitioning of feature space.

LocalOutlierFactor

Density-based anomaly detection comparing local reachability densities.

OneClassSVM

Support vector method for novelty detection in high-dimensional spaces.

EllipticEnvelope

Gaussian assumption-based outlier detection using Mahalanobis distance.

AutoEncoder

Neural network-based anomaly detection via reconstruction error.

COPOD

Copula-based outlier detection - fast, parameter-free.

ECOD

Empirical cumulative distribution-based outlier detection.

DeepSVDD

Deep Support Vector Data Description for complex data.

DAGMM

Deep Autoencoding Gaussian Mixture Model for unsupervised anomaly detection.

TranAD

Transformer-based adversarial anomaly detection for time series.

Dimensionality Reduction

Reduce feature space while preserving structure for visualization and preprocessing.

PCA

Principal Component Analysis for linear dimensionality reduction.

KernelPCA

Non-linear PCA using kernel trick for complex manifolds.

t-SNE

T-distributed Stochastic Neighbor Embedding for 2D/3D visualization.

UMAP

Uniform Manifold Approximation for fast, high-quality embeddings.

LDA

Linear Discriminant Analysis for supervised dimensionality reduction.

NMF

Non-negative Matrix Factorization for parts-based decomposition.

ICA

Independent Component Analysis for blind source separation.

Isomap

Isometric mapping preserving geodesic distances on manifolds.

Classification with AiModelBuilder

C#
using AiDotNet;

// Classification with AiModelBuilder
var result = await new AiModelBuilder<float, float[], float>()
    .ConfigureModel(new GradientBoostingClassifier<float>(
        numTrees: 200, maxDepth: 6, learningRate: 0.1f))
    .ConfigurePreprocessing()
    .BuildAsync(X_train, y_train);

var predictions = result.Predict(X_test);

Start building with Classical Machine Learning

All 155+ implementations are included free under Apache 2.0.