AutoML & Neural Architecture Search
4Automated model design and hyperparameter optimization
Let algorithms find the best architecture and hyperparameters. Supports differentiable NAS (DARTS), evolutionary search, Bayesian optimization, Hyperband, and complete AutoML pipelines with feature selection, model selection, and ensemble building.
Neural Architecture Search
Automatically discover optimal neural network architectures.
DARTS
Differentiable Architecture Search with continuous relaxation of search space.
ENAS
Efficient NAS with parameter sharing across candidate architectures.
ProxylessNAS
Direct NAS on target hardware without proxy tasks.
Once-for-All
Train one network that supports diverse architectural configurations.
Hyperparameter Optimization
Find optimal hyperparameters efficiently.
Bayesian Optimization
Gaussian process-based optimization with acquisition functions.
Hyperband
Bandit-based approach with early stopping of poor configurations.
BOHB
Bayesian Optimization and HyperBand combining both approaches.
TPE
Tree-structured Parzen Estimator for efficient search.
Random Search
Surprisingly effective baseline for hyperparameter search.
AutoML Pipeline
End-to-end automated machine learning workflow.
Feature Selection
Automatic feature importance and selection methods.
Model Selection
Compare and select best model from algorithm library.
Ensemble Building
Automatically construct optimal model ensembles.
Cross-Validation
Robust model evaluation with multiple CV strategies.
AutoML with AiModelBuilder
using AiDotNet;
// AutoML with AiModelBuilder
var result = await new AiModelBuilder<float, float[], float>()
.ConfigureAutoML(new AutoMLOptions(
task: MLTask.Classification,
timeLimit: TimeSpan.FromMinutes(30),
metric: Metric.F1Score))
.ConfigurePreprocessing()
.BuildAsync(features, labels);
// AiModelBuilder selects the best model automatically
var prediction = result.Predict(newSample); Start building with AutoML & Neural Architecture Search
All 4 implementations are included free under Apache 2.0.