Models, Classifiers, and Test Classifier


IDLmlAutoEncoder: Implements an autoencoder model that can be used for clustering purposes.

IDLmlFeedForwardNeuralNetwork: Implements a Neural Network model that can be used for classification purposes.

IDLmlKMeans: Implements a K-means model that can be used for clustering purposes.

IDLmlSoftmax: Iimplements a Softmax model that can be used for classification purposes.

IDLmlSupportVectorMachineClassification: Implements an SVM model that can be used for classification purposes.

IDLmlSupportVectorMachineRegression: Implements an SVM model that can be used for regression purposes.

IDLmlTestClassifier: Computes a confusion matrix and other metrics that indicate how well a model trained as a classifier performed against the test data.

Partition and Shuffle


IDLmlPartition: Partitions data so that it can be separated into two or more groups.

IDLmlShuffle: Shuffles features and values to create a random reordering of training data used for machine learning applications.

Normalizers


IDLmlLinearNormalizer: Implements a linear normalizer using the formula dataOut = dataIn * scale + offset.

IDLmlRangeNormalizer: Implements a normalizer that will scale data to have a range of 1.

IDLmlTanHNormalizer: Implements a Hyperbolic Tangent Normalizer which maps the data to the Tanh of the data. The normalized data will be confined to the range (-1, +1).

IDLmlUnitNormalizer: Implements a normalizer that will scale data to have a range of 1.

IDLmlVarianceNormalizer: Implements a normalizer that will scale data to have a mean of 0 and a standard deviation of 1, regardless of range.

Optimizers


Optimization algorithms are used by neural networks to help minimize an error function by modifying the model’s internal learnable parameters.

IDLmloptAdam

IDLmloptGradientDescent

IDLmloptMomentum

IDLmloptQuickProp

IDLmloptRMSProp

Activation Functions


Activation functions are a mathematical tool used in machine learning to impart non-linearities into linear systems.

IDLmlafArcTan

IDLmlafBentIdentity

IDLmlafBinaryStep

IDLmlafELU

IDLmlafGaussian

IDLmlafIdentity

IDLmlafISRLU

IDLmlafISRU

IDLmlafLogistic

IDLmlafPReLU

IDLmlafReLU

IDLmlafSinc

IDLmlafSinusoid

IDLmlafSoftExponential

IDLmlafSoftmax

IDLmlafSoftPlus

IDLmlafSoftSign

IDLmlafTanH

Kernels


The Kernel classes encapsulate SVM (Support Vector Machine) parameters that help define a kernel.

IDLmlSVMLinearKernel

IDLmlSVMPolynomialKernel

IDLmlSVMRadialKernel

IDLmlSVMSigmoidKernel

Loss Functions


Loss functions are a mathematical function that must be minimized to achieve convergence.

IDLmllfCrossEntropy

IDLmllfHuber

IDLmllfLogCosh

IDLmllfMeanAbsoluteError

IDLmllfMeanSquaredError