Softmax Operator using Orion Framework

Over the past 3 weeks, I have been studying Orion, a Zero Knowledge Machine Learning ( zkML ) framework for Validity ML built using Cairo, a Rust like programming language for writing starknet contracts and traditional programs.

Validity ML leverages validity proofs like STARKs, which enables the verification of the correctness of computational processes. By deploying such proof systems in machine learning applications, we gain the ability to validate the inference of ML models or to confirm that a specific input produced a certain output with a given model.

Orion's use of Open Neural Network Exchange ( ONNX ) for its machine learning runtime provides key advantages. The interoperability of the ONNX format also allows Orion to take advantage of models from different training frameworks like PyTorch and TensorFlow and validiate their inferances. Below is the implimentation of one of the ONNX neural network operators using Orion.

Implimentation of SoftMax Operator using Orion

The softmax operator is commonly used in the final layer of neural network classifiers to convert the raw scores/logits into normalized probability distributions over the output classes. It takes the exponent of each logit value and then divides by the sum of all exponentiated values, ensuring the outputs sum to 1.

This probabilistic output is useful for multi-class classification tasks where we want to predict the likelihood of each potential class. The larger the logit for a class, the higher the softmax probability will be for that class.

For example, in an image classifier the softmax outputs might represent the predicted probabilities that the input image belongs to each possible class - cat, dog, car etc. The softmax essentially converts the logits into a "soft" probability distribution where the largest logit activates the corresponding output class.

This transforms the outputs into a more interpretable form and also enables the use of probabilistic loss functions like cross-entropy for training the model. The softmax nonlinearity enforces a competition between output units and allows only one class to be predicted as the winner based on the relative evidence encoded in the logits.