Important Notes - ONNX v1.13
As you know, ONNX was introduced by Facebook and Microsoft in 2017 and is a popular open-source framework. A few days ago, the new version of ONNX, v1.13.0, brought several new features and improvements. We will look closely at some of the most significant updates in this release.
Support for Sparse Tensors
Sparse tensors represent high-dimensional arrays with many elements with a value of zero. They store and manipulate data more efficiently than dense tensors, and sparse tensors can process more extensive datasets and create more complex models with fewer computational resources. One of the most exciting features of ONNX 1.13.0 is the support for sparse tensors, reducing the memory footprint of models.
Improved Performance for LSTM and GRU
ONNX 1.13.0 includes better optimization for LSTM and GRU models and helps us to increase the inference time. As you know, LSTMs use to retain or discard information over time selectively, and GRUs allow the flow of information to process longer data sequences more efficiently.
Support for More Operators
ONNX 1.13.0 supports several new operators, including MeanVarianceNormalization, MatrixDiag, and CumSum. These operators can be helpful in a wide range of machine learning tasks, such as computer vision and natural language processing.
Improved Integration with PyTorch
In ONNX 1.13.0, there are several improvements to the integration with PyTorch, including better support for dynamic shapes and more flexible handling of inputs and outputs.
In conclusion, ONNX 1.13.0 is a significant update to the framework that brings several new features and improvements. If you are a machine learning developer or researcher, it is worth exploring the unique capabilities of ONNX 1.13.0 and seeing how they can benefit your work.