ONNX AI Format Adds Partners

Today, following the introduction of the Open Neural Network Exchange (ONNX) format on September 7, AMD, ARM, Huawei, IBM, Intel, Qualcomm have announced their support for ONNX. These companies, like Facebook and Microsoft, recognize the benefits ONNX’s open ecosystem provides engineers and researchers by allowing them to more easily move between state-of-the-art machine learning tools and choose the best combination for their projects. ONNX also makes it easier for optimizations to reach more developers. Any tools exporting ONNX models can benefit ONNX-compatible runtimes and libraries designed to maximize performance on some of the best AI hardware in the industry.

ONNX is an important part of the deep learning approach in Facebook’s AI teams, we are continuously trying to push the frontier of AI and develop better algorithms for learning. When we have a breakthrough, we strive to deliver the state-of-the-art technologies to our teams as soon as possible. With ONNX, we are focused on bringing the worlds of AI research and products closer together so that we can innovate and deploy intelligence faster.

We’re excited that these companies have decided to join us in this mission. We invite the others in the community to join the effort and support ONNX in their ecosystem. Enabling interoperability between different frameworks and streamlining the path from research to production will help increase the speed of innovation in the AI community.