Metadata-Version: 2.4 Name: onnxslim Version: 0.1.75 Summary: OnnxSlim: A Toolkit to Help Optimize Onnx Model Home-page: https://github.com/inisis/OnnxSlim Author: inisis Author-email: desmond.yao@buaa.edu.cn License: MIT Project-URL: Bug Tracker, https://github.com/inisis/OnnxSlim/issues Classifier: Programming Language :: Python :: 3 Classifier: License :: OSI Approved :: MIT License Classifier: Intended Audience :: Developers Classifier: Topic :: Scientific/Engineering :: Artificial Intelligence Requires-Python: >=3.6 Description-Content-Type: text/markdown License-File: LICENSE Requires-Dist: onnx Requires-Dist: sympy>=1.13.3 Requires-Dist: packaging Requires-Dist: colorama Requires-Dist: ml_dtypes Dynamic: author Dynamic: author-email Dynamic: classifier Dynamic: description Dynamic: description-content-type Dynamic: home-page Dynamic: license Dynamic: license-file Dynamic: project-url Dynamic: requires-dist Dynamic: requires-python Dynamic: summary # OnnxSlim

DeepWiki

OnnxSlim can help you slim your onnx model, with less operators, but same accuracy, better inference speed. - 🚀 2025/05/17: OnnxSlim is merged into [optimum](https://github.com/huggingface/optimum) 🤗🤗🤗 - 🚀 2025/04/30: Rank 1st in the [AICAS 2025 LLM inference optimization challenge](https://tianchi.aliyun.com/competition/entrance/532289/customize588) - 🚀 2025/01/28: Achieved 1M downloads - 🚀 2024/06/23: OnnxSlim is merged into [transformers.js](https://github.com/huggingface/transformers.js) 🤗🤗🤗 - 🚀 2024/06/02: OnnxSlim is merged into [ultralytics](https://github.com/ultralytics/ultralytics) ❤️❤️❤️ - 🚀 2024/04/30: Rank 1st in the [AICAS 2024 LLM inference optimization challenge](https://tianchi.aliyun.com/competition/entrance/532170/customize440) held by Arm and T-head - 🚀 2024/01/25: OnnxSlim is merged to [mnn-llm](https://github.com/wangzhaode/mnn-llm), performance increased by 5% # Benchmark ![Image](https://github.com/user-attachments/assets/fefc79f1-5d8d-486b-935a-a088846b3900) # Installation ## Using Prebuilt ```bash pip install onnxslim ``` ## Install From Source ```bash pip install git+https://github.com/inisis/OnnxSlim@main ``` ## Install From Local ```bash git clone https://github.com/inisis/OnnxSlim && cd OnnxSlim/ pip install . ``` # How to use ## Bash ```bash onnxslim your_onnx_model slimmed_onnx_model ```
## Inscript ```inscript import onnx import onnxslim model = onnx.load("model.onnx") slimmed_model = onnxslim.slim(model) onnx.save(slimmed_model, "slimmed_model.onnx") ``` For more usage, see onnxslim -h or refer to our [examples](./examples) # Projects using OnnxSlim - [Mozilla/smart_autofill](https://github.com/mozilla/smart_autofill) - [alibaba/MNN](https://github.com/alibaba/MNN) - [PaddlePaddle/PaddleOCR](https://github.com/PaddlePaddle/PaddleOCR) - [huggingface/transformers.js](https://github.com/huggingface/transformers.js) - [huggingface/optimum](https://github.com/huggingface/optimum) - [THU-MIG/yolov10](https://github.com/THU-MIG/yolov10) - [ultralytics/ultralytics](https://github.com/ultralytics/ultralytics) - [ModelScope/FunASR](https://github.com/modelscope/FunASR) - [alibaba/MNN-LLM](https://github.com/wangzhaode/mnn-llm) - [deepghs/imgutils](https://github.com/deepghs/imgutils) - [sunsmarterjie/yolov12](https://github.com/sunsmarterjie/yolov12) - [nndeploy/nndeploy](https://github.com/nndeploy/nndeploy) - [CVCUDA/CV-CUDA](https://github.com/CVCUDA/CV-CUDA) # References > - [onnx-graphsurgeon](https://github.com/NVIDIA/TensorRT/tree/main/tools/onnx-graphsurgeon) > - [Polygraphy](https://github.com/NVIDIA/TensorRT/tree/main/tools/Polygraphy/polygraphy) > - [onnx-simplifier](https://github.com/daquexian/onnx-simplifier) > - [tabulate](https://github.com/astanin/python-tabulate) > - [onnxruntime](https://github.com/microsoft/onnxruntime) # Contact Discord: https://discord.gg/nRw2Fd3VUS QQ Group: `873569894`