Build a deep learning inference framework from scratch
envirooment requirement:
- conda,python=3.11
- mininn_test.gynn download
conda create -n mininn python=3.11
conda activate mininn
pip install mininn
python python/test/pip_test.py
Host | Target | Tool | Compiler | Backend | Note |
---|---|---|---|---|---|
Windows | Windows | cmake 3.26.4 | msvc 14.44 | cuda✅ opencl✅ avx✅ sse✅ mkl✅ | use VS shell |
clang 20.1.5 | cuda❌ opencl✅ avx✅ sse✅ mkl✅ | ||||
g++ 13.1.0 | cuda❌ opencl✅ avx✅ sse✅ mkl✅ | ||||
bazel 7.3.1 | msvc 14.44 | cuda✅ opencl✅ avx✅ sse✅ mkl✅ | default msvc | ||
clang 20.1.5 | |||||
g++ 13.1.0 | |||||
Linux | Linux | cmake 3.16.3 | clang 10.0.0 | cuda✅ opencl✅ avx✅ sse✅ mkl✅ | |
g++ 9.4.0 | cuda✅ opencl✅ avx✅ sse✅ mkl✅ | ||||
bazel 8.0.0 | clang 10.0.0 | ||||
g++ 9.4.0 | cuda✅ opencl✅ avx✅ sse✅ mkl❌ | default g++ |
# https
git clone --recursive https://github.com/masteryi-0018/MiniNN.git
# ssh
git clone --recursive [email protected]:masteryi-0018/MiniNN.git
# cmake
python build.py
# bazel
python build.py --tool bazel
# if you want to identify generator, add this flag
python build.py --tool bazel --generator ninja
# if you want to identify compiler, add this flag
python build.py --tool bazel --compiler clang
# if you want build python wheel, add this flag
python build.py --tool bazel --wheel
# cmake
.\build\mininn\test\gtest-main.exe
# bazel
.\bazel-bin\mininn\gtest-main.exe
# cmake
./build/mininn/test/gtest-main
# bazel
./bazel-bin/mininn/gtest-main
- mininn convertor
- 支持将 onnx 模型转换为 gynn 格式
- 支持将 pytorch 模型转换为 gynn 格式
- 支持将 tensorflow 模型转换为 gynn 格式
- mininn IR
- 支持多算子构图
- mininn kernel
- 增加 opencl 后端
- 增加 cuda 后端
- 增加 avx 后端
- 增加 sse 后端
- 增加 mkl 后端
- mininn release
- demo
- cpp sdk
- pip package
- mininn build
- cmake
- bazel
- clang++
- g++
- msvc