- Checkout branch
main
from https://github.com/microsoft/onnxruntime toC:\code\onnxruntime
- Install Ninja:
pip install ninja
or download from github (make sure it's in PATH)
Use the file js\build_jsep.bat
. see its content for usage instruction
C:\code\onnxruntime>js\build_jsep d
If it's the first time to build, perform a clean build
C:\code\onnxruntime>js\build_jsep d clean
- Use the following command line to build wasm:
C:\code\onnxruntime>
.\build.bat --config Release --build_wasm --enable_wasm_simd --enable_wasm_threads --use_jsep --target onnxruntime_webassembly --skip_tests
- Prepare js/web:
C:\code\onnxruntime>
cd js
C:\code\onnxruntime\js>npm ci
C:\code\onnxruntime\js>cd common
C:\code\onnxruntime\js\common>npm ci
C:\code\onnxruntime\js\common>cd ..\web
C:\code\onnxruntime\js\web>npm ci
C:\code\onnxruntime\js\web>npm run pull:wasm
C:\code\onnxruntime\js\web>copy /Y ..\..\build\Windows\Release\ort-wasm-simd-threaded.jsep.wasm .\lib\wasm\binding\ort-wasm-simd-threaded.jsep.wasm
C:\code\onnxruntime\js\web>copy /Y ..\..\build\Windows\Release\ort-wasm-simd-threaded.jsep.mjs .\dist\ort-wasm-simd-threaded.jsep.mjs
C:\code\onnxruntime\js\web>
npm test -- model <path-to-ort-model-test-folder> -b=webgpu -x=1 --debug
C:\code\onnxruntime\js\web>
npm run build
Following files are built and needed for distribution:
- ort.{...}.min.[m]js
- ort.{...}.min.[m]js.map ( optional, for debug purpose )
- ort-wasm-simd-threaded.jsep.wasm
C:\code\onnxruntime\js\common>
npm pack
C:\code\onnxruntime\js\web>npm pack
2 Package files are needed for npm install. Please install them together:
- js/common/onnxruntime-common-*.tgz
- js/web/onnxruntime-web-*.tgz
Make sure the files mentioned above are put in dist folder.
-
Consume from <script> tag:
<script src="./dist/ort.webgpu.min.js"></script>
-
Consume from package:
import {Tensor, InferenceSession} from "onnxruntime-web/webgpu";