๐ High performance nerual network framework running on Web.
Classification | Style Transfer | Object Detection(TBD) |
Firstly you should determine which backend to use. sznn provides 3 backends(JS, WASM and WebGPU). But currently you could only choose best backend statically by yourself.
Fortunately, we provide a tool as suggestion. Just directly open tools/schedule/detect.html
in your target browser and you will see our recommendation.
Two simple API then you can inference ONNX models:
const model = await loadModel("./model.onnx");
const output = await model.forward(input);
Here is SqueezeNet inference benchmark. Test on my M1 MacBook Pro, Chome Canary v100.
Obviously, sznn has a long way to go. ๐
Warning
This project is still heavily in development. Please DO NOT use it in production environment!
- Enrich more ONNX opreators. (#1)
- Add YOLO as detection example. (#2)
- Optimize convolutional layer. (#3)
- Improve backends performace. (#4) (#5)
Let's make sznn better together.
Copyright ยฉ๏ธ 2022 Sh-Zh-7