Skip to content

(WIP) ๐Ÿš€ High performance nerual network inference engine running on Web.

License

Notifications You must be signed in to change notification settings

ShZh-libraries/sznn

Folders and files

NameName
Last commit message
Last commit date

Latest commit

ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 

Repository files navigation

sznn

๐Ÿš€ High performance nerual network framework running on Web.

issues Size Github Star

Overview

classify transfer detect
Classification Style Transfer Object Detection(TBD)

Usage

Firstly you should determine which backend to use. sznn provides 3 backends(JS, WASM and WebGPU). But currently you could only choose best backend statically by yourself.

Fortunately, we provide a tool as suggestion. Just directly open tools/schedule/detect.html in your target browser and you will see our recommendation.

Two simple API then you can inference ONNX models:

const model = await loadModel("./model.onnx");
const output = await model.forward(input);

Benchmark

Here is SqueezeNet inference benchmark. Test on my M1 MacBook Pro, Chome Canary v100.

benchmark

Obviously, sznn has a long way to go. ๐Ÿ˜…

Roadmap

Warning

This project is still heavily in development. Please DO NOT use it in production environment!

  • Enrich more ONNX opreators. (#1)
  • Add YOLO as detection example. (#2)
  • Optimize convolutional layer. (#3)
  • Improve backends performace. (#4) (#5)

Let's make sznn better together.

License

Apache-2.0 License

Copyright ยฉ๏ธ 2022 Sh-Zh-7