You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
-> return the root module which is the first module in the list.
39
39
```
40
40
41
41
During the `CreateModuleFromLibrary` function, the `net_onnx_rdl.tar` will be untar and with `g++ -shared -fPIC -o <output_name>.so devc.o lib0.o` to create the shared library.
42
42
43
-
44
43
```log
45
44
python: tvm.graph_executor.create
46
45
-> call the `GraphExecutorCreate@src/runtime/graph_executor/graph_executor.cc:788`
47
46
-> call the `exec->Init`
48
47
-> call the `this->Load(&reader)` # This is the `Load` function.
49
-
-> in this load function, it will parse the metadata.json like this (in the graph_json chapter):
50
-
-> Here it will extract the `arg_nodes`, `heads`, `nodes`, `node_row_ptr`, `attrs` and `shape` information.
51
-
-> and init the vector of Node in the `GraphExecutor` object.
52
-
-> and all the nodes in the json will be extracted to the `TVMOpParam`.
53
48
-> call `GraphExecutor::SetupStorage()`
54
49
-> calculate the tensor space and create a storage list and allocate the memory.
0 commit comments