Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

onnx-mlir --EmitJNI failed on windows, run wrong command #2580

Open
armgong opened this issue Oct 24, 2023 · 3 comments · May be fixed by #2582
Open

onnx-mlir --EmitJNI failed on windows, run wrong command #2580

armgong opened this issue Oct 24, 2023 · 3 comments · May be fixed by #2582

Comments

@armgong
Copy link

armgong commented Oct 24, 2023

run build on windows today, build ok, run --EmitObj --EmitLib also is a go, but --EmitJNI failed. onnx-mlir run wrong command on windows

 lib.exe x D:\onnx\onnx-mlir\build\Release\lib/libjniruntime.a jnidummy.c.o

on linux

/usr/bin/ar: ar x /src/onnx-mlir/buildr/Release/lib/libjniruntime.a jnidummy.c.o

on windows when run --emitjni, it only replace ar with lib, but this is wrong. file name are also wrong, should be jniruntime.lib and jnidummy.c.obj.

This is detail output of --EmitJNI on windows

D:\onnx\onnx-mlir\build\Release\bin>onnx-mlir.exe --EmitJNI -v add.onnx
The ONNX model has 0 elements in its initializers. This value would be close to and greater than the number of parameters in the model. Because there is no way to exactly count the number of parameters, this value can be used to have a rough idea of the number of parameters in the model.
[D:\onnx\onnx-mlir\build\Release\bin\]D:/onnx/llvm-project/build/bin/opt.exe: opt.exe -O0 --mtriple=x86_64-pc-windows-msvc --code-model small -o add.bc add.unoptimized.bc
[D:\onnx\onnx-mlir\build\Release\bin\]D:/onnx/llvm-project/build/bin/llc.exe: llc.exe -O0 --mtriple=x86_64-pc-windows-msvc --code-model small -filetype=obj -relocation-model=pic -o add.obj add.bc
[D:\onnx\onnx-mlir\build\Release\bin\.]C:/Program Files/Microsoft Visual Studio/2022/Community/VC/Tools/MSVC/14.37.32822/bin/Hostx64/x64/lib.exe: lib.exe x D:\onnx\onnx-mlir\build\Release\lib/libjniruntime.a jnidummy.c.o
Microsoft (R) Library Manager Version 14.37.32824.0
Copyright (C) Microsoft Corporation.  All rights reserved.

LINK : fatal error LNK1181: 无法打开输入文件“x”
lib.exe x D:\onnx\onnx-mlir\build\Release\lib/libjniruntime.a jnidummy.c.o
Error message:
Program path: C:/Program Files/Microsoft Visual Studio/2022/Community/VC/Tools/MSVC/14.37.32822/bin/Hostx64/x64/lib.exe
Command execution failed.

compare with linux , --EmitJNI is ok

[root@uos-compiler bin]# ./onnx-mlir --EmitJNI -v mnist.onnx
The ONNX model has 26506 elements in its initializers. This value would be close to and greater than the number of parameters in the model. Because there is no way to exactly count the number of parameters, this value can be used to have a rough idea of the number of parameters in the model.
[/src/onnx-mlir/buildr/Release/bin/]/src/llvm-project/build/bin/opt: opt -O0 --mtriple=x86_64-unknown-linux-gnu --code-model small -o mnist.bc mnist.unoptimized.bc
[/src/onnx-mlir/buildr/Release/bin/]/src/llvm-project/build/bin/llc: llc -O0 --mtriple=x86_64-unknown-linux-gnu --code-model small -filetype=obj -relocation-model=pic -o mnist.o mnist.bc
[/src/onnx-mlir/buildr/Release/bin/.]/usr/bin/ar: ar x /src/onnx-mlir/buildr/Release/lib/libjniruntime.a jnidummy.c.o
[/src/onnx-mlir/buildr/Release/bin/]/usr/local/bin/c++: c++ -z noexecstack mnist.o ./jnidummy.c.o -o ./libmodel.so -shared -fPIC -L/src/onnx-mlir/buildr/Release/lib -ljniruntime -lcruntime
[/src/onnx-mlir/buildr/Release/bin/]/usr/bin/jar: jar uf mnist.jar -C . libmodel.so
JNI archive mnist.jar has been compiled.
@armgong
Copy link
Author

armgong commented Oct 24, 2023

maybe this function is the root of problem?

static int genJniObject(const mlir::OwningOpRef<ModuleOp> &module,
    std::string jniSharedLibPath, std::string jniObjPath) {
  Command ar(/*exePath=*/getToolPath("ar", true));
  int rc = ar.appendStr("x")
               // old version of ar does not support --output so comment out
               // for now and use the optional wdir for exec() to get around
               // the problem.
               //.appendStr("--output")
               //.appendStr(llvm::sys::path::parent_path(jniObjPath).str())
               .appendStr(jniSharedLibPath)
               .appendStr(llvm::sys::path::filename(jniObjPath).str())
               .exec(llvm::sys::path::parent_path(jniObjPath).str());
  return rc != 0 ? CompilerFailureInGenJniObj : CompilerSuccess;
}

@gongsu832
Copy link
Collaborator

--EmitJNI was never properly implemented for Windows since we don't have the expertise nor a test env for it.

@armgong
Copy link
Author

armgong commented Oct 25, 2023

write a patch for this issue and pull request created, close it now

@tungld tungld linked a pull request Oct 25, 2023 that will close this issue
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants