This application is a reference implementation for developers to show how to use the Java API and could be used to easily check the accuracy. The Java API is a wrapper around the C++ API defined at https://www.doubango.org/SDKs/micr/docs/cpp-api.html.
The application accepts path to a JPEG/PNG/BMP file as input. This is not the recommended way to use the API. We recommend reading the data directly from the camera and feeding the SDK with the uncompressed YUV data without saving it to a file or converting it to RGB.
If you don't want to build this sample and is looking for a quick way to check the accuracy then, try our cloud-based solution at https://www.doubango.org/webapps/micr/.
This sample is open source and doesn't require registration or license key.
If you don't want to build this sample by yourself then, use the pre-built C++ versions:
- Windows: recognizer.exe under binaries/windows/x86_64
- Linux: recognizer under binaries/linux/x86_64. Built on Ubuntu 18. You'll need to download libtensorflow.so as explained here
- Raspberry Pi: recognizer under binaries/raspbian/armv7l
- Android: check android folder
- iOS: check ios folder
On Windows, the easiest way to try this sample is to navigate to binaries/windows/x86_64 and run binaries/windows/x86_64/recognizer.bat. You can edit these files to use your own images and configuration options.
This sample contains a single Java source file.
You have to navigate to the current folder (ultimateMICR-SDK/samples/java/recognizer
) before trying the next commands:
cd ultimateMICR-SDK/samples/java/recognizer
Here is how to build the file using javac
:
javac @sources.txt -d .
Recognizer
is a command line application with the following usage:
Recognizer \
--image <path-to-image-with-micr-to-process> \
[--assets <path-to-assets-folder>] \
[--format <format-for-dtection:e13b/cmc7/e13b+cmc7>] \
[--tokenfile <path-to-license-token-file>] \
[--tokendata <base64-license-token-data>]
Options surrounded with [] are optional.
--image
Path to the image(JPEG/PNG/BMP) to process. You can use default image at ../../../assets/images/e13b_1280x720.jpg.--assets
Path to the assets folder containing the configuration files and models. Default value is the current folder.--format
Defines the MICR format to enable for the detection. Usee13b
to look for E-13B lines only andcmc7
for CMC-7 lines only. To look for both, usee13b+cmc7
. For performance reasons you should not usee13b+cmc7
unless you really expect the document to contain both E-13B and CMC7 lines. Default:e13b+cmc7
.--tokenfile
Path to the file containing the base64 license token if you have one. If not provided then, the application will act like a trial version. Default: null.--tokendata
Base64 license token if you have one. If not provided then, the application will act like a trial version. Default: null.
You'll need to build the sample as explained above.
You have to navigate to the current folder (ultimateMICR-SDK/samples/java/recognizer
) before trying the next commands:
cd ultimateMICR-SDK/samples/java/recognizer
- For example, on Raspberry Pi you may call the recognizer application using the following command:
LD_LIBRARY_PATH=../../../binaries/raspbian/armv7l:$LD_LIBRARY_PATH \
java Recognizer --image ../../../assets/images/e13b_1280x720.jpg --format e13b --assets ../../../assets
- On Linux x86_64, you may use the next command:
LD_LIBRARY_PATH=../../../binaries/linux/x86_64:$LD_LIBRARY_PATH \
java Recognizer --image ../../../assets/images/e13b_1280x720.jpg --format e13b --assets ../../../assets
Before trying to run the program you'll need to download libtensorflow.so as explained here
- On Windows x86_64, you may use the next command:
setlocal
set PATH=%PATH%;../../../binaries/windows/x86_64
java Recognizer --image ../../../assets/images/e13b_1280x720.jpg --format e13b --assets ../../../assets
endlocal
Please note that if you're cross compiling the application then you've to make sure to copy the application and both the assets and binaries folders to the target device.