|
22 | 22 | - [Build Steps](#build-steps-2)
|
23 | 23 | - [Additional Build Options](#additional-build-options-3)
|
24 | 24 | - [Use Custom OpenCV Builds for Inference Engine](#use-custom-opencv-builds-for-inference-engine)
|
| 25 | +- [Adding Inference Engine to your project](#adding-inference-engine-to-your-project) |
25 | 26 | - [(Optional) Additional Installation Steps for the Intel® Movidius™ Neural Compute Stick and Neural Compute Stick 2](#optional-additional-installation-steps-for-the-intel-movidius-neural-compute-stick-and-neural-compute-stick-2)
|
26 | 27 | - [For Linux, Raspbian Stretch* OS](#for-linux-raspbian-stretch-os)
|
27 | 28 | - [For Windows](#for-windows-1)
|
@@ -62,7 +63,13 @@ The software was validated on:
|
62 | 63 | git submodule init
|
63 | 64 | git submodule update --recursive
|
64 | 65 | ```
|
65 |
| -2. Install build dependencies using the `install_dependencies.sh` script in the project root folder. |
| 66 | +2. Install build dependencies using the `install_dependencies.sh` script in the project root folder: |
| 67 | + ```sh |
| 68 | + chmod +x install_dependencies.sh |
| 69 | + ``` |
| 70 | + ```sh |
| 71 | + ./install_dependencies.sh |
| 72 | + ``` |
66 | 73 | 3. By default, the build enables the Inference Engine GPU plugin to infer models on your Intel® Processor Graphics. This requires you to [Install Intel® Graphics Compute Runtime for OpenCL™ Driver package 19.04.12237](https://github.com/intel/compute-runtime/releases/tag/19.04.12237) before running the build. If you don't want to use the GPU plugin, use the `-DENABLE_CLDNN=OFF` CMake build option and skip the installation of the Intel® Graphics Compute Runtime for OpenCL™ Driver.
|
67 | 74 | 4. Create a build folder:
|
68 | 75 | ```sh
|
@@ -90,33 +97,20 @@ You can use the following additional build options:
|
90 | 97 |
|
91 | 98 | - If the CMake-based build script can not find and download the OpenCV package that is supported on your platform, or if you want to use a custom build of the OpenCV library, refer to the [Use Custom OpenCV Builds](#use-custom-opencv-builds-for-inference-engine) section for details.
|
92 | 99 |
|
93 |
| -- To build the Python API wrapper, use the `-DENABLE_PYTHON=ON` option. To specify an exact Python version, use the following options: |
94 |
| - ```sh |
95 |
| - -DPYTHON_EXECUTABLE=`which python3.7` \ |
96 |
| - -DPYTHON_LIBRARY=/usr/lib/x86_64-linux-gnu/libpython3.7m.so \ |
97 |
| - -DPYTHON_INCLUDE_DIR=/usr/include/python3.7 |
98 |
| - ``` |
| 100 | +- To build the Python API wrapper: |
| 101 | + 1. Install all additional packages listed in the `/inference-engine/ie_bridges/python/requirements.txt` file: |
| 102 | + ```sh |
| 103 | + pip install -r requirements.txt |
| 104 | + ``` |
| 105 | + 2. use the `-DENABLE_PYTHON=ON` option. To specify an exact Python version, use the following options: |
| 106 | + ```sh |
| 107 | + -DPYTHON_EXECUTABLE=`which python3.7` \ |
| 108 | + -DPYTHON_LIBRARY=/usr/lib/x86_64-linux-gnu/libpython3.7m.so \ |
| 109 | + -DPYTHON_INCLUDE_DIR=/usr/include/python3.7 |
| 110 | + ``` |
99 | 111 |
|
100 | 112 | - To switch off/on the CPU and GPU plugins, use the `cmake` options `-DENABLE_MKL_DNN=ON/OFF` and `-DENABLE_CLDNN=ON/OFF` respectively.
|
101 | 113 |
|
102 |
| -5. Adding to your project |
103 |
| - |
104 |
| - For CMake projects, set an environment variable `InferenceEngine_DIR`: |
105 |
| - |
106 |
| - ```sh |
107 |
| - export InferenceEngine_DIR=/path/to/dldt/inference-engine/build/ |
108 |
| - ``` |
109 |
| - |
110 |
| - Then you can find Inference Engine by `find_package`: |
111 |
| - |
112 |
| - ```cmake |
113 |
| - find_package(InferenceEngine) |
114 |
| -
|
115 |
| - include_directories(${InferenceEngine_INCLUDE_DIRS}) |
116 |
| -
|
117 |
| - target_link_libraries(${PROJECT_NAME} ${InferenceEngine_LIBRARIES} dl) |
118 |
| - ``` |
119 |
| - |
120 | 114 | ## Build for Raspbian Stretch* OS
|
121 | 115 |
|
122 | 116 | > **NOTE**: Only the MYRIAD plugin is supported.
|
@@ -371,7 +365,13 @@ The software was validated on:
|
371 | 365 | git submodule init
|
372 | 366 | git submodule update --recursive
|
373 | 367 | ```
|
374 |
| -2. Install build dependencies using the `install_dependencies.sh` script in the project root folder. |
| 368 | +2. Install build dependencies using the `install_dependencies.sh` script in the project root folder: |
| 369 | + ```sh |
| 370 | + chmod +x install_dependencies.sh |
| 371 | + ``` |
| 372 | + ```sh |
| 373 | + ./install_dependencies.sh |
| 374 | + ``` |
375 | 375 | 3. Create a build folder:
|
376 | 376 | ```sh
|
377 | 377 | mkdir build
|
@@ -419,6 +419,22 @@ After you got the built OpenCV library, perform the following preparation steps
|
419 | 419 | 1. Set the `OpenCV_DIR` environment variable to the directory where the `OpenCVConfig.cmake` file of you custom OpenCV build is located.
|
420 | 420 | 2. Disable the package automatic downloading with using the `-DENABLE_OPENCV=OFF` option for CMake-based build script for Inference Engine.
|
421 | 421 |
|
| 422 | +## Adding Inference Engine to your project |
| 423 | +
|
| 424 | +For CMake projects, set the `InferenceEngine_DIR` environment variable: |
| 425 | +
|
| 426 | +```sh |
| 427 | +export InferenceEngine_DIR=/path/to/dldt/inference-engine/build/ |
| 428 | +``` |
| 429 | +
|
| 430 | +Then you can find Inference Engine by `find_package`: |
| 431 | +
|
| 432 | +```cmake |
| 433 | +find_package(InferenceEngine) |
| 434 | +include_directories(${InferenceEngine_INCLUDE_DIRS}) |
| 435 | +target_link_libraries(${PROJECT_NAME} ${InferenceEngine_LIBRARIES} dl) |
| 436 | +``` |
| 437 | +
|
422 | 438 | ## (Optional) Additional Installation Steps for the Intel® Movidius™ Neural Compute Stick and Neural Compute Stick 2
|
423 | 439 |
|
424 | 440 | > **NOTE**: These steps are only required if you want to perform inference on Intel® Movidius™ Neural Compute Stick or the Intel® Neural Compute Stick 2 using the Inference Engine MYRIAD Plugin. See also [Intel® Neural Compute Stick 2 Get Started](https://software.intel.com/en-us/neural-compute-stick/get-started)
|
@@ -461,7 +477,7 @@ For Intel® Movidius™ Neural Compute Stick and Intel® Neural Compute Stick 2,
|
461 | 477 | 1. Go to the `<DLDT_ROOT_DIR>/inference-engine/thirdparty/movidius/MovidiusDriver` directory, where the `DLDT_ROOT_DIR` is the directory to which the DLDT repository was cloned.
|
462 | 478 | 2. Right click on the `Movidius_VSC_Device.inf` file and choose **Install** from the pop up menu.
|
463 | 479 |
|
464 |
| -You have installed the driver for your Intel® Movidius™ Neural Compute Stick or Intel® Neural Compute Stick 2. |
| 480 | +You have installed the driver for your Intel® Movidius™ Neural Compute Stick or Intel® Neural Compute Stick 2. |
465 | 481 |
|
466 | 482 | ## Next Steps
|
467 | 483 |
|
|
0 commit comments