Skip to content

Commit 6dfc778

Browse files
author
Alexey Suhov
committed
Publishing 2019 R3.1 content
1 parent 1798ac0 commit 6dfc778

33 files changed

+635
-48
lines changed

inference-engine/README.md

Lines changed: 43 additions & 27 deletions
Original file line numberDiff line numberDiff line change
@@ -22,6 +22,7 @@
2222
- [Build Steps](#build-steps-2)
2323
- [Additional Build Options](#additional-build-options-3)
2424
- [Use Custom OpenCV Builds for Inference Engine](#use-custom-opencv-builds-for-inference-engine)
25+
- [Adding Inference Engine to your project](#adding-inference-engine-to-your-project)
2526
- [(Optional) Additional Installation Steps for the Intel® Movidius™ Neural Compute Stick and Neural Compute Stick 2](#optional-additional-installation-steps-for-the-intel-movidius-neural-compute-stick-and-neural-compute-stick-2)
2627
- [For Linux, Raspbian Stretch* OS](#for-linux-raspbian-stretch-os)
2728
- [For Windows](#for-windows-1)
@@ -62,7 +63,13 @@ The software was validated on:
6263
git submodule init
6364
git submodule update --recursive
6465
```
65-
2. Install build dependencies using the `install_dependencies.sh` script in the project root folder.
66+
2. Install build dependencies using the `install_dependencies.sh` script in the project root folder:
67+
```sh
68+
chmod +x install_dependencies.sh
69+
```
70+
```sh
71+
./install_dependencies.sh
72+
```
6673
3. By default, the build enables the Inference Engine GPU plugin to infer models on your Intel® Processor Graphics. This requires you to [Install Intel® Graphics Compute Runtime for OpenCL™ Driver package 19.04.12237](https://github.com/intel/compute-runtime/releases/tag/19.04.12237) before running the build. If you don't want to use the GPU plugin, use the `-DENABLE_CLDNN=OFF` CMake build option and skip the installation of the Intel® Graphics Compute Runtime for OpenCL™ Driver.
6774
4. Create a build folder:
6875
```sh
@@ -90,33 +97,20 @@ You can use the following additional build options:
9097

9198
- If the CMake-based build script can not find and download the OpenCV package that is supported on your platform, or if you want to use a custom build of the OpenCV library, refer to the [Use Custom OpenCV Builds](#use-custom-opencv-builds-for-inference-engine) section for details.
9299

93-
- To build the Python API wrapper, use the `-DENABLE_PYTHON=ON` option. To specify an exact Python version, use the following options:
94-
```sh
95-
-DPYTHON_EXECUTABLE=`which python3.7` \
96-
-DPYTHON_LIBRARY=/usr/lib/x86_64-linux-gnu/libpython3.7m.so \
97-
-DPYTHON_INCLUDE_DIR=/usr/include/python3.7
98-
```
100+
- To build the Python API wrapper:
101+
1. Install all additional packages listed in the `/inference-engine/ie_bridges/python/requirements.txt` file:
102+
```sh
103+
pip install -r requirements.txt
104+
```
105+
2. use the `-DENABLE_PYTHON=ON` option. To specify an exact Python version, use the following options:
106+
```sh
107+
-DPYTHON_EXECUTABLE=`which python3.7` \
108+
-DPYTHON_LIBRARY=/usr/lib/x86_64-linux-gnu/libpython3.7m.so \
109+
-DPYTHON_INCLUDE_DIR=/usr/include/python3.7
110+
```
99111

100112
- To switch off/on the CPU and GPU plugins, use the `cmake` options `-DENABLE_MKL_DNN=ON/OFF` and `-DENABLE_CLDNN=ON/OFF` respectively.
101113

102-
5. Adding to your project
103-
104-
For CMake projects, set an environment variable `InferenceEngine_DIR`:
105-
106-
```sh
107-
export InferenceEngine_DIR=/path/to/dldt/inference-engine/build/
108-
```
109-
110-
Then you can find Inference Engine by `find_package`:
111-
112-
```cmake
113-
find_package(InferenceEngine)
114-
115-
include_directories(${InferenceEngine_INCLUDE_DIRS})
116-
117-
target_link_libraries(${PROJECT_NAME} ${InferenceEngine_LIBRARIES} dl)
118-
```
119-
120114
## Build for Raspbian Stretch* OS
121115

122116
> **NOTE**: Only the MYRIAD plugin is supported.
@@ -371,7 +365,13 @@ The software was validated on:
371365
git submodule init
372366
git submodule update --recursive
373367
```
374-
2. Install build dependencies using the `install_dependencies.sh` script in the project root folder.
368+
2. Install build dependencies using the `install_dependencies.sh` script in the project root folder:
369+
```sh
370+
chmod +x install_dependencies.sh
371+
```
372+
```sh
373+
./install_dependencies.sh
374+
```
375375
3. Create a build folder:
376376
```sh
377377
mkdir build
@@ -419,6 +419,22 @@ After you got the built OpenCV library, perform the following preparation steps
419419
1. Set the `OpenCV_DIR` environment variable to the directory where the `OpenCVConfig.cmake` file of you custom OpenCV build is located.
420420
2. Disable the package automatic downloading with using the `-DENABLE_OPENCV=OFF` option for CMake-based build script for Inference Engine.
421421
422+
## Adding Inference Engine to your project
423+
424+
For CMake projects, set the `InferenceEngine_DIR` environment variable:
425+
426+
```sh
427+
export InferenceEngine_DIR=/path/to/dldt/inference-engine/build/
428+
```
429+
430+
Then you can find Inference Engine by `find_package`:
431+
432+
```cmake
433+
find_package(InferenceEngine)
434+
include_directories(${InferenceEngine_INCLUDE_DIRS})
435+
target_link_libraries(${PROJECT_NAME} ${InferenceEngine_LIBRARIES} dl)
436+
```
437+
422438
## (Optional) Additional Installation Steps for the Intel® Movidius™ Neural Compute Stick and Neural Compute Stick 2
423439
424440
> **NOTE**: These steps are only required if you want to perform inference on Intel® Movidius™ Neural Compute Stick or the Intel® Neural Compute Stick 2 using the Inference Engine MYRIAD Plugin. See also [Intel® Neural Compute Stick 2 Get Started](https://software.intel.com/en-us/neural-compute-stick/get-started)
@@ -461,7 +477,7 @@ For Intel® Movidius™ Neural Compute Stick and Intel® Neural Compute Stick 2,
461477
1. Go to the `<DLDT_ROOT_DIR>/inference-engine/thirdparty/movidius/MovidiusDriver` directory, where the `DLDT_ROOT_DIR` is the directory to which the DLDT repository was cloned.
462478
2. Right click on the `Movidius_VSC_Device.inf` file and choose **Install** from the pop up menu.
463479
464-
You have installed the driver for your Intel® Movidius™ Neural Compute Stick or Intel® Neural Compute Stick 2.
480+
You have installed the driver for your Intel® Movidius™ Neural Compute Stick or Intel® Neural Compute Stick 2.
465481
466482
## Next Steps
467483

inference-engine/include/builders/ie_layer_decorator.hpp

Lines changed: 4 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -9,6 +9,10 @@
99
#include <vector>
1010

1111
namespace InferenceEngine {
12+
13+
/**
14+
* @brief Neural network builder API
15+
*/
1216
namespace Builder {
1317

1418
/**

inference-engine/include/cldnn/cldnn_config.hpp

Lines changed: 3 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -15,6 +15,9 @@
1515

1616
namespace InferenceEngine {
1717

18+
/**
19+
* @brief GPU plugin configuration
20+
*/
1821
namespace CLDNNConfigParams {
1922

2023
/**

inference-engine/include/dlia/dlia_config.hpp

Lines changed: 6 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -16,6 +16,9 @@
1616

1717
namespace InferenceEngine {
1818

19+
/**
20+
* @brief DLIA plugin metrics
21+
*/
1922
namespace DliaMetrics {
2023

2124
/**
@@ -37,6 +40,9 @@ DECLARE_DLIA_METRIC_VALUE(INPUT_STREAMING);
3740

3841
} // namespace DliaMetrics
3942

43+
/**
44+
* @brief DLIA plugin configuration
45+
*/
4046
namespace DLIAConfigParams {
4147

4248
/**

inference-engine/include/gna/gna_config.hpp

Lines changed: 13 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -3,10 +3,10 @@
33
//
44

55
/**
6-
* @brief A header that defines advanced related properties for VPU plugins.
6+
* @brief A header that defines advanced related properties for GNA plugin.
77
* These properties should be used in SetConfig() and LoadNetwork() methods of plugins
88
*
9-
* @file vpu_plugin_config.hpp
9+
* @file gna_config.hpp
1010
*/
1111

1212
#pragma once
@@ -16,9 +16,20 @@
1616

1717
namespace InferenceEngine {
1818

19+
/**
20+
* @brief GNA plugin configuration
21+
*/
1922
namespace GNAConfigParams {
2023

24+
/**
25+
* @def GNA_CONFIG_KEY(name)
26+
* @brief Shortcut for defining configuration keys
27+
*/
2128
#define GNA_CONFIG_KEY(name) InferenceEngine::GNAConfigParams::_CONFIG_KEY(GNA_##name)
29+
/**
30+
* @def GNA_CONFIG_VALUE(name)
31+
* @brief Shortcut for defining configuration values
32+
*/
2233
#define GNA_CONFIG_VALUE(name) InferenceEngine::GNAConfigParams::GNA_##name
2334

2435
#define DECLARE_GNA_CONFIG_KEY(name) DECLARE_CONFIG_KEY(GNA_##name)

inference-engine/include/hetero/hetero_plugin_config.hpp

Lines changed: 3 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -18,6 +18,9 @@
1818

1919
namespace InferenceEngine {
2020

21+
/**
22+
* @brief Heterogeneous plugin configuration
23+
*/
2124
namespace HeteroConfigParams {
2225

2326
/**

inference-engine/include/ie_plugin_config.hpp

Lines changed: 6 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -17,6 +17,9 @@
1717

1818
namespace InferenceEngine {
1919

20+
/**
21+
* @brief %Metrics
22+
*/
2023
namespace Metrics {
2124

2225
#ifndef DECLARE_METRIC_KEY_IMPL
@@ -144,6 +147,9 @@ DECLARE_EXEC_NETWORK_METRIC_KEY(OPTIMAL_NUMBER_OF_INFER_REQUESTS, unsigned int);
144147

145148
} // namespace Metrics
146149

150+
/**
151+
* @brief Generic plugin configuration
152+
*/
147153
namespace PluginConfigParams {
148154

149155
/**

inference-engine/include/inference_engine.hpp

Lines changed: 3 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -28,6 +28,9 @@
2828
#include <cpp/ie_executable_network.hpp>
2929
#include <ie_version.hpp>
3030

31+
/**
32+
* @brief Inference Engine API
33+
*/
3134
namespace InferenceEngine {
3235
/**
3336
* @brief Gets the top n results from a tblob

inference-engine/include/multi-device/multi_device_config.hpp

Lines changed: 3 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -16,6 +16,9 @@
1616

1717
namespace InferenceEngine {
1818

19+
/**
20+
* @brief Multi Device plugin configuration
21+
*/
1922
namespace MultiDeviceConfigParams {
2023

2124
/**

inference-engine/include/vpu/vpu_plugin_config.hpp

Lines changed: 3 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -37,6 +37,9 @@
3737

3838
namespace InferenceEngine {
3939

40+
/**
41+
* @brief VPU plugin configuration
42+
*/
4043
namespace VPUConfigParams {
4144

4245
//

0 commit comments

Comments
 (0)