Showing posts with label openvino. Show all posts
Showing posts with label openvino. Show all posts

Install iGPU / Intel integrated GPU drivers for OpenVINO Inference

To enable Intel integrated GPU / iGPU for running deep learning inference with OpenVINO, we need to install the Intel Graphics Compute Runtime for oneAPI Level Zero and OpenCL™ Driver on Linux.

Intel Graphics Compute Runtime:  https://github.com/intel/compute-runtime/releases

See : https://docs.openvino.ai/2024/get-started/configurations/configurations-intel-gpu.html

More info: https://dgpu-docs.intel.com/driver/client/overview.html

# After Installing Intel Graphics Compute Runtime



sudo apt install hwinfo clinfo -y



After installation, add video and render user groups.

sudo usermod -aG video $USER
sudo usermod -aG render $USER

# Logout and login 

Build OpenVINO from source - Linux

Official Instructions - Click Here

Build instructions for OpenVINO from Source with the Python API Wrapper

OpenVINO 2022.1.0 and later require GLIBC 2.27+, check with `ldd --version`

Here instructions are for Python 3.7, you can change it to 3.6 as well.

Software Requirements:

- CMake 3.13 or higher
- GCC 7.5 or higher to build OpenVINO Runtime
- Python 3.6 or higher for OpenVINO Runtime Python API
- (Optional) Install Intel® Graphics Compute Runtime for OpenCL™ Driver package 19.41.14441 to enable inference on Intel integrated GPUs.


$ sudo apt-get install python3.7-dev

$ pip install cython numpy

$ cd ~ #openvino will be installed in ~/openvino

$ git clone https://github.com/openvinotoolkit/openvino.git
$ cd openvino
$ git submodule update --init --recursive
$ chmod +x install_dependencies.sh
$ ./install_dependencies.sh
$ mkdir build && cd build

$ cmake -DCMAKE_BUILD_TYPE=Release \
-DENABLE_INTEL_GNA=OFF -DENABLE_INTEL_MYRIAD_COMMON=OFF \
-DENABLE_PYTHON=ON \
-DPYTHON_EXECUTABLE=`which python3.7` \
-DPYTHON_LIBRARY=/usr/lib/x86_64-linux-gnu/libpython3.7m.so \
-DPYTHON_INCLUDE_DIR=/usr/include/python3.7 ..

$ make --jobs=$(nproc --all)

$ export PYTHONPATH=$PYTHONPATH:~/openvino/bin/intel64/Release/lib/python_api/python3.7/
$ export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:~/openvino/bin/intel64/Release/lib/

# OR Install the wheel with PIP
$ pip install ~/openvino/build/wheel/*.whl

# TEST BUILD
$ python3.7
>>> from openvino.inference_engine import IENetwork, IECore

# Test Benchmark app

$ alias benchmark_app=~/openvino/bin/intel64/Release/benchmark_app 
$ benchmark_app -h

Python contextlib for Timing Python code

If you've ever found yourself needing to measure the execution time of specific portions of your Python code, the `contextlib` module o...