Skip to content
This repository has been archived by the owner on Nov 16, 2023. It is now read-only.

Commit

Permalink
Merge pull request #139 from tommywu052/master
Browse files Browse the repository at this point in the history
Factory AI vison on WSL2 GPU enabled.
  • Loading branch information
initmahesh authored Jul 26, 2020
2 parents d5920ad + b7cacb7 commit b2ba47a
Show file tree
Hide file tree
Showing 15 changed files with 198 additions and 0 deletions.
35 changes: 35 additions & 0 deletions factory-ai-vision/rtsp-generator/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,35 @@
<h1 align="center">RTSP Endpoint with Gstreamer 👋</h1>
<p>
How to generate RTSP endpoint with GStreamer.
</p>



## GStreamer Installation

```sh
sudo apt-get install libgstreamer1.0-0 gstreamer1.0-plugins-base gstreamer1.0-plugins-good gstreamer1.0-plugins-bad gstreamer1.0-plugins-ugly gstreamer1.0-libav gstreamer1.0-doc gstreamer1.0-tools
sudo apt-get install gstreamer1.0-plugins-base-apps
sudo apt-get install gir1.2-gst-rtsp-server-1.0
sudo apt-get install python-gst-1.0 python3-gst-1.0
sudo apt-get install python3-opencv

```

## Usage

```sh
Download gst-rtsp.py to current folder

Modify Line 14 for your own video or url
> - self.cap = cv2.VideoCapture("your own video file or url ")
python3 gst-rtsp.py
```

## Author

👤 **Tommy Wu**

* Github: [@tommywu052](https://github.com/tommywu052)

## Thanks you
72 changes: 72 additions & 0 deletions factory-ai-vision/rtsp-generator/gst-rtsp.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,72 @@
#!/usr/bin/env python3

import cv2
import gi

gi.require_version('Gst', '1.0')
gi.require_version('GstRtspServer', '1.0')
from gi.repository import Gst, GstRtspServer, GObject

import netifaces as ni
ni.ifaddresses('eth0')
ip = ni.ifaddresses('eth0')[ni.AF_INET][0]['addr']

class SensorFactory(GstRtspServer.RTSPMediaFactory):
def __init__(self, **properties):
super(SensorFactory, self).__init__(**properties)
self.cap = cv2.VideoCapture("taiwan_culture.mp4")
self.number_frames = 0
self.fps = 30
self.duration = 1 / self.fps * Gst.SECOND # duration of a frame in nanoseconds
self.launch_string = 'appsrc name=source is-live=true block=true format=GST_FORMAT_TIME ' \
'caps=video/x-raw,format=BGR,width=1280,height=720,framerate={}/1 ' \
'! videoconvert ! video/x-raw,format=I420 ' \
'! x264enc speed-preset=ultrafast tune=zerolatency ' \
'! rtph264pay config-interval=1 name=pay0 pt=96'.format(self.fps)

def on_need_data(self, src, lenght):
if self.cap.isOpened():
ret, frame = self.cap.read()
if ret:
data = frame.tostring()
#print (data)
buf = Gst.Buffer.new_allocate(None, len(data), None)
buf.fill(0, data)
buf.duration = self.duration
timestamp = self.number_frames * self.duration
buf.pts = buf.dts = int(timestamp)
buf.offset = timestamp
self.number_frames += 1
retval = src.emit('push-buffer', buf)
print('pushed buffer, frame {}, duration {} ns, durations {} s'.format(self.number_frames,
self.duration,
self.duration / Gst.SECOND))
if retval != Gst.FlowReturn.OK:
print(retval)

def do_create_element(self, url):
return Gst.parse_launch(self.launch_string)

def do_configure(self, rtsp_media):
self.number_frames = 0
appsrc = rtsp_media.get_element().get_child_by_name('source')
appsrc.connect('need-data', self.on_need_data)


class GstServer(GstRtspServer.RTSPServer):
def __init__(self, **properties):
super(GstServer, self).__init__(**properties)
self.factory = SensorFactory()
self.factory.set_shared(True)
self.get_mount_points().add_factory("/test", self.factory)
self.attach(None)


GObject.threads_init()
Gst.init(None)

server = GstServer()
print("Running as - rtsp://"+ip+":8554/test")

loop = GObject.MainLoop()
loop.run()
Binary file not shown.
78 changes: 78 additions & 0 deletions factory-ai-vision/wsl2/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,78 @@
# RUNNING FACTORY AI ON WSL2 with GPU Acceleration Enabled


1. WSL2 Environment Setup :

A. Surface Book2 with GTX 1060

B. Windows Insider Preview Build 20150 or higher :  [register for
the Windows Insider
Program](https://insider.windows.com/getting-started/#register).

C. Download Nvidia Driver :
<https://developer.nvidia.com/45541-gameready-win10-dch-64bit-international>

D.  [Enable WSL
2](https://docs.microsoft.com/en-us/windows/wsl/install-win10) and [install
a glibc-based
distribution](https://docs.microsoft.com/en-us/windows/wsl/install-win10#install-your-linux-distribution-of-choice) (like
Ubuntu or Debian). I choose Ubuntu 18.04

E. Check this
[guide](https://docs.nvidia.com/cuda/wsl-user-guide/index.html#setting-containers)
to install Nvidia container. (Should install this before IoT
Edge installation) .

2. Install IoT Edge and Configure your device on IoT Hub by the
[doc](https://docs.microsoft.com/en-us/azure/iot-edge/how-to-install-iot-edge-linux).

![](media/image3.png)

3. Steps to run AI workload for Web Module & Inference Module --

A. Start the previous distro environment

> ![](media/image4.png)
B. Clone the
[start.sh](https://github.com/tommywu052/azure-intelligent-edge-patterns/blob/master/factory-ai-vision/wsl2/start.sh)
on the root folder.

C. cd \~ & ./start.sh

> ![](media/image5.png)
D. Check your module status :

> ![](media/image6.png)
E. nano the /etc/docker/daemon.json as

> ![](media/image7.png)
F. systemctl restart docker & iotedge restart

```{=html}
```
4. Testing Process :

A. Check WebModule is running :

iotedge logs -f WebModule \--tail 500

B. Go to <http://localhost:8080/> and add a new location.

> ![](media/image9.png)
C. Choose "demo Pretrained Detection " and Configure

D. Check the Right-hand side Configuration about GPU(accelerated) :
around 10-15 ms , compared with CPU around 300-500 ms.

![](media/image11.png)

E. Verify the logs on inference module : some onnxruntime warning but
initializer scalepreprocessor scale.

> ![](media/image12.png)
F. Done and Enjoy your WSL2 !
10 changes: 10 additions & 0 deletions factory-ai-vision/wsl2/daemon.json
Original file line number Diff line number Diff line change
@@ -0,0 +1,10 @@
{
"default-runtime": "nvidia",
"runtimes": {
"nvidia": {
"path": "nvidia-container-runtime",
"runtimeArgs": []
}
},
"dns": ["8.8.8.8", "8.8.4.4"]
}
1 change: 1 addition & 0 deletions factory-ai-vision/wsl2/media/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@

Binary file added factory-ai-vision/wsl2/media/image11.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added factory-ai-vision/wsl2/media/image12.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added factory-ai-vision/wsl2/media/image3.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added factory-ai-vision/wsl2/media/image4.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added factory-ai-vision/wsl2/media/image5.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added factory-ai-vision/wsl2/media/image6.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added factory-ai-vision/wsl2/media/image7.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added factory-ai-vision/wsl2/media/image9.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
2 changes: 2 additions & 0 deletions factory-ai-vision/wsl2/start.sh
Original file line number Diff line number Diff line change
@@ -0,0 +1,2 @@
sudo daemonize /usr/bin/unshare --fork --pid --mount-proc /lib/systemd/systemd --system-unit=basic.target
exec sudo nsenter -t $(pidof systemd) -a su - $LOGNAME

0 comments on commit b2ba47a

Please sign in to comment.