Tuesday, July 14, 2015

從MJPG獲得視頻並用opencv處理

https://github.com/jacksonliam/mjpg-streamer
登入你的pi,去到/usr/src目錄下創建一個MJPG目錄

cd /usr/src
sudo mkdir mjpg-streamer
sudo chown `whoami`:users mjpg-streamer
cd mjpg-streamer

從GitHub下載
git clone https://github.com/jacksonliam/mjpg-streamer.git

為了編譯代碼,我們需要安裝一些庫的依賴關係
sudo apt-get install libv4l-dev libjpeg8-dev imagemagick build-essential cmake subversion

接下來,我們需要編譯MJPG
cd mjpg-streamer-experimental
make

現在我們應設置為開始串流視頻。有很多選項可以設置。有關詳細信息,請訪問上面鏈接的頁面GitHub
export LD_LIBRARY_PATH=.
./mjpg_streamer -o "output_http.so -w ./www" -i "input_raspicam.so -x 640 -y 480 -fps 20 -ex night"

如果誰想要進一步處理視頻,你可以創建一個文件rpi-stream.py及以下將腳本貼到它獲得視頻流使用OpenCV的顯示
import cv2
import urllib
import numpy as np
 
stream=urllib.urlopen('http://192.168.0.193:8080/?action=stream')
bytes=''
while True:
    bytes+=stream.read(1024)
    a = bytes.find('\xff\xd8')
    b = bytes.find('\xff\xd9')
    if a!=-1 and b!=-1:
        jpg = bytes[a:b+2]
        bytes= bytes[b+2:]
        i = cv2.imdecode(np.fromstring(jpg, dtype=np.uint8),cv2.CV_LOAD_IMAGE_COLOR)
        cv2.imshow('i',i)
        if cv2.waitKey(1) == 27:
            exit(0)

參考資料:
http://petrkout.com/electronics/low-latency-0-4-s-video-streaming-from-raspberry-pi-mjpeg-streamer-opencv/

No comments:

Post a Comment