我有一个python客户端代码,它接收使用VLC或OBS Studio软件传输的视频流。
客户代码:
import cv2
import time
target_url = 'udp://@0.0.0.0:1235'
stream = cv2.VideoCapture(target_url)
while True:
r, f = stream.read()
if r:
cv2.imshow('IP Camera stream',f)
它能够读取和显示从另一台机器用VLC传输的视频流。现在,我想创建视频服务器应用程序,而不是使用VLC。我试着使用cv2.VideoWriter
,但它只接收本地文件,而不是udpsink。在浏览网络之后,我得到了很少的堆栈溢出回答,建议使用TCP的pyzmq 1,手动创建套接字,并处理它,因为客户端应该能够同时从VLC和自定义应用程序接收到不起作用的Ref 2。
然后,我了解了NetGear参考文献3,这是一个很好的工具。但是它不支持UDP,因为它内部使用pyzmq 4。
基本上,我是在寻找类似cv2.VideoWriter('udp://192.168.1.2:5000', fourcc, ..)
的东西。
问题:是否有一种方式,可以将实时的摄像机馈送转换成具有比特率和fps的H264,然后在UDP上传输,这样就可以使用cv2.VideoCapture('udp://@0.0.0.0:5000')
接收到。
发布于 2022-02-20 01:58:35
我建议用gstreamer来做这件事。你可以尝试:
#!/usr/bin/env python
import cv2
print(cv2.__version__)
# Uncommenting this would allow to check if your opencv build has GSTREAMER support
#print(cv2.getBuildInformation())
cap = cv2.VideoCapture("udpsrc port=5000 ! application/x-rtp,media=video,encoding-name=H264 ! queue ! rtpjitterbuffer latency=500 ! rtph264depay ! h264parse ! avdec_h264 ! videoconvert ! video/x-raw,format=BGR ! queue ! appsink drop=1", cv2.CAP_GSTREAMER)
# For NVIDIA using NVMM memory
#cap = cv2.VideoCapture("udpsrc port=5000 ! application/x-rtp,media=video,encoding-name=H264 ! queue ! rtpjitterbuffer latency=500 ! rtph264depay ! h264parse ! nvv4l2decoder ! nvvidconv ! video/x-raw,format=BGRx ! videoconvert ! video/x-raw,format=BGR ! queue ! appsink drop=1", cv2.CAP_GSTREAMER)
width = cap.get(cv2.CAP_PROP_FRAME_WIDTH)
height = cap.get(cv2.CAP_PROP_FRAME_HEIGHT)
#fps = cap.get(cv2.CAP_PROP_FPS) #doesn't work with python in my case so forcing below...you may have to adjust for your case
fps = 30
if not cap.isOpened():
print('Failed to open camera')
exit
print('Source opened, framing %dx%d@%d' % (width,height,fps))
writer = cv2.VideoWriter("appsrc ! video/x-raw,format=BGR ! queue ! videoconvert ! x264enc insert-vui=1 ! h264parse ! rtph264pay ! udpsink port=5001", cv2.CAP_GSTREAMER, 0, float(fps), (int(width),int(height)))
# For NVIDIA using NVMM memory
#writer = cv2.VideoWriter("appsrc ! video/x-raw,format=BGR ! queue ! videoconvert ! video/x-raw,format=BGRx ! nvvidconv ! nvv4l2h264enc insert-sps-pps=1 insert-vui=1 ! h264parse ! rtph264pay ! udpsink port=5001", cv2.CAP_GSTREAMER, 0, float(fps), (int(width),int(height)))
if not writer.isOpened():
print('Failed to open writer')
cap.release()
exit
while True:
ret_val, img = cap.read();
if not ret_val:
break
writer.write(img);
cv2.waitKey(1)
writer.release()
cap.release()
这应该会流到端口5001上的本地主机,并且您应该能够在运行X的Linux主机上接收到(预计设置时间为10秒):
gst-launch-1.0 udpsrc port=5001 ! application/x-rtp,media=video,encoding-name=H264 ! queue ! rtpjitterbuffer latency=500 ! rtph264depay ! h264parse ! avdec_h264 ! videoconvert ! xvimagesink
如果要流到给定主机,请在禁用自动多播时设置udpsink的主机属性:
writer = cv2.VideoWriter("appsrc ! video/x-raw,format=BGR ! queue ! videoconvert ! x264enc insert-vui=1 ! h264parse ! rtph264pay ! udpsink port=5001 host=<target_IP> auto-multicast=0
如果您想使用多播(最好避免使用wifi):
writer = cv2.VideoWriter("appsrc ! video/x-raw,format=BGR ! queue ! videoconvert ! x264enc insert-vui=1 ! h264parse ! rtph264pay ! udpsink port=5001 host=224.1.1.1
# And you may receive on any LAN Linux host host with:
gst-launch-1.0 udpsrc multicast-group=224.1.1.1 port=5001 ! application/x-rtp, media=video,encoding-name=H264 ! rtph264depay ! h264parse ! avdec_h264 ! videoconvert ! xvimagesink
https://stackoverflow.com/questions/71174080
复制相似问题