I had mentioned in a prior post that I was having a bit more trouble using GStreamer than I had anticipated. I had some problems using the udpsink and udpsrc elements that were used for the RTPStreaming demo. It turns out that was actually caused by my prior GStreamer experience. I've used GStreamer, VLC, and OMXPlayer a fair amount as clients to receive RTSP streams from IP cameras. I'm used to the client sending a request to the server (camera) when it wants to start receiving the video stream, so the connection is made only if the server is running. In the RTPStreaming demo the client (VLC or GST udpsrc) needed to be running first and it took me a while to figure out that the sequence that I used caused things to not work. As I played with GStreamer trying to figure out the issue I found that GStreamer is highly dependent on the underlying hardware and software. And I also discovered that there are quite a few instances of others having problems getting GStreamer pipelines to work with the Xilinx VCU that I'm using. Of course, these are just learning curve issues but I've run into similar problems that I'm having difficulty figuring out.
I'm not a hard core GStreamer programmer but I have used GStreamer on Linux computers and with OMX on Raspberry Pi's. From previous experience I had not anticipated that I would have difficulties getting an RTSP stream into the omxh264dec on the VCU. I have four IP cameras and an NVR that I intend to use for my roadtest project. The cameras are all different models, 2 are Amcrest, 2 are FDT, and the NVR is an Amcrest. They use proprietary APIs and some of the commands are undocumented but the video configurations are specified and I've used RTSP successfully from all of them. The problem that I'm having is that I've only been able to successfully interface one of the Amcrest cameras through the VCU decoder and out to the Display Port. In the other cases I can only get the first frame through and then the image stops updating but the pipeline will continue cycling.
I made a few videos to illustrate the issue.
The first video is using RTSP with VLC in an Ubuntu 16.04 VM (virtual machine). I am not using an SDP but just the RTSP URL in the Streaming Media interface. VLC is receiving the RTSP stream from the NVR. I cycle through the 4 cameras and then a quad view. In all cases you can see the video is updating by observing the timestamp and other movement in the images. Everything is working.
The second video is using RTSP with GStreamer in the same Ubuntu 16.04 VM (I found that I have performance issues with GStreamer in an Ubuntu 18.04 VM but I'm not going to pursue that right now). I bring up 2 of the cameras using the RTSP stream directly from the cameras and then the quad view from the NVR. Again you can see the video updating in all cases.
The third video is using RTSP with GStreamer on the UltraZed-EV (UZ7EV) with output to a monitor from the Display Port.. I start with the RTSP stream from the working camera then the same camera through the NVR interface, both cases work. Then a non-working camera, both cases do not work. Then the quad view from the NVR which also does not work.
I use a simple GST pipeline in the VM:
gst-launch-1.0 rtspsrc location="rtsp://user:passwd@ipaddr:port/commandstring" ! rtph264depay ! h264parse ! avdec_h264 ! autovideosink
The location parameters change based on which source I'm using but I guess what's significant is that I'm not explicitly setting the stream parameters. The rtspsrc should be getting those from the camera.
For the UZ7EV a simple pipeline also works for the one Amcrest camera:
gst-launch-1.0 -v rtspsrc location="rtsp://user:passwd@ipaddr:port/commandstring" ! rtph264depay ! h264parse ! omxh264dec ! kmssink bus-id="fd4a0000.zynqmp-display" fullscreen-overlay=true
I do see a distinct difference between the caps (capabilites) that are negotiated between the camera and the rtspsrc for the working and non-working cameras
Working camera:
Progress: (request) SETUP stream 0
/GstPipeline:pipeline0/GstRTSPSrc:rtspsrc0/GstRtpBin:manager: latency = 2000
/GstPipeline:pipeline0/GstRTSPSrc:rtspsrc0/GstRtpBin:manager: ntp-sync = false
/GstPipeline:pipeline0/GstRTSPSrc:rtspsrc0/GstRtpBin:manager: rfc7273-sync = false
/GstPipeline:pipeline0/GstRTSPSrc:rtspsrc0/GstRtpBin:manager: ntp-time-source = NTP time based on realtime clock
/GstPipeline:pipeline0/GstRTSPSrc:rtspsrc0/GstRtpBin:manager: drop-on-latency = false
/GstPipeline:pipeline0/GstRTSPSrc:rtspsrc0/GstRtpBin:manager: max-rtcp-rtp-time-diff = 1000
/GstPipeline:pipeline0/GstRTSPSrc:rtspsrc0/GstRtpBin:manager: max-ts-offset-adjustment = 0
/GstPipeline:pipeline0/GstRTSPSrc:rtspsrc0/GstRtpBin:manager: buffer-mode = Slave receiver to sender clock
/GstPipeline:pipeline0/GstRTSPSrc:rtspsrc0/GstUDPSrc:udpsrc0: timeout = 5000000000
/GstPipeline:pipeline0/GstRTSPSrc:rtspsrc0/GstUDPSrc:udpsrc0: caps = application/x-rtp, media=(string)video, payload=(int)96, clock-rate=(int)90000, encoding-name=(string)H264, packetization-mode=(string)1, profile-level-id=(string)64001F, sprop-parameter-sets=(string)"Z2QAH6w0yAUAW///Ad0B3G4CAgKAAAH0AAB1MHQwAMN4AAw3hd5caGABhvAAGG8LvLhQAA\=\=\,aO48MAA\=", a-packetization-supported=(string)DH, a-rtppayload-supported=(string)DH, a-framerate=(string)30.000000, a-recvonly=(string)"", ssrc=(uint)24860225
/GstPipeline:pipeline0/GstRTSPSrc:rtspsrc0/GstUDPSrc:udpsrc1: caps = application/x-rtcp
Progress: (request) SETUP stream 1
/GstPipeline:pipeline0/GstRTSPSrc:rtspsrc0/GstUDPSrc:udpsrc3: timeout = 5000000000
/GstPipeline:pipeline0/GstRTSPSrc:rtspsrc0/GstUDPSrc:udpsrc3: caps = application/x-rtp, media=(string)audio, payload=(int)0, clock-rate=(int)8000, encoding-name=(string)PCMU, a-packetization-supported=(string)DH, a-rtppayload-supported=(string)DH, a-recvonly=(string)"", ssrc=(uint)855912185
/GstPipeline:pipeline0/GstRTSPSrc:rtspsrc0/GstUDPSrc:udpsrc4: caps = application/x-rtcp
Non-working camera:
Progress: (request) SETUP stream 0
/GstPipeline:pipeline0/GstRTSPSrc:rtspsrc0/GstRtpBin:manager: latency = 2000
/GstPipeline:pipeline0/GstRTSPSrc:rtspsrc0/GstRtpBin:manager: ntp-sync = false
/GstPipeline:pipeline0/GstRTSPSrc:rtspsrc0/GstRtpBin:manager: rfc7273-sync = false
/GstPipeline:pipeline0/GstRTSPSrc:rtspsrc0/GstRtpBin:manager: ntp-time-source = NTP time based on realtime clock
/GstPipeline:pipeline0/GstRTSPSrc:rtspsrc0/GstRtpBin:manager: drop-on-latency = false
/GstPipeline:pipeline0/GstRTSPSrc:rtspsrc0/GstRtpBin:manager: max-rtcp-rtp-time-diff = 1000
/GstPipeline:pipeline0/GstRTSPSrc:rtspsrc0/GstRtpBin:manager: max-ts-offset-adjustment = 0
/GstPipeline:pipeline0/GstRTSPSrc:rtspsrc0/GstRtpBin:manager: buffer-mode = Slave receiver to sender clock
/GstPipeline:pipeline0/GstRTSPSrc:rtspsrc0/GstUDPSrc:udpsrc1: timeout = 5000000000
/GstPipeline:pipeline0/GstRTSPSrc:rtspsrc0/GstUDPSrc:udpsrc1: caps = application/x-rtp, media=(string)video, payload=(int)96, clock-rate=(int)90000, encoding-name=(string)H264, packetization-mode=(string)1, profile-level-id=(string)42001F, sprop-parameter-sets=(string)"Z0IAH5WoFAFuQA\=\=\,aM48gA\=\=", a-framesize=(string)1280-720, ssrc=(uint)1257665891
/GstPipeline:pipeline0/GstRTSPSrc:rtspsrc0/GstUDPSrc:udpsrc2: caps = application/x-rtcp
Progress: (request) SETUP stream 1
/GstPipeline:pipeline0/GstRTSPSrc:rtspsrc0/GstUDPSrc:udpsrc4: timeout = 5000000000
/GstPipeline:pipeline0/GstRTSPSrc:rtspsrc0/GstUDPSrc:udpsrc4: caps = application/x-rtp, media=(string)audio, payload=(int)8, clock-rate=(int)8000, encoding-name=(string)PCMA, encoding-params=(string)1, packetization-mode=(string)1, a-ptime=(string)20, ssrc=(uint)2065986730
/GstPipeline:pipeline0/GstRTSPSrc:rtspsrc0/GstUDPSrc:udpsrc5: caps = application/x-rtcp
I've set GST_DEBUG=3 (error) but I don't see any obvious differences in the logs (I'm ignoring the audio stream errors - udpsrc3/4)
Working camera log:
0:00:00.989571057 2660 0x7f8003b9e0 FIXME default gstutils.c:3981:gst_pad_create_stream_id_internal:<fakesrc0:src> Creating random stream-id, consider implementing a deterministic way of creating a stream-id
0:00:00.989683908 2660 0x7f8003b990 FIXME default gstutils.c:3981:gst_pad_create_stream_id_internal:<fakesrc1:src> Creating random stream-id, consider implementing a deterministic way of creating a stream-id
0:00:01.074379325 2660 0x7f8003c000 WARN basesrc gstbasesrc.c:3055:gst_base_src_loop:<udpsrc3> error: Internal data stream error.
0:00:01.074418525 2660 0x7f8003c000 WARN basesrc gstbasesrc.c:3055:gst_base_src_loop:<udpsrc3> error: streaming stopped, reason not-linked (-1)
0:00:03.153371848 2660 0x7f600035e0 FIXME basesink gstbasesink.c:3156:gst_base_sink_default_event:<kmssink0> stream-start event without group-id. Consider implementing group-id handling in the upstream elements
0:00:03.158844473 2660 0x7f600035e0 WARN GST_PADS gstpad.c:4226:gst_pad_peer_query:<h264parse0:src> could not send sticky events
0:00:03.200954824 2660 0x7f8003c140 WARN video-info video-info.c:723:gst_video_info_to_caps: invalid matrix 3 for RGB format, using RGB
0:00:03.202297348 2660 0x7f8003c140 WARN GST_PADS gstpad.c:4226:gst_pad_peer_query:<omxh264dec-omxh264dec0:src> could not send sticky events
0:00:03.202362548 2660 0x7f8003c140 WARN videodecoder gstvideodecoder.c:3900:gst_video_decoder_negotiate_pool:<omxh264dec-omxh264dec0> Subclass failed to decide allocation
Non-working camera log:
0:00:00.791565207 2698 0x7fac03f9e0 FIXME default gstutils.c:3981:gst_pad_create_stream_id_internal:<fakesrc0:src> Creating random stream-id, consider implementing a deterministic way of creating a stream-id
0:00:00.792353815 2698 0x7fac03f990 FIXME default gstutils.c:3981:gst_pad_create_stream_id_internal:<fakesrc1:src> Creating random stream-id, consider implementing a deterministic way of creating a stream-id
0:00:03.005533260 2698 0x7fac040050 WARN basesrc gstbasesrc.c:3055:gst_base_src_loop:<udpsrc4> error: Internal data stream error.
0:00:03.005570740 2698 0x7fac040050 WARN basesrc gstbasesrc.c:3055:gst_base_src_loop:<udpsrc4> error: streaming stopped, reason not-linked (-1)
0:00:03.026290038 2698 0x7f900034f0 FIXME basesink gstbasesink.c:3156:gst_base_sink_default_event:<kmssink0> stream-start event without group-id. Consider implementing group-id handling in the upstream elements
0:00:03.031667991 2698 0x7f900034f0 WARN GST_PADS gstpad.c:4226:gst_pad_peer_query:<h264parse0:src> could not send sticky events
0:00:03.072212957 2698 0x7fac040140 WARN video-info video-info.c:723:gst_video_info_to_caps: invalid matrix 3 for RGB format, using RGB
0:00:03.073495870 2698 0x7fac040140 WARN GST_PADS gstpad.c:4226:gst_pad_peer_query:<omxh264dec-omxh264dec0:src> could not send sticky events
0:00:03.073560030 2698 0x7fac040140 WARN videodecoder gstvideodecoder.c:3900:gst_video_decoder_negotiate_pool:<omxh264dec-omxh264dec0> Subclass failed to decide allocation
Not sure how much impact the FIXME/WARN issues are but they look the same between the two cameras. And the pipeline does run in the non-working case.
So, the difference that is apparent to me is that the working camera negotiates a framerate (30) and the non-working camera negotiates a framesize (1280-720). I've set the parameters on both cameras to be the same (30fps and 1280x720).
Downstream at the decoder both cameras have the correct framesize but the working camera has a 30fps framerate (30/1) and the non-working camera has a variable framerate (0/1). Not sure if this is an issue or how to fix it. I tried adding caps at the rtspsrc but the framerate stayed variable. I also tried adding an rtpjitterbuffer element and latency=1000 in case of dropped frames. I also tried changing the camera encoding type from Main to Baseline in case there was an issue with B-Frames.
Working camera:
/GstPipeline:pipeline0/GstOMXH264Dec-omxh264dec:omxh264dec-omxh264dec0.GstPad:sink: caps = video/x-h264, stream-format=(string)byte-stream, alignment=(string)au, pixel-aspect-ratio=(fraction)477/476, width=(int)1280, height=(int)720, framerate=(fraction)30/1, interlace-mode=(string)progressive, chroma-format=(string)4:2:0, bit-depth-luma=(uint)8, bit-depth-chroma=(uint)8, colorimetry=(string)1:3:5:1, parsed=(boolean)true, profile=(string)high, level=(string)3.1
Non-working camera:
/GstPipeline:pipeline0/GstOMXH264Dec-omxh264dec:omxh264dec-omxh264dec0.GstPad:sink: caps = video/x-h264, stream-format=(string)byte-stream, alignment=(string)au, width=(int)1280, height=(int)720, framerate=(fraction)0/1, interlace-mode=(string)progressive, chroma-format=(string)4:2:0, bit-depth-luma=(uint)8, bit-depth-chroma=(uint)8, colorimetry=(string)1:3:5:1, parsed=(boolean)true, profile=(string)baseline, level=(string)3.1
Hopefully this is just a matter of a setting that I'm missing but I'm actually not sure where the problem is. I'm pretty sure the decoder works with a variable framerate. Back to searching the forums and trying to figure out the right questions to ask.
Links to previous posts for this roadtest:
- Avnet UltraZed-EV Starter Kit Road Test- the adventure begins.....
- Avnet UltraZed-EV Starter Kit Road Test - VCU TRD
- Avnet UltraZed-EV Starter Kit Road Test - VCU TRD continued
- Avnet UltraZed-EV Starter Kit Road Test - Port PYNQv2.5
- Avnet UltraZed-EV Starter Kit Road Test - Port PYNQv2.5 continued
- Avnet UltraZed-EV Starter Kit Road Test - Vitis AI
- Avnet UltraZed-EV Starter Kit Road Test - Overview