Page 1 of 1

Receive livestream on another Linux machine

Posted: Sun Dec 22, 2019 10:01 pm
by ducnan
Hi,

I am new to StereoPi. As I go through the SLP image tutorial in the wiki, there are parts to introduce how to receive Livestream on windows and Mac, but I wonder how to receive Livestream on another Linux machine. And is it possible to make it become a standard USB camera?

Thanks in advance and happy holidays.

Duncan

Re: Receive livestream on another Linux machine

Posted: Mon Dec 23, 2019 10:45 am
by Realizator
ducnan wrote:
Sun Dec 22, 2019 10:01 pm
Hi,

I am new to StereoPi. As I go through the SLP image tutorial in the wiki, there are parts to introduce how to receive Livestream on windows and Mac, but I wonder how to receive Livestream on another Linux machine. And is it possible to make it become a standard USB camera?

Thanks in advance and happy holidays.

Duncan
Hi Duncan,
You can use gstreamer on Linux too, like in our examples in the Wiki for Win and Mac, you just need to fix some parameters according to your Linux distribution.
As for USB camera emulation - it is very difficult task. The only successful attempt from our side was with the ODB VirtualCam plugin (you may find some details in our update here). It is better to go another way and receive network stream from your StereoPi in your application under Linux.

Re: Receive livestream on another Linux machine

Posted: Mon Dec 23, 2019 5:14 pm
by ducnan
Hi Eugene,

Thank you very much for your reply. So I might try to stream it through the network. Can this network be a local network, like I just connect the StereoPi and my Linux machine through the LAN cable?

My end goal is to stream the frames captured by stereoPi to the Jetson Nano doing a real-time SLAM task on ROS. Are there any similar or related projects I can reference? It would be very helpful.

Thanks,
Duncan

Re: Receive livestream on another Linux machine

Posted: Tue Dec 24, 2019 9:22 am
by Realizator
Hi Duncan,
Yes, you can use Ethernet cable. But you'll also need to configure Ethernet interface appropriate way. If you'll use a direct connection (without router), then you need to fix a static IP address for both devices. If you use a router, you can use DHCP, but this way you need to find a way to detect IP addresses assigned to both StereoPi and Jetson.

Re: Receive livestream on another Linux machine

Posted: Tue Dec 24, 2019 2:57 pm
by ducnan
Hi Eugene,

I probably would not use a router as both the stereoPi and Nano are mounted to a small robot. I know how to set the static IP address for the Nano but how to do it for the StereoPi with SLP image? Via admin panel or SSH?

I have one more question about the admin panel: I see an option "USB Enable", what does this setting do?

Thank you so much.
Duncan

Re: Receive livestream on another Linux machine

Posted: Tue Dec 24, 2019 3:14 pm
by Realizator
ducnan wrote:
Tue Dec 24, 2019 2:57 pm
Hi Eugene,

I probably would not use a router as both the stereoPi and Nano are mounted to a small robot. I know how to set the static IP address for the Nano but how to do it for the StereoPi with SLP image? Via admin panel or SSH?

I have one more question about the admin panel: I see an option "USB Enable", what does this setting do?

Thank you so much.
Duncan
SLP image has SSH enabled by default, so you can do it over SSH. You can find SSH credentials here in our Wiki.

"USB enable" option turns on "usb acessory" mode for the Android phones. If you connect USB cable to the StereoPi (not to the micro USB, but to the USB A), and the other side of the cable to any Android phone (that is micro USB or USB C), your phone will recognize StereoPi as a network accessory. In this case, if you use our Android application, you can access StereoPi and see video without WiFi or Ethernet, but just with a cable.

Re: Receive livestream on another Linux machine

Posted: Tue Dec 24, 2019 8:17 pm
by ducnan
Great. So now I am able to access the stereoPi through the SSH with my nano and even able to see the livestream with a browser.
I try to set up the nano with the gstreamer like the one on wiki for Mac. However when I run this on the command line:

Code: Select all

gst-launch-1.0 -v udpsrc port=3000 buffer-size=300000 ! h264parse ! avdec_h264 ! fpsdisplaysink sync=false
There is no live stream comes out on my nano and it prompts me to run gdb for debugging. Here is the complete message shown on the terminal:

Code: Select all

Setting pipeline to PAUSED ...
Pipeline is live and does not need PREROLL ...
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0/GstAutoVideoSink:fps-display-video_sink/GstNvOverlaySink-nvoverlaysink:fps-display-video_sink-actual-sink-nvoverlay: sync = false
Setting pipeline to PLAYING ...
New clock: GstSystemClock
/GstPipeline:pipeline0/GstH264Parse:h264parse0.GstPad:src: caps = video/x-h264, width=(int)1280, height=(int)720, framerate=(fraction)0/1, chroma-format=(string)4:2:0, bit-depth-luma=(uint)8, bit-depth-chroma=(uint)8, parsed=(boolean)true, stream-format=(string)avc, alignment=(string)au, profile=(string)baseline, level=(string)4, codec_data=(buffer)01428028ffe1000d2742802895a014016e8078913501000528ce025c80
/GstPipeline:pipeline0/avdec_h264:avdec_h264-0.GstPad:sink: caps = video/x-h264, width=(int)1280, height=(int)720, framerate=(fraction)0/1, chroma-format=(string)4:2:0, bit-depth-luma=(uint)8, bit-depth-chroma=(uint)8, parsed=(boolean)true, stream-format=(string)avc, alignment=(string)au, profile=(string)baseline, level=(string)4, codec_data=(buffer)01428028ffe1000d2742802895a014016e8078913501000528ce025c80
/GstPipeline:pipeline0/avdec_h264:avdec_h264-0.GstPad:src: caps = video/x-raw, format=(string)I420, width=(int)1280, height=(int)720, interlace-mode=(string)progressive, multiview-mode=(string)mono, multiview-flags=(GstVideoMultiviewFlagsSet)0:ffffffff:/right-view-first/left-flipped/left-flopped/right-flipped/right-flopped/half-aspect/mixed-mono, pixel-aspect-ratio=(fraction)1/1, chroma-site=(string)mpeg2, colorimetry=(string)bt709, framerate=(fraction)0/1
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0.GstGhostPad:sink.GstProxyPad:proxypad0: caps = video/x-raw, format=(string)I420, width=(int)1280, height=(int)720, interlace-mode=(string)progressive, multiview-mode=(string)mono, multiview-flags=(GstVideoMultiviewFlagsSet)0:ffffffff:/right-view-first/left-flipped/left-flopped/right-flipped/right-flopped/half-aspect/mixed-mono, pixel-aspect-ratio=(fraction)1/1, chroma-site=(string)mpeg2, colorimetry=(string)bt709, framerate=(fraction)0/1
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0/GstTextOverlay:fps-display-text-overlay.GstPad:src: caps = video/x-raw, format=(string)I420, width=(int)1280, height=(int)720, interlace-mode=(string)progressive, multiview-mode=(string)mono, multiview-flags=(GstVideoMultiviewFlagsSet)0:ffffffff:/right-view-first/left-flipped/left-flopped/right-flipped/right-flopped/half-aspect/mixed-mono, pixel-aspect-ratio=(fraction)1/1, chroma-site=(string)mpeg2, colorimetry=(string)bt709, framerate=(fraction)0/1
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0/GstAutoVideoSink:fps-display-video_sink.GstGhostPad:sink.GstProxyPad:proxypad1: caps = video/x-raw, format=(string)I420, width=(int)1280, height=(int)720, interlace-mode=(string)progressive, multiview-mode=(string)mono, multiview-flags=(GstVideoMultiviewFlagsSet)0:ffffffff:/right-view-first/left-flipped/left-flopped/right-flipped/right-flopped/half-aspect/mixed-mono, pixel-aspect-ratio=(fraction)1/1, chroma-site=(string)mpeg2, colorimetry=(string)bt709, framerate=(fraction)0/1
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0/GstAutoVideoSink:fps-display-video_sink/GstNvOverlaySink-nvoverlaysink:fps-display-video_sink-actual-sink-nvoverlay.GstPad:sink: caps = video/x-raw, format=(string)I420, width=(int)1280, height=(int)720, interlace-mode=(string)progressive, multiview-mode=(string)mono, multiview-flags=(GstVideoMultiviewFlagsSet)0:ffffffff:/right-view-first/left-flipped/left-flopped/right-flipped/right-flopped/half-aspect/mixed-mono, pixel-aspect-ratio=(fraction)1/1, chroma-site=(string)mpeg2, colorimetry=(string)bt709, framerate=(fraction)0/1
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0/GstAutoVideoSink:fps-display-video_sink.GstGhostPad:sink: caps = video/x-raw, format=(string)I420, width=(int)1280, height=(int)720, interlace-mode=(string)progressive, multiview-mode=(string)mono, multiview-flags=(GstVideoMultiviewFlagsSet)0:ffffffff:/right-view-first/left-flipped/left-flopped/right-flipped/right-flopped/half-aspect/mixed-mono, pixel-aspect-ratio=(fraction)1/1, chroma-site=(string)mpeg2, colorimetry=(string)bt709, framerate=(fraction)0/1
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0/GstTextOverlay:fps-display-text-overlay.GstPad:video_sink: caps = video/x-raw, format=(string)I420, width=(int)1280, height=(int)720, interlace-mode=(string)progressive, multiview-mode=(string)mono, multiview-flags=(GstVideoMultiviewFlagsSet)0:ffffffff:/right-view-first/left-flipped/left-flopped/right-flipped/right-flopped/half-aspect/mixed-mono, pixel-aspect-ratio=(fraction)1/1, chroma-site=(string)mpeg2, colorimetry=(string)bt709, framerate=(fraction)0/1
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0.GstGhostPad:sink: caps = video/x-raw, format=(string)I420, width=(int)1280, height=(int)720, interlace-mode=(string)progressive, multiview-mode=(string)mono, multiview-flags=(GstVideoMultiviewFlagsSet)0:ffffffff:/right-view-first/left-flipped/left-flopped/right-flipped/right-flopped/half-aspect/mixed-mono, pixel-aspect-ratio=(fraction)1/1, chroma-site=(string)mpeg2, colorimetry=(string)bt709, framerate=(fraction)0/1
Caught SIGSEGV
#0  0x0000007f9b9fe048 in __GI___poll (fds=0x5592034eb0, nfds=548072895088, timeout=<optimized out>) at ../sysdeps/unix/sysv/linux/poll.c:41
#1  0x0000007f9bb0ae40 in  () at /usr/lib/aarch64-linux-gnu/libglib-2.0.so.0
#2  0x0000005591d97bd0 in  ()
Spinning.  Please run 'gdb gst-launch-1.0 9428' to continue debugging, Ctrl-C to quit, or Ctrl-\ to dump core.

Any idea about what is going on? I am kind of lost.

Thank you,
Duncan

Re: Receive livestream on another Linux machine

Posted: Wed Dec 25, 2019 9:49 am
by Realizator
Duncan, if you see video livestream in a browser, it means that all hardware is Ok, Great!
If you're using SLP image, you should know, that there are livestream scripts, which uses camera all the time. So if you try to start a new livestream, you'll get a conflict, as two instances tries to access the cameras.
Before running your scripts please stop our livestream. Just run /opt/StereoPi/stop.sh
If you're trying to stop raspivid or gstreamer by "killall..." command, they will auto-restart again, if our scripts are running.
Please check this issue in your case.

Re: Receive livestream on another Linux machine

Posted: Sat Dec 28, 2019 4:39 pm
by ducnan
Hi Eugene,

I don't have a custom script for the stereoPi; now I just run GStreamer on Nano to see whether it can livestream through UDP.

So I follow the step you sadi; ssh to stereoPi and stop all scripts. After that, I ran the GStreamer command on Nano but still no live videos pop out. Then, I restart the scripts on the stereopi by running the "run.sh", the console saying that "video connected, client count=0"; then I ran the GStreamer
on Nano again but nothing was improved and the client count in the stereoPi console is still 0. Was that the UDP connection never established successfully?

Lastly, I tried to stream it through rtsp. I selected the option for rtsp in the stereoPi Admin panel and able to get the video on my Nano through the MPlayer and a rtsp ros driver. However, the rstp seems to have a big latency, is like 10 seconds, which is not tolerable to my project. Is there any method to lower down the latency?

Thank you very much.
Duncan

Re: Receive livestream on another Linux machine

Posted: Sat Dec 28, 2019 5:04 pm
by Realizator
Hi Duncan,
Thank you for your detailed description. Now I have an idea, why you can not receive a stream.
When you run gstreamer on Nano, you actually start a receiving part.
Now you need to run streaming part on your StereoPi.
For this you have a two ways:

1. Use built-in SLP livestream feature (bottom of this Wiki page).
For this you need to put your Nano's IP in a panel. Cite from our Wiki:
- In SLP Admin panel I turn on "Stream UDP" option
- Put "192.168.1.139:3000" in UDP clients field (REPLACE 192.168.1.139 to your Nano IP)
- Press "save" at the bottom of Admin panel
- Run gst-launch line in command line on Nano
Also please use the same 3000 port in your gstreamer options on Nano.

2. Stop our scripts (stop.sh) and run a gstreamer line you need from the StereoPi console.

Re: Receive livestream on another Linux machine

Posted: Sat Dec 28, 2019 10:06 pm
by ducnan
Hi Eugene,

I already followed the steps described in the wiki (setting up the UDP client address and running the GStreamer command). The result was included in the last last respond( the one with content of the terminal info).
Basically, the result is I could not receive livestream by using the GStreamer on Nano through UDP.
Here is part of the error message after running the gstreamer command:
--------------------------------------------------------
Caught SIGSEGV
#0 0x0000007f9b9fe048 in __GI___poll (fds=0x5592034eb0, nfds=548072895088, timeout=<optimized out>) at ../sysdeps/unix/sysv/linux/poll.c:41
#1 0x0000007f9bb0ae40 in () at /usr/lib/aarch64-linux-gnu/libglib-2.0.so.0
#2 0x0000005591d97bd0 in ()
Spinning. Please run 'gdb gst-launch-1.0 9428' to continue debugging, Ctrl-C to quit, or Ctrl-\ to dump core.
--------------------------------------------------------------

Thanks.
Duncan

Re: Receive livestream on another Linux machine

Posted: Mon Dec 30, 2019 9:35 am
by Realizator
Hi Duncan,
Under Ubunto on your Nano, please try to use 'autovideosync' instead of 'fpsdisplaysink'.

Re: Receive livestream on another Linux machine

Posted: Tue Dec 31, 2019 9:10 pm
by ducnan
Hi Eugene,

I tried to replace the "fpsdisplaysink" with "autovideosink" as you suggested, but the result and error messages are the same as before. I also tried different sink options after referencing the GStreamer website, like udpsink and fakesink; the udpsink option gave an error of message too long.

Thank you and happy new year!

Duncan

Re: Receive livestream on another Linux machine

Posted: Thu Jan 02, 2020 3:50 pm
by Realizator
Hi Duncan,
Ok, I need to repeat this experiment with the Ubuntu. I remember that gstreamer binaries are OS-specific, so we just need to find appropriate line for you.
Also, our example lines are very simplified, as actually there are lot of other para,meters for the image decode (here is an example from our forum with the more advanced options)

Re: Receive livestream on another Linux machine

Posted: Thu Jan 09, 2020 10:29 am
by Realizator
Hello Duncan,
Sorry for my late reply!
I still has no Nano here to play with, but found some information.
Here at NVidia developers forum you can find an example of capturing raw H264 stream:
https://devtalk.nvidia.com/default/topi ... ng-opencv/

The first row is working for this question author, but it takes 50% of CPU, as it's not using hardware acceleration for decode. But you can try this to check, if it works for you.

As you plan to use received stream in OpenCV, the second part of this discussion will be interesting for you too, as the main task for this topic-maker is OpenCV processing of this received stream.

As I remember from our previous gstreamer-related hardware-specific projects (like based on TI DM365 SoC), gstreamer is a very hardware-specific on embedded systems. With the Windows or Mac you can use our easy sample gst-launch rows and not even notice desktop CPU load, but for embedded systems like Nano you have to use hardware-optimized options. So the best way is to start with existing examples from Nano users, and optimize them for your case. And to get the best results you'd use a docs from NVidia like this Accelerated_GStreamer_User_Guide.pdf