![]() The following section indicates we want to grab the JPEG frames straight from the camera in 160×120 resolution, at 30 frames per second. “v4l2src” tells Gstreamer we want it to grab video from a video capture source, in our case, a webcam using the Video4Linux2 drivers. It’s useful when you’re trying to troubleshoot a non-functioning stream. The “-v” enables verbose mode, which tells Gstreamer to tell us absolutely everything that’s going on. “gst-launch” refers to the gstreamer executable. This is the GStreamer command I used: gst-launch -v v4l2src ! "image/jpeg,width=160,height=120,framerate=30/1" ! rtpjpegpay ! udpsink host=192.168.2.3 port=5001 Let’s look at the Raspberry Pi side of things first. To understand how this works, let’s take a look at the pipelines I used in my low-latency application below. ![]() Gstreamer is configured by the use of “pipelines”, which are a series of commands specifying where to get video from, how to process and encode it, and then where to send it out. It’s a real Swiss Army knife of video streaming tools, rolled up into an arcane command-line utility. I eventually came across a utility called Gstreamer. Gstreamer is the Swiss Army Knife of Streaming This simply wouldn’t do, so I resumed my search. This meant that while initial latency was 500ms, it would blow out to several seconds in just a few minutes of running the stream. The software insisted on sending every frame no matter what, so if the connection hit a rough patch, MJPEGstreamer would wait until the frames were sent. MJPEGstreamer didn’t handle low speed connections well. Latency was under 500ms, which I considered acceptable. By then filming the phone & the stream with another camera, I could check the recording and see the difference between the time on the stopwatch, and the time displayed on the stream. I held the phone next to the screen displaying the video stream over WiFi. To measure latency, I downloaded a stopwatch app on my phone. Once installed on the Raspberry Pi, MJPEGstreamer can be accessed remotely through a web interface, upon which it presents the stream. At first, it looked like this might be a perfect solution to my issues. By using a tool dedicated to streaming in that format, I would avoid any time-intensive transcoding of the webcam video that could introduce latency into the stream. This led me to a tool known as MJPEGStreamer. The next streaming tool I turned to was a Microsoft Lifecam VX-2000, which natively provides an MJPEG output at up to 640×480 resolution. Native MJPEG Streaming - If Your Network is Fast While I could have possibly optimized it further, I decided to move on and try to find something a little more lightweight. Streaming over WiFi and waving my hands in front of the camera showed I had a delay of at least two or three seconds. Getting the software to recognize the webcam plugged into my Pi Zero took forever, and when I did get eventually get the stream up and running, it was far too laggy to be useful. Initial experiments were sadly fraught with issues. My first approach to the challenge was the venerable VLC Media Player. Thus I set sail for the nearest search engine and begun researching my problem. ![]() Minimizing latency was key to making the vehicle easy to drive. I was building a remotely controlled vehicle that uses the cellular data network for communication. However, if you need live video with as little latency as possible, things get more difficult. It’s remarkably easy to do, and there’s a wide variety of tools available to get the job done. They’re used in all kinds of security and monitoring projects to take still shots over time, or record video footage for later review. The Raspberry Pi is an incredibly versatile computing platform, particularly when it comes to embedded applications.
0 Comments
Leave a Reply. |