Thursday, February 12, 2015

The Saga of Streaming Video

So I set out to play with streaming some video over WiFi. The eventual goal is to have a control interface on my Android phone. The tilting action from the phone will cause Robie to move forward. back, left and right. And of course the mobile screen will show whatever Robie is looking at through the Raspi cam that's now secured to his skull.

There are a ton of different options for streaming video from the Pi. Ideally I thought at first that I'd use the built in h264 encoder and then fire off a compressed stream to my destination. Ideally this stream would be done using UDP transport so that we could avoid re-transmission of packets. All modern phones have h264 decoders in them as far as I know. So this should be pretty easy, right?

Well my experiments didn't turn out so well. I was able to get h264 streaming working over TCP transport. But the latency was unacceptable for a real time application such as this. You would wave your hand in front of the camera and then 5 seconds later you'd see it in the video. I tried several different approaches, encoding with vlc, encoding with ffmpeg and encoding with gstreamer. I tried various different resolutions as well. All to no avail. Either the stream had too much latency or the stream simply wouldn't play.

Eventually (though the magic of Google) I came to realize that mjpeg (motion jpeg) streaming was the best approach to use. Intuitively you'd think h264 with hardware encoding would be the way to go. But you'd be wrong.

So there were 2 different mjpeg streamers that I managed to get working nicely:

mjpeg streamer: http://sourceforge.net/projects/mjpg-streamer/
which has a nice tutorial here:
http://wolfpaulus.com/jounal/embedded/raspberrypi_webcam/

and

uv4l: http://www.linux-projects.org/modules/sections/index.php?op=viewarticle&artid=14

Of the two approaches I found uv4l better. It seemed to have less CPU utilization and the embedded web application has a nifty configuration screen where you can adjust camera parameters on-the-fly.

Now my latency is almost imperceptible. I can wave my hand in front of the cam and see it on the client end instantaneously. One item of note though, I did have to lower my resolution and framerate considerably. 640x480@15fps seemed to be the lower limit of realtime streaming. I opted for lowering it even further to 320x240@15fps. This way I should have less issues with bandwidth and more CPU for other tasks.

One last item to share... My tests were running flawlessly on my Raspi B+. But when I cobbled everything together in my A+, I ran in to an issue. When streaming video the Pi works nicely for all of 10-30 seconds. Then inexplicably the ping times slow to a crawl and the video gets super choppy - eventually stalling out. It's so bad that sometimes I can even log in through ssh.

I thought I might have a driver issue with the WiFi dongle. After playing with several settings and getting nowhere I thought I'd try putting it back in the B+. Now she streams flawlessly again. So I can think of 2 possibilities here:

a) The A+ has 1/2 the memory of the B+, so it may be running in to a RAM issue.
b) The A+ has the motor controller, DC-DC step up board and the ws2812 modules all pulling power from the same source. The B+ experiment was *only* using the raspicam and the WiFi adapter. So it could be a power starvation issue.

Of the 2 options, b) stands out to me as the first thing to test. Perhaps tonight I'll see where that leads...

No comments:

Post a Comment