83

I want to make a live stream of (a window on) my linux desktop using a free streaming site, using the captured video as a fake webcam. There are many tools for this on windows. ffmpeg allows me to capture input on a specific window, but I can't find a way to output the video to a fake webcam-style device usable by flash.

Can anyone recommend a method (or software) for doing this?

bkconrad
  • 961

6 Answers6

93

You can install v4l2loopback. It is a kernel module that simulates a webcam. Load it with:

modprobe v4l2loopback

Then you need to send the video stream to the device /dev/videoN (where N is the number that corresponds to the freshly created device - probably the highest number) using a program like ffmpeg. In order to capture the desktop and forward it to /dev/videoN with ffmpeg, you can use the following command line:

ffmpeg -probesize 100M -framerate 15 -f x11grab -video_size 1280x720 -i :0.0+0,0 -vcodec rawvideo -pix_fmt yuv420p -f v4l2 /dev/videoN

Change the value of -framerate from 15 to something else if you want a different frame rate.

The resolution is chosen in the -video_size parameter. If you want to specify an offset from the upper-left corner of the screen, pass it in the -i parameter in the form -i :0.0+x,y, where x and y are the horizontal and vertical offset respectively.

Suuuehgi
  • 267
13

Use v4l2loopback with mplayer.

  1. Download it,

  2. compile it (make and su -c 'make install'),

  3. load the module with su -c 'modprobe v4l2loopback',

  4. then change one line in the file examples/yuv4mpeg_to_v4l2.c of the v4l2loopback source folder from

    v.fmt.pix.pixelformat = V4L2_PIX_FMT_YUV420;
    

to

    v.fmt.pix.pixelformat = V4L2_PIX_FMT_YVU420;
  1. and do make in this folder.

  2. Then run it from the examples directory like this:

    mkfifo /tmp/pipe  # only needed once, as long as you do not delete the file /tmp/pipe
    ./yuv4mpeg_to_v4l2 < /tmp/pipe &
    mplayer movie.mp4 -vf scale=480:360 -vo yuv4mpeg:file=/tmp/pipe
    

where you replace movie.mp4 with the name of your video file. And replace /dev/video0 with your loopback device.

MPlayer is able to play any webstreams, all kind of video files, even from stdin! I just tested it with a file from http://www.tagesschau.de which is a german news site.

TS=$(wget "http://www.tagesschau.de/multimedia/video/" -q -O - | grep --regexp='http.*\.webm"' | sed -e 's%.*href="%%' -e 's%\.webm".*%\.webm%')
./yuv4mpeg_to_v4l2 < /tmp/pipe &
mplayer $TS -vf scale=480:360 -vo yuv4mpeg:file=/tmp/pipe

Instead of the $TS you could put a - (which stands for stdin). And in front of mplayer your ffmpeg command redirecting its output to stdout. So something like:

./yuv4mpeg_to_v4l2 < /tmp/pipe &
ffmpeg -someOptions ... -o - | mplayer - -vf scale=480:360 -vo yuv4mpeg:file=/tmp/pipe

Did not test the last one, because you did not tell how your ffmpeg command look like.

erik
  • 2,028
4

Without using ffmpeg, this is what worked for me (Ubuntu 20.04):

  1. Install OBS : https://obsproject.com/download
  2. Install the v4l2loopback module: https://github.com/umlaeute/v4l2loopback#run
  3. Start the module: v4l2loopback devices=1 video_nr=10 card_label="OBS Cam" exclusive_caps=1 (in which video_nr means the device number (it will become /dev/video10 in this example)
  4. Install obs-v4l2sink: (deb package) https://github.com/CatxFish/obs-v4l2sink/releases
  5. Install libobs-dev (not sure if needed)
  6. Link the library to the correct directory: ln /usr/lib/obs-plugins/v4l2sink.so /usr/lib/x86_64-linux-gnu/obs-plugins/
  7. Then follow: https://github.com/CatxFish/obs-v4l2sink/

NOTE: remember to use the device you specified, like: /dev/video10

lepe
  • 798
  • 9
  • 17
4

What distro are you using? I've had success with WebCamStudio under Arch combined with the Livestream web-based "studio." It's been a little while since I've used it, though.

http://www.ws4gl.org/

What are you trying to do exactly? ffmpeg compiled with x11grab can record the desktop. I've had limited success pushing that to Ustream, but again it's been a while and I think what I was doing won't work anymore.

If you just want to stream a file rather than your desktop (I'm thinking when you say, "A window," you mean, "VLC"), I can point you in the right direction to get that working with Livestream (maybe Ustream). I'm clumsily figuring out how to do this through experimentation. It's not fantastic but it works with Livestream.

Justin.tv has scripts that can stream from VLC to their service, as well.

http://apiwiki.justin.tv/mediawiki/index.php/Linux_Broadcasting_API

3

First, appear.in probably does what you want without any hassle (I'm not affiliated): http://appear.in/

Second, you can stream to Twitch or other services using OBS, which recently added linux support(!): https://obsproject.com/

OBS also solves the much harder problem of muxing system sound and audio input while screen capturing on Ubuntu (not solved by anything in the universe repo that I've found so far).

I don't have any awesome unix-y solutions. But those worked for me in the real world.

random
  • 15,201
bkconrad
  • 961
1

Another way to simulate a real world live camera is to use udp://. For example:

# (make sure you use the correct screen number, in my case it was :1, not :0)
ffmpeg -re -f x11grab -r 15 -s 1280x720 -i :0.0+0,0 -map 0:v -c:v libx264 -f mpegts udp://localhost:50000

The video is received by:

ffmpeg -i udp://localhost:50000 -f mpegts video.ts

Furthermore, if your purpose is simply to simulate a live camera (as for testing a computer vision pipeline), you could use the native frame rate -re option, the looping -stream_loop -1 option and a static file:

VIDE0=./static-video.mp4
ffmpeg -re -stream_loop -1 -i $VIDEO -map 0:v -f mpegts udp://localhost:50000

Official ffmpeg info on the -re option:

-re (input) Read input at native frame rate. Mainly used to simulate a grab device, or live input stream (e.g. when reading from a file). Should not be used with actual grab devices or live input streams (where it can cause packet loss). By default ffmpeg attempts to read the input(s) as fast as possible. This option will slow down the reading of the input(s) to the native frame rate of the input(s). It is useful for real-time output (e.g. live streaming).

Gooshan
  • 111