Using Farstream for videoconferencing

I see a lot of comments about Gstreamer videoconferencing abilities, so I want to continue this topic and say a few words about the Farstream project.

Gstreamer is the one of Collabora's flagship projects. This company is developing other projects based on Gstreamer, and Farstream is exact project for audio and video conferencing. With Farstream, businesses can easily create a multi-platform, multi-user audio/video conference solution that supports a wide range of codecs. Simply put, Farstream (formerly Farsight) is an advanced VoIP/video streaming engine capable of dealing with all known audio/video conferencing protocols.

Quote from the project description:

Philippe Kalaf began the Farstream project in 2005. He was aiming to create a simple-to-use framework that implemented multiple audio/video streaming protocols.
Most of the initial work on Farstream focused on the protocol agnostic API and the RTP plugin. Over the years, the RTP plugin has matured into a very complete RTP stack included support for all advanced features such as multi-user conferencing, RTCP enabled lip-sync, on-the-fly codec switching and many others. The Farstream team then started working on integrating libnice's ICE stack which provides for near 95% NAT traversal capabilities. Alternatively, Farstream's RTP plugin can use raw UDP or multicast UDP as well as the Google and MSN flavors of ICE that are currently deployed.
See more on wiki.

Let's go straight to practice. Install the python-farsight package and clone the sources:

$ sudo apt-get install python-farsight
$ git clone git://git.collabora.co.uk/git/farsight2.git

Go to farsight2/examples/gui and start fs2-gui.py:

$ cd farsight2/examples/gui
$ ./fs2-gui.py

In this application you can start the server as well as the client instance:



As a parameter for the application you can pass a GStreamer source element like v4l2src. By default it uses test sources. The application is not so stable and smooth, but don't forget that it's an example.

Basically, it's all - see the sources as reference for using in your applications. The application in working mode looks like this one:


The Farstream library is licensed under LGPL 2.1 or later, so it can be used in commercial applications, but if you change its sources you have to contribute it back. Good luck!

Comments

  1. hi alex...
    i did some experiments on gstreamer..
    nw am trying to learn farstream...
    can u suggest some examples/tutorials which help me...
    thanx :)

    ReplyDelete
  2. Farstream sources contain some examples, also you can try to find/read blogs of the farstream developers, like this one.
    Unfortunately, it's difficult to find a lot of information about these, and as with any OSS project, you have to use its sources to learn how to deal with it.

    ReplyDelete
  3. Hi Alex,

    I was trying the example app to understand farstream.. i used v4l2src as source (with the intend to use the integrated camera in my lenovo T410 as the source) but I could only see empty box. Any idea?

    Also, I would like to create a video conferencing app with browsers as the interface. How do I go about doing that? Where do I start?

    Thanks,
    Jagadish D

    ReplyDelete
    Replies
    1. Unfortunately v4l2 doesn't support all webcameras natively, you have to find workaround, use external camera, or recompile v4l2 (or even kernel) to get camera working.

      And I didn't catch the phrase "video conferencing app with browsers as the interface". Web-browsers are completely different story and you have to use some plugins like Abobe Flash or MS Silverlight (Mono Moonlight?) to get it working.

      Delete
    2. Thanks for a quick response.

      The reason I was expecting my laptop's integrated camera to work is that the following gStreamer command worked for me..

      $ gst-launch v4l2src ! video/x-raw-rgb,width=320,height=240 ! ffmpegcolorspace ! xvimagesink

      By "video conferencing app with browsers as the interface" I meant to use browsers at the terminals/clients (PC1 and PC2) for capturing and viewing the video and have gStreamer/farStream at the server to facilitate the audio/video transfer.

      I did try this with flash for capture & display and rtmplite server for media transfer but the rtmplite's performance is very poor when loaded with two or more users.

      Please correct me if my understanding of any of the above terms are wrong. It will be great if you can suggest me better ideas.

      Thanks a ton!
      Jagadish D

      Delete
    3. 1. About the webcam. In your gst-launch command you used video stream conversion pipes. Looks like it's the reason you have problems with farstream - you have to transform your video stream somehow for further processing first. Look to sources and options - I believe there is a way to do it.

      2. In video conferences we usually have two bottlenecks - the performance of codecs and network bandwidth. New versions of flash plugin use decent codecs, shouldn't be problems there. But as far as I understand, rtmplite does transcoding and it can be a problem. Maybe some options can modify this behaviour.


      But anyway you can't use webbrowser as a viewer with farstream project - it requires standalone application. Plugins and browsers usually make things worse regarding performance, and if you want to have videochat in HD quality with more than 2 persons, looks like the using a special application is only an option.

      Delete
    4. Thanks a lot Alex, this helps.

      Delete
  4. For webcam display;

    1. connect webcam on your client
    2. test usb connection with linux commandline of "lsusb";
    3. run python gui with:

    $ cd farsight2/examples/gui

    $ ./fs2-gui.py /dev/video0


    For webcam no.2 with command line of:

    $ ./fs2-gui.py /dev/video1


    Good Luck

    ReplyDelete

Post a Comment

Popular posts from this blog

Web application framework comparison by memory consumption

Trac Ticket Workflow

Shellcode detection using libemu