Table of Contents

Image Transmission Protocol

This page describes how the image streaming functionality works and covers both the implementation details (on MAV and QGroundControl) as well as the communication between the MAV und QGroundControl.

The image transmission protocol consists of two modules: An image streaming and a video streaming component:

The main advantage of the image streaming component over the video streaming is the better integration into QGroundControl. The main disadvantage is its need for MAVLink support (hence not as cross-plattform as the video streaming component).

Communication

Image streaming

The image streaming component uses two mavlink messages: A handshake message, DATA_TRANSMISSION_HANDSHAKE, to initiate, control and stop the image streaming; and a data container message, ENCAPSULATED_IMAGE, to transport the image data (see picture on the right).

(1) The communication is initiated by the QGroundControl with a request to start the stream. To do so, one must set the following fields in the MAVLink message:

It is possible to request for a specific image quality. To do so, you must set the quality field. All other fields should be zero in the initial request.

(2) When the targeted MAV recieves the handshake request, it sends back an acknowledgment and starts the image stream at the requested framerate. The handshake ACK packet normally contains the same values as requested by the QGroundStation (state set to 1, because it's an ACK), and adds data about the size of the next sent image:

(3) The image data is then split into chunks to fit into normal MAVLink messages. They are then packed into ENCAPSULATED_DATA packets and sent over MAVLink. Every packet contains a sequence number as well as the ID of the image stream it belongs to. The image streamer now sends periodically new images, there is no further intercation needed. Every new image comes with a new DATA_TRANSMISSION_HANDSHAKE ACK packet with updated image size, packets and payload fields. After this ACK packet, the new image arrives as a series of ENCAPSULATED_DATA packets. Note: The sequence number starts at 0 for every new image of the stream.

(4) To stop an image stream you must send a new DATA_TRANSMISSION_HANDSHAKE request packet with the frequency set to 0. The MAV will acknowledge this by sending back an ACK packet containing the same data as in the request.

Video streaming

The video transmission communication protocol is much simpler than the image streaming one: It consists of just one MAVLink message, VIDEO_STREAM, which is used to start and stop the video stream (see picture on the right).

The message has two fields to set:

The video stream is generated by FFMpeg on the MAV side. A small MAVLink wrapper grabs the camera image, adds (Y)UV channels for the YUV420 rawimage format and feeds that image into FFMpeg. The output is then sent to the Groundstation (note: at the moment this requires a fixed IP for the groundstation as well as one initial configuration step when setting up the MAV). Upon recieving the video stream, QGroundControl opens up a VLC window to redistribute the video stream: It takes the stream from the MAV, and offers this stream as RTP stream (on a multicast address) and as HTTP stream (for direct unicast streaming) to the network. This is done without transcoding of the original stream, to keep the performance impact as low as possible.

Other mobile devices can now connect to the stream on the multicast address 239.255.12.45, or to the HTTP stream on http://[QGC-HOST]/MAVLive.mpg. The multicast stream is announced via SAP under the name “MAVLive”.

Usage / Configuration

To use the two modules on your MAV, you have to do the following steps.

Image streaming

  1. Compile the mavconn middleware for your MAV: Guide.
  2. Start at least these components on the MAV:
    px_mavlink_bridge_udp &
    px_system_control --heartbeat &
    px_camera -o lcm &
  3. Compile and start QGroundControl
  4. Start the image streaming component (you can add the -v flag to see some more output):
    px_imagestreamer
  5. Initiate the image stream: Open the HUD widget, right-click into the widget and chose “Enable live Image Streaming”.

You should now be able to see the live video feed with one image per second (default, hardcoded at the moment).

Video streaming

  1. Perform steps 1 to 3 as in the image streaming part above
  2. Create a symlink in your home directory:
    cd ~
    ln -s mavconn/src/comm/video/px_videostreamer.sh px_videostreamer.sh

    (note: you could also copy the file, though that is not recommended.)

  3. Start the video streaming component on the MAV:
    px_videostreamer
  4. Initiate the video stream: Open the HUD widget, right-click into the widget and chose “Enable Video Live feed”.

A VLC window should now open. Don't close that window as long as you want to stream the video to others! If you want to watch the current stream, just open the stream in another VLC window.

Developer

Out-of-the-box, the image streaming component only implements JPEG streaming of the camera image. To implement your own image stream, you have to do the following: