OMX EZSDK Examples

From Texas Instruments Wiki
Jump to: navigation, search

OMX Examples Provided in EZSDK

There are sample applications provided in EZSDK using OpenMax components provided in EZSDK.

  1. Decode_Display Example
  2. Capture_ Encode Example
  3. Adec_Snt Example
  4. C6xtest Example

Decode_Display

Description

This example uses three OMX components Video Decoder Component (VDEC), Video Frame Processing Component (VFPC), Video Frame Display Component (VFDC) for creating a simple application, which can decode an H264 elementary stream and scale and display it. This application is an IL Client running on A8 processor with Linux operating system. Decoder component runs on media controller HDVICP2, while frame processing and display component runs on media controller HDVPSS part of DM81xx.

OpenMax VDEC component takes a buffer, which contains single frame of H264 elementary stream, and after decoding gives the buffer to A8. Elementary stream chunking logic is implemented as H264 stream parser, running on A8. File read logic is implemented within application, and parser provides the buffers to VDEC component via OpenMax standard non tunneled APIs. Please note parser provided in application is for demonstration, and has been tested with limited number of streams.

Output of decoder is fed to VFPC component. VFPC component implements various frame proceesing module. In this example Scalar part of VFPC module is used. Each module has different OpenMax component name to distinguish. Scalar is doing just the chroma conversion in this example, and video scaling is not used. Chroma conversion is required as VDEC component provides output in 420 format, while display component takes the input in 422 format.

By default display is configured in IL Client to display video on on-chip HDMI port. This can be routed to HDDAC port by changing the display id in IL client. As DM8168 supports 2 HD display, applications can make use of on chip HDMI for one display and HDCOMP port for another display. OMX VFDC component provides the interface to HDVPSS display module. Alongwith OpenMax Display module (VFDC), a control module is also provided as an OpenMax component for controlling various display module. Control module does not have buffers or ports, but provide the OpenMax APIs for configuration purpose only.

Following diagram represents this decode and display use case.
Figure 1: Decode - Display application
  (click on the picture to enlarge)

This application takes width, height, frame rate and file name as input argument. Frame rate control allows decoder to run at frame rate specified as argument. Display is running at 60 frames / second, so max frame rate is 60 frames per second.

How to Build

For Building the app, SDK needs to be installed on linux host machine.. 

Note all the dependencies must be built beforehand, it will be done by doing following. 

 $ cd <ezsdk_root>
 $ make components 
 $ make omx

 

Creates the A8 Linux binary in component-sources/omx_05_02_00_0x/bin/decode_display/bin/ti816x-evm/ folder


How to Run

As this example by default provides display on HDMI port, user should plug in a display device (TV) into base board HDMI port. Also display device should support 1080p60 display.on running the application user should see the video palying on connected display device.

For running the application following steps are required (These steps assume that all the binaries are stored in same folder). Please note, SDK might autoload the binaries as part of demonstration application, which should be disabled before following the procedure below. It assumes that EVM has been booted and user has logged in as root. If init sequence in SDK is not disabled, following steps are not required, and user can proceed with Matrix GUI disabling step.


•# /prcm_config_app s

• Insert syslink module
 # insmod syslink.ko


• Load the Firware using firmware_loader utility provided in SDK. 
( By default in the init scripts of Linux, firmaware might be getting loaded, so care needs to taken)

i. #  ./firmware_load 1 dm816x_hdvicp.xem3 start
ii.#  ./firmware_load 2 dm816x_hdvpss.xem3 start

• Insert HDMI controller / FB dev driver
i.  #  insmod vpss .ko vpss_slaveloader = 0xbfb0000
ii. #  insmod ti816xxxhdmi.ko
iii.#  insmod ti81xxfb.ko
 

If Matrix GUI is running as part of SDK initialization scripts, It needs to be turned off

# /etc/init.d/matrix-gui-e stop
# /etc/init.d/pvr-init stop


• Run the application

# ./decode_display_a8host_debug.xv5T -i sample.h264 -w 1920 -h 1080 -f 60 -g 0


If application has modified the display id to set it to HDCOMP, HDCOMP needs to be configured for 1080p-60 mode, before running the application

# echo 1080p-60 >/sys/devices/platform/vpss/display3/mode

Capture_Encode

Description

This example uses four OMX components Video Encode Component (VENC), Video Frame Processing Component (VFPC), Video Frame Display Component (VFDC), and Video Capture Component (VFCC) for creating a simple application, which can capture and encode it to an H264 elementary stream and also display it. Capture, Display and Frame Processing components run on HDVPSS part of media controller, while Encode runs on HDVICP2.

This examples requires DM8168 base EVM and EIO daughter card attached to it for running. Also a video source, which can produce 1080p60 component output is required. (example has option of specifying 720p60 source as well).

In this application, capture component captures the 1080p60 input using TVP7002 decoder on Catalog EIO board. (Component input connectors are labeled as J5J6J7 on the EIO card). This data is fed to DEI VFPC component, which produces two outputs. One output is given to display component while other is fed to encoder component running on HDVICP2. Since capture is being done with 420 progressive formats, DEI algorithm is not turned ON. For interlace capture, DEI can be used for de-interlacing.

This application takes mode (1080p/720p), frame rate, bitrate and file name as input argument. By default display is on on-chip HDMI port. Encoder bit rate can be varied through arguments passed in the application. More parameters for encoding can be changed in IL client though OMX APIs. Capture is always at 60 fps, irrespective of frame rate specified. If it is set to 30 capture component would skip frames as per mask specified in example, and provide 30 frames to encoder. So the source should provide 60fps.

Following diagrams shows the components data flow.

Omx encode.png

(click on the picture to enlarge)

How to Build

For Building the app, SDK needs to be installed on linux host machine..

 Note all the dependencies must be built beforehand 

 $ cd <ezsdk_root>
 $ make components 
 $ make omx


Above steps would create the A8 Linux binary in component-sources/omx_05_02_00_0x/bin/capture_encode/bin/ti816x-evm/ folder

How to Run

As this example by default provides display on HDMI port, user should plug in a display device (TV) into base board HDMI port. Also display device should support 1080p60 display. A video source should be connected on Component input port of EIO card. Video source should provide 1080p60 on component output, otherwise EVM will not be able to detect the capture signal. On running the application user should see the video playing on connected display device, and encoded output will be produced in given file.

For running the application following steps are required (These steps assume that all the binaries are stored in same folder). Please note, SDK would autoload the binaries as part of demonstration application, which should be disabled before following the procedure below. It assumes that EVM has been booted and user has logged in as root. If init sequence in SDK is not disabled, following steps are not required, and user can proceed with Matrix GUI disabling step.

•# /prcm_config_app s

• Insert syslink module
 # insmod syslink.ko


• Load the Firware using firmware_loader utility provided in SDK. 
( By default in the init scripts of Linux, firmaware might be getting loaded, so care needs to taken)

i. #  ./firmware_load 1 dm816x_hdvicp.xem3 start
ii.#  ./firmware_load 2 dm816x_hdvpss.xem3 start

• Insert HDMI controller / FB dev driver
i.  #  insmod vpss .ko vpss_slaveloader = 0xbfb0000
ii. #  insmod ti816xxxhdmi.ko
 

If Matrix GUI is running as part of SDK initialization scripts, It needs to be turned off

# /etc/init.d/matrix-gui-e stop
# /etc/init.d/pvr-init stop


• Run the application

# ./capture_encode_a8host_debug.xv5t -o test.data -m 1080p -f 60 -b 10000000 -n 500 


If application has modified the display id to set it to HDCOMP, HDCOMP needs to be configured for 1080p-60 mode, before running the application

# echo 1080p-60 >/sys/devices/platform/vpss/display3/mode


IL Client / Application

In order to create / configure and connect the OpenMax components, application is written as an Integration Layer ( IL) client. This IL client is used to invoke OpenMax APIs for different component. In this application Components allocate the video buffers in response on OMX APIs on IL client. IL client is responsible for taking the buffers from one component and passing it to other component.

Creating and Configuring the Components

For creation and Configuration following OpenMax APIs are used.

  • OMX_GetHandle
  • OMX_GetParameter
  • OMX_SetParameter
  • OMX_AllocateBuffer
  • OMX_UseBuffer

For port enable and change the state following OpenMax API is used.

  • OMX_SendCommand

Following flow chart provides brief overview of OMX API flow for creation and configuration of components. OpenMax state changes are done to start the buffer communication.

Openmax CreateStateChange.JPG 

(click on the picture to enlarge)

Buffer Communication between components

Following OpenMax APIs are used to pass the buffers to components.

  • OMX_EmptyThisBuffer
  • OMX_FillThisBuffer

Component returns the buffers to IL Client via callbacks in response to above data APIs. IL client implements the callback functions, which are invoked, when component returns the buffers. Following flow chart implements the buffer handling inside IL Client.

Openmax ilclient.jpg 

(click on the picture to enlarge)

Tear-down Sequence

After the iput file is played, IL client tears down the component. For Tear-down, IL client change the state of each component and free up the buffers. It is depicted in following flow chart.

OpenMax dataflow.jpg 

(click on the picture to enlarge)

Adec_Snt

Description

This example uses the Audio Decoder Component (ADEC) to decode compressed stereo audio streams using a choice of supported DSP audio decoders, and write the decoded output data (L/R interleaved PCM samples) into a file. This application is an IL Client running on the A8 processor with the Linux operating system. The ADEC component runs on the C674x DSP of the DM81xx. OpenMax API call sequences for init, state changes and teardown are the same as for the VDEC component configured for "Standard Non-Tunneled" mode.

The OpenMax ADEC component takes a buffer, which contains one or more frames of encoded audio data, and after a decoding cycle passes the output buffer containing decoded data back to the A8. Buffers are exchanged with the component via the standard asynchronous OpenMAX non-tunneled APIs OMX_EmptyThisBuffer and OMX_FillThisBuffer. These buffers are expected to be physically contiguous, and allocated from a region of memory that is configured as cached from the DSP side. User applications may use the OMX_AllocateBuffer API to allocate from the preconfigured DSP-cached shared memory region, as done in this example. The number of bytes consumed from the input buffer and the number of bytes written to the output buffer are returned in the buffer headers passed to the application via the corresponding callback APIs. Input file read, buffer offset tracking & fills, and output file write logic is implemented within the application.

For audio decoders, the number of bytes consumed from the encoded bitstream and/or the number of output samples for each decoding cycle may not be known prior to that cycle. The ADEC component implementation reuses the SysLink shared memory plus MessageQ-based RCM infrastructure developed for the VDEC-SNT component, for buffer passing between the A8 and DSP. Therefore, it can handle a single input and single output buffer at a time. The adec_snt example implementation illustrates this.

Adec_Snt application

(click on the picture to enlarge)

The application takes the input filename, output filename and the decoder type as command-line arguments. The input file must contain only the encoded audio bitstream; any metadata or container data e.g. ID3 tags, MP4 headers etc. must not be sent to the ADEC component. Decoded output is written to corresponding <filename>.pcm files in same folder.

How to Build

For Building the app, SDK needs to be installed on linux host machine.

  $ make omx

creates the binary in component-sources/omx_05_02_00_0x/bin/adec_snt/bin/ti814x-evm/ or component-sources/omx_05_02_00_0x/bin/adec_snt/bin/ti816x-evm/ folder

Note that all the dependencies must be built beforehand

 $ cd <ezsdk_root>
 $ make components 


How to Run

For running the application following steps are required (these steps assume that all the binaries are stored in same folder). Please note, SDK might autoload the binaries as part of demonstration application, which should be disabled before following the procedure below. It assumes that EVM has been booted and user has logged in as root.


• Insert syslink module
 # insmod syslink.ko


• Load the Firmware using firmware_loader utility provided in SDK. 
(This must be done only if the DSP firmware was not loaded in the Linux init scripts)

 #  ./firmware_loader 0 dm816x_c6xdsp.xe674 start

• Run the application

# ./adec_snt_a8host_debug.xv5T –i sample.aac –o output.pcm –c aaclc

If run without arguments, the application prints the command line description to the standard output.

The output file will be named <input filename>.pcm e.g. decoded output for file input.aac will be written to input.aac.pcm. These .pcm files can be viewed and played in PCM editors like "GoldWave" or "CoolEdit".


C6xtest

Description

This example uses one OMX component Video LooPBack Component (VLPB)  for creating a simple application, which can copy a buffer from memory location to another. This application is an IL Client running on A8 processor with Linux operating system. The VLPB component runs on the dsp (c67x) part of DM81xx.

VLPB stands for Video Loop Back Component. The component name is "OMX.TI.C67X.VLPB". As is clear from the name, the component runs on the DSP (C67x). This component has 16 input ports and 16 output ports. It copies a buffer on its input port to a buffer on the output port. There is a one to one correspondence between the input and output ports. In this example, only one input and output port is used. Therefore, essentially this a copy component running on the DSP with buffers on the input and output port supplied via OpenMax standard non tunneled APIs.

Following diagram represents the c6xtest example.
C6xtest (VLPB) application
  (click on the picture to enlarge)
Following diagram represents the usage of OMX standard non-tunneled apis for data communication using VLPB component on C67x.
VLPB component - Data Communication
  (click on the picture to enlarge)

This application does not take any additional parameters as input argument. In this example, a buffer of size IL_CLIENT_VLPB_BUFFER_SIZE is filled with IL_CLIENT_VLPB_PATTERN and is passed as the input buffer. IL client provides this buffer by using OMX_EmptyThisBuffer call. Output buffers to components are provided by using OMX_FillThisBuffer APIs. Component informs the IL Client by calling the callbacks provided during OMX_GetHandle(), namely FillBufferDone and EmptyBufferDone. After processing IL_CLIENT_VLPB_MAX_FRAMES frames in this sample application, component is moved back to idle and loaded state. Finally component handle is deleted by using OMX_FreeHandle() API. The constants IL_CLIENT_VLPB_xxx are defined in ilclient_utils.h in /c6xtest/src folder.

How to Build

For Building the app, SDK needs to be installed on linux host machine.. 

Note all the dependencies must be built beforehand, it will be done by doing following. 

 $ cd <ezsdk_root>
 $ make components 
 $ make omx

 

This creates the A8 Linux binary in component-sources/omx_05_02_00_0x/bin/c6xtest/bin/ti81xx-evm/ folder


How to Run

For running the application following steps are required (These steps assume that all the binaries are stored in same folder). Please note, SDK might autoload the binaries as part of demonstration application, which should be disabled before following the procedure below. It assumes that EVM has been booted and user has logged in as root. If init sequence in SDK is not disabled, following steps are not required, and user can proceed with Matrix GUI disabling step.


•# /prcm_config_app s

• Insert syslink module
 # insmod syslink.ko

• Load the Firware using firmware_loader utility provided in SDK. ( By default in the init scripts of Linux, firmaware might be getting loaded, so care needs to taken)

i.#  ./firmware_load 0 dm81'''<u>x</u>'''x_c6xdsp.xem3 start
ii. #  ./firmware_load 1 dm816x_hdvicp.xem3 start
iii.#  ./firmware_load 2 dm816x_hdvpss.xem3 start


• Run the application

# ./c6xtest_a8host_debug.xv5T


Decode_MosaicDisplay

Description

This example is similar to Decode_display except that it includes OMX Software mosaic component (swmosaic). The swmosaic component (VSWMOSAIC) is used for video composition. OMX component Video Decoder Component (VDEC) is used for decoding the H264 video stream; Video Frame Processing Component (VFPC), Video Frame Display Component (VFDC) are used for scaling and display; This application is an IL Client running on A8 processor with Linux operating system. Decoder component runs on media controller HDVICP2, while frame processing and SwMosaic and display components run on media controller HDVPSS part of DM81xx.

OpenMax VDEC component takes a buffer, which contains single frame of H264 elementary stream, and after decoding gives the buffer to A8. Elementary stream chunking logic is implemented as H264 stream parser, running on A8. File read logic is implemented within application, and parser provides the buffers to VDEC component via OpemMax standard non tunneled APIs. Please note parser provided in application is for demonstration, and has been tested with limited number of streams.

Output of decoder is fed to VFPC component. VFPC component implements various frame processing module. In this example Scalar part of VFPC module is used. Each module has different OpenMax component name to distinguish. Scalar is doing just the chroma conversion in this example, and video scaling is not used. Chroma conversion is required as VDEC component provides output in 420Semi-planar format, while display component takes the input in 422interleaved format. The swmosaic component has 2 ports. One port (portindex - 0 also configured as window 0) is fed from the scalar described above. And other port (#1 also configured as window 1) is fed from locally generated video frame (white color) for demo purpose. The swmosaic component does the composition (window 0 followed by window 1). The composed output is given to the display component.

By default display is configured in IL Client to display video on on-chip HDMI port. This can be routed to HDDAC port by changing the display id in IL client. As DM8168 supports 2 HD display, applications can make use of on chip HDMI for one display and HDCOMP port for another display. OMX VFDC component provides the interface to HDVPSS display module. Alongwith OpenMax Display module (VFDC), a control module is also provided as an OpenMax component for controlling various display module. Control module does not have buffers or ports, but provide the OpenMax APIs for configuration purpose only. Omx swmosaic example.jpg

How to Build

For Building the app, SDK needs to be installed on linux host machine.. 

Note all the dependencies must be built beforehand, it will be done by doing following. 

 $ cd <ezsdk_root>
 $ make components 
 $ make omx

 

Creates the A8 Linux binary in component-sources/omx_05_02_00_xx/bin/decode_mosaicdisplay/bin/<platform - ti816x-evm or ti814x-evm>/ folder

How to Run

As this example by default provides display on HDMI port, user should plug in a display device (TV) into base board HDMI port. Also display device should support 1080p60 display.on running the application user should see the video palying on connected display device.

For running the application following steps are required (These steps assume that all the binaries are stored in same folder). Please note, SDK might autoload the binaries as part of demonstration application, which should be disabled before following the procedure below. It assumes that EVM has been booted and user has logged in as root. If init sequence in SDK is not disabled, following steps are not required, and user can proceed with Matrix GUI disabling step.


•# /prcm_config_app s

• Insert syslink module
 # insmod syslink.ko

• Load the Firware using firmware_loader utility provided in SDK. 
( By default in the init scripts of Linux, firmaware might be getting loaded, so care needs to taken)

i. #  ./firmware_load 1 dm816x_hdvicp.xem3 start
ii.#  ./firmware_load 2 dm816x_hdvpss.xem3 start

• Insert HDMI controller / FB dev driver
i.  #  insmod vpss .ko vpss_slaveloader = 0xbfb0000
ii. #  insmod ti816xxxhdmi.ko
iii.#  insmod ti81xxfb.ko
 

If Matrix GUI is running as part of SDK initialization scripts, It needs to be turned off

# /etc/init.d/matrix-gui-e stop
# /etc/init.d/pvr-init stop


• Run the application

# ./decode_mosaicdisplay_a8host_debug.xv5T -i sample.h264 -w 1920 -h 1080 -f 60  -p 1 -g 0


If application has modified the display id to set it to HDCOMP, HDCOMP needs to be configured for 1080p-60 mode, before running the application

# echo 1080p-60 >/sys/devices/platform/vpss/display3/mode


FAQ about OMX Examples Provided in EZSDK

1. Why does the decode_display or decode example fail for certain MPEG2 streams? Also, why is there blockiness in the displayed output for some streams?

Ans: The parser supplied with the OMX examples is a primitive parser to be used only for example purpose. This has not been tested for an exhaustive set of streams. As a result, for certain streams the parser does the frame chunking incorrectly resulting in the VDEC component throwing out errors as it expects the input bitstream buffer to contain exactly one frame worth of data. This results in the MPEG2 Decoder applying concealment for these frames. As a result, the output may appear blocky. The decoding continues till the end of the stream despite this error. Note that users are expected to take care of parsers on their side for their system. The OMX examples are meant to show how OMX apis should be used to create an OMX IL-Client.