Please note as of Wednesday, August 15th, 2018 this wiki has been set to read only. If you are a TI Employee and require Edit ability please contact x0211426 from the company directory.


From Texas Instruments Wiki
Jump to: navigation, search


DDRALGHEAP is a DSP executable's memory segment used only for allocating dynamic memory needed by algorithm instances when they are instantiated (External Memory (DDR) Heap for Algorithm Instances). It needs to be as large as is the sum for external memory needed by all codec instances that are active at the same time, in the worst case.

Server Memory Overview

A DSP Server executable has two main external memory segments defined in its BIOS configuration file (.tcf for BIOS 5, .cfg for BIOS 6): DDR and DDRALGHEAP, and a couple of system segments, not interesting here. The DDR contains all the static code and data (for BIOS, Codec Engine, and all other linked-in libraries, including algs). It also contains the heap for any system allocations, which are relatively small. DDRALGHEAP is the pool from which dynamic memory is granted to every alg instance the [traditionally ARM-side] application causes to be created; this memory is reclaimed when the application deletes the alg instance.

The size of DDRALGHEAP depends on which alg instances and how many of them will run at any given time. By default it's quite large to ease the early stages of the development process; at some point, you can shrink it down to the necessary minimum by calculating (or measuring) how large it needs to be. The final size varies greatly from use case to use case, and can range from half a MB to tens of MB. It traditionally depends on how many video codec instances, and to a lesser extent imaging and other alg instances, will run at the same time, because they typically require the most memory.

Example DDRALGHEAP Usage

For example, if your Server contains an audio decoder and a video decoder, typically your ARM-side application will create one audio decoder instance (via AUDDEC_create()), and one video decoder instance (via VIDDEC_create()). Each alg instance will require some amount external memory for its internal processing buffers; this does not include input and output buffers for exchanging the data with the ARM side. How big that amount is depends on the type of the alg (and who made it), but also on the mode of using the alg. If the ARM-side only needs video decoding to run at quarter resolution, and specifies that at VIDDEC_create() time, that instance will require less memory from DDRALGHEAP than if it wants the video decoder to run at full resolution.

Similarly, if you want to create two video decoder instances at the same time (possibly running in different modes), along with the audio decoder, your DDRALGHEAP must be large enough to accommodate all three existing side by side.

The opposite example (before we move on to the "how to" part) is when your Server image contains a video encoder and a video decoder, but you never run both at the same time. This is the case, for instance, with recording or playing back video on a digital camera; you never need to do both at the same time. When the camera user presses the "record" button, the ARM-side application creates a video encoder instance, records a movie, then deletes the instance; when the user presses the "playback" button, the ARM-side application creates a video decoder instance, plays the move, then deletes the instance. Since the two alg instances never exist at the same time, your DDRLGHEAP need only be large enough to accommodate the instance that needs more memory, probably the encoder. This instance creating and deleting is nearly instantaneous, and is better than creating both instances at camera boot time and deleting both at shutdown time. (While the latter scenario may be slightly simpler to program, it eats up more external memory.)

Calculating DDRALGHEAP size

You can calculate the total external memory needs for all algorithms by looking at the spec sheet for each, finding out how much memory each alg instance requires for the mode in which you plan to use it, then looking at all the modes of your application to find out what is the worst case, in terms of memory needs, when several algs are active at the same time. That number, plus some headroom that the engineering intuition says you should allow, is the minimal size of DDRALGHEAP for your application.

Or you can measure it by asking the Server to tell you how much external memory it is using, at various points in your Codec Engine-using application. Worst-case number, intuition, headroom, and you have the minimal "DDRALGHEAP" size.

As for asking the Server for the current memory usage, there's a few options:

  • In CE 1.10 and earlier, there is an API called Engine_getUsedMem() that returns the Server's total memory usage. The total includes "DDR" as well, which you don't want in the picture, but if you call Engine_getUsedMem() immediately after Engine_open(), before any alg instances are created, you will be able to calculate the delta. (DDRALHGEAP usage is exactly 0 when there are zero active alg instances.) When you call Engine_getUsedMem() after you create one or several instances, and subtract the first Engine_getUsedMem() result you got, you will roughly get the amount of DDRALGHEAP memory used by the instances you created (very slightly larger due to some system data structures in "DDR" that had to be created to support the instances, but compared to video codecs requirements those are negligible, on the order of a few KB).
  • In CE 1.20 and later, the API above is refined so you can ask precisely for the segment you need. Then you run the following code to find out how big DDRALGHEAP is at the moment you run the code:
hServer = Engine_getServer(hEngine);
Server_getNumMemSegs(hServer, &numSegs);
for (i = 0; i < numSegs; i++) {
    Server_getMemStat(hServer, i, &memStat);
    if (strcmp(, "DDRALGHEAP") == 0) {
        printf("DDRALGHEAP usage is %d out of %d available\n",
                memStat.used, memStat.size);

What are the limitations of the OMAP35x DVSDK-provided Codec Server?

The DVSDK-provided Codec Server can not cover all possible use-cases (e.g. mutichannel). The size was chosen to match the DVSDK example use cases. If the use case argument in the application is different from the listed combinations of the TestMatrix tab, of cs1omap3530_master_build_specification.xls located under cs1omap3530_1_00_##/packages/ti/sdo/server/cs/docs, there are chances that the DDRALGHEAP requirements may be more than the sized ones. Codec initialization/instantiation errors are the major symptoms of this condition. Under those cases, please follow these steps.

Note: Please note that any codec initialization errors should not be interpreted as insufficient DDRALGHEAP sizes, since there could be other reasons also to trigger the codec initialization errors.

  • Step 1. Calculate the DDRALGHEAP required for that use case argument, that triggered the codec initialization errors.
  • Step 2. If you notice that the calculated DDRALGHEAP from Step 1 is more than 9MB, increase the DDRALGHEAP for the codec server as below.
  1. Lookout for the below statement in server.tcf located under cs1omap3530_1_00_##/packages/ti/sdo/server/cs directory
    var DDRALGHEAP_SIZE   = 0x00900000 /* 9M space for the DDRALGHEAP */
  2. Update the DDRALGHEAP_SIZE variable for holding the calculated DDRALGHEAP sizes, and that's it. All other memory sections (DDR2, RESET VECTOR, DSPLINK) starting address would be changed and are computed automatically.
  3. Re-build the codec server to use it for the updated DDRALGHEAP sizes as mentioned in Rebuild the DSP side server executable.
  4. To get the updated/changed DSP memory map, after rebuilding, please refer to located under cs1omap3530_1_00_##/packages/ti/sdo/server/cs/package/cfg/bin directory.

Note: By changing the DDRALGHEAP sizes, the codec server memory footprints (codec server size, start and end address) would get changed and would impact the earlier OS sizes. (can't have the OS use 99 Mbytes, since it’ll overlap the DSP).

See Also