oneVPL

The oneAPI Video Processing Library is a programming interface for video decoding, encoding, and processing to build portable media pipelines on CPU’s, GPU’s, and other accelerators. It provides API primitives for zero-copy buffer sharing, device discovery and selection in media centric and video analytics workloads. oneVPL’s backwards and cross-architecture compatibility ensures optimal execution on current and next generation hardware without source code changes.

See oneVPL API Reference for the detailed API description.

oneVPL for Intel® Media Software Development Kit Users

oneVPL is source compatible with Intel® Media Software Development Kit (MSDK), allowing applications to use MSDK to target older hardware and oneVPL to target everthing else. oneVPL offers improved usability over MSDK. Some obsolete features of MSDK have been omitted from oneVPL.

oneVPL Usability Enhancements

  1. Smart dispatcher with implementations capabilities discovery. Explore SDK Session for more details.

  2. Simplified decoder initialization. Expore Decoding Procedures for more details.

  3. New memory management and components (session) interoperability. Explore Internal memory managment and Decoding Procedures for more details.

  4. Improved internal threading and internal task scheduling.

Obsolete MSDK Features omitted from oneVPL

The following MSDKt features are not included in oneVPL:

Audio Support

oneVPL is for video processing, and removes audio APIs that duplicate functionality from other audio libraries like Sound Open Firmware

ENC and PAK interfaces

Available as part of Flexible Encode Infrastructure (FEI) and plugin interfaces. FEI is the Intel Graphic specific feature designed for AVC and HEVC encoders, not widely used by customers.

User plugins architecture

oneVPL enables robust video acceleration through API implementations of many different video processing frameworks, making support of its own user plugin framework obsolete.

External Buffer memory managment

A set of callback functions to replace internal memory allocation is obsolete.

Video Processing extended runtime functionality

Video processing function MFXVideoVPP_RunFrameVPPAsyncEx is used for plugins only and is obsolete.

External threading

New threading model makes MFXDoWork function obsolete

The following behaviors occur when attempting to use a MSDK API that is not supported by oneVPL:

Code compilation

Code compiled with the oneVPL API headers will generate a compile and/or link error when attempting to use a removed API.

Code previously compiled with MSDK and used with a oneVPL runtime

Code previously compiled with MSDK and executing using a oneVPL runtime will generate an MFX_ERR_UNSUPPORTED error when calling a removed function.

MSDK API’s not present in oneVPL

Audio related functions:

MFXAudioCORE_SyncOperation(mfxSession session, mfxSyncPoint syncp, mfxU32 wait)
MFXAudioDECODE_Close(mfxSession session)
MFXAudioDECODE_DecodeFrameAsync(mfxSession session, mfxBitstream *bs,
                                mfxAudioFrame *frame_out, mfxSyncPoint *syncp)
MFXAudioDECODE_DecodeHeader(mfxSession session, mfxBitstream *bs, mfxAudioParam *par)
MFXAudioDECODE_GetAudioParam(mfxSession session, mfxAudioParam *par)
MFXAudioDECODE_Init(mfxSession session, mfxAudioParam *par)
MFXAudioDECODE_Query(mfxSession session, mfxAudioParam *in, mfxAudioParam *out)
MFXAudioDECODE_QueryIOSize(mfxSession session, mfxAudioParam *par, mfxAudioAllocRequest *request)
MFXAudioDECODE_Reset(mfxSession session, mfxAudioParam *par)
MFXAudioENCODE_Close(mfxSession session)
MFXAudioENCODE_EncodeFrameAsync(mfxSession session, mfxAudioFrame *frame,
                                mfxBitstream *buffer_out, mfxSyncPoint *syncp)
MFXAudioENCODE_GetAudioParam(mfxSession session, mfxAudioParam *par)
MFXAudioENCODE_Init(mfxSession session, mfxAudioParam *par)
MFXAudioENCODE_Query(mfxSession session, mfxAudioParam *in, mfxAudioParam *out)
MFXAudioENCODE_QueryIOSize(mfxSession session, mfxAudioParam *par, mfxAudioAllocRequest *request)
MFXAudioENCODE_Reset(mfxSession session, mfxAudioParam *par)

Flexible encode infrastructure functions:

MFXVideoENC_Close(mfxSession session)
MFXVideoENC_GetVideoParam(mfxSession session, mfxVideoParam *par)
MFXVideoENC_Init(mfxSession session, mfxVideoParam *par)
MFXVideoENC_ProcessFrameAsync (mfxSession session, mfxENCInput *in,
                               mfxENCOutput *out, mfxSyncPoint *syncp)
MFXVideoENC_Query(mfxSession session, mfxVideoParam *in, mfxVideoParam *out)
MFXVideoENC_QueryIOSurf(mfxSession session, mfxVideoParam *par,
                        mfxFrameAllocRequest *request)
MFXVideoENC_Reset(mfxSession session, mfxVideoParam *par)
MFXVideoPAK_Close(mfxSession session)
MFXVideoPAK_GetVideoParam(mfxSession session, mfxVideoParam *par)
MFXVideoPAK_Init(mfxSession session, mfxVideoParam *par)
MFXVideoPAK_ProcessFrameAsync(mfxSession session, mfxPAKInput *in,
                              mfxPAKOutput *out, mfxSyncPoint *syncp)
MFXVideoPAK_Query(mfxSession session, mfxVideoParam *in, mfxVideoParam *out)
MFXVideoPAK_QueryIOSurf(mfxSession session, mfxVideoParam *par,
                        mfxFrameAllocRequest *request)
MFXVideoPAK_Reset(mfxSession session, mfxVideoParam *par)

User Plugin functions:

MFXAudioUSER_ProcessFrameAsync(mfxSession session, const mfxHDL *in,
                               mfxU32 in_num, const mfxHDL *out,
                               mfxU32 out_num, mfxSyncPointx *syncp)
MFXAudioUSER_Register(mfxSession session, mfxU32 type, const mfxPlugin *par)
MFXAudioUSER_Unregister(mfxSession session, mfxU32 type)
MFXVideoUSER_GetPlugin(mfxSession session, mfxU32 type, mfxPlugin *par)
MFXVideoUSER_ProcessFrameAsync(mfxSession session, const mfxHDL *in, mfxU32 in_num,
                               const mfxHDL *out, mfxU32 out_num, mfxSyncPoint *syncp)
MFXVideoUSER_Register(mfxSession session, mfxU32 type, const mfxPlugin *par)
MFXVideoUSER_Unregister(mfxSession session, mfxU32 type)
MFXVideoUSER_Load(mfxSession session, const mfxPluginUID *uid, mfxU32 version)
MFXVideoUSER_LoadByPath(mfxSession session, const mfxPluginUID *uid, mfxU32 version,
                        const mfxChar *path, mfxU32 len)
MFXVideoUSER_UnLoad(mfxSession session, const mfxPluginUID *uid)
MFXDoWork(mfxSession session)

Memory functions:

MFXVideoCORE_SetBufferAllocator(mfxSession session, mfxBufferAllocator *allocator)

Video processing functions:

MFXVideoVPP_RunFrameVPPAsyncEx(mfxSession session, mfxFrameSurface1 *in,
                               mfxFrameSurface1 *surface_work, mfxFrameSurface1 **surface_out,
                               mfxSyncPoint *syncp)

Important

Corresponding extension buffers are also removed.

oneVPL API versioning

As a successor of MSDKt oneVPL API version starts from 2.0.

Experimental API’s in oneVPL are protected with the following macro:

#if (MFX_VERSION >= MFX_VERSION_NEXT)

To use the API, define the MFX_VERSION_USE_LATEST macro.

Acronyms and Abbreviations

Acronyms / Abbreviations

Meaning

API

Application Programming Interface

AVC

Advanced Video Codec (same as H.264 and MPEG-4, part 10)

Direct3D

Microsoft* Direct3D* version 9 or 11.1

Direct3D9

Microsoft* Direct3D* version 9

Direct3D11

Microsoft* Direct3D* version 11.1

DRM

Digital Right Management

DXVA2

Microsoft DirectX* Video Acceleration standard 2.0

H.264

ISO*/IEC* 14496-10 and ITU-T* H.264, MPEG-4 Part 10, Advanced Video Coding, May 2005

HRD

Hypothetical Reference Decoder

IDR

Instantaneous decoding fresh picture, a term used in the H.264 specification

LA

Look Ahead. Special encoding mode where encoder performs pre analysis of several frames before actual encoding starts.

MPEG

Motion Picture Expert Group

MPEG-2

ISO/IEC 13818-2 and ITU-T H.262, MPEG-2 Part 2, Information Technology- Generic Coding of Moving Pictures and Associate Audio Information: Video, 2000

NAL

Network Abstraction Layer

NV12

A color format for raw video frames

PPS

Picture Parameter Set

QP

Quantization Parameter

RGB3

Twenty-four-bit RGB color format. Also known as RGB24

RGB4

Thirty-two-bit RGB color format. Also known as RGB32

SDK

Intel® Media Software Development Kit – SDK

SEI

Supplemental Enhancement Information

SPS

Sequence Parameter Set

VA API

Video Acceleration API

VBR

Variable Bit Rate

VBV

Video Buffering Verifier

VC-1

SMPTE* 421M, SMPTE Standard for Television: VC-1 Compressed Video Bitstream Format and Decoding Process, August 2005

video memory

memory used by hardware acceleration device, also known as GPU, to hold frame and other types of video data

VPP

Video Processing

VUI

Video Usability Information

YUY2

A color format for raw video frames

YV12

A color format for raw video frames, Similar to IYUV with U and V reversed

IYUV

A color format for raw video frames, also known as I420

P010

A color format for raw video frames, extends NV12 for 10 bit

I010

A color format for raw video frames, extends IYUV/I420 for 10 bit

GPB

Generalized P/B picture. B-picture, containing only forward references in both L0 and L1

HDR

High Dynamic Range

BRC

Bit Rate Control

MCTF

Motion Compensated Temporal Filter. Special type of a noise reduction filter which utilizes motion to improve efficiency of video denoising

iGPU/iGfx

Integrated Intel® HD Graphics

dGPU/dGfx

Discrete Intel® Graphics

Architecture

SDK functions fall into the following categories:

Category

Description

DECODE

Decode compressed video streams into raw video frames

ENCODE

Encode raw video frames into compressed bitstreams

VPP

Perform video processing on raw video frames

CORE

Auxiliary functions for synchronization

Misc

Global auxiliary functions

With the exception of the global auxiliary functions, SDK functions are named after their functioning domain and category, as illustrated below. Here, SDK only exposes video domain functions.

SDK function name notation

Applications use SDK functions by linking with the SDK dispatcher library, as illustrated below. The dispatcher library identifies the hardware acceleration device on the running platform, determines the most suitable platform library, and then redirects function calls. If the dispatcher is unable to detect any suitable platform-specific hardware, the dispatcher redirects SDK function calls to the default software library.

digraph {
  rankdir=TB;
  Application [shape=record label="Application" ];
  Sdk [shape=record  label="SDK Dispatcher Library"];
  Lib1 [shape=record  label="SDK Library 1 (CPU)"];
  Lib2 [shape=record  label="SDK Library 2 (Platform 1)"];
  Lib3 [shape=record  label="SDK Library 3 (Platform 2)"];
  Application->Sdk;
  Sdk->Lib1;
  Sdk->Lib2;
  Sdk->Lib3;
}

Video Decoding

The DECODE class of functions takes a compressed bitstream as input and converts it to raw frames as output.

DECODE processes only pure or elementary video streams. The library cannot process bitstreams that reside in a container format, such as MP4 or MPEG. The application must first de-multiplex the bitstreams. De-multiplexing extracts pure video streams out of the container format. The application can provide the input bitstream as one complete frame of data, less than one frame (a partial frame), or multiple frames. If only a partial frame is provided, DECODE internally constructs one frame of data before decoding it.

The time stamp of a bitstream buffer must be accurate to the first byte of the frame data. That is, the first byte of a video coding layer NAL unit for H.264, or picture header for MPEG-2 and VC-1. DECODE passes the time stamp to the output surface for audio and video multiplexing or synchronization.

Decoding the first frame is a special case, since DECODE does not provide enough configuration parameters to correctly process the bitstream. DECODE searches for the sequence header (a sequence parameter set in H.264, or a sequence header in MPEG-2 and VC-1) that contains the video configuration parameters used to encode subsequent video frames. The decoder skips any bitstream prior to that sequence header. In the case of multiple sequence headers in the bitstream, DECODE adopts the new configuration parameters, ensuring proper decoding of subsequent frames.

DECODE supports repositioning of the bitstream at any time during decoding. Because there is no way to obtain the correct sequence header associated with the specified bitstream position after a position change, the application must supply DECODE with a sequence header before the decoder can process the next frame at the new position. If the sequence header required to correctly decode the bitstream at the new position is not provided by the application, DECODE treats the new location as a new “first frame” and follows the procedure for decoding first frames.

Video Encoding

The ENCODE class of functions takes raw frames as input and compresses them into a bitstream.

Input frames usually come encoded in a repeated pattern called the Group of Picture (GOP) sequence. For example, a GOP sequence can start from an I-frame, followed by a few B-frames, a P-frame, and so on. ENCODE uses an MPEG-2 style GOP sequence structure that can specify the length of the sequence and the distance between two key frames: I- or P-frames. A GOP sequence ensures that the segments of a bitstream do not completely depend upon each other. It also enables decoding applications to reposition the bitstream.

ENCODE processes input frames in two ways:

  • Display order: ENCODE receives input frames in the display order. A few GOP structure parameters specify the GOP sequence during ENCODE initialization. Scene change results from the video processing stage of a pipeline can alter the GOP sequence.

  • Encoded order: ENCODE receives input frames in their encoding order. The application must specify the exact input frame type for encoding. ENCODE references GOP parameters to determine when to insert information such as an end-of-sequence into the bitstream.

An ENCODE output consists of one frame of a bitstream with the time stamp passed from the input frame. The time stamp is used for multiplexing subsequent video with other associated data such as audio. The SDK library provides only pure video stream encoding. The application must provide its own multiplexing.

ENCODE supports the following bitrate control algorithms: constant bitrate, variable bitrate (VBR), and constant Quantization Parameter (QP). In the constant bitrate mode, ENCODE performs stuffing when the size of the least-compressed frame is smaller than what is required to meet the Hypothetical Reference Decoder (HRD) buffer (or VBR) requirements. (Stuffing is a process that appends zeros to the end of encoded frames.)

Video Processing

Video processing (VPP) takes raw frames as input and provides raw frames as output.

digraph {
  rankdir=LR;
  F1 [shape=record label="Function 1" ];
  F2 [shape=record  label="Function 2"];
  F3 [shape=record  label="Additional Filters"];
  F4 [shape=record label="Function N-1" ];
  F5 [shape=record  label="Function N"];
  F1->F2->F3->F4->F5;
}

The actual conversion process is a chain operation with many single-function filters, as Figure 3 illustrates. The application specifies the input and output format, and the SDK configures the pipeline accordingly. The application can also attach one or more hint structures to configure individual filters or turn them on and off. Unless specifically instructed, the SDK builds the pipeline in a way that best utilizes hardware acceleration or generates the best video processing quality.

Table 1 shows the SDK video processing features. The application can configure supported video processing features through the video processing I/O parameters. The application can also configure optional features through hints. See “Video Processing procedure / Configuration” for more details on how to configure optional filters.

Todo

create link to “Video Processing procedure / Configuration”

Video Processing Features

Configuration

Convert color format from input to output

I/O parameters

De-interlace to produce progressive frames at the output

I/O parameters

Crop and resize the input frames

I/O parameters

Convert input frame rate to match the output

I/O parameters

Perform inverse telecine operations

I/O parameters

Fields weaving

I/O parameters

Fields splitting

I/O parameters

Remove noise

hint (optional feature)

Enhance picture details/edges

hint (optional feature)

Adjust the brightness, contrast, saturation, and hue settings

hint (optional feature)

Perform image stabilization

hint (optional feature)

Convert input frame rate to match the output, based on frame interpolation

hint (optional feature)

Perform detection of picture structure

hint (optional feature)

Color Conversion Support:

Output Color>

Input Color

NV12

RGB32

P010

P210

NV16

A2RGB10

RGB4 (RGB32)

X (limited)

X (limited)

NV12

X

X

X

X

YV12

X

X

UYVY

X

YUY2

X

X

P010

X

X

X

X

P210

X

X

X

X

X

NV16

X

X

X

Note

‘X’ indicates a supported function.

Note

The SDK video processing pipeline supports limited functionality for RGB4 input. Only filters that are required to convert input format to output one are included in pipeline. All optional filters are skipped. See description of MFX_WRN_FILTER_SKIPPED warning in mfxStatus enum for more details on how to retrieve list of active filters.

Deinterlacing/Inverse Telecine Support in VPP:

Input Field Rate (fps) Interlaced

Output Frame Rate (fps) Progressive

23.976

25

29.97

30

50

59.94

60

29.97

Inverse Telecine

X

50

X

X

59.94

X

X

60

X

X

Note

‘X’ indicates a supported function.

This table describes pure deinterlacing algorithm. The application can combine it with frame rate conversion to achieve any desirable input/output frame rate ratio. Note, that in this table input rate is field rate, i.e. number of video fields in one second of video. The SDK uses frame rate in all configuration parameters, so this input field rate should be divided by two during the SDK configuration. For example, 60i to 60p conversion in this table is represented by right bottom cell. It should be described in mfxVideoParam as input frame rate equal to 30 and output 60.

SDK support two HW-accelerated deinterlacing algorithms: BOB DI (in Linux’s libVA terms VAProcDeinterlacingBob) and Advanced DI (VAProcDeinterlacingMotionAdaptive). Default is ADI (Advanced DI) which uses reference frames and has better quality. BOB DI is faster than ADI mode. So user can select as usual between speed and quality.

User can exactly configure DI modes via mfxExtVPPDeinterlacing.

There is one special mode of deinterlacing available in combination with frame rate conversion. If VPP input frame is interlaced (TFF or BFF) and output is progressive and ratio between source frame rate and destination frame rate is ½ (for example 30 to 60, 29.97 to 59.94, 25 to 50), special mode of VPP turned on: for 30 interlaced input frames application will get 60 different progressive output frames

Color formats supported by VPP filters:

Color>

Filter

RGB4 (RGB32)

NV12

YV12

YUY2

P010

P210

NV1

Denoise

X

MCTF

X

Deinterlace

X

Image stabilization

X

Frame rate conversion

X

Resize

X

X

X

X

Detail

X

Color conversion

X

X

X

X

X

X

X

Composition

X

X

Field copy

X

Fields weaving

X

Fields splitting

X

Note

‘X’ indicates a supported function.

Note

The SDK video processing pipeline supports limited HW acceleration for P010 format - zeroed mfxFrameInfo::Shift leads to partial acceleration.

Todo

create link to mfxFrameInfo::Shift

Note

The SDK video processing pipeline does not support HW acceleration for P210 format.

Todo

Keep or remove HW?

Programming Guide

This chapter describes the concepts used in programming the SDK.

The application must use the include file, mfxvideo.h for C/C++ programming) and link the SDK dispatcher library, libmfx.so.

Include these files:

#include "mfxvideo.h"    /* The SDK include file */

Link this library:

libmfx.so                /* The SDK dynamic dispatcher library (Linux)*/

Status Codes

The SDK functions organize into classes for easy reference. The classes include ENCODE (encoding functions), DECODE (decoding functions), and VPP (video processing functions).

Init, Reset and Close are member functions within the ENCODE, DECODE and VPP classes that initialize, restart and de-initialize specific operations defined for the class. Call all other member functions within a given class (except Query and QueryIOSurf) within the InitReset (optional) … Close sequence.

The Init and Reset member functions both set up necessary internal structures for media processing. The difference between the two is that the Init functions allocate memory while the Reset functions only reuse allocated internal memory. Therefore, Reset can fail if the SDK needs to allocate additional memory. Reset functions can also fine-tune ENCODE and VPP parameters during those processes or reposition a bitstream during DECODE.

All SDK functions return status codes to indicate whether an operation succeeded or failed. See the mfxStatus enumerator for all defined status codes. The status code MFX_ERR_NONE indicates that the function successfully completed its operation. Status codes are less than MFX_ERR_NONE for all errors and greater than MFX_ERR_NONE for all warnings.

If an SDK function returns a warning, it has sufficiently completed its operation, although the output of the function might not be strictly reliable. The application must check the validity of the output generated by the function.

If an SDK function returns an error (except MFX_ERR_MORE_DATA or MFX_ERR_MORE_SURFACE or MFX_ERR_MORE_BITSTREAM), the function aborts the operation. The application must call either the Reset function to put the class back to a clean state, or the Close function to terminate the operation. The behavior is undefined if the application continues to call any class member functions without a Reset or Close. To avoid memory leaks, always call the Close function after Init.

SDK Session

Before calling any SDK functions, the application must initialize the SDK library and create an SDK session. An SDK session maintains context for the use of any of DECODE, ENCODE, or VPP functions.

Media SDK dispatcher (legacy)

The function MFXInit() starts (initializes) an SDK session. MFXClose() closes (de-initializes) the SDK session. To avoid memory leaks, always call MFXClose() after MFXInit().

The application can initialize a session as a software-based session (MFX_IMPL_SOFTWARE) or a hardware-based session (MFX_IMPL_HARDWARE). In the former case, the SDK functions execute on a CPU, and in the latter case, the SDK functions use platform acceleration capabilities. For platforms that expose multiple graphic devices, the application can initialize the SDK session on any alternative graphic device (MFX_IMPL_HARDWARE1,…, MFX_IMPL_HARDWARE4).

The application can also initialize a session to be automatic (MFX_IMPL_AUTO or MFX_IMPL_AUTO_ANY), instructing the dispatcher library to detect the platform capabilities and choose the best SDK library available. After initialization, the SDK returns the actual implementation through the MFXQueryIMPL() function.

Internally, dispatcher works in that way:

  1. It seaches for the shared library with the specific name:

    OS

    Name

    Description

    Linux

    libmfxsw64.so.1

    64-bit software-based implementation

    Linux

    libmfxsw32.so.1

    32-bit software-based implementation

    Linux

    libmfxhw64.so.1

    64-bit hardware-based implementation

    Linux

    libmfxhw64.so.1

    32-bit hardware-based implementation

    Windows

    libmfxsw32.dll

    64-bit software-based implementation

    Windows

    libmfxsw32.dll

    32-bit software-based implementation

    Windows

    libmfxhw64.dll

    64-bit hardware-based implementation

    Windows

    libmfxhw64.dll

    32-bit hardware-based implementation

  2. Once library is loaded, dispatcher obtains addresses of an each SDK function. See table with the list of functions to export.

oneVPL diapatcher

oneVPL dispatcher extends the legacy dispatcher by providing additional ability to select appropriate implementation based on the implementation capabilities. Implementation capabilities include information about supported decoders, encoders and VPP filters. For each supported encoder, decoder and filter, capabilities include information about supported memory types, color formats, image (frame) size in pixels and so on.

This is recomended way for the user to configure the dispatcher’s capabilities search filters and create session based on suitable implementation:

This is application termination procedure:

Note

Multiple loader instances can be created.

Note

Each loader may have multiple config objects assotiated with it.

Important

One config object can handle only one filter property.

Note

Multiple sessions can be created by using one loader object.

When dispatcfher searches for the implementation it uses following priority rules:

  1. HW implementation has priority over SW implementation.

  2. Gen HW implementation hase priority over VSI HW implementation.

  3. Highest API version has higher priority over lower API version.

Note

Implementation has priority over the API version. In other words, dispatcher must return implementation with highest API priority (greater or equal to the requested).

Dispatcher searches implemetation in the following folders at runtime (in priority order):

  1. User-defined search folders.

  2. oneVPL package.

  3. Standalone MSDK package (or driver).

User has ability to develop it’s own implementation and guide oneVPL dispatcher to load his implementation by providing list of search folders. The way how it can be done depends on OS.

  • linux: User can provide colon separated list of folders in ONEVPL_SEARCH_PATH environmental variable.

  • Windows: User can provide semicolon separated list of folders in ONEVPL_SEARCH_PATH environmental variable. Alternatively, user can use Windows registry.

Different SW implementations is supported by the dispatcher. User can use field mfxImplDescription::VendorID or mfxImplDescription::VendorImplID or mfxImplDescription::ImplName to search for the particular implementation.

Internally, dispatcher works in that way:

  1. Dispatcher loads any shared library with in given search floders.

  2. For each loaded library, dispatcher tries to resolve adress of the MFXQueryImplCapabilities() function to collect the implamentation;s capabilities.

  3. Once user requested to create the session based on this implementation, dispatcher obtains addresses of an each SDK function. See table with the list of functions to export.

This table summarizes list of evviromental variables to control the dispatcher behaviour:

Varible

Purpose

ONEVPL_SEARCH_PATH

List of user-defined search folders.

Note

Each implementation must support both dispatchers for backward compatibility with existing applications.

Multiple Sessions

Each SDK session can run exactly one instance of DECODE, ENCODE and VPP functions. This is good for a simple transcoding operation. If the application needs more than one instance of DECODE, ENCODE and VPP in a complex transcoding setting, or needs more simultaneous transcoding operations to balance CPU/GPU workloads, the application can initialize multiple SDK sessions. Each SDK session can independently be a software-based session or hardware-based session.

The application can use multiple SDK sessions independently or run a “joined” session. Independently operated SDK sessions cannot share data unless the application explicitly synchronizes session operations (to ensure that data is valid and complete before passing from the source to the destination session.)

To join two sessions together, the application can use the function MFXJoinSession(). Alternatively, the application can use the function MFXCloneSession() to duplicate an existing session. Joined SDK sessions work together as a single session, sharing all session resources, threading control and prioritization operations (except hardware acceleration devices and external allocators). When joined, one of the sessions (the first join) serves as a parent session, scheduling execution resources, with all others child sessions relying on the parent session for resource management.

With joined sessions, the application can set the priority of session operations through the MFXSetPriority() function. A lower priority session receives less CPU cycles. Session priority does not affect hardware accelerated processing.

After the completion of all session operations, the application can use the function MFXDisjoinSession() to remove the joined state of a session. Do not close the parent session until all child sessions are disjoined or closed.

Frame and Fields

In SDK terminology, a frame (or frame surface, interchangeably) contains either a progressive frame or a complementary field pair. If the frame is a complementary field pair, the odd lines of the surface buffer store the top fields and the even lines of the surface buffer store the bottom fields.

Frame Surface Locking

During encoding, decoding or video processing, cases arise that require reserving input or output frames for future use. In the case of decoding, for example, a frame that is ready for output must remain as a reference frame until the current sequence pattern ends. The usual approach is to cache the frames internally. This method requires a copy operation, which can significantly reduce performance.

SDK functions define a frame-locking mechanism to avoid the need for copy operations. This mechanism is as follows:

  • The application allocates a pool of frame surfaces large enough to include SDK function I/O frame surfaces and internal cache needs. Each frame surface maintains a Locked counter, part of the mfxFrameData structure. Initially, the Locked counter is set to zero.

  • The application calls an SDK function with frame surfaces from the pool, whose Locked counter is set as appropriate: for decoding or video processing operations where the SDK uses the surfaces to write it should be equal to zero. If the SDK function needs to reserve any frame surface, the SDK function increases the Locked counter of the frame surface. A non-zero Locked counter indicates that the calling application must treat the frame surface as “in use.” That is, the application can read, but cannot alter, move, delete or free the frame surface.

  • In subsequent SDK executions, if the frame surface is no longer in use, the SDK decreases the Locked counter. When the Locked counter reaches zero, the application is free to do as it wishes with the frame surface.

In general, the application must not increase or decrease the Locked counter, since the SDK manages this field. If, for some reason, the application needs to modify the Locked counter, the operation must be atomic to avoid race condition.

Attention

Modifying the Locked counter is not recommended.

Starting from API version 2.0 mfxFrameSurfaceInterface structure as a set of callback functions was introduced for mfxFrameSurface1 to work with frames. This interface defines mfxFrameSurface1 as a reference counted object which can be allocated by the SDK or application. Application has to follow the general rules of operations with reference countend objects. As example, when surfaces are allocated by the SDK during MFXVideoDECODE_DecodeFrameAsync or with help of MFXMemory_GetSurfaceForVPP, MFXMemory_GetSurfaceForEncode, application has to call correspondent mfxFrameSurfaceInterface->(*Release) for the surfaces whose are no longer in use.

Attention

Need to distinguish Locked counter which defines read/write access polices and reference counter responsible for managing frames’ lifetime.

Note

all mfxFrameSurface1 structures starting from mfxFrameSurface1::mfxStructVersion = {1,1} supports mfxFrameSurfaceInterface.

Decoding Procedures

Example 1 shows the pseudo code of the decoding procedure. The following describes a few key points:

  • The application can use the MFXVideoDECODE_DecodeHeader() function to retrieve decoding initialization parameters from the bitstream. This step is optional if such parameters are retrievable from other sources such as an audio/video splitter.

  • The application uses the MFXVideoDECODE_QueryIOSurf() function to obtain the number of working frame surfaces required to reorder output frames. This call is optional and required when application uses external allocation.

  • The application calls the MFXVideoDECODE_DecodeFrameAsync() function for a decoding operation, with the bitstream buffer (bits), and an unlocked working frame surface (work) as input parameters.

Attention

Starting from API version 2.0 application can provide NULL as working frame surface what leads to internal memory allocation.

If decoding output is not available, the function returns a status code requesting additional bitstream input or working frame surfaces as follows:

  • MFX_ERR_MORE_DATA: The function needs additional bitstream input. The existing buffer contains less than a frame worth of bitstream data.

  • MFX_ERR_MORE_SURFACE: The function needs one more frame surface to produce any output.

  • MFX_ERR_REALLOC_SURFACE: Dynamic resolution change case - the function needs bigger working frame surface (work).

Example 2 below demonstrates simplified decoding procedure.

Starting for API version 2.0 new decoding approach has been introduced. For simple use cases, when user just wants to decode some elementary stream and don’t want to set additional parameters, the simplified procedure of Decoder’s initialization has been proposed. For such situations it is possible to skip explicit stages of stream’s header decodeng and Decoder’s initialization and perform it implicitly during decoding of first frame. This change also requires additional field in mfxBitstream object to indicate codec type. In that mode decoder allocates mfxFrameSurface1 internally, so users should set input surface to zero.

Example 1: Decoding Pseudo Code

MFXVideoDECODE_DecodeHeader(session, bitstream, &init_param);
MFXVideoDECODE_QueryIOSurf(session, &init_param, &request);
allocate_pool_of_frame_surfaces(request.NumFrameSuggested);
MFXVideoDECODE_Init(session, &init_param);
sts=MFX_ERR_MORE_DATA;
for (;;) {
   if (sts==MFX_ERR_MORE_DATA && !end_of_stream())
      append_more_bitstream(bitstream);
   find_unlocked_surface_from_the_pool(&work);
   bits=(end_of_stream())?NULL:bitstream;
   sts=MFXVideoDECODE_DecodeFrameAsync(session,bits,work,&disp,&syncp);
   if (sts==MFX_ERR_MORE_SURFACE) continue;
   if (end_of_bitstream() && sts==MFX_ERR_MORE_DATA) break;
   if (sts==MFX_ERR_REALLOC_SURFACE) {
      MFXVideoDECODE_GetVideoParam(session, &param);
      realloc_surface(work, param.mfx.FrameInfo);
      continue;
   }
   // skipped other error handling
   if (sts==MFX_ERR_NONE) {
      MFXVideoCORE_SyncOperation(session, syncp, INFINITE);
      do_something_with_decoded_frame(disp);
   }
}
MFXVideoDECODE_Close();
free_pool_of_frame_surfaces();

Example 2: Simplified decoding procedure

sts=MFX_ERR_MORE_DATA;
for (;;) {
   if (sts==MFX_ERR_MORE_DATA && !end_of_stream())
      append_more_bitstream(bitstream);
   bits=(end_of_stream())?NULL:bitstream;
   sts=MFXVideoDECODE_DecodeFrameAsync(session,bits,NULL,&disp,&syncp);
   if (sts==MFX_ERR_MORE_SURFACE) continue;
   if (end_of_bitstream() && sts==MFX_ERR_MORE_DATA) break;
   // skipped other error handling
   if (sts==MFX_ERR_NONE) {
      MFXVideoCORE_SyncOperation(session, syncp, INFINITE);
      do_something_with_decoded_frame(disp);
      release_surface(disp);
   }
}

Bitstream Repositioning

The application can use the following procedure for bitstream reposition during decoding:

  • Use the MFXVideoDECODE_Reset() function to reset the SDK decoder.

  • Optionally, if the application maintains a sequence header that decodes correctly the bitstream at the new position, the application may insert the sequence header to the bitstream buffer.

  • Append the bitstream from the new location to the bitstream buffer.

  • Resume the decoding procedure. If the sequence header is not inserted in the above steps, the SDK decoder searches for a new sequence header before starting decoding.

Broken Streams Handling

Robustness and capability to handle broken input stream is important part of the decoder.

First of all, start code prefix (ITU-T H.264 3.148 and ITU-T H.265 3.142) is used to separate NAL units. Then all syntax elements in bitstream are parsed and verified. If any of elements violate the specification then input bitstream is considered as invalid and decoder tries to re-sync (find next start code). The further decoder’s behavior is depend on which syntax element is broken:

  • SPS header – return MFX_ERR_INCOMPATIBLE_VIDEO_PARAM (HEVC decoder only, AVC decoder uses last valid)

  • PPS header – re-sync, use last valid PPS for decoding

  • Slice header – skip this slice, re-sync

  • Slice data - Corruption flags are set on output surface

Note

Some requirements are relaxed because there are a lot of streams which violate the letter of standard but can be decoded without errors.

  • Many streams have IDR frames with frame_num != 0 while specification says that “If the current picture is an IDR picture, frame_num shall be equal to 0.” (ITU-T H.265 7.4.3)

  • VUI is also validated, but errors doesn’t invalidate the whole SPS, decoder either doesn’t use corrupted VUI (AVC) or resets incorrect values to default (HEVC).

The corruption at reference frame is spread over all inter-coded pictures which use this reference for prediction. To cope with this problem you either have to periodically insert I-frames (intra-coded) or use ‘intra refresh’ technique. The latter allows to recover corruptions within a pre-defined time interval. The main point of ‘intra refresh’ is to insert cyclic intra-coded pattern (usually row) of macroblocks into the inter-coded pictures, restricting motion vectors accordingly. Intra-refresh is often used in combination with Recovery point SEI, where recovery_frame_cnt is derived from intra-refresh interval. Recovery point SEI message is well described at ITU-T H.264 D.2.7 and ITU-T H.265 D.2.8. This message can be used by the decoder to understand from which picture all subsequent (in display order) pictures contain no errors, if we start decoding from AU associated with this SEI message. In opposite to IDR, recovery point message doesn’t mark reference pictures as “unused for reference”.

Besides validation of syntax elements and theirs constrains, decoder also uses various hints to handle broken streams.

  • If there are no valid slices for current frame – the whole frame is skipped.

  • The slices which violate slice segment header semantics (ITU-T H.265 7.4.7.1) are skipped. Only slice_temporal_mvp_enabled_flag is checked for now.

  • Since LTR (Long Term Reference) stays at DPB until it will be explicitly cleared by IDR or MMCO, the incorrect LTR could cause long standing visual artifacts. AVC decoder uses the following approaches to care about this:

    • When we have DPB overflow in case incorrect MMCO command which marks reference picture as LT, we rollback this operation

    • An IDR frame with frame_num != 0 can’t be LTR

  • If decoder detects frame gapping, it inserts ‘fake’ (marked as non-existing) frames, updates FrameNumWrap (ITU-T H.264 8.2.4.1) for reference frames and applies Sliding Window (ITU-T H.264 8.2.5.3) marking process. ‘Fake’ frames are marked as reference, but since they are marked as non-existing they are not really used for inter-prediction.

VP8 Specific Details

Unlike other supported by SDK decoders, VP8 can accept only complete frame as input and application should provide it accompanied by MFX_BITSTREAM_COMPLETE_FRAME flag. This is the single specific difference.

JPEG

The application can use the same decoding procedures for JPEG/motion JPEG decoding, as illustrated in pseudo code below:

// optional; retrieve initialization parameters
MFXVideoDECODE_DecodeHeader(...);
// decoder initialization
MFXVideoDECODE_Init(...);
// single frame/picture decoding
MFXVideoDECODE_DecodeFrameAsync(...);
MFXVideoCORE_SyncOperation(...);
// optional; retrieve meta-data
MFXVideoDECODE_GetUserData(...);
// close
MFXVideoDECODE_Close(...);

DECODE supports JPEG baseline profile decoding as follows:

  • DCT-based process

  • Source image: 8-bit samples within each component

  • Sequential

  • Huffman coding: 2 AC and 2 DC tables

  • 3 loadable quantization matrixes

  • Interleaved and non-interleaved scans

  • Single and multiple scans

    • chroma subsampling ratios:

    • Chroma 4:0:0 (grey image)

    • Chroma 4:1:1

    • Chroma 4:2:0

    • Chroma horizontal 4:2:2

    • Chroma vertical 4:2:2

    • Chroma 4:4:4

  • 3 channels images

The MFXVideoDECODE_Query() function will return MFX_ERR_UNSUPPORTED if the input bitstream contains unsupported features.

For still picture JPEG decoding, the input can be any JPEG bitstreams that conform to the ITU-T* Recommendation T.81, with an EXIF* or JFIF* header. For motion JPEG decoding, the input can be any JPEG bitstreams that conform to the ITU-T Recommendation T.81.

Unlike other SDK decoders, JPEG one supports three different output color formats - NV12, YUY2 and RGB32. This support sometimes requires internal color conversion and more complicated initialization. The color format of input bitstream is described by JPEGChromaFormat and JPEGColorFormat fields in mfxInfoMFX structure. The MFXVideoDECODE_DecodeHeader() function usually fills them in. But if JPEG bitstream does not contains color format information, application should provide it. Output color format is described by general SDK parameters - FourCC and ChromaFormat fields in mfxFrameInfo structure.

Motion JPEG supports interlaced content by compressing each field (a half-height frame) individually. This behavior is incompatible with the rest SDK transcoding pipeline, where SDK requires that fields be in odd and even lines of the same frame surface.) The decoding procedure is modified as follows:

SDK supports JPEG picture rotation, in multiple of 90 degrees, as part of the decoding operation. By default, the MFXVideoDECODE_DecodeHeader() function returns the Rotation parameter so that after rotation, the pixel at the first row and first column is at the top left. The application can overwrite the default rotation before calling MFXVideoDECODE_Init().

The application may specify Huffman and quantization tables during decoder initialization by attaching mfxExtJPEGQuantTables and mfxExtJPEGHuffmanTables buffers to mfxVideoParam structure. In this case, decoder ignores tables from bitstream and uses specified by application. The application can also retrieve these tables by attaching the same buffers to mfxVideoParam and calling MFXVideoDECODE_GetVideoParam() or MFXVideoDECODE_DecodeHeader() functions.

Multi-view video decoding

The SDK MVC decoder operates on complete MVC streams that contain all view/temporal configurations. The application can configure the SDK decoder to generate a subset at the decoding output. To do this, the application needs to understand the stream structure and based on such information configure the SDK decoder for target views.

The decoder initialization procedure is as follows:

  • The application calls the MFXVideoDECODE_DecodeHeader function to obtain the stream structural information. This is actually done in two sub-steps:

    • The application calls the MFXVideoDECODE_DecodeHeader function with the mfxExtMVCSeqDesc structure attached to the mfxVideoParam structure.

      Do not allocate memory for the arrays in the mfxExtMVCSeqDesc structure just yet. Set the View, ViewId and OP pointers to NULL and set NumViewAlloc, NumViewIdAlloc and NumOPAlloc to zero. The function parses the bitstream and returns MFX_ERR_NOT_ENOUGH_BUFFER with the correct values NumView, NumViewId and NumOP. This step can be skipped if the application is able to obtain the NumView, NumViewId and NumOP values from other sources.

    • The application allocates memory for the View, ViewId and OP arrays and calls the MFXVideoDECODE_DecodeHeader function again.

      The function returns the MVC structural information in the allocated arrays.

  • The application fills the mfxExtMvcTargetViews structure to choose the target views, based on information described in the mfxExtMVCSeqDesc structure.

  • The application initializes the SDK decoder using the MFXVideoDECODE_Init function. The application must attach both the mfxExtMVCSeqDesc structure and

    the mfxExtMvcTargetViews structure to the mfxVideoParam structure.

In the above steps, do not modify the values of the mfxExtMVCSeqDesc structure after the MFXVideoDECODE_DecodeHeader function, as the SDK decoder uses the values in the structure for internal memory allocation. Once the application configures the SDK decoder, the rest decoding procedure remains unchanged. As illustrated in the pseudo code below, the application calls the MFXVideoDECODE_DecodeFrameAsync function multiple times to obtain all target views of the current frame picture, one target view at a time. The target view is identified by the FrameID field of the mfxFrameInfo structure.

mfxExtBuffer *eb[2];
mfxExtMVCSeqDesc  seq_desc;
mfxVideoParam init_param;

init_param.ExtParam=&eb;
init_param.NumExtParam=1;
eb[0]=&seq_desc;
MFXVideoDECODE_DecodeHeader(session, bitstream, &init_param);

/* select views to decode */
mfxExtMvcTargetViews tv;
init_param.NumExtParam=2;
eb[1]=&tv;

/* initialize decoder */
MFXVideoDECODE_Init(session, &init_param);

/* perform decoding */
for (;;) {
    MFXVideoDECODE_DecodeFrameAsync(session, bits, work, &disp,
                                    &syncp);
    MFXVideoCORE_SyncOperation(session, &syncp, INFINITE);
}

/* close decoder */
MFXVideoDECODE_Close();

Encoding Procedures

Encoding procedure

There are two ways of allocation and handling in SDK for shared memory: external and internal.

Example below shows the pseudo code of the encoding procedure with external memory (legacy mode).

MFXVideoENCODE_QueryIOSurf(session, &init_param, &request);
allocate_pool_of_frame_surfaces(request.NumFrameSuggested);
MFXVideoENCODE_Init(session, &init_param);
sts=MFX_ERR_MORE_DATA;
for (;;) {
   if (sts==MFX_ERR_MORE_DATA && !end_of_stream()) {
      find_unlocked_surface_from_the_pool(&surface);
      fill_content_for_encoding(surface);
   }
   surface2=end_of_stream()?NULL:surface;
   sts=MFXVideoENCODE_EncodeFrameAsync(session,NULL,surface2,bits,&syncp);
   if (end_of_stream() && sts==MFX_ERR_MORE_DATA) break;
   // Skipped other error handling
   if (sts==MFX_ERR_NONE) {
      MFXVideoCORE_SyncOperation(session, syncp, INFINITE);
      do_something_with_encoded_bits(bits);
   }
}
MFXVideoENCODE_Close();
free_pool_of_frame_surfaces();

The following describes a few key points:

  • The application uses the MFXVideoENCODE_QueryIOSurf function to obtain the number of working frame surfaces required for reordering input frames.

  • The application calls the MFXVideoENCODE_EncodeFrameAsync function for the encoding operation. The input frame must be in an unlocked frame surface from the frame surface pool. If the encoding output is not available, the function returns the status code MFX_ERR_MORE_DATA to request additional input frames.

  • Upon successful encoding, the MFXVideoENCODE_EncodeFrameAsync function returns MFX_ERR_NONE. However, the encoded bitstream is not yet available because the MFXVideoENCODE_EncodeFrameAsync function is asynchronous. The application must use the MFXVideoCORE_SyncOperation function to synchronize the encoding operation before retrieving the encoded bitstream.

  • At the end of the stream, the application continuously calls the MFXVideoENCODE_EncodeFrameAsync function with NULL surface pointer to drain any remaining bitstreams cached within the SDK encoder, until the function returns MFX_ERR_MORE_DATA.

Note

It is the application’s responsibility to fill pixels outside of crop window when it is smaller than frame to be encoded. Especially in cases when crops are not aligned to minimum coding block size (16 for AVC, 8 for HEVC and VP9).

Another approach is when SDK allocates memory for shared objects internally.

MFXVideoENCODE_Init(session, &init_param);
sts=MFX_ERR_MORE_DATA;
for (;;) {
   if (sts==MFX_ERR_MORE_DATA && !end_of_stream()) {
      MFXMemory_GetSurfaceForEncode(&surface);
      fill_content_for_encoding(surface);
   }
   surface2=end_of_stream()?NULL:surface;
   sts=MFXVideoENCODE_EncodeFrameAsync(session,NULL,surface2,bits,&syncp);
   if (surface2) surface->FrameInterface->(*Release)(surface2);
   if (end_of_stream() && sts==MFX_ERR_MORE_DATA) break;
   // Skipped other error handling
   if (sts==MFX_ERR_NONE) {
      MFXVideoCORE_SyncOperation(session, syncp, INFINITE);
      do_something_with_encoded_bits(bits);
   }
}
MFXVideoENCODE_Close();

There are several key points which are different from legacy mode:

  • The application doesn’t need to call MFXVideoENCODE_QueryIOSurf function to obtain the number of working frame surfaces since allocation is done by SDK

  • The application calls the MFXMemory_GetSurfaceForEncode function to get free surface for the following encode operation.

  • The application needs to call the FrameInterface->(*Release) function to decrement reference counter of the obtained surface after MFXVideoENCODE_EncodeFrameAsync call.

Configuration Change

The application changes configuration during encoding by calling MFXVideoENCODE_Reset function. Depending on difference in configuration parameters before and after change, the SDK encoder either continues current sequence or starts a new one. If the SDK encoder starts a new sequence it completely resets internal state and begins a new sequence with IDR frame.

The application controls encoder behavior during parameter change by attaching mfxExtEncoderResetOption to mfxVideoParam structure during reset. By using this structure, the application instructs encoder to start or not to start a new sequence after reset. In some cases request to continue current sequence cannot be satisfied and encoder fails during reset. To avoid such cases the application may query reset outcome before actual reset by calling MFXVideoENCODE_Query function with mfxExtEncoderResetOption attached to mfxVideoParam structure.

The application uses the following procedure to change encoding configurations:

  • The application retrieves any cached frames in the SDK encoder by calling the MFXVideoENCODE_EncodeFrameAsync function with a NULL input frame pointer until the function returns MFX_ERR_MORE_DATA.

Note

The application must set the initial encoding configuration flag EndOfStream of the mfxExtCodingOption structure to OFF to avoid inserting an End of Stream (EOS) marker into the bitstream. An EOS marker causes the bitstream to terminate before encoding is complete.

  • The application calls the MFXVideoENCODE_Reset function with the new configuration:

    • If the function successfully set the configuration, the application can continue encoding as usual.

    • If the new configuration requires a new memory allocation, the function returns MFX_ERR_INCOMPATIBLE_VIDEO_PARAM. The application must close the SDK encoder and reinitialize the encoding procedure with the new configuration.

External Bit Rate Control

The application can make encoder use external BRC instead of native one. In order to do that it should attach to mfxVideoParam structure mfxExtCodingOption2 with ExtBRC = MFX_CODINGOPTION_ON and callback structure mfxExtBRC during encoder initialization. Callbacks Init, Reset and Close will be invoked inside MFXVideoENCODE_Init, MFXVideoENCODE_Reset and MFXVideoENCODE_Close correspondingly. Figure below illustrates asynchronous encoding flow with external BRC (usage of GetFrameCtrl and Update):

asynchronous encoding flow with external BRC

Note

IntAsyncDepth is the SDK max internal asynchronous encoding queue size; it is always less than or equal to mfxVideoParam::AsyncDepth.

External BRC Pseudo Code:

#include "mfxvideo.h"
#include "mfxbrc.h"

typedef struct {
   mfxU32 EncodedOrder;
   mfxI32 QP;
   mfxU32 MaxSize;
   mfxU32 MinSize;
   mfxU16 Status;
   mfxU64 StartTime;
   // ... skipped
} MyBrcFrame;

typedef struct {
   MyBrcFrame* frame_queue;
   mfxU32 frame_queue_size;
   mfxU32 frame_queue_max_size;
   mfxI32 max_qp[3]; //I,P,B
   mfxI32 min_qp[3]; //I,P,B
   // ... skipped
} MyBrcContext;

mfxStatus MyBrcInit(mfxHDL pthis, mfxVideoParam* par) {
   MyBrcContext* ctx = (MyBrcContext*)pthis;
   mfxI32 QpBdOffset;
   mfxExtCodingOption2* co2;

   if (!pthis || !par)
      return MFX_ERR_NULL_PTR;

   if (!IsParametersSupported(par))
      return MFX_ERR_UNSUPPORTED;

   frame_queue_max_size = par->AsyncDepth;
   frame_queue = (MyBrcFrame*)malloc(sizeof(MyBrcFrame) * frame_queue_max_size);

   if (!frame_queue)
      return MFX_ERR_MEMORY_ALLOC;

   co2 = (mfxExtCodingOption2*)GetExtBuffer(par->ExtParam, par->NumExtParam, MFX_EXTBUFF_CODING_OPTION2);
   QpBdOffset = (par->BitDepthLuma > 8) : (6 * (par->BitDepthLuma - 8)) : 0;

   for (<X = I,P,B>) {
      ctx->max_qp[X] = (co2 && co2->MaxQPX) ? (co2->MaxQPX - QpBdOffset) : <Default>;
      ctx->min_qp[X] = (co2 && co2->MinQPX) ? (co2->MinQPX - QpBdOffset) : <Default>;
   }

   // skipped initialization of other other BRC parameters

   frame_queue_size = 0;

   return MFX_ERR_NONE;
}

mfxStatus MyBrcReset(mfxHDL pthis, mfxVideoParam* par) {
   MyBrcContext* ctx = (MyBrcContext*)pthis;

   if (!pthis || !par)
      return MFX_ERR_NULL_PTR;

   if (!IsParametersSupported(par))
      return MFX_ERR_UNSUPPORTED;

   if (!IsResetPossible(ctx, par))
      return MFX_ERR_INCOMPATIBLE_VIDEO_PARAM;

   // reset here BRC parameters if required

   return MFX_ERR_NONE;
}

mfxStatus MyBrcClose(mfxHDL pthis) {
   MyBrcContext* ctx = (MyBrcContext*)pthis;

   if (!pthis)
      return MFX_ERR_NULL_PTR;

   if (ctx->frame_queue) {
      free(ctx->frame_queue);
      ctx->frame_queue = NULL;
      ctx->frame_queue_max_size = 0;
      ctx->frame_queue_size = 0;
   }

   return MFX_ERR_NONE;
}

mfxStatus MyBrcGetFrameCtrl(mfxHDL pthis, mfxBRCFrameParam* par, mfxBRCFrameCtrl* ctrl) {
   MyBrcContext* ctx = (MyBrcContext*)pthis;
   MyBrcFrame* frame = NULL;
   mfxU32 cost;

   if (!pthis || !par || !ctrl)
      return MFX_ERR_NULL_PTR;

   if (par->NumRecode > 0)
      frame = GetFrame(ctx->frame_queue, ctx->frame_queue_size, par->EncodedOrder);
   else if (ctx->frame_queue_size < ctx->frame_queue_max_size)
      frame = ctx->frame_queue[ctx->frame_queue_size++];

   if (!frame)
      return MFX_ERR_UNDEFINED_BEHAVIOR;

   if (par->NumRecode == 0) {
      frame->EncodedOrder = par->EncodedOrder;
      cost = GetFrameCost(par->FrameType, par->PyramidLayer);
      frame->MinSize = GetMinSize(ctx, cost);
      frame->MaxSize = GetMaxSize(ctx, cost);
      frame->QP = GetInitQP(ctx, frame->MinSize, frame->MaxSize, cost); // from QP/size stat
      frame->StartTime = GetTime();
   }

   ctrl->QpY = frame->QP;

   return MFX_ERR_NONE;
}

mfxStatus MyBrcUpdate(mfxHDL pthis, mfxBRCFrameParam* par, mfxBRCFrameCtrl* ctrl, mfxBRCFrameStatus* status) {
   MyBrcContext* ctx = (MyBrcContext*)pthis;
   MyBrcFrame* frame = NULL;
   bool panic = false;

   if (!pthis || !par || !ctrl || !status)
      return MFX_ERR_NULL_PTR;

   frame = GetFrame(ctx->frame_queue, ctx->frame_queue_size, par->EncodedOrder);
   if (!frame)
      return MFX_ERR_UNDEFINED_BEHAVIOR;

   // update QP/size stat here

   if (   frame->Status == MFX_BRC_PANIC_BIG_FRAME
     || frame->Status == MFX_BRC_PANIC_SMALL_FRAME_FRAME)
      panic = true;

   if (panic || (par->CodedFrameSize >= frame->MinSize && par->CodedFrameSize <= frame->MaxSize)) {
      UpdateBRCState(par->CodedFrameSize, ctx);
      RemoveFromQueue(ctx->frame_queue, ctx->frame_queue_size, frame);
      ctx->frame_queue_size--;
      status->BRCStatus = MFX_BRC_OK;

      // Here update Min/MaxSize for all queued frames

      return MFX_ERR_NONE;
   }

   panic = ((GetTime() - frame->StartTime) >= GetMaxFrameEncodingTime(ctx));

   if (par->CodedFrameSize > frame->MaxSize) {
      if (panic || (frame->QP >= ctx->max_qp[X])) {
         frame->Status = MFX_BRC_PANIC_BIG_FRAME;
      } else {
         frame->Status = MFX_BRC_BIG_FRAME;
         frame->QP = <increase QP>;
      }
   }

   if (par->CodedFrameSize < frame->MinSize) {
      if (panic || (frame->QP <= ctx->min_qp[X])) {
         frame->Status = MFX_BRC_PANIC_SMALL_FRAME;
         status->MinFrameSize = frame->MinSize;
      } else {
         frame->Status = MFX_BRC_SMALL_FRAME;
         frame->QP = <decrease QP>;
      }
   }

   status->BRCStatus = frame->Status;

   return MFX_ERR_NONE;
}

//initialize encoder
MyBrcContext brc_ctx;
mfxExtBRC ext_brc;
mfxExtCodingOption2 co2;
mfxExtBuffer* ext_buf[2] = {&co2.Header, &ext_brc.Header};

memset(&brc_ctx, 0, sizeof(MyBrcContext));
memset(&ext_brc, 0, sizeof(mfxExtBRC));
memset(&co2, 0, sizeof(mfxExtCodingOption2));

vpar.ExtParam = ext_buf;
vpar.NumExtParam = sizeof(ext_buf) / sizeof(ext_buf[0]);

co2.Header.BufferId = MFX_EXTBUFF_CODING_OPTION2;
co2.Header.BufferSz = sizeof(mfxExtCodingOption2);
co2.ExtBRC = MFX_CODINGOPTION_ON;

ext_brc.Header.BufferId = MFX_EXTBUFF_BRC;
ext_brc.Header.BufferSz = sizeof(mfxExtBRC);
ext_brc.pthis           = &brc_ctx;
ext_brc.Init            = MyBrcInit;
ext_brc.Reset           = MyBrcReset;
ext_brc.Close           = MyBrcClose;
ext_brc.GetFrameCtrl    = MyBrcGetFrameCtrl;
ext_brc.Update          = MyBrcUpdate;

status = MFXVideoENCODE_Query(session, &vpar, &vpar);
if (status == MFX_ERR_UNSUPPOERTED || co2.ExtBRC != MFX_CODINGOPTION_ON)
   // unsupported case
else
   status = MFXVideoENCODE_Init(session, &vpar);

JPEG

The application can use the same encoding procedures for JPEG/motion JPEG encoding, as illustrated by the pseudo code:

// encoder initialization
MFXVideoENCODE_Init (...);
// single frame/picture encoding
MFXVideoENCODE_EncodeFrameAsync (...);
MFXVideoCORE_SyncOperation(...);
// close down
MFXVideoENCODE_Close(...);

ENCODE supports JPEG baseline profile encoding as follows:

  • DCT-based process

  • Source image: 8-bit samples within each component

  • Sequential

  • Huffman coding: 2 AC and 2 DC tables

  • 3 loadable quantization matrixes

  • Interleaved and non-interleaved scans

  • Single and multiple scans

    • chroma subsampling ratios:

    • Chroma 4:0:0 (grey image)

    • Chroma 4:1:1

    • Chroma 4:2:0

    • Chroma horizontal 4:2:2

    • Chroma vertical 4:2:2

    • Chroma 4:4:4

  • 3 channels images

The application may specify Huffman and quantization tables during encoder initialization by attaching mfxExtJPEGQuantTables and mfxExtJPEGHuffmanTables buffers to mfxVideoParam structure. If the application does not define tables then the SDK encoder uses tables recommended in ITU-T* Recommendation T.81. If the application does not define quantization table it has to specify Quality parameter in mfxInfoMFX structure. In this case, the SDK encoder scales default quantization table according to specified Quality parameter.

The application should properly configured chroma sampling format and color format. FourCC and ChromaFormat fields in mfxFrameInfo structure are used for this. For example, to encode 4:2:2 vertically sampled YCbCr picture, the application should set FourCC to MFX_FOURCC_YUY2 and ChromaFormat to MFX_CHROMAFORMAT_YUV422V. To encode 4:4:4 sampled RGB picture, the application should set FourCC to MFX_FOURCC_RGB4 and ChromaFormat to MFX_CHROMAFORMAT_YUV444.

The SDK encoder supports different sets of chroma sampling and color formats on different platforms. The application has to call MFXVideoENCODE_Query() function to check if required color format is supported on given platform and then initialize encoder with proper values of FourCC and ChromaFormat in mfxFrameInfo structure.

The application should not define number of scans and number of components. They are derived by the SDK encoder from Interleaved flag in mfxInfoMFX structure and from chroma type. If interleaved coding is specified then one scan is encoded that contains all image components. Otherwise, number of scans is equal to number of components. The SDK encoder uses next component IDs - “1” for luma (Y), “2” for chroma Cb (U) and “3” for chroma Cr (V).

The application should allocate big enough buffer to hold encoded picture. Roughly, its upper limit may be calculated using next equation:

BufferSizeInKB = 4 + (Width * Height * BytesPerPx + 1023) / 1024;

where Width and Height are weight and height of the picture in pixel, BytesPerPx is number of byte for one pixel. It equals to 1 for monochrome picture, 1.5 for NV12 and YV12 color formats, 2 for YUY2 color format, and 3 for RGB32 color format (alpha channel is not encoded).

Multi-view video encoding

Similar to the decoding and video processing initialization procedures, the application attaches the mfxExtMVCSeqDesc structure to the mfxVideoParam structure for encoding initialization. The mfxExtMVCSeqDesc structure configures the SDK MVC encoder to work in three modes:

  • Default dependency mode: the application specifies NumView` and all other fields zero. The SDK encoder creates a single operation point with all views (view identifier 0 : NumView-1) as target views. The first view (view identifier 0) is the base view. Other views depend on the base view.

  • Explicit dependency mode: the application specifies NumView and the View dependency array, and sets all other fields to zero. The SDK encoder creates a single operation point with all views (view identifier View[0 : NumView-1].ViewId) as target views. The first view (view identifier View[0].ViewId) is the base view. The view dependencies follow the View dependency structures.

  • Complete mode: the application fully specifies the views and their dependencies. The SDK encoder generates a bitstream with corresponding stream structures.

The SDK MVC encoder does not support importing sequence and picture headers via the mfxExtCodingOptionSPSPPS structure, or configuring reference frame list via the mfxExtRefListCtrl structure.

During encoding, the SDK encoding function MFXVideoENCODE_EncodeFrameAsync accumulates input frames until encoding of a picture is possible. The function returns MFX_ERR_MORE_DATA for more data at input or MFX_ERR_NONE if having successfully accumulated enough data for encoding of a picture. The generated bitstream contains the complete picture (multiple views). The application can change this behavior and instruct encoder to output each view in a separate bitstream buffer. To do so the application has to turn on the ViewOutput flag in the mfxExtCodingOption structure. In this case, encoder returns MFX_ERR_MORE_BITSTREAM if it needs more bitstream buffers at output and MFX_ERR_NONE when processing of picture (multiple views) has been finished. It is recommended that the application provides a new input frame each time the SDK encoder requests new bitstream buffer. The application must submit views data for encoding in the order they are described in the mfxExtMVCSeqDesc structure. Particular view data can be submitted for encoding only when all views that it depends upon have already been submitted.

The following pseudo code shows the encoding procedure pseudo code.

mfxExtBuffer *eb;
mfxExtMVCSeqDesc  seq_desc;
mfxVideoParam init_param;

init_param.ExtParam=&eb;
init_param.NumExtParam=1;
eb=&seq_desc;

/* init encoder */
MFXVideoENCODE_Init(session, &init_param);

/* perform encoding */
for (;;) {
    MFXVideoENCODE_EncodeFrameAsync(session, NULL, surface2, bits,
                                    &syncp);
    MFXVideoCORE_SyncOperation(session,syncp,INFINITE);
}

/* close encoder */
MFXVideoENCODE_Close();

Video Processing Procedures

Example below shows the pseudo code of the video processing procedure.

MFXVideoVPP_QueryIOSurf(session, &init_param, response);
allocate_pool_of_surfaces(in_pool, response[0].NumFrameSuggested);
allocate_pool_of_surfaces(out_pool, response[1].NumFrameSuggested);
MFXVideoVPP_Init(session, &init_param);
in=find_unlocked_surface_and_fill_content(in_pool);
out=find_unlocked_surface_from_the_pool(out_pool);
for (;;) {
   sts=MFXVideoVPP_RunFrameVPPAsync(session,in,out,aux,&syncp);
   if (sts==MFX_ERR_MORE_SURFACE || sts==MFX_ERR_NONE) {
      MFXVideoCore_SyncOperation(session,syncp,INFINITE);
      process_output_frame(out);
      out=find_unlocked_surface_from_the_pool(out_pool);
   }
   if (sts==MFX_ERR_MORE_DATA && in==NULL)
      break;
   if (sts==MFX_ERR_NONE || sts==MFX_ERR_MORE_DATA) {
      in=find_unlocked_surface(in_pool);
      fill_content_for_video_processing(in);
      if (end_of_input_sequence())
         in=NULL;
   }
}
MFXVideoVPP_Close(session);
free_pool_of_surfaces(in_pool);
free_pool_of_surfaces(out_pool);

The following describes a few key points:

  • The application uses the MFXVideoVPP_QueryIOSurf function to obtain the number of frame surfaces needed for input and output. The application must allocate two frame surface pools, one for the input and the other for the output.

  • The video processing function MFXVideoVPP_RunFrameVPPAsync is asynchronous. The application must synchronize to make the output result ready, through the MFXVideoCORE_SyncOperation function.

  • The body of the video processing procedures covers three scenarios as follows:

  • If the number of frames consumed at input is equal to the number of frames generated at output, VPP returns MFX_ERR_NONE when an output is ready. The application must process the output frame after synchronization, as the MFXVideoVPP_RunFrameVPPAsync function is asynchronous. At the end of a sequence, the application must provide a NULL input to drain any remaining frames.

  • If the number of frames consumed at input is more than the number of frames generated at output, VPP returns MFX_ERR_MORE_DATA for additional input until an output is ready. When the output is ready, VPP returns MFX_ERR_NONE. The application must process the output frame after synchronization and provide a NULL input at the end of sequence to drain any remaining frames.

  • If the number of frames consumed at input is less than the number of frames generated at output, VPP returns either MFX_ERR_MORE_SURFACE (when more than one output is ready), or MFX_ERR_NONE (when one output is ready and VPP expects new input). In both cases, the application must process the output frame after synchronization and provide a NULL input at the end of sequence to drain any remaining frames.

Configuration

The SDK configures the video processing pipeline operation based on the difference between the input and output formats, specified in the mfxVideoParam structure. A few examples follow:

  • When the input color format is YUY2 and the output color format is NV12, the SDK enables color conversion from YUY2 to NV12.

  • When the input is interleaved and the output is progressive, the SDK enables de-interlacing.

  • When the input is single field and the output is interlaced or progressive, the SDK enables field weaving, optionally with deinterlacing.

  • When the input is interlaced and the output is single field, the SDK enables field splitting.

In addition to specifying the input and output formats, the application can provide hints to fine-tune the video processing pipeline operation. The application can disable filters in pipeline by using mfxExtVPPDoNotUse structure; enable them by using mfxExtVPPDoUse structure and configure them by using dedicated configuration structures. See Table 4 for complete list of configurable video processing filters, their IDs and configuration structures. See the ExtendedBufferID enumerator for more details.

The SDK ensures that all filters necessary to convert input format to output one are included in pipeline. However, the SDK can skip some optional filters even if they are explicitly requested by the application, for example, due to limitation of underlying hardware. To notify application about this skip, the SDK returns warning MFX_WRN_FILTER_SKIPPED. The application can retrieve list of active filters by attaching mfxExtVPPDoUse structure to mfxVideoParam structure and calling MFXVideoVPP_GetVideoParam function. The application must allocate enough memory for filter list.

Configurable VPP filters:

Filter ID

Configuration structure

MFX_EXTBUFF_VPP_DENOISE

mfxExtVPPDenoise

MFX_EXTBUFF_VPP_MCTF

mfxExtVppMctf

MFX_EXTBUFF_VPP_DETAIL

mfxExtVPPDetail

MFX_EXTBUFF_VPP_FRAME_RATE_CONVERSION

mfxExtVPPFrameRateConversion

MFX_EXTBUFF_VPP_IMAGE_STABILIZATION

mfxExtVPPImageStab

MFX_EXTBUFF_VPP_PICSTRUCT_DETECTION

none

MFX_EXTBUFF_VPP_PROCAMP

mfxExtVPPProcAmp

MFX_EXTBUFF_VPP_FIELD_PROCESSING

mfxExtVPPFieldProcessing

Example of Video Processing configuration:

/* enable image stabilization filter with default settings */
mfxExtVPPDoUse du;
mfxU32 al=MFX_EXTBUFF_VPP_IMAGE_STABILIZATION;

du.Header.BufferId=MFX_EXTBUFF_VPP_DOUSE;
du.Header.BufferSz=sizeof(mfxExtVPPDoUse);
du.NumAlg=1;
du.AlgList=&al;

/* configure the mfxVideoParam structure */
mfxVideoParam conf;
mfxExtBuffer *eb=&du;

memset(&conf,0,sizeof(conf));
conf.IOPattern=MFX_IOPATTERN_IN_SYSTEM_MEMORY | MFX_IOPATTERN_OUT_SYSTEM_MEMORY;
conf.NumExtParam=1;
conf.ExtParam=&eb;

conf.vpp.In.FourCC=MFX_FOURCC_YV12;
conf.vpp.Out.FourCC=MFX_FOURCC_NV12;
conf.vpp.In.Width=conf.vpp.Out.Width=1920;
conf.vpp.In.Height=conf.vpp.Out.Height=1088;

/* video processing initialization */
MFXVideoVPP_Init(session, &conf);

Region of Interest

During video processing operations, the application can specify a region of interest for each frame, as illustrated below:

VPP Region of Interest Operation:

VPP Region of Interest Operation

Specifying a region of interest guides the resizing function to achieve special effects such as resizing from 16:9 to 4:3 while keeping the aspect ratio intact. Use the CropX, CropY, CropW and CropH parameters in the mfxVideoParam structure to specify a region of interest.

Examples of VPP Operations on Region of Interest:

Operation

VPP Input Width/Height

VPP Input CropX, CropY, CropW, CropH

VPP Output Width/Height

VPP Output CropX, CropY, CropW, CropH

Cropping

720x480

16,16,688,448

720x480

16,16,688,448

Resizing

720x480

0,0,720,480

1440x960

0,0,1440,960

Horizontal stretching

720x480

0,0,720,480

640x480

0,0,640,480

16:9 4:3 with letter boxing at the top and bottom

1920x1088

0,0,1920,1088

720x480

0,36,720,408

4:3 16:9 with pillar boxing at the left and right

720x480

0,0,720,480

1920x1088

144,0,1632,1088

Multi-view video processing

The SDK video processing supports processing multiple views. For video processing initialization, the application needs to attach the mfxExtMVCSeqDesc structure to the mfxVideoParam structure and call the MFXVideoVPP_Init function. The function saves the view identifiers. During video processing, the SDK processes each view independently, one view at a time. The SDK refers to the FrameID field of the mfxFrameInfo structure to configure each view according to its processing pipeline. The application needs to fill the the FrameID field before calling the MFXVideoVPP_RunFrameVPPAsync function, if the video processing source frame is not the output from the SDK MVC decoder. The following pseudo code illustrates it:

mfxExtBuffer *eb;
mfxExtMVCSeqDesc  seq_desc;
mfxVideoParam init_param;

init_param.ExtParam = &eb;
init_param.NumExtParam=1;
eb=&seq_desc;

/* init VPP */
MFXVideoVPP_Init(session, &init_param);

/* perform processing */
for (;;) {
    MFXVideoVPP_RunFrameVPPAsync(session,in,out,aux,&syncp);
    MFXVideoCORE_SyncOperation(session,syncp,INFINITE);
}

/* close VPP */
MFXVideoVPP_Close(session);

Transcoding Procedures

The application can use the SDK encoding, decoding and video processing functions together for transcoding operations. This section describes the key aspects of connecting two or more SDK functions together.

Asynchronous Pipeline

The application passes the output of an upstream SDK function to the input of the downstream SDK function to construct an asynchronous pipeline. Such pipeline construction is done at runtime and can be dynamically changed, as illustrated below:

mfxSyncPoint sp_d, sp_e;
MFXVideoDECODE_DecodeFrameAsync(session,bs,work,&vin, &sp_d);
if (going_through_vpp) {
   MFXVideoVPP_RunFrameVPPAsync(session,vin,vout, &sp_d);
   MFXVideoENCODE_EncodeFrameAsync(session,NULL,vout,bits2,&sp_e);
} else {
   MFXVideoENCODE_EncodeFrameAsync(session,NULL,vin,bits2,&sp_e);
}
MFXVideoCORE_SyncOperation(session,sp_e,INFINITE);

The SDK simplifies the requirement for asynchronous pipeline synchronization. The application only needs to synchronize after the last SDK function. Explicit synchronization of intermediate results is not required and in fact can slow performance.

The SDK tracks the dynamic pipeline construction and verifies dependency on input and output parameters to ensure the execution order of the pipeline function. In Example 6, the SDK will ensure MFXVideoENCODE_EncodeFrameAsync does not begin its operation until MFXVideoDECODE_DecodeFrameAsync or MFXVideoVPP_RunFrameVPPAsync has finished.

During the execution of an asynchronous pipeline, the application must consider the input data in use and must not change it until the execution has completed. The application must also consider output data unavailable until the execution has finished. In addition, for encoders, the application must consider extended and payload buffers in use while the input surface is locked.

The SDK checks dependencies by comparing the input and output parameters of each SDK function in the pipeline. Do not modify the contents of input and output parameters before the previous asynchronous operation finishes. Doing so will break the dependency check and can result in undefined behavior. An exception occurs when the input and output parameters are structures, in which case overwriting fields in the structures is allowed.

Note

Note that the dependency check works on the pointers to the structures only.

There are two exceptions with respect to intermediate synchronization:

  • The application must synchronize any input before calling the SDK function MFXVideoDECODE_DecodeFrameAsync, if the input is from any asynchronous operation.

  • When the application calls an asynchronous function to generate an output surface in video memory and passes that surface to a non-SDK component, it must explicitly synchronize the operation before passing the surface to the non-SDK component.

Pseudo Code of Asynchronous ENC->**ENCODE** Pipeline Construction:

mfxENCInput enc_in = ...;
mfxENCOutput enc_out = ...;
mfxSyncPoint sp_e, sp_n;
mfxFrameSurface1* surface = get_frame_to_encode();
mfxExtBuffer dependency;
dependency.BufferId = MFX_EXTBUFF_TASK_DEPENDENCY;
dependency.BufferSz = sizeof(mfxExtBuffer);

enc_in.InSurface = surface;
enc_out.ExtParam[enc_out.NumExtParam++] = &dependency;
MFXVideoENC_ProcessFrameAsync(session, &enc_in, &enc_out, &sp_e);

surface->Data.ExtParam[surface->Data.NumExtParam++] = &dependency;
MFXVideoENCODE_EncodeFrameAsync(session, NULL, surface, &bs, &sp_n);

MFXVideoCORE_SyncOperation(session, sp_n, INFINITE);
surface->Data.NumExtParam--;

Surface Pool Allocation

When connecting SDK function A to SDK function B, the application must take into account the needs of both functions to calculate the number of frame surfaces in the surface pool. Typically, the application can use the formula Na+Nb, where Na is the frame surface needs from SDK function A output, and Nb is the frame surface needs from SDK function B input.

For performance considerations, the application must submit multiple operations and delays synchronization as much as possible, which gives the SDK flexibility to organize internal pipelining. For example, the operation sequence:

digraph {
   rankdir=LR;
   f1 [shape=record label="ENCODE(F1)" ];
   f2 [shape=record label="ENCODE(F2)" ];
   f3 [shape=record label="SYNC(F1)" ];
   f4 [shape=record label="SYNC(F2)" ];
   f1->f2->f3->f4;
}

is recommended, compared with:

digraph {
   rankdir=LR;
   f1 [shape=record label="ENCODE(F1)" ];
   f2 [shape=record label="ENCODE(F2)" ];
   f3 [shape=record label="SYNC(F1)" ];
   f4 [shape=record label="SYNC(F2)" ];
   f1->f3->f2->f4;
}

In this case, the surface pool needs additional surfaces to take into account multiple asynchronous operations before synchronization. The application can use the AsyncDepth parameter of the mfxVideoParam structure to inform an SDK function that how many asynchronous operations the application plans to perform before synchronization. The corresponding SDK QueryIOSurf function will reflect such consideration in the NumFrameSuggested value. Example below shows a way of calculating the surface needs based on NumFrameSuggested values:

async_depth=4;
init_param_v.AsyncDepth=async_depth;
MFXVideoVPP_QueryIOSurf(session, &init_param_v, response_v);
init_param_e.AsyncDepth=async_depth;
MFXVideoENCODE_QueryIOSurf(session, &init_param_e, &response_e);
num_surfaces=    response_v[1].NumFrameSuggested
         +response_e.NumFrameSuggested
         -async_depth; /* double counted in ENCODE & VPP */

Pipeline Error Reporting

During asynchronous pipeline construction, each stage SDK function will return a synchronization point (sync point). These synchronization points are useful in tracking errors during the asynchronous pipeline operation.

Assume the pipeline is:

digraph {
   rankdir=LR;
   A->B->C;
}

The application synchronizes on sync point C. If the error occurs in SDK function C, then the synchronization returns the exact error code. If the error occurs before SDK function C, then the synchronization returns MFX_ERR_ABORTED. The application can then try to synchronize on sync point B. Similarly, if the error occurs in SDK function B, the synchronization returns the exact error code, or else MFX_ERR_ABORTED. Same logic applies if the error occurs in SDK function A.

Working with hardware acceleration

Working with multiple Intel media devices

If your system has multiple Intel Gen Graphics adapters you may need hints on which adapter suits better to process some particular workload. The SDK provides helper API to select best suitable adapter for your workload based on passed workload description. Example below showcases workload initialization on discrete adapter:

mfxU32 num_adapters_available;

// Query number of Intel Gen Graphics adapters available on system
mfxStatus sts = MFXQueryAdaptersNumber(&num_adapters_available);
MSDK_CHECK_STATUS(sts, "MFXQueryAdaptersNumber failed");

// Allocate memory for response
std::vector<mfxAdapterInfo> displays_data(num_adapters_available);
mfxAdaptersInfo adapters = { displays_data.data(), mfxU32(displays_data.size()), 0u };

// Query information about all adapters (mind that first parameter is NULL)
sts = MFXQueryAdapters(nullptr, &adapters);
MSDK_CHECK_STATUS(sts, "MFXQueryAdapters failed");

// Find dGfx adapter in list of adapters
auto idx_d = std::find_if(adapters.Adapters, adapters.Adapters + adapters.NumActual,
    [](const mfxAdapterInfo info)
{
   return info.Platform.MediaAdapterType == mfxMediaAdapterType::MFX_MEDIA_DISCRETE;
});

// No dGfx in list
if (idx_d == adapters.Adapters + adapters.NumActual)
{
   printf("Warning: No dGfx detected on machine\n");
   return -1;
}

mfxU32 idx = static_cast<mfxU32>(std::distance(adapters.Adapters, idx));

// Choose correct implementation for discrete adapter
switch (adapters.Adapters[idx].Number)
{
case 0:
   impl = MFX_IMPL_HARDWARE;
   break;
case 1:
   impl = MFX_IMPL_HARDWARE2;
   break;
case 2:
   impl = MFX_IMPL_HARDWARE3;
   break;
case 3:
   impl = MFX_IMPL_HARDWARE4;
   break;

default:
   // Try searching on all display adapters
   impl = MFX_IMPL_HARDWARE_ANY;
   break;
}

// Initialize mfxSession in regular way with obtained implementation

As you see in this example, after obtaining adapter list with MFXQueryAdapters further initialization of mfxSession is performed in regular way. Particular adapter choice is performed with MFX_IMPL_HARDWARE,…, MFX_IMPL_HARDWARE4 values of mfxIMPL.

Example below showcases usage of MFXQueryAdapters for querying best suitable adapter for particular encode workload (see MFXQueryAdapters description for adapter priority rules):

mfxU32 num_adapters_available;

// Query number of Intel Gen Graphics adapters available on system
mfxStatus sts = MFXQueryAdaptersNumber(&num_adapters_available);
MSDK_CHECK_STATUS(sts, "MFXQueryAdaptersNumber failed");

// Allocate memory for response
std::vector<mfxAdapterInfo> displays_data(num_adapters_available);
mfxAdaptersInfo adapters = { displays_data.data(), mfxU32(displays_data.size()), 0u };

// Fill description of Encode workload
mfxComponentInfo interface_request = { MFX_COMPONENT_ENCODE, Encode_mfxVideoParam };

// Query information about suitable adapters for Encode workload described by Encode_mfxVideoParam
sts = MFXQueryAdapters(&interface_request, &adapters);

if (sts == MFX_ERR_NOT_FOUND)
{
   printf("Error: No adapters on machine capable to process desired workload\n");
   return -1;
}

MSDK_CHECK_STATUS(sts, "MFXQueryAdapters failed");

// Choose correct implementation for discrete adapter. Mind usage of index 0, this is best suitable adapter from MSDK perspective
switch (adapters.Adapters[0].Number)
{
case 0:
   impl = MFX_IMPL_HARDWARE;
   break;
case 1:
   impl = MFX_IMPL_HARDWARE2;
   break;
case 2:
   impl = MFX_IMPL_HARDWARE3;
   break;
case 3:
   impl = MFX_IMPL_HARDWARE4;
   break;

default:
   // Try searching on all display adapters
   impl = MFX_IMPL_HARDWARE_ANY;
   break;
}

// Initialize mfxSession in regular way with obtained implementation

Working with video memory

To fully utilize the SDK acceleration capability, the application should support OS specific infrastructures, Microsoft* DirectX* for Micorosoft* Windows* and VA API for Linux*.

The hardware acceleration support in application consists of video memory support and acceleration device support.

Depending on usage model, the application can use video memory on different stages of pipeline. Three major scenarios are illustrated below:

digraph {
  rankdir=LR;
  labelloc="t";
  label="SDK Functions interconnection";
  F1 [shape=octagon label="SDK Function"];
  F2 [shape=octagon label="SDK Function"];
  F1->F2 [ label="Video Memory" ];
}
digraph {
  rankdir=LR;
  labelloc="t";
  label="Video memory as output";
  F3 [shape=octagon label="SDK Function"];
  F4 [shape=octagon label="Application" fillcolor=lightgrey];
  F3->F4 [ label="Video Memory" ];
}
digraph {
  rankdir=LR;
  labelloc="t";
  label="Video memory as input";
  F5 [shape=octagon label="Application"];
  F6 [shape=octagon label="SDK Function"];
  F5->F6 [ label="Video Memory" ];
}

The application must use the IOPattern field of the mfxVideoParam structure to indicate the I/O access pattern during initialization. Subsequent SDK function calls must follow this access pattern. For example, if an SDK function operates on video memory surfaces at both input and output, the application must specify the access pattern IOPattern at initialization in MFX_IOPATTERN_IN_VIDEO_MEMORY for input and MFX_IOPATTERN_OUT_VIDEO_MEMORY for output. This particular I/O access pattern must not change inside the InitClose sequence.

Initialization of any hardware accelerated SDK component requires the acceleration device handle. This handle is also used by SDK component to query HW capabilities. The application can share its device with the SDK by passing device handle through the MFXVideoCORE_SetHandle function. It is recommended to share the handle before any actual usage of the SDK.

Working with Microsoft* DirectX* Applications

The SDK supports two different infrastructures for hardware acceleration on Microsoft* Windows* OS, “Direct3D 9 DXVA2” and “Direct3D 11 Video API”. In the first one the application should use the IDirect3DDeviceManager9 interface as the acceleration device handle, in the second one - ID3D11Device interface. The application should share one of these interfaces with the SDK through the MFXVideoCORE_SetHandle function. If the application does not provide it, then the SDK creates its own internal acceleration device. This internal device could not be accessed by the application and as a result, the SDK input and output will be limited to system memory only. That in turn will reduce SDK performance. If the SDK fails to create a valid acceleration device, then SDK cannot proceed with hardware acceleration and returns an error status to the application.

The application must create the Direct3D9* device with the flag D3DCREATE_MULTITHREADED. Additionally the flag D3DCREATE_FPU_PRESERVE is recommended. This influences floating-point calculations, including PTS values.

The application must also set multithreading mode for Direct3D11* device. Example below illustrates how to do it:

ID3D11Device            *pD11Device;
ID3D11DeviceContext     *pD11Context;
ID3D10Multithread       *pD10Multithread;

pD11Device->GetImmediateContext(&pD11Context);
pD11Context->QueryInterface(IID_ID3D10Multithread, &pD10Multithread);
pD10Multithread->SetMultithreadProtected(true);

During hardware acceleration, if a Direct3D* “device lost” event occurs, the SDK operation terminates with the return status MFX_ERR_DEVICE_LOST. If the application provided the Direct3D* device handle, the application must reset the Direct3D* device.

When the SDK decoder creates auxiliary devices for hardware acceleration, it must allocate the list of Direct3D* surfaces for I/O access, also known as the surface chain, and pass the surface chain as part of the device creation command. In most cases, the surface chain is the frame surface pool mentioned in the Frame Surface Locking section.

The application passes the surface chain to the SDK component Init function through an SDK external allocator callback. See the Memory Allocation and External Allocators section for details.

Only decoder Init function requests external surface chain from the application and uses it for auxiliary device creation. Encoder and VPP Init functions may only request internal surfaces. See the ExtMemFrameType enumerator for more details about different memory types.

Depending on configuration parameters, SDK requires different surface types. It is strongly recommended to call one of the MFXVideoENCODE_QueryIOSurf, MFXVideoDECODE_QueryIOSurf or MFXVideoVPP_QueryIOSurf functions to determine the appropriate type.

Supported SDK Surface Types and Color Formats for Direct3D9:

Class

Input Surface Type

Input Color Format

Output Surface Type

Output Color Format

DECODE

Not Applicable

Not Applicable

Decoder Render Target

NV12

DECODE (JPEG)

Decoder Render Target

RGB32, YUY2

VPP

Decoder/Processor Render Target

Listed in ColorFourCC

Decoder Render Target

NV12

VPP

Processor Render Target

RGB32

ENCODE

Decoder Render Target

NV12

Not Applicable

Not Applicable

ENCODE (JPEG)

Decoder Render Target

RGB32, YUY2, YV12

Note

“Decoder Render Target” corresponds to DXVA2_VideoDecoderRenderTarget type.

Note

“Processor Render Target” corresponds to DXVA2_VideoProcessorRenderTarget.

Supported SDK Surface Types and Color Formats for Direct3D11:

Class

Input Surface Type

Input Color Format

Output Surface Type

Output Color Format

DECODE

Not Applicable

Not Applicable

Decoder Render Target

NV12

DECODE (JPEG)

Decoder/Processor Render Target

RGB32, YUY2

VPP

Decoder/Processor Render Target

Listed in ColorFourCC

Processor Render Target

NV12

VPP

Processor Render Target

RGB32

ENCODE

Decoder/Processor Render Target

NV12

Not Applicable

Not Applicable

ENCODE (JPEG)

Decoder/Processor Render Target

RGB32, YUY2

Note

“Decoder Render Target” corresponds to D3D11_BIND_DECODER flag.

Note

“Processor Render Target” corresponds to D3D11_BIND_RENDER_TARGET.

Note

that NV12 is the major encoding and decoding color format.

Note

Additionally, JPEG/MJPEG decoder supports RGB32 and YUY2 output.

Note

JPEG/MJPEG encoder supports RGB32 and YUY2 input for Direct3D9/Direct3D11 and YV12 input for Direct3D9 only.

Note

VPP supports RGB32 output.

Working with VA API Applications

The SDK supports single infrastructure for hardware acceleration on Linux* - “VA API”. The application should use the VADisplay interface as the acceleration device handle for this infrastructure and share it with the SDK through the MFXVideoCORE_SetHandle function. Because the SDK does not create internal acceleration device on Linux, the application must always share it with the SDK. This sharing should be done before any actual usage of the SDK, including capability query and component initialization. If the application fails to share the device, the SDK operation will fail.

Obtaining VA display from X Window System:

Display   *x11_display;
VADisplay va_display;

x11_display = XOpenDisplay(current_display);
va_display  = vaGetDisplay(x11_display);

MFXVideoCORE_SetHandle(session, MFX_HANDLE_VA_DISPLAY, (mfxHDL) va_display);

Obtaining VA display from Direct Rendering Manager:

int card;
VADisplay va_display;

card = open("/dev/dri/card0", O_RDWR); /* primary card */
va_display = vaGetDisplayDRM(card);
vaInitialize(va_display, &major_version, &minor_version);

MFXVideoCORE_SetHandle(session, MFX_HANDLE_VA_DISPLAY, (mfxHDL) va_display);

When the SDK decoder creates hardware acceleration device, it must allocate the list of video memory surfaces for I/O access, also known as the surface chain, and pass the surface chain as part of the device creation command. The application passes the surface chain to the SDK component Init function through an SDK external allocator callback. See the Memory Allocation and External Allocators section for details.

Todo

Add link to “Allocation and External Allocators”

Only decoder Init function requests external surface chain from the application and uses it for device creation. Encoder and VPP Init functions may only request internal surfaces. See the ExtMemFrameType enumerator for more details about different memory types.

Note

The VA API does not define any surface types and the application can use either MFX_MEMTYPE_VIDEO_MEMORY_DECODER_TARGET or MFX_MEMTYPE_VIDEO_MEMORY_PROCESSOR_TARGET to indicate data in video memory.

Supported SDK Surface Types and Color Formats for VA API:

SDK Class

SDK Function Input

SDK Function Output

DECODE

Not Applicable

NV12

DECODE (JPEG)

RGB32, YUY2

VPP

Listed in ColorFourCC

NV12, RGB32

ENCODE

NV12

Not Applicable

ENCODE (JPEG)

RGB32, YUY2, YV12

Memory Allocation and External Allocators

There are two models of memory managment in SDK implementations: internal and external.

External memory managment

In external memory model the application must allocate sufficient memory for input and output parameters and buffers, and de-allocate it when SDK functions complete their operations. During execution, the SDK functions use callback functions to the application to manage memory for video frames through external allocator interface mfxFrameAllocator.

If an application needs to control the allocation of video frames, it can use callback functions through the mfxFrameAllocator interface. If an application does not specify an allocator, an internal allocator is used. However, if an application uses video memory surfaces for input and output, it must specify the hardware acceleration device and an external frame allocator using mfxFrameAllocator.

The external frame allocator can allocate different frame types:

  • in system memory

  • in video memory, as “decoder render targets” or “processor render targets.” See the section Working with hardware acceleration for additional details.

The external frame allocator responds only to frame allocation requests for the requested memory type and returns MFX_ERR_UNSUPPORTED for all others. The allocation request uses flags, part of memory type field, to indicate which SDK class initiates the request, so the external frame allocator can respond accordingly.

Simple external frame allocator:

typedef struct {
   mfxU16 width, height;
   mfxU8 *base;
} mid_struct;

mfxStatus fa_alloc(mfxHDL pthis, mfxFrameAllocRequest *request, mfxFrameAllocResponse *response) {
   if (!(request->type&MFX_MEMTYPE_SYSTEM_MEMORY))
      return MFX_ERR_UNSUPPORTED;
   if (request->Info->FourCC!=MFX_FOURCC_NV12)
      return MFX_ERR_UNSUPPORTED;
   response->NumFrameActual=request->NumFrameMin;
   for (int i=0;i<request->NumFrameMin;i++) {
      mid_struct *mmid=(mid_struct *)malloc(sizeof(mid_struct));
      mmid->width=ALIGN32(request->Info->Width);
      mmid->height=ALIGN32(request->Info->Height);
      mmid->base=(mfxU8*)malloc(mmid->width*mmid->height*3/2);
      response->mids[i]=mmid;
   }
   return MFX_ERR_NONE;
}

mfxStatus fa_lock(mfxHDL pthis, mfxMemId mid, mfxFrameData *ptr) {
   mid_struct *mmid=(mid_struct *)mid;
   ptr->pitch=mmid->width;
   ptr->Y=mmid->base;
   ptr->U=ptr->Y+mmid->width*mmid->height;
   ptr->V=ptr->U+1;
   return MFX_ERR_NONE;
}

mfxStatus fa_unlock(mfxHDL pthis, mfxMemId mid, mfxFrameData *ptr) {
   if (ptr) ptr->Y=ptr->U=ptr->V=ptr->A=0;
   return MFX_ERR_NONE;
}

mfxStatus fa_gethdl(mfxHDL pthis, mfxMemId mid, mfxHDL *handle) {
   return MFX_ERR_UNSUPPORTED;
}

mfxStatus fa_free(mfxHDL pthis, mfxFrameAllocResponse *response) {
   for (int i=0;i<response->NumFrameActual;i++) {
      mid_struct *mmid=(mid_struct *)response->mids[i];
      free(mmid->base); free(mid);
   }
   return MFX_ERR_NONE;
}

For system memory, it is highly recommended to allocate memory for all planes of the same frame as a single buffer (using one single malloc call).

Internal memory managment

In the internal memory managment model SDK provides interface functions for frames allocation:

MFXMemory_GetSurfaceForVPP()

MFXMemory_GetSurfaceForEncode()

MFXMemory_GetSurfaceForDecode()

which are used together with mfxFrameSurfaceInterface for surface managment. The surface returned by these function is reference counted objecte and the application has to call mfxFrameSurfaceInterface::Release after finishing all operations with the surface. In this model the application doesn’t need to create and set external allocator to SDK. Another possibility to obtain internally allocated surface is to call MFXVideoDECODE_DecodeFrameAsync() with working surface equal to NULL (see Simplified decoding procedure). In such situation Decoder will allocate new refcountable mfxFrameSurface1 and return to the user. All assumed contracts with user are similar with such in functions MFXMemory_GetSurfaceForXXX.

mfxFrameSurfaceInterface

Starting from API version 2.0 SDK support mfxFrameSurfaceInterface. This interface is a set of callback functions to manage lifetime of allocated surfaces, get access to pixel data, and obtain native handles and device abstractions (if suitable). It’s recommended to use mfxFrameSurface1::mfxFrameSurfaceInterface if presents instead of directly accessing mfxFrameSurface1 structure members or call external allocator callback functions if set.

The following example demonstrates the usage of mfxFrameSurfaceInterface for memory sharing:

// let decode frame and try to access output optimal way.
sts = MFXVideoDECODE_DecodeFrameAsync(session, NULL, NULL, &outsurface, &syncp);
if (MFX_ERR_NONE == sts)
{
    outsurface->FrameInterface->(*GetDeviceHandle)(outsurface, &device_handle, &device_type);
    // if application or component is familar with mfxHandleType and it's possible to share memory created by device_handle.
    if (isDeviceTypeCompatible(device_type) && isPossibleForMemorySharing(device_handle)) {
        // get native handle and type
        outsurface->FrameInterface->(*GetNativeHandle)(outsurface, &resource, &resource_type);
        if (isResourceTypeCompatible(resource_type)) {
            //use memory directly
            ProcessNativeMemory(resource);
            outsurface->FrameInterface->(*Release)(outsurface);
        }
    }
    // Application or component is not aware about such DeviceHandle or Resource type need to map to system memory.
    outsurface->FrameInterface->(*Map)(outsurface, MFX_MAP_READ);
    ProcessSystemMemory(outsurface);
    outsurface->FrameInterface->(*Unmap)(outsurface);
    outsurface->FrameInterface->(*Release)(outsurface);
}

Hardware Device Error Handling

The SDK accelerates decoding, encoding and video processing through a hardware device. The SDK functions may return the following errors or warnings if the hardware device encounters errors:

Status

Description

MFX_ERR_DEVICE_FAILED

Hardware device returned unexpected errors. SDK was unable to restore operation.

MFX_ERR_DEVICE_LOST

Hardware device was lost due to system lock or shutdown.

MFX_WRN_PARTIAL_ACCELERATION

The hardware does not fully support the specified configuration. The encoding, decoding, or video processing operation may be partially accelerated.

MFX_WRN_DEVICE_BUSY

Hardware device is currently busy.

SDK functions Query, QueryIOSurf, and Init return MFX_WRN_PARTIAL_ACCELERATION to indicate that the encoding, decoding or video processing operation can be partially hardware accelerated or not hardware accelerated at all. The application can ignore this warning and proceed with the operation. (Note that SDK functions may return errors or other warnings overwriting MFX_WRN_PARTIAL_ACCELERATION, as it is a lower priority warning.)

SDK functions return MFX_WRN_DEVICE_BUSY to indicate that the hardware device is busy and unable to take commands at this time. Resume the operation by waiting for a few milliseconds and resubmitting the request. Example below shows the decoding pseudo-code. The same procedure applies to encoding and video processing.

SDK functions return MFX_ERR_DEVICE_LOST or MFX_ERR_DEVICE_FAILED to indicate that there is a complete failure in hardware acceleration. The application must close and reinitialize the SDK function class. If the application has provided a hardware acceleration device handle to the SDK, the application must reset the device.

Pseudo-Code to Handle MFX_WRN_DEVICE_BUSY:

mfxStatus sts=MFX_ERR_NONE;
for (;;) {
   // do something
   sts=MFXVideoDECODE_DecodeFrameAsync(session, bitstream,  surface_work, &surface_disp, &syncp);
   if (sts == MFX_WRN_DEVICE_BUSY) Sleep(5);
}

Summary Tables

Mandatory API reference

Functions per API Version

This is list of functions exported by any implementation with corresponding API version.

Function

API Version

MFXInit

1.0

MFXClose

1.0

MFXQueryIMPL

1.0

MFXQueryVersion

1.0

MFXJoinSession

1.0

MFXDisjoinSession

1.0

MFXCloneSession

1.0

MFXSetPriority

1.0

MFXGetPriority

1.0

MFXVideoCORE_SetFrameAllocator

1.0

MFXVideoCORE_SetHandle

1.0

MFXVideoCORE_GetHandle

1.0

MFXVideoCORE_SyncOperation

1.0

MFXVideoENCODE_Query

1.0

MFXVideoENCODE_QueryIOSurf

1.0

MFXVideoENCODE_Init

1.0

MFXVideoENCODE_Reset

1.0

MFXVideoENCODE_Close

1.0

MFXVideoENCODE_GetVideoParam

1.0

MFXVideoENCODE_GetEncodeStat

1.0

MFXVideoENCODE_EncodeFrameAsync

1.0

MFXVideoDECODE_Query

1.0

MFXVideoDECODE_DecodeHeader

1.0

MFXVideoDECODE_QueryIOSurf

1.0

MFXVideoDECODE_Init

1.0

MFXVideoDECODE_Reset

1.0

MFXVideoDECODE_Close

1.0

MFXVideoDECODE_GetVideoParam

1.0

MFXVideoDECODE_GetDecodeStat

1.0

MFXVideoDECODE_SetSkipMode

1.0

MFXVideoDECODE_GetPayload

1.0

MFXVideoDECODE_DecodeFrameAsync

1.0

MFXVideoVPP_Query

1.0

MFXVideoVPP_QueryIOSurf

1.0

MFXVideoVPP_Init

1.0

MFXVideoVPP_Reset

1.0

MFXVideoVPP_Close

1.0

MFXVideoVPP_GetVideoParam

1.0

MFXVideoVPP_GetVPPStat

1.0

MFXVideoVPP_RunFrameVPPAsync

1.0

MFXVideoVPP_RunFrameVPPAsyncEx

1.10

MFXInitEx

1.14

MFXVideoCORE_QueryPlatform

1.19

MFXMemory_GetSurfaceForVPP

2.0

MFXMemory_GetSurfaceForEncode

2.0

MFXMemory_GetSurfaceForDecode

2.0

MFXQueryImplDescription

2.0

MFXReleaseImplDescription

2.0

Appendicies

Configuration Parameter Constraints

The mfxFrameInfo structure is used by both the mfxVideoParam structure during SDK class initialization and the mfxFrameSurface1 structure during the actual SDK class function. The following constraints apply:

Constraints common for DECODE, ENCODE and VPP:

Parameters

During SDK initialization

During SDK operation

FourCC

Any valid value

The value must be the same as the initialization value.

The only exception is VPP in composition mode, where in some cases it is allowed to

mix RGB and NV12 surfaces. See mfxExtVPPComposite for more details.

ChromaFormat

Any valid value

The value must be the same as the initialization value.

Constraints for DECODE:

Parameters

During SDK initialization

During SDK operation

Width

Height

Aligned frame size

The values must be the equal to or larger than the

initialization values.

CropX, CropY

CropW, CropH

Ignored

DECODE output. The cropping values are per-frame based.

AspectRatioW

AspectRatioH

Any valid values or unspecified (zero); if unspecified,

values from the input bitstream will be used;

see note below the table.

DECODE output.

FrameRateExtN

FrameRateExtD

If unspecified, values from the input bitstream will be

used; see note below the table.

DECODE output.

PicStruct

Ignored

DECODE output.

Note

Note about priority of initialization parameters.

Note

If application explicitly sets FrameRateExtN/FrameRateExtD or AspectRatioW/AspectRatioH during initialization then decoder uses these values during decoding regardless of values from bitstream and does not update them on new SPS. If application sets them to 0, then decoder uses values from stream and update them on each SPS.

Constraints for VPP:

Parameters

During SDK initialization

During SDK operation

Width

Height

Any valid values

The values must be the equal to or larger than the

initialization values.

CropX, CropY

CropW, CropH

Ignored

These parameters specify the region of interest

from input to output.

AspectRatioW

AspectRatioH

Ignored

Aspect ratio values will be passed through from

input to output.

FrameRateExtN

FrameRateExtD

Any valid values

Frame rate values will be updated with the

initialization value at output.

PicStruct

MFX_PICSTRUCT_UNKNOWN

MFX_PICSTRUCT_PROGRESSIVE

MFX_PICSTRUCT_FIELD_TFF

MFX_PICSTRUCT_FIELD_BFF

MFX_PICSTRUCT_FIELD_SINGLE

MFX_PICSTRUCT_FIELD_TOP

MFX_PICSTRUCT_FIELD_BOTTOM

The base value must be the same as the

initialization value unless

MFX_PICSTRUCT_UNKNOWN is specified during

initialization.

Other decorative picture structure flags are

passed through or added as needed. See the

PicStruct enumerator for details.

Constraints for ENCODE:

Parameters

During SDK initialization

During SDK operation

Width

Height

Encoded frame size

The values must be the equal to or larger than the

initialization values.

CropX, CropY

CropW, CropH

H.264: Cropped frame size

MPEG-2: CropW and CropH

specify the real width and height

(maybe unaligned) of the coded

frames. CropX and CropY

must be zero.

Ignored

AspectRatioW

AspectRatioH

Any valid values

Ignored

FrameRateExtN

FrameRateExtD

Any valid values

Ignored

PicStruct

MFX_PICSTRUCT_UNKNOWN

MFX_PICSTRUCT_PROGRESSIVE

MFX_PICSTRUCT_FIELD_TFF

MFX_PICSTRUCT_FIELD_BFF

The base value must be the same as the

initialization value unless

MFX_PICSTRUCT_UNKNOWN is specified during

initialization.

Add other decorative picture structure flags to indicate

additional display attributes. Use MFX_PICSTRUCT_UNKNOWN

during initialization for field attributes and

MFX_PICSTRUCT_PROGRESSIVE for frame attributes. See the

PicStruct enumerator for details.

The following table summarizes how to specify the configuration parameters during initialization and during encoding, decoding and video processing:

Structure (param)

ENCODE Init

ENCODE Encoding

DECODE Init

DECODE Decoding

VPP Init

VPP Processing

mfxVideoParam

Protected

R

R

R

IOPattern

M

M

M

ExtParam

O

O

O

NumExtParam

O

O

O

mfxInfoMFX

CodecId

M

M

CodecProfile

O

O/M*

CodecLevel

O

O

NumThread

O

O

TargetUsage

O

GopPicSize

O

GopRefDist

O

GopOptFlag

O

IdrInterval

O

RateControlMethod

O

InitialDelayInKB

O

BufferSizeInKB

O

TargetKbps

M

MaxKbps

O

NumSlice

O

NumRefFrame

O

EncodedOrder

M

mfxFrameInfo

FourCC

M

M

M

M

M

M

Width

M

M

M

M

M

M

Height

M

M

M

M

M

M

CropX

M

Ign

Ign

U

Ign

M

CropY

M

Ign

Ign

U

Ign

M

CropW

M

Ign

Ign

U

Ign

M

CropH

M

Ign

Ign

U

Ign

M

FrameRateExtN

M

Ign

O

U

M

U

FrameRateExtD

M

Ign

O

U

M

U

AspectRatioW

O

Ign

O

U

Ign

PT

AspectRatioH

O

Ign

O

U

Ign

PT

PicStruct

O

M

Ign

U

M

M/U

ChromaFormat

M

M

M

M

Ign

Ign

Table Legend:

Remarks

Ign

Ignored

PT

Pass Through

Does Not Apply

M

Mandated

R

Reserved

O

Optional

U

Updated at output

Note

CodecProfile is mandated for HEVC REXT and SCC profiles and optional for other cases. If application doesn’t explicitly set CodecProfile during initialization, HEVC decoder will use profile up to Main10.

Multiple-Segment Encoding

Multiple-segment encoding is useful in video editing applications when during production; the encoder encodes multiple video clips according to their time line. In general, one can define multiple-segment encoding as dividing an input sequence of frames into segments and encoding them in different encoding sessions with the same or different parameter sets:

Segment already Encoded

Segment in encoding

Segment to be encoded

0s

200s

500s

Note

Note that different encoders can also be used.

The application must be able to:

  • Extract encoding parameters from the bitstream of previously encoded segment;

  • Import these encoding parameters to configure the encoder.

Encoding can then continue on the current segment using either the same or the similar encoding parameters.

Extracting the header containing the encoding parameter set from the encoded bitstream is usually the task of a format splitter (de-multiplexer). Nevertheless, the SDK MFXVideoDECODE_DecodeHeader() function can export the raw header if the application attaches the mfxExtCodingOptionSPSPPS structure as part of the parameters.

The encoder can use the mfxExtCodingOptionSPSPPS structure to import the encoding parameters during MFXVideoENCODE_Init(). The encoding parameters are in the encoded bitstream format. Upon a successful import of the header parameters, the encoder will generate bitstreams with a compatible (not necessarily bit-exact) header. Table below shows all functions that can import a header and their error codes if there are unsupported parameters in the header or the encoder is unable to achieve compatibility with the imported header.

Function Name

Error Code if Import Fails

MFXVideoENCODE_Init()

MFX_ERR_INCOMPATIBLE_VIDEO_PARAM

MFXVideoENCODE_QueryIOSurf()

MFX_ERR_INCOMPATIBLE_VIDEO_PARAM

MFXVideoENCODE_Reset()

MFX_ERR_INCOMPATIBLE_VIDEO_PARAM

MFXVideoENCODE_Query()

MFX_ERR_UNSUPPORTED

The encoder must encode frames to a GOP sequence starting with an IDR frame for H.264 (or I frame for MPEG-2) to ensure that the current segment encoding does not refer to any frames in the previous segment. This ensures that the encoded segment is self-contained, allowing the application to insert it anywhere in the final bitstream. After encoding, each encoded segment is HRD compliant. However, the concatenated segments may not be HRD compliant.

Example below shows an example of the encoder initialization procedure that imports H.264 sequence and picture parameter sets:

mfxStatus init_encoder() {
   mfxExtCodingOptionSPSPPS option, *option_array;

   /* configure mfxExtCodingOptionSPSPPS */
   memset(&option,0,sizeof(option));
   option.Header.BufferId=MFX_EXTBUFF_CODING_OPTION_SPSPPS;
   option.Header.BufferSz=sizeof(option);
   option.SPSBuffer=sps_buffer;
   option.SPSBufSize=sps_buffer_length;
   option.PPSBuffer=pps_buffer;
   option.PPSBufSize=pps_buffer_length;

   /* configure mfxVideoParam */
   mfxVideoParam param;
   //...
   param.NumExtParam=1;
   option_array=&option;
   param.ExtParam=&option_array;

   /* encoder initialization */
   mfxStatus status;
   status=MFXVideoENCODE_Init(session, &param);
   if (status==MFX_ERR_INCOMPATIBLE_VIDEO_PARAM) {
      printf(“Initialization failed\n”);
   } else {
      printf(“Initialized\n”);
   }
   return status;
}

Streaming and Video Conferencing Features

The following sections address a few aspects of additional requirements that streaming or video conferencing applications may use in the encoding or transcoding process. See also Configuration Change chapter.

Dynamic Bitrate Change

The SDK encoder supports dynamic bitrate change differently depending on bitrate control mode and HRD conformance requirement. If HRD conformance is required, i.e. if application sets NalHrdConformance option in mfxExtCodingOption structure to ON, the only allowed bitrate control mode is VBR. In this mode, the application can change TargetKbps and MaxKbps values. The application can change these values by calling the MFXVideoENCODE_Reset() function. Such change in bitrate usually results in generation of a new key-frame and sequence header. There are some exceptions though. For example, if HRD Information is absent in the stream then change of TargetKbps does not require change of sequence header and as a result the SDK encoder does not insert a key frame.

If HRD conformance is not required, i.e. if application turns off NalHrdConformance option in mfxExtCodingOption structure, all bitrate control modes are available. In CBR and AVBR modes the application can change TargetKbps, in VBR mode the application can change TargetKbps and MaxKbps values. Such change in bitrate will not result in generation of a new key-frame or sequence header.

The SDK encoder may change some of the initialization parameters provided by the application during initialization. That in turn may lead to incompatibility between the parameters provided by the application during reset and working set of parameters used by the SDK encoder. That is why it is strongly recommended to retrieve the actual working parameters by MFXVideoENCODE_GetVideoParam() function before making any changes to bitrate settings.

In all modes, the SDK encoders will respond to the bitrate changes as quickly as the underlying algorithm allows, without breaking other encoding restrictions, such as HRD compliance if it is enabled. How soon the actual bitrate can catch up with the specified bitrate is implementation dependent.

Alternatively, the application may use the CQP (constant quantization parameter) encoding mode to perform customized bitrate adjustment on a per-frame base. The application may use any of the encoded or display order modes to use per-frame CQP.

Dynamic Resolution Change

The SDK encoder supports dynamic resolution change in all bitrate control modes. The application may change resolution by calling MFXVideoENCODE_Reset() function. The application may decrease or increase resolution up to the size specified during encoder initialization.

Resolution change always results in insertion of key IDR frame and new sequence parameter set header. The only exception is SDK VP9 encoder (see section for Dynamic reference frame scaling). The SDK encoder does not guarantee HRD conformance across resolution change point.

The SDK encoder may change some of the initialization parameters provided by the application during initialization. That in turn may lead to incompatibility of parameters provide by the application during reset and working set of parameters used by the SDK encoder. That is why it is strongly recommended to retrieve the actual working parameters set by MFXVideoENCODE_GetVideoParam() function before making any resolution change.

Dynamic reference frame scaling

VP9 standard allows to change resolution without insertion of key-frame. It’s possible because of native built-in capability of VP9 decoder to upscale and downscale reference frames to match resolution of frame which is being encoded. By default SDK VP9 encoder inserts key-frame when application does Dynamic Resolution Change. In this case first frame with new resolution is encoded using Inter prediction from scaled reference frame of previous resolution. Dynamic scaling has following limitation coming from VP9 specification: resolution of any active reference frame cannot exceed 2x resolution of current frame, and can’t be smaller than 1/16 of current frame resolution. In case of dynamic scaling SDK VP9 encoder always uses single active reference frame for first frame after resolution change. So SDK VP9 encoder has following limitation for dynamic resolution change: new resolution shouldn’t exceed 16x and be below than 1/2 of current resolution.

Application may force insertion of key-frame at the place of resolution change by invoking encoder reset with mfxExtEncoderResetOption::StartNewSequence set to MFX_CODINGOPTION_ON. In case of inserted key-frame above limitations for new resolution are not in force.

It should be noted that resolution change with dynamic reference scaling is compatible with multiref (mfxVideoParam::NumRefFrame > 1). For multiref configuration SDK VP9 encoder uses multiple references within stream pieces of same resolution, and uses single reference at the place of resolution change.

Forced Key Frame Generation

The SDK supports forced key frame generation during encoding. The application can set the FrameType parameter of the mfxEncodeCtrl structure to control how the current frame is encoded, as follows:

  • If the SDK encoder works in the display order, the application can enforce any current frame to be a key frame. The application cannot change the frame type of already buffered frames inside the SDK encoder.

  • If the SDK encoder works in the encoded order, the application must exactly specify frame type for every frame thus the application can enforce the current frame to have any frame type that particular coding standard allows.

Reference List Selection

During streaming or video conferencing, if the application can obtain feedbacks about how good the client receives certain frames, the application may need to adjust the encoding process to use or not use certain frames as reference. The following paragraphs describe how to fine-tune the encoding process based on such feedbacks.

The application can specify the reference window size by specifying the parameter mfxInfoMFX::NumRefFrame during encoding initialization. Certain platform may have limitation on how big the size of the reference window is. Use the function MFXVideoENCODE_GetVideoParam() to retrieve the current working set of parameters.

During encoding, the application can specify the actual reference list lengths by attaching the mfxExtAVCRefListCtrl structure to the MFXVideoENCODE_EncodeFrameAsync() function. The mfxExtAVCRefListCtrl::NumRefIdxL0Active member specifies the length of the reference list L0 and the mfxExtAVCRefListCtrl::NumRefIdxL1Active member specifies the length of the reference list L1. These two numbers must be less or equal to the parameter mfxInfoMFX::NumRefFrame during encoding initialization.

The application can instruct the SDK encoder to use or not use certain reference frames. To do this, there is a prerequisite that the application must uniquely identify each input frame, by setting the mfxFrameData::FrameOrder parameter. The application then specifies the preferred reference frame list mfxExtAVCRefListCtrl::PreferredRefList and/or the rejected frame list mfxExtAVCRefListCtrl::RejectedRefList, and attach the mfxExtAVCRefListCtrl structure to the MFXVideoENCODE_EncodeFrameAsync() function. The two lists fine-tune how the SDK encoder chooses the reference frames of the current frame. The SDK encoder does not keep PreferredRefList and application has to send it for each frame if necessary. There are a few limitations:

  • The frames in the lists are ignored if they are out of the reference window.

  • If by going through the lists, the SDK encoder cannot find a reference frame for the current frame, the SDK encoder will encode the current frame without using any reference frames.

  • If the GOP pattern contains B-frames, the SDK encoder may not be able to follow the mfxExtAVCRefListCtrl instructions.

Low Latency Encoding and Decoding

The application can set mfxVideoParam::AsyncDepth = 1 to disable any decoder buffering of output frames, which is aimed to improve the transcoding throughput. With mfxVideoParam::AsyncDepth = 1, the application must synchronize after the decoding or transcoding operation of each frame.

The application can adjust mfxExtCodingOption::MaxDecFrameBuffering, during encoding initialization, to improve decoding latency. It is recommended to set this value equal to number of reference frames.

Reference Picture Marking Repetition SEI message

The application can request writing the reference picture marking repetition SEI message during encoding initialization, by setting the mfxExtCodingOption::RefPicMarkRep of the mfxExtCodingOption structure. The reference picture marking repetition SEI message repeats certain reference frame information in the output bitstream for robust streaming.

The SDK decoder will respond to the reference picture marking repetition SEI message if such message exists in the bitstream, and check with the reference list information specified in the sequence/picture headers. The decoder will report any mismatch of the SEI message with the reference list information in the mfxFrameData::Corrupted field.

Long-term Reference frame

The application may use long-term reference frames to improve coding efficiency or robustness for video conferencing applications. The application controls the long-term frame marking process by attaching the mfxExtAVCRefListCtrl extended buffer during encoding. The SDK encoder itself never marks frame as long-term.

There are two control lists in the mfxExtAVCRefListCtrl extended buffer. The mfxExtAVCRefListCtrl::LongTermRefList list contains the frame orders (the mfxFrameData::FrameOrder value in the mfxFrameData structure) of the frames that should be marked as long-term frames. The mfxExtAVCRefListCtrl::RejectedRefList list contains the frame order of the frames that should be unmarked as long-term frames. The application can only mark/unmark those frames that are buffered inside encoder. Because of this, it is recommended that the application marks a frame when it is submitted for encoding. Application can either explicitly unmark long-term reference frame or wait for IDR frame, there all long-term reference frames will be unmarked.

The SDK encoder puts all long-term reference frames at the end of a reference frame list. If the number of active reference frames (the mfxExtAVCRefListCtrl::NumRefIdxL0Active and mfxExtAVCRefListCtrl::NumRefIdxL1Active values in the mfxExtAVCRefListCtrl extended buffer) is smaller than the total reference frame number (the mfxInfoMFX::NumRefFrame value in the mfxInfoMFX structure during the encoding initialization), the SDK encoder may ignore some or all long term reference frames. The application may avoid this by providing list of preferred reference frames in the mfxExtAVCRefListCtrl::PreferredRefList list in the mfxExtAVCRefListCtrl extended buffer. In this case, the SDK encoder reorders the reference list based on the specified list.

Temporal scalability

The application may specify the temporal hierarchy of frames by using the mfxExtAvcTemporalLayers extended buffer during the encoder initialization, in the display-order encoding mode. The SDK inserts the prefix NAL unit before each slice with a unique temporal and priority ID. The temporal ID starts from zero and the priority ID starts from the mfxExtAvcTemporalLayers::BaseLayerPID value. The SDK increases the temporal ID and priority ID value by one for each consecutive layer.

If the application needs to specify a unique sequence or picture parameter set ID, the application must use the mfxExtCodingOptionSPSPPS extended buffer, with all pointers and sizes set to zero and valid mfxExtCodingOptionSPSPPS::SPSId/mfxExtCodingOptionSPSPPS::PPSId fields. The same SPS and PPS ID will be used for all temporal layers.

Each temporal layer is a set of frames with the same temporal ID. Each layer is defined by the mfxExtAvcTemporalLayers::Scale value. Scale for layer N is equal to ratio between the frame rate of subsequence consisted of temporal layers with temporal ID lower or equal to N and frame rate of base temporal layer. The application may skip some of the temporal layers by specifying the mfxExtAvcTemporalLayers::Scale value as zero. The application should use an integer ratio of the frame rates for two consecutive temporal layers.

For example, 30 frame per second video sequence typically is separated by three temporal layers, that can be decoded as 7.5 fps (base layer), 15 fps (base and first temporal layer) and 30 fps (all three layers). mfxExtAvcTemporalLayers::Scale for this case should have next values {1,2,4,0,0,0,0,0}.

Switchable Graphics and Multiple Monitors

The following sections address a few aspects of supporting switchable graphics and multiple monitors configurations.

Switchable Graphics

Switchable Graphics refers to the machine configuration that multiple graphic devices are available (integrated device for power saving and discrete devices for performance.) Usually at one time or instance, one of the graphic devices drives display and becomes the active device, and others become inactive. There are different variations of software or hardware mechanisms to switch between the graphic devices. In one of the switchable graphics variations, it is possible to register an application in an affinity list to certain graphic device so that the launch of the application automatically triggers a switch. The actual techniques to enable such a switch are outside the scope of this document. This document discusses the implication of switchable graphics to the SDK and the SDK applications.

As the SDK performs hardware acceleration through Intel graphic device, it is critical that the SDK can access to the Intel graphic device in the switchable graphics setting. If possible, it is recommended to add the application to the Intel graphic device affinity list. Otherwise, the application must handle the following cases:

  • By the SDK design, during the SDK library initialization, the function MFXInit() searches for Intel graphic devices. If a SDK implementation is successfully loaded, the function MFXInit() returns MFX_ERR_NONE and the MFXQueryIMPL() function returns the actual implementation type. If no SDK implementation is loaded, the function MFXInit() returns MFX_ERR_UNSUPPORTED. In the switchable graphics environment, if the application is not in the Intel graphic device affinity list, it is possible that the Intel graphic device is not accessible during the SDK library initialization. The fact that the MFXInit() function returns MFX_ERR_UNSUPPORTED does not mean that hardware acceleration is not possible permanently. The user may switch the graphics later and by then the Intel graphic device will become accessible. It is recommended that the application initialize the SDK library right before the actual decoding, video processing, and encoding operations to determine the hardware acceleration capability.

  • During decoding, video processing, and encoding operations, if the application is not in the Intel graphic device affinity list, the previously accessible Intel graphic device may become inaccessible due to a switch event. The SDK functions will return MFX_ERR_DEVICE_LOST or MFX_ERR_DEVICE_FAILED, depending on when the switch occurs and what stage the SDK functions operate. The application needs to handle these errors and exits gracefully.

Multiple Monitors

Multiple monitors refer to the machine configuration that multiple graphic devices are available. Some of the graphic devices connect to a display, they become active and accessible under the Microsoft* DirectX* infrastructure. For those graphic devices not connected to a display, they are inactive. Specifically, under the Microsoft Direct3D9* infrastructure, those devices are not accessible.

The SDK uses the adapter number to access to a specific graphic device. Usually, the graphic device that drives the main desktop becomes the primary adapter. Other graphic devices take subsequent adapter numbers after the primary adapter. Under the Microsoft Direct3D9 infrastructure, only active adapters are accessible and thus have an adapter number.

The SDK extends the implementation type mfxIMPL as follows:

Implementation Type

Definition

MFX_IMPL_HARDWARE

The SDK should initialize on the primary adapter

MFX_IMPL_HARDWARE2

The SDK should initialize on the 2nd graphic adapter

MFX_IMPL_HARDWARE3

The SDK should initialize on the 3rd graphic adapter

MFX_IMPL_HARDWARE4

The SDK should initialize on the 4th graphic adapter

The application can use the above definitions to instruct the SDK library to initializes on a specific graphic device. The application can also use the following definitions for automatic detection:

Implementation Type

Definition

MFX_IMPL_HARDWARE_ANY

The SDK should initialize on any graphic adapter

MFX_IMPL_AUTO_ANY

The SDK should initialize on any graphic adapter. If not successful, load the software implementation.

If the application uses the Microsoft* DirectX* surfaces for I/O, it is critical that the application and the SDK works on the same graphic device. It is recommended that the application use the following procedure:

Finally, similar to the switchable graphics cases, it is possible that the user disconnects monitors from the graphic devices or remaps the primary adapter thus causes interruption. If the interruption occurs during the SDK library initialization, the MFXInit() function may return MFX_ERR_UNSUPPORTED. This means hardware acceleration is currently not available. It is recommended that the application initialize the SDK library right before the actual decoding, video processing, and encoding operations to determine the hardware acceleration capability.

If the interruption occurs during decoding, video processing, or encoding operations, the SDK functions will return MFX_ERR_DEVICE_LOST or MFX_ERR_DEVICE_FAILED. The application needs to handle these errors and exit gracefully.

Working directly with VA API for Linux*

The SDK takes care of all memory and synchronization related operations in VA API. However, in some cases the application may need to extend the SDK functionality by working directly with VA API for Linux*. For example, to implement customized external allocator. This chapter describes some basic memory management and synchronization techniques.

To create VA surface pool the application should call vaCreateSurfaces:

VASurfaceAttrib attrib;
attrib.type = VASurfaceAttribPixelFormat;
attrib.value.type = VAGenericValueTypeInteger;
attrib.value.value.i = VA_FOURCC_NV12;
attrib.flags = VA_SURFACE_ATTRIB_SETTABLE;

#define NUM_SURFACES 5;
VASurfaceID surfaces[NUMSURFACES];

vaCreateSurfaces(va_display, VA_RT_FORMAT_YUV420, width, height, surfaces, NUM_SURFACES, &attrib, 1);

To destroy surface pool the application should call vaDestroySurfaces:

vaDestroySurfaces(va_display, surfaces, NUM_SURFACES);

If the application works with hardware acceleration through the SDK then it can access surface data immediately after successful completion of MFXVideoCORE_SyncOperation() call. If the application works with hardware acceleration directly then it has to check surface status before accessing data in video memory. This check can be done asynchronously by calling vaQuerySurfaceStatus function or synchronously by vaSyncSurface function.

After successful synchronization the application can access surface data. It is performed in two steps. At the first step VAImage is created from surface and at the second step image buffer is mapped to system memory. After mapping VAImage.offsets[3] array holds offsets to each color plain in mapped buffer and VAImage.pitches[3] array holds color plain pitches, in bytes. For packed data formats, only first entries in these arrays are valid. How to access data in NV12 surface:

VAImage image;
unsigned char *buffer, Y, U, V;

vaDeriveImage(va_display, surface_id, &image);
vaMapBuffer(va_display, image.buf, &buffer);

/* NV12 */
Y = buffer + image.offsets[0];
U = buffer + image.offsets[1];
V = U + 1;

After processing data in VA surface the application should release resources allocated for mapped buffer and VAImage object:

vaUnmapBuffer(va_display, image.buf);
vaDestroyImage(va_display, image.image_id);

In some cases, for example, to retrieve encoded bitstream from video memory, the application has to use VABuffer to store data. Example below shows how to create, use and then destroy VA buffer. Note, that vaMapBuffer function returns pointers to different objects depending on mapped buffer type. It is plain data buffer for VAImage and VACodedBufferSegment structure for encoded bitstream. The application cannot use VABuffer for synchronization and in case of encoding it is recommended to synchronize by input VA surface as described above.

/* create buffer */
VABufferID buf_id;
vaCreateBuffer(va_display, va_context, VAEncCodedBufferType, buf_size, 1, NULL, & buf_id);

/* encode frame */
// ...

/* map buffer */
VACodedBufferSegment *coded_buffer_segment;

vaMapBuffer(va_display, buf_id, (void **)(& coded_buffer_segment));

size   = coded_buffer_segment->size;
offset = coded_buffer_segment->bit_offset;
buf    = coded_buffer_segment->buf;

/* retrieve encoded data*/
// ...

/* unmap and destroy buffer */
vaUnmapBuffer(va_display, buf_id);
vaDestroyBuffer(va_display, buf_id);

CQP HRD mode encoding

Application can configure AVC encoder to work in CQP rate control mode with HRD model parameters. SDK will place HRD information to SPS/VUI and choose appropriate profile/level. It’s responsibility of application to provide per-frame QP, track HRD conformance and insert required SEI messages to the bitstream.

Example below shows how to enable CQP HRD mode. Application should set RateControlMethod to CQP, mfxExtCodingOption::VuiNalHrdParameters to ON, mfxExtCodingOption::NalHrdConformance to OFF and set rate control parameters similar to CBR or VBR modes (instead of QPI, QPP and QPB). SDK will choose CBR or VBR HRD mode based on MaxKbps parameter. If MaxKbps is set to zero, SDK will use CBR HRD model (write cbr_flag = 1 to VUI), otherwise VBR model will be used (and cbr_flag = 0 is written to VUI).

mfxExtCodingOption option, *option_array;

/* configure mfxExtCodingOption */
memset(&option,0,sizeof(option));
option.Header.BufferId         = MFX_EXTBUFF_CODING_OPTION;
option.Header.BufferSz         = sizeof(option);
option.VuiNalHrdParameters     = MFX_CODINGOPTION_ON;
option.NalHrdConformance       = MFX_CODINGOPTION_OFF;

/* configure mfxVideoParam */
mfxVideoParam param;

// ...

param.mfx.RateControlMethod         = MFX_RATECONTROL_CQP;
param.mfx.FrameInfo.FrameRateExtN   = <valid_non_zero_value>;
param.mfx.FrameInfo.FrameRateExtD   = <valid_non_zero_value>;
param.mfx.BufferSizeInKB            = <valid_non_zero_value>;
param.mfx.InitialDelayInKB          = <valid_non_zero_value>;
param.mfx.TargetKbps                = <valid_non_zero_value>;

if (<write cbr_flag = 1>)
   param.mfx.MaxKbps = 0;
else /* <write cbr_flag = 0> */
   param.mfx.MaxKbps = <valid_non_zero_value>;

param.NumExtParam = 1;
option_array     = &option;
param.ExtParam     = &option_array;

/* encoder initialization */
mfxStatus sts;
sts = MFXVideoENCODE_Init(session, &param);

// ...

/* encoding */
mfxEncodeCtrl ctrl;
memset(&ctrl,0,sizeof(ctrl));
ctrl.QP = <frame_qp>

sts=MFXVideoENCODE_EncodeFrameAsync(session,&ctrl,surface2,bits,&syncp);

oneVPL API Reference

Basic Types

typedef unsigned char mfxU8

Unsigned integer, 8 bit type

typedef char mfxI8

Signed integer, 8 bit type

typedef unsigned short mfxU16

Unsigned integer, 16 bit type

typedef short mfxI16

Signed integer, 16 bit type

typedef unsigned int mfxU32

Unsigned integer, 32 bit type

typedef int mfxI32

Signed integer, 32 bit type

typedef unsigned int mfxUL32

Unsigned integer, 32 bit type

typedef int mfxL32

Signed integer, 32 bit type

typedef __UINT64 mfxU64

Unigned integer, 64 bit type

typedef __INT64 mfxI64

Signed integer, 64 bit type

typedef float mfxF32

Single-presesion floating point, 32 bit type

typedef double mfxF64

Double-presesion floating point, 64 bit type

typedef void *mfxHDL

Handle type

typedef mfxHDL mfxMemId

Memory ID type

typedef void *mfxThreadTask

Thread task type

typedef char mfxChar

ASCII character, 8 bit type

Typedefs

typedef struct _mfxSession *mfxSession

SDK session handle

typedef struct _mfxSyncPoint *mfxSyncPoint

Syncronization point object handle

typedef struct _mfxLoader *mfxLoader

SDK loader handle

typedef struct _mfxConfig *mfxConfig

SDK config handle

oneVPL Dispatcher API

Defines

MFX_IMPL_NAME

Maximum allowed lenght of the implementation name.

MFX_STRFIELD_LEN

Maximum allowed lenght of the implementation name.

Structures

mfxVariant

enum mfxVariantType

The mfxVariantType enumerator data types for mfxVarianf type.

Values:

enumerator MFX_VARIANT_TYPE_UNSET = 0

Undefined type.

enumerator MFX_VARIANT_TYPE_U8 = 1

8-bit unsigned integer.

enumerator MFX_VARIANT_TYPE_I8

8-bit signed integer.

enumerator MFX_VARIANT_TYPE_U16

16-bit unsigned integer.

enumerator MFX_VARIANT_TYPE_I16

16-bit signed integer.

enumerator MFX_VARIANT_TYPE_U32

32-bit unsigned integer.

enumerator MFX_VARIANT_TYPE_I32

32-bit signed integer.

enumerator MFX_VARIANT_TYPE_U64

64-bit unsigned integer.

enumerator MFX_VARIANT_TYPE_I64

64-bit signed integer.

enumerator MFX_VARIANT_TYPE_F32

32-bit single precision floating point.

enumerator MFX_VARIANT_TYPE_F64

64-bit double precision floating point.

enumerator MFX_VARIANT_TYPE_PTR

Generic type pointer.

struct mfxVariant

The mfxVariantType enumerator data types for mfxVarianf type.

Public Members

mfxStructVersion Version

Version of the structure.

mfxVariantType Type

Value type.

union mfxVariant::data Data

Value data member.

union data

Value data holder.

Public Members

mfxU8 U8

mfxU8 data.

mfxI8 I8

mfxI8 data.

mfxU16 U16

mfxU16 data.

mfxI16 I16

mfxI16 data.

mfxU32 U32

mfxU32 data.

mfxI32 I32

mfxI32 data.

mfxU64 U64

mfxU64 data.

mfxI64 I64

mfxI64 data.

mfxF32 F32

mfxF32 data.

mfxF64 F64

mfxF64 data.

mfxHDL Ptr

Pointer.

mfxDecoderDescription

struct mfxDecoderDescription

This structure represents decoders description.

Public Members

mfxStructVersion Version

Version of the structure.

mfxU16 reserved[7]

reserved for future use.

mfxU16 NumCodecs

Number of supported decoders.

struct mfxDecoderDescription::decoder *Codecs

Pointer to the array of decoders.

struct decoder

This structure represents decoder description.

Public Members

mfxU32 CodecID

decoder ID in fourCC format.

mfxU16 reserved[8]

reserved for future use.

mfxU16 MaxcodecLevel

Maximum supported codec’s level.

mfxU16 NumProfiles

Number of supported profiles.

struct mfxDecoderDescription::decoder::decprofile *Profiles

Pointer to the array of profiles supported by the codec.

struct decprofile

This structure represents codec’s profile description.

Public Members

mfxU32 Profile

Profile ID in fourCC format.

mfxU16 reserved[7]

reserved for future use.

mfxU16 NumMemTypes

Number of supported memory types.

struct mfxDecoderDescription::decoder::decprofile::decmemdesc *MemDesc

Pointer to the array of memory types.

struct decmemdesc

This structure represents underlying details of the memory type.

Public Members

mfxResourceType MemHandleType

Memory handle type.

mfxRange32U Width

Range of supported image widths.

mfxRange32U Height

Range of supported image heights.

mfxU16 reserved[7]

reserved for future use.

mfxU16 NumColorFormats

Number of supported output color formats.

mfxU32 *ColorFormats

Pointer to the array of supported output color formats (in fourCC).

mfxEncoderDescription

struct mfxEncoderDescription

This structure represents encoder description.

Public Members

mfxStructVersion Version

Version of the structure.

mfxU16 reserved[7]

reserved for future use.

mfxU16 NumCodecs

Number of supported encoders.

struct mfxEncoderDescription::encoder *Codecs

Pointer to the array of encoders.

struct encoder

This structure represents encoder description.

Public Members

mfxU32 CodecID

Encoder ID in fourCC format.

mfxU16 MaxcodecLevel

Maximum supported codec’s level.

mfxU16 BiDirectionalPrediction

Indicates B-frames support.

mfxU16 reserved[7]

reserved for future use.

mfxU16 NumProfiles

Number of supported profiles.

struct mfxEncoderDescription::encoder::encprofile *Profiles

Pointer to the array of profiles supported by the codec.

struct encprofile

This structure represents codec’s profile description.

Public Members

mfxU32 Profile

Profile ID in fourCC format.

mfxU16 reserved[7]

reserved for future use.

mfxU16 NumMemTypes

Number of supported memory types.

struct mfxEncoderDescription::encoder::encprofile::encmemdesc *MemDesc

Pointer to the array of memory types.

struct encmemdesc

This structure represents underlying details of the memory type.

Public Members

mfxResourceType MemHandleType

Memory handle type.

mfxRange32U Width

Range of supported image widths.

mfxRange32U Height

Range of supported image heights.

mfxU16 reserved[7]

reserved for future use.

mfxU16 NumColorFormats

Number of supported input color formats.

mfxU32 *ColorFormats

Pointer to the array of supported input color formats (in fourCC).

mfxVPPDescription

struct mfxVPPDescription

This structure represents VPP description.

Public Members

mfxStructVersion Version

Version of the structure.

mfxU16 reserved[7]

reserved for future use.

mfxU16 NumFilters

Number of supported VPP filters.

struct mfxVPPDescription::filter *Filters

Pointer to the array of supported filters.

struct filter

This structure represents VPP filters description.

Public Members

mfxU32 FilterFourCC

Filter ID in fourCC format.

mfxU16 MaxDelayInFrames

Introduced output delay in frames.

mfxU16 reserved[7]

reserved for future use.

mfxU16 NumMemTypes

Number of supported memory types.

struct mfxVPPDescription::filter::memdesc *MemDesc

Pointer to the array of memory types.

struct memdesc

This structure represents underlying details of the memory type.

Public Members

mfxResourceType MemHandleType

Memory handle type.

mfxRange32U Width

Range of supported image widths.

mfxRange32U Height

Range of supported image heights.

mfxU16 reserved[7]

reserved for future use.

mfxU16 NumInFormats

Number of supported input color formats.

struct mfxVPPDescription::filter::memdesc::format *Formats

Pointer to the array of supported formats.

struct format

This structure represents input color format description.

Public Members

mfxU16 reserved[7]

reserved for future use.

mfxU16 NumOutFormat

Number of supported output color formats.

mfxU32 *OutFormats

Pointer to the array of supported output color formats (in fourCC).

mfxImplDescription

struct mfxImplDescription

This structure represents implementation description

Public Members

mfxStructVersion Version

Version of the structure.

mfxIMPL Impl

Impl type: SW/GEN/Bay/Custom.

mfxU16 accelerationMode

Hardware acceleration stack to use. OS depended parameter. On linux - VA, on Windows - DX*.

mfxVersion ApiVersion

Supported API version.

mfxU8 ImplName[MFX_IMPL_NAME]

Null-terminated string with implementation name given by vendor.

mfxU8 License[MFX_STRFIELD_LEN]

Null-terminated string with licence name of the implementation.

mfxU8 Keywords[MFX_STRFIELD_LEN]

Null-terminated string with comma-separated list of keywords speciefic to this implementation that dispatcher can search for.

mfxU32 VendorID

Standart vendor ID 0x8086 - Intel.

mfxU32 VendorImplID

Vendor specific number with gived implementation ID.

mfxDecoderDescription Dec

Decoders config.

mfxEncoderDescription Enc

Encoders config.

mfxVPPDescription VPP

VPP config.

mfxU32 reserved[16]

reserved for future use.

mfxU32 NumExtParam

Number of extension buffers. Reserved for future. Must be 0.

mfxExtBuffer **ExtParam

Array of extension buffers.

mfxU64 Reserved2

reserved for future use.

union mfxImplDescription::[anonymous] ExtParams

Extension buffers. Reserved for future.

Functions

mfxLoader MFXLoad()

This function creates the SDK loader.

Return

Loader SDK loader handle or NULL if failed.

void MFXUnload(mfxLoader loader)

This function destroys the SDK dispatcher.

Parameters
  • [in] loader: SDK loader handle.

mfxConfig MFXCreateConfig(mfxLoader loader)

This function creates dispatcher configuration.

This function creates dispatcher internal congfiguration, which is used to filter out avialable implementations. Then this config is used to walk through selected implementations to gather more details and select appropriate implementation to load. Loader object remembers all created mfxConfig objects and desrtoyes them during mfxUnload function call.

Multilple configurations per single mfxLoader object is possible.

Usage example:

mfxLoader loader = MFXLoad();
mfxConfig cfg = MFXCreateConfig(loader);
MFXCreateSession(loader,0,&session);
Return

SDK config handle or NULL pointer is failed.

Parameters
  • [in] loader: SDK loader handle.

mfxStatus MFXSetConfigFilterProperty(mfxConfig config, const mfxU8 *name, mfxVariant value)

This function used to add additional filter propery (any fields of mfxImplDescription structure) to the configuration of the SDK loader object. One mfxConfig properties can hold only single filter property.

Simple Usage example:

mfxLoader loader = MFXLoad();
mfxConfig cfg = MFXCreateConfig(loader);
mfxVariant ImplValue;
ImplValue.Type = MFX_VARIANT_TYPE_U32;
ImplValue.Data.U32 = MFX_IMPL_SOFTWARE;
MFXSetConfigFilterProperty(cfg,"mfxImplDescription.Impl",ImplValue);
MFXCreateSession(loader,0,&session);
Note

Each new call with the same parameter “name” will overwrite previously set “value”. This may invalidate other properties.

Note

Each new call with another parameter “name” will delete previouse property and create new property based on new “name“‘s value.

Two sessions usage example (Multiple loaders example):

// Create session with software based implementation
mfxLoader loader1 = MFXLoad();
mfxConfig cfg1 = MFXCreateConfig(loader1);
mfxVariant ImplValueSW;
ImplValueSW.Type = MFX_VARIANT_TYPE_U32;
ImplValueSW.Data.U32 = MFX_IMPL_SOFTWARE;
MFXSetConfigFilterProperty(cfg1,"mfxImplDescription.Impl",ImplValueSW);
MFXCreateSession(loader1,0,&sessionSW);

// Create session with hardware based implementation
mfxLoader loader2 = MFXLoad();
mfxConfig cfg2 = MFXCreateConfig(loader2);
mfxVariant ImplValueHW;
ImplValueHW.Type = MFX_VARIANT_TYPE_U32;
ImplValueHW.Data.U32 = MFX_IMPL_HARDWARE;
MFXSetConfigFilterProperty(cfg2,"mfxImplDescription.Impl",ImplValueHW);
MFXCreateSession(loader2,0,&sessionHW);

// use both sessionSW and sessionHW
// ...
// Close everything
MFXClose(sessionSW);
MFXClose(sessionHW);
MFXUnload(loader1); // cfg1 will be destroyed here.
MFXUnload(loader2); // cfg2 will be destroyed here.

Two decoders example (Multiple Config objects example):

mfxLoader loader = MFXLoad();

mfxConfig cfg1 = MFXCreateConfig(loader);
mfxVariant ImplValue;
val.Type = MFX_VARIANT_TYPE_U32;
val.Data.U32 = MFX_CODEC_AVC;
MFXSetConfigFilterProperty(cfg1,"mfxImplDescription.mfxDecoderDescription.decoder.CodecID",ImplValue);

mfxConfig cfg2 = MFXCreateConfig(loader);
mfxVariant ImplValue;
val.Type = MFX_VARIANT_TYPE_U32;
val.Data.U32 = MFX_CODEC_HEVC;
MFXSetConfigFilterProperty(cfg2,"mfxImplDescription.mfxDecoderDescription.decoder.CodecID",ImplValue);

MFXCreateSession(loader,0,&sessionAVC);
MFXCreateSession(loader,0,&sessionHEVC);

Return

MFX_ERR_NONE The function completed successfully. MFX_ERR_NULL_PTR If config is NULL.

MFX_ERR_NULL_PTR If name is NULL.

MFX_ERR_NOT_FOUND If name contains unknown parameter name. MFX_ERR_UNSUPPORTED If value data type doesn’t equal to the paramer with provided name.

Parameters
  • [in] config: SDK config handle.

  • [in] name: Name of the parameter (see mfxImplDescription structure and example).

  • [in] value: Value of the parameter.

mfxStatus MFXEnumImplementations(mfxLoader loader, mfxU32 i, mfxImplCapsDeliveryFormat format, mfxHDL *idesc)

This function used to iterate over filtered out implementations to gather their details. This function allocates memory to store mfxImplDescription structure instance. Use MFXDispReleaseImplDescription function to free memory allocated to the mfxImplDescription structure.

Return

MFX_ERR_NONE The function completed successfully. The idesc contains valid information.

MFX_ERR_NULL_PTR If loader is NULL.

MFX_ERR_NULL_PTR If idesc is NULL.

MFX_ERR_NOT_FOUND Provided index is out of possible range.

MFX_ERR_UNSUPPORTED If requested format isn’t supported.

Parameters
  • [in] loader: SDK loader handle.

  • [in] i: Index of the implementation.

  • [in] format: Format in which capabilities need to be delivered. See mfxImplCapsDeliveryFormat enumerator for more details.

  • [out] idesc: Poiner to the mfxImplDescription structure.

mfxStatus MFXCreateSession(mfxLoader loader, mfxU32 i, mfxSession *session)

This function used to load and initialize the implementation.

mfxLoader loader = MFXLoad();
int i=0;
while(1) {
   mfxImplDescription *idesc;
   MFXEnumImplementations(loader, i, MFX_IMPLCAPS_IMPLDESCSTRUCTURE, (mfxHDL*)&idesc);
   if(is_good(idesc)) {
       MFXCreateSession(loader, i,&session);
       // ...
       MFXDispReleaseImplDescription(loader, idesc);
   }
   else
   {
       MFXDispReleaseImplDescription(loader, idesc);
       break;
   }
}
Return

MFX_ERR_NONE The function completed successfully. The session contains pointer to the SDK session handle.

MFX_ERR_NULL_PTR If loader is NULL.

MFX_ERR_NULL_PTR If session is NULL.

MFX_ERR_NOT_FOUND Provided index is out of possible range.

Parameters
  • [in] loader: SDK loader handle.

  • [in] i: Index of the implementation.

  • [out] session: pointer to the SDK session handle.

mfxStatus MFXDispReleaseImplDescription(mfxLoader loader, mfxHDL hdl)

This function destoys handle allocated by MFXQueryImplCapabilities function.

Return

MFX_ERR_NONE The function completed successfully.

MFX_ERR_NULL_PTR If loader is NULL.

MFX_ERR_INVALID_HANDLE Provided hdl handle isn’t assotiated with this loader.

Parameters
  • [in] loader: SDK loader handle.

  • [in] hdl: Handle to destroy. Can be equal to NULL.

Enums

mfxStatus

enum mfxStatus

Itemizes status codes returned by SDK functions

Values:

enumerator MFX_ERR_NONE = 0

no error

enumerator MFX_ERR_UNKNOWN = -1

unknown error.

enumerator MFX_ERR_NULL_PTR = -2

null pointer

enumerator MFX_ERR_UNSUPPORTED = -3

undeveloped feature

enumerator MFX_ERR_MEMORY_ALLOC = -4

failed to allocate memory

enumerator MFX_ERR_NOT_ENOUGH_BUFFER = -5

insufficient buffer at input/output

enumerator MFX_ERR_INVALID_HANDLE = -6

invalid handle

enumerator MFX_ERR_LOCK_MEMORY = -7

failed to lock the memory block

enumerator MFX_ERR_NOT_INITIALIZED = -8

member function called before initialization

enumerator MFX_ERR_NOT_FOUND = -9

the specified object is not found

enumerator MFX_ERR_MORE_DATA = -10

expect more data at input

enumerator MFX_ERR_MORE_SURFACE = -11

expect more surface at output

enumerator MFX_ERR_ABORTED = -12

operation aborted

enumerator MFX_ERR_DEVICE_LOST = -13

lose the HW acceleration device

enumerator MFX_ERR_INCOMPATIBLE_VIDEO_PARAM = -14

incompatible video parameters

enumerator MFX_ERR_INVALID_VIDEO_PARAM = -15

invalid video parameters

enumerator MFX_ERR_UNDEFINED_BEHAVIOR = -16

undefined behavior

enumerator MFX_ERR_DEVICE_FAILED = -17

device operation failure

enumerator MFX_ERR_MORE_BITSTREAM = -18

expect more bitstream buffers at output

enumerator MFX_ERR_GPU_HANG = -21

device operation failure caused by GPU hang

enumerator MFX_ERR_REALLOC_SURFACE = -22

bigger output surface required

enumerator MFX_WRN_IN_EXECUTION = 1

the previous asynchronous operation is in execution

enumerator MFX_WRN_DEVICE_BUSY = 2

the HW acceleration device is busy

enumerator MFX_WRN_VIDEO_PARAM_CHANGED = 3

the video parameters are changed during decoding

enumerator MFX_WRN_PARTIAL_ACCELERATION = 4

SW is used

enumerator MFX_WRN_INCOMPATIBLE_VIDEO_PARAM = 5

incompatible video parameters

enumerator MFX_WRN_VALUE_NOT_CHANGED = 6

the value is saturated based on its valid range

enumerator MFX_WRN_OUT_OF_RANGE = 7

the value is out of valid range

enumerator MFX_WRN_FILTER_SKIPPED = 10

one of requested filters has been skipped

enumerator MFX_ERR_NONE_PARTIAL_OUTPUT = 12

frame is not ready, but bitstream contains partial output

enumerator MFX_TASK_DONE = MFX_ERR_NONE

task has been completed

enumerator MFX_TASK_WORKING = 8

there is some more work to do

enumerator MFX_TASK_BUSY = 9

task is waiting for resources

enumerator MFX_ERR_MORE_DATA_SUBMIT_TASK = -10000

return MFX_ERR_MORE_DATA but submit internal asynchronous task

mfxIMPL

typedef mfxI32 mfxIMPL

This enumerator itemizes SDK implementation types. The implementation type is a bit OR’ed value of the base type and any decorative flags.

enumerator MFX_IMPL_SOFTWARE = 0x0001

Pure Software Implementation

enumerator MFX_IMPL_HARDWARE = 0x0002

Hardware Accelerated Implementation (default device)

enumerator MFX_IMPL_AUTO_ANY = 0x0003

Auto selection of any hardware/software implementation

enumerator MFX_IMPL_HARDWARE_ANY = 0x0004

Auto selection of any hardware implementation

enumerator MFX_IMPL_HARDWARE2 = 0x0005

Hardware accelerated implementation (2nd device)

enumerator MFX_IMPL_HARDWARE3 = 0x0006

Hardware accelerated implementation (3rd device)

enumerator MFX_IMPL_HARDWARE4 = 0x0007

Hardware accelerated implementation (4th device)

enumerator MFX_IMPL_RUNTIME = 0x0008

This value cannot be used for session initialization. It may be returned by MFXQueryIMPL function to show that session has been initialized in run time mode.

enumerator MFX_IMPL_SINGLE_THREAD = 0x0009
enumerator MFX_IMPL_VIA_ANY = 0x0100

Hardware acceleration can go through any supported OS infrastructure. This is default value, it is used by the SDK if none of MFX_IMPL_VIA_xxx flag is specified by application.

enumerator MFX_IMPL_VIA_D3D9 = 0x0200

Hardware acceleration goes through the Microsoft* Direct3D9* infrastructure.

enumerator MFX_IMPL_VIA_D3D11 = 0x0300

Hardware acceleration goes through the Microsoft* Direct3D11* infrastructure.

enumerator MFX_IMPL_VIA_VAAPI = 0x0400

Hardware acceleration goes through the Linux* VA API infrastructure.

enumerator MFX_IMPL_EXTERNAL_THREADING = 0x10000
enumerator MFX_IMPL_UNSUPPORTED = 0x0000

One of the MFXQueryIMPL returns

MFX_IMPL_BASETYPE(x)

The application can use the macro MFX_IMPL_BASETYPE(x) to obtain the base implementation type.

mfxImplCapsDeliveryFormat

enum mfxImplCapsDeliveryFormat

Values:

enumerator MFX_IMPLCAPS_IMPLDESCSTRUCTURE = 1

Deliver capabilities as mfxImplDescription structure.

mfxPriority

enum mfxPriority

The mfxPriority enumerator describes the session priority.

Values:

enumerator MFX_PRIORITY_LOW = 0

Low priority: the session operation halts when high priority tasks are executing and more than 75% of the CPU is being used for normal priority tasks.

enumerator MFX_PRIORITY_NORMAL = 1

Normal priority: the session operation is halted if there are high priority tasks.

enumerator MFX_PRIORITY_HIGH = 2

High priority: the session operation blocks other lower priority session operations.

GPUCopy

enumerator MFX_GPUCOPY_DEFAULT = 0

Use default mode for the current SDK implementation.

enumerator MFX_GPUCOPY_ON = 1

Enable GPU accelerated copying.

enumerator MFX_GPUCOPY_OFF = 2

Disable GPU accelerated copying.

PlatformCodeName

enumerator MFX_PLATFORM_UNKNOWN = 0

Unknown platform

enumerator MFX_PLATFORM_SANDYBRIDGE = 1

Sandy Bridge

enumerator MFX_PLATFORM_IVYBRIDGE = 2

Ivy Bridge

enumerator MFX_PLATFORM_HASWELL = 3

Haswell

enumerator MFX_PLATFORM_BAYTRAIL = 4

Bay Trail

enumerator MFX_PLATFORM_BROADWELL = 5

Broadwell

enumerator MFX_PLATFORM_CHERRYTRAIL = 6

Cherry Trail

enumerator MFX_PLATFORM_SKYLAKE = 7

Skylake

enumerator MFX_PLATFORM_APOLLOLAKE = 8

Apollo Lake

enumerator MFX_PLATFORM_KABYLAKE = 9

Kaby Lake

enumerator MFX_PLATFORM_GEMINILAKE = 10

Gemini Lake

enumerator MFX_PLATFORM_COFFEELAKE = 11

Coffe Lake

enumerator MFX_PLATFORM_CANNONLAKE = 20

Cannon Lake

enumerator MFX_PLATFORM_ICELAKE = 30

Ice Lake

enumerator MFX_PLATFORM_JASPERLAKE = 32

Jasper Lake

enumerator MFX_PLATFORM_ELKHARTLAKE = 33

Elkhart Lake

enumerator MFX_PLATFORM_TIGERLAKE = 40

Tiger Lake

mfxMediaAdapterType

enum mfxMediaAdapterType

The mfxMediaAdapterType enumerator itemizes types of Intel Gen Graphics adapters.

Values:

enumerator MFX_MEDIA_UNKNOWN = 0xffff

Unknown type.

enumerator MFX_MEDIA_INTEGRATED = 0

Integrated Intel Gen Graphics adapter.

enumerator MFX_MEDIA_DISCRETE = 1

Discrete Intel Gen Graphics adapter.

mfxMemoryFlags

enum mfxMemoryFlags

The mfxMemoryFlags enumerator specifies memory access mode.

Values:

enumerator MFX_MAP_READ = 0x1

The surface is mapped for reading.

enumerator MFX_MAP_WRITE = 0x2

The surface is mapped for writing.

enumerator MFX_MAP_READ_WRITE = MFX_MAP_READ | MFX_MAP_WRITE

The surface is mapped for reading and writing.

enumerator MFX_MAP_NOWAIT = 0x10

The mapping would be done immediatly without any implicit synchronizations.

Attention

This flag is optional

mfxResourceType

enum mfxResourceType

Values:

enumerator MFX_RESOURCE_SYSTEM_SURFACE = 1

System memory.

enumerator MFX_RESOURCE_VA_SURFACE = 2

VA Surface.

enumerator MFX_RESOURCE_VA_BUFFER = 3

VA Buffer.

enumerator MFX_RESOURCE_DX9_SURFACE = 4

IDirect3DSurface9.

enumerator MFX_RESOURCE_DX11_TEXTURE = 5

ID3D11Texture2D.

enumerator MFX_RESOURCE_DX12_RESOURCE = 6

ID3D12Resource.

enumerator MFX_RESOURCE_DMA_RESOURCE = 7

DMA resource.

ColorFourCC

The ColorFourCC enumerator itemizes color formats.

enumerator MFX_FOURCC_NV12 = MFX_MAKEFOURCC('N', 'V', '1', '2')

NV12 color planes. Native Format

enumerator MFX_FOURCC_NV21 = MFX_MAKEFOURCC('N', 'V', '2', '1')
enumerator MFX_FOURCC_YV12 = MFX_MAKEFOURCC('Y', 'V', '1', '2')

YV12 color planes

enumerator MFX_FOURCC_IYUV = MFX_MAKEFOURCC('I', 'Y', 'U', 'V')

< same as NV12 but with weaved V and U values.

enumerator MFX_FOURCC_NV16 = MFX_MAKEFOURCC('N', 'V', '1', '6')

4:2:2 color format with similar to NV12 layout.

enumerator MFX_FOURCC_YUY2 = MFX_MAKEFOURCC('Y', 'U', 'Y', '2')

YUY2 color planes.

enumerator MFX_FOURCC_RGB565 = MFX_MAKEFOURCC('R', 'G', 'B', '2')

2 bytes per pixel, uint16 in little-endian format, where 0-4 bits are blue, bits 5-10 are green and bits 11-15 are red

enumerator MFX_FOURCC_RGBP = MFX_MAKEFOURCC('R', 'G', 'B', 'P')

RGB 24 bit planar layout (3 separate channels, 8-bits per sample each). This format should be mapped to D3DFMT_R8G8B8 or VA_FOURCC_RGBP.

enumerator MFX_FOURCC_RGB4 = MFX_MAKEFOURCC('R', 'G', 'B', '4')

RGB4 (RGB32) color planes. ARGB is the order, A channel is 8 MSBs

enumerator MFX_FOURCC_P8 = 41

Internal SDK color format. The application should use one of the functions below to create such surface, depending on Direct3D version.

Direct3D9: IDirectXVideoDecoderService::CreateSurface()

Direct3D11: ID3D11Device::CreateBuffer()

enumerator MFX_FOURCC_P8_TEXTURE = MFX_MAKEFOURCC('P', '8', 'M', 'B')

Internal SDK color format. The application should use one of the functions below to create such surface, depending on Direct3D version.

Direct3D9: IDirectXVideoDecoderService::CreateSurface()

Direct3D11: ID3D11Device::CreateTexture2D()

enumerator MFX_FOURCC_P010 = MFX_MAKEFOURCC('P', '0', '1', '0')

P010 color format. This is 10 bit per sample format with similar to NV12 layout. This format should be mapped to DXGI_FORMAT_P010.

enumerator MFX_FOURCC_I010 = MFX_MAKEFOURCC('I', '0', '1', '0')

< same as YV12 except that the U and V plane order is reversed.

enumerator MFX_FOURCC_P016 = MFX_MAKEFOURCC('P', '0', '1', '6')

P016 color format. This is 16 bit per sample format with similar to NV12 layout. This format should be mapped to DXGI_FORMAT_P016.

enumerator MFX_FOURCC_P210 = MFX_MAKEFOURCC('P', '2', '1', '0')

0 bit per sample 4:2:2 color format with similar to NV12 layout

enumerator MFX_FOURCC_BGR4 = MFX_MAKEFOURCC('B', 'G', 'R', '4')

ABGR color format. It is similar to MFX_FOURCC_RGB4 but with interchanged R and B channels. ‘A’ is 8 MSBs, then 8 bits for ‘B’ channel, then ‘G’ and ‘R’ channels.

enumerator MFX_FOURCC_A2RGB10 = MFX_MAKEFOURCC('R', 'G', '1', '0')

10 bits ARGB color format packed in 32 bits. ‘A’ channel is two MSBs, then ‘R’, then ‘G’ and then ‘B’ channels. This format should be mapped to DXGI_FORMAT_R10G10B10A2_UNORM or D3DFMT_A2R10G10B10.

enumerator MFX_FOURCC_ARGB16 = MFX_MAKEFOURCC('R', 'G', '1', '6')

10 bits ARGB color format packed in 64 bits. ‘A’ channel is 16 MSBs, then ‘R’, then ‘G’ and then ‘B’ channels. This format should be mapped to DXGI_FORMAT_R16G16B16A16_UINT or D3DFMT_A16B16G16R16 formats.

enumerator MFX_FOURCC_ABGR16 = MFX_MAKEFOURCC('B', 'G', '1', '6')

10 bits ABGR color format packed in 64 bits. ‘A’ channel is 16 MSBs, then ‘B’, then ‘G’ and then ‘R’ channels. This format should be mapped to DXGI_FORMAT_R16G16B16A16_UINT or D3DFMT_A16B16G16R16 formats.

enumerator MFX_FOURCC_R16 = MFX_MAKEFOURCC('R', '1', '6', 'U')

16 bits single channel color format. This format should be mapped to DXGI_FORMAT_R16_TYPELESS or D3DFMT_R16F.

enumerator MFX_FOURCC_AYUV = MFX_MAKEFOURCC('A', 'Y', 'U', 'V')

YUV 4:4:4, AYUV color format. This format should be mapped to DXGI_FORMAT_AYUV.

enumerator MFX_FOURCC_AYUV_RGB4 = MFX_MAKEFOURCC('A', 'V', 'U', 'Y')

RGB4 stored in AYUV surface. This format should be mapped to DXGI_FORMAT_AYUV.

enumerator MFX_FOURCC_UYVY = MFX_MAKEFOURCC('U', 'Y', 'V', 'Y')

UYVY color planes. Same as YUY2 except the byte order is reversed.

enumerator MFX_FOURCC_Y210 = MFX_MAKEFOURCC('Y', '2', '1', '0')

10 bit per sample 4:2:2 packed color format with similar to YUY2 layout. This format should be mapped to DXGI_FORMAT_Y210.

enumerator MFX_FOURCC_Y410 = MFX_MAKEFOURCC('Y', '4', '1', '0')

10 bit per sample 4:4:4 packed color format. This format should be mapped to DXGI_FORMAT_Y410.

enumerator MFX_FOURCC_Y216 = MFX_MAKEFOURCC('Y', '2', '1', '6')

16 bit per sample 4:2:2 packed color format with similar to YUY2 layout. This format should be mapped to DXGI_FORMAT_Y216.

enumerator MFX_FOURCC_Y416 = MFX_MAKEFOURCC('Y', '4', '1', '6')

16 bit per sample 4:4:4 packed color format. This format should be mapped to DXGI_FORMAT_Y416.

ChromaFormatIdc

The ChromaFormatIdc enumerator itemizes color-sampling formats.

enumerator MFX_CHROMAFORMAT_MONOCHROME = 0

Monochrome

enumerator MFX_CHROMAFORMAT_YUV420 = 1

4:2:0 color

enumerator MFX_CHROMAFORMAT_YUV422 = 2

4:2:2 color

enumerator MFX_CHROMAFORMAT_YUV444 = 3

4:4:4 color

enumerator MFX_CHROMAFORMAT_YUV400 = MFX_CHROMAFORMAT_MONOCHROME

Equal to monochrome

enumerator MFX_CHROMAFORMAT_YUV411 = 4

4:1:1 color

enumerator MFX_CHROMAFORMAT_YUV422H = MFX_CHROMAFORMAT_YUV422

4:2:2 color, horizontal subsampling. It is equal to 4:2:2 color.

enumerator MFX_CHROMAFORMAT_YUV422V = 5

4:2:2 color, vertical subsampling

enumerator MFX_CHROMAFORMAT_RESERVED1 = 6

Reserved

enumerator MFX_CHROMAFORMAT_JPEG_SAMPLING = 6

Color sampling specified via mfxInfoMFX::SamplingFactorH and SamplingFactorV.

PicStruct

The PicStruct enumerator itemizes picture structure. Use bit-OR’ed values to specify the desired picture type.

enumerator MFX_PICSTRUCT_UNKNOWN = 0x00

Unspecified or mixed progressive/interlaced/field pictures.

enumerator MFX_PICSTRUCT_PROGRESSIVE = 0x01

Progressive picture.

enumerator MFX_PICSTRUCT_FIELD_TFF = 0x02

Top field in first interlaced picture.

enumerator MFX_PICSTRUCT_FIELD_BFF = 0x04

Bottom field in first interlaced picture.

enumerator MFX_PICSTRUCT_FIELD_REPEATED = 0x10

First field repeated: pic_struct=5 or 6 in H.264.

enumerator MFX_PICSTRUCT_FRAME_DOUBLING = 0x20

Double the frame for display: pic_struct=7 in H.264.

enumerator MFX_PICSTRUCT_FRAME_TRIPLING = 0x40

Triple the frame for display: pic_struct=8 in H.264.

enumerator MFX_PICSTRUCT_FIELD_SINGLE = 0x100

Single field in a picture.

enumerator MFX_PICSTRUCT_FIELD_TOP = MFX_PICSTRUCT_FIELD_SINGLE | MFX_PICSTRUCT_FIELD_TFF

Top field in a picture: pic_struct = 1 in H.265.

enumerator MFX_PICSTRUCT_FIELD_BOTTOM = MFX_PICSTRUCT_FIELD_SINGLE | MFX_PICSTRUCT_FIELD_BFF

Bottom field in a picture: pic_struct = 2 in H.265.

enumerator MFX_PICSTRUCT_FIELD_PAIRED_PREV = 0x200

Paired with previous field: pic_struct = 9 or 10 in H.265.

enumerator MFX_PICSTRUCT_FIELD_PAIRED_NEXT = 0x400

Paired with next field: pic_struct = 11 or 12 in H.265

Frame Data Flags

enumerator MFX_TIMESTAMP_UNKNOWN = -1

Indicates that timestamp is unknown for this frame/bitsream portion.

enumerator MFX_FRAMEORDER_UNKNOWN = -1

Unused entry or SDK functions that generate the frame output do not use this frame.

enumerator MFX_FRAMEDATA_ORIGINAL_TIMESTAMP = 0x0001

Indicates the time stamp of this frame is not calculated and is a pass-through of the original time stamp.

Corruption

The Corruption enumerator itemizes the decoding corruption types. It is a bit-OR’ed value of the following.

enumerator MFX_CORRUPTION_MINOR = 0x0001

Minor corruption in decoding certain macro-blocks.

enumerator MFX_CORRUPTION_MAJOR = 0x0002

Major corruption in decoding the frame - incomplete data, for example.

enumerator MFX_CORRUPTION_ABSENT_TOP_FIELD = 0x0004

Top field of frame is absent in bitstream. Only bottom field has been decoded.

enumerator MFX_CORRUPTION_ABSENT_BOTTOM_FIELD = 0x0008

Bottom field of frame is absent in bitstream. Only top filed has been decoded.

enumerator MFX_CORRUPTION_REFERENCE_FRAME = 0x0010

Decoding used a corrupted reference frame. A corrupted reference frame was used for decoding this frame. For example, if the frame uses refers to frame was decoded with minor/major corruption flag – this frame is also marked with reference corruption flag.

enumerator MFX_CORRUPTION_REFERENCE_LIST = 0x0020

The reference list information of this frame does not match what is specified in the Reference Picture Marking Repetition SEI message. (ITU-T H.264 D.1.8 dec_ref_pic_marking_repetition)

Note

Flag MFX_CORRUPTION_ABSENT_TOP_FIELD/MFX_CORRUPTION_ABSENT_BOTTOM_FIELD is set by the AVC decoder when it detects that one of fields is not present in bitstream. Which field is absent depends on value of bottom_field_flag (ITU-T H.264 7.4.3).

TimeStampCalc

The TimeStampCalc enumerator itemizes time-stamp calculation methods.

enumerator MFX_TIMESTAMPCALC_UNKNOWN = 0

The time stamp calculation is to base on the input frame rate, if time stamp is not explicitly specified.

enumerator MFX_TIMESTAMPCALC_TELECINE = 1

Adjust time stamp to 29.97fps on 24fps progressively encoded sequences if telecining attributes are available in the bitstream and time stamp is not explicitly specified. The input frame rate must be specified.

IOPattern

The IOPattern enumerator itemizes memory access patterns for SDK functions. Use bit-ORed values to specify an input access pattern and an output access pattern.

enumerator MFX_IOPATTERN_IN_VIDEO_MEMORY = 0x01

Input to SDK functions is a video memory surface.

enumerator MFX_IOPATTERN_IN_SYSTEM_MEMORY = 0x02

Input to SDK functions is a linear buffer directly in system memory or in system memory through an external allocator.

enumerator MFX_IOPATTERN_OUT_VIDEO_MEMORY = 0x10

Output to SDK functions is a video memory surface.

enumerator MFX_IOPATTERN_OUT_SYSTEM_MEMORY = 0x20

Output to SDK functions is a linear buffer directly in system memory or in system memory through an external allocator.

CodecFormatFourCC

The CodecFormatFourCC enumerator itemizes codecs in the FourCC format.

enumerator MFX_CODEC_AVC = MFX_MAKEFOURCC('A', 'V', 'C', ' ')

AVC, H.264, or MPEG-4, part 10 codec

enumerator MFX_CODEC_HEVC = MFX_MAKEFOURCC('H', 'E', 'V', 'C')

HEVC codec

enumerator MFX_CODEC_MPEG2 = MFX_MAKEFOURCC('M', 'P', 'G', '2')

MPEG-2 codec

enumerator MFX_CODEC_VC1 = MFX_MAKEFOURCC('V', 'C', '1', ' ')

VC-1 codec

enumerator MFX_CODEC_VP9 = MFX_MAKEFOURCC('V', 'P', '9', ' ')

VP9 codec

enumerator MFX_CODEC_AV1 = MFX_MAKEFOURCC('A', 'V', '1', ' ')

AV1 codec

enumerator MFX_CODEC_JPEG = MFX_MAKEFOURCC('J', 'P', 'E', 'G')

JPEG codec

CodecProfile

The CodecProfile enumerator itemizes codec profiles for all codecs.

enumerator MFX_PROFILE_UNKNOWN = 0

Unspecified profile

H.264 profiles

enumerator MFX_PROFILE_AVC_BASELINE = 66
enumerator MFX_PROFILE_AVC_MAIN = 77
enumerator MFX_PROFILE_AVC_EXTENDED = 88
enumerator MFX_PROFILE_AVC_HIGH = 100
enumerator MFX_PROFILE_AVC_HIGH10 = 110
enumerator MFX_PROFILE_AVC_HIGH_422 = 122
enumerator MFX_PROFILE_AVC_CONSTRAINED_BASELINE = MFX_PROFILE_AVC_BASELINE + MFX_PROFILE_AVC_CONSTRAINT_SET1
enumerator MFX_PROFILE_AVC_CONSTRAINED_HIGH = MFX_PROFILE_AVC_HIGH + MFX_PROFILE_AVC_CONSTRAINT_SET4 + MFX_PROFILE_AVC_CONSTRAINT_SET5

Combined with H.264 profile these flags impose additional constrains. See H.264 specification for the list of constrains.

enumerator MFX_PROFILE_AVC_CONSTRAINT_SET0 = (0x100 << 0)
enumerator MFX_PROFILE_AVC_CONSTRAINT_SET1 = (0x100 << 1)
enumerator MFX_PROFILE_AVC_CONSTRAINT_SET2 = (0x100 << 2)
enumerator MFX_PROFILE_AVC_CONSTRAINT_SET3 = (0x100 << 3)
enumerator MFX_PROFILE_AVC_CONSTRAINT_SET4 = (0x100 << 4)
enumerator MFX_PROFILE_AVC_CONSTRAINT_SET5 = (0x100 << 5)

Multi-view video coding extension profiles

enumerator MFX_PROFILE_AVC_MULTIVIEW_HIGH = 118

Multi-view high profile.

enumerator MFX_PROFILE_AVC_STEREO_HIGH = 128

Stereo high profile.

MPEG-2 profiles

enumerator MFX_PROFILE_MPEG2_SIMPLE = 0x50
enumerator MFX_PROFILE_MPEG2_MAIN = 0x40
enumerator MFX_PROFILE_MPEG2_HIGH = 0x10

VC-1 Profiles

enumerator MFX_PROFILE_VC1_SIMPLE = (0 + 1)
enumerator MFX_PROFILE_VC1_MAIN = (4 + 1)
enumerator MFX_PROFILE_VC1_ADVANCED = (12 + 1)

HEVC profiles

enumerator MFX_PROFILE_HEVC_MAIN = 1
enumerator MFX_PROFILE_HEVC_MAIN10 = 2
enumerator MFX_PROFILE_HEVC_MAINSP = 3
enumerator MFX_PROFILE_HEVC_REXT = 4
enumerator MFX_PROFILE_HEVC_SCC = 9

VP9 Profiles

enumerator MFX_PROFILE_VP8_0 = 0 + 1
enumerator MFX_PROFILE_VP8_1 = 1 + 1
enumerator MFX_PROFILE_VP8_2 = 2 + 1
enumerator MFX_PROFILE_VP8_3 = 3 + 1

VP9 Profiles

enumerator MFX_PROFILE_VP9_0 = 1
enumerator MFX_PROFILE_VP9_1 = 2
enumerator MFX_PROFILE_VP9_2 = 3
enumerator MFX_PROFILE_VP9_3 = 4

JPEG Prifiles

enumerator MFX_PROFILE_JPEG_BASELINE = 1

Baseline JPEG Profile.

CodecLevel

The CodecLevel enumerator itemizes codec levels for all codecs.

enumerator MFX_LEVEL_UNKNOWN = 0

Unspecified level

H.264 level 1-1.3

enumerator MFX_LEVEL_AVC_1 = 10
enumerator MFX_LEVEL_AVC_1b = 9
enumerator MFX_LEVEL_AVC_11 = 11
enumerator MFX_LEVEL_AVC_12 = 12
enumerator MFX_LEVEL_AVC_13 = 13

H.264 level 2-2.2

enumerator MFX_LEVEL_AVC_2 = 20
enumerator MFX_LEVEL_AVC_21 = 21
enumerator MFX_LEVEL_AVC_22 = 22

H.264 level 3-3.2

enumerator MFX_LEVEL_AVC_3 = 30
enumerator MFX_LEVEL_AVC_31 = 31
enumerator MFX_LEVEL_AVC_32 = 32

H.264 level 4-4.2

enumerator MFX_LEVEL_AVC_4 = 40
enumerator MFX_LEVEL_AVC_41 = 41
enumerator MFX_LEVEL_AVC_42 = 42

H.264 level 5-5.2

enumerator MFX_LEVEL_AVC_5 = 50
enumerator MFX_LEVEL_AVC_51 = 51
enumerator MFX_LEVEL_AVC_52 = 52

MPEG2 Levels

enumerator MFX_LEVEL_MPEG2_LOW = 0xA
enumerator MFX_LEVEL_MPEG2_MAIN = 0x8
enumerator MFX_LEVEL_MPEG2_HIGH = 0x4
enumerator MFX_LEVEL_MPEG2_HIGH1440 = 0x6

VC-1 Level Low (simple & main profiles)

enumerator MFX_LEVEL_VC1_LOW = (0 + 1)
enumerator MFX_LEVEL_VC1_MEDIAN = (2 + 1)
enumerator MFX_LEVEL_VC1_HIGH = (4 + 1)

VC-1 advanced profile levels

enumerator MFX_LEVEL_VC1_0 = (0x00 + 1)
enumerator MFX_LEVEL_VC1_1 = (0x01 + 1)
enumerator MFX_LEVEL_VC1_2 = (0x02 + 1)
enumerator MFX_LEVEL_VC1_3 = (0x03 + 1)
enumerator MFX_LEVEL_VC1_4 = (0x04 + 1)

HEVC levels

enumerator MFX_LEVEL_HEVC_1 = 10
enumerator MFX_LEVEL_HEVC_2 = 20
enumerator MFX_LEVEL_HEVC_21 = 21
enumerator MFX_LEVEL_HEVC_3 = 30
enumerator MFX_LEVEL_HEVC_31 = 31
enumerator MFX_LEVEL_HEVC_4 = 40
enumerator MFX_LEVEL_HEVC_41 = 41
enumerator MFX_LEVEL_HEVC_5 = 50
enumerator MFX_LEVEL_HEVC_51 = 51
enumerator MFX_LEVEL_HEVC_52 = 52
enumerator MFX_LEVEL_HEVC_6 = 60
enumerator MFX_LEVEL_HEVC_61 = 61
enumerator MFX_LEVEL_HEVC_62 = 62

HEVC Tiers

enumerator MFX_TIER_HEVC_MAIN = 0
enumerator MFX_TIER_HEVC_HIGH = 0x100

GopOptFlag

The GopOptFlag enumerator itemizes special properties in the GOP (Group of Pictures) sequence.

enumerator MFX_GOP_CLOSED = 1

The encoder generates closed GOP if this flag is set. Frames in this GOP do not use frames in previous GOP as reference.

The encoder generates open GOP if this flag is not set. In this GOP frames prior to the first frame of GOP in display order may use frames from previous GOP as reference. Frames subsequent to the first frame of GOP in display order do not use frames from previous GOP as reference.

The AVC encoder ignores this flag if IdrInterval in mfxInfoMFX structure is set to 0, i.e. if every GOP starts from IDR frame. In this case, GOP is encoded as closed.

This flag does not affect long-term reference frames.

enumerator MFX_GOP_STRICT = 2

The encoder must strictly follow the given GOP structure as defined by parameter GopPicSize, GopRefDist etc in the mfxVideoParam structure. Otherwise, the encoder can adapt the GOP structure for better efficiency, whose range is constrained by parameter GopPicSize and GopRefDist etc. See also description of AdaptiveI and AdaptiveB fields in the mfxExtCodingOption2 structure.

TargetUsage

The TargetUsage enumerator itemizes a range of numbers from MFX_TARGETUSAGE_1, best quality, to MFX_TARGETUSAGE_7, best speed. It indicates trade-offs between quality and speed. The application can use any number in the range. The actual number of supported target usages depends on implementation. If specified target usage is not supported, the SDK encoder will use the closest supported value.

enumerator MFX_TARGETUSAGE_1 = 1

Best quality

enumerator MFX_TARGETUSAGE_2 = 2
enumerator MFX_TARGETUSAGE_3 = 3
enumerator MFX_TARGETUSAGE_4 = 4

Balanced quality and speed.

enumerator MFX_TARGETUSAGE_5 = 5
enumerator MFX_TARGETUSAGE_6 = 6
enumerator MFX_TARGETUSAGE_7 = 7

Best speed

enumerator MFX_TARGETUSAGE_UNKNOWN = 0

Unspecified target usage.

enumerator MFX_TARGETUSAGE_BEST_QUALITY = MFX_TARGETUSAGE_1

Best quality.

enumerator MFX_TARGETUSAGE_BALANCED = MFX_TARGETUSAGE_4

Balanced quality and speed.

enumerator MFX_TARGETUSAGE_BEST_SPEED = MFX_TARGETUSAGE_7

Best speed.

RateControlMethod

The RateControlMethod enumerator itemizes bitrate control methods.

enumerator MFX_RATECONTROL_CBR = 1

Use the constant bitrate control algorithm.

enumerator MFX_RATECONTROL_VBR = 2

Use the variable bitrate control algorithm.

enumerator MFX_RATECONTROL_CQP = 3

Use the constant quantization parameter algorithm.

enumerator MFX_RATECONTROL_AVBR = 4

Use the average variable bitrate control algorithm.

enumerator MFX_RATECONTROL_LA = 8

Use the VBR algorithm with look ahead. It is a special bitrate control mode in the SDK AVC encoder that has been designed to improve encoding quality. It works by performing extensive analysis of several dozen frames before the actual encoding and as a side effect significantly increases encoding delay and memory consumption.

The only available rate control parameter in this mode is mfxInfoMFX::TargetKbps. Two other parameters, MaxKbps and InitialDelayInKB, are ignored. To control LA depth the application can use mfxExtCodingOption2::LookAheadDepth parameter.

This method is not HRD compliant.

enumerator MFX_RATECONTROL_ICQ = 9

Use the Intelligent Constant Quality algorithm. This algorithm improves subjective video quality of encoded stream. Depending on content, it may or may not decrease objective video quality. Only one control parameter is used - quality factor, specified by mfxInfoMFX::ICQQuality.

enumerator MFX_RATECONTROL_VCM = 10

Use the Video Conferencing Mode algorithm. This algorithm is similar to the VBR and uses the same set of parameters mfxInfoMFX::InitialDelayInKB, TargetKbpsandMaxKbps. It is tuned for IPPP GOP pattern and streams with strong temporal correlation between frames. It produces better objective and subjective video quality in these conditions than other bitrate control algorithms. It does not support interlaced content, B frames and produced stream is not HRD compliant.

enumerator MFX_RATECONTROL_LA_ICQ = 11

Use intelligent constant quality algorithm with look ahead. Quality factor is specified by mfxInfoMFX::ICQQuality. To control LA depth the application can use mfxExtCodingOption2::LookAheadDepth parameter.

This method is not HRD compliant.

enumerator MFX_RATECONTROL_LA_HRD = 13

MFX_RATECONTROL_LA_EXT has been removed

Use HRD compliant look ahead rate control algorithm.

enumerator MFX_RATECONTROL_QVBR = 14

Use the variable bitrate control algorithm with constant quality. This algorithm trying to achieve the target subjective quality with the minimum number of bits, while the bitrate constraint and HRD compliancy are satisfied. It uses the same set of parameters as VBR and quality factor specified by mfxExtCodingOption3::QVBRQuality.

TrellisControl

The TrellisControl enumerator is used to control trellis quantization in AVC encoder. The application can turn it on or off for any combination of I, P and B frames by combining different enumerator values. For example, MFX_TRELLIS_I | MFX_TRELLIS_B turns it on for I and B frames.

enumerator MFX_TRELLIS_UNKNOWN = 0

Default value, it is up to the SDK encoder to turn trellis quantization on or off.

enumerator MFX_TRELLIS_OFF = 0x01

Turn trellis quantization off for all frame types.

enumerator MFX_TRELLIS_I = 0x02

Turn trellis quantization on for I frames.

enumerator MFX_TRELLIS_P = 0x04

Turn trellis quantization on for P frames.

enumerator MFX_TRELLIS_B = 0x08

Turn trellis quantization on for B frames.

BRefControl

The BRefControl enumerator is used to control usage of B frames as reference in AVC encoder.

enumerator MFX_B_REF_UNKNOWN = 0

Default value, it is up to the SDK encoder to use B frames as reference.

enumerator MFX_B_REF_OFF = 1

Do not use B frames as reference.

enumerator MFX_B_REF_PYRAMID = 2

Arrange B frames in so-called “B pyramid” reference structure.

LookAheadDownSampling

The LookAheadDownSampling enumerator is used to control down sampling in look ahead bitrate control mode in AVC encoder.

enumerator MFX_LOOKAHEAD_DS_UNKNOWN = 0

Default value, it is up to the SDK encoder what down sampling value to use.

enumerator MFX_LOOKAHEAD_DS_OFF = 1

Do not use down sampling, perform estimation on original size frames. This is the slowest setting that produces the best quality.

enumerator MFX_LOOKAHEAD_DS_2x = 2

Down sample frames two times before estimation.

enumerator MFX_LOOKAHEAD_DS_4x = 3

Down sample frames four times before estimation. This option may significantly degrade quality.

BPSEIControl

The BPSEIControl enumerator is used to control insertion of buffering period SEI in the encoded bitstream.

enumerator MFX_BPSEI_DEFAULT = 0x00

encoder decides when to insert BP SEI.

enumerator MFX_BPSEI_IFRAME = 0x01

BP SEI should be inserted with every I frame

SkipFrame

The SkipFrame enumerator is used to define usage of mfxEncodeCtrl::SkipFrame parameter.

enumerator MFX_SKIPFRAME_NO_SKIP = 0

Frame skipping is disabled, mfxEncodeCtrl::SkipFrame is ignored.

enumerator MFX_SKIPFRAME_INSERT_DUMMY = 1

Skipping is allowed, when mfxEncodeCtrl::SkipFrame is set encoder inserts into bitstream frame where all macroblocks are encoded as skipped. Only non-reference P and B frames can be skipped. If GopRefDist = 1 and mfxEncodeCtrl::SkipFrame is set for reference P frame, it will be encoded as non-reference.

enumerator MFX_SKIPFRAME_INSERT_NOTHING = 2

Similar to MFX_SKIPFRAME_INSERT_DUMMY, but when mfxEncodeCtrl::SkipFrame is set encoder inserts nothing into bitstream.

enumerator MFX_SKIPFRAME_BRC_ONLY = 3

mfxEncodeCtrl::SkipFrame indicates number of missed frames before the current frame. Affects only BRC, current frame will be encoded as usual.

IntraRefreshTypes

The IntraRefreshTypes enumerator itemizes types of intra refresh.

enumerator MFX_REFRESH_NO = 0

Encode without refresh.

enumerator MFX_REFRESH_VERTICAL = 1

Vertical refresh, by column of MBs.

enumerator MFX_REFRESH_HORIZONTAL = 2

Horizontal refresh, by rows of MBs.

enumerator MFX_REFRESH_SLICE = 3

Horizontal refresh by slices without overlapping.

WeightedPred

The WeightedPred enumerator itemizes weighted prediction modes.

enumerator MFX_WEIGHTED_PRED_UNKNOWN = 0

Allow encoder to decide.

enumerator MFX_WEIGHTED_PRED_DEFAULT = 1

Use default weighted prediction.

enumerator MFX_WEIGHTED_PRED_EXPLICIT = 2

Use explicit weighted prediction.

enumerator MFX_WEIGHTED_PRED_IMPLICIT = 3

Use implicit weighted prediction (for B-frames only).

PRefType

The PRefType enumerator itemizes models of reference list construction and DPB management when GopRefDist=1.

enumerator MFX_P_REF_DEFAULT = 0

Allow encoder to decide.

enumerator MFX_P_REF_SIMPLE = 1

Regular sliding window used for DPB removal process.

enumerator MFX_P_REF_PYRAMID = 2

Let N be the max reference list’s size. Encoder treat each N’s frame as ‘strong’ reference and the others as “weak” references. Encoder uses ‘weak’ reference only for prediction of the next frame and removes it from DPB right after. ‘Strong’ references removed from DPB by sliding window.

ScenarioInfo

The ScenarioInfo enumerator itemizes scenarios for the encoding session.

enumerator MFX_SCENARIO_UNKNOWN = 0
enumerator MFX_SCENARIO_DISPLAY_REMOTING = 1
enumerator MFX_SCENARIO_VIDEO_CONFERENCE = 2
enumerator MFX_SCENARIO_ARCHIVE = 3
enumerator MFX_SCENARIO_LIVE_STREAMING = 4
enumerator MFX_SCENARIO_CAMERA_CAPTURE = 5
enumerator MFX_SCENARIO_VIDEO_SURVEILLANCE = 6
enumerator MFX_SCENARIO_GAME_STREAMING = 7
enumerator MFX_SCENARIO_REMOTE_GAMING = 8

ContentInfo

The ContentInfo enumerator itemizes content types for the encoding session.

enumerator MFX_CONTENT_UNKNOWN = 0
enumerator MFX_CONTENT_FULL_SCREEN_VIDEO = 1
enumerator MFX_CONTENT_NON_VIDEO_SCREEN = 2

IntraPredBlockSize/InterPredBlockSize

IntraPredBlockSize/InterPredBlockSize specifies minimum block size of inter-prediction.

enumerator MFX_BLOCKSIZE_UNKNOWN = 0

Unspecifyed.

enumerator MFX_BLOCKSIZE_MIN_16X16 = 1

16x16

enumerator MFX_BLOCKSIZE_MIN_8X8 = 2

16x16, 8x8

enumerator MFX_BLOCKSIZE_MIN_4X4 = 3

16x16, 8x8, 4x4

MVPrecision

The MVPrecision enumerator specifies the motion estimation precision

enumerator MFX_MVPRECISION_UNKNOWN = 0
enumerator MFX_MVPRECISION_INTEGER = (1 << 0)
enumerator MFX_MVPRECISION_HALFPEL = (1 << 1)
enumerator MFX_MVPRECISION_QUARTERPEL = (1 << 2)

CodingOptionValue

The CodingOptionValue enumerator defines a three-state coding option setting.

enumerator MFX_CODINGOPTION_UNKNOWN = 0

Unspecified.

enumerator MFX_CODINGOPTION_ON = 0x10

Coding option set.

enumerator MFX_CODINGOPTION_OFF = 0x20

Coding option not set.

enumerator MFX_CODINGOPTION_ADAPTIVE = 0x30

Reserved

BitstreamDataFlag

The BitstreamDataFlag enumerator uses bit-ORed values to itemize additional information about the bitstream buffer.

enumerator MFX_BITSTREAM_COMPLETE_FRAME = 0x0001

The bitstream buffer contains a complete frame or complementary field pair of data for the bitstream. For decoding, this means that the decoder can proceed with this buffer without waiting for the start of the next frame, which effectively reduces decoding latency. If this flag is set, but the bitstream buffer contains incomplete frame or pair of field, then decoder will produce corrupted output.

enumerator MFX_BITSTREAM_EOS = 0x0002

The bitstream buffer contains the end of the stream. For decoding, this means that the application does not have any additional bitstream data to send to decoder.

ExtendedBufferID

The ExtendedBufferID enumerator itemizes and defines identifiers (BufferId) for extended buffers or video processing algorithm identifiers.

enumerator MFX_EXTBUFF_THREADS_PARAM = MFX_MAKEFOURCC('T', 'H', 'D', 'P')

mfxExtThreadsParam buffer ID

enumerator MFX_EXTBUFF_CODING_OPTION = MFX_MAKEFOURCC('C', 'D', 'O', 'P')

This extended buffer defines additional encoding controls. See the mfxExtCodingOption structure for details. The application can attach this buffer to the structure for encoding initialization.

enumerator MFX_EXTBUFF_CODING_OPTION_SPSPPS = MFX_MAKEFOURCC('C', 'O', 'S', 'P')

This extended buffer defines sequence header and picture header for encoders and decoders. See the mfxExtCodingOptionSPSPPS structure for details. The application can attach this buffer to the mfxVideoParam structure for encoding initialization, and for obtaining raw headers from the decoders and encoders.

enumerator MFX_EXTBUFF_VPP_DONOTUSE = MFX_MAKEFOURCC('N', 'U', 'S', 'E')

This extended buffer defines a list of VPP algorithms that applications should not use. See the mfxExtVPPDoNotUse structure for details. The application can attach this buffer to the mfxVideoParam structure for video processing initialization.

enumerator MFX_EXTBUFF_VPP_AUXDATA = MFX_MAKEFOURCC('A', 'U', 'X', 'D')

This extended buffer defines auxiliary information at the VPP output. See the mfxExtVppAuxData structure for details. The application can attach this buffer to the mfxEncodeCtrl structure for per-frame encoding control.

enumerator MFX_EXTBUFF_VPP_DENOISE = MFX_MAKEFOURCC('D', 'N', 'I', 'S')

The extended buffer defines control parameters for the VPP denoise filter algorithm. See the mfxExtVPPDenoise structure for details. The application can attach this buffer to the mfxVideoParam structure for video processing initialization.

enumerator MFX_EXTBUFF_VPP_SCENE_ANALYSIS = MFX_MAKEFOURCC('S', 'C', 'L', 'Y')
enumerator MFX_EXTBUFF_VPP_PROCAMP = MFX_MAKEFOURCC('P', 'A', 'M', 'P')

The extended buffer defines control parameters for the VPP ProcAmp filter algorithm. See the mfxExtVPPProcAmp structure for details. The application can attach this buffer to the mfxVideoParam structure for video processing initialization or to the mfxFrameData structure in the mfxFrameSurface1 structure of output surface for per-frame processing configuration.

enumerator MFX_EXTBUFF_VPP_DETAIL = MFX_MAKEFOURCC('D', 'E', 'T', ' ')

The extended buffer defines control parameters for the VPP detail filter algorithm. See the mfxExtVPPDetail structure for details. The application can attach this buffer to the structure for video processing initialization.

enumerator MFX_EXTBUFF_VIDEO_SIGNAL_INFO = MFX_MAKEFOURCC('V', 'S', 'I', 'N')

This extended buffer defines video signal type. See the mfxExtVideoSignalInfo structure for details. The application can attach this buffer to the mfxVideoParam structure for encoding initialization, and for retrieving such information from the decoders.

enumerator MFX_EXTBUFF_VPP_DOUSE = MFX_MAKEFOURCC('D', 'U', 'S', 'E')

This extended buffer defines a list of VPP algorithms that applications should use. See the mfxExtVPPDoUse structure for details. The application can attach this buffer to the structure for video processing initialization.

enumerator MFX_EXTBUFF_AVC_REFLIST_CTRL = MFX_MAKEFOURCC('R', 'L', 'S', 'T')

This extended buffer defines additional encoding controls for reference list. See the mfxExtAVCRefListCtrl structure for details. The application can attach this buffer to the mfxVideoParam structure for encoding & decoding initialization, or the mfxEncodeCtrl structure for per-frame encoding configuration.

enumerator MFX_EXTBUFF_VPP_FRAME_RATE_CONVERSION = MFX_MAKEFOURCC('F', 'R', 'C', ' ')

This extended buffer defines control parameters for the VPP frame rate conversion algorithm. See the mfxExtVPPFrameRateConversion structure for details. The application can attach this buffer to the mfxVideoParam structure for video processing initialization.

enumerator MFX_EXTBUFF_PICTURE_TIMING_SEI = MFX_MAKEFOURCC('P', 'T', 'S', 'E')

This extended buffer configures the H.264 picture timing SEI message. See the mfxExtPictureTimingSEI structure for details. The application can attach this buffer to the mfxVideoParam structure for encoding initialization, or the mfxEncodeCtrl structure for per-frame encoding configuration.

enumerator MFX_EXTBUFF_AVC_TEMPORAL_LAYERS = MFX_MAKEFOURCC('A', 'T', 'M', 'L')

This extended buffer configures the structure of temporal layers inside the encoded H.264 bitstream. See the mfxExtAvcTemporalLayers structure for details. The application can attach this buffer to the mfxVideoParam structure for encoding initialization.

enumerator MFX_EXTBUFF_CODING_OPTION2 = MFX_MAKEFOURCC('C', 'D', 'O', '2')

This extended buffer defines additional encoding controls. See the mfxExtCodingOption2 structure for details. The application can attach this buffer to the structure for encoding initialization.

enumerator MFX_EXTBUFF_VPP_IMAGE_STABILIZATION = MFX_MAKEFOURCC('I', 'S', 'T', 'B')

This extended buffer defines control parameters for the VPP image stabilization filter algorithm. See the mfxExtVPPImageStab structure for details. The application can attach this buffer to the mfxVideoParam structure for video processing initialization.

enumerator MFX_EXTBUFF_ENCODER_CAPABILITY = MFX_MAKEFOURCC('E', 'N', 'C', 'P')

This extended buffer is used to retrieve SDK encoder capability. See the mfxExtEncoderCapability structure for details. The application can attach this buffer to the mfxVideoParam structure before calling MFXVideoENCODE_Query function.

enumerator MFX_EXTBUFF_ENCODER_RESET_OPTION = MFX_MAKEFOURCC('E', 'N', 'R', 'O')

This extended buffer is used to control encoder reset behavior and also to query possible encoder reset outcome. See the mfxExtEncoderResetOption structure for details. The application can attach this buffer to the mfxVideoParam structure before calling MFXVideoENCODE_Query or MFXVideoENCODE_Reset functions.

enumerator MFX_EXTBUFF_ENCODED_FRAME_INFO = MFX_MAKEFOURCC('E', 'N', 'F', 'I')

This extended buffer is used by the SDK encoder to report additional information about encoded picture. See the mfxExtAVCEncodedFrameInfo structure for details. The application can attach this buffer to the mfxBitstream structure before calling MFXVideoENCODE_EncodeFrameAsync function.

enumerator MFX_EXTBUFF_VPP_COMPOSITE = MFX_MAKEFOURCC('V', 'C', 'M', 'P')

This extended buffer is used to control composition of several input surfaces in the one output. In this mode, the VPP skips any other filters. The VPP returns error if any mandatory filter is specified and filter skipped warning for optional filter. The only supported filters are deinterlacing and interlaced scaling.

enumerator MFX_EXTBUFF_VPP_VIDEO_SIGNAL_INFO = MFX_MAKEFOURCC('V', 'V', 'S', 'I')

This extended buffer is used to control transfer matrix and nominal range of YUV frames. The application should provide it during initialization.

enumerator MFX_EXTBUFF_ENCODER_ROI = MFX_MAKEFOURCC('E', 'R', 'O', 'I')

This extended buffer is used by the application to specify different Region Of Interests during encoding. The application should provide it at initialization or at runtime.

enumerator MFX_EXTBUFF_VPP_DEINTERLACING = MFX_MAKEFOURCC('V', 'P', 'D', 'I')

This extended buffer is used by the application to specify different deinterlacing algorithms.

enumerator MFX_EXTBUFF_AVC_REFLISTS = MFX_MAKEFOURCC('R', 'L', 'T', 'S')

This extended buffer specifies reference lists for the SDK encoder.

enumerator MFX_EXTBUFF_DEC_VIDEO_PROCESSING = MFX_MAKEFOURCC('D', 'E', 'C', 'V')

See the mfxExtDecVideoProcessing structure for details.

enumerator MFX_EXTBUFF_VPP_FIELD_PROCESSING = MFX_MAKEFOURCC('F', 'P', 'R', 'O')

The extended buffer defines control parameters for the VPP field-processing algorithm. See the mfxExtVPPFieldProcessing structure for details. The application can attach this buffer to the mfxVideoParam structure for video processing initialization or to the mfxFrameData structure during runtime.

enumerator MFX_EXTBUFF_CODING_OPTION3 = MFX_MAKEFOURCC('C', 'D', 'O', '3')

This extended buffer defines additional encoding controls. See the mfxExtCodingOption3 structure for details. The application can attach this buffer to the structure for encoding initialization.

enumerator MFX_EXTBUFF_CHROMA_LOC_INFO = MFX_MAKEFOURCC('C', 'L', 'I', 'N')

This extended buffer defines chroma samples location information. See the mfxExtChromaLocInfo structure for details. The application can attach this buffer to the mfxVideoParam structure for encoding initialization.

enumerator MFX_EXTBUFF_MBQP = MFX_MAKEFOURCC('M', 'B', 'Q', 'P')

This extended buffer defines per-macroblock QP. See the mfxExtMBQP structure for details. The application can attach this buffer to the mfxEncodeCtrl structure for per-frame encoding configuration.

enumerator MFX_EXTBUFF_MB_FORCE_INTRA = MFX_MAKEFOURCC('M', 'B', 'F', 'I')

This extended buffer defines per-macroblock force intra flag. See the mfxExtMBForceIntra structure for details. The application can attach this buffer to the mfxEncodeCtrl structure for per-frame encoding configuration.

enumerator MFX_EXTBUFF_HEVC_TILES = MFX_MAKEFOURCC('2', '6', '5', 'T')

This extended buffer defines additional encoding controls for HEVC tiles. See the mfxExtHEVCTiles structure for details. The application can attach this buffer to the mfxVideoParam structure for encoding initialization.

enumerator MFX_EXTBUFF_MB_DISABLE_SKIP_MAP = MFX_MAKEFOURCC('M', 'D', 'S', 'M')

This extended buffer defines macroblock map for current frame which forces specified macroblocks to be non skip. See the mfxExtMBDisableSkipMap structure for details. The application can attach this buffer to the mfxEncodeCtrl structure for per-frame encoding configuration.

enumerator MFX_EXTBUFF_HEVC_PARAM = MFX_MAKEFOURCC('2', '6', '5', 'P')

See the mfxExtHEVCParam structure for details.

enumerator MFX_EXTBUFF_DECODED_FRAME_INFO = MFX_MAKEFOURCC('D', 'E', 'F', 'I')

This extended buffer is used by SDK decoders to report additional information about decoded frame. See the mfxExtDecodedFrameInfo structure for more details.

enumerator MFX_EXTBUFF_TIME_CODE = MFX_MAKEFOURCC('T', 'M', 'C', 'D')

See the mfxExtTimeCode structure for more details.

enumerator MFX_EXTBUFF_HEVC_REGION = MFX_MAKEFOURCC('2', '6', '5', 'R')

This extended buffer specifies the region to encode. The application can attach this buffer to the mfxVideoParam structure during HEVC encoder initialization.

enumerator MFX_EXTBUFF_PRED_WEIGHT_TABLE = MFX_MAKEFOURCC('E', 'P', 'W', 'T')

See the mfxExtPredWeightTable structure for details.

enumerator MFX_EXTBUFF_DIRTY_RECTANGLES = MFX_MAKEFOURCC('D', 'R', 'O', 'I')

See the mfxExtDitrtyRect structure for details.

enumerator MFX_EXTBUFF_MOVING_RECTANGLES = MFX_MAKEFOURCC('M', 'R', 'O', 'I')

See the mfxExtMoveRect structure for details.

enumerator MFX_EXTBUFF_CODING_OPTION_VPS = MFX_MAKEFOURCC('C', 'O', 'V', 'P')

See the mfxExtCodingOptionVPS structure for details.

enumerator MFX_EXTBUFF_VPP_ROTATION = MFX_MAKEFOURCC('R', 'O', 'T', ' ')

See the mfxExtVPPRotation structure for details.

enumerator MFX_EXTBUFF_ENCODED_SLICES_INFO = MFX_MAKEFOURCC('E', 'N', 'S', 'I')

See the mfxExtEncodedSlicesInfo structure for details.

enumerator MFX_EXTBUFF_VPP_SCALING = MFX_MAKEFOURCC('V', 'S', 'C', 'L')

See the mfxExtVPPScaling structure for details.

enumerator MFX_EXTBUFF_HEVC_REFLIST_CTRL = MFX_EXTBUFF_AVC_REFLIST_CTRL

This extended buffer defines additional encoding controls for reference list. See the mfxExtAVCRefListCtrl structure for details. The application can attach this buffer to the mfxVideoParam structure for encoding & decoding initialization, or the mfxEncodeCtrl structure for per-frame encoding configuration.

enumerator MFX_EXTBUFF_HEVC_REFLISTS = MFX_EXTBUFF_AVC_REFLISTS

This extended buffer specifies reference lists for the SDK encoder.

enumerator MFX_EXTBUFF_HEVC_TEMPORAL_LAYERS = MFX_EXTBUFF_AVC_TEMPORAL_LAYERS

This extended buffer configures the structure of temporal layers inside the encoded H.264 bitstream. See the mfxExtAvcTemporalLayers structure for details. The application can attach this buffer to the mfxVideoParam structure for encoding initialization.

enumerator MFX_EXTBUFF_VPP_MIRRORING = MFX_MAKEFOURCC('M', 'I', 'R', 'R')

See the mfxExtVPPMirroring structure for details.

enumerator MFX_EXTBUFF_MV_OVER_PIC_BOUNDARIES = MFX_MAKEFOURCC('M', 'V', 'P', 'B')

See the mfxExtMVOverPicBoundaries structure for details.

enumerator MFX_EXTBUFF_VPP_COLORFILL = MFX_MAKEFOURCC('V', 'C', 'L', 'F')

See the mfxExtVPPColorFill structure for details.

enumerator MFX_EXTBUFF_DECODE_ERROR_REPORT = MFX_MAKEFOURCC('D', 'E', 'R', 'R')

This extended buffer is used by SDK decoders to report error information before frames get decoded. See the mfxExtDecodeErrorReport structure for more details.

enumerator MFX_EXTBUFF_VPP_COLOR_CONVERSION = MFX_MAKEFOURCC('V', 'C', 'S', 'C')

See the mfxExtColorConversion structure for details.

enumerator MFX_EXTBUFF_CONTENT_LIGHT_LEVEL_INFO = MFX_MAKEFOURCC('L', 'L', 'I', 'S')

This extended buffer configures HDR SEI message. See the mfxExtContentLightLevelInfo structure for details.

enumerator MFX_EXTBUFF_MASTERING_DISPLAY_COLOUR_VOLUME = MFX_MAKEFOURCC('D', 'C', 'V', 'S')

This extended buffer configures HDR SEI message. See the mfxExtMasteringDisplayColourVolume structure for details.

enumerator MFX_EXTBUFF_MULTI_FRAME_PARAM = MFX_MAKEFOURCC('M', 'F', 'R', 'P')

This extended buffer allow to specify multi-frame submission parameters.

enumerator MFX_EXTBUFF_MULTI_FRAME_CONTROL = MFX_MAKEFOURCC('M', 'F', 'R', 'C')

This extended buffer allow to manage multi-frame submission in runtime.

enumerator MFX_EXTBUFF_ENCODED_UNITS_INFO = MFX_MAKEFOURCC('E', 'N', 'U', 'I')

See the mfxExtEncodedUnitsInfo structure for details.

enumerator MFX_EXTBUFF_VPP_MCTF = MFX_MAKEFOURCC('M', 'C', 'T', 'F')

This video processing algorithm identifier is used to enable MCTF via mfxExtVPPDoUse and together with mfxExtVppMctf

enumerator MFX_EXTBUFF_VP9_SEGMENTATION = MFX_MAKEFOURCC('9', 'S', 'E', 'G')

Extends mfxVideoParam structure with VP9 segmentation parameters. See the mfxExtVP9Segmentation structure for details.

enumerator MFX_EXTBUFF_VP9_TEMPORAL_LAYERS = MFX_MAKEFOURCC('9', 'T', 'M', 'L')

Extends mfxVideoParam structure with parameters for VP9 temporal scalability. See the mfxExtVP9TemporalLayers structure for details.

enumerator MFX_EXTBUFF_VP9_PARAM = MFX_MAKEFOURCC('9', 'P', 'A', 'R')

Extends mfxVideoParam structure with VP9-specific parameters. See the mfxExtVP9Param structure for details.

enumerator MFX_EXTBUFF_AVC_ROUNDING_OFFSET = MFX_MAKEFOURCC('R', 'N', 'D', 'O')

See the mfxExtAVCRoundingOffset structure for details.

enumerator MFX_EXTBUFF_PARTIAL_BITSTREAM_PARAM = MFX_MAKEFOURCC('P', 'B', 'O', 'P')

See the mfxExtPartialBitstreamParam structure for details.

enumerator MFX_EXTBUFF_BRC = MFX_MAKEFOURCC('E', 'B', 'R', 'C')
enumerator MFX_EXTBUFF_VP8_CODING_OPTION = MFX_MAKEFOURCC('V', 'P', '8', 'E')

This extended buffer describes VP8 encoder configuration parameters. See the mfxExtVP8CodingOption structure for details. The application can attach this buffer to the mfxVideoParam structure for encoding initialization.

enumerator MFX_EXTBUFF_JPEG_QT = MFX_MAKEFOURCC('J', 'P', 'G', 'Q')

This extended buffer defines quantization tables for JPEG encoder.

enumerator MFX_EXTBUFF_JPEG_HUFFMAN = MFX_MAKEFOURCC('J', 'P', 'G', 'H')

This extended buffer defines Huffman tables for JPEG encoder.

enumerator MFX_EXTBUFF_ENCODER_IPCM_AREA = MFX_MAKEFOURCC('P', 'C', 'M', 'R')

See the mfxExtEncoderIPCMArea structure for details.

enumerator MFX_EXTBUFF_INSERT_HEADERS = MFX_MAKEFOURCC('S', 'P', 'R', 'E')

See the mfxExtInsertHeaders structure for details.

enumerator MFX_EXTBUFF_MVC_SEQ_DESC = MFX_MAKEFOURCC('M', 'V', 'C', 'D')

This extended buffer describes the MVC stream information of view dependencies, view identifiers, and operation points. See the ITU*-T H.264 specification chapter H.7.3.2.1.4 for details.

enumerator MFX_EXTBUFF_MVC_TARGET_VIEWS = MFX_MAKEFOURCC('M', 'V', 'C', 'T')

This extended buffer defines target views at the decoder output.

enumerator MFX_EXTBUFF_ENCTOOLS_CONFIG = MFX_MAKEFOURCC('E', 'E', 'T', 'C')

See the mfxExtEncToolsConfig structure for details.

enumerator MFX_EXTBUFF_CENC_PARAM = MFX_MAKEFOURCC('C', 'E', 'N', 'P')

This structure is used to pass decryption status report index for Common Encryption usage model. See the mfxExtCencParam structure for more details.

PayloadCtrlFlags

The PayloadCtrlFlags enumerator itemizes additional payload properties.

enumerator MFX_PAYLOAD_CTRL_SUFFIX = 0x00000001

Insert this payload into HEVC Suffix SEI NAL-unit.

ExtMemFrameType

The ExtMemFrameType enumerator specifies the memory type of frame. It is a bit-ORed value of the following. For information on working with video memory surfaces, see the section Working with hardware acceleration.

enumerator MFX_MEMTYPE_PERSISTENT_MEMORY = 0x0002

Memory page for persistent use.

enumerator MFX_MEMTYPE_DXVA2_DECODER_TARGET = 0x0010

Frames are in video memory and belong to video decoder render targets.

enumerator MFX_MEMTYPE_DXVA2_PROCESSOR_TARGET = 0x0020

Frames are in video memory and belong to video processor render targets.

enumerator MFX_MEMTYPE_VIDEO_MEMORY_DECODER_TARGET = MFX_MEMTYPE_DXVA2_DECODER_TARGET

Frames are in video memory and belong to video decoder render targets.

enumerator MFX_MEMTYPE_VIDEO_MEMORY_PROCESSOR_TARGET = MFX_MEMTYPE_DXVA2_PROCESSOR_TARGET

Frames are in video memory and belong to video processor render targets.

enumerator MFX_MEMTYPE_SYSTEM_MEMORY = 0x0040

The frames are in system memory.

enumerator MFX_MEMTYPE_RESERVED1 = 0x0080
enumerator MFX_MEMTYPE_FROM_ENCODE = 0x0100

Allocation request comes from an ENCODE function

enumerator MFX_MEMTYPE_FROM_DECODE = 0x0200

Allocation request comes from a DECODE function

enumerator MFX_MEMTYPE_FROM_VPPIN = 0x0400

Allocation request comes from a VPP function for input frame allocation

enumerator MFX_MEMTYPE_FROM_VPPOUT = 0x0800

Allocation request comes from a VPP function for output frame allocation

enumerator MFX_MEMTYPE_FROM_ENC = 0x2000

Allocation request comes from an ENC function

enumerator MFX_MEMTYPE_INTERNAL_FRAME = 0x0001

Allocation request for internal frames

enumerator MFX_MEMTYPE_EXTERNAL_FRAME = 0x0002

Allocation request for I/O frames

enumerator MFX_MEMTYPE_EXPORT_FRAME = 0x0008

Application requests frame handle export to some associated object. For Linux frame handle can be considered to be exported to DRM Prime FD, DRM FLink or DRM FrameBuffer Handle. Specifics of export types and export procedure depends on external frame allocator implementation

enumerator MFX_MEMTYPE_SHARED_RESOURCE = MFX_MEMTYPE_EXPORT_FRAME

For DX11 allocation use shared resource bind flag.

enumerator MFX_MEMTYPE_VIDEO_MEMORY_ENCODER_TARGET = 0x1000

Frames are in video memory and belong to video encoder render targets.

FrameType

The FrameType enumerator itemizes frame types. Use bit-ORed values to specify all that apply.

enumerator MFX_FRAMETYPE_UNKNOWN = 0x0000

Frame type is unspecifyed.

enumerator MFX_FRAMETYPE_I = 0x0001

This frame or the first field is encoded as an I frame/field.

enumerator MFX_FRAMETYPE_P = 0x0002

This frame or the first field is encoded as an P frame/field.

enumerator MFX_FRAMETYPE_B = 0x0004

This frame or the first field is encoded as an B frame/field.

enumerator MFX_FRAMETYPE_S = 0x0008

This frame or the first field is either an SI- or SP-frame/field.

enumerator MFX_FRAMETYPE_REF = 0x0040

This frame or the first field is encoded as a reference.

enumerator MFX_FRAMETYPE_IDR = 0x0080

This frame or the first field is encoded as an IDR.

enumerator MFX_FRAMETYPE_xI = 0x0100

The second field is encoded as an I-field.

enumerator MFX_FRAMETYPE_xP = 0x0200

The second field is encoded as an P-field.

enumerator MFX_FRAMETYPE_xB = 0x0400

The second field is encoded as an S-field.

enumerator MFX_FRAMETYPE_xS = 0x0800

The second field is an SI- or SP-field.

enumerator MFX_FRAMETYPE_xREF = 0x4000

The second field is encoded as a reference.

enumerator MFX_FRAMETYPE_xIDR = 0x8000

The second field is encoded as an IDR.

MfxNalUnitType

The MfxNalUnitType enumerator specifies NAL unit types supported by the SDK HEVC encoder.

enumerator MFX_HEVC_NALU_TYPE_UNKNOWN = 0

The SDK encoder will decide what NAL unit type to use.

enumerator MFX_HEVC_NALU_TYPE_TRAIL_N = (0 + 1)

See Table 7-1 of the ITU-T H.265 specification for the definition of these type.

enumerator MFX_HEVC_NALU_TYPE_TRAIL_R = (1 + 1)

See Table 7-1 of the ITU-T H.265 specification for the definition of these type.

enumerator MFX_HEVC_NALU_TYPE_RADL_N = (6 + 1)

See Table 7-1 of the ITU-T H.265 specification for the definition of these type.

enumerator MFX_HEVC_NALU_TYPE_RADL_R = (7 + 1)

See Table 7-1 of the ITU-T H.265 specification for the definition of these type.

enumerator MFX_HEVC_NALU_TYPE_RASL_N = (8 + 1)

See Table 7-1 of the ITU-T H.265 specification for the definition of these type.

enumerator MFX_HEVC_NALU_TYPE_RASL_R = (9 + 1)

See Table 7-1 of the ITU-T H.265 specification for the definition of these type.

enumerator MFX_HEVC_NALU_TYPE_IDR_W_RADL = (19 + 1)

See Table 7-1 of the ITU-T H.265 specification for the definition of these type.

enumerator MFX_HEVC_NALU_TYPE_IDR_N_LP = (20 + 1)

See Table 7-1 of the ITU-T H.265 specification for the definition of these type.

enumerator MFX_HEVC_NALU_TYPE_CRA_NUT = (21 + 1)

See Table 7-1 of the ITU-T H.265 specification for the definition of these type.

mfxHandleType

enum mfxHandleType

The mfxHandleType enumerator itemizes system handle types that SDK implementations might use.

Values:

enumerator MFX_HANDLE_DIRECT3D_DEVICE_MANAGER9 = 1

Pointer to the IDirect3DDeviceManager9 interface. See Working with Microsoft* DirectX* Applications for more details on how to use this handle.

enumerator MFX_HANDLE_D3D9_DEVICE_MANAGER = MFX_HANDLE_DIRECT3D_DEVICE_MANAGER9

Pointer to the IDirect3DDeviceManager9 interface. See Working with Microsoft* DirectX* Applications for more details on how to use this handle.

enumerator MFX_HANDLE_RESERVED1 = 2
enumerator MFX_HANDLE_D3D11_DEVICE = 3

Pointer to the ID3D11Device interface. See Working with Microsoft* DirectX* Applications for more details on how to use this handle.

enumerator MFX_HANDLE_VA_DISPLAY = 4

Pointer to VADisplay interface. See Working with VA API Applications for more details on how to use this handle.

enumerator MFX_HANDLE_RESERVED3 = 5
enumerator MFX_HANDLE_VA_CONFIG_ID = 6

Pointer to VAConfigID interface. It represents external VA config for Common Encryption usage model.

enumerator MFX_HANDLE_VA_CONTEXT_ID = 7

Pointer to VAContextID interface. It represents external VA context for Common Encryption usage model.

enumerator MFX_HANDLE_CM_DEVICE = 8

mfxSkipMode

enum mfxSkipMode

The mfxSkipMode enumerator describes the decoder skip-mode options.

Values:

enumerator MFX_SKIPMODE_NOSKIP = 0
enumerator MFX_SKIPMODE_MORE = 1

Do not skip any frames.

enumerator MFX_SKIPMODE_LESS = 2

Skip more frames.

FrcAlgm

The FrcAlgm enumerator itemizes frame rate conversion algorithms. See description of mfxExtVPPFrameRateConversion structure for more details.

enumerator MFX_FRCALGM_PRESERVE_TIMESTAMP = 0x0001

Frame dropping/repetition based frame rate conversion algorithm with preserved original time stamps. Any inserted frames will carry MFX_TIMESTAMP_UNKNOWN.

enumerator MFX_FRCALGM_DISTRIBUTED_TIMESTAMP = 0x0002

Frame dropping/repetition based frame rate conversion algorithm with distributed time stamps. The algorithm distributes output time stamps evenly according to the output frame rate.

enumerator MFX_FRCALGM_FRAME_INTERPOLATION = 0x0004

Frame rate conversion algorithm based on frame interpolation. This flag may be combined with MFX_FRCALGM_PRESERVE_TIMESTAMP or MFX_FRCALGM_DISTRIBUTED_TIMESTAMP flags.

ImageStabMode

The ImageStabMode enumerator itemizes image stabilization modes. See description of mfxExtVPPImageStab structure for more details.

enumerator MFX_IMAGESTAB_MODE_UPSCALE = 0x0001

Upscale mode.

enumerator MFX_IMAGESTAB_MODE_BOXING = 0x0002

Boxing mode.

InsertHDRPayload

The InsertHDRPayload enumerator itemizes HDR payloads insertion rules.

enumerator MFX_PAYLOAD_OFF = 0

Don’t insert payload.

enumerator MFX_PAYLOAD_IDR = 1

Insert payload on IDR frames.

LongTermIdx

The LongTermIdx specifies long term index of picture control

enumerator MFX_LONGTERM_IDX_NO_IDX = 0xFFFF

Long term index of picture is undefined.

TransferMatrix

The TransferMatrix enumerator itemizes color transfer matrixes.

enumerator MFX_TRANSFERMATRIX_UNKNOWN = 0

Transfer matrix isn’t specifyed

enumerator MFX_TRANSFERMATRIX_BT709 = 1

Transfer matrix from ITU-R BT.709 standard.

enumerator MFX_TRANSFERMATRIX_BT601 = 2

Transfer matrix from ITU-R BT.601 standard.

NominalRange

The NominalRange enumerator itemizes pixel’s value nominal range.

enumerator MFX_NOMINALRANGE_UNKNOWN = 0

Range isn’t defined.

enumerator MFX_NOMINALRANGE_0_255 = 1

Range is [0,255].

enumerator MFX_NOMINALRANGE_16_235 = 2

Range is [16,235].

ROImode

The ROImode enumerator itemizes QP adjustment mode for ROIs.

enumerator MFX_ROI_MODE_PRIORITY = 0

Priority mode.

enumerator MFX_ROI_MODE_QP_DELTA = 1

QP mode

enumerator MFX_ROI_MODE_QP_VALUE = 2

Absolute QP

DeinterlacingMode

The DeinterlacingMode enumerator itemizes VPP deinterlacing modes.

enumerator MFX_DEINTERLACING_BOB = 1

BOB deinterlacing mode.

enumerator MFX_DEINTERLACING_ADVANCED = 2

Advanced deinterlacing mode.

enumerator MFX_DEINTERLACING_AUTO_DOUBLE = 3

Auto mode with deinterlacing double framerate output.

enumerator MFX_DEINTERLACING_AUTO_SINGLE = 4

Auto mode with deinterlacing single framerate output.

enumerator MFX_DEINTERLACING_FULL_FR_OUT = 5

Deinterlace only mode with full framerate output.

enumerator MFX_DEINTERLACING_HALF_FR_OUT = 6

Deinterlace only Mode with half framerate output.

enumerator MFX_DEINTERLACING_24FPS_OUT = 7

24 fps fixed output mode.

enumerator MFX_DEINTERLACING_FIXED_TELECINE_PATTERN = 8

Fixed telecine pattern removal mode.

enumerator MFX_DEINTERLACING_30FPS_OUT = 9

30 fps fixed output mode.

enumerator MFX_DEINTERLACING_DETECT_INTERLACE = 10

Only interlace detection.

enumerator MFX_DEINTERLACING_ADVANCED_NOREF = 11

Advanced deinterlacing mode without using of reference frames.

enumerator MFX_DEINTERLACING_ADVANCED_SCD = 12

Advanced deinterlacing mode with scene change detection.

enumerator MFX_DEINTERLACING_FIELD_WEAVING = 13

Field weaving.

TelecinePattern

The TelecinePattern enumerator itemizes telecine patterns.

enumerator MFX_TELECINE_PATTERN_32 = 0

3:2 telecine.

enumerator MFX_TELECINE_PATTERN_2332 = 1

2:3:3:2 telecine.

enumerator MFX_TELECINE_PATTERN_FRAME_REPEAT = 2

One frame repeat telecine.

enumerator MFX_TELECINE_PATTERN_41 = 3

4:1 telecine.

enumerator MFX_TELECINE_POSITION_PROVIDED = 4

User must provide position inside a sequence of 5 frames where the artifacts start.

VPPFieldProcessingMode

The VPPFieldProcessingMode enumerator is used to control VPP field processing algorithm.

enumerator MFX_VPP_COPY_FRAME = 0x01

Copy the whole frame.

enumerator MFX_VPP_COPY_FIELD = 0x02

Copy only one field.

enumerator MFX_VPP_SWAP_FIELDS = 0x03

Swap top and bottom fields.

PicType

The PicType enumerator itemizes picture type.

enumerator MFX_PICTYPE_UNKNOWN = 0x00

Picture type is unknown.

enumerator MFX_PICTYPE_FRAME = 0x01

Picture is a frame.

enumerator MFX_PICTYPE_TOPFIELD = 0x02

Picture is a top field.

enumerator MFX_PICTYPE_BOTTOMFIELD = 0x04

Picture is a bottom field.

MBQPMode

The MBQPMode enumerator itemizes QP update modes.

enumerator MFX_MBQP_MODE_QP_VALUE = 0

QP array contains QP values.

enumerator MFX_MBQP_MODE_QP_DELTA = 1

QP array contains deltas for QP.

enumerator MFX_MBQP_MODE_QP_ADAPTIVE = 2

QP array contains deltas for QP or absolute QP values.

GeneralConstraintFlags

The GeneralConstraintFlags enumerator uses bit-ORed values to itemize HEVC bitstream indications for specific profiles. Each value indicates for format range extensions profiles.

enumerator MFX_HEVC_CONSTR_REXT_MAX_12BIT = (1 << 0)
enumerator MFX_HEVC_CONSTR_REXT_MAX_10BIT = (1 << 1)
enumerator MFX_HEVC_CONSTR_REXT_MAX_8BIT = (1 << 2)
enumerator MFX_HEVC_CONSTR_REXT_MAX_422CHROMA = (1 << 3)
enumerator MFX_HEVC_CONSTR_REXT_MAX_420CHROMA = (1 << 4)
enumerator MFX_HEVC_CONSTR_REXT_MAX_MONOCHROME = (1 << 5)
enumerator MFX_HEVC_CONSTR_REXT_INTRA = (1 << 6)
enumerator MFX_HEVC_CONSTR_REXT_ONE_PICTURE_ONLY = (1 << 7)
enumerator MFX_HEVC_CONSTR_REXT_LOWER_BIT_RATE = (1 << 8)

SampleAdaptiveOffset

The SampleAdaptiveOffset enumerator uses bit-ORed values to itemize corresponding HEVC encoding feature.

enumerator MFX_SAO_UNKNOWN = 0x00

Use default value for platform/TargetUsage.

enumerator MFX_SAO_DISABLE = 0x01

Disable SAO. If set during Init leads to SPS sample_adaptive_offset_enabled_flag = 0. If set during Runtime, leads to to slice_sao_luma_flag = 0 and slice_sao_chroma_flag = 0 for current frame.

enumerator MFX_SAO_ENABLE_LUMA = 0x02

Enable SAO for luma (slice_sao_luma_flag = 1).

enumerator MFX_SAO_ENABLE_CHROMA = 0x04

Enable SAO for chroma (slice_sao_chroma_flag = 1).

ErrorTypes

The ErrorTypes enumerator uses bit-ORed values to itemize bitstream error types.

enumerator MFX_ERROR_PPS = (1 << 0)

Invalid/corrupted PPS.

enumerator MFX_ERROR_SPS = (1 << 1)

Invalid/corrupted SPS.

enumerator MFX_ERROR_SLICEHEADER = (1 << 2)

Invalid/corrupted slice header.

enumerator MFX_ERROR_SLICEDATA = (1 << 3)

Invalid/corrupted slice data.

enumerator MFX_ERROR_FRAME_GAP = (1 << 4)

Missed frames.

HEVCRegionType

The HEVCRegionType enumerator itemizes type of HEVC region.

enumerator MFX_HEVC_REGION_SLICE = 0

Slice type.

HEVCRegionEncoding

The HEVCRegionEncoding enumerator itemizes HEVC region’s encoding.

enumerator MFX_HEVC_REGION_ENCODING_ON = 0
enumerator MFX_HEVC_REGION_ENCODING_OFF = 1

Angle

The Angle enumerator itemizes valid rotation angles.

enumerator MFX_ANGLE_0 = 0

0 degrees.

enumerator MFX_ANGLE_90 = 90

90 degrees.

enumerator MFX_ANGLE_180 = 180

180 degrees.

enumerator MFX_ANGLE_270 = 270

270 degrees.

ScalingMode

The ScalingMode enumerator itemizes variants of scaling filter implementation.

enumerator MFX_SCALING_MODE_DEFAULT = 0

Default scaling mode. SDK selects the most appropriate scaling method.

enumerator MFX_SCALING_MODE_LOWPOWER = 1

Low power scaling mode which is applicable for platform SDK implementations. The exact scaling algorithm is defined by the SDK.

enumerator MFX_SCALING_MODE_QUALITY = 2

The best quality scaling mode

InterpolationMode

The InterpolationMode enumerator specifies type of interpolation method used by VPP scaling filter.

enumerator MFX_INTERPOLATION_DEFAULT = 0

Default interpolation mode for scaling. SDK selects the most appropriate

scaling method.

enumerator MFX_INTERPOLATION_NEAREST_NEIGHBOR = 1

Nearest neighbor interpolation method

enumerator MFX_INTERPOLATION_BILINEAR = 2

Bilinear interpolation method

enumerator MFX_INTERPOLATION_ADVANCED = 3

Advanced interpolation method is defined by each SDK and usually gives best quality

MirroringType

The MirroringType enumerator itemizes mirroring types.

enumerator MFX_MIRRORING_DISABLED = 0
enumerator MFX_MIRRORING_HORIZONTAL = 1
enumerator MFX_MIRRORING_VERTICAL = 2

ChromaSiting

The ChromaSiting enumerator defines chroma location. Use bit-OR’ed values to specify the desired location.

enumerator MFX_CHROMA_SITING_UNKNOWN = 0x0000

Unspecified.

enumerator MFX_CHROMA_SITING_VERTICAL_TOP = 0x0001

Chroma samples are co-sited vertically on the top with the luma samples.

enumerator MFX_CHROMA_SITING_VERTICAL_CENTER = 0x0002

Chroma samples are not co-sited vertically with the luma samples.

enumerator MFX_CHROMA_SITING_VERTICAL_BOTTOM = 0x0004

Chroma samples are co-sited vertically on the bottom with the luma samples.

enumerator MFX_CHROMA_SITING_HORIZONTAL_LEFT = 0x0010

Chroma samples are co-sited horizontally on the left with the luma samples.

enumerator MFX_CHROMA_SITING_HORIZONTAL_CENTER = 0x0020

Chroma samples are not co-sited horizontally with the luma samples.

VP9ReferenceFrame

The VP9ReferenceFrame enumerator itemizes reference frame type by mfxVP9SegmentParam::ReferenceFrame parameter.

enumerator MFX_VP9_REF_INTRA = 0

Intra.

enumerator MFX_VP9_REF_LAST = 1

Last.

enumerator MFX_VP9_REF_GOLDEN = 2

Golden.

enumerator MFX_VP9_REF_ALTREF = 3

Alternative reference.

SegmentIdBlockSize

The SegmentIdBlockSize enumerator indicates the block size represented by each segment_id in segmentation map. These values are used with the mfxExtVP9Segmentation::SegmentIdBlockSize parameter.

enumerator MFX_VP9_SEGMENT_ID_BLOCK_SIZE_UNKNOWN = 0

Unspecified block size

enumerator MFX_VP9_SEGMENT_ID_BLOCK_SIZE_8x8 = 8

8x8 block size.

enumerator MFX_VP9_SEGMENT_ID_BLOCK_SIZE_16x16 = 16

16x16 block size.

enumerator MFX_VP9_SEGMENT_ID_BLOCK_SIZE_32x32 = 32

32x32 block size.

enumerator MFX_VP9_SEGMENT_ID_BLOCK_SIZE_64x64 = 64

64x64 block size.

SegmentFeature

The SegmentFeature enumerator indicates features enabled for the segment. These values are used with the mfxVP9SegmentParam::FeatureEnabled parameter.

enumerator MFX_VP9_SEGMENT_FEATURE_QINDEX = 0x0001

Quantization index delta.

enumerator MFX_VP9_SEGMENT_FEATURE_LOOP_FILTER = 0x0002

Loop filter level delta.

enumerator MFX_VP9_SEGMENT_FEATURE_REFERENCE = 0x0004

Reference frame.

enumerator MFX_VP9_SEGMENT_FEATURE_SKIP = 0x0008

Skip.

MCTFTemporalMode

The MCTFTemporalMode enumerator itemazes temporal filtering modes.

enumerator MFX_MCTF_TEMPORAL_MODE_UNKNOWN = 0
enumerator MFX_MCTF_TEMPORAL_MODE_SPATIAL = 1
enumerator MFX_MCTF_TEMPORAL_MODE_1REF = 2
enumerator MFX_MCTF_TEMPORAL_MODE_2REF = 3
enumerator MFX_MCTF_TEMPORAL_MODE_4REF = 4

mfxComponentType

enum mfxComponentType

The mfxComponentType enumerator describes type of workload passed to MFXQueryAdapters.

Values:

enumerator MFX_COMPONENT_ENCODE = 1

Encode workload.

enumerator MFX_COMPONENT_DECODE = 2

Decode workload.

enumerator MFX_COMPONENT_VPP = 3

VPP workload.

PartialBitstreamOutput

The PartialBitstreamOutput enumerator indicates flags of partial bitstream output type.

enumerator MFX_PARTIAL_BITSTREAM_NONE = 0

Don’t use partial output

enumerator MFX_PARTIAL_BITSTREAM_SLICE = 1

Partial bitstream output will be aligned to slice granularity

enumerator MFX_PARTIAL_BITSTREAM_BLOCK = 2

Partial bitstream output will be aligned to user-defined block size granularity

enumerator MFX_PARTIAL_BITSTREAM_ANY = 3

Partial bitstream output will be return any coded data avilable at the end of SyncOperation timeout

BRCStatus

The BRCStatus enumerator itemizes instructions to the SDK encoder by mfxExtBrc::Update.

enumerator MFX_BRC_OK = 0

CodedFrameSize is acceptable, no further recoding/padding/skip required, proceed to next frame.

enumerator MFX_BRC_BIG_FRAME = 1

Coded frame is too big, recoding required.

enumerator MFX_BRC_SMALL_FRAME = 2

Coded frame is too small, recoding required.

enumerator MFX_BRC_PANIC_BIG_FRAME = 3

Coded frame is too big, no further recoding possible - skip frame.

enumerator MFX_BRC_PANIC_SMALL_FRAME = 4

Coded frame is too small, no further recoding possible - required padding to mfxBRCFrameStatus::MinFrameSize.

Rotation

The Rotation enumerator itemizes the JPEG rotation options.

enumerator MFX_ROTATION_0 = 0

No rotation.

enumerator MFX_ROTATION_90 = 1

90 degree rotation

enumerator MFX_ROTATION_180 = 2

180 degree rotation

enumerator MFX_ROTATION_270 = 3

270 degree rotation

JPEGColorFormat

The JPEGColorFormat enumerator itemizes the JPEG color format options.

enumerator MFX_JPEG_COLORFORMAT_UNKNOWN = 0
enumerator MFX_JPEG_COLORFORMAT_YCbCr = 1

Unknown color format. The SDK decoder tries to determine color format from available in bitstream information. If such information is not present, then MFX_JPEG_COLORFORMAT_YCbCr color format is assumed.

enumerator MFX_JPEG_COLORFORMAT_RGB = 2

Bitstream contains Y, Cb and Cr components.

JPEGScanType

The JPEGScanType enumerator itemizes the JPEG scan types.

enumerator MFX_SCANTYPE_UNKNOWN = 0

Unknown scan type.

enumerator MFX_SCANTYPE_INTERLEAVED = 1

Interleaved scan.

enumerator MFX_SCANTYPE_NONINTERLEAVED = 2

Non-interleaved scan.

Protected

The Protected enumerator describes the protection schemes.

enumerator MFX_PROTECTION_CENC_WV_CLASSIC = 0x0004

The protection scheme is based on the Widevine* DRM from Google*.

enumerator MFX_PROTECTION_CENC_WV_GOOGLE_DASH = 0x0005

The protection scheme is based on the Widevine* Modular DRM* from Google*.

Structs

mfxRange32U

struct mfxRange32U

This structure represents range of unsigned values

Public Members

mfxU32 Min

Minimal value of the range

mfxU32 Max

Maximal value of the range

mfxU32 Step

Value incrementation step

mfxI16Pair

struct mfxI16Pair

Thus structure represents pair of numbers of mfxI16 type

Public Members

mfxI16 x

First number

mfxI16 y

Second number

mfxHDLPair

struct mfxHDLPair

Thus structure represents pair of handles of mfxHDL type

Public Members

mfxHDL first

First handle

mfxHDL second

Second number

mfxVersion

union mfxVersion
#include <mfxcommon.h>

The mfxVersion union describes the version of the SDK implementation.

Public Members

mfxU16 Minor

Minor number of the SDK implementation

mfxU16 Major

Major number of the SDK implementation

struct mfxVersion::[anonymous] [anonymous]
mfxU32 Version

SDK implementation version number

mfxStructVersion

union mfxStructVersion
#include <mfxdefs.h>

Introduce field Version for any structures. Minor number is incremented when reserved fields are used, major number is incremnted when size of structure is increased. Assumed that any structure changes are backward binary compatible. mfxStructVersion starts from {1,0} for any new API structures, if mfxStructVersion is added to the existent legacy structure (replacing reserved fields) it starts from {1, 1}.

Public Members

mfxU8 Minor

Minor number of the correspondent structure

mfxU8 Major

Major number of the correspondent structure

struct mfxStructVersion::[anonymous] [anonymous]
mfxU16 Version

Structure version number

mfxPlatform

struct mfxPlatform

The mfxPlatform structure contains information about hardware platform.

Public Members

mfxU16 CodeName

Intel® microarchitecture code name. See the PlatformCodeName enumerator for a list of possible values.

mfxU16 DeviceId

Unique identifier of graphics device.

mfxU16 MediaAdapterType

Description of Intel Gen Graphics adapter type. See the mfxMediaAdapterType enumerator for a list of possible values.

mfxU16 reserved[13]

Reserved for future use.

mfxInitParam

struct mfxInitParam

This structure specifies advanced initialization parameters. A zero value in any of the fields indicates that the corresponding field is not explicitly specified.

Public Members

mfxIMPL Implementation

mfxIMPL enumerator that indicates the desired SDK implementation

mfxVersion Version

Structure which specifies minimum library version or zero, if not specified

mfxU16 ExternalThreads

Desired threading mode. Value 0 means internal threading, 1 – external.

mfxExtBuffer **ExtParam

Points to an array of pointers to the extra configuration structures; see the ExtendedBufferID enumerator for a list of extended configurations.

mfxU16 NumExtParam

The number of extra configuration structures attached to this structure.

mfxU16 GPUCopy

Enables or disables GPU accelerated copying between video and system memory in the SDK components. See the GPUCopy enumerator for a list of valid values.

mfxInfoMFX

struct mfxInfoMFX

The mfxInfoMFX structure specifies configurations for decoding, encoding and transcoding processes. A zero value in any of these fields indicates that the field is not explicitly specified.

Public Members

mfxU32 reserved[7]

Reserved for future use.

mfxU16 LowPower

For encoders set this flag to ON to reduce power consumption and GPU usage. See the CodingOptionValue enumerator for values of this option. Use Query function to check if this feature is supported.

mfxU16 BRCParamMultiplier

Specifies a multiplier for bitrate control parameters. Affects next four variables InitialDelayInKB, BufferSizeInKB, TargetKbps, MaxKbps. If this value is not equal to zero encoder calculates BRC parameters as value * BRCParamMultiplier.

mfxFrameInfo FrameInfo

mfxFrameInfo structure that specifies frame parameters

mfxU32 CodecId

Specifies the codec format identifier in the FOURCC code; see the CodecFormatFourCC enumerator for details. This is a mandated input parameter for QueryIOSurf and Init functions.

mfxU16 CodecProfile

Specifies the codec profile; see the CodecProfile enumerator for details. Specify the codec profile explicitly or the SDK functions will determine the correct profile from other sources, such as resolution and bitrate.

mfxU16 CodecLevel

Codec level; see the CodecLevel enumerator for details. Specify the codec level explicitly or the SDK functions will determine the correct level from other sources, such as resolution and bitrate.

mfxU16 TargetUsage

Target usage model that guides the encoding process; see the TargetUsage enumerator for details.

mfxU16 GopPicSize

Number of pictures within the current GOP (Group of Pictures); if GopPicSize = 0, then the GOP size is unspecified. If GopPicSize = 1, only I-frames are used. Pseudo-code that demonstrates how SDK uses this parameter.

mfxU16 get_gop_sequence (...) {
   pos=display_frame_order;
   if (pos == 0)
       return MFX_FRAMETYPE_I | MFX_FRAMETYPE_IDR | MFX_FRAMETYPE_REF;

   If (GopPicSize == 1) // Only I-frames
       return MFX_FRAMETYPE_I | MFX_FRAMETYPE_REF;

   if (GopPicSize == 0)
               frameInGOP = pos;    //Unlimited GOP
           else
               frameInGOP = pos%GopPicSize;

   if (frameInGOP == 0)
       return MFX_FRAMETYPE_I | MFX_FRAMETYPE_REF;

   if (GopRefDist == 1 || GopRefDist == 0)    // Only I,P frames
               return MFX_FRAMETYPE_P | MFX_FRAMETYPE_REF;

   frameInPattern = (frameInGOP-1)%GopRefDist;
   if (frameInPattern == GopRefDist - 1)
       return MFX_FRAMETYPE_P | MFX_FRAMETYPE_REF;

   return MFX_FRAMETYPE_B;
 }                  

mfxU16 GopRefDist

Distance between I- or P (or GPB) - key frames; if it is zero, the GOP structure is unspecified. Note: If GopRefDist = 1, there are no regular B-frames used (only P or GPB); if mfxExtCodingOption3::GPB is ON, GPB frames (B without backward references) are used instead of P.

mfxU16 GopOptFlag

ORs of the GopOptFlag enumerator indicate the additional flags for the GOP specification.

mfxU16 IdrInterval

For H.264, IdrInterval specifies IDR-frame interval in terms of I-frames; if IdrInterval = 0, then every I-frame is an IDR-frame. If IdrInterval = 1, then every other I-frame is an IDR-frame, etc.

For HEVC, if IdrInterval = 0, then only first I-frame is an IDR-frame. If IdrInterval = 1, then every I-frame is an IDR-frame. If IdrInterval = 2, then every other I-frame is an IDR-frame, etc.

For MPEG2, IdrInterval defines sequence header interval in terms of I-frames. If IdrInterval = N, SDK inserts the sequence header before every Nth I-frame. If IdrInterval = 0 (default), SDK inserts the sequence header once at the beginning of the stream.

If GopPicSize or GopRefDist is zero, IdrInterval is undefined.

mfxU16 InitialDelayInKB

Initial size of the Video Buffering Verifier (VBV) buffer.

Note

In this context, KB is 1000 bytes and Kbps is 1000 bps.

mfxU16 QPI

Quantization Parameter (QP) for I frames for constant QP mode (CQP). Zero QP is not valid and means that default value is assigned by oneVPL. Non-zero QPI might be clipped to supported QPI range.

Note

Default QPI value is implementation dependent and subject to change without additional notice in this document.

mfxU16 Accuracy

Specifies accuracy range in the unit of tenth of percent.

mfxU16 BufferSizeInKB

BufferSizeInKB represents the maximum possible size of any compressed frames.

mfxU16 TargetKbps

Constant bitrate TargetKbps. Used to estimate the targeted frame size by dividing the framerate by the bitrate.

mfxU16 QPP

Quantization Parameter (QP) for P frames for constant QP mode (CQP). Zero QP is not valid and means that default value is assigned by oneVPL. Non-zero QPP might be clipped to supported QPI range.

Note

Default QPP value is implementation dependent and subject to change without additional notice in this document.

mfxU16 ICQQuality

This parameter is for Intelligent Constant Quality (ICQ) bitrate control algorithm. It is value in the 1…51 range, where 1 corresponds the best quality.

mfxU16 MaxKbps

the maximum bitrate at which the encoded data enters the Video Buffering Verifier (VBV) buffer.

mfxU16 QPB

Quantization Parameter (QP) for B frames for constant QP mode (CQP). Zero QP is not valid and means that default value is assigned by oneVPL. Non-zero QPI might be clipped to supported QPB range.

Note

Default QPB value is implementation dependent and subject to change without additional notice in this document.

mfxU16 Convergence

Convergence period in the unit of 100 frames.

mfxU16 NumSlice

Number of slices in each video frame; each slice contains one or more macro-block rows. If NumSlice equals zero, the encoder may choose any slice partitioning allowed by the codec standard. See also mfxExtCodingOption2::NumMbPerSlice.

mfxU16 NumRefFrame

Max number of all available reference frames (for AVC/HEVC NumRefFrame defines DPB size); if NumRefFrame = 0, this parameter is not specified. See also mfxExtCodingOption3::NumRefActiveP, NumRefActiveBL0 and NumRefActiveBL1 which set a number of active references.

mfxU16 EncodedOrder

If not zero, EncodedOrder specifies that ENCODE takes the input surfaces in the encoded order and uses explicit frame type control. Application still must provide GopRefDist and mfxExtCodingOption2::BRefType so SDK can pack headers and build reference lists correctly.

mfxU16 DecodedOrder

For AVC and HEVC, used to instruct the decoder to return output frames in the decoded order. Must be zero for all other decoders. When enabled, correctness of mfxFrameData::TimeStamp and FrameOrder for output surface is not guaranteed, the application should ignore them.

mfxU16 ExtendedPicStruct

Instructs DECODE to output extended picture structure values for additional display attributes. See the PicStruct description for details.

mfxU16 TimeStampCalc

Time stamp calculation method; see the TimeStampCalc description for details.

mfxU16 SliceGroupsPresent

Nonzero value indicates that slice groups are present in the bitstream. Only AVC decoder uses this field.

mfxU16 MaxDecFrameBuffering

Nonzero value specifies the maximum required size of the decoded picture buffer in frames for AVC and HEVC decoders.

mfxU16 EnableReallocRequest

For decoders supporting dynamic resolution change (VP9), set this option to ON to allow MFXVideoDECODE_DecodeFrameAsync return MFX_ERR_REALLOC_SURFACE. See the CodingOptionValue enumerator for values of this option. Use Query function to check if this feature is supported.

mfxU16 JPEGChromaFormat

Specify the chroma sampling format that has been used to encode JPEG picture. See the ChromaFormat enumerator.

mfxU16 Rotation

Rotation option of the output JPEG picture; see the Rotation enumerator for details.

mfxU16 JPEGColorFormat

Specify the color format that has been used to encode JPEG picture. See the JPEGColorFormat enumerator for details.

mfxU16 InterleavedDec

Specify JPEG scan type for decoder. See the JPEGScanType enumerator for details.

mfxU8 SamplingFactorH[4]

Horizontal sampling factor.

mfxU8 SamplingFactorV[4]

Verical sampling factor.

mfxU16 Interleaved

Non-interleaved or interleaved scans. If it is equal to MFX_SCANTYPE_INTERLEAVED then the image is encoded as interleaved, all components are encoded in one scan. See the JPEG Scan Type enumerator for details.

mfxU16 Quality

Specifies the image quality if the application does not specified quantization table. This is the value from 1 to 100 inclusive. “100” is the best quality.

mfxU16 RestartInterval

Specifies the number of MCU in the restart interval. “0” means no restart interval.

Note

The mfxInfoMFX::InitialDelayInKB, mfxInfoMFX::TargetKbps, mfxInfoMFX::MaxKbps parameters are for the constant bitrate (CBR), variable bitrate control (VBR) and CQP HRD algorithms.

The SDK encoders follow the Hypothetical Reference Decoding (HRD) model. The HRD model assumes that data flows into a buffer of the fixed size BufferSizeInKB with a constant bitrate TargetKbps. (Estimate the targeted frame size by dividing the framerate by the bitrate.)

The decoder starts decoding after the buffer reaches the initial size InitialDelayInKB, which is equivalent to reaching an initial delay of InitialDelayInKB*8000/TargetKbpsms. Note: In this context, KB is 1000 bytes and Kbps is 1000 bps.

If InitialDelayInKB or BufferSizeInKB is equal to zero, the value is calculated using bitrate, frame rate, profile, level, and so on.

TargetKbps must be specified for encoding initialization.

For variable bitrate control, the MaxKbps parameter specifies the maximum bitrate at which the encoded data enters the Video Buffering Verifier (VBV) buffer. If MaxKbps is equal to zero, the value is calculated from bitrate, frame rate, profile, level, and so on.

Note

The mfxInfoMFX::TargetKbps, mfxInfoMFX::Accuracy, mfxInfoMFX::Convergence parameters are for the average variable bitrate control (AVBR) algorithm. The algorithm focuses on overall encoding quality while meeting the specified bitrate, TargetKbps, within the accuracy range Accuracy, after a Convergence period. This method does not follow HRD and the instant bitrate is not capped or padded.

mfxFrameInfo

struct mfxFrameInfo

The mfxFrameInfo structure specifies properties of video frames. See also “Configuration Parameter Constraints” chapter.

FrameRate

Specify the frame rate by the formula: FrameRateExtN / FrameRateExtD.

For encoding, frame rate must be specified. For decoding, frame rate may be unspecified (FrameRateExtN and FrameRateExtD are all zeros.) In this case, the frame rate is default to 30 frames per second.

mfxU32 FrameRateExtN

Numerator

mfxU32 FrameRateExtD

Denominator

AspectRatio

These parameters specify the sample aspect ratio. If sample aspect ratio is explicitly defined by the standards (see Table 6-3 in the MPEG-2 specification or Table E-1 in the H.264 specification), AspectRatioW and AspectRatioH should be the defined values. Otherwise, the sample aspect ratio can be derived as follows:

AspectRatioW=display_aspect_ratio_width*display_height;

AspectRatioH=display_aspect_ratio_height*display_width;

For MPEG-2, the above display aspect ratio must be one of the defined values in Table 6-3. For H.264, there is no restriction on display aspect ratio values.

If both parameters are zero, the encoder uses default value of sample aspect ratio.

mfxU16 AspectRatioW

Ratio for width.

mfxU16 AspectRatioH

Ratio for height.

ROI

Display the region of interest of the frame; specify the display width and height in mfxVideoParam.

mfxU16 CropX

X coordinate

mfxU16 CropY

Y coordinade

mfxU16 CropW

Width

mfxU16 CropH

Height

Public Members

mfxU32 reserved[4]

Reserbed for future use.

mfxU16 reserved4

Reserbed for future use.

mfxU16 BitDepthLuma

Number of bits used to represent luma samples.

Note

Not all codecs and SDK implementations support this value. Use Query function to check if this feature is supported.

mfxU16 BitDepthChroma

Number of bits used to represent chroma samples.

Note

Not all codecs and SDK implementations support this value. Use Query function to check if this feature is supported.

mfxU16 Shift

When not zero indicates that values of luma and chroma samples are shifted. Use BitDepthLuma and BitDepthChroma to calculate shift size. Use zero value to indicate absence of shift.

Note

Not all codecs and SDK implementations support this value. Use Query function to check if this feature is supported.

mfxFrameId FrameId

Frame ID. Ignored as obsolete parameter.

mfxU32 FourCC

FourCC code of the color format; see the ColorFourCC enumerator for details.

mfxU16 Width

Width of the video frame in pixels. Must be a multiple of 16.

mfxU16 Height

Height of the video frame in pixels. Must be a multiple of 16 for progressive frame sequence and a multiple of 32 otherwise.

mfxU64 BufferSize

Size of frame buffer in bytes. Valid only for plain formats (when FourCC is P8); Width, Height and crops in this case are invalid.

mfxU16 PicStruct

Picture type as specified in the PicStruct enumerator.

mfxU16 ChromaFormat

Color sampling method; the value of ChromaFormat is the same as that of ChromaFormatIdc. ChromaFormat is not defined if FourCC is zero.

Note

Data alignment for Shift = 0

digraph {
    abc [shape=none, margin=0, label=<
    <TABLE BORDER="0" CELLBORDER="1" CELLSPACING="0" CELLPADDING="4">
     <TR><TD>Bit</TD><TD>15</TD><TD>14</TD><TD>13</TD><TD>12</TD><TD>11</TD><TD>10</TD><TD>9</TD><TD>8</TD>
         <TD>7</TD><TD>6</TD><TD>5</TD><TD>4</TD><TD>3</TD><TD>2</TD><TD>1</TD><TD>0</TD>
     </TR>
     <TR><TD>Value</TD><TD>0</TD><TD>0</TD><TD>0</TD><TD>0</TD><TD>0</TD><TD>0</TD><TD COLSPAN="10">Valid data</TD>
     </TR>
       </TABLE>>];
}

Data alignment for Shift != 0

digraph {
    abc [shape=none, margin=0, label=<
    <TABLE BORDER="0" CELLBORDER="1" CELLSPACING="0" CELLPADDING="4">
     <TR><TD>Bit</TD><TD>15</TD><TD>14</TD><TD>13</TD><TD>12</TD><TD>11</TD><TD>10</TD><TD>9</TD><TD>8</TD>
         <TD>7</TD><TD>6</TD><TD>5</TD><TD>4</TD><TD>3</TD><TD>2</TD><TD>1</TD><TD>0</TD>
     </TR>
     <TR><TD>Value</TD><TD COLSPAN="10">Valid data</TD><TD>0</TD><TD>0</TD><TD>0</TD><TD>0</TD><TD>0</TD><TD>0</TD>
     </TR>
       </TABLE>>];
}

mfxVideoParam

struct mfxVideoParam

The mfxVideoParam structure contains configuration parameters for encoding, decoding, transcoding and video processing.

Public Members

mfxU32 AllocId

Unique component ID that will be passed by SDK to mfxFrameAllocRequest. Useful in pipelines where several components of the same type share the same allocator.

mfxU16 AsyncDepth

Specifies how many asynchronous operations an application performs before the application explicitly synchronizes the result. If zero, the value is not specified.

mfxInfoMFX mfx

Configurations related to encoding, decoding and transcoding; see the definition of the mfxInfoMFX structure for details.

mfxInfoVPP vpp

Configurations related to video processing; see the definition of the mfxInfoVPP structure for details.

mfxU16 Protected

Specifies the content protection mechanism; see the Protected enumerator for a list of supported protection schemes.

mfxU16 IOPattern

Input and output memory access types for SDK functions; see the enumerator IOPattern for details. The Query functions return the natively supported IOPattern if the Query input argument is NULL. This parameter is a mandated input for QueryIOSurf and Init functions. For DECODE, the output pattern must be specified; for ENCODE, the input pattern must be specified; and for VPP, both input and output pattern must be specified.

mfxExtBuffer **ExtParam

The number of extra configuration structures attached to this structure.

mfxU16 NumExtParam

Points to an array of pointers to the extra configuration structures; see the ExtendedBufferID enumerator for a list of extended configurations. The list of extended buffers should not contain duplicated entries, i.e. entries of the same type. If mfxVideoParam structure is used to query the SDK capability, then list of extended buffers attached to input and output mfxVideoParam structure should be equal, i.e. should contain the same number of extended buffers of the same type.

mfxFrameData

struct mfxY410

The mfxY410 structure specifies “pixel” in Y410 color format

Public Members

mfxU32 U

U component.

mfxU32 Y

Y component.

mfxU32 V

V component.

mfxU32 A

A component.

struct mfxA2RGB10

The mfxA2RGB10 structure specifies “pixel” in A2RGB10 color format

Public Members

mfxU32 B

B component.

mfxU32 G

G component.

mfxU32 R

R component.

mfxU32 A

A component.

struct mfxFrameData

The mfxFrameData structure describes frame buffer pointers.

Extension Buffers

mfxU16 NumExtParam

The number of extra configuration structures attached to this structure.

General members

mfxU16 reserved[9]

Reserved for future use

mfxU16 MemType

Allocated memory type; see the ExtMemFrameType enumerator for details. Used for better integration of] 3rd party plugins into SDK pipeline.

mfxU16 PitchHigh

Distance in bytes between the start of two consecutive rows in a frame.

mfxU64 TimeStamp

Time stamp of the video frame in units of 90KHz (divide TimeStamp by 90,000 (90 KHz) to obtain the time in seconds). A value of MFX_TIMESTAMP_UNKNOWN indicates that there is no time stamp.

mfxU32 FrameOrder

Current frame counter for the top field of the current frame; an invalid value of MFX_FRAMEORDER_UNKNOWN indicates that SDK functions that generate the frame output do not use this frame.

mfxU16 Locked

Counter flag for the application; if Locked is greater than zero then the application locks the frame or field pair. Do not move, alter or delete the frame.

Color Planes

Data pointers to corresponding color channels (planes). The frame buffer pointers must be 16-byte aligned. The application has to specify pointers to all color channels even for packed formats. For example, for YUY2 format the application has to specify Y, U and V pointers. For RGB32 – R, G, B and A pointers.

mfxU8 *A

A channel

mfxMemId MemId

Memory ID of the data buffers; if any of the preceding data pointers is non-zero then the SDK ignores MemId.

Additional Flags

mfxU16 Corrupted

Some part of the frame or field pair is corrupted. See the Corruption enumerator for details.

mfxU16 DataFlag

Additional flags to indicate frame data properties. See the FrameDataFlag enumerator for details.

Public Members

mfxExtBuffer **ExtParam

Points to an array of pointers to the extra configuration structures; see the ExtendedBufferID enumerator for a list of extended configurations.

mfxU16 PitchLow

Distance in bytes between the start of two consecutive rows in a frame.

mfxU8 *Y

Y channel

mfxU16 *Y16

Y16 channel

mfxU8 *R

R channel

mfxU8 *UV

UV channel for UV merged formats

mfxU8 *VU

YU channel for VU merged formats

mfxU8 *CbCr

CbCr channel for CbCr merged formats

mfxU8 *CrCb

CrCb channel for CrCb merged formats

mfxU8 *Cb

Cb channel

mfxU8 *U

U channel

mfxU16 *U16

U16 channel

mfxU8 *G

G channel

mfxY410 *Y410

T410 channel for Y410 format (merged AVYU)

mfxU8 *Cr

Cr channel

mfxU8 *V

V channel

mfxU16 *V16

V16 channel

mfxU8 *B

B channel

mfxA2RGB10 *A2RGB10

A2RGB10 channelfor A2RGB10 format (merged ARGB)

mfxFrameSurfaceInterface

struct mfxFrameSurfaceInterface

Public Members

mfxHDL Context

This context of memory interface. User should not touch (change, set, null) this pointer.

mfxStructVersion Version

The version of the structure.

mfxStatus (*AddRef)(mfxFrameSurface1 *surface)

This function increments the internal reference counter of the surface, so user is going to keep the surface. The surface cannot be destroyed until user wouldn’t call (*Release). It’s expected that users would call (*AddRef)() each time when they create new links (copy structure, etc) to the surface for proper management.

Return

MFX_ERR_NONE if no error.

MFX_ERR_NULL_PTR if surface is NULL.

MFX_ERR_INVALID_HANDLE if mfxFrameSurfaceInterface->Context is invalid (for example NULL).

MFX_ERR_UNKNOWN in case of any internal error.

Parameters
  • [in] surface: valid surface.

mfxStatus (*Release)(mfxFrameSurface1 *surface)

This function decrements the internal reference counter of the surface, users have to care about calling of (*Release) after (*AddRef) or when it’s required according to the allocation logic. For instance, users have to call (*Release) to relase a surface obtained with GetSurfaceForXXX function.

Return

MFX_ERR_NONE if no error.

MFX_ERR_NULL_PTR if surface is NULL.

MFX_ERR_INVALID_HANDLE if mfxFrameSurfaceInterface->Context is invalid (for example NULL).

MFX_ERR_UNDEFINED_BEHAVIOR if Reference Counter of surface is zero before call.

MFX_ERR_UNKNOWN in case of any internal error.

Parameters
  • [in] surface: valid surface.

mfxStatus (*GetRefCounter)(mfxFrameSurface1 *surface, mfxU32 *counter)

This function returns current reference counter of mfxFrameSurface1 structure.

Return

MFX_ERR_NONE if no error.

MFX_ERR_NULL_PTR if surface or counter is NULL.

MFX_ERR_INVALID_HANDLE if mfxFrameSurfaceInterface->Context is invalid (for example NULL).

MFX_ERR_UNKNOWN in case of any internal error.

Parameters
  • [in] surface: valid surface.

  • [out] counter: sets counter to the current reference counter value.

mfxStatus (*Map)(mfxFrameSurface1 *surface, mfxU32 flags)

This function set pointers of surface->Info.Data to actual pixel data, providing read-write access. In case of video memory, actual surface with data in video memory becomes mapped to system memory. An application can map a surface for read with any value of mfxFrameSurface1::Data.Locked, but for write only when mfxFrameSurface1::Data.Locked equals to 0. Note: surface allows shared read access, but exclusive write access.Let consider the following cases: -Map with Write or Read|Write flags. Request during active another read or write access returns MFX_ERR_LOCK_MEMORY error immediately, without waiting. MFX_MAP_NOWAIT doesn’t impact behavior. Such request doesn’t lead to any implicit synchonizations. -Map with Read flag. Request during active write access will wait for resource to become free, or exits immediately with error if MFX_MAP_NOWAIT flag was set. This request may lead to the implicit synchronization (with same logic as Synchronize call) waiting for surface to become ready to use (all dependencies should be resolved and upstream components finished writing to this surface). It is guaranteed that read access will be acquired right after synchronization without allowing other thread to acquire this surface for writing. If MFX_MAP_NOWAIT was set and surface isn’t ready yet (has some unresolved data dependencies or active processing) read access request exits immediately with error. Read-write access with MFX_MAP_READ_WRITE provides exclusive simultaneous reading and writing access.

Return

MFX_ERR_NONE if no error.

MFX_ERR_NULL_PTR if surface is NULL.

MFX_ERR_INVALID_HANDLE if mfxFrameSurfaceInterface->Context is invalid (for example NULL).

MFX_ERR_UNSUPPORTED if flags are invalid.

MFX_ERR_LOCK_MEMORY if user wants to map the surface for write and surface->Data.Locked doesn’t equal to 0.

MFX_ERR_UNKNOWN in case of any internal error.

Parameters
  • [in] surface: valid surface.

  • [out] flags: to specify mapping mode.

  • [out] surface->Info.Data: - pointers set to actual pixel data.

mfxStatus (*Unmap)(mfxFrameSurface1 *surface)

This function invalidates pointers of surface->Info.Data and sets them to NULL. In case of video memory, actual surface with data in video memory becomes unmapped.

Return

MFX_ERR_NONE if no error.

MFX_ERR_NULL_PTR if surface is NULL.

MFX_ERR_INVALID_HANDLE if mfxFrameSurfaceInterface->Context is invalid (for example NULL).

MFX_ERR_UNSUPPORTED if surface is already unmapped.

MFX_ERR_UNKNOWN in case of any internal error.

Parameters
  • [in] surface: valid surface

  • [out] surface->Info.Data: - pointers set to NULL

mfxStatus (*GetNativeHandle)(mfxFrameSurface1 *surface, mfxHDL *resource, mfxResourceType *resource_type)

This function returns a native resource’s handle and type. The handle is returned as-is that means the reference counter of base resources is not incremented. Native resource is not detached from surface, oneVPL still owns the resource. User must not anyhow destroy native resource or rely that this resource will be alive after (*Release).

Return

MFX_ERR_NONE if no error.

MFX_ERR_NULL_PTR if any of surface, resource or resource_type is NULL.

MFX_ERR_INVALID_HANDLE if any of surface, resource or resource_type is not valid object (no native resource was allocated).

MFX_ERR_UNSUPPORTED if surface is in system memory.

MFX_ERR_UNKNOWN in case of any internal error.

Parameters
  • [in] surface: valid surface.

  • [out] resource: - pointer is set to the native handle of the resource.

  • [out] resource_type: - type of native resource (see mfxResourceType enumeration).

mfxStatus (*GetDeviceHandle)(mfxFrameSurface1 *surface, mfxHDL *device_handle, mfxHandleType *device_type)

This function returns a device abstraction which was used to create that resource. The handle is returned as-is that means the reference counter for device abstraction is not incremented. Native resource is not detached from surface, oneVPL still has a reference to the resource. User must not anyhow destroy device or rely that this device will be alive after (*Release).

Return

MFX_ERR_NONE if no error.

MFX_ERR_NULL_PTR if any of surface, devic_handle or device_type is NULL.

MFX_ERR_INVALID_HANDLE if any of surface, resource or resource_type is not valid object (no native resource was allocated).

MFX_ERR_UNSUPPORTED if surface is in system memory.

MFX_ERR_UNKNOWN in case of any internal error.

Parameters
  • [in] surface: valid surface.

  • [out] device_handle: - pointer is set to the device which created the resource

  • [out] device_type: - type of device (see mfxHandleType enumeration).

mfxStatus (*Synchronize)(mfxFrameSurface1 *surface, mfxU32 wait)

This function guarantees readiness both of the data (pixels) and any frame’s meta information (e.g. corruption flags) after function complete. Instead of MFXVideoCORE_SyncOperation users may directly call (*Synchronize) after correspondent Decode/VPP function calls (MFXVideoDECODE_DecodeFrameAsync or MFXVideoVPP_RunFrameVPPAsync). The prerequisites to call the functions are: main processing functions returned MFX_ERR_NONE and valid mfxFrameSurface1 object.

Return

MFX_ERR_NONE if no error.

MFX_ERR_NULL_PTR if surfaceis NULL.

MFX_ERR_INVALID_HANDLE if any of surface is not valid object .

MFX_WRN_IN_EXECUTION if the given timeout is expired and the surface is not ready.

MFX_ERR_ABORTED if the specified asynchronous function aborted due to data dependency on a previous asynchronous function that did not complete.

MFX_ERR_UNKNOWN in case of any internal error.

Parameters
  • [in] surface: - valid surface.

  • [out] wait: - wait time in milliseconds.

mfxFrameSurface1

struct mfxFrameSurface1

The mfxFrameSurface1 structure defines the uncompressed frames surface information and data buffers. The frame surface is in the frame or complementary field pairs of pixels up to four color-channels, in two parts: mfxFrameInfo and mfxFrameData.

Public Members

struct mfxFrameSurfaceInterface *FrameInterface

mfxFrameSurfaceInterface specifies interface to work with surface.

mfxFrameInfo Info

mfxFrameInfo structure specifies surface properties.

mfxFrameData Data

mfxFrameData structure describes the actual frame buffer.

mfxBitstream

struct mfxBitstream

The mfxBitstream structure defines the buffer that holds compressed video data.

Public Members

mfxEncryptedData *EncryptedData

Reserved and must be zero.

mfxExtBuffer **ExtParam

Array of extended buffers for additional bitstream configuration. See the ExtendedBufferID enumerator for a complete list of extended buffers.

mfxU16 NumExtParam

The number of extended buffers attached to this structure.

mfxI64 DecodeTimeStamp

Decode time stamp of the compressed bitstream in units of 90KHz. A value of MFX_TIMESTAMP_UNKNOWN indicates that there is no time stamp. This value is calculated by the SDK encoder from presentation time stamp provided by the application in mfxFrameSurface1 structure and from frame rate provided by the application during the SDK encoder initialization.

mfxU64 TimeStamp

Time stamp of the compressed bitstream in units of 90KHz. A value of MFX_TIMESTAMP_UNKNOWN indicates that there is no time stamp.

mfxU8 *Data

Bitstream buffer pointer, 32-bytes aligned

mfxU32 DataOffset

Next reading or writing position in the bitstream buffer

mfxU32 DataLength

Size of the actual bitstream data in bytes

mfxU32 MaxLength

Allocated bitstream buffer size in bytes

mfxU16 PicStruct

Type of the picture in the bitstream; this is an output parameter.

mfxU16 FrameType

Frame type of the picture in the bitstream; this is an output parameter.

mfxU16 DataFlag

Indicates additional bitstream properties; see the BitstreamDataFlag enumerator for details.

mfxU16 reserved2

Reserved for future use.

mfxEncodeStat

struct mfxEncodeStat

The mfxEncodeStat structure returns statistics collected during encoding.

Public Members

mfxU32 NumFrame

Number of encoded frames.

mfxU64 NumBit

Number of bits for all encoded frames.

mfxU32 NumCachedFrame

Number of internally cached frames.

mfxDecodeStat

struct mfxDecodeStat

The mfxDecodeStat structure returns statistics collected during decoding.

Public Members

mfxU32 NumFrame

Number of total decoded frames.

mfxU32 NumSkippedFrame

Number of skipped frames.

mfxU32 NumError

Number of errors recovered.

mfxU32 NumCachedFrame

Number of internally cached frames.

mfxPayload

struct mfxPayload

The mfxPayload structure describes user data payload in MPEG-2 or SEI message payload in H.264. For encoding, these payloads can be inserted into the bitstream. The payload buffer must contain a valid formatted payload. For H.264, this is the sei_message() as specified in the section 7.3.2.3.1 ‘Supplemental enhancement information message syntax’ of the ISO/IEC 14496-10 specification. For MPEG-2, this is the section 6.2.2.2.2 ‘User data’ of the ISO/IEC 13818-2 specification, excluding the user data start_code. For decoding, these payloads can be retrieved as the decoder parses the bitstream and caches them in an internal buffer.

Public Members

mfxU32 CtrlFlags

Additional payload properties. See the PayloadCtrlFlags enumerator for details.

mfxU8 *Data

Pointer to the actual payload data buffer.

mfxU32 NumBit

Number of bits in the payload data

mfxU16 Type

MPEG-2 user data start code or H.264 SEI message type.

mfxU16 BufSize

Payload buffer size in bytes.

Codec

Supported Types

MPEG2

0x01B2 //User Data

AVC

02 //pan_scan_rect

03 //filler_payload

04 //user_data_registered_itu_t_t35

05 //user_data_unregistered

06 //recovery_point

09 //scene_info

13 //full_frame_freeze

14 //full_frame_freeze_release

15 //full_frame_snapshot

16 //progressive_refinement_segment_start

17 //progressive_refinement_segment_end

19 //film_grain_characteristics

20 //deblocking_filter_display_preference

21 //stereo_video_info

45 //frame_packing_arrangement

HEVC

All

mfxEncodeCtrl

struct mfxEncodeCtrl

The mfxEncodeCtrl structure contains parameters for per-frame based encoding control.

Public Members

mfxExtBuffer Header

Extension buffer header.

mfxU16 MfxNalUnitType

Type of NAL unit that contains encoding frame. All supported values are defined by MfxNalUnitType enumerator. Other values defined in ITU-T H.265 specification are not supported.

The SDK encoder uses this field only if application sets mfxExtCodingOption3::EnableNalUnitType option to ON during encoder initialization.

Note

Only encoded order is supported. If application specifies this value in display order or uses value inappropriate for current frame or invalid value, then SDK encoder silently ignores it.

mfxU16 SkipFrame

Indicates that current frame should be skipped or number of missed frames before the current frame. See the mfxExtCodingOption2::SkipFrame for details.

mfxU16 QP

If nonzero, this value overwrites the global QP value for the current frame in the constant QP mode.

mfxU16 FrameType

Encoding frame type; see the FrameType enumerator for details. If the encoder works in the encoded order, the application must specify the frame type. If the encoder works in the display order, only key frames are enforceable.

mfxU16 NumExtParam

Number of extra control buffers.

mfxU16 NumPayload

Number of payload records to insert into the bitstream.

mfxExtBuffer **ExtParam

Pointer to an array of pointers to external buffers that provide additional information or control to the encoder for this frame or field pair; a typical usage is to pass the VPP auxiliary data generated by the video processing pipeline to the encoder. See the ExtendedBufferID for the list of extended buffers.

mfxPayload **Payload

Pointer to an array of pointers to user data (MPEG-2) or SEI messages (H.264) for insertion into the bitstream; for field pictures, odd payloads are associated with the first field and even payloads are associated with the second field. See the mfxPayload structure for payload definitions.

mfxFrameAllocRequest

struct mfxFrameAllocRequest

The mfxFrameAllocRequest structure describes multiple frame allocations when initializing encoders, decoders and video preprocessors. A range specifies the number of video frames. Applications are free to allocate additional frames. In any case, the minimum number of frames must be at least NumFrameMin or the called function will return an error.

Public Members

mfxU32 AllocId

Unique (within the session) ID of component requested the allocation.

mfxFrameInfo Info

Describes the properties of allocated frames.

mfxU16 Type

Allocated memory type; see the ExtMemFrameType enumerator for details.

mfxU16 NumFrameMin

Minimum number of allocated frames.

mfxU16 NumFrameSuggested

Suggested number of allocated frames.

mfxFrameAllocResponse

struct mfxFrameAllocResponse

The mfxFrameAllocResponse structure describes the response to multiple frame allocations. The calling function returns the number of video frames actually allocated and pointers to their memory IDs.

Public Members

mfxU32 AllocId

Unique (within the session) ID of component requested the allocation.

mfxMemId *mids

Pointer to the array of the returned memory IDs; the application allocates or frees this array.

mfxU16 NumFrameActual

Number of frames actually allocated.

mfxFrameAllocator

struct mfxFrameAllocator

The mfxFrameAllocator structure describes the callback functions Alloc, Lock, Unlock, GetHDL and Free that the SDK implementation might use for allocating internal frames. Applications that operate on OS-specific video surfaces must implement these callback functions.

Using the default allocator implies that frame data passes in or out of SDK functions through pointers, as opposed to using memory IDs.

The SDK behavior is undefined when using an incompletely defined external allocator. See the section Memory Allocation and External Allocators for additional information.

Public Members

mfxHDL pthis

Pointer to the allocator object.

mfxStatus (*Alloc)(mfxHDL pthis, mfxFrameAllocRequest *request, mfxFrameAllocResponse *response)

This function allocates surface frames. For decoders, MFXVideoDECODE_Init calls Alloc only once. That call includes all frame allocation requests. For encoders, MFXVideoENCODE_Init calls Alloc twice: once for the input surfaces and again for the internal reconstructed surfaces.

If two SDK components must share DirectX* surfaces, this function should pass the pre-allocated surface chain to SDK instead of allocating new DirectX surfaces. See the Surface Pool Allocation section for additional information.

Return

MFX_ERR_NONE The function successfully allocated the memory block.

MFX_ERR_MEMORY_ALLOC The function failed to allocate the video frames.

MFX_ERR_UNSUPPORTED The function does not support allocating the specified type of memory.

Parameters
  • [in] pthis: Pointer to the allocator object.

  • [in] request: Pointer to the mfxFrameAllocRequest structure that specifies the type and number of required frames.

  • [out] response: Pointer to the mfxFrameAllocResponse structure that retrieves frames actually allocated.

mfxStatus (*Lock)(mfxHDL pthis, mfxMemId mid, mfxFrameData *ptr)

This function locks a frame and returns its pointer.

Return

MFX_ERR_NONE The function successfully locked the memory block.

MFX_ERR_LOCK_MEMORY This function failed to lock the frame.

Parameters
  • [in] pthis: Pointer to the allocator object.

  • [in] mid: Memory block ID.

  • [out] ptr: Pointer to the returned frame structure.

mfxStatus (*Unlock)(mfxHDL pthis, mfxMemId mid, mfxFrameData *ptr)

This function unlocks a frame and invalidates the specified frame structure.

Return

MFX_ERR_NONE The function successfully locked the memory block.

Parameters
  • [in] pthis: Pointer to the allocator object.

  • [in] mid: Memory block ID.

  • [out] ptr: Pointer to the frame structure; This pointer can be NULL.

mfxStatus (*GetHDL)(mfxHDL pthis, mfxMemId mid, mfxHDL *handle)

This function returns the OS-specific handle associated with a video frame. If the handle is a COM interface, the reference counter must increase. The SDK will release the interface afterward.

Return

MFX_ERR_NONE The function successfully returned the OS-specific handle.

MFX_ERR_UNSUPPORTED The function does not support obtaining OS-specific handle..

Parameters
  • [in] pthis: Pointer to the allocator object.

  • [in] mid: Memory block ID.

  • [out] handle: Pointer to the returned OS-specific handle.

mfxStatus (*Free)(mfxHDL pthis, mfxFrameAllocResponse *response)

This function de-allocates all allocated frames.

Return

MFX_ERR_NONE The function successfully de-allocated the memory block.

Parameters
  • [in] pthis: Pointer to the allocator object.

  • [in] response: Pointer to the mfxFrameAllocResponse structure returned by the Alloc function.

mfxComponentInfo

struct mfxComponentInfo

The mfxComponentInfo structure contains wokload description, which is accepted by MFXQueryAdapters function.

Public Members

mfxComponentType Type

Type of workload: Encode, Decode, VPP. See mfxComponentType enumerator for possible values.

mfxVideoParam Requirements

Detailed description of workload, see mfxVideoParam for details.

mfxAdapterInfo

struct mfxAdapterInfo

The mfxAdapterInfo structure contains description of Intel Gen Graphics adapter.

Public Members

mfxPlatform Platform

Platform type description, see mfxPlatform for details.

mfxU32 Number

Value which uniquely characterizes media adapter. On windows this number can be used for initialization through DXVA interface (see example).

mfxAdaptersInfo

struct mfxAdaptersInfo

The mfxAdaptersInfo structure contains description of all Intel Gen Graphics adapters available on current system.

Public Members

mfxAdapterInfo *Adapters

Pointer to array of mfxAdapterInfo structs allocated by user.

mfxU32 NumAlloc

Length of Adapters array.

mfxU32 NumActual

Number of Adapters entries filled by MFXQueryAdapters.

mfxQPandMode

struct mfxQPandMode

The mfxQPandMode structure specifies per-MB or per-CU mode and QP or deltaQP value depending on the mode type.

Public Members

mfxU8 QP

QP for MB or CU. Valid when Mode = MFX_MBQP_MODE_QP_VALUE. For AVC valid range is 1..51. For HEVC valid range is 1..51. Application’s provided QP values should be valid; otherwise invalid QP values may cause

undefined behavior. MBQP map should be aligned for 16x16 block size. (align rule is (width +15 /16) && (height +15 /16)) For MPEG2 QP corresponds to quantizer_scale of the ISO*\/IEC* 13818-2 specification and have valid range 1..112.

mfxI8 DeltaQP

Pointer to a list of per-macroblock QP deltas in raster scan order. For block i: QP[i] = BrcQP[i] + DeltaQP[i]. Valid when Mode = MFX_MBQP_MODE_QP_DELTA.

mfxU16 Mode

Defines QP update mode. Can be equal to MFX_MBQP_MODE_QP_VALUE or MFX_MBQP_MODE_QP_DELTA.

VPP Structures

mfxInfoVPP
struct mfxInfoVPP

Public Members

mfxFrameInfo In

Input format for video processing.

mfxFrameInfo Out

Output format for video processing.

mfxVPPStat
struct mfxVPPStat

The mfxVPPStat structure returns statistics collected during video processing.

Public Members

mfxU32 NumFrame

Total number of frames processed.

mfxU32 NumCachedFrame

Number of internally cached frames.

Extension buffers structures

mfxExtBuffer
struct mfxExtBuffer

This structure is the common header definition for external buffers and video processing hints.

Public Members

mfxU32 BufferId

Identifier of the buffer content. See the ExtendedBufferID enumerator for a complete list of extended buffers.

mfxU32 BufferSz

Size of the buffer.

mfxExtCodingOption
struct mfxExtCodingOption

The mfxExtCodingOption structure specifies additional options for encoding.

The application can attach this extended buffer to the mfxVideoParam structure to configure initialization.

Public Members

mfxExtBuffer Header

Extension buffer header. Header.BufferId must be equal to MFX_EXTBUFF_CODING_OPTION.

mfxU16 RateDistortionOpt

Set this flag if rate distortion optimization is needed. See the CodingOptionValue enumerator for values of this option.

mfxU16 MECostType

Motion estimation cost type; this value is reserved and must be zero.

mfxU16 MESearchType

Motion estimation search algorithm; this value is reserved and must be zero.

mfxI16Pair MVSearchWindow

Rectangular size of the search window for motion estimation; this parameter is reserved and must be (0, 0).

mfxU16 FramePicture

Set this flag to encode interlaced fields as interlaced frames; this flag does not affect progressive input frames. See the CodingOptionValue enumerator for values of this option.

mfxU16 CAVLC

If set, CAVLC is used; if unset, CABAC is used for encoding. See the CodingOptionValue enumerator for values of this option.

mfxU16 RecoveryPointSEI

Set this flag to insert the recovery point SEI message at the beginning of every intra refresh cycle. See the description of IntRefType in mfxExtCodingOption2 structure for details on how to enable and configure intra refresh.

If intra refresh is not enabled then this flag is ignored.

See the CodingOptionValue enumerator for values of this option.

mfxU16 ViewOutput

Set this flag to instruct the MVC encoder to output each view in separate bitstream buffer. See the CodingOptionValue enumerator for values of this option and SDK Reference Manual for Multi-View Video Coding for more details about usage of this flag.

mfxU16 NalHrdConformance

If this option is turned ON, then AVC encoder produces HRD conformant bitstream. If it is turned OFF, then AVC encoder may, but not necessary does, violate HRD conformance. I.e. this option can force encoder to produce HRD conformant stream, but cannot force it to produce unconformant stream.

See the CodingOptionValue enumerator for values of this option.

mfxU16 SingleSeiNalUnit

If set, encoder puts all SEI messages in the singe NAL unit. It includes both kinds of messages, provided by application and created by encoder. It is three states option, see CodingOptionValue enumerator for values of this option:

UNKNOWN - put each SEI in its own NAL unit,

ON - put all SEI messages in the same NAL unit,

OFF - the same as unknown

mfxU16 VuiVclHrdParameters

If set and VBR rate control method is used then VCL HRD parameters are written in bitstream with identical to NAL HRD parameters content. See the CodingOptionValue enumerator for values of this option.

mfxU16 RefPicListReordering

Set this flag to activate reference picture list reordering; this value is reserved and must be zero.

mfxU16 ResetRefList

Set this flag to reset the reference list to non-IDR I-frames of a GOP sequence. See the CodingOptionValue enumerator for values of this option.

mfxU16 RefPicMarkRep

Set this flag to write the reference picture marking repetition SEI message into the output bitstream. See the CodingOptionValue enumerator for values of this option.

mfxU16 FieldOutput

Set this flag to instruct the AVC encoder to output bitstreams immediately after the encoder encodes a field, in the field-encoding mode. See the CodingOptionValue enumerator for values of this option.

mfxU16 IntraPredBlockSize

Minimum block size of intra-prediction; This value is reserved and must be zero.

mfxU16 InterPredBlockSize

Minimum block size of inter-prediction; This value is reserved and must be zero.

mfxU16 MVPrecision

Specify the motion estimation precision; this parameter is reserved and must be zero.

mfxU16 MaxDecFrameBuffering

Specifies the maximum number of frames buffered in a DPB. A value of zero means unspecified.

mfxU16 AUDelimiter

Set this flag to insert the Access Unit Delimiter NAL. See the CodingOptionValue enumerator for values of this option.

mfxU16 PicTimingSEI

Set this flag to insert the picture timing SEI with pic_struct syntax element. See sub-clauses D.1.2 and D.2.2 of the ISO/IEC 14496-10 specification for the definition of this syntax element. See the CodingOptionValue enumerator for values of this option. The default value is ON.

mfxU16 VuiNalHrdParameters

Set this flag to insert NAL HRD parameters in the VUI header. See the CodingOptionValue enumerator for values of this option.

mfxExtCodingOption2
struct mfxExtCodingOption2

The mfxExtCodingOption2 structure together with mfxExtCodingOption structure specifies additional options for encoding.

The application can attach this extended buffer to the mfxVideoParam structure to configure initialization and to the mfxEncodeCtrl during runtime.

Public Members

mfxExtBuffer Header

Extension buffer header. Header.BufferId must be equal to MFX_EXTBUFF_CODING_OPTION2.

mfxU16 IntRefType

Specifies intra refresh type. See the IntraRefreshTypes. The major goal of intra refresh is improvement of error resilience without significant impact on encoded bitstream size caused by I frames. The SDK encoder achieves this by encoding part of each frame in refresh cycle using intra MBs. MFX_REFRESH_NO means no refresh. MFX_REFRESH_VERTICAL means vertical refresh, by column of MBs. MFX_REFRESH_HORIZONTAL means horizontal refresh, by rows of MBs. MFX_REFRESH_SLICE means horizontal refresh by slices without overlapping. In case of MFX_REFRESH_SLICE SDK ignores IntRefCycleSize (size of refresh cycle equals number slices). This parameter is valid during initialization and runtime. When used with temporal scalability, intra refresh applied only to base layer.

mfxU16 IntRefCycleSize

Specifies number of pictures within refresh cycle starting from 2. 0 and 1 are invalid values. This parameter is valid only during initialization.

mfxI16 IntRefQPDelta

Specifies QP difference for inserted intra MBs. This is signed value in [-51, 51] range. This parameter is valid during initialization and runtime.

mfxU32 MaxFrameSize

Specify maximum encoded frame size in byte. This parameter is used in VBR based bitrate control modes and ignored in others. The SDK encoder tries to keep frame size below specified limit but minor overshoots are possible to preserve visual quality. This parameter is valid during initialization and runtime. It is recommended to set MaxFrameSize to 5x-10x target frame size ((TargetKbps*1000)/(8* FrameRateExtN/FrameRateExtD)) for I frames and 2x-4x target frame size for P/B frames.

mfxU32 MaxSliceSize

Specify maximum slice size in bytes. If this parameter is specified other controls over number of slices are ignored.

Note

Not all codecs and SDK implementations support this value. Use Query function to check if this feature is supported.

mfxU16 BitrateLimit

Modifies bitrate to be in the range imposed by the SDK encoder. Setting this flag off may lead to violation of HRD conformance. Mind that specifying bitrate below the SDK encoder range might significantly affect quality. If on this option takes effect in non CQP modes: if TargetKbps is not in the range imposed by the SDK encoder, it will be changed to be in the range. See the CodingOptionValue enumerator for values of this option. The default value is ON, i.e. bitrate is limited. This parameter is valid only during initialization. Flag works with MFX_CODEC_AVC only, it is ignored with other codecs.

mfxU16 MBBRC

Setting this flag enables macroblock level bitrate control that generally improves subjective visual quality. Enabling this flag may have negative impact on performance and objective visual quality metric. See the CodingOptionValue enumerator for values of this option. The default value depends on target usage settings.

mfxU16 ExtBRC

Turn ON this option to enable external BRC. See the CodingOptionValue enumerator for values of this option. Use Query function to check if this feature is supported.

mfxU16 LookAheadDepth

Specifies the depth of look ahead rate control algorithm. It is the number of frames that SDK encoder analyzes before encoding. Valid value range is from 10 to 100 inclusive. To instruct the SDK encoder to use the default value the application should zero this field.

mfxU16 Trellis

This option is used to control trellis quantization in AVC encoder. See TrellisControl enumerator for possible values of this option. This parameter is valid only during initialization.

mfxU16 RepeatPPS

This flag controls picture parameter set repetition in AVC encoder. Turn ON this flag to repeat PPS with each frame. See the CodingOptionValue enumerator for values of this option. The default value is ON. This parameter is valid only during initialization.

mfxU16 BRefType

This option controls usage of B frames as reference. See BRefControl enumerator for possible values of this option. This parameter is valid only during initialization.

mfxU16 AdaptiveI

This flag controls insertion of I frames by the SDK encoder. Turn ON this flag to allow changing of frame type from P and B to I. This option is ignored if GopOptFlag in mfxInfoMFX structure is equal to MFX_GOP_STRICT. See the CodingOptionValue enumerator for values of this option. This parameter is valid only during initialization.

mfxU16 AdaptiveB

This flag controls changing of frame type from B to P. Turn ON this flag to allow such changing. This option is ignored if GopOptFlag in mfxInfoMFX structure is equal to MFX_GOP_STRICT. See the CodingOptionValue enumerator for values of this option. This parameter is valid only during initialization.

mfxU16 LookAheadDS

This option controls down sampling in look ahead bitrate control mode. See LookAheadDownSampling enumerator for possible values of this option. This parameter is valid only during initialization.

mfxU16 NumMbPerSlice

This option specifies suggested slice size in number of macroblocks. The SDK can adjust this number based on platform capability. If this option is specified, i.e. if it is not equal to zero, the SDK ignores mfxInfoMFX::NumSlice parameter.

mfxU16 SkipFrame

This option enables usage of mfxEncodeCtrl::SkipFrameparameter. See the SkipFrame enumerator for values of this option.

Note

Not all codecs and SDK implementations support this value. Use Query function to check if this feature is supported.

mfxU8 MinQPI

Minimum allowed QP value for I frame types. Valid range is 1..51 inclusive. Zero means default value, i.e.no limitations on QP.

Note

Not all codecs and SDK implementations support this value. Use Query function to check if this feature is supported.

mfxU8 MaxQPI

Maximum allowed QP value for I frame types. Valid range is 1..51 inclusive. Zero means default value, i.e.no limitations on QP.

Note

Not all codecs and SDK implementations support this value. Use Query function to check if this feature is supported.

mfxU8 MinQPP

Minimum allowed QP value for P frame types. Valid range is 1..51 inclusive. Zero means default value, i.e.no limitations on QP.

Note

Not all codecs and SDK implementations support this value. Use Query function to check if this feature is supported.

mfxU8 MaxQPP

Maximum allowed QP value for P frame types. Valid range is 1..51 inclusive. Zero means default value, i.e.no limitations on QP.

Note

Not all codecs and SDK implementations support this value. Use Query function to check if this feature is supported.

mfxU8 MinQPB

Minimum allowed QP value for B frame types. Valid range is 1..51 inclusive. Zero means default value, i.e.no limitations on QP.

Note

Not all codecs and SDK implementations support this value. Use Query function to check if this feature is supported.

mfxU8 MaxQPB

Maximum allowed QP value for B frame types. Valid range is 1..51 inclusive. Zero means default value, i.e.no limitations on QP.

Note

Not all codecs and SDK implementations support this value. Use Query function to check if this feature is supported.

mfxU16 FixedFrameRate

This option sets fixed_frame_rate_flag in VUI.

Note

Not all codecs and SDK implementations support this value. Use Query function to check if this feature is supported.

mfxU16 DisableDeblockingIdc

This option disable deblocking.

Note

Not all codecs and SDK implementations support this value. Use Query function to check if this feature is supported.

mfxU16 DisableVUI

This option completely disables VUI in output bitstream.

Note

Not all codecs and SDK implementations support this value. Use Query function to check if this feature is supported.

mfxU16 BufferingPeriodSEI

This option controls insertion of buffering period SEI in the encoded bitstream. It should be one of the following values:

MFX_BPSEI_DEFAULT – encoder decides when to insert BP SEI,

MFX_BPSEI_IFRAME – BP SEI should be inserted with every I frame.

mfxU16 EnableMAD

Turn ON this flag to enable per-frame reporting of Mean Absolute Difference. This parameter is valid only during initialization.

mfxU16 UseRawRef

Turn ON this flag to use raw frames for reference instead of reconstructed frames. This parameter is valid during initialization and runtime (only if was turned ON during initialization).

Note

Not all codecs and SDK implementations support this value. Use Query function to check if this feature is supported.

mfxExtCodingOption3
struct mfxExtCodingOption3

The mfxExtCodingOption3 structure together with mfxExtCodingOption and mfxExtCodingOption2 structures specifies additional options for encoding. The application can attach this extended buffer to the mfxVideoParam structure to configure initialization and to the mfxEncodeCtrl during runtime.

Public Members

mfxExtBuffer Header

Extension buffer header. Header.BufferId must be equal to MFX_EXTBUFF_CODING_OPTION2.

mfxU16 NumSliceI

The number of slices for I frames.

Note

Not all codecs and SDK implementations support these values. Use Query function to check if this feature is supported

mfxU16 NumSliceP

The number of slices for P frames.

Note

Not all codecs and SDK implementations support these values. Use Query function to check if this feature is supported

mfxU16 NumSliceB

The number of slices for B frames.

Note

Not all codecs and SDK implementations support these values. Use Query function to check if this feature is supported

mfxU16 WinBRCMaxAvgKbps

When rate control method is MFX_RATECONTROL_VBR, MFX_RATECONTROL_LA, MFX_RATECONTROL_LA_HRD or MFX_RATECONTROL_QVBR this parameter specifies the maximum bitrate averaged over a sliding window specified by WinBRCSize. For MFX_RATECONTROL_CBR this parameter is ignored and equals TargetKbps.

mfxU16 WinBRCSize

When rate control method is MFX_RATECONTROL_CBR, MFX_RATECONTROL_VBR, MFX_RATECONTROL_LA, MFX_RATECONTROL_LA_HRD or MFX_RATECONTROL_QVBR this parameter specifies sliding window size in frames. Set this parameter to zero to disable sliding window.

mfxU16 QVBRQuality

When rate control method is MFX_RATECONTROL_QVBR this parameter specifies quality factor.It is a value in the 1,…,51 range, where 1 corresponds to the best quality.

mfxU16 EnableMBQP

Turn ON this option to enable per-macroblock QP control, rate control method must be MFX_RATECONTROL_CQP. See the CodingOptionValue enumerator for values of this option. This parameter is valid only during initialization.

mfxU16 IntRefCycleDist

Distance between the beginnings of the intra-refresh cycles in frames. Zero means no distance between cycles.

mfxU16 DirectBiasAdjustment

Turn ON this option to enable the ENC mode decision algorithm to bias to fewer B Direct/Skip types. Applies only to B frames, all other frames will ignore this setting. See the CodingOptionValue enumerator for values of this option.

mfxU16 GlobalMotionBiasAdjustment

Enables global motion bias. See the CodingOptionValue enumerator for values of this option.

mfxU16 MVCostScalingFactor

Values are:

0: set MV cost to be 0

1: scale MV cost to be 1/2 of the default value

2: scale MV cost to be 1/4 of the default value

3: scale MV cost to be 1/8 of the default value

mfxU16 MBDisableSkipMap

Turn ON this option to enable usage of mfxExtMBDisableSkipMap. See the CodingOptionValue enumerator for values of this option. This parameter is valid only during initialization.

mfxU16 WeightedPred

Weighted prediction mode. See the WeightedPred enumerator for values of these options.

mfxU16 WeightedBiPred

Weighted prediction mode. See the WeightedPred enumerator for values of these options.

mfxU16 AspectRatioInfoPresent

Instructs encoder whether aspect ratio info should present in VUI parameters. See the CodingOptionValue enumerator for values of this option.

mfxU16 OverscanInfoPresent

Instructs encoder whether overscan info should present in VUI parameters. See the CodingOptionValue enumerator for values of this option.

mfxU16 OverscanAppropriate

ON indicates that the cropped decoded pictures output are suitable for display using overscan. OFF indicates that the cropped decoded pictures output contain visually important information in the entire region out to the edges of the cropping rectangle of the picture. See the CodingOptionValue enumerator for values of this option.

mfxU16 TimingInfoPresent

Instructs encoder whether frame rate info should present in VUI parameters. See the CodingOptionValue enumerator for values of this option.

mfxU16 BitstreamRestriction

Instructs encoder whether bitstream restriction info should present in VUI parameters. See the CodingOptionValue enumerator for values of this option.

mfxU16 LowDelayHrd

Corresponds to AVC syntax element low_delay_hrd_flag (VUI). See the CodingOptionValue enumerator for values of this option.

mfxU16 MotionVectorsOverPicBoundaries

When set to OFF, no sample outside the picture boundaries and no sample at a fractional sample position for which the sample value is derived using one or more samples outside the picture boundaries is used for inter prediction of any sample.

When set to ON, one or more samples outside picture boundaries may be used in inter prediction.

See the CodingOptionValue enumerator for values of this option.

mfxU16 Log2MaxMvLengthHorizontal
mfxU16 Log2MaxMvLengthVertical
mfxU16 ScenarioInfo

Provides a hint to encoder about the scenario for the encoding session. See the ScenarioInfo enumerator for values of this option.

mfxU16 ContentInfo

Provides a hint to encoder about the content for the encoding session. See the ContentInfo enumerator for values of this option.

mfxU16 PRefType

When GopRefDist=1, specifies the model of reference list construction and DPB management. See the PRefType enumerator for values of this option.

mfxU16 FadeDetection

Instructs encoder whether internal fade detection algorithm should be used for calculation of weigh/offset values for pred_weight_table unless application provided mfxExtPredWeightTable for this frame. See the CodingOptionValue enumerator for values of this option.

mfxI16 DeblockingAlphaTcOffset
mfxI16 DeblockingBetaOffset
mfxU16 GPB

Turn this option OFF to make HEVC encoder use regular P-frames instead of GPB. See the CodingOptionValue enumerator for values of this option.

mfxU32 MaxFrameSizeI

Same as mfxExtCodingOption2::MaxFrameSize but affects only I-frames. MaxFrameSizeI must be set if MaxFrameSizeP is set. If MaxFrameSizeI is not specified or greater than spec limitation, spec limitation will be applied to the sizes of I-frames.

mfxU32 MaxFrameSizeP

Same as mfxExtCodingOption2::MaxFrameSize but affects only P/B-frames. If MaxFrameSizeP equals 0, the SDK sets MaxFrameSizeP equal to MaxFrameSizeI. If MaxFrameSizeP is not specified or greater than spec limitation, spec limitation will be applied to the sizes of P/B-frames.

mfxU32 reserved3[3]
mfxU16 EnableQPOffset

Enables QPOffset control. See the CodingOptionValue enumerator for values of this option.

mfxI16 QPOffset[8]

When EnableQPOffset set to ON and RateControlMethod is CQP specifies QP offset per pyramid layer. For B-pyramid, B-frame QP = QPB + QPOffset[layer]. For P-pyramid, P-frame QP = QPP + QPOffset[layer].

mfxU16 NumRefActiveP[8]

< Max number of active references for P and B frames in reference picture lists 0 and 1 correspondingly. Array index is pyramid layer. Max number of active references for P frames. Array index is pyramid layer.

mfxU16 NumRefActiveBL0[8]

Max number of active references for B frames in reference picture list 0. Array index is pyramid layer.

mfxU16 NumRefActiveBL1[8]

Max number of active references for B frames in reference picture list 1. Array index is pyramid layer.

mfxU16 ConstrainedIntraPredFlag
mfxU16 TransformSkip

For HEVC if this option turned ON, transform_skip_enabled_flag will be set to 1 in PPS, OFF specifies that transform_skip_enabled_flag will be set to 0.

mfxU16 TargetChromaFormatPlus1

Minus 1 specifies target encoding chroma format (see ChromaFormatIdc enumerator). May differ from source one. TargetChromaFormatPlus1 = 0 mean default target chroma format which is equal to source (mfxVideoParam::mfx::FrameInfo::ChromaFormat + 1), except RGB4 source format. In case of RGB4 source format default target chroma format is 4:2:0 (instead of 4:4:4) for the purpose of backward compatibility.

mfxU16 TargetBitDepthLuma

Target encoding bit-depth for luma samples. May differ from source one. 0 mean default target bit-depth which is equal to source (mfxVideoParam::mfx::FrameInfo::BitDepthLuma).

mfxU16 TargetBitDepthChroma

Target encoding bit-depth for chroma samples. May differ from source one. 0 mean default target bit-depth which is equal to source (mfxVideoParam::mfx::FrameInfo::BitDepthChroma).

mfxU16 BRCPanicMode

Controls panic mode in AVC and MPEG2 encoders.

mfxU16 LowDelayBRC

When rate control method is MFX_RATECONTROL_VBR, MFX_RATECONTROL_QVBR or MFX_RATECONTROL_VCM this parameter specifies frame size tolerance. Set this parameter to MFX_CODINGOPTION_ON to allow strictly obey average frame size set by MaxKbps, e.g. cases when MaxFrameSize == (MaxKbps*1000)/(8* FrameRateExtN/FrameRateExtD). Also MaxFrameSizeI and MaxFrameSizeP can be set separately.

mfxU16 EnableMBForceIntra

Turn ON this option to enable usage of mfxExtMBForceIntra for AVC encoder. See the CodingOptionValue enumerator for values of this option. This parameter is valid only during initialization.

mfxU16 AdaptiveMaxFrameSize

If this option is ON, BRC may decide a larger P or B frame size than what MaxFrameSizeP dictates when the scene change is detected. It may benefit the video quality. AdaptiveMaxFrameSize feature is not supported with LowPower ON or if the value of MaxFrameSizeP = 0.

mfxU16 RepartitionCheckEnable

Controls AVC encoder attempts to predict from small partitions. Default value allows encoder to choose preferred mode, MFX_CODINGOPTION_ON forces encoder to favor quality, MFX_CODINGOPTION_OFF forces encoder to favor performance.

mfxU16 QuantScaleType
mfxU16 IntraVLCFormat
mfxU16 ScanType
mfxU16 EncodedUnitsInfo

Turn this option ON to make encoded units info available in mfxExtEncodedUnitsInfo.

mfxU16 EnableNalUnitType

If this option is turned ON, then HEVC encoder uses NAL unit type provided by application in mfxEncodeCtrl::MfxNalUnitType field.

Note

This parameter is valid only during initialization.

Note

Not all codecs and SDK implementations support this value. Use Query function to check if this feature is supported.

mfxU16 ExtBrcAdaptiveLTR

Turn OFF to prevent Adaptive marking of Long Term Reference Frames when using ExtBRC. When ON and using ExtBRC, encoders will mark, modify, or remove LTR frames based on encoding parameters and content properties. The application must set each input frame’s mfxFrameData::FrameOrder for correct operation of LTR.

mfxU16 reserved[163]
mfxExtCodingOptionSPSPPS
struct mfxExtCodingOptionSPSPPS

Attach this structure as part of the extended buffers to configure the SDK encoder during MFXVideoENCODE_Init. The sequence or picture parameters specified by this structure overwrite any such parameters specified by the structure or any other extended buffers attached therein.

For H.264, SPSBuffer and PPSBuffer must point to valid bitstreams that contain the sequence parameter set and picture parameter set, respectively. For MPEG-2, SPSBuffer must point to valid bitstreams that contain the sequence header followed by any sequence header extension. The PPSBuffer pointer is ignored. The SDK encoder imports parameters from these buffers. If the encoder does not support the specified parameters, the encoder does not initialize and returns the status code MFX_ERR_INCOMPATIBLE_VIDEO_PARAM.

Check with the MFXVideoENCODE_Query function for the support of this multiple segment encoding feature. If this feature is not supported, the query returns MFX_ERR_UNSUPPORTED.

Public Members

mfxExtBuffer Header

Extension buffer header. Header.BufferId must be equal to MFX_EXTBUFF_CODING_OPTION_SPSPPS.

mfxU8 *SPSBuffer

Pointer to a valid bitstream that contains the SPS (sequence parameter set for H.264 or sequence header followed by any sequence header extension for MPEG-2) buffer; can be NULL to skip specifying the SPS.

mfxU8 *PPSBuffer

Pointer to a valid bitstream that contains the PPS (picture parameter set for H.264 or picture header followed by any picture header extension for MPEG-2) buffer; can be NULL to skip specifying the PPS.

mfxU16 SPSBufSize

Size of the SPS in bytes

mfxU16 PPSBufSize

Size of the PPS in bytes

mfxU16 SPSId

SPS identifier; the value is reserved and must be zero.

mfxU16 PPSId

PPS identifier; the value is reserved and must be zero.

mfxExtInsertHeaders
struct mfxExtInsertHeaders

Runtime ctrl buffer for SPS/PPS insertion with current encoding frame

Public Members

mfxExtBuffer Header

Extension buffer header. Header.BufferId must be equal to MFX_EXTBUFF_INSERT_HEADERS.

mfxU16 SPS

tri-state option to insert SPS

mfxU16 PPS

tri-state option to insert PPS

mfxU16 reserved[8]
mfxExtCodingOptionVPS
struct mfxExtCodingOptionVPS

Attach this structure as part of the extended buffers to configure the SDK encoder during MFXVideoENCODE_Init. The sequence or picture parameters specified by this structure overwrite any such parameters specified by the structure or any other extended buffers attached therein.

If the encoder does not support the specified parameters, the encoder does not initialize and returns the status code MFX_ERR_INCOMPATIBLE_VIDEO_PARAM.

Check with the MFXVideoENCODE_Query function for the support of this multiple segemnt encoding feature. If this feature is not supported, the query returns MFX_ERR_UNSUPPORTED.

Public Members

mfxExtBuffer Header

Extension buffer header. Header.BufferId must be equal to MFX_EXTBUFF_CODING_OPTION_VPS.

mfxU8 *VPSBuffer

Pointer to a valid bitstream that contains the VPS (video parameter set for HEVC) buffer.

mfxU16 VPSBufSize

Size of the VPS in bytes

mfxU16 VPSId

VPS identifier; the value is reserved and must be zero.

mfxExtThreadsParam
struct mfxExtThreadsParam

Attached to the mfxInitParam structure during the SDK session initialization, mfxExtThreadsParam structure specifies options for threads created by this session.

Public Members

mfxExtBuffer Header

Must be MFX_EXTBUFF_THREADS_PARAM

mfxU16 NumThread

The number of threads.

mfxI32 SchedulingType

Scheduling policy for all threads.

mfxI32 Priority

Priority for all threads.

mfxU16 reserved[55]

Reserved for future use

mfxExtVideoSignalInfo
struct mfxExtVideoSignalInfo

The mfxExtVideoSignalInfo structure defines the video signal information.

For H.264, see Annex E of the ISO/IEC 14496-10 specification for the definition of these parameters.

For MPEG-2, see section 6.3.6 of the ITU* H.262 specification for the definition of these parameters. The field VideoFullRange is ignored.

For VC-1, see section 6.1.14.5 of the SMPTE* 421M specification. The fields VideoFormat and VideoFullRange are ignored.

Note

If ColourDescriptionPresent is zero, the color description information (including ColourPrimaries, TransferCharacteristics, and MatrixCoefficients) will/does not present in the bitstream.

Public Members

mfxExtBuffer Header

Extension buffer header. Header.BufferId must be equal to MFX_EXTBUFF_VIDEO_SIGNAL_INFO.

mfxU16 VideoFormat
mfxU16 VideoFullRange
mfxU16 ColourDescriptionPresent
mfxU16 ColourPrimaries
mfxU16 TransferCharacteristics
mfxU16 MatrixCoefficients
mfxExtAVCRefListCtrl
struct mfxExtAVCRefListCtrl

The mfxExtAVCRefListCtrl structure configures reference frame options for the H.264 encoder. See Reference List Selection and Long-term Reference frame chapters for more details.

mfxExtAVCRefListCtrl::PreferredRefList Specify list of frames that should be used to predict the current frame.

Note

Not all implementations of the SDK encoder support LongTermIdx and ApplyLongTermIdx fields in this structure. The application has to use query mode 1 to determine if such functionality is supported. To do so, the application has to attach this extended buffer to mfxVideoParam structure and call MFXVideoENCODE_Query function. If function returns MFX_ERR_NONE and these fields were set to one, then such functionality is supported. If function fails or sets fields to zero then this functionality is not supported.

mfxExtAVCRefListCtrl::RejectedRefList Specify list of frames that should not be used for prediction.

mfxExtAVCRefListCtrl::LongTermRefList Specify list of frames that should be marked as long-term reference frame.

Public Members

mfxExtBuffer Header

Extension buffer header. Header.BufferId must be equal to MFX_EXTBUFF_AVC_REFLIST_CTRL.

mfxU16 NumRefIdxL0Active

Specify the number of reference frames in the active reference list L0. This number should be less or equal to the NumRefFrame parameter from encoding initialization.

mfxU16 NumRefIdxL1Active

Specify the number of reference frames in the active reference list L1. This number should be less or equal to the NumRefFrame parameter from encoding initialization.

mfxU32 FrameOrder

Together FrameOrder and PicStruct fields are used to identify reference picture. Use FrameOrder = MFX_FRAMEORDER_UNKNOWN to mark unused entry.

mfxU16 PicStruct

Together FrameOrder and PicStruct fields are used to identify reference picture. Use FrameOrder = MFX_FRAMEORDER_UNKNOWN to mark unused entry.

mfxU16 ViewId

Reserved and must be zero.

mfxU16 LongTermIdx

Index that should be used by the SDK encoder to mark long-term reference frame.

mfxU16 reserved[3]

Reserved

mfxU16 ApplyLongTermIdx

If it is equal to zero, the SDK encoder assigns long-term index according to internal algorithm. If it is equal to one, the SDK encoder uses LongTermIdx value as long-term index.

mfxExtMasteringDisplayColourVolume
struct mfxExtMasteringDisplayColourVolume

The mfxExtMasteringDisplayColourVolume configures the HDR SEI message. If application attaches this structure to the mfxEncodeCtrl at runtime, the encoder inserts the HDR SEI message for current frame and ignores InsertPayloadToggle. If application attaches this structure to the mfxVideoParam during initialization or reset, the encoder inserts HDR SEI message based on InsertPayloadToggle. Fields semantic defined in ITU-T* H.265 Annex D.

Public Members

mfxExtBuffer Header

Extension buffer header. Header.BufferId must be equal to MFX_EXTBUFF_MASTERING_DISPLAY_COLOUR_VOLUME.

mfxU16 InsertPayloadToggle

InsertHDRPayload enumerator value.

mfxU16 DisplayPrimariesX[3]

Color primaries for a video source in increments of 0.00002. Consist of RGB x coordinates and define how to convert colors from RGB color space to CIE XYZ color space. These fields belong to the [0..50000] range.

mfxU16 DisplayPrimariesY[3]

Color primaries for a video source in increments of 0.00002. Consist of RGB y coordinates and define how to convert colors from RGB color space to CIE XYZ color space. These fields belong to the [0..50000] range.

mfxU16 WhitePointX

White point X coordinate.

mfxU16 WhitePointY

White point Y coordinate.

mfxU32 MaxDisplayMasteringLuminance

Specify maximum luminance of the display on which the content was authored in units of 0.00001 candelas per square meter. These fields belong to the [1..65535] range.

mfxU32 MinDisplayMasteringLuminance

Specify minimum luminance of the display on which the content was authored in units of 0.00001 candelas per square meter. These fields belong to the [1..65535] range.

mfxExtContentLightLevelInfo
struct mfxExtContentLightLevelInfo

The mfxExtContentLightLevelInfo structure configures the HDR SEI message. If application attaches this structure to the mfxEncodeCtrl structure at runtime, the encoder inserts the HDR SEI message for current frame and ignores InsertPayloadToggle. If application attaches this structure to the mfxVideoParam structure during initialization or reset, the encoder inserts HDR SEI message based on InsertPayloadToggle. Fields semantic defined in ITU-T* H.265 Annex D.

Public Members

mfxExtBuffer Header

Extension buffer header. Header.BufferId must be equal to EXTBUFF_CONTENT_LIGHT_LEVEL_INFO.

mfxU16 InsertPayloadToggle

InsertHDRPayload enumerator value.

mfxU16 MaxContentLightLevel

Maximum luminance level of the content. The field belongs to the [1..65535] range.

mfxU16 MaxPicAverageLightLevel

Maximum average per-frame luminance level of the content. The field belongs to the [1..65535] range.

mfxExtPictureTimingSEI
struct mfxExtPictureTimingSEI

The mfxExtPictureTimingSEI structure configures the H.264 picture timing SEI message. The encoder ignores it if HRD information in stream is absent and PicTimingSEI option in mfxExtCodingOption structure is turned off. See mfxExtCodingOption for details.

If the application attaches this structure to the mfxVideoParam structure during initialization, the encoder inserts the picture timing SEI message based on provided template in every access unit of coded bitstream.

If application attaches this structure to the mfxEncodeCtrl structure at runtime, the encoder inserts the picture timing SEI message based on provided template in access unit that represents current frame.

These parameters define the picture timing information. An invalid value of 0xFFFF indicates that application does not set the value and encoder must calculate it.

See Annex D of the ISO*\/IEC* 14496-10 specification for the definition of these parameters.

Public Members

mfxExtBuffer Header

Extension buffer header. Header.BufferId must be equal to MFX_EXTBUFF_PICTURE_TIMING_SEI.

mfxU32 reserved[14]
mfxU16 ClockTimestampFlag
mfxU16 CtType
mfxU16 NuitFieldBasedFlag
mfxU16 CountingType
mfxU16 FullTimestampFlag
mfxU16 DiscontinuityFlag
mfxU16 CntDroppedFlag
mfxU16 NFrames
mfxU16 SecondsFlag
mfxU16 MinutesFlag
mfxU16 HoursFlag
mfxU16 SecondsValue
mfxU16 MinutesValue
mfxU16 HoursValue
mfxU32 TimeOffset
struct mfxExtPictureTimingSEI::[anonymous] TimeStamp[3]
mfxExtAvcTemporalLayers
struct mfxExtAvcTemporalLayers

The mfxExtAvcTemporalLayers structure configures the H.264 temporal layers hierarchy. If application attaches it to the mfxVideoParam structure during initialization, the SDK encoder generates the temporal layers and inserts the prefix NAL unit before each slice to indicate the temporal and priority IDs of the layer.

This structure can be used with the display-order encoding mode only.

Public Members

mfxExtBuffer Header

Extension buffer header. Header.BufferId must be equal to MFX_EXTBUFF_AVC_TEMPORAL_LAYERS.

mfxU16 BaseLayerPID

The priority ID of the base layer; the SDK encoder increases the ID for each temporal layer and writes to the prefix NAL unit.

mfxU16 Scale

The ratio between the frame rates of the current temporal layer and the base layer.

mfxExtEncoderCapability
struct mfxExtEncoderCapability

The mfxExtEncoderCapability structure is used to retrieve SDK encoder capability. See description of mode 4 of the MFXVideoENCODE_Query function for details how to use this structure.

Note

Not all implementations of the SDK encoder support this extended buffer. The application has to use query mode 1 to determine if such functionality is supported. To do so, the application has to attach this extended buffer to mfxVideoParam structure and call MFXVideoENCODE_Query function. If function returns MFX_ERR_NONE then such functionality is supported.

Public Members

mfxExtBuffer Header

Extension buffer header. Header.BufferId must be equal to MFX_EXTBUFF_ENCODER_CAPABILITY.

mfxU32 MBPerSec

Specify the maximum processing rate in macro blocks per second.

mfxExtEncoderResetOption
struct mfxExtEncoderResetOption

The mfxExtEncoderResetOption structure is used to control the SDK encoder behavior during reset. By using this structure, the application instructs the SDK encoder to start new coded sequence after reset or continue encoding of current sequence.

This structure is also used in mode 3 of MFXVideoENCODE_Query function to check for reset outcome before actual reset. The application should set StartNewSequence to required behavior and call query function. If query fails, see status codes below, then such reset is not possible in current encoder state. If the application sets StartNewSequence to MFX_CODINGOPTION_UNKNOWN then query function replaces it by actual reset type: MFX_CODINGOPTION_ON if the SDK encoder will begin new sequence after reset or MFX_CODINGOPTION_OFF if the SDK encoder will continue current sequence.

Using this structure may cause next status codes from MFXVideoENCODE_Reset and MFXVideoENCODE_Queryfunctions:

  • MFX_ERR_INVALID_VIDEO_PARAM - if such reset is not possible. For example, the application sets StartNewSequence to off and requests resolution change.

  • MFX_ERR_INCOMPATIBLE_VIDEO_PARAM - if the application requests change that leads to memory allocation. For example, the application set StartNewSequence to on and requests resolution change to bigger than initialization value.

  • MFX_ERR_NONE - if such reset is possible.

There is limited list of parameters that can be changed without starting a new coded sequence:

  • Bitrate parameters, TargetKbps and MaxKbps in the mfxInfoMFX structure.

  • Number of slices, NumSlice in the mfxInfoMFX structure. Number of slices should be equal or less than number of slices during initialization.

  • Number of temporal layers in mfxExtAvcTemporalLayers structure. Reset should be called immediately before encoding of frame from base layer and number of reference frames should be big enough for new temporal layers structure.

  • Quantization parameters, QPI, QPP and QPB in the mfxInfoMFX structure.

As it is described in Configuration Change chapter, the application should retrieve all cached frames before calling reset. When query function checks for reset outcome, it expects that this requirement be satisfied. If it is not true and there are some cached frames inside the SDK encoder, then query result may differ from reset one, because the SDK encoder may insert IDR frame to produce valid coded sequence.

See also Appendix ‘Streaming and Video Conferencing Features’.

Note

Not all implementations of the SDK encoder support this extended buffer. The application has to use query mode 1 to determine if such functionality is supported. To do so, the application has to attach this extended buffer to mfxVideoParam structure and call MFXVideoENCODE_Query function. If function returns MFX_ERR_NONE then such functionality is supported.

Public Members

mfxExtBuffer Header

Extension buffer header. Header.BufferId must be equal to MFX_EXTBUFF_ENCODER_RESET_OPTION.

mfxU16 StartNewSequence

Instructs encoder to start new sequence after reset. It is one of the CodingOptionValue options:

MFX_CODINGOPTION_ON – the SDK encoder completely reset internal state and begins new coded sequence after reset, including insertion of IDR frame, sequence and picture headers.

MFX_CODINGOPTION_OFF – the SDK encoder continues encoding of current coded sequence after reset, without insertion of IDR frame.

MFX_CODINGOPTION_UNKNOWN – depending on the current encoder state and changes in configuration parameters the SDK encoder may or may not start new coded sequence. This value is also used to query reset outcome.

mfxExtAVCEncodedFrameInfo
struct mfxExtAVCEncodedFrameInfo

The mfxExtAVCEncodedFrameInfo is used by the SDK encoder to report additional information about encoded picture. The application can attach this buffer to the mfxBitstream structure before calling MFXVideoENCODE_EncodeFrameAsync function. For interlaced content the SDK encoder requires two such structures. They correspond to fields in encoded order.

Note

Not all implementations of the SDK encoder support this extended buffer. The application has to use query mode 1 to determine if such functionality is supported. To do so, the application has to attach this extended buffer to mfxVideoParam structure and call MFXVideoENCODE_Query function. If function returns MFX_ERR_NONE then such functionality is supported.

Public Members

mfxExtBuffer Header

Extension buffer header. Header.BufferId must be equal to MFX_EXTBUFF_ENCODED_FRAME_INFO.

mfxU32 FrameOrder

Frame order of encoded picture.

Frame order of reference picture.

mfxU16 PicStruct

Picture structure of encoded picture.

Picture structure of reference picture.

mfxU16 LongTermIdx

Long term index of encoded picture if applicable.

Long term index of reference picture if applicable.

mfxU32 MAD

Mean Absolute Difference between original pixels of the frame and motion compensated (for inter macroblocks) or spatially predicted (for intra macroblocks) pixels. Only luma component, Y plane, is used in calculation.

mfxU16 BRCPanicMode

Bitrate control was not able to allocate enough bits for this frame. Frame quality may be unacceptably low.

mfxU16 QP

Luma QP.

mfxU32 SecondFieldOffset

Offset to second field. Second field starts at mfxBitstream::Data + mfxBitstream::DataOffset + mfxExtAVCEncodedFrameInfo::SecondFieldOffset.

struct mfxExtAVCEncodedFrameInfo::[anonymous] UsedRefListL1[32]

Reference lists that have been used to encode picture.

mfxExtEncoderROI
struct mfxExtEncoderROI

The mfxExtEncoderROI structure is used by the application to specify different Region Of Interests during encoding. It may be used at initialization or at runtime.

Public Members

mfxExtBuffer Header

Extension buffer header. Header.BufferId must be equal to MFX_EXTBUFF_ENCODER_ROI.

mfxU16 NumROI

Number of ROI descriptions in array. The Query function mode 2 returns maximum supported value (set it to 256 and Query will update it to maximum supported value).

mfxU16 ROIMode

QP adjustment mode for ROIs. Defines if Priority or DeltaQP is used during encoding.

mfxU32 Left

Left ROI’s coordinate.

mfxU32 Top

Top ROI’s coordinate.

mfxU32 Right

Right ROI’s coordinate.

mfxU32 Bottom

Bottom ROI’s coordinate.

mfxI16 DeltaQP

Delta QP of ROI. Used if ROIMode = MFX_ROI_MODE_QP_DELTA. This is absolute value in the -51…51 range, which will be added to the MB QP. Lesser value produces better quality.

struct mfxExtEncoderROI::[anonymous] ROI[256]

ROI location rectangle. ROI rectangle definition is using end-point exclusive notation. In other words, the pixel with (Right, Bottom) coordinates lies immediately outside of the ROI. Left, Top, Right, Bottom should be aligned by codec-specific block boundaries (should be dividable by 16 for AVC, or by 32 for HEVC). Every ROI with unaligned coordinates will be expanded by SDK to minimal-area block-aligned ROI, enclosing the original one. For example (5, 5, 15, 31) ROI will be expanded to (0, 0, 16, 32) for AVC encoder, or to (0, 0, 32, 32) for HEVC. Array of ROIs. Different ROI may overlap each other. If macroblock belongs to several ROI, Priority from ROI with lowest index is used.

mfxExtEncoderIPCMArea
struct mfxExtEncoderIPCMArea

Public Members

mfxExtBuffer Header

Extension buffer header. Header.BufferId must be equal to MFX_EXTBUFF_ENCODER_IPCM_AREA.

mfxU32 Left

Left Area’s coordinate.

mfxU32 Top

Top Area’s coordinate.

mfxU32 Right

Right Area’s coordinate.

mfxU32 Bottom

Bottom Area’s coordinate.

struct mfxExtEncoderIPCMArea::[anonymous] Area[64]

Number of Area’s

mfxExtAVCRefLists
struct mfxExtAVCRefLists

The mfxExtAVCRefLists structure specifies reference lists for the SDK encoder. It may be used together with the mfxExtAVCRefListCtrl structure to create customized reference lists. If both structures are used together, then the SDK encoder takes reference lists from mfxExtAVCRefLists structure and modifies them according to the mfxExtAVCRefListCtrl instructions. In case of interlaced coding, the first mfxExtAVCRefLists structure affects TOP field and the second – BOTTOM field.

Note

Not all implementations of the SDK encoder support this structure. The application has to use query function to determine if it is supported

Public Members

mfxExtBuffer Header

Extension buffer header. Header.BufferId must be equal to MFX_EXTBUFF_AVC_REFLISTS.

mfxU16 NumRefIdxL0Active

Specify the number of reference frames in the active reference list L0. This number should be less or equal to the NumRefFrame parameter from encoding initialization.

mfxU16 NumRefIdxL1Active

Specify the number of reference frames in the active reference list L1. This number should be less or equal to the NumRefFrame parameter from encoding initialization.

struct mfxExtAVCRefLists::mfxRefPic RefPicList1[32]

Specify L0 and L1 reference lists.

struct mfxRefPic

Public Members

mfxU32 FrameOrder

Together these fields are used to identify reference picture. Use FrameOrder = MFX_FRAMEORDER_UNKNOWN to mark unused entry. Use PicStruct = MFX_PICSTRUCT_FIELD_TFF for TOP field, PicStruct = MFX_PICSTRUCT_FIELD_BFF for BOTTOM field.

mfxU16 PicStruct

Together these fields are used to identify reference picture. Use FrameOrder = MFX_FRAMEORDER_UNKNOWN to mark unused entry. Use PicStruct = MFX_PICSTRUCT_FIELD_TFF for TOP field, PicStruct = MFX_PICSTRUCT_FIELD_BFF for BOTTOM field.

mfxExtChromaLocInfo
struct mfxExtChromaLocInfo

The mfxExtChromaLocInfo structure defines the location of chroma samples information.

Members of this structure define the location of chroma samples information.

See Annex E of the ISO*\/IEC* 14496-10 specification for the definition of these parameters.

Public Members

mfxExtBuffer Header

Extension buffer header. Header.BufferId must be equal to MFX_EXTBUFF_CHROMA_LOC_INFO.

mfxU16 ChromaLocInfoPresentFlag
mfxU16 ChromaSampleLocTypeTopField
mfxU16 ChromaSampleLocTypeBottomField
mfxU16 reserved[9]
mfxExtMBForceIntra
struct mfxExtMBForceIntra

The mfxExtMBForceIntra structure specifies macroblock map for current frame which forces specified macroblocks to be encoded as Intra if mfxExtCodingOption3::EnableMBForceIntra was turned ON during encoder initialization. The application can attach this extended buffer to the mfxEncodeCtrl during runtime.

Public Members

mfxExtBuffer Header

Extension buffer header. Header.BufferId must be equal to MFX_EXTBUFF_MB_FORCE_INTRA.

mfxU32 MapSize

Macroblock map size.

mfxU8 *Map

Pointer to a list of force intra macroblock flags in raster scan order. Each flag is one byte in map. Set flag to 1 to force corresponding macroblock to be encoded as intra. In case of interlaced encoding, the first half of map affects top field and the second – bottom field.

mfxExtMBQP
struct mfxExtMBQP

The mfxExtMBQP structure specifies per-macroblock QP for current frame if mfxExtCodingOption3::EnableMBQP was turned ON during encoder initialization. The application can attach this extended buffer to the mfxEncodeCtrl during runtime.

Public Members

mfxExtBuffer Header

Extension buffer header. Header.BufferId must be equal to MFX_EXTBUFF_MBQP.

mfxU16 Mode

Defines QP update mode. See MBQPMode enumerator for more details.

mfxU16 BlockSize

QP block size, valid for HEVC only during Init and Runtime.

mfxU32 NumQPAlloc

Size of allocated by application QP or DeltaQP array.

mfxU8 *QP

Pointer to a list of per-macroblock QP in raster scan order. In case of interlaced encoding the first half of QP array affects top field and the second – bottom field. Valid when Mode = MFX_MBQP_MODE_QP_VALUE

For AVC valid range is 1..51.

For HEVC valid range is 1..51. Application’s provided QP values should be valid; otherwise invalid QP values may cause undefined behavior. MBQP map should be aligned for 16x16 block size. (align rule is (width +15 /16) && (height +15 /16))

For MPEG2 QP corresponds to quantizer_scale of the ISO*\/IEC* 13818-2 specification and have valid range 1..112.

mfxI8 *DeltaQP

Pointer to a list of per-macroblock QP deltas in raster scan order. For block i: QP[i] = BrcQP[i] + DeltaQP[i]. Valid when Mode = MFX_MBQP_MODE_QP_DELTA.

mfxQPandMode *QPmode

Block-granularity modes when MFX_MBQP_MODE_QP_ADAPTIVE is set.

mfxExtHEVCTiles
struct mfxExtHEVCTiles

The mfxExtHEVCTiles structure configures tiles options for the HEVC encoder. The application can attach this extended buffer to the mfxVideoParam structure to configure initialization.

Public Members

mfxExtBuffer Header

Extension buffer header. Header.BufferId must be equal to MFX_EXTBUFF_HEVC_TILES.

mfxU16 NumTileRows

Number of tile rows.

mfxU16 NumTileColumns

Number of tile columns.

mfxExtMBDisableSkipMap
struct mfxExtMBDisableSkipMap

The mfxExtMBDisableSkipMap structure specifies macroblock map for current frame which forces specified macroblocks to be non skip if mfxExtCodingOption3::MBDisableSkipMap was turned ON during encoder initialization. The application can attach this extended buffer to the mfxEncodeCtrl during runtime.

Public Members

mfxExtBuffer Header

Extension buffer header. Header.BufferId must be equal to MFX_EXTBUFF_MB_DISABLE_SKIP_MAP.

mfxU32 MapSize

Macroblock map size.

mfxU8 *Map

Pointer to a list of non-skip macroblock flags in raster scan order. Each flag is one byte in map. Set flag to 1 to force corresponding macroblock to be non-skip. In case of interlaced encoding the first half of map affects top field and the second – bottom field.

mfxExtHEVCParam
struct mfxExtHEVCParam

Public Members

mfxExtBuffer Header

Extension buffer header. Header.BufferId must be equal to MFX_EXTBUFF_HEVC_PARAM.

mfxU16 PicWidthInLumaSamples

Specifies the width of each coded picture in units of luma samples.

mfxU16 PicHeightInLumaSamples

Specifies the height of each coded picture in units of luma samples.

mfxU64 GeneralConstraintFlags

Additional flags to specify exact profile/constraints. See the GeneralConstraintFlags enumerator for values of this field.

mfxU16 SampleAdaptiveOffset

Controls SampleAdaptiveOffset encoding feature. See enum SampleAdaptiveOffset for supported values (bit-ORed). Valid during encoder Init and Runtime.

mfxU16 LCUSize

Specifies largest coding unit size (max luma coding block). Valid during encoder Init.

mfxExtDecodeErrorReport
struct mfxExtDecodeErrorReport

This structure is used by the SDK decoders to report bitstream error information right after DecodeHeader or DecodeFrameAsync. The application can attach this extended buffer to the mfxBitstream structure at runtime.

Public Members

mfxExtBuffer Header

Extension buffer header. Header.BufferId must be equal to MFX_EXTBUFF_DECODE_ERROR_REPORT.

mfxU32 ErrorTypes

Bitstream error types (bit-ORed values). See ErrorTypes enumerator for the list of possible types.

mfxExtDecodedFrameInfo
struct mfxExtDecodedFrameInfo

This structure is used by the SDK decoders to report additional information about decoded frame. The application can attach this extended buffer to the mfxFrameSurface1::mfxFrameData structure at runtime.

Public Members

mfxExtBuffer Header

Extension buffer header. Header.BufferId must be equal to MFX_EXTBUFF_DECODED_FRAME_INFO.

mfxU16 FrameType

Frame type. See FrameType enumerator for the list of possible types.

mfxExtTimeCode
struct mfxExtTimeCode

This structure is used by the SDK to pass MPEG 2 specific timing information.

See ISO/IEC 13818-2 and ITU-T H.262, MPEG-2 Part 2 for the definition of these parameters.

Public Members

mfxExtBuffer Header

Extension buffer header. Header.BufferId must be equal to MFX_EXTBUFF_TIME_CODE.

mfxU16 DropFrameFlag

Indicated dropped frame.

mfxU16 TimeCodeHours

Hours.

mfxU16 TimeCodeMinutes

Minutes.

mfxU16 TimeCodeSeconds

Seconds.

mfxU16 TimeCodePictures

Pictures.

mfxExtHEVCRegion
struct mfxExtHEVCRegion

Attached to the mfxVideoParam structure during HEVC encoder initialization, specifies the region to encode.

Public Members

mfxExtBuffer Header

Extension buffer header. Header.BufferId must be equal to MFX_EXTBUFF_HEVC_REGION.

mfxU32 RegionId

ID of region.

mfxU16 RegionType

Type of region. See HEVCRegionType enumerator for the list of possible types.

mfxU16 RegionEncoding

Set to MFX_HEVC_REGION_ENCODING_ON to encode only specified region.

mfxExtPredWeightTable
struct mfxExtPredWeightTable

When mfxExtCodingOption3::WeightedPred was set to explicit during encoder Init or Reset and the current frame is P-frame or mfxExtCodingOption3::WeightedBiPred was set to explicit during encoder Init or Reset and the current frame is B-frame, attached to mfxEncodeCtrl, this structure specifies weighted prediction table for current frame.

Public Members

mfxExtBuffer Header

Extension buffer header. Header.BufferId must be equal to MFX_EXTBUFF_PRED_WEIGHT_TABLE.

mfxU16 LumaLog2WeightDenom

Base 2 logarithm of the denominator for all luma weighting factors. Value shall be in the range of 0 to 7, inclusive.

mfxU16 ChromaLog2WeightDenom

Base 2 logarithm of the denominator for all chroma weighting factors. Value shall be in the range of 0 to 7, inclusive.

mfxU16 LumaWeightFlag[2][32]

LumaWeightFlag[L][R] equal to 1 specifies that the weighting factors for the luma component are specified for R’s entry of RefPicList L.

mfxU16 ChromaWeightFlag[2][32]

ChromaWeightFlag[L][R] equal to 1 specifies that the weighting factors for the chroma component are specified for R’s entry of RefPicList L.

mfxI16 Weights[2][32][3][2]

The values of the weights and offsets used in the encoding processing. The value of Weights[i][j][k][m] is interpreted as: i refers to reference picture list 0 or 1; j refers to reference list entry 0-31; k refers to data for the luma component when it is 0, the Cb chroma component when it is 1 and the Cr chroma component when it is 2; m refers to weight when it is 0 and offset when it is 1

mfxExtAVCRoundingOffset
struct mfxExtAVCRoundingOffset

This structure is used by the SDK encoders to set rounding offset parameters for quantization. It is per-frame based encoding control, and can be attached to some frames and skipped for others. When the extension buffer is set the application can attach it to the mfxEncodeCtrl during runtime.

Public Members

mfxExtBuffer Header

Extension buffer header. Header.BufferId must be equal to MFX_EXTBUFF_AVC_ROUNDING_OFFSET.

mfxU16 EnableRoundingIntra

Enable rounding offset for intra blocks. See the CodingOptionValue enumerator for values of this option.

mfxU16 RoundingOffsetIntra

Intra rounding offset. Value shall be in the range of 0 to 7, inclusive.

mfxU16 EnableRoundingInter

Enable rounding offset for inter blocks. See the CodingOptionValue enumerator for values of this option.

mfxU16 RoundingOffsetInter

Inter rounding offset. Value shall be in the range of 0 to 7, inclusive.

mfxExtDirtyRect
struct mfxExtDirtyRect

Used by the application to specify dirty regions within a frame during encoding. It may be used at initialization or at runtime.

Dirty rectangle definition is using end-point exclusive notation. In other words, the pixel with (Right, Bottom) coordinates lies immediately outside of the Dirty rectangle. Left, Top, Right, Bottom should be aligned by codec-specific block boundaries (should be dividable by 16 for AVC, or by block size (8, 16, 32 or 64, depends on platform) for HEVC). Every Dirty rectangle with unaligned coordinates will be expanded by SDK to minimal-area block-aligned Dirty rectangle, enclosing the original one. For example (5, 5, 15, 31) Dirty rectangle will be expanded to (0, 0, 16, 32) for AVC encoder, or to (0, 0, 32, 32) for HEVC, if block size is 32. Dirty rectangle (0, 0, 0, 0) is a valid dirty rectangle and means that frame is not changed.

Public Members

mfxExtBuffer Header

Extension buffer header. Header.BufferId must be equal to MFX_EXTBUFF_DIRTY_RECTANGLES.

mfxU16 NumRect

Number of dirty rectangles.

mfxU32 Left

Dirty region coordinate.

mfxU32 Top

Dirty region coordinate.

mfxU32 Right

Dirty region coordinate.

mfxU32 Bottom

Dirty region coordinate.

struct mfxExtDirtyRect::[anonymous] Rect[256]

Array of dirty rectangles.

mfxExtMoveRect
struct mfxExtMoveRect

Used by the application to specify moving regions within a frame during encoding.

Destination rectangle location should be aligned to MB boundaries (should be dividable by 16). If not, the SDK encoder truncates it to MB boundaries, for example, both 17 and 31 will be truncated to 16.

Public Members

mfxExtBuffer Header

Extension buffer header. Header.BufferId must be equal to MFX_EXTBUFF_MOVING_RECTANGLE.

mfxU16 NumRect

Number of moving rectangles.

mfxU32 DestLeft

Destination rectangle location.

mfxU32 DestTop

Destination rectangle location.

mfxU32 DestRight

Destination rectangle location.

mfxU32 DestBottom

Destination rectangle location.

mfxU32 SourceLeft

Source rectangle location.

mfxU32 SourceTop

Source rectangle location.

struct mfxExtMoveRect::[anonymous] Rect[256]

Array of moving rectangles.

mfxExtMVOverPicBoundaries
struct mfxExtMVOverPicBoundaries

Attached to the mfxVideoParam structure instructs encoder to use or not use samples over specified picture border for inter prediction.

Public Members

mfxExtBuffer Header

Extension buffer header. Header.BufferId must be equal to MFX_EXTBUFF_MV_OVER_PIC_BOUNDARIES.

mfxU16 StickTop

When set to OFF, one or more samples outside corresponding picture boundary may be used in inter prediction. See the CodingOptionValue enumerator for values of this option.

mfxU16 StickBottom

When set to OFF, one or more samples outside corresponding picture boundary may be used in inter prediction. See the CodingOptionValue enumerator for values of this option.

mfxU16 StickLeft

When set to OFF, one or more samples outside corresponding picture boundary may be used in inter prediction. See the CodingOptionValue enumerator for values of this option.

mfxU16 StickRight

When set to OFF, one or more samples outside corresponding picture boundary may be used in inter prediction. See the CodingOptionValue enumerator for values of this option.

mfxVP9SegmentParam
struct mfxVP9SegmentParam

The mfxVP9SegmentParam structure contains features and parameters for the segment.

Public Members

mfxU16 FeatureEnabled

Indicates which features are enabled for the segment. See SegmentFeature enumerator for values for this option. Values from the enumerator can be bit-OR’ed. Support of particular feature depends on underlying HW platform. Application can check which features are supported by calling of Query.

mfxI16 QIndexDelta

Quantization index delta for the segment. Ignored if MFX_VP9_SEGMENT_FEATURE_QINDEX isn’t set in FeatureEnabled. Valid range for this parameter is [-255, 255]. If QIndexDelta is out of this range, it will be ignored. If QIndexDelta is within valid range, but sum of base quantization index and QIndexDelta is out of [0, 255], QIndexDelta will be clamped.

mfxI16 LoopFilterLevelDelta

Loop filter level delta for the segment. Ignored if MFX_VP9_SEGMENT_FEATURE_LOOP_FILTER isn’t set in FeatureEnabled. Valid range for this parameter is [-63, 63]. If LoopFilterLevelDelta is out of this range, it will be ignored. If LoopFilterLevelDelta is within valid range, but sum of base loop filter level and LoopFilterLevelDelta is out of [0, 63], LoopFilterLevelDelta will be clamped.

mfxU16 ReferenceFrame

Reference frame for the segment. See VP9ReferenceFrame enumerator for values for this option. Ignored if MFX_VP9_SEGMENT_FEATURE_REFERENCE isn’t set in FeatureEnabled.

mfxExtVP9Segmentation
struct mfxExtVP9Segmentation

In VP9 encoder it’s possible to divide a frame to up to 8 segments and apply particular features (like delta for quantization index or for loop filter level) on segment basis. “Uncompressed header” of every frame indicates if segmentation is enabled for current frame, and (if segmentation enabled) contains full information about features applied to every segment. Every “Mode info block” of coded frame has segment_id in the range [0, 7].

To enable Segmentation mfxExtVP9Segmentation structure with correct settings should be passed to the encoder. It can be attached to the mfxVideoParam structure during initialization or MFXVideoENCODE_Reset call (static configuration). If mfxExtVP9Segmentation buffer isn’t attached during initialization, segmentation is disabled for static configuration. If the buffer isn’t attached for Reset call, encoder continues to use static configuration for segmentation which was actual before this Reset call. If mfxExtVP9Segmentation buffer with NumSegments=0 is provided during initialization or Reset call, segmentation becomes disabled for static configuration.

Also the buffer can be attached to the mfxEncodeCtrl structure during runtime (dynamic configuration). Dynamic configuration is applied to current frame only (after encoding of current frame SDK Encoder will switch to next dynamic configuration, or to static configuration if dynamic isn’t provided for next frame).

The SegmentIdBlockSize, NumSegmentIdAlloc, SegmentId parameters represent segmentation map. Here, segmentation map is array of segment_ids (one byte per segment_id) for blocks of size NxN in raster scan order. Size NxN is specified by application and is constant for whole frame. If mfxExtVP9Segmentation is attached during initialization and/or during runtime, all three parameters should be set to proper values not conflicting with each other and with NumSegments. If any of them not set, or any conflict/error in these parameters detected by SDK, segmentation map discarded.

Public Members

mfxExtBuffer Header

Extension buffer header. Header.BufferId must be equal to MFX_EXTBUFF_VP9_SEGMENTATION.

mfxU16 NumSegments

Number of segments for frame. Value 0 means that segmentation is disabled. Sending of 0 for particular frame will disable segmentation for this frame only. Sending of 0 to Reset function will disable segmentation permanently (can be enabled again by subsequent Reset call).

mfxVP9SegmentParam Segment[8]

Array of structures mfxVP9SegmentParam containing features and parameters for every segment. Entries with indexes bigger than NumSegments-1 are ignored. See the mfxVP9SegmentParam structure for definitions of segment features and their parameters.

mfxU16 SegmentIdBlockSize

Size of block (NxN) for segmentation map. See SegmentIdBlockSize enumerator for values for this option. Encoded block which is bigger than SegmentIdBlockSize uses segment_id taken from it’s top-left sub-block from segmentation map. Application can check if particular block size is supported by calling of Query.

mfxU32 NumSegmentIdAlloc

Size of buffer allocated for segmentation map (in bytes). Application must assure that NumSegmentIdAlloc is enough to cover frame resolution with blocks of size SegmentIdBlockSize. Otherwise segmentation map will be discarded.

mfxU8 *SegmentId

Pointer to segmentation map buffer which holds array of segment_ids in raster scan order. Application is responsible for allocation and release of this memory. Buffer pointed by SegmentId provided during initialization or Reset call should be considered in use until another SegmentId is provided via Reset call (if any), or until call of MFXVideoENCODE_Close. Buffer pointed by SegmentId provided with mfxEncodeCtrl should be considered in use while input surface is locked by SDK. Every segment_id in the map should be in the range of [0, NumSegments-1]. If some segment_id is out of valid range, segmentation map cannot be applied. If buffer mfxExtVP9Segmentation is attached to mfxEncodeCtrl in runtime, SegmentId can be zero. In this case segmentation map from static configuration will be used.

mfxVP9TemporalLayer
struct mfxVP9TemporalLayer

The mfxVP9TemporalLayer structure specifies temporal layer.

Public Members

mfxU16 FrameRateScale

The ratio between the frame rates of the current temporal layer and the base layer. The SDK treats particular temporal layer as “defined” if it has FrameRateScale > 0. If base layer defined, it must have FrameRateScale equal to 1. FrameRateScale of each next layer (if defined) must be multiple of and greater than FrameRateScale of previous layer.

mfxU16 TargetKbps

Target bitrate for current temporal layer (ignored if RateControlMethod is CQP). If RateControlMethod is not CQP, application must provide TargetKbps for every defined temporal layer. TargetKbps of each next layer (if defined) must be greater than TargetKbps of previous layer.

mfxExtVP9TemporalLayers
struct mfxExtVP9TemporalLayers

The SDK allows to encode VP9 bitstream that contains several subset bitstreams that differ in frame rates also called “temporal layers”. On decoder side each temporal layer can be extracted from coded stream and decoded separately. The mfxExtVP9TemporalLayers structure configures the temporal layers for SDK VP9 encoder. It can be attached to the mfxVideoParam structure during initialization or MFXVideoENCODE_Reset call. If mfxExtVP9TemporalLayers buffer isn’t attached during initialization, temporal scalability is disabled. If the buffer isn’t attached for Reset call, encoder continues to use temporal scalability configuration which was actual before this Reset call. In SDK API temporal layers are ordered by their frame rates in ascending order. Temporal layer 0 (having lowest frame rate) is called base layer. Each next temporal layer includes all previous layers. Temporal scalability feature has requirements for minimum number of allocated reference frames (controlled by SDK API parameter NumRefFrame). If NumRefFrame set by application isn’t enough to build reference structure for requested number of temporal layers, the SDK corrects NumRefFrame. Temporal layer structure is reset (re-started) after key-frames.

Public Members

mfxExtBuffer Header

Extension buffer header. Header.BufferId must be equal to MFX_EXTBUFF_VP9_TEMPORAL_LAYERS.

mfxVP9TemporalLayer Layer[8]

The array of temporal layers. Layer[0] specifies base layer. The SDK reads layers from the array while they are defined (have FrameRateScale>0). All layers starting from first layer with FrameRateScale=0 are ignored. Last layer which is not ignored is “highest layer”. Highest layer has frame rate specified in mfxVideoParam. Frame rates of lower layers are calculated using their FrameRateScale. TargetKbps of highest layer should be equal to TargetKbps specified in mfxVideoParam. If it’s not true, TargetKbps of highest temporal layers has priority. If there are no defined layers in Layer array, temporal scalability feature is disabled. E.g. to disable temporal scalability in runtime, application should pass to Reset call mfxExtVP9TemporalLayers buffer with all FrameRateScale set to 0.

mfxExtVP9Param
struct mfxExtVP9Param

Attached to the mfxVideoParam structure extends it with VP9-specific parameters. Used by both decoder and encoder.

Public Members

mfxExtBuffer Header

Extension buffer header. Header.BufferId must be equal to MFX_EXTBUFF_VP9_PARAM.

mfxU16 FrameWidth

Width of the coded frame in pixels.

mfxU16 FrameHeight

Height of the coded frame in pixels.

mfxU16 WriteIVFHeaders

Turn this option ON to make encoder insert IVF container headers to output stream. NumFrame field of IVF sequence header will be zero, it’s responsibility of application to update it with correct value. See the CodingOptionValue enumerator for values of this option.

mfxI16 QIndexDeltaLumaDC

Specifies an offset for a particular quantization parameter.

mfxI16 QIndexDeltaChromaAC

Specifies an offset for a particular quantization parameter.

mfxI16 QIndexDeltaChromaDC

Specifies an offset for a particular quantization parameter.

mfxU16 NumTileRows

Number of tile rows. Should be power of two. Maximum number of tile rows is 4 (per VP9 specification). In addition maximum supported number of tile rows may depend on underlying hardware platform. Use Query function to check if particular pair of values (NumTileRows, NumTileColumns) is supported. In VP9 tile rows have dependencies and cannot be encoded/decoded in parallel. So tile rows are always encoded by the SDK in serial mode (one-by-one).

mfxU16 NumTileColumns

Number of tile columns. Should be power of two. Restricted with maximum and minimum tile width in luma pixels defined in VP9 specification (4096 and 256 respectively). In addition maximum supported number of tile columns may depend on underlying hardware platform. Use Query function to check if particular pair of values (NumTileRows, NumTileColumns) is supported. In VP9 tile columns don’t have dependencies and can be encoded/decoded in parallel. So tile columns can be encoded by the SDK in both parallel and serial modes. Parallel mode is automatically utilized by the SDK when NumTileColumns exceeds 1 and doesn’t exceed number of tile coding engines on the platform. In other cases serial mode is used. Parallel mode is capable to encode more than 1 tile row (within limitations provided by VP9 specification and particular platform). Serial mode supports only tile grids 1xN and Nx1.

mfxEncodedUnitInfo
struct mfxEncodedUnitInfo

The structure mfxEncodedUnitInfo is used to report encoded unit info.

Public Members

mfxU16 Type

Codec-dependent coding unit type (NALU type for AVC/HEVC, start_code for MPEG2 etc).

mfxU32 Offset

Offset relatively to associated mfxBitstream::DataOffset.

mfxU32 Size

Unit size including delimiter.

mfxExtEncodedUnitsInfo
struct mfxExtEncodedUnitsInfo

If mfxExtCodingOption3::EncodedUnitsInfo was set to MFX_CODINGOPTION_ON during encoder initialization, structure mfxExtEncodedUnitsInfo attached to the mfxBitstream structure during encoding is used to report information about coding units in the resulting bitstream.

The number of filled items in UnitInfo is min(NumUnitsEncoded, NumUnitsAlloc).

For counting a minimal amount of encoded units you can use algorithm:

nSEI = amountOfApplicationDefinedSEI;
if (CodingOption3.NumSlice[IPB] != 0 || mfxVideoParam.mfx.NumSlice != 0)
  ExpectedAmount = 10 + nSEI + Max(CodingOption3.NumSlice[IPB], mfxVideoParam.mfx.NumSlice);
else if (CodingOption2.NumMBPerSlice != 0)
  ExpectedAmount = 10 + nSEI + (FrameWidth * FrameHeight) / (256 * CodingOption2.NumMBPerSlice);
else if (CodingOption2.MaxSliceSize != 0)
  ExpectedAmount = 10 + nSEI + Round(MaxBitrate / (FrameRate*CodingOption2.MaxSliceSize));
else
  ExpectedAmount = 10 + nSEI;

if (mfxFrameInfo.PictStruct != MFX_PICSTRUCT_PROGRESSIVE)
  ExpectedAmount = ExpectedAmount * 2;

if (temporalScaleabilityEnabled)
  ExpectedAmount = ExpectedAmount * 2;
Note

Only AVC encoder supports it.

Public Members

mfxExtBuffer Header

Extension buffer header. Header.BufferId must be equal to MFX_EXTBUFF_ENCODED_UNITS_INFO.

mfxEncodedUnitInfo *UnitInfo

Pointer to an array of structures mfxEncodedUnitsInfo of size equal to or greater than NumUnitsAlloc.

mfxU16 NumUnitsAlloc

UnitInfo array size.

mfxU16 NumUnitsEncoded

Output field. Number of coding units to report. If NumUnitsEncoded is greater than NumUnitsAlloc, UnitInfo array will contain information only for the first NumUnitsAlloc units; user may consider to reallocate UnitInfo array to avoid this for consequent frames.

mfxExtPartialBitstreamParam
struct mfxExtPartialBitstreamParam

This structure is used by an encoder to output parts of bitstream as soon as they ready. The application can attach this extended buffer to the mfxVideoParam structure at init time. If this option is turned ON (Granularity != MFX_PARTIAL_BITSTREAM_NONE), then encoder can output bitstream by part based with required granularity.

This parameter is valid only during initialization and reset. Absence of this buffer means default or previously configured bitstream output behavior.

Note

Not all codecs and SDK implementations support this feature. Use Query function to check if this feature is supported.

Public Members

mfxExtBuffer Header

Extension buffer header. Header.BufferId must be equal to MFX_EXTBUFF_PARTIAL_BITSTREAM_PARAM.

mfxU32 BlockSize

Output block granulatiry for PartialBitstreamGranularity, valid only for MFX_PARTIAL_BITSTREAM_BLOCK.

mfxU16 Granularity

Granulatiry of the partial bitstream: slice/block/any, all types of granularity state in PartialBitstreamOutput enum.

VPP Extention buffers

mfxExtVPPDoNotUse
struct mfxExtVPPDoNotUse

The mfxExtVPPDoNotUse structure tells the VPP not to use certain filters in pipeline. See “Configurable VPP filters” table for complete list of configurable filters. The user can attach this structure to the mfxVideoParam structure when initializing video processing.

Public Members

mfxExtBuffer Header

Extension buffer header. Header.BufferId must be equal to MFX_EXTBUFF_VPP_DONOTUSE.

mfxU32 NumAlg

Number of filters (algorithms) not to use

mfxU32 *AlgList

Pointer to a list of filters (algorithms) not to use

mfxExtVPPDoUse
struct mfxExtVPPDoUse

The mfxExtVPPDoUse structure tells the VPP to include certain filters in pipeline.

Each filter may be included in pipeline by two different ways. First one, by adding filter ID to this structure. In this case, default filter parameters are used. Second one, by attaching filter configuration structure directly to the mfxVideoParam structure. In this case, adding filter ID to mfxExtVPPDoUse structure is optional. See Table “Configurable VPP filters” for complete list of configurable filters, their IDs and configuration structures.

The user can attach this structure to the mfxVideoParam structure when initializing video processing.

Note

MFX_EXTBUFF_VPP_COMPOSITE cannot be enabled using mfxExtVPPDoUse because default parameters are undefined for this filter. Application must attach appropriate filter configuration structure directly to the mfxVideoParam structure to enable it.

Public Members

mfxExtBuffer Header

Extension buffer header. Header.BufferId must be equal to MFX_EXTBUFF_VPP_DOUSE.

mfxU32 NumAlg

Number of filters (algorithms) to use

mfxU32 *AlgList

Pointer to a list of filters (algorithms) to use

mfxExtVPPDenoise

struct mfxExtVPPDenoise

The mfxExtVPPDenoise structure is a hint structure that configures the VPP denoise filter algorithm.

Public Members

mfxExtBuffer Header

Extension buffer header. Header.BufferId must be equal to MFX_EXTBUFF_VPP_DENOISE.

mfxU16 DenoiseFactor

Value of 0-100 (inclusive) indicates the level of noise to remove.

mfxExtVPPDetail
struct mfxExtVPPDetail

The mfxExtVPPDetail structure is a hint structure that configures the VPP detail/edge enhancement filter algorithm.

Public Members

mfxExtBuffer Header

Extension buffer header. Header.BufferId must be equal to MFX_EXTBUFF_VPP_DETAIL.

mfxU16 DetailFactor

0-100 value (inclusive) to indicate the level of details to be enhanced.

mfxExtVPPProcAmp
struct mfxExtVPPProcAmp

The mfxExtVPPProcAmp structure is a hint structure that configures the VPP ProcAmp filter algorithm. The structure parameters will be clipped to their corresponding range and rounded by their corresponding increment.

Note

There are no default values for fields in this structure, all settings must be explicitly specified every time this buffer is submitted for processing.

Public Members

mfxExtBuffer Header

Extension buffer header. Header.BufferId must be equal to FX_EXTBUFF_VPP_PROCAMP.

mfxF64 Brightness

The brightness parameter is in the range of -100.0F to 100.0F, in increments of 0.1F. Setting this field to 0.0F will disable brightness adjustment.

mfxF64 Contrast

The contrast parameter in the range of 0.0F to 10.0F, in increments of 0.01F, is used for manual contrast adjustment. Setting this field to 1.0F will disable contrast adjustment. If the parameter is negative, contrast will be adjusted automatically.

mfxF64 Hue

The hue parameter is in the range of -180F to 180F, in increments of 0.1F. Setting this field to 0.0F will disable hue adjustment.

mfxF64 Saturation

The saturation parameter is in the range of 0.0F to 10.0F, in increments of 0.01F. Setting this field to 1.0F will disable saturation adjustment.

mfxExtVPPDeinterlacing
struct mfxExtVPPDeinterlacing

This structure is used by the application to specify different deinterlacing algorithms

Public Members

mfxExtBuffer Header

Extension buffer header. Header.BufferId must be equal to MFX_EXTBUFF_VPP_DEINTERLACING.

mfxU16 Mode

Deinterlacing algorithm. See the DeinterlacingMode enumerator for details.

mfxU16 TelecinePattern

Specifies telecine pattern when Mode = MFX_DEINTERLACING_FIXED_TELECINE_PATTERN. See the TelecinePattern enumerator for details.

mfxU16 TelecineLocation

Specifies position inside a sequence of 5 frames where the artifacts start when TelecinePattern = MFX_TELECINE_POSITION_PROVIDED

mfxU16 reserved[9]

Reserved for the future use

mfxExtEncodedSlicesInfo
struct mfxExtEncodedSlicesInfo

The mfxExtEncodedSlicesInfo is used by the SDK encoder to report additional information about encoded slices. The application can attach this buffer to the mfxBitstream structure before calling MFXVideoENCODE_EncodeFrameAsync function.

Note

Not all implementations of the SDK encoder support this extended buffer. The application has to use query mode 1 to determine if such functionality is supported. To do so, the application has to attach this extended buffer to mfxVideoParam structure and call MFXVideoENCODE_Query function. If function returns MFX_ERR_NONE then such functionality is supported.

Public Members

mfxExtBuffer Header

Extension buffer header. Header.BufferId must be equal to MFX_EXTBUFF_ENCODED_SLICES_INFO.

mfxU16 SliceSizeOverflow

When mfxExtCodingOption2::MaxSliceSize is used, indicates the requested slice size was not met for one or more generated slices.

mfxU16 NumSliceNonCopliant

When mfxExtCodingOption2::MaxSliceSize is used, indicates the number of generated slices exceeds specification limits.

mfxU16 NumEncodedSlice

Number of encoded slices.

mfxU16 NumSliceSizeAlloc

SliceSize array allocation size. Must be specified by application.

mfxU16 *SliceSize

Slice size in bytes. Array must be allocated by application.

mfxExtVppAuxData
struct mfxExtVppAuxData

The mfxExtVppAuxData structure returns auxiliary data generated by the video processing pipeline. The encoding process may use the auxiliary data by attaching this structure to the mfxEncodeCtrl structure.

Public Members

mfxExtBuffer Header

Extension buffer header. Header.BufferId must be equal to MFX_EXTBUFF_VPP_AUXDATA.

mfxU16 PicStruct

Detected picture structure - top field first, bottom field first, progressive or unknown if video processor cannot detect picture structure. See the PicStruct enumerator for definition of these values.

By default, detection is turned off and the application should explicitly enable it by using mfxExtVPPDoUse buffer and MFX_EXTBUFF_VPP_PICSTRUCT_DETECTION algorithm.

mfxExtVPPFrameRateConversion
struct mfxExtVPPFrameRateConversion

The mfxExtVPPFrameRateConversion structure configures the VPP frame rate conversion filter. The user can attach this structure to the mfxVideoParam structure when initializing video processing, resetting it or query its capability.

On some platforms advanced frame rate conversion algorithm, algorithm based on frame interpolation, is not supported. To query its support the application should add MFX_FRCALGM_FRAME_INTERPOLATION flag to Algorithm value in mfxExtVPPFrameRateConversion structure, attach it to structure and call MFXVideoVPP_Query function. If filter is supported the function returns MFX_ERR_NONE status and copies content of input structure to output one. If advanced filter is not supported then simple filter will be used and function returns MFX_WRN_INCOMPATIBLE_VIDEO_PARAM, copies content of input structure to output one and corrects Algorithm value.

If advanced FRC algorithm is not supported both MFXVideoVPP_Init and MFXVideoVPP_Reset functions returns MFX_WRN_INCOMPATIBLE_VIDEO_PARAM status.

Public Members

mfxExtBuffer Header

Extension buffer header. Header.BufferId must be equal to MFX_EXTBUFF_VPP_FRAME_RATE_CONVERSION.

mfxU16 Algorithm

See the FrcAlgm enumerator for a list of frame rate conversion algorithms.

mfxExtVPPImageStab
struct mfxExtVPPImageStab

The mfxExtVPPImageStab structure is a hint structure that configures the VPP image stabilization filter.

On some platforms this filter is not supported. To query its support, the application should use the same approach that it uses to configure VPP filters - by adding filter ID to mfxExtVPPDoUse structure or by attaching mfxExtVPPImageStab structure directly to the mfxVideoParam structure and calling MFXVideoVPP_Query function. If this filter is supported function returns MFX_ERR_NONE status and copies content of input structure to output one. If filter is not supported function returns MFX_WRN_FILTER_SKIPPED, removes filter from mfxExtVPPDoUse structure and zeroes mfxExtVPPImageStab structure.

If image stabilization filter is not supported, both MFXVideoVPP_Init and MFXVideoVPP_Reset functions returns MFX_WRN_FILTER_SKIPPED status.

The application can retrieve list of active filters by attaching mfxExtVPPDoUse structure to mfxVideoParam structure and calling MFXVideoVPP_GetVideoParam function. The application must allocate enough memory for filter list.

Public Members

mfxExtBuffer Header

Extension buffer header. Header.BufferId must be equal to MFX_EXTBUFF_VPP_IMAGE_STABILIZATION.

mfxU16 Mode

Image stabilization mode. See ImageStabMode enumerator for possible values.

mfxVPPCompInputStream
struct mfxVPPCompInputStream

The mfxVPPCompInputStream structure is used to specify input stream details for composition of several input surfaces in the one output.

Public Members

mfxU32 DstX

X coordinate of location of input stream in output surface.

mfxU32 DstY

Y coordinate of location of input stream in output surface.

mfxU32 DstW

Width of of location of input stream in output surface.

mfxU32 DstH

Height of of location of input stream in output surface.

mfxU16 LumaKeyEnable

None zero value enables luma keying for the input stream. Luma keying is used to mark some of the areas of the frame with specified luma values as transparent. It may be used for closed captioning, for example.

mfxU16 LumaKeyMin

Minimum value of luma key, inclusive. Pixels whose luma values fit in this range are rendered transparent.

mfxU16 LumaKeyMax

Maximum value of luma key, inclusive. Pixels whose luma values fit in this range are rendered transparent.

mfxU16 GlobalAlphaEnable

None zero value enables global alpha blending for this input stream.

mfxU16 GlobalAlpha

Alpha value for this stream in [0..255] range. 0 – transparent, 255 – opaque.

mfxU16 PixelAlphaEnable

None zero value enables per pixel alpha blending for this input stream. The stream should have RGB color format.

mfxU16 TileId

Specify the tile this video stream assigned to. Should be in range [0..NumTiles). Valid only if NumTiles > 0.

mfxExtVPPComposite
struct mfxExtVPPComposite

The mfxExtVPPComposite structure is used to control composition of several input surfaces in the one output. In this mode, the VPP skips any other filters. The VPP returns error if any mandatory filter is specified and filter skipped warning for optional filter. The only supported filters are deinterlacing and interlaced scaling. The only supported combinations of input and output color formats are:

  • RGB to RGB,

  • NV12 to NV12,

  • RGB and NV12 to NV12, for per pixel alpha blending use case.

The VPP returns MFX_ERR_MORE_DATA for additional input until an output is ready. When the output is ready, VPP returns MFX_ERR_NONE. The application must process the output frame after synchronization.

Composition process is controlled by:

  • mfxFrameInfo::CropXYWH in input surface- defines location of picture in the input frame,

  • InputStream[i].DstXYWH defines location of the cropped input picture in the output frame,

  • mfxFrameInfo::CropXYWH in output surface - defines actual part of output frame. All pixels in output frame outside this region will be filled by specified color.

If the application uses composition process on video streams with different frame sizes, the application should provide maximum frame size in mfxVideoParam during initialization, reset or query operations.

If the application uses composition process, MFXVideoVPP_QueryIOSurf function returns cumulative number of input surfaces, i.e. number required to process all input video streams. The function sets frame size in the mfxFrameAllocRequest equal to the size provided by application in the mfxVideoParam.

Composition process supports all types of surfaces

All input surfaces should have the same type and color format, except per pixel alpha blending case, where it is allowed to mix NV12 and RGB surfaces.

There are three different blending use cases:

  • Luma keying. In this case, all input surfaces should have NV12 color format specified during VPP initialization. Part of each surface, including first one, may be rendered transparent by using LumaKeyEnable, LumaKeyMin and LumaKeyMax values.

  • Global alpha blending. In this case, all input surfaces should have the same color format specified during VPP initialization. It should be either NV12 or RGB. Each input surface, including first one, can be blended with underling surfaces by using GlobalAlphaEnable and GlobalAlpha values.

  • Per pixel alpha blending. In this case, it is allowed to mix NV12 and RGB input surfaces. Each RGB input surface, including first one, can be blended with underling surfaces by using PixelAlphaEnable value.

It is not allowed to mix different blending use cases in the same function call.

In special case where destination region of the output surface defined by output crops is fully covered with destination sub-regions of the surfaces, the fast compositing mode can be enabled. The main use case for this mode is a video-wall scenario with fixed destination surface partition into sub-regions of potentialy different size.

In order to trigger this mode, application must cluster input surfaces into tiles, defining at least one tile by setting the NumTiles field to be greater then 0 and assigning surfaces to the corresponding tiles setting TileId field to the value within [0..NumTiles) range per input surface. Tiles should also satisfy following additional constraints:

  • each tile should not have more than 8 surfaces assigned to it;

  • tile bounding boxes, as defined by the enclosing rectangles of a union of a surfaces assigned to this tile, should not intersect;

Background color may be changed dynamically through Reset. No default value. YUV black is (0;128;128) or (16;128;128) depending on the sample range. The SDK uses YUV or RGB triple depending on output color format.

Public Members

mfxExtBuffer Header

Extension buffer header. Header.BufferId must be equal to MFX_EXTBUFF_VPP_COMPOSITE.

mfxU16 Y

Y value of the background color.

mfxU16 R

R value of the background color.

mfxU16 U

U value of the background color.

mfxU16 G

G value of the background color.

mfxU16 V

V value of the background color.

mfxU16 B

B value of the background color.

mfxU16 NumTiles

Number of input surface clusters grouped together to enable fast compositing. May be changed dynamically at runtime through Reset.

mfxU16 NumInputStream

Number of input surfaces to compose one output. May be changed dynamically at runtime through Reset. Number of surfaces can be decreased or increased, but should not exceed number specified during initialization. Query mode 2 should be used to find maximum supported number.

mfxVPPCompInputStream *InputStream

This array of mfxVPPCompInputStream structures describes composition of input video streams. It should consist of exactly NumInputStream elements.

mfxExtVPPVideoSignalInfo
struct mfxExtVPPVideoSignalInfo

The mfxExtVPPVideoSignalInfo structure is used to control transfer matrix and nominal range of YUV frames. The application should provide it during initialization. It is supported for all kinds of conversion YUV->YUV, YUV->RGB, RGB->YUV.

Note

This structure is used by VPP only and is not compatible with mfxExtVideoSignalInfo.

Public Members

mfxExtBuffer Header

Extension buffer header. Header.BufferId must be equal to MFX_EXTBUFF_VPP_VIDEO_SIGNAL_INFO.

mfxU16 TransferMatrix

Transfer matrix.

mfxU16 NominalRange

Nominal range.

mfxExtVPPFieldProcessing
struct mfxExtVPPFieldProcessing

The mfxExtVPPFieldProcessing structure configures the VPP field processing algorithm. The application can attach this extended buffer to the mfxVideoParam structure to configure initialization and/or to the mfxFrameData during runtime, runtime configuration has priority over initialization configuration. If field processing algorithm was activated via mfxExtVPPDoUse structure and mfxExtVPPFieldProcessing extended buffer was not provided during initialization, this buffer must be attached to mfxFrameData of each input surface.

Public Members

mfxExtBuffer Header

Extension buffer header. Header.BufferId must be equal to MFX_EXTBUFF_VPP_FIELD_PROCESSING.

mfxU16 Mode

Specifies the mode of field processing algorithm. See the VPPFieldProcessingMode enumerator for values of this option.

mfxU16 InField

When Mode is MFX_VPP_COPY_FIELD specifies input field. See the PicType enumerator for values of this parameter.

mfxU16 OutField

When Mode is MFX_VPP_COPY_FIELD specifies output field. See the PicType enumerator for values of this parameter.

mfxExtDecVideoProcessing
struct mfxExtDecVideoProcessing

If attached to the mfxVideoParam structure during the Init stage this buffer will instruct decoder to resize output frames via fixed function resize engine (if supported by HW) utilizing direct pipe connection bypassing intermediate memory operations. Main benefits of this mode of pipeline operation are offloading resize operation to dedicated engine reducing power consumption and memory traffic.

Public Members

mfxExtBuffer Header

Extension buffer header. Header.BufferId must be equal to MFX_EXTBUFF_DEC_VIDEO_PROCESSING.

struct mfxExtDecVideoProcessing::mfxIn In

Input surface description.

struct mfxExtDecVideoProcessing::mfxOut Out

Output surface description.

struct mfxIn

Input surface description.

Public Members

mfxU16 CropX

X coordinate of region of interest of the input surface.

mfxU16 CropY

Y coordinate of region of interest of the input surface.

mfxU16 CropW

Width coordinate of region of interest of the input surface.

mfxU16 CropH

Height coordinate of region of interest of the input surface.

struct mfxOut

Output surface description.

Public Members

mfxU32 FourCC

FourCC of output surface Note: Should be MFX_FOURCC_NV12.

mfxU16 ChromaFormat

Chroma Format of output surface.

Note

Should be MFX_CHROMAFORMAT_YUV420

mfxU16 Width

Width of output surface

mfxU16 Height

Height of output surface

mfxU16 CropX

X coordinate of region of interest of the output surface.

mfxU16 CropY

Y coordinate of region of interest of the output surface.

mfxU16 CropW

Width coordinate of region of interest of the output surface.

mfxU16 CropH

Height coordinate of region of interest of the output surface.

mfxExtVPPRotation
struct mfxExtVPPRotation

The mfxExtVPPRotation structure configures the VPP Rotation filter algorithm.

Public Members

mfxExtBuffer Header

Extension buffer header. Header.BufferId must be equal to MFX_EXTBUFF_VPP_ROTATION.

mfxU16 Angle

Rotation angle. See Angle enumerator for supported values.

mfxExtVPPScaling
struct mfxExtVPPScaling

The mfxExtVPPScaling structure configures the VPP Scaling filter algorithm. Not all combinations of ScalingMode and InterpolationMethod are supported in the SDK. The application has to use query function to determine if a combination is supported.

Public Members

mfxExtBuffer Header

Extension buffer header. Header.BufferId must be equal to MFX_EXTBUFF_VPP_SCALING.

mfxU16 ScalingMode

Scaling mode. See ScalingMode for possible values.

mfxU16 InterpolationMethod

Interpolation mode for scaling algorithm. See InterpolationMode for possible values.

mfxExtVPPMirroring
struct mfxExtVPPMirroring

The mfxExtVPPMirroring structure configures the VPP Mirroring filter algorithm.

Public Members

mfxExtBuffer Header

Extension buffer header. Header.BufferId must be equal to MFX_EXTBUFF_VPP_MIRRORING.

mfxU16 Type

Mirroring type. See MirroringType for possible values.

mfxExtVPPColorFill
struct mfxExtVPPColorFill

The mfxExtVPPColorFill structure configures the VPP ColorFill filter algorithm.

Public Members

mfxExtBuffer Header

Extension buffer header. Header.BufferId must be equal to MFX_EXTBUFF_VPP_COLORFILL.

mfxU16 Enable

Set to ON makes VPP fill the area between Width/Height and Crop borders. See the CodingOptionValue enumerator for values of this option.

mfxExtColorConversion
struct mfxExtColorConversion

The mfxExtColorConversion structure is a hint structure that tunes the VPP Color Conversion algorithm, when attached to the mfxVideoParam structure during VPP Init.

Public Members

mfxExtBuffer Header

Extension buffer header. Header.BufferId must be equal to MFX_EXTBUFF_VPP_COLOR_CONVERSION.

mfxU16 ChromaSiting

See ChromaSiting enumerator for details.

ChromaSiting is applied on input or output surface depending on the scenario:

VPP Input

VPP Output

ChromaSiting indicates

MFX_CHROMAFORMAT_YUV420

MFX_CHROMAFORMAT_YUV422

MFX_CHROMAFORMAT_YUV444

The input chroma location

MFX_CHROMAFORMAT_YUV444

MFX_CHROMAFORMAT_YUV420

MFX_CHROMAFORMAT_YUV422

The output chroma location

MFX_CHROMAFORMAT_YUV420

MFX_CHROMAFORMAT_YUV420

Chroma location for both input and output

MFX_CHROMAFORMAT_YUV420

MFX_CHROMAFORMAT_YUV422

horizontal location for both input and output, and vertical location for input

mfxExtVppMctf
struct mfxExtVppMctf

The mfxExtVppMctf structure allows to setup Motion-Compensated Temporal Filter (MCTF) during the VPP initialization and to control parameters at runtime. By default, MCTF is off; an application may enable it by adding MFX_EXTBUFF_VPP_MCTF to mfxExtVPPDoUse buffer or by attaching mfxExtVppMctf to mfxVideoParam during initialization or reset.

Public Members

mfxExtBuffer Header

Extension buffer header. Header.BufferId must be equal to MFX_EXTBUFF_VPP_MCTF.

mfxU16 FilterStrength

0..20 value (inclusive) to indicate the filter-strength of MCTF. A strength of MCTF process controls degree of possible changes of pixel values eligible for MCTF; the bigger the strength the larger the change is; it is a dimensionless quantity, values 1..20 inclusively imply strength; value 0 stands for AUTO mode and is valid during initialization or reset only; if invalid value is given, it is fixed to default value which is 0. If this field is 1..20 inclusive, MCTF operates in fixed-strength mode with the given strength of MCTF process. At runtime, value 0 and values greater than 20 are ignored.

Bit Rate Control Extension Buffers

mfxBRCFrameParam
struct mfxBRCFrameParam

The mfxBRCFrameParam structure describes frame parameters required for external BRC functions.

Public Members

mfxU16 SceneChange

Frame belongs to a new scene if non zero.

mfxU16 LongTerm

Frame is a Long Term Reference frame if non zero.

mfxU32 FrameCmplx

Frame Complexity Frame spatial complexity if non zero. Zero if complexity is not available.

mfxU32 EncodedOrder

The frame number in a sequence of reordered frames starting from encoder Init.

mfxU32 DisplayOrder

The frame number in a sequence of frames in display order starting from last IDR.

mfxU32 CodedFrameSize

Size of the frame in bytes after encoding.

mfxU16 FrameType

See FrameType enumerator

mfxU16 PyramidLayer

B-pyramid or P-pyramid layer, frame belongs to.

mfxU16 NumRecode

Number of recodings performed for this frame.

mfxU16 NumExtParam

Reserved for the future use.

mfxExtBuffer **ExtParam

Reserved for the future use.

Frame spatial complexity calculated according to this formula:

Frame spatial complexity
mfxBRCFrameCtrl
struct mfxBRCFrameCtrl

The mfxBRCFrameCtrl structure specifies controls for next frame encoding provided by external BRC functions.

Public Members

mfxI32 QpY

Frame-level Luma QP.

mfxU32 InitialCpbRemovalDelay

See initial_cpb_removal_delay in codec standard. Ignored if no HRD control: mfxExtCodingOption::VuiNalHrdParameters = MFX_CODINGOPTION_OFF. Calculated by encoder if initial_cpb_removal_delay==0 && initial_cpb_removal_offset == 0 && HRD control is switched on.

mfxU32 InitialCpbRemovalOffset

See initial_cpb_removal_offset in codec standard. Ignored if no HRD control: mfxExtCodingOption::VuiNalHrdParameters = MFX_CODINGOPTION_OFF. Calculated by encoder if initial_cpb_removal_delay==0 && initial_cpb_removal_offset == 0 && HRD control is switched on.

mfxU32 MaxFrameSize

Max frame size in bytes. This is option for repack feature. Driver calls PAK until current frame size is less or equal maxFrameSize or number of repacking for this frame is equal to maxNumRePak.Repack is available if driver support, MaxFrameSize !=0, MaxNumRePak != 0. Ignored if maxNumRePak == 0.

mfxU8 DeltaQP[8]

This is option for repack feature. Ignored if maxNumRePak == 0 or maxNumRePak==0. If current frame size > maxFrameSize and or number of repacking (nRepack) for this frame <= maxNumRePak, PAK is called with QP = mfxBRCFrameCtrl::QpY + Sum(DeltaQP[i]), where i = [0,nRepack]. Non zero DeltaQP[nRepack] are ignored if nRepack > maxNumRePak. If repacking feature is on ( maxFrameSize & maxNumRePak are not zero), it is calculated by encoder.

mfxU16 MaxNumRepak

Number of possible repacks in driver if current frame size > maxFrameSize. Ignored if maxFrameSize==0. See maxFrameSize description. Possible values are [0,8].

mfxU16 NumExtParam

Reserved for the future use.

mfxExtBuffer **ExtParam

Reserved for the future use.

mfxBRCFrameStatus
struct mfxBRCFrameStatus

The mfxBRCFrameStatus structure specifies instructions for the SDK encoder provided by external BRC after each frame encoding. See the BRCStatus enumerator for details.

Public Members

mfxU32 MinFrameSize

Size in bytes, coded frame must be padded to when Status = MFX_BRC_PANIC_SMALL_FRAME.

mfxU16 BRCStatus

See BRCStatus enumerator.

mfxExtBRC
struct mfxExtBRC

The mfxExtBRC structure contains set of callbacks to perform external bit rate control. Can be attached to mfxVideoParam structure during encoder initialization. Turn mfxExtCodingOption2::ExtBRC option ON to make encoder use external BRC instead of native one.

Public Members

mfxExtBuffer Header

Extension buffer header. Header.BufferId must be equal to MFX_EXTBUFF_BRC.

mfxHDL pthis

Pointer to the BRC object.

mfxStatus (*Init)(mfxHDL pthis, mfxVideoParam *par)

This function initializes BRC session according to parameters from input mfxVideoParam and attached structures. It does not modify in any way the input mfxVideoParam and attached structures. Invoked during MFXVideoENCODE_Init.

Return

MFX_ERR_NONE if no error.

MFX_ERR_UNSUPPORTED The function detected unsupported video parameters.

Parameters
  • [in] pthis: Pointer to the BRC object.

  • [in] par: Pointer to the mfxVideoParam structure that was used for the encoder initialization.

mfxStatus (*Reset)(mfxHDL pthis, mfxVideoParam *par)

This function resets BRC session according to new parameters. It does not modify in any way the input mfxVideoParam and attached structures. Invoked during MFXVideoENCODE_Reset.

Return

MFX_ERR_NONE if no error.

MFX_ERR_UNSUPPORTED The function detected unsupported video parameters.

MFX_ERR_INCOMPATIBLE_VIDEO_PARAM The function detected that provided by the application video parameters are incompatible with initialization parameters. Reset requires additional memory allocation and cannot be executed.

Parameters
  • [in] pthis: Pointer to the BRC object

  • [in] par: Pointer to the mfxVideoParam structure that was used for the encoder initialization

mfxStatus (*Close)(mfxHDL pthis)

This function de-allocates any internal resources acquired in Init for this BRC session. Invoked during MFXVideoENCODE_Close.

Return

MFX_ERR_NONE if no error.

Parameters
  • [in] pthis: Pointer to the BRC object.

mfxStatus (*GetFrameCtrl)(mfxHDL pthis, mfxBRCFrameParam *par, mfxBRCFrameCtrl *ctrl)

This function returns controls (ctrl) to encode next frame based on info from input mfxBRCFrameParam structure (par) and internal BRC state. Invoked asynchronously before each frame encoding or recoding.

Return

MFX_ERR_NONE if no error.

Parameters
  • [in] pthis: Pointer to the BRC object.

  • [in] par: Pointer to the mfxVideoParam structure that was used for the encoder initialization.

  • [out] ctrl: Pointer to the output mfxBRCFrameCtrl structure.

mfxStatus (*Update)(mfxHDL pthis, mfxBRCFrameParam *par, mfxBRCFrameCtrl *ctrl, mfxBRCFrameStatus *status)

This function updates internal BRC state and returns status to instruct encoder whether it should recode previous frame, skip it, do padding or proceed to next frame based on info from input mfxBRCFrameParam and mfxBRCFrameCtrl structures. Invoked asynchronously after each frame encoding or recoding.

Return

MFX_ERR_NONE if no error.

Parameters
  • [in] pthis: Pointer to the BRC object.

  • [in] par: Pointer to the mfxVideoParam structure that was used for the encoder initialization.

  • [in] ctrl: Pointer to the output mfxBRCFrameCtrl structure.

  • [in] status: Pointer to the output mfxBRCFrameStatus structure.

VP8 Extenrion Buffers

mfxExtVP8CodingOption
struct mfxExtVP8CodingOption

Public Members

mfxExtBuffer Header

Extension buffer header. Header.BufferId must be equal to MFX_EXTBUFF_VP8_CODING_OPTION.

mfxU16 Version

Determines the bitstream version. Corresponds to the same VP8 syntax element in frame_tag.

mfxU16 EnableMultipleSegments

Set this option to ON, to enable segmentation. This is tri-state option. See the CodingOptionValue enumerator for values of this option.

mfxU16 LoopFilterType

Selecting the type of filter (normal or simple). Corresponds to VP8 syntax element filter_type.

mfxU16 LoopFilterLevel[4]

Controls the filter strength. Corresponds to VP8 syntax element loop_filter_level.

mfxU16 SharpnessLevel

Controls the filter sensitivity. Corresponds to VP8 syntax element sharpness_level.

mfxU16 NumTokenPartitions

Specifies number of token partitions in the coded frame.

mfxI16 LoopFilterRefTypeDelta[4]

Loop filter level delta for reference type (intra, last, golden, altref).

mfxI16 LoopFilterMbModeDelta[4]

Loop filter level delta for MB modes.

mfxI16 SegmentQPDelta[4]

QP delta for segment.

mfxI16 CoeffTypeQPDelta[5]

QP delta for coefficient type (YDC, Y2AC, Y2DC, UVAC, UVDC).

mfxU16 WriteIVFHeaders

Set this option to ON, to enable insertion of IVF container headers into bitstream. This is tri-state option. See the CodingOptionValue enumerator for values of this option

mfxU32 NumFramesForIVFHeader

Specifies number of frames for IVF header when WriteIVFHeaders is ON.

JPEG Extension Buffers

mfxExtJPEGQuantTables
struct mfxExtJPEGQuantTables

The mfxExtJPEGQuantTables structure specifies quantization tables. The application may specify up to 4 quantization tables. The SDK encoder assigns ID to each table. That ID is equal to table index in Qm array. Table “0” is used for encoding of Y component, table “1” for U component and table “2” for V component. The application may specify fewer tables than number of components in the image. If two tables are specified, then table “1” is used for both U and V components. If only one table is specified then it is used for all components in the image. Table below illustrate this behavior.

Public Members

mfxExtBuffer Header

Extension buffer header. Header.BufferId must be equal to MFX_EXTBUFF_JPEG_QT.

mfxU16 NumTable

Number of quantization tables defined in Qmarray.

mfxU16 Qm[4][64]

Quantization table values.

Table ID

0

1

2

Number of tables

0

Y, U, V

1

Y

U, V

2

Y

U

V

mfxExtJPEGHuffmanTables
struct mfxExtJPEGHuffmanTables

The mfxExtJPEGHuffmanTables structure specifies Huffman tables. The application may specify up to 2 quantization table pairs for baseline process. The SDK encoder assigns ID to each table. That ID is equal to table index in DCTables and ACTables arrays. Table “0” is used for encoding of Y component, table “1” for U and V component. The application may specify only one table in this case it will be used for all components in the image. Table below illustrate this behavior.

Public Members

mfxExtBuffer Header

Extension buffer header. Header.BufferId must be equal to MFX_EXTBUFF_JPEG_HUFFMAN.

mfxU16 NumDCTable

Number of DC quantization table in DCTables array.

mfxU16 NumACTable

Number of AC quantization table in ACTables array.

mfxU8 Bits[16]

Number of codes for each code length.

mfxU8 Values[12]

List of the 8-bit symbol values.

Array of AC tables.

struct mfxExtJPEGHuffmanTables::[anonymous] DCTables[4]

Array of DC tables.

struct mfxExtJPEGHuffmanTables::[anonymous] ACTables[4]

List of the 8-bit symbol values.

Table ID

0

1

Number of tables

0

Y, U, V

1

Y

U, V

MVC Extension Buffers

mfxMVCViewDependency
struct mfxMVCViewDependency

This structure describes MVC view dependencies.

Public Members

mfxU16 ViewId

View identifier of this dependency structure.

mfxU16 NumAnchorRefsL0

Number of view components for inter-view prediction in the initial reference picture list RefPicList0 for anchor view components.

mfxU16 NumAnchorRefsL1

Number of view components for inter-view prediction in the initial reference picture list RefPicList1 for anchor view components.

mfxU16 AnchorRefL0[16]

View identifiers of the view components for inter-view prediction in the initial reference picture list RefPicList0 for anchor view components.

mfxU16 AnchorRefL1[16]

View identifiers of the view components for inter-view prediction in the initial reference picture list RefPicList1 for anchor view components.

mfxU16 NumNonAnchorRefsL0

Number of view components for inter-view prediction in the initial reference picture list RefPicList0 for non-anchor view components.

mfxU16 NumNonAnchorRefsL1

Number of view components for inter-view prediction in the initial reference picture list RefPicList1 for non-anchor view components.

mfxU16 NonAnchorRefL0[16]

View identifiers of the view components for inter-view prediction in the initial reference picture list RefPicList0 for non-anchor view components.

mfxMVCOperationPoint
struct mfxMVCOperationPoint

The mfxMVCOperationPoint structure describes the MVC operation point.

Public Members

mfxU16 TemporalId

Temporal identifier of the operation point.

mfxU16 LevelIdc

Level value signaled for the operation point.

mfxU16 NumViews

Number of views required for decoding the target output views corresponding to the operation point.

mfxU16 NumTargetViews

Number of target output views for the operation point.

mfxU16 *TargetViewId

View identifiers of the target output views for operation point.

mfxExtMVCSeqDesc
struct mfxExtMVCSeqDesc

The mfxExtMVCSeqDesc structure describes the MVC stream information of view dependencies, view identifiers, and operation points. See the ITU*-T H.264 specification chapter H.7.3.2.1.4 for details.

Public Members

mfxExtBuffer Header

Extension buffer header. Header.BufferId must be equal to MFX_EXTBUFF_MVC_SEQUENCE_DESCRIPTION.

mfxU32 NumView

Number of views.

mfxU32 NumViewAlloc

The allocated view dependency array size.

mfxMVCViewDependency *View

Pointer to a list of the mfxMVCViewDependency.

mfxU32 NumViewId

Number of view identifiers.

mfxU32 NumViewIdAlloc

The allocated view identifier array size.

mfxU16 *ViewId

Pointer to the list of view identifier.

mfxU32 NumOP

Number of operation points.

mfxU32 NumOPAlloc

The allocated operation point array size.

mfxMVCOperationPoint *OP

Pointer to a list of the mfxMVCOperationPoint structure.

mfxU16 NumRefsTotal

Total number of reference frames in all views required to decode the stream. This value is returned from the MFXVideoDECODE_Decodeheader function. Do not modify this value.

mfxExtMVCTargetViews
struct mfxExtMVCTargetViews

The mfxExtMvcTargetViews structure configures views for the decoding output.

Public Members

mfxExtBuffer Header

Extension buffer header. Header.BufferId must be equal to MFX_EXTBUFF_MVC_TARGET_VIEWS.

mfxU16 TemporalId

The temporal identifier to be decoded.

mfxU32 NumView

The number of views to be decoded.

mfxU16 ViewId[1024]

List of view identifiers to be decoded.

mfxExtEncToolsConfig
struct mfxExtEncToolsConfig

The mfxExtEncToolsConfig structure configures EncTools for SDK encoders. It can be attached to the mfxVideoParam structure during MFXVideoENCODE_Init or MFXVideoENCODE_Reset call. If mfxEncToolsConfig buffer isn’t attached during initialization, EncTools is disabled. If the buffer isn’t attached for MFXVideoENCODE_Reset call, encoder continues to use mfxEncToolsConfig which was actual before.

If the EncTools are unsupported in encoder, MFX_ERR_UNSUPPORTED is returned from MFXVideoENCODE_Query, MFX_ERR_INVALID_VIDEO_PARAM is returned from MFXVideoENCODE_Init. If any EncTools feature is on and not compatible with other video parameters, MFX_WRN_INCOMPATIBLE_VIDEO_PARAM is returned from Init and Query functions.

Some features can require delay before encoding can start. Parameter mfxExtCodingOption2::LookaheadDepth can be used to limit the delay. EncTools features requiring longer delay will be disabled.

If a field in mfxEncToolsConfig is set to MFX_CODINGOPTION_UNKNOWN, the corresponding feature will be enabled if it is compatible with other video parameters .

Actual EncTools configuration can be obtained using MFXVideoENCODE_GetVideoParam function with attached mfxEncToolsConfig buffer.

Public Members

mfxExtBuffer Header

Extension buffer header. Header.BufferId must be equal to MFX_EXTBUFF_ENCTOOLS_CONFIG.

mfxStructVersion Version

The version of the strucure.

mfxU16 SceneChange

Tri-state flag for enabling/disabling “Scene change analysis” feature.

mfxU16 AdaptiveI

Tri-state flag for configuring “Frame type calculation” feature. Distance between Intra frames depends on the content.

mfxU16 AdaptiveB

Tri-state flag for configuring “Frame type calculation” feature. Distance between nearest P (or I) frames depends on the content.

mfxU16 AdaptiveRefP

Tri-stae flag for configuring “Reference frame list calculation” feature. The most useful reference frames are calculated for P frames.

mfxU16 AdaptiveRefB

Tri-stae flag for configuring “Reference frame list calculation” feature. The most useful reference frames are calculated for B frames.

mfxU16 AdaptiveLTR

Tri-stae flag for configuring “Reference frame list calculation” feature. The most useful reference frames are calculated as LTR.

mfxU16 AdaptivePyramidQuantP

Tri-state flag for configuring “Delta QP hints” feature. Delta QP is calculated for P frames.

mfxU16 AdaptivePyramidQuantB

Tri-state flag for configuring “Delta QP hints” feature. Delta QP is calculated for B frames.

mfxU16 AdaptiveQuantMatrices

Tri-state flag for configuring “Adaptive quantization matrix” feature.

mfxU16 BRCBufferHints

Tri-stae flag for enabling/disabling “BRC buffer hints” feature: calculation of optimal frame size, HRD buffer fullness, etc.

mfxU16 BRC

Tri- state flag for enabling/disabling “BRC” functionality: QP calculation for frame encoding, encoding status calculation after frame encoding.

PCP Extension Buffers

struct _mfxExtCencParam

This structure is used to pass decryption status report index for Common Encryption usage model. The application can attach this extended buffer to the mfxBitstream structure at runtime.

Public Members

mfxExtBuffer Header

Extension buffer header. Header.BufferId must be equal to MFX_EXTBUFF_CENC_PARAM.

mfxU32 StatusReportIndex

Decryption status report index.

Functions

Implementation Capabilities

mfxHDL MFXQueryImplDescription(mfxImplCapsDeliveryFormat format)

This function delivers implementation capabilities in the requested format according to the format value.

Return

Handle to the capability report or NULL in case of unsupported format.

Parameters
  • [in] format: Format in which capabilities must be delivered. See mfxImplCapsDeliveryFormat for more details.

mfxStatus MFXReleaseImplDescription(mfxHDL hdl)

This function destoys handle allocated by MFXQueryImplCapabilities function.

Return

MFX_ERR_NONE The function completed successfully.

Parameters
  • [in] hdl: Handle to destroy. Can be equal to NULL.

Session Management

mfxStatus MFXInit(mfxIMPL impl, mfxVersion *ver, mfxSession *session)

This function creates and initializes an SDK session. Call this function before calling any other SDK functions. If the desired implementation specified by impl is MFX_IMPL_AUTO, the function will search for the platform-specific SDK implementation. If the function cannot find it, it will use the software implementation.

The argument ver indicates the desired version of the library implementation. The loaded SDK will have an API version compatible to the specified version (equal in the major version number, and no less in the minor version number.) If the desired version is not specified, the default is to use the API version from the SDK release, with which an application is built.

We recommend that production applications always specify the minimum API version that meets their functional requirements. For example, if an application uses only H.264 decoding as described in API v1.0, have the application initialize the library with API v1.0. This ensures backward compatibility.

Return

MFX_ERR_NONE The function completed successfully. The output parameter contains the handle of the session.

MFX_ERR_UNSUPPORTED The function cannot find the desired SDK implementation or version.

Parameters
  • [in] impl: mfxIMPL enumerator that indicates the desired SDK implementation.

  • [in] ver: Pointer to the minimum library version or zero, if not specified.

  • [out] session: Pointer to the SDK session handle.

mfxStatus MFXInitEx(mfxInitParam par, mfxSession *session)

This function creates and initializes an SDK session. Call this function before calling any other SDK functions. If the desired implementation specified by par. Implementation is MFX_IMPL_AUTO, the function will search for the platform-specific SDK implementation. If the function cannot find it, it will use the software implementation.

The argument par.Version indicates the desired version of the library implementation. The loaded SDK will have an API version compatible to the specified version (equal in the major version number, and no less in the minor version number.) If the desired version is not specified, the default is to use the API version from the SDK release, with which an application is built.

We recommend that production applications always specify the minimum API version that meets their functional requirements. For example, if an application uses only H.264 decoding as described in API v1.0, have the application initialize the library with API v1.0. This ensures backward compatibility.

The argument par.ExternalThreads specifies threading mode. Value 0 means that SDK should internally create and handle work threads (this essentially equivalent of regular MFXInit). I

Return

MFX_ERR_NONE The function completed successfully. The output parameter contains the handle of the session.

MFX_ERR_UNSUPPORTED The function cannot find the desired SDK implementation or version.

Parameters
  • [in] par: mfxInitParam structure that indicates the desired SDK implementation, minimum library version and desired threading mode.

  • [out] session: Pointer to the SDK session handle.

mfxStatus MFXClose(mfxSession session)

This function completes and de-initializes an SDK session. Any active tasks in execution or in queue are aborted. The application cannot call any SDK function after this function.

All child sessions must be disjoined before closing a parent session.

Return

MFX_ERR_NONE The function completed successfully.

Parameters
  • [in] session: SDK session handle.

mfxStatus MFXQueryIMPL(mfxSession session, mfxIMPL *impl)

This function returns the implementation type of a given session.

Return

MFX_ERR_NONE The function completed successfully.

Parameters
  • [in] session: SDK session handle.

  • [out] impl: Pointer to the implementation type

mfxStatus MFXQueryVersion(mfxSession session, mfxVersion *version)

This function returns the SDK implementation version.

Return

MFX_ERR_NONE The function completed successfully.

Parameters
  • [in] session: SDK session handle.

  • [out] version: Pointer to the returned implementation version.

mfxStatus MFXJoinSession(mfxSession session, mfxSession child)

This function joins the child session to the current session.

After joining, the two sessions share thread and resource scheduling for asynchronous operations. However, each session still maintains its own device manager and buffer/frame allocator. Therefore, the application must use a compatible device manager and buffer/frame allocator to share data between two joined sessions.

The application can join multiple sessions by calling this function multiple times. When joining the first two sessions, the current session becomes the parent responsible for thread and resource scheduling of any later joined sessions.

Joining of two parent sessions is not supported.

Return

MFX_ERR_NONE The function completed successfully.

MFX_WRN_IN_EXECUTION Active tasks are executing or in queue in one of the sessions. Call this function again after all tasks are completed.

MFX_ERR_UNSUPPORTED The child session cannot be joined with the current session.

Parameters
  • [inout] session: The current session handle.

  • [in] child: The child session handle to be joined

mfxStatus MFXDisjoinSession(mfxSession session)

This function removes the joined state of the current session. After disjoining, the current session becomes independent. The application must ensure there is no active task running in the session before calling this function.

Return

MFX_ERR_NONE The function completed successfully.

MFX_WRN_IN_EXECUTION Active tasks are executing or in queue in one of the sessions. Call this function again after all tasks are completed.

MFX_ERR_UNDEFINED_BEHAVIOR The session is independent, or this session is the parent of all joined sessions.

Parameters
  • [inout] session: The current session handle.

mfxStatus MFXCloneSession(mfxSession session, mfxSession *clone)

This function creates a clean copy of the current session. The cloned session is an independent session. It does not inherit any user-defined buffer, frame allocator, or device manager handles from the current session. This function is a light-weight equivalent of MFXJoinSession after MFXInit.

Return

MFX_ERR_NONE The function completed successfully.

Parameters
  • [in] session: The current session handle.

  • [out] clone: Pointer to the cloned session handle.

mfxStatus MFXSetPriority(mfxSession session, mfxPriority priority)

This function sets the current session priority.

Return

MFX_ERR_NONE The function completed successfully.

Parameters
  • [in] session: The current session handle.

  • [in] priority: Priority value.

mfxStatus MFXGetPriority(mfxSession session, mfxPriority *priority)

This function returns the current session priority.

Return

MFX_ERR_NONE The function completed successfully.

Parameters
  • [in] session: The current session handle.

  • [out] priority: Pointer to the priority value.

VideoCORE

mfxStatus MFXVideoCORE_SetFrameAllocator(mfxSession session, mfxFrameAllocator *allocator)

This function sets the external allocator callback structure for frame allocation.If the allocator argument is NULL, the SDK uses the default allocator, which allocates frames from system memory or hardware devices.The behavior of the SDK is undefined if it uses this function while the previous allocator is in use.A general guideline is to set the allocator immediately after initializing the session.

Return

MFX_ERR_NONE The function completed successfully.

Parameters
  • [in] session: SDK session handle.

  • [in] allocator: Pointer to the mfxFrameAllocator structure

mfxStatus MFXVideoCORE_SetHandle(mfxSession session, mfxHandleType type, mfxHDL hdl)

This function sets any essential system handle that SDK might use. If the specified system handle is a COM interface, the reference counter of the COM interface will increase. The counter will decrease when the SDK session closes.

Return

MFX_ERR_NONE The function completed successfully.

MFX_ERR_UNDEFINED_BEHAVIOR The same handle is redefined. For example, the function has been called twice with the same handle type or internal handle has been created by the SDK before this function call.

Parameters
  • [in] session: SDK session handle.

  • [in] type: Handle type

  • [in] hdl: Handle to be set

mfxStatus MFXVideoCORE_GetHandle(mfxSession session, mfxHandleType type, mfxHDL *hdl)

This function obtains system handles previously set by the MFXVideoCORE_SetHandle function. If the handler is a COM interface, the reference counter of the interface increases. The calling application must release the COM interface.

Return

MFX_ERR_NONE The function completed successfully.

MFX_ERR_UNDEFINED_BEHAVIOR Specified handle type not found.

Parameters
  • [in] session: SDK session handle.

  • [in] type: Handle type

  • [in] hdl: Pointer to the handle to be set

mfxStatus MFXVideoCORE_QueryPlatform(mfxSession session, mfxPlatform *platform)

This function returns information about current hardware platform.

Return

MFX_ERR_NONE The function completed successfully.

Parameters
  • [in] session: SDK session handle.

  • [out] platform: Pointer to the mfxPlatform structure

mfxStatus MFXVideoCORE_SyncOperation(mfxSession session, mfxSyncPoint syncp, mfxU32 wait)

This function initiates execution of an asynchronous function not already started and returns the status code after the specified asynchronous operation completes. If wait is zero, the function returns immediately.

Return

MFX_ERR_NONE The function completed successfully.

MFX_ERR_NONE_PARTIAL_OUTPUT The function completed successfully, bitstream contains a portion of the encoded frame according to required granularity.

MFX_WRN_IN_EXECUTION The specified asynchronous function is in execution.

MFX_ERR_ABORTED The specified asynchronous function aborted due to data dependency on a previous asynchronous function that did not complete.

Parameters
  • [in] session: SDK session handle.

  • [in] syncp: Sync point

  • [in] wait: wait time in milliseconds

Memory

mfxStatus MFXMemory_GetSurfaceForVPP(mfxSession session, mfxFrameSurface1 **surface)

This function returns surface which can be used as input for VPP. VPP should be initialized before this call. Surface should be released with mfxFrameSurface1::FrameInterface.Release(…) after usage. Value of mfxFrameSurface1::Data.Locked for returned surface is 0.

Return

MFX_ERR_NONE The function completed successfully.

MFX_ERR_NULL_PTR If surface is NULL.

MFX_ERR_INVALID_HANDLE If session was not initialized.

MFX_ERR_NOT_INITIALIZED If VPP wasn’t initialized (allocator needs to know surface size from somewhere).

MFX_ERR_MEMORY_ALLOC In case of any other internal allocation error.

Parameters
  • [in] session: SDK session handle.

  • [out] surface: Pointer is set to valid mfxFrameSurface1 object.

mfxStatus MFXMemory_GetSurfaceForEncode(mfxSession session, mfxFrameSurface1 **surface)

This function returns surface which can be used as input for Encoder. Encoder should be initialized before this call. Surface should be released with mfxFrameSurface1::FrameInterface.Release(…) after usage. Value of mfxFrameSurface1::Data.Locked for returned surface is 0.

Return

MFX_ERR_NONE The function completed successfully.

MFX_ERR_NULL_PTR If surface is NULL.

MFX_ERR_INVALID_HANDLE If session was not initialized.

MFX_ERR_NOT_INITIALIZED If Encoder wasn’t initialized (allocator needs to know surface size from somewhere).

MFX_ERR_MEMORY_ALLOC In case of any other internal allocation error.

Parameters
  • [in] session: SDK session handle.

  • [out] surface: Pointer is set to valid mfxFrameSurface1 object.

mfxStatus MFXMemory_GetSurfaceForDecode(mfxSession session, mfxFrameSurface1 **surface)

This function returns surface which can be used as input for Decoder. Decoder should be initialized before this call. Surface should be released with mfxFrameSurface1::FrameInterface.Release(…) after usage. Value of mfxFrameSurface1::Data.Locked for returned surface is 0.’ Note: this function was added to simplify transition from legacy surface management to proposed internal allocation approach. Previuosly, user allocated surfaces for working pool and fed decoder with them in DecodeFrameAsync calls. With MFXMemory_GetSurfaceForDecode it is possible to change the existing pipeline just changing source of work surfaces. Newly developed applications should preffer direct usage of DecodeFrameAsync with internal allocation.’.

Return

MFX_ERR_NONE The function completed successfully.

MFX_ERR_NULL_PTR If surface is NULL.

MFX_ERR_INVALID_HANDLE If session was not initialized.

MFX_ERR_NOT_INITIALIZED If Decoder wasn’t initialized (allocator needs to know surface size from somewhere).

MFX_ERR_MEMORY_ALLOC In case of any other internal allocation error.

Parameters
  • [in] session: SDK session handle.

  • [out] surface: Pointer is set to valid mfxFrameSurface1 object.

VideoENCODE

mfxStatus MFXVideoENCODE_Query(mfxSession session, mfxVideoParam *in, mfxVideoParam *out)

This function works in either of four modes:

If the in pointer is zero, the function returns the class configurability in the output structure. A non-zero value in each field of the output structure that the SDK implementation can configure the field with Init.

If the in parameter is non-zero, the function checks the validity of the fields in the input structure. Then the function returns the corrected values in the output structure. If there is insufficient information to determine the validity or correction is impossible, the function zeroes the fields. This feature can verify whether the SDK implementation supports certain profiles, levels or bitrates.

If the in parameter is non-zero and mfxExtEncoderResetOption structure is attached to it, then the function queries for the outcome of the MFXVideoENCODE_Reset function and returns it in the mfxExtEncoderResetOption structure attached to out. The query function succeeds if such reset is possible and returns error otherwise. Unlike other modes that are independent of the SDK encoder state, this one checks if reset is possible in the present SDK encoder state. This mode also requires completely defined mfxVideoParam structure, unlike other modes that support partially defined configurations. See mfxExtEncoderResetOption description for more details.

If the in parameter is non-zero and mfxExtEncoderCapability structure is attached to it, then the function returns encoder capability in mfxExtEncoderCapability structure attached to out. It is recommended to fill in mfxVideoParam structure and set hardware acceleration device handle before calling the function in this mode.

The application can call this function before or after it initializes the encoder. The CodecId field of the output structure is a mandated field (to be filled by the application) to identify the coding standard.

Return

MFX_ERR_NONE The function completed successfully.

MFX_ERR_UNSUPPORTED The function failed to identify a specific implementation for the required features.

MFX_WRN_PARTIAL_ACCELERATION The underlying hardware does not fully support the specified video parameters. The encoding may be partially accelerated. Only SDK HW implementations may return this status code.

MFX_WRN_INCOMPATIBLE_VIDEO_PARAM The function detected some video parameters were incompatible with others; incompatibility resolved.

Parameters
  • [in] session: SDK session handle.

  • [in] in: Pointer to the mfxVideoParam structure as input

  • [out] out: Pointer to the mfxVideoParam structure as output

mfxStatus MFXVideoENCODE_QueryIOSurf(mfxSession session, mfxVideoParam *par, mfxFrameAllocRequest *request)

This function returns minimum and suggested numbers of the input frame surfaces required for encoding initialization and their type. Init will call the external allocator for the required frames with the same set of numbers. The use of this function is recommended. For more information, see the section Working with hardware acceleration. This function does not validate I/O parameters except those used in calculating the number of input surfaces.

Return

MFX_ERR_NONE The function completed successfully.

MFX_ERR_INVALID_VIDEO_PARAM The function detected invalid video parameters. These parameters may be out of the valid range, or the combination of them resulted in incompatibility. Incompatibility not resolved.

MFX_WRN_PARTIAL_ACCELERATION The underlying hardware does not fully support the specified video parameters. The encoding may be partially accelerated. Only SDK HW implementations may return this status code.

MFX_WRN_INCOMPATIBLE_VIDEO_PARAM The function detected some video parameters were incompatible with others; incompatibility resolved.

Parameters

mfxStatus MFXVideoENCODE_Init(mfxSession session, mfxVideoParam *par)

This function allocates memory and prepares tables and necessary structures for encoding. This function also does extensive validation to ensure if the configuration, as specified in the input parameters, is supported.

Return

MFX_ERR_NONE The function completed successfully.

MFX_ERR_INVALID_VIDEO_PARAM The function detected invalid video parameters. These parameters may be out of the valid range, or the combination of them resulted in incompatibility. Incompatibility not resolved.

MFX_WRN_PARTIAL_ACCELERATION The underlying hardware does not fully support the specified video parameters. The encoding may be partially accelerated. Only SDK HW implementations may return this status code.

MFX_WRN_INCOMPATIBLE_VIDEO_PARAM The function detected some video parameters were incompatible with others; incompatibility resolved.

MFX_ERR_UNDEFINED_BEHAVIOR The function is called twice without a close;

Parameters
  • [in] session: SDK session handle.

  • [in] par: Pointer to the mfxVideoParam structure

mfxStatus MFXVideoENCODE_Reset(mfxSession session, mfxVideoParam *par)

This function stops the current encoding operation and restores internal structures or parameters for a new encoding operation, possibly with new parameters.

Return

MFX_ERR_NONE The function completed successfully.

MFX_ERR_INVALID_VIDEO_PARAM The function detected invalid video parameters. These parameters may be out of the valid range, or the combination of them resulted in incompatibility. Incompatibility not resolved.

MFX_ERR_INCOMPATIBLE_VIDEO_PARAM The function detected that provided by the application video parameters are incompatible with initialization parameters. Reset requires additional memory allocation and cannot be executed. The application should close the SDK component and then reinitialize it.

MFX_WRN_INCOMPATIBLE_VIDEO_PARAM The function detected some video parameters were incompatible with others; incompatibility resolved.

Parameters
  • [in] session: SDK session handle.

  • [in] par: Pointer to the mfxVideoParam structure

mfxStatus MFXVideoENCODE_Close(mfxSession session)

This function terminates the current encoding operation and de-allocates any internal tables or structures.

Return

MFX_ERR_NONE The function completed successfully.

Parameters
  • [in] session: SDK session handle.

mfxStatus MFXVideoENCODE_GetVideoParam(mfxSession session, mfxVideoParam *par)

This function retrieves current working parameters to the specified output structure. If extended buffers are to be returned, the application must allocate those extended buffers and attach them as part of the output structure. The application can retrieve a copy of the bitstream header, by attaching the mfxExtCodingOptionSPSPPS structure to the mfxVideoParam structure.

Return

MFX_ERR_NONE The function completed successfully.

Parameters
  • [in] session: SDK session handle.

  • [in] par: Pointer to the corresponding parameter structure

mfxStatus MFXVideoENCODE_GetEncodeStat(mfxSession session, mfxEncodeStat *stat)

This function obtains statistics collected during encoding.

Return

MFX_ERR_NONE The function completed successfully.

Parameters
  • [in] session: SDK session handle.

  • [in] stat: Pointer to the mfxEncodeStat structure

mfxStatus MFXVideoENCODE_EncodeFrameAsync(mfxSession session, mfxEncodeCtrl *ctrl, mfxFrameSurface1 *surface, mfxBitstream *bs, mfxSyncPoint *syncp)

This function takes a single input frame in either encoded or display order and generates its output bitstream. In the case of encoded ordering the mfxEncodeCtrl structure must specify the explicit frame type. In the case of display ordering, this function handles frame order shuffling according to the GOP structure parameters specified during initialization.

Since encoding may process frames differently from the input order, not every call of the function generates output and the function returns MFX_ERR_MORE_DATA. If the encoder needs to cache the frame, the function locks the frame. The application should not alter the frame until the encoder unlocks the frame. If there is output (with return status MFX_ERR_NONE), the return is a frame worth of bitstream.

It is the calling application’s responsibility to ensure that there is sufficient space in the output buffer. The value BufferSizeInKB in the mfxVideoParam structure at encoding initialization specifies the maximum possible size for any compressed frames. This value can also be obtained from MFXVideoENCODE_GetVideoParam after encoding initialization.

To mark the end of the encoding sequence, call this function with a NULL surface pointer. Repeat the call to drain any remaining internally cached bitstreams (one frame at a time) until MFX_ERR_MORE_DATA is returned.

This function is asynchronous.

Return

MFX_ERR_NONE The function completed successfully.

MFX_ERR_NOT_ENOUGH_BUFFER The bitstream buffer size is insufficient.

MFX_ERR_MORE_DATA The function requires more data to generate any output.

MFX_ERR_DEVICE_LOST Hardware device was lost; See Working with Microsoft* DirectX* Applications section for further information.

MFX_WRN_DEVICE_BUSY Hardware device is currently busy. Call this function again in a few milliseconds.

MFX_ERR_INCOMPATIBLE_VIDEO_PARAM Inconsistent parameters detected not conforming to Appendix A.

Parameters
  • [in] session: SDK session handle.

  • [in] ctrl: Pointer to the mfxEncodeCtrl structure for per-frame encoding control; this parameter is optional(it can be NULL) if the encoder works in the display order mode.

  • [in] surface: Pointer to the frame surface structure

  • [out] bs: Pointer to the output bitstream

  • [out] syncp: Pointer to the returned sync point associated with this operation

VideoDECODE

mfxStatus MFXVideoDECODE_Query(mfxSession session, mfxVideoParam *in, mfxVideoParam *out)

This function works in one of two modes:

1.If the in pointer is zero, the function returns the class configurability in the output structure. A non-zero value in each field of the output structure indicates that the field is configurable by the SDK implementation with the MFXVideoDECODE_Init function).

2.If the in parameter is non-zero, the function checks the validity of the fields in the input structure. Then the function returns the corrected values to the output structure. If there is insufficient information to determine the validity or correction is impossible, the function zeros the fields. This feature can verify whether the SDK implementation supports certain profiles, levels or bitrates.

The application can call this function before or after it initializes the decoder. The CodecId field of the output structure is a mandated field (to be filled by the application) to identify the coding standard.

Return

MFX_ERR_NONE The function completed successfully.

MFX_ERR_UNSUPPORTED The function failed to identify a specific implementation for the required features.

MFX_WRN_PARTIAL_ACCELERATION The underlying hardware does not fully support the specified video parameters. The encoding may be partially accelerated. Only SDK HW implementations may return this status code.

MFX_WRN_INCOMPATIBLE_VIDEO_PARAM The function detected some video parameters were incompatible with others; incompatibility resolved.

Parameters
  • [in] session: SDK session handle.

  • [in] in: Pointer to the mfxVideoParam structure as input

  • [out] out: Pointer to the mfxVideoParam structure as output

mfxStatus MFXVideoDECODE_DecodeHeader(mfxSession session, mfxBitstream *bs, mfxVideoParam *par)

This function parses the input bitstream and fills the mfxVideoParam structure with appropriate values, such as resolution and frame rate, for the Init function. The application can then pass the resulting structure to the MFXVideoDECODE_Init function for decoder initialization.

An application can call this function at any time before or after decoder initialization. If the SDK finds a sequence header in the bitstream, the function moves the bitstream pointer to the first bit of the sequence header. Otherwise, the function moves the bitstream pointer close to the end of thebitstream buffer but leaves enough data in the buffer to avoid possible loss of start code.

The CodecId field of the mfxVideoParam structure is a mandated field (to be filled by the application) to identify the coding standard.

The application can retrieve a copy of the bitstream header, by attaching the mfxExtCodingOptionSPSPPS structure to the mfxVideoParam structure.

Return

MFX_ERR_NONE The function successfully filled structure. It does not mean that the stream can be decoded by SDK. The application should call MFXVideoDECODE_Query function to check if decoding of the stream is supported.

MFX_ERR_MORE_DATA The function requires more bitstream data

MFX_ERR_UNSUPPORTED CodecId field of the

mfxVideoParam structure indicates some unsupported codec.

MFX_ERR_INVALID_HANDLE session is not initialized

MFX_ERR_NULL_PTR bs or par pointer is NULL.

Parameters
  • [in] session: SDK session handle.

  • [in] bs: Pointer to the bitstream

  • [in] par: Pointer to the mfxVideoParam structure

mfxStatus MFXVideoDECODE_QueryIOSurf(mfxSession session, mfxVideoParam *par, mfxFrameAllocRequest *request)

This function returns minimum and suggested numbers of the output frame surfaces required for decoding initialization and their type. Init will call the external allocator for the required frames with the same set of numbers. The use of this function is recommended. For more information, see the section Working with hardware acceleration. The CodecId field of the mfxVideoParam structure is a mandated field (to be filled by the application) to identify the coding standard. This function does not validate I/O parameters except those used in calculating the number of output surfaces.

Return

MFX_ERR_NONE The function completed successfully.

MFX_ERR_INVALID_VIDEO_PARAM The function detected invalid video parameters. These parameters may be out of the valid range, or the combination of them resulted in incompatibility. Incompatibility not resolved.

MFX_WRN_PARTIAL_ACCELERATION The underlying hardware does not fully support the specified video parameters. The encoding may be partially accelerated. Only SDK HW implementations may return this status code.

MFX_WRN_INCOMPATIBLE_VIDEO_PARAM The function detected some video parameters were incompatible with others; incompatibility resolved.

Parameters

mfxStatus MFXVideoDECODE_Init(mfxSession session, mfxVideoParam *par)

This function allocates memory and prepares tables and necessary structures for encoding. This function also does extensive validation to ensure if the configuration, as specified in the input parameters, is supported.

Return

MFX_ERR_NONE The function completed successfully.

MFX_ERR_INVALID_VIDEO_PARAM The function detected invalid video parameters. These parameters may be out of the valid range, or the combination of them resulted in incompatibility. Incompatibility not resolved.

MFX_WRN_PARTIAL_ACCELERATION The underlying hardware does not fully support the specified video parameters. The encoding may be partially accelerated. Only SDK HW implementations may return this status code.

MFX_WRN_INCOMPATIBLE_VIDEO_PARAM The function detected some video parameters were incompatible with others; incompatibility resolved.

MFX_ERR_UNDEFINED_BEHAVIOR The function is called twice without a close;

Parameters
  • [in] session: SDK session handle.

  • [in] par: Pointer to the mfxVideoParam structure

mfxStatus MFXVideoDECODE_Reset(mfxSession session, mfxVideoParam *par)

This function stops the current decoding operation and restores internal structures or parameters for a new decoding operation Reset serves two purposes: It recovers the decoder from errors. It restarts decoding from a new position The function resets the old sequence header (sequence parameter set in H.264, or sequence header in MPEG-2 and VC-1). The decoder will expect a new sequence header before it decodes the next frame and will skip any bitstream before encountering the new sequence header.

Return

MFX_ERR_NONE The function completed successfully.

MFX_ERR_INVALID_VIDEO_PARAM The function detected that video parameters are wrong or they conflict with initialization parameters. Reset is impossible.

MFX_ERR_INCOMPATIBLE_VIDEO_PARAM The function detected that provided by the application video parameters are incompatible with initialization parameters. Reset requires additional memory allocation and cannot be executed. The application should close the SDK component and then reinitialize it.

MFX_WRN_INCOMPATIBLE_VIDEO_PARAM The function detected some video parameters were incompatible with others; incompatibility resolved.

Parameters
  • [in] session: SDK session handle.

  • [in] par: Pointer to the mfxVideoParam structure

mfxStatus MFXVideoDECODE_Close(mfxSession session)

This function terminates the current decoding operation and de-allocates any internal tables or structures.

Return

MFX_ERR_NONE The function completed successfully.

Parameters
  • [in] session: SDK session handle.

mfxStatus MFXVideoDECODE_GetVideoParam(mfxSession session, mfxVideoParam *par)

This function retrieves current working parameters to the specified output structure. If extended buffers are to be returned, the application must allocate those extended buffers and attach them as part of the output structure. The application can retrieve a copy of the bitstream header, by attaching the mfxExtCodingOptionSPSPPS structure to the mfxVideoParam structure.

Return

MFX_ERR_NONE The function completed successfully.

Parameters
  • [in] session: SDK session handle.

  • [in] par: Pointer to the corresponding parameter structure

mfxStatus MFXVideoDECODE_GetDecodeStat(mfxSession session, mfxDecodeStat *stat)

This function obtains statistics collected during decoding.

Return

MFX_ERR_NONE The function completed successfully.

Parameters
  • [in] session: SDK session handle.

  • [in] stat: Pointer to the mfxDecodeStat structure

mfxStatus MFXVideoDECODE_SetSkipMode(mfxSession session, mfxSkipMode mode)

This function sets the decoder skip mode. The application may use it to increase decoding performance by sacrificing output quality. The rising of skip level firstly results in skipping of some decoding operations like deblocking and then leads to frame skipping; firstly, B then P. Particular details are platform dependent.

Return

MFX_ERR_NONE The function completed successfully and the output surface is ready for decoding

MFX_WRN_VALUE_NOT_CHANGED The skip mode is not affected as the maximum or minimum skip range is reached.

Parameters
  • [in] session: SDK session handle.

  • [in] mode: Decoder skip mode. See the mfxSkipMode enumerator for details.

mfxStatus MFXVideoDECODE_GetPayload(mfxSession session, mfxU64 *ts, mfxPayload *payload)

This function extracts user data (MPEG-2) or SEI (H.264) messages from the bitstream. Internally, the decoder implementation stores encountered user data or SEI messages. The application may call this function multiple times to retrieve the user data or SEI messages, one at a time.

If there is no payload available, the function returns with payload->NumBit=0.

Return

MFX_ERR_NONE The function completed successfully and the output buffer is ready for decoding

MFX_ERR_NOT_ENOUGH_BUFFER The payload buffer size is insufficient.

Parameters
  • [in] session: SDK session handle.

  • [in] ts: Pointer to the user data time stamp in units of 90 KHz; divide ts by 90,000 (90 KHz) to obtain the time in seconds; the time stamp matches the payload with a specific decoded frame.

  • [in] payload: Pointer to the mfxPayload structure; the payload contains user data in MPEG-2 or SEI messages in H.264.

mfxStatus MFXVideoDECODE_DecodeFrameAsync(mfxSession session, mfxBitstream *bs, mfxFrameSurface1 *surface_work, mfxFrameSurface1 **surface_out, mfxSyncPoint *syncp)

This function decodes the input bitstream to a single output frame.

The surface_work parameter provides a working frame buffer for the decoder. The application should allocate the working frame buffer, which stores decoded frames. If the function requires caching frames after decoding, the function locks the frames and the application must provide a new frame buffer in the next call.

If, and only if, the function returns MFX_ERR_NONE, the pointer surface_out points to the output frame in the display order. If there are no further frames, the function will reset the pointer to zero and return the appropriate status code.

Before decoding the first frame, a sequence header(sequence parameter set in H.264 or sequence header in MPEG-2 and VC-1) must be present. The function skips any bitstreams before it encounters the new sequence header.

The input bitstream bs can be of any size. If there are not enough bits to decode a frame, the function returns MFX_ERR_MORE_DATA, and consumes all input bits except if a partial start code or sequence header is at the end of the buffer. In this case, the function leaves the last few bytes in the bitstream buffer. If there is more incoming bitstream, the application should append the incoming bitstream to the bitstream buffer. Otherwise, the application should ignore the remaining bytes in the bitstream buffer and apply the end of stream procedure described below.

The application must set bs to NULL to signal end of stream. The application may need to call this function several times to drain any internally cached frames until the function returns MFX_ERR_MORE_DATA.

If more than one frame is in the bitstream buffer, the function decodes until the buffer is consumed. The decoding process can be interrupted for events such as if the decoder needs additional working buffers, is readying a frame for retrieval, or encountering a new header. In these cases, the function returns appropriate status code and moves the bitstream pointer to the remaining data.

The decoder may return MFX_ERR_NONE without taking any data from the input bitstream buffer. If the application appends additional data to the bitstream buffer, it is possible that the bitstream buffer may contain more than 1 frame. It is recommended that the application invoke the function repeatedly until the function returns MFX_ERR_MORE_DATA, before appending any more data to the bitstream buffer. This function is asynchronous.

Return

MFX_ERR_NONE The function completed successfully and the output surface is ready for decoding

MFX_ERR_MORE_DATA The function requires more bitstream at input before decoding can proceed.

MFX_ERR_MORE_SURFACE The function requires more frame surface at output before decoding can proceed.

MFX_ERR_DEVICE_LOST Hardware device was lost; See the Working with Microsoft* DirectX* Applications section for further information.

MFX_WRN_DEVICE_BUSY Hardware device is currently busy. Call this function again in a few milliseconds.

MFX_WRN_VIDEO_PARAM_CHANGED The decoder detected a new sequence header in the bitstream. Video parameters may have changed.

MFX_ERR_INCOMPATIBLE_VIDEO_PARAM The decoder detected incompatible video parameters in the bitstream and failed to follow them.

MFX_ERR_REALLOC_SURFACE Bigger surface_work required. May be returned only if

mfxInfoMFX::EnableReallocRequest was set to ON during initialization.

Parameters
  • [in] session: SDK session handle.

  • [in] bs: Pointer to the input bitstream

  • [in] surface_work: Pointer to the working frame buffer for the decoder

  • [out] surface_out: Pointer to the output frame in the display order

  • [out] syncp: Pointer to the sync point associated with this operation

VideoVPP

mfxStatus MFXVideoVPP_Query(mfxSession session, mfxVideoParam *in, mfxVideoParam *out)

This function works in one of two modes:

1.If the in pointer is zero, the function returns the class configurability in the output structure. A non-zero value in a field indicates that the SDK implementation can configure it with Init.

2.If the in parameter is non-zero, the function checks the validity of the fields in the input structure. Then the function returns the corrected values to the output structure. If there is insufficient information to determine the validity or correction is impossible, the function zeroes the fields.

The application can call this function before or after it initializes the preprocessor.

Return

MFX_ERR_NONE The function completed successfully.

MFX_ERR_UNSUPPORTED The SDK implementation does not support the specified configuration.

MFX_WRN_PARTIAL_ACCELERATION The underlying hardware does not fully support the specified video parameters. The videoprocessing may be partially accelerated. Only SDK HW implementations may return this status code.

MFX_WRN_INCOMPATIBLE_VIDEO_PARAM The function detected some video parameters were incompatible with others; incompatibility resolved.

Parameters
  • [in] session: SDK session handle.

  • [in] in: Pointer to the mfxVideoParam structure as input

  • [out] out: Pointer to the mfxVideoParam structure as output

mfxStatus MFXVideoVPP_QueryIOSurf(mfxSession session, mfxVideoParam *par, mfxFrameAllocRequest request[2])

This function returns minimum and suggested numbers of the input frame surfaces required for video processing initialization and their type. The parameter request[0] refers to the input requirements; request[1] refers to output requirements. Init will call the external allocator for the required frames with the same set of numbers. The use of this function is recommended. For more information, see the section Working with hardware acceleration. This function does not validate I/O parameters except those used in calculating the number of input surfaces.

Return

MFX_ERR_NONE The function completed successfully.

MFX_ERR_INVALID_VIDEO_PARAM The function detected invalid video parameters. These parameters may be out of the valid range, or the combination of them resulted in incompatibility. Incompatibility not resolved.

MFX_WRN_PARTIAL_ACCELERATION The underlying hardware does not fully support the specified video parameters. The videoprocessing may be partially accelerated. Only SDK HW implementations may return this status code.

MFX_WRN_INCOMPATIBLE_VIDEO_PARAM The function detected some video parameters were incompatible with others; incompatibility resolved.

Parameters
  • [in] session: SDK session handle.

  • [in] par: Pointer to the mfxVideoParam structure as input

  • [in] request: Pointer to the mfxFrameAllocRequest structure; use request[0] for input requirements and request[1] for output requirements for video processing.

mfxStatus MFXVideoVPP_Init(mfxSession session, mfxVideoParam *par)

This function allocates memory and prepares tables and necessary structures for video processing. This function also does extensive validation to ensure if the configuration, as specified in the input parameters, is supported.

Return

MFX_ERR_NONE The function completed successfully.

MFX_ERR_INVALID_VIDEO_PARAM The function detected invalid video parameters. These parameters may be out of the valid range, or the combination of them resulted in incompatibility. Incompatibility not resolved.

MFX_WRN_PARTIAL_ACCELERATION The underlying hardware does not fully support the specified video parameters. The video processing may be partially accelerated. Only SDK HW implementations may return this status code.

MFX_WRN_INCOMPATIBLE_VIDEO_PARAM The function detected some video parameters were incompatible with others; incompatibility resolved.

MFX_ERR_UNDEFINED_BEHAVIOR The function is called twice without a close.

MFX_WRN_FILTER_SKIPPED The VPP skipped one or more filters requested by the application.

Parameters
  • [in] session: SDK session handle.

  • [in] par: Pointer to the mfxVideoParam structure

mfxStatus MFXVideoVPP_Reset(mfxSession session, mfxVideoParam *par)

This function stops the current video processing operation and restores internal structures or parameters for a new operation.

Return

MFX_ERR_NONE The function completed successfully.

MFX_ERR_INVALID_VIDEO_PARAM The function detected that video parameters are wrong or they conflict with initialization parameters. Reset is impossible.

MFX_ERR_INCOMPATIBLE_VIDEO_PARAM The function detected that provided by the application video parameters are incompatible with initialization parameters. Reset requires additional memory allocation and cannot be executed. The application should close the SDK component and then reinitialize it.

MFX_WRN_INCOMPATIBLE_VIDEO_PARAM The function detected some video parameters were incompatible with others; incompatibility resolved.

Parameters
  • [in] session: SDK session handle.

  • [in] par: Pointer to the mfxVideoParam structure

mfxStatus MFXVideoVPP_Close(mfxSession session)

This function terminates the current video processing operation and de-allocates any internal tables or structures.

Return

MFX_ERR_NONE

The function completed successfully.

Parameters
  • [in] session: SDK session handle.

mfxStatus MFXVideoVPP_GetVideoParam(mfxSession session, mfxVideoParam *par)

This function retrieves current working parameters to the specified output structure. If extended buffers are to be returned, the application must allocate those extended buffers and attach them as part of the output structure.

Return

MFX_ERR_NONE The function completed successfully.

Parameters
  • [in] session: SDK session handle.

  • [in] par: Pointer to the corresponding parameter structure

mfxStatus MFXVideoVPP_GetVPPStat(mfxSession session, mfxVPPStat *stat)

This function obtains statistics collected during video processing.

Return

MFX_ERR_NONE The function completed successfully.

Parameters
  • [in] session: SDK session handle.

  • [in] stat: Pointer to the mfxVPPStat structure

mfxStatus MFXVideoVPP_RunFrameVPPAsync(mfxSession session, mfxFrameSurface1 *in, mfxFrameSurface1 *out, mfxExtVppAuxData *aux, mfxSyncPoint *syncp)

This function processes a single input frame to a single output frame. Retrieval of the auxiliary data is optional; the encoding process may use it. The video processing process may not generate an instant output given an input. See section Video Processing Procedures for details on how to correctly send input and retrieve output. At the end of the stream, call this function with the input argument in=NULL to retrieve any remaining frames, until the function returns MFX_ERR_MORE_DATA. This function is asynchronous.

Return

MFX_ERR_NONE The output frame is ready after synchronization.

MFX_ERR_MORE_DATA Need more input frames before VPP can produce an output

MFX_ERR_MORE_SURFACE The output frame is ready after synchronization. Need more surfaces at output for additional output frames available.

MFX_ERR_DEVICE_LOST Hardware device was lost; See the Working with Microsoft* DirectX* Applications section for further information.

MFX_WRN_DEVICE_BUSY Hardware device is currently busy. Call this function again in a few milliseconds.

Parameters
  • [in] session: SDK session handle.

  • [in] in: Pointer to the input video surface structure

  • [out] out: Pointer to the output video surface structure

  • [in] aux: Optional pointer to the auxiliary data structure

  • [out] syncp: Pointer to the output sync point