Skip to content

MPEG 133

MPEG 133 took place as an online conference from 2021-01-11 until 2021-01-15.

Press release of 133rd MPEG meeting: here
Request for offers to host MPEG 141: here
MPEG Roadmap after MPEG 133: here

Press release

The 133rd meeting of MPEG was held online from 2021-01-11 to 2021-01-15.

MPEG Systems File Format Subgroup wins Technology & Engineering Emmy® Award

MPEG is pleased to report that the File Format subgroup of MPEG Systems is being recognized this year by the National Academy for Television Arts and Sciences (NATAS) with a Technology & Engineering Emmy® for their 20 years of work on the ISO Base Media File Format (ISOBMFF). This format was first standardized in 1999 as part of the MPEG-4 Systems specification, and is now in its 6th edition as ISO/IEC 14496-12. It is the structural specification under the widely used and supported MP4 and 3GP file formats.

“This recognition further reinforces the major contributions that JTC 1/SC 29 has made to the entertainment industry and society at large.” — Philip C. Wennblom, Chair of ISO/IEC JTC 1

“The ISO Base Media File Format has become the bedrock underlying modern media storage and delivery, spanning both local storage and streaming services for music, video, and other media.” — Gary Sullivan, Chair of ISO/IEC JTC 1/SC 29

Evolution of the format has enabled modern, expressive uses of the file format, notably including segmented streaming over HTTP as exemplified by Dynamic Adaptive Streaming over HTTP (DASH), and as the underlying structure of the High Efficiency Image Format (HEIF) specification, which is seeing rapid multi-platform adoption for the carriage of modern images (including support for bracketing, bursts, grid layup to support very large visual surfaces, and client-performed adjustments before display). The file format subgroup is also responsible for other key specifications, including areas such as content protection (i.e., the Common Encryption and Sample Variants specifications) and carriage of many different types of media (including not only traditional audio and video, but also timed text, media-data metrics, and client-side applied visual effects and transitions). Work under way includes the support for volumetric (3D) visual media, including point clouds, and support for timed haptic media (such as vibration signals).

“File formats are typically taken for granted without considering the enormous engineering effort and vision required to design a file format serving an entire ecosystem with all its diverse requirements from multimedia content production to consumption.” — Jörn Ostermann, Convenor of MPEG Technical Coordination

“The ISO Base Media File Format has been adopted by a wide variety of industries and MPEG Systems has continuously enhanced the standard to reflect their needs for the last 20 years. The group will keep up the efforts to serve the industries better.” — Youngkwon Lim, Convenor of MPEG Systems

The File Format subgroup of MPEG Systems reached some important milestones at this meeting, reaching an agreement on the final text for the support of Versatile Video Coding (VVC) and Essential Video Coding (EVC) in ISO/IEC 14496-15, which specifies the carriage of NAL unit based video in the family of file formats based on the ISO Base Media File Format (ISO/IEC 14496-12). This work has enabled the start of specifying the adoption of those formats into the Common Media Application Format (CMAF), ISO/IEC 23000-19. The File Format subgroup also reached an agreement on the carriage of DASH events in timed metadata tracks, which will enable such media-related events to be carried through the ecosystem from authoring to the client, in a robust way that supports flexible segmentation.

“Over 20 years a small, dedicated, set of experts has curated this format, respecting and developing the initial design principles, while being flexible and responsive to the needs of industry, and the result is a format that is still in active adoption and development.” — David Singer, File Format Subgroup Chair

This is MPEG’s fourth Technology & Engineering Emmy® Award (after MPEG-1 and MPEG-2 together with JPEG in 1996, Advanced Video Coding (AVC) in 2008, and MPEG-2 Transport Stream in 2013) and sixth overall Emmy® Award including the Primetime Engineering Emmy® Awards for Advanced Video Coding (AVC) High Profile in 2008 and High Efficiency Video Coding (HEVC) in 2017, respectively.

Essential Video Coding (EVC) verification test finalized

At the 133rd MPEG meeting, a verification testing assessment of the Essential Video Coding (EVC) standard (ISO/IEC 23094-1; MPEG-5 Part 1) was completed. The first part of the EVC verification test using high dynamic range (HDR) and wide color gamut (WCG) was completed at the 132nd MPEG meeting. A subjective quality evaluation was conducted comparing the EVC Main profile to the HEVC Main 10 profile and the EVC Baseline profile to AVC High 10 profile, respectively. Analysis of the subjective test results showed that the average bit rate savings for EVC Main profile are approximately 40% compared to HEVC Main 10 profile, using UHD and HD SDR content encoded in both random access and low delay configurations. The average bit rate savings for the EVC Baseline profile compared to the AVC High 10 profile is approximately 40% using UHD SDR content encoded in the random-access configuration and approximately 35% using HD SDR content encoded in the low delay configuration. Verification test results using HDR content had shown average bit rate savings for EVC Main profile of approximately 35% compared to HEVC Main 10 profile. By providing significantly improved compression efficiency compared to HEVC and earlier video coding standards while encouraging the timely publication of licensing terms, the MPEG-5 EVC standard is expected to meet the market needs of emerging delivery protocols and networks, such as 5G, enabling the delivery of high-quality video services to an ever-growing audience.

MPEG issues a Call for Evidence on Video Coding for Machines

MPEG Technical Requirements is studying the standardization of coding technologies of unprocessed or processed video for machine intelligence use cases. Those technologies are expected to be with a compression capability that exceeds that of state-of-the-art video coding standards to fulfil a single or multiple machine intelligence tasks, or optionally to support hybrid machine/human vision at sufficient quality.

To coordinate this study, MPEG Technical Requirements re-established a Video Coding for Machines (VCM) Ad-hoc Group (AhG) to investigate the use cases and requirements, test conditions, evaluation methodologies, and potential coding technologies. Two specific technologies require evidence:

  • Efficient compression of processed or unprocessed video.
  • A shared backbone of feature extraction.

MPEG Technical Requirements aims to issue multiple rounds of Call for Evidence (CfE) to cover the full scope of VCM as defined in the draft requirements and the use cases document. The first round of CfE focuses mainly on the “compression efficiency”, which refers to the amount of data needed to represent the video content for the associated application.

This CfE requests information regarding video compression technology that has compression performance beyond that of Versatile Video Coding (VVC) for machine-only consumption as well as hybrid machine/human consumption. Evaluation of the submissions in response to the CfE will be performed at the 134th MPEG meeting, as further described in the CfE document “Call for Evidence for Video Coding for Machines”.

Companies and organizations that have developed compression technologies (for machine-vision or machine/human-vision consumptions) with compression capability better than that of the anchors provided in the MPEG-VCM evaluation framework document are kindly invited to bring such information in response to this CfE by contacting Dr. Igor Curcio, MPEG Technical Requirements Convenor, at igor.curcio@nokia.com.

Neural Network Compression for Multimedia Applications – MPEG calls for technologies for incremental coding of neural networks

Artificial neural networks have been demonstrated for use in a broad range of tasks in multimedia analysis and processing, such as visual and acoustic classification, object and pattern recognition, extraction of multimedia descriptors, or image and video coding. The trained neural networks for these applications contain a large number of parameters (weights), resulting in a considerable amount of data needed to represent the neural network itself. For efficiently transmitting these trained networks to devices (e.g., mobile phones, smart cameras), compression of neural networks is needed. The MPEG standard for compressed representation of neural networks for multimedia content and analysis, currently under Draft International Standard (DIS) ballot, addresses these requirements and provides technologies for parameter reduction and quantization in order to compress entire neural networks.

In emerging application scenarios of neural networks, such as edge-based content processing or federated training, updates of neural networks (e.g., after training on additional data) need to be exchanged. Such updates include changes of the network parameters and may also involve structural changes of the network (e.g., when extending a classification method with a new class). In scenarios like federated training, these updates are more frequent than initial deployments of trained networks, and thus require much more bandwidth over time. However, there is evidence that these updates with respect to a base model can be compressed very efficiently.

MPEG Technical Requirements has thus issued a Call for Proposals (CfP) for compression technology for incremental coding of neural networks, addressing weight and structure updates. The compression technology will be evaluated in terms of compression efficiency and the impact on performance in two use cases. Responses to the CfP will be analyzed at the 134th MPEG meeting. For further information about the call or regarding responses to the call please contact Werner Bailer (werner.bailer@joanneum.at) and Dr. Igor Curcio (MPEG Technical Requirements Convenor, igor.curcio@nokia.com).

MPEG Systems reaches the first milestone for supporting Versatile Video Coding (VVC) and Essential Video Coding (EVC) in the Common Media Application Format (CMAF)

At the 133rd MPEG meeting, MPEG Systems promoted Amendment 2 of the Common Media Application Format (CMAF; ISO/IEC 23000-19:2019) to Committee Draft Amendment (CDAM) status. This amendment defines constraints to Versatile Video Coding (VVC) and Essential Video Coding (EVC) video elementary streams when carried in a CMAF video track. In addition, the amendment also defines codec parameters to be used for CMAF switching sets with VVC and EVC tracks to ensure wide interoperability.

This amendment also includes support of the newly introduced MPEG-H 3D Audio profile and it is expected to reach its final milestone (i.e., Final Draft International Standard (FDIS)) status in early 2022.

MPEG Systems continuously enhances Dynamic Adaptive Streaming over HTTP (DASH)

At the 133rd MPEG meeting, MPEG Systems promoted Part 8 of Dynamic Adaptive Streaming over HTTP (DASH) also referred to as Session-based DASH (ISO/IEC 23009-8) to its final stage of standardization (i.e., Final Draft International Standard (FDIS)).

Historically, in DASH, every client uses the same manifest, as it best serves the scalability of the service. However, there have been increasing requests from the industry to enable customized manifests for enabling personalized services. MPEG Systems has standardized a solution to this problem without sacrificing scalability, resulting in ISO/IEC 23009-8 Session-based DASH. ISO/IEC 23009-8 adds a mechanism to the Media Presentation Description (MPD) to refer to another document, called Session-based Description (SBD), which allows per-session information. The DASH client can use this information (i.e., variables and their values) provided in the SBD to derive the URLs for HTTP GET requests.

MPEG Systems reached the first milestone to carry event messages in tracks of the ISO Base Media File Format

At the 133rd MPEG meeting, MPEG Systems has reached the first milestone of the development of a standard to carry event messages in tracks of the ISO Base Media File Format by promoting ISO/IEC 23001-18 to Committee Draft (CD) stage.

This specification provides a method for the carriage of event messages in tracks of the ISO Base Media File Format (ISOBMFF, ISO/IEC 14496-12). This event message track format associates the timeline of the DashEventMessageBox messages to the track timeline. The specified track format enables all common ISOBMFF processing such as multiplexing and de-fragmentation. In addition, multiplexing and de-multiplexing operations using top-level DashEventMessageBox based on this event message track format are defined. The carriage in the event message track format will also make this information more easily accessible to devices that can seek through ISOBMFF formatted media files. This standard is expected to reach its final milestone (i.e., Final Draft International Standard (FDIS)) in early 2022.

Output documents published in MPEG 133

MPEG-I

#PartTitle
3Versatile Video CodingTest Model 12 for Versatile Video Coding (VTM 12)
3Versatile Video CodingVVC verification test plan (draft 5)
3Versatile Video CodingWorking Draft 2 of ISO/IEC 23090-3 Amd.1 Operation range extensions
3Versatile Video CodingCore experiment on high bit depth and high bit rate entropy coding in VVC
8Network based Media ProcessingWhite paper on Network Based Media Processing
9Geometry-based Point Cloud CompressionG-PCC codec description
9Geometry-based Point Cloud CompressionEE4FE 13.47 On spherical coordinate geometry
9Geometry-based Point Cloud CompressionStatus on addresing the G-PCC requirements
12Immersive VideoTest Model 8 for MPEG Immersive Video
12Immersive VideoCommon Test Conditions for MPEG Immersive Video

MPEG-G

#PartTitle
MPEG-G Genomic Information Database

MPEG-4

#PartTitle
22Open Font FormatWD of ISO/IEC 14496-22:2019 AMD 2 Extending color font functionality and other updates

MPEG-5

#PartTitle
1Essential Video CodingReport on Essential Video Coding Compression Performance Verification Testing for SDR Content

MPEG-C

#PartTitle
7Versatile Supplemental Enhancement Information Messages for Coded Video BitstreamsWorking Draft 2 of ISO/IEC 23002-7 Amd.1 Additional SEI messages

MPEG-B

#PartTitle
17Carriage of Uncompressed Video in ISOBMFFExploration of ISO/IEC 23001-17 Carriage of Uncompressed Video in ISOBMFF

MPEG-7

#PartTitle
17Compression of Neural Networks for Multimedia Content Description and analysisClarifications about NNR evaluation framework

Administrative

#PartTitle
Request for offers to host MPEG 141

Explorations

#PartTitle
7Immersive VideoCall for MPEG-I Visual Test Materials
7Immersive VideoManual of Immersive Video Depth Estimation 3
34Video Coding for MachinesEvaluation Framework for Video Coding for Machines
34Video Coding for MachinesCall for Evidence for Video Coding for Machines
34Video Coding for MachinesUse cases and requirements for Video Coding for Machines
34Video Coding for MachinesCollection of Evidences Table for Video Coding for Machines
34Video Coding for MachinesCommon Test Conditions and Evaluation Methodology for Video Coding for Machines
36Neural Network-based Video CompressionExploration experiment on neural network-based video coding technology
40Haptics SupportRevised Draft Call for Proposals on the Coded Representation of Haptics - Phase 1
40Haptics SupportDraft Submissions and Evaluation Procedures for the Haptics CfP - Phase 1
40Haptics SupportDraft MPEG CE Methodology for Haptics
40Haptics SupportRevised Draft Encoder Input Format for Haptics
41Enhanced compression beyond VVC capabilityExploration experiment on enhanced compression beyond VVC capability
42Future Capabilities for MPEG-IDraft requirements for support of emerging immersive displays and media for immersive applications v2
42Future Capabilities for MPEG-IDraft report of technologies that enable use cases for immersive applications

Management

#PartTitle
AHGs Established at 2nd WG06 Meeting
AHGs Established at 2nd WG08 Meeting
AHGs Established at 2nd AG05 Meeting

All

#PartTitle
Assets of Comunication

Other documents published in MPEG 133

TypeTitle
AhGList of AHGs established at the 2nd WG 5 meeting
AhGAd hoc group rules for MPEG AGs and WGs
Work planMPEG Roadmap after MPEG 133