This work introduces the multiframe motion-compensation enhancement network (MMCE-Net), a deep-learning tool aimed at improving the performance of current video coding standards based on motion-compensation, such as H.265/HEVC. The proposed method improves the inter-prediction coding efficiency by enhancing the accuracy of the motion-compensated frame and thereby improving the rate-distortion performance. MMCE-Net is a neural network that jointly exploits the predicted coding unit and two co-located coding units from previous reference frames to improve the estimation of the temporal evolution of the scene. This letter describes the architecture of MMCE-Net, how it is integrated into H.265/HEVC and the corresponding performance.

Deep motion compensation enhancement in video compression

A. Fiandrotti
2022-01-01

Abstract

This work introduces the multiframe motion-compensation enhancement network (MMCE-Net), a deep-learning tool aimed at improving the performance of current video coding standards based on motion-compensation, such as H.265/HEVC. The proposed method improves the inter-prediction coding efficiency by enhancing the accuracy of the motion-compensated frame and thereby improving the rate-distortion performance. MMCE-Net is a neural network that jointly exploits the predicted coding unit and two co-located coding units from previous reference frames to improve the estimation of the temporal evolution of the scene. This letter describes the architecture of MMCE-Net, how it is integrated into H.265/HEVC and the corresponding performance.
2022
58
11
426
428
N. Prette; D. Valsesia; T. Bianchi; E. Magli; M. Naccari; A. Fiandrotti
File in questo prodotto:
File Dimensione Formato  
Electronics Letters - 2022 - Prette - Deep motion‐compensation enhancement in video compression.pdf

Accesso aperto

Tipo di file: PDF EDITORIALE
Dimensione 702.12 kB
Formato Adobe PDF
702.12 kB Adobe PDF Visualizza/Apri

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/2318/1887635
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 0
  • ???jsp.display-item.citation.isi??? 0
social impact