3 Definitions of terms, symbols and abbreviations

22.2633GPPRelease 17Service requirements for Video, Imaging and Audio for Professional Applications (VIAPA)TS

3.1 Definitions

For the purposes of the present document, the terms and definitions given in 3GPP TR 21.905 [1] and the following apply. A term defined in the present document takes precedence over the definition of the same term, if any, in 3GPP TR 21.905 [1].

AV Production: the process by which audio and video content are combined in order to produce media content. This could be for live events, media production, conferences or other professional applications.

Compressed Video: a means of making video file or stream sizes smaller to meet various applications. Different applications have different compressions methodologies applied.

– Mezzanine compression: low latency and non-complex compression applied to a video signal in order to maintain the maximum amount of information whilst reducing the stream size to allow for the available bandwidth.

– Visually lossless compression: the maximum amount of compression that can be applied to a video signal before visible compression artefacts appear.

– Highly compressed: use of compression to distribute content over very low bandwidth connections where the content is more important than the quality of the image.

Communication service availability: as defined in TS 22.261 [4].

Communication service reliability: as defined in TS 22.104 [3].

End-to-end Latency: as defined in TS 22.261 [4].

Isochronous: The time characteristic of an event or signal that is recurring at known, periodic time intervals.

NOTE 1: Isochronous data transmission is a form of synchronous data transmission where similar (logically or in size) data frames are sent linked to a periodic clock pulse.

NOTE 2: Isochronous data transmission ensures that data between the source and the sink of the AV application flows continuously and at a steady rate.

Imaging System Latency: The time that takes to generate an image from a source, to apply a certain amount of processing, to transfer it to a destination and then to render the resulting image on a suitable display device, as measured from the moment a specific event happens to the moment that very same event is displayed on a screen.

In-Ear-Monitoring (IEM): A specialist type of earphone usually worn by a performer in which an audio signal is fed to a wireless receiver and attached earphone.

Media Clock: Media clocks are used to control the flow (timing and period) of audio / video data acquisition, processing and playback. Typically, media clocks are generated locally in every mobile or stationary device with a master clock generated by an externally sourced grand master clock.

NOTE 3: Currently GPS but transitioning to 5G in future.

Mouth-to-ear Latency: End-to-end maximum latency between the analogue input at the audio source (e.g. wireless microphone) and the analogue output at the audio sink (e.g. IEM). It includes audio application, application interfacing and the time delay introduced by the wireless transmission path.

Non-public network: as defined in TS 22.261 [4].

Survival time: as defined in TS 22.261 [4].

Uncompressed Video: Uncompressed video is digital video that either has never been compressed or was generated by decompressing previously compressed digital video.

NOTE 4: RTP payload is described in [2].

Video, imaging and audio: The means of digital capture, transmission and storage of still and moving pictures and sound for professional use.

3.2 Symbols

For the purposes of the present document, the following symbols apply:

Tframe Time interval between consecutive audio frames at application layer. Also used to denote the transfer interval in this document.

3.3 Abbreviations

For the purposes of the present document, the abbreviations given in 3GPP TR 21.905 [1] and the following apply. An abbreviation defined in the present document takes precedence over the definition of the same abbreviation, if any, in 3GPP TR 21.905 [1].

AV Audio-Visual

IEM In-Ear-Monitoring