Glossary

AUDIO/VIDEO GLOSSARY

  • 2 piece projection – A 2 piece projection system consists of a light engine shining light onto a screen. The light engine can be placed in front of (front projection) or behind (rear projection) the screen.
  • 2/3 pulldown – Video processing that converts content shot at film (24fps) to video (30fps). This helps reduce motion artifacts when watching film-based content on a television.
  • AES67 – Developed by the Audio Engineering Society, it is a technical standard for transmission of audio over Internet Protocol. This Layer 3 protocol is based on existing standards to interoperate between various IP-based audio networking systems such as Ravenna, Livewire, and Dante.
  • ALLM– This feature lets a game console, PC or other device send a signal to the display which will cause it to automatically switch to a low-latency, low-lag mode for gaming. This could benefit other uses, such as karaoke and video conferencing. When the source no longer requires this mode—for example, when switching to a movie stream—the source disables the signal and the display reverts back to its previous mode.
  • AOC (Active Optical Cable) – A “hybrid” HDMI cable that transmits data over a fiber optic cable inside of the HDMI cable. Copper is still used for some things such as 5v hot-plug.
  • AVB (Audio Video Bridging) Developed by the Institute of Electrical and Electronics Engineers (IEEE) Audio Video Bridging task group (IEEE 802.1 standards committee), to provide improved synchronization, low-latency, and reliability for switched Ethernet networks for AVoIP applications.
  • AVoIP (Audio Video over Internet Protocol) A methodology of distributing audio, video, and control signals over standard Internet Protocol networks. Employs various compression technologies to encode and decode Audio/Video packets between source and display.
  • Backlight – The light source of an LCD panel. Most commonly CCFL bulb(s) or LEDs.
  • Backlit – Refers to the light source of an LCD TV being behind the panel (rather than the light source being across the bottom or along the edges –see Edgelit).
  • Banding – (Video artifact) described as abrupt changes or steps between shades of the same color. Symptoms of not enough bit resolution in the signal.
  • Bandwidth – maximum rate of data transfer across a given path.
  • Brightness – (TV control) How much gain is added or taken away from the red, green, and blue signal. Also known as Black level.
  • BT.2020 – (also rec. 2020) – standard established in 2012 that describes the parameters (resolution, frame rate, color space, etc) for UHDTV.
  • BT.601 (also rec. 601) – standard established in 1982 that describes the parameters (resolution, frame rate, color space, etc) for SDTV.
  • BT.709 (also rec. 709) – standard established in 1990 that describes the parameters (resolution, frame rate, color space, etc) for HDTV.
  • Chroma Subsampling – is the practice of encoding images by implementing less resolution for chroma information than for luma information, taking advantage of the human visual system’s lower acuity for color differences than for luminance. Typically described as a ratio of the luminance channel to the 2 color channels ex 4:4:4, 4:2:2, 4:2:0.
  • CIE chart – Created by the International Commission on Illumination (CIE) in 1931, the chart shows the range of colors in the visible light spectrum.
  • Clipping – when a signal is overdriven which results in distortion and loss of detail. In video, clipping most commonly refers to contrast being too high and bright details being lost. In audio, a clipped signal usually sounds like distortion and can be dangerous to audio equipment.
  • CMS – (TV control) Color Management System – A set of controls in some displays that allow the calibration of individual primary and secondary colors.
  • Color points – Targets for the primary and secondary colors within the CIE chart.
  • Color saturation – (TV control) The amount of color in the image.
  • Color space/color gamut – (TV control) The range of colors that a display can produce.
  • Color Temperature (Kelvin) – The “color” of light as described by the Black Body Curve. As things heat up, they change color. A “cool” color temperature (9500K) may look blue-ish white while a “warm” color temperature (3500k) may look orange-ish white.
  • Colorimeter – tristimulus (three-filtered) device used in display calibration that use a red, green, and blue filter to emulate the response of the human eye to light and color.
  • Contrast – (TV control) How much gain is added to the red, green, and blue signal. Also known as Picture.
  • Crushing – Video calibration term that describes the loss of shadow detail when brightness/black level is set too low.
  • D65 – the standard “color” of white as defined by the CIE. The coordinates for D65 on the CIE chart are x=.313, y=.329. D65 resembles daylight.
  • Dante (Digital Audio Network through Ethernet) Developed by Audinate, it is a system comprising of software, hardware, and network protocols that delivers uncompressed, multi-channel, low-latency digital audio over a standard Ethernet network using Layer 3 IP packets. It was designed to improve upon previous audio over Ethernet technologies such as CobraNet and EtherSound.
  • DCI/P3 (DCI = Digital Cinema Initiatives) a common RGB color space for digital movie projection from the American film industry. DCI-P3 was defined by the Digital Cinema Initiatives (DCI) organization and published by the Society of Motion Picture and Television Engineers (SMPTE).
  • Dolby Vision – Created by Dolby Laboratories, Dolby Vision was the second “version” of HDR. Dolby Vision uses the PQ EOTF, up to 4k resolution, BT.2020 color, up to 12 bits, and dynamic (frame by frame) metadata. Currently supported by Vizio, LG, Philips, TCL. Unlike HDR10, Dolby Vision is not an open source platform.
  • Dynamic range (Audio) – Describes the difference between the quietest and loudest sound heard on an audio system.
  • Dynamic range (Video) – Describes the difference between the brightest brights and the darkest darks displayed on an image.
  • Edgelit – Refers to the light source of an LCD TV being around the edges of the panel (rather than the light source being directly behind the LCD panel –see Backlit).
  • EDID – Extended Display Identification Data – a type of metadata that display devices use to describe their capabilities to a video source.
  • EOTF – Electro Optical Transfer Function – the process of converting an incoming video signal to light (formally “gamma” in a display).
  • FALD – Full Array Local Dimming – describes a style of backlight on an LCD panel. LEDs are located behind the panel and are divided into zones. Zones can be controlled and adjust in luminance depending on the source material.
  • Frame rate/Frames Per Second (fps) – How many still images are shown per second to simulate motion in film (24fps) and video (30fps).
  • FRL – Fixed Rate Link and it’s a signaling technology supported in the HDMI 2.1 Specification. FRL is necessary to achieve the higher uncompressed resolutions such as those above 4k60 as well as the ultra high speed bandwidths up to 48Gbps. It’s also required for compressed video transport which in turn enables operation at lower data rates for example 4k60 and ultra-high pixel rate video such as 10Kp120.
  • Gamma – (TV control) The relationship between the incoming video signal and the light output of the display. This relationship is non-linear. Higher Gamma values are better for darker rooms, lower Gamma values are better for brighter rooms. If Gamma is set correctly, shadows in dark movies, TV shows, games, etc should be visible regardless of room lighting.
  • Generator – generates test patterns to the display for calibration.
  • Genlock (generator locking) – is a common technique where the video output of one source (or a specific reference signal from a signal generator) is used to synchronize other picture sources together. The aim in video applications is to ensure the coincidence of signals in time at a combining or switching point. When video instruments are synchronized in this way, they are said to be generator-locked, or genlocked.
  • Grayscale – The range of gray shades from black to white. When calibrating a display, the grayscale should be calibrated to D65 -see D65.
  • H.264 – one standard of video compression. Most commonly used on Blu-ray discs, iTunes, and more.
  • H.265 – one standard of video compression. Also known as High Efficiency Video Coding (HEVC). One successor to H.264. Compatible with resolutions up to 8192×4320.
  • HDCP  High bandwidth Digital Content Protection – digital copy protection developed by Intel Corporation to prevent copying of digital audio & video content as it travels across connections. Types of connections that support HDCP include DisplayPort (DP), Digital Visual Interface (DVI), and High-Definition Multimedia Interface (HDMI).
  • HDMI High Definition Multimedia Interface. A digital interface that passes audio, video, and communication signals down a single cable.
  • HDR – High Dynamic Range – Greater dynamic range than standard video imaging.
  • HDR10 – The first “version” of HDR. HDR10 uses the PQ EOTF, up to 4k resolution, BT.2020 color, up to 10 bits, and static metadata.  Currently supported by Samsung, Sony, Hisense/Sharp, Philips, and LG. HDR10 is an open source technology.
  • HDR10+ – Similar to HDR10 except HDR10+ adds dynamic (frame by frame) metadata to the content. This is done by the display. Supported by Samsung, Panasonic, Amazon, 20th Century Fox, Warner Bros. HDR10+ is an open source technology.
  • HEVC High Efficiency Video Coding – one standard of video compression. Also known as H.265. Compatible with resolutions up to 8192×4320.
  • HLG  Hybrid Log Gamma. HDR designed for broadcast TV. Backwards compatible with SDR displays.
  • Hue/tint – (TV control) The shade or type of color.
  • IEEE (SA) – The Institute of Electrical and Electronics Engineers Standards Association (IEEE-SA) is an organization within IEEE that develops global standards in a broad range of industries, including: power and energybiomedical and health careinformation technology and roboticstelecommunication and home automationtransportationnanotechnologyinformation assurance, and many more.
  • Interlaced – The process of making an image by combining even and odd lines to make a solid picture.
  • Laser Phosphor – one type of light engine used in a video projector. Lasers excite a color wheel that is coated in phosphor to produce the colors that you see on the screen.
  • LCD – Liquid Crystal Display. Light shines through a liquid crystal to produce color. Does not actually emit light.
  • LED – Light Emitting Diode. A small semiconductor that emits light when current is applied. Most commonly in electronics, such as a backlight for an LCD TV.
  • Light meter – used in display calibration to measure light coming off of the screen.
  • Local dimming – The intensity of the light source of the LCD panel can adapt based on the content being shown. This allows for more Dynamic Range than typical LCD display.
  • Luminance – The intensity of light emitted from a screen.
  • Macroblocking – a video artifact (distortion) in which objects or areas of a video image appear to be made up of small squares rather than proper detail and smooth edges.
  • Max FALL (Maximum Frame-Average Light Level) – Used as a parameter in HDR10, MaxFALL is the maximum value of the frame-average for all the frames in the content (expressed as nits).
  • MaxCLL (Maximum Content Light Level) – Used as a parameter in HDR10, the MaxCLL is the luminance of the brightest pixel in the content (expressed as nits).
  • Metadata – additional data (information) that is part of the signal. In the context of HDR, metadata on the disc that tells the display how bright or dark the movie (or frame) should be.
  • Motion interpolation – (TV control) video processing that increases the video’s frame rate.
  • NIT – Metric measurement of luminance. Also known as candelas per meter squared (cdm2). 1 foot lambert equals 3.42 nits.
  • Noise Reduction – electronically processing noise out of the image.
  • OETF – Opto Electronic Transfer Function – the process of converting light into a video signal (cameras use OETF).
  • OLED – Organic Light Emitting Diode – Thin layer of organic film that emits light when voltage is applied.
  • Primary colors – colors that can be combined to make any other color in the visible spectrum. Typically Red, Green, and Blue.
  • Progressive scan – The process of making an image by showing the entire image at once (vs. interlaced which shows even then odd lines).
  • QFT – transports each frame at a higher rate to decrease “display latency”, which is the amount of time between a frame being ready for transport in the GPU and that frame being completely displayed. This latency is the sum of the transport time through the source’s output circuits, the transport time across the interface, the processing of the video data in the display, and the painting of the screen with the new data. This overall latency affects the responsiveness of games: how long it appears between a button is pressed to the time at which the resultant action is observed on the screen.While there are a lot of variables in this equation, not many are adjustable from an HDMI specification perspective. QFT operates on the transport portion of this equation by reducing the time it takes to send only the active video across the cable. This results in reduced display latency and increased responsiveness.
  • Quantum Dots – very small semiconductor particles, only several nanometers in size, so small that their optical and electronic properties differ from those of larger particles. Many types of quantum dot will emit light of specific frequencies if electricity or light is applied to them, and these frequencies can be precisely tuned by changing the dots’ size, shape and material, giving rise to many applications. Commonly used as a backlight system for an LCD panel.
  • Ravenna – an open standard technology for transporting real-time audio over standard IP networks. Supporting Level 3 of the OSI reference model, it is designed to interoperate with other widely deployed and established standards such as AES67.
  • Scaling – video processing that can either decrease or increase resolution.
  • Secondary colors – The products of combining the primary colors. Yellow, cyan, magenta.
  • Sharpness/resolution – The amount of detail in the image.
  • SMPTE (Society of Motion Picture and Television Engineers) an international group comprised of engineers, technologists, and executives in the entertainment industry. This body develops and publishes over 800 Standards, Recommended Practices, and Engineering Guidelines for broadcast, filmmaking, digital cinema, audio recording, information technology (IT), and medical imaging.
  • sRGB – a color space that HP and Microsoft created cooperatively in 1996 to use on monitors, printers, and the Internet. Very similar to BT.709.
  • ST2110 – a SMPTE standard that specifies the transport, synchronization, and description of separate video, audio, and ancillary elemental streams over managed IP networks. The three streams are sent on different IP address and kept in sync using PTP protocol. This “Essence-based” approach to transmission is well suited for Studio/Production workflows where the extra step to “unpack” the three elements is eliminated.
  • Tone Mapping – is a technique used in image processing to map one set of colors to another to approximate the appearance of high-dynamic-range images on a display that has a more limited dynamic range. An example would be showing 4000 nit content on an 800 nit display.
  • Video Noise – any distortion or random pixelation in the picture.
  • VP9 – (Developed by Google) one standard of video compression. Competitor to H.265. Used by YouTube as of June 2018. VP9 is customized for video resolutions greater than 1080p (such as UHD) and also enables lossless compression. The VP9 format supports the following color spaces: Rec. 601, Rec. 709, Rec. 2020, and sRGB. VP9 supports HDR video using Hybrid Log-Gamma (HLG) and Perceptual Quantizer (PQ).
  • VRR – VRR lets a gaming source deliver video frames as fast as it can, which in many cases is slower than the normal static refresh rate. Graphics processors require different absolute periods to render each frame, and this time is dependent upon the complexity of the scene, the horsepower of the GPU, the resolution selected and the frame rate. When the GPU is taxed by the other three factors and does not finish rendering the next frame by the time it needs to be displayed, the source must either repeat the current frame or display the partially-rendered next frame, which causes judder and tearing. By waiting until the next frame is ready to transport it, a smoother gaming experience can be provided to the user.
  • White balance  (TV control) The mix of the red, green, and blue signals to get a shade of white. The standard is D65.
  • Wide Color Gamut (WCG) – an increase in available colors. Compared to rec709, rec2020 can be described as WCG.

Subscribe To Our Newsletter

Join our mailing list to receive the latest news and updates from our team.

Thanks for subscribing!