What is Video?

Video is the technology of capturing, recording, processing, storing, transmitting, and reconstructing with digital or analog electronics, meaning a series of images representing moving scenes.

Video Description and Features

What is Video? What is its History?

Etymologically, the word video comes from the Latin verb video, vides, videre, which translates as the verb ‘see’. This term is usually applied to the video signal and is generally called “video” as an abbreviation of its full name. Video technology was first developed for television systems. However, it has evolved into many formats to allow consumers to record video and watch it over the Internet.

In some countries, it is also called recording images and sound on magnetic tape or laser disk (DVD). However, with the advent of the latter, this term is usually identified with the early recordings of the VHS tape BETAMAX on magnetic tape.

Initially, the video signal consists of several lines grouped into several frames. These are also divided into two fields, carrying the light and color information of the image. The number of lines and boxes and the way in which the color information is taken depends on the television standard.

The amplitude of the signal is 1Vpp, with a part of the signal carrying the image information above 0V and a part of the synchronizations below 0V. The positive part can reach up to 0.7V for the white level. Black corresponds to 0V. Synchronizations are pulses that go down to -0.3V. Today, there are a lot of different standards, especially in the computer field.

Parts of the Analog Signal

The signal consists of what is called brightness, chrominance, and timing. The amplitude is between 0.7 V and -0.3 V, which corresponds to the target from the sub-sync level. The appropriate signal is related to the brightness by synchronizations to which the chrominance signal is added. With its synchronization, the color is recorded in such a way that the chrominance is superimposed on the brightness.

The bandwidth of the brightness signal is usually on the order of 5 MHz. However, it depends on the system used. Chromaticity is a quadrature-modulated signal.

The carrier is called the chrominance sub-carrier and is a frequency near the top of the band. In PAL, it is 4.43 MHz. Obviously, this frequency is related to the remaining fundamental frequencies of the signal, which for historical reasons is referred to as the field frequency, which is based on the frequency of the electrical supply network, 50 Hz in Europe and 60 Hz in many.

Image Information

The image is made up of light and color; light defines the black-and-white image. And this part of the signal is called brightness.

There are different standards for color coding: NTSC, SECAM, and PAL.

Synchronizations

There are three classes of synchronization: line or horizontal, field or vertical, and color.

Line synchronization indicates where each line of the image begins and ends. They are divided into front slide, back slide, and synchronization pulse.

Vertical synchronizations are those that indicate the beginning and end of each field. They consist of front balance pulses, synchronization pulses, back balance pulses, and guard lines.

The frequency of the synchronization pulses depends on the television system. In the United States, a line frequency of 525 lines per frame is used, and in Europe, it is 625 lines per frame. These figures are derived from the frequency of the electrical network to which the oscillators of the receivers are pre-connected.

Regarding color, in all standards, a carrier is modulated with color information. In NTSC and PAL, it is an amplitude modulation for saturation, and in phase for hue, it is called quadrature modulation.

The PAL system rotates the carrier phase 180° on each line to compensate for transmission distortions. The SECAM system modulates each color component on the corresponding lines.

Description

The term video usually refers to a variety of formats. Reliable methods for recording and playing back information include digital formats such as DVD, QuickTime, DVC, and MPEG-4, as well as analog tapes like VHS and Betamax.

When cameras record analog signals such as PAL, SECAM, or NTSC, or when recording to digital media such as MPEG-4 or DVD (MPEG-2), they can be recorded and transmitted on a variety of physical media, such as magnetic tape.

Its quality depends primarily on the capture and storage method used. Digital television (DTV) is higher than previous television formats. It is a relatively newer format and has become a standard for television.

3D video, three-dimensional digital video, was released at the end of the 20th century. Six or eight cameras with real-time depth measurements are typically used to capture 3D sequences. The 3D format is fixed in MPEG-4 Part 16 Animation Frame Extension (AFX).

In the United Kingdom, Australia, the Netherlands, and New Zealand, the term video is often used informally to refer to recorders and, most notably, their tapes, which usually come from context.

Video Features

Image loading speed: the number of images per unit time for older mechanical cameras is six to eight images per second (fps) or 120 or more images per second for new professional cameras.

PAL and SECAM standards specify 25 fps, NTSC 29, and 97 fps. The cinema is slower at 24 fps. This makes the process of transferring film from film to video difficult. The minimum image loading speed to achieve the illusion of a moving image is about fifteen images per second.

Sweep Systems

1) Intertwined

It was developed to prevent the flicker or flicker that occurs in a television image when reproduced on a picture tube due to the persistence of the light phosphors that make up the same image screen.

The 2/1 interlaced scanning characteristic of PAL, NTSC, and SECAM television systems, and others developed later, allows us to analyze each frame of the image in two equal half-frames called fields. The resulting lines overlap and overlap each other alternately.

One of the fields has even lines, called an even field, and the other has odd lines. Each is called a bizarre field, with vertical synchronization at the beginning. There is a half-line offset between one field and the other. Thus, the even field can explore the edge of the image, freeing up the odd field.

The intermittent exploration of a two-field box requires that the number of lines in the box be odd so that the line from one field to the other can be divided into two halves.

The abbreviated resolution specifications usually include an “i” to indicate interlaced images. For example, the PAL format is often specified as 576i50. Here, 576 means the vertical line of resolution, “i” indicates interlaced, and 50 indicates 50 fields per second (half images).

2) Progressive

In progressive scan systems, each scan period updates all scan lines.

The development of an image representation system other than the image tube, such as TFT and plasma displays, has enabled the development of progressive scan television systems.

A procedure known as deinterlacing can be used to convert an interlaced stream, such as analog, DVD, or satellite, to be processed by progressive scan devices such as those set in TFT televisions, projectors, and panel plasmas.

Video Resolution

The size of an image is measured in pixels for digital or horizontal and vertical scan lines for analog. In the digital domain (for example, DVD), standard definition television (SDTV) is specified as 720/704/640 × 480i60 for NTSC and 768/720 × 576i50 for PAL or SECAM resolution.

However, in the analog domain, the number of active scan lines remains constant. The number of horizontal lines varies according to the signal quality measurement, approximately 320 pixels per line for VCR quality, 400 pixels for broadcast television, and 720 pixels for DVD.

The lack of square pixels preserves the aspect ratio.

New high-definition televisions (HDTVs) are capable of resolutions up to 1920×1080 px60, i.e., 1920 pixels per line per 1080 lines, at 60 frames per second.

3D video resolution is measured in voxels. For example, 512×512×512 voxels of resolution are used for simple 3D, which can now be displayed even on some PDAs.

Aspect Ratio

The aspect ratio is the width of the screen relative to its height. Until the standardization of high-definition television began, the standard format had an aspect ratio of 4/3. The accepted format is 9/16. Compatibility between the two aspect ratios can be achieved in a number of ways.

A 4/3 image to be displayed on a 16/9 screen can be presented in three different ways:

  • There are vertical black bars on both sides. It maintains a ratio of 4/3 without losing part of the active area of ​​the screen.
  • Enlarge the image until it fills the entire screen horizontally. Part of the image disappears above and below the picture.
  • Deforms the image to adapt to the screen format. The entire screen is used, and the whole image is seen, but the geometry is changed.

The 16/9 image to be displayed on the 4/3 screen has three similar views:

  • With horizontal bars above and below the image. The entire image is visible, but the screen size is lost.
  • Enlarging the image to fill the entire screen vertically, missing the side portions of the image.
  • Deforming the image to adapt it to the aspect ratio of the screen. The whole image appears on the entire screen, but the geometry is changed.

Color Space and Number of Bits Per Pixel

The color model name describes the video color rendering. The YIQ system was used in NTSC television. It is very similar to the YUV system used in NTSC and PAL televisions and the YDbDr system used by SECAM televisions.

The number of different colors that can be represented by a pixel depends on the number of bits per pixel (bpp). One way to reduce the number of bits per pixel in digital video is through chroma subsampling.

Quality

Its quality can be measured subjectively by official metrics such as PSNR or video quality using expert observation.

The subjective video quality of a video processing system can be evaluated as follows:

  • Select the video sequences (SRC) to be used to perform the test.
  • Select the system settings to be evaluated (HRC).
  • Choose a test method to present video clips to experts and collect their reviews.
  • Invite a sufficient number of experts, preferably no less than 15.
  • Take the tests.
  • Calculate the average for each HRC based on the expert’s assessment.

There are many subjective video quality methods outlined in the BT.500 recommendation. One of the ITU-T Standardized methods is DSIS (Double Stimulus Impairment Scale).

In this method, each specialist sees a solid reference to the video and then a lousy version of the same video. The specialist then evaluates the damaged video using a scale ranging from “no damage detected” to “damage very annoying.”

Compression Method

A variety of methods are used to compress video sequences. Video data contains temporal, spatial, and spectral redundancy.

In general, spatial redundancy is reduced by recording the differences between parts of the same image (frame); this task is known as intraframe compression and is closely related to image compression.

Similarly, temporal redundancy can be reduced by recording the differences between images (frames); this task is known as interframe compression and involves motion compensation and other techniques. It is used for satellite systems and MPEG-4 home systems.

Bit Rate

Bit rate is a measure of the rate of information contained in a stream or flow. It is measured in bits per second (bit/s or bps) or Megabits per second (Mbit/s or Mbps).

A higher bit rate provides better quality. For example, a video CD with a bit rate of about 1Mbps is of lower quality than a DVD with a bit rate of about 20Mbps. VBR (Variable Bit Rate) is a strategy used to maximize the visual quality of video and minimize the bit rate.

In fast-moving scenes, a variable bit rate uses more bits than in slow-moving scenes but achieves consistent visual quality over the same length. In the case of real-time and bufferless streaming, CBR (Constant Bit Rate) should be used when the bandwidth is constant.

Stereoscopic

Stereoscopic video requires two channels (a proper channel for the right eye and a left channel for the left) or two color-coded overlay layers.

This left and right layer technique is occasionally used on broadcast networks or in recent “anaglyph” releases of 3D movies on DVD. Red/cyan plastic lenses provide a hidden path for viewing images to create a stereoscopic view of the content.

New HD DVD and blue-ray discs will significantly enhance the 3D effect in color-coded stereo programs. The first commercially available HD players were expected to be introduced at the April 2006 NAB Show in Las Vegas.

Add a Comment

Your email address will not be published. Required fields are marked *