How Do I Choose the Best ATSC TV?
The image format of digital TV refers to the collective name of effective pixels and scanning methods in the horizontal and vertical directions of the image.
- Chinese name
- Digital TV image format
- Foreign name
- Digital television image format
- Influencing factors
- Image resolution, scanning method, frame rate, bright color ratio, etc.
- Advantage
- The image format of digital TV refers to the collective name of effective pixels and scanning methods in the horizontal and vertical directions of the image.
Digital TV image format definition
- The image format of digital TV involves the image resolution, scanning method, frame rate, and bright color ratio. When digital television is used to replace analog television, especially in the process of determining the high-definition television standard, there has been a lot of controversy about the image format. There are two main issues: one is progressive scanning or interlaced scanning, and the other is frame Frequency selection.
Digital TV image format scanning method
Digital TV image format interlaced
- Advantage
- Line scanning solves the problems of channel bandwidth and flicker. Today's analog TVs use interlaced scanning, because in the 6MHz / 8MHz channel bandwidth, 50 complete images per second cannot be sent, and a maximum of 25 frames per second can be transmitted. However, there will be severe large area flicker at 25 frames per second. With interlaced scanning, there are 50 fields per second, that is, 50 images are displayed. Although it is a half image each time, the large area flicker still improves a lot. The interlaced scanning method solves the two problems of channel bandwidth and flicker in one fell swoop.
- Disadvantage
- Interlacing does not resolve interline flicker. Because each line on the screen is still displayed 25 times per second during interlaced scanning, interline flicker still exists. When two adjacent lines in a frame are different (these two lines are transmitted and displayed with a field interval), the flicker is obvious. In other words, when the vertical resolution is good, there will always be flicker between lines. That's why computer monitors don't need interlaced scanning.
- The vertical resolution of interlaced scanning is not high. Interlacing was once seen as a way to double the vertical resolution because two consecutive fields are arranged with a half-line offset. But this happens only when two fields are taken from the same still frame. If there is motion, or the two fields are synthesized with the eyes, the system does not work as expected. Some people have studied that under normal brightness, the resolution is only improved by 10% instead of 100%. If the camera moves vertically during the shooting and each field moves one line, half of the scan lines are gone.
- The boundary of the interlaced image is blurred when the object is moving.
Digital TV image format progressive scan
- Progressive scanning is the direction of development, and interlaced scanning is a technology in the era of analog television, which meets the needs of analog television. In the age of digital TV, should we continue to use this scanning technology? Should progressive scanning be used instead? This issue has caused much controversy in the United States. At the end of 1995, the FCC in the United States was preparing to adopt ATSC as a digital television standard, but many well-known computer companies in the United States strongly opposed the use of interlaced scanning, requiring the use of progressive scanning. Because the digital compression technology is used, the utilization of channel bandwidth is improved, and progressive scanning is possible. After arguing for a whole year, it ended up being a compromise writing standard that could be used in both formats.
- New large-screen flat-panel display devices must use progressive scanning, such as liquid crystal display and plasma display, which have an important characteristic, namely sampling and holding. Usually these devices work at 50 / 60Hz, and the brightness of the pixel remains the same within a frame. Of course, there will be no flicker problem. Because of this, flat panel display devices cannot be used to display interlaced scans directly, otherwise the two fields will be displayed together, forming an interlaced "feather" shape. If the object moves fast enough between the two fields, two separate images will appear.
Selection of image format frame rate for digital TV
Digital TV image format frame rate and flicker
- Flicker and brightness are also closely related. Figure 1 shows the results of psychological experiments. As the brightness B of the reproduced image increases, the repetition frequency of the image must be increased to prevent flicker.
- Figure 1 Psychological experimental results of flicker and brightness
HDTV Digital TV image format HDTV system should choose a higher field frequency
- The selection of the system's field (frame) frequency-the flicker problem must be considered. The field frequency of the PAL system and the SECAM system is 50Hz, the NTSC system is 60Hz in Japan, and the US is 59.94Hz (to make the video carrier and the sound carrier separated by 4.5MHz). Compared with the two, NTSC is better for reducing flicker. Therefore, when designing an HDTV system, a higher field frequency should be used. In addition, because the HDTV has a wide field of view, it is easier to feel flicker.
- The receiver uses frequency doubling technology to effectively solve the flicker problem.
Choice of image format aspect ratio for digital TV
- There are only two types of television image aspect ratios: 4: 3 and 16: 9. HDTV chose the latter to meet the needs of presence.
Choice of Brightness and Color Ratio of Digital TV Picture Format
- The 4: 2: 0 format is used for audiences, and 4: 2: 2 programs are commonly used in studios or between TV stations.
ATSC Scan Format and Resolution of Digital TV's Image Format ATSC
- The image format of digital TV in our country has not been announced yet. We take the standard set by ATSC as an example to introduce.
- The HDTV image resolution is 1 080 lines per frame, each line has 1 920 pixels, the screen aspect ratio is 16: 9, and the pixels are square. The use of square pixels facilitates the interoperability of new video standards and other image and information systems, including computers.
- Increasing the frame rate makes the performance of motion in TV signals smoother. The more frames per second, the more realistic the description of motion. ATSC recommends 3 frame rates, namely 24, 30 and 60 frames / second.
- 24P is for movies. When a movie is transmitted at 60 frames per second or 60 fields of video, it uses a 3: 2 variable frame operation. In this way, a 24 frames per second is equivalent to 60 frames per second or 60 fields per second. During interlaced scanning, 2 video fields are obtained from the first film, 3 are acquired from the 2nd, and 2 are acquired from the 3rd, and so on. The same method is used for progressive scanning to send 60 frames. Get 2 video frames from the first movie frame, 3 from the second, and 2 from the 3, and so on. This approach makes viewers feel slightly stagnant, but Americans are used to it. During transmission, the ATSC encoder can recognize the duplicated frame and send a signal repeat 1 frame to the decoder. There is no need to repeatedly send the intra-frame information. The ATSC decoder receives this repeat one frame signal and The same frame is output from the frame memory to the display unit of the television.
- Increasing the number of lines per frame improves the clarity of the video image. The current NTSC provides 525 lines, while the HDTV standard provides 720 lines (progressive) or 1080 lines (interlaced). Different aspect ratios give viewers different perspectives. NTSC is 4: 3, HDTV aspect ratio is 16: 9, same as 35mm movie.
- Table 1 is the SDTV and HDTV image formats used by ATSC, where I represents interlaced scanning and P represents progressive scanning.
|
| |||
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|