HD Camera Pro: How to Minimize Pictures Flicker Through Interlacing

Interlacing is a technique that improves a video signal's quality without increasing its bandwidth. When the first televisions were created, they used a system called progressive scanning to read the video signals. The signals were turned into images using cathode ray tube screens. As the screens became brighter, people saw the pictures flicker.

What interlacing did was break the video image into lines that could be individually read in pieces as opposed to the whole thing at once. This solved the flicker problem, and everything was good until the 1970s, when computer monitors began to grow more prominent. The monitors reintroduced the progressive scanning method, as improved technology was able to keep up with it.

Things have changed a lot over the last forty years. HD is replacing standard definition. The majority of today's flat screen TVs use progressive scanning to display images. When you hear the terms 1080p or 720p, the p stands for progressive scanning, whereas "1080i" stands for interlacing. Although it still exists, progressive scanning will eventually replace interlacing entirely.

In fact, video that was shot using the interlacing method needs to be de-interlaced in order to play smoothly on progressive scan monitors, otherwise the interlaced lines are visible.

Popular P&S Cameras for High Quality Photos: