Jump to content
Main menu
Main menu
move to sidebar
hide
Navigation
Main page
Recent changes
Random page
Help about MediaWiki
Special pages
Niidae Wiki
Search
Search
Appearance
Create account
Log in
Personal tools
Create account
Log in
Pages for logged out editors
learn more
Contributions
Talk
Editing
Interlaced video
(section)
Page
Discussion
English
Read
Edit
View history
Tools
Tools
move to sidebar
hide
Actions
Read
Edit
View history
General
What links here
Related changes
Page information
Appearance
move to sidebar
hide
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
===Interlace and computers=== In the 1970s, computers and home video game systems began using TV sets as display devices. At that point, a 480-line [[NTSC]] signal was well beyond the graphics abilities of low cost computers, so these systems used a simplified video signal that made each video field scan directly on top of the previous one, rather than each line between two lines of the previous field, along with relatively low horizontal pixel counts. This marked the return of [[progressive scan]]ning not seen since the 1920s. Since each field became a complete frame on its own, modern terminology would call this [[240p]] on NTSC sets, and [[288p]] on [[PAL]]. While consumer devices were permitted to create such signals, broadcast regulations prohibited TV stations from transmitting video like this. Computer monitor standards such as the TTL-RGB mode available on the [[Color Graphics Adapter|CGA]] and e.g. [[BBC Micro]] were further simplifications to NTSC, which improved picture quality by omitting modulation of color, and allowing a more direct connection between the computer's graphics system and the CRT. By the mid-1980s, computers had outgrown these video systems and needed better displays. Most home and basic office computers suffered from the use of the old scanning method, with the highest display resolution being around 640x200 (or sometimes 640x256 in 625-line/50 Hz regions), resulting in a severely distorted tall narrow [[pixel]] shape, making the display of high resolution text alongside realistic proportioned images difficult (logical "square pixel" modes were possible but only at low resolutions of 320x200 or less). Solutions from various companies varied widely. Because PC monitor signals did not need to be broadcast, they could consume far more than the 6, 7 and 8 [[Hertz|MHz]] of bandwidth that NTSC and PAL signals were confined to. IBM's [[Monochrome Display Adapter]] and [[Enhanced Graphics Adapter]] as well as the [[Hercules Graphics Card]] and the original [[Apple Macintosh|Macintosh]] computer generated video signals of 342 to 350p, at 50 to 60 Hz, with approximately 16 MHz of bandwidth, some enhanced [[PC clone]]s such as the [[AT&T 6300]] (aka [[Olivetti M24]]) as well as computers made for the Japanese home market managed 400p instead at around 24 MHz, and the [[Atari ST]] pushed that to 71 Hz with 32 MHz bandwidth - all of which required dedicated high-frequency (and usually single-mode, i.e. not "video"-compatible) monitors due to their increased line rates. The [[Amiga|Commodore Amiga]] instead created a true interlaced 480i60/576i50 [[RGB color model|RGB]] signal at broadcast video rates (and with a 7 or 14 MHz bandwidth), suitable for NTSC/PAL encoding (where it was smoothly decimated to 3.5~4.5 MHz). This ability (plus built-in [[genlocking]]) resulted in the Amiga dominating the video production field until the mid-1990s, but the interlaced display mode caused flicker problems for more traditional PC applications where single-pixel detail is required, with "flicker-fixer" scan-doubler peripherals plus high-frequency RGB monitors (or Commodore's own specialist scan-conversion A2024 monitor) being popular, if expensive, purchases amongst power users. 1987 saw the introduction of [[Video Graphics Array|VGA]], on which PCs soon standardized, as well as Apple's [[Macintosh II]] range which offered displays of similar, then superior resolution and color depth, with rivalry between the two standards (and later PC quasi-standards such as XGA and SVGA) rapidly pushing up the quality of display available to both professional and home users. In the late 1980s and early 1990s, monitor and graphics card manufacturers introduced newer high resolution standards that once again included interlace. These monitors ran at higher scanning frequencies, typically allowing a 75 to 90 Hz field rate (i.e. 37.5 to 45 Hz frame rate), and tended to use longer-persistence phosphors in their CRTs, all of which was intended to alleviate flicker and shimmer problems. Such monitors proved generally unpopular, outside of specialist ultra-high-resolution applications such as [[Computer Aided Design|CAD]] and [[Desktop Publishing|DTP]] which demanded as many pixels as possible, with interlace being a necessary evil and better than trying to use the progressive-scan equivalents. Whilst flicker was often not immediately obvious on these displays, eyestrain and lack of focus nevertheless became a serious problem, and the trade-off for a longer afterglow was reduced brightness and poor response to moving images, leaving visible and often off-colored trails behind. These colored trails were a minor annoyance for monochrome displays, and the generally slower-updating screens used for design or database-query purposes, but much more troublesome for color displays and the faster motions inherent in the increasingly popular window-based operating systems, as well as the full-screen scrolling in WYSIWYG word-processors, spreadsheets, and of course for high-action games. Additionally, the regular, thin horizontal lines common to early GUIs, combined with low color depth that meant window elements were generally high-contrast (indeed, frequently stark black-and-white), made shimmer even more obvious than with otherwise lower fieldrate video applications. As rapid technological advancement made it practical and affordable, barely a decade after the first ultra-high-resolution interlaced upgrades appeared for the IBM PC, to provide sufficiently high pixel clocks and horizontal scan rates for hi-rez progressive-scan modes in first professional and then consumer-grade displays, the practice was soon abandoned. For the rest of the 1990s, monitors and graphics cards instead made great play of their highest stated resolutions being "non-interlaced", even where the overall framerate was barely any higher than what it had been for the interlaced modes (e.g. SVGA at 56p versus 43i to 47i), and usually including a top mode technically exceeding the CRT's actual resolution (number of color-phosphor triads) which meant there was no additional image clarity to be gained through interlacing and/or increasing the signal bandwidth still further. This experience is why the PC industry today remains against interlace in HDTV, and lobbied for the 720p standard, and continues to push for the adoption of 1080p (at 60 Hz for NTSC legacy countries, and 50 Hz for PAL); however, 1080i remains the most common HD broadcast resolution, if only for reasons of backward compatibility with older HDTV hardware that cannot support 1080p - and sometimes not even 720p - without the addition of an external scaler, similar to how and why most SD-focussed digital broadcasting still relies on the otherwise obsolete [[MPEG2]] standard embedded into e.g. [[DVB-T]].
Summary:
Please note that all contributions to Niidae Wiki may be edited, altered, or removed by other contributors. If you do not want your writing to be edited mercilessly, then do not submit it here.
You are also promising us that you wrote this yourself, or copied it from a public domain or similar free resource (see
Encyclopedia:Copyrights
for details).
Do not submit copyrighted work without permission!
Cancel
Editing help
(opens in new window)
Search
Search
Editing
Interlaced video
(section)
Add topic