|Predecessor||Enhanced Graphics Adapter|
Video Graphics Array (VGA) is a video display controller and accompanying de facto graphics standard, first introduced with the IBM PS/2 line of computers in 1987, which became ubiquitous in the PC industry within three years. The term can now refer to the computer display standard, the 15-pin D-subminiature VGA connector, or the 640×480 resolution characteristic of the VGA hardware.
VGA was the last IBM graphics standard to which the majority of PC clone manufacturers conformed, making it the lowest common denominator that virtually all post-1990 PC graphics hardware can be expected to implement.
IBM intended to supersede VGA with the Extended Graphics Array (XGA) standard, but failed. Instead, VGA was adapted into many extended forms by third parties, collectively known as Super VGA, then gave way to custom graphics processing units which, in addition to their proprietary interfaces and capabilities, continue to implement common VGA graphics modes and interfaces to the present day.
The VGA analog interface standard has been extended to support resolutions of up to 2048×1536 and even higher in special applications.
Unlike the graphics adapters that preceded it (MDA, CGA, EGA and many third-party options) there was initially no discrete VGA card released by IBM. The first commercial implementation of VGA was a built-in component of the IBM PS/2, in which it was accompanied by 256KB of video RAM, and a new DE-15 connector replacing the DE-9 used by previous graphics adapters.
IBM later released the standalone IBM PS/2 Display Adapter, which utilized the VGA but could be added to machines that did not have it built in.
The VGA was a single chip implementing the entirety of a video display controller, rather than the many discrete components and ICs of the graphics adapters that had preceded it. The term "array" rather than "adapter" in the name denoted that it was not a complete independent expansion device, but a single component that could be integrated into a system.
The VGA required only video memory, timing crystals and an external RAMDAC, and its small part count allowed IBM to include it directly on the PS/2 motherboard, in contrast to prior IBM PC models – PC, PC/XT, and PC AT – which required a separate display adapter installed in a slot in order to connect a monitor.
The VGA supports all graphics modes supported by the MDA, CGA and EGA cards, as well as multiple new modes.
The 640×480 16-color and 320×200 256-color modes had fully redefinable palettes, with each entry selected from an 18-bit (262,144-color) gamut.
The other modes defaulted to standard EGA or CGA compatible palettes and instructions, but still permitted remapping of the palette with VGA-specific commands.
As the VGA began to be cloned in great quantities by manufacturers who added ever-increasing capabilities, its 640×480, 16-color mode became the de facto lowest common denominator of graphics cards. By the mid 1990s, a 640×480×16 graphics mode using the VGA memory and register specifications was expected by operating systems such as Windows 95 and OS/2 Warp 3.0, which provided no support for lower resolutions or bit depths, or support for other memory or register layouts without additional drivers. Well into the 2000s, even after the VESA standard for graphics cards became commonplace, the "VGA" graphics mode remained a compatibility option for PC operating systems.
Nonstandard display modes can be implemented, with horizontal resolutions of:
And heights of:
For example, high resolution modes with square pixels are available at 768×576 or 704×528 in 16 colors, or medium-low resolution at 320×240 with 256 colors. Alternatively, extended resolution is available with "fat" pixels and 256 colors using, e.g. 400×600 (50 Hz) or 360×480 (60 Hz), and "thin" pixels, 16 colors and the 70 Hz refresh rate with e.g. 736×410 mode.
"Narrow" modes such as 256×224 tend to preserve the same pixel ratio as in e.g. 320×240 mode unless the monitor is adjusted to stretch the image out to fill the screen, as they are derived simply by masking down the wider mode instead of altering pixel or line timings, but can be useful for reducing memory requirements and pixel addressing calculations for arcade game conversions or console emulators.
See also: VGA text mode
VGA also implements several text modes:
As with the pixel-based graphics modes, additional text modes are possible by programming the VGA correctly, with an overall maximum of about 100×80 cells and an active area spanning about 88×64 cells.
One variant that is sometimes seen is 80×30 or 80×60, using an 8×16 or 8×8 font and an effective 640×480 pixel display, which trades use of the more flickery 60 Hz mode for an additional 5 or 10 lines of text and square character blocks (or, at 80×30, square half-blocks).
Unlike the cards that preceded it, which used binary TTL signals to interface with a monitor (or composite, in the case of the CGA), the VGA introduced a video interface using pure analog RGB signals, 0.7 volts peak-to-peak max. In conjunction with an 18-bit RAMDAC this produced a color gamut of 262,144 colors. This gamut has come to be well known as the SRGB color space (but it is most commonly divided into 16,777,216 colors using a 24-bit RAMDAC or 8-bits per primary color).
The original VGA specifications follow:
The intended standard value for the horizontal frequency of VGA's 640×480 mode is exactly double the value used in the NTSC-M video system, as this made it much easier to offer optional TV-out solutions or external VGA-to-TV converter boxes at the time of VGA's development. It is also at least nominally twice that of CGA, which also supported composite monitors.
All derived VGA timings (i.e. those which use the master 25.175 and 28.322 MHz crystals and, to a lesser extent, the nominal 31.469 kHz line rate) can be varied by software that bypasses the VGA firmware interface and communicates directly with the VGA hardware, as many MS-DOS based games did. However, only the standard modes, or modes that at least use almost exactly the same H-sync and V-sync timings as one of the standard modes, can be expected to work with the original late-1980s and early-1990s VGA monitors. The use of other timings may in fact damage such monitors and thus was usually avoided by software publishers.
Third-party "multisync" CRT monitors were more flexible, and in combination with "super EGA", VGA, and later SVGA graphics cards using extended modes, could display a much wider range of resolutions and refresh rates at arbitrary sync frequencies and pixel clock rates.
For the most common VGA mode (640×480, 60 Hz, non-interlaced), the horizontal timings can be found in the HP Super VGA Display Installation Guide and in other places.
640×400 @ 70 Hz is traditionally the video mode used for booting VGA-compatible x86 personal computers that show a graphical boot screen, while text-mode boot uses 720×400 @ 70 Hz.
This convention has been eroded in recent years, however, with POST and BIOS screens moving to higher resolutions, taking advantage of EDID data to match the resolution to a connected monitor.
640×480 @ 60 Hz is the default Windows graphics mode (usually with 16 colors), up to Windows 2000. It remains an option in XP and later versions via the boot menu "low resolution video" option and per-application compatibility mode settings, despite Windows now defaulting to 1024×768 and generally not allowing any resolution below 800×600 to be set.
The need for such a low-quality, universally compatible fallback has diminished since the turn of the millennium, as VGA-signalling-standard screens or adaptors unable to show anything beyond the original resolutions have become increasingly rare.
320×200 at 70 Hz was the most common mode for VGA-era PC games, with pixel-doubling and line-doubling performed in hardware to present a 640x400 at 70 Hz signal to the monitor.
The Windows 95/98/Me LOGO.SYS boot-up image was 320x400 resolution, displayed with pixel-doubling to present a 640x400 at 70 Hz signal to the monitor. The 400-line signal was the same as the standard 80x25 text mode, which meant that pressing Esc to return to text mode didn't change the frequency of the video signal, and thus the monitor did not have to resynchronize (which could otherwise have taken several seconds).
See also: VGA connector
The standard VGA monitor interface is a 15-pin D-subminiature connector in the "E" shell, variously referred to as "HD-15", "DE-15" and "DB-15".
Because VGA uses low-voltage analog signals, signal degradation becomes a factor with low-quality or overly long cables. Solutions include shielded cables, cables that include a separate internal coaxial cable for each color signal, and "broken out" cables utilizing a separate coaxial cable with a BNC connector for each color signal.
BNC breakout cables typically use five connectors, one each for Red, Green, Blue, Horizontal Sync, and Vertical Sync, and do not include the other signal lines of the VGA interface. With BNC, the coaxial wires are fully shielded end-to-end and through the interconnect so that virtually no crosstalk and very little external interference can occur.
The VGA color system uses register-based palettes to map colors in various bit depths to its 18-bit output gamut. It is backward compatible with the EGA and CGA adapters, but supports extra bit depth for the palette when in these modes.
For instance, when in EGA 16-color modes, VGA offers 16 palette registers, and in 256-color modes, it offers 256 registers. Each palette register contain a 3×6 bit RGB value, selecting a color from the 18-bit gamut of the DAC.
These color registers are initialized to default values IBM expected to be most useful for each mode. For instance, EGA 16-color modes initialize to the default CGA 16-color palette, and the 256-color mode initializes to a palette consisting of 16 CGA colors, 16 grey shades, and then 216 colors chosen by IBM to fit expected use cases. After initialization they can be redefined at any time without altering the contents of video RAM, permitting palette cycling.
In the 256-color modes, the DAC is set to combine four 2-bit color values, one from each plane, into an 8-bit-value representing an index into the 256-color palette. The CPU interface combines the 4 planes in the same way, a feature called "chain-4", so that each the pixel appears to the CPU as a packed 8-bit value representing the palette index.
The video memory of the VGA is mapped to the PC's memory via a window in the range between segments 0xA0000 and 0xBFFFF in the PC's real mode address space (A000:0000 and B000:FFFF in segment:offset notation). Typically, these starting segments are:
Due to the use of different address mappings for different modes, it is possible to have a monochrome adapter (i.e. MDA or Hercules) and a color adapter such as the VGA, EGA, or CGA installed in the same machine.
At the beginning of the 1980s, this was typically used to display Lotus 1-2-3 spreadsheets in high-resolution text on a monochrome display and associated graphics on a low-resolution CGA display simultaneously. Many programmers also used such a setup with the monochrome card displaying debugging information while a program ran in graphics mode on the other card. Several debuggers, like Borland's Turbo Debugger, D86 and Microsoft's CodeView could work in a dual monitor setup. Either Turbo Debugger or CodeView could be used to debug Windows.
There were also DOS device drivers such as
ox.sys, which implemented a serial interface simulation on the monochrome display and, for example, allowed the user to receive crash messages from debugging versions of Windows without using an actual serial terminal.
It is also possible to use the "MODE MONO" command at the DOS prompt to redirect the output to the monochrome display. When a monochrome adapter was not present, it was possible to use the 0xB000–0xB7FF address space as additional memory for other programs.
"Unchaining" the 256 KB VGA memory into four separate "planes" makes VGA's 256 KB of RAM available in 256-color modes. There is a trade-off for extra complexity and performance loss in some types of graphics operations, but this is mitigated by other operations becoming faster in certain situations:
Software such as Fractint, Xlib and ColoRIX also supported tweaked 256-color modes on standard adaptors using freely-combinable widths of 256, 320, and 360 pixels and heights of 200, 240 and 256 (or 400, 480 and 512) lines, extending still further to 384 or 400 pixel columns and 576 or 600 (or 288, 300). However, 320×240 was the best known and most frequently used, as it offered a standard 40-column resolution and 4:3 aspect ratio with square pixels. "320×240×8" resolution was commonly called Mode X, the name used by Michael Abrash when he presented the resolution in Dr. Dobb's Journal.
The highest resolution modes were only used in special, opt-in cases rather than as standard, especially where high line counts were involved. Standard VGA monitors had a fixed line scan (H-scan) rate – "multisync" monitors being, at the time, expensive rarities – and so the vertical/frame (V-scan) refresh rate had to be reduced in order to accommodate them, which increased visible flicker and thus eye strain. For example, the highest 800×600 mode, being otherwise based on the matching SVGA resolution (with 628 total lines), reduced the refresh rate from 60 Hz to about 50 Hz (and 832×624, the theoretical maximum resolution achievable with 256kb at 16 colors, would have reduced it to about 48 Hz, barely higher than the rate at which XGA monitors employed a double-frequency interlacing technique to mitigate full-frame flicker).
These modes were also outright incompatible with some monitors, producing display problems such as picture detail disappearing into overscan (especially in the horizontal dimension), vertical roll, poor horizontal sync or even a complete lack of picture depending on the exact mode attempted. Due to these potential issues, most VGA tweaks used in commercial products were limited to more standards-compliant, "monitor-safe" combinations, such as 320×240 (square pixels, three video pages, 60 Hz), 320×400 (double resolution, two video pages, 70 Hz), and 360×480 (highest resolution compatible with both standard VGA monitors and cards, one video page, 60 Hz) in 256 colors, or double the horizontal resolution in 16-color mode.
Several companies produced VGA compatible graphic board models.
Main article: Super VGA
Super VGA (SVGA) is a display standard developed in 1988, when NEC Home Electronics announced its creation of the Video Electronics Standards Association (VESA). The development of SVGA was led by NEC, along with other VESA members including ATI Technologies and Western Digital. SVGA enabled graphics display resolutions up to 800×600 pixels, 36% more than VGA's maximum resolution of 640×480 pixels.
Main article: XGA
Extended Graphics Array (XGA) is an IBM display standard introduced in 1990. Later it became the most common appellation of the 1024 × 768 pixels display resolution.
It is said about airplanes that the DC3 and 737 are the most popular planes ever built, and the 737, in particular, the best-selling airplane ever. The same could be said for the ubiquitous VGA, and its big brother the XGA. The VGA, which can still be found buried in today’s modern GPUs and CPUs, set the foundation for a video standard, and an application programming standard.
Discrete failures such as ... XGA graphics
((cite journal)): Cite journal requires