Jump to content

Matrox Parhelia

From Wikipedia, the free encyclopedia

This is the current revision of this page, as edited by Childishbeat (talk | contribs) at 11:02, 29 June 2024 (Sales: Corrected the last paragraph to say nVidia and ATI "controlled" rather than that they "control" the majority of the discrete graphics chip market, corrected nVidia to Nvidia in the same section). The present address (URL) is a permanent link to this version.

(diff) ← Previous revision | Latest revision (diff) | Newer revision → (diff)
Parhelia AGP 128 MB
Parhelia PCI-X 256 MB

The Matrox Parhelia-512 is a graphics processing unit (GPU) released by Matrox in 2002. It has full support for DirectX 8.1 and incorporates several DirectX 9.0 features. At the time of its release, it was best known for its ability to drive three monitors ("Surround Gaming") and its Coral Reef tech demo.

As had happened with previous Matrox products, the Parhelia was released just before competing companies released cards that completely outperformed it. In this case it was the ATI Radeon 9700, released only a few months later. The Parhelia remained a niche product, and was Matrox's last major effort to sell into the consumer market.

Background

[edit]

The Parhelia series was Matrox's attempt to return to the market after a long hiatus, their first significant effort since the G200 and G400 lines had become uncompetitive. Their other post-G400 products, G450 and G550, were cost-reduced revisions of G400 technology and were not competitive with ATI's Radeon or NVIDIA's GeForce lines with regards to 3D computer graphics.

Description

[edit]

Features

[edit]

The Parhelia-512 was the first GPU by Matrox to be equipped with a 256-bit memory bus, giving it an advantage over other cards of the time in the area of memory bandwidth. The "-512" suffix refers to the 512-bit ring bus. The Parhelia processor featured Glyph acceleration, where anti-aliasing of text was accelerated by the hardware.

Parhelia-512 includes 4 32×4 vertex shaders with dedicated displacement mapping engine, pixel shader array with 4 texturing unit and 5-stage pixel shader per pixel pipeline. It supports 16× fragment anti-aliasing, all of which were featured prominently in Matrox's Coral Reef technical demo.

Display controller component supports 10-bit color frame buffer (called "Gigacolor") with 10-bit 400 MHz RAMDACs on 2 RGB ports and 230 MHz RAMDAC on TV encoder port, which was an improvement over its competitors. The frame buffer is in RGBA (10:10:10:2) format, and supports full gamma correction. Dual link TMDS is supported via external controller connected to the digital interface.

Memory controller supports 256-bit DDR SDRAM.

The "Surround Gaming" support allowed the card to drive three monitors creating a unique level gaming immersion. For example, in a flight simulator or sim racing, the middle monitor could show the windshield while the left and right monitors could display the side views (offering peripheral vision). However, only 2 displays can be controlled independently.[1]

Video cards

[edit]

The cards were released in 2002, simply called Matrox Parhelia, initially came with 128 or 256 MiB memory. Retail cards are clocked 220 MHz core, 275 MHz memory; OEM cards are clocked 200 MHz core, 250 MHz memory.[2]

To further improve analog image quality, 5th order low-pass filters are used.

Performance

[edit]
Parhelia chip

For a top-of-the-line, and rather expensive card ($399 USD), the Matrox Parhelia's 3D gaming performance was well behind NVIDIA's older and similarly priced GeForce 4 Ti 4600. The Parhelia was only competitive with the older Radeon 8500 and GeForce 3, which typically cost half as much. The Parhelia's potent performance was held back by its comparatively low GPU clock speed (220 MHz for retail model, 200 MHz for OEM and 256 MB models), initially believed by many commentators to be due to the large (for that time-frame) transistor count. However, ATI's Radeon 9700 was released later that year, with a considerably larger transistor count (108 million vs. 80 million), on the same 150 nm chip fabrication process, yet managed a substantially higher clock (325 MHz vs. 250 MHz).

The card's fillrate performance was formidable[citation needed] in games that used many texture layers; though equipped with just 4 pixel pipelines, each had 4 texture units. This proved not to be an efficient arrangement in most situations. Parhelia was also hampered by poor bandwidth conserving technologies/techniques; ATI introduced their 3rd gen HyperZ in Radeon 9700, NVIDIA touted Lightning Memory Architecture 2 for the GeForce 4 series, Matrox had no similarly comprehensive optimization approach. While the Parhelia possessed an impressive[fact or opinion?] raw memory bandwidth much of it was wasted on invisible house-keeping tasks because the card lacked the ability to predict overdraw or compress z-buffer data, among other inefficiencies. Some writers believed Parhelia to have a "crippled" triangle-setup engine that starved the rest of the chip in typical 3D rendering tasks [1].

Later in Parhelia's life, when DirectX 9 applications were becoming quite prevalent, Matrox acknowledged that the vertex shaders were not Shader Model 2.0 capable, and as such not DirectX 9-compliant, as was initially advertised. Presumably there were several bugs within the Parhelia core that could not be worked around in the driver.[speculation?] However, it was all a bit of a moot point because Parhelia's performance was not adequate to drive most DirectX 9-supporting titles well even without more complex shader code weighing the card down.

Sales

[edit]

Despite the lackluster performance for its price, Matrox hoped to win over enthusiasts with the Parhelia's unique and high quality features, such as "Surround Gaming", glyph acceleration, high resolutions, and 16x fragment anti-aliasing. In these aspects, some reviewers[like whom?] suggested that Parhelia could have been a compelling alternative to the comparably priced GeForce 4 Ti 4600 ($399 USD), which was the performance leader but only DirectX 8.1 compliant.

However, within a few months after release, the Parhelia was completely overshadowed by ATI's far faster and fully DirectX 9.0 compliant Radeon 9700. The Radeon 9700 was faster and produced higher quality 3D images, while debuting at the same price as the Parhelia ($399 USD). Due to their equivalent pricing against faster cards, the Parhelia never got a significant hold in the market. It remained a niche product, while Nvidia and ATI controlled the majority of the discrete graphics chip market.

Parhelia-LX

[edit]

After the launch of Parhelia-512, Matrox released Parhelia-LX, which supports only 128-bit memory and has only 2 pixel pipelines. The first video cards using it included Matrox Millennium P650 and Millennium P750.

Future products

[edit]

Originally, Matrox planned to produce the "Parhelia 2" successor, codenamed "Pitou".[3] However, when Parhelia-512 failed to compete in the gaming market, the project was never again mentioned and Matrox left the gaming market altogether by 2003.

Parhelia processors were later upgraded to support AGP 8×, and PCI Express.

In 2006, Matrox re-introduced Surround Gaming with their TripleHead2Go, which utilizing the existing GPU to render 3D graphics, splitting the resulting image over three screens.[4][5] Certified products include ATI and NVIDIA (and later Intel) processors.

With the introduction of Millennium P690 in 2007, it was die-shrunk to 90 nm, and supports DDR2 memory.[6] Windows Vista is supported under XP Driver Model.

In June 2008, Matrox announced the release of M-Series video cards.[7] It has the advertised single-chip quad head support. Unlike previous products, it supports Windows Vista Aero acceleration.

In 2014, Matrox announced the next line of multi-display graphics cards would be based on 28 nm AMD GPUs with Graphics Core Next technologies with DirectX 11.2, OpenGL 4.4 and OpenCL 1.2 compatibility; shader model 5.0; PCI Express 3.0 and 128-bit memory interface.[8] The first AMD-based products, Matrox C420 and C680, was set to be available in Q4 2014.[9]

References

[edit]
[edit]