Jump to content

Hollywood (graphics chip)

From Wikipedia, the free encyclopedia

This is an old revision of this page, as edited by Tilkax (talk | contribs) at 01:08, 12 May 2020 (re-add reference that was mistakenly dropped in revision 208005575). The present address (URL) is a permanent link to this revision, which may differ significantly from the current revision.

ATI "Hollywood" GPU within the Wii console

Hollywood is the name of the graphics processing unit (GPU) used in Nintendo's Wii video game console. It was designed by ATI (now AMD), and is manufactured using the same 90 nm CMOS process[1] as the "Broadway" processor. Very few official details were released to the public by Nintendo, ATI, or IBM. The Hollywood GPU is reportedly based on the GameCube's "Flipper" GPU and is clocked 50% higher at 243 MHz,[2] though none of the clock rates were ever confirmed by Nintendo, IBM, or ATI.

Hollywood is a multi-chip module (MCM) package containing two dies under the cover. One of the two chips, codenamed Napa, controls the I/O functions, RAM access, the Audio DSP, and the actual GPU with its embedded DRAM, and measures 8 × 9 mm. The other, codenamed Vegas, holds 24 MB of "internal" 1T-SRAM and measures 13.5 × 7 mm.[3]

Hollywood also contains an ARM926 core, which has been unofficially nicknamed Starlet.[4] This embedded microprocessor performs many of the I/O functions, including controlling the wireless functionality, USB, the optical disc drive, and other miscellaneous functions. It also acts as the security controller of the system, performing encryption and authentication functions. The Hollywood includes hardware implementations of AES and SHA-1, to speed up these functions. Communication with the main CPU is accomplished via an IPC mechanism. The Starlet performs the WiiConnect24 functions while the Wii console is in standby mode.[4]

Hardware Capabilities

  • 243 MHz graphics chip
  • 3MB embedded GPU memory (eDRAM)
    • 2MB dedicated to Z-buffer and framebuffer
    • 1MB texture cache
    • 24MB 1T-SRAM @ 486 MHz (3.9GB/s) directly accessible for textures and other video data
  • Fixed function pipeline (no support for programmable vertex or pixel shaders in hardware)
  • Texture Environment Unit (TEV) - capable of combining up to 8 textures in up to 16 stages or "passes"
  • ~30GB/s internal bandwidth^
  • ~18 million polygons/second^
  • 972Mpixels/sec peak pixel fillrate

(Note: ^ denotes speculation: using confirmed AMD GameCube data x 1.5, a crude but likely accurate way of calculating the Wii's results based on clock speeds and identical architecture)

Texture Environment Unit

The Texture Environment Unit (TEV) is a unique piece of hardware exclusive to the GameCube and Wii. The Wii inherited the TEV from Flipper, and the TEV is - to use an analogy from Factor 5 director Julian Eggebrecht - "like an elaborate switchboard that makes the wildest combinations of textures and materials possible."

The TEV pipeline combines up to 8 textures in up to 16 stages at once. Each stage can apply a multitude of functions to the texture. This was frequently used to simulate pixel shader effects such as bump-mapping, or to perform effects such as cel shading. On the GameCube, Factor 5's Star Wars: Rogue Squadron II used the TEV for the targeting computer effect and the simulated volumetric fog. In another scenario, Wave Race: Blue Storm used the TEV notably for water distortion (such as refraction) and other water effects. The Wii's TEV unit and TEV capabilities are no different from the GameCube's, excluding indirect performance advantages from the faster clock speeds.


References

  1. ^ "Wiiの概要 (Wii本体)" (in Japanese). Nintendo. Archived from the original on 2006-06-15. Retrieved 2007-01-03.
  2. ^ "IGN: Revolution's Horsepower". IGN. 2006-03-29. Archived from the original on 2011-05-22. Retrieved 2006-12-23.
  3. ^ Eda, Hiroki (2006-11-27). "PS3 VS Wii, Comparisons of Core LSI Chip Areas". Tech-On!. Archived from the original on 2007-01-03.
  4. ^ a b "Starlet". Wiibrew. Retrieved 2008-02-20.