I did not post this back when the specs of the Wii U were leaked because it was still tagged as rumor at the time, but since then it has been confirmed by a known developer/programmer with SDK access that the spec sheet from the Wii U development kit was actually copied and pasted and leaked to the Internet 1 day before E3 and is in fact the truth from Nintendo. Keep in mind that these specs are based on earlier dev kits from late last year and are still missing some information from Nintendo (like the CPU clock frequencies and probably some unique features of the GPU) but are no longer speculation or rumor. The most recent dev kit for the Wii U is undoubtedly even more powerful than these specs show.
Main Application Processor
Three cores (fully coherent).
3MB aggregate L2 Cache size.
core 0: 512 KB
core 1: 2048 KB
core 2: 512 KB
Write gatherer per core.
Locked (L1d) cache DMA per core.
Up to 3GB of main memory (CAT-DEVs only). Note: retail machine will have half devkit memory
Please note that the quantity of memory available from the Cafe SDK and Operating System may vary.
Graphics and Video
Modern unified shader architecture.
32MB high-bandwidth eDRAM, supports 720p 4x MSAA or 1080p rendering in a single pass.
HDMI and component video outputs.
Unified shader architecture executes vertex, geometry, and pixel shaders
Multi-sample anti-aliasing (2, 4, or 8 samples per pixel)
Read from multi-sample surfaces in the shader
128-bit floating point HDR texture filtering
High resolution texture support (up to 8192 x 8192)
Indexed cube map arrays
8 render targets
Independent blend modes per render target
Pixel coverage sample masking
Hierarchical Z/stencil buffer
Early Z test and Fast Z Clear
Lossless Z & stencil compression
2x/4x/8x/16x high quality adaptive anisotropic filtering modes
sRGB filtering (gamma/degamma)
Stream out support
Compute shader support
GX2 is a 3D graphics API for the Nintendo Wii U system (also known as Cafe). The API is designed to be as efficient as GX(1) from the Nintendo GameCube and Wii systems. Current features are modeled after OpenGL and the AMD r7xx series of graphics processors. Wii U’s graphics processor is referred to as GPU7.
Sound and Audio
Dedicated 120MHz audio DSP.
Support for 6 channel discrete uncompressed audio (via HDMI).
2 channel audio for the Cafe DRC controller.
Monaural audio for the Cafe Remote controller.
802.11 b/g/n Wifi.
2 x USB 2.0 host controllers x 2 ports each.
512MB SLC NAND for System.
8GB MLC NAND for Applications.
Host PC Bridge
Dedicated Cafe-to-host PC bridge hardware.
Allows File System emulation by host PC.
Provides interface for debugger and logging to host PC.
So what do all these numbers and words mean? The part that stands out to me is "Compute Shader Support" and "Tessellation Unit" with both of those features being very prominent in DirectX 11 graphics cards. Tessellation was possible on Xbox 360/PS3 (at a high cost on performance) but has been much improved and more efficient with DirectX 11. These features alone would put the Wii U's custom "GPU7" processor far ahead of what was possible on Xbox 360 and PS3. Compute Shading is what most likely will be used by the GPU's in the PS4 and the next Xbox to take the strain off the CPU but not hampering the performance of the GPU (I realize that this can be referred to a GPGPU, but since all modern PC GPUs have this functionality it's not really necessary to call it that).
Here is a good explanation of the benefits that Compute Shading brings:
Compute Shaders are programs that are executed on the graphics processor. With DirectX 11 and DirectCompute, developers are able to use the massive parallel processing power of modern GPUs to accelerate a much wider range of applications that were previously only executable on CPUs. Compute Shaders can be used to enable new graphical techniques to enhance image quality (such as order independent transparency, ray tracing, and advanced post-processing effects), or to accelerate a wide variety of non-graphics applications (such as video transcoding, video upscaling, game physics simulation, and artificial intelligence). In games, Compute Shader support effectively enables more scene details and realism:
- Optimized post-processing effects – apply advanced lighting techniques to enhance the mood in a scene
- High quality shadow filtering – no more hard edges on a shadow, see shadows the way you would in real life
- Depth of field – use the power of the GPU to have more realistic transitions of focal points – imagine looking through a gun sight or a camera lens
- High Definition Ambient occlusion – incredibly realistic lighting and shadow combinations
When you start to put all this information together, it becomes easy to understand why ports of 360/PS3 games will not really benefit much in the graphics department on the Wii U without developers totally rewriting their code. Compute Shading & Tessellation for example is something that you will not be seeing in most or any 360/PS3 ports to Wii U, and not only that but the developers making those ports on Wii U would use the same rules as 360/PS3 by making the CPU do the work that could have be done by the Wii U's GPU and it's newer technology. This will be the Wii U's version of a "lazy port" with it's extra features not even being used.
The good news is that the GPU in the Wii U should be able to accomplish all the effects of the PS4 and next Xbox, with those systems most likely being able to do "more of it" and at faster speeds. However, with the similar technology and features that all next-gen systems will have it's doubtful that the Wii U will get left behind in the graphics race since the difference will not be noticeable enough to warrant actually purchasing another console based on the sole reason of the Wii U not being able to run a certain game like what happened with the original Wii. It's encouraging that we (I say we, but I mean the industry since you could argue that the Wii had the best games this gen) will finally get back to comparing who has the best games and not the best graphics in this coming generation.