While Microsoft and Sony are busy pushing polygons and stressing high-definition content, Nintendo has let graphics take a "backseat," if you will, to the unique functionality of the Wii-mote. Many people who viewed and played Wii games at E3 this year would tell you that the visuals looked roughly on par with what users already see on their GameCubes.
But that doesn't mean that Wii games will simply look like GameCube titles using a unique control method. Nintendo already told us that the hardware being used at E3 was not finalized. Moreover, during the show GameDaily BIZ met with ATI, which is providing the "Hollywood" GPU for the Wii, and John Swinimer, Senior Public Relations Manager of Consumer Products, emphasized that the Wii architecture is capable of producing far better results than what we've witnessed thus far. "I think what you saw [on Wii] was just the tip of the iceberg of what the Hollywood chip can bring to the Nintendo Wii," he said.
We tried to pry some design specs out of ATI, but Swinimer would only offer the following: "I'm really not here to talk about the design specs... other than the fact that ATI worked closely with Nintendo. The team that worked on this chip also worked on the Flipper chip that was in GameCube, and they've been working with Nintendo for a very long time so there's a great chemistry with the two teams working together."
He continued, "I really don't think that it's about the [specs]; I think it's about the innovation that it brings to the table—the motion-sensing, the always-on capability, which is really cool too—the fact that the chip is powerful enough and responsive enough to be there at a moment's notice, and I think that's pretty cool for the average gamer.
"Industry sources have said that the Wii GPU would be moderately more powerful than the GameCube's GPU, but how much more we don't know. Conservative estimates from developers have placed the Wii console as a whole at 2 - 2.5 times more powerful than the GameCube.
ATI is also responsible for providing the custom GPU for Microsoft's Xbox 360, so we tried to find out how the "Hollywood" chip compares to what's in the 360. Once again, however, Swinimer sidestepped the question. "They're different chips for different platforms and different uses. I don't think it's a fair comparison to put them on a chart [to analyze]. That's not what it's all about... I think if you focus on the capabilities that the chip will have for the average consumer, with the amazement and wow factor, I think that's the value that we bring."
So the "just a little better than Xbox" graphics were only the tip of the iceberg of the Wii's graphic capability? You heard it from ATI themselves, now you must believe.