Revolution Tech Specs Mooted: Uproar Ensues – Full Report

GameCube Ver 2 claimed. Best Nintendo console ever looms

Posted by Staff
Revolution Tech Specs Mooted: Uproar Ensues – Full Report
IGN has gone on record to claim it has been privy to some hitherto secret information regarding what will be under the hood of Nintendo's looming Revolution console, offering specifications allegedly obtained from senior development sources which point to a GameCube-based next-generation console.

According to the site:

Insiders stress that Revolution runs on an extension of the Gekko and Flipper architectures that powered GameCube, which is why studios who worked on GCN will have no problem making the transition to the new machine, they say. IBM's "Broadway" CPU is clocked at 729MHz, according to updated Nintendo documentation. By comparison, GameCube's Gekko CPU ran at 485MHz. The original Xbox's CPU was clocked at 733MHz. Meanwhile, Xbox 360 runs three symmetrical cores at 3.2GHz.

Revolution's ATI-provided "Hollywood" GPU clocks in at 243MHz. By comparison, GameCube's GPU ran at 162MHz, while the GPU on the original Xbox was clocked at 233MHz. Sources we spoke with suggest that it is unlikely the GPU will feature any added shader features, as has been speculated.

The overall system memory numbers we reported last December have not greatly fluctuated, but new clarifications have surfaced. Revolution will operate using 24MBs of "main" 1T-SRAM. It will additionally boast 64MBs of "external" 1T-SRAM. That brings the total number of system RAM up to 88MBs, not including the 3MB texture buffer on the GPU. By comparison, GameCube featured 40MBs of RAM not counting the GPU's on-board 3MBs. The original Xbox included 64MBs total RAM. Xbox 360 and PlayStation 3 operate on 512MBs of RAM.

It is not known if the 14MBs of extra D-RAM we reported on last December are in the current Revolution specifications.


Worthy of note is the fact that both CPU and GPU clock speeds mooted for Revolution are almost exactly 1.5 times faster than the GameCube, leading some to vocalise doubt that IGN's report comprises little more than 'outsider' developer information from a second-hand source at last week's Games Developers Conference.

The fact that no shader technology has been mentioned has surprised many, leaving Nintendo somewhat out in the cold in the eyes of technophiles as the next-generation of home consoles come online.

As you would imagine, the very suggestion that the Revolution will be underpowered when compared to the Xbox 360 and PlayStation 3 has caused fanboy meltdown, in spite of the fact that Nintendo has repeatedly outlined it is focusing on gameplay innovation over graphics. The row over what element of gaming boils the videogame community down into two main factions. Those who care about pixel-shaded sweat beads appearing on the face of Dante in DMC7 and those who would rather play Bishi Bashi-inspired competitive interactive cookery using a magic wand.

The fact that no mass storage has been announced for the Revolution has also caused some to turn their nose up at Nintendo's new machine, though SPOnG believes Nintendo will release a first-party hard disc soon after the Revolution ships. Company president Satoru Iwata briefed some developers during GDC to expect HDD support, explaining that the Revolution can make use of any USB storage medium.

As to whether Nintendo's gamble with power, and understand, it is a gamble, pays off remains to be seen, though SPOnG, for the record, believes it will and in a massive way.

Should these specs prove accurate, essentially every studio on the planet with GameCube experience can begin making games for Revolution. Then bear in mind these games will cost a fraction of those being readied for PlayStation 3 and Xbox 360. The result will be more games available at lower price point. Then throw into the mix the ever-impressive online service from Nintendo and of course, the Revolution's word-defying new controller and a whole new world of quick, accessible fun gaming opens up.

The same games but prettier or a whole new way of playing. A no-brainer for people on both sides of the divide.

We'll leave you with this. Ask anyone who owns both a DS and a PSP which they play and enjoy the most. The technophiles will back PSP and the gamers with take DS every time.
Companies:

Comments

Showing the 20 most recent comments. Read all 24.
Andrew 30 Mar 2006 12:43
5/24
Hmm. If this is true, it looks like Nintendo are trying to re-use GC technology in an effort to save cash. Although I'm an XBOX and 360 user, I was always quite impressed with the image quality of the GC - it was crisp and colourful and seemed capable of very 'solid' looking 3d. Very much like the DC in fact. However, in this world of 'mine is bigger than yours', these published specs will enable the playground Nintendo bashers to have a field day.
soanso 30 Mar 2006 13:26
6/24
Nintendo have a precedent for reusing old hardware. The SNES was based around re used hardware with a kick-ass graphics and sound chip stuck on but it introduced a new controller and a new way of playing .
I still don't think memory and power is that important. Running games in HD takes a lot of power and I think polygon for polygon revo games will match 360 and ps3 games (only those won't have suck pretty pixels attached to them).
For the normal user, the average gamer, will that actually make any difference. I don't have an hd tv, I have no intention of buying one and I can't afford it anyway even if I wanted one. Will 360 games look better than revo games on my 14inch tv?

I think this new console is made to be the best value for money possible. The cheapest to develop for, the cheapest to make and likely the cheapest to buy. I don't expect cheaper games. I reckon they'll cost the same as GC games do today.
I'm so bored with games right now. I haven't bought a proper console games for longer than I can remember. I've spent money on 'other distractions' and I've bought DS games but as far as proper playthemonthetelly console games. I'm bored stiff by them!
more comments below our sponsor's message
TwoADay 30 Mar 2006 13:48
7/24
As long as the graphics on the first-party games are at the RE4 level (hopefully a little better) at E3, there shouldn't be too huge of a hit on N.

I, like others, am generally bored with most of the console games out there (although I finally picked of SSBM) so the fact that there aren't any terraflops of things moving around isn't a deal-breaker to me.

The key is making sure that the graphics are on a RE4 level. Sure, Graphics don't equal gameplay, but graphics generally get the interest going in a game/system.

This also raises the hopes of someone hoping to see an under $200US system, or at 200.

Not great news, but not bad news, either. I'm still interested...
way 30 Mar 2006 14:08
8/24
Hmm, this is close to the spec I estimated they could do for the Game Boy advanced 2 based on the Gamecube.

What happen to the higher end spec options that were mooted last year for the Revo. Those options were much closer to what you would fit in a box the size of a Revolution and keep it reasonably cool. Sure they could put in multi-core 3Ghz CPU, next year when IBM has there new high speed low powered process on line, but presently the limit is closer to around 1.8Ghz.

Something smells fishy here.

Still, this machine should be able to fast track to a portable version very quickly (hold it GBA2, was supposed to have disk) as it will likely not be too far more complex than a PSP (which is more complex than a PS2). Nintendo likes to introduce really cheap to manufacture hardware based on older tech for it's portables, that has been shrunk down in cost and power consumption. Without actually spec on the Revo's transistor count, process, and power consumption, or the PSP's, an estimate would be that it might be mature enough for a low cost portable version next year, they might delay it the year after. If this is the case they might be free to upgrade the Revo itself next year or the year after.

About the graphic potential:
The spec of the Revo might look quiet low, but remember, like Power VR, this is an optimised architecture, that only needs to render on screen pixels. It is possible for them to provide what they claim on these sorts of Mhz, especially with parallel processing arrays. My own scheme would use less than 200Mhz for near photo realistic rendering in a hardware version. All is not lost, but I don't see anybody that far advanced in 3D process modeling refinement.
OptimusP 30 Mar 2006 16:14
9/24
From a technical standpoint having 24 md ram and then 64 mb ram of the same type separetly is very ineffcient, easier to just stick a 96 mb slab into it.

And there are some other stuff in there that sound very inefficient. Why is that a factor, well Nintendo likes efficienty, that's why the GC could produce Xbox-like graphics against double the framerate (where Halo 2 has trouble reaching 30 fps, RE4 was locked on 60 fps... go figure) it was hyperefficient and fast as hell. Suddenly making a piece of hardware that's supposed to be easier to work for then the GC with those kind off inefficienties...it smells fishy indeed.

Oh yes, for those who doubt that 88 mb of 1T-SRAM isn't enough... the 16 mb of 1T-SRAM of the GC(strange that IGN again mixes up the 24 MB E-RAM of the GC for the 16 MB 1T-SRAM) is favored by developers over PS2's total 32 MB RAM, capable of handling much more data (1T-SRAM is about 10 times faster then PS2's RAM type).
majin dboy 30 Mar 2006 16:35
10/24
okay,wtf are u ppl talking about.i think i understood about to sentances of the last 3 or fourm posts.
crs117 30 Mar 2006 18:18
11/24
Way,

You obviously do not have a clue of what you are talking about.

Next up...MHZ comparisons between systems do not mean much of anything if you do not take into comparison the actual architecture. For instance 3 gigahertz machine that is only capable of doing 1 alu (arithmatic logic unit( 1+1=2)) calculation per cycle (hertz) is going to be slower then a 1 gigahertz machine that can process 4 alu calculations per cycle because the first machine can do 3 thousand alu operations per second while the second can do 4 thousand alu operations per second even though the first chip is clocked 3 times faster.

The very fact that ign tried to compare the PIII based xbox chip with either the gcn or xbox 360 chip shows total ignorance on all things computing.

I highly doubt the claims of this report or the accuracy of the system specs. I am certain that the rev is not going to have near the horsepower of either ps3 or x360 but N has said that we will say wow at the graphics.

To put things into consideration ps3 and x360 were designed to be media hubs which required a much more general processor and hardware design while the gcn and the rev were designed from the ground up to play games. If i am capable of playing music off of my pc while playing pgr3 while being connected and periodically jumping back into xbox live think about how much computing power and ram has not been applied to pgr3.

I do fully believe that the gpu will be a more cutting edge chip then an upclocked gcn graphics chip...it would not cost them a cent more to simply take a new ati chip based on more modern technology out of their middle of the line hardware which is way more capable than the 5 year old graphics chip technology in the gcn and it supports shader support. Besides i think that the 3mb of onboard graphic memory is for pixel and shader operations.

Ram wont be nearly the issue because resolutions will be limited to 480p so textures will be by nature much smaller than 720p or 1080i/p.

By the way the reason for seperating the 2 main rams is for cost reasons. The ram is on 2 seperate levels. I think 16 is onboard with the processor (again indicating a full processor redesign even if it is similar) but each transistor on the chip increases the die size of the chip which exponentially increases cost. The 16mb will be on a lower level and faster to access then the rest of the ram which will be external and will require some external bus (access limited by bus width and bus clock) to access. 16mb of on board processor cache is pretty good considering that most server chips feature 1-2 megs 4 max on the latest chips.

I still say bogus specs. Maybe those where early dev kits that where to be used as a platform to start development.
Happydwarf 30 Mar 2006 19:04
12/24
Can i just say that it really doesn't matter about the specs of the machine. Nnintendo have already proved that with the DS. Nintendo have already outsold the PSP due in part to the new and original control system. The same will be true of the Revolution whic i expect to be not far of the publisized specs by IGN. The main thing is that nintendo arn't trying to compete in a willy waving "my machine looks better than yours" contest but bring fresh gaming to the consumers. Nintendo has already proved that it can and i'm sure that no matter how underpowered the machine may seem they will once again make console gaming fun and enjoyable again. If you want pretty graphics and HD resoultions go spend £3k on a pc. Thats what i've gone and done and my machine makes the Xbox 360 look positvly pitiful (F.E.A.R. any one). The console market has always been about cheap, fun and easly accessable gaming. Something that Sony and Microsoft shoulld really be taking in to account (and quite clearly dont really understand). Let the war commence and watch as microsoft and sony lose millions of dolllars whilst nintendo laugh as there bank balence stays in the black. Its already a well known fact that the 360 and PS3 will lose both companies money on each unit sold. Fair play to nintendo for not submiting to the hardcore fans, crying out for HD and media center connectivity. And personally if your such a hardcore fanactic gamer (like myself) that you need the HD display and media center conectivity shouldn't you really be spending cash on building a customised PC rather than buying an expensive toy from sony or microsoft.
realvictory 30 Mar 2006 19:33
13/24
Specifications do matter. They are what defines the game. The controller is one part of the specification. The problem is, people compare two chips, when they should be comparing the whole system.

Like you say, (hardcore) fans do care, therefore it does matter how powerful it is. Ok, maybe what actually matters is the appearance of the game, not the speed of the processor, but we can still roughly estimate how good it will be, because a lot of things are unfeasible.

And yes, that isn't all that matters. But this is the next generation. This is what Nintendo themselves called the "Revolution." That, to me, doesn't correspond to last generation's technology with a new controller.

They're making a whole new system, 5 years after the last one. Not only have my expectations risen accordingly; so have my standards. The same with every other person on this side of the planet. It's no good making something worse than people expect, in any aspect that is important (which includes processor power and graphics, and more - not just the controller). Gameplay is important, so is graphics - neither is trivial - so people aren't going to be satisfied with trivial gameplay; neither will they be satisfied with trivial graphics.

Put it this way - if the majority of people were already satisfied with what there already is, they wouldn't buy the new one - they'd keep playing the old one.
Ditto 30 Mar 2006 20:16
14/24
I agree with most of your points crs117.

By the way the reason for seperating the 2 main rams is for cost reasons. The ram is on 2 seperate levels. I think 16 is onboard with the processor (again indicating a full processor redesign even if it is similar) but each transistor on the chip increases the die size of the chip which exponentially increases cost... 16mb of on board processor cache is pretty good considering that most server chips feature 1-2 megs 4 max on the latest chips.


As well as the fact that you would NEVER put 16MB of RAM on a processor die. That would be insane.

For a start, your processor would be massive, really hard to manufacture and have a high failure rate. It would be very hard to mass-produce.

Secondly, 16MB of RAM would be really inefficient. You use a small amount of on-die memory for a reason; to give yourself a super-quick work area to load in frequently used instructions. The more memory you add, the slower it becomes.

However, I do agree that it sounds like they have some faster RAM and slower RAM.

I think I'm right but I will check my facts on Monday.

realvictory wrote:
The same with every other person on this side of the planet. It's no good making something worse than people expect, in any aspect that is important (which includes processor power and graphics, and more - not just the controller). Gameplay is important, so is graphics - neither is trivial - so people aren't going to be satisfied with trivial gameplay; neither will they be satisfied with trivial graphics.


I think that the point is that graphics are no longer limiting. The Revo's graphics will be sufficiently good on a normal TV to compete with Xbox and PS3, I guess (possibly with some spec changes).

IMO we just don't need such technically amazing graphics as the new systems. The last few GC games have had stunning graphics, and I don't feel that even those would let a next-gen package down.

Someone I know has an Xbox 360. Apparently the graphics are good, but they don't matter. About the only thing that's impressed him is the online service. And he's a hardcore gamer.
realvictory 30 Mar 2006 21:28
15/24
Yeah, we'll just have to wait until E3, really...

I can't say for sure how good graphics have to be to be what people are pleased with, but the thing is, like i said - standards increase.

I don't think, though, that the controller is enough on its own. Still, we all know it's going to be more powerful to an extent.

But using last generation's processors that are overclocked just seems like Nintendo didn't really bother at all. Still, we obviously have to wait and see what it does.

On the other hand, I'm about 50/50 undecided as to whether this is actually an April Fool's joke. I mean, presumably people with dev kits are making games, that they presumably want to sell! So it would be in their own interests to present the specs, or any Revolution details in as favourable a way as possible - not just simply by saying "50% as fast as last time, because we're not focusing on improving graphics anymore."
Ditto 30 Mar 2006 21:34
16/24
realvictory wrote:

I can't say for sure how good graphics have to be to be what people are pleased with, but the thing is, like i said - standards increase.


Yeah, I do agree, like if Nintendo suddenly re-released the Super NES as their home system they'd be doomed.

I don't think, though, that the controller is enough on its own. Still, we all know it's going to be more powerful to an extent.


I think that the main limitation will be if Nintendo can deliver software.

I mean, presumably people with dev kits are making games, that they presumably want to sell! So it would be in their own interests to present the specs, or any Revolution details in as favourable a way as possible - not just simply by saying "50% as fast as last time, because we're not focusing on improving graphics anymore."


True. Also though, you don't need hardware to program games - you could simulate your hardware or write for a similar platform (the Rev/GC will prob be sufficiently similar for you to do this).

I suspect that most people are still in the design stage at the moment...

Thanks for the constructive posts :)
crs117 30 Mar 2006 21:36
17/24
Adam,

i wouldnt say it would be insane to put 16 megs of cache on die, but it would not be very cost effective when compared to what else you could do with that space.

Besides i must have misread it because they are referring to it as main ram and external ram but they say that "The external RAM can be accessed as quickly as the main RAM, which is a nice touch,".

Still these hz numbers dont mean a thing without knowing the processor architecture. More has changed in 5 years then what nintendo would settle for in a new console by just doubling the clock rates.

I am still buying one.
lozbag 30 Mar 2006 22:22
18/24
there seems to be two categories of subject here. The story was about specs, that is what we wanted. But this is why Ninty have withheld them. Some are talking specs, some are focused on the new ideas. It is not tec vs spec, its just forget the spec and enjoy.

Some people have already been re-educated, some will be re-educated. Some will buy the PS3 and shun the Rev. A few unfortunates will buy the 360.

Ninty are playing for time, the more they say 'it ain't about the specs' the more people will listen. But time is running out and clearly there are still plenty of punters who only want to talk tech specs. and compare them.

This is a totally new way of thinking. It is 'regressive' technology in some respects. Ninty are going sideways. There taking a left turn. I'm going to try to keep up.
tyrion 31 Mar 2006 08:51
19/24
realvictory wrote:
So it would be in their own interests to present the specs, or any Revolution details in as favourable a way as possible - not just simply by saying "50% as fast as last time, because we're not focusing on improving graphics anymore."

It depends on the target audience. The new MX-5 was touted as having no more grip than the old one. No improvement, but then no change to the amount of fun you can have.
way 31 Mar 2006 09:14
20/24
crs117 wrote:
Way,
You obviously do not have a clue of what you are talking about.


I definitely do compared to you ;).

Obviously, vastly, jealous.


Interesting theories.

About separate memory areas, apart from only being able to economically fitting on a chip with other functions, making a more stranded separate chip economical, they usually are for different purposes allowing different things to be doing in parallel. Parallel memories are also important because most memory timing schemes favour one stream of continuous memory, and put in extra cycles swapping memory access between functions etc. It is actually an efficiency.

Say a 1Ghz unit could do 3 ALU instructions per cycle, it will only make applications that sue it enough, as fast as (theoretically because ti is often not that simple) a 3Ghz cpu that does one a cycle. Still, the 3Ghz has an lead of the same peak ALU power plus the extra 2 Ghz of processing power. An application that doesn't require the ALU can theoretically run 3 times faster on the 3Ghz CPU than than the 1Ghz one.

The future game is media, and game simulation requires simulating all sorts of things past physics and graphics, so general purpose processing is a upcoming vital part of a gaming machine.

Mhz is not a good indicator of finale power, but more Mhz on the same system is. So being able to tell just how much cooling a little unit like the Revo can take, tells how much Mhz they can look at in the size, which tells how much they can afford to clock their system at. Unless they have some massive additions in terms of transistors (new arrays of graphic elements and CPU's, or mega graphics/processing die) rather than just a pure speed increase they should easily do 2Ghz, so 3GHz might require double. maybe even triple, what 2Ghz requires, 1Ghz is far lower again, and why those low powered embedded arms are so slow. The other factor is, that they can build extra complexity into a circuit, instead of Mhz. This is why, at the same Mhz, and shrunk to the same process size, a Power 4 chip would outperform, but run hotter, than the Gamecube's processor.

Putting 16MB of memory on a processor die is totally sane, because it can run very fast, though it has to balanced out to the most performance to cost ratio at the time. I think some of the latest upcoming processors are doing 16MB, or was that 32MB. I prefer 1GB on the processor with advanced clearspeed like co-processor array (AMD is talking to them about using it as a response to the cell, which I predicted would have to be done years ago). As a number of separate chips with embedded memory, and a memory chip stacked on top of each other, this allows memory access to be much faster between the chips, and on chip, as the amount of energy required to push signals off a chip package onto a board, and the length of traces, are a limiting factor on speed, and they can use different types of processes for each chip that better suit their functions with the chips brought together for stacking, main memory being a typical one that requires a very different assembly line fro production. The other factor is, apart from mega dies, price of chips are greatly effected by the amount of pins they have on the package to be connected to dies (a stack of dies just uses connect pads on the dies, the technology is quiet mature) Apart from that they can be stacked side by side (but normally PC processor companies use the big dies). This would reduce drastically the amount of pins (for example no external memory busses needed) so that the main pins could be put down ones side for vertical mounting with cooler units on the front and back of the chip. This also allows for the possibility to run slow busses off the top and side edges of the chip, like USB2, PC HDMI derivative, SATA etc (what do you think they are goign to do with 1 billion transistors+memory when they get there. Actually, you can replace all the pins, including hyper-transport, with a handful of modern peripheral busses (goign to external buffers) and do away with the chipset ;).

And before anybody says it, I know that faster memory is hotter, until they shrink it enough, and take away the off chip memory driving circuits, and swap to one of the upcoming low powered memories.

If this is the speed, on a modestly improved GC, then they could have released it two or three years ago. I'm for either a "April Fools" joke, or that concentration on handheld hardware Nintendo talked about a few years ago, which means this might be the Game Boy advanced 2 spec, or they are pushing this GBA2 spec developers to keep the system cool until they shrink it again (like the PSP did with under clocking) and to make a compatible game base for the GBA2. If it is for system cooling, then the real spec might be closer to 2Ghz. The again, maybe it is a new innovative architecture, or a very complex one that spits out heat. If we knew the process they used, and when the the next process is dues, we might be able to tell. In that case, they might be releasing this months ahead of the die shrink they want, and it will be overclocked to get there, with maybe double the power consumption they want. In that case, it will be one rather warm toy when new games that hit full speed start flooding in next year. Please note, that it doesn't mean that the first under clocked titles are goign to look much different to the majority of initial games on the XBox360, because it is likely it will be vastly improved processing functionality over the GC, an=d many XBox360 games shouldn't hit much of the machines true stride this year either.

Another alternative:
ATI is just readying their next generation core, for the PC market, which means they might be free to use the latest stuff on the Nintendo, now that would run at such a speed for the amount of heat that the box could take. Xbox 360 got that last year. It doesn't quiet explain the processor speed, but if you remember, the Atari Jaguar in the early 90's, it was next generation, and people criticised it for it's processor speed, but as Atari rightly pointed out, that the processor speed didn't really matter, as the co-processors were doing most of the work, and the main processor was just managing thing, sending the jobs to the co-processors. With an more general purpose GPU, like what is proposed for the upcoming generation, this is quiet possible for the speed of the processor it has.

It could be done. So which one is correct?
Radiant 31 Mar 2006 15:01
21/24
Just show me a bloody game.
God damn nerds run this world.
crs117 31 Mar 2006 16:00
22/24
Way,

You are a complete idiot. Besides mispelling almost every other word in your reply...you reply is a bunch of techno gibberish that either means nothing, completely wrong, or both at the same time.

I think your response must be some sort of april fools joke...cause i surely hope you dont think you have a clue of what you are talking about.
lozbag 31 Mar 2006 19:56
23/24
Radiant wrote:
Just show me a bloody game.
God damn nerds run this world.


typo I think.

that should be nerds 'ruin' this world.
way 1 Apr 2006 15:37
24/24
crs117 wrote:
Way,

You are a complete idiot. Besides mispelling almost every other word in your reply...you reply is a bunch of techno gibberish that either means nothing, completely wrong, or both at the same time.

I think your response must be some sort of april fools joke...cause i surely hope you dont think you have a clue of what you are talking about.


Haha, which, almost all, of my spelling is wrong, That isn't a grammar mistake, just how you reply to a gibbering idiot. What are you doing nowadays after your star Wars episode 1,2 and 3 roles, Jar Jar,

Like your picture:
http://www.starwars.com/data bank/character/jarjarbinks/

Did you design the official Star Wars website, or the front page, it looks like it. Re-edit: Ah, it's finished loading, there is actually a menu now, there is some functionality, so you must be clear of that one.

------------

Seriously, I don't have time to reply to arrogant upstart man ship anymore, that can do little but falsely accuse people about what they know little about (yes clueless) little they know about. I know, because I used the simple standard terms (would you like me to use the really technical ones) common on web usage these days, If I am talking above your knowledge then don't complain, like I am the complete fool, and you are not, Read and learn or let others read in peace. Google some terms instead, or Get a Dictionary for CRS117's Sake. Unless you want to jump up and down and use Black Magic words to describe them instead of the sensible ones. Now I am not going to waste my time lengthening this thread by continuously replying to this stuff.

By the way, misspelling is spelled with two s's, not one, not to mention your "dont", and "april" (it's a noun). Your lucky I don't go around picking the eyes out of other people's spelling and grammar, like you are (without giving an example, mind you).

Have a good day, CRS117.
Posting of new comments is now locked for this page.