
10k fists
Members-
Posts
155 -
Joined
-
Last visited
Content Type
Profiles
Forums
Blogs
Everything posted by 10k fists
-
Like I said earlier, I don't read anything up on Cell, so I don't know what's public or not. The Berkeley report might have something on context switching... I'm not saying IBM or Sony made a mistake in the sense that they built Cell incorrectly. But Cell's main design is not for the PS3, Sony wants to do far more with it. Even the benchmarks for Cell are run on bigger versions with more SPEs and a higher clock speed. So please don't take it the wrong way when I point out potential flaws in Cells architecture when it relates to video games. So IBM didn't mess up, they did exactly what was needed to make Cell do what Sony wanted. We honestly wouldn't really be having discussions like this had Xenon not been created with developers in mind. Not to sound like a raving fanboy, but the processor has so many benefits over Cell when it comes to video game programing that's it's astounding. But, once you remove Xenon from its gaming environment, it will be trounced by the performance of Cell. I think that just goes to show just how powerful Cell really is, in that even when you put Cell in an environment it wasn't truely created for (video games), it still holds its own.
-
Oh yeah, I forgot, congrats to your Oilers... Very good game, should have been 3-1 though, it's a shame that ref was out of position and couldn't see the puck.
-
The original idea was for Cell to actually read and write to both pools and be able to help out RSX even more (well, the original idea was no GPU at all, and Cell doing all the work, but that didn't happen). In order for Cell to get instructions from RSX now, RSX has to pull from the GDDR3 pool, and the write to the XDR pool for Cell to read. Again, I'm not saying this is time consuming, or anything of that nature, it's just more work, and wasted bandwidth dispite it being a minimal amount. AI is not my field, but I can say this... It was originally thought that AI could run across the SPEs since it was a parallel architecture. People assumed that you could traverse several branches at once with the SPEs. It turned out to be something that wouldn't work, good theory, botched in practice. Cell itself doesn't have poor branch prediction, since the PPE is good at it, but yes, the SPEs have no branch predictor. They have what is called branch hinting. Hinting gives them a 50/50 shot at guessing the right branch to take, if they guess wrong, it's about a 20 cycle flush to start over. I think almost every developer will view that as a waste of time when you have an almost 90% prediction rate from the PPE. That's a toss up. Until proven in action, it can only be assumed that IBM and Sony know what they're doing. I don't think it's a well rounded processor though, it has entirely too much focus on FPU performance, and not enough on integer. Just remember that Cells concept was not for the PS3, it's purpose for Sony is beyond that. If the PS3 doesn't grab the majority ownership this generation, I worry for Sony's Playstation division. I just can't imagine publishers sinking millions of dollars into a project for the PS3, when that money could go much, much further on the Xbox 360 or even the Wii. If Sony still has market share this time around, then developers/publishers will do what they need to. I just hope it doesn't kill off the smaller development houses even further...
-
It was initially the idea that Cell would actually perform the graphical operations that it was taking away from RSX. So writing it to the GDDR3 memory pool for RSX to process wouldn't really help that much, since RSX would assumed to be tapped out for Cell to be needed. Yeah, more of the same. I'd like AI that learns a little better (the larger history tables in Cell and Xenon should make that possible). I'd also like AI to have more decisions to make, and actually choose the smarter decisions in most cases. I want the AI to have more decisions, more variety to choose from, more "thinking" if it were. In order to do that, you need more branching. You said you've worked on AI in the past, so I'm sure you'd agree that you need heavy branching to make AI better, right? I don't really read up on Cell... But IBM and the Berkeley report (I think it's been published on the web somewhere) are good places to read up on it.
-
Your typical CAD station has roughly 2GB of RAM and a minimum of 512MB workstation GPU. Even then, an object that's 100'+ long struggles to render/rotate smoothly. Maintaining 5fps is a problem for the most part. Has nothing to do with games though. lol The problem with having Cell write to the GDDR3 pool is that RSX will most likely be tapped out in bandwidth, so writing there would not be beneficial. Reading from the pool, and removing some of the workload from RSX would have been a better choice I think. Developers will find a way, though, so it's really a moot point. I will be curious to see how games turn out in four years. Very interesting read. Would AI running in FP really simulate intelligence though, or would it be a more "basic" AI? I don't think the history table would be that long for any kind of advanced AI to be run in that nature... I could easily be wrong though. Cell and Xenon both use branch predictors from the PowerPC 970 - best in class - just a smaller history table. I'm not going to say that running AI on the SPEs can't be done, I just don't think it'll be worth doing. It was proved that the PS2 could, in fact, do bump mapping, but nobody did it for a game since it was too taxing. The SPEs were designed to run specific tasks that they repeatedly compute, which is not largely the case in gaming, and especially not the case in AI, I believe. Context switching on SPEs is expensive*, in addition to the weak decision-tree and branching performance on them. Not only is running it on the SPEs a nightmare when it comes to synchronization but you're using up huge amounts of Cell's real abilities to try to sludge something that will run very poorly on it. I think a best case scenario for the PS3 is that AI will run part time on the PPE and perhaps later down the road, someone will slave a SPE to help with whatever non-decision tree work there is. * SPEs have to manually context-switch - run code on them to flush the local store/reload it for the new dataset - it's an expensive operation for them.
-
It's not really that far-fetched that lots of computationally intense algorithms on Cell, which specialize in enormous amounts of computations (which results in data), that take up lots of memory. That's why PC workstations need a ton of RAM, for example -- not just the graphics chip, but system memory. I don't believe I stated that first out of the gate wins, I simply stated that if the Xbox came out first and got the bulk of the percentage of users, then developers would have been far more willilng and ready to code games for the Xbox instead. It was really a moot point since it didn't happen, but you appeared to claim that developers made tons of the games for the PS2 dispite the difficulties - which caused me to think that you were perhaps stating that regardless of programming difficulty, developers will support a console full blown. I just don't find that to be a reasonable assumption, since they would rather develop for the console with the larger userbase. If photo realism is a goal, then much of the gameplay will be sacrificed, on either platform. There's a delicate balance between great graphics and simply over doing it. We are not yet at the point where we can have visuals of that nature while still providing the gamer with a great gameplay experience. It offers a few other advantages, but yes, the bulk of the workload on the eDRAM is antialiasing and z buffering. It really does provide an advantage though, as it frees up a tremendous amount of bandwidth for other graphical opperations. I believe that antialiasing is one of the most bandwidth consuming tasks that a GPU does. If RSX's bandwidth is consumed by applying FSAA and AF to the scene, as well as rendering the geometry and applying shading etc..., it may have previously been in a developers interest to have Cell take over some of the geometry workload. Fully destructable environments would have also been possible, in my opinion. I recall Kojima stating in an interview just recently that they had to remove some of the destructive environments. Who knows, perhaps it's a byproduct of this, or something entirely different... Aside from physics, the SPE's won't have a lot to do if they can't help RSX render, in my opinion.
-
Games will need it, but developers will most likely work around it. It's a situation similar to the lack of a "next-gen" medium in the 360. Developers will work around it, but games at some point will need the additional space. My first comment was based on my belief that the 360 will be the lead platform for most non-exclusive 3rd party games. If that is so, then in the future developers are going to have a hard time trying to get the games to run on the PS3 if they start pushing Xenons capabilities. Yeah, there's a lot of "ifs", but what good is a discussion without a bit of "if"? A lot of FPU work where Cell is crunching a lot of numbers is a likely scenario. Be it some new physics system or taking some of the geometry load off RSX. Remember, the entire focus of Cell was it being a floating point beast, which it is. But in the PS3, it may not have the ability. RSX is fairly weak compared to Xenos, and will have a hard time "keeping up" once developers get more familiar with the hardware. It's already going to be strapped for bandwidth far sooner since it lacks the eDRAM. At first, I never thought it would be a problem, since Cell and RSX were designed to work well together, but with the GDDR3 being effectively removed from Cells access, anything it needs from there, will have to go through RSX first. Again, it's not an uncommon task, but it still uses bandwidth and "wastes" cycles that could have been spent on something else. The success of the console played a large role in developer support. Had the Xbox come out first and gotten a foothold in the userbase, do you honestly think as many developers would have used the PS2 as their platform of choice? The library wasn't huge because developers enjoyed coding for the platform, it was big because almost 100 million people bought the machine. Memory constraints will always be an issue, if you gave developers 9GB of memory, they'd find a way to want 10GB. It's more about choice. The 360 is completely unified, giving developers a choice of just how much memory they need at a given time. Again, as developers get more familiar with the hardware, they'll start optimizing everything better, and the issues of these 1st gen games will truely be a thing of the past. I guess that's why I'm upset with Microsoft for pushing the 360 out so soon. It would have been better had they waited until Spring 06 to launch the console, they would have had less of a shortage, and developers would have had almost six more months to get their launch games ready. While it may be uncommon for Xenon to require more than 256MB, it's a lot easier to justify since Xenos has the eDRAM will alleviates a great deal of bandwidth consuming processes. I don't think video memory will help with the streaming of game cells or proceedural synthesis that's the big "buzz" right now... That's cool you got to talk to some people in the industry though, it's always interesting to hear what they have to say, and to try and get a real opinion out of them. It seems like Carmack is the only one that will really run his mouth, everyone else seems to walk on eggshells when it comes to the Playstation and Xbox brands. I guess their mentioning of unified memory supports half of what I said though. I always look at both extremes, as one of the situations is likely to happen. If developers are more concerned with the lack of graphics memory, then my first scenario of Cell/Xenon requiring more memory isn't as likely to happen... But at the same time, if graphics is their focus, then they'd likely want to use Cell to push more geometry, which would require more memory, which would lead right back to the problem of Cell not being able to access the GDDR3 pool. It's all a shame, as I think pushing these processors would offer more "next-gen" gameplay, and for me, would be much more entertaining than extra neat "next-gen" graphics.
-
The GDDR3 (local memory) is the 256MB memory pool for RSX, since Sony opted to go with a non-unified memory architecture for the PS3. A bounding factor for Cell is that there is, effectively, only 192MB of XDR RAM for it to use, since 64MB is being reserved for the OS, and other system related functions. A lot of people don't realize just how much of a bandwidth monster Cell is when running at anywhere near high capacity (high not meaning peak of course). That's why it's hooked up to a 25GB/s memory pool. There are possible situations where computationally intensive algorithms that read in and output a lot of data would need to go somewhere, and the 192MB pool not be enough. But developers will most likely have to work around such situations and try to make sure that everything fits into the 192MB pool. They'll do it, because they need to, but it just comes full circle in that the PS3 is complicated to code for, when compared to the 360. The unified architecture is actually starting to work in Microsoft's favor for ease of development. I really wish Sony had decided to go with a unified architecture, it would have made things a lot easier, and they would have really gotten a lot more potential out of Cell.
-
I pulled that image from the Xbox section for the game, so apparently, they share identical art assets, as do nearly all of EA's cross-platform games.
-
<{POST_SNAPBACK}> Of course they can't imagine a scenario where Cell would need to access more RAM, because they didn't design the system to do as such. It's a rather foolish PR comment, that would be like designing a car that could only go 70mph, and then when someone wants to go 80, the manufacturer says "we didn't see any situation where you'd need to go 80mph"... I can promise you that games will require Cell to access data that's in both the main and local memory, that's why they allowed RSX to access both pools of memory at a fairly decent rate. It's just unfortunate that when Cell needs that information, developers will have to have RSX move it from GDDR3 to the XDR pool. It's not uncommon for developers to do these types of tasks, but it involves more work, and isn't as streamlined as it should be.
-
Are you serious? You really think that something like this, looks pratically identical to this? Normally I don't resort to posting links, as it's not worth my time, but come on...
-
Huh? Too many "" - it's impossible to tell what you're saying, and what you may be pulling out of someone elses brain. Both the 360 and PS3 are fully next gen, they carry all of the graphical features that distinguish them appart from the PS2/Xbox/Gamecube generation.
-
Why is that? You do realize that the Inquirer has bashed the 360 don't you?
-
Their source is the Berkley report, I believe. Highly reliable. It's just the manner in which the Inquirer delivered the message, it was just wrong and full of nonsense. Will developers be able to work around these issues? Yes. Will it take more time? Yes. It all goes back to what I've said for over a year... Cell is more powerful than Xenon, but harder to program for. Xenos is more powerful than RSX.
-
The hardware is not broken, nor is it slow - more along the lines of inefficient for a gaming console. The article was poorly written and put together, but their sources are accurate. It just shows that Cell cannot effectively use the memory pool of GDDR3, but that the RSX has a command/processing bandwidth for geometry and vertex data from Cell at 4GB/s. Pretty much, if it's not on that slide, then it's most likely junk in the article.
-
The games should have been running on final hardware.
-
I don't believe I frabricated, nor do I believe I said it was selling "great". I did however wrongly state it was selling better than the original Xbox did. But if you could point out more of these "several" things that were untrue that you opted not to point out? Try to avoid the "chuck patch" as that one was covered, and just a quick mistake on my part. So that's two, which isn't several.
-
What spin did I put on it? The SPEs are slaves to the PPE, and they do nothing but number crunching (FPU work), they have poor branch prediction and a short history tree... Where's the spin?
-
I have no interest in doing anyones homework. In the newsgroups I post in, people don't ask for links regarding information like this, as what I'm discussing here, and what we discuss there is common knowledge in the industry. At what point in my post did I say Microsoft was infallible? I was specifically discussing PR comments directly relating to Sony, and not making some "exclusive" comment that only applies to them. Is this what you normally do when you don't know how to respond? Take something out of context and then label the other person a "fanboi"? My credentials are more than enough for me to discuss the matter. anandtech could never give me a "clue".... It's apparent you know nothing about the architecture of either chipset. As nothing I typed was "hype". But, I guess this goes back to my previous statement, you don't have a response so you use a fanboy type remark, this time, claiming I've "swallowed hype". You're quite a piece of work. I know quite a bit about the PowerPC family which covers everything from OSX, to BlueGene, to Xenon, to Cell, to PowerPC clusters, to z/OS and so on and so forth. There isn't a thing I need to read to bring me up to speed on Xenon. I was sent a PM a while back from someone with a link to these forums, telling me to post here when discussions like this came up. So, in the spirit of loving to talk about this tech, I did. I now realize that it was a mistake, as people like yourself take it upon yourselves to ruin the thread. When games require more than a DVD-9, then we can have this discussion... Well, we could have, but you seem to be a prick when you are put in a debate where your knowledge is lacking. Calling people "fanbois" and telling them to quit "wacking off", I would like to personally thank you for acting like a 10 year old, and bringing this once civil dicussion to a cesspool-like quality. You've really proven yourself as a capable poster full of valuable information for people to read. Take care.
-
Oh, sorry.
-
Nobody in this debate is claiming the difficulty of coding on the console has any relation to the success of the it.
-
Carmack has commented on its complexity - his words were "pain in the ass" I believe. Square-Enix is already having small issues with Final Fantasy 13, and they've yet to really dive into Cell. It took a team of world renown programers over 15 months to optimize half of the SPEs in Cell to run the benchmarks used in Berkleys study. Tims comments go against what nearly everybody else in the industry has said. Regardless of his comments, they hold little weight in the industry as a whole, since he's no longer coding the engines. I thought we were to the point that everyone could see through the smoke and mirrors that Sony uses? Paying for out of house PR statements, showing software not actually running on the hardware itself, and other methods of deception to gamers. Sorry that you think that the compilers will be magical, but claiming it's just multi-threading to mainly overcome is to put it bluntly, wrong. Multi-threaded processors and code are nothing new. If you really wanted to make a case against the complexity of Cell/Xenon, then you could have at least tried the argument of in-order processing instead of out of order processing. The fact that you and the other guy seem to insist that the eight "cores" on Cell and the three cores on Xenon are somehow not that dissimilar is quite funny. On one chipset you have a CPU with seven co-processors that are nothing more than FPU pushing slaves to the PPE. You have a completely different architecture approach and focus, with poor branch prediction and a short history tree. A processor that is a FPU pushing beast, but weak on all other fronts. Do you honestly think that the compilers are going to magically transform their base "multi-core code" (that you and the other guy seem to think they'll be doing) and turn it into something that Cell can utilize in an optimal way? On the other chipset, you have a CPU with three symetrical cores, has incredible vector processing capabilities, very strong branch prediction (pushing 90%) in all three cores. A chipset that each core can run independantly from one another, or can be utilized like Cell, with one being the conductor and the other two being slaves (less than optimal for its design). The focus of this chipset is to run games and games only, a multitude of changes made to the ALU that reduce clock cycles for common tasks. In order for developers to actually use Cell, and benefit from its power, they're going to have to spend an enormous amount of time learning the architecture, far more than they will have to spend on Broadway or Xenon. The cores on Cell are not similar to the cores on Xenon from a programming standpoint.
-
They butchered the SPECIAL system, and the combat was horrible. Far worse than Arcanum, in my opinion.
-
I don't believe this to be an accurate statement. While I haven't had any experience with backwards compatibility on the 360, I've always been under the impression that all the games will be emulated, which will require a download. Without access to Xbox Live, or the internet, a CD burner and a blank CD, you will not be playing any Xbox games on your 360.
-
It does when you appear to be claiming that you can simply write code for Xenon, and easily transfer it to Cell (because the three cores on Xenon are somehow not that different from the 8 on Cell)... If, on the other hand you aren't claiming that, then please clarify, as I've completely misunderstood you. We aren't talking about MS fanboys right now, we are talking about a claim you appeared to make regarding the fact that once you code for Xenon's three cores, that Cells eight cores aren't that far off. Which is not true. Getting Cell to function on eight cores is an extremely daunting task. You put too much weight on the power of compilers. They aren't magic bags of dust that transform all code into exactly what the processor needs to do what you want. You still have to know how to code for the processor, regardless of how powerful your tools are. Even when XNA rolls out for the 360 in full force, it will still require developers to know and understand how Xenon functions before they can even begin to take advantage of the hardware. If you'd like to see what happens (in a basic video game example) when code from a different platform is more or less ported directly into the 360, then boot up Quake 4 in HD, and watch the framerate get hammered. You must still optimize your code, a compiler will not do that for you. Or just look at some of the early mess of launch/first generation titles for the 360. Developers simply did not have enough time with the final hardware, and were forced to cut too many corners to get the game on the shelves. Not sure what your point is ... As the developers directly linked to MS' wallet have not gone as far as first party Sony developers have. Netural developers (ones for both platforms obviously), have all echoed the same thoughts, Cell is much harder to program for, dispite being more powerful, and that Xenos is more powerful than the RSX.