Jump to content

New PS3 details emerge...


Epiphany

Recommended Posts

If I have time I can happily dig up some of Epiphany's posts where he has outright lied, not to mention in Volourn-esque fashion he insists on completely ignoring any facts and evidence that makes him look bad.

 

You always threaten your "find quotes" nonsense...

 

Epiphany insists the XBox didn't have a PC CPU because it had a Celeron, which isn't really a P3.

 

He insists that the PS3 won't have a HDD despite Sony's specs listing it as having a HDD.

 

Your typical "distort the truth inorder to make Ender appear smart" antics.

 

Yeah, and Sony initially said that the PS3 would be a network hub as well, yet that was canned. Ken has also gone on record as saying the HDD, if offered, would not be used for game saves, rather strickly music and other media applications. Hence, the game caching advantage would be OUT THE WINDOW.

 

He insists the PS3 specs are a lie, and that it won't pull off two 1080p images, but the 360 which only produces one 720p image is clearly more powerful.

 

More of your truth distorting. I said it could not output two 1080p images for game related purposes, but I have always stuck by the claim that it would produce stored footage (movie, film clip, etc...) at 1080p, as it has enough banwidth for that, but it does NOT have the banwidth to produce visuals at that quality of 1080 for "stunning" looking games.

 

He insists overall the 360 is more powerful despite all the 3rd party developers saying the PS3 will clearly have more power.

 

No 3rd party developer has outright said that the PS3 is more powerful, and there isn't a single 3rd party developer that would make such a bogus claim unless they developed exclusively for said console. Objective developers are maintaining a neutral stance, which is the smart thing to do, since thrashing a console will only get them ignored by said manufacturer.

 

He claims to be an objective non-fanboy, but he uses Microsoft as a trusted source while assuming every Sony statement is a lie.

 

I used a MS quote ONCE, for YOU, when YOU said YOU had never seen MS say something. I quoted it for YOU, since again YOU REQUESTED IT.

 

He insists that HD-DVD is a superior technology over Blu-Ray which is plain fanboyism and foolish.  I include this only to demonstrate his dedicated M$ fanboy trends.

 

Yes, because MS has such a huge, mammoth size role in HD-DVD... HD-DVD will win, regardless.

 

He insists that the PPC processor designed by IBM from the G5 Mac line (that they also said would be in next-gen Macs) doesn't constitute PC technology, yet he insists the PS3 is just PC technology.  The Cell has never been in a PC and was designed for the PS3 primarily and entertainment technology in general.

 

Again your lack of knowledge on the subject shines through, as Xenon shares no direct relationship with the G5 line other than the PowerPC label which is just that, a label that's not even used anymore. G5's were used in ALPHA test kits. The true backbone of the cores is based off of the PowerPC architecture, which is not exclusive to the G5 line. Regardless, the processor has been customized to such a degree, it was given the "Xenon" name, and the PowerPC label was dumped.

 

Again, your distrotion of the truth, as I clearly said that the PS3 shares more with a common PC than the X360 does since the -->GPU<-- is a run of the mill Nvidia "lets bump up the mhz" card. Cell, Xenon and Xenos are new to the tech field. But your selective memory easily allows you to ignore the blatant facts I beat over your head day in and day out.

 

He says floating point performance means nothing, despite CPU benchmarks.  In fact any benchmark which proves him absolutely wrong, he discounts as meaningless.  His only response to pssts that prove him wrong is that I'm an idiot and that the conversation is above me.

 

I've never said FPU performance meant nothing, again, more of you distorting everything that's said to rally the people behind your baseless accusations. FPU performance generates the graphics that you see, beyond that, its role is significantly less. FPU performance on a CPU, however, isn't nearly as important as integer performance. A subject which you have oddly enough, ignored completely, which is where my "over your head" argument comes into play. You've yet to offer any counter point to the fact that Xenon is superior in ALU performance (by a large margin) over Cell, and that RSX and Xenos will be what FPU performance is measured by. Again, ~90% (almost exactly 90 for the PS3, and more for X360) of BOTH consoles FPU performance comes from the GPU. I've said this multiple times, let me know when it sinks in.

 

In console threads I wasn't the first, nor the only person to declare him to be a liar.

 

Just you and that other guy that "lol'd" at everything and agreed with everything you said with "OWNED!" while adding absolutely nothing to the debate. Face it, Ender, you have nothing on this argument, as you simply distort everything that's said inorder to try and get people to rally behind you by making stuff up.

 

He has also outright lied on specs and facts as well.

 

Yet you've been unable to PROVE any of it as being false. Should I spout off the breakdown of ALU/FPU performance, how the PPE and SPE's communicate, versus the symetrical design of Xenon again? Should I go into detail about the VMX advantage Xenon has over Cell, and why Cell will be a great workstation CPU, but not a great gaming one? Should I dive into (again) more information regarding how each SPE has a tiny VALU to help curve its lack-luster ALU performance, dispite the fact that the VALU is in fact, tiny, and nothing to write home about? Should I also discuss how if the VALU's are used, then that negates any further advantage (be it moot) the PS3 would have in FPU performance, since the clock cycles between the PPE and the SPE's would be eaten up with the PPE going what integer processing it can, while feeding the SPE's simple integer code as well? What about the fact that a single Xenon core is more powerful than the PPE in Cell, and that the SPE's themselves are only marginally greater in total power (if properly fed info from the PPE) than a fully functional PPE? How about the fact that an IBM tech said that the Xenon could easily perform the same functions (for a gaming environment now - don't forget) that the Cell could, but it would be LESS efficient than actually setting up the three cores to run independantly of one another? (not because the Cell design is inefficient, outside of gaming standards, that is)

 

Should I move the discussion to the GPU's? I mean, I could easily talk all day about how Xenos has a superior architecture than the RSX, and how Nvidia (who previously claimed USA was unimportant), has gone on record as stating their future GPU line will contain USA. What about the eDRAM? 5% performance hit on 4X FSAA at 720p resolution? Unheard of in modern day cards, and completely unachievable in the RSX. The bus speed advantage the eDRAM in Xenos provides boosts it far above what the RSX could ever hope to perform. The benefit to RSX is that it apparently works well with Cell, which helps recover some of the ground it got blitzed on by Xenos.

 

As I've said all along, Cell is a better workstation CPU than any on the planet, its FPU performance will lend itself wonderfully to programs like AutoCAD. For gaming, however, it simply will not perform as well as a processor that has close default specs, but was built for a single purpose - TO RUN GAMES.

 

You know, I could go on, but you'll be too "busy" making more excuses, or hunting for quotes of mine and then taking them out of context. Hell, you've already made me a hypocrite by baiting me with your nonsense into responding to you again.

 

I'm done with you till the PS3 launches, so that I can shove the proof directly in your face -- if anyone even bothers to pay the arm & leg the console is going to cost.

Link to comment
Share on other sites

I hate to say this, but that borders on the obsessive. I'm not going to even bother trying to take all of that in.

kirottu said:
I was raised by polar bears. I had to fight against blood thirsty wolves and rabid penguins to get my food. Those who were too weak to survive were sent to Sweden.

 

It has made me the man I am today. A man who craves furry hentai.

So let us go and embrace the rustling smells of unseen worlds

Link to comment
Share on other sites

IBM said the Xenon was based of PPC (PC technology) and that they planned on releasing the Xenon as the next-gen Mac processor.

 

Microsoft has outright said that it is a general processor, which Sony has hit upon. Microsoft defends the decision to use a general processor as the XBox will do more than game, yet Sony's processor is dedicated to gaming.

 

Epiphany not only lies, but he has no clue what he is talking about.

Link to comment
Share on other sites

I have one point:

...
He insists that HD-DVD is a superior technology over Blu-Ray which is plain fanboyism and foolish.  I include this only to demonstrate his dedicated M$ fanboy trends.

Yes, because MS has such a huge, mammoth size role in HD-DVD... HD-DVD will win, regardless.

...

The jury is still out on HD-DVD versus Blu-ray:

...

HD DVD is promoted by Toshiba, NEC, Sanyo, and (most recently[1]) Microsoft, and backed by four major film studios. ...

New Line Cinema

Paramount Pictures

Universal Studios

Warner Bros.

...

HD-DVD initially received more studio support than Blu-Ray. One reason given for this is that it is less expensive to convert a production line from producing DVDs to HD-DVD production, than it is to convert to Blu-Ray. This early lead has vanished.

 

As expected, Sony's subsidiaries Sony Pictures Entertainment and MGM Studios have both announced their support for the Blu-ray Disc format.

 

On 3 October 2004 20th Century Fox announced that it was joining the BDA, but has not yet decided which format to support, although it seems likely that it will be Blu-ray.

 

On 8 December 2004 The Walt Disney Company (and its home video division, Buena Vista Home Entertainment) announced its non-exclusive support for Blu-ray.

 

On 7 January 2005 Vivendi Universal Games (VU Games) and Electronic Arts (EA Games) announced their support for the Blu-ray Disc format.

...

:rolleyes:

OBSCVRVM PER OBSCVRIVS ET IGNOTVM PER IGNOTIVS

ingsoc.gif

OPVS ARTIFICEM PROBAT

Link to comment
Share on other sites

No 3rd party developer has outright said that the PS3 is more powerful, and there isn't a single 3rd party developer that would make such a bogus claim unless they developed exclusively for said console. Objective developers are maintaining a neutral stance, which is the smart thing to do, since thrashing a console will only get them ignored by said manufacturer.

 

 

Lemme dig up that interview with EA......

 

 

Sony will have more processing power. There's no question about that.

 

http://www.gamespy.com/articles/634/634928p4.html

 

Unless you're implying that EA is not a third party developer ^_^

 

Furthermore, it's so obvious to him that he says "no question about it."

 

 

Considering your debacle over why 64-bit is useful to workstation computers, I wouldn't consider you to be someone that can claim to be an authority on anything. The stuff that you state is easy to find on any XBOX site, which certainly isn't the end all be all of objectivity.

Link to comment
Share on other sites

Lemme dig up that interview with EA......

 

 

Sony will have more processing power. There's no question about that.

 

http://www.gamespy.com/articles/634/634928p4.html

 

Unless you're implying that EA is not a third party developer ^_^

 

Furthermore, it's so obvious to him that he says "no question about it."

 

To bust your bubble without even needing try:

 

"Well, I don't think any of us are really ready to say for sure."

 

Yeah, there's an abundance of knowledge and confidence flowing from him. A beacon of truth I tell ya...

 

What color is the sky?

I don't know for sure. It's purple, without a doubt.

...eh?

 

Considering your debacle over why 64-bit is useful to workstation computers, I wouldn't consider you to be someone that can claim to be an authority on anything.  The stuff that you state is easy to find on any XBOX site, which certainly isn't the end all be all of objectivity.

 

Considering I straight up told you that the comment in relation to "professionals" was a direct quote from an IBM tech. But I'm glad you don't claim me as an authority on anything, I won't have to worry about trying to explain anything to you anymore, since you won't be asking me any more stupid questions. Shouldn't you be working on that super new and unique AI routine that uses massive FLOP performance?? :-

Link to comment
Share on other sites

Here's what Gabe Newell (of Valve fame) has to say:

http://www.next-gen.biz/index.php?option=c...id=510&Itemid=2

 

"Statements about 'Oh, the PS3 is going to be twice as fast as an Xbox 360' are totally meaningless . It means nothing. It's surprising that game customers don't realize how it treats them like idiots. The assumption is that you're going to swallow that kind of system, when in fact there's no code that has been run on both of those architectures that is anything close to a realistic proxy for game performance. So to make a statement like that, I'm worried for the customers. And that we view customers as complete morons that will never catch on and that we're lying to them all the time. That's a problem because in the long run, it will have an impact on our sales."

Hadescopy.jpg

(Approved by Fio, so feel free to use it)

Link to comment
Share on other sites

We're not just talking about pure floating point performance.

 

We're talking great color depth, and twice as many pixel shader operations per second as well.

 

Add to that we know that as an end-result, Sony says the machine can put out two 1080p images, where as the 360 can only put out one 720p image.

 

Whether or not the PS3 is "twice" as powerful as the 360 isn't really debatable. The only debate is whether or not that power will be utilized, and given the number of developers in Sony's corner, why should we really believe it won't be?

Link to comment
Share on other sites

I'm still going to have to wait and see what the hell is going on before I make a decision. Whatever happened in previous threads, it's clear that folks aren't lying so much as taking a hardened, biased position. If I thought it would work, I'd ask you guys to switch sides so I could see the arguments from the other end.

 

I don't look forward to wading through all the information when these systems finally hit the shelves.

Fionavar's Holliday Wishes to all members of our online community:  Happy Holidays

 

Join the revelry at the Obsidian Plays channel:
Obsidian Plays


 
Remembering tarna, Phosphor, Metadigital, and Visceris.  Drink mead heartily in the halls of Valhalla, my friends!

Link to comment
Share on other sites

To bust your bubble without even needing try:

 

"Well, I don't think any of us are really ready to say for sure."

 

Yeah, there's an abundance of knowledge and confidence flowing from him. A beacon of truth I tell ya...

 

At least I have a link, rather than your vague IBM guy that you write quotes down from, but cannot find on the internet anywhere. Wouldn't you visit the same site?

 

In any case, he said "There's no final hardware specs on any of the systems [at the time of this interview]" right after your line.

 

Despite just saying that, he still had no problems stating that "Sony will have more processing power. There's no question about that" on the very next line.

 

The stuff about "I don't think any of us are really ready to say for sure" is about the Playstation 3 being as a total package more powerful than the 360, which will include a plethora of other things that you usually like to mention. Processing power alone, he states unabashedly Playstation 3.

 

Considering I straight up told you that the comment in relation to "professionals" was a direct quote from an IBM tech.

 

Let me replay the scene for you here. The debacle wasn't that professionals like 64-bit.

 

You stated here:

The vast majority of numbers being used would be 32-bit precision in Cell. This is one reason Cell isn't that popular in the "professional" fields, they usually require 64-bit precision at the very least. The majority of Cell's speed comes from SIMD/vector performance, and all of Sony's big benchmark numbers assume 32-bit precision. The number tanks with 64-bit precision, significantly.

 

I responded here

 

:devil:

 

"At the very least?" You'd suggest they require more???

 

Why would the "vast majority of numbers" require 64-bits for "professional" work? And besides, it's still easy to get 64-bit precision numbers on a 32-bit (or even 8-bit) machine. I actually had an assignment this past year where we dealt with numbers that were indefinite in size (i.e. no bit limitation), and it still ran very fast, and was easy to implement (it was an STL class we used though...I would imagine that the actual implementation would be hard...but hey, someone else did all that work for me...yay STL!).

 

To which you responded here (which wasn't an answer to my question):

I didn't claim the Cell couldn't do 64 bit, I said it would "tank" in performance, given the benchmarks Sony has shown, using 32 bit precision.

 

To which I responded here:

Define "tanking" in performance? And why do "professional" users "require" 32-bit "numbers." Your whole last point in your previous article was rather vague, and I'm asking for a clarification.

 

You:

All of Sony's benchmarks were in 32-bit. From talking with an IBM tech that worked on Cell, Xenon, Broadway, and currently working on "Gene", he has said time and time again, that the Cell "tanks" in 64 bit precision. Time will tell, but he's about as unbias as they come, as he's worked on/with a all of the next gen console chips, as well as having extensive knowledge beyond the scope of simple "hands on" impressions.

 

BTW this is your reference to an IBM tech, but was not the debacle that I was referring to.

 

My response

Who would this tech be? And still, why would "professionals" "require" 64-bit "numbers."

 

Your reply. (and this was the pooch screw) was "Preservation of 14 decimal digits over 7." to which I point out how that is not the case here.

 

That's just a description of a 64-bit floating point number. And besides, 14 decimal digits is often still too imprecise. If they are doing something where 7 decimal digits is insignificant. And besides, 64-bit floating point numbers are used all the time for programming on 32-bit machines (any time the key word "double" is used). And loading these is still quick, given that the floating point registers of pretty much every modern processor come in pairs, where say $f0 is the 32 most significant bits and $f1 is the 32 least significant bits. Having said that, if precision is required, developers will use something like mpz_class which provides 100% precision no matter what.

 

 

The biggest advantage of 64-bit machines is not the "precision" of the floating point numbers (which are easily attained on 32-bit, or even 16-bit, machines). It's for the memory addressing. You are able to access waaaaay more than the 4GB of memory, which a lot of the large scale "professionals" are approaching or even exceeding. The AMD Opteron uses 48-bits for virtual memory addressing, giving it roughly 256 TB of memory addressing space (I'm trying to find out why they didn't use full 64-bit, unless they figured the 16 EB was a little overkill).

 

I then poked fun at you by using "numbers" in quotes to which you tried to defend yourself by saying that "the numbers word was directly relating to benchmarks ". Looking back, you said "one reason Cell isn't that popular in the "professional" fields" because " they usually require 64-bit precision at the very least". Were you trying to say that they needed benchmarks that were 64 bits precise here?

 

Couple that in with you thinking that Coppermine Celerons and Pentium 3s are different microprocessors (and continuing to state that the "hybrid" is more Celeron than P3, despite it having performance numbers closer to a P3).

 

 

 

Shouldn't you be working on that super new and unique AI routine that uses massive FLOP performance??

 

I'm curious what your background is on AI. Particularly ones that let you state specific things such as "FPU performance in AI is a moot subject, as the bulk, and in most cases ALL of the AI is done using logic."

 

I'm also curious....how would you define logic in AI?

Link to comment
Share on other sites

I'd like to note that I generally purchase all consoles each generation and I believe that makes be a bit objective in the matter. I also have no qualms admitting the XBox as superior hardware in this generation. I don't understand why XBox fanbois can't admit the PS3 appears to be superior in the hardware categories.

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...