Jump to content

Recommended Posts

Posted

"...The more I see of this guy Volourn, the more looks like his rep is well deserved. Odd."

\

Go troll someone else. You give me way more importance than I have. I am nobody. Let's keep it that way. :)

DWARVES IN PROJECT ETERNITY = VOLOURN HAS PLEDGED $250.

Posted

"...The more I see of this guy Volourn, the more looks like his rep is well deserved. Odd."

\

Go troll someone else. You give me way more importance than I have. I am nobody. Let's keep it that way. :)

 

Do you feel important... now?

 

http://youtu.be/_IVqMXPFYwI

Fere libenter homines id quod volunt credunt. - Julius Caesar

 

:facepalm: #define TRUE (!FALSE)

I ran across an article where the above statement was found in a release tarball. LOL! Who does something like this? Predictably, this oddity was found when the article's author tried to build said tarball and the compiler promptly went into cardiac arrest. If you're not a developer, imagine telling someone the literal meaning of up is "not down". Such nonsense makes computers, and developers... angry.

Posted (edited)

 

 

 

What? What sense does that make? DA:I is your typical "chosen one saves the world" story that gleefully embraces every RPG cliche known to man. PS:T is a much more personal, character-focussed story that works to subvert many of those same cliches. PS:T is heavily centered around dialogue, while DA:I is much more action-oriented. I get the feeling he's just name-dropping a well-regarded RPG to gave DA:I more credibility.

...

That is actually what I wanted to say. Didn't put it well enough, my bad.

 

Look, there's nothing wrong here. Games designed around concept of "fun" (whether they should be or not - another question), but fun is deeply subjective thing. Not every (wo)man finds reading to be fun. Even fewer find thinking to be so. If you are developer/publisher and seek to sell your game to as many as possible (and you have to, even if only to get back gargantuan development funds of "next-gen graphics") you ought to take a lowest common denominator and divide your game by it. You or me (the gamer) can like it or dislike it, but it is logically right. There were no other ways games could go. AAA titles cannot be intelligent by their own nature, so why keep accusing them of this?

 

You can say that there's Obsidian, who manages to make AAA-games tolerable in this regard at the least and good at the most. But Obs was working only on sequels at first, therefore had strong base in every aspect including technical to begin with, lots of work already done. And even with that every AAA-game they made was severly broken at release in many regards except narrative. For the very first game they want to make without casual gamers in mind they had to go to KS.

 

Moreover, inspiration plays a great role in creative work. I just cannot imagine developer being seriously passionate to make something like Skyrim. More like earning his/her money according to directives from above. Maybe I'm wrong on that one, though.

 

 

 

...The more I see of this guy Volourn, the more looks like his rep is well deserved. Odd.

I've said this before but... I think Skyrim came at odds with it's own design due to a lot of different factors. That horrid gamebroke engine, limitations of the PS3, etc. Mods make it so much better, I won't play it without them, but I still get the feeling that a lot of corners had to be cut from the original vision. I like Skyrim but I'm not an FPS guy though, I never liked FPS until the first Half Life came along, and then I saw how a developer could create a compelling story there, along with mechanics and enemies that tell stories of their own. It really is a shame that I rarely see any other FPS noted for its story and atmosphere. I tried that Last Light game and found myself bored pretty quickly.

 

Games shouldn't always be designed around fun, the flash show Extra Credits covered that and did a pretty good job I thought. For some reason that show seems to be haunted by some controversy in its past, but I've never bothered to dig into that.

 

And yes, Volourn is great... We definitely don't agree on everything, but interesting conversation comes from differing perspectives, not ones that are more or less the same.

 

Edit: Come to think of it, part of why I like TES is probably because I started playing at Daggerfall, where as most people discovered the games when Morrowind hit the scene. I also played the Might & Magic series, which is sort of in-between games like the Gold Box RPGs and the first person perspective of TES. It was first person, but party + turn based combat, had really odd puzzles too. The end of book 2 was a surprise, but also made you wonder WTF it had to do with the game.

I agree with the limitations part but i wouldnt lump it solely on the ps3, remember the game was designed for the 360 and then ported over to the pc and ps3, so limitations should rest on the 360 part. Bethesda has a terrible track record porting to the ps3 and while all their games are a buggy mess, the best port of a bethesda game to the ps3 was oblivion and bethesda had paid another company to port that game to the ps3.

I blame the limitations on the 360 and the ps3 for skyrim and even fonv. It really shows in fonv when the base game without any patches has more content in it and alot of stuff had to be snipped because the 360 and ps3 couldnt handle it.

But now that the newer bethesda titles will be on the newer consoles which are basically high end pcs 2 or 3 years ago instead of the outdated machines that were 10 years old, hopefully we will see a huge difference. But tbh i think they will focus more on the sandbox and graphics than actual mechanics and story. They need to start hiring writers again instead on relying on programmers writing the storys. Flame me but i really think they need to bring back the acid tripping, chain smoking, alcoholic writer MK since he basically wrote most of the lore and the last game he was part of was the last TES game that had a decent Main Quest instead of having the guy who wrote the DB quest line in oblivion writing most of the content like they did in Skyrim.

Edited by redneckdevil
Posted
I agree with the limitations part but i wouldnt lump it solely on the ps3, remember the game was designed for the 360 and then ported over to the pc and ps3, so limitations should rest on the 360 part. Bethesda has a terrible track record porting to the ps3 and while all their games are a buggy mess, the best port of a bethesda game to the ps3 was oblivion and bethesda had paid another company to port that game to the ps3.

I blame the limitations on the 360 and the ps3 for skyrim and even fonv. It really shows in fonv when the base game without any patches has more content in it and alot of stuff had to be snipped because the 360 and ps3 couldnt handle it.

But now that the newer bethesda titles will be on the newer consoles which are basically high end pcs 2 or 3 years ago instead of the outdated machines that were 10 years old, hopefully we will see a huge difference. But tbh i think they will focus more on the sandbox and graphics than actual mechanics and story. They need to start hiring writers again instead on relying on programmers writing the storys. Flame me but i really think they need to bring back the acid tripping, chain smoking, alcoholic writer MK since he basically wrote most of the lore and the last game he was part of was the last TES game that had a decent Main Quest instead of having the guy who wrote the DB quest line in oblivion writing most of the content like they did in Skyrim.

 

 

Everyone has problems with the PS3 because it's not a developer friendly piece of hardware. They're also forced to tune parts of the game engine to the lowest common denominator in hardware, unless they want to maintain vastly different code branches in their CVS. That's how the PS3 can effect games on other platforms. Developers, publishers, even CEO's have commented through the years and they usually amount to: what a piece of crap. I remember reading one comment that said, "More 4 letter words ring out of the PS3 room than the rest of the office combined."

 

The two biggest issues are 256MB was a terrible design decision. The decision to use "cell" was just plain hubris. I mean, if it was so great, if it was they "way forward", if it was everything Sony touted it to be... Then why isn't there one in the PS4?

 

The core issue is that Sony's engineers forgot a something important...

 

Theory and practice sometimes clash. And, when that happens, theory loses. Every single time. - Linus Torvalds.

Fere libenter homines id quod volunt credunt. - Julius Caesar

 

:facepalm: #define TRUE (!FALSE)

I ran across an article where the above statement was found in a release tarball. LOL! Who does something like this? Predictably, this oddity was found when the article's author tried to build said tarball and the compiler promptly went into cardiac arrest. If you're not a developer, imagine telling someone the literal meaning of up is "not down". Such nonsense makes computers, and developers... angry.

Posted (edited)

Here ya go redneckdevil, in depth explanations and he even covers Sony's own postmortem analysis of the PS3 and why cell isn't in PS4.

 

In fact I just learned something myself from this, Cell didn't support OOE... That's just... shockingly bad. It appears it was even worse than I thought.

 

Now, if you think I am a 360 fan, you'd be wrong. I've never owned either console. If I had to choose today, I'd choose a PS4. But... I have 512GB of SSD, GTX cards in SLI, 32GB of RAM, 4.1 TB of HDD. So, why would I ever do that?

 

Edited by Luridis

Fere libenter homines id quod volunt credunt. - Julius Caesar

 

:facepalm: #define TRUE (!FALSE)

I ran across an article where the above statement was found in a release tarball. LOL! Who does something like this? Predictably, this oddity was found when the article's author tried to build said tarball and the compiler promptly went into cardiac arrest. If you're not a developer, imagine telling someone the literal meaning of up is "not down". Such nonsense makes computers, and developers... angry.

Posted

Ow im not saying its not sonys fault with the ps3, im just saying that due to limitations it shouldnt be ONLY blamed on the ps3. I was stating the 360 has a hand it in it as well simy because of it designed onit first and while it is better friendly hardware than the ps3, being designed on a hardware that is 10 years old does have its place in the blame as well.

 

Tbh i just yearn for the old days when the ganes were designed for pc first then ported to consoles and things cut to fit on that hardware. But i see why having everyone has the same game is seen as a good thing though.

Posted (edited)

Ow im not saying its not sonys fault with the ps3, im just saying that due to limitations it shouldnt be ONLY blamed on the ps3. I was stating the 360 has a hand it in it as well simy because of it designed onit first and while it is better friendly hardware than the ps3, being designed on a hardware that is 10 years old does have its place in the blame as well.

 

Tbh i just yearn for the old days when the ganes were designed for pc first then ported to consoles and things cut to fit on that hardware. But i see why having everyone has the same game is seen as a good thing though.

 

I didn't blame it all on the PS3 architecture. But, when you're using middleware to facilitate cross platform development, I'm going to assume they want to keep library functions as similar as possible across the platforms. Because, that's what I would do. Just look at standard C library, the functions it provides are largely the same across all platforms. There's a reason it doesn't have lots of platform specific functionality in there.

 

Yes, if you listen to what he says at the end of the video... Naughty Dog succeeded on PS3 because of sheer manpower. And, for the record, I'm not saying the Cell was poorly engineered, it pioneered some good ideas, I'm saying it was poorly engineered for practical use. As a research tool it was absolutely a good thing, but shoehorning the thing to market they way Sony did and expecting game studios to do compiler work was downright insulting. It's their hardware, they're responsible for producing a dev kit for application level development. Let studios do what they do best, make games and not waste their time trying to put together platform tool sets.

 

Edit: By compiler work I mean stuff like platform compiler optimization, things that the hardware developer is in the best position to do because they have access to things like the internal schematics of the CPU and development versions of them. I do realize that studios do some compiler work, but I think that's probably focused on stuff like shaders, or game specific optimizations.

Edited by Luridis

Fere libenter homines id quod volunt credunt. - Julius Caesar

 

:facepalm: #define TRUE (!FALSE)

I ran across an article where the above statement was found in a release tarball. LOL! Who does something like this? Predictably, this oddity was found when the article's author tried to build said tarball and the compiler promptly went into cardiac arrest. If you're not a developer, imagine telling someone the literal meaning of up is "not down". Such nonsense makes computers, and developers... angry.

Posted

Started Divinity Original Sin and am enjoying it. I like the return of tactical combat and it's nice to see magic that doesn't suck again. (Skyrim's not compelling as a mage, even with extensive mods.)

 

I made a Ranger + Witch for starting characters and went through the first dungeon and then to the beach. I'm glad to see the battle get more serious at the beach because what was in that first dungeon I could have just slept through.

 

The witch I like... But the ranger... doesn't feel like a ranger, feels like an archer (if you've played Might & Magic you know what I mean). I'm not feeling the class at all, is it just not very good or what? I'm starting to think the wayfarer was what I was looking for.

Fere libenter homines id quod volunt credunt. - Julius Caesar

 

:facepalm: #define TRUE (!FALSE)

I ran across an article where the above statement was found in a release tarball. LOL! Who does something like this? Predictably, this oddity was found when the article's author tried to build said tarball and the compiler promptly went into cardiac arrest. If you're not a developer, imagine telling someone the literal meaning of up is "not down". Such nonsense makes computers, and developers... angry.

Posted

Has this been covered already, what's wrong with going downhill ... you don't have to pedal, you can see where you're going, you can build all the speed you want, it's super exciting. There's a reason they call them uphill battles, no one wants to work twice as hard for half the distance. Use gravity, use it, don't fight it. I agree, role-playing games have been going downhill for years. 

  • Like 2

All Stop. On Screen.

Posted

People who talk about how games were better back in the day remind me of parents who complain about how their kid's music sucks compared to the masterpieces back in their day.

 

Not even saying they're wrong.  It's just funny how similar they sound to my parents when I was younger and they'd always comment on how music is trash nowadays compared to theirs.

"Console exclusive is such a harsh word." - Darque

"Console exclusive is two words Darque." - Nartwak (in response to Darque's observation)

Posted (edited)

Has this been covered already, what's wrong with going downhill ... you don't have to pedal, you can see where you're going, you can build all the speed you want, it's super exciting. There's a reason they call them uphill battles, no one wants to work twice as hard for half the distance. Use gravity, use it, don't fight it. I agree, role-playing games have been going downhill for years. 

ok, that were kinda funny, and in a sense, we agree. 

 

please note we didn't bother to read any of this thread save the last few posts-- the title o' thread were enough to prevent our contribution.  no doubt we missed "important" stuff and perhaps we will be redundant.

 

we read the recent codexian review for wasteland 2, and we couldn't help but laugh. some o' the old skool crpg staples that is seeming at the top of the hill is stuff we thinks best left behind.  if vd honestly thought all skills in ws 2 were useful and the busted attribute he saw as most deserving recognition were coordination, then something is wrong with old skool.  the stuff that vd and others see as the qualities necessary to put a game at the top o' the hill is clearly not what we were seeing as important.

 

...

 

when kotor were released, we were mighty ambivalent. the combat were suck and the rules mechanics were shallow. nevertheless the game had interesting characters and the plot were clever. sure, the plot and main characters o' kotor were largely a re-imagining o' the original star wars movies, but that is what made it work. people were disappointed with the prequel movies, and kotor brought back the original star wars themes and characters and even the big reveal. kotor was a well-told story with interesting characters and we enjoyed it very much in spite of shallow mechanics and boring combat. 

 

it took us a short bit o' reflection to make us make the connection between kotor and the game it very much reminded us of: ps:t . were another game with an amnesiac main character. were another game with shallow mechanics and terrible combat. ps:t strengths were the the handling o' the unique setting and the character development o' party and non-party characters. well guess what, those were the strengths o' kotor too.  ps:t were (and is) our favorite crpg, and on the downward slide o' the crpg genre, kotor managed to replicate much of what we liked about ps:t. 

 

is many games released this century that we has enjoyed very much. is some game we like far less... is no more a trend for good or bad than we has seen at anytime in the past. we generally liked the ie games, but there were also many Horrible crpgs released at the same time. even so, while we were seeing many flaws in bg2, we still think it were the best and most complete crog we ever played, even if it weren't our favorite crpg. conversely, we were not a fan o' elder scrolls and we absolutely loathed oblivion... so it were kinda shocking to us that we liked fallout 3. how could we like fo3 given that it played so much like oblivion? as much as fo purists hated it, we thought the strength o' fo3 were the setting. bethesda did a fantastic job with the fo setting, and placing the game in washington DC were inspired. oddly enough, we saw the gameplay o' fo:nv as superior to fo3, but the vegas setting were lacking. other than hoover damn, vegas is what makes vegas compelling for Gromnir. for obvious reasons, obsidian could not recreate the vegas strip. still, we enjoyed fo:nv well enough.

 

the nwn2 games had us similarly conflicted. am knowing folks in these parts is big fans o' motb, but we found it kinda meh... and too many characters were poorly written and over dependent on their hook to be interesting.

 

we liked mass effect  but hated the dialogue wheel, the mako and the shallow combat. mass effect 2 was both better and worse than mass effect... depends on which feature we mentions. still, overall we liked me 2 in spite o' the fact that the ultimate story o' me 2 were silly and the over-reliance of joinables with daddy issues.

 

old skool games we loathed includes arcanum. if that is a game at top o' the hill, we don't wanna return. arcanum were a poorly balanced and buggy mess that had a boring story, and worse, forgettable characters. there has been no pc game, crpg or otherwise, for which our anticipation were so utterly destroyed by the reality. arcanum suffered from the one insurmountable flaw no game can overcome regardless o' depth o' rules o'r multitude o' quests: it were boring.

 

Gromnir is no sucker for old skool crpg greatness. we liked many old games and we like many newer games. am not a fan o' the skyrim and oblivion kinda open-world trend that seems to sacrifice character and story in favor o' giving us panoramic vistas, but keep in mind that before oblivion were released, many developers, including obsidian developers, were telling us we would never see another enormous game in the near future. bg2 size could never be replicated, or so we were told. as much as we loathe the skyrim kinda stuff, bethesda helped show that big games ain't outta reach.  even games we don't like has helped make games we do like more likely to get made in the future.

 

regardless, we don't see some kinda crpg general downward descent into mediocrity or failure. over the years we has played a few gems o' games along with a good number o' moderate entertaining games. we don't recall a golden age or a dark ages for the genre. in our experience, the next game development has as much chance as being a gem as a turd.... same as was true for the previous anticipated role-play game developments. 

 

HA! Good Fun!

Edited by Gromnir
  • Like 2

"If there be time to expose through discussion the falsehood and fallacies, to avert the evil by the processes of education, the remedy to be applied is more speech, not enforced silence."Justice Louis Brandeis, Concurring, Whitney v. California, 274 U.S. 357 (1927)

"Im indifferent to almost any murder as long as it doesn't affect me or mine."--Gfted1 (September 30, 2019)

Posted

please note we didn't bother to read any of this thread save the last few posts-- the title o' thread were enough to prevent our contribution.  no doubt we missed "important" stuff and perhaps we will be redundant.

 

 

Yea, I get that.

 

People took it the wrong way.

People made assumptions about what I was referring to.

People assumed my statement was tied to black & while lines of thinking.

People thought I was only talking about Skyrim.

Downhill apparently means different things to me than others, I'm not surprised at that really.

 

It was really all about a douche plowing through an RPG like it was an FPS game. My thinking was: if mechanics are getting weaker, if story lines are becoming more shallow, if puzzles and traps are becoming remedial... It's because people have started playing RPGs like there is a race for the end of the game. I blame WoW, well MMOJSs. (MMO Job Simulators, cause that's what it means to "raid".)

 

My post was never about the games flavor of RPG: tactical, turn based, real time, party based, strong story, first person, class based, etc.

 

"But, think of the children!" - I say to hell with them.

Fere libenter homines id quod volunt credunt. - Julius Caesar

 

:facepalm: #define TRUE (!FALSE)

I ran across an article where the above statement was found in a release tarball. LOL! Who does something like this? Predictably, this oddity was found when the article's author tried to build said tarball and the compiler promptly went into cardiac arrest. If you're not a developer, imagine telling someone the literal meaning of up is "not down". Such nonsense makes computers, and developers... angry.

Posted

 

I didn't blame it all on the PS3 architecture. But, when you're using middleware to facilitate cross platform development, I'm going to assume they want to keep library functions as similar as possible across the platforms. Because, that's what I would do. Just look at standard C library, the functions it provides are largely the same across all platforms. There's a reason it doesn't have lots of platform specific functionality in there.

 

Yes, if you listen to what he says at the end of the video... Naughty Dog succeeded on PS3 because of sheer manpower. And, for the record, I'm not saying the Cell was poorly engineered, it pioneered some good ideas, I'm saying it was poorly engineered for practical use. As a research tool it was absolutely a good thing, but shoehorning the thing to market they way Sony did and expecting game studios to do compiler work was downright insulting. It's their hardware, they're responsible for producing a dev kit for application level development. Let studios do what they do best, make games and not waste their time trying to put together platform tool sets.

 

Edit: By compiler work I mean stuff like platform compiler optimization, things that the hardware developer is in the best position to do because they have access to things like the internal schematics of the CPU and development versions of them. I do realize that studios do some compiler work, but I think that's probably focused on stuff like shaders, or game specific optimizations.

 

..actually, the ps3 through commonly available middleware is as perfectly usable as any pc with .. 416Mb ram.. without sacrificing a single thought on cpu-architecture, platform specific implementions through OpenGL inserts, /or/ compiler work for that matter..

 

The problems that Naughty Dog described earlier at least were similar, but different to the problems the porting crews struggled with. ND made an engine from scratch to create the animation, the magical shirt, etc. And then added parts to this later, such as shader work with commonly known methods that has some requirements that draw resources from more than just the gpu-specific resources. So then you need to plan well, and not start to fight between the parts of the studio that want visual flair and shaders done in one way, and the parts that think you should drop it all and make something new from scratch.

 

So on the games after Uncharted, they - allegedly - had the problem that if they wanted to use their artists as much as possible, they would need to coordinate with a platform specialist who could create new tools, with new limitations, etc. And that's difficult. And if that's the only thing you see as a developer - for example if you're on the graphics and art. Then you're going to observe the architecture as a problem. Because on PC and the xbox, the existing middle-ware and tools are adequate anyway. On PC and with graphics cards, since that's where the methods were developed. And on the xbox, in spite of having the same unified memory architecture as the ps3 in principle, thanks to essentially better limited.. I mean, structured and recognizable.. platform middleware.

 

I.e., there's nothing to compromise about on the xbox360. However, you still need to write an engine and use platform specific implementation - with platform specific limitations. But the graphics and art part is less difficult.

 

Porting people have a different problem. Because the majority of the ports that were made weren't actually ports from PC to ps3, but porting the xbox360 version to ps3. And that leaves you with a fairly specific problem, namely that the ps3 has 60Mb or so less available memory. So something has to go, such as textures, or filters. But when you're sold on the magnificentness of the cell, and so on, there's some pressure involved to create an alternative solution instead. And that's where the complaints turn up - from people who are not really qualified.. and very few people are, and if they are it's a lot of thankless work.. to start restructuring a project that's already completed and repackage it. Including specific implementations and allocation work that may break routines in the project. Which you would need to do - only to get the same results you already have. 

 

Meanwhile, the "platform parity" requirement from either Sony and Microsoft (on the pain of reduced visibility and promotion) ensures that if you actually went through the entire thing and wanted to add something interesting, then you wouldn't be able to anyway. Obviously that makes a lot of work on the port untenable.

 

Of course, if you wanted to just port a normally implemented PC game to the ps3, then you have an easier time, since the platform tools ensure such and such execution times (in spite of the core logic not "supporting" out of order execution on the assembly level.. *cough* ..do we even know the significance of this? We're really talking about how automatic compiler optimisations that work on x86 platforms won't work as successfully. Relying on that for getting something to barely run is.. not a good idea on any platform).

 

And that's why you see fairly successful and easy ports being made now at the end of the ps3's lifetime, when developers are simply porting a PC version, rather than outsourcing the port work after developing an xbox360 version.

 

As well as why the ports on projects where they had a competent platform specialist were quite good ports. DICE and BF3, for example. Or the Assassin's Creed games. Both of those had a bigger shader budget than the xbox360 version, for example. They also smuggled in gpu-trickery that the 360 version, or a PC with the same amount of shaders on the graphics card, couldn't run. Since, to paraphrase Christina Coffin, the gpu-code can be run unmodified on the cell's spus. It's not efficient, but it's something that can be used to compensate for the then and now outdated gpu in the ps3. In other words, with a competent platform specialist to do the implementations on the high level, it would be possible to simply add extra graphics card routines, unmodified, without reducing thread response, etc.

 

The thing is that right now, the only advancement in graphics in games is happening is adding more post-filtering and post-processing. Which means the only real requirement is to add more graphics cores from one "generation" to the next.

 

There's no "need" to create a platform with an explicitly parallel assembly language and processing elements that have immediate and concurrent access to main memory. Because who would want.. you know.. real-time ray-tracing, infinitely complex occlusion detection, near infinite light sources, perspective and location varying shadows, physics simulation based on advanced and complex maths that still complete every single processor cycle, etc. I mean, that'd be silly, since we have such awesome cutscenes. Not to mention grass made with green dots and all the square rocks. Awesome.

 

Anyway. And that's not really going to change for quite a while. And that's also why the two current "last gen" consoles really are simply a low-budget PC with an in-built platform drm system. And developers conforming to that very eagerly is what is holding games "back". It's not one platform, or Microsoft or Sony that's holding the industry back at all.

 

You know, you can say all kinds of things about platforms being difficult to develop for and things of that sort keeping games from realizing their potential as a phenomenon (and people actually do, which is hilarious). But what these developers that complain are actually referring to is that they can't add 3% extra shaders. It's not that somehow the game doesn't have enough process cycles to run reasonably well, or enough shaders to run most of the effects.

 

Like people say, really good games that were made a while back are still playable. And they're often made with more skill and thought than newer games. That, as explained, tend to be overly focused on adding those shaders much more than creating gameplay and core logic, rulesets, etc. The Uncharted games being very good examples. Where they essentially threw away a huge amount of animation interference logic, in order to add a bloom filter and more one-shot explosion effects. Along with more cutscenes, of course.

 

And that part - cutscenes and graphics post - is becoming a massive part of what games-development means. Which, as you can probably tell is my opinion, is why games are going downhill.

The injustice must end! Sign the petition and Free the Krug!

Posted

..actually, the ps3 through commonly available middleware is as perfectly usable as any pc with .. 416Mb ram.. without sacrificing a single thought on cpu-architecture, platform specific implementions through OpenGL inserts, /or/ compiler work for that matter.. (1)

 

 

The problems that Naughty Dog described earlier at least were similar, but different to the problems the porting crews struggled with. ND made an engine from scratch to create the animation, the magical shirt, etc. And then added parts to this later, such as shader work with commonly known methods that has some requirements that draw resources from more than just the gpu-specific resources. So then you need to plan well, and not start to fight between the parts of the studio that want visual flair and shaders done in one way, and the parts that think you should drop it all and make something new from scratch.

 

So on the games after Uncharted, they - allegedly - had the problem that if they wanted to use their artists as much as possible, they would need to coordinate with a platform specialist who could create new tools, with new limitations, etc. And that's difficult. And if that's the only thing you see as a developer - for example if you're on the graphics and art. Then you're going to observe the architecture as a problem. Because on PC and the xbox, the existing middle-ware and tools are adequate anyway. On PC and with graphics cards, since that's where the methods were developed. And on the xbox, in spite of having the same unified memory architecture as the ps3 in principle, thanks to essentially better limited.. I mean, structured and recognizable.. platform middleware. (1)

 

I.e., there's nothing to compromise about on the xbox360. However, you still need to write an engine and use platform specific implementation - with platform specific limitations. But the graphics and art part is less difficult.

 

Porting people have a different problem. Because the majority of the ports that were made weren't actually ports from PC to ps3, but porting the xbox360 version to ps3. And that leaves you with a fairly specific problem, namely that the ps3 has 60Mb or so less available memory. So something has to go, such as textures, or filters. But when you're sold on the magnificentness of the cell, and so on, there's some pressure involved to create an alternative solution instead. And that's where the complaints turn up - from people who are not really qualified.. and very few people are, and if they are it's a lot of thankless work.. to start restructuring a project that's already completed and repackage it. Including specific implementations and allocation work that may break routines in the project. Which you would need to do - only to get the same results you already have. (2)

 

Meanwhile, the "platform parity" requirement from either Sony and Microsoft (on the pain of reduced visibility and promotion) ensures that if you actually went through the entire thing and wanted to add something interesting, then you wouldn't be able to anyway. Obviously that makes a lot of work on the port untenable.

 

Of course, if you wanted to just port a normally implemented PC game to the ps3, then you have an easier time, since the platform tools ensure such and such execution times (in spite of the core logic not "supporting" out of order execution on the assembly level.. *cough* ..do we even know the significance of this? (3) We're really talking about how automatic compiler optimisations that work on x86 platforms won't work as successfully. Relying on that for getting something to barely run is.. not a good idea on any platform).

 

And that's why you see fairly successful and easy ports being made now at the end of the ps3's lifetime, when developers are simply porting a PC version, rather than outsourcing the port work after developing an xbox360 version.

 

As well as why the ports on projects where they had a competent platform specialist were quite good ports. DICE and BF3, for example. Or the Assassin's Creed games. Both of those had a bigger shader budget than the xbox360 version, for example. They also smuggled in gpu-trickery that the 360 version, or a PC with the same amount of shaders on the graphics card, couldn't run. Since, to paraphrase Christina Coffin, the gpu-code can be run unmodified on the cell's spus. It's not efficient, but it's something that can be used to compensate for the then and now outdated gpu in the ps3. In other words, with a competent platform specialist to do the implementations on the high level, it would be possible to simply add extra graphics card routines, unmodified, without reducing thread response, etc.

 

The thing is that right now, the only advancement in graphics in games is happening is adding more post-filtering and post-processing. Which means the only real requirement is to add more graphics cores from one "generation" to the next.

 

There's no "need" to create a platform with an explicitly parallel assembly language and processing elements that have immediate and concurrent access to main memory. Because who would want.. you know.. real-time ray-tracing, infinitely complex occlusion detection, near infinite light sources, perspective and location varying shadows, physics simulation based on advanced and complex maths that still complete every single processor cycle, etc. I mean, that'd be silly, since we have such awesome cutscenes. Not to mention grass made with green dots and all the square rocks. Awesome.

 

Anyway. And that's not really going to change for quite a while. And that's also why the two current "last gen" consoles really are simply a low-budget PC with an in-built platform drm system. And developers conforming to that very eagerly is what is holding games "back". It's not one platform, or Microsoft or Sony that's holding the industry back at all.

 

You know, you can say all kinds of things about platforms being difficult to develop for and things of that sort keeping games from realizing their potential as a phenomenon (and people actually do, which is hilarious). But what these developers that complain are actually referring to is that they can't add 3% extra shaders. It's not that somehow the game doesn't have enough process cycles to run reasonably well, or enough shaders to run most of the effects.

 

Like people say, really good games that were made a while back are still playable. And they're often made with more skill and thought than newer games. That, as explained, tend to be overly focused on adding those shaders much more than creating gameplay and core logic, rulesets, etc. The Uncharted games being very good examples. Where they essentially threw away a huge amount of animation interference logic, in order to add a bloom filter and more one-shot explosion effects. Along with more cutscenes, of course.

 

And that part - cutscenes and graphics post - is becoming a massive part of what games-development means. Which, as you can probably tell is my opinion, is why games are going downhill.

 

 

1 - It is now... It was not this way at launch. Which is why, as one of the videos I linked stated, had developers forgoing the use of the SPE units all together and running everything on the general purpose core. Why? Lack of docs, lack of tools.

 

http://www.redgamingtech.com/sony-playstation-3-post-mortem-part-1-the-cell-processor/

 

2 - Thanks for the additional insight.

 

3 - Compiler? Has nothing to do with the compiler, OOE & BPE are hardware-side optimizations. They are designed to make up for poor compiler optimizations, lack of run time information available at compile time and crap like this highlights. I'm tired of explaining it. Let the world this country do whatever it wants and teach everyone that programming is about... java. :banghead:

 

Me at coffee shop: Well that depends, are you talking about a superscalar CPU? What is this being compiled for?

CS Student (Not 1st Year): What's superscalar?

Me: Did you sleep through your hardware class?

Him: What hardware class?

Me: FFS!

 

 

Fere libenter homines id quod volunt credunt. - Julius Caesar

 

:facepalm: #define TRUE (!FALSE)

I ran across an article where the above statement was found in a release tarball. LOL! Who does something like this? Predictably, this oddity was found when the article's author tried to build said tarball and the compiler promptly went into cardiac arrest. If you're not a developer, imagine telling someone the literal meaning of up is "not down". Such nonsense makes computers, and developers... angry.

Posted

 

ok, that were kinda funny, and in a sense, we agree. 

 

Am taking this and running, as fast as we can go, downhill, or up, doesn't matter, will use a motor. 

All Stop. On Screen.

Posted

Gromnir: I agree with you on Arcanum, I also found it boring although I loved F1 and F2. I also agree that WL2 could have been done better and it is not a shiny example of good old gaming done right. D:OS is that more.

 

But I still consider that olden games (from end of 90s) overall are better than what we get today. Their stories and gameplay options still beat what is done today.

Posted

 

..actually, the ps3 through commonly available middleware is as perfectly usable as any pc with .. 416Mb ram.. without sacrificing a single thought on cpu-architecture, platform specific implementions through OpenGL inserts, /or/ compiler work for that matter.. (1)

 

 

The problems that Naughty Dog described earlier at least were similar, but different to the problems the porting crews struggled with. ND made an engine from scratch to create the animation, the magical shirt, etc. And then added parts to this later, such as shader work with commonly known methods that has some requirements that draw resources from more than just the gpu-specific resources. So then you need to plan well, and not start to fight between the parts of the studio that want visual flair and shaders done in one way, and the parts that think you should drop it all and make something new from scratch.

 

So on the games after Uncharted, they - allegedly - had the problem that if they wanted to use their artists as much as possible, they would need to coordinate with a platform specialist who could create new tools, with new limitations, etc. And that's difficult. And if that's the only thing you see as a developer - for example if you're on the graphics and art. Then you're going to observe the architecture as a problem. Because on PC and the xbox, the existing middle-ware and tools are adequate anyway. On PC and with graphics cards, since that's where the methods were developed. And on the xbox, in spite of having the same unified memory architecture as the ps3 in principle, thanks to essentially better limited.. I mean, structured and recognizable.. platform middleware. (1)

 

I.e., there's nothing to compromise about on the xbox360. However, you still need to write an engine and use platform specific implementation - with platform specific limitations. But the graphics and art part is less difficult.

 

Porting people have a different problem. Because the majority of the ports that were made weren't actually ports from PC to ps3, but porting the xbox360 version to ps3. And that leaves you with a fairly specific problem, namely that the ps3 has 60Mb or so less available memory. So something has to go, such as textures, or filters. But when you're sold on the magnificentness of the cell, and so on, there's some pressure involved to create an alternative solution instead. And that's where the complaints turn up - from people who are not really qualified.. and very few people are, and if they are it's a lot of thankless work.. to start restructuring a project that's already completed and repackage it. Including specific implementations and allocation work that may break routines in the project. Which you would need to do - only to get the same results you already have. (2)

 

Meanwhile, the "platform parity" requirement from either Sony and Microsoft (on the pain of reduced visibility and promotion) ensures that if you actually went through the entire thing and wanted to add something interesting, then you wouldn't be able to anyway. Obviously that makes a lot of work on the port untenable.

 

Of course, if you wanted to just port a normally implemented PC game to the ps3, then you have an easier time, since the platform tools ensure such and such execution times (in spite of the core logic not "supporting" out of order execution on the assembly level.. *cough* ..do we even know the significance of this? (3) We're really talking about how automatic compiler optimisations that work on x86 platforms won't work as successfully. Relying on that for getting something to barely run is.. not a good idea on any platform).

 

And that's why you see fairly successful and easy ports being made now at the end of the ps3's lifetime, when developers are simply porting a PC version, rather than outsourcing the port work after developing an xbox360 version.

 

As well as why the ports on projects where they had a competent platform specialist were quite good ports. DICE and BF3, for example. Or the Assassin's Creed games. Both of those had a bigger shader budget than the xbox360 version, for example. They also smuggled in gpu-trickery that the 360 version, or a PC with the same amount of shaders on the graphics card, couldn't run. Since, to paraphrase Christina Coffin, the gpu-code can be run unmodified on the cell's spus. It's not efficient, but it's something that can be used to compensate for the then and now outdated gpu in the ps3. In other words, with a competent platform specialist to do the implementations on the high level, it would be possible to simply add extra graphics card routines, unmodified, without reducing thread response, etc.

 

The thing is that right now, the only advancement in graphics in games is happening is adding more post-filtering and post-processing. Which means the only real requirement is to add more graphics cores from one "generation" to the next.

 

There's no "need" to create a platform with an explicitly parallel assembly language and processing elements that have immediate and concurrent access to main memory. Because who would want.. you know.. real-time ray-tracing, infinitely complex occlusion detection, near infinite light sources, perspective and location varying shadows, physics simulation based on advanced and complex maths that still complete every single processor cycle, etc. I mean, that'd be silly, since we have such awesome cutscenes. Not to mention grass made with green dots and all the square rocks. Awesome.

 

Anyway. And that's not really going to change for quite a while. And that's also why the two current "last gen" consoles really are simply a low-budget PC with an in-built platform drm system. And developers conforming to that very eagerly is what is holding games "back". It's not one platform, or Microsoft or Sony that's holding the industry back at all.

 

You know, you can say all kinds of things about platforms being difficult to develop for and things of that sort keeping games from realizing their potential as a phenomenon (and people actually do, which is hilarious). But what these developers that complain are actually referring to is that they can't add 3% extra shaders. It's not that somehow the game doesn't have enough process cycles to run reasonably well, or enough shaders to run most of the effects.

 

Like people say, really good games that were made a while back are still playable. And they're often made with more skill and thought than newer games. That, as explained, tend to be overly focused on adding those shaders much more than creating gameplay and core logic, rulesets, etc. The Uncharted games being very good examples. Where they essentially threw away a huge amount of animation interference logic, in order to add a bloom filter and more one-shot explosion effects. Along with more cutscenes, of course.

 

And that part - cutscenes and graphics post - is becoming a massive part of what games-development means. Which, as you can probably tell is my opinion, is why games are going downhill.

 

 

1 - It is now... It was not this way at launch. Which is why, as one of the videos I linked stated, had developers forgoing the use of the SPE units all together and running everything on the general purpose core. Why? Lack of docs, lack of tools.

 

http://www.redgamingtech.com/sony-playstation-3-post-mortem-part-1-the-cell-processor/

 

2 - Thanks for the additional insight.

 

3 - Compiler? Has nothing to do with the compiler, OOE & BPE are hardware-side optimizations. They are designed to make up for poor compiler optimizations, lack of run time information available at compile time and crap like this highlights. I'm tired of explaining it. Let the world this country do whatever it wants and teach everyone that programming is about... java. :banghead:

 

Me at coffee shop: Well that depends, are you talking about a superscalar CPU? What is this being compiled for?

CS Student (Not 1st Year): What's superscalar?

Me: Did you sleep through your hardware class?

Him: What hardware class?

Me: FFS!

 

Heheh. Yeah, I can see where that's coming from. But even if the documentation was as patchy as it was for the open sdk (which I very much doubt), you would not have any problems whatsoever if you simply allocated the main processor and a helper thread to be automatically allocated elsewhere.. with the simplest wrapper you can possibly imagine - and then dropped everything spu, explicitly parallel assembly and so on, and simply ran it. Then that would work, you see. 

 

But yes, I see where you're coming from. Even though those of us who have programmed a lot of assembly, and know when never to use assembly or explicit allocations or imperative function calls that rely on other subroutines, etc., would only ever use that very rarely, and only very late in the process. Specifically because of the kind of issues people did run into when they had to port those functions to even marginally different hardware. Simply because that means, inevitably, that you have to reimplement the entire thing from that point where you chose to use platform-specific implementations.

 

Because make no mistake about it, the complaints about the ps3 hardware being insane and ridiculous does not come from people who program spu-specific instruction sets to add effects or do animation spline correction, or add filters with in-scene dependency, or anything of that sort.

 

The ones who did that - who incidentally Sony fired in mass when their games didn't create "buzz" enough, apparently - they knew what they were going to, and how interesting this really is. The complaints also do not come from competent porters who worked with a well-documented project. It comes from people who either have to port a 360 version, or from people who have put it in their heads that c++ memory hacks to "speed up" games and "steal" clock-cycles under the processor was still a thing in 2005. Or that it's still something you do and expect great results from in 2014. That's just not how programming on current hardware works.

 

You also literally do not write high-level code that in any way exploits the "advantages" of cisc optimisations such as what you have on intel platforms. These belong to the compiler tricks that are fairly esoteric, and also don't actually run on the xbox360 hardware.. You know.. that's something to consider. But yes, at least knowing what simd and superscalar designs is, so you know what the principle behind it actually looks like when you're creating shader code, or something. So you know what sort of structure likely would be possible to optimize like a mf'er if you had such and such instruction sets(read: algorithms programmed on the spu elements, etc) - that would be a good thing, absolutely.

 

But as you point out, that's not how people work now. It's middleware and high-level coding. And my point is that in that context, the xbox360 and the ps3 is identical. 

 

Also, when I talked about relying on compiler optimisations to make the game run smoothly, or run at all, then we're talking about something else. Then it's usually about inefficient code that will have no allocation discipline, no resource collection, lots of possible code that might run but won't that's automatically snipped, etc. And you can't rely on that when you have a limited budget on ram - which is the case for any console. Frankly, even mobile developers know how to do this, and when to work with and when to circumvent the compiler in that sense. What resources to mark, what resources to unbind. And none of that is different between xbox360 and ps3 development. That is, if you forgo the use of spus and simply end up with the same or slightly worse results.

 

Meanwhile, keep in mind - "unmodified shader code" can run on an spu.

 

When I first looked at the cell be sdk, and then heard people talk about it -- my jaw dropped. It's like wtf is this? There's a guy who created a huge series of articles where he wrote a byte-code emulator for x86, essentially, to run on the cell, for example. So he essentially took each instruction, filled the spus each cycle with a new instruction, and then ran it. Which then produced abysmal performance (as in merely slightly below the same as when running on an x86 platform). This is a fake example, obviously. It's not even a real-time compiler result. Where he deliberately creates a worst-case scenario that literally never happens. And he did that to "prove" that the cell be was a sham. He's extremely active on the beyond3d forums, for example. And that was something I know people trusted as an authoritative source on deciding whether the cell be had any future. 

 

Now, if you made complex instructions instead, you could create something like the in-scene effects and animation in Killzone 2. You know, real-time cutscenes, yeah? But that's again something so far out -- even when it's actually made and it's proven that it works -- that it just doesn't survive. 

 

So this gen we have mid-range gaming pcs with an outdated graphics card and inbuilt drm. And it sells more. Because the "industry" has /confidence/ in this cheap crap, and it generates "buzz"..

The injustice must end! Sign the petition and Free the Krug!

Posted

 

Heheh. Yeah, I can see where that's coming from. But even if the documentation was as patchy as it was for the open sdk (which I very much doubt), you would not have any problems whatsoever if you simply allocated the main processor and a helper thread to be automatically allocated elsewhere.. with the simplest wrapper you can possibly imagine - and then dropped everything spu, explicitly parallel assembly and so on, and simply ran it. Then that would work, you see. 

 

But yes, I see where you're coming from. Even though those of us who have programmed a lot of assembly, and know when never to use assembly or explicit allocations or imperative function calls that rely on other subroutines, etc., would only ever use that very rarely, and only very late in the process. Specifically because of the kind of issues people did run into when they had to port those functions to even marginally different hardware. Simply because that means, inevitably, that you have to reimplement the entire thing from that point where you chose to use platform-specific implementations. (1)

 

Because make no mistake about it, the complaints about the ps3 hardware being insane and ridiculous does not come from people who program spu-specific instruction sets to add effects or do animation spline correction, or add filters with in-scene dependency, or anything of that sort.

 

The ones who did that - who incidentally Sony fired in mass when their games didn't create "buzz" enough, apparently - they knew what they were going to, and how interesting this really is. The complaints also do not come from competent porters who worked with a well-documented project. It comes from people who either have to port a 360 version, or from people who have put it in their heads that c++ memory hacks to "speed up" games and "steal" clock-cycles under the processor was still a thing in 2005. Or that it's still something you do and expect great results from in 2014. That's just not how programming on current hardware works.

 

You also literally do not write high-level code that in any way exploits the "advantages" of cisc optimisations such as what you have on intel platforms. These belong to the compiler tricks that are fairly esoteric, and also don't actually run on the xbox360 hardware.. You know.. that's something to consider. But yes, at least knowing what simd and superscalar designs is, so you know what the principle behind it actually looks like when you're creating shader code, or something. So you know what sort of structure likely would be possible to optimize like a mf'er if you had such and such instruction sets(read: algorithms programmed on the spu elements, etc) - that would be a good thing, absolutely. (2)

 

But as you point out, that's not how people work now. It's middleware and high-level coding. And my point is that in that context, the xbox360 and the ps3 is identical. (1)

 

Also, when I talked about relying on compiler optimisations to make the game run smoothly, or run at all, then we're talking about something else. Then it's usually about inefficient code that will have no allocation discipline, no resource collection, lots of possible code that might run but won't that's automatically snipped, etc. And you can't rely on that when you have a limited budget on ram - which is the case for any console. Frankly, even mobile developers know how to do this, and when to work with and when to circumvent the compiler in that sense. What resources to mark, what resources to unbind. And none of that is different between xbox360 and ps3 development. That is, if you forgo the use of spus and simply end up with the same or slightly worse results.

 

Meanwhile, keep in mind - "unmodified shader code" can run on an spu.

 

When I first looked at the cell be sdk, and then heard people talk about it -- my jaw dropped. It's like wtf is this? There's a guy who created a huge series of articles where he wrote a byte-code emulator for x86, essentially, to run on the cell, for example. So he essentially took each instruction, filled the spus each cycle with a new instruction, and then ran it. Which then produced abysmal performance (as in merely slightly below the same as when running on an x86 platform). This is a fake example, obviously. It's not even a real-time compiler result. Where he deliberately creates a worst-case scenario that literally never happens. And he did that to "prove" that the cell be was a sham. He's extremely active on the beyond3d forums, for example. And that was something I know people trusted as an authoritative source on deciding whether the cell be had any future. (3)

 

Now, if you made complex instructions instead, you could create something like the in-scene effects and animation in Killzone 2. You know, real-time cutscenes, yeah? But that's again something so far out -- even when it's actually made and it's proven that it works -- that it just doesn't survive. 

 

So this gen we have mid-range gaming pcs with an outdated graphics card and inbuilt drm. And it sells more. Because the "industry" has /confidence/ in this cheap crap, and it generates "buzz"..

 

 

1 - This is what I meant when I said the functionality of the middleware is likely tuned to the lowest common denominator for the hardware. I obviously don't write games so that was a guess, but I have used other kinds of middleware in other cross platform scenarios. And, in the cases I am talking about, you can't really depend on anything being there but the foreign interface wrapper to POSIX/libc/STL or whatever level you're working at, even if one of the platforms has mylovelystandardlibs.

 

2 - No, you can't optimize for OOE / BPE at the level of say, Python. But, you can write code that inadvertently defeats those mechanisms.

 

3 - I know where you're coming from. I don't think the cell was a sham, I think it was ill-executed, very ill-ly executed. Given what game developers were blogging, as tell as their CEO's at the time, the predictions I made to a friend when the architecture was first announced were spot on. Which went something akin to, "game developers are going to hate it and it's going to take a while for them to build momentum on it because it changes a lot, and from what I understand gives them little assistance in tools and docs. They're setting everyone back, and making the hardware cost a ton more and the benefits that would offset this are probably years away." Or well, you get the drift.

Fere libenter homines id quod volunt credunt. - Julius Caesar

 

:facepalm: #define TRUE (!FALSE)

I ran across an article where the above statement was found in a release tarball. LOL! Who does something like this? Predictably, this oddity was found when the article's author tried to build said tarball and the compiler promptly went into cardiac arrest. If you're not a developer, imagine telling someone the literal meaning of up is "not down". Such nonsense makes computers, and developers... angry.

Posted

(2) right. But both the xenon makeup in the 360 and the ppc-element in the cell do out of order execution. It's the array of spus that are specialized processors with their internal cache, concurrent access to working memory, etc. These don't run efficiently with general byte-code. But they do run it. So like explained, if you don't have incredible demands going on here, it's not exactly wizardry going on to get a reasonable result.

 

(1) sure. That's.. a thing. But you have to deal with this anyway, whether you develop for Microsoft's lovely Microsoft libraries. Or if you're developing for Sony's sucky sdk, etc. 

 

(3) I suppose it was marketed badly, and then sabotaged to obscurity internally, in the way that only Sony knows how to with interesting solutions (it's not the first time - on their phones, for example..). Hardware, however.. the hardware is wonderful. :p No, seriously, though, it was one hell of an attempt to create a portable cell be platform for the consumer market, without the drawbacks the specialization typically created. And it.. succeeded. It had the special purpose spe-array with asynchronous access to working memory. As well as the decent enough normal processor and a gpu, coupled to the same memory. That engineering feat is what really limited the memory size, by the way - construction of that memory-bus with concurrent access is expensive technically, and... well, you never would do that if you wanted to minimize the production cost, or for example maximize the PC-port performance. 

 

If you wanted that, then you would have chosen a different solution. Namely what the ps4 and the xbone turned into. That do have better performance in the area people care about. Shader budget and memory storage.

 

But it of course lacks any of the expensive features. It is a success for marketing, and a defeat for people who like interesting tech and any sort of advancement for for example computer graphics and interactivity in games. The parallel to how games are developed - marketing and cutscenes over substance - is fairly striking, yes?

The injustice must end! Sign the petition and Free the Krug!

Posted

(2) right. But both the xenon makeup in the 360 and the ppc-element in the cell do out of order execution. It's the array of spus that are specialized processors with their internal cache, concurrent access to working memory, etc. These don't run efficiently with general byte-code. But they do run it. So like explained, if you don't have incredible demands going on here, it's not exactly wizardry going on to get a reasonable result.

 

(1) sure. That's.. a thing. But you have to deal with this anyway, whether you develop for Microsoft's lovely Microsoft libraries. Or if you're developing for Sony's sucky sdk, etc. 

 

(3) I suppose it was marketed badly, and then sabotaged to obscurity internally, in the way that only Sony knows how to with interesting solutions (it's not the first time - on their phones, for example..). Hardware, however.. the hardware is wonderful. :p No, seriously, though, it was one hell of an attempt to create a portable cell be platform for the consumer market, without the drawbacks the specialization typically created. And it.. succeeded. It had the special purpose spe-array with asynchronous access to working memory. As well as the decent enough normal processor and a gpu, coupled to the same memory. That engineering feat is what really limited the memory size, by the way - construction of that memory-bus with concurrent access is expensive technically, and... well, you never would do that if you wanted to minimize the production cost, or for example maximize the PC-port performance. 

 

If you wanted that, then you would have chosen a different solution. Namely what the ps4 and the xbone turned into. That do have better performance in the area people care about. Shader budget and memory storage.

 

But it of course lacks any of the expensive features. It is a success for marketing, and a defeat for people who like interesting tech and any sort of advancement for for example computer graphics and interactivity in games. The parallel to how games are developed - marketing and cutscenes over substance - is fairly striking, yes?

 

Yep, xbox decided to follow in Sony's footsteps this time around. And, that that said... I'll leave you with my favorite classic platformer of all time. Though the guy playing is sorta whiney.

 

Fere libenter homines id quod volunt credunt. - Julius Caesar

 

:facepalm: #define TRUE (!FALSE)

I ran across an article where the above statement was found in a release tarball. LOL! Who does something like this? Predictably, this oddity was found when the article's author tried to build said tarball and the compiler promptly went into cardiac arrest. If you're not a developer, imagine telling someone the literal meaning of up is "not down". Such nonsense makes computers, and developers... angry.

Posted

I was following the topic of narrative... ;)

 

More Lovecraft please. And no, slenderdork and pyramid heads don't count, even if they share a genre. And for it to be truly Lovecraft, the end can't be the player besting villain. The options are to survive it, join with it, or realize it was you all along. I've seen people try to sell things like the unlikely underdog (think Frodo) as Lovecraft-like. "But it was so much more powerful than our little train that could, and he won by pure luck, a dagger in the Achilles heel." Nope... Sorry, he won, you can't win in a Lovecraft theme, no matter how unlikely, lucky or hapless the protagonist. Finally, the more baffling and haunting it is after the fact, the better.

You know, I'm biased as I'm making a game on Lovecraft, but these are ridiculous standards to hold to. I'm sick of people shredding their creativity in order to stay on the "tradition" route. 

Posted (edited)

 

I was following the topic of narrative... ;)

 

More Lovecraft please. And no, slenderdork and pyramid heads don't count, even if they share a genre. And for it to be truly Lovecraft, the end can't be the player besting villain. The options are to survive it, join with it, or realize it was you all along. I've seen people try to sell things like the unlikely underdog (think Frodo) as Lovecraft-like. "But it was so much more powerful than our little train that could, and he won by pure luck, a dagger in the Achilles heel." Nope... Sorry, he won, you can't win in a Lovecraft theme, no matter how unlikely, lucky or hapless the protagonist. Finally, the more baffling and haunting it is after the fact, the better.

You know, I'm biased as I'm making a game on Lovecraft, but these are ridiculous standards to hold to. I'm sick of people shredding their creativity in order to stay on the "tradition" route. 

 

 

It's not "tradition"... Lovecraft by the very nature of its theme the villain can only be survived, joined with or it is secretly yourself. That is the root premise of Lovecraft horror. That is what set it apart from the rest of horror: ancient, unimaginably powerful and absolutely unbeatable, period.

 

Now, you can have something Lovecraft inspired that works differently, but that the evil can be bested in whole with a happy ending is not a theme I've ever encountered in his fiction. It's of course okay to call your fiction something different, but to say "this is the Lovecraft way" doesn't work if there is blue skies and happy children petting puppies at the end.

 

Oh, these guys seem to get it too... I don't know why it's so hard to understand.

 

Edited by Luridis

Fere libenter homines id quod volunt credunt. - Julius Caesar

 

:facepalm: #define TRUE (!FALSE)

I ran across an article where the above statement was found in a release tarball. LOL! Who does something like this? Predictably, this oddity was found when the article's author tried to build said tarball and the compiler promptly went into cardiac arrest. If you're not a developer, imagine telling someone the literal meaning of up is "not down". Such nonsense makes computers, and developers... angry.

Posted

 

 

I was following the topic of narrative... ;)

 

More Lovecraft please. And no, slenderdork and pyramid heads don't count, even if they share a genre. And for it to be truly Lovecraft, the end can't be the player besting villain. The options are to survive it, join with it, or realize it was you all along. I've seen people try to sell things like the unlikely underdog (think Frodo) as Lovecraft-like. "But it was so much more powerful than our little train that could, and he won by pure luck, a dagger in the Achilles heel." Nope... Sorry, he won, you can't win in a Lovecraft theme, no matter how unlikely, lucky or hapless the protagonist. Finally, the more baffling and haunting it is after the fact, the better.

You know, I'm biased as I'm making a game on Lovecraft, but these are ridiculous standards to hold to. I'm sick of people shredding their creativity in order to stay on the "tradition" route. 

 

 

It's not "tradition"... Lovecraft by the very nature of its theme the villain can only be survived, joined with or it is secretly yourself. That is the root premise of Lovecraft horror. That is what set it apart from the rest of horror: ancient, unimaginably powerful and absolutely unbeatable, period.

 

Now, you can have something Lovecraft inspired that works differently, but that the evil can be bested in whole with a happy ending is not a theme I've ever encountered in his fiction. It's of course okay to call your fiction something different, but to say "this is the Lovecraft way" doesn't work if there is blue skies and happy children petting puppies at the end.

 

Oh, these guys seem to get it too... I don't know why it's so hard to understand.

 

 

Stagnation in any genre limits originality and creativity. That's a fact. 

 

Once you start putting definite, concrete words to one author's prevalent themes or concepts, it becomes a circle-jerk of trying to reclaim their status for yourself.

Posted (edited)

I don't agree with you. The genre is fine. Lovecraft is an author, not a genre. His work is fairly unique in its presentation in a similar way The Empire Strikes Back was. Imagine of Gimli succeeded in destroying the One Ring in Rivendell with his axe, that would neuter the whole damn story.

 

“Now all my tales are based on the fundemental premise that common human laws and interests and emotions have no validity or significance in the vast cosmos-at-large.... To achieve the essence of real externality, whether of time or space or dimension, one must forget that such things as organic life, good and evil, love and hate, and all such local attributes of a negligible and temporary race called mankind, have any existence at all.” ― H.P. Lovecraft

“And now at last the Earth was dead. The final pitiful survivor had perished. All the teeming billions; the slow aeons; the empires and civilizations of mankind were summed up in this poor twisted form—and how titanically meaningless it had all been! Now indeed had come an end and climax to all the efforts of humanity—how monstrous and incredible a climax in the eyes of those poor complacent fools in the prosperous days! Not ever again would the planet know the thunderous tramping of human millions—or even the crawling of lizards and the buzz of insects, for they, too, had gone. Now was come the reign of sapless branches and endless fields of tough grasses. Earth, like its cold, imperturbable moon, was given over to silence and blackness forever. The stars whirled on; the whole careless plan would continue for infinities unknown. This trivial end of a negligible episode mattered not to distant nebulae or to suns newborn, flourishing, and dying. The race of man, too puny and momentary to have a real function or purpose, was as if it had never existed. To such a conclusion the aeons of its farcically toilsome evolution had led.” ― H.P. Lovecraft

 

No where in that do I see: Hope, hero, luck or anything resembling an underdog beating the odds. If you survive, its because you happened to fall beneath the notice of the destroyer. You can't cross the streams and walk away like you kicked butt and took names, if you remain sane, that's probably because your struggle with sanity will only please the Old One.

 

If you want to go a different direction and use his work for inspiration, that's completely normal. He has inspired lots off off shoots in horror fiction. But to say a "hero" fits neatly into his themes, is to miss the entire cynosure that is his work. So, good luck with your protagonist.

Edited by Luridis

Fere libenter homines id quod volunt credunt. - Julius Caesar

 

:facepalm: #define TRUE (!FALSE)

I ran across an article where the above statement was found in a release tarball. LOL! Who does something like this? Predictably, this oddity was found when the article's author tried to build said tarball and the compiler promptly went into cardiac arrest. If you're not a developer, imagine telling someone the literal meaning of up is "not down". Such nonsense makes computers, and developers... angry.

Posted

You know what, scratch all that.

 

Bryy, you have the right to make any game you want, and call it whatever you want, assuming you're not squatting a trademark. But, I also have the right to disagree with you about what's thematically consistent with a particular subject.

 

For the record, I hope your game turns out great. I hope it's all you expect. Just because my opinion differs and we don't agree doesn't mean I wish to discourage your creativity. I just see things differently, and that's all the statement is intended to convey. People disagree with me all the time, but it doesn't stop me from working on whatever it is I have planned.

Fere libenter homines id quod volunt credunt. - Julius Caesar

 

:facepalm: #define TRUE (!FALSE)

I ran across an article where the above statement was found in a release tarball. LOL! Who does something like this? Predictably, this oddity was found when the article's author tried to build said tarball and the compiler promptly went into cardiac arrest. If you're not a developer, imagine telling someone the literal meaning of up is "not down". Such nonsense makes computers, and developers... angry.

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...