Jump to content

Azarkon

Members
  • Posts

    486
  • Joined

  • Last visited

Posts posted by Azarkon

  1. Considering the amount of people interested in games nowadays, I have a feeling that it's also a matter of supply and demand in the job market. Your local evil corporate conglomerate treats programmers bad because programmers are a dime a dozen since the computer science bubble burst. If joe programmer doesn't like it, he can always leave - there are many programmers out of work and plenty willing to work under even adverse conditions if it means working in a job that they think they'd like. But the rate of disillusionment among programmers in the gaming industry is pretty high too, or so I hear.

  2. Why should Lucas innovate if he can milk his success till the day he dies?

     

    Admit it, folks, you can criticize Lucas all you want, but in the end it comes down to this: successes on the scale of the Star Wars franchise are rare. If a person can come up with just one such franchise, he/she's set for life. The problem we all face is coming up with that one - Lucas did it, and for that you gotta respect him, regardless of how he acts afterwards.

  3. Come now.

     

    I doubt, in the interest of avoiding a Troika-like demise, employees of Obsidian, including MCA, is allowed to say that KOTOR2's ending is a pile of big stinking poo because publisher Lucas Arts didn't give them enough time to finish the game.

     

    A Empire Strikes Back ending may simply be the best way of expressing one's discontent without coming out and saying it directly. Reading between the lines, in this respect, maybe necessary...

  4. I don't know about the rest of you guys, but I'm quite excited to see the different approaches/opinions to stories in games. I agree with all five points of view, if that's possible, and I dearly hope that all five sides have their place in the gaming industry.

     

    Part of what interests me most about the game genre as opposed to literature/comics and film/animation is that games do not suffer from the dictatorship of the story. In literature/film, the first thing people will tell you is that the story is everything: without a story you have nothing, and so you always start with the story board; film and literature are simply two ways of representing stories: one linguistic, the other visual. This isn't to say that there's anything wrong with this, but it is a limitation because not everthing in life is experienced through stories.

     

    Games do not have the same limitations. They may *engage* in storytelling and indeed allow the story to dictate the game, but they can choose not to and still result in a very good game. In this respect, the game dictates the story, not the other way around, and that's a good thing, because in one sense it's a step beyond film/literature. You CAN make a game that's all story - it then becomes a film(or novel, if it's text based). But you can also make a game that has little/no story - that's unprecedented and has no equivalent in film/literature. Therefore, games allow an extra dimension of freedom, and this, I argue, gives it greater potential than either film/literature in the capturing of basic human experiences.

  5. It seems to me that in game design as in art your portfolio is more important than anything else. It's quite obvious why this would be: knowledge and problem solving skills can't often be judged directly (which is why they look at your degrees), but artistic/creative skills often can be.

     

    I'm of the mind that the best way to get into the gaming industry is to make a good game/mod, either on your own or with others, to demonstrate that you've got what it takes. This is with regards to game design. I'm not so sure about programming - true, anyone can be a programmer, but to make great games you want more than the average programmer. You want someone who's smart enough to implement cutting edge optimization, graphics libraries, and even R&D his own tech (ie Pixar's R&D team) if industrial equivalents can't be found. Most truly successful game companies do not simply license engines - they modify them heavily, or develop their own, in order to stay ahead of the technology curve. To do this, you need true scientific savvy, true ingenuity, and for that you have to go to the high-class university graduates - and pay them big bucks.

     

    But game design? Nah. Although, really, it'd be much better if you had BOTH - technological savvy and game design experience, at which point you're going to be ahead of the guy with just either. In any industry where competition is key, the person who's more "bang" for the buck typically wins out. Who would you pick, if you were faced with two guys who's both made mods/games, but one has a BS degree in CS from MIT and the other has just a community college education? The former you could probably depend on for more than just game design - the latter, not necessarily. As with anything else, therefore, it's good to understand that getting a job is not necessarily about being the best man for the job - as that's entirely too difficult to tell simply from resumes/interview imperssions - but being the *better* man than anyone else applying.

  6. The people in those cities were all working for the military machine.

     

    Civilian is a tricky term. Every person living in a country during a time of war either contributes in some way to the war effort or detracts in some way from the war effort (just by paying taxes in America, you support the war effort). Very few people have zero effect on how the war goes. If your definition of civilian, then, is simply that the Japanese in Nagasaki/Hiroshima supported the war "more" than joe tax-payer, that may fly - except for the fact that, of course, they didn't have much more of a choice than you do in not paying taxes. That is, the Japanese military dictatorship during the time would not have tolerated dissent, and propaganda regarding the righteousness of Japan was such that it would've been impossible for most people to have had "other ideas" anyhow. This includes, by the way, women and children: as we all know, in times of total war, women and children contribute to the war effort just the same as adult males do.

     

    If you start making distinctions based on gender and age, you start running into problems such as 11-year-old and women suicide bombers (and by military logic, should therefore be considered threats and exterminated).

     

    That said, I completely agree that one should not weigh civilian deaths over soldier deaths. Both are, in effect, innocent of what their superiors decided, and incapable, really, of rebelling against that choice. This is the reason why you cannot call everyone in Germany a war criminal because they "contributed" to the systematic genocide of Jews, no more than you can say that a drafted soldier was truly guilty of anything, other than, perhaps, the inability to stand up for basic humanity over patriotism (and most likely be court martialled in the process).

     

    However, it should also be said that civilians are more commonly used simply to refer to a non-combatant. Yes, the people working in factories contributed to the war effort, but they are still non-combatants, and in human society, at least, we seem to have some ancient sense of honor that killing a unarmed person with no direct way of fighting back - regardless of whether said person contributed to making the weapons that killed thousands - is somehow a dishonor. Perhaps we imagine ourselves in the same position, I don't know, but it remains the fact that civilians are civilians regardless of how they're tied to the war effort. Once you pick up a weapon or directly becomes involved in the orchestration of war (whether as a leader or common soldier), you cease to be a civilian. But till then, even enemy scientists could be considered civilians, though of course - as I said - no country was ever really shy about killing civilians when it became necessary: they just apologize afterwards.

  7. India's been around for what? 4000 years?

     

    Been around isn't the same as being a world power or, though we won't admit it, a nation-empire able to police the world. Nor is it the same as being the same political entity it had been 4000 years ago.

     

    One of the easiest mistakes to make is to confuse modern locales with ancient civilizations - do you call a modern Italian a Roman? Do you still call the UK the British Empire? Do you call Communist China a dynasty? No - because they are none of these things. Those civilizations had come and they had gone. What we have now is their history and their legacy, nothing more. Modern civilizations should not be confused with the old.

     

    India in its history has undergone many revolutions, and knew its own golden age during the reign of the Gupta Empire. But like every civilization, its time of power came and gone; old, decadent empires collapsed to give way to newer political systems and younger civilizations. Yes, they still call themselves Indian (though a huge chunk has become Pakistani, let's not forget), but that's because it's the culture that survives, not the empire or the political system.

     

    These things come in cycles. A few hundred years ago the British Empire had its golden age. A thousand and five hundred years ago the Chinese Tang Dynasty was the most developed empire in the world. A bit over two thousand years ago the Romans ruled. Earlier than even that the Egyptians and their godlike Pharaohs thought that their glorious civilization would last for all eternity. None of that came to pass. History is the best proof that no one civilization will ever be able to stand the test of time - inevitably, power brings complacency, age brings decadence, and newer civilizations and governments rise to replace the old.

  8. Once upon a time, America (and Europe) considered people willing to sacrifice their lives for the greater cause of causing mass destruction to enemies - civilian or otherwise - a virtue (Samson is a Biblical hero, after all). Once, America was a new country, bursting with potential and the need to break from the old decadent societies of Europe. Naturally, it allied itself with the rebels, the freedom fighters - and we have stories, myths, and folk tales passed down from that time praising those who might very well in modern times be considered terrorists, or at least criminals.

     

    But the tables have, somewhere down the line, been turned. America is now the most powerful nation in the world, and just like every society that's ever been in the same position, we are likely bound for the same end.

     

    History is a cycle of war, peace, and revolution. There will come a day when the US will become a decadent old civilization (if it is not already), and when that day comes, assuming that we did not bring the world to total destruction with our fall, there will rise other, younger peoples and nations who will be remembered by their respective states as freedom fighters. But before that, they must first be called rebels, traitors, criminals, and terrorists. That seems to be the way of things.

  9. but would you say there is a difference between when those tactics are occasionally used and when they are the backbone of your entire effort?

     

    The question you gotta ask yourself is this: can terrorists fight their opponents in any other way other than terrorism? Can they declare, in a military sense, "war"?

     

    I'm of the mind that people generally don't engage in SUICIDE bombings if they could achieve their objects through some other method, such as winning battles and being respected as war heroes. But that's not what modern warfare is about - the age of huge armies clashing against each other have come and gone, though many nations still refuse to admit it.

  10. Terrorism is just one of the many military tactics used to achieve a particular objective. It's easy to abstain from terrorism when you're big and powerful, with tanks and bombs and missiles capable of leveling entire cities in the span of a day - it's easy to, at that point, claim that you're trying to avoid civilian targets because you CAN do so. It's the equivalent of possessing great strength and choosing just the right amount for the job.

     

    But it's different when you're weak. It's different when you can't fight your opponent directly or risk extermination. It's different when the only possible victory you can ever achieve is to not necessarily scare your opponent's military into submission, but to make the entire atmosphere of war too costly and dangerous for your opponent to continue fighting. Guerilla warfare and terrorism have this same objective and are often both undertaken by "freedom" fighters. The difference is that guerilla warfare is the term given to a more organized military offensive undertaken by trained soldiers, whereas terrorism we associate more with extremist groups and organizations who do not typically have true military prowess, and are not "nations" from any political viewpoint (you seldom hear of a nation committing terrorism, because the equivalent tactics used by nations are considered acts of war).

     

    To say that guerilla freedom fighters never target innocent is a matter of terminology. You can certainly define it in this way, but I doubt, somehow, that linguistic word choce is what you're after here. In the history of "freedom fighting" both guerilla warfare and terrorism have been used - necessarily - to achieve political goals. During the Vietnam War the guerilla fighters of the Vietcong initiated attacks against civilian targets for the sake of disrupting US forces - and vice versa, in certain cases - this is distinctly a terrorist tactic, but because the Vietcong are considered an organized military under a politica entity, it can be labeled and often is as guerilla warfare. For a military to commit terrorism, it's merely the acts and facts of war. For an organization - such as Muslim extremist groups - to do the same, it's terrorism through and through. Ultimately, though, the two are seldom different. As I said, targetting civilian targets is not necessary if you have the technological and military power to avoid it. But when it does become necessary, I've seen very few countries actually shy away from the use of force for their own ends. The US, after all, are the ones who dropped the A-bomb on Japanese civilians - and the Japanese, in likewise fashion, are the ones who committed the massacres on Chinese civilians. The cycle repeats, and I'd like to quote from a rather wise man:

     

    Governments seldom lose wars. People seldom win them.

  11. Arcanum, ToEE, and V:TM sank Troika. To claim that they were great games really doesn't hold water. The Fallouts and the IWDs, at the very least, kept IPLY and BIS going, and the BG's and NWN's obviously brought Bioware success. Troika's games ran them into the ground due to a variety of technical issues - and really, when it comes to games, the best ideas without solid execution are worth jack squat.

     

    Personally, I was alot more immersed by IWD than ToEE, despite IWD's age. And I certainly preferred the IE's style of D&D combat to ToEE's, not because I hate TB (I love TB strategy games and roguelikes), but because ToEE's TB was placed within the context of a rush job dungeon-crawl. It could've been great, if ToEE itself was great, but combat mechanis alone does not a game make, and I doubt publishers elsewhere learned much from ToEE other than not to release a buggy, unfinished game. It certainly did not bring about the second age of TB CRPG's, if that's what some are implying.

     

    Heck, even Troika realized that when they dropped TB for RT in VTM.

  12. Artistic license > political correctness. Period.

     

    Of course, with that said, anyone and everyone should feel free to dismiss a game/company based on his/her own sensibilities, and all companies that seek to make money would do well to understand the nature of that sensibility. In the creation of art, no qualm should ever be made for the sake of being representative of a modern sensibility (unless that was the goal). However, in the creation of a product, such a sentiment is reversed...

  13. So long as I'm limited to staring at the computer screen, instead of being "fully immersed" through a VR system of some sort, isometric systems/alternative cameras have their place.

     

    The simple fact of the matter is, first-person isn't the END ALL BE ALL when you're staring at a 2D representation of a 3D environment. You can make it as real as you want to, but if I'm reduced to movement via keypads and mouses, first-person viewpoints just ain't all that unless you switch to a shoot'em up style ala FPS's. Even with eye-candy, you can never quite capture the beauty of executing a eye-candy, "I know kung fu" attack with a first-person system, whereas it's easily possible with a follow camera or a morphable isometric POV.

     

    Immersion comes in many flavors, and while I totally agree that first person POV could offer more immersion in many situations (I'm definitely not in the "isometric 4 eva" crowd), to claim that they're the only choice is hogwash.

  14. You have to remember that DS was designed for novice players and those put off RPGs by all those stats then everything makes sense.

     

    Perhaps that was the intent, but to be honest, DS is easily out-performed by other games, such as Diablo, designed for the same purpose. Gameplay does not have to be complex in order to be fun, but DS was neither. The first time I realized that you could cast the same two spells over and over again ad infinitum in every battle, I knew that the game was doomed to failure. Even Diablo had you using more than that, and Diablo battles were at least fast-paced enough to keep your attention with the "omg mass slaughter #%#!" even if you were just pressing the same buttons. Plus Diablo had interesting items and spells, whereas Dungeon Siege's spells and items were lackluster and statistical. Save for a few moments of brilliance (the first time I saw the beholder comes to mind), DS was for me mostly forgettable, so you will have to excuse me if I make some memory errors.

  15. Games you watch aren't necessarily boring, as millions of FF fans will tell you.

     

    However, in DS most of the times what you watched wasn't the story, but the repetitive monotony of uninspired battles. And that was supposed to be the point of the game.

     

    I can't imagine what kind of drugs one would have to take in order to consider that "fun" during the design phase.

  16. Nah, WC and WC2 didn't follow their stand plot, but that's because back then they didn't even do the integrated plotlines. In WC and WC2 the side you played was the side that "won", I believe, though the official storyline is that in WC, the orcs won. Then in WC2, the humans won by closing the dark portal. Then in WC2 expansion, the humans won again by destroying the dark portal.

     

    After WC and WC2, it's become their practice to have the good guys win (via sacrifice) in the main game by destroying the super-villain, and then the villains in the expansion/sequel win by becoming some kind of new super-villain (WC3: Archimonde gets killed by wisps in RoC, but Arthas becomes Lich King and "wins" in FT; SC: Tassadar destroys Overmind in original, but Kerrigan becomes new Zerg Queen in BW; in Diablo, the hero kills Diablo, but in Diablo 2, he becomes Diablo). Makes it possible to do sequels.

     

    Speaking of storylines, Diablo I/II and SC had the best storylines of any of Blizzard's games, imho. Many people consider Diablo I/II standard hack 'n slashes - and they are - but they actually had a Nietszche undercurrent of hero-becoming-monster that's well presented. Diablo II's cutscenes, especially, are amazing in how well they told the story - way different than the actual game's lackluster presentation.

  17. Blizzard is currently too busy working out their server stability fiasco with WoW than to make SC 2, I'd think. And besides, alot of the guys who made SC have left Blizzard, so I'm not sure if SC 2 is going to be like SC even if it gets made.

  18. Statement: A monopoly in the gaming industry means no niche genres such as CRPGs.

     

    Makes no sense. A monopoly of the gaming industry that produces only FPS games cannot exist. Even if the demand for CRPGs is small, there is still a demand. A company that produces only FPS's will not satisfy that segment of the demand and as such makes it easy for plenty of other companies to rise and make a living out of supplying CRPGs. That would in turn defeat the definition of said corporation as a monopoly.

     

    In other words, for a company to be a monopoly in the gaming industry, it must satisfy all genre demands. It must produce CRPGs, or else make it so that CRPGs cannot be developed by anyone else - which, unless government regulations totally change, will simply not happen.

     

    In fact, a so-called "monopoly" here would actually be a relatively GOOD thing, as the assumption behind a monopoly is that it'd be impossible to compete with said monopolizing super-corporation. If such is the case, then smaller/independent companies would not need to concentrate on making mass-market FPS's and console games, as they'd not be able to out-compete the super-corp anyhow. In other words, since it's economic competition that drives the current shift away from CRPGs and into mass market games, if a single corporation ends up single-handedly dominating that market to such a degree that other companies cannot compete with it, those smaller companies will be forced to abandon that road and go back to what they're best at in the niche genres, which means producing more CRPGs.

     

    I don't completely agree with taks' definition of monopolies, but he makes a very good case for why leisure monopolies under the current system doesn't really hurt the economy. The fear of the megacorporation is more a fear of integrated corporations that control everyday life - for instance, a corporation that controls food products / water / medical care. Such a corporation, using suppression tactics, could theoretically never be displaced under taks' scenario since the market for food products / water / medical care will not decrease. Fortunately, there are government regulations in place to prevent such monopolistic practices.

     

    As far as leisure monopolies go, I really don't see how a corporation can ever fully dominate a market without suffering from inefficiencies or having political/military power outside of mere market forces. For a company to completely dominate the gaming industry, it'd first need to out-compete all other companies. Let's say it does that, and then raises its prices to hurt the consumer. But in that case, consumers may simply decide to not buy such a product, and move onto other leisure activities. This would weaken the monopolizing corporation, and make it easier for smaller companies to come in and fill the gap, which eventually leads back to fair competition. The only alternative to taks' cycle would be if the corporation controlled all supplies - if it denied game development access to everyone else. But such a corporation would need government power in order to do this, and as such can't be considered merely a monopoly under current definitions.

  19. Correct me if I'm wrong, but isn't there a huge difference between different "degrees"? Are we just talking about a general bachelors degree here (perhaps the common use referent for "degree"?), or a Masters/Ph. D program?

     

    That nitpicking detail aside, I'm currently working out CS/English "degrees" at UCB, and I for one can attest to the fact that many CS students do not know how to make a button in Visual Basic.

     

    How? Because I'm one of them. Why? Because I never use Visual Basic. The classes are taught, for the most part, in a combination of C, C++, Java, Lisp, and various other languages on the Unix platform - and not via anything Microsoft has made.

     

    However, that doesn't mean that given a week or two, most students of CS here would find it difficult at all to pick up Visual Basic and start using it like a regular. That's kind of what's demanded by a rigorous CS degree: you have to be able to self-learn programming languages in a matter of a week or two, while class is in session, independently of the professors who already assume that you know everything.

  20. I often find that, on a purely personal scale, the game that first got you "hooked" into the realm of a genre tends to be the one that most stands out. It's been proven again and again for me. For instance, I fondly prefer the "nostalgia" of Everquest to the much more quality production in World of Warcraft, the dated atmosphere of "Doom" to the much more effectively rendered Half Life 2, and the geeky charm of graphically challenged roguelikes to modern-day eye-candy hack 'n slashes.

     

    For me, as such, it was the BG's, and not the Fallouts, that most stand out. I never did play the original Fallout when it first came out, and playing it years later, having been jaded by so many other CRPGs and TB games, is just not the same. It lacked the "magic" of first discovering isometric CRPGs, and as such, even though it may be a superior game, I cannot honestly prefer it.

     

    Edit: Course, that I personally prefer fantasy-based settings for CRPGs (having been one of those kids that at a young age read Tolkien rather than Asimov, Jordan & Martin rather than **** & Simmons, did not exactly help the Fallouts, either.

  21. Guild Wars has got to be one of the most disappointing games I've ever played. Best MMORPG? Not even close, made moreso by the fact that Guild Wars isn't even a MMORPG - it's a CORPG, straight from the devs' mouth. I'm actually surprised people on this forum would recommend it. Guild Wars is basically Diablo III. Not only that, but it's a dumbed down version of Diablo III, with combat more akin to Dungeon Siege than anything else. Sure, I had fun playing it initially during the free weekend, but by the end of that weekend I had stopped playing. It was just pointless hack 'n slash coupled with Quake-style arena combat. That'd be fine - after all I did enjoy Diablo II - if Guild Wars had any semblance of a good combat system. But it doesn't, or at least it didn't when I last played it recently. In essence, Guild Wars' combat system consists of pressing one of 7 or so keys that represent your skills, of which you can only have 7 active during any dungeon crawl/arena combat/etc, over and over again - since your mana regenerates so fast you'll basically be using them every other attack. Given this, how much excitement do you presume is really involved in Guild Wars combat? Add to this the mercenaries they allow you to hire that basically do all the work for you, and you basically have online Dungeon Siege.

     

    Guild Wars has great graphics. And it has the so-called "free of charge" play. If those are the things you look for in a game, be my guest. But if you're looking for any amount of depth in gameplay, it's my opinion that Guild Wars simply doesn't offer it.

     

    Best MMORPG at the moment? World of Warcraft. Pros: talent-based character development, fun quests, difficult dungeon crawls, and the potential for a very dynamic world (though Blizzard has yet to realize that potential). Cons: At times immature community, limited longevity.

  22. Just because Sion is the Dark Lord of the Sith doesn't mean he's going to be the main villain. Emperor Palpatine was Dark Lord of the Sith in the original movies, but who would rate him as the main villain of the SW story over Vader? Hell, when he died in the end it was "ooo-kay..." whereas Vader's presence commanded attention throughout the movies.

×
×
  • Create New...