Jump to content

angshuman

Members
  • Posts

    655
  • Joined

  • Last visited

Everything posted by angshuman

  1. What?!?? Valve has delayed a game?!?? That is unheard of.
  2. WoW, eh? :D Take a break from those MMO's, buddy, or soon you'll end up dead-as-a-Korean.
  3. Wow, the Xbox textures look really terrible. I'm pretty sure a GF3 can easily handle higher-res textures than that. Honestly, the X360 screenshot also doesn't look that impressive. You could probably get something like that off a 9800Pro.
  4. I beg to disagree. I have been playing Guild Wars PvE for a while and my impression so far has been that the story is weak. Not necessarily bad, but completely ordinary and uninspiring. Nowhere close to KOTOR-I/II. Hell, Diablo-I/II had a more intriguing plot than Guild Wars. I haven't completed the campaign, so there might be interesting plot twists coming up, but a good story always draws you in and constantly inspires you to keep playing -- something that I have not felt with GW. The gameplay, though, is pretty good. There's a great deal of variety, and it's pleasantly hard. But I don't feel any excitement at all about progressing the story further. It definitely does seem to be a very very PvP focused game, and PvP's just not for me. All IMHO, of course
  5. It seems the rumor was completely bogus. http://joystiq.com/entry/1234000830065698/
  6. http://www.nvnews.net/vbulletin/showthread.php?t=58966
  7. Yep, Lost Coast is a single-level tech demo (it's free). Gameplay is probably about 10 minutes. Aftermath is the expansion pack.
  8. That's *exactly* what I felt from the screenshots (haven't played the game yet). I couldn't understand why people were going ga-ga over the outdoor scenes -- the low-resolution flat skyboxes appear to from the HL1 era. Doom 3 had these, but then Doom 3 was a mostly-indoors game. I was hoping they would upgrade the engine to support high-resolution 3D skyboxes like HL2, but that didn't happen. Overall, Quake 4 seems to look *fantastic*, but the seamless vast-open-spaces feeling (that was awesomely done in HL2) is completely absent in the outdoor scenes.
  9. Are you talking about Tenebrae 2.0? I believe according to the devs, Tenebrae2 is based on the Tenebrae1 code. However, the project seems to be dead -- the last update was sometime in March 2004.
  10. Be careful, buddy... last I heard, Malak liked to touch hot things with his tongue to see how hot they are.
  11. If you read that article carefuly, you will notice that barring a few exceptions (such as Half Life 2), the only resolutions where CPU scaling is evident is 1024x768 and lower. Once you go up to 1280x1024 and 1600x1200 there is virutally no CPU scaling at all. Even for Half Life 2, it's only the 3000+ that holds it back slightly at 1280, anything above 3500+ is great. At 1600, even the 3000+ does well. Look carefully at the graphs. A lot of them seem to be at 800x600 resolutions! Way to go, firingsquad! However, there are a lot of meaningful graphs as well. Look at Doom 3's results. That game stresses the GPU's shaders and memory bandwidth like no other, and I believe is very representative of near-future games. FX-55 becoming a bottleneck? Sure, if you're playing Quake 3 at 300 frames per second. Give me any modern game running at an Image Quality level that results in 60 frames per second on a 7800GTX SLI-or-not, and show me that an FX-55 is a bottleneck. I'll eat my GTX. Let me give you my definition of a CPU bottleneck: When the highest IQ setting that can give you a smooth 60 fps (substitute with your favorite fps) is constrained by your CPU, you have a CPU bottleneck. If your favorite fps is close to 300, you'll likely run into CPU bottlenecks all the time.
  12. That site got the correct results, but drew the wrong conslusion!!! If your FPS remains the same regardless of what processor you use, this means your system is completely *GPU* limited (technically, I should say graphics-card limited, since the biggest bottleneck in the latest 7800 cards seems to be not the GPU core itself, but its anaemic 256-bit interface to the onboard GDDR3)! Any modern CPU (Intel 3.2, Athlon 3000+) should be powerful enough to fuel even a dual 7800GTX-SLI setup in most games. Firingsquad recently conducted a CPU scaling test on BetaField 2 and drew the same conclusion -- the game was primarily GPU limited, not CPU limited. Unless, of course, for some weird reason you play at ridiculously low settings like 1024x768 without AA/AF , in which case obviously your GPU is going to sit and smoke a few cigars between every frame.
  13. Are you sure about this? I don't mean to sound like a smarta**, but I just want to make sure we're talking about the same thing. We're not talking about overclocking by changing the bus speed here, that's the "standard" way of overclocking since it's a technique that works on all processors. I'm talking about upping the clock multiplier that gets multiplied to the bus clock to get the core clock, and my understanding was that on non-FX cores, this multiplier was half-locked in that you can only reduce the multiplicative factor, not increase it. Edit: Did some googling and I found that there are some people who seem to have received Venice cores with fully unlocked multipliers. Sure doesn't seem to be the norm, though. Anyway, congratulations if you're one of them
  14. Well, you can clock down on your Venice, but you won't be able to clock up .
  15. Yep. Like I said, higher clock speeds, unlocked multipliers, and unreasonable prices.
  16. The regular Athlon 64's also have the dual-channel on-chip memory controller. The difference between the regular and the FX is as follows: 1. The FX is the highest-clocked A64. Originally, AMD intended to keep just one FX as part of their lineup. So, when the FX-55 came about, the FX-53 was renamed as the Athlon 64 4000+. Unfortunately, AMD's dual-core nomenclature has completely messed things up, and so they could not rename the FX-55 even though the FX-57 is here. 2. The FX's have their clock multipliers unlocked. What this means is that you are free to overclock the chip by twiddling with the multiplier, and changing the factor with which the bus clock is multiplied to get the core clock. Most processors (regular Athlon 64's, Pentium 4's) have their multipliers *locked*, which means that the only way to overclock your processor is to overclock the bus itself, which results in everything else connected to the bus (memory etc.) being overclocked. 3. The FX's have ridiculous prices. Last I ckecked, the FX57 was going at $1020. The Intel Extreme Editions have similar prices. That is simply unjustifiable. No matter how rich or foolish you are, that processor is not worth that much. It's not a matter of my opinion, it's just a simple fact: Whoever buys an FX-57 today does not know what they are doing.
  17. Blizzard North's lead guys (I don't recall their names) left and formed Flagship Studios, and have been working on Hellgate: London. And yes, a bunch of others formed ArenaNet and gave us Guild Wars. A few months back, Blizzard had picked up the few remaining employees at North and relocated them somewhere else. These guys are believed to have been working on Diablo III. A couple of days back, Blizzard North was officially closed down.
  18. Absolutely. HL was revolutionary. Halo was a pretty nice game. That's all. Single player was decent. There are a TON of SP PC shooters that were much more fun. NOLF2. RTCW. Serious Sam. Each of these games had something unique to offer. What did Halo have to offer? Vehicles? Dual Weilding? These are all very *shallow* gameplay elements. Minor novelties, nothing else. Granted, the game was pretty well-balanced. But I felt it was too "generic", did not give my anything new I hadn't seen before. As for multiplayer, I haven't played too much of it, so I can't comment. I have heard from several people, though, that UT2004 provides a far more rich and varied experience. Basically, Halo "revolutionized" the *console* shooter genre, which is why the series is so hyped-up. Anyone who had played PC shooters before (especially ones of the calibre of HL) would come away mildly amused but primarily unimpressed. I speak only for myself, of course
  19. One thing that I found great about Diablo II's SP mode was that the feeling of achievement as your character leveled up was excellent, and at the same time your character never got uber-powerful. I have no clue how Blizzard managed to achieve this, since it seems to be a very hard thing to do. It's more than balance -- if the monsters you're fighting are pretty much always at the same level as your PC, you'll never get that sense of achievement. A Level Up in that game often translated to a substantial increase in your power, and you could go on a killing spree for a while. Slowly, your effectiveness would decrease as you progressed through the more or less linear sequence of levels, until the next Level Up. Seems very simple, but few games do this as well as Diablo II. The skill trees were simply awesome, the items were great, and the randomized item generation was well implemented. It wasn't a pure RPG, but for what it set out to do, it did it marvelously well. -- Angshuman
  20. It's very likely that the Xbox is easier to program for. Unless, of course, Sony has created a magic tool that takes a single-threaded game that automagically performs optimal scheduling and allocation on the SPUs, which I highly doubt. Any programmer would agree that programming for a 3-way symmetric multi-core with more or less traditional memory systems is without a doubt easier to nail than programming for an 8-way heterogenous multi-core with an unfamiliar cache hierarchy. The Cell definitely looks like it packs more punch, but to extract all that juice out of it would take a lot of optimization-via-trial-and-error effort for the traditionally-trained programmers. Then again, Sweeney and his team supposedly came up with that magnificent demo in what -- 2 months? I honestly doubt that, but who knows... I could be completely wrong about everything. -- Angshuman
×
×
  • Create New...