Jump to content

angshuman

Members
  • Posts

    655
  • Joined

  • Last visited

Everything posted by angshuman

  1. Eldar: I tried out the demo for a few minutes... not bad at all, but the game didn't quite suck me in (like the HL games do). Bok: Yeah, I felt Gamespot's Perfect Dark: Zero review was a bit... weird. Greg Kasavin went on and on and on and on and on about how awesome everything about the game was, and all the while it seemed he wasn't quite convinced about it himself. I guess they were under tremendous pressure from M$ since it was a launch title. What do you think of the game?
  2. Gamespot gives it a 7.5. Sounds like a pretty mediocre FPS. Link.
  3. Ninja Gaiden: Black. After finishing vanilla NG at normal difficulty, I briefly tried out the Hard and Very Hard levels and was somewhat disappointed. The game was exactly the same and the challenge, though greater, wasn't the nerve-wracking panic-inducing experience I was expecting. I went out and bought NG:B and started a game on Hard. Two words: Ass, Platter.
  4. From what I've heard, the problem with OpenGL is that the Architecture Review Board (ARB) is relatively slow at updating the API to keep up with the rapidly changing hardware. On the other hand, MS has an aggressive team that actually dictates to the hardware vendors the kind of features they want implemented. I've tried my hand at both DX (7-era, pre-shaders) and OpenGL coding and personally I found DX to be a shoddy joke of an API. OpenGL was far more elegant, although I could honestly never get the hang of either of them - they were both too frighteningly stateful. I've heard that the new DX versions (esp. DX10) have cleaned up a lot of the garbage, most of which was supposed to have come from DX's COM origins.
  5. I don't understand what Sony was trying to achieve by going to a public magazine and whining about its competitor... doesn't make much business sense.
  6. /signed. In fact, sign me up for any game that supports a Linux client.
  7. For the bulletproof reliability you are talking about, you probably want to go RAID. As for manufacturers, I am not sure who makes the most reliable consumer-class hard drives. In the enterprise world, Seagate's SCSI drives seem to be the best. I used to swear by them for consumer-class drives too until my roommate's Seagate's head crashed. One thing I will definitely vouch for - if you put your HDD in a USB enclosure and lug it around wherever you go, it WILL crash. These things were not built for mobility, regardless of what their shock-tolerance-Gs specifications might say.
  8. Haha! I do that too. :D
  9. BFG >>>>>>>>>>>>>>>>>>>>>>>> ASUS. And congrats on your new card!
  10. (w00t) MS-DOS FTW. Anyone play Digdug?
  11. Tetris at high levels gives me more ulcers than Real Life.
  12. Gamespot had done an editors' choice version of this a few months back. Here are the Readers' Choices.
  13. I too dislike this approach to "harden" a game. Creating large spawn groups, increasing creatures' HP beyond reasonable amounts, reducing the Medpack spawn etc. make a game more tedious, not more challenging. Challenge comes from superior enemy AI, better enemy aiming abilities, friendly fire in team-based games, increased potency of simple weapons for both you as well as the enemy, tighter time-limits for timed missions, etc. I guess the single most important factor is enemy AI, something that has shown marginal improvement over the years. Allan, we seriously need you to get into the industry!
  14. Meta, I believe both the 7900GTX as well as the 7950GX2 use 90nm technology. The 7800's used 110nm. GPU manufacturing technology usually runs one generation behind CPU technology. Even for CPUs, only the newest Intel processors are being fabbed on 65nm, with AMD's upcoming AM2 still using 90nm. It will be at least a year before GPUs switch to 65nm. Edit: The difference between the GTX and GX2 chips are the bin; the GTX chip comes from the absolute top bin, the creme de la creme. Very few chips can be clocked that high. The GX2's chips are lower clocked, and therefore yields are much higher. This is the reason why GX2's are so competitively priced. A single GX2 chip's effective manufacturing cost is quite possible about half that of a single GTX chip.
  15. If you are planning to buy a leet card right away (either the GX2 or the GTX), you should perhaps try to sell it about 1 month before the launch of the 8 series. Also, you might want to wait for about a month after launch before purchasing the 8 series card itself so that prices can stabilize. This means you'll need a temporary card for those 2 months, but you'll likely end up saving a lot. Oh, and Meta, glad you share my sentiments about the GX2. :D
  16. I've never bought exotic RAM, but I've heard lots of good things about Corsair XMS. I'm guessing you are planning to overclock a lot? Otherwise there's not much point in buying such classy stuff... Corsair ValueSelect can be had for about half that price.
  17. 10K, question for you: You say Cell can push graphics more than anything else. Why did IBM design the Cell with Graphics in mind? Isn't that RSX's job? Actually, I can think of two reasons: Vertex processing is going to be done on the SPEs, with RSX being used only for pixel and texture operations. Now, if RSX were a unified architecture (a la Xenos) this would have been a far more valid argument. However, it is a glorified G70 so it has hard-wired Vertex units. Can the SPEs process vertex transforms faster than these units? Conspiracy Theory: The PS3 was never meant to have a dedicated GPU. The Cell(s) was/were supposed to be everything, with the PPE being used for control-intensive tasks and the SPEs for data-intensive jobs (graphics). My guess is that the original idea was to have not one but an entire network of Cells as the processing core. However, during the later stages of design, IBM figured out it was consuming too much power and not performing as well. A network of Cells was out of the question. Trouble is, a single Cell couldn't do graphics as fast as a dedicated GPU. In panic, Sony rushed to Nvidia. This also explains why Nvidia couldn't design a dedicated Xenos-like GPU for them: they just didn't have the time. Comments?
  18. I'll have to agree wholeheartedly with that. As much as I've enjoyed the game, the storyline was pretty bad. The overall plot might have had some potential, but the dialogue felt like it was written by a kid. The voice acting was also absolutely abysmal. Fortunately, you can skip cinematics. The excellent gameplay makes up for these shortcomings, but it does leave you with a bitter taste in your mouth. Perhaps due to the abscence of monthly fees, Anet didn't have enough cash left over to hire real voice actors and got their programmers to voice the dialogues they had themselves written. "
  19. Gamespot's Analysis
  20. Forgive me, but I had to make that slight bugfix.
  21. The graphics engine is very inefficient. The system requirements are immense, but it looks horrible. It has a graphics setting that only "future" cards could run at the time it was released. This was quite true, but the inability of contemporary cards to run it came not from fantastic-looking visuals but from an underoptimized graphics engine. IMO the artwork was also terrible, but that's a subjective issue.
×
×
  • Create New...