Jump to content

angshuman

Members
  • Posts

    655
  • Joined

  • Last visited

Everything posted by angshuman

  1. - Cirrus Logic PCI VGA 1MB - SiS 6326 - Intel i740 - GeForce2 MX - Matrox Millennium G250 - GeForce FX5200 - GeForce 6200A * - GeForce 6600GT - GeForce 7800GTX * * = currently used
  2. In response to the scratching argument: Hades is right, CDs and DVDs are made of the same physical materials, and it is just as easy to scratch one as the other. You could argue that since data on a DVD is much more densely packed than on a CD, the probability that a physical scratch of a certain size screws up the Error Correction Codes is higher on a DVD. In practice, I don't believe there is any data supporting this hypothesis. Even if it were true to some extent, the orders of magnitude of advantage that a DVD provides in terms of storage capacity makes CDs a laughable option. The only argument you can make against DVDs is the substantial investment you need to make in order to purchase a DVD drive: $18.99 + shipping. Consider alternative uses for that kind of money - half a dinner at a nice restaurant, 2 movies at the theater, half of a PC game, 2 music CDs, a crappy keyboard, a set of horrible desktop speakers... the possibilities are endless. Why, you could even invest the money in some Google stock!
  3. DVD-ROM drive: $18.99 DVD+/-RW drive: $31.99 I don't see any logic behind not getting one.
  4. Three Dimensions: Wolfenstein 3D
  5. Ugh.. that was incorrect, I'll be lambasted by graphics engine devs. What I meant was Shader Model 3.0. DX9 offered two new shader models, 2.0 and 3.0. While current engines are based completely on DX9, most features are SM2.0-specific. SM3.0 offers functionalities that are a significant upgrade from SM2.0 including extremely long shaders, dynamic branching and a bunch of floating point formats (required for HDR). nVidia have been supporting SM3.0 since the 6-series, but ATi has only introduced support for it in the X1x00 series. Most game devs do not want to leave users of the previous generation high-end ATi cards in the dark, and therefore center their engines around SM2.0, with a couple of SM3.0 features here and there optionally enabled if a capable card is detected. Bottomline? An SM2.0 game will probably run a wee bit faster on NV GPUs, as will an SM3.0 game that only uses a couple of its features. An engine specifically designed around SM3.0 with heavy use of dynamic branching and long shaders will whoop a 7900's butt. However, the chances that any such engine will ever see the light of day are quite slim, since DX10 and SM4.0 are right around the corner.
  6. That's a toughie. The X1900 has more shader power and way superior dynamic branching. It also allows you the ability to use AA and HDR at the same time. It's targeted at more "modern", DX9-ish games. Thing is, very few games use dynamic branching (or any other DX9 features) heavily. The 7900 has more brute force and therefore scores slightly better in most games out today. It's still not a huge difference.
  7. The glooow! The wonderful glowww!
  8. I disagree. The Pentium M is way superior to any mobile processor AMD has to offer. Agreed. If you want to game on your laptop, "Centrino" just won't cut it. (Centrino = Pentium M processor + Intel's Graphics + Intel's WiFi chip).
  9. IMO when shopping for a PSU the things to look for, in order of importance, are (assumption is that you've narrowed down your selection to a set that satisfies your minimum wattage requirements): 1. Brand Name (brand = quality, reliability, stability) 2. Amperage Rating on +12V rail(s) 3. Wattage (beyond your minimum requirement) 4. Features like Active PFC etc. You essentially get what you pay for. PSU's are very un-glamorous products, so they can't be surrounded by marketing hype. If a PSU is expensive, it's probably because it has a bunch of expensive components inside.
  10. Saving a file doesn't count since any non-brain-damaged OS will stream it via the file cache. The other instances you mentioned are rare enough that I will prefer to spend the extra $100 on my graphics card or CPU. Of course, if you already have a top-end system then you might want to spend the extra cash on the HDD, but if I wanted a really top-end storage system (non-RAID), I'd probably pair up a small high end SCSI with a huge low end IDE. Different strokes.. Yay! (w00t) Can't describe the feeling when you power up your machine, knowing that the best graphics card money can buy is powering it. (w00t) (w00t) (w00t) You won't regret it!
  11. IMO if you care about HDD speeds you're probably going to the HDD too often, so you have other things to take care of first. ... If you still want a really uber drive for bragging rights, get a Seagate Cheetah (15K SCSI) :D
  12. Just making sure... you did plug in the auxiliary power socket right? You wouldn't have had this on your 6200.
  13. That's an excellent card Lady. The X1300 is a low-end card targeted at basic PCs, and is comparable to a GeForce 7300. The person who was telling you that it is comparable to a 6800 didn't know what they were talking about, really.
  14. In this business, investing in the bleeding edge inevitably leads to some heartache a couple of months later. You have to accept that. At the same time, your reservations about Vista are also applicable to upcoming motherboards, CPUs, video cards etc. What if the next nVidia GPU turns out to be a disaster like the horrible FX series? Once you've spent the dough, there's really no point in thinking about what that money would've gotten you had you waited longer. The bottom line is that the stuff you have with you is damn good stuff, and well worth the $2K you spent on it.
  15. This is interesting, if you find a link could you post it?
  16. http://www.theinq.net/?article=31719 Inquirer FTW. :D Don't hold me to it, I said "Rumor"!
  17. Seconded. Nothing overclocks like a DFI Lanparty.
  18. No, it doesn't. <{POST_SNAPBACK}> That's a counterintuitive, unsubstantiated claim! I want proof! :D
  19. Is that the "minimum" configuration or the "recommended" configuration? Typically, the so-called "recommended" specs actually end up being the bare minumum you need to run the damn thing at a framerate faster than a Powerpoint presentation.
  20. Rumors abound that the insane GeForce 7950 GX2 is going to retail for around $600 at launch. Since you're already over budget, you probably wouldn't be interested. Still thought I should mention it since you'll get almost twice (subject to final clocks of course) your current gfx processing for $110 more.
  21. That's because AMD have been using DDR1 until now. They are going to introduce Socket AM2 in a couple of months, and with it, DDR2 support. For the record, DDR2 does give you more bandwidth, but suffers from latency issues. For your day to day applications and games, latency seems much more critical. Only some heavy scientific benchmarks seem to benefit from DDR2 (correct me if I'm wrong about this please, I haven't hunted around much). At the same time, Intel aging platforms perhaps did benefit from DDR2. Poor AMD! Hypertransport was doing perfectly fine with DDR1... They had to do some *crazy* amount of tweaking to get their DDR2-based system up to the performance of their DDR1 system. Edit: Some links... http://www.anandtech.com/cpuchipsets/showdoc.aspx?i=2738 http://www.anandtech.com/cpuchipsets/showdoc.aspx?i=2741
  22. On the non-proprietary front, FB-DIMMs seem to be the next trend in memory architectures.
  23. Apparently modern GPUs aren't able to keep up with all the new objects the PhysX processes and throws out on screen. Either that, or the PhysX itself isn't able to cope with the physics computations devs are currently throwing at it. Bottom line is, plug a PhysX in your machine, and your framerate goes down. Deceleration FTL.
  24. Eldar, if you are buying a processor *now*, there's no reason for you to even consider Intel. AMD have an unquestionably superior product at the moment. It's very different from the ATi-nVidia duel, where the differences are marginal at best. The only balancing factor is price - AMD dual cores are noticeably more expensive that Intel dual cores.
×
×
  • Create New...