Jump to content

angshuman

Members
  • Posts

    655
  • Joined

  • Last visited

Everything posted by angshuman

  1. Ravenholm is awesome ^^. Especially in the HL2 speed-run :D. To answer your previous question, no, there is no HDR in any HL2 artwork except Lost Coast. The engine has the capability, but you need to create HDR textures etc. and Valve figured it would be too much work to upgrade the textures in HL2. The new pay-to-play "episodes" should all feature HDR assets.
  2. Source rocks, and so do Valve texture artists. Just look at the photorealistic quality of textures in that second screenshot!
  3. I used to have this problem with my 6600GT while playing HL2. However, in my case it seemed to be completely random. I don't believe my card was overheating. My best guess was that either some pipes in the GPU or some of the memory modules were buggy and couldn't take the high clock speeds. Eventually my card blew up. I don't know if the blowing up had anything to do with this problem, though.
  4. Congrats! And no. :D
  5. Aye, it makes you look like a pro. :cool: I always wear mine when I'm assembling someone else's machine in front of them.
  6. Hooking up all those itsy-bitsy wires from the case (USB, audio, power, HDD, etc.) onto the motherboard is the most annoying part. In general, wiring up is annoying. Installing Windows/Linux is a close second.
  7. Very nice philosophy. I always keep something like this at the back of my mind while shopping for parts. Balance between a system's components not only has a certain aesthetic appeal, it also has a technical significance since it implies that none of the components are bottlenecking your system. I would agree with Bokishi and go ahead and recommend a vanilla 6600. You should be able to play most modern titles, but you will likely have to reduce quality settings to minimal levels. The performance of your video card is affected by three factors: (a) GPU performance, (b) Memory performance, and © amount of Memory. Of these, GPU and Memory performance have somewhat of a linear impact on user-perceived performance. Amount of Memory works very differently -- if the amount of data that is currently in the scene you are observing fits into your VRAM, gameplay will be smooth. However, if the data does not fit, you will perceive occasional jolts as the new data is swapped in. Most modern games fit well into 128MB, and swaps are uncommon enough to be not noticed. (a) GPU performance is proportional to (i) the width of the GPU, i.e., the number of pixel pipelines in the GPU, and (ii) the clock speed (MHz) of the GPU. Simply multiply the two of them and you should have a very very crude idea about the level of performance you can expect from the GPU. (b) Memory performance is proportional to (i) the width of the bus interface between the GPU and the memory, typically 64, 128 or 256 bits, and (ii) the clock speed (MHz) of the memory. Multiplying the two of them should give you a more or less accurate indication of your video RAM performance. GDDR3 memories are manufactured using expensive processes that enable them to be clocked at very high frequencies. As far as the end user is concerned, you need not look at whether the memory is DDR or GDDR3. What you are concerned with is the clock speed of the memory, and the width of the memory interface. To see the kind of tradeoffs that are possible using these parameters, you can look at two Nvidia cards that offer about the same level of performance, but have significantly different configurations. 6600GT : GPU width = 8, GPU clock = 500MHz, Mem width = 128, Mem clock = 1000MHz (GDDR3) 6800NU : GPU width = 12, GPU clock = 350MHz, Mem width = 256, Mem clock = 700MHz (DDR2) Although Nvidia had initially priced the 6800NU at a much higher slab than the 6600GT, the prices eventually leveled off. What's interesting is that the 6600GT, due to its GDDR3 memory, could only be found in 128MB configurations since placing more than that would prohibitively raise the cost of the product. However, the 6800NU could be found in both 128MB and 256MB avatars since piling on el-cheapo DDR2 memory is easy. At any rate, very few modern games will actually bring out the difference between a 128MB and a 256MB card; they simply do not have large enough datasets under reasonable quality settings. While shopping for a video card, look for GPU and Memory performance, and any amount of memory greater than or equal to 128MB should be fine.
  8. Is that an ENIAC game?
  9. Agreed, but the physics effects being currently used by HL2/Havok are pretty much gimmicks and (IMO) do not add substantially to the overall gameplay experience. I wonder if Havok was designed with sufficient forethought to truly harness the capabilities of a dedicated PPU beyond providing a cool gravity-gun like toy.
  10. Those buggers scared the sh*t outta me when I first encountered them.
  11. Sure, I wasn't referring to your post in particular. It just seemed to me that the widespread obsession with 256MB was creeping up again. ... This particular card really does sound like an awesome deal. It most certainly is a unique card. I had never seen a vanilla 6800 coupled with GDDR3 memory before. And the price is ridiculously good. (w00t)
  12. What he said. Best Buy are thugs. That 6600 priced at $184 is a joke, it's not even worth half that. And for heaven's sake, PLEASE DO NOT GO AFTER 256MB!!!!! You could read up my countless previous posts on the matter. The bottom line is that a 6600GT with 128MB GDDR3 will blow a 6600 with 256MB or 1GB or 200000 BAJILLION JIGGA BYTES out of the water. The $150 6600GT on Best Buy seems to be somewhat reasonably priced. I remember that Newegg had a 6600GT at around $99 a couple of months back.
  13. OMG it's the clone army!
  14. That was with a 720p WMV stream downloaded from Microsoft's hi-def showcase. Correct me if I'm wrong, but I believe DVD is no better than standard def (480p)?
  15. So, to sum up: - 729MHz CPU - 243MHz GPU - 88MB total system RAM - System-on-chip design Some revolution. Each unit will probably cost them $20 to manufacture. They can sell them for $200 each. For the first time in history, a console maker is actually going to make profits from selling the console units themselves. *That's* Nintendo's next-gen strategy?
  16. The screenshots look absolutely gorgeous. I like the bloom effect, as long as it is kept subtle and not over the top *cough*Fable*cough*.
  17. Windows MCE 2005 requires at least a GeForce FX5200 for standard definition accelerated video playback. Not that Windows MCE is the be-all, end-all of media center front-ends, but it most certainly is one of the most popular. The FX5200 is able to take a significant fraction of the load off the CPU, and these cards typically ship with very good output TV encoders (contrary to the general perception that ATi's TV encoders are superior). Most pre-GeForce4MX nVidia cards however had very crappy TV encoders, and the CPU had to take all of the burden of decoding. For hi-def accelerated playback, you are going to have to move to 6600-class. I have an FX5200 + Sempron64 based media center PC (std. def of course), and the picture quality is fabulous (compared to my laptop's Radeon 7500's output). Although I have never actually measured CPU utilization numbers, I have never faced any problems or hiccups decoding streams at upto 720p resolution.
  18. 1.4GHz GDDR3 memory is freaking expensive. More than 128MB would likely escalate 7600GT prices to unacceptable levels. Don't just look at the amount of memory. Even 1GB of el-cheapo DDR memory used on 6200's and 6600's is probably a lot cheaper than the high-end 128MB on the 7600GT.
  19. Yes, you are correct. The 6800's have a 256-bit memory bus, while the 7600GT has a 128-bit bus. Further, this bus is fed via 8 ROP's in the 7600GT while the 6800 has 16 ROP's (same as the 7800's). The reason memory bandwidth and number of ROP's are important is because these two are the critical factors that determine the performance impact due to antialiasing. These two factors coupled with the number of shaders on the two GPU's effectively make the 7600GT a much narrower processor than, say, the 6800GT. However, this narrow width and the 90nm process technology enables nVidia to clock the 7600GT at very high speeds, thereby making up for the lost performance while keeping manufacturing costs much lower than the 6800's. At high resolutions and AA settings, the 7600GT's memory bandwidth will start becoming more and more of a bottleck, but at this point it's possible that neither of the two GPU's will be giving you a playable frame rate.
  20. O RLY? <{POST_SNAPBACK}> YA RLY
  21. Does anybody know exactly how unemployment insurance works? :ph34r:
  22. A friend of mine from work used to say: "I feel sorry for the guys at Creative and other sound-card vendors. If they hope to make as much progress as those in the video card business have been making, they're going to have to wait for human evolution to catch up."
  23. Yes, that's a lot more reasonable, although I suspect you'd have to overclock the bejesus out of the 6200s to get them to that level. But the thing is, for the price you'll pay for the pair of 6200's, you could easily get a 6600GT. And in all likelihood a stock 6600GT will outperform a resonably overclocked pair of 6200s. Which is exactly the point I was trying to make -- in this business, you get what you pay for. SLI does NOT upset that equation, even if you throw overclocking into the mix.
×
×
  • Create New...