Jump to content

angshuman

Members
  • Posts

    655
  • Joined

  • Last visited

Everything posted by angshuman

  1. Yes, that's how the 7-series GPUs happen to be configured and sold by Nvidia. The 7800GTX has 24 pipes, the 7800GT has 20, the 7600GT has 12 (was it 16?) etc. The 6-series was configured as follows: 6800 Ultra = 16, 6800 GT = 16, 6800 = 12, 6800XT = 8, 6600GT = 8, etc. The NV40 and NV45 cores had a maximum of 16 pipes, whether they were enabled or not. Of these, only the AGP 6800 had masked pipes. ASUS also tried to dupe a few customers by selling a "6800 GTO" which was essentially an overclocked 6800 and nowhere close to a real GT. Gainward sometimes does the reverse, i.e., sell a higher-end GPU as a lower-end "golden sample". However, these are extremely rare occurrences. Gah! we're wasting precious forum space over a silly argument . Bottomline, as we all know, is that masked pipes do exist, and it doesn't hurt to download RivaTuner and give it a shot. Taks, I think you somehow managed to get hold of a 7800GS badged as a 6800GT!
  2. Both the Ultra and the GT have 16 *functional* pixel shaders and 16 ROP's each, in both AGP as well as PCI-e form. Some references: AGP 6800 Ultra vs. GT PCI-E 6800 Ultra vs. GT Tom's VGA Chart I have no idea how you managed to turn on 2 extra pipes on your GT. Did you buy the card recently? Maybe you managed to get hold of an uncommon card from somewhere? A 7800 labeled as a 6800 perhaps? Could you give the detailed un-modded specs of your card - Core clock, memory clock, vertex shaders, pixel shaders and ROPs? Did you go from 12 --> 14 or 14 --> 16 or 16 --> 18?
  3. Check out these guides.
  4. IIRC the 6800GT and Ultra had the same number of pipelines. The only differences were the memory and core clock speed (I think the DAC speeds are the same too. In fact, I believe the DAC speeds are the same across *all* video cards and have nothing to do with performance. Isn't the DAC simply a converter that generates the final RGB signal? If you're using a DVI connection the DAC is probably not used at all. I could be wrong though.) As far as unlocking pipelines goes, I think you're talking about the AGP version of the "vanilla" 6800 (also known as the 6800nu), which had 4 masked pipelines that you could unlock to get the GPU core (but not memory) up to 6800GT specs. @Arkan: Take a look at the screenshots in the nvnews.net thread I linked to. Regular antialiasing cannot handle aliasing resulting from transparencies (typically used in fences, trees, grass, ladders, etc.) If you have HL2, go to the bridge level in Highway 17 and you'll see what a difference TAA can make.
  5. As a lot of people suspected long back, the 6-series GeForce cards were always capable of performing Transparency AA. Nvidia had kept the feature locked away in the drivers as a 7-series exclusive. It seems there were a couple of hacks with Rivatuner etc. through which it was possible to turn on the feature, but it is now available through official Nvidia drivers. http://nvnews.net/vbulletin/showthread.php?t=74790
  6. A "core" is essentially a full processor. Before the advent of multi-cores, if you wanted to have multiple processors in your system, one of the most popular options was Symmetric Multi-Processing (SMP), where you buy a motherboard with multiple sockets into which you can plug multiple physical chips, with each chip containing one processor and some cache memory. The chips would be connected to main memory via a system bus. Unfortunately, such configurations were primarily reserved for servers, and both multiple-socket motherboards as well as SMP-capable processors were (and still are) quite expensive. Today's "multi-core" chips (formally called Chip Multi-Processors or CMP's) contain multiple processors (or "cores") and cache memories inside a single chip. The primary advantage to the end-user such as you and I is cost and simplicity. With a single-socket motherboard and at the price point of a single-chip system, you can get the performance of an SMP system. High-throughput servers can use CMP's with multi-socketed motherboards to create "SMP-of-CMP" systems. As far as the Operating System and application softwares are concerned, there is very little difference between a dual-processor CMP ("dual-core") and a dual-processor SMP. There are performance issues, however. Communication Latency: On an SMP, the processors are physically separated over a large distance. On a CMP, they are much closer together, so communication latencies are much lower. However, Intel's original CMP implentations (Pentium D) were extremely hacked-up. Inter-processor communication first had to go off the chip, get onto the system bus and then re-enter the chip into the other core. This offset pretty much all of the performance benefits of a CMP. In fact, the second iteration of the Pentium D (Presler) was not a CMP at all - it was actually a multi-chip module (MCM): 2 distinct chips crammed into a single package and made to fit into a single Socket. Bandwidth: On a CMP, both cores have to share the same set of pins to get data on and off the chip to/from main memory. This could be devastating for bandwidth-intensive applications. In practice, there's not much of a difference between bandwidth contention of a CMP and an SMP on an antiquated Intel-style "shared-bus" platform, where both processors need to get onto the same bus to get to main memory regardless of whether they are on the same chip (CMP) or on multiple chips (SMP). AMD's Hyper Transport, however, is an entirely different animal. Multiple chips on an SMP will have access to a lot more bandwidth than multiple cores on a CMP. With both Intel and AMD planning to introduce Quad-Cores, my spider-sense tells me that bandwidth contention is likely to be a huge performance issue that is going to limit the peak performance of these systems. Cache Sharing: Local on-chip cache memory can be shared between the multiple cores of a CMP. This is extremely beneficial if the applications running on the different cores have different cache space requirements. This is impossible on an SMP. Unfortunately, this is also not exploited on either the Pentium D or the Athlon64 X2. The on-chip L2 caches are hard-partitioned between the two cores on both these processors. Conroe and Merom are the first CMPs to feature shared L2 caches, and this should have a significant impact on performance for some applications. I believe that both AMD's HyperTransport as well as Intel's brain-damaged FSB have sufficient bandwidth to support dual cores. With 4 cores on a single chip, I'm not so sure. Intel's blazing Conroes will without doubt be gasping for data in quad-core form, unless you are dealing with an excellently-written application that manages to restrict a large part of its communication within the chip. AMD's platform probably has more room to play with, but the beauty of HyperTransport is lies in its scalability with multiple sockets, so if you have plenty of cash and want a lot of processors, it is better to go with more sockets than more cores-in-a-socket.
  7. I hadn't heard about this player earlier, but I looked it up after reading your post. It definitely looks like an interesting project, and it's cross-platform to boot. I read up some of its documentation, and it seems that the only supersampling method they support is Bicubic resampling. The video processing community swears by Lanczos resizing and I can personally attest to its quality (though it is a *massive* resource hog). Start with some noise-removal, do a Lanczos resize, and add a slick sharpening filter like LimitedSharpen, and you've achieved some really high-quality upscaling. The best part is that pretty much all of the software you'll need for this is open source (GPL/LGPL). In case you're interested in giving it a shot, let me know and I'll post links to some excellent online tutorials and fora. Quick question (and I know mkreku will say I'm nuts ): Did any of the LCD's you tested happen to have TN panels? TN panels will kill your color, contrast and viewing angles *and* give you the scaling-to-native problems, but a true 8-bit panel (PVA, S-PVA, S-IPS) will at least get the richness of your colors somewhat close to a CRT, and the only things left for you to deal with will be the aliasing and native-resolution issues. Bottomline: 8-bit LCD panel + FFDShow + AviSynth + a fast CPU = excellent video quality. I will admit, however, that I have absolutely no idea if the end result actually gets anywhere close to CRT quality. Games appear sucky only if you do not play at your LCD's native resolution (usually the highest resolution your monitor supports). However, do keep in mind that I am pathologically allergic to "teh jaggies". Several of my friends do not find teh jaggies distracting, and a lot of them aren't bothered by non-native LCD resolutions either. Me, I feel as if every atom in my eye is being excruciatingly plucked out one-by-one when I am looking at a screen full of teh jaggies. An LCD panel running at non-native resolution, to me feels like teh jaggies with quad-damage.
  8. I would go for the 20" without a second thought. Widescreen is for televisons, not monitors, because the only thing theyre good for is to watch widescreen films and film on an LCD screen is badness because it completely screws up the quality unless the film happens to be in the exact same resolution as the screens native resolution (which it wont) <{POST_SNAPBACK}> I'd go for the 20" too, although I'm a big fan of widescreen. A 19" widescreen, however, is way too small. If you are used to a regular 19" display, you might find the height of a 19" widescreen a little too small. As for the LCD native resolution thingy, I feel it doesn't affect live-action movies as much as it affects video games and anime movies. This is especially true if you use software postprocessing. A decent set of scaling filters (Lanczos resize via FFDShow + some fancy AviSynth sharpeners) can work wonders if you have the processing power to run them in real time. Once you software-resize the videos to native resolution, they look absolutely stunning. This is for up-scaling; I've never found down-scaling to be an issue at all. 1080p videos appear *perfect* on my 720p monitor without any processing.
  9. With my brand-spanking new Toledo X2 4800+ (which I have yet to install), I believe my system has finally come to a stable and balanced state and does not have any significant bottlenecks anywhere. I don't think I'll be upgrading this system again. About six months to eight months down the line, I'll try to sell the whole thing off - monitor and all - and build a new system from ground up.
  10. I hadn't seen that before. I almost fell off my chair!
  11. The Great HDCP Fiasco. The article primarily complains that none of our existing hardware (monitors and video cards) will be able to play HD-DVDs/Blu-Rays with copy-protection (which means effectively all commercial HD-DVDs). I frankly don't care; I didn't really expect my year-old video card to be able to play these high-definition videos at reasonable framerates anyway. What bothers me is that the "DRM ecosystem" I was talking about in my previous few posts seems to be coming alive. It seems if you want to play HD-DVD or Blu-Ray content, you need to have an HDCP-compliant video card, an HDCP-compliant monitor and an HDCP-compliant OS, otherwise you will only get the content at 1/4th resolution. EDIT: I hadn't noticed that this was a pretty old article. Do forgive me if you have already read it. I felt it was relevant to the current discussion.
  12. For those among us that are planning to build a Conroe system, here's some useful information. It seems there are a bunch of ASUS P5W "Conroe-ready" motherboards in the market that aren't really Conroe ready since they have on older BIOS. If you already have a Pentium4 that you can plug in to flash the BIOS it's a non-issue, but if you are building a new Conroe system from scratch this may cause some headaches. http://www.theinquirer.net/default.aspx?article=33523
  13. I'd say that Nvidia have the better mid-range products as of today, so I'd recommend the 7600, although I'm not a big fan of the vanilla (non-GT) cards. The 7600GT should be good value for money.
  14. Well, you can always continue to play your existing media on XP. However, all new media released by the big content providers will be unplayable on non-DRM-infected machines. I am sure Microsoft will release free patches for XP that will make it fully usable with DRM-compatible hardware and media. If you want to enjoy the new content, you'll either have to install the patches or switch to Vista.
  15. The problem is that they're all in it together - Microsoft, Intel, Apple, the Display manufacturers, and the good people that make up the RIAA and MPAA. Together, they create an ecosystem where if you want to enjoy any sort of media on your PC, you will have to use DRM-infected Monitors, Intel's DRM platforms and M$ and Apple's DRM software. I fear that in time it might be completely impossible to watch or listen to any sort of licensed media on open-source software and open-standards hardware.
  16. You could also try cleaning the lens using a lens-cleaner CD.
  17. Please check your figures.
  18. To the Original Poster: Your processor is really quite decent. Your RAM is *****way***** too less. 256MB just doesn't cut it for Windows XP. Even the primary OS functionalities do not fit into 256MB. Your entire experience is likely to be a huge swap-fest. And if you have stuff like Antiviruses and Messengers running in the background, it's even worse. You will be shocked at how much better your system performs simply if you upgrade your memory to 512MB, and disable all the worthless background junk like messengers, quicktime, Winamp agent, etc. Of course, in order to run a 3D game well you will need a 3D card, but if you are feeling that your overall desktop experience is torturous, increasing your RAM will go an incredibly long way in soothing the pain. And your total investment is probably going to be around $20, even if you have to toss out your old RAM sticks.
  19. Itanium was targeted at a very different market segment (although it may not have started life that way). The mid-range and high-end server market had been using 64-bit machines for ages: Sun Ultra SPARC, IBM P-series, DEC Alpha etc. Itanium was Intel/HP's attempt to play in this segment. Machines built around these processors usually had Multi-Chip Modules (in which each chip is sometimes multi-core e.g. IBM Power5), fancy interconnection networks (e.g. Alpha), huge caches (20-30MB is not uncommon on an Itanium system), loads of memory bandwidth, NUMA, etc. etc. However, these servers were usually too expensive for small businesses. Therefore, Intel had always had a market for the Xeon line: relatively low-end 32-bit server parts. AMD was technically the first to step up the ante in this particular low-end server segment and introduce the Opteron as the first 64-bit x86 server processor. Intel's "reply" to this was the 64-bit Xeon, not the Itanium.
  20. Fedora FTW
  21. This is a common misconception. Intel launched its 64-bit processors only a couple of months after AMD did. It takes several years to design and build a processor. Intel has already started working on processors targeted at the 2010 timeframe. There are always several alternative designs being parallelly worked on by multiple design teams, but only a few finally see the light of day. Exactly which product or feature is launched when depends on several factors, which include market conditions, state of the competition etc. If Intel didn't launch 64-bit processors when AMD did, it was likely because the marketing teams felt that there wasn't enough demand for it. It seems AMD's marketing for 64-bit was good enough to create sufficient hype about it, and so Intel had to respond. Of course, it takes a few months for the product to be sent for full-scale manufacturing, but the 64-bit designs were ready long before that.
  22. And don't forget those damn atmospheric neutrons! EDIT: I just noticed you live in Denver. :D
  23. If you have a decent heatsink on your memory modules, you could probably try and take it up to 900, but I don't think I'm the right person to answer that question... I have little to no experience overclocking video RAM. What I meant by the "just 100MHz" in my previous post was that it was a small overclock compared to what you would have to achieve to get the GX2 up to the GTX's clock speed (+400MHz). In fact, 100MHz actually sounds like a nice overclock for 700MHz modules.
  24. Well, that's just a 100MHz overclock. A GTX's memory runs at 1600MHz while a GX2's runs at 1200MHz. The cooling on a GX2 is a joke. I doubt either its core or its memory would survive any amount of overclocking. The card was designed from the start to be a slightly slow but gargantuan monster, while the GTX is more of a lean and mean machine. But then, as benchmarks show, the GX2 almost always trounces the GTX in most real-world situations. What would be interesting is a comparison between Quad-SLI GX2's (once the drivers are mature enough) and a pair of SLI'd GTX's. I have a strong suspicion the GTX's will win.
  25. If it sells for less than $200 and the games are reasonably priced, I'm all for it. I hope they make some fantastic games for it that don't try to forcibly put too much emphasis on the gimmicky controller.
×
×
  • Create New...