
angshuman
Members-
Posts
655 -
Joined
-
Last visited
Content Type
Profiles
Forums
Blogs
Everything posted by angshuman
-
Not even close. 6200 = 4 pixel shaders, 2 ROPs, 64-bit bus, ~600MHz mem 6800u = 16 pixel shaders, 16 ROPs, 256-bit bus, 1.1GHz mem In practice, benchmarks indicate that you typically need to SLI two 6600GT's to reach the level of a 6800GT. You'd need an army of 6200's to match up with a 6800 Ultra. As far as graphics is concerned, the market is very straightforward. The more you pay, the more performance you get, and that's that. It's not a linear curve, but it's MUCH more reasonable than the processor price curve. If you're playing a game at max settings, you'd be hard pressed to notice any difference between a $65 Sempron and a $1000 FX-57. Metadigital's system makes some smart choices. In my opinion, you should: - spend the most amount of money on your power supply, video card, and all interface devices (monitor, keyboard, mouse, speakers), - strike a good balance on "core" components like your case and motherboard, and - get the cheapest processor and memory that you can buy without losing your dignity and self-respect.
-
I'd suspect the video card first, followed closely by the motherboard. I doubt it's a software/configuration/driver issue. Which means isolation is the first step. Can you get hold of a temporary video card for testing purposes? The closer it is to your existing card, the better.
-
Most cards with dual-DVI outputs ship with DVI->VGA converters (it's not really a converter, it is merely a socket adapter without any electronics. Your card detects that it is connected to a VGA monitor through such an adapter and sends out a VGA signal instead of a DVI signal). You can also buy one from your local electronics store for 5-6 bucks. And just to set the record straight, the 6800GS comes in both PCIe and AGP flavors. The AGP version often turns out to be a 6800GT/Ultra with pipes disabled, and people have been successful in moding it to Ultra-Extreme specs.
-
If you are willing to wait 6 months, do so and get the Intel Conroe, it's a no-brainer. If not, get an AMD. Again a no-brainer. End of story. There are no stability issues with any processors, and most motherboards from reputed manufacturers (Asus, MSI, Gigabyte, etc.) are fantastic.
-
That's a very myopic outlook. Which two have more in common - Video Gaming and Home Entertainment or Video Gaming and Word Processing? I don't object to your skepticism regarding the console being a convenient home theater platform as much as I object to your belief that a "computer" will be any more suited to the task. The concept of a "Personal Computer" as we know it is fast growing obsolete. It is becoming less and less economically and technologically viable to have a behemoth all-in-one uber-powerful "Computer" that needs to be powered from a supply transformer and dipped in liquid nitrogen. We are slowly moving towards a decentralized but networked system comprising of units that are dedicated to performing a particular set of tasks. With home media transitioning over the years from VHS into the digital "high-definition" era, game consoles and home media boxes are beginning to share a lot of functionalities (heavy processing, digital I/O), and therefore it (arguably) makes sense to club these into a single unit.
-
It's not the number that worries me. The X's cores are at least somewhat traditional, and they are homogenous. Write a thread-safe C program, it will run on them more or less optimally if you are a little careful about your branches. The SPUs don't look like something your average C program would be happy running on. Admittedly, I know very little about either the ISA or the u-arch of these SPUs, but from what I've read they appear to be very wide, non-speculative, statically scheduled vector processors. It would be hard to extract optimal performance from these (for generic code) unless you have an absolutely brilliant compiler or a programmer hand-coding in assembly. And I do not believe there is a lot of room for suboptimality. On the other hand, it seems likely that the SPUs are being used for some specific task such as Vector (offloaded from the RSX) or Physics processing, with only the PPU being used for the primary generic thread. But even in this case, good abstractions in the form of libraries and language constructs need to exist for the programmers to be able to utilize them effectively. The sum total of all the work done seems to be a lot greater than what needs to be done for the X360, where traditional multithreaded programming approaches would work reasonably well.
-
Unless Sony has developed a bunch of fantastic dev kits and a collection of magical libraries and compilers that abstract out the array of SPUs somehow, it seems very likely that the PS3 is a lot harder to program for. It's not because it used a new processor architecture, it's because of what that architecture looks like.
-
And they are horrible developers. Just look at the system requirements of that engine that IMHO looks like glittering crap, except at the highest quality settings which they claim to have designed keeping "future hardware" in mind. Ridiculous! They write bad code and accuse the customer of having crappy hardware. To add insult to injury, the artwork is a slap on the face of anyone with even a mild sense of art and aesthetics. (My comments are somewhat influenced by my intense hatred for Sony, so keep that table salt handy. If you even care.) On Topic: Guild Wars FTW. It's not an MMO (as Nick pointed out) and it's not much of an RPG either, but it's a hell of a fun game. If you want a real MMORPG, I guess WoW is the way to go, 1 billion people can't be wrong :ph34r:. I did try out a 10-day WoW account, and the game really sucked me in... Not. (Kill 10 Kobold Vermin -> Kill 10 Kobold Workers -> Kill 10 Kobold Laborers -> Get 7 Wolf Fangs -> Get 10 candles -> Delete Account.)
-
B-but... isn't Spam the theme of the thread?
-
Irrlicht could be an option. The engine itself is free but you are allowed to build commercial appliactions on it. Also, the Quake 3 source code is now available under GPL. Of course, this means that anything you build out of it will also have to be distributed free under GPL.
-
Iron Maiden - 2 Minutes To Midnight
-
The only 2 things I can brag about are my MSI Nforce4 Platinum SLI motherboard and my BFG 7800GTX.
-
Does it have lots of ph4t l00t?
-
Just ran it. For those of you who are wasting precious bandwidth downloading this, here's my advice... don't bother. The free version is almost exactly the same as 3DMark05, only everything has a little more detail. There is only 1 new test. Plus, they upped the default resolution (which you can't change in the free version) to 1280x1024 in order to bring about the customary reduction in benchmark score that happens with every new 3DMark generation. Pathetic. FWIW I got a score of 3702.
-
Wait till I post my scores in this very thread... :ph34r:
-
While 3DMark does have its flaws, I do not completely agree with the System Performance argument. Graphics is by far the limiting factor that determines performance in most games today. 3DMark05 in particular was considered to be slightly flawed because it was a little too vertex-intensive and was therefore not completely representative of all modern games. While it is true that a lot of people use 3DMark as a willy-waving tool, serious willy wavers will actually use a combination of real game benchmarks to show the sheer power of the graphics chip they designed... no, wait, the graphics chip that they bought with Daddy's cash. I like 3DMark because it's pretty. (w00t) But my favorite was 3DMark2000, it had a very nice demo with great music. They are getting progressively worse each generation (haven't tried this one out yet). Edit: My apologies Kaftan, I had misread your post somewhat, I thought you were advocating stressing the full-system instead of focusing on the graphics subsystem.
-
A "Synergistic Processing Unit". It's a processor with a large amount of brute force but a lot less intelligence (that is why it needs an army of programmers to extract its full performance). The PS3's Cell is made up of of 1 PPU (Principal Processing Unit?) and 7 of these SPUs.
-
What's nice is that a CPU is great for running control flow-intensive DFS-like algorithms. In fact, as far as I knew, even "physics" as used in games primarily involves collision detection, and collision detection algorithms are also typically control-heavy and thus are well suited to execution on standard processors with good branch predictors. What's weird is that the Cell's SPU's are completely non-speculative, which seems to render them completely useless for this task, so what the hell are those SPU's good for anyway?
-
Emphasis on shooting, eh? Serious Sam. :D
-
Try Progress Quest. Essentially it's World of Warcraft minus the annoying n00bs and monthly fees.
-
I never really played the original NOLF. NOLF2 was fabulous, though.
-
As of today, nothing beats AMD in gaming*. However, as others have noted, your processor is really quite irrelevant as compared to your graphics card. A $150 Athlon64 3200+ paired with a $450 7800GTX will blow the socks off a $1100 Athlon64 FX60 equipped with a $350 7800GT. Intel processors at corresponding prices points will give a little lower performance, but not really noticeable especially at higher resolutions. As long as your processor and RAM are not *bottlenecking* your system, your gaming performance will pretty much scale proportionately to how much money you put into your graphics card. Personally, I never cared for the concept of "future-proofing", unless you consider a 6-month timeframe as Future. It just doesn't exist in this business. Every time I have tried to do that, I ended up regreting it. It's best to buy the best-performing contemporary equipment you can within your budget, unless of course you are an "enthusiast", really have a lot of cash, and are willing to buy nothing but the bleeding edge stuff. This is especially true for CPU's -- anything above a 3200+ is a waste of money IMHO. The AMD FX-series and Intel Extreme Edition series are jokes. * Edit: All this Intel-bashing (that I myself indulge in often) will end around mid-2006 with the introduction of Conroe.
-
Thanks Surreptishus, that's interesting info. I hadn't heard about this EFI thingy before. At any rate, if Vole and Jobs come to an "agreement" of any sort, they are probably going to make it work. Maybe not with XP, but definitely Vista onwards.
-
In a previous thread, I had wagered that Apple will try its level best to keep Bill Gates' dirty hands off their brand-spanking new Intel-based Macs. I was wrong. Link