Jump to content

Recommended Posts

Posted (edited)

Test/Benchmark/Performance review of the Intel's newest revolutionary integrated graphics X4500 against the lowest end of the other manufacturers (S3 430GT, ATI 3450, nVidia 8400GS):

http://www.forum-3dcenter.org/vbulletin/sh...ad.php?t=442135

 

What can we see? Epic fail! Both image quality and performance are far, far off anything else on this planet. Plus, there are some nice compatibility issues with Crysis on high (even if that would be suicide to even try to play, considering 7fps average on low, 800x600, on an E2200 @2.4GHz, 4GB Ram).

 

Imo, more tests like this or even real reviews should be done by hardware magazines, both printed and online, because people always expect their games to run on Intel graphics. They tend to complain when their games run like crap on them, see NWN2 forums for example. More information about the ridiculous performance of the chips and the compatibility issues of the drivers (or even the hardware?) could probably open average Joe's eyes once he googles for reviews.

Edited by samm

Citizen of a country with a racist, hypocritical majority

Posted (edited)

D_N: What are these pics representing? A X800XL faster than a X1800XT? What's the used system, driver versions? But thanks for linking to these other sites, I hadn't seen many IGP before ;)

 

Gorgon: Yes indeed, and Intel better do something about their compliance with standards and their drivers, or Larrabee will just plain fail.

Edited by samm

Citizen of a country with a racist, hypocritical majority

Posted
What's the used system, driver versions?

 

The second set of graphs are from a Linux-based system running open-source drivers, and I'm sure there is more information is in the original article if you want it. ;)

"Geez. It's like we lost some sort of bet and ended up saddled with a bunch of terrible new posters on this forum."

-Hurlshot

 

 

Posted

I know, it was a bad attempt at pointing out that the rather strange results in the second image were quite in need of some explanation for people looking at nothing but the length of the bars and the numbers.

So, people reading this here and not reading the linked tests: Note that the first of D_N's reviews attributes abysmal performance to this onboard solution, while the second test is run under Linux with really exotic drivers etc (exotic for Windows users at least) that perform vastly different from e.g. Vista and thus end up recommending the IGP for HTPCs - which frankly I don't understand without support de-interlacing, noise reduction or even noteable reduction of CPU usage when displaying movies.

Citizen of a country with a racist, hypocritical majority

Posted

I like the idea of simplifying graphics card, since that allows game devs to familiarise themselves with the hardware a lot more and use it better (this is the what happens with consoles, for example), but if you think Intel can do that, there is a serious problem.

"Alright, I've been thinking. When life gives you lemons, don't make lemonade - make life take the lemons back! Get mad! I don't want your damn lemons, what am I supposed to do with these? Demand to see life's manager. Make life rue the day it thought it could give Cave Johnson lemons. Do you know who I am? I'm the man who's gonna burn your house down! With the lemons. I'm going to to get my engineers to invent a combustible lemon that burns your house down!"

Posted (edited)

Why would that be a problem?

 

Now, a bad graphics card isn't the same as a simple graphics card. Also, what needs to be simple is not the card but the interface to program it, which is DirectX for most current games. What ideally would be done to facilitate game programming is ensuring 100% compliance with all interfaces, be it DirectX, OpenGL or the upcoming general purpose computation language OpenCL (and DirectX 11) - a programmer ideally doesn't have to know the hardware, but just 'if I write X, independent of the hardware, Y will result'.

Intel's graphics card aren't by any means the most compliant. But also ATI and nVidia have their own quirks and strengths and recommend programmers to write code that uses their specifc strengths. Consoles merely have the advantage of being restricted to one hardware, that doesn't have to be simpler, but would even allow to code in a form of assembly language because a programmer knows exactly what the hardware understands and what it is best in, what needs to be avoided etc.

Edited by samm

Citizen of a country with a racist, hypocritical majority

Posted

i don't know much about programming, but I do know system design. Sounds to me like samm's right. It's the programmer's interface that needs to be reliable.

"It wasn't lies. It was just... bull****"."

             -Elwood Blues

 

tarna's dead; processing... complete. Disappointed by Universe. RIP Hades/Sand/etc. Here's hoping your next alt has a harp.

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...