Jump to content

ATI X800XL Wallpaper


dufflover

Recommended Posts

OK, I know there are way better cards like the nVidia GeForce 7800 GTX, the ATI Radeon X850XT Platinum, and I'm already drooling over the Camcorder video of the ATI R520 "Fudo" ("X900"?) demo.

 

Anyway, I got an X800XL yesterday, and being a poor unemployed Uni student, I probably won't get anything better for a while :wub: . As a demo of this, this X800XL replaces my GeForce4 440 MX/Go :wub: so yep, I can't believe I have a modern graphics card in my computer....(you lucky people with SLI's, X850's and so on :thumbsup: )

 

I've always wanted to run that ATI Double-Cross Ruby demo and even though I've done that now...I seem to still be a bit hooked on the wonder of it all, and right now my desktop wallpaper is this (I didn't quite like the existing ones):

 

post-10482-1121612725_thumb.jpg

 

What's this thread supposed to be about/discuss? I have no idea... :wub:

Uhh, if you'd like comment on the wallpaper (or any other good Ruby wallpapers).

I'm even thinking of making some icons of that optical diamond thing...but I really should be working on Pure Pazaak.

 

I guess I'm another "OMG I just found out I'm Revan...must tell someone" type person.

sigpic0yb.jpg

Pure Pazaak - The Stand-alone Multiplayer Pazaak Game (link to Obsidian board thread)

Pure Pazaak website (big thank you to fingolfin)

Link to comment
Share on other sites

Im not buying another ATI product until they learn to write good, working drivers.

 

 

The only complain I got with this baby

 

prod_1693.jpg

 

is that despite its double-deck design, the fan does not have an air intake for drawing cool air from outside the computer like ATI has. But then again, maybe its because ATI has patented that method.

DISCLAIMER: Do not take what I write seriously unless it is clearly and in no uncertain terms, declared by me to be meant in a serious and non-humoristic manner. If there is no clear indication, asume the post is written in jest. This notification is meant very seriously and its purpouse is to avoid misunderstandings and the consequences thereof. Furthermore; I can not be held accountable for anything I write on these forums since the idea of taking serious responsability for my unserious actions, is an oxymoron in itself.

 

Important: as the following sentence contains many naughty words I warn you not to read it under any circumstances; botty, knickers, wee, erogenous zone, psychiatrist, clitoris, stockings, bosom, poetry reading, dentist, fellatio and the department of agriculture.

 

"I suppose outright stupidity and complete lack of taste could also be considered points of view. "

Link to comment
Share on other sites

The previous statement just HAD to bite me in ass.. apprently there is a bug in the latest nVidia drivers that oversets the Gamma for video overlays, f*cking up the picture real good.

 

If your divx playback colours are *@$

DISCLAIMER: Do not take what I write seriously unless it is clearly and in no uncertain terms, declared by me to be meant in a serious and non-humoristic manner. If there is no clear indication, asume the post is written in jest. This notification is meant very seriously and its purpouse is to avoid misunderstandings and the consequences thereof. Furthermore; I can not be held accountable for anything I write on these forums since the idea of taking serious responsability for my unserious actions, is an oxymoron in itself.

 

Important: as the following sentence contains many naughty words I warn you not to read it under any circumstances; botty, knickers, wee, erogenous zone, psychiatrist, clitoris, stockings, bosom, poetry reading, dentist, fellatio and the department of agriculture.

 

"I suppose outright stupidity and complete lack of taste could also be considered points of view. "

Link to comment
Share on other sites

The previous statement just HAD to bite me in ass.. apprently there is a bug in the latest nVidia drivers that oversets the Gamma for video overlays, f*cking up the picture real good.

 

If your divx playback colours are *@$
Link to comment
Share on other sites

Well, thats about you versus the other 98% ATI users who cant play a single game without fiddling with drivers back and forth. Like I used to be, for instance (I exaggretate of course, it was only about every third game but thats too much)

 

 

The fiasco with K2 was single-handedly ATI's fault, the week of K2's release they put out a new set of drivers that were largely(this is good) completely incompatible with K2's renderer. While nVidia users experienced no problems :lol:"

DISCLAIMER: Do not take what I write seriously unless it is clearly and in no uncertain terms, declared by me to be meant in a serious and non-humoristic manner. If there is no clear indication, asume the post is written in jest. This notification is meant very seriously and its purpouse is to avoid misunderstandings and the consequences thereof. Furthermore; I can not be held accountable for anything I write on these forums since the idea of taking serious responsability for my unserious actions, is an oxymoron in itself.

 

Important: as the following sentence contains many naughty words I warn you not to read it under any circumstances; botty, knickers, wee, erogenous zone, psychiatrist, clitoris, stockings, bosom, poetry reading, dentist, fellatio and the department of agriculture.

 

"I suppose outright stupidity and complete lack of taste could also be considered points of view. "

Link to comment
Share on other sites

Well, thats about you versus the other 98% ATI users who cant play a single game without fiddling with drivers back and forth. Like I used to be, for instance (I exaggretate of course, it was only about every third game but thats too much)

 

 

The fiasco with K2 was single-handedly ATI's fault, the week of K2's release they put out a new set of drivers that were largely(this is good) completely incompatible with K2's renderer. While nVidia users experienced no problems :lol:"

 

My mistake. K2 was the one game that I had problems with. Had to edit the ini file to get Dantooine to work. However I can't think of any other games off the top of my head. If I do remember one I'll be sure to right it down for you. o:)"

Link to comment
Share on other sites

So what would y'all recommend when buying a graphics card? Geforce, Radeon, what?

DENMARK!

 

It appears that I have not yet found a sig to replace the one about me not being banned... interesting.

Link to comment
Share on other sites

Half the people are going to say nVidia and the other half is going to say ATI. My personal experience is that nVidia offers a lot more stability than ATI but that isnt necessairly true(as with the gamma bug I mentioned).

DISCLAIMER: Do not take what I write seriously unless it is clearly and in no uncertain terms, declared by me to be meant in a serious and non-humoristic manner. If there is no clear indication, asume the post is written in jest. This notification is meant very seriously and its purpouse is to avoid misunderstandings and the consequences thereof. Furthermore; I can not be held accountable for anything I write on these forums since the idea of taking serious responsability for my unserious actions, is an oxymoron in itself.

 

Important: as the following sentence contains many naughty words I warn you not to read it under any circumstances; botty, knickers, wee, erogenous zone, psychiatrist, clitoris, stockings, bosom, poetry reading, dentist, fellatio and the department of agriculture.

 

"I suppose outright stupidity and complete lack of taste could also be considered points of view. "

Link to comment
Share on other sites

I got the ATI X800XL cos it performs at basically an equal level (win there, lose there, etc) with 6800GT but for $150 AU less.

 

I admit that ATI's have a bigger reputation with drivers, although I haven't had a bad experience in my 2 days yet. lol.

 

Besides, ATI Ruby canes that Mermaid...now if only Bastila was rendered like that, my wallpaper would change instantly.

sigpic0yb.jpg

Pure Pazaak - The Stand-alone Multiplayer Pazaak Game (link to Obsidian board thread)

Pure Pazaak website (big thank you to fingolfin)

Link to comment
Share on other sites

Well, thats about you versus the other 98% ATI users who cant play a single game without fiddling with drivers back and forth. Like I used to be, for instance (I exaggretate of course, it was only about every third game but thats too much)

 

 

The fiasco with K2 was single-handedly ATI's fault, the week of K2's release they put out a new set of drivers that were largely(this is good) completely incompatible with K2's renderer. While nVidia users experienced no problems :blink:"

 

 

I don't recall either problem.

 

 

Although as discussed with Ender, I seem to be in the 'Lucky' camp.

Link to comment
Share on other sites

NVidia has better price points, good performance and solid drivers.  As I'm broke, I just went with the 6600 GT.

 

nVidia wins the mid-range this round with the 6600 series, however in the 9x00 series, ATi definatly had the advantage. Dunno much about the current high end cards as I won't be able to afford them till I win the lottery. :blink:"

Link to comment
Share on other sites

I've got an ATI Radeon Mobility T2 128MB. It has DirectX9.0c compatiblity. (I didn't have much choice, it was the most advanced ATi available -- which is all IBM Thinkpads supplied. It's equivalent to a 7500, or is it a 7800? Never can remember. Anyway, it's pretty good.)

 

I had a problem with K2, but that was settled by turning off the movies (which didn't work that well in K1, either: I can watch them in batch, but now in situ :) ). I have had no other problems with any other games, not Doom (a blatant vVidia-optimised game) nor Half-Life 2 (a blatant ATi-optimised game).

 

The latest (bleeding edge) ATi cards (X800 / X850) are still better than the latest nVidia cards, even though there is always more room to over-clock an nVidia.

 

STOP PRESS: I am just looking at the review in this month's PC Format, and it says that the the new nVidia GeForce 7800 GTX is the fastest around: 89% score (and there is the SLI model, which allows pairing).

 

The rumours I have read in the last month are that the Radeon pairing technology (CrossFile) is superior to the nVidia, but not many of us are going to go that far, I'm sure, and CrossFire is still not available (and neither are too many X800s, I believe ...)

 

Also, the nVidia run hotter and draw more power (to help keep them warm).

 

Listed model is: Gigabyte 7800 GTX,

OBSCVRVM PER OBSCVRIVS ET IGNOTVM PER IGNOTIVS

ingsoc.gif

OPVS ARTIFICEM PROBAT

Link to comment
Share on other sites

6800 Ultras are out now though aren't they?

 

I wonder what will be out when the consoles finally arrive.

 

 

Plus, it'd just be silly for them to not find ways to take this uber technology and make it work in a PC box. Especially when there are wackos that don't mind spending $1200 on an SLI PCI-e system so that he can get 400 FPS instead of 380 FPS.

Link to comment
Share on other sites

Really?

ATI said the 360 GPU will be better than any PC card they put out when the 360 ships, and NVidia said the PS3 GPU will be better than two 6800 Ultras.

Yeah, I just added some stats, but basically the new nVidia 7800 GTX has twice the pixel shading power of the 6800 Ultra, and there is a 7800 GTX SLI (so two of them paired). There will also probably be a 7800 Ultra due out soon, methinks ...

 

So it sounds like the PS3 is using the G70 chip (110nm die) and is basically the same as the new 7800 GTX.

OBSCVRVM PER OBSCVRIVS ET IGNOTVM PER IGNOTIVS

ingsoc.gif

OPVS ARTIFICEM PROBAT

Link to comment
Share on other sites

One thing I was hoping about the consoles (particularly the XBox) is that people would push the envelope a bit more for games graphics wise, so that those with phat hardware could play it on the PC, and those that didn't want to shell out the bling could still play it on the XBox.

 

It hasn't happened to the extent that I wanted, but I do think that games like Chaos Theory look really, really nice.

Link to comment
Share on other sites

Apparently the next gen consoles (XBox 360 and PS3) will have graphics power (talking about the final speed/design, not the graphics chip alone) equal to a future "R600" ATI Chip. Ofcourse since the R520 isn't out yet can't really tell for sure. The sooner it comes out, the sooner I can download a pre-rendered version of that movie!

sigpic0yb.jpg

Pure Pazaak - The Stand-alone Multiplayer Pazaak Game (link to Obsidian board thread)

Pure Pazaak website (big thank you to fingolfin)

Link to comment
Share on other sites

I've always wondered about why a graphics manufacturer would make a chip so powerful as the console manufacturers claim to be, and yet NOT release it for PC, where people seem to be more than willing to shell out over $1000 on an SLI configuration.

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...