Jump to content

Is dual-screen more demanding on the GFX?


Recommended Posts

I have two monitors for "work" and when I play games I just turn one off so I wont be distracted or waste power. Now, running two monitors is just like splitting the signal,right? It doesnt increase the GPU load?

 

 

My computer starts to run the fans harder when I play NWN now, it didnt use to do that I think.

DISCLAIMER: Do not take what I write seriously unless it is clearly and in no uncertain terms, declared by me to be meant in a serious and non-humoristic manner. If there is no clear indication, asume the post is written in jest. This notification is meant very seriously and its purpouse is to avoid misunderstandings and the consequences thereof. Furthermore; I can not be held accountable for anything I write on these forums since the idea of taking serious responsability for my unserious actions, is an oxymoron in itself.

 

Important: as the following sentence contains many naughty words I warn you not to read it under any circumstances; botty, knickers, wee, erogenous zone, psychiatrist, clitoris, stockings, bosom, poetry reading, dentist, fellatio and the department of agriculture.

 

"I suppose outright stupidity and complete lack of taste could also be considered points of view. "

Link to comment
Share on other sites

Is it really rendering the two screens separately in clone mode? Isnt it just a question of sending exactly the same signal to both screens at one? Kind of like if you put a T-connector on the wire?

DISCLAIMER: Do not take what I write seriously unless it is clearly and in no uncertain terms, declared by me to be meant in a serious and non-humoristic manner. If there is no clear indication, asume the post is written in jest. This notification is meant very seriously and its purpouse is to avoid misunderstandings and the consequences thereof. Furthermore; I can not be held accountable for anything I write on these forums since the idea of taking serious responsability for my unserious actions, is an oxymoron in itself.

 

Important: as the following sentence contains many naughty words I warn you not to read it under any circumstances; botty, knickers, wee, erogenous zone, psychiatrist, clitoris, stockings, bosom, poetry reading, dentist, fellatio and the department of agriculture.

 

"I suppose outright stupidity and complete lack of taste could also be considered points of view. "

Link to comment
Share on other sites

I'm really happy that Kaftan has raised this - as I have an interest in this too. With the CPU I want apparently still only vapourware, I'm tempted to wait and see what the Core Duo 2 notebooks are going to look like and what GPU may be available in a 15" model. the question for me is gaming on my larger LCD when at home and whether it will have an affect as Bokishi has described or if Kaftan's assessment is correct. Anyone have anything authoritative in this regard to link to for further reflection?

The universe is change;
your life is what our thoughts make it
- Marcus Aurelius (161)

:dragon:

Link to comment
Share on other sites

So meta, if I brought the notebook home and hooked it up to my LCD and did not span - there would be no hit? What about of the second LCD had a higher resolution - how would that affect the performance?

 

Note: Kaftan if I am derailing/detracting from your intent, let me know and I'll split this into to threads.

The universe is change;
your life is what our thoughts make it
- Marcus Aurelius (161)

:dragon:

Link to comment
Share on other sites

Wouldn't the resolution be determined by the application itself. If anything I'd suspect it'd force both montiors into that resolution, and you'd get stretching and stuff in the one you weren't using.

 

 

If dual monitors required renderings for both monitors (when it's just a split image), playing a game at high resolution with full on AF and AA effects would be a very large performance hit.

 

 

An easy way to test is to see your performance with a single monitor attached, and with two.

Link to comment
Share on other sites

That's a fair observation alanschu, the problem is that I would have to buy the notebook w/o knowing if it will work. A question that occurs to me with your observation is what if the notebook monitor does not go as high as the external LCD - what does the GPU do in that case?

The universe is change;
your life is what our thoughts make it
- Marcus Aurelius (161)

:dragon:

Link to comment
Share on other sites

Are you piping the same image, or spanning multiple monitors? The former is no extra GPU strain (vertex shaidng, bump mapping, etc), the latter is (bigger image to process).

 

There's always some slow down involved as the card runs both RAMDACs[Random Access Memory, Digital to Analog Converter], it's an Memory Bandwidth issue... but it's minimal unless you use extend/span instead of Clone Mode.

 

There have been reports of OpenGL based games slowing down or not working when using Clone Mode, but that is driver issue and not Hardware related one.

"If at first you don't succeed... So much for skydiving." - Henry Youngman.

Link to comment
Share on other sites

That's a fair observation alanschu, the problem is that I would have to buy the notebook w/o knowing if it will work. A question that occurs to me with your observation is what if the notebook monitor does not go as high as the external LCD - what does the GPU do in that case?

My laptop can run at a higher resolution than the built-in screen (IBM seems to think this is a good idea, it's called "windowing, I think, as you can move the underlying window around on the smaller LCD "portal"); so (apart from the admin overhead mentioned by Baneblade, above), the GPU sets the resolution to what it is told to (up to the maximum it can manage). I can run both monitor and screen with stretched graphics on one.

OBSCVRVM PER OBSCVRIVS ET IGNOTVM PER IGNOTIVS

ingsoc.gif

OPVS ARTIFICEM PROBAT

Link to comment
Share on other sites

That's a fair observation alanschu, the problem is that I would have to buy the notebook w/o knowing if it will work. A question that occurs to me with your observation is what if the notebook monitor does not go as high as the external LCD - what does the GPU do in that case?

 

Unless I'm horribly mistaken, from a performance point of view, the game itself will render at the resolution the game is set at.

 

Meta's explanation seems reasonable for what would happen to your laptop screen. In the worst case, I would expect that the laptop screen would go black as it is incapable of displaying the resolution specified. Kind of like when I tried running things at 1600x1200 on my old CRT. I don't know if a flat panel would respond any different than a CRT.

 

As for the desktop view, you'd probably get pieces of the desktop truncated or something like that.

Link to comment
Share on other sites

Okay so a further point - is anyone actually doing the gaming I am tlaking about? If so, what is your experience. The CPU/RAM/HDD are now up to par in a notebook - it simply remains the GPU that might be the bottle-neck. I have to wait and see what the core duo 2 will be paired with in a notebook environment.

The universe is change;
your life is what our thoughts make it
- Marcus Aurelius (161)

:dragon:

Link to comment
Share on other sites

I have two monitors for "work" and when I play games I just turn one off so I wont be distracted or waste power. Now, running two monitors is just like splitting the signal,right? It doesnt increase the GPU load?

 

 

My computer starts to run the fans harder when I play NWN now, it didnt use to do that I think.

 

I'm pretty sure the GFX card has to work harder, it's not sending the same signal after all.

 

My Geforce 7950 GX2 requires that I dedicate a card to each monitor in Dual Screen mode, which is great!

 

I'm not 100% certain, but I will state that from my knowledge of the hardware it does work harder, but how much harder I am unsure.

RS_Silvestri_01.jpg

 

"I'm a programmer at a games company... REET GOOD!" - Me

Link to comment
Share on other sites

I'm really happy that Kaftan has raised this - as I have an interest in this too. With the CPU I want apparently still only vapourware, I'm tempted to wait and see what the Core Duo 2 notebooks are going to look like and what GPU may be available in a 15" model. the question for me is gaming on my larger LCD when at home and whether it will have an affect as Bokishi has described or if Kaftan's assessment is correct. Anyone have anything authoritative in this regard to link to for further reflection?

 

It would be a matter of diabling the notebook monitor and using the external one, the performance hit would be mostly dependant upon the resolution you run the game in.

 

If the GFX card has very little RAM you're screwed in terms of resolution while it MAY be possible that it'll render at a higher res, you'll get a performance hit due to the resolution.

RS_Silvestri_01.jpg

 

"I'm a programmer at a games company... REET GOOD!" - Me

Link to comment
Share on other sites

Okay this is likely my ignorance as I am justing beginning to migrte to use a notebook primarily - but how is you disable the LCD of the notebook? I assume a 2nd one must first be enabled?

The universe is change;
your life is what our thoughts make it
- Marcus Aurelius (161)

:dragon:

Link to comment
Share on other sites

Okay this is likely my ignorance as I am justing beginning to migrte to use a notebook primarily - but how is you disable the LCD of the notebook? I assume a 2nd one must first be enabled?

 

You switch the 2nd to being your primary display... From what I recall.

 

It's been a while since I did it. Basically a Dual monitor setup, you just switch the monitor to it's output.

 

But I recall a few years ago using a projector when demoing a game, this allowed for displaying the same image as was on the laptop. In that case I'm fairly certain there is no performance loss akin to dual monitor setup where each output is seperate, in which case it sends out the same signal to both.

 

Oh and totally unrelated - some games have issues with Dual Core processors, you have to set the affinity of the CPU's to using one core, this is the case with Thief series, and a few other games (Same goes for GPU's, that are either SLI'd or Quad SLI'd). Developers who complain about Dual Core computing also deserve a slap! That may or may not have been known to folk, far as I am aware there aren't too many people using Dual Cores at the moment.

RS_Silvestri_01.jpg

 

"I'm a programmer at a games company... REET GOOD!" - Me

Link to comment
Share on other sites

Actually this makes perfect sense, as I've seen the laptop we use at work frequently get used on the big screen for meetings, and I have seen it with the notebook's screen both on and off.

 

I also have noticed no real difference in performance while splitting the signal, though we are just running around with NWN so it might not be a demanding enough of a game.

Link to comment
Share on other sites

Developers who complain about Dual Core computing also deserve a slap! That may or may not have been known to folk, far as I am aware there aren't too many people using Dual Cores at the moment.

writing code for a parallel environment is a nightmare. developers that complain about it are right to do so. just being able to fix the affinity to a specific core is not the problem. the developers are probably required to implement threaded versions that will work across multiple cores. this means maintaining two different programs/code bases depending upon configuration.

 

my guess is that most game developers are not experts in programming in parallel environments and, likewise, do not have the time to learn the subtleties involved given their other tasks. parallel computing has been around for quite some time in the scientific/defense realms, and that's where the knowledgebase lies. probably not something a game dev has time for unless he doesn't sleep much.

 

oh, i'm programming on a quad-core, the BCM1480 (broadcom).

 

taks

comrade taks... just because.

Link to comment
Share on other sites

Developers who complain about Dual Core computing also deserve a slap! That may or may not have been known to folk, far as I am aware there aren't too many people using Dual Cores at the moment.

writing code for a parallel environment is a nightmare. developers that complain about it are right to do so. just being able to fix the affinity to a specific core is not the problem. the developers are probably required to implement threaded versions that will work across multiple cores. this means maintaining two different programs/code bases depending upon configuration.

 

my guess is that most game developers are not experts in programming in parallel environments and, likewise, do not have the time to learn the subtleties involved given their other tasks. parallel computing has been around for quite some time in the scientific/defense realms, and that's where the knowledgebase lies. probably not something a game dev has time for unless he doesn't sleep much.

 

oh, i'm programming on a quad-core, the BCM1480 (broadcom).

 

taks

 

*Sigh*

 

I think you misunderstood me, it's a very difficult thing to do, parallel program, but the benefits in regards to what the potentail rewards are for games make it worthwhile, in my opinion. That is why they deserve a slap, perhaps it's because I see it as being more of the ability to deal with tasks seperately in some sense, I can see some benefits for AI in future releases.

 

Graphics have been offloaded for quite some time, physics is currently being offloaded, two cpu's leave alot of power to improve game AI. That's my opinion, complaining because of that is worthy of a slap!

RS_Silvestri_01.jpg

 

"I'm a programmer at a games company... REET GOOD!" - Me

Link to comment
Share on other sites

the problem is that the potential pitfalls are almost enough to outweigh the benefits. not all problems that need solving can be efficiently solved in parallel, either. i'm dealing with one right now that, on the surface, seems like a nifty parallel task. unfortunately, deep down, there are reasons not to chase the rabbit down the hole.

 

my point really, was that there is a steep learning curve that game developers may have a hard time overcoming. very steep. no matter the potential benefit, the initial stages of climbing that slope may do irreparable damage, and delays, to a project.

 

keep in mind, graphics is offloaded through a well developed API that lets the developer get his task done semi-transparently. once said dev has to start writing his own threads, however, things get a little sticky (uh, a lot).

 

taks

 

taks

comrade taks... just because.

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...