JFSOCC Posted February 24, 2014 Share Posted February 24, 2014 As a laptop owner, I always resent seeing the sentence: "laptop versions of these cards may work but are not supported" My graphics card on my laptop is pretty powerful, yet I find myself unable to play many games I should easily be able to play at top graphics, not because I don't have the power, but because the drivers are simply unsupported. I despise large developers like Ubisoft which make it a matter of course not to support laptop video cards, especially when the evidence points to a growing market for laptops (and tablets) and a dying market for the desktop. http://en.wikipedia.org/wiki/Laptop#Sales http://www.usatoday.com/story/tech/2013/03/06/apple-google-microsoft-hewlett-packard-dell-ipad-iphone-android-ios-samsung-galaxy/1946325/ It's absurd, the effort to support laptops is minimal (it's not larger than supporting desktops) and you reach a larger market. I'm tired of being treated like a second rate PC owner, especially when my laptops have always been desktop replacements when it comes to power. Remember: Argue the point, not the person. Remain polite and constructive. Friendly forums have friendly debate. There's no shame in being wrong. If you don't have something to add, don't post for the sake of it. And don't be afraid to post thoughts you are uncertain about, that's what discussion is for.---Pet threads, everyone has them. I love imagining Gods, Monsters, Factions and Weapons. Link to comment Share on other sites More sharing options...
Keyrock Posted February 24, 2014 Share Posted February 24, 2014 (edited) Which games are you unable to max out setting on? As a laptop owner, I generally am able to max whatever settings I want, it might just make the game into a slideshow for me (in the case of something like The Witcher 2). Also, the graphics driver is generally the same whether you are running a desktop or a laptop card, at least that has been the case for me with Nvidia (what I currently have) and AMD/ATI cards. Are you running a lappy with an Intel GPU? I would expect newer games to support Intel cards just fine, older games may have problems, or no support at all, as Intel GPUs only recently have become up to snuff for gaming. Edited February 24, 2014 by Keyrock RFK Jr 2024 "Any organization created out of fear must create fear to survive." - Bill Hicks Link to comment Share on other sites More sharing options...
Hassat Hunter Posted February 24, 2014 Share Posted February 24, 2014 I'm going to say... yes, your laptop graphics card sucks. And it's that, not the drivers... ^ I agree that that is such a stupid idiotic pathetic garbage hateful retarded scumbag evil satanic nazi like term ever created. At least top 5. TSLRCM Official Forum || TSLRCM Moddb || My other KOTOR2 mods || TSLRCM (English version) on Steam || [M4-78EP on Steam Formerly known as BattleWookiee/BattleCookiee Link to comment Share on other sites More sharing options...
Helz Posted February 24, 2014 Share Posted February 24, 2014 Does your graphics card use the current Nvidia/AMD driver, or did you buy your laptop from a manufacturer that requires you to use its own crappy proprietary driver? Because if its the first case, it should run any game just fine provided its powerful enough. If its the second, the fault lies with the maker of your laptop, not the game developers. No laptop manufacturer keeps their GPU drivers up to date. Link to comment Share on other sites More sharing options...
JFSOCC Posted February 24, 2014 Author Share Posted February 24, 2014 (edited) Nvidia geforce 710m 2gb ddr3... should be able to run most everything. and it does, just a lot less efficiently. It had real trouble with framerates with Xcom-Enemy within on higher graphic settings. Edited February 24, 2014 by JFSOCC Remember: Argue the point, not the person. Remain polite and constructive. Friendly forums have friendly debate. There's no shame in being wrong. If you don't have something to add, don't post for the sake of it. And don't be afraid to post thoughts you are uncertain about, that's what discussion is for.---Pet threads, everyone has them. I love imagining Gods, Monsters, Factions and Weapons. Link to comment Share on other sites More sharing options...
Keyrock Posted February 24, 2014 Share Posted February 24, 2014 (edited) A GeForce 710M is an entry-level card. It will run any game, but you ain't maxing out any recent AAA games on that card. I run a 650M 2GB GDDR5, which is a mid-level card, and I'm not maxing out any recent AAA games either (though I can run them at very nice looking levels at 1600x900). It ain't the drivers, your card is simply nowhere near as powerful as you think. Forget any anti-aliasing options (super duper resource hogs) and keep anisotropic levels to something sane like 2X or 4X. Stick with low shadow options (tend to be resource hogs) or disable shadows entirely. I haven't played XCOM:EU (/ducks) so I can't comment on that specific game. Edited February 24, 2014 by Keyrock RFK Jr 2024 "Any organization created out of fear must create fear to survive." - Bill Hicks Link to comment Share on other sites More sharing options...
Bester Posted February 24, 2014 Share Posted February 24, 2014 (edited) There is an unlocked version of drivers for all Nvidia and Ati laptop cards, google it for god's sake. It's made by laptop communities, not official. As to not being as efficient, it's because the cooling system in laptops is just one small fan for both CPU and GPU. Of course it can't be as efficient as in PC where you have 2 fans on the video card alone and another medium one on CPU. Think, man. And who in their right mind buys laptops for gaming. Edited February 24, 2014 by Bester IE Mod for Pillars of Eternity: link Link to comment Share on other sites More sharing options...
JFSOCC Posted February 24, 2014 Author Share Posted February 24, 2014 A GeForce 710M is an entry-level card. It will run any game, but you ain't maxing out any recent AAA games on that card. I run a 650M 2GB GDDR5, which is a mid-level card, and I'm not maxing out any recent AAA games either (though I can run them at very nice looking levels at 1600x900). It ain't the drivers, your card is simply nowhere near as powerful as you think. Forget any anti-aliasing options (super duper resource hogs) and keep anisotropic levels to something sane like 2X or 4X. Stick with low shadow options (tend to be resource hogs) or disable shadows entirely. I haven't played XCOM:EU (/ducks) so I can't comment on that specific game.You're telling me that a 2gb dd3 card is entry level? It's a compromise for sure, but I would rather call it mid level than entry level. Remember: Argue the point, not the person. Remain polite and constructive. Friendly forums have friendly debate. There's no shame in being wrong. If you don't have something to add, don't post for the sake of it. And don't be afraid to post thoughts you are uncertain about, that's what discussion is for.---Pet threads, everyone has them. I love imagining Gods, Monsters, Factions and Weapons. Link to comment Share on other sites More sharing options...
JFSOCC Posted February 24, 2014 Author Share Posted February 24, 2014 There is an unlocked version of drivers for all Nvidia and Ati laptop cards, google it for god's sake. It's made by laptop communities, not official. As to not being as efficient, it's because the cooling system in laptops is just one small fan for both CPU and GPU. Of course it can't be as efficient as in PC where you have 2 fans on the video card alone and another medium one on CPU. Think, man. And who in their right mind buys laptops for gaming. I'm aware of this. The problem is not the cooling, or the power of the card. the problem is that the drivers are unsupported, which means that the supported drivers are very basic and inefficient and do not get the best out of your power. If my card had been supported, it would have run it smoothly, because it has enough power to. And Keyrock, I always lower the anti-aliasing, because that offers the largest improvements on smoothness of play. Remember: Argue the point, not the person. Remain polite and constructive. Friendly forums have friendly debate. There's no shame in being wrong. If you don't have something to add, don't post for the sake of it. And don't be afraid to post thoughts you are uncertain about, that's what discussion is for.---Pet threads, everyone has them. I love imagining Gods, Monsters, Factions and Weapons. Link to comment Share on other sites More sharing options...
ShadySands Posted February 24, 2014 Share Posted February 24, 2014 A GeForce 710M is an entry-level card. It will run any game, but you ain't maxing out any recent AAA games on that card. I run a 650M 2GB GDDR5, which is a mid-level card, and I'm not maxing out any recent AAA games either (though I can run them at very nice looking levels at 1600x900). It ain't the drivers, your card is simply nowhere near as powerful as you think. Forget any anti-aliasing options (super duper resource hogs) and keep anisotropic levels to something sane like 2X or 4X. Stick with low shadow options (tend to be resource hogs) or disable shadows entirely. I haven't played XCOM:EU (/ducks) so I can't comment on that specific game.You're telling me that a 2gb dd3 card is entry level? It's a compromise for sure, but I would rather call it mid level than entry level. Yes, it's entry level http://www.notebookcheck.net/NVIDIA-GeForce-710M.84746.0.html 1 Free games updated 3/4/21 Link to comment Share on other sites More sharing options...
Sarex Posted February 24, 2014 Share Posted February 24, 2014 You're telling me that a 2gb dd3 card is entry level? It's a compromise for sure, but I would rather call it mid level than entry level. 2gb doesn't mean anything, there are a lot more specifications on a graphics card then the size of it's memory. 1 "because they filled mommy with enough mythic power to become a demi-god" - KP Link to comment Share on other sites More sharing options...
alanschu Posted February 24, 2014 Share Posted February 24, 2014 You're telling me that a 2gb dd3 card is entry level? It's a compromise for sure, but I would rather call it mid level than entry level. Memory is only one aspect. A quick google search of the card came up with this: "The 710M is just a downclocked 620M with a new label. It's still a very ****ty card and will struggle to play any games at all. " Link 1 Link to comment Share on other sites More sharing options...
Keyrock Posted February 24, 2014 Share Posted February 24, 2014 (edited) A GeForce 710M is an entry-level card. It will run any game, but you ain't maxing out any recent AAA games on that card. I run a 650M 2GB GDDR5, which is a mid-level card, and I'm not maxing out any recent AAA games either (though I can run them at very nice looking levels at 1600x900). It ain't the drivers, your card is simply nowhere near as powerful as you think. Forget any anti-aliasing options (super duper resource hogs) and keep anisotropic levels to something sane like 2X or 4X. Stick with low shadow options (tend to be resource hogs) or disable shadows entirely. I haven't played XCOM:EU (/ducks) so I can't comment on that specific game.You're telling me that a 2gb dd3 card is entry level? It's a compromise for sure, but I would rather call it mid level than entry level. http://www.notebookcheck.net/NVIDIA-GeForce-710M.84746.0.html The NVIDIA GeForce 710M is an entry-level, DirectX 11 compatible graphics card that has been announced in spring 2013. Its core is based on the 28nm GF117 chip (Fermi architecture) with 64-bit DDR3 memory. Compared to the older GT 620M, the 710M is clocked condiderably higher. Generally the Nvidia naming scheme goes like this: GeForce x10 - x40 = entry-level GeForce x50 - x60 = mid-level GeForce x70 and up = high-end Edit: Triple-ninja'd Edited February 24, 2014 by Keyrock 1 RFK Jr 2024 "Any organization created out of fear must create fear to survive." - Bill Hicks Link to comment Share on other sites More sharing options...
Bester Posted February 24, 2014 Share Posted February 24, 2014 (edited) All x10, x20, x30 nvidia cards are for office and such purposes. Gaming cards start at x60, no matter what generation. And the best ones quality/price-wise are x70 and x80. Edited February 24, 2014 by Bester IE Mod for Pillars of Eternity: link Link to comment Share on other sites More sharing options...
JFSOCC Posted February 24, 2014 Author Share Posted February 24, 2014 right, this changes nothing on the fact that if my card got supported it would be able to run these games smoother, especially the games I should be able to run but can't Remember: Argue the point, not the person. Remain polite and constructive. Friendly forums have friendly debate. There's no shame in being wrong. If you don't have something to add, don't post for the sake of it. And don't be afraid to post thoughts you are uncertain about, that's what discussion is for.---Pet threads, everyone has them. I love imagining Gods, Monsters, Factions and Weapons. Link to comment Share on other sites More sharing options...
alanschu Posted February 24, 2014 Share Posted February 24, 2014 (edited) right, this changes nothing on the fact that if my card got supported it would be able to run these games smoother, especially the games I should be able to run but can't Um, actually it does. As you say, laptops aren't that uncommon. But the reality is that your particular card/chipset isn't a particularly powerful one. And if it doesn't have the hardware power there's little hope that the software is going to get the type of increased performance that you are hoping for. It's not as simple as "support my card" (and besides, that'd be more in line for the game developer doing the work, rather than the chipset manufacturer). I can through 2 GB on a TNT2 but it's not going to make a lick of difference. Edited February 24, 2014 by alanschu 1 Link to comment Share on other sites More sharing options...
JFSOCC Posted February 24, 2014 Author Share Posted February 24, 2014 I guess I was wrong then. I still think it's absurd not to support laptop video cards. Remember: Argue the point, not the person. Remain polite and constructive. Friendly forums have friendly debate. There's no shame in being wrong. If you don't have something to add, don't post for the sake of it. And don't be afraid to post thoughts you are uncertain about, that's what discussion is for.---Pet threads, everyone has them. I love imagining Gods, Monsters, Factions and Weapons. Link to comment Share on other sites More sharing options...
kirottu Posted February 24, 2014 Share Posted February 24, 2014 Basic rule of video cards is that the second number means more than the first number. 2 This post is not to be enjoyed, discussed, or referenced on company time. Link to comment Share on other sites More sharing options...
mkreku Posted February 24, 2014 Share Posted February 24, 2014 The first number usually denotes which generation of card it is (although that enumeration has been kind of lost by now). Second number shows the power of the card (higher is better). Third number shows if the card is an alternate version (zero is the base, 5 is usually the alternate version, differences are usually small). These rules apply for both Nvidia and AMD. 2 Swedes, go to: Spel2, for the latest game reviews in swedish! Link to comment Share on other sites More sharing options...
Keyrock Posted February 24, 2014 Share Posted February 24, 2014 The first number usually denotes which generation of card it is (although that enumeration has been kind of lost by now). Yeah, with the way both Nvidia and AMD rebrand old chipsets into new cards, that first number doesn't mean much of anything any more. You add to that the new GeForce 750 and 750Ti, which are the new Maxwell architecture while they still use the same first digit (7) as what are mostly Kepler cards, then you have some cards with the same first digit that are still Fermi architecture... It's a bloody mess. RFK Jr 2024 "Any organization created out of fear must create fear to survive." - Bill Hicks Link to comment Share on other sites More sharing options...
AGX-17 Posted February 24, 2014 Share Posted February 24, 2014 (edited) You're telling me that a 2gb dd3 card is entry level? It's a compromise for sure, but I would rather call it mid level than entry level. nVidia doesn't call it mid level. That would be a 750. The amount of memory and the data rate of the RAM isn't what ultimately determines power and performance, it's the GPU. This circumstance also depends on what you're trying to do with it. While the specs on nVidia's site say it supports DX11 and can draw at resolutions up to 2560x1600, it doesn't say it can do that with games that utilize DX11 to its potential, at full resolution with post-processing effects, or that you'd be able to get a workable framerate out of that. I'm not even touching the subject of AA and SSAO, that'd be like talking about how fast a Pinto reaches escape velocity. Maybe you can crank everything up to the max if you lower the resolution to 320x240. Edited February 24, 2014 by AGX-17 Link to comment Share on other sites More sharing options...
Zoraptor Posted February 24, 2014 Share Posted February 24, 2014 The reason they don't support laptop cards is that there are already a large number of different configurations even if you take desktop cards based on reference configurations. If you start doing things to reduce the heat or power draw as often done in a laptop then they will be changing how the card works, and make it potentially (and in practice almost always) less powerful than its desktop equivalent- as well as the aforementioned custom driver issues. That becomes a problem, of course, because if you have the usual sort of spec classification the equivalent laptop card may not work at all, may work with reduced features or may work considerably slower than the desktop version- or it may overheat the laptop/ card and cause crashing because most laptops are only really designed to run Win7, a browser and other productivity applications. You have to be prepared for a lot of extra support tickets due to those issues if you formally support laptop cards, it's easier to just not support them and put the risk onto the buyer. And on occasion manufacturers may massage the truth about their cards' capabilities as well. Intel insisted that their old integrated laptop cards supported OpenGL. This was a... questionable interpretation in that while some things relying on OpenGL would run, many wouldn't run or would but only at a pace that was slower than chilled treacle. Link to comment Share on other sites More sharing options...
Althernai Posted February 24, 2014 Share Posted February 24, 2014 The architectures of laptop and desktop GPUs are identical, it's just that the laptop versions have fewer hardware resources and lower clock speeds to keep the heat and power draw down. The reason developers say that laptops are not supported is that laptop manufacturers like to put custom drivers on their machines and there's no way to test every single variety of laptop. As long as you got the generic drivers, the games should work just fine. I've been playing games on laptops for more than a decade now (my last three GPUs were the Mobility Radeon 9800, the GeForce 8600M GT and currently the Radeon 6770M) and I've never had a compatibility problem that a driver update didn't solve. At most, some games (like Bioshock 2) will warn you that they're not guaranteed to work on laptops, but they work just fine anyway. By the way, don't be fooled by the RAM -- it's there exclusively for marketing purposes. There's simply no way a GeForce 710M (with its 14.4 GB/s of memory bandwidth) can ever make use of 2GB of memory, at least not while playing games at a reasonable frame rate. More generally, you can find a good summary of all GPUs on Wikipedia; here are the pages for Nvidia, AMD and Intel (the Intel ones are sadly not as detailed as the other two). The most important number is what the wiki refers to as "processing power" (it's in GFLOPS). The second most important is probably the memory bandwidth which is what currently limits the APU (integrated GPU) offerings. The other numbers are mostly correlated with those two. The GeForce 710M in particular can be found here. As you can see, it is the weakest of the 700M cards. To be honest, I don't understand its purpose since the Intel GPU that's built into the CPU of your laptop is almost certainly more powerful. Link to comment Share on other sites More sharing options...
Bartimaeus Posted February 25, 2014 Share Posted February 25, 2014 (edited) I guess I was wrong then. I still think it's absurd not to support laptop video cards. In what way do you think they are not supported compared to desktop cards? Your video card is essentially for running a modern desktop interface, and fairly unintensive indie games or games that are fairly old (e: at high settings), (>4, 5?). There's not much more to squeeze out of a 710M... I think the fact that you were even *TRYING* to play XCOM at higher settings and weren't running into severe difficulties suggests that the drivers are performing very well indeed... Edited February 25, 2014 by Bartimaeus Quote How I have existed fills me with horror. For I have failed in everything - spelling, arithmetic, riding, tennis, golf; dancing, singing, acting; wife, mistress, whore, friend. Even cooking. And I do not excuse myself with the usual escape of 'not trying'. I tried with all my heart. In my dreams, I am not crippled. In my dreams, I dance. Link to comment Share on other sites More sharing options...
Humanoid Posted February 25, 2014 Share Posted February 25, 2014 The short of it is that hardware manufacturers are filthy lying bastards. Rebranding is rampant, where previous generation cards are relabelled to appear to be current-gen, additional dead-weight memory is added for no purpose other than to look good on a spec sheet, and they use the same model numbers as desktop models while performing at a fraction of the speed. They even have fake generations just to look better (e.g. the nV GT300 and AMD HD8000 series, these exist solely to exploit the "number is higher, so must be better" class of customers). There's some advice about naming schemes above, but just bear in mind that fundamentally the names are based around desktop models. There's a lot more gotchas and deliberate obfuscation in the mobile area. In the event a mobile GPU shares a model number with a desktop one, halve the expected performance as a rough guide, at *best*. 1 L I E S T R O N GL I V E W R O N G Link to comment Share on other sites More sharing options...
Recommended Posts
Create an account or sign in to comment
You need to be a member in order to leave a comment
Create an account
Sign up for a new account in our community. It's easy!
Register a new accountSign in
Already have an account? Sign in here.
Sign In Now