Jump to content

Welcome to Obsidian Forum Community
Register now to gain access to all of our features. Once registered and logged in, you will be able to create topics, post replies to existing threads, give reputation to your fellow members, get your own private messenger, post status updates, manage your profile and so much more. If you already have an account, login here - otherwise create an account for free today!
Photo

Nvidia unveils the GTX 1060


  • Please log in to reply
25 replies to this topic

#21
Humanoid

Humanoid

    Arch-Mage

  • Members
  • 3633 posts

Worth noting that the 4GB cards, out of the box, have a lower memory speed than the 8GB variants. Given that the actual memory chips are identical, you'll be able to easily overclock them to the same stock speeds as the 8GB cards at the very least. Most reviews will not be doing so however, so you can add a few percent onto the benchmark scores for a more accurate representation of likely performance.



#22
Gizmo

Gizmo

    (7) Enchanter

  • Members
  • 811 posts
  • Pillars of Eternity Backer
  • Kickstarter Backer

... and that combined with brand loyalty suggests to me that it'll probably sell very well regardless.

I used to be staunchly in the ATI camp, because of the better DVD playback at the time, but I had to switch to Nvidia because of its performance boost in Blender; and I've never had the incentive to switch back ~yet.  So it's always going to be Nvidia for me until the Blender situation changes ~if it ever does. 



#23
Gorgon

Gorgon

    Forum Moderator

  • Moderators
  • 4450 posts

LED mode on/off is listed as a 'feature'.  When are they going to start listing 'no LED nonsense' as a feature ?

 

LEDs are like ketchup, you can put it on anything, and sometimes you definitely shouldn't.



#24
Bartimaeus

Bartimaeus

    (10) Necromancer

  • Members
  • 1402 posts
  • Steam:Ask!

Well, personally, I would instead say something more like, "LEDs are like ketchup, you shouldn't be putting it on anything you actually like." ;)



#25
teknoman2

teknoman2

    (9) Sorcerer

  • Members
  • 1346 posts
  • Pillars of Eternity Backer
  • Kickstarter Backer

 

... and that combined with brand loyalty suggests to me that it'll probably sell very well regardless.

I used to be staunchly in the ATI camp, because of the better DVD playback at the time, but I had to switch to Nvidia because of its performance boost in Blender; and I've never had the incentive to switch back ~yet.  So it's always going to be Nvidia for me until the Blender situation changes ~if it ever does. 

 

the main problem with the performance of AMD cards and CPUs is that they are locked into a contract with Global Foundries for their silicon. if they could get it from TSMC like nVidia does, they could clock the cores to the same speeds and combined with all the innovations they have crammed in there, they could be way ahead



#26
Gizmo

Gizmo

    (7) Enchanter

  • Members
  • 811 posts
  • Pillars of Eternity Backer
  • Kickstarter Backer

the main problem with the performance of AMD cards and CPUs is that they are locked into a contract with Global Foundries for their silicon. if they could get it from TSMC like nVidia does, they could clock the cores to the same speeds and combined with all the innovations they have crammed in there, they could be way ahead

But in the case with Blender, Blender supports CUDA GPU parallel rendering.  Blender does have OpenCL support for a limited selection of AMD cards, but it is still early support, and not optimized.






0 user(s) are reading this topic

0 members, 0 guests, 0 anonymous users