Jump to content

samm

Members
  • Posts

    1150
  • Joined

  • Last visited

Everything posted by samm

  1. Awesome video Still, if anyone buys the card with reference cooler and keeps that piece of yesterday-tech installed, it's his own fault. Just looking at the 290 (non-X) and what difference cooler speed makes, I'd say investing in a custom cooler and slightly undervolting the card, without any overclocking, will propel it to Titan levels of performance for less than half the price.
  2. Total system power consumption - it's not just the card. What are you suggesting about ATI (AMD?) obscuring something? I don't quite get that part of your posting. Erm yes, but in absolute numbers it's a rather small percentage of a desktop's power consumption at idle, even if it is 20W vs. 5W. In the mobile sector such differences would be huge of course, but there's nothing anywhere near as powerful as the discreet desktop cards.
  3. Seems like an obsession for some users to pledge and pledge more and pledge even more - maybe Roberts should close the thing at 30 million or so to avoid getting a reputation of ruining his backers. Although its their own fault of course Hopefully, there will be a lot of good exploration gameplay to be had, and a not overly humorless militaristic Squadron 42 story...
  4. Nice write-up, though not very current it seems
  5. In a way, it is no longer very necessary. Cards have suitable power saving modes when nothing is demanded from them, i.e. on Desktop, lowering clockspeed and voltage significantly so the dissipate no more than about 10-20W for the high end ones. If the screen goes to sleep / is plugged of for a sufficient amount of time, newer AMD cards power down to about 3W. The switch to an integrated GPU would not save that much anymore.
  6. You're right. Regarding the timeline, my memory was really inaccurate. The HD 4770, containing the first 40nm GPU architecture, debuted in early 2009, and Cayman as the last one debuted at the end of 2010, so a good two years between them. Almost the same time span between Tahiti and Hawaii.
  7. Haha, yeah, generic is meant to sound negative from my part: It's not their description, but mine, based on what I saw of it so far. I like the looks and the turnbased system, I also like the team, but the setting struck me as, well, too generic for my taste.
  8. So http://deathfiregame.com/ is getting kickstarted soon. First person, turnbased, party based RPG in generic fantasy setting. Looks like a pretty dungeon crawler Also, it's by Guido Henkel, who on one hand has great games like Planescape:Torment and the Dark Eye trilogy of old in his list of achievements, but also is known to ramble a lot about how bad the PC gaming industry has become. Ah well, I'll be interested to see how this turns out
  9. Not too similar - Diablo III obviously uses a completely different design approach (dumbed down, yes, there you have it) to combat, and DA:O even in zoomed out state and high difficulty still was not as RTS-like to control as the Infinity Engine games - marquee selection, formations etc. - which are the proclaimed model for P:E. In my opinion, whishing for controler support in P:E is as meaningful as whishing for it to be a 3D first person game. It's just against its very initially stated design goals. However, I fully share the following sentiment:
  10. It is not, you can specify folders to back up (at least in Win 7). The Sync functionality can be quite a pita, have not found out how to disable the Sync-list-entry if the origin or the destination folder is no longer reachable.
  11. 750W is of course ridiculous. But the power consumption *is* high, in the > 200W range in games, at least with stock cooling. If cooled to better temperatures with an Accelero or watercoled, it should be using quite a bit less of juice. The problem is that the projection of both IHVs did not work out regarding the progress in manufacturing tech - they have been releasing GPUs on the same process node for several generations now, and are just pushing it to the limit.
  12. As stated in the other thread, I really dislike this latin pronounciation, as well as the phrases used for incantations (thanks for spelling them out, Sensuki! ) I can't help but feel embarassed for the writers/speakers/characters hearing them. So for me, it's a [x]NO Yes to cool incantations, no to latin. Use the conlangs instead, for example
  13. Awesome card, price-, performance- and featurewise. It even removes the woes of Crossfire The cooler is a shame though, and the new Powertune will need testing on the user side to getting used to - meaning that setting a temp target too low may cause the GPU to throttle, if you don't allow the cooler to spin up high enough etc. I'm happy with my current setup, but the next gen of nVidia and AMD cards - I won't be buying anything new before then - will be even more interesting due to AMD's ability to keep up at the high-end again Also... I want TrueAudio to succeed. Ideally with peripherals, i.e. discreet DSP-cards, to allow nVidia- or older-generation-AMD users to get the benefit of truly well computed sound as well... ah, a man can dream...
  14. Temps and max voltage are way too high. Firstly, disable Turbo in mode in your BIOS, as this requires a massively higher voltage (your 1.376 probably) and does only marginally add to performance. Secondly, as stated by Humanoid, get a cooler that works. Not only will the PC be more stable with lower temperatures, but it will also be faster, because the FX will throttle at these high temps. Regarding cooler, I'm using a Megahalems Rev. C on my FX 8350, and it does a reasonable job cooling it, even with a little overclocking. Just make sure the cooler you get will fit in your case, and on AMD Socket AM3. Everything will be better than what you use currently.
  15. Uh, I hope they do not use "latin" casting voices, after all the effort they put into developing specific languages for the different cultures in the game. Everything "Latin" in games with English speakers bugs me to no end. Being from a country with German, French, Italian and Romanic as official languages, it always sounded ridiculous to my ears, and by now, having taken Latin classes and studied language, it has become even more of a distraction when playing such a game.
  16. Receiver between PC and TV for both audio and video, yes PC 'feeds' the receiver via HDMI, receiver 'feeds' the TV via HDMI and the speakers via speaker cables (just plain naked copper cables).
  17. I would spend slightly more for the mainboard and use the standard M5A97 R.2.0, because it has a heatsink on the power section. The FX heats the mainboard pretty well I'd also upgrade the memory to DDR3 1866, may only be a few percentage points faster but your processor officially supports it and 1333 isn't noticably cheaper. Check the mainboard's official support list for guaranteed compatibility - I think the Ripjaws X do work with all the Asus boards though. Other than that, I don't have any complaints. Ok, Battlefield 4 Beta supposedly has a problem with the six cores (way slower even than four cores), but this is a bug. No "go the Intel route" criers yet? I'm positively surprised
  18. If you intend to go the Bluray / HD streaming route, please consider using a receiver. You plug in the graphics card via HDMI, no need for any kind of sound card or onboard audio. With added benefit for music as well regarding quality and price. Reasons: 1. timing. Bluray (or other HD content) often uses 23.somethingsomething fps, and when separating audio content from video content (i.e., plug the graphics card into the monitor, plug in speakers to separate audio output) there will be perdiodical re-sync issues where the image/audio will stutter until it's in sync again. This won't happen if you use HDMI, because audio and video signal will be in sync automatically. 2. sound quality. Digital transmission to the receiver whose amp will deliver better audio quality (SNR, Crosstalk etc.) than any onboard solution and most sound cards. Passive "monitor" speakers are a safe bet for quality for the money - you can spend 1000$ on a "home theatre setup" from Samsung or whatever with tiny loudspeakers and rumbling bass or have a better experience with components you select by your own taste and budget. 3. flexibility. You automatically have a "relay" station between PC and screen - two shorter HDMI cables are less expensive than one long one*. You can add more speakers effortlessly should money permit it in the future. You can buy any length or quality of speaker cable** ----- * don't let audio-/videophiles talk you into buying super-special-silvercoated-quadruple-isolated-gold-contact-$100/m cables ** don't let audiophiles talk you into buying too fat ones: expensive but crappy handling and no immediate gain unless you transmit absurd amounts of power, which amps in a home receiver don't deliver anyway
  19. Hell yes, how I would love to :D However, I'd prefer it to be not fantasy-like (think Starwars or ME) and to have a lot of SPACE to enjoy - vast and deadly silent and black it should feel instead of colourful and populated
  20. Yay :D Finally someone is pushing positional audio again, and even Creative's CMSS3D seems to have found a worthy successor! Along with the announcement of AMD's Hawaii and smaller graphics cards, they pitched something called "AMD TrueAudio". FMOD is along for the ride, as well as Wwise and GenAudio It features precise calculations regarding orientation and elevation and can, but does not have to, leverage an additional DSP for computation Here's hoping that this initiative works out and the lack of quality brought along by Vista and the previous gen consoles is about the be replaced by something better again Maybe not relevant to P:E, but hopefully to Obsidian's audio guys in other contexts.
  21. And for that issue, there'd be another, more suitable thread here: http://forums.obsidian.net/topic/63884-okay-so-high-quality-video-has-been-covered-what-about-high-quality-audio/ This one was not intended to be about sampling rates or resolution.
  22. And again, the concept of HDMI is complete crap even in its 2.0 version. Supporting 32 audio streams but only 60Hz @4k? Probably to save bandwith/processing power for HDCP... and for idiotic new features, auto lipsync, wtf do these things technically have to do with a *display connection* standard? Nothing. And all this comes with license fees. I'd say the consortium needs more techies and less marketing and lobby persons. Improving DP further would be the better route to take.
  23. Oh Ubisoft, how I love you! Always something good in petto to improve your public image *searches the net for the footage*
  24. 's just like a movie... life doesn't move me http://www.youtube.com/watch?v=eRFvHLRfhCI
×
×
  • Create New...