-
Posts
1486 -
Joined
-
Last visited
Content Type
Profiles
Forums
Blogs
Posts posted by AwesomeOcelot
-
-
Wub wub wub *random electronic samples* build up... *break* drop that completely random bass with massive wobble. No thanks. I have a strong hatred for modern grime and dubstep, they weren't my favourite genres to begin with but now they seem to be dominated by talentless idiots. Can I get some structure? Maybe a little rhythm? Melody? I can't tell one track from the next. I'm sorry I'm not blind drunk and coked off my **** to enjoy this. Remember when Wipeout came out with awesome soundtracks, with songs like Fluke - "Atom Bomb"?
-
I'm OK with paid mods, and the platform getting a cut. I think 75% is a bit large, mostly Bethesda's cut. For what? It would be great if for instance a texture company sold a texture pack they made from their assets for a game, a musician sold an alternative soundtrack, or a former EA artist sells building packs for Cities: Skylines. I hope this results in professionals and talented people being able to dedicate time to modding where they wouild not have before. It would be great if rights holders used their IP in mods. The pricing seems to be way off, these mods seem to be even more expensive than the majority of rip-off DLC that a lot of games come with now.
-
Grimrock and Blackguards were in Humble Bundles... and have been deep discounted numerous times. Ownership doesn't represent margins and profit. Grimrock is a more niche game than PoE, if PoE gets the same treatment through sales and bundles it would do even better than Grimrock. Blackguards was also heavily criticized.
-
Rebuilt old PC with new case/cooling:
CPU: Intel i5 2500K @ 4Ghz
MoBo: Asus P8Z68-v Pro
SSD: Crucial M4
HDD: Western Digital Caviar Green 1GB
RAM: Kingston Hyper-X DDR3 1600 16GB
Cooler: Corsair i100
Case: Fractal Define R5
ODD: DVDRW
Upgraded my HTPC now with:
Kingston Hyper-X 1600 8GB
Western Digital Green 3TB.
Main PC:
Case: Fractal Define XL R2
PSU: Corsair HX850i
CPU: Intel Core i7-5820K 3.30GHz
CPU Cooler: Corsair Hydro H110 280mm
MoBo: ASUS RAMPAGE V EXTREME
RAM: 16GB Skill Ripjaws 4 F4 2666Mhz
GPU: Asus GeForce GTX 970 4GB DirectCU II OC STRIX
SSD:- Crucial MX100 512GB
- Samsung 840 EVO 250GB
HDD: Western Digital 4TB Black
Keyboard: Corsair Vengeance K70
Gamepad: Xbox One Controller
Mouse: Logitech G502
Mousepad: Razer Goliathus Speed Medium
Monitor: AOC G2460PG 24" G-Sync 144Hz 1ms LED LCD TN2nd Monitor: Asus VE247 60Hz 2ms LED LCD TN
Soundcard: Asus Xonar U7 Compact 7.1
Speakers: Creative GigaWorks T40 Series II
Headphones: AKG K551
ODD: Samsung SE-506CB 6x Blu-Ray Writer SlimLaptop Lenovo Flex 2.
CPU: Intel Core i7-4510U 2.0 GHz
RAM: DDR3 1600 16 GB
SSD: 256 GB
ODD: DVDRW
GPU: Nvidia G840
OS: Windows 8.1
Display: Multitouch 15.6" LED 1080p
-
I don't think SteamSpy would be able to tell which version it was. It might be possible for it to, but since it measures all Steam ownership I doubt he makes individual rules for games.
-
If it's the same profiling/API scaping technique Ars and SteamSpy use then it would include backers who used a Steam key. In terms of numbers, games get spikes on release and sales. Performance is not bad, considering that the game cost $4m, took 2.5 years, it's not bad. D: OS sold 160K copes in a week, according to SteamSpy PoE sold ~153K ± 25k (not factoring backers after KickStarter). PoE is more expensive, doesn't have multiplayer, and if reports are true that they started development after Divinity II, D: OS took 4 years to develop and cost €4.5.
-
Backer ContentIt's come to our attention that a piece of backer-created content has made it into Pillars of Eternity that was not vetted. Once it was brought to our attention, it followed the same vetting process as all of our other content. Prior to release, we worked with many of our backers to iterate on content they asked to be put into the game that didn't strike the right tone.In the case of this specific content, we checked with the backer who wrote it and asked them about changing it. We respect our backers greatly, and felt it was our duty to include them in the process. They gave us new content which we have used to replace what is in the game. To be clear, we followed the process we would have followed had this content been vetted prior to the release of the product.We appreciate the faith you have all given us into making Pillars of Eternity the great game that it has become, and we appreciate the strength of conviction all of you bring to every conversation we have together.Sincerely,Feargus Urquhart, CEOObsidian Entertainment, Inc.
From Firedorn: I had the choice not to change it. They simply emailed me and asked if I was OK with changing it, but I could choose not to.
From CEO: It failed out vetting process. We have already worked to change ("iterate") other content that didn't strike the right tone. We include backers in this process.
I guess what seems surprising is Firedorn's assertion that it was his choice.
They probably emailed Firedorn asking "In light of the controversy, would you like to change your memorial?" Which might have given Firedorn the impression that it was his choice, but it doesn't suggest it either way if we don't know what would have happened if he would have replied, "No, I'd rather not". That it "failed our vetting process" leans towards them removing it entirely if the backer refuses to change it themselves. Which I'm fine with, the devs should decide, but if they are influenced by one **** gamer on twitter crying about bull****, then that's not OK. Have your opinions and stick by them, don't give into political correctness or fringe nonsense bullying. If Obsidian found it truly offensive, transphobic, then I can live with it, but I just can't see it. Also this is the same studio that loves South Park and made the Stick of Truth, amongst other things. No one should be giving those ****ers an easy win like this.
-
3
-
-
I love how you think anyone will take you guys professed motives seriously when your posts lecture about how misgendering is ok.
Like, credibility went all the way out the window there.
Do you even know what "misgendering" means? Would you care to point me to what posts "misgender" somebody? I've been reading the replies here and I haven't seen anybody misgender anybody else. I might have glided over those parts you see so clearly, or you might just be making up words and definitions, like that twitter person and their "transmisogyny".
Here are some definitions:
http://www.oxforddictionaries.com/definition/english/misgender
http://en.wiktionary.org/wiki/misgender
Calling a trans woman "he" or "it" when their preferred pronouns are "she/her" is called misgendering. Transmisogyny is basically the intersection of transphobia and misogyny: misogyny that trans women face primarily because of their trans status. This isn't very difficult to understand.
It can't be misogyny because they're not women. Calling someone "it" is not misgendering, it's dehumanizing. It's not misgendering because they're not women, if you called a woman "him" then that would be misgendering. I'm all for their individual rights, they can refer to themselves how they want, I might even humour them personally, but gender is completely subjective, they have no right to dictate how others percieve gender.
-
...making a stupid "Joke" about trans women being men.
It's also interesting that the OP of this thread took exception to this, not the stereotyping and phobia of the contents of the joke. It's not even mentioned that the character in the joke is trans. Also, it's up to the individual whether they consider transwomen men or not.
-
1
-
-
It's OK to be transphobic e.g. to feel shame for sleeping with a man when they thought he was a woman. It's wrong to expect shielding from this point of view, in life or fiction. I'm not transphobic, although I don't subscribe to all the views and language some trans people use. I can't see how that situation can possibly be rape, for instance. I don't want to live in a world where anything that might be possibly offensive or have impact on people is censored. Also I find many things offensive in media including games, but it's only particular groups of people that seem to want to censor things. That's surely going to bring people to your side, I'm sure that's going to win hearts and minds. Even when I support their point of view, I think "these people are giant arseholes".
Neil Gaiman recently wrote a book Trigger Warning, after people wanted to ban disturbing books from university, even classics. I'll be really disappointed in Obsidian if they remove this joke from PoE, think about what kind of message that sends out. Also the implications to a possible future kickstarter, these people are not your backers, they don't even speak for all trans people.
-
3
-
-
People have highlighted segments of 50 Shades of Grey and The Da Vinci Code, and they undoubtably contain some of the worst writing in history, but are two of the most popular books. Elitism isn't bad, I think these books prove we need a lot more of it. And the same goes for games, It can be popular, but that doesn't make it good.
-
2
-
-
Giant publishers decide what pitches they fund or don't a developer is not in the drivers seat. Always online, DRM, and social media features come from the publisher. Bioware are the ones behind having a franchise go from successor to BG/NWN to ****ty 3rd person action RPG to boring faux MMORPG. I don't think they were trying to "fix" anything, they wanted to turn Dragon Age into Mass Effect, then they wanted to turn it into The Old Republic. If Bioware start making a FPS series with a new IP, the next Dragon Age is going to be a FPSRPG.
I think the key difference between these developers before and after they started to suck, is that before they made games because they love games and they wanted to play the games they were making, afterward they're adopting a different mentality, they're chasing demographics and thinking too much about what "the players" want, or worse, listening to what they say they want. I think that mindset comes with being taken over by a publisher. Of course, gamers are to blame for this, they reward **** at every turn.
-
3
-
-
Judging by the name, the GPU will likely be a GTX 970, which, lawsuit aside, is a damn good GPU. No word yet on pricing or when it will hit the market.
The 970M laptop GPU, it's got more horsepower than the PS4's, about as much as between the PS4 and XB1.
-
I will never forget them what they have done to Fallout universe
Successfully resurrected it and then gave Obsidian a chance to improve it...Bastards!
From their perspective maybe, for fans of Fallout it was a complete failure. They didn't even try, but then they obviously don't have the talent to create a Fallout game.
-
1
-
-
Hardware Canucks and PC Perspective also fail to find the stutter. The GTX 970 does not have issues going over 3.5GB. People experience problems with games all the time, and this is a nice scape goat for them.
-
Fahrenheit is one of the worst games I've ever played. The plot is nonsensical, the voice acting is awful, the characters and dialogue are really bad, and the gameplay mostly consists of QTE, including climbing up by virtue of pressing left or right for each grab/step. There's an interactive QTE sex scene, segments where you just piss about until timed events happen, and a lot of the time the game is a 3rd person fixed camera, from the most fustrating angles imaginable, like walking towards the camera.
-
1
-
-
Anandtech also unable to find performance problems with the GTRX 970 over 3.5GB.
Underclocking an 980 and its memory could give you a 970, if the GPU would actually be designed as it was marketed in the first place. But it is not.
This is true, and also contradicting what you posted, and it invalidates the methodology of the benchmark. If that site is claiming that this shows a performance problem with using two pools of memory, then I'd suggest you stop using that site.
-
-
This is wrong, a) there are tools that will show you what the 970 is using, and b) the card is designed so that the game will only allocate 3.5GB if that's all it needs. On a 980 it doesn't really matter wether it allocates 3.5GB or 4GB.Thus it's reasonable to assume that the memory use of the 970 is between 3.5 and 4GB in this case.
My assumptions from a) do not apply for the 290X, so the first point in section b) is irrelevant (different architecture, different driver, game very likely aware of architectural differences between 290X and 9xx).
It shows that this behaviour is not indicative of having two pools of memory, but of the GPU hitting a wall, and the 290x is a similarly priced and performing GPU.
between a 980 and a 970 that are clocked to have identical processing power.
That is not possible with the differences in L2, ROP, and SM. Underclocking a 980 does not give you a 970.
the same theoretical horsepower of the same architecture with the same driver it could be expected to behave identically.
But they don't have the same theoretical horsepower.
-
You can't hotlink them. All 4 of the image links are from the same article.
-
Yes it has, I posted links to game benchmark results showing an occurence of such problems.
a) The graph is reporting the engine using less than 3.5GB of VRAM which should only use the one pool anyway, some articles claim certain tools read the amount of allocated VRAM incorrectly.
b) There's a 3rd graph of the R290x using 4GB and having lots of spikes. Which only means that the GTX 980 is a better GPU than a GTX 970 or Radeon R290x. Spikes are what happen when the GPU starts falling over due to excessive demand on it. Many cards have spikes at those settings, many cards have worse spikes than the GTX 970, we've already seen benchmarks of frame times for the GTX 970 at ultra settings with games like Watch Dogs, the GTX 970 didn't perform worse than similar class cards with a single pool of 4GB.
c) A GTX 970 is not an underclocked GTX 980, as you yourself acknowledged.
d) I don't understand the argument, after knowing that the pools can be accessed simultaneously, that this could cause significant stuttering. A game like Watch Dogs is only using the one pool for some reason and it's the slower 512MB pool? That's not going to happen. Stuttering can be caused by a lot of reasons, and people latched onto Nai's benchmark even though it's meaningless in terms of real world game performance. If the partial disable is causing stuttering, it's not down to the reasons stated, it's not because the 512MB is a slower pool. I seem to recall people complaining about stuttering in Watch Dogs with a variety of cards, the various Titans.
-
Hm, as far as I've gathered, if the GPU is accessing the 512MB in question, it has to do so exclusively.
That's what the Anandtech article says, but this is what the PC perspective article says:
To those wondering how peak bandwidth would remain at 224 GB/s despite the division of memory controllers on the GTX 970, Alben stated that it can reach that speed only when memory is being accessed in both pools.
As I was writing this I noticed an update further down:
I wanted to clarify a point on the GTX 970's ability to access both the 3.5GB and 0.5GB pools of data at the same. Despite some other outlets reporting that the GPU cannot do that, Alben confirmed to me that because the L2 has multiple request busses, the 7th L2 can indeed access both memories that are attached to it at the same time.
To my mind, if the 970 couldn't access both memory pools at the same time then the card would not perform anywhere near as well as it does.
In normal usage, a user will probably just feel a little more stutter when some of the impaired memory blocks are accessed every now and then. But if the "sour spot" is achieved, something like that can result:
Which hasn't been shown in game benchmarks.
And regarding the claim of that engineer that 28GB/s is four times faster than main memory bandwidth: if he was not talking about access times or anything like that, I'd be ashamed if I were him
Even a very common case of, let's say, a dual channel pair of 1600 DDR 3 (aka. PC3-12800, which shows the MB/s for one channel) offers bandwidth similar to that (at least between CPU and RAM, while the bandwidth available between GPU and system RAM is capped by a PCI-e 3 x16 at a theoretical maximum of about 16GB/s)
What I assume he's refering to is the scenario where the PC would use the system RAM instead of the VRAM, the PCIe link being around 4 times slower. In this context, I don't think anyone who knows a little about this would assume he's refering to the bandwidth of the memory itself, he's obviously refering to the bandwidth of the system memory in practice with this application. Also in practice the bandwidth will be less than 16GB/s.
As for the possibilities to avoid the situation: game engines almost certainly will not account for one GPU's particular and very specific weaknesses, unless nVidia pays the developers for the extra effort and layer of abstraction they'd have to insert for that. The same is true for an OS, especially if developed prior to the common knowledge of this issue - which is every OS that is available now or appearing soon. So if anywhere, I'd expect this to be handled at driver level.
Going forward Nvidia might employ this method for their second "tier" GPU, so it may effect how game engines allocate memory, especially since the GTX 970 is pretty popular, and Nvidia has the most market share. PC Perspective seem to think the OS already, at least a modern OS like Windows 8, sees the different pools and allocates memory based on their speed. Obviously the drivers are going to handling this as best they can.
I think that people are over reacting to this, and that Nvidia have shot themselves in the foot. The GTX 970 was a win in performance and cost for them, the partial disable technology allowed this card to exist, but now that's soured.
-
Old games are still good games. I'm not a nostalgia driven person, I've gone back to games I loved in the 90's, and some of them do not hold up, I often play old games I've never played before and enjoy them thoroughly. Some of the design choices of new games make them significantly worse than old games. It's worth making old games accessible, maintaining and improving them. I'd rather have a large pool of quality games to buy than have more new mediocre games each year. Some old games require quality of life modifications, either through arcade heritage or system limitation, some things can be incredibly annoying.
What DoubleFine are doing is good, the lighting is a vast improvement, textures are better now, and I hear the soundtrack's improved. I don't have as much love for Grim as a lot of people but it is a rare example of good plot, characters, and acting in video games, I can literally watch a let's play of it from start to end, and it's almost as enjoyable as a good TV series. Some of the voice acting is some of the best in games, some of it is bad, the soundtrack is great. The movement system is awful, the puzzles are irritating, you can get lost in dialogue trees that are way too large and have required triggers, selection in the environment can be fustrating, backtracking can be slow, and cycling through your inventory would be much faster with a menu. I didn't like Glottis, that got old fast. Is it a great game? Not really, on first playthrough there's way too many problems with the gameplay, and the gameplay doesn't have much replayability. Why was this game not an animated movie or TV series? Is not a complement to a game.
-
1
-
-
The limitation in the GTX 970 doesn't seem to be showing in any benchmarks when comapred to the 980, whether that's average fps or frame time, even at the highest settings. Are we buying GPU for synthetic benchmarks or real world performance? Reviewers are trying to produce evidence of this limiation in game benchmarks.
Nvidia still gave out false specifications, that should be an automatic refund for false advertising. If Nvidia had sold the card with this information, the GTX 970 is still the best deal on the market, even at 3.5GB it would still be the best deal, if it were to be found to effect game performance significantly then that's a bigger issue.
That 970 memory issue where it will lag bad once you go over 3gb vram, is a deal breaker for me
It effects the last 512MB, not the last 1024MB.
Yeah, that issue makes the whole "4GB RAM" somewhat misleading. Obviously the RAM is physically there, but it could just as well not be.
An Nvidia engineer has stated that the 512MB is still around 4 times faster than system memory, and the GPU still has 224 GB/s bandwidth if memory is being accessed from both pools. The only time bandwidth is 28 GB/s is when the GPU is only accessing the 512MB pool, and it's possible for the OS, drivers, and game engine to stop that happening.
RANDOM VIDEO GAME NEWS
in Computer and Console
Posted · Edited by AwesomeOcelot
A lot of these Kickstarters seem to have private investment behind them e.g. Star Citizen, Underworld Ascendent, Divinity: Original Sin. Some of them have the budget for a game, but want a bigger game, e.g. Gianna Sisters: Twisted Dream, Defense Grid 2.
- Yacht Club Games
With a lot of Kickstarters you're basically expecting them to work for way under pay, but on the flip side of that they get royalties.