Hiro Protagonist II Posted September 29, 2014 Posted September 29, 2014 Matt, maybe try out and create a character with a higher Might than Perception and then buy the same class at the inn but swap the Might/Perception so the Perception is the higher attribute, and test which one does more damage. From people playing the Beta, the OP of this thread, Fiebras with his thread, Mutonizer and myself, we would all be in agreement that Perception is the DPS stat. I know you have the maths to make your argument, but what we're experiencing is not what the maths says which would indicate to me that something else is at work.
Matt516 Posted September 29, 2014 Posted September 29, 2014 No.... Please read my earlier posts in this thread. That's not the correct way to look at it. A 1% increased chance to crit, coupled with a 1% decreased chance to graze, is equivalent to +1.5% damage. You cannot just compare the crit damage to the Might bonus. You have to weigh the damage by the probability of it actually happening. Matt I think the DT will really eliminate the small damage bonus from might. Can you tell me what some average DT values are like? If they are comprable to usual damage values then they will wipe out the benefits of small bonuses to damage from might. Also, if you read my explanation correctly, I clearly explained that the PER bonuses are relevent only close to Def/Att ratio of 1. The farther you get from that the smaller is the effect. Then may be item + talent benefits to Acc dominate. But I can not definitely side with you that Might is all that useful. I would personally keep is jsut enought to overcome DT and the rest I will pump into ACC bonuses whereever I can get them. The damage reduction from DT is equal to your attack speed times DT. It's independent of Might or Accuracy boni. The only time this isn't true is when you're doing minimum damage (DT is blocking you completely). And in that case, some small bit of Might isn't going to help you that much anyway. Regarding "doubling the crit chance". Yeah - if ACC-DEF starts at 0, then 5 points in Accuracy will raise your crit chance from 5% to 10%. That is doubling the crit chance... but it's not even close to doubling your damage. The total effective DPS multiplier from Accuracy and Might is given as such: (0.5 * Graze chance + 1 * Hit chance + 1.5 * Crit chance) * Might multiplier, where Might multiplier is equal to (1 + (Might - 10)*0.02). Assuming Might is at 10 again, this means that in raising Accuracy that 5 points, we went from doing 75% of our base DPS (0.5 * 45% + 1 * 45% * 1.5 * 5%) * 100% to doing 82.5% of our base DPS (0.5 * 45% + 1 * 45% * 1.5 * 10%) * 100%. We doubled our crit chance, but only increased our DPS by 7.5% of the base value. Let's compare this to what we'd get if we varied Might from 10 to 15. When we raise Might from 10 to 15 (while ACC-DEF is at 0), our Might multiplier goes from 100% to 110%. Plugging this into the above equation, we get that we've gone from doing 75% of our base DPS (0.5 * 45% + 1 * 45% + 1.5 * 5%) * 100% to doing 82.5% of our base DPS (0.5 * 45% + 1 * 45% * 1.5 * 5%) * 110%. By increasing Might by 5, we've increased our DPS by the same amount, 7.5% of the base value. Responding "But I'm critting more so I'm doing more damage" is nonsensical. The math is right there. You got the exact same average DPS benefit from that 5 point increase in Might as you would by increasing Accuracy by 5. More Accuracy increases your crit chance, yes - but it's only a chance. More Might increases ALL your damage by a set amount. Basically (at least in this case) it comes down to whether you'd rather have your attacks do more consistent damage, or more sporadic damage - but the overall average is the exact same. You can't just look at "more crit chance" and think that because 50% is bigger than 2% that Accuracy outweighs Might. Each point of Accuracy gives you a tiny bit higher chance to do a lot more damage. Each point of Might gives you a 100% chance to do a tiny bit more damage. They are the same in the end. Literally. The same.
Captain Shrek Posted September 29, 2014 Posted September 29, 2014 Matt, that is why I asked what some general DT values are like. If they are ~ equal to damage you can gain from Might benefits, you would see only a little effect of might bonuses. But assuming that DT is smaller than the damage you can gain from Might, then yes, you are right: Your total damage will be larger over time with Might high. "The essence of balance is detachment. To embrace a cause, to grow fond or spiteful, is to lose one's balance, after which, no action can be trusted. Our burden is not for the dependent of spirit."
Matt516 Posted September 30, 2014 Posted September 30, 2014 Matt, maybe try out and create a character with a higher Might than Perception and then buy the same class at the inn but swap the Might/Perception so the Perception is the higher attribute, and test which one does more damage. From people playing the Beta, the OP of this thread, Fiebras with his thread, Mutonizer and myself, we would all be in agreement that Perception is the DPS stat. I know you have the maths to make your argument, but what we're experiencing is not what the maths says which would indicate to me that something else is at work. I've already expressed numerous times why I don't think in-game testing is the right way to figure this particular balance issue out. I'll enumerate again: Humans (myself definitely included) are absolutely abysmal at judging probabilistic events, statistical trends, etc just by running a bunch of tests. Really, we're terrible at it. We are subject to a number of heuristic biases that make our subjective opinions and judgments about probabilistic events really bad. There are entire fields of research devoted to helping humans learn how to make better decisions specifically because of how bad our natural biases are. There are too many other variables at play. Granted, this could be accounted for by making two identical characters with only Might and Perception varying so this isn't really a strong point. Ignore it - it doesn't really belong here. Just wanted to mention it because I think the other variables at play have affected other people's subjective comparisons between characters (like BB Rogue and BB Fighter). The amount of RNG involved in damage and to-hit rolls means that I'd have to run combat for a very, very, very long time in order to get reliable results. I don't have a good way of recording that much data other than screenshots and manually entering in 1000s of combat rolls from screenshots into a spreadsheet. Because of 3 and 4, it's not worth the time. This would be an endeavor of many, many nights, and it's not worth it because I already know the answer. This isn't like a physical experiment, where I have equations that kind of describe the phenomenon and I'm trying to get some approximate answer by neglecting a bunch of stuff. In this case, the equations describe exactly what is going on. Like - to the t. The systems are very clear, and it's all just math being done in a computer. I'm just using my own computer to replicate that math on a statistical scale. Calculating effective DPS is very straightforward, and I know I'm right unless there's a bug in the game or some completely random mechanic we've never heard about that would invalidate my model. That said, I'd be more than happy to change my mind if someone points out an error in my equations or assumptions - I've done so multiple times before (see the first threads where I posted my spreadsheets - Azrael Ultima and Caerdon both corrected me on various things and I changed my equations accordingly). But I don't really see the value of spending many hours testing something I know to be mathematically true. At the moment, the only major arguments against my math have been "but I played with high Perception and wrecked everything". Sure. I'm not disputing that. But the experience of any one person is colored by biases, RNG, confounding variables, etc... whereas going straight to the numbers on how the damage is actually calculated doesn't lie. So while I appreciate the suggestion, I'm going to have to respectfully decline.
Matt516 Posted September 30, 2014 Posted September 30, 2014 Matt, that is why I asked what some general DT values are like. If they are ~ equal to damage you can gain from Might benefits, you would see only a little effect of might bonuses. But assuming that DT is smaller than the damage you can gain from Might, then yes, you are right: Your total damage will be larger over time with Might high. Well, from the Beta (someone correct me if I'm remembering wrong), DT tends to range between 0 and 12, with buffs capable of taking it higher.
Matt516 Posted September 30, 2014 Posted September 30, 2014 (edited) Sorry for making yet another post after my previous two - I wanted to make sure this got seen and wasn't buried as an edit. I just wanted to add that I'm not trying to spoil anyone's fun, or tell them their build sucks, or anything like that. Despite the fact that I like to look at game mechanics from a really mathematical perspective, I'm not a minmaxer. I'm really not. I tend to pick a build that is both viable and fun, then just go with it, optimization be damned. The reason I'm being so adamant about this is that I really hate to see people claiming X is overpowered or whatever when it's demonstrably not true. If Perception and Might are mathematically well balanced right now (and I'm still willing to listen to arguments that they're not), changing them due to an incorrect surge of opinion in the community would throw the whole thing out of whack. Then, months down the line, people would be really unhappy to discover that Perception actually kind of sucks now (or whatever). The whole reason I started with all this spreadsheet stuff was that I saw people making balance and game mechanics arguments based on incorrect information. The spreadsheets were created to give people the tools they need to make correct arguments about balance. There's more than just the graphs Sensuki and I put in our paper in that spreadsheet - there's a combat simulator that lets you set any variable I could imagine (from attributes to DT to weapon damage range to attack speed - and more!) and get the resulting DPS. With DT fully accounted for! Including minimum damage!! I didn't make it for fun.. well, not entirely for fun. I made it so that people who may not be as mathematically minded as I am could test various theories and such without having to spend hours and hours ingame doing it and having to deal with RNG and such mucking up the results. The current version of the spreadsheet (linked at the very end of Sensuki and I's paper, btw) is a bit out of date. It doesn't account for DR or the new 0 point of Might yet. I'm planning to update it though, and I'd love it if people would take that opportunity to use it to test theories and such. As for the current argument - I think I've said all I can really say. If some people still aren't convinced that my math accurately represents the damage dynamics, that's ok. I don't really know what else to do to convince you. Anyone stumbling on this midway can pretty much get 100% of what I would say about it from reading the posts I've already written. So... I guess what I'm saying is I probably won't keep arguing this (at least on this thread). I think the math and the explanations I've already given speak for themselves. I'd be happy to continue to answer questions in PM or whatever, but I think this would be a good point to retire this particular debate - at least for now. So with all that said - back to the OP. Fire Godlike Barbarian. Kind of great. Except for when you activate Frenzy while at low health and your fire blowback ability kills you instantly. Does that count as.. breaking the game? Edited September 30, 2014 by Matt516 2
Hiro Protagonist II Posted September 30, 2014 Posted September 30, 2014 I've already expressed numerous times why I don't think in-game testing is the right way to figure this particular balance issue out. Lets hope Josh does the same thing, sticks with his excel spread sheets and us powergamers can have some fun.
Matt516 Posted September 30, 2014 Posted September 30, 2014 I've already expressed numerous times why I don't think in-game testing is the right way to figure this particular balance issue out. Lets hope Josh does the same thing, sticks with his excel spread sheets and us powergamers can have some fun. Well testing is really important for determining if the game is actually working properly. It's just not that useful for determining if the game mechanics are balanced as designed.
Sensuki Posted September 30, 2014 Posted September 30, 2014 (edited) At ACC-DEF +6 or higher you can't miss. You'll either graze, hit or crit. Might makes all of those damage hits better. Extra accuracy only makes one of those hits better if it shifts the value to a higher range. Edited September 30, 2014 by Sensuki
mutonizer Posted September 30, 2014 Posted September 30, 2014 (edited) The fact that ACC is necessary to hit is built into the core of my calculations[..]Ok then, my bad What I am saying is that if your Accuracy is already roughly equal to or greater than their Defense, you benefit more from Might. And that's a fact.Which is somewhat, to me, irrelevant, since you don't know what kind of DEF you'll be facing, what kind of buff you'll have, or debuff they'll have, or if you'll be in range for an aura, or just a meter too far, etc. That's why I consider ACC always better in reality. Sure, in some cases that might not be true, but that just means easy fights will be easy anyway. I've said it multiple times. If Accuracy < Defense, Accuracy helps more. If Accuracy > Defense, Might helps more. The issue is that your statements seem to be based on some bizarre assumption that if you don't pump Perception, you can't hit anything. How is this true, exactly? We're not starting from Accuracy - Defense = -95. In general, we're starting from Accuracy - Defense of roughly around 0, plus or minus 20 (the base Deflections range from 5 to 25 and the base Accuracies range from 5? to 25). The base values are relatively in the same range (meaning Accuracy - Defense is going to be closeish to 0, plus or minus 20), and then are modified by various factors (talents, enchantments, buffs, level difference, shields, etc). So I see absolutely no reason to frame the discussion as if the difference between pumping Perception and not pumping Perception is the difference between missing all the time and hitting all the time. It's not. Not at all.It's not in itself, I agree. But it stacks the odds in your favor under bad circumstances, which are the cases where you need to perform and overall reduce grazes and boost critical chances. The statement that "ACC is a prerequisite for Damage, while Damage is not a prerequisite for ACC" is meaningless. Sure, it's kind of true.. I guess..... sort of... in a very specific context... but it doesn't have any bearing on the current discussion. We're talking about Accuracy vs Might. And this is how it is: one point in either of those stats increases your raw DPS by some percentage of the base DPS. That's how the math works. That's how the system is set up. I'm not idealizing some complicated heat transfer or fluid mechanics problem with broad assumptions here - I'm literally just doing the algebra.Ok, that's probably the problem then, we're not talking about the same thing. I'm talking actual combat situations with unknown variables, you're talking mathematical balancing between PER and ACC on the attribute page based on an hypothetical ACC/DEF ratio. In which case, putting myself in your view, I 100% agree with you. In reality however, I'll still be pumping PER on every single character that at some point might need to do something on a hostile, then maybe look at MIGHT, because that's what works currently in actual battle. The statement that "PER outshines MIG on every single level" is just incorrect, pure and simple. In general, if your Accuracy will already be equal to or greater than their Defense without putting points into Perception, those points are better spent on Might (provided pure DPS is your goal). And if your Accuracy will already be lesser than their Defense without putting points into Perception, it benefits you more to invest in Perception. This leads to a system where very Perceptive characters are skilled at hurting enemies with high Deflection, and very Mighty characters are skilled at hurting enemies with low Deflection. Provided there isn't a serious skew one way or the other (really high Deflection enemies or really low Deflection enemies), this means that Perception and Might are balanced. Each is better in a different situation.You're right in a vacuum again, but in reality, you do NOT know, that's my point and that's why PER is always better, because you do not know if you're ACC will be greater than their DEF. And one second it might be, another it might not be. A high PER character can perform better in any situation where these variables are unknown, while a high MIG will not. Sure, against a low DEF enemy, it won't matter as much, but even then, PER will still boost your chance to actually hit and critical hit, not graze. If you could respec your characters before every single battle after looking at the exact enemies stats, then yea, you could follow the math and balance out MIG and PER on the fly. But when you create a character at level 1 and need to live with his attributes all the way to level 12, MIG is just too damn risky, while you cannot go wrong with PER, the chance of it being a waste are extremely low. At ACC-DEF +6 or higher you can't miss. You'll either graze, hit or crit. Might makes all of those damage hits better. Extra accuracy only makes one of those hits better if it shifts the value to a higher range.Graze cannot be considered as anything but a somewhat miss from a DPS standpoint though. Only Hits and Critical Hits are meaningful because they usually ensure you'll bypass DT. Support/Utility wise it has some interest, but the 0.5 multiplier to duration and some effects make it a very bad investment for a given daily resource. Edited September 30, 2014 by mutonizer 1
Sensuki Posted September 30, 2014 Posted September 30, 2014 (edited) In reality however, I'll still be pumping PER on every single character that at some point might need to do something on a hostile, then maybe look at MIGHT, because that's what works currently in actual battle. Absolutely, but for pure DPS where the character will be primarily focusing on pure damage, Might is always better for DPS if you already have a high accuracy. The more accuracy you have, the better the Might bonus is. As more and more accuracy bonuses stack on top of each other, the benefit Perception gives will be worth much, much less than the Might bonus. At the moment, some of the secondary defenses - Fort, Ref, Will can be pretty high, so for casters attacking those, the extra accuracy may be more valuable, and for debuffing it becomes a case of Acc vs bonus durations. Accuracy scales with level and has a lot of inputs (base, perception, items, buffs and progression) Might does not scale with level and it's bonus raises per each point of accuracy. As accuracy gets higher and higher and higher over the course of the game, if you're finding that you're outclassing enemy defenses - Might will always be better. There will be cases where certain enemies have REALLY high defenses, and in Path of the Damned all enemies will have really high defenses, so in those cases Perception is better. Edited September 30, 2014 by Sensuki 1
mutonizer Posted September 30, 2014 Posted September 30, 2014 (edited) Most pure DPS will pump PER and MIG to the max though probably. DEX is neat but not sure I'd put it before these two yet Edited September 30, 2014 by mutonizer
swordofthesith Posted September 30, 2014 Author Posted September 30, 2014 Most pure DPS will pump PER and MIG to the max though probably. DEX is neat but not sure I'd put it before these two yet Mut has the right of it. You will notice that in my Rogue build I did not fail to max out MIG. However, as Mut has correctly pointed out several times, you can make perfectly viable melee builds that rely on PER over MIG, especially with classes that have multiple damage multipliers like Rogues. But without decent PER/accuracy, any DPS build is bound to fall flat.
swordofthesith Posted September 30, 2014 Author Posted September 30, 2014 Most pure DPS will pump PER and MIG to the max though probably. DEX is neat but not sure I'd put it before these two yet Mut has the right of it. You will notice that in my Rogue build I did not fail to max out MIG. However, as Mut has correctly pointed out several times, you can make perfectly viable melee builds that rely on PER over MIG, especially with classes that have multiple damage multipliers like Rogues. But without decent PER/accuracy, any DPS build is bound to fall flat. Ergo PER > MIG.
Sensuki Posted September 30, 2014 Posted September 30, 2014 Most pure DPS will pump PER and MIG to the max though probably. DEX is neat but not sure I'd put it before these two yet No you wouldn't because of the DT system. If it was a pure DR system it would be pretty good though. It's most beneficial for buff/debuff based casters. Probably the weakest attribute overall too maybe?
swordofthesith Posted September 30, 2014 Author Posted September 30, 2014 No you wouldn't because of the DT system. If it was a pure DR system it would be pretty good though. It's most beneficial for buff/debuff based casters. Probably the weakest attribute overall too maybe? Greatswords. High DT weapons, etcetera. DT hurdle solved.
Sensuki Posted September 30, 2014 Posted September 30, 2014 (edited) Estocs are better, but Might gives more DPS per point than Dexterity gives per point, so you're always better off taking Might for pure DPS. Edited September 30, 2014 by Sensuki
swordofthesith Posted September 30, 2014 Author Posted September 30, 2014 (edited) Estocs are better, but Might gives more DPS per point than Dexterity gives per point, so you're always better off taking Might for pure DPS. Only in a vacuum, when you discount variable DF. Edited September 30, 2014 by swordofthesith
mutonizer Posted September 30, 2014 Posted September 30, 2014 (edited) No you wouldn't because of the DT system. If it was a pure DR system it would be pretty good though. It's most beneficial for buff/debuff based casters. Probably the weakest attribute overall too maybe? Well, we're missing the Range component for now so it's hard to tell. Also not sure how the % speed modifier is applied or calculated. I'll have a look later on, see if I can figure it out. As it stands now, I would put it on DPS classes more than casters though. DPS will get it's bonus constantly during auto-attacks and in theory, it'll help should they need to put out some kind of ability in time. For buff/debuff/healing, it's rare that you need to chain-cast with a character so it's not really that useful I think and since their auto attack damage is irrelevant, you're much better off keeping them idle all the time, ready to cast instantly when you need to, which is probably why they put Range component in there, to make it useful for these guys. Edit: Scratch that, Range is on PER, not DEX. So yea, DEX's a no go for me on casters (Wizard, Healing Priest, etc), just pure DPS and DPS Supports. Edit 2: Seems to be a simple +/- system. First you grab Armor Speed, then +/- character ArmorSpeed modifiers, then +/- Attack Speed modifiers, and that alters your recovery time. So for example. Fighter in plate armor (50% armor speed), with Armored Grace (-16% Armor Speed) and DEX 18 (16% Attack Speed) would give a Recovery Time modifier of 18%, better than padded. And after some testing, that proves correct in game indeed. Soooo yea, pretty good stat actually, especially if you're gonna wear heavy armor, as it just negates it but since it only changes Recovery time, if you don't chain-cast, auto-attack constantly, it's wasted. Edited September 30, 2014 by mutonizer
Sensuki Posted September 30, 2014 Posted September 30, 2014 (edited) I admit due to them splitting AoE and Durations, I repeated over and over again that this would be terrible for support casters, forcing them to invest in two attributes to get the primary thing that they want and it would make Dexterity not an important attribute for any class. So yeah that's why I modded the attribute system, because the current one sucks, just not as much as the old one. I wouldn't get IAS on DPS though I'd just get Might and Perception. Edited September 30, 2014 by Sensuki
mutonizer Posted September 30, 2014 Posted September 30, 2014 (edited) I wouldn't get IAS on DPS though I'd just get Might and Perception. On Melee DPS I agree since you cannot really dump stat or risk having really poor defenses, which is very risky. On Ranged DPS however, you can dump stat a bit here and there to grab 8 more points to put into DEX after your maxed out PER and MIG. All things being equal, that's like a flat 16% DPS boost on auto-attacks no? And provides faster reaction time if you need to use an ability on an auto-attack DPSer, which is pretty nice in itself. If I redo my Support Melee Priest build, this is something that I might actually look into and on Ranged Rogues (not rangers, because of pet putting you at risk), this is definitely worth it I'd say. Edited September 30, 2014 by mutonizer
Sensuki Posted September 30, 2014 Posted September 30, 2014 (edited) Yeah sure, IAS is the third best behind MIG and PER and on ranged you can afford to dump RES and CON a bit. In our paper we found that IAS would be very very good for buff and debuff based casters, the second best attribute behind Intellect in the old system. In my mod this is still the case, but in the current patch it depends. Whereas it's a black and white choice for all other classes. Josh either did not consider this when he made his system or does not care about it. Edited September 30, 2014 by Sensuki
mutonizer Posted September 30, 2014 Posted September 30, 2014 (edited) Yeah sure, IAS is the third best behind MIG and PER and on ranged you can afford to dump RES and CON a bit. In our paper we found that IAS would be very very good for buff and debuff based casters, the second best attribute behind Intellect in the old system. In my mod this is still the case, but in the current patch it depends. Whereas it's a black and white choice for all other classes. Josh either did not consider this when he made his system or does not care about it. I don't see how it would impact a buff/debuff caster much. It doesn't make you cast faster (edit: I mean, doesn't change how fast your casting/attack animation is or anything from what I could see), it just reduces your recovery time so if you don't chain cast (and I mean really chain cast, as in not a nanosecond wasted anywhere), it's 100% useless. Since most spells are AE and all are resources based, Not many situations where a 16% difference in recovery speed would change much of anything. I mean, you'd really need to be pumping out a lot of buffs/debuffs with perfect timing. Edited September 30, 2014 by mutonizer
Sensuki Posted September 30, 2014 Posted September 30, 2014 (edited) This is how recovery time works ([Action Animation] + [Recovery Time]) + [Modifiers] Modifiers affect the the total of Action Animation and Recovery time, but instead of increasing/decreasing the speed of the animation and the recovery at once like in Warcraft 3, the recovery time is sped up/slowed down first taking into account the bonus from the whole total. When recovery time becomes 0, then the animation is sped up. It would not be any different if IAS increased the action animation and reduced recovery time at the same time, because it would be the same percentage of both. I think the "it's 100% useless" is a farce because it is giving you more actions per encounter. It would have been more beneficial in the old system where all bonuses were positive, but it makes less of a difference now due to very minimal increases from attributes. Did you not read our paper man? Edited September 30, 2014 by Sensuki
mutonizer Posted September 30, 2014 Posted September 30, 2014 (edited) This is how recovery time works ([Action Animation] + [Recovery Time]) + [Modifiers] Modifiers affect the the total of Action Animation and Recovery time, but instead of increasing/decreasing the speed of the animation and the recovery at once like in Warcraft 3, the recovery time is sped up/slowed down first taking into account the bonus from the whole total. When recovery time becomes 0, then the animation is sped up. Yea I get that bit. It would not be any different if IAS increased the action animation and reduced recovery time at the same time, because it would be the same percentage of both. I think the "it's 100% useless" is a farce because it is giving you more actions per encounter.Not sure what you do in battle, but my casters are always idle during encounters to make sure that when I need them, they can act instantly. Do you have your wizards and priest auto attack or something? For me, casters are not classes that need more actions per encounter because they are not part of the sustained classes. Melee/Ranged characters who use auto attack constantly are however part of the sustained classes, and that makes more actions per encounter useful. Since I also don't play with "naked" characters, they'll never really go over the recovery threshold, therefore never benefit from an increased casting animation speed. Of course, if you keep them naked then yea, that makes it nice I guess. Did you not read our paper man?Parts of it to check it out but I much rather test how it's actually working in game, than debate abstracts removed from everything. I mean it's nice and useful to a point, but the reality is a bit more chaotic and unpredictable than a lab test. Edited September 30, 2014 by mutonizer
Recommended Posts