Jump to content

Will project eternity have multi-core support?


Recommended Posts

Will the game have support for multi core or is it going to be single threaded/core? I'm asking because only in the latest version of unity there is support for multi core and it is still a beta version.

 

Even though it wouldn't be a major problem if it only was single threaded due to the combat system, which has pause, nevertheless, there are many dual/quad core processors with low frequency (1,4-1,6 ghz especially in notebooks), that could use the extra muscle from another core and make the gaming experience even better.

Link to comment
Share on other sites

I doubt this game will require the use of multiple processors, unless the Unity engine is far worse than I understand it to be. This is an IE knock-off, not Dwarf Fortress.

I made a 2 hour rant video about dragon age 2. It's not the greatest... but if you want to watch it, here ya go:

Link to comment
Share on other sites

I doubt this game will require the use of multiple processors, unless the Unity engine is far worse than I understand it to be. This is an IE knock-off, not Dwarf Fortress.

It probably will require multi-core processors, since more and more low-frequency, high-core-count computers are entering the market by the day (looking mostly at laptops/ultrabooks). The benefit of using two low frequency cores is far higher than using a single high frequency one (lower temperature, more simultaneous processes).

 

If the engine can support it, and their intention is to support current-gen low-end systems, I'm certain it will support multiple cores :). The game will rely a lot more on CPU than GPU, seeing as things like advanced AI & path finding is all handled by the CPU.

  • Like 2
"What if a mid-life crisis is just getting halfway through the game and realising you put all your points into the wrong skill tree?"
Link to comment
Share on other sites

I doubt this game will require the use of multiple processors, unless the Unity engine is far worse than I understand it to be. This is an IE knock-off, not Dwarf Fortress.

It probably will require multi-core processors, since more and more low-frequency, high-core-count computers are entering the market by the day (looking mostly at laptops/ultrabooks). The benefit of using two low frequency cores is far higher than using a single high frequency one (lower temperature, more simultaneous processes).

 

If the engine can support it, and their intention is to support current-gen low-end systems, I'm certain it will support multiple cores :). The game will rely a lot more on CPU than GPU, seeing as things like advanced AI & path finding is all handled by the CPU.

 

Sure, I pity people who are using AMD processors this generation too, but I'm just saying - BG2 has sufficiently good path finding and it runs on 200 MHZ processors. Its AI isn't the greatest, but not even Arkane bothered to put any effort into giving Dishonored's guards AI. If a high-budget stealth game isn't going to invest in AI, I doubt we'll see any processor-hungry AI in PE.

 

Since Unity does support multicore, it's not a big deal I guess, but I will be very surprised if this game can't be run on a tablet (in theory anyway, I doubt the control scheme will lend to playing it on a tablet).

Edited by anubite

I made a 2 hour rant video about dragon age 2. It's not the greatest... but if you want to watch it, here ya go:

Link to comment
Share on other sites

BG2 had one of the worst pathfinding algorithms in any game, ever, heh. Something the PE team has admitted, and said they will greatly improve on.

 

Played it recently? Every time you issue a new move command, the character stops for nearly a full second before it gets executed. Characters get stuck on each other. Characters very often set of in the completely wrong direction.

 

(AMD is a processor manufacturer, they made both LV, ULV, CULV and full power processors... just like Intel. There's no great difference in performance between AMD & Intel, if you look at price-to-performance ratios AMD even comes out on top. You will be very happy for multi-core support if you're on a ~1.2Ghz ULV dual-core processor, whether it's made by Intel or AMD.

Edited by mstark
"What if a mid-life crisis is just getting halfway through the game and realising you put all your points into the wrong skill tree?"
Link to comment
Share on other sites

BG2 had one of the worst pathfinding algorithms in any game, ever, heh. Something the PE team has admitted, and said they will greatly improve on

 

You have never played the Total War series. Literally miles out of their way.

Haha, I haven't actually, and I suppose I should be glad about that :p
"What if a mid-life crisis is just getting halfway through the game and realising you put all your points into the wrong skill tree?"
Link to comment
Share on other sites

BG2 had one of the worst pathfinding algorithms in any game, ever, heh. Something the PE team has admitted, and said they will greatly improve on.

 

Hurray! In IWD2 there was nothing quite as frustrating as moving the party to a new location on the map and having half of them take the long way around. It led to constant micromanagement of the party movement. If they can fix that, it would be great.

"It has just been discovered that research causes cancer in rats."

Link to comment
Share on other sites

BG2 had one of the worst pathfinding algorithms in any game, ever, heh. Something the PE team has admitted, and said they will greatly improve on.

 

Considering the time in which it was written and how it only screws up when tons of stuff gets crammed into doors, I think it does pretty well, at least compared to games like Starcraft.

 

(AMD is a processor manufacturer, they made both LV, ULV, CULV and full power processors... just like Intel. There's no great difference in performance between AMD & Intel, if you look at price-to-performance ratios AMD even comes out on top. You will be very happy for multi-core support if you're on a ~1.2Ghz ULV dual-core processor, whether it's made by Intel or AMD.

 

AMD's "top of the line" processors tend to be 6 cores or more cores stuck together. Sure, they get the job done if you're just surfing the net or playing an ancient game, but they're not exactly as well-suited for playing most newer games. I imagine the only reason they're even able to compete with Intel right now is because people probably jump to the conclusion that more cores = better. Which is usually the inverse.

Edited by anubite
  • Like 1

I made a 2 hour rant video about dragon age 2. It's not the greatest... but if you want to watch it, here ya go:

Link to comment
Share on other sites

Depends on what you do with your computer, but yes, most games lack support for more than 2 threads, which is bad for the game and bad for gamers. That does meant that today, it's usually better to have fewer cores with higher clock speed, if all you do with your computer is gaming. If PE does support multiple threads, it'll benefit all of us, since there isn't a single current-gen processor out there with less than 2 cores/threads at the moment.

 

However, if you're not just a gamer, and you do any form of actual work, or use productivity tools, you'll be very happy for every extra core you can get. Multi-core support is more common in productivity tools.

"What if a mid-life crisis is just getting halfway through the game and realising you put all your points into the wrong skill tree?"
Link to comment
Share on other sites

  • 1 year later...

I apologize for necro-posting but I didn't want to start a new thread about the same thing; recently I have been wondering about exactly this.

 

It would be nice to have an official word on multi-core support from the devs (is it going to be supported? If yes, up to how many cores? I seem to recall that most games only take advantage of up to 2 cores nowadays, despite quad-core CPUs being quite widespread).

 

Tangentially, will hyper-threading (as offered, for example, by Intel i7 CPUs) be supported?

"Time is not your enemy. Forever is."

— Fall-From-Grace, Planescape: Torment

"It's the questions we can't answer that teach us the most. They teach us how to think. If you give a man an answer, all he gains is a little fact. But give him a question, and he'll look for his own answers."

— Kvothe, The Wise Man's Fears

My Deadfire mods: Brilliant Mod | Faster Deadfire | Deadfire Unnerfed | Helwalker Rekke | Permanent Per-Rest Bonuses | PoE Items for Deadfire | No Recyled Icons | Soul Charged Nautilus

 

Link to comment
Share on other sites

I know that you are looking for official word (and I'm interested as well), but since I'm seeing this thread for the first time, a couple of comments:

 

As far as applications are concerned, hyperthreading = multiple cores.  The application creates a thread, which is then assigned by the OS to a core, as far as I know without regards to whether it is a real CPU or a "virtual" CPU that created by hyperthreading.  If I'm remembering correctly, hyperthreading allows a single core to start processing instruction X, then (once X has stopped using certain circuits) start the execution of Y, as long as instructions X and Y fall into some range of valid instructions.  If instruction X and Y aren't hyperthreading compatible, then the core automatically blocks the execution of Y until X completes.  Given that application programers don't generally have visibility into the machine language instructions that are generated when they issue a command, there is very little that an application developer can do to leverage hyperthreading beyond creating multiple threads and hoping that the OS does something intelligent with them.

 

On multi-core processing within a single application:  This is very, very hard to do.  Even with a very well understood problem that has been worked on by hundreds of programmers, there finally exists an process that gets a speedup of 2x on 2 processors.  Of course, on 16 processors the speedup drops to 11.1, and the implementation is so complex that most chess programs don't implement it...  Cite: http://chessprogramming.wikispaces.com/Parallel+Search.  Even an extraordinary game has a tiny fraction of the number of developers working on it than does the general group of "chess programmers", and those developers rarely have loads of experience in MPP development, so...

 

Generally game programs that support multiple threads only support a couple -- a "graphics thread", a "sound thread", and a "game thread" -- and do a very, very poor job at distributing work evenly among those threads.  So, one of three threads will be running at 100% utilization (say, the "graphics" thread), one thread will be running at 5% utilization (the "sound thread") and the other will be working at 20% (the "game thread").  As one might expect, this leads to only a marginal improvement in performance... :(

 

Worse still, that marginal improvement comes at an enormous cost in development / debug time.  That's because a multi-threaded game has loads of intermittent failure modes, that only occur when the "graphics thread" is execute instruction 95,483 at the same time as the "game thread" is executing instruction 38,283 -- in all other cases, everything works properly.  Worse still, the failure might only occur when the debugger is not attached to the code, because the debugger itself is a thread itself and interacts with the other threads running on the computer at the same time.  Speaking as a developer myself (not a game developer), trying to troubleshoot a bug without a debugger is an exercise in frustration.

 

To sum up:  Multi-threaded games generally only get marginal speedups at best at the cost of significantly increased development time and a very real risk of serious bugs only being discovered after release.  This is why the vast majority of games don't support multiple threads.

 

Note that the formula is a bit different if the underlying engine (Unity in this case) uses multiple threads internally.  This is because the engine is (fingers crossed) well tested -- and is certainly better tested than any single game would be -- and the game can be developed as a single threaded application.  This is one of the big advantages of using an off the shelf engine, and is the way that most "multi-threaded games" get the bullet point (the game itself is a single thread, but the engine has several threads, so there is some mutli-core activity going on).

  • Like 3
Link to comment
Share on other sites

I think I can answer hyperthreads.  Your OS recognizes each as an extra CPU core, but they are not cores as such.  Instead, each core is given an extra set of addressable and non-addressable registers, so that two processes can be loaded into CPU memory at once.  This has the benefit of being able to switch between the two processes on the single core really really quickly, instead of reloading the process CPU state from RAM on each context switch.  In effect, you have an additional "core" to work with, and it will be treated as such by your system (which is why applications and such show the number of cores you have as your hyperthread count).  In actuality, it is not truely an additional core because it does not support parallel (simultaneous) processing.

 

Basically, a 2 core - 2 hyperthread computer (I think they just count this as 4 hyperthreads) can hold 4 processes in CPU memory at once, but can only run 2 simultaneously.  It can, however, switch between running any 2 of these 4 at once with incredible speed.

Edited by Pipyui
  • Like 1
Link to comment
Share on other sites

It will "support" multicore of course, i.e. it will run on multi-core CPUs, but no one knows if it'll take advantage of them in any meaningful way. Perhaps my information is outdated, but support for multithreading is not a strong point of Unity. Anyway, we'd need access to their code to see if anything is done in a multithreaded fashion.

 

None of this matters if the game performs adequatly.

Link to comment
Share on other sites

Tangentially, will hyper-threading (as offered, for example, by Intel i7 CPUs) be supported?

There's nothing in particular that a program can do to "support hyper-threading". It's a feature built in your CPU which makes thread switching faster regardless of what these threads are doing or what application is using them. So, that question doesn't really make sense.

 

If PoE is highly multithreaded - which is unlikely -, then it will benefit from hyperthreading, otherwise it won't.

Edited by Zeckul
  • Like 1
Link to comment
Share on other sites

BG2 had one of the worst pathfinding algorithms in any game, ever, heh. Something the PE team has admitted, and said they will greatly improve on.

 

Played it recently? Every time you issue a new move command, the character stops for nearly a full second before it gets executed. Characters get stuck on each other. Characters very often set of in the completely wrong direction.

 

For it's day BG2's pathfinding was above most if not all of the rest. What limited it more than anything was the capabilities of the processors of the day. I really don't think it's that bad even by today's standards. A lot of titles nowadays cheat and rather than have characters possibly get stuck on each other they have them walk right though. I personally would prefer my characters in PoE to get stuck on each other than have them walk right through each other. I've never noticed the 1 second pause thing you mentioned, and I've played it as recently as 2011.

 

As already mentioned, there are titles many consider to be AAA in quality that have worse pathfinding (though I personally think total war is poop; so much potential, programmed so bad).

Edited by Valsuelm
Link to comment
Share on other sites

It's kind of an interesting question what might be useful to program for multicore in a title like that.

 

I mean, you could imagine that path-finding, node-generation for ai, non-immediate tasks.. and preloading, caching, and so on would be running separate threads. And that this would possibly be able to complete asynchronously without interfering with immediate tasks. ..that could be things like triggering effects, updating animation states (maybe dependent on terrain, sneak/detection, spots/looking towards target, aggressiveness), hit-detection (if any).. 

 

So given that they're a bit economical in the first place, and targeting relatively low powered rigs, they're probably:

1. Avoiding the expensive immediate calculations to keep cpu-usage low.

2. Making sure the game is not actually dependent on instant multicore runs.

 

So maybe it'd be the difference between a 2 second pause once in a while, or a 0.1 second pause from a large mob ai run... you know, hidden behind a short "Graak kill human!" prompt.

 

edit: As much as I like to hear myself talk, it would be interesting to hear from tech at Obsidian about this, though :p

Edited by nipsen
  • Like 1

The injustice must end! Sign the petition and Free the Krug!

Link to comment
Share on other sites

Generally game programs that support multiple threads only support a couple -- a "graphics thread", a "sound thread", and a "game thread"

 

I was told that the "state of the art" is to have at least two "graphics threads": One that runs the animations of all animated objects in real-time, and another one that performs the actual rendering in cooperation with the graphics card (by taking a snapshot of the animation states 60 times per second).

 

But I could have misunderstood (my knowledge of this stuff is pretty basic).

 

In any case, I think that's something that the developers of the engine (in this case Unity) would have to implement, not the developers of the game that uses the engine.

 

"Some ideas are so stupid that only an intellectual could believe them." -- attributed to George Orwell

Link to comment
Share on other sites

BG2 had one of the worst pathfinding algorithms in any game, ever, heh.

 

Well, the actual pathfinding (i.e. calculating the shortest path from A to B) was pretty good.

 

The problem was how the engine reacted to moving characters blocking each others paths. Namely:

  • Whenever the pathfinding algorithm was called by the engine, it only took into account the state of the map and entities at that very instance, effectively treating mobile entities the same as immobile obstacles.

     

  • As soon as a moving character bumped into an obstacle that hadn't been there when his current path was calculated, the engine called the pathfinding algorithm again for that character, and set him on that new path (on which he then continued even if the obstacle was only temporary).

I could think of at least 3 solutions for that problem:

 

a) Allow characters to move through each other, or "push each other out of the way", so the pathfinding algorithm won't have to treat other characters as obstacles at all.

 

b) Re-run the pathfinding algorithm for every moving character continually (say, twice per second), to allow them to quickly correct their paths whenever obstacles appear and disappear. In the Infinity Engine times computers where not powerful enough for that, but nowadays they probably are.

 

c) Come up with a smart, dynamic, cooperative pathfinding algorithm that can intrinsically take other character's movement into account, and can detect whether obstacles blocking the shortest path are only temporary.

 

Obviously [c] would give the best results, but I doubt that the Obsidian developers would want to spend so much effort. (Unless of course if the Unity engine helps with that? Does someone know?)

  • Like 1

"Some ideas are so stupid that only an intellectual could believe them." -- attributed to George Orwell

Link to comment
Share on other sites

:) yeah, the usual solution is just to add more updates per second (and add a queue system) until the most obvious problems disappear. See a lot of games in the unreal engine that just avoid the problem by removing the objects that actually move as well. Or create a fighting system that has literally no overworld interference - that when you see it from an abstract on the top level, it's still square blocks that run into each other on flat ground. That then the animation sequences fit into.. lots of work tends to go into that, connecting specific moves to the animation of the other characters.

 

It's not something the framework really helps you with, but I know people have made good plugins to improve on it. The problem is the overhead and that you still run into the same problem as before, that you have paths that rely on out of date info. Maybe you could let the paths have curves, have characters obey order, and slip into line that way, etc. Probably lots of good ways to do it that look perfectly fine.

 

But it's not easy to have that starting point and then add hit-detection or node-generation that isn't based on linear approximations. Always having squares like on a checker-board to work from. Or, you end up with having to rework everything from scratch.

 

I don't know.. I suppose the most tempting thing to do would be to prepare an algorithm that executes in a.. reasonably instant amount of time. That's based on a vector for each character and the reach of the next few steps or so. That would make up some curve. And then create longer animation cycles adjusted from the existing splines they're made from anyway, and so on. It's probably not impossible to do that on an x86 platform either. And then... fantasizing completely here.. having a start for a fireball spell begin with a staff movement and a staff effect that comes naturally from the previous stances. Rather than ending in a rest state before executing the cast spell animation, and so on.

 

This would also solve the milling around problem, since you could project formations and ..things.. in a way that would make sense, having the fighters take position up front right outside of the reach of the wizzard, having movement that doesn't actually have obstructions in the normal checker-board sense.

 

.. could see bunches of beautiful solutions to that on the ps3. The animation and hit-detection in Infamous1, dragon-wings in Lair, animation interference in heavenly sword.

 

The guy who makes Overgrowth has a pretty neat solution to this, by the way. See some neat trickery with OpenCL, other interesting things happen with shared memory on multicore arm processors ..and I kind of go around hoping that we'll see more and more solutions in "simple" games that use more imaginative ways to do pathfinding and animation interference. Probably will have to be coming along with less d&d dice-roll abstractions in rpgs, at least.

The injustice must end! Sign the petition and Free the Krug!

Link to comment
Share on other sites

Probably will have to be coming along with less d&d dice-roll abstractions in rpgs, at least.

 

Just for the record, I do not see this is an inevitable consequences of animination improvement -- and, more to the point, I don't see this as a good thing.  If you see this as progress, then I'd suggest that you should be looking at a different company.

Link to comment
Share on other sites

c) Come up with a smart, dynamic, cooperative pathfinding algorithm that can intrinsically take other character's movement into account, and can detect whether obstacles blocking the shortest path are only temporary.

 

This is a well-understood topic with a lot of existing and available research (see steering behaviors, ORCA, ClearPath, flow fields, etc.). RVO2 is a free library that could be easily integrated to provide cooperative collision avoidance; it even has a pure C# implementation so it should just compile as-is in Unity.

  • Like 2
Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...