Pioneering The Multi-GPU Future With Oxide Games
A little over a month ago, the tech community was abuzz over an article about using an AMD GPU and an Nvidia GPU in the same system running Ashes of the Singularity from Oxide Games in DirectX 12. That early evaluation covered the use of a number of top tier graphics cards as well as a pairing of some older top tier graphics cards from a few years ago. Although the test served as an example of the benefits of such a configuration, it generated a number of questions from the community, such as why integrated graphics weren’t tested out and what happens when you use two very different graphics cards, such as a brand new GPU paired with an older one from a previous generation, or a lower tier card paired with a higher tier card.
I recently had the opportunity to speak with Dan Baker, one of the co-founders of Oxide Games and the creator of the Nitrous Engine, to discover these answers and learn much more about the engine and DirectX 12.
iGPU Not Being Overlooked
When the GeForce + Radeon performance evaluation article ran, the most common questions asked around the Internet surrounded integrated GPUs. Many, many people are interested in making use of these GPU cores that are already in their PCs and wanted to know why this configuration was absent from the evaluation. Because of the interest from the community, I started our conversation with this topic.
Baker said that the rendering mode the game uses is a traditional symmetrical multi-GPU configuration. The game renders alternating frames to each GPU somewhat like SLI or Crossfire, which works best with GPUs that offer similar performance. This mode doesn’t work well when two GPUs have a large performance delta between them, though, so Baker and his team are working on ways to do asymmetric multi-GPU rendering. Rather than alternate frame rendering, the team is working on ways to offload sections of the work to the slower GPU and then upload that rendered data back to the faster GPU to put the scene together.
Earlier this year, I had the chance to see Ashes of the Singularity running in a multi-GPU configuration that showed a discrete R9 290 paired with the integrated GPU cores from an AMD A10-7870K APU. I was curious what the difference here is, and Baker clarified that the build I was shown back then was running on Vulkan, not DirectX 12.
The bigger difference is multi-vendor support, though. In the demo I saw, both the R9 290 and the integrated GPU cores are from AMD and share the GCN architecture. Synchronizing GPUs from multiple different vendors requires doing things for a more general implementation. The developers need to ensure that using the slower integrated GPU cores isn’t going to hold back the rest of the system from completing a task even faster without them.
The iGPU Can Hold Back Top Tier Discrete Graphics
The hard part is dealing with the large performance delta between discrete and integrated GPUs. Baker said, “You want to keep the integrated [GPU] a little under filled, because if it ever takes more time on the integrated than the discrete [GPU], then you’re really probably going to get a loss, and that’s the tricky part that we have to manage.”
Stay On the Cutting Edge: Get the Tom's Hardware Newsletter
Get Tom's Hardware's best news and in-depth reviews, straight to your inbox.
Baker believes that, for example, pairing the new Intel Iris Pro integrated graphics with a comparable discrete GPU would work very well, but two GPUs that don’t share similar performance characteristics adds a whole new level of complexity to the engine but he believes this is completely feasible. “We’re convinced we can make good games on it,” he said.
Baker acknowledged that Oxide Games wants to cater to as many PC configurations as possible. He noted that many gamers like to upgrade their GPU every couple of years, and he would like to allow people to make use of their old GPU alongside their new GPU for even better performance. He also noted that most systems have some form of integrated GPU that goes to waste when a discrete GPU is installed. He and his team are keen to capitalize on every bit of computing power available to the engine.
I asked Baker how long until we see a build with support for cross-vendor GPUs and whether or not he thinks this technology will be translated into future titles from Oxide or other developers using the game’s engine. He was reluctant to say when we’d see a public release, but he did confirm that the company intends to have it working during the beta period of Ashes of the Singularity. He also noted that testing every single GPU combination out there is next to impossible. “We were doing the calculations on it, and we realized that there [are] over 4,000 combinations of multiple vendor video cards that are probably valid,” he said. “That’s insane; we can’t test all that, so we have to roll it out early so that users can make a database of what works and what doesn’t.” The company will be relying on beta test feedback for verification of how various combinations fare. As for future titles and projects from other developers, he said that Oxide is always talking to other developers about what they have done; the company isn’t interested in keeping the process secret.
Baker noted that they are currently very much in the experimental stages of figuring out how DX12 can be used with multiple GPUs from different vendors. He said that in many cases, Microsoft doesn’t have the answer to their questions, so there’s been much trial and error between the two companies to get things to work.
Baker noted that all of this may seem like too much work for one title, but this grand effort is not about Ashes of the Singularity. It’s bigger than that. This work is being done at the engine level so that these multi-GPU features will be available to any future development team working with the Nitrous Engine.
Multi GPU in Direct X 12: No More Relying On GPU Drivers
DirectX 11 (and earlier versions of DirectX) managed multiple GPUs at the driver level. It relied on profiles created by the GPU manufacturers for each and every game. Without a specific profile, the game has no way of using the extra GPU resources; it wouldn’t even “know” if there were additional GPUs at all.
DirectX 12 gives direct control of the GPU to the game designers, which allows for synchronization across multiple GPUs. Baker said the hard part for his team was wrapping their heads around how to use the new API, but once they figured out that part, multi-GPU support just worked. He said they are no longer dependent on GPU driver updates, which takes away some of the fragility of working with multiple graphics cards. He believes this will pave the way for wide support for multi-GPU configurations with a variety of games, regardless of driver support.
There are traditionally two ways to work with multiple GPUs with an application: Implicit Multi-Adapter, which is a driver-level feature, and Explicit Linked Adapter, which allows for support for SLI using a bridge. The Nitrous Engine is being designed to support a new multiadapter approach -- Explicit Multiadapter, which doesn’t require a software driver to use multiple processors. It also treats multiple GPUs as seperate GPUs.
Explicit Linked Adapter has not yet been implemented into the engine. Baker said he isn’t sure what kind of gains it would net, as the company is already seeing pretty efficient scaling with Explicit Multi-GPU. He did say that there are additional features such as better memory sharing with Explicit Linked Adapter that allow for more efficient data synchronization. Baker noted that linked GPUs don’t need to have specific driver profiles for SLI and CrossFire as long as the GPU complies with DX12 specifications. He said the profile is essentially meaningless with DirectX 12, as the driver doesn’t even know that the game is using two graphics cards.
GPU Memory Resources
One of the features that DirectX 12 brings to the table is the ability to have independent memory pools for each GPU. In SLI and CrossFire, the GPUs have replicated memory pools that are identical copies of each other, but in DX12, each GPU can have completely separate data in memory, and even different size memory pools.
Baker said that the current build of the Nitrous Engine uses duplicated resources for each GPU because of bandwidth constraints. He went on to explain that for Explicit Multi-GPU to share resources, the data has to be accessible by both GPUs -- meaning it needs to go into system memory. Because of the bandwidth constraints of system memory compared to the throughput of GDDR5 and HBM, the company opted to use duplicated memory resources. However, Oxide has not yet written off the idea of using a shared memory pool in some cases, and the company has some ideas about how to use two independent memory pools to its advantage.
Baker mentioned that with a discrete GPU plus integrated GPU configuration, one GPU is already accessing system memory to begin with, so it’s not much effort to allow the discrete GPU to see that pool. Additionally, Baker is toying with the idea of dedicating specific portions of the game to one GPU or the other.
The example that he gave is to have one GPU handle all of the textures of one faction in the game, while the other GPU handles a completely different faction of characters. The texture files in this situation would not need to be duplicated. (Baker had just thought of that concept the day we spoke and seemed genuinely interested to experiment with the idea.)
Oxide Engine Built To Fill A Gap
Ashes of the Singularity is the new game that Oxide is building, but the Nitrous Engine is being developed in conjunction with it as well. The Nitrous Engine will eventually be available to third-party developers to create their own projects. Baker said that Oxide is building Nitrous to cater to the under-served market for an engine that can render many assets and simultaneously run multiple simulations. Nitrous is typically suited for real-time strategy games, but Baker said there’s nothing limiting it from being used for other classes of games such as adventure or first-person shooters.
The current short term plan for the Nitrous Engine is to keep it mostly internal at the beginning. Baker said that Oxide has a couple of as-yet unannounced projects, as well as the already-announced reboot of Star Control in development using the Nitrous engine. He said that Oxide is a game development company first and an engine licensing company second. The company is building the engine for its own purposes, but Baker expects to move into licensing the engine in the next couple years.
Graphics Rendering Lessons From The Film Industry
People always ask why games don’t look like Pixar films when currently available computer hardware is hundreds of times faster than the machines available in the 1990s when Toy Story was rendered. Although the graphics in these movies don’t have the same level of detailed textures, the 3D objects in them are crisp and without jaggy edges.
When the team at Oxide sat down to start building the Nitrous engine, the team wondered if it were now possible to apply these rendering techniques in real time. Baker said that every industry insider the team talked to loved the idea but didn’t believe that it would be possible to get it working fast enough. In practice, Oxide discovered that today’s graphics hardware is, in fact, capable of such calculations.
Traditional game graphics use a process called Deferred Rendering. This process is the most common rendering technique, and it is used almost universally in video game graphics. Deferred Rendering calculates the graphics rendering first and then adds the lighting and shading calculations to the last part of the scene. The downside to this process is that it introduces aliasing into the scene, which has to be corrected with anti-aliasing techniques later on.
The film industry’s process has been completely the opposite. Companies such as Pixar make every effort to remove any aliasing and noise before adding any level of detail to the scene. Pixar’s team developed an algorithm called REYES that allowed them to render scenes with this level of clarity. Oxide Games took that concept and developed a technique it called Object Space Rendering (OSR) that effectively does the same rendering work in real time on modern GPUs. When the team realized that hardware exists to make this work, they decided it was the direction they needed to take the Nitrous Engine.
Performance
Baker and his team started off trying out the Object Space Rendering technique as an experiment. He said they didn't plan to use it, but after they started testing it out, they realized that it can be scaled up and down easily to suit different hardware profiles. The Nitrous Engine can control the number of shading samples that the entire scene has to work with so it can be scaled to suit any graphics hardware, including integrated GPUs. Baker said that Object Space Rendering is more work overall, but that "it’s actually very efficient to use hardware in this way. Even though we have to do more overall work, the GPU is being used at a higher percent of efficiency."
Pioneering New Industry Standards
With all of the success that the team at Oxide has seen with the adoption of Object Space Rendering, I queried Baker for his perspective on where this technology will go in the industry. I wondered if OSR will be adopted by other game makers quickly or if it will take some time. He suggested that it will be a bit of a process for developers to retool for OSR, but he added that there is already interest -- and more coming -- from the hardware makers. The hardware industry is becoming convinced that this is the way to render things in the future.
Baker suspects that Oxide Games and the Nitrous Engine are just the pioneers of an industry-wide change in how graphics are rendered in the future. He was reluctant to elaborate, but when I asked if he thought this will be used in VR, he allowed, "I would say yes."
It’s fine to get excited about a new game -- Ashes of the Singularity may well be a terrific title -- but there’s so much more happening around it that will have far-reaching ramifications for the whole of the gaming industry. The multi-GPU capabilities being developed thanks to innovations in DX12 and the efforts of Oxide Games have the gaming community -- rightfully -- excited.
Follow Kevin Carbotte @pumcypuhoy. Follow us on Facebook, Google+, RSS, Twitter and YouTube.
Kevin Carbotte is a contributing writer for Tom's Hardware who primarily covers VR and AR hardware. He has been writing for us for more than four years.
-
urishima This is quite exiting, to see what looks like a revolution on the hardware and engine side of game development happen.Reply -
renz496 The Future: gpu maker still the one that pushing multi gpu while game engine and game developer moving away from it.Reply -
WFang so multi gpu was demo'ed earlier using Vulkan?Reply
Why are we not hearing more about Vulkan? -With the disaster that is shaping up to be Windows 10, I really would hope to see a bigger focus on Vulkan and therefore paving a path to more titles on e.g. linux, steamOS etc. -
Logsdonb Multi GPU options are getting a lot better next year with the next generation of GPUs from Nvidia and AMD, which are based on HBM 2.0. This architecture will support much higher bandwidth than current GPUs. This will allow much better performance with SLI or Crossfire implementations as well as multi GPU cards than with today. We may even end up seeing more utilization of 2 smaller GPUs rather then one massive one like today.Reply -
epobirs If multi-GPU support got sufficiently common, I'd wonder how inexpensive of a video card Intel could produce from their IGA to double up on the GPU hardware in a PC. If the price was right, say around $50, and it gave a major boost to a large number of games, it could be a nice accessory for the large, less demanding part of the market who doesn't want much better than they see on consoles but with the variety of genres better offered in the PC realm.Reply -
cwolf78 so multi gpu was demo'ed earlier using Vulkan?
Why are we not hearing more about Vulkan? -With the disaster that is shaping up to be Windows 10, I really would hope to see a bigger focus on Vulkan and therefore paving a path to more titles on e.g. linux, steamOS etc.
What "disaster" are you talking about? Windows 10 is still on pace as the fastest selling OS release ever. Even if it miraculously tanked and they didn't sell even one more copy, it would still have an order of magnitude more market share (especially GAMING market share) than all the linux distros including SteamOS combined. So which API do you think developers are going to be targeting most? The answer should be obvious. I think Vulkan will be much more beneficial in the mobile space. And with the Xbox One getting DX12 support, perhaps PS4 will get Vulkan? -
epobirs so multi gpu was demo'ed earlier using Vulkan?
Why are we not hearing more about Vulkan? -With the disaster that is shaping up to be Windows 10, I really would hope to see a bigger focus on Vulkan and therefore paving a path to more titles on e.g. linux, steamOS etc.
Probably because your perception of disaster is subjective and at odds with reality. Yes, I know, Linux is the desktop of tomorrow. And always will be.
Gamers will flock to Windows 10 when a significant amount of product makes good use of DX12. There is no rush for them before then and it's still quite early yet for developers. Microsoft knows this and is doing what they can to move things forward but no amount of resources can substitute for the time needed to get a handle on major API changes and create products.
Stuff like SLI and other multi-gpu approaches has been demoed with every graphics APIs to come down the pike since the 90s. That doesn't mean the API is going to receive widespread adoption, just that another checkbox has been filled on the demo chart. You aren't hearing much about Vulkan because it's mainly in use where the consumer has little concern about such matters. Just as for many millions of console gamers Unreal Engine is just a name that goes by as part of the opening credits. PC gamers care a lot more about under the hood stuff than on most other platforms. It's part of the attraction if you have the inclination.
In much the same way you don't see a lot of articles about the changes to Android's runtime infrastructure. It's really important stuff for the platform but a very limited audience, outside of developers, for the interesting details.
-
tical2399 Where is this disaster you're talking about. W10 was been getting tons of praise. The same with dx 12. Talk about straw man arguments.Reply