There was an interesting thread on Reddit [1] the other day which did a breakdown of the state of gaming in terms of native games and games that can be run easily via Steam Play (aka 'Proton', Valve's in-built WINE+DXVK layer). The results were quite interesting and, as one of the replies pointed out in another breakdown, Linux now competes with (and in some ways beats) gaming on macOS in terms of size of catalogue.
Granted there is a ways to go to beat Windows in this space, but the progress over the last 6 years generally and the last 6 months specifically is astounding.
As someone who has just released[1] a first access game across Windows, macOS and Linux:
10 years ago developing on linux as a game developer was way worse than windows and MacOS, now there is no real difference at all. I speak as a long time UNIX user, Linux was terrible for C++ devleopment, debugging and asset creation/viewing (compared to the others) making gamedev tougher than it needed to be.
10 years ago linux's OpenGL drivers were behind even MacOS and way behind Windows OpenGL which was behind DirectX by quite some way. Now Linux's drivers are fairly good although it's still a nightmare upgrading radeon drivers and NVidia's drivers seem to be fickle. This leads to a lot of bugs from users and supporting various distro/hw/driver combinations.
Libraries like libSDL(2) were less mature 10 year ago, multi monitor, input and sound support were patchy because the Linux landscape was fragmented and buggy. Now libSDL2 is pretty much spot on. Mutli-monitor is doable (not as good as macOS but as good as windows), input is pretty good and sound is OK (but most people use FMOD which is superb).
Where Linux falls behind is deployment. No I am not going to ship my source for you to compile. itch.io and Steam go a long way to make it easy to deploy and crucially update the game. It's still a PITA to deploy a compiled app on Linux with unclear glibc versions and amazingly it's worse than MS DLL hell of MSVC redistributable shambles.
It's not impossible to overcome but it's a PITA for a small market. Not a good position to be in. I support macOS and Linux as well as Windows because I believe in plurality but I can see why many chose not to. Don't think that using Unity or Unreal make anything easier either. In many ways they are worse than a well written custom engine.
> It's still a PITA to deploy a compiled app on Linux with unclear glibc versions and amazingly it's worse than MS DLL hell of MSVC redistributable shambles.
One option is to statically link against musl libc [1].
The kernel ABI is about the only stable thing you can rely on (which is something you _can't_ rely on for macOS or Windows).
> The kernel ABI is about the only stable thing you can rely on (which is something you _can't_ rely on for macOS or Windows).
In Windows's case, technically accurate but not really relevant, since the interface for working with the base DLLs like kernel32, user32, etc. hasn't changed.
Sorry I wasn't clearer. I just meant that on Windows and macOS, you can rely on certain userland libraries, but not the kernel ABI, while on Linux you can rely on the kernel ABI, but not the libraries.
Note a program compiled with musl can't use shared libraries which were compiled with glibc.
I wrote my own tiny x64 C standard library which, unlike musl, is compatible with other glibc libraries: https://github.com/procedural/c_stdlib
Is it not possible to link glibc staticalliy? IIRC musl has readability and code size in mind while glibc optimizes for performance at the cost of readability and codesize.
The problem that I know of is that glibc supports loading DSO modules, eg. for NSS, so it's not going to support pluggable NSS sources like LDAP etc when statically linked. There are workarounds, however:
Can't imagine many Linux gamers are going to need NSS support in their game binaries. It's a bit of a stretch to say glibc doesn't support static linking, perhaps "officially" it doesn't, its just not all the extended features are going to work.
Static linking is generally going to be frowned upon these days however since you're effectively linking security issues into the binary forever.
Not really. musl optimizes for performance and codesize, whilst glibc optimizes for the random user who needs to read error message in farsi and logs on via an external LDAP service, and needs LD and MALLOC hooks to debug weird edge cases.
glibc unicode tables are huge, musl are tiny.
glibc uses assembler optimizations for everything, whilst musl uses compiler auto-vectorization, which mostly beats handwritten assembly, esp. with constant arguments.
Well, you should release the source code, or else a DOS executable file that can run on DOSBOX, or a .NES ROM image file, or whatever, the end user can then provide their own emulator to run it on. Only the end user shall have full control over what runs on their computer, not you.
I currently use Linux and it is much better than Windows (although I didn't like the desktop environment so I uninstalled it, but customizing the system is what I do regardless of what operating system).
> It's still a PITA to deploy a compiled app on Linux with unclear glibc versions
If you don't want to worry about the glibc version on your users' machines, you can just ship your dynamic linker and use it to run your executable in your wrapper script (you do have one, right?). Why's that so hard?
I honestly don't see the problem. It's a different platform so it's got different quirks, that's just life.
Having said that, I strongly prefer Linux's explicit shared object search paths and very user-friendly dynamic linker to Window's implicit search scheme (aka the DLL hell). If that means I need to add my dynamic linker along with a couple lines of wrapper script so my binaries work, so be it.
And, you can even avoid the wrapper script if you can make sure your stuff is always installed to a standard location (eg. /opt/yourcompany) and you pass -Wl,rpath to your linker (at compile time) but I really don't see the point.
The differences between them are getting smaller and smaller with time. With the notable exceptions of some really cool distros like nixOS or Solus most others have almost the same libraries at this point at varying versions.
What about it? It doesn't solve the problem, it just allows you to conveniently package everything you can't rely on the distro having, which is basically everything.
But the Linux kernel is really the only common thing you can expect across a wide array of distributions.
* If you're a GUI app you can also speak X11 with GLX and its up to the user to have something that understands your messages.
* If you want desktop integration you use one of the many dbus protocols and it's up to the user's DE to understand your requests.
* If you distribute through Steam you may also depend on the Steam runtime, but you have to be careful to only depend on what's provided.
* If you distribute through Flatpak or Snap you can also depend on any of the provided runtimes or build your own for maybe easier maintainability. It's essentially 'bundle everything' as a service.
tl;dr depend on protocols and runtimes, not libraries.
Also, nothing is stopping you from building against Ubuntu:latest. It's more cumbersome but Linux users will typically find a way to make your stuff work on other distros.
> Also, nothing is stopping you from building against Ubuntu:latest. It's more cumbersome but Linux users will typically find a way to make your stuff work on other distros.
This is not the mentality of anyone who cares about the user experience of the product.
MacOS has had perfect crisp multi-monitor support including fractional scaling and mixed DPI with per-windiow/per-monitor application DPI awareness for years.
Windows is finally introducing a diagnostic in the task manager to show which applications have high DPI support. The modern option, per-monitor awareness, isn't even possible on Linux with the X display server - it requires Wayland.
Snaps work on at least 40 different distros last time I looked. They remove a significant chunk of that 'library compatibility' because they ship the necessary dependencies in the package. While Ubuntu dominates the Linux desktop market, by a couple of orders of magnitude, we made snaps with the express intention of them working on all the major leading distros. The goal being to ensure you don't need to test on all those different distros. One snap to rule them all ;)
He still has a deployment problem. Flatpak needs to be installed and requires users to set up repos, including flathub itself, so he has to have them jump through extra hoops even if he puts his product in the largest most default repo available.
From what I understand about Snap, the only difference is that that the One True Repo is installed by default.
AppImage is another option he could consider, which requires no repos or additional installs, just flipping the execute bit on the AppImage file. Unfortunately even that is spotty because there is no standard set of libraries to rely on like there is on other platforms, so he'd pretty much have to include everything from glibc up to be safe.
i think people mentioned snaps work on non-ubuntu systems. can't you build everything statically that you need, and deploy as snap? is it a ton more difficult than that? not saying thats easy, but seems less of a burden and you can hit at least a large amount of users
Runs 5000 games run on Linux vs runs well on Linux. Unfortunately - Linux still has some way to go. Generally, you get lower FPS on Linux(could matter for competitive gaming). There are graphical glitches and artifacts even on well supported games(I am looking at you Dota2). It is far more likely for game to be entirely broken via an update on Linux than Windows and it can take quite awhile to get the fix.
this is the truth! I've been a linux user since 95' more migrating to windows on the desktop now that I'm getting older. I've tried gaming in linux throughout the years (even very recently). And this is a big issue, first getting games to work is actually most times a PITA, and one slight update can break what you had going.. It's difficult to invest in a game if you never know when it might break on your OS.
Consoles give very strict contracts that you can rely on to not change for the whole lifetime of the console. In contrary macOS changes stuff every year apparently based on what a fortune teller says for Steve Job's ghost.
That means that early on, yeah it's as bad as Mac, but it gets way better over the lifetime of the console.
Yeah, no. There's talks of Blizzard for instance dropping support for Mac. Dice has a blog post that's messaging concern. It looks like a bunch of engines are going to limp on MoltenVK which kind or imposes a weird impedance mismatch and gives weird perf issues soemtimes that you probs wouldn't see with a native Metal backend.
And that's before getting into the release of Metal 2. There's a non zero amount of work to support it, and it's not clear how long Apple is going to support Metal 1.
And all of that is before all sorts of other crazy stuff with Apple changing their app signing requirements, messaging that they're going to require all apps to be signed by Apple in some future macOS release (but won't tell you when that is).
I wasn't talking about in-house solutions, rather engines that many AAA studios buy in order to actually focus on the game itself.
As for the rest of your remark, it comes up in places like HN, but not at all when attending local game developer meetups, developer articles on Making Games, Gamasutra, Connection, IGDA, or many other professional publications.
I mean, my day job is supporting an application across Win/Mac/Linux. Even ignoring the graphics, Apple is easily the hardest to support. I don't really care if you haven't read a magazine article on it.
And to pretend like FrostBite doesn't matter is ridiculous.
So wait, the AAA developers not supporting don't count against your argument? Even in the case of Blizzard who has famously been one of the biggest Mac supporters? Isn't "all AAA support Mac... except all the ones that don't support Mac" a tautology?
Also, just noticed that you lumped in Unity with AAA, lolz. What's next libgdx?
> Because their focus is clearly PlayStation, Xbox and PC, not even Nintendo hardware.
> There are plenty of other AAA studios using Unreal, Unity, CryEngine.
Blizzard's focus had been on Mac in addition to Windows. With the switch to Metal, they're probably abandoning it. FrostBite means that EA AAA games probably won't either. Ubisoft didn't release Assassin's Creed Odessey on Mac. And even looking at Unreal Engine 4 games, only Fortnite and a twoer defense game have been released for Mac. Looking at CryEngine, no games have ever been released for Mac. So where is all this AAA support for Metal that you're talking about?
To lead you to water, Mac support is a nice to have so that their in house tools work with the artists' platforms they're used to. But they don't care enough to finish out the QA, or put in any work to make the game actually shippable on that platform. The switch to Metal means that you can't justify it with "well we can just support OpenGL and get Mac for free" like they used to.
> Maybe you should check again the names of some studios using Unity, ever heard of Nintendo and Microsoft?
AAA is about the games, not the studios. Name a single AAA game on Unity.
It was about three years ago, that through pseudo public channels, that Apple started messaging that OpenGL was on it's way out. Oh, look what Blizzard game came out (Overwatch) which has pretty flagrantly disregarded the idea of Mac support, even entertaining the idea of possible switch support.
> Yet OpenGL doesn't make them support Linux any better.
> So support or not for Metal is not the real reason why they don't want to focus on the Mac.
"But they don't care enough to finish out the QA, or put in any work to make the game actually shippable on that platform.". Mac was a fixed platform, and you used to be able to justify the engineering because the work ultimately helped make your Windows port better ("the end user will have a way out if there's a bunch in their DirectX drivers"), and let your artists do all the work on the tools they were used to. Then if you're running your tooling on Mac, you've been supporting it the whole time and there's very little QA overhead for release since it's a relatively fixed platform. That last part doesn't apply to Linux. This whole time I've been saying it's not just OpenGL->Metal, it's a Nexus of several things all coming together to break the camel's back.
> As for games, Nascar Heat 3, for example.
You know that a game that's less than $50 at release isn't a AAA game, right?
Today I learned that games like Sea of Thieves, Fortnite, Hitman, GTA, Assassins Creed aren't AAA because they are too cheap according to your price table.
(not in the game industry, but a graphics programmer)
Are there really no games out there that program their own graphics anymore and don’t rely on “middleware” engines? This seems shocking to me. Then again I was shocked the first time I learned that most games don’t hand-code assembly anymore. Things move so fast.
AAA studios always use middleware, if it isn't bought, it is done in-house.
The actual set of 3D API is a tiny portion of everything that a game engine requires, among scene management, materials handling, graphical editor, plugins, sound, physics,....
So one always ends up with a pluggable rendering layer, where adding a new API is relatively simple.
Now what has been happening is that with production costs skyrocketing, most studios are increasable adopting external middleware that they just adapt to their purposes than writing everything from scratch.
For example, you can get Unreal and get support for NVidia's raytracing features out of the box, or invest the money to develop the same features from scratch in-house.
The culture in the games industry is that what matters is the story, gameplay, taking advantage of hardware features and getting the game out there, tech comes and goes.
macOS can't really be considered a proper gaming competitor to Linux, since it runs on very limited hardware. You can't use it for demanding games, unlike Linux.
Granted there is a ways to go to beat Windows in this space, but the progress over the last 6 years generally and the last 6 months specifically is astounding.
[1] https://www.reddit.com/r/linux_gaming/comments/9qopag/5000_l...