First of all, please do not read this post as a complain. Just as an analysis. I am a Linux user for decades. I remember using Linux in my first year of University, somewhere about 1996. I never stopped using Linux, in the server, but during some years, somewhere between 2002 and 2017 I was a Mac user for the desktop (one PPC Mac Book, two Intel Mac Book Pro). Back to the beginning of 2017 I needed an update, and decided to go back to Linux. Mostly because of the price of a decent Mac Book Pro when compared with a generic laptop. I decided on a Dell, and without a lot of thinking, decided on a Dell with a 4K display. Well, I had a hard time trying to install a distribution. Tried Mint, Debian and Ubuntu. Curiously, all gave trouble with UEFI boot, but the one that ended up installing a working Linux was Mint Given it is Debian based, I can keep up with the Debian updates, and install most packages available only for Ubuntu.

While I see some applications getting better, from 2002 to 2017, it seems Linux community continues rewriting the wheel. I can’t see any big difference from what I am experiencing today in the desktop with what I was experiencing earlier with Gnome 1. Yes, the code was changed. It might be more stable, faster, support a couple of new things. But it seems we continue rewriting and rewriting the same old applications.

Then there is the issue with a 4K display. Even if GTK3 has support for High DPI screens, a lot of  applications are not written for this toolkit. And I am not sure, at all, that this is something that need to be managed by the graphical toolkit. I still think it is a Xorg issue, where we should be able to define DPIs for each screen, and have the basic low-level tools scale everything. As this is how I see things, I decided today to look to the blog of Xorg. And it doesn’t have news since 2013. As I could read, now most work is done as independent libraries. Nevertheless, it is strange no changes were needed to be done in 4 years.

Also curious that a bunch of applications using node.js are being working great. Examples are GitKraken, Code, Atom, Franz… and even Sublime is working great on 4K (even if it has some other issues). Unfortunately Unity3d is not working properly in 4K, but that looks more like an issue with their own GUI system, than anything else (but then, if Xorg took care of things, maybe it would work great, just like it works acceptably under Windows). But other things, like old Gtk, Xlib, QT or even Java applications still look like needing a microscope to be read.

So, here I am, with a shiny new laptop, deciding to keep Linux, or getting back to.. huh.. windows! Yeah, I do dual boot, but I like Linux for most things. But some aren’t possible As a teacher, I know I will have problems when trying to use a beamer. When connecting an external display, everything will look monstrous. Or I can change the resolution on the built-in screen, go searching the HiDPI switch, turn it of, restart the session for it to be correctly applied, and then use the laptop. Shame.

And yes, I know a lot of this is my fault. If I did not change to Mac, and if like me other hundred of developers didn’t do the same, probably we would have a lot more Linux users, writing and patching these applications. Or we would just end up with a lot of more distributions, a lot more window manager, but with the same main issues.

At last, but not less important, I would like to thank you to everyone that is still working on Linux making it better. I know this is not a paid job. I know you (and I) do what we want, and what makes us happy. That is why this is not a complain text. Just looking to what I see, without pointing any fingers.

 

 

Leave a Reply