TechOpsGuys.com Diggin' technology every day

June 17, 2012

The old Linux ABI compatibility argument

Filed under: Random Thought — Tags: — Nate @ 12:45 pm

[WARNING: Rambling ahead]

Was just reading a somewhat interesting discussion over on slashdot (despite the fact that I am a regular there I don’t have an account and have posted as an Anonymous Coward about 4 times over the past twelve years).

The discussion is specifically about Linus’ slamming NVIDIA for their lack of co-operation in open sourcing their drivers and integrating them into the kernel.

As an NVIDIA customer going back to at least 1999 I think when I got a pair of cheap white boxed TNT2 PCI Graphics cards from Fry’s Electronics I can say I’ve been quite happy with their support of Linux. Sure I would love it if it was more open source and integrated and stuff, but it’s not that big of a deal for me to grab the binary drivers from their site and install them.

I got a new Dell desktop at work late last year, and specifically sought out a model that had Nvidia in it because of my positive experience with them (my Toshiba laptop is also Nvidia-based). I went ahead and installed Ubuntu 10.04 64-bit on it, and it just so happens that the Nvidia driver in 10.04 did not support whatever version of graphics chip was in the Dell box – it worked ok in safe mode but not in regular 3D/high performance/normal mode. So I went to download the driver from Nvidia’s site only to find I had no network connectivity. It seems the e1000e driver in 10.04 also did not support the network chip that happened to be in that desktop. So I had to use another computer to track down the source code for the driver and copy it over via USB or something I forget. Ever since that time whenever Ubuntu upgrades the kernel on me I have to boot to text mode to recompile the e1000e driver and re-install the Nvidia driver. As an experienced Linux user this is not a big deal to me. I have read too many bad things about Ubuntu and Unity that I would much rather put up with the pain of the occasional driver re-install than have constant pain because of a messed up UI. A more normal user perhaps should use a newer version of distro that hopefully has built in support for all the hardware (perhaps one of the Ubuntu offshoots that doesn’t have Unity – I haven’t tried any of the offshoots myself).

One of the other arguments is that the Nvidia code taints the kernel, making diagnostics harder – this is true – though I struggle to think of a single time I had a problem where I thought the Nvidia driver was getting in the way of finding the root cause. I tend to run a fairly conservative set of software(I recently rolled back my Firefox 13 64-bit on my Ubuntu at work to Firefox 3.6 32-bit due to massive stability problems with the newer Firefox(5 crashes in the span of about 3 hours)) so system crashes and stuff really aren’t that common.

It’s sad that apparently the state of ATI video drivers on Linux is still so poor despite significant efforts over the years in the open source community to make it better. I believe I am remembering right when in the late 90s Weather.com invested a bunch of resources in getting ATI drivers up to speed to power their graphics on their sets. AMD seems to have contributed quite a bit of stuff themselves. But the results still don’t seem to cut it. I’ve never to my knowledge at least used a ATI video card in a desktop/laptop setting on one of my own systems anyways. I keep watching to see if their driver/hardware situation on Linux is improving but haven’t seen much to get excited about over the years.

From what I understand Nvidia’s drivers are fairly unified across platforms, and a lot of their magic sauce is in the drivers, less of it in the chips. So myselfร‚ย  I can understand them wanting to protect that competitive edge. Provided they keep supplying quality product anyways.

Strangely enough the most recent kernel upgrade didn’t impact the Nvidia driver but still of course broke the e1000e driver. I’m not complaining about that though it comes with the territory (my Toshiba Laptop on the other hand is fully supported by Ubuntu 10.04 no special drivers needed – though I do need to restart X11 after suspend/resume if I expect to get high performance video(mainly in intensive 3D games). My laptop doesn’t travel much and stays on 24×7 not a big deal.

The issue more than anything else is even now after all these years there isn’t a good enough level of compatibility across kernel versions or even across user land. So many headaches for the user would be fixed if this was made more of a priority. The counter argument of course is open source the code and integrate it and it will be better all around. Except unless the product is particularly popular it’s much more likely (even if open source) that it will just die on the vine, not being able to compile against more modern libraries and binaries themselves will just end up segfaulting. Use the source luke, comes to mind here where I could technically try to hire someone to fix it for me (or learn to code myself) but it’s not that important – I wish product X would still work and there isn’t anything realistically I can do to make it work.

But even if the application(or game or whatever) is old and not being maintained anymore it still may be useful to people. Microsoft has obviously done a really good job in this department over the years. I was honestly pretty surprised when I was able to play the game Xwing vs Tie Fighter(1997) on my dual processor Opteron with XP Professional (and reports say it works fine in Windows 7 provided you install it using another OS because the installer is 16-bit which doesn’t work in Windows 7 64-bit). I very well could be wrong but 1997 may of been even before Linux moved from libc5 to glibc.

I had been quietly hoping that as time has gone on that at some point things would stabilize as being good enough for some of these interfaces but it doesn’t seem to be happening. One thing that has seemed to have stabilize is the use of iptables as the firewall of choice on Linux. I of course went through ipfwadm in kernel 2.0, and ipchains in 2.2, then by the time iptables came out I had basically moved on to FreeBSD for my firewalls (later OpenBSD when pf came out). I still find iptables quite a mess compared to pf but about the most complicated thing I have to do with it is transparent port redirection and for that I just copy/paste examples of config I have from older systems. Doesn’t bug me if I don’t end up using it.

Another piece of software that I had grown to like over the years – this time something that really has been open source is xmms (version 1). Basically a lookalike of the popular Winamp software xmms v1 is a really nice simple MP3/OGG player. I even used it in it’s original binary-only incarnation. Version 1 was abandoned years ago(They list Red Hat 9 binaries if that gives you an idea), and version 2 seems to be absolutely nothing remotely similar to version 1. So I’ve tried to stick to version 1. With today’s screen resolutions I like to keep it in double size mode. Here’s a bug report on Debian from 2005 to give you an idea how old this issue is, but fortunately the workaround still works. Xmms still does compile(though I did need to jump through quite a few hoops if I recall right) – for how long I don’t know.

I remember a few months ago wanting to hook up my old Sling boxes again, which are used to stream TV over the internet (especially since I was going to be doing some traveling late last year/this year). I bought them probably 6-7-8 years ago and have not had them hooked up in years. Back then I was able to happily use WINE to install the Windows based Sling Player and watch video. This was in 2006. I tried earlier this year and it doesn’t work anymore. The same version of Sling Player (same .exe from 5+ years ago) doesn’t work on today’s WINE. I wasn’t the only one, a lot of other people had problems too(could not find any reports of it working for anyone). Of course it still worked in XP. I keep the Sling box turned off so it doesn’t wear out prematurely unless I plan to use it. Of course I forgot to plug it in before I went on my recent trip to Amsterdam.

I look at a stack of old Linux games from Loki Software and am quite certain none of them will ever run again, but the windows versions of such games will still happily run(some of them even run in Wine of all things). It’s disappointing to say the least.

I’m sure I am more committed to Linux on the desktop/laptop than most Linux folks out there (that are more often than not using OS X), and I don’t plan to change – just keep chugging along. From the early days of staying up all night compiling KDE 0.something on Slackware to what I have in Ubuntu today..

I’m grateful that Nvidia has been able to put out such quality drivers for Linux over the years and as a result I opt for their chipsets in my Linux laptop/desktops at every opportunity. I’ve been running it (Linux) since I want to say 1998 when my patience with NT4 finally ran out. Linux was the first system I was exposed to at a desktop level that didn’t seem to slow down or become less stable with the more software you loaded on it (stays true for me today as well). I never quite understood what I was doing, or what the OS was doing that would prompt me to re-install from the ground up at least once a year back in the mid 90s with Windows.

I don’t see myself ever going to OS X, I gave it an honest run for about two weeks a couple years ago and it was just so different to what I’m used to I could not continue using it, even putting Ubuntu as the base OS on the hardware didn’t work because I couldn’t stand the track pad (I like the nipple, who wouldn’t like a nipple? My current laptop has both and I always use the nipple) and the keyboard had a bunch of missing keys. I’m sure if I tried to forget all of my habits that I have developed over the years and do things the Apple way it could of worked but going and buying a Toshiba and putting Ubuntu 10.04 on it was (and remains) the path of least resistance for me to becoming productive on a new system (second least(next to Linux) resistance is a customized XP).

I did use Windows as my desktop at work for many years but it was heavily, heavily customized with Blackbox for windows as well as cygwin and other tools. So much so that the IT departments didn’t know how to use my system(no explorer shell, no start menu). But it gave windows a familiar feel to Linux with mouse over activation (XP Power toys – another feature OS X lacked outside of the terminal emulator anyways), virtual desktops (though no edge flipping). It took some time to configure but once up and going it worked well. I don’t know how well it would work in Windows 7, the version of BB I was using came out in 2004/2005 time frame, there are newer versions though.

I do fear what may be coming down the pike from a Linux UI perspective though, I plan to stick to Ubuntu 10.04 for as long as I can. The combination of Gnome 2 + some software called brightside which allows for edge flipping(otherwise I’d be in KDE) works pretty well for me(even though I have to manually start brightside every time I login, when it starts automatically it doesn’t work for some reason. The virtual desktop implementation isn’t as good as Afterstep, something I used for many years but Gnome makes up for it in other areas where Afterstep fell short.

I’ve gotten even more off topic than I thought I would.

So – thanks Nvidia for making such good drivers over the years, because of them it’s made Linux on the desktop/laptop that much easier to deal with for me. The only annoying issue I recall having was on my M5 laptop, which wasn’t limited to Linux and didn’t seem specific to Nvidia (or Toshiba).

Also thank you to Linus for making Linux and getting it to where it is today.

5 Comments

  1. I am fairly confident that you can get pretty much any app to work on Linux with only the binary, the issue is 99.99% of the time purely a userland one. ELF binaries will all work if you have the right userland libraries in place. I can still get Mosaic Web Browser running on Linux (not that it’s very useful) and Unreal Tournament GOTY Edition from 1999 still works with a few small tricks, although it’s better to use it in Wine so that all the Windows-specific mods work (yes, Loki did it) ๐Ÿ˜‰

    With Linux, you can very easily maintain compatibility with older apps by grabbing older libraries and working with those in a separate directory, using LD_LIBRARY_PATH to tell the linker to use the path you specify rather than the normal /usr/lib* type of situation.

    If you’re concerned about security and want a decent source of older libraries, slam a distro with older libs into a chroot directory and update it with their update tools. Hint: Red Hat Enterprise Linux clones like CentOS offer errata for free and have 10 years support, not only do they offer many maintained compat libs for the versions prior, but their current versions are often old enough to keep apps going for longer – you can make use of their libs for older apps while keeping a bleeding edge distro like Fedora going ๐Ÿ˜‰

    Of course, now i’ve mentioned CentOS, I think it has to be mentioned that if you don’t like bleeding edge (or Unity) then you should avoid Ubuntu. If you don’t like Unity, avoid Ubuntu. Ubuntu is derived from Debian Unstable normally and their LTS versions come from Debian Testing. If you want stability and GNOME 2, go get CentOS.

    Red Hat Enterprise Linux (CentOS’ upstream) comes from a completely tested, stable, older Fedora release. They literally wait for the Fedora version (and the one above it) they wish to base upon to EOL, then they internally test and stabilise based on any outstanding bugs which may not have been fixed before EOL and they backport the stable releases of drivers for new hardware from the latest kernels. You then get bugfix and fresh hardware enablement for all classes of hardware for the first 3 years, then critical hardware enablement 1 for one year (e.g. CPU and minor GPU revisions) and then after that, you get just security/major bugfix-only patches. With 10 years of support, you get a stable environment, oh and did I forget to mention *they maintain ABI compatibility on libraries throughout*. CentOS gets you these benefits for free, I recommend you try it. Also, if you need more than 10 years, Red Hat can give critical patches for an additional 3 years on a set of core packages too…. That’s 13 years of support, which matches that of Windows XP (2001 through to April 2014).

    Windows is actually less compatible than Linux if you use the latest userland binaries they offer from system32. It’s actually WinSxS hard-link voodoo magic that keeps everything working. If you examine WinSxS, you’ll will find many, many copies of the same DLLs which are optionally linked in based on manifests at run time or by compatibility database hacks. Also, Windows 8 is due to wipe the floor with compatibility for older games by the way; it can’t do 8-bit/16-bit display modes, hacks are being used to fake to games that 32-bit is really 16-bit/8-bit because DWM is now a core part of the display management and it requires 32-bit colour. In addition, they’ve yet again broken driver ABIs, so all the graphics vendors have to write new drivers and WinSxS gets bigger with each update. You need 40GB+ reserved for growth of WinSxS now.

    The only system which I know goes all the way back with full compatibility is Solaris (they even maintain driver ABI compatibility) and unfortunately try as they might, Illumos/OpenIndiana are unlikely to revive OpenSolaris and Oracle have killed certain compatibility features in Oracle Solaris 11, like lx branded zones for compatibility with older Linux 2.2/2.4 binaries.

    Anyway, older apps do work on Linux, they just need a tiny bit of “environmental hacking” by giving them the libs they are comfy with. Also, the benefit of open source is you can maintain these libs long after the support is gone if the applications are security-critical by backporting fixes (of course by then, chances are, the binary apps have holes like swiss cheese anyway), or you can use SELinux/AppArmor or LXC to provide safeguards.

    Comment by Martyn Hare — June 21, 2012 @ 3:47 am

  2. Now i’ve totally gone off-topic ๐Ÿ˜›

    Try using a 6200SE Turbocache on an ATI XPress chipset with their older drivers for example (hint: random lockups), or a GeForce 2 MX 400 with Athlon XP 2000 with earlier drivers (hint: random lockups). NVIDIA may be offering better backwards compatibility and in many cases better consistency than ATI Catalyst right now but their driver quality has historically been just as crap, they’ve just had a better installer.

    NVIDIA have no plans to support Kernel Mode Setting in Linux, which means that VT switching will forever be laggier than with other drivers. However, rootless Xorg is possible with NVIDIA drivers, just like with KMS ๐Ÿ˜€

    However, ATI and Intel have better drivers which do KMS and use more modern rendering infrastructure, so in 5-10 years, NVIDIA will be 3rd rate unless they up their game, GPL the kernel drivers and simply keep the userland drivers under lock-and-key.

    Comment by Martyn Hare — June 21, 2012 @ 4:25 am

  3. Thanks for the comments! yes I do agree that as far as linux binary compatibility the ELF format etc, I only vaguely remember the a.out stuff has remained constant and the issue is user land from that perspective. BUT the kernel has it’s own issues with regards to driver compatibility across versions. One of the nice things I suppose about running a virtualized environment is at least from a basic driver perspective the VM virtual hardware doesn’t change often so that you do have the ability of moving between kernels without much pain. Though that can change if your using paravirtualized drivers like VMXNET in Vmware.

    The headaches of hacking modules.cgz in Red hat installer images to get updated network and/or storage drivers in order to get a PXE boot kickstart to function haven’t been felt by me in a while but the memories are burned in pretty good, and I’m reminded by it to some extent though the pain is sigificantly less when I have to recompile my network driver on my desktop at work. For some drivers there is the option of removing the tag, for lack of a better word at the moment so the driver isn’t tied to a particular version (I believe, though this could be old) and you could *try* to force the driver to load on different versions and it may work (though of course may not). [EDIT: I completely forgot about building a system with the respected kernel and compiling the drivers from HP/Dell whomever to that version so the installer would work – totally forgot about that part!]

    For me at least Nvidia driver quality has been good for the chipsets I have used for the past 10 years or so, now in distributions like Ubuntu that include the drivers, it’s only a matter of time until the drivers in the distro become obsolete vs newer hardware, but downloading the drivers from Nvidia’s website has always worked for me. With the caveat of course that I run more conservative hardware configurations. I have read that Nvidia doesn’t support, is it Optima? the hybrid GPU setup where you can switch between a dedicated GPU and the integrated one to save power on demand. I’m not sure what that means though – if it means that you can use Nvidia or the integrated one (and you decide that at boot) or if it means that your forced to use the Nvidia or the integrated one. The lack of ability to switch between the two would be unfortunate but I’d probably still be happy to just use the Nvidia one since it’s what I’ve used in my recent two laptops (Toshiba M5 now Toshiba A10). Of course battery life of about 2 hours (even on windows) isn’t very good but I didn’t buy this thing as really anything more than a desktop replacement with on site support – I knew the battery thing going into the purchase.

    I haven’t read all of your comments yet but will read more and may add more, but thanks again! I have a conference call to go to right now

    Comment by Nate — June 21, 2012 @ 10:00 am

  4. When you say VT switching I assume you mean X11->terminal switching? I had given up on that many many
    years ago, the lag doesn’t bother me but there seemed to be 50/50 odds that when switching back and
    forth the video would freeze forcing a reboot. This happened to me on Nvidia and other chipsets too,
    though it’s been a while since I used anything but Nvidia, I can recall a couple laptops I had a
    few years ago that had embedded video(they weren’t my primary machines but still used them on occasion,
    forgot what chips they used(most likely Intel) but they weren’t Nvidia. My sister still has one of
    them and uses it in Ubuntu – an old Toshiba laptop from about 2005.

    So I only attempt X11->VT switching as a last resort pretty much, normally save all of my open files before I try it too.

    ATI and Intel may integrate better with Linux but very consistently I see people say to use Nvidia over
    anything else due to the completeness of the driver itself. I recall Cedga for example which used to
    make a version of Wine to run games, their only certified supported configuration (if I remember right)
    required an Nvidia video chip.

    There was another discussion on slashdot recently where some XBMC developer(s) were complaining about
    ATI drivers – in that simple things like hardware decoding of MPEG-2 video wasn’t possible (in at least
    some chips or something), because AMD hadn’t released the specs/info needed to do it – the reason given
    was I believe something involving them being nervous about things like DRM being broken (which didn’t
    seem like a good excuse to me since it’s pretty easy to break DVD DRM at least), but in any case seems
    to have strong signs that while there are better open source drivers (better as in integrate with the
    OS etc), they still lack a bunch of functionality and/or performance of the card/driver itself.

    I recall yet another discussion not long ago again on slashdot where AMD/ATI were going to move away from
    frequent release cycles on their drivers, and read comments from folks talking about how many of the newer games had issues with ATI video cards because of driver problems even on Windows, speculation that if the frequent release cycles couldn’t improve quality than the longer release cycles weren’t likely to help a whole lot either. So seeing similar driver issues on the native platform for the chips (something that is not new I don’t believe I have seen complaints for a while now) – makes me certainly pause and think more about using them in Linux.

    Last time I had Intel video chips (granted in lower end machines) the 3D performance was basically non
    existent. While I haven’t played many games in Linux *recently* I had been playing a few over the past few
    years, especially in Cedga or Wine – I can’t think of any native linux games I’ve played seriously since Loki Software was around. I’m not a serious gamer by any means, but Nvidia gives the performance and features to make it possible.

    Nvidia has it’s warts for sure, but in my experience thus far they are easier to live with than the alternatives.

    thanks again!

    Comment by Nate — June 24, 2012 @ 9:58 am

  5. […] main argument seems to be around backwards compatibility, an argument I raised somewhat recently, backwards compatibility has been really the bane for Linux on the desktop, and […]

    Pingback by Some reasons why Linux on the desktop failed « TechOpsGuys.com — September 2, 2012 @ 10:40 pm

RSS feed for comments on this post.

Sorry, the comment form is closed at this time.

Powered by WordPress