Forgot your password?
typodupeerror
GUI OS X Operating Systems GNOME Linux

The True Challenges of Desktop Linux 505

Posted by timothy
from the editing-video-would-be-nice dept.
olau writes "Hot on the heels on the opinion piece on how Mac OS X killed Linux on the desktop is a more levelheaded analysis by another GNOME old-timer Christian Schaller who doesn't think Mac OS X killed anything. In fact, in spite of the hype surrounding Mac OS X, it seems to barely have made a dent in the overall market, he argues. Instead he points to a much longer list of thorny issues that Linux historically has faced as a contender to Microsoft's double-monopoly on the OS and the Office suite."
This discussion has been archived. No new comments can be posted.

The True Challenges of Desktop Linux

Comments Filter:
  • by MrEricSir (398214) on Friday August 31, 2012 @06:57PM (#41195893) Homepage

    FTA:

    The core of his argument seems to be that the lack of ABI stability was the main reason we didnâ(TM)t get a significant market share in the desktop market. Personally I think this argument doesnâ(TM)t hold water at all...

    This is one argument I really don't get, and yet the FOSS library maintainers seem to be adamant that they must be able to break their ABIs whenever they want.

    Yes, I know keeping a stable ABI is hard. But here's the deal: as a maintainer, it's your job.

    Let's not forget that the point of libraries is to develop software on top of them. If the library ABIs are shifting all the time, then those libraries have failed at their most fundamental task.

    There's absolutely zero excuses for why an app written three years ago shouldn't run fine today. None. If MS and Apple can do it, then so can you.

    But it's worse than that. Writing a GUI application that runs just on the past two or three versions of Ubuntu requires writing your own compatability layers, or at least peppering your code with #defines. Why on earth would we want to put this burden on application developers?

    • by cynyr (703126) on Friday August 31, 2012 @07:12PM (#41196005)

      but does the binary have to run or just work if you configure; make; make install again? right the OSS world assumes that software can be recompiled, and most only needs that. Sometimes it needs a simple patch, but yes breaking ABI isn't really an issue. Breaking an API is much more of one.

      • by MrEricSir (398214) on Friday August 31, 2012 @07:16PM (#41196037) Homepage

        but does the binary have to run or just work if you configure; make; make install again?

        First of all, if you do that it's no longer the same binary.

        Secondly, why would you place that burden on the user? The whole point of software is to solve problems for users, not to create new ones.

        • by monkeyhybrid (1677192) on Friday August 31, 2012 @07:55PM (#41196301)

          but does the binary have to run or just work if you configure; make; make install again?

          First of all, if you do that it's no longer the same binary.

          So? If most of your software is FOSS and can be recompiled, why do you care if it's the same binary or not?

          Secondly, why would you place that burden on the user? The whole point of software is to solve problems for users, not to create new ones.

          It's not often that burden is placed on the user; package maintainers for each Linux distribution generally take care of compiling and making sure the relevant libraries are in place. With every distribution upgrade I do there's been less and less reason to compile anything myself. In fact, IIRC, I've not compiled a single piece of third-party software for my use for at least a year or two.

          A moving ABI really isn't a problem at all for the vast majority of Linux users, especially if most of the software we use is FOSS and available from a distribution's repositories. Now, that's not to say it doesn't cause a few headaches for package maintainers...

          • For every binary taking a change that is recompiled by somebody other than the user, the still has to make a download, so that's just trading off computing resources and time for network resources and time. As such, all else being equal the ABI should not change.

          • Circle rhetorics (Score:5, Insightful)

            by dutchwhizzman (817898) on Saturday September 01, 2012 @02:58AM (#41198181)

            You are talking about current Linux users and application suppliers that seem to not bother about ABI stability. If you want to get the other 95+ percent of people that use desktop computers using your product, you may want to look at their needs and not solely at the needs of the few you are catering for already.

            Diversity is good for an ecosystem, evolution depends on it. However, too much instability and chaos and evolution loses because most of the deviations are too crippled to grow into something useful, even if they have some very good mutations. This is true for the development of the organisms themselves, but also for people wanting to "farm" these organisms.

            Large corporations making enterprise software don't want to bother with supporting variations that rather quickly run in to thousands of different possible software combinations that require adaptation in their product or service to make it work. Why do you think Oracle is only supporting a few Linux distributions for it's RDBMS? It's not just because they want to promote their own distribution, but because it simply is a pain in the behind to have to support someone's Arch or Gentoo box and finding out after dozens of expensive analysis by actual expensive software debugging experts to find out some flag is set different during compile time, or a minor version of some library is used that has an obscure bug that only gets triggered in specific circumstances. Just a few of those cases and your profit model is out of the window. It's just way too risky.

            Both MicroSoft and Apple have a tendency to announce well ahead if they want to retire some framework for binary compatibility so application developers can adapt their product to the new alternative way ahead of time and still support older versions of their product for years to come. Windows is still offering most (if not all) 16bit windows ABIs in some form on some OSes still supported today. Apple took many years to kill "Classic" support, support for PPC cpus and legacy frameworks have been around for years before they stopped supporting anything but cocoa.

            If you compiled an app for OSX or Windows XP 5 years ago using the then latest standards, chances that it will run without any modification or extra work on a freshly installed system with OSX 10.7 or Windows 7 are very high. Try that with a graphical application for a Linux desktop and at the very minimal, you'd probably be looking at installing "compat libs" if your distro supplies them at all. This is a support nightmare and a nuisance at the least for people able to deal with this sort of problem themselves. For Linux to make it to the desktop successfully this needs to change. Linux needs it's Visicalc, WordPerfect, Office, PhotoShop or similar "must have killer application" to get a decent share of desktop usage and making it hard for application makers to choose Linux for that isn't going to make that happen.

      • Whichever. But, if a recompile is needed, either you make it idiot-proof (ideally, one-click, with a 99% success rate), or you lose 95% of PC users.

        If the configure, make.. steps are always the same, why aren't they scripted once and for all ? Is there a GUI to do it ?

        • by monkeyhybrid (1677192) on Friday August 31, 2012 @08:08PM (#41196367)

          Whichever. But, if a recompile is needed, either you make it idiot-proof (ideally, one-click, with a 99% success rate), or you lose 95% of PC users.

          That idiot-proof method you wish for is already there. It's called a package manager and every major distribution has one. Ok, so it's not recompiling the software for you on the fly (in most cases) but that's because someone else has done that for you so you don't even need to think about it. It really couldn't be easier, either by GUI or CLI.

          • by enos (627034) on Friday August 31, 2012 @10:20PM (#41197097)

            It's called a package manager and every major distribution has one.

            Every major distribution has their own one that's incompatible with every other major distribution's. That's even though the package systems do the same job. Even distros that use the same package management system don't share compatible repositories.

            So you just turned supporting "Linux" into supporting Ubuntu, RedHat, SuSE, etc.

            • So you just turned supporting "Linux" into supporting Ubuntu, RedHat, SuSE, etc.

              Unfortunately, that happened a long time ago, and that is a major reason why Linux never really took off as a replacement for windows.

            • by F.Ultra (1673484)
              And? At work we simply call a script that sends the project to the build server that builds packages for all the major distributions and architectures as well att published to the appropiate package repository.
          • by Burz (138833) on Friday August 31, 2012 @11:00PM (#41197347) Journal

            Package managers do not solve the problem of compatibility across different distros. In fact, not even across the semi-major upgrades you see each month with a single distro. PMs also contribute to making the development environment app-unfriendly because they don't work well with anything that hasn't been subsumed into "the repository"... i.e. independent software distribution is really an uphill slog to the point where even Mozilla gave up on packaging apps for Linux-based distros long ago; Mozilla packages apps for Windows and OS X.

            Really, if you don't make it easy for curious types to make something interesting and to then share it easily with others, then the platform doesn't work. People will continue cutting their programming teeth on OS X and Windows and will stay there or with other platforms that satisfy the same criteria. So-called "Desktop Linux" doesn't even have an SDK! The longbeard hacker politics affecting the Linux Foundation demand that it doesn't have an SDK. Skittering around in Google's wake, they saw fit to create an SDK for Mobile Linux but heaven forbid if we get one for ye olde desktop.

            The subculture stubbornly refuses to standardize both the user experience and that of the app developer. And so it drives both groups away.

            • Package managers do not solve the problem of compatibility across different distros.

              That's correct. That's not the aim.

              In fact, not even across the semi-major upgrades you see each month with a single distro.

              Not sure I follow. I've had plenty of day-to-day updates work with no problems over the years. My arch instalation is 4 years old and still up to date.

              PMs also contribute to making the development environment app-unfriendly because they don't work well with anything that hasn't been subsumed into "the

      • by Guy Harris (3803)

        but does the binary have to run or just work if you configure; make; make install again? right the OSS world assumes that software can be recompiled, and most only needs that. Sometimes it needs a simple patch, but yes breaking ABI isn't really an issue. Breaking an API is much more of one.

        How many of the ABI breakages about which people compile are the result of API breakage, and how many are the result of changing the sizes or layout of data types in ways that don't break the API?

      • by Anonymous Coward on Friday August 31, 2012 @09:17PM (#41196703)

        Yes the binary should still run, and the SAME binary should run across several distros and several versions of those distros. Even in the current messed environment it is possible if you are very careful, use the oldest compiler you can find so that all of your users have newer versions of libc and libstdc++ and build and bundle all the rest of the libraries yourself including the GUI libraries, and be careful on the X11 options on configure since you can't count on xfixes. This is why commercial development has little patience for Linux. From: Linux user since 0.92 kernel and Principal developer of a commercial desktop Linux statistical visualization product. product is still sold, and thriving on Windows and Mac, even an iPad version, but now discontinued on Linux! Sadly, Without ABI stability and at least compatibility libraries, Linux will not be more than a niche on the desktop.

      • by golodh (893453)
        Largely agreed, and I think it's an important point.

        I will never forget how I tried to install JGR, which is a graphical shell for the statistical package R.

        I tried this under Windows XP, and the whole process took 10 seconds and everything worked.

        I tried the same thing under Linux and first found that there was no package, my distro didn't include it (for JGR was experimental at that time) so I had to use a tarball. Downloaded the tarball, did configure and make ... and was confronted by a load of er

    • It's funny you say that.. The important Linux Desktop APIs have been stable for over a decade. Look at GLib 2.x and indeed the entire GNOME 2.x stack, it hasn't been broken. You can still run an application compiled against GTK+ 2.0 on any modern distribution.. Obviously, it will have the same functionalities that it had 10 years ago, but the same can be said of Windows or OSX.

      And well, GTK+ 3 has a slightly different API, etc, but so is WinRT or many of the newer OSX APIs. And Well, GTK+ 2.x is parallel installable, so you can keep using it more or less forever.

      • by martin-boundary (547041) on Friday August 31, 2012 @08:41PM (#41196541)
        The OP is talking about ABIs (Application Binary Interface), but as your post implies, that's a red herring. Who cares if the low level binary interface that handles OS and library system calls changes? Just recompile the software for the most recent version of everything you've got.

        We can do that in the FOSS world, because we ship the source to everything and the APIs are what matters. The ABI "problem" is a nonproblem that's really a side effect of the misguided commercial belief in secrecy.

        If you're a company that only wants to sell a compiled binary to a bunch of clients, then you don't get to complain if the binary you prepared fifteen years ago for some distro using linux 2.1 no longer works in 2012.

        Just tell your clients to run the older distro, or else recompile your code for a modern distro. Or you know, you could make your code open source, and reap the benefits of community support.

        • Re: (Score:3, Insightful)

          by Anonymous Coward

          Who cares if the low level binary interface that handles OS and library system calls changes?

          It used to be my job. I worked on a comercial program used by digital hardware engineers. We supported many flavors of UNIX.

          Just recompile the software for the most recent version of everything you've got.

          You want me to compile, ship, and support a binary on every flavor of Linux a customer has? We tried to figure out what customers had, and supported the top four. This only covered about 60% of requests.

          We can do that in the FOSS world, because we ship the source to everything and the APIs are what matters. The ABI "problem" is a nonproblem that's really a side effect of the misguided commercial belief in secrecy.

          Paying money for closed source software is how 99.99% of users expect to get software that does the job and is well supported. If you refuse to support this use case, do not expect

          • I get what you're saying. Customers' IT departments like to lock down their systems, and that caused problems for your (former?) company. So it's natural to wish that the Linux community would adjust its development practices so that various distros' users' IT departments' decision makers can't mess about too much.

            But the fact is that decision makers in IT departments will always find ways to mess about. If it isn't ABIs, it will be something else. The philosophy of the Linux community is that technical e

    • Actually, I think code should be distributed in an ABI-independent manner.

      So distribute some form of intermediate code. Put a version number in there. And then let the OS process it into something the CPU can understand.

    • I have one. (Score:5, Insightful)

      by Anonymous Coward on Friday August 31, 2012 @07:20PM (#41196069)

      There's absolutely zero excuses for why an app written three years ago shouldn't run fine today.

      You sound like you're a paying customer or their boss. If said maintainers are volunteers and doing this in their spare time and juggling work and family and just having a life, I think they have an excuse.

      If it were me and I heard horseshit like your post, I'd say, "Here's the code. Knock yourself out. I'm taking my kid to the movies like I promised him three releases ago."

      • Re:I have one. (Score:4, Insightful)

        by shutdown -p now (807394) on Friday August 31, 2012 @09:11PM (#41196687) Journal

        Which is precisely why Linux on the Desktop is still confined to 1%.

        • Bingo (Score:5, Insightful)

          by Sycraft-fu (314770) on Friday August 31, 2012 @09:19PM (#41196713)

          You can't have it both ways. If you are happy with the "Whatever it is free, the quality can vary and people can do whatever they like," then cool. That's great but understand and accept it'll never be mainstream desktop. If you want that mainstream desktop, then you have to start to support users. You have to make things easy for them. "Just get the source code," can NEVER be something you utter.

          So if you want Linux to always just be the "geek OS" on the desktop or something people use when they are going to stack a bunch of custom shit on (like Android) then no problem. However if you want to advocate Linux for all and a Linux desktop then you need to accept that some shit has to change.

      • Re:I have one. (Score:5, Insightful)

        by MobileTatsu-NJG (946591) on Friday August 31, 2012 @09:37PM (#41196831)

        You sound like you're a paying customer or their boss. If said maintainers are volunteers and doing this in their spare time and juggling work and family and just having a life, I think they have an excuse.

        Look, if you're going to pull this 'you get what you pay for' nonsense then you're not allowed to try to convert people over to OSS. You can't have it both ways.

    • by Misagon (1135) on Friday August 31, 2012 @07:23PM (#41196105)

      I think that you are thinking of "API": Application Programming Interface. I don't think that is what Christian Schaller is referring to programming interface compatibility but to binary compatibility of software packages between Linux distributions.

      Let's say that you have a Fedora RPM for an app, and you wish to run that under Ubuntu.
      While you can convert the raw RPM to DEB format, you can not auto-convert the binary files within the package.
      The binary programs in the RPM have most likely been configured at compile time in a way that it has dependencies on libraries that are different on Ubuntu.
      On Windows and MacOS, respectively, there is only one distribution, and therefore they do not have this problem.

      But yes, API compatibility between versions of a library is also a problem.

      • And I think that is also why the Linux Foundation's Desktop specification doesn't come close to cutting the mustard: It defines RPM as the standard file format, but you need a lot more than that to make it a working standard. Mark Shuttleworth used to campaign to get different distros to synchronize on some of the more common library versions (seems like he gave up though).

        I think desktop distros would do us all a service by dropping "Linux" from their monikers and short descriptions. Just take the kernel a

    • by NoNonAlphaCharsHere (2201864) on Friday August 31, 2012 @07:28PM (#41196133)
      You'd probably have a point there if every single Windows app didn't ship with 42 DLLs that only work with/for that particular app, providing a shim between the app and the OS. In contrast, Linux apps are actually expected to interface with shared libraries not directly under the particular app developers control.
      • Re: (Score:3, Insightful)

        by Tyler Eaves (344284)

        Hard drive space is cheap.
        My time isn't.

        I know which situation has caused me more heartache.

        • Re: (Score:3, Insightful)

          by 0123456 (636235)

          Hard drive space is cheap.
          My time isn't.

          I know which situation has caused me more heartache.

          You mean, finding all seventy five copies of zlib.dll strewn through random directories on your system which have exploitable security holes so you can individually replace them all with a patched version?

          • by Blakey Rat (99501) on Friday August 31, 2012 @08:42PM (#41196545)

            So the solution there is to ship BIG EXPANSIVE libraries with the OS, and keep on top of them so new stuff is supported by those libraries ASAP. You don't have 75 copies of zlib.dll, you have one-- and it's owned and updated by the OS.

            Take Microsoft's .net for example. The library covers pretty much everything you can imagine wanting to do with a computer, and it's constantly updated as new file formats/etc arrive. But since there's only ONE .net, the library is still one holistic thing that can be updated when security problems arise without breaking anything.

            That's not to say that .net is the perfect solution to all problems, but it's definitely worth examining how other vendors solve the problems in Linux.

            For what it's worth, I come from Mac Classic, a platform that never had DLLs in the first place (but did have a huge expansive built-in library). Frankly, I've never been convinced that shared libraries were a good idea, even when HD space was expensive. But that's just me.

    • Casual User Here (Score:5, Insightful)

      by Iskender (1040286) on Friday August 31, 2012 @07:36PM (#41196183)

      As a single-booting but casual Linux user I don't really know if these libraries are what makes distributing software such a pain, but whatever the reason is something needs to change, and the point about software distribution was spot on.

      Package management is nice, but if something isn't available through it I won't install it. Why not? Because:
      * I have to compile it myself. This often results in errors which I can't handle.
      * I have to edit config files. Might be xorg.conf, might be something else. All I know is someone failed to make it work out of the box properly. Things will break.
      * I have to find the application. Yes, that's right: often applications leave no trace after installing, especially when using a manager. They're buried in the complex-just-cause Unixey filesystem. Typing the name into the CLI fails too of course.

      Now all of these problems can be solved, some seemingly trivially. This doesn't matter - the fact that I can edit xorg.conf means I'm probably in the top 3-5% of all computer users as far as Linux goes, meaning it could just as well be impossible for a normal user.

      Users are used to the Windows XP interface and Linux is frequently more like it than Windows 7 is, so the exterior isn't a problem. The ACTUAL usability problem is installing software - it needs to work universally so people can actually do things and therefore be interested in and dependent on the OS.

      • by dbIII (701233)
        Something is changing but it's getting worse. The current problem is the design of gnome3 is bringing DLL hell to linux for anyone that wants to run things based on portions of gnome2. This appears to be by design to kill off what is left of gnome2 (which I think is a stupid reason, but it's still a reason) and it's creating a variety of library problems.
    • It is not that FOSS developers hate ABI compatibility. It is that the value of such compatibility for important projects (FOSS ones) is very near zero, thus why should they have extra work to achieve it?

      Yeah, there is a bias here. Linux developers don't think closed source drivers are important. If you think they are wrong, the burden is on you to convince them.

      About TFA, well, I've not read it yet, but if that is its best argument, it just doesn't fly. The lack of ABI compatibility only impacts drivers dev

      • About TFA, well, I've not read it yet, but if that is its best argument, it just doesn't fly.

        Yep, now that I've RTFA, it agrees with me.

    • Frankly I'm shocked you were modded up, so many of us have been saying that for years and called filthy names and told what idiots we are for not "seeing the genius" of constant breaking by the devs.

      Lets be honest and cut the bullshit folks...why does Windows rule the desktop? its simply because there is software that covers every niche from inventory management to medical billing, electrical supply to salvage yards, there is SOMEBODY out there making software for it and it runs on Windows.

      So what does that have to do with anything? Simple. When the core of the OS is constantly shifting like the sand to keep those applications running is gonna be costly as hell. For every big software house like Adobe you got 10,000 little shops run by a handful of guys filling one of those teeny tiny niches the big boys don't care about, and they simply don't have the money to constantly "pull an Nvidia" and pay a team of devs to constantly rewrite their stuff so it'll work.

      And the saddest part? you have this HUGE network that would be happy to support you, little guys like me and the other smaller shops paying too much for MSFT licenses, tons of little software houses getting screwed just like we are when it comes to licensing fees, yet when we point out what we need to support you, which frankly isn't all that much, just some real stability and timetables we can work with, what do we get? Insults and told what idiots we are for not seeing how fricking brilliant breaking stuff is.

      So if Linux goes nowhere on the desktop you really only have yourselves to blame, hell MSFT has been treating their customers like crap so long they might as well put a Goatse on the box yet we still buy it because we have no choice because everything is simply in too much flux with Linux. The one or two "solutions" trotted out when we point that out cost several times what Windows does, like RHEL, thus making MSFT the better deal.

      People aren't on Windows because it gives them a fuzzy to see a WinFlag, its because nobody else will step up and give us a platform where we can have long term access to our applications without having to constantly futz and study and fix like its a dying 76 Dodge. Why do you think OSX adoption has been climbing, when they charge so much for older X86 hardware? because when you get fed up with MSFT's bullshit there isn't anywhere else to go, and that is a damned shame because it didn't have to be that way, the devs chose to make it so.

    • by Eil (82413) on Friday August 31, 2012 @10:05PM (#41196981) Homepage Journal

      This is one argument I really don't get, and yet the FOSS library maintainers seem to be adamant that they must be able to break their ABIs whenever they want.

      Yes, FOSS library maintainers want to be able to break their ABIs. They do it often. And that's fine. Why? Because we have this thing called versioning. You can write your application against libfoo.so.2, and the author of libfoo can rewrite the thing from ground-up and call it libfoo.so.3. And guess what? Your application works just fine because libfoo.so.2 didn't disappear from the face of the earth. You just install libfoo.so.2 and libfoo.so.3 side-by-side and everybody's happy. This is a primary strength of open source, not a weakness.

    • KDE keeps binary compatibility through all releases with the same major number. And in practice, source compatibility is broken in major ways only once every two major releases (happened between 1 and 2 and between 3 and 4).

      So many articles about "linux failing on the desktop" which should be "GNOME failed as the linux desktop". People forget history, and they should read the kde mailing lists rachives from 1999, from the KDE list. The mother of all flamewars is easy to spot by the extreme number of posts.

      Y

  • by future assassin (639396) on Friday August 31, 2012 @07:03PM (#41195939) Homepage

    who think that when they buy something it belongs to them to do with as they wish, there will always be Linux. As it is seems that WIndows 8 MS is taking that away and so is Apple.

    As a non developer or programmer seems to me Linux is stronger than ever.

  • by hawguy (1600213) on Friday August 31, 2012 @07:04PM (#41195949)

    At my company, out of 500 computer users, we have around 60% Windows and 40% OSX, Linux users (including me) don't even account for 1% of our desktops (but factor heavily in our servers - we're around 50% Windows, 40% Linux and 10% OSX (which will be moved to Linux before the end of the year). Most of the OSX users are normal business users (finance, IT, etc) not graphic designers or other users that traditionally have preferred OSX.

    There's little reason for anyone here to run Linux to do their work - Office 2011 runs well on OSX and gives users an Office Suite and Outlook that's compatible with the rest of the corporation. And there's the whole Apple Ecosystem that some people like to be inside of.

    Even though I run Linux, I still do most of my work on a Win 7 virtual machine because some apps just don't run well (or at all) on Linux. I tried Crossover Office/Wine for a while to run Office, but it wasn't worth dealing with the quirks, it runs much better on Windows. Plus, some of our corporate tools and infrastructure management tools run only on Windows (or require MSIE for full functionality). We run a terminal server for OSX users that need to run Windows apps.

    OSX may not have killed Linux, but it sure has kicked it into the corner.

    • Wordperfect was already being used extensively by legal offices. It would not have been a huge jump to get legal offices to switch to Linux running Wordperfect. But after version 8 Wordperfect was not a native Linux port but this convoluted thing that ran through an emulator layer which was insane. Then, not long after it died. That was the end of the chance for Linux to make an advance to the corporate/business desktop.
      I'm sure some other things didn't help as well. I still think one major issue is th

    • by cynyr (703126)

      does the OSX office allow VBA? Yes this is a serious question, the engineering world seems to depend on excel and VBA to make things go 'round.

    • Basically you're argument is that it's all about Microsoft Office? I agree with you, then it has nothing to do with how Good or Bad GNOME vs OSX are. The Linux Desktop will not happen on any serious scale until the corporate world stops revolving around Office and there isn't a damn thing we can do about it.

    • by perpenso (1613749) on Friday August 31, 2012 @07:46PM (#41196245)
      No Mac OS X has not killed desktop Linux. However it has halted Linux's advance into the desktop market. Much like Linux did not kill MS Windows Server, it halted the advance of Windows Server into what had been traditional *nix server territory.

      That said ...

      So he argues that Mac OS X has not displaced Linux because its overall marketshare has only gone from 5 to 7.5%?

      That seems to be an odd conclusion. That growth is nearly twice the entire Linux marketshare according to his cited numbers. If he wanted to argue Mac OS X is not displacing Windows he would have a point. As for Linux he really offers no evidence.

      Yet the number of Mac laptops seen at Linux specific conferences, and the number long term Linux users confessing they moved to Mac OS X, are so common as to be far more than mere anecdotes.

      The truth is that a bunch of people out there wanted a *nix environment. Workstations were beyond their reach and Linux filled an empty niche by delivering *nix on PC hardware. Many historic Linux users just want an affordable *nix and didn't care about the politics and drama of the FSF and the "free software" movement. So when Mac OS X delivered another affordable *nix implementation that runs side by side with a nice consumer GUI environment that has support from many commercial software publishers they switched. It also helped that the Mac hardware delivers the "holy grail" of running Mac OS X, Windows and Linux. Sure you can emulate but for things like games you are probably better off booting into Windows. Something many Linux users do too.
  • by AugstWest (79042) on Friday August 31, 2012 @07:19PM (#41196063)

    I was a Linux user beginning with Redhat 3. I went through Redhat, Mandrake, Fedora, Gentoo and Ubuntu. I've also used Solaris for a daily workstation.

    Then I was assigned a Mac at a new job (running Tiger), and have never used anything else for a desktop since. I've had no reason to. I still keep an Ubuntu box in the house, but it's a server.

    My name is Anecdotal Evidence, it's true, but whatever. I went Mac, and never looked back.

    • by perpenso (1613749) on Friday August 31, 2012 @07:54PM (#41196295)

      I was a Linux user beginning with Redhat 3. I went through Redhat, Mandrake, Fedora, Gentoo and Ubuntu. I've also used Solaris for a daily workstation.

      Then I was assigned a Mac at a new job (running Tiger), and have never used anything else for a desktop since. I've had no reason to. I still keep an Ubuntu box in the house, but it's a server.

      My name is Anecdotal Evidence, it's true, but whatever. I went Mac, and never looked back.

      Your experience is so common it goes beyond anecdotal. Many Linux users just wanted a *nix environment. They did not care about the FSF, the GPL, the free software movement, etc. They just wanted to run some *nix applications and tools. Linux was originally their only affordable option to workstations back in the day. Mac OS X comes along and they have another affordable *nix option. One that also gives them a consumer oriented desktop and off-the-shelf consumer and business productivity software. Mac OS X basically offers a superset of the software they can run under Linux.

      • This is the reason I'm writing this on a MacBook Pro. I went to mac 10 years ago. That was during their "switcher" campaign. Fact was all the people I know who "switched" went from Linux to Mac, not windows. It was only after the switch to Intel and the release of the iDevices that a lot of my non-tech friends went mac.

        But the main reason why I went to mac was I wanted a *iux that worked. And trying to get Linux to work with laptop hardware back at that time was nearly impossible. Sure it may run, but

      • by scream at the sky (989144) on Saturday September 01, 2012 @12:02AM (#41197669) Homepage

        This is exactly what happened with me.

        I'd been using Debian and its various derivatives since Woody was the unstable distribution, and I had always been happy with it (so I thought)

        Then, in April it was time to buy myself a new laptop, and I bought a 13" MacBook Pro on a whim, knowing that I could install Debian if I wanted to with no issues, but I figured I would try OSX out to see what the deal is.

        5 months later, Debian has been relegated to running in a VM Ware Fusion instance that takes up 8GB of disk space, and gets booted once a month or so, and I am really wishing I had just bought a Mac back in '99 when I first started pissing around with Debian.

      • by Tom (822)

        Your experience is so common it goes beyond anecdotal. Many Linux users just wanted a *nix environment. They did not care about the FSF, the GPL, the free software movement, etc.

        Actually, some even did that - I know I still say Free Software and not "Open Source", I've done a couple things for the EFF, talked with the FSF, etc. etc.

        But, I tried out a MacBook Pro one day, fully intending to install Linux on it, and in the end I never did, because I discovered how pleasant working with computers is when everything just works and I can focus on whatever it is I want to get done.

        And that's the part the Linux desktop misses - getting out of my way and letting me get my stuff done. I don

    • by rasmusbr (2186518)

      I think it's pretty well established that that Apple's hardware-software combo has been superior to everything else on the market for about five years now aside for niche markets such as gaming. If Apple released a $300 laptop tomorrow Microsoft's Windows business would be destroyed within a matter of years.

      Why is Apple not doing that? Well, I'll tell you why. Apple cares about profit, profit margin and market share counted in dollars (and not in users or units shipped). They've probably concluded that they

  • by Shavano (2541114) on Friday August 31, 2012 @07:20PM (#41196071)

    I think the real root of the difference is that Linux serves a different market. Apple Mac OS X is a consumer product pitched for people who want their computers to "just work." Windows is a consumer/business product geared to people who want (and are convinced they need) a high level of support. Linux is not either of those and never will be. It's a system made by and for programmers and other techies who want to be free of the monopolistic practices and have full control of their own machines from top to bottom.

    I think Linux may in fact be close to saturating that market. It may make inroads into the business and consumer user spaces. I think it will and should because businesses shouldn't be using things that are very expensive and promote lock-in when there are good-enough alternatives that meet most of their needs. Corporate customers are very conservative about risk, and they perceive that buying a professionally supported commercial product is a lower-risk option. And they've drunk the Kool-Aid regarding how efficient their office applications are.

    In reality, Windows customers probably pay the steepest price for their OS choice. It requires tons of support in a corporate environment and exposes you to a much higher risk of malware infections and security breaches. Maybe you need Windows on a few of your machines -- those of people who need to establish an appearance of "Corporate" credibility. And maybe you need some Macs for certain applications where the Mac apps give you enough of a productivity improvement to pay for the expensive system. But most of the worker bees can do as well or better on Linux at much less cost. But it will never come with support. Support will be either hire-your-own or contracted separately.

    • by Kjella (173770) on Friday August 31, 2012 @10:12PM (#41197049) Homepage

      Windows is a consumer/business product geared to people who want (and are convinced they need) a high level of support.

      Lolwut? I know there are many businesses that want support because it's their bread and butter, but my home desktop isn't supported by anyone and I think that's pretty common. The reasons are more:

      1) It's what most other people run meaning most shit has been found by somebody else and fixed. Maybe the driver developers should care that they have crap support for the 1% that's Linux or the 5% that's Mac but they sure as hell care if 90%+ of their market think they're crap. Of course this is a chicken and egg situation, if Linux had 90%+ market share it'd be the one with stellar support but it isn't. How any laptops still have problems with power management and suspend/resume? How many dare ship a laptop with those functions broken in Windows?
      2) Because most people are on Windows, most software is written for Windows. I'm sure you can try arguing that quality beats quantity, but it doesn't hold up in practice. Most of the commercial software have people to do all the boring and tedious work and polish that so often is skimped on in the OSS community. Not to mention most OSS is available on Windows, sure if you want to use GIMP you can but you can also get Photoshop or whatever else you fancy. The list of Linux-exclusive killer apps is short if not empty.
      3) With lots of users, there's also lots of people that might be able to help you. If people have any kind of installation instructions or guides or tutorials for something, it's likely to be for Windows and possibly Mac. How to do it in Linux? You're on your own. It's not that I can't find out on my own and there's usually something analogous but it's still time spent and if you don't like to play with those details then it's time wasted. If you get any training at work it's likely to be for Windows or Windows applications.

      Using Windows is travelling down the well worn path, if you're using Linux you're far more paving way. I'd also wager that any person able to manage a Linux box could just as easily have managed a Windows box, Seriously, if you spend any significant time managing your home desktop then you're doing it wrong.

  • Perhaps Linux needs a minimalist leader. Throw everything out. Then step by step, bring back features and see what works, and what doesn't. In the process make sure that everything has a consistent look and feel.

    • Perhaps Linux needs a minimalist leader. Throw everything out. Then step by step, bring back features and see what works, and what doesn't. In the process make sure that everything has a consistent look and feel.

      EEEpc 2G Surf [asus.com], from 2007. The first "netbook".

      It wasn't a huge success, but it panicked Microsoft. For a brief moment, the future of mobile computing was Linux. Windows Vista wouldn't fit on the thing. Microsoft had to re-animate Windows XP to compete.

      (It also had a terrible variant of Linux. I have two of the things. The WiFi code is unreliable, and the "union file system" which makes one read-only and one read-write file system appear to be in the same namespace leaks inodes. The hardware is solid,

      • by otuz (85014)

        Yep, products like that are bad for Linux. They made the general population see Linux as the cheapo toy operatiing system, that doesn't really work and doesn't really have any software.

    • by DesScorp (410532)

      Perhaps Linux needs a minimalist leader. Throw everything out. Then step by step, bring back features and see what works, and what doesn't. In the process make sure that everything has a consistent look and feel.

      Linux on the desktop hasn't happened for one reason, and one reason only: Linux is fractured. There are several desktops, window managers, package systems, even kernels. This isn't the case with OS X or Windows, where you have a single API and standard to develop for. No commercial developer is going to write software for a chameleon operating system with a half dozen desktop packages.The same thing that caused Linux to take off with hobbyists and adapt so well to the server room is the same thing that wil

    • I run a very minimalist linux setup. Xmonad window manager, no applets, wireless managers, widgets,etc. I use it for coding. It's awesome.

      But I can't see how the majority of users would be happy with it. Hell, my wife doesn't even know how to run a browser on it. Users want to have their phones auto mounted, printers working, scanners working, xbox controllers, nintendo controllers, games and whatever they buy at best buy and plug in, to just work.

      So where do you stop? What's a minimalist distro?

      Use
    • Re:minimalist (Score:5, Insightful)

      by devphaeton (695736) on Friday August 31, 2012 @10:03PM (#41196975)

      Perhaps Linux needs a minimalist leader. Throw everything out. Then step by step, bring back features and see what works, and what doesn't. In the process make sure that everything has a consistent look and feel.

      Believe it or not, that used to be Ubuntu. Back 8 or 10 years ago, there were all these distributions that offered 'choice!' by loading the biggest Gnome or KDE desktop crammed to the gills with EVERY and I mean EVERY app that was available. Stable, beta, working or not. You opened a panel and there were 17 calculators to choose from, 23 IRC clients, about 15 web browsers, 7 different terminal apps... you get the idea. Most of it was half-broken shit.

      The beauty of Ubuntu in the beginning (I thought) was that they cut out all of that. You got a nice, slick installer that installed Debian Unstable (which we'd all known for years was fine for everyday use) with a slick graphical installer. You booted up to a nicely themed Gnome desktop with only the best ONE of each type of application installed. They were smart about choosing what apps to include by default, and I felt that their choices resonated very closely with experienced linux users who generally all agreed on the best app for a particular usage. The whole Debian repository was mirrored and available, but you didn't have to dig through a bunch of crap to find the stuff that you most likely would have chosen to install yourself. Configs were all clicky-clicky, but all your fave debian cli tools like aptitude still worked as expected.

      I really thought that Ubuntu was going to become the polished distro that brought Year Of The Linux Desktop(tm) from fantasy to reality. I still think that they had a real chance to pull that off. (At least up until about 8.0, then it started to get weird).

      My $0.02 plus tax.

  • The reasons listed make good sense to me and most could help explain why a comparable or even a better desktop experience could still fail to get adoption, especially in the enterprise.

    But is it really the case that the desktop linux experience really is as polished as the windows or Mac? Please understand I am not trying to start a flame war, I like all these platforms, I use Windows mostly for my personal desktop use and Linux mostly for my servers.

    I have not spent time recently trying to configure the be

  • by slashmydots (2189826) on Friday August 31, 2012 @07:35PM (#41196179)
    That isn't surprising considering they're polar opposites! I'm not talking about design and function and style, I mean that Apple is all about psychotic levels of control, MONEY MONEY MONEY, and locking everything down into their pretty little walled garden. Linux is exactly, perfectly the opposite. It's designed for anyone to use without some company controlling it or paying a ton of money or not being able to modify it, etc. They aren't even targeting remotely the same market other than "people who don't want to use Windows."
  • was never an office killer. Calc is missing a tonne of stuff people use Excel for (e.g. as a poor man's application database). Writer has several nasty document eater bugs that haven't been fixed to this day. There's also nothing that competes with Outlook for Enterprise grade messaging. The fact is that stuff is expensive and above all boring to write. Large gov't grants could do it, but good luck getting that done between Microsoft's lobbying and the cries of 'Socialism!'.
    • Evolution is a viable Outlook replacement, if configured right. Trust me, I have experience with eGroupware and Evolution. It works.

    • by dbIII (701233)
      Nice joke. You had me thinking you were serious until you mentioned Outlook.
  • by Zombie Ryushu (803103) on Friday August 31, 2012 @07:56PM (#41196305)

    No more "New Distros". No more new package managers, If you have applications, make meta-packages. What really needs to happen is, DEB and RPM need to talk to each other. Stop making "New Distro that changes everything needlessly again."

    Make applications that solve problems, make meta-packages for large suites of applications, make it so RPM distros can talk to DEB databases and vice versa. Agree on a system. And give the "I'm going to make a new distro where the Wallpaper is blue rather than brown" a big glass of shut-up juice. There needs to be one overlording Linux.

  • by obarthelemy (160321) on Friday August 31, 2012 @08:01PM (#41196329)

    I think the basic issue is that Linux is an OS by nerds, for nerds. Which is fine, as long as they don't pretend they're something else.

    - While using a preinstalled Linux system can be OK (if the system is vanilla, well installed, and you don't want to change anything), installing/admin-ing a Linux system requires the CLI within 10 minutes
    - the code might be good, the documentation is horrendous. Codenames are fun except when you don't care about them and have to keep a post-it note to remember if Carmic Crap is 8.10 or 9.14; once you know that, you got to try and find relevant info (MAN pages are often out of sync and/or a bit unclear; forum posts rarely states which versions they apply to or not...). I think this is both accidental (writing doc is boring and unglamorous) and by design (if only a few people can make head or tail of something, their market value increases)
    - the feature set is chosen to impress your programmer peers, not to seduce/help non-techies.
    - many distros, GUIs... are *released* in what is barely a beta state (early Unity, KDE4...). People howl at MS putting out crap v1s... Linux does worse with v4s...

    Engineers often wonder what the world would be like without marketing- nor business-men. The answer is: Desktop Linux.

  • Why does it have to be someone else's fault? Why's it Mac OS X's fault? Or Microsoft's monopoly? Or even ABI compatibility? Where's the analysis of whether the bulk of average-joe users actually like using Linux desktops?

    Seriously, it's the first explanation that needs to be looked it. Yes, many of people love their Linux desktops, and they're very vocal here on slashdot. But is there any Linux desktop that is there today, or has been, that could be loved by the masses?

    I switched from Linux desktops about y

  • by Osgeld (1900440) on Friday August 31, 2012 @10:18PM (#41197085)

    Its just to much of a pain in the ass to deal with on a daily basis, and I have been fighting it for at least a decade

  • by Tom (822) on Saturday September 01, 2012 @02:47AM (#41198145) Homepage Journal

    Yes, OS X did kill the Linux desktop. But not for the reasons usually mentioned. What it did was take the pressure off that had been driving Linux.
    You see, many of simply wanted an alternative to windows, preferably a unix-like system. There was none after OS/2 died (lots of early Linux fans moved in from OS/2, do you still remember?) and academic alternatives like Oberon went nowhere. So we worked on Linux.

    And then OS X came along and gave us what we wanted and we went there. Not the story of everyone, but one you hear again and again.

    At least two thirds of the Mac fans in my circles used to be Linux, not windows, users.

  • by Kr1ll1n (579971) on Saturday September 01, 2012 @01:36PM (#41200709)

    I am capable of using all 3, but my preference is OSX. I have a many reasons for why, which I will detail below;

    In my mind Desktop Linux suffers from the following;

    1. Distribution maintainers have created a new form of "dependency hell".
    I had a Fedora-KDE VM set up the other day. I found an application that installed with the default load that I didn't want. I wanted to remove said application. I find by issuing a yum remove "app" that the entire KDE desktop, in one way, shape or form, listed that particular app as a dependency. Welcome to dependency hell, folks. Where removing a lone app could break your entire desktop, and force you to use CLI for the rest of your days.

    2. Lack of numbering libraries means installing the app you want could break your OS.
    I want version X of product Y. Product Y requires version Z of libc.so.6. Each version of libc has it's own libc.so.6, and by installing version X of product Y, I will brake compatibility in my OS, and may require a reload, even if the version of said product is considered at least beta, or even stable. The underpinning library structure has failed me, and again, placed me in a new type of dependency hell.

    3. No unified package management across distributions.
    Let's see, I want CalligraFlow version X. Fedora through yum would install vA, Ubuntu through apt-get would install vB, and Chakra through pacman would install vC.
    Do we really think this is beneficial to have every distro package a different version of the same product?

    Now, let's compare;

    OSX - .dmg or .pkg for applications. double-click to install a pkg, or double-click to mount a dmg and drag and drop. done. No dependency hell.
    Windows - download either MSI or EXE. Double click either to install. click yes or next through some menus. done.

    In both OSX and Windows, the supporting infrastructure for the application to work is not the problem, but in Linux, it almost always is.

    When OSS developers are busy trying to maintain 25 different versions of a single supporting library, there is WAY too much fragmentation.
    When distribution maintainers are building their catalogs, and they differ from every other distribution maintainer, there is too much fragmentation, especially when they make everything a dependency of the core desktop for their distro to support a single meta-package.

    When your options are yum, apt, pacman, etc..etc.. and only a select few packages are the same across all, there is too much fragmentation.

Truth is free, but information costs.

Working...