2009: software installation in GNU/Linux is still broken -- and a path to fixing it

2009: software installation in GNU/Linux is still broken -- and a path to fixing it


GNU/Linux is slowly invading everybody's everyday life. I won't say "The year of the GNU/Linux desktop is here". Been there, done that. But, GNU/Linux is definitely imposing its presence -- think about Android, or the number of people who are currently using GNU/Linux as their main desktop.

And yet, software installation in GNU/Linux is broken. No, not broken... it's terribly broken. Why is that, and what can be done to fix it?

Free Software Magazine is now on Twitter!

Free Software Magazine has joined the Twitter crowd! You can see Free Software Magazine's twitter posts. You can also see our Twitter RSS feed. So, once again:

http://twitter.com/fsmag -- FSM's twitter account

http://twitter.com/statuses/user_timeline/48408821.rss -- FSM's twitter RSS feed

The current story

Most distributions today (including the great Ubuntu) are based on package managers. If you want to install a piece of software, you grab it from one of the official repositories, and your package manager will "explode it" onto your computer's file system. A program will place bits and pieces in /usr/bin, /usr/lib, /etc, and so on. This is normally done through a package manager. In Ubuntu, for example, you would probably use Synaptic. A package manager will normally solve all the "dependency problems" for you. Ah, dependencies... basically, an image viewing program might need, for example, libjpeg to function (libjpeg being a library of functions to open, save, and generally deal with JPEG files). This is a very Unix-ish approach. It works perfectly well for servers, but fails on several levels for clients. Why?

Advertisement: ZenOSS Enterprise Monitoring

ZenOSS is a powerful, enterprise level monitoring system -- and yes, it is fully free software. Sounds convincing? Download Zen OSS now!

http://www.zenoss.com/in/mi/fsm

There are several drastic problems with this approach. Here is a comprehensive but by no means exhaustive list (which will probably grow as pepople e-mail me):

  • Users need to have root access in order to install a piece of software; no per-user installation is allowed
  • It's very tricky to install several versions of the same piece of software. Just think of the poor graphic designer who needs to install several versions of Opera and Firefox;
  • Users are stuck with the piece of software installed system-wide;
  • The software needs to be downloaded from the official repositories. Well, it doesn't need to, but an average user wants to stay well away from unofficial repositories for technical reasons;
  • In some cases (especially when the user adds repositories or installs packages directly), the dependency-checking mechanism often fails and users end up with circular dependencies. They are nasty;
  • A piece of software is bound to a specific distribution, and -- what's worse -- to a specific version of that distribution too. It's not trivial to install Openoffice 3.1 on Ubuntu 8.10. You can argue that you can install the bunch of .deb packages from OpenOffice's web site. Tell that to your grandmother or your average inexperienced computer user.
  • It's not trivial to "give" a program to a friend. To the end user, giving a program to a friend should be as simple as dragging an icon onto a memory stick; instead, files are scattered all over the system.

It's 2009, and GNU/Linux is still plagued by all of these problems. Even Ubuntu, a distribution I love, is plagued by all of these issues -- and we are talking about a distribution aimed at end users!

What the story should be

The story should be very simple:

  • Users should be able to install software even without being root
  • Users should be able to install different versions of the same software immensely easily
  • Users should be able to run either their own version of the software, or the one installed on the system (if any)
  • It should be possible to download and install software even though it doesn't come from an official repository
  • Software just needs to work -- unchanged -- even if it's a bit old and it's run on a newer distribution
  • It should be possible to "give" a piece of software to a friend by simply dragging an icon onto a memory stick

All this is true with Apple's OS X. They got software installation just right -- although a few programs, lately, seem to come with an ugly installation process.

Where does the problem come from?

Don't get me wrong: I think Ubuntu is a fantastic system, and gets a lot of things right. It think the problem stems from an issue that is more philosophical than anything else.

The issue is at the heart of this article, and deserves to be put in bold.

Every GNU/Linux distribution at the moment (including Ubuntu) confuses system software with end user software, whereas they are two very different beasts which should be treated very, very differently.

I think using dpkg/apt-get or rpm/yum for system-wide software, libraries and so on is the way to go. GNU/Linux's success in the server arena is not a coincidence: a distribution is made up of several independent "bricks" which create a majestic building.

However, using the same philosophy -- and therefore architecture -- for end-user software is just too limiting. My point list above is not "a list of unfortunate drawbacks". It's one of the major reasons why GNU/Linux hasn't achieved mass penetration in the desktop arena.

What bothers me is that while all of the other problems are being solved (vendor support amongst the others), this one is a persistent thorn in every GNU/Linux user's side. A painful one.

Existing material about this problem

A lot of debate-- as well as software -- exist about this issue. In terms of software, you can get a whole distribution -- GoboLinux -- which follows exactly this priority: one directory per program. There is a problem with GoboLinux's approach: it applies the "one directory per thing" approach to everything -- including system libraries and server-side programs. GoboLinux also goes one step further by changing completely the file system -- an idea I am strongly against.

In terms of what's been said, there are several discussions about this in Ubuntu and Debian. A good start is the [Rename Top Directory Names][http://brainstorm.ubuntu.com/idea/6243/) in Ubuntu. This link has a long list of duplicates. There are also many, many "blueprint" drafts in Ubuntu's launchpad. There are so many in fact that you will get lost reading them. A lot of them talk about a simplified directory structure for the system, which as a consequence would imply simplified software installation.

What's wrong with GoboLinux?

I don't think GoboLinux's approach is a winner for two reasons:

  • The Unix file system has been around for a long time -- for good reason. It does work extremely well to keep a system sane and working.

  • It would meet too much resistance in the GNU/Linux community -- for good reason.

However, GoboLinux gave us a practical example that this change can be made. It's actually possible.

Four steps to fix the problem

I can't really fix this problem. It will take a lot of effort, and a lot of courage from major players to even start heading in the right direction.

The first step, is to face the truth and admit that there is a problem. This is the aim of this article, which -- I hope -- will resonate within the GNU/Linux community.

The second step is to set out a path which might eventually lead to a solution. This is what I will attempt to do in this article. The solution will be generic and I will try to borrow from as much existing software as possible.

The third step is to improve on the proposed solution; this is tricky, because there needs to be the right balance between too little and too much planning. It also requires somebody to coordinate the discussion, able to lead everybody towards a full solution. My secret dream is that somebody from Canonical, or from Red Hat, would do this.

The fourth step is implementation. This is the hard part. I am sure that implementing it will reveal problems, limitations --and more.

My own semi-technical take

Here is my idea. I haven't programmed in C in years; this means that I might make some silly mistakes. However, I am confident I can provide a good starting point.

Here we go.

  • There should be a comprehensive list of libraries, and versions, expected to be available with the operating system. Today, GNU/Linux has quite a number of desktop installations and we have a pretty strong idea of what a desktop system seems to expect. This list should include both Gnome and KDE. This should be a cross-distribution list. To do this, maybe a distribution (Ubuntu?) might write a list, and then others might follow. Every two years or so a new "version" of the super-system might come out with an updated list of libraries and versions. Note that applications should do their best to work on current systems, and on the previous one. This would mean that newer applications would have the potential to work on four year-old systems.

  • There should be a well defined directory tree that contains a whole application. It should include 1) the executable 2) the icon 3) the "lib" directory with extra libraries not listed in the point above and 4) anything else. This directory should be expected to be read-only. The directory could have the extension .apx and have a file called application.xml in it.

  • In case libraries are provided, the system should add them to the library path before the system ones. So, if a program needs a specific library that is not listed in the first point, or if for some reason needs a different version of the "stock" libraries then:

  • GNU/Linux file managers should show those directories and their icons

  • You should have different directories for the different versions of executables and libraries according to the processor.

  • The operating system should keep track of the applications available (each folder with extension .apx and with a application.xml file in it could be expected to be an application) and their locations. This can be done easily with triggers in the kernel when a file is moved or copied etc. The system should allow two different copies of the same application in two different directories.

  • The operating system should offer a way to upgrade all the existing applications (since it knows what's on the disk and what version it is).

  • There should be a security system where whoever distributes the application is able to "sign it" -- users should be able to view the signature before running it. This can be extended as far as we want to take it.

  • The distribution should have the option to hide completely, in its package manager, any end user applications. Yum/apt-get/synaptic and such like should still be used to keep the system up to date -- and not just for end-user programs.

  • There should be a "recipe system" like the one available in GoboLinux, where a piece of software is compiled making sure that it works in its own directory. Here, looking at GoboLinux did would be immensely beneficial. Note that providing a working recipe for each piece of software would be a big task, but it would be limited to end-user software.

Whoever manages this system should look closely at what OS X does, because OS X's engineers had the exact same problems to solve -- and solved them successfully.

Conclusions

This article might start a revolution -- or it might just be yet another article complaining about installing software in GNU/Linux.

I have a dream. A dream of a world where people distribute applications as bundled directories, and these bundles work in Ubuntu, Fedora, etc -- and they keep on working when a new version of the operating system is installed. A world where software installation in GNU/Linux is easy and applications can be swapped by simply copying them onto a memory stick.

I wonder if I will ever see this in GNU/Linux?.

P.S. Some will say, "if you like the way OS X does things, use OS X". My answer to that is, "I like the way OS X does things, it works, it solves problems, but let's rather be inspired by it and improve it"

Category: 

Comments

jabjoe's picture
Submitted by jabjoe on

What you want is called Zero Install http://0install.net/

It is used in Rox to copy RiscOSs Application Directory.

http://en.wikipedia.org/wiki/Application_Directory

Having been a RiscOS user I can say this isn't a perfect system. Each application folder ends up with everything required to run the application, which sounds good, but isn't.

Like all OSs, RiscOS loaded for a dll in the system first, then the local folder. But what if a dll of the same name but different version was in the system? BANG. So you must name the dlls differently, or load version info and select from that.

So if that's all working, you have multiple apps using different version of the dll all in memory. This is something I at least don't want to be the common case!

You want dynamic libs shared by many applications. Which yes means a dependency soup. That's why you need a package manager! (WHY DOES WINDOWS STILL NOT HAVE ONE)

Perhaps for special cases, you could use the linux custom namespaces, you could knock something up that meant a user could install and run applications without admin. The applications are run in a namespace with the /usr,/etc,/lib,/bin folders unionfs with the user's and system's version. Or you make the split between user applications and system applications, and user applications run in this custom name space. But do you want this per user? I guess not. You want all users to be able install/uninstall/modify these apps. I guess not. Then don't you just end up with admin again?

Joe

Tony Mobily's picture

Hi,

I had already replied to this comment, but must have forgotten to submit it after previewing it...

-----------------
What you want is called Zero Install http://0install.net/

It is used in Rox to copy RiscOSs Application Directory.

http://en.wikipedia.org/wiki/Application_Directory
-------------------

See my other comment on Zero Install in my other answer.

-------------
So if that's all working, you have multiple apps using different version of the dll all in memory. This is something I at least don't want to be the common case!

You want dynamic libs shared by many applications. Which yes means a dependency soup. That's why you need a package manager! (WHY DOES WINDOWS STILL NOT HAVE ONE)
---------------

Did you actually read my article...?
This is precisely why I propose a set of common, guaranteed libraries that can be expected.
So, a program would only have extra ones, or theo nes that are not "guaranteed".

-----------------
Perhaps for special cases, you could use the linux custom namespaces, you could knock something up that meant a user could install and run applications without admin. The applications are run in a namespace with the /usr,/etc,/lib,/bin folders unionfs with the user's and system's version. Or you make the split between user applications and system applications, and user applications run in this custom name space.
-------------------

This goes against the important principle in my article, "One application per folder"

----------------
But do you want this per user? I guess not. You want all users to be able install/uninstall/modify these apps. I guess not. Then don't you just end up with admin again?
-----------------

I don't get it. What's stopping people from doing that _today_? If you are saying that you don't want users to execute applications _at all_, then that's a different issue -- something I suspect you will get _very little_ support for...

Merc.

jabjoe's picture
Submitted by jabjoe on

Zero Install I'll cover in the other thread.

I read your article, well more precisely I listen to it as I worked. ;-)

No matter how wide you guaranteed lib list is, people we want to use libs not in it. You want a system where applications that use the same lib, use the same file for that lib. Even if you did have a system with every possible lib installed on just in case, do you want that?

The custom namespace could work how ever is required, you could make a existing application adhere to "One application per folder", but I admit it's a hack.

Of course I want users to execute applications. I can't see how you thought I meant that. I'll try to be clearer.

You don't want each user to have their own copy of the app, very few apps are users going to need their own copy. So you want a system of shared applications. So who decides what is installed or not? One user could uninstall another's app. So you end up with permissions.

Tony Mobily's picture

Hi,

> Zero Install I'll cover in the other thread.

OK

> I read your article, well more precisely I listen to it as I worked. ;-)

:-D

--------------------------
No matter how wide you guaranteed lib list is, people we want to use libs not in it. You want a system where applications that use the same lib, use the same file for that lib. Even if you did have a system with every possible lib installed on just in case, do you want that?
----------------------------

I wrote this in the article... you write a comprehensive list of libraries that you use as a "base" system, and then if a program requires more libraries, it would have them in its own lib directory. If those extra libraries are installed with the system already, the system-wide ones will be found beforehand.

------------
You don't want each user to have their own copy of the app, very few apps are users going to need their own copy. So you want a system of shared applications. So who decides what is installed or not? One user could uninstall another's app. So you end up with permissions.
------------

...? Just have a shared "application" folder system-wide, available to all users. Only users with administration access will be able to copy apps in that folder.

This really is the least of the problems.

Merc.

jabjoe's picture
Submitted by jabjoe on

------------
I wrote this in the article... you write a comprehensive list of libraries that you use as a "base" system, and then if a program requires more libraries, it would have them in its own lib directory. If those extra libraries are installed with the system already, the system-wide ones will be found beforehand.
------------

That's standard, search PATH then locally.

One thing that helps is to make the dll interface version part of the name. So you app links to lib_123.o and maybe later lib_123.o becomes a wrapper of lib_124.o for old apps. Of course, that doesn't 100% work because sometimes with other implementations bugs show up that where hidden before, but in the open source world this is easily fixed. In the closed world it isn't, so in Windows you have the manifest enforcing.

With local only DLLs, you end in DLLs hell, which is another reason MS introduced the manifest, which solves one problem but gives you another, for instance "hey I actually wanted it to use the new DLL, it fixes the bug!". Local libs you want to be the exception, not the rule.

Apps can be written like you want, in a single folder, and some are, at least in ported versions. But in general it's better if the libs are system wide and brought down only when needed. Which is why the tend to do that.

I don't feel your pushing for something new, it seams like what existed on RiscOS and can exist now (on any OS, bar being able to "execute" these application folders) but isn't normally done as it's considered bad practice because it doesn't scale.

----------
...? Just have a shared "application" folder system-wide, available to all users. Only users with administration access will be able to copy apps in that folder.

This really is the least of the problems.
---------

How is that different then now in the area of administration?

Seams like what would make you happy is just a script that makes a folder containing everything a app needs (sucks it from a deb file?), and creates a namespace to run the app to make it think it's installed normally. I'm sure there are things that make this kind of portable app already. But you wouldn't want the whole system done like that, you only want it when you want to run something that either doesn't play nice with the rest of the system, or you don't have admin.

Tony Mobily's picture

Hi,

> Zero Install I'll cover in the other thread.

OK

> I read your article, well more precisely I listen to it as I worked. ;-)

:-D

--------------------------
No matter how wide you guaranteed lib list is, people we want to use libs not in it. You want a system where applications that use the same lib, use the same file for that lib. Even if you did have a system with every possible lib installed on just in case, do you want that?
----------------------------

I wrote this in the article... you write a comprehensive list of libraries that you use as a "base" system, and then if a program requires more libraries, it would have them in its own lib directory. If those extra libraries are installed with the system already, the system-wide ones will be found beforehand.

------------
You don't want each user to have their own copy of the app, very few apps are users going to need their own copy. So you want a system of shared applications. So who decides what is installed or not? One user could uninstall another's app. So you end up with permissions.
------------

...? Just have a shared "application" folder system-wide, available to all users. Only users with administration access will be able to copy apps in that folder.

This really is the least of the problems.

Merc.

Terry Hancock's picture

You don't want each user to have their own copy of the app, very few apps are users going to need their own copy. So you want a system of shared applications. So who decides what is installed or not? One user could uninstall another's app. So you end up with permissions.

At first I thought I agreed with you on this, but then I started thinking about how programming languages like Python handle the problem, which is reference-counting. Then I remembered that my current back up system does precisely this using hardlinks, which is a common element of nearly all Unix/Linux filesystems.

One could easily imagine an installation system which shows each user an independent /usr drive (or /usr/lib or /usr/share/lib or whatever is needed), but without taking up (much) more space when the same shared object libraries are installed -- instead, the libraries get hardlinked to the same data. So long as any one user is using a given library, the library is stored on the system, but when the last user abandons it, it automatically goes back into the free space pool. It's a simple reference-counting system, which is how standard Unix/Linux filesystems work normally, so it doesn't require new engineering on that level.

Of course, this won't satisfy the "one app/one file" requirement that Tony suggests (I'm not too sure about the sensibility of that requirement, though I do appreciate the convenience it would provide).

But it does provide a means of supporting multiple library version installations on the same computer.

What would be even nicer is if individual applications could pick and choose which library to bind to. This may be harder, though, if I understand how library-binding works.

As regards the one-file idea, might there be a way to trigger (or suggest) installation of necessary libs from library loading errors? One could imagine an installation system that read those logs to look for failures from applications which are not within the package-management system. Maybe some kind of helper application or plugin for some user-friendly package manager like Synaptic?

morris's picture
Submitted by morris on

Oh yeah! Let's strip the source code shipping requirement from the GPL too, it's too much a hassle to read all that code. Who needs to be creative?! Even better, who thinks can be creative?

I also have a few improvements for this article:

* Users can not have root access, unless they ARE root; per-user installation is allowed in $HOME
* SOMETIMES, with SOME distros, it’s very tricky to install several versions of SOME _PACKAGES_ (while the same is not true for softwares).
* Users are not stuck with software installed system-wide, that's why I have ~/bin/ and even ~/local/bin/, doing so is as simple as exporting $PATH;
* Supported _packages_ usually are downloaded from the official repositories. Pre-compiled binaries for major distros USUALLY can be found on the project's website, testing usually is made using a public repository, and running `make` at $HOME;
* IN SOME CASES, especially when the user adds repositories or installs packages directly without relying on the package manager, the package manager it self is not capable of dependency-checking and users themselves end up with circular dependencies. Nasty Users!
* A PACKAGE (again, not true for software itself) is bound to a specific distribution, which assures it to work on that same distro, and SOMETIMES to a specific version of that distribution, what usually means easier support.
* It’s nontrivial to _give a package_ to a friend, just because giving it's name is easier, and even a simple tarball SOMETIMES is too (and it's possible on every single distro);

I also recommend this article (http://mdzlog.alcor.net/2009/06/20/white-boy-test/) for updates, you are still stuck whit the _gandma_ thing, that's old!

Tony Mobily's picture

Hi,

-----------
Oh yeah! Let's strip the source code shipping requirement from the GPL too, it's too much a hassle to read all that code. Who needs to be creative?! Even better, who thinks can be creative?
-------------

You seem to be implying that with *current* package managers people get the source code when they install binaries... I think your point here is irrelevant.

> I also have a few improvements for this article:

I am all ears!

--------------
* Users can not have root access, unless they ARE root; per-user installation is allowed in $HOME
--------------

This would make system-wide configuration hard. I don't think it's a good idea.

-----------------
* SOMETIMES, with SOME distros, it’s very tricky to install several versions of SOME _PACKAGES_ (while the same is not true for softwares).
------------------

I am not following you here.

----------------------------
* Users are not stuck with software installed system-wide, that's why I have ~/bin/ and even ~/local/bin/, doing so is as simple as exporting $PATH;
-----------------------------

Common users don't know -- and shouldn't know -- what "exporting $PATH" means. Plus, I very very very much doubt people manage to install Firefox, Openoffice, Pidgin in $HOME/bin

------------------
* Supported _packages_ usually are downloaded from the official repositories. Pre-compiled binaries for major distros USUALLY can be found on the project's website, testing usually is made using a public repository, and running `make` at $HOME;
-------------------

I probably wasn't clear enough in the article... I was aiming at common users. At the moment some programs make binaries available, but they have to fight with rpms, debs, and then again different distros, and different versions of the same distro. That's just too hard. The article is about making this simplified.

----------------------------------
* IN SOME CASES, especially when the user adds repositories or installs packages directly without relying on the package manager, the package manager it self is not capable of dependency-checking and users themselves end up with circular dependencies. Nasty Users!
-----------------------------------

I assume you are being sarcastic. Note that users are sometimes _forced_ to add repositories and install extra packages. Try installing OpenOffice 3.x on Ubuntu 8.10...

-----------------------
* A PACKAGE (again, not true for software itself) is bound to a specific distribution, which assures it to work on that same distro, and SOMETIMES to a specific version of that distribution, what usually means easier support.
-----------------------

I assume you missed the part of the article where I specify that there should be a list of packages _cross_ distribution...

----------------------------
* It’s nontrivial to _give a package_ to a friend, just because giving it's name is easier, and even a simple tarball SOMETIMES is too (and it's possible on every single distro);
-----------------------------

If by tarball you are referring to the source, then we are talking about two different audiences.

------------------
I also recommend this article (http://mdzlog.alcor.net/2009/06/20/white-boy-test/) for updates, you are still stuck whit the _gandma_ thing, that's old!
-------------------

I won't comment on that article.

Thank you!

Merc.

imthefrizzlefry's picture

Concerning: "* Users can not have root access, unless they ARE root; per-user installation is allowed in $HOME"

You replied: "This would make system-wide configuration hard. I don't think it's a good idea."

----------------

It sounds like you are implying users should be capable of making system wide changes. I think that Windows has demonstrated time and time again that this is a bad idea. It allows for the very typical situation where a user is capable of installing a virus or other application that can make negative system wide changes.

I admit that nag screens are annoying, however I think the Linux solution of giving some users permission to use the sudo command with the user's password to elevate a process is the best solution. It is an improvement over the solution in Windows Vista where the user is asked to confirm each individual change.

I feel the answer to your original point of individual users needing to have multiple versions of the same software installed, is best dealt with by having user copies in the home directory, or having the super user account make a system-wide change.

bkor's picture
Submitted by bkor on

Time to cancel my FSM 'subscription'. From a magazine I expect research to be done, not articles like this... there are enough blogs to ignore.

Tony Mobily's picture

Hi,

I can't comment on your desire not to read Free Software Magazine -- that's your choice.

Well, I've been a GNU/Linux user for some 15 years, have been a programmer and have been installing GNU/Linux on clients' machines for *years*.

I can only guess a lot of people out here haven't seen common user struggle with Ubuntu (one of the most user friendly ones) as much as I have.

Regards,

Merc.

mrak018's picture
Submitted by mrak018 on

It doesn't broken. It is your brain that is broken, LameDuck. Didn't Linux Hater guy tell you enough to understand why you're wrong?

You simply want software installation in Mac OS X way, while it is done in Linux way. Both ways have their advantages and disadvantages. Both are fine. Why don't you just build a Mac-like software distribution system for Linux, if you need one? It is easy. Just go and make it.

Tony Mobily's picture

> It doesn't broken. It is your brain that is broken, LameDuck.

You are being offensive. I have to ask you to please avoid attacking people like this on your web site. I don't mind being attacked myself -- my skin is definitely thick enough. But, please avoid it if at all possible.

> Didn't Linux Hater guy tell you enough to understand why you're wrong?

I am not very familiar with the Linux Hater guy. However, I based my experience as a programmer, user and advocate as a basis of my opinions and suggestions.

--------------
You simply want software installation in Mac OS X way, while it is done in Linux way.
--------------

That's correct.

> Both ways have their advantages and disadvantages.

My (humble) point is that the current disadvantage GNU/Linux is suffering are too many -- and that the disadvantages OS X has are possible to overcome.

> Both are fine.

I disagree on this one.

--------------
Why don't you just build a Mac-like software distribution system for Linux, if you need one? It is easy. Just go and make it.
---------------

I have build a lot of software in my life -- I maintain 50000 lines of code in Drupal (User Karma, Drigg, Friend List, Extra Voting Forms) and can tell you, with an open heart, that something like this is *anything but* easy. It requires interaction between distro, and some technical trickery.

Please don't be offensive in your comments.

Bye!

Merc.

Lopo Lencastre de Almeida's picture

«Users should be able to install different versions of the same software immensely easily"»

I agree with Jabjoe. this could be really messy...

... but on the other end I do have Swiftweasel and Firefox both running in the same machine and I had two different Firefox versions also running in the same machine and using parts of the same configuration, etc. It is tricky and implies tackling the configuration files your self but it is doable.

And, usually, COMMON USERS should not have more than one stable and updated version of the software.

«Users should be able to install software even without being root»

I dn't know your Ubuntu but in mine there is a sudoers file and I don't have to be root... just need to have part of its rights.

Tony Mobily's picture

Hi,

----------------
... but on the other end I do have Swiftweasel and Firefox both running in the same machine and I had two different Firefox versions also running in the same machine and using parts of the same configuration, etc. It is tricky and implies tackling the configuration files your self but it is doable.
------------------

Most users would find this impossible...

------------------
And, usually, COMMON USERS should not have more than one stable and updated version of the software.
------------------

...? There are some cases where different versions of the same software are important. A graphic designer will want several versions of Firefox.

-----------------------------
I dn't know your Ubuntu but in mine there is a sudoers file and I don't have to be root... just need to have part of its rights.
----------------------------

Becoming root *is* like being root. I was talking about installing software in the users' home directories...

Merc.

Lopo Lencastre de Almeida's picture

«However, using the same philosophy — and therefore architecture — for end-user software is just too limiting. My point list above is not “a list of unfortunate drawbacks”. It’s one of the major reasons why GNU/Linux hasn’t achieved mass penetration in the desktop arena

Sorry, Tony, to disagree with this.

The major reasons are:

  1. Lack of support for the most popular games running real well
  2. Lack of a real promotion of an easy development language for common apps
  3. Lack of lobbying (with money/goods/other rolling) with some governments to push it to schools

On 1. I have several kids around here that won't use Linux due to its lack of capability to play the games they want without too much fuss.
I know that there is Wine and I know that there is Cedega (which costs €25 / half year)... the first is a pain to make some games barely playable and the second costs money... and they don't want to pay for the game and for something that they don't need to have if playing on MS Windows.

On 2. I can say for my experience -- because I'm that old -- that the real boost that MS Windows had was thanks to Visual Basic 2 and mostly thanks to Visual Basic 3, that allowed the conversion of thousands of MS Access / Clipper apps into real graphical software apps and allowed developers with little experience in developing real graphic interfaces to port their programs to MS Windows. Visual C come latter in time to fight Borland C and Borland Delphi was the last RAD tool I used before moving away from Microsoft.
You have several around Linux but, on my view, only Gambas could win hearts and that is not the one you see promoted. Canonical Ubuntu pushes Python, Novell SuSE pushes Mono and so on... but none of those are real graphical RAD tools.

On 3. we can see what happens in most countries were Microsoft wants to be the one (http://ur1.ca/6591) and that is not only in under development countries.

Those are the most important. The rest, I think, although important are cosmetic for the common user. My mother and my nefews don't really care where the software is really installed.

Tony Mobily's picture

-----------------
«However, using the same philosophy — and therefore architecture — for end-user software is just too limiting. My point list above is not “a list of unfortunate drawbacks”. It’s one of the major reasons why GNU/Linux hasn’t achieved mass penetration in the desktop arena.»

Sorry, Tony, to disagree with this.
--------------------

No problem at all :-D

> The major reasons are:
> 1. Lack of support for the most popular games running real well

OK. So, you are a game developer and want to release it. Do you release it as a deb package? Or RPM? For Ubuntu 8.04? Or Ubuntu 8.10? Or maybe Fedora 9? Fedora 10? Or Debian? Or...?

> 2. Lack of a real promotion of an easy development language for common apps

I don't see this as a major problem, but I might well be wrong...

> 3. Lack of lobbying (with money/goods/other rolling) with some governments to push it to schools

I agree!

---------------------
Those are the most important. The rest, I think, although important are cosmetic for the common user. My mother and my nefews don't really care where the software is really installed.
----------------------

I think users have a sense of being highly empowered when applications are containers and they decide where it is, or who they give it to. That, plus all of the PROs I listed in the article :-D

Merc.

jabjoe's picture
Submitted by jabjoe on

>> 1. Lack of support for the most popular games running real well

> OK. So, you are a game developer and want to release it. Do > you release it as a deb package? Or RPM? For Ubuntu 8.04? Or > Ubuntu 8.10? Or maybe Fedora 9? Fedora 10? Or Debian? Or...?

Look at Nexzuiz and see. You can download a deb file, probably an rpm, but also a zip file containing an executable for each platform.

This is not the problem with game development on Linux, it's the market, or lack of it. If games are a big driver for a platform, it's a chicken and egg problem.

Tony Mobily's picture

Hi,

My feeling, from practical feedback given by software makers and from my experience, is that a lack of an easy, multi-distro, standard way of distributing software discourages software makers from doing so.

Merc.

jabjoe's picture
Submitted by jabjoe on

Developing a game requires less libs than an application. You could maybe get away with just OpenGL and SDL.

Id's engines have been used as the basis of many games, and they have always been cross platform. There is Quake even for the RiscPC! There have been many possibilities of many games being easily developed-on or ported-to Linux. But they are not, and this is because of the lack of a market. As someone who works in the game industry, I can tell you PC games generally aren't that profitable, and that's Windows games!

Nexuiz is a really good example that it's not the development and distribution method that is the problem.

Tony Mobily's picture

Hi,

Received by email, reproduced with permission of Ian:

> The story should be very simple:
>
> * Users should be able to install software even without being root

Should they be free to install viruses as well ;-)

> * Users should be able to install different versions of the same software immensely easily
> * Users should be able to run either their own version of the software, or the one installed on the system (if any)
> * It should be possible to download and install software even though it doesn't come from an official repository
> * Software just needs to work -- unchanged -- even if it's a bit old and it's run on a newer distribution
> * It should be possible to "give" a piece of software to a friend by simply dragging an icon onto a memory stick

Acorn had this well supported using application directories in RISC OS.
Drag !app to any position in the Filesystem and it knew where it was and where libraries etc were that it might need. Each directory had scope for a custom script, application icon and program + supporting
resources. Shared libraries and system resources used more generally in !System !fonts or other shared repositories. Delete the applications directory and the app is uninstalled. Big downside is that the app could
be a virus or contain a virus.

> However, using the same philosophy -- and therefore architecture --
> for end-user software is just too limiting. My point list above is not
> "a list of unfortunate drawbacks". It's one of the major reasons why
> GNU/Linux hasn't achieved mass penetration in the desktop arena.

Not really, the main reason is that it hasn't hundreds of billions of marketing budget backing it and until recently key applications were not available on the Linux platform, most of the user training is on the Windows GUI etc. I think that is still an issue but getting easier. For most end users having an on-line add/remove as with Ubuntu is a major improvement over massing about with CDs and DVDs (not to mention malware). How many PC apps can I simply copy to a USB stick and distribute without installshield and worrying about which DLLs are or
are not present etc? Not many in my experience.

> There are also many, many "blueprint" drafts in Ubuntu's launchpad.
> There are so many in fact that you will get lost reading them. A lot
> of them talk about a simplified directory structure for the system,
> which as a _consequence_ would imply simplified software installation.

Simplifying the directory structure and making it more obvious and consistent what goes where is a good idea but I don't think it is primarily about end users installing software. More making it easier for techs and system support managers to transition from Windows and manage
systems quickly and effectively.

> . This would mean that newer
> applications would have the potential to work on four year-old
> systems.

Sounds reasonable especially as systems mature.

> The directory could have the extension `.apx` and have a file called
> `application.xml` in it.

Sounds very like the Acorn Risc OS approach.

> I have a dream. A dream of a world where people distribute
> applications as bundled directories,

If you were in a UK school in the early 90s you probably had this on a
Risc PC or A3000 :-)

> I wonder if I will ever see this in GNU/Linux?.

I still think virus proliferation would be a significant problem to solve with such a system. However, rationalising the architecture of the file system is a worthy goal irrespective of its relevance to installing
applications.

Ian

Tony Mobily's picture

>> * Users should be able to install software even without being root
>
> Should they be free to install viruses as well ;-)

"Digital signatures".
Application developers should have the freedom to only install applications digitally signed by a trusted source (the distro) or NOT.

And... let's not get into what users end up doing when a piece of software isn't the the main repository -- except, they end up installing untrusted software as _root)...

> Acorn had this well supported using application directories in RISC OS.
> Drag !app to any position in the Filesystem and it knew where it was and
> where libraries etc were that it might need. Each directory had scope
> for a custom script, application icon and program + supporting
> resources. Shared libraries and system resources used more generally
> in !System !fonts or other shared repositories. Delete the applications
> directory and the app is uninstalled. Big downside is that the app could
> be a virus or contain a virus.

See above about viruses...

>> However, using the same philosophy -- and therefore architecture --
>> for end-user software is just too limiting. My point list above is not
>> "a list of unfortunate drawbacks". It's one of the major reasons why
>> GNU/Linux hasn't achieved mass penetration in the desktop arena.
>
> Not really, the main reason is that it hasn't hundreds of billions of
> marketing budget backing it and until recently key applications were not
> available on the Linux platform, most of the user training is on the
> Windows GUI etc.

I disagree on this one... but hey.

> I think that is still an issue but getting easier. For
> most end users having an on-line add/remove as with Ubuntu is a major
> improvement over massing about with CDs and DVDs (not to mention
> malware). How many PC apps can I simply copy to a USB stick and
> distribute without installshield and worrying about which DLLs are or
> are not present etc? Not many in my experience.

None in my experience -- I was referring to Mac, not Windows...

>> In terms of what's been said, there are several discussions about this
>> in Ubuntu and Debian. A good start is the [Rename Top Directory
>> Names][http://brainstorm.ubuntu.com/idea/6243/) in Ubuntu. This link
>> has a _long_ list of duplicates.
>> There are also many, many "blueprint" drafts in Ubuntu's launchpad.
>> There are so many in fact that you will get lost reading them. A lot
>> of them talk about a simplified directory structure for the system,
>> which as a _consequence_ would imply simplified software installation.
>
> Simplifying the directory structure and making it more obvious and
> consistent what goes where is a good idea but I don't think it is
> primarily about end users installing software. More making it easier for
> techs and system support managers to transition from Windows and manage
> systems quickly and effectively.

I wrote this from an OS X perspective... rather than windows.
I think the directory structure can stay as it is because it shouldn't
even matter...

[...]
>> The directory could have the extension `.apx` and have a file called
>> `application.xml` in it.
>
> Sounds very like the Acorn Risc OS approach.

I wish I had seen it...

>> I have a dream. A dream of a world where people distribute
>> applications as bundled directories,
>
> If you were in a UK school in the early 90s you probably had this on a
> Risc PC or A3000 :-)

AAAHHHHHHHH :-D

>> I wonder if I will ever see this in GNU/Linux?.
>
> I still think virus proliferation would be a significant problem to
> solve with such a system. However, rationalising the architecture of the
> file system is a worthy goal irrespective of its relevance to installing
> applications.

That's not quite my point of view... but it's a point of view :-D

Merc.

Tony Mobily's picture

(Received by Brian and republished with his permission)

Interesting article.

Your description of "a new list every two years" is effectively what every Linux vendor all ready does with long term support releases. You might want to also consider Gentoo's notion of "slots" which allows multiple versions of libraries and compilers to exist gracefully next to each other for use by other software.

Your dream is not hard technically. It requires statically linking the program, and results in a very large binary, or distributing required libraries with the program, and the correct relative path to get to the libraries, and results in a lot of duplication of files.
One solution to the duplication problem is to use a deduplication routine to automagically hard-link duplicate library files. This then adds the added complication that the file can't be deleted while *any* (hard or symbolic) link still exists.

For user-space applications, some methods like this might make sense.

edeion's picture
Submitted by edeion on

As for me, I found your article interesting and promising.

I guess system wide install of what you call "user software" has also some advantages such as:
- decreasing disk space loss
- factoring the updates
- making generic software available by default (which is still better for hopeless users)

And this is not true only for huge servers: even on a family computer, more technical users may be asked to manage software updates (and so) for the whole family.

However, you're right: it's quite frustrating to have pain installing (say) a plain emacs when you are not root. And this happens especially on servers. Some software pieces could be installed per user even with a package manager.

I would have liked to provide a more in depth comment but I don't manage to concentrate.

Tony Mobily's picture

Hi,

> As for me, I found your article interesting and promising.

Thanks :-D

--------------
I guess system wide install of what you call "user software" has also some advantages such as:
- decreasing disk space loss
---------------

Probably true. But... we are talking about very negligible space here.

> - factoring the updates

See one of the points -- the system knows where each application is, and can therefore do updates

> - making generic software available by default (which is still better for hopeless users)

Not following you.

-------------------
And this is not true only for huge servers: even on a family computer, more technical users may be asked to manage software updates (and so) for the whole family.
--------------------

That's when you get the applications in a "system-wide" folder.

----------------------
However, you're right: it's quite frustrating to have pain installing (say) a plain emacs when you are not root. And this happens especially on servers. Some software pieces could be installed per user even with a package manager.
-------------------

Yep.

--------------
I would have liked to provide a more in depth comment but I don't manage to concentrate.
---------------

That's quite alright :-D

Merc.

NilsR's picture
Submitted by NilsR on

Interesting post, too bad you felt you had to go negative (to be heard?) There's always ways to improve any kind of software, but the flip side of that coin is that everything is forever more or less "broken".

What I think would be a cool solution to at least some of your issues, would be to add a virtual layer on top of the regular system. The bottom interface of this layer would relate to the file-system as it is. The top interface would show the application only what it needs, a virtual copy of the file-system, made exclusively for that app. No need to actually have more than one copy of (each version of) the libraries, the virtual layer would take care of serving each app with the library version it is built with.

I suspect this is done already in many situations, Wine using "Bottles" and chroot jails are probably worth looking into for lessons learned. I'm sure others could find many more examples. The hardest part would most likely be to automate it and make it as easy, user-friendly and stable as possible.

Tony Mobily's picture

> Interesting post, too bad you felt you had to go negative (to be heard?)

I didn't think it was negative... I wasn't "emotional" or anything. I just listed all of the drawbacks of the current installation methods and explained why I thought they were show-stoppers...

----------------------
There's always ways to improve any kind of software, but the flip side of that coin is that everything is forever more or less "broken".
---------------------

Yep. And then somebody says "it's broken!" :-D

-----------------------
What I think would be a cool solution to at least some of your issues, would be to add a virtual layer on top of the regular system. The bottom interface of this layer would relate to the file-system as it is. The top interface would show the application only what it needs, a virtual copy of the file-system, made exclusively for that app. No need to actually have more than one copy of (each version of) the libraries, the virtual layer would take care of serving each app with the library version it is built with.
----------------------------

This is an interesting idea. But... why would you go that far? What clear advantages would there be, compared to adding the directory's own "lib" directory to the LD_LIBRARY_PATH var?

Merc.

code.guru's picture
Submitted by code.guru on

This is a great article, and controversial too :P

I totally agree with Tony on this one, Linux is waaaaaay to hard for people like my mom. If we were to make it like 30% easier, I bet people would flock to it in truckloads :D

Everything Tony said was correct and great, as a programmer myself, I thought I might add these suggestions:

1. Make installing programs like Windows. You download an .EXE, run it, click Next a few times and you're done
2. Rewrite some of the Linux Kernel in C++ to make it more OOP; or make later modifications in that Language
3. Make the kernel have a well structured API, that way developers can worry less about all that system stuff; if they want to make a new dir, they should be allowed to call a makedir(); function, right?

Hope you liked my suggestions
~ mike

----
Please Visit My Blog:
http://shallweprogram.blogspot.com/

Tony Mobily User's picture

> This is a great article, and controversial too :P

Yep... it was a bit.

-------------------
I totally agree with Tony on this one, Linux is waaaaaay to hard for people like my mom. If we were to make it like 30% easier, I bet people would flock to it in truckloads :D
--------------------

We can only hope!

-------------
Everything Tony said was correct and great, as a programmer myself, I thought I might add these suggestions:
-----------------

----------
1. Make installing programs like Windows. You download an .EXE, run it, click Next a few times and you're done
-----------

Well this is actually exactly what my piece was against... software installation should be a matter of copying the application over!

---------------------
2. Rewrite some of the Linux Kernel in C++ to make it more OOP; or make later modifications in that Language
----------------------

Have you seen the Linux kernel's code...?
I don't think this would be possible, to be honest.

--------------------
3. Make the kernel have a well structured API, that way developers can worry less about all that system stuff; if they want to make a new dir, they should be allowed to call a makedir(); function, right?
------------------------

Lost you here... "mkdir" *is* a current Linux system call -- in fact, it's part of the POSIX standard?

Merc.

Tony Mobily User's picture

> This is a great article, and controversial too :P

Yep... it was a bit.

-------------------
I totally agree with Tony on this one, Linux is waaaaaay to hard for people like my mom. If we were to make it like 30% easier, I bet people would flock to it in truckloads :D
--------------------

We can only hope!

-------------
Everything Tony said was correct and great, as a programmer myself, I thought I might add these suggestions:
-----------------

----------
1. Make installing programs like Windows. You download an .EXE, run it, click Next a few times and you're done
-----------

Well this is actually exactly what my piece was against... software installation should be a matter of copying the application over!

---------------------
2. Rewrite some of the Linux Kernel in C++ to make it more OOP; or make later modifications in that Language
----------------------

Have you seen the Linux kernel's code...?
I don't think this would be possible, to be honest.

--------------------
3. Make the kernel have a well structured API, that way developers can worry less about all that system stuff; if they want to make a new dir, they should be allowed to call a makedir(); function, right?
------------------------

Lost you here... "mkdir" *is* a current Linux system call -- in fact, it's part of the POSIX standard?

Merc.

northofnowhere's picture

You are a brave man. Every time this subject is broached, the author can expect a large number of responses of the "it works for me, so why change" variety. (And those are the nice ones!)

I agree entirely with your analysis - software installation is broken because it's stuck in the Unix server timewarp, and this "little" problem is indeed the major reason for Linux's anaemic adoption rate. The effects permeate throughout the software creation process, marketing (or lack thereof) of Linux systems, and end user support. It's what condemns Linux to a microscopic share.

Your proposed steps to solution are a good start. I particularly agree that there must be a recognized cross-distribution system base, with regular updates, and applications added on top in their own directories.

Gobo is not the only effort in this direction. One that's taken an approach nearer to your ideal is PC-BSD, which consists of self-sufficient applications installed on top of a FreeBSD base, and allows most of the things you call for, eg multiple versions of the same application.

I also agree that the only way something like this is going to get any traction is if there is a major sponsor. But is there a company/organization brave enough to get behind something different, when what we've tried up to now hasn't worked? In some ways, the free software community is very conservative.

Tony Mobily's picture

Hi,

---------------
You are a brave man. Every time this subject is broached, the author can expect a large number of responses of the "it works for me, so why change" variety. (And those are the nice ones!)
---------------

Yeah... we have a variety of those in this very article :-D
But, thank you so much for the PC-BSD link! You are right, they are very close to doing exactly what I proposed.

There are things I don't quite agree with (for example, the need to have an "install" procedure...) but it's really remarkable!

Merc.

jabjoe's picture
Submitted by jabjoe on

>> There should be a comprehensive list of libraries, and versions, expected to be available with the operating system.

Isn't this what the Linux Standard Base is about?
You develop to that. Yes you might need to package it in a few different repository types (deb and rpm), but competition is good, you wouldn't want one repository type to rule them all, or you missed the point of free software's competing ecosystem of interchangeable parts.

With not mentioning "Zero Install", "Super OS" (with its Portable Applications) or the Linux Standard Base, I'm wondering if you did your research, or if this is just a rant.

I don't want the system you advocate because I feel shared components are very important for a number of reasons, and that always leads to where we are now. I don't want any Tom, Dick or Harry to be able install, uninstall applications, which leads to restricted rights, which leads to admin, which leads to where we are now. It is currently possible to install multiple versions of the same software, firefox 3 and 3.5 are both in the ubuntu repository and work on the same machine just fine, and yes, sometimes, when outside the repository, setting up multiple version on the same machine to run is "advanced" but that's ok, because it's not the common case, when it is, like with firefox, both versions are made to work together and put in a repository, which is where we are now.

What we have now evolved to where it is, and for my money, is working rather well. But by all means, install "Super OS" or a distro with "Zero Install" instead of package management, from what you are saying you want, I think these might fit you.

Tony Mobily's picture

Hi,

>> There should be a comprehensive list of libraries, and versions, expected to be available with the operating system.

>Isn't this what the Linux Standard Base is about?

No it's not. From http://www.ibm.com/developerworks/library/l-lsb.html :

In general, applications have the potential to become LSB compliant if they can limit themselves to using only these 14 system libraries: libc, libdl, libm, libutil, libcrypt, libz, libpthread, libncurses, libX11, libXext, LibXt, libICE, libSM, and libGL.

See the word *comprehensive*.

> You develop to that.

You are a _little_ restricted. The list above is not comprehensive enough.

> Yes you might need to package it in a few different repository types (deb and rpm), but competition is good, you wouldn't want one repository type to rule them all, or you missed the point of free software's competing ecosystem of interchangeable parts.

You could write a much more comprehensive list of libraries (which would include Gnome and KDE) but you are stuck with the following problems:

* Users need to have root access in order to install a piece of software; no per-user installation is allowed
* It’s very tricky to install several versions of the same piece of software. Just think of the poor graphic designer who needs to install several versions of Opera and Firefox;
* Users are stuck with the piece of software installed system-wide;
* The software needs to be downloaded from the official repositories. Well, it doesn’t need to, but an average user wants to stay well away from unofficial repositories for technical reasons;
* In some cases (especially when the user adds repositories or installs packages directly), the dependency-checking mechanism often fails and users end up with circular dependencies. They are nasty;
* A piece of software is bound to a specific distribution, and — what’s worse — to a specific version of that distribution too. It’s not trivial to install Openoffice 3.1 on Ubuntu 8.10. You can argue that you can install the bunch of .deb packages from OpenOffice’s web site. Tell that to your grandmother or your average inexperienced computer user.
* It’s not trivial to “give” a program to a friend. To the end user, giving a program to a friend should be as simple as dragging an icon onto a memory stick; instead, files are scattered all over the system.

----------------
With not mentioning "Zero Install", "Super OS" (with its Portable Applications) or the Linux Standard Base, I'm wondering if you did your research, or if this is just a rant.
------------------

I have Zero Install installed in my system. I studied it inside out. Zero Install does _not_ solve all of the problems I list above. Did you see the contents of the "Application directory" created by ZeroInstall? It goes against the "transparency" principle my proposed idea tries to work towards.

For SuperOS's "RUNZ" system, it's still under development and I wasn't aware of it. It _seems_ promising -- and so is PC-BSD's system.

I consider a "rant" a piece that just complains about things. In my piece, I pointed out a problem, explained why it was a problem, and offered a possible solution.

To me, that's not a rant.

--------------
I don't want the system you advocate because I feel shared components are very important for a number of reasons, and that always leads to where we are now. I don't want any Tom, Dick or Harry to be able install, uninstall applications, which leads to restricted rights, which leads to admin, which leads to where we are now.
-----------------

Are you saying that you think people should mount their home directory with the "x" flag disabled?
I am asking because _right now_ GNU/Linux allows you to install applications in their own home directory -- it's just hard to do so.

-------------------
It is currently possible to install multiple versions of the same software, firefox 3 and 3.5 are both in the ubuntu repository and work on the same machine just fine, and yes, sometimes, when outside the repository, setting up multiple version on the same machine to run is "advanced" but that's ok, because it's not the common case, when it is, like with firefox, both versions are made to work together and put in a repository, which is where we are now.
----------------------

I disagree with this point, but we will need to agree to disagree.

---------------------
What we have now evolved to where it is, and for my money, is working rather well. But by all means, install "Super OS" or a distro with "Zero Install" instead of package management, from what you are saying you want, I think these might fit you.
------------------------

Zero Install doesn't. Super OS is promising. We shall see.

Merc.

jabjoe's picture
Submitted by jabjoe on

I'm sure the standard libs in the LSB will expand. For additional libs the current policy is you should statically link it. The point is, the LSB is at least the start of what you are asking for, so you should have mentioned it.

-------------------
Users installing applications:
-------------------
There are a collection of systems that can be put in place already for the odd application that needs to be per user not system wide. If it's system wide, as most of them will be, you want permissions on who does what, which is what we have now.

-------------------
It’s very tricky to install several versions of the same piece of software.:
-------------------
Thought you wanted to agree to disagree on this? There is already multiple version of some software in repositories that have been setup to live together. Any common cases should go in the repository, others the user does needs to be more technical to do, but a non-technical user wouldn't even try on any system.

-------------------
Users are stuck with the piece of software installed system-wide:
-------------------

Not if they are technical.

-------------------
The software needs to be downloaded from the official repositories. Well, it doesn’t need to, but an average user wants to stay well away from unofficial repositories for technical reasons;
-------------------
I've never had any problems, and don't know anyone who has. It's not something you hear of. I think the repositories are one of the strong points of Linux. It keeps the whole system up to date, gives you a safe place to get apps, resolves dependencies issues by pulling down everything a app requires, ensures applications use the same version of shared libs when possible and when it isn't make sure the libs of different versions can live in peace together.

-------------------
In some cases (especially when the user adds repositories or installs packages directly), the dependency-checking mechanism often fails and users end up with circular dependencies. They are nasty;
-------------------
I just never heard of this being a issue before. If it was such a huge issue, you would think it was everywhere......

-------------------
A piece of software is bound to a specific distribution, and — what’s worse — to a specific version of that distribution too. It’s not trivial to install Openoffice 3.1 on Ubuntu 8.10. You can argue that you can install the bunch of .deb packages from OpenOffice’s web site. Tell that to your grandmother or your average inexperienced computer user.
-------------------

What is in the repository is what is known to work. You could argue it's a release branch, so should only get bugfixes.
But I did install OpenOffice 3.1 in Ubuntu 8.10, just added a repository + key and updated. Just like Wine and many other things. Dead easy.

For grandmothers this could be put into a installer for them. Where they just double click and enter their password.

-------------------
It’s not trivial to “give” a program to a friend. To the end user, giving a program to a friend should be as simple as dragging an icon onto a memory stick; instead, files are scattered all over the system.
-------------------

RiscOS did that, and it had it's uses, but as a former RiscOS user, you don't want that. Not in the form RiscOS had at any rate (which it does sounds like you want). With FUSE, perhaps an interface like your after could be hacked up. A virtual filesystem you can drag and drop application to/from. Could be a nice little project.

-------------------
I have Zero Install installed in my system. I studied it inside out. Zero Install does _not_ solve all of the problems I list above. Did you see the contents of the "Application directory" created by ZeroInstall? It goes against the "transparency" principle my proposed idea tries to work towards.
-------------------

Nope, saw it read about it, realized it was a copy of RiscOS system and lift it alone. Don't get me wrong, I loved RiscOS, but computers have moved on since then, and that way of doing things doesn't scale (and even on the scale it was then, I did have issues a few times because of it).

-------------------
For SuperOS's "RUNZ" system, it's still under development and I wasn't aware of it. It _seems_ promising -- and so is PC-BSD's system.
-------------------

Cool. Once you have played let us know how you got on. Personally, I'm far from conviced going back to the RiscOS way is better then repositories.

-------------------
I consider a "rant" a piece that just complains about things. In my piece, I pointed out a problem, explained why it was a problem, and offered a possible solution.
To me, that's not a rant.
-------------------

Without looking at existing systems like you are advocating.
Before you say it, you stated GoboLinux wasn't what you where advocating.

Tony Mobily's picture

Hi,

------------------
I'm sure the standard libs in the LSB will expand.
------------------

Glad you are. I am sure it won't. It's not within their scope to expand to include *desktop* libraries.

------------------------------
For additional libs the current policy is you should statically link it. The point is, the LSB is at least the start of what you are asking for, so you should have mentioned it.
------------------------------

Incorrect. The LSB has different goals, and has been around for *years*.

-----------------
There are a collection of systems that can be put in place already for the odd application that needs to be per user not system wide. If it's system wide, as most of them will be, you want permissions on who does what, which is what we have now.
-------------------

There is a number of ways to do per-user applications, and I am proposing one that solves the point list listed in the article.
The "who does what" you mention applies to _any_ system.

-----------------------
Thought you wanted to agree to disagree on this? There is already multiple version of some software in repositories that have been setup to live together.
-------------------------

I was talking about user end software, not libraries. It's mentioned clearly in the article.

-----------------
Any common cases should go in the repository, others the user does needs to be more technical to do, but a non-technical user wouldn't even try on any system.
------------------

End users often want to _try_ a newer version of a piece of software before making it *theirs*.
This is one basic freedom that the current system doesn't give you.
Again, please keep in mind that I am talking about end user software.

>> Users are stuck with the piece of software installed system-wide:
> Not if they are technical.

That's great. As you can see in the article, I am after empowering _average_ users, not "technical" ones.

>> [Staying with official repositories]
>I've never had any problems, and don't know anyone who has.

...I had huge trouble _yesterday_, and all I wanted to do was install the latest OpenOffice on Ubuntu 8.10.
You were probably lucky.

---------------
It's not something you hear of. I think the repositories are one of the strong points of Linux. It keeps the whole system up to date, gives you a safe place to get apps, resolves dependencies issues by pulling down everything a app requires, ensures applications use the same version of shared libs when possible and when it isn't make sure the libs of different versions can live in peace together.
-----------------

If you add unofficial repositories, you end up having trouble. Look up "pinning" for example.

-------------
[dependency hell from unofficial repositories]
I just never heard of this being a issue before. If it was such a huge issue, you would think it was everywhere......
--------------

Try googling it.

-----------------
What is in the repository is what is known to work. You could argue it's a release branch, so should only get bugfixes.
But I did install OpenOffice 3.1 in Ubuntu 8.10, just added a repository + key and updated. Just like Wine and many other things. Dead easy.
-------------------

Google it. I had a recoursive dependency hell which took me ages to fix. And the command line.

-------------
[Giving programs to a friend[
RiscOS did that, and it had it's uses, but as a former RiscOS user, you don't want that. Not in the form RiscOS had at any rate (which it does sounds like you want). With FUSE, perhaps an interface like your after could be hacked up. A virtual filesystem you can drag and drop application to/from. Could be a nice little project.
----------------

I already proposed a solution that already works for others. I think having a virtual file system would be a huge complication -- but I might be wrong.

--------------
Cool. Once you have played let us know how you got on. Personally, I'm far from conviced going back to the RiscOS way is better then repositories.
--------------

We will see...

-------------------
"I consider a "rant" a piece that just complains about things. In my piece, I pointed out a problem, explained why it was a problem, and offered a possible solution.
To me, that's not a rant."

Without looking at existing systems like you are advocating.
Before you say it, you stated GoboLinux wasn't what you where advocating.
----------------------

...? What can I say? You have the freedom to see my piece as a rant; we myst have different opinions on what a "rant" is!

Bye,

Merc.

code.guru's picture
Submitted by code.guru on

Have you seen the Linux kernel's code...?
I don't think this would be possible, to be honest.
-------

LOL, that is true, but it wouldn't be to hard to make future modifications in C++ would it?

----
Visit My Blog:
http://shallweprogram.blogspot.com/

jabjoe's picture
Submitted by jabjoe on

We'll see what happens with LSB, without doubt it will grow. I've got mixed feelings about having a standard base anyway. One of Windows issues is that everything is written to one implementation, so when that changes, loads of hidden bugs surface. Maybe it's better to have to test on many implementations. For open source stuff, some of your users might even fix it for you. ;-)

This looks like what I followed to install the latest OpenOffice in 8.10:

http://news.softpedia.com/news/How-to-Install-OpenOffice-org-3-1-on-Ubuntu-9-04-111105.shtml

(it does cover 8.10 too)

It really was easy.

Doing a quick google, it seems a big cause of dependency hell is from people mixing debian and Ubuntu repositories.
But regardless, there does indeed seem to be a issue I've just never hit. But there are better solutions being talked about then walking away from all the advantages of package managing.

http://wiki.winehq.org/TrustingThirdPartyRepositories

So DLL hell or dependency hell, not hit dependency hell, but I have hit DLL hell, and I don't like the MS manifest solution. I'll stick with repositories until something better comes along, which at the moment, I'm not seeing out there. I see a net gain from repositories, which I think is why repository based distros have one out.

A virtual filesystem can make things much simpler. That's why they are so widely used in Unix systems. I wish Linux was as far along on that front as Plan9, with ALSA meant to be replacing OSS, looks to me like we are moving further away from it. :-(
Anyway, without starting this virtual filesystem, I don't know how much work it would be. Just an idea, which I'm unlikely to >really< look at. Tend to only do things at home (i.e. Linux) that are useful to me. ;-)

dingletec's picture
Submitted by dingletec on

A few points in the beginning of your article are incorrect, and since they were primary examples, I couldn't continue.

You mention that under Linux (I primarily use Ubuntu) that you are unable to:

1. Install software as a non root user. Using your two examples, Opera and Firefox, this is easily done as a non root user by downloading the tar.gz version into your home folder and extracting. Simply run the executable file located in that folder. Easiest way is to create a launcher pointing to it. Drop launcher on panel or desktop.

2. Unable to run multiple versions of Opera or Firefox. As in the above example, download the tar.gz version of as many releases as you like and extract them. Rename the folder in the case of Firefox, but as in the first example, simply run the executable in each releases folder. Create launchers pointing to each version on your desktop or panel to simplify the process.

Tony Mobily's picture

---------------
A few points in the beginning of your article are incorrect, and since they were primary examples, I couldn't continue.
---------------

It makes it a little awkward, to discuss an article you haven't read.

You talk about .tgz files. I don't think it's a sane way to go -- please feel free to read the rest of the article to see why.

A short list of points:

* You lose the ability to reuse at least some of the libraries. See my point about having a common set of libraries

* It's not straightforward to the average user to install an application that comes in .tgz format. I tested it numerous times with average users -- the failure rate was huge

* What I suggest is _somehow_ similar to the .tar.gz distribution method, but with several enhancements which allow library sharing, default icon, automatic upgrading, etc.

Bye,

Merc.

dingletec's picture
Submitted by dingletec on

I also just tested your point: It’s not trivial to “give” a program to a friend

I dragged the opera and firefox folders created by extracting the tar.gz version of each from my Ubuntu system to a thumbdrive. I carried it over to a coworkers computer which is running Linux Mint (Ubuntu based). I copied the folders from the thumbdrive to the user folder and ran the executables within.

Both ran, and neither are installed on the System.

kimdolin's picture
Submitted by kimdolin on

I know this may sound as inappropriate metaphor but consider this:

When you want to drive a car you need at least some knowledge and formal status be able to do so. So basically even if your friend gives you his car you should not take it on the road unless you have at least minimal knowledge(intelligence) of the rules. Yes of course, your friend can tell you that the car is in his possession and that it is him who decides who will drive it and when. But it is you who has to have experience to drive a car and knowledge of the rules. Otherwise you may harm somebody etc. . Reducing the overhaul of sharing applications with others does not mean others will be able to use it correctly and that the applications will then play well with other components in the system(with the whole traffic on the road). There has to be some intelligence on the consumers end also.

Do you know how your car works inside? I doubt that. It is also because you are presented only the nice user interface and do not interact with your engine directly. Yet when you apply for driver's license you should learn the basic concepts. And guess what. Nobody has problem with that. Why not teaching people the basic concepts of Operating System and filesystem structure. Is it less important than learning about a car?

Systems are complex and will always be. Complexity is a must because it differentiates and unites at the same time. Simplicity is a nice feature but definitely not the best one.

Usually you do not want to have 3 different versions of shared libraries on your system. It may only happen when the software you want to use is either outdated or bleeding edge. Ordinary user most likely do not want to use bleeding edge software so I do not see the point in this. If the user wants to do something un-ordinary he should no be considered ordinary anymore and it is his responsibility to study documentation and solve the issues.

I am very happy with ports on FreeBSD and it is not the simplest package management system available. But because I am only presented the nice user interface I can just imagine about the internals. If something brakes during building process I guess I have to solve it on my own.

kimdolin's picture
Submitted by kimdolin on

I know this may sound as inappropriate metaphor but consider this:

When you want to drive a car you need at least some knowledge and formal status be able to do so. So basically even if your friend gives you his car you should not take it on the road unless you have at least minimal knowledge(intelligence) of the rules. Yes of course, your friend can tell you that the car is in his possession and that it is him who decides who will drive it and when. But it is you who has to have experience to drive a car and knowledge of the rules. Otherwise you may harm somebody etc. . Reducing the overhaul of sharing applications with others does not mean others will be able to use it correctly and that the applications will then play well with other components in the system(with the whole traffic on the road). There has to be some intelligence on the consumers end also.

Do you know how your car works inside? I doubt that. It is also because you are presented only the nice user interface and do not interact with your engine directly. Yet when you apply for driver's license you should learn the basic concepts. And guess what. Nobody has problem with that. Why not teaching people the basic concepts of Operating System and filesystem structure. Is it less important than learning about a car?

Systems are complex and will always be. Complexity is a must because it differentiates and unites at the same time. Simplicity is a nice feature but definitely not the best design.

Usually you do not want to have 3 different versions of shared libraries on your system. It may only happen when the software you want to use is either outdated or bleeding edge. Ordinary user most likely do not want to use bleeding edge software so I do not see the point in this. If the user wants to do something un-ordinary he should no be considered ordinary anymore and it is his responsibility to study documentation and solve the issues.

I am very happy with ports on FreeBSD and it is not the simplest package management system available. But because I am only presented the nice user interface I can just imagine about the internals. If something brakes during building process I guess I have to solve it on my own.

cjs8's picture
Submitted by cjs8 on

Uh... have you heard of relocatable packages that can be installed with a prefix of your own choosing? There's even a couple standard prefixes for non-standard package installations.

ebug's picture
Submitted by ebug on

I am not a programmer just an end user. In my experience in using linux (Ubuntu/LinuxMint), I install programs through ADD/REMOVE APPLICATIONS, or deb files thru GDebi, or Synaptics or Minstall. So what is broken?
I understand that beneath this simplicity, there are things that need to be fix which only you guys (the hardcore coders) can solve. But for end user like me, installing programs in linux particularly with Ubuntu/Linuxmint are now soooooo easy. So why is it so hard to penetrate the huge market or users? Probably, in my opinion, it is that many programs for Windows/OSX are marketed individually. You can see them in stores, fliers, magazines etc. In Linux, the programs are available after you install or somebody forced you to install a distro. Ask someone what is GIMP or Amarok. For the 98% something people, these are just funny sounding words. Try asking them what is Photoshop,...you see what I mean! So what is my point?
1. Installation and sharing of programs in Linux are NOW so easy.
2. Promote the best programs to the masses. With a caption like..."Amarok, you're long lost love...." then beneath the packaging, "Works best with Linux". If I know Amarok's reputation, I'll install Linux OS to just be able to install my amour.

So coders, do what you got to do, I don't care a bit. But thank you for bringing Linux to life with all its glory and shortcomings. We do love or loved someone even with his/her imperfections. Right?

zephyr1977's picture
Submitted by zephyr1977 on

I couldn't disagree with you more. Linux, like any other software, has it's problems but one of the things Linux got VERY right is package management. I'm not going to go over everything but just your main points.

You don't want individual users to install their own copies of software..it's that simple. It may sound nice, but in a home setting this is a non issue. In a corporate setting you want users to have a locked desktop. You don't want them installing anything they want. It can lead to an absolute nightmare for administrators. Per user installation is just poor design.

As for installing multiple versions of software, you hit on one of the extremely few programs that might meet a situation like that. However someone above me just answered the question about how simple it is to do that. But seriously do you need multiple versions of office software? of your media player? of your favorite game? I can see absolutely no reason to have multiple versions of 99.9% of the software available in any situation. (except for libraries which already allow you to easily install different versions).

You can in fact download and install software not in the official repositories easily if you choose the right format (assuming it's availabe). The problem comes if you don't use one of the most popular distros, then you run into trouble. But Linux by nature will always be fragmented. If your distro of choice isn't popular enough for projects to build a package for, then you accept that or choose a different distro. There will never be one way of doing things in the Linux world which is good, it leads to more innovation. But that also means there will never been one standard binary format that works on everything. If that's something you want maybe Linux is not for you.

As for older software working on newer distributions...I understand the reasoning here, specifically for companies that may have some expensive custom app that might end up tied to a specific distribution and version. However backwards compatibility should never be a goal of Linux. If you want to see how that works out just take a look at Windows. Generally apps die off for a reason but the beauty of Linux is that if you REALLY just want to use it, with the right knowledge you can update it yourself. For the average user? They'll stick to a newer version or to the apps that have come out on top in their respective arenas.

Now giving someone an app...is it really that hard to copy a .dep or .rpm file onto a memory stick and run it on there computer? Technically that is just copying an icon onto a memory stick if you're using the GUI. Better yet, just find out the package name and use the package manager! No copying needed.

Besides these individual points, the biggest problem with the way you perceive software installation should be, is it throws out the single greatest advantage that package management has...and that is quite simply management. It manages all of my software for me. It checks every single program on my computer for updates. I would hate to have to go back to managing all these programs on my own or worse having every program have their own little hidden update manager running in the background...it would be like going back to Windows *shudder*. Not a single distribution has a software installation system. It's a management system. And that's what sets it apart from Windows and Mac. Could things be done better in some areas? Sure. Are certain package managers better than others? Maybe (as it has a lot to do with personal preference). But throwing out the whole thing in order to make a mess of your system, have software installed in any which directory the user throws it in, end up with multiple instances of the same application in different users directories, have multiple copies of the same library strewn around the system? Then have to manage and update all of these sperately? (or are you proposing some magical update manager that manages ever file you drag and drop onto your system?) Sounds like a winning idea to me!

zelrik's picture
Submitted by zelrik on

I am not an expert but there are a couple of points that are bothering me :

* Users need to have root access in order to install a piece of software; no per-user installation is allowed

I should say that anyone who has an administrative password can install software and not necessary root. I agree that once the software is there...well it's there for everyone but the configuration remains personal.

* It’s very tricky to install several versions of the same piece of software. Just think of the poor graphic designer who needs to install several versions of Opera and Firefox;

I dont really see the point of having several versions of the same software but ok.. maybe it's a pain in some cases...

* The software needs to be downloaded from the official repositories. Well, it doesn’t need to, but an average user wants to stay well away from unofficial repositories for technical reasons;

I would argue PPA's are pretty handy already while I agree that it still needs improvements. (Mainly for security reasons imo)

* In some cases (especially when the user adds repositories or installs packages directly), the dependency-checking mechanism often fails and users end up with circular dependencies. They are nasty;

I havent experienced this personally but I have heard of it, a lot more for 'rpm-based package managers' though.

* A piece of software is bound to a specific distribution, and — what’s worse — to a specific version of that distribution too. It’s not trivial to install Openoffice 3.1 on Ubuntu 8.10. You can argue that you can install the bunch of .deb packages from OpenOffice’s web site. Tell that to your grandmother or your average inexperienced computer user.

Again, I disagree, .debs and PPA are already pretty good and fairly easy to use.(while needing some improvements)

* It’s not trivial to “give” a program to a friend. To the end user, giving a program to a friend should be as simple as dragging an icon onto a memory stick; instead, files are scattered all over the system.

You can still send a .deb but it's kinda pointless in most cases. Correct me if I am wrong but I really do think Canonical is working on improving software portability also.

----------------

Just some thoughts...

ammorais's picture
Submitted by ammorais on

First, I'm not a native English speaker so sorry for anything.

I'm sorry, but your article seems from someone that is a complete newbie on linux, and is
trying to overcome the normal newbie frustration with some poor excuses about why his familiarized model is better.
Get over it and try to understand Linux and why it's consider on of the most secure existing OS.

First, let me just tell you that Ubuntu != Linux

Now let's go to the facts.

Users should be able to install software even without being root

First, users are allowed to install software if they aren't root. You can install software on your home directory, as long as the software allows it and the software doen't require admin privileges(like accessing the hardware).
Even Windows started to request to you admin privileges to install software(in vista if I'm not wrong). Being able to to install programs without root previleges is a characteristic of a poor OS design that runs the kernel and applications in the same privileged space. that's what makes a fresh install windows box compromised just after a few minutes connected to the internet.

It’s very tricky to install several versions of the same piece of software. Just think of the poor graphic designer who needs to install several versions of Opera and Firefox;

That's not true. In the very rare cases that applications are justified to have more than one version, distros normally offer a solution.

Users should be able to run either their own version of the software, or the one installed on the system (if any)
As someone pointed out, if you run a newbie friendly distro like ubuntu or some other populasr distro, you can since probably there is a deb package for you. If you run a more advanced distro, you probably are an advanced user and you are able to manage a simple software instalation. I haven't got yet a distro that I didn't manage to install custom software.

Software just needs to work — unchanged — even if it’s a bit old and it’s run on a newer distribution
Why? Why do you think that is an advantage. Only OS's based on closed binary have to be ABI compatible. Linux is 99% open source. You can always recompile a version for your specific platform, and normally applications are optimized to your system and to your computer.

It should be possible to “give” a piece of software to a friend by simply dragging an icon onto a memory stick

You can give software from to your friend dragging an icon from your desktop to a pen. You have several formats. deb, rpm, tar, tgz, tar.bz2, etc... The difference is that you are not familiarized with this formats. What's next? You arguing that Linux executables should end in .exe just because you think it's more intuitive.

Conclusion: I know it's difficult to stay objective when we have an emotional background attached. What you know is windows and you probably feel(and believe), the windows system is more intuitive, as we are getting older it's difficult to learn new stuff. Linux is different, but it isn't less functional in any way.

admin's picture
Submitted by admin on

Hi,

---------------
First, I'm not a native English speaker so sorry for anything.

I'm sorry, but your article seems from someone that is a complete newbie on linux, and is
trying to overcome the normal newbie frustration with some poor excuses about why his familiarized model is better.
-------------------

Well, I've been a programmer for about 18 years (although I moved away from device drivers and system programming about 10 years ago) and have used GNU/Linux since 1995. I am sure I qualify as a newbie compared to a _lot_ of people out there. However, I was more talking about what I had seen in terms of newbies using GNU/Linux -- and failing.

-----------------
"Users should be able to install software even without being root"

First, users are allowed to install software if they aren't root. You can install software on your home directory, as long as the software allows it and the software doen't require admin privileges(like accessing the hardware).
-------------------

The missing keyword there -- my fault -- is "easily". Average users get lost when they download a .tar.gz

-----------------------
"It’s very tricky to install several versions of the same piece of software. Just think of the poor graphic designer who needs to install several versions of Opera and Firefox;

That's not true. In the very rare cases that applications are justified to have more than one version, distros normally offer a solution.
-----------------------------

Testing a new version of a program before embracing it is not rare. The user should have the freedom to do that.

----------------------
"Users should be able to run either their own version of the software, or the one installed on the system (if any)"

As someone pointed out, if you run a newbie friendly distro like ubuntu or some other populasr distro, you can since probably there is a deb package for you. If you run a more advanced distro, you probably are an advanced user and you are able to manage a simple software instalation. I haven't got yet a distro that I didn't manage to install custom software.
-----------------------

My article is trying to go a little further than "If you use Ubuntu then you *should* be right"... In the article, I am trying to offer a cross-distro, transparent solution.

-----------------------
"Software just needs to work — unchanged — even if it’s a bit old and it’s run on a newer distribution"

Why? Why do you think that is an advantage. Only OS's based on closed binary have to be ABI compatible. Linux is 99% open source. You can always recompile a version for your specific platform, and normally applications are optimized to your system and to your computer.
---------------------------

Running old software is not a good idea. However, getting somebody else to decide for everybody which version of the software you should use is limiting.

I might desire to burn a CD with all my most precious end-user programs, and give it to everybody when I see them -- and expect their super-new distro to just run them.

-------------------------
"It should be possible to “give” a piece of software to a friend by simply dragging an icon onto a memory stick

You can give software from to your friend dragging an icon from your desktop to a pen. You have several formats. deb, rpm, tar, tgz, tar.bz2, etc... The difference is that you are not familiarized with this formats. What's next? You arguing that Linux executables should end in .exe just because you think it's more intuitive.
---------------------------

I am very familiar with those formats. Average users aren't. And most users will install software from a repository, hence having no intuitive idea of "where" the program is, and how to "move it" somewhere else.

--------------------------------
Conclusion: I know it's difficult to stay objective when we have an emotional background attached. What you know is windows and you probably feel(and believe), the windows system is more intuitive, as we are getting older it's difficult to learn new stuff. Linux is different, but it isn't less functional in any way.
------------------------------------

...? I offered a technical proposal on how to deal with what I consider a problem. I can only assume you missed the technical proposal in the second part of the article?
And no, I haven't used Windows in years... But I have used Macs, and can clearly see the differences.

Bye,

Merc.

jabjoe's picture
Submitted by jabjoe on

http://en.wikipedia.org/wiki/Dependency_hell

Personally I come away thinking we already have the best solution to date. Package management. For special cases there is nothing stopping portable applications being created.

What really seals the deal for me is:

* Safe. A software repository is software known to be safe. In Windows a good proportion of the problem with infections is because people install apps from any old place.

* Easy for user. One place to get software known to work.

* Easy for programmer. Source dependencies solved too. Building a open source app on Windows is often a nightmare as you have to hunt down the right version of the right libs and install them as expected. With package management, you just add the source repositories and say install the source for A, and it tells you A needs B,C and D dev libs, you still want to install? You just say yes and your ready to program while your Windows counter part hasn't even started weeping at his desk (which he will in a few hours, once the horror of his situation has sunk in). ;-)

Mitch Meyran's picture

It's been said in other comments, but...

Installing an application system-wide is what the root account is about. Local installs are perfectly possible, and a modern OS does not prevent you from running a local copy of any software that does NOT require system-wide installation. Even under Windows, which enforces the very same system on any machine running an NT kernel; the fact that Microsoft decided to make everybody 'root' by default is faulty design, not an asset.

Installing several applications side by side: well, this is faulty design on the part of installer designers, under Windows too: nothing, and I do mean NOTHING, prevents you from installing any copy of any software in a separate directory and to access it freely, be it in windows or Linux (actually, Linux already has a design for that: /usr for system-wide, /usr/local for local users, /opt for alternative, local users, and inside /home for single user... And not all in /Program Files!). Yup, 2 points covered at once.

Official repositories: modern distros allow the install of local packages, untrusted repositories (no signature) with user confirmation, or trusted repositories; most require root access if the installed package will modify stuff outside the user's account.

Installing packages from outside official repositories: that's what personal repositories are for. And personally, I didn't find installing OOo packages from .deb packages than MSO from a CD (key, updates, reboots and all)

Software running unchanged: well, it does.

Copying an installed application to give it away: no. Software are delitace collections of stuff, that's why you have installers. Give a friend an original installer, not files copied from here and there!

---
A computer is like air conditioning: it becomes useless when you open windows.

Terry Hancock's picture

Installing several applications side by side: well, this is faulty design on the part of installer designers, under Windows too: nothing, and I do mean NOTHING, prevents you from installing any copy of any software in a separate directory and to access it freely, be it in windows or Linux (actually, Linux already has a design for that: /usr for system-wide, /usr/local for local users, /opt for alternative, local users, and inside /home for single user... And not all in /Program Files!). Yup, 2 points covered at once.

Sorry, but I've come across too many hard-coded filesystem dependencies in my life not to comment on this. If a library is almost always installed in the same place, then there are going to be packages that just write that location right into the code (or in the Makefile or somewhere else that isn't easy to find and configure).

There's really very little that distributions or installers can do to stop programmers from being lazy this way. Hopefully you won't see this in the most popular end-user applications -- it's possible that I've seen this more often because I have maintained aging crufty scientific packages, but I have definitely seen it in packages I use.

bogdanbiv's picture
Submitted by bogdanbiv on

We should just add that when users run package management software without administrative rights the default installation prefix should be ~/usr or ~/.usr instead of /usr. Configuration should go into ~/etc or ~/.etc.

Just change the installation prefix when users do not have admin priviledges.

Also, because we retain the most common unix tree we can move/migrate software from just one user to system wide availability. It would be just:
apt-get remove firefox #uninstall user-space firefox
sudo apt-get install firefox. #install system wide firefox
sudo mv ~/.gnome /etc/... #copy customized configuration

Also adopting the /etc structure in user-space would solve the mess with user specific configuration files: .bash_history, .mozilla-thunderbird/, .subversion/ and .gnome/ could all move from ~/. to ~/.etc

To me this is just a small modification in the package management software, something which should be easy with the help of Package Kit.
So what do you think?

bogdanbiv's picture
Submitted by bogdanbiv on

I got this idea today when I installed Firefox 3.5 into /home/user/apps, because it was a tar ball and there are no packages for my distro, yet.
But when I thought what flaws are in this strategy, I quickly found the biggest: with the ~/apps scheme, you get the system path of what’s in the system plus ~/apps/firefox/, ~/apps/eclipse, which is all wrong by my standards. Why not keep in userspace the split between bin, docs, and configuration as it is standard unix system wide?
And then, logic follows that if we keep the same tree structure in /home/users/usr as it is system, why not just use the standard package management software?
What do you think?
NB: I used ~/. as a shorthand for /home/user1/.

UPDATE: A... I've put roughly the same post twice... sorry guys!
I edited the text to remove duplicates.

thornik's picture
Submitted by thornik on

I agree only with 'Linux has a problem' and pointed issues. But problem sit much deeper: in kind of dependency between modules and a whole development architecture. GoboLinux is THE BEST what you can afford now in this ugly Linux.
Problem not in "this is user's soft, this is system's" - problem in static link between "library" and "soft". If you need LibJPEG, WTF you ask for version "1.7.3"?? Who cares which version is there? If soft worries about exact version, where some "feature" first appear, just point it! "require LibJPEG >= 1.7" - it's enough. Even more exotic "require LibJPEG stable" or "option LibMP3" - module LibMP3 _can_ be used, but not necessary. Now we have idiotic packages, which pull a whole system behind even if this is "ls" with optional GUI - be sure, whole X system with Gnome & KDE will be installed! Unix way, heh...

> There should be a well defined directory tree that contains a whole application.
and earlier:
> The Unix file system has been around for a long time — for good reason.

ha-ha :) Very logical! WHERE you will put your 'application' when you have a mess of /bin /sbin /usr/bin, etc? Use your 'good reason' file system!

Well, this article is not a revolution - it's just throwing shit to air, long-long time assembled shit named Linux. If you wanna revolution, throw away LINUX - most ugliest system.

cichlasoma's picture
Submitted by cichlasoma on

Nix(OS) goes in the same direction at least.
http://nixos.org/

mmarion7's picture
Submitted by mmarion7 on

As one of the unwashed masses that represents that market penetration mentioned in the article I can wholeheartedly agree that the difficulty installing software is THE reason I don't use Linux as my primary OS.

I have revisited the Linux world every year or so for about 10 years. Installation has improved hugely, hardware support is almost a non-issue now, but installing software remains an absolute mystery.

Installing from a distribution CD is simple, but after that I'd rather re-install the OS than try to install software.

Not a criticism of the OS, just the experience of a newbie who isn't a hobbyist or an IT professional.

gondorr's picture
Submitted by gondorr on

it's about the missing differentiation between OS and application, one mozilla developer mentioned this before: http://benjamin.smedbergs.us/blog/2006-10-04/is-ubuntu-an-operating-system/

also, this topic was already solved. autopackage:
http://web.archive.org/web/20061124075140/http://www.autopackage.org/faq.html#1_1

but, again missing support from the community http://web.archive.org/web/20080331092730/http://www.linux.com/articles/60124 Autopackage struggling to gain acceptance

Author information

Tony Mobily's picture

Biography

Tony is the founder and the Editor In Chief of Free Software Magazine