Sometimes I just want to be stupid

Short URL:


Most modern Linux distributions have slick graphical installers, are on single DVD's and install common applications very easily. The installers make software choices that lead new users by the hand with little to go wrong. Life is also easier for us old timers who, in the past, suffered through many configuration files, compiling network drivers and the miscellaneous headaches we encountered trying to get our hardware to work.

The first Linux distribution that I installed, way back around 1995, was Slackware which came on a set of floppy disks. There was another set for X Windows and other sets for additional software packages. Installing the operating system, once you went through the disk partitioning detour, was pretty straight forward. Getting X to run was an entirely different exercise. You had to install X Windows, load the correct driver for your graphics card, setup the monitor and manually setup the configuration file, albeit through a handy utility. If you didn't setup the software correctly, supposedly, you could destroy your monitor or graphics card. I never heard of this actually happening, but I could imagine typing “startx" and the monitor begin to billow gray electronic smoke. The kind of smoke that you know will make your system never work right again.

Installing and configuring Linux really wasn't that much of a leap though. Getting the most out of DOS high memory, TSR's and knowing the difference between “extended" and “expanded" memory was an art in itself. Making DOS do its magic tricks and Linux doing its own tricks, though different, still required you to know what you were doing. Without the required knowledge you suffered through a partially working machine or worse a machine that just stared back at you with a flashing cursor, no input or output. All of this before search engines had the answer to every question, well probably a wrong answer, but an answer none the less.

All of this graphical and auto-magic installation goodness has made me mentally fat and lazy, we'll leave the physical out of this. I now expect this wondrous GUI-ness all the time, and when it's not there, the small nerve in my neck, that lets itself be known when I'm frustrated, starts to signal its existence. Now with the hundreds of packages that are available on Linux, you would think I would know better than to expect everything to be point and click. Actually, I do know better and I should be happy with that.

I've been working on a couple of projects that have me download source code and do the compile and install dance. Compilation is required either because the hardware that I have isn't supported by the driver version shipped with the distribution or the package is only available as source code. Making sure I have the correct compiler installed with the required libraries forces me to know what I'm doing. I'm back to the old days of using vi and tweaking configuration files to make everything work just right. Basically, I need to know how Linux works to make it do what I need it to do.

The nice thing about Linux though, is that if you have to install something from source or configure a particular piece of hardware that isn't widely supported, the knowledge gained makes you a better user. Sure it's more difficult compiling, installing and configuring, but you have more control of the installation process. If something goes wrong during one of the steps you most likely will get enough error messages to be able to see what went wrong and where. Then you can either figure out the problem or, with the error information in hand, asking an expert for help. Much more helpful than getting a meaningless error message from an installer that you have no control over.

I guess blissful ignorance isn't all that it's cracked up to be. You can't blame me for wanting to be stupid sometimes though. I can dream of being, let's say less than knowledgeable. It's just good to realize that with computing knowledge comes computing power. Without that knowledge, I might as well be staring at a blinking cursor.



Anonymous visitor's picture
Submitted by Anonymous visitor (not verified) on

I'd like to see Linux get to the point where package management and making hardware work are completely brainless. Sure, it's good to know how to compile apps, and how to dink with drivers and customize kernels and all that sysadmin stuff, but that's all just system management, and I'm glad it's finally growing up and becoming more automated. That's the kind of boring stuff computers should be doing, not people. I'd like to see a Linux installer that installs only kernels, drivers, and applications that are supported by your hardware. Then have some kind of slick utility for future changes and upgrades. Of course in classic FOSS fashion the user will have the choice of how much manual control they want over the process.

In my almost-humble opinion, the real work is either coding, or using applications. The in-between jobs of having to learn weird hacks and command options just to make things run in the first place are wastes of time, and are overdue for retirement. And thankfully, Linux is getting there.

Terry Hancock's picture

"I'd like to see a Linux installer that installs only kernels, drivers, and applications that are supported by your hardware. Then have some kind of slick utility for future changes and upgrades. Of course in classic FOSS fashion the user will have the choice of how much manual control they want over the process."

Take your pick: Debian, Fedora, SUSE (even though we are currently miffed with them!), Ubuntu, MEPIS, Knoppix, Linspire/Freespire, Xandros, ....

Of course, as in all cases, the installers aren't 100% accurate. But that goes for any O/S and distribution! I have quite vivid recollections of having to talk users through the correct way to get HP drivers to install on Windows, which involved second-guessing and avoiding Windows' thumb-fingered attempts to autodetect "Plug-n-Play" hardware!

But a number of Linux distributions have quite good installers which are accurate on a wide range of hardware. Tools like apt-get or synaptic provide a remarkable ease of controlling your package install state (I know there are also good tools on RPM distributions, such as "Yum" and "YaST", but I can only personally vouch for Debian's tools).

Anonymous visitor's picture
Submitted by Anonymous visitor (not verified) on

None of those self-customize for your hardware. They all install lardy pre-fab kernels and big fat initrd images that contain everything in the world for booting on every possible system, and dump a big fat load of kernel modules you'll never use on your system. Take a peek and see how much ISDN stuff is cluttering up your system, for one example. Joystick junk? Bluetooth baggage? I don't want those, but they alway gets installed. In my dream world nothing gets installed unless hardware is detected for it by the installer, or the user requests it, and the kernel only supports what I want it to.

Yes, I can have all that now by building custom kernels and fine-tuning package selection. But it's very time-consuming. I'm dreaming of nice automated tools.

Terry Hancock's picture

Of course, other O/Ss like Windows and Mac OS also do that, so this is not a competitive issue.

OTOH, there are distributions like Gentoo or Lunar that do compile the kernel (and other packages) as part of the distribution process. This achieves very nearly the optimum performance possible on the target system, at the expense of additional installation time.

In practice, however, with 80GB harddrives being practically free, the extra bulk in a pre-fab kernel seems like a non-issue for most end-user systems! Remember that loadable kernel modules don't consume RAM if they aren't loaded, so we are really just talking about disk space.

So this is more of a "high performance" complaint than an "ease of use" concern. And of course, HPC always costs more time and effort for the user.

Anonymous visitor's picture
Submitted by Anonymous visitor (not verified) on

Unnecessary packages make updates and upgrades longer and more complicated, and more risk of dependency problems. And carry some security risks as well. Which admittedly are fairly minor concerns on stable distributions with reliable upgrade paths like Debian and RHEL.

"HPC always costs more time and effort for the user." But must it be a lot of time and effort? We already have all the tools to do this, we're just stuck with doing it manually. I want the computer to do the work!

Terry Hancock's picture

Don't like how it works now?

Remember: one thing ALL free software operating systems come with is a COMPLETE development environment for just about any programming language you can imagine.

So, if you want improvements, hey, there's the keyboard, start typing, bucko... ;-)

That's what free software is all about.

Anonymous visitor's picture
Submitted by Anonymous visitor (not verified) on

Honestly I would like to see ANY OS get to this point. The closest I have yet seen was a Mac Classic running officail Mac hardware from Apple. Even then you occasionally got that peice that just would never work quite right. Windows has not made, OSX is starting to slip, and Linux while for the most part on the ball still has its thorn spots.

Anonymous visitor's picture
Submitted by Anonymous visitor (not verified) on

I started using *nix back on Motorola microcomputers in the late 70's highlighted in the mid 80's with highspeed SUN and Indy GUI 32bit workstations.

The Wintel crew since the mid 90's seem to be regurgitating this period and although the 32bit CPUs are faster, the overall interface, functionality, usefulness, performance and general purpose application selection is still about the same, although lacking some of the really easy end user tools to manipulate information, automate repetitive tasks, and give high levels of security and open networking.

Intel had to add 128bit registers to its core for multimedia extensions, but for profit reasons did not extend this to the whole chip, so we have been forced to go without the obvious media input and output advances over the 80's model that very long word instructions would have given us.

The marketing rationale behind this decision is that people have to make these dumb 32bit machines perform by themselves and more bottoms on seats means more profits to Wintel.

Now that the IBM Cell architecture is hitting the shelves in what can be a cheap PS3 Linux development machine, we expect to see garage developers give us some amazing advances on the current dumb system interface, and with the ability of Cells to Open Grid to process AI data domains, we should at long last have the power to do audio visual recognition and responses.

Couple this with *nix daemons and information processing tools and you have the potential for an automatic intelligent slave system instead of a dumb PC.

Roll on 2008.

Anonymous visitor's picture
Submitted by Anonymous visitor (not verified) on

I can tell you from first hand experience that same version of Slackware you are remembering that you could indeed smoke a monitor. I did that the second time I ever configured an X server. After running the "startx" the monitor came up all nice and bright and ran just fine for about 20 minutes or so excpet for this odd hum. Then the gary smoke that powers the monitor start to leak out. and left me with one dead monitor. It turns out that with X especially the older ones you could over drive the video card or send the to high a refresh rate to you monitor if you where not careful. I did the too high refresh mistake.

Anonymous visitor's picture
Submitted by Anonymous visitor (not verified) on

You could (and probably still can) blow up your CRT monitor by using the wrong frequency settings just a few years ago. I did just that by running my monitor at about 30% higher vertical refresh than it could do, it just went ping and that was that.

Anonymous visitor's picture
Submitted by Anonymous visitor (not verified) on

But it seems that the Linux hacker community don't want me to, as it would pollute the pristine purity of the concept. If I want to install an application (not Linux itself), I have to copy out a set of command line strings, and being stupid it takes ages and I often do it wrong. A lot of the time, the strings I have to copy out are one set of a smallish number of variants, with a few differences (noticeably the package name) in the arguments. Now, when programmers (I'm not a programmer) find themselves in this position, they write a program, or a script, or somerthing that relieves them of this tedium, and makes it right almost every time. And they either sell it, or give it away, so that other people don't have to reinvent the nuclear reactor. Think of things like, er, C, that saved a hell of a lot of tedious assembler work when Unix was first invented, and now everybody has GCC and quite likely a few other compilers hanging round. And Kylix, that saves you all that trouble with GUI design. And I suppose someone had to write the first assembler like this: 10001100 0110000 1010001 ....

And I bet a decent application installer, that knows about the different distros, hardware, directory structure, where to get packages, how to make everything fit together AND plays Stairway to Heaven while you're waiting isn't as much work as an assembler.

But when it comes to making program installation easy, all the hackers dig their heels in. It's good for the soul to type "make xxx" on the command line, having got your sources in the right directory by hand in the first place, and then have it all fall over because you've got the wrong library version or whatever. Having an installer to find the right directory, prompt for su passwords, check for the libraries, check for dependencies, copy everything and then type "make" for you is cheating.

The OS community at the moment looks like a Manchester City forward faced with an open goal. Microsoft by now is probably beginning to realise that Vista is an appalling blunder, and that the world is not quite ready yet for Windows 1984. All Linux needs is the little push to make it idiot-usable (it's already practically idiot- proof), and you could easily see mass migration. That would mean the hardware manufacturers issuing Linux drivers automatically, major software vendors selling Linux versions, and in case you hadn't realised, ask-your-fee for experienced Linux hackers for at least five years.

Now, where's the ball going? In the net, or row G?

Author information

Ken Leyba's picture


Ken has been working in the IT field since the early 80's, first as a hardware tech whose oscilloscope was always by his side, and currently as a system administrator. Supporting both Windows and Linux, Windows keeps him consistently busy while Linux keeps his job fun.