Editorial

Editorial


The IT world has a reputation of being extremely fast-paced. And it is: an accounting program in the ’80s would have been written in COBOL. In the ’90s it would have been written with a RAD (Rapid Application Developer) environment such as Delphi or Visual Basic. In the... ’00s (noughties?), today, the same application would probably be written as a web system, possibly using all of the “Web 2.0” technologies to make it responsive and highly usable.

I am not going to try and predict what it will be like in the ’10s and I am not going to try and make guesses, since I am notorious for being wrong (yes, I am one of those people who thought that Java would be “it”...).

However, there is a shift that has indeed already happened, and hasn’t been highlighted as much by the media. This shift helps free software, and at the same time is helped by free software.

Let me take a step back. At the beginning of this decade, the internet become the most important feature in any personal computer. There was a very strong distinction between PCs (which acted as “clients”) and servers (which often didn’t have a monitor and were stacked in a server rack in a data centre). The PCs served would request a page; servers would display pages; PCs would render them. Blogging systems and photo albums often worked with the same concepts: the PCs were often used as a means to transfer the data onto the servers, using a web interface or an unfortunate proprietary client.

While I shouldn’t use the past tense just yet, something today has changed: people are using GNU/Linux and Mac OS, and therefore have fully featured servers hidden behind all the pretty icons they are used to. Windows users try to catch up, but they do seem to struggle in terms of available software or—more importantly—security; however yes, even Microsoft users can act as servers.

This change is paramount. Computer users today are finding themselves with very fast permanent home-connections, as well as powerful server systems hidden behind fancy icons. They are realising that there is no point in paying a hosting company to have a web site: they can just run their own. The IP address changes? That’s not a problem: there’s dynamic DNS, free of charge, that comes to rescue. Everybody is becoming “a bit of a system administrator”, where their computer hosts their web sites, their photo album, or their music. Quite a few issues still remain, especially for security or problem solving when something “goes wrong”. However, none of these problems are really new: how many spam zombies run happily in client-only, Windows machines? How many times do client machines crash?

To me, this is yet another fantastic opportunity for free software: if installing a web server, a blogging system or a Wiki becomes as simple as installing any desktop application (that is, no command line is needed), then less skilled people will have yet another good reason to use free software and free operating systems.

For quite a few years, a lot of people predicted that “clients will be thinner and thinner, until most of the processing will happen in the server”. What a lot of people didn’t predict, is that all those servers would be in the hands of the end users through very fast internet connections, rather than dedicated data centres.

I don’t know if this is only the beginning of yet another revolution, or just a temporary trend which will be relegated to geeks and computer experts. This will also depend on how easy it will be for the common user to install a pre-configured web server, photo albums, Wikis, etc. on their systems (right now, my mother wouldn’t manage any of those things and she knows as much as 90% of the computer users out there).

In any case, it’s definitely something to keep an eye on.

Category: 
License: 

Comments

Terry Hancock's picture

While I agree that self-hosting is more possible than it used to be, it can be a problem. What may seem like enormously high bandwidth for browsing, may still be kind of tight for a server. Furthermore, servers require high "up and down" speeds, whereas much broadband service is "asymmetric", meaning you get a lot slower upload than download.

For example, it's not uncommon to see "384 kbps down / 128 kbps up" or "1024 kbps down / 128 kbps up". For a server, that "up" figure is the limiting factor (e.g. HTML requests (down) are usually tiny, but webpages (up) are large).

There's also reliability. Data centers usually have better uptime than you can manage at home.

But even if you have the bandwidth, you still have the headaches associated with running security on your website. Even if I did run my server at my house, I'd surely want it to be a separate machine, so that it wouldn't interfere with my desktop use -- I prefer to run a good firewall so I can be a little relaxed about the security updates on my desktop machine. Almost all attacks on Linux security are server-attacks -- if you only use the machine as a client (and block other traffic), the odds of an intrusion are really tiny (I've never had this happen, after 6 years of using Linux on the desktop).

Of course, this all depends on what you want a website to do. If it's just your home page, then a direct site makes sense. If it's any kind of business or organization, though, the cost of a data-center-based webhost is probably worth it (and for me, anyway, it costs less than my ISP does, so it's not a big expense).

Author information

Tony Mobily's picture

Biography

Tony is the founder and the Editor In Chief of Free Software Magazine