The IT world has a reputation of being extremely fast-paced. And it is: an accounting program in the ’80s would have been written in COBOL. In the ’90s it would have been written with a RAD (Rapid Application Developer) environment such as Delphi or Visual Basic. In the... ’00s (noughties?), today, the same application would probably be written as a web system, possibly using all of the “Web 2.0” technologies to make it responsive and highly usable.
I am not going to try and predict what it will be like in the ’10s and I am not going to try and make guesses, since I am notorious for being wrong (yes, I am one of those people who thought that Java would be “it”...).
However, there is a shift that has indeed already happened, and hasn’t been highlighted as much by the media. This shift helps free software, and at the same time is helped by free software.
Let me take a step back. At the beginning of this decade, the internet become the most important feature in any personal computer. There was a very strong distinction between PCs (which acted as “clients”) and servers (which often didn’t have a monitor and were stacked in a server rack in a data centre). The PCs served would request a page; servers would display pages; PCs would render them. Blogging systems and photo albums often worked with the same concepts: the PCs were often used as a means to transfer the data onto the servers, using a web interface or an unfortunate proprietary client.
While I shouldn’t use the past tense just yet, something today has changed: people are using GNU/Linux and Mac OS, and therefore have fully featured servers hidden behind all the pretty icons they are used to. Windows users try to catch up, but they do seem to struggle in terms of available software or—more importantly—security; however yes, even Microsoft users can act as servers.
This change is paramount. Computer users today are finding themselves with very fast permanent home-connections, as well as powerful server systems hidden behind fancy icons. They are realising that there is no point in paying a hosting company to have a web site: they can just run their own. The IP address changes? That’s not a problem: there’s dynamic DNS, free of charge, that comes to rescue. Everybody is becoming “a bit of a system administrator”, where their computer hosts their web sites, their photo album, or their music. Quite a few issues still remain, especially for security or problem solving when something “goes wrong”. However, none of these problems are really new: how many spam zombies run happily in client-only, Windows machines? How many times do client machines crash?
To me, this is yet another fantastic opportunity for free software: if installing a web server, a blogging system or a Wiki becomes as simple as installing any desktop application (that is, no command line is needed), then less skilled people will have yet another good reason to use free software and free operating systems.
For quite a few years, a lot of people predicted that “clients will be thinner and thinner, until most of the processing will happen in the server”. What a lot of people didn’t predict, is that all those servers would be in the hands of the end users through very fast internet connections, rather than dedicated data centres.
I don’t know if this is only the beginning of yet another revolution, or just a temporary trend which will be relegated to geeks and computer experts. This will also depend on how easy it will be for the common user to install a pre-configured web server, photo albums, Wikis, etc. on their systems (right now, my mother wouldn’t manage any of those things and she knows as much as 90% of the computer users out there).
In any case, it’s definitely something to keep an eye on.