When Javascript became the world's new CPU

Short URL: http://fsmsh.com/3258


The computing world is always very unpredictable. That must be why there is a small number of people who make large amounts of money from it: they are in the right (unpredictable) place, at the right (unpredictable) time. Who would have ever guessed that Javascript, a simple scripting language initially thought as a simple means to make web pages "cooler", would become... drum roll... the world's new CPU?

I doubt anybody expected it. Even after seeing AJAX (which was ironically started by Microsoft...), very few would have bet that Javascript would become quite so important. Javascript is the only really widespread, multiplatform solution the modern IT world has seen. And yet, it made it. Google Documents is an office suite which runs in your browser -- and it's not even the best one. And that's only the beginning: the world is absolutely full of software -- and I am talking about full blown software -- which will run for you wherever you are.

Join us in San Jose, CA, February 23-26, 2010, for USENIX FAST '10

At FAST '10, explore new directions in the design, implementation, evaluation, and deployment of storage systems. Learn from leaders in the storage industry, beginning Tuesday, February 23, with ground-breaking file and storage tutorials by industry leaders such as Brent Welch, Marc Unangst, Michael Condict, and more. This year's innovative 3-day technical program includes 21 technical papers, as well as two keynote addresses, Work-in-Progress Reports (WiPs), and a Poster Session. Don't miss this unique opportunity to meet with premier storage system researchers and industry practitioners from around the globe.

Register by February 8 and save! Additional discounts are available!


The few readers who were here in the mid 90s will have a very string deja-vu feeling right now: this is precisely what Java tried to do back then, when Windows had just won the "PC operating system war" against OS/2, and Java was proposed by Sun as an emergency exit: a platform where the operating system didn't matter at all; no wonder Microsoft did anything and everything to shoot it down! (Although Sun did a pretty good job itself, by not releasing it under a free license and hanging on to it way too tightly...).

So, why should Javascript succeed?

So, what is the difference between Java and Javascript? Why should Javascript succeed where Java failed?

Java programs needed a virtual machine that people had to download and install. It was a big download back then -- and a big install. There were several versions of virtual machines -- Sun's, Microsoft's, and a couple of GPL ones that never seemed to work 100% fine. They were all sort-of compatible one with another, but not quite. Desktop Java programs didn't tend to run well: they seemed to need phenomenal amounts of RAM, especially if you ran them for more than a couple of hours. Java libraries kept on changing -- AWT, then Swing -- and so did the various "editions". When Java finally matured and was released under the GPL, many years too late, it was already obsolete.

Javascript has a very different starting point. First of all, everybody has it -- well, everybody with a browser, which is probably 99.5% of computer users today (if not more). Various version of Javascript are compatible with each other -- or, I should say "compatible enough". Yes, writing programs in Javascript that will run on any browser is tricky at times, but it's also a requirement -- and it's definitely possible. Javascript programs have another advantage: they are actually useful. You can edit your Google documents wherever you are: from an Internet Cafe in Nicaragua, or from your home. Plus, Javascript virtual machines are getting faster and faster -- in fact, they are getting stupidly fast, thanks to some very healthy competition happening. While there is browser fragmentation, there are very few and very established browsers: Firefox, Internet Explorer, Chrome, Safari, Opera.

Today, when somebody thinks about a new piece of software to develop, they don't think about a stand alone application; they think about a web application. They don't think about which operating system it should run under; they know that is going to run in a browser. It might be Flash, it might be Silverlight, but most of the time -- luckily -- it's Javascript.

Wait, what about Silverlight?

Microsoft realised that the world was going online and got really worried about it. They dodged the Java thread (helped by Sun), but must have realised that the javascript threat was bigger because the world had changed, and was now ready for change. So, they created and released Silverlight.

Silverlight is a typical example of how Microsoft does things: release something; make it "sort of open"; allow a free implementation, but by turning a blind eye more than actually allowing it; make sure the Windows version of it has some specific Windows-specific features, and that crucial software will use those features. There you have it: Silverlight (or Moonlight).

They have no chance. They came late, and the world is largely ignoring it. They will try to push it as much as possible. However, the world has changed under their feet. Developers want to know that they can reach every user they can get to: a semi-proprietary tool won't do that.

Nice try though.

The irony of Javascript: where are the free applications?

This is an argument that has been raised many times: the world is finally seeing a real liberation from the operating system, thanks to the free, standard-bound, multiplatform browser. And more importantly thanks to an incredibly fast Javascript virtual machine. But... is that really light, at the end of the tunnel?

Maybe not. Right now, at the end of the tunnel there are thousands of online applications that are not free at all. They are sometimes (not always) free as in beer, and definitely not free software. Google Documents is not free software. Zoho is not free software. Basecamp is not free software (and in fact, you pay real money for the right to use it). The list goes on and on.

So, we have these fantastic (and free) toolkits to develop AJAX applications, fantastic (free) Javascript virtual machines... and everything is basically used to create non-free applications. We will never see their source code, will never be able to do much with them.

Maybe free software is shooting itself on the foot?

The counter-argument: is this really bad?

I am now going to make an argument that probably shouldn't belong here. It's an argument a very smart colleague of mine made once -- and it's stuck ever since.

Did we ever complain if the source code of a web site wasn't available? It's a weak argument, but it's a valid one never the less. To me, even more than the source code, there are two major components that need to be free:

  • The data. If I have data stored somewhere, and I can access it from several online applications, then that shouldn't be a problem. Unfortunately, that's not the case right now -- think about Google Documents, where all your files are sort-of locked in a Google world, a sort of file explorer that leaves a lot to be desired. The same applies to most of the other online applications. Yes, it's generally easy to import and export files, but that's just not really quite enough.

  • Format is free. This is another really important argument. Luckily, it looks like we managed. We have ODF -- Open Document Format. Most online and offline office suites today, free or non-free, support ODF. Microsoft Office "sort of" does, but that's the usual Microsoft story, nobody is surprised: files don't look right, importing is painful, and so on. So, my point is that if all of your documents are stored in an open, free format, then you shouldn't have a problem.

Note that once these conditions are true, free software alternatives to proprietary online programs are likely to pop up.

Back to the "world's CPU" argument

All this wouldn't be possible without that little scripting language created some fifteen years ago to spike up web pages. Yes, that's Javascript. Some people hate it, some people love it, but the world definitely needs it. Google, with Google Chrome OS, is betting a lot on it. Visual Environments like Lily are starting to pop up -- and you can bet there will be more and more.

We live in an online world, and Javascript is the new world's engine. It's open, it's free, it's powerful, and it's managed to reshape the computer world.

Whoever bet on it was definitely in the right place at the right time. Would you have ever imagined?



feranick's picture
Submitted by feranick on

The scenario described in this article is somewhat restrictive, as it gives full credit to JavaScript. It is however the combination of JavaScript, HTML and CSS that makes sites like Google Docs what they are. The role of HTML will be even greater as HTML 5 will gain shares in the market. You cannot easily replace Flash and Silverlight only with JavaScript. But you easily can with HTML 5.

AmyRose's picture
Submitted by AmyRose on

Here we go again. No, the cloud is NOT taking away the need for desktop applications. Most people want to store their documents locally, not on some random company's servers. I don't know of anyone who would use a website like Google Documents to do most of their work. Yeah, sure, it has some advantages, but most people I know don't use it for anything. Of course, we also have games. And stop with this "everybody develops for the web. Nobody develops for the desktop anymore." It's BS and you know it. Last I checked, Free desktop applications were continuing to be developed actively, like Empathy, Rhythmbox, Pidgin, OpenOffice.org, AbiWord, GIMP, Inkscape, etc. There are also some newer FOSS applications out there.

Terry Hancock's picture

While I'm not certain that Tony was seriously trying to tell us that free software apps on local CPUs are obsolete, I have to agree that it is emphatically not true -- at least for serious computer users.

For the core computer users, there's no substitute for having your data right here and right now. I already get burned from using my gmail account -- whenever there's an outage, I can't even read my old mail (fortunately, I don't keep a lot of important data there!).

Also, internet applications really only work for low-data-rate applications. Even the best broadband available is a far cry from the bus speeds on the inside of your computer! So for graphics and multimedia applications, I'm afraid the ribbon still goes to native apps.

What happens to the masses who don't really know they are using a computer, is another story. The heyday of the PC may indeed be passing, in favor of more compact, more specialized devices like "smartphones" or "netbooks". And in those cases, distributed "cloud" computing may be more important so that the device itself is less important.

That creates some threat for free software -- which I think may be Tony's point. Because those apps are not distributed (but merely provided "as a service"), they don't need to publish their source code, even if they are based on copylefted software. This is kind of old news, but it also explains why Richard Stallman is so down on "cloud computing" -- too much reliance on it is a danger to "user freedoms".

I think the importance you and I place on keeping our apps local is basically just an expression of our own greater concern with holding onto those "user freedoms", though we may express this in more pragmatic terms.

Ingotian's picture
Submitted by Ingotian on

This is about trends not absolutes. For most people, web based applications have the capacity to do all the things they need to do get to facebook, send e-mail write notes and messages, handle their photos. For some people that might not be the case for a long time. Not too long ago unix was only available on minicomputers, now it runs on cell phones. Not long ago a 56k internet connection was considered fast. Water flows down hill even if it gets temporarily diverted from time to time. The rate varies but nevertheless it gets to the sea.

Change takes time - that is why there are still a vast majority of Windows users. There will definitely be a persistent desktop market for a long time to come, and a lot of that will stick with what it knows, after all some people still use typewriters and we still spend a disproportionate amount of time teaching hand writing in schools and not keyboard skills when most kids are going to type a lot more than they write by hand.

In my own company we are moving more and more to the web. I put most documents on Drupal pages not in OOo, I share business plans using Google's on-line spreadsheet, we are evaluating Ubuntu One to get rid of the need for a local server. Why? Its simply more convenient. we have several 3G broad band connections as well as the main cable one so the internet going down is not really an issue (and outages are rare these days). We have two independent web hosts so we can back up between then and they back up everything 4 times a day for us in any case. So it's actually less expensive and less hassle. So while I currently have a desktop computer, a netbook and a smartphone I can see a time when the Smartphone internet combination with a decent size screen and keyboard is good enough. If it does the job and is less expensive it will take off particularly if the Windows lock-in is broken. Where does free software fit? Well the web servers have to run an OS, my Smart phone is Android based, Javascript has already been mentioned and no doubt popular apps will get repurposed for the web so you will be able to choose either a desk based or web version. May providers can support cloud computing - we use both Canonical and Google. I don't see the same lock-in problems as with Windows because Open Standards are becoming more and more mandated and the cloud is likely o be an interoperating environment where there is at least some choice of providers even if a big one like Google becomes dominant.


Terry Hancock's picture

Your post suggests that centralization is an unavoidable trend, but I don't think that's true. Technology allows both centralization and de-centralization, and both trends can be seen in the history of computing.

Centralization is as much about politics as it is about the technology: who do you want to have control over your data? On the one hand, you can manage it yourself, with the caveat that you'll only have yourself to blame if you fumble and lose it. On the other hand, you can trust it to a corporation who is probably more reliable in principle, but has no sense of loyalty or personal motivation to protect your rights or needs.

A lot of us would rather take the chance on our own abilities than to take the risk of being subject to someone else's agenda.

Ingotian's picture
Submitted by Ingotian on

People have been shown to have the characteristics of sheep, Windows is the evidence :-) We are talking mass market here. Most people that use technology already use centralised technologies, Cell phone providers, MSN, YouTube, wikipedia, Googlemaps, AOL, etc etc. There will always be individuals who don't conform to these trends - 1.5 million SMEs in the UK have no web site but even that is changing. When you say a lot of us, you mean a diminishing minority when compared to the whole population, But choice is good and I don't see choice ending.


Ingotian's picture
Submitted by Ingotian on

This is why we have a strategy to teach Javascript in schools.

1. Schools have a real problem installing software because of network
security. Teaching and using Javascript avoids that
2. Javascript code is immediately available by viewing the source in
the browser. We can show a small program and how it works
3. Javascript programming demonstrates Open Source instantly
4. Students can work on their Javascript projects at school or at home.
5. Students can learn by modifying small web based puzzles and games to produce new ones for their friends.

Here are some examples in development.


Take a puzzle change it and share it.

Anyone wants to take part feel free.


David Sugar's picture

In some ways, Javascript became what Java originally was envisioned to do (for those that remember Java applets...), that is deliver a universal/platform agnostic client side experience in a browser. In truth, it has been much harder to do so portably with Javascript because of broken document models and compatibility issues in certain browsers, yet it largely has happened despite even these fundimental flaws, so it is very valid to reflect on what this means going forward. Another interesting place for Javascript in the future is entirely off the browser, as a platform/native client scripting language.

Ingotian's picture
Submitted by Ingotian on

If browsers become increasingly standards compliant which seems likely, Javascript will get used more and more with increasing support as more and more libraries and routines spring up. I can see a time where Visual Javascript or something similar makes it very easy to develop quite sophisticated applications without installing anything special. Extending this off browser would be a natural and logical development.


Author information

Tony Mobily's picture


Tony is the founder and the Editor In Chief of Free Software Magazine