Does free software make sense for your enterprise?

Does free software make sense for your enterprise?


“Dude, I can, like, totally do that way cheaper with Linux and stuff.” These were the words of a bearded geek running Linux on his digital watch. As he proceeded to cut and patch alpha code into the Linux kernel running on the production database system, the manager watched on in admiration. Six months later, long after the young hacker decided to move into a commune in the Santa Cruz hills, something broke. Was it really “way” cheaper?

Nostalgia and first impressions

I remember the first time I actually opened up a Sun server after years of using them remotely. Coming from a background of PC-based hardware (that were somehow deemed servers), it was a memorable moment. Some obsessive-compulsive engineer with an eye for detail had obviously spent countless hours making sure that the 2-inch gap between a hard drive and an interface port was filled with (gasp!) a 2-inch cable. Each wire was color-coded, the appropriate length, and carefully held down with ties and widgets to ensure that they never got in the way. Each server I opened was configured the exact same way, and each component matched—down to the screws (which were custom fitted and just right). This was a far cry from 9 foot 2-device IDE cable I got out of the junk drawer that was used to add a random hard drive held in place by duct tape. As the halo above the server lit up the room, I was suddenly struck with a justification for the hefty price tag for these magical machines. Quality.

As the halo above the server lit up the room, I was suddenly struck with a justification for the hefty price tag for these magical machines

The SPARCstation 20, as beautiful as a roseThe SPARCstation 20, as beautiful as a rose

The same quality control that went into the server packing went into the hardware. Sure, I couldn’t connect my Colorado tape drive, digital camera, or latest toy to the back of these servers as I could to the PC, or use a really cheap hard drive created by some no-name vendor, but all the supported hardware that I actually did connect was controlled by a perfect driver and was carefully integrated into the operating system. It all just worked. Magically. If it was on the support list, you knew that some team of detail-oriented engineers took care to make sure that there were no flaws.

Someone had to pay for all of this, hence the hefty price tags, but I knew that most of the servers that I was running my mission critical applications on were deployed when I was in diapers. Baby diapers, not the diapers that people who remember this type of quality are wearing today—or the ones I’ll be wearing when these servers finally crash.

The alternatives to Sun Microsystems, IBM, Solaris, AIX, HP, HP/UX and all of the other commercial software running on proprietary hardware were untested. Linux was in its infancy and focused on desktops and attempted to support everything you plugged into it. It was a time of instant gratification and first to market that was necessary for it to gain acceptance in the desktop world. If a digital camera worked in Windows, the Linux community had better jump on providing support for it in their world. This led to terrible drives, kernels with goto statements and functions with the comments “Uhm, we’ll get to this later. Return 0 for now”. This led to instability when these systems were used as servers. The BSD community, while providing more stable software, wasn’t seen as sexy and didn’t gain as much acceptance. FreeBSD, NetBSD and OpenBSD were “theoretical” operating systems that were coded well but weren’t supported enough to provide integration with some of the more common components in use in current infrastructures. Additionally, more focus within the BSD community was spent on ensuring software was functional (and up to standards) than pretty—which led to a lack of usable interfaces. This gap seemed to propel commercial software more and more. They were well programmed and the vendors had enough resources to cover usability and stability.

When the engineers, system administrators, programmers, and jack-of-all-trades geeks move and finally become managers (or are forced into the role), they remember these days. And they associate Sun Microsystems, IBM, and other giants with this type of quality. To them, the quality they first saw and admired is still around today. Decisions on what to use is made by these new managers. And they certainly didn't apply any of the sane principles they would apply when you manage staff.

Who runs this thing?

Use of free software and alternatives to expensive proprietary hardware went crazy during the early days of the new AOL—err internet. Instead of the goal being stability and an infinite life, new companies were satisfied with getting something out quickly and, upon gaining venture capital or selling out, they could “move up”. The investment required for the commercial side was just too much for a fledgling company. But as we all know, a temporary solution usually becomes a permanent part of your infrastructure. Even worse, a temporary solution in place until more funding is available will outlast the company CEO.

But as we all know, a temporary solution usually becomes a permanent part of your infrastructure. Even worse, a temporary solution in place until more funding is available will outlast the company CEO

Adding to the equation, the open source and the free software movements were growing and the quality of the software was definitely increasing, providing a reasonable solution that wouldn’t instantly crash.

Unfortunately, the problem facing management was responsibility and support. If there was a problem with software written by a 16 year old kid in Finland, who was responsible. If an administrator walked in and deployed a quick solution to fix your immediate needs, who would support his undocumented idea when he left? As employees leave and are no longer expect to grow up with the company (especially during the years of high school age CEOs) as they once have been. A need for a redundant array of inexpensive administrators is created. The prerequisites for this need are an infrastructure that is supportable by many.

Your free software based system running alpha drivers for that new array to save you 500 bucks? Gone, you’re the only one who can run it. Your sendmail infrastructure optimized to take advantage of inferior hardware using custom interfaces for account management? Gone, welcome the appliances.

The need to have software and hardware maintainable by anyone with a base level of experience has replaced making the most of the hardware and software in your infrastructure. If the configurations aren’t default, the performance improvement that you might be giving them with your take on things just won’t be cost effective. Look at it like blackmail or extortion. You walk into a company, “save” them hundreds of thousands of dollars on software, get them locked into your infrastructure, and then demand a huge raise just to keep it going because no one else can. By the time they’ve moved their operations towards your infrastructure, they can’t easily go back. Ironically, the same came be said for commercial software—even those based on open standards. That one extra little feature that Microsoft has in their implementation of a product over the free software version will lock your company into using Microsoft products for all eternity. At the same time, your company will feel that they could easily move away and use any vendor before, hey, it’s an open standard!

Consistency in project life cycles

Ok, so I used a buzzword, but my head isn’t standing up into point. This matters. There are processes in place for development of software, quality assurance testing, and validation that happen before software reaches the customer. In the commercial realm, there are people paid to do some of the most boring tasks just to make sure they get it right. While 30 QA engineers aren’t necessarily going to be as good as a public user community, they are consistent in their operations and try to make sure that nothing slips through the cracks. The user community will often test a few things here and there but won’t go through the tedious tasks of making sure some of the most basic operations work with each subminor revision of code. If something is missed, the assumption is that someone else will find it and any potential problems will be taken care of.

These things are boring. But someone has to do it. From the programming, QA, and other areas of software, each has a defined process that needs to be followed. The same is true for deployment within your infrastructure. What would you do to bring Postfix, which is amazing software, into the mix within your environment? Most people would take some of the old email infrastructure, validate a few things here and there (to make sure no users requiring the delivery of mail are missed). An old legacy host doesn’t speak well with your email gateways? Uh oh, you overlooked that. Important emails being inappropriately tagged as spam? My bad, sorry. These mistakes happen, and because these little things were overlooked in your haste to show off Postfix, someone is going to look bad.

Try deploying any commercial package (especially with a support contract). All of the little caveats you run into will surely be documented by the vendor. A list of supported products will also be given so you know which integration efforts will require a bit of extra work or should remain pointed to an interim solution. And if all else fails…

Who’s to blame?

We’re a society of lawsuits and passing the buck. Slip and fall while running the wrong way on an escalator drunk? First thing someone will say is that there wasn’t a sign telling you not to do that. Sue someone for your own stupidity. Fall behind on a project and lose your bongus because of a bug in software from a vendor? The threat of tarnishing their reputation lets you strong arm them into giving you anything you want. Big or small, there’s someone on the other end of the software with a livelihood.

The threat of tarnishing their reputation lets you strong arm them into giving you anything you want. Big or small, there’s someone on the other end of the software with a livelihood

Have a problem? Blame sendmail.com, the commercial side Have a problem? Blame sendmail.com, the commercial side

The only person to blame when free software fails is the person who deployed the software. But the person in charge of your organization can’t just say “Oh, that crazy Ted! He did it again!” and get away with it. Heads roll when something bad happens. If the bad thing is free software, the heads are internal. If the bad thing can be blamed on a vendor then more people within your organization are safe from the axe. Companies all like it when there’s someone outside of the organization that can be blamed for a failure.

Sun and other large vendors have teams of highly trained engineers ready to parachute into your company at a moment’s notice. All are trained consistently, all read from the same handbook, and all can come in and give you 2080 hours of work in a weekend just to get you up and running. Try doing that by bringing in the same people “trained” on free software. Each has their own idea on how to do things, no consistent sets of manuals, and, for the most part, all from different companies. There isn’t a single source of gurus who know the same Linux implementation who can run out to help you when you’re in a bind.

At the same time, a list of truly supported packages will be there. If you try to integrate a commercial package with something that isn’t supported, there are no guarantees. There’s no “It should work” here. These certifications by the vendor are often done after extensive testing from both sides—the package being deployed and the package being integrated. These often leave you with an all-commercial environment as no one is going to have the time or money to make sure that something is going to work with the latest free software version of something. These things leave free software back a bit but make your management rest a little easier at night. After all, if Microsoft certifies something will work, it will, right?

If you can’t beat ’em…

Check out sendmail.org, the original free version of the softwareCheck out sendmail.org, the original free version of the software

The open source and the free software communities saw some of the same issues that I just ranted about on my proprietary soap box. Their response? Sendmail, so established a software that is has been the victim of a love-to-hate mentality, never had much competition in the commercial realm for much of its life. Eric Allman (father of sendmail) could have chosen to sit quiet and not do anything to further sendmail. Instead, he chose to create a commercial version of the software, offer support, and create an alternative to commercial email packages that were up and coming but not yet a threat. The end result was the same sendmail everyone was already running with the added aspect of accountability and a helping hand. This ended up being a good move because when the dot com boom happened, the former telemarketer turned “engineer” didn’t understand the new set of simpler m4 configuration macros used within sendmail, let alone the ability to understand anything required more than 3 mouse clicks and a “Duh!”_ _

Check out apache.org, home of quality projects near you Check out apache.org, home of quality projects near you

Apache, like Sendmail, has always had a stable following. It lived through commercial pressure (though one can always question the legitimacy) of Netscape Commerce and Enterprise Servers, iPlanet, IIS, and a slew of poor quality commercial versions of their software. A foundation was started and an entire community grew to support each other and their relevant projects. While there wasn’t necessarily someone on the other end of a phone line, there was an established group, which your managers would have a hard time convincing you, would magically disappear.

Sendmail and Apache moved away from the unorganized efforts seen by the free software community when it was smaller and less significant. This gave them credibility and made some of their software a viable replacement for commercial products.

What can be done?

What can you do about all of this, if anything? Don’t be the bearded geek I mentioned at the beginning of the article and put in some extra effort to make sure that what you deploy is supportable. Baby steps. You won’t win anyone over right away but when you deploy a free replacement for software, document the process, live by standards, and make sure that the life cycle of your project involves a significant amount of quality assurance, integration and interoperability testing. Put in the extra time to see what sorts of processes exist within your company and what will make everyone comfortable. Take the extra effort required to follow these procedures. Not only will you earn respect, you might learn something, come up with a bug, and let management view your free software deployment in a better light. Don’t just “do it” and expect everything to work magically with your 3am software install.

The community as a whole should also spend time making sure that their software meets the requirements set by other vendors for interoperability. Software from Sun and Microsoft have a strict set of guidelines before they’ll list a product as fully supported. While expensive, going through the process of certifying free software as fully operable with commercial packages will give a big boost to the movement.

One problem area that people fall into is the latest and greatest upgrade cycle. The stability of a system goes down if you keep upgrading your kernel, libraries, and applications to the latest and greatest revision. There’s really no reason to fix something that isn’t broken. In many environments, constantly upgrading libraries will cause components to fail. It will also cause custom code written by programmers in your company to act different. The lack of backwards compatibility in some packages just doesn’t work in a production environment. Sure, these problems exist in the commercial realm as well (and the hiccups are even worse) but those systems aren’t upgraded as often. A patch upgrade every 6 months or a year (with minor security updates in between) aren’t going to create as big a problem as a weekly kernel upgrade just for the sake of a kernel upgrade. It might be boring to sit around and wait while your office servers fall a few revisions back from your desktop at home, but it’s worth it. Schedule significant system upgrades after testing in development environments (you know what those are, right?) for a while. Create integration environments so that developers and application groups can make sure their code is going to work with your proposed update. Stay behind the curve a little bit and let others find out bugs in the new code before you’re running it on a system that must be up 99.9999% of the time.

Discipline, process, and doing a few extra tedious tasks will give everyone a better impression of some of the solutions you’re proposing. Maybe, just maybe, quality software will catch up with the vendors. Maybe the smiles on their faces won’t be so big and their thinning bank accounts will make them realize that we should all work together to create better code and worry more about things that matter—not just the bottom line.

Bibliography

Jackiewicz, Tom “Deploying OpenLDAP”, Apress:2004

Category: 
License: 

Comments

admin's picture
Submitted by admin on

From: Robert Pogson
Url: http://www.skyweb.ca/~alicia
Date: 2005-12-03
Subject: free software make sense for my enterprise

The biggest problem I see with the adoption of free software is trying to maintain a mix of free and proprietary like GNU/Linux mixed with Windows. Microsoft does not follow standards. They take most of a standard and change it a bit so that software following the standard cannot work well with Windows. I recommend cold turkey switches to Linux to avoid these complications. This works well with small to medium operations where the usual servers can be replaced in a day and the usual desktop installations can be replaced by Linux Terminal Server in a day. If this is done on a non-working weekend, there is plenty of time to diagnose difficulties and fix them or revert to the previous state. If the operation is too large or this abrupt change is not feasible, I recommend using free software in the Windows environment until the nerves calm down and the training period passes, and then switching to Linux one department at a time. Either way, the long term gain is worth a bit of short term pain.

Most of the problems I have seen lately are not problems with free software, but the perception of it with the potential users. Only last week a professional in my organization told me Linux was pirated software and we should not use it... Last month our chief technologist told me he was worried by the thought of adding 20 PCs to our system let alone running Linux on them. I gave him a list of 12 advantages of using LTSP (Linux Terminal Server Project) with the new machines and all he could come up with was "but our users are used to Windows... and we would like to do that but we do not want to maintain two systems". At the same time, our techs are running themselves ragged trying to keep up with routine problems of Windows systems. Some problems do not get fixed for weeks. I pointed out that we do not need anything that Windows offers that Linux cannot supply and that there would be much less work maintaining a few terminal servers compared to hundreds of thick clients. We are still using Windows except in my department where we use both and my Linux users have no problems and my Windows users are a burden. It is like standing outside in bitter winter weather because we are afraid to open the door.

From: Eduardo Cruz
Url:
Date: 2005-12-10
Subject: free software

This comment by Robert Podson is so typical of the Linux-free software devotees..."Most of the problems I have seen lately are not problems with free software".

Always, the problem is the users (dumb), Microsoft (always must have a hand in it), the rest of us, whom are not clever enough to understand the higher meaning of things, etc., etc. We in business need things to work, and if they do not work, I want someone here to fix it, yesterday. Whether they cost little or lots of money, usability is the key!

I have been running some version of Linux (Suse now) in dual boot windows machines for several years now. On a laptop, while running Linux, the computer crashes and reboot quite often. On desktops, it crashes my network or itself. So, in the dependability arena I would not ditch windows for it. I run Linux for the experience, but in fact, there isn't any reason to have it. What can I do in Linux that I cannot do in windows? Nothing. At least in windows my fonts are better on my eyes thanks to Clear Type.

Because is free? My time is too valuable. In many ways it is more expensive, if in fact you were to help those companies doing some of the heavy work in Linux and buy it. Suse Linux cost around $80 retail with manuals and 90 days support. Anyone can buy windows home edition for about $100-120, and it works, period.

Look at the most successful free program, Open office. Who has done all the work for Open Office? It has not been the so-called community, but the 100 or so programmers working at Sun. You need that continuity to make it happen.

I just happen to be able to fix the things that go wrong when they go wrong. How many hours has it taken me to get my wireless running in Linux? You don't have enough hands and feet to count. But how about the regular computer user? Why should he spend the hours and frustration necessary to get a Linux system running right? Most do not care, and that is why they stick with windows.

It is not a religion, as many want it to be. Right now windows works better, tomorrow, who knows? Maybe in a couple years, Linux, any flavor, will be ready to compete with windows, and I don't mean just to be able to load it on any pc, I mean to work from the get go, without having to spend countless hours tinkering with it and posting and reading in forums to get things right. Build me a better mousetrap and I will buy it.

Don't tell me about updates, etc. I have to update suse just as much. Thanks god for my broadband! And forget about viruses...who wants to write a virus for a three percent user base when you have the 800 pound gorilla to wreck havoc with?

Like the article explains, Linux followers love their once a month updates, and new kernel number, whatever .5667 it is at the time, their new KDE 3.5 upgrade, that either crashes or slow, or may work just fine in your system. That creates chaos in a working business environment.

You better start listening to the likes of the writer. He has free software better interests in mind. For myself, I like to see some competition in the field. It makes all the companies edgy. That is why I want Linux to stick around. But you can't hide behind excuses all the time if you do not want Linux to end up like those great little OS's of the past. Remember, CPM, DR-DOS, Amiga, Atari, OS-2, etc.

From: Tony Mobily (SUBSCRIBER!)
Url: http://www.mobily.com
Date: 2005-12-10
Subject: Not quite

Hello,

-------

QUOTE:

We in business need things to work, and if they do not work, I want someone here to fix it, yesterday. Whether they cost little or lots of money, usability is the key!

-------

Tell that to companies who have to buy let's say 3000 licenses for Windows AND Office.

-------

QUOTE:

I have been running some version of Linux (Suse now) in dual boot windows machines for several years now. On a laptop, while running Linux, the computer crashes and reboot quite often. On desktops, it crashes my network or itself.

-------

This is when I stopped actually believing your post - sorry.

I know a _lot_ of people who say the exact opposite. I've never seen a Linux machine "crash" unless there was a serious hardware problem.

It's possible that your laptop has some serious compatibility issues with Linux - which is rare, but it can definitely happen. However, writing "Linux crashes, Windows doesn't" is still unfair.

-------

QUOTE:

So, in the dependability arena I would not ditch windows for it. I run Linux for the experience, but in fact, there isn't any reason to have it.

-------

I suggest you delete it.

-------

QUOTE:

Nothing. At least in windows my fonts are better on my eyes thanks to Clear Type.

-------

Here we go again. The font problem was there because of patents held by Apple. Note: I said "were". Fonts now look just about the same on Linux, Windows, Mac, etc.

The trouble is, the odd person reading this page might actually believe you.

------

QUOTE:

Because is free? My time is too valuable.

-------

Great. Use Windows.

-------

QUOTE:

In many ways it is more expensive, if in fact you were to help those companies doing some of the heavy work in Linux and buy it. Suse Linux cost around $80 retail with manuals and 90 days support. Anyone can buy windows home edition for about $100-120, and it works, period.

-------

Here we go again. Ubuntu is _free_ and it's _fantastic_. Casual reader: go and visit http://www.ubuntu.com and don't believe the post. You don't have to spend $80 for Linux.

--------

QUOTE:

Look at the most successful free program, Open office. Who has done all the work for Open Office? It has not been the so-called community, but the 100 or so programmers working at Sun. You need that continuity to make it happen.

---------

Casual reader: don't be misled. OpenOffice is only the most known successfull programs.

Also, OpenOffice has a strong community; anybody can contribute to it - and in fact people DO. Visit here: http://contributing.openoffice.org/index.html

-------

QUOTE:

I just happen to be able to fix the things that go wrong when they go wrong. How many hours has it taken me to get my wireless running in Linux? You don't have enough hands and feet to count. But how about the regular computer user? Why should he spend the hours and frustration necessary to get a Linux system running right? Most do not care, and that is why they stick with windows.

-------

This is unfortunately true. Some of the hardware support is sometimes tricky. Again, as far as wireless is concerned, things fot a lot better over the last 2 years. Visit this for wireless:

http://www.hpl.hp.com/personal/Jean_Tourrilhes/Linux/

The post doesn't seem to acknowledge that. Oh well.

--------

QUOTE:

It is not a religion, as many want it to be. Right now windows works better, tomorrow, who knows?

--------

Did you just say "Windows works better?" Right.

--------

QUOTE:

in a couple years, Linux, any flavor, will be ready to compete with windows, and I don't mean just to be able to load it on any pc, I mean to work from the get go, without having to spend countless hours tinkering with it and posting and reading in forums to get things right. Build me a better mousetrap and I will buy it.

---------

This will ALWAYS depend on what hardware you buy. Harware makers are already (and finally, I may add) creating machines that are guaranteed to work on Linux. This is not a "Linux problem". This is part of the normal evolution of an operating system. Now that enough people are using it, and the market is significant, hardware makers will make PCs and laptops which will definitely work on Linux, 100%.

--------

QUOTE:

And forget about viruses...who wants to write a virus for a three percent user base when you have the 800 pound gorilla to wreck havoc with?

---------

Question: what's 3% of TEN MILLION?

Do you know that it's much, much harder to create a virus for Linux than it is for Windows?

---------

QUOTE:

Like the article explains, Linux followers love their once a month updates, and new kernel number, whatever .5667 it is at the time, their new KDE 3.5 upgrade, that either crashes or slow, or may work just fine in your system. That creates chaos in a working business environment.

----------

In a business environment, install the latest STABLE Suse, latest STABLE Ubuntu, latest STABLE RedHat, and you're set.

There isn't much anybody can do if you get annoyed at people trying out the latest pieces of software.

-------

QUOTE:

But you can't hide behind excuses all the time if you do not want Linux to end up like those great little OS's of the past. Remember, CPM, DR-DOS, Amiga, Atari, OS-2, etc.

--------

Unlike those "great little OS's of the past", Linux's adoption is *growing* _exponentially_.

Did you notice?

Merc.

From: Robert Pogson
Url: http://www.skyweb.ca/~alicia
Date: 2005-12-13
Subject: We in business need things to work,

That is an excellent recommendation for Linux. Let me tell you what Windows users have had to put up with in my organization:

For three months, 24 new staff and all the students have not been able to print on a common Laserjet. The techs had set up every machine in the building with a new "profile" that did not include a printer definition. After repeated technical service requests, this item finally made it up the queue and was fixed. The day after the techs left, the students can now print, but now none of the staff can without an "add printer" operation. Our techs are smart people and quite competent, but after spending hours trying to get XP service pack 2 onto all our PCs and getting only some to install, they ran short of time and had to rush off to the next location. This is on a system locked down so tightly that no one can wiggle. The hard drives are re-written at every reboot to combat malware. The techs are supposed to be able to maintain the software remotely but often they cannot get in because Windows ignores them.

On the other hand, I have a single Linux server to maintain and all my users have been able to print from day one. Not one of my users has lost a file either. My system goes for weeks without a reboot and runs most of the processes for 22 users simultaneously whereas our Windows systems run processes for one user and many files have been lost due to freezes. I would say Linux works here and Windows does not. Last year I was in a shop where Windows worked pretty well except that the scanner could not be used after Service Pack 2 and the computers in the lab needed to be re-imaged. It took many hours to do this. Microsoft asserted its right to install Service Pack 2 without our permission and broke our system.

From: Richard Corfield
Url:
Date: 2005-12-11
Subject: Processes, design outweigh the "Free/Proprietry" choice

Yes you can patch alpha software into the linux kernel and make all sorts of interesting things happen. You can develop on DotNet2 beta using experimental new features if you wish. If your business needs that kind of cutting edge software then it can employ people to manage it. Some businesses do.

So much of what many businesses want can be achieved with commodity Linux software. Install Apache or Tomcat or JBoss or MySQL or PostGres or Eclipse. This is software that a lot of people know how to use. Set up proper processess and documentation - something missing in too many companies I expect! - and there's no need to worry about managing it.

Case in point, the company I worked for lost their Linux guy for 6 months with cancer. The linux systems didn't fail in this time, but it wasn't a risk anyway.

The entire process to recover the automated backups onto clean PCs using original media for software were documented, tested and the other UNIX admins in the organisation were familliar with them. We had them try it using the previous night's automated backup and tested the resulting system. The same instructions were used in other offices once the initial pilot was deemed a success.

Part of the decision to try Linux was based on the knowledge that the software (Apache, Tomcat, an in-house Java application and source control in CVS) was available for Solaris should the Linux experiment fail. Potential permanent loss of the main Linux guy was not a risk.

For a process like this to work, it must be repeatable. A lot of the good Linux software is highly modular and loosly coupled. This is a good software design goal in its own right and true open standards help in this goal.

Where in a tightly coupled (heavily integrated) system a small change in one module can cost hours of support or developer time to adjust, the looser coupled system is more resilient. There is far more freedom to deal with a small change to Apache or Tomcat, or even to completely swap them out if need be, without breaking the rest of the system.

Author information

Tom Jackiewicz's picture

Biography

Tom Jackiewicz is currently responsible for global LDAP and email architecture at a Fortune 100 company. Over the past 12 years, he worked on the email and LDAP capabilities of the Palm VII, helped architect many large scale ISPs servicing millions of active email users, and audited security for many Fortune 500 companies. Jackiewicz has held management, engineering, and consulting positions at Applied Materials, Motorola, and Winstar GoodNet. Jackiewicz has also published articles on network security and monitoring, IT infrastructure, Solaris, Linux, DNS, LDAP, and LDAP security. He lives in San Francisco’s Mission neighborhood, where he relies on public transportation and a bicycle to get himself to the office-fashionably late. He is the author of Deploying OpenLDAP, published by Apress in November 2004.