Friday, March 19, 2010

Macs in the Enterprise

Actually, this is about more than just Macs in the enterprise. I recently had a rather good conversation about Apple taking over computing in the enterprise. It just goes to show how old I've become that I find myself justifying the existence of big, evil IT.

Dell, HP etc. have business lines of PCs — they tend to be the less-than-exciting computers that seem to carry a price premium over what an enterprising geek could make themselves or what you could buy off a special from the same vendors. However, they will guarantee that the hardware won't change, they will guarantee that the exact models will be available for a certain length of time — an office can standardize on a single model and can still buy the same model for the new hire a year later — and you know when the model will be discontinued and new models introduced so you can plan refresh cycles accordingly. Apple is rather lousy at this, as are other vendors targeting consumer markets where agility in introducing new gizmos matters more. That, and the vendor / consultant / var / third party partnerships are another issue, but they're boring and straightforward.

There is then the argument that Apple is less friendly to centralized IT as far as management tools available. In the Windows world, a well run IT shop can have more or less complete control over computers — where they are, what their service tags, serial numbers, installed software, patch levels, hardware specs etc. are, when they turn on, when they go sleep, whether screens will lock upon screensaver, who has what rights and so forth. Bringing up things that matter to centralized IT is often met by, "You can do what with a Mac too." And it may well be true.

But then there is the matter of the less-tangible aspects of centralized IT, analogous to institutional or bureaucratic inertia. An enterprise may have, say, a system that keeps track of inventory, another that does backups and a third that pushes patches to computers. To switch to Macs, all three might have to be replaced — and that's the easy part, you can measure how much money that costs, and estimate how much might be saved by the new computers. However, what's much harder to gauge is how much it will take not only to retrain the IT administrators with the new tools, but how long it will take them to learn how to properly use them.

Specifically, chances are that the IT organization has spent a years figuring out which features work, which don't work, which annoy users, which annoy admins and so forth. They have an idea how long it will take to do a certain operation. They know how to troubleshoot issues. They know how to write scripts to leverage existing systems. The way the IT organization is structured, the forms it has, the documentation it does, all these things may be affected. Of course, bureaucracy shouldn't hinder productive activity, but in reality there are trade-offs. Of course, the extra work for IT may result in a more productive workforce that offsets the costs, and the new ways of doing things may be better than the old ones, and a chance to rethink the way things are done and ditch historic baggage — as long as people are aware of the implications on a human, organization and process level.

Another example; an organization has a network consisting of equipment from vendor X. Vendor Y sells equipment that is just as good, and cheaper. However, the organization has a decade's worth of experience with vendor X, and many custom scripts and programs that work with X but not Y. They all may be little issues, but taken as a whole it will take years to get back to the same operational effectiveness with vendor Y than the organization had with vendor X to begin with. If the depth of intangible changes aren't communicated up the management structure, it may rather surprise the CIO why a simple vendor switch is causing so much heartburn, or not taking place as fast as expected.

No comments:

Post a Comment