A few weeks ago, Frank Scavo of the Enterprise System Spectator linked to a post on “IT propaganda” from “CEO Blogger”, and I’ve been meaning to respond to these posts since. (I don’t usually write about it, mostly because I don’t want to spill on any of my clients, but this is actually my area of expertise.)
Frank Scavo actually asks the right question: “Is information technology a strategic investment that should be leveraged to produce a competitive advantage? Or is it a utility that should be managed to lowest cost once minimal levels of service are established?”
And answers it correctly, too: “[W]hether IT is a strategic investment or a utility cost depends on what the company needs information technology to do. If information technology is a key element of the firm’s product, service, operations, or strategy, then IT should be viewed as a strategic investment. … On the other hand, if information technology is not a key element, then it should be viewed as a cost of doing business, seeking to maintain acceptable levels of service with managed levels of risk, at the lowest cost.”
There are a few myths about IT, and the anonymous CEO at CEO Blogger has fallen into some of them without apparently realizing it. Here’s the key part of his post:
The IT industry is always trying to convince those outside of IT that IT matters, that if you don’t spend enough money on IT it will hurt your company.
But most of us are not convinced. IT is not a profit center, it’s a cost center. Once your IT department grows past the minimum size needed to maintain your company, additional money spent on IT is a loss. But IT is always trying to shake down extra unnecessary money in order to bleed away profits.
I really can’t recall reading any company news releases about IT security breaches, but I certainly recall reading news about companies writing off tens of millions of dollars after an “investment” in new enterprise software failed and was abandoned.
Like every career, there are good IT managers and bad IT managers. (At all levels, including directors, CTOs, CIOs, and so on.) If you can’t hire the good ones, that may say more about you than about IT in general. (In particular, I’m referencing the comment about a “shake down” for more money.)
The truth is, IT can or cannot do what you want, depending on what you want. And if the CEO is not sufficiently clear of his desires, and if his employees from CIO to assistant night-shift operator are not sufficiently understanding of their duties, there is likely to be a gap between expectation and delivery. I see the gap a lot, and it almost always starts from the top down, not the bottom up. (When it comes up from the bottom, with IT suggesting all kinds of large projects to automate this or that, or to buy expensive off-the-shelf solutions that kind-of do what the company needs, it’s usually a sign of bad discipline in the business management, for not compelling IT to concentrate on the provision of services at request. IT should never be allowed to run the show, unless the business of the company is provision of IT services, any more than should the accountants or lawyers or HR people.)
This whole problem, of whether IT is cost or profit, relates to the “build vs. buy” debate: do you hire programmers and build an application, or do you buy COTS software (commercial off-the-shelf) and customize it? Well, it depends on your need. If you need a word processor, just buy something off the shelf: that problem is solved generically for everyone. If you need an access and identity management system (what I’ve been doing the last couple of years), it depends on your requirements: are you selling your security, or something heavily dependent on your security (for example, are you a bank?), or are you using the AIM system to protect what you are selling (such as an online retailer)? If you are selling your security, you will get better results faster and more reliably by building the key components yourself than by buying and customizing COTS software. The reason why is that you are seeking an edge in your business, not a reasonable solution that works for most people.
This gets really big as the system get large. For example, implementing SAP or PeopleSoft (the whole package; not the accounting software alone) is almost always more trouble than it’s worth. You can easily spend more on customizing such a package than you would have doing the job manually. This leads to my next point: productivity.
IT cannot make a company more productive, except in limited ways. What IT can do is allow a company to take on jobs it could not have done at all without IT. For example, Wal Mart has built a large amount of custom software, some based on or using COTS components and some entirely internally. This has not increased the productivity of their retail employees; rather, it has allowed Wal Mart to operate a larger amount of stores as if their purchasing managers were present in each store. By automating their purchasing for all stores, the net effect has been more efficiency; this was achieved not by increasing productivity, but by doing things that were previously not possible.
WYSIWYG word processors, similarly, have eliminated the typing pool. Not by making typists more efficient, so that fewer are needed, but by making every employee capable of typing their own documents well at composition. In other words, a whole category of work – turning composition into finished documentation – has been eliminated. There is still frequently a need for tech writers or similar specialists to fine-tune documents, but there is not a need to turn notes into the documents in the first place.
The extent to which IT can deliver benefits to the business is less dependent on IT than the business. If a company has a competent IT department, competently managed, their ability to produce is entirely dependent on the requirements they get from the business managers. A good example of a mismatch in this area is when a company’s business managers want something, but cannot explain exactly what they want, nor formulate requirements that will be testable at the end of the project. In this case, the application or infrastructure that emerges can probably be made to work, but will almost certainly not be a good match for the business needs – especially if the business need has evolved. Combine this with the inevitable “vanity features” – filling in checkboxes that have no relation to reality, or giving someone a chance to “make an impact” on the project, or what have you – and you get a bloated project that doesn’t do its job. It is then usually IT that is blamed, even though the fault lies with the poor initial specification that IT was given. This is the most common set of circumstances I’ve seen where large IT projects fail: overreach combined with poor initial requirements specification.
(And, while I’m on the topic, guard against your CIOs being wined and dined by large consultancies. It’s not too unusual for large projects to be let on the basis of what amounts to bribery, and the results are usually less than stellar.)
In the end, the way to make IT work for the business is to make sure each project fits one of a few categories before approving the project. These categories are:
- The project will do something already being done (either automatically or manually), but cheaper, so that the ROI alone justifies the project.
- The project will do something new, that makes it possible for the business to do something new and profitable, and will cost sufficiently little that the business can still be run profitably. (SAP is a key example of where this often fails: the business sometimes cannot make enough profit to pay for the cost of building and operating the systems it needs to run SAP.)
- The project will do something which may or may not help the business, but where the costs are small enough that failure is not problematic. This is usually a research or proof-of-concept project. Where these work, they often evolve into a full-featured product or infrastructure.
- The project meets a legal requirement (such as SOX or HIPAA compliance), and that requirement cannot be met more cheaply with process.
If you can’t fit the project into one of those frameworks (without shoving and twisting the numbers), then the project will probably fail, and will likely be an expensive failure at that.
There’s another category of project that can be quite successful and that’s worth mentioning: capturing information in the course of a regular business process that could not be captured practically otherwise. This is real utility of practically all financial accounting systems (for many organizations the function itself can actually be performed more efficiently manually).
Jumping back up to the beginning of your post, this is exactly right:
The problem is that most companies (large and small) don’t know what business they’re actually in. Don’t believe me? I have two words for you: New Coke.
Finally, the number of good IT managers is very, very small. Book-learnin’ is helpful but not sufficient. Few technicians have the temperament or skills. But most importantly organizations rarely value or encourage IT management.
Sadly, you are correct about the number of good IT managers. From my experience, this stems from two causes.
First, too many IT guys consider themselves geeks, bad at people things, etc. As a result, they don’t attempt to develop themselves into managers. (They just complain about how their managers don’t understand technical stuff, while they make no attempt to understand business needs.)
Second, too many executives subscribe to the idea that a manager can manage anything, and agree that the IT guys are geeks who can’t do people stuff. As a result, the executives appoint unqualified managers, and don’t develop their internal talent.
Ironically, the companies that are worst about this are the companies whose main business is the provision of IT goods and services.