I grew up in the 1950’s when network television was booming. My dad was an early adopter and always had to have the newest and latest version of everything, so we had a “television set” for as long I can remember. In order to watch something, you had to become expert at things like fine tuning, brightness and contrast, focusing, and adjusting the horizontal and vertical hold. If you were one of the elite with your own antenna tower and rotor you learned to aim your directional antenna just so in order to catch that show that was being broadcast by a distant station. In those days, because the technology was relatively new, the hardware was at the centre of things. You turned on “the TV” to see what was available to watch on the broadcaster’s schedule. Today, after over half a century of evolution, one no longer watches “the TV”. Television as a technology is no longer about the hardware or even the delivery system. Television is about the content. People now simply watch “TV”.
Personal computers have gone through a similar growth process. Almost a quarter-century ago I bought my first PC — a 25 MHz 486 with 2 MB of RAM and a 250 MB hard drive for the princely sum of $2200 — and dutifully learned about autoexec.bat files and Lotus 123 macros. I learned about defragging and installing software from floppy disks (Lotus SmartSuite used 28 of them) and setting the screen resolution for that big new 15 inch monitor. Today most of those tasks are no longer necessary or are automated. Here at home we have six machines on our network that are largely specialized including a couple of NAS devices, a dedicated machine for watching streaming video content and a Chromebook for checking the weather while breakfast is warming, or finding a recipe. We no longer “use the computer”; instead we look something up on “the Internet”. Like television before it, it is no longer about computer hardware but about the content.
Many IT departments began in the same era during which I bought my first PC. People I worked with back then often did not have a PC at home. Their only exposure to computing was at work. The support desk was a busy place. The IT department was responsible for keeping the computers running and connected. Some places I have worked, IT was even responsible for the phones and printers (they were networked after all). Lately, the IT industry newsletters that I subscribe to online have been filled with headlines and quotes like the following:
• “Where did all the on-premises IT jobs go?” – InfoWorld newsletter
• “With minimal IT intervention, you can set up a self-service BI environment…” – TDWI
• “Software as a service vs. old-school IT” – InfoWorld.com
In spite of resistance and denial from “old-school” IT departments that still see themselves as the keepers of the keys, today’s knowledgeable and connected business users are doing an end-around and bypassing the IT department in their pursuit of useful and timely business information and applications. The help desk has been outsourced, the equipment is leased from and maintained by a third party, and increasingly even the data and the applications are stored and run in the remote data centres of contracted cloud providers. There is a surprising amount of denial and trash-talk about cloud computing on IT industry forums — mostly along the lines of cloud computing and big data being just the latest fad. Unfortunately many IT departments that are stuck in the past have become “computer departments” or “network departments” or “data departments” instead of “Information Technology departments”. They have lost sight of the fact that it is no longer about the hardware and software but about the content.
Take an honest look at your own IT department. What does it do right now that cannot be outsourced? How easy is it to obtain the information you need to do your job? Do the members of your IT department have the skills necessary to support your organizational information needs and if so, are they allowed to use those skills?