Dan's Take

When Tech Becomes the Wiring In the Wall

It may be out of sight, but it shouldn't be out of mind.

IT decision makers are a forgetful bunch. Once a technology's been around for a while and become part of the "wiring in the wall" -- commonplace and works well enough -- it's largely forgotten. It's only of interest when things don't work well.

Decision makers seem to focus on whatever the industry thinks is the newest technology, and don't think much about the extensive and complex stack of technology keeping the enterprise running, day in and day out. Their image of the IT market is something like a radar screen: unless it's refreshed regularly, it fades from view.

I've been in the industry long enough to have seen many highly competitive, interesting and exciting markets evolve into being the wiring in the wall. Here's a partial list:

  • Mainframe operating systems. These complex software products have gotten good enough that many don't give them a single thought. Most of the early entrants have become footnotes in history. Only a few survive today.

  • UNIX operating systems. At one point, IDC's Server Operating Environment service was tracking 37 different UNIX operating systems. There were many attempts to create a broad set of standards. Although there are far fewer choices today and the levels of compatibility and interoperability are very high, if a watcher pulls back the covers, it's clear that there are still several fighting fiefdoms doing their best to dominate and control this platform.

  • Linux distributions. At one point, the same IDC service was tracking nearly 400 different Linux distributions. Many universities, government institutions,  hardware, software and services suppliers had their own package for internal use. Now there are only a few distributions. As with UNIX, the levels of compatibility and interoperability are high enough that most don't give this technology much thought.

  • Desktop/laptop operating systems. Many hardware suppliers offered their own operating systems, development tools and applications. Furthermore, systems were based on many different microprocessors from suppliers such as North American Rockwell, Intel, AMD and many others. We've seen 8-bit, 12-bit, 16-bit, 32-bit and 64-bit designs. We've seen reduced instruction set (RISC) and complex instruction set designs. Does anyone think much about this now that x86, Windows, OS X and Linux have largely taken over?

  • Distributed processing architectures. How many of today's applications are built on the foundation of client/server computing? Web-based applications, cloud-based applications and a large majority of enterprise applications are now built as collections of network services herded together to present one type of application, then combined in a different configuration to present a different one. I remember the excitement when I was first able to open and read a file on a server in Digital Equipment Corporation's Chicago regional office from a local system in the Kansas City office.

The industry has gotten to a place in which this is so commonplace that many hardly give a thought to where their data is located, what machine is hosting it, what operating system it runs, where the storage is located or what physical media is being used to store the data.

Dan's Take: Check Your Wiring To Avoid Getting Burned
Time and again, I take part in conversations with vendor representatives in which they tell me that they're the first to do something. They crow about how innovative and creative their engineering team is. I think back at what I've already seen in the industry, and ask them if they've ever heard of product x, y or z that did the same thing, only on a different platform. One supplier thought they had invented a form of virtualization technology that was first seen on Boroughs mainframes, later on IBM mainframes, then on UNIX systems, then Windows, and now everything but the kitchen sink.

It's important to remember what technology is supporting what we're doing today, or we'll be bamboozled by glib marketing messages into believing that something old is really something new or something supporting the enterprise every day is really a "legacy" and should be thrown away.

We need to remain aware of the wiring in the wall, even though we look at it through different lenses all the time. Is client server computing really obsolete when we use variations on that theme every day from every network-enabled device?

I'm reminded of something George Santayana said: "Those who cannot remember the past are condemned to repeat it." I'd add that those who don't remember the past are likely to pay different vendors exorbitant fees to buy it over and over again.

About the Author

Daniel Kusnetzky, a reformed software engineer and product manager, founded Kusnetzky Group LLC in 2006. He's literally written the book on virtualization and often comments on cloud computing, mobility and systems software. He has been a business unit manager at a hardware company and head of corporate marketing and strategy at a software company.

Featured

Subscribe on YouTube