The Cranky Admin

The Case for Systems Integrators

Think they're outdated? Think again.

If there is one advantage to no longer being reliant on practical systems administration for a living, it is that I no longer have to play the stupid "nerd supremacy" games. Now that writing about tech pays the bills, I get paid to not only make mistakes, but to talk publicly about them. Recently, I've learned to greatly appreciate good systems integrators.

The Fear of 'I Don't Know'
One thing that's always bothered me about working in IT is the pervasive fear that saying "I don't know" will be seen as a professional failing. It has certainly been my experience. Non-nerds will view an "I don't know" as either outrageous incompetence on my part or believe that I am purposefully withholding information from them. In either case, they usually get angry.

Nerds tend to react differently to an "I don't know." The really senior nerds usually smile knowingly, or, if there's still any fire in them, enjoy the puzzle and join in on the research. The rest start playing the alpha nerd game, a defensive reaction typically more related to impostor syndrome than ego.

Those of you reading this probably know well and good that what the masses think we know about computers and what we actually know about computers have nothing to do with one another. We are all of us fleshy extrusions of the search engine dimension into this plane of existence. Google, Stack Exchange, Bing (kidding!), and many others are where we store our knowledge. With the exception of the day-to-day, we just store metadata about where the knowledge is for when we need it later on.

Five years ago, all of the above would have been an intro to a modestly interesting bit of philosophy about the role of peer pressure in job satisfaction. I could probably have gotten a few hundred words out of a discussion about how a lack of understanding by managers about how much nerds learn by rote leads to a hostile work environment, stress, burnout and so on.

Today, the rise of software-defined whatsits changes this conversation. When designing our datacenters, we aren't simply choosing which meticulously QAed solutions we'll lash together. Whitebox is the new black, and if we can't find a way to be honest with ourselves -- and those with the purse strings -- about the limits of our knowledge, we can get ourselves into a lot of trouble.

Building, Then Breaking
The job of a systems integrator is to take a bunch of technologies, lash them together, and present it to a customer as a single solution. They then need to support that solution through its lifetime. Often, systems integrators are building solutions out of technologies from multiple vendors. This is a lot harder today than it was 10 years ago.

Having a deep understanding of the technologies involved gives an architect a place to start, but it is only a place to start. An awful lot of the job is building test systems and stressing them until they break. Specifically, one needs to find out how things break under different circumstances; and more importantly, how to recover from it.

Systems integration is a huge amount of "I don't know." It's a lot of asking others for help. It is, above all, remarkably humbling. Software-defined widgetry presents us with the ability to take that burden on ourselves.

For large enough shops, this makes sense. They can afford to have a team of nerds buying up different bits of kit and figuring out just how far everything will go. For small shops this can also make sense: skip the value-add cost of systems integration because the organization just won't be pushing their gear anywhere near its limits.

In the fuzzy middle, however, there lies an awful lot of danger. This is where systems integrators come in.

Adding Actual Value
Buying a built solution from a systems integrator can cost more than building your own. If your approach to building your own is making an educated guess, (maybe by using a vendor HCL,) buying the gear, and then just sort of hoping it all works to plan, a professionally-integrated solution will probably cost more.

Proper systems integrators buy at least one copy of the hardware they're going to sell to their customers, install all the software on it, configure it, and then test the crap out of it. The cost of doing this -- including the purchase of the gear that didn't make the cut -- is then shared across all customers.

Actually knowing what you're doing in the software-defined widget world would mean that you did something similar, which means a testing budget, time to test and justifying this to those who pay for things. It also means admitting you don't know something.

This isn't to say all systems integrators do their job, nor that all workplaces are so hostile that running your own test program is a potentially career-ending move. But the idea that systems integrators will go away because of software-defined widgets, the public cloud or other changes is nonsense.

Systems integrators do more than just bundle bits and sell them as a single SKU. They offer a certain amount of political butt-covering. If they have a broad enough client base all using the same stuff, they can also organize testing regimens, canary groups and other "proper IT" across their customer base that just isn't possible for any but the really big, really rich, or really progressive IT shops.

Learning from mistakes as a service will always have value. No matter what technologies are in play.

About the Author

Trevor Pott is a full-time nerd from Edmonton, Alberta, Canada. He splits his time between systems administration, technology writing, and consulting. As a consultant he helps Silicon Valley startups better understand systems administrators and how to sell to them.


Subscribe on YouTube