ADLER typewriter Model n°7 (Frankfurt / Germany). Unknown model date (probably ~1930/40). By Dake

How Standardization Creates Dis-Order

E. Forrest Christian Change, Computers/IT, Reviews - Articles Leave a Comment

What happens when you move to standardization?

In IT, we’re pushed to do single server loads, unified single-sign on systems, international PKI based Role Based Access Control, and all types of other forms of simplifying, rationalizing and standardizing.

But no one ever stops to ask, Does standardizing in IT really make life easier?

Not “no one” exactly. Because two researchers in Oslo did.

Ole Hanseth and Braa, both of the Department of Informatics at the University of Oslo, examined a desktop standards project done within Norsk Hydro. The company had grown substantially over a period of several years, acquiring and developing new suborganizations. These more or less had been allowed to develop their own infrastructures.

But in the 1990s, Norsk Hydro decided to rationalize their IT infrastructures and create a standard desktop platform.

Remember, in the 1990s the world was different. OS/2 still had a shot at being your backend server. Novell ruled the LAN market. Apple was going bankrupt. Lotus had its own office suite. WordPerfect still had the lion’s share of the word processing business in many industries. The Web was still in its infancy and client-server technology didn’t really work.

Standardizing on a single set of desktop applications and operating system would allow NH to purchase things in bulk, saving substantial money.

Unfortunately….

Standardizing created disorder at the same time it was creating order elsewhere. You might even say standardization created more chaos than order. The standard wasn’t implemented at the level above the divisions, so they could continue to use whatever platform they wished. They also found that the diversity of their businesses meant that many home-grown applications had to be supported or ported, creating the normal nightmare for upgrades.

I’ve been a part of a massive single-step upgrade before, when a large insurer in Chicago moved from Windows 3.1 and Novell to Windows 95 and Windows NT. We still had applications running on OS/2 in 1998, it having been a big product in the insurance industry for its robustness. They had over 600 applications that had to be tested against the single loads for desktop and servers. We had only one OS load for all NT servers, which kept getting changed to allow important applications to pass testing, creating a nightmare of integration where things that had passed testing didn’t work when they got put into production because the server load had changed.

I think that the problem of standardization is really one of getting it at the right level. Standardizing on Windows NT is an abstract standard; standardizing on Windows NT configured in only one certain way is a more concrete standard. The vague standard causes problems when you start using it because people have to give up one to take on another.

We need standards because they make our lives easy in many ways. When I go to the hardware store, I need to know that a particular nail type is the same from box to box, that a 2×4 is the same size (although the ones in my house are actually 2 inches by 4 inches and not the modern size), that a 5V AC adapter actually produces what my machine needs. But they also introduce as much dis-order (the destruction of order) as they create order.

Think about standardizing on a language. While you gain the ability to communicate easily across a wider geographical reason, you lose a richness of local diversity which gave a wider set of points of view, simply because our language shapes our perceptions. The adoption of Parisian French as the lingua franca of France provided for easier maintenance of the nation, from taxation to controlling the armies, but lost the diversity of local stories and languages.

But what can you do? In Flanders, my friends from Ypres can’t use “home tongue” Dutch to speak with their friends from Ghent. They both have to resort to “standard Dutch”. The local variants simply won’t do.

The same is true of writing. We need some standard language to communicate about certain things. Yes, there is the adoption of Greek, Latin, French, Mandarin and English as global languages but also the use of jargons to communicate within a field. These standard words and syntaxeslet us talk to each other across large boundaries but with a reduced diversity.

Perhaps only dictators can truly create standards. But there are other examples.

TCP/IP came about as a standard partly because the Dept. of Defense in the U.S. wanted to interconnect computers so that something might survive a Soviet nuclear strike. But the standard survived and thrived, I think, because in the end it didn’t care about content. You can put anything in a packet. The standard doesn’t care.

Especially if it’s in English.

But standardization also destroys the current local ways. If you are making parts for just your own machines, the parts can be based on your own local standards. But when you have to have parts work on many machines, they must adhere to some set of standards created and maintained outside your organization. And standards take a lot of time to maintain. Even our measurement standards (meter, volt, joule, etc.) require a massive worldwide effort at maintenance and

Hanseth and Braa argue (and Hanseth and others argue elsewhere) that in the end standardization devolves into creating standard gateways between systems that translate one’s output into another’s input. Having done standardization of IT infrastructures, I know that this is true.

I wonder who much of this also has to do with tacit knowledge embedded in local work systems. One of the problems of implementing PeopleSoft is that it enforces a particular way of doing things across the organization. But local work-arounds have been created to get work done, making for a lot of variation locally. The standard is simply not complex enough to handle the level of variance in the work done locally.

Because it’s telling people how to do their jobs rather than giving them limits and letting them figure it out from there. It restricts judgment too much.

So maybe that’s the point.

Ole Hanseth and Kristin Braa (2001). Hunting for the treasure at the end of the rainbow: Standardizing corporate IT infrastructure. Computer Supported Cooperative Work (CSCW). The Journal of Collaborative Computing. 10(3-4):261-292.

Image Credit: ADLER typewriter Model n°7 (Frankfurt / Germany). Unknown model date (probably ~1930/40). © Dake. (CC BY-SA 2.5)

About the Author

Forrest Christian

Twitter Google+

E. Forrest Christian is a consultant, coach, author, trainer and speaker at The Manasclerk Company who helps managers and experts find insight and solutions to what seem like insolvable problems. Cited for his "unique ability and insight" by his clients, Forrest has worked with people from almost every background, from artists to programmers to executives to global consultants. Forrest lives and works plain view of North Carolina's Mount Baker.  [contact]

Tell Forrest how wrong he is: