Encourage innovation? What does that mean?
The session was titled End User Computing. I thought it was a grammatical error … I expected it to be about end-user computing, not about how to end … as in prevent … user computing. And to be fair, the panelists didn’t advocate ending it. One, a credentialed authority on security, pointed out … somewhat […]
The session was titled End User Computing. I thought it was a grammatical error … I expected it to be about end-user computing, not about how to end … as in prevent … user computing.
And to be fair, the panelists didn’t advocate ending it. One, a credentialed authority on security, pointed out … somewhat grudgingly, but she did point out … that the lockdown era is over. Given the proliferation of end-user devices and the increase in travelers, teleworkers, contract labor and so on, locking down every access point is no longer practical.
In its place is a better approach, which emphasizes protection of information assets rather than making hardening all access points the centerpiece.
The other panelist lead a team within IT responsible for developing apps and such for the company’s customers and employees use on their personal smartphones and tablets. His focus was creating innovative products for end-user computing devices.
And then they were done. Time for Q&A. Thinking I was tossing a soft, high one over the plate, I asked what they were doing to promote innovation by end-users.
Whiff! The second panelist spoke more about how he encouraged his team of developers to innovate. The security expert did note that some end-users were finding innovative ways to use tablets and smartphones, now that they’re allowed to do so. That was about it.
But encourage innovation? I might as well have asked, “Igli og slog, flub glubbly wub?”
It’s time for a trip down memory lane … back to the early days of personal computers. With limited storage and processing power, and networking still in the future, they were pretty much useless for serious, mainframe-style computing, which is why IT (back then it was MIS) considered them a pointless, uninteresting distraction.
And so they leaked in, hidden in office equipment budgets because end-users ate them up. Empowered by:
- Annoyingly limited but cheap and easy-to-learn languages like interpreted BASIC and Turbo Pascal,
- A sort-of-database-management-system (dBase II), and
- Thousands of new, inexpensive commercial applications written for these new devices,
- The sudden ability to ignore MIS entirely, doing whatever they needed to do, when they needed to do it, for themselves …
… they figured out countless ways to incrementally improve how they, their workgroups, and their departments operated. This is what made the PC a disruptive technology: Its first success came from providing the ability to do things nobody had done before, not from doing the same old stuff on a new platform.
Until the PC became powerful enough to gain a place in mainstream IT architecture. When that happened, in most companies IT gained control and put a stop to all the innovation, because of all the bad things that could happen if end-users were allowed to do whatever they wanted to with their no-longer-personal computers.
Now we have tablets and smartphones. For the most part they’re optimized for consumers, not business use (see “A tablet-driven view of what’s wrong with American business,” KJR, 4/25/2011 and “Tablets won’t be disruptive ’til the future gets here,” KJR, 5/2/2011).
From IT’s perspective they’re more annoyance than opportunity — a proliferation of browsers and form factors we need to support so employees can use them to get at whatever they need to get at.
And even this isn’t good enough. It isn’t because tablets and smartphones have the potential to be truly disruptive. That’s just prognosticating … they will be or they won’t, and we can deal with the disruption to existing marketplaces when it happens, just as we did with the PC.
It’s what we do with them in the meantime.
It’s true: If you let employees innovate on their own, Terrible Things might happen, especially if your company routinely hires stupid people and subjects them to inept guidance.
And yes, you could provide the tools and nobody will find anything useful to do with them. That could happen, especially if your company typically hires dullards whose primary virtue is showing up on time.
When PCs came on the scene, they drove a burst of innovation. A few of the most enlightened companies actively encouraged it. Most of the rest didn’t even know it was happening until it was too late.
Tablets and smartphones are the heirs to the original personal computer. Give employees half a chance and they’ll find new and interesting things to do on them that can help your business. The problem is that nobody can predict, in advance, what your employees will come up with.
It’s investing in and relying on the smarts, good judgment, and creativity of individual employees. What a radical notion.