There can be little doubt that the personal and domestic use of technology has had an impact on user expectations and behaviour with regard to the way IT is used in a business context.
This arguably started when the PC used at home typically became more up to date and of a higher spec than the equipment used in the workplace. The same has happened more recently with mobile devices, where extremely advanced technology is accessible to anyone in the high street making the standard issue corporate mobile look pretty lame in comparison.
Then we have the advent of social media and the evolution of the Web in general. Whether it’s blogging, personal networking, content sharing, conferencing, or even full online application services, people have become used to sophisticated capability being available over the wire on demand.
When looked at from a business perspective, these developments can be viewed very positively. The general adaptability of employees and their willingness and ability to embrace new ideas and capabilities are clearly enhanced. Potentially, costs can be saved also, as many users seem willing to spend their own money to deal with business requirements if it means they can work with the tools they prefer.
However, tech savvy users solving their own IT needs can also create issues in the form of elevated risks and increased support overhead and costs. Contributing factors here are the desire to be different for the sake of it (with the consequences of fragmentation that come with that), a tendency to tamper with standard issue tools, and the enthusiastic amateur DIY approach to development and integration.
With this in mind, the topic of ‘consumerisation’ is coming into focus. By this we mean the trend towards users having a lot more say in the technology that is used in the workplace. Within this, we have users acquiring technology and services for business use on a personal funding basis or via local departmental budgets and expense accounts.
If you don’t see this happening in your organisation, then you’re probably not looking hard enough, so key question is how to deal with it.
The first thing to dismiss is any thought of simply blocking all of this activity. Experience has shown that if you try, it will just go underground, which makes any challenges even more difficult to deal with. Locking down the infrastructure and policing bans on certain activities is in itself very expensive anyway and, even if you go down this draconian route, can you really be sure that creative and determined users will not find ways to circumvent your control measures?
There are some lessons we can learn from more progressive CIOs and other senior IT leaders who started to acknowledge and embrace consumerisation long before it became as prominent a phenomenon as it is today. We spoke with some of these as part of the research for The Technology Garden. Through these conversations and subsequent research, it is clear that while no hard and fast formulas exist, a number of principles have emerged that are worth considering as part of your response to the trend.
The first and most obvious is to acknowledge that consumerisation is unstoppable, so the sooner you accept this and start to figure out how to deal with it, the better.
The next principle, and a good place to start when moving forward, is to establish clarity on what constitutes core activity, to which a set of non-negotiable constraints and policies will apply, as opposed to peripheral activity, within which you can accommodate more personal preference.
As an example, you might define the execution of ERP and CRM related transactions as core, but provide some freedom on how those transactions are invoked. In practical terms, this could translate to an SOA back end infrastructure that exposes transaction related services for assembly in any way the user wants in a portal interface, or access via any other suitable front end, whether browser, PC or mobile device based.
Picking up on a very key word here, ‘suitable’, it is important to acknowledge the difference between flexibility and anarchy. Even within the domain of peripheral activity, it makes sense to define some basic ground rules and guidelines around issues such as security, compliance, integrity, supportability and so on.
The spirit here is to meet users half way by agreeing to accommodate user preferences, but not to the extent of creating tangible negative consequences that will be difficult, impossible or prohibitively expensive to manage. Saying you will allow personal equipment to be hooked up to the network provided it meets certain criteria in terms of spec, securability, ability to receive electronic policies, and so on, is an example here.
Consumerisation best practice is evolving in many other ways, and it is impossible to go into all them here. It is, however, worth mentioning one more very important principle, and that is the one of visibility, i.e. making sure you have sight, as far as is practical, of everything that goes on. Through automated discovery of devices and software in the context of asset management, monitoring of network activity to track websites and online services being accessed, etc, the trick is to watch, rather than block, by default, and act when certain scenarios arise.
One such scenario might be the spreading of a particular type of clearly useful activity within the user base, e.g. some saw this happen with public web and audio conferencing services, then step in to either support what’s being used or provide a more supportable alternative that meets the same need. Other scenarios might represent more of a threat, e.g. the leakage of corporate information via social networks, in which case a combination of policy and education might be in order.
So, you can do a lot to manage, even leverage, the consumerisation trend. It is therefore far better to roll with it than resist it, otherwise you risk losing control and missing opportunities.