Tuesday, June 14, 2005

Learning innovations

I have just spent a couple of days at a small highly-focused symposium titled “Innovations in E-learning.” It was put together by the US Naval Education and Training Command and the Defence Acquisition University (DAU), who have among the best and brightest training minds that the American taxpayer’s money can buy. They are not short of budget, manpower, or technology, and they get to mess with lots of experimental stuff. I decided to participate because the future of learning matters to me, and because a couple of my virtual colleagues were pretty much dominating the presentations in one stream.

For several years now, Training Departments have been transfixed by the evolving internet in the same way that dinosaurs were probably awe-struck by the approaching comet. So what does the future hold? I’m happy to report that learning will thrive, but trainers will have to merge back into operational roles. Oh, and Training Departments are dead, at least as we know them. As are Learning Management Systems and any other relics of centralized distribution of learning. Learning that is informal, collaborative, contextual, real-time, and peer-generated, will be the mode of tomorrow.

It seems counter-intuitive that military types whose culture is defined by command and control hierarchies would advocate devolution of learning to the swab on the deck-plates or the grunt in the foxhole, but that was the gist of what was being said. Admittedly, it was not being said by the JAG look-alikes or their entourages, but by the civilian gurus who write their white papers for them. And devolution of learning does not necessarily mean relinquishing control – in fact there are some very scary big-brother systems being deployed that (allegedly) will tell anyone with access pretty much what any individual sailor anywhere in the world had for breakfast last Tuesday and, to five decimal places, what his or her competency rating is on any given skill. It is hard to reconcile what they are saying with what they are doing, until you realize that, because these systems are so vast, they take a long time to build and deploy. So at any point in time the military are rolling out systems and policies that have long since been abandoned for something new – which may not see the light of day for a decade.

I was mainly interested in hearing what folks like Jay Cross, Clark Aldrich, Harvey Singh and Ben Watson had to say about workflow learning, collaboration, and simulations. However, in amongst their sessions was a real eye-opener from a VP at IBM. IBM used to be a blue-suit red-tie operation as monolithic as a bank, but it has been doing a lot of shape-shifting in recent years. These days any organization that is unwilling or unable to do that is unlikely to be around very long. It’s Darwinian – those who can adapt most readily are most likely to survive in times of rapid change. IBM’s consulting wing, adrenalised a couple of years ago by their acquisition of Price-Waterhouse Coopers consulting, is doing what big consulting firms rarely do – they are advocating unique solutions that they don’t already have parked in a truck around the corner.

Here’s a quick version of the IBM line on “embedded” or workflow learning:

The most profound shift that will take place in training over the next three years is a movement away from traditional, formal, course-based learning (classroom or online) and towards clever integration into the workflow of learning-enabling tools like Instant Messaging and informal collaboration processes. As we move learning from its “separate service” role to a more integrated coal-face role, one of the biggest obstacles is the political question of who owns it. The other is the need for a deeply rooted culture of collaboration throughout the organization.

A simple example of workflow learning in action: Tom in Finance gets an urgent request to authorize foreign travel funds for an executive. He learned how to do that in a training course last year, but has never needed to do it in practice, so he’s lost. The help system, typically, doesn’t. The FAQ gives no guidance either. So he sends out a broadcast Instant Message to a small group of SMEs and experienced practitioners asking for help. So far this is not a lot different from “prairie dogging” – popping your head up above your cube divider and yelling “Does anyone know how to…” But here is where it gets interesting. Jill, an experienced practitioner in another city, responds to the message. She remotely takes control of Tom’s computer and talks to him as he watches her go through the steps on his screen. She identifies that the help system, the FAQ, and possibly the original training are inadequate, and updates the FAQ in wiki-like fashion. Then she identifies a group of Tom’s peers who might benefit from knowing what Tom now knows, and sends them an announcement of a ten-minute webinar for later that week. During the webinar, she records the session, and saves it to the system where those who could not make it, or those who may encounter the problem in the future, can easily find and watch it. Then she notifies those responsible for basic training, and those responsible for the help system, that they might need to pay attention to the issue. Tom, in the meantime, evaluates the help he has received, and his ratings and comments get added to Jill’s profile for reference by future aid-seekers, and her management.

The technology is not complex, or even expensive. Most people have it on their computers already. Aspects of this are widely used already in e-commerce and e-customer support. Individuals already learn this way intuitively. What is hard is achieving the mindset and the culture that allows and encourages this to happen across an organization.

There is nothing revolutionary in the IBM vision. If you have followed those who advocate informal learning and collaborative learning (and indeed many of my own rants), you will realize that the ideas are not new. But, for me, the amazing thing was to hear them coming from IBM. If Big Blue is advocating this approach, and is actively setting about trying to get it to work in its clients’ cultures as well as its own, then there is something serious going on. Workflow learning has moved from the drawing board to the boardroom. They say that in theory there is no difference between theory and practice, but in practice there is. IBM is taking its theories on the road, and, in practice, is being taken seriously.


Original in TrainingZONE Parkin Space column of 10 June 2005

Monday, June 13, 2005

Knowledge managing the retirement brain drain

In most Western countries, the baby boomer bubble is causing concern for those planning pension and social security services. It should also be worrying employers. A quarter of current American employees will be retiring within the next five years. If the outgoing masses know anything of value and it is not being passed down to others in their companies, those organisations face a brain drain that could harm their ability to operate. While much of the work done in examining this issue is carried out by those seeking to market a solution of some kind, the workplace realities, on which they all seem to agree, are fascinating. It appears that finance departments have little regard for the enduring value of an employee’s lifetime of work.

Accenture recently surveyed several hundred employees who qualified as “approaching retirement” to find out what, if anything, their employers were going to do about retaining their knowledge before they left. (I personally found the sample demographics a little alarming: since when are people between the ages of 40 and 50 approaching retirement, unless of course they are in US government jobs?)

It seems that companies are doing very little to capture that knowledge. Could it be that companies simply don’t value what their “ageing” employees contribute? There’s plenty of anecdotal evidence to support that view. If faced with recruiting or promoting either an inexperienced 25-year-old MBA or a 50-year-old veteran, the MBA is always the odds-on favourite – even in a country where overt ageism is illegal.

What opportunities are offered to imminently retiring employees to pass on their wisdom? According to Accenture, one company in four makes no effort whatsoever to capture the workplace knowledge of retirees, and a further 16% of companies expect retirees to have an informal chat with colleagues before leaving. That’s more than 40% of companies that have no formal processes for retaining expertise.

When you think of the money and time that has been put into training and developing the expertise that is apparently seen as disposable, you have to wonder how serious companies are about the value they place on “human capital”. After all, what is human capital if not expertise? If it is disposable, it has no significant value. And if the expertise of your most experienced people has no significant value, why on earth are you wasting your time bandying about training ROI calculations? At the end of the day, the return on all that investment, in Kirkpatrick Level Four terms, is assumed to be not worth the trouble of securing. Or so your accountants will tell you.

Now, you can look at the other side of the coin and say that more than half of all companies do make an effort to hang onto that expertise. In fact 20% of companies claim to put their retirees through a knowledge transfer process that lasts several months. But I suspect that this is only in exceptional cases for particularly high-ranking employees.

It may be that the failure to make an effort to preserve workplace knowledge is not because such knowledge is undervalued but because the extent of the problem is not realised. In that case, how do we get senior management to sit up and take notice? And, once they do appreciate the scale of the impending problem, what can be done to move awareness to effective action? Perhaps this responsibility falls more squarely on the shoulders of HR management and human capital strategists, rather than on trainers. Or perhaps it is the responsibility of those charged with knowledge management.

Knowledge management (KM) has had a bit of a rough ride over the past 10 years, having made many of the same mistakes that e-learning made. Their focus was initially on building technology-based tools to extract, retain, and retrieve what was in the heads of employees, and their focus was on explicit knowledge rather than tacit knowledge. Nowadays tacit knowledge is recognised as being more relevant to performance, even if it is harder to capture. KM people are working more with informal approaches such as story-telling to capturing what this less tangibly expressed expertise.

The problem is that the less formal our processes become – in both training and knowledge capture – the less easy they are to sell to the corporate bean-counters. And the less tangible our activities are to those who like dealing with hard numbers, the less value they are assumed to have.

I sometimes think that accountants are the biggest obstacles to progress in corporate learning and knowledge management. They love structure and hierarchy and abhor ambiguity and fuzziness. If they can’t measure it, it has no value. How 20th century! We have all run more than our share of the mandatory “Finance for non-financial managers” courses. Perhaps it is time to lobby for some mandatory “knowledge for non-knowledge managers” courses…

A trainer's output is performance, not courses

In Kirkpatrick terms, most “learning professionals” don’t like looking beyond level two. Because it is generally considered too difficult to evaluate training once learners leave the classroom, they have a good excuse for setting themselves much more easily achieved objectives.

A couple of weeks ago, in a reputable training forum, I heard a trainer proudly declaring that his learning objectives were always cast in such a way that they would be fully achieved by the end of a course. I was appalled, and said so. My reservations about competency-based learning management aside, surely instructional designers and trainers seek to impact the performance of learners once they return to the real world? If that is true, instructional designers and trainers have a responsibility for the ultimate on-the-job effectiveness of their handiwork. And, if they have a responsibility, they should set objectives to know whether or not they are meeting that responsibility.

I was stunned by the number of people who came back at me as if I were mad, declaring adamantly (and somewhat self-righteously) that their responsibility for learner performance stopped at the door of the classroom or at the closing of the e-learning browser window. What happened after training was not their problem. If instructional designers as a group have such a limited perception of their importance to the business, then whoever is training them is doing a lousy job.

The question "what things will participants be able to do at the end of the session that they were not able to do pre-session" is a common cop-out that does not produce learning objectives. Rather, it produces a reverse-engineered set of often grotesquely constrained learning promises that have as much substance as those of a used-car salesman.

I am increasingly wary of any learning objective whose time horizon is only the end of a course. By the end of any learning experience, learners may have acquired some new competence. But within months that competence should have grown, been applied, and be achieving business results. That scope should characterise our learning objectives. To be satisfied with taking responsibility only for that which is directly within our domain, important as it is, ignores the fact that learning has a greater business purpose.

Setting objectives that do not look beyond the point where a learner hands in his/her smile-sheet is abdicating responsibility for the effectiveness of training. If course-end is the training performance horizon, then post-course drop-offs in recall or ability become acceptable to trainers as "not my problem."

You could argue that as long as the trainer has created an acceptable change between pre-training ability and post-training ability, then he or she has performed successfully.

But learning is not only about acquiring temporary knowledge and skills. Motivation is an important part of what a trainer should be doing. At the very least, we should design our courses and conduct them in such a way as to motivate people to want to learn, to want to keep learning, and to want to apply what they are learning once they leave the training environment. If that motivation is absent, we have not done our job.

Admittedly, there are many factors beyond the control of individual trainers that impact transferability of learning to the workplace. Management and systemic obstacles all affect the way people grow in their jobs. But trainers can’t simply ignore these. We should try to exert influence even if we have no overt responsibility. And if those obstacles are immovable, we should design our training to accommodate them and to equip learners to deal with them.

Simply saying "not my job" is an inadequate response in someone who has taken on the task of improving the performance of fellow employees. Trainers and instructional designers do not have total responsibility for a learner’s performance on the job, but I think they should share some of that responsibility. If their salaries, bonuses, or careers depended on it, we’d see much more effective learning experiences being developed.

Sunday, June 12, 2005

E-learning salaries

Recently the E-learning Guild released the results of its annual e-learning salary survey. Data was gathered in the first quarter of this year from people who work in the e-learning field.

It’s amazing that they were able to find any participants for the survey, since, in the US at least, you hardly ever see a job advertised as an “e-learning” position any more. Two or three years ago, they were all over the place, but now it seems that some facility with e-learning is simply one of the requirements of anyone looking for employment in the broader field of corporate learning.

So while the results may be a little questionable, they do offer an interesting insight into the relative values placed on different skill-sets and responsibilities by companies in America. For those who responded, what do US pay packages look like?

Across the country, the median annual pay of a Training Manager working in e-learning was $74,000 (£40,200 for those of you in the UK).

The US is a big place, and salaries vary with geography, jobs in the middle of the country normally paying 10-20% less than the East or West coast. Washington DC, where I am, is actually part of the South East, and salaries are typically a little less than those paid in New York, just to the north. So while New York Training Managers pulled in $84,000 (£45,600), those in DC settled for around $72,000 (£39,100).

Higher up the ladder, the national median salary of a Training Executive was $120,000 (£65,200). At the bottom of the scale, a classroom or online trainer typically earned $50,000 (£27,200). Between the humble trainer and the training manager, fell all of the “specialized” folk, including curriculum designers and developers, instructional designers, and content developers.

How hard do people have to work to earn these salaries? Permanent full-time employees are averaging 49.9 hours per week, and they get, on average, 18.8 days of paid vacation each year. (One poor soul reported working 80 hours a week! Been there, done that, dodged the bypass.)

The question that jumps out at me is this: why are trainers, who contribute so much to the actual effectiveness of training, so obviously undervalued by their organisations? Is an instructional designer really worth 20% more than a trainer? Trainers, after all, have the ultimate responsibility to “make it work” even if the design is no good. Given the insipid nature of instructional design that I see so often, I have to believe there is something wrong with this picture. Perhaps it is simply supply and demand. Or perhaps it is that, in companies that have gone overboard with e-learning, the classroom or synchronous trainer has indeed been sidelined.

You can get the complete report, free of charge, at the E-learning Guild site, but you have to sign up, free, as an associate member.

Friday, June 10, 2005

Training Needs Analysis needs analysis

I have yet to meet anyone in the learning business who will question the value of a Training Needs Analysis (TNA). Indeed, there are many who will insist that a TNA is carried out before any work is done on defining or developing training of any kind.

We all know, understand, and value the concept of a TNA, and believe that, in a perfect world, we’d invest time and resources in TNAs whenever we could.

That’s why I was initially surprised when a debate erupted recently over the definition of Training Needs Analyses. On reflection, though, it is clear that “TNA,” rather like “e-learning” or “evaluation” is one of those Alice in Wonderland terms that means exactly what you want it to mean.

As a result, half a dozen learning experts can carry on a conversation about TNAs all day, not realising that they are each talking about rather different things. And a training department can commission a consultant to carry out a TNA, and neither party will know until it is too late that they were each working with very different concepts.

To some, a TNA is a big-picture strategic process that helps define which performance gaps are best addressed by training, and which are best addressed by other means. To others, it is a project-specific tactical analysis which assumes that training is called for and seeks to examine the learning environment and define the optimal instructional processes. To others still, a TNA falls somewhere in the middle, or is an operational process designed to aid in planning and budgeting.

But whatever the motive or focus level, most surely agree that the outcome of a TNA is a clear idea of what has to be done, how it should best be approached. The TNA may also tell you what the recommended approach might cost and what return might be expected.

My own perception of Training Needs Analysis is as a bridge between the strategic and the tactical. I see a TNA as being most useful at a project level, where a specific performance gap has been identified, and objectives to close the gap have been defined.

A TNA would tell you how much of that gap might be addressed by training and at what cost. And, if training is subsequently allocated a role, a more detailed TNA would define precisely what learning objectives would be met, and in what way.

Unfortunately, reality often intervenes. The time and cost taken up by a formal TNA is frequently just not worth the potential benefit. The individual judgment, instincts, and insights of those in training management can often produce a much faster, more practical ‘analysis’ than a formal study. And, in truth, the only time it is worth doing something more formal is when the cost of making a mistake is simply unacceptable.

There is absolutely nothing wrong with this approach. It is a perfectly valid way to make decisions. And, despite its not being a formal research-based study, it is still a TNA. To me, a Training Needs Analysis is any analytical process that seeks out and examines relevant information, qualitative or quantitative, with the objective of determining the nature of the role that training can or should play in resolving a performance problem or exploiting a performance opportunity.

In the real world of market research, multi-million dollar budgets are frequently decided on the mutterings of a few dozen people in focus groups and the collective experience of the marketing decision-makers – simply because a statistically significant “proper” research project might take too long, cost too much, and produce a result that you can’t take to the bank anyway.

I am, sadly, an old-school geek, having spent a large part of my academic and work life in the market research field. As a result, I get unreasonably excited about anything that pretends to be a rigorous analytical study but is, in reality, a sloppy or token gesture. I am not talking about the calculated expertise-based decision-making approach mentioned above, but about the attempt to pass off Pseudo-studies as proper research and analysis. I know how easy it is to “prove” just about anything with research, and how assuming what you want to prove can result in a study that conveniently reaffirms your expectations.

I have seen many “TNAs” do just that. They are designed to discover a set of needs that magically coincide with a pre-determined solution. Company training departments are less guilty of this than large outsourced training vendors, who have a rather fixed set of offerings to sell.

They find it more expedient to “customize” their customers’ perceptions of their needs, than to customize the nature of their solution. Probably the worst culprits are the big consulting firms who have a batch of “processes” in their database that they try to surreptitiously syndicate across as many client companies as possible, all under the guise of original tailor-made work. They use token TNAs to do the selling job for them, and usually get paid substantially for those TNAs as well.

It has been said that it doesn’t matter what you call it so long as you do it well. I disagree. While we may never reach an industry-wide consensus, aligning our thinking on what a TNA is does matter. The more we outsource the different aspects of the training role, the more open we become to confusion, ambiguity, and waste, especially if we don’t take the time to define our terms and – more importantly – demand that our business partners define theirs.