Wednesday, October 11, 2006

Learning Circuits Big Question -- follow-up

The big question follow-up: So what can we do about it?

Here is the new question:
If you think it's important that everyone be blogging, how do we get there? If you agree the goals that I've just listed are important, but blogs aren't the answer. What is?
As I noted in my previous post on this subject, I don’t think it’s important that everyone be blogging. In fact, the longer I think about this, the less I like the idea. Here is what I keep coming back to – what does it mean to be a professional? And what does it means to be a typical elearning professional in a corporate setting?

It seems to me that the average elearning professional in the US is likely working for a corporation, and their primary mandate is to meet the business needs and goals of whoever is paying their salaries. Their primary goal is not career development or professional development, it’s business impact. Insofar as blogging helps us achieve the latter, I think it’s great. Insofar as it achieves the former at the expense of the latter, I think we need to question our priorities.

The mere fact of being a knowledge worker doesn’t entitle us to prioritize our time to the benefit of our own professional growth at the expense of company goals. If my primary job is administering an LMS or delivering instructor-led training or developing PSS, then how would blogging factor-in exactly? I don’t mean “what might they blog about?” – that would be self-evident. What I mean is “how would blogging help them perform their core jobs more effectively or efficiently?” How would blogging help them deliver greater business value? I’m pretty sure that these elearning professionals wouldn’t be allowed to surf Fark all day at work, and by the same logic, they shouldn’t be blogging all day at work either. While one certainly has more value than the other (and I’ll let you decide which : ), if neither contributes to some sort of bottom-line business impact, they are equally a waste of time as far as the business is concerned.

Different rules apply of course if you are a consultant or a vendor or if you are willing to contribute on your own dime and on your own time. And maybe different rules apply outside the US. I agree with most of Tony’s points about the value of blogging to the individual, I’m just not sure I agree with its value to the business, particularly when it’s phrased as “everyone.” Organizations will not benefit if every one of their learning professionals is spending 2 hours per week blogging. In some large organizations, this may mean 600 hours per week spent blogging or some 30,000 hours per year. I think it’s safe to say that with 30,000 hours, an organization could design a learning or performance initiative that might have more impact than the results associated with blogging.

As to how else to achieve and practice the professional characteristics Tony mentioned:
  • being self-reflective,
  • being collaborative,
  • being rigorous in supporting our positions,
  • open to feedback,
  • understanding our point of view and learning to share it,
  • working knowledge of new technologies

there are lots of ways. But first, it’s worth noting that blogging will not magically endow you with these characteristics. In many ways, I think a desire to blog about elearning probably means you already possess many of these traits, which is really sort of a wish-list of elearning professionalism. If you are doctor, you probably already have an interest in helping people and in exploring and learning about complex subjects and maybe a fair degree of detective-like, analytical thinking skills. You get the idea. In other words, you possess the professional traits one would expect of someone in the “doctoring” profession.

The folks who have self-selected blogging likely possess the traits that would lead them to blogging. Just as the universe happens to have favorable physical laws that lead to the formation of people who then marvel at how weird it is that the universe happens to favor the formation of people, we now have bloggers who believe that people should blog so that they will possess the traits of bloggers which is what led them to blogging in the first place. Do we blog so we can be blogger-like or are we blogger-like and therefore blog? Ok, I’m clearly having too much fun with this.

As to other ways to achieve this sort of Zen-like self-reflection in collaborative openness and communal sharing of future technologies? How about talking with colleagues within your organization over lunch? Better yet, during the design of the next learning initiative. Or maybe during an elearning conference? Or by commenting on other people’s blogs or through listservs or bulletin board style interactions?

Is there something about authoring a blog which imbues it with more importance or significance than a verbal dialog with peers? I suppose there is the permanence factor and the ability for a larger, wider debate. Of course, with this, you also lose intimacy. Do I learn more from strangers who tell me my design “sucks” or do I learn more from peers who tell me my design “could use some work”? I don’t know. What is unique about blogs is the idea of “putting yourself out there” – “these are my words and thoughts for good or for ill. This is what I believe at this moment.” And in that sense, a blog requires more rigor perhaps than a discussion, more clarity of mind and more internal self-reflection about what “what I really think” on a particular subject. But I’m not sure that this is an artifact of my writing a “blog” or just the fact that I’m writing. I suppose either way, the impact is the same. To derive the same unique benefits of blogging, there may be no choice but to publish and distill the cacophony of everyday thinking into the coherency of the written word.

Which of course leads to the question – what does a typical elearning professional need to think so deeply about anyway? For the typical elearning professional, are there enough, “boy I really need to think through this subject” situations on a daily or weekly basis to justify the time and energy of creating and maintaining a blog? I have no idea, but I’d be curious as to what others think.

Tuesday, October 03, 2006

The Big Question -- Should all learning professionals be blogging?

Should all learning professionals be blogging?

So this month’s Learning Circuits Blog is on the subject of blogging. Which I guess makes this response some sort of meta-blog. A suppose another question we could be asking is “should learning professionals who already blog comment on whether learning professionals should be blogging?” Isn’t this a bit like asking a group of Republicans whether all taxes should be lowered? And really, can there be any answer but “yes and no”?

Yes
Why? For me, it’s a simple issue of “practicing what you preach.” Blogging, wikis, podcasting, WBT’s, simulations, instructor-led delivery (live and virtual), EPSS… it all has a place and a function in helping drive to organizational performance. But when to use each requires some understanding of the pro’s and con’s, which is best developed through usage and experiment. While you can probably argue that a theoretical understanding of the relative merits of each intervention is sufficient for most learning professionals to make accurate decisions on when to use each, I’m not sure that you really “get” this stuff until you do it. And until you “get” it, how do you design your solution? The best performance centric solutions are ones that blur the traditional delivery lines: WBT with PSS with a Wiki for on-going maintenance or Virtual Classroom with Simulation and a weekly Video Podcast on the latest product news. You get the idea. Until you understand your choices as both a consumer and a producer, you are “book” smart, and you will be harder pressed to innovate in meet your organization’s unique training and performance needs.

No
Why? For me, it’s a question of “time management.” Blogging, wikis, podcasting, simulations, virtual classrooms, LMS…. all new technologies take some time to investigate. But some of these are investigations of technology and infrastructure, while others are investigations of fundamental content development models. I can investigate LMS’s without necessarily implementing one or developing courses specific to the LMS. But blogging, wikis, podcasting… these are transformations in the way we communicate – they are content-centric, and therefore, the level of investigation to have a “gut” level understanding of these technologies is non-trivial. To get a “feel” for the impact of blogs and wikis and podcasting, I should, at a minimum, participate as a learner, but ideally as a producer as well. That works for me as a consultant / vendor in this space, but I know from working with my clients, that 90% of them don’t have as much time as they would like to work on existing projects and business initiatives, let alone take on the task of investigating new technologies that will likely consume a lot of time. So “no,” there are too many other things that will help training organizations delivery real world business impact and blogging for the sake of blogging is not the best use time for most elearning professionals.

I suppose the ideal answer is “maybe” -- maybe use a blog as a kind of internal newsletter on some business relevant subject or a change management communication vehicle for an upcoming initiative? Or maybe dedicate a few key individuals to act as learning R&D to investigate new technologies and their uses. While I’m strongly in favor of all of us learning and experimenting with new techniques and delivery models, I’m also passionately committed to the notion that we should be delivering business value first and foremost.

Just as we wouldn’t want every programmer on the team to spend hours each week learning .net if we were a J2EE house, we probably don’t want every member of the elearning community writing blogs when there are still thousands of hours of classes and WBT to deliver. On the other hand, we probably would want to use .net on the first project where it made sense. And we might want a sub-set of the team to focus on investigation and R&D on emerging technologies. It seems to me that a similar model should hold for elearning. As with all forward leaning professional development in any field, the key is striking the right balance between “big picture,” strategic, “what if” activities against the need to deliver real world business value today (if for no other reason than the selfish desire to be employed long enough to get to the cool stuff… ; )

Thursday, September 21, 2006

Performance Management -- a cautionary tale

In my previous post, I defined what I meant by "performance-based learning." In this one, I want to tell you a story about the danger facing a related movement – the Performance Management movement – you know the idea of managing performance through an LMS or a third-party tool that every analyst is talking about? Obviously, performance management and performance-based learning are related concepts. One is about you measure performance and one is about how you improve it. Simple, right? Maybe not. Read on.

Here’s the context and the prelude:
This past week I was at a client site to discuss their need for custom learning and content development tools – in this case, Firefly and Firefly Publisher (a software simulation tool, and it’s big brother, a collaborative content authoring tool). The client is an existing LMS customer who has recently purchased an enterprise license of our LMS (KnowledgePlanet). Since our LMS also has some pretty sophisticated performance management features, they also planned to use the LMS to manage skills and competencies, report on skill gaps and do some limited succession planning in the future. The key for them was the value in being able to link skill gaps and succession planning models directly to learning. They saw this as a key advantage to the solution. Keep that in mind.

So fast forward to this past week. I had been asked to go to the client site to discuss tools and custom eLearning. As an expert in both areas, I didn’t need to do much prep. Just loaded the usual demos, worked on a few simple conceptual PPT slides and off I went. In retrospect, I should have been slightly more curious about a company that had all the cutting edge learning and performance infrastructure anyone could ask for, and yet wanted an overview of our custom content and tool capabilities. Relative to all the planning that goes into an LMS and performance management strategy, eLearning content development is pretty straightforward: task and audience analysis + performance objectives + learning and corporate culture + infrastructure + timeline + budget = courseware design and cost. Not exactly rocket science. And as for the tools we offer, they speak pretty well for themselves and are recognized as industry-leading solutions so again, not a lot of research or demo required to determine fit.

So what the heck was I doing there? Well, as it turns out, this well-respected “everyone knows their name” company who is spending significant dollars on a massive LMS and performance management initiative had zero, yes ZERO, eLearning to distribute with either system. Zero, like nada, zilch, nihilo, zip. None. Nor did they have any in-house expertise to build any. Nor did they have any budget to hire new people to build any. They did have some limited budget for custom content development. And they also had some limited budget to train their “coordinators” to build some simple custom content.

I was there to discuss our custom content capabilities and to show them our tools, specifically to help them determine if they were “too complex” for their “not courseware developers, not instructional designers, not trainers -- some other job function” resources to use successfully in developing learning. I was also there to help them budget for eLearning for the coming year so they could outsource key projects as budget allowed. Needless to say, I was surprised and alarmed. Given the consultative nature of our sales process, I was also confused as to how we could have sold such a complex solution to a customer with so little understanding of what it was going to take to be successful. As it turns out, there wasn’t much we could have done differently on this front.

So let’s see, what’s wrong with this picture? Here is just a partial list:

Cart before the horse
Building infrastructure before you build courseware is like building a house before you have any furniture, curtains, rugs, silverware, dishes, or a TV (which may be ok if you leave money aside to do this after you buy a house, but not so OK if you don’t…). Building infrastructure before you staff instructional designers and course developers and content experts is like building a restaurant without planning for chefs, hostesses and wait staff. And trying to build a training practice staffed with people who have zero ID or training program? Ugghh. “The tools you choose are the least of your worries…” came to mind at least half a dozen times. I wanted to tell them that their plan was sort of like buying a Corvette and filling the gas tank with sand, but I didn’t.

Poor alignment between goals and methods
In any talent management or performance management engagement, the central goal should not be the identification and quantification of gaps. Nor should the goal simply be succession planning or a comprehensive catalog of skills and competencies by job role. All of this is secondary, even tertiary, to the real business goal – improved performance leading to the successful completion of business objectives.

Tracking does not improve performance, identification of skill gaps does not rectify them, having a succession plan doesn’t magically endow the successor with the requisite skills. What does? Training, learning, practice, mentoring, cognitive and task-oriented apprenticeships, systems changes, process changes... Central to any performance or talent management initiative, therefore, is the need for comprehensive training and support plan, a significant chunk of which will need to be e-something – eLearning, e-Performance Support Systems, e-documentation, e-Seminars (recorded or live)… Logistically speaking, if the org is big enough to consider comprehensive talent or performance management, they are almost certainly too big to do all the required training via live instructors.

Poor alignment between business units internally
How was it that we were able to sell an entire LMS without visibility into the content development plan? Simple really. The infrastructure purchase and the content development responsibilities were aligned with different business units. IT owned infrastructure; HR owned corporate training; each business unit owned it’s own training development. No one had the big picture. HR started the performance management initiative which clearly required technology. So IT took the lead and worked with HR to define needs and what not (this ultimately became another bit of irony since the entire thing is hosted by us and therefore requires nearly zero IT involvement from the client 00 in other words, IT never should have drove this process in the first place). Along the way, a few questions were asked of the business units but no one seriously inquired as to the existing state of content or existing budget or existing skill sets. Since there were no complaints, HR just assumed training was under control at the business unit level and that once a corporate LMS and performance management strategy was unveiled, the content would just start populating itself. Sort of a “if you build it, the content will come” strategy.

So why didn’t we know? After all, we sold them the solution. Didn’t we do our homework? Sure we did. We asked and were told there was content in every business unit. Business unit reps were in a few sales and consulting meetings and never raised any concerns. The IT guys didn’t seem to have any idea what content meant and the HR folks were mostly concerned about system admin and reports and the like. So even though we asked the right questions and we seemingly got the right answers, the reality is that we didn’t have the right people at the table.

In retrospect, the business units were woefully underrepresented. And as it turns out, they often sent different reps to each meeting. They were often asked at the last minute because HR forgot to invite them, and worse, because they were in the business units actually delivering business value, they often had deadlines and commitments that resulted in missed meetings. It also became clear as we learned more that the business units just assumed HR was going to take care of all the training. It never even crossed their minds that they would need to build hours and hours of training without any additional support or staff. After all, their main job was production and generating revenue, not developing training.

So in the end, IT and HR were both well-represented, but no one was really representing the learners or the business units. HR and IT both had executive level involvement. The business units had managers and directors. HR and IT had dedicated resources and time to spend on this initiative. The business units tried to find time in-between production. Because of this, the one group who was expected to do the most to make the roll-out successful had the least input and the least involvement. And, as a result, the one group who was supposed to benefit the most is the least likely to see real benefits from the solution.

Change management and failure to plan
What happens when the systems go live and there is little to no business-unit specific eLearning available? What will be your opinion of this system? What would your CEO’s opinion be? What’s the likelihood you will view the Talent or Performance Management System as a success if you can’t improve your talent or performance in any new or valuable ways? How likely are you to embrace it?

At the end of the day, Performance Management is not about measuring performance, it’s about improving performance. To do true performance management, you can’t just put an architecture in place to measure gaps and manage succession planning. Learning must be connected to the plan, and not just hypothetical learning you might someday build but real learning and training resources that exist today. You must plan for how you will actually change the performance of the organization. While it’s true that you can’t manage what you can’t measure, it’s equally true that you can’t improve anything just by measuring it. Knowing that the convenience store is a mile away doesn’t help me get there. For that, I need a pair of sneakers, a bike, a car, or some other conveyance to move toward my destination. In the case of corporate performance and the personal performance of individual employees, there are a number of appropriate vehicles that should be considered: eLearning, simulation, performance support, process changes, systems changes…

While all of this may seem blatantly obvious, I’m not seeing nearly enough attention devoted to this in the hype around Performance Management. HPT (Human Performance Technology) and ISPI (International Society for Performance Improvement) should be factoring significantly into these discussions, but so far, there’s a lot of talk about saving money by automating process and not much at all about improving the bottom-line through performance improvements. True visionary companies are not the ones who put in place architecture to measure and track; true visionaries are the ones who measure and track so they can make tactical improvements in the performance of individual employees to better enable the organization to achieve its business objectives.

Lessons learned
So lessons learned on my end? Don’t make even rudimentary assumptions about the state of the organization, regardless of it’s reputation in it’s vertical. Do ask more questions about the whole strategy and the overall plan. I’m not sure this would have changed much in this case as this organization was hell-bent on buying an integrated LMS and performance management solution, but maybe for a more enlightened organization, more of the right question will put the focus where it needs to be – on performers and performance and the methods and means by which to improve both.

I also think I'm going to spend more time talking about the importance of connecting the overall learning strategy to organizational goals. For years, the pendulum was toward architecture -- LMS's were king. Then it seemed to swing back to content for awhile with simulation tools and Wikis and interactivity tools leading a counter-charge in favor of content. Now I think we're back the other way where analysts seem to be focusing on infrastructure again, this time with Perfomance Management and Talent Management. As with LMS's, these buzz words don't really don't mean anything in the absence of a content strategy.

I don't know. Maybe they are making the same mistakes I did and assume that the need for content is self-evident. Clearly however, this is wrong. Now, more than ever, we need to think about a holistic strategy to address performance: including Wikis, functional or departmental blogs, podcasting, simulations, collaboration, PSS, nano-learning, etc... Layering yet another heavy infrastructure into a content-weak model seriously undermines the likely success of a solution or any chance for real business impact. On the one hand, it's sad to still have to focus on such basic issues as connecting a learning and performance strategy to organizational goals. On the other hand, maybe this need never goes away and we just need to keep hammering home the idea that the biggest value of elearning lies in actual performance improvement, which is inextricably tied to quality content.

Tuesday, August 01, 2006

Why Performance-based Learning?

So, after 15 years in the learning space, I finally decided to jump into the fray with my own blog on all things learning. Technically, I suppose someone could nitpick and claim that since I titled the blog “Performance-based Learning” it must be about “performance-based learning” and not just learning. So what’s the difference? Why the distinction? Shouldn’t all learning be about performance?

Let’s start with the last question first. Yes. In a corporate setting, all learning should translate into performance. Just as you wouldn’t invest in new hardware for no reason, you shouldn’t invest in training for employees with an expectation of the same old performance and efficiencies. If there isn’t a demonstrable ROI, why do it?

Why the distinction? Well, for starters, too many people talk about students and learning when we should be talking about performers and performance. Even experienced people in this field sometimes talk about learning as if it’s an end rather than a means. Frankly, I’m tired of hearing about award-winning training – what about award-winning ROI?

The words “Performance-based Learning” are an attempt to cover a broad range of related concepts in an anagram-ish sort of way. It describes not only the end state: learning that results in performance improvements, but also the means, namely learning methodologies that rely on performance as a training technique. Or more succinctly: learning that results from performance and learning that results in performance, namely workplace performance.

This is where I’ve spent my life for the last 15 years – developing products, consulting, evangelizing, selling, presenting at conferences… Given the recent buzz from analysts in this area, it seemed that some clarify might be useful from someone who has been advocating this approach prior to it’s adoption as the latest eLearning fad.

So what’s the difference between “Learning” and “Performance-based Learning”? Ideally, none. Through real world examples, I hope to show the value of learning that impacts the bottom-line while also making the case for a broader view of learning that includes performance support, mentoring, human factors consulting, and process analysis. I hope that you find this blog useful. More importantly, I hope that you tell me I’m crazy now and then. None of us have all the answers on this stuff, but through dialog, maybe we’ll find that we have most of them… Happy reading.