<img height="1" width="1" style="display:none" src="https://www.facebook.com/tr?id=1373947175984693&amp;ev=PageView&amp;noscript=1">

Learning & Development Blog

The Devil's in the Details: 7 Tips for Change Readiness Assessments

There are plenty of change management models to choose from (Kotter, ProSci, Bridges, et al), and several learning development approaches (ADDIE, iterative, combo). While the theories behind each approach are fairly easy to understand, things can sometimes get a little tricky when you actually have to execute them.

One critical component shared by all the models – whether for learning or organizational change management (OCM) – is the up-front needs assessment phase. Theoretically, the assessment phase is pretty simple; you have to understand your needs so you can recommend appropriate action. And, usually the steps to perform the assessment sound pretty straightforward: interview key stakeholders, do a survey, analyze a bunch of documents, and voila: you have a plan of action.

In reality however, there are a lot of pesky details that can – at minimum – slow you down, and – at worst – can actually put your whole program at risk.

Practicing the Preaching

Here are a few simple things to keep in mind next time you find yourself assessing a new project. Keep in mind: these tips are more pragmatic than theoretical. There’s plenty of information elsewhere on the theory behind these assessments (books, whitepapers, bloggers, conferences, etc.). I've chosen to provide some detail-oriented insights that you might not find elsewhere.

1. Get both qualitative and quantitative information.

While most organizations are happy to allow interviews with subject matter experts and project team members, some balk at conducting end-user surveys to gather a broader, more quantitative data-set. Usually it’s because “we just did a survey last year …” or “we know what works with our learners.” Don’t give in to the pressure to skip the survey; it will allow you to confirm or debunk long-standing organizational assumptions.

[pullquote]Don’t give in to the pressure to skip the survey; it will allow you to confirm or debunk long-standing organizational assumptions.[/pullquote]

Often, a key executive makes a blanket statement that is quickly accepted as an absolute truth (E.g., “Our learners hate eLearning…”). Since no one wants to question the senior leader’s judgment, this unsubstantiated, and often outdated, viewpoint can have a serious impact on the project. If you have a hunch that eLearning might actually work for your learners, there’s no substitute for hard data gathered from an online survey of your entire learner base.

2. Avoid focus groups at all costs.

On one recent project, I agreed to facilitate a focus group to gather information from a cross-functional group of about 15 employees. It was a minor disaster. I had difficulty getting meaningful input from more than four or five participants. The two most vocal participants, I learned later, were not employees at all, but worked for the system integrator that was contracted to customize and configure software at the heart of the business change.

Clearly, these vocal individuals had a stake in the assessment outcome. And, because they were articulate and influential, they used the large-group dynamic to alter the opinions of many, and actually block the opinions of some.

3. Take interview notes by topic, not person.

Obviously, you have to interview people one at a time. But, that doesn’t mean you need to organize your notes that way. By taking your notes on a by-topic basis, you can save a huge amount of time and energy when you need to go back and analyze them for common themes.

Previously, I had always grouped my notes by interviewee. After all the interviews were done, I had a whole bunch of MS Word documents with pages and pages of notes – each interviewee in a separate document, like this:

Change Readiness Assessment Sample

Here’s the problem: when you’re looking for common themes, and you’ve interviewed a couple dozen people, you need to open all the Word documents and scroll through each one to find the matching topics in each set of notes. This approach is so inefficient I can’t believe I had never thought to do it any other way.

I finally realized it was more efficient to take notes by question, not person. So, I began using Excel instead. This way, I only need one document, and all my findings can be reviewed by topic. Since I’m not reporting the identities of the interviewees, it’s less important that the notes be grouped by individual. The new way looks like this:

needs assessment questions

During the interview, I just click on each tab, Question 1, then Question 2, and so on … essentially “pre-grouping” the responses, so summarizing common themes takes a fraction of the time that the old way did.

4. Capture Demographic Data

Since we've already determined that you're going to do a survey as part of your next assessment, the next step is to be sure you capture as much demographic information as you can. Demographic details can help you understand how employees’ opinions align with their geographic location, tenure with the organization, business unit, job function, seniority, and even who their manager is.

To analyze the data, you can use whatever survey tool you've chosen (Survey Monkey, Wufoo, etc), but none of these tools really has the power of plain old Excel. Of course, if you don't already know how, you'll need to learn how to create pivot tables in Excel, but you should view this as an opportunity, not a roadblock. Running pivot tables on huge sets of data is one of life's little-known joys.

In some cases, it’s OK to simply determine an “average” score for survey responses, but I think it’s much more useful to show numeric responses by question – something that’s virtually impossible to do without using pivot tables.

For example, let's say you have a set of questions like this:

Change Assessment QuestionsIf you just want to compare responses at a high level, you can just substitute numbers for each of the responses (5 = Strongly Agree and 1 = Strongly Disagree), and calculate an average score for each question. This is useful for highlighting areas of concern relative to one another. Here's an example of what the summarized data might look like for these five questions:

Organizational Readiness Survey

As you can see, all these questions scored between 3.0 and 4.0 on a 1 to 5 scale. Frankly, this doesn't tell us much. Because "neutral" was one of the options on the survey, average results tend to gravitate toward 3.0.

Luckily, we asked respondents to tell us (among other things) their role relative to the project, and their department. By summarizing the data with pivot tables, we can get more useful insights into how employees really responded.

Here is the same data, focusing just on Question 15, and broken down by role and department ("neutral" responses have been removed):

Change Readiness Survey

The second set of graphs tells a much richer story. The graph on the left clearly shows two big issues. First, more than half of Project Team Members and System Users feel the future state has not been explained. What's worse, senior management believes just the opposite. In the graph on the right, which shows responses grouped by function, we can see which specific departments are really pushing the results into the red.

5. There's No Such Thing as a "Draft" Assessment

Many times, you'll be asked to share a "draft" or "advance copy" of your assessment report with certain key stakeholders. They will tell you they "just want to know what to expect," etc. Do not share previews of your assessment with anyone who has a stake in its findings.

On one recent project, we found that a major ERP implementation was moving forward with no resources assigned to the training effort, and no plan for how training would be created or delivered. We made the mistake of sharing an "advance copy" of the report with a couple of team members and, amazingly, when we delivered the report findings to senior management a few days later, a training team had been formed, along with a 20 page "training plan." Not surprisingly, the team was comprised of borrowed resources, and the "plan" was a blank template pulled from another project.

Of course, you may share a draft of your report with a trusted adviser, so it can be examined for clarity and rigor.

6. Move Quickly

Because a learning or change readiness assessment is essentially a "snapshot" of a current situation, it's important to move fast, since things are changing constantly. If you wait a month, or even a few weeks, to report your findings, you may appear misinformed, since things may have changed since you finished gathering data.

So, to be sure you're ready to present your findings quickly, you will need to start building your presentation or report in "real-time." Create a thesis statement after the first couple of interviews. You can always change it later; the point is to get a framework for your report created as quickly as possible, so once your data collection is complete, you can finish the entire document in short order.

If you can perform the assessment quickly, with minimal disruption to the organization, it will be easier for you to make a case to "re-assess" the situation in six months - so you can measure the impact of your learning or OCM program interventions.

7. No Executive Summaries

If you're asked to present your findings to an executive audience, do not create a separate, abridged version. Instead, choose a subset of slides that you'd like to present, and use Powerpoint to 'hide' the rest. Or, skip quickly through some slides. If you must create an executive summary, put it at the beginning of your report and label it as such; never create a separate document that serves as an executive summary.

The point is, your entire report needs to remain unaltered because ideally, your report has created a story, and makes a solid case for a certain course of action. Once you begin to abstract your findings (by separating them from the supporting data), they instantly begin to lose credibility and start to look like suggestions rather than imperatives.

That's Just the Beginning

If you've ever done a full-scale OCM or learning program assessment, you'll know that this list just scratches the surface. There are a lot of landmines out there. If you have performed such assessments and have other ideas, I'd love to hear them. This seems to me like an area where we as learning and change practitioners have a lot of room for improvement.

Not sure what to ask during the assessment phase? Download one of our needs assessement guides to get started!

change readiness assessment

New Call-to-action