Chris from Coursecheck talks about the importance of customer feedback

Episode 2 - The importance of collecting customer feedback, and how to do it well

Chris Wigglesworth from Coursecheck joins us in the accessplanit office to talk about why it's so important for training companies to collect customer feedback; designing your feedback forms; asking the right questions, and how to utilize your feedback more effectively. Listen now.

Transcript from The Everest Podcast

Series 1 Episode 2

Chris Wigglesworth from Coursecheck joins us in the office, to talk about the imprtance of gathering good customer feedback, and how to do it well.

READ THE BLOG

Amity:

Hello, and welcome back to The Everest Podcast. This is our second episode. Today the podcast will be hosted by myself, Amity and my colleague Hannah from accessPlanit.

Hannah:            

For our listeners that aren't familiar with what we do already, here at AccessPlanit, we provide training management tools that help customers be the most effective they can be, which is what this podcast is all about. We cover topics such as marketing, corse feedback, understanding your learners, and running effective events. Today we're joined by Chris from Coursecheck. He's going to be sharing with us how to collect effective feedback as well as using that feedback to market your courses.

Amity:                

Chris, why don't you tell us a little bit about yourself and what Coursecheck is all about?

Chris:                 

Thanks, Amity. Yes, delighted to do that. I run Coursecheck. I set it up after having run a training business. So I've had some first hand experience of being on that side of the fence, if you like. And the challenges involved in collecting feedback and making good use of it. I suppose I'm that classic, "I wish we'd had Coursecheck when I had the training company." Because it was really designed for people like me and other training businesses to use. So, i've got good first hand experience of it and that's what I do now. We're building the company up and dealing with anyone who runs training courses.

Hannah:            

Okay. Interesting. We obviously have a similar backstory. So we've got lots in common. accessplanit started as a training business as well. And similar kind of thing, we thought, "I wish we had this at the time." So we built it and I guess we've got similar back stories.

Amity:                

We're going to talk a little bit about what customer feedback is and why it's so important. My first question to you is, let's talk about what we mean when we talk about customer feedback and feedback forms. What does collecting feedback look like and what are some of the ways people do it?

Chris:                 

Yes. It's one of these areas that sounds very simple. Doesn't it? Collecting feedback. But actually, when you look at all the different ways that people do it, you realize that it's actually quite a complicated subject. At the one end of the spectrum, if you're going through an airport, you often see those little smiley faces. "Was that a good experience or bad?" And you you're just pressing one of three buttons. And then at the other end of the spectrum, you've got extensive survey forms with lots and lots of questions and text. It's one of these things you can go into any amount of detail or very little. Most companies recognize that it is something worth doing. But what we find is that a lot of companies just don't do it particularly well. They want to do it, but they struggle to actually get business value out of the process and can get quite disillusioned with it.

Amity:                

It's kind of an afterthought for a lot of companies.

Chris:                 

Yeah. I mean, you get, obviously, in between those extremes you've got things like SMS-type messaging. We've all probably had that on our mobile phones. "Rate your satisfaction one to five, or one to 10." And some companies will do extensive interviews with customers and get very detailed feedback in that way, rather than trying to get something from everybody. Lots of ways of doing it.

Amity:                

What would you suggest? What would you say to people who aren't actually doing it yet-

Hannah:            

Or not very well.

Chris:                 

What I would say to them is, perhaps you have an opinion formed that it's just not worth doing and we would say, and especially in this day and age, if you're in any kind of business, listening to your customers- it's very hard to argue that is something you shouldn't bother to do. We live in a social age where proof of quality is really, really important. I think it's increasingly hard for anyone to say, "Look, I just don't think it's worth it." But it is not worth doing it unless you're going to do it well and get some value out of it. I think a lot of people perhaps pay lip service to it rather than not do it at all. So I think an interesting thing to talk about is the sort of best practices around actually going though the process and making it useful.

Amity:                

And I know people can implement it for marketing purposes as well as just seeing how a course went.

Chris:                 

Yeah. It's those two real reasons for collecting feedback. There's a few more, actually, but primarily yes. As a quality control tool. How good is our training? And secondly, we can use feedback from marketing. It can be a very valuable tool in that respect, I'm sure we'll talk about that more later. But too, there's also value in things like, is it making a difference? A lot of feedback tends to be around the reaction to the training. Did you like the trainer? That sort of thing. Was the lunch okay? Rather than, did you actually learn anything and are you going to remember what you've been taught.

Amity:                

Yeah. So, following on from that, when people are thinking about designing their feedback forms, obviously it comes down to asking the right questions to get the most out of it. What are your recommendations for-

Chris:                 

Yes. It's a big, big subject area, actually. What questions to ask. And designing good questions is really, really key. It's a bit like rubbish in, rubbish out.

If we don't design our survey form with good quality questions, we're not going to get useful data out the other end. But essentially, we want to be measuring, yes, their reaction to the training. It's fair enough to want to do that sort of thing but also we want to focus on the learning and the likelihood that this is going to make a difference to you after you've been on the course.

There's lots of different question types you can ask. Obviously score based questions are fine as a sort of health check, if you'd like. Going back to the early example in the airport. That's great if everyone is mainly hitting the green button to say, "Yes, it was fine." But if everyone is hitting the red button, you have no idea then what's gone wrong. It's like a health check. So the only way you're going to find out what you could do better is by asking much more open questions. And asking questions that actually, the learner has to think about a lot more before they answer.

Chris:                 

I'll give you an example. If we're talking about the trainer, for example. How good was the trainer? Well, the simple way of doing it would be to say, "How did you rate the trainer? One to five or one to 10." And people will say, "He was a very good trainer." But actually, if all the trainer did was give a wonderful presentation throughout the day and you were completely engaged throughout, all the evidence says you will not remember what you've heard because if you haven't actually practiced it and done activities, and thought about how you're going to apply it. It's just not going to have the impact. You might think it was a great course but actually there's been a lot of research done between looking at what feedback was provided and actually what difference was made in the workplace.

Chris:                 

There's an incredible lack of correlation between the two. That's not because collecting feedback doesn't work, it's just it's got to be done in the right way. So questions which make the learners think. For example, going back to the trainer, using the trainer. There are certain types of activity, if you'd like, during the course that the trainer perhaps should do. Like spend time doing worked examples or asking the learners to think about how they're going to apply their newly found skills. If those things don't happen then we know it's not going to be effective. Actually, rather than asking how good was the trainer, better to say, "To what extent did the trainer do this? Do that? Do the other?" And then we can actually be very specific about the things that we know will have the impact.

Hannah:            

Definitely. I think there's a big misunderstood difference about a great day out or great event and a valuable learning experience. So that's really good knowledge to share. Thank you.

Chris:                 

And the flip side is when people complain about the air conditioning or the lunch wasn't very good and it has no correlation. They ought to have forgotten that soon enough but did they learn something?

Hannah:            

That's a very popular one, isn't it?

Chris:                 

Yes. It's difficult for training companies because typically when the learner has left the room at the end of the course, they're going back to their place of work and often the training company has nothing more to do with them.

In that sense, in house training departments, they have the potential to get a lot more feedback later about did it make a difference and what are you now doing differently? It's quite hard for training companies to go the extra mile, if you'd like, but they can ask questions about how you might apply this in your workplace, are you going to get support in the workplace to help you reinforce these skills? And there are things like that, that actually can be really useful information for the training company to gather. If only because then they can communicate it back to their customer to say, "This was the reaction we got and actually if you do these things, it'll make your investment in that training much more worthwhile."

Hannah:            

That's really interesting. I hadn't thought of that. I guess moving on from that, would you recommend the best practice for training companies is to send those follow up feedback forms so it wouldn't just be at the end of the course, it would also be say, three months later, six months?

Chris:                 

Yes. There's a lot to be said for that. And sadly most training companies don't do it and that's not entirely their fault because they may not have the relationship, if you'd like, with the individual going down the track. But in principle, you're quite right.

That's when you get to the questions that really will tell you, did the training make a difference? But whether it did make a difference will come down to the extent to which the individual had the opportunity to apply their skills and again, the evidence is if they don't apply them very soon, I'm talking within days, if not the next day, we all forget remarkably quickly. Much quicker than we want to believe possible.

Amity:                

I remember you showed us a scale at the Everest Conference. One of the slides was the scale of how quickly people forgot things.

Chris:                 

The curve of forgetfulness. Yes.

Amity:                

Do you think there's a max amount of questions that people should ask? Is there such thing as too many questions?

Chris:                 

There definitely is. It's one of these less is more examples really. We see training companies asking as few as five questions and getting pretty much 100% response rate to that. Conversely, some who have 20 or more, finding that half of them aren't getting answered anyway. So it's a bit of a false economy, or a false perspective, to think, "Well, I'll ask more." It's very tempting but actually, you won't get so much out of it.

So, we recommend 10 to 12 questions really and a mix of questions. There's lots of other things to consider, like making sure your questions are unambiguous. A lot of objectivity can creep in and you want to try and remove that from the responses to ensure you're getting good data. When you ask somebody how good something was, some people are much tougher than others.

Hannah:            

Some people have higher standards.

Chris:                 

And don't have questions that are sort of the same thing but asked in a slightly different way. So even if you ask, to what extent did the trainer use examples and then another question, to what extent did you feel the trainer related to my work or something that essentially is the same thing. You want to look at combining that. Always go through your questions and then try and get them down to every single one is giving you something genuinely different.

Hannah:            

That's interesting because I've also heard perspectives that that is a good thing to do. Ask things in slightly different ways. I don't know. I don't know where I've read that.

Amity:                

I feel like I've heard that as well.

Chris:                 

Even the type of question, they call them lickered questions, where it's, "To what extent do you agree with the following?" You know? It's, the answer is strongly disagree, disagree, agree, and there's a tendency that we will look at the results of those and then turn them into numbers and then create some sort of average and draw some conclusion. That really doesn't help. It's much better to look at each category, or the people that responded in a certain way, say, "How are we going to address that?"

Amity:                

Is there a chance that when you number a question, people will always go, if it's between one and 10, and they kind of feel like it was good but they can't quite remember, they always go for a six, seven, eight?

Chris:                 

Yes.

Amity:                

Is that a pitfall with using-

Chris:                 

Yes. It is. The thing is, in our personal lives, we're all used to the one to five star rating, aren't we? Where three neatly falls in the middle. But there are schools of thought that say you should have a six point scale so that you force people to come down slightly on one side of the fence or the other. I think I'd say it's more important that your questions are written in such a way that the responses are going to be clear and unambiguous rather than ... We're not great supporters of lots of score based questions for that reason actually.

Amity:                

Yeah.

Hannah:            

So, how do you feel about free text boxes? Good tool to use?

Chris:                 

Absolutely. Absolutely. There's an assumption, I think, again, with a lot of training providers or companies generally, perhaps, to think that there's nothing my customers could tell me that I don't already know. But actually, sometimes they do have a really good idea. It's not going to be that they all do but those are the ways we get those little nuggets from some people that will say, "Actually, that's a great idea. We should do that." And when you're in the quality control world, it's all about continuous improvement.

It's looking for those tiny little things. It's rarely going to be something massive. But it's that combination of incremental little improvements that you end up with a better product and better service. Big fan of, "What did you particularly like?" Or, "What did you particularly not like?" "What are you going to do differently as a result of this training?" Those sort of questions.

Chris:                 

They're very specific. And what you want to avoid is the situation where people go, tick, tick, tick, tick, tick. You know? We've all probably done that on survey forms. It should be a thoughtful process that you're making the people go through and actually, if you make it interesting for them and you make it thoughtful and you talk to them about why you're doing it. Response rate is something we get asked about a lot. People worry about response rates. But if you make it an interesting exercise, they'll do it and they'll do it well and you'll get more value from it.

Hannah:            

Just a question on the analysis of the data. Where do you commonly see the responsibility sitting in training organizations? Would it be with management? Would it be with the course coordinators? Where does it usually sit?

Chris:                 

Well, we know where it should sit, which isn't always the same thing as where it actually does sit. One of the things about collecting feedback digitally is there is the potential to share it widely, very quickly, in a way that you certainly don't get from paper. The traditional paper approach is that someone will look at the feedback, typically the trainer to start with, and the perhaps an administrator and then it's very much reliant on the administrator to sort of flag up something good or bad that they want to do something about.

But with digital, you can bypass all of that and you can make sure that senior people in the company are given information, it's perhaps pushed out to them by email or automated reporting, we're great fans of that. Because if it's not easy, people just don't do it. and so really, then you can put the right information in front of the right people that can actually make a difference. It's very important to act promptly. Especially if something hasn't gone quite as well as you might like.

Hannah:            

Yeah. In those kinds of situations, would you suggest, or do you see from your customers, people actually getting on the phone, talking to the customer if something's not gone right?

Chris:                 

Yes. It's a very good point, Hannah. The whole thing about negative feedback is you should have a process around it. You should have an agreed process. It's not just sort of reacting randomly to what happened to occur. You're having a process, and the key thing is that you want to communicate with that individual directly about it and do it promptly. And if you do that quickly and well, I mean, we've all probably experienced this in our personal lives. You can end up thinking that actually you're dealing with a great company even though something went wrong. And we've also been on the opposite side of that where we've complained about something and nothing happens, and actually we get even madder than we would have been if they'd just dealt with it.

Of course, in the public domain, there's another reason for responding to negative feedback and doing it well, and that is of course because if those comments are out there in the public domain then other people can see them and it's really, really important that you respond and that other people can see that you responded.

Amity:                

Any kind of suggestions to when you're responding, because people-

Chris:                 

It's an art.

Amity:                

You have to deal with them in any industry, don't you?

Chris:                 

You do.

Amity:                

And people struggle to find the right words to respond.

Chris:                 

And particularly so where you feel the complaint is totally unjustified. And let's face it, we know we can't please every customer all the time. Things, sometimes they genuinely go wrong, but sometimes people are just plain unreasonable. And we have to be able to cater for both. And the trick is when someone ... If something has genuinely gone wrong, in some ways that's more straight forward. You take it on the chin, react in some way to show you've listened and you've taken on board, or what you're going to do about it. Great.

When someone has reacted unreasonably, you have to treat them with a bit more care. You obviously don't want to fall on your sword and say it was all our fault when it wasn't but equally, you don't want to put fuel on the fire and make it worse. So you want to ... Your comments need to empathize with the individual, without overtly apologizing. You could be sorry that they're sorry but there's a way of dealing with it.

Hannah:            

Yeah. I can imagine there's a fine line involved with that kind of response.

Chris:                 

There is. And we recommend that somebody has that task. That you don't leave it to the instructors, for example, to do. Because they're more likely to be a bit more emotionally involved. And so it's better to have someone who is a little bit more unemotional about it.

Hannah:            

Yeah. Definitely.

Amity:                

Being a marketer, for accessplanit, I'd like to talk a bit more about the marketing side of utilizing your feedback.

Chris:                 

Sure.

Amity:                

Digital feedback, as we all know, can help you gain new customers as well as being used as an indicator of how well your course is going. Talk to us a bit about how Coursecheck can help and how feedback in general can help you with your marketing.

Chris:                 

Sure. Feedback, customer feedback, potentially, I say potentially, is a very valuable marketing asset but there's a lot of caveats as to whether it is or isn't. We've seen, for example, some companies who will put every single piece of feedback they've ever received on their own website. Page after page after page. You can see it all. Whether you believe it's there or it's complete or not, the fact is it doesn't work. It's all about how you use the data that matters. People are getting much more savvy now about reviews. It's a bit like a bottle of wine. You want to see, where does it come from? How did it get there? Could it be fake? We all have heard of fake news.

Hannah:            

I've got a great wine app I'll share with you after.

Chris:                 

Yeah. People are more conscious of ... No system is perfect, I think, when it comes to online reviews, but crucially, it needs to be at arms length to yourself. If you're an organization and you want to showcase customer feedback, you're much better-off having that feedback on an independent website. And there are plenty of reviews websites out there.

It was really Amazon and Ebay that started this whole thing off. They made reviews central to their platforms. And really, everybody else has sort of followed that mold. But we all now trust what people say about something almost as much as we do a personal recommendation. It's not quite, but the gap is closing. But that trust can be eroded if we have reason to be doubtful about the authenticity of it.

Amity:                

Like, if they're all five stars.

Chris:                 

Exactly. So there's this concept of peak trust, which is the idea that if you've got all five stars, no one is going to believe you. And if you've got all one stars, no one is going to want you. Peak trust is around about 4.7, 4.75.

Hannah:            

I think that's what we've got on Capterra.

Chris:                 

Then you're perfect. You're perfect. It's credible because you can't be perfect. But equally, it's very good. And so you tick those two boxes. The other thing that people, I think miss a trick on with the marketing, is all the ways you can use this asset. Imagine you've got this body of evidence of how good you are and it's on an independent site, and of course Google can find that and there's ways it can help drive traffic to your website, etc.

But there's lots and lots of other ways you can use it as well. In a more proactive way. So for example, we see our customers tweeting about their feedback. And posting links to show people where it is. They post stuff on LinkedIn about it. They will put it on their email footers and sales presentations. It's a great differentiator to say, "Look, we're really serious about our customer feedback and this is how we do it and here's the evidence." And trade shows. There's all sorts of ways in which you can proactively use it. It's a really, really valuable marketing asset. I call it a marketing asset, if you like. And best of all, it's free.

Amity:                

Yes. And what about for SEO purposes. I believe that Google pulls through feedback from Course Check? Is that right?

Chris:                 

That's correct. Yes. There's no real magic to this. If you build your website in the right way and you put reviews on it, you can essentially tell Google, "This is a review and this is the rating and these are the comments." And we, like other review sites, do this as a matter of course. It's not actually something that ... I mean, anyone could do it in fact.

Going back to the authenticity, it works best on an independent website anyway. Google will see reviews on Course Check and there's a big if here. Obviously companies have to have a good course outline and that's really, really essential as well. To be found on a search. People are going to be searching on the type of training. So to have good, rich content about your course and on the same page, to have recent reviews, which change, that's a great combination because Google will look at that page and say, "This is a fresh page. It's different to what it was yesterday." And the more often it changes, the more often Google comes back and checks it again, and the more Google will promote that page. It is that combination of having reviews and good content that makes it work.

Hannah:            

That's interesting. I guess that saves the marketers a job of updating the pages all the time if Coursecheck is doing it for them.

Amity:             

We'll have to put reviews on all our pages.

Chris:                 

It's worth using; for example, some of our customers use our API, to pull through the actual most recent reviews so they can have a feed of, for example, the 10 most recent reviews and they're just sort of scrolling along on their own, on their website.

Hannah:            

And Google loves that.

Chris:                 

And that's a great benefit for the training provider, to have that. Of course, people sometimes ask for that and then you have to remind them, of course that does mean all of your most recent reviews, whatever they say. So at that point, people can be slightly hesitant, but no, by and large, coming back to this negative feedback- If people know that they do do a good job, and that they're going to get the occasional piece of negative feedback. They just shouldn't be concerned about it. You know? If you're getting a lot of negative feedback, that's obviously a different story.

Hannah:            

Yeah. Unplug the API at that point. I guess as well, it's good to know how to use it from a marketing perspective with the training industry becoming, and I guess it has been for a lot of years, very competitive, and lots of new providers popping up all the time. I guess anything that training companies can do to get that awareness of more course bookings. That positive feedback which differentiates them from their competitors is, I guess, invaluable.

Chris:                 

It is really valuable and it is, to be fair, tougher for some than for others. If you're a training provider in a niche area, it's relatively a lot easier to show Google, "Here's a course that's got high ratings and it seems to match what you're looking for." I feel sometimes a bit sorry for people who are running Microsoft Office training, where there is just so many companies doing it and so many different types of offering. It's really, really hard for them to stand out.

Amity:                

I guess you're competing against Microsoft Office as well.

Chris:                 

Reviews or no reviews, it's a crowded field.

Hannah:            

Yeah. We've spoke about how collecting feedback digitally is a lot more valuable than paper forms. It goes further, we can share it with more people. What would you say to someone that would use a site like yours to collect feedback rather than something like Survey Monkey?

Chris:                 

Right. Two main answers to that question. First of all, Survey Monkey is a fantastic survey tool. No one is going to beat that, but it is generic. And so with Coursecheck, our survey form is designed specifically for training providers and it's configurable, but it's built for training providers. What that means is that out of the box you're going to get all your reports and your alerts and your analytics, and we know from experience, my own experience, the kind of questions that training providers want answers to. In that sense, any more bespoke tool like Coursecheck, is going to have an advantage over a completely generic, albeit a very powerful generic tool like Survey Monkey.

The other key difference comes back to the marketing and that with Course Check there's the opportunity to make your feedback public as well. Obviously with Survey Monkey, you're not going to get that. So you'd have to look at some other solution for your marketing.

Amity:                

Yeah. Manual input would be, if you did it on Survey Monkey, wouldn't it? You'd have to then type it all out, copy and paste it into your website?

Chris:                 

Yes. And that's assuming your Course, for GDPR, the delegate has given you permission to do that and you get into a whole other potential can of worms there as to what extent you can use their feedback.

Hannah:            

Good point.

Amity:                

What would you say to people who are stuck in their ways and want to stick to paper feedback forms?

Chris:                 

I would say that you should seriously consider looking at a digital alternative. The number one reason people give us, why they stick with paper, is down to one thing and that's response rates. They worry, or they believe that if they were to switch to a digital approach, as much as they might say they would like to, that no one will do it. They just don't get the feedback. So no point in having a wonderful survey tool if it's not being used.

I think the bit that is often missed, and this is not particular to Coursecheck, is that actually, these days everyone has got a smart phone and there's really no reason not to collect feedback digitally. Whether you're using Coursecheck or anything else. But do it in the room and that way you get feedback from everybody and you get all the benefits of honesty and more thoughtful feedback that they don't necessarily do on paper.

Chris:                 

Paper is, especially if you've got something to say about the trainer; are you really going to put that on your piece of paper and then look the trainer in the eye and say, "Thanks very much and here's my feedback form." Probably not. There's lots of people we've talked to who've moved from paper to digital, are genuinely surprised by how much more they're getting from it. It's a bit like you've got to try it to see that. The benefits of going digital are pretty well understood. The immediacy, nevermind the environmental savings, the paper, but some of the other benefits, as I mentioned, the honesty, the insight, are less well appreciated. Until you actually do it. So that would be my advice. Try going digital but do it in the room and that way you get the best of both worlds.

Amity:                

Yeah. I guess it's a case of training your trainers to add that extra five minutes in at the end and make sure everyone has the link.

Chris:                 

Yes. And that's another important point that you raise, because the trainers are your ambassadors really. They are the best people to explain why we want to collect this feedback. And people have done it by email afterwards who then complain that the response rates were so low. You said, "Well, what did you do to make it happen?" They didn't communicate with the learners. So you need to explain to them why you want this feedback. It may sound obvious, but tell them how much it matters and of course if you're doing it as an activity in the room, at the end of the course, you've got their attention, there's no reason not to do that and get 100% response rate.

Hannah:            

Yeah. We've been running our training benchmark report for three years now and this year is the first time is was neck and neck paper and digital. It's always like further towards the paper side. So I think the industry is catching up.

Chris:                 

I read that in your report. That there's definitely a trend, which I'm pleased to see. My hunch is that the people who respond to your survey are, by definition, probably more likely to be going digital or thinking about going digital. I think there's actually even more use of paper out there still, sadly, than probably ... but what is unambiguous is it's going the right direction.

Hannah:            

It's all good and well collecting feedback, but how do people know what good looks like?

Chris:                 

Sure. That's a good question. It's really important when you're starting to collect feedback that you have a goal as to why you're doing it and as you say, what does good look like? And you can change your goals, obviously, over time but it's really important that you create a target for what you're going to be measuring against so that you've got a way of saying to people, "We beat our target." You might beat it after six months, you review that target but it's really worth having. And the most common way people do that is with net promoter score, which is that question, on a scale of 0 to 10, how likely are you to recommend us? And it's a very particular way of measuring customer sentiment. It's widely used, not just in the training world, but elsewhere too.

And that's a great way to set a benchmark for what is good, if you'd like. It's quite a volatile score. The problem of 0 to 5 as a score-base is that you end up with, "were we 4.53 or were we 4.55?" And it gets a bit tricky. Whereas, net promoter score is a much broader scale and so it's a much better way of having a measure. And some of our customers actually take this very, very seriously, to the extent that they actually bonus everybody in the company based on what our net promoter score was the last quarter. And guess what? Everybody in that company really, really cares.

Hannah:            

Yeah. I was going to ask how people motivate the staff, so that's good. Are there any other measures that you see training companies using other than the net promoter score?

Chris:                 

The one I'd like to see them using more is around the extent to which people learned and the extent to which they are likely to remember what they learned. And this goes back to where we started in composing the right questions. But it's rare to find those kinds of questions on a survey form, which is ironic considering people are there to learn but their questions tend to be about what was the experience like? Rather than focus very heavily on what you've learned, the extent to which it's going to make a difference, how likely are you going to be able to apply this in your workplace, etc, etc? I think that's where training companies could learn a lot more and potentially feed that information back to their own customers to help those customers then get the most out of their training investment.

Hannah:            

Designing serveys, who would you recommend should be involved in that process? Should it be a mixture of various levels? Trainers? Administrators? Senior management? Or would it be based on what the business wants to achieve?

Chris:                 

Yes. The business, really. I think it's a quality control question. But if you're the kind of training provider that does a lot of work for private clients, in other words you're not running open courses. Then your questions can be very much tailored towards that particular customer. Some of my customers run no public courses at all. All their training is programs and in that environment of course, you definitely want the customer as a stakeholder in that whole process, and will have views about what they want to find out.

So, it can be a bit more than just your own views. There's lots of good advice out there. I mean, I'm a big fan of a gentleman called Will Thalheimer who runs a website called Smile Sheets. He's based in the States but he's got a wealth of information about what makes good survey forms, and we recommend his approach. He does consultancy around this sort of thing.

Hannah:            

How would you recommend increasing response rates?

Chris:                 

Well, generally speaking, if you're doing it in the room, you should get feedback from everybody, but if for some reason you are doing it by email, then as I said before, the important thing is that you do explain to people why you want the feedback. It has been suggested to me by one training company that they would have a prize draw, for which all five star reviews would be entered. Thought that was pretty, not ideal. Yes. Different ways. But some companies, on a more serious note, do link it to providing a certificate. So it's like, you give us your feedback and we'll send you out a certificate. Soft bullying.

Amity:                

Good idea though. An incentive is always good, I think.

Hannah:            

We recently partnered up with Coursecheck, and accessplanit now have an integration. So could you tell us a little bit about what are the benefits of that integration?

Chris:                 

Sure. Although Coursecheck can be used stand-alone, one of the things that Coursecheck needs to know about is the Course schedule. And so there's two parts to the integration. One part is that Coursecheck can use the accessplanit API to pull through all of the schedule information about what courses are coming up and that obviously saves then the- otherwise we'd have to do that manually or upload it from a spreadsheet. Lots of potential for errors to creep in. So that's a really nice thing about the integration, is that it means that from an administrative point of view, running Coursecheck is really no extra effort involved.

All the trainers need to be able to do is to give out a code; an event code on the day, and again, that can come through from accessplanit and it just means that the whole system is nicely linked-up. And it also means potentially that if customers don't want to collect feedback in the room for any reason, they want to do it by email, then again, accessplanit has the ability to send out emails which will have dynamic links to Coursecheck, so that when someone clicks on it, they're going in and they're sending feedback about the right event, and the date, and the trainer is all there for them. They're not just having to fill all that out from scratch.

Hannah:            

Sounds amazing.

Amity:                

It is amazing. Time saving.

Chris:                 

Big time saver.

Amity:                

For the person attending the course as well as the trainer and the administrator.

Hannah:            

Do you find, obviously your main customer is commercial training companies, but do you have any knowledge of learning development teams being more effective at collecting feedback than training companies, or is it something that-

Chris:                

That's a good question. Yeah. I think L&D departments tend to come at it from the perspective of HR more than the training function. If that makes sense. And the L&D systems that they use, tend to lend themselves to reporting up through the management; how well trained are my team? What's their next learning objective going to be? Is that sort of view of the world rather than, I've got this team of trainers. And I don't think they're well catered for to be honest, because I think L&D systems often have a feedback component but it's not really designed to answer the questions about who are my best instructors? Or which course do I get negative feedback on? It doesn't report in that way. That's my understanding of it.

Hannah:            

Yeah. I guess that also leans towards commercial training providers making money, and it's about customer service and what the customer thinks, a lot more than- The learner in an internal department isn't always treated as a customer like they should be.

Chris:                 

No. And the other complication with the in-company stuff. Is that people are going to be maybe less honest.

Hannah:            

Yeah. I can imagine.

Chris:                 

With a training company, they're generally quite forthright. It's like, "it was rubbish". They'll say what they thought. But imagine if you're doing an internal training course, you might think twice before saying "that was a waste of time", when it's your-

Hannah:            

Yeah. I guess on that same note, would you recommend anonymous surveys or people putting-

Chris:                 

Yeah. That's a good question actually. We've introduced recently, a feature where the training company can say, "I'm willing to allow anonymous feedback." Most of them don't. People have to say who they are. But it's interesting. Where they have said, "Yes, I'm willing to allow it." Some people do choose to be anonymous. But I'm not quite sure why that is. I think some part, I think a little bit is just laziness. Because if they don't tick that box, they've got to fill out another five more boxes saying their name, blah, blah, blah.

Hannah:            

Yeah. I suppose. If it was me, if I had a bad experience, I'd want them to know who I was.

Amity:                

I'm the opposite.

Hannah:            

So they could call me up and see how good the customer service was.

Chris:                 

I share the frustration of the training provider because I've had this where someone has said it was really bad and because they'd chosen to remain anonymous, the training company is kind of powerless. It's like, I can't even ... I don't even know who. I know they can work it out, can't they? But in principle they can't. They don't know. They can't just ask "Was it you that said that?"

Hannah:            

Yeah. It's a bit like staff surveys.

Chris:                 

Yeah.

Hannah:            

They're anonymous but it's kind of, we can't do anything with it because we don't know who you are.

Amity:                

All right. Well, thank you very much Chris.

Chris:                 

Thank you, Amity.

Amity:                

It was really interesting. Hopefully our listeners will get a lot out of it and start using their feedback forms more effectively.

Chris:                 

Let's hope so.

Our Next Podcast Episode

Andy from London School Online talks about using language as a training tool

Book A Tour Of Our System