Tracking where learners go after they leave the program is also important. Do they indeed go on to further education and training, or employment, if that is their goal? How do you know? And if they go, do they finish the course they started, or are they employed long-term? We have to track exit and follow-up statistics, so again CIPMS is not requiring that you do additional work, but it does require that you spend some time analyzing what the numbers mean. If people don’t complete college courses, for example, does it mean that they weren’t adequately prepared? On the positive side, if course completion rates are high, then it most likely does mean that your agency has done a good job of preparing people for further education and training.
A good place to start analyzing program impact is with the information you collected for exit and follow-up. Start with the mandatory learner satisfaction exit survey and use it to ask other questions … you are not restricted to just asking six questions! For example, you could ask learners to identify one thing they can do at exit that they were not able to do when they started. Gathering this information from a number of learners could also result in or provide information for a very valuable marketing tool!
The exit surveys give you a starting point to evaluate program impact, but they are fairly immediate results. You will also want to know more about longer-term outcomes. This information can be harder to quantify, but it speaks to the results your program is achieving, i.e., the impact on learners, their families and the community. Gathering this data takes more thought and creativity, but it can be done.
The Espanola site of North Channel Literacy wanted to know if its employment preparation modules were providing learners with the skills they needed to find and maintain employment. Staff went back through their records to find out. They tracked the information in chart form, and the resulting document provides a wealth of data that can be used to clearly demonstrate the effectiveness of their modules. Staff could use these statistics for recruiting both learners and tutors as well as for seeking funding. Over time, staff will continue to develop further evidence as to the success of their modules and the success of the learners who use those modules.
This type of data tracking can clearly show that results were achieved and how the program knows that they were achieved. Although it can take some time to set up this type of tracking, once it is in place, it simply becomes a matter of keeping it up to date and adding information on a regular basis. This is a sample of what Espanola’s tracking form looks like.