I may not have been one of those called, but would not likely have answered, had the call come through. Had I picked up the call, there’s an excellent chance I’d have said “no” right out of the gate. Not as a result of indifference but out of an abundance of caution in this age of sophisticated scams.
Out of curiousity I checked to be sure my cell number is still listed on the National Do Not Call List; it is. I also use an app that does a fair job of identifying “likely spam” so I can block those numbers if they leave voice mail or dead air. The CRTC enforces the Unwanted Telecommunication Rules and also provides information to help identify and report suspected scams.
I’d have been happy to participate in the survey, but here’s the dilemma: how can people like me be reached when we are so much more aware of the need to guard our electronic privacy that the chances of an authorized agency being able to speak to us are hampered by our imperative for caution. There has to be a better way.
I think that this is a really good point. We have all become very cautious out of concern to protect ourselves, which is a rational choice. I wonder if the report accounted for this reluctance to provide information.
For all the thinking and adjusting that goes into the statistics of surveying, there are some human factors that come into play that are not given consideration in successfully getting someone to give their precious time to a survey. In the early 1990s I worked a second job with Environics as a telephone surveyor/interviewer. What I observed was that the surveys I conducted were leading to government and private actions later in real life. I resolved, then, to never turn done an opportunity to affect what was going to happen. I believe that most people do not see the connection between the surveys and the outcomes so they think the surveys are a waste of their time. So tell them realistically what the results will/ could lead to and the time frame. I also noticed that if the surveyor sounded like a sweet young thing with a girly name like Tammy or Cindy, men would usually agree to complete the survey. I also personally learned that if I went slightly off the dry introductory script to rephrase the request to participate in the survey as helping me out, I was more successful in getting a “Yes, go ahead” instead of a “Sorry.I’m busy.”
A few points I'd offer as additional context. Hopefully it's helpful. It feels a bit ironic for me to be offering communication to clarify something the government is doing... given that the subject is about communicating what the government is doing. Maybe they should hire me.
Anyway, this survey work is not a one-off exercise; the “Continuous Tracking of Canadians’ Views” program is recurring federal public opinion research that has been conducted for several years. While the published methodology does not explicitly reference automated dialing, the scale and structure of the fieldwork make it extremely unlikely that unanswered or invalid calls ever reached a human interviewer—standard CATI practice relies on automated dialers, with interviewers engaged only once a live person answers.
It is also worth noting that the contract cost would normally include not just interviewing, but data cleaning, weighting to Census benchmarks, non-response analysis, and the preparation of formal executive and methodology reports, not simply the act of dialing phone numbers.
On the issue of sample distortion, the need for weighting is not an indication of methodological failure. In contemporary telephone surveys, achieving a raw respondent pool that mirrors the population on age, education, and other characteristics is close to impossible and is not the objective. Post-survey weighting to correct known biases is standard operating procedure and is explicitly built into survey design for work of this type.
Finally, the very high number of dial attempts and low response rate, while striking, are not unexpected or novel. These conditions are well-documented across government and private-sector polling. Government expectations are typically set around completed interviews, regional coverage, and statistical precision—not around minimizing call volume or assuming higher participation rates than current realities allow.
I may not have been one of those called, but would not likely have answered, had the call come through. Had I picked up the call, there’s an excellent chance I’d have said “no” right out of the gate. Not as a result of indifference but out of an abundance of caution in this age of sophisticated scams.
Out of curiousity I checked to be sure my cell number is still listed on the National Do Not Call List; it is. I also use an app that does a fair job of identifying “likely spam” so I can block those numbers if they leave voice mail or dead air. The CRTC enforces the Unwanted Telecommunication Rules and also provides information to help identify and report suspected scams.
https://crtc.gc.ca/eng/phone/telemarketing/fraud.htm
I’d have been happy to participate in the survey, but here’s the dilemma: how can people like me be reached when we are so much more aware of the need to guard our electronic privacy that the chances of an authorized agency being able to speak to us are hampered by our imperative for caution. There has to be a better way.
I think that this is a really good point. We have all become very cautious out of concern to protect ourselves, which is a rational choice. I wonder if the report accounted for this reluctance to provide information.
For all the thinking and adjusting that goes into the statistics of surveying, there are some human factors that come into play that are not given consideration in successfully getting someone to give their precious time to a survey. In the early 1990s I worked a second job with Environics as a telephone surveyor/interviewer. What I observed was that the surveys I conducted were leading to government and private actions later in real life. I resolved, then, to never turn done an opportunity to affect what was going to happen. I believe that most people do not see the connection between the surveys and the outcomes so they think the surveys are a waste of their time. So tell them realistically what the results will/ could lead to and the time frame. I also noticed that if the surveyor sounded like a sweet young thing with a girly name like Tammy or Cindy, men would usually agree to complete the survey. I also personally learned that if I went slightly off the dry introductory script to rephrase the request to participate in the survey as helping me out, I was more successful in getting a “Yes, go ahead” instead of a “Sorry.I’m busy.”
A few points I'd offer as additional context. Hopefully it's helpful. It feels a bit ironic for me to be offering communication to clarify something the government is doing... given that the subject is about communicating what the government is doing. Maybe they should hire me.
Anyway, this survey work is not a one-off exercise; the “Continuous Tracking of Canadians’ Views” program is recurring federal public opinion research that has been conducted for several years. While the published methodology does not explicitly reference automated dialing, the scale and structure of the fieldwork make it extremely unlikely that unanswered or invalid calls ever reached a human interviewer—standard CATI practice relies on automated dialers, with interviewers engaged only once a live person answers.
It is also worth noting that the contract cost would normally include not just interviewing, but data cleaning, weighting to Census benchmarks, non-response analysis, and the preparation of formal executive and methodology reports, not simply the act of dialing phone numbers.
On the issue of sample distortion, the need for weighting is not an indication of methodological failure. In contemporary telephone surveys, achieving a raw respondent pool that mirrors the population on age, education, and other characteristics is close to impossible and is not the objective. Post-survey weighting to correct known biases is standard operating procedure and is explicitly built into survey design for work of this type.
Finally, the very high number of dial attempts and low response rate, while striking, are not unexpected or novel. These conditions are well-documented across government and private-sector polling. Government expectations are typically set around completed interviews, regional coverage, and statistical precision—not around minimizing call volume or assuming higher participation rates than current realities allow.
Very interesting Bill - really appreciate the detail!