Sunday, August 11, 2013

Is Consumer Satisfaction Survey Meaningful?

 Citizen Satisfaction Survey Presentation to Council

The next city council meeting will see another Ipsos Reid Citizen Satisfaction Survey presented. Some serious questions could be raised as to the validity of this survey as it relies on the 4 - point scale which has a tendency to result in a false positive response. This is due to the fact most people when not given a 'neutral' option as is presented in the more accurate 5 - point scale will tend to answer leaning towards the more positive, due to something known as the positive response bias. 

How Your Voice Is Corrupted
In Citizen Satisfaction Surveys

CALGARY, AB, Jun 18, 2012/ Troy Media/ – Some years ago, the City of Calgary conducted a multi-million dollar effort to engage citizen’s in the development of a new transportation strategy. The Go Plan used consultations, focus groups, workshops and polls to engage citizens and gather feedback.

Shortly before the release of the final report, someone thought it might be a good idea to conduct a more scientific survey of Calgarian’s. It was an afterthought, a last minute effort to ensure i’s were dotted and t’s crossed. A professor of transportation planning at the University of Calgary was quickly contracted to conduct the survey.

The results completely contradicted what was already in the soon-to-be released Go Plan report, setting up a potential public relations disaster. Disaster was adverted by simultaneously releasing and dismissing the scientific effort as a small, insignificant part of the larger consultative process. Thus, the unbiased picture of what Calgarian’s wanted in a transportation system was ignored, while millions were wasted meeting the needs of specialized interests.

Gathering citizen feedback to improve public services has all but disappeared. Now, survey research is conducted for the public relations purpose of ensuring a positive result. This is the Billy Beane, Moneyball strategy of; “If you get the answer you’re looking for, hang-up”. People are suspicious of these surveys, but are unsure as to how things are manipulated. Well, here’s how.

Step one

Cherry pick the sample. This ensures those participating will give you the answer you’re looking for. You have to be subtle about it though. Surveying only cycling enthusiasts on bike lanes is too obvious. Instead, hold public engagement sessions on topics appealing to cycling enthusiasts like; ‘Bike lanes, should we have them?’ The meeting will be overwhelmed with those answering ‘Yes!’. This enables sponsors to claim the session was open to both positive and negative feedback while knowing only those in favour will show up.

An example is the engagement process concerning Calgary’s new central library. The process assumes that Calgarian’s want a new library, that libraries have an important role to play, and so on, ensuring that responses will be heavily biased in favour of current library users and supporters. Administrators will get the answers they’re looking for and will follow up with claims they listened using an ‘extensive’ public consultation process. (Consultations are usually described as extensive but rarely as unbiased.) Unfortunately, this public relations approach to engagement ensures that the people the library needs to hear from, to grow and remain relevant, are largely excluded.

Step two

Design the questions carefully. Again, you can’t be too obvious about this. Subtly implying something for nothing will generate positive responses. ‘Are you in favour of more police on the streets?’ ‘Sure, who isn’t?’ ‘Would you like a new central library?’ ‘You bet.’ Missing is any hint of cost.

Vague, general questions also elicit more positive responses than specific questions. ‘How do you feel about the quality of city services?’ will yield more positive results than; ‘Considering your last interaction with the city, how would rate the level of service you received?’.
Plus, vague questions enable what Darrell Huff, in How to Lie With Statistics, calls “the semi-attached figure”. This is measuring one thing and drawing conclusions about another. For example, food quality at Alberta extended care facilities, particularly in rural areas, has recently come under fire. Criticism has been deflected by noting that customer satisfaction in facilities is high. Perhaps, but what about customer satisfaction with the food and in rural areas?

Step three

This is the best trick of all and a trade secret. If you want to guarantee positive results on your survey, use a small, five point scale, and report the top two box scores – where a score of four or five is recorded as ‘positive’ or ‘satisfied’. Where’s the trick?

It’s in the phenomena called positive response bias. People tend to answer to the positive, even when their attitudes are neutral or slightly negative, answering four when three better describes it. A tally of ‘satisfied’ respondents, therefore, includes people ‘neutral’ or ‘slightly dissatisfied’ ensuring an inflated positive result.

Check this out for yourself. Download a citizen satisfaction report and see if it doesn’t rely on a top two box score summary on a five point scale. By the way, the smaller the scale, the stronger the bias. Citizen satisfaction skyrockets with four point scales.

So there you have it. You’re satisfied with public services and now you know why – the survey says so and it was designed to say so from the start.

Robert Gerst is a Partner in Charge of Operational Excellence and Research & Statistical Methods at Converge Consulting Group Inc. He is author of numerous peer reviewed articles and of The Performance Improvement Toolkit: The Guide to Knowledge-Based Improvement. 

 An example of manipulating a survey by using unbalanced questions:

Example 1 Balanced:

Very Poor     Poor     Average     Good     Excellent

Example 2 Unbalanced:

Poor      Average      Good      Very Good      Excellent

In Example 1 you see 2 positive and 2 negative statements with a neutral midpoint. For this case, respondents are not led in either direction. However, in Example 2 there are 3 positive statements and only 2 negative statements. In this case, the more positive statements tend to be selected.

The Ipsos Reid survey council like to quote as approval of their performance on the subject of satisfaction with council and staff, the options were very satisfied and somewhat satisfied. There was no option for not satisfied at all. The balance of the questions presented two positive and two negative with no middle ground. As pointed out above, this is the perfect way to manipulate a false positive result.


1 comment:

  1. Definitely a useful article. I hope the Nanaimo survey itself will be published in this space so that readers can see first hand how consumer satisfaction (i.e., city hall with Ipsos-Reid) is achieved. After all, the nature of this symbiotic relationship is obvious: I-R wants continuing contracts and city hall wants continuing news of just how great it is. The need for that is clear: council and staff want to ward off pesky council-watchers, letter writers, and a new phenom in Nanaimo -- critical editorialists.

    It would also be useful to know more about the sampling methodology employed and the context.

    Concerning the former, information such as the following would help readers detect the degree of bias built into the sampling approach, and thus the survey's validity: was the survey conducted exclusively by land-line telephone? by internet canvassing of Nanaimo users registered with Ipsos-Reid? etc.

    Concerning the latter: questions such as the following (appropriately scaled, of course) would help the reader assess the reliability of the survey findings:

    1. Do you follow city hall issues? Do you know who the mayor is? any councillors? any staff members? If so, in what capacity?

    2. Do you know how your taxes compare to those of comparably-sized municipalities in BC? Would your views about local tax rates be affected by such knowledge?


    When it comes to spending money on consultations that are often ignored, our city hall likely has few equals. So some pertinent FOI question to put to the city would include:

    1. How much was spent in each of the last 3 years on surveys and opinion polls?

    2. How much was spent on other community consultations, such as open houses with survey forms, public hearings, etc.

    3. To what extent did city hall follow the expressed wishes of the responding public for each such initiative?


    My guess is that most such surveys and consultations are a sham in terms of results and might as well be dispensed with.

    It's clear to anyone who has participated in almost any local rezoning application public hearing that council and staff are tone-deaf to criticism and can't do enough for developers, no matter how intrusive and costly their proposals are for neighbourhood stability and the financial interests of homeowners.

    In fact, it's fair to say that public hearings would likely be disposed of if council wasn't obliged to follow provincial legislation. Ditto the OCP itself, which on key points was transformed into something to satisfy developers and planning staff after the last review. All the citizen input was virtually ignored.


Your comment will appear after moderation before publishing,

Thank you for your comments.Any comment that could be considered slanderous or includes unacceptable language will be removed.

Thank you for participating and making your opinions known.

Note: only a member of this blog may post a comment.