Another way to mislead with statistics is to weight the question you ask, or to target groups of people who are likely to answer it in a certain way.  Then make the predictable results sound like an amazing discovery.

I could predict the following without holding an opinion poll:

  • The majority of unemployed people think unemployment is too high.
  • The majority of taxpayers think taxes are too high.
  • The majority of victims of crime think sentencing is too lenient.
  • The majority of people convicted of offences think sentencing is too severe.
  • The majority of farmers do not want farm subsidies to be scrapped.

You will rarely see these pairs of questions asked in the same survey, or if so I suspect the answers to only one of each pair would be published:

1. a. Would you like to see immigration reduced?

1. b. Would you like the NHS to stop recruiting foreign doctors and nurses?

2. a. How long do you think terrorists should be detained without trial?

2. b. What is the definition of a terrorist?

3. a. Should there be stiffer penalties for tax evasion?

3. b. Do you ever pay tradesmen cash-in-hand?

4. a. Should more money be spent on X?

4. b. Should less money be spent on Y?

4. c. Should Council Tax be capped?

5. a. Should local authorities have more power to respond to local needs?

5. b. Should something be done about the postcode lottery?

I would also expect this same basic question to get a different answer if it was asked in each of these different ways:

  1. Should businesses be subject to less state interference?
  2. Should more be done to ensure adequate standards of consumer protection/safety/transparency?

Or similarly:

  1. Do we need to do more to protect the countryside?
  2. Should there be fewer planning restrictions?

So what I am saying is that you should always check what questions have actually been asked, and what groups of people were selected.  If that information is not given, be very suspicious!