Editor’d Note: The public-use microdata file (PUMF) will be released shortly; the following will be updated accordingly.
So 95 percent of Canadians are ‘somewhat’ to ‘very’ proud to be Canadian. Or so some segment of 48.1 percent of respondents to Statistics Canada’s 2013 General Social Survey (GSS) – aka Cycle 27, Social Identity – indicated. If you thought the response rate for the 2011 National Household Survey (NHS) was bad, at least that survey asked fairly discreet, straight-forward questions. In addition to the arbitrary questions on national pride and patriotism, the 2013 GSS also contained questions that likely discouraged certain individuals from responding, effectively defeating its purpose.
To start with the positives, the GSS is one of the few Statscan surveys that somewhat regularly asks about race, or ‘visible minority’ status. Notably, Statscan makes a point of asking it on this voluntary household survey with a known poor response record, but not the mandatory Labour Force Survey (LFS). It makes more sense to ask it on the LFS, since vismin status is used to evaluate employment equity.
In addition to the arbitrary ‘pride and patriotism’ questions, the 2013 GSS asked subjective questions about confidence in institutions, experienced discrimination and general well-being. When cross-referenced with responses to other more objective questions, such as race, education and income, the 2013 GSS could have provided some interesting insights. That’s if the survey had a decent response rate.
Which it didn’t. The majority of Canadians asked either refused or failed to respond adequately to the 2013 GSS. While voluntary Statscan surveys are known to have lower response rates, even the 2012 GSS – aka Cycle 26, Caregiving and Care Receiving – supposedly saw nearly two thirds (65.7 percent) of those queried responding.
The GSS release notes state the obvious: “To the extent that the non-responding households and persons differ from the rest of the sample, the results may be biased.”
The 2013 GSS literally leaves one wondering how the other half live. Or rather, at least half. 48.1 percent was the ‘overall response rate’. Whether that’s the total collection rate or a composite of questionnaire and individual question response rates is unclear at this point. Either way, the response rates for certain individual questions would have been lower. Likely even-lower-response candidates on the 2013 GSS questionnaire would include this block of questions:
In the past 12 months, have you done any of the following activities:
REP_Q30 expressed your views on an issue by contacting a newspaper or a politician?
REP_Q35 expressed your views on a political or social issue through an Internet forum or news website?
REP_Q40 signed a petition on paper?
REP_Q45 signed an Internet petition?
REP_Q50 boycotted or chosen a product for ethical reasons?
REP_Q60 attended a public meeting?
REP_Q70 spoke out at a public meeting?
REP_Q80 participated in a demonstration or march?
REP_Q85 worn a badge, T–shirt, displayed a lawn sign in support or opposition to a political or social cause?
Living in an era that sees even supposedly ‘liberal’ members of parliament approve ‘preventative arrests’ of suspected terrorists, with a federal government that classifies those supporting causes from environmental protection to boycott, divestment and sanctions (BDS) as suspected terrorists, and where such motives/intentions are determined by secretly monitoring individuals’ social interactions and social media activities, it seems quite reasonable to avoid responding to such questions.
Indeed, non-response would be the prudent response – especially for socially engaged individuals or members of racial and/or religious minority groups.
It’s also worth noting that the referenced question block immediately preceded the ‘pride and patriotism’ blocks on the 2013 GSS. Some of the less-than-patriotic, ne’er-do-wells that did things like protest, boycott and sign petitions could’ve dropped out of the survey before responding to the ‘pride and patriotism’ questions; those continuing with the survey could’ve felt pressured to prove ‘pride and patriotism’ by responding more positively to such questions. Question order matters.
Given the nature of the 2013 GSS questions and questionnaire structure, it’s likely that “non-responding households and persons” did significantly “differ from the rest of the sample,” rendering “biased” results.
To its credit, Statscan didn’t try to impute missing answers to these highly subjective and personal questions. (But apparently did so for demographic questions like sex?) However, the likely biased results were nevertheless reweighted as if they were representative of the general population. (Method not specified, likely based on what Statscan considers core demographics: age, gender and possibly language and/or immigration status – but not likely race, which would have been particularly relevant given the nature of the questions, if the whole exercise wasn’t a moot point.)
To address the oft-repeated rebuttal about Statscan data: Quite simply, it’s neither that protected nor private. Statscan’s become a little more transparent recently in admitting it links responses between surveys without respondent consent. As discussed previously, the current privacy commissioner has shown little interest in review or oversight of Statscan’s increasingly questionable practices. And section 12 of the Statistics Act basically entitles the government to access and share any Statscan data with pretty much any department or ‘corporation’.
Digression aside, the bigger question is: Why’s Statscan effectively conducting public opinion polls – and that’s what large swathes of the 2013 GSS are – that likely aren’t any more reliable than ones conducted by private research/polling firms? Especially so when in recent years it’s consistently deferred to budgetary restraint to justify less public data dissemination, and used that in turn to justify greater demands from the public for ‘cost recovery’.
Despite inherent bias issues, the various GSS iterations over the years have provided some unique insights into Canadian household socio-economic profiles otherwise not easily attained from Statscan’s regular flagship household surveys (census, LFS). However, there are some serious questions that need to be (re)considered with regard to this Social Identity iteration of the GSS – and those questions go beyond just the ones on the questionnaire script.