Chart 1 Job Vacancy Statistics (JVS), February 2015 (3MMA)
Chart 2 Job Vacancy and Wages Survey (JVWS), Q1 2015
Source(s): Statistics Canada
Last month’s inaugural Job Vacancy and Wages Survey (JVWS) release by Statistics Canada – which the agency emphasises was undertaken on behalf of Employment and Skills Development Canada (ESDC) – raised more questions than it seemed to answer. When initially contacted for comment, Statscan indicated it would be releasing more data by the end of September. That data, along with additional feedback provided by the agency, points to problems with the survey. It’s worth mentioning that ESDC was the source of the now infamous Kijiji jobs report – because the JVWS bears a striking resemblance to it.
As previously noted, there’s a wide disparity between the total job vacancy estimates from the JVWS and those from the Job Vacancy Statistics (JVS) that Statscan has been producing since 2011. The initial JVWS release didn’t allow for more detailed comparison as it only provided job vacancy estimates by occupation; the JVS only provides estimates by industry. The additional JVWS data since released by Statscan allows for a basic (two digit) industry level comparison between the two surveys. The attached charts show treemaps sized proportionately to the total number of vacancies estimated by each survey, the boxes within the treemaps are sized proportionately to each industry’s respective share of total estimated vacancies and the box shading corresponds to each respective industry’s vacancy rate.
It’s apparent the two surveys stand in stark contrast. At this point it’s worth revisiting the five differences between the JVWS and JVS that Statscan emphasised in the inaugural JVWS release.
The JVWS quarterly sample includes 100,000 business locations, compared with a (JVS) monthly sample of 15,000 from the survey component of the SEPH.
Statscan verified the JVWS methodology sees only a third of sampled locations surveyed each month, with each location surveyed only once each quarter. The rush to get the survey out before it was ready saw Statscan in fact only survey 67,000 of its sampled locations, about 33,000 each in February and March 2015. So how does the release account for all of Q1? Statscan imputed results for the third of locations that would have been surveyed in January from the responses received from the businesses surveyed in the two subsequent months – a really bad idea given the the inherent seasonality of January labour estimates. The “collection response rate” (which does not account for incomplete or invalid responses) for the surveyed locations was only 67 percent percent. Which means that 45 percent or less of the data used to generate the JVWS vacancy estimates for Q1 2015 came from actual respondents (the rest being computer-generated ‘guesstimates’ based on those few responses).
The JVWS covers the entire agriculture, forestry, fishing and hunting sector, whereas the SEPH (JVS) covers select subsectors of this industry group.
The SEPH may cover select subsectors of the agriculture, forestry, fishing and hunting sector, but the JVS has never reported a single job for the sector, from first release in March 2011 thru May 2015. The reason cited between March 2011 and February 2015 was poor data quality, “F – too unreliable to publish”; the reason cited since has been confidentiality, “x – suppressed to meet the confidentiality requirements of the Statistics Act”. The JVWS estimated 12,900 agriculture vacancies for Q1 2015, with a data quality indicator of “D – acceptable”. It also happened to estimated 7.5 percent agriculture vacancy rate, the highest for any sector – an artefact of the JVWS methodology, as discussed further on.
In the JVWS, the sampling unit is the location, while it is the establishment in the SEPH (JVS). For example, the JVWS surveys individual stores or restaurants, whereas the SEPH generally surveys the head offices of large retailers or restaurant chains.
While it’s true that the JVS is based on a survey of establishments while the JVWS is a survey of locations, the example given is misleading. As Statscan notes elsewhere: “… two stores in the retail industry may be considered one establishment if the accounting information… is not available separately, but is combined at a higher level.” The additional JVWS information provided by Statscan illustrates the limited potential impact of this conceptual difference: 94,231 establishments accounted for the total of 101,088 locations sampled for Q1 2015. It’s difficult to imagine how 7% more locations could account for 81% higher estimated vacancies in the JVWS, even with absurd weighting.
JVWS respondents tend to be responsible for human resources with a good understanding of both current and emerging job vacancies, while the SEPH respondents are often responsible for the company payroll.
As previously noted, the suggestion that the discrepancy can be accounted for by ’emerging’, i.e. not yet advertised, vacancies is simply false. The JVWS questionnaire clearly indicates that only positions for which the employer was “doing active external recruitment” count as vacancies – the same criterion as the JVS.
JVWS respondents are asked to report jobs that are vacant on the first day of the month as well as those that will become vacant during the month. In contrast, the two questions on job vacancies in the SEPH refer to jobs that are vacant on the last day of the month.
Statscan conceded that while this is a technical difference, in practice it would have little if any impact: On average, employers’ responses on the last day of a given month should differ little if at all from their responses on the first day of the following month. It would make a difference if the questionnaire asked about ’emerging’ or potential vacancies not yet actively being recruited for, which it doesn’t. In any case, using such a technique to ‘gin up’ vacancy estimates would constitute rather questionable practice at best.
So other than the obviously unreliable estimates generated from having less than half of sampled businesses responding, what else could account for the incredible discrepancy between the JVWS and JVS estimates? It would appear to be the peculiar and unique JVWS methodology, as highlighted by this comment received from Statscan:
“The JVWS counts the number of distinct vacancies over 3 months (the quarter), and is not a weighted average of monthly estimates. This also means that you can’t divide the figure by three to represent a monthly figure as the sample is designed to be representative of the quarter. As with any other sample based survey, the responses are then weighted to represent other locations of similar size/activity/etc for the purpose of creating a total job vacancy count.”
Reading that over while keeping the sampling and survey methodology in mind, it’s obvious that the JVWS does not and can not only count distinct job vacancies during a given quarter. 33,000 different sampled businesses are surveyed each month of the quarter, and the vacancies each business reports are counted as distinct over the span of the entire quarter, which they clearly can’t be.
That’s because hundreds of thousands of Canadians change jobs every month, and presumably many of those job changers also change company. This affects both vacancies and labour demand. For businesses in certain sectors with relatively higher temporary/casual employment and/or turnover, like say retail trade or accommodation and food services, the peculiar JVWS methodology used would necessarily end up counting individual vacancies multiple times.
The attached charts clearly show a significant increase in both the number and share of total vacancies in the JVWS relative to the JVS for both those industries, an estimated 42,000 more in retail and 31,000 more in accommodation and food services. Those two differences, along with the peculiar situation that produced an estimated 13,000 vacancies in agriculture in the JVWS but none in the JVS, account for half the discrepancy between the two surveys’ vacancy counts. Arts, entertainment and recreation, a sector characterised by temporary/casual employment, also showed a relatively large discrepancy of nearly 14,000 vacancies more in the JVWS.
(Conversely, the JVWS estimated 15,000 less vacancies than the JVS in health care and social assistance, commonly viewed as a sector charactersied by shortages and relatively high labour demand.)
There’s also a stark contrast in vacancy rates between the two referenced surveys. Except for the health care and social assistance sector, vacancy rates were higher across the board for the JVWS. That one’s easy enough to explain: Statscan used the higher ‘quarterly’ job vacancy estimates in the numerator and a monthly employment estimate from the LFS in the denominator to calculate the JVWS vacancy rates. This produced some remarkable results, including vacancy rates for agricultural and arts sectors 0f 7.5 and 7.0 percent, respectively — the respective JVS estimates were ‘undefined’ and 1.1 percent.
Readers may recall another job vacancy ‘survey’ released last year with a peculiar methodology that had the same ‘vacancies’ counted multiple times. That ‘survey’ also used a single month’s LFS employment estimate as the denominator to artificially inflate estimated job vacancy rates. It was commissioned and subsequently defended by the former Employment Minister, the same one responsible for commissioning the JVWS.
There were a couple of simple and cost-effective measures that could have been taken to fix the JVS instead. Those measures were clearly spelled out for the Employment Minister, who acknowledged having read the recommendations.
Informative & thoughtful critique of our inadequate labour market information by Boshra & MacEwan in @globeandmail: http://t.co/2JzT5Uq4oL
— Jason Kenney (@jkenney) December 18, 2013
Which begs the question: Why did the Employment Minister repeatedly chose to go to the trouble and expense of doing such a simple thing as producing a job vacancy estimate the wrong way when it would have been relatively simple, quick and cost-effective to do it the right way to begin with?