Training and selection of ield staff

2.7 Field operations

The NRVA 2011-12 ield staff consisted of two mixed interview couples and one ield supervisor for each of the 34 provinces of Afghanistan. The ield operations were supervised by nine Regional Statistical Oficers RSOs, who were selected from the Provincial Statistical Oficers PSOs. In addition, NRVA staff from CSO Headquarters performed monthly monitoring missions for direct feedback to interviewers and supervisors. The survey instrument consisted of paper questionnaires for households, male and female community Shuras councils and commodity prices in the nearest market place. The male interviewers administered the interviews with the male household representative and the female interviewers those with female household representative and other eligible female household members. In addition, a female interviewer conducted the female Shura interviews, whereas the supervisor usually administered the male Shura interviews. The supervisors were also responsible for collecting the market prices. Each of the ield teams had a monthly interview target of 50 household interviews in 5 selected clusters, resulting in a national monthly total of 1,700 household interviews. Data collection started in April 2011. Progress in the irst months was slow due to a variety of reasons, including access problems related to insecurity and physical circumstances, replacement of ield staff, Ramazan, and the requirement to revise the sampling procedure. Effectively, this meant that in spring and summer 2011 fewer interviews were conducted than planned. The missing interviews were compensated in corresponding period in 2012. For this reason data collection was extended to August 2012. In addition to surveying the resident population during the entire survey period, the nomadic Kuchi population was accessed in winter and summer when they tend to stay put for some time. Provinces that faced most security challenges were Kapisa, Paktya, Zabul, Logar, Wardak, Sar-e-Pul, Jawzjan, Helmand and Urozgan. In view of recurrent access problems a security strategy was developed. This strategy included mapping of insecure areas, security assessment in the ield, consultation of relevant information sources PSOs, NSP Regional Management Units, CDCs, and discussions and negotiations with relevant actors, such as governors, community leaders and Jahadi commanders. 6 As a last resort insecure areas were replaced by more secure areas. The implementation of this strategy resulted in fewer replacements in the second and third survey quarters. The security situation in Zabul did not allow participation of female interviewers. Figure 2.1 shows in which districts the survey was implemented according to the sample design, and in which districts less or no data collection took place. Out of the 357 sampled districts and provincial centres of Afghanistan, in 342 96 percent information was collected, although in 35 10 percent fewer interviews were conducted than originally planned.

2.8 Data processing

The tasks of the RSOs included checking a sample of the completed questionnaires, as a second level of quality control in the ield after the checking by supervisors. On a monthly basis, they transported batches of completed questionnaires and other survey documents back to CSO Headquarters and took new ield supplies to the provinces. The PSOs were responsible for the introduction of the ield teams to the provincial and local authorities, for monitoring ieldwork progress and the security situation, and for veriication of survey results in the ield. In four provinces ield staff was replaced due to underperformance. Data processing in CSO Headquarters was done in parallel to the ieldwork and started upon arrival of the irst batch of completed questionnaires in May 2011. The irst stage consisted of manual checking by three questionnaire editors. Subsequently, the questionnaire batch was submitted for data entry. The data entry staff received two rounds of training before actual data capture started. In the course of the survey, the team was expanded to 30 operators to keep up to eliminate the backlog that arose due to double data entry. Data capture was done with a specially designed MS Access programme, which was piloted to ensure a smooth performance. The database was equipped with VB coding to perform basic consistency and range checks. The database programme also included several data-cleaning and data-management procedures for process monitoring and daily back-ups by the Database Director. 6 CSO acknowledges the valuable support of MRRD in the development and implementation of this strategy. SURVEY METHODOLOGY AND OPERATIONS 5 6 SURVEY METHODOLOGY AND OPERATIONS Figure 2.1 Implementation of NRVA 2011-12 sampling clusters, by district The principle of double data entry was introduced to avoid high levels of manual data capture errors. For each of the double-entered batches integrity checks were performed at individual, household and batch level. Emerging issues were resolved by a team of seven data editors. A complementary MS Access programme identiied discrepancies between the batches of double-entered data, which were subsequently reconciled and again tested for integrity. Further data editing was irst performed on the MS Access database. This database was then transferred to Stata software for the application of programmes to identify data laws and either perform automatic imputation or manual screen editing. Data processing was completed in September 2012. During the analysis phase, inal edits were done.

2.9 Comparability of results

Comparability between the 2007-08 and 2011-12 surveys was maintained as much as possible by a largely similar questionnaire design and content for reported indicators, training and data collection procedures. Whereas the sampling design differed between surveys, both surveys produced representative results at national and provincial level. Comparability with NRVA 2005 is more limited due to major questionnaire revisions in 2007 and the limitation of data collection to three months in 2005, which prevented seasonal analysis like that in the last two NRVA rounds. Any comparison with 2005 results in this report should, therefore, be treated with caution. 7 The NRVA questionnaire design partially built on major international survey practices, such as the DHS and MICS surveys. In addition, for internationally agreed indicators, NRVA usually applies the standard conceptualisation and deinitions. Therefore, many indicators produced in this report embody a high level of international comparability. The report text indicates if, for some reason, applied deinitions deviate from the internationally recommended ones. The annex with concepts and deinitions provides the speciications applied in the present analysis Annex XI. 7 As sampling design, survey design and questionnaire content of NRVA 2003 were very different from the subsequent rounds, no effort is made here to include its results in any trend analysis.