SciELO - Scientific Electronic Library Online

vol.6 issue3Arterial hypertension and alcoholism among workers in an oil refineryEvaluation of institutional cancer registries in Colombia author indexsubject indexarticles search
Home Page  

Revista Panamericana de Salud Pública

Print version ISSN 1020-4989

Rev Panam Salud Publica vol.6 n.3 Washington Sep. 1999 

Rapid surveys for program evaluation: design and implementation of an experiment in Ecuador


Kate Macintyre,1 Richard E. Bilsborrow,2 Caton Olmedo,3 and Rodolfo Carrasco3



ABSTRACT This paper presents details from the field test of two rapid surveys in Ecuador in 1995. It focuses on how the surveys were designed and implemented, including descriptions of the sampling procedures, the preparation and use of preprogrammed palmtop computers for data entry, the selection criteria for the interviewing team, and how the training was designed. Lessons are drawn that will assist health professionals plan and carry out better rapid data collection in the future.
The objective of the study was to evaluate the reliability and validity of data gathered during the rapid surveys as compared with a recent "gold standard" national survey. A two-way factorial design was used to control for differences in sampling (probability versus quasi-probability) and methods of data collection (paper versus palmtop computer). Few differences were detected between the surveys done on palmtops as compared to paper ones, but urban and rural differentials in contraceptive use were less pronounced in the rapid surveys than in the earlier, national survey. This suggests that caution should be exercised in interpreting the disaggregated data in these rapid surveys. In-depth interviews revealed two features of the rapid surveys that were especially popular: the palmtops for their speed of data entry, and the short questionnaire for its "low impact" on a respondent's time. The common belief that computers would disturb respondents was not found to be the case. Even with no computer experience, the interviewers rapidly mastered the new technology.



Conventional household-level surveys are expensive, time consuming, and require considerable human resource skills. Further, many developing nations lack the funds and human resources for these types of surveys. For at least the past 15 years researchers in the international health community have been exploring methods, generally called "rapid surveys," to collect data more quickly and cheaply than can be done with conventional methods (1, 2). Despite several years of using rapid methods in the field, skepticism about the quality of the data prevails (3). In addition, survey planners lack information on the practical aspects and technical difficulties of rapid approaches. This paper provides realistic suggestions to help program managers and health-statistics managers improve the design and implementation of rapid surveys. The paper is intended to assist those contemplating doing a rapid survey, by warning them of potential pitfalls and suggesting ways to avoid those difficulties. We draw on our experience of conducting two rapid surveys in Ecuador in 1995.

Following a brief introduction to rapid survey methodology, the paper describes the procedures followed in the Ecuador example. We include lessons learned from the experience and suggestions for the future use of these surveys. The paper concludes with estimates of the length of time needed for each stage and a discussion of several misconceptions regarding rapid survey work in developing countries.



Rapid surveys have become increasingly important in recent years for health planners and researchers (4). International agencies and national program managers have been experimenting with various types of rapid surveys to reduce costs. One researcher (5) even calls this an "epidemic" of rapid methods. The World Health Organization's Expanded Programme on Immunization (EPI) cluster sampling methodology, tailored originally for use in measuring immunization coverage rates in developing countries (6), has now been used in at least 5 000 instances (2). It has been adapted to gather information on diarrheal diseases, acute respiratory infections, and sexually transmitted diseases (7). Rapid epidemiological studies have also evaluated community-level knowledge, as well as the approximate prevalence of various infectious and vector-borne diseases (5, 7). Recently, there have been adaptations of the original rapid anthropological procedures (RAP) (8) and the rapid rural appraisal (RRA) (9). These include the "situation analysis" method developed by the Population Council (10), a RAP method used by the United Nations Children's Fund (11), and the World Bank's participatory rural appraisal methods (12). (See Manderson and Aaby (5), Scrimshaw and Hurtado (8), and Macintyre (16) for reviews of rapid methodologies).

While different definitions of "rapid" abound, most authorities cite the following five features as characteristic of a "rapid" survey: 1) relatively low cost, 2) use of a short questionnaire, 3) reduced sample size, 4) fast feedback of data to decisionmakers, and 5) the use of computers in data collection and analysis. Depending on the topic and population under study, rapid methodologies may be qualitative or quantitative. For example, the EPI cluster sampling method is regarded as a quantitative approach, as it is designed to gather data generalizable to a larger population (6). The anthropological RAP methodology, in contrast, is considered qualitative (8) and is intended to illustrate unique constraints at the local level based on in-depth information collection. These data are not usually generalizable beyond the immediate local area. The Ecuador study used two rapid survey designs of the quantitative type.

While preparing for research in Ecuador, it became clear that few researchers have published descriptions of their methodologies for and experiences in conducting rapid surveys. Indeed, this gap is not limited to the rapid survey world, but exists in the larger field of survey sampling. We had hoped to find, for example, evidence of advantages and disadvantages to various methods of selecting a sample in the field and discussions of the appropriate questionnaire length, the ideal size of the interviewing team, and how to program the computers we were planning to use in our surveys. Frerichs' work in Burma (13, 14) and that of Forster et al. in The Gambia (15) recorded computer requirements and discussed some of the obstacles but provided few guidelines for field implementation.

To remedy this deficiency we document our field work procedures below. The text describes the preparation, field work, and main lessons from this rapid study in Ecuador. A brief description of the experimental design is provided first, to facilitate comparisons between the rapid surveys' results and the results from a large, national survey completed 6 months earlier.



The Rapid Survey of Contraception and Fertility (RSCF, known in Spanish by the abbreviation "ENRAF") was implemented in 1995 by an Ecuadorian research organization, the Center for Population Studies and Responsible Parenthood (CPSRP, or CEPAR in Spanish). The year before, the experienced and highly qualified researchers at the CPSRP had successfully conducted a reproductive health survey called the Survey of Demographics and Maternal and Child Health (SDMCH) in 15 of Ecuador's 21 provinces, as well as the country's two largest cities, Guayaquil and Quito.

The purpose of RSCF was to conduct a rigorous, scientific test of the quality of data obtained from rapid surveys. An ideal situation arose in Ecuador, where SDMCH had just been completed, and thus could be used as a "gold standard" against which to compare the rapid survey results. The impetus for such a study came from the EVALUATION Project, a research project funded by the U.S. Agency for International Development to develop and test new research methodologies to better measure the impacts of family planning programs.4 Although the main thrust of the RSCF study was methodological, we designed the survey questionnaire to elicit standard information on family planning use that would be useful to provincial-level decisionmakers.

Two rapid sampling designs were tested in the RSCF study. The first design was a pure probability subsample or follow-up survey. Known in this paper as the Rapid A survey, it was based on reinterviews of a subset of the 1994 SDMCH respondents. The second RSCF design, called Rapid B, was based on a "quasi-probability" sample that used the EPI-type method of cluster sampling but sampled from the same clusters as SDMCH.

Since the RSCF study was also designed to test new technology and respondents' reactions to it, we also divided the interviewees by mode of data capture. Half of the respondents were interviewed using the traditional paper and pencil method, and the other half were interviewed using the latest palmtop computers. Both methods can be called "rapid," as each conforms to the criteria listed earlier in this paper for rapid surveys. RSCF was conducted in only 4 provinces, compared with the 17 areas surveyed the year before with SDMCH.

Table 1 compares some of the results from the SDMCH and RSCF surveys; a more detailed presentation and discussion is beyond the scope of this paper but can be found elsewhere (16). Reasonable-quality data came from both the Rapid A probability subsample and the Rapid B quasi-probability subsample. This was true on both contraceptive prevalence and related individual sociodemographic characteristics. For example, the estimated mean proportion who reported "ever using" any family planning method was not statistically different (in t tests) across the surveys. Also comparable in all the surveys were the mean age of respondents and the proportion either married or in consensual union; either being married or in a consensual union is a strong predictor of contraceptive use in Ecuador. However, some differences also arose. For example, there was a statistically significant difference in the results of the SDMCH survey and the Rapid A survey in terms of the proportion of women reporting they were "currently using" family planning. This difference may be due to the selective migration to other parts of Ecuador by younger, unmarried women, leaving behind older women, who were more likely to be current users. The results also illustrate the problem of finding the same respondents for a follow-up study. This is always difficult but may be compounded by a lack of time. That is, in this case, the "rapid" nature of the survey may be a shortcoming in design. This lesson and others point to the need to fully understand the sampling methods in the rapid survey and their potential to bias the results. The following section explores how the sampling design was drawn up and implemented.




The rapid-survey questionnaire

The 1994 SDMCH questionnaire was 50 pages long and contained about 250 questions. For the RSCF, a questionnaire was developed with 45 questions that focused on a subset of the SDMCH topics. This approach permitted comparison between the 1994 and 1995 surveys since they used the same topics and the same wording. The questions selected for the RSCF included the household roster, almost all the SDMCH family planning questions, some basic background characteristics on the woman and other household members, some fertility-related questions, and a section on reproductive preferences. Anyone who has developed a questionnaire knows it can be a difficult and contentious process, especially when trying to reduce the length. We rigorously sought to keep the length below 50 questions, since this was the estimated number that could be covered in a 15-minute interview. We recommend that the questionnaire development stage not be viewed as part of the "rapid" process, and that great care be taken to keep the questionnaire short and entirely focused on the main objective of the rapid survey. Otherwise, the objective will be lost as the questionnaire will become too long for a rapid survey.

An important modification in the SDMCH procedures for the RSCF was to reduce the maximum number of callback return visits made during a 12-hour period before a woman was coded as absent, from eight to four. The earlier, larger survey thus had a greater chance of contacting women. The reason for this reduction in the RSCF callbacks was to accommodate the faster pace of the field work, which planned for two census sectors to be completed each day, that is, three or four interviews per interviewer per day.


Programming the palmtop computers

The rapid survey questionnaire was first entered into a standard desktop personal computer using basic DOS-compatible editor software. It was then translated to the SURVEY data entry package, a public-use bilingual software program (U.S. Centers for Disease Control and Prevention, Atlanta, Georgia, United States of America). Once the questionnaire had been checked for logical skip patterns and range checks, communications software was used to transfer it to the handheld computers. The programming was done by an experienced senior computer programmer who initially took about 5 days to enter and check the questionnaire. A further 2 weeks were necessary to check each palmtop carefully, as well as the laptop that was to be used in the field as the central data bank. The SURVEY software was used to test each palmtop for glitches. Testing of the fluency and accuracy of programming was conducted prior to field work, and several small changes and corrections were made following the pretests. No changes had to be made to the program once the teams were in the field.

Hewlett Packard (HP) 200LX palmtops were used in the field. A standard 200LX palmtop unit weighs about 312 g and measures 16 cm by 8.5 cm by 2 cm, that is, about the size of an adult's hand. The screen is about 12 cm by 5 cm. The 200LX comes preloaded with several software programs. Its DOS operating system makes it completely compatible with the SURVEY software. The unit uses very little battery power, another advantage for use in the field.

Improvements in the technical capacity of palmtops were a key factor in making the computer-assisted data entry possible in these Ecuadorian rapid surveys. The HP 200LX palmtops came standard with two megabytes of internal memory, which would allow for up to 68 completed RSCF questionnaires to be entered per computer. To supplement the standard memory, we purchased two flash memory cards of five megabytes each for each palmtop. These cards provided extra capacity during the days when using the palmtops in very remote sectors made the downloading of data difficult. No data were lost in the field while the interviews were being conducted.

All the questions in the survey were close-ended ones (either yes/no or precoded, digitized responses). The only question requiring typing in multiple characters on the keyboard was the respondent's name. This was intentional, as the 200LX keypad is small and difficult to use in the field.


Selecting and training interviewers and supervisors

Two supervisors (one male, one female), six interviewers (female), and two drivers (male) were trained and hired by CPSRP to implement the rapid survey. This group of 10 was split into two teams of five (three interviewers, one supervisor, and one driver) for the field work. CPSRP had trained all the interviewers for the SDMCH survey, so they needed no introduction to the content of the rapid survey. Other criteria for interviewer selection included good performance under SDMCH, having completed high school, and being fit enough to spend lengthy periods in the field. These criteria were basically no different from those normally used in selecting candidates for survey interviewing.

The interviewers received three days of training just on the use of computers. In retrospect, this was probably not sufficient as subsequent training episodes were needed during the first seven days of field work. Questions arose in the field that the technical advisor had to answer, such as how to get out of an interview once started and how to back up through an interview to change responses. These problems were not due to the interviewers' lack of experience with computers. Rather, they were mostly the typical difficulties encountered with new software, such as the memorizing of key strokes. More days taken to pretest the questionnaires with computer training in field conditions would have better prepared the interviewers and reduced the amount of troubleshooting needed during the field work.



Two days were spent doing pretests, in addition to the training on the palmtops at the CPSRP headquarters. During the pretests rural and urban census sectors were visited to try out the questionnaire, the computers, and the field sample selection procedures, as well as to ensure that training was adequate. The pretests were completed in sectors not included in the RSCF survey but thought to present a reasonable range of problems that might arise. Various minor computer-program changes were made following the pretests. It was noted that the automated skips in the computers helped identify illogical steps in the questionnaire that had been missed in the preparation of the skips on the paper questionnaire. The usefulness of thorough pretesting in rapid survey implementation cannot be overstated. The pretest in the rural sector also demonstrated a weakness of the EPI method of sampling and necessitated an important modification for the Rapid B independently sampled survey (see below in the subsection on sampling strategies in rural areas).


Maps and equipment preparation

The National Institute of Statistics and Censuses of Ecuador provided maps of each census sector that had been used the year before for the SDMCH. Maps are so important for survey research that it is worth noting that where map quality is poor, high costs will be incurred either in terms of errors in the research due to missed households or communities, or for the creation of better maps. Other survey equipment included backpacks, pencils, clipboards, and hats to identify the institutional affiliation of the interviewers. In addition, the rapid study team members needed fanny packs to carry the palmtops, ponchos, and a bottle for selecting the random direction for sampling (see below).


Sampling strategies

The SDMCH had been designed using standard multistage cluster sampling, with selections made at two stages, the province and the census sector. That 1994 survey had included all the provinces in the Sierra Andean highlands of central Ecuador, as well as all the provinces in the coastal plains of western Ecuador. Between 25 and 30 primary sampling units (PSUs) were selected in each province. The PSUs were census sectors selected randomly from Ecuador's 1990 national census. At the second stage 40 households per PSU were randomly selected as eligible for inclusion. At the final stage, one woman per household was randomly selected from a household roster of all eligible women aged 15-49 years.

Both RSCF rapid surveys were also multistage cluster sample surveys, though they differed from the SDMCH in the method of selecting the households to interview. In RSCF, two provinces were selected to represent the Sierra and two others to represent the coastal region. Earlier, some thought had been given to including provinces along Ecuador's southern frontier, but that was precluded by the border war that broke out with Peru. The final four provinces selected, Cotopaxi and Imbabura in the Sierra and Manabí and Esmeraldas on the coast, provided considerable variation in terms of geography, ethnic groups, and socioeconomic conditions.

In the Rapid A survey all women in the selected PSUs who had completed a questionnaire during SDMCH were subjects for interview. The intent was to reinterview SDMCH respondents, as a check on those being interviewed for the first time in the rapid study, and also to compare answers given to a short questionnaire and to a long questionnaire. The names of the original SDMCH respondents, names of head of households, and street addresses were entered in advance on all the Rapid A paper and electronic questionnaires. This was to enable interviewers to find the same respondents from the SDMCH.

One of the main lessons learned from this sampling strategy was that finding women based only on their name and address was often difficult (as distinguished from not finding women who were absent because of temporary or permanent migration). This situation led to considerable bias in whom we managed to find, especially in urban sectors. There were also cases of women who had been recorded incorrectly the first time, or where an SDMCH team had interviewed women from outside the selected sector. In rural areas the local residents were more likely to know the women in question, so it was generally easier to find them.

For the Rapid B survey, the EPI quasi-probability method of cluster sampling was followed. The purpose of this approach was to reduce costs by avoiding the creation of a sampling frame, that is, a list of all the households in each PSU. Typically, the creation of a frame in the field is an expensive, time-consuming process. In order to avoid making a sampling frame of households but still retain a quasi-probability sample in each cluster, the second stage sample selection of households to contact takes place in the field. The normal procedure under the EPI method involves first identifying the geographical central point of the PSU on a map. This is the starting point for the selection of households. Next, a random direction from this starting point is chosen using a spinning object, such as a bottle or pencil. The team then proceeds in the direction selected, interviewing households, up to the boundary of the sector.

During the pretest we ascertained that different starting point procedures needed to be established for rural and urban areas. In urban areas one can be reasonably confident of finding households within reach of the central starting point of each PSU. However, in sparsely populated rural areas a random direction may or may not provide enough households for the survey. Therefore, the EPI sampling method was significantly modified for rural areas.

Rural areas. On all the maps of the rural census sectors in Ecuador there are some labeled points ("dots") where there are small centers of population. The dots indicate such locations as large farms, schools, churches, and river and roadside settlements. Figure 1 shows a typical remote rural census sector. Located in Manabí province, in Ecuador's Pacific coastal region, this sector has three main settlement points: Boca del Cayo, Cantagallo, and Motete. But there are usually other households dispersed away from such named dots. The issue is where to begin a rapid survey in such a sector. For the sake of the rapidity of the survey, we assumed a priori that there was sufficient homogeneity in the rural population living in or close to these small communities to represent the whole sector. Therefore, it would not matter which marked dot on the map we selected as the survey starting point. We first numbered all the dots on the sector map from north to south and then chose one of them at random as the starting point for the sector. Typically, the number of dots per rural census sector ranged from two to six.



For work in the field the team would arrive at this selected dot by vehicle, walking, or some other means. The team would interview all the households living in that place. That might be only one or two households, or it could be many more. There were always fewer than the fixed total of 40, so interviewing continued towards another point on the map. Instead of spinning a bottle to chose a direction or a second dot, in the interest of rapidity we decided to continue interviewing in the direction of the next closest dot, dependent on "proximity." Rather than just simple physical distance "proximity" took into account the accessibility by existing roads or paths and the existence of any geographical obstacles, such as impassable rivers or mountains. The team then proceeded along the road or path towards that second population center. Along the way, the team interviewed all the households, including ones within a one-kilometer walk from the road. The interviewing continued until 40 households had been contacted. This number included uninhabited or deserted dwellings, households without a woman of reproductive age, and households that refused to be interviewed. If two centers still did not yield 40 households, we selected a third center with the best accessibility.

Figure 1 illustrates how this process worked with the rural census sector in Manabí province. The randomly selected starting point was Cantagallo (dot "1" in the figure), and the next closest (most accessible) point was Boca del Cayo ("2"). Physically, or "as the crow flies," Motete ("3") was closer, but there was a road washed away between the two communities so we could not have reached Motete that same day. Therefore, we chose the most accessible nearest point on the map, Boca del Cayo, and proceeded along the road towards it until we contacted 40 households.

Urban areas. For the Rapid B surveys in urban areas, it was possible to apply the procedure recommended for the EPI method. This involved using existing maps from the National Institute of Statistics and Censuses to select a sector starting point in the field and to guide the interviewers as to where the boundaries of that sector lay. Having chosen a starting point, the team then spun a bottle in the street to randomly select a direction for choosing particular households. The bottle usually stopped at an angle to the direction of the street, that is, not quite perpendicular. The interviewers chose as their first direction the side of the street that the top of the bottle was pointing to. The interviewers would proceed from there, along that side of the street, until they reached the boundary of the sector. They would next do interviews on the opposite side of the same street, but again beginning from the same central starting point. If the team reached the sector boundary on this second leg of interviewing without having found the required 40 households, the supervisor then selected another "spinning point" by picking a second street intersection near the center of the sector. That was usually on a street parallel to the one on which the first set of interviews had been done. After spinning the bottle a second time, the team would then set out in the indicated direction, contacting households until the required 40 had been reached.

Figure 2 illustrates the Rapid B sampling process with an urban sector in the city of Ibarra, in the Sierra-region province of Imbabura. The central starting point was determined to be the middle of Sucre Street between Borrero and Juan Mejía Legueroca Streets. The bottle was spun and landed pointing along a north-south axis, as shown in the diagram. The interviewers first contacted all the households along the north side of Sucre Street, beginning at the starting point and moving northeastward. They next moved southwestward, contacting all the households on the south side of Sucre Street, until they had reached the limits of the sector. This gave the team about 30 households. To reach the required 40, the supervisor chose a second spinning or starting point, at the intersection of Sucre and Borerro Streets. The bottle was spun and landed pointing northwest. Given the sector limits, this meant the interviewers first contacted households on the north side of Borerro between Sucre and Simón Bolívar. They then continued on the south side of Borerro east of Sucre but still inside the sector limit, until they had 40 households. The three interviewers completed the sector in approximately 5 hours of work.




During the six weeks of RSCF field work 2 862 households were visited and 1 702 interviews with women were completed. At an estimated rate of one sector per team per day, the survey should have taken 36 working days to complete. Allowances for delays, travel time, and breaks between provinces meant the survey took a total of 45 days. Each team alternated between paper and palmtop surveying on consecutive days.

At the end of each day the senior supervisor collected all the palmtops and downloaded the data from them into a laptop. This process took about 45 minutes and included preliminary checks on data quality, such as consistent answers for some of the questions (e.g., reported age in years and date of birth). Within two weeks the interviewers could see basic frequencies. This provided a quick demonstration of the usefulness of the palmtop computers in the field, and the interviewers were pleased to see the product of their work so promptly.

The interviewers unanimously approved of the palmtops. The convenient length of the questionnaire was also frequently commented on, and the ease of the automatic skip patterns and range checks meant the interviews were easy to conduct. It is notable that there was not one incomplete interview during RSCF; that is, no respondent refused to complete the survey once she had started. Of course, time is valuable, and the fact that the interviewers could truthfully say at the outset that the interview would only take 15 minutes was an enormous relief to both the interviewers and the respondents.

Table 2 provides summary statistics from the field work, including the number of interviews conducted per day for each survey, number of households contacted, average length of time to complete interviews, and refusal and response rates.



The palmtop computers worked well, and we lost no data despite extreme weather conditions. Several palmtops were dropped in the mud, and one was dropped in shallow water. They were transported by mule, donkey, horse, bicycle, canoe, and car. Essentially, we broke all the rules regarding the climatic conditions recommended for the palmtops. The HP manual warned against exposing the palmtops to high heat or humidity and suggested particular caution in dusty conditions. At the Ecuadorian coast during the February and March survey period, humidity ranged between 80% and 100% and the temperature from 20 °C to 40 °C. There were downpours in the coastal region, and the Sierra sectors were generally cool, very dusty, and some 2 000 ft above the maximum altitude of 10 000 ft recommended for the palmtops.

But we worked hard to protect both the palmtops and the laptop. All could have been targets of theft. In one reportedly violent poor urban sector in the city of Esmeraldas, all the supervisors and other project staff shadowed the interviewers in case problems arose. Luckily there were no attempted thefts that day or during the rest of the field work. This could have been because the palmtops resemble large calculators, which are now becoming common. However, we strongly recommend that care be taken in unsafe areas to prevent problems.


Time frame

Estimating the time needed to prepare and conduct a "generic" rapid survey was somewhat problematic given that we were conducting a methodological study that incorporated features not usually associated with survey design, such as different data capture and sampling methods. While rapid surveys should be, and are in our experience, much quicker to implement than traditional surveys, they should not be viewed as "instant." As with any household survey, careful planning is needed prior to the field work. Particular care should be taken to develop a short questionnaire, a pretest, and a computerized data entry program. The time taken to select the sample will also vary according to the quality, recency, and availability of a census and maps of census sectors or other possible PSUs.

Figure 3 shows the time frame for this methodological study. The preparatory stages took approximately 5 months, including the design of the questionnaire, the programming, preparation of sampling plans, training of the interviewers, pretesting, and final equipment preparation. The field work itself was completed in 6 weeks. Data entry for the paper questionnaires took 3 weeks. With preliminary data cleaning and analysis taking about 3 weeks, this brought the total time to about 8 months for design, implementation, and basic analysis of the two rapid surveys. If we had done the rapid surveys using only palmtops, the total time needed would have been around 6 or 7 months.




We wish to emphasize several lessons, some of which dispel common misconceptions concerning the implementation of rapid surveys. One common belief is that using computers may increase refusal rates and bias the overall results as compared to using paper forms to record responses in face-to-face interviews (13). In no part of Ecuador did we find the computers a hindrance. If anything, the palmtops were an attraction, as demonstrated by the 2.3% refusal rate for the computer-entered questionnaires compared to the 3.1% for paper-entered interviews (Table 2).

A second lesson concerns the educational level the interviewers need to operate the computers. It is often assumed that some experience and some significant level of training will be needed for interviewers to operate computers in the field. In the early stages of RSCF we were concerned that we might need more highly trained interviewers, as only one of the interviewers had had any prior experience with computers. Indeed, three of the six reported that they had never used a typewriter before. While all the interviewers had completed high school, none had had any further formal education. And even though they had all been trained as SDMCH interviewers, they all expressed nervousness about the palmtop computers. During the first week of field work the interviewers occasionally asked for assistance with the palmtops for some operations they had studied but not completely mastered, such as going back in the interview when a respondent changed her mind about some information. Two more days of training in the pretest period would have eliminated this problem. After the first week there were almost no computer-related questions, and the supervisors tended to concentrate on the sample selection.

The importance of callback return visits was illustrated at several stages during implementation. We were attempting to do these surveys as fast as possible, and there was a tendency in the early stages to push the teams to do more than one sector per day. This was remedied after the first province. Many women in Ecuador work outside the home. When we first approached a household in the morning, especially in an urban area, the woman selected for the interview was often either at work or at the market. If these women were to be interviewed, it was thus essential to return later in the day, and sometimes as late as 9:00 p.m. In spite of those efforts we still missed women who were away overnight. Trying to increase the number of sectors completed per day therefore only increased the likelihood of missing women who were at work or at the market during the daylight hours. We recommend that at least four callbacks be allowed for each household in a sector, and that the interviewing team be scheduled to complete no more than one sector each day.

Allowing for a return visit the following day would greatly reduce the problem of missing women identified in some sectors during the surveys, but it would also reduce the rapidity of the survey and increase the cost. Whether this approach is desirable or not depends on the percentage of respondents who are likely to be missed if a second day is not allowed and how different they are from those contacted in the first day. This will vary with the purpose and content of the rapid survey. It was not a major problem in the case of family planning in Ecuador.

Our experience with sampling in the field using the EPI-method Rapid B survey taught us several things. First, there is a clear advantage to having good maps. These enable supervisors to mark with confidence the geographical central point, and to know when the boundary of the sector has been reached. The majority of the maps we used in the four provinces were of good quality. It was readily apparent when the quality dropped. For example, when some small roads had not been included or small hamlets had been missed entirely because they were new, the team could easily become confused and overshoot the boundary of the census sector. Second, although we had spent one afternoon carefully describing the sampling scheme to the supervisors and training them, it was still necessary to review this method and correct procedures several times in the field. More time spent on training, and particularly on training during pretesting, would have helped the supervisors understand the method better at the outset.

Discussions about the exact midpoint of an urban sector were usually solved by flipping a coin. However, it would have been better to agree on some a priori way of finding the closest point to the center prior to field work. There was also some discussion about what to use as the "spinning object." A plastic soda bottle was ideal since it spun freely several times before stopping, but a glass soda bottle also worked well. Either led to some interested looks from passers-by.

In conclusion, it is important that the notion of "rapidity" not affect the survey preparation stages, including questionnaire design, location of population/census data and maps, training, and pretesting in the field. "Rapid" should refer only to the period between the beginning of the data collection and the presentation of basic results. Technological advances make computerized data collection relatively simple, though sufficient training periods should be allowed to overcome the interviewers' natural worries. Concerns that the use of computers would generate higher nonresponse rates were not borne out in Ecuador. Leaving sufficient time for callbacks (thus lowering nonresponse rates) is essential for high-quality surveys, and the rapid surveys are no exceptions to this. This illustrates an important tradeoff between time and money on the one hand and response rates on the other hand. Finally, map preparation, supervisor training, and other such preparatory work for the EPI method of household sample selection in the field should not be equated with "quick and dirty" methods. Basic statistical and managerial competence are vital elements of rapid surveys so that good quality data can be collected and evaluated accurately and efficiently.


Acknowledgments. Funding for this research was provided by the Department of Health Policy and Administration of the School of Public Health, and the Carolina Population Center, both at the University of North Carolina, Chapel Hill. The principal funding sources for the field work were the U. S. Agency for International Development, through its EVALUATION Project, and the Andrew Mellon Foundation. A fellowship from the Population Council, New York, supported the analytic work.



1. Von Braun J, Puetz D. Data needs for food policy in developing countries: New directions for household surveys. Washington, DC: International Food Policy Research Institute; 1993.         [ Links ]

2. Anker M. Epidemiological and statistical methods for rapid health assessment: Introduction. World Health Stat Q 1991;44(3):94-97.         [ Links ]

3. Turner AG, Magnani RJ, Shuaib M. A not quite as quick but much cleaner alternative to the Expanded Programme on Immunization (EPI) cluster survey design. Int J Epidemiol 1996;25(1):198-203.         [ Links ]

4. Samara R, Buckner B, Tsui AO. Understanding how family planning programs work; findings from five years of evaluation research. Chapel Hill, NC: The EVALUATION Project; 1996.         [ Links ]

5. Manderson L, Aaby P. An epidemic in the field? Rapid assessment procedures and health research. Soc Sci Med 1992;35(7):839-850.         [ Links ]

6. Lemeshow S, Robinson D. Surveys to measure program coverage and impact: Review of methodology used by the expanded programme on immunization. World Health Stat Q 1985;38(1):65-75.         [ Links ]

7. Vlassoff C, Tanner M. The relevance of rapid assessment to health research and interventions. Health Policy and Planning 1992;7:1-9.         [ Links ]

8. Scrimshaw S, Hurtado E. Rapid assessment procedures for nutrition and primary health care. Tokyo: The UN University, UCLA Latin American Center Publications; 1988.         [ Links ]

9. Chambers R. Shortcut methods of gathering social information for rural development population. In: Cernea M, ed. Putting people first: Sociology and development projects. Washington, DC: World Bank, 1985.         [ Links ]

10. Miller R, Fisher A, Miller K, Ndhlovu L, Ndugga Maggwa B, Askew I, et al. The situation analysis approach to assessing family planning and reproductive health services: A handbook. New York: The Population Council; 1997.         [ Links ]

11. United Nations Children's Fund. RAP handbook. New York: UNICEF; 1993.         [ Links ]

12. World Bank. World Bank participation sourcebook, 1996. Washington, DC: World Bank; 1996. p. 181-204.         [ Links ]

13. Frerichs RR, Tar KT. Computer-assisted rapid surveys in developing countries. Public Health Rep 1989;104(1):14-23.         [ Links ]

14. Frerichs RR. Simple analytic procedures for rapid micro-computer-assisted cluster surveys in developing countries. Public Health Rep 1989;104(1):24-35.         [ Links ]

15. Forster D, Behrens RH, Campbell H, Byass P. Evaluation of computerized field data collection system for health surveys. Bull World Health Organ 1991;69(1):107-111.         [ Links ]

16. Macintyre K. Tradeoffs between precision and cost: A field test of rapid survey methods for family planning evaluation [doctoral dissertation]. Chapel Hill, NC: University of North Carolina at Chapel Hill; 1997.         [ Links ]



Manuscript received on 8 May 1998. Accepted for publication on 16 March 1999.




Encuestas rápidas para la evaluación de programas: diseño y aplicación de un experimento en el Ecuador

Este trabajo detalla algunos aspectos de la prueba en el terreno de dos encuestas rápidas efectuadas en el Ecuador en 1995. Examina el diseño y la aplicación de las encuestas; la preparación y el uso de computadoras de bolsillo preprogramadas para la inclusión de los datos; los criterios de selección del equipo de entrevistadores, y el diseño del adiestramiento. Se extraen lecciones que ayudarán a los profesionales de la salud a mejorar en su futuro desempeño la forma de planificar y llevar a cabo sus actividades de recolección rápida de datos.
El objetivo del estudio fue evaluar la fiabilidad y validez de los datos recolectados durante las encuestas rápidas, en comparación con las de una encuesta nacional reciente que sirvió de "patrón de oro". Se aplicó un diseño factorial de doble vía para controlar el efecto de las diferencias de muestreo (probabilidad frente a quasi-probabilidad) y de los métodos de recolección de datos (en papel o en computadora de bolsillo). Se detectaron pocas diferencias entre las encuestas efectuadas con computadoras de bolsillo en comparación con las efectuadas en papel, pero hubo menos diferencias entre datos de procedencia urbana y rural respecto al uso de anticonceptivos en las encuestas rápidas que en la encuesta anterior, de alcance nacional. Esto indica que debe observarse precaución a la hora de interpretar los datos desglosados recogidos mediante estas encuestas rápidas. Las entrevistas a fondo revelaron dos rasgos de las encuestas rápidas que gozaron de gran popularidad: las computadoras de bolsillo por la rapidez con la que se incorporan los datos, y el cuestionario corto por su "poco impacto" en el tiempo del encuestado. La creencia general de que los encuestados se sentirían incómodos con las computadoras no se observó en la realidad. Aun sin tener experiencia en el uso de computadoras, los entrevistadores aprendieron a manejar la nueva tecnología sin dificultad.



1 Tulane University, School of Public Health and Tropical Medicine, Department of International Health and Development. Send correspondence to: Kate Macintyre, Tulane University, School of Public Health and Tropical Medicine, Department of International Health and Development, 1440 Canal Street, Suite 2200, New Orleans, Louisiana 70112. Telephone: (504) 588-5185. E-mail:

2 University of North Carolina at Chapel Hill, Carolina Population Center and Department of Biostatistics, Chapel Hill, North Carolina.

3 Centro de Estudios de Población y Desarrollo Social, Quito, Ecuador.

4 Tsui A, Hermalin AI. Improving the effectiveness of family planning programs by improving evaluation capabilities. At: International Union for the Scientific Study of Population General Conference, Montreal, Canada, August 1993.