This paper seeks to show the importance of appropriately
using and creating surveys, a task often desirable and required
(Mathews 1992:31; Shipp 1990:12). However, let the reader be at
ease; no statistical jargon in used; a future paper will discuss
statistical terms and issues. Four errors will be examined that
occur when creating, using, and interpreting surveys.
On November 16, 1995, it was reported on the local news that
one-fifth of the world had no food (Canal 4, 1995). As no
reports of massive starvation have occurred as of this writing, I
can only conclude that the news was wrong. What was meant (I
assume) is that a large portion of the world's population has
very little food.
A leading pollster in Montevideo recently revealed how
sloppily data was collected on poverty levels in Paraguay. He
was hired to conduct surveys for the World Bank and other
organizations in Asuncion. As time ran out for his project, his
supervisor instructed him to join him on the roof of a tall
building. From that vantage point they counted the number of tin
roofs that they could see (Entrevista 1995:1-3). The survey
assumed that tin roofs equaled poverty, when it is quite likely
the opposite is true; those living inside at least enjoyed some
protection from the elements.
Of course, who conducts the survey can influence the
outcome. A survey by the Catholic church found 76 percent of
Uruguay to be Catholic, while a survey by a government entity
found it to be only 36 percent (Rama 1964:12-13).
Two checks to see if a survey is scientific: 1) does it
measure what it says it does? and 2) can it consistently measure
what it says it does? So there are some questions we would ask
before we conduct a survey or rely on survey results. does the
survey find out what we want to know, or is it the best thing we
happen to have at hand? Will this survey work in a different
culture, just by translating it? We will explore these issues in
the last section of this paper.
To put is another way, if you ask the respondent the same
question two weeks from now, will the answer be the same, or does
the response depend on time? Perhaps church surveys are like a
snapshot, never to be repeated or duplicated (and thus no basis
on which to establish policy).
Despite these warnings, it is possible to create, conduct,
and analyze a survey. It is not an easy task, nor is it an easy
solution to any problem. Each survey best answers the questions
it was created to answer, and it requires time to think up the
questions, and time to get the answers. But once conducted, a
good survey can be a rich source of information to target a
people group to evangelize or to decide what to teach in Sunday
school--but not both!
1995 "Consulting the Oracle. Everyone loves polls.
But can you trust them?" U.S. News and World
Report (December 4) 52-58.
1995 "Entrevista con Oscar Alivos, Gerente Equipos
[Interview with Oscar Alivos, director of Equipos
survey firm]." Cultural Supplement of El Pais
(Montevideo), September 21, 1-3.
1993 Interview with Enrique Segarra, Director
ofInstituto de Profesores Artigas [Artigas
Teachers' College], December 10. Montevideo.
1996 "Sex in America: Good News and Bad News,"
Current Thoughts and Trends (February) 3.
SHIPP, GloverLYING TO YOURSELF AND YOUR CONGREGATION
Warren Roane
Montevideo, Uruguay
Four Errors of Surveys
Error 1: Take Data at Face Value.
The most common error is thinking that the question asked is
the same as the question answered. Often, a survey does nothing
but produce a report full of what Huff termed "semi-detached
figures" (Huff 1954:3). A number is thrown out that is meant to
impress the audience, but has very little meaning. Or, a chart
of numbers is generated, but it does not really tell you what you
want to know. Two examples illustrate the point.Error 2: If it is in Print, it is Scientific.
Huff mentions several of the classic tricks used to deceive
the public, including the "gee-whiz graph" a device used to
impress the reader by distorting figures to fit an objective
(Huff 1954:3). A tall graph of yearly conversions looks more
impressive than a short, squatty graph with the same information.
A recent U.S. News and World Report (Consulting 1995:52-58) lists
modern tricks: slanted questions, double negatives, and forcing
you to give an opinion (even if you do not have one). How a
question is framed plays a big role in how someone responds to it
(Paulos 1988:87). A six percent tax increase sounds better than
a $91 million increase. The question of semantics gives rise to
confusion, i.e., the recent controversy about the estimated
number of homosexuals in the U.S. population (1.4% or 10%) which
grows out of different political, social, and religious
orientations.Error 3: Data Collected Scientifically are Accurate.
Now that we have acquired a scientific, reliable, accurate
survey, have we fool-proofed our survey? No, we still have to
conduct our survey responsibly and scientifically. For example,
response to written surveys (or newspaper ads) may reflect
literacy rates rather than attitudes about the Gospel. As
mentioned above, the way a question is worded makes a world of
difference. One example: "Do you consider yourself a
homosexual?" is different than "Have you ever had an attraction
for someone of the same sex?" Wording probably accounts for the
Kinsey 1948 survey result of 10% incidence of homosexuality cited
above. Better wording in recent surveys (as well as use of
scientific methods) indicate that the true rate is 1.4 to 2.8%
(Sex 1996:3).Error 4: Once True, Always True.
Data are not static. As part of my dissertation, I
conducted a survey of college professors and asked them about
their jobs (percent of time allocated to teaching, conducting
research, etc.). Because I asked them to reply, and then respond
again later in the semester, I discovered that most of their
answers changed over time. In fact, one could conclude that only
name, rank, and serial number were reported consistently over
time. I conclude that this type of survey, last given at the
national level in 1987, was not reliable enough on which to base
institutional policy (Roane, 1993a). Despite the "obviousness"
of this conclusion, most surveys are revered as true, once for
all, given for all time.
What This Means for the Missionary
1. Check official data with informal methods.
For example, Mexico is 88% Catholic, while Uruguay is 76%,
according to one "official" survey (Rama 1964:12). Counting
people who genuflect on a bus may not be scientific, but it may
give an indication if the official figures are accurate. An
informal bus survey conducted by me (Mexico 1988 and Uruguay
1993) indicate that while most (90%) Mexicans genuflect,
Uruguayans do not (10%). This does not mean the official figures
are wrong; it just causes me to want to investigate more, perhaps
with my own, scientific, survey.2. Examine data carefully.
My own informal word count for the book of Jonah shows that
"salvation" or "forgiveness" occurs five times, while "destroy"
or "die" is used 13 times. Does this mean that the theme of
Jonah is the destruction of the wicked, or that God is
unmerciful? Take Huff's suggestions on "how to talk back to a
statistic": ask who says so, how does he know, what is missing,
and does it make sense (Huff 1954:3).3. Be careful in preparing your own surveys.
a) Scenario one: borrowing an existing survey in English.
Translation of "perfect" surveys from English to another
language does not make the new survey perfect! Culture,
language, and the psychology of testing all play a role in
creating a survey. I recently showed an American teacher a
survey given to the director of the Uruguayan teachers'
institute. On the "easy" question of race/ethnicity he was
puzzled; should he mark "White" or "American Indian" (because his
grandmother was half-indian). It never occurred to him to mark
"Hispanic," although his mother is from Spain, he has a Spanish
surname, and the only language he speaks is Spanish (Roane
1993b). Once translated and corrected for cultural differences,
the survey has become a different entity, basically your own (see
below for caveats).b) Scenario two: creating your own instrument from scratch.
If you design your own survey instrument, you should perform
statistical tests of reliability (does it give consistent
results) and validity (does it measure what I want it to). You
should also field test it, to see what problems you might
encounter during the actual survey. You may even need to give
pre- and post-surveys to measure the effects of time on the
results. Above all, the person who collects the data should be
trained and have experience with that particular survey
instrument.
BIBLIOGRAPHY
1995 Canal 4 (Montevideo). Television news broadcast,
November 16, 8:30 PM.
HUFF, Darrell
1954 How to Lie With Statistics. New York: W.W.
Norton.
MATHEWS, Ed
1992 "Research Assisted Missions: A Rationale," Journal
of Applied Missiology (April) 31-35.
PAULOS, John Allen
1988 Innumeracy: Mathematical Illiteracy and its
Consequences. New York: Hill and Wang.
RAMA, Carlos
1964 La Religion en el Uruguay [Religion in Uruguay].
Montevideo: Nuestro Tiempo.
ROANE, Warren
1993 "Faculty Development Participation at Selected
Private Colleges in Texas with Special
Consideration of the 1987 Faculty at Work Study"
(Ph.D. dissertation, University of Texas).
1990 "Research: A Key to Successful Urban Evangelism,"
Journal of Applied Missiology (April) 12-18.
Mirrored by permission of ACU Missions Personnel
Direct questions and comments to Ed Mathews,