In lieu of an abstract, here is a brief excerpt of the content:

221 Appendix Obtaining a Sample The heart of this research is a questionnaire sent out to members of a variety of groups that are noted for their opposition to the Christian right. The questionnaire can be seen in table A.1. SurveyMonkey, an online survey website, was utilized. We located several cultural progressive groups and contacted them. In every case we sent a link to a contact member of the targeted group and had him or her send the survey out to the members of the group. The advantage of doing this is that we obtained no list of potential respondents and were better able to maintain the anonymity of the respondents.1 The downside is that we do not know how many individuals received our link and cannot calculate a response rate. We used a variety of local and national groups. One group is a national group known for its promotion of atheism. Another group is a national group known for its progressive political activism. A third group is a regional group that concentrates on issues of education . Finally, we sent the survey link to a local activist group that sent it out to other local groups and social contacts that our contact believed would fit the definition of cultural progressive. While 222 || appendix we acknowledge that we did not obtain a probability sample, we did make efforts to gain some diversity in our population by religious ideology, level of political concern, and region. We are certain that we did not succeed by region due to the large number of individuals we surveyed from the South. However, we do believe that we found an adequate number of individuals who are highly concerned about the Christian right for political reasons and an adequate number of individuals who are concerned due to religious reasons. In addition to our survey, we also subscribed to a variety of different news and electronic letters. Each of the three major groups in the previous paragraph supplied this primary material. We read all of the materials for a given period of time to get a sense of the messages that were being sent out to the members of the group. The time frame depended on how frequently the primary sources were sent out. For example, one organization used a daily electronic journal. We monitored it for three months. Another organization sent out a monthly paper newsletter. It was monitored for an entire year. The periodicals were coded according to the general theme of each article. As we looked through the codings, we were able to gain a general perception of the attitudes of the leaders of the social movement of cultural progressives. Articles and readings that were purely informational and that did not attempt to persuade the reader about a given point were not necessarily coded. One of the periodicals also contained letters to the editor. We looked through those letters as well for clues as to what drives cultural progressive activists. Although the letters do not directly come from cultural progressive organizations, the fact that the editors of the periodical allowed them to be printed indicates that the writers had compatibility with the aims of the organizations. This is reinforced by the fact that few, if any, of the letters were critical of the organization. Finally, one of us attended a training workshop for one of the groups mentioned above. It was the only time we interacted faceto -face with some of the cultural progressive activists at an event. He kept notes during the meeting, and we discussed his experience. Fortunately, this visit occurred before the survey was conducted [18.116.90.141] Project MUSE (2024-04-26 15:39 GMT) appendix || 223 and allowed us to develop some understanding of what some cultural progressive activists want. Coding of the Short-Answer Responses The coding of the questions was straightforward and allowed for a quantitative assessment of the group as a whole. Each open-ended response was coded by one researcher. Each attribute was coded as a dichotomous yes or no in regard to that particular response. This allowed us to enter the answers into a statistical program. Each attribute was assessed on its own, and for many responses several attributes were coded as yes. However, we strove not to code the same phrase or word in a response with more than one attribute. Thus, a respondent who defined the Christian right as “murderous” would likely have that response coded as defined as violence but...

Share