In lieu of an abstract, here is a brief excerpt of the content:

online survey Instant Publication, Instant Mistake, All of the Above gina walejko Developments in information and communication technologies have led to new forms of communication and personal expression. For example, today many adolescents and adults spend time interacting with others via online video games (e.g., see Williams and Xiong in this volume). Others use social networking Web sites to keep in contact with peers, foster new friendships, ‹nd signi‹cant others, and even develop professional connections . The past several years have also seen the rise of blogs on a diverse array of topics, from personal diaries to politics and technology (e.g., Adamic and Glance 2005; Drezner and Farrell 2004; Singer 2005). From a social scienti‹c perspective, users of online communication technologies offer researchers interesting populations to study. Scholars can gain easy access to large samples and, with help of online surveys, can ask these users attitudinal and behavioral questions. For example, what do political bloggers think about the mainstream media? Are young users of social networking concerned with privacy? Add to this the ease and relatively small cost of online survey software, and the junior researcher might begin to see a recipe for instant success. (Start with 1,000 bloggers. Add equal parts Web questionnaire and Stata. Stir. Bake for one year in a 90-degree graduate student of‹ce.) 101 Unfortunately, the allure of online surveys often masks the dif‹culties and trade-offs that they entail. For the most part, researchers have surveyed populations of interest located in the “real world” with of›ine modes such as mail, telephone, and face-to-face data-collection techniques. The Internet challenges researchers to rethink established standards of constructing sampling frames, creating questionnaires, and contacting samples . Furthermore, certain characteristics of these new forms of online communication complicate the process of quantitative research. For example , how does a social scientist interested in studying “the most popular bloggers” operationalize such a project? Should this measurement be based on blog aggregator Web sites (which ones?), Web page traf‹c (hits, visitors, or page counts?), or neither? This chapter will explore the challenges that arise when doing survey research on Internet-based populations , discussing in depth the complications that arose during one speci‹c study of bloggers and ending with a list of lessons learned. the 2007 academic blogger survey Given my interests in survey methodology and Internet use, I decided to execute a survey of academic bloggers. Blogs are online publications usually consisting of Web entries or short articles to which visitors may post public comments either on the Web site itself or on their own blogs elsewhere . Blogs have become increasingly numerous, and, in 2008, the blogtracker site, Technorati, followed almost 113 million blogs (2008). I began my research by collecting a list of bloggers who work as professors or researchers in universities, colleges, and research institutions. I then designed an online survey instrument to measure characteristics of this sample , speci‹cally attitudes toward blogging and scholarship, frequency of blogging activities, type of content, their Web use, and technological pro‹ciency, as well as general demographic information. I then administered the survey questionnaire online. As a new (and somewhat inexperienced) graduate student and researcher , I expected the project to be semipainless. Little could I have known that it would take me nearly three months—or a whole summer— to collect the contact information for my population. However, the intricacies involved in sampling an online population as well as the challenges of balancing coursework and research obligations made the study more dif‹cult than I had expected. What follows is a step-by-step description of the hurdles that I encountered while designing and administering an on102 • research confidential line survey of academic bloggers. First, I discuss in depth the planning associated with the project including hypotheses and sampling design, questionnaire design and testing, and Institutional Review Board (IRB) approval. Second, I brie›y discuss survey implementation including correspondence and list management, technical dif‹culties that arose during data collection, and the problems that I encountered when calculating a response rate. I conclude the chapter with a list of lessons learned for researchers embarking on similar projects. planning, planning, and more planning Our fourth-grade science teachers taught us to develop hypotheses before testing and experimentation. However, we were instructed that these hypotheses must be derived from inductive reasoning and educated guesses. How does a researcher interested in technology create well-informed hypotheses about phenomena that are rapidly changing? During the...


Additional Information

Related ISBN
MARC Record
Launched on MUSE
Open Access
Back To Top

This website uses cookies to ensure you get the best experience on our website. Without cookies your experience may not be seamless.