University of Toronto Press
  • Knowledge Construction and Information Seeking in Collaborative Learning / La construction des connaissances et la recherche d’information dans l’apprentissage collaboratif
Abstract

This study aims to better understand the complex dynamics of knowledge construction and information seeking in a collaborative learning setting. A total of 34 graduate students who participated in a collaborative research project were asked to complete process surveys in the initiation, midpoint, and completion phases of the project. The process survey for this study comprised closed questions that sought to measure students’ perceptions of knowledge and difficulty as well as open-ended questions that asked students what they knew about the topic and what they considered difficult at each phase of the project. The results revealed growth in individual students’ knowledge as they proceeded through the project. When the results of this study are compared to findings from studies focusing on individual information seeking, students who participated in the collaborative research project began the project with confidence as they developed a shared understanding of the topic in the early phase of the project. However, students became more stressed as the project progressed as they carried out their information-seeking activities in individual ways.

Résumé

Cette étude a pour objectif de mieux comprendre la dynamique complexe de la construction des connaissances et de la recherche d’information dans la situation de l’apprentissage collaboratif. Nous avons demandé à trente-quatre étudiants des cycles supérieurs ayant participé à un projet de recherche collaborative de participer à une enquête sur le processus lors des phases de début, de milieu et de fin du projet. Aux fins de cette étude, l’enquête comprenait des questions fermées qui avaient pour but de mesurer les connaissances acquises ainsi que la difficulté telles que perçues par les étudiants, et des questions ouvertes qui demandaient aux [End Page 1] étudiants ce qu’ils savaient sur le sujet et ce qu’ils considéraient comme difficile à chaque phase du projet. Les résultats ont montré une croissance des connaissances des élèves au fur et à mesure qu’ils avançaient dans le projet. En comparant les résultats de cette étude avec les résultats d'études portant sur la recherche individuelle d’information, les étudiants ayant participé à un projet de recherche collaborative ont démarré le projet avec une confiance due au fait qu’ils avaient développé une compréhension commune de la question dès la première phase du projet. Toutefois, les étudiants ressentent plus de stress quand le projet avance et qu’ils doivent effectuer leurs activités de recherche d’information individuellement.

Keywords

information-seeking behaviour, collaborative information seeking, knowledge construction, collaborative learning, graduate students

Keywords

comportement de recherche d’information, recherche d’information en collaboration, construction des connaissances, apprentissage collaboratif, étudiants aux cycles supérieurs

Introduction

A substantial number of researchers have been devoted to the theoretical and empirical understanding of users’ information-seeking behaviour in the field of library and information science. The question of what motivates information seeking has been a contentious one among researchers. For instance, a cognitive viewpoint on information seeking has asserted that people seek information to bridge a gap in their knowledge structure and then incorporate the information they find into their knowledge structure (e.g., Belkin 1980; Buckland 1991; Cole 2011; Dervin 1983; Kuhlthau 2004; Marchionini 1995; Wilson 2000). Several studies have supported the notion of information seeking as a process of knowledge construction with different cognitive stages (e.g., Cole et al. 2013; Tang and Solomon 1998; Vakkari and Hakala 2000; Vakkari, Pennanen, and Serola 2003; Wang and Soergel 1998). According to Savolainen (2012), the motivation of information seeking often relates to the actor’s questions about his or her ability to perform the information-seeking task. He further argued that the answer to these questions depends on diverse factors, such as the perceived difficulty of the task and the actor’s current level of knowledge about the subject area.

In a similar vein, the question of what motivates collaborative information seeking has been addressed. In fact, a substantial number of researchers have expressed a need to explore aspects of collaboration in support of information seeking since the early 1990s because it has been assumed that information-seeking activities can be performed collaboratively or individually. Several studies have been conducted with a focus group of individuals within various work settings, such as engineers (Fidel et al. 2004), military personnel (Prekop 2002), health care teams (Reddy and Jansen 2008), and students (Hyldegård 2006). Most aimed to understand how people search for information to resolve their shared information needs, but several studies explored triggers for collaborative information-seeking activities (e.g., Paul and Reddy 2010; Reddy and Jansen 2008). However, much of the published research focused on social and environmental factors that influence the motivation to initiate and continue collaborative information seeking—that is, “the role demands of the individuals’ work or [End Page 2] environments within which that work takes place” (Wilson 1999, 252), rather than the individual’s personal characteristics, such as emotional and cognitive space.

Learning contexts have been frequently studied to address the question of what motivates information seeking, as students often engage in information-seeking tasks as part of an instructional activity in a learning environment. Students frequently engage in such activities as locating, selecting, organizing, evaluating, synthesizing, and using relevant information sources to construct meaning about some particular knowledge content. Such activities have been assumed to lead to higher levels of knowledge acquisition and learning. As such, collaborative learning tasks, which involve interaction among students, are a complex process and have been considered a useful pedagogical method for fostering knowledge construction (Schellens and Valcke 2006). Many researchers have discussed the advantages of collaboration with peers within the learning environment. Boud, Cohen, and Sampson (1999), for instance, asserted that collaborative learning is helpful in improving learners’ subject knowledge and in constructing impressive learning achievement without extra staffing in the learning environment. Such collaborative learning tasks have been considered a new source for research into collaborative information seeking (e.g., Hyldegård 2006, 2009; Saleh and Large 2011; Sormunen, Tanni, and Heinström 2013).

However, as Limberg and Alexandersson (2009) pointed out, studies on information seeking for the learning task have focused on research questions regarding how students seek, select, and use information for their learning task. Little research has been conducted on the interaction between how students seek, select, and use information and what they actually learn about the subject. Despite several studies on information seeking in a collaborative learning context, not much attention has been paid to the unique drivers and constraints that exist in the collaboration, which are critical to enabling students to mediate collaborative information seeking. The question of how individuals perceive their collaborative learning task differently has rarely been considered as most studies have centred on the group work dimension, which regards “groups as problem solving units and individuals acting as group members” (Hyldegård and Ingwersen 2007).

This article aims to understand the cognitive experience of students who are engaged in collaborative learning. In particular, it explores how individual students construct knowledge throughout the process, what difficulties they experience in each phase of the process, and how they perceive such difficulties. For students’ research processes, the study used the framework outlined in Kuhlthau’s (2004) Information Search Process (ISP) model, which describes information-seeking behaviour in tasks that required knowledge construction.

This study uses aggregated individual-level observations. The purpose of this study is to examine the processes through which individual graduate students construct knowledge and seek information in a collaborative learning context, in particular: [End Page 3]

  1. 1. To assess the knowledge students gain as they progress through the collaborative learning task;

  2. 2. To measure students’ perceptions of what they know and how difficult the collaborative learning task is; and

  3. 3. To identify the constraints students face as they progress through the collaborative learning project.

Related studies

Information seeking in collaboration

The area of information seeking has been extensively studied in several contexts over the past few decades. While several contexts have been explored, researchers recently have paid attention to the collaborative aspects of information seeking because collaboration has become more prevalent in people’s everyday lives. The growing prominence of collaborative work has led to the emergence of a new research area called Collaborative Information Seeking (CIS), which investigates the role of collaboration in the information-seeking activities of groups (Reddy and Jansen 2008). In addition, the nature of information seeking in collaboration within various professional groups has become an intriguing research topic (Fidel et al. 2004; Prekop 2002; Reddy and Jansen 2008; Hyldegård 2006, 2009; Hansen and Järvelin 2005; Reddy and Spence 2008).

In the field of CIS, several models of collaborative information seeking have been constructed to explain the information-seeking roles and patterns performed (Prekop 2002), the different layers or phases of collaborative information seeking (Shah 2008; Karunakaran, Spence, and Reddy 2010), and collaboration levels and types by stages (Yue and He 2010). Such models were developed to explain complex phenomena related to information seeking in collaboration, but they have not been empirically tested and validated. The ISP model, which explores users’ experience in the process of information seeking as a series of thoughts, feelings, and actions (Kuhlthau 2004), was also adopted in a collaborative setting. The applicability of the ISP model has been explored in several studies on collaborative information seeking (e.g., Hyldegård 2006; Saleh and Large 2011; van Aalst et al. 2007; Shah and Gonzalez-Ibanez 2010). For instance, van Aalst et al. (2007) found that students in a collaborative group had a difficult time developing a focus and felt overwhelmed by the amount of information they found on their topics; several students reported feeling frustrated at the end of a search. By focusing on affective relevance in the collaborative information search context, Shah and Gonzalez-Ibanez (2010) proved that some stages in the model, such as exploration, formulation, and collection, were not distinct in collaborative information seeking.

Among various environments, collaborative learning has become a topic for research. Several prior studies have provided evidence that students work together to find information, reflect on experience, and create knowledge in classroom settings, where collaborative learning often takes place. Several researchers focused on the information search process in which students engage. Hyldegård (2006), based on Kuhlthau’s ISP model, examined the information behaviour of university [End Page 4] students engaged in a group-based setting. Her research noted that the social dimension of collaborative information seeking did not completely coincide with the stages of the ISP model; the affective and cognitive states of individuals in groups also did not meet the model in the context. Saleh and Large (2011) reported that students collaborated most during the task-formulation stage early in the project and less during the selection of the design solution. However, a few researchers tried to explore an explicit link between information seeking and learning outcomes. As early as 1999, in her study on students who participated in a group project, Limberg (1999) found that few students differed from their group in regards to either their information seeking and use or their learning outcomes. Meyers (2011) in his study on middle school students concluded that students’ group work experiences activated certain beneficial cognitive processes, such as elaboration, resource sharing, and strategy discussion, but task outcomes and learning outcomes revealed that students in the group condition did not outperform students working alone.

Graduate students’ information seeking

As research in the area of information seeking progresses, various user groups and their respective information-seeking behaviours have been explored. Among those user groups, university students, the nearest and most convenient subjects, have been extensively studied. However, graduate students as the sole target group have rarely been studied in detail because they are often regarded as not distinct from undergraduate students.

Several studies have explored graduate students’ general access to and use of the library, searching habits and patterns, preferred information resources, and use of information resources for their scholarly activities. Some targeted graduate students in specific disciplines, such as the humanities (Barrett 2005; Delgadillo and Lynch 1999), education (Earp 2008), biology (Brown 2005), and the physical sciences (Brown 1999; Jamali and Nicholas 2008), whereas other studies targeted students from various departments at a single university (George et al. 2006; Fidzani 1998; Liao, Finn, and Lu 2007; Liu and Yang 2004; Barton et al. 2002; Kayongo and Helm 2010). Those previous studies have shown that the discipline did not affect students’ information-seeking behaviour critically, with a few exceptions (e.g., Jamali and Nicholas 2008); rather, prior knowledge and experience searching more significantly influenced information-seeking behaviour (Khosrowjerdi and Iranshahi 2011; Korobili, Malliari, and Zapounidou 2011). Khosrowjerdi and Iranshahi’s (2011) study, for instance, found a strong and positive relationship between prior knowledge and information-seeking behaviour, which was measured by five dimensions, such as relevance judgement, generation of new ideas, and effort made to search for information. Other common findings include that graduate students rely heavily on the Internet because of its timeliness and easy online access (Earp 2008; George et al. 2006; Liao, Finn and Lu 2007; Barton et al. 2002). Liu and Yang (2004) found that students were most satisfied with easily attainable information instead of seeking [End Page 5] the most relevant sources of information, which indicates that convenience is preferred over content for information resources.

Previous studies (e.g., Catalano 2010; Korobili, Malliari, and Zapounidou 2011) also concluded that many graduate students lack basic skills required to effectively use libraries and their resources. In addition, they reported that students rarely obtained help from a reference librarian. Such studies stressed that graduate students need to develop competency skills to define research problems and locate and organize necessary resources pertaining to academic research. Bradigan and her colleagues stressed that “students come to graduate study with vastly different levels of preparation and may understandably be unaware or reluctant to confront the deficiencies in their research training” (Bradigan, Kroll, and Sims 1987, 336). In a sense, there has been a call for additional instruction for graduate students on how to conduct more sophisticated searches. Information literacy skills that are important for this audience include completing a comprehensive review of the literature, learning how to evaluate sources within the context of particular projects, and properly citing and including sources in theses or dissertations. To support the need for information literacy instruction, several studies discussed how librarians and faculty often worked together to integrate information literacy outcomes into graduate-level courses and assessed students’ information literacy outcomes and skills (Cooney and Hiris 2003; Buchanan, Luck and Jones 2002; Emmett and Emde 2007; Samson and Millet 2003; Grant and Berg 2003).

Methodology

Data collection

The participants of this study were graduate students enrolled in an elective course at a large public university in the Southwestern United States. All students were pursuing a master’s degree in library and information science, and none were first-year students of the degree program. All students had previously worked with their classmates in a distributed learning environment. The course was delivered online over a 15-week semester. The online course was supported with a learning management system that offered various communication tools, such as e-mail, discussion boards, and synchronous chat. The system also offered a group area as a platform for group interaction, where students had access to a variety of tools, including a group discussion board, instant chat, file exchange, a group wiki, and group e-mail.

The collaborative learning task as one curriculum-based unit was designed to explore a specific topic in the field of digital library research. For the task, students were instructed to work together in groups of four to investigate contemporary issues and trends in a given topic and present their research paper at the end of the semester. Groups were formed early in the semester according to students’ choice from a given list of topics. An information session was conducted to discuss the expectations and provide clarification about the project. Throughout this project, students were expected to be involved in the research process, from understanding and defining the problem to writing the paper. [End Page 6]

During a 15-week semester, a total of 12 weeks were given to students to work together on the project. The project was broken down into phases, each of which was four weeks long; these were called the initiation, midpoint, and completion phases of the project. Both individual and group deliverables were required to be submitted for each phase. As group deliverables, each group was asked to submit a progress report describing their progress towards the final paper, including the purpose and a brief outline of the paper during the midpoint of the project, and to present their final paper by the end of the completion phase. As individual deliverables, each student was required to reflect on and document the progress of his or her own learning through a process survey at the end of each phase. In addition, students’ interaction in a group area was monitored and captured throughout the project.

The process survey was created based on the Student Learning through Inquiry Measure (SLIM) toolkit, which was developed by Todd, Kuhlthau, and Heinström (2005) to measure students’ knowledge construction and to track experiences in their information-seeking process. The toolkit has been employed to study various student groups, such as students in grades 6 to 12 (Kuhlthau, Heinström, and Todd 2008), high school students learning English as a second language (Kim 2010), and students in grades 7 to 10 (FitzGerald 2011). The survey for this study slightly modified the questions from the original SLIM toolkit. The process survey included the following questions:

  1. 1. A closed question that sought to measure students’ perceived knowledge level for a given topic. Example: “How would you rate your knowledge about the topic?”

  2. 2. A closed question to measure students’ perception of the difficulty of the research task. Example: “How would you rate the difficulty about the research task?”

  3. 3. An open-ended question that sought to assess the actual knowledge students gained by asking them to describe what they knew about the topic in a few sentences. Example: “Write down what you know about the topic.”

  4. 4. An open-ended question to identify what they considered difficult about the research task. Example: “Thinking of your research so far, what did you find difficult to do?”

It should be noted that the survey instrument used in this study was developed for a course where the researcher was an instructor and was given to the students who served as research subjects for this study. The survey, in fact, helped students take the pulse of how they felt about their work and contributions, as well as how the group was operating. To avoid the risk for students of being research subjects, informed consent forms were sent after the final grades for the course had been entered to allow students the option of excluding their survey responses and interactions in the group area from data collection and analysis for this study. Of 35 students who were enrolled in the class, 34 students signed the informed consent agreeing to participate in the study. [End Page 7]

Data analysis

A subject ID was assigned randomly to disguise the identity of the students. Then all responses for closed questions were coded on a 5-point scale in Excel. For example, the “Very easy” level of perceived difficulty was coded as 1, whereas “Very difficult” was coded as 5; the “Not at all knowledgeable” level of perceived knowledge was coded as 1, whereas “Extremely knowledgeable” was coded as 5.

Open-ended responses were analysed using content analysis and grounded theory with the assistance of NVivo software. Responses regarding what students found it difficult to do were analysed using open coding by applying the constant comparative method (Glaser 1965); a total of 10 different coding categories for difficulty reasons were identified in this study. Responses to what students knew about the topic, which reflects the content of knowledge, were coded according to the schemes of the SLIM toolkit. Such topic statements were divided into facts, explanations, and conclusions as follows:

  • • Facts are statements that describe characteristics, processes, styles, actions, and class inclusion.

  • • Explanations are statements that explain how and why, provide end results, and articulate some causality.

  • • Conclusions are statements that formulate syntheses and express opinions, positions, and evaluations.

In terms of inter-coder reliability, two independent coders coded the open-ended responses from five participants. Cohen’s kappa was calculated to measure the agreement of the two coders; it was 0.73 (p < .05) for responses to what students knew about the topic and 0.86 (p < .01) for responses to what students considered difficult, indicating an excellent level of agreement between the coders.

All manipulated data were entered in SPSS and analysed to produce descriptive statistics and inferential statistics. A Mann—Whitney U Test was used to find whether students’ levels of difficulty, perceived knowledge, and actual knowledge varied among the phases. A Friedman Test, a non-parametric alternative to the one-way ANOVA, was performed to determine whether a significant change existed among the three phases of the project. These tests were chosen because the data were ordinal and non-parametric, and the survey was used with small samples and with a repeated-measure design. A Pearson correlation coefficient was determined to assess the relationship between difficulty and knowledge.

In addition, field notes on students’ interaction in the group area were documented in Word files and organized in order by date. They were used to verify and support the findings from the analysis of the process surveys.

Results

Initiation of the project

At the beginning of the research process, students mostly assessed their knowledge as “not very knowledgeable” (M = 2.44). The number of total topic statements averaged 2.24. Most students presented their topics with statements that focused [End Page 8] on facts (M = 1.03). The number of explanation statements was the lowest (M = 0.21). No correlations were found between perceived knowledge and actual knowledge. At this stage, students did not engage deeply in their given topic.

This lack of topic knowledge influenced students’ perceptions of difficulty. Students, in fact, expressed their expectations about difficulty rather than the actual difficulty they were encountering. Several students expected the research to be difficult because of their lack of topic knowledge, such as the following example: “For topics in which I know little like this, visualizing the entire big picture and then narrowing down to the specific topic is often difficult” (S021). Also, it should be noted that group members’ opinions could influence students’ perceptions, as exemplified in subject 017’s comment: “I have not tried to find research this far. I have only read what my group has sent me. My group has told me it is hard to market to faculty. I think I might try taking a stab at that. I am not sure if I’ll be successful, though.” This is supported by the fact that group discussions were used exclusively for asking questions and for expressing personal doubts or ideas during this phase.

Regardless, most students expected the project to be “easy” (M = 2.88). They were fairly confident in their knowledge of searching tools and skills. For example, subject 027 was sure about where to go for information. He said, “I find it easier to find good academic resources through Google Scholar and the online electronics database.”

Midpoint of the project

Students’ perceived knowledge increased (M = 3.03) compared to the beginning of the process. Overall, actual knowledge increased significantly as the number of total topic statements averaged 4.03. The number of conclusion statements was the highest (M = 2.00) and was significantly higher than in the initiation phase (U = 400, p < .05). Even though the number of explanation statements was still the lowest (M = 0.82), students presented more explanation statements in the midpoint than in the initiation phase (U = 386, p < .05). This is because many students started to think about why or how something was happening in their topic through their group brainstorming between the initiation and the midpoint. Several students found that understanding the context of the topic was quite challenging, as presented in subject 026’s comment: “I’m trying to answer the ‘why is this important’ question, but I don’t know if the answer is so obvious that I feel dumb or if I haven’t found the answer to that question yet.”

The perceived level of difficulty also increased (M = 3.06, SD = 0.74). The level of difficulty students felt was the highest in the midpoint phase, as shown in figure 1. Finding specific, targeted information for the topic was most frequently mentioned as the reason for such perceived difficulty (33% of the statements). Examples include the following: “Finding more targeted personalization information has required more in-depth searching” (S036) and “The most challenging part has been narrowing down search results to digital library-specific challenges and not just challenges faced by users working in a virtual environment” (S038). Many students had already obtained general background resources for their topic [End Page 9] and started to collect information on their focused views of the topic at this phase. They focused on individual reasoning rather than group knowledge building. As such, several students confessed that developing a focused topic was challenging (23% of the statements). Another challenge was picking out the relevant information from the many sources they had already identified (15% of statements). Students often felt overwhelmed by too much information related to the topic and had difficulty selecting appropriate information for their focus. During this phase, group discussion was devoted to sharing found information with the group and providing suggestions to others. Thus, students might have been challenged to manage information overload when determining what enough information is. Subject 018 completed her searching and said, “The difficult part is going through all the information and selecting the information I need for my part” (S018). Sometimes, this challenge made them feel lost. Subject 003 decided to stop searching and said, “I am trying to keep my research brief and not get lost in the research process. At some point you need to stop with the searching.”

Figure 1. Mean of perceived difficulty by the three phases of the project
Click for larger view
View full resolution
Figure 1.

Mean of perceived difficulty by the three phases of the project

In this phase, individuals within the group often worked on their individual tasks, which was an outcome of the division of the tasks among the group members in the initiation phase. The postings in the group discussion represented the work of an individual, and uneven participation in the group discussion was observed. This implies that each individual’s progress was diversified. For instance, subject 002 seemed to be far behind in the research process compared with his group members, even failing to initiate his own search and expressing deep frustration: “I have no idea what to begin with it. It seems that I have no clear idea of our topic.” When participating in a group with little interaction with peers in their group, students felt isolated and tended to seek help from their interpersonal sources rather than from peers in their group. For instance, subject 011 sought help from his work colleagues, who are easily accessible to help solve the problem he faced: “Finding information on marketing digital resources has been challenging, and I will need to consult my work colleagues for direction.” [End Page 10]

Completion of the project

Between the midpoint and the completion phases, students’ perceived knowledge increased significantly. Students mostly assessed themselves as “very knowledgeable” (M = 3.79, SD = 0.73). Students’ actual knowledge increased; the number of total topic statements averaged 5.18. However, no significant increases were observed from the midpoint to the completion. The Pearson coefficient correlation test indicated that the number of total topic statements was negatively correlated with students’ perceived difficulty (r = 0.35, p < .05). In addition, the number of fact statements was negatively correlated with students’ perceived difficulty (r = –0.62, p < .01). That is, the more students had learned about the topic, the less difficult the task felt to them during this phase.

Even though students’ perceived level of difficulty was lower (M = 2.85) than in other stages, several students still spoke of the recurring difficulty in tracking down particular information; 22% of total difficulty statements were identified as falling in this category. For example, “Finding literature on accessibility specifically of digital libraries was difficult” (S024) and “It was a little difficult to find newer articles on born-digital storage solutions” (S031). It is worth noting that a small number of students (12% of statements) still seemed unclear about their focus because the information gained from peers in a group changed their existing schema or modified their original ideas. Subject 015 realized that his initial focus was too broad: “Digital data and its preservation is a huge topic. Even limiting the project to the digital humanities left a lot of research domains to explore, each with its own particular vocabulary, problem space, and methodological approach.”

At this phase, most students completed their search and engaged in synthesizing the information to write a paper. Thus, the postings in the group discussion area demonstrate increased interactivity and participation. During this stage, most students focused on group knowledge building. Synthesizing information requires a student to process and interact with information and peers in a group; extracting the most relevant content from the information and organizing that content in a manner that supported the purpose and format of the product or performance was strenuous for many students (25% of statements). Many students commented that it was not easy to summarize the main ideas extracted from the information gathered. The following are some examples: “The most difficult part for me is putting together information in a coherent thought for the paper” (S037); “I think the most difficult part of the project was deciding which information to use given the length constraints of the project” (S033).

Whereas few commented on the difficulty of collaboration until the midpoint, many students finally felt anxious about working with their group members during this phase. Another quarter of the statements concerned the issue of communicating as a group, coordinating different writing styles, and reaching consensus. Students commented as follows: “Clarifying expectations with other team members was more of a challenge” (S027) and “Working with a group remotely was a little challenging as well as trying to make the paper flow with four pieces of writing from people with different writing styles” (S028). This [End Page 11] implies that communication and coordination activities in a collaborative learning task imposed an additional cognitive load on students.

Figure 2. Mean of topic statements by the three phases of the project
Click for larger view
View full resolution
Figure 2.

Mean of topic statements by the three phases of the project

Differences in three phases

A Friedman Test was used to compare the mean changes in the perceived difficulty, perceived knowledge, and actual knowledge for the three phases. Figure 2 shows that the number of fact, explanation, conclusion, and total statements increased as students proceeded through their project. The results revealed significant differences in the number of total topic statements and the number of conclusion statements among the three phases of the project, χ2 (2) = 16.15, p < .01 and χ2 (2) = 11.26, p < .01, respectively. This implies that students submitted more statements in the completion phase of the project than in the other two phases. But students’ knowledge particularly increased between the initiation and the midpoint.

The test also showed significant differences between the three phases of the project with respect to the level of perceived knowledge, χ2 (2) = 41.79, p < .01. As shown in figure 3, most students presented themselves as very knowledgeable about their topics during the final writing task.

Discussion

This study presented a change and/or growth in individual students’ knowledge as they proceeded through a collaborative learning task. The growth was confirmed by the perceived knowledge assessed by students and the actual knowledge measured by their topic statements. This result certainly confirms Kuhlthau’s ISP model, which implies that individuals learn about their topic and construct knowledge as they proceed through the phases of the information-seeking process. Students’ knowledge particularly increased between the initiation and the midpoint when groups changed their research strategies from group brainstorming to individual searching. In contrast to previous studies on an individual [End Page 12] learning context that found that students expressed their knowledge mostly by factual statements (e.g., Todd 2006; Kim 2010), the students in this study expressed their knowledge predominantly by conclusion statements. This is because students viewed a collaborative learning task as one that brings a range of ideas and meanings from individual group members, and they strove to establish group-level resolution in an early phase of the project.

Figure 3. Mean of perceived knowledge by the three phases of the project
Click for larger view
View full resolution
Figure 3.

Mean of perceived knowledge by the three phases of the project

We also observed a slight increase in the number of explanation statements between the initiation and the midpoint; however, the explanation statements did not increase substantially throughout the project. This result may indicate that the students who participated in a collaborative learning task might have missed an opportunity to engage in “deep learning” that enables them to transfer their “factual knowledge into usable knowledge” (Bransford, Brown, and Cocking 2000), which requires critical thinking and higher-order reasoning. This also verifies a later study by Kuhlthau and her colleagues Heinström and Todd (2008), which found that if students failed to find a focus, their depth of knowledge remained at a superficial level.

The students showed significant increases in their perceived knowledge throughout the process. At the initiation, students tended to relate the task at hand to their previous knowledge; those who recognized gaps in their knowledge tended to feel that they were not knowledgeable about the topic. Previous empirical studies in cognitive psychology have asserted that an individual’s perceptions of his or her knowledge can play an integral role in influencing his or her decision making and behaviour (Radecki and Jaccard 1995); for instance, individuals who believe that they are already knowledgeable about a topic may be less likely to search out additional information about that topic. Perceived knowledge was also influenced by how knowledgeable a student’s group members were perceived as being. Therefore, several groups presented similar patterns in their perceptions of knowledge at the beginning of the project, then showed [End Page 13] individual differences within the group as individuals moved into doing their own information seeking and the reasoning process evolved. It should also be noted that perceived knowledge and actual knowledge did not coincide. This weak correlation between actual and perceived knowledge suggests that other factors influenced students’ perceptions of knowledge. Students could be biased assessors of their knowledge levels as the knowledge levels of their group members could affect judgements of their own knowledge.

This study also revealed that students who participated in collaborative learning initiated the projects with confidence but became more stressed as the project progressed. Students in this study arrived at a shared intragroup understanding of the topic quickly through discussions and brainstorming with their group members, which alleviated feelings of uncertainty and lack of confidence. In contrast, students’ perceived difficulty was the highest at the midpoint, when they tended to do their research in a more individual way. Such perceived difficulty was likely related to a feeling of frustration and cognitive burden. There were several students who struggled to find specific information for their targeted topic after a division of tasks and subtopics had been agreed on. This finding is in line with Pennanen and Vakkari’s (2003) study, which found that students’ search goals changed from expecting general information to expecting more specific information as they became more familiar with the topic. Many students shared information they found with their group members while gathering relevant and up-to-date information to aid their understanding of their own focused topic. However, when information is shared and much of it is relevant, students need to deal with the issue of information overload. This means that information sharing may not be beneficial in every instance as it hinders students from staying focused on the relevant topic. Thus, students were placed in a condition of cognitive overload, where “information overload was added to multitasking and interruptions” (Kirsch 2000).

Along with systematically searching for information, difficulty in filtering and extracting information was one constraint experienced by the students. Such difficulty continued until the completion stage, when students needed to synthesize and organize what each group member had gathered on his or her given subtopic. The completion phase, which required a group-level agreement and coordination, presented another set of challenges. Unlike in individual-based information seeking, some context-specific challenges of collaboration, such as negotiation and communication with group members, influenced students’ cognitive and emotional experiences until the process ended. Such collaboration challenges in the completion phase negatively influenced students’ experiences rather than the outcome of the results.

It is interesting to note that students’ perceptions of difficulty were negatively correlated with the actual knowledge assessed by topic statements. This indicates that students who assessed their research project as more difficult tended to construct their knowledge inactively. The result is probably related to what Kuhlthau (1999) found in her study on perceptions of the information search process of an early career information worker; it is the perception of complexity, [End Page 14] rather than the actual objective complexity of a task, that causes feelings of uncertainty. Furthermore, it also confirms what Ingwersen and Järvelin (2005) asserted: Perceived task complexity or difficulty relates to the task performer’s knowledge and experience.

Despite some interesting findings of this study, there are several limitations that challenge both the validity of the findings and the ability to develop generalizations from them. Caution should be used when making generalizations based on the findings of this study alone, in part because of the small sample size. The sampling validity was further weakened by the circumstances of conducting a study using graduate students in a single discipline. Students in library and information science tend to be confident and sure of their advanced searching skills. In addition, the survey instrument, which was designed as a data collection framework to “chart the information-to-knowledge development” (Kuhlthau, Heinström, and Todd 2008) and identify barriers in the information-seeking experience, is subject to further improvement. For instance, in our open-ended question “Write down what you know about the topic,” students reacted in their own words to questions about the topic they were researching, which produced more information from which interpretations of questions could be inferred. However, it is challenging to measure knowledge, as knowledge itself may be a multidimensional construct. It also should be noted that there was no actual measure of the performance or learning outcomes such as a post-test or assessment of the task itself in this study. Therefore, it was difficult to observe how much knowledge students actually gained or achieved during the project.

Conclusion

This study was conducted to (1) assess the knowledge students gained as they progressed through collaborative learning, (2) measure students’ perceptions of what they knew and how difficult the collaborative learning task was, and (3) identify constraints the students faced as they progressed through the collaborative learning project.

To answer these questions, we used Kuhlthau’s ISP model as a framework for understanding the process of information seeking in a collaborative learning setting. In particular, this study found the ISP model a useful and insightful explanation of the cognitive aspects of an individual’s involvement in collaborative information seeking. This study found that students’ problems and their progress in knowledge construction were diverse at each stage, because each student’s information-seeking activities became more differentiated, especially at the midpoint of the process. This confirms the results of Vakkari (2000), who validated Kuhlthau’s ISP model by investigating the information behaviour of students writing a research proposal and found that all the participants proceeded at varying paces. In this study, students in the same group were not necessarily moving at the same pace or experiencing the same difficulty even though their perceptions were influenced by their group members’ knowledge and opinion. This has some implications for students who are often involved in a collaborative learning process in terms of a particular “zone of intervention.” For [End Page 15] instance, if students are aware that increased frustration and anxiety is to be expected midway through the process, they become less discouraged when it happens (Kracker 2002). In a sense, instructors and librarians who guide students through the collaborative learning process can emphasize this for students and be ready to provide efficient, helpful, and appropriate instructional intervention.

Furthermore, it should be emphasized that each group member can act as a collaborator who intervenes in other group members’ information-seeking processes. In this study, students often worked individually on their subtopics after dividing up the tasks among the group members through group communication. The division of the tasks, which has been recognized as one of the phenomena of collaborative information seeking, can be helpful to reduce the difficulty and redundancy of the tasks and to improve the task performer’s knowledge and experience during the group search process (Foley and Smeaton 2010). This implies that it is important to consider how to mediate the group search process by dividing tasks and sharing knowledge effectively in collaborative information seeking. Several forms of mediation can be used to divide up tasks, such as communicative mediation, user interface mediation, algorithmic mediation, role-based mediation, and collaborative search, to lead to better results than those of individual search (Shah, Pickens and Golovchinsky 2010; Kelly and Payne 2013). Thus, it should be highlighted that each group member can keep track of each person’s search activity independently by mediating various aspects of task division for useful collaborative information-seeking processes.

While CIS has matured to become a distinctive field of research over that last two decades, as Shah (2014) pointed out, the important question of how people synthesize and make sense of information to construct knowledge in the process of collaborative information seeking needs to be addressed. Future research, complemented with other research strategies to collect data on what students actually do at each stage of the process, might continue to explore the collaborative information-seeking process in mediating the division of tasks to support students’ knowledge construction. Further study of additional participants from various disciplines is needed to confirm the findings. Such study would strengthen the findings and provide additional, finer details concerning information seeking in collaborative learning.

Jeonghyun Kim
Department of Library and Information Sciences, College of Information, University of North Texas
Jeonghyun.Kim@unt.edu
Jisu Lee
Department of Library and Information Science, Sookmyung Women’s University, Seoul, Korea
Jisulee0423@gmail.com

References

Barrett, Andy. 2005. “The Information-Seeking Habits of Graduate Student Researchers in the Humanities.” Journal of Academic Librarianship 31 (4): 324–31. http://dx.doi.org/10.1016/j.acalib.2005.04.005.
Barton, Hope, Jim Cheng, Leo Clougherty, John Forys, Toby Lyles, Dorothy Marie Persson, Christine Walters, and Carlette Washington-Hoagland. 2002. “Identifying the Resource and Service Needs of Graduate and Professional Students: The University of Iowa User Needs of Graduate Professional Series.” Portal: Libraries and the Academy 2 (1): 125–43. http://dx.doi.org/10.1353/pla.2002.0014.
Belkin, Nicholas J. 1980. “Anomalous States of Knowledge as a Basis for Information Retrieval.” Canadian Journal of Information Science 5:133–43. [End Page 16]
Boud, David, Ruth Cohen, and Jane Sampson. 1999. “Peer Learning and Assessment.” Assessment & Evaluation in Higher Education 24 (4): 413–26. http://dx.doi.org/10.1080/0260293990240405.
Bradigan, Pamela S., Susan M. Kroll, and Sally R. Sims. 1987. “Graduate Student Bibliographic Instruction at a Large University: A Workshop Approach.” Reference Quarterly 26 (3): 335–40.
Bransford, John D., Ann L. Brown, and Rodney R. Cocking. 2000. How People Learn. Washington, DC: National Academy Press.
Brown, Cecelia M. 1999. “Information Literacy of Physical Science Graduate Students in the Information Age.” College & Research Libraries 60 (5): 426–39.
———. 2005. “Where Do Molecular Biology Graduate Students Find Information?” Science & Technology Libraries 25 (3): 89–104. http://dx.doi.org/10.1300/J122v25n03_06.
Buchanan, Lori E., DeAnne Luck, and Ted C. Jones. 2002. “Integrating Information Literacy into the Virtual University: A Course Model.” Library Trends 51 (2): 144–67.
Buckland, Michael K. 1991. “Information as Thing.” Journal of the American Society for Information Science 42 (5): 351–60. http://dx.doi.org/10.1002/(SICI)1097–4571(199106)42:5<351::AID-ASI5>3.0.CO;2-3.
Catalano, Amy J. 2010. “Using ACRL Standards to Assess the Information Literacy of Graduate Students in an Education Program.” Evidence-Based Library and Information Practice 5 (4): 7–20.
Cole, Charles. 2011. “A Theory of Information Need for Information Retrieval That Connects Information to Knowledge.” Journal of the American Society for Information Science and Technology 62 (7): 1216–31. http://dx.doi.org/10.1002/asi.21541.
Cole, Charles, Jamshid Behesthi, Andrew Large, Isabelle Lamoureux, Dhary Abuhimed, and Mohammed AlGhamdi. 2013. “Seeking Information for a Middle School History Project: The Concept of Implicit Knowledge in the Students’ Transition from Kuhlthau’s Stage 3 to Stage 4.” Journal of the American Society for Information Science and Technology 64 (3): 558–73. http://dx.doi.org/10.1002/asi.22786.
Cooney, Martha, and Lolene Hiris. 2003. “Integrating Information Literacy and Its Assessment into a Graduate Business Course: A Collaborative Framework.” Research Strategies 19 (3–4): 213–32. http://dx.doi.org/10.1016/j.resstr.2004.11.002.
Delgadillo, Roberto, and Beverly P. Lynch. 1999. “Future Historians: Their Quest for Information.” College & Research Libraries 60 (3): 245–59.
Dervin, Brenda. 1983. “An Overview of Sense-Making Research: Concepts, Methods and Results.” Paper presented at the annual meeting of the International Communication Association, Dallas, TX.
Earp, Vanessa. 2008. “Information Source Preferences of Education Graduate Students.” Behavioral & Social Sciences Librarian 27 (2): 73–91. http://dx.doi.org/10.1080/01639260802194974.
Emmett, Ada, and Judith Emde. 2007. “Assessing Information Literacy Skills Using the ACRL Standards as a Guide.” Reference Services Review 35 (2): 210–29. http://dx.doi.org/10.1108/00907320710749146.
Fidel, Raya, Annelise Mark Pejtersen, Bryan Cleal, and Harry Bruce. 2004. “A Multidimensional Approach to the Study of Human-Information Interaction: A Case Study of Collaborative Information Retrieval.” Journal of the American Society for Information Science and Technology 55 (11): 939–53. http://dx.doi.org/10.1002/asi.20041. [End Page 17]
Fidzani, Babakisi T. 1998. “Information Needs and Information Seeking Behaviour of Graduate Students at the University of Botswana.” Library Review 47 (7): 329–40. http://dx.doi.org/10.1108/00242539810233459.
FitzGerald, Lee. 2011. “The Twin Purposes of Guided Inquiry: Guiding Student Inquiry and Evidence Based Practice.” Scan 39: 26–41.
Foley, Colum, and Alan F. Smeaton. 2010. “Division of Labour and Sharing of Knowledge for Synchronous Collaborative Information Retrieval.” Information Processing & Management 46 (6): 762–72. http://dx.doi.org/10.1016/j.ipm.2009.10.010.
Glaser, Barney G. 1965. “The Constant Comparative Method of Qualitative Analysis.” Social Problems 12 (4): 436–45. http://dx.doi.org/10.2307/798843.
George, Carole, Alice Bright, Terry Hurlbert, Erika C. Linke, Gloriana St. Clair, and Joan Stein. 2006. “Scholarly Use of Information: Graduate Students’ Information Seeking Behaviour.” Information Research 11 (4). http://www.informationr.net/ir/11-4/paper272.html.
Grant, Marcia, and Marlowe Berg. 2003. “Information Literacy Integration in a Doctoral Program.” Behavioral & Social Sciences Librarian 22 (1): 115–28. http://dx.doi.org/10.1300/J103v22n01_08.
Hansen, Preben, and Kalervo Järvelin. 2005. “Collaborative Information Retrieval in an Information Intensive Domain.” Information Processing & Management 41 (5): 1101–19. http://dx.doi.org/10.1016/j.ipm.2004.04.016.
Hyldegård, Jette. 2006. “Collaborative Information Behaviour—Exploring Kuhlthau’s Information Search Process Model in a Group Based Educational Setting.” Information Processing & Management 42 (1): 276–98. http://dx.doi.org/10.1016/j.ipm.2004.06.013.
———. 2009. “Beyond the Search Process Exploring Group Members’ Information Behaviour in Context.” Information Processing & Management 45 (1): 142–58. http://dx.doi.org/10.1016/j.ipm.2008.05.007.
Hyldegård, Jette, and Peter Ingwersen. 2007. “Task Complexity and Information Behaviour in Group Based Problem Solving.” Information Research 12 (4). http://www.informationr.net/ir/12-4/colis/colis27.html.
Ingwersen, Peter, and Kalervo Järvelin. 2005. The Turn: Integration of Information Seeking and Retrieval in Context. Dordrecht, the Netherlands: Springer.
Jamali, Hamid R., and David Nicholas. 2008. “Information-Seeking Behaviour of Physicists and Astronomers.” Aslib Proceedings 60 (5): 444–62. http://dx.doi.org/10.1108/00012530810908184.
Karunakaran, Arvind, Patricia R. Spence, and Madhu C. Reddy. 2010. “Towards a Model of Collaborative Information Behaviour.” Proceedings of the 2nd International Workshop on Collaborative Information Seeking, Savannah, GA.
Kayongo, Jessica, and Clarence Helm. 2010. “Graduate Students and the Library: A Survey of Research Practices and Library Use at the University of Notre Dame.” Reference and User Services Quarterly 49 (4): 341–49. http://dx.doi.org/10.5860/rusq.49n4.341.
Kelly, Ryan, and Stephen J. Payne. 2013. “Division of Labour in Collaborative Information Seeking: Current Approaches and Future Directions.” Proceedings of the 5nd International Workshop on Collaborative Information Seeking, San Antonio, TX.
Khosrowjerdi, Mahmood, and Mohammad Iranshahi. 2011. “Prior Knowledge and Information-Seeking Behavior of PhD and MA Students.” Library & Information Science Research 33 (4): 331–35. http://dx.doi.org/10.1016/j.lisr.2010.04.008. [End Page 18]
Kim, Sung Un. 2010. “The Information Seeking and Use of English Language Learners in a High School Setting.” PhD diss., Rutgers University.
Kirsch, David. 2000. “A Few Thoughts on Cognitive Overload.” Intellectica 30:19–51.
Korobili, Stella, Aphrodite Malliari, and Sofia Zapounidou. 2011. “Factors That Influence Information Seeking Behavior: The Case of Greek Graduate Students.” Journal of Academic Librarianship 37 (2): 155–65. http://dx.doi.org/10.1016/j.acalib.2011.02.008.
Kracker, Jacqueline. 2002. “Research Anxiety and Students’ Perceptions of Research: An Experiment. Part 1. Effect of Teaching Kuhlthau’s ISP Model.” Journal of the American Society for Information Science and Technology 53 (4): 282–94. http://dx.doi.org/10.1002/asi.10040.
Kuhlthau, Carol C. 1999. “The Role of Experience in the Information Search Process of an Early Career Information Worker: Perceptions of Uncertainty, Complexity, Construction and Sources.” Journal of the American Society for Information Science 50 (5): 399–412. http://dx.doi.org/10.1002/(SICI)1097–4571(1999)50:5<399::AID-ASI3>3.0.CO;2–L.
———. 2004. Seeking Meaning: A Process Approach to Library and Information Services. 2nd ed. Westport, CT: Libraries Unlimited.
Kuhlthau, Carol C., Jannica E. Heinström, and Ross Todd. 2008. “The Information Search Process Revisited: Is the Model Still Useful?” Information Research 13 (4). http://www.informationr.net/ir/13-4/paper355.html.
Liao, Yan, Mary Finn, and Jun Lu. 2007. “Information-Seeking Behavior of International Graduate Students vs. American Graduate Students: A User Study at Virginia Tech 2005.” College & Research Libraries 68 (1): 5–25.
Limberg, Louis. 1999. “Three Conceptions of Information Seeking and Use.” In Exploring the Contexts of Information Behavior, ed. Thomas D. Wilson and David K. Allen, 116–35. London: Taylor Graham.
Limberg, Louis, and Mikael Alexandersson. 2009. “Learning and Information Seeking.” Encyclopaedia of Library and Information Sciences 3: 3252–62.
Liu, Zao, and Zheng Y. Yang. 2004. “Factors Influencing Distance-Education Graduate Students’ Use of Information Sources: A Use Study.” Journal of Academic Librarianship 30 (1): 24–35. http://dx.doi.org/10.1016/j.jal.2003.11.005.
Marchionini, Gary. 1995. Information Seeking in Electronic Environments. Cambridge: Cambridge University Press.
Meyers, Eric M. 2011. “The Nature and Impact of Information Problem Solving in the Middle School Science Classroom.” PhD diss., University of Washington.
Paul, Sharoda A., and Madhu C. Reddy. 2010. “A Framework for Sensemaking in Collaborative Information Seeking.” Proceedings of the Collaborative Information Seeking (CIS) Workshop, Savannah, GA.
Pennanen, Mikko, and Petti Vakkari. 2003. “Students’ Conceptual Structure, Search Process, and Outcome while Preparing a Research Proposal: A Longitudinal Case Study.” Journal of the American Society for Information Science and Technology 54 (8): 759–70. http://dx.doi.org/10.1002/asi.10273.
Prekop, Paul. 2002. “A Qualitative Study of Collaborative Information Seeking.” Journal of Documentation 58 (5): 533–47. http://dx.doi.org/10.1108/00220410210441000.
Radecki, Carmen C., and James Jaccard. 1995. “Perceptions of Knowledge, Actual Knowledge, and Information Search Behavior.” Journal of Experimental Social Psychology 31 (2): 107–38. http://dx.doi.org/10.1006/jesp.1995.1006.
Reddy, Madhu C., and Bernard J. Jansen. 2008. “A Model for Understanding Collaborative Information Behaviour in Context: A Study of Two Healthcare Teams.” [End Page 19] Information Processing & Management 44 (1): 256–73. http://dx.doi.org/10.1016/j.ipm.2006.12.010.
Reddy, Madhu C., and Patricia R. Spence. 2008. “Collaborative Information Seeking: A Field Study of a Multidisciplinary Patient Care Team.” Information Processing & Management 44 (1): 242–55. http://dx.doi.org/10.1016/j.ipm.2006.12.003.
Saleh, Nasser, and Andrew Large. 2011. “Collaborative Information Behaviour in Undergraduate Group Projects: A Study of Engineering Students.” Proceedings of the American Society for Information Science and Technology, New Orleans, LA. http://dx.doi.org/10.1002/meet.2011.14504801035.
Samson, Sue, and Michelle S. Millet. 2003. “The Learning Environment: First-Year Students, Teaching Assistants, and Information Literacy.” Research Strategies 19 (2): 84–98. http://dx.doi.org/10.1016/j.resstr.2004.02.001.
Savolainen, Reijo. 2012. “Conceptualizing Information Need in Context.” Information Research 17 (4). http://www.informationr.net/ir/17-4/paper534.html.
Schellens, Tammy, and Martin Valcke. 2006. “Fostering Knowledge Construction in University Students through Asynchronous Discussion Groups.” Computers & Education 46 (4): 349–70. http://dx.doi.org/10.1016/j.compedu.2004.07.010.
Shah, Chirag C. 2008. “Toward Collaborative Information Seeking (CIS).” Proceedings of Collaborative Exploratory Search Workshop at JCDL, Pittsburgh, PA.
———. 2014. “Collaborative Information Seeking.” Journal of the Association for Information Science and Technology 65 (2): 215–36. http://dx.doi.org/10.1002/asi.22977.
Shah, Chirag C., and Roberto Gonzalez-Ibanez. 2010. “Exploring Information Seeking Processes in Collaborative Search Tasks.” Proceedings of the American Society of Information Science and Technology, Pittsburgh, PA. http://dx.doi.org/10.1002/meet.14504701211.
Shah, Chirag C., Jeremy Pickens, and Gene Golovchinsky. 2010. “Role-Based Results Redistribution for Collaborative Information Retrieval.” Information Processing & Management 46 (6): 773–81. http://dx.doi.org/10.1016/j.ipm.2009.10.002.
Sormunen, Eero, Mikko Tanni, and Jannica Heinström. 2013. “Students’ Engagement in Collaborative Knowledge Construction in Group Assignments for Information Literacy.” Information Research 18 (3). http://www.informationr.net/ir/18-3/colis/paperC40.html.
Tang, Rong, and Paul Solomon. 1998. “Toward an Understanding of the Dynamics of Relevance Judgment: An Analysis of One Person’s Search Behavior.” Information Processing & Management 34 (2-3): 237–56. http://dx.doi.org/10.1016/S0306-4573(97)00081-2.
Todd, Ross. 2006. “From Information to Knowledge: Charting and Measuring Changes in Students’ Knowledge of a Curriculum Topic.” Information Research 11 (4). http://www.informationr.net/ir/11-4/paper264.html.
Todd, Ross, Carol C. Kuhlthau, and Jannica E. Heinström. 2005. “Student Learning Inquiry Measure (SLIM) Handbook.” The Center for International Scholarship in School Libraries. http://cissl.rutgers.edu/images/stories/docs/slimtoolkit.pdf.
Vakkari, Petti. 2000. “eCognition and Changes of Search Terms and Tactics during Task Performance: A Longitudinal Case Study.” Proceedings of the RIAO Conference, Paris.
Vakkari, Petti, and Nanna Hakala. 2000. “Changes in Relevance Criteria and Problem Stages in Task Performance.” Journal of Documentation 56 (5): 540–62. http://dx.doi.org/10.1108/EUM0000000007127.
Vakkari, Petti, Mikko Pennanen, and Sami Serola. 2003. “Changes of Search Terms and Tactics while Writing a Research Proposal: A Longitudinal Case Study.” Information [End Page 20] Processing & Management 39 (3): 445–63. http://dx.doi.org/10.1016/S0306-4573(02)00031-6.
van Aalst, Jan, Fung W. Hing, Li S. May, and Wong P. Yan. 2007. “Exploring Information Literacy in Secondary Schools in Hong Kong: A Case Study.” Library & Information Science Research 29 (4): 533–52. http://dx.doi.org/10.1016/j.lisr.2007.06.004.
Wang, Peiling, and Dagobert Soergel. 1998. “A Cognitive Model of Document Use during a Research Project. Study 1. Document Selection.” Journal of the American Society for Information Science 49 (2): 115–33. http://dx.doi.org/10.1002/(SICI)1097-4571(199802)49:2<115::AID-ASI3>3.0.CO;2-T.
Wilson, Tom D. 1999. “Models in Information Behaviour Research.” Journal of Documentation 55 (3): 249–70. http://dx.doi.org/10.1108/EUM0000000007145.
———. 2000. “Human Information Behaviour.” Informing Science 3 (1): 49–55.
Yue, Zhen, and Daqing He. 2010. “Exploring Collaborative Information Behaviour in Context: A Case Study of E-discovery.” Proceedings of the 2nd International Workshop on Collaborative Information Seeking, Savannah, GA. [End Page 21]

Share