University of Toronto Press
  • Enhancing Skills, Effecting Change: Evaluating an Intervention for Students with Below-Proficient Information Literacy Skills / Renforcer les compétences pour induire des changements : évaluation d'une intervention auprès d'étudiants possédant des compétences informationnelles inférieures à la maîtrise
  • Don Latham, Associate Professor and Melissa Gross, Professor
Abstract

An intervention was developed for first-year community college students with below-proficient information literacy skills for the purpose of helping them gain an understanding of information literacy as a discrete skill set, develop a more accurate view of their own skill levels, and learn at least one skill they could apply in imposed and self-generated information-seeking tasks. Using an experimental design, the researchers conducted a summative evaluation of the intervention through pre- and post-intervention test scores, pre- and post-intervention tests, pre-and post-intervention surveys, post-intervention interviews with participants, and a debriefing with the intervention instructor. This paper focuses on the results of the post-intervention interviews. Those results suggest that students learned at least one skill from the intervention. The results are mixed as to whether they gained a greater understanding of information literacy as a skill set and whether they recalibrated their perceptions of their own information literacy skill levels.

Résumé

Nous avons organisé une intervention dans un centre universitaire de premier cycle, auprès d'étudiants de première année ayant pour point commun des compétences informationnelles inférieures à la maîtrise, notre but étant de les aider à acquérir une compréhension des compétences informationnelles comme ensemble individuel d'habiletés à acquérir, à développer un point de vue personnel plus juste sur leurs propres niveaux de savoir-faire, et à apprendre au moins une compétence applicable dans les travaux informationnels imposés ou auto-générés. À l'aide d'un modèle expérimental, les chercheurs ont procédé à une évaluation globale de l'intervention au moyen des résultats pré et post-intervention, des tests pré et [End Page 367] post-intervention, des enquêtes pré et post-intervention, des entrevues post-intervention avec les participants, et après avoir recueilli le témoignage de l'instructeur ayant mené l'intervention. Cette étude se concentre sur les résultats des entrevues post-intervention. Les résultats montrent que les étudiants ont acquis au moins une compétence grâce à l'intervention. Les résultats sont mitigés en ce qui concerne leur acquisition d'une compréhension des compétences informationnelles en tant qu'ensemble d'habiletés, et quant à savoir si les étudiants ont pu réévaluer leur perception de leurs propres niveaux de savoir-faire dans les compétences informationnelles.

Keywords

information literacy, information literacy instruction, information behaviour of college students

Keywords

compétences informationnelles, enseignement des compétences informationnelles, comportement informationnel chez les étudiants au collégial

Introduction

An information literacy skills intervention was developed and evaluated as part of a three-year project funded by the Institute of Museum and Library Services to identify and respond to the needs of first-year community college students with below-proficient information literacy (IL) skill levels. This project, now in its third year, involves library and information studies faculty collaborating with community college librarians to develop and deliver an effective intervention for such students. Students with below-proficient information literacy skill levels were identified through the Information Literacy Test (ILT) (Wise et al. 2009). In year one of the project, interviews were conducted to understand students' experiences with and perceptions of information and information seeking, and a series of focus groups was held to understand students' preferences for instruction. In year two an intervention was designed, developed, and piloted using an iterative approach. In year three the resulting intervention was delivered and evaluated to determine its impact on the target user group. Using an experimental design, the researchers conducted a summative evaluation of the intervention through pre- and post-intervention ILT scores, pre- and post-intervention tests, pre- and post-intervention surveys, post-intervention interviews with participants, and a debriefing with the intervention instructor. The results from the ILT scores, tests, surveys, and debriefing are in the process of being analysed. This paper focuses on the development of the intervention, the design of the summative evaluation, and the results of the post-intervention interviews with intervention participants.

Background

For over two decades, librarians and researchers have been emphasizing the importance of information literacy skills for academic success and personal fulfilment. In 1989, the Association of College and Research Libraries' Presidential Committee on Information Literacy issued its final report, asserting that "information literacy is a survival skill in the Information Age" and emphasizing that "libraries, which provide a significant public access point to such information and usually at no cost, must play a key role in preparing people for the demands of today's information society" (ACRL 1989). In 2000, ACRL released its Information Literacy Competency Standards for Higher Education, outlining five skills [End Page 368] needed in order for a person to be information literate: being able to recognize when information is needed, access information, evaluate information, use information to accomplish a goal, and understand the ethical dimensions of information use (ACRL 2000). In 2002, several entities, including the United States Department of Education, the AOL Time Warner Foundation, Apple Computer, Microsoft Corporation, and the National Education Association, founded the Partnership for 21st Century Skills and identified information literacy as one of the key skill sets for successful 21st-century learning (http://www.p21.org/).

Yet, in spite of the emphasis on this important skill set, evidence suggests that many students enter college without having attained competence in information literacy. A study by the Educational Testing Service (ETS) found that of 3,000 college students and 800 high school students who took the ETS Information and Communication Technology Test, only 13% achieved scores that would indicate they are information literate (Foster 2006). A pilot study that included 51 first-semester students at a Research I university found that 45% scored in the below-proficient range on the Information Literacy Test, a standardized, validated test developed at James Madison University (Gross and Latham 2007). Moreover, a recent survey of nearly 900 college students found that 40% of them feel that they have "some gaps" in their ability to do research (Peter D. Hart Research Associates/Public Opinion Strategies 2005).

Community college students are often at a particular disadvantage. Open admissions policies mean that a wide range of students enrol in community colleges, and they come from a wide range of backgrounds in terms of their academic preparation before entering college. Approximately 50% of these students are the first in their families to attend college (Boswell and Wilson 2004). Over 40% of them enrol in remedial education courses (Boswell and Wilson 2004). It is therefore not surprising that there are low rates of retention and transfer among many community college students (Jacobson 2005).

Theoretical frameworks

This project has employed three theoretical frameworks in investigating and responding to the needs of community college students with below-proficient information literacy skill levels: the Dunning-Kruger effect, the imposed-query model, and the relational model of information literacy. The Dunning-Kruger effect (Kruger and Dunning 1999) suggests that, in a given knowledge domain, people with low skill levels are unlikely to recognize their deficiencies and are also unlikely to recognize competence in others. Gross (2005) hypothesizes that such individuals are unlikely to seek remediation since they believe they already possess the skills in question. Fortunately, it is possible to help these individuals recalibrate their self-perceptions by teaching them one or more of these skills (Kruger and Dunning 1999). The Dunning-Kruger effect does not, of course, apply in every knowledge domain—most laypeople, for example, recognize that they are not competent to perform brain surgery. However, research with first-year college students indicates that the Dunning-Kruger effect does apply in the domain of information literacy (Gross and Latham 2007). [End Page 369]

The imposed-query model (Gross 1995) posits a significant distinction between self-generated queries and imposed queries. Self-generated queries are defined as information-seeking tasks that arise from personal information needs and interests. Imposed queries are defined as information-seeking tasks that are given by one party (the imposer) to another (the agent). With imposed queries, the agent may be unclear as to exactly what information is desired by the imposer and/or may have a low level of motivation to complete the information task.

The relational model of information literacy (Bruce 1997), rather than emphasizing attributes of information literacy, focuses instead on users' varying conceptions of and experiences with information literacy, information literacy instruction, and what may be called the "information landscape." It is thus the relation between users and information literacy that is the primary concern of the relational model. The relational model has been used to explore first-year college students' experiences with information seeking, their conceptions of information literacy, and their perceptions of their own information literacy skill levels. Gross and Latham (2011) found that students tend to see information seeking as a product not a process. In other words, they focus on what information is found rather than on how it is found. They think of themselves as being good at finding information but do not see information literacy as a discrete skill set. Students prefer the Internet and people as sources, but they give little consideration to issues of information quality. They do see differences between imposed information tasks, which they describe as having several constraints, and self-generated tasks, which they consider to be more open and flexible (Gross and Latham 2011).

Research questions and instructional goals

Interviews and focus groups were conducted with students who had demonstrated below-proficient skills as determined by their performance on the Information Literacy Test (ILT). The ILT is a 60-item, computer-based, multiple-choice test that measures competency in four of the five ACRL standards; the use of information (the fourth standard) is not measured. The test developers define three levels of proficiency: a score of 90% or higher is considered advanced; a score of 65% to 89% is considered proficient; a score of below 65% is considered below proficient (Wise et al. 2009).

Semi-structured in-depth interviews were conducted with 57 students with below-proficient information literacy skills. Students were asked about recent experiences with imposed and self-generated information seeking, the importance of information search skills, their perceptions of their own skill levels, and their experiences with information skills instruction. Six focus groups were conducted with 64 students with below-proficient information literacy skills. By design, this represented a different group of students; none of the students who were interviewed participated in the focus groups. In the focus groups, students were asked about their experiences with instruction, their preferences for instruction, and their thoughts about how to publicize an information skills workshop and how to motivate students to attend. [End Page 370]

Based on what was learned from the interviews and focus groups, three primary instructional goals were developed: By the end of the intervention, participants should be able to

  • • identify the three steps that make up the information skills process,

  • • use keywords to search for information online, and

  • • evaluate Internet search results.

The summative evaluation of the intervention has been guided by three research questions:

  • • Do students who attend an educational intervention focused on the needs of students with below-proficient information literacy skills evince a change in their conception of the skills required to find, evaluate, and use information?

  • • Do students who attend an educational intervention focused on the needs of students with below-proficient information literacy skills evince a change in their conception of their personal ability to find, evaluate, and use information?

  • • Do students who attend an educational intervention focused on the needs of students with below-proficient information literacy skills learn at least one skill that they could use to improve both self-generated and imposed information-seeking task outcomes?

Intervention design

The design of the intervention is characterized by four guiding principles: (1) it is evidence based; (2) it focuses on issues of perception; (3) it focuses on learner-centred instruction; and (4) it is reality based. The intervention was based on various kinds of data collected from students with below-proficient information literacy skills as identified by their scores on the ILT. Both before and immediately after taking the ILT, these students were asked to estimate their performance on the test. As previous research has shown (Gross and Latham 2007), students with below-proficient skills tend to greatly overestimate their skill levels. In addition, data collected from the interviews with below-proficient students confirmed that these students have an inflated sense of their own skills and that they have the perception that information literacy is not a discrete skill set. Data collected from the focus groups indicated that students prefer face-to-face, interactive instruction with an opportunity to practise the skill they are learning.

The intervention also focused on students' experiences with and perceptions of information seeking, information literacy, and their own information literacy skills. It was felt that gaining an understanding of students' perceptions was crucial to developing an intervention that would address their needs. The data gathered from students suggested that the key goals of the workshop should be to change students' views of information literacy and change their perceptions of their own skills. The intervention is thus designed to be learner centred, and therefore employs Bruce's (2008) informed-learning approach. Informed learning, as it relates to course development, involves understanding students' experiences, developing relevant experiences for students as part of the learning process, [End Page 371] encouraging reflection on learning, and providing opportunities for students to apply what they are learning (Bruce 2008). Finally, the intervention was reality based, designed to fit within the constraints faced by most instruction librarians. The intervention is intended to provide a framework for information literacy instruction but also to be flexible and adaptable in terms of the specific learning objectives addressed. It is designed to be presented as a one-hour workshop. Whether the one-hour workshop is the optimal way of providing information literacy instruction is certainly open to debate; however, time is a constraint that many instruction librarians must deal with.

The intervention was developed using an iterative approach to formative evaluation involving four stages. Initially, the intervention was pilot-tested with a few students in one-on-one sessions. A researcher sat down with each student and talked through the presentation slides, the worksheets, and the handouts. Feedback was solicited from each student and the results were compiled. The researchers then made changes to the various workshop materials based on feedback received from the students as well as their own observations during the one-on-one sessions. In the second stage of the formative evaluation, small groups of three to four students each were presented with the workshop slides, worksheets, and handouts and essentially experienced the workshop as it would be presented to a larger group. Again, feedback was solicited from the students, the researchers recorded their own observations, and changes were made to the content and materials based on the data collected. In the third stage, the workshop was presented in two full-scale pilot sessions involving a total of 19 students. Feedback was solicited from the students and the researchers who had observed the sessions, and changes were made accordingly. Finally, a training session was held with a group of five instruction librarians. This session included a demonstration of the intervention in which the librarians played the role of students. Following this session, the librarians were asked to provide their feedback, and this feedback was then incorporated into additional refinement of the workshop content and materials.

The intervention was designed as a one-hour, face-to-face workshop with a recommended class size of 12 to 16 students. It was held in a computer lab, and students worked in pairs. Students were introduced to the ASE Process Model (ASE being an acronym for "Analyse, Search, and Evaluate"), and they practised conducting web (rather than database) searches on self-generated (rather than imposed) information tasks. In the focus groups, students expressed a preference for face-to-face instruction with a small class size, ample opportunities to interact with other students and the instructor, opportunities to practise the skill(s) being taught, and the availability of handouts. It was felt that web searching on self-generated topics would help students achieve the personal relevance framework as described by Bruce (2008), as this type of search allowed students to focus on topics and tasks with which they already had some interest and familiarity.

The intervention was delivered in five sessions to 49 students with below-proficient information literacy skills (none of whom had participated in the previous interviews or focus groups). The same instructor, a librarian from one [End Page 372] of the community college partners, led all of the workshops. This was done to control for any variability that might have been introduced had different instructors led the different sessions.

The students who participated in the workshops reflect the demographic diversity of the community colleges in which they are enrolled. One of the colleges is a mid-size school located in a mid-size city. Its students come from all over the state of Florida. The other college is a small school in a rural area. Its students come primarily from the college's five-county service region. Of the 49 students who participated in the workshop, 27 (55%) were female and 22 (45%) were male. Forty (82%) were between the ages of 18 and 22 years, while nine (18%) were older, ranging in age from 29 to 47 years. Thirty (61%) of the participants were white, sixteen (33%) black, two (4%) multiracial, and one (2%) Hispanic.

Summative evaluation of these sessions included several data collection techniques: pre- and post-intervention administration of the ILT, pre- and post-intervention tests and surveys, in-depth interviews with intervention participants, and a debriefing with the intervention instructor. In keeping with the experimental design, a control group of 43 students who did not take the workshop completed the pre- and post-intervention ILT, and pre- and post-intervention tests and surveys. The control group was used to determine whether any changes noted in students' performance on the ILT, conceptions of information literacy, and perceptions of their own skill levels could reasonably be attributed to the intervention.

The ILT was administered before and immediately after the workshop to determine whether students demonstrated any measurable changes in an objective score of their overall information literacy skills. A comparison of scores will be used to address RQ 3 (whether students had learned one or more skills from the workshop). To further address RQ 3, the pre- and post-intervention tests asked them several questions related to a hypothetical search activity (namely, if they wanted to find information about raccoons that were raiding their bird feeder but did not want to kill the raccoons, what would they do first; how would they search for information on the web; how would they broaden their search results; how would they narrow their search results; and how would they decide which information to use?). In addition, the post-intervention test asked students to list the steps in the information skills process. The post-intervention survey asked students to list any new skills or ideas they learned in the workshop. To address RQ 1 (whether students evinced a change in their conceptions of information literacy), both pre- and post-intervention surveys asked students whether they thought information skills were important for personal and school-related success, and the post-intervention survey asked students whether the workshop had changed their view of the importance of information skills. To address RQ 2 (whether students evinced a change in their perceptions of their own skills), the pre-intervention survey asked students to rate their information skills, and the post-intervention survey asked students to rate their skills both before and after the workshop. [End Page 373]

To gain more in-depth information to help address the research questions, follow-up interviews were conducted approximately two to four weeks after the workshop with 30 of the students who had participated in the intervention. To address RQ 1, students were asked whether the workshop had changed their views of information literacy as a skill set. To address RQ 2, students were asked whether the workshop had changed their views of their own ability to find, evaluate, and use information. To address RQ 3, they were asked whether they had used any skill learned in the workshop and, if so, what skills and in what context (school assignment, personal information need, etc.). If they answered in the affirmative, they were asked to explain how the use of this skill was different from what they would have done before the workshop. They were also asked to describe what they found most memorable about the workshop. To address RQ 1, 2, and 3, students were asked what the biggest impact of the workshop was on them, whether they would likely attend another similar workshop, and, if so, what additional skill(s) they would be interested in learning.

In addition, students were asked several questions related specifically to their assessment of the workshop presentation and format. For example, they were asked to comment on what they liked best and what they liked least, what they felt could be improved, whether they liked working in pairs, and what they thought about the worksheets and handouts. They were also asked about incentives that could be used to encourage them and other students at their college to attend future workshops and about the optimum time in the semester to offer such workshops. Finally, they were given an opportunity to make additional comments and to ask questions.

Each of the summative evaluation techniques relates, in one way or another, to the research goals for the intervention—its impact on how students conceive of information literacy as a skill set, the extent to which the intervention helps students recalibrate self-perceptions of ability, the extent to which it promotes an increase in skills, and its impact on both personal and academic information-seeking experiences. The interview results have been analysed. The results from the other evaluation techniques are in the process of being analysed. When the data analysis is complete, the results of the multiple evaluation techniques will be shared with the project's advisory board and their feedback will be solicited. On the basis of all of the evaluative data collected, the researchers will decide what, if any, changes should be made to workshop content and materials.

Interview results

The post-intervention interviews were conducted by the researchers approximately two to four weeks after the workshops were held. It was felt that this period would be sufficient to give students an opportunity to use what they learned in the workshop and to perhaps experience a change in their attitudes about information literacy and their own skill levels. In addition it was felt that allowing this much time to pass would help researchers determine which parts of the workshops students remembered most vividly and which skills had been retained. The interviews were audio recorded, and notes were transcribed from [End Page 374] the interviews by the research assistants. These notes were reviewed and supplemented by the researchers. The notes were then analysed for common themes related to the research questions discussed above.

Overall, the interview results indicate that the students liked the workshop, found it engaging, and agreed that they learned at least one skill from it. Specific results of the interviews will be discussed in relation to the three research questions for the intervention:

RQ 1. Do students who attend an educational intervention focused on the needs of students with below-proficient information literacy skills evince a change in their conception of the skills required to find, evaluate, and use information?

The students interviewed acknowledged that they had not really thought about information skills (the term that was used in the workshop) before taking the workshop, but that now they felt information skills were important in today's digital society, especially in regards to success with academic assignments such as writing research papers. One student said that before the workshop "I didn't think about how to search," while another stated that the process of taking the ILT "helped to get me to think about these skills." Students expressed opinions such as they now believed information skills were skills that could be learned. They said that the workshop had made them aware of the process of finding and evaluating information; moreover, the skills they learned made finding information easier, and they now appreciated the importance of evaluating information. Some students, however, demonstrated that they did not have a clear idea of what constitutes information skills, as they conflated information literacy skills with other skills, such as computer literacy and writing skills, in their discussion of what they learned and the importance of the workshop.

Some students indicated that they were surprised to discover that the "tips" or "tricks" related to finding and evaluating information existed, and they lamented the fact that they had not been taught these skills sooner. One student, for example, stated, "I wasn't taught this in high school." Another student identified the most important impact of the workshop as "recognizing the importance of finding information." Several students volunteered that they had shared one or more of the skills learned in the workshop with other students, friends, and/or family members. They also said that they believed other students at their college would benefit from the workshop—and some indicated that they had recommended the workshop to their class colleagues. Overall, the workshop participants who were interviewed seemed to value the skills taught in the workshop, and in many cases they indicated that the workshop changed their view of information skills.

RQ 2. Do students who attend an educational intervention focused on the needs of students with below-proficient information literacy skills evince a change in their conception of their personal ability to find, evaluate, and use information? [End Page 375]

Students who attended the workshop stated that they felt more confident in their searching skills after attending the workshop than they had before. As with their conception of information skills, some students said that they had not given a lot of thought to their own ability to find, evaluate, and use information before taking the workshop. Some indicated that they had thought their skills were good before they attended the workshop but realized afterwards that their skills were not as strong as they had thought. One student, for example, said, "I realized I didn't know as much as I thought I did." Another said that he thought his skills were "okay" before but now they were "good." And yet another student said that, while he "already knew a lot of the stuff " before the workshop, now searching was "a little easier." Other students said that before attending the workshop they had felt their skills were weak but that after attending they felt their skills were stronger. One student stated that her skills before the workshop were "primitive" but now had improved to "average." Another said that he "never knew how to find information before." Yet another stated that after taking the ILT, "I realized what I didn't know." However, one student said she experienced "no change," that she was "still not good" with information skills.

Most students, however (as discussed in relation to RQ 1 above), indicated that the workshop had a positive impact on their information skills, which suggests that they had engaged in self-reflection and self-assessment in the time since the workshop. Several commented on liking the pre- and post-intervention surveys and added that these surveys helped them to think about what they had learned. In addition, students indicated that they would be interested in attending an additional workshop, and some students said that they would attend the same workshop again as a refresher. Their interest also suggests not only that they found the workshop useful but also that they have considered their current skill levels and believe they could use further improvement. When asked what other skills they would like to learn, students identified things such as additional search techniques, using additional search engines (besides Google), searching in databases, finding books in the library's online catalogue, and general library skills. Some mentioned wanting more information about how to evaluate information, particularly information found on the web. Others identified skills and content not directly related to information literacy, for example, using various computer applications, essay writing, grammar, and even stress management.

RQ 3. Do students who attend an educational intervention focused on the needs of students with below-proficient information literacy skills learn (at least) one skill that they could use to improve both self-generated and imposed information-seeking task outcomes?

The students agreed that they had learned something from the workshop. Typically, what they said they had learned was related specifically to finding information. Several specific search skills were mentioned as being especially valuable, including the use of keywords, exact-phrase searching, truncation, and advanced searching in Google. Some students noted topic/question analysis as a key skill [End Page 376] they had learned, while others mentioned using techniques for evaluation as a key skill. When asked how these skills were different from what they would have done before the workshop, students said that before they would have "just Googled." They said that they would have typed "more words" and even full sentences or full questions in the search box and would have retrieved a lot of information that was not relevant. They stated that searching would have taken them much longer before the workshop, and several indicated that they would have had to seek help from librarians and others. Several mentioned that they would have taken the first results from their searches without evaluating the credibility of the sources.

Students indicated not only that they had learned something from the workshop but also that they had used one or more of these skills since attending the workshop. The skills were used in completing school assignments, typically essays for English classes, and in personal information seeking as well. Some examples of personal information seeking mentioned by students are planning a trip, comparing products for a pending purchase, starting a business, and reading the game and fishing report. Interestingly, several students indicated that they had tried their newly acquired skills in searching databases—even though the workshop had focused only on web searching—and that they were pleased with their success.

The most memorable part of the workshop varied among students. Some remembered the ASE Process; others recalled specific techniques, such as truncation or the advanced Google search. Still others remembered working with a partner and completing the worksheets while searching. Some recalled the example of looking for information about raccoons that was part of the pre- and post-intervention surveys. When asked about the impact the workshop had on them, they stated that the skills learned would help them in conducting research, finding "proper information" for both school assignments and personal use, writing papers, and getting better grades. Research, they said, would now be easier; they would not have to seek help from others to be successful at conducting research; and they would now be able to get more useful information in less time.

Additional findings

Overall, students liked the workshop, and many commented on the effectiveness of the instructor. In particular, they appreciated that she had taken a "laid back" approach and was personable, that her pacing had been neither too fast nor too slow, and that she had interacted with them by asking questions and responding to the results of their searches. They liked being shown how to search for and evaluate information, and also having the opportunity to practise these skills in the workshop using their own topics. One student, for example, said that she liked the workshop because she "felt involved." All but a few of the students liked working in pairs and noted that being able to conduct searches with a partner helped to hold their interest much more than a straight lecture or demonstration or combination lecture-demonstration would have. One student commented that he appreciated being able to work in pairs because his [End Page 377] partner was "more computer savvy" than he was. Some liked completing the worksheets during their searches, and others did not. Some commented on the usefulness of the handout, which contained a summary of the ASE Process as well as a grid of evaluation criteria to use with web resources.

The thing they liked least about the workshop was having to take the Information Literacy Test again afterwards. They felt that the test was too long and tiresome. The test, of course, was administered for research purposes and would most likely not be part of a regular workshop, but the students had no way of knowing that. When asked whether they thought the workshop had helped improve their performance on the ILT, most of them indicated that, while they had felt more comfortable taking the ILT the second time because they knew what to expect, they did not feel that the workshop had had any great impact on their scores (though, as discussed above, some said that taking the test made them more aware of the importance of information skills). The researchers expected this result, as the ILT is a broader assessment and measures more skills than were (or could have been) taught in the workshop.

When asked if they had any suggestions for improving the workshop, some students said that it should focus more on academic material rather than personal information needs. Others said that it should be made longer as a lot of material was covered in a relatively short period. Some suggested that they be allowed to take the worksheets home. (For the purposes of conducting an evaluation of the workshop, the researchers retained the worksheets.) And several stated that the workshop should be offered more frequently.

Participants also commented on the importance of incentives in encouraging students to attend such workshops. The kinds of incentives mentioned included extra credit, college credit, volunteer/community service credit, food, gift cards, and small gifts, such as T-shirts and pens. Several said that the workshop should be required. Others stated that students would be likely to attend if teachers recommended it and if other students reported that the content was interesting and useful. In terms of when the workshop should be offered, most students agreed that early in the semester, during the third or fourth week, would be ideal. That way, students could benefit from the skills learned and apply them in their classes. Some, though, recommended the middle of the semester, perhaps thinking that by that point major projects would have been assigned and discussed in class but that there would still be time left for completing the assignments. As far as in what context the workshop should be offered, some students mentioned orientation as a good candidate, while others said that it should be offered as part of a class, such as English, computer literacy, or a remedial course. Some noted that the workshop should be presented before students begin college—the summer before freshman year, for example, or even as early as middle school.

One final note: Not all students expressed comfort or familiarity with computers. One student said that because he lives in the country (where Internet access is problematic), he does not use computers except when he is at school. Another said that she had "not used computers until about a year and a half [End Page 378] ago." As noted above, several students conflated information literacy with computer literacy and, when asked what additional skills they would like to learn, identified various computer applications.

Discussion

As indicated by the interview data, it is clear that students learned one or more skills from the workshop. In terms of whether the workshop changed their conception of information skills, the results are somewhat mixed. On the one hand, students said that they had not really considered these skills before the workshop. This is in keeping with prior research that indicates students do not think of information literacy as a skill set and that they think of information seeking as a product rather than a process (Gross and Latham 2009; 2011). The students interviewed reported that now they understood the importance of being able to find and evaluate information. The fact that they had used the skills and in many cases had shared them with others suggests not only that they found the workshop useful but also that they were beginning to think of information literacy as a discrete set of skills that they could learn and then share with others. On the other hand, while some of them were able to identify additional information skills that they would like to learn (such as other search engines and database searching), others had difficulty and identified tangentially related skills, such as computer applications and writing, or mostly unrelated skills, such as stress management. Some students had trouble thinking of any skill at all. So if students are beginning to think of information literacy as a skill set, it appears that some of them are still unclear as to exactly what those skills are.

By the same token, students generally reported a change in their perception of their own information literacy skill levels. Many acknowledged having never really thought about their skills before the workshop. Some of them thought before the workshop that their skills were good. This goes along with earlier research studies that found students with below-proficient information literacy skills tend to greatly overestimate their skill levels (Gross and Latham 2007). Kruger and Dunning (1999) suggest that often people can more accurately assess their skills in a given knowledge domain if they are taught these skills. The students who participated in the interviews reported that they did learn at least one skill, and they reported a change in their self-perceptions as well. Some said that, in retrospect, they realized their skills were not as strong as they had thought, while others stated that they knew their skills were weak. Students overall reported increased confidence in their information skills after completing the workshop.

The fact that some students were able to identify specific additional skills that they would like to acquire also suggests that they perhaps were able to change their self-perceptions and to recognize that they lack certain skills that could be potentially useful to them. Others, though, had difficulty identifying a particular skill, which may indicate not only an unclear conception of information literacy as a skill set but also students' lack of awareness of their own deficiencies. An important goal of instruction, of course, is to help students feel empowered with new knowledge and more confident in their abilities; however, [End Page 379] the fact that students often described their information skills as "good" or "much better" suggests perhaps that they still have an inflated sense of their actual abilities. Analysis of the pre- and post-intervention surveys, and, to a lesser extent, comparison of the pre- and post-intervention ILT scores will help the researchers confirm whether this is the case.

It was gratifying to discover that students reported learning through the workshop at least one skill that they had been able to apply in schoolwork and/ or their personal lives. It was also gratifying to see that students, overall, found all of the skills associated with the ASE Process approach useful, although they tended to focus on the skills associated with more effective searching. The researchers were surprised—and pleased—to learn that some students had successfully applied their newly acquired search skills to academic databases. This supported the researchers' belief that the ASE Process approach would be adaptable to other kinds of information seeking besides web searching. The fact that relatively few students identified learning how to evaluate search results as the most memorable aspect of the workshop or as the biggest impact the workshop had on them is in keeping with previous research that discovered students tend not to be greatly concerned about the quality of information found (Gross and Latham 2011).

Students responded positively to the workshop format and especially liked the opportunities for interaction with the instructor and other students, the combination of demonstration and practice, and the availability of handouts. Both their own interest in attending future workshops and their recommendations of the workshop to other students underscored the fact that they found value in the workshop they had attended. Their dislike of the ILT is not surprising, given that the test is long (60 multiple-choice items) and covers a lot of material not covered directly in the workshop or in other classes. Though most of the students found the test more familiar and a bit less intimidating the second time they took it (following the workshop), they did not think they necessarily performed better the second time around. The researchers also do not expect the ILT scores to increase noticeably, if at all, considering that the test assesses a much broader range of information than was covered in the workshop.

Students are aware of the importance of incentives. Those who participated in this project were paid for their participation, and they acknowledged that money was a strong motivator. In the absence of payment, though, students felt that the strongest incentives for students to attend the workshop would be to make it mandatory, to provide college credit, to provide extra credit, and/or to provide community service credit. While several mentioned the importance of the workshop and felt it was beneficial, especially in terms of helping them get better grades, most felt that more tangible motivators would be necessary to entice students to attend future workshops.

Implications for research and practice

The researchers will complete the data analysis of the pre- and post-intervention ILT scores, pre- and post-intervention tests, and the pre- and post-intervention [End Page 380] survey responses, and will compare the results from the workshop participants to those from the control group. Data from all sources will be triangulated to determine the extent to which the data sets support or contradict each other in order to draw conclusions on the effectiveness of the workshop in reaching the research goals. What is learned in the summative evaluation will also be used to revise the workshop content and materials as needed. The researchers are committed to continuing to refine the ASE Process workshop and disseminating the results of this project to academic and school librarians. Toward that end, an Attaining Information Literacy website has been developed, where information about the project, workshop materials, and citations to research presentations and publications are being made available (http://www.attaininfolit.org). The site has had over 9,000 visitors since May 1, 2010.

Possible future research projects include using the ASE Process approach with other instructional objectives. For instance, it could be used to teach database searching and could be focused on a school assignment instead of a self-generated topic. Another strategy would be to focus on one part of the ASE Process—evaluating sources, for example. The ASE Process approach could be used with other audiences too: with first-year students at other colleges; with college juniors, seniors, and even graduate students; with high school students or, as one interview participant suggested, middle school students; and with senior adults.

The results of the project suggest that the ASE Process workshop could be incorporated into the services offered by libraries of various types. It is adaptable and flexible and could be used by instruction librarians in academic, school, and even public libraries to teach basic information skills. It could be introduced in an initial workshop and then used as a kind of scaffold for teaching additional skills related to finding, evaluating, and using information. It could easily be presented as a stand-alone session, as was done in this project, or offered as part of a class. Regardless of how the workshop is used or adapted, the features that the students in our study responded to most positively should be retained: the interactive aspect, the opportunity to practise skills, and the availability of handouts for review purposes. It is also important to provide incentives for students to participate in such workshops. Depending on the target audience, such motivators might include course credit, extra credit, class requirement, or demonstrable benefits (e.g., improved grades or the acquisition of new skills).

This project has sought to inform research and practice in the development of information literacy skills by gathering data about students' experiences with information seeking, conceptions of information literacy, and perceptions of their own skill levels. The workshop that has been developed has the potential to address the instructional needs of students with below-proficient information literacy skills, but it may also be beneficial to students with proficient and even advanced skills. By helping students gain an understanding of information literacy as a skill set, helping them accurately assess their own skill levels, and teaching them useful skills that they can apply in both school-related and personal [End Page 381] information-seeking tasks, this workshop can help students develop a skill set that will serve them well as they complete their academic programs and then move into the workforce.

Don Latham, Associate Professor
School of Library & Information Studies, Florida State University
Melissa Gross, Professor
School of Library & Information Studies, Florida State University

Acknowledgements

The researchers would like to thank the Institute of Museum and Library Services for funding the Attaining Information Literacy Project; Bonnie Armstrong for assisting with the instructional design for the intervention; the librarians at the two community colleges, Renee Hopkins, Jane Stephens, and Colleen Thorburn; our graduate research assistants, Debi Carruth, Jon Hollister, Meredith Mills, and Will Woodley; and our advisory board, Rebecca Bichel, Kenneth Burhanna, Sarah McDaniel, and Bianca Rodriguez.

References

Association of College and Research Libraries. 1989. Presidential Committee on Information Literacy: Final Report. Chicago, IL: Association of College and Research Libraries. http://www.ala.org/ala/mgrps/divs/acrl/publications/whitepapers/presidential.cfm.
Association of College and Research Libraries. 2000. Information Literacy Competency Standards for Higher Education. Chicago, IL: Association of College and Research Libraries. http://www.ala.org/ala/mgrps/divs/acrl/standards/informationliteracycompetency.cfm.
Boswell, Katherine, and Cynthia D. Wilson, eds. 2004. Keeping America's Promise: A Report on the Future of the Community College. Denver, CO: Education Commission of the States. http://www.league.org/league/projects/promise/files/promise.pdf.
Bruce, Christine. 1997. The Seven Faces of Information Literacy. Adelaide, Australia: Auslib Press.
———. 2008. Informed Learning. Chicago: Association of College and Research Libraries.
Foster, Andrea L. 2006. "Students Fall Short on 'Information Literacy,' Educational Testing Service's Study Finds." Chronicle of Higher Education, October 26.
Gross, Melissa. 1995. "The Imposed Query." RQ 35 (2): 236-43.
———. 2005. "The Impact of Low-Level Skills on Information-Seeking Behavior: Implications of Competency Theory for Research and Practice." Reference and User Services Quarterly 45 (2): 155-62.
Gross, Melissa, and Don Latham. 2007. "Attaining Information Literacy: An Investigation of the Relationship between Skill Level, Self Estimates of Skill, and Library Anxiety." Library & Information Science Research 29 (3): 332-53.
———. 2009. "Undergraduate Perceptions of Information Literacy: Defining, Attaining, and Self-Assessing Skills." College & Research Libraries 70 (4): 336-50.
———. 2011. "Experiences with and Perceptions of Information: A Phenomenographic Study of First-Year College Students." Library Quarterly 81 (2): 161-86.
Jacobson, David L. 2005. "The New Core Competence of the Community College." Change 37 (4): 52-62.
Kruger, Justin, and David Dunning. 1999. "Unskilled and Unaware of It: How Difficulties in Recognizing One's Own Incompetence Lead to Inflated Self-Assessments." Journal of Personality and Social Psychology 77 (6): 1121-34. [End Page 382]
Peter D. Hart Research Associates/Public Opinion Strategies. 2005. Rising to the Challenge: Are High School Graduates Prepared for College and Work? A Study of Recent High School Graduates, College Instructors, and Employers. Washington, DC: Achieve, Inc. http://www.achieve.org/node/548.
Wise, Steven L., Lynn Cameron, Sheng-Ta Yang, and Susan L. Davis. 2009. Information Literacy Test: Test Manual. Harrisonburg, VA: Center for Assessment & Research Studies. http://www.madisonassessment.com/uploads/ILT%20Test%20Manual%202010.pdf. [End Page 383]

Share