University of Toronto Press
  • Service Quality Assessment in University Libraries of Pakistan/L’évaluation de la qualité de service dans les bibliothèques universitaires du Pakistan
Abstract

This study measures the service quality of university libraries of Pakistan from the user’s perspective. The data were collected (N = 1,473) from undergraduate and graduate students and faculty members of 22 Pakistani universities through a locally modified LibQUAL+ survey in Urdu. Library users rated 22 core survey items on a scale of 1 (low) to 9 (high) in terms of minimum acceptable service quality, desired service quality, and perceived service quality. Study findings indicate that libraries overall do not meet users’ minimum acceptable and desired levels of service quality. The zone of tolerance identified eight problematic services, most of which are related to the information control dimension. This study also indicates a wide gap between users’ perceptions and expectations of service quality.

Résumé

Cette étude a pour objet la mesure de la qualité de service dans les bibliothèques universitaires du Pakistan du point de vue des utilisateurs. Les données ont été recueillies (N = 1,473) auprès des étudiants des premier et second cycles et des membres du corps professoral de 22 universités pakistanaises, au moyen d’une enquête LibQUAL+ en ourdou modifiée localement. Les utilisateurs ont attribué à 22 postes clés de l’enquête trois notes comprises entre 1 (faible) et 9 (élevée), disposées selon trois colonnes parallèles correspondant aux valeurs minimale, souhaitée et perçue. Les résultats de l’étude démontrent que les bibliothèques ne répondent pas aux attentes minimales et souhaitées des utilisateurs. La zone de tolérance a identifié huit services présentant des problèmes, et la plupart d’entre eux sont liés au contrôle de l'information (IC). L’étude indique également un écart important entre les perceptions et les attentes des utilisateurs concernant la qualité de service.

Keywords

LibQUAL+, service quality, university libraries, Pakistan, South Asia, developing countries

Keywords

LibQUAL, qualité des services, bibliothèques universitaires, Pakistan, Asie du sud, pays en voie de développement [End Page 59]

Introduction

A university library contributes greatly to the overall institution fulfilling its mission and achieving academic excellence. University libraries acquire, organize, preserve, and disseminate information to achieve this purpose. University libraries also offer different kinds of services, physical facilities, collections of documents, access to information, and study spaces. The traditional services and role of university libraries changed with the passage of time due to the increase in information providers (vendors, Google, Amazon, etc.), the increase in users’ expectations, the application of modern technologies, global competition in the information service sector, the digital revolution, the introduction of numerous information formats, and the rising costs of physical information materials. University libraries require greater understanding of users’ needs, experiences, expectations, and perceptions if they are to overcome these challenges. Assessment of library service quality helps in identifying weak and strong areas, in decreasing the gap between customers’ perceptions and expectations, in justifying resources, and in planning for current and future services. The service-quality literature (Hernon and McClure 1990; Taylor 1986; Whitehall 1992; Nitecki 1996; Chweh 1982; Hernon and McClure 1986; Oldman and Wills 1977; Taylor 1986) identifies that customers have a central position in the assessment of service quality and claims that “only customers judge quality; all other judgments are essentially irrelevant” (Zeithaml, Parauraman, and Berry 1993, 16).

The traditional services are becoming obsolete and do not fulfill users’ demands for information. Library authorities should recognize the different needs, priorities, and feedback of library users in this regard. All programs and initiatives regarding current and future services must be user-centred. Libraries in the developed countries have perceived this reality and focused on meeting their customers’ needs. The libraries in the advanced countries have moved from focusing on inputs to outcome-based evaluation of the quality of services; however, in developing countries like Pakistan, libraries yet lag behind in this area. Unlike in the developed world, in Pakistan users’ perceptions of library service quality is not regularly assessed (Rehman and Pervaiz 2007). There is no data available to inform library managers, policy makers, universities, and the Higher Education Commission (HEC) about users’ expectations and perceptions, or about gaps between perceptions and expectations across individual services, dimensions, and user groups.

Gap in literature and significance of the study

Scholarly literature on the various aspects of library service quality in developed countries is commonly available. However, very few studies are available from developing countries, like China (Wang 2007), South Africa (Moon 2007), Bangladesh (Ahmed and Shoeb 2009), Iran (Hariri and Afnani 2008), India (Sahu 2007; Manjunatha and Shivalingaiah 2004; Sherikar, Jange, and Sangam 2006; Thakuria 2007), and Malaysia (Kiran 2010), and there is no comprehensive study of university libraries in Pakistan.

The LibQUAL+ survey is the only reliable, valid, and specialized instrument for the assessment of library service quality. More than 1.5 million library users [End Page 60] from 1,200 libraries have participated in the LibQUAL+ survey (Kyrillidou 2011). But out of 1.5 million users, and 1,200 libraries, there is not a single participant from South Asia. There are numerous benefits of service quality assessment in university libraries of Pakistan: (1) identification of service shortfalls, (2) identification of gaps across services and user groups, (3) help in justification of resources, (4) improvement of library services, and (5) marketing and promotion of library services and resources. Assessment of the service quality of the university libraries of Pakistan will give library users an opportunity to communicate where library services need improvement. As a result of that, the concerned authorities can take actions to better meet users expectations of service quality.

Research questions

This study seeks answers to the following research questions with reference to university libraries of Pakistan:

  • • What services are within the zone of tolerance for the overall user group and subgroups?

  • • Which of the service quality dimensions are within or outside the zone of tolerance for the overall user group as well as for subgroups?

  • • Which attributes of the library service quality meet, exceed, or fall short of user expectations?

Literature review

Service quality in libraries

The term “library service quality” is now frequently used in the library and information science (LIS) literature. However there is no consensus on its definition. It is an elusive, ambiguous, subjective, and multifaceted concept. Library service quality is basically defined in terms of gap analysis, specifically the gap between customers’ minimum acceptable or desired level of service and perceptions of actual services received (Hernon and Whitman 2001).

Theoretical foundations of service quality

The most well-established service quality evaluation models (i.e., SERVQUAL and LibQUAL+) define service quality as the “difference between customers’ perceptions and expectations” using disconfirmation/confirmation theory. This theory is based on the satisfaction literature.

According to the confirmation/disconfirmation theory model, a positive disconfirmation occurs if a customer’s perception of a service performance exceeds the prior expectations of the service performance (Green 2007). The positive gap indicates that users’ minimum acceptable and desired levels of service quality were met or exceeded. The customer is delighted and considers the service quality exceptionally good. Conversely, when service quality is lower than acceptable or desired levels, expectations are disconfirmed negatively and users’ standards or desires are not met. The customer is likely to think that the quality of service is low and to be disappointed. Zeithaml, Berry, and Parasuraman (1993) point out [End Page 61] that customers have two types of expectations: minimum acceptable service quality and desired service quality. The first is the minimum level of service that users would find acceptable and the second is the level of service that users personally want. The difference between minimum acceptable and desired service quality is called the zone of tolerance (ZOT). The difference between perceived service quality and the minimum acceptable service quality is called the service adequacy gap (SAG), and the difference between perceived service quality and desired service quality is called the service superiority gap (SSG).

The LibQUAL+ instrument

LibQUAL+ is a well-known and recognized instrument that libraries use to “solicit, track, understand, and act upon users’ opinions of service quality” (Association of Research Libraries 2010). More than 1.5 million library users from 1,200 libraries have participated in LibQUAL+ since its inception. The instrument was developed collaboratively between the Association of Research libraries (ARL) and Texas A&M University Libraries. The LibQUAL+ instrument is an attractive tool which easily rates service quality from the customer perspective. As a result of various refinements, the current LibQUAL+ version measures library service quality through 22 core questions from three dimensions: “affect of service” (AS), “information control” (IC), and “library as place” (LP).

The AS dimension consists of nine questions which are related to courtesy, knowledge, and helpfulness of library staff in delivering user services. The IC dimension addresses (through eight questions) the adequacy of print and electronic collections, the ease with which access tools can be used, the modernity of equipment, the quality of the library website, and the ability of users to access information themselves without assistance. The LP dimension focuses on user perceptions of quiet, comfortable, inviting, and reflective study space that inspires study and learning.

Various studies (Cook, Heath, Thompson, et al. 2001Cook, Heath, Thompson, et al. 2001; Thompson and Cook 2002; Thompson, Cook, and Heath 2003; Thompson, Cook, and Kyrillidou 2005; Thompson, Cook, and Kyrillidou 2006; Thompson, Kyrillidou, and Cook 2008) have confirmed the psychometric integrity of the LibQUAL+ instrument with different well-known approaches such as “structural equation modeling, reliability analysis, factor analysis, taxonometric analysis and latent trait item response theory” (Miller 2008, 37).

Users’ perceptions and expectations with library services

The perception score typically falls within the ZOT. Parasuraman, Zeithaml, and Berry (1991) state that a service performance below the ZOT could create dissatisfaction and disappointment, and reduce customer reliability and loyalty. A performance level within or above the ZOT shows that users’ minimum standards were met, and it increases their loyalty, reliability and satisfaction with a service organization. Putting together minimum acceptable service quality, desired service quality, and perceived service quality provides a great deal of understanding of how well a library is performing from the perspective of its users. [End Page 62]

The review of literature (Cook et al. 2008, 2009; Cook et al. 2010; Cook, Heath, and Thompson 2003; Hiller 2004; Hubbard and Walter 2005; Jaggars, Jaggars, and Duffy 2009; Lessin 2004; Nimsomboon and Nagata 2003; Shedlock and Walton 2004; Wilson 2004) reveals that libraries were meeting users’ minimum requirements and users were satisfied overall with staff, collection, access, physical facilities, environment, and study-space related services, and all dimensions were within the ZOT. However, a few studies (Ahmed and Shoeb 2009; Association of Research Libraries 2011; Kemp 2002; Kyrillidou and Persson 2006; Lock and Town 2005) indicate users’ minimum requirements were not met in specific IC dimension questions and the overall IC dimension.

In terms of the service quality dimensions, the findings of reviewed studies varied (Boyd-Byrnes and Rosenthal 2005; Cook et al. 2003; Hubbard and Walter 2005; Jaggars, Jaggars, and Duffy 2009; Kyrillidou and Persson 2006; Lippincott and Kyrillidou 2004; Shedlock and Walton 2004; Wilson 2004). They suggest that users (especially faculty and graduate students) have high expectations for information control and low opinion of their libraries’ performance in this area. On the contrary, users (except undergraduates) were found to have low expectations for the library-as-place dimension and have a high opinion of how their libraries perform in this area. The largest negative gap was noted on the IC dimension.

Few studies from France and some developing countries had better scores than the rest of the developed world (Arshad 2009; Cook et al. 2008, 2009; Cook et al. 2010; Seay, Seaman, and Cohen 1996). Most of the studies (Cook, Heath and Thompson 2001; Dole 2002; Hariri and Afnani 2008; Johnson 2007; Sharma, Anand and Sharma 2010; Thompson, Kyrillidou and Cook 2007) did not find significant difference in perceived service quality on the basis of gender and user types. However users opinion was significantly different on the basis of academic subjects and library sector (i.e. public or private) (Wisniewski 1996; Cook, Heath, and Thompson 2003; Gatten 2004a, 2004b; Lee 2004; Creaser 2006; Lessin 2004; Wilson 2004).

Service quality and university libraries of Pakistan

In Pakistan, service quality is an unfamiliar topic and practices of regular assessment of library service quality do not exist at any level. Some users studies, satisfaction surveys, and service evaluation studies of individual libraries touched this topic. The following section is an overview of these research studies. Normally, university library performance is assessed through various statistics presented in annual reports submitted to higher management. These statistics consist of the number of collections, staff, and library users as well as various usage counts (numbers of borrowed books and visitors).

Jabeen (2004) explored the status of physical and material facilities in the university libraries of Lahore. She points out that existing facilities were inadequate. The major problems identified are non-availability of needed facilities for special persons; shortage of space for material and readers; and inadequate, outdated printed material. Akhtar (2008) asserts that most of the users were [End Page 63] dissatisfied with the overall quality, collection, organization of material, public services, physical facilities, and para-professional library staff’s attitude. However, respondents were satisfied with library membership and circulation services. Awan, Azam, and Asif (2008) identified discrepancies between the perceptions and expectations of library users. The top three poor services were in the following categories: (1) “library has modern looking equipment;” (2) “materials associated with library services are visually appealing;” and (3) “staff members of library understand specific needs of users.” Arshad (2009) investigated users’ perceptions of Punjab University’s departmental libraries. She used 22 items from the SERVQUAL scale to collect the data from students (N = 334). The results showed an overall high negative gap (SSG = 1.26). All 22 survey items had negative scores, but those related to staff services were the highest. The five items that had the largest negative gap were (1) modern equipment, (2) assuring customers’ secrecy, (3) knowledgeable library staff to respond to users’ queries, (4) library staff having the confidence of users, and (5) physical facilities. This study did not report the validity of scale in the Pakistani context. The exclusion of perceptions of faculty members and the small sample size were additional limitations. Moreover, the study did not measure the minimum acceptable service expectations.

Rehman, Shafiq, and Mahmood (2011) found that users were somewhat satisfied with reference sources, reference personnel, physical facilities, and services offered by their libraries but they were not fully satisfied with any of the reference services. The authors recommended that libraries should immediately address the issues related to their reference collections, staff, and public services in their reference sections. Furthermore, libraries should also introduce electronic or virtual reference services to increase current levels of user satisfaction.

University libraries in Pakistan rapidly expanded after the establishment of the HEC in 2002 because of an increase in university enrolment, changing methods of learning, the growth of science and technology activity, and the recognition of the library as an important source of learning. Concern for responsiveness to users has become increasingly imperative. But despite rising expectations for enhanced library services in Pakistani universities, no research has studied users’ expectations and perceptions of the quality of service provided by central libraries.

Research methodology

Research design

A cross-sectional research design was used in this empirical study. The survey method was used to collect the data on a self-reporting questionnaire. The data were collected by the researcher through personal visits to the sites of relevant universities of Pakistan. This study is a part of a larger research project. The data used in this research were collected in the context of that project, where a wider range of variables were obtained. That project is a work in progress. This study reports the findings concerning the three research questions. [End Page 64]

Sample and sampling

Sampling was done in two stages. In the first stage, random samples were taken from 43 universities with central libraries in the Punjab province and the federal capital of Pakistan. In the second stage, from each of the 22 selected universities (13 public and 9 private), 25 undergraduate students, 25 graduate students, and 25 teachers of different age, experience, department, gender, and qualification were conveniently selected to respond to the questionnaires for data collection. The convenient sampling method was selected due to non-availability of a complete list of population. However, the researcher made every possible effort to collect the data from representative user groups. The sample fairly represents different types of users (faculty, graduate students, and undergraduate students), sectors, geographical locations, ages, academic disciplines, genders, and qualifications.

Measure

The users’ opinions were measured through a locally modified LibQUAL+ instrument. The modification and adaptability of the latest LibQUAL+ English version into the Pakistani context was made through a nine-member focus group. The modified version of LibQUAL+ (American English) was translated in Urdu using standard procedures of forward-backward translation. The psychometric properties of the instrument were established through exploratory factor analysis, confirmatory factor analysis, and Cronbach’s alpha. The final instrument consisted of 21 items measuring service quality in three dimensions: affect of service (AS), information control (IC), and library as place (LP). The AS dimension consists of eight questions related to courtesy, knowledge, and helpfulness of library staff in delivering user services. The IC dimension addresses (through eight questions) the adequacy of print and electronic collections, the ease with which access tools can be used, the modernity of equipment, the quality of the library website, and the ability of users to access information themselves without assistance. The third dimension, the LP dimension, focuses on user perceptions of quiet, comfortable, inviting, and reflective study space that inspires study and learning. Users rated all items in terms of their perceptions of service quality, desired service quality and minimum acceptable service quality. A rating scale ranging from 1 (low) to 9 (high) was used.

Data collection

Out of 1,650 total distributed questionnaires, 1,497 filled questionnaires were returned successfully, resulting in a response rate of 91%. Responses revealed that 66% of the respondents were male and 34% were female; 34% were graduate students, 37% were undergraduate students, and 29% were faculty members. Sixty percent of the respondents were from public universities and 40% were from private universities. Respondents represented eight major categories of academic disciplines: sciences (10%), engineering and technology (22%), management (29%), social sciences (17%), agriculture (4%), health (10%), education (4%), and others (3%). [End Page 65]

Data analysis

The quantitative data analysis was conducted with the help of the Statistical Package for the Social Sciences (SPSS) software and the Analysis of Moment Structures (AMOS) add-on module. After initial data screening (for, e.g., missing values, descriptive statistics, normality, detection of multivariate outliers, and correlation analysis) the final sample size was reduced to 1,473 cases for further data analysis.

Figure 1. ZOT for Pakistani users Note: Items shaded light grey are below the ZOT and medium grey are within the ZOT.
Click for larger view
View full resolution
Figure 1.

ZOT for Pakistani users

Note: Items shaded light grey are below the ZOT and medium grey are within the ZOT.

Services inside and outside the zone of tolerance (ZOT)

The mean score for minimum service quality, desired service quailty, and perceived service quality were compared to identify the ZOT. The study identified the ZOT for the overall number of users through a radar chart (see figure 1). Each spoke of the radar chart represents one question, represented via item code (AS-1, AS-2, AS-3, etc.). On each spoke, respondents’ minimum, desired, and perceived levels of service quality were plotted. The light grey area represents items that scored below the ZOT, while the medium grey area represents items that scored within the ZOT. The dark grey area represents the zone of tolerance where no item has scored. [End Page 66]

Tables 1 and 2 show mean scores and gaps for each question for the overall user group and subgroups, respectively. For the overall user group, eight items were below the ZOT. Seven items were below the ZOT for faculty members, 12 for graduate students, and 13 for undergraduate students. None of the survey items were above the zone of tolerance for any user group. There were some

Table 1. Service adequacy gap (SAG) and average ratings of expectations and perceived service quality for overall user group (ordered according to SAG)
Click for larger view
View full resolution
Table 1.

Service adequacy gap (SAG) and average ratings of expectations and perceived service quality for overall user group (ordered according to SAG)

[End Page 67]

Table 2. Service adequacy gap (SAG) of subgroups (items ordered according to SAG)
Click for larger view
View full resolution
Table 2.

Service adequacy gap (SAG) of subgroups (items ordered according to SAG)

[End Page 68]

Table 3. Dimension-specific summary of service adequacy gaps (SAG) for overall user group and subgroups
Click for larger view
View full resolution
Table 3.

Dimension-specific summary of service adequacy gaps (SAG) for overall user group and subgroups

commonalities between the overall user group and subgroups. Five items of the IC dimension were below the ZOT for the overall user group, faculty, graduate students, and undergraduate students. These items are (1) “the library has modern equipment that lets me easily access the needed information;” (2) “electronic resources of the library are accessible from my home or office;” (3) “the library has electronic information resources that I need;” (4) “the library’s website enables me to locate information on my own;” and “the library has printed materials that I need for my work.”

Two other items were below the ZOT for faculty and the overall user group: (1) “the library has community spaces for group learning and group study” and (2) “the library has print and/or electronic journal collections that I require for my work.” Four other items were below the ZOT for undergraduate and graduate students. These four items (related to the AS dimension) were (1) “library staff has knowledge to answer users’ questions;” (2) “library staff shows dependability in handling users’ service problems;” (3) “library staff is always willing to help users;” and (4) “library staff is always ready to respond to users’ questions” (see table 2). The overall mean scores for undergraduate and graduate students were below the ZOT but for faculty it was within the ZOT (see table 2). Pakistani users overall had a wide zone of tolerance (1.76). The graduate students had a narrow zone of tolerance (1.70). On the other hand faculty, had a wide zone of tolerance (1.90). The undergraduate students had wider zone of tolerance than graduate students (1.73).

Dimension-specific ZOTs for overall user group and subgroups

The zone of tolerance was also calculated for individual dimensions at the level of the overall user group as well as the subgroups. The results show that the IC dimension was below the ZOT and the AS and LP dimensions were inside the ZOT for the overall user group (see figure 2). For faculty, the AS and LP dimensions were within the ZOT (SAGs of 0.24 and 0.11, respectively) but the IC dimension (−.22) was outside it (see figure 3). For graduate students, the LP dimension was within the ZOT (.27; see figure 4) but the AS and IC dimensions were below it (AS = −.008, IC = −.27). The results were the same for undergraduate students (AS = −.04, IC = −.29, LP = .001; see figure 5). The LP dimension was within the ZOT and the IC dimension was outside the ZOT for the overall user group and all subgroups. [End Page 69]

Figure 2. Dimension-specific ZOTs for all users Note: Grey bar indicates the range between minimum acceptable and desired service quality. Black marker indicates perceived service quality.
Click for larger view
View full resolution
Figure 2.

Dimension-specific ZOTs for all users

Note: Grey bar indicates the range between minimum acceptable and desired service quality. Black marker indicates perceived service quality.

Figure 3. Dimension-specific ZOTs for faculty Note: Grey bar indicates the range between minimum acceptable and desired service quality. Black marker indicates perceived service quality.
Click for larger view
View full resolution
Figure 3.

Dimension-specific ZOTs for faculty

Note: Grey bar indicates the range between minimum acceptable and desired service quality. Black marker indicates perceived service quality.

Gap between users’ desires and perceptions

The difference between the desired level of service (i.e., what the user wishes to receive from the library) and the perception of delivered service that exceed those desires is called the “service superiority gap.” The service superiority gap (SSG) is the key to assessing perceived service quality. If perceived-quality scores for any service are equal or above the desired level, then that service is regarded exceptionally well. On the other hand, perceived-quality scores below the desired level show that libraries are not meeting users’ desires.

The SSG was identified by subtracting the mean desired score from the mean perceived score on 21 core questions for the overall user group and [End Page 70] subgroups (i.e., faculty, graduate students, and undergraduate students). The results showed that all 21 items had negative SSGs, both for the overall user group and the subgroups (see tables 4 and 5). None of the library services met or exceeded users’ desired expectations. The five items with the highest negative SSG for the overall user group and for faculty are related to the IC dimension. These items were (1) “the library has modern equipment that lets me easily access the needed information;” (2) “electronic resources of the library are accessible from my home or office;” (3) “the library’s website enables me to locate information on my own;” (4) “the library has electronic information resources that I need;” and (5) “the library has printed materials that I need for my work.”

Figure 4. Dimension-specific ZOTs for graduate students Note: Grey bar indicates the range between minimum acceptable and desired service quality. Black marker indicates perceived service quality.
Click for larger view
View full resolution
Figure 4.

Dimension-specific ZOTs for graduate students

Note: Grey bar indicates the range between minimum acceptable and desired service quality. Black marker indicates perceived service quality.

Figure 5. Dimension-specific ZOTs for undergraduate students Note: Grey bar indicates the range between minimum acceptable and desired service quality. Black marker indicates perceived service quality.
Click for larger view
View full resolution
Figure 5.

Dimension-specific ZOTs for undergraduate students

Note: Grey bar indicates the range between minimum acceptable and desired service quality. Black marker indicates perceived service quality.

[End Page 71]

Table 4. Service superiority gaps (SSG) for overall user group
Click for larger view
View full resolution
Table 4.

Service superiority gaps (SSG) for overall user group

The results for the subgroups (see table 5) show that faculty have the largest overall negative SSG (−1.86) and graduates (−1.73) have the smallest negative SSG. The five items having the largest negative SSG for graduates are (1) “the library has modern equipment that lets me easily access the needed information;” [End Page 72]

Table 5. Service superiority gap (SSG) of subgroups (items ordered according to SSG)
Click for larger view
View full resolution
Table 5.

Service superiority gap (SSG) of subgroups (items ordered according to SSG)

(2) “electronic resources of the library are accessible from my home or office;” (3) “the library’s website enables me to locate information on my own;” (4) “library staff instills confidence in users;” and (5) “the library has electronic information resources that I need.”

Most of these items are again related to the IC dimension. The five items with the largest SSGs for undergraduate students are also related to the IC dimension, except for one item from the LP dimension. These items are (1) “the library’s website enables me to locate information on my own;” (2) “the library has community spaces for group learning and group study;” (3) “electronic resources of the library are accessible from my home or office;” (4) “the library has modern equipment that lets me easy access to the needed;” and (5) “the library has electronic information resources that I need.” The dimension-specific comparison reveals negative SSGs on all three dimensions (AS = −1.76, IC = −2.02, and LP = −1.60). The largest gaps were noted for the IC dimension and the lowest for the LP dimension.

Discussion of results

In this part, the results of the study are discussed in the context of the available literature. An alternative explanation is also provided for some of the results. The first research question of this study is “What services are within the zone of [End Page 73] tolerance for the overall user group and subgroups?” The results suggest that the overall mean score for all services is below the ZOT. At the individual-service level, eight services are below the ZOT for the overall user group. At the subgroup level (graduate student, undergraduate student, and faculty), five services related to collection and access (IC dimension) are commonly below the ZOT. These services are (1) availability of modern equipment; (2) remote access of electronic resources; (3) availability of electronic information; (4) library website; (5) availability of print material; and (6) availability of print and electronic journals.

The dimension-specific analysis answered the second question, “Which of the service quality dimensions are within or outside the zone of tolerance for the overall user group as well as for subgroups? It was found that the AS and LP dimensions are within the ZOT and that IC dimension is below the ZOT for the overall user group. This result is somewhat surprising because most studies (Cook et al. 2008, 2009; Cook et al. 2010; Cook, Heath, and Thompson 2003; Hiller 2004; Hubbard and Walter 2005; Jaggars, Jaggars, and Duffy 2009; Lessin 2004; Nimsomboon and Nagata 2003; Shedlock and Walton 2004; Wilson 2004) found that the IC dimension was within the ZOT. Only a few studies (Association of Research Libraries 2011; Kemp 2002; Kyrillidou and Persson 2006; Lock and Town 2005) found that the IC dimension was below the ZOT. The major reasons for poor service quality in the university libraries in Pakistan are lack of modern equipment (computer, scanners, photocopy, multimedia, etc.), poor marketing of electronic resources, non-availability of active library websites, inadequate collections, meagre journal collections, inability of library staff to provide high-quality services, and non-availability of space for group discussion. Library authorities should allocate resources to improve the above-mentioned services.

To answer the third research question—“Which attributes of the library service quality meet, exceed, or fall short of user expectations?”—the study examined the service superiority gaps (i.e., the positive difference between perceived service quality and desired service quality). The results show that none of the library services met or exceed users’ desired levels of service quality. The potential reason for this is that users tend to wish for more than what they already; therefore, their expectations may have been shaped by exposure to already-rich information environments. The services with the largest negative gap are modern equipment, electronic resources and their remote access, library websites with which to find information independently, and availability of printed materials. All three dimensions had high negative superiority gaps, but the IC dimension had the largest negative gap and the LP dimension had the lowest negative gap. These results were not surprising and are in line with the existing literature (Lippincott and Kyrillidou 2004; Arshad 2009; Jankowska, Hertel and Young 2006; Kayongo and Jones 2008; Kemp 2002; Lock and Town 2005; Ahmed and Shoeb 2009; Moon 2007; Hitchingham and Kenney 2002; Peterson et al. 2004; Cook et al. 2008) that highlighted that users’ desired expectations were not met. The other possible reasons may be lack of resources (e.g., material, IT infrastructure, easy access, and modern equipment); poor skills, knowledge, and attitude of library staff; non-availability of Web-based services; training of users [End Page 74] and staff in information retrieval; and imbalanced, outdated, and inadequate collections.

Implications of the study

The results of the study suggest the following implications for the administration of university libraries, the HEC, the government of Pakistan, and policy makers to improve the overall quality of library services.

The study results will be helpful to libraries seeking to understand users’ perceptions regarding library services. Findings of the study can also be helpful for determining strong as well as weak areas of services. Library administration can use these results for future planning and improvements of service, and to justify the resources incurred on service.

This research identified service attributes which are below users’ minimum acceptable and desired service quality. The practitioners can isolate these problematic areas and make plans for immediate action based upon these results.

It was found that current library allocations of most of the universities are not sufficient. Therefore, the HEC, universities, and other concerned authorities should immediately increase the budget allocation for services that are identified below the minimum level (below the ZOT) by this study. For instance, most of the university libraries of Pakistan terribly need to acquire, develop, or improve print and electronic collections, modern equipment, space for group study, interpersonal training, and remote and Web-based services.

The findings of the study have some implications for policy makers, the HEC, and universities. The users rated very poor the quality of electronic resources and their access to them. Effective delivery and marketing of services is an important factor of library service quality, but library administration may be little concerned about it. A potential reason for these findings could be that librarians are not marketing these resources to the end users. Therefore, these findings are thought provoking for library administrations, policy makers, and the HEC. Despite the fact that they have spent a lot of resources to subscribe to electronic resources, the benefits of access have not been realized in the form of library users being able to recognize the value they get from these resources. More work needs to take place in this area in Pakistan.

The study results imply that library administrations should immediately pay attention to the improvement of the IC dimension. More specifically, libraries should improve print and electronic collections (book, journals electronic resources), provide the latest information-access tools (catalogue, website, and classification tools), modern equipment (computer, photocopier, scanner, etc.), remote access, web-based services, and space for group discussion. In addition, library managers should provide knowledgeable, cooperative, and courteous staff at customer service points.

Conclusion

The study found that the Urdu version of LibQUAL+ is reliable and valid in the Pakistani context. The experience of LibQUAL+ implementation in Pakistan [End Page 75] ran smoothly, and scales and questions were very well understood by respondents. The results of the study demonstrate that there is a wide gap between user perceptions and expectations of service quality. The undergraduate students have the largest service quality gap, and faculty members have smallest. All the services were perceived lower than the desired level of users at university libraries of Pakistan. Eight problematic areas were identified as outside of the zone of tolerance, and most of them are related to the IC dimension. These perceptions of low quality result in disappointment, frustration, and dissatisfaction, and they decrease customer loyalty and reliance on library services. Concerned authorities should pay attention and allocate resources for the related services, like print and electronic collection, modern equipment and tools, library website, remote access, and space for group discussion. In addition, managers should also allocate resources for staff training to improve courtesy, willingness to help, and knowledge among library staff, so they can meet or exceed user requirements.

The result of this research will also help the authorities of the university libraries of Pakistan to have a deeper knowledge of the perceptions and expectations of users and their discrepancies. The university libraries in Pakistan are often criticized for not being able to deliver quality services to their users. This study shows the data that supports these observations that perceived service quality falls short of users’ desired service quality. It is therefore suggested that the HEC and the government of Pakistan, as administrative and statutory bodies of universities in Pakistan, work toward introducing national standards for benchmarking library services in order to maintain the quality of the university libraries in Pakistan. The Urdu version of the LibQUAL+ protocol could serve as a first step toward soliciting, tracking, and comparing service quality among university libraries in Pakistan.

Shafiq Ur Rehman
Assistant Professor
Department of Library & Information Science
University of the Punjab
Pakistan
s_rehman25@hotmail.com

References

Ahmed, S. M. Zabed, and Md Zahid Hossain Shoeb. 2009. “Measuring Service Quality of a Public University Library in Bangladesh using SERVQUAL.” Performance Measurement and Metrics 10 (1): 17–32. http://dx.doi.org/10.1108/14678040910949666.
Akhtar, M. Z. 2008. “Library Services and User Satisfaction.” Pakistan Library and Information Science Journal 39 (2): 25–35.
Arshad, Alia. 2009. “User’s Perceptions and Expectations of Quality Punjab University Libraries’ Services.” MPhil thesis, Department of Library and Information Science, University of the Punjab, Lahore.
Association of Research Libraries. 2010. “General Information—What is LIBQUAL.” www.libqual.org/about/about_lq/general_info.
Association of Research Libraries. 2011. “2011 LibQUAL+ Survey Highlights—Session I: January–May.” www.libqual.org/documents/LibQual/publications/LibQUALHighlights2011_SessionI.pdf.
Awan, Muhammad Usman, S. Azam, and Muhammad Asif. 2008. “Library Service Quality Assessment.” Journal of Quality and Technology Management IV (1): 51–64.
Boyd-Byrnes, Mary Kate, and Marilyn Rosenthal. 2005. “Remote Access Revisited: Disintermediation and Its Discontents.” Journal of Academic Librarianship 31 (3): 216–24. http://dx.doi.org/10.1016/j.acalib.2005.03.002. [End Page 76]
Chweh, S.S. 1982. “User Criteria for Evaluation of Library Service.” Journal of Library Administration 2 (1): 35–46. http://dx.doi.org/10.1300/J111V02N01_05.
Cook, Collen, Fred Heath, and Bruce Thompson. 2001. “Users’ Hierarchical Perspectives on Library Service Quality: A ‘LibQUAL+’ Study.” College & Research Libraries 62 (2): 147–53.
Cook, Colleen, Fred Heath, and Bruce Thompson. 2003. “Zones of Tolerance in Perceptions of Library Service Quality: A LibQUAL+ Study.” portal: Libraries and the Academy 3 (1): 113–23. http://dx.doi.org/10.1353/pla.2003.0003.
Cook, Colleen, Fred Heath, Bruce Thompson, MaShana Davis, Martha Kyrillidou, and Gary Roebuck. 2008. LIBQUAL 2008 Survey: LibQUAL France. Washington, DC: Association of Research Libraries/Texas A&M University.
Cook, Colleen, Fred Heath, Bruce Thompson, MaShana Davis, Martha Kyrillidou, and Gary Roebuck. 2009. LIBQUAL 2009 Survey: LibQUAL France. Washington, DC: Association of Research Libraries/Texas A&M University.
Cook, Colleen, Fred Heath, Bruce Thompson, David Green, Martha Kyrillidou, and Gary Roebuck. 2010. LIBQUAL 2010 Survey: LibQUAL France. Washington, DC: Association of Research Libraries/Texas A&M University. http://libqual-fr.pbworks.com/f/Notebook_LibqualFrance_2010.pdf.
Cook, Colleen, Fred Heath, Bruce Thompson, and Russel Thompson. 2001a. “LibQUAL+: Service Quality Assessment in Research Libraries.” IFLA Journal 27 (4): 264–68. http://dx.doi.org/10.1177/034003520102700410.
Cook, Collen, Fred Heath, Bruce Thompson, and Russel Thompson. 2001b. “The Search for New Measures: The ARL LibQUAL Project—A Preliminary Report.” portal: Libraries and the Academy 1 (1): 103–12.
Cook, Colleen, Fred Heath, Bruce Thompson, and Duane Webster. 2003. “LibQUAL+: Preliminary Results from 2002.” Performance Measurement and Metrics 4 (1): 38–47. http://dx.doi.org/10.1108/14678040310471239.
Creaser, Claire. 2006. “One Size Does Not Fit All: User Surveys in Academic Libraries.” Performance Measurement and Metrics 7 (3): 153–62. http://dx.doi.org/10.1108/14678040610713110.
Dole, Wanda. 2002. “LibQUAL and the Small Academic Library.” Performance Measurement and Metrics 3 (2): 85–95. http://dx.doi.org/10.1108/14678040210429982.
Gatten, Jeffrey N. 2004a. “Measuring Consortium Impact on User Perceptions: OhioLINK and LibQUAL.” Journal of Academic Librarianship 30 (3): 222–28. http://dx.doi.org/10.1016/j.acalib.2004.02.004.
Gatten, Jeffrey N. 2004b. “The OhioLINK LibQUAL+ 2002 Experience: A Consortium Looks at Service Quality.” Journal of Library Administration 40 (3/4): 19–48. http://dx.doi.org/10.1300/J111v40n03_03.
Green, John P. 2007. “Determining the Reliability and Validity of Service Quality Scores in a Public Library Context: A Confirmatory Approach.” PhD thesis, Capella University, Minnesota.
Hariri, Nadjla, and Farideh Afnani. 2008. “LibQUAL in Iran: A Subgroup Analysis by Gender.” Performance Measurement and Metrics 9 (2): 80–93. http://dx.doi.org/10.1108/14678040810906790.
Hernon, Peter, and C. R. McClure. 1986. “Unobtrusive Reference Testing: The 55 Percent Rule.” Library Journal 111 (7): 37–41.
Hernon, Peter, and C. R. McClure. 1990. Evaluation and Library Decision Making. Norwood: Ablex Publishing Corporation.
Hernon, Peter, and John R. Whitman. 2001. Delivering Satisfaction and Service Quality: A Customer-Based Approach for Libraries. Chicago: American Library Association. [End Page 77]
Hiller, Steve. 2004. “Another Tool in the Assessment Toolbox.” Journal of Library Administration 40 (3/4): 121–37. http://dx.doi.org/10.1300/J111v40n03_10.
Hitchingham, Eileen E., and Donald Kenney. 2002. “Extracting Meaningful Measures of User Satisfaction from LibQUAL+ for the University Libraries at Virginia Tech.” Performance Measurement and Metrics 3 (2): 48–58. http://dx.doi.org/10.1108/14678040210440937.
Hubbard, William J., and Donald E. Walter. 2005. “Assessing Library Services with LibQUAL+: A Case Study.” Southeastern Librarian 53 (1): 35–45.
Jabeen, M. 2004. “Lahore kay Jamiati Kutab Khano ki Madi wa Tibee Sahoolia” (“Physical and material facilities of university libraries of Lahore”). Master’s thesis, Department of Library and Information Science, University of the Punjab, Lahore.
Jaggars, D. E., S. S. Jaggars, and J. S. Duffy. 2009. “Comparing Service Priorities between Staff and Users in Association of Research Libraries (ARL) Member Libraries.” portal: Libraries and the Academy 9 (4): 441–52.
Jankowska, M. A., K. Hertel, and N. J. Young. 2006. “Improving Library Service Quality to Graduate Students: Libqual Survey Results in a Practical Setting.” portal: Libraries and the Academy 6 (1): 59–77.
Johnson, W. G. 2007. “LibQUAL+ and the Community College Library.” Community & Junior College Libraries 14 (2): 139–50. http://dx.doi.org/10.1300/02763910802139405.
Kayongo, Jessica, and Sherri Jones. 2008. “Faculty Perception of Information Control using LibQUAL+ Indicators.” Journal of Academic Librarianship 34 (2): 130–38. http://dx.doi.org/10.1016/j.acalib.2007.12.002.
Kemp, Jan H. 2002. “Using the LibQUAL+ Survey to Assess User Perceptions of Collections and Service Quality.” Collection Management 26 (4): 1–14. http://dx.doi.org/10.1300/J105v26n04_01.
Kiran, Kaur. 2010. “Service Quality and Customer Satisfaction in Academic Libraries: Perspectives from a Malaysian University.” Library Review 59 (4): 261–73. http://dx.doi.org/10.1108/00242531011038578.
Kyrillidou, Martha. 2011. “LibQUAL+ Survey Introduction 2011.” Presentation delivered at American Library Association Midwinter Meeting, San Diego, CA, January 10. www.libqual.org/documents/LibQual/publications/2011_ALA_SanDiego_SurveyIntro.pdf.
Kyrillidou, Martha, and A. C. Persson. 2006. “The New Library User in Sweden: A LibQUAL Study at Lund University.” Performance Measurement and Metrics 7 (1): 45–53. http://dx.doi.org/10.1108/14678040610654855.
Lee, Tamera. 2004. “Exploring Outcomes Assessment.” Journal of Library Administration 40 (3/4): 49–58. http://dx.doi.org/10.1300/J111v40n03_04.
Lessin, Barton. 2004. “Mining LibQUAL+ Data for Pointers to Service Quality at Wayne State University.” Journal of Library Administration 40 (3/4): 139–55. http://dx.doi.org/10.1300/J111v40n03_11.
Lippincott, S., and M. Kyrillidou. 2004. “How ARL University Communities Access Information: Highlights from LibQUAL+.” ARL Bimonthly Report, no. 236: 7–8.
Lock, Selena, and J. Stephen Town. 2005. “LibQUAL+ in the UK and Ireland: Three Years’ Findings and Experience.” SCONUL Focus 35 (Summer/Autumn): 41–45.
Manjunatha, K., and D. Shivalingaiah. 2004. “Customer’s Perception of Service Quality in Libraries.” Annals of Library and Information Studies 51 (4): 145–51.
Miller, K. 2008. “Service Quality in Academic Libraries: An Analysis of LibQUAL Scores and Institutional Characteristics.” PhD thesis, University of Central Florida, Florida. [End Page 78]
Moon, A. 2007. “LibQUAL+ at Rhodes University Library: An Overview of the First South African Implementation.” Performance Measurement and Metrics 8 (2): 72–87. http://dx.doi.org/10.1108/14678040710760586.
Nimsomboon, Narit, and Haruki Nagata. 2003. Assessment of Library Service Quality at Thammasat University Library System. Japan: Research Center for Knowledge Communities.
Nitecki, D. A. 1996. “Changing the Concept and Measure of Service Quality in Academic Libraries.” Journal of Academic Librarianship 22 (3): 181–90. http://dx.doi.org/10.1016/S0099-1333(9690056-7).
Oldman, Christine, and Gordon Wills. 1977. The Beneficial Library. Bradford, UK: MCB Books.
Parasuraman, A., V. A. Zeithaml, and L. L. Berry. 1991. “Refinement and Reassessment of the SERVQUAL Scale.” Journal of Retailing 67 (4): 421–50.
Peterson, Richard, Beverly Murphy, Stephanie Holmgren, and Patricia L. Thibodeau. 2004. “The LibQUAL Challenge.” Journal of Library Administration 40 (3/4): 83–98. http://dx.doi.org/10.1300/J111v40n03_07.
Rehman, Shafiq Ur, and A. Pervaiz. 2007. “Challenges and Opportunities for Libraries in Pakistan.” Pakistan Library and Information Science Journal 38 (3): 6–11.
Rehman, Shafiq Ur, Frazana Shafiq, and Khalid Mahmood. 2011. “A Survey of User Perception and Satisfaction with Reference Services in University Libraries of Punjab.” Library Philosophy and Practice. http://digitalcommons.unl.edu/libphilprac/624/
Sahu, A. K. 2007. “Measuring Service Quality in an Academic Library: An Indian Case Study.” Library Review 56 (3): 234–43. http://dx.doi.org/10.1108/00242530710736019.
Seay, T., S. Seaman, and D. Cohen. 1996. “Measuring and Improving the Quality of Public Services: A Hybrid Approach.” Library Trends 44 (3): 464–90.
Sharma, Sanjeev K., V. K. Anand, and Geeta Sharma. 2010. “Quality of Services Rendered by University Libraries: An Empirical Investigation.” Trends in Information Management 6 (1): 1–16.
Shedlock, James, and Linda Walton. 2004. “An Academic Medical Library Using LibQUAL+.” Journal of Library Administration 40 (3/4): 99–110. http://dx.doi.org/10.1300/J111v40n03_08.
Sherikar, Amruth, Suresh Jange, and S.L. Sangam. 2006. “Performance Measurement of Quality Services in Academic and Research Libraries in India.” Paper presenetd at Asia-Pacific Conference on Library & Information Education & Practice 2006 (A-LIEP 2006), April 3–6, Singapore.
Taylor, Robert S. 1986. Value-Added Processes in Information Systems. Norwood, NJ: Ablex.
Thakuria, P.K. 2007. “Concept of Quality in Library Services: An Overview.” Paper presented at 5th convention of PLANNER (Promotion of Library Automation and Networking in North Eastern Region), December 7–8, Guwahati, India. http://ir.inflibnet.ac.in/dxml/bitstream/handle/1944/1370/46.pdf?sequence=1.
Thompson, Bruce, and Colleen Cook. 2002. “Stability of the Reliability of LibQual+ Scores: A Reliability Generalization Meta-analysis Study.” Educational and Psychological Measurement 62 (4): 735–43. http://dx.doi.org/10.1177/0013164402062004013.
Thompson, Bruce, Colleen Cook, and Fred Heath. 2003. “Two Short Forms of the LibQUAL+ Survey: Assessing Users’ Perceptions of Library Service Quality.” Library Quarterly 73 (4): 453–65. http://dx.doi.org/10.1086/603441. [End Page 79]
Thompson, Bruce, Colleen Cook, and Martha Kyrillidou. 2005. “Concurrent Validity of LibQUAL+ Scores: What Do LibQUAL+ Scores Measure?” Journal of Academic Librarianship 31 (6): 517–22. http://dx.doi.org/10.1016/j.acalib.2005.08.002.
Thompson, Bruce, Colleen Cook, and Martha Kyrillidou. 2006. “Stability of Library Service Quality Benchmarking Norms across Time and Cohorts: A LIBQUAL+ Study.” Paper read at Asia-Pacific Conference on Library & Information Education & Practice 2006 (A-LIEP 2006), April 3–6, Singapore Singapore.
Thompson, Bruce, Martha Kyrillidou, and Collen Cook. 2007. “Library Users’ Service Expectations: A LibQUAL+ Study of the Range of What Users Will Tolerate.” Paper presented at the 7th Northumbria Conference on Performance Measurement in Libraries and Information Services, August 13–16, Stellenbosch, South Africa.
Thompson, Bruce, Martha Kyrillidou, and Colleen Cook. 2008. “How You Can Evaluate the Integrity of Your Library Service Quality Assessment Data: Intercontinental LibQUAL+ Analyses Used as Concrete Heuristic Examples.” Performance Measurement and Metrics 9 (3): 202–215. http://dx.doi.org/10.1108/14678040810928426.
Wang, K. 2007. “Users’ Evaluation on Library Service Quality: A LibQUAL+ Empirical Study.” Paper read at International Conference on Service Systems and Service Management, June 9–11, Chengdu, China.
Whitehall, T. 1992. “Quality in Library and Information Service: A Review.” Library Management 13 (5): 23–35. http://dx.doi.org/10.1108/01435129210020361.
Wilson, Flo. 2004. “LibQUAL+ 2002 at Vanderbilt University.” Journal of Library Administration 40 (3/4): 197–240. http://dx.doi.org/10.1300/J111v40n03_15.
Wisniewski, Mik. 1996. “Measuring Service Quality in the Public Sector: The Potential for SERVQUAL.” Total Quality Management & Business Excellence 7 (4): 357–66.
Zeithaml, Valarie, Leonard Berry, and A. Parasuraman. 1993. “The Nature and Determinants of Customer Expectations of Service.” Journal of the Academy of Marketing Science 21 (1): 1–12. http://dx.doi.org/10.1177/0092070393211001. [End Page 80]

Share