publisher colophon
abstract

This study used two different samples of undergraduate business students taking both online and face-to-face courses to measure the perceived favorability of online courses. The Fall 2014 sample consisted of 237 complete data respondents, while the Spring 2015 sample consisted of 114 complete data respondents. A new reliable four-item perceived favorability of online versus face-to-face courses measure was used. Across both samples, two correlates, that is, satisfaction with course tools, and satisfaction with instructor response time, were each positively related to perceived favorability of online courses, beyond controlled for background and behavioral variables.

Keywords

Online learning, perceived favorability of online courses, satisfaction with online course tools, instructor response time

Correlates of Business Undergraduates’ Perceived Favorability of Online Compared to Face-to-Face Courses

Many universities and colleges are increasingly viewing online education as a critical component of their enrollment strategic plan (Allen and Seaman 2013), and there has been phenomenal growth over the past several years in colleges offering online degree programs (Britt 2015). The number [End Page 50] of accredited schools offering online undergraduate- or graduate-level programs continues to increase (BizEd 2015). Such growth in business online education is expected to continue as increasingly organizations expect students to develop the skills necessary to perform in virtual teams (Arbaugh 2014). On-going research on the effectiveness of online versus face-to-face classes is needed. The goal of this study is to test for correlates of business undergraduates’ perceived favorability of online compared to face-to-face courses. First, prior research will be briefly reviewed.

Prior Research

A recent study of 200 business undergraduates (Comer, Lenaghan, and Sengupta 2015) found that the respondents generally perceived that they had the necessary attributes for successful online learning (e.g., capability, self-discipline, active learning perspective). They also found that students in qualitative (versus quantitative) and introductory (versus advanced) online courses reported more positive perceptions of their learning experience. Using a sample of undergraduate and graduate business students, Beqiri, Chase, and Bishka (2010) found that students who were male, graduate, married, and lived more than one mile away from campus were more satisfied with the delivery of online courses. Using a survey sample of 277 MBA online students, Endres et al. (2009) found five distinct student satisfaction factors: learning practices, course materials, faculty practices, student-to-student interaction, and online course tools. These satisfaction factors had different levels of relationships to three types of intent recommendations to others: to take the course, to take the faculty member teaching the course, or to go to the university. Intent recommendation was measured using a Yes/No response format. For example, only faculty practices of the five satisfaction factors significantly explained intent to recommend the faculty member to other students.

Prior research supporting students’ favoring face-to-face courses includes media richness theory (An and Frick 2006), which suggests several advantages of face-to-face, synchronous communication during class discussion and participation including: seeing nonverbal cues and immediacy of feedback. However, Rovai (2004) has argued, using a constructivist perspective, that online discussion boards allow more reflective interaction than the more spontaneous interactions in a typical face-to-face classroom. Meyer (2007) found that the advantages of online discussion (versus [End Page 51] face-to-face) included: the opportunity to take more time and care to reflect on what response should be made, and allowing quieter students to open up more online. The availability of stronger technology, including WebEx and high-quality taped video lectures (Bonk and Zhang 2006), makes it possible to compare live online WebEx sessions to face-to-face class discussions, and high-quality video lectures made for online classes to face-to-face class lectures. A study-specific measure of perceived favorability of online compared to face-to-face classes was used as the key outcome variable.

Correlates to Perceived Favorability of Online Courses and Control Variables

In their model of antecedents of virtual learning effectiveness, Picoli, Ahmad, and Ives (2001) differentiate between students’ technology comfort versus technology attitudes. Students who feel more comfortable using the different online course tools should have a move positive evaluation of the virtual learning experience. In addition, students who are more satisfied with the different online course tools also report a more positive evaluation. Such satisfaction can involve easy access and less downtime when working with the online tools. Prior research has also shown that instructor availability and response time to questions affect student online course satisfaction (Bolliger and Martindale 2004). Collectively, this research suggests the following: (1) comfort with technology (course tools), (2) satisfaction with technology (course tools), and (3) satisfaction with instructor response time, should each be positively related to perceived online course favorability. Taken as a group, therefore, these three variables represent an attitudinal variables set. The above research suggests the first study hypothesis (H1):

H1: Comfort with technology, satisfaction with technology, and satisfaction with instructor response time should each be positively related to perceived favorability of online courses.

Related research (Crede, Roch and Kieszczynka 2010) has argued that regardless of the teaching mode used by the instructor, for example, face-to-face versus online, attendance is likely to be beneficial for the students’ learning. The amount of time a student invests in preparing for a course (e.g., studying, completing assignments) can be an indicator of a student’s motivation (Picoli, Ahmad, and Ives 2001). Collectively, these two variables, attendance and hours spent per week preparing, represent [End Page 52] a behavioral variables set. The above research suggests the second study hypothesis (H2):

H2: Online course attendance and hours spent per week preparing for an online course should each be positively related to perceived favorability of online courses.

The lack of accumulated prior online research suggests that, without hypothesizing their impact on perceived favorability of online courses, the following four background variables should be at least controlled for: (1) gender (Beqiri, Chase, and Bishka 2010); (2) introductory versus advanced online course; (3) quantitative versus qualitative online course (Comer, Lenaghan, and Sengupta 2015); and (4) the use of high-quality video lectures in an online course (Bonk and Zhang 2006). Together these four variables represent a background variables set. Prior general research on undergraduate student outcomes, such as professional development and timely graduation (Blau and Snell 2013), suggests controlling first for such background variables. Therefore, these four variables will be tested first in the third hypothesis (H3):

H3: The background variables set, followed by the behavioral variables set, followed by the attitudinal variables set will each significantly explain incremental variance in perceived favorability of online courses.

Finally, Endres et al. (2009) found that student satisfaction with faculty practices, learning practices, and course materials was positively related to intent to recommend the online course. This leads to our final study hypothesis (H4).

H4: Perceived favorability of online courses will be positively related to intent to recommend online courses to others.

Methods

Sample and Procedure

The initial sample in the fall semester of 2014 consisted of business undergraduate students enrolled in at least one online course within the Fox School of Business and Management, one of eleven colleges within Temple University, located in Philadelphia, Pennsylvania. In the fall of 2014 there [End Page 53] were 27,642 undergraduates enrolled at Temple University, of which 6,455 (23%) were in the Fox School of Business and Management. At the undergraduate level, within the University and the Business School, an increasing number of online courses are being offered to complement traditional face-to-face classes. Temple’s online bachelor of business administration (BBA) was recently ranked in the top 10 online BBA programs in the United States by US News and World Report (2016).

Students were asked to voluntarily fill out an online survey (using Qualtrics). As an incentive to voluntarily complete the online survey, an iPad mini was offered via lottery. A student could fill out a survey for each different online course he or she was taking and the student’s name was entered in the lottery for each completed survey. Generally, participating students were taking 1–2 online classes along with traditional classes as part of being full-time (at least 12 credit hours/semester). This allowed students to directly compare current online versus face-to-face courses they were taking. In several cases, however, students were taking three or more online classes. Four hundred and sixty-five students (N = 465) filled out at least part of the online course survey. The survey was posted near the end of the course for one week. The fall sample background variables will be reported below.

In the Spring 2015 semester, a separate validation sample of business undergraduate students taking at least one online course was also collected using the same process as in the fall. An iPad mini was again offered via lottery. Two hundred and ninety-nine students (N = 299) filled out at least part of the online course survey using Qualtrics. The spring sample background variables will be reported below.

Measures

Background Variables for the Fall 2014 Sample

Four variables were collected (percentage in category): gender; introductory versus advanced online course; quantitative versus qualitative online course; and use of high-quality video lectures in an online course. Gender was coded as 1 = male (48%), 2 = female (52%). Introductory versus advanced online course involved seeing if a course was classified as a “lower-division foundation requirement” by the business school. If so, the online course was coded as introductory. If the course had a prerequisite it was classified as advanced. The breakdown was 1 = introductory (58%), 2 = advanced (42%). In addition an online course was classified as quantitative or qualitative [End Page 54] based on preexisting business school guidelines. Quantitative courses included statistics, accounting, finance, managerial information systems, and risk management and insurance. Qualitative courses included human resource management, marketing, legal, strategic management, international business and business administration. The breakdown was 1 = quantitative (50%), 2 = qualitative (50%). The response breakdown to the item use of high-quality video lectures in an online course was 1 = No (31%), 2 = Yes (69%).

Background Variables for the Spring 2015 Sample

The same four variables were collected from the spring surveys (percentage in category): gender; introductory versus advanced online course; quantitative versus qualitative online course; and use of high-quality video lectures in an online course. Gender was coded as 1 = male (47%), 2 = female (53%). Introductory versus advanced online course was defined as above. The breakdown was 1 = introductory (61%), 2 = advanced (39%). The same mix of quantitative and qualitative courses noted above was used, and the breakdown was 1 = quantitative (54%), 2 = qualitative (46%). The response breakdown to the item use of high-quality video lectures in an online course was 1 = No (20%), 2 = Yes (80%).

Measures Used for Both the Fall 2014 and Spring 2015 Samples

To ascertain students’ comfort level with course tools, four questions were asked, using the lead-in, “please describe your comfort level with each of the following tools: ‘Blackboard/Canvas,’ ‘WebEx,’ ‘Edmodo Discussion Board,’ and ‘Blackboard Discussion Board.’” Items were answered from 1 to 6, where 1 = extremely low, 2 = low, 3 = average, 4 = high, 5 = very high, and 6 = not applicable. An item frequency breakdown indicated that 41% of the fall respondents selected “not applicable” for the “Edmodo Discussion Board” item, which had to be coded as missing data. Similarly 42% of the spring sample selected “not applicable.” This item was thus deleted in further analyses.

To measure satisfaction level with course tools, four items were asked, using the lead-in, “please describe your satisfaction level with each of the following tools: ‘Blackboard/Canvas,’ ‘WebEx,’ ‘Edmodo Discussion Board,’ and ‘Blackboard Discussion Board.’” Items were answered from 1 to 6, where 1 = extremely low, 2 = low, 3 = average, 4 = high, 5 = very high, and 6 = not applicable. Again, an item frequency breakdown indicated that 47% of the fall respondents and 50% of the spring respondents selected [End Page 55] “not applicable” for the “Edmodo Discussion Board” item, so this item was deleted in further analyses.

For satisfaction with the response time of instructors answering questions, one question was asked: “Your satisfaction level with the response time of your instructor when you have questions or concerns,” with the response scale from 1 = extremely low to 5 = very high.

The issues of attendance and hours spent per week preparing were addressed through a pair of items. The first measured attendance: “on average how often did you attend synchronous WebEx sessions,” with the usable response scale as 1= never, 2 = seldom, 3 = half of the time, 4 = most of the time, and 5 = every time. Nineteen percent of the fall sample and 21% of the spring sample responded 6 = not applicable and these responses were coded as missing data. The second focused on hours spent per week preparing, “on average how many hours a week did you spend on preparing for class and completing course assignments,” using the following response scale: 1 = less than one hour; 2 = 1 to 2 hours; 3 = 2 to 3 hours; 4 = 3 to 4 hours; 5 = 4 to 6 hours; 6 = 6 to 8 hours; 7 = 8 or more hours.

We measured perceived favorability of online (versus face-to-face) courses through the following four items: “compared to face-to-face lectures, the high-quality video lectures were”; “compared to face-to-face class discussions, the live online WebEx sessions were”; “compared to face-to-face class participation, the online discussion boards were”; and “overall compared to face-to-face classes, the online course was.” All items were answered on the following scale, 1 = inferior, 2 = somewhat inferior, 3 = same, 4 = somewhat superior, 5 = superior. Any “not applicable” response to an item was coded as missing data.

We measured intent to recommend an online course to others as one item, “I would recommend an online course to others,” where 1 = no and 2 = yes.

Data Analysis

SPSS-PC version 22 (SPSS, 2013) was used for all data analyses. Beyond the specific missing data issues noted above, when the data was aggregated into scales to test the study hypotheses, there were additional missing data for both samples. For the Fall 2014 cohort, this resulted in a complete-sample data size of N=237 (out of 465), or 51%. For the Spring 2015 group, the complete-sample data size was N = 114 (out of 299), or 38%. The hierarchical regression models were checked for outliers (there were none with residuals of at least three standard deviations). It was determined that the [End Page 56] assumptions of no multicollinearity, linearity, homoscedasticity, and normally distributed errors were satisfactorily met (Stevens 1992). For the hierarchical regression models the background variables were entered first, followed by the behavior-related variables, followed by the attitudinal variables. This order for variable set entry is consistent with prior research testing undergraduate outcomes (Blau and Snell 2013).

Results

General

Table 1 shows the continuous variable means, standard deviations, scale reliabilities, and correlations for both samples. The descriptive data are generally consistent across both samples for each variable. Multi-item measures are divided by the number of items in that measure, so that the scale mean is based on the response scale used. The scale reliabilities for satisfaction with course tools and perceived favorability of online courses are both acceptable at over .70 (Nunnally 1978) but the scale reliability for comfort with course tools falls below this threshold.

The magnitude of correlation results between samples is generally consistent, but the significance of results varies more because the fall complete-data sample (N = 237) is more than double the size of the spring complete-data sample (N = 114). There is correlational overlap between comfort with course tools and satisfaction with course tools but they are sufficiently distinct to be used separately (Stevens 1992). Factor analyses, using eigenvalues greater than one and the scree test, confirmed the presence of two factors (satisfaction, comfort) for both samples (Stevens 1992).

Tests of Hypotheses

The first hypothesis, H1, was that comfort with technology, satisfaction with technology, and satisfaction with instructor response time, should each be positively related to perceived favorability of online courses. The correlational results in table 1 show that there is complete support for this hypothesis (fall and spring), that is, comfort with course tools (r = .35 and r = .40), satisfaction with course tools (r = .43 and r = .53), and satisfaction with instructor response time (r = .49 and r = .44) are each significantly positively related (p < .01) to perceived favorability of online courses for both the fall and spring samples. Thus, H1 is supported. [End Page 57]

Table 1. Means, Standard Deviations, Reliabilities, and Correlations for Continuous Variables
Click for larger view
View full resolution
Table 1.

Means, Standard Deviations, Reliabilities, and Correlations for Continuous Variables

[End Page 58]

The second hypothesis, H2, was that attendance and hours spent per week preparing should each be positively related to perceived favorability of online courses. Looking at table 1, support for this hypothesis is much weaker. Only for the fall sample is there a significant positive correlation (r = .25) between attendance at WebEx sessions and perceived favorability.

The third hypothesis, H3, was that the background variables set, followed by the behavioral variables set, followed by the attitudinal variables set will each significantly explain incremental variance in perceived favorability of online courses. Table 2 presents the results of the final hierarchical regression models for testing the incremental impact of each variable set for the fall and spring samples. For the fall sample, there is support for H3 as each of the variable sets, background variables (9%), behavior-related variables (6%), and attitudinal variables (20%), accounts for significant incremental variance in perceived favorability of online courses. However, for the spring sample there is less support for H3 since only the attitudinal variables set accounts for significant additional variance (36%). Overall there is partial support for H3. Looking at the individual correlates, there is consistent support across both samples for two of them. Both satisfaction with course tools (Fall, b = .36, p < .01; Spring, b = .57, p < .01) and satisfaction with instructor response time (Fall, b = .36, p < .01; Spring, b = .31, p < .01) significantly impact perceived favorability of online courses. For the fall sample, one variable, introductory versus advanced, was significant such that introductory online courses were perceived more favorably (b = –.35, p < .01). Overall, for perceived favorability of online courses, 35% of the variance (adjusted R2 = 32%) for the fall and 41% of the variance (adjusted R2 = 36%) for the spring was explained.

The final hypothesis to be tested, H4, is that perceived favorability of online courses will be positively related to intent to recommend an online course to others. The correlation results support this hypothesis. For the fall sample, the correlation between perceived favorability and intent to recommend was r = .47, p < .01, and for the spring sample this correlation was r = .44, p < .01. Therefore, H4 is supported.

Discussion

Review of the relevant literature suggests that this study is the first to use a perceived favorability of online versus face-to-face courses scale, comparing specific components of each course, that is, video lectures versus face-to-face [End Page 59]

Table 2. Final Hierarchical Regression Models for Incrementally Testing the Contributions of Common Correlates for Explaining Perceived Favorability of Online Courses for Fall 2014 and Spring 2015 Samples
Click for larger view
View full resolution
Table 2.

Final Hierarchical Regression Models for Incrementally Testing the Contributions of Common Correlates for Explaining Perceived Favorability of Online Courses for Fall 2014 and Spring 2015 Samples

[End Page 61]

lectures, live online WebEx sessions versus face-to-face class discussions, and online discussion boards versus face-to-face class participation. Prior research (Sun et al. 2008) used a three-item “e-learning course quality” scale that made a more general comparison. Two sample items from their scale (Sun et al. 2008, 1198) are “conducting the course via the Internet improved the quality of the course compared to other courses” and “I feel the quality of the course I took was largely unaffected by conducting it via the Internet.” Sampling undergraduates taking both online and face-to-face courses simultaneously in a semester allowed for direct comparison. This is a study strength compared to prior online research, which uses only online students (Beqiri, Chase, and Bishka 2010; Comer, Lenaghan, and Sengupta 2015; Sun et al. 2008). The descriptive results, including strong reliabilities across both samples, support future use of this perceived favorability scale. However, testing the generalizability of the perceived favorability measure using other samples of undergraduates, for example, liberal arts, engineering, natural sciences, and health sciences, is needed. Finding partial support for introductory (versus advanced) online courses being perceived more favorably is consistent with Comer, Lenaghan, and Sengupta (2015). In addition, finding that perceived favorability of online courses is positively related to intent to recommend online courses to others is consistent with Endres et al. (2009).

The strongest study results were found for satisfaction with course tools and satisfaction with instructor response time, with each being significantly related to perceived favorability of online courses across both samples. Prior research supports the importance of instructor response time affecting student online course satisfaction (Bolliger and Martindale 2004). For both of the hierarchical regression models, course tools satisfaction “overrode” course tools comfort for contributory significance in explaining perceived favorability. One comparison problem was that the scale reliability for the course tools comfort scale was weaker. Depending on required use, it may be harder for undergraduates to feel equally comfortable using different online course tools such as Blackboard/Canvas, WebEx and Blackboard Discussion Boards. However, ambiguity clouds the issue of what “comfort level” means. One inference is that it means “ability to use” (Picoli, Ahmad, and Ives 2001), but future research needs to more specifically measure this factor. Satisfaction level may imply students feeling that these course tools “helped them” in the online course, but again more specific measurement is needed. Greater item specificity may be one contributing factor to the stronger reliability of the perceived favorability of online courses scale. [End Page 62]

Limitations and Implications for Future Research

Other study limitations need to be acknowledged, including the samples of self-selected business students taking online courses for some unarticulated reason. Measuring the motivation for taking online courses (Daymont, Blau, and Campbell 2011) could help to further understand their perceived favorability versus face-to-face courses. Measuring other individual differences, such as self-discipline (Comer, Lenaghan, and Sengupta 2015), or age and distance from campus (Beqiri, Chase, and Bishka 2010) could have also helped to explain additional variance in perceived favorability of online courses. Since all data were self-reported, common method variance is a limitation. However, a one-factor test (Podsakoff et al. 2003) found that the first factor accounted for 28% of the total variance in the fall sample and there were six factors with eigenvalues of at least one. For the spring sample, 27% of the total variance was accounted for by the first factor and there were five factors with eigenvalues of at least one. These results indicate that if the first factor represents “method variance,” it is not an overriding limitation.

As noted earlier, missing data, including the response of “not applicable,” greatly reduced the complete-data sample sizes. The items most affected by missing data were “comfort level” and “satisfaction level” for the Edmodo discussion board items. Evidently, many online faculty did not use Edmodo, but instead used the Blackboard discussion board course tool, if any tool was used at all. Even a single missing item response resulted in a missing value for that respondent’s scale score. It may be difficult to require faculty to use certain course tools (e.g., Edmodo) if there are other available options. However, offering increased faculty training and individual consultation, if possible, could increase faculty usage. To reduce the likelihood of missing data, requiring students to fill in all items on a survey page before being allowed to go to the next page is an option. Reminding faculty at the beginning of their online courses about the positive impact of their quicker response time to students on end-of-course teaching evaluations may be useful as well.

Conclusion

This study suggests the value of a new measure, perceived favorability of online courses, to allow more direct comparison with student perceptions of face-to-face courses. Across two business undergraduate samples, [End Page 63] two correlates, that is, satisfaction with course tools and satisfaction with instructor response time, were each positively related to perceived favorability of online courses, beyond controlled for background and behavioral variables. Unfortunately, we did not compare the satisfaction levels of students taking online versus face-to-face courses. However, it is encouraging that the mean on perceived favorability of online courses compared to face-to-face courses increased from 3.26 out of 5 in the fall of 2014 to 3.42 out of 5 in the spring of 2015. In addition, it is also encouraging that the percentage of students reporting the use of high-quality video lectures in their online courses increased from 69% in the fall to 80% in the spring.

As the Fox Business School moves forward with current and new online course delivery every effort is being made to keep the “integrity” of a course, that is, to offer the same content and process in an online course as the face-to-face equivalent course. The perceived online course favorability measure is one way to gauge this equivalence. The Fox online BBA program, as well as individual online business courses, offers students a way to conveniently take college classes without having to make the trip to campus. For working adults, veterans who are getting ready to take traditional classes, or those students who want to take a class during the summer while away at an internship, online classes allow them the flexibility they need in their busy schedules. They have the same courses and faculty that they would have in a traditional classroom. This is also a great option for students with disabilities who would struggle getting to/from campus. Within the Fox Business School’s Online and Digital Learning Department, full-time support staff, including instructional designers, video production specialists, and senior technology support specialists, assist faculty in developing the highest quality online courses. As the number of online course offerings increases across universities and colleges, continued comparison of online versus face-to-face courses across diverse student samples is important. [End Page 64]

Gary Blau

gary blau received his PhD in organizational behavior from the University of Cincinnati in 1982. His research interests include understanding student-related outcomes.

Darin Kapanjie

darin kapanjie received his EdD in curriculum, instruction, and technology in education from Temple University in 2011. He is a strategic thought-leader and innovator in the field of online and digital learning.

References

Allen, I. E., and J. Seaman. 2013. Changing Course: Ten Years of Tracking Online Education in the United States. Babson Park, MA: Babson Survey Research Group. Retrieved from http://www.onlinelearningsurvey.com/reports/changingcourse.pdf.
An, Y.-J., and T. Frick. 2006. “Student Perceptions of Asynchronous Computer-mediated Communication in Face-to-Face Courses.” Journal of Computer-Mediated Communication 11:485–99. doi:10.1111/j.1083-6101.2006.00023.x.
Arbaugh, J. B. 2014. “What Might Online Delivery Teach Us about Blended Management Education? Prior Perspectives and Future Directions.” Journal of Management Education 38:784–817. doi:10.1177/1052562914534244.
Beqiri, M. S., N. M. Chase, and A. Bishka. 2010. “Online Course Delivery: An Empirical Investigation of Factors Affecting Student Satisfaction.” Journal of Education for Business 85:95–100. doi: 10.1080/08832320903258527.
Blau, G., and C. S. Snell. 2013. “Understanding Undergraduate Professional Development Engagement and Its Impact.” College Student Journal 47 (4): 689–702.
Bolliger, D. U., and T. Martindale. 2004. “Key Factors for Determining Student Satisfaction in Online Courses.” International Journal on E-learning (January–March): 61–67.
Bonk, C. J., and K. Zhang. 2006. “Introducing the R2D2 Model: Online Learning for the Diverse Learners of the World.” Distance Education 27 (2): 249–64.
Britt, M. 2015. “How to Better Engage Online Students with Online Strategies.” College Student Journal 49 (3): 399–404.
Comer, D. R., J. A. Lenaghan, and K. Sengupta. 2015. “Factors that Affect Students’ Capacity to Fulfill the Role of Online Learner.” Journal of Education for Business 90:145–55. doi: 10.1080/08832323.2015.1007906.
Crede, M., S. G. Roch, and U. S. Kieszczynka. 2010. “Class Attendance in College: A Meta- analytic Review of the Relationships with Grades and Student Characteristics.” Review of Educational Research 80 (2): 272–95. [End Page 65]
Daymont, T., G. Blau, and D. Campbell. 2011. “Deciding Between Traditional and Online Formats: Exploring the Role of Learning Advantages, Flexibility and Compensatory Adaptation.” Journal of Behavioral and Applied Management 11:156–79.
Endres, M. L., C. A. Hurtubis, S. Chowdhury, and C. Frye. 2009. “The Multifaceted Nature of Online MBA Student Satisfaction and Impacts on Behavioral Intentions.” Journal of Education for Business 84 (5): 304–12. doi:10.3200/JOEB.84.5.304-312.
Meyer, K. A. 2007. “Student Perceptions of Face-to-Face and Online Discussions: The Advantage Goes to . . .” Journal of Asynchronous Learning Networks 11 (4): 53–69.
Nunnally, J. C. 1978. Psychometric Theory. 2nd ed. New York: McGraw-Hill.
Podsakoff, P., S. Mackenzie, J. Lee, and N. Podsakoff. 2003. “Common Method Biases in Behavioral Research: A Critical Review of the Literature and Recommended Remedies.” Journal of Applied Psychology 88 (5): 879–903.
Rovai, A. P. 2004. “A Constructivist Approach to Online College Learning.” The Internet and Higher Education 7 (2): 79–93.
Stevens, J. 1992. Applied Multivariate Statistics for the Social Sciences. Mahwah, NJ: Lawrence Erlbaum.
Sun, P.-C., R. J. Tsai, G. Finger, U.-Y. Chen, and D. Yeh. 2008. “What Drives a Successful E-Learning? An Empirical Investigation of the Critical Factors Influencing Learner Satisfaction.” Computers and Education 50:1183–1202.
US News and World Report. 2016. “Top 10 Online Business Degree Programs.” Best Degree Programs website. http://www.bestdegreeprograms.org/top-schools/online-business-degree-bachelors (accessed January 19, 2016). [End Page 66]

Additional Information

ISSN
2160-6757
Print ISSN
2160-6765
Pages
50-66
Launched on MUSE
2016-07-06
Open Access
No
Back To Top

This website uses cookies to ensure you get the best experience on our website. Without cookies your experience may not be seamless.