**5. Conclusions and future research**

The framework seems to effectively evaluate the levels of e-readiness throughout a blended or online course. The results suggest that students' level of e-readiness contributes to their academic success via different pathways. While students often have a choice of selecting a blended or online course, it should not be assumed that the provision of an ELS could replace the traditional classroom. Evaluating their suitability for different modalities should be paramount so that they are given the best options, and advice, to make an informed decision. One can only wonder how the experiences of the 166 students who were deficient in all three scales could have been improved with pre-course screening and ELS training. These methods should be embedded in courses with an online component, where students can be assessed, and if necessary exposed to technical and social skills required for the ELS. Failure to do so could result in missed opportunities for improving expectations in online environment and unnecessary increases in attrition rates. Another recommendation of fixing the 'ready' in e-readiness would be to increase the visibility and accessibility of help desk services for students in both cohorts, by possibly embedding email contacts and online chat facilities with every ELS log in screen and on every course page. While it may initially seem trivial to do so, it could improve retention rates of those who would otherwise become frustrated with the ELS and drop the course.

[2] Vanslambrouck S, Zhu C, Lombaerts K, Philipsen B, Tondeur J. Students' motivation and subjective task value of participating in online and blended learning environments.

Fixing the 'Ready' in E-Learning Readiness http://dx.doi.org/10.5772/intechopen.74287 81

[3] van Rooij SW, Zirkle K. Balancing pedagogy, student readiness and accessibility: A case study in collaborative online course development. The Internet and Higher Education.

[4] Yu Y, Richardson JC. An exploratory factor analysis and reliability analysis of the student online learning readiness (SOLR) instrument. Online Learning 2015. 2016;**19**(5):22;

[5] Lim DH, Morris ML, Kupritz VW. Online vs. blended learning: Differences in instructional outcomes and learner satisfaction. Journal of Asynchronous Learning Networks.

[6] Lim HL. Community of inquiry in an online undergraduate information technology

[7] Artino AR Jr. Online or face-to-face learning? Exploring the personal factors that predict students' choice of instructional format. The Internet and Higher Education. 2010;

[8] Burns K, Duncan M, Sweeney DC, North JW. A longitudinal comparison of course delivery modes of an introductory information systems course and their impact on a subsequent information systems course. MERLOT Journal of Online Learning and

[9] Wagner SC, Garippo SJ, Lovaas P. A longitudinal comparison of online versus traditional instruction. MERLOT Journal of Online Learning and Teaching. 2011;**7**(1):68-73 [10] Cole MT, Shelley DJ, Swartz LB. Online instruction, e-learning, and student satisfaction: A three year study. The International Review of Research in Open and Distance

[11] Mandernach BJ, Mason T, Forrest KD, Hackathorn J. Faculty views on the appropriateness of teaching undergraduate psychology courses online. Teaching of Psychology.

[12] Parkes M, Stein S, Reading C. Student preparedness for university e-learning environ-

[13] Darab B, Montazer GA. An eclectic model for assessing e-learning readiness in the

[14] Hashim H, Tasir Z. E-Learning Readiness: A Literature Review. 2014 International Conference on Teaching and Learning in Computing and Engineering (LaTiCE). Kuching:

[15] Bhuasiri W, Xaymoungkhou O, Zo H, Rho JJ, Ciganek AP. Critical success factors for e-learning in developing countries: A comparative analysis between ICT experts and

ments. The Internet and Higher Education. 2015;**25**(1):1-10

faculty. Computers & Education. 2012;**58**(2):843-855

Iranian univerisities. Computers & Education. 2011;**56**(3):900-910

course. Journal of Information Technology Education. 2007;**6**:153-168

The Internet and Higher Education. 2017;**36**:33-40

2016;**28**:1-7

[cited 2016 March 31]

2007;**11**(2):27-42

**13**(4):272-276

Teaching. 2013;**9**(4):453-467

Learning. 2014;**15**(6):111-131

2012;**39**(3):203-208

IEEE; 2014

There were some limitations to the study. First, students in the region who still pay high 'per minute' fees, use 'dial-up' to access the Internet, do not own a computer or have reliable Internet access are less likely to spend extra time completing an online survey. More so, students who were not categorized as e-ready and were still not familiar with the ELS could have possibly contributed to under-reporting of responses if they were not keen on completing a non-essential task such as an online survey. These findings could be used as a benchmark for comparisons of levels and characteristics of e-readiness in other blended and online courses. However, tracking students' levels of e-readiness, whether categorized as e-ready or not, in subsequent courses through to graduation would be a most useful study for university administrations and instructors in an effort to understand and use key indicators to reduce attrition rates.
