Bridging System Development Perspectives and Quantitative Analysis of User Adoption in Educational Information Systems
Author(s) -
Paniti Netinant,
Sorapak Pukdesree,
Meennapa Rukhiran
Publication year - 2025
Publication title -
ieee access
Language(s) - English
Resource type - Magazines
SCImago Journal Rank - 0.587
H-Index - 127
eISSN - 2169-3536
DOI - 10.1109/access.2025.3616630
Subject(s) - aerospace , bioengineering , communication, networking and broadcast technologies , components, circuits, devices and systems , computing and processing , engineered materials, dielectrics and plasmas , engineering profession , fields, waves and electromagnetics , general topics for engineers , geoscience , nuclear engineering , photonics and electrooptics , power, energy and industry applications , robotics and control systems , signal processing and analysis , transportation
Sustaining digital platform adoption in computer science requires both robust system design and quantitative evaluation of user behavior and system performance. System Quality (SQ) encompassing reliability, responsiveness, security, and usability, is a critical determinant of adoption. However, most studies grounded in the Technology Acceptance Model (TAM) and Task–Technology Fit (TTF) have overlooked objective SQ metrics and the moderating influence of Digital Skills (DS). This study introduces the Digital Educational Systems Adoption Framework (DESAF), which integrates TAM, TTF, and DS to analyze adoption of the Graduate Academic Information System Services (GAISS). A mixed-method design combined three years of Microsoft IIS server log data (2022–2024) with 550 user survey responses. SQ was quantified through request volumes, response times, error rates, and downtime, and analyzed alongside behavioral constructs using structural equation modeling in SmartPLS 4. Results indicate that SQ strongly predicts DS (β = 0.416, p < 0.001) and actual system use (β = 0.251, p < 0.001), while DS significantly enhances perceived ease of use, usefulness, satisfaction, and actual use (β = 0.151, p < 0.001), confirming its dual role as both predictor and moderator. System performance improved substantially as average response times decreased from 526 ms to 305 ms and downtime from 3.67% to 2.86%, while annual requests rose from 155,000 to 966,000. Yet, feature expansion increased error rates to 14.88%, highlighting a trade-off between performance and complexity. The validated DESAF model advances computer science by bridging software performance engineering and behavioral modeling, offering an evidence-based framework for adaptive, user-centered platforms sustained through longitudinal performance monitoring.
Accelerating Research
Robert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom
Address
John Eccles HouseRobert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom