z-logo
open-access-imgOpen Access
A Simulation-based Approach to Optimize the Execution Time and Minimization of Average Waiting Time Using Queuing Model in Cloud Computing Environment
Author(s) -
Souvik Pal,
Prasant Kumar Pattnaik
Publication year - 2016
Publication title -
international journal of electrical and computer engineering
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.277
H-Index - 22
ISSN - 2088-8708
DOI - 10.11591/ijece.v6i2.pp743-750
Subject(s) - computer science , cloud computing , cloudsim , queue , queueing theory , server , service (business) , minification , quality of service , distributed computing , operations research , operating system , computer network , economy , economics , programming language , engineering
Cloud computing is the emerging domain in academia and IT Industry. It is a business framework for delivering the services and computing power on-demand basis. Cloud users have to pay the service providers based on their usage. For enterprises, cloud computing is the worthy of consideration and they try to build business systems with lower costs, higher profits and quality-of-service. Considering cost optmization, service provider may initially try to use less number of CPU cores and data centers. For that reason, this paper deals with CloudSim simulation tool which has been utilized for evaluating the number of CPU cores and execution time. Minimization of waiting time is also a considerable issue. When a large number of jobs are requested, they have to wait for getting allocated to the servers which in turn may increase the queue length and also waiting time. This paper also deals with queuing model with multi-server and finite capacity to reduce the waiting time and queue length.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here