z-logo
open-access-imgOpen Access
Implementing a Real Time Virtual Mouse System and Fingertip Detection based on Artificial Intelligence
Author(s) -
Banda Aneela
Publication year - 2021
Publication title -
international journal for research in applied science and engineering technology
Language(s) - English
Resource type - Journals
ISSN - 2321-9653
DOI - 10.22214/ijraset.2021.35485
Subject(s) - computer science , gesture , artificial intelligence , human–computer interaction , identification (biology) , gesture recognition , construct (python library) , tracking (education) , computer vision , modalities , human intelligence , psychology , pedagogy , social science , botany , sociology , biology , programming language
Artificial intelligence refers to the simulation of human intelligence in computers that have been trained to think and act like humans. It is a broad branch of computer science devoted to the creation of intelligent machines capable of doing activities that would normally need human intelligence. Despite the fact that Artificial intelligence is a heterogeneous science with several techniques, developments in machine learning and deep learning are driving a paradigm shift in practically every business. Human-computer interaction requires the identification of hand gestures utilizing vision-based technology. The keyboard and mouse have grown more significant in human-computer interaction in recent decades. This involves the progression of touch technology over buttons, as well as a variety of other gesture control modalities. A normal camera may be used to construct a hand tracking-based virtual mouse application. We combine camera and computer vision technologies, such as finger- tip identification and gesture recognition, into the proposed system to handle mouse operations (volume control, right click, left click), and show how it can execute all that existing mouse devices can.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here