Niaz Mohammad
Related Authors
Antonis Argyros
University of Crete
eleazar lopez
Instituto Tecnologico de Culiacan
JESUS PERALTA TORIBIO
Universidad Nacional de Ingeneria
Andrea Cherubini
LIRMM
Nicolas H Franceschini
Aix-Marseille University
InterestsView All (15)
Uploads
Papers by Niaz Mohammad
houses a single board computer Raspberry Pi for its computation and Arduino Uno to drive the motors as instructed. The visual sensory input is coming through a webcam mounted on the robot platform itself. When the robot is in motion, we compute optical flow from successive frame capture of the webcam using OpenCV's robust API. This allows us to navigate the robot in an unknown environment avoiding obstacles.
We intend to develop a gesture based mouse and application controller which we named "G-MAC". Our system will be developed with two major visions in mind:(1) to build a virtual mouse system which will virtually replace all the major mouse functionalities and (2) to build a gesture application which enables us to control various applications via hand gestures. Our Gesture Application will consists of two types of gesture recognition:(1) static gesture recognition which will open up an application based on matched gestures from image database, and (2) a simple linear swipe based motion gesture which will enable us to control selected application such as \Picasa" to move to next/previous images and even zoom in/out. The ultimate goal is to build a gesture based interface which will
be more intuitive and flexible for users.
houses a single board computer Raspberry Pi for its computation and Arduino Uno to drive the motors as instructed. The visual sensory input is coming through a webcam mounted on the robot platform itself. When the robot is in motion, we compute optical flow from successive frame capture of the webcam using OpenCV's robust API. This allows us to navigate the robot in an unknown environment avoiding obstacles.
We intend to develop a gesture based mouse and application controller which we named "G-MAC". Our system will be developed with two major visions in mind:(1) to build a virtual mouse system which will virtually replace all the major mouse functionalities and (2) to build a gesture application which enables us to control various applications via hand gestures. Our Gesture Application will consists of two types of gesture recognition:(1) static gesture recognition which will open up an application based on matched gestures from image database, and (2) a simple linear swipe based motion gesture which will enable us to control selected application such as \Picasa" to move to next/previous images and even zoom in/out. The ultimate goal is to build a gesture based interface which will
be more intuitive and flexible for users.