omiio.org is dedicated to fostering open source growth for mobile applications software development on TI’s OMAP™ platform.
Through omiio.org, you can collaborate, share and grow open source hardware and software.
Gesture Recognition In Android
Gestures are a powerful means of communication among humans. In fact, gesturing is so deeply rooted in our communication that people often continue gesturing when speaking on the telephone. Hand gestures provide a separate complementary modality to speech for expressing ones idea. Information associated with hand gestures in a conversation is degree, discourse structure, spatial and temporal structure. So, a natural interaction between humans and computing devices can be achieved by using hand gestures for communication between them.
The purpose of this project is to provide a highly sophisticated Human Machine Interface.
The camera of the computing device is opened simultaneously with a photo viewer application. Appropriate gestures are made. These gestures are captured by the camera and the type of the gestures and determined by certain algorithms and these gestures are converted into computer understandable commands. These commands are mapped to an application where the intended actions are performed.
This new gesture based approach allows the users to interact with computers through hand postures, being the system adaptable to different light conditions and background. Its efficiency makes it suitable for real-time applications.
Gesturing can be used by developers as a tool for development of a wide range of applications and by typical users who use smart phones and tablets that run Android. People who are physically handicapped will also find this system very useful.
The application user performs gestures using hand. A gesture recognition system uses a video camera to capture images of the hand movement. It captures the live stream and extracts that into frames. The gesture-recognition software tracks the moving hand features, it identifies the motion and sends it to the android application. The android application then issues commands to the currently running application.
An evaluation kit with OMAP 4430 processor (PandaBoard).
A motion sensing camera.
Ram: 120MB or more.
Hard disk: Minimum 200MB.
Android SDK 2.0 or more.
JAVA and XML.
The system is required to perform the following functions.
Switch on the camera and open an application simultaneously.
The camera should be on in the video capturing mode and should run in the background as the intended application should remain in display.
Capture the gestures made by the user of the device by the motion sensing camera present.
Perform corresponding actions for the appropriate gestures made by the users
Dalvik virtual machine optimized for android devices.
Rich development environment including device emulators, tools for debugging, memory and performance profiling, and a plug –in for the Eclipse IDE.
The system is expected to run on low memory devices also.
The response time should be very less. i.e., a response action should be performed as and when the gestures are made.
The system should neglect the inappropriate gestures made by the user.
Availability of the system depends on availability of the device and its service.
Documentation provided by the application is simple and easily can be understood.
Platform compatibility is limited to android devices.
The product build is scalable.
Usability by the target user community is given utmost importance.
BY PES School of Engineering,