Bibliography:
Rubine, D. (1991). Specifying gestures by example (Vol. 25, No. 4, pp. 329-337). ACM.Link:
http://srl.tamu.edu/srlng_media/content/objects/object-1236962325-cefe7476d664dc727f969660eac672cc/bde-GI-FinalVersion.pdfSummary:
Previously recognition of gestures was hand-coded.
GDP a sketch program that recognizes gestures.
GRANDMA (Gesture Recognition Automated in Novel Direct Manipulation Architecture). Design application by assigning gestures to view classes and specifying what the gestures do.
Recognizes gestures by first building a classifier from examples. Gestures are single strokes. A feature vector is created for each stroke.
Classification. Each class of gesture is specified by the weights it gives for each feature. Each feature vector is created from the measurement of a set of 13 features. Features 12 and 13 have a dynamic component. They measure speed and duration.
Training for the classifier is done with a linear discriminator. First they get a sample estimate of the mean feature vector for each class. Using the mean feature vector for each class they calculate the sample estimate of the covariance matrix for each class and then create the sample estimate of the common covariance matrix using all the sample estimate of the covariance matrices for all classes.
Finally they use the inverse of the sample estimate of the common covariance matrix to calculate the weights for the class evaluators.
In the cases of outliers and ambiguous classification, the probability of correct classification and the standard deviation of the input gesture to its classification is used to determine rejection.
Rejection occasionally occurs on gestures that would be acceptable so it should be turned off if the application supports robust undo.
GSCORE.
Comments:
Seems like a good approach for recognizing gestures. Gestures could be tailored for each user.Mouse for gestures is outdated. Multi-touch gestures(at least direct manipulation) the norm.

No comments:
Post a Comment