S ¼Max R;G;Bð Þ−Min R;G;Bð ÞMax R;G;Bð ÞV ¼Max R;G;Bð Þ255Data Pre-processingFeature extraction onstatic gestureComposite templatefor registrationClassiferData Pre-processingFeature extraction onstatic gestureDecryptionEncryptionEncrypted compositetemplate for testingInsure channelMatching?SecureStorageYes/NoID,security keys,PIN#Claimed ID,PIN#Enrollment processesAuthentication processesFigure 3 Work flow diagram that shows the enrollment processes and authentication processes ofour proposed biometric authentication model.Fong et al.BioMedical Engineering OnLine 2013,12:111 Page 6 of 18http://www.biomedical-engineering-online.com/content/12/1/111After the HSV conversion,the image is subject to erosion and dilation with the pur-pose of removing noise and eliminating the background from the foreground object.For erosion,it is defined as follow.Consider in the space of two sets,A and B.Whenthe set B erodes on set A,it can be expressed as A⊕B.A is the input image and B isthe structural element when the input pixel and its surrounding pixels with respect tothe structure elements 1 of the pixel values are 255,the input pixel value is set to 255.Erosion can effectively remove unnecessary elements by selecting the appropriatestructural elements.Dilation is the next step after erosion.Dilation works as this:con-sider the two sets A and B again.A is the input image and B is the structural elementwhen the input pixel and its surrounding pixels with respect to the pixel value of thestructural elements 1 to 255 are more than one,the input value of the pixel is set to255.It makes the image to visually expand.The aim of dilation is to fill the gaps byusing the appropriate structural elements and to remove the background.Feature extractionThe simultaneously captured hand gesture image is passed through three stages,preprocessing,feature extraction,and finally classification.As described earlier inpreprocessing stage some operations are applied to extract the hand gesture from itsbackground and prepare the hand gesture image for the feature extraction stage.Features are extracted from several image analysis functions which are applied overtwo different types of image data.The first is the original intensity images of the hand,and the other type is the hand contour.While the intensity of the image map providerich information about the shades and texture of the hand skin,the hand contourinforms almost explicitly about the outlined shape of the hand.For extracting the contour of a hand image,an effective and simple edge detection al-gorithm called Sobel filter is used.The filter is also known as Prewitt gradient edge de-tector.The filter detects and highlights the edges of an image by measuring its 2Dspatial gradient according to the high spatial frequency of the nearby regions of theedges.The operation is done by a couple of 3-times-3 convolution kernels which try tofind the approximate absolute gradient magnitude at each point and the orientation ofthat gradient.The kernels are shown in Figure 4.The gradient magnitude is thereby:gj j¼ﬃﬃﬃﬃﬃﬃﬃﬃﬃﬃﬃﬃﬃﬃﬃﬃﬃﬃﬃﬃﬃﬃﬃﬃﬃﬃﬃﬃﬃﬃﬃﬃg∅ð Þ2þ gþ90ð Þ2q.In order to cope with the fast computation from the results ofthe two kernels,the gradient magnitude is approximated by using |g| = |g∅| + |g+ 90|.These kernels are designed to respond maximally to edges running vertically andhorizontally relative to the pixel grid,one kernel for each of the two perpendicularorientations.The kernels can be applied separately to the input image,to produceseparate measurements of the gradient component in each orientation (one perpen-dicular to the other).One kernel is simply the other rotated by 90 degrees.Natural edges in images often lead to lines in the output image that are several pixelswide due to the smoothing effect of the Sobel operator.However,this phenomenonwhich may be undesired in other applications has an advantage on amplifying the out-line of a hand gesture;therefore it would be easier for a classifier to accurately differen-tiate different hand signs apart by recognizing their exaggerated outlines.After thehand gesture images have been processed by Sobel filter,features that describe the handgesture are subsequently extracted.There are several types of vision-based informationFong et al.BioMedical Engineering OnLine 2013,12:111 Page 7 of 18http://www.biomedical-engineering-online.com/content/12/1/111available via image processing algorithms which are narrated as follow,for harvest ofdescriptive features.They are intensity histogram and its averaged profile,color histo-gram,and dimensionality measures.The simplest type of vision-based information isintensity.Taken account of the intensity value 0 to 255 of each pixel across a hand ges-ture image,and projecting these intensity values over a 3 dimensional plot (with x-axisand y-axis being the coordinates for the 2D spatial positions of the pixels,and z-axisfor their corresponding intensities),a visual hand gesture could well be recreated andbecome recognizable.The same informative information would be used for training aclassifier that automatically distinguishes the gestures.Figure 5 shows a sample of in-tensity map for hand gesture of letter ‘a’.It can be seen clearly how the intensitiesmimic the detailed brightness and contrast of the surface of the hand gesture.Its coun-terpart after Sobel filter applied is shown in Figure 6 as well.Directionality analysis is an image processing method that can quantitativelycomputes a histogram of directional structures of an image.It is developed byJean-Yves Tinevez,Max-Planck Institute of Cell Biology and Genetics,Dresden (http://fiji.sc/JeanYvesTinevez).The analysis is designed to infer the visual orientation of struc-tures in an image.The output histogram indicates the amount of structures that areoriented across all different directions along the x-axis.The normalized amount ofpixels of the image areas that are slanted hereto each corresponding direction,lies onthe y-axis.Images with completely isotropic content (e.g.photo of a clear blue sky or apile of random pebbles) are expected to produce a flat histogram.Images that containsubjects that are inclined towards some directional orientation are expected to show ahistogram with some peaks at that orientation.For example,as shown in Figure 7,inthe image of a hand gesture of letter ‘p’,the fingers and arm wriest are oriented mainlyin three populations of pixels - the index finger is pointing almost flat about horizon-tally,the thumb and the middle fingers are bending towards the direction of approxi-mately 120° assuming that the starting point is zero degree at the East direction and itgoes clockwise,finally the wrist is slanted at around 75° supporting the hand.So thesegroups of directional oriented pixels give rise to the peaks that are shown in Figure 8,known as the directionality histogram.The directionality analysis is implemented based on Fourier spectrum analysis.For asquare image,structures with a preferred orientation generate a periodic patternat +90° orientation in the Fourier transform of the image,compared to the direction ofthe objects in the input image.This plugin chops the image into square pieces,andcomputes their Fourier power spectra.The latter are analyzed in polar coordinates,andthe power is measured for each angle using the spatial filters proposed in [29].+1 +2 +1 -1 0 +10 0 0 -2 0 +2-1 -2 -1 -1 0 +1Figure 4 Pseudo code of the horizontal and vertical projection method.Fong et al.BioMedical Engineering OnLine 2013,12:111 Page 8 of 18http://www.biomedical-engineering-online.com/content/12/1/111In addition to the histogram,the directionality analysis generates statistics pertainingto the highest peak found in the histogram as well.In the above example,the peakwhich has the highest peak is the arm wrist that bends at 75° because the wrist area hasmost pixels oriented to a common direction.The statistics generated are harvested asinformative features as well as those of the other peaks,to be used for training a handgesture classifier.The maximum peak is fitted by a Gaussian function,taking intoaccount the periodic nature of the histogram.The ‘Direction (°)’ column reports thecenter of the Gaussian.The ‘Dispersion (°)’ column reports the standard deviation ofthe Gaussian.The ‘Amount’ column is the sum of the histogram from center-standarddeviation to center+standard deviation,divided by the total sum of the histogram.Thereal histogram values are used for the summation,not the Gaussian fit.The ‘Goodness’column reports the goodness of the fit;1 is good,0 is bad.ExperimentThe design of the proposed biometric authentication system addresses two uniquefunctions that (1) enable hand sign recognition via static images of hand gestures;(2)allow personal identification by distinguishing their subtle but unique behavioral pat-terns in posing the hand signs.During the test of static hand posture,the characteris-tics of a particular hand in terms of shape,intensity and color distributions of theFigure 5 3D intensity map of hand sign of letter ‘a’ presented at +45 degrees.Fong et al.BioMedical Engineering OnLine 2013,12:111 Page 9 of 18http://www.biomedical-engineering-online.com/content/12/1/111hand,and its directional orientation in posing a hand sign are used to identify theusers.Signers who are the users of the biometric authentication system pose their handaccording to their enrolled secret patterns in front of a camera,to be authenticated.Experimental evaluation is carried out as two computer performance tests:(i) predict-ing the identity of a signer by hand gesture;and (ii) predicting hand sign content bygesture.The data sources from which the features of the hand gestures are extractedand tested in the experiments are introduced in Section 4.1.The performance evalu-ation criteria are described in Section 4.2,the experimental results are reported anddiscussed in Section 4.3.Experimental dataFor acquiring the static hand gesture image data,four student volunteers took turn,each to generate four sets of hand gestures for the 26 letters according to the standardAmerican Sign Language.For each of the same letter,each student tried posing at fourslightly different angles in order to enact the effect of inexactness in sign language.Then each student repeated in posing at slightly different angles.In this set of data which are subject to training and testing the classifier methods,thehand contour is extracted as a feature which was treated by scaling and removal ofFigure 6 3D intensity map of hand sign of letter ‘a’ presented at +45 degrees with Sobelfilter applied.Fong et al.BioMedical Engineering OnLine 2013,12:111 Page 10 of 18http://www.biomedical-engineering-online.com/content/12/1/111background in real time.After that the fitted images of the gestures are processed withfurther feature extraction as discussed in Section 3.2.There are a total of 1,536 featuresthat are taken from all these image analysis techniques.Dimensionality reduction isapplied to remove the redundant features from training an effective classifier.Thealgorithm used is called Correlation based Feature Selection.The algorithm evaluatessubsets of features on the basis of the following hypothesis:“Good feature subsets con-tain features highly correlated with the classification,yet uncorrelated to each other”[30].The significant features are retained and used for training the classifier.Twenty-seven significant features are selected for classifier responsible for prediction of signerby gesture,and sixteen useful features are retained for prediction of content by gesture.More features are needed for predicting signer than for predicting content;it showsthat it may be easier to generalize a satisfactorily accurate classifier for contents (whichare limited to the distinctive shapes of 26 alphabets) than for identifying hands of eachindividual.The differences of each signer’s hand may be subtle and hence require morefeatures to accomplish the training.The portions,however,by which the features areselected from different type of image analysis are shown in pie-charts in Figure 9 andFigure 10.It can be easily observed that for prediction of contents,Directionality ana-lysis is more imperative because hand signs are distinctively different by the shapes ofhand gestures.Contrariwise,significant features from color histogram dominate thefeature space (by 48%) for classifying each individual’s hand,largely could be due to thedifferent skin complexion colors.Figure 7 Hand gesture image of letter ‘p’ that has the directional lines added for illustration.Figure 8 The respective Directionality Histogram of the hand gesture image of letter ‘p’.Fong et al.BioMedical Engineering OnLine 2013,12:111 Page 11 of 18http://www.biomedical-engineering-online.com/content/12/1/111Performance criteriaThe experiments concern about checking the performance of our proposed biometricauthentication model especially the classifier.The classifier serves as a brain in predict-ing the identity of the signers and the content of the hand signs.Therefore its accuracyis up-most important.There are many classification algorithms available,some ofwhich may be more suitable for hand sign pattern recognition than the others.Likewisethere are multiple performance criteria by which the performance of these classifierswould be well assessed.The performance criteria adopted here include the accuracy of the classifiers,itscounterpart Mean Absolute Error,Kappa statistics,F-measure,ROC,that are being ob-served during the hand sign recognition.All the values of the performance results ex-cept accuracy which is in percentage,are normalized to [0,1] where 0 is the minimumand 1 is the maximum.The accuracy is simply the percentage of the correctly classifiedcases over the total number of testing cases.It serves as the main performance of themodel indicating how ‘useful’ it is with respect to prediction.In the experiment,the op-tion for training/testing is set to 10-fold cross validation,which is a common way instatistics to validate how well the results of a data mining model will generalize to anyindependent dataset.It works by randomly partitioning the full dataset to two subsets,one being the training segment and the other one being the testing segment.The test-ing segment serves as unseen samples for assessing the performance of the inducedmodel;of course the testing segment has already had the predefined class labels,so thesoftware would be able to score the accuracy of the model that was trained by thetraining subset.This process is repeated ten rounds,again randomly on different posi-tions of the full dataset,in order to obtain unbiased performance results.Each time thecross-validation is performed over different random partitions.The final performancescores are those averaged over the ten rounds.Kappa statistics is generally used in data mining,statistical analysis and even assess-ment of medical diagnostic tests,as an indicator on how ‘reliable’ a trained model is.Itbasically reflects how consistent the evaluation results obtained from multiple inter-Figure 9 Proportion of features selected from different image analysis for classifyingindividual signers.Fong et al.BioMedical Engineering OnLine 2013,12:111 Page 12 of 18http://www.biomedical-engineering-online.com/content/12/1/111observers are and how well they are agreed upon.A full description of the Kappa statis-tics can be found in [31].Generally a Kappa of 0 indicates agreement is equivalent tochance,where as a Kappa of 1 means perfect agreement.It loosely defines here as reli-ability by implying a model that has a high Kappa value is a consistent model thatwould expect about the same level of performance (in this case,accuracy) even when itis tested with datasets from other sources.The Kappa statistics is computed here fromthe 10-fold cross-validation with each fold of different combination of partitions (train-ing and testing) as different inter-observers.In pattern recognition such as hand sign recognition in biometric authentication,preci-sion rate or just Precision is the fraction of relevantly recognized instances.In our biometricauthentication model,Precision is a measure of the accuracy provided that a specific classhas been predicted.It is calculated by this simple formula:Precision ¼True PositiveTrue Positive þFalse Positive.Recall is defined as the fraction of relevantly retrieved instances.We can infer thatthe same part of both precision and recall is relevance,based on which they all make ameasurement.Usually,precision and recall scores are not discussed in isolation and therelationship between them is inverse,indicating that one increases and the other de-creases.Recall is defined as:Recall ¼True PositiveTrue Positive þFalse Negative.In a classification task,recall is a criterion of the classification ability of a predictionmodel to select labeled instances from training and testing datasets.A precision withscore 1.0 means that every instance with label belonging to the specific class (predictedby the classifier) does indeed belong to that class in fact.Whereas a recall of score 1.0means that each instance from that particular class is labeled to this class and all arepredicted correctly,none shall be left out.F-measure is the harmonic mean of precision and recall,that is:F measure ¼21Precisionþ1Recall¼2 Precision RecallPrecisionþRecall.It is also known as balanced F score or F-measure in trad-ition,because recall and precision are equally weighted.The general formula forFβmeasure is:Fβ¼1þβ21Precisionþβ2Recall¼1þβ2ð ÞPrecision Recallβ2PrecisionþRecall.As mentioned before,precisionFigure 10 Proportion of features selected from different image analysis for classifying handgesture contents.Fong et al.BioMedical Engineering OnLine 2013,12:111 Page 13 of 18http://www.biomedical-engineering-online.com/content/12/1/111and recall scores should be taken into account simultaneously because they have astrong inter-relation essentially.Consequentially,both are combined into a single meas-ure,which is F-measure,which is perceived as a well-rounded performance evaluation,more highly valued than the simple accuracy.ROC is an acronym for Receiver Operating Characteristic;it is an important meansto evaluate the performance of a classifier system.It is created by plotting the fractionof true positives out of the positives in x-axis (known as sensitivity) and the fraction offalse positives out of the negatives (known as specificity) in y-axis.So when plottingsensitivity and specificity on a ROC plot,the curve should be the higher the better inthese two directions.Theoretically any classifier will display certain trade-off betweenthese two measures.For example,in biometric authentication system in which the useris being tested for extra precaution for security requirement,the classifier may be setto consider on more biometric features in addition to the standard ones,even thoughthey are minor ones (low specificity) and perhaps higher influential factors are adjustedfor these event variables that may directly or indirectly trigger the security alert (highsensitivity).In this paper,we used the area under the curve (AUC) as a quantitativemeasure to represent the probability that a classifier will rank a randomly chosen posi-tive instance higher than a randomly chosen negative,for classifier model comparison.In general,the area under the ROC curve (AUC) is widely recognized as the measureof a diagnostic test’s discriminatory power;which in our case,the stronger the better indiscriminating signers’ hand signs and their subtle behavioral patterns apart.Experimental resultsIn the experiment,ten popular machine learning algorithms are used to test four differ-ent types of prediction.The ten algorithms are chosen from the major types of classifi-cation,including decision trees,rule-based methods,kernel functions,and Bayesmethods.For the details of the algorithms,readers are referred to [32] where a similarframework of experiments using classification algorithms is described.For fairness ofthe comparison,all the selected algorithms have been fine-tuned in advance with thebest-performing parameters.The experiment for the comparison is executed in thesame computing environment,including both the hardware and software;the samehand sign data as described in Section 4.1 are used for all,and the same 10 fold cross-validation option is selected across each experiment trial for each algorithm.The per-formance results that have been rigorously experimented over the classifiers are shownin terms of various performance criteria,in Tables 1 and 2.The results from the twoTables are done by the two different types of authentication – by the identity of thesigner,and by the contents of the hand signs.In terms of accuracy and its counterpart mean absolute error,classification algo-rithms such as J48,Random Forest,NNge and Perceptron performed consistently wellfor the two types of authentication.On the other hand,algorithms like NBTree,Deci-sion Table,Association Rules,SVM,BayesNet and NaiveBayes performed poorly withlow accuracy and high error.In contrast,the mean absolute errors for BayesNet andSVM are very little,close to almost zero,for predicting alphabets by static hand signimages.It shows that these algorithms can generalize the pattern recognition modelsvery well for hand sign alphabets.However some algorithms which are very linear forFong et al.BioMedical Engineering OnLine 2013,12:111 Page 14 of 18http://www.biomedical-engineering-online.com/content/12/1/111example neural network (perceptron) and NNge that can map the decision rules bynearest neighbor methods to hyper-rectangles,can generally perform relatively wellboth types of prediction.The performances by the criteria of F-measure and ROC AUC follow about thesame patterns as in accuracy.For Kappa statistic,again the performance compari-son follows closely the patterns of Accuracy,F-measure and ROC AUC.However,the Kappa values for the classifiers that are induced by the following three algo-rithms shrink very sharply,NBTree,Association Rules and BayesNet.They fail togeneralize the model for a wide variety of datasets.When it comes to biometricauthentication system,these algorithms should be avoided because of the poorreliability.Largely,the two types of predictions follow a general trend;it appears that the pre-diction of signer by static hand gesture has a higher overall classification perform-ance,than the prediction of alphabets by static hand sign.The Kappa value howevershows that identifying contents by gesture is more reliable than identifying signers inthe system.That means overall,it is more difficult for the system to recognize a hu-man person’s behavioral pattern in hand gesture than to recognize the content of thegesture.This is further assured by the ROC AUC performance;predicting contentsis always easier than predicting signers.Interestingly a Type-I error exists in the comparison,the FP (false positive) rateis relatively the lowest in the prediction of content by gesture.The FP rate forprediction of signer by gesture,in contrast,has a higher accuracy and other per-formance factors than prediction of content by gesture;but prediction of signerby gesture has a higher (almost double) FP rate than prediction of content bygesture.Having a false positive rate that is higher in predicting signer by gesture,means the false alarm rate is high.A false positive occurs when the authentica-tion system mistakenly flags a legitimate user as a wrong user.This may seemharmless when compared to the otherwise,but false positives can be a nuisancein denying access to eligible users.Table 1 Classification results of predicting signers’ identities by static handgesture imagesSigner prediction by static gestureGroup Algorithm Accuracy%Kappa Mean-abs-errorTPRateFPRatePrecision Recall F-MeasureROCAreaDecisionTreeJ48 78.125 0.5625 0.2316 0.781 0.219 0.791 0.781 0.779 0.725NBTree 90.625 0.8125 0.1216 0.906 0.094 0.908 0.906 0.906 0.973RandomForest 87.5 0.75 0.2138 0.875 0.125 0.881 0.875 0.875 0.941Rule-basedDecisionTable 71.875 0.4375 0.327 0.719 0.281 0.72 0.719 0.718 0.775NNge 84.375 0.6875 0.1563 0.844 0.156 0.856 0.844 0.842 0.844AssociationRules68.75 0.375 0.3125 0.688 0.313 0.75 0.688 0.667 0.688Functions Perceptron 93.75 0.875 0.086 0.938 0.063 0.944 0.938 0.937 0.98SVM 87.5 0.75 0.125 0.875 0.125 0.881 0.875 0.875 0.875Bayes BayesNet 87.5 0.75 0.1177 0.875 0.125 0.875 0.875 0.875 0.977NaiveBayes 87.5 0.75 0.1382 0.875 0.125 0.881 0.875 0.875 0.931Fong et al.BioMedical Engineering OnLine 2013,12:111 Page 15 of 18http://www.biomedical-engineering-online.com/content/12/1/111ConclusionBiometrics is a scientific approach that involves recognizing people by measuring theirphysical and/or behavioral characteristics.In this paper,we proposed a novel biometricdiscipline that uses hand sign gestures as captured in static images in signing.The mo-tivation of using hand sign as biometric authentication is its ease-of-use and the intrin-sic behavioral characteristics in signing.Furthermore,signing can convey some secretmessage which tops up another level of secrecy in authentication from the underlyinghand patterns and hand movements.The full potential of using hand sign biometric isyet to be unleashed,in a spectrum of security applications.This paper serves as a preliminary research in investigating the possibilities of usinghand signs as biometric authentication.Specifically we rigorously tested out two typesof prediction from the perspective of an authentication system,over static hand gesturedata,as well as using ten popular machine learning algorithms.The two types of pre-diction are:(1) identifying signer using static hand gesture,and (2) recognizing the con-tent of the hand gesture using its static image.We argued in the paper that lowoperational cost is emphasized in the proposed model as it relies on only a simple videocamera without expensive scanning hardware.The image processing is designed light-weight too.Simple histogram methods and directionality analysis are used in lieu ofcomplex computational transforms.In conclusion,the experiments showed that the results are promising overall withour proposed multimodal biometric authentication system.Maximum 93.75% accuracycould be achieved by artificial neural network,in predicting signers’ identities by statichand static gesture.The on par accuracy was observed in predicting contents by statichand sign images too.In general it was shown by the extensive experiments over vari-ous performance factors that recognizing signers (behavioral patterns) are far more dif-ficult than recognizing the hand sign contents (character recognition).It is believedthat plenty of research niches and opportunities exist,both at the level of technicalmethods and functional policies,by using hand sign data for biometric authentication.This paper contributes to a pioneer investigation of this novel approach.Table 2 Classification results of predicting hand gesture contents by static handgesture imagesHand sign content prediction by static gestureGroup Algorithm Accuracy%Kappa Mean-abs-errorTPRateFPRatePrecision Recall F-MeasureROCAreaDecisionTreeJ48 56.25 0.4167 0.2221 0.563 0.146 0.576 0.563 0.566 0.752NBTree 93.75 0.9167 0.0345 0.938 0.021 0.938 0.938 0.938 0.995RandomForest 78.125 0.7083 0.1953 0.781 0.073 0.776 0.781 0.777 0.942Rule-basedDecisionTable 53.125 0.375 0.2898 0.531 0.156 0.499 0.531 0.49 0.824NNge 84.375 0.7917 0.0781 0.844 0.052 0.846 0.844 0.838 0.896AssociationRules50.0.3333 0.25 0.5 0.167 0.333 0.5 0.375 0.667Functions Perceptron 93.75 0.9167 0.0538 0.938 0.021 0.95 0.938 0.937 0.964SVM 93.75 0.9167 0.0313 0.938 0.021 0.95 0.938 0.937 0.958Bayes BayesNet 93.75 0.9167 0.0303 0.938 0.021 0.938 0.938 0.938 0.995NaiveBayes 78.125 0.7083 0.1114 0.781 0.073 0.8 0.781 0.776 0.85Fong et al.BioMedical Engineering OnLine 2013,12:111 Page 16 of 18http://www.biomedical-engineering-online.com/content/12/1/111Competing interestsThe authors declare that they have no competing interests.Authors’ contributionsSF initiated with the original concept of using hand gestures as a biometrics method.The experiments for validatingthe concepts via data-mining are carried out by supervision of SF and YZ.IF and IFJr offered ideas along the way andsupported by enhancing the model as well as contributing to the mathematics of the analysis.SF drafted the manu-script.All authors read and approved the final manuscript.AcknowledgementThe authors are thankful for the financial support from the research grant of Grant no.MYRG152(Y3-L2)-FST11-ZY,offered by the University of Macau,RDAO.Special thanks go to Mr.Tang Ka Wai (Paul) and Ms.Cheng Yiming(Jacqueline),who are software engineering graduates of University of Macau,for programming the imagineprocessing software and conducting the pattern recognition experiments.Author details1Department of Computer and Information Science,University of Macau,Macau,SAR,China.2Faculty of electricalengineering and computer science,University of Maribor,Smetanova 17,2000,Maribor,Slovenia.Received:31 July 2013 Accepted:11 October 2013Published:30 October 2013References1.The great baby signing debate.The British Psychological Society;2008.http://www.thepsychologist.org.uk/archive/archive_home.cfm/volumeID_21-editionID_159-ArticleID_1330.2.Schools encouraged to explore British sign language in the classroom.UK:Education Magazine.http://www.education-magazine.co.uk/Schools_Encouraged_to_Explore_British_Sign_Language_in_the_Classroom-a-2365.html.3.Roizenblatt R,Schor P,Dante F,Roizenblatt J,Belfort R:Iris recognition as a biometric method after cataractsurgery.Biomed Eng Online 2004,3:2.4.Fong S,Zhuang Y:Using Medical History Embedded in Biometrics Medical Card for User IdentityAuthentication:Privacy Preserving Authentication Model by Features Matching.J Biomed Biotechnol 2012,2012:1–11.Article ID 403987.5.Singh YN,Singh SK:Identifying Individuals Using Eigenbeat Features of Electrocardiogram.J Eng 2013:1–8.http://www.hindawi.com/journals/je/2013/539284/cta/.6.Fong S:Using Hierarchical Time Series Clustering Algorithm and Wavelet Classifier for Biometric VoiceClassification.J Biomed Biotechnol 2012,2012:1–12.Article ID 215019.7.Kumar A,Ravikanth CH:Personal authentication using finger knuckle surface.IEEE Transactions on InformationForensics and Security 2009,4(1):98–110.8.Kim MG,Moon HM,Chung YW,Pan SB:A Survey and Proposed Framework on the Soft BiometricsTechnique for Human Identification in Intelligent Video Surveillance System.J Biomed Biotechnol 2012,2012:1–7.Article ID 614146.9.Bashir M,Scharfenberg G,Kempf J:Person authentication by handwriting in air using a biometric smart pendevice.BIOSIG 2011,191:219–226.10.Banerjee SP,Woodard DL:Biometric authentication and identification using keystroke dynamics:a survey.J Pattern Recognition Res 2012,7:116–139.11.Wong SF,Cipolla R:Real-time adaptive hand motion recognition using a sparse bayesian classifier.In Proceedings ofthe Computer Vision in Human-Computer Interaction Lecture Notes in Computer Science 2005,3766:170–179.12.Wang GW,Zhang C,Zhuang J:An Application of Classifier Combination Methods in Hand Gesture Recognition.Mathematical Problems in Engineering;2012.Article ID 346951,17.13.Garg P,Aggarwal N,Sofat S:Vision based hand gesture recognition.World Acad Sci Eng Technol 2009,49:972–977.14.Naidoo S,Omlin C,Glaser M:Vision-based static hand gesture recognition using support vector machines.1998:88–94.http://www.docstoc.com/docs/19593197/Vision-Based-Static-Hand-Gesture-Recognition-Using-Support-Vector.15.Ghotkar AS,Kharate GK:Vision based Real Time Hand Gesture Recognition Techniques for Human ComputerInteraction.Int J Comput Appl 2013,70(16):1–8.Published by Foundation of Computer Science,New York,USA.16.Pradhan A,Ghose MK,Pradhan M:A hand gesture recognition using feature extraction.Int J Curr Eng Technol2012,2(4):323–327.17.Chen Q,Georganas ND,Petriu EM:Real-time vision-based hand gesture recognition using haar-like features.Warsaw:Proceedings of the Instrumentation and Measurement Technology Conference;2007:1–6.18.Sathish G,Saravanan SV,Narmadha S,Maheswari SU:Personal authentication system using hand veinbiometric.Int J Comput Technol Appl 2012,3(1):383–391.19.Nandini C,Ashwini C,Aparna M,Ramani N,Kini P,Sheeba K:Biometric authentication by dorsal hand vein pattern.Engineering and Technology:International Journal of;2012:2(5).20.Gayathri S,Nigel KGJ,Prabakar S:Low cost hand vein authentication system on embedded linux platform.Int JInnovative Technol Exploring Eng 2013,2(4):138–141.21.Mathivanan B,Palanisamy V,Selvarajan S:Multi dimensional hand geometry based biometric verification andrecognition system.Int J Emerg Technol Adv Eng 2012,2(7):348–354.22.Fotak T,Koruga P,Baca M:Trends in hand geometry biometrics.Croatia:Proceedings of the Central EuropeanConference on Information and Intelligent Systems;2012:323–493.23.Hasan H,KareemSA:Static hand gesture recognition using neural networks.January:Artificial Intelligence Review;2012.Fong et al.BioMedical Engineering OnLine 2013,12:111 Page 17 of 18http://www.biomedical-engineering-online.com/content/12/1/11124.Eisenstein J,Davis R:Visual and Linguistic Information in Gesture Classification.State College,Pennsylvania,USA:Proceedings of the CMI 2004;2004:1–8.25.Tsalakanidou F,Malassiotis S,Strintzis MG:A 3D face and hand biometric system for robust user-friendlyauthentication.Pattern Recognition Lett 2007,28:2238–2249.26.Gutta S,Huang J,Imam IF,Wechsler H:Face and hand gesture recognition using hybrid classifiers.Killington,USA:Proceedings of the Second International Conference on Automatic Face and Gesture Recognition;1996:164–169.27.Pavešic N,Savic T,Ribaric S:Multimodal biometric authentication system based on hand features.Germany:FromData and Information Analysis to Knowledge Engineering Studies in Classification,Data Analysis,and KnowledgeOrganization;2006:630–637.28.Hashem HF:Adaptive Technique for Human Face Detection Using HSV Color Space andNeural Networks.Cairo,Eygpt:Proceedings of the National Radio Science Conference 2009;2009:1–7.29.Liu ZQ:Scale space approach to directional analysis of images.Appl Optimization 1991,30(11):1369–1373.30.Hall MA:Correlation-based feature selection for machine learning.Department of Computer Science,The Universityof Waikato,Hamilton,New Zealand:PhD thesis;1999.31.Viera AJ,Garrett JM:Understanding interobserver agreement:the Kappa statistic.Family Medicine,ResearchSeries,May 2005,37(5):360–363.32.Fong S,Lan K,Wong R:Classifying Human Voices By Using Hybrid SFX Time-series Pre-processing andEnsemble Feature Selection.J Biomed Biotechnol 2013,2013:1–28.Article ID 720834.doi:10.1186/1475-925X-12-111Cite this article as:Fong et al.:A biometric authentication model using hand gesture images.BioMedicalEngineering OnLine 2013 12:111.Submit your next manuscript to BioMed Centraland take full advantage of:•Convenient online submission•Thorough peer review•No space constraints or color ﬁgure charges•Immediate publication on acceptance•Inclusion in PubMed, CAS, Scopus and Google Scholar•Research which is freely available for redistributionSubmit your manuscript atwww.biomedcentral.com/submitFong et al.BioMedical Engineering OnLine 2013,12:111 Page 18 of 18http://www.biomedical-engineering-online.com/content/12/1/111