Introduction to Handwritten Signature Verification Dave Fenton University of Ottawa SPOT presentation, University of Ottawa, 29 Oct 2004 p. 1/53
Handwritten signature verification Presentation overview (altered to remove all signatures): Goal, applications and assumptions Basic concepts in biometrics Experimental setup Technical difficulties posed by HSV Past research and the current state of the art Overview of my research SPOT presentation, University of Ottawa, 29 Oct 2004 p. 2/53
Goal of HSV To verify a person s identity based on the way in which he/she signs his/her name Two types of system: Offline systems use static features (the signature image) Online systems use dynamic features (the time series) Written passwords are also under consideration SPOT presentation, University of Ottawa, 29 Oct 2004 p. 3/53
Applications Principal application: reduce fraud in financial transactions Cannot rely on sales staff to visually verify signatures on credit card receipts Occasional acceptances of forgeries are allowable Rejections of valid signatures may irritate valuable customers To date, used mostly for electronic signature of business documents (hash function protects document against alteration) SPOT presentation, University of Ottawa, 29 Oct 2004 p. 4/53
Document verification Apply hash function to document to generate hash code If signature is valid, encrypt hash with signer s private key Recipient decodes received hash using public key If document has been altered, hashes don t match SPOT presentation, University of Ottawa, 29 Oct 2004 p. 5/53
Applications Secondary application: access security for buildings or mobile computing devices For building security, it would not be tolerable to accept forgeries HSV would have to be combined with on-site security staff or other biometric/password/pin systems Already used on some laptops and PDAs SPOT presentation, University of Ottawa, 29 Oct 2004 p. 6/53
Naïve assumptions A person signs his or her name consistently each time All signatures contain enough steady features to be reliably verified A forger cannot perfectly imitate the dynamic features of a signature All a user s passwords can be replaced by his/her signature SPOT presentation, University of Ottawa, 29 Oct 2004 p. 7/53
Example of consistency static Password: Prejunife 21 July 2003 2 Sept 2003 24 Sep 2003 SPOT presentation, University of Ottawa, 29 Oct 2004 p. 8/53
Example of consistency dynamic 50 Y Velocity (cm/s) 0 50 50 0 0.5 1 1.5 2 2.5 3 0 50 50 0 0.5 1 1.5 2 2.5 3 0 50 50 0 0.5 1 1.5 2 2.5 3 0 50 50 0 0.5 1 1.5 2 2.5 3 0 50 50 0 0.5 1 1.5 2 2.5 3 0 50 50 0 0.5 1 1.5 2 2.5 3 0 50 0 0.5 1 1.5 2 2.5 3 Time (normalized) SPOT presentation, University of Ottawa, 29 Oct 2004 p. 9/53
Example of inconsistency static Password: Ingusions 12 Jan 2004 18 Mar 2004 27 Sep 2004 SPOT presentation, University of Ottawa, 29 Oct 2004 p. 10/53
Example of inconsistency dynamic 50 Y Velocity (cm/s) 0 50 0 0.5 1 1.5 2 2.5 50 0 50 0 0.5 1 1.5 2 2.5 50 0 50 0 0.5 1 1.5 2 2.5 50 0 50 0 0.5 1 1.5 2 2.5 50 0 50 0 0.5 1 1.5 2 2.5 50 0 50 0 0.5 1 1.5 2 2.5 Time (normalized) SPOT presentation, University of Ottawa, 29 Oct 2004 p. 11/53
Example of forger ability static Password: Prejunife SPOT presentation, University of Ottawa, 29 Oct 2004 p. 12/53
Example of forger ability dynamic 50 Y Velocity (cm/s) 0 50 0 0.5 1 1.5 2 2.5 3 3.5 4 4.5 5 50 0 50 0 0.5 1 1.5 2 2.5 3 3.5 4 4.5 5 20 0 20 0 0.5 1 1.5 2 2.5 3 3.5 4 4.5 5 50 0 50 0 0.5 1 1.5 2 2.5 3 3.5 4 4.5 5 20 0 20 0 0.5 1 1.5 2 2.5 3 3.5 4 4.5 5 50 0 50 0 0.5 1 1.5 2 2.5 3 3.5 4 4.5 5 Time (normalized) SPOT presentation, University of Ottawa, 29 Oct 2004 p. 13/53
More realistic assumptions Most signers sign their names consistently Most signatures contain enough steady features to be reliably verified Most forgers cannot reproduce a signature well enough to defeat a good verifier It is more difficult to forge both the static and dynamic features of a signature than just the static features SPOT presentation, University of Ottawa, 29 Oct 2004 p. 14/53
Fallout from broken assumptions It may not be possible to verify all signatures reliably For any signature, there will probably exist a skilled forger who can forge it competently Serious consideration must be given to passwords confidential easily replaced if template compromised can exert some control over the length (quality of features) can request the signer to write legibly SPOT presentation, University of Ottawa, 29 Oct 2004 p. 15/53
Handwritten signature verification Presentation overview: Goal, applications and assumptions Basic concepts in biometrics Experimental setup Technical difficulties posed by HSV Past research and the current state of the art Overview of my research SPOT presentation, University of Ottawa, 29 Oct 2004 p. 16/53
Basics of biometrics Physical v. behavioural biometrics: A physical biometric makes use of a fixed characteristic of the body (e.g. fingerprints, iris patterns, retina patterns, hand geometry, facial features) The most accurate methods are usually perceived as too intrusive. A behavioural biometric makes use of personal behaviours which are assumed to be almost invariant (e.g. voice, handwriting, typing, gait) Perceived as less intrusive, but less accurate than physical biometrics SPOT presentation, University of Ottawa, 29 Oct 2004 p. 17/53
Two stages 1. Enrolment: a user s signature characteristics are learned from a small number of input samples. The resulting information is called the template. Typically, 3 5 signatures are used 2. Verification or recognition: For verification, a candidate signature is compared to the template of a single signer. For recognition, the candidate signature must be compared against many templates. SPOT presentation, University of Ottawa, 29 Oct 2004 p. 18/53
FRR and FAR Two error rates are specified: The False Rejection Rate (FRR) is the rate at which valid signatures are rejected. The False Acceptance Rate (FAR) is the rate at which forged signatures are accepted as valid. In many cases, low FRR implies high FAR, and vice-versa Current state of the art: FRR and FAR sum to 2 5%. Actual numbers may be even worse! SPOT presentation, University of Ottawa, 29 Oct 2004 p. 19/53
ROC curves Most verifiers have a single numerical output. If the output level is above a decision threshold, the signature is accepted as valid, otherwise it is rejected. In this case, the FRR and FAR can both be plotted against the decision threshold in a receiver operating characteristic (ROC) curve. SPOT presentation, University of Ottawa, 29 Oct 2004 p. 20/53
Example ROC curves 1 0.9 False Acceptance Rate (FAR) False Rejection Rate (FRR) Error rate 0.8 0.7 0.6 0.5 0.4 0.3 0.2 0.1 0 0 0.5 1 1.5 2 2.5 3 3.5 4 4.5 5 Decision threshold SPOT presentation, University of Ottawa, 29 Oct 2004 p. 21/53
Example ROC curves 1 0.9 False Acceptance Rate (FAR) False Rejection Rate (FRR) Error rate 0.8 0.7 0.6 0.5 0.4 0.3 0.2 0.1 Equal Error Rate 0 0 0.5 1 1.5 2 2.5 3 3.5 4 4.5 5 Decision threshold SPOT presentation, University of Ottawa, 29 Oct 2004 p. 22/53
Types of forgery Random. A random forgery is simply another person s valid signature. Simple. The forger spells the name correctly, but writes in his own style. Skilled, or Knowledgeable. The forger tries to fully reproduce all the shapes and dynamics of the original signature. In this study, forgers are shown MPEG movies of the original signature. The training set consists of a few valid signatures and many random forgeries. After training, the verifier is tested against all 3 types of forgery. SPOT presentation, University of Ottawa, 29 Oct 2004 p. 23/53
Genuine samples Password: Taximotels 19 Sep 2003 16 Oct 2003 6 Nov 2003 SPOT presentation, University of Ottawa, 29 Oct 2004 p. 24/53
Forgeries Password: Taximotels Random Simple Knowledgeable SPOT presentation, University of Ottawa, 29 Oct 2004 p. 25/53
Motivations for research Despite company claims, error rates are high, and need improvement For computing devices with pen inputs (PDAs, tablet PCs), automatic signature verification is a sensible technology Signatures are already a widely accepted means of identification SPOT presentation, University of Ottawa, 29 Oct 2004 p. 26/53
Handwritten signature verification Presentation overview: Goal, applications and assumptions Basic concepts in biometrics Experimental setup Technical difficulties posed by HSV Past research and the current state of the art Overview of my research SPOT presentation, University of Ottawa, 29 Oct 2004 p. 27/53
Data collection Signatures collected using an Interlink Electronics epad-ink (100 Hz samp freq) Captures X & Y position, pressure, time stamp Data collection program written in C++ Data protected by PGPdisk SPOT presentation, University of Ottawa, 29 Oct 2004 p. 28/53
Data collection Two levels of volunteer: Level 1: one signing session Level 2: three signing sessions Each volunteer contributes: 10 samples of genuine signature 10 samples of genuine password Simple forgeries of 2 signatures and 2 passwords Knowledgeable forgeries of a signature and a password SPOT presentation, University of Ottawa, 29 Oct 2004 p. 29/53
Enrolment Acquire valid signatures: Operational systems typically collect 3 5 genuine signatures; academic systems up to 20 Some use warping and interpolation schemes to create extra valid signatures SPOT presentation, University of Ottawa, 29 Oct 2004 p. 31/53
# Enrolment &. &! - & & * + ) ( ' "! &! ( / & # &, # &, # & % # $ Preprocess: Concatenate strokes into single time sequence Render invariant to: translation: subtract X & Y means rotation: force linear regression line to be horizontal scale: may normalize based on box size or signal power Normalization of duration is not carried out at this stage SPOT presentation, University of Ottawa, 29 Oct 2004 p. 32/53
; 6 1 Enrolment 6 4 3 5 0 12 > 6 8 6 F5 > 9 61 E 6 > 1 8 > B C 5 A @5 6?5 : 94 8 7 > 6 9 8 @ G > 6 ; 6 > 35 D 68 ; 6 > 35 D 68 ; 6 > 35 8 = 4 ; < May extract hundreds of features. Examples: Function features: time series such as velocity or acceleration Dynamic discrete features: signing time, number of strokes, pen-down distance, max velocity, mean pressure, time to write longest stroke Static discrete features: bounding box, slant SPOT presentation, University of Ottawa, 29 Oct 2004 p. 33/53
S N I Enrolment N L KM H IJ V N P N ^M V Q NI ] N V I P V Z [ M Y XM N WM R QL P O V N Q P X _ V N S N V KM \ NP S N V KM \ NP S N V KM P U L S T Select features: With function features, typically use the same features for each signer Not all discrete features are equally informative Cost used for feature selection is usually error rate; classifier dependent Sequential forward/backward search SPOT presentation, University of Ottawa, 29 Oct 2004 p. 34/53
k f a Enrolment f d c e `ab n f h f ve n i fa u f n a h n r s e q pe f oe j id h g n f i h p w n f k f n ce t fh k f n ce t fh k f n ce h m d k l Create template: Best performance so far: keep raw data of multiple signatures Bad practice, from security perspective Template may also include list of features to keep, best classifier to use, decision thresholds SPOT presentation, University of Ottawa, 29 Oct 2004 p. 35/53
Verification { { ~ } x yz{ ƒ ~y ƒ } Œ y } ˆ ƒ ~ y } ~ } ~ y ƒ ~ y } ƒ ~ y } ƒ ~ {} Š ~ ~ {} ƒ ~ Ž ~ ~y ƒ ~ {} Š ~ ƒ ~} ~ Œ } {} Š ~ ~ } ~ ~ } x yz{ ƒ Ž ƒ ~ } ~ Ž Œ ~ ~ Ž Œ ~ Initial steps of verification are same as enrolment Only selected features need to be extracted SPOT presentation, University of Ottawa, 29 Oct 2004 p. 36/53
Verification œ š ³ š š ž «œ ž š ª œ œ œ Ÿ Ÿš ž œ œ œ œ œ ž š Ÿ œ œ œ œ œ œ œ ª š ž š ž œ œ š œ œ š Ÿ ² œ ² ± œ ª œ «œ ª œ Template ID is acquired at same time as candidate signature Designated by swipe card, PIN number, etc. Template is usually stored in central database, but may also be held on swipe card SPOT presentation, University of Ottawa, 29 Oct 2004 p. 37/53
Verification ¾  ¾ Ñ º ¹ µ ü É ºµ à ¼»¹ Â È Ç Ã ¾µ» ¾¹ Ä Å º µ à ¹ º Á¹ ¾ º ½» ½ ¼» µ º µ à ¹ º µ à ¹ º ¾ ¹ Æ º» º ¾ ¹» ¼ À ½ ¾ º Ê Î º ºµ º ¾ ¹ Æ º» ¾ º¹ º È»»¹  ü ¾» ¹ Æ À Ç Ã¼ ºÍ º ¾¹ Ì º º ¹ µ ½ Ê Ð Ã º й Ï ¾ º Ê» Â È ¾ º É Ë ¾ º Ê Â» È ¾ º Many different classifiers have been tried May have to combine results from multiple classifiers Will be covered in more detail later SPOT presentation, University of Ottawa, 29 Oct 2004 p. 38/53
Handwritten signature verification Presentation overview: Goal, applications and assumptions Basic concepts in biometrics Experimental setup Technical difficulties posed by HSV Past research and the current state of the art Overview of my research SPOT presentation, University of Ottawa, 29 Oct 2004 p. 39/53
Technical difficulties Physiology of handwriting is not well understood Signers not motivated to sign in a careful, invariant manner Training sets are sparse and badly imbalanced (few valid signatures) No knowledgeable forgeries available for training Short, variable signatures often easily forged SPOT presentation, University of Ottawa, 29 Oct 2004 p. 40/53
Technical difficulties No standard database of signatures and forgeries: every researcher uses a different set of amateur forgeries some researchers test only against random forgeries FAR is ill-defined; a low error rate may reflect the forgers lack of skill rather than the verifier s ability SPOT presentation, University of Ottawa, 29 Oct 2004 p. 41/53
Technical work-arounds Disqualify certain signers during enrolment Allow multiple signing attempts Allow probationary period with relaxed acceptance criteria (collect more training signatures) Use passwords with a certain minimum length SPOT presentation, University of Ottawa, 29 Oct 2004 p. 42/53
Handwritten signature verification Presentation overview: Goal, applications and assumptions Basic concepts in biometrics Experimental setup Technical difficulties posed by HSV Past research and the current state of the art Overview of my research SPOT presentation, University of Ottawa, 29 Oct 2004 p. 43/53
Past research Research has been underway for several decades. Peak activity in mid-1980s to mid-1990s. Early ideas are still good performers because they have fewer control parameters Between 30 60% of forgeries can be detected by a basic time verifier SPOT presentation, University of Ottawa, 29 Oct 2004 p. 44/53
Time verifier Genuine signatures 200 150 100 50 Total Signing Time Accepted by time verifier Rejected by time verifier Sample mean = 3.58 Sample deviation = 1.53 0 0 2 4 6 8 10 12 14 16 18 20 Simple forgeries 150 100 50 Sample mean = 4.46 Sample deviation = 1.60 0 0 2 4 6 8 10 12 14 16 18 20 Knowledgeable forgeries 100 50 Sample mean = 6.11 Sample deviation = 2.74 0 0 2 4 6 8 10 12 14 16 18 20 Time (seconds) SPOT presentation, University of Ottawa, 29 Oct 2004 p. 45/53
Classifiers Euclidean distance weighted linear metrics regional correlation dynamic time warping (DTW) neural networks hidden Markov models (HMMs) Bayesian belief net SPOT presentation, University of Ottawa, 29 Oct 2004 p. 46/53
Features used Function features (usually position, velocity and pressure) Vectors of discrete features Features calculated within sliding window (e.g. centre of mass, torque) Wavelet coefficients LPC coefficients Walsh transform of pen-up/pen-down signal Pre-defined strokes (HMMs) SPOT presentation, University of Ottawa, 29 Oct 2004 p. 47/53
Classifier issues Time alignment is important if using function features With few enrolment signatures, statistical estimates are unreliable Lack of training data is a severe problem for learning machines Data imbalance is also problematic SPOT presentation, University of Ottawa, 29 Oct 2004 p. 48/53
State of the art In academic studies, more complicated verifiers often achieve better results than simple verifiers However, in field use, simple verifiers like DTW often outperform everything else: few adjustable parameters with normalization, can set a single decision threshold for all signers Best verifier in public contest: DTW with 5-signature template SPOT presentation, University of Ottawa, 29 Oct 2004 p. 49/53
State of the art Most sophisticated verifier: Plamondon s Sign@metric solution discrete parametric verifier physiological delta-lognormal verifier static feature verifier claimed performance: error rate of 0.0003% among 86,500 people!! Other companies that did not take part in public contest: CIC, Cyber-SIGN, SoftPro, Wondernet SPOT presentation, University of Ottawa, 29 Oct 2004 p. 50/53
Handwritten signature verification Presentation overview: Goal, applications and assumptions Basic concepts in biometrics Experimental setup Technical difficulties posed by HSV Past research and the current state of the art Overview of my research SPOT presentation, University of Ottawa, 29 Oct 2004 p. 51/53
My research Classifier comparison (DTW, NN, SVM, weighted distance metric) Techniques to mitigate imbalance of training data Re-open debate on the use of passwords Data analysis across signing sessions Feature selection algorithm that gives preferential treatment to features that are most likely to be stable Use of support vector machine SPOT presentation, University of Ottawa, 29 Oct 2004 p. 52/53
Questions? To volunteer: please e-mail d.fenton@ieee.org SPOT presentation, University of Ottawa, 29 Oct 2004 p. 53/53