• Users Online: 190
  • Home
  • Print this page
  • Email this page
Home About us Editorial board Ahead of print Current issue Search Archives Submit article Instructions Subscribe Contacts Login 


 
 Table of Contents  
ORIGINAL ARTICLE
Year : 2017  |  Volume : 3  |  Issue : 2  |  Page : 49-54

Development of quality control system for fingerprint comparison processes


1 Institute of Evidence Law and Forensic Science, University of Political Science and Law, Beijing, China
2 Department of Shanghai Police, Shanghai Key Laboratory of Criminal Scene Evidence, Shanghai, China
3 Department of Criminology, University of Leicester, Leicester, UK
4 College of Science, Northeastern University, Shenyang, China
5 Department of Forensic and Investigative Science, West Virginia University, Morgantown, WV, USA

Date of Web Publication30-Jun-2017

Correspondence Address:
Shiquan Liu
Institute of Evidence Law and Forensic Science, China University of Political Science and Law, Beijing
China
Login to access the Email id

Source of Support: None, Conflict of Interest: None


DOI: 10.4103/jfsm.jfsm_56_17

Rights and Permissions
  Abstract 

Fingerprint evidence played an important role in investigation, prosecution, and trial process due to the belief of its uniqueness and unchanged characteristics. However, in recent years, the science behind the process of fingerprint comparisons has been questioned. Main research questions have been focusing on the opaqueness within the comparison processes, subjective judgments, lack of universal standards, no error rate expression on final conclusions, and poor scientific fundamental research data. Facing the above-mentioned questions, this paper aims to suggest a quality control system (QCS) for fingerprint comparison processes. This QCS is based on the use of software (PiAnoS) and its technological features, being able to provide a data management model to increase the transparency and quality of fingerprint comparison processes.

Keywords: Fingerprint comparisons, quality control systems, standards in forensics


How to cite this article:
Liu S, Mi Z, Gonçalves FV, Wu J, Ayers K. Development of quality control system for fingerprint comparison processes. J Forensic Sci Med 2017;3:49-54

How to cite this URL:
Liu S, Mi Z, Gonçalves FV, Wu J, Ayers K. Development of quality control system for fingerprint comparison processes. J Forensic Sci Med [serial online] 2017 [cited 2017 Oct 23];3:49-54. Available from: http://www.jfsmonline.com/text.asp?2017/3/2/49/209292


  Introduction Top


Conventional procedures to compare fingerprints worldwide are based in different sources. Specific manuals such as the Best Practice Manual for Fingerprint Examination [1] have been providing fingerprint examiners and forensic agencies with guidelines for best practices within this field. Official reports such as the National Academy of Sciences (report)[2] or the President's Council of Advisors on Science and Technology (report)[3] also started to observe the possible handicaps that forensic sciences may have been facing recently. Since the first high profile cases [4] where miscarriages of justice took place due to misguided decisions within the processes carried by examiners, that research has been focusing on the issues of errors in forensics,[5] as well as suggesting fingerprint examiners, among other suggestions, to follow a strict procedure, the ACE-V process. Briefly, based on the previously mentioned manuals [1] that analysis is an initial information-gathering phase in which the fingerprint examiner assesses the unknown latent mark to identify the quality and quantity of discriminating details and the known prints, retrieved either by an automated system (e.g., Automated Fingerprint Identification System [AFIS]) or given from potential suspects, comparison refers to the observation of the friction ridge details within the fingermarks collected at a crime scene to determine the agreement or disagreement within the details of the known fingerprint. The examiner assesses the agreement or disagreement of the information observed during analysis and comparison and forms a conclusion at the evaluation phase. During the different phases, fingerprint examiners carry different assessments and judgments following specific requirements in their agency.[1] Conventionally, a quality control system (QCS) on fingerprint comparisons depends, mainly, on human judgment,[6] also known as the verification phase. Recent research has been done to monitor the processes during comparison and verification phase to provide suggestions for the topic of errors during these two phases [7-10] although a common comment that authors provide for future studies is the need to focus more on QCSs and its validity.[11]

Currently, a scientific definition for standards on QCSs can be found by international bodies such as the International Standard Organization with the effort to standardize terminology and quality management practices for worldwide application.[12] Aiming to add some research on this topic, the present paper provides a scientifically based QCS developed with PiAnoS (PiAnoS is an open source software developed by the Université de Lausanne, and can be accessed at http://ips-labs.unil.ch) and statistical tools carried on site with Chinese forensic bureaus. The suggested QCS is structured by four sections where all the steps are possible to be monitored, promoting the decrease of errors to happen or at least to be found once they happen.


  Structure Of Quality Control System Top


In this section, the structure of the proposed QCS that was used with Chinese bureaus is presented. This QCS aims to suggest a new perspective on monitoring fingerprint comparison processes within a multi-focus design using a four-step structure in which fingerprint bureaus can handle the functionality and validity of the whole process [Figure 1]. In reference to previous research,[5] to control the quality of each phase within the ACE-V process, bureaus should monitor the whole process simultaneously. As shown in [Figure 2], all the latent print samples and data (known prints) from the identification process are gathered in the database through the software. All the data have its quality checked throughout the QCS, promoting a quality control of the identification process in real time.
Figure 1: Framework of fingerprint identification

Click here to view
Figure 2: Quality control system design

Click here to view


Fingerprint image evaluation system

As shown in [Figure 1], there is one fingerprint image evaluation system in this model, which is split in two phases:First, examiners should map the location of minutiae, and second its quality. Software such as Universal Latent Workstation (ULW) (software provided by the FBI to the authors) can be effective to calculate the quality scores of latent prints and known prints [Figure 3] and [Figure 4].
Figure 3: Universal Latent Workstation quality map distribution of 4 marks (courtesy of Professor Christophe Champod, Université de Lausanne)

Click here to view
Figure 4: Expert quality map distribution of 4 marks

Click here to view


Based on the above factors, examiners can quantify the proportion of the quality of the region and the proportion of the area occupied by the minutiae, the specific calculation is explained as it follows: QR is the ratio between the high-quality area and low-quality area. The larger QR is the higher the preassessment score will be. To improve the flexibility of the model, it is assumed that the ratio is linear with the score. With the increasing of the number of minutiae with high-quality (NM1) and medium-quality area (NM2), the score gradually increases. When the number of minutiae is more than 12, the score tends to increase slowly. Based on the understanding of fingerprint identification, it was established the model for the fingerprint image evaluation to be described as a linear additive model by the following equation:

Score = μ + β ×QR +1 (NMI) + m2 (NM2) ε

This model is an initial exploration of the fingerprint value evaluation, and examiners can use ULW software to test and obtain n sets of data, allowing professionals to estimate the model parameters and nonparametric function components.

Data storage system

Data storage system (DSS) is made with the support of software for fingerprint comparison (e.g., PiAnoS), translated to Chinese. All the data from the identification process were stored into a text document which was transformed into MS Excel. Professionals can also use other types of data storage software such as SPSS (Statistical Package for the Social Sciences) IBM SPSS Statistics.

Data analysis system

Data analysis system (DAS) receives different sources of information from the previous step (DAS) such as basic personal information of examiners, minutiae number, judgment value evaluation, distortion assessment, conclusion, and discussion. According to the requirements in each bureau, DAS decides which data should be accepted to be reported.

Result evaluation system

The main role of the result evaluation system (RES) is to evaluate information from DAS and its associated conclusions. During the decision-making process, examiners are likely to commit two types of errors: Type I/false negatives (an identification which is rejected) and Type II/false positives (an exclusion which is identified). During the process of fingerprint identification, the weight of committing these two types of errors is not the same. Based on Wald's decision theory [Table 1], the weight of committing both errors is not the same.
Table 1: Wald's decision theory

Click here to view


In the process of identification, inconclusive decisions are the most conservative and less risky decisions.[13] According to Wald's decision theory, if the minimax decision of the problem is to be solved according to the nonconclusive value, the decision should be an exclusion, having the minimax decision as a conservative decision to reduce the risk of misguided conclusions. The evaluation system uses data collected from the fingerprint identification process [Figure 5] to calculate the score.
Figure 5: Judgment information distribution

Click here to view


Each organization is represented within a 10-dimensional vector consistent of the evaluation of the fingerprint value and its conclusion. The scores are expressed as follows:



Denote ai is the weight. The higher the difficulty to assess the fingerprint, the greater the weight assignment, n1 is the number of matching fingerprints, n2 is the number of close no matching fingerprints and N = n1 + n2 is the total number of finger marks. The main purpose of RES is to evaluate the differences of results within the same comparison and to distinguish the score of the same person or organization within the same case.


  Implementation And Experiments Top


Aiming to assess the impact of the suggested system this paper presents a case study developed to evaluate its capabilities and validity within a national proficiency test with Chinese forensic bureaus.

Methodology

The study was conducted for a fingerprint identification proficiency test within National Forensic Science Centre to show how to use QCS to digitally monitor and manage the quality of the comparison processes used by the bureaus.

Sample

Nineteen fingerprint bureaus within forensic science laboratories were selected to perform the experiment. All the laboratories were accredited by the time the experiment occurred by the China National Accreditation Service for Conformity Assessment.

Materials

Three trials were provided to compare [Figure 6]. Two trials were a match, the third one (bottom of the image) was a close nonmatch. The three trials and its correct conclusions were provided by a certified latent print examiner who have used them in previous research.[14]
Figure 6: Trials

Click here to view


Bureaus performed the three assessments as real cases within their workflow using all the technology available for the bureaus' work. However, all of the bureaus were aware the three cases were part of an experiment. Bureaus were not aware of the correct conclusion for each case.


  Results Top


All the information was collected by DSS (DSY) from the minutiae annotation result [Table 2], value judgment and conclusion [Table 3] and [Graph 1]. Fingerprint examiners were also asked to annotate minutiae to see the difference between examiners and agencies during the test [Figure 7] and [Figure 8].
Table 2: Minutiae annotation results

Click here to view
Table 3: Conclusion results

Click here to view

Figure 7: Minutiae annotation and quality map from

Click here to view
Figure 8: Consensus of minutiae annotation and quality map in the test (numbers represent the minutiae location)

Click here to view



  Discussion Top


From the results, we can see that each agency change the minutiae numbers between the analysis and comparison phase and most of the agencies annotate more minutiae in the second phase [Figure 7] and [Figure 8]. It is needed to train agencies and their experts on how to make annotations and keep consistency on their minutiae annotation in future analysis. We found errors on false identification (false positives) that some agencies were unable to distinguish only on the close nonmatch trial.


  Summary Top


In this paper, an approach to manage the quality of fingerprint comparison processes, the QCS, shows that it is an effective way to improve the levels of quality management due to the following four reasons: first, it is not a complex system once it is implemented due to the similarity operation within an AFIS, allowing both fingerprint examiners and managers to understand the system itself. Second, the reliability of data storage completely ensures the collection of data from the fingerprint comparison processes. Third, data analysis using statistical software ensures the accuracy of the results. Finally, evaluation reports from the QCS are able to detect problems, provide recommendations, and improve management levels for fingerprint agencies.

In Summary, we suggest fingerprint bureaus to set up scientific procedures including an independent inspection phase, data recording and analysis systems, report review mechanisms and features for additional report content analysis. Internal training within continuing professional development should be implemented associated to the previous features. Future studies also should look to the process of comparing close nonmatches, distortion levels, and the scientific evaluation of fingerprint evidence in courts.

Acknowledgment

This paper was supported by award number 71371188 (on evidence evaluation model and its application, a study based on decision-making in management and court) funded by China National Natural Science Fund and Award number 2016XCWZK08 funded by Shanghai Key Laboratory of Crime Scene Evidence.

Financial support and sponsorship

Nil.

Conflicts of interest

There are no conflicts of interest.



 
  References Top

1.
European Network of Forensic Science Institutes (ENFSI). Best Practice Manual for Fingerprint Examination (ENFSI-BPM-FIN-01); 2015.  Back to cited text no. 1
    
2.
Committee on Identifying the Needs of the Forensic Sciences Community, National Research Council. Strengthening Forensic Science in the United States: A Path Forward. National Academy of Sciences; 2009.  Back to cited text no. 2
    
3.
Executive Office of the President President's Council of Advisors on Science and Technology. Report to the President Forensic Science in Criminal Courts: Ensuring Scientific Validity of Feature-Comparison Methods; 2016.  Back to cited text no. 3
    
4.
Office of the Inspector General. A Review of the FBI's Handling of the Brandon Mayfield Case. Department of Justice; 2006.  Back to cited text no. 4
    
5.
Dror IE. Why experts make errors. J Forensic Identification 2006;4:600-56.  Back to cited text no. 5
    
6.
Codes of Practice and Conduct, Forensic Science Regulator, 2015.  Back to cited text no. 6
    
7.
Langenburg G, Champod C, Wertheim P. Testing for potential contextual bias effects during the verification stage of the ACE-V methodology when conducting fingerprint comparisons. J Forensic Sci 2009;54:571-82.  Back to cited text no. 7
    
8.
Ulery BT, Hicklin RA, Buscaglia J, Roberts MA. Repeatability and reproducibility of decisions by latent fingerprint examiners. PLoS One 2012;7:e32800.  Back to cited text no. 8
    
9.
Dror IE, Charlton D, Péron AE. Contextual information renders experts vulnerable to making erroneous identifications. Forensic Sci Int 2006;156:74-8.  Back to cited text no. 9
    
10.
Langenburg G. A performance study of the ACE-V process: A pilot study to measure the accuracy, precision, reproducibility, and the bias ability of conclusions resulting from the ACE-V process. J Forensic Identification 2009;59:219-57.  Back to cited text no. 10
    
11.
Kassin SM, Dror IE, Kukucka J. The forensic confirmation bias: Problems, perspectives, and proposed solutions. J Appl Res Mem Cogn 2013;2:42-52.  Back to cited text no. 11
    
12.
Latent Print Examination and Human Factors: Improving the Practice through a System Approach, the Report of the Expert Working Group on Human Factors in Latent Print Analysis; February, 2012.  Back to cited text no. 12
    
13.
Thompson MB, Tangen JM, McCarthy DJ. Expertise in fingerprint identification. J Forensic Sci 2013;58:1519-30.  Back to cited text no. 13
    
14.
Neumann C, Champod C, Yoo M, Genessay T, Langenburg G. Improving the Understanding and the Reliability of the Concept of “Sufficiency” in Friction Ridge Examination. U.S. Department of Justice, Washington, D.C.; 2013.  Back to cited text no. 14
    


    Figures

  [Figure 1], [Figure 2], [Figure 3], [Figure 4], [Figure 5], [Figure 6], [Figure 7], [Figure 8]
 
 
    Tables

  [Table 1], [Table 2], [Table 3]



 

Top
 
 
  Search
 
Similar in PUBMED
   Search Pubmed for
   Search in Google Scholar for
 Related articles
Access Statistics
Email Alert *
Add to My List *
* Registration required (free)

 
  In this article
Abstract
Introduction
Structure Of Qua...
Implementation A...
Results
Discussion
Summary
References
Article Figures
Article Tables

 Article Access Statistics
    Viewed487    
    Printed3    
    Emailed0    
    PDF Downloaded112    
    Comments [Add]    

Recommend this journal


[TAG2]
[TAG3]
[TAG4]