Published 2004 | Version v1
Publication

Computerised anthropomorphometric analysis of images: case report.

Description

The personal identification of living subjects through video filmed images can occasionally be necessary, particularly in the following circumstances: (1) the need to identify unknown subjects by comparing two-dimensional images of someone of known identity with the subject. (2) The need to identify subjects taken in photographs or recorded on video camera by using a comparison with individuals of known identity. METHODS AND APPARATUS: The final aim of our research was that of analysing a video clip of a bank robbery and to determine whether one of the subjects was identifiable with one of the suspects. Following the correct methodology for personal identification, the original videotape was first analysed, relating to the robbery carried out in the bank so as to study the characteristics of the criminal action and to pinpoint the best scenes for an antropomorphometrical analysis. The scene of the crime was therefore reconstructed by bringing the suspect back to the bank where the robbery took place, who was then filmed with the same closed circuit video cameras and made to assume positions as close as possible to those of the bank robber to be identified. RESULTS AND CONCLUSIONS: Taking frame no. 17, points of comparable similarity were identified on the face and right ear of the perpetrator of the crime and the same points of similarity identified on the face of the suspect: right and left eyebrows, right and left eyes, "glabella", nose, mouth, chin, fold between nose and upper lip, right ear, elix, tragus,"fossetta", "conca" and lobule. After careful comparative morphometric computer analysis, it was concluded that none of the 17 points of similarity showed the same anthropomorphology (points of negative similarity). It is reasonable to sustain that 17 points of negative similarity (or non coincidental points) is sufficient to exclude the identity of the person compared with the other.

Additional details

Created:
April 14, 2023
Modified:
December 1, 2023