INFORMATION-THEORETIC BOUNDS TO ACCURACY FOR BIOMETRIC IDENTIFICATION IN METRIC SPACES OF DATA REPRESENTATIONS

Cover Page

Cite item

Full Text

Open Access Open Access
Restricted Access Access granted
Restricted Access Subscription Access

Abstract

For both datasets of biometric objects given by images and an ensemble of the different modality datasets, the lower bounds to error probability of person identification subject to a fixed amount of information have been investigated. The bounds are constructed using a probabilistic object classification model in metric spaces of the object representations. These bounds are independent on decision algorithms and they are formed by the inverses of the rate-distortion functions for the model of discrete source coding with Hamming distortion when the source letters are transmitted over a noisy channel. The difference between a unit and any obtained lower bound to error probability produces an appropriate upper bound to accuracy of person identification depending on a given amount of processed information in a given dataset of the object representations. The obtained bounds are useful for estimating an efficiency of the decision algorithms in terms of deviations of the algorithm error probability or accuracy relative to the boundary values subject to a given average amount of information for making the decisions.

About the authors

A. M Lange

Federal Research Center “Computer Science and Control” RAS

Email: lange_am@mail.ru
Moscow, Russia

M. M Lange

Federal Research Center “Computer Science and Control” RAS

Email: lange_mm@mail.ru
Moscow, Russia

S. V Paramonov

Federal Research Center “Computer Science and Control” RAS

Email: psypobox@gmail.com
Moscow, Russia

References

  1. Gallager R.G. Information Theory and Reliable Communication. N.Y.: Wiley and Sons, 1968. 588 p.
  2. Dobrushin R.L., Tsybakov B.S. Information Transmission with Additional Noise // IRE Transaction on Information Theory. 1962. V. 8(5). P. 293–304. doi: 10.1109/TIT.1962.1057738.
  3. Sethi I.K., Sarvarayudd G.P.R. Hierarchical Classifier Design Using Mutual Information // IEEE Transactions on Pattern Analysis and Machine Intelligence. 1982. V. 4(4). P. 441–445.
  4. Rigau J., Fekxas M., Sbert M. An Information Theoretic Framework for Image Segmentation // Intern. Conf. on Image Processing. ICIP '04, Singapura: IEEE, 2005. AN: 8436089. doi: 10.1109/ICIP.2004.1419518.
  5. Lange M.M., Lange A.M. Information Theoretic Lower Bounds to Error Probability for the Models of Noisy Discrete Source Coding and Object Classification // Pattern Recognition and Image Analysis. 2022. V. 32(3). P. 570-574. doi: 10.1134/S105466182203021X.
  6. Ланге M.M., Ланге A.M. Теоретико-информационные границы точности кодирования сообщений и распознавания образов по ансамблям данных // Компьютерная оптика. 2024. Т. 48(3). С. 460–470. doi: 10.18287/2412-6179-CO-1362
  7. Brown G., Pocock A., Zhao M.J., Luj'an M. Conditional Likelihood Maximization: A Unifying Framework for Information Theoretic Feature Selection // J. Machine Learning Research. 2012. V. 13(8). P. 27–66.
  8. Fleuret F. Fast Binary Feature Selection with Conditional Mutual Information // J. Machine Learning Research. 2004. V. 5. P. 1531–1555.
  9. Ланге A.M., Ланге M.M., Парамонов С.В. О соотношении взаимной информации и вероятности ошибки в задаче классификации данных // ЖВМ и МФ. 2021. Т. 61(7). С. 1129–1205. doi: 10.31857/S0044466921070115.
  10. Duin R.P.W., de Ridder D., Tax D.M.J. Experiments with a Featureless Approach to Pattern Recognition // Pattern Recognition Letters. 1997. V. 18. P. 1159–1166.
  11. Dovenko S.D. Recovering Missing Values of Paired Comparisons // Pattern Recognition and Image Analysis. 2022. V. 32(3). P. 522–527. doi: 10.1134/S1054661822030099.
  12. Sueño H.T., Gerardo B.D., Medina R.P. Multi-class Document Classification Using Support Vector Machine (SVM) Based on Improved Naive Bayes Vectorization Technique // Intern. J. Advanced Trends in Computer Science and Engineering. 2020. V. 9(3). P. 3937–3944. doi: 10.30534/ijatcse/2020/216932020.
  13. Xu X., Huang S.L., Zheng L., Wornell G.W. An Information Theoretic Interpretation to Deep Neural Networks // Entropy. 2022. V. 24(1). P. 135. doi: 10.3390/e24010135.
  14. Cover T.M., Thomas J.A. Elements of Information Theory. 2nd ed. N.Y.: Wiley and Sons, 2006. 748 p.
  15. Distance Matrices for Face Dataset. Available: https://github.com/lange-am/tree_distance_matricies/blob/main/treedist_faces_euclidean2.txt
  16. Distance Matrices for Signature Dataset. Available: https://github.com/lange-am/tree_distance_matricies/blob/main/treedist_signs_euclidean2.txt

Supplementary files

Supplementary Files
Action
1. JATS XML

Copyright (c) 2025 Russian Academy of Sciences

Согласие на обработку персональных данных

 

Используя сайт https://journals.rcsi.science, я (далее – «Пользователь» или «Субъект персональных данных») даю согласие на обработку персональных данных на этом сайте (текст Согласия) и на обработку персональных данных с помощью сервиса «Яндекс.Метрика» (текст Согласия).